spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "965" <1097828...@qq.com>
Subject 回复:Do we need to kill a spark job every time we change and deploy it?
Date Fri, 30 Nov 2018 14:31:51 GMT
I think if your job is running and you want to deploy a new jar which is the new version for
the other, spark will think the new jar is another job ,
they distinguish job by  Job ID , so if you want to replace the jar ,you have to kil job every
time;


------------------ 原始邮件 ------------------
发件人:  "Mina Aslani"<aslanimina@gmail.com>;
发送时间:  2018年11月29日(星期四)凌晨2:44
收件人:  "user @spark"<user@spark.apache.org>;

主题:  Do we need to kill a spark job every time we change and deploy it?



Hi,

I have a question for you. 
Do we need to kill a spark job every time we change and deploy it to cluster? Or, is there
a way for Spark to automatically pick up the recent jar version?



Best regards,
Mina
Mime
View raw message