spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mina Aslani <aslanim...@gmail.com>
Subject Do we need to kill a spark job every time we change and deploy it?
Date Wed, 28 Nov 2018 18:44:10 GMT
Hi,

I have a question for you.
Do we need to kill a spark job every time we change and deploy it to
cluster? Or, is there a way for Spark to automatically pick up the recent
jar version?

Best regards,
Mina

Mime
View raw message