spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Irving Duran <irving.du...@gmail.com>
Subject Re: Do we need to kill a spark job every time we change and deploy it?
Date Wed, 28 Nov 2018 20:33:29 GMT
Are you referring to have spark picking up a new jar build?  If so, you can
probably script that on bash.

Thank You,

Irving Duran


On Wed, Nov 28, 2018 at 12:44 PM Mina Aslani <aslanimina@gmail.com> wrote:

> Hi,
>
> I have a question for you.
> Do we need to kill a spark job every time we change and deploy it to
> cluster? Or, is there a way for Spark to automatically pick up the recent
> jar version?
>
> Best regards,
> Mina
>

Mime
View raw message