spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sagar grover <sagargrove...@gmail.com>
Subject Re: Properly stop applications or jobs within the application
Date Thu, 08 Mar 2018 11:07:16 GMT
I am assuming you are running in yarn cluster mode. Have you tried yarn
application -kill application_id ?

With regards,
Sagar Grover
Phone - 7022175584

On Thu, Mar 8, 2018 at 4:03 PM, bsikander <behroz89@gmail.com> wrote:

> I have scenarios for both.
> So, I want to kill both batch and streaming midway, if required.
>
> Usecase:
> Normally, if everything is okay we don't kill the application but sometimes
> while accessing external resources (like Kafka) something can go wrong. In
> that case, the application can become useless because it is not doing
> anything useful, so we want to kill it (midway). In such a case, when we
> kill it, sometimes the application becomes a zombie and doesn't get killed
> programmatically (atleast, this is what we found). A kill through Master UI
> or manual using kill -9 is required to clean up the zombies.
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Mime
View raw message