I am assuming you are running in yarn cluster mode. Have you tried yarn application -kill application_id ?

With regards,
Sagar Grover
Phone - 7022175584

On Thu, Mar 8, 2018 at 4:03 PM, bsikander <behroz89@gmail.com> wrote:
I have scenarios for both.
So, I want to kill both batch and streaming midway, if required.

Normally, if everything is okay we don't kill the application but sometimes
while accessing external resources (like Kafka) something can go wrong. In
that case, the application can become useless because it is not doing
anything useful, so we want to kill it (midway). In such a case, when we
kill it, sometimes the application becomes a zombie and doesn't get killed
programmatically (atleast, this is what we found). A kill through Master UI
or manual using kill -9 is required to clean up the zombies.

Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

To unsubscribe e-mail: user-unsubscribe@spark.apache.org