spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From bsikander <>
Subject Re: Properly stop applications or jobs within the application
Date Thu, 08 Mar 2018 10:33:29 GMT
I have scenarios for both.
So, I want to kill both batch and streaming midway, if required.

Normally, if everything is okay we don't kill the application but sometimes
while accessing external resources (like Kafka) something can go wrong. In
that case, the application can become useless because it is not doing
anything useful, so we want to kill it (midway). In such a case, when we
kill it, sometimes the application becomes a zombie and doesn't get killed
programmatically (atleast, this is what we found). A kill through Master UI
or manual using kill -9 is required to clean up the zombies.

Sent from:

To unsubscribe e-mail:

View raw message