spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bruno Faria <brunocf...@hotmail.com>
Subject Terminate job without killing
Date Wed, 07 Dec 2016 03:03:16 GMT
I have a python spark job that runs successfully but never ends (releases the prompt). I got
messages like "releasing accumulator" but never the shutdown message (expected) and the prompt
release.


In order to handle this I used sys.exit(0), now it works but the tasks always appears as KILLED
and I can't control or monitor if the job ended successfully or not.


Basically I have 2 questions

1 - Is sys.exit(0) the best way to end a job or am I missing something (heard sc.stop() is
not a good approach)?

2 - How to make sure the job finished successfully or not (the idea is to use airflow to monitor
that)


Any help is really appreciated.


Thanks

Mime
View raw message