spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tao Xiao <xiaotao.cs....@gmail.com>
Subject How to kill a Spark job running in cluster mode ?
Date Tue, 11 Nov 2014 13:58:32 GMT
I'm using Spark 1.0.0 and I'd like to kill a job running in cluster mode,
which means the driver is not running on local node.

So how can I kill such a job? Is there a command like "hadoop job -kill
<job-id>" which kills a running MapReduce job ?

Thanks

Mime
View raw message