Thanks for your replies.

Actually we can kill a driver by the command "bin/spark-class org.apache.spark.deploy.Client kill <spark-master> <driver-id>" if you know the driver id.  

2014-11-11 22:35 GMT+08:00 Ritesh Kumar Singh <>:
There is a property :
which needs to be set true for killing applications directly from the webUI.
Check the link:
Kill Enable spark job


On Tue, Nov 11, 2014 at 7:42 PM, Sonal Goyal <> wrote:
The web interface has a kill link. You can try using that.

Best Regards,
Founder, Nube Technologies 

On Tue, Nov 11, 2014 at 7:28 PM, Tao Xiao <> wrote:
I'm using Spark 1.0.0 and I'd like to kill a job running in cluster mode, which means the driver is not running on local node.

So how can I kill such a job? Is there a command like "hadoop job -kill <job-id>" which kills a running MapReduce job ?