Thanks for your replies.

Actually we can kill a driver by the command "bin/spark-class org.apache.spark.deploy.Client kill <spark-master> <driver-id>" if you know the driver id.  

2014-11-11 22:35 GMT+08:00 Ritesh Kumar Singh <riteshoneinamillion@gmail.com>:
There is a property :
   spark.ui.killEnabled
which needs to be set true for killing applications directly from the webUI.
Check the link:
Kill Enable spark job

Thanks

On Tue, Nov 11, 2014 at 7:42 PM, Sonal Goyal <sonalgoyal4@gmail.com> wrote:
The web interface has a kill link. You can try using that.

Best Regards,
Sonal
Founder, Nube Technologies 





On Tue, Nov 11, 2014 at 7:28 PM, Tao Xiao <xiaotao.cs.nju@gmail.com> wrote:
I'm using Spark 1.0.0 and I'd like to kill a job running in cluster mode, which means the driver is not running on local node.

So how can I kill such a job? Is there a command like "hadoop job -kill <job-id>" which kills a running MapReduce job ?

Thanks