spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tao Xiao <xiaotao.cs....@gmail.com>
Subject Re: How to kill a Spark job running in cluster mode ?
Date Wed, 12 Nov 2014 12:20:30 GMT
Thanks for your replies.

Actually we can kill a driver by the command "bin/spark-class
org.apache.spark.deploy.Client kill <spark-master> <driver-id>" if you know
the driver id.

2014-11-11 22:35 GMT+08:00 Ritesh Kumar Singh <riteshoneinamillion@gmail.com
>:

> There is a property :
>    spark.ui.killEnabled
> which needs to be set true for killing applications directly from the
> webUI.
> Check the link:
> Kill Enable spark job
> <http://spark.apache.org/docs/latest/configuration.html#spark-ui>
>
> Thanks
>
> On Tue, Nov 11, 2014 at 7:42 PM, Sonal Goyal <sonalgoyal4@gmail.com>
> wrote:
>
>> The web interface has a kill link. You can try using that.
>>
>> Best Regards,
>> Sonal
>> Founder, Nube Technologies <http://www.nubetech.co>
>>
>> <http://in.linkedin.com/in/sonalgoyal>
>>
>>
>>
>> On Tue, Nov 11, 2014 at 7:28 PM, Tao Xiao <xiaotao.cs.nju@gmail.com>
>> wrote:
>>
>>> I'm using Spark 1.0.0 and I'd like to kill a job running in cluster
>>> mode, which means the driver is not running on local node.
>>>
>>> So how can I kill such a job? Is there a command like "hadoop job -kill
>>> <job-id>" which kills a running MapReduce job ?
>>>
>>> Thanks
>>>
>>
>>
>

Mime
View raw message