spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ritesh Kumar Singh <riteshoneinamill...@gmail.com>
Subject Re: How to kill a Spark job running in cluster mode ?
Date Tue, 11 Nov 2014 14:35:05 GMT
There is a property :
   spark.ui.killEnabled
which needs to be set true for killing applications directly from the webUI.
Check the link:
Kill Enable spark job
<http://spark.apache.org/docs/latest/configuration.html#spark-ui>

Thanks

On Tue, Nov 11, 2014 at 7:42 PM, Sonal Goyal <sonalgoyal4@gmail.com> wrote:

> The web interface has a kill link. You can try using that.
>
> Best Regards,
> Sonal
> Founder, Nube Technologies <http://www.nubetech.co>
>
> <http://in.linkedin.com/in/sonalgoyal>
>
>
>
> On Tue, Nov 11, 2014 at 7:28 PM, Tao Xiao <xiaotao.cs.nju@gmail.com>
> wrote:
>
>> I'm using Spark 1.0.0 and I'd like to kill a job running in cluster mode,
>> which means the driver is not running on local node.
>>
>> So how can I kill such a job? Is there a command like "hadoop job -kill
>> <job-id>" which kills a running MapReduce job ?
>>
>> Thanks
>>
>
>

Mime
View raw message