spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mayur Rustagi <mayur.rust...@gmail.com>
Subject Re: How to kill a spark app ?
Date Sun, 16 Mar 2014 20:17:42 GMT
Thr is a no good way to kill jobs in Spark yet. The closest is
cancelAllJobs & cancelJobGroup in spark context. I have had bugs using
both. I am trying to test them out, typically you would start a different
thread & call these functions on it when you wish to cancel a job.
Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Sun, Mar 16, 2014 at 2:59 PM, Debasish Das <debasish.das83@gmail.com>wrote:

> Are these the right options:
>
> 1. If there is a spark script, just do a ctrl-c from spark-shell and the
> job will be killed property.
>
> 2. For spark application also ctrl c will kill the job property on the
> cluster:
>
> Somehow the ctrl-c option did not work for us...
>
> Similar option works fine for scalding for example but we see lot of dead
> nodes if too many jobs are killed abruptly.
>
> 3. Use the Client script...
>
> /bin/spark-class org.apache.spark.deploy.Client kill spark://
> myspark.com:7077 app-20140316142129-0000
> Runner java
> Classpath
> :/home/debasish/sag_spark/conf:/home/debasish/sag_spark/assembly/target/scala-2.10/spark-assembly-1.0.0-incubating-SNAPSHOT-hadoop2.0.0-mr1-cdh4.5.0.jar
> Java opts  -Djava.library.path= -Xms512m -Xmx512m
> Options -Dspark.cores.max=16
> Sending kill command to spark://myspark.com:7077
> Driver app-20140316142129-0000 has already finished or does not exist
>
> This option also did not kill the job. I can still see the job running on
> spark webui...
>
> Thanks.
> Deb
>

Mime
View raw message