spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mayur Rustagi <mayur.rust...@gmail.com>
Subject Re: How to kill a spark app ?
Date Sun, 16 Mar 2014 21:35:56 GMT
Are you embedding your driver inside the cluster?
If not then that command will not kill the driver. You can simply kill the
application by killing the scala application.
So if its spark shell, simply by killing the shell the application will
disconnect from the cluster.

If the driver is embedded in the cluster then the above command will be
required.

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Sun, Mar 16, 2014 at 5:06 PM, Debasish Das <debasish.das83@gmail.com>wrote:

> Thanks Mayur...
>
> I need both...but to start with even application killer will help a lot...
>
> Somehow that command did not work for me....I will try it again from the
> spark main folder..
>
>
> On Sun, Mar 16, 2014 at 1:43 PM, Mayur Rustagi <mayur.rustagi@gmail.com>wrote:
>
>> This is meant to kill the whole driver hosted inside the Master (new
>> feature as of 0.9.0).
>> I assume you are trying to kill a job/task/stage inside the Spark rather
>> than the whole application.
>> Regards
>> Mayur
>>
>> Mayur Rustagi
>> Ph: +1 (760) 203 3257
>> http://www.sigmoidanalytics.com
>> @mayur_rustagi <https://twitter.com/mayur_rustagi>
>>
>>
>>
>> On Sun, Mar 16, 2014 at 4:36 PM, Debasish Das <debasish.das83@gmail.com>wrote:
>>
>>> From
>>> http://spark.incubator.apache.org/docs/latest/spark-standalone.html#launching-applications-inside-the-cluster
>>>
>>>
>>> ./bin/spark-class org.apache.spark.deploy.Client kill <driverId>
>>>
>>>
>>> does not work / has bugs ?
>>>
>>>
>>> On Sun, Mar 16, 2014 at 1:17 PM, Mayur Rustagi <mayur.rustagi@gmail.com>wrote:
>>>
>>>> Thr is a no good way to kill jobs in Spark yet. The closest is
>>>> cancelAllJobs & cancelJobGroup in spark context. I have had bugs using
>>>> both. I am trying to test them out, typically you would start a different
>>>> thread & call these functions on it when you wish to cancel a job.
>>>> Regards
>>>> Mayur
>>>>
>>>> Mayur Rustagi
>>>> Ph: +1 (760) 203 3257
>>>> http://www.sigmoidanalytics.com
>>>>  @mayur_rustagi <https://twitter.com/mayur_rustagi>
>>>>
>>>>
>>>>
>>>> On Sun, Mar 16, 2014 at 2:59 PM, Debasish Das <debasish.das83@gmail.com
>>>> > wrote:
>>>>
>>>>> Are these the right options:
>>>>>
>>>>> 1. If there is a spark script, just do a ctrl-c from spark-shell and
>>>>> the job will be killed property.
>>>>>
>>>>> 2. For spark application also ctrl c will kill the job property on the
>>>>> cluster:
>>>>>
>>>>> Somehow the ctrl-c option did not work for us...
>>>>>
>>>>> Similar option works fine for scalding for example but we see lot of
>>>>> dead nodes if too many jobs are killed abruptly.
>>>>>
>>>>> 3. Use the Client script...
>>>>>
>>>>> /bin/spark-class org.apache.spark.deploy.Client kill spark://
>>>>> myspark.com:7077 app-20140316142129-0000
>>>>> Runner java
>>>>> Classpath
>>>>> :/home/debasish/sag_spark/conf:/home/debasish/sag_spark/assembly/target/scala-2.10/spark-assembly-1.0.0-incubating-SNAPSHOT-hadoop2.0.0-mr1-cdh4.5.0.jar
>>>>> Java opts  -Djava.library.path= -Xms512m -Xmx512m
>>>>> Options -Dspark.cores.max=16
>>>>> Sending kill command to spark://myspark.com:7077
>>>>> Driver app-20140316142129-0000 has already finished or does not exist
>>>>>
>>>>> This option also did not kill the job. I can still see the job running
>>>>> on spark webui...
>>>>>
>>>>> Thanks.
>>>>> Deb
>>>>>
>>>>
>>>>
>>>
>>
>

Mime
View raw message