spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From satish saley <satishsale...@gmail.com>
Subject Re: killing spark job which is submitted using SparkSubmit
Date Fri, 06 May 2016 20:00:31 GMT
Hi Anthony,

I am passing

                    --master
                    yarn-cluster
                    --name
                    pysparkexample
                    --executor-memory
                    1G
                    --driver-memory
                    1G
                    --conf
                    spark.yarn.historyServer.address=http://localhost:18080
                    --conf
                    spark.eventLog.enabled=true

                    --verbose

                    pi.py


I am able to run the job successfully. I just want to get it killed
automatically whenever I kill my application.


On Fri, May 6, 2016 at 11:58 AM, Anthony May <anthonymay@gmail.com> wrote:

> Greetings Satish,
>
> What are the arguments you're passing in?
>
> On Fri, 6 May 2016 at 12:50 satish saley <satishsaleyos@gmail.com> wrote:
>
>> Hello,
>>
>> I am submitting a spark job using SparkSubmit. When I kill my
>> application, it does not kill the corresponding spark job. How would I kill
>> the corresponding spark job? I know, one way is to use SparkSubmit again
>> with appropriate options. Is there any way though which I can tell
>> SparkSubmit at the time of job submission itself. Here is my code:
>>
>>
>>    -
>>    import org.apache.spark.deploy.SparkSubmit;
>>    - class MyClass{
>>    -
>>    - public static void main(String args[]){
>>    - //preparing args
>>    - SparkSubmit.main(args);
>>    - }
>>    -
>>    - }
>>
>>

Mime
View raw message