spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From satish saley <satishsale...@gmail.com>
Subject killing spark job which is submitted using SparkSubmit
Date Fri, 06 May 2016 18:50:39 GMT
Hello,

I am submitting a spark job using SparkSubmit. When I kill my application,
it does not kill the corresponding spark job. How would I kill the
corresponding spark job? I know, one way is to use SparkSubmit again with
appropriate options. Is there any way though which I can tell SparkSubmit
at the time of job submission itself. Here is my code:

-
import org.apache.spark.deploy.SparkSubmit;
- class MyClass{
-
- public static void main(String args[]){
- //preparing args
- SparkSubmit.main(args);
- }
-
- }

Mime
View raw message