spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andreas Fritzler <>
Subject Programmatically create SparkContext on YARN
Date Mon, 17 Aug 2015 07:34:17 GMT
Hi all,

when runnig the Spark cluster in standalone mode I am able to create the
Spark context from Java via the following code snippet:

SparkConf conf = new SparkConf()
>    .setAppName("MySparkApp")
>    .setMaster("spark://SPARK_MASTER:7077")
>    .setJars(jars);
> JavaSparkContext sc = new JavaSparkContext(conf);

As soon as I'm done with my processing, I can just close it via

> sc.stop();
Now my question: Is the same also possible when running Spark on YARN? I
currently don't see how this should be possible without submitting your
application as a packaged jar file. Is there a way to get this kind of
interactivity from within your Scala/Java code?


View raw message