spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Or <>
Subject Re: Programmatically create SparkContext on YARN
Date Tue, 18 Aug 2015 20:50:20 GMT
Hi Andreas,

I believe the distinction is not between standalone and YARN mode, but
between client and cluster mode.

In client mode, your Spark submit JVM runs your driver code. In cluster
mode, one of the workers (or NodeManagers if you're using YARN) in the
cluster runs your driver code. In the latter case, it doesn't really make
sense to call `setMaster` in your driver because Spark needs to know which
cluster you're submitting the application to.

Instead, the recommended way is to set the master through the `--master`
flag in the command line, e.g.

$ bin/spark-submit
    --master spark://
    --class some.user.Clazz
    --name "My app name"
    --jars lib1.jar,lib2.jar
    --deploy-mode cluster

Both YARN and standalone modes support client and cluster modes, and the
spark-submit script is the common interface through which you can launch
your application. In other words, you shouldn't have to do anything more
than providing a different value to `--master` to use YARN.


2015-08-17 0:34 GMT-07:00 Andreas Fritzler <>:

> Hi all,
> when runnig the Spark cluster in standalone mode I am able to create the
> Spark context from Java via the following code snippet:
> SparkConf conf = new SparkConf()
>>    .setAppName("MySparkApp")
>>    .setMaster("spark://SPARK_MASTER:7077")
>>    .setJars(jars);
>> JavaSparkContext sc = new JavaSparkContext(conf);
> As soon as I'm done with my processing, I can just close it via
>> sc.stop();
> Now my question: Is the same also possible when running Spark on YARN? I
> currently don't see how this should be possible without submitting your
> application as a packaged jar file. Is there a way to get this kind of
> interactivity from within your Scala/Java code?
> Regards,
> Andrea

View raw message