spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sun Rui <sunrise_...@163.com>
Subject Re: Application not showing in Spark History
Date Tue, 02 Aug 2016 09:26:23 GMT
bin/spark-submit will set some env variable, like SPARK_HOME, that Spark later will use to
locate the spark-defaults.conf from which default settings for Spark will be loaded.

I would guess that some configuration option like spark.eventLog.enabled in the spark-defaults.conf
is skipped by directly using the SparkSubmit class instead of “bin/spark-submit”.

The formal way to launch a Spark application within Java is to use SparkLauncher. Remember
to call SparkLaunch.setSparkHome() to set the Spark Home directory.

> On Aug 2, 2016, at 16:53, Rychnovsky, Dusan <Dusan.Rychnovsky@firma.seznam.cz>
wrote:
> 
> Hi,
> 
> I am trying to launch my Spark application from within my Java application via the SparkSubmit
class, like this:
> 
> 
> List<String> args = new ArrayList<>();
> 
> args.add("--verbose");
> args.add("--deploy-mode=cluster");
> args.add("--master=yarn");
> ...
> 
> SparkSubmit.main(args.toArray(new String[args.size()]));
> 
> 
> This works fine, with one catch - the application does not appear in Spark History after
it's finished.
> 
> If, however, I run the application using `spark-submit.sh`, like this:
> 
> 
> spark-submit \
>   --verbose \
>   --deploy-mode=cluster \
>   --master=yarn \
>   ...
> 
> 
> the application appears in Spark History correctly.
> 
> What am I missing?
> 
> Also, is this a good way to launch a Spark application from within a Java application
or is there a better way?
> 
> Thanks,
> Dusan


Mime
View raw message