spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Correct SparkLauncher usage
Date Mon, 07 Nov 2016 23:36:03 GMT
On Mon, Nov 7, 2016 at 3:29 PM, Mohammad Tariq <dontariq@gmail.com> wrote:
> I have been trying to use SparkLauncher.startApplication() to launch a Spark app from
within java code, but unable to do so. However, same piece of code is working if I use SparkLauncher.launch().
>
> Here are the corresponding code snippets :
>
> SparkAppHandle handle = new SparkLauncher()
>
>         .setSparkHome("/Users/miqbal1/DISTRIBUTED_WORLD/UNPACKED/spark-1.6.1-bin-hadoop2.6")
>
>         .setJavaHome("/Library/Java/JavaVirtualMachines/jdk1.8.0_92.jdk/Contents/Home")
>
>         .setAppResource("/Users/miqbal1/wc.jar").setMainClass("org.myorg.WC").setMaster("local")
>
>         .setConf("spark.dynamicAllocation.enabled", "true").startApplication();    System.out.println(handle.getAppId());
>
>     System.out.println(handle.getState());
>
> This prints null and UNKNOWN as output.

The information you're printing is not available immediately after you
call "startApplication()". The Spark app is still starting, so it may
take some time for the app ID and other info to be reported back. The
"startApplication()" method allows you to provide listeners you can
use to know when that information is available.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message