spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nerea Ayestarán <nerea.a...@gmail.com>
Subject Using SparkLauncher in cluster mode, in a Mesos cluster
Date Thu, 27 Oct 2016 09:12:58 GMT
I am trying to launch a Apache Spark job from a java class to a Apache
Mesos cluster in cluster deploy mode. I use SparkLauncher configured as
follows:

Process sparkProcess = new SparkLauncher()
                .setAppResource("hdfs://auto-ha/path/to/jar/SparkPi.jar")
                .setMainClass("com.ik.SparkPi")
                .setMaster("mesos://dispatcher:7077")
                .setConf("spark.executor.uri",
"hdfs://auto-ha/spark/spark-2.0.0-bin-hadoop2.7.tgz")
                .setSparkHome("/local/path/to/spark-2.0.0-bin-hadoop2.7")
                .setDeployMode("cluster")
                .setAppName("PI")
                .setVerbose(true)
                .launch();


If I submit the same job with the same configuration using spark-submit the
job works perfectly, but in the case of the SparkLauncher I get the
following error:

Error: Cannot load main class from JAR
file:/var/lib/mesos/slaves/00a81353-d68c-4b7c-b050-d9dfb2a74646-S24/frameworks/52806e97-565b-43d0-90a1-979a61196cb8-0007/executors/driver-20161024163345-0095/runs/c904dc98-8365-4270-895c-374c59ff2b34/spark-2.0.0-bin-hadoop2.7/2


If I go to the Mesos task ui, I can see the spark folder and the
SparkPi.jar.

What am I missing? If I don't specify the local spark home it doesn't work
neither.

Thanks in advance.

Mime
View raw message