spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From olegshirokikh <>
Subject Submitting jobs on Spark EC2 cluster: class not found, even if it's on CLASSPATH
Date Sun, 01 Mar 2015 08:39:55 GMT
Hi there,

I'm trying out Spark Job Server (REST) to submit jobs to spark cluster. I
believe that my problem is unrelated to this specific software, but
otherwise generic issue with missing jars on paths. So every application
implements the trait with SparkJob class:

/object LongPiJob extends SparkJob {

SparkJob class is available through the jar file, built by Spark Job Server
Scala application. When I run all this with local Spark cluster, everything
works fine after I add the export line into

/export SPARK_CLASSPATH=$SPARK_HOME/job-server/spark-job-server.jar/

However, when I do the same on Spark cluster on EC2, I get the errors:

 	/java.lang.NoClassDefFoundError: spark/jobserver/SparkJob/

I've added the path in (on remote Spark master Amazon machine):

/export MASTER=`cat /root/spark-ec2/cluster-url`

*export SPARK_CLASSPATH=/root/spark/job-server/spark-job-server.jar*


Also, when I run ./bin/, I can see the required jar,
defining "missing" class at the first place:

/bin]$ ./ 
Spark assembly has been built with Hive, including Datanucleus jars on

What am I missing? I'd greatly appreciate your help

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message