spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sasi <>
Subject Re: Set EXTRA_JAR environment variable for spark-jobserver
Date Fri, 09 Jan 2015 08:04:14 GMT

Yes, as you mentioned, we are creating a new SparkContext for our Job. The
reason being, to define Apache Cassandra connection using SparkConf. We
hope, this also should work.

For uploading JAR, we followed 
(1) Package JAR using *sbt package* command 
(2) Use *curl --data-binary
localhost:8090/jars/sparking* command to upload
as mentioned in link.

We done some samples earlier for connecting Apache Cassandra to spark using
Scala language. Initially, we faced same exception as
*java.lang.NoClassDefFoundError* during class run and we overcome that using
*--jars {required JAR paths}* option during *spark-submit*. Finally, able to
run them as regular spark app successfully. So, we are sure of what has been
written for this spark-jobserver.

To give you some update, we prepared Uber JAR (an integrated JAR with all
depedencies) as Pankaj mentioned and now facing *SparkException: Job aborted
due to stage failure* for which we need to raise another post.

Thank you once again for your suggestions.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message