spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From yana <yana.kadiy...@gmail.com>
Subject Re: Difference between spark-defaults.conf and SparkConf.set
Date Wed, 01 Jul 2015 12:27:00 GMT
Thanks. Without spark submit it seems the more straightforward solution is to just pass it
on the driver's classpath. I was more surprised that the same conf parameter had different
behavior depending on where it's specified. Program vs spark-defaults. Im all set now- thanks
for replying

<div>-------- Original message --------</div><div>From: Akhil Das <akhil@sigmoidanalytics.com>
</div><div>Date:07/01/2015  2:27 AM  (GMT-05:00) </div><div>To: Yana
Kadiyska <yana.kadiyska@gmail.com> </div><div>Cc: user@spark.apache.org
</div><div>Subject: Re: Difference between spark-defaults.conf and SparkConf.set
</div><div>
</div>.addJar works for me when i run it as a stand-alone application (without using
spark-submit)

Thanks
Best Regards

On Tue, Jun 30, 2015 at 7:47 PM, Yana Kadiyska <yana.kadiyska@gmail.com> wrote:
Hi folks, running into a pretty strange issue:

I'm setting
spark.executor.extraClassPath 
spark.driver.extraClassPath

to point to some external JARs. If I set them in spark-defaults.conf everything works perfectly.
However, if I remove spark-defaults.conf and just create a SparkConf and call 
.set("spark.executor.extraClassPath","...)
.set("spark.driver.extraClassPath",...) 

I get ClassNotFound exceptions from Hadoop Conf:

Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.ceph.CephFileSystem
not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1493)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1585)

This seems like a bug to me -- or does spark-defaults.conf somehow get processed differently?

I have dumped out sparkConf.toDebugString and in both cases (spark-defaults.conf/in code sets)
it seems to have the same values in it...

Mime
View raw message