spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From yana <>
Subject Re: Difference between spark-defaults.conf and SparkConf.set
Date Wed, 01 Jul 2015 12:27:00 GMT
Thanks. Without spark submit it seems the more straightforward solution is to just pass it
on the driver's classpath. I was more surprised that the same conf parameter had different
behavior depending on where it's specified. Program vs spark-defaults. Im all set now- thanks
for replying

<div>-------- Original message --------</div><div>From: Akhil Das <>
</div><div>Date:07/01/2015  2:27 AM  (GMT-05:00) </div><div>To: Yana
Kadiyska <> </div><div>Cc:
</div><div>Subject: Re: Difference between spark-defaults.conf and SparkConf.set
</div>.addJar works for me when i run it as a stand-alone application (without using

Best Regards

On Tue, Jun 30, 2015 at 7:47 PM, Yana Kadiyska <> wrote:
Hi folks, running into a pretty strange issue:

I'm setting

to point to some external JARs. If I set them in spark-defaults.conf everything works perfectly.
However, if I remove spark-defaults.conf and just create a SparkConf and call 

I get ClassNotFound exceptions from Hadoop Conf:

Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.ceph.CephFileSystem
not found
        at org.apache.hadoop.conf.Configuration.getClassByName(
        at org.apache.hadoop.conf.Configuration.getClass(

This seems like a bug to me -- or does spark-defaults.conf somehow get processed differently?

I have dumped out sparkConf.toDebugString and in both cases (spark-defaults.conf/in code sets)
it seems to have the same values in it...

View raw message