spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Utkarsh Sengar <utkarsh2...@gmail.com>
Subject Re: Exclude slf4j-log4j12 from the classpath via spark-submit
Date Tue, 25 Aug 2015 17:48:18 GMT
This worked for me locally:
spark-1.4.1-bin-hadoop2.4/bin/spark-submit --conf
spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar:/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
--conf
spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar:/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
--verbose --class runner.SparkRunner target/simspark-0.1-SNAPSHOT.jar


Now I am going to try it out on our mesos cluster.
I assumed "spark.executor.extraClassPath" takes csv as jars the way
"--jars" takes it but it should be ":" separated like a regular classpath
jar.

Thanks for your help!
-Utkarsh


On Mon, Aug 24, 2015 at 5:05 PM, Utkarsh Sengar <utkarsh2012@gmail.com>
wrote:

> I get the same error even when I set the SPARK_CLASSPATH: export
> SPARK_CLASSPATH=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.1.jar:/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
> And I run the job like this: /spark-1.4.1-bin-hadoop2.4/bin/spark-submit
> --class runner.SparkRunner
> target/simspark-0.1-SNAPSHOT-jar-with-dependencies.jar
>
> I am not able to find the code in spark which adds these jars before the
> spark classes in classpath. Or maybe its a bug. Any suggestions on
> workarounds?
>
> Thanks,
> -Utkarsh
>
>
> On Mon, Aug 24, 2015 at 4:32 PM, Utkarsh Sengar <utkarsh2012@gmail.com>
> wrote:
>
>> I assumed that's the case beacause of the error I got and the
>> documentation which says: "Extra classpath entries to append to the
>> classpath of the driver."
>>
>> This is where I stand now:
>>         <dependency>
>>             <groupId>org.apache.spark</groupId>
>>             <artifactId>spark-core_2.10</artifactId>
>>             <version>1.4.1</version>
>>             <exclusions>
>>                 <exclusion>
>>                     <groupId>org.slf4j</groupId>
>>                     <artifactId>slf4j-log4j12</artifactId>
>>                 </exclusion>
>>             </exclusions>
>>         </dependency>
>>
>> And no exclusions from my logging lib.
>>
>> And I submit this task: spark-1.4.1-bin-hadoop2.4/bin/spark-submit
>> --class runner.SparkRunner --conf
>> "spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar"
>> --conf
>> "spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar"
>> --conf
>> "spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar"
>> --conf
>> "spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar"
>> target/simspark-0.1-SNAPSHOT-jar-with-dependencies.jar
>>
>> And I get the same error:
>> Caused by: java.lang.ClassCastException:
>> org.slf4j.impl.Log4jLoggerFactory cannot be cast to
>> ch.qos.logback.classic.LoggerContext
>>     at
>> com.opentable.logging.AssimilateForeignLogging.assimilate(AssimilateForeignLogging.java:68)
>>     at
>> com.opentable.logging.AssimilateForeignLoggingHook.automaticAssimilationHook(AssimilateForeignLoggingHook.java:28)
>>     at com.opentable.logging.Log.<clinit>(Log.java:31)
>>     ... 16 more
>>
>>
>> Thanks,
>> -Utkarsh
>>
>> On Mon, Aug 24, 2015 at 4:11 PM, Marcelo Vanzin <vanzin@cloudera.com>
>> wrote:
>>
>>> On Mon, Aug 24, 2015 at 3:58 PM, Utkarsh Sengar <utkarsh2012@gmail.com>
>>> wrote:
>>> > That didn't work since "extraClassPath" flag was still appending the
>>> jars at
>>> > the end, so its still picking the slf4j jar provided by spark.
>>>
>>> Out of curiosity, how did you verify this? The "extraClassPath"
>>> options are supposed to prepend entries to the classpath, and the code
>>> seems to be doing that. If it's not really doing that in some case,
>>> it's a bug that needs to be fixed.
>>>
>>> Another option is those is setting the "SPARK_CLASSPATH" env variable,
>>> which is deprecated, but might come in handy in case there is actually
>>> a bug in handling those options.
>>>
>>>
>>> --
>>> Marcelo
>>>
>>
>>
>>
>> --
>> Thanks,
>> -Utkarsh
>>
>
>
>
> --
> Thanks,
> -Utkarsh
>



-- 
Thanks,
-Utkarsh

Mime
View raw message