spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jörn Franke <>
Subject Re: Spark does not load all classes in fat jar
Date Mon, 18 Mar 2019 13:08:28 GMT
Maybe that class is already loaded as part of a core library of Spark?

Do you have concrete class names?

In doubt create a fat jar and shade the dependencies in question

> Am 18.03.2019 um 12:34 schrieb Federico D'Ambrosio <>:
> Hello everyone,
> We're having a serious issue, where we get ClassNotFoundException because, apparently
the class is not found within the classpath of Spark, in both the Driver and Executors.
> First, I checked whether the class was actually within the jar with jar tf, and there
actually is. Then, I activated the following options to see which classes are actually loaded:
> --conf 'spark.driver.extraJavaOptions=-verbose:class' --conf 'spark.executor.extraJavaOptions=-verbose:class'

> and I can see from the YARN stdout logs that some classes, just like the one throwing
the exception, are not actually being loaded while other classes are.
> I tried, then, using --jars to pass the jar containing the missing classes, and also
using addJar() from the spark context, to no avail.
> This looks like an issue with Spark class loader.
> Any idea about what's happenig here? I'm using Spark (HDP 3.0). 
> Thank you for your help,
> Federico

View raw message