spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Federico D'Ambrosio" <>
Subject Re: Spark does not load all classes in fat jar
Date Mon, 18 Mar 2019 14:11:04 GMT
Hi Jorn, thank you for your response.

I'm sorry I didn't mention it in the previous mail, the class name is
'sherlogic_mon_interface', a class of our project, which is intended to be
instantiated dinamically (Class.forName(" sherlogic_mon_interface")) within
an executor. So, it's not within the Spark core library.

To give a bit more context, we've got 3 modules, let's call them: core, ext
and util. The missing classes are *some* (and that's really weird, not all
of them) of the classes defined in the two modules ext and util.

The weird thing is that we're specifically building the uberjar with all
those dependencies, just like you said, because we thought it was related
to an incorrect resolution of the additional jars, so we added those
modules as dependencies explicitly (checked again with jar tf, classes are
present in the jar). And still, the same issue of ClassNotFoundException.

Feels like I'm missing something.


Il giorno lun 18 mar 2019 alle ore 14:09 Jörn Franke <>
ha scritto:

> Fat jar with shading as the application not as an additional jar package
> Am 18.03.2019 um 14:08 schrieb Jörn Franke <>:
> Maybe that class is already loaded as part of a core library of Spark?
> Do you have concrete class names?
> In doubt create a fat jar and shade the dependencies in question
> Am 18.03.2019 um 12:34 schrieb Federico D'Ambrosio <>:
> Hello everyone,
> We're having a serious issue, where we get ClassNotFoundException because,
> apparently the class is not found within the classpath of Spark, in both
> the Driver and Executors.
> First, I checked whether the class was actually within the jar with jar tf,
> and there actually is. Then, I activated the following options to see
> which classes are actually loaded:
> --conf 'spark.driver.extraJavaOptions=-verbose:class' --conf
> 'spark.executor.extraJavaOptions=-verbose:class'
> and I can see from the YARN stdout logs that some classes, just like the
> one throwing the exception, are not actually being loaded while other
> classes are.
> I tried, then, using --jars to pass the jar containing the missing
> classes, and also using addJar() from the spark context, to no avail.
> This looks like an issue with Spark class loader.
> Any idea about what's happenig here? I'm using Spark
> (HDP 3.0).
> Thank you for your help,
> Federico

Federico D'Ambrosio

View raw message