spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <>
Subject Re: NoClassDefFoundError
Date Mon, 08 Dec 2014 16:00:49 GMT
Hi Julius,

You can add those external jars to spark while creating the sparkContext
(sc.addJar("/path/to/the/jar")), if you are submitting the job using
spark-submit then you can use the --jars option and get those jars shipped.

Best Regards

On Sun, Dec 7, 2014 at 11:05 PM, Julius K <> wrote:

> Hi everyone,
> I am new to Spark and encountered a problem.
> I want to use an external library in a java project and compiling
> works fine with maven, but during runtime (locally) I get a
> NoClassDefFoundError.
> Do I have to put the jars somewhere, or tell spark where they are?
> I can send the pom.xml and my imports or source code, if this helps you.
> Best regards
> Julius Kolbe
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

View raw message