spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dong Lei <>
Subject RE: ClassNotDefException when using spark-submit with multiple jars and files located on HDFS
Date Tue, 09 Jun 2015 07:59:13 GMT
Thanks Akhil:

The driver fails so fast to get a look at 4040. Is there any other way to see the download
and ship process of the files?

Is driver supposed to download these jars from HDFS to some location, then ship them to excutors?
I can see from log that the driver downloaded the application jar but not the other jars specified
by “—jars”.

Or I misunderstand the usage of “--jars”, and the jars should be already in every worker,
driver will not download them?
Is there some useful docs?

Dong Lei

From: Akhil Das []
Sent: Tuesday, June 9, 2015 3:24 PM
To: Dong Lei
Subject: Re: ClassNotDefException when using spark-submit with multiple jars and files located

Once you submits the application, you can check in the driver UI (running on port 4040) Environment
Tab to see whether those jars you added got shipped or not. If they are shipped and still
you are getting NoClassDef exceptions then it means that you are having a jar conflict which
you can resolve by putting the jar with the class in it on the top of your classpath.

Best Regards

On Tue, Jun 9, 2015 at 9:05 AM, Dong Lei <<>>
Hi, spark-users:

I’m using spark-submit to submit multiple jars and files(all in HDFS) to run a job, with
the following command:

  --class myClass
 --master spark://localhost:7077/
  --deploy-mode cluster
  --jars hdfs://localhost/1.jar, hdfs://localhost/2.jar
  --files hdfs://localhost/1.txt, hdfs://localhost/2.txt

the stderr in the driver showed java.lang.ClassNotDefException for a class in 1.jar.

I checked the log that spark has added these jars:
     INFO SparkContext: Added JAR hdfs:// …1.jar
     INFO SparkContext: Added JAR hdfs:// …2.jar

In the folder of the driver, I only saw the main.jar is copied to that place, but  the other
jars and files were not there

Could someone explain how should I pass the jars and files needed by the main jar to spark?

If my class in main.jar refer to these files with a relative path, will spark copy these files
into one folder?

BTW, my class works in a client mode with all jars and files in local.

Dong Lei

View raw message