spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Thakrar, Jayesh" <jthak...@conversantmedia.com>
Subject Re: "Spark.jars not adding jars to classpath"
Date Thu, 22 Mar 2018 14:20:46 GMT
Is this in spark-shell or a spark-submit job?
If spark-submit job,  is it local or cluster?

One reliable way of adding jars is to use the command line option "--jars"
See http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
for more info.

If you add jars after the sparkcontext is created, its too late as the driver and executor
processes (distributed) or threads (local) would have already been setup.


From: Ankit Agrahari <ankitagr2312@gmail.com>
Date: Tuesday, March 20, 2018 at 11:34 PM
To: <dev@spark.apache.org>
Subject: "Spark.jars not adding jars to classpath"

I am trying to add my custom jar in spark job using "spark.jars" property.
Although I can read the info in logs of jar getting added but when I check the jars that are
added to class path, I doesn't find it.Below are the functions that I also have tried it out.
1)spark.jars
2)spark.driver.extraLibraryPath
3)spark.executor.extraLibraryPath
4)setJars(Seq[<path>])

But none added jar.I am using spark 2.2.0 in HDP and files were kept locally.
Please let me know what possibly I am doing wrong.
Mime
View raw message