spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arjun kr <arjun...@outlook.com>
Subject Re: Spark user classpath setting
Date Thu, 14 Jun 2018 21:29:26 GMT
Thanks a lot, Marcelo!! It did the work. :)

Regards,

Arjun

________________________________
From: Marcelo Vanzin <vanzin@cloudera.com>
Sent: Friday, June 15, 2018 2:07 AM
To: Arjun kr
Cc: user@spark.apache.org
Subject: Re: Spark user classpath setting

I only know of a way to do that with YARN.

You can distribute the jar files using "--files" and add just their
names (not the full path) to the "extraClassPath" configs. You don't
need "userClassPathFirst" in that case.

On Thu, Jun 14, 2018 at 1:28 PM, Arjun kr <arjun.kr@outlook.com> wrote:
> Hi All,
>
>
> I am trying to execute a sample spark script ( that use spark jdbc ) which
> has dependencies on a set of custom jars. These custom jars need to be added
> first in the classpath. Currently, I have copied custom lib directory to all
> the nodes and able to execute it with below command.
>
>
> bin/spark-shell  --conf spark.driver.extraClassPath=/custom-jars/* --conf
> "spark.driver.userClassPathFirst=true" --conf
> spark.executor.extraClassPath=/custom-jars/* --conf
> "spark.executor.userClassPathFirst=true" --master yarn -i
> /tmp/spark-test.scala
>
>
> Are there any options that do not require jars to be copied to all the nodes
> (with option to be added to class path first) ? The --jars and --archives
> option seems to not working for me. Any suggestion would be appreciated!
>
>
> Thanks,
>
>
> Arjun
>
>
>



--
Marcelo

Mime
View raw message