spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jan Botorek <Jan.Boto...@infor.com>
Subject RE: Add jar files on classpath when submitting tasks to Spark
Date Tue, 01 Nov 2016 12:56:44 GMT
Thank you for the reply.
I am aware of the parameters used when submitting the tasks (--jars is working for us).

But, isn’t there any way how to specify a location (directory) for jars „in global“
- in the spark-defaults.conf??


From: ayan guha [mailto:guha.ayan@gmail.com]
Sent: Tuesday, November 1, 2016 1:49 PM
To: Jan Botorek <Jan.Botorek@infor.com>
Cc: user <user@spark.apache.org>
Subject: Re: Add jar files on classpath when submitting tasks to Spark


There are options to specify external jars in the form of --jars, --driver-classpath etc depending
on spark version and cluster manager.. Please see spark documents for configuration sections
and/or run spark submit help to see available options.
On 1 Nov 2016 23:13, "Jan Botorek" <Jan.Botorek@infor.com<mailto:Jan.Botorek@infor.com>>
wrote:
Hello,
I have a problem trying to add jar files to be available on classpath when submitting task
to Spark.

In my spark-defaults.conf file I have configuration:
spark.driver.extraClassPath = path/to/folder/with/jars
all jars in the folder are available in SPARK-SHELL

The problem is that jars are not on the classpath for SPARK-MASTER; more precisely – when
I submit any job that utilizes any jar from external folder, the java.lang.ClassNotFoundException
is thrown.
Moving all external jars into the jars folder solves the situation, but we need to keep external
files separatedly.

Thank you for any help
Best regards,
Jan
Mime
View raw message