There are options to specify external jars in the form of --jars, --driver-classpath etc depending on spark version and cluster manager.. Please see spark documents for configuration sections and/or run spark submit help to see available options.
I have a problem trying to add jar files to be available on classpath when submitting task to Spark.
In my spark-defaults.conf file I have configuration:
spark.driver.extraClassPath = path/to/folder/with/jars
all jars in the folder are available in SPARK-SHELL
The problem is that jars are not on the classpath for SPARK-MASTER; more precisely – when I submit any job that utilizes any jar from external folder, the java.lang.
Moving all external jars into the jars folder solves the situation, but we need to keep external files separatedly.
Thank you for any help