spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From rzykov <rzy...@gmail.com>
Subject SPARK 1.1.0 on yarn-cluster and external JARs
Date Thu, 25 Sep 2014 12:25:49 GMT
We build some SPARK jobs with external jars. I compile jobs by including them
in one assembly.
But look for an approach to put all external jars into HDFS.

We have already put  spark jar in a HDFS folder and set up the variable
SPARK_JAR. 
What is the best way to do that for other external jars (MongoDB, algebird
and so on)?

Thanks in advance





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SPARK-1-1-0-on-yarn-cluster-and-external-JARs-tp15136.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message