spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From santhoma <santhosh.tho...@yahoo.com>
Subject question about setting SPARK_CLASSPATH IN spark_env.sh
Date Wed, 18 Jun 2014 04:26:11 GMT
Hi, 

This is about spark 0.9. 
I have a 3 node spark cluster. I want to add a locally available jarfile
(present on all nodes) to the SPARK_CLASPATH variable in
/etc/spark/conf/spark-env.sh  so that all nodes can access it.

Question is,
should I edit 'spark-env.sh' on all nodes to add the jar  ?
Or, is it enough to add it only in the master node from where I am
submitting jobs?

thanks
Santhosh



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/question-about-setting-SPARK-CLASSPATH-IN-spark-env-sh-tp7809.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Mime
View raw message