Not a hack, this is documented here:, and is in fact the proper way of setting per-application Spark configurations.

Additionally, you can specify default Spark configurations so you don't need to manually set it for all applications. If you are running Spark 0.9 or before, then you could set them through the environment variable SPARK_JAVA_OPTS in conf/

As of Spark 1.0, however, this mechanism is deprecated. The new way of setting default Spark configurations is through conf/spark-defaults.conf in the following format value
spark.config.two value2

More details are documented here:

2014-05-16 15:16 GMT-07:00 Theodore Wong <>:
I found that the easiest way was to pass variables in the Spark configuration
object. The only catch is that all of your properties keys must being with
"spark." in order for Spark to propagate the values. So, for example, in the

SparkConf conf = new SparkConf();
conf.set("spark.myapp.myproperty", "propertyValue");

JavaSparkContext context = new JavaSparkContext(conf);

I realize that this is most likely a hack, but it works and is easy (at
least for me) to follow from a programming standpoint compared to setting
environment variables outside of the program.


Theodore Wong

Theodore Wong &lt;;

View this message in context:
Sent from the Apache Spark User List mailing list archive at