spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From redocpot <>
Subject Environment Variables question
Date Fri, 01 Aug 2014 09:02:39 GMT

According to the configuration guide

"Certain Spark settings can be configured through environment variables,
which are read from the conf/ script in the directory where
Spark is installed (or conf/spark-env.cmd on Windows). In Standalone and
Mesos modes, this file can give machine specific information such as
hostnames.* It is also sourced when running local Spark applications or
submission scripts.*"

I am not sure about the last sentence, because when I reset the
SPARK_LOCAL_DIR to some specific paths, I have to restart the cluster, like:
, otherwise, newly modified env vars(like SPARK_LOCAL_DIR) are not used. The
same for submission scripts and setting SparkConf on Driver program.

It seems that the is sourced only when the cluster is started,
that means env vars are saved in spark daemon process (which will be used
for future spark App/prog), not in a single spark application/program. We
can not use different SPARK_LOCAL_DIR for different spark applications based
on the same daemon process, since SPARK_LOCAL_DIR takes the one loaded on
booting time of the spark daemon.

Could someone confirm this ? Willing to give some more details, if

Thank you.


View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message