spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Is spark-env.sh sourced by Application Master and Executor for Spark on YARN?
Date Thu, 04 Jan 2018 17:16:40 GMT
On Wed, Jan 3, 2018 at 8:18 PM, John Zhuge <john.zhuge@gmail.com> wrote:
> Something like:
>
> Note: When running Spark on YARN, environment variables for the executors
> need to be set using the spark.yarn.executorEnv.[EnvironmentVariableName]
> property in your conf/spark-defaults.conf file or on the command line.
> Environment variables that are set in spark-env.sh will not be reflected in
> the executor process.

I'm not against adding docs, but that's probably true for all
backends. No backend I know sources spark-env.sh before starting
executors.

For example, the standalone worker sources spark-env.sh before
starting the daemon, and those env variables "leak" to the executors.
But you can't customize an individual executor's environment that way
without restarting the service.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message