spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hemant Bhanawat <hemant9...@gmail.com>
Subject Re: How to set environment of worker applications
Date Mon, 24 Aug 2015 07:30:59 GMT
That's surprising. Passing the environment variables using
spark.executor.extraJavaOptions=-Dmyenvvar=xxx to the executor and then
fetching them using System.getProperty("myenvvar") has worked for me.

What is the error that you guys got?

On Mon, Aug 24, 2015 at 12:10 AM, Sathish Kumaran Vairavelu <
vsathishkumaran@gmail.com> wrote:

> spark-env.sh works for me in Spark 1.4 but not
> spark.executor.extraJavaOptions.
>
> On Sun, Aug 23, 2015 at 11:27 AM Raghavendra Pandey <
> raghavendra.pandey@gmail.com> wrote:
>
>> I think the only way to pass on environment variables to worker node is
>> to write it in spark-env.sh file on each worker node.
>>
>> On Sun, Aug 23, 2015 at 8:16 PM, Hemant Bhanawat <hemant9379@gmail.com>
>> wrote:
>>
>>> Check for spark.driver.extraJavaOptions and
>>> spark.executor.extraJavaOptions in the following article. I think you can
>>> use -D to pass system vars:
>>>
>>> spark.apache.org/docs/latest/configuration.html#runtime-environment
>>> Hi,
>>>
>>> I am starting a spark streaming job in standalone mode with spark-submit.
>>>
>>> Is there a way to make the UNIX environment variables with which
>>> spark-submit is started available to the processes started on the worker
>>> nodes?
>>>
>>> Jan
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>

Mime
View raw message