spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nithin Asokan <anithi...@gmail.com>
Subject Re: How to set System environment variables in Spark
Date Tue, 29 Sep 2015 19:33:43 GMT
--conf is used to pass any spark configuration that starts with *spark.**

You can also use "--driver-java-options" to pass any system properties you
would like to the driver program.

On Tue, Sep 29, 2015 at 2:30 PM swetha <swethakasireddy@gmail.com> wrote:

>
> Hi,
>
> How to set System environment variables when submitting a job?  Suppose I
> have the environment variable as shown below. I have been trying to specify
> --- -Dcom.w1.p1.config.runOnEnv=dev and --conf
> -Dcom.w1.p1.config.runOnEnv=dev. But, it does not seem to be working. How
> to
> set environment variable when submitting a job in Spark?
>
>
> -Dcom.w1.p1.config.runOnEnv=dev
>
> Thanks,
> Swetha
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-System-environment-variables-in-Spark-tp24875.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message