spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <>
Subject Re: Can't I mix non-Spark properties into a .properties file and pass it to spark-submit via --properties-file?
Date Mon, 16 Feb 2015 15:28:58 GMT
Since SparkConf is only for Spark properties, I think it will in
general only pay attention to and preserve "spark.*" properties. You
could experiment with that. In general I wouldn't rely on Spark
mechanisms for your configuration, and you can use any config
mechanism you like to retain your own properties.

On Mon, Feb 16, 2015 at 3:26 PM, Emre Sevinc <> wrote:
> Hello,
> I'm using Spark 1.2.1 and have a file, and in it I have
> non-Spark properties, as well as Spark properties, e.g.:
>    job.output.dir=file:///home/emre/data/mymodule/out
> I'm trying to pass it to spark-submit via:
>    spark-submit --class com.myModule --master local[4] --deploy-mode client
> --verbose --properties-file /home/emre/data/ mymodule.jar
> And I thought I could read the value of my non-Spark property, namely,
> job.output.dir by using:
>     SparkConf sparkConf = new SparkConf();
>     final String validatedJSONoutputDir = sparkConf.get("job.output.dir");
> But it gives me an exception:
>     Exception in thread "main" java.util.NoSuchElementException:
> job.output.dir
> Is it not possible to mix Spark and non-Spark properties in a single
> .properties file, then pass it via --properties-file and then get the values
> of those non-Spark properties via SparkConf?
> Or is there another object / method to retrieve the values for those
> non-Spark properties?
> --
> Emre Sevinç

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message