spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Emre Sevinc <emre.sev...@gmail.com>
Subject Why doesn't the --conf parameter work in yarn-cluster mode (but works in yarn-client and local)?
Date Mon, 23 Mar 2015 12:39:08 GMT
Hello,

According to Spark Documentation at
https://spark.apache.org/docs/1.2.1/submitting-applications.html :

  --conf: Arbitrary Spark configuration property in key=value format. For
values that contain spaces wrap “key=value” in quotes (as shown).

And indeed, when I use that parameter, in my Spark program I can retrieve
the value of the key by using:

    System.getProperty("key");

This works when I test my program locally, and also in yarn-client mode, I
can log the value of the key and see that it matches what I wrote in the
command line, but it returns *null* when I submit the very same program in
*yarn-cluster* mode.

Why can't I retrieve the value of key given as --conf "key=value" when I
submit my Spark application in *yarn-cluster* mode?

Any ideas and/or workarounds?


-- 
Emre Sevinç
http://www.bigindustries.be/

Mime
View raw message