spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From MEETHU MATHEW <meethu2...@yahoo.co.in>
Subject Use of SPARK_DAEMON_JAVA_OPTS
Date Wed, 23 Jul 2014 08:04:10 GMT


 Hi all,

Sorry for taking this topic again,still I am confused on this.

I set SPARK_DAEMON_JAVA_OPTS="-XX:+UseCompressedOops -Xmx8g"             

when I run my application,I  got the following line in logs.

Spark Command: java -cp ::/usr/local/spark-1.0.1/conf:/usr/local/spark-1.0.1/assembly/target/scala-2.10/spark-assembly-1.0.1-hadoop1.2.1.jar
-XX:MaxPermSize=128m -XX:+UseCompressedOops-Xmx8g-Dspark.akka.logLifecycleEvents=true -Xms512m 
-Xmx512morg.apache.spark.deploy.worker.Worker spark://master:7077


-Xmx is set twice. One from the SPARK_DAEMON_JAVA_OPTS .
2nd from bin/spark-class(from SPARK_DAEMON_MEMORY or DEFAULT_MEM).

I believe that the second value will be taken in execution ie the one passed as SPARK_DAEMON _MEMORY or
DEFAULT_MEM.

So I would like to know what is the purpose of SPARK_DAEMON_JAVA_OPTS and how it is different
from SPARK_DAEMON _MEMORY.


Thanks & Regards, 
Meethu M
Mime
View raw message