spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From vonnagy <>
Subject Submit job with driver options in Mesos Cluster mode
Date Thu, 06 Oct 2016 16:21:09 GMT
I am trying to submit a job to spark running in a Mesos cluster. We need to
pass custom java options to the driver and executor for configuration, but
the driver task never includes the options. Here is an example submit. 

         -verbose:gc -XX:+PrintGCTimeStamps -Xloggc:$appdir/gc.out 
         -XX:+CMSClassUnloadingEnabled " 

-Dredis.master=${REDIS_MASTER} -Dredis.port=${REDIS_PORT} 

spark-submit \ 
  --name client-events-intake \ 
  --class ClientEventsApp \ 
  --deploy-mode cluster \ 
  --driver-java-options "${EXEC_PARAMS} ${GC_OPTS}" \ 
  --conf "spark.ui.killEnabled=true" \ 
  --conf "spark.mesos.coarse=true" \ 
  --conf "spark.driver.extraJavaOptions=${EXEC_PARAMS}" \ 
  --conf "spark.executor.extraJavaOptions=${EXEC_PARAMS}" \ 
  --master mesos://someip:7077 \ 
  --verbose \ 

When the driver task runs in Mesos it is creating the following command: 

sh -c 'cd spark-1*;  bin/spark-submit --name client-events-intake --class
ClientEventsApp --master mesos://someip:5050 --driver-cores 1.0
--driver-memory 512M ../some.jar ' 

There are no options for the driver here, thus the driver app blows up
because it can't find the java options. However, the environment variables
contain the executor options: 

SPARK_EXECUTOR_OPTS -> -Dspark.executor.extraJavaOptions=-Dloglevel=DEBUG

Any help would be great. I know that we can set some "spark.*" settings in
default configs, but these are not necessarily spark related. This is not an
issue when running the same logic outside of a Mesos cluster in Spark
standalone mode. 


View this message in context:
Sent from the Apache Spark Developers List mailing list archive at

To unsubscribe e-mail:

View raw message