spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rodrick Brown <>
Subject Re: Submit job with driver options in Mesos Cluster mode
Date Fri, 28 Oct 2016 05:27:14 GMT
Try setting the values in $SPARK_HOME/conf/spark-defaults.conf 


$ egrep 'spark.(driver|executor).extra' /data/orchard/spark-2.0.1/conf/spark-defaults.conf
spark.executor.extraJavaOptions    	-Duser.timezone=UTC -Xloggc:garbage-collector.log
spark.driver.extraJavaOptions 	   	-Duser.timezone=UTC -Xloggc:garbage-collector.log

Rodrick Brown / DevOPs Engineer 
+1 917 445 6839 / <>
Orchard Platform 
101 5th Avenue, 4th Floor, New York, NY 10003 <>
Orchard Blog <> | Marketplace Lending Meetup <>
> On Oct 6, 2016, at 12:20 PM, vonnagy <> wrote:
> I am trying to submit a job to spark running in a Mesos cluster. We need to
> pass custom java options to the driver and executor for configuration, but
> the driver task never includes the options. Here is an example submit. 
> GC_OPTS="-XX:+UseConcMarkSweepGC 
>         -verbose:gc -XX:+PrintGCTimeStamps -Xloggc:$appdir/gc.out 
>         -XX:MaxPermSize=512m 
>         -XX:+CMSClassUnloadingEnabled " 
> -Dredis.master=${REDIS_MASTER} -Dredis.port=${REDIS_PORT} 
> spark-submit \ 
>  --name client-events-intake \ 
>  --class ClientEventsApp \ 
>  --deploy-mode cluster \ 
>  --driver-java-options "${EXEC_PARAMS} ${GC_OPTS}" \ 
>  --conf "spark.ui.killEnabled=true" \ 
>  --conf "spark.mesos.coarse=true" \ 
>  --conf "spark.driver.extraJavaOptions=${EXEC_PARAMS}" \ 
>  --conf "spark.executor.extraJavaOptions=${EXEC_PARAMS}" \ 
>  --master mesos://someip:7077 \ 
>  --verbose \ 
>  some.jar 
> When the driver task runs in Mesos it is creating the following command: 
> sh -c 'cd spark-1*;  bin/spark-submit --name client-events-intake --class
> ClientEventsApp --master mesos://someip:5050 --driver-cores 1.0
> --driver-memory 512M ../some.jar ' 
> There are no options for the driver here, thus the driver app blows up
> because it can't find the java options. However, the environment variables
> contain the executor options: 
> SPARK_EXECUTOR_OPTS -> -Dspark.executor.extraJavaOptions=-Dloglevel=DEBUG
> ... 
> Any help would be great. I know that we can set some "spark.*" settings in
> default configs, but these are not necessarily spark related. This is not an
> issue when running the same logic outside of a Mesos cluster in Spark
> standalone mode. 
> Thanks! 
> --
> View this message in context:
> Sent from the Apache Spark User List mailing list archive at
> ---------------------------------------------------------------------
> To unsubscribe e-mail:

*NOTICE TO RECIPIENTS*: This communication is confidential and intended for 
the use of the addressee only. If you are not an intended recipient of this 
communication, please delete it immediately and notify the sender by return 
email. Unauthorized reading, dissemination, distribution or copying of this 
communication is prohibited. This communication does not constitute an 
offer to sell or a solicitation of an indication of interest to purchase 
any loan, security or any other financial product or instrument, nor is it 
an offer to sell or a solicitation of an indication of interest to purchase 
any products or services to any persons who are prohibited from receiving 
such information under applicable law. The contents of this communication 
may not be accurate or complete and are subject to change without notice. 
As such, Orchard App, Inc. (including its subsidiaries and affiliates, 
"Orchard") makes no representation regarding the accuracy or completeness 
of the information contained herein. The intended recipient is advised to 
consult its own professional advisors, including those specializing in 
legal, tax and accounting matters. Orchard does not provide legal, tax or 
accounting advice.

View raw message