Not sure why that is failing, but i found a workaround like:

#!/bin/bash -e


export _JAVA_OPTIONS=-Xmx1g

OPTS+=" --class org.apache.spark.examples.SparkPi"

echo $SPARK_SUBMIT $OPTS lib/spark-examples-1.1.0-hadoop1.0.4.jar

exec $SPARK_SUBMIT $OPTS lib/spark-examples-1.1.0-hadoop1.0.4.jar

Best Regards

On Sat, Nov 8, 2014 at 12:31 AM, Koert Kuipers <> wrote:
i need to run spark-submit inside a script with options that are build up programmatically. oh and i need to use exec to keep the same pid (so it can run as a service and be killed).

this is what i tried:
#!/bin/bash -e


OPTS="--class org.apache.spark.examples.SparkPi"
OPTS+=" --driver-java-options \"-Da=b -Dc=d\""

echo $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar

exec $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar

no luck. it gets confused on the multiple java options it seems. i get:
Exception in thread "main" java.lang.NoClassDefFoundError: "-Da=b
Caused by: java.lang.ClassNotFoundException: "-Da=b
    at Method)
    at java.lang.ClassLoader.loadClass(
    at sun.misc.Launcher$AppClassLoader.loadClass(
    at java.lang.ClassLoader.loadClass(
Could not find the main class: "-Da=b.  Program will exit.

i also tried many other ways of escaping the quoted java options. none of them work.
strangely it does work if i replace the last line by (there is no science to this for me, i dont know much about bash, just trying random and probably bad things):
eval exec $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar

i am lost as to why... and there must be a better solution? it looks kinda nasty with the eval + exec

best, koert