spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Koert Kuipers <>
Subject spark-submit inside script... need some bash help
Date Fri, 07 Nov 2014 19:01:09 GMT
i need to run spark-submit inside a script with options that are build up
programmatically. oh and i need to use exec to keep the same pid (so it can
run as a service and be killed).

this is what i tried:
#!/bin/bash -e


OPTS="--class org.apache.spark.examples.SparkPi"
OPTS+=" --driver-java-options \"-Da=b -Dc=d\""

echo $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar

exec $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar

no luck. it gets confused on the multiple java options it seems. i get:
Exception in thread "main" java.lang.NoClassDefFoundError: "-Da=b
Caused by: java.lang.ClassNotFoundException: "-Da=b
    at Method)
    at java.lang.ClassLoader.loadClass(
    at sun.misc.Launcher$AppClassLoader.loadClass(
    at java.lang.ClassLoader.loadClass(
Could not find the main class: "-Da=b.  Program will exit.

i also tried many other ways of escaping the quoted java options. none of
them work.
strangely it does work if i replace the last line by (there is no science
to this for me, i dont know much about bash, just trying random and
probably bad things):
eval exec $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar

i am lost as to why... and there must be a better solution? it looks kinda
nasty with the eval + exec

best, koert

View raw message