spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vladimir Tretyakov <vladimir.tretya...@sematext.com>
Subject Is there a way to provide individual property to each Spark executor?
Date Thu, 02 Oct 2014 12:25:14 GMT
Hi, here in Sematext we almost done with Spark monitoring
http://www.sematext.com/spm/index.html

But we need 1 thing from Spark, something like
https://groups.google.com/forum/#!topic/storm-user/2fNCF341yqU in Storm.

Something like 'placeholder' in java opts which Spark will fills for
executor, with executorId (0,1,2,3...).

For example I will write in spark-defaults.conf:

spark.executor.extraJavaOptions -Dcom.sun.management.jmxremote
-javaagent:/opt/spm/spm-monitor/lib/spm-monitor-spark.jar=myValue-
*%executoId*:spark-executor:default

and will get in executor processes:
-Dcom.sun.management.jmxremote
-javaagent:/opt/spm/spm-monitor/lib/spm-monitor-spark.jar=myValue-*0*
:spark-executor:default
-Dcom.sun.management.jmxremote
-javaagent:/opt/spm/spm-monitor/lib/spm-monitor-spark.jar=myValue-*1*
:spark-executor:default
-Dcom.sun.management.jmxremote
-javaagent:/opt/spm/spm-monitor/lib/spm-monitor-spark.jar=myValue-*2*
:spark-executor:default
...
...
...



Can I do something like that in Spark for executor? If not maybe it can be
done in the future? Will be useful.

Thx, best redgards, Vladimir Tretyakov.

Mime
View raw message