Hi Vladimir,

This is not currently supported, but users have asked for it in the past. I have filed an issue for it here: https://issues.apache.org/jira/browse/SPARK-3767 so we can track its progress.


2014-10-02 5:25 GMT-07:00 Vladimir Tretyakov <vladimir.tretyakov@sematext.com>:
Hi, here in Sematext we almost done with Spark monitoring http://www.sematext.com/spm/index.html

But we need 1 thing from Spark, something like https://groups.google.com/forum/#!topic/storm-user/2fNCF341yqU in Storm.

Something like 'placeholder' in java opts which Spark will fills for executor, with executorId (0,1,2,3...).

For example I will write in spark-defaults.conf:

spark.executor.extraJavaOptions -Dcom.sun.management.jmxremote -javaagent:/opt/spm/spm-monitor/lib/spm-monitor-spark.jar=myValue-%executoId:spark-executor:default

and will get in executor processes:
-Dcom.sun.management.jmxremote -javaagent:/opt/spm/spm-monitor/lib/spm-monitor-spark.jar=myValue-0:spark-executor:default
-Dcom.sun.management.jmxremote -javaagent:/opt/spm/spm-monitor/lib/spm-monitor-spark.jar=myValue-1:spark-executor:default
-Dcom.sun.management.jmxremote -javaagent:/opt/spm/spm-monitor/lib/spm-monitor-spark.jar=myValue-2:spark-executor:default

Can I do something like that in Spark for executor? If not maybe it can be done in the future? Will be useful.

Thx, best redgards, Vladimir Tretyakov.