spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Oleg Mazurov <>
Subject Re: Running spark with javaagent configuration
Date Wed, 15 May 2019 18:23:59 GMT
You can see what Uber JVM does at :

--conf spark.jars=hdfs://hdfs_url/lib/jvm-profiler-1.0.0.jar
> --conf spark.executor.extraJavaOptions=-javaagent:jvm-profiler-1.0.0.jar

    -- Oleg

On Wed, May 15, 2019 at 6:28 AM Anton Puzanov <>

> Hi everyone,
> I want to run my spark application with javaagent, specifically I want to
> use newrelic with my application.
> When I run spark-submit I must pass --conf
> "spark.driver.extraJavaOptions=-javaagent=<full path to newrelic jar>"
> My problem is that I can't specify the full path as I run in cluster mode
> and I don't know the exact host which will serve as the driver.
> *Important:* I know I can upload the jar to every node, but it seems like
> a fragile solution as machines will be added and removed later.
> I have tried specifying the jar with --files but couldn't make it work, as
> I didn't know where exactly I should point the javaagent
> Any suggestions on what is the best practice to handle this kind of
> problems? and what can I do?
> Thanks a lot,
> Anton

View raw message