spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Haryani, Akshay" <>
Subject Re: Get application metric from Spark job
Date Thu, 02 Sep 2021 18:12:23 GMT
Hi Aurélien,

Spark has endpoints to expose the spark application metrics. These endpoints can be used as
a rest API. You can read more about these here:

If you want to build your own custom metrics, you can explore spark custom plugins. Using
a custom plugin, you can track your own custom metrics and plug it into the spark metrics
system. Please note plugins are supported on spark versions above 3.0.

Thanks & Regards,
Akshay Haryani

From: Aurélien Mazoyer <>
Date: Thursday, September 2, 2021 at 8:36 AM
To: <>
Subject: Get application metric from Spark job
Hi community,

I would like to collect information about the execution of a Spark job while it is running.
Could I define some kind of application metrics (such as a counter that would be incremented
in my code) that I could retrieve regularly while the job is running?

Thank you for help,


View raw message