spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aurélien Mazoyer <aurel...@aepsilon.com>
Subject Re: Get application metric from Spark job
Date Mon, 06 Sep 2021 12:46:40 GMT
Hi Akshay,

Thank you for your reply. Sounds like a good idea, but I unfortunately have
a 2.6 cluster. Do you know if there would be another solution that would
run on 2.6 or if I have no other choice than migrating to 3?

Regards,

Aurélien

Le jeu. 2 sept. 2021 à 20:12, Haryani, Akshay <akshay.haryani@hpe.com> a
écrit :

> Hi Aurélien,
>
>
>
> Spark has endpoints to expose the spark application metrics. These
> endpoints can be used as a rest API. You can read more about these here:
> https://spark.apache.org/docs/3.1.1/monitoring.html#rest-api
>
>
>
> Additionally,
>
> If you want to build your own custom metrics, you can explore spark custom
> plugins. Using a custom plugin, you can track your own custom metrics and
> plug it into the spark metrics system. Please note plugins are supported
> on spark versions above 3.0.
>
>
>
>
>
> --
>
> Thanks & Regards,
>
> Akshay Haryani
>
>
>
> *From: *Aurélien Mazoyer <aurelien@aepsilon.com>
> *Date: *Thursday, September 2, 2021 at 8:36 AM
> *To: *user@spark.apache.org <user@spark.apache.org>
> *Subject: *Get application metric from Spark job
>
> Hi community,
>
>
>
> I would like to collect information about the execution of a Spark job
> while it is running. Could I define some kind of application metrics (such
> as a counter that would be incremented in my code) that I could retrieve
> regularly while the job is running?
>
>
> Thank you for help,
>
>
>
> Aurelien
>

Mime
View raw message