spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mike Sukmanowsky <mike.sukmanow...@gmail.com>
Subject Spark Metrics Framework?
Date Mon, 21 Mar 2016 15:54:38 GMT
We make extensive use of the elasticsearch-hadoop library for Hadoop/Spark.
In trying to troubleshoot our Spark applications, it'd be very handy to
have access to some of the many metrics
<https://www.elastic.co/guide/en/elasticsearch/hadoop/current/metrics.html>
that the library makes available when running in map reduce mode. The library's
author noted
<https://discuss.elastic.co/t/access-es-hadoop-stats-from-spark/44913> that
Spark doesn't offer any kind of a similar metrics API where by these
metrics could be reported or aggregated on.

Are there any plans to bring a metrics framework similar to Hadoop's
Counter system to Spark or is there an alternative means for us to grab
metrics exposed when using Hadoop APIs to load/save RDDs?

Thanks,
Mike

Mime
View raw message