spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mike Sukmanowsky <>
Subject Spark Metrics Framework?
Date Mon, 21 Mar 2016 15:54:38 GMT
We make extensive use of the elasticsearch-hadoop library for Hadoop/Spark.
In trying to troubleshoot our Spark applications, it'd be very handy to
have access to some of the many metrics
that the library makes available when running in map reduce mode. The library's
author noted
<> that
Spark doesn't offer any kind of a similar metrics API where by these
metrics could be reported or aggregated on.

Are there any plans to bring a metrics framework similar to Hadoop's
Counter system to Spark or is there an alternative means for us to grab
metrics exposed when using Hadoop APIs to load/save RDDs?


View raw message