spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shao, Saisai" <>
Subject RE: Executor metrics in spark application
Date Tue, 22 Jul 2014 06:40:55 GMT
Hi Denes,

I think you can register your customized metrics source into metrics system through,
you can take metrics.propertes.template as reference,

Basically you can do as follow if you want to monitor on executor:


I think the below code can only register metrics source in client side.


BTW, it's not a good choice to register through MetricsSystem, it would be nice to register
through configuration. Also you can enable console sink to verify whether the source is registered
or not.


-----Original Message-----
From: Denes [] 
Sent: Tuesday, July 22, 2014 2:02 PM
Subject: Re: Executor metrics in spark application

I'm also pretty interested how to create custom Sinks in Spark. I'm using it with Ganglia
and the normal metrics from JVM source do show up. I tried to create my own metric based on
Issac's code, but does not show up in Ganglia.
Does anyone know where is the problem?
Here's the code snippet: 

class AccumulatorSource(accumulator: Accumulator[Long], name: String) extends Source {
  val sourceName = "accumulator.metrics"
  val metricRegistry = new MetricRegistry()
  metricRegistry.register("accumulator", name), new Gauge[Long] {
     override def getValue: Long = {
            return accumulator.value;


and then in the main:
val longAccumulator = sc.accumulator[Long](0); val accumulatorMetrics = new AccumulatorSource(longAccumulator
, "counters.accumulator"); SparkEnv.get.metricsSystem.registerSource(accumulatorMetrics);

View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message