spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@hortonworks.com>
Subject Re: Get spark metrics in code
Date Sun, 11 Sep 2016 18:23:57 GMT

> On 9 Sep 2016, at 13:20, Han JU <ju.han.felix@gmail.com> wrote:
> 
> Hi,
> 
> I'd like to know if there's a possibility to get spark's metrics from code. For example
> 
>   val sc = new SparkContext(conf)
>   val result = myJob(sc, ...)
>   result.save(...)
>   
>   val gauge = MetricSystem.getGauge("org.apahce.spark....")
>   println(gauge.getValue)  // or send to to internal aggregation service
> 
> I'm aware that there's a configuration for sending metrics to several kinds of sinks
but I'm more interested in a per job basis style and we use a custom log/metric aggregation
service for building dashboards.
> 

It's all coda hale metrics; should be retrievable somehow, for a loose definition of "somehow"

I'd be interested in knowing what you come up with here. 


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message