spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <r...@databricks.com>
Subject Re: [Proposal] Enabling time series analysis on spark metrics
Date Tue, 01 Mar 2016 22:06:34 GMT
Is the suggestion just to use a different config (and maybe fallback to
appid) in order to publish metrics? Seems reasonable.


On Tue, Mar 1, 2016 at 8:17 AM, Karan Kumar <karankumar1100@gmail.com>
wrote:

> +dev mailing list
>
> Time series analysis on metrics becomes quite useful when running spark
> jobs using a workflow manager like oozie.
>
> Would love to take this up if the community thinks its worthwhile.
>
> On Tue, Feb 23, 2016 at 2:59 PM, Karan Kumar <karankumar1100@gmail.com>
> wrote:
>
>> HI
>>
>> Spark at the moment uses application ID to report metrics. I was thinking
>> that if we can create an option to export metrics on a user-controlled key.
>> This will allow us to do time series analysis on counters by dumping these
>> counters in a DB such as graphite.
>>
>> One of the approaches I had in mind was allowing a user to set a property
>> via the spark client. If that property is set, use the property value to
>> report metrics else use the current implementation
>> <https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala>of
>> reporting metrics on appid.
>>
>> Thoughts?
>>
>> --
>> Thanks
>> Karan
>>
>
>
>
> --
> Thanks
> Karan
>

Mime
View raw message