spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shao, Saisai" <>
Subject RE: Better way of measuring custom application metrics
Date Sun, 04 Jan 2015 02:46:12 GMT

I think there’s a StreamingSource in Spark Streaming which exposes the Spark Streaming running
status to the metrics sink, you can connect it with Graphite sink to expose metrics to Graphite.
I’m not sure is this what you want.

Besides you can customize the Source and Sink of the MetricsSystem to build your own and configure
it in with class name to let it loaded by metrics system, for the details
you can refer to or source code.


From: Enno Shioji []
Sent: Sunday, January 4, 2015 7:47 AM
Subject: Better way of measuring custom application metrics

I have a hack to gather custom application metrics in a Streaming job, but I wanted to know
if there is any better way of doing this.

My hack consists of this singleton:

object Metriker extends Serializable {
  @transient lazy val mr: MetricRegistry = {
    val metricRegistry = new MetricRegistry()
    val graphiteEndpoint = new InetSocketAddress("<>",
      .build(new Graphite(graphiteEndpoint))
      .start(5, TimeUnit.SECONDS)

  @transient lazy val processId = ManagementFactory.getRuntimeMXBean.getName

  @transient lazy val hostId = {
    try {
    } catch {
      case e: UnknownHostException => "localhost"

   def metricName(name: String): String = {
    "%s.%s.%s".format(name, hostId, processId)

which I then use in my jobs like so:

        .map { i =>
          i * 2

Then I aggregate the metrics on Graphite. This works, but I was curious to know if anyone
has a less hacky way.

View raw message