spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rares Vernica <rvern...@gmail.com>
Subject Set Job Descriptions for Scala application
Date Wed, 05 Aug 2015 19:29:59 GMT
Hello,

My Spark application is written in Scala and submitted to a Spark cluster
in standalone mode. The Spark Jobs for my application are listed in the
Spark UI like this:

Job Id     Description ...
6          saveAsTextFile at Foo.scala:202
5          saveAsTextFile at Foo.scala:201
4          count at Foo.scala:188
3          collect at Foo.scala:182
2          count at Foo.scala:162
1          count at Foo.scala:152
0          collect at Foo.scala:142


Is it possible to assign Job Descriptions to all these jobs in my Scala
code?

Thanks!
Rares

Mime
View raw message