spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matt Narrell <matt.narr...@gmail.com>
Subject Re: SPARK UI - Details post job processiong
Date Fri, 26 Sep 2014 14:38:01 GMT
Yes, I’m running Hadoop’s Timeline server that does this for the YARN/Hadoop logs (and
works very nicely btw).  Are you saying I can do the same for the SparkUI as well?  Also,
where do I set these Spark configurations since this will be executed inside a YARN container?
 On the “client” machine via spark-env.sh?  Do I pass these as command line arguments
to spark-submit?  Do I set them explicitly on my SparkConf?

Thanks in advance.

mn

On Sep 25, 2014, at 9:13 PM, Andrew Ash <andrew@andrewash.com> wrote:

> Matt you should be able to set an HDFS path so you'll get logs written to a unified place
instead of to local disk on a random box on the cluster.
> 
> On Thu, Sep 25, 2014 at 1:38 PM, Matt Narrell <matt.narrell@gmail.com> wrote:
> How does this work with a cluster manager like YARN?
> 
> mn
> 
> On Sep 25, 2014, at 2:23 PM, Andrew Or <andrew@databricks.com> wrote:
> 
>> Hi Harsha,
>> 
>> You can turn on `spark.eventLog.enabled` as documented here: http://spark.apache.org/docs/latest/monitoring.html.
Then, if you are running standalone mode, you can access the finished SparkUI through the
Master UI. Otherwise, you can start a HistoryServer to display finished UIs.
>> 
>> -Andrew
>> 
>> 2014-09-25 12:55 GMT-07:00 Harsha HN <99harsha.h.n99@gmail.com>:
>> Hi,
>> 
>> Details laid out in Spark UI for the job in progress is really interesting and very
useful. 
>> But this gets vanished once the job is done. 
>> Is there a way to get job details post processing? 
>> 
>> Looking for Spark UI data, not standard input,output and error info.
>> 
>> Thanks,
>> Harsha
>> 
> 
> 


Mime
View raw message