spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Spark Monitoring UI for Hadoop Yarn Cluster
Date Tue, 03 Mar 2015 19:41:12 GMT
Spark applications shown in the RM's UI should have an "Application
Master" link when they're running. That takes you to the Spark UI for
that application where you can see all the information you're looking
for.

If you're running a history server and add
"spark.yarn.historyServer.address" to your config, that link will
become a "History" link after the application is finished, and will
take you to the history server to view the app's UI.



On Tue, Mar 3, 2015 at 9:47 AM, Srini Karri <skarri.net@gmail.com> wrote:
> Hi All,
>
> I am having trouble finding data related to my requirement. Here is the
> context, I have tried Standalone Spark Installation on Windows, I am able to
> submit the logs, able to see the history of events. My question is, is it
> possible to achieve the same monitoring UI experience with Yarn Cluster like
> Viewing workers, running/completed job stages in the Web UI. Currently, if
> we go to our Yarn Resource manager UI, we are able to see the Spark Jobs and
> it's logs. But it is not as rich as Spark Standalone master UI. Is this
> limitation for hadoop yarn cluster or is there any way we can hook this
> Spark Standalone master to Yarn Cluster?
>
> Any help is highly appreciated.
>
> Regards,
> Srini.



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message