spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Srini Karri <skarri....@gmail.com>
Subject Re: Spark Monitoring UI for Hadoop Yarn Cluster
Date Wed, 04 Mar 2015 18:08:22 GMT
Hi Todd and Marcelo,

Thanks for helping me. I was to able to lunch the history server on windows
with out any issues. One problem I am running into right now. I always get
the message no completed applications found in history server UI. But I was
able to browse through these applications from Spark Master. Do you have
any thoughts what could be problem? Following are my settings in spark conf
file:

spark.executor.extraClassPath
D:\\Apache\\spark-1.2.1-bin-hadoop2\\spark-1.2.1-bin-hadoop2.4\\bin\\classes
spark.eventLog.dir
D:/Apache/spark-1.2.1-bin-hadoop2/spark-1.2.1-bin-hadoop2.4/bin/tmp/spark-events
spark.history.fs.logDirectory
D:/Apache/spark-1.2.1-bin-hadoop2/spark-1.2.1-bin-hadoop2.4/bin/tmp/spark-events

Also I have attached Spark Master and Spark History server UI screen shots
for convenience. And all the logs are available and I granted directory
permissions to "Everyone with full control". Following is the console
output from History server:

D:\Apache\spark-1.2.1-bin-hadoop2\spark-1.2.1-bin-hadoop2.4\bin>spark-class.cmd
org.apache.spark.deploy.history.HistoryServer
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
15/03/04 08:59:42 INFO SecurityManager: Changing view acls to: skarri
15/03/04 08:59:42 INFO SecurityManager: Changing modify acls to: skarri
15/03/04 08:59:42 INFO SecurityManager: SecurityManager: authentication
disabled
; ui acls disabled; users with view permissions: Set(skarri); users with
modify
permissions: Set(skarri)
15/03/04 08:59:49 WARN NativeCodeLoader: Unable to load native-hadoop
library fo
r your platform... using builtin-java classes where applicable
15/03/04 08:59:56 INFO Utils: Successfully started service on port 18080.
15/03/04 08:59:56 INFO HistoryServer: Started HistoryServer at
http://skarri-lt0
5.redmond.corp.microsoft.com:18080

Regards,
Srini.

On Tue, Mar 3, 2015 at 11:41 AM, Marcelo Vanzin <vanzin@cloudera.com> wrote:

> Spark applications shown in the RM's UI should have an "Application
> Master" link when they're running. That takes you to the Spark UI for
> that application where you can see all the information you're looking
> for.
>
> If you're running a history server and add
> "spark.yarn.historyServer.address" to your config, that link will
> become a "History" link after the application is finished, and will
> take you to the history server to view the app's UI.
>
>
>
> On Tue, Mar 3, 2015 at 9:47 AM, Srini Karri <skarri.net@gmail.com> wrote:
> > Hi All,
> >
> > I am having trouble finding data related to my requirement. Here is the
> > context, I have tried Standalone Spark Installation on Windows, I am
> able to
> > submit the logs, able to see the history of events. My question is, is it
> > possible to achieve the same monitoring UI experience with Yarn Cluster
> like
> > Viewing workers, running/completed job stages in the Web UI. Currently,
> if
> > we go to our Yarn Resource manager UI, we are able to see the Spark Jobs
> and
> > it's logs. But it is not as rich as Spark Standalone master UI. Is this
> > limitation for hadoop yarn cluster or is there any way we can hook this
> > Spark Standalone master to Yarn Cluster?
> >
> > Any help is highly appreciated.
> >
> > Regards,
> > Srini.
>
>
>
> --
> Marcelo
>

Mime
View raw message