spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yong Zhang <>
Subject Re: finding Spark Master
Date Wed, 08 Mar 2017 02:00:23 GMT
This website explains it very clear, if you are using Yarn.

Running Spark Applications on YARN -<>
When Spark applications run on a YARN cluster manager, resource management, scheduling, and
security are controlled by YARN.

From: Adaryl Wakefield <>
Sent: Tuesday, March 7, 2017 8:53 PM
To: Koert Kuipers
Subject: RE: finding Spark Master

Ah so I see setMaster(‘yarn-client’). Hmm.

What I was ultimately trying to do was develop with Eclipse on my windows box and have the
code point to my cluster so it executes there instead of my local windows machine. Perhaps
I’m going about this wrong.

Adaryl "Bob" Wakefield, MBA
Mass Street Analytics, LLC
Twitter: @BobLovesData

From: Koert Kuipers []
Sent: Tuesday, March 7, 2017 7:47 PM
To: Adaryl Wakefield <>
Subject: Re: finding Spark Master

assuming this is running on yarn there is really spark-master. every job created its own "master"
within a yarn application.

On Tue, Mar 7, 2017 at 6:27 PM, Adaryl Wakefield <<>>

I’m running a three node cluster along with Spark along with Hadoop as part of a HDP stack.
How do I find my Spark Master? I’m just seeing the clients. I’m trying to figure out what
goes in setMaster() aside from local[*].

Adaryl "Bob" Wakefield, MBA
Mass Street Analytics, LLC
Twitter: @BobLovesData

View raw message