spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tsuyoshi OZAWA <>
Subject Re: Dynamic Allocation in Spark 1.2.0
Date Sat, 27 Dec 2014 18:07:33 GMT
Hi Anders,

I faced the same issue as you mentioned. Yes, you need to install
spark shuffle plugin for YARN. Please check following PRs which add
doc to enable dynamicAllocation:

I could run Spark on YARN with dynamicAllocation by following the
instructions described in the docs.

- Tsuyoshi

On Sat, Dec 27, 2014 at 11:06 PM, Anders Arpteg <> wrote:
> Hey,
> Tried to get the new spark.dynamicAllocation.enabled feature working on Yarn
> (Hadoop 2.2), but am unsuccessful so far. I've tested with the following
> settings:
>       conf
>         .set("spark.dynamicAllocation.enabled", "true")
>         .set("spark.shuffle.service.enabled", "true")
>         .set("spark.dynamicAllocation.minExecutors", "10")
>         .set("spark.dynamicAllocation.maxExecutors", "700")
> The app works fine on Spark 1.2 if dynamicAllocation is not enabled, but
> with the settings above, it will start the app and the first job is listed
> in the web ui. However, no tasks are started and it seems to be stuck
> waiting for a container to be allocated forever.
> Any help would be appreciated. Need to do something specific to get the
> external yarn shuffle service running in the node manager?
> TIA,
> Anders

- Tsuyoshi

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message