spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anders Arpteg <arp...@spotify.com>
Subject Dynamic Allocation in Spark 1.2.0
Date Sat, 27 Dec 2014 14:06:49 GMT
Hey,

Tried to get the new spark.dynamicAllocation.enabled feature working on
Yarn (Hadoop 2.2), but am unsuccessful so far. I've tested with the
following settings:

      conf
        .set("spark.dynamicAllocation.enabled", "true")
        .set("spark.shuffle.service.enabled", "true")
        .set("spark.dynamicAllocation.minExecutors", "10")
        .set("spark.dynamicAllocation.maxExecutors", "700")

The app works fine on Spark 1.2 if dynamicAllocation is not enabled, but
with the settings above, it will start the app and the first job is listed
in the web ui. However, no tasks are started and it seems to be stuck
waiting for a container to be allocated forever.

Any help would be appreciated. Need to do something specific to get the
external yarn shuffle service running in the node manager?

TIA,
Anders

Mime
View raw message