spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tamas Jambor <jambo...@gmail.com>
Subject Re: Yarn number of containers
Date Thu, 25 Sep 2014 21:20:09 GMT
Thank you.

Where is the number of containers set?

On Thu, Sep 25, 2014 at 7:17 PM, Marcelo Vanzin <vanzin@cloudera.com> wrote:
> On Thu, Sep 25, 2014 at 8:55 AM, jamborta <jamborta@gmail.com> wrote:
>> I am running spark with the default settings in yarn client mode. For some
>> reason yarn always allocates three containers to the application (wondering
>> where it is set?), and only uses two of them.
>
> The default number of executors in Yarn mode is 2; so you have 2
> executors + the application master, so 3 containers.
>
>> Also the cpus on the cluster never go over 50%, I turned off the fair
>> scheduler and set high spark.cores.max. Is there some additional settings I
>> am missing?
>
> You probably need to request more cores (--executor-cores). Don't
> remember if that is respected in Yarn, but should be.
>
> --
> Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message