spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shushant Arora <shushantaror...@gmail.com>
Subject Re: spark on yarn
Date Tue, 14 Jul 2015 17:55:46 GMT
Is yarn.scheduler.maximum-allocation-vcores the setting for max vcores per
container?

Whats the setting for max limit of --num-executors ?

On Tue, Jul 14, 2015 at 11:18 PM, Marcelo Vanzin <vanzin@cloudera.com>
wrote:

> On Tue, Jul 14, 2015 at 10:40 AM, Shushant Arora <
> shushantarora09@gmail.com> wrote:
>
>> My understanding was --executor-cores(5 here) are maximum concurrent
>> tasks possible in an executor and --num-executors (10 here)are no of
>> executors or containers demanded by Application master/Spark driver program
>>  to yarn RM.
>>
>
> --executor-cores requests cores from YARN. YARN is a resource manager, and
> you're requesting more resources than it has available, so it denies your
> request. If you want to make more than 4 cores available in your NMs, you
> need to change YARN's configuration.
>
> --
> Marcelo
>

Mime
View raw message