spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <>
Subject Re: spark on yarn
Date Tue, 14 Jul 2015 17:48:09 GMT
On Tue, Jul 14, 2015 at 10:40 AM, Shushant Arora <>

> My understanding was --executor-cores(5 here) are maximum concurrent
> tasks possible in an executor and --num-executors (10 here)are no of
> executors or containers demanded by Application master/Spark driver program
>  to yarn RM.

--executor-cores requests cores from YARN. YARN is a resource manager, and
you're requesting more resources than it has available, so it denies your
request. If you want to make more than 4 cores available in your NMs, you
need to change YARN's configuration.


View raw message