spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vipul Pandey <vipan...@gmail.com>
Subject Re: Problem with giving memory to executors on YARN
Date Fri, 19 Sep 2014 20:49:13 GMT
How many cores do you have in your boxes?
looks like you are assigning 32 cores "per" executor - is that what you want?  are there other
applications running on the cluster? you might want to check YARN UI to see how many containers
are getting allocated to your application. 


On Sep 19, 2014, at 1:37 PM, Soumya Simanta <soumya.simanta@gmail.com> wrote:

> I'm launching a Spark shell with the following parameters
> 
> ./spark-shell --master yarn-client --executor-memory 32g --driver-memory 4g --executor-cores
32 --num-executors 8 
> 
> but when I look at the Spark UI it shows only 209.3 GB total memory. 
> 
> 
> Executors (10)
> Memory: 55.9 GB Used (209.3 GB Total)
> This is a 10 node YARN cluster where each node has 48G of memory. 
> 
> Any idea what I'm missing here? 
> 
> Thanks
> -Soumya
> 
> 
> 


Mime
View raw message