spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soumya Simanta <soumya.sima...@gmail.com>
Subject Problem with giving memory to executors on YARN
Date Fri, 19 Sep 2014 20:37:48 GMT
I'm launching a Spark shell with the following parameters

./spark-shell --master yarn-client --executor-memory 32g --driver-memory 4g
--executor-cores 32 --num-executors 8

but when I look at the Spark UI it shows only 209.3 GB total memory.


Executors (10)

   - *Memory:* 55.9 GB Used (209.3 GB Total)

This is a 10 node YARN cluster where each node has 48G of memory.

Any idea what I'm missing here?

Thanks
-Soumya

Mime
View raw message