spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lan Jiang <ljia...@gmail.com>
Subject Spark Yarn executor container memory
Date Tue, 16 Aug 2016 02:26:04 GMT
Hello,

My understanding is that YARN executor container memory is based on "spark.executor.memory"
+ “spark.yarn.executor.memoryOverhead”. The first one is for heap memory and second one
is for offheap memory. The spark.executor.memory is used by -Xmx to set the max heap size.
Now my question is why it does not count permgen size and memory used by stack. They are not
part of the max heap size. IMHO, YARN executor container memory should be set to:  spark.executor.memory
 + [-XX:MaxPermSize] + number_of_threads * [-Xss] + spark.yarn.executor.memoryOverhead. What
did I miss?

Lan
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message