spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akash Mishra <akash.mishr...@gmail.com>
Subject Understanding spark.executor.memoryOverhead
Date Thu, 09 Aug 2018 10:14:46 GMT
Hi *,

I would like to know what part of spark codebase
uses spark.executor.memoryOverhead? I have a job which has
*spark.executor.memory=18g
*but it requires spark.executor.memoryOverhead=4g for the same process
otherwise I get task error,

ExecutorLostFailure (executor 28 exited caused by one of the running tasks)
Reason: Container killed by YARN for exceeding memory limits. 23.7 GB of 22
GB physical memory used. Consider boosting
spark.yarn.executor.memoryOverhead.

There is only 1 task running on this executor and JVM Heap usage is around
14 GB. I am not able to understand that what exactly is using 5.7 GB of
memory other than Java?

Is it netty for block read or something else?



-- 

Regards,
Akash Mishra.


"It's not our abilities that make us, but our decisions."--Albus Dumbledore

Mime
View raw message