spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sunita Arvind <>
Subject Increasing spark.yarn.executor.memoryOverhead degrades performance
Date Mon, 18 Jul 2016 15:47:53 GMT
Hello Experts,

For one of our streaming appilcation, we intermittently saw:

WARN yarn.YarnAllocator: Container killed by YARN for exceeding memory
limits. 12.0 GB of 12 GB physical memory used. Consider boosting

Based on what I found on internet and the error message, I increased the
memoryOverhead to 768. This is actually slowing the application. We are on
spark1.3, so not sure if its due to any GC pauses. Just to do some
intelligent trials, I wanted to understand what could be causing the
degrade. Should I increase driver memoryOverhead also? Another interesting
observation is, bringing down the executor memory to 5GB with executor
memoryOverhead to 768 showed significant performance gains. What are the
other associated settings?


View raw message