spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eric Bless <>
Subject Boosting spark.yarn.executor.memoryOverhead
Date Tue, 11 Aug 2015 21:40:38 GMT
Previously I was getting a failure which included the message     Container killed by
YARN for exceeding memory limits. 2.1 GB of 2 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.

So I attempted the following -     spark-submit --jars examples.jar
--conf spark.yarn.executor.memoryOverhead=1024 host table

This resulted in -    Application application_1438983806434_24070 failed 2 times due to
AM Container for appattempt_1438983806434_24070_000002 exited with exitCode: -1000

Am I specifying the spark.yarn.executor.memoryOverhead incorrectly?

View raw message