spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sea" <>
Subject About extra memory on yarn mode
Date Tue, 14 Jul 2015 12:44:48 GMT
Hi all:
I have a question about why spark on yarn will need extra memory
I apply for 10 executors, executor memory 6g,  I find that it will allocate 1g more for 1
executor, totally 7g for 1 executor.
I try to set spark.yarn.executor.memoryOverhead, but it did not help.
1g for 1 executor is too much, who knows how to adjust its size?
View raw message