spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sea" <261810...@qq.com>
Subject Re: About memory leak in spark 1.4.1
Date Sun, 02 Aug 2015 09:16:47 GMT
Hi, Barak
    It is ok with spark 1.3.0, the problem is with spark 1.4.1.
    I don't think spark.storage.memoryFraction will make any sense, because it is still in
heap memory. 




------------------ 原始邮件 ------------------
发件人: "Barak Gitsis";<barakg@similarweb.com>;
发送时间: 2015年8月2日(星期天) 下午4:11
收件人: "Sea"<261810726@qq.com>; "user"<user@spark.apache.org>; 
抄送: "rxin"<rxin@databricks.com>; "joshrosen"<joshrosen@databricks.com>; "davies"<davies@databricks.com>;

主题: Re: About memory leak in spark 1.4.1



Hi,reducing spark.storage.memoryFraction did the trick for me. Heap doesn't get filled because
it is reserved..
My reasoning is: 
I give executor all the memory i can give it, so that makes it a boundary.
From here i try to make the best use of memory I can. storage.memoryFraction is in a sense
user data space.  The rest can be used by the system. 
If you don't have so much data that you MUST store in memory for performance, better give
spark more space.. 
ended up setting it to 0.3


All that said, it is on spark 1.3 on cluster


hope that helps


On Sat, Aug 1, 2015 at 5:43 PM Sea <261810726@qq.com> wrote:

Hi, all
I upgrage spark to 1.4.1, many applications failed... I find the heap memory is not full ,
but the process of CoarseGrainedExecutorBackend will take more memory than I expect, and it
will increase as time goes on, finally more than max limited of the server, the worker will
die.....


Any can help?


Mode:standalone


spark.executor.memory 50g


25583 xiaoju    20   0 75.5g  55g  28m S 1729.3 88.1   2172:52 java


55g more than 50g I apply



-- 

-Barak
Mime
View raw message