spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-26750) Estimate memory overhead should taking multi-cores into account
Date Tue, 26 Feb 2019 15:42:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-26750?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen resolved SPARK-26750.
-------------------------------
    Resolution: Won't Fix

> Estimate memory overhead should taking multi-cores into account
> ---------------------------------------------------------------
>
>                 Key: SPARK-26750
>                 URL: https://issues.apache.org/jira/browse/SPARK-26750
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 2.4.0
>            Reporter: liupengcheng
>            Priority: Major
>
> Currently, spark esitmate the memory overhead without taking multi-cores into account,
sometimes, it might cause direct memory oom, or killed by yarn for exceeding requested physical
memory. 
> I think the memory overhead is related to the executor's core number(mainly the spark
direct memory and some related jvm native memory, for instance, the thread stacks, GC data
etc.). so maybe we can improve this estimation by taking the core number into account.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message