spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (Jira)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-28843) Set OMP_NUM_THREADS to executor cores reduce Python memory consumption
Date Fri, 30 Aug 2019 01:31:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-28843?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Hyukjin Kwon reassigned SPARK-28843:
------------------------------------

    Assignee: Ryan Blue

> Set OMP_NUM_THREADS to executor cores reduce Python memory consumption
> ----------------------------------------------------------------------
>
>                 Key: SPARK-28843
>                 URL: https://issues.apache.org/jira/browse/SPARK-28843
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.3.3, 3.0.0, 2.4.3
>            Reporter: Ryan Blue
>            Assignee: Ryan Blue
>            Priority: Major
>              Labels: release-notes
>
> While testing hardware with more cores, we found that the amount of memory required by
PySpark applications increased and tracked the problem to importing numpy. The numpy issue
isĀ [https://github.com/numpy/numpy/issues/10455]
> NumPy uses OpenMP that starts a thread pool with the number of cores on the machine (and
does not respect cgroups). When we set this lower we see a significant reduction in memory
consumption.
> This parallelism setting should be set to the number of cores allocated to the executor,
not the number of cores available.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message