spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-26771) Make .unpersist(), .destroy() consistently non-blocking by default
Date Sat, 02 Feb 2019 00:31:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-26771?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen resolved SPARK-26771.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 3.0.0

Issue resolved by pull request 23685
[https://github.com/apache/spark/pull/23685]

> Make .unpersist(), .destroy() consistently non-blocking by default
> ------------------------------------------------------------------
>
>                 Key: SPARK-26771
>                 URL: https://issues.apache.org/jira/browse/SPARK-26771
>             Project: Spark
>          Issue Type: Improvement
>          Components: GraphX, Spark Core
>    Affects Versions: 2.4.0
>            Reporter: Sean Owen
>            Assignee: Sean Owen
>            Priority: Major
>              Labels: release-notes
>             Fix For: 3.0.0
>
>
> See https://issues.apache.org/jira/browse/SPARK-26728 and https://github.com/apache/spark/pull/23650
. 
> RDD and DataFrame expose an .unpersist() method with optional "blocking" argument. So
does Broadcast.destroy(). This argument is false by default except for the Scala RDD (not
Pyspark) implementation and its GraphX subclasses. Most usages of these methods request non-blocking
behavior already, and indeed, it's not typical to want to wait for the resources to be freed,
except in tests asserting behavior about these methods (where blocking is typically requested).
> This proposes to make the default false across these methods, and adjust callers to only
request non-default blocking behavior where important, such as in a few key tests. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message