spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-3471) Automatic resource manager for SparkContext in Scala?
Date Sat, 24 Jan 2015 12:21:34 GMT

     [ https://issues.apache.org/jira/browse/SPARK-3471?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen resolved SPARK-3471.
------------------------------
    Resolution: Not a Problem

This is about adding some kind of try-with-resources equivalent for Scala? No, there isn't
one. I know of the ARM library that provides this functionality: https://github.com/jsuereth/scala-arm
 In terms of what the Spark code has to do to enable resource management with {{SparkContext}},
there's nothing to do. It implements {{Closeable}} but even that is not necessary for this
library to work. So it's something a user app could include if really desired. I don't think
there is a change to Spark needed here.

> Automatic resource manager for SparkContext in Scala?
> -----------------------------------------------------
>
>                 Key: SPARK-3471
>                 URL: https://issues.apache.org/jira/browse/SPARK-3471
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 1.0.2
>            Reporter: Shay Rojansky
>            Priority: Minor
>
> After discussion in SPARK-2972, it seems like a good idea to add "automatic resource
management" semantics to SparkContext (i.e. "with" in Python (SPARK-3458), Closeable/AutoCloseable
in Java (SPARK-3470)).
> I have no knowledge of Scala whatsoever, but a quick search seems to indicate that there
isn't a standard mechanism for this - someone with real Scala knowledge should take a look
and make a decision...



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message