spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andrew Ash (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-1860) Standalone Worker cleanup should not clean up running applications
Date Thu, 22 May 2014 04:16:39 GMT

    [ https://issues.apache.org/jira/browse/SPARK-1860?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14005596#comment-14005596
] 

Andrew Ash commented on SPARK-1860:
-----------------------------------

So the Spark master webui shows the running applications, so it at least knows what's running.
 I guess since this is running on a worker it may need to be told by the master what the active
applications are.  Not sure the internals of Spark very well but there's got to be a way to
determine this.

> Standalone Worker cleanup should not clean up running applications
> ------------------------------------------------------------------
>
>                 Key: SPARK-1860
>                 URL: https://issues.apache.org/jira/browse/SPARK-1860
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.0.0
>            Reporter: Aaron Davidson
>            Priority: Critical
>             Fix For: 1.1.0
>
>
> The default values of the standalone worker cleanup code cleanup all application data
every 7 days. This includes jars that were added to any applications that happen to be running
for longer than 7 days, hitting streaming jobs especially hard.
> Applications should not be cleaned up if they're still running. Until then, this behavior
should not be enabled by default.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message