spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-25183) Spark HiveServer2 registers shutdown hook with JVM, not ShutdownHookManager; race conditions can arise
Date Wed, 22 Aug 2018 17:28:02 GMT

     [ https://issues.apache.org/jira/browse/SPARK-25183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Apache Spark reassigned SPARK-25183:
------------------------------------

    Assignee: Apache Spark

> Spark HiveServer2 registers shutdown hook with JVM, not ShutdownHookManager; race conditions
can arise
> ------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-25183
>                 URL: https://issues.apache.org/jira/browse/SPARK-25183
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Steve Loughran
>            Assignee: Apache Spark
>            Priority: Minor
>
> Spark's HiveServer2 registers a shutdown hook with the JVM {{Runtime.addShutdownHook()}}
which can happen in parallel with the ShutdownHookManager sequence of spark & Hadoop,
which operate the shutdowns in an ordered sequence.
> This has some risks
> * FS shutdown before rename of logs completes, SPARK-6933
> * Delays of rename on object stores may block FS close operation, which, on clusters
with timeouts hooks (HADOOP-12950) of FileSystem.closeAll() can force a kill of that shutdown
hook and other problems.
> General outcome: logs aren't present.
> Proposed fix: 
> * register hook with {{org.apache.spark.util.ShutdownHookManager}}
> * HADOOP-15679 to make shutdown wait time configurable, so O(data) renames don't trigger
timeouts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message