spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hemant Bhanawat <>
Subject Re: Executor shutdown hooks?
Date Thu, 07 Apr 2016 04:51:36 GMT
As part of PR, I have added a
killAllTasks function that can be used to kill (rather interrupt)
individual tasks before an executor exits. If this PR is accepted, for
doing task level cleanups, we can add a call to this function before
executor exits. The exit thread will wait for a certain period of time
before the executor jvm exits to allow proper cleanups of the tasks.

Hemant Bhanawat <>

On Thu, Apr 7, 2016 at 6:08 AM, Reynold Xin <> wrote:

> On Wed, Apr 6, 2016 at 4:39 PM, Sung Hwan Chung <>
> wrote:
>> My option so far seems to be using JVM's shutdown hook, but I was
>> wondering if Spark itself had an API for tasks.
> Spark would be using that under the hood anyway, so you might as well just
> use the jvm shutdown hook directly.

View raw message