spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tathagata Das <t...@databricks.com>
Subject Re: Graceful shutdown for Spark Streaming
Date Thu, 30 Jul 2015 07:32:38 GMT
How is sleep not working? Are you doing

streamingContext.start()
Thread.sleep(xxx)
streamingContext.stop()

On Wed, Jul 29, 2015 at 6:55 PM, anshu shukla <anshushukla0@gmail.com>
wrote:

> If we want to stop the  application after fix-time period , how it will
> work . (How to give the duration in logic , in my case  sleep(t.s.)  is not
> working .)  So i used to kill coarseGrained job at each slave by script
> .Please suggest something .
>
> On Thu, Jul 30, 2015 at 5:14 AM, Tathagata Das <tdas@databricks.com>
> wrote:
>
>> StreamingContext.stop(stopGracefully = true) stops the streaming context
>> gracefully.
>> Then you can safely terminate the Spark cluster. They are two different
>> steps and needs to be done separately ensuring that the driver process has
>> been completely terminated before the Spark cluster is the terminated.
>>
>> On Wed, Jul 29, 2015 at 6:43 AM, Michal ńĆizmazia <micizma@gmail.com>
>> wrote:
>>
>>> How to initiate graceful shutdown from outside of the Spark Streaming
>>> driver process? Both for the local and cluster mode of Spark Standalone as
>>> well as EMR.
>>>
>>> Does sbin/stop-all.sh stop the context gracefully? How is it done? Is
>>> there a signal sent to the driver process?
>>>
>>> For EMR, is there a way how to terminate an EMR cluster with Spark
>>> Streaming graceful shutdown?
>>>
>>> Thanks!
>>>
>>>
>>>
>>
>
>
> --
> Thanks & Regards,
> Anshu Shukla
>

Mime
View raw message