spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tobias Pfeiffer <>
Subject Re: StreamingContext does not stop
Date Fri, 14 Nov 2014 02:46:34 GMT

I guess I found part of the issue: I said
  dstream.transform(rdd => { rdd.foreachPartition(...); rdd })
instead of
  dstream.transform(rdd => { rdd.mapPartitions(...) }),
that's why stop() would not stop the processing.

Now with the new version a non-graceful shutdown works in the sense that
Spark does not wait for my processing to complete; job generator, job
scheduler, job executor etc. all seem to be shut down fine, just the
threads that do the actual processing are not. Even after
streamingContext.stop() is done, I see logging output from my processing

Is there any way to signal to my processing tasks that they should stop the


View raw message