spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Evo Eftimov" <>
Subject RE: stop streaming context of job failure
Date Tue, 16 Jun 2015 14:07:30 GMT


also subscribe to various Listeners for various Metrcis Types e.g. Job Stats/Statuses  - this
will allow you (in the driver) to decide when to stop  the context gracefully (the listening
and stopping can be done from a completely separate thread in the driver)

Class JobProgressListener

·         Object



·         All Implemented Interfaces:

Logging <>
, SparkListener <>



public class JobProgressListener

extends Object

implements SparkListener <>
, Logging <>

:: DeveloperApi :: Tracks task-level information to be displayed in the UI. 

All access to the data structures in this class must be synchronized on the class, since the
UI thread and the EventBus loop may otherwise be reading and updating the internal data structures





From: Krot Viacheslav [] 
Sent: Tuesday, June 16, 2015 2:35 PM
Subject: stop streaming context of job failure


Hi all,

Is there a way to stop streaming context when some batch processing failed?

I want to set reasonable reties count, say 10, and if failed - stop context completely.

Is that possible?

View raw message