spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From lalit1303 <>
Subject Re: Lifecycle of RDD in spark-streaming
Date Wed, 26 Nov 2014 09:36:36 GMT
Hi Mukesh,

Once you create a streming job, a DAG is created which contains your job
plan i.e. all map transformation and all action operations to be performed
on each batch of streaming application.

So, once your job is started, the input dstream take the data input from
specified source and all the transformations/actions are performed according
to the DAG created. Once all the operations on dstream are performed, the
dstream is destroyed in LRU fashion.

Lalit Yadav
View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message