spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From hawkwang <>
Subject Spark Streaming - How to save all items in batchs from beginning to a single stream rdd?
Date Tue, 22 Jul 2014 14:59:31 GMT
Hi guys,

Is it possible to generate a single stream rdd which can be updated with 
new batch rdd content?

I know that we can use updateStateByKey to make aggregation,
but here just want to keep tracking all historical original content.

I also noticed that we can save to redis or other storage system,
but can we just use spark streaming mechanism to make it happen?

Thanks for any suggestion.


View raw message