spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From karthik tunga <karthik.tu...@gmail.com>
Subject Spark Streaming join
Date Thu, 02 Jun 2016 06:48:58 GMT
Hi,

I have a scenario where I need to join DStream with a RDD. This is to add
some metadata info to incoming events. This is fairly straight forward.

What I also want to do is refresh this metadata RDD on a fixed schedule(or
when  underlying hdfs file changes). I want to "expire" and reload this RDD
every say 10 minutes.

Is this possible ?

Apologies if this has been asked before.

Cheers,
Karthik

Mime
View raw message