spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: [SparkStreaming] Is it possible to delay the start of some DStream in the application?
Date Sun, 17 May 2015 15:28:59 GMT
Why not just trigger your batch job with that event?

If you really need streaming, then you can create a custom receiver and
make the receiver sleep till the event has happened. That will obviously
run your streaming pipelines without having any data to process.

Thanks
Best Regards

On Fri, May 15, 2015 at 4:39 AM, Haopu Wang <HWang@qilinsoft.com> wrote:

> In my application, I want to start a DStream computation only after an
> special event has happened (for example, I want to start the receiver
> only after the reference data has been properly initialized).
>
> My question is: it looks like the DStream will be started right after
> the StreaminContext has been started. Is it possible to delay the start
> of specific DStream?
>
> Thank you very much!
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message