kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Matthias J. Sax" <matth...@confluent.io>
Subject Re: Writing streams to kafka topic
Date Fri, 01 Sep 2017 16:48:44 GMT

this is not supported by the DSL layer. What you would need to do, is to
add a custom stateful transform() operator after there window
(`stream.groupByKey().aggregate().toStream().transform().to()`), that
buffers the output and remembers the latest result. Second, you would
schedule a punctuation that emit the data whenever you want.

Hope this helps.


On 8/31/17 9:52 PM, Praveen wrote:
> Hi,
> I have a use case where I want to schedule processing of events in the
> future. I am not really sure if this a proper use of stream processing
> application. But I was looking at KTable and kafka streams api to see if
> this was possible.
> So far the pattern I have is:
>     FEED -> changelog stream -> groupByKey() -> window -> write to
> different kafka topic
> The window here i believe would be the TumblingWindow for my use case. I'd
> like to write back to a kafka topic only after the window retention ends.
> The documentation
> <http://docs.confluent.io/current/streams/developer-guide.html#writing-streams-back-to-kafka>
> says that streams may only be written "continuously" to the kafka topic. Is
> that the case?
> - Praveen

View raw message