kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Benjamin Manns <benma...@gmail.com>
Subject Kafka Streams demultiplexer
Date Sat, 16 Apr 2016 14:31:27 GMT
Hi all,

I'm taking a look at the new Kafka Streams system in 0.10, and I was
wondering if anyone has an example or knows how I might demultiplex a topic
with Kafka Streams. It looks pretty easy to join streams together (which is
great), but I only see ways to produce a single (or a predetermined number
of) topics. My use case is that I have a producer that generates messages

{ name: 'foo', data: 1 }
{ name: 'bar', data: 2 }
{ name: 'foo', data: 4 }

And I want to produce streams


{ name: 'foo', data: 1 }
{ name: 'foo', data: 4 }

{ name: 'bar', data: 2 }

Where the cardinality of name is between 100 and 1000 and will change over
time. (Specifically, I want to split change data capture from Maxwell's
Daemon <http://maxwells-daemon.io/> or similar to a topic-per-table.)

Is there a way to do this? My thinking is that it will be far more
performant to consume per-table for stream/KTable joins than to filter from
the firehose of every single change in the database.



Benjamin Manns
(434) 321-8324

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message