kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Henry Thacker <he...@henrythacker.com>
Subject Re: Kafka Streams 0.10.0.1 - multiple consumers not receiving messages
Date Tue, 02 May 2017 07:53:15 GMT
Thanks all for your replies - I have checked out the docs which were very
helpful.

I have now moved the separate topic streams to different processes each
with their own app.id and I'm getting the following pattern, with no data
consumed:

"Starting stream thread [StreamThread-1]
Discovered coordinator .... for group ..
Marking the coordinator .... dead for group ..
Discovered coordinator .... for group ..
Marking the coordinator .... dead for group .."

The discover and dead states repeat every few minutes.

During this time, the broker logs look happy.

One other, hopefully unrelated point, is this cluster is all SSL encrypted.

Thanks,
Henry

-- 
Henry Thacker

On 29 April 2017 at 05:31:30, Matthias J. Sax (matthias@confluent.io) wrote:

> Henry,
>
> you might want to check out the docs, that give an overview of the
> architecture:
> http://docs.confluent.io/current/streams/architecture.html#example
>
> Also, I am wondering why your application did not crash: I would expect
> an exception like
>
> java.lang.IllegalArgumentException: Assigned partition foo-2 for
> non-subscribed topic regex pattern; subscription pattern is bar
>
> Maybe you just don't hit it, because both topics have a single partition
> and not multiple.
>
> Out of interest though, had I subscribed for both topics in one subscriber
> - I would have expected records for both topics interleaved
>
>
> Yes. That should happen.
>
> why when
>
> running this in two separate processes do I not observe the same?
>
>
> Not sure what you mean by this?
>
> If I fix this by changing the application ID for each streaming process -
> does this mean I lose the ability to share state stores between the
> applications?
>
>
> Yes.
>
>
> If both your topics are single partitioned, and you want to share state,
> you will not be able to run with more then one thread in your Streams app.
>
> The only way to work around this, would be to copy the data into another
> topic with more partitions before you process them -- of course, this
> would mean data duplication.
>
>
> -Matthias
>
>
> On 4/28/17 12:45 PM, Henry Thacker wrote:
>
> Thanks Michael and Eno for your help - I always thought the unit of
> parallelism was a combination of topic & partition rather than just
> partition.
>
> Out of interest though, had I subscribed for both topics in one subscriber
> - I would have expected records for both topics interleaved, why when
> running this in two separate processes do I not observe the same? Just
> wanting to try and form a mental model of how this is all working - I will
> try and look through some code over the weekend.
>
> If I fix this by changing the application ID for each streaming process -
> does this mean I lose the ability to share state stores between the
> applications?
>
> Unfortunately the data on the input topics are provided by a third party
> component which sends these keyless messages on a single partition per
> topic, so I have little ability to fix this at source :-(
>
> Thanks,
> Henry
>
>
> ------------------------------
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message