kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Srikrishna Alla <allasrikrish...@gmail.com>
Subject Re: Kafka Connect Consumer reading messages from Kafka recursively
Date Tue, 03 Jan 2017 20:58:09 GMT
Thanks for your response Ewen. I will try to make updates to the producer
as suggested. Regd the Sink Connector consumer, Could it be that
connect-offsets topic is not getting updated with the offset information
per consumer? In that case, will the connector consume the same messages
again and again? Also, if that is the case, how would I be able to
troubleshoot? I am running a secured Kafka setup with SASL_PLAINTEXT setup.
Which users/groups should have access to write to the default topics? If
not, please guide me in the right direction.

Thanks,
Sri

On Tue, Jan 3, 2017 at 1:59 PM, Ewen Cheslack-Postava <ewen@confluent.io>
wrote:

> On Tue, Jan 3, 2017 at 8:38 AM, Srikrishna Alla <allasrikrishna1@gmail.com
> >
> wrote:
>
> > Hi,
> >
> > I am using Kafka/Kafka Connect to track certain events happening in my
> > application. This is how I have implemented it -
> > 1. My application is opening a KafkaProducer every time this event
> happens
> > and writes to my topic. My application has several components running in
> > Yarn and so I did not find a way to have just one producer and reuse it.
> > Once the event has been published, producer is closed
> >
>
> KafkaProducer is thread safe, so you can allocate a single producer per
> process and use it every time the event occurs on any thread. Creating and
> destroying a producer for every event will be very inefficient -- not only
> are you opening new TCP connections every time, having to lookup metadata
> every time, etc, you also don't allow the producer to get any benefit from
> batching so every message will require its own request/response.
>
>
> > 2. I am using Kafka Connect Sink Connector to consume from my topic and
> > write to DB and do other processing.
> >
> > This setup is working great as long as we have a stable number of events
> > published. The issue I am facing is when we have a huge number of
> events(in
> > thousands within minutes) hitting Kafka. In this case, my Sink Connector
> is
> > going into a loop and reading events from Kafka recursively and not
> > stopping. What could have triggered this? Please provide your valuable
> > insights.
> >
>
> What exactly do you mean by "reading events from Kafka recursively"? Unless
> it's hitting some errors that are causing consumers to fall out of the
> group uncleanly and then rejoin later, you shouldn't be seeing duplicates.
> Is there anything from the logs that might help reveal the problem?
>
> -Ewen
>
>
> >
> > Thanks,
> > Sri
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message