kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tousif <tousif.pa...@gmail.com>
Subject Re: kafka sending duplicate content to consumer
Date Fri, 23 Jan 2015 09:48:17 GMT
Thanks, i'm a newbie wrt kafka. im using kafka-spout.

here is the fail handler of kafka-spout,  so to avoid replaying do i need
to remove below snipped from fail handler?.
Can you point me to official kafka-spout. can i use the one provided under
external folder.

 if (_failHandler.shouldReplay(id)) {
                LOG.debug("kafka message id {} failed in topology, adding
to buffer again", id);
                _queue.add(id);

            }


 @Override
    public void fail(final Object o) {
        if (o instanceof KafkaMessageId) {
            final KafkaMessageId id = (KafkaMessageId) o;
            // delegate decision of replaying the message to failure policy
            if (_failHandler.shouldReplay(id)) {
                LOG.debug("kafka message id {} failed in topology, adding
to buffer again", id);
                _queue.add(id);

            }
            else {
                LOG.debug("kafka message id {} failed in topology,
delegating failure to policy", id);
                // remove message from pending; _failHandler will take
action if needed
                _failHandler.fail(id, _inProgress.remove(id));
            }
        }
    }







On Fri, Jan 23, 2015 at 2:44 PM, svante karlsson <saka@csi.se> wrote:

> A kafka broker never pushes data to a consumer. It's the consumer that does
> a long fetch and it provides the offset to read from.
>
> The problem lies in how your consumer handles the for example 1000 messages
> that it just got. If you handle 500 of them and crash without committing
> the offsets somewhere (either to kafka or in some other system). When you
> restart the you will start your fetch again from the last committed offset.
> Kafka has no notion of an already consumed message.
>
>
>
> 2015-01-23 7:54 GMT+01:00 Tousif <tousif.pasha@gmail.com>:
>
> > Hi,
> >
> > i want know in which situation does kafka send same event  multiple times
> > to consumer. Is there a consumer side configuration to tell kafka to send
> > only once and stop retries?
> >
> > --
> >
> >
> > Regards
> > Tousif Khazi
> >
>



-- 


Regards
Tousif Khazi

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message