kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From svante karlsson <s...@csi.se>
Subject Re: kafka sending duplicate content to consumer
Date Fri, 23 Jan 2015 09:14:27 GMT
A kafka broker never pushes data to a consumer. It's the consumer that does
a long fetch and it provides the offset to read from.

The problem lies in how your consumer handles the for example 1000 messages
that it just got. If you handle 500 of them and crash without committing
the offsets somewhere (either to kafka or in some other system). When you
restart the you will start your fetch again from the last committed offset.
Kafka has no notion of an already consumed message.

2015-01-23 7:54 GMT+01:00 Tousif <tousif.pasha@gmail.com>:

> Hi,
> i want know in which situation does kafka send same event  multiple times
> to consumer. Is there a consumer side configuration to tell kafka to send
> only once and stop retries?
> --
> Regards
> Tousif Khazi

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message