kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jun Rao <jun...@gmail.com>
Subject Re: Error Handling and Acknowledgements in Kafka Consumers
Date Fri, 14 Sep 2012 04:32:32 GMT
In this case, you have to wait until 9 is processed before you commit the



On Thu, Sep 13, 2012 at 8:52 AM, Vito Sabella <vsabella@outlook.com> wrote:

> Kafka-users,
> I'm looking to use kafka in a pub-sub model where a consumer reads from
> Kafka and does some processing on the message.  How would you recommend a
> commit to Zookeeper / setting the last message consumed location if
> processing one of the messages in the pipe is more unreliable than the
> others.
> Let's say I read a batch of 10 messages (1-10) and I successfully process
> messages 1-8 and 10 quickly, but message #9 is taking an inordinately long
> time to process. I don't want to write the message as consumed against
> Zookeeper but I also don't want to block forward progress of the pipeline
> for that topic/partition.
> In the edge case, let's say processing all messages was successful but
> processing message #9 totally failed/timed out, but a reattempt at
> processing it would not result in failure (web-service call, strange
> network condition). In the timeout/failure case would the suggestion be to
> re-queue the message?
> Does anyone have any recommendations for the above scenarios?
> ThanksVito

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message