kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Arunkumar Srambikkal (asrambik)" <asram...@cisco.com>
Subject RE: Kafka producer failed to send but actually does
Date Wed, 04 Mar 2015 11:45:25 GMT
Thanks for responding. 

I was creating an instance of kafka.server.KafkaServer in my code for running some tests and
this was what I referred to by an embedded broker. 

The scenario you described was what was happening. In my case when I kill my broker, it fails
to send an ack. I added handling of duplicates and resolved this issue.

Thanks
Arun

-----Original Message-----
From: Jiangjie Qin [mailto:jqin@linkedin.com.INVALID] 
Sent: Tuesday, March 03, 2015 11:13 PM
To: users@kafka.apache.org
Subject: Re: Kafka producer failed to send but actually does

What do you mean by Kafka embedded broker?
Anyway, this could happen. For example, producer sends message to broker.
After that some network issue occurs and the producer did not got confirmation from broker,
so the producer thought the send failed. But the broker actually got the message. The produce
is expected to resend the message, so broker will have duplicate messages, and that¹s also
why we say Kafka guarantees at least once.

-Jiangjie (Becket) Qin

On 3/3/15, 4:01 AM, "Arunkumar Srambikkal (asrambik)" <asrambik@cisco.com>
wrote:

>Hi,
>
>I'm running some tests with the Kafka embedded broker and I see cases 
>where the producer gets the FailedToSendMessageException but in reality 
>the message is transferred and consumer gets it
>
>Is this expected / known issue?
>
>Thanks
>Arun
>
>My producer config =
>
>    props.put("producer.type",         "sync")
>    props.put("serializer.class",      "kafka.serializer.StringEncoder");
>    props.put("partitioner.class",     "com.test.PartMe");
>    props.put("metadata.broker.list",  "127.0.0.1:"+port);
>    props.put("request.required.acks", "-1");
>    props.put("message.send.max.retries", "0")


Mime
View raw message