kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From christopher palm <cpa...@gmail.com>
Subject KafkaProducer Retries in .9.0.1
Date Wed, 06 Apr 2016 04:23:25 GMT
Hi All,

I am working with the KafkaProducer using the properties below,
so that the producer keeps trying to send upon failure on Kafka .9.0.1.
I am forcing a failure by setting my buffersize smaller than my
payload,which causes the expected exception below.

I don't see the producer retry to send on receiving this failure.

Am I  missing something in the configuration to allow the producer to retry
on failed sends?

Thanks,
Chris

.java.util.concurrent.ExecutionException:
org.apache.kafka.common.errors.RecordTooLargeException: The message is 8027
bytes when serialized which is larger than the total memory buffer you have
configured with the buffer.memory configuration.

 props.put("bootstrap.servers", bootStrapServers);

props.put("acks", "all");

props.put("retries", 3);//Try for 3 strikes

props.put("batch.size", batchSize);//Need to see if this number should
increase under load

props.put("linger.ms", 1);//After 1 ms fire the batch even if the batch
isn't full.

props.put("buffer.memory", buffMemorySize);

props.put("max.block.ms",500);

props.put("max.in.flight.requests.per.connection", 1);

props.put("key.serializer",
"org.apache.kafka.common.serialization.StringSerializer");

props.put("value.serializer",
"org.apache.kafka.common.serialization.ByteArraySerializer");

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message