kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From l vic <lvic4...@gmail.com>
Subject RecordTooLargeException on 16M messages in Kafka?
Date Thu, 15 Aug 2019 01:23:20 GMT
My kafka (1.0.0) producer errors out on  large (16M) messages.
ERROR Error when sending message to topic test with key: null, value:
16777239 bytes with error: (org.apache.kafka.clients.producer.internals.
ErrorLoggingCallback)

org.apache.kafka.common.errors.RecordTooLargeException: The message is
16777327 bytes when serialized which is larger than the maximum request
size you have configured with the max.request.size configuration.
I found couple of links describing the solution:
*https://stackoverflow.com/questions/21020347/how-can-i-send-large-messages-with-kafka-over-15mb
<https://stackoverflow.com/questions/21020347/how-can-i-send-large-messages-with-kafka-over-15mb>*

in my server.properties on brokers I set:
socket.request.max.bytes=104857600
message.max.bytes=18874368
max.request.size=18874368
replica.fetch.max.bytes=18874368
fetch.message.max.bytes=18874368

Then in my producer.properties i tried to set
max.request.size=18874368

But no matter how i large i try to set max.request.size -
i still have the same problem...Are there other settings i am missing?
Can it be solved in configuration alone, or do i need to make code changes?
Thank you,

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message