kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shantanu Deshmukh <shantanu...@gmail.com>
Subject Re: How to handle kafka large messages
Date Tue, 09 Apr 2019 11:27:45 GMT
Well,
from your own synopsis it is clear that message you want to send it much
larger than max.message.bytes setting on broker. You can modify it.
However, do keep in mind that if you seem to be constantly increasing this
limit then you have to look at your message itself. Does it really need to
be that large? Large messages can stress out the cluster.

On Tue, Apr 9, 2019 at 4:50 PM Rammohan Vanteru <ramz.mohan88@gmail.com>
wrote:

> Hi Users,
>
> Let me know if any one faced this issue.
>
> I have went through multiple articles but has different answers. Just want
> to check with kafka users.
>
> Below are the setting i have on kafka cluster. What are the tuning
> parameters to overcome this large message size issue.
>
>
> Kafka version: 0.11
> Number of nodes in a kafka cluster: 3 nodes
> Number topic and partitions: 1topic and 10 partitions.
> Message size: upto 5mb
> Max.messages.bytes on topic is 2mb
>
> Error message:
>
> 201904-09 00:00:02.469 ERROR 35301 --- [ad | producer-1]
> c.b.a.s.p.KafkaTelemetryConsumer : Failed to send TelemetryHarvesterServer
> with data size 1090517 to kafka.
>
> org.springframework.kafka.core.KafkaProducerException: Failed to send;
> nested exception is org.apache.kafka.common.errors.RecordTooLargeException:
> The request included a message larger than the max message size the server
> will accept.
> ```
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message