kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Joe Lawson <jlaw...@opensourceconnections.com>
Subject Re: kafka “stops working” after a large message is enqueued
Date Wed, 03 Feb 2016 03:35:12 GMT
Make sure the topic is created after message Max bytes is set.
On Feb 2, 2016 9:04 PM, "Tech Bolek" <techy_bolek@yahoo.com.invalid> wrote:

> I'm running kafka_2.11-0.9.0.0 and a java-based producer/consumer. With
> messages ~70 KB everything works fine. However, after the producer enqueues
> a larger, 70 MB  message, kafka appears to stop delivering the messages to
> the consumer. I.e. not only is the large message not delivered but also
> subsequent smaller messages. I know the producer succeeds because I use
> kafka callback for the confirmation and I can see the messages in the kafka
> message log.
> kafka config custom changes:
>     message.max.bytes=200000000    replica.fetch.max.bytes=200000000
> consumer config:
>  props.put("fetch.message.max.bytes",   "200000000");
> props.put("max.partition.fetch.bytes", "200000000");
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message