kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Upendra Yadav <upendra1...@gmail.com>
Subject Re: best config for kafka 10.0.0.1 consumer.assign.
Date Wed, 27 Nov 2019 17:08:03 GMT
I have added this to my consumer config, and now it works fine.
receive.buffer.bytes=1048576



On Wed, Nov 13, 2019 at 10:41 AM Upendra Yadav <upendra1024@gmail.com>
wrote:

> Hi,
>
> I m using consumer assign method and consume with 15000 poll time out to
> consume single partition data from another DC.
>
> Below are my consumer configs:
> enable.auto.commit=false
> max.poll.records=4000
> max.partition.fetch.bytes=4096000
> key.deserializer=org.apache.kafka.common.serialization
> .ByteArrayDeserializer value.deserializer=org.apache.kafka.common.
> serialization.ByteArrayDeserializer
>
> with this my consumer works fine. but when I'm changing
> max.partition.fetch.bytes to 16384000, my consumer is not receiving any
> message.
> there is no exception. if I'm using consumer assign, do I need to tune
> below properties:
> fetch.max.bytes
> session.timeout.ms
> heartbeat.interval.ms
> Please let me know if I'm missing something.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message