kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Upendra Yadav <upendra1...@gmail.com>
Subject best config for kafka 10.0.0.1 consumer.assign.
Date Wed, 13 Nov 2019 05:11:24 GMT
Hi,

I m using consumer assign method and consume with 15000 poll time out to
consume single partition data from another DC.

Below are my consumer configs:
enable.auto.commit=false
max.poll.records=4000
max.partition.fetch.bytes=4096000
key.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
value.deserializer=org.apache.kafka.common.serialization
.ByteArrayDeserializer

with this my consumer works fine. but when I'm changing
max.partition.fetch.bytes to 16384000, my consumer is not receiving any
message.
there is no exception. if I'm using consumer assign, do I need to tune
below properties:
fetch.max.bytes
session.timeout.ms
heartbeat.interval.ms
Please let me know if I'm missing something.

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message