kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From l vic <lvic4...@gmail.com>
Subject OOM for large messages with compression?
Date Wed, 21 Aug 2019 13:51:42 GMT
I have to deal with large ( 16M) text messages in my Kafka system, so i
increased several message limit settings on broker/producer/consumer site
and now the system is able to get them through....I also tried to enable
compression in producer:
"compression.type"= "gzip"
but to my surprise ended up with OOM exceptions on producer side:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at
java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300) at
java.lang.StringCoding.encode(StringCoding.java:344) at
java.lang.String.getBytes(String.java:918) at
org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:43)
at
org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:24)
at
org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:326)
at
org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:248)
Shouldn't I be able to save memory with compression? Why does the
compression have the opposite effect?

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message