kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Liam Clarke <liam.cla...@adscale.co.nz>
Subject Re: OOM for large messages with compression?
Date Wed, 21 Aug 2019 23:12:38 GMT
Hi I Vic,

Your OOM is happening before any compression is applied. It's occurring
when the StringSerializer is converting the string to bytes. Looking deeper
into StringCoding.encode, it's first allocating a byte array to fit your
string, and this is where your OOM is occurring, line 300 of
StringCoding.java is  byte[] ba = new byte[en];

Compression is applied after the string is serialized to bytes. So you'll
need to increase your heap size to support this.

Hope that helps :)

Liam Clarke

On Thu, Aug 22, 2019 at 1:52 AM l vic <lvic4594@gmail.com> wrote:

> I have to deal with large ( 16M) text messages in my Kafka system, so i
> increased several message limit settings on broker/producer/consumer site
> and now the system is able to get them through....I also tried to enable
> compression in producer:
> "compression.type"= "gzip"
> but to my surprise ended up with OOM exceptions on producer side:
> Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at
> java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300) at
> java.lang.StringCoding.encode(StringCoding.java:344) at
> java.lang.String.getBytes(String.java:918) at
> org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:43)
> at
> org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:24)
> at
> org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:326)
> at
> org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:248)
> Shouldn't I be able to save memory with compression? Why does the
> compression have the opposite effect?

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message