how to consume large records

I have to deal with large ( 16M) text messages in my Kafka system, so i increased several message limit settings on broker/producer/consumer site and now the system is able to get them through…I also tried to enable compression in producer: “compression.type”= “gzip” but to my surprise ended up with OOM exceptions on producer side: Exception in thread “main” java.lang.OutOfMemoryError: Java heap space at java.lang.StringCoding$StringEncoder.encode( at java.lang.StringCoding.encode( at java.lang.String.getBytes( at org.apache.kafka.common.serialization.StringSerializer.serialize(

Looking deeper into StringCoding.encode, it’s first allocating a byte array to fit your string, and this is where your OOM is occurring, line 300 of is byte[] ba = new byte[en];

==> increase heap size

Ready to work with me?

Tell me everything!
© Copyright 2018-2022 · Stéphane Derosiaux · All Rights Reserved.