kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bart Vercammen <b...@cloutrix.com>
Subject [Kafka Consumer] deserializer (documentation) mismatch?
Date Mon, 15 Oct 2018 13:38:06 GMT

I found a mismatch between the documentation in
the org.apache.kafka.common.serialization.Deserializer and the
implementation in KafkaConsumer.

Deserializer documentation sais: *"serialized bytes; may be null;
implementations are recommended to handle null by returning a value or null
rather than throwing an exception*"
but in the KafkaConsumer, 'null' is never passed to the deserializer.

>From 'parseRecord' in 'org.apache.kafka.clients.consumer.internals.Fetcher'
            K key = keyBytes == null ? null :
this.keyDeserializer.deserialize(partition.topic(), headers, keyByteArray);
            ByteBuffer valueBytes = record.value();
            byte[] valueByteArray = valueBytes == null ? null :
            V value = valueBytes == null ? null :
this.valueDeserializer.deserialize(partition.topic(), headers,

I stumbled upon this discrepancy while trying to pass a valid object from
the deserializer to the application when a 'delete' was received on a
log-compacted topic.
So basically the question I have here is the following: is the
documentation in the Deserializer wrong, or is the implementation in the
Fetcher wrong?
To me it seems more plausible to have 'null' being processed by the
deserializer, as to the Fetcher shortcutting on 'null' values ...

Any thoughts?

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message