storm-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Manish Sharma <maa...@gmail.com>
Subject custom value.deserializer for storm-kafka-client-1.1.1?
Date Sat, 23 Sep 2017 23:38:55 GMT
Hello,
I am trying to use a custom ValueDeserializer when consuming from kafka, I
tried the following


--snip--
KafkaSpoutConfig kafkaSpoutConfig = KafkaSpoutConfig
.builder(property.getKafka_consumer_bootstrap_servers(), topics)
.setFirstPollOffsetStrategy(KafkaSpoutConfig.FirstPollOffsetStrategy.EARLIEST)
.setGroupId(property.getKafka_consumer_groupid())
.setProp(ConsumerConfig.CLIENT_ID_CONFIG, "StormKafkaConsumer")
.setProp(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
"EmailObjectDeserializer")
.build();
--snip--


It didn't take, In the logs I still see spout executor instantiated with
default "StringDeserializer" class.


--snip--
6348 [Thread-18-SMTPInjectionKafkaSpout-executor[2 2]] INFO
o.a.k.c.c.ConsumerConfig - ConsumerConfig values:
auto.commit.interval.ms = 5000
auto.offset.reset = latest
bootstrap.servers = [XXXX.XXXX.XXXX:9092]
check.crcs = true
client.id = StormKafkaConsumer
connections.max.idle.ms = 540000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = dev_worker
heartbeat.interval.ms = 3000
interceptor.classes = null
key.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 100
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.ms = 50
request.timeout.ms = 305000
retry.backoff.ms = 100
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer <-------
--snip--


Any thoughts on how to get custom value.deserializer working with
storm-kafka-client-1.1.1?

Mime
View raw message