kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kevin Perera <kper...@ippon.fr>
Subject Kafka Consumers - keeping them open
Date Mon, 24 Jun 2019 15:13:31 GMT
Hello! I’m interested in trying to get my Kafka Consumer to keep eating records. However,
after a short period of time, it stops incrementing. How do you usually get this to work?
Below is a short configuration that I use for my KafkaConsumer. Any help would be greatly
appreciated. 

hostname = InetAddress.getLocalHost().getHostName();
// configuration for how we consume records.
tweetsProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, hostname + ":9092");
tweetsProps.put(ConsumerConfig.GROUP_ID_CONFIG, “groupID");
tweetsProps.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");
tweetsProps.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "1000");
tweetsProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
tweetsProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
tweetsProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message