kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sven Ludwig" <s_lud...@gmx.de>
Subject Aw: Re: Doubts in Kafka
Date Thu, 10 Jan 2019 13:26:33 GMT
Okay, but
what if one also needs to preserve the order of messages coming from a particular device?

With Kafka, this is perhaps possible if all messages from a particular device go into the
same partition.

Would it be a good and efficient solution for this approach to set the key of each Kafka ProducerRecord
to the unique ID of the Device
AND deactivate the key-based log-cleaner on the broker so that it does not delete older records
that have the same key?


Gesendet: Donnerstag, 10. Januar 2019 um 08:35 Uhr
Von: "Peter Levart" <peter.levart@gmail.com>
An: users@kafka.apache.org, "aruna ramachandran" <arunaeienec@gmail.com>
Betreff: Re: Doubts in Kafka
Hi Aruna,

On 1/10/19 8:19 AM, aruna ramachandran wrote:
> I am using keyed partitions with 1000 partitions, so I need to create 1000
> consumers because consumers groups and re balancing concepts is not worked
> in the case of manually assigned consumers.Is there any replacement for the
> above problem.

What API are you using in the KafkaConsumer? Are you using
subscribe(Collection<String> topics) or are you using
assign(Collection<TopicPartition> partitions) ?

The 1st one (subscribe) is the one you should be using for your usecase.
With that call, when you subscribe to a multi-partition topic and you
have multiple KafkaConsumer(s) configured with the same consumer group
id, then partitions of the topic are dynamically assigned (and possibly
reassigned when consumers come or go) to a set of live consumers. Will
this work for you (and why not)?

Regards, Peter

View raw message