kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rainer Guessner" <raguess...@gmx.com>
Subject Re: Streaming Data
Date Tue, 09 Apr 2019 22:11:29 GMT
For more flexibility without the need for extensive coding I suggest Esper complex event processing

Sent: Tuesday, April 09, 2019 at 4:26 PM
From: "Nick Torenvliet" <natorenvliet@gmail.com>
To: users@kafka.apache.org
Subject: Streaming Data
Hi all,

Just looking for some general guidance.

We have a kafka -> druid pipeline we intend to use in an industrial setting
to monitor process data.

Our kafka system recieves messages on a single topic.

The messages are {"timestamp": yy:mm:ddThh:mm:ss.mmm, "plant_equipment_id":
"id_string", "sensorvalue": float}

For our POC there are about 2000 unique plant_equipment ids, this will
quickly grow to 20,000.

The kafka topic streams into druid

We are building some node.js/react browser based apps for analytics and
real time stream monitoring.

We are thinking that for visualizing historical data sets we will hit druid
for data.

For real time streaming we are wondering what our best option is.

One option is to just hit druid semi regularly and update the on screen
visualization as data arrives from there.

Another option is to stream subset of the topics (somehow) from kafka using
some streams interface.

With all the stock ticker apps out there, I have to imagine this is a
really common use case.

Anyone have any thoughts as to what we are best to do?


View raw message