kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gary Ogden <gog...@gmail.com>
Subject kafka streams with dynamic content and filtering
Date Mon, 03 Oct 2016 13:51:31 GMT
I have a use case, and I'm wondering if it's possible to do this with Kafka.

Let's say we will have customers that will be uploading JSON to our system,
but that JSON layout will be different between each customer. They are able
to define the schema of the JSON being uploaded.

They will then be able to define the fields in that JSON they want to
gather metrics on (sum, counts etc).

Is there a way with Kafka streaming to dynamically read the configuration
for that customer and process the json and do counts and sums for the
fields they've defined.

It's possible at any time they may want to modify the configuration for
their json as well. Stop counting one field, start counting another.

They will also want to do some inferences as well. IE, if this particular
JSON is uploaded with a field in it, then check to see if another json was
uploaded within 8 hours.

Is it possible for Kafka streaming to be this dynamic?

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message