kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Guozhang Wang <wangg...@gmail.com>
Subject Re: kafka streams with dynamic content and filtering
Date Tue, 04 Oct 2016 22:25:18 GMT
Hello Gary,

What you described should be workable with the lower-level Processor
interface of Kafka Streams, i.e. dynamic aggregations based on the input
data indicating changes to the JSON schemas. For detailed examples of how
the Processor API works please read the corresponding sections on the web
docs:

http://docs.confluent.io/3.0.1/streams/developer-guide.html#processor-api


Guozhang

On Mon, Oct 3, 2016 at 6:51 AM, Gary Ogden <gogden@gmail.com> wrote:

> I have a use case, and I'm wondering if it's possible to do this with
> Kafka.
>
> Let's say we will have customers that will be uploading JSON to our system,
> but that JSON layout will be different between each customer. They are able
> to define the schema of the JSON being uploaded.
>
> They will then be able to define the fields in that JSON they want to
> gather metrics on (sum, counts etc).
>
> Is there a way with Kafka streaming to dynamically read the configuration
> for that customer and process the json and do counts and sums for the
> fields they've defined.
>
> It's possible at any time they may want to modify the configuration for
> their json as well. Stop counting one field, start counting another.
>
> They will also want to do some inferences as well. IE, if this particular
> JSON is uploaded with a field in it, then check to see if another json was
> uploaded within 8 hours.
>
> Is it possible for Kafka streaming to be this dynamic?
>



-- 
-- Guozhang

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message