kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tim Smith <secs...@gmail.com>
Subject Re: Flume use case for Kafka & HDFS
Date Fri, 25 Sep 2015 02:53:31 GMT
Not out of the box, no - I don't think you can use an attribute of the
posted JSON to specify topics for kafka or folder for HDFS.

For dynamically creating topics in kafka, you would have to write some kind
of custom kafka producer - the kafka channel or sink in flume requires a
kafka topic to be defined in the flume config. For HDFS sink in flume, you
can create parameters with a custom interceptor or maybe use
morphlines/grok and then pass them to the hdfs sink.

On Sat, Sep 19, 2015 at 11:26 AM, Hemanth Abbina <HemanthA@eiqnetworks.com>

> I'm new to Flume and thinking to use Flume in the below scenario.
> Our system receives events as HTTP POST, and we need to store them in
> Kafka(for processing) as well as HDFS(as permanent store).
> Can we configure Flume as below ?
> *         Source:  HTTP (expecting JSON event as HTTP body, with a dynamic
> topic name in the URI)
> *         Channel: KAFKA (should store the received JSON body, to a topic
> mentioned in the URI)
> *         Sink:  HDFS (should store the data in a folder mentioned in the
> URI.
> For example, If I receive a JSON event from a HTTP source with the below
> attributes,
> *         URL: https://xx.xx.xx.xx/event/abc
> *         Body of POST:  { name: xyz, value=123}
> The event should be saved to Kafka channel - with topic 'abc' and written
> to HDFS to a folder as 'abc'.
> This 'abc' will be dynamic and change from event to event.
> Is this possible with Flume ?
> Thanks in advance
> Hemanth




  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message