kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Krishna Raj <reach.krishna...@gmail.com>
Subject Re: kafka consumer to write into DB
Date Sat, 06 Dec 2014 02:03:31 GMT
Hi Sa,

I created bulk consumer which consumes, processes and post to ElasticSearch.

There are config for the size of message consumption. And you can modify
the code about what you want to do about the consumed message.

https://github.com/reachkrishnaraj/kafka-elasticsearch-standalone-consumer

Thanks,
Kr



On Fri, Dec 5, 2014 at 5:07 PM, Neha Narkhede <neha@confluent.io> wrote:

> Not that I know of.
>
> On Fri, Dec 5, 2014 at 9:44 AM, Sa Li <salicn@gmail.com> wrote:
>
> > Thanks, Neha, is there a java version batch consumer?
> >
> > thanks
> >
> >
> >
> > On Fri, Dec 5, 2014 at 9:41 AM, Scott Clasen <scott@heroku.com> wrote:
> >
> > > if you are using scala/akka this will handle the batching and acks for
> > you.
> > >
> > > https://github.com/sclasen/akka-kafka#akkabatchconsumer
> > >
> > > On Fri, Dec 5, 2014 at 9:21 AM, Sa Li <salicn@gmail.com> wrote:
> > >
> > > > Thank you very much for the reply, Neha, I have a question about
> > > consumer,
> > > > I consume the data from kafka and write into DB, of course I have to
> > > create
> > > > a hash map in memory, load data into memory and bulk copy to DB
> instead
> > > of
> > > > insert into DB line by line. Does it mean I need to ack each message
> > > while
> > > > load to memory?
> > > >
> > > > thanks
> > > >
> > > >
> > > >
> > > > On Thu, Dec 4, 2014 at 1:21 PM, Neha Narkhede <neha@confluent.io>
> > wrote:
> > > >
> > > > > This is specific for pentaho but may be useful -
> > > > > https://github.com/RuckusWirelessIL/pentaho-kafka-consumer
> > > > >
> > > > > On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <salicn@gmail.com> wrote:
> > > > >
> > > > > > Hello, all
> > > > > >
> > > > > > I never developed a kafka consumer, I want to be able to make
an
> > > > advanced
> > > > > > kafka consumer in java to consume the data and continuously
write
> > the
> > > > > data
> > > > > > into postgresql DB. I am thinking to create a map in memory
and
> > > > getting a
> > > > > > predefined number of messages in memory then write into DB in
> > batch,
> > > is
> > > > > > there a API or sample code to allow me to do this?
> > > > > >
> > > > > >
> > > > > > thanks
> > > > > >
> > > > > >
> > > > > > --
> > > > > >
> > > > > > Alec Li
> > > > > >
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > Thanks,
> > > > > Neha
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > >
> > > > Alec Li
> > > >
> > >
> >
> >
> >
> > --
> >
> > Alec Li
> >
>
>
>
> --
> Thanks,
> Neha
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message