kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sa Li <sal...@gmail.com>
Subject Re: kafka consumer to write into DB
Date Fri, 05 Dec 2014 17:44:18 GMT
Thanks, Neha, is there a java version batch consumer?

thanks



On Fri, Dec 5, 2014 at 9:41 AM, Scott Clasen <scott@heroku.com> wrote:

> if you are using scala/akka this will handle the batching and acks for you.
>
> https://github.com/sclasen/akka-kafka#akkabatchconsumer
>
> On Fri, Dec 5, 2014 at 9:21 AM, Sa Li <salicn@gmail.com> wrote:
>
> > Thank you very much for the reply, Neha, I have a question about
> consumer,
> > I consume the data from kafka and write into DB, of course I have to
> create
> > a hash map in memory, load data into memory and bulk copy to DB instead
> of
> > insert into DB line by line. Does it mean I need to ack each message
> while
> > load to memory?
> >
> > thanks
> >
> >
> >
> > On Thu, Dec 4, 2014 at 1:21 PM, Neha Narkhede <neha@confluent.io> wrote:
> >
> > > This is specific for pentaho but may be useful -
> > > https://github.com/RuckusWirelessIL/pentaho-kafka-consumer
> > >
> > > On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <salicn@gmail.com> wrote:
> > >
> > > > Hello, all
> > > >
> > > > I never developed a kafka consumer, I want to be able to make an
> > advanced
> > > > kafka consumer in java to consume the data and continuously write the
> > > data
> > > > into postgresql DB. I am thinking to create a map in memory and
> > getting a
> > > > predefined number of messages in memory then write into DB in batch,
> is
> > > > there a API or sample code to allow me to do this?
> > > >
> > > >
> > > > thanks
> > > >
> > > >
> > > > --
> > > >
> > > > Alec Li
> > > >
> > >
> > >
> > >
> > > --
> > > Thanks,
> > > Neha
> > >
> >
> >
> >
> > --
> >
> > Alec Li
> >
>



-- 

Alec Li

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message