kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Neha Narkhede <neha.narkh...@gmail.com>
Subject Re: Kafka Hadoop Consumer for multiple brokers
Date Tue, 04 Jun 2013 16:06:41 GMT
It is probably possible to try to make the serde format pluggable. You can
try to start that discussion on the camus mailing list.

Thanks,
Neha
On Jun 4, 2013 8:51 AM, "Samir Madhavan" <
samir.madhavan@fluturasolutions.com> wrote:

> Thanks Jun.
>
> For one of our project, the data is coming in the binary proto format.
> Camus is based on Avro so what would be the best situation to handle to
> protobuf data files?
>
>
> On Tue, Jun 4, 2013 at 8:40 PM, Jun Rao <junrao@gmail.com> wrote:
>
> > The idea is that each mapper is only connecting to a single Kafka broker.
> > Each line in the input file specifies broker uri, topic, partition and
> > offset.
> >
> > The hadoop consumer in contrib is probably a bit outdated. The one that
> > LinkedIn uses now can be found at https://github.com/linkedin/camus
> >
> > Thanks,
> >
> > Jun
> >
> >
> > On Tue, Jun 4, 2013 at 7:29 AM, Samir Madhavan <
> > samir.madhavan@fluturasolutions.com> wrote:
> >
> > > Hi,
> > >
> > > I was going through the hadoop-consumer in the contrib folder. There is
> > > property that asks for the kafka server URI. This might sound silly but
> > > from looking at it, it seems to be only for a single kafka broker.
> > >
> > > What we have multiple brokers, how do we implement the hadoop-consumer
> > for
> > > it?
> > >
> > > Regards,
> > > Samir
> > >
> >
>
>
>
> --
> *Samir Madhavan *| Data Scientist | Flutura Business Solutions Pvt. Ltd |
>   4th
> Floor, 'Geetanjali', #693, 15th Cross, J.P Nagar 2nd Phase, Bangalore,
> India - 560078 | Mobile: +91 9886139631 | email: *
> samir.madhavan@fluturasolutions.com*| www.fluturasolutions.com |
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message