kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jay Kreps <jay.kr...@gmail.com>
Subject Re: default producer to retro-fit existing log files collection process?
Date Wed, 04 Sep 2013 16:44:33 GMT
As Neha says the best thing we currently provide is console producer.
Providing a more flexible framework specifically targeted at log slurping
would be a cool open source project.

-Jay


On Wed, Sep 4, 2013 at 7:34 AM, Neha Narkhede <neha.narkhede@gmail.com>wrote:

> Quick and dirty solution would be to somehow tail the logs and use console
> producer to send the data to kafka.
>
> Thanks,
> Neha
> On Sep 3, 2013 2:09 PM, "Maxime Petazzoni" <Maxime.Petazzoni@turn.com>
> wrote:
>
> > Tomcat uses commons-logging for logging. You might be able to write an
> > adapter towards Kafka, in a similar way as the log4j-kafka appender. I
> > think this would be cleaner than writing something Tomcat-specific that
> > intercepts your requests and logs them through Kafka.
> >
> > /Max
> > --
> > Maxime Petazzoni
> > Sr. Platform Engineer
> > m 408.310.0595
> > www.turn.com
> >
> > ________________________________________
> > From: Yang [teddyyyy123@gmail.com]
> > Sent: Tuesday, September 03, 2013 10:09 AM
> > To: users@kafka.apache.org
> > Subject: default producer to retro-fit existing log files collection
> > process?
> >
> > in many setups we have production web server logs rotated on local disks,
> > and then collected using some sort of scp processes.
> >
> > I guess the ideal way to use kafka is to write a module for tomcat and
> > catches the request , send through the kafka api. but is there a "quick
> and
> > dirty" producer included from kafka  to just read the existing rotated
> logs
> > and send through kafka API? this would avoid having to touch the existing
> > java code
> >
> > thanks
> > Yang
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message