kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Benjamin Black...@b3k.us>
Subject Re: default producer to retro-fit existing log files collection process?
Date Wed, 04 Sep 2013 18:25:06 GMT
commons-logging has a log4j logger, so perhaps you just need to use it and
the log4j-kafka appender to achieve your goal?


On Tue, Sep 3, 2013 at 2:08 PM, Maxime Petazzoni
<Maxime.Petazzoni@turn.com>wrote:

> Tomcat uses commons-logging for logging. You might be able to write an
> adapter towards Kafka, in a similar way as the log4j-kafka appender. I
> think this would be cleaner than writing something Tomcat-specific that
> intercepts your requests and logs them through Kafka.
>
> /Max
> --
> Maxime Petazzoni
> Sr. Platform Engineer
> m 408.310.0595
> www.turn.com
>
> ________________________________________
> From: Yang [teddyyyy123@gmail.com]
> Sent: Tuesday, September 03, 2013 10:09 AM
> To: users@kafka.apache.org
> Subject: default producer to retro-fit existing log files collection
> process?
>
> in many setups we have production web server logs rotated on local disks,
> and then collected using some sort of scp processes.
>
> I guess the ideal way to use kafka is to write a module for tomcat and
> catches the request , send through the kafka api. but is there a "quick and
> dirty" producer included from kafka  to just read the existing rotated logs
> and send through kafka API? this would avoid having to touch the existing
> java code
>
> thanks
> Yang
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message