kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arjun Kota <ar...@socialtwist.com>
Subject Re: kafka consumer not consuming messages
Date Wed, 12 Feb 2014 16:52:13 GMT
Hi,

No i havent changed the auto commit enable. That one message is the one
which  got earlier long time back(2 weeks back). After that i started
working recently and things started behaving werid.

I dont have the request log now, will check and let u know.

Thanks
Arjun narasimha k
On Feb 12, 2014 9:27 PM, "Jun Rao" <junrao@gmail.com> wrote:

> Interesting. So you have 4 messages in the broker. The checkpointed offset
> for the consumer is at the 3rd message. Did you change the default setting
> of auto.commit.enable? Also, if you look at the
> request log, what's the offset in the fetch request from this consumer?
> Thanks,
> Jun
>
>
> On Tue, Feb 11, 2014 at 10:07 PM, Arjun <arjun@socialtwist.com> wrote:
>
> > The topic name is correct, the o/p of the ConsumerOffserChecker is
> >
> > arjunn@arjunn-lt:~/Downloads/Kafka0.8/new/kafka_2.8.0-0.8.0$
> > bin/kafka-run-class.sh kafka.tools.ConsumerOffsetChecker --group group1
> > --zkconnect 127.0.0.1:2181,127.0.0.1:2182,127.0.0.1:2183 --topic
> > taf.referral.emails.service
> > Group           Topic                          Pid Offset logSize
> > Lag             Owner
> > group1          taf.referral.emails.service    0   2 4               2
> > group1_arjunn-lt-1392133080519-e24b249b-0
> > group1          taf.referral.emails.service    1   2 4               2
> > group1_arjunn-lt-1392133080519-e24b249b-0
> >
> > thanks
> > Arjun Narasimha Kota
> >
> >
> >
> >
> > On Wednesday 12 February 2014 10:21 AM, Jun Rao wrote:
> >
> >> Could you double check that you used the correct topic name? If so,
> could
> >> you run ConsumerOffsetChecker as described in
> >> https://cwiki.apache.org/confluence/display/KAFKA/FAQ and see if there
> is
> >> any lag?
> >>
> >> Thanks,
> >>
> >> Jun
> >>
> >>
> >> On Tue, Feb 11, 2014 at 8:45 AM, Arjun Kota <arjun@socialtwist.com>
> >> wrote:
> >>
> >>  fetch.wait.max.ms=10000
> >>> fetch.min.bytes=128
> >>>
> >>> My message size is much more than that.
> >>> On Feb 11, 2014 9:21 PM, "Jun Rao" <junrao@gmail.com> wrote:
> >>>
> >>>  What's the fetch.wait.max.ms and fetch.min.bytes you used?
> >>>>
> >>>> Thanks,
> >>>>
> >>>> Jun
> >>>>
> >>>>
> >>>> On Tue, Feb 11, 2014 at 12:54 AM, Arjun <arjun@socialtwist.com>
> wrote:
> >>>>
> >>>>  With the same group id from the console consumer its working fine.
> >>>>>
> >>>>>
> >>>>> On Tuesday 11 February 2014 01:59 PM, Guozhang Wang wrote:
> >>>>>
> >>>>>  Arjun,
> >>>>>>
> >>>>>> Are you using the same group name for the console consumer and
the
> >>>>>>
> >>>>> java
> >>>
> >>>> consumer?
> >>>>>>
> >>>>>> Guozhang
> >>>>>>
> >>>>>>
> >>>>>> On Mon, Feb 10, 2014 at 11:38 PM, Arjun <arjun@socialtwist.com>
> >>>>>>
> >>>>> wrote:
> >>>
> >>>>   Hi Jun,
> >>>>>>
> >>>>>>> No its not that problem. I am not getting what the problem
is can
> you
> >>>>>>> please help.
> >>>>>>>
> >>>>>>> thanks
> >>>>>>> Arjun Narasimha Kota
> >>>>>>>
> >>>>>>>
> >>>>>>> On Monday 10 February 2014 09:10 PM, Jun Rao wrote:
> >>>>>>>
> >>>>>>>   Does
> >>>>>>>
> >>>>>>>> https://cwiki.apache.org/confluence/display/KAFKA/FAQ#FAQ-
> >>>>>>>> Whydoesmyconsumernevergetanydata?
> >>>>>>>> apply?
> >>>>>>>>
> >>>>>>>> Thanks,
> >>>>>>>>
> >>>>>>>> Jun
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> On Sun, Feb 9, 2014 at 10:27 PM, Arjun <arjun@socialtwist.com>
> >>>>>>>>
> >>>>>>> wrote:
> >>>
> >>>>    Hi,
> >>>>>>>>
> >>>>>>>>  I started using kafka some time back. I was experimenting
with
> 0.8.
> >>>>>>>>>
> >>>>>>>> My
> >>>>
> >>>>> problem is the kafka is unable to consume the messages. My
> >>>>>>>>> configuration
> >>>>>>>>> is kafka broker on the local host and zookeeper
on the local
> host.
> >>>>>>>>>
> >>>>>>>> I
> >>>
> >>>> have only one broker and one consumer at present.
> >>>>>>>>>
> >>>>>>>>> What have I done:
> >>>>>>>>>         1) I used the java examples in the kafka
src and pushed
> >>>>>>>>> some
> >>>>>>>>>
> >>>>>>>> 600
> >>>>
> >>>>> messages to the broker
> >>>>>>>>>         2) I used the console consumer to check
weather the
> >>>>>>>>> messages
> >>>>>>>>>
> >>>>>>>> are
> >>>>
> >>>>> there in the broker or not. Console consumer printed all 600
> >>>>>>>>>
> >>>>>>>> messages
> >>>
> >>>>         3) Now i used the java Consumer code, and tried to get those
> >>>>>>>>> messages. This is not printing any messages. It
just got stuck
> >>>>>>>>>
> >>>>>>>>> When was it working earlier:
> >>>>>>>>>         -When i tried with three brokers and three
consumers in
> the
> >>>>>>>>>
> >>>>>>>> same
> >>>>
> >>>>> machine, with the same configuration it worked fine.
> >>>>>>>>>         -I changed the properties accordingly when
i tried to
> make
> >>>>>>>>>
> >>>>>>>> it
> >>>
> >>>> work
> >>>>>>>>> with one broker and one consumer
> >>>>>>>>>
> >>>>>>>>> What does log say:
> >>>>>>>>>         - attaching the logs even
> >>>>>>>>>
> >>>>>>>>> If some one points me where I am doing wrong it
would be helpful.
> >>>>>>>>>
> >>>>>>>>> Thanks
> >>>>>>>>> Arjun Narasimha Kota
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message