kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Matthias J. Sax" <matth...@confluent.io>
Subject Re: Debugging Kafka Streams Windowing
Date Mon, 08 May 2017 22:10:08 GMT
Great! Glad 0.10.2.1 fixes it for you!

-Matthias

On 5/7/17 8:57 PM, Mahendra Kariya wrote:
> Upgrading to 0.10.2.1 seems to have fixed the issue.
> 
> Until now, we were looking at random 1 hour data to analyse the issue. Over
> the weekend, we have written a simple test that will continuously check for
> inconsistencies in real time and report if there is any issue.
> 
> No issues have been reported for the last 24 hours. Will update this thread
> if we find any issue.
> 
> Thanks for all the support!
> 
> 
> 
> On Fri, May 5, 2017 at 3:55 AM, Matthias J. Sax <matthias@confluent.io>
> wrote:
> 
>> About
>>
>>> 07:44:08.493 [StreamThread-10] INFO o.a.k.c.c.i.AbstractCoordinator -
>>> Discovered coordinator broker-05:6667 for group group-2.
>>
>> Please upgrade to Streams 0.10.2.1 -- we fixed couple of bug and I would
>> assume this issue is fixed, too. If not, please report back.
>>
>>> Another question that I have is, is there a way for us detect how many
>>> messages have come out of order? And if possible, what is the delay?
>>
>> There is no metric or api for this. What you could do though is, to use
>> #transform() that only forwards each record and as a side task, extracts
>> the timestamp via `context#timestamp()` and does some book keeping to
>> compute if out-of-order and what the delay was.
>>
>>
>>>>>  - same for .mapValues()
>>>>>
>>>>
>>>> I am not sure how to check this.
>>
>> The same way as you do for filter()?
>>
>>
>> -Matthias
>>
>>
>> On 5/4/17 10:29 AM, Mahendra Kariya wrote:
>>> Hi Matthias,
>>>
>>> Please find the answers below.
>>>
>>> I would recommend to double check the following:
>>>>
>>>>  - can you confirm that the filter does not remove all data for those
>>>> time periods?
>>>>
>>>
>>> Filter does not remove all data. There is a lot of data coming in even
>>> after the filter stage.
>>>
>>>
>>>>  - I would also check input for your AggregatorFunction() -- does it
>>>> receive everything?
>>>>
>>>
>>> Yes. Aggregate function seems to be receiving everything.
>>>
>>>
>>>>  - same for .mapValues()
>>>>
>>>
>>> I am not sure how to check this.
>>>
>>
>>
> 


Mime
View raw message