nifi-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Joe Percivall <>
Subject Re: Getting the number of logs
Date Thu, 10 Nov 2016 01:16:57 GMT
Hello Sai,
I'm gonna paraphrase what I think your use-case is first, let me know if this is wrong. You
want to keep track of the number of logs coming in and every hour you want to document how
many came in in that hour. Currently NiFi doesn't handle this type of "stateful" event processing
very well and with what NiFi currently offers you are very limited. 
That said, I've done some work in order to help NiFi into the "stateful" event processing
space that may help you. I currently have an open PR[1] to add state to UpdateAttribute. This
allows you keep stateful values (like a count) and even acts as a Stateful Rule Engine (using
UpdateAttribute's 'Advanced Tab'). 
So in order to solve your use-case you can set up one stateful UpdateAttribute along your
main flow that counts all your incoming FlowFiles. Then add a GenerateFlowFile processor running
on an hourly cron job that is routed to the stateful UpdateAttribute to act as a trigger.
When the Stateful UpdateAttribute is triggered it adds the count as an attribute of the triggering
flowfile and resets the count. Then just do a RouteOnAttribute after the stateful UpdateAttribute
to separate the triggering FlowFile from the incoming data and put it to ElasticSearch.
That may not have been the best explanation and if not I can create a template and take screenshots
tomorrow if you're interested. One thing to keep in mind though, this stateful processing
does have a limitation in this PR in that it will only work with local state. So no tracking
counts across a whole cluster, just per node.
Joe - - - - - - Joseph

    On Wednesday, November 9, 2016 11:41 AM, "Peddy, Sai" <>

  <!--#yiv3697162636 _filtered #yiv3697162636 {font-family:"Cambria Math";panose-1:2 4
5 3 5 4 6 3 2 4;} _filtered #yiv3697162636 {font-family:Calibri;panose-1:2 15 5 2 2 2 4 3
2 4;}#yiv3697162636 #yiv3697162636 p.yiv3697162636MsoNormal, #yiv3697162636 li.yiv3697162636MsoNormal,
#yiv3697162636 div.yiv3697162636MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:Calibri;}#yiv3697162636
a:link, #yiv3697162636 span.yiv3697162636MsoHyperlink {color:#0563C1;text-decoration:underline;}#yiv3697162636
a:visited, #yiv3697162636 span.yiv3697162636MsoHyperlinkFollowed {color:#954F72;text-decoration:underline;}#yiv3697162636
span.yiv3697162636EmailStyle17 {font-family:Calibri;color:windowtext;}#yiv3697162636 span.yiv3697162636msoIns
{text-decoration:underline;color:teal;}#yiv3697162636 .yiv3697162636MsoChpDefault {font-family:Calibri;}
_filtered #yiv3697162636 {margin:1.0in 1.0in 1.0in 1.0in;}#yiv3697162636 div.yiv3697162636WordSection1
{}-->Hi All,    Previously posted this in the Dev listserv moving it over to the Users
listserv    I’m currently working on a use case to be able to track the number of individual
logs that come in and put that information in ElasticSearch. I wanted to see if there is an
easy way to do this and whether anyone had any good ideas?    Current approach I am considering:
Route the Log Files coming in – to a Split Text & Route Text Processor to make sure
no empty logs get through and get the individual log count when files contain multiple logs
– At the end of this the total number of logs are visible in the UI queue, where it displays
the queueCount, but this information is not readily available to any processor. Current thought
process is that I can use the ExecuteScript Processor and update a local file to keep track
and insert the document into elastic search hourly.    Any advice would be appreciated   
Thanks, Sai Peddy 
 The information contained in this e-mail is confidential and/or proprietary to Capital One
and/or its affiliates and may only be used solely in performance of work or services for Capital
One. The information transmitted herewith is intended only for use by the individual or entity
to which it is addressed. If the reader of this message is not the intended recipient, you
are hereby notified that any review, retransmission, dissemination, distribution, copying
or other use of, or taking of any action in reliance upon this information is strictly prohibited.
If you have received this communication in error, please contact the sender and delete the
material from your computer.

View raw message