nifi-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adam Williams <aaronfwilli...@outlook.com>
Subject RE: CSV to Mongo
Date Mon, 21 Sep 2015 19:19:01 GMT
Sorry about that, this should work.  Attached the template and the below error:
2015-09-21 14:36:02,821 ERROR [Timer-Driven Process Thread-10] o.a.nifi.processors.mongodb.PutMongo
PutMongo[id=480877a4-f349-4ef7-9538-8e3e3e108e06] Failed to insert StandardFlowFileRecord[uuid=bbd7048f-d5a1-4db4-b938-da64b67e810e,claim=org.apache.nifi.controller.repository.claim.StandardContentClaim@8893ae38,offset=0,name=GDELT.MASTERREDUCEDV2.TXT,size=6581409407]
into MongoDB due to java.lang.NegativeArraySizeException: java.lang.NegativeArraySizeException

Date: Mon, 21 Sep 2015 15:12:43 -0400
Subject: Re: CSV to Mongo
From: bbende@gmail.com
To: users@nifi.apache.org

Adam, 
I imported the template and it looks like it only captured the PutMongo processor. Can you
try deselecting everything on the graph and creating the template again so we can take a look
at the rest of the flow? or if you have other stuff on your graph, select all of the processors
you described so they all get captured.
Also, can you provide any of the stacktrace for the exception you are seeing? The log is in
NIFI_HOME/logs/nifi-app.log
Thanks,
Bryan

On Mon, Sep 21, 2015 at 3:03 PM, Bryan Bende <bbende@gmail.com> wrote:
Adam,
Thanks for attaching the template, we will take a look and see what is going on.
Thanks,
Bryan

On Mon, Sep 21, 2015 at 2:50 PM, Adam Williams <aaronfwilliams@outlook.com> wrote:



Hey Joe,
Sure thing.  I attached the template, I'm just taking the GDELT data set for the getFile Processor
which works.  The error i get is a negative array.


> Date: Mon, 21 Sep 2015 14:24:50 -0400
> Subject: Re: CSV to Mongo
> From: joe.witt@gmail.com
> To: users@nifi.apache.org
> 
> Adam,
> 
> Regarding moving from Storm to NiFi i'd say they make better teammates
> than competitors.  The use case outlines above should be quite easy
> for NiFi but there are analytic/processing functions Storm is probably
> a better answer for.  We're happy to help explore that with you as you
> progress.
> 
> If you ever run into an ArrayIndexBoundsException.. then it will
> always be 100% a coding error.  Would you mind sending your
> flow.xml.gz over or making a template of the flow (assuming it
> contains nothing sensitive)?  If at all possible sample data which
> exposes the issue would be ideal.  As an alternative can you go ahead
> and send us the resulting stack trace/error that comes out?
> 
> We'll get this addressed.
> 
> Thanks
> Joe
> 
> On Mon, Sep 21, 2015 at 2:17 PM, Adam Williams
> <aaronfwilliams@outlook.com> wrote:
> > Hello,
> >
> > I'm moving from storm to NiFi and trying to do a simple test with getting a
> > large CSV file dumped into MongoDB.  The CSV file has a header with column
> > names and it is structured, my only problem is dumping it into MongoDB.  At
> > a high level, do the following processor steps look correct?  All i want is
> > to just pull the whole CSV file over the MongoDB without a regex or anything
> > fancy (yet).  I eventually always seem to hit trouble with array index
> > problems with the putmongo processor:
> >
> > GetFile --> ExtractText --> RoutOnAttribute(not a null line) --> PutMongo.
> >
> > Does that seem to be the right way to do this in NiFi?
> >
> > Thank you,
> > Adam
 		 	   		  



 		 	   		  
Mime
View raw message