nifi-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mohit" <mohit.j...@open-insights.co.in>
Subject RE: Validate Record issue
Date Wed, 02 May 2018 17:50:26 GMT
Mark,

 

Thanks for the input. I’ll implement the same.

 

Regards,

Mohit

 

From: Mark Payne <markap14@hotmail.com> 
Sent: 02 May 2018 23:13
To: users@nifi.apache.org
Subject: Re: Validate Record issue

 

Mohit, 

 

Correct - I was saying that it *should* allow that but due to a bug (NIFI-5141) it currently
does not. So in the meantime,

you would have to update your schema to allow for either double or int (or long, if you prefer)
types.

 

Thanks

-Mark

 





On May 2, 2018, at 1:38 PM, Mohit <mohit.jain@open-insights.co.in <mailto:mohit.jain@open-insights.co.in>
> wrote:

 

Hi Mark,

 

I set the Strict type checking to false, still it doesn’t allowed.

 

Thanks,

Mohit

 

From: Mark Payne <markap14@hotmail.com <mailto:markap14@hotmail.com> > 
Sent: 02 May 2018 23:00
To: users@nifi.apache.org <mailto:users@nifi.apache.org> 
Subject: Re: Validate Record issue

 

Mohit,

 

If you look at the Provenance events that are emitted by the processor, they show the reason
that the records

are considered invalid. Specifically, for this use case, it shows: The following 2 fields
had values whose type did not match the schema: [/hs_kbps, /site_id]

It appears that in your incoming data, the values are integers, instead of doubles. If you
have the "Strict Type Checking"

value set to "false" in the processor, then it should allow this. Unfortunately, though, it
appears that there is a bug

that causes integer values not to be considered validate when the schema says that it is a
double. I created a JIRA [1] for this.

 

In the meantime, if you update your schema to allow for those fields to be ["null", "long",
"double"] then you should be good.

 

Thanks

-Mark

 

[1]  <https://issues.apache.org/jira/browse/NIFI-5141> https://issues.apache.org/jira/browse/NIFI-5141

 






On May 2, 2018, at 11:57 AM, Mohit < <mailto:mohit.jain@open-insights.co.in> mohit.jain@open-insights.co.in>
wrote:

 

Hi,

I’m using ValidateRecord processor to validate CSV and writing to avro. 

For a file it is transferring all the record to an invalid relationship. It is working fine
with ConvertCsvToAvro processor.

 

Avro Schema - {"type":"record","name":"cell_kpi_dump_geo","namespace":"cell_kpi_dump_geo","fields":[{"name":"month","type":["null","string"],"default":null},

{"name":"cell","type":["null","string"],"default":null},{"name":"availability","type":["int","null"],"default":0},{"name":"cssr_speech","type":["int","null"],"default":0},

{"name":"dcr_speech","type":["int","null"],"default":0},{"name":"hs_kbps","type":["double","null"],"default":0.0},{"name":"eul_kbps","type":["int","null"],"default":0},

{"name":"tech","type":["null","string"],"default":null},{"name":"site_id","type":["double","null"],"default":0.0},{"name":"longitude","type":["double","null"],"default":0.0},{"name":"latitude","type":["double","null"],"default":0.0}]}

 

 

Sample record – 

 

May-16,KA4371D,95,100,0,151,,2G,4371,-1.606926,6.67223

 

Is there something I’m doing wrong?

 

 

Regards,

Mohit

 


Mime
View raw message