nifi-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From l vic <lvic4...@gmail.com>
Subject Re: [EXT] ExecuteSQL: convertToAvroStream failure with SQlite integer
Date Fri, 09 Nov 2018 21:06:12 GMT
Works if change column type to UNSIGNED BIG INT in SQLite

On Fri, Nov 9, 2018 at 9:33 AM Colin Dean <colin.dean@arcadia.io> wrote:

> You could try dropping in the relevant NAR from 1.8.0 but otherwise, no.
> The 1.8.0 version will make it a lot easier to figure out what SQL type is
> causing the problem, though, and will probably result in a bug report.
>
>
>
>
>
> *From: *l vic <lvic4594@gmail.com>
> *Reply-To: *"users@nifi.apache.org" <users@nifi.apache.org>
> *Date: *Friday, November 9, 2018 at 6:52 AM
> *To: *"users@nifi.apache.org" <users@nifi.apache.org>
> *Subject: *Re: [EXT] ExecuteSQL: convertToAvroStream failure with SQlite
> integer
>
>
>
> Using ver. 1.7... Any way around it in existing version?
>
>
>
> On Thu, Nov 8, 2018 at 5:53 PM Colin Dean <colin.dean@arcadia.io> wrote:
>
> What version of NiFi are you using? An error like this comes up every now
> and then; one was just fixed in NiFi 1.8.0 but it was related to JDBC
> drivers that return Long for unsigned ints. 1.8.0 also improved the error
> message so that it should show the type of the object that was passed into
> the unresolvable union.
>
>
>
> https://github.com/apache/nifi/pull/3032
>
>
>
>
>
> *From: *l vic <lvic4594@gmail.com>
> *Reply-To: *"users@nifi.apache.org" <users@nifi.apache.org>
> *Date: *Thursday, November 8, 2018 at 5:43 PM
> *To: *"users@nifi.apache.org" <users@nifi.apache.org>
> *Subject: *[EXT] ExecuteSQL: convertToAvroStream failure with SQlite
> integer
>
>
>
> Hi, I am trying to use ExecuteSQL to get "epoch time" value from SQLite
> table:
>
> select start_date from sched
>
> where start_time is defined as INTEGER
>
> If the start_date = 1536548297955 i see the following exception:
>
> failed to process due to
> org.apache.avro.file.DataFileWriter$AppendWriteException:
> org.apache.avro.UnresolvedUnionException: Not in union ["null","int"]:
> 1536548297955; rolling back session: {}
>
> org.apache.avro.file.DataFileWriter$AppendWriteException:
> org.apache.avro.UnresolvedUnionException: Not in union ["null","int"]:
> 1536548297955
>
> at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:308)
>
> at
>
> Caused by: org.apache.avro.UnresolvedUnionException: Not in union
> ["null","int"]: 1536548297955
>
> at org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:709)
>
>
>
> This is obviously Avro conversion issue as this works from sqlite3 CLI..
>
> If I try to define it as BIGINT i
> have org.apache.avro.UnresolvedUnionException: Not in union
> ["null","long"]: 1536548297955;
>
> Any idea how i can resolve this?
>
> Thanks,
>
> -V
>
>

Mime
View raw message