drill-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abhishek Girish <abhishek.gir...@gmail.com>
Subject Re: JSON field changes from scalar to list
Date Fri, 15 May 2015 23:31:11 GMT
Actually this still constitutes as schema change across records. And
store.json.all_text_mode wouldn't work since the type changes from scalar
to list. I tried a few things, but couldn't find a workaround! I would
suggest pre-processing with modification to remove such schema change.


On Fri, May 15, 2015 at 4:07 PM, Vince Gonzalez <vince.gonzalez@gmail.com>
wrote:

> Actually, the example I gave wasn't a representative one - any way to work
> around the same error given this data? Is there no way to "cast" the simple
> value into a list? Will I need to do the modification to the "a" column in
> pre-processing?
>
>
> { "timestamp": 1431717028,
>   "data": [
>     { "a": 1 }
>   ]
> }
> { "timestamp": 1431717029,
>   "data": [
>     { "a": [ 1,2,3 ] }
>   ]
> }
>
>
> 0: jdbc:drill:zk=local> select t.data.a from dfs.tmp.`test.json` t;
> Query failed: DATA_READ ERROR: You tried to start when you are using a
> ValueWriter of type NullableBigIntWriterImpl.
>
> File  /tmp/test.json
> Record  2
> Line  8
> Column  14
> Field  a
> Fragment 0:0
>
> [5686ca9d-bce2-4dd1-8ad0-c9a187e5eeff on 172.30.1.73:31010]
>
>
>
> On Fri, May 15, 2015 at 4:04 PM, Abhishek Girish <
> abhishek.girish@gmail.com>
> wrote:
>
> > Hey Vince,
> >
> > I don't think that's supported. The array "data" is heterogeneous (one
> is a
> > simple value and other is an array).
> >
> > -Abhishek
> >
> >
> > On Fri, May 15, 2015 at 12:12 PM, Vince Gonzalez <
> vince.gonzalez@gmail.com
> > >
> > wrote:
> >
> > > Given the following data sample (stored in /tmp/data.json) and query,
> is
> > > there a good way to avoid an error?
> > >
> > > { "timestamp": 1431717028,
> > >   "data": [
> > >     { "a": 1 },
> > >     { "a": [ 1,2,3 ] }
> > >   ]
> > > }
> > >
> > >
> > > 0: jdbc:drill:zk=local> select flatten(data) as data from
> > > dfs.tmp.`test.json`;
> > > Query failed: DATA_READ ERROR: You tried to start when you are using a
> > > ValueWriter of type NullableBigIntWriterImpl.
> > >
> > > File  /tmp/test.json
> > > Record  1
> > > Line  4
> > > Column  14
> > > Field  a
> > > Fragment 0:0
> > >
> > > [b4ea66d1-98c9-426f-a2ce-aeb211a378da on 172.30.1.73:31010]
> > > Error: exception while executing query: Failure while executing query.
> > > (state=,code=0)
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message