hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yongzhi Chen (JIRA)" <>
Subject [jira] [Commented] (HIVE-12378) Exception on HBaseSerDe.serialize binary field
Date Fri, 13 Nov 2015 18:08:11 GMT


Yongzhi Chen commented on HIVE-12378:

Binary can not be null. This is consistent with other data types for hive hbase tables, for
example if I tried to insert into test9 values (5, NULL); test9 second column is string; or
test1(second column is int)  I got similar exception:
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive
Runtime Error while processing row {"tmp_values_col1":"5","tmp_values_col2":null}
	at org.apache.hadoop.mapred.MapTask.runOldMapper(
	at org.apache.hadoop.mapred.YarnChild$
	at Method)
	at org.apache.hadoop.mapred.YarnChild.main(
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing
row {"tmp_values_col1":"5","tmp_values_col2":null}
	at org.apache.hadoop.hive.ql.exec.MapOperator.process(
	... 8 more
Caused by: java.lang.IllegalArgumentException: No columns to insert
	at org.apache.hadoop.hbase.client.HTable.validatePut(
	at org.apache.hadoop.hbase.client.BufferedMutatorImpl.validatePut(
	at org.apache.hadoop.hbase.client.BufferedMutatorImpl.doMutate(
	at org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(
	at org.apache.hadoop.hbase.client.HTable.put(
	at org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat$MyRecordWriter.write(
	at org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat$MyRecordWriter.write(
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(
	at org.apache.hadoop.hive.ql.exec.Operator.forward(
	at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(
	at org.apache.hadoop.hive.ql.exec.Operator.forward(
	at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(
	at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(
	at org.apache.hadoop.hive.ql.exec.MapOperator.process(
	... 9 more


following code is added because  I want LazyBioBinary is consistent LazyBinary.
In the LazyBinary.init method, it calls super.init(bytes, start, length) which is the LazyObject.init
and it is the same code as following:
    if (bytes == null) {
      throw new RuntimeException("bytes cannot be null!");
    this.isNull = false;

> Exception on HBaseSerDe.serialize binary field
> ----------------------------------------------
>                 Key: HIVE-12378
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>          Components: HBase Handler, Serializers/Deserializers
>    Affects Versions: 1.0.0, 1.1.0, 2.0.0
>            Reporter: Yongzhi Chen
>            Assignee: Yongzhi Chen
>         Attachments: HIVE-12378.1.patch
> An issue was reproduced with the binary typed HBase columns in Hive:
> It works fine as below:
> CREATE TABLE test9 (key int, val string)
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> "hbase.columns.mapping" = ":key,cf:val#b"
> );
> insert into test9 values(1,"hello");
> But when string type is changed to binary as:
> CREATE TABLE test2 (key int, val binary)
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> "hbase.columns.mapping" = ":key,cf:val#b"
> );
> insert into table test2 values(1, 'hello');
> The following exception is thrown:
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while
processing row {"tmp_values_col1":"1","tmp_values_col2":"hello"}
> ...
> Caused by: java.lang.RuntimeException: Hive internal error.
> at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitive(
> at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(
> at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeField(
> at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(
> at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(
> ... 16 more
> We should support hive binary type column for hbase.

This message was sent by Atlassian JIRA

View raw message