sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Markus Kemper (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SQOOP-3078) Sqoop (export + --hcatalog) with Oracle BINARY_DOUBLE fails with NPE
Date Mon, 12 Dec 2016 12:59:59 GMT
Markus Kemper created SQOOP-3078:
------------------------------------

             Summary: Sqoop (export + --hcatalog) with Oracle BINARY_DOUBLE fails with NPE
                 Key: SQOOP-3078
                 URL: https://issues.apache.org/jira/browse/SQOOP-3078
             Project: Sqoop
          Issue Type: Bug
          Components: hive-integration
            Reporter: Markus Kemper


According to the documentation BINARY_DOUBLE should be supported thus this seems like a possible
defect.

{noformat}
#################
# DOC - Type Binary Double is supported
#################

http://archive.cloudera.com/cdh5/cdh/5/sqoop-1.4.6-cdh5.8.0/SqoopUserGuide.html#_supported_data_types
25.8.2.4. Supported Data Types
The following Oracle data types are supported by the Data Connector for Oracle and Hadoop:
BINARY_DOUBLE 

#################
# STEP 01 - Setup Oracle Table and Data
#################

export MYCONN=jdbc:oracle:thin:@oracle.cloudera.com:1521/orcl12c;
export MYUSER=sqoop
export MYPSWD=sqoop

sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "drop table t1_oracle"
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "create table t1_oracle
(c1 varchar(10), c2 binary_double, c3 binary_double, c4 varchar(10))"
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "delete from t1_oracle"
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "insert into t1_oracle
values ('one.one', 1.1, 1.1, 'one.one')"
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select * from
t1_oracle"

Output:
-------------------------------------------------
| C1         | C2       | C3       | C4         | 
-------------------------------------------------
| one.one    | 1.1      | 1.1      | one.one    | 
-------------------------------------------------

#################
# STEP 02 - Verify Import (--target-dir) and Export (--export-dir)
#################

sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1_ORACLE --target-dir
/user/root/t1_oracle --delete-target-dir --num-mappers 1 --fields-terminated-by ','
hdfs dfs -cat /user/root/t1_oracle/part*
sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1_ORACLE --export-dir
/user/root/t1_oracle --num-mappers 1 --input-fields-terminated-by ','
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select * from
t1_oracle"

Output:
one.one,1.1,1.1,one.one
---
-------------------------------------------------
| C1         | C2       | C3       | C4         | 
-------------------------------------------------
| one.one    | 1.1      | 1.1      | one.one    | 
| one.one    | 1.1      | 1.1      | one.one    | 
-------------------------------------------------

#################
# STEP 03 - Verify Import (--hive-import) and Export (--export-dir)
#################

sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "delete from t1_oracle"
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "insert into t1_oracle
values ('one.one', 1.1, 1.1, 'one.one')"
sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1_ORACLE --target-dir
/user/root/t1_oracle --delete-target-dir --num-mappers 1 --hive-import --hive-database default
--hive-table t1_oracle --fields-terminated-by ','
beeline -u jdbc:hive2:// -e "use default; select * from t1_oracle;"
hdfs dfs -cat /user/hive/warehouse/t1_oracle/part*
sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1_ORACLE --export-dir
/user/hive/warehouse/t1_oracle --num-mappers 1 --input-fields-terminated-by ','
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select * from
t1_oracle"

Output:
+---------------+---------------+---------------+---------------+--+
| t1_oracle.c1  | t1_oracle.c2  | t1_oracle.c3  | t1_oracle.c4  |
+---------------+---------------+---------------+---------------+--+
| one.one       | 1.1           | 1.1           | one.one       |
+---------------+---------------+---------------+---------------+--+
---
one.one,1.1,1.1,one.one
---
-------------------------------------------------
| C1         | C2       | C3       | C4         | 
-------------------------------------------------
| one.one    | 1.1      | 1.1      | one.one    | 
| one.one    | 1.1      | 1.1      | one.one    | 
-------------------------------------------------

#################
# STEP 04 - Verify Import (--hive-import) and Export (--hcatalog)
# This is the defect reproduction
#################

sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "delete from t1_oracle"
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select * from
t1_oracle"
sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1_ORACLE --hcatalog-database
default --hcatalog-table t1_oracle --num-mappers 1 --input-fields-terminated-by ','

Output:
-------------------------------------------------
| C1         | C2       | C3       | C4         | 
-------------------------------------------------
-------------------------------------------------
---
16/12/07 16:37:14 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
java.lang.NullPointerException
	at org.apache.hadoop.io.Text.encode(Text.java:450)
	at org.apache.hadoop.io.Text.set(Text.java:198)
	at org.apache.hadoop.io.Text.<init>(Text.java:88)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureExportInputFormat(SqoopHCatUtilities.java:857)
	at org.apache.sqoop.mapreduce.JdbcExportJob.configureInputFormat(JdbcExportJob.java:73)
	at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:432)
	at org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:455)
	at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81)
	at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
16/12/07 16:37:14 INFO hive.metastore: Closed a connection to metastore, current connections:
0

#################
# STEP 04 - Verify Import (--hive-import) and Export (--hcatalog) and (--columns)
# This is the defect reproduction
#################

sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1_ORACLE --hcatalog-database
default --hcatalog-table t1_oracle --num-mappers 1 --input-fields-terminated-by ',' --columns
C1
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select * from
t1_oracle"

-------------------------------------------------
| C1         | C2       | C3       | C4         | 
-------------------------------------------------
| one.one    | (null)   | (null)   | (null)     | 
-------------------------------------------------

sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1_ORACLE --hcatalog-database
default --hcatalog-table t1_oracle --num-mappers 1 --input-fields-terminated-by ',' --columns
C1,C4
sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select * from
t1_oracle"

-------------------------------------------------
| C1         | C2       | C3       | C4         | 
-------------------------------------------------
| one.one    | (null)   | (null)   | (null)     | 
| one.one    | (null)   | (null)   | one.one    | 
-------------------------------------------------

sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1_ORACLE --hcatalog-database
default --hcatalog-table t1_oracle --num-mappers 1 --input-fields-terminated-by ',' --columns
C2

16/12/09 12:34:44 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
java.lang.NullPointerException
	at org.apache.hadoop.io.Text.encode(Text.java:450)
	at org.apache.hadoop.io.Text.set(Text.java:198)
	at org.apache.hadoop.io.Text.<init>(Text.java:88)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureExportInputFormat(SqoopHCatUtilities.java:857)
	at org.apache.sqoop.mapreduce.JdbcExportJob.configureInputFormat(JdbcExportJob.java:73)
	at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:432)
	at org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:455)
	at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81)
	at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
{noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message