sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ying Cao (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SQOOP-3211) Sqoop1 (import + --target-dir + --table) using DB2 with case-sensitive-table fails with error (DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704)
Date Thu, 27 Jul 2017 08:42:00 GMT

    [ https://issues.apache.org/jira/browse/SQOOP-3211?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16102938#comment-16102938
] 

Ying Cao commented on SQOOP-3211:
---------------------------------

The return DB2 error of SQLCODE=-204 is a typical error for "can not find table".

>From details error:

_17/07/17 14:14:47 INFO manager.SqlManager: Executing SQL statement: *SELECT t.* FROM t1_lower
AS t WHERE 1=0*
17/07/17 14:14:47 ERROR manager.SqlManager: Error executing statement: com.ibm.db2.jcc.am.SqlSyntaxErrorException:
DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=DB2INST1.T1_LOWER, DRIVER=4.16.53
com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: *SQLCODE=-204*, SQLSTATE=42704,
SQLERRMC=DB2INST1.T1_LOWER, DRIVER=4.16.53
	at com.ibm.db2.jcc.am.fd.a(fd.java:739)_

Sqoop invoke sql  "SELECT t.* FROM t1_lower AS t WHERE 1=0" to DB2, but DB2 is not case-sensitive
which mean this statement works on table "T1_LOWER" instead of "t1_lower", so DB2 returns
SQLCODE 204 it works well.

Let's return to you issue, special table name with \"\" to make table name lower case and
mark with "" will help to fix the error.

For example : import a DB2 table  "test"

_sqoop export -connect  <<url>  --username <user> --password <password>
--table \"\"test\"\"_

then it will try to access table "test": _ INFO manager.SqlManager: Executing SQL statement:
SELECT t.* FROM "test" AS t WHERE 1=0_

> Sqoop1 (import + --target-dir + --table) using DB2 with case-sensitive-table fails with
error (DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704)
> -------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SQOOP-3211
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3211
>             Project: Sqoop
>          Issue Type: Bug
>         Environment: $ sqoop version
> 17/07/17 14:22:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.10.1
>            Reporter: Markus Kemper
>
> Sqoop1 (import + --target-dir + --table) using DB2 with case-sensitive-table fails with
error (DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704), see test case below.
> *Test Case*
> {noformat}
> #################
> # STEP 01 - Create Table and Data
> #################
> export MYCONN=jdbc:db2://host.domain.com:50000/SQOOP
> export MYUSER=sqoop;
> export MYPSWD=sqoop;
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "drop table
t1_default"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "create table
t1_default (c1_default int)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "insert into
t1_default values (1)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select c1_default
from t1_default"
> ---------------
> | C1_DEFAULT  | 
> ---------------
> | 1           | 
> ---------------
> #################
> # STEP 02 - Import Data (baseline) using (--as-textfile, --as-avrodatafile, --as-parquetfile)
> #################
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1_default
--target-dir /user/root/t1_default --delete-target-dir --num-mappers 1 --as-textfile
> hdfs dfs -cat /user/root/t1_default/part*
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1_default
--target-dir /user/root/t1_default --delete-target-dir --num-mappers 1 --as-avrodatafile
> avro-tools tojson --pretty 'hdfs://host.domain.com/user/root/t1_default/part-m-00000.avro'
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1_default
--target-dir /user/root/t1_default --delete-target-dir --num-mappers 1 --as-parquetfile
> hdfs dfs -ls /user/root/t1_default/*.parquet
> parquet-tools cat --json 'hdfs://host.domain.com/user/root/t1_default/c2a7687c-b2bd-40d6-8959-d8ce9c240ae6.parquet'
> Output:
> 17/07/17 14:00:41 INFO mapreduce.ImportJobBase: Transferred 2 bytes in 30.2579 seconds
(0.0661 bytes/sec)
> 17/07/17 14:00:41 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> 1
> ~~~~~
> 17/07/17 14:02:05 INFO mapreduce.ImportJobBase: Transferred 263 bytes in 22.8116 seconds
(11.5292 bytes/sec)
> 17/07/17 14:02:05 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> {
>   "C1_DEFAULT" : {
>     "int" : 1
>   }
> }
> ~~~~~
> 17/07/17 14:06:29 INFO mapreduce.ImportJobBase: Transferred 1.2578 KB in 29.3071 seconds
(43.9484 bytes/sec)
> 17/07/17 14:06:29 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> -rw-r--r--   3 root root        449 2017-07-17 14:06 /user/root/t1_default/c2a7687c-b2bd-40d6-8959-d8ce9c240ae6.parquet
> ~~~~~
> {"C1_DEFAULT":1}
> #################
> # STEP 03 - Create Table and Data with case-sensitive DB2 table name
> #################
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "drop table
\"t1_lower\""
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "create table
\"t1_lower\" (\"c1_lower\" int)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "insert into
\"t1_lower\" values (1)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select \"c1_lower\"
from \"t1_lower\""
> sqoop list-tables --connect $MYCONN --username $MYUSER --password $MYPSWD | egrep -i
"t1"
> Output:
> ---------------
> | c1_lower    | 
> ---------------
> | 1           | 
> ---------------
> ~~~~~
> T1_DEFAULT
> t1_lower
> #################
> # STEP 04 - Import Data with case-sensitive DB2 table name using (--as-textfile, --as-avrodatafile,
--as-parquetfile)
> #################
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table \"t1_lower\"
--target-dir /user/root/t1_lower --delete-target-dir --num-mappers 1 --as-textfile
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table \"t1_lower\"
--target-dir /user/root/t1_lower --delete-target-dir --num-mappers 1 --as-avrodatafile
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table \"t1_lower\"
--target-dir /user/root/t1_lower --delete-target-dir --num-mappers 1 --as-parquetfile
> Output:
> 17/07/17 14:13:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM t1_lower
AS t WHERE 1=0
> 17/07/17 14:13:59 ERROR manager.SqlManager: Error executing statement: com.ibm.db2.jcc.am.SqlSyntaxErrorException:
DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=DB2INST1.T1_LOWER, DRIVER=4.16.53
> com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704,
SQLERRMC=DB2INST1.T1_LOWER, DRIVER=4.16.53
> 	at com.ibm.db2.jcc.am.fd.a(fd.java:739)
> ~~~~~
> 17/07/17 14:14:47 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM t1_lower
AS t WHERE 1=0
> 17/07/17 14:14:47 ERROR manager.SqlManager: Error executing statement: com.ibm.db2.jcc.am.SqlSyntaxErrorException:
DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=DB2INST1.T1_LOWER, DRIVER=4.16.53
> com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704,
SQLERRMC=DB2INST1.T1_LOWER, DRIVER=4.16.53
> 	at com.ibm.db2.jcc.am.fd.a(fd.java:739)
> ~~~~~
> 17/07/17 14:15:26 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM t1_lower
AS t WHERE 1=0
> 17/07/17 14:15:26 ERROR manager.SqlManager: Error executing statement: com.ibm.db2.jcc.am.SqlSyntaxErrorException:
DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=DB2INST1.T1_LOWER, DRIVER=4.16.53
> com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704,
SQLERRMC=DB2INST1.T1_LOWER, DRIVER=4.16.53
> 	at com.ibm.db2.jcc.am.fd.a(fd.java:739)
> #################
> # STEP 05 - Import Data with case-sensitive DB2 table name using (--query) as workaround
> #################
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select
* from \"t1_lower\" where \$CONDITIONS" --target-dir /user/root/t1_lower --delete-target-dir
--num-mappers 1 --as-textfile
> hdfs dfs -cat /user/root/t1_lower/part*
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select
* from \"t1_lower\" where \$CONDITIONS" --target-dir /user/root/t1_lower --delete-target-dir
--num-mappers 1 --as-avrodatafile
> avro-tools tojson --pretty 'hdfs://host.domain.com/user/root/t1_lower/part-m-00000.avro'
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select
* from \"t1_lower\" where \$CONDITIONS" --target-dir /user/root/t1_lower --delete-target-dir
--num-mappers 1 --as-parquetfile
> hdfs dfs -ls /user/root/t1_lower/*.parquet
> parquet-tools cat --json 'hdfs://host.domain.com/user/root/t1_lower/0a9f9927-1e9e-4f6b-90af-adc68403fea0.parquet'
> Output:
> 17/07/17 14:17:34 INFO mapreduce.ImportJobBase: Transferred 2 bytes in 23.1601 seconds
(0.0864 bytes/sec)
> 17/07/17 14:17:34 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> 1
> ~~~~~
> 17/07/17 14:30:30 INFO mapreduce.ImportJobBase: Transferred 270 bytes in 21.6549 seconds
(12.4683 bytes/sec)
> 17/07/17 14:30:30 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> {
>   "c1_lower" : {
>     "int" : 1
>   }
> }
> ~~~~~
> 17/07/17 14:32:51 INFO mapreduce.ImportJobBase: Transferred 1.2832 KB in 24.6545 seconds
(53.2966 bytes/sec)
> 17/07/17 14:32:51 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> -rw-r--r--   3 root root        461 2017-07-17 14:32 /user/root/t1_lower/0a9f9927-1e9e-4f6b-90af-adc68403fea0.parquet
> ~~~~~
> {"c1_lower":1}
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message