sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Eric Lin (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SQOOP-3188) Sqoop1 (import + --target-dir) with empty directory (/usr/lib/hive) fails with error (java.lang.NoClassDefFoundError: org/json/JSONObject)
Date Thu, 12 Jul 2018 08:33:00 GMT

    [ https://issues.apache.org/jira/browse/SQOOP-3188?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16541316#comment-16541316
] 

Eric Lin commented on SQOOP-3188:
---------------------------------

I still can't re-produce the issue after multiple tries under trunk code. Maybe it has been
fixed?

> Sqoop1 (import + --target-dir) with empty directory (/usr/lib/hive) fails with error
(java.lang.NoClassDefFoundError: org/json/JSONObject)
> ------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SQOOP-3188
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3188
>             Project: Sqoop
>          Issue Type: Bug
>            Reporter: Markus Kemper
>            Assignee: Eric Lin
>            Priority: Major
>
> Sqoop1 (import + --target-dir) with empty directory (/usr/lib/hive) fails with error
(java.lang.NoClassDefFoundError: org/json/JSONObject), see test case below.
> *Test Case*
> {noformat}
> #################
> # STEP 01 - Create Table and Data
> #################
> export MYCONN=jdbc:mysql://mysql.sqoop.com:3306/sqoop
> export MYUSER=sqoop
> export MYPSWD=sqoop
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "drop table
t1"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "create table
t1 (c1 int, c2 date, c3 varchar(10))"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "insert into
t1 values (1, current_date, 'some data')"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select *
from t1"
> Output:
> -----------------------------------------
> | c1          | c2         | c3         | 
> -----------------------------------------
> | 1           | 2017-05-10 | some data  | 
> -----------------------------------------
> #################
> # STEP 02 - Import Data into HDFS 
> #################
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1 --target-dir
/user/root/t1 --delete-target-dir --num-mappers 1
> hdfs dfs -cat /user/root/t1/part*
> Output:
> 17/05/10 13:46:24 INFO mapreduce.ImportJobBase: Transferred 23 bytes in 22.65 seconds
(1.0155 bytes/sec)
> 17/05/10 13:46:24 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> 1,2017-05-10,some data
> #################
> # STEP 03 - Create Bogus Hive Directory and Attempt to Import into HDFS
> #################
> mkdir /usr/lib/hive
> chmod 777 /usr/lib/hive
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1 --target-dir
/user/root/t1 --delete-target-dir --num-mappers 1
> Output:
> 17/05/10 13:47:44 INFO mapreduce.ImportJobBase: Beginning import of t1
> Exception in thread "main" java.lang.NoClassDefFoundError: org/json/JSONObject
> 	at org.apache.sqoop.util.SqoopJsonUtil.getJsonStringforMap(SqoopJsonUtil.java:43)
> 	at org.apache.sqoop.SqoopOptions.writeProperties(SqoopOptions.java:776)
> 	at org.apache.sqoop.mapreduce.JobBase.putSqoopOptionsToConfiguration(JobBase.java:388)
> 	at org.apache.sqoop.mapreduce.JobBase.createJob(JobBase.java:374)
> 	at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:256)
> 	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
> 	at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
> 	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
> 	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
> 	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> 	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
> 	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
> 	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
> 	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
> Caused by: java.lang.ClassNotFoundException: org.json.JSONObject
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 	... 15 more
> #################
> # STEP 04 - Remove Bogus Hive Directory and Attempt to Import into HDFS 
> #################
> rm -rf /usr/lib/hive
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1 --target-dir
/user/root/t1 --delete-target-dir --num-mappers 1
> hdfs dfs -cat /user/root/t1/part*
> Output:
> 17/05/10 13:52:30 INFO mapreduce.ImportJobBase: Transferred 23 bytes in 22.6361 seconds
(1.0161 bytes/sec)
> 17/05/10 13:52:30 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> 1,2017-05-10,some data
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message