sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jarek Jarcec Cecho <jar...@apache.org>
Subject Re: importing to hive with partitions
Date Sun, 04 Mar 2012 12:38:15 GMT
Hi Marcin,
can you try running the command without --target-dir parameter? 

It seems to me that you're trying to save sqoop output directly to HIVE warehouse directory,
however sqoop is firstly exporting data to different HDFS directory and then calling hive
for creating table and loading (moving) data.

Jarcec

On Thu, Mar 01, 2012 at 02:56:23PM +0100, Marcin Cylke wrote:
> Hi
> I'm trying to import some data from DB to hive tables. My sqoop
> version is 1.4.1-incubating__hadoop-1.0.0 (http://ftp.tpnet.pl/vol/d1/apache//incubator/sqoop/sqoop-1.4.1-incubating/sqoop-1.4.1-incubating__hadoop-1.0.0.tar.gz).
> I use this command:
> 
> bin/sqoop import --driver oracle.jdbc.driver.OracleDriver
>           --connect jdbc:oracle:thin:@172.16.17.232:1521:dwh
>           --username had_test
>           --password abc1234
>           --query "select table_name, tablespace_name, cluster_name,
> iot_name, status, pct_free
> , pct_used from all_tables where owner = 'ME' and \$CONDITIONS"
>           --num-mappers 1
>           --hive-import
>           --hive-table "sample_rm3"
>           --hive-partition-key 'OWNER'
>           --hive-partition-value 'ME'
>           --target-dir /user/hive/warehouse/sample_rm3
>           --mysql-delimiters
>           --verbose
> 
> and am getting the following error when loading data to partitions:
> 
> 12/03/01 14:44:38 INFO hive.HiveImport: Loading data to table
> default.sample_rm3 partition (owner=ME)
> 12/03/01 14:44:38 INFO hive.HiveImport: Failed with exception
> checkPaths: hdfs://hadoop1:9000/user/hive/warehouse/sample_rm3 has
> nested
> directoryhdfs://hadoop1:9000/user/hive/warehouse/sample_rm3/owner=ME
> 12/03/01 14:44:38 INFO hive.HiveImport: FAILED: Execution Error,
> return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
> 12/03/01 14:44:38 ERROR tool.ImportTool: Encountered IOException
> running import job: java.io.IOException: Hive exited with status 9
>         at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:347)
>         at
> org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:297)
>         at
> org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:239)
>         at
> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:392)
>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> 
> 
> Is this a known issue, is there some kind of workaround?
> 
> It seems that this topic is loosely connected to this JIRA ticket:
> https://issues.apache.org/jira/browse/SQOOP-312 Are there any plans
> on commiting such support in sqoop?
> 
> Best regards
> Marcin

Mime
View raw message