sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jarek Jarcec Cecho <jar...@apache.org>
Subject Re: Sqoop Import into Hive
Date Mon, 24 Jun 2013 15:02:41 GMT
Hi Manickam,
the Hive import have two phases. In the first phase Sqoop will import data into temporary
directory just like in the normal non-hive import. Second phase will then load the data into
Hive. Your job seems to be failing as the output directory required for the first phase already
exists. When no --target-dir or --warehouse-dir parameter are used, it's created in home directory
on HDFS of user that executed Sqoop.

What exact HDFS command are you using to see the directory? I would advise to run following

  hadoop dfs -rmr EMPLOYEE


On Mon, Jun 24, 2013 at 06:59:46PM +0530, Manickam P wrote:
> Hi,
> I'm trying to import from my oracle db and want to insert in my hive. For that i used
the below script. ./sqoop-import --connect jdbc:oracle:thin:@
--username scott--password tiger --table=EMPLOYEE  --hive-table Employee --create-hive-table
--hive-import --hive-home /path to hive home/But i am getting the below error. 13/06/24 09:06:10
ERROR security.UserGroupInformation: PriviledgedActionException as:root cause:org.apache.hadoop.mapred.FileAlreadyExistsException:
Output directory EMPLOYEE already exists13/06/24 09:06:10 ERROR tool.ImportTool: Encountered
IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output
directory EMPLOYEE already exists        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:137)
       at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:889)        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
       at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:396)
       at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
       at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)       
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
       at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)   
    at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
       at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:545)        at org.apache.sqoop.manager.OracleManager.importTable(OracleManager.java:380)
       at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
       at org.apache.sqoop.Sqoop.run(Sqoop.java:145)        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
       at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
       at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> I am unable to find out the directory in my hdfs.  I tried to execute the dfs command
but there is no directory like that. Pls help me.
> Thanks,Manickam P 		 	   		  

View raw message