sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Langer <david_lan...@hotmail.com>
Subject RE: The --hive-overwrite doesn't overwrite data
Date Sat, 21 Jan 2012 00:40:10 GMT

Duh, that took care of it!
 
Thanx for the help!

 

> From: kathleen@cloudera.com
> Date: Fri, 20 Jan 2012 15:58:47 -0800
> Subject: Re: The --hive-overwrite doesn't overwrite data
> To: sqoop-user@incubator.apache.org
> 
> Dave - can you try adding the --hive-import option?
> 
> Regards, Kathleen
> 
> On Fri, Jan 20, 2012 at 3:07 PM, David Langer <david_langer@hotmail.com> wrote:
> > Sure. Here it is:
> >
> > [cloudera@localhost ~]$ hive;
> > Hive history
> > file=/tmp/cloudera/hive_job_log_cloudera_201201201806_30238324.txt
> > hive> show tables;
> > OK
> > ndw_adventureworks_salesperson
> > Time taken: 3.716 seconds
> > hive> quit;
> > [cloudera@localhost ~]$ sqoop import --connect
> > 'jdbc:mysql://localhost/AdventureWorks?zeroDateTimeBehavior=round'
> > --username cloudera --query 'SELECT *, 87 AS JobID FROM SalesPerson WHERE
> > $CONDITIONS' --split-by BusinessEntityID  --target-dir /tmp/SalesPerson
> > --hive-overwrite --hive-table NDW_AdventureWorks_SalesPerson --verbose
> > 12/01/20 18:02:34 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > 12/01/20 18:02:34 DEBUG sqoop.ConnFactory: Added factory
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > /usr/lib/sqoop/conf/managers.d/mssqoop-sqlserver
> > 12/01/20 18:02:34 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 12/01/20 18:02:34 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > 12/01/20 18:02:34 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 12/01/20 18:02:34 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > 12/01/20 18:02:34 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> > jdbc:mysql:
> > 12/01/20 18:02:34 INFO manager.MySQLManager: Preparing to use a MySQL
> > streaming resultset.
> > 12/01/20 18:02:34 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > com.cloudera.sqoop.manager.MySQLManager@303020ad
> > 12/01/20 18:02:34 INFO tool.CodeGenTool: Beginning code generation
> > 12/01/20 18:02:35 DEBUG manager.SqlManager: No connection paramenters
> > specified. Using regular API for making connection.
> > 12/01/20 18:02:35 DEBUG manager.SqlManager: Using fetchSize for next query:
> > -2147483648
> > 12/01/20 18:02:35 INFO manager.SqlManager: Executing SQL statement: SELECT
> > *, 87 AS JobID FROM SalesPerson WHERE  (1 = 0)
> > 12/01/20 18:02:35 DEBUG manager.SqlManager: Using fetchSize for next query:
> > -2147483648
> > 12/01/20 18:02:35 INFO manager.SqlManager: Executing SQL statement: SELECT
> > *, 87 AS JobID FROM SalesPerson WHERE  (1 = 0)
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter: selected columns:
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   BusinessEntityID
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   TerritoryID
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   SalesQuota
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   Bonus
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   CommissionPct
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   SalesYTD
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   SalesLastYear
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   rowguid
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   ModifiedDate
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter:   JobID
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter: Writing source file:
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/QueryResult.java
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter: Table name: null
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter: Columns: BusinessEntityID:4,
> > TerritoryID:4, SalesQuota:3, Bonus:3, CommissionPct:3, SalesYTD:3,
> > SalesLastYear:3, rowguid:12, ModifiedDate:93, JobID:-5,
> > 12/01/20 18:02:35 DEBUG orm.ClassWriter: sourceFilename is QueryResult.java
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager: Found existing
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/
> > 12/01/20 18:02:35 INFO orm.CompilationManager: HADOOP_HOME is
> > /usr/lib/hadoop
> > 12/01/20 18:02:35 INFO orm.CompilationManager: Found hadoop core jar at:
> > /usr/lib/hadoop/hadoop-core.jar
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager: Adding source file:
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/QueryResult.java
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager: Invoking javac with args:
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager:   -sourcepath
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager:
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager:   -d
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager:
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager:   -classpath
> > 12/01/20 18:02:35 DEBUG orm.CompilationManager:
> > /usr/lib/hadoop/conf:/usr/java/jdk1.6.0_21/lib/tools.jar:/usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u2.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u2.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/hue-plugins-1.2.0-cdh3u2.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.1.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.1.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/etc/zookeeper::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/avro-1.5.4.jar:/usr/lib/sqoop/lib/avro-ipc-1.5.4.jar:/usr/lib/sqoop/lib/avro-mapred-1.5.4.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/lib/sqoop/lib/jopt-simple-3.2.jar:/usr/lib/sqoop/lib/mysql-connector-java-5.0.8-bin.jar:/usr/lib/sqoop/lib/paranamer-2.3.jar:/usr/lib/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/lib/hadoop/conf:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u2.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u2.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/hue-plugins-1.2.0-cdh3u2.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.1.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.1.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../conf:/usr/java/jdk1.6.0_21/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.90.4-cdh3u2.jar:/usr/lib/hbase/bin/../hbase-0.90.4-cdh3u2-tests.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/asm-3.1.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-1.4.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/guava-r06.jar:/usr/lib/hbase/bin/../lib/hadoop-core.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.0.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/bin/../lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/thrift-0.2.0.jar:/usr/lib/hbase/bin/../lib/velocity-1.5.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u2.jar:/usr/lib/sqoop/sqoop-test-1.3.0-cdh3u2.jar::/usr/lib/hadoop/hadoop-core.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u2.jar
> > 12/01/20 18:02:36 ERROR orm.CompilationManager: Could not rename
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/QueryResult.java
> > to /home/cloudera/./QueryResult.java
> > java.io.IOException: Destination '/home/cloudera/./QueryResult.java' already
> > exists
> >         at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
> >         at
> > com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> >         at
> > com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> >         at
> > com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:369)
> >         at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:455)
> >         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> >         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> >         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> >         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > 12/01/20 18:02:36 INFO orm.CompilationManager: Writing jar file:
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/QueryResult.jar
> > 12/01/20 18:02:36 DEBUG orm.CompilationManager: Scanning for .class files in
> > directory: /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99
> > 12/01/20 18:02:36 DEBUG orm.CompilationManager: Got classfile:
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/QueryResult.class
> > -> QueryResult.class
> > 12/01/20 18:02:36 DEBUG orm.CompilationManager: Finished writing jar file
> > /tmp/sqoop-cloudera/compile/d93e798470bd6dd21aa2d218ef8d4f99/QueryResult.jar
> > 12/01/20 18:02:36 INFO mapreduce.ImportJobBase: Beginning query import.
> > 12/01/20 18:02:37 DEBUG mapreduce.DataDrivenImportJob: Using table class:
> > QueryResult
> > 12/01/20 18:02:37 DEBUG mapreduce.DataDrivenImportJob: Using InputFormat:
> > class com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u2.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.0.8-bin.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u2.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u2.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/paranamer-2.3.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/sqljdbc4.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/avro-mapred-1.5.4.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/avro-1.5.4.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/avro-ipc-1.5.4.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/sqoop-sqlserver-1.0.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.0.8-bin.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/snappy-java-1.0.3.2.jar
> > 12/01/20 18:02:37 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/jopt-simple-3.2.jar
> > 12/01/20 18:02:38 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT
> > MIN(BusinessEntityID), MAX(BusinessEntityID) FROM (SELECT *, 87 AS JobID
> > FROM SalesPerson WHERE  (1 = 1) ) AS t1
> > 12/01/20 18:02:38 DEBUG db.IntegerSplitter: Splits:
> > [                         274 to                          290] into 4 parts
> > 12/01/20 18:02:38 DEBUG db.IntegerSplitter:                          274
> > 12/01/20 18:02:38 DEBUG db.IntegerSplitter:                          278
> > 12/01/20 18:02:38 DEBUG db.IntegerSplitter:                          282
> > 12/01/20 18:02:38 DEBUG db.IntegerSplitter:                          286
> > 12/01/20 18:02:38 DEBUG db.IntegerSplitter:                          290
> > 12/01/20 18:02:39 INFO mapred.JobClient: Running job: job_201201201632_0008
> > 12/01/20 18:02:40 INFO mapred.JobClient:  map 0% reduce 0%
> > 12/01/20 18:02:54 INFO mapred.JobClient:  map 50% reduce 0%
> > 12/01/20 18:02:59 INFO mapred.JobClient:  map 75% reduce 0%
> > 12/01/20 18:03:00 INFO mapred.JobClient:  map 100% reduce 0%
> > 12/01/20 18:03:02 INFO mapred.JobClient: Job complete: job_201201201632_0008
> > 12/01/20 18:03:02 INFO mapred.JobClient: Counters: 12
> > 12/01/20 18:03:02 INFO mapred.JobClient:   Job Counters
> > 12/01/20 18:03:02 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=28816
> > 12/01/20 18:03:02 INFO mapred.JobClient:     Total time spent by all reduces
> > waiting after reserving slots (ms)=0
> > 12/01/20 18:03:02 INFO mapred.JobClient:     Total time spent by all maps
> > waiting after reserving slots (ms)=0
> > 12/01/20 18:03:02 INFO mapred.JobClient:     Launched map tasks=4
> > 12/01/20 18:03:02 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > 12/01/20 18:03:02 INFO mapred.JobClient:   FileSystemCounters
> > 12/01/20 18:03:02 INFO mapred.JobClient:     HDFS_BYTES_READ=505
> > 12/01/20 18:03:02 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=270332
> > 12/01/20 18:03:02 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=1867
> > 12/01/20 18:03:02 INFO mapred.JobClient:   Map-Reduce Framework
> > 12/01/20 18:03:02 INFO mapred.JobClient:     Map input records=17
> > 12/01/20 18:03:02 INFO mapred.JobClient:     Spilled Records=0
> > 12/01/20 18:03:02 INFO mapred.JobClient:     Map output records=17
> > 12/01/20 18:03:02 INFO mapred.JobClient:     SPLIT_RAW_BYTES=505
> > 12/01/20 18:03:02 INFO mapreduce.ImportJobBase: Transferred 1.8232 KB in
> > 25.4856 seconds (73.2572 bytes/sec)
> > 12/01/20 18:03:02 INFO mapreduce.ImportJobBase: Retrieved 17 records.
> > [cloudera@localhost ~]$
> >
> >
> >> From: kathleen@cloudera.com
> >> Date: Fri, 20 Jan 2012 14:41:32 -0800
> >
> >> Subject: Re: The --hive-overwrite doesn't overwrite data
> >> To: sqoop-user@incubator.apache.org
> >>
> >> Dave - to aid in debugging, please re-run your Sqoop job with the
> >> --verbose flag and then paste the console log.
> >>
> >> Thanks, Kathleen
> >>
> >> > On Fri, Jan 20, 2012 at 11:51 AM, David Langer
> >> > <david_langer@hotmail.com> wrote:
> >> >> Greetings!
> >> >>
> >> >> Hopefully this isn't too much of a newbie question, but I am unable
to
> >> >> get
> >> >> the --hive-overwrite argument working. I'm using sqoop 1.3.0-cdh3u2
on
> >> >> the
> >> >> Cloudera VMWare Player VM.
> >> >>
> >> >>
> >> >> The following sqoop invocation succeeds in creating the Hive table
and
> >> >> populates it with data:
> >> >>
> >> >> sqoop import --connect
> >> >> 'jdbc:mysql://localhost/MyDB?zeroDateTimeBehavior=round' --username
> >> >> cloudera
> >> >> --query 'SELECT *, 47 AS JobID FROM SalesPerson WHERE $CONDITIONS'
> >> >> --split-by ID  --target-dir /tmp/SalesPerson --create-hive-table
> >> >> --hive-import --hive-table MyDB_SalesPerson
> >> >>
> >> >>
> >> >> However, while the following sqoop invocation does produce the desired
> >> >> data
> >> >> in HDFS (i.e., /tmp/SalesPerson) it does not overwrite the data in
the
> >> >> Hive
> >> >> table:
> >> >>
> >> >> sqoop import --connect
> >> >> 'jdbc:mysql://localhost/MyDB?zeroDateTimeBehavior=round' --username
> >> >> cloudera
> >> >> --query 'SELECT *, 87 AS JobID FROM SalesPerson WHERE $CONDITIONS'
> >> >> --split-by ID  --target-dir /tmp/SalesPerson --hive-overwrite
> >> >> --hive-table
> >> >> MyDB_salesperson
> >> >>
> >> >>
> >> >> There is nothing in Hive.log that indicates the --hive-overwrite sqoop
> >> >> invocation is interacting with Hive (e.g., no exceptions).
> >> >>
> >> >> Any assistance would be greatly appreciated.
> >> >>
> >> >> Thanx,
> >> >>
> >> >> Dave
 		 	   		  
Mime
View raw message