sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anirudh <techie.anir...@gmail.com>
Subject Re: Error while sqoop-export from Hive to SQL Server
Date Thu, 12 Apr 2012 23:28:27 GMT
Hi Bhavesh,

It is better to continue on the same thread than to create a new one for
the same issue. It helps tracking the problem better.
Is it possible of you to share the schema and the generated
"tmptempmeasurereport.java" file?

Thanks,
Anirudh

On Mon, Apr 2, 2012 at 1:58 AM, Bhavesh Shah <bhavesh25shah@gmail.com>wrote:

> Hello,
> I am facing the some errors while sqoop-export. Previously, I had posted
> the same question and I got a reply  that:
> specify the field delimiter  as well and try out
> --input-fields-terminated-by ','
>
> I also tried that but the error is same. What mistake I am doing here?
>
> Pls suggest me some solution to get me out of this.
>
> (I have also created a table in SQL Server.
> And my field delimiter of table in HIve is ",". )
>
>
> hadoop@ubuntu:~/sqoop-1.3.0-cdh3u1/bin$ ./sqoop-export --connect
> 'jdbc:sqlserver://192.168.1.1;username=abcd;password=12345;database=HadoopTest'
> --table tmptempmeasurereport --export-dir
> /user/hive/warehouse/tmptempmeasurereport
> 12/04/02 14:21:09 INFO SqlServer.MSSQLServerManagerFactory: Using
> Microsoft's SQL Server - Hadoop Connector
> 12/04/02 14:21:09 INFO manager.SqlManager: Using default fetchSize of 1000
> 12/04/02 14:21:09 INFO tool.CodeGenTool: Beginning code generation
> 12/04/02 14:21:09 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [tmptempmeasurereport]
> 12/04/02 14:21:09 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [tmptempmeasurereport]
> 12/04/02 14:21:09 INFO orm.CompilationManager: HADOOP_HOME is
> /home/hadoop/hadoop-0.20.2-cdh3u2
> 12/04/02 14:21:10 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/98b22347ba65c1dd0a2317c96dea3b41/tmptempmeasurereport.jar
> 12/04/02 14:21:10 INFO mapreduce.ExportJobBase: Beginning export of
> tmptempmeasurereport
> 12/04/02 14:21:10 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [tmptempmeasurereport]
> 12/04/02 14:21:12 INFO input.FileInputFormat: Total input paths to process
> : 1
> 12/04/02 14:21:12 INFO input.FileInputFormat: Total input paths to process
> : 1
> 12/04/02 14:21:12 INFO mapred.JobClient: Running job: job_201203291108_2778
> 12/04/02 14:21:13 INFO mapred.JobClient:  map 0% reduce 0%
> 12/04/02 14:21:18 INFO mapred.JobClient: Task Id :
> attempt_201203291108_2778_m_000000_0, Status : FAILED
> java.util.NoSuchElementException
>     at java.util.AbstractList$Itr.next(AbstractList.java:350)
>     at tmptempmeasurereport.__loadFromFields(tmptempmeasurereport.java:383)
>     at tmptempmeasurereport.parse(tmptempmeasurereport.java:332)
>     at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:79)
>     at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> 12/04/02 14:21:23 INFO mapred.JobClient: Task Id :
> attempt_201203291108_2778_m_000000_1, Status : FAILED
> java.util.NoSuchElementException
>     at java.util.AbstractList$Itr.next(AbstractList.java:350)
>     at tmptempmeasurereport.__loadFromFields(tmptempmeasurereport.java:383)
>     at tmptempmeasurereport.parse(tmptempmeasurereport.java:332)
>     at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:79)
>     at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> 12/04/02 14:21:27 INFO mapred.JobClient: Task Id :
> attempt_201203291108_2778_m_000000_2, Status : FAILED
> java.util.NoSuchElementException
>     at java.util.AbstractList$Itr.next(AbstractList.java:350)
>     at tmptempmeasurereport.__loadFromFields(tmptempmeasurereport.java:383)
>     at tmptempmeasurereport.parse(tmptempmeasurereport.java:332)
>     at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:79)
>     at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> 12/04/02 14:21:33 INFO mapred.JobClient: Job complete:
> job_201203291108_2778
> 12/04/02 14:21:33 INFO mapred.JobClient: Counters: 7
> 12/04/02 14:21:33 INFO mapred.JobClient:   Job Counters
> 12/04/02 14:21:33 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=19198
> 12/04/02 14:21:33 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 12/04/02 14:21:33 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 12/04/02 14:21:33 INFO mapred.JobClient:     Launched map tasks=4
> 12/04/02 14:21:33 INFO mapred.JobClient:     Data-local map tasks=4
> 12/04/02 14:21:33 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> 12/04/02 14:21:33 INFO mapred.JobClient:     Failed map tasks=1
> 12/04/02 14:21:33 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> 22.8352 seconds (0 bytes/sec)
> 12/04/02 14:21:33 INFO mapreduce.ExportJobBase: Exported 0 records.
> 12/04/02 14:21:33 ERROR tool.ExportTool: Error during export: Export job
> failed!
> hadoop@ubuntu:~/sqoop-1.3.0-cdh3u1/bin$
>
>
> --
> Regards,
> Bhavesh Shah
>
>

Mime
View raw message