sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From YouPeng Yang <yypvsxf19870...@gmail.com>
Subject Re: export job failed with oracle
Date Fri, 19 Apr 2013 06:14:08 GMT
Hi All
  Hmm~, Sorry to bother you again.I find the explanation in the Sqoop User
Guide[1].

  It looks like one have to take corresponding change .

[1]==========================================================
21.3.1. Dates and Times

Oracle JDBC represents DATE and TIME SQL types as TIMESTAMP values.
Any DATE columns
in an Oracle database will be imported as a TIMESTAMP in Sqoop, and
Sqoop-generated code will store these values in java.sql.Timestamp fields.
When exporting data back to a database, Sqoop parses text fields as
TIMESTAMP types (with the form yyyy-mm-dd HH:MM:SS.ffffffff) even if you
expect these fields to be formatted with the JDBC date escape format of
yyyy-mm-dd. Dates exported to Oracle should be formatted as full timestamps.

 Regards


2013/4/19 YouPeng Yang <yypvsxf19870706@gmail.com>

> Hi All
>
>    I think I get the reason.
>    There is a DATE column in my table  NMS_CMTS_CPU_CDX_TEST .
>
>    And I found the same error on this URL:
>
> https://groups.google.com/a/cloudera.org/forum/?fromgroups=#!topic/sqoop-user/I0zqhKOdOyQ
>   According to the URL,I install the  Quest Data Connector for Oracle and
> Hadoop,and It goes well when I export  without the DATE column in my table
>  and the corresponding dataset in my HDFS file.
>   On the other hand,When I do the above export job again ,the exception
> come out.
>
>   Anyone encountered with the same trouble,or Any suggestion?
>
>
> Regards
>
>
>
>
>
>
> 2013/4/19 YouPeng Yang <yypvsxf19870706@gmail.com>
>
>> Hi All
>>
>>    I do an export job to export data to my oracle10g database :
>> /home/sqoop-1.4.1-cdh4.1.2/bin/sqoop export --connect
>> jdbc:oracle:thin:@10.167.14.225:1521:wxoss -username XUJINGYU -password
>> 123456  --export-dir sqoop/NMS_CMTS_CPU_CDX --table NMS_CMTS_CPU_CDX_TEST
>> --input-fields-terminated-by  "|"
>>
>>   However I  get the excetion[1].
>>   It is weird because My import job from oracle  succeed.
>>
>>   Any suggestion will appreciated.
>>
>> Thank you.
>>
>>
>> [1]===========================================================
>> ...
>> 13/04/19 10:12:17 INFO mapreduce.Job: The url to track the job:
>> http://Hadoop01:8088/proxy/application_1364348895095_0040/
>> 13/04/19 10:12:17 INFO mapreduce.Job: Running job: job_1364348895095_0040
>> 13/04/19 10:12:30 INFO mapreduce.Job: Job job_1364348895095_0040 running
>> in uber mode : false
>> 13/04/19 10:12:30 INFO mapreduce.Job:  map 0% reduce 0%
>> 13/04/19 10:12:40 INFO mapreduce.Job: Task Id :
>> attempt_1364348895095_0040_m_000002_0, Status : FAILED
>> Error: java.lang.RuntimeException:
>> java.lang.reflect.InvocationTargetException
>>         at
>> org.apache.sqoop.mapreduce.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.java:166)
>>         at
>> org.apache.sqoop.mapreduce.CombineFileRecordReader.<init>(CombineFileRecordReader.java:125)
>>         at
>> org.apache.sqoop.mapreduce.ExportInputFormat.createRecordReader(ExportInputFormat.java:94)
>>         at
>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:455)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:697)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>> Caused by: java.lang.reflect.InvocationTargetException
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>>         at
>> org.apache.sqoop.mapreduce.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.java:148)
>>         ... 10 more
>> Caused by: java.net.ConnectException: Call From Hadoop04/10.167.14.224to Hadoop01:8020
failed on connection exception: java.net.ConnectException:
>> Connection refused; For more details see:
>> http://wiki.apache.org/hadoop/ConnectionRefused
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:721)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1164)
>>         at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>>         at $Proxy10.getFileInfo(Unknown Source)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>>         at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>>         at $Proxy10.getFileInfo(Unknown Source)
>>         at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:628)
>>         at
>> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1507)
>>         at
>> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:783)
>>         at
>> org.apache.sqoop.mapreduce.ExportJobBase.getFileType(ExportJobBase.java:110)
>>         at
>> org.apache.sqoop.mapreduce.ExportJobBase.isSequenceFiles(ExportJobBase.java:98)
>>         at
>> org.apache.sqoop.mapreduce.CombineShimRecordReader.createChildReader(CombineShimRecordReader.java:120)
>>         at
>> org.apache.sqoop.mapreduce.CombineShimRecordReader.<init>(CombineShimRecordReader.java:60)
>>         ... 15 more
>> Caused by: java.net.ConnectException: Connection refused
>>         at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>         at
>> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
>>         at
>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:207)
>>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:523)
>>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:488)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:476)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:570)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:220)
>>         at org.apache.hadoop.ipc.Client.getConnection(Client.java:1213)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1140)
>>         ... 31 more
>>
>>
>>
>

Mime
View raw message