sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jarek Jarcec Cecho <jar...@apache.org>
Subject Re: Sqoop Halts - Exporting from HDFS to Oracle (Date/Timestamp issue?)
Date Sat, 02 Nov 2013 01:25:28 GMT
Hi Tanzir,
would you mind sharing with us the following information?

* Sqoop command that you are using
* Entire generated log when executed with --verbose parameter


On Wed, Oct 30, 2013 at 10:28:41PM +0600, Tanzir Musabbir wrote:
> I'm encountering an issue when exporting data from HDFS to Oracle. I'm not sure whether
this is an existing issue or not. When I carefully checked the log, I saw one of the column
has "2/13/2013 4:35:50" wheres the corresponding column type in Oracle is varchar2.
> Can't we export date or time just as string? When my HDFS has that kind of data, Sqoop
halts and at the end its(task) killed by mapreduce task's timeout.
> Based on this: http://qnalist.com/questions/31561/sqoop-modifies-the-date-format-in-the-exported-data,
I'm getting impression that Oracle driver converts it to timestamp and tries to insert it
and when it doesn't see timestamp as column type it halts. Please correct me if I'm wrong.
> Is there any way not to do that kind of conversion? and just export them as string?
> Thanks in advance.
> Sincerely,Tanzir 		 	   		  

View raw message