sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gwen Shapira <gshap...@cloudera.com>
Subject Re: Sqoop import/Export CLOB Datatype
Date Wed, 27 Aug 2014 17:20:06 GMT
Try converting your Hive timestamp field to a string with the format:
yyyy-mm-dd HH:MM:SS.ffffffff

Gwen

On Wed, Aug 27, 2014 at 10:11 AM, Duddu, Rajasekhar
<Rajasekhar.Duddu@verizonwireless.com> wrote:
> Hi,
>
> I didn’t understand the doc, could anyone please let me know what all type castings
I have to do to make this work.
> Appreciate your help.
>
> Thanks.
>
> -----Original Message-----
> From: Gwen Shapira [mailto:gshapira@cloudera.com]
> Sent: Tuesday, August 26, 2014 5:02 PM
> To: user@sqoop.apache.org
> Subject: Re: Sqoop import/Export CLOB Datatype
>
> TIMESTAMP in Hive is actually a Long (seconds since 1970).
>
> I'm assuming the matching column in Oracle is date or timestamp, and Oracle does not
automatically convert Long to Date/Timestamp.
>
> Take a look here on how to handle it:
> http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_dates_and_times
>
> On Tue, Aug 26, 2014 at 4:29 PM, Duddu, Rajasekhar <Rajasekhar.Duddu@verizonwireless.com>
wrote:
>> Hi,
>>
>>
>>
>> I have successfully imported an Oracle table with CLOB data type to
>> Hive Table. After processing that table, I am supposed to export back
>> the same table to Oracle.
>>
>>
>>
>> Source Oracle table types are  : ( DOCUMENT_ID  NUMBER,XML_DATA     CLOB,
>> SUBMIT_DATE  DATE )
>>
>> Hive table  types  : (document_id int,xml_data string, submit_date
>> timestamp
>> )
>>
>> Export Oracle table : DOCUMENT_ID  NUMBER, XML_DATA     VARCHAR2(1000 BYTE),
>> SUBMIT_DATE  DATE)
>>
>>  I created this export table  because I cannot export the CLOB
>> directly from Hive String type, so first export it to this table and
>> then in oracle convert to CLOB.
>>
>>
>>
>>
>>
>> Problem:
>>
>> While exporting it hangs for a while and fails . I found an error in
>> JOB Tracker.
>>
>>
>>
>> ERROR org.apache.sqoop.mapreduce.AsyncSqlOutputFormat: Got exception
>> in update thread: java.sql.SQLException: ORA-01461: can bind a LONG
>> value only for insert into a LONG column
>>
>>
>>
>> Export Error:
>>
>> 14/08/26 18:53:58 INFO mapred.JobClient:  map 0% reduce 0%
>>
>> 14/08/26 18:54:19 INFO mapred.JobClient:  map 100% reduce 0%
>>
>> 14/08/26 19:04:20 INFO mapred.JobClient:  map 0% reduce 0%
>>
>> 14/08/26 19:04:22 INFO mapred.JobClient: Task Id :
>> attempt_201408041327_29224_m_000000_0, Status : FAILED
>>
>> Task attempt_201408041327_29224_m_000000_0 failed to report status for
>> 600 seconds. Killing!
>>
>> 14/08/26 19:04:32 INFO mapred.JobClient:  map 100% reduce 0%
>>
>> 14/08/26 19:14:32 INFO mapred.JobClient:  map 0% reduce 0%
>>
>> 14/08/26 19:14:34 INFO mapred.JobClient: Task Id :
>> attempt_201408041327_29224_m_000000_1, Status : FAILED
>>
>> Task attempt_201408041327_29224_m_000000_1 failed to report status for
>> 600 seconds. Killing!
>>
>> 14/08/26 19:14:45 INFO mapred.JobClient:  map 100% reduce 0%
>>
>>
>>
>>
>>
>> My steps:
>>
>> Hive>create table sqp_clob_data7 (DOCUMENT_ID INT, XML_DATA STRING,
>> SUBMIT_DATE TIMESTAMP);
>>
>>
>>
>> sqoop import options:
>>
>> sqoop import --connect jdbc:oracle:thin:@<host> --username
>> <username>-P --table <tablename> --hive-import  --hive-table
>> sqp_clob_data7 --hive-overwrite --hive-drop-import-delims
>> --map-column-java XML_DATA=String --verbose -m 1
>>
>>
>>
>> export options:
>>
>> sqoop export --connect jdbc:oracle:thin:@<host>  --username <username>
>> -P --table <tabllename>  --export-dir
>> /apps/hive/warehouse/sqp_clob_data7
>> --verbose -m 1 --input-fields-terminated-by '\001'
>>
>>
>>
>> Versions:
>>
>> Hadoop – HW 1.3.2
>>
>> Hive - 0.11.0.1.3.3.0
>>
>> Sqoop - 1.4.3.1.3.3.0
>>
>>
>>
>> I am not clear with the error which I see in job tracker, please let
>> me know if anyone has come across such issue and if there is any fox for the same.
>>
>>
>>
>> Thanks & Regards
>>
>> Rajasekhar D

Mime
View raw message