sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Robson <David.Rob...@software.dell.com>
Subject RE: Import Partitions from Oracle to Hive Partitions
Date Tue, 05 Aug 2014 22:40:21 GMT
Hi Venkat,

I'm not sure what this will do in regards to Hive partitions - I'll test it out when I get
into the office and get back to you. But this option will make it so there is one file for
each Oracle partition - which might be of interest to you.

Match Hadoop Files to Oracle Table Partitions

-Doraoop.chunk.method={ROWID|PARTITION}

To import data from a partitioned table in such a way that the resulting HDFS folder structure
in
Hadoop will match the table's partitions, set the chunk method to PARTITION. The alternative
(default) chunk method is ROWID.

Notes:
l For the number of Hadoop files to match the number of Oracle partitions, set the number
of mappers to be greater than or equal to the number of partitions.
l If the table is not partitioned then value PARTITION will lead to an error.

David


From: Venkat, Ankam [mailto:Ankam.Venkat@centurylink.com]
Sent: Wednesday, 6 August 2014 3:56 AM
To: 'user@sqoop.apache.org'
Subject: Import Partitions from Oracle to Hive Partitions

I am trying to import  partitions from Oracle table to Hive partitions.

Can somebody provide the syntax using regular JDBC connector and Oraoop connector?

Thanks in advance.

Regards,
Venkat



Mime
View raw message