sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gwen Shapira <gshap...@cloudera.com>
Subject Re: Import Partitions from Oracle to Hive Partitions
Date Tue, 05 Aug 2014 22:44:27 GMT
Hive expects a directory for each partition, so getting data with
OraOop will require some post-processing - copy files into properly
named directories and adding the new partitions to a hive table.

Sqoop has the --hive-partition-key and --hive-partition-value, but
this assumes that all the data sqooped will fit into a single
partition.


On Tue, Aug 5, 2014 at 3:40 PM, David Robson
<David.Robson@software.dell.com> wrote:
> Hi Venkat,
>
>
>
> I’m not sure what this will do in regards to Hive partitions – I’ll test it
> out when I get into the office and get back to you. But this option will
> make it so there is one file for each Oracle partition – which might be of
> interest to you.
>
>
>
> Match Hadoop Files to Oracle Table Partitions
>
>
>
> -Doraoop.chunk.method={ROWID|PARTITION}
>
>
>
> To import data from a partitioned table in such a way that the resulting
> HDFS folder structure in
>
> Hadoop will match the table’s partitions, set the chunk method to PARTITION.
> The alternative
>
> (default) chunk method is ROWID.
>
>
>
> Notes:
>
> l For the number of Hadoop files to match the number of Oracle partitions,
> set the number
>
> of mappers to be greater than or equal to the number of partitions.
>
> l If the table is not partitioned then value PARTITION will lead to an
> error.
>
>
>
> David
>
>
>
>
>
> From: Venkat, Ankam [mailto:Ankam.Venkat@centurylink.com]
> Sent: Wednesday, 6 August 2014 3:56 AM
> To: 'user@sqoop.apache.org'
> Subject: Import Partitions from Oracle to Hive Partitions
>
>
>
> I am trying to import  partitions from Oracle table to Hive partitions.
>
>
>
> Can somebody provide the syntax using regular JDBC connector and Oraoop
> connector?
>
>
>
> Thanks in advance.
>
>
>
> Regards,
>
> Venkat
>
>
>
>

Mime
View raw message