sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gwen Shapira <gshap...@cloudera.com>
Subject Re: Sqoop export hive to Oracle fails
Date Tue, 17 Mar 2015 15:00:35 GMT
It looks like for some reason Sqoop is trying to export your partition
as if it was a Kite data set.

What's the file format of the table? (i.e. Avro? Parquet? Text?)

On Tue, Mar 17, 2015 at 7:42 AM, Suresh Kumar Sethuramaswamy
<rockssk@gmail.com> wrote:
>
>
> Hi,
>
>
> I have a partitioned hive table which i want to export to an ORacle table.
>
>
>
> Sqoop statement
>
> ----------------
>
> sqoop export -D
> mapred.child.java.opts="-Djava.security.egd=file:/dev/../dev/urandom"
> --connect jdbc:oracle:thin:@//<host:Port>/<DBNAME>  --username <user>
> --password <password> --table <oracletable> --export-dir
> /user/hive/warehouse/<db>/<hivetable>/<partition1>/<partition2>
> --enclosed-by '\"'
>
>
>
>
>
> Env:
>
> -----
>
> CDH 5.3.0
>
> Sqoop 1.4.5
>
> Hive 0.13
>
>
>
>
>
> Error:
>
> ----
>
> org.kitesdk.data.DatasetNotFoundException: Descriptor location does not
> exist:
> hdfs://<namenode>:8020/user/hive/warehouse/<db>/<hivetable>/<partition1>/<partition2>/.metadata
>
>
>
> Please help resolve this or suggest a better option to export a partitioned
> hive table data to Oracle table.
>
>
>
> Regards
>
> Suresh
>
>

Mime
View raw message