sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Suraj Nayak <snay...@gmail.com>
Subject Re: Binary data transfer using Sqoop
Date Thu, 16 Jul 2015 02:23:31 GMT
Hi Abe,

Thanks for highlighting missing required info quickly. Below are the
details:

   - *Version:* Sqoop 1.4.5
   - *Sqoop Command: *sqoop import --connect
   jdbc:teradata://aa.bb.cc.internal/DATABASE=someDB --username sqoop_usr
   --password sqoop_usr --table ENCRYPTED_TBL --fields-terminated-by \\001 -m
   1 --target-dir /tmp/ENC_TBL --connection-manager
   "org.apache.sqoop.manager.GenericJdbcManager" --driver
   com.teradata.jdbc.TeraDriver
   - *Table structure:* id:varchar, count:int, first_name:binary,
   email:binary, column5:varchar.

Binary is used as the data is encrypted.

Thanks!



On Wed, Jul 15, 2015 at 6:44 PM, Abraham Elmahrek <abe@cloudera.com> wrote:

> Hey man,
>
> Need some details to help:
>
>    - What version of Sqoop?
>    - Sqoop command?
>    - Database table structure (preferably a describe on the database)
>
> -Abe
>
> On Wed, Jul 15, 2015 at 6:42 PM, Suraj Nayak <snayakm@gmail.com> wrote:
>
> > Hi Sqoop Users and Developers,
> >
> > How can i import a Binary data column in a table into HDFS without
> > converting it into String.
> >
> > I have encrypted data in RDBMS, I need to import this column as is
> without
> > converting it into string. As of now, Sqoop is typecasting the data into
> > String/text and decryption is failing in Hadoop.
> >
> > Can someone provide pointers to solve this? Any workaround?
> >
> > --
> > Thanks
> > Suraj Nayak M
> >
>



-- 
Thanks
Suraj Nayak M

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message