sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adarsh Sharma <eddy.ada...@gmail.com>
Subject Re: BigInt Support of postgresql in Sqoop
Date Fri, 24 Aug 2012 05:10:46 GMT
Thanks Jarcec for the update, So Sqoop is not suitable for shifting data
from DB to HDFS , if some columns have integer[] or bigint[] datatypes.

Is there any way i can sh*ift  date having bigint[] datatypes from* postgresql
DB to HDFS using Sqoop or I need to test another tool like Talend etc.


Thanks


On Thu, Aug 23, 2012 at 11:45 PM, Jarek Jarcec Cecho <jarcec@apache.org>wrote:

> Hi Adarsh,
> as far as I know then Sqoop should not have any issues with bigint data
> type.
>
> Based on provided log fragment, It seems that you're having issues with
> SQL type 2003 that should be ARRAY (see 1). I'm afraid that array is really
> not supported in Sqoop at the moment.
>
> Jarcec
>
> 1:
> http://docs.oracle.com/javase/1.5.0/docs/api/constant-values.html#java.sql.Types.ARRAY
>
> On Thu, Aug 23, 2012 at 08:47:10PM +0530, Adarsh Sharma wrote:
> > Hi all,
> >
> > Please forgive if i violate any rule before posting in this mailing list
> .
> > I am using for some testing in my hadoop standalone set up.
> >
> > Hadoop Version: 0.20.2-cdh3u5, 580d1d26c7ad6a7c6ba72950d8605e2c6fbc96cc
> > Sqoop Version : Sqoop 1.4.1-incubating
> > Also tried         :  Sqoop 1.4.0-incubating
> > Postgresql Versio : edb-psql (9.0.4.14)
> >
> >
> > I am able to export data from HDFS to postgresql but when I am trying to
> > import data from DB to hdfs , below problem arises :
> > hadoop@test123:~/project/
> > sqoop-1.4.1-incubating__hadoop-0.20$ bin/sqoop import  --connect
> > jdbc:postgresql://localhost/hadooppipeline  --table test_table --username
> > postgres --password postgres
> > Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> > Please set $HBASE_HOME to the root of your HBase installation.
> > 12/08/23 19:25:19 WARN tool.BaseSqoopTool: Setting your password on the
> > command-line is insecure. Consider using -P instead.
> > 12/08/23 19:25:19 INFO manager.SqlManager: Using default fetchSize of
> 1000
> > 12/08/23 19:25:19 INFO tool.CodeGenTool: Beginning code generation
> > 12/08/23 19:25:19 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > t.* FROM "test_table" AS t LIMIT 1
> > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > 12/08/23 19:25:19 ERROR orm.ClassWriter: No Java type for SQL type 2003
> for
> > column rc_list
> >
> > There are 4 bigint columns in the table. Please guide me if Sqoop support
> > for bigint columns or not.
> >
> > I do some Rn D and find only one link but not able to solve the issue :
> >
> https://issues.cloudera.org/browse/SQOOP-48?page=com.atlassian.jira.plugin.system.issuetabpanels%3Aall-tabpanel
> >
> >
> > Thanks
>

Mime
View raw message