sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arvind Prabhakar <arv...@apache.org>
Subject Re: [sqoop-user] Sqoop-with Terradata
Date Thu, 08 Sep 2011 01:43:02 GMT
Srini,

One thing you can try is building and using the latest Sqoop from the
trunk. Also when you do run Sqoop, specify the --verbose option to
produce detailed output that is helpful in finding what could be going
wrong.

Thanks,
Arvind

On Fri, Sep 2, 2011 at 2:21 PM, SRINIVAS SURASANI <vasajb@gmail.com> wrote:
> Arvind,
> Now Im getting strange errror. I made sure table has equal number of
> attributes and data is not-corrupted.
> 1st error:
> sqoop export -libjar/jdbc/13.00.00.07/lib/tdgssconfig.jar --verbose --driver
> com.teradata.jdbc.TeraDriver --connect jdbc:teradata://PTD/EW1_CMTS_WORK
> --username EDWBD_CMTS --password  --table EW1_CMS_WORK.TMRB_TEST
> --export-dir /user/hadrdev/sqoop_msrb_test.txt --fields-terminated-by ','
> --lines-terminated-by '\n'
>
>
> 11/09/02 16:25:14 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 11/09/02 16:25:14 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM EW1_CMS_WORK.TMRB_TEST AS t WHERE 1=0
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/jdbc/13.00.00.07/common/lib/terajdbc4.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/commons-io-1.4.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/sqljdbc4.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> 11/09/02 16:25:14 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
> 922 for hadrdev
> 11/09/02 16:25:14 INFO security.TokenCache: Got dt for
> hdfs://idoop:9000/tmp/hadoop-mapred/mapred/staging/hadrdev/.staging/job_201108311434_0051/libjars/tdgssconfig.jar;uri=10.128.225.1:9000;t.service=10.128.225.1:9000
> 11/09/02 16:25:14 INFO input.FileInputFormat: Total input paths to process :
> 1
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=4
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=7854508
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: maxSplitSize=1963627
> 11/09/02 16:25:14 INFO input.FileInputFormat: Total input paths to process :
> 1
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadrdev/sqoop_msrb_test.txt:0+7854508 Locations:idoop.com:;
> 11/09/02 16:25:15 INFO mapred.JobClient: Running job: job_201108311434_0051
> 11/09/02 16:25:16 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 16:25:24 INFO mapred.JobClient: Task Id :
> attempt_201108311434_0051_m_000000_0, Status : FAILED
> java.io.IOException: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata
> Database] [TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax error:
> expected something between ')' and ','.
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:217)
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:45)
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:530)
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>         at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:80)
>         at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:646)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>         at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata Database]
> [TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax error: expected
> something between ')' and ','.
>         at
> com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:288)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:102)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:285)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:176)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:108)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.StatementController.run(StatementController.java:99)
>         at
> com.teradata.jdbc.jdbc_4.Statement.executeStatement(Statement.java:331)
>         at
> com.teradata.jdbc.jdbc_4.Statement.prepareRequest(Statement.java:491)
>         at
> com.teradata.jdbc.jdbc_4.PreparedStatement.<init>(PreparedStatement.java:56)
>         at
> com.teradata.jdbc.jdbc_4.TDSession.createPreparedStatement(TDSession.java:689)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalPreparedStatement.<init>(TeraLocalPreparedStatement.java:84)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.prepareStatement(TeraLocalConnection.java:327)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.prepareStatement(TeraLocalConnection.java:148)
>         at
> com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.getPreparedStatement(ExportOutputFormat.java:142)
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.execUpdate(AsyncSqlRecordWriter.java:146)
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:212)
>
> 2nd error for other same kind of command:
> here mapper is repeating for three times and job is getting failed.
> duce.ExportInputFormat: Total input bytes=24
> 11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat: maxSplitSize=24
> 11/09/02 16:34:24 INFO input.FileInputFormat: Total input paths to process :
> 1
> 11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadrdev/sqoop_test.txt:0+24 Locations:idoop.ms.com:;
> 11/09/02 16:34:25 INFO mapred.JobClient: Running job: job_201108311434_0052
> 11/09/02 16:34:26 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 16:34:37 INFO mapred.JobClient:  map 100% reduce 0%
> 11/09/02 16:44:37 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 16:44:37 INFO mapred.JobClient: Task Id :
> attempt_201108311434_0052_m_000000_0, Status : FAILED
> Task attempt_201108311434_0052_m_000000_0 failed to report status for 600
> seconds. Killing!
> 11/09/02 16:44:46 INFO mapred.JobClient:  map 100% reduce 0%
> 11/09/02 16:54:46 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 16:54:46 INFO mapred.JobClient: Task Id :
> attempt_201108311434_0052_m_000000_1, Status : FAILED
> Task attempt_201108311434_0052_m_000000_1 failed to report status for 600
> seconds. Killing!
> 11/09/02 16:54:56 INFO mapred.JobClient:  map 100% reduce 0%
> 11/09/02 17:04:56 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 17:04:56 INFO mapred.JobClient: Task Id :
> attempt_201108311434_0052_m_000000_2, Status : FAILED
> Task attempt_201108311434_0052_m_000000_2 failed to report status for 600
> seconds. Killing!
> 11/09/02 17:05:07 INFO mapred.JobClient:  map 100% reduce 0%
>
> Thanks Arvind as always.. you are taking out your valuable time to help.
>
>
> On Fri, Sep 2, 2011 at 2:12 AM, Arvind Prabhakar <arvind@apache.org> wrote:
>>
>> Please try specifying the extra Jar file using -libjar argument. This
>> is a generic Hadoop argument that Sqoop passes down to the framework
>> and should allow the inclusion of other jar files in the classpath.
>> Note that this must be specified before any Sqoop specific argument is
>> given. For example:
>>
>> $ bin/sqoop import -libjars /path/to/gssjar --connect "..."
>>
>> Thanks,
>> Arvind
>>
>> On Thu, Sep 1, 2011 at 8:54 PM, SRINIVAS SURASANI <vasajb@gmail.com>
>> wrote:
>> > Arvind,
>> >
>> > I understand to place the GSS config jar to be placed in Sqoop lib
>> > directory. I was wondering is there any alternative way to achieve this
>> > [
>> > meaning , how terajdbc4.jar is added to is distributed cache
>> > automatically
>> > before launching Map-Reduce ].
>> >
>> > Thanks,
>> > Srini
>> >
>> >
>> >
>> > On Wed, Aug 31, 2011 at 1:44 PM, Arvind Prabhakar <arvind@apache.org>
>> > wrote:
>> >>
>> >> Srini,
>> >>
>> >> This is happening because the GSS config Jar is not getting put in
>> >> Distributed Cache. Sqoop only puts certain jars in the cache as
>> >> opposed to putting every jar that exists in its classpath. In order to
>> >> force any Jar to be put in the Distributed Cache, you must copy it
>> >> over to Sqoop's lib directory.
>> >>
>> >> Thanks,
>> >> Arvind
>> >>
>> >> On Tue, Aug 30, 2011 at 9:36 PM, SRINIVAS SURASANI <vasajb@gmail.com>
>> >> wrote:
>> >> > Getting the error while exporting .. And from my observation while
>> >> > compiling
>> >> > .java, sets the classpath for terajdbc4.jar and tdgssconfig.jar (
as
>> >> > I
>> >> > marked in bold letters below) but just before launching map-reduce
>> >> > adding
>> >> > jar classpath of tdgssconfig.jar.
>> >> >  I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path
>> >> > to>tdgssconfig.jar
>> >> > Any Help Appreciated.
>> >> > $ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver
>> >> > --connect
>> >> > jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table
>> >> > --export-dir
>> >> > /user/hadoop/sqoop_test.txt --fields-terminated-by ,
>> >> > --lines-terminated-by
>> >> > \n -m 1>
>> >> > 11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
>> >> > Enter password:
>> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
>> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
>> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> >> > 11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize
of
>> >> > 1000
>> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
>> >> > com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
>> >> > 11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
>> >> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > 1000
>> >> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> >> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > 1000
>> >> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
>> >> > DB_Temp_Table.java
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> >> > 11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
>> >> > /usr/lib/hadoop
>> >> > 11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar
>> >> > at:
>> >> > /usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
>> >> >
>> >> >
>> >> > /tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with
>> >> > args:
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> >> >
>> >> >
>> >> > /usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar:<somepath>/lib/tdgssconfig.jar:<somepath>/lib/terajdbc4.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> >> > Note:
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> >> > uses or overrides a deprecated API.
>> >> > Note: Recompile with -Xlint:deprecation for details.
>> >> > 11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class
>> >> > files in
>> >> > directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
>> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
>> >> > -> DB_Temp_Table.class
>> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar
>> >> > file
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> >> > 11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
>> >> > DB.Temp_Table
>> >> > 11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
>> >> > com.cloudera.sqoop.mapreduce.ExportInputFormat
>> >> > 11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > 1000
>> >> > 11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:<somepath>/lib/terajdbc4.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/sqljdbc4.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
>> >> > 11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN
>> >> > token
>> >> > 795 for hadoop
>> >> > 11/08/30 22:59:53 INFO security.TokenCache: Got dt for
>> >> >
>> >> >
>> >> > hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
>> >> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
>> >> > process :
>> >> > 1
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target
>> >> > numMapTasks=1
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input
>> >> > bytes=18
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
>> >> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
>> >> > process :
>> >> > 1
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated
>> >> > splits:
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
>> >> > Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
>> >> > 11/08/30 22:59:53 INFO mapred.JobClient: Running job:
>> >> > job_201107010928_0398
>> >> > 11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
>> >> > 11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
>> >> > attempt_201107010928_0398_m_000000_0, Status : FAILED
>> >> > java.io.IOException: java.lang.NullPointerException
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
>> >> >         at
>> >> >
>> >> >
>> >> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
>> >> >         at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
>> >> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>> >> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> >         at java.security.AccessController.doPrivileged(Native Method)
>> >> >         at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >         at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>> >> >         at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> > Caused by: java.lang.NullPointerException
>> >> >         at
>> >> > com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>> >> >         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >> >         at
>> >> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >> >         at
>> >> > com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >> >         at
>> >> > com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >> >         at
>> >> > com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >> >         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >> >         at
>> >> > java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >> >         at
>> >> > java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
>> >> >         ... 8 more
>> >> >
>> >> > attempt_201107010928_0398_m_000000_0: GSSException: Failure
>> >> > unspecified
>> >> > at
>> >> > GSS-API level (Mechanism level: UserFile parameter null)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(
>> >> > On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <vasajb@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> Arvind,
>> >> >>
>> >> >> I have subscribed at sqoop-user@incubator.apache.org and posted
>> >> >> Question .
>> >> >> Sorry for the inconvinence from my end, since I'm close to deadline
>> >> >> I'm
>> >> >> taking your valuable time.
>> >> >>
>> >> >> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
>> >> >> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
>> >> >> Iam getting the following error:
>> >> >> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize
>> >> >> of
>> >> >> 1000
>> >> >> GSSException: Failure unspecified at GSS-API level (Mechanism level:
>> >> >> UserFile parameter null)
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>> >> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >> >>         at
>> >> >> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >> >>         at
>> >> >> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >> >>         at
>> >> >> com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >> >>         at
>> >> >> java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >> >>         at
>> >> >> java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>> >> >>         at
>> >> >>
>> >> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>> >> >>         at
>> >> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>> >> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>> >> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>> >> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> >> >> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
>> >> >> java.lang.NullPointerException
>> >> >> java.lang.NullPointerException
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>> >> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >> >>         at
>> >> >> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >> >>         at
>> >> >> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >> >>         at
>> >> >> com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >> >>         at
>> >> >> java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >> >>         at
>> >> >> java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>> >> >>         at
>> >> >>
>> >> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>> >> >>         at
>> >> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>> >> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>> >> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>> >> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> >> >> Thanks,
>> >> >> Srini
>> >> >>
>> >> >>
>> >> >>
>> >> >> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
>> >> >> <arvind@cloudera.com> wrote:
>> >> >>>
>> >> >>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>> >> >>>
>> >> >>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in
order to
>> >> >>> use the generic JDBC connector, you will have to specify the
driver
>> >> >>> class explicitly via the command line option --driver
>> >> >>> com.teradata.jdbc.TeraDriver.
>> >> >>>
>> >> >>> Thanks,
>> >> >>> Arvind
>> >> >>>
>> >> >>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI
>> >> >>> <vasajb@gmail.com>
>> >> >>> wrote:
>> >> >>> > Arvind,
>> >> >>> > I have set the classpath to teradata4.jar [ not placed
the
>> >> >>> > teradata4.jar in
>> >> >>> > sqoop lib, as I dont have permissions].
>> >> >>> > I'm getting the following error
>> >> >>> >
>> >> >>> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK
>> >> >>> > --username
>> >> >>> > srini -P
>> >> >>> > ERROR: tool.BaseSqoopTool: Got error creating database
manager:
>> >> >>> > java.io.IOexception: No manager for connect string:
>> >> >>> > jdbc:teradata:///PKTD/E1_CMS_WORK
>> >> >>> >    at
>> >> >>> > com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>> >> >>> >    at
>> >> >>> >
>> >> >>> > com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>> >> >>> >    ...
>> >> >>> >    ...
>> >> >>> > Thanks,
>> >> >>> > Srini
>> >> >>> >
>> >> >>> >
>> >> >>> >
>> >> >>> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI
>> >> >>> > <vasajb@gmail.com>
>> >> >>> > wrote:
>> >> >>> >>
>> >> >>> >> Thanks-a-lot Arvind.
>> >> >>> >>
>> >> >>> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
>> >> >>> >> <arvind@cloudera.com>
>> >> >>> >> wrote:
>> >> >>> >>>
>> >> >>> >>> [Moving the thread to sqoop-user@incubator.apache.org]
>> >> >>> >>>
>> >> >>> >>> Hi Srini,
>> >> >>> >>>
>> >> >>> >>> You should be able to use the generic JDBC connector
to
>> >> >>> >>> import/export
>> >> >>> >>> from Teradata. There is also a specialized connector
that is
>> >> >>> >>> available
>> >> >>> >>> for use with Teradata if you are interested. This
connector is
>> >> >>> >>> not
>> >> >>> >>> a
>> >> >>> >>> part of Sqoop and can be obtained from Cloudera
by going to:
>> >> >>> >>>
>> >> >>> >>> http://www.cloudera.com/partners/connectors/
>> >> >>> >>>
>> >> >>> >>> Thanks,
>> >> >>> >>> Arvind
>> >> >>> >>>
>> >> >>> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI
>> >> >>> >>> <vasajb@gmail.com>
>> >> >>> >>> wrote:
>> >> >>> >>> > I have csv file in hadoop and looking to
load into Teradata.
>> >> >>> >>> > I
>> >> >>> >>> > was
>> >> >>> >>> > wondering does the sqoop works with Terradata.(with
JDBC jar
>> >> >>> >>> > placing
>> >> >>> >>> > in sqoop lib dir).
>> >> >>> >>> >
>> >> >>> >>> > Regards
>> >> >>> >>> > Srini
>> >> >>> >>> >
>> >> >>> >>> > --
>> >> >>> >>> > NOTE: The mailing list sqoop-user@cloudera.org
is deprecated
>> >> >>> >>> > in
>> >> >>> >>> > favor
>> >> >>> >>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>> >> >>> >>> > Please
>> >> >>> >>> > subscribe to it by sending an email to
>> >> >>> >>> > incubator-sqoop-user-subscribe@apache.org.
>> >> >>> >>> >
>> >> >>> >>>
>> >> >>> >>> --
>> >> >>> >>> NOTE: The mailing list sqoop-user@cloudera.org
is deprecated in
>> >> >>> >>> favor
>> >> >>> >>> of
>> >> >>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>> >> >>> >>> Please
>> >> >>> >>> subscribe
>> >> >>> >>> to it by sending an email to
>> >> >>> >>> incubator-sqoop-user-subscribe@apache.org.
>> >> >>> >>
>> >> >>> >
>> >> >>> > --
>> >> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated
in
>> >> >>> > favor
>> >> >>> > of
>> >> >>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org.
Please
>> >> >>> > subscribe
>> >> >>> > to it by sending an email to
>> >> >>> > incubator-sqoop-user-subscribe@apache.org.
>> >> >>> >
>> >> >>>
>> >> >>> --
>> >> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated
in
>> >> >>> favor
>> >> >>> of
>> >> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org.
Please
>> >> >>> subscribe
>> >> >>> to it by sending an email to
>> >> >>> incubator-sqoop-user-subscribe@apache.org.
>> >> >>
>> >> >
>> >> >
>> >
>> >
>
>

Mime
View raw message