sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brenden Cobb <Brenden.C...@humedica.com>
Subject Re: Sqoop to HDFS error Cannot initialize cluster
Date Fri, 31 Jan 2014 18:57:39 GMT
Hi Abe- thanks for your continued help on this issue. I changed my SQOOP env setting to point
to .20 mapreduce and seem to have progressed (or stepped back?)

It appears I've moved beyond the Mapreduce error and into JDBC connector errors… Any thoughts
are appreciated.


[$ sqoop import --connect jdbc:oracle:thin:@som-dmsandbox03.humedica.net:1521:DB11G --username
SQOOP --password xx --table SQOOP.test --verbose

14/01/31 13:50:01 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.5.0
14/01/31 13:50:01 DEBUG tool.BaseSqoopTool: Enabled debug logging.
14/01/31 13:50:01 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure.
Consider using -P instead.
14/01/31 13:50:01 DEBUG util.ClassLoaderStack: Checking for existing class: com.quest.oraoop.OraOopManagerFactory
14/01/31 13:50:01 DEBUG util.ClassLoaderStack: Class is already available. Skipping jar /opt/cloudera/parcels/CDH/lib/sqoop/lib/oraoop-1.6.0.jar
14/01/31 13:50:01 DEBUG sqoop.ConnFactory: Added factory com.quest.oraoop.OraOopManagerFactory
in jar /opt/cloudera/parcels/CDH/lib/sqoop/lib/oraoop-1.6.0.jar specified by /opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/bin/../lib/sqoop/conf/managers.d/oraoop
14/01/31 13:50:01 DEBUG sqoop.ConnFactory: Loaded manager factory: com.quest.oraoop.OraOopManagerFactory
14/01/31 13:50:01 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
14/01/31 13:50:01 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.quest.oraoop.OraOopManagerFactory
14/01/31 13:50:01 INFO manager.SqlManager: Using default fetchSize of 1000
14/01/31 13:50:02 INFO oraoop.OraOopManagerFactory:
***********************************************************************
*** Using Quest® Data Connector for Oracle and Hadoop 1.6.0-cdh4-20 ***
*** Copyright 2012 Quest Software, Inc.                             ***
*** ALL RIGHTS RESERVED.                                            ***
***********************************************************************
14/01/31 13:50:02 INFO oraoop.OraOopManagerFactory: Oracle Database version: Oracle Database
11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
14/01/31 13:50:02 INFO oraoop.OraOopManagerFactory: This Oracle database is not a RAC.
14/01/31 13:50:02 DEBUG sqoop.ConnFactory: Instantiated ConnManager com.quest.oraoop.OraOopConnManager@40f892a4
14/01/31 13:50:02 INFO tool.CodeGenTool: Beginning code generation
14/01/31 13:50:02 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT "FIRST","LAST","EMAIL"
FROM SQOOP.test WHERE 0=1
14/01/31 13:50:02 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
14/01/31 13:50:02 INFO manager.SqlManager: Executing SQL statement: SELECT "FIRST","LAST","EMAIL"
FROM SQOOP.test WHERE 0=1
14/01/31 13:50:02 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
14/01/31 13:50:02 INFO manager.SqlManager: Executing SQL statement: SELECT FIRST,LAST,EMAIL
FROM SQOOP.test WHERE 1=0
14/01/31 13:50:02 DEBUG orm.ClassWriter: selected columns:
14/01/31 13:50:02 DEBUG orm.ClassWriter:   FIRST
14/01/31 13:50:02 DEBUG orm.ClassWriter:   LAST
14/01/31 13:50:02 DEBUG orm.ClassWriter:   EMAIL
14/01/31 13:50:02 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/SQOOP_test.java
14/01/31 13:50:02 DEBUG orm.ClassWriter: Table name: SQOOP.test
14/01/31 13:50:02 DEBUG orm.ClassWriter: Columns: FIRST:12, LAST:12, EMAIL:12,
14/01/31 13:50:02 DEBUG orm.ClassWriter: sourceFilename is SQOOP_test.java
14/01/31 13:50:02 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/
14/01/31 13:50:02 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-0.20-mapreduce
14/01/31 13:50:02 INFO orm.CompilationManager: Found hadoop core jar at: /opt/cloudera/parcels/CDH/lib/hadoop-0.20-mapreduce/hadoop-core.jar
14/01/31 13:50:02 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/SQOOP_test.java
14/01/31 13:50:02 DEBUG orm.CompilationManager: Invoking javac with args:
14/01/31 13:50:02 DEBUG orm.CompilationManager:   -sourcepath
14/01/31 13:50:02 DEBUG orm.CompilationManager:   /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/
14/01/31 13:50:02 DEBUG orm.CompilationManager:   -d
14/01/31 13:50:02 DEBUG orm.CompilationManager:   /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/
14/01/31 13:50:02 DEBUG orm.CompilationManager:   -classpath
14/01/31 13:50:02 DEBUG orm.CompilationManager:   /etc/hadoop/conf:/opt/cloudera/parcels/CDH/lib/hadoop/lib/jersey-json-1.8.

-- jars removed for brevity --

Note: /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/SQOOP_test.java uses or overrides
a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/01/31 13:50:03 DEBUG orm.CompilationManager: Could not rename /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/SQOOP_test.java
to /opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/etc/sqoop/conf.dist/./SQOOP_test.java
java.io.FileNotFoundException: /opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/etc/sqoop/conf.dist/./SQOOP_test.java
(Permission denied)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:194)
at java.io.FileOutputStream.<init>(FileOutputStream.java:145)
at org.apache.commons.io.FileUtils.doCopyFile(FileUtils.java:936)
at org.apache.commons.io.FileUtils.copyFile(FileUtils.java:888)
at org.apache.commons.io.FileUtils.copyFile(FileUtils.java:835)
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2385)
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:97)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:396)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
at org.apache.sqoop.Sqoop.main(Sqoop.java:240)
14/01/31 13:50:03 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/SQOOP.test.jar
14/01/31 13:50:03 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a
14/01/31 13:50:03 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/SQOOP_test.class
-> SQOOP_test.class
14/01/31 13:50:03 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-oracle/compile/95f371afda8b0247e432776caf5d5f4a/SQOOP.test.jar
14/01/31 13:50:03 INFO mapreduce.ImportJobBase: Beginning import of SQOOP.test
14/01/31 13:50:03 DEBUG db.DBConfiguration: Securing password into job credentials store
14/01/31 13:50:03 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
14/01/31 13:50:03 INFO manager.SqlManager: Executing SQL statement: SELECT FIRST,LAST,EMAIL
FROM SQOOP.test WHERE 1=0
14/01/31 13:50:03 DEBUG mapreduce.DataDrivenImportJob: Using table class: SQOOP_test
14/01/31 13:50:03 DEBUG mapreduce.DataDrivenImportJob: Using InputFormat: class com.quest.oraoop.OraOopDataDrivenDBInputFormat
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/sqoop-1.4.3-cdh4.5.0.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/ojdbc6.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/oraoop-1.6.0.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/sqoop-1.4.3-cdh4.5.0.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/xz-1.0.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/paranamer-2.3.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/snappy-java-1.0.4.1.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/ant-contrib-1.0b3.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/netty-3.4.0.Final.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/jackson-mapper-asl-1.8.8.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/servlet-api-2.5-20081211.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/avro-mapred-1.7.4-hadoop2.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/ojdbc6.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/commons-compress-1.4.1.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/jackson-core-asl-1.8.8.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/avro-1.7.4.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/avro-ipc-1.7.4-tests.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/jetty-util-6.1.26.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/oraoop-1.6.0.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/commons-io-1.4.jar
14/01/31 13:50:03 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/hsqldb-1.8.0.10.jar
14/01/31 13:50:04 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments.
Applications should implement Tool for the same.
14/01/31 13:50:05 WARN oraoop.OraOopUtilities: System property java.security.egd is not set
to file:///dev/urandom - Oracle connections may time out.
14/01/31 13:50:05 DEBUG db.DBConfiguration: Fetching password from job credentials store
14/01/31 13:50:05 INFO oraoop.OracleConnectionFactory: Session Time Zone set to GMT
14/01/31 13:50:05 INFO oraoop.OracleConnectionFactory: Initializing Oracle session with SQL
:
begin
  dbms_application_info.set_module(module_name => 'Quest® Data Connector for Oracle and
Hadoop', action_name => 'import 20140131135002EST');
end;
14/01/31 13:50:05 INFO oraoop.OracleConnectionFactory: Initializing Oracle session with SQL
: alter session disable parallel query
14/01/31 13:50:05 INFO oraoop.OracleConnectionFactory: Initializing Oracle session with SQL
: alter session set "_serial_direct_read"=true
14/01/31 13:50:05 INFO oraoop.OracleConnectionFactory: Initializing Oracle session with SQL
: alter session set tracefile_identifier=oraoop
14/01/31 13:50:05 INFO mapred.JobClient: Cleaning up the staging area hdfs://som-dmsandbox01.humedica.net:8020/user/oracle/.staging/job_201401311341_0004
14/01/31 13:50:05 ERROR security.UserGroupInformation: PriviledgedActionException as:oracle
(auth:SIMPLE) cause:java.io.IOException: java.sql.SQLSyntaxErrorException: ORA-00942: table
or view does not exist

14/01/31 13:50:05 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException:
java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist

at com.quest.oraoop.OraOopDataDrivenDBInputFormat.getSplits(OraOopDataDrivenDBInputFormat.java:120)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1079)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1096)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:177)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:995)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:948)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:948)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:606)
at com.quest.oraoop.OraOopConnManager.importTable(OraOopConnManager.java:260)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
at org.apache.sqoop.Sqoop.main(Sqoop.java:240)
Caused by: java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist

at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:396)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:951)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:513)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:227)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:531)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:208)
at oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:886)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1175)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1296)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3613)
at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3657)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1495)
at com.quest.oraoop.OraOopOracleQueries.getOracleDataChunksExtent(OraOopOracleQueries.java:265)
at com.quest.oraoop.OraOopDataDrivenDBInputFormat.getSplits(OraOopDataDrivenDBInputFormat.java:74)

From: Abraham Elmahrek <abe@cloudera.com<mailto:abe@cloudera.com>>
Reply-To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Date: Thursday, January 30, 2014 7:12 PM
To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Subject: Re: Sqoop to HDFS error Cannot initialize cluster

It looks like you're using a newer version of Hadoop that requires the mapreduce.jobtracker.address
property to be set. You can likely use the same value as what you've provided for mapred.job.tracker.

-Abe


On Thu, Jan 30, 2014 at 3:58 PM, Brenden Cobb <Brenden.Cobb@humedica.com<mailto:Brenden.Cobb@humedica.com>>
wrote:
Here we go, thanks:

[oracle@som-dmsandbox03 ~]$ sqoop import --connect jdbc:oracle:thin:@som-dmsandbox03.humedica.net:1521:DB11G
--username sqoop --password xx --table sqoop.test --verbose
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
14/01/30 18:51:26 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.5.0
14/01/30 18:51:26 DEBUG tool.BaseSqoopTool: Enabled debug logging.
14/01/30 18:51:26 DEBUG util.ClassLoaderStack: Checking for existing class: com.quest.oraoop.OraOopManagerFactory
14/01/30 18:51:26 DEBUG util.ClassLoaderStack: Class is already available. Skipping jar /opt/cloudera/parcels/CDH/lib/sqoop/lib/oraoop-1.6.0.jar
14/01/30 18:51:26 DEBUG sqoop.ConnFactory: Added factory com.quest.oraoop.OraOopManagerFactory
in jar /opt/cloudera/parcels/CDH/lib/sqoop/lib/oraoop-1.6.0.jar specified by /opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/bin/../lib/sqoop/conf/managers.d/oraoop
14/01/30 18:51:26 DEBUG sqoop.ConnFactory: Loaded manager factory: com.quest.oraoop.OraOopManagerFactory
14/01/30 18:51:26 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
14/01/30 18:51:26 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.quest.oraoop.OraOopManagerFactory
14/01/30 18:51:26 INFO manager.SqlManager: Using default fetchSize of 1000
14/01/30 18:51:28 INFO oraoop.OraOopManagerFactory:
***********************************************************************
*** Using Quest® Data Connector for Oracle and Hadoop 1.6.0-cdh4-20 ***
*** Copyright 2012 Quest Software, Inc.                             ***
*** ALL RIGHTS RESERVED.                                            ***
***********************************************************************
14/01/30 18:51:28 INFO oraoop.OraOopManagerFactory: Oracle Database version: Oracle Database
11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
14/01/30 18:51:28 INFO oraoop.OraOopManagerFactory: This Oracle database is not a RAC.
14/01/30 18:51:28 WARN conf.Configuration: mapred.map.max.attempts is deprecated. Instead,
use mapreduce.map.maxattempts
14/01/30 18:51:28 DEBUG sqoop.ConnFactory: Instantiated ConnManager com.quest.oraoop.OraOopConnManager@5a20f443
14/01/30 18:51:28 INFO tool.CodeGenTool: Beginning code generation
14/01/30 18:51:28 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT "FIRST","LAST","EMAIL"
FROM sqoop.test WHERE 0=1
14/01/30 18:51:28 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
14/01/30 18:51:28 INFO manager.SqlManager: Executing SQL statement: SELECT "FIRST","LAST","EMAIL"
FROM sqoop.test WHERE 0=1
14/01/30 18:51:28 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
14/01/30 18:51:28 INFO manager.SqlManager: Executing SQL statement: SELECT FIRST,LAST,EMAIL
FROM sqoop.test WHERE 1=0
14/01/30 18:51:28 DEBUG orm.ClassWriter: selected columns:
14/01/30 18:51:28 DEBUG orm.ClassWriter:   FIRST
14/01/30 18:51:28 DEBUG orm.ClassWriter:   LAST
14/01/30 18:51:28 DEBUG orm.ClassWriter:   EMAIL
14/01/30 18:51:28 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/sqoop_test.java
14/01/30 18:51:28 DEBUG orm.ClassWriter: Table name: sqoop.test
14/01/30 18:51:28 DEBUG orm.ClassWriter: Columns: FIRST:12, LAST:12, EMAIL:12,
14/01/30 18:51:28 DEBUG orm.ClassWriter: sourceFilename is sqoop_test.java
14/01/30 18:51:28 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/
14/01/30 18:51:28 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
14/01/30 18:51:28 INFO orm.CompilationManager: Found hadoop core jar at: /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar
14/01/30 18:51:28 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/sqoop_test.java
14/01/30 18:51:28 DEBUG orm.CompilationManager: Invoking javac with args:
14/01/30 18:51:28 DEBUG orm.CompilationManager:   -sourcepath
14/01/30 18:51:28 DEBUG orm.CompilationManager:   /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/
14/01/30 18:51:28 DEBUG orm.CompilationManager:   -d
14/01/30 18:51:28 DEBUG orm.CompilationManager:   /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/
14/01/30 18:51:28 DEBUG orm.CompilationManager:   -classpath
Note: /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/sqoop_test.java uses or overrides
a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/01/30 18:51:30 DEBUG orm.CompilationManager: Could not rename /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/sqoop_test.java
to /usr/local/oracle/./sqoop_test.java

......jars output removed for brevity......

org.apache.commons.io.FileExistsException: Destination '/usr/local/oracle/./sqoop_test.java'
already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:97)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:396)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
at org.apache.sqoop.Sqoop.main(Sqoop.java:240)
14/01/30 18:51:30 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/sqoop.test.jar
14/01/30 18:51:30 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b
14/01/30 18:51:30 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/sqoop_test.class
-> sqoop_test.class
14/01/30 18:51:30 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-oracle/compile/e3be71c52b86d50ad8177d621099250b/sqoop.test.jar
14/01/30 18:51:30 INFO mapreduce.ImportJobBase: Beginning import of sqoop.test
14/01/30 18:51:30 WARN conf.Configuration: mapred.job.tracker is deprecated. Instead, use
mapreduce.jobtracker.address
14/01/30 18:51:30 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
14/01/30 18:51:30 DEBUG db.DBConfiguration: Securing password into job credentials store
14/01/30 18:51:30 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
14/01/30 18:51:30 INFO manager.SqlManager: Executing SQL statement: SELECT FIRST,LAST,EMAIL
FROM sqoop.test WHERE 1=0
14/01/30 18:51:30 DEBUG mapreduce.DataDrivenImportJob: Using table class: sqoop_test
14/01/30 18:51:30 DEBUG mapreduce.DataDrivenImportJob: Using InputFormat: class com.quest.oraoop.OraOopDataDrivenDBInputFormat
14/01/30 18:51:31 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/sqoop-1.4.3-cdh4.5.0.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/ojdbc6.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/oraoop-1.6.0.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/sqoop-1.4.3-cdh4.5.0.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/xz-1.0.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/paranamer-2.3.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/snappy-java-1.0.4.1.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/ant-contrib-1.0b3.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/netty-3.4.0.Final.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/jackson-mapper-asl-1.8.8.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/servlet-api-2.5-20081211.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/avro-mapred-1.7.4-hadoop2.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/ojdbc6.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/commons-compress-1.4.1.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/jackson-core-asl-1.8.8.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/avro-1.7.4.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/avro-ipc-1.7.4-tests.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/jetty-util-6.1.26.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/oraoop-1.6.0.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/commons-io-1.4.jar
14/01/30 18:51:31 DEBUG mapreduce.JobBase: Adding to job classpath: file:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/sqoop/lib/hsqldb-1.8.0.10.jar
14/01/30 18:51:31 INFO mapreduce.Cluster: Failed to use org.apache.hadoop.mapred.LocalClientProtocolProvider
due to error: Invalid "mapreduce.jobtracker.address" configuration value for LocalJobRunner
: "som-dmsandbox01.humedica.net:8021<http://som-dmsandbox01.humedica.net:8021>"
14/01/30 18:51:31 ERROR security.UserGroupInformation: PriviledgedActionException as:oracle
(auth:SIMPLE) cause:java.io.IOException: Cannot initialize Cluster. Please check your configuration
for mapreduce.framework.name<http://mapreduce.framework.name> and the correspond server
addresses.
14/01/30 18:51:31 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException:
Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name<http://mapreduce.framework.name>
and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:122)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:84)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:77)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1239)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1235)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1234)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1263)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:606)
at com.quest.oraoop.OraOopConnManager.importTable(OraOopConnManager.java:260)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
at org.apache.sqoop.Sqoop.main(Sqoop.java:240)

From: Abraham Elmahrek <abe@cloudera.com<mailto:abe@cloudera.com>>
Reply-To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Date: Thursday, January 30, 2014 6:32 PM

To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Subject: Re: Sqoop to HDFS error Cannot initialize cluster

Hmm could you please run the command with the --verbose option at the end and provide the
sqoop log?


On Thu, Jan 30, 2014 at 3:09 PM, Brenden Cobb <Brenden.Cobb@humedica.com<mailto:Brenden.Cobb@humedica.com>>
wrote:
Thanks very much for the detailed instructions.. However I am still receiving the error:

14/01/30 18:07:58 ERROR security.UserGroupInformation: PriviledgedActionException as:root
(auth:SIMPLE) cause:java.io.IOException: Cannot initialize Cluster. Please check your configuration
for mapreduce.framework.name<http://mapreduce.framework.name> and the correspond server
addresses.
14/01/30 18:07:58 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException:
Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name<http://mapreduce.framework.name>
and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:122)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:84)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:77)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1239)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1235)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1234)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1263)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:606)
at com.quest.oraoop.OraOopConnManager.importTable(OraOopConnManager.java:260)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
at org.apache.sqoop.Sqoop.main(Sqoop.java:240)


Any other thoughts?

Thanks

From: Abraham Elmahrek <abe@cloudera.com<mailto:abe@cloudera.com>>
Reply-To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Date: Thursday, January 30, 2014 3:01 PM

To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Subject: Re: Sqoop to HDFS error Cannot initialize cluster

It seems like mapreduce.framework.name<http://mapreduce.framework.name> is missing from
this configuration. You should be able to use a safety valve to manually add it in Cloudera
Manager. The correct value here is "classic" I believe since you don't have Yarn deployed.

To add a safety valve configuration for MapReduce, go to Services -> Mapreduce -> Configuration
-> View and Edit -> Service Wide -> Advanced -> Safety valve configuration for
mapred-site.xml. You should able to add the entry:

<property>
  <name>mapreduce.framework.name<http://mapreduce.framework.name></name>
  <value>classic</value>
</property>

Then save and restart MR. Let us know how it goes.

-Abe


On Thu, Jan 30, 2014 at 11:18 AM, Brenden Cobb <Brenden.Cobb@humedica.com<mailto:Brenden.Cobb@humedica.com>>
wrote:
Mapred-site.xml:

<!--Autogenerated by Cloudera CM on 2013-12-04T22:38:07.943Z-->
<configuration>
  <property>
    <name>mapred.job.tracker</name>
    <value>som-dmsandbox01.humedica.net:8021<http://som-dmsandbox01.humedica.net:8021></value>
  </property>
  <property>
    <name>mapred.job.tracker.http.address</name>
    <value>0.0.0.0:50030<http://0.0.0.0:50030></value>
  </property>
  <property>
    <name>mapreduce.job.counters.max</name>
    <value>120</value>
  </property>
  <property>
    <name>mapred.output.compress</name>
    <value>false</value>
  </property>
  <property>
    <name>mapred.output.compression.type</name>
    <value>BLOCK</value>
  </property>
  <property>
    <name>mapred.output.compression.codec</name>
    <value>org.apache.hadoop.io.compress.DefaultCodec</value>
  </property>
  <property>
    <name>mapred.map.output.compression.codec</name>
    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
  </property>
  <property>
    <name>mapred.compress.map.output</name>
    <value>true</value>
  </property>
  <property>
    <name>zlib.compress.level</name>
    <value>DEFAULT_COMPRESSION</value>
  </property>
  <property>
    <name>io.sort.factor</name>
    <value>64</value>
  </property>
  <property>
    <name>io.sort.record.percent</name>
    <value>0.05</value>
  </property>
  <property>
    <name>io.sort.spill.percent</name>
    <value>0.8</value>
  </property>
  <property>
    <name>mapred.reduce.parallel.copies</name>
    <value>10</value>
  </property>
  <property>
    <name>mapred.submit.replication</name>
    <value>2</value>
  </property>
  <property>
    <name>mapred.reduce.tasks</name>
    <value>2</value>
  </property>
  <property>
    <name>mapred.userlog.retain.hours</name>
    <value>24</value>
  </property>
  <property>
    <name>io.sort.mb</name>
    <value>71</value>
  </property>
  <property>
    <name>mapred.child.java.opts</name>
    <value> -Xmx298061516</value>
  </property>
  <property>
    <name>mapred.job.reuse.jvm.num.tasks</name>
    <value>1</value>
  </property>
  <property>
    <name>mapred.map.tasks.speculative.execution</name>
    <value>false</value>
  </property>
  <property>
    <name>mapred.reduce.tasks.speculative.execution</name>
    <value>false</value>
  </property>
  <property>
    <name>mapred.reduce.slowstart.completed.maps</name>
    <value>0.8</value>
  </property>
</configuration>

From: Abraham Elmahrek <abe@cloudera.com<mailto:abe@cloudera.com>>
Reply-To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Date: Thursday, January 30, 2014 2:13 PM

To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Subject: Re: Sqoop to HDFS error Cannot initialize cluster

Hmmm could you provide your mapred-site.xml? It seems like you need to update the mapreduce.framework.name<http://mapreduce.framework.name>
to "classic" if you're using MR1.

-Abe


On Thu, Jan 30, 2014 at 11:02 AM, Brenden Cobb <Brenden.Cobb@humedica.com<mailto:Brenden.Cobb@humedica.com>>
wrote:
Hi Abe-

Sqoop 1.4.3 was installed as part of CDH 4.5

Using the server domain instead of localhost did push things along a bit, but the job is complaining
that the LocalJobRunner is on the "master" node in cluster:

14/01/30 13:49:18 INFO mapreduce.Cluster: Failed to use org.apache.hadoop.mapred.LocalClientProtocolProvider
due to error: Invalid "mapreduce.jobtracker.address" configuration value for LocalJobRunner
: "som-dmsandbox01.humedica.net:8021<http://som-dmsandbox01.humedica.net:8021>"
14/01/30 13:49:18 ERROR security.UserGroupInformation: PriviledgedActionException as:oracle
(auth:SIMPLE) cause:java.io.IOException: Cannot initialize Cluster. Please check your configuration
for mapreduce.framework.name<http://mapreduce.framework.name> and the correspond server
addresses.
14/01/30 13:49:18 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException:
Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name<http://mapreduce.framework.name>
and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:122)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:84)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:77)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1239)


The instance above mentions som-dmsandbox01.humedica.net<http://humedica.net> (the master)
 while the machine Im executing on is som-dmsandbox03.humedica.net<http://humedica.net>

-BC

From: Abraham Elmahrek <abe@cloudera.com<mailto:abe@cloudera.com>>
Reply-To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Date: Thursday, January 30, 2014 1:49 PM
To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Subject: Re: Sqoop to HDFS error Cannot initialize cluster

Hey there,

Sqoop1 is actually just a really heavy client. It will create jobs in MapReduce for data transfering.

With that being said, I'm curious about how sqoop was installed? What version of Sqoop1 are
you running? It might be as simple as setting the HADOOP_HOME environment variable or updating
one of the configs.

-Abe


On Thu, Jan 30, 2014 at 10:36 AM, Brenden Cobb <Brenden.Cobb@humedica.com<mailto:Brenden.Cobb@humedica.com>>
wrote:
I think I have part of the answer.. I'm specifying localhost when I think I should be using
the actual domain, otherwise sqoop thinks it's not in distributed mode?

-BC

From: Brenden Cobb <brenden.cobb@humedica.com<mailto:brenden.cobb@humedica.com>>
Reply-To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Date: Thursday, January 30, 2014 12:34 PM
To: "user@sqoop.apache.org<mailto:user@sqoop.apache.org>" <user@sqoop.apache.org<mailto:user@sqoop.apache.org>>
Subject: Sqoop to HDFS error Cannot initialize cluster

Hello-

I'm trying to sqoop data from oracle to hdfs but getting the following error:

$ sqoop import --connect jdbc:oracle:thin:@localhost:1521/DB11G --username sqoop --password
xx --table sqoop.test

…
14/01/30 10:58:10 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-oracle/compile/fa0ce9acd6ac6d0c349389a6dbfee62b/sqoop.test.jar
14/01/30 10:58:10 INFO mapreduce.ImportJobBase: Beginning import of sqoop.test
14/01/30 10:58:10 WARN conf.Configuration: mapred.job.tracker is deprecated. Instead, use
mapreduce.jobtracker.address
14/01/30 10:58:10 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
14/01/30 10:58:10 INFO manager.SqlManager: Executing SQL statement: SELECT FIRST,LAST,EMAIL
FROM sqoop.test WHERE 1=0
14/01/30 10:58:11 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14/01/30 10:58:11 ERROR security.UserGroupInformation: PriviledgedActionException as:oracle
(auth:SIMPLE) cause:java.io.IOException: Cannot initialize Cluster. Please check your configuration
for mapreduce.framework.name<http://mapreduce.framework.name> and the correspond server
addresses.
14/01/30 10:58:11 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException:
Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name<http://mapreduce.framework.name>
and the correspond server addresses.

at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:122)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:84)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:77)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1239)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1235)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1234)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1263)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:606)
at com.quest.oraoop.OraOopConnManager.importTable(OraOopConnManager.java:260)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
at org.apache.sqoop.Sqoop.main(Sqoop.java:240)


Checking just the Database side works ok:
$ sqoop list-tables --connect jdbc:oracle:thin:@localhost:1521:DB11G --username sqoop --password
xx
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
14/01/30 12:12:20 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.5.0
14/01/30 12:12:20 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure.
Consider using -P instead.
14/01/30 12:12:20 INFO manager.SqlManager: Using default fetchSize of 1000
14/01/30 12:12:21 INFO manager.OracleManager: Time zone has been set to GMT
TEST


Any thoughts?

Thanks,
BC






Mime
View raw message