sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jarek Jarcec Cecho <jar...@apache.org>
Subject Re: Running Sqoop job from Oozie fails on create database
Date Tue, 27 Nov 2012 17:04:06 GMT
Hi John,
Sqoop is not supporting Hive integration when running from Oozie. Recommended workaround is
to firstly run Sqoop import to temporary directory (no hive import) and than in separate Hive
action load your data into Hive.

Jarcec

On Tue, Nov 27, 2012 at 04:49:59PM +0000, John Dasher wrote:
> Hi,
> 
>  I am attempting to run a sqoop job from oozie to load a table in Hive, incrementally.
The oozie job errors with: "org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDOFatalDataStoreException:
Failed to create database '/var/lib/hive/metastore/metastore_db'"
> 
> We have hive set up to store the meta-data in a MySql database. So I'm lost trying to
find out where/why it's trying to create a database in Derby. Any pointers or information
is greatly appreciated.
> 
> 
> Thank you,
> 
> John
> 
> 
> We're using CDH4 (Free Edition):
> 
> Hadoop 2.0.0-cdh4.0.1
> 
> Oozie client build version: 3.1.3-cdh4.0.1
> 
> Sqoop 1.4.1-cdh4.0.1
> 
> 
> Sqoop command and syslog below.
> 
> Sqoop command arguments :
>              job
>              --meta-connect
>              jdbc:hsqldb:hsql://hadoopdw4:16000/sqoop
>              --exec
>              sq_admin_users_hive
> 
> 
> syslog logs
> 
> 
> 
> 2012-11-27 14:40:13,395 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter
is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 2012-11-27 14:40:13,617 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded the native-hadoop
library
> 2012-11-27 14:40:14,048 WARN org.apache.hadoop.conf.Configuration: session.id is deprecated.
Instead, use dfs.metrics.session-id
> 2012-11-27 14:40:14,049 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM
Metrics with processName=MAP, sessionId=
> 2012-11-27 14:40:14,757 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit
code 0
> 2012-11-27 14:40:14,763 INFO org.apache.hadoop.mapred.Task:  Using ResourceCalculatorPlugin
: org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1ce3570c
> 2012-11-27 14:40:15,004 WARN org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy
native library is available
> 2012-11-27 14:40:15,004 INFO org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy
native library loaded
> 2012-11-27 14:40:15,011 WARN mapreduce.Counters: Counter name MAP_INPUT_BYTES is deprecated.
Use FileInputFormatCounters as group name and  BYTES_READ as counter name instead
> 2012-11-27 14:40:15,015 INFO org.apache.hadoop.mapred.MapTask: numReduceTasks: 0
> 2012-11-27 14:40:15,549 WARN org.apache.sqoop.tool.SqoopTool: $SQOOP_CONF_DIR has not
been set in the environment. Cannot check for additional configuration.
> 2012-11-27 14:40:15,950 WARN org.apache.sqoop.ConnFactory: $SQOOP_CONF_DIR has not been
set in the environment. Cannot check for additional configuration.
> 2012-11-27 14:40:16,036 INFO org.apache.sqoop.manager.MySQLManager: Preparing to use
a MySQL streaming resultset.
> 2012-11-27 14:40:16,044 INFO org.apache.sqoop.tool.CodeGenTool: Beginning code generation
> 2012-11-27 14:40:16,598 INFO org.apache.sqoop.manager.SqlManager: Executing SQL statement:
SELECT t.* FROM `admin_users` AS t LIMIT 1
> 2012-11-27 14:40:16,640 INFO org.apache.sqoop.manager.SqlManager: Executing SQL statement:
SELECT t.* FROM `admin_users` AS t LIMIT 1
> 2012-11-27 14:40:16,660 INFO org.apache.sqoop.orm.CompilationManager: HADOOP_HOME is
/usr/lib/hadoop-0.20-mapreduce
> 2012-11-27 14:40:16,661 INFO org.apache.sqoop.orm.CompilationManager: Found hadoop core
jar at: /usr/lib/hadoop-0.20-mapreduce/hadoop-core.jar
> 2012-11-27 14:40:20,544 INFO org.apache.sqoop.orm.CompilationManager: Writing jar file:
/tmp/sqoop-mapred/compile/7fef46c7a9af683cd26c7cf826f91b6e/admin_users.jar
> 2012-11-27 14:40:20,600 INFO org.apache.sqoop.tool.ImportTool: Incremental import based
on column `updated_at`
> 2012-11-27 14:40:20,602 INFO org.apache.sqoop.tool.ImportTool: Lower bound value: '2012-11-26
21:12:01.0'
> 2012-11-27 14:40:20,602 INFO org.apache.sqoop.tool.ImportTool: Upper bound value: '2012-11-27
14:40:20.0'
> 2012-11-27 14:40:20,602 WARN org.apache.sqoop.manager.MySQLManager: It looks like you
are importing from mysql.
> 2012-11-27 14:40:20,604 WARN org.apache.sqoop.manager.MySQLManager: This transfer can
be faster! Use the --direct
> 2012-11-27 14:40:20,604 WARN org.apache.sqoop.manager.MySQLManager: option to exercise
a MySQL-specific fast path.
> 2012-11-27 14:40:20,604 INFO org.apache.sqoop.manager.MySQLManager: Setting zero DATETIME
behavior to convertToNull (mysql)
> 2012-11-27 14:40:20,627 INFO org.apache.sqoop.mapreduce.ImportJobBase: Beginning import
of admin_users
> 2012-11-27 14:40:20,698 WARN org.apache.sqoop.mapreduce.JobBase: SQOOP_HOME is unset.
May not be able to find all job dependencies.
> 2012-11-27 14:40:21,017 WARN org.apache.hadoop.mapred.JobClient: Use GenericOptionsParser
for parsing the arguments. Applications should implement Tool for the same.
> 2012-11-27 14:40:21,493 INFO org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat: BoundingValsQuery:
SELECT MIN(`id`), MAX(`id`) FROM `admin_users` WHERE ( `updated_at` >= '2012-11-26 21:12:01.0'
AND `updated_at` < '2012-11-27 14:40:20.0' )
> 2012-11-27 14:40:21,942 INFO org.apache.hadoop.mapred.JobClient: Running job: job_201210230847_10616
> 2012-11-27 14:40:22,945 INFO org.apache.hadoop.mapred.JobClient:  map 0% reduce 0%
> 2012-11-27 14:40:32,974 INFO org.apache.hadoop.mapred.JobClient:  map 100% reduce 0%
> 2012-11-27 14:40:35,985 INFO org.apache.hadoop.mapred.JobClient: Job complete: job_201210230847_10616
> 2012-11-27 14:40:35,988 INFO org.apache.hadoop.mapred.JobClient: Counters: 23
> 2012-11-27 14:40:35,988 INFO org.apache.hadoop.mapred.JobClient:   File System Counters
> 2012-11-27 14:40:35,997 INFO org.apache.hadoop.mapred.JobClient:     FILE: Number of
bytes read=0
> 2012-11-27 14:40:35,997 INFO org.apache.hadoop.mapred.JobClient:     FILE: Number of
bytes written=66125
> 2012-11-27 14:40:35,997 INFO org.apache.hadoop.mapred.JobClient:     FILE: Number of
read operations=0
> 2012-11-27 14:40:35,997 INFO org.apache.hadoop.mapred.JobClient:     FILE: Number of
large read operations=0
> 2012-11-27 14:40:35,998 INFO org.apache.hadoop.mapred.JobClient:     FILE: Number of
write operations=0
> 2012-11-27 14:40:35,998 INFO org.apache.hadoop.mapred.JobClient:     HDFS: Number of
bytes read=105
> 2012-11-27 14:40:35,998 INFO org.apache.hadoop.mapred.JobClient:     HDFS: Number of
bytes written=0
> 2012-11-27 14:40:35,998 INFO org.apache.hadoop.mapred.JobClient:     HDFS: Number of
read operations=1
> 2012-11-27 14:40:35,998 INFO org.apache.hadoop.mapred.JobClient:     HDFS: Number of
large read operations=0
> 2012-11-27 14:40:35,998 INFO org.apache.hadoop.mapred.JobClient:     HDFS: Number of
write operations=1
> 2012-11-27 14:40:36,004 INFO org.apache.hadoop.mapred.JobClient:   Job Counters
> 2012-11-27 14:40:36,004 INFO org.apache.hadoop.mapred.JobClient:     Launched map tasks=1
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:     Total time spent
by all maps in occupied slots (ms)=12802
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:     Total time spent
by all reduces in occupied slots (ms)=0
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:     Total time spent
by all maps waiting after reserving slots (ms)=0
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:     Total time spent
by all reduces waiting after reserving slots (ms)=0
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:   Map-Reduce Framework
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:     Map input records=0
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:     Map output records=0
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:     Input split bytes=105
> 2012-11-27 14:40:36,005 INFO org.apache.hadoop.mapred.JobClient:     Spilled Records=0
> 2012-11-27 14:40:36,006 INFO org.apache.hadoop.mapred.JobClient:     CPU time spent (ms)=710
> 2012-11-27 14:40:36,006 INFO org.apache.hadoop.mapred.JobClient:     Physical memory
(bytes) snapshot=84942848
> 2012-11-27 14:40:36,006 INFO org.apache.hadoop.mapred.JobClient:     Virtual memory (bytes)
snapshot=1023770624
> 2012-11-27 14:40:36,006 INFO org.apache.hadoop.mapred.JobClient:     Total committed
heap usage (bytes)=59572224
> 2012-11-27 14:40:36,019 INFO org.apache.sqoop.mapreduce.ImportJobBase: Transferred 0
bytes in 15.309 seconds (0 bytes/sec)
> 2012-11-27 14:40:36,022 INFO org.apache.sqoop.mapreduce.ImportJobBase: Retrieved 0 records.
> 2012-11-27 14:40:36,024 INFO org.apache.sqoop.hive.HiveImport: Removing temporary files
from import process: admin_users/_logs
> 2012-11-27 14:40:36,029 INFO org.apache.sqoop.hive.HiveImport: Loading uploaded data
into Hive
> 2012-11-27 14:40:36,057 INFO org.apache.sqoop.manager.SqlManager: Executing SQL statement:
SELECT t.* FROM `admin_users` AS t LIMIT 1
> 2012-11-27 14:40:36,064 WARN org.apache.sqoop.hive.TableDefWriter: Column reset_password_sent_at
had to be cast to a less precise type in Hive
> 2012-11-27 14:40:36,064 WARN org.apache.sqoop.hive.TableDefWriter: Column remember_created_at
had to be cast to a less precise type in Hive
> 2012-11-27 14:40:36,064 WARN org.apache.sqoop.hive.TableDefWriter: Column current_sign_in_at
had to be cast to a less precise type in Hive
> 2012-11-27 14:40:36,064 WARN org.apache.sqoop.hive.TableDefWriter: Column last_sign_in_at
had to be cast to a less precise type in Hive
> 2012-11-27 14:40:36,064 WARN org.apache.sqoop.hive.TableDefWriter: Column created_at
had to be cast to a less precise type in Hive
> 2012-11-27 14:40:36,064 WARN org.apache.sqoop.hive.TableDefWriter: Column updated_at
had to be cast to a less precise type in Hive
> 2012-11-27 14:40:38,326 INFO org.apache.sqoop.hive.HiveImport: WARNING: org.apache.hadoop.metrics.jvm.EventCounter
is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties
files.
> 2012-11-27 14:40:39,995 INFO org.apache.sqoop.hive.HiveImport: Logging initialized using
configuration in jar:file:/usr/lib/hive/lib/hive-common-0.8.1-cdh4.0.1.jar!/hive-log4j.properties
> 2012-11-27 14:40:40,012 INFO org.apache.sqoop.hive.HiveImport: Hive history file=/tmp/mapred/hive_job_log_mapred_201211271440_1425102933.txt
> 2012-11-27 14:40:44,451 INFO org.apache.sqoop.hive.HiveImport: org.apache.hadoop.hive.ql.metadata.HiveException:
javax.jdo.JDOFatalDataStoreException: Failed to create database '/var/lib/hive/metastore/metastore_db',
see the next exception for details.
> 2012-11-27 14:40:44,451 INFO org.apache.sqoop.hive.HiveImport: NestedThrowables:
> 2012-11-27 14:40:44,454 INFO org.apache.sqoop.hive.HiveImport: java.sql.SQLException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see the next exception for
details.
> 2012-11-27 14:40:44,456 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:991)
> 2012-11-27 14:40:44,456 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:976)
> 2012-11-27 14:40:44,456 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:7852)
> 2012-11-27 14:40:44,457 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:7251)
> 2012-11-27 14:40:44,457 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:243)
> 2012-11-27 14:40:44,457 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:430)
> 2012-11-27 14:40:44,457 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
> 2012-11-27 14:40:44,457 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889)
> 2012-11-27 14:40:44,457 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
> 2012-11-27 14:40:44,458 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212)
> 2012-11-27 14:40:44,458 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
> 2012-11-27 14:40:44,458 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:338)
> 2012-11-27 14:40:44,458 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:436)
> 2012-11-27 14:40:44,458 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:446)
> 2012-11-27 14:40:44,459 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:642)
> 2012-11-27 14:40:44,459 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554)
> 2012-11-27 14:40:44,459 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport:  at java.lang.reflect.Method.invoke(Method.java:597)
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport: Caused by: javax.jdo.JDOFatalDataStoreException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see the next exception for
details.
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport: NestedThrowables:
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport: java.sql.SQLException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see the next exception for
details.
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:298)
> 2012-11-27 14:40:44,461 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:601)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at java.lang.reflect.Method.invoke(Method.java:597)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at java.security.AccessController.doPrivileged(Native
Method)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:246)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:275)
> 2012-11-27 14:40:44,462 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:208)
> 2012-11-27 14:40:44,463 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:183)
> 2012-11-27 14:40:44,463 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
> 2012-11-27 14:40:44,463 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
> 2012-11-27 14:40:44,463 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:407)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:359)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:504)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:266)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:228)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:114)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2111)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2121)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:989)
> 2012-11-27 14:40:44,468 INFO org.apache.sqoop.hive.HiveImport:  ... 20 more
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport: Caused by: java.sql.SQLException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see the next exception for
details.
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.jdbc.InternalDriver.connect(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown
Source)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at java.sql.DriverManager.getConnection(DriverManager.java:582)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at java.sql.DriverManager.getConnection(DriverManager.java:185)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
> 2012-11-27 14:40:44,469 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
> 2012-11-27 14:40:44,470 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
> 2012-11-27 14:40:44,470 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
> 2012-11-27 14:40:44,470 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
> 2012-11-27 14:40:44,470 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
> 2012-11-27 14:40:44,470 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
> 2012-11-27 14:40:44,478 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> 2012-11-27 14:40:44,478 INFO org.apache.sqoop.hive.HiveImport:  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> 2012-11-27 14:40:44,478 INFO org.apache.sqoop.hive.HiveImport:  at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> 2012-11-27 14:40:44,478 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:588)
> 2012-11-27 14:40:44,478 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
> 2012-11-27 14:40:44,478 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
> 2012-11-27 14:40:44,478 INFO org.apache.sqoop.hive.HiveImport:  at org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
> 2012-11-27 14:40:44,478 INFO org.apache.sqoop.hive.HiveImport:  ... 47 more
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport: Caused by: java.sql.SQLException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see the next exception for
details.
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  ... 73 more
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport: Caused by: java.sql.SQLException:
Directory /var/lib/hive/metastore/metastore_db cannot be created.
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
Source)
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  ... 70 more
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport: Caused by: ERROR XBM0H:
Directory /var/lib/hive/metastore/metastore_db cannot be created.
> 2012-11-27 14:40:44,479 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.iapi.error.StandardException.newException(Unknown
Source)
> 2012-11-27 14:40:44,480 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
Source)
> 2012-11-27 14:40:44,480 INFO org.apache.sqoop.hive.HiveImport:  at java.security.AccessController.doPrivileged(Native
Method)
> 2012-11-27 14:40:44,480 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
Source)
> 2012-11-27 14:40:44,480 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
Source)
> 2012-11-27 14:40:44,480 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
Source)
> 2012-11-27 14:40:44,480 INFO org.apache.sqoop.hive.HiveImport:  at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
Source)
> 2012-11-27 14:40:44,488 INFO org.apache.sqoop.hive.HiveImport:  ... 70 more
> 2012-11-27 14:40:46,665 INFO org.apache.sqoop.hive.HiveImport: FAILED: Error in metadata:
javax.jdo.JDOFatalDataStoreException: Failed to create database '/var/lib/hive/metastore/metastore_db',
see the next exception for details.
> 2012-11-27 14:40:46,665 INFO org.apache.sqoop.hive.HiveImport: NestedThrowables:
> 2012-11-27 14:40:46,666 INFO org.apache.sqoop.hive.HiveImport: java.sql.SQLException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see the next exception for
details.
> 2012-11-27 14:40:46,669 INFO org.apache.sqoop.hive.HiveImport: FAILED: Execution Error,
return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
> 2012-11-27 14:40:46,692 ERROR org.apache.sqoop.tool.ImportTool: Encountered IOException
running import job: java.io.IOException: Hive exited with status 9
>         at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:388)
>         at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:338)
>         at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:249)
>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415)
>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
>         at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:233)
>         at org.apache.sqoop.tool.JobTool.run(JobTool.java:288)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
>         at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:205)
>         at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:174)
>         at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
>         at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:47)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:454)
>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>         at org.apache.hadoop.mapred.Child.main(Child.java:264)
> 

Mime
View raw message