sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bhavesh Shah <bhavesh25s...@gmail.com>
Subject Re: Table not creating in hive
Date Fri, 03 Feb 2012 08:09:46 GMT
Hello Alex,
Thanks for your reply.
I have observed one thing this thing is happening with some tables only.
While some tables import with the complete data while some not.
But the issue is that though import completely or not their entry is not
listed in "SHOW TABLE" command.

Why this is happening I am not getting.
Is there any problem in configuration?



-
Thanks and Regards,
Bhavesh Shah




On Fri, Feb 3, 2012 at 1:34 PM, alo alt <wget.null@googlemail.com> wrote:

> 0 records exported, so the table will be not created since they have no
> data. Also check the file:
> > /java.io.IOException: Destination
> '/home/hadoop/sqoop-1.3.0-cdh3u1/bin/./Appointment.java' already exists
>
> sqoop will move it, but it still exists.
>
> - Alex
>
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
>
> On Feb 3, 2012, at 6:22 AM, Bhavesh Shah wrote:
>
> >
> >
> > ---------- Forwarded message ----------
> > From: Bhavesh Shah <bhavesh25shah@gmail.com>
> > Date: Fri, Feb 3, 2012 at 10:38 AM
> > Subject: Re: Table not creating in hive
> > To: dev@hive.apache.org, sqoop-user@incubator.apache.org
> >
> >
> > Hello Bejoy & Alexis,
> > Thanks for your reply.
> > I am using mysql as a database (and not derby)
> > Previuosly I am using --split by 1 and is working fine, but when I
> installed MySQL and change the database then I got the error for --split-by
> option and thats why I use -m 1.
> > But again due to that it is showing that data retrieve is 0.
> >
> > Here are the logs.
> > hadoop@ubuntu:~/sqoop-1.3.0-cdh3u1/bin$ ./sqoop-import  --connect
> 'jdbc:sqlserver://192.168.1.1;username=abcd;password=12345;database=FIGMDHadoopTest'
> --table Appointment --hive-table appointment -m 1 --hive-import --verbose
> >
> > 12/01/31 22:33:40 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > 12/01/31 22:33:40 INFO tool.BaseSqoopTool: Using Hive-specific
> delimiters for output. You can override
> > 12/01/31 22:33:40 INFO tool.BaseSqoopTool: delimiters with
> --fields-terminated-by, etc.
> > 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Added factory
> com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> /home/hadoop/sqoop-1.3.0-cdh3u1/conf/managers.d/mssqoop-sqlserver
> > 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory
> > 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 12/01/31 22:33:40 INFO SqlServer.MSSQLServerManagerFactory: Using
> Microsoft's SQL Server - Hadoop Connector
> > 12/01/31 22:33:40 INFO manager.SqlManager: Using default fetchSize of
> 1000
> > 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> com.microsoft.sqoop.SqlServer.MSSQLServerManager@116471f
> > 12/01/31 22:33:40 INFO tool.CodeGenTool: Beginning code generation
> > 12/01/31 22:33:40 DEBUG manager.SqlManager: No connection paramenters
> specified. Using regular API for making connection.
> > 12/01/31 22:33:41 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000
> > 12/01/31 22:33:41 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [Appointment]
> > 12/01/31 22:33:41 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000
> > 12/01/31 22:33:41 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [Appointment]
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter: selected columns:
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   AppointmentUid
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   ExternalID
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   PatientUid
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   StartTime
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   EndTime
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   ResourceUid
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   Note
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   AppointmentTypeUid
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   AppointmentStatusUid
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   CheckOutNote
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   CreatedDate
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter:   CreatedByUid
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.java
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter: Table name: Appointment
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter: Columns: AppointmentUid:1,
> ExternalID:12, PatientUid:1, StartTime:93, EndTime:93, ResourceUid:1,
> RenderringProviderUid:1, ReferringProviderUid:1, ServiceLocationUid:1,
> Note:-1, AppointmentTypeUid:1, AppointmentStatusUid:1, CheckOutNote:12,
> ROSxml:-1, CreatedDate:93, CreatedByUid:1, ModifiedDate:93,
> ModifiedByUid:1, SingleDayAppointmentGroupUid:1,
> MultiDayAppointmentGroupUid:1,
> > 12/01/31 22:33:41 DEBUG orm.ClassWriter: sourceFilename is
> Appointment.java
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/
> > 12/01/31 22:33:41 INFO orm.CompilationManager: HADOOP_HOME is
> /home/hadoop/hadoop-0.20.2-cdh3u2
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.java
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager: Invoking javac with args:
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager:   -sourcepath
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager:   -d
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager:   -classpath
> > 12/01/31 22:33:41 DEBUG orm.CompilationManager:
> /home/hadoop/hadoop-0.20.2-cdh3u2//conf:/usr/lib/jvm/java-6-sun-1.6.0.26//lib/tools.jar:/home/hadoop/hadoop-0.20.2-cdh3u2/:/home/hadoop/hadoop-0.20.2-cdh3u2//hadoop-core-0.20.2-cdh3u2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/ant-contrib-1.0b3.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-cli-1.2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-codec-1.4.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-el-1.0.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-httpclient-3.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-logging-1.0.4.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/core-3.1.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/hadoop-fairscheduler-0.20.2-cdh3u2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jackson-core-asl-1.5.2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jackson-mapper-asl-1.5.2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jetty-6.1.26.cloudera.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jetty-util-6.1.26.cloudera.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jsch-0.1.42.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/junit-4.5.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/kfs-0.2.2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/libfb303.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/libthrift.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/log4j-1.2.15.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/mockito-all-1.8.2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/oro-2.0.8.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/servlet-api-2.5-6.1.14.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/xmlenc-0.52.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop-1.3.0-cdh3u1/conf/::/home/hadoop/sqoop-1.3.0-cdh3u1//lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/avro-1.5.1.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/avro-ipc-1.5.1.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/avro-mapred-1.5.1.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/commons-io-1.4.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/jopt-simple-3.2.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/paranamer-2.3.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/snappy-java-1.0.3-rc2.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/sqljdbc4.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/sqoop-sqlserver-1.0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/hadoop/hbase-0.90.1-cdh3u0/:/home/hadoop/hbase-0.90.1-cdh3u0//hbase-0.90.1-cdh3u0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//hbase-0.90.1-cdh3u0-tests.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/activation-1.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/asm-3.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/avro-1.3.3.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-cli-1.2.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-codec-1.4.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-el-1.0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-httpclient-3.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-lang-2.5.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-logging-1.1.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-net-1.4.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/core-3.1.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/guava-r06.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/hadoop-core-0.20.2-cdh3u0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/hbase-0.90.1-cdh3u0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jackson-core-asl-1.5.2.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jackson-jaxrs-1.5.5.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jackson-mapper-asl-1.5.2.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jackson-xc-1.5.5.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jasper-compiler-5.5.23.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jasper-runtime-5.5.23.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jaxb-api-2.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jaxb-impl-2.1.12.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jersey-core-1.4.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jersey-json-1.4.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jersey-server-1.4.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jettison-1.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jetty-6.1.26.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jetty-util-6.1.26.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jruby-complete-1.0.3.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jsp-2.1-6.1.14.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jsp-api-2.1-6.1.14.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jsp-api-2.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jsr311-api-1.1.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/log4j-1.2.16.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/protobuf-java-2.3.0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/servlet-api-2.5-6.1.14.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/servlet-api-2.5.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/slf4j-api-1.5.8.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/slf4j-log4j12-1.5.8.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/stax-api-1.0.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/thrift-0.2.0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/xmlenc-0.52.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/zookeeper-3.3.3-cdh3u0.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//sqoop-1.3.0-cdh3u1.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//sqoop-test-1.3.0-cdh3u1.jar::/home/hadoop/hadoop-0.20.2-cdh3u2/hadoop-core-0.20.2-cdh3u2.jar:/home/hadoop/sqoop-1.3.0-cdh3u1/sqoop-1.3.0-cdh3u1.jar
> > 12/01/31 22:33:42 ERROR orm.CompilationManager: Could not rename
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.java
> to /home/hadoop/sqoop-1.3.0-cdh3u1/bin/./Appointment.java
> > java.io.IOException: Destination
> '/home/hadoop/sqoop-1.3.0-cdh3u1/bin/./Appointment.java' already exists
> >     at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
> >     at
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
> >     at
> com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
> >     at
> com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:337)
> >     at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
> >     at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
> >     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >     at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
> >     at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
> >     at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
> >     at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
> > 12/01/31 22:33:42 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.jar
> > 12/01/31 22:33:42 DEBUG orm.CompilationManager: Scanning for .class
> files in directory:
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff
> > 12/01/31 22:33:42 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.class
> -> Appointment.class
> > 12/01/31 22:33:42 DEBUG orm.CompilationManager: Finished writing jar
> file
> /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.jar
> > 12/01/31 22:33:42 INFO mapreduce.ImportJobBase: Beginning import of
> Appointment
> > 12/01/31 22:33:42 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000
> > 12/01/31 22:33:42 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [Appointment]
> > 12/01/31 22:33:42 DEBUG mapreduce.DataDrivenImportJob: Using table
> class: Appointment
> > 12/01/31 22:33:42 DEBUG mapreduce.DataDrivenImportJob: Using
> InputFormat: class com.microsoft.sqoop.SqlServer.MSSQLServerDBInputFormat
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/sqoop-1.3.0-cdh3u1.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/sqljdbc4.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/sqoop-sqlserver-1.0.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/sqoop-1.3.0-cdh3u1.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/jopt-simple-3.2.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/jackson-mapper-asl-1.7.3.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/snappy-java-1.0.3-rc2.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/ant-eclipse-1.0-jvm1.2.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/sqoop-sqlserver-1.0.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/avro-mapred-1.5.1.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/avro-1.5.1.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/avro-ipc-1.5.1.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/paranamer-2.3.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/sqljdbc4.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/jackson-core-asl-1.7.3.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/commons-io-1.4.jar
> > 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/ant-contrib-1.0b3.jar
> > 12/01/31 22:33:43 INFO mapred.JobClient: Running job:
> job_201201311414_0051
> > 12/01/31 22:33:44 INFO mapred.JobClient:  map 0% reduce 0%
> > 12/01/31 22:33:48 INFO mapred.JobClient:  map 100% reduce 0%
> > 12/01/31 22:33:48 INFO mapred.JobClient: Job complete:
> job_201201311414_0051
> > 12/01/31 22:33:48 INFO mapred.JobClient: Counters: 11
> > 12/01/31 22:33:48 INFO mapred.JobClient:   Job Counters
> > 12/01/31 22:33:48 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=4152
> > 12/01/31 22:33:48 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> > 12/01/31 22:33:48 INFO mapred.JobClient:     Total time spent by all
> maps waiting after reserving slots (ms)=0
> > 12/01/31 22:33:48 INFO mapred.JobClient:     Launched map tasks=1
> > 12/01/31 22:33:48 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > 12/01/31 22:33:48 INFO mapred.JobClient:   FileSystemCounters
> > 12/01/31 22:33:48 INFO mapred.JobClient:     HDFS_BYTES_READ=87
> > 12/01/31 22:33:48 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=61985
> > 12/01/31 22:33:48 INFO mapred.JobClient:   Map-Reduce Framework
> > 12/01/31 22:33:48 INFO mapred.JobClient:     Map input records=0
> > 12/01/31 22:33:48 INFO mapred.JobClient:     Spilled Records=0
> > 12/01/31 22:33:48 INFO mapred.JobClient:     Map output records=0
> > 12/01/31 22:33:48 INFO mapred.JobClient:     SPLIT_RAW_BYTES=87
> > 12/01/31 22:33:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
> 6.2606 seconds (0 bytes/sec)
> > 12/01/31 22:33:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> > 12/01/31 22:33:48 INFO hive.HiveImport: Removing temporary files from
> import process: Appointment/_logs
> > 12/01/31 22:33:48 INFO hive.HiveImport: Loading uploaded data into Hive
> > 12/01/31 22:33:48 DEBUG hive.HiveImport: Hive.inputTable: Appointment
> > 12/01/31 22:33:48 DEBUG hive.HiveImport: Hive.outputTable: appointment
> > 12/01/31 22:33:48 DEBUG manager.SqlManager: No connection paramenters
> specified. Using regular API for making connection.
> > 12/01/31 22:33:48 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000
> > 12/01/31 22:33:48 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [Appointment]
> > 12/01/31 22:33:48 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000
> > 12/01/31 22:33:48 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [Appointment]
> > 12/01/31 22:33:48 WARN hive.TableDefWriter: Column StartTime had to be
> cast to a less precise type in Hive
> > 12/01/31 22:33:48 WARN hive.TableDefWriter: Column EndTime had to be
> cast to a less precise type in Hive
> > 12/01/31 22:33:48 WARN hive.TableDefWriter: Column CreatedDate had to be
> cast to a less precise type in Hive
> > 12/01/31 22:33:48 WARN hive.TableDefWriter: Column ModifiedDate had to
> be cast to a less precise type in Hive
> > 12/01/31 22:33:48 DEBUG hive.TableDefWriter: Create statement: CREATE
> TABLE IF NOT EXISTS `appointment` ( `AppointmentUid` STRING, `ExternalID`
> STRING, `PatientUid` STRING, `StartTime` STRING, `EndTime` STRING,`Note`
> STRING, `AppointmentTypeUid` STRING, `AppointmentStatusUid` STRING,
> `CheckOutNote` STRING, `CreatedDate` STRING, `CreatedByUid` STRING) COMMENT
> 'Imported by sqoop on 2012/01/31 22:33:48' ROW FORMAT DELIMITED FIELDS
> TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
> > 12/01/31 22:33:48 DEBUG hive.TableDefWriter: Load statement: LOAD DATA
> INPATH 'hdfs://localhost:54310/user/hadoop/Appointment' INTO TABLE
> `appointment`
> > 12/01/31 22:33:48 DEBUG hive.HiveImport: Using external Hive process.
> > 12/01/31 22:33:50 INFO hive.HiveImport: Hive history
> file=/tmp/hadoop/hive_job_log_hadoop_201201312233_1008229902.txt
> > 12/01/31 22:33:52 INFO hive.HiveImport: OK
> > 12/01/31 22:33:52 INFO hive.HiveImport: Time taken: 2.006 seconds
> > 12/01/31 22:33:53 INFO hive.HiveImport: Loading data to table
> default.appointment
> > 12/01/31 22:33:53 INFO hive.HiveImport: OK
> > 12/01/31 22:33:53 INFO hive.HiveImport: Time taken: 0.665 seconds
> > 12/01/31 22:33:53 INFO hive.HiveImport: Hive import complete.
> >
> >
> >
> >
> >
> > --
> > Thanks and Regards,
> > Bhavesh Shah
> >
> >
> >
> > On Thu, Feb 2, 2012 at 8:20 PM, Alexis De La Cruz Toledo <
> alexisdct@gmail.com> wrote:
> > This is because you need the metastore.
> > If you aren't installed in a databases,
> > it installed with derby in the directory when
> > you access to hive, remember where was it.
> > There you should find the directory name _metastore
> > and in this directory access to hive.
> >
> > Regards.
> >
> > El 2 de febrero de 2012 05:46, Bhavesh Shah <bhavesh25shah@gmail.com
> >escribió:
> >
> > > Hello all,
> > >
> > > After successfully importing the tables in hive I am not able to see
> the
> > > table in Hive.
> > > When I imported the table I saw the dir on HDFS (under
> > > /user/hive/warehouse/) but when I execute command in Hive "SHOW TABLES"
> > > the table is not in the list.
> > >
> > > I find a lot about it but not getting anything.
> > > Pls suggest me some solution for it.
> > >
> > >
> > >
> > >
> > > --
> > > Thanks and Regards,
> > > Bhavesh Shah
> > >
> >
> >
> >
> > --
> > Ing. Alexis de la Cruz Toledo.
> > *Av. Instituto Politécnico Nacional No. 2508 Col. San Pedro Zacatenco.
> México,
> > D.F, 07360 *
> > *CINVESTAV, DF.*
> >
> >
> >
> >
> >
> >
> > --
> > Regards,
> > Bhavesh Shah
> >
>
>


-

Mime
View raw message