sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abraham Elmahrek <...@cloudera.com>
Subject Re: import mysql query to hbase using sqoop
Date Thu, 05 Jun 2014 07:14:41 GMT
Hey there,

What does the contents of the directory '/usr/local/Cellar/hbase/0.98.1'
look like? The command 'find /usr/local/Cellar/hbase/0.98.1' should assist
in figuring that out. Please paste the result of that command here.

The environment variables listed above are used in many different areas of
Hadoop. I've typically seen them in startup scripts to resolve libraries
and binaries. In this case, it's used to include jars in the classpath when
you call Sqoop.

Are you able to import to HDFS?


On Wed, Jun 4, 2014 at 6:59 PM, Josh Millstein <joshua.millstein@gmail.com>

> I've got hadoop / hbase / hive / set up and running (can create files on
> hdfs, run map reduce jobs, create a "table" in hbase and also in hive) on
> my mac with osx 10.9. I'm now trying to import data from a mysql table into
> sqoop (using query, not table-name etc). I am getting this error with this
> command.
> sqoop import --connect jdbc:mysql:///joshLocal --username root --query "SELECT * FROM
    BITLOG WHERE \$CONDITIONS" --split-by oozie_job.id --hbase-table bitlogTest --hbase-create-
   table --column-family bitLogColumn
> ERROR tool.ImportTool: Error during import: HBase jars are not present in classpath,
cannot import to HBase!
>  I believe that all the export vars are correctly setup. I have the
> following in swoop-env.sh
> export HADOOP_HOME="/usr/local/Cellar/hadoop/2.4.0"
> export HBASE_HOME="/usr/local/Cellar/hbase/0.98.1"
> export HIVE_HOME="/usr/local/Cellar/hive/0.13.0"
> export ZOOCFGDIR="/usr/local/etc/zookeeper"
> export HCAT_HOME="/usr/local/Cellar/hive/0.13.0/libexec/hcatalog"
>  one thing I did that gave a different message was to change the hbase
> home to point to HBASE_HOME/libexec/lib in swoop-env.sh. That gave me a
> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Tool
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>  error. I have seen some advice given that says I need to copy over hadoop
> security jar files to hbase's installation. I don't exactly know what files
> need to go over and if that is even an issue. The only reason I thought it
> might be is because java.security.SecureClassLoader is in the stacktrace.
> I'm sorry if this is a really basic java question but I'm a complete
> novice with it.
> One other even more basic java question. When we define HADOOP_HOME
> HBASE_HOME etc., what are we "telling" the other java programs that rely on
> that info. Are we saying "here is the executable java file" or are we
> saying "here are the jar file in the lib folder" . I don't quite understand
> what I should actually be pointing to because I don't know how that path is
> used.

View raw message