sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John Zhao <jz...@alpinenow.com>
Subject Re: Help Sqoop Import
Date Wed, 12 Mar 2014 16:38:08 GMT
No, you don not need manually copy the jar files.
Usualy this happens when you run in MR local mode with yarn. Check your hadoop setting or
sqoop setting to make sure you get the correct job tracker.

John.

On Mar 12, 2014, at 9:05 AM, Kleiton Silva <kleiton.contato@gmail.com> wrote:

> Hello my friends, 
> 
> I have some doubt about sqoop and i hope you can you help me.
> 
> I am try import one table from mysql with two columns. when a try execute the import
with the following command:
> 
> start job --jid 2
> 
> I get this error:
> 
> 2014-03-13 12:54:31 PDT: FAILURE_ON_SUBMIT 
> Exception: java.io.FileNotFoundException: File does not exist: hdfs://oak:54310/usr/local/Cellar/hadoop/2.2.0/libexec/share/hadoop/common/lib/guava-11.0.2.jar
> 
> Command that i've had to do before this error:
> 
> hdfs dfs -mkdir /usr/lib/sqoop/lib
> hdfs dfs -copyFromLocal /usr/lib/sqoop/lib/*.jar /usr/lib/sqoop/lib
> 
> hdfs dfs -mkdir -p /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
> hdfs dfs -copyFromLocal /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib/*.jar /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
> 
> 
> Is really necessary copy all jars to HDFS or there is another smart solution?
> 
> 
> 
> Thank you.
> 
> Kleiton Silva
> 


Mime
View raw message