sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jarek Jarcec Cecho <jar...@apache.org>
Subject Re: Sqoop 1.4.x incremental job with hdfs error
Date Tue, 02 Jul 2013 14:45:57 GMT
Hi sir,
Sqoop requires Hadoop configuration files available on the machine where you run Sqoop. I'm
wondering if the config files from machine "C" (hdfs gateway I suppose) are also available
on machine "A" where Sqoop is running.

Jarcec

On Tue, Jul 02, 2013 at 07:02:41PM +0900, corbacho anthony wrote:
> Hi!
> 
> I am trying to create a sqoop job with incremental option.
> I want to save it into my hdfs, so I use the option --target-dir,
> but sqoop throw me an error: tool.ImportTool: Imported Failed: Wrong FS:
> hdfs://my.hdfs.com:54310/job_import_incrt, expected: file:///
> 
> My sqoop job:
> sqoop job --verbose --create job_import_0 -- import --connect jdbc:mysql://
> db.mysql.com:3306/DB --table TABLE_TEST --target-dir hdfs://
> my.hdfs.com:54310/db_import --username xxx --password xxx --incremental
> append --check-column id --last-value 1
> 
> I run sqoop on a machine A, I have sqoop-metastore on a machine B and my
> hdfs on a machine C.
> 
> What should I do to "force scoop" to save it into my hdfs and not on my
> local machine?
> 
> PS: If i change --target-dir with a local directory, its work like a charm.
> 
> Thank you
> Anthony

Mime
View raw message