Hey man,

Hadoop is required for sqoop to run.

-Abe

On Thu, Jan 22, 2015 at 12:18 PM, Narasimha Tadepalli <Narasimha.Tadepalli@actian.com> wrote:

Hi Abraham

 

Thanks for the response. I do know that sqoop1 does write to local fs. But my main question is can I only run the sqoop1 without having Hadoop installed in the system?

 

Thanks

Narasimha

 

From: Abraham Elmahrek [mailto:abe@cloudera.com]
Sent: Wednesday, January 21, 2015 6:36 PM
To: user@sqoop.apache.org
Subject: Re: Can I run only sqoop1 client without prerequisite of Hadoop?

 

Hey there,

 

I'm assuming you'd like to use Sqoop to transfer to a local file such that you can transport it out of your closed environment? If so, I'd check out "local" fs support: https://sqoop.apache.org/docs/1.4.5/SqoopUserGuide.html#_using_generic_and_specific_arguments. Essentially, you can write to the local file system with that.

 

AFAIK, Sqoop1 doesn't support FS => HDFS data transfers. In Sqoop2, such a general data transfer use case is being worked on. You can use HDFS "put" in the mean time I'd imagine.

 

-Abe

 

On Wed, Jan 21, 2015 at 4:20 PM, Narasimha Tadepalli <Narasimha.Tadepalli@actian.com> wrote:

We have very complex distributed environment where our Hadoop and oracle database are in two different private network infrastructures. We wanna use sqoop1 to import the oracle database in avro and then transport that data close to HDFS environment. And then export using sqoop1 into Hadoop systems. Is there a way I can only run sqoop1 without Hadoop prerequisite in the system? May be by just adding few dependency jars to the path?

 

Thanks

Narasimha