sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jarek Jarcec Cecho <jar...@apache.org>
Subject Re: how to modify or use sqoop to write to a different destination
Date Sun, 09 Jun 2013 15:25:17 GMT
Hi Jane,
Sqoop is a client utility that can be installed and executed anywhere including for example
your own computer. The only requirements are that such machine do have available appropriate
Hadoop libraries and configuration files for your cluster.

Jarcec

On Wed, Jun 05, 2013 at 07:01:45PM -0400, Jane Wayne wrote:
> thanks. importing into HDFS is fine for now. but i have another question
> now.
> 
> let's say i have 3 servers.
> 
> W: web server
> H: hadoop server
> D: database server
> 
> what i want to do is use sqoop on W to import data from D to H.
> unfortunately, H is locked down (no new software besides hadoop may be
> installed on it for now). is this scenario possible with sqoop? from
> reading the documentation, it seems sqoop has to be installed on H and run
> from H, but H is a cluster of restricted modifications.
> 
> please note that i am experimenting with sqoop 1.4.3.
> 
> 
> 
> On Wed, Jun 5, 2013 at 5:30 PM, Jarek Jarcec Cecho <jarcec@apache.org>wrote:
> 
> > Hi Jane,
> > Sqoop currently supports import into HDFS, Hive and HBase. One possible
> > workaround to import data into different system would be to import data
> > into HDFS and then export them back, just somewhere else.
> >
> > Jarcec
> >
> > On Wed, Jun 05, 2013 at 04:28:49PM -0400, Jane Wayne wrote:
> > > hi,
> > >
> > > as i understand, when sqoop imports from a rdbms, it can import directly
> > > into to hdfs or hive. however, i would like to import into a different
> > > destination (perhaps a different NoSQL store). how can i do this? is
> > there
> > > a "hook" somewhere in the API?
> > >
> > > thanks,
> >

Mime
View raw message