sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Allaway <andrewalla...@outlook.com>
Subject Sqoop, sending for a loop - newby lost - SQL Server/Sqoop
Date Mon, 28 Oct 2013 23:04:52 GMT
Sorry for the bad title:)

Have:
3 nodes
Debian/wheezy
Hadoop 1.2.1
Hive 0.11.0

All's working great:)

Want to connect SQL Server 2012 and SQL Serv. 2014 CTP to the above


I'm totally lost

Namenode (aka node1): 192.168.10.10
Node2 192.168.10.11
Node3 192.168.10.12

Have Windows7 (static ip4 192.168.10.13), connected via ethernet thru a switch. I can ssh
into nodes 1-3 easy.

All's swell.

On Win7 have a full sql server instance "bob", database "test_db", schema "test_schema" &
table "test_table" login "abc" pw "xyz".

On the cluster I've hadoop here:
/usr/local/hadoop

Just untared Scoop to /usr/lib/sqoop

Then when I tried to run$ sqoop help from the above dir, it said it didn't know where my hadoop
was.  So I ran the hadoop_home /usr/local....

Then ran$ sqoop help and it said it can't find hdfs.  So I ran the same$ export home_hdfs
usr/local....

Then ran sqoop help and it said it needs Hbase????

Does it?  Why does it need Hbase to run?

Not sure how to go from here.  I want to install these packages as I learn them. I don't intend
to learn Hbase at the moment, can I "live" w/o it?

Even if sqoop worked I still don't understand how to pull the table above (test_table) into
hdfs and into Hive??

Thoughts?

Best,
Andy




Mime
View raw message