sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Allaway <andrewalla...@outlook.com>
Subject RE: Sqoop, sending for a loop - newby lost - SQL Server/Sqoop
Date Thu, 31 Oct 2013 03:31:31 GMT
Hi Jarcec, et al -

Yes, that is rather vague of me.  Apologizes, I appreciate your question and aid:)

I've some updates, but still getting thrown for a loop by Hadoop-Sqoop:)  /* I can't resist
0>    <--not'a bad ice-cream-cone   :)  */

* I was able to uninstall the sqoop 1.4.4 package and re-install it with the version that
says sqoop-bin.1.4.4.  That worked magically!  Woohoo.  I can now run sqoop!  What was the
difference of the stable release with bin vs w/o bin?

* Now the down side, I cant, for the life of me, get sqoop to connect to my SQL Server.  I've
the specific jar file to run sqoop for sql server at nameNode: ...sqoop/lib/sqljdbc4.jar

Questions:
1) When I submit the below from .../sqoop/, I get nowhere:)  advice?

bin/sqoop list-databases --connect 'jdbc:sqlserver://Andy-ABC-1-HP\BOB:1433;databaseName=andy_dev;user=Andy-ABC-1;password=***;'

What I think is the relevant part of error:
...

13/10/30 06:20:29 ERROR manager.CatalogQueryManager: Failed to list databases
com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host Andy-ABC-1-HP,
port 1433 has failed. Error: "null. Verify the connection properties. Make sure that an instance
of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure
that TCP connections to the port are not blocked by a firewall.".
        at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:190)
        at com.microsoft.sqlserver.jdbc.SQLServerException.ConvertConnectExceptionToSQLServerException(SQLServerException.java:241)
..

What I've got:

a) 
My HP Laptop, set to static: IP4 - 192.168.10.13
SQLServer running on the above laptop:
Host name: Andy-ABC-1-HP
Instance: BOB
Port (per: start>sql server config. mgr>SQL native client>client protocols>tcp/ip>properties>port
1433
DB: andy_dev
schema: dbo
Login: Andy-ABC-1
Pw :

b) namenode, 192.168.10.10, Debian, can ping 192.168.10.13 and 192.168.10.13 can ssh into
namenode

2) When I lauch sqoop, it keeps saying "Error: /usr/lib/hadoop does not exist! Please set
$HADOOP_COMMON_HOME to the root of your Hadoop installation. & Please set $HADOOP_MAPRED_HOME
to the root of your Hadoop MapReduce installation."  I then run: export export HADOOP_COMMON_HOME=/usr/local/hadoop
and export HADOOP_MAPRED_HOME=/usr/local/hadoop. It works but after reboot it is back??? How
to perm set it?

Firewall was off when all above run...

Any advice appreciated!

Thanks!
Andy

> Date: Wed, 30 Oct 2013 14:28:52 -0700
> From: jarcec@apache.org
> To: user@sqoop.apache.org
> Subject: Re: Sqoop, sending for a loop - newby lost - SQL Server/Sqoop
> 
> Hi Andrew,
> would you mind sharing with use exact commands and exact exceptions that you are seeing?
It will help us to understand your issue better.
> 
> Jarcec
> 
> On Mon, Oct 28, 2013 at 07:29:47PM -0500, Andrew Allaway wrote:
> > Hey Abe - 
> > 
> > I haven't because I'm working on running a real lean install on a ARM cluster.
> > 
> > I want a bare bones build (Hadoop, Hive & Scoop).  Starting to feel that although
these packages are open source (Hadoop, Hive, Sqoop, etc), figuring out how to build a solution
from the bottom, w/o a full package (Bigtop, Cloudera, MapR, Horton, et al), is quite hard,
due to not knowing what dependencies are required.
> > 
> > The Doc's of Sqoop 1.4.4 don't mention Hbase being required. What am I missing?
> > 
> > Anyone tried running a barebones setup and know what I'm missing?
> > 
> > Thanks Abe for the tip, I've your distro on a VM and it has all the bells and whistles,
just was hoping to get a three node solution running swell with the barebones, just can't
figure out what base packages are needed to tie SS 2012/14 and a cluster together with only
Hadoop, Hive and Sqoop.
> > 
> > Also I forgot to mention, when I run sqoop help it also says something about missing
a Java class.  I've OpenJDK running, pointing to ARM (i.e. export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-armhf)
> > 
> > Thanks to all in advance for your suggestions!
> > 
> > 
> > Andy
> > 
> > 
> > 
> > 
> > > On Oct 28, 2013, at 6:20 PM, "Abraham Elmahrek" <abe@cloudera.com> wrote:
> > > 
> > > Andy,
> > > 
> > > Have you tried installing using Apache Bigtop? or some other packaged installation
provider? Hbase client libs are used for Hbase import. Sqoop is compiled with Hbase support
I think.
> > > 
> > > -Abe
> > > 
> > > 
> > >> On Mon, Oct 28, 2013 at 4:04 PM, Andrew Allaway <andrewallaway@outlook.com>
wrote:
> > >> Sorry for the bad title:)
> > >> 
> > >> Have:
> > >> 3 nodes
> > >> Debian/wheezy
> > >> Hadoop 1.2.1
> > >> Hive 0.11.0
> > >> 
> > >> All's working great:)
> > >> 
> > >> Want to connect SQL Server 2012 and SQL Serv. 2014 CTP to the above
> > >> 
> > >> 
> > >> I'm totally lost
> > >> 
> > >> Namenode (aka node1): 192.168.10.10
> > >> Node2 192.168.10.11
> > >> Node3 192.168.10.12
> > >> 
> > >> Have Windows7 (static ip4 192.168.10.13), connected via ethernet thru a
switch. I can ssh into nodes 1-3 easy.
> > >> 
> > >> All's swell.
> > >> 
> > >> On Win7 have a full sql server instance "bob", database "test_db", schema
"test_schema" & table "test_table" login "abc" pw "xyz".
> > >> 
> > >> On the cluster I've hadoop here:
> > >> /usr/local/hadoop
> > >> 
> > >> Just untared Scoop to /usr/lib/sqoop
> > >> 
> > >> Then when I tried to run$ sqoop help from the above dir, it said it didn't
know where my hadoop was.  So I ran the hadoop_home /usr/local....
> > >> 
> > >> Then ran$ sqoop help and it said it can't find hdfs.  So I ran the same$
export home_hdfs usr/local....
> > >> 
> > >> Then ran sqoop help and it said it needs Hbase????
> > >> 
> > >> Does it?  Why does it need Hbase to run?
> > >> 
> > >> Not sure how to go from here.  I want to install these packages as I learn
them. I don't intend to learn Hbase at the moment, can I "live" w/o it?
> > >> 
> > >> Even if sqoop worked I still don't understand how to pull the table above
(test_table) into hdfs and into Hive??
> > >> 
> > >> Thoughts?
> > >> 
> > >> Best,
> > >> Andy
> > > 
 		 	   		  
Mime
View raw message