From user-return-1809-apmail-sqoop-user-archive=sqoop.apache.org@sqoop.apache.org Mon Oct 28 23:20:49 2013 Return-Path: X-Original-To: apmail-sqoop-user-archive@www.apache.org Delivered-To: apmail-sqoop-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 7F21410B86 for ; Mon, 28 Oct 2013 23:20:49 +0000 (UTC) Received: (qmail 29708 invoked by uid 500); 28 Oct 2013 23:20:49 -0000 Delivered-To: apmail-sqoop-user-archive@sqoop.apache.org Received: (qmail 29673 invoked by uid 500); 28 Oct 2013 23:20:49 -0000 Mailing-List: contact user-help@sqoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@sqoop.apache.org Delivered-To: mailing list user@sqoop.apache.org Received: (qmail 29665 invoked by uid 99); 28 Oct 2013 23:20:49 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 28 Oct 2013 23:20:49 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of abe@cloudera.com designates 209.85.220.170 as permitted sender) Received: from [209.85.220.170] (HELO mail-vc0-f170.google.com) (209.85.220.170) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 28 Oct 2013 23:20:43 +0000 Received: by mail-vc0-f170.google.com with SMTP id hv10so2632143vcb.1 for ; Mon, 28 Oct 2013 16:20:22 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:content-type; bh=8P4oDbFfYejVbXmbWW8aUWICPa8V7hCyf97YEqluG4k=; b=mSlSYpBi8F21fge1Pg0vRuNRuWf5SXXFg1lSU1/6Z94Z6Q6L4T+FJjWm9vuGTAIiNi OriLmdNBEpE+NDAzbUUMtCAiYKX1GsOYPROebhDSwPOZCAco/aoQ/cKTVzKNmL7H7lNq 1Tl5VLFVg8fXsiUwHVffc9KLoBJrcLXZnrTBIlzxVKUmE2L7KF6X+ybib5+MfKQMVvdi n7L/MHA77hO9/PL1eKGyCWV1f+XyOaUubsxduEK/+bDy4JGdW9RJvH4UQtFReb60v4bq q/mrYR8+mMJsE685RkujQgbrBsFTlKomNY/qdKaJlfryCdqdoxf2ohdkPviqw4Y6RVRC ggXQ== X-Gm-Message-State: ALoCoQl/g9C1WfIEEeZkkXIX4EB9PogvkhVg0NZs5OxqIRRcgY8wdiSHiaNIhC1USfp5iQKH1ndR MIME-Version: 1.0 X-Received: by 10.52.243.138 with SMTP id wy10mr12935794vdc.2.1383002422380; Mon, 28 Oct 2013 16:20:22 -0700 (PDT) Received: by 10.59.8.170 with HTTP; Mon, 28 Oct 2013 16:20:22 -0700 (PDT) In-Reply-To: References: Date: Mon, 28 Oct 2013 16:20:22 -0700 Message-ID: Subject: Re: Sqoop, sending for a loop - newby lost - SQL Server/Sqoop From: Abraham Elmahrek To: "user@sqoop.apache.org" Content-Type: multipart/alternative; boundary=001a11c1c4a61c259604e9d55839 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c1c4a61c259604e9d55839 Content-Type: text/plain; charset=ISO-8859-1 Andy, Have you tried installing using Apache Bigtop? or some other packaged installation provider? Hbase client libs are used for Hbase import. Sqoop is compiled with Hbase support I think. -Abe On Mon, Oct 28, 2013 at 4:04 PM, Andrew Allaway wrote: > Sorry for the bad title:) > > Have: > 3 nodes > Debian/wheezy > Hadoop 1.2.1 > Hive 0.11.0 > > All's working great:) > > Want to connect SQL Server 2012 and SQL Serv. 2014 CTP to the above > > > I'm totally lost > > Namenode (aka node1): 192.168.10.10 > Node2 192.168.10.11 > Node3 192.168.10.12 > > Have Windows7 (static ip4 192.168.10.13), connected via ethernet thru a > switch. I can ssh into nodes 1-3 easy. > > All's swell. > > On Win7 have a full sql server instance "bob", database "test_db", schema > "test_schema" & table "test_table" login "abc" pw "xyz". > > On the cluster I've hadoop here: > /usr/local/hadoop > > Just untared Scoop to /usr/lib/sqoop > > Then when I tried to run$ sqoop help from the above dir, it said it didn't > know where my hadoop was. So I ran the hadoop_home /usr/local.... > > Then ran$ sqoop help and it said it can't find hdfs. So I ran the same$ > export home_hdfs usr/local.... > > Then ran sqoop help and it said it needs Hbase???? > > Does it? Why does it need Hbase to run? > > Not sure how to go from here. I want to install these packages as I learn > them. I don't intend to learn Hbase at the moment, can I "live" w/o it? > > Even if sqoop worked I still don't understand how to pull the table above > (test_table) into hdfs and into Hive?? > > Thoughts? > > Best, > Andy > > > > --001a11c1c4a61c259604e9d55839 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Andy,

Have you tried installing using A= pache Bigtop? or some other packaged installation provider? Hbase client li= bs are used for Hbase import. Sqoop is compiled with Hbase support I think.=

-Abe


On Mon, Oct 28, 2013 at 4:04 PM, Andrew Allaway <andrewallaway@outlook.com> wrote:
Sorry for the bad title:)

Have:
3 nodes
Debian/wheezy
Hadoop 1.2.1
Hive 0.11.0

All's working great:)

Want to connect SQL Server 2012 and SQL Serv. 2014 CTP to the above


I'm totally lost

Namenode (aka node1): 192.168.10.10
Node2 192.168.10.11
Node3 192.168.10.12

Have Windows7 (static ip4 192.168.10.13), connected via ethernet thru a swi= tch. I can ssh into nodes 1-3 easy.

All's swell.

On Win7 have a full sql server instance "bob", database "tes= t_db", schema "test_schema" & table "test_table&quo= t; login "abc" pw "xyz".

On the cluster I've hadoop here:
/usr/local/hadoop

Just untared Scoop to /usr/lib/sqoop

Then when I tried to run$ sqoop help from the above dir, it said it didn= 9;t know where my hadoop was. =A0So I ran the hadoop_home /usr/local....
Then ran$ sqoop help and it said it can't find hdfs. =A0So I ran the sa= me$ export home_hdfs usr/local....

Then ran sqoop help and it said it needs Hbase????

Does it? =A0Why does it need Hbase to run?

Not sure how to go from here. =A0I want to install these packages as I lear= n them. I don't intend to learn Hbase at the moment, can I "live&q= uot; w/o it?

Even if sqoop worked I still don't understand how to pull the table abo= ve (test_table) into hdfs and into Hive??

Thoughts?

Best,
Andy




--001a11c1c4a61c259604e9d55839--