From hbase-user-return-4592-apmail-hadoop-hbase-user-archive=hadoop.apache.org@hadoop.apache.org Thu Jun 11 15:41:38 2009 Return-Path: Delivered-To: apmail-hadoop-hbase-user-archive@minotaur.apache.org Received: (qmail 3439 invoked from network); 11 Jun 2009 15:41:38 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 11 Jun 2009 15:41:38 -0000 Received: (qmail 24323 invoked by uid 500); 11 Jun 2009 15:41:49 -0000 Delivered-To: apmail-hadoop-hbase-user-archive@hadoop.apache.org Received: (qmail 24296 invoked by uid 500); 11 Jun 2009 15:41:49 -0000 Mailing-List: contact hbase-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hbase-user@hadoop.apache.org Delivered-To: mailing list hbase-user@hadoop.apache.org Received: (qmail 24286 invoked by uid 99); 11 Jun 2009 15:41:49 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 11 Jun 2009 15:41:49 +0000 X-ASF-Spam-Status: No, hits=2.2 required=10.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of alexjaquet@gmail.com designates 209.85.220.219 as permitted sender) Received: from [209.85.220.219] (HELO mail-fx0-f219.google.com) (209.85.220.219) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 11 Jun 2009 15:41:38 +0000 Received: by fxm19 with SMTP id 19so1967875fxm.29 for ; Thu, 11 Jun 2009 08:41:16 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:received:in-reply-to:references :date:message-id:subject:from:to:content-type; bh=4WZ/nAga+q/cbTW0ElbWzkem+FTzhWFKgR0FieE+UIg=; b=TlHAsCdf3UcYT4+sWzDmX7S6adv4cMhYAip3aFP6Y42nA0vA9pzEva0z70TPFQqIM+ ZOdwKNXlY7Th960XQys2KKFcpkscuXdIEu2rxsysOO94xDLvjPkf3SiVgx0kCdOByRUQ USsv4nQEjEf3DqdiIb09DzoSvwLpWhXGnVu9c= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; b=GREWJhi+mQ/vwXZN1ZuZgSgotVpVH5aOi9AYeW1Hlzyl/nzJ7FhmLg0sbktouO0v8H TAiGJaj5mSEG0z5GUCQ0KKwuhJINCGRFkdcS8zDEjJhOSqUKZipB51B+mKnnNOHcP3N5 pq26cj0uT+zImnrzi/EM+U197+R4FafT6tBCw= MIME-Version: 1.0 Received: by 10.204.97.140 with SMTP id l12mr2603768bkn.133.1244734875667; Thu, 11 Jun 2009 08:41:15 -0700 (PDT) In-Reply-To: <314098690906110836k537c4c53kbc4703e9ffcbfff2@mail.gmail.com> References: <46a0c3190906110624g2d8521aah216f6a1006133a86@mail.gmail.com> <46a0c3190906110647g4be5066ao5b4a613328549b3@mail.gmail.com> <314098690906110654h1f579563l7aec1f2d05b57fd0@mail.gmail.com> <46a0c3190906110658j3f4cbd3ch7264607999273c71@mail.gmail.com> <46a0c3190906110707s76100386l261613e9e917f318@mail.gmail.com> <314098690906110801t35d7a01lb870b3fbc1d2f6cc@mail.gmail.com> <46a0c3190906110832r29f1ab39i3e498178d8a980cc@mail.gmail.com> <314098690906110836k537c4c53kbc4703e9ffcbfff2@mail.gmail.com> Date: Thu, 11 Jun 2009 17:41:15 +0200 Message-ID: <46a0c3190906110841w6c597e9aj4802b043c41ca6b3@mail.gmail.com> Subject: Re: Windows installation From: Alexandre Jaquet To: hbase-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=0016363b99221a9597046c146c8d X-Virus-Checked: Checked by ClamAV on apache.org --0016363b99221a9597046c146c8d Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit Good to know I'will first read the book before asking other questions :) Thx 2009/6/11 jason hadoop > I don't actually use hbase, so I can't give you a direct answer. There is a > section on using spring in my book, to initialize a mapper or reducer, in > chapter 5. > > On Thu, Jun 11, 2009 at 8:32 AM, Alexandre Jaquet >wrote: > > > I was to exciting to begin the reading it was just the email I provided. > > > > One more question, does hbase provide a ConnectionFactory or > SessionFactory > > that can be integrated within Spring ? > > > > Thanks > > > > 2009/6/11 jason hadoop > > > > > I don't know the password for that, you will need to contact apress > > > support. > > > > > > On Thu, Jun 11, 2009 at 7:07 AM, Alexandre Jaquet < > alexjaquet@gmail.com > > > >wrote: > > > > > > > I get your book rigth now (but it get a password protection can you > > mail > > > at > > > > alexjaquet@gmail.com the password), one more question regarding more > > > hbase > > > > that hadoop does hbase is well suited for every large application > like > > > > auction website or very community forum > > > > > > > > thx > > > > > > > > 2009/6/11 Alexandre Jaquet > > > > > > > > > Thanks I run yet to buy your ebook ! > > > > > > > > > > 2009/6/11 jason hadoop > > > > > > > > > > My book has a small section on setting up under windows. > > > > >> > > > > >> The key piece is that you must have a cygwin installation on the > > > > machine, > > > > >> and include the cygwin installation's bin directory in your > windows > > > > system > > > > >> PATH environment variable. (Control > > Panel|System|Advanced|Environment > > > > >> Variables|System variables|Path > > > > >> There is always a constant confusion between the paths on the > > windows > > > > side > > > > >> (as seen by the jvm) and by the paths seen by the hadoop scripts > > > through > > > > >> cygwin. > > > > >> > > > > >> > > > > >> > > > > >> On Thu, Jun 11, 2009 at 6:47 AM, Alexandre Jaquet < > > > alexjaquet@gmail.com > > > > >> >wrote: > > > > >> > > > > >> > As I can read in the doc Windows is supported as a dev platform > > > within > > > > >> the > > > > >> > use of cygwin (but I've will not have pain if I've to switch to > > > linux! > > > > >> :): > > > > >> > > > > > >> > thx > > > > >> > Pre-requisites Supported Platforms > > > > >> > > > > > >> > - GNU/Linux is supported as a development and production > > platform. > > > > >> Hadoop > > > > >> > has been demonstrated on GNU/Linux clusters with 2000 nodes. > > > > >> > - Win32 is supported as a *development platform*. Distributed > > > > >> operation > > > > >> > has not been well tested on Win32, so it is not supported as a > > > > >> > *production > > > > >> > platform*. > > > > >> > > > > > >> > > > > > >> > > > > > >> > 2009/6/11 Nick Cen > > > > >> > > > > > >> > > as far as i know, hadoop has not been ported to the windows. > > > > >> > > > > > > >> > > 2009/6/11 Alexandre Jaquet > > > > >> > > > > > > >> > > > Hello, > > > > >> > > > > > > > >> > > > For my first try I will use windows as a non clustered > system. > > > > >> > > > > > > > >> > > > I'm been trying to run it after the setting up of the > > JAVA_HOME > > > > env > > > > >> > > > variable > > > > >> > > > > > > > >> > > > but when I run the following command *bin/hadoop jar > > > > >> > > hadoop-*-examples.jar > > > > >> > > > grep input output 'dfs[a-z.]+' I'm getting > > > > >> > > > this : > > > > >> > > > * > > > > >> > > > > > > > >> > > > *$ bin/hadoop jar hadoop-*-examples.jar grep input output > > > > >> 'dfs[a-z.]+' > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 2: $'\r': > > command > > > > not > > > > >> > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 7: $'\r': > > command > > > > not > > > > >> > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 9: export: > > > > >> > > > `Files/Java/jdk1.6.0_12 > > > > >> > > > ': not a valid identifier > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 10: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 13: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 16: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 19: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 29: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 32: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 35: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 38: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 41: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 46: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 49: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > /cygdrive/c/Documents and Settings/Alexandre Jaquet/Mes > > > > >> > > > documents/hadoop-0.20.0/ > > > > >> > > > hadoop-0.20.0/bin/../conf/hadoop-env.sh: line 52: $'\r': > > command > > > > not > > > > >> > > found > > > > >> > > > bin/hadoop: line 258: C:/Program/bin/java: No such file or > > > > directory > > > > >> > > > bin/hadoop: line 289: C:/Program/bin/java: No such file or > > > > directory > > > > >> > > > bin/hadoop: line 289: exec: C:/Program/bin/java: cannot > > execute: > > > > No > > > > >> > such > > > > >> > > > file or > > > > >> > > > directory* > > > > >> > > > > > > > >> > > > Here is my *hadoop-env.sh > > > > >> > > > > > > > >> > > > # Set Hadoop-specific environment variables here. > > > > >> > > > > > > > >> > > > # The only required environment variable is JAVA_HOME. All > > > others > > > > >> are > > > > >> > > > # optional. When running a distributed configuration it is > > best > > > > to > > > > >> > > > # set JAVA_HOME in this file, so that it is correctly > defined > > on > > > > >> > > > # remote nodes. > > > > >> > > > > > > > >> > > > # The java implementation to use. Required. > > > > >> > > > export JAVA_HOME=C:/Program Files/Java/jdk1.6.0_12/bin > > > > >> > > > > > > > >> > > > # Extra Java CLASSPATH elements. Optional. > > > > >> > > > # export HADOOP_CLASSPATH= > > > > >> > > > > > > > >> > > > # The maximum amount of heap to use, in MB. Default is 1000. > > > > >> > > > # export HADOOP_HEAPSIZE=2000 > > > > >> > > > > > > > >> > > > # Extra Java runtime options. Empty by default. > > > > >> > > > # export HADOOP_OPTS=-server > > > > >> > > > > > > > >> > > > # Command specific options appended to HADOOP_OPTS when > > > specified > > > > >> > > > export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote > > > > >> > > > $HADOOP_NAMENODE_OPT > > > > >> > > > S" > > > > >> > > > export > > > > HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote > > > > >> > > > $HADOOP_SEC > > > > >> > > > ONDARYNAMENODE_OPTS" > > > > >> > > > export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote > > > > >> > > > $HADOOP_DATANODE_OPT > > > > >> > > > S" > > > > >> > > > export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote > > > > >> > > > $HADOOP_BALANCER_OPT > > > > >> > > > S" > > > > >> > > > export > HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote > > > > >> > > > $HADOOP_JOBTRACKER > > > > >> > > > _OPTS" > > > > >> > > > # export HADOOP_TASKTRACKER_OPTS= > > > > >> > > > # The following applies to multiple commands (fs, dfs, fsck, > > > > distcp > > > > >> > etc) > > > > >> > > > # export HADOOP_CLIENT_OPTS > > > > >> > > > > > > > >> > > > # Extra ssh options. Empty by default. > > > > >> > > > # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o > > > > >> > SendEnv=HADOOP_CONF_DIR" > > > > >> > > > > > > > >> > > > # Where log files are stored. $HADOOP_HOME/logs by default. > > > > >> > > > # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs > > > > >> > > > > > > > >> > > > # File naming remote slave hosts. $HADOOP_HOME/conf/slaves > by > > > > >> default. > > > > >> > > > # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves > > > > >> > > > > > > > >> > > > # host:path where hadoop code should be rsync'd from. Unset > > by > > > > >> > default. > > > > >> > > > # export HADOOP_MASTER=master:/home/$USER/src/hadoop > > > > >> > > > > > > > >> > > > # Seconds to sleep between slave commands. Unset by > default. > > > > This > > > > >> > > > # can be useful in large clusters, where, e.g., slave rsyncs > > can > > > > >> > > > # otherwise arrive faster than the master can service them. > > > > >> > > > # export HADOOP_SLAVE_SLEEP=0.1 > > > > >> > > > > > > > >> > > > # The directory where pid files are stored. /tmp by default. > > > > >> > > > # export HADOOP_PID_DIR=/var/hadoop/pids > > > > >> > > > > > > > >> > > > # A string representing this instance of hadoop. $USER by > > > default. > > > > >> > > > # export HADOOP_IDENT_STRING=$USER > > > > >> > > > > > > > >> > > > # The scheduling priority for daemon processes. See 'man > > nice'. > > > > >> > > > # export HADOOP_NICENESS=10 > > > > >> > > > ~ > > > > >> > > > ~ > > > > >> > > > ~ > > > > >> > > > > > > > >> > > > Thanks in advance ! > > > > >> > > > > > > > >> > > > Alexandre Jaquet > > > > >> > > > * > > > > >> > > > > > > > >> > > > > > > >> > > > > > > >> > > > > > > >> > > -- > > > > >> > > http://daily.appspot.com/food/ > > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > >> > > > > >> -- > > > > >> Pro Hadoop, a book to guide you from beginner to hadoop mastery, > > > > >> http://www.apress.com/book/view/9781430219422 > > > > >> www.prohadoopbook.com a community for Hadoop Professionals > > > > >> > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > Pro Hadoop, a book to guide you from beginner to hadoop mastery, > > > http://www.apress.com/book/view/9781430219422 > > > www.prohadoopbook.com a community for Hadoop Professionals > > > > > > > > > -- > Pro Hadoop, a book to guide you from beginner to hadoop mastery, > http://www.apress.com/book/view/9781430219422 > www.prohadoopbook.com a community for Hadoop Professionals > --0016363b99221a9597046c146c8d--