whirr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Deb Ghosh <dgcloud...@gmail.com>
Subject Re: Whirr installation issue
Date Fri, 20 Apr 2012 00:00:13 GMT
Hi Ashish,
Now its another prob. with os . I have in my laptop windows 7 and ubuntu
also but not partitioned ..I was using this for last 3 moths - I downloaded
ubuntu 11.10 from ubuntu site . During booting i have the option of Ubuntu
or windows.

Today I was trying the same with pressing enter as is for ubuntu ..but now
I get the GRUB prompt. So not able to go to ubuntu...

Is there any command which will boot with ubuntu ...not experience in this
line.

Thanks
Debashis

On Wed, Apr 18, 2012 at 5:28 PM, Ashish <paliwalashish@gmail.com> wrote:

> Download from a nearby mirror http://www.apache.org/dyn/closer.cgi/whirr/
>
> untar, you should be good to go.
>
> I would recommend the following steps
> 1. Use existing recipe for launching a cluster
> 2. Customize the existing recipe according to your needs
>
> HTH
> ashish
>
> On Thu, Apr 19, 2012 at 3:45 AM, Deb Ghosh <dgcloudera@gmail.com> wrote:
> > Hi Ashish,
> >
> > Just wanted to clear a little more on this ..
> >
> > So on my local ubuntu machine I will install whirr 0.7.1 . In that case
> from
> > which site I should download this for ubuntu 11.10.
> > Any clue is appreciated.
> >
> > Thanks in advance
> > Debashis
> >
> > On Tue, Apr 17, 2012 at 5:41 PM, Ashish <paliwalashish@gmail.com> wrote:
> >>
> >> I hope your are using whirr 0.7.1
> >> whirr installs jdk for you, with 0.7.1 the default is openjdk.
> >>
> >> To avoid this JDK issue, I used a custom AMI that has JDK
> >> pre-installed, and in whirr receipe you can specify the Java Home
> >> variable to be used.
> >>
> >>
> >> Andrei/Karel - thoughts on this?
> >>
> >> cheers
> >> ashish
> >>
> >> On Wed, Apr 18, 2012 at 2:09 AM, Deb Ghosh <dgcloudera@gmail.com>
> wrote:
> >> > Hi Ashish,
> >> >
> >> > First of all many thanks to you time and response , still not able to
> >> > however I have ssh to the remote and found the following ...seems to
> be
> >> > java
> >> > not installed on the remote !
> >> >
> >> > I have in the remote
> >> > ubuntu@ip-10-140-10-242:/tmp/runscript$ ls -l
> >> > total 28
> >> > -rwxr--r-- 1 ubuntu ubuntu   764 2012-04-17 20:10 runscript.sh
> >> > -rw-r--r-- 1 root   root    4111 2012-04-17 20:10 stderr.log
> >> > -rw-r--r-- 1 root   root   12683 2012-04-17 20:10 stdout.log
> >> > ======
> >> >
> >> > 1) in stderr.log at the tail part I have the following
> >> >
> >> > ++ echo nn,jt
> >> > ++ tr , '\n'
> >> > + for role in '$(echo "$ROLES" | tr "," "\n")'
> >> > + case $role in
> >> > + start_namenode
> >> > + which dpkg
> >> > + apt-get -y install hadoop-0.20-namenode
> >> > dpkg-preconfigure: unable to re-open stdin:
> >> > update-rc.d: warning: hadoop-0.20-namenode start runlevel arguments
> (2 3
> >> > 4
> >> > 5) do not match LSB Default-Start values (3 5)
> >> > update-rc.d: warning: hadoop-0.20-namenode stop runlevel arguments (0
> 1
> >> > 6)
> >> > do not match LSB Default-Stop values (0 1 2 4 6)
> >> > + AS_HDFS='su -s /bin/bash - hdfs -c'
> >> > + '[' '!' -e /mnt/hadoop/hdfs ']'
> >> > + su -s /bin/bash - hdfs -c 'hadoop-0.20 namenode -format'
> >> >
> +======================================================================+
> >> > |      Error: JAVA_HOME is not set and Java could not be
> found         |
> >> >
> +----------------------------------------------------------------------+
> >> > | Please download the latest Sun JDK from the Sun Java web
> site        |
> >> > |       > http://java.sun.com/javase/downloads/<                
     |
> >> >
> |                                                                      |
> >> > | Hadoop requires Java 1.6 or
> later.                                   |
> >> > | NOTE: This script will find Sun Java whether you install using
> the   |
> >> > |       binary or the RPM based
> installer.                             |
> >> >
> +======================================================================+
> >> >
> >> >
> >> > 2) and on the tail part of ubuntu@ip-10-140-10-242:/tmp/runscript$ vi
> >> > stdout.log
> >> >
> >> > I have
> >> >
> >> > Setting up hadoop-0.20 (0.20.2+923.197-1~lucid-cdh3) ...
> >> > update-alternatives: using /etc/hadoop-0.20/conf.empty to provide
> >> > /etc/hadoop-0.20/conf (hadoop-0.20-conf) in auto mode.
> >> > update-alternatives: using /usr/bin/hadoop-0.20 to provide
> >> > /usr/bin/hadoop
> >> > (hadoop-default) in auto mode.
> >> >
> >> > Setting up hadoop-0.20-native (0.20.2+923.197-1~lucid-cdh3) ...
> >> >
> >> > Processing triggers for libc-bin ...
> >> > ldconfig deferred processing now taking place
> >> > update-alternatives: using /etc/hadoop-0.20/conf.dist to provide
> >> > /etc/hadoop-0.20/conf (hadoop-0.20-conf) in auto mode.
> >> > Reading package lists...
> >> > Building dependency tree...
> >> > Reading state information...
> >> > The following NEW packages will be installed:
> >> >   hadoop-0.20-namenode
> >> > 0 upgraded, 1 newly installed, 0 to remove and 122 not upgraded.
> >> > Need to get 248kB of archives.
> >> > After this operation, 324kB of additional disk space will be used.
> >> > Get:1 http://archive.cloudera.com/debian/ lucid-cdh3/contrib
> >> > hadoop-0.20-namenode 0.20.2+923.197-1~lucid-cdh3 [248kB]
> >> > Fetched 248kB in 0s (14.8MB/s)
> >> > Selecting previously deselected package hadoop-0.20-namenode.
> >> > (Reading database ... 24766 files and directories currently
> installed.)
> >> > Unpacking hadoop-0.20-namenode (from
> >> > .../hadoop-0.20-namenode_0.20.2+923.197-1~lucid-cdh3_all.deb) ...
> >> >
> >> > =========================
> >> >
> >> > SO do i need to install java on the remote machine ...bit confused
> here.
> >> >
> >> > Your time is appreciated.
> >> >
> >> > Thanks
> >> > Debashis
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >
> >> > On Sat, Apr 14, 2012 at 10:20 PM, Ashish <paliwalashish@gmail.com>
> >> > wrote:
> >> >>
> >> >> Debashish,
> >> >>
> >> >> Can you check in the logs that services got started. You can do this
> >> >> scanning the whirr.log or by logging into the machines and verifying
> >> >> it. If something is missing, you can check /tmp/log onto the ec2
> >> >> machine, to get an insight into what went wrong.
> >> >>
> >> >> I would do the following
> >> >> 1. Verify everything went fine in whirr.log
> >> >> 2. Log into the machine and check all services are running (using ps
> or
> >> >> jps)
> >> >> 3. If all is working fine, then you should get the UI.
> >> >>
> >> >> If nothing works out, I would use a existing recipe from
> >> >> whirr_install/recipes and try. This would work as it is battle
> tested.
> >> >>
> >> >> You can also join #whirr on IRC, a lot of folks are hanging out there
> >> >> and might help you in real time :)
> >> >>
> >> >> HTH !
> >> >> ashish
> >> >>
> >> >> On Fri, Apr 13, 2012 at 2:43 AM, Deb Ghosh <dgcloudera@gmail.com>
> >> >> wrote:
> >> >> > Hi ,
> >> >> > The problem is I am in Ubuntu 11.10 as my os and after launching
> >> >> > hadoop
> >> >> > ec2
> >> >> > cluster  and running the proxyset up.sh as shown below after ====
> , I
> >> >> > am
> >> >> > trying to http in the firefox bowser for the nodes like name node
> >> >> > with
> >> >> > the
> >> >> > http://ec2-23-20-228-116.compute-1.amazonaws.com:50070/ – web
UI
> for
> >> >> > HDFS
> >> >> > name node(s ) but it does not connect
> >> >> >
> >> >> > But when i use hadoop single node cluster in hadoop yahoo in the
> same
> >> >> > os
> >> >> > and
> >> >> > use the web ui loacal host it goes to to the webui in the firefox
> >> >> > http://localhost:50070/ – web UI for HDFS name node(s)
> >> >> >
> >> >> > =============
> >> >> >
> >> >> > Completed launch of myhadoopcluster
> >> >> > Web UI available at
> http://ec2-23-20-228-116.compute-1.amazonaws.com
> >> >> > Wrote Hadoop site file
> >> >> > /home/debashig/.whirr/myhadoopcluster/hadoop-site.xml
> >> >> > Wrote Hadoop proxy script
> >> >> > /home/debashig/.whirr/myhadoopcluster/hadoop-proxy.sh
> >> >> > Started cluster of 2 instances
> >> >> > HadoopCluster{instances=[Instance{roles=[jt, nn],
> >> >> >
> >> >> > publicAddress=
> ec2-23-20-228-116.compute-1.amazonaws.com/23.20.228.116,
> >> >> > privateAddress=/10.64.74.23}, Instance{roles=[tt, dn],
> >> >> > publicAddress=/50.17.54.86, privateAddress=/10.204.74.58}],
> >> >> >
> >> >> >
> >> >> > configuration={fs.default.name=hdfs://
> ec2-23-20-228-116.compute-1.amazonaws.com:8020/,
> >> >> > mapred.job.tracker=ec2-23-20-228-116.compute-1.amazonaws.com:8021,
> >> >> > hadoop.job.ugi=root,root,
> >> >> >
> >> >> >
> >> >> >
> hadoop.rpc.socket.factory.class.default=org.apache.hadoop.net.SocksSocketFactory,
> >> >> > hadoop.socks.server=localhost:6666}}
> >> >> > debashig@ubuntu:~/amazon/Ec2_basic_setup/cloudera/whirr-0.1.0+23$
> sh
> >> >> > ~/.whirr/myhadoopcluster/hadoop-proxy.sh
> >> >> > Running proxy to Hadoop cluster at
> >> >> > ec2-23-20-228-116.compute-1.amazonaws.com. Use Ctrl-c to quit.
> >> >> > Warning: Permanently added
> >> >> > 'ec2-23-20-228-116.compute-1.amazonaws.com,23.20.228.116' (RSA)
to
> >> >> > the
> >> >> > list
> >> >> > of known hosts.
> >> >> > =====================================
> >> >> >
> >> >> > Please provide your toughts or solution if any.
> >> >> >
> >> >> > Thanks
> >> >> > Debashis
> >> >> >
> >> >> >
> >> >> > On Tue, Apr 10, 2012 at 8:19 PM, Ashish <paliwalashish@gmail.com>
> >> >> > wrote:
> >> >> >>
> >> >> >> The way I do this is, pick up the Name Node and Job Tracker
URLs
> >> >> >> from
> >> >> >> the whirr logs or console and punch them in on browser without
any
> >> >> >> change, and it works.
> >> >> >>
> >> >> >> I hope this is what you are trying to achieve.
> >> >> >>
> >> >> >>
> >> >> >> On Wed, Apr 11, 2012 at 8:35 AM, Deb Ghosh <dgcloudera@gmail.com>
> >> >> >> wrote:
> >> >> >> >
> >> >> >> > Hello ,
> >> >> >> >
> >> >> >> > Would appreciate a help on the following issue :-
> >> >> >> >
> >> >> >> > I was installing whirr on Amazon ec2 cluster  the launch
of
> whirr
> >> >> >> > using
> >> >> >> > my
> >> >> >> > ubuntu 11.10 was ok
> >> >> >> >
> >> >> >> > then we did the following to run the proxy server
> >> >> >> >
> >> >> >> > sh ~/.whirr/myhadoopcluster/hadoop-proxy.sh
> >> >> >> >
> >> >> >> >    Running proxy to Hadoop cluster at
> >> >> >> >    ec2-72-44-45-199.compute-1.amazonaws.com.
> >> >> >> >    Use Ctrl-c to quit.
> >> >> >> >
> >> >> >> >
> >> >> >> > The hadoop-proxy.sh is used to access the web interface
of
> Hadoop
> >> >> >> > securely.
> >> >> >> > When we run this it will tunnel through to the cluster
and give
> us
> >> >> >> > access in
> >> >> >> > the web browser via a SOCKS proxy. For this we changed
the
> >> >> >> > internet
> >> >> >> > option
> >> >> >> > for proxy and after that when we tried to access following
jsp
> >> >> >> > page
> >> >> >> >
> >> >> >> > http://<hostname>:50070/dfshealth.jsp  this was
not working
> >> >> >> >
> >> >> >> >
> >> >> >> > (here the hostname was real DNS address on Amazon ec2).
> >> >> >> >
> >> >> >> > It is waiting for sometime and then failing to connect
to amazon
> >> >> >> > server.
> >> >> >> >
> >> >> >> >
> >> >> >> > Please let me know - How I need to resolve this asap.
> >> >> >> >
> >> >> >> >
> >> >> >> > Thanks in advance
> >> >> >> > Debashis  ( mobile # 5103662639)
> >> >> >> >
> >> >> >> >
> >> >> >> >
> >> >> >>
> >> >> >>
> >> >> >>
> >> >> >> --
> >> >> >> thanks
> >> >> >> ashish
> >> >> >>
> >> >> >> Blog: http://www.ashishpaliwal.com/blog
> >> >> >> My Photo Galleries: http://www.pbase.com/ashishpaliwal
> >> >> >
> >> >> >
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> thanks
> >> >> ashish
> >> >>
> >> >> Blog: http://www.ashishpaliwal.com/blog
> >> >> My Photo Galleries: http://www.pbase.com/ashishpaliwal
> >> >
> >> >
> >>
> >>
> >>
> >> --
> >> thanks
> >> ashish
> >>
> >> Blog: http://www.ashishpaliwal.com/blog
> >> My Photo Galleries: http://www.pbase.com/ashishpaliwal
> >
> >
>
>
>
> --
> thanks
> ashish
>
> Blog: http://www.ashishpaliwal.com/blog
> My Photo Galleries: http://www.pbase.com/ashishpaliwal
>

Mime
View raw message