hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From praba karan <prabas...@gmail.com>
Subject Re: Preparing Hbase for the Bulk Load
Date Mon, 28 Feb 2011 06:30:25 GMT
Stack, Sorry. I was sidelined by other works.

It's a sixteen columns. I ve pseudo-distributed mode and I was using it for
developing model to implement it in the big clusters. Yes, I am not using
the Bulk Loader. I am using the MapReduce program to upload the bulk Load.
Used the following code from the link below

http://wiki.apache.org/hadoop/Hbase/MapReduce

I did modified according to the hbase-0.89 version and I did uploaded to the
Hbase-0.89. But after uploading the Sample data of size around 3GB to
my pseudo distributed machine, Hbase works fine until I restarts. After I
restarts the machine.  Hbase is not working. It says as MasterNotRunning
exception.

The size of rows is around close to the 24 millions

Version of hadoop is 0.20


Regards
Jason



On Thu, Feb 17, 2011 at 10:56 PM, Stack <stack@duboce.net> wrote:

> On Thu, Feb 17, 2011 at 2:15 AM, praba karan <prabaster@gmail.com> wrote:
> > Hi all,
> >
> > I ve been trying to load the Hbase with huge amount of data into the
> Hbase
> > using the Map Reduce program. Hbase table contains the 16 columns and row
> Id
> > are generated by the UUID's.
>
> Is that 16 columns or 16 column families?
>
> When you say huge, what sizes are you talking?
>
> Whats your cluster size?
>
> You are not using the bulk loader?
>
> How many mappers do you have running on each machine?
>
>
> > When I try to load, It takes a time and gives
> > the exception as discussed in the following link.
> >
> > http://web.archiveorange.com/archive/v/gMxNALiU1zbHXVoaJzOT
> >
>
> That exception is pretty generic.
>
> > After that, Hbase shell stopped working. I tried restarting the cluster.
> > When I tried to disable and to drop the table. It produces the Following
> > exception
> >
> >
> > "ERROR: org.apache.hadoop.hbase.RegionException: Retries exhausted, it
> took
> > too long to wait for the table Sample to be disabled."
> >
>
> >
> > How to recover my Hbase-0.89 and is there any procedure to prepare the
> Hbase
> > for the Bulk Upload. My data contains the Rows in millions!
> >
>
> What size are these rows?
>
> Please update to 0.90  hbase.  What version of hadoop?
>
> St.Ack
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message