hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Purtell <apurt...@apache.org>
Subject Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar
Date Tue, 07 Jun 2011 06:10:05 GMT
Pardon if I've missed something but I think this thread comes down to:

On Mon, 6/6/11, Mike Spreitzer <mspreitz@us.ibm.com> wrote:
> So my suggestion is to be unequivocal about it: when running
> distributed, always build your own Hadoop and put its -core
> JAR into your HBase installation (or use Cloudera, which has
> done this for you).

This is a good suggestion. We try. But it seems not all the avenues to this information are
covered. Suggestions on where to improve are helpful. Our Web UIs put up a big fat warning.
We cover this issue in the online book, but as this thread suggests, we might make that better
by pulling in some of Michael Noll's material either directly with his permission or by reference.
We don't "own" the Hadoop wiki so can't do anything there. Our own wiki (and website) needs
a refresh. When that happens we can cover this issue, perhaps with a compatibility matrix
(with links to build or distro instructions), somewhere up front.

> Also: explicitly explain how the file has to be named (there
> is a strict naming requirement so that the launching scripts
> work, right?).

In my experience what the jar is named is not important. Remove the old Hadoop jars from HBase
lib/. Drop in a suitable Hadoop -append variant core jar. We have a "frankenbase" internal
HBase version, and we simply do that and also replace the ZooKeeper jar (we have a variant
of that which can do SASL authentication) and all is well.

   - Andy


Mime
View raw message