hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marc Limotte <mslimo...@gmail.com>
Subject Re: Seeing errors after loading a fair amount of data. KeeperException$NoNodeException, IOException
Date Thu, 07 Jan 2010 21:44:57 GMT
Thanks, Robert.

I'm using Fedora, so it probably works the same way as you suggest.  Setting
the ulimit and xcievers as described in the troubleshooting didn't seem to
help.  But I'm going to try again with your suggestion.

Marc

On Thu, Jan 7, 2010 at 12:56 PM, Andrew Purtell <apurtell@apache.org> wrote:

> Robert,
>
> Thanks for that. I updated the relevant section of the Troubleshooting page
> up on the HBase wiki with this advice.
>
> Best regards,
>
>   - Andy
>
>
>
> ----- Original Message ----
> > From: "Gibbon, Robert, VF-Group" <Robert.Gibbon@vodafone.com>
> > To: hbase-user@hadoop.apache.org
> > Sent: Thu, January 7, 2010 5:04:58 AM
> > Subject: RE: Seeing errors after loading a fair amount of data.
> KeeperException$NoNodeException, IOException
> >
> > Maybe you are running Red Hat? Just changing limits.conf I think won't
> > work because RH has a maximum total open files across the whole system,
> > which is 4096 by default, unless you do something like this too
> >
> > echo "32768" > /proc/sys/fs/file-max
> > service network restart
> >
> > To make it permanent edit /etc/sysctl.conf to include the line:
> >       fs.file-max = 32768
> > Kind regards,
> > Robert
> [...]
>
>
>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message