hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Amandeep Khurana <ama...@gmail.com>
Subject Re: Connection problem during data import into hbase
Date Sat, 21 Feb 2009 09:12:29 GMT
I just did a "count 'in_table'", it did work. So, I truncated the table and
started the map reduce job again to load the entire file. For some reason, I
had to kill the job. After that, the count command gives the following
error:


count 'in_table'

NativeException: org.apache.hadoop.hbase.client.RetriesExhaustedException:
Trying to contact region server 171.69.102.51:60020 for region
in_table,,1235194191883, row '', but failed after 5 attempts.
Exceptions:
org.apache.hadoop.hbase.NotServingRegionException:
org.apache.hadoop.hbase.NotServingRegionException: in_table,,1235194191883
    at
org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:2065)
    at
org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1699)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
    at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895)

org.apache.hadoop.hbase.NotServingRegionException:
org.apache.hadoop.hbase.NotServingRegionException: in_table,,1235194191883
    at
org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:2065)
    at
org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1699)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
    at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895)


It seems like the daemon is getting corrupt with the Map Red job somehow...
I cant really understand whats happening.

Amandeep

Amandeep Khurana
Computer Science Graduate Student
University of California, Santa Cruz


On Sat, Feb 21, 2009 at 1:01 AM, Amandeep Khurana <amansk@gmail.com> wrote:

> Yes, the table exists before I start the job.
>
> I am not using TableOutputFormat. I picked up the sample code from the docs
> and am using it.
>
> Here's the job conf:
>
> JobConf conf = new JobConf(getConf(), IN_TABLE_IMPORT.class);
>         FileInputFormat.setInputPaths(
> conf, new Path("import_data"));
>         conf.setMapperClass(MapClass.class);
>         conf.setNumReduceTasks(0);
>         conf.setOutputFormat(NullOutputFormat.class);
>         JobClient.runJob(conf);
>
>
> Interestingly, the hbase shell isnt working now either. Its giving errors
> even when I give the command "list"...
>
>
>
> Amandeep Khurana
> Computer Science Graduate Student
> University of California, Santa Cruz
>
>
> On Sat, Feb 21, 2009 at 12:10 AM, stack <stack@duboce.net> wrote:
>
>> The table exists before you start the MR job?
>>
>> When you say 'midway through the job', are you using tableoutputformat to
>> insert into your table?
>>
>> Which version of hbase?
>>
>> St.Ack
>>
>> On Fri, Feb 20, 2009 at 9:55 PM, Amandeep Khurana <amansk@gmail.com>
>> wrote:
>>
>> > I dont know if this is related or not, but it seems to be. After this
>> map
>> > reduce job, I tried to count the number of entries in the table in hbase
>> > through the shell. It failed with the following error:
>> >
>> > hbase(main):002:0> count 'in_table'
>> > NativeException: java.lang.NullPointerException: null
>> >    from java.lang.String:-1:in `<init>'
>> >    from org/apache/hadoop/hbase/util/Bytes.java:92:in `toString'
>> >    from
>> org/apache/hadoop/hbase/client/RetriesExhaustedException.java:50:in
>> > `getMessage'
>> >    from
>> org/apache/hadoop/hbase/client/RetriesExhaustedException.java:40:in
>> > `<init>'
>> >    from org/apache/hadoop/hbase/client/HConnectionManager.java:841:in
>> > `getRegionServerWithRetries'
>> >    from org/apache/hadoop/hbase/client/MetaScanner.java:56:in `metaScan'
>> >    from org/apache/hadoop/hbase/client/MetaScanner.java:30:in `metaScan'
>> >    from org/apache/hadoop/hbase/client/HConnectionManager.java:411:in
>> > `getHTableDescriptor'
>> >    from org/apache/hadoop/hbase/client/HTable.java:219:in
>> > `getTableDescriptor'
>> >    from sun.reflect.NativeMethodAccessorImpl:-2:in `invoke0'
>> >    from sun.reflect.NativeMethodAccessorImpl:-1:in `invoke'
>> >    from sun.reflect.DelegatingMethodAccessorImpl:-1:in `invoke'
>> >    from java.lang.reflect.Method:-1:in `invoke'
>> >    from org/jruby/javasupport/JavaMethod.java:250:in
>> > `invokeWithExceptionHandling'
>> >    from org/jruby/javasupport/JavaMethod.java:219:in `invoke'
>> >    from org/jruby/javasupport/JavaClass.java:416:in `execute'
>> > ... 145 levels...
>> >    from org/jruby/internal/runtime/methods/DynamicMethod.java:74:in
>> `call'
>> >    from org/jruby/internal/runtime/methods/CompiledMethod.java:48:in
>> `call'
>> >    from org/jruby/runtime/CallSite.java:123:in `cacheAndCall'
>> >    from org/jruby/runtime/CallSite.java:298:in `call'
>> >    from
>> >
>> >
>> ruby/hadoop/install/hbase_minus_0_dot_19_dot_0/bin//hadoop/install/hbase/bin/../bin/hirb.rb:429:in
>> > `__file__'
>> >    from
>> >
>> >
>> ruby/hadoop/install/hbase_minus_0_dot_19_dot_0/bin//hadoop/install/hbase/bin/../bin/hirb.rb:-1:in
>> > `__file__'
>> >    from
>> >
>> >
>> ruby/hadoop/install/hbase_minus_0_dot_19_dot_0/bin//hadoop/install/hbase/bin/../bin/hirb.rb:-1:in
>> > `load'
>> >    from org/jruby/Ruby.java:512:in `runScript'
>> >    from org/jruby/Ruby.java:432:in `runNormally'
>> >    from org/jruby/Ruby.java:312:in `runFromMain'
>> >    from org/jruby/Main.java:144:in `run'
>> >    from org/jruby/Main.java:89:in `run'
>> >    from org/jruby/Main.java:80:in `main'
>> >    from /hadoop/install/hbase/bin/../bin/HBase.rb:444:in `count'
>> >    from /hadoop/install/hbase/bin/../bin/hirb.rb:348:in `count'
>> >    from (hbase):3:in `binding'
>> >
>> >
>> > Amandeep Khurana
>> > Computer Science Graduate Student
>> > University of California, Santa Cruz
>> >
>> >
>> > On Fri, Feb 20, 2009 at 9:46 PM, Amandeep Khurana <amansk@gmail.com>
>> > wrote:
>> >
>> > > Here's what it throws on the console:
>> > >
>> > > 09/02/20 21:45:29 INFO mapred.JobClient: Task Id :
>> > > attempt_200902201300_0019_m_000006_0, Status : FAILED
>> > > java.io.IOException: table is null
>> > >         at IN_TABLE_IMPORT$MapClass.map(IN_TABLE_IMPORT.java:33)
>> > >         at IN_TABLE_IMPORT$MapClass.map(IN_TABLE_IMPORT.java:1)
>> > >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>> > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>> > >         at org.apache.hadoop.mapred.Child.main(Child.java:155)
>> > >
>> > > attempt_200902201300_0019_m_000006_0:
>> > > org.apache.hadoop.hbase.client.NoServerForRegionException: Timed out
>> > trying
>> > > to locate root region
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRootRegion(HConnectionManager.java:768)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:448)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.relocateRegion(HConnectionManager.java:430)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegionInMeta(HConnectionManager.java:557)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:457)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.relocateRegion(HConnectionManager.java:430)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegionInMeta(HConnectionManager.java:557)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:461)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:423)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > > org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:114)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > > org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:97)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > > IN_TABLE_IMPORT$MapClass.configure(IN_TABLE_IMPORT.java:120)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:58)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:83)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > > org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:58)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > >
>> >
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:83)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > > org.apache.hadoop.mapred.MapTask.run(MapTask.java:328)
>> > > attempt_200902201300_0019_m_000006_0:   at
>> > > org.apache.hadoop.mapred.Child.main(Child.java:155)
>> > >
>> > >
>> > >
>> > >
>> > >
>> > > Amandeep Khurana
>> > > Computer Science Graduate Student
>> > > University of California, Santa Cruz
>> > >
>> > >
>> > > On Fri, Feb 20, 2009 at 9:43 PM, Amandeep Khurana <amansk@gmail.com
>> > >wrote:
>> > >
>> > >> I am trying to import data from a flat file into Hbase using a Map
>> > Reduce
>> > >> job. There are close to 2 million rows. Mid way into the job, it
>> starts
>> > >> giving me connection problems and eventually kills the job. When the
>> > error
>> > >> comes, the hbase shell also stops working.
>> > >>
>> > >> This is what I get:
>> > >>
>> > >> 2009-02-20 21:37:14,407 INFO org.apache.hadoop.ipc.HBaseClass:
>> Retrying
>> > connect to server: /171.69.102.52:60020. Already tried 0 time(s).
>> > >>
>> > >> What could be going wrong?
>> > >>
>> > >> Amandeep
>> > >>
>> > >>
>> > >> Amandeep Khurana
>> > >> Computer Science Graduate Student
>> > >> University of California, Santa Cruz
>> > >>
>> > >
>> > >
>> >
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message