hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Amandeep Khurana <ama...@gmail.com>
Subject Re: Connection problem during data import into hbase
Date Sat, 21 Feb 2009 05:55:47 GMT
I dont know if this is related or not, but it seems to be. After this map
reduce job, I tried to count the number of entries in the table in hbase
through the shell. It failed with the following error:

hbase(main):002:0> count 'in_table'
NativeException: java.lang.NullPointerException: null
    from java.lang.String:-1:in `<init>'
    from org/apache/hadoop/hbase/util/Bytes.java:92:in `toString'
    from org/apache/hadoop/hbase/client/RetriesExhaustedException.java:50:in
`getMessage'
    from org/apache/hadoop/hbase/client/RetriesExhaustedException.java:40:in
`<init>'
    from org/apache/hadoop/hbase/client/HConnectionManager.java:841:in
`getRegionServerWithRetries'
    from org/apache/hadoop/hbase/client/MetaScanner.java:56:in `metaScan'
    from org/apache/hadoop/hbase/client/MetaScanner.java:30:in `metaScan'
    from org/apache/hadoop/hbase/client/HConnectionManager.java:411:in
`getHTableDescriptor'
    from org/apache/hadoop/hbase/client/HTable.java:219:in
`getTableDescriptor'
    from sun.reflect.NativeMethodAccessorImpl:-2:in `invoke0'
    from sun.reflect.NativeMethodAccessorImpl:-1:in `invoke'
    from sun.reflect.DelegatingMethodAccessorImpl:-1:in `invoke'
    from java.lang.reflect.Method:-1:in `invoke'
    from org/jruby/javasupport/JavaMethod.java:250:in
`invokeWithExceptionHandling'
    from org/jruby/javasupport/JavaMethod.java:219:in `invoke'
    from org/jruby/javasupport/JavaClass.java:416:in `execute'
... 145 levels...
    from org/jruby/internal/runtime/methods/DynamicMethod.java:74:in `call'
    from org/jruby/internal/runtime/methods/CompiledMethod.java:48:in `call'
    from org/jruby/runtime/CallSite.java:123:in `cacheAndCall'
    from org/jruby/runtime/CallSite.java:298:in `call'
    from
ruby/hadoop/install/hbase_minus_0_dot_19_dot_0/bin//hadoop/install/hbase/bin/../bin/hirb.rb:429:in
`__file__'
    from
ruby/hadoop/install/hbase_minus_0_dot_19_dot_0/bin//hadoop/install/hbase/bin/../bin/hirb.rb:-1:in
`__file__'
    from
ruby/hadoop/install/hbase_minus_0_dot_19_dot_0/bin//hadoop/install/hbase/bin/../bin/hirb.rb:-1:in
`load'
    from org/jruby/Ruby.java:512:in `runScript'
    from org/jruby/Ruby.java:432:in `runNormally'
    from org/jruby/Ruby.java:312:in `runFromMain'
    from org/jruby/Main.java:144:in `run'
    from org/jruby/Main.java:89:in `run'
    from org/jruby/Main.java:80:in `main'
    from /hadoop/install/hbase/bin/../bin/HBase.rb:444:in `count'
    from /hadoop/install/hbase/bin/../bin/hirb.rb:348:in `count'
    from (hbase):3:in `binding'


Amandeep Khurana
Computer Science Graduate Student
University of California, Santa Cruz


On Fri, Feb 20, 2009 at 9:46 PM, Amandeep Khurana <amansk@gmail.com> wrote:

> Here's what it throws on the console:
>
> 09/02/20 21:45:29 INFO mapred.JobClient: Task Id :
> attempt_200902201300_0019_m_000006_0, Status : FAILED
> java.io.IOException: table is null
>         at IN_TABLE_IMPORT$MapClass.map(IN_TABLE_IMPORT.java:33)
>         at IN_TABLE_IMPORT$MapClass.map(IN_TABLE_IMPORT.java:1)
>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>         at org.apache.hadoop.mapred.Child.main(Child.java:155)
>
> attempt_200902201300_0019_m_000006_0:
> org.apache.hadoop.hbase.client.NoServerForRegionException: Timed out trying
> to locate root region
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRootRegion(HConnectionManager.java:768)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:448)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.relocateRegion(HConnectionManager.java:430)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegionInMeta(HConnectionManager.java:557)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:457)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.relocateRegion(HConnectionManager.java:430)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegionInMeta(HConnectionManager.java:557)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:461)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:423)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:114)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:97)
> attempt_200902201300_0019_m_000006_0:   at
> IN_TABLE_IMPORT$MapClass.configure(IN_TABLE_IMPORT.java:120)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:58)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:83)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:58)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:83)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:328)
> attempt_200902201300_0019_m_000006_0:   at
> org.apache.hadoop.mapred.Child.main(Child.java:155)
>
>
>
>
>
> Amandeep Khurana
> Computer Science Graduate Student
> University of California, Santa Cruz
>
>
> On Fri, Feb 20, 2009 at 9:43 PM, Amandeep Khurana <amansk@gmail.com>wrote:
>
>> I am trying to import data from a flat file into Hbase using a Map Reduce
>> job. There are close to 2 million rows. Mid way into the job, it starts
>> giving me connection problems and eventually kills the job. When the error
>> comes, the hbase shell also stops working.
>>
>> This is what I get:
>>
>> 2009-02-20 21:37:14,407 INFO org.apache.hadoop.ipc.HBaseClass: Retrying connect to
server: /171.69.102.52:60020. Already tried 0 time(s).
>>
>> What could be going wrong?
>>
>> Amandeep
>>
>>
>> Amandeep Khurana
>> Computer Science Graduate Student
>> University of California, Santa Cruz
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message