hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Daniel Cryans <jdcry...@apache.org>
Subject Re: RetriesExhaustedException!
Date Tue, 10 Mar 2009 12:45:22 GMT
NguyenHuynh,

Please do take a look at :

http://wiki.apache.org/hadoop/Hbase/Troubleshooting#5 and 6

http://hadoop.apache.org/hbase/docs/r0.19.0/api/overview-summary.html#overview_description
See if you met the requirements

When it cannot obtain a block (the root error in your case), it's
because something went wrong on the HDFS side.

J-D

On Mon, Mar 9, 2009 at 2:10 AM, nguyenhuynh <nguyenhuynh@asnet.com.vn> wrote:
>
> Hi all!
>
>
> I have a Map/Reduce job use to fetch URIs. The URIs stored in HBase.
> When I run job, appeared error:
>
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to contact region server
192.168.1.182:60020 for region TestHarisCrawlDatum,http://www.imedeen.us,1235699240314, row
'http://www.nghenhac.info/Search/Index.html?SearchQuery=SabrinaSalerno&tab=1', but failed
after 10 attempts.
> Exceptions:
> java.io.IOException: java.io.IOException: Could not obtain block: blk_3404782455271254354_52804
file=/hbase/TestHarisCrawlDatum/551384203/crd_urs/mapfiles/2937286311309188187/data
>        at org.apache.hadoop.dfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1462)
>        at org.apache.hadoop.dfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1312)
>        at org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1417)
>        at java.io.DataInputStream.readFully(DataInputStream.java:178)
>        at org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:64)
>        at org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:102)
>        at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1933)
>        at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1833)
>        at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1879)
>        at org.apache.hadoop.io.MapFile$Reader.next(MapFile.java:516)
>        at org.apache.hadoop.hbase.regionserver.StoreFileScanner.getNext(StoreFileScanner.java:312)
>        at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:195)
>        at org.apache.hadoop.hbase.regionserver.HStoreScanner.next(HStoreScanner.java:196)
>        at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.next(HRegion.java:2027)
>        at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1087)
>        at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown Source)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.hbase.ipc.HbaseRPC$Server.call(HbaseRPC.java:554)
>        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:888)
>
>
> * I have 3 machines: 1 master and 2 slave
> * I run with more than 13000 URIs.
>
> Please help!
>
> Thanks
> Best regards,
>
> NguyenHuynh.
>

Mime
View raw message