hbase-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: HBase-Trunk_matrix » latest1.7,yahoo-not-h2 #1213
Date Tue, 12 Jul 2016 06:54:48 GMT
See <https://builds.apache.org/job/HBase-Trunk_matrix/jdk=latest1.7,label=yahoo-not-h2/1213/>

------------------------------------------
[...truncated 5603 lines...]
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 159.828 sec - in org.apache.hadoop.hbase.TestZooKeeper
Running org.apache.hadoop.hbase.mapreduce.TestHLogRecordReader
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.404 sec - in org.apache.hadoop.hbase.mapreduce.TestHLogRecordReader
Running org.apache.hadoop.hbase.mapreduce.TestHRegionPartitioner
Running org.apache.hadoop.hbase.mapreduce.TestImportTsv
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.145 sec - in org.apache.hadoop.hbase.mapreduce.TestHRegionPartitioner
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.91 sec - in org.apache.hadoop.hbase.mapreduce.TestTableInputFormat
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 261.159 sec - in org.apache.hadoop.hbase.client.TestReplicasClient
Running org.apache.hadoop.hbase.mapreduce.TestWALPlayer
Running org.apache.hadoop.hbase.mapreduce.TestImportExport
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.214 sec - in org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFilesSplitRecovery
Running org.apache.hadoop.hbase.mapreduce.TestRowCounter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.655 sec - in org.apache.hadoop.hbase.mapreduce.TestWALPlayer
Running org.apache.hadoop.hbase.mapreduce.TestImportTSVWithVisibilityLabels
Running org.apache.hadoop.hbase.mapreduce.TestHashTable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.76 sec - in org.apache.hadoop.hbase.mapreduce.TestHashTable
Running org.apache.hadoop.hbase.mapreduce.TestImportTSVWithTTLs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.266 sec - in org.apache.hadoop.hbase.mapreduce.TestImportTSVWithTTLs
Running org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat2
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.27 sec - in org.apache.hadoop.hbase.mapreduce.TestRowCounter
Running org.apache.hadoop.hbase.mapreduce.TestWALRecordReader
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 88.721 sec - in org.apache.hadoop.hbase.mapreduce.TestImportExport
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.367 sec - in org.apache.hadoop.hbase.mapreduce.TestWALRecordReader
Tests run: 16, Failures: 0, Errors: 0, Skipped: 14, Time elapsed: 29.209 sec - in org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat2
Running org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 78.217 sec - in org.apache.hadoop.hbase.mapreduce.TestImportTSVWithVisibilityLabels
Running org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1
Running org.apache.hadoop.hbase.mapreduce.TestTimeRangeMapRed
Running org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.871 sec - in org.apache.hadoop.hbase.mapreduce.TestTimeRangeMapRed
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 137.907 sec - in org.apache.hadoop.hbase.mapreduce.TestImportTsv
Running org.apache.hadoop.hbase.mapreduce.TestSyncTable
Running org.apache.hadoop.hbase.mapreduce.TestCopyTable
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.922 sec - in org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 39.738 sec - in org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1
Running org.apache.hadoop.hbase.io.TestFileLink
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.882 sec - in org.apache.hadoop.hbase.mapreduce.TestSyncTable
Running org.apache.hadoop.hbase.TestMetaTableAccessor
Running org.apache.hadoop.hbase.io.hfile.TestHFileBlockIndex
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.688 sec - in org.apache.hadoop.hbase.io.TestFileLink
Running org.apache.hadoop.hbase.io.hfile.TestSeekBeforeWithInlineBlocks
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.268 sec - in org.apache.hadoop.hbase.io.hfile.TestHFileBlockIndex
Running org.apache.hadoop.hbase.io.hfile.TestScannerSelectionUsingTTL
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.059 sec - in org.apache.hadoop.hbase.io.hfile.TestSeekBeforeWithInlineBlocks
Running org.apache.hadoop.hbase.io.hfile.TestHFileSeek
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.429 sec - in org.apache.hadoop.hbase.io.hfile.TestHFileSeek
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.222 sec - in org.apache.hadoop.hbase.TestMetaTableAccessor
Running org.apache.hadoop.hbase.io.hfile.TestHFileBlock
Running org.apache.hadoop.hbase.io.hfile.TestCacheConfig
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.715 sec - in org.apache.hadoop.hbase.mapreduce.TestCopyTable
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.493 sec - in org.apache.hadoop.hbase.io.hfile.TestCacheConfig
Running org.apache.hadoop.hbase.io.hfile.TestScannerFromBucketCache
Running org.apache.hadoop.hbase.io.hfile.TestForceCacheImportantBlocks
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.531 sec - in org.apache.hadoop.hbase.io.hfile.TestScannerFromBucketCache
Running org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.412 sec - in org.apache.hadoop.hbase.io.hfile.TestForceCacheImportantBlocks
Running org.apache.hadoop.hbase.io.encoding.TestEncodedSeekers
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.438 sec - in org.apache.hadoop.hbase.io.hfile.TestScannerSelectionUsingTTL
Running org.apache.hadoop.hbase.io.encoding.TestDataBlockEncoders
Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 91.829 sec - in org.apache.hadoop.hbase.io.hfile.TestHFileBlock
Tests run: 8, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 182.085 sec <<< FAILURE!
- in org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat
testWithMockedMapReduceSingleRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
 Time elapsed: 72.336 sec  <<< ERROR!
org.apache.hadoop.hbase.client.ScannerTimeoutException: 60010ms passed since the last invocation,
timeout is currently set to 60000
	at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:433)
	at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:363)
	at org.apache.hadoop.hbase.client.ClientSimpleScanner.next(ClientSimpleScanner.java:51)
	at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:776)
	at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:702)
	at org.apache.hadoop.hbase.MetaTableAccessor.getTableRegionsAndLocations(MetaTableAccessor.java:624)
	at org.apache.hadoop.hbase.MetaTableAccessor.getTableRegions(MetaTableAccessor.java:447)
	at org.apache.hadoop.hbase.client.HBaseAdmin.getTableRegions(HBaseAdmin.java:2112)
	at org.apache.hadoop.hbase.snapshot.SnapshotTestingUtils.confirmSnapshotValid(SnapshotTestingUtils.java:247)
	at org.apache.hadoop.hbase.snapshot.SnapshotTestingUtils.createSnapshotAndValidate(SnapshotTestingUtils.java:410)
	at org.apache.hadoop.hbase.mapreduce.TableSnapshotInputFormatTestBase.createTableAndSnapshot(TableSnapshotInputFormatTestBase.java:213)
	at org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMockedMapReduce(TestTableSnapshotInputFormat.java:201)
Caused by: org.apache.hadoop.hbase.UnknownScannerException: org.apache.hadoop.hbase.UnknownScannerException:
Unknown scanner '16'. This can happen due to any of the following reasons: a) Scanner id given
is wrong, b) Scanner lease expired because of long wait between consecutive client checkins,
c) Server may be closing down, d) RegionServer restart during upgrade.
If the issue is due to reason (b), a possible fix would be increasing the value of'hbase.client.scanner.timeout.period'
configuration.
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2546)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34818)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2212)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:118)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:189)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:169)

	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hbase.ipc.AsyncCall.setFailed(AsyncCall.java:159)
	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.channelRead0(AsyncServerResponseHandler.java:81)
	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.channelRead0(AsyncServerResponseHandler.java:38)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:326)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:326)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1320)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:905)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:563)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:504)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:418)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:390)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:742)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.UnknownScannerException:
Unknown scanner '16'. This can happen due to any of the following reasons: a) Scanner id given
is wrong, b) Scanner lease expired because of long wait between consecutive client checkins,
c) Server may be closing down, d) RegionServer restart during upgrade.
If the issue is due to reason (b), a possible fix would be increasing the value of'hbase.client.scanner.timeout.period'
configuration.
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2546)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34818)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2212)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:118)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:189)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:169)

	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.createRemoteException(AsyncServerResponseHandler.java:120)
	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.channelRead0(AsyncServerResponseHandler.java:76)
	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.channelRead0(AsyncServerResponseHandler.java:38)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:326)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:326)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1320)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:905)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:563)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:504)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:418)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:390)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:742)
	at java.lang.Thread.run(Thread.java:745)

Running org.apache.hadoop.hbase.io.encoding.TestBufferedDataBlockEncoder
Running org.apache.hadoop.hbase.io.asyncfs.TestFanOutOneBlockAsyncDFSOutput
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.559 sec - in org.apache.hadoop.hbase.io.encoding.TestBufferedDataBlockEncoder
Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 56.419 sec - in org.apache.hadoop.hbase.io.encoding.TestDataBlockEncoders
Running org.apache.hadoop.hbase.io.asyncfs.TestSaslFanOutOneBlockAsyncDFSOutput
Running org.apache.hadoop.hbase.filter.TestScanRowPrefix
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.551 sec - in org.apache.hadoop.hbase.filter.TestScanRowPrefix
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.569 sec - in org.apache.hadoop.hbase.io.asyncfs.TestFanOutOneBlockAsyncDFSOutput
Running org.apache.hadoop.hbase.filter.TestFilterWithScanLimits
Running org.apache.hadoop.hbase.filter.TestFilterWrapper
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 109.421 sec - in org.apache.hadoop.hbase.io.encoding.TestEncodedSeekers
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.877 sec - in org.apache.hadoop.hbase.filter.TestFilterWrapper
Running org.apache.hadoop.hbase.filter.TestFuzzyRowFilterEndToEnd
Running org.apache.hadoop.hbase.filter.TestFilterListOrOperatorWithBlkCnt
Tests run: 72, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 160.688 sec - in org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.728 sec - in org.apache.hadoop.hbase.filter.TestFilterListOrOperatorWithBlkCnt
Running org.apache.hadoop.hbase.filter.TestColumnRangeFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.112 sec - in org.apache.hadoop.hbase.filter.TestFilterWithScanLimits
Running org.apache.hadoop.hbase.filter.TestFuzzyRowAndColumnRangeFilter
Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 84.068 sec - in org.apache.hadoop.hbase.io.asyncfs.TestSaslFanOutOneBlockAsyncDFSOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.661 sec - in org.apache.hadoop.hbase.filter.TestFuzzyRowAndColumnRangeFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.461 sec - in org.apache.hadoop.hbase.filter.TestColumnRangeFilter
Running org.apache.hadoop.hbase.filter.TestMultiRowRangeFilter
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.873 sec - in org.apache.hadoop.hbase.filter.TestMultiRowRangeFilter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 301.871 sec - in org.apache.hadoop.hbase.filter.TestFuzzyRowFilterEndToEnd

Results :

Tests in error: 
  TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMockedMapReduceSingleRegion:90->testWithMockedMapReduce:201->TableSnapshotInputFormatTestBase.createTableAndSnapshot:213
» ScannerTimeout

Tests run: 2072, Failures: 0, Errors: 1, Skipped: 24

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache HBase ...................................... SUCCESS [  2.556 s]
[INFO] Apache HBase - Checkstyle ......................... SUCCESS [  0.665 s]
[INFO] Apache HBase - Resource Bundle .................... SUCCESS [  0.253 s]
[INFO] Apache HBase - Annotations ........................ SUCCESS [  0.197 s]
[INFO] Apache HBase - Protocol ........................... SUCCESS [  2.970 s]
[INFO] Apache HBase - Common ............................. SUCCESS [01:30 min]
[INFO] Apache HBase - Procedure .......................... SUCCESS [01:35 min]
[INFO] Apache HBase - Client ............................. SUCCESS [ 41.909 s]
[INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [  7.612 s]
[INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [  9.611 s]
[INFO] Apache HBase - Prefix Tree ........................ SUCCESS [ 10.601 s]
[INFO] Apache HBase - Server ............................. FAILURE [  01:04 h]
[INFO] Apache HBase - Testing Util ....................... SKIPPED
[INFO] Apache HBase - Thrift ............................. SKIPPED
[INFO] Apache HBase - RSGroup ............................ SKIPPED
[INFO] Apache HBase - Shell .............................. SKIPPED
[INFO] Apache HBase - Integration Tests .................. SKIPPED
[INFO] Apache HBase - Examples ........................... SKIPPED
[INFO] Apache HBase - Rest ............................... SKIPPED
[INFO] Apache HBase - External Block Cache ............... SKIPPED
[INFO] Apache HBase - Spark .............................. SKIPPED
[INFO] Apache HBase - Assembly ........................... SKIPPED
[INFO] Apache HBase - Shaded ............................. SKIPPED
[INFO] Apache HBase - Shaded - Client .................... SKIPPED
[INFO] Apache HBase - Shaded - Server .................... SKIPPED
[INFO] Apache HBase - Archetypes ......................... SKIPPED
[INFO] Apache HBase - Exemplar for hbase-client archetype  SKIPPED
[INFO] Apache HBase - Exemplar for hbase-shaded-client archetype  SKIPPED
[INFO] Apache HBase - Archetype builder .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:09 h
[INFO] Finished at: 2016-07-12T06:51:14+00:00
[INFO] Final Memory: 70M/477M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test
(secondPartTestsExecution) on project hbase-server: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/HBase-Trunk_matrix/jdk=latest1.7,label=yahoo-not-h2/ws/hbase-server/target/surefire-reports>
for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-server
Build step 'Invoke top-level Maven targets' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  : # Run zombie detector script
./dev-support/zombie-detector.sh --jenkins ${BUILD_ID}
[yahoo-not-h2] $ /bin/bash -xe /tmp/hudson7392298012295699824.sh
+ ./dev-support/zombie-detector.sh --jenkins 1213
Tue Jul 12 06:51:22 UTC 2016 We're ok: there is no zombie test


    {color:green}+1 zombies{color}. No zombie tests found running at the end of the build.
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Archiving artifacts
Recording test results
[FINDBUGS] Skipping publisher since build result is FAILURE
[CHECKSTYLE] Skipping publisher since build result is FAILURE

Mime
View raw message