hbase-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: HBase-1.1-JDK7 #1709
Date Wed, 04 May 2016 08:33:54 GMT
See <https://builds.apache.org/job/HBase-1.1-JDK7/1709/changes>

Changes:

[tedyu] HBASE-15742 Reduce allocation of objects in metrics (Phil Yang)

------------------------------------------
[...truncated 4191 lines...]
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint.testNonHfileFolder » OutOfMemory ...
  Run 2: TestLoadIncrementalHFilesUseSecurityEndPoint>TestLoadIncrementalHFiles.testNonHfileFolder:389->TestLoadIncrementalHFiles.testNonHfileFolder:422
» TableNotFound
  Run 3: TestLoadIncrementalHFilesUseSecurityEndPoint>TestLoadIncrementalHFiles.testNonHfileFolder:389->TestLoadIncrementalHFiles.testNonHfileFolder:422
» TableNotFound

  TestSecureLoadIncrementalHFilesSplitRecovery.setupCluster:60 » Runtime java.la...
  TestTableInputFormatScan1>TestTableInputFormatScanBase.setUpBeforeClass:85 » IO
org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:146->TableSnapshotInputFormatTestBase.setupCluster:60
» IO
  Run 2: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:146->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory
  Run 3: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:146->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testRestoreSnapshotDoesNotCreateBackRefLinks(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testRestoreSnapshotDoesNotCreateBackRefLinks:114->TableSnapshotInputFormatTestBase.setupCluster:60
» IO
  Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testRestoreSnapshotDoesNotCreateBackRefLinks:114->TableSnapshotInputFormatTestBase.setupCluster:60
» IO
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testRestoreSnapshotDoesNotCreateBackRefLinks:114->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapReduceAndOfflineHBaseMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceAndOfflineHBaseMultiRegion:108->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory
  Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceAndOfflineHBaseMultiRegion:108->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» IO
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceAndOfflineHBaseMultiRegion:108->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapReduceMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceMultiRegion:102->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» IO
  Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceMultiRegion:102->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» IO
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceMultiRegion:102->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapReduceSingleRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceSingleRegion:97->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory
  Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceSingleRegion:97->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceSingleRegion:97->TableSnapshotInputFormatTestBase.testWithMapReduce:157->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMockedMapReduceMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMockedMapReduceMultiRegion:92->testWithMockedMapReduce:187->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory
  Run 2: TestTableSnapshotInputFormat.testWithMockedMapReduceMultiRegion » Remote java....
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMockedMapReduceMultiRegion:92->testWithMockedMapReduce:187->TableSnapshotInputFormatTestBase.setupCluster:60
» IO

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMockedMapReduceSingleRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMockedMapReduceSingleRegion:87->testWithMockedMapReduce:187->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory
  Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMockedMapReduceSingleRegion:87->testWithMockedMapReduce:187->TableSnapshotInputFormatTestBase.setupCluster:60
» OutOfMemory
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMockedMapReduceSingleRegion:87->testWithMockedMapReduce:187->TableSnapshotInputFormatTestBase.setupCluster:60
» IO

  TestPerColumnFamilyFlush.testCompareStoreFileCount:600->doPut:558 » Runtime ja...
  TestPerColumnFamilyFlush.testFlushingWhenLogRolling:466 » IO Shutting down
  TestPerColumnFamilyFlush.testLogReplayWithDistributedLogSplit:430->doTestLogReplay:341
» IO
  TestPerColumnFamilyFlush.testLogReplayWithDistributedReplay:423->doTestLogReplay:341
» IllegalState
  TestFSHLog.setUpBeforeClass:126 » OutOfMemory unable to create new native thre...
  TestWALReplay.setUpBeforeClass:140 » OutOfMemory unable to create new native t...
  TestWALReplayCompressed.setUpBeforeClass:34->TestWALReplay.setUpBeforeClass:140 » IO
  TestZKInterProcessReadWriteLock.beforeAllTests:82 » ConnectionLoss KeeperError...
  TestZKInterProcessReadWriteLock.testMultipleClients » OutOfMemory unable to cr...
Flaked tests: 
org.apache.hadoop.hbase.mapreduce.TestCopyTable.testCopyTableWithBulkload(org.apache.hadoop.hbase.mapreduce.TestCopyTable)
  Run 1: TestCopyTable.testCopyTableWithBulkload:131->doCopyTableTest:95 » IO Unable to...
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testNonHfileFolderWithUnmatchedFamilyName(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint.testNonHfileFolderWithUnmatchedFamilyName
» OutOfMemory
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testNonexistentColumnFamilyLoad(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint>TestLoadIncrementalHFiles.testNonexistentColumnFamilyLoad:376
Incorrect exception message, expected message: [Unmatched family names found], current message:
[All datanodes 127.0.0.1:46214 are bad. Aborting...]
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testRegionCrossingHFileSplitRowColBloom(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint.testRegionCrossingHFileSplitRowColBloom
» Remote
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testRegionCrossingLoad(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint>TestLoadIncrementalHFiles.testRegionCrossingLoad:129->TestLoadIncrementalHFiles.runTest:228->TestLoadIncrementalHFiles.runTest:242->TestLoadIncrementalHFiles.runTest:248->TestLoadIncrementalHFiles.runTest:275
» TestTimedOut
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testRegionCrossingRowBloom(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint.testRegionCrossingRowBloom » Remote
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testSimpleHFileSplit(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint.testSimpleHFileSplit » Remote Fil...
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testSimpleLoad(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint.testSimpleLoad » Remote unable to...
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testSplitStoreFileWithNoneToNone(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint.testSplitStoreFileWithNoneToNone »
Remote
  Run 2: PASS

org.apache.hadoop.hbase.namespace.TestNamespaceAuditor.testRegionMerge(org.apache.hadoop.hbase.namespace.TestNamespaceAuditor)
  Run 1: TestNamespaceAuditor.testRegionMerge » Remote java.lang.OutOfMemoryError: unab...
  Run 2: PASS

org.apache.hadoop.hbase.namespace.TestNamespaceAuditor.testRegionOperations(org.apache.hadoop.hbase.namespace.TestNamespaceAuditor)
  Run 1: TestNamespaceAuditor.testRegionOperations:454 » Runtime java.lang.OutOfMemoryE...
  Run 2: PASS

org.apache.hadoop.hbase.namespace.TestNamespaceAuditor.testStatePreserve(org.apache.hadoop.hbase.namespace.TestNamespaceAuditor)
  Run 1: TestNamespaceAuditor.testStatePreserve:604 » IO java.util.concurrent.Execution...
  Run 2: TestNamespaceAuditor.testStatePreserve » Remote unable to create new native th...
  Run 3: PASS

org.apache.hadoop.hbase.regionserver.TestRecoveredEdits.testReplayWorksThoughLotsOfFlushing(org.apache.hadoop.hbase.regionserver.TestRecoveredEdits)
  Run 1: TestRecoveredEdits.testReplayWorksThoughLotsOfFlushing:134->verifyAllEditsMadeItIn:179
» TestTimedOut
  Run 2: TestRecoveredEdits.testReplayWorksThoughLotsOfFlushing:134->verifyAllEditsMadeItIn:179
» TestTimedOut
  Run 3: PASS

org.apache.hadoop.hbase.regionserver.TestTags.testFlushAndCompactionwithCombinations(org.apache.hadoop.hbase.regionserver.TestTags)
  Run 1: TestTags.testFlushAndCompactionwithCombinations » Remote unable to create new ...
  Run 2: PASS

org.apache.hadoop.hbase.regionserver.TestTags.testTags(org.apache.hadoop.hbase.regionserver.TestTags)
  Run 1: TestTags.testTags:119 » IO java.util.concurrent.ExecutionException: java.io.IO...
  Run 2: PASS

org.apache.hadoop.hbase.regionserver.compactions.TestCompactionWithThroughputController.testCompaction(org.apache.hadoop.hbase.regionserver.compactions.TestCompactionWithThroughputController)
  Run 1: TestCompactionWithThroughputController.testCompaction:163->testCompactionWithThroughputLimit:119->prepareData:95
» Runtime
  Run 2: PASS

org.apache.hadoop.hbase.regionserver.compactions.TestCompactionWithThroughputController.testGetCompactionPressureForStripedStore(org.apache.hadoop.hbase.regionserver.compactions.TestCompactionWithThroughputController)
  Run 1: TestCompactionWithThroughputController.testGetCompactionPressureForStripedStore:259
» Runtime
  Run 2: PASS

org.apache.hadoop.hbase.regionserver.wal.TestSecureWALReplay.testCompactedBulkLoadedFiles(org.apache.hadoop.hbase.regionserver.wal.TestSecureWALReplay)
  Run 1: TestSecureWALReplay>TestWALReplay.testCompactedBulkLoadedFiles:457->TestWALReplay.access$000:117->TestWALReplay.runWALSplit:1153
» OutOfMemory
  Run 2: PASS


Tests run: 1474, Failures: 1, Errors: 24, Skipped: 5, Flakes: 18

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache HBase ...................................... SUCCESS [7.213s]
[INFO] Apache HBase - Checkstyle ......................... SUCCESS [1.536s]
[INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.473s]
[INFO] Apache HBase - Annotations ........................ SUCCESS [2.524s]
[INFO] Apache HBase - Protocol ........................... SUCCESS [21.268s]
[INFO] Apache HBase - Common ............................. SUCCESS [2:09.044s]
[INFO] Apache HBase - Procedure .......................... SUCCESS [3:58.013s]
[INFO] Apache HBase - Client ............................. SUCCESS [2:05.424s]
[INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [11.425s]
[INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [17.181s]
[INFO] Apache HBase - Prefix Tree ........................ SUCCESS [20.546s]
[INFO] Apache HBase - Server ............................. FAILURE [2:19:19.009s]
[INFO] Apache HBase - Testing Util ....................... SKIPPED
[INFO] Apache HBase - Thrift ............................. SKIPPED
[INFO] Apache HBase - Rest ............................... SKIPPED
[INFO] Apache HBase - Shell .............................. SKIPPED
[INFO] Apache HBase - Integration Tests .................. SKIPPED
[INFO] Apache HBase - Examples ........................... SKIPPED
[INFO] Apache HBase - Assembly ........................... SKIPPED
[INFO] Apache HBase - Shaded ............................. SKIPPED
[INFO] Apache HBase - Shaded - Client .................... SKIPPED
[INFO] Apache HBase - Shaded - Server .................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:28:57.330s
[INFO] Finished at: Wed May 04 08:31:12 UTC 2016
[INFO] Final Memory: 300M/533M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test
(secondPartTestsExecution) on project hbase-server: ExecutionException: java.lang.RuntimeException:
There was an error in the forked process
[ERROR] java.lang.ArrayIndexOutOfBoundsException: 1
[ERROR] at org.apache.maven.surefire.common.junit4.JUnit4ProviderUtil.generateFailingTests(JUnit4ProviderUtil.java:64)
[ERROR] at org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:151)
[ERROR] at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:203)
[ERROR] at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:155)
[ERROR] at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-server
Build step 'Invoke top-level Maven targets' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  : # Post-build task script. TODO: Check this in and have all builds reference
check-in.
pwd && ls
# NOTE!!!! The below code has been copied and pasted from ./dev-tools/run-test.sh
# Do not change here without syncing there and vice-versa.
ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep -e '-Dhbase.test' | wc -l`
if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
 echo "Suspicious java process found - waiting 30s to see if there are just slow to stop"
 sleep 30
 ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep -e '-Dhbase.test' | wc -l`
 if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
   echo " {color:red}There appear to be $ZOMBIE_TESTS_COUNT zombie tests{color}, they should
have been killed by surefire but survived"
   jps -v | grep surefirebooter | grep -e '-Dhbase.test'
   jps -v | grep surefirebooter | grep -e '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack
   # Exit with error
   exit 1
 else
   echo "We're ok: there is no zombie test, but some tests took some time to stop"
 fi
else
  echo "We're ok: there is no zombie test"
fi
[HBase-1.1-JDK7] $ /bin/bash -xe /tmp/hudson7457043399166959306.sh
+ pwd
<https://builds.apache.org/job/HBase-1.1-JDK7/ws/>
+ ls
bin
CHANGES.txt
conf
dev-support
hbase-annotations
hbase-assembly
hbase-checkstyle
hbase-client
hbase-common
hbase-examples
hbase-hadoop2-compat
hbase-hadoop-compat
hbase-it
hbase-native-client
hbase-prefix-tree
hbase-procedure
hbase-protocol
hbase-resource-bundle
hbase-rest
hbase-server
hbase-shaded
hbase-shell
hbase-testing-util
hbase-thrift
LICENSE.txt
NOTICE.txt
pom.xml
README.txt
src
target
++ jps -v
++ grep surefirebooter
++ wc -l
++ grep -e -Dhbase.test
+ ZOMBIE_TESTS_COUNT=0
+ [[ 0 != 0 ]]
+ echo 'We'\''re ok: there is no zombie test'
We're ok: there is no zombie test
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Archiving artifacts
Recording test results
Updating HBASE-15742

Mime
View raw message