hadoop-mapreduce-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Hadoop-Mapreduce-trunk-Java8 - Build # 855 - Still Failing
Date Thu, 07 Jan 2016 01:43:35 GMT
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/855/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 10980 lines...]
Running org.apache.hadoop.mapred.pipes.TestPipes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.03 sec - in org.apache.hadoop.mapred.pipes.TestPipes
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed
in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.173 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed
in 8.0
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.936 sec - in org.apache.hadoop.mapred.TestReporter

Results :

Failed tests: 
  TestNetworkedJob.testNetworkedJob:174 expected:<[[Thu Jan 07 00:50:42 +0000 2016] Application
is Activated, waiting for resources to be assigned for AM.  Details : AM Partition = <DEFAULT_PARTITION>
; Partition Resource = <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0
% ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]>
but was:<[]>

Tests in error: 
  TestJobName>ClusterMapReduceTestCase.tearDown:143->ClusterMapReduceTestCase.stopCluster:128
» NoClassDefFound
  TestJobName>ClusterMapReduceTestCase.setUp:56->ClusterMapReduceTestCase.startCluster:87
» YarnRuntime

Tests run: 515, Failures: 1, Errors: 2, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.306 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.931 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.184 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:55 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:17 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:55 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:14 h
[INFO] Finished at: 2016-01-07T01:43:28+00:00
[INFO] Final Memory: 38M/144M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test)
on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException:
The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient
&& /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError
-jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter6491762453151308964.jar
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire7947797009002537192tmp
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2747753843000009429056tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestJobName.testComplexNameWithRegex

Error Message:
org/apache/hadoop/util/ShutdownThreadsHelper

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ShutdownThreadsHelper
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceStop(HistoryFileManager.java:666)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceStop(JobHistory.java:171)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStop(MiniMRYarnCluster.java:257)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.stopCluster(ClusterMapReduceTestCase.java:128)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.tearDown(ClusterMapReduceTestCase.java:143)
	at junit.framework.TestCase.runBare(TestCase.java:146)
	at junit.framework.TestResult$1.protect(TestResult.java:122)
	at junit.framework.TestResult.runProtected(TestResult.java:142)
	at junit.framework.TestResult.run(TestResult.java:125)
	at junit.framework.TestCase.run(TestCase.java:129)
	at junit.framework.TestSuite.runTest(TestSuite.java:255)
	at junit.framework.TestSuite.run(TestSuite.java:250)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.ShutdownThreadsHelper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceStop(HistoryFileManager.java:666)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceStop(JobHistory.java:171)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStop(MiniMRYarnCluster.java:257)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.stopCluster(ClusterMapReduceTestCase.java:128)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.tearDown(ClusterMapReduceTestCase.java:143)
	at junit.framework.TestCase.runBare(TestCase.java:146)
	at junit.framework.TestResult$1.protect(TestResult.java:122)
	at junit.framework.TestResult.runProtected(TestResult.java:142)
	at junit.framework.TestResult.run(TestResult.java:125)
	at junit.framework.TestCase.run(TestCase.java:129)
	at junit.framework.TestSuite.runTest(TestSuite.java:255)
	at junit.framework.TestSuite.run(TestSuite.java:250)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestJobName.testComplexName

Error Message:
could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null:
No AbstractFileSystem configured for scheme: file

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException:
fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file
	at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:153)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:79)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:75)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:73)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at junit.framework.TestCase.runBare(TestCase.java:139)
	at junit.framework.TestResult$1.protect(TestResult.java:122)
	at junit.framework.TestResult.runProtected(TestResult.java:142)
	at junit.framework.TestResult.run(TestResult.java:125)
	at junit.framework.TestCase.run(TestCase.java:129)
	at junit.framework.TestSuite.runTest(TestSuite.java:255)
	at junit.framework.TestSuite.run(TestSuite.java:250)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null:
No AbstractFileSystem configured for scheme: file
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1705)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:149)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:79)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:75)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:73)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at junit.framework.TestCase.runBare(TestCase.java:139)
	at junit.framework.TestResult$1.protect(TestResult.java:122)
	at junit.framework.TestResult.runProtected(TestResult.java:142)
	at junit.framework.TestResult.run(TestResult.java:125)
	at junit.framework.TestCase.run(TestCase.java:129)
	at junit.framework.TestSuite.runTest(TestSuite.java:255)
	at junit.framework.TestSuite.run(TestSuite.java:250)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob

Error Message:
expected:<[[Thu Jan 07 00:50:42 +0000 2016] Application is Activated, waiting for resources
to be assigned for AM.  Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource
= <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute
used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]> but was:<[]>

Stack Trace:
org.junit.ComparisonFailure: expected:<[[Thu Jan 07 00:50:42 +0000 2016] Application is
Activated, waiting for resources to be assigned for AM.  Details : AM Partition = <DEFAULT_PARTITION>
; Partition Resource = <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0
% ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]>
but was:<[]>
	at org.junit.Assert.assertEquals(Assert.java:115)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob(TestNetworkedJob.java:174)



Mime
  • Unnamed multipart/mixed (inline, None, 0 bytes)
View raw message