flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Robert Metzger (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-1069) "flink-hadoop-compatibility" fails to build with Hadoop 2.5.0 dependencies
Date Wed, 17 Sep 2014 15:17:34 GMT

    [ https://issues.apache.org/jira/browse/FLINK-1069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14137376#comment-14137376
] 

Robert Metzger commented on FLINK-1069:
---------------------------------------

Ok, I'm assigning myself to the issue. I'll look into the shading of Guava.

> "flink-hadoop-compatibility" fails to build with Hadoop 2.5.0 dependencies
> --------------------------------------------------------------------------
>
>                 Key: FLINK-1069
>                 URL: https://issues.apache.org/jira/browse/FLINK-1069
>             Project: Flink
>          Issue Type: Bug
>    Affects Versions: 0.6-incubating, 0.7-incubating
>            Reporter: Robert Metzger
>
> {{mvn clean verify  -Dhadoop.profile=2 -Dhadoop.version=2.5.0}} fails with
> {code}
> org.apache.flink.runtime.client.JobExecutionException: java.lang.IllegalAccessError:
tried to access method com.google.common.base.Stopwatch.<init>()V from class org.apache.hadoop.mapreduce.lib.input.FileInputFormat
> 	at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:379)
> 	at org.apache.flink.hadoopcompatibility.mapreduce.HadoopInputFormat.createInputSplits(HadoopInputFormat.java:156)
> 	at org.apache.flink.hadoopcompatibility.mapreduce.HadoopInputFormat.createInputSplits(HadoopInputFormat.java:53)
> 	at org.apache.flink.runtime.jobgraph.JobInputVertex.getInputSplits(JobInputVertex.java:101)
> 	at org.apache.flink.runtime.executiongraph.ExecutionGraph.createVertex(ExecutionGraph.java:495)
> 	at org.apache.flink.runtime.executiongraph.ExecutionGraph.constructExecutionGraph(ExecutionGraph.java:281)
> 	at org.apache.flink.runtime.executiongraph.ExecutionGraph.<init>(ExecutionGraph.java:177)
> 	at org.apache.flink.runtime.jobmanager.JobManager.submitJob(JobManager.java:469)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.flink.runtime.ipc.RPC$Server.call(RPC.java:422)
> 	at org.apache.flink.runtime.ipc.Server$Handler.run(Server.java:958)
> 	at org.apache.flink.runtime.client.JobClient.submitJobAndWait(JobClient.java:268)
> 	at org.apache.flink.test.util.JavaProgramTestBase$TestEnvironment.execute(JavaProgramTestBase.java:148)
> 	at org.apache.flink.hadoopcompatibility.mapreduce.example.WordCount.main(WordCount.java:86)
> 	at org.apache.flink.test.hadoopcompatibility.mapreduce.HadoopInputOutputITCase.testProgram(HadoopInputOutputITCase.java:44)
> 	at org.apache.flink.test.util.JavaProgramTestBase.testJob(JavaProgramTestBase.java:100)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> 	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> 	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> 	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> 	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> 	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> 	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> 	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> 	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> 	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> 	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> 	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> 	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> 	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> 	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message