flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "morvenhuang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-12163) Hadoop Compatibility, Could not load the TypeInformation due to incorrect classloader
Date Thu, 11 Apr 2019 09:06:00 GMT

    [ https://issues.apache.org/jira/browse/FLINK-12163?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16815228#comment-16815228
] 

morvenhuang commented on FLINK-12163:
-------------------------------------

Discussion: [http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Is-copying-flink-hadoop-compatibility-jar-to-FLINK-HOME-lib-the-only-way-to-make-it-work-td27181.html]

> Hadoop Compatibility, Could not load the TypeInformation due to incorrect classloader
> -------------------------------------------------------------------------------------
>
>                 Key: FLINK-12163
>                 URL: https://issues.apache.org/jira/browse/FLINK-12163
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hadoop Compatibility
>    Affects Versions: 1.7.2
>         Environment: Flink 1.5.6 standalone, Flink 1.7.2 standalone, 
> Hadoop 2.9.1 standalone
>            Reporter: morvenhuang
>            Priority: Critical
>
> For Flink 1.5.6, 1.7.2, I keep getting error when using Hadoop Compatibility, 
> {code:java}
> Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class
'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
> at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2140)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1759)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1701)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:956)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:1176)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:889)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:839)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:805)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:798)
> at org.apache.flink.api.common.typeinfo.TypeHint.<init>(TypeHint.java:50)
> {code}
> Packaging the flink-hadoop-compatibility dependency with my code into a fat jar doesn't
help.
> The error won't go until I copy the flink-hadoop-compatibility jar to FLINK_HOME/lib.
> This seems to be a classloader issue when looking into the TypeExtractor#createHadoopWritableTypeInfo
> {code:java}
> Class<?> typeInfoClass;
> try {
> typeInfoClass = Class.forName(HADOOP_WRITABLE_TYPEINFO_CLASS, false, TypeExtractor.class.getClassLoader());
> }
> catch (ClassNotFoundException e) {
> throw new RuntimeException("Could not load the TypeInformation for the class '"
> + HADOOP_WRITABLE_CLASS + "'. You may be missing the 'flink-hadoop-compatibility' dependency.");
> }
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message