flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Fabian Hueske (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (FLINK-12163) Hadoop Compatibility, could not load the TypeInformation due to incorrect classloader
Date Mon, 15 Apr 2019 13:14:00 GMT

     [ https://issues.apache.org/jira/browse/FLINK-12163?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Fabian Hueske reassigned FLINK-12163:
-------------------------------------

    Assignee: Hai Yu

> Hadoop Compatibility, could not load the TypeInformation due to incorrect classloader
> -------------------------------------------------------------------------------------
>
>                 Key: FLINK-12163
>                 URL: https://issues.apache.org/jira/browse/FLINK-12163
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hadoop Compatibility
>    Affects Versions: 1.7.2, 1.8.0
>         Environment: Flink 1.5.6 standalone, Flink 1.7.2 standalone, 
> Hadoop 2.9.1 standalone
>            Reporter: morvenhuang
>            Assignee: Hai Yu
>            Priority: Critical
>
> For Flink 1.5.6, 1.7.2, I keep getting error when using Hadoop Compatibility, 
> {code:java}
> Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class
'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
> at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2140)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1759)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1701)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:956)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:1176)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:889)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:839)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:805)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:798)
> at org.apache.flink.api.common.typeinfo.TypeHint.<init>(TypeHint.java:50)
> {code}
> Packaging the flink-hadoop-compatibility dependency with my code into a fat jar doesn't
help.
> The error won't go until I copy the flink-hadoop-compatibility jar to FLINK_HOME/lib.
> This seems to be a classloader issue when looking into the TypeExtractor#createHadoopWritableTypeInfo
> {code:java}
> Class<?> typeInfoClass;
> try {
> typeInfoClass = Class.forName(HADOOP_WRITABLE_TYPEINFO_CLASS, false, TypeExtractor.class.getClassLoader());
> }
> catch (ClassNotFoundException e) {
> throw new RuntimeException("Could not load the TypeInformation for the class '"
> + HADOOP_WRITABLE_CLASS + "'. You may be missing the 'flink-hadoop-compatibility' dependency.");
> }
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message