flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Fabian Hueske (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-1076) Support function-level compatibility for Hadoop's wrappers functions
Date Fri, 29 Aug 2014 10:21:53 GMT

    [ https://issues.apache.org/jira/browse/FLINK-1076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14115084#comment-14115084
] 

Fabian Hueske commented on FLINK-1076:
--------------------------------------

+1 The goal should be to have simple wrappers for Hadoop Map and Reduce functions similar
to the Wrappers for Input- and OutputFormat which can be mixed into "regular" Flink programs.

>  Support function-level compatibility for  Hadoop's wrappers functions
> ----------------------------------------------------------------------
>
>                 Key: FLINK-1076
>                 URL: https://issues.apache.org/jira/browse/FLINK-1076
>             Project: Flink
>          Issue Type: New Feature
>          Components: Hadoop Compatibility
>    Affects Versions: 0.7-incubating
>            Reporter: Artem Tsikiridis
>            Assignee: Artem Tsikiridis
>              Labels: features
>
> While the Flink wrappers for Hadoop Map and Reduce tasks are implemented in https://github.com/apache/incubator-flink/pull/37
it is currently not possible to use the {{HadoopMapFunction}} and the {{HadoopReduceFunction}}
without a {{JobConf}}. It woule be useful if we could specify a Hadoop Mapper, Reducer (or
Combiner) and use them as seperate components in a Flink Job.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message