flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Aljoscha Krettek (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (FLINK-403) Support for Hadoop Types in Java/Scala
Date Mon, 22 Sep 2014 13:40:33 GMT

     [ https://issues.apache.org/jira/browse/FLINK-403?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Aljoscha Krettek resolved FLINK-403.
------------------------------------
    Resolution: Fixed

Scala also supports them after merge of scala-rework.

> Support for Hadoop Types in Java/Scala
> --------------------------------------
>
>                 Key: FLINK-403
>                 URL: https://issues.apache.org/jira/browse/FLINK-403
>             Project: Flink
>          Issue Type: Sub-task
>            Reporter: GitHub Import
>              Labels: github-import
>             Fix For: pre-apache
>
>
> It would be very valueable if users could use plain Hadoop objects as types in the Java
and Scala API.
> Since they are serializable/deserializable by a very similar mechanism as the Stratosphere
Value types, this should be easy to add.
> The following is necessary:
>   - Add setField(...) and getField(...) methods to the Record for subclasses if Hadoop's
Writable. That should do it for the Java API.
>   - The Scala API needs an additional check in the type generator that subclasses of
Writables are also accepted and not only subclasses of Value.
> ---------------- Imported from GitHub ----------------
> Url: https://github.com/stratosphere/stratosphere/issues/403
> Created by: [StephanEwen|https://github.com/StephanEwen]
> Labels: core, user satisfaction, 
> Created at: Thu Jan 16 14:20:02 CET 2014
> State: open



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message