flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Till Rohrmann (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-1979) Implement Loss Functions
Date Wed, 06 May 2015 12:52:01 GMT

    [ https://issues.apache.org/jira/browse/FLINK-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14530471#comment-14530471

Till Rohrmann commented on FLINK-1979:

Hi [~jguenther], great to hear that you wanna picks this topic up. We're currently developing
an optimization framework which can tremendously benefit from more loss functions. See FLINK-1889
and FLINK-1807 for more details. 

Here is [https://github.com/apache/flink/pull/613] the corresponding pull request with the
current state. We hope to merge it in the next days. You find the interface for the loss functions
in the file LossFunction.scala. There are also an implementation for the squared loss function.

Be aware that the prediction function will still change a little bit. But they will more or
less have the following interface
trait PredictionFunction {
  def predict(x: Vector, weights: WeightVector): Double
  def gradient(x: Vector, weights: WeightVector): Vector

So in order to implement new loss functions you simply have to implement the abstract methods.

Shall I assign this issue to you, Johannes?

> Implement Loss Functions
> ------------------------
>                 Key: FLINK-1979
>                 URL: https://issues.apache.org/jira/browse/FLINK-1979
>             Project: Flink
>          Issue Type: Improvement
>          Components: Machine Learning Library
>            Reporter: Johannes G√ľnther
>            Priority: Minor
>              Labels: ML
> For convex optimization problems, optimizer methods like SGD rely on a pluggable implementation
of a loss function and its first derivative.

This message was sent by Atlassian JIRA

View raw message