spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiangrui Meng <men...@gmail.com>
Subject Re: Setting a custom loss function for GradientDescent
Date Mon, 30 Mar 2015 21:08:07 GMT
You can extend Gradient, e.g.,
https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/optimization/Gradient.scala#L266,
and use it in GradientDescent:
https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/optimization/GradientDescent.scala#L149.
Please note that this is a developer API. -Xiangrui

On Fri, Mar 27, 2015 at 7:11 PM, shmoanne <jlshin@eng.ucsd.edu> wrote:
> I am working with the mllib.optimization.GradientDescent class and I'm
> confused about how to set a custom loss function with setGradient?
>
> For instance, if I wanted my loss function to be x^2 how would I go about
> setting it using setGradient?
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Setting-a-custom-loss-function-for-GradientDescent-tp22263.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message