spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Justin Yip <yipjus...@prediction.io>
Subject Implementing custom metrics under MLPipeline's BinaryClassificationEvaluator
Date Mon, 18 May 2015 05:35:21 GMT
Hello,

I would like to use other metrics in BinaryClassificaitonEvaluator, I am
thinking about simple ones (i.e. PrecisionByThreshold). From the api site,
I can't tell much about how to implement it.

>From the code, it seems like I will have to override this function, using
most of the existing code for checking column schema, then replace the line
which compute the actual score
<https://github.com/apache/spark/blob/1b8625f4258d6d1a049d0ba60e39e9757f5a568b/mllib/src/main/scala/org/apache/spark/ml/evaluation/BinaryClassificationEvaluator.scala#L72>
.

Is my understanding correct? Or there are more convenient way of
implementing a metric in order to be used by ML pipeline?

Thanks.

Justin




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Implementing-custom-metrics-under-MLPipeline-s-BinaryClassificationEvaluator-tp22930.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Mime
View raw message