[ https://issues.apache.org/jira/browse/SPARK-6192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14352459#comment-14352459
]
Yan Ni edited comment on SPARK-6192 at 3/9/15 2:45 AM:
-------------------------------------------------------
hello, I am a senior year undergraduate student and had experience in python & ML. Now
I am interested in distributed platforms like spark but don't have any experience. I would
like to take this project as my starting point in spark. Any advice?
Thanks!
was (Author: leckie-chn):
hello, I am a senior year undergraduate student and had experience in python & ML. Now
I am interested in distributed platforms like spark but don't have any experience. I want
to take this project as my starting point in spark. Any advice?
Thanks!
> Enhance MLlib's Python API (GSoC 2015)
> --------------------------------------
>
> Key: SPARK-6192
> URL: https://issues.apache.org/jira/browse/SPARK-6192
> Project: Spark
> Issue Type: Umbrella
> Components: ML, MLlib, PySpark
> Reporter: Xiangrui Meng
> Assignee: Manoj Kumar
> Labels: gsoc, gsoc2015, mentor
>
> This is an umbrella JIRA for [~MechCoder]'s GSoC 2015 project. The main theme is to enhance
MLlib's Python API, to make it on par with the Scala/Java API. The main tasks are:
> 1. For all models in MLlib, provide save/load method. This also
> includes save/load in Scala.
> 2. Python API for evaluation metrics.
> 3. Python API for streaming ML algorithms.
> 4. Python API for distributed linear algebra.
> 5. Simplify MLLibPythonAPI using DataFrames. Currently, we use
> customized serialization, making MLLibPythonAPI hard to maintain. It
> would be nice to use the DataFrames for serialization.
> I'll link the JIRAs for each of the tasks.
> Note that this doesn't mean all these JIRAs are pre-assigned to [~MechCoder]. The TODO
list will be dynamic based on the backlog.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|