spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiangrui Meng <men...@gmail.com>
Subject Re: mllib performance on mesos cluster
Date Mon, 22 Sep 2014 19:36:08 GMT
1) MLlib 1.1 should be faster than 1.0 in general. What's the size of
your dataset? Is the RDD evenly distributed across nodes? You can
check the storage tab of the Spark WebUI.

2) I don't have much experience with mesos deployment. Someone else
may be able to answer your question.

-Xiangrui

On Fri, Sep 19, 2014 at 12:17 PM, SK <skrishna.id@gmail.com> wrote:
> Hi,
>
> I have a program similar to the BinaryClassifier example that I am running
> using my data (which is fairly small). I run this for 100 iterations. I
> observed the following performance:
>
> Standalone mode cluster with 10 nodes (with Spark 1.0.2):  5 minutes
> Standalone mode cluster with 10 nodes (with Spark 1.1.0):  8.9 minutes
> Mesos cluster with 10 nodes (with Spark 1.1.0): 17 minutes
>
> 1) According to the documentation, Spark 1.1.0 has better performance. So I
> would like to understand why the runtime is longer on Spark 1.1.0.
>
> 2) Why is the performance on mesos significantly higher than in standalone
> mode?  I just wanted to find out if any one else has observed poor
> performance for Mllib based programs on mesos cluster. I looked through the
> application detail logs and found that some of the scheduler delay values
> were higher on mesos compared to standalone mode (40 ms vs. 10 ms).  Is the
> mesos scheduler slower?
>
> thanks
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/mllib-performance-on-mesos-cluster-tp14692.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message