spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Measuring Performance in Spark
Date Tue, 28 Oct 2014 06:18:19 GMT
One approach would be to write pure mapReduce and spark jobs (eg like
wordcounts, filter, join, groupBy etc) and benchmark them. Another would be
to pick something that runs on top of mapReduce/Spark and benchmark on it.
(like benchmark against hive and sparkSQL)

Thanks
Best Regards

On Mon, Oct 27, 2014 at 10:52 PM, mahsa <mahsa.hanifi@gmail.com> wrote:

> Hi,
> I want to test the performance of MapReduce, and Spark on a program, find
> the bottleneck, calculating the performance of each part of the program and
> etc. I was wondering if there is tool for the measurement like Galia and
> etc. to help me in this regard.
> Thanks!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message