spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 牛兆捷 <nzjem...@gmail.com>
Subject Workload for spark testing
Date Sun, 14 Sep 2014 01:23:02 GMT
Hi All:

We know some memory of spark are used for computing (e.g.,
spark.shuffle.memoryFraction) and some are used for caching RDD for future
use (e.g., spark.storage.memoryFraction).

Is there any existing workload which can utilize both of them during the
running left cycle? I want to do some performance by adjusting the ratio of
them.

Thanks.

-- 
*Regards,*
*Zhaojie*

Mime
View raw message