spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ZHANG Wei <wezh...@outlook.com>
Subject Re: [Spark Core]: Does an executor only cache the partitions it requires for its computations or always the full RDD?
Date Thu, 16 Apr 2020 09:31:50 GMT
As far as I know, if you are talking about RDD.cache(), the answer is the executor only caches
the partition it requires.

Cheers,
-z

________________________________________
From: zwithouta <ralf.schmidtner@web.de>
Sent: Tuesday, April 14, 2020 18:28
To: user@spark.apache.org
Subject: [Spark Core]: Does an executor only cache the partitions it requires for its computations
or always the full RDD?

Provided caching is activated for a RDD, does each executor of a cluster only
cache the partitions it requires for its computations or always the full
RDD?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message