spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From condor join <spark_ker...@outlook.com>
Subject Question About OFF_HEAP Caching
Date Mon, 18 Jul 2016 07:11:45 GMT
Hi All,

I have some questions about OFF_HEAP Caching. In Spark 1.X when we use rdd.persist(StorageLevel.OFF_HEAP),that
means rdd caching in Tachyon(Alluxio). However,in Spark 2.X,we can directly use OFF_HEAP 
For Caching

(https://issues.apache.org/jira/browse/SPARK-13992?jql=project%20%3D%20SPARK%20AND%20text%20~%20%22off-heap%20caching%22).
I am confuse about this and I have follow questions:

1.In Spark 2.X, how should we use Tachyon for caching?

2.Is there any reason that must change in this way(I mean use off_heap directly instead of
using Tachyon)

Thanks a lot!


Mime
View raw message