spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vin J <>
Subject Spark 2.x OFF_HEAP persistence
Date Wed, 04 Jan 2017 18:50:13 GMT
Until Spark 1.6 I see there were specific properties to configure such as
the external block store master url (spark.externalBlockStore.url) etc to
use OFF_HEAP storage level which made it clear that an external Tachyon
type of block store as required/used for OFF_HEAP storage.

Can someone clarify how this has been changed in Spark 2.x - because I do
not see config settings anymore that point Spark to an external block store
like Tachyon (now Alluxio) (or am i missing seeing it?)

I understand there are ways to use Alluxio with Spark, but how about
OFF_HEAP storage - can Spark 2.x OFF_HEAP rdd persistence still exploit
alluxio/external block store? Any pointers to design decisions/Spark JIRAs
related to this will also help.


View raw message