spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hien Luu <hien...@gmail.com>
Subject Re: disable spark disk cache
Date Mon, 04 Mar 2019 00:25:16 GMT
Hi Andrey,

Below is the description of MEMORY_ONLY from
https://spark.apache.org/docs/latest/rdd-programming-guide.html

"Store RDD as deserialized Java objects in the JVM. If the RDD does not fit
in memory, some partitions will not be cached and will be recomputed on the
fly each time they're needed. This is the default level."

Just curious how do you know Spark will be disk even option MEMORY_ONLY is
chosen?

Cheers,

Hien

On Sun, Mar 3, 2019 at 1:47 PM Andrey Dudin <dudin.andrey@gmail.com> wrote:

> Hello everyone,
>
> Is there a way to prevent caching data to disk even if the memory(RAM)
> runs out?
> As I know, spark will use disk even if I use MEMORY_ONLY. How to disable
> this mechanism? I want to get something like out of memory exception if the
> memory(RAM) runs out.
>
>
> Thanks,
> Andrey
>


-- 
Regards,

Mime
View raw message