spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jack Kolokasis <>
Subject Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail]
Date Fri, 20 Mar 2020 14:45:07 GMT
Hello Michel,

Spark seperates executors memory using an adaptive boundary between 
storage and execution memory. If there is no caching and execution 
memory needs more space, then it will use a portion of the storage memory.

If your program does not use caching then you can reduce storage memory.


On 20/3/20 4:40 μ.μ., msumbul wrote:
> Hello,
> Im asking mysef the exact meaning of the setting of
> spark.memory.storageFraction.
> The documentation mention:
> "Amount of storage memory immune to eviction, expressed as a fraction of the
> size of the region set aside by spark.memory.fraction. The higher this is,
> the less working memory may be available to execution and tasks may spill to
> disk more often"
> Does that mean that if there is no caching that part of the memory will not
> be used at all?
> In the spark UI, in the tab "Executor", I can see that the "storage memory"
> is always zero. Does that mean that that part of the memory is never used at
> all and I can reduce it or never used for storage specifically?
> Thanks in advance for your help,
> Michel
> --
> Sent from:
> ---------------------------------------------------------------------
> To unsubscribe e-mail:

To unsubscribe e-mail:

View raw message