spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michel Sumbul <michelsum...@gmail.com>
Subject Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail] [Marketing Mail]
Date Fri, 20 Mar 2020 15:51:07 GMT
Hi  Iacovos,

thansk for the reply its super clear.
Do you know if there is a way to know the max memory usage?
In the spark ui 2.3.x the "peak memory usage" metris is always at zero.

Thanks,
Michel

<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
Garanti
sans virus. www.avast.com
<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>

Le ven. 20 mars 2020 à 14:56, Jack Kolokasis <kolokasis@ics.forth.gr> a
écrit :

> This is just a counter to show you the size of cached RDDs. If it is zero
> means that no caching has occurred. Also, even storage memory is used for
> computing the counter will show as zero.
>
> Iacovos
> On 20/3/20 4:51 μ.μ., Michel Sumbul wrote:
>
> Hi,
>
> Thanks for the very quick reply!
> If I see the metrics "storage memory", always at 0, does that mean that
> the memory is neither used for caching or computing?
>
> Thanks,
> Michel
>
>
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
Garanti
> sans virus. www.avast.com
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
>
> Le ven. 20 mars 2020 à 14:45, Jack Kolokasis <kolokasis@ics.forth.gr> a
> écrit :
>
>> Hello Michel,
>>
>> Spark seperates executors memory using an adaptive boundary between
>> storage and execution memory. If there is no caching and execution
>> memory needs more space, then it will use a portion of the storage memory.
>>
>> If your program does not use caching then you can reduce storage memory.
>>
>> Iacovos
>>
>> On 20/3/20 4:40 μ.μ., msumbul wrote:
>> > Hello,
>> >
>> > Im asking mysef the exact meaning of the setting of
>> > spark.memory.storageFraction.
>> > The documentation mention:
>> >
>> > "Amount of storage memory immune to eviction, expressed as a fraction
>> of the
>> > size of the region set aside by spark.memory.fraction. The higher this
>> is,
>> > the less working memory may be available to execution and tasks may
>> spill to
>> > disk more often"
>> >
>> > Does that mean that if there is no caching that part of the memory will
>> not
>> > be used at all?
>> > In the spark UI, in the tab "Executor", I can see that the "storage
>> memory"
>> > is always zero. Does that mean that that part of the memory is never
>> used at
>> > all and I can reduce it or never used for storage specifically?
>> >
>> > Thanks in advance for your help,
>> > Michel
>> >
>> >
>> >
>> > --
>> > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>

Mime
View raw message