spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jane thorpe <janethor...@aol.com.INVALID>
Subject Re: [Pyspark] - Spark uses all available memory; unrelated to size of dataframe
Date Thu, 16 Apr 2020 05:24:28 GMT

The Web UI only shows 
"
The Storage Memory column shows the amount of memory used and reserved for caching data. "

WEB UI  does not show  the values of Xmx or Xms or XSS.

you are are never going to know the cause of 
OutofMemoryError or StackOverFlowError.

The visual tool is as useless as it can possibly be.








On Thursday, 16 April 2020 Yeikel <email@yeikel.com> wrote:
The memory that you see in Spark's UI page, under storage is not the memory
used by your processing but the amount of memory that you persisted from
your RDDs and DataFrames

Read more here :
https://spark.apache.org/docs/3.0.0-preview/web-ui.html#storage-tab

We need more details to be able to help you (sample code helps)



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message