spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yeikel <em...@yeikel.com>
Subject Re: [Pyspark] - Spark uses all available memory; unrelated to size of dataframe
Date Thu, 16 Apr 2020 03:17:57 GMT
The memory that you see in Spark's UI page, under storage is not the memory
used by your processing but the amount of memory that you persisted from
your RDDs and DataFrames

Read more here :
https://spark.apache.org/docs/3.0.0-preview/web-ui.html#storage-tab

We need more details to be able to help you (sample code helps)



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message