spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yeikel <>
Subject Re: [Pyspark] - Spark uses all available memory; unrelated to size of dataframe
Date Thu, 16 Apr 2020 03:17:57 GMT
The memory that you see in Spark's UI page, under storage is not the memory
used by your processing but the amount of memory that you persisted from
your RDDs and DataFrames

Read more here :

We need more details to be able to help you (sample code helps)

Sent from:

To unsubscribe e-mail:

View raw message