spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yeikel <em...@yeikel.com>
Subject Re: Get Size of a column in Bytes for a Pyspark Dataframe
Date Thu, 16 Apr 2020 21:30:51 GMT
As far as I know , one option is to persist it , and check in Spark UI. 

df.select("field").persist().count() //


I'd like to hear other options too. 





--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message