spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <>
Subject How to not write empty RDD partitions in RDD.saveAsTextFile()
Date Sat, 18 Oct 2014 12:30:30 GMT

I am developing program using Spark where I am using filter such as:
cleanedData = x: x != None and x !=
It happens to me that there is saved lot of empty files (probably from those partitions that
should have been filtered out). Is there some way, how to prevent Spark from saving these
empty files?
Thank you in advance for any help.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message