spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michal ńĆizmazia <mici...@gmail.com>
Subject change default storage level
Date Thu, 09 Jul 2015 14:09:38 GMT
Is there a way how to change the default storage level?

If not, how can I properly change the storage level wherever necessary, if
my input and intermediate results do not fit into memory?

In this example:

context.wholeTextFiles(...)
    .flatMap(s -> ...)
    .flatMap(s -> ...)

Does persist() need to be called after every transformation?

 context.wholeTextFiles(...)
    .persist(StorageLevel.MEMORY_AND_DISK)
    .flatMap(s -> ...)
    .persist(StorageLevel.MEMORY_AND_DISK)
    .flatMap(s -> ...)
    .persist(StorageLevel.MEMORY_AND_DISK)

 Thanks!

Mime
View raw message