spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nicholas Chammas <>
Subject Expanded docs for the various storage levels
Date Thu, 07 Jul 2016 19:07:59 GMT
I’m looking at the docs here:

A newcomer to Spark won’t understand the meaning of _2, or the meaning of
_SER (or its value), and won’t understand how exactly memory and disk play
together when something like MEMORY_AND_DISK is selected.

Is there a place in the docs that expands on the storage levels a bit? If
not, shall we create a JIRA and expand this documentation? I don’t mind
taking on this task, though frankly I’m interested in this because I don’t
fully understand the differences myself. :)


View raw message