spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James <alcaid1...@gmail.com>
Subject Bug in DISK related Storage level?
Date Mon, 03 Nov 2014 12:43:40 GMT
Hello,

I am trying to load a very large graph to run a GraphX algorithm, and the
graph is not fix the memory,
I found that if I use DISK_ONLY or MEMORY_AND_DISK_SER storage level, the
program will met OOM, but if I use MEMORY_ONLY_SER, the program will not.
Thus I want to know what kind of difference would risk OOM to the program ?

Alcaid

Mime
View raw message