spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sguj <>
Subject Re: Spark 1.0.0 java.lang.outOfMemoryError: Java Heap Space
Date Wed, 18 Jun 2014 14:20:02 GMT
I got rid of most of my heap errors by increasing the number of partitions of
my RDDs by 8-16x. I found in the  tuning page
<>   that heap space errors
can be caused by a hash table that's generated during the shuffle functions,
so by splitting up how much is in each shuffle function with partitions, I
was able to get rid of the errors. Thanks for putting me on the right path
with the partitions.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message