mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rodolfo Viana <rodolfodelimavi...@gmail.com>
Subject java.lang.OutOfMemoryError with Mahout 0.10 and Spark 1.1.1
Date Mon, 20 Jul 2015 20:40:24 GMT
I’m trying to run Mahout 0.10 with Spark 1.1.1.
I have input files with 8k, 10M, 20M, 25M.

So far I run with the following configuration:

8k with 1,2,3 slaves
10M with 1, 2, 3 slaves
20M with 1,2,3 slaves

But when I try to run
bin/mahout spark-itemsimilarity --master spark://node1:7077 --input
filein.txt --output out --sparkExecutorMem 6g

with 25M I got this error:

java.lang.OutOfMemoryError: Java heap space

or

java.lang.OutOfMemoryError: GC overhead limit exceeded


Is that normal? Because when I was running 20M I didn’t get any error, now
I have 5M more.

Any ideas why this is happening?

-- 
Rodolfo de Lima Viana
Undergraduate in Computer Science at UFCG

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message