spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abel Coronado Iruegas <acoronadoirue...@gmail.com>
Subject Re: java.lang.OutOfMemoryError: GC overhead limit exceeded
Date Mon, 21 Jul 2014 14:16:14 GMT
Hi Yifan

This works for me:

export SPARK_JAVA_OPTS="-Xms10g -Xmx40g -XX:MaxPermSize=10g"
export ADD_JARS=/home/abel/spark/MLI/target/MLI-assembly-1.0.jar
export SPARK_MEM=40g
./spark-shell

Regards


On Mon, Jul 21, 2014 at 7:48 AM, Yifan LI <iamyifanli@gmail.com> wrote:

> Hi,
>
> I am trying to load the Graphx example dataset(LiveJournal, 1.08GB)
> through *Scala Shell* on my standalone multicore machine(8 cpus, 16GB
> mem), but an OutOfMemory error was returned when below code was running,
>
> val graph = GraphLoader.edgeListFile(sc, path, minEdgePartitions =
> 16).partitionBy(PartitionStrategy.RandomVertexCut)
>
> I guess I should set some parameters to JVM? like "-Xmx5120m"
> But how to do this in Scala Shell?
> I directly used the "bin/spark-shell" to start spark and seems everything
> works correctly in WebUI.
>
> Or, I should do parameters setting at somewhere(spark-1.0.1)?
>
>
>
> Best,
> Yifan LI
>

Mime
View raw message