spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: GC overhead limit exceeded in Spark-interactive shell
Date Mon, 24 Mar 2014 08:43:22 GMT
PS you have a typo in "DEAMON" - its DAEMON. Thanks Latin.
On Mar 24, 2014 7:25 AM, "Sai Prasanna" <ansaiprasanna@gmail.com> wrote:

> Hi All !! I am getting the following error in interactive spark-shell
> [0.8.1]
>
>
>  *org.apache.spark.SparkException: Job aborted: Task 0.0:0 failed more
> than 0 times; aborting job java.lang.OutOfMemoryError: GC overhead limit
> exceeded*
>
>
> But i had set the following in the spark.env.sh and hadoop-env.sh
>
> export SPARK_DEAMON_MEMORY=8g
> export SPARK_WORKER_MEMORY=8g
> export SPARK_DEAMON_JAVA_OPTS="-Xms8g -Xmx8g"
> export SPARK_JAVA_OPTS="-Xms8g -Xmx8g"
>
>
> export HADOOP_HEAPSIZE=4000
>
> Any suggestions ??
>
> --
> *Sai Prasanna. AN*
> *II M.Tech (CS), SSSIHL*
>
>
>

Mime
View raw message