spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sai Prasanna <ansaiprasa...@gmail.com>
Subject Re: GC overhead limit exceeded in Spark-interactive shell
Date Mon, 24 Mar 2014 18:43:39 GMT
Thanks Aaron !!


On Mon, Mar 24, 2014 at 10:58 PM, Aaron Davidson <ilikerps@gmail.com> wrote:

> 1. Note sure on this, I don't believe we change the defaults from Java.
>
> 2. SPARK_JAVA_OPTS can be used to set the various Java properties (other
> than memory heap size itself)
>
> 3. If you want to have 8 GB executors then, yes, only two can run on each
> 16 GB node. (In fact, you should also keep a significant amount of memory
> free for the OS to use for buffer caching and such.)
> An executor may use many cores, though, so this shouldn't be an issue.
>
>
> On Mon, Mar 24, 2014 at 2:44 AM, Sai Prasanna <ansaiprasanna@gmail.com>wrote:
>
>> Thanks Aaron and Sean...
>>
>> Setting SPARK_MEM finally worked. But i have a small doubt.
>> 1)What is the default value that is allocated for JVM and for HEAP_SPACE
>> for Garbage collector.
>>
>> 2)Usually we set 1/3 of total memory for heap. So what should be the
>> practice for Spark processes. Where & how should we set them.
>> And what is the default value does it assume?
>>
>> 3) Moreover, if we set SPARK_MEM to say 8g and i have a 16g RAM, can only
>> two executors run max on a node of a cluster ??
>>
>>
>> Thanks Again !!
>>
>>
>>
>>
>> On Mon, Mar 24, 2014 at 2:13 PM, Sean Owen <sowen@cloudera.com> wrote:
>>
>>> PS you have a typo in "DEAMON" - its DAEMON. Thanks Latin.
>>> On Mar 24, 2014 7:25 AM, "Sai Prasanna" <ansaiprasanna@gmail.com> wrote:
>>>
>>>> Hi All !! I am getting the following error in interactive spark-shell
>>>> [0.8.1]
>>>>
>>>>
>>>>  *org.apache.spark.SparkException: Job aborted: Task 0.0:0 failed more
>>>> than 0 times; aborting job java.lang.OutOfMemoryError: GC overhead limit
>>>> exceeded*
>>>>
>>>>
>>>> But i had set the following in the spark.env.sh and hadoop-env.sh
>>>>
>>>> export SPARK_DEAMON_MEMORY=8g
>>>> export SPARK_WORKER_MEMORY=8g
>>>> export SPARK_DEAMON_JAVA_OPTS="-Xms8g -Xmx8g"
>>>> export SPARK_JAVA_OPTS="-Xms8g -Xmx8g"
>>>>
>>>>
>>>> export HADOOP_HEAPSIZE=4000
>>>>
>>>> Any suggestions ??
>>>>
>>>> --
>>>> *Sai Prasanna. AN*
>>>> *II M.Tech (CS), SSSIHL*
>>>>
>>>>
>>>>
>>
>>
>> --
>> *Sai Prasanna. AN*
>> *II M.Tech (CS), SSSIHL*
>>
>>
>> *Entire water in the ocean can never sink a ship, Unless it gets inside.
>> All the pressures of life can never hurt you, Unless you let them in.*
>>
>
>


-- 
*Sai Prasanna. AN*
*II M.Tech (CS), SSSIHL*


*Entire water in the ocean can never sink a ship, Unless it gets inside.All
the pressures of life can never hurt you, Unless you let them in.*

Mime
View raw message