spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sandy Ryza <sandy.r...@cloudera.com>
Subject Re: java.lang.OutOfMemoryError: GC overhead limit exceeded
Date Tue, 27 Jan 2015 21:33:42 GMT
Hi Antony,

If you look in the YARN NodeManager logs, do you see that it's killing the
executors?  Or are they crashing for a different reason?

-Sandy

On Tue, Jan 27, 2015 at 12:43 PM, Antony Mayi <antonymayi@yahoo.com.invalid>
wrote:

> Hi,
>
> I am using spark.yarn.executor.memoryOverhead=8192 yet getting executors
> crashed with this error.
>
> does that mean I have genuinely not enough RAM or is this matter of config
> tuning?
>
> other config options used:
> spark.storage.memoryFraction=0.3
> SPARK_EXECUTOR_MEMORY=14G
>
> running spark 1.2.0 as yarn-client on cluster of 10 nodes (the workload is
> ALS trainImplicit on ~15GB dataset)
>
> thanks for any ideas,
> Antony.
>

Mime
View raw message