spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: Maximum memory limits
Date Sun, 16 Mar 2014 18:42:06 GMT
Are you using HEAD or 0.9.0? I know there was a memory issue fixed a few
weeks ago that made ALS need a lot more memory than is needed.

https://github.com/apache/incubator-spark/pull/629

Try the latest code.

--
Sean Owen | Director, Data Science | London


On Sun, Mar 16, 2014 at 11:40 AM, Debasish Das <debasish.das83@gmail.com>wrote:

> Hi,
>
> I gave my spark job 16 gb of memory and it is running on 8 executors.
>
> The job needs more memory due to ALS requirements (20M x 1M matrix)
>
> On each node I do have 96 gb of memory and I am using 16 gb out of it. I
> want to increase the memory but I am not sure what is the right way to do
> that...
>
> On 8 executor if I give 96 gb it might be a issue due to GC...
>
> Ideally on 8 nodes, I would run with 48 executors and each executor will
> get 16 gb of memory..Total  48 JVMs...
>
> Is it possible to increase executors per node ?
>
> Thanks.
> Deb
>

Mime
View raw message