spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Or <and...@databricks.com>
Subject Re: spark setting maximum available memory
Date Thu, 22 May 2014 20:11:39 GMT
Hi Ibrahim,

If your worker machines only have 8GB of memory, then launching executors
with all the memory will leave no room for system processes. There is no
guideline, but I usually leave around 1GB just to be safe, so

conf.set("spark.executor.memory", "7g")

Andrew


2014-05-22 7:23 GMT-07:00 İbrahim Rıza HALLAÇ <ibrahimrizahallac@live.com>:

> In my situation each slave has 8 GB memory.  I want to use the maximum
> memory that I can: .set("spark.executor.memory", "?g")
> How can I determine the amount of memory I should set ? It fails when I
> set it to 8GB.
>

Mime
View raw message