spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aliaksei Litouka <aliaksei.lito...@gmail.com>
Subject Re: How to specify executor memory in EC2 ?
Date Thu, 12 Jun 2014 21:32:34 GMT
spark-env.sh doesn't seem to contain any settings related to memory size :(
I will continue searching for a solution and will post it if I find it :)
Thank you, anyway


On Wed, Jun 11, 2014 at 12:19 AM, Matei Zaharia <matei.zaharia@gmail.com>
wrote:

> It might be that conf/spark-env.sh on EC2 is configured to set it to 512,
> and is overriding the application’s settings. Take a look in there and
> delete that line if possible.
>
> Matei
>
> On Jun 10, 2014, at 2:38 PM, Aliaksei Litouka <aliaksei.litouka@gmail.com>
> wrote:
>
> > I am testing my application in EC2 cluster of m3.medium machines. By
> default, only 512 MB of memory on each machine is used. I want to increase
> this amount and I'm trying to do it by passing --executor-memory 2G option
> to the spark-submit script, but it doesn't seem to work - each machine uses
> only 512 MB instead of 2 gigabytes. What am I doing wrong? How do I
> increase the amount of memory?
>
>

Mime
View raw message