spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ian Ferreira <ianferre...@hotmail.com>
Subject Re: Easy one
Date Wed, 07 May 2014 18:39:55 GMT
Spoke to soon, while I can get the  worker memory up, I can¹t seem to
increase the executor memory, still seems locked in a 512MB

Even with context set up like so

      sconf.setExecutorEnv("spark.executor.memory", "1g²)

From:  Ian Ferreira <ianferreira@hotmail.com>
Date:  Tuesday, May 6, 2014 at 6:10 PM
To:  <user@spark.apache.org>
Subject:  Re: Easy one

Thanks!

From:  Aaron Davidson <ilikerps@gmail.com>
Reply-To:  <user@spark.apache.org>
Date:  Tuesday, May 6, 2014 at 5:32 PM
To:  <user@spark.apache.org>
Subject:  Re: Easy one

If you're using standalone mode, you need to make sure the Spark Workers
know about the extra memory. This can be configured in spark-env.sh on the
workers as

export SPARK_WORKER_MEMORY=4g


On Tue, May 6, 2014 at 5:29 PM, Ian Ferreira <ianferreira@hotmail.com>
wrote:
> Hi there,
> 
> Why can¹t I seem to kick the executor memory higher? See below from EC2
> deployment using m1.large
> 
> 
> And in the spark-env.sh
> export SPARK_MEM=6154m
> 
> 
> And in the spark context
> sconf.setExecutorEnv("spark.executor.memory", "4g²)
> 
> Cheers
> - Ian
> 




Mime
View raw message