spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jim Blomo <jim.bl...@gmail.com>
Subject Re: Pyspark worker memory
Date Wed, 19 Mar 2014 07:48:25 GMT
Thanks for the suggestion, Matei.  I've tracked this down to a setting
I had to make on the Driver.  It looks like spark-env.sh has no impact
on the Executor, which confused me for a long while with settings like
SPARK_EXECUTOR_MEMORY.  The only setting that mattered was setting the
system property in the *driver* (in this case pyspark/shell.py) or
using -Dspark.executor.memory in SPARK_JAVA_OPTS *on the master*.  I'm
not sure how this varies from 0.9.0 release, but it seems to work on
SNAPSHOT.

On Tue, Mar 18, 2014 at 11:52 PM, Matei Zaharia <matei.zaharia@gmail.com> wrote:
> Try checking spark-env.sh on the workers as well. Maybe code there is
> somehow overriding the spark.executor.memory setting.
>
> Matei
>
> On Mar 18, 2014, at 6:17 PM, Jim Blomo <jim.blomo@gmail.com> wrote:
>
> Hello, I'm using the Github snapshot of PySpark and having trouble setting
> the worker memory correctly. I've set spark.executor.memory to 5g, but
> somewhere along the way Xmx is getting capped to 512M. This was not
> occurring with the same setup and 0.9.0. How many places do I need to
> configure the memory? Thank you!
>
>

Mime
View raw message