spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jim Blomo <>
Subject Re: Pyspark worker memory
Date Wed, 19 Mar 2014 07:48:25 GMT
Thanks for the suggestion, Matei.  I've tracked this down to a setting
I had to make on the Driver.  It looks like has no impact
on the Executor, which confused me for a long while with settings like
SPARK_EXECUTOR_MEMORY.  The only setting that mattered was setting the
system property in the *driver* (in this case pyspark/ or
using -Dspark.executor.memory in SPARK_JAVA_OPTS *on the master*.  I'm
not sure how this varies from 0.9.0 release, but it seems to work on

On Tue, Mar 18, 2014 at 11:52 PM, Matei Zaharia <> wrote:
> Try checking on the workers as well. Maybe code there is
> somehow overriding the spark.executor.memory setting.
> Matei
> On Mar 18, 2014, at 6:17 PM, Jim Blomo <> wrote:
> Hello, I'm using the Github snapshot of PySpark and having trouble setting
> the worker memory correctly. I've set spark.executor.memory to 5g, but
> somewhere along the way Xmx is getting capped to 512M. This was not
> occurring with the same setup and 0.9.0. How many places do I need to
> configure the memory? Thank you!

View raw message