spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <matei.zaha...@gmail.com>
Subject Re: Pyspark worker memory
Date Wed, 19 Mar 2014 06:52:00 GMT
Try checking spark-env.sh on the workers as well. Maybe code there is somehow overriding the
spark.executor.memory setting.

Matei

On Mar 18, 2014, at 6:17 PM, Jim Blomo <jim.blomo@gmail.com> wrote:

> Hello, I'm using the Github snapshot of PySpark and having trouble setting the worker
memory correctly. I've set spark.executor.memory to 5g, but somewhere along the way Xmx is
getting capped to 512M. This was not occurring with the same setup and 0.9.0. How many places
do I need to configure the memory? Thank you!
> 


Mime
View raw message