spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <>
Subject Re: Pyspark worker memory
Date Wed, 19 Mar 2014 06:52:00 GMT
Try checking on the workers as well. Maybe code there is somehow overriding the
spark.executor.memory setting.


On Mar 18, 2014, at 6:17 PM, Jim Blomo <> wrote:

> Hello, I'm using the Github snapshot of PySpark and having trouble setting the worker
memory correctly. I've set spark.executor.memory to 5g, but somewhere along the way Xmx is
getting capped to 512M. This was not occurring with the same setup and 0.9.0. How many places
do I need to configure the memory? Thank you!

View raw message