spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jim Blomo <jim.bl...@gmail.com>
Subject Pyspark worker memory
Date Wed, 19 Mar 2014 01:17:28 GMT
Hello, I'm using the Github snapshot of PySpark and having trouble setting
the worker memory correctly. I've set spark.executor.memory to 5g, but
somewhere along the way Xmx is getting capped to 512M. This was not
occurring with the same setup and 0.9.0. How many places do I need to
configure the memory? Thank you!

Mime
View raw message