spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mrm <>
Subject driver memory
Date Wed, 23 Jul 2014 10:29:52 GMT

How do I increase the driver memory? This are my configs right now:

sed 's/INFO/ERROR/' spark/conf/   >
sed 's/INFO/ERROR/' spark/conf/  >
# Environment variables and Spark properties
export SPARK_WORKER_MEMORY="30g" # Whole memory per worker node indepedent
of application (default: total memory on worker node minus 1 GB)
# SPARK_WORKER_CORES = total number of cores an application can use on a
# SPARK_WORKER_INSTANCES = how many workers per machine? Limit the number of
cores per worker if more than one worker on a machine
export SPARK_JAVA_OPTS=" -Dspark.executor.memory=30g
-Dspark.speculation.quantile=0.5 -Dspark.speculation=true
-Dspark.cores.max=80 -Dspark.akka.frameSize=1000 -Dspark.rdd.compress=true"
#spark.executor.memory = memory taken by spark on a machine

In the application UI, it says my driver has 295 MB memory. I am trying to
broadcast a variable that is 0.15 gigs and it is throwing OutOfMemory
errors, so I am trying to see if by increasing the driver memory I can fix


View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message