spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sun, Rui" <>
Subject RE: How to set memory for SparkR with master="local[*]"
Date Mon, 02 Nov 2015 05:47:35 GMT
Hi, Matej,

For the convenience of SparkR users, when they start SparkR without using bin/sparkR, (for
example, in RStudio), enables setting of
“spark.driver.memory”, (also other similar options, like: spark.driver.extraClassPath,
spark.driver.extraJavaOptions, spark.driver.extraLibraryPath) in the sparkEnvir parameter
for sparkR.init() to take effect.

Would you like to give it a try? Note the change is on the master branch, you have to build
Spark from source before using it.

From: Sun, Rui []
Sent: Monday, October 26, 2015 10:24 AM
To: Dirceu Semighini Filho
Cc: user
Subject: RE: How to set memory for SparkR with master="local[*]"

As documented in,
Note for “spark.driver.memory”:
Note: In client mode, this config must not be set through the SparkConf directly in your application,
because the driver JVM has already started at that point. Instead, please set this through
the --driver-memory command line option or in your default properties file.

If you are to start a SparkR shell using bin/sparkR, then you can use bin/sparkR –driver-memory.
You have no chance to set the driver memory size after the R shell has been launched via bin/sparkR.

Buf if you are to start a SparkR shell manually without using bin/sparkR (for example, in
Rstudio), you can:
Sys.setenv("SPARKR_SUBMIT_ARGS" = "--conf spark.driver.memory=2g sparkr-shell")
sc <- sparkR.init()

From: Dirceu Semighini Filho []
Sent: Friday, October 23, 2015 7:53 PM
Cc: user
Subject: Re: How to set memory for SparkR with master="local[*]"

Hi Matej,
I'm also using this and I'm having the same behavior here, my driver has only 530mb which
is the default value.

Maybe this is a bug.

2015-10-23 9:43 GMT-02:00 Matej Holec <<>>:

How to adjust the memory settings properly for SparkR with master="local[*]"
in R?

*When running from  R -- SparkR doesn't accept memory settings :(*

I use the following commands:

R>  library(SparkR)
R>  sc <- sparkR.init(master = "local[*]", sparkEnvir =
list(spark.driver.memory = "5g"))

Despite the variable spark.driver.memory is correctly set (checked in
http://node:4040/environment/), the driver has only the default amount of
memory allocated (Storage Memory 530.3 MB).

*But when running from  spark-1.5.1-bin-hadoop2.6/bin/sparkR -- OK*

The following command:

]$ spark-1.5.1-bin-hadoop2.6/bin/sparkR --driver-memory 5g

creates SparkR session with properly adjustest driver memory (Storage Memory
2.6 GB).

Any suggestion?


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:<>
For additional commands, e-mail:<>

View raw message