spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tamas Jambor <jambo...@gmail.com>
Subject Re: spark.driver.memory is not set (pyspark, 1.1.0)
Date Wed, 01 Oct 2014 16:59:21 GMT
thanks Marcelo.

What's the reason it is not possible in cluster mode, either?

On Wed, Oct 1, 2014 at 5:42 PM, Marcelo Vanzin <vanzin@cloudera.com> wrote:
> You can't set up the driver memory programatically in client mode. In
> that mode, the same JVM is running the driver, so you can't modify
> command line options anymore when initializing the SparkContext.
>
> (And you can't really start cluster mode apps that way, so the only
> way to set this is through the command line / config files.)
>
> On Wed, Oct 1, 2014 at 9:26 AM, jamborta <jamborta@gmail.com> wrote:
>> Hi all,
>>
>> I cannot figure out why this command is not setting the driver memory (it is
>> setting the executor memory):
>>
>>     conf = (SparkConf()
>>                 .setMaster("yarn-client")
>>                 .setAppName("test")
>>                 .set("spark.driver.memory", "1G")
>>                 .set("spark.executor.memory", "1G")
>>                 .set("spark.executor.instances", 2)
>>                 .set("spark.executor.cores", 4))
>>     sc = SparkContext(conf=conf)
>>
>> whereas if I run the spark console:
>> ./bin/pyspark --driver-memory 1G
>>
>> it sets it correctly. Seemingly they both generate the same commands in the
>> logs.
>>
>> thanks a lot,
>>
>>
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-memory-is-not-set-pyspark-1-1-0-tp15498.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>
>
>
> --
> Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message