spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohammed Guller <moham...@glassbeam.com>
Subject RE: Setting the number of executors in standalone mode
Date Fri, 20 Feb 2015 18:08:02 GMT
ASFAIK, in stand-alone mode, each Spark application gets one executor on each worker. You could
run multiple workers on a machine though.

Mohammed

From: Yiannis Gkoufas [mailto:johngouf85@gmail.com]
Sent: Friday, February 20, 2015 9:48 AM
To: Mohammed Guller
Cc: user@spark.apache.org
Subject: Re: Setting the number of executors in standalone mode

Hi Mohammed,

thanks a lot for the reply.
Ok, so from what I understand I cannot control the number of executors per worker in standalone
cluster mode.
Is that correct?

BR

On 20 February 2015 at 17:46, Mohammed Guller <mohammed@glassbeam.com<mailto:mohammed@glassbeam.com>>
wrote:
SPARK_WORKER_MEMORY=8g
Will allocate 8GB memory to Spark on each worker node. Nothing to do with # of executors.


Mohammed

From: Yiannis Gkoufas [mailto:johngouf85@gmail.com<mailto:johngouf85@gmail.com>]
Sent: Friday, February 20, 2015 4:55 AM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Setting the number of executors in standalone mode

Hi there,

I try to increase the number of executors per worker in the standalone mode and I have failed
to achieve that.
I followed a bit the instructions of this thread: http://stackoverflow.com/questions/26645293/spark-configuration-memory-instance-cores

and did that:
spark.executor.memory           1g
SPARK_WORKER_MEMORY=8g

hoping to get 8 executors per worker but its still 1.
And the option num-executors is not available in the standalone mode.

Thanks a lot!

Mime
View raw message