ASFAIK, in stand-alone mode, each Spark application gets one executor on each worker. You could run multiple workers on a machine though.
From: Yiannis Gkoufas [mailto:firstname.lastname@example.org]
Sent: Friday, February 20, 2015 9:48 AM
To: Mohammed Guller
Subject: Re: Setting the number of executors in standalone mode
thanks a lot for the reply.
Ok, so from what I understand I cannot control the number of executors per worker in standalone cluster mode.
Is that correct?
On 20 February 2015 at 17:46, Mohammed Guller <email@example.com> wrote:
Will allocate 8GB memory to Spark on each worker node. Nothing to do with # of executors.
I try to increase the number of executors per worker in the standalone mode and I have failed to achieve that.
I followed a bit the instructions of this thread: http://stackoverflow.com/questions/26645293/spark-configuration-memory-instance-cores
and did that:
hoping to get 8 executors per worker but its still 1.
And the option num-executors is not available in the standalone mode.
Thanks a lot!