Hi,

Currently, there is only one executor per worker. There is jira ticket to relax this:

https://issues.apache.org/jira/browse/SPARK-1706

But, if you want to use more cores, maybe, you can try increasing SPARK_WORKER_INSTANCES. It increases the number of workers per machine. Take a look here:
http://spark.apache.org/docs/1.2.0/spark-standalone.html

Hope this help!
Kelvin


On Fri, Feb 20, 2015 at 10:08 AM, Mohammed Guller <mohammed@glassbeam.com> wrote:

ASFAIK, in stand-alone mode, each Spark application gets one executor on each worker. You could run multiple workers on a machine though.

 

Mohammed

 

From: Yiannis Gkoufas [mailto:johngouf85@gmail.com]
Sent: Friday, February 20, 2015 9:48 AM
To: Mohammed Guller
Cc: user@spark.apache.org
Subject: Re: Setting the number of executors in standalone mode

 

Hi Mohammed,

 

thanks a lot for the reply.

Ok, so from what I understand I cannot control the number of executors per worker in standalone cluster mode.

Is that correct?

 

BR

 

On 20 February 2015 at 17:46, Mohammed Guller <mohammed@glassbeam.com> wrote:

SPARK_WORKER_MEMORY=8g

Will allocate 8GB memory to Spark on each worker node. Nothing to do with # of executors.

 

 

Mohammed

 

From: Yiannis Gkoufas [mailto:johngouf85@gmail.com]
Sent: Friday, February 20, 2015 4:55 AM
To: user@spark.apache.org
Subject: Setting the number of executors in standalone mode

 

Hi there,

 

I try to increase the number of executors per worker in the standalone mode and I have failed to achieve that.

 

and did that:

spark.executor.memory           1g

SPARK_WORKER_MEMORY=8g

 

hoping to get 8 executors per worker but its still 1.

And the option num-executors is not available in the standalone mode.

 

Thanks a lot!