spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Judy Nash <judyn...@exchange.microsoft.com>
Subject RE: spark standalone with multiple executors in one work node
Date Thu, 05 Mar 2015 21:59:40 GMT
I meant from one app, yes.

Was asking this because our previous tuning experiment shows spark-on-yarn runs faster when
overloading workers with executors (i.e. if a worker has 4 cores, creating 2 executors each
use 4 cores will see a speed boost from 1 executor with 4 cores).

I have found an equivalent solution for standalone that have given me a speed boost. Instead
of adding more executors, I overloaded SPARK_WORKER_CORES to 2x of CPU cores on the worker.
We are seeing better performance due to CPU now has consistent 100% utilization.

-----Original Message-----
From: Sean Owen [mailto:sowen@cloudera.com] 
Sent: Thursday, February 26, 2015 2:11 AM
To: Judy Nash
Cc: user@spark.apache.org
Subject: Re: spark standalone with multiple executors in one work node

--num-executors is the total number of executors. In YARN there is not quite the same notion
of a Spark worker. Of course, one worker has an executor for each running app, so yes, but
you mean for one app? it's possible, though not usual, to run multiple executors for one app
on one worker. This may be useful if your executor heap size is otherwise getting huge.

On Thu, Feb 26, 2015 at 1:58 AM, Judy Nash <judynash@exchange.microsoft.com> wrote:
> Hello,
>
>
>
> Does spark standalone support running multiple executors in one worker node?
>
>
>
> It seems yarn has the parameter --num-executors  to set number of 
> executors to deploy, but I do not find the equivalent parameter in spark standalone.
>
>
>
>
>
> Thanks,
>
> Judy
Mime
View raw message