spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Du Li <>
Subject Re: How to use more executors
Date Thu, 12 Mar 2015 00:42:32 GMT
Is it being merged in the next release? It's indeed a critical patch!

     On Wednesday, January 21, 2015 3:59 PM, Nan Zhu <> wrote:

 …not sure when will it be reviewed…
but for now you can work around by allowing multiple worker instances on a single machine
-- Nan Zhu On Wednesday, January 21, 2015 at 6:50 PM, Larry Liu wrote:

 Will  SPARK-1706 be included in next release?
On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu <> wrote:

Please see SPARK-1706
On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu <> wrote:

I tried to submit a job with  --conf "spark.cores.max=6"  or --total-executor-cores 6
on a standalone cluster. But I don't see more than 1 executor on each worker. I am wondering
how to use multiple executors when submitting jobs.


View raw message