spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Axel Dahl <>
Subject how do I execute a job on a single worker node in standalone mode
Date Mon, 17 Aug 2015 22:36:13 GMT
I have a 4 node cluster and have been playing around with the num-executors
parameters, executor-memory and executor-cores

I set the following:

But when I run the job, I see that each worker, is running one executor
which has  2 cores and 2.5G memory.

What I'd like to do instead is have Spark just allocate the job to a single
worker node?

Is that possible in standalone mode or do I need a job/resource scheduler
like Yarn to do that?

Thanks in advance,


View raw message