spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <>
Subject Re: Single application using all the cores - preventing other applications from running
Date Fri, 31 Jan 2014 22:48:11 GMT
You can set the spark.cores.max property in your application to limit the maximum number of
cores it will take. Checko ut
It’s also possible to control scheduling in more detail within a Spark application, or if
you run on other cluster managers, like Mesos. That’s described in more detail here:


On Jan 31, 2014, at 2:42 PM, Timothee Besset <> wrote:

> Hello,
> What are my options to balance resources between multiple applications running against
a Spark cluster?
> I am using the standalone cluster [1] setup on my local machine, and starting a single
application uses all the available cores. As long as that first application is running, no
other application does any processing.
> I tried to run more workers using less cores with SPARK_WORKER_CORES, but the single
application still takes everything (see
> Is there any strategy to reallocate resources based on number of applications running
against the cluster, or is the design mostly geared towards having a single application running
at a time?
> Thank you,
> TTimo
> [1]

View raw message