spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <matei.zaha...@gmail.com>
Subject Re: Single application using all the cores - preventing other applications from running
Date Fri, 31 Jan 2014 22:48:11 GMT
You can set the spark.cores.max property in your application to limit the maximum number of
cores it will take. Checko ut http://spark.incubator.apache.org/docs/latest/spark-standalone.html#resource-scheduling.
It’s also possible to control scheduling in more detail within a Spark application, or if
you run on other cluster managers, like Mesos. That’s described in more detail here: http://spark.incubator.apache.org/docs/latest/job-scheduling.html.

Matei

On Jan 31, 2014, at 2:42 PM, Timothee Besset <ttimo@ttimo.net> wrote:

> Hello,
> 
> What are my options to balance resources between multiple applications running against
a Spark cluster?
> 
> I am using the standalone cluster [1] setup on my local machine, and starting a single
application uses all the available cores. As long as that first application is running, no
other application does any processing.
> 
> I tried to run more workers using less cores with SPARK_WORKER_CORES, but the single
application still takes everything (see https://dl.dropboxusercontent.com/u/1529870/spark%20-%20multiple%20applications.png
).
> 
> Is there any strategy to reallocate resources based on number of applications running
against the cluster, or is the design mostly geared towards having a single application running
at a time?
> 
> Thank you,
> TTimo
> 
> [1] http://spark.incubator.apache.org/docs/latest/spark-standalone.html
> 


Mime
View raw message