spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mayur Rustagi <mayur.rust...@gmail.com>
Subject Re: Single application using all the cores - preventing other applications from running
Date Fri, 31 Jan 2014 22:45:55 GMT
Go for Fair scheduler and different weights. Default is FIFO. If you are
feeling adventurous try out sparrow scheduler .
Regards
Mayur
On Feb 1, 2014 4:12 AM, "Timothee Besset" <ttimo@ttimo.net> wrote:

> Hello,
>
> What are my options to balance resources between multiple applications
> running against a Spark cluster?
>
> I am using the standalone cluster [1] setup on my local machine, and
> starting a single application uses all the available cores. As long as that
> first application is running, no other application does any processing.
>
> I tried to run more workers using less cores with SPARK_WORKER_CORES, but
> the single application still takes everything (see
> https://dl.dropboxusercontent.com/u/1529870/spark%20-%20multiple%20applications.png).
>
> Is there any strategy to reallocate resources based on number of
> applications running against the cluster, or is the design mostly geared
> towards having a single application running at a time?
>
> Thank you,
> TTimo
>
> [1] http://spark.incubator.apache.org/docs/latest/spark-standalone.html
>
>

Mime
View raw message