spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Subacini B <>
Subject Spark Worker Core Allocation
Date Sun, 08 Jun 2014 05:54:48 GMT
Hi All,

My cluster has 5 workers each having 4 cores (So total 20 cores).It is  in
stand alone mode (not using Mesos or Yarn).I want two programs to run at
same time. So I have configured "spark.cores.max=3" , but when i run the
program it allocates three cores taking one core from each worker making 3
workers to run the program ,

How to configure such that it takes 3 cores from 1 worker so that i can use
other workers for second program.

Thanks in advance

View raw message