Hi TedCan specify the core as follows for example 12 cores?:val conf = new SparkConf().
val sc = new SparkContext(conf)
Dr Mich Talebzadeh
On 30 March 2016 at 14:59, Ted Yu <firstname.lastname@example.org> wrote:
Total CPU cores to allow Spark applications to use on the machine (default: all available); only on workerbq. sc.getConf().set()I think you should use this pattern (shown in https://spark.apache.org/docs/latest/spark-standalone.html):
val conf = new SparkConf() .setMaster(...) .setAppName(...) .set("spark.cores.max", "1") val sc = new SparkContext(conf)On Wed, Mar 30, 2016 at 5:46 AM, vetal king <email@example.com> wrote:ShridharThanks in Advance,Any idea what I am missing?I am using Spark in standalone mode (spark://<hostname>:7077)Hi all,While submitting Spark Job I am am specifying options --executor-cores 1 and --driver-cores 1. However, when the job was submitted, the job used all available cores. So I tried to limit the cores within my main function sc.getConf().set("spark.cores.max", "1"); however it still used all available cores