spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yanbo Liang <yanboha...@gmail.com>
Subject Re: Determine number of running executors
Date Fri, 21 Nov 2014 09:30:12 GMT
You can get parameter such as spark.executor.memory, but you can not get
executor or core numbers.
Because executor and core are parameters of spark deploy environment not
spark context.

val conf = new SparkConf().set("spark.executor.memory","2G")
val sc = new SparkContext(conf)

sc.getConf.get("spark.executor.memory")
conf.get("spark.executor.memory")

2014-11-21 15:35 GMT+08:00 Tobias Pfeiffer <tgp@preferred.jp>:

> Hi,
>
> when running on YARN, is there a way for the Spark driver to know how many
> executors, cores per executor etc. there are? I want to know this so I can
> repartition to a good number.
>
> Thanks
> Tobias
>

Mime
View raw message