spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sandy Ryza <sandy.r...@cloudera.com>
Subject Re: Determine number of running executors
Date Fri, 21 Nov 2014 18:49:09 GMT
Hi Tobias,

One way to find out the number of executors is through
SparkContext#getExecutorMemoryStatus.  You can find out the number of by
asking the SparkConf for the "spark.executor.cores" property, which, if not
set, means 1 for YARN.

-Sandy


On Fri, Nov 21, 2014 at 1:30 AM, Yanbo Liang <yanbohappy@gmail.com> wrote:

> You can get parameter such as spark.executor.memory, but you can not get
> executor or core numbers.
> Because executor and core are parameters of spark deploy environment not
> spark context.
>
> val conf = new SparkConf().set("spark.executor.memory","2G")
> val sc = new SparkContext(conf)
>
> sc.getConf.get("spark.executor.memory")
> conf.get("spark.executor.memory")
>
> 2014-11-21 15:35 GMT+08:00 Tobias Pfeiffer <tgp@preferred.jp>:
>
>> Hi,
>>
>> when running on YARN, is there a way for the Spark driver to know how
>> many executors, cores per executor etc. there are? I want to know this so I
>> can repartition to a good number.
>>
>> Thanks
>> Tobias
>>
>
>

Mime
View raw message