spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "bit1129@163.com" <bit1...@163.com>
Subject Re: Re: How to find how many cores are allocated to Executor
Date Tue, 30 Jun 2015 07:24:59 GMT
Thanks Akhil.

Your suggestion is doable, but I still wonder why no cores information is avaiblable for each
executor on the UI, given the fact that we can see the memory used information for each executor.



bit1129@163.com
 
From: Akhil Das
Date: 2015-06-30 15:33
To: bit1129@163.com
CC: user
Subject: Re: How to find how many cores are allocated to Executor
One way would be, If you have enough partitions available for a stage (> total # of cores)
then you can open up the tasks tab of the UI and then from there you can see the cores associated
with executors (total tasks running parallel on the same node).

Thanks
Best Regards

On Mon, Jun 29, 2015 at 1:59 PM, bit1129@163.com <bit1129@163.com> wrote:
Hi,

I am on Spark 1.3.1.Following is copied from the Executor page, but there is no information
about how many cores are allocated to each executor? Is there way to figure how many cores
allocated for each executor? This would be helpful  when there are many applications run on
the cluster.




bit1129@163.com

邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。
共有 1 个附件
CatchD702.jpg(130K) 极速下载 在线预览 
Mime
View raw message