spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "" <>
Subject Re: Re: How to find how many cores are allocated to Executor
Date Tue, 30 Jun 2015 07:24:59 GMT
Thanks Akhil.

Your suggestion is doable, but I still wonder why no cores information is avaiblable for each
executor on the UI, given the fact that we can see the memory used information for each executor.
From: Akhil Das
Date: 2015-06-30 15:33
CC: user
Subject: Re: How to find how many cores are allocated to Executor
One way would be, If you have enough partitions available for a stage (> total # of cores)
then you can open up the tasks tab of the UI and then from there you can see the cores associated
with executors (total tasks running parallel on the same node).

Best Regards

On Mon, Jun 29, 2015 at 1:59 PM, <> wrote:

I am on Spark 1.3.1.Following is copied from the Executor page, but there is no information
about how many cores are allocated to each executor? Is there way to figure how many cores
allocated for each executor? This would be helpful  when there are many applications run on
the cluster.

共有 1 个附件
CatchD702.jpg(130K) 极速下载 在线预览 
View raw message