spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: How to find how many cores are allocated to Executor
Date Tue, 30 Jun 2015 07:03:04 GMT
One way would be, If you have enough partitions available for a stage (>
total # of cores) then you can open up the tasks tab of the UI and then
from there you can see the cores associated with executors (total tasks
running parallel on the same node).

Thanks
Best Regards

On Mon, Jun 29, 2015 at 1:59 PM, bit1129@163.com <bit1129@163.com> wrote:

> Hi,
>
> I am on Spark 1.3.1.Following is copied from the Executor page, but there
> is no information about how many cores are allocated to each executor? Is
> there way to figure how many cores allocated for each executor? This would
> be helpful  when there are many applications run on the cluster.
>
>
> ------------------------------
> bit1129@163.com
>

Mime
View raw message