spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Zhan Zhang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-17637) Packed scheduling for Spark tasks across executors
Date Fri, 23 Sep 2016 16:09:20 GMT

    [ https://issues.apache.org/jira/browse/SPARK-17637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15516851#comment-15516851
] 

Zhan Zhang commented on SPARK-17637:
------------------------------------

[~jerryshao] The idea is straightforward. Instead of doing round robin on the executers with
available cores, the new scheduling will try to allocate tasks to the executors with least
available cores. As a result for the executors who has more free resources may not have new
tasks allocated. With dynamic allocation enabled, these executors may be released so that
other jobs can get required resources from underlying resource manager.

It is not specific bound to dynamic allocation, but it is an easy way to understand the gains
of the new scheduler. In addition, in the patch (soon to be sent out) there is also another
scheduler which does exactly opposite thing by allocating tasks to executors with most available
cores in order to balance the workload to all executors.

> Packed scheduling for Spark tasks across executors
> --------------------------------------------------
>
>                 Key: SPARK-17637
>                 URL: https://issues.apache.org/jira/browse/SPARK-17637
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>            Reporter: Zhan Zhang
>            Priority: Minor
>
> Currently Spark scheduler implements round robin scheduling for tasks to executors. Which
is great as it distributes the load evenly across the cluster, but this leads to significant
resource waste in some cases, especially when dynamic allocation is enabled.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message