hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Lefty Leverenz (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-16799) Control the max number of task for a stage in a spark job
Date Sun, 04 Jun 2017 05:02:04 GMT

    [ https://issues.apache.org/jira/browse/HIVE-16799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16036167#comment-16036167
] 

Lefty Leverenz commented on HIVE-16799:
---------------------------------------

Doc note:  This adds *hive.spark.stage.max.tasks* to HiveConf.java, so it needs to be documented
in the Spark section of Configuration Properties.

* [ConfigurationProperties -- Spark | https://cwiki.apache.org/confluence/display/Hive/Configuration+Properties#ConfigurationProperties-Spark]
* [hive.spark.stage.max.tasks | https://cwiki.apache.org/confluence/display/Hive/Configuration+Properties#ConfigurationProperties-hive.spark.stage.max.tasks]
 (This link won't work until the documentation is done.)

Thanks for the TODOC3.0 label, Xuefu.

> Control the max number of task for a stage in a spark job
> ---------------------------------------------------------
>
>                 Key: HIVE-16799
>                 URL: https://issues.apache.org/jira/browse/HIVE-16799
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: Xuefu Zhang
>            Assignee: Xuefu Zhang
>              Labels: TODOC3.0
>             Fix For: 3.0.0
>
>         Attachments: HIVE-16799.1.patch, HIVE-16799.patch
>
>
> HIVE-16552 gives admin an option to control the maximum number of tasks a Spark job may
have. However, this may not be sufficient as this tends to penalize jobs that have many stages
while favoring jobs that has fewer stages. Ideally, we should also limit the number of tasks
in a stage, which is closer to the maximum number of mappers or reducers in a MR job.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message