spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andrew Or (JIRA)" <>
Subject [jira] [Closed] (SPARK-4126) Do not set `spark.executor.instances` if not needed (yarn-cluster)
Date Wed, 29 Oct 2014 21:03:33 GMT


Andrew Or closed SPARK-4126.
    Resolution: Won't Fix

superseded by SPARK-4138

> Do not set `spark.executor.instances` if not needed (yarn-cluster)
> ------------------------------------------------------------------
>                 Key: SPARK-4126
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.2.0
>            Reporter: Andrew Or
>            Assignee: Andrew Or
>            Priority: Minor
> In yarn cluster mode, we currently always set `spark.executor.instances` regardless of
whether this is set by the user. While not a huge deal, this prevents us from knowing whether
the user did specify a starting number of executors.
> This is needed in SPARK-3795 to throw the appropriate exception when this is set AND
dynamic executor allocation is turned on.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message