spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andrew Or (JIRA)" <j...@apache.org>
Subject [jira] [Closed] (SPARK-4126) Do not set `spark.executor.instances` if not needed (yarn-cluster)
Date Wed, 29 Oct 2014 21:03:33 GMT

     [ https://issues.apache.org/jira/browse/SPARK-4126?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Andrew Or closed SPARK-4126.
----------------------------
    Resolution: Won't Fix

superseded by SPARK-4138

> Do not set `spark.executor.instances` if not needed (yarn-cluster)
> ------------------------------------------------------------------
>
>                 Key: SPARK-4126
>                 URL: https://issues.apache.org/jira/browse/SPARK-4126
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.2.0
>            Reporter: Andrew Or
>            Assignee: Andrew Or
>            Priority: Minor
>
> In yarn cluster mode, we currently always set `spark.executor.instances` regardless of
whether this is set by the user. While not a huge deal, this prevents us from knowing whether
the user did specify a starting number of executors.
> This is needed in SPARK-3795 to throw the appropriate exception when this is set AND
dynamic executor allocation is turned on.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message