spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Imran Rashid <iras...@cloudera.com.INVALID>
Subject Re: Spark on Yarn, is it possible to manually blacklist nodes before running spark job?
Date Wed, 23 Jan 2019 22:09:22 GMT
Serga, can you explain a bit more why you want this ability?
If the node is really bad, wouldn't you want to decomission the NM entirely?
If you've got heterogenous resources, than nodelabels seem like they would
be more appropriate -- and I don't feel great about adding workarounds for
the node-label limitations into blacklisting.

I don't want to be stuck supporting a configuration with too limited a use
case.

(may be better to move discussion to
https://issues.apache.org/jira/browse/SPARK-26688 so its better archived,
I'm responding here in case you aren't watching that issue)

On Tue, Jan 22, 2019 at 6:09 AM Jörn Franke <jornfranke@gmail.com> wrote:

> You can try with Yarn node labels:
>
> https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/NodeLabel.html
>
> Then you can whitelist nodes.
>
> Am 19.01.2019 um 00:20 schrieb Serega Sheypak <serega.sheypak@gmail.com>:
>
> Hi, is there any possibility to tell Scheduler to blacklist specific nodes
> in advance?
>
>

Mime
View raw message