spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "jonathan.keebler" <>
Subject Worker with no Executor (YARN client-mode)
Date Fri, 03 Oct 2014 21:59:36 GMT
Hi all,

We're running Spark 1.0 on CDH 5.1.2.  We're using Spark in YARN-client

We're seeing that one of our nodes is not being assigned any tasks, and no
resources (RAM,cpu) are being used on this node.  In the CM UI this worker
node is in good health and the spark Worker process is running, along with
the yarn-NODEMANAGER and hdfs-DATANODE .

We've tried re-starting the Spark Worker process while the application is
running, but there are still no tasks assigned to the worker.

Any hints or thoughts on this?  We can wait until the current job finishes
and restart spark, yarn, etc, but I wonder if there is a way to make the
currently running job recognize the worker & begin assigning tasks.

- Jon

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message