spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Programmatically get status of job (WAITING/RUNNING)
Date Thu, 07 Dec 2017 19:50:58 GMT
On Thu, Dec 7, 2017 at 11:40 AM, bsikander <behroz89@gmail.com> wrote:
> For example, if an application wanted 4 executors
> (spark.executor.instances=4) but the spark cluster can only provide 1
> executor. This means that I will only receive 1 onExecutorAdded event. Will
> the application state change to RUNNING (even if 1 executor was allocated)?

What application state are you talking about? That's the thing that
you seem to be confused about here.

As you've already learned, SparkLauncher only cares about the driver.
So RUNNING means the driver is running.

And there's no concept of running anywhere else I know of that is
exposed to Spark applications. So I don't know which code you're
referring to when you say "the application state change to RUNNING".

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message