spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <>
Subject Re: Application details for failed and teminated jobs
Date Thu, 02 Oct 2014 18:35:04 GMT
You may want to take a look at this PR:

Long story short: while not a terrible idea to show running
applications, your particular case should be solved differently.
Applications are responsible for calling "SparkContext.stop()" at the
end of their run, currently, so you should make sure your code does
that even when something goes wrong.

If that is done, they'll show up in the History Server.

On Thu, Oct 2, 2014 at 11:31 AM, SK <> wrote:
> Hi,
> Currently the history server provides application details for only the
> successfully completed jobs (where the APPLICATION_COMPLETE file is
> generated). However,  (long-running) jobs that we terminate manually or
> failed jobs where the APPLICATION_COMPLETE may not be generated, dont show
> up on the history server page. They however do show up on the 4040 interface
> as long as they are running. Is it possible to save those logs and load them
> up on the history server (even when the APPLICATION_COMPLETE is not
> present)? This would allow us troubleshoot the failed and terminated jobs
> without holding up the cluster.
> thanks
> --
> View this message in context:
> Sent from the Apache Spark User List mailing list archive at
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message