spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ryan <ryan.hd....@gmail.com>
Subject Re: How can I tell if a Spark job is successful or not?
Date Fri, 11 Aug 2017 02:04:40 GMT
you could exit with error code just like normal java/scala application, and
get it from driver/yarn

On Fri, Aug 11, 2017 at 9:55 AM, Wei Zhang <zhwe@microsoft.com.invalid>
wrote:

> I suppose you can find the job status from Yarn UI application view.
>
>
>
> Cheers,
>
> -z
>
>
>
> *From:* 陈宇航 [mailto:yuhang.chen@foxmail.com]
> *Sent:* Thursday, August 10, 2017 5:23 PM
> *To:* user <user@spark.apache.org>
> *Subject:* How can I tell if a Spark job is successful or not?
>
>
>
> I want to do some clean-ups after a Spark job is finished, and the action
> I would do depends on whether the job is successful or not.
>
> So how where can I get the result for the job?
>
> I already tried the SparkListener, it worked fine when the job is
> successful, but if the job fails, the listener seems not called.
>

Mime
View raw message