spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Richard Marscher <rmarsc...@localytics.com>
Subject Re: Spark Job triggers second attempt
Date Thu, 07 May 2015 14:58:45 GMT
Hi,

I think you may want to use this setting?:

spark.task.maxFailures4Number of individual task failures before giving up
on the job. Should be greater than or equal to 1. Number of allowed retries
= this value - 1.

On Thu, May 7, 2015 at 2:34 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepujain@gmail.com> wrote:

> How i can stop Spark to stop triggering second attempt in case the first
> fails.
> I do not want to wait for the second attempt to fail again so that i can
> debug faster.
>
> .set("spark.yarn.maxAppAttempts", "0") OR .set("spark.yarn.maxAppAttempts",
> "1")
>
> is not helping.
>
> --
> Deepak
>
>

Mime
View raw message