spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com>
Subject Spark Job triggers second attempt
Date Thu, 07 May 2015 06:34:06 GMT
How i can stop Spark to stop triggering second attempt in case the first
fails.
I do not want to wait for the second attempt to fail again so that i can
debug faster.

.set("spark.yarn.maxAppAttempts", "0") OR .set("spark.yarn.maxAppAttempts",
"1")

is not helping.

-- 
Deepak

Mime
View raw message