spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anders Arpteg <arp...@spotify.com>
Subject Re: Failing jobs runs twice
Date Tue, 13 Jan 2015 19:00:59 GMT
Yes Andrew, I am. Tried setting spark.yarn.applicationMaster.waitTries to 1
(thanks Sean), but with no luck. Any ideas?

On Tue, Jan 13, 2015 at 7:58 PM, Andrew Or <andrew@databricks.com> wrote:

> Hi Anders, are you using YARN by any chance?
>
> 2015-01-13 0:32 GMT-08:00 Anders Arpteg <arpteg@spotify.com>:
>
> Since starting using Spark 1.2, I've experienced an annoying issue with
>> failing apps that gets executed twice. I'm not talking about tasks inside a
>> job, that should be executed multiple times before failing the whole app.
>> I'm talking about the whole app, that seems to close the previous Spark
>> context, start a new, and rerun the app again.
>>
>> This is annoying since it overwrite the log files as well and it becomes
>> hard to troubleshoot the failing app. Does anyone know how to turn this
>> "feature" off?
>>
>> Thanks,
>> Anders
>>
>
>

Mime
View raw message