spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Or <and...@databricks.com>
Subject Re: Failing jobs runs twice
Date Tue, 13 Jan 2015 18:58:57 GMT
Hi Anders, are you using YARN by any chance?

2015-01-13 0:32 GMT-08:00 Anders Arpteg <arpteg@spotify.com>:

> Since starting using Spark 1.2, I've experienced an annoying issue with
> failing apps that gets executed twice. I'm not talking about tasks inside a
> job, that should be executed multiple times before failing the whole app.
> I'm talking about the whole app, that seems to close the previous Spark
> context, start a new, and rerun the app again.
>
> This is annoying since it overwrite the log files as well and it becomes
> hard to troubleshoot the failing app. Does anyone know how to turn this
> "feature" off?
>
> Thanks,
> Anders
>

Mime
View raw message