spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anders Arpteg <arp...@spotify.com>
Subject Failing jobs runs twice
Date Tue, 13 Jan 2015 08:32:23 GMT
Since starting using Spark 1.2, I've experienced an annoying issue with
failing apps that gets executed twice. I'm not talking about tasks inside a
job, that should be executed multiple times before failing the whole app.
I'm talking about the whole app, that seems to close the previous Spark
context, start a new, and rerun the app again.

This is annoying since it overwrite the log files as well and it becomes
hard to troubleshoot the failing app. Does anyone know how to turn this
"feature" off?

Thanks,
Anders

Mime
View raw message