spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aaron Davidson <ilike...@gmail.com>
Subject Re: How to enable fault-tolerance?
Date Mon, 09 Jun 2014 17:33:45 GMT
Looks like your problem is local mode:
https://github.com/apache/spark/blob/640f9a0efefd42cff86aecd4878a3a57f5ae85fa/core/src/main/scala/org/apache/spark/SparkContext.scala#L1430

For some reason, someone decided not to do retries when running in local
mode. Not exactly sure why, feel free to submit a JIRA on this.


On Mon, Jun 9, 2014 at 8:59 AM, Peng Cheng <pc175@uow.edu.au> wrote:

> I speculate that Spark will only retry on exceptions that are registered
> with
> TaskSetScheduler, so a definitely-will-fail task will fail quickly without
> taking more resources. However I haven't found any documentation or web
> page
> on it
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-enable-fault-tolerance-tp7250p7255.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Mime
View raw message