spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mayur Rustagi <mayur.rust...@gmail.com>
Subject Re: How to terminate job from the task code?
Date Sat, 21 Jun 2014 12:14:25 GMT
You can terminate job group from spark context,  Youll have to send across
the spark context to your task.
On 21 Jun 2014 01:09, "Piotr Kołaczkowski" <pkolaczk@datastax.com> wrote:

> If the task detects unrecoverable error, i.e. an error that we can't
> expect to fix by retrying nor moving the task to another node, how to stop
> the job / prevent Spark from retrying it?
>
> def process(taskContext: TaskContext, data: Iterator[T]) {
>    ...
>
>    if (unrecoverableError) {
>       ??? // terminate the job immediately
>    }
>    ...
>  }
>
> Somewhere else:
> rdd.sparkContext.runJob(rdd, something.process _)
>
>
> Thanks,
> Piotr
>
>
> --
> Piotr Kolaczkowski, Lead Software Engineer
> pkolaczk@datastax.com
>
> http://www.datastax.com/
> 777 Mariners Island Blvd., Suite 510
> San Mateo, CA 94404
>

Mime
View raw message