spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: How to timeout a task?
Date Sat, 27 Jun 2015 16:26:26 GMT
Have you looked at:
http://stackoverflow.com/questions/2281850/timeout-function-if-it-takes-too-long-to-finish

FYI

On Sat, Jun 27, 2015 at 8:33 AM, wasauce <wferrell@gmail.com> wrote:

> Hello!
>
> We use pyspark to run a set of data extractors (think regex). The
> extractors
> (regexes) generally run quite quickly and find a few matches which are
> returned and stored into a database.
>
> My question is -- is it possible to make the function that runs the
> extractors have a timeout? I.E. if for a given file the extractor runs for
> more than X seconds it terminates and returns a default value?
>
> Here is a code snippet of what we are doing with some comments as to which
> function I am looking to timeout.
>
> code: https://gist.github.com/wasauce/42a956a1371a2b564918
>
> Thank you
>
> - Bill
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-timeout-a-task-tp23513.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message