spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wei Chen <weic...@apache.org>
Subject Set TimeOut and continue with other tasks
Date Wed, 10 Jul 2019 05:47:05 GMT
Hello All,

I am using spark to process some files parallelly.
While most files are able to be processed within 3 seconds,
it is possible that we stuck on 1 or 2 files as they will never finish (or
will take more than 48 hours).
Since it is a 3rd party file conversion tool, we are not able to debug why
the converter stuck at the time.

Is it possible that we set a timeout for our process, throw exceptions for
those tasks,
while still continue with other successful tasks?

Best Regards
Wei

Mime
View raw message