spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hemalatha A <hemalatha.amru...@googlemail.com>
Subject Re: Fail a batch in Spark Streaming forcefully based on business rules
Date Thu, 28 Jul 2016 10:11:40 GMT
Another usecase why I need to do this is, If Exception A is caught I should
just print it and ignore, but ifException B occurs, I have to end the
batch, fail it and stop processing the batch.
Is it possible to achieve this?? Any hints on this please.


On Wed, Jul 27, 2016 at 10:42 AM, Hemalatha A <
hemalatha.amrutha@googlemail.com> wrote:

> Hello,
>
> I have a uescase where in, I have  to fail certain batches in my
> streaming batches, based on my application specific business rules.
> Ex: If in a batch of 2 seconds, I don't receive 100 message, I should fail
> the batch and move on.
>
> How to achieve this behavior?
>
> --
>
>
> Regards
> Hemalatha
>



-- 


Regards
Hemalatha

Mime
View raw message