spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From HARSH TAKKAR <takkarha...@gmail.com>
Subject Re: Back pressure not working on streaming
Date Wed, 02 Jan 2019 03:49:22 GMT
There is separate property for max rate , by default is is not set, so if
you want to limit the max rate you should  provide that property  a value.

Initial rate =10 means it will pick only 10 records per receiver in the
batch interval when you start the process.

Depending  upon the consumption rate it will increase  the consumption of
records for processing in each batch.

However i, feel 10 is way to low number for 32 partitioned kafka topic.



Regards
Harsh
Happy New Year

On Wed 2 Jan, 2019, 08:33 JF Chen <darouwan@gmail.com wrote:

> I have set  spark.streaming.backpressure.enabled to true,  spark.streaming.backpressure.initialRate
> to 10.
> Once my application started, it received 32 million messages on first
> batch.
> My application runs every 300 seconds, with 32 kafka partition. So what's
> is the max rate if I set initial rate to 10?
>
> Thanks!
>
>
> Regard,
> Junfeng Chen
>

Mime
View raw message