spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: max receiving rate in spark streaming
Date Wed, 07 Jan 2015 11:16:10 GMT
If you are using the Lowlevel consumer
<https://github.com/dibbhatt/kafka-spark-consumer> then you have an option
to tweak the rate by setting *_fetchSizeBytes
<https://github.com/dibbhatt/kafka-spark-consumer/blob/master/src/main/java/consumer/kafka/KafkaConfig.java#L37>
*value. Default
is 64kb, you can increase it upto 1MB+ depending on your cluster size.

Thanks
Best Regards

On Wed, Jan 7, 2015 at 4:41 PM, Hafiz Mujadid <hafizmujadid00@gmail.com>
wrote:

> Hi experts!
>
>
> Is there any way to decide what can be effective receiving rate for kafka
> spark streaming?
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/max-receiving-rate-in-spark-streaming-tp21013.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message