spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "peter" <peter1...@qq.com>
Subject 答复: Issue : KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey
Date Mon, 21 Oct 2019 13:12:07 GMT
You can try improve setting spark.streaming.kafka.consumer.cache.maxCapacity

 

发件人: Shyam P [mailto:shyamabigdata@gmail.com] 
发送时间: 2019年10月21日 20:43
收件人: kafka-clients@googlegroups.com; spark users <user@spark.apache.org>
主题: Issue : KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey

 

Hi ,

 I am using spark-sql-2.4.1v with kafka

 

I am facing slow consumer issue 

I see warning "KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey(spark-kafka-source-33321dde-bfad-49f3-bdf7-09f95883b6e9--1249540122-executor)"
in logs

 

 

  more on the same 

https://stackoverflow.com/questions/58456939/how-to-set-spark-consumer-cache-to-fix-kafkaconsumer-cache-hitting-max-capaci
 

 

   can  anyone please advice how to fix this and improve my consumer performance?

 

 

Thank you.

 


Mime
View raw message