spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shyam P <shyamabigd...@gmail.com>
Subject Issue : KafkaConsumer cache hitting max capacity of 64, removing consumer for CacheKey
Date Mon, 21 Oct 2019 12:42:38 GMT
Hi ,
 I am using spark-sql-2.4.1v with kafka

I am facing slow consumer issue
I see warning "KafkaConsumer cache hitting max capacity of 64, removing
consumer for
CacheKey(spark-kafka-source-33321dde-bfad-49f3-bdf7-09f95883b6e9--1249540122-executor)"
in logs


  more on the same
https://stackoverflow.com/questions/58456939/how-to-set-spark-consumer-cache-to-fix-kafkaconsumer-cache-hitting-max-capaci


   can  anyone please advice how to fix this and improve my consumer
performance?


Thank you.

Mime
View raw message