The symptom is simple, the broker is not responding in 120 seconds.
That's the reason why Debabrata asked the broker config.

What I can suggest is to check the previous printout which logs the Kafka consumer settings.
With 


On Tue, Apr 14, 2020 at 11:44 AM ZHANG Wei <wezhang@outlook.com> wrote:
Here is the assertion error message format:

   s"Failed to get records for $groupId $topic $partition $offset after polling for $timeout")

You might have to check the kafka service with the error log:

> 20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
> java.lang.AssertionError: assertion failed: Failed to get records for spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling for 120000

Cheers,
-z

________________________________________
From: Debabrata Ghosh <mailfordebu@gmail.com>
Sent: Saturday, April 11, 2020 2:25
To: user
Subject: Re: Spark Streaming not working

Any solution please ?

On Fri, Apr 10, 2020 at 11:04 PM Debabrata Ghosh <mailfordebu@gmail.com<mailto:mailfordebu@gmail.com>> wrote:
Hi,
        I have a spark streaming application where Kafka is producing records but unfortunately spark streaming isn't able to consume those.

I am hitting the following error:

20/04/10 17:28:04 ERROR Executor: Exception in task 0.5 in stage 0.0 (TID 24)
java.lang.AssertionError: assertion failed: Failed to get records for spark-executor-service-spark-ingestion dice-ingestion 11 0 after polling for 120000
        at scala.Predef$.assert(Predef.scala:170)
        at org.apache.spark.streaming.kafka010.CachedKafkaConsumer.get(CachedKafkaConsumer.scala:74)
        at org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:223)
        at org.apache.spark.streaming.kafka010.KafkaRDD$KafkaRDDIterator.next(KafkaRDD.scala:189)

Would you please be able to help with a resolution.

Thanks,
Debu

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org