spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ravidspark <>
Subject Spark maxTaskFailures is not recognized with Cassandra
Date Tue, 05 Jun 2018 19:19:49 GMT
Hi All,

I configured the number of task failures using spark.task.maxFailures as 10
in my spark application which ingests data into Cassandra reading from
Kafka. I observed that when Cassandra service is down, it is not retrying
for the property I set i.e. 10. Instead it is retrying with the default
maxFailures which is 4. Is there something I need to do, to make Spark retry
to connect to Cassandra more than 4 times?

Thanks in Advance,

Sent from:

To unsubscribe e-mail:

View raw message