spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ravidspark <ravi.pegas...@gmail.com>
Subject Spark maxTaskFailures is not recognized with Cassandra
Date Tue, 05 Jun 2018 19:19:49 GMT
Hi All,

I configured the number of task failures using spark.task.maxFailures as 10
in my spark application which ingests data into Cassandra reading from
Kafka. I observed that when Cassandra service is down, it is not retrying
for the property I set i.e. 10. Instead it is retrying with the default
maxFailures which is 4. Is there something I need to do, to make Spark retry
to connect to Cassandra more than 4 times?

Thanks in Advance,
Ravi



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message