spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pablo Federigi <pablo.feder...@mercadolibre.com>
Subject SaveToCassandra - how to handle failed inserts?
Date Fri, 07 Oct 2016 15:30:38 GMT
Hello

In the next example I'm using the method saveToCassandra from the
spark-cassandra connector

RDDJavaFunctions<Tuple2&lt;String, Integer>> dsJF1 =
CassandraJavaUtil.javaFunctions(result);
      dsJF1.writerBuilder("test_keyspace", "test",
              CassandraJavaUtil.mapTupleToRow(String.class, Integer.class))
              .withColumnSelector(CassandraJavaUtil.someColumns("column1",
"column2"))
              .saveToCassandra();

In the example above, we can suppose that result has 1000 records and just
one record fails when trying to write to cassandra (even after the Retry
Policy configuration).

I just want to know how to handle those failed records when the driver was
not able to write to cassandra (for example, due a timeout exception). It's
someway to log failed records?

Thanks,
Pablo



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SaveToCassandra-how-to-handle-failed-inserts-tp27865.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message