I can reply from an user’s perspective – I defer to semantic guarantees to someone with more experience.

I’ve successfully implemented the following using a custom Accumulable class:
Everything works as expected in terms of functionality - with 2 caveats:
Hope this helps,

From: "Sela, Amit"
Date: Monday, October 26, 2015 at 11:13 AM
To: "user@spark.apache.org"
Subject: Accumulators internals and reliability

It seems like there is not much literature about Spark's Accumulators so I thought I'd ask here:

Do Accumulators reside in a Task ? Are they being serialized with the task ? Sent back on task completion as part of the ResultTask ?

Are they reliable ? If so, when ? Can I relay on accumulators value only after the task was successfully complete (meaning in the driver) ? Or also during the task execution as well (what about speculative execution) ?

What are the limitations on the number (or size) of Accumulators ?