I'd like to know if this architecture is correct or not. We are studying Spark as our ETL engine, we have a UI designer for the graph, this give us a model that we want to translate in the corresponding Spark executions.
What brings to us Akka FSM, Using same sparkContext for all actors, we suppose that parallelism will be supported by Spark(depending on configuration of course) and Akka, each node Actor will be executed only if his related in edges are completed.
It's that correct or there is any simple way to do it? I took a look to Graphx but It's looks like is more for joins than manipulation or maybe I've overlook something
regards--Ing. Ivaldi Andres