spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrés Ivaldi <>
Subject Multiple Spark taks with Akka FSM
Date Wed, 09 Mar 2016 16:46:52 GMT

I'd like to know if this architecture is correct or not. We are studying
Spark as our ETL engine, we have a UI designer for the graph, this give us
a model that we want to translate in the corresponding Spark executions.
What brings to us Akka FSM, Using same sparkContext for all actors, we
suppose that parallelism will be supported by Spark(depending on
configuration of course) and Akka, each node Actor will be executed only if
his related in edges are completed.
It's that correct or there is any simple way to do it? I took a look to
Graphx but It's looks like is more for joins than manipulation or maybe
I've overlook something

Ing. Ivaldi Andres

View raw message