spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Luis Guerra <>
Subject Spark execution plan
Date Wed, 23 Jul 2014 09:03:30 GMT
Hi all,

I was wondering how spark may deal with an execution plan. Using PIG as
example and its DAG execution, I would like to manage Spark for a similar

For instance, if my code has 3 different "parts", being A and B
self-sufficient parts:

Part A:
var output_a
Part B:
var output_b
Part C:
...using output_a and output_b

How would be the execution plan in spark? Could somehow parts A and B being
executed in parallel?

Related to this, are there thread implementations in Scala? Could be this a
solution for this scenario?


View raw message