My team is investigating a number of technologies in the Big Data space.  A team member recently got turned on to Cascading as an application layer for orchestrating complex workflows/scenarios.  He asked me if Spark had an "application layer"?  My initial reaction is "no" that Spark would not have a separate orchestration/application layer.  Instead, the core Spark API (along with Streaming) would compete directly with Cascading for this kind of functionality and that the two would not likely be all that complementary.  I realize that I am exposing my ignorance here and could be way off.  Is there anyone who knows a bit about both of these technologies who could speak to this in broad strokes?