Thanks for the quick answer :)
Sadly, the comment in the page doesn’t answer my questions. More specifically:
1. GraphFrames last activity in github was 2 months ago. Last release on 12 Nov 2016. Till recently 2 month was close to a Spark release cycle. Why there has been no major development since mid November?
2. The page you linked refers to a *plan* to move GraphFrames to the standard Spark release cycle. Is this *plan* publicly available / visible?
3. I couldn’t find any statement of intent to preserve either one or the other APIs, or just merge them: in other words, there seem to be no overarching plan for a cohesive & comprehensive graph API (I apologise in advance if I’m wrong).
4. I was initially impressed by GraphFrames syntax in places similar to Neo4J Cypher (now open source), but later I understood was an incomplete lightweight experiment (with no intention to move to full compatibility, perhaps for good reasons). To me it sort of gave the wrong message.
5. In the mean time the world of graphs is changing. GraphBlas forum seems to make some traction: a library based on GraphBlas has been made available on Accumulo (Graphulo). Assuming that Spark is NOT going to adopt similar lines, nor to follow Datastax with tinkertop and Gremlin, again, what is the new, cohesive & comprehensive API that Spark is going to deliver?
Sadly, the API uncertainty may force developers to more stable kind of API / platforms & roadmaps.
Your question is answered here under "Will GraphFrames be part of Apache Spark?", no?
Please see this email trail: no answer so far on the user@spark board. Trying the developer board for better luck
I am a bit confused by the current roadmap for graph and graph analytics in Apache Spark.
I understand that we have had for some time two libraries (the following is my understanding - please amend as appropriate!):
. GraphX, part of Spark project. This library is based on RDD and it is only accessible via Scala. It doesn’t look that this library has been enhanced recently.
. GraphFrames, independent (at the moment?) library for Spark. This library is based on Spark DataFrames and accessible by Scala & Python. Last commit on GitHub was 2 months ago.
GraphFrames cam about with the promise at some point to be integrated in Apache Spark.
I can see other projects coming up with interesting libraries and ideas (e.g. Graphulo on Accumulo, a new project with the goal of implementing the GraphBlas building blocks for graph algorithms on top of Accumulo).
Where is Apache Spark going?
Where are graph libraries in the roadmap?
Thanks for any clarity brought to this matter.
Begin forwarded message:
Subject: Re: Question on Spark's graph libraries
Date: 10 March 2017 at 13:13:15 CET