Hi Ankur,

This is a great question as I've heard similar concerns about Spark on Mesos.

At the time when I started to contribute to Spark on Mesos approx half year ago, the Mesos scheduler and related code hasn't really got much attention from anyone and it was pretty much in maintenance mode. 

As a Mesos PMC that is really interested in Spark I started to refactor and check out different JIRAs and PRs around the Mesos scheduler, and after that started to fix various bugs in Spark, added documentation and also in fix related Mesos issues as well. 

Just recently for 1.4 we've merged in Cluster mode and Docker support, and there are also pending PRs around framework authentication, multi-role support, dynamic allocation, more finer tuned coarse grain mode scheduling configurations, etc.

And finally just want to mention that Mesosphere and Typesafe is collaborating to bring a certified distribution (https://databricks.com/spark/certification/certified-spark-distribution) of Spark on Mesos and DCOS, and we will be pouring resources into not just maintain Spark on Mesos but drive more features into the Mesos scheduler and also in Mesos so stateful services can leverage new APIs and features to make better scheduling decisions and optimizations. 

I don't have a solidified roadmap to share yet, but we will be discussing this and hopefully can share with the community soon.

In summary Spark on Mesos is not dead or in maintenance mode, and look forward to see a lot more changes from us and the community.

Tim

On Thu, May 14, 2015 at 11:30 PM, Ankur Chauhan <ankur@malloc64.com> wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi,

This is both a survey type as well as a roadmap query question. It
seems like of the cluster options to run spark (i.e. via YARN and
Mesos), YARN seems to be getting a lot more attention and patches when
compared to Mesos.

Would it be correct to assume that spark on mesos is more or less a
dead or something like a maintenance-only feature and YARN is the
recommended way to go?

What is the roadmap for spark on mesos? and what is the roadmap for
spark on yarn. I like mesos so as much as I would like to see it
thrive I don't think spark community is active (or maybe it just
appears that way).

Another more community oriented question: what do most people use to
run spark in production or more-than-POC products? Why did you make
that decision?

There was a similar post form early 2014 where Metei answered that
mesos and yarn were equally important, but has this changed as spark
has now reached almost 1.4.0 stage?

- -- Ankur Chauhan
-----BEGIN PGP SIGNATURE-----

iQEcBAEBAgAGBQJVVZKGAAoJEOSJAMhvLp3L0vEIAI4edLB2rMGk+OTI4WujxX6k
Ud5NyFUpaQ8WDjOhwcWB9RK5EoM7X3wGzRcGza1HLVnvdSUBG8Ltabt47GsP2lo0
7H9y2GluUZg/RJXbN0Ehp6moWjAU1W/55POD3t87qeUdydUJVbgDYA/KovNa6i8s
Z/e8mfvOrFSJyuJi8KW2KcfOmB1i8VZH7b/zZqtfJKNGo/0dac/gez19vVPaXPa4
WNUN8dHcp0yiZnZ0PUTYNLhI58BXBCSmkEl2Ex7X3NBUGUgJ5HGHn6dpqqNhGvf3
yPw0B0q93NcExK/E4/I75nn4vh5wKLPLWT8U5btphmc7S6h8gWFMEJRHQCdtaUk=
=uYXZ
-----END PGP SIGNATURE-----

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org