spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Reynold Xin" <r...@databricks.com>
Subject Spark 3.0 branch cut and code freeze on Jan 31?
Date Mon, 23 Dec 2019 23:48:28 GMT
We've pushed out 3.0 multiple times. The latest release window documented on the website (
http://spark.apache.org/versioning-policy.html ) says we'd code freeze and cut branch-3.0
early Dec. It looks like we are suffering a bit from the tragedy of the commons, that nobody
is pushing for getting the release out. I understand the natural tendency for each individual
is to finish or extend the feature/bug that the person has been working on. At some point
we need to say "this is it" and get the release out. I'm happy to help drive this process.

To be realistic, I don't think we should just code freeze * today *. Although we have updated
the website, contributors have all been operating under the assumption that all active developments
are still going on. I propose we *cut the branch on* *Jan 31* *, and code freeze and switch
over to bug squashing mode, and try to get the 3.0 official release out in Q1*. That is, by
default no new features can go into the branch starting Jan 31.

What do you think?

And happy holidays everybody.
Mime
View raw message