spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hyukjin Kwon <gurwls...@gmail.com>
Subject Re: Spark 3.0 branch cut and code freeze on Jan 31?
Date Tue, 24 Dec 2019 07:25:16 GMT
Sounds fine.
I am trying to get pandas UDF redesign done (SPARK-28264
<https://issues.apache.org/jira/browse/SPARK-28264>) on time. Hope I can
make it.

2019년 12월 24일 (화) 오후 4:17, Wenchen Fan <cloud0fan@gmail.com>님이 작성:

> Sounds good!
>
> On Tue, Dec 24, 2019 at 7:48 AM Reynold Xin <rxin@databricks.com> wrote:
>
>> We've pushed out 3.0 multiple times. The latest release window documented
>> on the website <http://spark.apache.org/versioning-policy.html> says
>> we'd code freeze and cut branch-3.0 early Dec. It looks like we are
>> suffering a bit from the tragedy of the commons, that nobody is pushing for
>> getting the release out. I understand the natural tendency for each
>> individual is to finish or extend the feature/bug that the person has been
>> working on. At some point we need to say "this is it" and get the release
>> out. I'm happy to help drive this process.
>>
>> To be realistic, I don't think we should just code freeze *today*.
>> Although we have updated the website, contributors have all been operating
>> under the assumption that all active developments are still going on. I
>> propose we *cut the branch on **Jan 31**, and code freeze and switch
>> over to bug squashing mode, and try to get the 3.0 official release out in
>> Q1*. That is, by default no new features can go into the branch starting Jan
>> 31.
>>
>> What do you think?
>>
>> And happy holidays everybody.
>>
>>
>>
>>

Mime
View raw message