spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiangrui Meng <m...@databricks.com>
Subject Re: [VOTE] SPARK 2.4.0 (RC2)
Date Mon, 01 Oct 2018 17:11:40 GMT
On Mon, Oct 1, 2018 at 9:52 AM Holden Karau <holden.karau@gmail.com> wrote:

> Oh that does look like an important correctness issue.
> -1
>
> On Mon, Oct 1, 2018, 9:57 AM Marco Gaido <marcogaido91@gmail.com> wrote:
>
>> -1, I was able to reproduce SPARK-25538 with the provided data.
>>
>> Il giorno lun 1 ott 2018 alle ore 09:11 Ted Yu <yuzhihong@gmail.com> ha
>> scritto:
>>
>>> +1
>>>
>>> -------- Original message --------
>>> From: Denny Lee <denny.g.lee@gmail.com>
>>> Date: 9/30/18 10:30 PM (GMT-08:00)
>>> To: Stavros Kontopoulos <stavros.kontopoulos@lightbend.com>
>>> Cc: Sean Owen <srowen@gmail.com>, Wenchen Fan <cloud0fan@gmail.com>,
>>> dev <dev@spark.apache.org>
>>> Subject: Re: [VOTE] SPARK 2.4.0 (RC2)
>>>
>>> +1 (non-binding)
>>>
>>>
>>> On Sat, Sep 29, 2018 at 10:24 AM Stavros Kontopoulos <
>>> stavros.kontopoulos@lightbend.com> wrote:
>>>
>>>> +1
>>>>
>>>> Stavros
>>>>
>>>> On Sat, Sep 29, 2018 at 5:59 AM, Sean Owen <srowen@gmail.com> wrote:
>>>>
>>>>> +1, with comments:
>>>>>
>>>>> There are 5 critical issues for 2.4, and no blockers:
>>>>> SPARK-25378 ArrayData.toArray(StringType) assume UTF8String in 2.4
>>>>> SPARK-25325 ML, Graph 2.4 QA: Update user guide for new features &
APIs
>>>>> SPARK-25319 Spark MLlib, GraphX 2.4 QA umbrella
>>>>> SPARK-25326 ML, Graph 2.4 QA: Programming guide update and migration
>>>>> guide
>>>>> SPARK-25323 ML 2.4 QA: API: Python API coverage
>>>>>
>>>>> Xiangrui, is SPARK-25378 important enough we need to get it into 2.4?
>>>>>
>>>>
IMHO, the use case (spark-tensorflow-connector) is very important. But
whether we need to fix it in 2.4 branch depends on the release timeline.
See my comment in the JIRA:
https://issues.apache.org/jira/browse/SPARK-25378


>
>>>>> I found two issues resolved for 2.4.1 that got into this RC, so marked
>>>>> them as resolved in 2.4.0.
>>>>>
>>>>> I checked the licenses and notice and they look correct now in source
>>>>> and binary builds.
>>>>>
>>>>> The 2.12 artifacts are as I'd expect.
>>>>>
>>>>> I ran all tests for 2.11 and 2.12 and they pass with -Pyarn
>>>>> -Pkubernetes -Pmesos -Phive -Phadoop-2.7 -Pscala-2.12.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Sep 27, 2018 at 10:00 PM Wenchen Fan <cloud0fan@gmail.com>
>>>>> wrote:
>>>>> >
>>>>> > Please vote on releasing the following candidate as Apache Spark
>>>>> version 2.4.0.
>>>>> >
>>>>> > The vote is open until October 1 PST and passes if a majority +1
PMC
>>>>> votes are cast, with
>>>>> > a minimum of 3 +1 votes.
>>>>> >
>>>>> > [ ] +1 Release this package as Apache Spark 2.4.0
>>>>> > [ ] -1 Do not release this package because ...
>>>>> >
>>>>> > To learn more about Apache Spark, please see
>>>>> http://spark.apache.org/
>>>>> >
>>>>> > The tag to be voted on is v2.4.0-rc2 (commit
>>>>> 42f25f309e91c8cde1814e3720099ac1e64783da):
>>>>> > https://github.com/apache/spark/tree/v2.4.0-rc2
>>>>> >
>>>>> > The release files, including signatures, digests, etc. can be found
>>>>> at:
>>>>> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc2-bin/
>>>>> >
>>>>> > Signatures used for Spark RCs can be found in this file:
>>>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>> >
>>>>> > The staging repository for this release can be found at:
>>>>> >
>>>>> https://repository.apache.org/content/repositories/orgapachespark-1287
>>>>> >
>>>>> > The documentation corresponding to this release can be found at:
>>>>> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc2-docs/
>>>>> >
>>>>> > The list of bug fixes going into 2.4.0 can be found at the following
>>>>> URL:
>>>>> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>>>>> >
>>>>> > FAQ
>>>>> >
>>>>> > =========================
>>>>> > How can I help test this release?
>>>>> > =========================
>>>>> >
>>>>> > If you are a Spark user, you can help us test this release by taking
>>>>> > an existing Spark workload and running on this release candidate,
>>>>> then
>>>>> > reporting any regressions.
>>>>> >
>>>>> > If you're working in PySpark you can set up a virtual env and install
>>>>> > the current RC and see if anything important breaks, in the
>>>>> Java/Scala
>>>>> > you can add the staging repository to your projects resolvers and
>>>>> test
>>>>> > with the RC (make sure to clean up the artifact cache before/after
so
>>>>> > you don't end up building with a out of date RC going forward).
>>>>> >
>>>>> > ===========================================
>>>>> > What should happen to JIRA tickets still targeting 2.4.0?
>>>>> > ===========================================
>>>>> >
>>>>> > The current list of open tickets targeted at 2.4.0 can be found
at:
>>>>> > https://issues.apache.org/jira/projects/SPARK and search for
>>>>> "Target Version/s" = 2.4.0
>>>>> >
>>>>> > Committers should look at those and triage. Extremely important
bug
>>>>> > fixes, documentation, and API tweaks that impact compatibility should
>>>>> > be worked on immediately. Everything else please retarget to an
>>>>> > appropriate release.
>>>>> >
>>>>> > ==================
>>>>> > But my bug isn't fixed?
>>>>> > ==================
>>>>> >
>>>>> > In order to make timely releases, we will typically not hold the
>>>>> > release unless the bug in question is a regression from the previous
>>>>> > release. That being said, if there is something which is a regression
>>>>> > that has not been correctly targeted please ping me or a committer
to
>>>>> > help target the issue.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>>>
>>>>>
>>>> --

Xiangrui Meng

Software Engineer

Databricks Inc. [image: http://databricks.com] <http://databricks.com/>

Mime
View raw message