spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Takeshi Yamamuro <linguin....@gmail.com>
Subject Re: [VOTE] SPARK 3.0.0-preview (RC2)
Date Fri, 01 Nov 2019 21:55:19 GMT
+1, too.

On Sat, Nov 2, 2019 at 3:36 AM Hyukjin Kwon <gurwls223@gmail.com> wrote:

> +1
>
> On Fri, 1 Nov 2019, 15:36 Wenchen Fan, <cloud0fan@gmail.com> wrote:
>
>> The PR builder uses Hadoop 2.7 profile, which makes me think that 2.7 is
>> more stable and we should make releases using 2.7 by default.
>>
>> +1
>>
>> On Fri, Nov 1, 2019 at 7:16 AM Xiao Li <lixiao@databricks.com> wrote:
>>
>>> Spark 3.0 will still use the Hadoop 2.7 profile by default, I think.
>>> Hadoop 2.7 profile is much more stable than Hadoop 3.2 profile.
>>>
>>> On Thu, Oct 31, 2019 at 3:54 PM Sean Owen <srowen@gmail.com> wrote:
>>>
>>>> This isn't a big thing, but I see that the pyspark build includes
>>>> Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
>>>> 3.2 by default.
>>>>
>>>> Otherwise, the tests all seems to pass with JDK 8 / 11 with all
>>>> profiles enabled, so I'm +1 on it.
>>>>
>>>>
>>>> On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <jiangxb1987@gmail.com>
>>>> wrote:
>>>> >
>>>> > Please vote on releasing the following candidate as Apache Spark
>>>> version 3.0.0-preview.
>>>> >
>>>> > The vote is open until November 3 PST and passes if a majority +1 PMC
>>>> votes are cast, with
>>>> > a minimum of 3 +1 votes.
>>>> >
>>>> > [ ] +1 Release this package as Apache Spark 3.0.0-preview
>>>> > [ ] -1 Do not release this package because ...
>>>> >
>>>> > To learn more about Apache Spark, please see http://spark.apache.org/
>>>> >
>>>> > The tag to be voted on is v3.0.0-preview-rc2 (commit
>>>> 007c873ae34f58651481ccba30e8e2ba38a692c4):
>>>> > https://github.com/apache/spark/tree/v3.0.0-preview-rc2
>>>> >
>>>> > The release files, including signatures, digests, etc. can be found
>>>> at:
>>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
>>>> >
>>>> > Signatures used for Spark RCs can be found in this file:
>>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>> >
>>>> > The staging repository for this release can be found at:
>>>> >
>>>> https://repository.apache.org/content/repositories/orgapachespark-1336/
>>>> >
>>>> > The documentation corresponding to this release can be found at:
>>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
>>>> >
>>>> > The list of bug fixes going into 3.0.0 can be found at the following
>>>> URL:
>>>> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>> >
>>>> > FAQ
>>>> >
>>>> > =========================
>>>> > How can I help test this release?
>>>> > =========================
>>>> >
>>>> > If you are a Spark user, you can help us test this release by taking
>>>> > an existing Spark workload and running on this release candidate, then
>>>> > reporting any regressions.
>>>> >
>>>> > If you're working in PySpark you can set up a virtual env and install
>>>> > the current RC and see if anything important breaks, in the Java/Scala
>>>> > you can add the staging repository to your projects resolvers and test
>>>> > with the RC (make sure to clean up the artifact cache before/after so
>>>> > you don't end up building with an out of date RC going forward).
>>>> >
>>>> > ===========================================
>>>> > What should happen to JIRA tickets still targeting 3.0.0?
>>>> > ===========================================
>>>> >
>>>> > The current list of open tickets targeted at 3.0.0 can be found at:
>>>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>>>> Version/s" = 3.0.0
>>>> >
>>>> > Committers should look at those and triage. Extremely important bug
>>>> > fixes, documentation, and API tweaks that impact compatibility should
>>>> > be worked on immediately.
>>>> >
>>>> > ==================
>>>> > But my bug isn't fixed?
>>>> > ==================
>>>> >
>>>> > In order to make timely releases, we will typically not hold the
>>>> > release unless the bug in question is a regression from the previous
>>>> > release. That being said, if there is something which is a regression
>>>> > that has not been correctly targeted please ping me or a committer to
>>>> > help target the issue.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>>
>>>>
>>>
>>> --
>>> [image: Databricks Summit - Watch the talks]
>>> <https://databricks.com/sparkaisummit/north-america>
>>>
>>

-- 
---
Takeshi Yamamuro

Mime
View raw message