spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wenchen Fan <cloud0...@gmail.com>
Subject Re: [VOTE] SPARK 2.4.0 (RC1)
Date Tue, 18 Sep 2018 14:41:43 GMT
Thanks Marcelo to point out my gpg key issue! I've re-generated it and
uploaded to ASF spark repo. Let's see if it works in the next RC.

Thanks Saisai to point out the Python doc issue, I'll fix it in the next RC.

This RC fails because:
1. it doesn't include a Scala 2.12 build
2. the gpg key issue
3. the Python doc issue
4. some other potential blocker issues.

I'll start RC2 once these blocker issues are either resolved or we decide
to mark them as non-blocker.

Thanks,
Wenchen

On Tue, Sep 18, 2018 at 9:48 PM Marco Gaido <marcogaido91@gmail.com> wrote:

> Sorry but I am -1 because of what was reported here:
> https://issues.apache.org/jira/browse/SPARK-22036?focusedCommentId=16618104&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16618104
> .
> It is a regression unfortunately. Despite the impact is not huge and there
> are workarounds, I think we should include the fix in 2.4.0. I created
> SPARK-25454 and submitted a PR for it.
> Sorry for the trouble.
>
> Il giorno mar 18 set 2018 alle ore 05:23 Holden Karau <
> holden@pigscanfly.ca> ha scritto:
>
>> Deprecating Py 2 in the 2.4 release probably doesn't belong in the RC
>> vote thread. Personally I think we might be a little too late in the game
>> to deprecate it in 2.4, but I think calling it out as "soon to be
>> deprecated" in the release docs would be sensible to give folks extra time
>> to prepare.
>>
>> On Mon, Sep 17, 2018 at 2:04 PM Erik Erlandson <eerlands@redhat.com>
>> wrote:
>>
>>>
>>> I have no binding vote but I second Stavros’ recommendation for
>>> spark-23200
>>>
>>> Per parallel threads on Py2 support I would also like to propose
>>> deprecating Py2 starting with this 2.4 release
>>>
>>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>> <vanzin@cloudera.com.invalid> wrote:
>>>
>>>> You can log in to https://repository.apache.org and see what's wrong.
>>>> Just find that staging repo and look at the messages. In your case it
>>>> seems related to your signature.
>>>>
>>>> failureMessageNo public key: Key with id: (xxxx) was not able to be
>>>> located on http://gpg-keyserver.de/. Upload your public key and try
>>>> the operation again.
>>>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan <cloud0fan@gmail.com>
>>>> wrote:
>>>> >
>>>> > I confirmed that
>>>> https://repository.apache.org/content/repositories/orgapachespark-1285
>>>> is not accessible. I did it via ./dev/create-release/do-release-docker.sh
>>>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see any
>>>> error message during it.
>>>> >
>>>> > Any insights are appreciated! So that I can fix it in the next RC.
>>>> Thanks!
>>>> >
>>>> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen <srowen@apache.org>
wrote:
>>>> >>
>>>> >> I think one build is enough, but haven't thought it through. The
>>>> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>>>> >> best advertised as a 'beta'. So maybe publish a no-hadoop build
of
>>>> it?
>>>> >> Really, whatever's the easy thing to do.
>>>> >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan <cloud0fan@gmail.com>
>>>> wrote:
>>>> >> >
>>>> >> > Ah I missed the Scala 2.12 build. Do you mean we should publish
a
>>>> Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
>>>> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
>>>> Scala 2.12?
>>>> >> >
>>>> >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen <srowen@apache.org>
>>>> wrote:
>>>> >> >>
>>>> >> >> A few preliminary notes:
>>>> >> >>
>>>> >> >> Wenchen for some weird reason when I hit your key in gpg
>>>> --import, it
>>>> >> >> asks for a passphrase. When I skip it, it's fine, gpg can
still
>>>> verify
>>>> >> >> the signature. No issue there really.
>>>> >> >>
>>>> >> >> The staging repo gives a 404:
>>>> >> >>
>>>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>>>> >> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>>>> >> >> [id=orgapachespark-1285] exists but is not exposed.
>>>> >> >>
>>>> >> >> The (revamped) licenses are OK, though there are some minor
>>>> glitches
>>>> >> >> in the final release tarballs (my fault) : there's an extra
>>>> directory,
>>>> >> >> and the source release has both binary and source licenses.
I'll
>>>> fix
>>>> >> >> that. Not strictly necessary to reject the release over
those.
>>>> >> >>
>>>> >> >> Last, when I check the staging repo I'll get my answer,
but, were
>>>> you
>>>> >> >> able to build 2.12 artifacts as well?
>>>> >> >>
>>>> >> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan <cloud0fan@gmail.com>
>>>> wrote:
>>>> >> >> >
>>>> >> >> > Please vote on releasing the following candidate as
Apache
>>>> Spark version 2.4.0.
>>>> >> >> >
>>>> >> >> > The vote is open until September 20 PST and passes
if a
>>>> majority +1 PMC votes are cast, with
>>>> >> >> > a minimum of 3 +1 votes.
>>>> >> >> >
>>>> >> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>>>> >> >> > [ ] -1 Do not release this package because ...
>>>> >> >> >
>>>> >> >> > To learn more about Apache Spark, please see
>>>> http://spark.apache.org/
>>>> >> >> >
>>>> >> >> > The tag to be voted on is v2.4.0-rc1 (commit
>>>> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>>>> >> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>>>> >> >> >
>>>> >> >> > The release files, including signatures, digests,
etc. can be
>>>> found at:
>>>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>>>> >> >> >
>>>> >> >> > Signatures used for Spark RCs can be found in this
file:
>>>> >> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>> >> >> >
>>>> >> >> > The staging repository for this release can be found
at:
>>>> >> >> >
>>>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>>>> >> >> >
>>>> >> >> > The documentation corresponding to this release can
be found at:
>>>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>>>> >> >> >
>>>> >> >> > The list of bug fixes going into 2.4.0 can be found
at the
>>>> following URL:
>>>> >> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>>>> >> >> >
>>>> >> >> > FAQ
>>>> >> >> >
>>>> >> >> > =========================
>>>> >> >> > How can I help test this release?
>>>> >> >> > =========================
>>>> >> >> >
>>>> >> >> > If you are a Spark user, you can help us test this
release by
>>>> taking
>>>> >> >> > an existing Spark workload and running on this release
>>>> candidate, then
>>>> >> >> > reporting any regressions.
>>>> >> >> >
>>>> >> >> > If you're working in PySpark you can set up a virtual
env and
>>>> install
>>>> >> >> > the current RC and see if anything important breaks,
in the
>>>> Java/Scala
>>>> >> >> > you can add the staging repository to your projects
resolvers
>>>> and test
>>>> >> >> > with the RC (make sure to clean up the artifact cache
>>>> before/after so
>>>> >> >> > you don't end up building with a out of date RC going
forward).
>>>> >> >> >
>>>> >> >> > ===========================================
>>>> >> >> > What should happen to JIRA tickets still targeting
2.4.0?
>>>> >> >> > ===========================================
>>>> >> >> >
>>>> >> >> > The current list of open tickets targeted at 2.4.0
can be found
>>>> at:
>>>> >> >> > https://issues.apache.org/jira/projects/SPARK and
search for
>>>> "Target Version/s" = 2.4.0
>>>> >> >> >
>>>> >> >> > Committers should look at those and triage. Extremely
important
>>>> bug
>>>> >> >> > fixes, documentation, and API tweaks that impact compatibility
>>>> should
>>>> >> >> > be worked on immediately. Everything else please retarget
to an
>>>> >> >> > appropriate release.
>>>> >> >> >
>>>> >> >> > ==================
>>>> >> >> > But my bug isn't fixed?
>>>> >> >> > ==================
>>>> >> >> >
>>>> >> >> > In order to make timely releases, we will typically
not hold the
>>>> >> >> > release unless the bug in question is a regression
from the
>>>> previous
>>>> >> >> > release. That being said, if there is something which
is a
>>>> regression
>>>> >> >> > that has not been correctly targeted please ping me
or a
>>>> committer to
>>>> >> >> > help target the issue.
>>>>
>>>>
>>>>
>>>> --
>>>> Marcelo
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>>
>>>>
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>> Books (Learning Spark, High Performance Spark, etc.):
>> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>
>

Mime
View raw message