Might be out of topic: regarding SPARK-24211 (flaky tests in StreamingJoinSuite) I might volunteer to take a look, but if things are not flaky with branch 2.4 and EOL on branch 2.3 is coming sooner (in some months), I wonder we still want to tackle it in any way.

2019년 2월 7일 (목) 오후 2:21, Sean Owen <srowen@gmail.com>님이 작성:
+1 from me. I built and tested the source release on the same env and
this time not seeing failures. Good, no idea what happened.

I updated Fix Version on JIRAs that were marked as 2.3.4 but went in
before the RC2 tag.

I'm kinda concerned that this test keeps failing in branch 2.3:

org.apache.spark.sql.streaming.StreamingOuterJoinSuite.left outer join
with non-key condition violated

It's among the items tracked in
I don't think it needs to block a release as I think we believe it's
just the test that's flaky, but I'm wondering whether people are
seeing this fail when testing the release?
I did not see it fail running my tests though.

On Tue, Feb 5, 2019 at 5:07 PM Takeshi Yamamuro <linguin.m.s@gmail.com> wrote:
> Please vote on releasing the following candidate as Apache Spark version 2.3.3.
> The vote is open until February 8 6:00PM (PST) and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
> [ ] +1 Release this package as Apache Spark 2.3.3
> [ ] -1 Do not release this package because ...
> To learn more about Apache Spark, please see http://spark.apache.org/
> The tag to be voted on is v2.3.3-rc2 (commit 66fd9c34bf406a4b5f86605d06c9607752bd637a):
> https://github.com/apache/spark/tree/v2.3.3-rc2
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.3-rc2-bin/
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1298/
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.3-rc2-docs/
> The list of bug fixes going into 2.3.3 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12343759
> =========================
> How can I help test this release?
> =========================
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
> ===========================================
> What should happen to JIRA tickets still targeting 2.3.3?
> ===========================================
> The current list of open tickets targeted at 2.3.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.3
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
> ==================
> But my bug isn't fixed?
> ==================
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
> P.S.
> I checked all the tests passed in the Amazon Linux 2 AMI;
> $ java -version
> openjdk version "1.8.0_191"
> OpenJDK Runtime Environment (build 1.8.0_191-b12)
> OpenJDK 64-Bit Server VM (build 25.191-b12, mixed mode)
> $ ./build/mvn -Pyarn -Phadoop-2.7 -Phive -Phive-thriftserver -Pmesos -Psparkr test
> --
> ---
> Takeshi Yamamuro

To unsubscribe e-mail: dev-unsubscribe@spark.apache.org