Cool, congrats!Bests,TakeshiOn Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi <email@example.com> wrote:That's Awesome !!!
Thanks to everyone that made this possible :cheers:
On Sun, Aug 25, 2019 at 6:03 AM Xiao Li <firstname.lastname@example.org> wrote:
Thank you for your contributions! This is a great feature for Spark 3.0! We finally achieve it!
On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <email@example.com> wrote:
From: ☼ R Nair <firstname.lastname@example.org>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <email@example.com>
Cc: firstname.lastname@example.org <email@example.com>; user @spark/'user @spark'/spark users/user@spark <firstname.lastname@example.org>
Subject: Re: JDK11 Support in Apache SparkFinally!!! Congrats
On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun <email@example.com> wrote:
Thanks to your many many contributions,Apache Spark master branch starts to pass on JDK11 as of today.(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/(JDK11 is used for building and testing.)
We already verified all UTs (including PySpark/SparkR) before.
Please feel free to use JDK11 in order to build/test/run `master` branch andshare your experience including any issues. It will help Apache Spark 3.0.0 release.
For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .The next step is `how to support JDK8/JDK11 together in a single artifact`.