spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From DB Tsai <d_t...@apple.com.INVALID>
Subject [DISCUSSION]JDK11 for Apache 2.x?
Date Tue, 27 Aug 2019 19:01:20 GMT
Hello everyone,

Thank you all for working on supporting JDK11 in Apache Spark 3.0 as a community.

Java 8 is already end of life for commercial users, and many companies are moving to Java
11. 
The release date for Apache Spark 3.0 is still not there yet, and there are many API 
incompatibility issues when upgrading from Spark 2.x. As a result, asking users to move to
Spark 3.0 to use JDK 11 is not realistic.

Should we backport PRs for JDK11 and cut a release in 2.x to support JDK11?

Should we cut a new Apache Spark 2.5 since the patches involve some of the dependencies changes
which is not desired in minor release?

Thanks.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Mime
View raw message