spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean Georges Perrin <...@jgp.net>
Subject Re: [DISCUSSION]JDK11 for Apache 2.x?
Date Tue, 27 Aug 2019 21:14:54 GMT
Not a contributor, but a user perspective…

As Spark 3.x will be an evolution, I am not completely shocked that it would imply a Java
11 requirement as well. Would be great to have both Java 8 and Java 11, but one needs to be
able to say goodbye. Java 8 is great, still using it actively in production, but we know its
time is limited, so, by the time we evolve to Spark 3, we could combine it with Java 11.

On the other hand, not everybody may think this way and it may slow down the adoption of Spark
3…

However, I concur with Sean, I don’t think another 2.x is needed for Java 11.

> On Aug 27, 2019, at 3:09 PM, Sean Owen <srowen@gmail.com> wrote:
> 
> I think one of the key problems here are the required dependency
> upgrades. It would mean many minor breaking changes and a few bigger
> ones, notably around Hive, and forces a scala 2.12-only update. I
> think my question is whether that even makes sense as a minor release?
> it wouldn't be backwards compatible with 2.4 enough to call it a
> low-risk update. It would be a smaller step than moving all the way to
> 3.0, sure. I am not super against it, but we have to keep in mind how
> much work it would then be to maintain two LTS 2.x releases, 2.4 and
> the sort-of-compatible 2.5, while proceeding with 3.x.
> 
> On Tue, Aug 27, 2019 at 2:01 PM DB Tsai <d_tsai@apple.com.invalid> wrote:
>> 
>> Hello everyone,
>> 
>> Thank you all for working on supporting JDK11 in Apache Spark 3.0 as a community.
>> 
>> Java 8 is already end of life for commercial users, and many companies are moving
to Java 11.
>> The release date for Apache Spark 3.0 is still not there yet, and there are many
API
>> incompatibility issues when upgrading from Spark 2.x. As a result, asking users to
move to
>> Spark 3.0 to use JDK 11 is not realistic.
>> 
>> Should we backport PRs for JDK11 and cut a release in 2.x to support JDK11?
>> 
>> Should we cut a new Apache Spark 2.5 since the patches involve some of the dependencies
changes
>> which is not desired in minor release?
>> 
>> Thanks.
>> 
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Mime
View raw message