spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mark Hamstra <m...@clearstorydata.com>
Subject Re: Make Scala 2.12 as default Scala version in Spark 3.0
Date Wed, 07 Nov 2018 18:32:38 GMT
I'm not following "exclude Scala 2.13". Is there something inherent in
making 2.12 the default Scala version in Spark 3.0 that would prevent us
from supporting the option of building with 2.13?

On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <srowen@gmail.com> wrote:

> That's possible here, sure. The issue is: would you exclude Scala 2.13
> support in 3.0 for this, if it were otherwise ready to go?
> I think it's not a hard rule that something has to be deprecated
> previously to be removed in a major release. The notice is helpful,
> sure, but there are lots of ways to provide that notice to end users.
> Lots of things are breaking changes in a major release. Or: deprecate
> in Spark 2.4.1, if desired?
>
> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cloud0fan@gmail.com> wrote:
> >
> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in
> Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
> 3.x?
> >
> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <rxin@databricks.com> wrote:
> >>
> >> Have we deprecated Scala 2.11 already in an existing release?
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Mime
View raw message