spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <>
Subject Re: time for Apache Spark 3.0?
Date Thu, 19 Apr 2018 13:32:02 GMT
That certainly sounds beneficial, to maybe several other projects. If
there's no downside and it takes away API issues, seems like a win.

On Thu, Apr 19, 2018 at 5:28 AM Dean Wampler <> wrote:

> I spoke with Martin Odersky and Lightbend's Scala Team about the known API
> issue with method disambiguation. They offered to implement a small patch
> in a new release of Scala 2.12 to handle the issue without requiring a
> Spark API change. They would cut a 2.12.6 release for it. I'm told that
> Scala 2.13 should already handle the issue without modification (it's not
> yet released, to be clear). They can also offer feedback on updating the
> closure cleaner.
> So, this approach would support Scala 2.12 in Spark, but limited to
> 2.12.6+, without the API change requirement, but the closure cleaner would
> still need updating. Hence, it could be done for Spark 2.X.
> Let me if you want to pursue this approach.
> dean

View raw message