spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dean Wampler <deanwamp...@gmail.com>
Subject Re: time for Apache Spark 3.0?
Date Thu, 19 Apr 2018 10:28:11 GMT
I spoke with Martin Odersky and Lightbend's Scala Team about the known API
issue with method disambiguation. They offered to implement a small patch
in a new release of Scala 2.12 to handle the issue without requiring a
Spark API change. They would cut a 2.12.6 release for it. I'm told that
Scala 2.13 should already handle the issue without modification (it's not
yet released, to be clear). They can also offer feedback on updating the
closure cleaner.

So, this approach would support Scala 2.12 in Spark, but limited to
2.12.6+, without the API change requirement, but the closure cleaner would
still need updating. Hence, it could be done for Spark 2.X.

Let me if you want to pursue this approach.

dean




*Dean Wampler, Ph.D.*

*VP, Fast Data Engineering at Lightbend*
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do>, Fast Data Architectures
for Streaming Applications
<http://www.oreilly.com/data/free/fast-data-architectures-for-streaming-applications.csp>,
and other content from O'Reilly
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com
https://github.com/deanwampler

On Thu, Apr 5, 2018 at 8:13 PM, Marcelo Vanzin <vanzin@cloudera.com> wrote:

> On Thu, Apr 5, 2018 at 10:30 AM, Matei Zaharia <matei.zaharia@gmail.com>
> wrote:
> > Sorry, but just to be clear here, this is the 2.12 API issue:
> https://issues.apache.org/jira/browse/SPARK-14643, with more details in
> this doc: https://docs.google.com/document/d/1P_
> wmH3U356f079AYgSsN53HKixuNdxSEvo8nw_tgLgM/edit.
> >
> > Basically, if we are allowed to change Spark’s API a little to have only
> one version of methods that are currently overloaded between Java and
> Scala, we can get away with a single source three for all Scala versions
> and Java ABI compatibility against any type of Spark (whether using Scala
> 2.11 or 2.12).
>
> Fair enough. To play devil's advocate, most of those methods seem to
> be marked "Experimental / Evolving", which could be used as a reason
> to change them for this purpose in a minor release.
>
> Not all of them are, though (e.g. foreach / foreachPartition are not
> experimental).
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Mime
View raw message