spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "DW @ Gmail" <deanwamp...@gmail.com>
Subject Re: Scala 2.12 support
Date Wed, 06 Jun 2018 19:03:46 GMT
I feel good ;)

I know there is a lot of interest among Spark users. Since the compiler change won’t force
a Spark API change, can we target Spark 2.4?

Sent from my rotary phone. 


> On Jun 6, 2018, at 11:33 AM, Holden Karau <holden@pigscanfly.ca> wrote:
> 
> Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in
2.13 for the API change issue that could be back ported to 2.12 so how about we try and get
this ball rolling?
> 
> It sounds like it would also need a closure cleaner change, which could be backwards
compatible but since it’s such a core component and we might want to be cautious with it,
we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so
we don’t break anyone.
> 
> How do folks feel about this?
> 
>> On Sat, Apr 21, 2018 at 5:32 AM Dean Wampler <deanwampler@gmail.com> wrote:
>> Hi, Reynold,
>> 
>> Sorry for the delay in replying; I was traveling.
>> 
>> The Scala changes would avoid the need to change the API now. Basically, the compiler
would be modified to detect the particular case of the two ambiguous, overloaded methods,
then pick the best fit in a more "intelligent" way. (They can provide more specific details).
This would not address the closure cleaner changes required. However, the Scala team offered
to provide suggestions or review changes.
>> 
>> dean
>> 
>> Dean Wampler, Ph.D.
>> VP, Fast Data Engineering at Lightbend
>> Author: Programming Scala, 2nd Edition, Fast Data Architectures for Streaming Applications,
and other content from O'Reilly
>> @deanwampler
>> http://polyglotprogramming.com
>> https://github.com/deanwampler
>> 
>>> On Thu, Apr 19, 2018 at 6:46 PM, Reynold Xin <rxin@databricks.com> wrote:
>>> Forking the thread to focus on Scala 2.12.
>>> 
>>> Dean,
>>> 
>>> There are couple different issues with Scala 2.12 (closure cleaner, API breaking
changes). Which one do you think we can address with a Scala upgrade? (The closure cleaner
one I haven't spent a lot of time looking at it but it might involve more Spark side changes)
>>> 
>>>> On Thu, Apr 19, 2018 at 3:28 AM, Dean Wampler <deanwampler@gmail.com>
wrote:
>>>> I spoke with Martin Odersky and Lightbend's Scala Team about the known API
issue with method disambiguation. They offered to implement a small patch in a new release
of Scala 2.12 to handle the issue without requiring a Spark API change. They would cut a 2.12.6
release for it. I'm told that Scala 2.13 should already handle the issue without modification
(it's not yet released, to be clear). They can also offer feedback on updating the closure
cleaner.
>>>> 
>>>> So, this approach would support Scala 2.12 in Spark, but limited to 2.12.6+,
without the API change requirement, but the closure cleaner would still need updating. Hence,
it could be done for Spark 2.X. 
>>>> 
>>>> Let me if you want to pursue this approach.
>>>> 
>>>> dean
>>>> 
>>>> 
>>>> 
>>>> Dean Wampler, Ph.D.
>>>> VP, Fast Data Engineering at Lightbend
>>>> Author: Programming Scala, 2nd Edition, Fast Data Architectures for Streaming
Applications, and other content from O'Reilly
>>>> @deanwampler
>>>> http://polyglotprogramming.com
>>>> https://github.com/deanwampler
>>>> 
>>>>> On Thu, Apr 5, 2018 at 8:13 PM, Marcelo Vanzin <vanzin@cloudera.com>
wrote:
>>>>> On Thu, Apr 5, 2018 at 10:30 AM, Matei Zaharia <matei.zaharia@gmail.com>
wrote:
>>>>> > Sorry, but just to be clear here, this is the 2.12 API issue: https://issues.apache.org/jira/browse/SPARK-14643,
with more details in this doc: https://docs.google.com/document/d/1P_wmH3U356f079AYgSsN53HKixuNdxSEvo8nw_tgLgM/edit.
>>>>> >
>>>>> > Basically, if we are allowed to change Spark’s API a little to
have only one version of methods that are currently overloaded between Java and Scala, we
can get away with a single source three for all Scala versions and Java ABI compatibility
against any type of Spark (whether using Scala 2.11 or 2.12).
>>>>> 
>>>>> Fair enough. To play devil's advocate, most of those methods seem to
>>>>> be marked "Experimental / Evolving", which could be used as a reason
>>>>> to change them for this purpose in a minor release.
>>>>> 
>>>>> Not all of them are, though (e.g. foreach / foreachPartition are not
>>>>> experimental).
>>>>> 
>>>>> -- 
>>>>> Marcelo
>>>>> 
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>>> 
>>>> 
>>> 
>> 
> -- 
> Twitter: https://twitter.com/holdenkarau

Mime
View raw message