spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@gmail.com>
Subject Re: Interesting implications of supporting Scala 2.13
Date Sat, 11 May 2019 23:25:26 GMT
For those interested, here's the first significant problem I see that
will require separate source trees or a breaking change:
https://issues.apache.org/jira/browse/SPARK-27683?focusedCommentId=16837967&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16837967

Interested in thoughts on how to proceed on something like this, as
there will probably be a few more similar issues.

On Fri, May 10, 2019 at 3:32 PM Reynold Xin <rxin@databricks.com> wrote:
>
> Yea my main point is when we do support 2.13, it'd be great if we don't have to break
APIs. That's why doing the prep work in 3.0 would be great.
>
>
> On Fri, May 10, 2019 at 1:30 PM, Imran Rashid <irashid@cloudera.com> wrote:
>>
>> +1 on making whatever api changes we can now for 3.0.
>>
>> I don't think that is making any commitments to supporting scala 2.13 in any specific
version.  We'll have to deal with all the other points you raised when we do cross that bridge,
but hopefully those are things we can cover in a minor release.
>>
>> On Fri, May 10, 2019 at 2:31 PM Sean Owen <srowen@gmail.com> wrote:
>>>
>>> I really hope we don't have to have separate source trees for some files, but
yeah it's an option too. OK, will start looking into changes we can make now that don't break
things now, and deprecations we need to make now proactively.
>>>
>>> I should also say that supporting Scala 2.13 will mean dependencies have to support
Scala 2.13, and that could take a while, because there are a lot.
>>> In particular, I think we'll find our SBT 0.13 build won't make it, perhaps just
because of the plugins it needs. I tried updating to SBT 1.x and it seemed to need quite a
lot of rewrite, again in part due to how newer plugin versions changed. I failed and gave
up.
>>>
>>> At some point maybe we figure out whether we can remove the SBT-based build if
it's super painful, but only if there's not much other choice. That is for a future thread.
>>>
>>>
>>> On Fri, May 10, 2019 at 1:51 PM Reynold Xin <rxin@databricks.com> wrote:
>>>>
>>>> Looks like a great idea to make changes in Spark 3.0 to prepare for Scala
2.13 upgrade.
>>>>
>>>> Are there breaking changes that would require us to have two different source
code for 2.12 vs 2.13?
>>>>
>>>>
>>>> On Fri, May 10, 2019 at 11:41 AM, Sean Owen <srowen@gmail.com> wrote:
>>>>>
>>>>> While that's not happening soon (2.13 isn't out), note that some of the
changes to collections will be fairly breaking changes.
>>>>>
>>>>> https://issues.apache.org/jira/browse/SPARK-25075
>>>>> https://docs.scala-lang.org/overviews/core/collections-migration-213.html
>>>>>
>>>>> Some of this may impact a public API, so may need to start proactively
fixing stuff for 2.13 before 3.0 comes out where possible.
>>>>>
>>>>> Here's an example: Traversable goes away. We have a method SparkConf.setAll(Traversable).
We can't support 2.13 while that still exists. Of course, we can decide to deprecate it with
replacement (use Iterable) and remove it in the version that supports 2.13. But that would
mean a little breaking change, and we either have to accept that for a future 3.x release,
or it waits until 4.x.
>>>>>
>>>>> I wanted to put that on the radar now to gather opinions about whether
this will probably be acceptable, or whether we really need to get methods like that changed
before 3.0.
>>>>>
>>>>> Also: there's plenty of straightforward but medium-sized changes we can
make now in anticipation of 2.13 support, like, make the type of Seq we use everywhere explicit
(will be good for a like 1000 file change I'm sure). Or see if we can swap out Traversable
everywhere. Remove MutableList, etc.
>>>>>
>>>>> I was going to start fiddling with that unless it just sounds too disruptive.
>>>>>
>>>>> ---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Mime
View raw message