spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Holden Karau <hol...@pigscanfly.ca>
Subject Re: Scala 2.12 support
Date Fri, 08 Jun 2018 00:49:11 GMT
Tests can just be changed to accept either output too :p

On Thu, Jun 7, 2018, 5:19 PM Dean Wampler <deanwampler@gmail.com> wrote:

> Do the tests expect a particular console output order? That would annoy
> them. ;) You could sort the expected and output lines, then diff...
>
>
> *Dean Wampler, Ph.D.*
>
> *VP, Fast Data Engineering at Lightbend*
> Author: Programming Scala, 2nd Edition
> <http://shop.oreilly.com/product/0636920033073.do>, Fast Data
> Architectures for Streaming Applications
> <http://www.oreilly.com/data/free/fast-data-architectures-for-streaming-applications.csp>,
> and other content from O'Reilly
> @deanwampler <http://twitter.com/deanwampler>
> http://polyglotprogramming.com
> https://github.com/deanwampler
>
> On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <holden@pigscanfly.ca> wrote:
>
>> If the difference is the order of the welcome message I think that should
>> be fine.
>>
>> On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <deanwampler@gmail.com> wrote:
>>
>>> I'll point the Scala team to this issue, but it's unlikely to get fixed
>>> any time soon.
>>>
>>> dean
>>>
>>>
>>> *Dean Wampler, Ph.D.*
>>>
>>> *VP, Fast Data Engineering at Lightbend*
>>> Author: Programming Scala, 2nd Edition
>>> <http://shop.oreilly.com/product/0636920033073.do>, Fast Data
>>> Architectures for Streaming Applications
>>> <http://www.oreilly.com/data/free/fast-data-architectures-for-streaming-applications.csp>,
>>> and other content from O'Reilly
>>> @deanwampler <http://twitter.com/deanwampler>
>>> http://polyglotprogramming.com
>>> https://github.com/deanwampler
>>>
>>> On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <d_tsai@apple.com> wrote:
>>>
>>>> Thanks Felix for bringing this up.
>>>>
>>>> Currently, in Scala 2.11.8, we initialize the Spark by overriding
>>>> loadFIles() before REPL sees any file since there is no good hook in Scala
>>>> to load our initialization code.
>>>>
>>>> In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles()
>>>> method was removed.
>>>>
>>>> Alternatively, one way we can do in the newer version of Scala is by
>>>> overriding initializeSynchronous() suggested by Som Snytt; I have a working
>>>> PR with this approach,
>>>> https://github.com/apache/spark/pull/21495 , and this approach should
>>>> work for older version of Scala too.
>>>>
>>>> However, in the newer version of Scala, the first thing that the REPL
>>>> calls is printWelcome, so in the newer version of Scala, welcome message
>>>> will be shown and then the URL of the SparkUI in this approach. This will
>>>> cause UI inconsistencies between different versions of Scala.
>>>>
>>>> We can also initialize the Spark in the printWelcome which I feel more
>>>> hacky. It will only work for newer version of Scala since in order version
>>>> of Scala, printWelcome is called in the end of the initialization process.
>>>> If we decide to go this route, basically users can not use Scala older than
>>>> 2.11.9.
>>>>
>>>> I think this is also a blocker for us to move to newer version of Scala
>>>> 2.12.x since the newer version of Scala 2.12.x has the same issue.
>>>>
>>>> In my opinion, Scala should fix the root cause and provide a stable
>>>> hook for 3rd party developers to initialize their custom code.
>>>>
>>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>>>> Apple, Inc
>>>>
>>>> > On Jun 7, 2018, at 6:43 AM, Felix Cheung <felixcheung_m@hotmail.com>
>>>> wrote:
>>>> >
>>>> > +1
>>>> >
>>>> > Spoke to Dean as well and mentioned the problem with 2.11.12
>>>> https://github.com/scala/bug/issues/10913
>>>> >
>>>> > _____________________________
>>>> > From: Sean Owen <srowen@gmail.com>
>>>> > Sent: Wednesday, June 6, 2018 12:23 PM
>>>> > Subject: Re: Scala 2.12 support
>>>> > To: Holden Karau <holden@pigscanfly.ca>
>>>> > Cc: Dean Wampler <deanwampler@gmail.com>, Reynold Xin <
>>>> rxin@databricks.com>, dev <dev@spark.apache.org>
>>>> >
>>>> >
>>>> > If it means no change to 2.11 support, seems OK to me for Spark
>>>> 2.4.0. The 2.12 support is separate and has never been mutually compatible
>>>> with 2.11 builds anyway. (I also hope, suspect that the changes are
>>>> minimal; tests are already almost entirely passing with no change to the
>>>> closure cleaner when built for 2.12)
>>>> >
>>>> > On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <holden@pigscanfly.ca>
>>>> wrote:
>>>> > Just chatted with Dean @ the summit and it sounds like from Adriaan
>>>> there is a fix in 2.13 for the API change issue that could be back ported
>>>> to 2.12 so how about we try and get this ball rolling?
>>>> >
>>>> > It sounds like it would also need a closure cleaner change, which
>>>> could be backwards compatible but since it’s such a core component and
we
>>>> might want to be cautious with it, we could when building for 2.11 use the
>>>> old cleaner code and for 2.12 use the new code so we don’t break anyone.
>>>> >
>>>> > How do folks feel about this?
>>>> >
>>>> >
>>>> >
>>>>
>>>>
>>>
>

Mime
View raw message