spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kostas Sakellis <kos...@cloudera.com>
Subject Re: Discuss: commit to Scala 2.10 support for Spark 2.x lifecycle
Date Wed, 06 Apr 2016 01:54:53 GMT
>From both this and the JDK thread, I've noticed (including myself) that
people have different notions of compatibility guarantees between major and
minor versions.
A simple question I have is: What compatibility can we break between minor
vs. major releases?

It might be worth getting on the same page wrt compatibility guarantees.

Just a thought,
Kostas

On Tue, Apr 5, 2016 at 4:39 PM, Holden Karau <holden@pigscanfly.ca> wrote:

> One minor downside to having both 2.10 and 2.11 (and eventually 2.12) is
> deprecation warnings in our builds that we can't fix without introducing a
> wrapper/ scala version specific code. This isn't a big deal, and if we drop
> 2.10 in the 3-6 month time frame talked about we can cleanup those warnings
> once we get there.
>
> On Fri, Apr 1, 2016 at 10:00 PM, Raymond Honderdors <
> Raymond.Honderdors@sizmek.com> wrote:
>
>> What about a seperate branch for scala 2.10?
>>
>>
>>
>> Sent from my Samsung Galaxy smartphone.
>>
>>
>> -------- Original message --------
>> From: Koert Kuipers <koert@tresata.com>
>> Date: 4/2/2016 02:10 (GMT+02:00)
>> To: Michael Armbrust <michael@databricks.com>
>> Cc: Matei Zaharia <matei.zaharia@gmail.com>, Mark Hamstra <
>> mark@clearstorydata.com>, Cody Koeninger <cody@koeninger.org>, Sean Owen
>> <sowen@cloudera.com>, dev@spark.apache.org
>> Subject: Re: Discuss: commit to Scala 2.10 support for Spark 2.x
>> lifecycle
>>
>> as long as we don't lock ourselves into supporting scala 2.10 for the
>> entire spark 2 lifespan it sounds reasonable to me
>>
>> On Wed, Mar 30, 2016 at 3:25 PM, Michael Armbrust <michael@databricks.com
>> > wrote:
>>
>>> +1 to Matei's reasoning.
>>>
>>> On Wed, Mar 30, 2016 at 9:21 AM, Matei Zaharia <matei.zaharia@gmail.com>
>>> wrote:
>>>
>>>> I agree that putting it in 2.0 doesn't mean keeping Scala 2.10 for the
>>>> entire 2.x line. My vote is to keep Scala 2.10 in Spark 2.0, because it's
>>>> the default version we built with in 1.x. We want to make the transition
>>>> from 1.x to 2.0 as easy as possible. In 2.0, we'll have the default
>>>> downloads be for Scala 2.11, so people will more easily move, but we
>>>> shouldn't create obstacles that lead to fragmenting the community and
>>>> slowing down Spark 2.0's adoption. I've seen companies that stayed on an
>>>> old Scala version for multiple years because switching it, or mixing
>>>> versions, would affect the company's entire codebase.
>>>>
>>>> Matei
>>>>
>>>> On Mar 30, 2016, at 12:08 PM, Koert Kuipers <koert@tresata.com> wrote:
>>>>
>>>> oh wow, had no idea it got ripped out
>>>>
>>>> On Wed, Mar 30, 2016 at 11:50 AM, Mark Hamstra <mark@clearstorydata.com
>>>> > wrote:
>>>>
>>>>> No, with 2.0 Spark really doesn't use Akka:
>>>>> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkConf.scala#L744
>>>>>
>>>>> On Wed, Mar 30, 2016 at 9:10 AM, Koert Kuipers <koert@tresata.com>
>>>>> wrote:
>>>>>
>>>>>> Spark still runs on akka. So if you want the benefits of the latest
>>>>>> akka (not saying we do, was just an example) then you need to drop
scala
>>>>>> 2.10
>>>>>> On Mar 30, 2016 10:44 AM, "Cody Koeninger" <cody@koeninger.org>
>>>>>> wrote:
>>>>>>
>>>>>>> I agree with Mark in that I don't see how supporting scala 2.10
for
>>>>>>> spark 2.0 implies supporting it for all of spark 2.x
>>>>>>>
>>>>>>> Regarding Koert's comment on akka, I thought all akka dependencies
>>>>>>> have been removed from spark after SPARK-7997 and the recent
removal
>>>>>>> of external/akka
>>>>>>>
>>>>>>> On Wed, Mar 30, 2016 at 9:36 AM, Mark Hamstra <
>>>>>>> mark@clearstorydata.com> wrote:
>>>>>>> > Dropping Scala 2.10 support has to happen at some point,
so I'm not
>>>>>>> > fundamentally opposed to the idea; but I've got questions
about
>>>>>>> how we go
>>>>>>> > about making the change and what degree of negative consequences
>>>>>>> we are
>>>>>>> > willing to accept.  Until now, we have been saying that
2.10
>>>>>>> support will be
>>>>>>> > continued in Spark 2.0.0.  Switching to 2.11 will be non-trivial
>>>>>>> for some
>>>>>>> > Spark users, so abruptly dropping 2.10 support is very likely
to
>>>>>>> delay
>>>>>>> > migration to Spark 2.0 for those users.
>>>>>>> >
>>>>>>> > What about continuing 2.10 support in 2.0.x, but repeatedly
making
>>>>>>> an
>>>>>>> > obvious announcement in multiple places that such support
is
>>>>>>> deprecated,
>>>>>>> > that we are not committed to maintaining it throughout 2.x,
and
>>>>>>> that it is,
>>>>>>> > in fact, scheduled to be removed in 2.1.0?
>>>>>>> >
>>>>>>> > On Wed, Mar 30, 2016 at 7:45 AM, Sean Owen <sowen@cloudera.com>
>>>>>>> wrote:
>>>>>>> >>
>>>>>>> >> (This should fork as its own thread, though it began
during
>>>>>>> discussion
>>>>>>> >> of whether to continue Java 7 support in Spark 2.x.)
>>>>>>> >>
>>>>>>> >> Simply: would like to more clearly take the temperature
of all
>>>>>>> >> interested parties about whether to support Scala 2.10
in the
>>>>>>> Spark
>>>>>>> >> 2.x lifecycle. Some of the arguments appear to be:
>>>>>>> >>
>>>>>>> >> Pro
>>>>>>> >> - Some third party dependencies do not support Scala
2.11+ yet
>>>>>>> and so
>>>>>>> >> would not be usable in a Spark app
>>>>>>> >>
>>>>>>> >> Con
>>>>>>> >> - Lower maintenance overhead -- no separate 2.10 build,
>>>>>>> >> cross-building, tests to check, esp considering support
of 2.12
>>>>>>> will
>>>>>>> >> be needed
>>>>>>> >> - Can use 2.11+ features freely
>>>>>>> >> - 2.10 was EOL in late 2014 and Spark 2.x lifecycle
is years to
>>>>>>> come
>>>>>>> >>
>>>>>>> >> I would like to not support 2.10 for Spark 2.x, myself.
>>>>>>> >>
>>>>>>> >>
>>>>>>> ---------------------------------------------------------------------
>>>>>>> >> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>>>> >> For additional commands, e-mail: dev-help@spark.apache.org
>>>>>>> >>
>>>>>>> >
>>>>>>>
>>>>>>> ---------------------------------------------------------------------
>>>>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>>>>
>>>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>
>
> --
> Cell : 425-233-8271
> Twitter: https://twitter.com/holdenkarau
>

Mime
View raw message