spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raymond Honderdors <Raymond.Honderd...@sizmek.com>
Subject RE: Discuss: commit to Scala 2.10 support for Spark 2.x lifecycle
Date Sat, 02 Apr 2016 05:00:34 GMT
What about a seperate branch for scala 2.10?



Sent from my Samsung Galaxy smartphone.


-------- Original message --------
From: Koert Kuipers <koert@tresata.com>
Date: 4/2/2016 02:10 (GMT+02:00)
To: Michael Armbrust <michael@databricks.com>
Cc: Matei Zaharia <matei.zaharia@gmail.com>, Mark Hamstra <mark@clearstorydata.com>,
Cody Koeninger <cody@koeninger.org>, Sean Owen <sowen@cloudera.com>, dev@spark.apache.org
Subject: Re: Discuss: commit to Scala 2.10 support for Spark 2.x lifecycle

as long as we don't lock ourselves into supporting scala 2.10 for the entire spark 2 lifespan
it sounds reasonable to me

On Wed, Mar 30, 2016 at 3:25 PM, Michael Armbrust <michael@databricks.com<mailto:michael@databricks.com>>
wrote:
+1 to Matei's reasoning.

On Wed, Mar 30, 2016 at 9:21 AM, Matei Zaharia <matei.zaharia@gmail.com<mailto:matei.zaharia@gmail.com>>
wrote:
I agree that putting it in 2.0 doesn't mean keeping Scala 2.10 for the entire 2.x line. My
vote is to keep Scala 2.10 in Spark 2.0, because it's the default version we built with in
1.x. We want to make the transition from 1.x to 2.0 as easy as possible. In 2.0, we'll have
the default downloads be for Scala 2.11, so people will more easily move, but we shouldn't
create obstacles that lead to fragmenting the community and slowing down Spark 2.0's adoption.
I've seen companies that stayed on an old Scala version for multiple years because switching
it, or mixing versions, would affect the company's entire codebase.

Matei

On Mar 30, 2016, at 12:08 PM, Koert Kuipers <koert@tresata.com<mailto:koert@tresata.com>>
wrote:

oh wow, had no idea it got ripped out

On Wed, Mar 30, 2016 at 11:50 AM, Mark Hamstra <mark@clearstorydata.com<mailto:mark@clearstorydata.com>>
wrote:
No, with 2.0 Spark really doesn't use Akka: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkConf.scala#L744

On Wed, Mar 30, 2016 at 9:10 AM, Koert Kuipers <koert@tresata.com<mailto:koert@tresata.com>>
wrote:

Spark still runs on akka. So if you want the benefits of the latest akka (not saying we do,
was just an example) then you need to drop scala 2.10

On Mar 30, 2016 10:44 AM, "Cody Koeninger" <cody@koeninger.org<mailto:cody@koeninger.org>>
wrote:
I agree with Mark in that I don't see how supporting scala 2.10 for
spark 2.0 implies supporting it for all of spark 2.x

Regarding Koert's comment on akka, I thought all akka dependencies
have been removed from spark after SPARK-7997 and the recent removal
of external/akka

On Wed, Mar 30, 2016 at 9:36 AM, Mark Hamstra <mark@clearstorydata.com<mailto:mark@clearstorydata.com>>
wrote:
> Dropping Scala 2.10 support has to happen at some point, so I'm not
> fundamentally opposed to the idea; but I've got questions about how we go
> about making the change and what degree of negative consequences we are
> willing to accept.  Until now, we have been saying that 2.10 support will be
> continued in Spark 2.0.0.  Switching to 2.11 will be non-trivial for some
> Spark users, so abruptly dropping 2.10 support is very likely to delay
> migration to Spark 2.0 for those users.
>
> What about continuing 2.10 support in 2.0.x, but repeatedly making an
> obvious announcement in multiple places that such support is deprecated,
> that we are not committed to maintaining it throughout 2.x, and that it is,
> in fact, scheduled to be removed in 2.1.0?
>
> On Wed, Mar 30, 2016 at 7:45 AM, Sean Owen <sowen@cloudera.com<mailto:sowen@cloudera.com>>
wrote:
>>
>> (This should fork as its own thread, though it began during discussion
>> of whether to continue Java 7 support in Spark 2.x.)
>>
>> Simply: would like to more clearly take the temperature of all
>> interested parties about whether to support Scala 2.10 in the Spark
>> 2.x lifecycle. Some of the arguments appear to be:
>>
>> Pro
>> - Some third party dependencies do not support Scala 2.11+ yet and so
>> would not be usable in a Spark app
>>
>> Con
>> - Lower maintenance overhead -- no separate 2.10 build,
>> cross-building, tests to check, esp considering support of 2.12 will
>> be needed
>> - Can use 2.11+ features freely
>> - 2.10 was EOL in late 2014 and Spark 2.x lifecycle is years to come
>>
>> I would like to not support 2.10 for Spark 2.x, myself.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org<mailto:dev-unsubscribe@spark.apache.org>
>> For additional commands, e-mail: dev-help@spark.apache.org<mailto:dev-help@spark.apache.org>
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org<mailto:dev-unsubscribe@spark.apache.org>
For additional commands, e-mail: dev-help@spark.apache.org<mailto:dev-help@spark.apache.org>







Mime
View raw message