spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@apache.org>
Subject Re: time for Apache Spark 3.0?
Date Tue, 13 Nov 2018 00:00:45 GMT
My non-definitive takes --

I would personally like to remove all deprecated methods for Spark 3.
I started by removing 'old' deprecated methods in that commit. Things
deprecated in 2.4 are maybe less clear, whether they should be removed

Everything's fair game for removal or change in a major release. So
far some items in discussion seem to be Scala 2.11 support, Python 2
support, R support before 3.4. I don't know about other APIs.

Generally, take a look at JIRA for items targeted at version 3.0. Not
everything targeted for 3.0 is going in, but ones from committers are
more likely than others. Breaking changes ought to be tagged
'release-notes' with a description of the change. The release itself
has a migration guide that's being updated as we go.


On Mon, Nov 12, 2018 at 5:49 PM Matt Cheah <mcheah@palantir.com> wrote:
>
> I wanted to clarify what categories of APIs are eligible to be broken in Spark 3.0. Specifically:
>
>
>
> Are we removing all deprecated methods? If we’re only removing some subset of deprecated
methods, what is that subset? I see a bunch were removed in https://github.com/apache/spark/pull/22921
for example. Are we only committed to removing methods that were deprecated in some Spark
version and earlier?
> Aside from removing support for Scala 2.11, what other kinds of (non-experimental and
non-evolving) APIs are eligible to be broken?
> Is there going to be a way to track the current list of all proposed breaking changes
/ JIRA tickets? Perhaps we can include it in the JIRA ticket that can be filtered down to
somehow?
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Mime
View raw message