spot-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kant kodali <kanth...@gmail.com>
Subject Re: should we upgrade to spark 2.1 ?
Date Mon, 24 Apr 2017 16:06:03 GMT
Its No Brainer to go with 2.1 especially when we are dealing with spark
streaming. Actually 2.2 will be released in one more month (going off from
their standard release process) and there are lot of features that one may
find is needed when working with Spark Streaming that are not available in
2.1 or below.

I had problems with 2.0.0 that got fixed in 2.0.1.

Strongly recommend 2.1.0 or above.

On Mon, Apr 24, 2017 at 8:03 AM, Barona, Ricardo <ricardo.barona@intel.com>
wrote:

> In general, I’ve seen Spark 2.0.0 and 2.1.0 are faster than 1.6.0 because
> of the “whole-stage code generation” – as per release notes, (2 – 10X)
> performance speedups for common operators in SQL and DataFrames, including
> joins. The only thing that concerns me is MLlib deprecation in 2.1.0.
>
> Given that, I’d say, we should migrate to 2.0.x, start experimenting with
> Spark ML – LDA and give support for 1.6.0, like Nate says, for one year or
> so.
>
> On 4/21/17, 6:59 PM, "Austin Leahy" <Austin@digitalminion.com> wrote:
>
>     Damn Michael beat me to it ;D
>     On Fri, Apr 21, 2017 at 4:58 PM Michael Ridley <mridley@cloudera.com>
> wrote:
>
>     > Given that the project has not had a release, I don't see any reason
> to
>     > stick with 1.6 support. Now seems like a good time to switch to 2 if
> that's
>     > what people want to do. I haven't had time to do a deep dive on
> Spark 2 yet
>     > so I don't have enough information to have a technical opinion,
> other than
>     > that I hear a lot of excitement and preference for Spark 2.
>     >
>     > Michael Ridley
>     > Senior Solutions Architect
>     > Cloudera
>     >
>     > Sent from my mobile.
>     > Pardon any spelling errors.
>     >
>     > > On Apr 21, 2017, at 6:39 PM, Segerlind, Nathan L <
>     > nathan.l.segerlind@intel.com> wrote:
>     > >
>     > > Hi everybody.
>     > >
>     > > There's been some talk about upgrading to Spark 2.1.
>     > >
>     > > Do people think this is worthwhile?
>     > >
>     > > Would others like to see continued support for 1.6? For how long
> and it
>     > what capacity?
>     > >
>     > > Should we maintain two branches?
>     > >
>     > > Or perhaps drive the 2.1 branch forward and only send bug fixes to
> the
>     > 1.6 branch for another year or so?
>     > >
>     > >
>     >
>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message