spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yana Kadiyska <yana.kadiy...@gmail.com>
Subject Re: What are factors need to Be considered when upgrading to Spark 2.1.0 from Spark 1.6.0
Date Fri, 29 Sep 2017 13:49:48 GMT
One thing to note, if you are using Mesos, is that the version of Mesos
changed from 0.21 to 1.0.0. So taking a newer Spark might push you into
larger infrastructure upgrades

On Fri, Sep 22, 2017 at 2:39 PM, Gokula Krishnan D <email2dgk@gmail.com>
wrote:

> Hello All,
>
> Currently our Batch ETL Jobs are in Spark 1.6.0 and planning to upgrade
> into Spark 2.1.0.
>
> With minor code changes (like configuration and Spark Session.sc) able to
> execute the existing JOB into Spark 2.1.0.
>
> But noticed that JOB completion timings are much better in Spark 1.6.0 but
> no in Spark 2.1.0.
>
> For the instance, JOB A completed in 50s in Spark 1.6.0.
>
> And with the same input and JOB A completed in 1.5 mins in Spark 2.1.0.
>
> Is there any specific factor needs to be considered when switching to
> Spark 2.1.0 from Spark 1.6.0.
>
>
>
> Thanks & Regards,
> Gokula Krishnan* (Gokul)*
>

Mime
View raw message