spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <>
Subject Re: [discuss] ending support for Java 7 in Spark 2.0
Date Tue, 05 Apr 2016 10:41:02 GMT

> On 4 Apr 2016, at 20:58, Ofir Manor <> wrote:
> I think that a backup plan could be to announce that JDK7 is deprecated in Spark 2.0
and support for it will be fully removed in Spark 2.1. This gives admins enough warning to
install JDK8 along side their "main" JDK (or fully migrate to it), while allowing the project
to merge JDK8-specific changes to trunk right after the 2.0 release.

Announcing a plan is good; anything which can be done to help mixed JVM deployment (documentation,
testing) would be useful too

> However, I personally think it is better to drop JDK7 now. I'm sure that both the community
and the distributors (Databricks, Cloudera, Hortonworks, MapR, IBM etc) will all rush to help
their customers migrate their environment to support Spark 2.0, so I think any backlash won't
be dramatic or lasting. 

People using Spark tend to be pretty aggressive about wanting the latest version, at least
on the 1.x line; so far there've been no major problems allowing mixed spark version deployments,
provided shared bits of infrastructure (spark history server) were recent. Hive metadata repository
access is the other big issue: moving spark up to hive  1.2.1 addresses that for the moment.

I don't know about organisations adoption of JDK8 vs 7; or how anyone would react to having
to move to java 8 for spark 2. Maybe it'll be a barrier to adoption —maybe it'll be an incentive
to upgrade.

Oh, I do know that Java 9 is going to be trouble. Different topic.
View raw message