spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Siegmann <>
Subject Re: Straw poll: dropping support for things like Scala 2.10
Date Wed, 26 Oct 2016 17:45:38 GMT
Is the deprecation of JDK 7 and Scala 2.10 documented anywhere outside the
release notes for Spark 2.0.0? I do not consider release notes to be
sufficient public notice for deprecation of supported platforms - this
should be noted in the documentation somewhere. Here are on the only
mentions I could find:

At it says:

"*Note: Starting version 2.0, Spark is built with Scala 2.11 by default.
Scala 2.10 users should download the Spark source package and build with
Scala 2.10 support

At it says:

"Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API,
Spark 2.0.1 uses Scala 2.11. You will need to use a compatible Scala
version (2.11.x)."

it says:

   - "Spark 2.0.1 is built and distributed to work with Scala 2.11 by
   default. (Spark can be built to work with other versions of Scala, too.) To
   write applications in Scala, you will need to use a compatible Scala
   version (e.g. 2.11.X)."
   - "Spark 2.0.1 works with Java 7 and higher. If you are using Java 8,
   Spark supports lambda expressions
   for concisely writing functions, otherwise you can use the classes in the
   - "Spark 2.0.1 works with Python 2.6+ or Python 3.4+. It can use the
   standard CPython interpreter, so C libraries like NumPy can be used. It
   also works with PyPy 2.3+."

View raw message