spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <ja...@japila.pl>
Subject Incorrect Scala version for Spark 2.4.x releases in the docs?
Date Thu, 17 Sep 2020 14:11:46 GMT
Hi,

Just found this paragraph in
http://spark.apache.org/docs/2.4.6/index.html#downloading:

"Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API,
Spark 2.4.6 uses Scala 2.12. You will need to use a compatible Scala
version (2.12.x)."

That seems to contradict the version of Scala in the pom.xml [1] which
is 2.11.12. I think this says that Spark 2.4.6 uses Scala 2.12 by default
which is incorrect to me. Am I missing something?

My question is what's the official Scala version of Spark 2.4.6 (and others
in 2.4.x release line)?

(I do know that Spark 2.4.x could be compiled with Scala 2.12, but that
requires scala-2.12 profile [2] to be enabled)

[1] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L158
[2] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L2830

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
"The Internals Of" Online Books <https://books.japila.pl/>
Follow me on https://twitter.com/jaceklaskowski

<https://twitter.com/jaceklaskowski>

Mime
View raw message