spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <ja...@japila.pl>
Subject Re: Incorrect Scala version for Spark 2.4.x releases in the docs?
Date Thu, 17 Sep 2020 15:35:15 GMT
Thanks Sean for such a quick response! Let me propose a fix for the docs.

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
"The Internals Of" Online Books <https://books.japila.pl/>
Follow me on https://twitter.com/jaceklaskowski

<https://twitter.com/jaceklaskowski>


On Thu, Sep 17, 2020 at 4:16 PM Sean Owen <srowen@gmail.com> wrote:

> The pre-built binary distros should use 2.11 in 2.4.x. Artifacts for
> both Scala versions are available, yes.
> Yeah I think it should really say you can use 2.11 or 2.12.
>
> On Thu, Sep 17, 2020 at 9:12 AM Jacek Laskowski <jacek@japila.pl> wrote:
> >
> > Hi,
> >
> > Just found this paragraph in
> http://spark.apache.org/docs/2.4.6/index.html#downloading:
> >
> > "Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API,
> Spark 2.4.6 uses Scala 2.12. You will need to use a compatible Scala
> version (2.12.x)."
> >
> > That seems to contradict the version of Scala in the pom.xml [1] which
> is 2.11.12. I think this says that Spark 2.4.6 uses Scala 2.12 by default
> which is incorrect to me. Am I missing something?
> >
> > My question is what's the official Scala version of Spark 2.4.6 (and
> others in 2.4.x release line)?
> >
> > (I do know that Spark 2.4.x could be compiled with Scala 2.12, but that
> requires scala-2.12 profile [2] to be enabled)
> >
> > [1] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L158
> > [2] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L2830
> >
> > Pozdrawiam,
> > Jacek Laskowski
> > ----
> > https://about.me/JacekLaskowski
> > "The Internals Of" Online Books
> > Follow me on https://twitter.com/jaceklaskowski
> >
>

Mime
View raw message