spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Todd Nist <tsind...@gmail.com>
Subject Re: Newbie Help for spark compilation problem
Date Sun, 25 Oct 2015 22:51:45 GMT
So yes the individual artifacts are released however, there is no
deployable bundle prebuilt for Spark 1.5.1 and Scala 2.11.7, something
like:  spark-1.5.1-bin-hadoop-2.6_scala-2.11.tgz.  The spark site even
states this:

*Note: Scala 2.11 users should download the Spark source package and
build with Scala 2.11 support
<http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211>.*
So if you want one simple deployable, for a standalone environment I
thought you had to perform the make-distribution like I described.

Clearly the individual artifacts are there as you state, is there a
provided 2.11 tgz available as well?  I did not think there was, if there
is then should the documentation on the download site be changed to reflect
this?

Sorry for the confusion.

-Todd

On Sun, Oct 25, 2015 at 4:07 PM, Sean Owen <sowen@cloudera.com> wrote:

> No, 2.11 artifacts are in fact published:
> http://search.maven.org/#search%7Cga%7C1%7Ca%3A%22spark-parent_2.11%22
>
> On Sun, Oct 25, 2015 at 7:37 PM, Todd Nist <tsindotg@gmail.com> wrote:
> > Sorry Sean you are absolutely right it supports 2.11 all o meant is
> there is
> > no release available as a standard download and that one has to build it.
> > Thanks for the clairification.
> > -Todd
> >
>

Mime
View raw message