spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <r...@databricks.com>
Subject Re: Building Spark against Scala 2.10.1 virtualized
Date Fri, 18 Jul 2014 20:55:11 GMT
Yes.


On Fri, Jul 18, 2014 at 12:50 PM, Meisam Fathi <meisam.fathi@gmail.com>
wrote:

> Sorry for resurrecting this thread but project/SparkBuild.scala is
> completely rewritten recently (after this commit
> https://github.com/apache/spark/tree/628932b). Should library
> dependencies be defined in pox.xml files after this commit?
>
> Thanks
> Meisam
>
> On Thu, Jun 5, 2014 at 4:51 PM, Matei Zaharia <matei.zaharia@gmail.com>
> wrote:
> > You can modify project/SparkBuild.scala and build Spark with sbt instead
> of Maven.
> >
> >
> > On Jun 5, 2014, at 12:36 PM, Meisam Fathi <meisam.fathi@gmail.com>
> wrote:
> >
> >> Hi community,
> >>
> >> How should I change sbt to compile spark core with a different version
> >> of Scala? I see maven pom files define dependencies to scala 2.10.4. I
> >> need to override/ignore the maven dependencies and use Scala
> >> virtualized, which needs these lines in a build.sbt file:
> >>
> >> scalaOrganization := "org.scala-lang.virtualized"
> >>
> >> scalaVersion := "2.10.1"
> >>
> >> libraryDependencies += "EPFL" %% "lms" % "0.3-SNAPSHOT"
> >>
> >> scalacOptions += "-Yvirtualize"
> >>
> >>
> >> Thanks,
> >> Meisam
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message