spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <matei.zaha...@gmail.com>
Subject Re: Spark dependency library causing problems with conflicting versions at import
Date Tue, 08 Oct 2013 03:15:28 GMT
Yeah, you have to use the scala-2.10 branch to use Scala 2.10. Scala doesn't really support
mixed-version builds. We are actively maintaining the 2.10 branch and plan on merging it into
master soon and having be part of Spark 0.9. Right now it's up to date with the 0.8 release,
so it should be fine to use. I think you'll have to either do that or find another version
of the library you have.

Matei

On Oct 7, 2013, at 8:10 PM, Matt Cheah <mcheah@palantir.com> wrote:

> Hi Matel,
> 
> We are mainly having trouble with Scala. Our project currently depends on the Scala 2.10
library. We have tried private namespacing the Scala library, but this breaks Akka, because
as you have observed the reflection logic breaks.
> 
> I looked at the Scala 2.10 branch, and there are two problems I think we'd have with
it:
> It's not a major official release (we prefer to stick to official releases), and
> It's 33 commits behind master.
> Are there plans to actively maintain this branch and eventually release it officially?
> 
> -Matt Cheah
> 
> From: Matei Zaharia <matei.zaharia@gmail.com>
> Date: Monday, October 7, 2013 7:49 PM
> To: "user@spark.incubator.apache.org" <user@spark.incubator.apache.org>
> Cc: Andrew Winings <mcheah@palantir.com>, Tarun Singh <tsingh@palantir.com>
> Subject: Re: Spark dependency library causing problems with conflicting versions at import
> 
> Hi Mingyu,
> 
> The latest version of Spark works with Scala 2.9.3, which is the latest Scala-2.9 version.
There's also a branch called branch-2.10 on GitHub that uses 2.10.3. What specific libraries
are you having trouble with?
>> I see other open source projects private-namespacing the dependencies to work around
this problem. For example, Elasticsearch prepends dependent libraries with "org.elasticsearch….".
Would it be possible for Spark to take this path in the future?
> This can be tough to do because some libraries use reflection or require only one instance
per JVM (e.g. Log4J). You may be able to package Spark in this way by modifying the Maven
file though.
>> We intend to just create a remote connection from our code base, so it's not like
we really need the full Spark implementation in our code base. Is it possible to separate
out "client" project that only allows remote connections via SparkContext such that the "client"
jar only contains a few dependencies?
> Yeah good question, this is hard today but might be possible later. I'd recommend writing
a little RPC service on top and accessing that.
> 
> Matei
> 


Mime
View raw message