spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mingyu Kim <m...@palantir.com>
Subject Spark dependency library causing problems with conflicting versions at import
Date Tue, 08 Oct 2013 01:18:21 GMT
Hi all,

I'm trying to use spark in our existing code base. However, a lot of spark
dependencies are not updated to the latest versions and they conflict with
our versions of the libraries. Most notably scala-2.9.2 and scala-2.10.1.
Have people run into these problems before? How did you work around this?

A few specific more questions I have are,
* I remember you have a plan to upgrade scala to the latest version of
scala. Is there any ETA?
* I see other open source projects private-namespacing the dependencies to
work around this problem. For example, Elasticsearch prepends dependent
libraries with "org.elasticsearchŠ.". Would it be possible for Spark to take
this path in the future?
* We intend to just create a remote connection from our code base, so it's
not like we really need the full Spark implementation in our code base. Is
it possible to separate out "client" project that only allows remote
connections via SparkContext such that the "client" jar only contains a few
dependencies?
Mingyu



Mime
View raw message