spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <matei.zaha...@gmail.com>
Subject Re: fixed hbase version in SparkBuild (spark-0.8)
Date Wed, 31 Jul 2013 04:08:05 GMT
Yeah, and maybe we will want to change to Maven as the recommended tool for assembly building.
I want to look into this more for the 0.8 release.

Matei

On Jul 30, 2013, at 9:04 PM, Konstantin Boudnik <cos@apache.org> wrote:

> On Tue, Jul 30, 2013 at 08:44PM, Matei Zaharia wrote:
>> Let's at the very least make it configurable, but an even better thing will
>> be to make sbt assembly not include it. I think the only thing that depends
>> on HBase is the examples project, but unfortunately SBT puts all its JARs in
>> the lib_managed folder and just stupidly creates an assembly by grouping
>> those. The Maven build, for example, should not do that.
> 
> It is very easy to exclude dependencies in Maven assembly, like it is done for
> Hadoop. Lemme send out a putt request - a good finding indeed, Dmitriy, thank
> you!
> 
> Cos
> 
>> Matei
>> 
>> On Jul 30, 2013, at 7:40 PM, Dmitriy Lyubimov <dlieu.7@gmail.com> wrote:
>> 
>>> Hello,
>>> 
>>> after couple of days(!) of trying to understand where i get the
>>> "NoSuchMethod" error, i traced it down to the fact that 0.8 now includes
>>> hbase.
>>> 
>>> While it is assumed that hadoop version is specified, hbase version is
>>> fixed. This seem to create problem if hbase is used with a particular
>>> version of CDH hadoop client in the backend. (there's a known compatibility
>>> bug).
>>> 
>>> wouldn't it make sense in this case to allow to declare hbase version as
>>> well, perhaps even tie it to the CDH version?
>>> 
>>> At the very least i think it deserves a specific mention in the header
>>> section to provide opportunity to override, just like hadoop version does?
>>> 
>>> Thanks.
>>> -D
>> 


Mime
View raw message