phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anil Gupta <anilgupt...@gmail.com>
Subject Re: [DISCUSS] cut new RC or do follow-on point release?
Date Mon, 25 Aug 2014 17:46:46 GMT
+1 on sinking this RC. I have faced that problem with pig Phoenix integration earlier.
 I gave up on using pig with Phoenix because I had no clue about this kind of compilation
issue.

Thanks,
Anil
Sent from my iPhone

> On Aug 25, 2014, at 9:24 AM, Jesse Yates <jesse.k.yates@gmail.com> wrote:
> 
> I'd say the release is broken if we say we support hadoop2 (and we do,
> AFAIK) and then it breaks when running with hadoop2.
> 
> Right way would be to sink the RC and spin a new one. Hopefully voting
> won't take too long on it as it should be pretty close to the original
>> On Aug 25, 2014 9:19 AM, "James Taylor" <jamestaylor@apache.org> wrote:
>> 
>> See PHOENIX-1183. We're not compiling our pig & flume modules
>> separately for hadoop1 and hadoop2, but only for hadoop1. As a result,
>> our bundled phoenix-pig jar doesn't work correctly with distributions
>> that rely on hadoop2 (like CDH 5.1).
>> 
>> Should we cancel the current RC and spin up a new one?
>> Or should we release the current RC and follow up with a point release?
>> 
>> Thanks,
>> James
>> 

Mime
View raw message