hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcus Herou <marcus.he...@tailsweep.com>
Subject Re: Developing cross-component patches post-split
Date Wed, 01 Jul 2009 21:10:35 GMT

My 5 cents about svn:externals.
I could not live without it but...I always tend to forget to update our
svn:externals and about once a month I wonder why I accidentally released
bleeding edge code in our production environent *smile* (should've written
that auto-branching script waaay back ago)....



On Wed, Jul 1, 2009 at 10:54 PM, Doug Cutting <cutting@apache.org> wrote:

> Todd Lipcon wrote:
>> - Whenever a task will need to touch both Common and one of the components
>> (Mapred/HDFS) should there be two JIRAs or is it sufficient to have just
>> one
>> "HADOOP" JIRA with separate patches uploaded for the two repositories?
> Two Jiras, I think.  In the long run, such issues should be few.  E.g., we
> should not be changing the FileSystem API incompatibly much.
>  - If we're to do two separate JIRAs, is the best bet to use JIRA's
>> "linking"
>> feature to show the dependency between them?
> +1
>  - When we have one of these cross-project changes, how are we supposed to
>> do
>> builds?
> I talked with Owen & Nigel about this, and we thought that, for now, it
> might be reasonable to have the mapreduce and hdfs trunk each contain an svn
> external link to the current common jar.  Then folks can commit new versions
> of hadoop-common-trunk.jar as they commit changes to common's trunk.  We'd
> need to remove or update this svn external link when branching.  Thoughts?
> Doug

Marcus Herou CTO and co-founder Tailsweep AB

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message