hadoop-mapreduce-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tsuyoshi OZAWA <ozawa.tsuyo...@gmail.com>
Subject Re: Fault injection framework for testing
Date Wed, 16 Jan 2013 04:18:04 GMT
Thanks for your comment.
Your comment is helpful for me.

I'd like to go with 2nd approach - MOP with Groovy. In that case, how
can I add test code to the trunk?
Is it acceptable for Hadoop project to add test code written in groovy?


On Wed, Jan 16, 2013 at 12:13 PM, Konstantin Boudnik <cos@apache.org> wrote:
> Hadoop-1 includes framework called Herriot that would allow you to develop
> on-the-cluster FI system tests. However, because of the some timing, it hasn't
> been hooked into the maven build system Hadoop-2 branches.
> Basically, I see two way of doing what you need to do here:
>   - wait until the Herriot is integrated back (that might take a while,
>     actually)
>   - go along with MOP using Groovy and develop a cluster test for your
>     feature. MOP won't require pretty much anything but a groovy jar to be
>     added to the classpath of the java process(es) in question. With it in
>     place you can instrument anything you want the way you need during the
>     application bootstrap. In fact, I think Herriot would be better off with
>     that approach instead of initial AspectJ build-time mechanism.
> Hope it helps,
>   Cos
> On Wed, Jan 16, 2013 at 02:19AM, Tsuyoshi OZAWA wrote:
>> Hi,
>> I've created patch for MAPREDUCE-4502. Now, I confirmed that it works
>> well for usual case, and I also added code to handle MapTask failure.
>> As a next step, I need to add test code against MapTask failure.
>> So I have questions:
>> Is there fault injection in MapReduce testing framework?
>> If the answer is negative, do you have any ideas to test it?
>> Thanks,
>> OZAWA Tsuyoshi

OZAWA Tsuyoshi

View raw message