spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Or <and...@databricks.com>
Subject Re: Recent Spark test failures
Date Mon, 11 May 2015 20:08:46 GMT
Hi Ted,

Yes, those two options can be useful, but in general I think the standard
to set is that tests should never fail. It's actually the worst if tests
fail sometimes but not others, because we can't reproduce them
deterministically. Using -M and -A actually tolerates flaky tests to a
certain extent, and I would prefer to instead increase the determinism in
these tests.

-Andrew

2015-05-08 17:56 GMT-07:00 Ted Yu <yuzhihong@gmail.com>:

> Andrew:
> Do you think the -M and -A options described here can be used in test runs
> ?
> http://scalatest.org/user_guide/using_the_runner
>
> Cheers
>
> On Wed, May 6, 2015 at 5:41 PM, Andrew Or <andrew@databricks.com> wrote:
>
>> Dear all,
>>
>> I'm sure you have all noticed that the Spark tests have been fairly
>> unstable recently. I wanted to share a tool that I use to track which
>> tests
>> have been failing most often in order to prioritize fixing these flaky
>> tests.
>>
>> Here is an output of the tool. This spreadsheet reports the top 10 failed
>> tests this week (ending yesterday 5/5):
>>
>> https://docs.google.com/spreadsheets/d/1Iv_UDaTFGTMad1sOQ_s4ddWr6KD3PuFIHmTSzL7LSb4
>>
>> It is produced by a small project:
>> https://github.com/andrewor14/spark-test-failures
>>
>> I have been filing JIRAs on flaky tests based on this tool. Hopefully we
>> can collectively stabilize the build a little more as we near the release
>> for Spark 1.4.
>>
>> -Andrew
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message