spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bryan Cutler <cutl...@gmail.com>
Subject Re: Run a specific PySpark test or group of tests
Date Tue, 15 Aug 2017 19:26:31 GMT
This generally works for me to just run tests within a class or even a
single test.  Not as flexible as pytest -k, which would be nice..

$ SPARK_TESTING=1 bin/pyspark pyspark.sql.tests ArrowTests

On Tue, Aug 15, 2017 at 5:49 AM, Nicholas Chammas <
nicholas.chammas@gmail.com> wrote:

> Pytest does support unittest-based tests
> <https://docs.pytest.org/en/latest/unittest.html>, allowing for
> incremental adoption. I'll see how convenient it is to use with our current
> test layout.
>
> On Tue, Aug 15, 2017 at 1:03 AM Hyukjin Kwon <gurwls223@gmail.com> wrote:
>
>> For me, I would like this if this can be done with relatively small
>> changes.
>> How about adding more granular options, for example, specifying or
>> filtering smaller set of test goals in the run-tests.py script?
>> I think it'd be quite small change and we could roughly reach this goal
>> if I understood correctly.
>>
>>
>> 2017-08-15 3:06 GMT+09:00 Nicholas Chammas <nicholas.chammas@gmail.com>:
>>
>>> Say you’re working on something and you want to rerun the PySpark tests,
>>> focusing on a specific test or group of tests. Is there a way to do that?
>>>
>>> I know that you can test entire modules with this:
>>>
>>> ./python/run-tests --modules pyspark-sql
>>>
>>> But I’m looking for something more granular, like pytest’s -k option.
>>>
>>> On that note, does anyone else think it would be valuable to use a test
>>> runner like pytest to run our Python tests? The biggest benefits would be
>>> the use of fixtures <https://docs.pytest.org/en/latest/fixture.html>,
>>> and more flexibility on test running and reporting. Just wondering if we’ve
>>> already considered this.
>>>
>>> Nick
>>> ​
>>>
>>
>>

Mime
View raw message