Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pytest.skip not respected after initial figure created in a test #172

Closed
greglucas opened this issue Jun 28, 2022 · 2 comments · Fixed by #171
Closed

pytest.skip not respected after initial figure created in a test #172

greglucas opened this issue Jun 28, 2022 · 2 comments · Fixed by #171
Labels

Comments

@greglucas
Copy link

In Cartopy, we have some parameterized tests that create a figure, try to do some things on that figure in a try/except clause, and if it hits the exception throw a pytest.skip() to indicate we want to skip that test. In the recent 0.16 release, this no longer works and the images are compared and produce a failure due to their being no image to compare against (since we want it skipped). See comment here: SciTools/cartopy#2058 (review)

@dopplershift
Copy link
Contributor

We're also seeing this over in MetPy where we have a decorator that uses importorskip:

def needs_cartopy(test_func):
    """Decorate a test function or fixture as requiring CartoPy.

    Will skip the decorated test, or any test using the decorated fixture, if ``cartopy`` is
    unable to be imported.
    """
    @functools.wraps(test_func)
    def wrapped(*args, **kwargs):
        pytest.importorskip('cartopy')
        return test_func(*args, **kwargs)
    return wrapped

@ConorMacBride
Copy link
Member

Prior to v0.16.0 pytest-mpl wrapped the user's entire test function in a pytest-mpl wrapper which ran the image comparison tests. This way the function will always exit as soon as it was skipped or failed etc. However, this approach resulted in tests inside classes not being run in pytest 7. I fixed the issue with the classes in #164, although I changed how pytest-mpl integrates with pytest as this seemed like the simplest solution to the issue in the long run (as the design of wrapping most of pytest-mpl inside each of the user's test functions didn't seem great), although this has now produced this issue.

The issue is that in the pytest_runtest_call hook inside pytest-mpl, we are assuming that a figure will always be returned for testing if the test is marked as mpl_image_compare (see code below). However, this will not be the case if the test is skipped or if an exception is raised during the test. In these cases we should return from that hook without running any image comparison.

There is a very simple fix for this which will work for #171, so I'll push an update to that PR tomorrow which will solve this issue. (And I'll add some tests for this issue.) This PR introduces yet another change to the pytest hooks. Maybe the best option would be to revert to the pre v0.16.0 pytest hooks which have worked for a long time, and find a different solution to the classes bug? It would be great if a pytest dev could review these hooks.

@pytest.hookimpl(hookwrapper=True)
def pytest_runtest_call(self, item):
yield
if get_compare(item) is not None:
close_mpl_figure(self.result)

ConorMacBride added a commit to ConorMacBride/pytest-mpl that referenced this issue Jun 29, 2022
If the return value hasn't been added to the dictionary, it means that the user's test didn't reach the end. This means that it raised an exception or was otherwise stopped early by pytest (skipped, failed, etc). In any case, we should not proceed with figure testing and use the result that pytest has already determined.

Fixes matplotlib#172
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants