Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Preprocess async fixtures for each event loop #906

Merged
merged 7 commits into from
Aug 9, 2024

Conversation

cstruct
Copy link
Contributor

@cstruct cstruct commented Aug 5, 2024

Caching of fixture preprocessing is now also keyed by event loop id, sync fixtures can be processed if they are wrapping a async fixture, each async fixture has a mapping from root scope name to fixture id that is now used to dynamically get the event loop fixture.

3 tests fail for me locally, 2 more are flaky, some of these fail without my changes.

I'm note sure if I'm conflating fixture and loop scope and if that is contributing to the failed tests.

I feel like I have not written the clearest code but can only see a broader refactoring as a way out of that and would like feedback if this approach is even something worth doing before undertaking that.

This fixes #862.

@seifertm

@seifertm
Copy link
Contributor

seifertm commented Aug 7, 2024

Thanks! I'll have a look at it right away.

It seems like this issue should be tackled as part of the v0.24 release. Version 0.24 is supposed to allow users of v0.21 and v0.23 to upgrade to a common version before overrides of the event_loop fixture is removed. The removal will allow a bunch of refactoring and simplification of the code. Therefore, I think we should wait with larger refactorings until v0.24 is out.

@seifertm seifertm added this to the v0.24 milestone Aug 7, 2024
@minrk
Copy link

minrk commented Aug 7, 2024

I ran into this issue myself in jupyterhub/jupyterhub#4664, and this PR seems to mostly fix it.

The one fixture that can still reproduce it for me is adding an autouse=True function-scoped async fixture, combined with module-scoped loop:

@pytest_asyncio.fixture(loop_scope="module", autouse=True)
async def cleanup_after():
    yield
    print("async cleanup after")

run in this sample test environment:

> pytest
================================= test session starts =================================
platform darwin -- Python 3.11.9, pytest-8.3.2, pluggy-1.5.0 -- /Users/minrk/.virtualenvs/pyt-asy/bin/python
cachedir: .pytest_cache
rootdir: /Users/minrk/dev/jpy/temp/pytest-atest
configfile: pytest.ini
plugins: asyncio-0.24.0a1.dev1+gee75d65
asyncio: mode=Mode.AUTO, default_loop_scope=module
collected 2 items
run-last-failure: rerun previous 1 failure first

test_a.py::test_a ERROR                                                         [ 50%]
test_b.py::test_a PASSED                                                        [100%]

======================================= ERRORS ========================================
______________________________ ERROR at setup of test_a _______________________________
file /Users/minrk/dev/jpy/temp/pytest-atest/test_a.py, line 5
  async def test_a(current_loop):
      assert current_loop is asyncio.get_running_loop()
file /Users/minrk/dev/jpy/temp/pytest-atest/conftest.py, line 8
  @pytest_asyncio.fixture(loop_scope="module", autouse=True)
  async def cleanup_after():
      yield
      print("cleaning up after each test")
E       fixture 'test_b.py::<event_loop>' not found
>       available fixtures: _session_event_loop, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, cleanup_after, current_loop, doctest_namespace, event_loop, event_loop_policy, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, test_a.py::<event_loop>, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, unused_tcp_port, unused_tcp_port_factory, unused_udp_port, unused_udp_port_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/Users/minrk/dev/jpy/temp/pytest-atest/conftest.py:8
=============================== short test summary info ===============================
ERROR test_a.py::test_a
============================= 1 passed, 1 error in 0.01s ==============================

@seifertm seifertm linked an issue Aug 7, 2024 that may be closed by this pull request
@seifertm
Copy link
Contributor

seifertm commented Aug 7, 2024

@cstruct I managed to get parametrization tests of the event_loop fixture working, by adding the event_loop argument injection next to injection of the request arg. I ended up having tests reference (non-autouse) fixtures from completely different modules, which I didn't manage to solve.

Therefore, I combined your idea of using request.getfixturevalue(…) in the fixture synchronizers with determining the event loop at runtime (see #668, #669). Back then, I had to close the PR, because it relied on the fact the event_loop is no longer an explicit argument in fixtures and tests.

It looks like we got it working, thanks to your efforts. I left the commit history as is, but most of it should be squashed in my opinion.

The new code creates problems with pytest 7, though. I cannot say why this is the case at the moment.

@minrk Thanks for raising this. I haven't checked your example together with the most recent changes, yet.

@seifertm seifertm force-pushed the main branch 2 times, most recently from f228b0d to e7b5c69 Compare August 7, 2024 11:06
@minrk
Copy link

minrk commented Aug 7, 2024

jupyterhub fixtures are now working with f228b0d

@seifertm seifertm force-pushed the main branch 2 times, most recently from 32704d2 to 0bf2566 Compare August 7, 2024 11:15
@cstruct
Copy link
Contributor Author

cstruct commented Aug 7, 2024

Thank you @seifertm! I played whack-a-mole with failing tests this morning to no avail so I am very happy you picked this up.

@seifertm seifertm force-pushed the main branch 3 times, most recently from 0aa2854 to 2044e56 Compare August 7, 2024 12:01
@seifertm
Copy link
Contributor

seifertm commented Aug 7, 2024

Apparently, the changes require at least pytest 8.2. Otherwise, the test suite emits ResourceWarnings.
This is possibly related to pytest-dev/pytest#1489

@codecov-commenter
Copy link

codecov-commenter commented Aug 7, 2024

Codecov Report

Attention: Patch coverage is 92.10526% with 3 lines in your changes missing coverage. Please review.

Project coverage is 91.30%. Comparing base (f45aa18) to head (171da4f).

Files Patch % Lines
pytest_asyncio/plugin.py 92.10% 1 Missing and 2 partials ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #906      +/-   ##
==========================================
- Coverage   91.50%   91.30%   -0.20%     
==========================================
  Files           2        2              
  Lines         506      506              
  Branches      100       98       -2     
==========================================
- Hits          463      462       -1     
- Misses         25       26       +1     
  Partials       18       18              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@seifertm seifertm linked an issue Aug 7, 2024 that may be closed by this pull request
Copy link
Contributor

@seifertm seifertm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In my opinion it's fine to require pytest 8.2 or newer. If it solves a bug, I'd rather do that than trying to work around whatever issue is causing ResourceWarnings.

@seifertm seifertm added this pull request to the merge queue Aug 9, 2024
@seifertm seifertm removed this pull request from the merge queue due to a manual request Aug 9, 2024
@seifertm seifertm enabled auto-merge August 9, 2024 11:52
@seifertm seifertm added this pull request to the merge queue Aug 9, 2024
Merged via the queue into pytest-dev:main with commit 59afabf Aug 9, 2024
17 checks passed
@cstruct
Copy link
Contributor Author

cstruct commented Aug 9, 2024

Thank you for your hard work @seifertm! It was a pleasure to be involved!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants