Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

options for context and fail when unused fixtures are present #4

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

will-byrne-cardano
Copy link

@mikicz - Thanks for creating this plugin, I find it's pretty useful for keeping things clean when refactoring or using in large codebases.

In this PR I have attempted to add a couple of extra options to extend it a bit.

--unused-fixtures-fail-when-present

A boolean option that, when set to true, will set the pytest exit code to failure when there are unused fixtures present in the test session. I think this could be useful for CICD scenarios when this plugin is used

--unused-fixtures-context

Pass one or more directories that should act as the context for missing fixtures i.e. only consider fixtures missing if they are defined in any of the context directories. This makes it easy to limit the scope of missing fixtures to just the tests directory in the current repo

baseid check

I added the baseid check to pytest_collection_finish as the tests were failing when I ran them locally before making any changes. The logs reported there were unused fixtures in the _pytest module (even with my virtual environment named venv). Example log output:

------------------------------------------------------------------ Captured stdout call -------------------------------------------------------------------
============================= test session starts =============================
platform win32 -- Python 3.11.9, pytest-8.3.3, pluggy-1.5.0
rootdir: XXX\pytest-unused-fixtures
configfile: pyproject.toml
plugins: unused-fixtures-0.1.5, xdist-3.6.1
collected 1 item

..\..\..\..\..\..\GitHub\pytest-unused-fixtures\test_default.py .        [100%]

=============================== UNUSED FIXTURES ===============================
cache -- C:555
capsysbinary -- C:1005
capfd -- C:1033
capfdbinary -- C:1061
capsys -- C:977
doctest_namespace [session scope] -- C:740
pytestconfig [session scope] -- C:1344
record_property -- C:279
record_xml_attribute -- C:302
record_testsuite_property [session scope] -- C:340
tmpdir_factory [session scope] -- C:297
tmpdir -- C:304
caplog -- C:597
monkeypatch -- C:30
recwarn -- C:34
tmp_path_factory [session scope] -- C:241
tmp_path -- C:256
--------------------- fixtures defined from test_default ----------------------
fixture_c -- test_default.py:14
--------------------- fixtures defined from xdist.plugin ----------------------
worker_id [session scope] -- C:354
testrun_uid [session scope] -- C:363
============================== 1 passed in 1.04s ==============================
================================================================= short test summary info =================================================================
FAILED tests/test_base_class.py::test_default - Failed: nomatch: '*UNUSED FIXTURES*'
FAILED tests/test_ignore.py::test_default - Failed: nomatch: '*UNUSED FIXTURES*'
==================================================================== 2 failed in 2.86s ====================================================================

I'm not 100% sure this is the right thing to do as I'm not an expert on the pytest internals. There's some info in the docs here

Tests

I haven't added any tests for new functionality yet, I wanted to get your feedback on the proposed changes first. Happy to add these if you are happy with the proposed features

Let me know your thoughts

fail-when-present option to fail the pytest session if there are unused fixtures found
@mikicz
Copy link
Owner

mikicz commented Oct 21, 2024

Hey - all of those changes make sense to me! As long as the current tests pass I am happy with the baseid check.

Could I ask you to write tests and update the README with the new options, then I'd be happy to merge

@will-byrne-cardano
Copy link
Author

will-byrne-cardano commented Oct 21, 2024

Thanks for agreeing to my proposed changes.

I will get round to the updates to tests and README this week and tag you for a review once complete

@will-byrne-cardano will-byrne-cardano marked this pull request as draft October 21, 2024 12:17
@will-byrne-cardano will-byrne-cardano marked this pull request as ready for review October 21, 2024 21:27
@will-byrne-cardano
Copy link
Author

@mikicz - tests added and README updated. Let me know if you have any comments

mikicz
mikicz previously approved these changes Oct 22, 2024
@mikicz mikicz dismissed their stale review October 22, 2024 08:25

Tests are failing

@mikicz
Copy link
Owner

mikicz commented Oct 22, 2024

Tests seem to be failing on CI, could you have a look why? (It does pass locally for me )

@will-byrne-cardano
Copy link
Author

Tests seem to be failing on CI, could you have a look why? (It does pass locally for me )

Yes will take a look. They were also passing locally for me as well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants