-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🧪 Make pytest notify us about future warnings #15620
base: devel
Are you sure you want to change the base?
🧪 Make pytest notify us about future warnings #15620
Conversation
@fosterseth this is what I mentioned some time ago when looking at that rrule bug ^ |
3572b5b
to
96cf287
Compare
In essence, this configures Python to turn any warnings emitted in runtime into errors[[1]]. This is the best practice that allows reacting to future deprecation announcements that are coming from the dependencies (direct, or transitive, or even CPython itself)[[2]]. The typical workflow looks like this: 1. If a dependency is updated an a warning is hit in tests, the deprecated thing should be replaced with newer APIs. 2. If a dependency is transitive or we have no control over it otherwise, the specific warning and a regex matching its message, plus the module reference (where possible) can be added to the list of temporary ignores in `pytest.ini`. 3. The list of temporary ignores should be reevaluated periodically, including when dependency re-pinning in lockfile is happening. [1]: https://docs.python.org/3/using/cmdline.html#cmdoption-W [2]: https://pytest-with-eric.com/configuration/pytest-ignore-warnings/
1f96561
to
912a056
Compare
pytest.ini
Outdated
# FIXME: Use `open()` via a context manager | ||
# FIXME: in `awx/main/utils/common.py` to close hanging file descriptors | ||
# FIXME: and then delete the entry. | ||
once:unclosed file <_io.BufferedWriter name='[^']+'>:ResourceWarning:awx.main.utils.common |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This feels like re-arranging deck chairs on the Titanic. Great that your change gets us exposed to new warnings, but it will make dependency upgrades take longer, which already happen too infrequently, and it almost guarantees that none of these issues will be worked. And they really should be.
The methods here are fine, but I'd like to at least have at least a plan to knock out the borderline absurd ones in the list.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I skipped addressing them because that wasn't my goal. I wanted to show how to configure pytest properly. Fixing them can be planned for whenever tech debt is worked on.
I don't think that it causes more work, it just shifts the time when mandatory changes need to be made. Besides, you can always add more ignores if you want to postpone.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this PR, itself, not tech debt? And like I said, I'm fine to file followup, but "whenever tech debt is worked on" doesn't happen. Even a catch-all issue referencing your list will never get attention.
I meant it to comment on the warning about hanging file descriptors because absolutely nobody will disagree with that change, for example:
Lines 27 to 28 in 69baa73
fd = open("/var/lib/awx/.tower_version", "r") | |
if fd.read().strip() != tower_version: |
I don't want to just have this added to a forever-ignore list, unless, for some reason I'm wrong about the complexity of it. Again, filing issues about the specific issues being ignored might be a workable approach.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AlanCoding I see that, but this is as much as I'm willing to contribute right now. So merging it already would be more useful than waiting for months to see if I'll ever get to it.
This PR is useful in that it would prevent new similar problems from appearing silently over time. That's what I want out of it. This patch is not a tech debt, but rather documenting the existing issues that need to be addressed regardless of whether the strictness setting is set, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't just document existing issues, it changes the way of working by introducing an ignore-list. The reason the warnings were not silenced is because of a shared belief that they are important and should have code changes to resolve. This disposition doesn't necessarily change... but if you're not clear, people will do the wrong thing. I don't trust that without an example, so I created one:
Getting a new process for contributions, without getting some contributions (separate PRs is fine) is not something I'm really happy with. But here, I'm seeking an explicit consensus between you and me, and specific followup issues.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, that makes sense in a way. Though, I don't think it changes the process. Previously, deprecation warnings would have to be addressed at some point and they were invisible (by default UserWarning
instances are printed on the terminal, while many things, DeprecationWarning
and FutureDeprecationWarning
included never show up anywhere). To reveal those, one has to know upfront that they have to invoke everything using python -Werror -Im thing
(or similar via an env var). And then, those places would have to be fixed. The problem is that with invisible warnings that accumulate over time, one would have to be aware of all these. And the process was fixing the errors too late when upstreams would replace warnings with hard failures.
The current setup makes these things visible ahead of time which is a pretty much standard best practice in the external Python world. So to me saying that the process changed would be too strong. I'd rather say that it became more transparent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remaining easily workable categories I see are:
- FIXME: Delete this entry once
USE_L10N
use is removed - remove distutils from repo (only use, not dep)
- Delete this entry once naive dates aren't passed to DB lookup
- 'index_together' is deprecated. Use 'Meta.indexes'
- Using QuerySet.iterator.. after prefetch_related.. without specifying chunk_size
- remove always-true assertions
- fix Pagination may yield inconsistent results with an unordered object_list. .class
- Channel's inbuilt http protocol AsgiHandler is deprecated. Use Django's get_asgi_application
- Use
codecs.open()
via a context manager
I would like these to be filed, so @djyasin can have the available for the next sprint. I would have liked to have them even for story pointing, but it might be too late for the current round.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AlanCoding fair. I was also postponing fixing things like open()
under CM, since those are straightforward and could be a good learning opportunity for folks to try addressing them as opposed to more involved problems.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
By the way, in the linked place in asgi, a more elegant solution would be pathlib.Path('/var/lib/awx/.tower_version').read_text().strip()
even. This is to say that every place needs to be inspected additionally, beyond the generic suggestion I've put in the comments.
Quality Gate passedIssues Measures |
# FIXME: Use `open()` via a context manager | ||
# FIXME: in `awx/main/tests/unit/test_tasks.py` to close hanging file | ||
# FIXME: descriptors and then delete the entry. | ||
once:unclosed file <_io.TextIOWrapper name='[^']+' mode='r' encoding='UTF-8'>:ResourceWarning:awx.main.tests.unit.test_tasks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so this line can be removed with @AlanCoding 's commit?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@fosterseth I think he removed one entry already, I'm not sure about others. There were several different warnings in some files. Somebody needs to check it.
with open(config_loc, 'r') as f: | ||
shade_config = f.read() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AlanCoding here's an example of what I meant @ #15620 (comment)
with open(config_loc, 'r') as f: | |
shade_config = f.read() | |
shade_config = Path(config_loc).read_text() |
def tmp_write(path, data): | ||
with open(path, 'wb') as f: | ||
f.write(data) | ||
|
||
threading.Thread(target=tmp_write, args=(path, data)).start() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AlanCoding another example of #15620 (comment):
def tmp_write(path, data): | |
with open(path, 'wb') as f: | |
f.write(data) | |
threading.Thread(target=tmp_write, args=(path, data)).start() | |
threading.Thread(target=lambda p, d: Path(p).write_bytes(d), args=(path, data)).start() |
This is one of the reasons I didn't really want to include fixing things into the scope of the PR, since a good solution might need to be iterated on, which may make the scope grow indefinitely.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's fine with me. I suspected Path
could do this more clearly. It's your PR, you can include it or not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't exactly expect you to merge it. I could have changed the base after your PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AlanCoding Ah, I was under the impression that you wanted it to be a part of the PR before merging. Anyway, it's not really blocking, I just wanted to be explicit about my observations. I think I'll add these code suggestions on top and then wait until somebody hits merge, then.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or maybe not… This import isn't there, and I won't be doing this from browser. Let's leave it as is and document somewhere that it's nice to address it.
In essence, this configures Python to turn any warnings emitted in runtime into errors[1]. This is the best practice that allows reacting to future deprecation announcements that are coming from the dependencies (direct, or transitive, or even CPython itself)[2].
The typical workflow looks like this:
If a dependency is updated an a warning is hit in tests, the deprecated thing should be replaced with newer APIs.
If a dependency is transitive or we have no control over it otherwise, the specific warning and a regex matching its message, plus the module reference (where possible) can be added to the list of temporary ignores in
pytest.ini
.The list of temporary ignores should be reevaluated periodically, including when dependency re-pinning in lockfile is happening.