Compare commits

...

60 Commits

Author SHA1 Message Date
Alex Waygood
1a30934c33 [ty] Cleanup various APIs 2025-10-30 16:52:13 -04:00
Alex Waygood
13375d0e42 [ty] Use the top materialization of classes for narrowing in class-patterns for match statements (#21150) 2025-10-30 20:44:51 +00:00
Douglas Creager
c0b04d4b7c [ty] Update "constraint implication" relation to work on constraints between two typevars (#21068)
It's possible for a constraint to mention two typevars. For instance, in
the body of

```py
def f[S: int, T: S](): ...
```

the baseline constraint set would be `(T ≤ S) ∧ (S ≤ int)`. That is, `S`
must specialize to some subtype of `int`, and `T` must specialize to a
subtype of the type that `S` specializes to.

This PR updates the new "constraint implication" relationship from
#21010 to work on these kinds of constraint sets. For instance, in the
example above, we should be able to see that `T ≤ int` must always hold:

```py
def f[S, T]():
    constraints = ConstraintSet.range(Never, S, int) & ConstraintSet.range(Never, T, S)
    static_assert(constraints.implies_subtype_of(T, int))  # now succeeds!
```

This did not require major changes to the implementation of
`implies_subtype_of`. That method already relies on how our `simplify`
and `domain` methods expand a constraint set to include the transitive
closure of the constraints that it mentions, and to mark certain
combinations of constraints as impossible. Previously, that transitive
closure logic only looked at pairs of constraints that constrain the
same typevar. (For instance, to notice that `(T ≤ bool) ∧ ¬(T ≤ int)` is
impossible.)

Now we also look at pairs of constraints that constraint different
typevars, if one of the constraints is bound by the other — that is,
pairs of the form `T ≤ S` and `S ≤ something`, or `S ≤ T` and `something
≤ S`. In those cases, transitivity lets us add a new derived constraint
that `T ≤ something` or `something ≤ T`, respectively. Having done that,
our existing `implies_subtype_of` logic finds and takes into account
that derived constraint.
2025-10-30 16:11:04 -04:00
Brent Westbrook
1c7ea690a8 [flake8-type-checking] Fix TC003 false positive with future-annotations (#21125)
Summary
--

Fixes #21121 by upgrading `RuntimeEvaluated` annotations like
`dataclasses.KW_ONLY` to `RuntimeRequired`. We already had special
handling for
`TypingOnly` annotations in this context but not `RuntimeEvaluated`.
Combining
that with the `future-annotations` setting, which allowed ignoring the
`RuntimeEvaluated` flag, led to the reported bug where we would try to
move
`KW_ONLY` into a `TYPE_CHECKING` block.

Test Plan
--

A new test based on the issue
2025-10-30 14:14:29 -04:00
Alex Waygood
9bacd19c5a [ty] Fix lookup of __new__ on instances (#21147)
## Summary

We weren't correctly modeling it as a `staticmethod` in all cases,
leading us to incorrectly infer that the `cls` argument would be bound
if it was accessed on an instance (rather than the class object).

## Test Plan

Added mdtests that fail on `main`. The primer output also looks good!
2025-10-30 13:42:46 -04:00
Brent Westbrook
f0fe6d62fb Fix syntax error false positive on nested alternative patterns (#21104)
## Summary

Fixes #21101 by storing the child visitor's names in the parent visitor.
This makes sure that `visitor.names` on line 1818 isn't empty after we
visit a nested OR pattern.

## Test Plan

New inline test cases derived from the issue,
[playground](https://play.ruff.rs/7b6439ac-ee8f-4593-9a3e-c2aa34a595d0)
2025-10-30 13:40:03 -04:00
Prakhar Pratyush
10bda3df00 [pyupgrade] Fix false positive for TypeVar with default on Python <3.13 (UP046,UP047) (#21045)
## Summary

Type default for Type parameter was added in Python 3.13 (PEP 696).

`typing_extensions.TypeVar` backports the default argument to earlier
versions.

`UP046` & `UP047` were getting triggered when
`typing_extensions.TypeVar` with `default` argument was used on python
version < 3.13

It shouldn't be triggered for python version < 3.13

This commit fixes the bug by adding a python version check before
triggering them.

Fixes #20929.

## Test Plan

### Manual testing 1

As the issue author pointed out in
https://github.com/astral-sh/ruff/issues/20929#issuecomment-3413194511,
ran the following on `main` branch:
> % cargo run -p ruff -- check ../efax/ --target-version py312
--no-cache

<details><summary>Output</summary>

```zsh
   Compiling ruff_linter v0.14.1 (/Users/prakhar/ruff/crates/ruff_linter)
   Compiling ruff v0.14.1 (/Users/prakhar/ruff/crates/ruff)
   Compiling ruff_graph v0.1.0 (/Users/prakhar/ruff/crates/ruff_graph)
   Compiling ruff_workspace v0.0.0 (/Users/prakhar/ruff/crates/ruff_workspace)
   Compiling ruff_server v0.2.2 (/Users/prakhar/ruff/crates/ruff_server)
    Finished `dev` profile [unoptimized + debuginfo] target(s) in 6.72s
     Running `target/debug/ruff check ../efax/ --target-version py312 --no-cache`
UP046 Generic class `ExpectationParametrization` uses `Generic` subclass instead of type parameters
  --> /Users/prakhar/efax/efax/_src/expectation_parametrization.py:17:48
   |
17 | class ExpectationParametrization(Distribution, Generic[NP]):
   |                                                ^^^^^^^^^^^
18 |     """The expectation parametrization of an exponential family distribution.
   |
help: Use type parameters

UP046 Generic class `ExpToNat` uses `Generic` subclass instead of type parameters
  --> /Users/prakhar/efax/efax/_src/mixins/exp_to_nat/exp_to_nat.py:27:68
   |
26 | @dataclass
27 | class ExpToNat(ExpectationParametrization[NP], SimpleDistribution, Generic[NP]):
   |                                                                    ^^^^^^^^^^^
28 |     """This mixin implements the conversion from expectation to natural parameters.
   |
help: Use type parameters

UP046 Generic class `HasEntropyEP` uses `Generic` subclass instead of type parameters
  --> /Users/prakhar/efax/efax/_src/mixins/has_entropy.py:25:20
   |
23 |                    HasEntropy,
24 |                    JaxAbstractClass,
25 |                    Generic[NP]):
   |                    ^^^^^^^^^^^
26 |     @abstract_jit
27 |     @abstractmethod
   |
help: Use type parameters

UP046 Generic class `HasEntropyNP` uses `Generic` subclass instead of type parameters
  --> /Users/prakhar/efax/efax/_src/mixins/has_entropy.py:64:20
   |
62 | class HasEntropyNP(NaturalParametrization[EP],
63 |                    HasEntropy,
64 |                    Generic[EP]):
   |                    ^^^^^^^^^^^
65 |     @jit
66 |     @final
   |
help: Use type parameters

UP046 Generic class `NaturalParametrization` uses `Generic` subclass instead of type parameters
  --> /Users/prakhar/efax/efax/_src/natural_parametrization.py:43:30
   |
41 | class NaturalParametrization(Distribution,
42 |                              JaxAbstractClass,
43 |                              Generic[EP, Domain]):
   |                              ^^^^^^^^^^^^^^^^^^^
44 |     """The natural parametrization of an exponential family distribution.
   |
help: Use type parameters

UP046 Generic class `Structure` uses `Generic` subclass instead of type parameters
  --> /Users/prakhar/efax/efax/_src/structure/structure.py:31:17
   |
30 | @dataclass
31 | class Structure(Generic[P]):
   |                 ^^^^^^^^^^
32 |     """This class generalizes the notion of type for Distribution objects.
   |
help: Use type parameters

UP046 Generic class `DistributionInfo` uses `Generic` subclass instead of type parameters
  --> /Users/prakhar/efax/tests/distribution_info.py:20:24
   |
20 | class DistributionInfo(Generic[NP, EP, Domain]):
   |                        ^^^^^^^^^^^^^^^^^^^^^^^
21 |     def __init__(self, dimensions: int = 1, safety: float = 0.0) -> None:
22 |         super().__init__()
   |
help: Use type parameters

Found 7 errors.
No fixes available (7 hidden fixes can be enabled with the `--unsafe-fixes` option).
```
</details> 

Running it after the changes:
```zsh
ruff % cargo run -p ruff -- check ../efax/ --target-version py312 --no-cache
   Compiling ruff_linter v0.14.1 (/Users/prakhar/ruff/crates/ruff_linter)
   Compiling ruff v0.14.1 (/Users/prakhar/ruff/crates/ruff)
   Compiling ruff_graph v0.1.0 (/Users/prakhar/ruff/crates/ruff_graph)
   Compiling ruff_workspace v0.0.0 (/Users/prakhar/ruff/crates/ruff_workspace)
   Compiling ruff_server v0.2.2 (/Users/prakhar/ruff/crates/ruff_server)
    Finished `dev` profile [unoptimized + debuginfo] target(s) in 7.86s
     Running `target/debug/ruff check ../efax/ --target-version py312 --no-cache`
All checks passed!
```

---

### Manual testing 2

Ran the check on the following script (mainly to verify `UP047`):
```py
from __future__ import annotations                                                                                                                                                    

from typing import Generic

from typing_extensions import TypeVar

T = TypeVar("T", default=int)


def generic_function(var: T) -> T:
    return var


Q = TypeVar("Q", default=str)


class GenericClass(Generic[Q]):
    var: Q
```

On `main` branch:
> ruff % cargo run -p ruff -- check ~/up046.py --target-version py312
--preview --no-cache

<details><summary>Output</summary>

```zsh
   Compiling ruff_linter v0.14.1 (/Users/prakhar/ruff/crates/ruff_linter)
   Compiling ruff v0.14.1 (/Users/prakhar/ruff/crates/ruff)
   Compiling ruff_graph v0.1.0 (/Users/prakhar/ruff/crates/ruff_graph)
   Compiling ruff_workspace v0.0.0 (/Users/prakhar/ruff/crates/ruff_workspace)
   Compiling ruff_server v0.2.2 (/Users/prakhar/ruff/crates/ruff_server)
    Finished `dev` profile [unoptimized + debuginfo] target(s) in 7.43s
     Running `target/debug/ruff check /Users/prakhar/up046.py --target-version py312 --preview --no-cache`
UP047 Generic function `generic_function` should use type parameters
  --> /Users/prakhar/up046.py:10:5
   |
10 | def generic_function(var: T) -> T:
   |     ^^^^^^^^^^^^^^^^^^^^^^^^
11 |     return var
   |
help: Use type parameters

UP046 Generic class `GenericClass` uses `Generic` subclass instead of type parameters
  --> /Users/prakhar/up046.py:17:20
   |
17 | class GenericClass(Generic[Q]):
   |                    ^^^^^^^^^^
18 |     var: Q
   |
help: Use type parameters

Found 2 errors.
No fixes available (2 hidden fixes can be enabled with the `--unsafe-fixes` option).
```

</details> 

After the fix (this branch):
```zsh
ruff % cargo run -p ruff -- check ~/up046.py --target-version py312 --preview --no-cache
   Compiling ruff_linter v0.14.1 (/Users/prakhar/ruff/crates/ruff_linter)
   Compiling ruff v0.14.1 (/Users/prakhar/ruff/crates/ruff)
   Compiling ruff_graph v0.1.0 (/Users/prakhar/ruff/crates/ruff_graph)
   Compiling ruff_workspace v0.0.0 (/Users/prakhar/ruff/crates/ruff_workspace)
   Compiling ruff_server v0.2.2 (/Users/prakhar/ruff/crates/ruff_server)
    Finished `dev` profile [unoptimized + debuginfo] target(s) in 7.40s
     Running `target/debug/ruff check /Users/prakhar/up046.py --target-version py312 --preview --no-cache`
All checks passed!
```

Signed-off-by: Prakhar Pratyush <prakhar1144@gmail.com>
2025-10-30 12:59:07 -04:00
David Peter
e55bc943e5 [ty] Reachability and narrowing for enum methods (#21130)
## Summary

Adds proper type narrowing and reachability analysis for matching on
non-inferable type variables bound to enums. For example:

```py
from enum import Enum

class Answer(Enum):
    NO = 0
    YES = 1

    def is_yes(self) -> bool:  # no error here!
        match self:
            case Answer.YES:
                return True
            case Answer.NO:
                return False
```

closes https://github.com/astral-sh/ty/issues/1404

## Test Plan

Added regression tests
2025-10-30 15:38:57 +01:00
David Peter
1b0ee4677e [ty] Use range instead of custom IntIterable (#21138)
## Summary

We previously didn't understand `range` and wrote these custom
`IntIterable`/`IntIterator` classes for tests. We can now remove them
and make the tests shorter in some places.
2025-10-30 15:21:55 +01:00
Dan Parizher
1ebedf6df5 [ruff] Add support for additional eager conversion patterns (RUF065) (#20657)
## Summary

Fixes #20583
2025-10-29 21:45:08 +00:00
Dan Parizher
980b4c55b2 [ruff-ecosystem] Fix CLI crash on Python 3.14 (#21092) 2025-10-29 21:37:39 +00:00
David Peter
5139f76d1f [ty] Infer type of self for decorated methods and properties (#21123)
## Summary

Infer a type of unannotated `self` parameters in decorated methods /
properties.

closes https://github.com/astral-sh/ty/issues/1448

## Test Plan

Existing tests, some new tests.
2025-10-29 21:22:38 +00:00
Jonas Vacek
aca8ba76a4 [flake8-bandit] Fix correct example for S308 (#21128)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
Fixed the incorrect import example in the "correct exmaple"
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan
🤷 
<!-- How was it tested? -->
2025-10-29 15:03:56 -04:00
Matthew Mckee
7045898ffa [ty] Dont provide goto definition for definitions which are not reexported in builtins (#21127) 2025-10-29 18:39:36 +00:00
Wei Lee
d38a5292d2 [airflow] warning airflow....DAG.create_dagrun has been removed (AIR301) (#21093) 2025-10-29 14:57:37 +00:00
Shunsuke Shibayama
83a00c0ac8 [ty] follow the breaking API changes made in salsa-rs/salsa#1015 (#21117) 2025-10-29 14:56:12 +00:00
Alex Waygood
8b22fd1a5f [ty] Rename Type::into_nominal_instance (#21124) 2025-10-29 10:18:33 -04:00
Andrew Gallant
765257bdce [ty] Filter out "unimported" from the current module
Note that this doesn't change the evaluation results unfortunately.
In particular, prior to this fix, the correct result was ranked above
the redundant result. Our MRR-based evaluation doesn't care about
anything below the rank of the correct answer, and so this change isn't
reflected in our evaluation.

Fixes astral-sh/ty#1445
2025-10-29 09:13:49 -04:00
Andrew Gallant
2d4e0edee4 [ty] Add evaluation test for auto-import including symbols in current module
This shouldn't happen. And indeed, currently, this results in a
sub-optimal ranking.
2025-10-29 09:13:49 -04:00
Andrew Gallant
9ce3fa3fe3 [ty] Refactor ty_ide completion tests
The status quo grew organically and didn't do well when one wanted to
mix and match different settings to generate a snapshot.

This does a small refactor to use more of a builder to generate
snapshots.
2025-10-29 09:13:49 -04:00
Andrew Gallant
196a68e4c8 [ty] Render import <...> in completions when "label details" isn't supported
This fixes a bug where the `import module` part of a completion for
unimported candidates would be missing. This makes it especially
confusing because the user can't tell where the symbol is coming from,
and there is no hint that an `import` statement will be inserted.

Previously, we were using [`CompletionItemLabelDetails`] to render the
`import module` part of the suggestion. But this is only supported in
clients that support version 3.17 (or newer) of the LSP specification.
It turns out that this support isn't widespread yet. In particular,
Heliex doesn't seem to support "label details."

To fix this, we take a [cue from rust-analyzer][rust-analyzer-details].
We detect if the client supports "label details," and if so, use it.
Otherwise, we push the `import module` text into the completion label
itself.

Fixes https://github.com/astral-sh/ruff/pull/20439#issuecomment-3313689568

[`CompletionItemLabelDetails`]: https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#completionItemLabelDetails
[rust-analyzer-details]: 5d905576d4/crates/rust-analyzer/src/lsp/to_proto.rs (L391-L404)
2025-10-29 08:50:41 -04:00
Dan Parizher
349061117c [refurb] Preserve digit separators in Decimal constructor (FURB157) (#20588)
## Summary

Fixes #20572

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-10-28 17:47:52 -04:00
Takayuki Maeda
d0aebaa253 [ISC001] fix panic when string literals are unclosed (#21034)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-10-28 19:14:58 +00:00
Douglas Creager
17850eee4b [ty] Reformat constraint set mdtests (#21111)
This PR updates the mdtests that test how our generics solver interacts
with our new constraint set implementation. Because the rendering of a
constraint set can get long, this standardizes on putting the `revealed`
assertion on a separate line. We also add a `static_assert` test for
each constraint set to verify that they are all coerced into simple
`bool`s correctly.

This is a pure reformatting (not even a refactoring!) that changes no
behavior. I've pulled it out of #20093 to reduce the amount of effort
that will be required to review that PR.
2025-10-28 14:59:49 -04:00
Douglas Creager
4d2ee41e24 [ty] Move constraint set mdtest functions into ConstraintSet class (#21108)
We have several functions in `ty_extensions` for testing our constraint
set implementation. This PR refactors those functions so that they are
all methods of the `ConstraintSet` class, rather than being standalone
top-level functions. 🎩 to @sharkdp for pointing out that
`KnownBoundMethod` gives us what we need to implement that!
2025-10-28 14:32:41 -04:00
Takayuki Maeda
7b959ef44b Avoid sending an unnecessary "clear diagnostics" message for clients supporting pull diagnostics (#21105) 2025-10-28 18:24:35 +00:00
renovate[bot]
4c4ddc8c29 Update Rust crate ignore to v0.4.24 (#20979)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-10-28 17:49:26 +00:00
Micha Reiser
ae0343f848 [ty] Rename inner query for better debugging experience (#21106) 2025-10-28 11:26:05 +00:00
Douglas Creager
29462ea1d4 [ty] Add new "constraint implication" typing relation (#21010)
This PR adds the new **_constraint implication_** relationship between
types, aka `is_subtype_of_given`, which tests whether one type is a
subtype of another _assuming that the constraints in a particular
constraint set hold_.

For concrete types, constraint implication is exactly the same as
subtyping. (A concrete type is any fully static type that is not a
typevar. It can _contain_ a typevar, though — `list[T]` is considered
concrete.)

The interesting case is typevars. The other typing relationships (TODO:
will) all "punt" on the question when considering a typevar, by
translating the desired relationship into a constraint set. At some
point, though, we need to resolve a constraint set; at that point, we
can no longer punt on the question. Unlike with concrete types, the
answer will depend on the constraint set that we are considering.
2025-10-27 22:01:08 -04:00
Bhuminjay Soni
7fee62b2de [semantic error tests]: refactor semantic error tests to separate files (#20926)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
This PR refactors semantic error tests in each seperate file


## Test Plan

<!-- How was it tested? -->

## CC
- @ntBre

---------

Signed-off-by: 11happy <soni5happy@gmail.com>
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-10-27 21:18:11 +00:00
Brent Westbrook
96b60c11d9 Respect --output-format with --watch (#21097)
Summary
--

Fixes #19550

This PR copies our non-watch diagnostic rendering code into
`Printer::write_continuously` in preview mode, allowing it to use
whatever output format is passed in.

I initially marked this as also fixing #19552, but I guess that's not
true currently but will be true once this is stabilized and we can
remove the warning.

Test Plan
--

Existing tests, but I don't think we have any `watch` tests, so some
manual testing as well. The default with just `ruff check --watch` is
still `concise`, adding just `--preview` still gives the `full` output,
and then specifying any other output format works, with JSON as one
example:

<img width="695" height="719" alt="Screenshot 2025-10-27 at 9 21 41 AM"
src="https://github.com/user-attachments/assets/98957911-d216-4fc4-8b6c-22c56c963b3f"
/>
2025-10-27 12:04:55 -04:00
Dylan
fffbe5a879 [pyflakes] Revert to stable behavior if imports for module lie in alternate branches for F401 (#20878)
Closes #20839
2025-10-27 10:23:36 -05:00
Dylan
116611bd39 Fix finding keyword range for clause header after statement ending with semicolon (#21067)
When formatting clause headers for clauses that are not their own node,
like an `else` clause or `finally` clause, we begin searching for the
keyword at the end of the previous statement. However, if the previous
statement ended in a semicolon this caused a panic because we only
expected trivia between the end of the last statement and the keyword.

This PR adjusts the starting point of our search for the keyword to
begin after the optional semicolon in these cases.

Closes #21065
2025-10-27 09:52:17 -05:00
Alex Waygood
db0e921db1 [ty] Fix bug where ty would think all types had an __mro__ attribute (#20995) 2025-10-27 11:19:12 +00:00
Micha Reiser
3c7f56f582 Restore indent.py (#21094) 2025-10-27 10:34:29 +00:00
Dan Parizher
8a73519b25 [flake8-django] Apply DJ001 to annotated fields (#20907) 2025-10-27 09:19:15 +01:00
Shahar Naveh
fa12fd0184 Clearer error message when line-length goes beyond threshold (#21072)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-10-27 07:42:48 +00:00
renovate[bot]
fdb8ea487c Update upload and download artifacts github actions (#21083)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-10-27 08:29:32 +01:00
renovate[bot]
d846a0319a Update dependency mdformat-mkdocs to v4.4.2 (#21088)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-27 07:47:46 +01:00
renovate[bot]
bca5d33385 Update cargo-bins/cargo-binstall action to v1.15.9 (#21086)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-27 07:47:12 +01:00
renovate[bot]
c83c4d52a4 Update Rust crate clap to v4.5.50 (#21090)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-27 07:46:38 +01:00
renovate[bot]
e692b7f1ee Update Rust crate get-size2 to v0.7.1 (#21091)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-27 07:44:02 +01:00
renovate[bot]
8e51db3ecd Update Rust crate bstr to v1.12.1 (#21089)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-27 07:43:39 +01:00
Auguste Lalande
64ab79e572 Add missing docstring sections to the numpy list (#20931)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Add docstring sections which were missing from the numpy list as pointed
out here #20923. For now these are only the official sections as
documented
[here](https://numpydoc.readthedocs.io/en/latest/format.html#sections).

## Test Plan

Added a test case for DOC102
2025-10-24 17:19:30 -04:00
Dan Parizher
1ade9a5943 [pydoclint] Fix false positive on explicit exception re-raising (DOC501, DOC502) (#21011)
## Summary
Fixes #20973 (`docstring-extraneous-exception`) false positive when
exceptions mentioned in docstrings are caught and explicitly re-raised
using `raise e` or `raise e from None`.

## Problem Analysis
The DOC502 rule was incorrectly flagging exceptions mentioned in
docstrings as "not explicitly raised" when they were actually being
explicitly re-raised through exception variables bound in `except`
clauses.

**Root Cause**: The `BodyVisitor` in `check_docstring.rs` only checked
for direct exception references (like `raise OSError()`) but didn't
recognize when a variable bound to an exception in an `except` clause
was being re-raised.

**Example of the bug**:
```python
def f():
    """Do nothing.

    Raises
    ------
    OSError
        If the OS errors.
    """
    try:
        pass
    except OSError as e:
        raise e  # This was incorrectly flagged as not explicitly raising OSError
```

The issue occurred because `resolve_qualified_name(e)` couldn't resolve
the variable `e` to a qualified exception name, since `e` is just a
variable binding, not a direct reference to an exception class.

## Approach
Modified the `BodyVisitor` in
`crates/ruff_linter/src/rules/pydoclint/rules/check_docstring.rs` to:

1. **Track exception variable bindings**: Added `exception_variables`
field to map exception variable names to their exception types within
`except` clauses
2. **Enhanced raise statement detection**: Updated `visit_stmt` to check
if a `raise` statement uses a variable name that's bound to an exception
in the current `except` clause
3. **Proper scope management**: Clear exception variable mappings when
leaving `except` handlers to prevent cross-contamination

**Key changes**:
- Added `exception_variables: FxHashMap<&'a str, QualifiedName<'a>>` to
track variable-to-exception mappings
- Enhanced `visit_except_handler` to store exception variable bindings
when entering `except` clauses
- Modified `visit_stmt` to check for variable-based re-raising: `raise
e` → lookup `e` in `exception_variables`
- Clear mappings when exiting `except` handlers to maintain proper scope

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-10-24 16:54:09 -04:00
Ibraheem Ahmed
304ac22e74 [ty] Use constructor parameter types as type context (#21054)
## Summary

Resolves https://github.com/astral-sh/ty/issues/1408.
2025-10-24 20:14:18 +00:00
Douglas Creager
c3de8847d5 [ty] Consider domain of BDD when checking whether always satisfiable (#21050)
That PR title might be a bit inscrutable.

Consider the two constraints `T ≤ bool` and `T ≤ int`. Since `bool ≤
int`, by transitivity `T ≤ bool` implies `T ≤ int`. (Every type that is
a subtype of `bool` is necessarily also a subtype of `int`.) That means
that `T ≤ bool ∧ T ≰ int` is an impossible combination of constraints,
and is therefore not a valid input to any BDD. We say that that
assignment is not in the _domain_ of the BDD.

The implication `T ≤ bool → T ≤ int` can be rewritten as `T ≰ bool ∨ T ≤
int`. (That's the definition of implication.) If we construct that
constraint set in an mdtest, we should get a constraint set that is
always satisfiable. Previously, that constraint set would correctly
_display_ as `always`, but a `static_assert` on it would fail.

The underlying cause is that our `is_always_satisfied` method would only
test if the BDD was the `AlwaysTrue` terminal node. `T ≰ bool ∨ T ≤ int`
does not simplify that far, because we purposefully keep around those
constraints in the BDD structure so that it's easier to compare against
other BDDs that reference those constraints.

To fix this, we need a more nuanced definition of "always satisfied".
Instead of evaluating to `true` for _every_ input, we only need it to
evaluate to `true` for every _valid_ input — that is, every input in its
domain.
2025-10-24 13:37:56 -04:00
Ibraheem Ahmed
f17ddd62ad [ty] Avoid duplicate diagnostics during multi-inference of standalone expressions (#21056)
## Summary

Resolves https://github.com/astral-sh/ty/issues/1428.
2025-10-24 13:21:39 -04:00
Micha Reiser
adbf05802a [ty] Fix rare panic with highly cyclic TypeVar definitions (#21059) 2025-10-24 18:30:54 +02:00
Micha Reiser
eb8c0ad87c [ty] Add --no-progress option (#21063) 2025-10-24 18:00:00 +02:00
Shahar Naveh
a2d0d39853 Configurable "unparse mode" for ruff_python_codegen::Generator (#21041)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-10-24 15:44:48 +00:00
Micha Reiser
f36fa7d6c1 [ty] Fix missing newline before first diagnostic (#21058) 2025-10-24 17:35:23 +02:00
William Woodruff
6f0982d2d6 chore: bump zizmor (#21064) 2025-10-24 10:58:23 -04:00
Dan Parizher
3e8685d2ec [pyflakes] Fix false positive for __class__ in lambda expressions within class definitions (F821) (#20564)
## Summary

Fixes #20562

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-10-24 10:07:19 -04:00
Dan Parizher
7576669297 [flake8-pyi] Fix PYI034 to not trigger on metaclasses (PYI034) (#20881)
## Summary

Fixes #20781

---------

Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-10-24 13:40:26 +00:00
Alex Waygood
bf74c824eb [ty] Delegate truthiness inference of an enum Literal type to its enum-instance supertype (#21060) 2025-10-24 14:34:16 +01:00
Alex Waygood
e196c2ab37 [ty] Consider __len__ when determining the truthiness of an instance of a tuple class or a @final class (#21049) 2025-10-24 09:29:55 +00:00
Micha Reiser
4522f35ea7 [ty] Add comment explaining why HasTrackedScope is implemented for Identifier and why it works (#21057) 2025-10-24 09:48:57 +02:00
Micha Reiser
be5a62f7e5 [ty] Timeout based workspace diagnostic progress reports (#21019) 2025-10-24 09:06:19 +02:00
wangxiaolei
28aed61a22 [pylint] Implement stop-iteration-return (PLR1708) (#20733)
## Summary

implement pylint rule stop-iteration-return / R1708

## Test Plan

<!-- How was it tested? -->

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-10-23 15:02:41 -07:00
237 changed files with 8845 additions and 3282 deletions

View File

@@ -438,7 +438,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo-binstall"
uses: cargo-bins/cargo-binstall@a66119fbb1c952daba62640c2609111fe0803621 # v1.15.7
uses: cargo-bins/cargo-binstall@afcf9780305558bcc9e4bc94b7589ab2bb8b6106 # v1.15.9
- name: "Install cargo-fuzz"
# Download the latest version from quick install and not the github releases because github releases only has MUSL targets.
run: cargo binstall cargo-fuzz --force --disable-strategies crate-meta-data --no-confirm
@@ -531,8 +531,7 @@ jobs:
persist-credentials: false
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
with:
# TODO: figure out why `ruff-ecosystem` crashes on Python 3.14
python-version: "3.13"
python-version: ${{ env.PYTHON_VERSION }}
activate-environment: true
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
@@ -699,7 +698,7 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: cargo-bins/cargo-binstall@a66119fbb1c952daba62640c2609111fe0803621 # v1.15.7
- uses: cargo-bins/cargo-binstall@afcf9780305558bcc9e4bc94b7589ab2bb8b6106 # v1.15.9
- run: cargo binstall --no-confirm cargo-shear
- run: cargo shear

View File

@@ -70,7 +70,7 @@ jobs:
shell: bash
run: "curl --proto '=https' --tlsv1.2 -LsSf https://github.com/axodotdev/cargo-dist/releases/download/v0.30.0/cargo-dist-installer.sh | sh"
- name: Cache dist
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: cargo-dist-cache
path: ~/.cargo/bin/dist
@@ -86,7 +86,7 @@ jobs:
cat plan-dist-manifest.json
echo "manifest=$(jq -c "." plan-dist-manifest.json)" >> "$GITHUB_OUTPUT"
- name: "Upload dist-manifest.json"
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: artifacts-plan-dist-manifest
path: plan-dist-manifest.json
@@ -128,14 +128,14 @@ jobs:
persist-credentials: false
submodules: recursive
- name: Install cached dist
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
name: cargo-dist-cache
path: ~/.cargo/bin/
- run: chmod +x ~/.cargo/bin/dist
# Get all the local artifacts for the global tasks to use (for e.g. checksums)
- name: Fetch local artifacts
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
pattern: artifacts-*
path: target/distrib/
@@ -153,7 +153,7 @@ jobs:
cp dist-manifest.json "$BUILD_MANIFEST_NAME"
- name: "Upload artifacts"
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: artifacts-build-global
path: |
@@ -179,14 +179,14 @@ jobs:
persist-credentials: false
submodules: recursive
- name: Install cached dist
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
name: cargo-dist-cache
path: ~/.cargo/bin/
- run: chmod +x ~/.cargo/bin/dist
# Fetch artifacts from scratch-storage
- name: Fetch artifacts
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
pattern: artifacts-*
path: target/distrib/
@@ -200,7 +200,7 @@ jobs:
cat dist-manifest.json
echo "manifest=$(jq -c "." dist-manifest.json)" >> "$GITHUB_OUTPUT"
- name: "Upload dist-manifest.json"
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
# Overwrite the previous copy
name: artifacts-dist-manifest
@@ -256,7 +256,7 @@ jobs:
submodules: recursive
# Create a GitHub Release while uploading all files to it
- name: "Download GitHub Artifacts"
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
pattern: artifacts-*
path: artifacts

2
.github/zizmor.yml vendored
View File

@@ -1,5 +1,5 @@
# Configuration for the zizmor static analysis tool, run via pre-commit in CI
# https://woodruffw.github.io/zizmor/configuration/
# https://docs.zizmor.sh/configuration/
#
# TODO: can we remove the ignores here so that our workflows are more secure?
rules:

View File

@@ -102,7 +102,7 @@ repos:
# zizmor detects security vulnerabilities in GitHub Actions workflows.
# Additional configuration for the tool is found in `.github/zizmor.yml`
- repo: https://github.com/zizmorcore/zizmor-pre-commit
rev: v1.15.2
rev: v1.16.0
hooks:
- id: zizmor

33
Cargo.lock generated
View File

@@ -295,9 +295,9 @@ checksum = "36f64beae40a84da1b4b26ff2761a5b895c12adc41dc25aaee1c4f2bbfe97a6e"
[[package]]
name = "bstr"
version = "1.12.0"
version = "1.12.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "234113d19d0d7d613b40e86fb654acf958910802bcceab913a4f9e7cda03b1a4"
checksum = "63044e1ae8e69f3b5a92c736ca6269b8d12fa7efe39bf34ddb06d102cf0e2cab"
dependencies = [
"memchr",
"regex-automata",
@@ -433,9 +433,9 @@ dependencies = [
[[package]]
name = "clap"
version = "4.5.49"
version = "4.5.50"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f4512b90fa68d3a9932cea5184017c5d200f5921df706d45e853537dea51508f"
checksum = "0c2cfd7bf8a6017ddaa4e32ffe7403d547790db06bd171c1c53926faab501623"
dependencies = [
"clap_builder",
"clap_derive",
@@ -443,9 +443,9 @@ dependencies = [
[[package]]
name = "clap_builder"
version = "4.5.49"
version = "4.5.50"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0025e98baa12e766c67ba13ff4695a887a1eba19569aad00a472546795bd6730"
checksum = "0a4c05b9e80c5ccd3a7ef080ad7b6ba7d6fc00a985b8b157197075677c82c7a0"
dependencies = [
"anstream",
"anstyle",
@@ -1007,7 +1007,7 @@ dependencies = [
"libc",
"option-ext",
"redox_users",
"windows-sys 0.59.0",
"windows-sys 0.60.2",
]
[[package]]
@@ -1224,9 +1224,9 @@ dependencies = [
[[package]]
name = "get-size-derive2"
version = "0.7.0"
version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e3814abc7da8ab18d2fd820f5b540b5e39b6af0a32de1bdd7c47576693074843"
checksum = "46b134aa084df7c3a513a1035c52f623e4b3065dfaf3d905a4f28a2e79b5bb3f"
dependencies = [
"attribute-derive",
"quote",
@@ -1235,9 +1235,9 @@ dependencies = [
[[package]]
name = "get-size2"
version = "0.7.0"
version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5dfe2cec5b5ce8fb94dcdb16a1708baa4d0609cc3ce305ca5d3f6f2ffb59baed"
checksum = "c0d51c9f2e956a517619ad9e7eaebc7a573f9c49b38152e12eade750f89156f9"
dependencies = [
"compact_str",
"get-size-derive2",
@@ -1523,9 +1523,9 @@ dependencies = [
[[package]]
name = "ignore"
version = "0.4.23"
version = "0.4.24"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6d89fd380afde86567dfba715db065673989d6253f42b88179abd3eae47bda4b"
checksum = "81776e6f9464432afcc28d03e52eb101c93b6f0566f52aef2427663e700f0403"
dependencies = [
"crossbeam-deque",
"globset",
@@ -3563,7 +3563,7 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.24.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=d38145c29574758de7ffbe8a13cd4584c3b09161#d38145c29574758de7ffbe8a13cd4584c3b09161"
source = "git+https://github.com/salsa-rs/salsa.git?rev=cdd0b85516a52c18b8a6d17a2279a96ed6c3e198#cdd0b85516a52c18b8a6d17a2279a96ed6c3e198"
dependencies = [
"boxcar",
"compact_str",
@@ -3587,12 +3587,12 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.24.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=d38145c29574758de7ffbe8a13cd4584c3b09161#d38145c29574758de7ffbe8a13cd4584c3b09161"
source = "git+https://github.com/salsa-rs/salsa.git?rev=cdd0b85516a52c18b8a6d17a2279a96ed6c3e198#cdd0b85516a52c18b8a6d17a2279a96ed6c3e198"
[[package]]
name = "salsa-macros"
version = "0.24.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=d38145c29574758de7ffbe8a13cd4584c3b09161#d38145c29574758de7ffbe8a13cd4584c3b09161"
source = "git+https://github.com/salsa-rs/salsa.git?rev=cdd0b85516a52c18b8a6d17a2279a96ed6c3e198#cdd0b85516a52c18b8a6d17a2279a96ed6c3e198"
dependencies = [
"proc-macro2",
"quote",
@@ -4521,7 +4521,6 @@ name = "ty_test"
version = "0.0.0"
dependencies = [
"anyhow",
"bitflags 2.9.4",
"camino",
"colored 3.0.0",
"insta",

View File

@@ -146,7 +146,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "d38145c29574758de7ffbe8a13cd4584c3b09161", default-features = false, features = [
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "cdd0b85516a52c18b8a6d17a2279a96ed6c3e198", default-features = false, features = [
"compact_str",
"macros",
"salsa_unstable",

View File

@@ -9,9 +9,7 @@ use itertools::{Itertools, iterate};
use ruff_linter::linter::FixTable;
use serde::Serialize;
use ruff_db::diagnostic::{
Diagnostic, DiagnosticFormat, DisplayDiagnosticConfig, DisplayDiagnostics, SecondaryCode,
};
use ruff_db::diagnostic::{Diagnostic, DisplayDiagnosticConfig, SecondaryCode};
use ruff_linter::fs::relativize_path;
use ruff_linter::logging::LogLevel;
use ruff_linter::message::{EmitterContext, render_diagnostics};
@@ -390,21 +388,18 @@ impl Printer {
let context = EmitterContext::new(&diagnostics.notebook_indexes);
let format = if preview {
DiagnosticFormat::Full
self.format
} else {
DiagnosticFormat::Concise
OutputFormat::Concise
};
let config = DisplayDiagnosticConfig::default()
.preview(preview)
.hide_severity(true)
.color(!cfg!(test) && colored::control::SHOULD_COLORIZE.should_colorize())
.with_show_fix_status(show_fix_status(self.fix_mode, fixables.as_ref()))
.format(format)
.with_fix_applicability(self.unsafe_fixes.required_applicability());
write!(
writer,
"{}",
DisplayDiagnostics::new(&context, &config, &diagnostics.inner)
)?;
.with_fix_applicability(self.unsafe_fixes.required_applicability())
.show_fix_diff(preview);
render_diagnostics(writer, format, config, &context, &diagnostics.inner)?;
}
writer.flush()?;

View File

@@ -226,7 +226,7 @@ static STATIC_FRAME: Benchmark = Benchmark::new(
max_dep_date: "2025-08-09",
python_version: PythonVersion::PY311,
},
750,
800,
);
#[track_caller]

View File

@@ -200,7 +200,12 @@ impl System for OsSystem {
/// The walker ignores files according to [`ignore::WalkBuilder::standard_filters`]
/// when setting [`WalkDirectoryBuilder::standard_filters`] to true.
fn walk_directory(&self, path: &SystemPath) -> WalkDirectoryBuilder {
WalkDirectoryBuilder::new(path, OsDirectoryWalker {})
WalkDirectoryBuilder::new(
path,
OsDirectoryWalker {
cwd: self.current_directory().to_path_buf(),
},
)
}
fn glob(
@@ -454,7 +459,9 @@ struct ListedDirectory {
}
#[derive(Debug)]
struct OsDirectoryWalker;
struct OsDirectoryWalker {
cwd: SystemPathBuf,
}
impl DirectoryWalker for OsDirectoryWalker {
fn walk(
@@ -473,6 +480,7 @@ impl DirectoryWalker for OsDirectoryWalker {
};
let mut builder = ignore::WalkBuilder::new(first.as_std_path());
builder.current_dir(self.cwd.as_std_path());
builder.standard_filters(standard_filters);
builder.hidden(hidden);

View File

@@ -10,6 +10,7 @@ from airflow.datasets import (
)
from airflow.datasets.manager import DatasetManager
from airflow.lineage.hook import DatasetLineageInfo, HookLineageCollector
from airflow.models.dag import DAG
from airflow.providers.amazon.aws.auth_manager.aws_auth_manager import AwsAuthManager
from airflow.providers.apache.beam.hooks import BeamHook, NotAir302HookError
from airflow.providers.google.cloud.secrets.secret_manager import (
@@ -20,6 +21,7 @@ from airflow.providers_manager import ProvidersManager
from airflow.secrets.base_secrets import BaseSecretsBackend
from airflow.secrets.local_filesystem import LocalFilesystemBackend
# airflow.Dataset
dataset_from_root = DatasetFromRoot()
dataset_from_root.iter_datasets()
@@ -56,6 +58,10 @@ hlc.add_input_dataset()
hlc.add_output_dataset()
hlc.collected_datasets()
# airflow.models.dag.DAG
test_dag = DAG(dag_id="test_dag")
test_dag.create_dagrun()
# airflow.providers.amazon.auth_manager.aws_auth_manager
aam = AwsAuthManager()
aam.is_authorized_dataset()
@@ -96,3 +102,15 @@ base_secret_backend.get_connections()
# airflow.secrets.local_filesystem
lfb = LocalFilesystemBackend()
lfb.get_connections()
from airflow.models import DAG
# airflow.DAG
test_dag = DAG(dag_id="test_dag")
test_dag.create_dagrun()
from airflow import DAG
# airflow.DAG
test_dag = DAG(dag_id="test_dag")
test_dag.create_dagrun()

View File

@@ -46,3 +46,9 @@ class CorrectModel(models.Model):
max_length=255, null=True, blank=True, unique=True
)
urlfieldu = models.URLField(max_length=255, null=True, blank=True, unique=True)
class IncorrectModelWithSimpleAnnotations(models.Model):
charfield: models.CharField = models.CharField(max_length=255, null=True)
textfield: models.TextField = models.TextField(max_length=255, null=True)
slugfield: models.SlugField = models.SlugField(max_length=255, null=True)

View File

@@ -0,0 +1,7 @@
# Regression test for https://github.com/astral-sh/ruff/issues/21023
'' '
"" ""
'' '' '
"" "" "
f"" f"
f"" f"" f"

View File

@@ -359,3 +359,29 @@ class Generic5(list[PotentialTypeVar]):
def __new__(cls: type[Generic5]) -> Generic5: ...
def __enter__(self: Generic5) -> Generic5: ...
# Test cases based on issue #20781 - metaclasses that triggers IsMetaclass::Maybe
class MetaclassInWhichSelfCannotBeUsed5(type(Protocol)):
def __new__(
cls, name: str, bases: tuple[type[Any], ...], attrs: dict[str, Any], **kwargs: Any
) -> MetaclassInWhichSelfCannotBeUsed5:
new_class = super().__new__(cls, name, bases, attrs, **kwargs)
return new_class
import django.db.models.base
class MetaclassInWhichSelfCannotBeUsed6(django.db.models.base.ModelBase):
def __new__(cls, name: str, bases: tuple[Any, ...], attrs: dict[str, Any], **kwargs: Any) -> MetaclassInWhichSelfCannotBeUsed6:
...
class MetaclassInWhichSelfCannotBeUsed7(django.db.models.base.ModelBase):
def __new__(cls, /, name: str, bases: tuple[object, ...], attrs: dict[str, object], **kwds: object) -> MetaclassInWhichSelfCannotBeUsed7:
...
class MetaclassInWhichSelfCannotBeUsed8(django.db.models.base.ModelBase):
def __new__(cls, name: builtins.str, bases: tuple, attributes: dict, /, **kw) -> MetaclassInWhichSelfCannotBeUsed8:
...

View File

@@ -252,3 +252,28 @@ from some_module import PotentialTypeVar
class Generic5(list[PotentialTypeVar]):
def __new__(cls: type[Generic5]) -> Generic5: ...
def __enter__(self: Generic5) -> Generic5: ...
# Test case based on issue #20781 - metaclass that triggers IsMetaclass::Maybe
class MetaclassInWhichSelfCannotBeUsed5(type(Protocol)):
def __new__(
cls, name: str, bases: tuple[type[Any], ...], attrs: dict[str, Any], **kwargs: Any
) -> MetaclassInWhichSelfCannotBeUsed5: ...
import django.db.models.base
class MetaclassInWhichSelfCannotBeUsed6(django.db.models.base.ModelBase):
def __new__(cls, name: str, bases: tuple[Any, ...], attrs: dict[str, Any], **kwargs: Any) -> MetaclassInWhichSelfCannotBeUsed6:
...
class MetaclassInWhichSelfCannotBeUsed7(django.db.models.base.ModelBase):
def __new__(cls, /, name: str, bases: tuple[object, ...], attrs: dict[str, object], **kwds: object) -> MetaclassInWhichSelfCannotBeUsed7:
...
class MetaclassInWhichSelfCannotBeUsed8(django.db.models.base.ModelBase):
def __new__(cls, name: builtins.str, bases: tuple, attributes: dict, /, **kw) -> MetaclassInWhichSelfCannotBeUsed8:
...

View File

@@ -14,3 +14,14 @@ def f():
import os
print(os)
# regression test for https://github.com/astral-sh/ruff/issues/21121
from dataclasses import KW_ONLY, dataclass
@dataclass
class DataClass:
a: int
_: KW_ONLY # should be an exception to TC003, even with future-annotations
b: int

View File

@@ -370,3 +370,22 @@ class Foo:
The flag converter instance with all flags parsed.
"""
return
# OK
def baz(x: int) -> int:
"""
Show a `Warnings` DOC102 false positive.
Parameters
----------
x : int
Warnings
--------
This function demonstrates a DOC102 false positive
Returns
-------
int
"""
return x

View File

@@ -81,3 +81,55 @@ def calculate_speed(distance: float, time: float) -> float:
except TypeError:
print("Not a number? Shame on you!")
raise
# This should NOT trigger DOC502 because OSError is explicitly re-raised
def f():
"""Do nothing.
Raises:
OSError: If the OS errors.
"""
try:
pass
except OSError as e:
raise e
# This should NOT trigger DOC502 because OSError is explicitly re-raised with from None
def g():
"""Do nothing.
Raises:
OSError: If the OS errors.
"""
try:
pass
except OSError as e:
raise e from None
# This should NOT trigger DOC502 because ValueError is explicitly re-raised from tuple exception
def h():
"""Do nothing.
Raises:
ValueError: If something goes wrong.
"""
try:
pass
except (ValueError, TypeError) as e:
raise e
# This should NOT trigger DOC502 because TypeError is explicitly re-raised from tuple exception
def i():
"""Do nothing.
Raises:
TypeError: If something goes wrong.
"""
try:
pass
except (ValueError, TypeError) as e:
raise e

View File

@@ -0,0 +1,21 @@
class C:
f = lambda self: __class__
print(C().f().__name__)
# Test: nested lambda
class D:
g = lambda self: (lambda: __class__)
print(D().g()().__name__)
# Test: lambda outside class (should still fail)
h = lambda: __class__
# Test: lambda referencing module-level variable (should not be flagged as F821)
import uuid
class E:
uuid = lambda: str(uuid.uuid4())

View File

@@ -0,0 +1,131 @@
"""Test cases for PLR1708 stop-iteration-return."""
# Valid cases - should not trigger the rule
def normal_function():
raise StopIteration # Not a generator, should not trigger
def normal_function_with_value():
raise StopIteration("value") # Not a generator, should not trigger
def generator_with_return():
yield 1
yield 2
return "finished" # This is the correct way
def generator_with_yield_from():
yield from [1, 2, 3]
def generator_without_stop_iteration():
yield 1
yield 2
# No explicit termination
def generator_with_other_exception():
yield 1
raise ValueError("something else") # Different exception
# Invalid cases - should trigger the rule
def generator_with_stop_iteration():
yield 1
yield 2
raise StopIteration # Should trigger
def generator_with_stop_iteration_value():
yield 1
yield 2
raise StopIteration("finished") # Should trigger
def generator_with_stop_iteration_expr():
yield 1
yield 2
raise StopIteration(1 + 2) # Should trigger
def async_generator_with_stop_iteration():
yield 1
yield 2
raise StopIteration("async") # Should trigger
def nested_generator():
def inner_gen():
yield 1
raise StopIteration("inner") # Should trigger
yield from inner_gen()
def generator_in_class():
class MyClass:
def generator_method(self):
yield 1
raise StopIteration("method") # Should trigger
return MyClass
# Complex cases
def complex_generator():
try:
yield 1
yield 2
raise StopIteration("complex") # Should trigger
except ValueError:
yield 3
finally:
pass
def generator_with_conditional_stop_iteration(condition):
yield 1
if condition:
raise StopIteration("conditional") # Should trigger
yield 2
# Edge cases
def generator_with_bare_stop_iteration():
yield 1
raise StopIteration # Should trigger (no arguments)
def generator_with_stop_iteration_in_loop():
for i in range(5):
yield i
if i == 3:
raise StopIteration("loop") # Should trigger
# Should not trigger - different exceptions
def generator_with_runtime_error():
yield 1
raise RuntimeError("not StopIteration") # Should not trigger
def generator_with_custom_exception():
yield 1
raise CustomException("custom") # Should not trigger
class CustomException(Exception):
pass
# Generator comprehensions should not be affected
list_comp = [x for x in range(10)] # Should not trigger
# Lambda in generator context
def generator_with_lambda():
yield 1
func = lambda x: x # Just a regular lambda
yield 2

View File

@@ -0,0 +1,17 @@
"""
Regression test for an ecosystem hit on
https://github.com/astral-sh/ruff/pull/21125.
We should mark all of the components of special dataclass annotations as
runtime-required, not just the first layer.
"""
from dataclasses import dataclass
from typing import ClassVar, Optional
@dataclass(frozen=True)
class EmptyCell:
_singleton: ClassVar[Optional["EmptyCell"]] = None
# the behavior of _singleton above should match a non-ClassVar
_doubleton: "EmptyCell"

View File

@@ -0,0 +1,13 @@
"""This is placed in a separate fixture as `TypeVar` needs to be imported
from `typing_extensions` to support default arguments in Python version < 3.13.
We verify that UP046 doesn't apply in this case.
"""
from typing import Generic
from typing_extensions import TypeVar
T = TypeVar("T", default=str)
class DefaultTypeVar(Generic[T]):
var: T

View File

@@ -0,0 +1,12 @@
"""This is placed in a separate fixture as `TypeVar` needs to be imported
from `typing_extensions` to support default arguments in Python version < 3.13.
We verify that UP047 doesn't apply in this case.
"""
from typing_extensions import TypeVar
T = TypeVar("T", default=int)
def default_var(var: T) -> T:
return var

View File

@@ -69,3 +69,19 @@ Decimal(float("\N{space}\N{hyPHen-MINus}nan"))
Decimal(float("\x20\N{character tabulation}\N{hyphen-minus}nan"))
Decimal(float(" -" "nan"))
Decimal(float("-nAn"))
# Test cases for digit separators (safe fixes)
# https://github.com/astral-sh/ruff/issues/20572
Decimal("15_000_000") # Safe fix: normalizes separators, becomes Decimal(15_000_000)
Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
# Test cases for non-thousands separators
Decimal("12_34_56_78") # Safe fix: preserves non-thousands separators
Decimal("1234_5678") # Safe fix: preserves non-thousands separators
# Separators _and_ leading zeros
Decimal("0001_2345")
Decimal("000_1_2345")
Decimal("000_000")

View File

@@ -43,3 +43,29 @@ logging.warning("Value: %r", repr(42))
logging.error("Error: %r", repr([1, 2, 3]))
logging.info("Debug info: %s", repr("test\nstring"))
logging.warning("Value: %s", repr(42))
# %s + ascii()
logging.info("ASCII: %s", ascii("Hello\nWorld"))
logging.warning("ASCII: %s", ascii("test"))
# %s + oct()
logging.info("Octal: %s", oct(42))
logging.warning("Octal: %s", oct(255))
# %s + hex()
logging.info("Hex: %s", hex(42))
logging.warning("Hex: %s", hex(255))
# Test with imported functions
from logging import info, log
info("ASCII: %s", ascii("Hello\nWorld"))
log(logging.INFO, "ASCII: %s", ascii("test"))
info("Octal: %s", oct(42))
log(logging.INFO, "Octal: %s", oct(255))
info("Hex: %s", hex(42))
log(logging.INFO, "Hex: %s", hex(255))

View File

@@ -0,0 +1,12 @@
async def f(): return [[x async for x in foo(n)] for n in range(3)]
async def test(): return [[x async for x in elements(n)] async for n in range(3)]
async def f(): [x for x in foo()] and [x async for x in foo()]
async def f():
def g(): ...
[x async for x in foo()]
[x async for x in y]

View File

@@ -0,0 +1,3 @@
match x:
case Point(x=1, x=2):
pass

View File

@@ -0,0 +1,3 @@
match x:
case {'key': 1, 'key': 2}:
pass

View File

@@ -0,0 +1 @@
class C[T, T]: pass

View File

@@ -0,0 +1,29 @@
def f(a):
global a
def g(a):
if True:
global a
def h(a):
def inner():
global a
def i(a):
try:
global a
except Exception:
pass
def f(a):
a = 1
global a
def f(a):
a = 1
a = 2
global a
def f(a):
class Inner:
global a # ok

View File

@@ -0,0 +1,8 @@
type X[T: (yield 1)] = int
type Y = (yield 1)
def f[T](x: int) -> (y := 3): return x
class C[T]((yield from [object])):
pass

View File

@@ -0,0 +1,8 @@
def func():
return *x
for *x in range(10):
pass
def func():
yield *x

View File

@@ -0,0 +1,11 @@
match value:
case _:
pass
case 1:
pass
match value:
case irrefutable:
pass
case 1:
pass

View File

@@ -0,0 +1,5 @@
match x:
case [a, a]:
pass
case _:
pass

View File

@@ -0,0 +1 @@
[x:= 2 for x in range(2)]

View File

@@ -0,0 +1 @@
*a = [1, 2, 3, 4]

View File

@@ -0,0 +1,7 @@
__debug__ = False
def process(__debug__):
pass
class Generic[__debug__]:
pass

View File

@@ -951,6 +951,9 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if checker.is_rule_enabled(Rule::MisplacedBareRaise) {
pylint::rules::misplaced_bare_raise(checker, raise);
}
if checker.is_rule_enabled(Rule::StopIterationReturn) {
pylint::rules::stop_iteration_return(checker, raise);
}
}
Stmt::AugAssign(aug_assign @ ast::StmtAugAssign { target, .. }) => {
if checker.is_rule_enabled(Rule::GlobalStatement) {

View File

@@ -1400,6 +1400,14 @@ impl<'a> Visitor<'a> for Checker<'a> {
AnnotationContext::RuntimeRequired => {
self.visit_runtime_required_annotation(annotation);
}
AnnotationContext::RuntimeEvaluated
if flake8_type_checking::helpers::is_dataclass_meta_annotation(
annotation,
self.semantic(),
) =>
{
self.visit_runtime_required_annotation(annotation);
}
AnnotationContext::RuntimeEvaluated => {
self.visit_runtime_evaluated_annotation(annotation);
}
@@ -2116,7 +2124,7 @@ impl<'a> Visitor<'a> for Checker<'a> {
| Expr::DictComp(_)
| Expr::SetComp(_) => {
self.analyze.scopes.push(self.semantic.scope_id);
self.semantic.pop_scope();
self.semantic.pop_scope(); // Lambda/Generator/Comprehension scope
}
_ => {}
}
@@ -3041,7 +3049,35 @@ impl<'a> Checker<'a> {
if let Some(parameters) = parameters {
self.visit_parameters(parameters);
}
// Here we add the implicit scope surrounding a lambda which allows code in the
// lambda to access `__class__` at runtime when the lambda is defined within a class.
// See the `ScopeKind::DunderClassCell` docs for more information.
let added_dunder_class_scope = if self
.semantic
.current_scopes()
.any(|scope| scope.kind.is_class())
{
self.semantic.push_scope(ScopeKind::DunderClassCell);
let binding_id = self.semantic.push_binding(
TextRange::default(),
BindingKind::DunderClassCell,
BindingFlags::empty(),
);
self.semantic
.current_scope_mut()
.add("__class__", binding_id);
true
} else {
false
};
self.visit_expr(body);
// Pop the DunderClassCell scope if it was added
if added_dunder_class_scope {
self.semantic.pop_scope();
}
}
}
self.semantic.restore(snapshot);

View File

@@ -286,6 +286,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Pylint, "R1702") => rules::pylint::rules::TooManyNestedBlocks,
(Pylint, "R1704") => rules::pylint::rules::RedefinedArgumentFromLocal,
(Pylint, "R1706") => rules::pylint::rules::AndOrTernary,
(Pylint, "R1708") => rules::pylint::rules::StopIterationReturn,
(Pylint, "R1711") => rules::pylint::rules::UselessReturn,
(Pylint, "R1714") => rules::pylint::rules::RepeatedEqualityComparison,
(Pylint, "R1722") => rules::pylint::rules::SysExitAlias,

View File

@@ -11,6 +11,8 @@ pub(crate) static GOOGLE_SECTIONS: &[SectionKind] = &[
SectionKind::References,
SectionKind::Returns,
SectionKind::SeeAlso,
SectionKind::Warnings,
SectionKind::Warns,
SectionKind::Yields,
// Google-only
SectionKind::Args,
@@ -32,7 +34,5 @@ pub(crate) static GOOGLE_SECTIONS: &[SectionKind] = &[
SectionKind::Tip,
SectionKind::Todo,
SectionKind::Warning,
SectionKind::Warnings,
SectionKind::Warns,
SectionKind::Yield,
];

View File

@@ -11,11 +11,14 @@ pub(crate) static NUMPY_SECTIONS: &[SectionKind] = &[
SectionKind::References,
SectionKind::Returns,
SectionKind::SeeAlso,
SectionKind::Warnings,
SectionKind::Warns,
SectionKind::Yields,
// NumPy-only
SectionKind::ExtendedSummary,
SectionKind::OtherParams,
SectionKind::OtherParameters,
SectionKind::Parameters,
SectionKind::Receives,
SectionKind::ShortSummary,
];

View File

@@ -36,6 +36,7 @@ pub(crate) enum SectionKind {
OtherParameters,
Parameters,
Raises,
Receives,
References,
Return,
Returns,
@@ -76,6 +77,7 @@ impl SectionKind {
"other parameters" => Some(Self::OtherParameters),
"parameters" => Some(Self::Parameters),
"raises" => Some(Self::Raises),
"receives" => Some(Self::Receives),
"references" => Some(Self::References),
"return" => Some(Self::Return),
"returns" => Some(Self::Returns),
@@ -117,6 +119,7 @@ impl SectionKind {
Self::OtherParameters => "Other Parameters",
Self::Parameters => "Parameters",
Self::Raises => "Raises",
Self::Receives => "Receives",
Self::References => "References",
Self::Return => "Return",
Self::Returns => "Returns",

View File

@@ -14,7 +14,7 @@ use ruff_text_size::TextSize;
/// The length of a line of text that is considered too long.
///
/// The allowed range of values is 1..=320
#[derive(Clone, Copy, Debug, Eq, PartialEq, serde::Serialize, serde::Deserialize)]
#[derive(Clone, Copy, Debug, Eq, PartialEq, serde::Serialize)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct LineLength(
#[cfg_attr(feature = "schemars", schemars(range(min = 1, max = 320)))] NonZeroU16,
@@ -46,6 +46,21 @@ impl fmt::Display for LineLength {
}
}
impl<'de> serde::Deserialize<'de> for LineLength {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
let value = u16::deserialize(deserializer)?;
Self::try_from(value).map_err(|_| {
serde::de::Error::custom(format!(
"line-length must be between 1 and {} (got {value})",
Self::MAX,
))
})
}
}
impl CacheKey for LineLength {
fn cache_key(&self, state: &mut CacheKeyHasher) {
state.write_u16(self.0.get());

View File

@@ -919,17 +919,6 @@ mod tests {
Ok(())
}
/// Wrapper around `test_contents_syntax_errors` for testing a snippet of code instead of a
/// file.
fn test_snippet_syntax_errors(contents: &str, settings: &LinterSettings) -> Vec<Diagnostic> {
let contents = dedent(contents);
test_contents_syntax_errors(
&SourceKind::Python(contents.to_string()),
Path::new("<filename>"),
settings,
)
}
/// A custom test runner that prints syntax errors in addition to other diagnostics. Adapted
/// from `flakes` in pyflakes/mod.rs.
fn test_contents_syntax_errors(
@@ -972,245 +961,38 @@ mod tests {
}
#[test_case(
"async_in_sync_error_on_310",
"async def f(): return [[x async for x in foo(n)] for n in range(3)]",
PythonVersion::PY310,
"AsyncComprehensionOutsideAsyncFunction"
Path::new("async_comprehension_outside_async_function.py"),
PythonVersion::PY311
)]
#[test_case(
"async_in_sync_okay_on_311",
"async def f(): return [[x async for x in foo(n)] for n in range(3)]",
PythonVersion::PY311,
"AsyncComprehensionOutsideAsyncFunction"
Path::new("async_comprehension_outside_async_function.py"),
PythonVersion::PY310
)]
#[test_case(
"async_in_sync_okay_on_310",
"async def test(): return [[x async for x in elements(n)] async for n in range(3)]",
PythonVersion::PY310,
"AsyncComprehensionOutsideAsyncFunction"
)]
#[test_case(
"deferred_function_body",
"
async def f(): [x for x in foo()] and [x async for x in foo()]
async def f():
def g(): ...
[x async for x in foo()]
",
PythonVersion::PY310,
"AsyncComprehensionOutsideAsyncFunction"
)]
#[test_case(
"async_in_sync_false_positive",
"[x async for x in y]",
PythonVersion::PY310,
"AsyncComprehensionOutsideAsyncFunction"
)]
#[test_case(
"rebound_comprehension",
"[x:= 2 for x in range(2)]",
PythonVersion::PY310,
"ReboundComprehensionVariable"
)]
#[test_case(
"duplicate_type_param",
"class C[T, T]: pass",
PythonVersion::PY312,
"DuplicateTypeParameter"
)]
#[test_case(
"multiple_case_assignment",
"
match x:
case [a, a]:
pass
case _:
pass
",
PythonVersion::PY310,
"MultipleCaseAssignment"
)]
#[test_case(
"duplicate_match_key",
"
match x:
case {'key': 1, 'key': 2}:
pass
",
PythonVersion::PY310,
"DuplicateMatchKey"
)]
#[test_case(
"global_parameter",
"
def f(a):
global a
#[test_case(Path::new("rebound_comprehension.py"), PythonVersion::PY310)]
#[test_case(Path::new("duplicate_type_parameter.py"), PythonVersion::PY312)]
#[test_case(Path::new("multiple_case_assignment.py"), PythonVersion::PY310)]
#[test_case(Path::new("duplicate_match_key.py"), PythonVersion::PY310)]
#[test_case(Path::new("duplicate_match_class_attribute.py"), PythonVersion::PY310)]
#[test_case(Path::new("invalid_star_expression.py"), PythonVersion::PY310)]
#[test_case(Path::new("irrefutable_case_pattern.py"), PythonVersion::PY310)]
#[test_case(Path::new("single_starred_assignment.py"), PythonVersion::PY310)]
#[test_case(Path::new("write_to_debug.py"), PythonVersion::PY312)]
#[test_case(Path::new("write_to_debug.py"), PythonVersion::PY310)]
#[test_case(Path::new("invalid_expression.py"), PythonVersion::PY312)]
#[test_case(Path::new("global_parameter.py"), PythonVersion::PY310)]
fn test_semantic_errors(path: &Path, python_version: PythonVersion) -> Result<()> {
let snapshot = format!(
"semantic_syntax_error_{}_{}",
path.to_string_lossy(),
python_version
);
let path = Path::new("resources/test/fixtures/semantic_errors").join(path);
let contents = std::fs::read_to_string(&path)?;
let source_kind = SourceKind::Python(contents);
def g(a):
if True:
global a
def h(a):
def inner():
global a
def i(a):
try:
global a
except Exception:
pass
def f(a):
a = 1
global a
def f(a):
a = 1
a = 2
global a
def f(a):
class Inner:
global a # ok
",
PythonVersion::PY310,
"GlobalParameter"
)]
#[test_case(
"duplicate_match_class_attribute",
"
match x:
case Point(x=1, x=2):
pass
",
PythonVersion::PY310,
"DuplicateMatchClassAttribute"
)]
#[test_case(
"invalid_star_expression",
"
def func():
return *x
",
PythonVersion::PY310,
"InvalidStarExpression"
)]
#[test_case(
"invalid_star_expression_for",
"
for *x in range(10):
pass
",
PythonVersion::PY310,
"InvalidStarExpression"
)]
#[test_case(
"invalid_star_expression_yield",
"
def func():
yield *x
",
PythonVersion::PY310,
"InvalidStarExpression"
)]
#[test_case(
"irrefutable_case_pattern_wildcard",
"
match value:
case _:
pass
case 1:
pass
",
PythonVersion::PY310,
"IrrefutableCasePattern"
)]
#[test_case(
"irrefutable_case_pattern_capture",
"
match value:
case irrefutable:
pass
case 1:
pass
",
PythonVersion::PY310,
"IrrefutableCasePattern"
)]
#[test_case(
"single_starred_assignment",
"*a = [1, 2, 3, 4]",
PythonVersion::PY310,
"SingleStarredAssignment"
)]
#[test_case(
"write_to_debug",
"
__debug__ = False
",
PythonVersion::PY310,
"WriteToDebug"
)]
#[test_case(
"write_to_debug_in_function_param",
"
def process(__debug__):
pass
",
PythonVersion::PY310,
"WriteToDebug"
)]
#[test_case(
"write_to_debug_class_type_param",
"
class Generic[__debug__]:
pass
",
PythonVersion::PY312,
"WriteToDebug"
)]
#[test_case(
"invalid_expression_yield_in_type_param",
"
type X[T: (yield 1)] = int
",
PythonVersion::PY312,
"InvalidExpression"
)]
#[test_case(
"invalid_expression_yield_in_type_alias",
"
type Y = (yield 1)
",
PythonVersion::PY312,
"InvalidExpression"
)]
#[test_case(
"invalid_expression_walrus_in_return_annotation",
"
def f[T](x: int) -> (y := 3): return x
",
PythonVersion::PY312,
"InvalidExpression"
)]
#[test_case(
"invalid_expression_yield_from_in_base_class",
"
class C[T]((yield from [object])):
pass
",
PythonVersion::PY312,
"InvalidExpression"
)]
fn test_semantic_errors(
name: &str,
contents: &str,
python_version: PythonVersion,
error_type: &str,
) {
let snapshot = format!("semantic_syntax_error_{error_type}_{name}_{python_version}");
let diagnostics = test_snippet_syntax_errors(
contents,
let diagnostics = test_contents_syntax_errors(
&source_kind,
&path,
&LinterSettings {
rules: settings::rule_table::RuleTable::empty(),
unresolved_target_version: python_version.into(),
@@ -1218,7 +1000,11 @@ mod tests {
..Default::default()
},
);
assert_diagnostics!(snapshot, diagnostics);
insta::with_settings!({filters => vec![(r"\\", "/")]}, {
assert_diagnostics!(format!("{snapshot}"), diagnostics);
});
Ok(())
}
#[test_case(PythonVersion::PY310)]

View File

@@ -492,6 +492,12 @@ fn check_method(checker: &Checker, call_expr: &ExprCall) {
"collected_datasets" => Replacement::AttrName("collected_assets"),
_ => return,
},
["airflow", "models", "dag", "DAG"] | ["airflow", "models", "DAG"] | ["airflow", "DAG"] => {
match attr.as_str() {
"create_dagrun" => Replacement::None,
_ => return,
}
}
["airflow", "providers_manager", "ProvidersManager"] => match attr.as_str() {
"initialize_providers_dataset_uri_resources" => {
Replacement::AttrName("initialize_providers_asset_uri_resources")

View File

@@ -288,10 +288,12 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
},
// airflow.model..DAG
["airflow", "models", .., "DAG"] => Replacement::SourceModuleMoved {
module: "airflow.sdk",
name: "DAG".to_string(),
},
["airflow", "models", "dag", "DAG"] | ["airflow", "models", "DAG"] | ["airflow", "DAG"] => {
Replacement::SourceModuleMoved {
module: "airflow.sdk",
name: "DAG".to_string(),
}
}
// airflow.sensors.base
[

View File

@@ -1,6 +1,25 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR311 [*] `airflow.DAG` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
--> AIR311_args.py:13:1
|
13 | DAG(dag_id="class_sla_callback", sla_miss_callback=sla_callback)
| ^^^
|
help: Use `DAG` from `airflow.sdk` instead.
2 |
3 | from datetime import timedelta
4 |
- from airflow import DAG, dag
5 + from airflow import dag
6 | from airflow.operators.datetime import BranchDateTimeOperator
7 + from airflow.sdk import DAG
8 |
9 |
10 | def sla_callback(*arg, **kwargs):
note: This is an unsafe fix and may change runtime behavior
AIR311 `sla_miss_callback` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
--> AIR311_args.py:13:34
|

View File

@@ -375,7 +375,7 @@ impl Violation for SuspiciousEvalUsage {
///
///
/// def render_username(username):
/// return django.utils.html.format_html("<i>{}</i>", username) # username is escaped.
/// return format_html("<i>{}</i>", username) # username is escaped.
/// ```
///
/// ## References

View File

@@ -61,9 +61,14 @@ pub(crate) fn nullable_model_string_field(checker: &Checker, body: &[Stmt]) {
}
for statement in body {
let Stmt::Assign(ast::StmtAssign { value, .. }) = statement else {
continue;
let value = match statement {
Stmt::Assign(ast::StmtAssign { value, .. }) => value,
Stmt::AnnAssign(ast::StmtAnnAssign {
value: Some(value), ..
}) => value,
_ => continue,
};
if let Some(field_name) = is_nullable_field(value, checker.semantic()) {
checker.report_diagnostic(
DjangoNullableModelStringField {

View File

@@ -186,3 +186,32 @@ DJ001 Avoid using `null=True` on string-based fields such as `URLField`
30 | urlfield = models.URLField(max_length=255, null=True)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
DJ001 Avoid using `null=True` on string-based fields such as `CharField`
--> DJ001.py:52:35
|
51 | class IncorrectModelWithSimpleAnnotations(models.Model):
52 | charfield: models.CharField = models.CharField(max_length=255, null=True)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
53 | textfield: models.TextField = models.TextField(max_length=255, null=True)
54 | slugfield: models.SlugField = models.SlugField(max_length=255, null=True)
|
DJ001 Avoid using `null=True` on string-based fields such as `TextField`
--> DJ001.py:53:35
|
51 | class IncorrectModelWithSimpleAnnotations(models.Model):
52 | charfield: models.CharField = models.CharField(max_length=255, null=True)
53 | textfield: models.TextField = models.TextField(max_length=255, null=True)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
54 | slugfield: models.SlugField = models.SlugField(max_length=255, null=True)
|
DJ001 Avoid using `null=True` on string-based fields such as `SlugField`
--> DJ001.py:54:35
|
52 | charfield: models.CharField = models.CharField(max_length=255, null=True)
53 | textfield: models.TextField = models.TextField(max_length=255, null=True)
54 | slugfield: models.SlugField = models.SlugField(max_length=255, null=True)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|

View File

@@ -23,6 +23,14 @@ mod tests {
Rule::MultiLineImplicitStringConcatenation,
Path::new("ISC_syntax_error.py")
)]
#[test_case(
Rule::SingleLineImplicitStringConcatenation,
Path::new("ISC_syntax_error_2.py")
)]
#[test_case(
Rule::MultiLineImplicitStringConcatenation,
Path::new("ISC_syntax_error_2.py")
)]
#[test_case(Rule::ExplicitStringConcatenation, Path::new("ISC.py"))]
fn rules(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!("{}_{}", rule_code.noqa_code(), path.to_string_lossy());

View File

@@ -1,13 +1,12 @@
use std::borrow::Cow;
use itertools::Itertools;
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::str::{leading_quote, trailing_quote};
use ruff_python_ast::StringFlags;
use ruff_python_index::Indexer;
use ruff_python_parser::{TokenKind, Tokens};
use ruff_python_parser::{Token, TokenKind, Tokens};
use ruff_source_file::LineRanges;
use ruff_text_size::{Ranged, TextRange};
use ruff_text_size::{Ranged, TextLen, TextRange};
use crate::Locator;
use crate::checkers::ast::LintContext;
@@ -169,7 +168,8 @@ pub(crate) fn implicit(
SingleLineImplicitStringConcatenation,
TextRange::new(a_range.start(), b_range.end()),
) {
if let Some(fix) = concatenate_strings(a_range, b_range, locator) {
if let Some(fix) = concatenate_strings(a_token, b_token, a_range, b_range, locator)
{
diagnostic.set_fix(fix);
}
}
@@ -177,38 +177,55 @@ pub(crate) fn implicit(
}
}
fn concatenate_strings(a_range: TextRange, b_range: TextRange, locator: &Locator) -> Option<Fix> {
let a_text = locator.slice(a_range);
let b_text = locator.slice(b_range);
let a_leading_quote = leading_quote(a_text)?;
let b_leading_quote = leading_quote(b_text)?;
// Require, for now, that the leading quotes are the same.
if a_leading_quote != b_leading_quote {
/// Concatenates two strings
///
/// The `a_string_range` and `b_string_range` are the range of the entire string,
/// not just of the string token itself (important for interpolated strings where
/// the start token doesn't span the entire token).
fn concatenate_strings(
a_token: &Token,
b_token: &Token,
a_string_range: TextRange,
b_string_range: TextRange,
locator: &Locator,
) -> Option<Fix> {
if a_token.string_flags()?.is_unclosed() || b_token.string_flags()?.is_unclosed() {
return None;
}
let a_trailing_quote = trailing_quote(a_text)?;
let b_trailing_quote = trailing_quote(b_text)?;
let a_string_flags = a_token.string_flags()?;
let b_string_flags = b_token.string_flags()?;
// Require, for now, that the trailing quotes are the same.
if a_trailing_quote != b_trailing_quote {
let a_prefix = a_string_flags.prefix();
let b_prefix = b_string_flags.prefix();
// Require, for now, that the strings have the same prefix,
// quote style, and number of quotes
if a_prefix != b_prefix
|| a_string_flags.quote_style() != b_string_flags.quote_style()
|| a_string_flags.is_triple_quoted() != b_string_flags.is_triple_quoted()
{
return None;
}
let a_text = locator.slice(a_string_range);
let b_text = locator.slice(b_string_range);
let quotes = a_string_flags.quote_str();
let opener_len = a_string_flags.opener_len();
let closer_len = a_string_flags.closer_len();
let mut a_body =
Cow::Borrowed(&a_text[a_leading_quote.len()..a_text.len() - a_trailing_quote.len()]);
let b_body = &b_text[b_leading_quote.len()..b_text.len() - b_trailing_quote.len()];
Cow::Borrowed(&a_text[TextRange::new(opener_len, a_text.text_len() - closer_len)]);
let b_body = &b_text[TextRange::new(opener_len, b_text.text_len() - closer_len)];
if a_leading_quote.find(['r', 'R']).is_none()
&& matches!(b_body.bytes().next(), Some(b'0'..=b'7'))
{
if !a_string_flags.is_raw_string() && matches!(b_body.bytes().next(), Some(b'0'..=b'7')) {
normalize_ending_octal(&mut a_body);
}
let concatenation = format!("{a_leading_quote}{a_body}{b_body}{a_trailing_quote}");
let range = TextRange::new(a_range.start(), b_range.end());
let concatenation = format!("{a_prefix}{quotes}{a_body}{b_body}{quotes}");
let range = TextRange::new(a_string_range.start(), b_string_range.end());
Some(Fix::safe_edit(Edit::range_replacement(
concatenation,

View File

@@ -0,0 +1,134 @@
---
source: crates/ruff_linter/src/rules/flake8_implicit_str_concat/mod.rs
---
ISC001 Implicitly concatenated string literals on one line
--> ISC_syntax_error_2.py:2:1
|
1 | # Regression test for https://github.com/astral-sh/ruff/issues/21023
2 | '' '
| ^^^^
3 | "" ""
4 | '' '' '
|
help: Combine string literals
invalid-syntax: missing closing quote in string literal
--> ISC_syntax_error_2.py:2:4
|
1 | # Regression test for https://github.com/astral-sh/ruff/issues/21023
2 | '' '
| ^
3 | "" ""
4 | '' '' '
|
ISC001 Implicitly concatenated string literals on one line
--> ISC_syntax_error_2.py:3:1
|
1 | # Regression test for https://github.com/astral-sh/ruff/issues/21023
2 | '' '
3 | "" ""
| ^^^^^
4 | '' '' '
5 | "" "" "
|
help: Combine string literals
ISC001 Implicitly concatenated string literals on one line
--> ISC_syntax_error_2.py:4:1
|
2 | '' '
3 | "" ""
4 | '' '' '
| ^^^^^
5 | "" "" "
6 | f"" f"
|
help: Combine string literals
ISC001 Implicitly concatenated string literals on one line
--> ISC_syntax_error_2.py:4:4
|
2 | '' '
3 | "" ""
4 | '' '' '
| ^^^^
5 | "" "" "
6 | f"" f"
|
help: Combine string literals
invalid-syntax: missing closing quote in string literal
--> ISC_syntax_error_2.py:4:7
|
2 | '' '
3 | "" ""
4 | '' '' '
| ^
5 | "" "" "
6 | f"" f"
|
ISC001 Implicitly concatenated string literals on one line
--> ISC_syntax_error_2.py:5:1
|
3 | "" ""
4 | '' '' '
5 | "" "" "
| ^^^^^
6 | f"" f"
7 | f"" f"" f"
|
help: Combine string literals
ISC001 Implicitly concatenated string literals on one line
--> ISC_syntax_error_2.py:5:4
|
3 | "" ""
4 | '' '' '
5 | "" "" "
| ^^^^
6 | f"" f"
7 | f"" f"" f"
|
help: Combine string literals
invalid-syntax: missing closing quote in string literal
--> ISC_syntax_error_2.py:5:7
|
3 | "" ""
4 | '' '' '
5 | "" "" "
| ^
6 | f"" f"
7 | f"" f"" f"
|
invalid-syntax: f-string: unterminated string
--> ISC_syntax_error_2.py:6:7
|
4 | '' '' '
5 | "" "" "
6 | f"" f"
| ^
7 | f"" f"" f"
|
ISC001 Implicitly concatenated string literals on one line
--> ISC_syntax_error_2.py:7:1
|
5 | "" "" "
6 | f"" f"
7 | f"" f"" f"
| ^^^^^^^
|
help: Combine string literals
invalid-syntax: f-string: unterminated string
--> ISC_syntax_error_2.py:7:11
|
5 | "" "" "
6 | f"" f"
7 | f"" f"" f"
| ^
|

View File

@@ -0,0 +1,53 @@
---
source: crates/ruff_linter/src/rules/flake8_implicit_str_concat/mod.rs
---
invalid-syntax: missing closing quote in string literal
--> ISC_syntax_error_2.py:2:4
|
1 | # Regression test for https://github.com/astral-sh/ruff/issues/21023
2 | '' '
| ^
3 | "" ""
4 | '' '' '
|
invalid-syntax: missing closing quote in string literal
--> ISC_syntax_error_2.py:4:7
|
2 | '' '
3 | "" ""
4 | '' '' '
| ^
5 | "" "" "
6 | f"" f"
|
invalid-syntax: missing closing quote in string literal
--> ISC_syntax_error_2.py:5:7
|
3 | "" ""
4 | '' '' '
5 | "" "" "
| ^
6 | f"" f"
7 | f"" f"" f"
|
invalid-syntax: f-string: unterminated string
--> ISC_syntax_error_2.py:6:7
|
4 | '' '' '
5 | "" "" "
6 | f"" f"
| ^
7 | f"" f"" f"
|
invalid-syntax: f-string: unterminated string
--> ISC_syntax_error_2.py:7:11
|
5 | "" "" "
6 | f"" f"
7 | f"" f"" f"
| ^
|

View File

@@ -50,6 +50,29 @@ use ruff_text_size::Ranged;
/// 1. `__aiter__` methods that return `AsyncIterator`, despite the class
/// inheriting directly from `AsyncIterator`.
///
/// The rule attempts to avoid flagging methods on metaclasses, since
/// [PEP 673] specifies that `Self` is disallowed in metaclasses. Ruff can
/// detect a class as being a metaclass if it inherits from a stdlib
/// metaclass such as `builtins.type` or `abc.ABCMeta`, and additionally
/// infers that a class may be a metaclass if it has a `__new__` method
/// with a similar signature to `type.__new__`. The heuristic used to
/// identify a metaclass-like `__new__` method signature is that it:
///
/// 1. Has exactly 5 parameters (including `cls`)
/// 1. Has a second parameter annotated with `str`
/// 1. Has a third parameter annotated with a `tuple` type
/// 1. Has a fourth parameter annotated with a `dict` type
/// 1. Has a fifth parameter is keyword-variadic (`**kwargs`)
///
/// For example, the following class would be detected as a metaclass, disabling
/// the rule:
///
/// ```python
/// class MyMetaclass(django.db.models.base.ModelBase):
/// def __new__(cls, name: str, bases: tuple[Any, ...], attrs: dict[str, Any], **kwargs: Any) -> MyMetaclass:
/// ...
/// ```
///
/// ## Example
///
/// ```pyi
@@ -87,6 +110,8 @@ use ruff_text_size::Ranged;
///
/// ## References
/// - [Python documentation: `typing.Self`](https://docs.python.org/3/library/typing.html#typing.Self)
///
/// [PEP 673]: https://peps.python.org/pep-0673/#valid-locations-for-self
#[derive(ViolationMetadata)]
#[violation_metadata(stable_since = "v0.0.271")]
pub(crate) struct NonSelfReturnType {
@@ -143,7 +168,10 @@ pub(crate) fn non_self_return_type(
};
// PEP 673 forbids the use of `typing(_extensions).Self` in metaclasses.
if analyze::class::is_metaclass(class_def, semantic).is_yes() {
if !matches!(
analyze::class::is_metaclass(class_def, semantic),
analyze::class::IsMetaclass::No
) {
return;
}

View File

@@ -451,6 +451,7 @@ help: Use `Self` as return type
359 + def __new__(cls) -> typing.Self: ...
360 | def __enter__(self: Generic5) -> Generic5: ...
361 |
362 |
note: This is an unsafe fix and may change runtime behavior
PYI034 [*] `__enter__` methods in classes like `Generic5` usually return `self` at runtime
@@ -468,4 +469,6 @@ help: Use `Self` as return type
- def __enter__(self: Generic5) -> Generic5: ...
360 + def __enter__(self) -> typing.Self: ...
361 |
362 |
363 | # Test cases based on issue #20781 - metaclasses that triggers IsMetaclass::Maybe
note: This is an unsafe fix and may change runtime behavior

View File

@@ -431,6 +431,8 @@ help: Use `Self` as return type
- def __new__(cls: type[Generic5]) -> Generic5: ...
253 + def __new__(cls) -> typing.Self: ...
254 | def __enter__(self: Generic5) -> Generic5: ...
255 |
256 |
note: This is a display-only fix and is likely to be incorrect
PYI034 [*] `__enter__` methods in classes like `Generic5` usually return `self` at runtime
@@ -447,4 +449,7 @@ help: Use `Self` as return type
253 | def __new__(cls: type[Generic5]) -> Generic5: ...
- def __enter__(self: Generic5) -> Generic5: ...
254 + def __enter__(self) -> typing.Self: ...
255 |
256 |
257 | # Test case based on issue #20781 - metaclass that triggers IsMetaclass::Maybe
note: This is a display-only fix and is likely to be incorrect

View File

@@ -98,6 +98,26 @@ mod tests {
Ok(())
}
#[test_case(Rule::TypingOnlyStandardLibraryImport, Path::new("TC003.py"))]
fn add_future_import_dataclass_kw_only_py313(rule: Rule, path: &Path) -> Result<()> {
let snapshot = format!(
"add_future_import_kw_only__{}_{}",
rule.noqa_code(),
path.to_string_lossy()
);
let diagnostics = test_path(
Path::new("flake8_type_checking").join(path).as_path(),
&settings::LinterSettings {
future_annotations: true,
// The issue in #21121 also didn't trigger on Python 3.14
unresolved_target_version: PythonVersion::PY313.into(),
..settings::LinterSettings::for_rule(rule)
},
)?;
assert_diagnostics!(snapshot, diagnostics);
Ok(())
}
// we test these rules as a pair, since they're opposites of one another
// so we want to make sure their fixes are not going around in circles.
#[test_case(Rule::UnquotedTypeAlias, Path::new("TC007.py"))]

View File

@@ -0,0 +1,28 @@
---
source: crates/ruff_linter/src/rules/flake8_type_checking/mod.rs
---
TC003 [*] Move standard library import `os` into a type-checking block
--> TC003.py:8:12
|
7 | def f():
8 | import os
| ^^
9 |
10 | x: os
|
help: Move into type-checking block
2 |
3 | For typing-only import detection tests, see `TC002.py`.
4 | """
5 + from typing import TYPE_CHECKING
6 +
7 + if TYPE_CHECKING:
8 + import os
9 |
10 |
11 | def f():
- import os
12 |
13 | x: os
14 |
note: This is an unsafe fix and may change runtime behavior

View File

@@ -9,6 +9,7 @@ use ruff_python_semantic::{Definition, SemanticModel};
use ruff_python_stdlib::identifiers::is_identifier;
use ruff_source_file::{LineRanges, NewlineWithTrailingNewline};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
use rustc_hash::FxHashMap;
use crate::Violation;
use crate::checkers::ast::Checker;
@@ -823,6 +824,8 @@ struct BodyVisitor<'a> {
currently_suspended_exceptions: Option<&'a ast::Expr>,
raised_exceptions: Vec<ExceptionEntry<'a>>,
semantic: &'a SemanticModel<'a>,
/// Maps exception variable names to their exception expressions in the current except clause
exception_variables: FxHashMap<&'a str, &'a ast::Expr>,
}
impl<'a> BodyVisitor<'a> {
@@ -833,6 +836,7 @@ impl<'a> BodyVisitor<'a> {
currently_suspended_exceptions: None,
raised_exceptions: Vec::new(),
semantic,
exception_variables: FxHashMap::default(),
}
}
@@ -864,20 +868,47 @@ impl<'a> BodyVisitor<'a> {
raised_exceptions,
}
}
/// Store `exception` if its qualified name does not correspond to one of the exempt types.
fn maybe_store_exception(&mut self, exception: &'a Expr, range: TextRange) {
let Some(qualified_name) = self.semantic.resolve_qualified_name(exception) else {
return;
};
if is_exception_or_base_exception(&qualified_name) {
return;
}
self.raised_exceptions.push(ExceptionEntry {
qualified_name,
range,
});
}
}
impl<'a> Visitor<'a> for BodyVisitor<'a> {
fn visit_except_handler(&mut self, handler: &'a ast::ExceptHandler) {
let ast::ExceptHandler::ExceptHandler(handler_inner) = handler;
self.currently_suspended_exceptions = handler_inner.type_.as_deref();
// Track exception variable bindings
if let Some(name) = handler_inner.name.as_ref() {
if let Some(exceptions) = self.currently_suspended_exceptions {
// Store the exception expression(s) for later resolution
self.exception_variables
.insert(name.id.as_str(), exceptions);
}
}
visitor::walk_except_handler(self, handler);
self.currently_suspended_exceptions = None;
// Clear exception variables when leaving the except handler
self.exception_variables.clear();
}
fn visit_stmt(&mut self, stmt: &'a Stmt) {
match stmt {
Stmt::Raise(ast::StmtRaise { exc, .. }) => {
if let Some(exc) = exc.as_ref() {
// First try to resolve the exception directly
if let Some(qualified_name) =
self.semantic.resolve_qualified_name(map_callable(exc))
{
@@ -885,28 +916,27 @@ impl<'a> Visitor<'a> for BodyVisitor<'a> {
qualified_name,
range: exc.range(),
});
} else if let ast::Expr::Name(name) = exc.as_ref() {
// If it's a variable name, check if it's bound to an exception in the
// current except clause
if let Some(exception_expr) = self.exception_variables.get(name.id.as_str())
{
if let ast::Expr::Tuple(tuple) = exception_expr {
for exception in tuple {
self.maybe_store_exception(exception, stmt.range());
}
} else {
self.maybe_store_exception(exception_expr, stmt.range());
}
}
}
} else if let Some(exceptions) = self.currently_suspended_exceptions {
let mut maybe_store_exception = |exception| {
let Some(qualified_name) = self.semantic.resolve_qualified_name(exception)
else {
return;
};
if is_exception_or_base_exception(&qualified_name) {
return;
}
self.raised_exceptions.push(ExceptionEntry {
qualified_name,
range: stmt.range(),
});
};
if let ast::Expr::Tuple(tuple) = exceptions {
for exception in tuple {
maybe_store_exception(exception);
self.maybe_store_exception(exception, stmt.range());
}
} else {
maybe_store_exception(exceptions);
self.maybe_store_exception(exceptions, stmt.range());
}
}
}

View File

@@ -61,6 +61,66 @@ DOC501 Raised exception `FasterThanLightError` missing from docstring
|
help: Add `FasterThanLightError` to the docstring
DOC501 Raised exception `ZeroDivisionError` missing from docstring
--> DOC501_google.py:70:5
|
68 | # DOC501
69 | def calculate_speed(distance: float, time: float) -> float:
70 | / """Calculate speed as distance divided by time.
71 | |
72 | | Args:
73 | | distance: Distance traveled.
74 | | time: Time spent traveling.
75 | |
76 | | Returns:
77 | | Speed as distance divided by time.
78 | | """
| |_______^
79 | try:
80 | return distance / time
|
help: Add `ZeroDivisionError` to the docstring
DOC501 Raised exception `ValueError` missing from docstring
--> DOC501_google.py:88:5
|
86 | # DOC501
87 | def calculate_speed(distance: float, time: float) -> float:
88 | / """Calculate speed as distance divided by time.
89 | |
90 | | Args:
91 | | distance: Distance traveled.
92 | | time: Time spent traveling.
93 | |
94 | | Returns:
95 | | Speed as distance divided by time.
96 | | """
| |_______^
97 | try:
98 | return distance / time
|
help: Add `ValueError` to the docstring
DOC501 Raised exception `ZeroDivisionError` missing from docstring
--> DOC501_google.py:88:5
|
86 | # DOC501
87 | def calculate_speed(distance: float, time: float) -> float:
88 | / """Calculate speed as distance divided by time.
89 | |
90 | | Args:
91 | | distance: Distance traveled.
92 | | time: Time spent traveling.
93 | |
94 | | Returns:
95 | | Speed as distance divided by time.
96 | | """
| |_______^
97 | try:
98 | return distance / time
|
help: Add `ZeroDivisionError` to the docstring
DOC501 Raised exception `AnotherError` missing from docstring
--> DOC501_google.py:106:5
|

View File

@@ -61,6 +61,66 @@ DOC501 Raised exception `FasterThanLightError` missing from docstring
|
help: Add `FasterThanLightError` to the docstring
DOC501 Raised exception `ZeroDivisionError` missing from docstring
--> DOC501_google.py:70:5
|
68 | # DOC501
69 | def calculate_speed(distance: float, time: float) -> float:
70 | / """Calculate speed as distance divided by time.
71 | |
72 | | Args:
73 | | distance: Distance traveled.
74 | | time: Time spent traveling.
75 | |
76 | | Returns:
77 | | Speed as distance divided by time.
78 | | """
| |_______^
79 | try:
80 | return distance / time
|
help: Add `ZeroDivisionError` to the docstring
DOC501 Raised exception `ValueError` missing from docstring
--> DOC501_google.py:88:5
|
86 | # DOC501
87 | def calculate_speed(distance: float, time: float) -> float:
88 | / """Calculate speed as distance divided by time.
89 | |
90 | | Args:
91 | | distance: Distance traveled.
92 | | time: Time spent traveling.
93 | |
94 | | Returns:
95 | | Speed as distance divided by time.
96 | | """
| |_______^
97 | try:
98 | return distance / time
|
help: Add `ValueError` to the docstring
DOC501 Raised exception `ZeroDivisionError` missing from docstring
--> DOC501_google.py:88:5
|
86 | # DOC501
87 | def calculate_speed(distance: float, time: float) -> float:
88 | / """Calculate speed as distance divided by time.
89 | |
90 | | Args:
91 | | distance: Distance traveled.
92 | | time: Time spent traveling.
93 | |
94 | | Returns:
95 | | Speed as distance divided by time.
96 | | """
| |_______^
97 | try:
98 | return distance / time
|
help: Add `ZeroDivisionError` to the docstring
DOC501 Raised exception `AnotherError` missing from docstring
--> DOC501_google.py:106:5
|

View File

@@ -166,6 +166,7 @@ mod tests {
#[test_case(Rule::UndefinedName, Path::new("F821_30.py"))]
#[test_case(Rule::UndefinedName, Path::new("F821_31.py"))]
#[test_case(Rule::UndefinedName, Path::new("F821_32.pyi"))]
#[test_case(Rule::UndefinedName, Path::new("F821_33.py"))]
#[test_case(Rule::UndefinedExport, Path::new("F822_0.py"))]
#[test_case(Rule::UndefinedExport, Path::new("F822_0.pyi"))]
#[test_case(Rule::UndefinedExport, Path::new("F822_1.py"))]
@@ -527,6 +528,38 @@ mod tests {
import a",
"f401_use_in_between_imports"
)]
#[test_case(
r"
if cond:
import a
import a.b
a.foo()
",
"f401_same_branch"
)]
#[test_case(
r"
try:
import a.b.c
except ImportError:
import argparse
import a
a.b = argparse.Namespace()
",
"f401_different_branch"
)]
#[test_case(
r"
import mlflow.pyfunc.loaders.chat_agent
import mlflow.pyfunc.loaders.chat_model
import mlflow.pyfunc.loaders.code_model
from mlflow.utils.pydantic_utils import IS_PYDANTIC_V2_OR_NEWER
if IS_PYDANTIC_V2_OR_NEWER:
import mlflow.pyfunc.loaders.responses_agent
",
"f401_type_checking"
)]
fn f401_preview_refined_submodule_handling(contents: &str, snapshot: &str) {
let diagnostics = test_contents(
&SourceKind::Python(dedent(contents).to_string()),

View File

@@ -898,6 +898,10 @@ fn best_match<'a, 'b>(
#[inline]
fn has_simple_shadowed_bindings(scope: &Scope, id: BindingId, semantic: &SemanticModel) -> bool {
let Some(binding_node) = semantic.binding(id).source else {
return false;
};
scope.shadowed_bindings(id).enumerate().all(|(i, shadow)| {
let shadowed_binding = semantic.binding(shadow);
// Bail if one of the shadowed bindings is
@@ -912,6 +916,34 @@ fn has_simple_shadowed_bindings(scope: &Scope, id: BindingId, semantic: &Semanti
if i > 0 && shadowed_binding.is_used() {
return false;
}
// We want to allow a situation like this:
//
// ```python
// import a.b
// if TYPE_CHECKING:
// import a.b.c
// ```
// but bail in a situation like this:
//
// ```python
// try:
// import a.b
// except ImportError:
// import argparse
// import a
// a.b = argparse.Namespace()
// ```
//
// So we require that all the shadowed bindings dominate the
// last live binding for the import. That is: if the last live
// binding is executed it should imply that all the shadowed
// bindings were executed as well.
if shadowed_binding
.source
.is_none_or(|node_id| !semantic.dominates(node_id, binding_node))
{
return false;
}
matches!(
shadowed_binding.kind,
BindingKind::Import(_) | BindingKind::SubmoduleImport(_)

View File

@@ -0,0 +1,12 @@
---
source: crates/ruff_linter/src/rules/pyflakes/mod.rs
---
F821 Undefined name `__class__`
--> F821_33.py:15:13
|
14 | # Test: lambda outside class (should still fail)
15 | h = lambda: __class__
| ^^^^^^^^^
16 |
17 | # Test: lambda referencing module-level variable (should not be flagged as F821)
|

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/pyflakes/mod.rs
---

View File

@@ -0,0 +1,18 @@
---
source: crates/ruff_linter/src/rules/pyflakes/mod.rs
---
F401 [*] `a.b` imported but unused
--> f401_preview_submodule.py:4:12
|
2 | if cond:
3 | import a
4 | import a.b
| ^^^
5 | a.foo()
|
help: Remove unused import: `a.b`
1 |
2 | if cond:
3 | import a
- import a.b
4 | a.foo()

View File

@@ -0,0 +1,66 @@
---
source: crates/ruff_linter/src/rules/pyflakes/mod.rs
---
F401 [*] `mlflow.pyfunc.loaders.chat_agent` imported but unused
--> f401_preview_submodule.py:2:8
|
2 | import mlflow.pyfunc.loaders.chat_agent
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
3 | import mlflow.pyfunc.loaders.chat_model
4 | import mlflow.pyfunc.loaders.code_model
|
help: Remove unused import: `mlflow.pyfunc.loaders.chat_agent`
1 |
- import mlflow.pyfunc.loaders.chat_agent
2 | import mlflow.pyfunc.loaders.chat_model
3 | import mlflow.pyfunc.loaders.code_model
4 | from mlflow.utils.pydantic_utils import IS_PYDANTIC_V2_OR_NEWER
F401 [*] `mlflow.pyfunc.loaders.chat_model` imported but unused
--> f401_preview_submodule.py:3:8
|
2 | import mlflow.pyfunc.loaders.chat_agent
3 | import mlflow.pyfunc.loaders.chat_model
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
4 | import mlflow.pyfunc.loaders.code_model
5 | from mlflow.utils.pydantic_utils import IS_PYDANTIC_V2_OR_NEWER
|
help: Remove unused import: `mlflow.pyfunc.loaders.chat_model`
1 |
2 | import mlflow.pyfunc.loaders.chat_agent
- import mlflow.pyfunc.loaders.chat_model
3 | import mlflow.pyfunc.loaders.code_model
4 | from mlflow.utils.pydantic_utils import IS_PYDANTIC_V2_OR_NEWER
5 |
F401 [*] `mlflow.pyfunc.loaders.code_model` imported but unused
--> f401_preview_submodule.py:4:8
|
2 | import mlflow.pyfunc.loaders.chat_agent
3 | import mlflow.pyfunc.loaders.chat_model
4 | import mlflow.pyfunc.loaders.code_model
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
5 | from mlflow.utils.pydantic_utils import IS_PYDANTIC_V2_OR_NEWER
|
help: Remove unused import: `mlflow.pyfunc.loaders.code_model`
1 |
2 | import mlflow.pyfunc.loaders.chat_agent
3 | import mlflow.pyfunc.loaders.chat_model
- import mlflow.pyfunc.loaders.code_model
4 | from mlflow.utils.pydantic_utils import IS_PYDANTIC_V2_OR_NEWER
5 |
6 | if IS_PYDANTIC_V2_OR_NEWER:
F401 [*] `mlflow.pyfunc.loaders.responses_agent` imported but unused
--> f401_preview_submodule.py:8:12
|
7 | if IS_PYDANTIC_V2_OR_NEWER:
8 | import mlflow.pyfunc.loaders.responses_agent
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
help: Remove unused import: `mlflow.pyfunc.loaders.responses_agent`
5 | from mlflow.utils.pydantic_utils import IS_PYDANTIC_V2_OR_NEWER
6 |
7 | if IS_PYDANTIC_V2_OR_NEWER:
- import mlflow.pyfunc.loaders.responses_agent
8 + pass

View File

@@ -52,6 +52,7 @@ mod tests {
#[test_case(Rule::ManualFromImport, Path::new("import_aliasing.py"))]
#[test_case(Rule::IfStmtMinMax, Path::new("if_stmt_min_max.py"))]
#[test_case(Rule::SingleStringSlots, Path::new("single_string_slots.py"))]
#[test_case(Rule::StopIterationReturn, Path::new("stop_iteration_return.py"))]
#[test_case(Rule::SysExitAlias, Path::new("sys_exit_alias_0.py"))]
#[test_case(Rule::SysExitAlias, Path::new("sys_exit_alias_1.py"))]
#[test_case(Rule::SysExitAlias, Path::new("sys_exit_alias_2.py"))]

View File

@@ -75,6 +75,7 @@ pub(crate) use shallow_copy_environ::*;
pub(crate) use single_string_slots::*;
pub(crate) use singledispatch_method::*;
pub(crate) use singledispatchmethod_function::*;
pub(crate) use stop_iteration_return::*;
pub(crate) use subprocess_popen_preexec_fn::*;
pub(crate) use subprocess_run_without_check::*;
pub(crate) use super_without_brackets::*;
@@ -185,6 +186,7 @@ mod shallow_copy_environ;
mod single_string_slots;
mod singledispatch_method;
mod singledispatchmethod_function;
mod stop_iteration_return;
mod subprocess_popen_preexec_fn;
mod subprocess_run_without_check;
mod super_without_brackets;

View File

@@ -0,0 +1,114 @@
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast as ast;
use ruff_python_ast::visitor::{Visitor, walk_expr, walk_stmt};
use ruff_text_size::Ranged;
use crate::Violation;
use crate::checkers::ast::Checker;
/// ## What it does
/// Checks for explicit `raise StopIteration` in generator functions.
///
/// ## Why is this bad?
/// Raising `StopIteration` in a generator function causes a `RuntimeError`
/// when the generator is iterated over.
///
/// Instead of `raise StopIteration`, use `return` in generator functions.
///
/// ## Example
/// ```python
/// def my_generator():
/// yield 1
/// yield 2
/// raise StopIteration # This causes RuntimeError at runtime
/// ```
///
/// Use instead:
/// ```python
/// def my_generator():
/// yield 1
/// yield 2
/// return # Use return instead
/// ```
///
/// ## References
/// - [PEP 479](https://peps.python.org/pep-0479/)
/// - [Python documentation](https://docs.python.org/3/library/exceptions.html#StopIteration)
#[derive(ViolationMetadata)]
#[violation_metadata(preview_since = "0.14.3")]
pub(crate) struct StopIterationReturn;
impl Violation for StopIterationReturn {
#[derive_message_formats]
fn message(&self) -> String {
"Explicit `raise StopIteration` in generator".to_string()
}
fn fix_title(&self) -> Option<String> {
Some("Use `return` instead".to_string())
}
}
/// PLR1708
pub(crate) fn stop_iteration_return(checker: &Checker, raise_stmt: &ast::StmtRaise) {
// Fast-path: only continue if this is `raise StopIteration` (with or without args)
let Some(exc) = &raise_stmt.exc else {
return;
};
let is_stop_iteration = match exc.as_ref() {
ast::Expr::Call(ast::ExprCall { func, .. }) => {
checker.semantic().match_builtin_expr(func, "StopIteration")
}
expr => checker.semantic().match_builtin_expr(expr, "StopIteration"),
};
if !is_stop_iteration {
return;
}
// Now check the (more expensive) generator context
if !in_generator_context(checker) {
return;
}
checker.report_diagnostic(StopIterationReturn, raise_stmt.range());
}
/// Returns true if we're inside a function that contains any `yield`/`yield from`.
fn in_generator_context(checker: &Checker) -> bool {
for scope in checker.semantic().current_scopes() {
if let ruff_python_semantic::ScopeKind::Function(function_def) = scope.kind {
if contains_yield_statement(&function_def.body) {
return true;
}
}
}
false
}
/// Check if a statement list contains any yield statements
fn contains_yield_statement(body: &[ast::Stmt]) -> bool {
struct YieldFinder {
found: bool,
}
impl Visitor<'_> for YieldFinder {
fn visit_expr(&mut self, expr: &ast::Expr) {
if matches!(expr, ast::Expr::Yield(_) | ast::Expr::YieldFrom(_)) {
self.found = true;
} else {
walk_expr(self, expr);
}
}
}
let mut finder = YieldFinder { found: false };
for stmt in body {
walk_stmt(&mut finder, stmt);
if finder.found {
return true;
}
}
false
}

View File

@@ -0,0 +1,109 @@
---
source: crates/ruff_linter/src/rules/pylint/mod.rs
---
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:38:5
|
36 | yield 1
37 | yield 2
38 | raise StopIteration # Should trigger
| ^^^^^^^^^^^^^^^^^^^
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:44:5
|
42 | yield 1
43 | yield 2
44 | raise StopIteration("finished") # Should trigger
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:50:5
|
48 | yield 1
49 | yield 2
50 | raise StopIteration(1 + 2) # Should trigger
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:56:5
|
54 | yield 1
55 | yield 2
56 | raise StopIteration("async") # Should trigger
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:62:9
|
60 | def inner_gen():
61 | yield 1
62 | raise StopIteration("inner") # Should trigger
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
63 |
64 | yield from inner_gen()
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:71:13
|
69 | def generator_method(self):
70 | yield 1
71 | raise StopIteration("method") # Should trigger
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
72 |
73 | return MyClass
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:81:9
|
79 | yield 1
80 | yield 2
81 | raise StopIteration("complex") # Should trigger
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
82 | except ValueError:
83 | yield 3
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:91:9
|
89 | yield 1
90 | if condition:
91 | raise StopIteration("conditional") # Should trigger
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
92 | yield 2
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:98:5
|
96 | def generator_with_bare_stop_iteration():
97 | yield 1
98 | raise StopIteration # Should trigger (no arguments)
| ^^^^^^^^^^^^^^^^^^^
|
help: Use `return` instead
PLR1708 Explicit `raise StopIteration` in generator
--> stop_iteration_return.py:105:13
|
103 | yield i
104 | if i == 3:
105 | raise StopIteration("loop") # Should trigger
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
help: Use `return` instead

View File

@@ -64,6 +64,7 @@ mod tests {
#[test_case(Rule::QuotedAnnotation, Path::new("UP037_0.py"))]
#[test_case(Rule::QuotedAnnotation, Path::new("UP037_1.py"))]
#[test_case(Rule::QuotedAnnotation, Path::new("UP037_2.pyi"))]
#[test_case(Rule::QuotedAnnotation, Path::new("UP037_3.py"))]
#[test_case(Rule::RedundantOpenModes, Path::new("UP015.py"))]
#[test_case(Rule::RedundantOpenModes, Path::new("UP015_1.py"))]
#[test_case(Rule::ReplaceStdoutStderr, Path::new("UP022.py"))]
@@ -111,7 +112,7 @@ mod tests {
#[test_case(Rule::NonPEP695TypeAlias, Path::new("UP040.pyi"))]
#[test_case(Rule::NonPEP695GenericClass, Path::new("UP046_0.py"))]
#[test_case(Rule::NonPEP695GenericClass, Path::new("UP046_1.py"))]
#[test_case(Rule::NonPEP695GenericFunction, Path::new("UP047.py"))]
#[test_case(Rule::NonPEP695GenericFunction, Path::new("UP047_0.py"))]
#[test_case(Rule::PrivateTypeParameter, Path::new("UP049_0.py"))]
#[test_case(Rule::PrivateTypeParameter, Path::new("UP049_1.py"))]
#[test_case(Rule::UselessClassMetaclassType, Path::new("UP050.py"))]
@@ -125,6 +126,22 @@ mod tests {
Ok(())
}
#[test_case(Rule::NonPEP695GenericClass, Path::new("UP046_2.py"))]
#[test_case(Rule::NonPEP695GenericFunction, Path::new("UP047_1.py"))]
fn rules_not_applied_default_typevar_backported(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = path.to_string_lossy().to_string();
let diagnostics = test_path(
Path::new("pyupgrade").join(path).as_path(),
&settings::LinterSettings {
preview: PreviewMode::Enabled,
unresolved_target_version: PythonVersion::PY312.into(),
..settings::LinterSettings::for_rule(rule_code)
},
)?;
assert_diagnostics!(snapshot, diagnostics);
Ok(())
}
#[test_case(Rule::SuperCallWithParameters, Path::new("UP008.py"))]
#[test_case(Rule::TypingTextStrAlias, Path::new("UP019.py"))]
fn rules_preview(rule_code: Rule, path: &Path) -> Result<()> {
@@ -140,11 +157,25 @@ mod tests {
Ok(())
}
#[test_case(Rule::QuotedAnnotation, Path::new("UP037_3.py"))]
fn rules_py313(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!("rules_py313__{}", path.to_string_lossy());
let diagnostics = test_path(
Path::new("pyupgrade").join(path).as_path(),
&settings::LinterSettings {
unresolved_target_version: PythonVersion::PY313.into(),
..settings::LinterSettings::for_rule(rule_code)
},
)?;
assert_diagnostics!(snapshot, diagnostics);
Ok(())
}
#[test_case(Rule::NonPEP695TypeAlias, Path::new("UP040.py"))]
#[test_case(Rule::NonPEP695TypeAlias, Path::new("UP040.pyi"))]
#[test_case(Rule::NonPEP695GenericClass, Path::new("UP046_0.py"))]
#[test_case(Rule::NonPEP695GenericClass, Path::new("UP046_1.py"))]
#[test_case(Rule::NonPEP695GenericFunction, Path::new("UP047.py"))]
#[test_case(Rule::NonPEP695GenericFunction, Path::new("UP047_0.py"))]
fn type_var_default_preview(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!("{}__preview_diff", path.to_string_lossy());
assert_diagnostics_diff!(

View File

@@ -6,8 +6,8 @@ use std::fmt::Display;
use itertools::Itertools;
use ruff_python_ast::{
self as ast, Arguments, Expr, ExprCall, ExprName, ExprSubscript, Identifier, Stmt, StmtAssign,
TypeParam, TypeParamParamSpec, TypeParamTypeVar, TypeParamTypeVarTuple,
self as ast, Arguments, Expr, ExprCall, ExprName, ExprSubscript, Identifier, PythonVersion,
Stmt, StmtAssign, TypeParam, TypeParamParamSpec, TypeParamTypeVar, TypeParamTypeVarTuple,
name::Name,
visitor::{self, Visitor},
};
@@ -369,15 +369,19 @@ fn in_nested_context(checker: &Checker) -> bool {
}
/// Deduplicate `vars`, returning `None` if `vars` is empty or any duplicates are found.
/// Also returns `None` if any `TypeVar` has a default value and preview mode is not enabled.
/// Also returns `None` if any `TypeVar` has a default value and the target Python version
/// is below 3.13 or preview mode is not enabled. Note that `typing_extensions` backports
/// the default argument, but the rule should be skipped in that case.
fn check_type_vars<'a>(vars: Vec<TypeVar<'a>>, checker: &Checker) -> Option<Vec<TypeVar<'a>>> {
if vars.is_empty() {
return None;
}
// If any type variables have defaults and preview mode is not enabled, skip the rule
// If any type variables have defaults, skip the rule unless
// running with preview mode enabled and targeting Python 3.13+.
if vars.iter().any(|tv| tv.default.is_some())
&& !is_type_var_default_enabled(checker.settings())
&& (checker.target_version() < PythonVersion::PY313
|| !is_type_var_default_enabled(checker.settings()))
{
return None;
}

View File

@@ -0,0 +1,36 @@
---
source: crates/ruff_linter/src/rules/pyupgrade/mod.rs
---
UP037 [*] Remove quotes from type annotation
--> UP037_3.py:15:35
|
13 | @dataclass(frozen=True)
14 | class EmptyCell:
15 | _singleton: ClassVar[Optional["EmptyCell"]] = None
| ^^^^^^^^^^^
16 | # the behavior of _singleton above should match a non-ClassVar
17 | _doubleton: "EmptyCell"
|
help: Remove quotes
12 |
13 | @dataclass(frozen=True)
14 | class EmptyCell:
- _singleton: ClassVar[Optional["EmptyCell"]] = None
15 + _singleton: ClassVar[Optional[EmptyCell]] = None
16 | # the behavior of _singleton above should match a non-ClassVar
17 | _doubleton: "EmptyCell"
UP037 [*] Remove quotes from type annotation
--> UP037_3.py:17:17
|
15 | _singleton: ClassVar[Optional["EmptyCell"]] = None
16 | # the behavior of _singleton above should match a non-ClassVar
17 | _doubleton: "EmptyCell"
| ^^^^^^^^^^^
|
help: Remove quotes
14 | class EmptyCell:
15 | _singleton: ClassVar[Optional["EmptyCell"]] = None
16 | # the behavior of _singleton above should match a non-ClassVar
- _doubleton: "EmptyCell"
17 + _doubleton: EmptyCell

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/pyupgrade/mod.rs
---

View File

@@ -2,7 +2,7 @@
source: crates/ruff_linter/src/rules/pyupgrade/mod.rs
---
UP047 [*] Generic function `f` should use type parameters
--> UP047.py:12:5
--> UP047_0.py:12:5
|
12 | def f(t: T) -> T:
| ^^^^^^^
@@ -20,7 +20,7 @@ help: Use type parameters
note: This is an unsafe fix and may change runtime behavior
UP047 [*] Generic function `g` should use type parameters
--> UP047.py:16:5
--> UP047_0.py:16:5
|
16 | def g(ts: tuple[*Ts]) -> tuple[*Ts]:
| ^^^^^^^^^^^^^^^^^
@@ -38,7 +38,7 @@ help: Use type parameters
note: This is an unsafe fix and may change runtime behavior
UP047 [*] Generic function `h` should use type parameters
--> UP047.py:20:5
--> UP047_0.py:20:5
|
20 | def h(
| _____^
@@ -62,7 +62,7 @@ help: Use type parameters
note: This is an unsafe fix and may change runtime behavior
UP047 [*] Generic function `i` should use type parameters
--> UP047.py:29:5
--> UP047_0.py:29:5
|
29 | def i(s: S) -> S:
| ^^^^^^^
@@ -80,7 +80,7 @@ help: Use type parameters
note: This is an unsafe fix and may change runtime behavior
UP047 [*] Generic function `broken_fix` should use type parameters
--> UP047.py:39:5
--> UP047_0.py:39:5
|
37 | # TypeVars with the new-style generic syntax and will be rejected by type
38 | # checkers
@@ -100,7 +100,7 @@ help: Use type parameters
note: This is an unsafe fix and may change runtime behavior
UP047 [*] Generic function `any_str_param` should use type parameters
--> UP047.py:43:5
--> UP047_0.py:43:5
|
43 | def any_str_param(s: AnyStr) -> AnyStr:
| ^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -11,7 +11,7 @@ Added: 1
--- Added ---
UP047 [*] Generic function `default_var` should use type parameters
--> UP047.py:51:5
--> UP047_0.py:51:5
|
51 | def default_var(v: V) -> V:
| ^^^^^^^^^^^^^^^^^

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/pyupgrade/mod.rs
---

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/pyupgrade/mod.rs
---

View File

@@ -1,5 +1,6 @@
use ruff_macros::{ViolationMetadata, derive_message_formats};
use itertools::Itertools;
use ruff_python_ast::{self as ast, Expr};
use ruff_python_trivia::PythonWhitespace;
use ruff_text_size::Ranged;
@@ -91,18 +92,20 @@ pub(crate) fn verbose_decimal_constructor(checker: &Checker, call: &ast::ExprCal
// using this regex:
// https://github.com/python/cpython/blob/ac556a2ad1213b8bb81372fe6fb762f5fcb076de/Lib/_pydecimal.py#L6060-L6077
// _after_ trimming whitespace from the string and removing all occurrences of "_".
let mut trimmed = Cow::from(str_literal.to_str().trim_whitespace());
if memchr::memchr(b'_', trimmed.as_bytes()).is_some() {
trimmed = Cow::from(trimmed.replace('_', ""));
}
let original_str = str_literal.to_str().trim_whitespace();
// Extract the unary sign, if any.
let (unary, rest) = if let Some(trimmed) = trimmed.strip_prefix('+') {
("+", Cow::from(trimmed))
} else if let Some(trimmed) = trimmed.strip_prefix('-') {
("-", Cow::from(trimmed))
let (unary, original_str) = if let Some(trimmed) = original_str.strip_prefix('+') {
("+", trimmed)
} else if let Some(trimmed) = original_str.strip_prefix('-') {
("-", trimmed)
} else {
("", trimmed)
("", original_str)
};
let mut rest = Cow::from(original_str);
let has_digit_separators = memchr::memchr(b'_', rest.as_bytes()).is_some();
if has_digit_separators {
rest = Cow::from(rest.replace('_', ""));
}
// Early return if we now have an empty string
// or a very long string:
@@ -118,6 +121,13 @@ pub(crate) fn verbose_decimal_constructor(checker: &Checker, call: &ast::ExprCal
return;
}
// If the original string had digit separators, normalize them
let rest = if has_digit_separators {
Cow::from(normalize_digit_separators(original_str))
} else {
Cow::from(rest)
};
// If all the characters are zeros, then the value is zero.
let rest = match (unary, rest.is_empty()) {
// `Decimal("-0")` is not the same as `Decimal("0")`
@@ -126,10 +136,11 @@ pub(crate) fn verbose_decimal_constructor(checker: &Checker, call: &ast::ExprCal
return;
}
(_, true) => "0",
_ => rest,
_ => &rest,
};
let replacement = format!("{unary}{rest}");
let mut diagnostic = checker.report_diagnostic(
VerboseDecimalConstructor {
replacement: replacement.clone(),
@@ -186,6 +197,22 @@ pub(crate) fn verbose_decimal_constructor(checker: &Checker, call: &ast::ExprCal
}
}
/// Normalizes digit separators in a numeric string by:
/// - Stripping leading and trailing underscores
/// - Collapsing medial underscore sequences to single underscores
fn normalize_digit_separators(original_str: &str) -> String {
// Strip leading and trailing underscores
let trimmed = original_str
.trim_start_matches(['_', '0'])
.trim_end_matches('_');
// Collapse medial underscore sequences to single underscores
trimmed
.chars()
.dedup_by(|a, b| *a == '_' && a == b)
.collect()
}
// ```console
// $ python
// >>> import sys

View File

@@ -167,12 +167,12 @@ FURB157 [*] Verbose expression in `Decimal` constructor
| ^^^^^^^
24 | Decimal("__1____000")
|
help: Replace with `1000`
help: Replace with `1_000`
20 | # See https://github.com/astral-sh/ruff/issues/13807
21 |
22 | # Errors
- Decimal("1_000")
23 + Decimal(1000)
23 + Decimal(1_000)
24 | Decimal("__1____000")
25 |
26 | # Ok
@@ -187,12 +187,12 @@ FURB157 [*] Verbose expression in `Decimal` constructor
25 |
26 | # Ok
|
help: Replace with `1000`
help: Replace with `1_000`
21 |
22 | # Errors
23 | Decimal("1_000")
- Decimal("__1____000")
24 + Decimal(1000)
24 + Decimal(1_000)
25 |
26 | # Ok
27 | Decimal("2e-4")
@@ -494,6 +494,7 @@ help: Replace with `"nan"`
69 + Decimal("nan")
70 | Decimal(float(" -" "nan"))
71 | Decimal(float("-nAn"))
72 |
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:70:9
@@ -511,6 +512,8 @@ help: Replace with `"nan"`
- Decimal(float(" -" "nan"))
70 + Decimal("nan")
71 | Decimal(float("-nAn"))
72 |
73 | # Test cases for digit separators (safe fixes)
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:71:9
@@ -519,6 +522,8 @@ FURB157 [*] Verbose expression in `Decimal` constructor
70 | Decimal(float(" -" "nan"))
71 | Decimal(float("-nAn"))
| ^^^^^^^^^^^^^
72 |
73 | # Test cases for digit separators (safe fixes)
|
help: Replace with `"nan"`
68 | Decimal(float("\N{space}\N{hyPHen-MINus}nan"))
@@ -526,3 +531,173 @@ help: Replace with `"nan"`
70 | Decimal(float(" -" "nan"))
- Decimal(float("-nAn"))
71 + Decimal("nan")
72 |
73 | # Test cases for digit separators (safe fixes)
74 | # https://github.com/astral-sh/ruff/issues/20572
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:75:9
|
73 | # Test cases for digit separators (safe fixes)
74 | # https://github.com/astral-sh/ruff/issues/20572
75 | Decimal("15_000_000") # Safe fix: normalizes separators, becomes Decimal(15_000_000)
| ^^^^^^^^^^^^
76 | Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
77 | Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
|
help: Replace with `15_000_000`
72 |
73 | # Test cases for digit separators (safe fixes)
74 | # https://github.com/astral-sh/ruff/issues/20572
- Decimal("15_000_000") # Safe fix: normalizes separators, becomes Decimal(15_000_000)
75 + Decimal(15_000_000) # Safe fix: normalizes separators, becomes Decimal(15_000_000)
76 | Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
77 | Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
78 | Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:76:9
|
74 | # https://github.com/astral-sh/ruff/issues/20572
75 | Decimal("15_000_000") # Safe fix: normalizes separators, becomes Decimal(15_000_000)
76 | Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
| ^^^^^^^^^^^
77 | Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
78 | Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
|
help: Replace with `1_234_567`
73 | # Test cases for digit separators (safe fixes)
74 | # https://github.com/astral-sh/ruff/issues/20572
75 | Decimal("15_000_000") # Safe fix: normalizes separators, becomes Decimal(15_000_000)
- Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
76 + Decimal(1_234_567) # Safe fix: normalizes separators, becomes Decimal(1_234_567)
77 | Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
78 | Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
79 |
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:77:9
|
75 | Decimal("15_000_000") # Safe fix: normalizes separators, becomes Decimal(15_000_000)
76 | Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
77 | Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
| ^^^^^^^^
78 | Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
|
help: Replace with `-5_000`
74 | # https://github.com/astral-sh/ruff/issues/20572
75 | Decimal("15_000_000") # Safe fix: normalizes separators, becomes Decimal(15_000_000)
76 | Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
- Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
77 + Decimal(-5_000) # Safe fix: normalizes separators, becomes Decimal(-5_000)
78 | Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
79 |
80 | # Test cases for non-thousands separators
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:78:9
|
76 | Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
77 | Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
78 | Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
| ^^^^^^^^
79 |
80 | # Test cases for non-thousands separators
|
help: Replace with `+9_999`
75 | Decimal("15_000_000") # Safe fix: normalizes separators, becomes Decimal(15_000_000)
76 | Decimal("1_234_567") # Safe fix: normalizes separators, becomes Decimal(1_234_567)
77 | Decimal("-5_000") # Safe fix: normalizes separators, becomes Decimal(-5_000)
- Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
78 + Decimal(+9_999) # Safe fix: normalizes separators, becomes Decimal(+9_999)
79 |
80 | # Test cases for non-thousands separators
81 | Decimal("12_34_56_78") # Safe fix: preserves non-thousands separators
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:81:9
|
80 | # Test cases for non-thousands separators
81 | Decimal("12_34_56_78") # Safe fix: preserves non-thousands separators
| ^^^^^^^^^^^^^
82 | Decimal("1234_5678") # Safe fix: preserves non-thousands separators
|
help: Replace with `12_34_56_78`
78 | Decimal("+9_999") # Safe fix: normalizes separators, becomes Decimal(+9_999)
79 |
80 | # Test cases for non-thousands separators
- Decimal("12_34_56_78") # Safe fix: preserves non-thousands separators
81 + Decimal(12_34_56_78) # Safe fix: preserves non-thousands separators
82 | Decimal("1234_5678") # Safe fix: preserves non-thousands separators
83 |
84 | # Separators _and_ leading zeros
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:82:9
|
80 | # Test cases for non-thousands separators
81 | Decimal("12_34_56_78") # Safe fix: preserves non-thousands separators
82 | Decimal("1234_5678") # Safe fix: preserves non-thousands separators
| ^^^^^^^^^^^
83 |
84 | # Separators _and_ leading zeros
|
help: Replace with `1234_5678`
79 |
80 | # Test cases for non-thousands separators
81 | Decimal("12_34_56_78") # Safe fix: preserves non-thousands separators
- Decimal("1234_5678") # Safe fix: preserves non-thousands separators
82 + Decimal(1234_5678) # Safe fix: preserves non-thousands separators
83 |
84 | # Separators _and_ leading zeros
85 | Decimal("0001_2345")
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:85:9
|
84 | # Separators _and_ leading zeros
85 | Decimal("0001_2345")
| ^^^^^^^^^^^
86 | Decimal("000_1_2345")
87 | Decimal("000_000")
|
help: Replace with `1_2345`
82 | Decimal("1234_5678") # Safe fix: preserves non-thousands separators
83 |
84 | # Separators _and_ leading zeros
- Decimal("0001_2345")
85 + Decimal(1_2345)
86 | Decimal("000_1_2345")
87 | Decimal("000_000")
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:86:9
|
84 | # Separators _and_ leading zeros
85 | Decimal("0001_2345")
86 | Decimal("000_1_2345")
| ^^^^^^^^^^^^
87 | Decimal("000_000")
|
help: Replace with `1_2345`
83 |
84 | # Separators _and_ leading zeros
85 | Decimal("0001_2345")
- Decimal("000_1_2345")
86 + Decimal(1_2345)
87 | Decimal("000_000")
FURB157 [*] Verbose expression in `Decimal` constructor
--> FURB157.py:87:9
|
85 | Decimal("0001_2345")
86 | Decimal("000_1_2345")
87 | Decimal("000_000")
| ^^^^^^^^^
|
help: Replace with `0`
84 | # Separators _and_ leading zeros
85 | Decimal("0001_2345")
86 | Decimal("000_1_2345")
- Decimal("000_000")
87 + Decimal(0)

View File

@@ -63,19 +63,44 @@ use crate::rules::flake8_logging_format::rules::{LoggingCallType, find_logging_c
#[violation_metadata(preview_since = "0.13.2")]
pub(crate) struct LoggingEagerConversion {
pub(crate) format_conversion: FormatConversion,
pub(crate) function_name: Option<&'static str>,
}
impl Violation for LoggingEagerConversion {
#[derive_message_formats]
fn message(&self) -> String {
let LoggingEagerConversion { format_conversion } = self;
let (format_str, call_arg) = match format_conversion {
FormatConversion::Str => ("%s", "str()"),
FormatConversion::Repr => ("%r", "repr()"),
FormatConversion::Ascii => ("%a", "ascii()"),
FormatConversion::Bytes => ("%b", "bytes()"),
};
format!("Unnecessary `{call_arg}` conversion when formatting with `{format_str}`")
let LoggingEagerConversion {
format_conversion,
function_name,
} = self;
match (format_conversion, function_name.as_deref()) {
(FormatConversion::Str, Some("oct")) => {
"Unnecessary `oct()` conversion when formatting with `%s`. \
Use `%#o` instead of `%s`"
.to_string()
}
(FormatConversion::Str, Some("hex")) => {
"Unnecessary `hex()` conversion when formatting with `%s`. \
Use `%#x` instead of `%s`"
.to_string()
}
(FormatConversion::Str, _) => {
"Unnecessary `str()` conversion when formatting with `%s`".to_string()
}
(FormatConversion::Repr, _) => {
"Unnecessary `repr()` conversion when formatting with `%s`. \
Use `%r` instead of `%s`"
.to_string()
}
(FormatConversion::Ascii, _) => {
"Unnecessary `ascii()` conversion when formatting with `%s`. \
Use `%a` instead of `%s`"
.to_string()
}
(FormatConversion::Bytes, _) => {
"Unnecessary `bytes()` conversion when formatting with `%b`".to_string()
}
}
}
}
@@ -118,12 +143,71 @@ pub(crate) fn logging_eager_conversion(checker: &Checker, call: &ast::ExprCall)
continue;
};
// Check for use of %s with str()
if checker.semantic().match_builtin_expr(func.as_ref(), "str")
&& matches!(format_conversion, FormatConversion::Str)
{
checker
.report_diagnostic(LoggingEagerConversion { format_conversion }, arg.range());
// Check for various eager conversion patterns
match format_conversion {
// %s with str() - remove str() call
FormatConversion::Str
if checker.semantic().match_builtin_expr(func.as_ref(), "str") =>
{
checker.report_diagnostic(
LoggingEagerConversion {
format_conversion,
function_name: None,
},
arg.range(),
);
}
// %s with repr() - suggest using %r instead
FormatConversion::Str
if checker.semantic().match_builtin_expr(func.as_ref(), "repr") =>
{
checker.report_diagnostic(
LoggingEagerConversion {
format_conversion: FormatConversion::Repr,
function_name: None,
},
arg.range(),
);
}
// %s with ascii() - suggest using %a instead
FormatConversion::Str
if checker
.semantic()
.match_builtin_expr(func.as_ref(), "ascii") =>
{
checker.report_diagnostic(
LoggingEagerConversion {
format_conversion: FormatConversion::Ascii,
function_name: None,
},
arg.range(),
);
}
// %s with oct() - suggest using %#o instead
FormatConversion::Str
if checker.semantic().match_builtin_expr(func.as_ref(), "oct") =>
{
checker.report_diagnostic(
LoggingEagerConversion {
format_conversion: FormatConversion::Str,
function_name: Some("oct"),
},
arg.range(),
);
}
// %s with hex() - suggest using %#x instead
FormatConversion::Str
if checker.semantic().match_builtin_expr(func.as_ref(), "hex") =>
{
checker.report_diagnostic(
LoggingEagerConversion {
format_conversion: FormatConversion::Str,
function_name: Some("hex"),
},
arg.range(),
);
}
_ => {}
}
}
}

View File

@@ -21,6 +21,26 @@ RUF065 Unnecessary `str()` conversion when formatting with `%s`
7 | # %s + repr()
|
RUF065 Unnecessary `repr()` conversion when formatting with `%s`. Use `%r` instead of `%s`
--> RUF065.py:8:26
|
7 | # %s + repr()
8 | logging.info("Hello %s", repr("World!"))
| ^^^^^^^^^^^^^^
9 | logging.log(logging.INFO, "Hello %s", repr("World!"))
|
RUF065 Unnecessary `repr()` conversion when formatting with `%s`. Use `%r` instead of `%s`
--> RUF065.py:9:39
|
7 | # %s + repr()
8 | logging.info("Hello %s", repr("World!"))
9 | logging.log(logging.INFO, "Hello %s", repr("World!"))
| ^^^^^^^^^^^^^^
10 |
11 | # %r + str()
|
RUF065 Unnecessary `str()` conversion when formatting with `%s`
--> RUF065.py:22:18
|
@@ -40,3 +60,160 @@ RUF065 Unnecessary `str()` conversion when formatting with `%s`
24 |
25 | # %s + repr()
|
RUF065 Unnecessary `repr()` conversion when formatting with `%s`. Use `%r` instead of `%s`
--> RUF065.py:26:18
|
25 | # %s + repr()
26 | info("Hello %s", repr("World!"))
| ^^^^^^^^^^^^^^
27 | log(logging.INFO, "Hello %s", repr("World!"))
|
RUF065 Unnecessary `repr()` conversion when formatting with `%s`. Use `%r` instead of `%s`
--> RUF065.py:27:31
|
25 | # %s + repr()
26 | info("Hello %s", repr("World!"))
27 | log(logging.INFO, "Hello %s", repr("World!"))
| ^^^^^^^^^^^^^^
28 |
29 | # %r + str()
|
RUF065 Unnecessary `repr()` conversion when formatting with `%s`. Use `%r` instead of `%s`
--> RUF065.py:44:32
|
42 | logging.warning("Value: %r", repr(42))
43 | logging.error("Error: %r", repr([1, 2, 3]))
44 | logging.info("Debug info: %s", repr("test\nstring"))
| ^^^^^^^^^^^^^^^^^^^^
45 | logging.warning("Value: %s", repr(42))
|
RUF065 Unnecessary `repr()` conversion when formatting with `%s`. Use `%r` instead of `%s`
--> RUF065.py:45:30
|
43 | logging.error("Error: %r", repr([1, 2, 3]))
44 | logging.info("Debug info: %s", repr("test\nstring"))
45 | logging.warning("Value: %s", repr(42))
| ^^^^^^^^
46 |
47 | # %s + ascii()
|
RUF065 Unnecessary `ascii()` conversion when formatting with `%s`. Use `%a` instead of `%s`
--> RUF065.py:48:27
|
47 | # %s + ascii()
48 | logging.info("ASCII: %s", ascii("Hello\nWorld"))
| ^^^^^^^^^^^^^^^^^^^^^
49 | logging.warning("ASCII: %s", ascii("test"))
|
RUF065 Unnecessary `ascii()` conversion when formatting with `%s`. Use `%a` instead of `%s`
--> RUF065.py:49:30
|
47 | # %s + ascii()
48 | logging.info("ASCII: %s", ascii("Hello\nWorld"))
49 | logging.warning("ASCII: %s", ascii("test"))
| ^^^^^^^^^^^^^
50 |
51 | # %s + oct()
|
RUF065 Unnecessary `oct()` conversion when formatting with `%s`. Use `%#o` instead of `%s`
--> RUF065.py:52:27
|
51 | # %s + oct()
52 | logging.info("Octal: %s", oct(42))
| ^^^^^^^
53 | logging.warning("Octal: %s", oct(255))
|
RUF065 Unnecessary `oct()` conversion when formatting with `%s`. Use `%#o` instead of `%s`
--> RUF065.py:53:30
|
51 | # %s + oct()
52 | logging.info("Octal: %s", oct(42))
53 | logging.warning("Octal: %s", oct(255))
| ^^^^^^^^
54 |
55 | # %s + hex()
|
RUF065 Unnecessary `hex()` conversion when formatting with `%s`. Use `%#x` instead of `%s`
--> RUF065.py:56:25
|
55 | # %s + hex()
56 | logging.info("Hex: %s", hex(42))
| ^^^^^^^
57 | logging.warning("Hex: %s", hex(255))
|
RUF065 Unnecessary `hex()` conversion when formatting with `%s`. Use `%#x` instead of `%s`
--> RUF065.py:57:28
|
55 | # %s + hex()
56 | logging.info("Hex: %s", hex(42))
57 | logging.warning("Hex: %s", hex(255))
| ^^^^^^^^
|
RUF065 Unnecessary `ascii()` conversion when formatting with `%s`. Use `%a` instead of `%s`
--> RUF065.py:63:19
|
61 | from logging import info, log
62 |
63 | info("ASCII: %s", ascii("Hello\nWorld"))
| ^^^^^^^^^^^^^^^^^^^^^
64 | log(logging.INFO, "ASCII: %s", ascii("test"))
|
RUF065 Unnecessary `ascii()` conversion when formatting with `%s`. Use `%a` instead of `%s`
--> RUF065.py:64:32
|
63 | info("ASCII: %s", ascii("Hello\nWorld"))
64 | log(logging.INFO, "ASCII: %s", ascii("test"))
| ^^^^^^^^^^^^^
65 |
66 | info("Octal: %s", oct(42))
|
RUF065 Unnecessary `oct()` conversion when formatting with `%s`. Use `%#o` instead of `%s`
--> RUF065.py:66:19
|
64 | log(logging.INFO, "ASCII: %s", ascii("test"))
65 |
66 | info("Octal: %s", oct(42))
| ^^^^^^^
67 | log(logging.INFO, "Octal: %s", oct(255))
|
RUF065 Unnecessary `oct()` conversion when formatting with `%s`. Use `%#o` instead of `%s`
--> RUF065.py:67:32
|
66 | info("Octal: %s", oct(42))
67 | log(logging.INFO, "Octal: %s", oct(255))
| ^^^^^^^^
68 |
69 | info("Hex: %s", hex(42))
|
RUF065 Unnecessary `hex()` conversion when formatting with `%s`. Use `%#x` instead of `%s`
--> RUF065.py:69:17
|
67 | log(logging.INFO, "Octal: %s", oct(255))
68 |
69 | info("Hex: %s", hex(42))
| ^^^^^^^
70 | log(logging.INFO, "Hex: %s", hex(255))
|
RUF065 Unnecessary `hex()` conversion when formatting with `%s`. Use `%#x` instead of `%s`
--> RUF065.py:70:30
|
69 | info("Hex: %s", hex(42))
70 | log(logging.INFO, "Hex: %s", hex(255))
| ^^^^^^^^
|

View File

@@ -1,11 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: attribute name `x` repeated in class pattern
--> <filename>:3:21
|
2 | match x:
3 | case Point(x=1, x=2):
| ^
4 | pass
|

View File

@@ -1,56 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: name `a` is parameter and global
--> <filename>:3:12
|
2 | def f(a):
3 | global a
| ^
4 |
5 | def g(a):
|
invalid-syntax: name `a` is parameter and global
--> <filename>:7:16
|
5 | def g(a):
6 | if True:
7 | global a
| ^
8 |
9 | def h(a):
|
invalid-syntax: name `a` is parameter and global
--> <filename>:15:16
|
13 | def i(a):
14 | try:
15 | global a
| ^
16 | except Exception:
17 | pass
|
invalid-syntax: name `a` is parameter and global
--> <filename>:21:12
|
19 | def f(a):
20 | a = 1
21 | global a
| ^
22 |
23 | def f(a):
|
invalid-syntax: name `a` is parameter and global
--> <filename>:26:12
|
24 | a = 1
25 | a = 2
26 | global a
| ^
27 |
28 | def f(a):
|

View File

@@ -1,9 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: named expression cannot be used within a generic definition
--> <filename>:2:22
|
2 | def f[T](x: int) -> (y := 3): return x
| ^^^^^^
|

View File

@@ -1,10 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: yield expression cannot be used within a generic definition
--> <filename>:2:13
|
2 | class C[T]((yield from [object])):
| ^^^^^^^^^^^^^^^^^^^
3 | pass
|

View File

@@ -1,9 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: yield expression cannot be used within a type alias
--> <filename>:2:11
|
2 | type Y = (yield 1)
| ^^^^^^^
|

View File

@@ -1,9 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: yield expression cannot be used within a TypeVar bound
--> <filename>:2:12
|
2 | type X[T: (yield 1)] = int
| ^^^^^^^
|

View File

@@ -1,10 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: Starred expression cannot be used here
--> <filename>:3:12
|
2 | def func():
3 | return *x
| ^^
|

View File

@@ -1,10 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: Starred expression cannot be used here
--> <filename>:2:5
|
2 | for *x in range(10):
| ^^
3 | pass
|

View File

@@ -1,10 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: Starred expression cannot be used here
--> <filename>:3:11
|
2 | def func():
3 | yield *x
| ^^
|

View File

@@ -1,12 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: name capture `irrefutable` makes remaining patterns unreachable
--> <filename>:3:10
|
2 | match value:
3 | case irrefutable:
| ^^^^^^^^^^^
4 | pass
5 | case 1:
|

View File

@@ -1,12 +0,0 @@
---
source: crates/ruff_linter/src/linter.rs
---
invalid-syntax: wildcard makes remaining patterns unreachable
--> <filename>:3:10
|
2 | match value:
3 | case _:
| ^
4 | pass
5 | case 1:
|

Some files were not shown because too many files have changed in this diff Show More