Compare commits

...

177 Commits

Author SHA1 Message Date
Zanie Blue
40b4aa28f9 Zizmor 2025-07-03 07:23:11 -05:00
Zanie Blue
ea4bf00c23 Revert "Only run the relevant test"
This reverts commit 82dc27f2680b8085280136848ffe2ee1d2952a4e.
2025-07-03 07:01:28 -05:00
Zanie Blue
7f4aa4b3fb Update for Depot 2025-07-03 07:01:28 -05:00
Zanie Blue
34c98361ae Do not set TMP 2025-07-03 07:01:28 -05:00
Zanie Blue
38bb96a6c2 Remove log variables 2025-07-03 07:01:28 -05:00
Zanie Blue
a014d55455 Remove fuzz corpus hack 2025-07-03 07:01:27 -05:00
Zanie Blue
306f6f17a9 Enable more logs 2025-07-03 07:01:27 -05:00
Zanie Blue
b233888f00 Only run the relevant test 2025-07-03 07:01:27 -05:00
Zanie Blue
540cbd9085 Add debug logs? 2025-07-03 07:01:27 -05:00
Zanie Blue
0112f7f0e4 Use a dev drive for testing on Windows 2025-07-03 07:01:27 -05:00
GiGaGon
d0f0577ac7 [flake8-pyi] Make example error out-of-the-box (PYI014, PYI015) (#19097) 2025-07-03 12:54:35 +01:00
Dhruv Manilawala
dc56c33618 [ty] Initial support for workspace diagnostics (#18939)
## Summary

This PR adds initial support for workspace diagnostics in the ty server.

Reference spec:
https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_diagnostic

This is currently implemented via the **pull diagnostics method** which
was added in the current version (3.17) and the server advertises it via
the `diagnosticProvider.workspaceDiagnostics` server capability.

**Note:** This might be a bit confusing but a workspace diagnostics is
not for a single workspace but for all the workspaces that the server
handles. These are the ones that the server received during
initialization. Currently, the ty server doesn't support multiple
workspaces so this capability is also limited to provide diagnostics
only for a single workspace (the first one if the client provided
multiple).

A new `ty.diagnosticMode` server setting is added which can be either
`workspace` (for workspace diagnostics) or `openFilesOnly` (for checking
only open files) (default). This is same as
`python.analysis.diagnosticMode` that Pyright / Pylance utilizes. In the
future, we could use the value under `python.*` namespace as fallback to
improve the experience on user side to avoid setting the value multiple
times.

Part of: astral-sh/ty#81

## Test Plan

This capability was introduced in the current LSP version (~3 years) and
the way it's implemented by various clients are a bit different. I've
provided notes on what I've noticed and what would need to be done on
our side to further improve the experience.

### VS Code

VS Code sends the `workspace/diagnostic` requests every ~2 second:

```
[Trace - 12:12:32 PM] Sending request 'workspace/diagnostic - (403)'.
[Trace - 12:12:32 PM] Received response 'workspace/diagnostic - (403)' in 2ms.
[Trace - 12:12:34 PM] Sending request 'workspace/diagnostic - (404)'.
[Trace - 12:12:34 PM] Received response 'workspace/diagnostic - (404)' in 2ms.
[Trace - 12:12:36 PM] Sending request 'workspace/diagnostic - (405)'.
[Trace - 12:12:36 PM] Received response 'workspace/diagnostic - (405)' in 2ms.
[Trace - 12:12:38 PM] Sending request 'workspace/diagnostic - (406)'.
[Trace - 12:12:38 PM] Received response 'workspace/diagnostic - (406)' in 3ms.
[Trace - 12:12:40 PM] Sending request 'workspace/diagnostic - (407)'.
[Trace - 12:12:40 PM] Received response 'workspace/diagnostic - (407)' in 2ms.
...
```

I couldn't really find any resource that explains this behavior. But,
this does mean that we'd need to implement the caching layer via the
previous result ids sooner. This will allow the server to avoid sending
all the diagnostics on every request and instead just send a response
stating that the diagnostics hasn't changed yet. This could possibly be
achieved by using the salsa ID.

If we switch from workspace diagnostics to open-files diagnostics, the
server would send the diagnostics only via the `textDocument/diagnostic`
endpoint. Here, when a document containing the diagnostic is closed, the
server would send a publish diagnostics notification with an empty list
of diagnostics to clear the diagnostics from that document. The issue is
the VS Code doesn't seem to be clearing the diagnostics in this case
even though it receives the notification. (I'm going to open an issue on
VS Code side for this today.)


https://github.com/user-attachments/assets/b0c0833d-386c-49f5-8a15-0ac9133e15ed

### Zed

Zed's implementation works by refreshing the workspace diagnostics
whenever the content of the documents are changed. This seems like a
very reasonable behavior and I was a bit surprised that VS Code didn't
use this heuristic.


https://github.com/user-attachments/assets/71c7b546-7970-434a-9ba0-4fa620647f6c

### Neovim

Neovim only recently added support for workspace diagnostics
(https://github.com/neovim/neovim/pull/34262, merged ~3 weeks ago) so
it's only available on nightly versions.

The initial support is limited and requires fetching the workspace
diagnostics manually as demonstrated in the video. It doesn't support
refreshing the workspace diagnostics either, so that would need to be
done manually as well. I'm assuming that these are just a temporary
limitation and will be implemented before the stable release.


https://github.com/user-attachments/assets/25b4a0e5-9833-4877-88ad-279904fffaf9
2025-07-03 11:04:54 +00:00
Dhruv Manilawala
a95c18a8e1 [ty] Add background request task support (#19041)
## Summary

This PR adds a new trait to support running a request in the background.

Currently, there exists a `BackgroundDocumentRequestHandler` trait which
is similar but is scoped to a specific document (file in an editor
context). The new trait `BackgroundRequestHandler` is not tied to a
specific document nor a specific project but it's for the entire
workspace.

This is added to support running workspace wide requests like computing
the [workspace
diagnostics](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_diagnostic)
or [workspace
symbols](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_symbol).

**Note:** There's a slight difference with what a "workspace" means
between the server and ty. Currently, there's a 1-1 relationship between
a workspace in an editor and the project database corresponding to that
workspace in ty but this could change in the future when Micha adds
support for multiple workspaces or multi-root workspaces.

The data that would be required by the request handler (based on
implementing workspace diagnostics) is the list of databases
(`ProjectDatabse`) corresponding to the projects in the workspace and
the index (`Index`) that contains the open documents. The
`WorkspaceSnapshot` represents this and is passed to the handler similar
to `DocumentSnapshot`.

## Test Plan

This is used in implementing the workspace diagnostics which is where
this is tested.
2025-07-03 11:01:10 +00:00
David Peter
e212dc2e8e [ty] Restructure/move dataclass tests (#19117)
Before I'm adding even more dataclass-related files, let's organize them
in a separate folder.
2025-07-03 10:36:14 +00:00
Aria Desires
c4f2eec865 [ty] Remove last vestiges of std::path from ty_server (#19088)
Fixes https://github.com/astral-sh/ty/issues/603
2025-07-03 15:18:30 +05:30
Zanie Blue
9fc04d6bf0 Use "python" for markdown code fences in on-hover content (#19082)
Instead of "text".

Closes https://github.com/astral-sh/ty/issues/749

We may not want this because the type display implementations are not
guaranteed to be valid Python, however, unless they're going to
highlight invalid syntax this seems like a better interim value than
"text"? I'm not the expert though. See
https://github.com/astral-sh/ty/issues/749#issuecomment-3026201114 for
prior commentary.

edit: Going back further to
https://github.com/astral-sh/ruff/pull/17057#discussion_r2028151621 for
prior context, it turns out they _do_ highlight invalid syntax in red
which is quite unfortunate and probably a blocker here.
2025-07-03 10:50:34 +05:30
Matthew Mckee
352b896c89 [ty] Add subtyping between SubclassOf and CallableType (#19026)
## Summary

Part of https://github.com/astral-sh/ty/issues/129

There were previously some false positives here.

## Test Plan

Updated `is_subtype_of.md` and `is_assignable_to.md`
2025-07-02 19:22:31 -07:00
GiGaGon
321575e48f [flake8-pyi] Make example error out-of-the-box (PYI042) (#19101)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [snake-case-type-alias
(PYI042)](https://docs.astral.sh/ruff/rules/snake-case-type-alias/#snake-case-type-alias-pyi042)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/8fafec81-2228-4ffe-81e8-1989b724cb47)
```py
type_alias_name: TypeAlias = int
```

[New example](https://play.ruff.rs/b396746c-e6d2-423c-bc13-01a533bb0747)
```py
from typing import TypeAlias

type_alias_name: TypeAlias = int
```

Imports were also added to the "use instead" section.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-02 22:31:15 +01:00
GiGaGon
066018859f [pyflakes] Fix backslash in docs (F621) (#19098)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

This fixes the docs for [expressions-in-star-assignment
(F621)](https://docs.astral.sh/ruff/rules/expressions-in-star-assignment/#expressions-in-star-assignment-f621)
having a backslash `\` before the left shifts `<<`. I'm not sure why
this happened in the first place, as the docstring looks fine, but
putting the `<<` inside a code block fixes it. I was not able to track
down the source of the issue either. The only other rule with a `<<` is
[missing-whitespace-around-bitwise-or-shift-operator
(E227)](https://docs.astral.sh/ruff/rules/missing-whitespace-around-bitwise-or-shift-operator/#missing-whitespace-around-bitwise-or-shift-operator-e227),
which already has it in a code block.

Old docs page:

![image](https://github.com/user-attachments/assets/993106c6-5d83-4aed-836b-e252f5b64916)
> In Python 3, no more than 1 \\<< 8 assignments are allowed before a
starred expression, and no more than 1 \\<< 24 expressions are allowed
after a starred expression.

New docs page:

![image](https://github.com/user-attachments/assets/3b40b35f-f39e-49f1-8b2e-262dda4085b4)
> In Python 3, no more than `1 << 8` assignments are allowed before a
starred expression, and no more than `1 << 24` expressions are allowed
after a starred expression.

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected.
2025-07-02 15:00:33 -04:00
David Peter
f76d3f87cf [ty] Allow declared-only class-level attributes to be accessed on the class (#19071)
## Summary

Allow declared-only class-level attributes to be accessed on the class:
```py
class C:
    attr: int

C.attr  # this is now allowed
``` 

closes https://github.com/astral-sh/ty/issues/384
closes https://github.com/astral-sh/ty/issues/553

## Ecosystem analysis


* We see many removed `unresolved-attribute` false-positives for code
that makes use of sqlalchemy, as expected (see changes for `prefect`)
* We see many removed `call-non-callable` false-positives for uses of
`pytest.skip` and similar, as expected
* Most new diagnostics seem to be related to cases like the following,
where we previously inferred `int` for `Derived().x`, but now we infer
`int | None`. I think this should be a
conflicting-declarations/bad-override error anyway? The new behavior may
even be preferred here?
  ```py
  class Base:
      x: int | None
  
  
  class Derived(Base):
      def __init__(self):
          self.x: int = 1
  ```
2025-07-02 18:03:56 +02:00
Micha Reiser
5f426b9f8b [ty] Remove ScopedExpressionId (#19019)
## Summary

The motivation of `ScopedExpressionId` was that we have an expression
identifier that's local to a scope and, therefore, unlikely to change if
a user makes changes in another scope. A local identifier like this has
the advantage that query results may remain unchanged even if other
parts of the file change, which in turn allows Salsa to short-circuit
dependent queries.

However, I noticed that we aren't using `ScopedExpressionId` in a place
where it's important that the identifier is local. It's main use is
inside `infer` which we always run for the entire file. The one
exception to this is `Unpack` but unpack runs as part of `infer`.

Edit: The above isn't entirely correct. We used ScopedExpressionId in
TypeInference which is a query result. Now using ExpressionNodeKey does
mean that a change to the AST invalidates most if not all TypeInference
results of a single file. Salsa then has to run all dependent queries to
see if they're affected by this change even if the change was local to
another scope.

If this locality proves to be important I suggest that we create two
queries on top of TypeInference: one that returns the expression map
which is mainly used in the linter and type inference and a second that
returns all remaining fields. This should give us a similar optimization
at a much lower cost

I also considered remove `ScopedUseId` but I believe that one is still
useful because using `ExpressionNodeKey` for it instead would mean that
all `UseDefMap` change when a single AST node changes. Whether this is
important is something difficult to assess. I'm simply not familiar
enough with the `UseDefMap`. If the locality doesn't matter for the
`UseDefMap`, then a similar change could be made and `bindings_by_use`
could be changed to an `FxHashMap<UseId, Bindings>` where `UseId` is a
thin wrapper around `NodeKey`.

Closes https://github.com/astral-sh/ty/issues/721
2025-07-02 17:57:32 +02:00
GiGaGon
37ba185c04 [flake8-pyi] Make example error out-of-the-box (PYI059) (#19080)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-02 16:49:54 +01:00
David Peter
93413d3631 [ty] Update docs links (#19092)
Point everything to the new documentation at https://docs.astral.sh/ty/
2025-07-02 17:34:56 +02:00
Zanie Blue
efd9b75352 Avoid reformatting comments in rules reference documentation (#19093)
closes https://github.com/astral-sh/ty/issues/754
2025-07-02 17:16:44 +02:00
David Peter
4cf56d7ad4 [ty] Fix lint summary wording (#19091) 2025-07-02 16:32:11 +02:00
David Peter
4e4e428a95 [ty] Fix link in generate_ty_rules (#19090) 2025-07-02 14:21:32 +00:00
Zanie Blue
522fd4462e Fix header levels in generated settings reference (#19089)
The headers were one level too deep for child items, and the top-level
`rules` header was way off.
2025-07-02 16:01:23 +02:00
David Peter
e599c9d0d3 [ty] Adapt generate_ty_rules for MkDocs (#19087)
## Summary

Adapts the Markdown for the rules-reference documentation page for
MkDocs.
2025-07-02 16:01:10 +02:00
Micha Reiser
e9b5ea71b3 Update Salsa (#19020)
## Summary

This PR updates Salsa to pull in Ibraheem's multithreading improvements (https://github.com/salsa-rs/salsa/pull/921).

## Performance

A small regression for single-threaded benchmarks is expected because
papaya is slightly slower than a `Mutex<FxHashMap>` in the uncontested
case (~10%). However, this shouldn't matter as much in practice because:

1. Salsa has a fast-path when only using 1 DB instance which is the
common case in production. This fast-path is not impacted by the changes
but we measure the slow paths in our benchmarks (because we use multiple
db instances)
2. Fixing the 10x slowdown for the congested case (multi threading)
outweights the downsides of a 10% perf regression for single threaded
use cases, especially considering that ty is heavily multi threaded.

## Test Plan

`cargo test`
2025-07-02 09:55:37 -04:00
Ibraheem Ahmed
ebc70a4002 [ty] Support LSP go-to with vendored typeshed stubs (#19057)
## Summary

Extracts the vendored typeshed stubs lazily and caches them on the local
filesystem to support go-to in the LSP.

Resolves https://github.com/astral-sh/ty/issues/77.
2025-07-02 07:58:58 -04:00
Micha Reiser
f7fc8fb084 [ty] Request configuration from client (#18984)
## Summary

This PR makes the necessary changes to the server that it can request
configurations from the client using the `configuration` request.
This PR doesn't make use of the request yet. It only sets up the
foundation (mainly the coordination between client and server)
so that future PRs could pull specific settings. 

I plan to use this for pulling the Python environment from the Python
extension.

Deno does something very similar to this.

## Test Plan

Tested that diagnostics are still shown.
2025-07-02 14:31:41 +05:30
GiGaGon
cdf91b8b74 [flake8-pyi] Make example error out-of-the-box (PYI062) (#19079)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [duplicate-literal-member
(PYI062)](https://docs.astral.sh/ruff/rules/duplicate-literal-member/#duplicate-literal-member-pyi062)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/6b00b41c-c1c5-4421-873d-fc2a143e7337)
```py
foo: Literal["a", "b", "a"]
```

[New example](https://play.ruff.rs/1aea839b-9ae8-4848-bb83-2637e1a68ce4)
```py
from typing import Literal

foo: Literal["a", "b", "a"]
```

Imports were also added to the "use instead" section.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-02 08:21:39 +01:00
Dhruv Manilawala
d1e705738e [ty] Log target names at trace level (#19084)
Follow-up to https://github.com/astral-sh/ruff/pull/19083, also log the
target names like `ty_python_semantic::module_resolver::resolver` in
`2025-07-02 10:12:20.188697000 DEBUG
ty_python_semantic::module_resolver::resolver: Adding first-party search
path '/Users/dhruv/playground/ty_server'` at trace level.
2025-07-02 04:49:36 +00:00
Dhruv Manilawala
c3d9b21db5 [ty] Use better datetime format for server logs (#19083)
This PR improves the timer format for ty server logs to be same as Ruff.

Ref: https://github.com/astral-sh/ruff/pull/16389
2025-07-02 04:39:12 +00:00
Alex Waygood
316c1b21e2 [ty] Add some missing calls to normalized_impl (#19074)
## Summary

I hoped this might fix the latest stack overflows on
https://github.com/astral-sh/ruff/pull/18659... it doesn't look like it
does, but these changes seem like they're probably correct anyway...?

## Test Plan

<!-- How was it tested? -->
2025-07-01 17:57:52 +01:00
Brent Westbrook
47733c0647 Remove new codspeed dependencies (#19073)
Summary
--

Updates to codspeed 3.0.2, removing some of the new dependencies
introduced in 3.0. See https://github.com/astral-sh/uv/pull/14396 and
https://github.com/CodSpeedHQ/codspeed-rust/pull/108 for the similar PR
in uv and the upstream fix, respectively.

Test Plan
--

N/a
2025-07-01 12:20:31 -04:00
NamelessGO
48366a7bbb Docs: Add Anki to Who's Using Ruff (#19072)
https://github.com/ankitects/anki/pull/4119

<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Add Anki to Who's Using Ruff (README)

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-01 15:44:04 +00:00
GiGaGon
cc736c3a51 [refurb] Fix false positive on empty tuples (FURB168) (#19058)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

This PR fixes #19047 / the [isinstance-type-none
(FURB168)](https://docs.astral.sh/ruff/rules/isinstance-type-none/#isinstance-type-none-furb168)
tuple false positive by adding a check if the tuple is empty to the
code. I also noticed there was another false positive with the other
tuple check in the same function, so I fixed it the same way.
`Union[()]` is invalid at runtime with `TypeError: Cannot take a Union
of no types.`, but it is accepted by `basedpyright`
[playground](https://basedpyright.com/?pythonVersion=3.8&typeCheckingMode=all&code=GYJw9gtgBALgngBwJYDsDmUkQWEMoCqKSYKAsAFAgCmAbtQIYA2A%2BvAtQBREkoDanAJQBdQUA)
and is equivalent to `Never`, so I fixed it anyways. I'm getting on a
side tangent here, but it looks like MyPy doesn't accept it, and ty
[playground](https://play.ty.dev/c2c468b6-38e4-4dd9-a9fa-0276e843e395)
gives `@Todo`.

## Test Plan

<!-- How was it tested? -->

Added two test cases for the two false positives.
[playground](https://play.ruff.rs/a53afc21-9a1d-4b9b-9346-abfbeabeb449)
2025-07-01 10:26:41 -04:00
GiGaGon
8cc14ad02d [flake8-datetimez] Make DTZ901 example error out-of-the-box (#19056)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [datetime-min-max
(DTZ901)](https://docs.astral.sh/ruff/rules/datetime-min-max/#datetime-min-max-dtz901)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/c1202727-1a18-4d3f-92a4-334ede07ed3e)
```py
datetime.max
```

[New example](https://play.ruff.rs/af2c76aa-9beb-46bc-8e27-faf53ecdbe8c)
```py
import datetime

datetime.datetime.max
```

I also added imports to the problem demonstration and use instead.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-01 09:57:34 -04:00
Илья Любавский
667dc62038 [ruff] Fix syntax error introduced for an empty string followed by a u-prefixed string (UP025) (#18899)
## Summary
/closes #18895
## Test Plan

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-01 09:34:08 -04:00
David Peter
dac4e356eb [ty] Use all reachable bindings for instance attributes and deferred lookups (#18955)
## Summary

Remove a hack in control flow modeling that was treating `return`
statements at the end of function bodies in a special way (basically
considering the state *just before* the `return` statement as the
end-of-scope state). This is not needed anymore now that #18750 has been
merged.

In order to make this work, we now use *all reachable bindings* for
purposes of finding implicit instance attribute assignments as well as
for deferred lookups of symbols. Both would otherwise be affected by
this change:
```py
def C:
    def f(self):
        self.x = 1  # a reachable binding that is not visible at the end of the scope
        return
```

```py
def f():
    class X: ...  # a reachable binding that is not visible at the end of the scope
    x: "X" = X()  # deferred use of `X`
    return
```

Implicit instance attributes also required another change. We previously
kept track of possibly-unbound instance attributes in some cases, but we
now give up on that completely and always consider *implicit* instance
attributes to be bound if we see a reachable binding in a reachable
method. The previous behavior was somewhat inconsistent anyway because
we also do not consider attributes possibly-unbound in other scenarios:
we do not (and can not) keep track of whether or not methods are called
that define these attributes.

closes https://github.com/astral-sh/ty/issues/711

## Ecosystem analysis

I think this looks very positive!

* We see an unsurprising drop in `possibly-unbound-attribute`
diagnostics (599), mostly for classes that define attributes in `try …
except` blocks, `for` loops, or `if … else: raise …` constructs. There
might obviously also be true positives that got removed, but the vast
majority should be false positives.
* There is also a drop in `possibly-unresolved-reference` /
`unresolved-reference` diagnostics (279+13) from the change to deferred
lookups.
* Some `invalid-type-form` false positives got resolved (13), because we
can now properly look up the names in the annotations.
* There are some new *true* positives in `attrs`, since we understand
the `Attribute` annotation that was previously inferred as `Unknown`
because of a re-assignment after the class definition.


## Test Plan

The existing attributes.md test suite has sufficient coverage here.
2025-07-01 14:38:36 +02:00
Alex Waygood
ebf59e2bef [ty] Rework disjointness of protocol instances vs types with possibly unbound attributes (#19043) 2025-07-01 12:47:27 +01:00
Alex Waygood
c6fd11fe36 [ty] Eagerly evaluate more constraints based on the raw AST (#19068) 2025-07-01 10:17:22 +00:00
David Peter
7d468ee58a [ty] Model reachability of star import definitions for nonlocal lookups (#19066)
## Summary

Temporarily modify `UseDefMapBuilder::reachability` for star imports in
order for new definitions to pick up the right reachability. This was
already working for `UseDefMapBuilder::place_states`, but not for
`UseDefMapBuilder::reachable_definitions`.

closes https://github.com/astral-sh/ty/issues/728

## Test Plan

Regression test
2025-07-01 11:06:37 +02:00
David Peter
4016521bf6 [ty] Eagerly evaluate TYPE_CHECKING constraints (#19044)
## Summary

Evaluate `TYPE_CHECKING` to `ALWAYS_TRUE` and `not TYPE_CHECKING` to
`ALWAYS_FALSE` during semantic index building. This is a follow-up to
https://github.com/astral-sh/ruff/pull/18998 and is in principle just a
performance optimization. We see some (favorable) ecosystem changes
because we can eliminate definitely-unreachable branches early now and
retain narrowing constraints without solving
https://github.com/astral-sh/ty/issues/690 first.
2025-07-01 11:05:52 +02:00
GiGaGon
b8653a9d3a [flake8-pyi] Make PYI032 example error out-of-the-box (#19061) 2025-07-01 07:50:58 +01:00
github-actions[bot]
966adca6f6 [ty] Sync vendored typeshed stubs (#19060)
Close and reopen this PR to trigger CI

Co-authored-by: typeshedbot <>
2025-07-01 07:45:06 +01:00
renovate[bot]
77941af1c6 Update Rust crate get-size2 to v0.5.1 (#19035)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [get-size2](https://redirect.github.com/bircni/get-size2) |
workspace.dependencies | patch | `0.5.0` -> `0.5.1` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>bircni/get-size2 (get-size2)</summary>

###
[`v0.5.1`](https://redirect.github.com/bircni/get-size2/blob/HEAD/CHANGELOG.md#051---2025-06-25)

[Compare
Source](https://redirect.github.com/bircni/get-size2/compare/0.5.0...0.5.1)

##### Bug Fixes

- correctly determine size for enums
([#&#8203;24](https://redirect.github.com/bircni/get-size2/issues/24)) -
([3c5bd18](3c5bd18cac))
- Nicolas

##### Miscellaneous Chores

- add top-level `heap_size` function
([#&#8203;25](https://redirect.github.com/bircni/get-size2/issues/25)) -
([f3b5e6e](f3b5e6e38c))
- Ibraheem Ahmed

##### Build

- update to newer cargo-verset to set dependency version automatically -
([b1154e4](b1154e4572))
- Nicolas

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 23:17:36 -04:00
Dylan
4bc170a5c1 Make dependency get-size2 truly optional in ruff_python_ast (#19052)
Gates all uses of `get-size2` behind the feature `get-size` in the crate
`ruff_python_ast`. Also requires that `ruff_text_size` is pulled in with
the feature `get-size` enabled if we enable the same-named feature for
`ruff_python_ast`.
2025-06-30 21:50:59 -05:00
Robsdedude
28ab61d885 [pyupgrade] Avoid PEP-604 unions with typing.NamedTuple (UP007, UP045) (#18682)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
Make `UP045` ignore `Optional[NamedTuple]` as `NamedTuple` is a function
(not a proper type). Rewriting it to `NamedTuple | None` breaks at
runtime. While type checkers currently accept `NamedTuple` as a type,
they arguably shouldn't. Therefore, we outright ignore it and don't
touch or lint on it.

For a more detailed discussion, see the linked issue.

## Test Plan
Added examples to the existing tests.

## Related Issues
Fixes: https://github.com/astral-sh/ruff/issues/18619
2025-06-30 17:22:23 -04:00
GiGaGon
4963835d0d [flake8-bandit] Make S604 and S609 examples error out-of-the-box (#19049)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

Both in one PR since they are in the same file.

S604
---

This PR makes [call-with-shell-equals-true
(S604)](https://docs.astral.sh/ruff/rules/call-with-shell-equals-true/#call-with-shell-equals-true-s604)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/a054fb79-7653-47f7-9ab5-3d8b7540c810)
```py
import subprocess

user_input = input("Enter a command: ")
subprocess.run(user_input, shell=True)
```

[New example](https://play.ruff.rs/6fea81b4-e745-4b85-8bea-faaabea5c86d)
```py
import my_custom_subprocess

user_input = input("Enter a command: ")
my_custom_subprocess.run(user_input, shell=True)
```

The old example doesn't raise `S604` because it gets overwritten by
[subprocess-popen-with-shell-equals-true
(S602)](https://docs.astral.sh/ruff/rules/subprocess-popen-with-shell-equals-true/#subprocess-popen-with-shell-equals-true-s602)
(which is a good idea to prevent two lints saying the same thing from
being raised)

S609
---

This PR makes [unix-command-wildcard-injection
(S609)](https://docs.astral.sh/ruff/rules/unix-command-wildcard-injection/#unix-command-wildcard-injection-s609)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/849860fa-0d12-4916-bdbc-64a0fa14cd9b)
```py
import subprocess

subprocess.Popen(["chmod", "777", "*.py"])
```

[New example](https://play.ruff.rs/77a54d7c-cf78-4158-bcf8-96dd698cf366)
```py
import subprocess

subprocess.Popen(["chmod", "777", "*.py"], shell=True)
```

I'm not familiar enough with `subprocess` to know why `shell=True` is
required to make `S609` raise here, but it works.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 16:10:14 -05:00
GiGaGon
09fa80f94c [flake8-datetimez] Make DTZ011 example error out-of-the-box (#19055)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [call-date-today
(DTZ011)](https://docs.astral.sh/ruff/rules/call-date-today/#call-date-today-dtz011)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/b42d6aef-7777-4b3b-9f96-19132000b765)
```py
import datetime

datetime.datetime.today()
```

[New example](https://play.ruff.rs/8577c3c1-cfa8-425b-b1e1-4c53b2a48375)
```py
import datetime

datetime.date.today()
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 15:54:04 -05:00
GiGaGon
fde82fc563 [flake8-bugbear] Make B028 example error out-of-the-box (#19054)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [no-explicit-stacklevel
(B028)](https://docs.astral.sh/ruff/rules/no-explicit-stacklevel/#no-explicit-stacklevel-b028)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/1ee80aec-2d6e-4a3f-8e98-da82b6a9f544)
```py
warnings.warn("This is a warning")
```

[New example](https://play.ruff.rs/343593aa-38a0-4d76-a32b-5abd0a4306cc)
```py
import warnings

warnings.warn("This is a warning")
```

Imports were also added to the "use instead" section

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 15:49:40 -05:00
GiGaGon
96decb17a9 [flake8-bugbear] Make B911 example error out-of-the-box (#19051)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [batched-without-explicit-strict
(B911)](https://docs.astral.sh/ruff/rules/batched-without-explicit-strict/#batched-without-explicit-strict-b911)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/a897d96b-0749-4291-8a62-dfd4caf290a0)
```py
itertools.batched(iterable, n)
```

[New example](https://play.ruff.rs/1c1e0ab7-014c-4dc2-abed-c2cb6cd01f70)
```py
import itertools

itertools.batched(iterable, n)
```

Imports were also added to the "use instead" sections

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 15:48:02 -05:00
Carl Meyer
2ae0bd9464 [ty] Normalize recursive types using Any (#19003)
## Summary

This just replaces one temporary solution to recursive protocols (the
`SelfReference` mechanism) with another one (track seen types when
recursively descending in `normalize` and replace recursive references
with `Any`). But this temporary solution can handle mutually-recursive
types, not just self-referential ones, and it's sufficient for the
primer ecosystem and some other projects we are testing on to no longer
stack overflow.

The follow-up here will be to properly handle these self-references
instead of replacing them with `Any`.

We will also eventually need cycle detection on more recursive-descent
type transformations and tests.

## Test Plan

Existing tests (including recursive-protocol tests) and primer.

Added mdtest for mutually-recursive protocols that stack-overflowed
before this PR.
2025-06-30 12:07:57 -07:00
Robsdedude
34052a1185 [flake8-comprehensions] Fix C420 to prepend whitespace when needed (#18616)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
This PR fixes rule C420's fix. The fix replaces `{...}` with
`dict....(...)`. Therefore, if there is any identifier or such right
before the fix, the fix will fuse that previous token with `dict...`.

The example in the issue is
```python
0 or{x: None for x in "x"}
# gets "fixed" to
0 ordict.fromkeys(iterable)
```

## Related Issues

Fixes: https://github.com/astral-sh/ruff/issues/18599
2025-06-30 12:38:26 -04:00
Dan Parizher
9f0d3cca89 [pydocstyle] Fix D413 infinite loop for parenthesized docstring (#18930)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Fixes #18908
2025-06-30 10:49:13 -04:00
Robsdedude
eb9d9c3646 [perflint] Fix PERF403 panic on attribute or subscription loop variable (#19042)
## Summary

Fixes: https://github.com/astral-sh/ruff/issues/19005

## Test Plan

Reproducer from issue report plus some extra cases that would cause the
panic were added.
2025-06-30 10:47:49 -04:00
Adrien Cacciaguerra
4fbf7e9de8 Bump CodSpeed to v3 (#19046)
## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
As discussed on Slack, there was an issue with the walltime metrics with
`divan`. The issue was fixed with the latest version of `cargo-codspeed`
and `codpseed-divan-compat`:
https://github.com/CodSpeedHQ/codspeed-rust/releases/tag/v3.0.0.

This PR updates all crates related to CodSpeed. A performance increase
of the following benchmarks is expected, as now the correct metric will
be used.

```
crates/ruff_benchmark/benches/ty_walltime.rs::multithreaded[pydantic]
crates/ruff_benchmark/benches/ty_walltime.rs::small[altair]
crates/ruff_benchmark/benches/ty_walltime.rs::small[freqtrade]
crates/ruff_benchmark/benches/ty_walltime.rs::small[pydantic]
crates/ruff_benchmark/benches/ty_walltime.rs::small[tanjun]
```

Once this is merged, we will update the historic data of the affected
benchmark on the CodSpeed UI, so that no false positives will appear.
2025-06-30 10:11:23 -04:00
GiGaGon
b23b4071eb [flake8-async] Make ASYNC220, ASYNC221, and ASYNC222 examples error out-of-the-box (#18978)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

All three in one PR since they are in the same file.

This PR makes [create-subprocess-in-async-function
(ASYNC220)](https://docs.astral.sh/ruff/rules/create-subprocess-in-async-function/#create-subprocess-in-async-function-async220)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/465036af-d75f-4bda-ba24-e50e8618bf16)
```py
async def foo():
    os.popen(cmd)
```

[New example](https://play.ruff.rs/8cf43d50-f9e1-45d6-b711-968c7135f2e0)
```py
import os


async def foo():
    os.popen(cmd)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

This PR makes [run-process-in-async-function
(ASYNC221)](https://docs.astral.sh/ruff/rules/run-process-in-async-function/#run-process-in-async-function-async221)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/0698aaa1-c722-4f04-b56c-61edec06945c)
```py
async def foo():
    subprocess.run(cmd)
```

[New example](https://play.ruff.rs/e05bfcbc-e681-4a28-8f50-2c0c2537d038)
```py
import subprocess


async def foo():
    subprocess.run(cmd)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

This PR makes [wait-for-process-in-async-function
(ASYNC222)](https://docs.astral.sh/ruff/rules/wait-for-process-in-async-function/#wait-for-process-in-async-function-async222)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/4305d477-8995-462d-83ae-435731d71e67)
```py
async def foo():
    os.waitpid(0)
```

[New example](https://play.ruff.rs/ad10c042-3b18-49ca-8f5c-5ab720516da1)
```py
import os


async def foo():
    os.waitpid(0)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 09:47:29 -04:00
GiGaGon
462dbadee4 [Airflow] Make AIR302 example error out-of-the-box (#18988)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [airflow3-moved-to-provider
(AIR302)](https://docs.astral.sh/ruff/rules/airflow3-moved-to-provider/#airflow3-moved-to-provider-air302)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/1026c008-57bc-4330-93b9-141444f2a611)
```py
from airflow.auth.managers.fab.fab_auth_manage import FabAuthManager
```

[New example](https://play.ruff.rs/b690e809-a81d-4265-9fde-1494caa0b7fd)
```py
from airflow.auth.managers.fab.fab_auth_manager import FabAuthManager

fab_auth_manager_app = FabAuthManager().get_fastapi_app()
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 09:45:15 -04:00
Robsdedude
a3638b3adc [pyupgrade] Mark UP008 fix safe if no comments in range (#18683)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
Mark `UP008`'s fix safe if it won't delete comments.

## Relevant Issues
Fixes: https://github.com/astral-sh/ruff/issues/18533

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-06-30 09:42:05 -04:00
GiGaGon
f857546aeb [flake8-bandit] Make S201 example error out-of-the-box (#19017)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [flask-debug-true
(S201)](https://docs.astral.sh/ruff/rules/flask-debug-true/#flask-debug-true-s201)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/d5e1a013-1107-4223-9094-0e8393ad3c64)
```py
import flask

app = Flask()

app.run(debug=True)
```

[New example](https://play.ruff.rs/c4aebd2c-0448-4471-8bad-3e38ace68367)
```py
from flask import Flask

app = Flask()

app.run(debug=True)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 09:39:59 -04:00
हिमांशु
d78f18cda9 [flake8-executable] Allow uvx in shebang line (EXE003) (#18967)
## Summary
closes #18902 

## Test Plan
I have added a test case
2025-06-30 09:38:18 -04:00
David Peter
db3dcd8ad6 [ty] Eagerly simplify 'True' and 'False' constraints (#18998)
## Summary

Simplifies literal `True` and `False` conditions to `ALWAYS_TRUE` /
`ALWAYS_FALSE` during semantic index building. This allows us to eagerly
evaluate more constraints, which should help with performance (looks
like there is a tiny 1% improvement in instrumented benchmarks), but
also allows us to eliminate definitely-unreachable branches in
control-flow merging. This can lead to better type inference in some
cases because it allows us to retain narrowing constraints without
solving https://github.com/astral-sh/ty/issues/690 first:
```py
def _(c: int | None):
    if c is None:
        assert False
    
    reveal_type(c)  # int, previously: int | None
```

closes https://github.com/astral-sh/ty/issues/713

## Test Plan

* Regression test for https://github.com/astral-sh/ty/issues/713
* Made sure that all ecosystem diffs trace back to removed false
positives
2025-06-30 13:11:52 +02:00
David Peter
54769ac9f9 [ty] While loop modeling cleanup (#18994)
## Summary

I found the previous code here very confusing, and it also did some
unnecessary work. Hopefully this is a bit easier to understand.
2025-06-30 11:38:25 +02:00
renovate[bot]
c80762debd Update pre-commit dependencies (#19038)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | minor | `v0.11.13` -> `v0.12.1` |
|
[python-jsonschema/check-jsonschema](https://redirect.github.com/python-jsonschema/check-jsonschema)
| repository | patch | `0.33.0` -> `0.33.1` |
|
[rbubley/mirrors-prettier](https://redirect.github.com/rbubley/mirrors-prettier)
| repository | minor | `v3.5.3` -> `v3.6.2` |
|
[woodruffw/zizmor-pre-commit](https://redirect.github.com/woodruffw/zizmor-pre-commit)
| repository | minor | `v1.9.0` -> `v1.10.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.1`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.1)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.0...v0.12.1)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.1

###
[`v0.12.0`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.0)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.11.13...v0.12.0)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.0

</details>

<details>
<summary>python-jsonschema/check-jsonschema
(python-jsonschema/check-jsonschema)</summary>

###
[`v0.33.1`](https://redirect.github.com/python-jsonschema/check-jsonschema/blob/HEAD/CHANGELOG.rst#0331)

[Compare
Source](https://redirect.github.com/python-jsonschema/check-jsonschema/compare/0.33.0...0.33.1)

- Update vendored schemas: bamboo-spec, bitbucket-pipelines, circle-ci,
cloudbuild,
compose-spec, dependabot, drone-ci, github-actions, github-workflows,
gitlab-ci,
mergify, readthedocs, renovate, taskfile, travis, woodpecker-ci
(2025-06-22)
- Fix: support `click==8.2.0`
- Fix a bug in `Last-Modified` header parsing which used local time and
could
  result in improper caching. Thanks :user:`fenuks`! (:pr:`565`)

</details>

<details>
<summary>rbubley/mirrors-prettier (rbubley/mirrors-prettier)</summary>

###
[`v3.6.2`](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.6.1...v3.6.2)

[Compare
Source](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.6.1...v3.6.2)

###
[`v3.6.1`](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.6.0...v3.6.1)

[Compare
Source](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.6.0...v3.6.1)

###
[`v3.6.0`](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.5.3...v3.6.0)

[Compare
Source](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.5.3...v3.6.0)

</details>

<details>
<summary>woodruffw/zizmor-pre-commit
(woodruffw/zizmor-pre-commit)</summary>

###
[`v1.10.0`](https://redirect.github.com/zizmorcore/zizmor-pre-commit/releases/tag/v1.10.0)

[Compare
Source](https://redirect.github.com/woodruffw/zizmor-pre-commit/compare/v1.9.0...v1.10.0)

See: https://github.com/zizmorcore/zizmor/releases/tag/v1.10.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
2025-06-30 08:43:39 +00:00
Robsdedude
4103d73224 Minor code simplification (#19022)
When inside a typing only annotation, the code is always inside an
annotation, too.
2025-06-30 13:42:59 +05:30
renovate[bot]
bedb53daec Update dependency smol-toml to v1.4.0 (#19037)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [smol-toml](https://redirect.github.com/squirrelchat/smol-toml) |
[`1.3.4` ->
`1.4.0`](https://renovatebot.com/diffs/npm/smol-toml/1.3.4/1.4.0) |
[![age](https://developer.mend.io/api/mc/badges/age/npm/smol-toml/1.4.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/npm/smol-toml/1.4.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/npm/smol-toml/1.3.4/1.4.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/smol-toml/1.3.4/1.4.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>squirrelchat/smol-toml (smol-toml)</summary>

###
[`v1.4.0`](https://redirect.github.com/squirrelchat/smol-toml/releases/tag/v1.4.0)

[Compare
Source](https://redirect.github.com/squirrelchat/smol-toml/compare/v1.3.4...v1.4.0)

This release introduces better support for integers, courtesy of
[@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor)! It is
now possible to parse integers that are larger than 53 bits as BigInts.
It is also possible to parse all integers as BigInts, enabling full type
preservation of TOML documents.

The project is now tested against Node 24 as well, ensuring stability on
the latest versions of Node.js.

##### What's Changed

- Use more detailed TOML types by
[@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor) in
[https://github.com/squirrelchat/smol-toml/pull/40](https://redirect.github.com/squirrelchat/smol-toml/pull/40)
- Serialize all numbers as floats by
[@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor) in
[https://github.com/squirrelchat/smol-toml/pull/42](https://redirect.github.com/squirrelchat/smol-toml/pull/42)
- Use bigint for large numbers by
[@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor) in
[https://github.com/squirrelchat/smol-toml/pull/41](https://redirect.github.com/squirrelchat/smol-toml/pull/41)

##### New Contributors

- [@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor) made
their first contribution in
[https://github.com/squirrelchat/smol-toml/pull/40](https://redirect.github.com/squirrelchat/smol-toml/pull/40)

**Full Changelog**:
https://github.com/squirrelchat/smol-toml/compare/v1.3.4...v1.4.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:37:20 +05:30
med1844
0ec2ad2fa5 [ty] Emit error for invalid binary operations in type expressions (#18991)
## Summary

This PR adds diagnostic for invalid binary operators in type
expressions. It should close https://github.com/astral-sh/ty/issues/706
if merged.

Please feel free to suggest better wordings for the diagnostic message.

## Test Plan

I modified `mdtest/annotations/invalid.md` and added a test for each
binary operator, and fixed tests that was broken by the new diagnostic.
2025-06-30 10:06:01 +02:00
renovate[bot]
9469a982cc Update dependency ruff to v0.12.1 (#19032)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [ruff](https://docs.astral.sh/ruff)
([source](https://redirect.github.com/astral-sh/ruff),
[changelog](https://redirect.github.com/astral-sh/ruff/blob/main/CHANGELOG.md))
| `==0.12.0` -> `==0.12.1` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/ruff/0.12.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/pypi/ruff/0.12.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/pypi/ruff/0.12.0/0.12.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/ruff/0.12.0/0.12.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>astral-sh/ruff (ruff)</summary>

###
[`v0.12.1`](https://redirect.github.com/astral-sh/ruff/blob/HEAD/CHANGELOG.md#0121)

[Compare
Source](https://redirect.github.com/astral-sh/ruff/compare/0.12.0...0.12.1)

##### Preview features

- \[`flake8-errmsg`] Extend `EM101` to support byte strings
([#&#8203;18867](https://redirect.github.com/astral-sh/ruff/pull/18867))
- \[`flake8-use-pathlib`] Add autofix for `PTH202`
([#&#8203;18763](https://redirect.github.com/astral-sh/ruff/pull/18763))
- \[`pygrep-hooks`] Add `AsyncMock` methods to `invalid-mock-access`
(`PGH005`)
([#&#8203;18547](https://redirect.github.com/astral-sh/ruff/pull/18547))
- \[`pylint`] Ignore `__init__.py` files in (`PLC0414`)
([#&#8203;18400](https://redirect.github.com/astral-sh/ruff/pull/18400))
- \[`ruff`] Trigger `RUF037` for empty string and byte strings
([#&#8203;18862](https://redirect.github.com/astral-sh/ruff/pull/18862))
- \[formatter] Fix missing blank lines before decorated classes in
`.pyi` files
([#&#8203;18888](https://redirect.github.com/astral-sh/ruff/pull/18888))

##### Bug fixes

- Avoid generating diagnostics with per-file ignores
([#&#8203;18801](https://redirect.github.com/astral-sh/ruff/pull/18801))
- Handle parenthesized arguments in `remove_argument`
([#&#8203;18805](https://redirect.github.com/astral-sh/ruff/pull/18805))
- \[`flake8-logging`] Avoid false positive for `exc_info=True` outside
`logger.exception` (`LOG014`)
([#&#8203;18737](https://redirect.github.com/astral-sh/ruff/pull/18737))
- \[`flake8-pytest-style`] Enforce `pytest` import for decorators
([#&#8203;18779](https://redirect.github.com/astral-sh/ruff/pull/18779))
- \[`flake8-pytest-style`] Mark autofix for `PT001` and `PT023` as
unsafe if there's comments in the decorator
([#&#8203;18792](https://redirect.github.com/astral-sh/ruff/pull/18792))
- \[`flake8-pytest-style`] `PT001`/`PT023` fix makes syntax error on
parenthesized decorator
([#&#8203;18782](https://redirect.github.com/astral-sh/ruff/pull/18782))
- \[`flake8-raise`] Make fix unsafe if it deletes comments (`RSE102`)
([#&#8203;18788](https://redirect.github.com/astral-sh/ruff/pull/18788))
- \[`flake8-simplify`] Fix `SIM911` autofix creating a syntax error
([#&#8203;18793](https://redirect.github.com/astral-sh/ruff/pull/18793))
- \[`flake8-simplify`] Fix false negatives for shadowed bindings
(`SIM910`, `SIM911`)
([#&#8203;18794](https://redirect.github.com/astral-sh/ruff/pull/18794))
- \[`flake8-simplify`] Preserve original behavior for `except ()` and
bare `except` (`SIM105`)
([#&#8203;18213](https://redirect.github.com/astral-sh/ruff/pull/18213))
- \[`flake8-pyi`] Fix `PYI041`'s fix causing `TypeError` with `None |
None | ...`
([#&#8203;18637](https://redirect.github.com/astral-sh/ruff/pull/18637))
- \[`perflint`] Fix `PERF101` autofix creating a syntax error and mark
autofix as unsafe if there are comments in the `list` call expr
([#&#8203;18803](https://redirect.github.com/astral-sh/ruff/pull/18803))
- \[`perflint`] Fix false negative in `PERF401`
([#&#8203;18866](https://redirect.github.com/astral-sh/ruff/pull/18866))
- \[`pylint`] Avoid flattening nested `min`/`max` when outer call has
single argument (`PLW3301`)
([#&#8203;16885](https://redirect.github.com/astral-sh/ruff/pull/16885))
- \[`pylint`] Fix `PLC2801` autofix creating a syntax error
([#&#8203;18857](https://redirect.github.com/astral-sh/ruff/pull/18857))
- \[`pylint`] Mark `PLE0241` autofix as unsafe if there's comments in
the base classes
([#&#8203;18832](https://redirect.github.com/astral-sh/ruff/pull/18832))
- \[`pylint`] Suppress `PLE2510`/`PLE2512`/`PLE2513`/`PLE2514`/`PLE2515`
autofix if the text contains an odd number of backslashes
([#&#8203;18856](https://redirect.github.com/astral-sh/ruff/pull/18856))
- \[`refurb`] Detect more exotic float literals in `FURB164`
([#&#8203;18925](https://redirect.github.com/astral-sh/ruff/pull/18925))
- \[`refurb`] Fix `FURB163` autofix creating a syntax error for `yield`
expressions
([#&#8203;18756](https://redirect.github.com/astral-sh/ruff/pull/18756))
- \[`refurb`] Mark `FURB129` autofix as unsafe if there's comments in
the `readlines` call
([#&#8203;18858](https://redirect.github.com/astral-sh/ruff/pull/18858))
- \[`ruff`] Fix false positives and negatives in `RUF010`
([#&#8203;18690](https://redirect.github.com/astral-sh/ruff/pull/18690))
- Fix casing of `analyze.direction` variant names
([#&#8203;18892](https://redirect.github.com/astral-sh/ruff/pull/18892))

##### Rule changes

- Fix f-string interpolation escaping in generated fixes
([#&#8203;18882](https://redirect.github.com/astral-sh/ruff/pull/18882))
- \[`flake8-return`] Mark `RET501` fix unsafe if comments are inside
([#&#8203;18780](https://redirect.github.com/astral-sh/ruff/pull/18780))
- \[`flake8-async`] Fix detection for large integer sleep durations in
`ASYNC116` rule
([#&#8203;18767](https://redirect.github.com/astral-sh/ruff/pull/18767))
- \[`flake8-async`] Mark autofix for `ASYNC115` as unsafe if the call
expression contains comments
([#&#8203;18753](https://redirect.github.com/astral-sh/ruff/pull/18753))
- \[`flake8-bugbear`] Mark autofix for `B004` as unsafe if the `hasattr`
call expr contains comments
([#&#8203;18755](https://redirect.github.com/astral-sh/ruff/pull/18755))
- \[`flake8-comprehension`] Mark autofix for `C420` as unsafe if there's
comments inside the dict comprehension
([#&#8203;18768](https://redirect.github.com/astral-sh/ruff/pull/18768))
- \[`flake8-comprehensions`] Handle template strings for comprehension
fixes
([#&#8203;18710](https://redirect.github.com/astral-sh/ruff/pull/18710))
- \[`flake8-future-annotations`] Add autofix (`FA100`)
([#&#8203;18903](https://redirect.github.com/astral-sh/ruff/pull/18903))
- \[`pyflakes`] Mark `F504`/`F522`/`F523` autofix as unsafe if there's a
call with side effect
([#&#8203;18839](https://redirect.github.com/astral-sh/ruff/pull/18839))
- \[`pylint`] Allow fix with comments and document performance
implications (`PLW3301`)
([#&#8203;18936](https://redirect.github.com/astral-sh/ruff/pull/18936))
- \[`pylint`] Detect more exotic `NaN` literals in `PLW0177`
([#&#8203;18630](https://redirect.github.com/astral-sh/ruff/pull/18630))
- \[`pylint`] Fix `PLC1802` autofix creating a syntax error and mark
autofix as unsafe if there's comments in the `len` call
([#&#8203;18836](https://redirect.github.com/astral-sh/ruff/pull/18836))
- \[`pyupgrade`] Extend version detection to include
`sys.version_info.major` (`UP036`)
([#&#8203;18633](https://redirect.github.com/astral-sh/ruff/pull/18633))
- \[`ruff`] Add lint rule `RUF064` for calling `chmod` with non-octal
integers
([#&#8203;18541](https://redirect.github.com/astral-sh/ruff/pull/18541))
- \[`ruff`] Added `cls.__dict__.get('__annotations__')` check (`RUF063`)
([#&#8203;18233](https://redirect.github.com/astral-sh/ruff/pull/18233))
- \[`ruff`] Frozen `dataclass` default should be valid (`RUF009`)
([#&#8203;18735](https://redirect.github.com/astral-sh/ruff/pull/18735))

##### Server

- Consider virtual path for various server actions
([#&#8203;18910](https://redirect.github.com/astral-sh/ruff/pull/18910))

##### Documentation

- Add fix safety sections
([#&#8203;18940](https://redirect.github.com/astral-sh/ruff/pull/18940),[#&#8203;18841](https://redirect.github.com/astral-sh/ruff/pull/18841),[#&#8203;18802](https://redirect.github.com/astral-sh/ruff/pull/18802),[#&#8203;18837](https://redirect.github.com/astral-sh/ruff/pull/18837),[#&#8203;18800](https://redirect.github.com/astral-sh/ruff/pull/18800),[#&#8203;18415](https://redirect.github.com/astral-sh/ruff/pull/18415),[#&#8203;18853](https://redirect.github.com/astral-sh/ruff/pull/18853),[#&#8203;18842](https://redirect.github.com/astral-sh/ruff/pull/18842))
- Use updated pre-commit id
([#&#8203;18718](https://redirect.github.com/astral-sh/ruff/pull/18718))
- \[`perflint`] Small docs improvement to `PERF401`
([#&#8203;18786](https://redirect.github.com/astral-sh/ruff/pull/18786))
- \[`pyupgrade`]: Use `super()`, not `__super__` in error messages
(`UP008`)
([#&#8203;18743](https://redirect.github.com/astral-sh/ruff/pull/18743))
- \[`flake8-pie`] Small docs fix to `PIE794`
([#&#8203;18829](https://redirect.github.com/astral-sh/ruff/pull/18829))
- \[`flake8-pyi`] Correct `collections-named-tuple` example to use
PascalCase assignment
([#&#8203;16884](https://redirect.github.com/astral-sh/ruff/pull/16884))
- \[`flake8-pie`] Add note on type checking benefits to
`unnecessary-dict-kwargs` (`PIE804`)
([#&#8203;18666](https://redirect.github.com/astral-sh/ruff/pull/18666))
- \[`pycodestyle`] Clarify PEP 8 relationship to
`whitespace-around-operator` rules
([#&#8203;18870](https://redirect.github.com/astral-sh/ruff/pull/18870))

##### Other changes

- Disallow newlines in format specifiers of single quoted f- or
t-strings
([#&#8203;18708](https://redirect.github.com/astral-sh/ruff/pull/18708))
- \[`flake8-logging`] Add fix safety section to `LOG002`
([#&#8203;18840](https://redirect.github.com/astral-sh/ruff/pull/18840))
- \[`pyupgrade`] Add fix safety section to `UP010`
([#&#8203;18838](https://redirect.github.com/astral-sh/ruff/pull/18838))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:12:53 +05:30
renovate[bot]
e33980fd5c Update PyO3/maturin-action action to v1.49.3 (#19033)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [PyO3/maturin-action](https://redirect.github.com/PyO3/maturin-action)
| action | patch | `v1.49.2` -> `v1.49.3` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>PyO3/maturin-action (PyO3/maturin-action)</summary>

###
[`v1.49.3`](https://redirect.github.com/PyO3/maturin-action/releases/tag/v1.49.3)

[Compare
Source](https://redirect.github.com/PyO3/maturin-action/compare/v1.49.2...v1.49.3)

##### What's Changed

- Fix `musllinux` builds with `rust-toolchain.toml` containing channel
that is not `stable` by
[@&#8203;FloezeTv](https://redirect.github.com/FloezeTv) in
[https://github.com/PyO3/maturin-action/pull/359](https://redirect.github.com/PyO3/maturin-action/pull/359)

##### New Contributors

- [@&#8203;FloezeTv](https://redirect.github.com/FloezeTv) made their
first contribution in
[https://github.com/PyO3/maturin-action/pull/359](https://redirect.github.com/PyO3/maturin-action/pull/359)

**Full Changelog**:
https://github.com/PyO3/maturin-action/compare/v1.49.2...v1.49.3

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:08:04 +05:30
renovate[bot]
053739f698 Update astral-sh/setup-uv action to v6.3.1 (#19031)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [astral-sh/setup-uv](https://redirect.github.com/astral-sh/setup-uv) |
action | patch | `v6.3.0` -> `v6.3.1` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>astral-sh/setup-uv (astral-sh/setup-uv)</summary>

###
[`v6.3.1`](https://redirect.github.com/astral-sh/setup-uv/releases/tag/v6.3.1):
🌈 Do not warn when version not in manifest-file

[Compare
Source](https://redirect.github.com/astral-sh/setup-uv/compare/v6.3.0...v6.3.1)

##### Changes

This is a hotfix to change the warning messages that a version could not
be found in the local manifest-file to info level.

A `setup-uv` release contains a version-manifest.json file with infos in
all available `uv` releases. When a new `uv` version is released this is
not contained in this file until the file gets updated and a new
`setup-uv` release is made.
We will overhaul this process in the future but for now the spamming of
warnings is removed.

##### 🐛 Bug fixes

- Do not warn when version not in manifest-file
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;462](https://redirect.github.com/astral-sh/setup-uv/issues/462))

##### 🧰 Maintenance

- chore: update known versions for 0.7.14
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;459](https://redirect.github.com/astral-sh/setup-uv/issues/459))
- Revert "Set expected cache dir drive to C: on windows
([#&#8203;451](https://redirect.github.com/astral-sh/setup-uv/issues/451))"
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;460](https://redirect.github.com/astral-sh/setup-uv/issues/460))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:06:39 +05:30
renovate[bot]
192569c848 Update Rust crate clearscreen to v4.0.2 (#19034)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [clearscreen](https://redirect.github.com/watchexec/clearscreen) |
workspace.dependencies | patch | `4.0.1` -> `4.0.2` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>watchexec/clearscreen (clearscreen)</summary>

###
[`v4.0.2`](https://redirect.github.com/watchexec/clearscreen/blob/HEAD/CHANGELOG.md#v402-2025-06-25)

[Compare
Source](https://redirect.github.com/watchexec/clearscreen/compare/v4.0.1...v4.0.2)

- **Deps:** Upgrade which from 7.0.2 to 8.0.0
([#&#8203;34](https://redirect.github.com/watchexec/clearscreen/issues/34))
-
([09bc029](09bc0299f4))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:06:10 +05:30
renovate[bot]
ae7eaa9913 Update Rust crate ordermap to v0.5.8 (#19036)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [ordermap](https://redirect.github.com/indexmap-rs/ordermap) |
workspace.dependencies | patch | `0.5.7` -> `0.5.8` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>indexmap-rs/ordermap (ordermap)</summary>

###
[`v0.5.8`](https://redirect.github.com/indexmap-rs/ordermap/blob/HEAD/RELEASES.md#058-2025-06-26)

[Compare
Source](https://redirect.github.com/indexmap-rs/ordermap/compare/0.5.7...0.5.8)

- Added `extract_if` methods to `OrderMap` and `OrderSet`, similar to
the
methods for `HashMap` and `HashSet` with ranges like `Vec::extract_if`.
- Added more `#[track_caller]` annotations to functions that may panic.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:05:52 +05:30
renovate[bot]
d88bebca32 Update Rust crate indexmap to v2.10.0 (#19040)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [indexmap](https://redirect.github.com/indexmap-rs/indexmap) |
workspace.dependencies | minor | `2.9.0` -> `2.10.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>indexmap-rs/indexmap (indexmap)</summary>

###
[`v2.10.0`](https://redirect.github.com/indexmap-rs/indexmap/blob/HEAD/RELEASES.md#2100-2025-06-26)

[Compare
Source](https://redirect.github.com/indexmap-rs/indexmap/compare/2.9.0...2.10.0)

- Added `extract_if` methods to `IndexMap` and `IndexSet`, similar to
the
methods for `HashMap` and `HashSet` with ranges like `Vec::extract_if`.
- Added more `#[track_caller]` annotations to functions that may panic.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:04:43 +05:30
InSync
e7aadfc28b [ty] Add special-cased inference for __import__(name) and importlib.import_module(name) (#19008) 2025-06-29 11:49:23 +01:00
Shunsuke Shibayama
de1f8177be [ty] Improve protocol member type checking and relation handling (#18847)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-06-29 10:46:33 +00:00
Ibraheem Ahmed
9218bf72ad [ty] Print salsa memory usage totals in mypy primer CI runs (#18973)
## Summary

Print the [new salsa memory usage
dumps](https://github.com/astral-sh/ruff/pull/18928) in mypy primer CI
runs to help us catch memory regressions. The numbers are rounded to the
nearest power of 1.1 (about a 5% threshold between buckets) to avoid overly sensitive diffs.
2025-06-28 15:09:50 -04:00
Micha Reiser
29927f2b59 Update Rust toolchain to 1.88 and MSRV to 1.86 (#19011) 2025-06-28 20:24:00 +02:00
GiGaGon
c5995c40d3 [flake8-async] Make ASYNC105 example error out-of-the-box (#19002)
## Summary

Part of #18972

This PR makes [trio-sync-call
(ASYNC105)](https://docs.astral.sh/ruff/rules/trio-sync-call/#trio-sync-call-async105)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/5b267e01-1c0a-4902-949e-45fc46f8b0d0)
```py
async def double_sleep(x):
    trio.sleep(2 * x)
```

[New example](https://play.ruff.rs/eba6ea40-ff88-4ea8-8cb4-cea472c15c53)
```py
import trio


async def double_sleep(x):
    trio.sleep(2 * x)
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:18:06 -05:00
GiGaGon
68f98cfcd8 [Airflow] Make AIR312 example error out-of-the-box (#18989)
## Summary

Part of #18972

This PR makes [airflow3-suggested-to-move-to-provider
(AIR312)](https://docs.astral.sh/ruff/rules/airflow3-suggested-to-move-to-provider/#airflow3-suggested-to-move-to-provider-air312)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/1be0d654-1ed5-4a0b-8791-cc5db73333d5)
```py
from airflow.operators.python import PythonOperator
```

[New example](https://play.ruff.rs/b6260206-fa19-4ab2-8d45-ddd43c46a759)
```py
from airflow.operators.python import PythonOperator


def print_context(ds=None, **kwargs):
    print(kwargs)
    print(ds)


print_the_context = PythonOperator(
    task_id="print_the_context", python_callable=print_context
)
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:17:11 -05:00
GiGaGon
315adba906 [flake8-async] Make ASYNC251 example error out-of-the-box (#18990)
## Summary

Part of #18972

This PR makes [blocking-sleep-in-async-function
(ASYNC251)](https://docs.astral.sh/ruff/rules/blocking-sleep-in-async-function/#blocking-sleep-in-async-function-async251)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/796684a2-c437-4390-b754-491e576ffe5e)
```py
async def fetch():
    time.sleep(1)
```

[New example](https://play.ruff.rs/90741192-fd0d-49fb-a04e-3127312da659)
```py
import time


async def fetch():
    time.sleep(1)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:15:34 -05:00
GiGaGon
523174e8be [flake8-async] Make ASYNC100 example error out-of-the-box (#18993)
## Summary

Part of #18972

This PR makes [cancel-scope-no-checkpoint
(ASYNC100)](https://docs.astral.sh/ruff/rules/cancel-scope-no-checkpoint/#cancel-scope-no-checkpoint-async100)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/6a399ae5-9b89-4438-b808-6604f1e40a70)
```py
async def func():
    async with asyncio.timeout(2):
        do_something()
```

[New example](https://play.ruff.rs/c44db531-d2f8-4a61-9e04-e5fc0ea989e3)
```py
import asyncio


async def func():
    async with asyncio.timeout(2):
        do_something()
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:13:54 -05:00
GiGaGon
ed2e90371b [flake8-async] Make ASYNC210 example error out-of-the-box (#18977)
## Summary

Part of #18972

This PR makes [blocking-http-call-in-async-function
(ASYNC210)](https://docs.astral.sh/ruff/rules/blocking-http-call-in-async-function/#blocking-http-call-in-async-function-async210)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/20cba4f4-fe2f-428a-a721-311d1a081e64)
```py
async def fetch():
    urllib.request.urlopen("https://example.com/foo/bar").read()
```

[New example](https://play.ruff.rs/5ca2a10d-5294-49ee-baee-0447f7188d9b)
```py
import urllib


async def fetch():
    urllib.request.urlopen("https://example.com/foo/bar").read()
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:11:38 -05:00
David Peter
90cb0d3a7b [ty] Reduce 'complex_constrained_attributes_2' runtime (#19001)
Re: https://github.com/astral-sh/ruff/pull/18979#issuecomment-3012541095

Each check increases the runtime by a factor of 3, so this should be an
order of magnitude faster.
2025-06-27 23:15:45 +02:00
Alex Waygood
1297d6a9eb [ty] Followups to tuple constructor improvements in #18987 (#19000) 2025-06-27 22:09:14 +01:00
Douglas Creager
caf3c916e8 [ty] Refactor argument matching / type checking in call binding (#18997)
This PR extracts a lot of the complex logic in the `match_parameters`
and `check_types` methods of our call binding machinery into separate
helper types. This is setup for #18996, which will update this logic to
handle variadic arguments. To do so, it is helpful to have the
per-argument logic extracted into a method that we can call repeatedly
for each _element_ of a variadic argument.

This should be a pure refactoring, with no behavioral changes.
2025-06-27 17:01:52 -04:00
Douglas Creager
c60e590b4c [ty] Support variable-length tuples in unpacking assignments (#18948)
This PR updates our unpacking assignment logic to use the new tuple
machinery. As a result, we can now unpack variable-length tuples
correctly.

As part of this, the `TupleSpec` classes have been renamed to `Tuple`,
and can now contain any element (Rust) type, not just `Type<'db>`. The
unpacker uses a tuple of `UnionBuilder`s to maintain the types that will
be assigned to each target, as we iterate through potentially many union
elements on the rhs. We also add a new consuming iterator for tuples,
and update the `all_elements` methods to wrap the result in an enum
(similar to `itertools::Position`) letting you know which part of the
tuple each element appears in. I also added a new
`UnionBuilder::try_build`, which lets you specify a different fallback
type if the union contains no elements.
2025-06-27 15:29:04 -04:00
Alex Waygood
a50a993b9c [ty] Make tuple instantiations sound (#18987)
## Summary

Ensure that we correctly infer calls such as `tuple((1, 2))`,
`tuple(range(42))`, etc. Ensure that we emit errors on invalid calls
such as `tuple[int, str]()`.

## Test Plan

Mdtests
2025-06-27 19:37:16 +01:00
Robsdedude
6802c4702f [flake8-pyi] Expand Optional[A] to A | None (PYI016) (#18572)
## Summary
Under preview 🧪 I've expanded rule `PYI016` to also flag type
union duplicates containing `None` and `Optional`.

## Test Plan
Examples/tests have been added. I've made sure that the existing
examples did not change unless preview is enabled.

## Relevant Issues
* https://github.com/astral-sh/ruff/issues/18508 (discussing
introducing/extending a rule to flag `Optional[None]`)
* https://github.com/astral-sh/ruff/issues/18546 (where I discussed this
addition with @AlexWaygood)

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-06-27 15:43:11 +00:00
Brent Westbrook
96f3c8d1ab Convert OldDiagnostic::noqa_code to an Option<String> (#18946)
## Summary

I think this should be the last step before combining `OldDiagnostic`
and `ruff_db::Diagnostic`. We can't store a `NoqaCode` on
`ruff_db::Diagnostic`, so I converted the `noqa_code` field to an
`Option<String>` and then propagated this change to all of the callers.

I tried to use `&str` everywhere it was possible, so I think the
remaining `to_string` calls are necessary. I spent some time trying to
convert _everything_ to `&str` but ran into lifetime issues, especially
in the `FixTable`. Maybe we can take another look at that if it causes a
performance regression, but hopefully these paths aren't too hot. We
also avoid some `to_string` calls, so it might even out a bit too.

## Test Plan

Existing tests

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-27 11:36:55 -04:00
Andrew Gallant
efcb63fe3a [ty] Fix playground (#18986)
I renamed a field on a `Completion` struct in #18982, and it looks like
this caused the playground to fail to build:

https://github.com/astral-sh/ruff/actions/runs/15928550050/job/44931734349

Maybe building that playground can be added to CI for pull requests?
2025-06-27 10:43:36 -04:00
Andrew Gallant
5f6b0ded21 [ty] Add builtins to completions derived from scope (#18982)
Most of the work here was doing some light refactoring to facilitate
sensible testing. That is, we don't want to list every builtin included
in most tests, so we add some structure to the completion type returned.
Tests can now filter based on whether a completion is a builtin or not.

Otherwise, builtins are found using the existing infrastructure for
`object.attr` completions (where we hard-code the module name
`builtins`).

I did consider changing the sort order based on whether a completion
suggestion was a builtin or not. In particular, it seemed like it might
be a good idea to sort builtins after other scope based completions,
but before the dunder and sunder attributes. Namely, it seems likely
that there is an inverse correlation between the size of a scope and
the likelihood of an item in that scope being used at any given point.
So it *might* be a good idea to prioritize the likelier candidates in
the completions returned.

Additionally, the number of items introduced by adding builtins is quite
large. So I wondered whether mixing them in with everything else would
become too noisy.

However, it's not totally clear to me that this is the right thing to
do. Right now, I feel like there is a very obvious lexicographic
ordering that makes "finding" the right suggestion to activate
potentially easier than if the ranking mechanism is less clear.
(Technically, the dunder and sunder attributes are not sorted
lexicographically, but I'd put forward that most folks don't have an
intuitive understanding of where `_` ranks lexicographically with
respect to "regular" letters. Moreover, since dunder and sunder
attributes are all grouped together, I think the ordering here ends up
being very obvious after even a quick glance.)
2025-06-27 10:20:01 -04:00
Matthew Mckee
a3c79d8170 [ty] Don't add incorrect subdiagnostic for unresolved reference (#18487)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
Co-authored-by: Carl Meyer <carl@astral.sh>
2025-06-27 12:40:33 +00:00
Alex Waygood
57bd7d055d [ty] Simplify KnownClass::check_call() and KnownFunction::check_call() (#18981) 2025-06-27 12:23:29 +01:00
David Peter
3c18d85c7d [ty] Add micro-benchmark for #711 (#18979)
## Summary

Add a benchmark for the problematic case in
https://github.com/astral-sh/ty/issues/711, which will potentially be
solved in https://github.com/astral-sh/ruff/pull/18955
2025-06-27 11:34:51 +02:00
GiGaGon
e5e3d998c5 [flake8-annotations] Make ANN401 example error out-of-the-box (#18974) 2025-06-27 07:06:11 +00:00
GiGaGon
85b2a08b5c [flake8-async] Make ASYNC110 example error out-of-the-box (#18975) 2025-06-27 09:01:02 +02:00
Jordy Williams
1874d52eda [pandas]: Fix issue on non pandas dataframe in-place usage (PD002) (#18963) 2025-06-27 06:56:13 +00:00
Yair Peretz
18efe2ab46 [pylint] Fix PLC0415 example (#18970)
Fixed documentation error
2025-06-26 18:33:33 -04:00
Ibraheem Ahmed
6f7b1c9bb3 [ty] Add environment variable to dump Salsa memory usage stats (#18928)
## Summary

Setting `TY_MEMORY_REPORT=full` will generate and print a memory usage
report to the CLI after a `ty check` run:

```
=======SALSA STRUCTS=======
`Definition`                                       metadata=7.24MB   fields=17.38MB  count=181062
`Expression`                                       metadata=4.45MB   fields=5.94MB   count=92804
`member_lookup_with_policy_::interned_arguments`   metadata=1.97MB   fields=2.25MB   count=35176
...
=======SALSA QUERIES=======
`File -> ty_python_semantic::semantic_index::SemanticIndex`
    metadata=11.46MB  fields=88.86MB  count=1638
`Definition -> ty_python_semantic::types::infer::TypeInference`
    metadata=24.52MB  fields=86.68MB  count=146018
`File -> ruff_db::parsed::ParsedModule`
    metadata=0.12MB   fields=69.06MB  count=1642
...
=======SALSA SUMMARY=======
TOTAL MEMORY USAGE: 577.61MB
    struct metadata = 29.00MB
    struct fields = 35.68MB
    memo metadata = 103.87MB
    memo fields = 409.06MB
```

Eventually, we should integrate these numbers into CI in some form. The
one limitation currently is that heap allocations in salsa structs (e.g.
interned values) are not tracked, but memoized values should have full
coverage. We may also want a peak memory usage counter (that accounts
for non-salsa memory), but that is relatively simple to profile manually
(e.g. `time -v ty check`) and would require a compile-time option to
avoid runtime overhead.
2025-06-26 21:27:51 +00:00
Victor Hugo Gomes
a1579d82d0 [pylint] Fix PLW0108 autofix introducing a syntax error when the lambda's body contains an assignment expression (#18678)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

This PR also supresses the fix if the assignment expression target
shadows one of the lambda's parameters.

Fixes #18675

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

Add regression tests.
<!-- How was it tested? -->
2025-06-26 16:56:17 -04:00
Dylan
32c54189cb Bump 0.12.1 (#18969) 2025-06-26 15:20:31 -05:00
GiGaGon
b85c219283 [FastAPI] Add fix safety section to FAST002 (#18940)
## Summary

Part of #15584

This PR adds a fix safety section to [fast-api-non-annotated-dependency
(FAST002)](https://docs.astral.sh/ruff/rules/fast-api-non-annotated-dependency/#fast-api-non-annotated-dependency-fast002).
It also re-words the availability section since I found it confusing.

The lint/fix was added in #11579 as always unsafe.
No reasoning is given in the original PR/code as to why this was chosen.
Example of why the fix is unsafe:
https://play.ruff.rs/3bd0566e-1ef6-4cec-ae34-3b07cd308155
```py
from fastapi import Depends, FastAPI, Query

app = FastAPI()

# Fix will remove the parameter default value
@app.get("/items/")
async def read_items(commons: dict = Depends(common_parameters)):
    return commons

# Fix will delete comment and change default parameter value
@app.get("/items/")
async def read_items_1(q: str = Query(  # This comment will be deleted
    default="rick")):
    return q
```
After fixing both instances of `FAST002`:
```py
from fastapi import Depends, FastAPI, Query
from typing import Annotated

app = FastAPI()

# Fix will remove the parameter default value
@app.get("/items/")
async def read_items(commons: Annotated[dict, Depends(common_parameters)]):
    return commons

# Fix will delete comment and change default parameter value
@app.get("/items/")
async def read_items_1(q: Annotated[str, Query()] = "rick"):
    return q
```
2025-06-26 12:38:02 -05:00
Andrew Gallant
b1d1cf1d38 [ty] Add regression test for leading tab mis-alignment in diagnostic rendering (#18965)
It turns out that astral-sh/ty#18692 also fixed astral-sh/ty#203. This
PR adds a regression test for it. (Locally, I "unfixed" the bug and
confirmed that this is actually a regression test.)

Fixes astral-sh/ty#203
2025-06-26 16:27:26 +00:00
Micha Reiser
1dcdf7f41d [ty] Resolve python environment in Options::to_program_settings (#18960)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-06-26 17:57:16 +02:00
Victor Hugo Gomes
d00697621e [ruff] Fix false positives and negatives in RUF010 (#18690) 2025-06-26 17:53:52 +02:00
Andrew Gallant
76619b96e5 [ty] Fix rendering of long lines that are indented with tabs
It turns out that `annotate-snippets` doesn't do a great job of
consistently handling tabs. The intent of the implementation is clearly
to expand tabs into 4 ASCII whitespace characters. But there are a few
places where the column computation wasn't taking this expansion into
account. In particular, the `unicode-width` crate returns `None` for a
`\t` input, and `annotate-snippets` would in turn treat this as either
zero columns or one column. Both are wrong.

In patching this, it caused one of the existing `annotate-snippets`
tests to fail. I spent a fair bit of time on it trying to fix it before
coming to the conclusion that the test itself was wrong. In particular,
the annotation ranges are 4 bytes off. However, when the range was
wrong, the buggy code was rendering the example as intended since `\t`
characters were treated as taking up zero columns of space. Now that
they are correctly computed as taking up 4 columns of space, the offsets
of the test needed to be adjusted.

Fixes #670
2025-06-26 11:12:16 -04:00
Andrew Gallant
6e25cfba2b [ty] Add regression test for diagnostic rendering panic
This converts the MRE in #670 into a fixture test for
`annotate-snippets`.
2025-06-26 11:12:16 -04:00
Micha Reiser
76387295a5 [ty] Move venv and conda env discovery to SearchPath::from_settings (#18938)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-06-26 16:39:27 +02:00
David Peter
d04e63a6d9 [ty] Add regression-benchmark for attribute-assignment hang (#18957)
## Summary

Adds a new micro-benchmark as a regression test for
https://github.com/astral-sh/ty/issues/627.

## Test Plan

Ran the benchmark on the parent commit of
89d915a1e3,
and verified that it took > 1s, while it takes ~10 ms after the fix.
2025-06-26 15:21:08 +02:00
David Peter
86fd9b634e [ty] Format conflicting types as an enumeration (#18956)
## Summary

Format conflicting declared types as
```
`str`, `int` and `bytes`
```

Thanks to @AlexWaygood for the initial draft.

@dcreager, looking forward to your one-character follow-up PR.
2025-06-26 14:29:33 +02:00
David Peter
c0beb3412f [ty] Prevent union builder construction for just one declaration (#18954)
## Summary

Avoid the construction of the `DeclaredTypeBuilder` if there is just one
declared type.
2025-06-26 13:00:09 +02:00
David Peter
b01003f81d [ty] Infer nonlocal types as unions of all reachable bindings (#18750)
## Summary

This PR includes a behavioral change to how we infer types for public
uses of symbols within a module. Where we would previously use the type
that a use at the end of the scope would see, we now consider all
reachable bindings and union the results:

```py
x = None

def f():
    reveal_type(x)  # previously `Unknown | Literal[1]`, now `Unknown | None | Literal[1]`

f()

x = 1

f()
```

This helps especially in cases where the the end of the scope is not
reachable:

```py
def outer(x: int):
    def inner():
        reveal_type(x)  # previously `Unknown`, now `int`

    raise ValueError
```

This PR also proposes to skip the boundness analysis of public uses.
This is consistent with the "all reachable bindings" strategy, because
the implicit `x = <unbound>` binding is also always reachable, and we
would have to emit "possibly-unresolved" diagnostics for every public
use otherwise. Changing this behavior allows common use-cases like the
following to type check without any errors:

```py
def outer(flag: bool):
    if flag:
        x = 1

        def inner():
            print(x)  # previously: possibly-unresolved-reference, now: no error
```

closes https://github.com/astral-sh/ty/issues/210
closes https://github.com/astral-sh/ty/issues/607
closes https://github.com/astral-sh/ty/issues/699

## Follow up

It is now possible to resolve the following TODO, but I would like to do
that as a follow-up, because it requires some changes to how we treat
implicit attribute assignments, which could result in ecosystem changes
that I'd like to see separately.


315fb0f3da/crates/ty_python_semantic/src/semantic_index/builder.rs (L1095-L1117)

## Ecosystem analysis

[**Full report**](https://shark.fish/diff-public-types.html)

* This change obviously removes a lot of `possibly-unresolved-reference`
diagnostics (7818) because we do not analyze boundness for public uses
of symbols inside modules anymore.
* As the primary goal here, this change also removes a lot of
false-positive `unresolved-reference` diagnostics (231) in scenarios
like this:
    ```py
    def _(flag: bool):
        if flag:
            x = 1
    
            def inner():
                x
    
            raise
    ```
* This change also introduces some new false positives for cases like:
    ```py
    def _():
        x = None
    
        x = "test"
    
        def inner():
x.upper() # Attribute `upper` on type `Unknown | None | Literal["test"]`
is possibly unbound
    ```
We have test cases for these situations and it's plausible that we can
improve this in a follow-up.


## Test Plan

New Markdown tests
2025-06-26 12:24:40 +02:00
Victor Hugo Gomes
2362263d5e [pyflakes] Mark F504/F522/F523 autofix as unsafe if there's a call with side effect (#18839)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-26 08:48:29 +00:00
GiGaGon
170ccd80b4 [playground] Add ruff logo docs link to Header.tsx (#18947) 2025-06-26 08:54:45 +02:00
Alex Waygood
4a5715b97a [ty] Reduce the overwhelming complexity of TypeInferenceBuilder::infer_call_expression (#18943)
## Summary

This function is huge, and hugely indented. This PR breaks most of it
out into two helper functions: `KnownFunction::check_call()` and
`KnownClass::check_call`.

My immediate motivation is that we need to add yet more special cases to
this function in order to properly handle `tuple` instantiations and
instantiations of tuple subclasses. But I really don't relish the
thought of doing that with the function's current structure 😆

## Test Plan

Existing tests all pass. No new ones are added; this is a pure refactor
that should have no functional change.
2025-06-25 21:10:55 +01:00
Alex Waygood
c77e72ea1a [ty] Add subdiagnostic about empty bodies in more cases (#18942) 2025-06-25 20:25:00 +01:00
Micha Reiser
5d546c600a [ty] Move search path resolution to Options::to_program_settings (#18937) 2025-06-25 18:00:38 +02:00
Nikolas Hearp
8b22992988 [flake8-errmsg] Extend EM101 to support byte strings (#18867)
## Summary

Fixes #18765

## Test Plan

Added test
2025-06-25 10:53:56 -04:00
GiGaGon
f6def1c86d Move big rule implementations (#18931)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Here's the part that was split out of #18906. I wanted to move these
into the rule files since the rest of the rules in
`deferred_scope`/`statement` have that same structure of implementations
being in the rule definition file. It also resolves the dilemma of where
to put the comment, at least for these rules.

## Test Plan

<!-- How was it tested? -->

N/A, no test/functionality affected
2025-06-25 10:46:25 -04:00
Brent Westbrook
5aab49880a [pylint] Allow fix with comments and document performance implications (PLW3301) (#18936)
Summary
--

Closes #18849 by adding a `## Known issues` section describing the
potential performance issues when fixing nested iterables. I also
deleted the comment check since the fix is already unsafe and added a
note to the `## Fix safety` docs.

Test Plan
--

Existing tests, updated to allow a fix when comments are present since
the fix is already unsafe.
2025-06-25 09:29:23 -04:00
Brent Westbrook
7783cea14f [flake8-future-annotations] Add autofix (FA100) (#18903)
Summary
--

This PR resolves the easiest part of
https://github.com/astral-sh/ruff/issues/18502 by adding an autofix that
just adds
`from __future__ import annotations` at the top of the file, in the same
way
as FA102, which already has an identical unsafe fix.

Test Plan
--

Existing snapshots, updated to add the fixes.
2025-06-25 08:37:18 -04:00
Micha Reiser
c1fed55d51 Delete the ruff_python_resolver crate (#18933) 2025-06-25 12:53:13 +02:00
David Peter
689797a984 [ty] Type narrowing in comprehensions (#18934)
## Summary

Add type narrowing inside comprehensions:

```py
def _(xs: list[int | None]):
    [reveal_type(x) for x in xs if x is not None]  # revealed: int
```

closes https://github.com/astral-sh/ty/issues/680

## Test Plan

* New Markdown tests
* Made sure the example from https://github.com/astral-sh/ty/issues/680
now checks without errors
* Made sure that all removed ecosystem diagnostics were actually false
positives
2025-06-25 11:30:28 +02:00
Victor Hugo Gomes
66dbea90f1 [perflint] Fix false negative in PERF401 (#18866) 2025-06-25 10:44:32 +02:00
GiGaGon
d2684a00c6 Fix f-string interpolation escaping (#18882) 2025-06-25 10:04:15 +02:00
Robsdedude
2a0c5669f2 [refurb] Detect more exotic float literals in FURB164 (#18925) 2025-06-25 09:08:25 +02:00
GiGaGon
cb152b4725 [Internal] Use more report_diagnostic_if_enabled (#18924)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

From @ntBre
https://github.com/astral-sh/ruff/pull/18906#discussion_r2162843366 :
> This could be a good target for a follow-up PR, but we could fold
these `if checker.is_rule_enabled { checker.report_diagnostic` checks
into calls to `checker.report_diagnostic_if_enabled`. I didn't notice
these when adding that method.
> 
> Also, the docs on `Checker::report_diagnostic_if_enabled` and
`LintContext::report_diagnostic_if_enabled` are outdated now that the
`Rule` conversion is basically free 😅
> 
> No pressure to take on this refactor, just an idea if you're
interested!

This PR folds those calls. I also updated the doc comments by copying
from `report_diagnostic`.

Note: It seems odd to me that the doc comment for `Checker` says
`Diagnostic` while `LintContext` says `OldDiagnostic`, not sure if that
needs a bigger docs change to fix the inconsistency.

<details>
<summary>Python script to do the changes</summary>

This script assumes it is placed in the top level `ruff` directory (ie
next to `.git`/`crates`/`README.md`)

```py
import re
from copy import copy
from pathlib import Path

ruff_crates = Path(__file__).parent / "crates"

for path in ruff_crates.rglob("**/*.rs"):
    with path.open(encoding="utf-8", newline="") as f:
        original_content = f.read()
    if "is_rule_enabled" not in original_content or "report_diagnostic" not in original_content:
        continue
    original_content_position = 0
    changed_content = ""
    for match in re.finditer(r"(?m)(?:^[ \n]*|(?<=(?P<else>else )))if[ \n]+checker[ \n]*\.is_rule_enabled\([ \n]*Rule::\w+[ \n]*\)[ \n]*{[ \n]*checker\.report_diagnostic\(", original_content):
        # Content between last match and start of this one is unchanged
        changed_content += original_content[original_content_position:match.start()]
        # If this was an else if, a { needs to be added at the start
        if match.group("else"):
            changed_content += "{"
        # This will result in bad formatting, but the precommit cargo format will handle it
        changed_content += "checker.report_diagnostic_if_enabled("
        # Depth tracking would fail if a string/comment included a { or }, but unlikely given the context
        depth = 1
        position = match.end()
        while depth > 0:
            if original_content[position] == "{":
                depth += 1
            if original_content[position] == "}":
                depth -= 1
            position += 1
        # pos - 1 is the closing }
        changed_content += original_content[match.end():position - 1]
        # If this was an else if, a } needs to be added at the end
        if match.group("else"):
            changed_content += "}"
        # Skip the closing }
        original_content_position = position
        if original_content[original_content_position] == "\n":
            # If the } is followed by a \n, also skip it for better formatting
            original_content_position += 1
    # Add remaining content between last match and file end
    changed_content += original_content[original_content_position:]
    with path.open("w", encoding="utf-8", newline="") as f:
        f.write(changed_content)
```

</details>

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected.
2025-06-24 21:43:22 -04:00
GiGaGon
90f47e9b7b Add missing rule code comments (#18906)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

While making some of my other changes, I noticed some of the lints were
missing comments with their lint code/had the wrong numbered lint code.
These comments are super useful since they allow for very easily and
quickly finding the source code of a lint, so I decided to try and
normalize them.

Most of them were fairly straightforward, just adding a doc
comment/comment in the appropriate place.

I decided to make all of the `Pylint` rules have the `PL` prefix.
Previously it was split between no prefix and having prefix, but I
decided to normalize to with prefix since that's what's in the docs, and
the with prefix will show up on no prefix searches, while the reverse is
not true.

I also ran into a lot of rules with implementations in "non-standard"
places (where "standard" means inside a file matching the glob
`crates/ruff_linter/rules/*/rules/**/*.rs` and/or the same rule file
where the rule `struct`/`ViolationMetadata` is defined).

I decided to move all the implementations out of
`crates/ruff_linter/src/checkers/ast/analyze/deferred_scopes.rs` and
into their own files, since that is what the rest of the rules in
`deferred_scopes.rs` did, and those were just the outliers.

There were several rules which I did not end up moving, which you can
see as the extra paths I had to add to my python code besides the
"standard" glob. These rules are generally the error-type rules that
just wrap an error from the parser, and have very small
implementations/are very tightly linked to the module they are in, and
generally every rule of that type was implemented in module instead of
in the "standard" place.

Resolving that requires answering a question I don't think I'm equipped
to handle: Is the point of these comments to give quick access to the
rule definition/docs, or the rule implementation? For all the rules with
implementations in the "standard" location this isn't a problem, as they
are the same, but it is an issue for all of these error type rules. In
the end I chose to leave the implementations where they were, but I'm
not sure if that was the right choice.

<details>
<summary>Python script I wrote to find missing comments</summary>

This script assumes it is placed in the top level `ruff` directory (ie
next to `.git`/`crates`/`README.md`)

```py
import re
from copy import copy
from pathlib import Path

linter_to_code_prefix = {
    "Airflow": "AIR",
    "Eradicate": "ERA",
    "FastApi": "FAST",
    "Flake82020": "YTT",
    "Flake8Annotations": "ANN",
    "Flake8Async": "ASYNC",
    "Flake8Bandit": "S",
    "Flake8BlindExcept": "BLE",
    "Flake8BooleanTrap": "FBT",
    "Flake8Bugbear": "B",
    "Flake8Builtins": "A",
    "Flake8Commas": "COM",
    "Flake8Comprehensions": "C4",
    "Flake8Copyright": "CPY",
    "Flake8Datetimez": "DTZ",
    "Flake8Debugger": "T10",
    "Flake8Django": "DJ",
    "Flake8ErrMsg": "EM",
    "Flake8Executable": "EXE",
    "Flake8Fixme": "FIX",
    "Flake8FutureAnnotations": "FA",
    "Flake8GetText": "INT",
    "Flake8ImplicitStrConcat": "ISC",
    "Flake8ImportConventions": "ICN",
    "Flake8Logging": "LOG",
    "Flake8LoggingFormat": "G",
    "Flake8NoPep420": "INP",
    "Flake8Pie": "PIE",
    "Flake8Print": "T20",
    "Flake8Pyi": "PYI",
    "Flake8PytestStyle": "PT",
    "Flake8Quotes": "Q",
    "Flake8Raise": "RSE",
    "Flake8Return": "RET",
    "Flake8Self": "SLF",
    "Flake8Simplify": "SIM",
    "Flake8Slots": "SLOT",
    "Flake8TidyImports": "TID",
    "Flake8Todos": "TD",
    "Flake8TypeChecking": "TC",
    "Flake8UnusedArguments": "ARG",
    "Flake8UsePathlib": "PTH",
    "Flynt": "FLY",
    "Isort": "I",
    "McCabe": "C90",
    "Numpy": "NPY",
    "PandasVet": "PD",
    "PEP8Naming": "N",
    "Perflint": "PERF",
    "Pycodestyle": "",
    "Pydoclint": "DOC",
    "Pydocstyle": "D",
    "Pyflakes": "F",
    "PygrepHooks": "PGH",
    "Pylint": "PL",
    "Pyupgrade": "UP",
    "Refurb": "FURB",
    "Ruff": "RUF",
    "Tryceratops": "TRY",
}

ruff = Path(__file__).parent / "crates"

ruff_linter = ruff / "ruff_linter" / "src"

code_to_rule_name = {}

with open(ruff_linter / "codes.rs") as codes_file:
    for linter, code, rule_name in re.findall(
        # The (?<! skips ruff test rules
        # Only Preview|Stable rules are checked
        r"(?<!#\[cfg\(any\(feature = \"test-rules\", test\)\)\]\n)        \((\w+), \"(\w+)\"\) => \(RuleGroup::(?:Preview|Stable), [\w:]+::(\w+)\)",
        codes_file.read(),
    ):
        code_to_rule_name[linter_to_code_prefix[linter] + code] = (rule_name, [])

ruff_linter_rules = ruff_linter / "rules"
for rule_file_path in [
    *ruff_linter_rules.rglob("*/rules/**/*.rs"),
    ruff / "ruff_python_parser" / "src" / "semantic_errors.rs",
    ruff_linter / "pyproject_toml.rs",
    ruff_linter / "checkers" / "noqa.rs",
    ruff_linter / "checkers" / "ast" / "mod.rs",
    ruff_linter / "checkers" / "ast" / "analyze" / "unresolved_references.rs",
    ruff_linter / "checkers" / "ast" / "analyze" / "expression.rs",
    ruff_linter / "checkers" / "ast" / "analyze" / "statement.rs",
]:
    with open(rule_file_path, encoding="utf-8") as f:
        rule_file_content = f.read()
    for code, (rule, _) in copy(code_to_rule_name).items():
        if rule in rule_file_content:
            if f"// {code}" in rule_file_content or f", {code}" in rule_file_content:
                del code_to_rule_name[code]
            else:
                code_to_rule_name[code][1].append(rule_file_path)

for code, rule in code_to_rule_name.items():
    print(code, rule[0])
    for path in rule[1]:
        print(path)
```

</details>

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected.
2025-06-24 21:18:57 -04:00
Carl Meyer
62975b3ab2 [ty] eliminate is_fully_static (#18799)
## Summary

Having a recursive type method to check whether a type is fully static
is inefficient, unnecessary, and makes us overly strict about subtyping
relations.

It's inefficient because we end up re-walking the same types many times
to check for fully-static-ness.

It's unnecessary because we can check relations involving the dynamic
type appropriately, depending whether the relation is subtyping or
assignability.

We use the subtyping relation to simplify unions and intersections. We
can usefully consider that `S <: T` for gradual types also, as long as
it remains true that `S | T` is equivalent to `T` and `S & T` is
equivalent to `S`.

One conservative definition (implemented here) that satisfies this
requirement is that we consider `S <: T` if, for every possible pair of
materializations `S'` and `T'`, `S' <: T'`. Or put differently the top
materialization of `S` (`S+` -- the union of all possible
materializations of `S`) is a subtype of the bottom materialization of
`T` (`T-` -- the intersection of all possible materializations of `T`).
In the most basic cases we can usefully say that `Any <: object` and
that `Never <: Any`, and we can handle more complex cases inductively
from there.

This definition of subtyping for gradual subtypes is not reflexive
(`Any` is not a subtype of `Any`).

As a corollary, we also remove `is_gradual_equivalent_to` --
`is_equivalent_to` now has the meaning that `is_gradual_equivalent_to`
used to have. If necessary, we could restore an
`is_fully_static_equivalent_to` or similar (which would not do an
`is_fully_static` pre-check of the types, but would instead pass a
relation-kind enum down through a recursive equivalence check, similar
to `has_relation_to`), but so far this doesn't appear to be necessary.

Credit to @JelleZijlstra for the observation that `is_fully_static` is
unnecessary and overly restrictive on subtyping.

There is another possible definition of gradual subtyping: instead of
requiring that `S+ <: T-`, we could instead require that `S+ <: T+` and
`S- <: T-`. In other words, instead of requiring all materializations of
`S` to be a subtype of every materialization of `T`, we just require
that every materialization of `S` be a subtype of _some_ materialization
of `T`, and that every materialization of `T` be a supertype of some
materialization of `S`. This definition also preserves the core
invariant that `S <: T` implies that `S | T = T` and `S & T = S`, and it
restores reflexivity: under this definition, `Any` is a subtype of
`Any`, and for any equivalent types `S` and `T`, `S <: T` and `T <: S`.
But unfortunately, this definition breaks transitivity of subtyping,
because nominal subclasses in Python use assignability ("consistent
subtyping") to define acceptable overrides. This means that we may have
a class `A` with `def method(self) -> Any` and a subtype `B(A)` with
`def method(self) -> int`, since `int` is assignable to `Any`. This
means that if we have a protocol `P` with `def method(self) -> Any`, we
would have `B <: A` (from nominal subtyping) and `A <: P` (`Any` is a
subtype of `Any`), but not `B <: P` (`int` is not a subtype of `Any`).
Breaking transitivity of subtyping is not tenable, so we don't use this
definition of subtyping.

## Test Plan

Existing tests (modified in some cases to account for updated
semantics.)

Stable property tests pass at a million iterations:
`QUICKCHECK_TESTS=1000000 cargo test -p ty_python_semantic -- --ignored
types::property_tests::stable`

### Changes to property test type generation

Since we no longer have a method of categorizing built types as
fully-static or not-fully-static, I had to add a previously-discussed
feature to the property tests so that some tests can build types that
are known by construction to be fully static, because there are still
properties that only apply to fully-static types (for example,
reflexiveness of subtyping.)

## Changes to handling of `*args, **kwargs` signatures

This PR "discovered" that, once we allow non-fully-static types to
participate in subtyping under the above definitions, `(*args: Any,
**kwargs: Any) -> Any` is now a subtype of `() -> object`. This is true,
if we take a literal interpretation of the former signature: all
materializations of the parameters `*args: Any, **kwargs: Any` can
accept zero arguments, making the former signature a subtype of the
latter. But the spec actually says that `*args: Any, **kwargs: Any`
should be interpreted as equivalent to `...`, and that makes a
difference here: `(...) -> Any` is not a subtype of `() -> object`,
because (unlike a literal reading of `(*args: Any, **kwargs: Any)`),
`...` can materialize to _any_ signature, including a signature with
required positional arguments.

This matters for this PR because it makes the "any two types are both
assignable to their union" property test fail if we don't implement the
equivalence to `...`. Because `FunctionType.__call__` has the signature
`(*args: Any, **kwargs: Any) -> Any`, and if we take that at face value
it's a subtype of `() -> object`, making `FunctionType` a subtype of `()
-> object)` -- but then a function with a required argument is also a
subtype of `FunctionType`, but not a subtype of `() -> object`. So I
went ahead and implemented the equivalence to `...` in this PR.

## Ecosystem analysis

* Most of the ecosystem report are cases of improved union/intersection
simplification. For example, we can now simplify a union like `bool |
(bool & Unknown) | Unknown` to simply `bool | Unknown`, because we can
now observe that every possible materialization of `bool & Unknown` is
still a subtype of `bool` (whereas before we would set aside `bool &
Unknown` as a not-fully-static type.) This is clearly an improvement.
* The `possibly-unresolved-reference` errors in sockeye, pymongo,
ignite, scrapy and others are true positives for conditional imports
that were formerly silenced by bogus conflicting-declarations (which we
currently don't issue a diagnostic for), because we considered two
different declarations of `Unknown` to be conflicting (we used
`is_equivalent_to` not `is_gradual_equivalent_to`). In this PR that
distinction disappears and all equivalence is gradual, so a declaration
of `Unknown` no longer conflicts with a declaration of `Unknown`, which
then results in us surfacing the possibly-unbound error.
* We will now issue "redundant cast" for casting from a typevar with a
gradual bound to the same typevar (the hydra-zen diagnostic). This seems
like an improvement.
* The new diagnostics in bandersnatch are interesting. For some reason
primer in CI seems to be checking bandersnatch on Python 3.10 (not yet
sure why; this doesn't happen when I run it locally). But bandersnatch
uses `enum.StrEnum`, which doesn't exist on 3.10. That makes the `class
SimpleDigest(StrEnum)` a class that inherits from `Unknown` (and
bypasses our current TODO handling for accessing attributes on enum
classes, since we don't recognize it as an enum class at all). This PR
improves our understanding of assignability to classes that inherit from
`Any` / `Unknown`, and we now recognize that a string literal is not
assignable to a class inheriting `Any` or `Unknown`.
2025-06-24 18:02:05 -07:00
Dan Parizher
eee5a5a3d6 [docs] Typo fix for playground (#18929)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

I saw the smallest typo while familiarizing myself with the playground,
it bothered me so much I just had to make a PR for it 😂
2025-06-24 21:01:48 -04:00
Douglas Creager
66f50fb04b [ty] Add property test generators for variable-length tuples (#18901)
Add property test generators for the new variable-length tuples. This
covers homogeneous tuples as well.

The property tests did their job! This identified several fixes we
needed to make to various type property methods.

cf https://github.com/astral-sh/ruff/pull/18600#issuecomment-2993764471

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-06-24 18:13:47 -04:00
Robsdedude
919af9628d [pygrep_hooks] Add AsyncMock methods to invalid-mock-access (PGH005) (#18547)
## Summary
This PR expands PGH005 to also check for AsyncMock methods in the same
vein. E.g., currently `assert mock.not_called` is linted. This PR adds
the corresponding async assertions `assert mock.not_awaited()`.

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-06-24 17:27:21 -04:00
Alex Waygood
9d8cba4e8b [ty] Improve disjointness inference for NominalInstanceTypes and SubclassOfTypes (#18864)
Co-authored-by: Carl Meyer <carl@astral.sh>
2025-06-24 20:27:37 +00:00
Josiah Kane
d89f75f9cc Fix link typo in ty's CONTRIBUTING.md (#18923) 2025-06-24 20:23:31 +00:00
Alex Waygood
e44c489273 [ty] Fix false positives when subscripting an object inferred as having an Intersection type (#18920) 2025-06-24 18:39:02 +00:00
chiri
3220242dec [flake8-use-pathlib] Add autofix for PTH202 (#18763)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
/closes #2331
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan
update snapshots
<!-- How was it tested? -->

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-06-24 17:58:31 +00:00
Andrew Gallant
6abafcb565 [ty] Add relative import completion tests
This tests things like `from ...foo import <CURSOR>`.

I had previously tested this on an ad hoc basis inside
of my editor, so the token state machine already recognizes
this pattern.

Ref https://github.com/astral-sh/ruff/pull/18830#discussion_r2159670033
2025-06-24 11:41:16 -04:00
Andrew Gallant
cef1a522dc [ty] Clarify what "cursor" means
This commit does a small refactor to combine the file and
cursor offset into a single type. I think this makes it
clearer that even if there are multiple files in the cursor
test, this one in particular corresponds to the file that
contains the `<CURSOR>` marker.
2025-06-24 11:41:16 -04:00
Andrew Gallant
40731f0589 [ty] Add a cursor test builder
This doesn't change any functionality of the cursor tests, but does
re-arrange the code a bit. Firstly, it's now in a builder. And secondly,
there's an API to add multiple files to the test (but exactly one must
have a `<CURSOR>` marker).
2025-06-24 11:41:16 -04:00
Andrew Gallant
1461137407 [ty] Enforce sort order of completions (#18917)
We achieve this by setting the "sort text" field of every completion.
Since we are trying to be smart about the order, we want the client to
respect our order.

Prior to this change, VS Code was re-sorting completions in
lexicographic order. This in turn resulted in dunder attributes
appearing before "normal" attributes.
2025-06-24 11:31:08 -04:00
K
47653ca88a [formatter] Fix missing blank lines before decorated classes in .pyi files (#18888)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-24 16:25:44 +02:00
Brent Westbrook
02ae8e1210 Apply fix availability and applicability when adding to DiagnosticGuard and remove NoqaCode::rule (#18834)
## Summary

This PR removes the last two places we were using `NoqaCode::rule` in
`linter.rs` (see
https://github.com/astral-sh/ruff/pull/18391#discussion_r2154637329 and
https://github.com/astral-sh/ruff/pull/18391#discussion_r2154649726) by
checking whether fixes are actually desired before adding them to a
`DiagnosticGuard`. I implemented this by storing a `Violation`'s `Rule`
on the `DiagnosticGuard` so that we could check if it was enabled in the
embedded `LinterSettings` when trying to set a fix.

All of the corresponding `set_fix` methods on `OldDiagnostic` were now
unused (except in tests where I just set `.fix` directly), so I moved
these to the guard instead of keeping both sets.

The very last place where we were using `NoqaCode::rule` was in the
cache. I just reverted this to parsing the `Rule` from the name. I had
forgotten to update the comment there anyway. Hopefully this doesn't
cause too much of a perf hit.

In terms of binary size, we're back down almost to where `main` was two
days ago
(https://github.com/astral-sh/ruff/pull/18391#discussion_r2155034320):

```
41,559,344 bytes for main 2 days ago
41,669,840 bytes for #18391
41,653,760 bytes for main now (after #18391 merged)
41,602,224 bytes for this branch
```

Only 43 kb up, but that shouldn't all be me this time :)

## Test Plan

Existing tests and benchmarks on this PR
2025-06-24 10:08:36 -04:00
David Peter
0edbd6c390 py-fuzzer: allow relative executable paths (#18915)
## Summary

I tried running `py-fuzzer` using executables in the current working
directory, but that failed with:
```
▶ uvx --from ./python/py-fuzzer --reinstall fuzz --test-executable ./ty_feature --bin=ty --baseline-executable ./ty_main --only-new-bugs 0-500
Usage: fuzz [-h] [--only-new-bugs] [--quiet] [--test-executable TEST_EXECUTABLE] [--baseline-executable BASELINE_EXECUTABLE] --bin {ruff,ty} seeds [seeds ...]
fuzz: error: Bad argument passed to `--baseline-executable`: no such file or executable PosixPath('ty_main')
 "Bad argument passed to `--baseline-executable`: no such file or executable PosixPath('ty_main')"
```

Using `.absolute()` on the `Path` fixes this.


## Test Plan

Successful `py-fuzzer` run with the invocation above.
2025-06-24 15:16:21 +02:00
Micha Reiser
833be2e66a [ty] Change environment.root to accept multiple paths (#18913) 2025-06-24 14:52:36 +02:00
Micha Reiser
0194452928 [ty] Rename src.root setting to environment.root (#18760)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-06-24 14:40:44 +02:00
Dhruv Manilawala
2c4c015f74 Use file path for detecting package root (#18914)
Ref: https://github.com/astral-sh/ruff/pull/18910#discussion_r2163847956
2025-06-24 12:32:41 +00:00
Dhruv Manilawala
66fc7c8fc0 Consider virtual path for various server actions (#18910)
## Summary

Ref:
https://github.com/astral-sh/ruff/issues/14820#issuecomment-2996690681

This PR fixes a bug where virtual paths or any paths that doesn't exists
on the file system weren't being considered for checking inclusion /
exclusion. This was because the logic used `file_path` which returns
`None` for those path. This PR fixes that by using the
`virtual_file_path` method that returns a `Path` corresponding to the
actual file on disk or any kind of virtual path.

This should ideally just fix the above linked issue by way of excluding
the documents representing the interactive window because they aren't in
the inclusion set. It failed only on Windows previously because the file
path construction would fail and then Ruff would default to including
all the files.

## Test Plan

On my machine, the `.interactive` paths are always excluded so I'm using
the inclusion set instead:

```json
{
  "ruff.nativeServer": "on",
  "ruff.path": ["/Users/dhruv/work/astral/ruff/target/debug/ruff"],
  "ruff.configuration": {
    "extend-include": ["*.interactive"]
  }
}
```

The diagnostics are shown for both the file paths and the interactive
window:

<img width="1727" alt="Screenshot 2025-06-24 at 14 56 40"
src="https://github.com/user-attachments/assets/d36af96a-777e-4367-8acf-4d9c9014d025"
/>

And, the logs:

```
2025-06-24 14:56:26.478275000 DEBUG notification{method="notebookDocument/didChange"}: Included path via `extend-include`: /Interactive-1.interactive
```

And, when using `ruff.exclude` via:

```json
{
	"ruff.exclude": ["*.interactive"]
}
```

With logs:

```
2025-06-24 14:58:41.117743000 DEBUG notification{method="notebookDocument/didChange"}: Ignored path via `exclude`: /Interactive-1.interactive
```
2025-06-24 12:24:28 +00:00
Alex Waygood
237a5821ba [ty] Introduce UnionType::try_from_elements and UnionType::try_map (#18911) 2025-06-24 12:09:02 +00:00
Alex Waygood
27eee5a1a8 [ty] Support narrowing on isinstance()/issubclass() if the second argument is a dynamic, intersection, union or typevar type (#18900) 2025-06-24 10:55:26 +00:00
med1844
fd2cc37f90 [ty] Add decorator check for implicit attribute assignments (#18587)
## Summary

Previously, the checks for implicit attribute assignments didn't
properly account for method decorators. This PR fixes that by:

- Adding a decorator check in `implicit_instance_attribute`. This allows
it to filter out methods with mismatching decorators when analyzing
attribute assignments.
- Adding attribute search for implicit class attributes: if an attribute
can't be found directly in the class body, the
`ClassLiteral::own_class_member` function will now search in
classmethods.
- Adding `staticmethod`: it has been added into `KnownClass` and
together with the new decorator check, it will no longer expose
attributes when the assignment target name is the same as the first
method name.

If accepted, it should fix https://github.com/astral-sh/ty/issues/205
and https://github.com/astral-sh/ty/issues/207.

## Test Plan

This is tested with existing mdtest suites and is able to get most of
the TODO marks for implicit assignments in classmethods and
staticmethods removed.

However, there's one specific test case I failed to figure out how to
correctly resolve:


b279508bdc/crates/ty_python_semantic/resources/mdtest/attributes.md (L754-L755)

I tried to add `instance_member().is_unbound()` check in this [else
branch](b279508bdc/crates/ty_python_semantic/src/types/infer.rs (L3299-L3301))
but it causes tests with class attributes defined in class body to fail.
While it's possible to implicitly add `ClassVar` to qualifiers to make
this assignment fail and keep everything else passing, it doesn't feel
like the right solution.
2025-06-24 11:42:10 +02:00
Victor Hugo Gomes
ca7933804e [ruff] Trigger RUF037 for empty string and byte strings (#18862) 2025-06-24 08:26:28 +02:00
Dhruv Manilawala
e474f36473 [ty] Avoid duplicate diagnostic in unpacking (#18897)
## Summary

This PR fixes astral-sh/ty#185 by avoiding to infer the value expression
for an unpacking.

This is done simply by only inferring the value expression in a
non-unpacking branch for assignment statement, for statement, with
statement and comprehensions.

This is a simpler alternative to
https://github.com/astral-sh/ruff/pull/18890 which I only realized in
hindsight! Ideally, the solution would to consider the "unpack" as it's
own region and do all of the inference of every expressions involved in
an unpacking inside the unpack query and then merge the results in the
outer query. This would require access to the `Unpack` ingredient which
is stored on the `Definition`. And, this would require create the said
`Definition`s for all attributes and subscript expressions. It does
simplify the target inference logic by streamlining it into a single
`infer_target` method instead of the `infer_target`/`infer_target_impl`
split.

Additionally, #18890 also solves a couple of TODOs around raising errors
around attribute / subscript assignment.

## Test Plan

Update the existing test, go through a couple of ecosystem diagnostic.
2025-06-24 07:49:44 +05:30
Igor Drokin
da16e00751 [pyupgrade] Extend version detection to include sys.version_info.major (UP036) (#18633)
## Summary

Resolves #18165 

Added pattern `["sys", "version_info", "major"]` to the existing matches
for `sys.version_info` to ensure consistent handling of both the base
object and its major version attribute.

## Test Plan
`cargo nextest run` and `cargo insta test`

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-06-23 20:01:55 +00:00
Илья Любавский
885dc9091f [ruff] Frozen Dataclass default should be valid (RUF009) (#18735)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
/closes #17424
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

<!-- How was it tested? -->
2025-06-23 19:10:12 +00:00
Andrew Gallant
d01e0faee3 [ty] Include imported sub-modules as attributes on modules for completions (#18898)
This also adds a new `ModuleName::relative_to` public API to help with
this.

Kudos to @AlexWaygood for the meat of this patch!

Ref https://github.com/astral-sh/ruff/pull/18830#discussion_r2161770991
2025-06-23 12:48:16 -04:00
Suneet Tipirneni
ef8281b695 [ty] add support for mapped union and intersection subscript loads (#18846)
## Summary

Note this modifies the diagnostics a bit. Previously performing
subscript access on something like `NotSubscriptable1 |
NotSubscriptable2` would report the full type as not being
subscriptable:

```
[non-subscriptable] "Cannot subscript object of type `NotSubscriptable1 | NotSubscriptable2` with no `__getitem__` method"
```

Now each erroneous constituent has a separate error:

```
[non-subscriptable] "Cannot subscript object of type `NotSubscriptable2` with no `__getitem__` method"
[non-subscriptable] "Cannot subscript object of type `NotSubscriptable1` with no `__getitem__` method"
```

Closes https://github.com/astral-sh/ty/issues/625

## Test Plan

 mdtest

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-06-23 16:38:01 +00:00
Andrew Gallant
a77db3da3f [ty] Add completions for from module import <CURSOR> (#18830)
There were two main challenges in this PR.

The first was mostly just figuring out how to get the symbols
corresponding to `module`. It turns out that we do this in a couple
of places in ty already, but through different means. In one approach,
we use [`exported_names`]. In another approach, we get a `Type`
corresponding to the module. We take the latter approach here, which is
consistent with how we do completions elsewhere. (I looked into
factoring this logic out into its own function, but it ended up being
pretty constrained. e.g., There's only one other place where we want to
go from `ast::StmtImportFrom` to a module `Type`, and that code also
wants the module name.)

The second challenge was recognizing the `from module import <CURSOR>`
pattern in the code. I initially started with some fixed token patterns
to get a proof of concept working. But I ended up switching to mini
state machine over tokens. I looked at the parser for `StmtImportFrom`
to determine what kinds of tokens we can expect.

[`exported_names`]:
23a3b6ef23/crates/ty_python_semantic/src/semantic_index/re_exports.rs (L47)
2025-06-23 10:43:25 -04:00
Victor Hugo Gomes
9e9c4fe17b [flake8-simplify] Fix SIM911 autofix creating a syntax error (#18793)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
The fix would create a syntax error if there wasn't a space between the
`in` keyword and the following expression.
For example:
```python
for country, stars in(zip)(flag_stars.keys(), flag_stars.values()):...
```

I also noticed that the tests for `SIM911` were note being run, so I
fixed that.

Fixes #18776

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

Add regression test
<!-- How was it tested? -->
2025-06-23 16:24:47 +02:00
Victor Hugo Gomes
f4c6ff3f68 [pylint] Fix PLC2801 autofix creating a syntax error (#18857)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
This PR fixes `PLC2801` autofix creating a syntax error due to lack of
padding if it is directly after a keyword.

Fixes https://github.com/astral-sh/ruff/issues/18813
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan
Add regression test
<!-- How was it tested? -->
2025-06-23 10:15:53 -04:00
Vasco Schiavo
ca8ed35275 [flake8-simplify] Preserve original behavior for except () and bare except (SIM105) (#18213)
The PR addresses issue #18209.
2025-06-23 10:01:57 -04:00
GiGaGon
315fb0f3da [flake8-logging] Add fix safety section to LOG002 (#18840)
## Summary

Part of #15584

This adds a `Fix safety` section to [invalid-get-logger-argument
(LOG002)](https://docs.astral.sh/ruff/rules/invalid-get-logger-argument/#invalid-get-logger-argument-log002).

The fix/lint was introduced in #7399
No reasoning is given on the unsafety in the PR/code
Unsafe fix demonstration:
[playground](https://play.ruff.rs/e8008cbf-2ef5-4d38-8255-324f90e624cb)
```py
import logging
logger = logging.getLogger(__file__)
```

---------

Co-authored-by: Dylan <dylwil3@gmail.com>
2025-06-23 13:23:01 +00:00
GiGaGon
bbc26b2f11 [pyupgrade] Add fix safety section to UP004 (#18853)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #15584

This adds a `Fix safety` section to [useless-object-inheritance
(UP004)](https://docs.astral.sh/ruff/rules/useless-object-inheritance/#useless-object-inheritance-up004)

I could not track down the original PR as this rule is so old it has
gone through several large ruff refactors.
No reasoning is given on the unsafety in the PR/code.
The unsafety is determined here:

f24e650dfd/crates/ruff_linter/src/rules/pyupgrade/rules/useless_class_metaclass_type.rs (L76-L80)

Unsafe fix demonstration:
[playground](https://play.ruff.rs/12b24eb4-d7a5-4ae0-93bb-492d64967ae3)
```py
class A(  # will be deleted
    object
):
    ...
```

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected
2025-06-23 08:22:36 -05:00
GiGaGon
ec07a0f885 [flake8-use-pathlib] Add fix safety section to PTH201 (#18837)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #15584

This adds a `Fix safety` section to [path-constructor-current-directory
(PTH201)](https://docs.astral.sh/ruff/rules/path-constructor-current-directory/#path-constructor-current-directory-pth201)

I could not track down the original PR as this rule is so old it has
gone through several large ruff refactors.
The unsafety is determined here:

d9266284df/crates/ruff_linter/src/rules/flake8_use_pathlib/rules/path_constructor_current_directory.rs (L55-L59)
Unsafe code example:
[playground](https://play.ruff.rs/76da532a-c7ad-4ef9-bba3-4626296e5317)
```py
from pathlib import Path
Path(#
    "."#
)
```

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected
2025-06-23 08:22:00 -05:00
GiGaGon
861dff1dd8 [refurb] Add fix safety section to FURB122 (#18842)
## Summary

Part of #15584

This adds a `Fix safety` section to [for-loop-writes
(FURB122)](https://docs.astral.sh/ruff/rules/for-loop-writes/#for-loop-writes-furb122).

The fix/lint was introduced in #10630
No reasoning is given on the unsafety in the PR/code.
The unsafety is determined here:

ea812d0813/crates/ruff_linter/src/rules/refurb/rules/for_loop_writes.rs (L200-L204)
Unsafe fix demonstration:
[playground](https://play.ruff.rs/06592f33-10b9-4a77-b31e-0d3a98f402f4)
```py
with open("issue.txt", "w") as f:
    for i in range(10):
        # will be deleted
        f.write(str(i))
```

---------

Co-authored-by: Dylan <dylwil3@gmail.com>
2025-06-23 13:21:37 +00:00
GiGaGon
8be205df99 [pyupgrade] Add fix safety section to UP010 (#18838)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #15584

This adds a `Fix safety` section to [unnecessary-future-import
(UP010)](https://docs.astral.sh/ruff/rules/unnecessary-future-import/#unnecessary-future-import-up010)

The unsafety is determined here:

d9266284df/crates/ruff_linter/src/rules/pyupgrade/rules/unnecessary_future_import.rs (L128-L132)

Unsafe code example:
[playground](https://play.ruff.rs/c07d8c41-9ab8-4b86-805b-8cf482d450d9)
```py
from __future__ import (print_function,# ...
__annotations__)  # ...
```

Edit: It looks like there was already a PR for this, #17490, but I
missed it since they said `UP029` instead of `UP010` :/

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected
2025-06-23 08:20:55 -05:00
GiGaGon
34fd44eb55 [flake8-logging] Add fix safety section to LOG001 (#18841)
Part of #15584

This adds a `Fix safety` section to [direct-logger-instantiation
(LOG001)](https://docs.astral.sh/ruff/rules/direct-logger-instantiation/#direct-logger-instantiation-log001).

The fix/lint was introduced in #7397
No reasoning is given on the unsafety in the PR/code
Unsafe fix demonstration:
[playground](https://play.ruff.rs/72e7277e-a9db-4cd9-9afb-2c56ef2db361)
```py
import logging
logger = logging.Logger(__name__)
```

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected

---------

Co-authored-by: Dylan <dylwil3@gmail.com>
2025-06-23 13:19:23 +00:00
Micha Reiser
b64307f65e fix casing of analyze.direction variant names (#18892)
## Summary

Fixes `analyze.direction` to use kebab-case for the variant names. 

Fixes https://github.com/astral-sh/ruff/issues/18887

## Test Plan

Created a `ruff.toml` and tested that both `dependents` and `Dependents`
were accepted
2025-06-23 14:30:30 +02:00
Victor Hugo Gomes
291413b126 [perflint] Fix PERF101 autofix creating a syntax error and mark autofix as unsafe if there are comments in the list call expr (#18803)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-23 11:51:46 +00:00
Dan Parizher
7ec7853cec [flake8-pytest-style] PT001/PT023 fix makes syntax error on parenthesized decorator (#18782)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-23 13:46:15 +02:00
David Peter
907c291877 [ty] Update mypy_primer, add two new projects (#18891)
## Summary

Pull in latest changes to mypy_primer:
01a7ca325f..e5f5544796
2025-06-23 13:08:11 +02:00
David Peter
21303d1a02 [ty] Minor change to builtins.md test (#18889)
## Summary

As far as I can tell, the two existing tests did the exact same thing.
Remove the redundant test, and add tests for all combinations of
declared/not-declared and local/"public" use of the name.

Proposing this as a separate PR before the behavior might change via
https://github.com/astral-sh/ruff/pull/18750
2025-06-23 12:32:50 +02:00
Victor Hugo Gomes
528ae8083b Remove redundant settings field from Checker (#18845)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-23 11:06:44 +02:00
Victor Hugo Gomes
0ce022e64e [refurb] Fix FURB163 autofix creating a syntax error for yield expressions (#18756) 2025-06-23 10:13:03 +02:00
Victor Hugo Gomes
659ecba477 [pylint] Supress PLE2510/2512/2513/2514/2515 autofix if the text contains an odd number of backslashes (#18856)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-23 10:11:51 +02:00
671 changed files with 22305 additions and 12324 deletions

View File

@@ -49,7 +49,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build sdist"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
command: sdist
args: --out dist
@@ -79,7 +79,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels - x86_64"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: x86_64
args: --release --locked --out dist
@@ -121,7 +121,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels - aarch64"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: aarch64
args: --release --locked --out dist
@@ -177,7 +177,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
args: --release --locked --out dist
@@ -230,7 +230,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.target }}
manylinux: auto
@@ -304,7 +304,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: auto
@@ -370,7 +370,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.target }}
manylinux: musllinux_1_2
@@ -435,7 +435,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: musllinux_1_2

View File

@@ -321,14 +321,30 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- name: Setup Dev Drive
run: ${{ github.workspace }}/.github/workflows/setup-dev-drive.ps1
# actions/checkout does not let us clone into anywhere outside `github.workspace`, so we have to copy the clone
- name: Copy Git Repo to Dev Drive
env:
RUFF_WORKSPACE: ${{ env.RUFF_WORKSPACE }}
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${env:RUFF_WORKSPACE}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:
workspaces: ${{ env.RUFF_WORKSPACE }}
- name: "Install Rust toolchain"
working-directory: ${{ env.RUFF_WORKSPACE }}
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
with:
tool: cargo-nextest
- name: "Run tests"
working-directory: ${{ env.RUFF_WORKSPACE }}
shell: bash
env:
NEXTEST_PROFILE: "ci"
@@ -460,7 +476,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
name: Download Ruff binary to test
id: download-cached-binary
@@ -661,7 +677,7 @@ jobs:
branch: ${{ github.event.pull_request.base.ref }}
workflow: "ci.yaml"
check_artifacts: true
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: Fuzz
env:
FORCE_COLOR: 1
@@ -712,7 +728,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
args: --out dist
- name: "Test wheel"
@@ -731,7 +747,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
@@ -774,7 +790,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: Install uv
uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uv pip install -r docs/requirements-insiders.txt --system
@@ -906,7 +922,7 @@ jobs:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install Rust toolchain"
run: rustup show
@@ -939,7 +955,7 @@ jobs:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install Rust toolchain"
run: rustup show

View File

@@ -34,7 +34,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"

View File

@@ -37,7 +37,7 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:
@@ -48,6 +48,8 @@ jobs:
- name: Run mypy_primer
shell: bash
env:
TY_MEMORY_REPORT: mypy_primer
run: |
cd ruff
@@ -70,7 +72,7 @@ jobs:
echo "Project selector: $PRIMER_SELECTOR"
# Allow the exit code to be 0 or 1, only fail for actual mypy_primer crashes/bugs
uvx \
--from="git+https://github.com/hauntsaninja/mypy_primer@01a7ca325f674433c58e02416a867178d1571128" \
--from="git+https://github.com/hauntsaninja/mypy_primer@e5f55447969d33ae3c7ccdb183e2a37101867270" \
mypy_primer \
--repo ruff \
--type-checker ty \

View File

@@ -22,7 +22,7 @@ jobs:
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
pattern: wheels-*

93
.github/workflows/setup-dev-drive.ps1 vendored Normal file
View File

@@ -0,0 +1,93 @@
# Configures a drive for testing in CI.
#
# When using standard GitHub Actions runners, a `D:` drive is present and has
# similar or better performance characteristics than a ReFS dev drive. Sometimes
# using a larger runner is still more performant (e.g., when running the test
# suite) and we need to create a dev drive. This script automatically configures
# the appropriate drive.
#
# When using GitHub Actions' "larger runners", the `D:` drive is not present and
# we create a DevDrive mount on `C:`. This is purported to be more performant
# than an ReFS drive, though we did not see a change when we switched over.
#
# When using Depot runners, the underling infrastructure is EC2, which does not
# support Hyper-V. The `New-VHD` commandlet only works with Hyper-V, but we can
# create a ReFS drive using `diskpart` and `format` directory. We cannot use a
# DevDrive, as that also requires Hyper-V. The Depot runners use `D:` already,
# so we must check if it's a Depot runner first, and we use `V:` as the target
# instead.
if ($env:DEPOT_RUNNER -eq "1") {
Write-Output "DEPOT_RUNNER detected, setting up custom dev drive..."
# Create VHD and configure drive using diskpart
$vhdPath = "C:\ruff_dev_drive.vhdx"
@"
create vdisk file="$vhdPath" maximum=20480 type=expandable
attach vdisk
create partition primary
active
assign letter=V
"@ | diskpart
# Format the drive as ReFS
format V: /fs:ReFS /q /y
$Drive = "V:"
Write-Output "Custom dev drive created at $Drive"
} elseif (Test-Path "D:\") {
# Note `Get-PSDrive` is not sufficient because the drive letter is assigned.
Write-Output "Using existing drive at D:"
$Drive = "D:"
} else {
# The size (20 GB) is chosen empirically to be large enough for our
# workflows; larger drives can take longer to set up.
$Volume = New-VHD -Path C:/ruff_dev_drive.vhdx -SizeBytes 20GB |
Mount-VHD -Passthru |
Initialize-Disk -Passthru |
New-Partition -AssignDriveLetter -UseMaximumSize |
Format-Volume -DevDrive -Confirm:$false -Force
$Drive = "$($Volume.DriveLetter):"
# Set the drive as trusted
# See https://learn.microsoft.com/en-us/windows/dev-drive/#how-do-i-designate-a-dev-drive-as-trusted
fsutil devdrv trust $Drive
# Disable antivirus filtering on dev drives
# See https://learn.microsoft.com/en-us/windows/dev-drive/#how-do-i-configure-additional-filters-on-dev-drive
fsutil devdrv enable /disallowAv
# Remount so the changes take effect
Dismount-VHD -Path C:/ruff_dev_drive.vhdx
Mount-VHD -Path C:/ruff_dev_drive.vhdx
# Show some debug information
Write-Output $Volume
fsutil devdrv query $Drive
Write-Output "Using Dev Drive at $Volume"
}
$Tmp = "$($Drive)\ruff-tmp"
# Create the directory ahead of time in an attempt to avoid race-conditions
New-Item $Tmp -ItemType Directory
# Move Cargo to the dev drive
New-Item -Path "$($Drive)/.cargo/bin" -ItemType Directory -Force
if (Test-Path "C:/Users/runneradmin/.cargo") {
Copy-Item -Path "C:/Users/runneradmin/.cargo/*" -Destination "$($Drive)/.cargo/" -Recurse -Force
}
Write-Output `
"DEV_DRIVE=$($Drive)" `
"TMP=$($Tmp)" `
"TEMP=$($Tmp)" `
"UV_INTERNAL__TEST_DIR=$($Tmp)" `
"RUSTUP_HOME=$($Drive)/.rustup" `
"CARGO_HOME=$($Drive)/.cargo" `
"RUFF_WORKSPACE=$($Drive)/ruff" `
"PATH=$($Drive)/.cargo/bin;$env:PATH" `
>> $env:GITHUB_ENV

View File

@@ -32,7 +32,7 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:

View File

@@ -81,7 +81,7 @@ repos:
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.13
rev: v0.12.1
hooks:
- id: ruff-format
- id: ruff
@@ -91,7 +91,7 @@ repos:
# Prettier
- repo: https://github.com/rbubley/mirrors-prettier
rev: v3.5.3
rev: v3.6.2
hooks:
- id: prettier
types: [yaml]
@@ -99,12 +99,12 @@ repos:
# zizmor detects security vulnerabilities in GitHub Actions workflows.
# Additional configuration for the tool is found in `.github/zizmor.yml`
- repo: https://github.com/woodruffw/zizmor-pre-commit
rev: v1.9.0
rev: v1.10.0
hooks:
- id: zizmor
- repo: https://github.com/python-jsonschema/check-jsonschema
rev: 0.33.0
rev: 0.33.1
hooks:
- id: check-github-workflows

View File

@@ -1,5 +1,81 @@
# Changelog
## 0.12.1
### Preview features
- \[`flake8-errmsg`\] Extend `EM101` to support byte strings ([#18867](https://github.com/astral-sh/ruff/pull/18867))
- \[`flake8-use-pathlib`\] Add autofix for `PTH202` ([#18763](https://github.com/astral-sh/ruff/pull/18763))
- \[`pygrep-hooks`\] Add `AsyncMock` methods to `invalid-mock-access` (`PGH005`) ([#18547](https://github.com/astral-sh/ruff/pull/18547))
- \[`pylint`\] Ignore `__init__.py` files in (`PLC0414`) ([#18400](https://github.com/astral-sh/ruff/pull/18400))
- \[`ruff`\] Trigger `RUF037` for empty string and byte strings ([#18862](https://github.com/astral-sh/ruff/pull/18862))
- [formatter] Fix missing blank lines before decorated classes in `.pyi` files ([#18888](https://github.com/astral-sh/ruff/pull/18888))
### Bug fixes
- Avoid generating diagnostics with per-file ignores ([#18801](https://github.com/astral-sh/ruff/pull/18801))
- Handle parenthesized arguments in `remove_argument` ([#18805](https://github.com/astral-sh/ruff/pull/18805))
- \[`flake8-logging`\] Avoid false positive for `exc_info=True` outside `logger.exception` (`LOG014`) ([#18737](https://github.com/astral-sh/ruff/pull/18737))
- \[`flake8-pytest-style`\] Enforce `pytest` import for decorators ([#18779](https://github.com/astral-sh/ruff/pull/18779))
- \[`flake8-pytest-style`\] Mark autofix for `PT001` and `PT023` as unsafe if there's comments in the decorator ([#18792](https://github.com/astral-sh/ruff/pull/18792))
- \[`flake8-pytest-style`\] `PT001`/`PT023` fix makes syntax error on parenthesized decorator ([#18782](https://github.com/astral-sh/ruff/pull/18782))
- \[`flake8-raise`\] Make fix unsafe if it deletes comments (`RSE102`) ([#18788](https://github.com/astral-sh/ruff/pull/18788))
- \[`flake8-simplify`\] Fix `SIM911` autofix creating a syntax error ([#18793](https://github.com/astral-sh/ruff/pull/18793))
- \[`flake8-simplify`\] Fix false negatives for shadowed bindings (`SIM910`, `SIM911`) ([#18794](https://github.com/astral-sh/ruff/pull/18794))
- \[`flake8-simplify`\] Preserve original behavior for `except ()` and bare `except` (`SIM105`) ([#18213](https://github.com/astral-sh/ruff/pull/18213))
- \[`flake8-pyi`\] Fix `PYI041`'s fix causing `TypeError` with `None | None | ...` ([#18637](https://github.com/astral-sh/ruff/pull/18637))
- \[`perflint`\] Fix `PERF101` autofix creating a syntax error and mark autofix as unsafe if there are comments in the `list` call expr ([#18803](https://github.com/astral-sh/ruff/pull/18803))
- \[`perflint`\] Fix false negative in `PERF401` ([#18866](https://github.com/astral-sh/ruff/pull/18866))
- \[`pylint`\] Avoid flattening nested `min`/`max` when outer call has single argument (`PLW3301`) ([#16885](https://github.com/astral-sh/ruff/pull/16885))
- \[`pylint`\] Fix `PLC2801` autofix creating a syntax error ([#18857](https://github.com/astral-sh/ruff/pull/18857))
- \[`pylint`\] Mark `PLE0241` autofix as unsafe if there's comments in the base classes ([#18832](https://github.com/astral-sh/ruff/pull/18832))
- \[`pylint`\] Suppress `PLE2510`/`PLE2512`/`PLE2513`/`PLE2514`/`PLE2515` autofix if the text contains an odd number of backslashes ([#18856](https://github.com/astral-sh/ruff/pull/18856))
- \[`refurb`\] Detect more exotic float literals in `FURB164` ([#18925](https://github.com/astral-sh/ruff/pull/18925))
- \[`refurb`\] Fix `FURB163` autofix creating a syntax error for `yield` expressions ([#18756](https://github.com/astral-sh/ruff/pull/18756))
- \[`refurb`\] Mark `FURB129` autofix as unsafe if there's comments in the `readlines` call ([#18858](https://github.com/astral-sh/ruff/pull/18858))
- \[`ruff`\] Fix false positives and negatives in `RUF010` ([#18690](https://github.com/astral-sh/ruff/pull/18690))
- Fix casing of `analyze.direction` variant names ([#18892](https://github.com/astral-sh/ruff/pull/18892))
### Rule changes
- Fix f-string interpolation escaping in generated fixes ([#18882](https://github.com/astral-sh/ruff/pull/18882))
- \[`flake8-return`\] Mark `RET501` fix unsafe if comments are inside ([#18780](https://github.com/astral-sh/ruff/pull/18780))
- \[`flake8-async`\] Fix detection for large integer sleep durations in `ASYNC116` rule ([#18767](https://github.com/astral-sh/ruff/pull/18767))
- \[`flake8-async`\] Mark autofix for `ASYNC115` as unsafe if the call expression contains comments ([#18753](https://github.com/astral-sh/ruff/pull/18753))
- \[`flake8-bugbear`\] Mark autofix for `B004` as unsafe if the `hasattr` call expr contains comments ([#18755](https://github.com/astral-sh/ruff/pull/18755))
- \[`flake8-comprehension`\] Mark autofix for `C420` as unsafe if there's comments inside the dict comprehension ([#18768](https://github.com/astral-sh/ruff/pull/18768))
- \[`flake8-comprehensions`\] Handle template strings for comprehension fixes ([#18710](https://github.com/astral-sh/ruff/pull/18710))
- \[`flake8-future-annotations`\] Add autofix (`FA100`) ([#18903](https://github.com/astral-sh/ruff/pull/18903))
- \[`pyflakes`\] Mark `F504`/`F522`/`F523` autofix as unsafe if there's a call with side effect ([#18839](https://github.com/astral-sh/ruff/pull/18839))
- \[`pylint`\] Allow fix with comments and document performance implications (`PLW3301`) ([#18936](https://github.com/astral-sh/ruff/pull/18936))
- \[`pylint`\] Detect more exotic `NaN` literals in `PLW0177` ([#18630](https://github.com/astral-sh/ruff/pull/18630))
- \[`pylint`\] Fix `PLC1802` autofix creating a syntax error and mark autofix as unsafe if there's comments in the `len` call ([#18836](https://github.com/astral-sh/ruff/pull/18836))
- \[`pyupgrade`\] Extend version detection to include `sys.version_info.major` (`UP036`) ([#18633](https://github.com/astral-sh/ruff/pull/18633))
- \[`ruff`\] Add lint rule `RUF064` for calling `chmod` with non-octal integers ([#18541](https://github.com/astral-sh/ruff/pull/18541))
- \[`ruff`\] Added `cls.__dict__.get('__annotations__')` check (`RUF063`) ([#18233](https://github.com/astral-sh/ruff/pull/18233))
- \[`ruff`\] Frozen `dataclass` default should be valid (`RUF009`) ([#18735](https://github.com/astral-sh/ruff/pull/18735))
### Server
- Consider virtual path for various server actions ([#18910](https://github.com/astral-sh/ruff/pull/18910))
### Documentation
- Add fix safety sections ([#18940](https://github.com/astral-sh/ruff/pull/18940),[#18841](https://github.com/astral-sh/ruff/pull/18841),[#18802](https://github.com/astral-sh/ruff/pull/18802),[#18837](https://github.com/astral-sh/ruff/pull/18837),[#18800](https://github.com/astral-sh/ruff/pull/18800),[#18415](https://github.com/astral-sh/ruff/pull/18415),[#18853](https://github.com/astral-sh/ruff/pull/18853),[#18842](https://github.com/astral-sh/ruff/pull/18842))
- Use updated pre-commit id ([#18718](https://github.com/astral-sh/ruff/pull/18718))
- \[`perflint`\] Small docs improvement to `PERF401` ([#18786](https://github.com/astral-sh/ruff/pull/18786))
- \[`pyupgrade`\]: Use `super()`, not `__super__` in error messages (`UP008`) ([#18743](https://github.com/astral-sh/ruff/pull/18743))
- \[`flake8-pie`\] Small docs fix to `PIE794` ([#18829](https://github.com/astral-sh/ruff/pull/18829))
- \[`flake8-pyi`\] Correct `collections-named-tuple` example to use PascalCase assignment ([#16884](https://github.com/astral-sh/ruff/pull/16884))
- \[`flake8-pie`\] Add note on type checking benefits to `unnecessary-dict-kwargs` (`PIE804`) ([#18666](https://github.com/astral-sh/ruff/pull/18666))
- \[`pycodestyle`\] Clarify PEP 8 relationship to `whitespace-around-operator` rules ([#18870](https://github.com/astral-sh/ruff/pull/18870))
### Other changes
- Disallow newlines in format specifiers of single quoted f- or t-strings ([#18708](https://github.com/astral-sh/ruff/pull/18708))
- \[`flake8-logging`\] Add fix safety section to `LOG002` ([#18840](https://github.com/astral-sh/ruff/pull/18840))
- \[`pyupgrade`\] Add fix safety section to `UP010` ([#18838](https://github.com/astral-sh/ruff/pull/18838))
## 0.12.0
Check out the [blog post](https://astral.sh/blog/ruff-v0.12.0) for a migration
@@ -16,7 +92,7 @@ guide and overview of the changes!
- **New default Python version handling for syntax errors**
Ruff will default to the _latest_ supported Python version (3.13) when
Ruff will default to the *latest* supported Python version (3.13) when
checking for the version-related syntax errors mentioned above to prevent
false positives in projects without a Python version configured. The default
in all other cases, like applying lint rules, is unchanged and remains at the
@@ -71,7 +147,7 @@ The following rules have been stabilized and are no longer in preview:
- [`class-with-mixed-type-vars`](https://docs.astral.sh/ruff/rules/class-with-mixed-type-vars) (`RUF053`)
- [`unnecessary-round`](https://docs.astral.sh/ruff/rules/unnecessary-round) (`RUF057`)
- [`starmap-zip`](https://docs.astral.sh/ruff/rules/starmap-zip) (`RUF058`)
- [`non-pep604-annotation-optional`](https://docs.astral.sh/ruff/rules/non-pep604-annotation-optional) (`UP045`)
- [`non-pep604-annotation-optional`] (`UP045`)
- [`non-pep695-generic-class`](https://docs.astral.sh/ruff/rules/non-pep695-generic-class) (`UP046`)
- [`non-pep695-generic-function`](https://docs.astral.sh/ruff/rules/non-pep695-generic-function) (`UP047`)
- [`private-type-parameter`](https://docs.astral.sh/ruff/rules/private-type-parameter) (`UP049`)

296
Cargo.lock generated
View File

@@ -132,6 +132,15 @@ version = "1.0.98"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e16d2d3311acee920a9eb8d33b8cbc1787ce4a264e85f964c2404b969bdcd487"
[[package]]
name = "approx"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cab112f0a86d568ea0e627cc1d6be74a1e9cd55214684db5561995f6dad897c6"
dependencies = [
"num-traits",
]
[[package]]
name = "arc-swap"
version = "1.7.1"
@@ -169,6 +178,36 @@ dependencies = [
"tempfile",
]
[[package]]
name = "attribute-derive"
version = "0.10.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0053e96dd3bec5b4879c23a138d6ef26f2cb936c9cdc96274ac2b9ed44b5bb54"
dependencies = [
"attribute-derive-macro",
"derive-where",
"manyhow",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "attribute-derive-macro"
version = "0.10.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "463b53ad0fd5b460af4b1915fe045ff4d946d025fb6c4dc3337752eaa980f71b"
dependencies = [
"collection_literals",
"interpolator",
"manyhow",
"proc-macro-utils",
"proc-macro2",
"quote",
"quote-use",
"syn",
]
[[package]]
name = "autocfg"
version = "1.4.0"
@@ -181,6 +220,15 @@ version = "0.13.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9e1b586273c5702936fe7b7d6896644d8be71e6314cfe09d3167c95f712589e8"
[[package]]
name = "bincode"
version = "1.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1f45e9417d87227c7a56d22e471c6206462cba514c7590c09aff4cf6d1ddcad"
dependencies = [
"serde",
]
[[package]]
name = "bincode"
version = "2.0.1"
@@ -419,9 +467,9 @@ checksum = "f46ad14479a25103f283c0f10005961cf086d8dc42205bb44c46ac563475dca6"
[[package]]
name = "clearscreen"
version = "4.0.1"
version = "4.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c41dc435a7b98e4608224bbf65282309f5403719df9113621b30f8b6f74e2f4"
checksum = "85a8ab73a1c02b0c15597b22e09c7dc36e63b2f601f9d1e83ac0c3decd38b1ae"
dependencies = [
"nix 0.29.0",
"terminfo",
@@ -432,22 +480,27 @@ dependencies = [
[[package]]
name = "codspeed"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "93f4cce9c27c49c4f101fffeebb1826f41a9df2e7498b7cd4d95c0658b796c6c"
checksum = "922018102595f6668cdd09c03f4bff2d951ce2318c6dca4fe11bdcb24b65b2bf"
dependencies = [
"anyhow",
"bincode 1.3.3",
"colored 2.2.0",
"glob",
"libc",
"nix 0.29.0",
"serde",
"serde_json",
"statrs",
"uuid",
]
[[package]]
name = "codspeed-criterion-compat"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c3c23d880a28a2aab52d38ca8481dd7a3187157d0a952196b6db1db3c8499725"
checksum = "24d8ad82d2383cb74995f58993cbdd2914aed57b2f91f46580310dd81dc3d05a"
dependencies = [
"codspeed",
"codspeed-criterion-compat-walltime",
@@ -456,9 +509,9 @@ dependencies = [
[[package]]
name = "codspeed-criterion-compat-walltime"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7b0a2f7365e347f4f22a67e9ea689bf7bc89900a354e22e26cf8a531a42c8fbb"
checksum = "61badaa6c452d192a29f8387147888f0ab358553597c3fe9bf8a162ef7c2fa64"
dependencies = [
"anes",
"cast",
@@ -481,9 +534,9 @@ dependencies = [
[[package]]
name = "codspeed-divan-compat"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8620a09dfaf37b3c45f982c4b65bd8f9b0203944da3ffa705c0fcae6b84655ff"
checksum = "3acf1d6fe367c2ff5ff136ca723f678490c3691d59d7f2b83d5e53b7b25ac91e"
dependencies = [
"codspeed",
"codspeed-divan-compat-macros",
@@ -492,9 +545,9 @@ dependencies = [
[[package]]
name = "codspeed-divan-compat-macros"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "30fe872bc4214626b35d3a1706a905d0243503bb6ba3bb7be2fc59083d5d680c"
checksum = "bcfa2013d7bee54a497d0e1410751d5de690fd67a3e9eb728ca049b6a3d16d0b"
dependencies = [
"divan-macros",
"itertools 0.14.0",
@@ -506,9 +559,9 @@ dependencies = [
[[package]]
name = "codspeed-divan-compat-walltime"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "104caa97b36d4092d89e24e4b103b40ede1edab03c0372d19e14a33f9393132b"
checksum = "e513100fb0e7ba02fb3824546ecd2abfb8f334262f0972225b463aad07f99ff0"
dependencies = [
"cfg-if",
"clap",
@@ -519,6 +572,12 @@ dependencies = [
"regex-lite",
]
[[package]]
name = "collection_literals"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "186dce98367766de751c42c4f03970fc60fc012296e706ccbb9d5df9b6c1e271"
[[package]]
name = "colorchoice"
version = "1.0.3"
@@ -808,6 +867,17 @@ dependencies = [
"parking_lot_core",
]
[[package]]
name = "derive-where"
version = "1.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "510c292c8cf384b1a340b816a9a6cf2599eb8f566a44949024af88418000c50b"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "diff"
version = "0.1.13"
@@ -930,35 +1000,12 @@ version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34aa73646ffb006b8f5147f3dc182bd4bcb190227ce861fc4a4844bf8e3cb2c0"
[[package]]
name = "env_filter"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "186e05a59d4c50738528153b83b0b0194d3a29507dfec16eccd4b342903397d0"
dependencies = [
"log",
"regex",
]
[[package]]
name = "env_home"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c7f84e12ccf0a7ddc17a6c41c93326024c42920d7ee630d04950e6926645c0fe"
[[package]]
name = "env_logger"
version = "0.11.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "13c863f0904021b108aa8b2f55046443e6b1ebde8fd4a15c399893aae4fa069f"
dependencies = [
"anstream",
"anstyle",
"env_filter",
"jiff",
"log",
]
[[package]]
name = "equivalent"
version = "1.0.2"
@@ -1090,6 +1137,29 @@ dependencies = [
"version_check",
]
[[package]]
name = "get-size-derive2"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1aac2af9f9a6a50e31b1e541d05b7925add83d3982c2793193fe9d4ee584323c"
dependencies = [
"attribute-derive",
"quote",
"syn",
]
[[package]]
name = "get-size2"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "624a0312efd19e1c45922dfcc2d6806d3ffc4bca261f89f31fcc4f63f438d885"
dependencies = [
"compact_str",
"get-size-derive2",
"hashbrown 0.15.4",
"smallvec",
]
[[package]]
name = "getopts"
version = "0.2.21"
@@ -1400,9 +1470,9 @@ dependencies = [
[[package]]
name = "indexmap"
version = "2.9.0"
version = "2.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cea70ddb795996207ad57735b50c5982d8844f38ba9ee5f1aedcfb708a2aa11e"
checksum = "fe4cd85333e22411419a0bcae1297d25e58c9443848b11dc6a86fefe8c78a661"
dependencies = [
"equivalent",
"hashbrown 0.15.4",
@@ -1478,6 +1548,12 @@ dependencies = [
"serde_json",
]
[[package]]
name = "interpolator"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "71dd52191aae121e8611f1e8dc3e324dd0dd1dee1e6dd91d10ee07a3cfb4d9d8"
[[package]]
name = "intrusive-collections"
version = "0.9.7"
@@ -1739,9 +1815,9 @@ checksum = "241eaef5fd12c88705a01fc1066c48c4b36e0dd4377dcdc7ec3942cea7a69956"
[[package]]
name = "lock_api"
version = "0.4.12"
version = "0.4.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "07af8b9cdd281b7915f413fa73f29ebd5d55d0d3f0155584dade1ff18cea1b17"
checksum = "96936507f153605bddfcda068dd804796c84324ed2510809e5b2a624c81da765"
dependencies = [
"autocfg",
"scopeguard",
@@ -1778,6 +1854,29 @@ dependencies = [
"url",
]
[[package]]
name = "manyhow"
version = "0.11.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b33efb3ca6d3b07393750d4030418d594ab1139cee518f0dc88db70fec873587"
dependencies = [
"manyhow-macros",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "manyhow-macros"
version = "0.11.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "46fce34d199b78b6e6073abf984c9cf5fd3e9330145a93ee0738a7443e371495"
dependencies = [
"proc-macro-utils",
"proc-macro2",
"quote",
]
[[package]]
name = "markdown"
version = "1.0.0"
@@ -2004,9 +2103,9 @@ checksum = "04744f49eae99ab78e0d5c0b603ab218f515ea8cfe5a456d7629ad883a3b6e7d"
[[package]]
name = "ordermap"
version = "0.5.7"
version = "0.5.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7d31b8b7a99f71bdff4235faf9ce9eada0ad3562c8fbeb7d607d9f41a6ec569d"
checksum = "6d6bff06e4a5dc6416bead102d3e63c480dd852ffbb278bf8cfeb4966b329609"
dependencies = [
"indexmap",
"serde",
@@ -2037,6 +2136,16 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
[[package]]
name = "papaya"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f92dd0b07c53a0a0c764db2ace8c541dc47320dad97c2200c2a637ab9dd2328f"
dependencies = [
"equivalent",
"seize",
]
[[package]]
name = "parking_lot"
version = "0.12.3"
@@ -2338,6 +2447,17 @@ dependencies = [
"toml_edit",
]
[[package]]
name = "proc-macro-utils"
version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eeaf08a13de400bc215877b5bdc088f241b12eb42f0a548d3390dc1c56bb7071"
dependencies = [
"proc-macro2",
"quote",
"smallvec",
]
[[package]]
name = "proc-macro2"
version = "1.0.95"
@@ -2414,6 +2534,28 @@ dependencies = [
"proc-macro2",
]
[[package]]
name = "quote-use"
version = "0.8.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9619db1197b497a36178cfc736dc96b271fe918875fbf1344c436a7e93d0321e"
dependencies = [
"quote",
"quote-use-macros",
]
[[package]]
name = "quote-use-macros"
version = "0.8.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82ebfb7faafadc06a7ab141a6f67bcfb24cb8beb158c6fe933f2f035afa99f35"
dependencies = [
"proc-macro-utils",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "r-efi"
version = "5.2.0"
@@ -2582,18 +2724,19 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.12.0"
version = "0.12.1"
dependencies = [
"anyhow",
"argfile",
"assert_fs",
"bincode",
"bincode 2.0.1",
"bitflags 2.9.1",
"cachedir",
"clap",
"clap_complete_command",
"clearscreen",
"colored 3.0.0",
"dunce",
"filetime",
"globwalk",
"ignore",
@@ -2703,6 +2846,7 @@ dependencies = [
"dunce",
"etcetera",
"filetime",
"get-size2",
"glob",
"ignore",
"insta",
@@ -2818,6 +2962,7 @@ dependencies = [
name = "ruff_index"
version = "0.0.0"
dependencies = [
"get-size2",
"ruff_macros",
"salsa",
"static_assertions",
@@ -2825,7 +2970,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.12.0"
version = "0.12.1"
dependencies = [
"aho-corasick",
"anyhow",
@@ -2835,6 +2980,7 @@ dependencies = [
"fern",
"glob",
"globset",
"hashbrown 0.15.4",
"imperative",
"insta",
"is-macro",
@@ -2930,6 +3076,7 @@ dependencies = [
"aho-corasick",
"bitflags 2.9.1",
"compact_str",
"get-size2",
"is-macro",
"itertools 0.14.0",
"memchr",
@@ -3029,6 +3176,7 @@ dependencies = [
"bitflags 2.9.1",
"bstr",
"compact_str",
"get-size2",
"insta",
"memchr",
"ruff_annotate_snippets",
@@ -3046,16 +3194,6 @@ dependencies = [
"walkdir",
]
[[package]]
name = "ruff_python_resolver"
version = "0.0.0"
dependencies = [
"env_logger",
"insta",
"log",
"tempfile",
]
[[package]]
name = "ruff_python_semantic"
version = "0.0.0"
@@ -3145,6 +3283,7 @@ dependencies = [
name = "ruff_source_file"
version = "0.0.0"
dependencies = [
"get-size2",
"memchr",
"ruff_text_size",
"serde",
@@ -3154,6 +3293,7 @@ dependencies = [
name = "ruff_text_size"
version = "0.0.0"
dependencies = [
"get-size2",
"schemars",
"serde",
"serde_test",
@@ -3162,7 +3302,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.12.0"
version = "0.12.1"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -3281,8 +3421,8 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=09627e450566f894956710a3fd923dc80462ae6d#09627e450566f894956710a3fd923dc80462ae6d"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
dependencies = [
"boxcar",
"compact_str",
@@ -3292,6 +3432,7 @@ dependencies = [
"hashlink",
"indexmap",
"intrusive-collections",
"papaya",
"parking_lot",
"portable-atomic",
"rayon",
@@ -3305,13 +3446,13 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=09627e450566f894956710a3fd923dc80462ae6d#09627e450566f894956710a3fd923dc80462ae6d"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
[[package]]
name = "salsa-macros"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=09627e450566f894956710a3fd923dc80462ae6d#09627e450566f894956710a3fd923dc80462ae6d"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
dependencies = [
"proc-macro2",
"quote",
@@ -3364,6 +3505,16 @@ version = "4.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1c107b6f4780854c8b126e228ea8869f4d7b71260f962fefb57b996b8959ba6b"
[[package]]
name = "seize"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e4b8d813387d566f627f3ea1b914c068aac94c40ae27ec43f5f33bde65abefe7"
dependencies = [
"libc",
"windows-sys 0.52.0",
]
[[package]]
name = "serde"
version = "1.0.219"
@@ -3564,6 +3715,16 @@ version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a2eb9349b6444b326872e140eb1cf5e7c522154d69e7a0ffb0fb81c06b37543f"
[[package]]
name = "statrs"
version = "0.18.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2a3fe7c28c6512e766b0874335db33c94ad7b8f9054228ae1c2abd47ce7d335e"
dependencies = [
"approx",
"num-traits",
]
[[package]]
name = "strip-ansi-escapes"
version = "0.2.1"
@@ -4024,6 +4185,7 @@ dependencies = [
"camino",
"colored 3.0.0",
"crossbeam",
"get-size2",
"globset",
"insta",
"notify",
@@ -4063,6 +4225,7 @@ dependencies = [
"countme",
"dir-test",
"drop_bomb",
"get-size2",
"glob",
"hashbrown 0.15.4",
"indexmap",
@@ -4124,6 +4287,7 @@ dependencies = [
"ty_ide",
"ty_project",
"ty_python_semantic",
"ty_vendored",
]
[[package]]
@@ -4163,6 +4327,7 @@ version = "0.0.0"
dependencies = [
"path-slash",
"ruff_db",
"static_assertions",
"walkdir",
"zip",
]
@@ -4597,11 +4762,10 @@ dependencies = [
[[package]]
name = "which"
version = "7.0.3"
version = "8.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24d643ce3fd3e5b54854602a080f34fb10ab75e0b813ee32d00ca2b44fa74762"
checksum = "d3fabb953106c3c8eea8306e4393700d7657561cb43122571b172bbfb7c7ba1d"
dependencies = [
"either",
"env_home",
"rustix",
"winsafe",

View File

@@ -5,7 +5,7 @@ resolver = "2"
[workspace.package]
# Please update rustfmt.toml when bumping the Rust edition
edition = "2024"
rust-version = "1.85"
rust-version = "1.86"
homepage = "https://docs.astral.sh/ruff"
documentation = "https://docs.astral.sh/ruff"
repository = "https://github.com/astral-sh/ruff"
@@ -62,8 +62,8 @@ camino = { version = "1.1.7" }
clap = { version = "4.5.3", features = ["derive"] }
clap_complete_command = { version = "0.6.0" }
clearscreen = { version = "4.0.0" }
divan = { package = "codspeed-divan-compat", version = "2.10.1" }
codspeed-criterion-compat = { version = "2.6.0", default-features = false }
divan = { package = "codspeed-divan-compat", version = "3.0.2" }
codspeed-criterion-compat = { version = "3.0.2", default-features = false }
colored = { version = "3.0.0" }
console_error_panic_hook = { version = "0.1.7" }
console_log = { version = "1.0.0" }
@@ -75,11 +75,16 @@ dashmap = { version = "6.0.1" }
dir-test = { version = "0.4.0" }
dunce = { version = "1.0.5" }
drop_bomb = { version = "0.1.5" }
env_logger = { version = "0.11.0" }
etcetera = { version = "0.10.0" }
fern = { version = "0.7.0" }
filetime = { version = "0.2.23" }
getrandom = { version = "0.3.1" }
get-size2 = { version = "0.5.0", features = [
"derive",
"smallvec",
"hashbrown",
"compact-str"
] }
glob = { version = "0.3.1" }
globset = { version = "0.4.14" }
globwalk = { version = "0.9.1" }
@@ -132,7 +137,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "09627e450566f894956710a3fd923dc80462ae6d" }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "fc00eba89e5dcaa5edba51c41aa5f309b5cb126b" }
schemars = { version = "0.8.16" }
seahash = { version = "4.1.0" }
serde = { version = "1.0.197", features = ["derive"] }
@@ -222,6 +227,7 @@ unnecessary_debug_formatting = "allow" # too many instances, the display also d
# Without the hashes we run into a `rustfmt` bug in some snapshot tests, see #13250
needless_raw_string_hashes = "allow"
# Disallowed restriction lints
ignore_without_reason = "allow" # Too many exsisting instances, and there's no auto fix.
print_stdout = "warn"
print_stderr = "warn"
dbg_macro = "warn"

View File

@@ -148,8 +148,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.12.0/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.12.0/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.12.1/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.12.1/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -182,7 +182,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.12.0
rev: v0.12.1
hooks:
# Run the linter.
- id: ruff-check
@@ -423,6 +423,7 @@ Ruff is used by a number of major open-source projects and companies, including:
- [Albumentations](https://github.com/albumentations-team/albumentations)
- Amazon ([AWS SAM](https://github.com/aws/serverless-application-model))
- [Anki](https://apps.ankiweb.net/)
- Anthropic ([Python SDK](https://github.com/anthropics/anthropic-sdk-python))
- [Apache Airflow](https://github.com/apache/airflow)
- AstraZeneca ([Magnus](https://github.com/AstraZeneca/magnus-core))

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.12.0"
version = "0.12.1"
publish = true
authors = { workspace = true }
edition = { workspace = true }
@@ -68,6 +68,7 @@ ruff_linter = { workspace = true, features = ["clap", "test-rules"] }
assert_fs = { workspace = true }
# Avoid writing colored snapshots when running tests from the terminal
colored = { workspace = true, features = ["no-color"] }
dunce = { workspace = true }
indoc = { workspace = true }
insta = { workspace = true, features = ["filters", "json"] }
insta-cmd = { workspace = true }

View File

@@ -442,7 +442,7 @@ impl LintCacheData {
// Parse the kebab-case rule name into a `Rule`. This will fail for syntax errors, so
// this also serves to filter them out, but we shouldn't be caching files with syntax
// errors anyway.
.filter_map(|msg| Some((msg.noqa_code().and_then(|code| code.rule())?, msg)))
.filter_map(|msg| Some((msg.name().parse().ok()?, msg)))
.map(|(rule, msg)| {
// Make sure that all message use the same source file.
assert_eq!(

View File

@@ -6,7 +6,6 @@ use anyhow::Result;
use bitflags::bitflags;
use colored::Colorize;
use itertools::{Itertools, iterate};
use ruff_linter::codes::NoqaCode;
use ruff_linter::linter::FixTable;
use serde::Serialize;
@@ -15,7 +14,7 @@ use ruff_linter::logging::LogLevel;
use ruff_linter::message::{
AzureEmitter, Emitter, EmitterContext, GithubEmitter, GitlabEmitter, GroupedEmitter,
JsonEmitter, JsonLinesEmitter, JunitEmitter, OldDiagnostic, PylintEmitter, RdjsonEmitter,
SarifEmitter, TextEmitter,
SarifEmitter, SecondaryCode, TextEmitter,
};
use ruff_linter::notify_user;
use ruff_linter::settings::flags::{self};
@@ -36,8 +35,8 @@ bitflags! {
}
#[derive(Serialize)]
struct ExpandedStatistics {
code: Option<NoqaCode>,
struct ExpandedStatistics<'a> {
code: Option<&'a SecondaryCode>,
name: &'static str,
count: usize,
fixable: bool,
@@ -303,11 +302,12 @@ impl Printer {
let statistics: Vec<ExpandedStatistics> = diagnostics
.inner
.iter()
.map(|message| (message.noqa_code(), message))
.map(|message| (message.secondary_code(), message))
.sorted_by_key(|(code, message)| (*code, message.fixable()))
.fold(
vec![],
|mut acc: Vec<((Option<NoqaCode>, &OldDiagnostic), usize)>, (code, message)| {
|mut acc: Vec<((Option<&SecondaryCode>, &OldDiagnostic), usize)>,
(code, message)| {
if let Some(((prev_code, _prev_message), count)) = acc.last_mut() {
if *prev_code == code {
*count += 1;
@@ -349,12 +349,7 @@ impl Printer {
);
let code_width = statistics
.iter()
.map(|statistic| {
statistic
.code
.map_or_else(String::new, |rule| rule.to_string())
.len()
})
.map(|statistic| statistic.code.map_or(0, |s| s.len()))
.max()
.unwrap();
let any_fixable = statistics.iter().any(|statistic| statistic.fixable);
@@ -370,7 +365,8 @@ impl Printer {
statistic.count.to_string().bold(),
statistic
.code
.map_or_else(String::new, |rule| rule.to_string())
.map(SecondaryCode::as_str)
.unwrap_or_default()
.red()
.bold(),
if any_fixable {

View File

@@ -17,6 +17,7 @@ fn command() -> Command {
command.arg("analyze");
command.arg("graph");
command.arg("--preview");
command.env_clear();
command
}
@@ -565,8 +566,8 @@ fn venv() -> Result<()> {
----- stderr -----
ruff failed
Cause: Invalid search path settings
Cause: Failed to discover the site-packages directory: Invalid `--python` argument `none`: does not point to a Python executable or a directory on disk
Cause: Invalid `--python` argument `none`: does not point to a Python executable or a directory on disk
Cause: No such file or directory (os error 2)
");
});

View File

@@ -612,7 +612,7 @@ fn extend_passed_via_config_argument() {
#[test]
fn nonexistent_extend_file() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
fs::write(
project_dir.join("ruff.toml"),
r#"
@@ -653,7 +653,7 @@ extend = "ruff3.toml"
#[test]
fn circular_extend() -> Result<()> {
let tempdir = TempDir::new()?;
let project_path = tempdir.path().canonicalize()?;
let project_path = dunce::canonicalize(tempdir.path())?;
fs::write(
project_path.join("ruff.toml"),
@@ -698,7 +698,7 @@ extend = "ruff.toml"
#[test]
fn parse_error_extends() -> Result<()> {
let tempdir = TempDir::new()?;
let project_path = tempdir.path().canonicalize()?;
let project_path = dunce::canonicalize(tempdir.path())?;
fs::write(
project_path.join("ruff.toml"),
@@ -2130,7 +2130,7 @@ select = ["UP006"]
#[test]
fn requires_python_no_tool() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("pyproject.toml");
fs::write(
&ruff_toml,
@@ -2441,7 +2441,7 @@ requires-python = ">= 3.11"
#[test]
fn requires_python_no_tool_target_version_override() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("pyproject.toml");
fs::write(
&ruff_toml,
@@ -2752,7 +2752,7 @@ requires-python = ">= 3.11"
#[test]
fn requires_python_no_tool_with_check() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("pyproject.toml");
fs::write(
&ruff_toml,
@@ -2797,7 +2797,7 @@ requires-python = ">= 3.11"
#[test]
fn requires_python_ruff_toml_no_target_fallback() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
@@ -3118,7 +3118,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_ruff_toml_no_target_fallback_check() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
@@ -3173,7 +3173,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_pyproject_toml_above() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let outer_pyproject = tempdir.path().join("pyproject.toml");
fs::write(
&outer_pyproject,
@@ -3200,7 +3200,7 @@ from typing import Union;foo: Union[int, str] = 1
"#,
)?;
let testpy_canon = testpy.canonicalize()?;
let testpy_canon = dunce::canonicalize(testpy)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&testpy_canon).as_str(), "[TMP]/foo/test.py"),(tempdir_filter(&project_dir).as_str(), "[TMP]/"),(r"(?m)^foo\\test","foo/test")]
@@ -3499,7 +3499,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_pyproject_toml_above_with_tool() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let outer_pyproject = tempdir.path().join("pyproject.toml");
fs::write(
&outer_pyproject,
@@ -3528,7 +3528,7 @@ from typing import Union;foo: Union[int, str] = 1
"#,
)?;
let testpy_canon = testpy.canonicalize()?;
let testpy_canon = dunce::canonicalize(testpy)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&testpy_canon).as_str(), "[TMP]/foo/test.py"),(tempdir_filter(&project_dir).as_str(), "[TMP]/"),(r"foo\\","foo/")]
@@ -3827,7 +3827,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_ruff_toml_above() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
@@ -3856,7 +3856,7 @@ from typing import Union;foo: Union[int, str] = 1
"#,
)?;
let testpy_canon = testpy.canonicalize()?;
let testpy_canon = dunce::canonicalize(testpy)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&testpy_canon).as_str(), "[TMP]/foo/test.py"),(tempdir_filter(&project_dir).as_str(), "[TMP]/")]
@@ -4441,7 +4441,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_extend_from_shared_config() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
@@ -4479,7 +4479,7 @@ from typing import Union;foo: Union[int, str] = 1
"#,
)?;
let testpy_canon = testpy.canonicalize()?;
let testpy_canon = dunce::canonicalize(testpy)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&testpy_canon).as_str(), "[TMP]/test.py"),(tempdir_filter(&project_dir).as_str(), "[TMP]/")]

View File

@@ -12,10 +12,8 @@ fn display_default_settings() -> anyhow::Result<()> {
// Tempdir path's on macos are symlinks, which doesn't play nicely with
// our snapshot filtering.
let project_dir = tempdir
.path()
.canonicalize()
.context("Failed to canonical tempdir path.")?;
let project_dir =
dunce::canonicalize(tempdir.path()).context("Failed to canonical tempdir path.")?;
std::fs::write(
project_dir.join("pyproject.toml"),

View File

@@ -352,7 +352,7 @@ impl DisplaySet<'_> {
// FIXME: `unicode_width` sometimes disagrees with terminals on how wide a `char`
// is. For now, just accept that sometimes the code line will be longer than
// desired.
let next = unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1);
let next = char_width(ch).unwrap_or(1);
if taken + next > right - left {
was_cut_right = true;
break;
@@ -377,7 +377,7 @@ impl DisplaySet<'_> {
let left: usize = text
.chars()
.take(left)
.map(|ch| unicode_width::UnicodeWidthChar::width(ch).unwrap_or(1))
.map(|ch| char_width(ch).unwrap_or(1))
.sum();
let mut annotations = annotations.clone();
@@ -821,11 +821,7 @@ impl DisplaySourceAnnotation<'_> {
// Length of this annotation as displayed in the stderr output
fn len(&self) -> usize {
// Account for usize underflows
if self.range.1 > self.range.0 {
self.range.1 - self.range.0
} else {
self.range.0 - self.range.1
}
self.range.1.abs_diff(self.range.0)
}
fn takes_space(&self) -> bool {
@@ -1393,6 +1389,7 @@ fn format_body<'m>(
let line_length: usize = line.len();
let line_range = (current_index, current_index + line_length);
let end_line_size = end_line.len();
body.push(DisplayLine::Source {
lineno: Some(current_line),
inline_marks: vec![],
@@ -1452,12 +1449,12 @@ fn format_body<'m>(
let annotation_start_col = line
[0..(start - line_start_index).min(line_length)]
.chars()
.map(|c| unicode_width::UnicodeWidthChar::width(c).unwrap_or(0))
.map(|c| char_width(c).unwrap_or(0))
.sum::<usize>();
let mut annotation_end_col = line
[0..(end - line_start_index).min(line_length)]
.chars()
.map(|c| unicode_width::UnicodeWidthChar::width(c).unwrap_or(0))
.map(|c| char_width(c).unwrap_or(0))
.sum::<usize>();
if annotation_start_col == annotation_end_col {
// At least highlight something
@@ -1499,7 +1496,7 @@ fn format_body<'m>(
let annotation_start_col = line
[0..(start - line_start_index).min(line_length)]
.chars()
.map(|c| unicode_width::UnicodeWidthChar::width(c).unwrap_or(0))
.map(|c| char_width(c).unwrap_or(0))
.sum::<usize>();
let annotation_end_col = annotation_start_col + 1;
@@ -1558,7 +1555,7 @@ fn format_body<'m>(
{
let end_mark = line[0..(end - line_start_index).min(line_length)]
.chars()
.map(|c| unicode_width::UnicodeWidthChar::width(c).unwrap_or(0))
.map(|c| char_width(c).unwrap_or(0))
.sum::<usize>()
.saturating_sub(1);
// If the annotation ends on a line-end character, we
@@ -1754,3 +1751,11 @@ fn format_inline_marks(
}
Ok(())
}
fn char_width(c: char) -> Option<usize> {
if c == '\t' {
Some(4)
} else {
unicode_width::UnicodeWidthChar::width(c)
}
}

View File

@@ -0,0 +1,38 @@
<svg width="740px" height="146px" xmlns="http://www.w3.org/2000/svg">
<style>
.fg { fill: #AAAAAA }
.bg { background: #000000 }
.fg-bright-blue { fill: #5555FF }
.fg-bright-red { fill: #FF5555 }
.container {
padding: 0 10px;
line-height: 18px;
}
.bold { font-weight: bold; }
tspan {
font: 14px SFMono-Regular, Consolas, Liberation Mono, Menlo, monospace;
white-space: pre;
line-height: 18px;
}
</style>
<rect width="100%" height="100%" y="0" rx="4.5" class="bg" />
<text xml:space="preserve" class="container fg">
<tspan x="10px" y="28px"><tspan class="fg-bright-red bold">error[E0308]</tspan><tspan>: </tspan><tspan class="bold">call-non-callable</tspan>
</tspan>
<tspan x="10px" y="46px"><tspan> </tspan><tspan class="fg-bright-blue bold">--&gt;</tspan><tspan> $DIR/main.py:5:9</tspan>
</tspan>
<tspan x="10px" y="64px"><tspan> </tspan><tspan class="fg-bright-blue bold">|</tspan>
</tspan>
<tspan x="10px" y="82px"><tspan class="fg-bright-blue bold">4 |</tspan><tspan> def f():</tspan>
</tspan>
<tspan x="10px" y="100px"><tspan class="fg-bright-blue bold">5 |</tspan><tspan> return (1 == '2')() # Tab indented</tspan>
</tspan>
<tspan x="10px" y="118px"><tspan> </tspan><tspan class="fg-bright-blue bold">|</tspan><tspan> </tspan><tspan class="fg-bright-red bold">^^^^^^^^^^^^</tspan>
</tspan>
<tspan x="10px" y="136px"><tspan> </tspan><tspan class="fg-bright-blue bold">|</tspan>
</tspan>
</text>
</svg>

After

Width:  |  Height:  |  Size: 1.5 KiB

View File

@@ -0,0 +1,45 @@
# [crates/ruff_db/src/diagnostic/render.rs:123:47] diag.to_annotate() = Message {
# level: Error,
# id: Some(
# "call-non-callable",
# ),
# title: "Object of type `bool` is not callable",
# snippets: [
# Snippet {
# origin: Some(
# "main.py",
# ),
# line_start: 1,
# source: "def f():\n\treturn (1 == '2')() # Tab indented\n",
# annotations: [
# Annotation {
# range: 17..29,
# label: None,
# level: Error,
# },
# ],
# fold: false,
# },
# ],
# footer: [],
# }
[message]
level = "Error"
id = "E0308"
title = "call-non-callable"
[[message.snippets]]
source = "def f():\n\treturn (1 == '2')() # Tab indented\n"
line_start = 4
origin = "$DIR/main.py"
[[message.snippets.annotations]]
label = ""
level = "Error"
range = [17, 29]
[renderer]
# anonymized_line_numbers = true
color = true

View File

@@ -0,0 +1,36 @@
<svg width="1196px" height="128px" xmlns="http://www.w3.org/2000/svg">
<style>
.fg { fill: #AAAAAA }
.bg { background: #000000 }
.fg-bright-blue { fill: #5555FF }
.fg-bright-red { fill: #FF5555 }
.container {
padding: 0 10px;
line-height: 18px;
}
.bold { font-weight: bold; }
tspan {
font: 14px SFMono-Regular, Consolas, Liberation Mono, Menlo, monospace;
white-space: pre;
line-height: 18px;
}
</style>
<rect width="100%" height="100%" y="0" rx="4.5" class="bg" />
<text xml:space="preserve" class="container fg">
<tspan x="10px" y="28px"><tspan class="fg-bright-red bold">error[E0308]</tspan><tspan>: </tspan><tspan class="bold">mismatched types</tspan>
</tspan>
<tspan x="10px" y="46px"><tspan> </tspan><tspan class="fg-bright-blue bold">--&gt;</tspan><tspan> $DIR/non-whitespace-trimming.rs:4:6</tspan>
</tspan>
<tspan x="10px" y="64px"><tspan> </tspan><tspan class="fg-bright-blue bold">|</tspan>
</tspan>
<tspan x="10px" y="82px"><tspan class="fg-bright-blue bold">4 |</tspan><tspan> </tspan><tspan class="fg-bright-blue bold">...</tspan><tspan> s_data['d_dails'] = bb['contacted'][hostip]['ansible_facts']['ansible_devices']['vda']['vendor'] + " " + bb['contacted'][hostip</tspan><tspan class="fg-bright-blue bold">...</tspan>
</tspan>
<tspan x="10px" y="100px"><tspan> </tspan><tspan class="fg-bright-blue bold">|</tspan><tspan> </tspan><tspan class="fg-bright-red bold">^^^^^^</tspan>
</tspan>
<tspan x="10px" y="118px"><tspan> </tspan><tspan class="fg-bright-blue bold">|</tspan>
</tspan>
</text>
</svg>

After

Width:  |  Height:  |  Size: 1.6 KiB

View File

@@ -0,0 +1,20 @@
[message]
level = "Error"
id = "E0308"
title = "mismatched types"
[[message.snippets]]
source = """
s_data['d_dails'] = bb['contacted'][hostip]['ansible_facts']['ansible_devices']['vda']['vendor'] + " " + bb['contacted'][hostip]['an
"""
line_start = 4
origin = "$DIR/non-whitespace-trimming.rs"
[[message.snippets.annotations]]
label = ""
level = "Error"
range = [5, 11]
[renderer]
# anonymized_line_numbers = true
color = true

View File

@@ -21,7 +21,7 @@
<text xml:space="preserve" class="container fg">
<tspan x="10px" y="28px"><tspan class="fg-bright-red bold">error[E0308]</tspan><tspan>: </tspan><tspan class="bold">mismatched types</tspan>
</tspan>
<tspan x="10px" y="46px"><tspan> </tspan><tspan class="fg-bright-blue bold">--&gt;</tspan><tspan> $DIR/non-whitespace-trimming.rs:4:242</tspan>
<tspan x="10px" y="46px"><tspan> </tspan><tspan class="fg-bright-blue bold">--&gt;</tspan><tspan> $DIR/non-whitespace-trimming.rs:4:238</tspan>
</tspan>
<tspan x="10px" y="64px"><tspan> </tspan><tspan class="fg-bright-blue bold">|</tspan>
</tspan>

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

View File

@@ -13,12 +13,12 @@ origin = "$DIR/non-whitespace-trimming.rs"
[[message.snippets.annotations]]
label = "expected `()`, found integer"
level = "Error"
range = [241, 243]
range = [237, 239]
[[message.snippets.annotations]]
label = "expected due to this"
level = "Error"
range = [236, 238]
range = [232, 234]
[renderer]

View File

@@ -348,6 +348,99 @@ fn benchmark_many_tuple_assignments(criterion: &mut Criterion) {
});
}
fn benchmark_complex_constrained_attributes_1(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("ty_micro[complex_constrained_attributes_1]", |b| {
b.iter_batched_ref(
|| {
// This is a regression benchmark for https://github.com/astral-sh/ty/issues/627.
// Before this was fixed, the following sample would take >1s to type check.
setup_micro_case(
r#"
class C:
def f(self: "C"):
if isinstance(self.a, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
"#,
)
},
|case| {
let Case { db, .. } = case;
let result = db.check();
assert!(!result.is_empty());
},
BatchSize::SmallInput,
);
});
}
fn benchmark_complex_constrained_attributes_2(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("ty_micro[complex_constrained_attributes_2]", |b| {
b.iter_batched_ref(
|| {
// This is is similar to the case above, but now the attributes are actually defined.
// https://github.com/astral-sh/ty/issues/711
setup_micro_case(
r#"
class C:
def f(self: "C"):
self.a = ""
self.b = ""
if isinstance(self.a, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
"#,
)
},
|case| {
let Case { db, .. } = case;
let result = db.check();
assert_eq!(result.len(), 0);
},
BatchSize::SmallInput,
);
});
}
struct ProjectBenchmark<'a> {
project: InstalledProject<'a>,
fs: MemoryFileSystem,
@@ -377,8 +470,7 @@ impl<'a> ProjectBenchmark<'a> {
metadata.apply_options(Options {
environment: Some(EnvironmentOptions {
python_version: Some(RangedValue::cli(self.project.config.python_version)),
python: (!self.project.config().dependencies.is_empty())
.then_some(RelativePathBuf::cli(SystemPath::new(".venv"))),
python: Some(RelativePathBuf::cli(SystemPath::new(".venv"))),
..EnvironmentOptions::default()
}),
..Options::default()
@@ -483,6 +575,8 @@ criterion_group!(
micro,
benchmark_many_string_assignments,
benchmark_many_tuple_assignments,
benchmark_complex_constrained_attributes_1,
benchmark_complex_constrained_attributes_2,
);
criterion_group!(project, anyio, attrs, hydra);
criterion_main!(check_file, micro, project);

View File

@@ -36,8 +36,7 @@ impl<'a> Benchmark<'a> {
metadata.apply_options(Options {
environment: Some(EnvironmentOptions {
python_version: Some(RangedValue::cli(self.project.config.python_version)),
python: (!self.project.config().dependencies.is_empty())
.then_some(RelativePathBuf::cli(SystemPath::new(".venv"))),
python: Some(RelativePathBuf::cli(SystemPath::new(".venv"))),
..EnvironmentOptions::default()
}),
..Options::default()

View File

@@ -74,19 +74,17 @@ impl<'a> RealWorldProject<'a> {
};
// Install dependencies if specified
if !checkout.project().dependencies.is_empty() {
tracing::debug!(
"Installing {} dependencies for project '{}'...",
checkout.project().dependencies.len(),
checkout.project().name
);
let start = std::time::Instant::now();
install_dependencies(&checkout)?;
tracing::debug!(
"Dependency installation completed in {:.2}s",
start.elapsed().as_secs_f64()
);
}
tracing::debug!(
"Installing {} dependencies for project '{}'...",
checkout.project().dependencies.len(),
checkout.project().name
);
let start_install = std::time::Instant::now();
install_dependencies(&checkout)?;
tracing::debug!(
"Dependency installation completed in {:.2}s",
start_install.elapsed().as_secs_f64()
);
tracing::debug!("Project setup took: {:.2}s", start.elapsed().as_secs_f64());
@@ -281,6 +279,14 @@ fn install_dependencies(checkout: &Checkout) -> Result<()> {
String::from_utf8_lossy(&output.stderr)
);
if checkout.project().dependencies.is_empty() {
tracing::debug!(
"No dependencies to install for project '{}'",
checkout.project().name
);
return Ok(());
}
// Install dependencies with date constraint in the isolated environment
let mut cmd = Command::new("uv");
cmd.args([

View File

@@ -14,10 +14,10 @@ license = { workspace = true }
ruff_annotate_snippets = { workspace = true }
ruff_cache = { workspace = true, optional = true }
ruff_notebook = { workspace = true }
ruff_python_ast = { workspace = true }
ruff_python_ast = { workspace = true, features = ["get-size"] }
ruff_python_parser = { workspace = true }
ruff_python_trivia = { workspace = true }
ruff_source_file = { workspace = true }
ruff_source_file = { workspace = true, features = ["get-size"] }
ruff_text_size = { workspace = true }
anstyle = { workspace = true }
@@ -27,17 +27,18 @@ countme = { workspace = true }
dashmap = { workspace = true }
dunce = { workspace = true }
filetime = { workspace = true }
get-size2 = { workspace = true }
glob = { workspace = true }
ignore = { workspace = true, optional = true }
matchit = { workspace = true }
path-slash = { workspace = true }
rustc-hash = { workspace = true }
salsa = { workspace = true }
schemars = { workspace = true, optional = true }
serde = { workspace = true, optional = true }
path-slash = { workspace = true }
thiserror = { workspace = true }
tracing = { workspace = true }
tracing-subscriber = { workspace = true, optional = true }
rustc-hash = { workspace = true }
zip = { workspace = true }
[target.'cfg(target_arch="wasm32")'.dependencies]

View File

@@ -19,7 +19,7 @@ mod stylesheet;
/// characteristics in the inputs given to the tool. Typically, but not always,
/// a characteristic is a deficiency. An example of a characteristic that is
/// _not_ a deficiency is the `reveal_type` diagnostic for our type checker.
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
pub struct Diagnostic {
/// The actual diagnostic.
///
@@ -270,7 +270,7 @@ impl Diagnostic {
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
struct DiagnosticInner {
id: DiagnosticId,
severity: Severity,
@@ -342,7 +342,7 @@ impl Eq for RenderingSortKey<'_> {}
/// Currently, the order in which sub-diagnostics are rendered relative to one
/// another (for a single parent diagnostic) is the order in which they were
/// attached to the diagnostic.
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
pub struct SubDiagnostic {
/// Like with `Diagnostic`, we box the `SubDiagnostic` to make it
/// pointer-sized.
@@ -443,7 +443,7 @@ impl SubDiagnostic {
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
struct SubDiagnosticInner {
severity: Severity,
message: DiagnosticMessage,
@@ -471,7 +471,7 @@ struct SubDiagnosticInner {
///
/// Messages attached to annotations should also be as brief and specific as
/// possible. Long messages could negative impact the quality of rendering.
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
pub struct Annotation {
/// The span of this annotation, corresponding to some subsequence of the
/// user's input that we want to highlight.
@@ -591,7 +591,7 @@ impl Annotation {
///
/// These tags are used to provide additional information about the annotation.
/// and are passed through to the language server protocol.
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
pub enum DiagnosticTag {
/// Unused or unnecessary code. Used for unused parameters, unreachable code, etc.
Unnecessary,
@@ -605,7 +605,7 @@ pub enum DiagnosticTag {
/// be in kebab case, e.g. `no-foo` (all lower case).
///
/// Rules use kebab case, e.g. `no-foo`.
#[derive(Debug, Copy, Clone, PartialOrd, Ord, PartialEq, Eq, Hash)]
#[derive(Debug, Copy, Clone, PartialOrd, Ord, PartialEq, Eq, Hash, get_size2::GetSize)]
pub struct LintName(&'static str);
impl LintName {
@@ -645,7 +645,7 @@ impl PartialEq<&str> for LintName {
}
/// Uniquely identifies the kind of a diagnostic.
#[derive(Debug, Copy, Clone, Eq, PartialEq, PartialOrd, Ord, Hash)]
#[derive(Debug, Copy, Clone, Eq, PartialEq, PartialOrd, Ord, Hash, get_size2::GetSize)]
pub enum DiagnosticId {
Panic,
@@ -735,6 +735,9 @@ pub enum DiagnosticId {
/// # no `[overrides.rules]`
/// ```
UselessOverridesSection,
/// Use of a deprecated setting.
DeprecatedSetting,
}
impl DiagnosticId {
@@ -773,6 +776,7 @@ impl DiagnosticId {
DiagnosticId::EmptyInclude => "empty-include",
DiagnosticId::UnnecessaryOverridesSection => "unnecessary-overrides-section",
DiagnosticId::UselessOverridesSection => "useless-overrides-section",
DiagnosticId::DeprecatedSetting => "deprecated-setting",
}
}
@@ -796,7 +800,7 @@ impl std::fmt::Display for DiagnosticId {
///
/// This enum presents a unified interface to these two types for the sake of creating [`Span`]s and
/// emitting diagnostics from both ty and ruff.
#[derive(Debug, Clone, PartialEq, Eq)]
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
pub enum UnifiedFile {
Ty(File),
Ruff(SourceFile),
@@ -848,7 +852,7 @@ impl DiagnosticSource {
/// It consists of a `File` and an optional range into that file. When the
/// range isn't present, it semantically implies that the diagnostic refers to
/// the entire file. For example, when the file should be executable but isn't.
#[derive(Debug, Clone, PartialEq, Eq)]
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
pub struct Span {
file: UnifiedFile,
range: Option<TextRange>,
@@ -920,7 +924,7 @@ impl From<crate::files::FileRange> for Span {
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Ord, PartialOrd)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Ord, PartialOrd, get_size2::GetSize)]
pub enum Severity {
Info,
Warning,
@@ -1083,7 +1087,7 @@ impl std::fmt::Display for ConciseMessage<'_> {
/// In most cases, callers shouldn't need to use this. Instead, there is
/// a blanket trait implementation for `IntoDiagnosticMessage` for
/// anything that implements `std::fmt::Display`.
#[derive(Clone, Debug, Eq, PartialEq)]
#[derive(Clone, Debug, Eq, PartialEq, get_size2::GetSize)]
pub struct DiagnosticMessage(Box<str>);
impl DiagnosticMessage {

View File

@@ -637,6 +637,22 @@ pub trait FileResolver {
fn input(&self, file: File) -> Input;
}
impl<T> FileResolver for T
where
T: Db,
{
fn path(&self, file: File) -> &str {
relativize_path(self.system().current_directory(), file.path(self).as_str())
}
fn input(&self, file: File) -> Input {
Input {
text: source_text(self, file),
line_index: line_index(self, file),
}
}
}
impl FileResolver for &dyn Db {
fn path(&self, file: File) -> &str {
relativize_path(self.system().current_directory(), file.path(*self).as_str())
@@ -708,7 +724,6 @@ fn relativize_path<'p>(cwd: &SystemPath, path: &'p str) -> &'p str {
#[cfg(test)]
mod tests {
use crate::Upcast;
use crate::diagnostic::{Annotation, DiagnosticId, Severity, Span};
use crate::files::system_path_to_file;
use crate::system::{DbWithWritableSystem, SystemPath};
@@ -2221,7 +2236,7 @@ watermelon
///
/// (This will set the "printed" flag on `Diagnostic`.)
fn render(&self, diag: &Diagnostic) -> String {
diag.display(&self.db.upcast(), &self.config).to_string()
diag.display(&self.db, &self.config).to_string()
}
}

View File

@@ -263,12 +263,23 @@ impl Files {
impl fmt::Debug for Files {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut map = f.debug_map();
if f.alternate() {
let mut map = f.debug_map();
for entry in self.inner.system_by_path.iter() {
map.entry(entry.key(), entry.value());
for entry in self.inner.system_by_path.iter() {
map.entry(entry.key(), entry.value());
}
map.finish()
} else {
f.debug_struct("Files")
.field("system_by_path", &self.inner.system_by_path.len())
.field(
"system_virtual_by_path",
&self.inner.system_virtual_by_path.len(),
)
.field("vendored_by_path", &self.inner.vendored_by_path.len())
.finish()
}
map.finish()
}
}
@@ -308,6 +319,9 @@ pub struct File {
count: Count<File>,
}
// The Salsa heap is tracked separately.
impl get_size2::GetSize for File {}
impl File {
/// Reads the content of the file into a [`String`].
///

View File

@@ -36,12 +36,6 @@ pub trait Db: salsa::Database {
fn python_version(&self) -> PythonVersion;
}
/// Trait for upcasting a reference to a base trait object.
pub trait Upcast<T: ?Sized> {
fn upcast(&self) -> &T;
fn upcast_mut(&mut self) -> &mut T;
}
/// Returns the maximum number of tasks that ty is allowed
/// to process in parallel.
///
@@ -76,11 +70,11 @@ pub trait RustDoc {
mod tests {
use std::sync::{Arc, Mutex};
use crate::Db;
use crate::files::Files;
use crate::system::TestSystem;
use crate::system::{DbWithTestSystem, System};
use crate::vendored::VendoredFileSystem;
use crate::{Db, Upcast};
type Events = Arc<Mutex<Vec<salsa::Event>>>;
@@ -153,15 +147,6 @@ mod tests {
}
}
impl Upcast<dyn Db> for TestDb {
fn upcast(&self) -> &(dyn Db + 'static) {
self
}
fn upcast_mut(&mut self) -> &mut (dyn Db + 'static) {
self
}
}
impl DbWithTestSystem for TestDb {
fn test_system(&self) -> &TestSystem {
&self.system

View File

@@ -2,6 +2,7 @@ use std::fmt::Formatter;
use std::sync::Arc;
use arc_swap::ArcSwapOption;
use get_size2::GetSize;
use ruff_python_ast::{AnyRootNodeRef, ModModule, NodeIndex};
use ruff_python_parser::{ParseOptions, Parsed, parse_unchecked};
@@ -20,7 +21,7 @@ use crate::source::source_text;
/// reflected in the changed AST offsets.
/// The other reason is that Ruff's AST doesn't implement `Eq` which Salsa requires
/// for determining if a query result is unchanged.
#[salsa::tracked(returns(ref), no_eq)]
#[salsa::tracked(returns(ref), no_eq, heap_size=get_size2::GetSize::get_heap_size)]
pub fn parsed_module(db: &dyn Db, file: File) -> ParsedModule {
let _span = tracing::trace_span!("parsed_module", ?file).entered();
@@ -44,9 +45,10 @@ pub fn parsed_module_impl(db: &dyn Db, file: File) -> Parsed<ModModule> {
///
/// This type manages instances of the module AST. A particular instance of the AST
/// is represented with the [`ParsedModuleRef`] type.
#[derive(Clone)]
#[derive(Clone, get_size2::GetSize)]
pub struct ParsedModule {
file: File,
#[get_size(size_fn = arc_swap_size)]
inner: Arc<ArcSwapOption<indexed::IndexedModule>>,
}
@@ -142,6 +144,18 @@ impl std::ops::Deref for ParsedModuleRef {
}
}
/// Returns the heap-size of the currently stored `T` in the `ArcSwap`.
fn arc_swap_size<T>(arc_swap: &Arc<ArcSwapOption<T>>) -> usize
where
T: GetSize,
{
if let Some(value) = &*arc_swap.load() {
T::get_heap_size(value)
} else {
0
}
}
mod indexed {
use std::sync::Arc;
@@ -150,7 +164,7 @@ mod indexed {
use ruff_python_parser::Parsed;
/// A wrapper around the AST that allows access to AST nodes by index.
#[derive(Debug)]
#[derive(Debug, get_size2::GetSize)]
pub struct IndexedModule {
index: Box<[AnyRootNodeRef<'static>]>,
pub parsed: Parsed<ModModule>,

View File

@@ -11,7 +11,7 @@ use crate::Db;
use crate::files::{File, FilePath};
/// Reads the source text of a python text file (must be valid UTF8) or notebook.
#[salsa::tracked]
#[salsa::tracked(heap_size=get_size2::GetSize::get_heap_size)]
pub fn source_text(db: &dyn Db, file: File) -> SourceText {
let path = file.path(db);
let _span = tracing::trace_span!("source_text", file = %path).entered();
@@ -65,7 +65,7 @@ fn is_notebook(path: &FilePath) -> bool {
/// The file containing the source text can either be a text file or a notebook.
///
/// Cheap cloneable in `O(1)`.
#[derive(Clone, Eq, PartialEq)]
#[derive(Clone, Eq, PartialEq, get_size2::GetSize)]
pub struct SourceText {
inner: Arc<SourceTextInner>,
}
@@ -123,8 +123,9 @@ impl std::fmt::Debug for SourceText {
}
}
#[derive(Eq, PartialEq)]
#[derive(Eq, PartialEq, get_size2::GetSize)]
struct SourceTextInner {
#[get_size(ignore)]
count: Count<SourceText>,
kind: SourceTextKind,
read_error: Option<SourceTextError>,
@@ -136,6 +137,19 @@ enum SourceTextKind {
Notebook(Box<Notebook>),
}
impl get_size2::GetSize for SourceTextKind {
fn get_heap_size(&self) -> usize {
match self {
SourceTextKind::Text(text) => text.get_heap_size(),
// TODO: The `get-size` derive does not support ignoring enum variants.
//
// Jupyter notebooks are not very relevant for memory profiling, and contain
// arbitrary JSON values that do not implement the `GetSize` trait.
SourceTextKind::Notebook(_) => 0,
}
}
}
impl From<String> for SourceTextKind {
fn from(value: String) -> Self {
SourceTextKind::Text(value)
@@ -148,7 +162,7 @@ impl From<Notebook> for SourceTextKind {
}
}
#[derive(Debug, thiserror::Error, PartialEq, Eq, Clone)]
#[derive(Debug, thiserror::Error, PartialEq, Eq, Clone, get_size2::GetSize)]
pub enum SourceTextError {
#[error("Failed to read notebook: {0}`")]
FailedToReadNotebook(String),
@@ -157,7 +171,7 @@ pub enum SourceTextError {
}
/// Computes the [`LineIndex`] for `file`.
#[salsa::tracked]
#[salsa::tracked(heap_size=get_size2::GetSize::get_heap_size)]
pub fn line_index(db: &dyn Db, file: File) -> LineIndex {
let _span = tracing::trace_span!("line_index", ?file).entered();

View File

@@ -124,6 +124,11 @@ pub trait System: Debug {
/// Returns `None` if no such convention exists for the system.
fn user_config_directory(&self) -> Option<SystemPathBuf>;
/// Returns the directory path where cached files are stored.
///
/// Returns `None` if no such convention exists for the system.
fn cache_dir(&self) -> Option<SystemPathBuf>;
/// Iterate over the contents of the directory at `path`.
///
/// The returned iterator must have the following properties:
@@ -186,6 +191,9 @@ pub trait System: Debug {
Err(std::env::VarError::NotPresent)
}
/// Returns a handle to a [`WritableSystem`] if this system is writeable.
fn as_writable(&self) -> Option<&dyn WritableSystem>;
fn as_any(&self) -> &dyn std::any::Any;
fn as_any_mut(&mut self) -> &mut dyn std::any::Any;
@@ -226,11 +234,52 @@ impl fmt::Display for CaseSensitivity {
/// System trait for non-readonly systems.
pub trait WritableSystem: System {
/// Creates a file at the given path.
///
/// Returns an error if the file already exists.
fn create_new_file(&self, path: &SystemPath) -> Result<()>;
/// Writes the given content to the file at the given path.
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()>;
/// Creates a directory at `path` as well as any intermediate directories.
fn create_directory_all(&self, path: &SystemPath) -> Result<()>;
/// Reads the provided file from the system cache, or creates the file if necessary.
///
/// Returns `Ok(None)` if the system does not expose a suitable cache directory.
fn get_or_cache(
&self,
path: &SystemPath,
read_contents: &dyn Fn() -> Result<String>,
) -> Result<Option<SystemPathBuf>> {
let Some(cache_dir) = self.cache_dir() else {
return Ok(None);
};
let cache_path = cache_dir.join(path);
// The file has already been cached.
if self.is_file(&cache_path) {
return Ok(Some(cache_path));
}
// Read the file contents.
let contents = read_contents()?;
// Create the parent directory.
self.create_directory_all(cache_path.parent().unwrap())?;
// Create and write to the file on the system.
//
// Note that `create_new_file` will fail if the file has already been created. This
// ensures that only one thread/process ever attempts to write to it to avoid corrupting
// the cache.
self.create_new_file(&cache_path)?;
self.write_file(&cache_path, &contents)?;
Ok(Some(cache_path))
}
}
#[derive(Clone, Debug, Eq, PartialEq)]

View File

@@ -1,4 +1,5 @@
use std::collections::BTreeMap;
use std::collections::{BTreeMap, btree_map};
use std::io;
use std::iter::FusedIterator;
use std::sync::{Arc, RwLock, RwLockWriteGuard};
@@ -153,6 +154,26 @@ impl MemoryFileSystem {
virtual_files.contains_key(&path.to_path_buf())
}
pub(crate) fn create_new_file(&self, path: &SystemPath) -> Result<()> {
let normalized = self.normalize_path(path);
let mut by_path = self.inner.by_path.write().unwrap();
match by_path.entry(normalized) {
btree_map::Entry::Vacant(entry) => {
entry.insert(Entry::File(File {
content: String::new(),
last_modified: file_time_now(),
}));
Ok(())
}
btree_map::Entry::Occupied(_) => Err(io::Error::new(
io::ErrorKind::AlreadyExists,
"File already exists",
)),
}
}
/// Stores a new file in the file system.
///
/// The operation overrides the content for an existing file with the same normalized `path`.
@@ -278,14 +299,14 @@ impl MemoryFileSystem {
let normalized = fs.normalize_path(path);
match by_path.entry(normalized) {
std::collections::btree_map::Entry::Occupied(entry) => match entry.get() {
btree_map::Entry::Occupied(entry) => match entry.get() {
Entry::File(_) => {
entry.remove();
Ok(())
}
Entry::Directory(_) => Err(is_a_directory()),
},
std::collections::btree_map::Entry::Vacant(_) => Err(not_found()),
btree_map::Entry::Vacant(_) => Err(not_found()),
}
}
@@ -345,14 +366,14 @@ impl MemoryFileSystem {
}
match by_path.entry(normalized.clone()) {
std::collections::btree_map::Entry::Occupied(entry) => match entry.get() {
btree_map::Entry::Occupied(entry) => match entry.get() {
Entry::Directory(_) => {
entry.remove();
Ok(())
}
Entry::File(_) => Err(not_a_directory()),
},
std::collections::btree_map::Entry::Vacant(_) => Err(not_found()),
btree_map::Entry::Vacant(_) => Err(not_found()),
}
}

View File

@@ -160,6 +160,39 @@ impl System for OsSystem {
None
}
/// Returns an absolute cache directory on the system.
///
/// On Linux and macOS, uses `$XDG_CACHE_HOME/ty` or `.cache/ty`.
/// On Windows, uses `C:\Users\User\AppData\Local\ty\cache`.
#[cfg(not(target_arch = "wasm32"))]
fn cache_dir(&self) -> Option<SystemPathBuf> {
use etcetera::BaseStrategy as _;
let cache_dir = etcetera::base_strategy::choose_base_strategy()
.ok()
.map(|dirs| dirs.cache_dir().join("ty"))
.map(|cache_dir| {
if cfg!(windows) {
// On Windows, we append `cache` to the LocalAppData directory, i.e., prefer
// `C:\Users\User\AppData\Local\ty\cache` over `C:\Users\User\AppData\Local\ty`.
cache_dir.join("cache")
} else {
cache_dir
}
})
.and_then(|path| SystemPathBuf::from_path_buf(path).ok())
.unwrap_or_else(|| SystemPathBuf::from(".ty_cache"));
Some(cache_dir)
}
// TODO: Remove this feature gating once `ruff_wasm` no longer indirectly depends on `ruff_db` with the
// `os` feature enabled (via `ruff_workspace` -> `ruff_graph` -> `ruff_db`).
#[cfg(target_arch = "wasm32")]
fn cache_dir(&self) -> Option<SystemPathBuf> {
None
}
/// Creates a builder to recursively walk `path`.
///
/// The walker ignores files according to [`ignore::WalkBuilder::standard_filters`]
@@ -192,6 +225,10 @@ impl System for OsSystem {
})
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn Any {
self
}
@@ -310,6 +347,10 @@ impl OsSystem {
}
impl WritableSystem for OsSystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
std::fs::File::create_new(path).map(drop)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
std::fs::write(path.as_std_path(), content)
}

View File

@@ -503,6 +503,12 @@ impl ToOwned for SystemPath {
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct SystemPathBuf(#[cfg_attr(feature = "schemars", schemars(with = "String"))] Utf8PathBuf);
impl get_size2::GetSize for SystemPathBuf {
fn get_heap_size(&self) -> usize {
self.0.capacity()
}
}
impl SystemPathBuf {
pub fn new() -> Self {
Self(Utf8PathBuf::new())

View File

@@ -102,6 +102,10 @@ impl System for TestSystem {
self.system().user_config_directory()
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
self.system().cache_dir()
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -123,6 +127,10 @@ impl System for TestSystem {
self.system().glob(pattern)
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
@@ -149,6 +157,10 @@ impl Default for TestSystem {
}
impl WritableSystem for TestSystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
self.system().create_new_file(path)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
self.system().write_file(path, content)
}
@@ -335,6 +347,10 @@ impl System for InMemorySystem {
self.user_config_directory.lock().unwrap().clone()
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
None
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -357,6 +373,10 @@ impl System for InMemorySystem {
Ok(Box::new(iterator))
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
@@ -377,6 +397,10 @@ impl System for InMemorySystem {
}
impl WritableSystem for InMemorySystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
self.memory_fs.create_new_file(path)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
self.memory_fs.write_file(path, content)
}

View File

@@ -212,7 +212,7 @@ impl Display for Error {
path: Some(path),
err,
} => {
write!(f, "IO error for operation on {}: {}", path, err)
write!(f, "IO error for operation on {path}: {err}")
}
ErrorKind::Io { path: None, err } => err.fmt(f),
ErrorKind::NonUtf8Path { path } => {

View File

@@ -4,12 +4,12 @@ use std::fmt::{self, Debug};
use std::io::{self, Read, Write};
use std::sync::{Arc, Mutex, MutexGuard};
use crate::file_revision::FileRevision;
use zip::result::ZipResult;
use zip::write::FileOptions;
use zip::{CompressionMethod, ZipArchive, ZipWriter, read::ZipFile};
pub use self::path::{VendoredPath, VendoredPathBuf};
use crate::file_revision::FileRevision;
mod path;

View File

@@ -87,6 +87,12 @@ impl ToOwned for VendoredPath {
#[derive(Debug, Eq, PartialEq, Clone, Hash)]
pub struct VendoredPathBuf(Utf8PathBuf);
impl get_size2::GetSize for VendoredPathBuf {
fn get_heap_size(&self) -> usize {
self.0.capacity()
}
}
impl Default for VendoredPathBuf {
fn default() -> Self {
Self::new()

View File

@@ -114,6 +114,7 @@ fn generate_set(output: &mut String, set: Set, parents: &mut Vec<Set>) {
parents.pop();
}
#[derive(Debug)]
enum Set {
Toplevel(OptionSet),
Named { name: String, set: OptionSet },
@@ -136,7 +137,7 @@ impl Set {
}
fn emit_field(output: &mut String, name: &str, field: &OptionField, parents: &[Set]) {
let header_level = if parents.is_empty() { "###" } else { "####" };
let header_level = "#".repeat(parents.len() + 1);
let _ = writeln!(output, "{header_level} `{name}`");

View File

@@ -73,12 +73,20 @@ fn generate_markdown() -> String {
for lint in lints {
let _ = writeln!(&mut output, "## `{rule_name}`\n", rule_name = lint.name());
// Increase the header-level by one
// Reformat headers as bold text
let mut in_code_fence = false;
let documentation = lint
.documentation_lines()
.map(|line| {
if line.starts_with('#') {
Cow::Owned(format!("#{line}"))
// Toggle the code fence state if we encounter a boundary
if line.starts_with("```") {
in_code_fence = !in_code_fence;
}
if !in_code_fence && line.starts_with('#') {
Cow::Owned(format!(
"**{line}**\n",
line = line.trim_start_matches('#').trim_start()
))
} else {
Cow::Borrowed(line)
}
@@ -87,21 +95,15 @@ fn generate_markdown() -> String {
let _ = writeln!(
&mut output,
r#"**Default level**: {level}
<details>
<summary>{summary}</summary>
r#"<small>
Default level: [`{level}`](../rules.md#rule-levels "This lint has a default level of '{level}'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20{encoded_name}) ·
[View source](https://github.com/astral-sh/ruff/blob/main/{file}#L{line})
</small>
{documentation}
### Links
* [Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20{encoded_name})
* [View source](https://github.com/astral-sh/ruff/blob/main/{file}#L{line})
</details>
"#,
level = lint.default_level(),
// GitHub doesn't support markdown in `summary` headers
summary = replace_inline_code(lint.summary()),
encoded_name = url::form_urlencoded::byte_serialize(lint.name().as_str().as_bytes())
.collect::<String>(),
file = url::form_urlencoded::byte_serialize(lint.file().replace('\\', "/").as_bytes())
@@ -113,25 +115,6 @@ fn generate_markdown() -> String {
output
}
/// Replaces inline code blocks (`code`) with `<code>code</code>`
fn replace_inline_code(input: &str) -> String {
let mut output = String::new();
let mut parts = input.split('`');
while let Some(before) = parts.next() {
if let Some(between) = parts.next() {
output.push_str(before);
output.push_str("<code>");
output.push_str(between);
output.push_str("</code>");
} else {
output.push_str(before);
}
}
output
}
#[cfg(test)]
mod tests {
use anyhow::Result;

View File

@@ -1,15 +1,15 @@
use anyhow::Result;
use anyhow::{Context, Result};
use std::sync::Arc;
use zip::CompressionMethod;
use ruff_db::Db as SourceDb;
use ruff_db::files::{File, Files};
use ruff_db::system::{OsSystem, System, SystemPathBuf};
use ruff_db::vendored::{VendoredFileSystem, VendoredFileSystemBuilder};
use ruff_db::{Db as SourceDb, Upcast};
use ruff_python_ast::PythonVersion;
use ty_python_semantic::lint::{LintRegistry, RuleSelection};
use ty_python_semantic::{
Db, Program, ProgramSettings, PythonPath, PythonPlatform, PythonVersionSource,
Db, Program, ProgramSettings, PythonEnvironment, PythonPlatform, PythonVersionSource,
PythonVersionWithSource, SearchPathSettings, SysPrefixPathOrigin, default_lint_registry,
};
@@ -35,38 +35,37 @@ impl ModuleDb {
python_version: PythonVersion,
venv_path: Option<SystemPathBuf>,
) -> Result<Self> {
let mut search_paths = SearchPathSettings::new(src_roots);
if let Some(venv_path) = venv_path {
search_paths.python_path =
PythonPath::sys_prefix(venv_path, SysPrefixPathOrigin::PythonCliFlag);
}
let db = Self::default();
let mut search_paths = SearchPathSettings::new(src_roots);
// TODO: Consider calling `PythonEnvironment::discover` if the `venv_path` is not provided.
if let Some(venv_path) = venv_path {
let environment =
PythonEnvironment::new(venv_path, SysPrefixPathOrigin::PythonCliFlag, db.system())?;
search_paths.site_packages_paths = environment
.site_packages_paths(db.system())
.context("Failed to discover the site-packages directory")?
.into_vec();
}
let search_paths = search_paths
.to_search_paths(db.system(), db.vendored())
.context("Invalid search path settings")?;
Program::from_settings(
&db,
ProgramSettings {
python_version: Some(PythonVersionWithSource {
python_version: PythonVersionWithSource {
version: python_version,
source: PythonVersionSource::default(),
}),
},
python_platform: PythonPlatform::default(),
search_paths,
},
)?;
);
Ok(db)
}
}
impl Upcast<dyn SourceDb> for ModuleDb {
fn upcast(&self) -> &(dyn SourceDb + 'static) {
self
}
fn upcast_mut(&mut self) -> &mut (dyn SourceDb + 'static) {
self
}
}
#[salsa::db]
impl SourceDb for ModuleDb {
fn vendored(&self) -> &VendoredFileSystem {

View File

@@ -36,14 +36,20 @@ impl fmt::Display for AnalyzeSettings {
}
#[derive(Default, Debug, Copy, Clone, PartialEq, Eq, CacheKey)]
#[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))]
#[cfg_attr(
feature = "serde",
derive(serde::Serialize, serde::Deserialize),
serde(rename_all = "kebab-case")
)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
#[cfg_attr(feature = "clap", derive(clap::ValueEnum))]
pub enum Direction {
/// Construct a map from module to its dependencies (i.e., the modules that it imports).
#[default]
#[cfg_attr(feature = "serde", serde(alias = "Dependencies"))]
Dependencies,
/// Construct a map from module to its dependents (i.e., the modules that import it).
#[cfg_attr(feature = "serde", serde(alias = "Dependents"))]
Dependents,
}

View File

@@ -14,6 +14,7 @@ license = { workspace = true }
doctest = false
[dependencies]
get-size2 = { workspace = true }
ruff_macros = { workspace = true }
salsa = { workspace = true, optional = true }

View File

@@ -6,7 +6,7 @@ use std::marker::PhantomData;
use std::ops::{Deref, DerefMut, RangeBounds};
/// An owned sequence of `T` indexed by `I`
#[derive(Clone, PartialEq, Eq, Hash)]
#[derive(Clone, PartialEq, Eq, Hash, get_size2::GetSize)]
#[repr(transparent)]
pub struct IndexVec<I, T> {
pub raw: Vec<T>,
@@ -191,6 +191,6 @@ where
#[expect(unsafe_code)]
unsafe fn maybe_update(old_pointer: *mut Self, new_value: Self) -> bool {
let old_vec: &mut IndexVec<I, T> = unsafe { &mut *old_pointer };
unsafe { salsa::Update::maybe_update(&mut old_vec.raw, new_value.raw) }
unsafe { salsa::Update::maybe_update(&raw mut old_vec.raw, new_value.raw) }
}
}

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.12.0"
version = "0.12.1"
publish = false
authors = { workspace = true }
edition = { workspace = true }
@@ -38,6 +38,7 @@ colored = { workspace = true }
fern = { workspace = true }
glob = { workspace = true }
globset = { workspace = true }
hashbrown = { workspace = true }
imperative = { workspace = true }
is-macro = { workspace = true }
is-wsl = { workspace = true }

View File

@@ -0,0 +1,5 @@
foo or{x: None for x in bar}
# C420 fix must make sure to insert a leading space if needed,
# See https://github.com/astral-sh/ruff/issues/18599

View File

@@ -0,0 +1,6 @@
def f_byte():
raise RuntimeError(b"This is an example exception")
def f_byte_empty():
raise RuntimeError(b"")

View File

@@ -0,0 +1,2 @@
#!/usr/bin/env -S uv tool run ruff check --isolated --select EXE003
print("hello world")

View File

@@ -0,0 +1,2 @@
#!/usr/bin/env -S uvx ruff check --isolated --select EXE003
print("hello world")

View File

@@ -119,4 +119,26 @@ field35: "int | str | int" # Error
# Technically, this falls into the domain of the rule but it is an unlikely edge case,
# only works if you have from `__future__ import annotations` at the top of the file,
# and stringified annotations are discouraged in stub files.
field36: "int | str" | int # Ok
field36: "int | str" | int # Ok
# https://github.com/astral-sh/ruff/issues/18546
# Expand Optional[T] to Union[T, None]
# OK
field37: typing.Optional[int]
field38: typing.Union[int, None]
# equivalent to None
field39: typing.Optional[None]
# equivalent to int | None
field40: typing.Union[typing.Optional[int], None]
field41: typing.Optional[typing.Union[int, None]]
field42: typing.Union[typing.Optional[int], typing.Optional[int]]
field43: typing.Optional[int] | None
field44: typing.Optional[int | None]
field45: typing.Optional[int] | typing.Optional[int]
# equivalent to int | dict | None
field46: typing.Union[typing.Optional[int], typing.Optional[dict]]
field47: typing.Optional[int] | typing.Optional[dict]
# avoid reporting twice
field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
field49: typing.Optional[complex | complex] | complex

View File

@@ -111,3 +111,25 @@ field33: typing.Union[typing.Union[int | int] | typing.Union[int | int]] # Error
# Test case for mixed union type
field34: typing.Union[list[int], str] | typing.Union[bytes, list[int]] # Error
# https://github.com/astral-sh/ruff/issues/18546
# Expand Optional[T] to Union[T, None]
# OK
field37: typing.Optional[int]
field38: typing.Union[int, None]
# equivalent to None
field39: typing.Optional[None]
# equivalent to int | None
field40: typing.Union[typing.Optional[int], None]
field41: typing.Optional[typing.Union[int, None]]
field42: typing.Union[typing.Optional[int], typing.Optional[int]]
field43: typing.Optional[int] | None
field44: typing.Optional[int | None]
field45: typing.Optional[int] | typing.Optional[int]
# equivalent to int | dict | None
field46: typing.Union[typing.Optional[int], typing.Optional[dict]]
field47: typing.Optional[int] | typing.Optional[dict]
# avoid reporting twice
field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
field49: typing.Optional[complex | complex] | complex

View File

@@ -83,3 +83,13 @@ def aliased_parentheses_no_params_multiline():
# scope="module"
)
def my_fixture(): ...
@(pytest.fixture())
def outer_paren_fixture_no_params():
return 42
@(fixture())
def outer_paren_imported_fixture_no_params():
return 42

View File

@@ -88,3 +88,14 @@ class TestClass:
# ),
)
def test_bar(param1, param2): ...
@(pytest.mark.foo())
def test_outer_paren_mark_function():
pass
class TestClass:
@(pytest.mark.foo())
def test_method_outer_paren():
pass

View File

@@ -128,3 +128,16 @@ def write_models(directory, Models):
pass; \
\
#
# Regression tests for: https://github.com/astral-sh/ruff/issues/18209
try:
1 / 0
except ():
pass
BaseException = ValueError
try:
int("a")
except BaseException:
pass

View File

@@ -29,3 +29,8 @@ def foo():
dict = {}
for country, stars in zip(dict.keys(), dict.values()):
...
# https://github.com/astral-sh/ruff/issues/18776
flag_stars = {}
for country, stars in(zip)(flag_stars.keys(), flag_stars.values()):...

View File

@@ -2,13 +2,81 @@ import os.path
from pathlib import Path
from os.path import getsize
filename = "filename"
filename1 = __file__
filename2 = Path("filename")
os.path.getsize("filename")
os.path.getsize(b"filename")
os.path.getsize(Path("filename"))
os.path.getsize(__file__)
os.path.getsize(filename)
os.path.getsize(filename1)
os.path.getsize(filename2)
os.path.getsize(filename="filename")
os.path.getsize(filename=b"filename")
os.path.getsize(filename=Path("filename"))
os.path.getsize(filename=__file__)
getsize("filename")
getsize(b"filename")
getsize(Path("filename"))
getsize(__file__)
getsize(filename="filename")
getsize(filename=b"filename")
getsize(filename=Path("filename"))
getsize(filename=__file__)
getsize(filename)
getsize(filename1)
getsize(filename2)
os.path.getsize(
"filename", # comment
)
os.path.getsize(
# comment
"filename"
,
# comment
)
os.path.getsize(
# comment
b"filename"
# comment
)
os.path.getsize( # comment
Path(__file__)
# comment
) # comment
getsize( # comment
"filename")
getsize( # comment
b"filename",
#comment
)
os.path.getsize("file" + "name")
getsize \
\
\
( # comment
"filename",
)
getsize(Path("filename").resolve())
import pathlib
os.path.getsize(pathlib.Path("filename"))

View File

@@ -0,0 +1,5 @@
import os
os.path.getsize(filename="filename")
os.path.getsize(filename=b"filename")
os.path.getsize(filename=__file__)

View File

@@ -85,3 +85,19 @@ import builtins
for i in builtins.list(nested_tuple): # PERF101
pass
# https://github.com/astral-sh/ruff/issues/18783
items = (1, 2, 3)
for i in(list)(items):
print(i)
# https://github.com/astral-sh/ruff/issues/18784
items = (1, 2, 3)
for i in ( # 1
list # 2
# 3
)( # 4
items # 5
# 6
):
print(i)

View File

@@ -163,4 +163,32 @@ def foo():
for k, v in src:
if lambda: 0:
dst[k] = v
dst[k] = v
# https://github.com/astral-sh/ruff/issues/18859
def foo():
v = {}
for o,(x,)in():
v[x,]=o
# https://github.com/astral-sh/ruff/issues/19005
def issue_19005_1():
c = {}
a = object()
for a.b in ():
c[a.b] = a.b
def issue_19005_2():
a = object()
c = {}
for a.k, a.v in ():
c[a.k] = a.v
def issue_19005_3():
a = [None, None]
c = {}
for a[0], a[1] in ():
c[a[0]] = a[1]

View File

@@ -69,3 +69,11 @@ def func():
Returns:
the value
"""
def func():
("""Docstring.
Raises:
ValueError: An error.
""")

View File

@@ -14,3 +14,6 @@ hidden = {"a": "!"}
"" % {
'test1': '',
'test2': '',
}
# https://github.com/astral-sh/ruff/issues/18806

View File

@@ -4,3 +4,11 @@
"{bar:{spam}}".format(bar=2, spam=3, eggs=4, ham=5) # F522
(''
.format(x=2)) # F522
# https://github.com/astral-sh/ruff/issues/18806
# The fix here is unsafe because the unused argument has side effect
"Hello, {name}".format(greeting=print(1), name="World")
# The fix here is safe because the unused argument has no side effect,
# even though the used argument has a side effect
"Hello, {name}".format(greeting="Pikachu", name=print(1))

View File

@@ -35,3 +35,11 @@
# Removing the final argument.
"Hello".format("world")
"Hello".format("world", key="value")
# https://github.com/astral-sh/ruff/issues/18806
# The fix here is unsafe because the unused argument has side effect
"Hello, {0}".format("world", print(1))
# The fix here is safe because the unused argument has no side effect,
# even though the used argument has a side effect
"Hello, {0}".format(print(1), "Pikachu")

View File

@@ -1,3 +1,5 @@
# Mock objects
# ============
# Errors
assert my_mock.not_called()
assert my_mock.called_once_with()
@@ -17,3 +19,25 @@ my_mock.assert_called()
my_mock.assert_called_once_with()
"""like :meth:`Mock.assert_called_once_with`"""
"""like :meth:`MagicMock.assert_called_once_with`"""
# AsyncMock objects
# =================
# Errors
assert my_mock.not_awaited()
assert my_mock.awaited_once_with()
assert my_mock.not_awaited
assert my_mock.awaited_once_with
my_mock.assert_not_awaited
my_mock.assert_awaited
my_mock.assert_awaited_once_with
my_mock.assert_awaited_once_with
MyMock.assert_awaited_once_with
assert my_mock.awaited
# OK
assert my_mock.await_count == 1
my_mock.assert_not_awaited()
my_mock.assert_awaited()
my_mock.assert_awaited_once_with()
"""like :meth:`Mock.assert_awaited_once_with`"""
"""like :meth:`MagicMock.assert_awaited_once_with`"""

View File

@@ -14,7 +14,7 @@ min(1, min(2, 3, key=test))
# This will still trigger, to merge the calls without keyword args.
min(1, min(2, 3, key=test), min(4, 5))
# Don't provide a fix if there are comments within the call.
# The fix is already unsafe, so deleting comments is okay.
min(
1, # This is a comment.
min(2, 3),

View File

@@ -129,3 +129,6 @@ blah = dict[{"a": 1}.__delitem__("a")] # OK
# https://github.com/astral-sh/ruff/issues/14597
assert "abc".__str__() == "abc"
# https://github.com/astral-sh/ruff/issues/18813
three = 1 if 1 else(3.0).__str__()

View File

@@ -57,3 +57,7 @@ _ = lambda x: z(lambda y: x + y)(x)
# lambda uses an additional keyword
_ = lambda *args: f(*args, y=1)
_ = lambda *args: f(*args, y=x)
# https://github.com/astral-sh/ruff/issues/18675
_ = lambda x: (string := str)(x)
_ = lambda x: ((x := 1) and str)(x)

View File

@@ -90,3 +90,52 @@ class AClass:
def myfunc(param: "tuple[Union[int, 'AClass', None], str]"):
print(param)
from typing import NamedTuple, Union
import typing_extensions
from typing_extensions import (
NamedTuple as NamedTupleTE,
Union as UnionTE,
)
# Regression test for https://github.com/astral-sh/ruff/issues/18619
# Don't emit lint for `NamedTuple`
a_plain_1: Union[NamedTuple, int] = None
a_plain_2: Union[int, NamedTuple] = None
a_plain_3: Union[NamedTuple, None] = None
a_plain_4: Union[None, NamedTuple] = None
a_plain_te_1: UnionTE[NamedTupleTE, int] = None
a_plain_te_2: UnionTE[int, NamedTupleTE] = None
a_plain_te_3: UnionTE[NamedTupleTE, None] = None
a_plain_te_4: UnionTE[None, NamedTupleTE] = None
a_plain_typing_1: UnionTE[typing.NamedTuple, int] = None
a_plain_typing_2: UnionTE[int, typing.NamedTuple] = None
a_plain_typing_3: UnionTE[typing.NamedTuple, None] = None
a_plain_typing_4: UnionTE[None, typing.NamedTuple] = None
a_string_1: "Union[NamedTuple, int]" = None
a_string_2: "Union[int, NamedTuple]" = None
a_string_3: "Union[NamedTuple, None]" = None
a_string_4: "Union[None, NamedTuple]" = None
a_string_te_1: "UnionTE[NamedTupleTE, int]" = None
a_string_te_2: "UnionTE[int, NamedTupleTE]" = None
a_string_te_3: "UnionTE[NamedTupleTE, None]" = None
a_string_te_4: "UnionTE[None, NamedTupleTE]" = None
a_string_typing_1: "typing.Union[typing.NamedTuple, int]" = None
a_string_typing_2: "typing.Union[int, typing.NamedTuple]" = None
a_string_typing_3: "typing.Union[typing.NamedTuple, None]" = None
a_string_typing_4: "typing.Union[None, typing.NamedTuple]" = None
b_plain_1: Union[NamedTuple] = None
b_plain_2: Union[NamedTuple, None] = None
b_plain_te_1: UnionTE[NamedTupleTE] = None
b_plain_te_2: UnionTE[NamedTupleTE, None] = None
b_plain_typing_1: UnionTE[typing.NamedTuple] = None
b_plain_typing_2: UnionTE[typing.NamedTuple, None] = None
b_string_1: "Union[NamedTuple]" = None
b_string_2: "Union[NamedTuple, None]" = None
b_string_te_1: "UnionTE[NamedTupleTE]" = None
b_string_te_2: "UnionTE[NamedTupleTE, None]" = None
b_string_typing_1: "typing.Union[typing.NamedTuple]" = None
b_string_typing_2: "typing.Union[typing.NamedTuple, None]" = None

View File

@@ -105,3 +105,23 @@ import builtins
class C:
def f(self):
builtins.super(C, self)
# see: https://github.com/astral-sh/ruff/issues/18533
class ClassForCommentEnthusiasts(BaseClass):
def with_comments(self):
super(
# super helpful comment
ClassForCommentEnthusiasts,
self
).f()
super(
ClassForCommentEnthusiasts,
# even more helpful comment
self
).f()
super(
ClassForCommentEnthusiasts,
self
# also a comment
).f()

View File

@@ -26,3 +26,9 @@ def hello():
f"foo"u"bar" # OK
f"foo" u"bar" # OK
# https://github.com/astral-sh/ruff/issues/18895
""u""
""u"hi"
""""""""""""""""""""u"hi"
""U"helloooo"

View File

@@ -71,3 +71,47 @@ if sys.version_info <= (3, 14, 0):
if sys.version_info <= (3, 14, 15):
print()
# https://github.com/astral-sh/ruff/issues/18165
if sys.version_info.major >= 3:
print("3")
else:
print("2")
if sys.version_info.major > 3:
print("3")
else:
print("2")
if sys.version_info.major <= 3:
print("3")
else:
print("2")
if sys.version_info.major < 3:
print("3")
else:
print("2")
if sys.version_info.major == 3:
print("3")
else:
print("2")
# Semantically incorrect, skip fixing
if sys.version_info.major[1] > 3:
print(3)
else:
print(2)
if sys.version_info.major > (3, 13):
print(3)
else:
print(2)
if sys.version_info.major[:2] > (3, 13):
print(3)
else:
print(2)

View File

@@ -47,3 +47,25 @@ class ServiceRefOrValue:
# Test for: https://github.com/astral-sh/ruff/issues/18508
# Optional[None] should not be offered a fix
foo: Optional[None] = None
from typing import NamedTuple, Optional
import typing_extensions
from typing_extensions import (
NamedTuple as NamedTupleTE,
Optional as OptionalTE,
)
# Regression test for https://github.com/astral-sh/ruff/issues/18619
# Don't emit lint for `NamedTuple`
a1: Optional[NamedTuple] = None
a2: typing.Optional[NamedTuple] = None
a3: OptionalTE[NamedTuple] = None
a4: typing_extensions.Optional[NamedTuple] = None
a5: Optional[typing.NamedTuple] = None
a6: typing.Optional[typing.NamedTuple] = None
a7: OptionalTE[typing.NamedTuple] = None
a8: typing_extensions.Optional[typing.NamedTuple] = None
a9: "Optional[NamedTuple]" = None
a10: Optional[NamedTupleTE] = None

View File

@@ -44,6 +44,13 @@ log(1, math.e)
math.log(1, 2.0001)
math.log(1, 10.0001)
# https://github.com/astral-sh/ruff/issues/18747
def log():
yield math.log((yield), math.e)
def log():
yield math.log((yield from x), math.e)
# see: https://github.com/astral-sh/ruff/issues/18639
math.log(1, 10 # comment

View File

@@ -20,6 +20,12 @@ _ = Decimal.from_float(float("-inf"))
_ = Decimal.from_float(float("Infinity"))
_ = Decimal.from_float(float("-Infinity"))
_ = Decimal.from_float(float("nan"))
_ = Decimal.from_float(float("-NaN"))
_ = Decimal.from_float(float(" \n+nan \t"))
_ = Decimal.from_float(float(" iNf\n\t "))
_ = Decimal.from_float(float(" -inF\n \t"))
_ = Decimal.from_float(float(" InfinIty\n\t "))
_ = Decimal.from_float(float(" -InfinIty\n \t"))
# OK
_ = Fraction(0.1)

View File

@@ -85,3 +85,10 @@ def _():
if isinstance(foo, type(None)):
...
# https://github.com/astral-sh/ruff/issues/19047
if isinstance(foo, ()):
pass
if isinstance(foo, Union[()]):
pass

View File

@@ -110,3 +110,19 @@ class ShouldMatchB008RuleOfImmutableTypeAnnotationIgnored:
# ignored
this_is_fine: int = f()
# Test for:
# https://github.com/astral-sh/ruff/issues/17424
@dataclass(frozen=True)
class C:
foo: int = 1
@dataclass
class D:
c: C = C()
@dataclass
class E:
c: C = C()

View File

@@ -73,3 +73,55 @@ class IntConversionDescriptor:
@frozen
class InventoryItem:
quantity_on_hand: IntConversionDescriptor = IntConversionDescriptor(default=100)
# Test for:
# https://github.com/astral-sh/ruff/issues/17424
@frozen
class C:
foo: int = 1
@attr.frozen
class D:
foo: int = 1
@define
class E:
c: C = C()
d: D = D()
@attr.s
class F:
foo: int = 1
@attr.mutable
class G:
foo: int = 1
@attr.attrs
class H:
f: F = F()
g: G = G()
@attr.define
class I:
f: F = F()
g: G = G()
@attr.frozen
class J:
f: F = F()
g: G = G()
@attr.mutable
class K:
f: F = F()
g: G = G()

View File

@@ -36,5 +36,19 @@ f"{ascii(bla)}" # OK
)
# OK
# https://github.com/astral-sh/ruff/issues/16325
f"{str({})}"
f"{str({} | {})}"
import builtins
f"{builtins.repr(1)}"
f"{repr(1)=}"
f"{repr(lambda: 1)}"
f"{repr(x := 2)}"
f"{str(object=3)}"

View File

@@ -1,5 +1,5 @@
from collections import deque
import collections
from collections import deque
def f():
@@ -91,3 +91,14 @@ def f():
def f():
deque([], iterable=[])
# https://github.com/astral-sh/ruff/issues/18854
deque("")
deque(b"")
deque(f"")
deque(f"" "")
deque(f"" f"")
deque("abc") # OK
deque(b"abc") # OK
deque(f"" "a") # OK
deque(f"{x}" "") # OK

View File

@@ -34,7 +34,7 @@ pub(crate) fn bindings(checker: &Checker) {
if binding.kind.is_bound_exception()
&& binding.is_unused()
&& !checker
.settings
.settings()
.dummy_variable_rgx
.is_match(binding.name(checker.source()))
{
@@ -67,7 +67,7 @@ pub(crate) fn bindings(checker: &Checker) {
flake8_import_conventions::rules::unconventional_import_alias(
checker,
binding,
&checker.settings.flake8_import_conventions.aliases,
&checker.settings().flake8_import_conventions.aliases,
);
}
if checker.is_rule_enabled(Rule::UnaliasedCollectionsAbcSetImport) {

View File

@@ -1,12 +1,7 @@
use ruff_python_semantic::analyze::visibility;
use ruff_python_semantic::{Binding, BindingKind, Imported, ResolvedReference, ScopeKind};
use ruff_text_size::Ranged;
use rustc_hash::FxHashMap;
use ruff_python_semantic::{Binding, ScopeKind};
use crate::Fix;
use crate::checkers::ast::Checker;
use crate::codes::Rule;
use crate::fix;
use crate::rules::{
flake8_builtins, flake8_pyi, flake8_type_checking, flake8_unused_arguments, pep8_naming,
pyflakes, pylint, ruff,
@@ -76,7 +71,7 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
flake8_type_checking::helpers::is_valid_runtime_import(
binding,
&checker.semantic,
&checker.settings.flake8_type_checking,
&checker.settings().flake8_type_checking,
)
})
.collect()
@@ -94,260 +89,19 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
}
if checker.is_rule_enabled(Rule::GlobalVariableNotAssigned) {
for (name, binding_id) in scope.bindings() {
let binding = checker.semantic.binding(binding_id);
// If the binding is a `global`, then it's a top-level `global` that was never
// assigned in the current scope. If it were assigned, the `global` would be
// shadowed by the assignment.
if binding.kind.is_global() {
// If the binding was conditionally deleted, it will include a reference within
// a `Del` context, but won't be shadowed by a `BindingKind::Deletion`, as in:
// ```python
// if condition:
// del var
// ```
if binding
.references
.iter()
.map(|id| checker.semantic.reference(*id))
.all(ResolvedReference::is_load)
{
checker.report_diagnostic(
pylint::rules::GlobalVariableNotAssigned {
name: (*name).to_string(),
},
binding.range(),
);
}
}
}
pylint::rules::global_variable_not_assigned(checker, scope);
}
if checker.is_rule_enabled(Rule::RedefinedArgumentFromLocal) {
for (name, binding_id) in scope.bindings() {
for shadow in checker.semantic.shadowed_bindings(scope_id, binding_id) {
let binding = &checker.semantic.bindings[shadow.binding_id()];
if !matches!(
binding.kind,
BindingKind::LoopVar
| BindingKind::BoundException
| BindingKind::WithItemVar
) {
continue;
}
let shadowed = &checker.semantic.bindings[shadow.shadowed_id()];
if !shadowed.kind.is_argument() {
continue;
}
if checker.settings.dummy_variable_rgx.is_match(name) {
continue;
}
let scope = &checker.semantic.scopes[binding.scope];
if scope.kind.is_generator() {
continue;
}
checker.report_diagnostic(
pylint::rules::RedefinedArgumentFromLocal {
name: name.to_string(),
},
binding.range(),
);
}
}
pylint::rules::redefined_argument_from_local(checker, scope_id, scope);
}
if checker.is_rule_enabled(Rule::ImportShadowedByLoopVar) {
for (name, binding_id) in scope.bindings() {
for shadow in checker.semantic.shadowed_bindings(scope_id, binding_id) {
// If the shadowing binding isn't a loop variable, abort.
let binding = &checker.semantic.bindings[shadow.binding_id()];
if !binding.kind.is_loop_var() {
continue;
}
// If the shadowed binding isn't an import, abort.
let shadowed = &checker.semantic.bindings[shadow.shadowed_id()];
if !matches!(
shadowed.kind,
BindingKind::Import(..)
| BindingKind::FromImport(..)
| BindingKind::SubmoduleImport(..)
| BindingKind::FutureImport
) {
continue;
}
// If the bindings are in different forks, abort.
if shadowed.source.is_none_or(|left| {
binding
.source
.is_none_or(|right| !checker.semantic.same_branch(left, right))
}) {
continue;
}
checker.report_diagnostic(
pyflakes::rules::ImportShadowedByLoopVar {
name: name.to_string(),
row: checker.compute_source_row(shadowed.start()),
},
binding.range(),
);
}
}
pyflakes::rules::import_shadowed_by_loop_var(checker, scope_id, scope);
}
if checker.is_rule_enabled(Rule::RedefinedWhileUnused) {
// Index the redefined bindings by statement.
let mut redefinitions = FxHashMap::default();
for (name, binding_id) in scope.bindings() {
for shadow in checker.semantic.shadowed_bindings(scope_id, binding_id) {
// If the shadowing binding is a loop variable, abort, to avoid overlap
// with F402.
let binding = &checker.semantic.bindings[shadow.binding_id()];
if binding.kind.is_loop_var() {
continue;
}
// If the shadowed binding is used, abort.
let shadowed = &checker.semantic.bindings[shadow.shadowed_id()];
if shadowed.is_used() {
continue;
}
// If the shadowing binding isn't considered a "redefinition" of the
// shadowed binding, abort.
if !binding.redefines(shadowed) {
continue;
}
if shadow.same_scope() {
// If the symbol is a dummy variable, abort, unless the shadowed
// binding is an import.
if !matches!(
shadowed.kind,
BindingKind::Import(..)
| BindingKind::FromImport(..)
| BindingKind::SubmoduleImport(..)
| BindingKind::FutureImport
) && checker.settings.dummy_variable_rgx.is_match(name)
{
continue;
}
let Some(node_id) = shadowed.source else {
continue;
};
// If this is an overloaded function, abort.
if shadowed.kind.is_function_definition() {
if checker
.semantic
.statement(node_id)
.as_function_def_stmt()
.is_some_and(|function| {
visibility::is_overload(
&function.decorator_list,
&checker.semantic,
)
})
{
continue;
}
}
} else {
// Only enforce cross-scope shadowing for imports.
if !matches!(
shadowed.kind,
BindingKind::Import(..)
| BindingKind::FromImport(..)
| BindingKind::SubmoduleImport(..)
| BindingKind::FutureImport
) {
continue;
}
}
// If the bindings are in different forks, abort.
if shadowed.source.is_none_or(|left| {
binding
.source
.is_none_or(|right| !checker.semantic.same_branch(left, right))
}) {
continue;
}
redefinitions
.entry(binding.source)
.or_insert_with(Vec::new)
.push((shadowed, binding));
}
}
// Create a fix for each source statement.
let mut fixes = FxHashMap::default();
for (source, entries) in &redefinitions {
let Some(source) = source else {
continue;
};
let member_names = entries
.iter()
.filter_map(|(shadowed, binding)| {
if let Some(shadowed_import) = shadowed.as_any_import() {
if let Some(import) = binding.as_any_import() {
if shadowed_import.qualified_name() == import.qualified_name() {
return Some(import.member_name());
}
}
}
None
})
.collect::<Vec<_>>();
if !member_names.is_empty() {
let statement = checker.semantic.statement(*source);
let parent = checker.semantic.parent_statement(*source);
let Ok(edit) = fix::edits::remove_unused_imports(
member_names.iter().map(std::convert::AsRef::as_ref),
statement,
parent,
checker.locator(),
checker.stylist(),
checker.indexer(),
) else {
continue;
};
fixes.insert(
*source,
Fix::safe_edit(edit).isolate(Checker::isolation(
checker.semantic().parent_statement_id(*source),
)),
);
}
}
// Create diagnostics for each statement.
for (source, entries) in &redefinitions {
for (shadowed, binding) in entries {
let mut diagnostic = checker.report_diagnostic(
pyflakes::rules::RedefinedWhileUnused {
name: binding.name(checker.source()).to_string(),
row: checker.compute_source_row(shadowed.start()),
},
binding.range(),
);
if let Some(range) = binding.parent_range(&checker.semantic) {
diagnostic.set_parent(range.start());
}
if let Some(fix) = source.as_ref().and_then(|source| fixes.get(source)) {
diagnostic.set_fix(fix.clone());
}
}
}
pyflakes::rules::redefined_while_unused(checker, scope_id, scope);
}
if checker.source_type.is_stub()
@@ -402,7 +156,7 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
&& binding.is_unused()
&& !binding.is_nonlocal()
&& !binding.is_global()
&& !checker.settings.dummy_variable_rgx.is_match(name)
&& !checker.settings().dummy_variable_rgx.is_match(name)
&& !matches!(
name,
"__tracebackhide__"

View File

@@ -167,7 +167,7 @@ pub(crate) fn definitions(checker: &mut Checker) {
if enforce_docstrings || enforce_pydoclint {
if pydocstyle::helpers::should_ignore_definition(
definition,
&checker.settings.pydocstyle,
&checker.settings().pydocstyle,
&checker.semantic,
) {
continue;
@@ -253,7 +253,7 @@ pub(crate) fn definitions(checker: &mut Checker) {
pydocstyle::rules::non_imperative_mood(
checker,
&docstring,
&checker.settings.pydocstyle,
&checker.settings().pydocstyle,
);
}
if checker.is_rule_enabled(Rule::SignatureInDocstring) {
@@ -292,7 +292,7 @@ pub(crate) fn definitions(checker: &mut Checker) {
if enforce_sections || enforce_pydoclint {
let section_contexts = pydocstyle::helpers::get_section_contexts(
&docstring,
checker.settings.pydocstyle.convention(),
checker.settings().pydocstyle.convention(),
);
if enforce_sections {
@@ -300,7 +300,7 @@ pub(crate) fn definitions(checker: &mut Checker) {
checker,
&docstring,
&section_contexts,
checker.settings.pydocstyle.convention(),
checker.settings().pydocstyle.convention(),
);
}
@@ -310,7 +310,7 @@ pub(crate) fn definitions(checker: &mut Checker) {
definition,
&docstring,
&section_contexts,
checker.settings.pydocstyle.convention(),
checker.settings().pydocstyle.convention(),
);
}
}

View File

@@ -41,7 +41,7 @@ pub(crate) fn except_handler(except_handler: &ExceptHandler, checker: &Checker)
except_handler,
type_.as_deref(),
body,
checker.settings.flake8_bandit.check_typed_exception,
checker.settings().flake8_bandit.check_typed_exception,
);
}
if checker.is_rule_enabled(Rule::TryExceptContinue) {
@@ -50,7 +50,7 @@ pub(crate) fn except_handler(except_handler: &ExceptHandler, checker: &Checker)
except_handler,
type_.as_deref(),
body,
checker.settings.flake8_bandit.check_typed_exception,
checker.settings().flake8_bandit.check_typed_exception,
);
}
if checker.is_rule_enabled(Rule::ExceptWithEmptyTuple) {

View File

@@ -7,6 +7,7 @@ use ruff_python_semantic::analyze::typing;
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
use crate::preview::is_optional_as_none_in_union_enabled;
use crate::registry::Rule;
use crate::rules::{
airflow, flake8_2020, flake8_async, flake8_bandit, flake8_boolean_trap, flake8_bugbear,
@@ -35,7 +36,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
&& checker.target_version() < PythonVersion::PY310
&& checker.target_version() >= PythonVersion::PY37
&& checker.semantic.in_annotation()
&& !checker.settings.pyupgrade.keep_runtime_typing
&& !checker.settings().pyupgrade.keep_runtime_typing
{
flake8_future_annotations::rules::future_rewritable_type_annotation(
checker, value,
@@ -51,7 +52,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
|| (checker.target_version() >= PythonVersion::PY37
&& checker.semantic.future_annotations_or_stub()
&& checker.semantic.in_annotation()
&& !checker.settings.pyupgrade.keep_runtime_typing)
&& !checker.settings().pyupgrade.keep_runtime_typing)
{
pyupgrade::rules::non_pep604_annotation(checker, expr, slice, operator);
}
@@ -90,7 +91,13 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if checker.is_rule_enabled(Rule::UnnecessaryLiteralUnion) {
flake8_pyi::rules::unnecessary_literal_union(checker, expr);
}
if checker.is_rule_enabled(Rule::DuplicateUnionMember) {
if checker.is_rule_enabled(Rule::DuplicateUnionMember)
// Avoid duplicate checks inside `Optional`
&& !(
is_optional_as_none_in_union_enabled(checker.settings())
&& checker.semantic.inside_optional()
)
{
flake8_pyi::rules::duplicate_union_member(checker, expr);
}
if checker.is_rule_enabled(Rule::RedundantLiteralUnion) {
@@ -288,7 +295,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
&& checker.target_version() < PythonVersion::PY39
&& checker.target_version() >= PythonVersion::PY37
&& checker.semantic.in_annotation()
&& !checker.settings.pyupgrade.keep_runtime_typing
&& !checker.settings().pyupgrade.keep_runtime_typing
{
flake8_future_annotations::rules::future_rewritable_type_annotation(checker, expr);
}
@@ -299,7 +306,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
|| (checker.target_version() >= PythonVersion::PY37
&& checker.semantic.future_annotations_or_stub()
&& checker.semantic.in_annotation()
&& !checker.settings.pyupgrade.keep_runtime_typing)
&& !checker.settings().pyupgrade.keep_runtime_typing)
{
pyupgrade::rules::use_pep585_annotation(
checker,
@@ -395,7 +402,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
&& checker.target_version() < PythonVersion::PY39
&& checker.target_version() >= PythonVersion::PY37
&& checker.semantic.in_annotation()
&& !checker.settings.pyupgrade.keep_runtime_typing
&& !checker.settings().pyupgrade.keep_runtime_typing
{
flake8_future_annotations::rules::future_rewritable_type_annotation(
checker, expr,
@@ -408,7 +415,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
|| (checker.target_version() >= PythonVersion::PY37
&& checker.semantic.future_annotations_or_stub()
&& checker.semantic.in_annotation()
&& !checker.settings.pyupgrade.keep_runtime_typing)
&& !checker.settings().pyupgrade.keep_runtime_typing)
{
pyupgrade::rules::use_pep585_annotation(checker, expr, &replacement);
}
@@ -539,14 +546,13 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
let location = expr.range();
match pyflakes::format::FormatSummary::try_from(string_value.to_str()) {
Err(e) => {
if checker.is_rule_enabled(Rule::StringDotFormatInvalidFormat) {
checker.report_diagnostic(
pyflakes::rules::StringDotFormatInvalidFormat {
message: pyflakes::format::error_to_string(&e),
},
location,
);
}
// F521
checker.report_diagnostic_if_enabled(
pyflakes::rules::StringDotFormatInvalidFormat {
message: pyflakes::format::error_to_string(&e),
},
location,
);
}
Ok(summary) => {
if checker
@@ -841,7 +847,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
flake8_comprehensions::rules::unnecessary_collection_call(
checker,
call,
&checker.settings.flake8_comprehensions,
&checker.settings().flake8_comprehensions,
);
}
if checker.is_rule_enabled(Rule::UnnecessaryLiteralWithinTupleCall) {
@@ -1006,7 +1012,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
Rule::PrintfInGetTextFuncCall,
]) && flake8_gettext::is_gettext_func_call(
func,
&checker.settings.flake8_gettext.functions_names,
&checker.settings().flake8_gettext.functions_names,
) {
if checker.is_rule_enabled(Rule::FStringInGetTextFuncCall) {
flake8_gettext::rules::f_string_in_gettext_func_call(checker, args);
@@ -1056,7 +1062,6 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
Rule::OsPathSplitext,
Rule::BuiltinOpen,
Rule::PyPath,
Rule::OsPathGetsize,
Rule::OsPathGetatime,
Rule::OsPathGetmtime,
Rule::OsPathGetctime,
@@ -1066,6 +1071,9 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
]) {
flake8_use_pathlib::rules::replaceable_by_pathlib(checker, call);
}
if checker.is_rule_enabled(Rule::OsPathGetsize) {
flake8_use_pathlib::rules::os_path_getsize(checker, call);
}
if checker.is_rule_enabled(Rule::PathConstructorCurrentDirectory) {
flake8_use_pathlib::rules::path_constructor_current_directory(checker, call);
}
@@ -1313,26 +1321,22 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
typ: CFormatErrorType::UnsupportedFormatChar(c),
..
}) => {
if checker
.is_rule_enabled(Rule::PercentFormatUnsupportedFormatCharacter)
{
checker.report_diagnostic(
pyflakes::rules::PercentFormatUnsupportedFormatCharacter {
char: c,
},
location,
);
}
// F509
checker.report_diagnostic_if_enabled(
pyflakes::rules::PercentFormatUnsupportedFormatCharacter {
char: c,
},
location,
);
}
Err(e) => {
if checker.is_rule_enabled(Rule::PercentFormatInvalidFormat) {
checker.report_diagnostic(
pyflakes::rules::PercentFormatInvalidFormat {
message: e.to_string(),
},
location,
);
}
// F501
checker.report_diagnostic_if_enabled(
pyflakes::rules::PercentFormatInvalidFormat {
message: e.to_string(),
},
location,
);
}
Ok(summary) => {
if checker.is_rule_enabled(Rule::PercentFormatExpectedMapping) {
@@ -1433,6 +1437,11 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if !checker.semantic.in_nested_union() {
if checker.is_rule_enabled(Rule::DuplicateUnionMember)
&& checker.semantic.in_type_definition()
// Avoid duplicate checks inside `Optional`
&& !(
is_optional_as_none_in_union_enabled(checker.settings())
&& checker.semantic.inside_optional()
)
{
flake8_pyi::rules::duplicate_union_member(checker, expr);
}

View File

@@ -45,18 +45,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
}
if checker.is_rule_enabled(Rule::NonlocalWithoutBinding) {
if !checker.semantic.scope_id.is_global() {
for name in names {
if checker.semantic.nonlocal(name).is_none() {
checker.report_diagnostic(
pylint::rules::NonlocalWithoutBinding {
name: name.to_string(),
},
name.range(),
);
}
}
}
pylint::rules::nonlocal_without_binding(checker, nonlocal);
}
if checker.is_rule_enabled(Rule::NonlocalAndGlobal) {
pylint::rules::nonlocal_and_global(checker, nonlocal);
@@ -132,7 +121,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
name,
decorator_list,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
&checker.semantic,
);
}
@@ -189,7 +178,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker.semantic.current_scope(),
stmt,
name,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::GlobalStatement) {
@@ -240,7 +229,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
name,
body,
checker.settings.mccabe.max_complexity,
checker.settings().mccabe.max_complexity,
);
}
if checker.is_rule_enabled(Rule::HardcodedPasswordDefault) {
@@ -265,7 +254,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker,
stmt,
body,
checker.settings.pylint.max_returns,
checker.settings().pylint.max_returns,
);
}
if checker.is_rule_enabled(Rule::TooManyBranches) {
@@ -273,7 +262,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker,
stmt,
body,
checker.settings.pylint.max_branches,
checker.settings().pylint.max_branches,
);
}
if checker.is_rule_enabled(Rule::TooManyStatements) {
@@ -281,7 +270,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker,
stmt,
body,
checker.settings.pylint.max_statements,
checker.settings().pylint.max_statements,
);
}
if checker.any_rule_enabled(&[
@@ -431,7 +420,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pylint::rules::too_many_public_methods(
checker,
class_def,
checker.settings.pylint.max_public_methods,
checker.settings().pylint.max_public_methods,
);
}
if checker.is_rule_enabled(Rule::GlobalStatement) {
@@ -459,7 +448,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker,
stmt,
name,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::ErrorSuffixOnExceptionName) {
@@ -468,7 +457,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
arguments.as_deref(),
name,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if !checker.source_type.is_stub() {
@@ -646,7 +635,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::LowercaseImportedAsNonLowercase) {
@@ -656,7 +645,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::CamelcaseImportedAsLowercase) {
@@ -666,7 +655,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::CamelcaseImportedAsConstant) {
@@ -676,7 +665,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::CamelcaseImportedAsAcronym) {
@@ -692,7 +681,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
&alias.name,
asname,
&checker.settings.flake8_import_conventions.banned_aliases,
&checker.settings().flake8_import_conventions.banned_aliases,
);
}
}
@@ -828,6 +817,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pyflakes::rules::future_feature_not_defined(checker, alias);
}
} else if &alias.name == "*" {
// F406
if checker.is_rule_enabled(Rule::UndefinedLocalWithNestedImportStarUsage) {
if !matches!(checker.semantic.current_scope().kind, ScopeKind::Module) {
checker.report_diagnostic(
@@ -838,14 +828,13 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
);
}
}
if checker.is_rule_enabled(Rule::UndefinedLocalWithImportStar) {
checker.report_diagnostic(
pyflakes::rules::UndefinedLocalWithImportStar {
name: helpers::format_import_from(level, module).to_string(),
},
stmt.range(),
);
}
// F403
checker.report_diagnostic_if_enabled(
pyflakes::rules::UndefinedLocalWithImportStar {
name: helpers::format_import_from(level, module).to_string(),
},
stmt.range(),
);
}
if checker.is_rule_enabled(Rule::RelativeImports) {
flake8_tidy_imports::rules::banned_relative_import(
@@ -854,7 +843,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
level,
module,
checker.module.qualified_name(),
checker.settings.flake8_tidy_imports.ban_relative_imports,
checker.settings().flake8_tidy_imports.ban_relative_imports,
);
}
if checker.is_rule_enabled(Rule::Debugger) {
@@ -869,7 +858,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
&qualified_name,
asname,
&checker.settings.flake8_import_conventions.banned_aliases,
&checker.settings().flake8_import_conventions.banned_aliases,
);
}
}
@@ -881,7 +870,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::LowercaseImportedAsNonLowercase) {
@@ -891,7 +880,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::CamelcaseImportedAsLowercase) {
@@ -901,7 +890,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::CamelcaseImportedAsConstant) {
@@ -911,7 +900,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
&checker.settings().pep8_naming.ignore_names,
);
}
if checker.is_rule_enabled(Rule::CamelcaseImportedAsAcronym) {
@@ -947,7 +936,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker,
stmt,
&helpers::format_import_from(level, module),
&checker.settings.flake8_import_conventions.banned_from,
&checker.settings().flake8_import_conventions.banned_from,
);
}
if checker.is_rule_enabled(Rule::ByteStringUsage) {

View File

@@ -34,7 +34,7 @@ pub(crate) fn string_like(string_like: StringLike, checker: &Checker) {
flake8_quotes::rules::unnecessary_escaped_quote(checker, string_like);
}
if checker.is_rule_enabled(Rule::AvoidableEscapedQuote)
&& checker.settings.flake8_quotes.avoid_escape
&& checker.settings().flake8_quotes.avoid_escape
{
flake8_quotes::rules::avoidable_escaped_quote(checker, string_like);
}

View File

@@ -13,15 +13,15 @@ pub(crate) fn unresolved_references(checker: &Checker) {
for reference in checker.semantic.unresolved_references() {
if reference.is_wildcard_import() {
if checker.is_rule_enabled(Rule::UndefinedLocalWithImportStarUsage) {
checker.report_diagnostic(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: reference.name(checker.source()).to_string(),
},
reference.range(),
);
}
// F406
checker.report_diagnostic_if_enabled(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: reference.name(checker.source()).to_string(),
},
reference.range(),
);
} else {
// F821
if checker.is_rule_enabled(Rule::UndefinedName) {
if checker.semantic.in_no_type_check() {
continue;

View File

@@ -28,7 +28,7 @@ use itertools::Itertools;
use log::debug;
use rustc_hash::{FxHashMap, FxHashSet};
use ruff_diagnostics::IsolationLevel;
use ruff_diagnostics::{Applicability, Fix, IsolationLevel};
use ruff_notebook::{CellOffsets, NotebookIndex};
use ruff_python_ast::helpers::{collect_import_from_member, is_docstring_stmt, to_module_path};
use ruff_python_ast::identifier::Identifier;
@@ -207,8 +207,6 @@ pub(crate) struct Checker<'a> {
/// The [`NoqaMapping`] for the current analysis (i.e., the mapping from line number to
/// suppression commented line number).
noqa_line_for: &'a NoqaMapping,
/// The [`LinterSettings`] for the current analysis, including the enabled rules.
pub(crate) settings: &'a LinterSettings,
/// The [`Locator`] for the current file, which enables extraction of source code from byte
/// offsets.
locator: &'a Locator<'a>,
@@ -260,13 +258,12 @@ impl<'a> Checker<'a> {
notebook_index: Option<&'a NotebookIndex>,
target_version: TargetVersion,
context: &'a LintContext<'a>,
) -> Checker<'a> {
) -> Self {
let semantic = SemanticModel::new(&settings.typing_modules, path, module);
Self {
parsed,
parsed_type_annotation: None,
parsed_annotations_cache: ParsedAnnotationsCache::new(parsed_annotations_arena),
settings,
noqa_line_for,
noqa,
path,
@@ -391,7 +388,7 @@ impl<'a> Checker<'a> {
/// Return a [`DiagnosticGuard`] for reporting a diagnostic.
///
/// The guard derefs to a [`Diagnostic`], so it can be used to further modify the diagnostic
/// The guard derefs to an [`OldDiagnostic`], so it can be used to further modify the diagnostic
/// before it is added to the collection in the checker on `Drop`.
pub(crate) fn report_diagnostic<'chk, T: Violation>(
&'chk self,
@@ -404,8 +401,8 @@ impl<'a> Checker<'a> {
/// Return a [`DiagnosticGuard`] for reporting a diagnostic if the corresponding rule is
/// enabled.
///
/// Prefer [`Checker::report_diagnostic`] in general because the conversion from a `Diagnostic`
/// to a `Rule` is somewhat expensive.
/// The guard derefs to an [`OldDiagnostic`], so it can be used to further modify the diagnostic
/// before it is added to the collection in the checker on `Drop`.
pub(crate) fn report_diagnostic_if_enabled<'chk, T: Violation>(
&'chk self,
kind: T,
@@ -464,6 +461,11 @@ impl<'a> Checker<'a> {
&self.semantic
}
/// The [`LinterSettings`] for the current analysis, including the enabled rules.
pub(crate) const fn settings(&self) -> &'a LinterSettings {
self.context.settings
}
/// The [`Path`] to the file under analysis.
pub(crate) const fn path(&self) -> &'a Path {
self.path
@@ -576,7 +578,7 @@ impl<'a> Checker<'a> {
) -> Option<TypingImporter<'b, 'a>> {
let source_module = if self.target_version() >= version_added_to_typing {
"typing"
} else if !self.settings.typing_extensions {
} else if !self.settings().typing_extensions {
return None;
} else {
"typing_extensions"
@@ -623,6 +625,7 @@ impl SemanticSyntaxContext for Checker<'_> {
fn report_semantic_error(&self, error: SemanticSyntaxError) {
match error.kind {
SemanticSyntaxErrorKind::LateFutureImport => {
// F404
if self.is_rule_enabled(Rule::LateFutureImport) {
self.report_diagnostic(LateFutureImport, error.range);
}
@@ -644,6 +647,7 @@ impl SemanticSyntaxContext for Checker<'_> {
}
}
SemanticSyntaxErrorKind::ReturnOutsideFunction => {
// F706
if self.is_rule_enabled(Rule::ReturnOutsideFunction) {
self.report_diagnostic(ReturnOutsideFunction, error.range);
}
@@ -1054,7 +1058,7 @@ impl<'a> Visitor<'a> for Checker<'a> {
let annotation = AnnotationContext::from_function(
function_def,
&self.semantic,
self.settings,
self.settings(),
self.target_version(),
);
@@ -1256,7 +1260,7 @@ impl<'a> Visitor<'a> for Checker<'a> {
}) => {
match AnnotationContext::from_model(
&self.semantic,
self.settings,
self.settings(),
self.target_version(),
) {
AnnotationContext::RuntimeRequired => {
@@ -1868,8 +1872,8 @@ impl<'a> Visitor<'a> for Checker<'a> {
match typing::match_annotated_subscript(
value,
&self.semantic,
self.settings.typing_modules.iter().map(String::as_str),
&self.settings.pyflakes.extend_generics,
self.settings().typing_modules.iter().map(String::as_str),
&self.settings().pyflakes.extend_generics,
) {
// Ex) Literal["Class"]
Some(typing::SubscriptKind::Literal) => {
@@ -2476,6 +2480,7 @@ impl<'a> Checker<'a> {
fn bind_builtins(&mut self) {
let target_version = self.target_version();
let settings = self.settings();
let mut bind_builtin = |builtin| {
// Add the builtin to the scope.
let binding_id = self.semantic.push_builtin();
@@ -2489,7 +2494,7 @@ impl<'a> Checker<'a> {
for builtin in MAGIC_GLOBALS {
bind_builtin(builtin);
}
for builtin in &self.settings.builtins {
for builtin in &settings.builtins {
bind_builtin(builtin);
}
}
@@ -2760,9 +2765,7 @@ impl<'a> Checker<'a> {
self.semantic.restore(snapshot);
if self.semantic.in_annotation()
&& self.semantic.in_typing_only_annotation()
{
if self.semantic.in_typing_only_annotation() {
if self.is_rule_enabled(Rule::QuotedAnnotation) {
pyupgrade::rules::quoted_annotation(self, annotation, range);
}
@@ -2805,6 +2808,7 @@ impl<'a> Checker<'a> {
Err(parse_error) => {
self.semantic.restore(snapshot);
// F722
if self.is_rule_enabled(Rule::ForwardAnnotationSyntaxError) {
self.report_type_diagnostic(
pyflakes::rules::ForwardAnnotationSyntaxError {
@@ -2952,6 +2956,7 @@ impl<'a> Checker<'a> {
self.semantic.flags -= SemanticModelFlags::DUNDER_ALL_DEFINITION;
} else {
if self.semantic.global_scope().uses_star_imports() {
// F405
if self.is_rule_enabled(Rule::UndefinedLocalWithImportStarUsage) {
self.report_diagnostic(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
@@ -2962,8 +2967,9 @@ impl<'a> Checker<'a> {
.set_parent(definition.start());
}
} else {
// F822
if self.is_rule_enabled(Rule::UndefinedExport) {
if is_undefined_export_in_dunder_init_enabled(self.settings)
if is_undefined_export_in_dunder_init_enabled(self.settings())
|| !self.path.ends_with("__init__.py")
{
self.report_diagnostic(
@@ -3115,7 +3121,6 @@ pub(crate) struct LintContext<'a> {
diagnostics: RefCell<Vec<OldDiagnostic>>,
source_file: SourceFile,
rules: RuleTable,
#[expect(unused, reason = "TODO(brent) use this instead of Checker::settings")]
settings: &'a LinterSettings,
}
@@ -3143,7 +3148,7 @@ impl<'a> LintContext<'a> {
/// Return a [`DiagnosticGuard`] for reporting a diagnostic.
///
/// The guard derefs to an [`OldDiagnostic`], so it can be used to further modify the diagnostic
/// before it is added to the collection in the collector on `Drop`.
/// before it is added to the collection in the context on `Drop`.
pub(crate) fn report_diagnostic<'chk, T: Violation>(
&'chk self,
kind: T,
@@ -3152,23 +3157,26 @@ impl<'a> LintContext<'a> {
DiagnosticGuard {
context: self,
diagnostic: Some(OldDiagnostic::new(kind, range, &self.source_file)),
rule: T::rule(),
}
}
/// Return a [`DiagnosticGuard`] for reporting a diagnostic if the corresponding rule is
/// enabled.
///
/// Prefer [`DiagnosticsCollector::report_diagnostic`] in general because the conversion from an
/// `OldDiagnostic` to a `Rule` is somewhat expensive.
/// The guard derefs to an [`OldDiagnostic`], so it can be used to further modify the diagnostic
/// before it is added to the collection in the context on `Drop`.
pub(crate) fn report_diagnostic_if_enabled<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
) -> Option<DiagnosticGuard<'chk, 'a>> {
if self.is_rule_enabled(T::rule()) {
let rule = T::rule();
if self.is_rule_enabled(rule) {
Some(DiagnosticGuard {
context: self,
diagnostic: Some(OldDiagnostic::new(kind, range, &self.source_file)),
rule,
})
} else {
None
@@ -3220,6 +3228,7 @@ pub(crate) struct DiagnosticGuard<'a, 'b> {
///
/// This is always `Some` until the `Drop` (or `defuse`) call.
diagnostic: Option<OldDiagnostic>,
rule: Rule,
}
impl DiagnosticGuard<'_, '_> {
@@ -3232,6 +3241,50 @@ impl DiagnosticGuard<'_, '_> {
}
}
impl DiagnosticGuard<'_, '_> {
fn resolve_applicability(&self, fix: &Fix) -> Applicability {
self.context
.settings
.fix_safety
.resolve_applicability(self.rule, fix.applicability())
}
/// Set the [`Fix`] used to fix the diagnostic.
#[inline]
pub(crate) fn set_fix(&mut self, fix: Fix) {
if !self.context.rules.should_fix(self.rule) {
self.fix = None;
return;
}
let applicability = self.resolve_applicability(&fix);
self.fix = Some(fix.with_applicability(applicability));
}
/// Set the [`Fix`] used to fix the diagnostic, if the provided function returns `Ok`.
/// Otherwise, log the error.
#[inline]
pub(crate) fn try_set_fix(&mut self, func: impl FnOnce() -> anyhow::Result<Fix>) {
match func() {
Ok(fix) => self.set_fix(fix),
Err(err) => log::debug!("Failed to create fix for {}: {}", self.name(), err),
}
}
/// Set the [`Fix`] used to fix the diagnostic, if the provided function returns `Ok`.
/// Otherwise, log the error.
#[inline]
pub(crate) fn try_set_optional_fix(
&mut self,
func: impl FnOnce() -> anyhow::Result<Option<Fix>>,
) {
match func() {
Ok(None) => {}
Ok(Some(fix)) => self.set_fix(fix),
Err(err) => log::debug!("Failed to create fix for {}: {}", self.name(), err),
}
}
}
impl std::ops::Deref for DiagnosticGuard<'_, '_> {
type Target = OldDiagnostic;

View File

@@ -22,6 +22,7 @@ use crate::{Edit, Fix, Locator};
use super::ast::LintContext;
/// RUF100
pub(crate) fn check_noqa(
context: &mut LintContext,
path: &Path,
@@ -34,39 +35,34 @@ pub(crate) fn check_noqa(
// Identify any codes that are globally exempted (within the current file).
let file_noqa_directives =
FileNoqaDirectives::extract(locator, comment_ranges, &settings.external, path);
let exemption = FileExemption::from(&file_noqa_directives);
// Extract all `noqa` directives.
let mut noqa_directives =
NoqaDirectives::from_commented_ranges(comment_ranges, &settings.external, path, locator);
if file_noqa_directives.is_empty() && noqa_directives.is_empty() {
return Vec::new();
}
let exemption = FileExemption::from(&file_noqa_directives);
// Indices of diagnostics that were ignored by a `noqa` directive.
let mut ignored_diagnostics = vec![];
// Remove any ignored diagnostics.
'outer: for (index, diagnostic) in context.iter().enumerate() {
// Can't ignore syntax errors.
let Some(code) = diagnostic.noqa_code() else {
let Some(code) = diagnostic.secondary_code() else {
continue;
};
if code == Rule::BlanketNOQA.noqa_code() {
if *code == Rule::BlanketNOQA.noqa_code() {
continue;
}
match &exemption {
FileExemption::All(_) => {
// If the file is exempted, ignore all diagnostics.
ignored_diagnostics.push(index);
continue;
}
FileExemption::Codes(codes) => {
// If the diagnostic is ignored by a global exemption, ignore it.
if codes.contains(&&code) {
ignored_diagnostics.push(index);
continue;
}
}
if exemption.contains_secondary_code(code) {
ignored_diagnostics.push(index);
continue;
}
let noqa_offsets = diagnostic
@@ -81,13 +77,21 @@ pub(crate) fn check_noqa(
{
let suppressed = match &directive_line.directive {
Directive::All(_) => {
directive_line.matches.push(code);
let Ok(rule) = Rule::from_code(code) else {
debug_assert!(false, "Invalid secondary code `{code}`");
continue;
};
directive_line.matches.push(rule);
ignored_diagnostics.push(index);
true
}
Directive::Codes(directive) => {
if directive.includes(code) {
directive_line.matches.push(code);
let Ok(rule) = Rule::from_code(code) else {
debug_assert!(false, "Invalid secondary code `{code}`");
continue;
};
directive_line.matches.push(rule);
ignored_diagnostics.push(index);
true
} else {
@@ -146,11 +150,11 @@ pub(crate) fn check_noqa(
if seen_codes.insert(original_code) {
let is_code_used = if is_file_level {
context
.iter()
.any(|diag| diag.noqa_code().is_some_and(|noqa| noqa == code))
context.iter().any(|diag| {
diag.secondary_code().is_some_and(|noqa| *noqa == code)
})
} else {
matches.iter().any(|match_| *match_ == code)
matches.iter().any(|match_| match_.noqa_code() == code)
} || settings
.external
.iter()

View File

@@ -46,6 +46,12 @@ impl PartialEq<&str> for NoqaCode {
}
}
impl PartialEq<NoqaCode> for &str {
fn eq(&self, other: &NoqaCode) -> bool {
other.eq(self)
}
}
impl serde::Serialize for NoqaCode {
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
where

View File

@@ -3,8 +3,8 @@ use anyhow::{Result, bail};
use libcst_native::{
Arg, Attribute, Call, Comparison, CompoundStatement, Dict, Expression, FormattedString,
FormattedStringContent, FormattedStringExpression, FunctionDef, GeneratorExp, If, Import,
ImportAlias, ImportFrom, ImportNames, IndentedBlock, Lambda, ListComp, Module, Name,
SmallStatement, Statement, Suite, Tuple, With,
ImportAlias, ImportFrom, ImportNames, IndentedBlock, Lambda, ListComp, Module, SmallStatement,
Statement, Suite, Tuple, With,
};
use ruff_python_codegen::Stylist;
@@ -104,14 +104,6 @@ pub(crate) fn match_attribute<'a, 'b>(
}
}
pub(crate) fn match_name<'a, 'b>(expression: &'a Expression<'b>) -> Result<&'a Name<'b>> {
if let Expression::Name(name) = expression {
Ok(name)
} else {
bail!("Expected Expression::Name")
}
}
pub(crate) fn match_arg<'a, 'b>(call: &'a Call<'b>) -> Result<&'a Arg<'b>> {
if let Some(arg) = call.args.first() {
Ok(arg)

View File

@@ -749,15 +749,16 @@ x = 1 \
let diag = {
use crate::rules::pycodestyle::rules::MissingNewlineAtEndOfFile;
let mut iter = edits.into_iter();
OldDiagnostic::new(
let mut diagnostic = OldDiagnostic::new(
MissingNewlineAtEndOfFile, // The choice of rule here is arbitrary.
TextRange::default(),
&SourceFileBuilder::new("<filename>", "<code>").finish(),
)
.with_fix(Fix::safe_edits(
);
diagnostic.fix = Some(Fix::safe_edits(
iter.next().ok_or(anyhow!("expected edits nonempty"))?,
iter,
))
));
diagnostic
};
assert_eq!(apply_fixes([diag].iter(), &locator).code, expect);
Ok(())

View File

@@ -63,7 +63,7 @@ fn apply_fixes<'a>(
let mut source_map = SourceMap::default();
for (code, name, fix) in diagnostics
.filter_map(|msg| msg.noqa_code().map(|code| (code, msg.name(), msg)))
.filter_map(|msg| msg.secondary_code().map(|code| (code, msg.name(), msg)))
.filter_map(|(code, name, diagnostic)| diagnostic.fix().map(|fix| (code, name, fix)))
.sorted_by(|(_, name1, fix1), (_, name2, fix2)| cmp_fix(name1, name2, fix1, fix2))
{
@@ -186,12 +186,13 @@ mod tests {
edit.into_iter()
.map(|edit| {
// The choice of rule here is arbitrary.
let diagnostic = OldDiagnostic::new(
let mut diagnostic = OldDiagnostic::new(
MissingNewlineAtEndOfFile,
edit.range(),
&SourceFileBuilder::new(filename, source).finish(),
);
diagnostic.with_fix(Fix::safe_edit(edit))
diagnostic.fix = Some(Fix::safe_edit(edit));
diagnostic
})
.collect()
}

View File

@@ -1,12 +1,11 @@
use std::borrow::Cow;
use std::collections::hash_map::Entry;
use std::path::Path;
use anyhow::{Result, anyhow};
use colored::Colorize;
use itertools::Itertools;
use ruff_python_parser::semantic_errors::SemanticSyntaxError;
use rustc_hash::FxHashMap;
use rustc_hash::FxBuildHasher;
use ruff_notebook::Notebook;
use ruff_python_ast::{ModModule, PySourceType, PythonVersion};
@@ -23,10 +22,10 @@ use crate::checkers::imports::check_imports;
use crate::checkers::noqa::check_noqa;
use crate::checkers::physical_lines::check_physical_lines;
use crate::checkers::tokens::check_tokens;
use crate::codes::NoqaCode;
use crate::directives::Directives;
use crate::doc_lines::{doc_lines_from_ast, doc_lines_from_tokens};
use crate::fix::{FixResult, fix_file};
use crate::message::SecondaryCode;
use crate::noqa::add_noqa;
use crate::package::PackageRoot;
use crate::preview::is_py314_support_enabled;
@@ -95,25 +94,25 @@ struct FixCount {
/// A mapping from a noqa code to the corresponding lint name and a count of applied fixes.
#[derive(Debug, Default, PartialEq)]
pub struct FixTable(FxHashMap<NoqaCode, FixCount>);
pub struct FixTable(hashbrown::HashMap<SecondaryCode, FixCount, rustc_hash::FxBuildHasher>);
impl FixTable {
pub fn counts(&self) -> impl Iterator<Item = usize> {
self.0.values().map(|fc| fc.count)
}
pub fn entry(&mut self, code: NoqaCode) -> FixTableEntry {
FixTableEntry(self.0.entry(code))
pub fn entry<'a>(&'a mut self, code: &'a SecondaryCode) -> FixTableEntry<'a> {
FixTableEntry(self.0.entry_ref(code))
}
pub fn iter(&self) -> impl Iterator<Item = (NoqaCode, &'static str, usize)> {
pub fn iter(&self) -> impl Iterator<Item = (&SecondaryCode, &'static str, usize)> {
self.0
.iter()
.map(|(code, FixCount { rule_name, count })| (*code, *rule_name, *count))
.map(|(code, FixCount { rule_name, count })| (code, *rule_name, *count))
}
pub fn keys(&self) -> impl Iterator<Item = NoqaCode> {
self.0.keys().copied()
pub fn keys(&self) -> impl Iterator<Item = &SecondaryCode> {
self.0.keys()
}
pub fn is_empty(&self) -> bool {
@@ -121,7 +120,9 @@ impl FixTable {
}
}
pub struct FixTableEntry<'a>(Entry<'a, NoqaCode, FixCount>);
pub struct FixTableEntry<'a>(
hashbrown::hash_map::EntryRef<'a, 'a, SecondaryCode, SecondaryCode, FixCount, FxBuildHasher>,
);
impl<'a> FixTableEntry<'a> {
pub fn or_default(self, rule_name: &'static str) -> &'a mut usize {
@@ -378,32 +379,7 @@ pub fn check_path(
let (mut diagnostics, source_file) = context.into_parts();
if parsed.has_valid_syntax() {
// Remove fixes for any rules marked as unfixable.
for diagnostic in &mut diagnostics {
if diagnostic
.noqa_code()
.and_then(|code| code.rule())
.is_none_or(|rule| !settings.rules.should_fix(rule))
{
diagnostic.fix = None;
}
}
// Update fix applicability to account for overrides
if !settings.fix_safety.is_empty() {
for diagnostic in &mut diagnostics {
if let Some(fix) = diagnostic.fix.take() {
if let Some(rule) = diagnostic.noqa_code().and_then(|code| code.rule()) {
let fixed_applicability = settings
.fix_safety
.resolve_applicability(rule, fix.applicability());
diagnostic.set_fix(fix.with_applicability(fixed_applicability));
}
}
}
}
} else {
if !parsed.has_valid_syntax() {
// Avoid fixing in case the source code contains syntax errors.
for diagnostic in &mut diagnostics {
diagnostic.fix = None;
@@ -703,18 +679,16 @@ pub fn lint_fix<'a>(
}
}
fn collect_rule_codes(rules: impl IntoIterator<Item = NoqaCode>) -> String {
rules
.into_iter()
.map(|rule| rule.to_string())
.sorted_unstable()
.dedup()
.join(", ")
fn collect_rule_codes<T>(rules: impl IntoIterator<Item = T>) -> String
where
T: Ord + PartialEq + std::fmt::Display,
{
rules.into_iter().sorted_unstable().dedup().join(", ")
}
#[expect(clippy::print_stderr)]
fn report_failed_to_converge_error(path: &Path, transformed: &str, diagnostics: &[OldDiagnostic]) {
let codes = collect_rule_codes(diagnostics.iter().filter_map(OldDiagnostic::noqa_code));
let codes = collect_rule_codes(diagnostics.iter().filter_map(OldDiagnostic::secondary_code));
if cfg!(debug_assertions) {
eprintln!(
"{}{} Failed to converge after {} iterations in `{}` with rule codes {}:---\n{}\n---",
@@ -746,11 +720,11 @@ This indicates a bug in Ruff. If you could open an issue at:
}
#[expect(clippy::print_stderr)]
fn report_fix_syntax_error(
fn report_fix_syntax_error<'a>(
path: &Path,
transformed: &str,
error: &ParseError,
rules: impl IntoIterator<Item = NoqaCode>,
rules: impl IntoIterator<Item = &'a SecondaryCode>,
) {
let codes = collect_rule_codes(rules);
if cfg!(debug_assertions) {

Some files were not shown because too many files have changed in this diff Show More