Compare commits

...

103 Commits

Author SHA1 Message Date
Zanie Blue
40b4aa28f9 Zizmor 2025-07-03 07:23:11 -05:00
Zanie Blue
ea4bf00c23 Revert "Only run the relevant test"
This reverts commit 82dc27f2680b8085280136848ffe2ee1d2952a4e.
2025-07-03 07:01:28 -05:00
Zanie Blue
7f4aa4b3fb Update for Depot 2025-07-03 07:01:28 -05:00
Zanie Blue
34c98361ae Do not set TMP 2025-07-03 07:01:28 -05:00
Zanie Blue
38bb96a6c2 Remove log variables 2025-07-03 07:01:28 -05:00
Zanie Blue
a014d55455 Remove fuzz corpus hack 2025-07-03 07:01:27 -05:00
Zanie Blue
306f6f17a9 Enable more logs 2025-07-03 07:01:27 -05:00
Zanie Blue
b233888f00 Only run the relevant test 2025-07-03 07:01:27 -05:00
Zanie Blue
540cbd9085 Add debug logs? 2025-07-03 07:01:27 -05:00
Zanie Blue
0112f7f0e4 Use a dev drive for testing on Windows 2025-07-03 07:01:27 -05:00
GiGaGon
d0f0577ac7 [flake8-pyi] Make example error out-of-the-box (PYI014, PYI015) (#19097) 2025-07-03 12:54:35 +01:00
Dhruv Manilawala
dc56c33618 [ty] Initial support for workspace diagnostics (#18939)
## Summary

This PR adds initial support for workspace diagnostics in the ty server.

Reference spec:
https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_diagnostic

This is currently implemented via the **pull diagnostics method** which
was added in the current version (3.17) and the server advertises it via
the `diagnosticProvider.workspaceDiagnostics` server capability.

**Note:** This might be a bit confusing but a workspace diagnostics is
not for a single workspace but for all the workspaces that the server
handles. These are the ones that the server received during
initialization. Currently, the ty server doesn't support multiple
workspaces so this capability is also limited to provide diagnostics
only for a single workspace (the first one if the client provided
multiple).

A new `ty.diagnosticMode` server setting is added which can be either
`workspace` (for workspace diagnostics) or `openFilesOnly` (for checking
only open files) (default). This is same as
`python.analysis.diagnosticMode` that Pyright / Pylance utilizes. In the
future, we could use the value under `python.*` namespace as fallback to
improve the experience on user side to avoid setting the value multiple
times.

Part of: astral-sh/ty#81

## Test Plan

This capability was introduced in the current LSP version (~3 years) and
the way it's implemented by various clients are a bit different. I've
provided notes on what I've noticed and what would need to be done on
our side to further improve the experience.

### VS Code

VS Code sends the `workspace/diagnostic` requests every ~2 second:

```
[Trace - 12:12:32 PM] Sending request 'workspace/diagnostic - (403)'.
[Trace - 12:12:32 PM] Received response 'workspace/diagnostic - (403)' in 2ms.
[Trace - 12:12:34 PM] Sending request 'workspace/diagnostic - (404)'.
[Trace - 12:12:34 PM] Received response 'workspace/diagnostic - (404)' in 2ms.
[Trace - 12:12:36 PM] Sending request 'workspace/diagnostic - (405)'.
[Trace - 12:12:36 PM] Received response 'workspace/diagnostic - (405)' in 2ms.
[Trace - 12:12:38 PM] Sending request 'workspace/diagnostic - (406)'.
[Trace - 12:12:38 PM] Received response 'workspace/diagnostic - (406)' in 3ms.
[Trace - 12:12:40 PM] Sending request 'workspace/diagnostic - (407)'.
[Trace - 12:12:40 PM] Received response 'workspace/diagnostic - (407)' in 2ms.
...
```

I couldn't really find any resource that explains this behavior. But,
this does mean that we'd need to implement the caching layer via the
previous result ids sooner. This will allow the server to avoid sending
all the diagnostics on every request and instead just send a response
stating that the diagnostics hasn't changed yet. This could possibly be
achieved by using the salsa ID.

If we switch from workspace diagnostics to open-files diagnostics, the
server would send the diagnostics only via the `textDocument/diagnostic`
endpoint. Here, when a document containing the diagnostic is closed, the
server would send a publish diagnostics notification with an empty list
of diagnostics to clear the diagnostics from that document. The issue is
the VS Code doesn't seem to be clearing the diagnostics in this case
even though it receives the notification. (I'm going to open an issue on
VS Code side for this today.)


https://github.com/user-attachments/assets/b0c0833d-386c-49f5-8a15-0ac9133e15ed

### Zed

Zed's implementation works by refreshing the workspace diagnostics
whenever the content of the documents are changed. This seems like a
very reasonable behavior and I was a bit surprised that VS Code didn't
use this heuristic.


https://github.com/user-attachments/assets/71c7b546-7970-434a-9ba0-4fa620647f6c

### Neovim

Neovim only recently added support for workspace diagnostics
(https://github.com/neovim/neovim/pull/34262, merged ~3 weeks ago) so
it's only available on nightly versions.

The initial support is limited and requires fetching the workspace
diagnostics manually as demonstrated in the video. It doesn't support
refreshing the workspace diagnostics either, so that would need to be
done manually as well. I'm assuming that these are just a temporary
limitation and will be implemented before the stable release.


https://github.com/user-attachments/assets/25b4a0e5-9833-4877-88ad-279904fffaf9
2025-07-03 11:04:54 +00:00
Dhruv Manilawala
a95c18a8e1 [ty] Add background request task support (#19041)
## Summary

This PR adds a new trait to support running a request in the background.

Currently, there exists a `BackgroundDocumentRequestHandler` trait which
is similar but is scoped to a specific document (file in an editor
context). The new trait `BackgroundRequestHandler` is not tied to a
specific document nor a specific project but it's for the entire
workspace.

This is added to support running workspace wide requests like computing
the [workspace
diagnostics](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_diagnostic)
or [workspace
symbols](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_symbol).

**Note:** There's a slight difference with what a "workspace" means
between the server and ty. Currently, there's a 1-1 relationship between
a workspace in an editor and the project database corresponding to that
workspace in ty but this could change in the future when Micha adds
support for multiple workspaces or multi-root workspaces.

The data that would be required by the request handler (based on
implementing workspace diagnostics) is the list of databases
(`ProjectDatabse`) corresponding to the projects in the workspace and
the index (`Index`) that contains the open documents. The
`WorkspaceSnapshot` represents this and is passed to the handler similar
to `DocumentSnapshot`.

## Test Plan

This is used in implementing the workspace diagnostics which is where
this is tested.
2025-07-03 11:01:10 +00:00
David Peter
e212dc2e8e [ty] Restructure/move dataclass tests (#19117)
Before I'm adding even more dataclass-related files, let's organize them
in a separate folder.
2025-07-03 10:36:14 +00:00
Aria Desires
c4f2eec865 [ty] Remove last vestiges of std::path from ty_server (#19088)
Fixes https://github.com/astral-sh/ty/issues/603
2025-07-03 15:18:30 +05:30
Zanie Blue
9fc04d6bf0 Use "python" for markdown code fences in on-hover content (#19082)
Instead of "text".

Closes https://github.com/astral-sh/ty/issues/749

We may not want this because the type display implementations are not
guaranteed to be valid Python, however, unless they're going to
highlight invalid syntax this seems like a better interim value than
"text"? I'm not the expert though. See
https://github.com/astral-sh/ty/issues/749#issuecomment-3026201114 for
prior commentary.

edit: Going back further to
https://github.com/astral-sh/ruff/pull/17057#discussion_r2028151621 for
prior context, it turns out they _do_ highlight invalid syntax in red
which is quite unfortunate and probably a blocker here.
2025-07-03 10:50:34 +05:30
Matthew Mckee
352b896c89 [ty] Add subtyping between SubclassOf and CallableType (#19026)
## Summary

Part of https://github.com/astral-sh/ty/issues/129

There were previously some false positives here.

## Test Plan

Updated `is_subtype_of.md` and `is_assignable_to.md`
2025-07-02 19:22:31 -07:00
GiGaGon
321575e48f [flake8-pyi] Make example error out-of-the-box (PYI042) (#19101)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [snake-case-type-alias
(PYI042)](https://docs.astral.sh/ruff/rules/snake-case-type-alias/#snake-case-type-alias-pyi042)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/8fafec81-2228-4ffe-81e8-1989b724cb47)
```py
type_alias_name: TypeAlias = int
```

[New example](https://play.ruff.rs/b396746c-e6d2-423c-bc13-01a533bb0747)
```py
from typing import TypeAlias

type_alias_name: TypeAlias = int
```

Imports were also added to the "use instead" section.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-02 22:31:15 +01:00
GiGaGon
066018859f [pyflakes] Fix backslash in docs (F621) (#19098)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

This fixes the docs for [expressions-in-star-assignment
(F621)](https://docs.astral.sh/ruff/rules/expressions-in-star-assignment/#expressions-in-star-assignment-f621)
having a backslash `\` before the left shifts `<<`. I'm not sure why
this happened in the first place, as the docstring looks fine, but
putting the `<<` inside a code block fixes it. I was not able to track
down the source of the issue either. The only other rule with a `<<` is
[missing-whitespace-around-bitwise-or-shift-operator
(E227)](https://docs.astral.sh/ruff/rules/missing-whitespace-around-bitwise-or-shift-operator/#missing-whitespace-around-bitwise-or-shift-operator-e227),
which already has it in a code block.

Old docs page:

![image](https://github.com/user-attachments/assets/993106c6-5d83-4aed-836b-e252f5b64916)
> In Python 3, no more than 1 \\<< 8 assignments are allowed before a
starred expression, and no more than 1 \\<< 24 expressions are allowed
after a starred expression.

New docs page:

![image](https://github.com/user-attachments/assets/3b40b35f-f39e-49f1-8b2e-262dda4085b4)
> In Python 3, no more than `1 << 8` assignments are allowed before a
starred expression, and no more than `1 << 24` expressions are allowed
after a starred expression.

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected.
2025-07-02 15:00:33 -04:00
David Peter
f76d3f87cf [ty] Allow declared-only class-level attributes to be accessed on the class (#19071)
## Summary

Allow declared-only class-level attributes to be accessed on the class:
```py
class C:
    attr: int

C.attr  # this is now allowed
``` 

closes https://github.com/astral-sh/ty/issues/384
closes https://github.com/astral-sh/ty/issues/553

## Ecosystem analysis


* We see many removed `unresolved-attribute` false-positives for code
that makes use of sqlalchemy, as expected (see changes for `prefect`)
* We see many removed `call-non-callable` false-positives for uses of
`pytest.skip` and similar, as expected
* Most new diagnostics seem to be related to cases like the following,
where we previously inferred `int` for `Derived().x`, but now we infer
`int | None`. I think this should be a
conflicting-declarations/bad-override error anyway? The new behavior may
even be preferred here?
  ```py
  class Base:
      x: int | None
  
  
  class Derived(Base):
      def __init__(self):
          self.x: int = 1
  ```
2025-07-02 18:03:56 +02:00
Micha Reiser
5f426b9f8b [ty] Remove ScopedExpressionId (#19019)
## Summary

The motivation of `ScopedExpressionId` was that we have an expression
identifier that's local to a scope and, therefore, unlikely to change if
a user makes changes in another scope. A local identifier like this has
the advantage that query results may remain unchanged even if other
parts of the file change, which in turn allows Salsa to short-circuit
dependent queries.

However, I noticed that we aren't using `ScopedExpressionId` in a place
where it's important that the identifier is local. It's main use is
inside `infer` which we always run for the entire file. The one
exception to this is `Unpack` but unpack runs as part of `infer`.

Edit: The above isn't entirely correct. We used ScopedExpressionId in
TypeInference which is a query result. Now using ExpressionNodeKey does
mean that a change to the AST invalidates most if not all TypeInference
results of a single file. Salsa then has to run all dependent queries to
see if they're affected by this change even if the change was local to
another scope.

If this locality proves to be important I suggest that we create two
queries on top of TypeInference: one that returns the expression map
which is mainly used in the linter and type inference and a second that
returns all remaining fields. This should give us a similar optimization
at a much lower cost

I also considered remove `ScopedUseId` but I believe that one is still
useful because using `ExpressionNodeKey` for it instead would mean that
all `UseDefMap` change when a single AST node changes. Whether this is
important is something difficult to assess. I'm simply not familiar
enough with the `UseDefMap`. If the locality doesn't matter for the
`UseDefMap`, then a similar change could be made and `bindings_by_use`
could be changed to an `FxHashMap<UseId, Bindings>` where `UseId` is a
thin wrapper around `NodeKey`.

Closes https://github.com/astral-sh/ty/issues/721
2025-07-02 17:57:32 +02:00
GiGaGon
37ba185c04 [flake8-pyi] Make example error out-of-the-box (PYI059) (#19080)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-02 16:49:54 +01:00
David Peter
93413d3631 [ty] Update docs links (#19092)
Point everything to the new documentation at https://docs.astral.sh/ty/
2025-07-02 17:34:56 +02:00
Zanie Blue
efd9b75352 Avoid reformatting comments in rules reference documentation (#19093)
closes https://github.com/astral-sh/ty/issues/754
2025-07-02 17:16:44 +02:00
David Peter
4cf56d7ad4 [ty] Fix lint summary wording (#19091) 2025-07-02 16:32:11 +02:00
David Peter
4e4e428a95 [ty] Fix link in generate_ty_rules (#19090) 2025-07-02 14:21:32 +00:00
Zanie Blue
522fd4462e Fix header levels in generated settings reference (#19089)
The headers were one level too deep for child items, and the top-level
`rules` header was way off.
2025-07-02 16:01:23 +02:00
David Peter
e599c9d0d3 [ty] Adapt generate_ty_rules for MkDocs (#19087)
## Summary

Adapts the Markdown for the rules-reference documentation page for
MkDocs.
2025-07-02 16:01:10 +02:00
Micha Reiser
e9b5ea71b3 Update Salsa (#19020)
## Summary

This PR updates Salsa to pull in Ibraheem's multithreading improvements (https://github.com/salsa-rs/salsa/pull/921).

## Performance

A small regression for single-threaded benchmarks is expected because
papaya is slightly slower than a `Mutex<FxHashMap>` in the uncontested
case (~10%). However, this shouldn't matter as much in practice because:

1. Salsa has a fast-path when only using 1 DB instance which is the
common case in production. This fast-path is not impacted by the changes
but we measure the slow paths in our benchmarks (because we use multiple
db instances)
2. Fixing the 10x slowdown for the congested case (multi threading)
outweights the downsides of a 10% perf regression for single threaded
use cases, especially considering that ty is heavily multi threaded.

## Test Plan

`cargo test`
2025-07-02 09:55:37 -04:00
Ibraheem Ahmed
ebc70a4002 [ty] Support LSP go-to with vendored typeshed stubs (#19057)
## Summary

Extracts the vendored typeshed stubs lazily and caches them on the local
filesystem to support go-to in the LSP.

Resolves https://github.com/astral-sh/ty/issues/77.
2025-07-02 07:58:58 -04:00
Micha Reiser
f7fc8fb084 [ty] Request configuration from client (#18984)
## Summary

This PR makes the necessary changes to the server that it can request
configurations from the client using the `configuration` request.
This PR doesn't make use of the request yet. It only sets up the
foundation (mainly the coordination between client and server)
so that future PRs could pull specific settings. 

I plan to use this for pulling the Python environment from the Python
extension.

Deno does something very similar to this.

## Test Plan

Tested that diagnostics are still shown.
2025-07-02 14:31:41 +05:30
GiGaGon
cdf91b8b74 [flake8-pyi] Make example error out-of-the-box (PYI062) (#19079)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [duplicate-literal-member
(PYI062)](https://docs.astral.sh/ruff/rules/duplicate-literal-member/#duplicate-literal-member-pyi062)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/6b00b41c-c1c5-4421-873d-fc2a143e7337)
```py
foo: Literal["a", "b", "a"]
```

[New example](https://play.ruff.rs/1aea839b-9ae8-4848-bb83-2637e1a68ce4)
```py
from typing import Literal

foo: Literal["a", "b", "a"]
```

Imports were also added to the "use instead" section.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-02 08:21:39 +01:00
Dhruv Manilawala
d1e705738e [ty] Log target names at trace level (#19084)
Follow-up to https://github.com/astral-sh/ruff/pull/19083, also log the
target names like `ty_python_semantic::module_resolver::resolver` in
`2025-07-02 10:12:20.188697000 DEBUG
ty_python_semantic::module_resolver::resolver: Adding first-party search
path '/Users/dhruv/playground/ty_server'` at trace level.
2025-07-02 04:49:36 +00:00
Dhruv Manilawala
c3d9b21db5 [ty] Use better datetime format for server logs (#19083)
This PR improves the timer format for ty server logs to be same as Ruff.

Ref: https://github.com/astral-sh/ruff/pull/16389
2025-07-02 04:39:12 +00:00
Alex Waygood
316c1b21e2 [ty] Add some missing calls to normalized_impl (#19074)
## Summary

I hoped this might fix the latest stack overflows on
https://github.com/astral-sh/ruff/pull/18659... it doesn't look like it
does, but these changes seem like they're probably correct anyway...?

## Test Plan

<!-- How was it tested? -->
2025-07-01 17:57:52 +01:00
Brent Westbrook
47733c0647 Remove new codspeed dependencies (#19073)
Summary
--

Updates to codspeed 3.0.2, removing some of the new dependencies
introduced in 3.0. See https://github.com/astral-sh/uv/pull/14396 and
https://github.com/CodSpeedHQ/codspeed-rust/pull/108 for the similar PR
in uv and the upstream fix, respectively.

Test Plan
--

N/a
2025-07-01 12:20:31 -04:00
NamelessGO
48366a7bbb Docs: Add Anki to Who's Using Ruff (#19072)
https://github.com/ankitects/anki/pull/4119

<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Add Anki to Who's Using Ruff (README)

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-01 15:44:04 +00:00
GiGaGon
cc736c3a51 [refurb] Fix false positive on empty tuples (FURB168) (#19058)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

This PR fixes #19047 / the [isinstance-type-none
(FURB168)](https://docs.astral.sh/ruff/rules/isinstance-type-none/#isinstance-type-none-furb168)
tuple false positive by adding a check if the tuple is empty to the
code. I also noticed there was another false positive with the other
tuple check in the same function, so I fixed it the same way.
`Union[()]` is invalid at runtime with `TypeError: Cannot take a Union
of no types.`, but it is accepted by `basedpyright`
[playground](https://basedpyright.com/?pythonVersion=3.8&typeCheckingMode=all&code=GYJw9gtgBALgngBwJYDsDmUkQWEMoCqKSYKAsAFAgCmAbtQIYA2A%2BvAtQBREkoDanAJQBdQUA)
and is equivalent to `Never`, so I fixed it anyways. I'm getting on a
side tangent here, but it looks like MyPy doesn't accept it, and ty
[playground](https://play.ty.dev/c2c468b6-38e4-4dd9-a9fa-0276e843e395)
gives `@Todo`.

## Test Plan

<!-- How was it tested? -->

Added two test cases for the two false positives.
[playground](https://play.ruff.rs/a53afc21-9a1d-4b9b-9346-abfbeabeb449)
2025-07-01 10:26:41 -04:00
GiGaGon
8cc14ad02d [flake8-datetimez] Make DTZ901 example error out-of-the-box (#19056)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [datetime-min-max
(DTZ901)](https://docs.astral.sh/ruff/rules/datetime-min-max/#datetime-min-max-dtz901)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/c1202727-1a18-4d3f-92a4-334ede07ed3e)
```py
datetime.max
```

[New example](https://play.ruff.rs/af2c76aa-9beb-46bc-8e27-faf53ecdbe8c)
```py
import datetime

datetime.datetime.max
```

I also added imports to the problem demonstration and use instead.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-01 09:57:34 -04:00
Илья Любавский
667dc62038 [ruff] Fix syntax error introduced for an empty string followed by a u-prefixed string (UP025) (#18899)
## Summary
/closes #18895
## Test Plan

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-01 09:34:08 -04:00
David Peter
dac4e356eb [ty] Use all reachable bindings for instance attributes and deferred lookups (#18955)
## Summary

Remove a hack in control flow modeling that was treating `return`
statements at the end of function bodies in a special way (basically
considering the state *just before* the `return` statement as the
end-of-scope state). This is not needed anymore now that #18750 has been
merged.

In order to make this work, we now use *all reachable bindings* for
purposes of finding implicit instance attribute assignments as well as
for deferred lookups of symbols. Both would otherwise be affected by
this change:
```py
def C:
    def f(self):
        self.x = 1  # a reachable binding that is not visible at the end of the scope
        return
```

```py
def f():
    class X: ...  # a reachable binding that is not visible at the end of the scope
    x: "X" = X()  # deferred use of `X`
    return
```

Implicit instance attributes also required another change. We previously
kept track of possibly-unbound instance attributes in some cases, but we
now give up on that completely and always consider *implicit* instance
attributes to be bound if we see a reachable binding in a reachable
method. The previous behavior was somewhat inconsistent anyway because
we also do not consider attributes possibly-unbound in other scenarios:
we do not (and can not) keep track of whether or not methods are called
that define these attributes.

closes https://github.com/astral-sh/ty/issues/711

## Ecosystem analysis

I think this looks very positive!

* We see an unsurprising drop in `possibly-unbound-attribute`
diagnostics (599), mostly for classes that define attributes in `try …
except` blocks, `for` loops, or `if … else: raise …` constructs. There
might obviously also be true positives that got removed, but the vast
majority should be false positives.
* There is also a drop in `possibly-unresolved-reference` /
`unresolved-reference` diagnostics (279+13) from the change to deferred
lookups.
* Some `invalid-type-form` false positives got resolved (13), because we
can now properly look up the names in the annotations.
* There are some new *true* positives in `attrs`, since we understand
the `Attribute` annotation that was previously inferred as `Unknown`
because of a re-assignment after the class definition.


## Test Plan

The existing attributes.md test suite has sufficient coverage here.
2025-07-01 14:38:36 +02:00
Alex Waygood
ebf59e2bef [ty] Rework disjointness of protocol instances vs types with possibly unbound attributes (#19043) 2025-07-01 12:47:27 +01:00
Alex Waygood
c6fd11fe36 [ty] Eagerly evaluate more constraints based on the raw AST (#19068) 2025-07-01 10:17:22 +00:00
David Peter
7d468ee58a [ty] Model reachability of star import definitions for nonlocal lookups (#19066)
## Summary

Temporarily modify `UseDefMapBuilder::reachability` for star imports in
order for new definitions to pick up the right reachability. This was
already working for `UseDefMapBuilder::place_states`, but not for
`UseDefMapBuilder::reachable_definitions`.

closes https://github.com/astral-sh/ty/issues/728

## Test Plan

Regression test
2025-07-01 11:06:37 +02:00
David Peter
4016521bf6 [ty] Eagerly evaluate TYPE_CHECKING constraints (#19044)
## Summary

Evaluate `TYPE_CHECKING` to `ALWAYS_TRUE` and `not TYPE_CHECKING` to
`ALWAYS_FALSE` during semantic index building. This is a follow-up to
https://github.com/astral-sh/ruff/pull/18998 and is in principle just a
performance optimization. We see some (favorable) ecosystem changes
because we can eliminate definitely-unreachable branches early now and
retain narrowing constraints without solving
https://github.com/astral-sh/ty/issues/690 first.
2025-07-01 11:05:52 +02:00
GiGaGon
b8653a9d3a [flake8-pyi] Make PYI032 example error out-of-the-box (#19061) 2025-07-01 07:50:58 +01:00
github-actions[bot]
966adca6f6 [ty] Sync vendored typeshed stubs (#19060)
Close and reopen this PR to trigger CI

Co-authored-by: typeshedbot <>
2025-07-01 07:45:06 +01:00
renovate[bot]
77941af1c6 Update Rust crate get-size2 to v0.5.1 (#19035)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [get-size2](https://redirect.github.com/bircni/get-size2) |
workspace.dependencies | patch | `0.5.0` -> `0.5.1` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>bircni/get-size2 (get-size2)</summary>

###
[`v0.5.1`](https://redirect.github.com/bircni/get-size2/blob/HEAD/CHANGELOG.md#051---2025-06-25)

[Compare
Source](https://redirect.github.com/bircni/get-size2/compare/0.5.0...0.5.1)

##### Bug Fixes

- correctly determine size for enums
([#&#8203;24](https://redirect.github.com/bircni/get-size2/issues/24)) -
([3c5bd18](3c5bd18cac))
- Nicolas

##### Miscellaneous Chores

- add top-level `heap_size` function
([#&#8203;25](https://redirect.github.com/bircni/get-size2/issues/25)) -
([f3b5e6e](f3b5e6e38c))
- Ibraheem Ahmed

##### Build

- update to newer cargo-verset to set dependency version automatically -
([b1154e4](b1154e4572))
- Nicolas

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 23:17:36 -04:00
Dylan
4bc170a5c1 Make dependency get-size2 truly optional in ruff_python_ast (#19052)
Gates all uses of `get-size2` behind the feature `get-size` in the crate
`ruff_python_ast`. Also requires that `ruff_text_size` is pulled in with
the feature `get-size` enabled if we enable the same-named feature for
`ruff_python_ast`.
2025-06-30 21:50:59 -05:00
Robsdedude
28ab61d885 [pyupgrade] Avoid PEP-604 unions with typing.NamedTuple (UP007, UP045) (#18682)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
Make `UP045` ignore `Optional[NamedTuple]` as `NamedTuple` is a function
(not a proper type). Rewriting it to `NamedTuple | None` breaks at
runtime. While type checkers currently accept `NamedTuple` as a type,
they arguably shouldn't. Therefore, we outright ignore it and don't
touch or lint on it.

For a more detailed discussion, see the linked issue.

## Test Plan
Added examples to the existing tests.

## Related Issues
Fixes: https://github.com/astral-sh/ruff/issues/18619
2025-06-30 17:22:23 -04:00
GiGaGon
4963835d0d [flake8-bandit] Make S604 and S609 examples error out-of-the-box (#19049)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

Both in one PR since they are in the same file.

S604
---

This PR makes [call-with-shell-equals-true
(S604)](https://docs.astral.sh/ruff/rules/call-with-shell-equals-true/#call-with-shell-equals-true-s604)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/a054fb79-7653-47f7-9ab5-3d8b7540c810)
```py
import subprocess

user_input = input("Enter a command: ")
subprocess.run(user_input, shell=True)
```

[New example](https://play.ruff.rs/6fea81b4-e745-4b85-8bea-faaabea5c86d)
```py
import my_custom_subprocess

user_input = input("Enter a command: ")
my_custom_subprocess.run(user_input, shell=True)
```

The old example doesn't raise `S604` because it gets overwritten by
[subprocess-popen-with-shell-equals-true
(S602)](https://docs.astral.sh/ruff/rules/subprocess-popen-with-shell-equals-true/#subprocess-popen-with-shell-equals-true-s602)
(which is a good idea to prevent two lints saying the same thing from
being raised)

S609
---

This PR makes [unix-command-wildcard-injection
(S609)](https://docs.astral.sh/ruff/rules/unix-command-wildcard-injection/#unix-command-wildcard-injection-s609)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/849860fa-0d12-4916-bdbc-64a0fa14cd9b)
```py
import subprocess

subprocess.Popen(["chmod", "777", "*.py"])
```

[New example](https://play.ruff.rs/77a54d7c-cf78-4158-bcf8-96dd698cf366)
```py
import subprocess

subprocess.Popen(["chmod", "777", "*.py"], shell=True)
```

I'm not familiar enough with `subprocess` to know why `shell=True` is
required to make `S609` raise here, but it works.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 16:10:14 -05:00
GiGaGon
09fa80f94c [flake8-datetimez] Make DTZ011 example error out-of-the-box (#19055)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [call-date-today
(DTZ011)](https://docs.astral.sh/ruff/rules/call-date-today/#call-date-today-dtz011)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/b42d6aef-7777-4b3b-9f96-19132000b765)
```py
import datetime

datetime.datetime.today()
```

[New example](https://play.ruff.rs/8577c3c1-cfa8-425b-b1e1-4c53b2a48375)
```py
import datetime

datetime.date.today()
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 15:54:04 -05:00
GiGaGon
fde82fc563 [flake8-bugbear] Make B028 example error out-of-the-box (#19054)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [no-explicit-stacklevel
(B028)](https://docs.astral.sh/ruff/rules/no-explicit-stacklevel/#no-explicit-stacklevel-b028)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/1ee80aec-2d6e-4a3f-8e98-da82b6a9f544)
```py
warnings.warn("This is a warning")
```

[New example](https://play.ruff.rs/343593aa-38a0-4d76-a32b-5abd0a4306cc)
```py
import warnings

warnings.warn("This is a warning")
```

Imports were also added to the "use instead" section

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 15:49:40 -05:00
GiGaGon
96decb17a9 [flake8-bugbear] Make B911 example error out-of-the-box (#19051)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [batched-without-explicit-strict
(B911)](https://docs.astral.sh/ruff/rules/batched-without-explicit-strict/#batched-without-explicit-strict-b911)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/a897d96b-0749-4291-8a62-dfd4caf290a0)
```py
itertools.batched(iterable, n)
```

[New example](https://play.ruff.rs/1c1e0ab7-014c-4dc2-abed-c2cb6cd01f70)
```py
import itertools

itertools.batched(iterable, n)
```

Imports were also added to the "use instead" sections

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 15:48:02 -05:00
Carl Meyer
2ae0bd9464 [ty] Normalize recursive types using Any (#19003)
## Summary

This just replaces one temporary solution to recursive protocols (the
`SelfReference` mechanism) with another one (track seen types when
recursively descending in `normalize` and replace recursive references
with `Any`). But this temporary solution can handle mutually-recursive
types, not just self-referential ones, and it's sufficient for the
primer ecosystem and some other projects we are testing on to no longer
stack overflow.

The follow-up here will be to properly handle these self-references
instead of replacing them with `Any`.

We will also eventually need cycle detection on more recursive-descent
type transformations and tests.

## Test Plan

Existing tests (including recursive-protocol tests) and primer.

Added mdtest for mutually-recursive protocols that stack-overflowed
before this PR.
2025-06-30 12:07:57 -07:00
Robsdedude
34052a1185 [flake8-comprehensions] Fix C420 to prepend whitespace when needed (#18616)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
This PR fixes rule C420's fix. The fix replaces `{...}` with
`dict....(...)`. Therefore, if there is any identifier or such right
before the fix, the fix will fuse that previous token with `dict...`.

The example in the issue is
```python
0 or{x: None for x in "x"}
# gets "fixed" to
0 ordict.fromkeys(iterable)
```

## Related Issues

Fixes: https://github.com/astral-sh/ruff/issues/18599
2025-06-30 12:38:26 -04:00
Dan Parizher
9f0d3cca89 [pydocstyle] Fix D413 infinite loop for parenthesized docstring (#18930)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Fixes #18908
2025-06-30 10:49:13 -04:00
Robsdedude
eb9d9c3646 [perflint] Fix PERF403 panic on attribute or subscription loop variable (#19042)
## Summary

Fixes: https://github.com/astral-sh/ruff/issues/19005

## Test Plan

Reproducer from issue report plus some extra cases that would cause the
panic were added.
2025-06-30 10:47:49 -04:00
Adrien Cacciaguerra
4fbf7e9de8 Bump CodSpeed to v3 (#19046)
## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
As discussed on Slack, there was an issue with the walltime metrics with
`divan`. The issue was fixed with the latest version of `cargo-codspeed`
and `codpseed-divan-compat`:
https://github.com/CodSpeedHQ/codspeed-rust/releases/tag/v3.0.0.

This PR updates all crates related to CodSpeed. A performance increase
of the following benchmarks is expected, as now the correct metric will
be used.

```
crates/ruff_benchmark/benches/ty_walltime.rs::multithreaded[pydantic]
crates/ruff_benchmark/benches/ty_walltime.rs::small[altair]
crates/ruff_benchmark/benches/ty_walltime.rs::small[freqtrade]
crates/ruff_benchmark/benches/ty_walltime.rs::small[pydantic]
crates/ruff_benchmark/benches/ty_walltime.rs::small[tanjun]
```

Once this is merged, we will update the historic data of the affected
benchmark on the CodSpeed UI, so that no false positives will appear.
2025-06-30 10:11:23 -04:00
GiGaGon
b23b4071eb [flake8-async] Make ASYNC220, ASYNC221, and ASYNC222 examples error out-of-the-box (#18978)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

All three in one PR since they are in the same file.

This PR makes [create-subprocess-in-async-function
(ASYNC220)](https://docs.astral.sh/ruff/rules/create-subprocess-in-async-function/#create-subprocess-in-async-function-async220)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/465036af-d75f-4bda-ba24-e50e8618bf16)
```py
async def foo():
    os.popen(cmd)
```

[New example](https://play.ruff.rs/8cf43d50-f9e1-45d6-b711-968c7135f2e0)
```py
import os


async def foo():
    os.popen(cmd)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

This PR makes [run-process-in-async-function
(ASYNC221)](https://docs.astral.sh/ruff/rules/run-process-in-async-function/#run-process-in-async-function-async221)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/0698aaa1-c722-4f04-b56c-61edec06945c)
```py
async def foo():
    subprocess.run(cmd)
```

[New example](https://play.ruff.rs/e05bfcbc-e681-4a28-8f50-2c0c2537d038)
```py
import subprocess


async def foo():
    subprocess.run(cmd)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

This PR makes [wait-for-process-in-async-function
(ASYNC222)](https://docs.astral.sh/ruff/rules/wait-for-process-in-async-function/#wait-for-process-in-async-function-async222)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/4305d477-8995-462d-83ae-435731d71e67)
```py
async def foo():
    os.waitpid(0)
```

[New example](https://play.ruff.rs/ad10c042-3b18-49ca-8f5c-5ab720516da1)
```py
import os


async def foo():
    os.waitpid(0)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 09:47:29 -04:00
GiGaGon
462dbadee4 [Airflow] Make AIR302 example error out-of-the-box (#18988)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [airflow3-moved-to-provider
(AIR302)](https://docs.astral.sh/ruff/rules/airflow3-moved-to-provider/#airflow3-moved-to-provider-air302)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/1026c008-57bc-4330-93b9-141444f2a611)
```py
from airflow.auth.managers.fab.fab_auth_manage import FabAuthManager
```

[New example](https://play.ruff.rs/b690e809-a81d-4265-9fde-1494caa0b7fd)
```py
from airflow.auth.managers.fab.fab_auth_manager import FabAuthManager

fab_auth_manager_app = FabAuthManager().get_fastapi_app()
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 09:45:15 -04:00
Robsdedude
a3638b3adc [pyupgrade] Mark UP008 fix safe if no comments in range (#18683)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
Mark `UP008`'s fix safe if it won't delete comments.

## Relevant Issues
Fixes: https://github.com/astral-sh/ruff/issues/18533

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-06-30 09:42:05 -04:00
GiGaGon
f857546aeb [flake8-bandit] Make S201 example error out-of-the-box (#19017)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [flask-debug-true
(S201)](https://docs.astral.sh/ruff/rules/flask-debug-true/#flask-debug-true-s201)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/d5e1a013-1107-4223-9094-0e8393ad3c64)
```py
import flask

app = Flask()

app.run(debug=True)
```

[New example](https://play.ruff.rs/c4aebd2c-0448-4471-8bad-3e38ace68367)
```py
from flask import Flask

app = Flask()

app.run(debug=True)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-30 09:39:59 -04:00
हिमांशु
d78f18cda9 [flake8-executable] Allow uvx in shebang line (EXE003) (#18967)
## Summary
closes #18902 

## Test Plan
I have added a test case
2025-06-30 09:38:18 -04:00
David Peter
db3dcd8ad6 [ty] Eagerly simplify 'True' and 'False' constraints (#18998)
## Summary

Simplifies literal `True` and `False` conditions to `ALWAYS_TRUE` /
`ALWAYS_FALSE` during semantic index building. This allows us to eagerly
evaluate more constraints, which should help with performance (looks
like there is a tiny 1% improvement in instrumented benchmarks), but
also allows us to eliminate definitely-unreachable branches in
control-flow merging. This can lead to better type inference in some
cases because it allows us to retain narrowing constraints without
solving https://github.com/astral-sh/ty/issues/690 first:
```py
def _(c: int | None):
    if c is None:
        assert False
    
    reveal_type(c)  # int, previously: int | None
```

closes https://github.com/astral-sh/ty/issues/713

## Test Plan

* Regression test for https://github.com/astral-sh/ty/issues/713
* Made sure that all ecosystem diffs trace back to removed false
positives
2025-06-30 13:11:52 +02:00
David Peter
54769ac9f9 [ty] While loop modeling cleanup (#18994)
## Summary

I found the previous code here very confusing, and it also did some
unnecessary work. Hopefully this is a bit easier to understand.
2025-06-30 11:38:25 +02:00
renovate[bot]
c80762debd Update pre-commit dependencies (#19038)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | minor | `v0.11.13` -> `v0.12.1` |
|
[python-jsonschema/check-jsonschema](https://redirect.github.com/python-jsonschema/check-jsonschema)
| repository | patch | `0.33.0` -> `0.33.1` |
|
[rbubley/mirrors-prettier](https://redirect.github.com/rbubley/mirrors-prettier)
| repository | minor | `v3.5.3` -> `v3.6.2` |
|
[woodruffw/zizmor-pre-commit](https://redirect.github.com/woodruffw/zizmor-pre-commit)
| repository | minor | `v1.9.0` -> `v1.10.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.1`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.1)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.0...v0.12.1)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.1

###
[`v0.12.0`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.0)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.11.13...v0.12.0)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.0

</details>

<details>
<summary>python-jsonschema/check-jsonschema
(python-jsonschema/check-jsonschema)</summary>

###
[`v0.33.1`](https://redirect.github.com/python-jsonschema/check-jsonschema/blob/HEAD/CHANGELOG.rst#0331)

[Compare
Source](https://redirect.github.com/python-jsonschema/check-jsonschema/compare/0.33.0...0.33.1)

- Update vendored schemas: bamboo-spec, bitbucket-pipelines, circle-ci,
cloudbuild,
compose-spec, dependabot, drone-ci, github-actions, github-workflows,
gitlab-ci,
mergify, readthedocs, renovate, taskfile, travis, woodpecker-ci
(2025-06-22)
- Fix: support `click==8.2.0`
- Fix a bug in `Last-Modified` header parsing which used local time and
could
  result in improper caching. Thanks :user:`fenuks`! (:pr:`565`)

</details>

<details>
<summary>rbubley/mirrors-prettier (rbubley/mirrors-prettier)</summary>

###
[`v3.6.2`](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.6.1...v3.6.2)

[Compare
Source](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.6.1...v3.6.2)

###
[`v3.6.1`](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.6.0...v3.6.1)

[Compare
Source](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.6.0...v3.6.1)

###
[`v3.6.0`](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.5.3...v3.6.0)

[Compare
Source](https://redirect.github.com/rbubley/mirrors-prettier/compare/v3.5.3...v3.6.0)

</details>

<details>
<summary>woodruffw/zizmor-pre-commit
(woodruffw/zizmor-pre-commit)</summary>

###
[`v1.10.0`](https://redirect.github.com/zizmorcore/zizmor-pre-commit/releases/tag/v1.10.0)

[Compare
Source](https://redirect.github.com/woodruffw/zizmor-pre-commit/compare/v1.9.0...v1.10.0)

See: https://github.com/zizmorcore/zizmor/releases/tag/v1.10.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
2025-06-30 08:43:39 +00:00
Robsdedude
4103d73224 Minor code simplification (#19022)
When inside a typing only annotation, the code is always inside an
annotation, too.
2025-06-30 13:42:59 +05:30
renovate[bot]
bedb53daec Update dependency smol-toml to v1.4.0 (#19037)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [smol-toml](https://redirect.github.com/squirrelchat/smol-toml) |
[`1.3.4` ->
`1.4.0`](https://renovatebot.com/diffs/npm/smol-toml/1.3.4/1.4.0) |
[![age](https://developer.mend.io/api/mc/badges/age/npm/smol-toml/1.4.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/npm/smol-toml/1.4.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/npm/smol-toml/1.3.4/1.4.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/smol-toml/1.3.4/1.4.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>squirrelchat/smol-toml (smol-toml)</summary>

###
[`v1.4.0`](https://redirect.github.com/squirrelchat/smol-toml/releases/tag/v1.4.0)

[Compare
Source](https://redirect.github.com/squirrelchat/smol-toml/compare/v1.3.4...v1.4.0)

This release introduces better support for integers, courtesy of
[@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor)! It is
now possible to parse integers that are larger than 53 bits as BigInts.
It is also possible to parse all integers as BigInts, enabling full type
preservation of TOML documents.

The project is now tested against Node 24 as well, ensuring stability on
the latest versions of Node.js.

##### What's Changed

- Use more detailed TOML types by
[@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor) in
[https://github.com/squirrelchat/smol-toml/pull/40](https://redirect.github.com/squirrelchat/smol-toml/pull/40)
- Serialize all numbers as floats by
[@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor) in
[https://github.com/squirrelchat/smol-toml/pull/42](https://redirect.github.com/squirrelchat/smol-toml/pull/42)
- Use bigint for large numbers by
[@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor) in
[https://github.com/squirrelchat/smol-toml/pull/41](https://redirect.github.com/squirrelchat/smol-toml/pull/41)

##### New Contributors

- [@&#8203;Gouvernathor](https://redirect.github.com/Gouvernathor) made
their first contribution in
[https://github.com/squirrelchat/smol-toml/pull/40](https://redirect.github.com/squirrelchat/smol-toml/pull/40)

**Full Changelog**:
https://github.com/squirrelchat/smol-toml/compare/v1.3.4...v1.4.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:37:20 +05:30
med1844
0ec2ad2fa5 [ty] Emit error for invalid binary operations in type expressions (#18991)
## Summary

This PR adds diagnostic for invalid binary operators in type
expressions. It should close https://github.com/astral-sh/ty/issues/706
if merged.

Please feel free to suggest better wordings for the diagnostic message.

## Test Plan

I modified `mdtest/annotations/invalid.md` and added a test for each
binary operator, and fixed tests that was broken by the new diagnostic.
2025-06-30 10:06:01 +02:00
renovate[bot]
9469a982cc Update dependency ruff to v0.12.1 (#19032)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [ruff](https://docs.astral.sh/ruff)
([source](https://redirect.github.com/astral-sh/ruff),
[changelog](https://redirect.github.com/astral-sh/ruff/blob/main/CHANGELOG.md))
| `==0.12.0` -> `==0.12.1` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/ruff/0.12.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/pypi/ruff/0.12.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/pypi/ruff/0.12.0/0.12.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/ruff/0.12.0/0.12.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>astral-sh/ruff (ruff)</summary>

###
[`v0.12.1`](https://redirect.github.com/astral-sh/ruff/blob/HEAD/CHANGELOG.md#0121)

[Compare
Source](https://redirect.github.com/astral-sh/ruff/compare/0.12.0...0.12.1)

##### Preview features

- \[`flake8-errmsg`] Extend `EM101` to support byte strings
([#&#8203;18867](https://redirect.github.com/astral-sh/ruff/pull/18867))
- \[`flake8-use-pathlib`] Add autofix for `PTH202`
([#&#8203;18763](https://redirect.github.com/astral-sh/ruff/pull/18763))
- \[`pygrep-hooks`] Add `AsyncMock` methods to `invalid-mock-access`
(`PGH005`)
([#&#8203;18547](https://redirect.github.com/astral-sh/ruff/pull/18547))
- \[`pylint`] Ignore `__init__.py` files in (`PLC0414`)
([#&#8203;18400](https://redirect.github.com/astral-sh/ruff/pull/18400))
- \[`ruff`] Trigger `RUF037` for empty string and byte strings
([#&#8203;18862](https://redirect.github.com/astral-sh/ruff/pull/18862))
- \[formatter] Fix missing blank lines before decorated classes in
`.pyi` files
([#&#8203;18888](https://redirect.github.com/astral-sh/ruff/pull/18888))

##### Bug fixes

- Avoid generating diagnostics with per-file ignores
([#&#8203;18801](https://redirect.github.com/astral-sh/ruff/pull/18801))
- Handle parenthesized arguments in `remove_argument`
([#&#8203;18805](https://redirect.github.com/astral-sh/ruff/pull/18805))
- \[`flake8-logging`] Avoid false positive for `exc_info=True` outside
`logger.exception` (`LOG014`)
([#&#8203;18737](https://redirect.github.com/astral-sh/ruff/pull/18737))
- \[`flake8-pytest-style`] Enforce `pytest` import for decorators
([#&#8203;18779](https://redirect.github.com/astral-sh/ruff/pull/18779))
- \[`flake8-pytest-style`] Mark autofix for `PT001` and `PT023` as
unsafe if there's comments in the decorator
([#&#8203;18792](https://redirect.github.com/astral-sh/ruff/pull/18792))
- \[`flake8-pytest-style`] `PT001`/`PT023` fix makes syntax error on
parenthesized decorator
([#&#8203;18782](https://redirect.github.com/astral-sh/ruff/pull/18782))
- \[`flake8-raise`] Make fix unsafe if it deletes comments (`RSE102`)
([#&#8203;18788](https://redirect.github.com/astral-sh/ruff/pull/18788))
- \[`flake8-simplify`] Fix `SIM911` autofix creating a syntax error
([#&#8203;18793](https://redirect.github.com/astral-sh/ruff/pull/18793))
- \[`flake8-simplify`] Fix false negatives for shadowed bindings
(`SIM910`, `SIM911`)
([#&#8203;18794](https://redirect.github.com/astral-sh/ruff/pull/18794))
- \[`flake8-simplify`] Preserve original behavior for `except ()` and
bare `except` (`SIM105`)
([#&#8203;18213](https://redirect.github.com/astral-sh/ruff/pull/18213))
- \[`flake8-pyi`] Fix `PYI041`'s fix causing `TypeError` with `None |
None | ...`
([#&#8203;18637](https://redirect.github.com/astral-sh/ruff/pull/18637))
- \[`perflint`] Fix `PERF101` autofix creating a syntax error and mark
autofix as unsafe if there are comments in the `list` call expr
([#&#8203;18803](https://redirect.github.com/astral-sh/ruff/pull/18803))
- \[`perflint`] Fix false negative in `PERF401`
([#&#8203;18866](https://redirect.github.com/astral-sh/ruff/pull/18866))
- \[`pylint`] Avoid flattening nested `min`/`max` when outer call has
single argument (`PLW3301`)
([#&#8203;16885](https://redirect.github.com/astral-sh/ruff/pull/16885))
- \[`pylint`] Fix `PLC2801` autofix creating a syntax error
([#&#8203;18857](https://redirect.github.com/astral-sh/ruff/pull/18857))
- \[`pylint`] Mark `PLE0241` autofix as unsafe if there's comments in
the base classes
([#&#8203;18832](https://redirect.github.com/astral-sh/ruff/pull/18832))
- \[`pylint`] Suppress `PLE2510`/`PLE2512`/`PLE2513`/`PLE2514`/`PLE2515`
autofix if the text contains an odd number of backslashes
([#&#8203;18856](https://redirect.github.com/astral-sh/ruff/pull/18856))
- \[`refurb`] Detect more exotic float literals in `FURB164`
([#&#8203;18925](https://redirect.github.com/astral-sh/ruff/pull/18925))
- \[`refurb`] Fix `FURB163` autofix creating a syntax error for `yield`
expressions
([#&#8203;18756](https://redirect.github.com/astral-sh/ruff/pull/18756))
- \[`refurb`] Mark `FURB129` autofix as unsafe if there's comments in
the `readlines` call
([#&#8203;18858](https://redirect.github.com/astral-sh/ruff/pull/18858))
- \[`ruff`] Fix false positives and negatives in `RUF010`
([#&#8203;18690](https://redirect.github.com/astral-sh/ruff/pull/18690))
- Fix casing of `analyze.direction` variant names
([#&#8203;18892](https://redirect.github.com/astral-sh/ruff/pull/18892))

##### Rule changes

- Fix f-string interpolation escaping in generated fixes
([#&#8203;18882](https://redirect.github.com/astral-sh/ruff/pull/18882))
- \[`flake8-return`] Mark `RET501` fix unsafe if comments are inside
([#&#8203;18780](https://redirect.github.com/astral-sh/ruff/pull/18780))
- \[`flake8-async`] Fix detection for large integer sleep durations in
`ASYNC116` rule
([#&#8203;18767](https://redirect.github.com/astral-sh/ruff/pull/18767))
- \[`flake8-async`] Mark autofix for `ASYNC115` as unsafe if the call
expression contains comments
([#&#8203;18753](https://redirect.github.com/astral-sh/ruff/pull/18753))
- \[`flake8-bugbear`] Mark autofix for `B004` as unsafe if the `hasattr`
call expr contains comments
([#&#8203;18755](https://redirect.github.com/astral-sh/ruff/pull/18755))
- \[`flake8-comprehension`] Mark autofix for `C420` as unsafe if there's
comments inside the dict comprehension
([#&#8203;18768](https://redirect.github.com/astral-sh/ruff/pull/18768))
- \[`flake8-comprehensions`] Handle template strings for comprehension
fixes
([#&#8203;18710](https://redirect.github.com/astral-sh/ruff/pull/18710))
- \[`flake8-future-annotations`] Add autofix (`FA100`)
([#&#8203;18903](https://redirect.github.com/astral-sh/ruff/pull/18903))
- \[`pyflakes`] Mark `F504`/`F522`/`F523` autofix as unsafe if there's a
call with side effect
([#&#8203;18839](https://redirect.github.com/astral-sh/ruff/pull/18839))
- \[`pylint`] Allow fix with comments and document performance
implications (`PLW3301`)
([#&#8203;18936](https://redirect.github.com/astral-sh/ruff/pull/18936))
- \[`pylint`] Detect more exotic `NaN` literals in `PLW0177`
([#&#8203;18630](https://redirect.github.com/astral-sh/ruff/pull/18630))
- \[`pylint`] Fix `PLC1802` autofix creating a syntax error and mark
autofix as unsafe if there's comments in the `len` call
([#&#8203;18836](https://redirect.github.com/astral-sh/ruff/pull/18836))
- \[`pyupgrade`] Extend version detection to include
`sys.version_info.major` (`UP036`)
([#&#8203;18633](https://redirect.github.com/astral-sh/ruff/pull/18633))
- \[`ruff`] Add lint rule `RUF064` for calling `chmod` with non-octal
integers
([#&#8203;18541](https://redirect.github.com/astral-sh/ruff/pull/18541))
- \[`ruff`] Added `cls.__dict__.get('__annotations__')` check (`RUF063`)
([#&#8203;18233](https://redirect.github.com/astral-sh/ruff/pull/18233))
- \[`ruff`] Frozen `dataclass` default should be valid (`RUF009`)
([#&#8203;18735](https://redirect.github.com/astral-sh/ruff/pull/18735))

##### Server

- Consider virtual path for various server actions
([#&#8203;18910](https://redirect.github.com/astral-sh/ruff/pull/18910))

##### Documentation

- Add fix safety sections
([#&#8203;18940](https://redirect.github.com/astral-sh/ruff/pull/18940),[#&#8203;18841](https://redirect.github.com/astral-sh/ruff/pull/18841),[#&#8203;18802](https://redirect.github.com/astral-sh/ruff/pull/18802),[#&#8203;18837](https://redirect.github.com/astral-sh/ruff/pull/18837),[#&#8203;18800](https://redirect.github.com/astral-sh/ruff/pull/18800),[#&#8203;18415](https://redirect.github.com/astral-sh/ruff/pull/18415),[#&#8203;18853](https://redirect.github.com/astral-sh/ruff/pull/18853),[#&#8203;18842](https://redirect.github.com/astral-sh/ruff/pull/18842))
- Use updated pre-commit id
([#&#8203;18718](https://redirect.github.com/astral-sh/ruff/pull/18718))
- \[`perflint`] Small docs improvement to `PERF401`
([#&#8203;18786](https://redirect.github.com/astral-sh/ruff/pull/18786))
- \[`pyupgrade`]: Use `super()`, not `__super__` in error messages
(`UP008`)
([#&#8203;18743](https://redirect.github.com/astral-sh/ruff/pull/18743))
- \[`flake8-pie`] Small docs fix to `PIE794`
([#&#8203;18829](https://redirect.github.com/astral-sh/ruff/pull/18829))
- \[`flake8-pyi`] Correct `collections-named-tuple` example to use
PascalCase assignment
([#&#8203;16884](https://redirect.github.com/astral-sh/ruff/pull/16884))
- \[`flake8-pie`] Add note on type checking benefits to
`unnecessary-dict-kwargs` (`PIE804`)
([#&#8203;18666](https://redirect.github.com/astral-sh/ruff/pull/18666))
- \[`pycodestyle`] Clarify PEP 8 relationship to
`whitespace-around-operator` rules
([#&#8203;18870](https://redirect.github.com/astral-sh/ruff/pull/18870))

##### Other changes

- Disallow newlines in format specifiers of single quoted f- or
t-strings
([#&#8203;18708](https://redirect.github.com/astral-sh/ruff/pull/18708))
- \[`flake8-logging`] Add fix safety section to `LOG002`
([#&#8203;18840](https://redirect.github.com/astral-sh/ruff/pull/18840))
- \[`pyupgrade`] Add fix safety section to `UP010`
([#&#8203;18838](https://redirect.github.com/astral-sh/ruff/pull/18838))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:12:53 +05:30
renovate[bot]
e33980fd5c Update PyO3/maturin-action action to v1.49.3 (#19033)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [PyO3/maturin-action](https://redirect.github.com/PyO3/maturin-action)
| action | patch | `v1.49.2` -> `v1.49.3` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>PyO3/maturin-action (PyO3/maturin-action)</summary>

###
[`v1.49.3`](https://redirect.github.com/PyO3/maturin-action/releases/tag/v1.49.3)

[Compare
Source](https://redirect.github.com/PyO3/maturin-action/compare/v1.49.2...v1.49.3)

##### What's Changed

- Fix `musllinux` builds with `rust-toolchain.toml` containing channel
that is not `stable` by
[@&#8203;FloezeTv](https://redirect.github.com/FloezeTv) in
[https://github.com/PyO3/maturin-action/pull/359](https://redirect.github.com/PyO3/maturin-action/pull/359)

##### New Contributors

- [@&#8203;FloezeTv](https://redirect.github.com/FloezeTv) made their
first contribution in
[https://github.com/PyO3/maturin-action/pull/359](https://redirect.github.com/PyO3/maturin-action/pull/359)

**Full Changelog**:
https://github.com/PyO3/maturin-action/compare/v1.49.2...v1.49.3

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:08:04 +05:30
renovate[bot]
053739f698 Update astral-sh/setup-uv action to v6.3.1 (#19031)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [astral-sh/setup-uv](https://redirect.github.com/astral-sh/setup-uv) |
action | patch | `v6.3.0` -> `v6.3.1` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>astral-sh/setup-uv (astral-sh/setup-uv)</summary>

###
[`v6.3.1`](https://redirect.github.com/astral-sh/setup-uv/releases/tag/v6.3.1):
🌈 Do not warn when version not in manifest-file

[Compare
Source](https://redirect.github.com/astral-sh/setup-uv/compare/v6.3.0...v6.3.1)

##### Changes

This is a hotfix to change the warning messages that a version could not
be found in the local manifest-file to info level.

A `setup-uv` release contains a version-manifest.json file with infos in
all available `uv` releases. When a new `uv` version is released this is
not contained in this file until the file gets updated and a new
`setup-uv` release is made.
We will overhaul this process in the future but for now the spamming of
warnings is removed.

##### 🐛 Bug fixes

- Do not warn when version not in manifest-file
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;462](https://redirect.github.com/astral-sh/setup-uv/issues/462))

##### 🧰 Maintenance

- chore: update known versions for 0.7.14
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;459](https://redirect.github.com/astral-sh/setup-uv/issues/459))
- Revert "Set expected cache dir drive to C: on windows
([#&#8203;451](https://redirect.github.com/astral-sh/setup-uv/issues/451))"
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;460](https://redirect.github.com/astral-sh/setup-uv/issues/460))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:06:39 +05:30
renovate[bot]
192569c848 Update Rust crate clearscreen to v4.0.2 (#19034)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [clearscreen](https://redirect.github.com/watchexec/clearscreen) |
workspace.dependencies | patch | `4.0.1` -> `4.0.2` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>watchexec/clearscreen (clearscreen)</summary>

###
[`v4.0.2`](https://redirect.github.com/watchexec/clearscreen/blob/HEAD/CHANGELOG.md#v402-2025-06-25)

[Compare
Source](https://redirect.github.com/watchexec/clearscreen/compare/v4.0.1...v4.0.2)

- **Deps:** Upgrade which from 7.0.2 to 8.0.0
([#&#8203;34](https://redirect.github.com/watchexec/clearscreen/issues/34))
-
([09bc029](09bc0299f4))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:06:10 +05:30
renovate[bot]
ae7eaa9913 Update Rust crate ordermap to v0.5.8 (#19036)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [ordermap](https://redirect.github.com/indexmap-rs/ordermap) |
workspace.dependencies | patch | `0.5.7` -> `0.5.8` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>indexmap-rs/ordermap (ordermap)</summary>

###
[`v0.5.8`](https://redirect.github.com/indexmap-rs/ordermap/blob/HEAD/RELEASES.md#058-2025-06-26)

[Compare
Source](https://redirect.github.com/indexmap-rs/ordermap/compare/0.5.7...0.5.8)

- Added `extract_if` methods to `OrderMap` and `OrderSet`, similar to
the
methods for `HashMap` and `HashSet` with ranges like `Vec::extract_if`.
- Added more `#[track_caller]` annotations to functions that may panic.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:05:52 +05:30
renovate[bot]
d88bebca32 Update Rust crate indexmap to v2.10.0 (#19040)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [indexmap](https://redirect.github.com/indexmap-rs/indexmap) |
workspace.dependencies | minor | `2.9.0` -> `2.10.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>indexmap-rs/indexmap (indexmap)</summary>

###
[`v2.10.0`](https://redirect.github.com/indexmap-rs/indexmap/blob/HEAD/RELEASES.md#2100-2025-06-26)

[Compare
Source](https://redirect.github.com/indexmap-rs/indexmap/compare/2.9.0...2.10.0)

- Added `extract_if` methods to `IndexMap` and `IndexSet`, similar to
the
methods for `HashMap` and `HashSet` with ranges like `Vec::extract_if`.
- Added more `#[track_caller]` annotations to functions that may panic.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 13:04:43 +05:30
InSync
e7aadfc28b [ty] Add special-cased inference for __import__(name) and importlib.import_module(name) (#19008) 2025-06-29 11:49:23 +01:00
Shunsuke Shibayama
de1f8177be [ty] Improve protocol member type checking and relation handling (#18847)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-06-29 10:46:33 +00:00
Ibraheem Ahmed
9218bf72ad [ty] Print salsa memory usage totals in mypy primer CI runs (#18973)
## Summary

Print the [new salsa memory usage
dumps](https://github.com/astral-sh/ruff/pull/18928) in mypy primer CI
runs to help us catch memory regressions. The numbers are rounded to the
nearest power of 1.1 (about a 5% threshold between buckets) to avoid overly sensitive diffs.
2025-06-28 15:09:50 -04:00
Micha Reiser
29927f2b59 Update Rust toolchain to 1.88 and MSRV to 1.86 (#19011) 2025-06-28 20:24:00 +02:00
GiGaGon
c5995c40d3 [flake8-async] Make ASYNC105 example error out-of-the-box (#19002)
## Summary

Part of #18972

This PR makes [trio-sync-call
(ASYNC105)](https://docs.astral.sh/ruff/rules/trio-sync-call/#trio-sync-call-async105)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/5b267e01-1c0a-4902-949e-45fc46f8b0d0)
```py
async def double_sleep(x):
    trio.sleep(2 * x)
```

[New example](https://play.ruff.rs/eba6ea40-ff88-4ea8-8cb4-cea472c15c53)
```py
import trio


async def double_sleep(x):
    trio.sleep(2 * x)
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:18:06 -05:00
GiGaGon
68f98cfcd8 [Airflow] Make AIR312 example error out-of-the-box (#18989)
## Summary

Part of #18972

This PR makes [airflow3-suggested-to-move-to-provider
(AIR312)](https://docs.astral.sh/ruff/rules/airflow3-suggested-to-move-to-provider/#airflow3-suggested-to-move-to-provider-air312)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/1be0d654-1ed5-4a0b-8791-cc5db73333d5)
```py
from airflow.operators.python import PythonOperator
```

[New example](https://play.ruff.rs/b6260206-fa19-4ab2-8d45-ddd43c46a759)
```py
from airflow.operators.python import PythonOperator


def print_context(ds=None, **kwargs):
    print(kwargs)
    print(ds)


print_the_context = PythonOperator(
    task_id="print_the_context", python_callable=print_context
)
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:17:11 -05:00
GiGaGon
315adba906 [flake8-async] Make ASYNC251 example error out-of-the-box (#18990)
## Summary

Part of #18972

This PR makes [blocking-sleep-in-async-function
(ASYNC251)](https://docs.astral.sh/ruff/rules/blocking-sleep-in-async-function/#blocking-sleep-in-async-function-async251)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/796684a2-c437-4390-b754-491e576ffe5e)
```py
async def fetch():
    time.sleep(1)
```

[New example](https://play.ruff.rs/90741192-fd0d-49fb-a04e-3127312da659)
```py
import time


async def fetch():
    time.sleep(1)
```

Imports were also added to the `Use instead:` section to make it valid
code out-of-the-box.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:15:34 -05:00
GiGaGon
523174e8be [flake8-async] Make ASYNC100 example error out-of-the-box (#18993)
## Summary

Part of #18972

This PR makes [cancel-scope-no-checkpoint
(ASYNC100)](https://docs.astral.sh/ruff/rules/cancel-scope-no-checkpoint/#cancel-scope-no-checkpoint-async100)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/6a399ae5-9b89-4438-b808-6604f1e40a70)
```py
async def func():
    async with asyncio.timeout(2):
        do_something()
```

[New example](https://play.ruff.rs/c44db531-d2f8-4a61-9e04-e5fc0ea989e3)
```py
import asyncio


async def func():
    async with asyncio.timeout(2):
        do_something()
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:13:54 -05:00
GiGaGon
ed2e90371b [flake8-async] Make ASYNC210 example error out-of-the-box (#18977)
## Summary

Part of #18972

This PR makes [blocking-http-call-in-async-function
(ASYNC210)](https://docs.astral.sh/ruff/rules/blocking-http-call-in-async-function/#blocking-http-call-in-async-function-async210)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/20cba4f4-fe2f-428a-a721-311d1a081e64)
```py
async def fetch():
    urllib.request.urlopen("https://example.com/foo/bar").read()
```

[New example](https://play.ruff.rs/5ca2a10d-5294-49ee-baee-0447f7188d9b)
```py
import urllib


async def fetch():
    urllib.request.urlopen("https://example.com/foo/bar").read()
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-06-28 10:11:38 -05:00
David Peter
90cb0d3a7b [ty] Reduce 'complex_constrained_attributes_2' runtime (#19001)
Re: https://github.com/astral-sh/ruff/pull/18979#issuecomment-3012541095

Each check increases the runtime by a factor of 3, so this should be an
order of magnitude faster.
2025-06-27 23:15:45 +02:00
Alex Waygood
1297d6a9eb [ty] Followups to tuple constructor improvements in #18987 (#19000) 2025-06-27 22:09:14 +01:00
Douglas Creager
caf3c916e8 [ty] Refactor argument matching / type checking in call binding (#18997)
This PR extracts a lot of the complex logic in the `match_parameters`
and `check_types` methods of our call binding machinery into separate
helper types. This is setup for #18996, which will update this logic to
handle variadic arguments. To do so, it is helpful to have the
per-argument logic extracted into a method that we can call repeatedly
for each _element_ of a variadic argument.

This should be a pure refactoring, with no behavioral changes.
2025-06-27 17:01:52 -04:00
Douglas Creager
c60e590b4c [ty] Support variable-length tuples in unpacking assignments (#18948)
This PR updates our unpacking assignment logic to use the new tuple
machinery. As a result, we can now unpack variable-length tuples
correctly.

As part of this, the `TupleSpec` classes have been renamed to `Tuple`,
and can now contain any element (Rust) type, not just `Type<'db>`. The
unpacker uses a tuple of `UnionBuilder`s to maintain the types that will
be assigned to each target, as we iterate through potentially many union
elements on the rhs. We also add a new consuming iterator for tuples,
and update the `all_elements` methods to wrap the result in an enum
(similar to `itertools::Position`) letting you know which part of the
tuple each element appears in. I also added a new
`UnionBuilder::try_build`, which lets you specify a different fallback
type if the union contains no elements.
2025-06-27 15:29:04 -04:00
Alex Waygood
a50a993b9c [ty] Make tuple instantiations sound (#18987)
## Summary

Ensure that we correctly infer calls such as `tuple((1, 2))`,
`tuple(range(42))`, etc. Ensure that we emit errors on invalid calls
such as `tuple[int, str]()`.

## Test Plan

Mdtests
2025-06-27 19:37:16 +01:00
Robsdedude
6802c4702f [flake8-pyi] Expand Optional[A] to A | None (PYI016) (#18572)
## Summary
Under preview 🧪 I've expanded rule `PYI016` to also flag type
union duplicates containing `None` and `Optional`.

## Test Plan
Examples/tests have been added. I've made sure that the existing
examples did not change unless preview is enabled.

## Relevant Issues
* https://github.com/astral-sh/ruff/issues/18508 (discussing
introducing/extending a rule to flag `Optional[None]`)
* https://github.com/astral-sh/ruff/issues/18546 (where I discussed this
addition with @AlexWaygood)

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-06-27 15:43:11 +00:00
Brent Westbrook
96f3c8d1ab Convert OldDiagnostic::noqa_code to an Option<String> (#18946)
## Summary

I think this should be the last step before combining `OldDiagnostic`
and `ruff_db::Diagnostic`. We can't store a `NoqaCode` on
`ruff_db::Diagnostic`, so I converted the `noqa_code` field to an
`Option<String>` and then propagated this change to all of the callers.

I tried to use `&str` everywhere it was possible, so I think the
remaining `to_string` calls are necessary. I spent some time trying to
convert _everything_ to `&str` but ran into lifetime issues, especially
in the `FixTable`. Maybe we can take another look at that if it causes a
performance regression, but hopefully these paths aren't too hot. We
also avoid some `to_string` calls, so it might even out a bit too.

## Test Plan

Existing tests

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-27 11:36:55 -04:00
Andrew Gallant
efcb63fe3a [ty] Fix playground (#18986)
I renamed a field on a `Completion` struct in #18982, and it looks like
this caused the playground to fail to build:

https://github.com/astral-sh/ruff/actions/runs/15928550050/job/44931734349

Maybe building that playground can be added to CI for pull requests?
2025-06-27 10:43:36 -04:00
Andrew Gallant
5f6b0ded21 [ty] Add builtins to completions derived from scope (#18982)
Most of the work here was doing some light refactoring to facilitate
sensible testing. That is, we don't want to list every builtin included
in most tests, so we add some structure to the completion type returned.
Tests can now filter based on whether a completion is a builtin or not.

Otherwise, builtins are found using the existing infrastructure for
`object.attr` completions (where we hard-code the module name
`builtins`).

I did consider changing the sort order based on whether a completion
suggestion was a builtin or not. In particular, it seemed like it might
be a good idea to sort builtins after other scope based completions,
but before the dunder and sunder attributes. Namely, it seems likely
that there is an inverse correlation between the size of a scope and
the likelihood of an item in that scope being used at any given point.
So it *might* be a good idea to prioritize the likelier candidates in
the completions returned.

Additionally, the number of items introduced by adding builtins is quite
large. So I wondered whether mixing them in with everything else would
become too noisy.

However, it's not totally clear to me that this is the right thing to
do. Right now, I feel like there is a very obvious lexicographic
ordering that makes "finding" the right suggestion to activate
potentially easier than if the ranking mechanism is less clear.
(Technically, the dunder and sunder attributes are not sorted
lexicographically, but I'd put forward that most folks don't have an
intuitive understanding of where `_` ranks lexicographically with
respect to "regular" letters. Moreover, since dunder and sunder
attributes are all grouped together, I think the ordering here ends up
being very obvious after even a quick glance.)
2025-06-27 10:20:01 -04:00
Matthew Mckee
a3c79d8170 [ty] Don't add incorrect subdiagnostic for unresolved reference (#18487)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
Co-authored-by: Carl Meyer <carl@astral.sh>
2025-06-27 12:40:33 +00:00
Alex Waygood
57bd7d055d [ty] Simplify KnownClass::check_call() and KnownFunction::check_call() (#18981) 2025-06-27 12:23:29 +01:00
David Peter
3c18d85c7d [ty] Add micro-benchmark for #711 (#18979)
## Summary

Add a benchmark for the problematic case in
https://github.com/astral-sh/ty/issues/711, which will potentially be
solved in https://github.com/astral-sh/ruff/pull/18955
2025-06-27 11:34:51 +02:00
GiGaGon
e5e3d998c5 [flake8-annotations] Make ANN401 example error out-of-the-box (#18974) 2025-06-27 07:06:11 +00:00
GiGaGon
85b2a08b5c [flake8-async] Make ASYNC110 example error out-of-the-box (#18975) 2025-06-27 09:01:02 +02:00
Jordy Williams
1874d52eda [pandas]: Fix issue on non pandas dataframe in-place usage (PD002) (#18963) 2025-06-27 06:56:13 +00:00
Yair Peretz
18efe2ab46 [pylint] Fix PLC0415 example (#18970)
Fixed documentation error
2025-06-26 18:33:33 -04:00
Ibraheem Ahmed
6f7b1c9bb3 [ty] Add environment variable to dump Salsa memory usage stats (#18928)
## Summary

Setting `TY_MEMORY_REPORT=full` will generate and print a memory usage
report to the CLI after a `ty check` run:

```
=======SALSA STRUCTS=======
`Definition`                                       metadata=7.24MB   fields=17.38MB  count=181062
`Expression`                                       metadata=4.45MB   fields=5.94MB   count=92804
`member_lookup_with_policy_::interned_arguments`   metadata=1.97MB   fields=2.25MB   count=35176
...
=======SALSA QUERIES=======
`File -> ty_python_semantic::semantic_index::SemanticIndex`
    metadata=11.46MB  fields=88.86MB  count=1638
`Definition -> ty_python_semantic::types::infer::TypeInference`
    metadata=24.52MB  fields=86.68MB  count=146018
`File -> ruff_db::parsed::ParsedModule`
    metadata=0.12MB   fields=69.06MB  count=1642
...
=======SALSA SUMMARY=======
TOTAL MEMORY USAGE: 577.61MB
    struct metadata = 29.00MB
    struct fields = 35.68MB
    memo metadata = 103.87MB
    memo fields = 409.06MB
```

Eventually, we should integrate these numbers into CI in some form. The
one limitation currently is that heap allocations in salsa structs (e.g.
interned values) are not tracked, but memoized values should have full
coverage. We may also want a peak memory usage counter (that accounts
for non-salsa memory), but that is relatively simple to profile manually
(e.g. `time -v ty check`) and would require a compile-time option to
avoid runtime overhead.
2025-06-26 21:27:51 +00:00
Victor Hugo Gomes
a1579d82d0 [pylint] Fix PLW0108 autofix introducing a syntax error when the lambda's body contains an assignment expression (#18678)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

This PR also supresses the fix if the assignment expression target
shadows one of the lambda's parameters.

Fixes #18675

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

Add regression tests.
<!-- How was it tested? -->
2025-06-26 16:56:17 -04:00
319 changed files with 11057 additions and 4087 deletions

View File

@@ -49,7 +49,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build sdist"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
command: sdist
args: --out dist
@@ -79,7 +79,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels - x86_64"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: x86_64
args: --release --locked --out dist
@@ -121,7 +121,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels - aarch64"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: aarch64
args: --release --locked --out dist
@@ -177,7 +177,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
args: --release --locked --out dist
@@ -230,7 +230,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.target }}
manylinux: auto
@@ -304,7 +304,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: auto
@@ -370,7 +370,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.target }}
manylinux: musllinux_1_2
@@ -435,7 +435,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: musllinux_1_2

View File

@@ -321,14 +321,30 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- name: Setup Dev Drive
run: ${{ github.workspace }}/.github/workflows/setup-dev-drive.ps1
# actions/checkout does not let us clone into anywhere outside `github.workspace`, so we have to copy the clone
- name: Copy Git Repo to Dev Drive
env:
RUFF_WORKSPACE: ${{ env.RUFF_WORKSPACE }}
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${env:RUFF_WORKSPACE}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:
workspaces: ${{ env.RUFF_WORKSPACE }}
- name: "Install Rust toolchain"
working-directory: ${{ env.RUFF_WORKSPACE }}
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
with:
tool: cargo-nextest
- name: "Run tests"
working-directory: ${{ env.RUFF_WORKSPACE }}
shell: bash
env:
NEXTEST_PROFILE: "ci"
@@ -460,7 +476,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
name: Download Ruff binary to test
id: download-cached-binary
@@ -661,7 +677,7 @@ jobs:
branch: ${{ github.event.pull_request.base.ref }}
workflow: "ci.yaml"
check_artifacts: true
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: Fuzz
env:
FORCE_COLOR: 1
@@ -712,7 +728,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
uses: PyO3/maturin-action@35be3186fc8e037e329f06b68dcd807d83dcc6dc # v1.49.2
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
args: --out dist
- name: "Test wheel"
@@ -731,7 +747,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
@@ -774,7 +790,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: Install uv
uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uv pip install -r docs/requirements-insiders.txt --system
@@ -906,7 +922,7 @@ jobs:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install Rust toolchain"
run: rustup show
@@ -939,7 +955,7 @@ jobs:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install Rust toolchain"
run: rustup show

View File

@@ -34,7 +34,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"

View File

@@ -37,7 +37,7 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:
@@ -48,6 +48,8 @@ jobs:
- name: Run mypy_primer
shell: bash
env:
TY_MEMORY_REPORT: mypy_primer
run: |
cd ruff

View File

@@ -22,7 +22,7 @@ jobs:
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
pattern: wheels-*

93
.github/workflows/setup-dev-drive.ps1 vendored Normal file
View File

@@ -0,0 +1,93 @@
# Configures a drive for testing in CI.
#
# When using standard GitHub Actions runners, a `D:` drive is present and has
# similar or better performance characteristics than a ReFS dev drive. Sometimes
# using a larger runner is still more performant (e.g., when running the test
# suite) and we need to create a dev drive. This script automatically configures
# the appropriate drive.
#
# When using GitHub Actions' "larger runners", the `D:` drive is not present and
# we create a DevDrive mount on `C:`. This is purported to be more performant
# than an ReFS drive, though we did not see a change when we switched over.
#
# When using Depot runners, the underling infrastructure is EC2, which does not
# support Hyper-V. The `New-VHD` commandlet only works with Hyper-V, but we can
# create a ReFS drive using `diskpart` and `format` directory. We cannot use a
# DevDrive, as that also requires Hyper-V. The Depot runners use `D:` already,
# so we must check if it's a Depot runner first, and we use `V:` as the target
# instead.
if ($env:DEPOT_RUNNER -eq "1") {
Write-Output "DEPOT_RUNNER detected, setting up custom dev drive..."
# Create VHD and configure drive using diskpart
$vhdPath = "C:\ruff_dev_drive.vhdx"
@"
create vdisk file="$vhdPath" maximum=20480 type=expandable
attach vdisk
create partition primary
active
assign letter=V
"@ | diskpart
# Format the drive as ReFS
format V: /fs:ReFS /q /y
$Drive = "V:"
Write-Output "Custom dev drive created at $Drive"
} elseif (Test-Path "D:\") {
# Note `Get-PSDrive` is not sufficient because the drive letter is assigned.
Write-Output "Using existing drive at D:"
$Drive = "D:"
} else {
# The size (20 GB) is chosen empirically to be large enough for our
# workflows; larger drives can take longer to set up.
$Volume = New-VHD -Path C:/ruff_dev_drive.vhdx -SizeBytes 20GB |
Mount-VHD -Passthru |
Initialize-Disk -Passthru |
New-Partition -AssignDriveLetter -UseMaximumSize |
Format-Volume -DevDrive -Confirm:$false -Force
$Drive = "$($Volume.DriveLetter):"
# Set the drive as trusted
# See https://learn.microsoft.com/en-us/windows/dev-drive/#how-do-i-designate-a-dev-drive-as-trusted
fsutil devdrv trust $Drive
# Disable antivirus filtering on dev drives
# See https://learn.microsoft.com/en-us/windows/dev-drive/#how-do-i-configure-additional-filters-on-dev-drive
fsutil devdrv enable /disallowAv
# Remount so the changes take effect
Dismount-VHD -Path C:/ruff_dev_drive.vhdx
Mount-VHD -Path C:/ruff_dev_drive.vhdx
# Show some debug information
Write-Output $Volume
fsutil devdrv query $Drive
Write-Output "Using Dev Drive at $Volume"
}
$Tmp = "$($Drive)\ruff-tmp"
# Create the directory ahead of time in an attempt to avoid race-conditions
New-Item $Tmp -ItemType Directory
# Move Cargo to the dev drive
New-Item -Path "$($Drive)/.cargo/bin" -ItemType Directory -Force
if (Test-Path "C:/Users/runneradmin/.cargo") {
Copy-Item -Path "C:/Users/runneradmin/.cargo/*" -Destination "$($Drive)/.cargo/" -Recurse -Force
}
Write-Output `
"DEV_DRIVE=$($Drive)" `
"TMP=$($Tmp)" `
"TEMP=$($Tmp)" `
"UV_INTERNAL__TEST_DIR=$($Tmp)" `
"RUSTUP_HOME=$($Drive)/.rustup" `
"CARGO_HOME=$($Drive)/.cargo" `
"RUFF_WORKSPACE=$($Drive)/ruff" `
"PATH=$($Drive)/.cargo/bin;$env:PATH" `
>> $env:GITHUB_ENV

View File

@@ -32,7 +32,7 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@445689ea25e0de0a23313031f5fe577c74ae45a1 # v6.3.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:

View File

@@ -81,7 +81,7 @@ repos:
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.13
rev: v0.12.1
hooks:
- id: ruff-format
- id: ruff
@@ -91,7 +91,7 @@ repos:
# Prettier
- repo: https://github.com/rbubley/mirrors-prettier
rev: v3.5.3
rev: v3.6.2
hooks:
- id: prettier
types: [yaml]
@@ -99,12 +99,12 @@ repos:
# zizmor detects security vulnerabilities in GitHub Actions workflows.
# Additional configuration for the tool is found in `.github/zizmor.yml`
- repo: https://github.com/woodruffw/zizmor-pre-commit
rev: v1.9.0
rev: v1.10.0
hooks:
- id: zizmor
- repo: https://github.com/python-jsonschema/check-jsonschema
rev: 0.33.0
rev: 0.33.1
hooks:
- id: check-github-workflows

257
Cargo.lock generated
View File

@@ -132,6 +132,15 @@ version = "1.0.98"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e16d2d3311acee920a9eb8d33b8cbc1787ce4a264e85f964c2404b969bdcd487"
[[package]]
name = "approx"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cab112f0a86d568ea0e627cc1d6be74a1e9cd55214684db5561995f6dad897c6"
dependencies = [
"num-traits",
]
[[package]]
name = "arc-swap"
version = "1.7.1"
@@ -169,6 +178,36 @@ dependencies = [
"tempfile",
]
[[package]]
name = "attribute-derive"
version = "0.10.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0053e96dd3bec5b4879c23a138d6ef26f2cb936c9cdc96274ac2b9ed44b5bb54"
dependencies = [
"attribute-derive-macro",
"derive-where",
"manyhow",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "attribute-derive-macro"
version = "0.10.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "463b53ad0fd5b460af4b1915fe045ff4d946d025fb6c4dc3337752eaa980f71b"
dependencies = [
"collection_literals",
"interpolator",
"manyhow",
"proc-macro-utils",
"proc-macro2",
"quote",
"quote-use",
"syn",
]
[[package]]
name = "autocfg"
version = "1.4.0"
@@ -181,6 +220,15 @@ version = "0.13.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9e1b586273c5702936fe7b7d6896644d8be71e6314cfe09d3167c95f712589e8"
[[package]]
name = "bincode"
version = "1.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1f45e9417d87227c7a56d22e471c6206462cba514c7590c09aff4cf6d1ddcad"
dependencies = [
"serde",
]
[[package]]
name = "bincode"
version = "2.0.1"
@@ -419,9 +467,9 @@ checksum = "f46ad14479a25103f283c0f10005961cf086d8dc42205bb44c46ac563475dca6"
[[package]]
name = "clearscreen"
version = "4.0.1"
version = "4.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c41dc435a7b98e4608224bbf65282309f5403719df9113621b30f8b6f74e2f4"
checksum = "85a8ab73a1c02b0c15597b22e09c7dc36e63b2f601f9d1e83ac0c3decd38b1ae"
dependencies = [
"nix 0.29.0",
"terminfo",
@@ -432,22 +480,27 @@ dependencies = [
[[package]]
name = "codspeed"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "93f4cce9c27c49c4f101fffeebb1826f41a9df2e7498b7cd4d95c0658b796c6c"
checksum = "922018102595f6668cdd09c03f4bff2d951ce2318c6dca4fe11bdcb24b65b2bf"
dependencies = [
"anyhow",
"bincode 1.3.3",
"colored 2.2.0",
"glob",
"libc",
"nix 0.29.0",
"serde",
"serde_json",
"statrs",
"uuid",
]
[[package]]
name = "codspeed-criterion-compat"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c3c23d880a28a2aab52d38ca8481dd7a3187157d0a952196b6db1db3c8499725"
checksum = "24d8ad82d2383cb74995f58993cbdd2914aed57b2f91f46580310dd81dc3d05a"
dependencies = [
"codspeed",
"codspeed-criterion-compat-walltime",
@@ -456,9 +509,9 @@ dependencies = [
[[package]]
name = "codspeed-criterion-compat-walltime"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7b0a2f7365e347f4f22a67e9ea689bf7bc89900a354e22e26cf8a531a42c8fbb"
checksum = "61badaa6c452d192a29f8387147888f0ab358553597c3fe9bf8a162ef7c2fa64"
dependencies = [
"anes",
"cast",
@@ -481,9 +534,9 @@ dependencies = [
[[package]]
name = "codspeed-divan-compat"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8620a09dfaf37b3c45f982c4b65bd8f9b0203944da3ffa705c0fcae6b84655ff"
checksum = "3acf1d6fe367c2ff5ff136ca723f678490c3691d59d7f2b83d5e53b7b25ac91e"
dependencies = [
"codspeed",
"codspeed-divan-compat-macros",
@@ -492,9 +545,9 @@ dependencies = [
[[package]]
name = "codspeed-divan-compat-macros"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "30fe872bc4214626b35d3a1706a905d0243503bb6ba3bb7be2fc59083d5d680c"
checksum = "bcfa2013d7bee54a497d0e1410751d5de690fd67a3e9eb728ca049b6a3d16d0b"
dependencies = [
"divan-macros",
"itertools 0.14.0",
@@ -506,9 +559,9 @@ dependencies = [
[[package]]
name = "codspeed-divan-compat-walltime"
version = "2.10.1"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "104caa97b36d4092d89e24e4b103b40ede1edab03c0372d19e14a33f9393132b"
checksum = "e513100fb0e7ba02fb3824546ecd2abfb8f334262f0972225b463aad07f99ff0"
dependencies = [
"cfg-if",
"clap",
@@ -519,6 +572,12 @@ dependencies = [
"regex-lite",
]
[[package]]
name = "collection_literals"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "186dce98367766de751c42c4f03970fc60fc012296e706ccbb9d5df9b6c1e271"
[[package]]
name = "colorchoice"
version = "1.0.3"
@@ -808,6 +867,17 @@ dependencies = [
"parking_lot_core",
]
[[package]]
name = "derive-where"
version = "1.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "510c292c8cf384b1a340b816a9a6cf2599eb8f566a44949024af88418000c50b"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "diff"
version = "0.1.13"
@@ -1067,6 +1137,29 @@ dependencies = [
"version_check",
]
[[package]]
name = "get-size-derive2"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1aac2af9f9a6a50e31b1e541d05b7925add83d3982c2793193fe9d4ee584323c"
dependencies = [
"attribute-derive",
"quote",
"syn",
]
[[package]]
name = "get-size2"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "624a0312efd19e1c45922dfcc2d6806d3ffc4bca261f89f31fcc4f63f438d885"
dependencies = [
"compact_str",
"get-size-derive2",
"hashbrown 0.15.4",
"smallvec",
]
[[package]]
name = "getopts"
version = "0.2.21"
@@ -1377,9 +1470,9 @@ dependencies = [
[[package]]
name = "indexmap"
version = "2.9.0"
version = "2.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cea70ddb795996207ad57735b50c5982d8844f38ba9ee5f1aedcfb708a2aa11e"
checksum = "fe4cd85333e22411419a0bcae1297d25e58c9443848b11dc6a86fefe8c78a661"
dependencies = [
"equivalent",
"hashbrown 0.15.4",
@@ -1455,6 +1548,12 @@ dependencies = [
"serde_json",
]
[[package]]
name = "interpolator"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "71dd52191aae121e8611f1e8dc3e324dd0dd1dee1e6dd91d10ee07a3cfb4d9d8"
[[package]]
name = "intrusive-collections"
version = "0.9.7"
@@ -1716,9 +1815,9 @@ checksum = "241eaef5fd12c88705a01fc1066c48c4b36e0dd4377dcdc7ec3942cea7a69956"
[[package]]
name = "lock_api"
version = "0.4.12"
version = "0.4.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "07af8b9cdd281b7915f413fa73f29ebd5d55d0d3f0155584dade1ff18cea1b17"
checksum = "96936507f153605bddfcda068dd804796c84324ed2510809e5b2a624c81da765"
dependencies = [
"autocfg",
"scopeguard",
@@ -1755,6 +1854,29 @@ dependencies = [
"url",
]
[[package]]
name = "manyhow"
version = "0.11.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b33efb3ca6d3b07393750d4030418d594ab1139cee518f0dc88db70fec873587"
dependencies = [
"manyhow-macros",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "manyhow-macros"
version = "0.11.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "46fce34d199b78b6e6073abf984c9cf5fd3e9330145a93ee0738a7443e371495"
dependencies = [
"proc-macro-utils",
"proc-macro2",
"quote",
]
[[package]]
name = "markdown"
version = "1.0.0"
@@ -1981,9 +2103,9 @@ checksum = "04744f49eae99ab78e0d5c0b603ab218f515ea8cfe5a456d7629ad883a3b6e7d"
[[package]]
name = "ordermap"
version = "0.5.7"
version = "0.5.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7d31b8b7a99f71bdff4235faf9ce9eada0ad3562c8fbeb7d607d9f41a6ec569d"
checksum = "6d6bff06e4a5dc6416bead102d3e63c480dd852ffbb278bf8cfeb4966b329609"
dependencies = [
"indexmap",
"serde",
@@ -2014,6 +2136,16 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
[[package]]
name = "papaya"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f92dd0b07c53a0a0c764db2ace8c541dc47320dad97c2200c2a637ab9dd2328f"
dependencies = [
"equivalent",
"seize",
]
[[package]]
name = "parking_lot"
version = "0.12.3"
@@ -2315,6 +2447,17 @@ dependencies = [
"toml_edit",
]
[[package]]
name = "proc-macro-utils"
version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eeaf08a13de400bc215877b5bdc088f241b12eb42f0a548d3390dc1c56bb7071"
dependencies = [
"proc-macro2",
"quote",
"smallvec",
]
[[package]]
name = "proc-macro2"
version = "1.0.95"
@@ -2391,6 +2534,28 @@ dependencies = [
"proc-macro2",
]
[[package]]
name = "quote-use"
version = "0.8.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9619db1197b497a36178cfc736dc96b271fe918875fbf1344c436a7e93d0321e"
dependencies = [
"quote",
"quote-use-macros",
]
[[package]]
name = "quote-use-macros"
version = "0.8.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82ebfb7faafadc06a7ab141a6f67bcfb24cb8beb158c6fe933f2f035afa99f35"
dependencies = [
"proc-macro-utils",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "r-efi"
version = "5.2.0"
@@ -2564,13 +2729,14 @@ dependencies = [
"anyhow",
"argfile",
"assert_fs",
"bincode",
"bincode 2.0.1",
"bitflags 2.9.1",
"cachedir",
"clap",
"clap_complete_command",
"clearscreen",
"colored 3.0.0",
"dunce",
"filetime",
"globwalk",
"ignore",
@@ -2680,6 +2846,7 @@ dependencies = [
"dunce",
"etcetera",
"filetime",
"get-size2",
"glob",
"ignore",
"insta",
@@ -2795,6 +2962,7 @@ dependencies = [
name = "ruff_index"
version = "0.0.0"
dependencies = [
"get-size2",
"ruff_macros",
"salsa",
"static_assertions",
@@ -2812,6 +2980,7 @@ dependencies = [
"fern",
"glob",
"globset",
"hashbrown 0.15.4",
"imperative",
"insta",
"is-macro",
@@ -2907,6 +3076,7 @@ dependencies = [
"aho-corasick",
"bitflags 2.9.1",
"compact_str",
"get-size2",
"is-macro",
"itertools 0.14.0",
"memchr",
@@ -3006,6 +3176,7 @@ dependencies = [
"bitflags 2.9.1",
"bstr",
"compact_str",
"get-size2",
"insta",
"memchr",
"ruff_annotate_snippets",
@@ -3112,6 +3283,7 @@ dependencies = [
name = "ruff_source_file"
version = "0.0.0"
dependencies = [
"get-size2",
"memchr",
"ruff_text_size",
"serde",
@@ -3121,6 +3293,7 @@ dependencies = [
name = "ruff_text_size"
version = "0.0.0"
dependencies = [
"get-size2",
"schemars",
"serde",
"serde_test",
@@ -3248,8 +3421,8 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=09627e450566f894956710a3fd923dc80462ae6d#09627e450566f894956710a3fd923dc80462ae6d"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
dependencies = [
"boxcar",
"compact_str",
@@ -3259,6 +3432,7 @@ dependencies = [
"hashlink",
"indexmap",
"intrusive-collections",
"papaya",
"parking_lot",
"portable-atomic",
"rayon",
@@ -3272,13 +3446,13 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=09627e450566f894956710a3fd923dc80462ae6d#09627e450566f894956710a3fd923dc80462ae6d"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
[[package]]
name = "salsa-macros"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=09627e450566f894956710a3fd923dc80462ae6d#09627e450566f894956710a3fd923dc80462ae6d"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
dependencies = [
"proc-macro2",
"quote",
@@ -3331,6 +3505,16 @@ version = "4.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1c107b6f4780854c8b126e228ea8869f4d7b71260f962fefb57b996b8959ba6b"
[[package]]
name = "seize"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e4b8d813387d566f627f3ea1b914c068aac94c40ae27ec43f5f33bde65abefe7"
dependencies = [
"libc",
"windows-sys 0.52.0",
]
[[package]]
name = "serde"
version = "1.0.219"
@@ -3531,6 +3715,16 @@ version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a2eb9349b6444b326872e140eb1cf5e7c522154d69e7a0ffb0fb81c06b37543f"
[[package]]
name = "statrs"
version = "0.18.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2a3fe7c28c6512e766b0874335db33c94ad7b8f9054228ae1c2abd47ce7d335e"
dependencies = [
"approx",
"num-traits",
]
[[package]]
name = "strip-ansi-escapes"
version = "0.2.1"
@@ -3991,6 +4185,7 @@ dependencies = [
"camino",
"colored 3.0.0",
"crossbeam",
"get-size2",
"globset",
"insta",
"notify",
@@ -4030,6 +4225,7 @@ dependencies = [
"countme",
"dir-test",
"drop_bomb",
"get-size2",
"glob",
"hashbrown 0.15.4",
"indexmap",
@@ -4091,6 +4287,7 @@ dependencies = [
"ty_ide",
"ty_project",
"ty_python_semantic",
"ty_vendored",
]
[[package]]
@@ -4130,6 +4327,7 @@ version = "0.0.0"
dependencies = [
"path-slash",
"ruff_db",
"static_assertions",
"walkdir",
"zip",
]
@@ -4564,11 +4762,10 @@ dependencies = [
[[package]]
name = "which"
version = "7.0.3"
version = "8.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24d643ce3fd3e5b54854602a080f34fb10ab75e0b813ee32d00ca2b44fa74762"
checksum = "d3fabb953106c3c8eea8306e4393700d7657561cb43122571b172bbfb7c7ba1d"
dependencies = [
"either",
"env_home",
"rustix",
"winsafe",

View File

@@ -5,7 +5,7 @@ resolver = "2"
[workspace.package]
# Please update rustfmt.toml when bumping the Rust edition
edition = "2024"
rust-version = "1.85"
rust-version = "1.86"
homepage = "https://docs.astral.sh/ruff"
documentation = "https://docs.astral.sh/ruff"
repository = "https://github.com/astral-sh/ruff"
@@ -62,8 +62,8 @@ camino = { version = "1.1.7" }
clap = { version = "4.5.3", features = ["derive"] }
clap_complete_command = { version = "0.6.0" }
clearscreen = { version = "4.0.0" }
divan = { package = "codspeed-divan-compat", version = "2.10.1" }
codspeed-criterion-compat = { version = "2.6.0", default-features = false }
divan = { package = "codspeed-divan-compat", version = "3.0.2" }
codspeed-criterion-compat = { version = "3.0.2", default-features = false }
colored = { version = "3.0.0" }
console_error_panic_hook = { version = "0.1.7" }
console_log = { version = "1.0.0" }
@@ -79,6 +79,12 @@ etcetera = { version = "0.10.0" }
fern = { version = "0.7.0" }
filetime = { version = "0.2.23" }
getrandom = { version = "0.3.1" }
get-size2 = { version = "0.5.0", features = [
"derive",
"smallvec",
"hashbrown",
"compact-str"
] }
glob = { version = "0.3.1" }
globset = { version = "0.4.14" }
globwalk = { version = "0.9.1" }
@@ -131,7 +137,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "09627e450566f894956710a3fd923dc80462ae6d" }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "fc00eba89e5dcaa5edba51c41aa5f309b5cb126b" }
schemars = { version = "0.8.16" }
seahash = { version = "4.1.0" }
serde = { version = "1.0.197", features = ["derive"] }
@@ -221,6 +227,7 @@ unnecessary_debug_formatting = "allow" # too many instances, the display also d
# Without the hashes we run into a `rustfmt` bug in some snapshot tests, see #13250
needless_raw_string_hashes = "allow"
# Disallowed restriction lints
ignore_without_reason = "allow" # Too many exsisting instances, and there's no auto fix.
print_stdout = "warn"
print_stderr = "warn"
dbg_macro = "warn"

View File

@@ -423,6 +423,7 @@ Ruff is used by a number of major open-source projects and companies, including:
- [Albumentations](https://github.com/albumentations-team/albumentations)
- Amazon ([AWS SAM](https://github.com/aws/serverless-application-model))
- [Anki](https://apps.ankiweb.net/)
- Anthropic ([Python SDK](https://github.com/anthropics/anthropic-sdk-python))
- [Apache Airflow](https://github.com/apache/airflow)
- AstraZeneca ([Magnus](https://github.com/AstraZeneca/magnus-core))

View File

@@ -68,6 +68,7 @@ ruff_linter = { workspace = true, features = ["clap", "test-rules"] }
assert_fs = { workspace = true }
# Avoid writing colored snapshots when running tests from the terminal
colored = { workspace = true, features = ["no-color"] }
dunce = { workspace = true }
indoc = { workspace = true }
insta = { workspace = true, features = ["filters", "json"] }
insta-cmd = { workspace = true }

View File

@@ -6,7 +6,6 @@ use anyhow::Result;
use bitflags::bitflags;
use colored::Colorize;
use itertools::{Itertools, iterate};
use ruff_linter::codes::NoqaCode;
use ruff_linter::linter::FixTable;
use serde::Serialize;
@@ -15,7 +14,7 @@ use ruff_linter::logging::LogLevel;
use ruff_linter::message::{
AzureEmitter, Emitter, EmitterContext, GithubEmitter, GitlabEmitter, GroupedEmitter,
JsonEmitter, JsonLinesEmitter, JunitEmitter, OldDiagnostic, PylintEmitter, RdjsonEmitter,
SarifEmitter, TextEmitter,
SarifEmitter, SecondaryCode, TextEmitter,
};
use ruff_linter::notify_user;
use ruff_linter::settings::flags::{self};
@@ -36,8 +35,8 @@ bitflags! {
}
#[derive(Serialize)]
struct ExpandedStatistics {
code: Option<NoqaCode>,
struct ExpandedStatistics<'a> {
code: Option<&'a SecondaryCode>,
name: &'static str,
count: usize,
fixable: bool,
@@ -303,11 +302,12 @@ impl Printer {
let statistics: Vec<ExpandedStatistics> = diagnostics
.inner
.iter()
.map(|message| (message.noqa_code(), message))
.map(|message| (message.secondary_code(), message))
.sorted_by_key(|(code, message)| (*code, message.fixable()))
.fold(
vec![],
|mut acc: Vec<((Option<NoqaCode>, &OldDiagnostic), usize)>, (code, message)| {
|mut acc: Vec<((Option<&SecondaryCode>, &OldDiagnostic), usize)>,
(code, message)| {
if let Some(((prev_code, _prev_message), count)) = acc.last_mut() {
if *prev_code == code {
*count += 1;
@@ -349,12 +349,7 @@ impl Printer {
);
let code_width = statistics
.iter()
.map(|statistic| {
statistic
.code
.map_or_else(String::new, |rule| rule.to_string())
.len()
})
.map(|statistic| statistic.code.map_or(0, |s| s.len()))
.max()
.unwrap();
let any_fixable = statistics.iter().any(|statistic| statistic.fixable);
@@ -370,7 +365,8 @@ impl Printer {
statistic.count.to_string().bold(),
statistic
.code
.map_or_else(String::new, |rule| rule.to_string())
.map(SecondaryCode::as_str)
.unwrap_or_default()
.red()
.bold(),
if any_fixable {

View File

@@ -612,7 +612,7 @@ fn extend_passed_via_config_argument() {
#[test]
fn nonexistent_extend_file() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
fs::write(
project_dir.join("ruff.toml"),
r#"
@@ -653,7 +653,7 @@ extend = "ruff3.toml"
#[test]
fn circular_extend() -> Result<()> {
let tempdir = TempDir::new()?;
let project_path = tempdir.path().canonicalize()?;
let project_path = dunce::canonicalize(tempdir.path())?;
fs::write(
project_path.join("ruff.toml"),
@@ -698,7 +698,7 @@ extend = "ruff.toml"
#[test]
fn parse_error_extends() -> Result<()> {
let tempdir = TempDir::new()?;
let project_path = tempdir.path().canonicalize()?;
let project_path = dunce::canonicalize(tempdir.path())?;
fs::write(
project_path.join("ruff.toml"),
@@ -2130,7 +2130,7 @@ select = ["UP006"]
#[test]
fn requires_python_no_tool() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("pyproject.toml");
fs::write(
&ruff_toml,
@@ -2441,7 +2441,7 @@ requires-python = ">= 3.11"
#[test]
fn requires_python_no_tool_target_version_override() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("pyproject.toml");
fs::write(
&ruff_toml,
@@ -2752,7 +2752,7 @@ requires-python = ">= 3.11"
#[test]
fn requires_python_no_tool_with_check() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("pyproject.toml");
fs::write(
&ruff_toml,
@@ -2797,7 +2797,7 @@ requires-python = ">= 3.11"
#[test]
fn requires_python_ruff_toml_no_target_fallback() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
@@ -3118,7 +3118,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_ruff_toml_no_target_fallback_check() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
@@ -3173,7 +3173,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_pyproject_toml_above() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let outer_pyproject = tempdir.path().join("pyproject.toml");
fs::write(
&outer_pyproject,
@@ -3200,7 +3200,7 @@ from typing import Union;foo: Union[int, str] = 1
"#,
)?;
let testpy_canon = testpy.canonicalize()?;
let testpy_canon = dunce::canonicalize(testpy)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&testpy_canon).as_str(), "[TMP]/foo/test.py"),(tempdir_filter(&project_dir).as_str(), "[TMP]/"),(r"(?m)^foo\\test","foo/test")]
@@ -3499,7 +3499,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_pyproject_toml_above_with_tool() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let outer_pyproject = tempdir.path().join("pyproject.toml");
fs::write(
&outer_pyproject,
@@ -3528,7 +3528,7 @@ from typing import Union;foo: Union[int, str] = 1
"#,
)?;
let testpy_canon = testpy.canonicalize()?;
let testpy_canon = dunce::canonicalize(testpy)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&testpy_canon).as_str(), "[TMP]/foo/test.py"),(tempdir_filter(&project_dir).as_str(), "[TMP]/"),(r"foo\\","foo/")]
@@ -3827,7 +3827,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_ruff_toml_above() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
@@ -3856,7 +3856,7 @@ from typing import Union;foo: Union[int, str] = 1
"#,
)?;
let testpy_canon = testpy.canonicalize()?;
let testpy_canon = dunce::canonicalize(testpy)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&testpy_canon).as_str(), "[TMP]/foo/test.py"),(tempdir_filter(&project_dir).as_str(), "[TMP]/")]
@@ -4441,7 +4441,7 @@ from typing import Union;foo: Union[int, str] = 1
#[test]
fn requires_python_extend_from_shared_config() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = tempdir.path().canonicalize()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
@@ -4479,7 +4479,7 @@ from typing import Union;foo: Union[int, str] = 1
"#,
)?;
let testpy_canon = testpy.canonicalize()?;
let testpy_canon = dunce::canonicalize(testpy)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&testpy_canon).as_str(), "[TMP]/test.py"),(tempdir_filter(&project_dir).as_str(), "[TMP]/")]

View File

@@ -12,10 +12,8 @@ fn display_default_settings() -> anyhow::Result<()> {
// Tempdir path's on macos are symlinks, which doesn't play nicely with
// our snapshot filtering.
let project_dir = tempdir
.path()
.canonicalize()
.context("Failed to canonical tempdir path.")?;
let project_dir =
dunce::canonicalize(tempdir.path()).context("Failed to canonical tempdir path.")?;
std::fs::write(
project_dir.join("pyproject.toml"),

View File

@@ -821,11 +821,7 @@ impl DisplaySourceAnnotation<'_> {
// Length of this annotation as displayed in the stderr output
fn len(&self) -> usize {
// Account for usize underflows
if self.range.1 > self.range.0 {
self.range.1 - self.range.0
} else {
self.range.0 - self.range.1
}
self.range.1.abs_diff(self.range.0)
}
fn takes_space(&self) -> bool {

View File

@@ -348,10 +348,10 @@ fn benchmark_many_tuple_assignments(criterion: &mut Criterion) {
});
}
fn benchmark_many_attribute_assignments(criterion: &mut Criterion) {
fn benchmark_complex_constrained_attributes_1(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("ty_micro[many_attribute_assignments]", |b| {
criterion.bench_function("ty_micro[complex_constrained_attributes_1]", |b| {
b.iter_batched_ref(
|| {
// This is a regression benchmark for https://github.com/astral-sh/ty/issues/627.
@@ -400,6 +400,47 @@ fn benchmark_many_attribute_assignments(criterion: &mut Criterion) {
});
}
fn benchmark_complex_constrained_attributes_2(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("ty_micro[complex_constrained_attributes_2]", |b| {
b.iter_batched_ref(
|| {
// This is is similar to the case above, but now the attributes are actually defined.
// https://github.com/astral-sh/ty/issues/711
setup_micro_case(
r#"
class C:
def f(self: "C"):
self.a = ""
self.b = ""
if isinstance(self.a, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
if isinstance(self.b, str):
return
"#,
)
},
|case| {
let Case { db, .. } = case;
let result = db.check();
assert_eq!(result.len(), 0);
},
BatchSize::SmallInput,
);
});
}
struct ProjectBenchmark<'a> {
project: InstalledProject<'a>,
fs: MemoryFileSystem,
@@ -534,7 +575,8 @@ criterion_group!(
micro,
benchmark_many_string_assignments,
benchmark_many_tuple_assignments,
benchmark_many_attribute_assignments,
benchmark_complex_constrained_attributes_1,
benchmark_complex_constrained_attributes_2,
);
criterion_group!(project, anyio, attrs, hydra);
criterion_main!(check_file, micro, project);

View File

@@ -14,10 +14,10 @@ license = { workspace = true }
ruff_annotate_snippets = { workspace = true }
ruff_cache = { workspace = true, optional = true }
ruff_notebook = { workspace = true }
ruff_python_ast = { workspace = true }
ruff_python_ast = { workspace = true, features = ["get-size"] }
ruff_python_parser = { workspace = true }
ruff_python_trivia = { workspace = true }
ruff_source_file = { workspace = true }
ruff_source_file = { workspace = true, features = ["get-size"] }
ruff_text_size = { workspace = true }
anstyle = { workspace = true }
@@ -27,6 +27,7 @@ countme = { workspace = true }
dashmap = { workspace = true }
dunce = { workspace = true }
filetime = { workspace = true }
get-size2 = { workspace = true }
glob = { workspace = true }
ignore = { workspace = true, optional = true }
matchit = { workspace = true }

View File

@@ -19,7 +19,7 @@ mod stylesheet;
/// characteristics in the inputs given to the tool. Typically, but not always,
/// a characteristic is a deficiency. An example of a characteristic that is
/// _not_ a deficiency is the `reveal_type` diagnostic for our type checker.
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
pub struct Diagnostic {
/// The actual diagnostic.
///
@@ -270,7 +270,7 @@ impl Diagnostic {
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
struct DiagnosticInner {
id: DiagnosticId,
severity: Severity,
@@ -342,7 +342,7 @@ impl Eq for RenderingSortKey<'_> {}
/// Currently, the order in which sub-diagnostics are rendered relative to one
/// another (for a single parent diagnostic) is the order in which they were
/// attached to the diagnostic.
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
pub struct SubDiagnostic {
/// Like with `Diagnostic`, we box the `SubDiagnostic` to make it
/// pointer-sized.
@@ -443,7 +443,7 @@ impl SubDiagnostic {
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
struct SubDiagnosticInner {
severity: Severity,
message: DiagnosticMessage,
@@ -471,7 +471,7 @@ struct SubDiagnosticInner {
///
/// Messages attached to annotations should also be as brief and specific as
/// possible. Long messages could negative impact the quality of rendering.
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
pub struct Annotation {
/// The span of this annotation, corresponding to some subsequence of the
/// user's input that we want to highlight.
@@ -591,7 +591,7 @@ impl Annotation {
///
/// These tags are used to provide additional information about the annotation.
/// and are passed through to the language server protocol.
#[derive(Debug, Clone, Eq, PartialEq)]
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
pub enum DiagnosticTag {
/// Unused or unnecessary code. Used for unused parameters, unreachable code, etc.
Unnecessary,
@@ -605,7 +605,7 @@ pub enum DiagnosticTag {
/// be in kebab case, e.g. `no-foo` (all lower case).
///
/// Rules use kebab case, e.g. `no-foo`.
#[derive(Debug, Copy, Clone, PartialOrd, Ord, PartialEq, Eq, Hash)]
#[derive(Debug, Copy, Clone, PartialOrd, Ord, PartialEq, Eq, Hash, get_size2::GetSize)]
pub struct LintName(&'static str);
impl LintName {
@@ -645,7 +645,7 @@ impl PartialEq<&str> for LintName {
}
/// Uniquely identifies the kind of a diagnostic.
#[derive(Debug, Copy, Clone, Eq, PartialEq, PartialOrd, Ord, Hash)]
#[derive(Debug, Copy, Clone, Eq, PartialEq, PartialOrd, Ord, Hash, get_size2::GetSize)]
pub enum DiagnosticId {
Panic,
@@ -800,7 +800,7 @@ impl std::fmt::Display for DiagnosticId {
///
/// This enum presents a unified interface to these two types for the sake of creating [`Span`]s and
/// emitting diagnostics from both ty and ruff.
#[derive(Debug, Clone, PartialEq, Eq)]
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
pub enum UnifiedFile {
Ty(File),
Ruff(SourceFile),
@@ -852,7 +852,7 @@ impl DiagnosticSource {
/// It consists of a `File` and an optional range into that file. When the
/// range isn't present, it semantically implies that the diagnostic refers to
/// the entire file. For example, when the file should be executable but isn't.
#[derive(Debug, Clone, PartialEq, Eq)]
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
pub struct Span {
file: UnifiedFile,
range: Option<TextRange>,
@@ -924,7 +924,7 @@ impl From<crate::files::FileRange> for Span {
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Ord, PartialOrd)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Ord, PartialOrd, get_size2::GetSize)]
pub enum Severity {
Info,
Warning,
@@ -1087,7 +1087,7 @@ impl std::fmt::Display for ConciseMessage<'_> {
/// In most cases, callers shouldn't need to use this. Instead, there is
/// a blanket trait implementation for `IntoDiagnosticMessage` for
/// anything that implements `std::fmt::Display`.
#[derive(Clone, Debug, Eq, PartialEq)]
#[derive(Clone, Debug, Eq, PartialEq, get_size2::GetSize)]
pub struct DiagnosticMessage(Box<str>);
impl DiagnosticMessage {

View File

@@ -637,6 +637,22 @@ pub trait FileResolver {
fn input(&self, file: File) -> Input;
}
impl<T> FileResolver for T
where
T: Db,
{
fn path(&self, file: File) -> &str {
relativize_path(self.system().current_directory(), file.path(self).as_str())
}
fn input(&self, file: File) -> Input {
Input {
text: source_text(self, file),
line_index: line_index(self, file),
}
}
}
impl FileResolver for &dyn Db {
fn path(&self, file: File) -> &str {
relativize_path(self.system().current_directory(), file.path(*self).as_str())
@@ -708,7 +724,6 @@ fn relativize_path<'p>(cwd: &SystemPath, path: &'p str) -> &'p str {
#[cfg(test)]
mod tests {
use crate::Upcast;
use crate::diagnostic::{Annotation, DiagnosticId, Severity, Span};
use crate::files::system_path_to_file;
use crate::system::{DbWithWritableSystem, SystemPath};
@@ -2221,7 +2236,7 @@ watermelon
///
/// (This will set the "printed" flag on `Diagnostic`.)
fn render(&self, diag: &Diagnostic) -> String {
diag.display(&self.db.upcast(), &self.config).to_string()
diag.display(&self.db, &self.config).to_string()
}
}

View File

@@ -263,12 +263,23 @@ impl Files {
impl fmt::Debug for Files {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut map = f.debug_map();
if f.alternate() {
let mut map = f.debug_map();
for entry in self.inner.system_by_path.iter() {
map.entry(entry.key(), entry.value());
for entry in self.inner.system_by_path.iter() {
map.entry(entry.key(), entry.value());
}
map.finish()
} else {
f.debug_struct("Files")
.field("system_by_path", &self.inner.system_by_path.len())
.field(
"system_virtual_by_path",
&self.inner.system_virtual_by_path.len(),
)
.field("vendored_by_path", &self.inner.vendored_by_path.len())
.finish()
}
map.finish()
}
}
@@ -308,6 +319,9 @@ pub struct File {
count: Count<File>,
}
// The Salsa heap is tracked separately.
impl get_size2::GetSize for File {}
impl File {
/// Reads the content of the file into a [`String`].
///

View File

@@ -36,12 +36,6 @@ pub trait Db: salsa::Database {
fn python_version(&self) -> PythonVersion;
}
/// Trait for upcasting a reference to a base trait object.
pub trait Upcast<T: ?Sized> {
fn upcast(&self) -> &T;
fn upcast_mut(&mut self) -> &mut T;
}
/// Returns the maximum number of tasks that ty is allowed
/// to process in parallel.
///
@@ -76,11 +70,11 @@ pub trait RustDoc {
mod tests {
use std::sync::{Arc, Mutex};
use crate::Db;
use crate::files::Files;
use crate::system::TestSystem;
use crate::system::{DbWithTestSystem, System};
use crate::vendored::VendoredFileSystem;
use crate::{Db, Upcast};
type Events = Arc<Mutex<Vec<salsa::Event>>>;
@@ -153,15 +147,6 @@ mod tests {
}
}
impl Upcast<dyn Db> for TestDb {
fn upcast(&self) -> &(dyn Db + 'static) {
self
}
fn upcast_mut(&mut self) -> &mut (dyn Db + 'static) {
self
}
}
impl DbWithTestSystem for TestDb {
fn test_system(&self) -> &TestSystem {
&self.system

View File

@@ -2,6 +2,7 @@ use std::fmt::Formatter;
use std::sync::Arc;
use arc_swap::ArcSwapOption;
use get_size2::GetSize;
use ruff_python_ast::{AnyRootNodeRef, ModModule, NodeIndex};
use ruff_python_parser::{ParseOptions, Parsed, parse_unchecked};
@@ -20,7 +21,7 @@ use crate::source::source_text;
/// reflected in the changed AST offsets.
/// The other reason is that Ruff's AST doesn't implement `Eq` which Salsa requires
/// for determining if a query result is unchanged.
#[salsa::tracked(returns(ref), no_eq)]
#[salsa::tracked(returns(ref), no_eq, heap_size=get_size2::GetSize::get_heap_size)]
pub fn parsed_module(db: &dyn Db, file: File) -> ParsedModule {
let _span = tracing::trace_span!("parsed_module", ?file).entered();
@@ -44,9 +45,10 @@ pub fn parsed_module_impl(db: &dyn Db, file: File) -> Parsed<ModModule> {
///
/// This type manages instances of the module AST. A particular instance of the AST
/// is represented with the [`ParsedModuleRef`] type.
#[derive(Clone)]
#[derive(Clone, get_size2::GetSize)]
pub struct ParsedModule {
file: File,
#[get_size(size_fn = arc_swap_size)]
inner: Arc<ArcSwapOption<indexed::IndexedModule>>,
}
@@ -142,6 +144,18 @@ impl std::ops::Deref for ParsedModuleRef {
}
}
/// Returns the heap-size of the currently stored `T` in the `ArcSwap`.
fn arc_swap_size<T>(arc_swap: &Arc<ArcSwapOption<T>>) -> usize
where
T: GetSize,
{
if let Some(value) = &*arc_swap.load() {
T::get_heap_size(value)
} else {
0
}
}
mod indexed {
use std::sync::Arc;
@@ -150,7 +164,7 @@ mod indexed {
use ruff_python_parser::Parsed;
/// A wrapper around the AST that allows access to AST nodes by index.
#[derive(Debug)]
#[derive(Debug, get_size2::GetSize)]
pub struct IndexedModule {
index: Box<[AnyRootNodeRef<'static>]>,
pub parsed: Parsed<ModModule>,

View File

@@ -11,7 +11,7 @@ use crate::Db;
use crate::files::{File, FilePath};
/// Reads the source text of a python text file (must be valid UTF8) or notebook.
#[salsa::tracked]
#[salsa::tracked(heap_size=get_size2::GetSize::get_heap_size)]
pub fn source_text(db: &dyn Db, file: File) -> SourceText {
let path = file.path(db);
let _span = tracing::trace_span!("source_text", file = %path).entered();
@@ -65,7 +65,7 @@ fn is_notebook(path: &FilePath) -> bool {
/// The file containing the source text can either be a text file or a notebook.
///
/// Cheap cloneable in `O(1)`.
#[derive(Clone, Eq, PartialEq)]
#[derive(Clone, Eq, PartialEq, get_size2::GetSize)]
pub struct SourceText {
inner: Arc<SourceTextInner>,
}
@@ -123,8 +123,9 @@ impl std::fmt::Debug for SourceText {
}
}
#[derive(Eq, PartialEq)]
#[derive(Eq, PartialEq, get_size2::GetSize)]
struct SourceTextInner {
#[get_size(ignore)]
count: Count<SourceText>,
kind: SourceTextKind,
read_error: Option<SourceTextError>,
@@ -136,6 +137,19 @@ enum SourceTextKind {
Notebook(Box<Notebook>),
}
impl get_size2::GetSize for SourceTextKind {
fn get_heap_size(&self) -> usize {
match self {
SourceTextKind::Text(text) => text.get_heap_size(),
// TODO: The `get-size` derive does not support ignoring enum variants.
//
// Jupyter notebooks are not very relevant for memory profiling, and contain
// arbitrary JSON values that do not implement the `GetSize` trait.
SourceTextKind::Notebook(_) => 0,
}
}
}
impl From<String> for SourceTextKind {
fn from(value: String) -> Self {
SourceTextKind::Text(value)
@@ -148,7 +162,7 @@ impl From<Notebook> for SourceTextKind {
}
}
#[derive(Debug, thiserror::Error, PartialEq, Eq, Clone)]
#[derive(Debug, thiserror::Error, PartialEq, Eq, Clone, get_size2::GetSize)]
pub enum SourceTextError {
#[error("Failed to read notebook: {0}`")]
FailedToReadNotebook(String),
@@ -157,7 +171,7 @@ pub enum SourceTextError {
}
/// Computes the [`LineIndex`] for `file`.
#[salsa::tracked]
#[salsa::tracked(heap_size=get_size2::GetSize::get_heap_size)]
pub fn line_index(db: &dyn Db, file: File) -> LineIndex {
let _span = tracing::trace_span!("line_index", ?file).entered();

View File

@@ -124,6 +124,11 @@ pub trait System: Debug {
/// Returns `None` if no such convention exists for the system.
fn user_config_directory(&self) -> Option<SystemPathBuf>;
/// Returns the directory path where cached files are stored.
///
/// Returns `None` if no such convention exists for the system.
fn cache_dir(&self) -> Option<SystemPathBuf>;
/// Iterate over the contents of the directory at `path`.
///
/// The returned iterator must have the following properties:
@@ -186,6 +191,9 @@ pub trait System: Debug {
Err(std::env::VarError::NotPresent)
}
/// Returns a handle to a [`WritableSystem`] if this system is writeable.
fn as_writable(&self) -> Option<&dyn WritableSystem>;
fn as_any(&self) -> &dyn std::any::Any;
fn as_any_mut(&mut self) -> &mut dyn std::any::Any;
@@ -226,11 +234,52 @@ impl fmt::Display for CaseSensitivity {
/// System trait for non-readonly systems.
pub trait WritableSystem: System {
/// Creates a file at the given path.
///
/// Returns an error if the file already exists.
fn create_new_file(&self, path: &SystemPath) -> Result<()>;
/// Writes the given content to the file at the given path.
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()>;
/// Creates a directory at `path` as well as any intermediate directories.
fn create_directory_all(&self, path: &SystemPath) -> Result<()>;
/// Reads the provided file from the system cache, or creates the file if necessary.
///
/// Returns `Ok(None)` if the system does not expose a suitable cache directory.
fn get_or_cache(
&self,
path: &SystemPath,
read_contents: &dyn Fn() -> Result<String>,
) -> Result<Option<SystemPathBuf>> {
let Some(cache_dir) = self.cache_dir() else {
return Ok(None);
};
let cache_path = cache_dir.join(path);
// The file has already been cached.
if self.is_file(&cache_path) {
return Ok(Some(cache_path));
}
// Read the file contents.
let contents = read_contents()?;
// Create the parent directory.
self.create_directory_all(cache_path.parent().unwrap())?;
// Create and write to the file on the system.
//
// Note that `create_new_file` will fail if the file has already been created. This
// ensures that only one thread/process ever attempts to write to it to avoid corrupting
// the cache.
self.create_new_file(&cache_path)?;
self.write_file(&cache_path, &contents)?;
Ok(Some(cache_path))
}
}
#[derive(Clone, Debug, Eq, PartialEq)]

View File

@@ -1,4 +1,5 @@
use std::collections::BTreeMap;
use std::collections::{BTreeMap, btree_map};
use std::io;
use std::iter::FusedIterator;
use std::sync::{Arc, RwLock, RwLockWriteGuard};
@@ -153,6 +154,26 @@ impl MemoryFileSystem {
virtual_files.contains_key(&path.to_path_buf())
}
pub(crate) fn create_new_file(&self, path: &SystemPath) -> Result<()> {
let normalized = self.normalize_path(path);
let mut by_path = self.inner.by_path.write().unwrap();
match by_path.entry(normalized) {
btree_map::Entry::Vacant(entry) => {
entry.insert(Entry::File(File {
content: String::new(),
last_modified: file_time_now(),
}));
Ok(())
}
btree_map::Entry::Occupied(_) => Err(io::Error::new(
io::ErrorKind::AlreadyExists,
"File already exists",
)),
}
}
/// Stores a new file in the file system.
///
/// The operation overrides the content for an existing file with the same normalized `path`.
@@ -278,14 +299,14 @@ impl MemoryFileSystem {
let normalized = fs.normalize_path(path);
match by_path.entry(normalized) {
std::collections::btree_map::Entry::Occupied(entry) => match entry.get() {
btree_map::Entry::Occupied(entry) => match entry.get() {
Entry::File(_) => {
entry.remove();
Ok(())
}
Entry::Directory(_) => Err(is_a_directory()),
},
std::collections::btree_map::Entry::Vacant(_) => Err(not_found()),
btree_map::Entry::Vacant(_) => Err(not_found()),
}
}
@@ -345,14 +366,14 @@ impl MemoryFileSystem {
}
match by_path.entry(normalized.clone()) {
std::collections::btree_map::Entry::Occupied(entry) => match entry.get() {
btree_map::Entry::Occupied(entry) => match entry.get() {
Entry::Directory(_) => {
entry.remove();
Ok(())
}
Entry::File(_) => Err(not_a_directory()),
},
std::collections::btree_map::Entry::Vacant(_) => Err(not_found()),
btree_map::Entry::Vacant(_) => Err(not_found()),
}
}

View File

@@ -160,6 +160,39 @@ impl System for OsSystem {
None
}
/// Returns an absolute cache directory on the system.
///
/// On Linux and macOS, uses `$XDG_CACHE_HOME/ty` or `.cache/ty`.
/// On Windows, uses `C:\Users\User\AppData\Local\ty\cache`.
#[cfg(not(target_arch = "wasm32"))]
fn cache_dir(&self) -> Option<SystemPathBuf> {
use etcetera::BaseStrategy as _;
let cache_dir = etcetera::base_strategy::choose_base_strategy()
.ok()
.map(|dirs| dirs.cache_dir().join("ty"))
.map(|cache_dir| {
if cfg!(windows) {
// On Windows, we append `cache` to the LocalAppData directory, i.e., prefer
// `C:\Users\User\AppData\Local\ty\cache` over `C:\Users\User\AppData\Local\ty`.
cache_dir.join("cache")
} else {
cache_dir
}
})
.and_then(|path| SystemPathBuf::from_path_buf(path).ok())
.unwrap_or_else(|| SystemPathBuf::from(".ty_cache"));
Some(cache_dir)
}
// TODO: Remove this feature gating once `ruff_wasm` no longer indirectly depends on `ruff_db` with the
// `os` feature enabled (via `ruff_workspace` -> `ruff_graph` -> `ruff_db`).
#[cfg(target_arch = "wasm32")]
fn cache_dir(&self) -> Option<SystemPathBuf> {
None
}
/// Creates a builder to recursively walk `path`.
///
/// The walker ignores files according to [`ignore::WalkBuilder::standard_filters`]
@@ -192,6 +225,10 @@ impl System for OsSystem {
})
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn Any {
self
}
@@ -310,6 +347,10 @@ impl OsSystem {
}
impl WritableSystem for OsSystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
std::fs::File::create_new(path).map(drop)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
std::fs::write(path.as_std_path(), content)
}

View File

@@ -503,6 +503,12 @@ impl ToOwned for SystemPath {
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct SystemPathBuf(#[cfg_attr(feature = "schemars", schemars(with = "String"))] Utf8PathBuf);
impl get_size2::GetSize for SystemPathBuf {
fn get_heap_size(&self) -> usize {
self.0.capacity()
}
}
impl SystemPathBuf {
pub fn new() -> Self {
Self(Utf8PathBuf::new())

View File

@@ -102,6 +102,10 @@ impl System for TestSystem {
self.system().user_config_directory()
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
self.system().cache_dir()
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -123,6 +127,10 @@ impl System for TestSystem {
self.system().glob(pattern)
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
@@ -149,6 +157,10 @@ impl Default for TestSystem {
}
impl WritableSystem for TestSystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
self.system().create_new_file(path)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
self.system().write_file(path, content)
}
@@ -335,6 +347,10 @@ impl System for InMemorySystem {
self.user_config_directory.lock().unwrap().clone()
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
None
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -357,6 +373,10 @@ impl System for InMemorySystem {
Ok(Box::new(iterator))
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
@@ -377,6 +397,10 @@ impl System for InMemorySystem {
}
impl WritableSystem for InMemorySystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
self.memory_fs.create_new_file(path)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
self.memory_fs.write_file(path, content)
}

View File

@@ -212,7 +212,7 @@ impl Display for Error {
path: Some(path),
err,
} => {
write!(f, "IO error for operation on {}: {}", path, err)
write!(f, "IO error for operation on {path}: {err}")
}
ErrorKind::Io { path: None, err } => err.fmt(f),
ErrorKind::NonUtf8Path { path } => {

View File

@@ -4,12 +4,12 @@ use std::fmt::{self, Debug};
use std::io::{self, Read, Write};
use std::sync::{Arc, Mutex, MutexGuard};
use crate::file_revision::FileRevision;
use zip::result::ZipResult;
use zip::write::FileOptions;
use zip::{CompressionMethod, ZipArchive, ZipWriter, read::ZipFile};
pub use self::path::{VendoredPath, VendoredPathBuf};
use crate::file_revision::FileRevision;
mod path;

View File

@@ -87,6 +87,12 @@ impl ToOwned for VendoredPath {
#[derive(Debug, Eq, PartialEq, Clone, Hash)]
pub struct VendoredPathBuf(Utf8PathBuf);
impl get_size2::GetSize for VendoredPathBuf {
fn get_heap_size(&self) -> usize {
self.0.capacity()
}
}
impl Default for VendoredPathBuf {
fn default() -> Self {
Self::new()

View File

@@ -114,6 +114,7 @@ fn generate_set(output: &mut String, set: Set, parents: &mut Vec<Set>) {
parents.pop();
}
#[derive(Debug)]
enum Set {
Toplevel(OptionSet),
Named { name: String, set: OptionSet },
@@ -136,7 +137,7 @@ impl Set {
}
fn emit_field(output: &mut String, name: &str, field: &OptionField, parents: &[Set]) {
let header_level = if parents.is_empty() { "###" } else { "####" };
let header_level = "#".repeat(parents.len() + 1);
let _ = writeln!(output, "{header_level} `{name}`");

View File

@@ -73,12 +73,20 @@ fn generate_markdown() -> String {
for lint in lints {
let _ = writeln!(&mut output, "## `{rule_name}`\n", rule_name = lint.name());
// Increase the header-level by one
// Reformat headers as bold text
let mut in_code_fence = false;
let documentation = lint
.documentation_lines()
.map(|line| {
if line.starts_with('#') {
Cow::Owned(format!("#{line}"))
// Toggle the code fence state if we encounter a boundary
if line.starts_with("```") {
in_code_fence = !in_code_fence;
}
if !in_code_fence && line.starts_with('#') {
Cow::Owned(format!(
"**{line}**\n",
line = line.trim_start_matches('#').trim_start()
))
} else {
Cow::Borrowed(line)
}
@@ -87,21 +95,15 @@ fn generate_markdown() -> String {
let _ = writeln!(
&mut output,
r#"**Default level**: {level}
<details>
<summary>{summary}</summary>
r#"<small>
Default level: [`{level}`](../rules.md#rule-levels "This lint has a default level of '{level}'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20{encoded_name}) ·
[View source](https://github.com/astral-sh/ruff/blob/main/{file}#L{line})
</small>
{documentation}
### Links
* [Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20{encoded_name})
* [View source](https://github.com/astral-sh/ruff/blob/main/{file}#L{line})
</details>
"#,
level = lint.default_level(),
// GitHub doesn't support markdown in `summary` headers
summary = replace_inline_code(lint.summary()),
encoded_name = url::form_urlencoded::byte_serialize(lint.name().as_str().as_bytes())
.collect::<String>(),
file = url::form_urlencoded::byte_serialize(lint.file().replace('\\', "/").as_bytes())
@@ -113,25 +115,6 @@ fn generate_markdown() -> String {
output
}
/// Replaces inline code blocks (`code`) with `<code>code</code>`
fn replace_inline_code(input: &str) -> String {
let mut output = String::new();
let mut parts = input.split('`');
while let Some(before) = parts.next() {
if let Some(between) = parts.next() {
output.push_str(before);
output.push_str("<code>");
output.push_str(between);
output.push_str("</code>");
} else {
output.push_str(before);
}
}
output
}
#[cfg(test)]
mod tests {
use anyhow::Result;

View File

@@ -2,10 +2,10 @@ use anyhow::{Context, Result};
use std::sync::Arc;
use zip::CompressionMethod;
use ruff_db::Db as SourceDb;
use ruff_db::files::{File, Files};
use ruff_db::system::{OsSystem, System, SystemPathBuf};
use ruff_db::vendored::{VendoredFileSystem, VendoredFileSystemBuilder};
use ruff_db::{Db as SourceDb, Upcast};
use ruff_python_ast::PythonVersion;
use ty_python_semantic::lint::{LintRegistry, RuleSelection};
use ty_python_semantic::{
@@ -66,15 +66,6 @@ impl ModuleDb {
}
}
impl Upcast<dyn SourceDb> for ModuleDb {
fn upcast(&self) -> &(dyn SourceDb + 'static) {
self
}
fn upcast_mut(&mut self) -> &mut (dyn SourceDb + 'static) {
self
}
}
#[salsa::db]
impl SourceDb for ModuleDb {
fn vendored(&self) -> &VendoredFileSystem {

View File

@@ -14,6 +14,7 @@ license = { workspace = true }
doctest = false
[dependencies]
get-size2 = { workspace = true }
ruff_macros = { workspace = true }
salsa = { workspace = true, optional = true }

View File

@@ -6,7 +6,7 @@ use std::marker::PhantomData;
use std::ops::{Deref, DerefMut, RangeBounds};
/// An owned sequence of `T` indexed by `I`
#[derive(Clone, PartialEq, Eq, Hash)]
#[derive(Clone, PartialEq, Eq, Hash, get_size2::GetSize)]
#[repr(transparent)]
pub struct IndexVec<I, T> {
pub raw: Vec<T>,
@@ -191,6 +191,6 @@ where
#[expect(unsafe_code)]
unsafe fn maybe_update(old_pointer: *mut Self, new_value: Self) -> bool {
let old_vec: &mut IndexVec<I, T> = unsafe { &mut *old_pointer };
unsafe { salsa::Update::maybe_update(&mut old_vec.raw, new_value.raw) }
unsafe { salsa::Update::maybe_update(&raw mut old_vec.raw, new_value.raw) }
}
}

View File

@@ -38,6 +38,7 @@ colored = { workspace = true }
fern = { workspace = true }
glob = { workspace = true }
globset = { workspace = true }
hashbrown = { workspace = true }
imperative = { workspace = true }
is-macro = { workspace = true }
is-wsl = { workspace = true }

View File

@@ -0,0 +1,5 @@
foo or{x: None for x in bar}
# C420 fix must make sure to insert a leading space if needed,
# See https://github.com/astral-sh/ruff/issues/18599

View File

@@ -0,0 +1,2 @@
#!/usr/bin/env -S uv tool run ruff check --isolated --select EXE003
print("hello world")

View File

@@ -0,0 +1,2 @@
#!/usr/bin/env -S uvx ruff check --isolated --select EXE003
print("hello world")

View File

@@ -119,4 +119,26 @@ field35: "int | str | int" # Error
# Technically, this falls into the domain of the rule but it is an unlikely edge case,
# only works if you have from `__future__ import annotations` at the top of the file,
# and stringified annotations are discouraged in stub files.
field36: "int | str" | int # Ok
field36: "int | str" | int # Ok
# https://github.com/astral-sh/ruff/issues/18546
# Expand Optional[T] to Union[T, None]
# OK
field37: typing.Optional[int]
field38: typing.Union[int, None]
# equivalent to None
field39: typing.Optional[None]
# equivalent to int | None
field40: typing.Union[typing.Optional[int], None]
field41: typing.Optional[typing.Union[int, None]]
field42: typing.Union[typing.Optional[int], typing.Optional[int]]
field43: typing.Optional[int] | None
field44: typing.Optional[int | None]
field45: typing.Optional[int] | typing.Optional[int]
# equivalent to int | dict | None
field46: typing.Union[typing.Optional[int], typing.Optional[dict]]
field47: typing.Optional[int] | typing.Optional[dict]
# avoid reporting twice
field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
field49: typing.Optional[complex | complex] | complex

View File

@@ -111,3 +111,25 @@ field33: typing.Union[typing.Union[int | int] | typing.Union[int | int]] # Error
# Test case for mixed union type
field34: typing.Union[list[int], str] | typing.Union[bytes, list[int]] # Error
# https://github.com/astral-sh/ruff/issues/18546
# Expand Optional[T] to Union[T, None]
# OK
field37: typing.Optional[int]
field38: typing.Union[int, None]
# equivalent to None
field39: typing.Optional[None]
# equivalent to int | None
field40: typing.Union[typing.Optional[int], None]
field41: typing.Optional[typing.Union[int, None]]
field42: typing.Union[typing.Optional[int], typing.Optional[int]]
field43: typing.Optional[int] | None
field44: typing.Optional[int | None]
field45: typing.Optional[int] | typing.Optional[int]
# equivalent to int | dict | None
field46: typing.Union[typing.Optional[int], typing.Optional[dict]]
field47: typing.Optional[int] | typing.Optional[dict]
# avoid reporting twice
field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
field49: typing.Optional[complex | complex] | complex

View File

@@ -170,3 +170,25 @@ def foo():
v = {}
for o,(x,)in():
v[x,]=o
# https://github.com/astral-sh/ruff/issues/19005
def issue_19005_1():
c = {}
a = object()
for a.b in ():
c[a.b] = a.b
def issue_19005_2():
a = object()
c = {}
for a.k, a.v in ():
c[a.k] = a.v
def issue_19005_3():
a = [None, None]
c = {}
for a[0], a[1] in ():
c[a[0]] = a[1]

View File

@@ -69,3 +69,11 @@ def func():
Returns:
the value
"""
def func():
("""Docstring.
Raises:
ValueError: An error.
""")

View File

@@ -57,3 +57,7 @@ _ = lambda x: z(lambda y: x + y)(x)
# lambda uses an additional keyword
_ = lambda *args: f(*args, y=1)
_ = lambda *args: f(*args, y=x)
# https://github.com/astral-sh/ruff/issues/18675
_ = lambda x: (string := str)(x)
_ = lambda x: ((x := 1) and str)(x)

View File

@@ -90,3 +90,52 @@ class AClass:
def myfunc(param: "tuple[Union[int, 'AClass', None], str]"):
print(param)
from typing import NamedTuple, Union
import typing_extensions
from typing_extensions import (
NamedTuple as NamedTupleTE,
Union as UnionTE,
)
# Regression test for https://github.com/astral-sh/ruff/issues/18619
# Don't emit lint for `NamedTuple`
a_plain_1: Union[NamedTuple, int] = None
a_plain_2: Union[int, NamedTuple] = None
a_plain_3: Union[NamedTuple, None] = None
a_plain_4: Union[None, NamedTuple] = None
a_plain_te_1: UnionTE[NamedTupleTE, int] = None
a_plain_te_2: UnionTE[int, NamedTupleTE] = None
a_plain_te_3: UnionTE[NamedTupleTE, None] = None
a_plain_te_4: UnionTE[None, NamedTupleTE] = None
a_plain_typing_1: UnionTE[typing.NamedTuple, int] = None
a_plain_typing_2: UnionTE[int, typing.NamedTuple] = None
a_plain_typing_3: UnionTE[typing.NamedTuple, None] = None
a_plain_typing_4: UnionTE[None, typing.NamedTuple] = None
a_string_1: "Union[NamedTuple, int]" = None
a_string_2: "Union[int, NamedTuple]" = None
a_string_3: "Union[NamedTuple, None]" = None
a_string_4: "Union[None, NamedTuple]" = None
a_string_te_1: "UnionTE[NamedTupleTE, int]" = None
a_string_te_2: "UnionTE[int, NamedTupleTE]" = None
a_string_te_3: "UnionTE[NamedTupleTE, None]" = None
a_string_te_4: "UnionTE[None, NamedTupleTE]" = None
a_string_typing_1: "typing.Union[typing.NamedTuple, int]" = None
a_string_typing_2: "typing.Union[int, typing.NamedTuple]" = None
a_string_typing_3: "typing.Union[typing.NamedTuple, None]" = None
a_string_typing_4: "typing.Union[None, typing.NamedTuple]" = None
b_plain_1: Union[NamedTuple] = None
b_plain_2: Union[NamedTuple, None] = None
b_plain_te_1: UnionTE[NamedTupleTE] = None
b_plain_te_2: UnionTE[NamedTupleTE, None] = None
b_plain_typing_1: UnionTE[typing.NamedTuple] = None
b_plain_typing_2: UnionTE[typing.NamedTuple, None] = None
b_string_1: "Union[NamedTuple]" = None
b_string_2: "Union[NamedTuple, None]" = None
b_string_te_1: "UnionTE[NamedTupleTE]" = None
b_string_te_2: "UnionTE[NamedTupleTE, None]" = None
b_string_typing_1: "typing.Union[typing.NamedTuple]" = None
b_string_typing_2: "typing.Union[typing.NamedTuple, None]" = None

View File

@@ -105,3 +105,23 @@ import builtins
class C:
def f(self):
builtins.super(C, self)
# see: https://github.com/astral-sh/ruff/issues/18533
class ClassForCommentEnthusiasts(BaseClass):
def with_comments(self):
super(
# super helpful comment
ClassForCommentEnthusiasts,
self
).f()
super(
ClassForCommentEnthusiasts,
# even more helpful comment
self
).f()
super(
ClassForCommentEnthusiasts,
self
# also a comment
).f()

View File

@@ -26,3 +26,9 @@ def hello():
f"foo"u"bar" # OK
f"foo" u"bar" # OK
# https://github.com/astral-sh/ruff/issues/18895
""u""
""u"hi"
""""""""""""""""""""u"hi"
""U"helloooo"

View File

@@ -47,3 +47,25 @@ class ServiceRefOrValue:
# Test for: https://github.com/astral-sh/ruff/issues/18508
# Optional[None] should not be offered a fix
foo: Optional[None] = None
from typing import NamedTuple, Optional
import typing_extensions
from typing_extensions import (
NamedTuple as NamedTupleTE,
Optional as OptionalTE,
)
# Regression test for https://github.com/astral-sh/ruff/issues/18619
# Don't emit lint for `NamedTuple`
a1: Optional[NamedTuple] = None
a2: typing.Optional[NamedTuple] = None
a3: OptionalTE[NamedTuple] = None
a4: typing_extensions.Optional[NamedTuple] = None
a5: Optional[typing.NamedTuple] = None
a6: typing.Optional[typing.NamedTuple] = None
a7: OptionalTE[typing.NamedTuple] = None
a8: typing_extensions.Optional[typing.NamedTuple] = None
a9: "Optional[NamedTuple]" = None
a10: Optional[NamedTupleTE] = None

View File

@@ -85,3 +85,10 @@ def _():
if isinstance(foo, type(None)):
...
# https://github.com/astral-sh/ruff/issues/19047
if isinstance(foo, ()):
pass
if isinstance(foo, Union[()]):
pass

View File

@@ -7,6 +7,7 @@ use ruff_python_semantic::analyze::typing;
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
use crate::preview::is_optional_as_none_in_union_enabled;
use crate::registry::Rule;
use crate::rules::{
airflow, flake8_2020, flake8_async, flake8_bandit, flake8_boolean_trap, flake8_bugbear,
@@ -90,7 +91,13 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if checker.is_rule_enabled(Rule::UnnecessaryLiteralUnion) {
flake8_pyi::rules::unnecessary_literal_union(checker, expr);
}
if checker.is_rule_enabled(Rule::DuplicateUnionMember) {
if checker.is_rule_enabled(Rule::DuplicateUnionMember)
// Avoid duplicate checks inside `Optional`
&& !(
is_optional_as_none_in_union_enabled(checker.settings())
&& checker.semantic.inside_optional()
)
{
flake8_pyi::rules::duplicate_union_member(checker, expr);
}
if checker.is_rule_enabled(Rule::RedundantLiteralUnion) {
@@ -1430,6 +1437,11 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if !checker.semantic.in_nested_union() {
if checker.is_rule_enabled(Rule::DuplicateUnionMember)
&& checker.semantic.in_type_definition()
// Avoid duplicate checks inside `Optional`
&& !(
is_optional_as_none_in_union_enabled(checker.settings())
&& checker.semantic.inside_optional()
)
{
flake8_pyi::rules::duplicate_union_member(checker, expr);
}

View File

@@ -2765,9 +2765,7 @@ impl<'a> Checker<'a> {
self.semantic.restore(snapshot);
if self.semantic.in_annotation()
&& self.semantic.in_typing_only_annotation()
{
if self.semantic.in_typing_only_annotation() {
if self.is_rule_enabled(Rule::QuotedAnnotation) {
pyupgrade::rules::quoted_annotation(self, annotation, range);
}

View File

@@ -35,39 +35,34 @@ pub(crate) fn check_noqa(
// Identify any codes that are globally exempted (within the current file).
let file_noqa_directives =
FileNoqaDirectives::extract(locator, comment_ranges, &settings.external, path);
let exemption = FileExemption::from(&file_noqa_directives);
// Extract all `noqa` directives.
let mut noqa_directives =
NoqaDirectives::from_commented_ranges(comment_ranges, &settings.external, path, locator);
if file_noqa_directives.is_empty() && noqa_directives.is_empty() {
return Vec::new();
}
let exemption = FileExemption::from(&file_noqa_directives);
// Indices of diagnostics that were ignored by a `noqa` directive.
let mut ignored_diagnostics = vec![];
// Remove any ignored diagnostics.
'outer: for (index, diagnostic) in context.iter().enumerate() {
// Can't ignore syntax errors.
let Some(code) = diagnostic.noqa_code() else {
let Some(code) = diagnostic.secondary_code() else {
continue;
};
if code == Rule::BlanketNOQA.noqa_code() {
if *code == Rule::BlanketNOQA.noqa_code() {
continue;
}
match &exemption {
FileExemption::All(_) => {
// If the file is exempted, ignore all diagnostics.
ignored_diagnostics.push(index);
continue;
}
FileExemption::Codes(codes) => {
// If the diagnostic is ignored by a global exemption, ignore it.
if codes.contains(&&code) {
ignored_diagnostics.push(index);
continue;
}
}
if exemption.contains_secondary_code(code) {
ignored_diagnostics.push(index);
continue;
}
let noqa_offsets = diagnostic
@@ -82,13 +77,21 @@ pub(crate) fn check_noqa(
{
let suppressed = match &directive_line.directive {
Directive::All(_) => {
directive_line.matches.push(code);
let Ok(rule) = Rule::from_code(code) else {
debug_assert!(false, "Invalid secondary code `{code}`");
continue;
};
directive_line.matches.push(rule);
ignored_diagnostics.push(index);
true
}
Directive::Codes(directive) => {
if directive.includes(code) {
directive_line.matches.push(code);
let Ok(rule) = Rule::from_code(code) else {
debug_assert!(false, "Invalid secondary code `{code}`");
continue;
};
directive_line.matches.push(rule);
ignored_diagnostics.push(index);
true
} else {
@@ -147,11 +150,11 @@ pub(crate) fn check_noqa(
if seen_codes.insert(original_code) {
let is_code_used = if is_file_level {
context
.iter()
.any(|diag| diag.noqa_code().is_some_and(|noqa| noqa == code))
context.iter().any(|diag| {
diag.secondary_code().is_some_and(|noqa| *noqa == code)
})
} else {
matches.iter().any(|match_| *match_ == code)
matches.iter().any(|match_| match_.noqa_code() == code)
} || settings
.external
.iter()

View File

@@ -46,6 +46,12 @@ impl PartialEq<&str> for NoqaCode {
}
}
impl PartialEq<NoqaCode> for &str {
fn eq(&self, other: &NoqaCode) -> bool {
other.eq(self)
}
}
impl serde::Serialize for NoqaCode {
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
where

View File

@@ -63,7 +63,7 @@ fn apply_fixes<'a>(
let mut source_map = SourceMap::default();
for (code, name, fix) in diagnostics
.filter_map(|msg| msg.noqa_code().map(|code| (code, msg.name(), msg)))
.filter_map(|msg| msg.secondary_code().map(|code| (code, msg.name(), msg)))
.filter_map(|(code, name, diagnostic)| diagnostic.fix().map(|fix| (code, name, fix)))
.sorted_by(|(_, name1, fix1), (_, name2, fix2)| cmp_fix(name1, name2, fix1, fix2))
{

View File

@@ -1,12 +1,11 @@
use std::borrow::Cow;
use std::collections::hash_map::Entry;
use std::path::Path;
use anyhow::{Result, anyhow};
use colored::Colorize;
use itertools::Itertools;
use ruff_python_parser::semantic_errors::SemanticSyntaxError;
use rustc_hash::FxHashMap;
use rustc_hash::FxBuildHasher;
use ruff_notebook::Notebook;
use ruff_python_ast::{ModModule, PySourceType, PythonVersion};
@@ -23,10 +22,10 @@ use crate::checkers::imports::check_imports;
use crate::checkers::noqa::check_noqa;
use crate::checkers::physical_lines::check_physical_lines;
use crate::checkers::tokens::check_tokens;
use crate::codes::NoqaCode;
use crate::directives::Directives;
use crate::doc_lines::{doc_lines_from_ast, doc_lines_from_tokens};
use crate::fix::{FixResult, fix_file};
use crate::message::SecondaryCode;
use crate::noqa::add_noqa;
use crate::package::PackageRoot;
use crate::preview::is_py314_support_enabled;
@@ -95,25 +94,25 @@ struct FixCount {
/// A mapping from a noqa code to the corresponding lint name and a count of applied fixes.
#[derive(Debug, Default, PartialEq)]
pub struct FixTable(FxHashMap<NoqaCode, FixCount>);
pub struct FixTable(hashbrown::HashMap<SecondaryCode, FixCount, rustc_hash::FxBuildHasher>);
impl FixTable {
pub fn counts(&self) -> impl Iterator<Item = usize> {
self.0.values().map(|fc| fc.count)
}
pub fn entry(&mut self, code: NoqaCode) -> FixTableEntry {
FixTableEntry(self.0.entry(code))
pub fn entry<'a>(&'a mut self, code: &'a SecondaryCode) -> FixTableEntry<'a> {
FixTableEntry(self.0.entry_ref(code))
}
pub fn iter(&self) -> impl Iterator<Item = (NoqaCode, &'static str, usize)> {
pub fn iter(&self) -> impl Iterator<Item = (&SecondaryCode, &'static str, usize)> {
self.0
.iter()
.map(|(code, FixCount { rule_name, count })| (*code, *rule_name, *count))
.map(|(code, FixCount { rule_name, count })| (code, *rule_name, *count))
}
pub fn keys(&self) -> impl Iterator<Item = NoqaCode> {
self.0.keys().copied()
pub fn keys(&self) -> impl Iterator<Item = &SecondaryCode> {
self.0.keys()
}
pub fn is_empty(&self) -> bool {
@@ -121,7 +120,9 @@ impl FixTable {
}
}
pub struct FixTableEntry<'a>(Entry<'a, NoqaCode, FixCount>);
pub struct FixTableEntry<'a>(
hashbrown::hash_map::EntryRef<'a, 'a, SecondaryCode, SecondaryCode, FixCount, FxBuildHasher>,
);
impl<'a> FixTableEntry<'a> {
pub fn or_default(self, rule_name: &'static str) -> &'a mut usize {
@@ -678,18 +679,16 @@ pub fn lint_fix<'a>(
}
}
fn collect_rule_codes(rules: impl IntoIterator<Item = NoqaCode>) -> String {
rules
.into_iter()
.map(|rule| rule.to_string())
.sorted_unstable()
.dedup()
.join(", ")
fn collect_rule_codes<T>(rules: impl IntoIterator<Item = T>) -> String
where
T: Ord + PartialEq + std::fmt::Display,
{
rules.into_iter().sorted_unstable().dedup().join(", ")
}
#[expect(clippy::print_stderr)]
fn report_failed_to_converge_error(path: &Path, transformed: &str, diagnostics: &[OldDiagnostic]) {
let codes = collect_rule_codes(diagnostics.iter().filter_map(OldDiagnostic::noqa_code));
let codes = collect_rule_codes(diagnostics.iter().filter_map(OldDiagnostic::secondary_code));
if cfg!(debug_assertions) {
eprintln!(
"{}{} Failed to converge after {} iterations in `{}` with rule codes {}:---\n{}\n---",
@@ -721,11 +720,11 @@ This indicates a bug in Ruff. If you could open an issue at:
}
#[expect(clippy::print_stderr)]
fn report_fix_syntax_error(
fn report_fix_syntax_error<'a>(
path: &Path,
transformed: &str,
error: &ParseError,
rules: impl IntoIterator<Item = NoqaCode>,
rules: impl IntoIterator<Item = &'a SecondaryCode>,
) {
let codes = collect_rule_codes(rules);
if cfg!(debug_assertions) {

View File

@@ -33,7 +33,7 @@ impl Emitter for AzureEmitter {
line = location.line,
col = location.column,
code = diagnostic
.noqa_code()
.secondary_code()
.map_or_else(String::new, |code| format!("code={code};")),
body = diagnostic.body(),
)?;

View File

@@ -33,7 +33,7 @@ impl Emitter for GithubEmitter {
writer,
"::error title=Ruff{code},file={file},line={row},col={column},endLine={end_row},endColumn={end_column}::",
code = diagnostic
.noqa_code()
.secondary_code()
.map_or_else(String::new, |code| format!(" ({code})")),
file = diagnostic.filename(),
row = source_location.line,
@@ -50,7 +50,7 @@ impl Emitter for GithubEmitter {
column = location.column,
)?;
if let Some(code) = diagnostic.noqa_code() {
if let Some(code) = diagnostic.secondary_code() {
write!(writer, " {code}")?;
}

View File

@@ -90,18 +90,15 @@ impl Serialize for SerializedMessages<'_> {
}
fingerprints.insert(message_fingerprint);
let (description, check_name) = if let Some(code) = diagnostic.noqa_code() {
(diagnostic.body().to_string(), code.to_string())
let (description, check_name) = if let Some(code) = diagnostic.secondary_code() {
(diagnostic.body().to_string(), code.as_str())
} else {
let description = diagnostic.body();
let description_without_prefix = description
.strip_prefix("SyntaxError: ")
.unwrap_or(description);
(
description_without_prefix.to_string(),
"syntax-error".to_string(),
)
(description_without_prefix.to_string(), "syntax-error")
};
let value = json!({

View File

@@ -87,7 +87,7 @@ pub(crate) fn message_to_json_value(message: &OldDiagnostic, context: &EmitterCo
}
json!({
"code": message.noqa_code().map(|code| code.to_string()),
"code": message.secondary_code(),
"url": message.to_url(),
"message": message.body(),
"fix": fix,

View File

@@ -59,7 +59,7 @@ impl Emitter for JunitEmitter {
body = message.body()
));
let mut case = TestCase::new(
if let Some(code) = message.noqa_code() {
if let Some(code) = message.secondary_code() {
format!("org.ruff.{code}")
} else {
"org.ruff".to_string()

View File

@@ -62,7 +62,7 @@ pub struct OldDiagnostic {
pub fix: Option<Fix>,
pub parent: Option<TextSize>,
pub(crate) noqa_offset: Option<TextSize>,
pub(crate) noqa_code: Option<NoqaCode>,
pub(crate) secondary_code: Option<SecondaryCode>,
}
impl OldDiagnostic {
@@ -79,7 +79,7 @@ impl OldDiagnostic {
fix: None,
parent: None,
noqa_offset: None,
noqa_code: None,
secondary_code: None,
}
}
@@ -115,7 +115,7 @@ impl OldDiagnostic {
fix,
parent,
noqa_offset,
noqa_code: Some(rule.noqa_code()),
secondary_code: Some(SecondaryCode(rule.noqa_code().to_string())),
}
}
@@ -247,9 +247,9 @@ impl OldDiagnostic {
self.fix().is_some()
}
/// Returns the [`NoqaCode`] corresponding to the diagnostic message.
pub fn noqa_code(&self) -> Option<NoqaCode> {
self.noqa_code
/// Returns the noqa code for the diagnostic message as a string.
pub fn secondary_code(&self) -> Option<&SecondaryCode> {
self.secondary_code.as_ref()
}
/// Returns the URL for the rule documentation, if it exists.
@@ -384,6 +384,68 @@ impl<'a> EmitterContext<'a> {
}
}
/// A secondary identifier for a lint diagnostic.
///
/// For Ruff rules this means the noqa code.
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Default, Hash, serde::Serialize)]
#[serde(transparent)]
pub struct SecondaryCode(String);
impl SecondaryCode {
pub fn new(code: String) -> Self {
Self(code)
}
pub fn as_str(&self) -> &str {
&self.0
}
}
impl Display for SecondaryCode {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str(&self.0)
}
}
impl std::ops::Deref for SecondaryCode {
type Target = str;
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl PartialEq<&str> for SecondaryCode {
fn eq(&self, other: &&str) -> bool {
self.0 == *other
}
}
impl PartialEq<SecondaryCode> for &str {
fn eq(&self, other: &SecondaryCode) -> bool {
other.eq(self)
}
}
impl PartialEq<NoqaCode> for SecondaryCode {
fn eq(&self, other: &NoqaCode) -> bool {
&self.as_str() == other
}
}
impl PartialEq<SecondaryCode> for NoqaCode {
fn eq(&self, other: &SecondaryCode) -> bool {
other.eq(self)
}
}
// for `hashbrown::EntryRef`
impl From<&SecondaryCode> for SecondaryCode {
fn from(value: &SecondaryCode) -> Self {
value.clone()
}
}
#[cfg(test)]
mod tests {
use rustc_hash::FxHashMap;

View File

@@ -26,7 +26,7 @@ impl Emitter for PylintEmitter {
diagnostic.compute_start_location().line
};
let body = if let Some(code) = diagnostic.noqa_code() {
let body = if let Some(code) = diagnostic.secondary_code() {
format!("[{code}] {body}", body = diagnostic.body())
} else {
diagnostic.body().to_string()

View File

@@ -71,7 +71,7 @@ fn message_to_rdjson_value(message: &OldDiagnostic) -> Value {
"range": rdjson_range(start_location, end_location),
},
"code": {
"value": message.noqa_code().map(|code| code.to_string()),
"value": message.secondary_code(),
"url": message.to_url(),
},
"suggestions": rdjson_suggestions(fix.edits(), &source_code),
@@ -84,7 +84,7 @@ fn message_to_rdjson_value(message: &OldDiagnostic) -> Value {
"range": rdjson_range(start_location, end_location),
},
"code": {
"value": message.noqa_code().map(|code| code.to_string()),
"value": message.secondary_code(),
"url": message.to_url(),
},
})

View File

@@ -8,9 +8,8 @@ use serde_json::json;
use ruff_source_file::OneIndexed;
use crate::VERSION;
use crate::codes::NoqaCode;
use crate::fs::normalize_path;
use crate::message::{Emitter, EmitterContext, OldDiagnostic};
use crate::message::{Emitter, EmitterContext, OldDiagnostic, SecondaryCode};
use crate::registry::{Linter, RuleNamespace};
pub struct SarifEmitter;
@@ -29,7 +28,7 @@ impl Emitter for SarifEmitter {
let unique_rules: HashSet<_> = results.iter().filter_map(|result| result.code).collect();
let mut rules: Vec<SarifRule> = unique_rules.into_iter().map(SarifRule::from).collect();
rules.sort_by(|a, b| a.code.cmp(&b.code));
rules.sort_by(|a, b| a.code.cmp(b.code));
let output = json!({
"$schema": "https://json.schemastore.org/sarif-2.1.0.json",
@@ -54,26 +53,25 @@ impl Emitter for SarifEmitter {
#[derive(Debug, Clone)]
struct SarifRule<'a> {
name: &'a str,
code: String,
code: &'a SecondaryCode,
linter: &'a str,
summary: &'a str,
explanation: Option<&'a str>,
url: Option<String>,
}
impl From<NoqaCode> for SarifRule<'_> {
fn from(code: NoqaCode) -> Self {
let code_str = code.to_string();
impl<'a> From<&'a SecondaryCode> for SarifRule<'a> {
fn from(code: &'a SecondaryCode) -> Self {
// This is a manual re-implementation of Rule::from_code, but we also want the Linter. This
// avoids calling Linter::parse_code twice.
let (linter, suffix) = Linter::parse_code(&code_str).unwrap();
let (linter, suffix) = Linter::parse_code(code).unwrap();
let rule = linter
.all_rules()
.find(|rule| rule.noqa_code().suffix() == suffix)
.expect("Expected a valid noqa code corresponding to a rule");
Self {
name: rule.into(),
code: code_str,
code,
linter: linter.name(),
summary: rule.message_formats()[0],
explanation: rule.explanation(),
@@ -111,8 +109,8 @@ impl Serialize for SarifRule<'_> {
}
#[derive(Debug)]
struct SarifResult {
code: Option<NoqaCode>,
struct SarifResult<'a> {
code: Option<&'a SecondaryCode>,
level: String,
message: String,
uri: String,
@@ -122,14 +120,14 @@ struct SarifResult {
end_column: OneIndexed,
}
impl SarifResult {
impl<'a> SarifResult<'a> {
#[cfg(not(target_arch = "wasm32"))]
fn from_message(message: &OldDiagnostic) -> Result<Self> {
fn from_message(message: &'a OldDiagnostic) -> Result<Self> {
let start_location = message.compute_start_location();
let end_location = message.compute_end_location();
let path = normalize_path(&*message.filename());
Ok(Self {
code: message.noqa_code(),
code: message.secondary_code(),
level: "error".to_string(),
message: message.body().to_string(),
uri: url::Url::from_file_path(&path)
@@ -144,12 +142,12 @@ impl SarifResult {
#[cfg(target_arch = "wasm32")]
#[expect(clippy::unnecessary_wraps)]
fn from_message(message: &OldDiagnostic) -> Result<Self> {
fn from_message(message: &'a OldDiagnostic) -> Result<Self> {
let start_location = message.compute_start_location();
let end_location = message.compute_end_location();
let path = normalize_path(&*message.filename());
Ok(Self {
code: message.noqa_code(),
code: message.secondary_code(),
level: "error".to_string(),
message: message.body().to_string(),
uri: path.display().to_string(),
@@ -161,7 +159,7 @@ impl SarifResult {
}
}
impl Serialize for SarifResult {
impl Serialize for SarifResult<'_> {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
@@ -184,7 +182,7 @@ impl Serialize for SarifResult {
}
}
}],
"ruleId": self.code.map(|code| code.to_string()),
"ruleId": self.code,
})
.serialize(serializer)
}

View File

@@ -14,7 +14,7 @@ use crate::Locator;
use crate::fs::relativize_path;
use crate::line_width::{IndentWidth, LineWidthBuilder};
use crate::message::diff::Diff;
use crate::message::{Emitter, EmitterContext, OldDiagnostic};
use crate::message::{Emitter, EmitterContext, OldDiagnostic, SecondaryCode};
use crate::settings::types::UnsafeFixes;
bitflags! {
@@ -151,8 +151,8 @@ impl Display for RuleCodeAndBody<'_> {
if let Some(fix) = self.message.fix() {
// Do not display an indicator for inapplicable fixes
if fix.applies(self.unsafe_fixes.required_applicability()) {
if let Some(code) = self.message.noqa_code() {
write!(f, "{} ", code.to_string().red().bold())?;
if let Some(code) = self.message.secondary_code() {
write!(f, "{} ", code.red().bold())?;
}
return write!(
f,
@@ -164,11 +164,11 @@ impl Display for RuleCodeAndBody<'_> {
}
}
if let Some(code) = self.message.noqa_code() {
if let Some(code) = self.message.secondary_code() {
write!(
f,
"{code} {body}",
code = code.to_string().red().bold(),
code = code.red().bold(),
body = self.message.body(),
)
} else {
@@ -254,8 +254,9 @@ impl Display for MessageCodeFrame<'_> {
let label = self
.message
.noqa_code()
.map_or_else(String::new, |code| code.to_string());
.secondary_code()
.map(SecondaryCode::as_str)
.unwrap_or_default();
let line_start = self.notebook_index.map_or_else(
|| start_index.get(),
@@ -269,7 +270,7 @@ impl Display for MessageCodeFrame<'_> {
let span = usize::from(source.annotation_range.start())
..usize::from(source.annotation_range.end());
let annotation = Level::Error.span(span).label(&label);
let annotation = Level::Error.span(span).label(label);
let snippet = Snippet::source(&source.text)
.line_start(line_start)
.annotation(annotation)

View File

@@ -16,9 +16,8 @@ use rustc_hash::FxHashSet;
use crate::Edit;
use crate::Locator;
use crate::codes::NoqaCode;
use crate::fs::relativize_path;
use crate::message::OldDiagnostic;
use crate::message::{OldDiagnostic, SecondaryCode};
use crate::registry::Rule;
use crate::rule_redirects::get_redirect_target;
@@ -106,9 +105,9 @@ impl Codes<'_> {
/// Returns `true` if the string list of `codes` includes `code` (or an alias
/// thereof).
pub(crate) fn includes(&self, needle: NoqaCode) -> bool {
pub(crate) fn includes<T: for<'a> PartialEq<&'a str>>(&self, needle: &T) -> bool {
self.iter()
.any(|code| needle == get_redirect_target(code.as_str()).unwrap_or(code.as_str()))
.any(|code| *needle == get_redirect_target(code.as_str()).unwrap_or(code.as_str()))
}
}
@@ -140,48 +139,55 @@ pub(crate) fn rule_is_ignored(
Ok(Some(NoqaLexerOutput {
directive: Directive::Codes(codes),
..
})) => codes.includes(code.noqa_code()),
})) => codes.includes(&code.noqa_code()),
_ => false,
}
}
/// A summary of the file-level exemption as extracted from [`FileNoqaDirectives`].
#[derive(Debug)]
pub(crate) enum FileExemption<'a> {
pub(crate) enum FileExemption {
/// The file is exempt from all rules.
All(Vec<&'a NoqaCode>),
All(Vec<Rule>),
/// The file is exempt from the given rules.
Codes(Vec<&'a NoqaCode>),
Codes(Vec<Rule>),
}
impl FileExemption<'_> {
/// Returns `true` if the file is exempt from the given rule.
pub(crate) fn includes(&self, needle: Rule) -> bool {
let needle = needle.noqa_code();
impl FileExemption {
/// Returns `true` if the file is exempt from the given rule, as identified by its noqa code.
pub(crate) fn contains_secondary_code(&self, needle: &SecondaryCode) -> bool {
match self {
FileExemption::All(_) => true,
FileExemption::Codes(codes) => codes.iter().any(|code| needle == **code),
FileExemption::Codes(codes) => codes.iter().any(|code| *needle == code.noqa_code()),
}
}
/// Returns `true` if the file is exempt from the given rule.
pub(crate) fn includes(&self, needle: Rule) -> bool {
match self {
FileExemption::All(_) => true,
FileExemption::Codes(codes) => codes.contains(&needle),
}
}
/// Returns `true` if the file exemption lists the rule directly, rather than via a blanket
/// exemption.
pub(crate) fn enumerates(&self, needle: Rule) -> bool {
let needle = needle.noqa_code();
let codes = match self {
FileExemption::All(codes) => codes,
FileExemption::Codes(codes) => codes,
};
codes.iter().any(|code| needle == **code)
codes.contains(&needle)
}
}
impl<'a> From<&'a FileNoqaDirectives<'a>> for FileExemption<'a> {
impl<'a> From<&'a FileNoqaDirectives<'a>> for FileExemption {
fn from(directives: &'a FileNoqaDirectives) -> Self {
let codes = directives
.lines()
.iter()
.flat_map(|line| &line.matches)
.copied()
.collect();
if directives
.lines()
@@ -203,7 +209,7 @@ pub(crate) struct FileNoqaDirectiveLine<'a> {
/// The blanket noqa directive.
pub(crate) parsed_file_exemption: Directive<'a>,
/// The codes that are ignored by the parsed exemptions.
pub(crate) matches: Vec<NoqaCode>,
pub(crate) matches: Vec<Rule>,
}
impl Ranged for FileNoqaDirectiveLine<'_> {
@@ -270,7 +276,7 @@ impl<'a> FileNoqaDirectives<'a> {
if let Ok(rule) = Rule::from_code(get_redirect_target(code).unwrap_or(code))
{
Some(rule.noqa_code())
Some(rule)
} else {
#[expect(deprecated)]
let line = locator.compute_line_index(range.start());
@@ -303,6 +309,10 @@ impl<'a> FileNoqaDirectives<'a> {
pub(crate) fn lines(&self) -> &[FileNoqaDirectiveLine] {
&self.0
}
pub(crate) fn is_empty(&self) -> bool {
self.0.is_empty()
}
}
/// Output of lexing a `noqa` directive.
@@ -830,7 +840,7 @@ fn build_noqa_edits_by_line<'a>(
struct NoqaComment<'a> {
line: TextSize,
code: NoqaCode,
code: &'a SecondaryCode,
directive: Option<&'a Directive<'a>>,
}
@@ -846,24 +856,14 @@ fn find_noqa_comments<'a>(
// Mark any non-ignored diagnostics.
for message in diagnostics {
let Some(code) = message.noqa_code() else {
let Some(code) = message.secondary_code() else {
comments_by_line.push(None);
continue;
};
match &exemption {
FileExemption::All(_) => {
// If the file is exempted, don't add any noqa directives.
comments_by_line.push(None);
continue;
}
FileExemption::Codes(codes) => {
// If the diagnostic is ignored by a global exemption, don't add a noqa directive.
if codes.contains(&&code) {
comments_by_line.push(None);
continue;
}
}
if exemption.contains_secondary_code(code) {
comments_by_line.push(None);
continue;
}
// Is the violation ignored by a `noqa` directive on the parent line?
@@ -921,7 +921,7 @@ fn find_noqa_comments<'a>(
struct NoqaEdit<'a> {
edit_range: TextRange,
noqa_codes: FxHashSet<NoqaCode>,
noqa_codes: FxHashSet<&'a SecondaryCode>,
codes: Option<&'a Codes<'a>>,
line_ending: LineEnding,
}
@@ -942,13 +942,13 @@ impl NoqaEdit<'_> {
writer,
self.noqa_codes
.iter()
.map(ToString::to_string)
.chain(codes.iter().map(ToString::to_string))
.map(|code| code.as_str())
.chain(codes.iter().map(Code::as_str))
.sorted_unstable(),
);
}
None => {
push_codes(writer, self.noqa_codes.iter().map(ToString::to_string));
push_codes(writer, self.noqa_codes.iter().sorted_unstable());
}
}
write!(writer, "{}", self.line_ending.as_str()).unwrap();
@@ -964,7 +964,7 @@ impl Ranged for NoqaEdit<'_> {
fn generate_noqa_edit<'a>(
directive: Option<&'a Directive>,
offset: TextSize,
noqa_codes: FxHashSet<NoqaCode>,
noqa_codes: FxHashSet<&'a SecondaryCode>,
locator: &Locator,
line_ending: LineEnding,
) -> Option<NoqaEdit<'a>> {
@@ -1017,7 +1017,7 @@ pub(crate) struct NoqaDirectiveLine<'a> {
/// The noqa directive.
pub(crate) directive: Directive<'a>,
/// The codes that are ignored by the directive.
pub(crate) matches: Vec<NoqaCode>,
pub(crate) matches: Vec<Rule>,
/// Whether the directive applies to `range.end`.
pub(crate) includes_end: bool,
}
@@ -1142,6 +1142,10 @@ impl<'a> NoqaDirectives<'a> {
pub(crate) fn lines(&self) -> &[NoqaDirectiveLine] {
&self.inner
}
pub(crate) fn is_empty(&self) -> bool {
self.inner.is_empty()
}
}
/// Remaps offsets falling into one of the ranges to instead check for a noqa comment on the

View File

@@ -90,6 +90,11 @@ pub(crate) const fn is_ignore_init_files_in_useless_alias_enabled(
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/18572
pub(crate) const fn is_optional_as_none_in_union_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/18547
pub(crate) const fn is_invalid_async_mock_access_check_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
@@ -99,3 +104,10 @@ pub(crate) const fn is_invalid_async_mock_access_check_enabled(settings: &Linter
pub(crate) const fn is_raise_exception_byte_string_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/18683
pub(crate) const fn is_safe_super_call_with_parameters_fix_enabled(
settings: &LinterSettings,
) -> bool {
settings.preview.is_enabled()
}

View File

@@ -23,12 +23,16 @@ use crate::{FixAvailability, Violation};
///
/// ## Example
/// ```python
/// from airflow.auth.managers.fab.fab_auth_manage import FabAuthManager
/// from airflow.auth.managers.fab.fab_auth_manager import FabAuthManager
///
/// fab_auth_manager_app = FabAuthManager().get_fastapi_app()
/// ```
///
/// Use instead:
/// ```python
/// from airflow.providers.fab.auth_manager.fab_auth_manage import FabAuthManager
/// from airflow.providers.fab.auth_manager.fab_auth_manager import FabAuthManager
///
/// fab_auth_manager_app = FabAuthManager().get_fastapi_app()
/// ```
#[derive(ViolationMetadata)]
pub(crate) struct Airflow3MovedToProvider<'a> {

View File

@@ -24,11 +24,31 @@ use ruff_text_size::TextRange;
/// ## Example
/// ```python
/// from airflow.operators.python import PythonOperator
///
///
/// def print_context(ds=None, **kwargs):
/// print(kwargs)
/// print(ds)
///
///
/// print_the_context = PythonOperator(
/// task_id="print_the_context", python_callable=print_context
/// )
/// ```
///
/// Use instead:
/// ```python
/// from airflow.providers.standard.operators.python import PythonOperator
///
///
/// def print_context(ds=None, **kwargs):
/// print(kwargs)
/// print(ds)
///
///
/// print_the_context = PythonOperator(
/// task_id="print_the_context", python_callable=print_context
/// )
/// ```
#[derive(ViolationMetadata)]
pub(crate) struct Airflow3SuggestedToMoveToProvider<'a> {

View File

@@ -476,12 +476,18 @@ impl Violation for MissingReturnTypeClassMethod {
/// ## Example
///
/// ```python
/// from typing import Any
///
///
/// def foo(x: Any): ...
/// ```
///
/// Use instead:
///
/// ```python
/// from typing import Any
///
///
/// def foo(x: int): ...
/// ```
///

View File

@@ -16,6 +16,8 @@ use crate::rules::flake8_async::helpers::AsyncModule;
///
/// ## Example
/// ```python
/// import asyncio
///
/// DONE = False
///
///
@@ -26,6 +28,8 @@ use crate::rules::flake8_async::helpers::AsyncModule;
///
/// Use instead:
/// ```python
/// import asyncio
///
/// DONE = asyncio.Event()
///
///

View File

@@ -20,12 +20,18 @@ use crate::checkers::ast::Checker;
///
/// ## Example
/// ```python
/// import urllib
///
///
/// async def fetch():
/// urllib.request.urlopen("https://example.com/foo/bar").read()
/// ```
///
/// Use instead:
/// ```python
/// import aiohttp
///
///
/// async def fetch():
/// async with aiohttp.ClientSession() as session:
/// async with session.get("https://example.com/foo/bar") as resp:

View File

@@ -21,12 +21,18 @@ use crate::checkers::ast::Checker;
///
/// ## Example
/// ```python
/// import os
///
///
/// async def foo():
/// os.popen(cmd)
/// ```
///
/// Use instead:
/// ```python
/// import asyncio
///
///
/// async def foo():
/// asyncio.create_subprocess_shell(cmd)
/// ```
@@ -54,12 +60,18 @@ impl Violation for CreateSubprocessInAsyncFunction {
///
/// ## Example
/// ```python
/// import subprocess
///
///
/// async def foo():
/// subprocess.run(cmd)
/// ```
///
/// Use instead:
/// ```python
/// import asyncio
///
///
/// async def foo():
/// asyncio.create_subprocess_shell(cmd)
/// ```
@@ -87,12 +99,19 @@ impl Violation for RunProcessInAsyncFunction {
///
/// ## Example
/// ```python
/// import os
///
///
/// async def foo():
/// os.waitpid(0)
/// ```
///
/// Use instead:
/// ```python
/// import asyncio
/// import os
///
///
/// def wait_for_process():
/// os.waitpid(0)
///

View File

@@ -19,12 +19,18 @@ use crate::checkers::ast::Checker;
///
/// ## Example
/// ```python
/// import time
///
///
/// async def fetch():
/// time.sleep(1)
/// ```
///
/// Use instead:
/// ```python
/// import asyncio
///
///
/// async def fetch():
/// await asyncio.sleep(1)
/// ```

View File

@@ -21,6 +21,9 @@ use crate::rules::flake8_async::helpers::MethodName;
///
/// ## Example
/// ```python
/// import asyncio
///
///
/// async def func():
/// async with asyncio.timeout(2):
/// do_something()
@@ -28,6 +31,9 @@ use crate::rules::flake8_async::helpers::MethodName;
///
/// Use instead:
/// ```python
/// import asyncio
///
///
/// async def func():
/// async with asyncio.timeout(2):
/// do_something()

View File

@@ -18,12 +18,18 @@ use crate::{Edit, Fix, FixAvailability, Violation};
///
/// ## Example
/// ```python
/// import trio
///
///
/// async def double_sleep(x):
/// trio.sleep(2 * x)
/// ```
///
/// Use instead:
/// ```python
/// import trio
///
///
/// async def double_sleep(x):
/// await trio.sleep(2 * x)
/// ```

View File

@@ -18,7 +18,7 @@ use crate::checkers::ast::Checker;
///
/// ## Example
/// ```python
/// import flask
/// from flask import Flask
///
/// app = Flask()
///
@@ -27,7 +27,9 @@ use crate::checkers::ast::Checker;
///
/// Use instead:
/// ```python
/// import flask
/// import os
///
/// from flask import Flask
///
/// app = Flask()
///

View File

@@ -108,10 +108,10 @@ impl Violation for SubprocessWithoutShellEqualsTrue {
///
/// ## Example
/// ```python
/// import subprocess
/// import my_custom_subprocess
///
/// user_input = input("Enter a command: ")
/// subprocess.run(user_input, shell=True)
/// my_custom_subprocess.run(user_input, shell=True)
/// ```
///
/// ## References
@@ -265,14 +265,14 @@ impl Violation for StartProcessWithPartialPath {
/// ```python
/// import subprocess
///
/// subprocess.Popen(["chmod", "777", "*.py"])
/// subprocess.Popen(["chmod", "777", "*.py"], shell=True)
/// ```
///
/// Use instead:
/// ```python
/// import subprocess
///
/// subprocess.Popen(["chmod", "777", "main.py"])
/// subprocess.Popen(["chmod", "777", "main.py"], shell=True)
/// ```
///
/// ## References

View File

@@ -20,16 +20,22 @@ use crate::{FixAvailability, Violation};
///
/// ## Example
/// ```python
/// import itertools
///
/// itertools.batched(iterable, n)
/// ```
///
/// Use instead if the batches must be of uniform length:
/// ```python
/// import itertools
///
/// itertools.batched(iterable, n, strict=True)
/// ```
///
/// Or if the batches can be of non-uniform length:
/// ```python
/// import itertools
///
/// itertools.batched(iterable, n, strict=False)
/// ```
///

View File

@@ -20,11 +20,15 @@ use crate::{checkers::ast::Checker, fix::edits::add_argument};
///
/// ## Example
/// ```python
/// import warnings
///
/// warnings.warn("This is a warning")
/// ```
///
/// Use instead:
/// ```python
/// import warnings
///
/// warnings.warn("This is a warning", stacklevel=2)
/// ```
///

View File

@@ -24,6 +24,7 @@ mod tests {
#[test_case(Rule::UnnecessaryComprehensionInCall, Path::new("C419_2.py"))]
#[test_case(Rule::UnnecessaryDictComprehensionForIterable, Path::new("C420.py"))]
#[test_case(Rule::UnnecessaryDictComprehensionForIterable, Path::new("C420_1.py"))]
#[test_case(Rule::UnnecessaryDictComprehensionForIterable, Path::new("C420_2.py"))]
#[test_case(Rule::UnnecessaryDoubleCastOrProcess, Path::new("C414.py"))]
#[test_case(Rule::UnnecessaryGeneratorDict, Path::new("C402.py"))]
#[test_case(Rule::UnnecessaryGeneratorList, Path::new("C400.py"))]

View File

@@ -7,6 +7,7 @@ use ruff_python_ast::{self as ast, Arguments, Comprehension, Expr, ExprCall, Exp
use ruff_text_size::{Ranged, TextRange};
use crate::checkers::ast::Checker;
use crate::fix::edits::pad_start;
use crate::{Edit, Fix, FixAvailability, Violation};
/// ## What it does
@@ -136,12 +137,16 @@ pub(crate) fn unnecessary_dict_comprehension_for_iterable(
if checker.semantic().has_builtin_binding("dict") {
let edit = Edit::range_replacement(
checker
.generator()
.expr(&fix_unnecessary_dict_comprehension(
dict_comp.value.as_ref(),
generator,
)),
pad_start(
checker
.generator()
.expr(&fix_unnecessary_dict_comprehension(
dict_comp.value.as_ref(),
generator,
)),
dict_comp.start(),
checker.locator(),
),
dict_comp.range(),
);
diagnostic.set_fix(Fix::applicable_edit(

View File

@@ -0,0 +1,16 @@
---
source: crates/ruff_linter/src/rules/flake8_comprehensions/mod.rs
---
C420_2.py:1:7: C420 [*] Unnecessary dict comprehension for iterable; use `dict.fromkeys` instead
|
1 | foo or{x: None for x in bar}
| ^^^^^^^^^^^^^^^^^^^^^^ C420
|
= help: Replace with `dict.fromkeys(iterable, value)`)
Safe fix
1 |-foo or{x: None for x in bar}
1 |+foo or dict.fromkeys(bar)
2 2 |
3 3 |
4 4 | # C420 fix must make sure to insert a leading space if needed,

View File

@@ -24,7 +24,7 @@ use crate::checkers::ast::Checker;
/// ```python
/// import datetime
///
/// datetime.datetime.today()
/// datetime.date.today()
/// ```
///
/// Use instead:

View File

@@ -18,19 +18,25 @@ use crate::checkers::ast::Checker;
/// unexpectedly, as in:
///
/// ```python
/// import datetime
///
/// # Timezone: UTC-14
/// datetime.min.timestamp() # ValueError: year 0 is out of range
/// datetime.max.timestamp() # ValueError: year 10000 is out of range
/// datetime.datetime.min.timestamp() # ValueError: year 0 is out of range
/// datetime.datetime.max.timestamp() # ValueError: year 10000 is out of range
/// ```
///
/// ## Example
/// ```python
/// datetime.max
/// import datetime
///
/// datetime.datetime.max
/// ```
///
/// Use instead:
/// ```python
/// datetime.max.replace(tzinfo=datetime.UTC)
/// import datetime
///
/// datetime.datetime.max.replace(tzinfo=datetime.UTC)
/// ```
#[derive(ViolationMetadata)]
pub(crate) struct DatetimeMinMax {

View File

@@ -22,6 +22,8 @@ mod tests {
#[test_case(Path::new("EXE002_3.py"))]
#[test_case(Path::new("EXE003.py"))]
#[test_case(Path::new("EXE003_uv.py"))]
#[test_case(Path::new("EXE003_uv_tool.py"))]
#[test_case(Path::new("EXE003_uvx.py"))]
#[test_case(Path::new("EXE004_1.py"))]
#[test_case(Path::new("EXE004_2.py"))]
#[test_case(Path::new("EXE004_3.py"))]

View File

@@ -47,7 +47,12 @@ pub(crate) fn shebang_missing_python(
shebang: &ShebangDirective,
context: &LintContext,
) {
if shebang.contains("python") || shebang.contains("pytest") || shebang.contains("uv run") {
if shebang.contains("python")
|| shebang.contains("pytest")
|| shebang.contains("uv run")
|| shebang.contains("uvx")
|| shebang.contains("uv tool run")
{
return;
}

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/flake8_executable/mod.rs
---

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/flake8_executable/mod.rs
---

View File

@@ -11,6 +11,7 @@ mod tests {
use crate::registry::Rule;
use crate::rules::pep8_naming;
use crate::settings::types::PreviewMode;
use crate::test::test_path;
use crate::{assert_diagnostics, settings};
@@ -172,4 +173,23 @@ mod tests {
assert_diagnostics!(snapshot, diagnostics);
Ok(())
}
#[test_case(Rule::DuplicateUnionMember, Path::new("PYI016.py"))]
#[test_case(Rule::DuplicateUnionMember, Path::new("PYI016.pyi"))]
fn preview_rules(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!(
"preview__{}_{}",
rule_code.noqa_code(),
path.to_string_lossy()
);
let diagnostics = test_path(
Path::new("flake8_pyi").join(path).as_path(),
&settings::LinterSettings {
preview: PreviewMode::Enabled,
..settings::LinterSettings::for_rule(rule_code)
},
)?;
assert_diagnostics!(snapshot, diagnostics);
Ok(())
}
}

View File

@@ -28,8 +28,10 @@ use crate::{AlwaysFixableViolation, Edit, Fix};
/// ## Example
///
/// ```pyi
/// from typing import Any
///
/// class Foo:
/// def __eq__(self, obj: typing.Any) -> bool: ...
/// def __eq__(self, obj: Any) -> bool: ...
/// ```
///
/// Use instead:

View File

@@ -19,11 +19,15 @@ use crate::{AlwaysFixableViolation, Applicability, Edit, Fix};
///
/// ## Example
/// ```python
/// from typing import Literal
///
/// foo: Literal["a", "b", "a"]
/// ```
///
/// Use instead:
/// ```python
/// from typing import Literal
///
/// foo: Literal["a", "b"]
/// ```
///

View File

@@ -1,17 +1,16 @@
use std::collections::HashSet;
use rustc_hash::FxHashSet;
use std::collections::HashSet;
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::comparable::ComparableExpr;
use ruff_python_ast::{Expr, ExprBinOp, Operator, PythonVersion};
use ruff_python_semantic::analyze::typing::traverse_union;
use ruff_text_size::{Ranged, TextRange};
use crate::checkers::ast::Checker;
use crate::{Applicability, Edit, Fix, FixAvailability, Violation};
use ruff_python_ast::{AtomicNodeIndex, Expr, ExprBinOp, ExprNoneLiteral, Operator, PythonVersion};
use ruff_python_semantic::analyze::typing::{traverse_union, traverse_union_and_optional};
use ruff_text_size::{Ranged, TextRange, TextSize};
use super::generate_union_fix;
use crate::checkers::ast::Checker;
use crate::preview::is_optional_as_none_in_union_enabled;
use crate::{Applicability, Edit, Fix, FixAvailability, Violation};
/// ## What it does
/// Checks for duplicate union members.
@@ -71,21 +70,35 @@ pub(crate) fn duplicate_union_member<'a>(checker: &Checker, expr: &'a Expr) {
union_type = UnionKind::PEP604;
}
let virtual_expr = if is_optional_as_none_in_union_enabled(checker.settings())
&& is_optional_type(checker, expr)
{
// If the union member is an `Optional`, add a virtual `None` literal.
&VIRTUAL_NONE_LITERAL
} else {
expr
};
// If we've already seen this union member, raise a violation.
if seen_nodes.insert(expr.into()) {
unique_nodes.push(expr);
if seen_nodes.insert(virtual_expr.into()) {
unique_nodes.push(virtual_expr);
} else {
diagnostics.push(checker.report_diagnostic(
DuplicateUnionMember {
duplicate_name: checker.generator().expr(expr),
duplicate_name: checker.generator().expr(virtual_expr),
},
// Use the real expression's range for diagnostics,
expr.range(),
));
}
};
// Traverse the union, collect all diagnostic members
traverse_union(&mut check_for_duplicate_members, checker.semantic(), expr);
if is_optional_as_none_in_union_enabled(checker.settings()) {
traverse_union_and_optional(&mut check_for_duplicate_members, checker.semantic(), expr);
} else {
traverse_union(&mut check_for_duplicate_members, checker.semantic(), expr);
}
if diagnostics.is_empty() {
return;
@@ -178,3 +191,12 @@ fn generate_pep604_fix(
applicability,
)
}
static VIRTUAL_NONE_LITERAL: Expr = Expr::NoneLiteral(ExprNoneLiteral {
node_index: AtomicNodeIndex::dummy(),
range: TextRange::new(TextSize::new(0), TextSize::new(0)),
});
fn is_optional_type(checker: &Checker, expr: &Expr) -> bool {
checker.semantic().match_typing_expr(expr, "Optional")
}

View File

@@ -21,27 +21,47 @@ use crate::{Fix, FixAvailability, Violation};
///
/// For example:
/// ```python
/// from collections.abc import Container, Iterable, Sized
/// from typing import Generic, TypeVar
///
///
/// T = TypeVar("T")
/// K = TypeVar("K")
/// V = TypeVar("V")
///
///
/// class LinkedList(Generic[T], Sized):
/// def push(self, item: T) -> None:
/// self._items.append(item)
///
///
/// class MyMapping(
/// Generic[K, V],
/// Iterable[Tuple[K, V]],
/// Container[Tuple[K, V]],
/// Iterable[tuple[K, V]],
/// Container[tuple[K, V]],
/// ):
/// ...
/// ```
///
/// Use instead:
/// ```python
/// from collections.abc import Container, Iterable, Sized
/// from typing import Generic, TypeVar
///
///
/// T = TypeVar("T")
/// K = TypeVar("K")
/// V = TypeVar("V")
///
///
/// class LinkedList(Sized, Generic[T]):
/// def push(self, item: T) -> None:
/// self._items.append(item)
///
///
/// class MyMapping(
/// Iterable[Tuple[K, V]],
/// Container[Tuple[K, V]],
/// Iterable[tuple[K, V]],
/// Container[tuple[K, V]],
/// Generic[K, V],
/// ):
/// ...

View File

@@ -75,7 +75,7 @@ impl AlwaysFixableViolation for TypedArgumentDefaultInStub {
/// ## Example
///
/// ```pyi
/// def foo(arg=[]) -> None: ...
/// def foo(arg=bar()) -> None: ...
/// ```
///
/// Use instead:
@@ -120,7 +120,7 @@ impl AlwaysFixableViolation for ArgumentDefaultInStub {
///
/// ## Example
/// ```pyi
/// foo: str = "..."
/// foo: str = bar()
/// ```
///
/// Use instead:

View File

@@ -14,11 +14,15 @@ use crate::checkers::ast::Checker;
///
/// ## Example
/// ```pyi
/// from typing import TypeAlias
///
/// type_alias_name: TypeAlias = int
/// ```
///
/// Use instead:
/// ```pyi
/// from typing import TypeAlias
///
/// TypeAliasName: TypeAlias = int
/// ```
#[derive(ViolationMetadata)]

View File

@@ -914,4 +914,79 @@ PYI016.py:115:23: PYI016 [*] Duplicate union member `int`
115 |+field35: "int | str" # Error
116 116 |
117 117 |
118 118 |
118 118 |
PYI016.py:134:45: PYI016 [*] Duplicate union member `typing.Optional[int]`
|
132 | field40: typing.Union[typing.Optional[int], None]
133 | field41: typing.Optional[typing.Union[int, None]]
134 | field42: typing.Union[typing.Optional[int], typing.Optional[int]]
| ^^^^^^^^^^^^^^^^^^^^ PYI016
135 | field43: typing.Optional[int] | None
136 | field44: typing.Optional[int | None]
|
= help: Remove duplicate union member `typing.Optional[int]`
Safe fix
131 131 | # equivalent to int | None
132 132 | field40: typing.Union[typing.Optional[int], None]
133 133 | field41: typing.Optional[typing.Union[int, None]]
134 |-field42: typing.Union[typing.Optional[int], typing.Optional[int]]
134 |+field42: typing.Optional[int]
135 135 | field43: typing.Optional[int] | None
136 136 | field44: typing.Optional[int | None]
137 137 | field45: typing.Optional[int] | typing.Optional[int]
PYI016.py:137:33: PYI016 [*] Duplicate union member `typing.Optional[int]`
|
135 | field43: typing.Optional[int] | None
136 | field44: typing.Optional[int | None]
137 | field45: typing.Optional[int] | typing.Optional[int]
| ^^^^^^^^^^^^^^^^^^^^ PYI016
138 | # equivalent to int | dict | None
139 | field46: typing.Union[typing.Optional[int], typing.Optional[dict]]
|
= help: Remove duplicate union member `typing.Optional[int]`
Safe fix
134 134 | field42: typing.Union[typing.Optional[int], typing.Optional[int]]
135 135 | field43: typing.Optional[int] | None
136 136 | field44: typing.Optional[int | None]
137 |-field45: typing.Optional[int] | typing.Optional[int]
137 |+field45: typing.Optional[int]
138 138 | # equivalent to int | dict | None
139 139 | field46: typing.Union[typing.Optional[int], typing.Optional[dict]]
140 140 | field47: typing.Optional[int] | typing.Optional[dict]
PYI016.py:143:61: PYI016 [*] Duplicate union member `complex`
|
142 | # avoid reporting twice
143 | field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
| ^^^^^^^ PYI016
144 | field49: typing.Optional[complex | complex] | complex
|
= help: Remove duplicate union member `complex`
Safe fix
140 140 | field47: typing.Optional[int] | typing.Optional[dict]
141 141 |
142 142 | # avoid reporting twice
143 |-field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
143 |+field48: typing.Union[typing.Optional[complex], complex]
144 144 | field49: typing.Optional[complex | complex] | complex
PYI016.py:144:36: PYI016 [*] Duplicate union member `complex`
|
142 | # avoid reporting twice
143 | field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
144 | field49: typing.Optional[complex | complex] | complex
| ^^^^^^^ PYI016
|
= help: Remove duplicate union member `complex`
Safe fix
141 141 |
142 142 | # avoid reporting twice
143 143 | field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
144 |-field49: typing.Optional[complex | complex] | complex
144 |+field49: typing.Optional[complex] | complex

View File

@@ -1,6 +1,5 @@
---
source: crates/ruff_linter/src/rules/flake8_pyi/mod.rs
snapshot_kind: text
---
PYI016.pyi:7:15: PYI016 [*] Duplicate union member `str`
|
@@ -883,6 +882,8 @@ PYI016.pyi:113:61: PYI016 [*] Duplicate union member `list[int]`
112 | # Test case for mixed union type
113 | field34: typing.Union[list[int], str] | typing.Union[bytes, list[int]] # Error
| ^^^^^^^^^ PYI016
114 |
115 | # https://github.com/astral-sh/ruff/issues/18546
|
= help: Remove duplicate union member `list[int]`
@@ -892,3 +893,81 @@ PYI016.pyi:113:61: PYI016 [*] Duplicate union member `list[int]`
112 112 | # Test case for mixed union type
113 |-field34: typing.Union[list[int], str] | typing.Union[bytes, list[int]] # Error
113 |+field34: typing.Union[list[int], str, bytes] # Error
114 114 |
115 115 | # https://github.com/astral-sh/ruff/issues/18546
116 116 | # Expand Optional[T] to Union[T, None]
PYI016.pyi:125:45: PYI016 [*] Duplicate union member `typing.Optional[int]`
|
123 | field40: typing.Union[typing.Optional[int], None]
124 | field41: typing.Optional[typing.Union[int, None]]
125 | field42: typing.Union[typing.Optional[int], typing.Optional[int]]
| ^^^^^^^^^^^^^^^^^^^^ PYI016
126 | field43: typing.Optional[int] | None
127 | field44: typing.Optional[int | None]
|
= help: Remove duplicate union member `typing.Optional[int]`
Safe fix
122 122 | # equivalent to int | None
123 123 | field40: typing.Union[typing.Optional[int], None]
124 124 | field41: typing.Optional[typing.Union[int, None]]
125 |-field42: typing.Union[typing.Optional[int], typing.Optional[int]]
125 |+field42: typing.Optional[int]
126 126 | field43: typing.Optional[int] | None
127 127 | field44: typing.Optional[int | None]
128 128 | field45: typing.Optional[int] | typing.Optional[int]
PYI016.pyi:128:33: PYI016 [*] Duplicate union member `typing.Optional[int]`
|
126 | field43: typing.Optional[int] | None
127 | field44: typing.Optional[int | None]
128 | field45: typing.Optional[int] | typing.Optional[int]
| ^^^^^^^^^^^^^^^^^^^^ PYI016
129 | # equivalent to int | dict | None
130 | field46: typing.Union[typing.Optional[int], typing.Optional[dict]]
|
= help: Remove duplicate union member `typing.Optional[int]`
Safe fix
125 125 | field42: typing.Union[typing.Optional[int], typing.Optional[int]]
126 126 | field43: typing.Optional[int] | None
127 127 | field44: typing.Optional[int | None]
128 |-field45: typing.Optional[int] | typing.Optional[int]
128 |+field45: typing.Optional[int]
129 129 | # equivalent to int | dict | None
130 130 | field46: typing.Union[typing.Optional[int], typing.Optional[dict]]
131 131 | field47: typing.Optional[int] | typing.Optional[dict]
PYI016.pyi:134:61: PYI016 [*] Duplicate union member `complex`
|
133 | # avoid reporting twice
134 | field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
| ^^^^^^^ PYI016
135 | field49: typing.Optional[complex | complex] | complex
|
= help: Remove duplicate union member `complex`
Safe fix
131 131 | field47: typing.Optional[int] | typing.Optional[dict]
132 132 |
133 133 | # avoid reporting twice
134 |-field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
134 |+field48: typing.Union[typing.Optional[complex], complex]
135 135 | field49: typing.Optional[complex | complex] | complex
PYI016.pyi:135:36: PYI016 [*] Duplicate union member `complex`
|
133 | # avoid reporting twice
134 | field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
135 | field49: typing.Optional[complex | complex] | complex
| ^^^^^^^ PYI016
|
= help: Remove duplicate union member `complex`
Safe fix
132 132 |
133 133 | # avoid reporting twice
134 134 | field48: typing.Union[typing.Optional[typing.Union[complex, complex]], complex]
135 |-field49: typing.Optional[complex | complex] | complex
135 |+field49: typing.Optional[complex] | complex

Some files were not shown because too many files have changed in this diff Show More