Compare commits

..

50 Commits

Author SHA1 Message Date
Charlie Marsh
1099c76fde Recommend running prek against upstream 2026-01-11 13:42:46 -05:00
Charlie Marsh
28cde5d7a6 Set priority for prek to enable concurrent execution 2026-01-11 13:33:20 -05:00
Charlie Marsh
2487233716 Use prek in docuemntation 2026-01-11 13:30:28 -05:00
Charlie Marsh
2c68057c4b [ty] Preserve argument signature in @total_ordering (#22496)
## Summary

Closes https://github.com/astral-sh/ty/issues/2435.
2026-01-10 14:35:58 -05:00
Charlie Marsh
8e29be9c1c Add let chains preference to development guidelines (#22497) 2026-01-10 12:29:07 -05:00
Dylan
880513a013 Respect fmt: skip for multiple statements on same logical line (#22119)
This PR adjusts the logic for skipping formatting so that a `fmt: skip`
can affect multiple statements if they lie on the same line.

Specifically, a `fmt: skip` comment will now suppress all the statements
in the suite in which it appears whose range intersects the line
containing the skip directive. For example:

```python
x=[
'1'
];x=2 # fmt: skip
```

remains unchanged after formatting.

(Note that compound statements are somewhat special and were handled in
a previous PR - see #20633).


Closes #17331 and #11430.

Simplest to review commit by commit - the key diffs of interest are the
commit introducing the core logic, and the diff between the snapshots
introduced in the last commit (compared to the second commit).

# Implementation

On `main` we format a suite of statements by iterating through them. If
we meet a statement with a leading or trailing (own-line)`fmt: off`
comment, then we suppress formatting until we meet a `fmt: on` comment.
Otherwise we format the statement using its own formatting rule.

How are `fmt: skip` comments handled then? They are handled internally
to the formatting of each statement. Specifically, calling `.fmt` on a
statement node will first check to see if there is a trailing,
end-of-line `fmt: skip` (or `fmt: off`/`yapf: off`), and if so then
write the node with suppressed formatting.

In this PR we move the responsibility for handling `fmt: skip` into the
formatting logic of the suite itself. This is done as follows:

- Before beginning to format the suite, we do a pass through the
statements and collect the data of ranges with skipped formatting. More
specifically, we create a map with key given by the _first_ skipped
statement in a block and value a pair consisting of the _last_ skipped
statement and the _range_ to write verbatim.
- We iterate as before, but if we meet a statement that is a key in the
map constructed above, we pause to write the associated range verbatim.
We then advance the iterator to the last statement in the block and
proceed as before.

## Addendum on range formatting

We also had to make some changes to range formatting in order to support
this new behavior. For example, we want to make sure that

```python
<RANGE_START>x=1<RANGE_END>;x=2 # fmt: skip
```

formats verbatim, rather than becoming 

```python
x = 1;x=2 # fmt: skip
```

Recall that range formatting proceeds in two steps:
1. Find the smallest enclosing node containing the range AND that has
enough info to format the range (so it may be larger than you think,
e.g. a docstring has enclosing node given by the suite, not the string
itself.)
2. Carve out the formatted range from the result of formatting that
enclosing node.

We had to modify (1), since the suite knows how to format skipped nodes,
but nodes may not "know" they are skipped. To do this we altered the
`visit_body` bethod of the `FindEnclosingNode` visitor: now we iterate
through the statements and check for skipped ranges intersecting the
format range. If we find them, we return without descending. The result
is to consider the statement containing the suite as the enclosing node
in this case.
2026-01-10 14:56:58 +00:00
Charlie Marsh
046c5a46d8 [ty] Support dataclass_transform as a function call (#22378)
## Summary

Instead of just as a decorator.

Closes https://github.com/astral-sh/ty/issues/2319.
2026-01-10 08:45:45 -05:00
Micha Reiser
cfed34334c Update mypy primer pin (#22490) 2026-01-10 14:00:00 +01:00
Charlie Marsh
11cc324449 [ty] Detect invalid @total_ordering applications in non-decorator contexts (#22486)
## Summary

E.g., `ValidOrderedClass = total_ordering(HasOrderingMethod)`.
2026-01-09 19:37:58 -05:00
Charlie Marsh
c88e1a0663 [ty] Avoid emitting Liskov repeated violations from grandparent to child (#22484)
## Summary

If parent violates LSP against grandparent, and child has the same
violation (but matches parent), we no longer flag the LSP violation on
child, since it can't be fixed without violating parent.

If parent violates LSP against grandparent, and child violates LSP
against both parent and grandparent, we emit two diagnostics (one for
each violation).

If parent violates LSP against grandparent, and child violates LSP
against parent (but not grandparent), we flag it.

Closes https://github.com/astral-sh/ty/issues/2000.
2026-01-09 19:27:57 -05:00
William Woodruff
2c7ac17b1e Remove two old zizmor ignore comments (#22485) 2026-01-09 18:02:44 -05:00
Matthias Schoettle
a0f2cd0ded Add language: golang to actionlint pre-commit hook (#22483) 2026-01-09 19:30:00 +00:00
Alex Waygood
dc61104726 [ty] Derive Default in a few more places in place.rs (#22481) 2026-01-09 18:37:36 +00:00
Jason K Hall
f40c578ffb [ruff] document RUF100 trailing comment fix behavior (#22479)
## Summary
Ruff's `--fix` for `RUF100` can inadvertently remove trailing comments
(e.g., `pylint` or `mypy` suppressions) by interpreting them as
descriptions. This PR adds a "Conflict with other linters" section to
the rule documentation to clarify this behavior and provide the
double-hash (`# noqa # pylint`) workaround.

## Fixes
Fixes #20762
2026-01-09 09:01:09 -08:00
Leo Hayashi
baaf6966f6 [ruff] Split WASM build and publish into separate workflows (#12387) (#22476)
Co-authored-by: Hayashi Reo <reo@wantedly.com>
2026-01-09 17:41:13 +01:00
Zanie Blue
a6df4a3be7 Add llms.txt support for documentation (#22463)
Matching uv.

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-09 09:56:41 -06:00
Micha Reiser
1094009790 Fix configuration path in --show-settings (#22478) 2026-01-09 16:28:57 +01:00
Andrew Gallant
10eb3d52d5 [ty] Rank top-level module symbols above most other symbols
This makes it so, e.g., `os<CURSOR>` will suggest the top-level stdlib
`os` module even if there is an `os` symbol elsewhere in your project.

The way this is done is somewhat overwrought, but it's done to avoid
suggesting top-level modules over other symbols already in scope.

Fixes astral-sh/issues#1852
2026-01-09 10:11:37 -05:00
Andrew Gallant
91d24ebb92 [ty] Give exact completion matches very high ranking
Aside from ruling out "impossible" completions in the current context,
we give completions with an exact match the highest possible ranking.
2026-01-09 09:55:32 -05:00
Andrew Gallant
e68fba20a9 [ty] Remove type-var-typing-over-ast evaluation task
This is now strictly redundant with the `typing-gets-priority` task.
2026-01-09 09:55:32 -05:00
Andrew Gallant
64117c1146 [ty] Improve completion ranking based on origin of symbols
Part of this was already done, but it was half-assed. We now look at the
search path that a symbol came from and centralize a symbol's origin
classification.

The preference ordering here is maybe not the right one, but we can
iterate as users give us feedback. Note also that the preference
ordering based on the origin is pretty low in the relevance sorting.
This means that other more specific criteria will and can override this.

This results in some nice improvements to our evaluation tasks.
2026-01-09 09:55:32 -05:00
Andrew Gallant
2196ef3a33 [ty] Add package priority completion evaluation tasks
This commit adds two new tests. One checks that a symbol in the current
project gets priority over a symbol in the standard library. Another
checks that a symbol in a third party dependency gets priority over a
symbol in the standard library. We don't get either of these right
today.

Note that these comparisons are done ceteris paribus. A symbol from the
standard library could still be ranked above a symbol elsewhere.
(Although I believe currently this is somewhat rare.)
2026-01-09 09:55:32 -05:00
Andrew Gallant
c36397031b [ty] Actually ignore hidden directories in completion evaluation
I apparently don't know how to use my own API. Previously,
we would skip, e.g., `.venv`, but still descend into it.

This was annoying in practice because I sometimes have an
environment in one of the truth task directories. The eval
command should ignore that entirely, but it ended up
choking on it without properly ignoring hidden files
and directories.
2026-01-09 09:55:32 -05:00
Andrew Gallant
eef34958f9 [ty] More sensible sorting for results returned by all_symbols
Previously, we would only sort by name and file path (unless the symbol
refers to an entire module). This led to somewhat confusing ordering,
since the file path is absolute and can be anything.

Sorting by module name after symbol name gives a more predictable
ordering in common practice.

This is mostly just an improvement for debugging purposes, i.e., when
looking at the output of `all_symbols` directly. This mostly shouldn't
impact completion ordering since completions do their own ranking.
2026-01-09 09:55:32 -05:00
Andrew Gallant
f1bd5f1941 [ty] Rank unimported symbols from typing higher
This addresses a number of minor annoyances where users really want
imports from `typing` most of the time.

For example:
https://github.com/astral-sh/ty/issues/1274#issuecomment-3345884227
2026-01-09 09:55:32 -05:00
Andrew Gallant
56862f8241 [ty] Refactor ranking
This moves the information we want to use to rank completions into
`Completion` itself. (This was the primary motivation for creating a
`CompletionBuilder` in the first place.)

The principal advantage here is that we now only need to compute the
relevance information for each completion exactly once. Previously, we
were computing it on every comparison, which might end up doing
redundant work for a not insignifcant number of completions.

The relevance information is also specifically constructed from the
builder so that, in the future, we might choose to short-circuit
construction if we know we'll never send it back to the client (e.g.,
its ranking is worse than the lowest ranked completion). But we don't
implement that optimization yet.
2026-01-09 09:55:32 -05:00
Andrew Gallant
e9cc2f6f42 [ty] Refactor completion construction to use a builder
I want to be able to attach extra data to each `Completion`, but not
burden callers with the need to construct it. This commit helps get us
to that point by requiring callers to use a `CompletionBuilder` for
construction instead of a `Completion` itself.

I think this will also help in the future if it proves to be the case
that we can improve performance by delaying work until we actually build
a `Completion`, which might only happen if we know we won't throw it
out. But we aren't quite there yet.

This also lets us tighten things up a little bit and makes completion
construction less noisy. The downside is that callers no longer need to
consider "every" completion field.

There should not be any behavior changes here.
2026-01-09 09:55:32 -05:00
Andrew Gallant
8dd56d4264 [ty] Rename internal completion Rank type to Relevance
I think "relevance" captures the meaning of this type more precisely.
2026-01-09 09:55:32 -05:00
Andrew Gallant
d4c1b0ccc7 [ty] Add evaluation tasks for a few cases
I went through https://github.com/astral-sh/ty/issues/1274 and tried to
extract what I could into eval tasks. Some of the suggestions from that
issue have already been done, but most haven't.

This captures the status quo.
2026-01-09 09:55:32 -05:00
Micha Reiser
e61657ff3c [ty] Enable unused-type-ignore-comment by default (#22474) 2026-01-09 10:58:43 +00:00
Micha Reiser
ba5dd5837c [ty] Pass slice to specialize (#22421) 2026-01-09 09:45:39 +01:00
Micha Reiser
c5f6a74da5 [ty] Fix goto definition for relative imports in third-party files (#22457) 2026-01-09 09:30:31 +01:00
Micha Reiser
b3cde98cd1 [ty] Update salsa (#22473) 2026-01-09 09:21:45 +01:00
Carl Meyer
f9f7a6901b Complete minor TODO in ty_python_semantic crate (#22468)
This TODO is very old -- we have long since recorded this definition.
Updating the test to actually assert the declaration requires a new
helper method for declarations, to complement the existing
`first_public_binding` helper.

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-08 16:04:34 -08:00
Dylan
c920cf8cdb Bump 0.14.11 (#22462) 2026-01-08 12:51:47 -06:00
Micha Reiser
bb757b5a79 [ty] Don't show diagnostics for excluded files (#22455) 2026-01-08 18:27:28 +01:00
Charlie Marsh
1f49e8ef51 Include configured src directories when resolving graphs (#22451)
## Summary

This PR augments the detected source paths with the user-configured
`src` when computing roots for `ruff analyze graph`.
2026-01-08 15:19:15 +00:00
Douglas Creager
701f5134ab [ty] Only consider fully static pivots when deriving transitive constraints (#22444)
When working with constraint sets, we track transitive relationships
between the constraints in the set. For instance, in `S ≤ int ∧ int ≤
T`, we can infer that `S ≤ T`. However, we should only consider fully
static types when looking for a "pivot" for this kind of transitive
relationship. The same pattern does not hold for `S ≤ Any ∧ Any ≤ T`;
because the two `Any`s can materialize to different types, we cannot
infer that `S ≤ T`.

Fixes https://github.com/astral-sh/ty/issues/2371
2026-01-08 09:31:55 -05:00
Micha Reiser
eea9ad8352 Pin maturin version (#22454) 2026-01-08 12:39:51 +01:00
Alex Waygood
eeac2bd3ee [ty] Optimize union building for unions with many enum-literal members (#22363) 2026-01-08 10:50:04 +00:00
Jason K Hall
7319c37f4e docs: fix jupyter notebook discovery info for editors (#22447)
Resolves #21892

## Summary

This PR updates `docs/editors/features.md` to clarify that Jupyter
Notebooks are now included by default as of version 0.6.0.
2026-01-08 11:52:01 +05:30
Amethyst Reese
805503c19a [ruff] Improve fix title for RUF102 invalid rule code (#22100)
## Summary

Updates the fix title for RUF102 to either specify which rule code to
remove, or clarify
that the entire suppression comment should be removed.

## Test Plan

Updated test snapshots.
2026-01-07 17:23:18 -08:00
Charlie Marsh
68a2f6c57d [ty] Fix super() with TypeVar-annotated self and cls parameter (#22208)
## Summary

This PR fixes `super()` handling when the first parameter (`self` or
`cls`) is annotated with a TypeVar, like `Self`.

Previously, `super()` would incorrectly resolve TypeVars to their bounds
before creating the `BoundSuperType`. So if you had `self: Self` where
`Self` is bounded by `Parent`, we'd process `Parent` as a
`NominalInstance` and end up with `SuperOwnerKind::Instance(Parent)`.

As a result:

```python
class Parent:
    @classmethod
    def create(cls) -> Self:
        return cls()

class Child(Parent):
    @classmethod
    def create(cls) -> Self:
        return super().create()  # Error: Argument type `Self@create` does not satisfy upper bound `Parent`
```

We now track two additional variants on `SuperOwnerKind` for TypeVar
owners:

- `InstanceTypeVar`: for instance methods where self is a TypeVar (e.g.,
`self: Self`).
- `ClassTypeVar`: for classmethods where `cls` is a `TypeVar` wrapped in
`type[...]` (e.g., `cls: type[Self]`).

Closes https://github.com/astral-sh/ty/issues/2122.

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2026-01-07 19:56:09 -05:00
Alex Waygood
abaa735e1d [ty] Improve UnionBuilder performance by changing Type::is_subtype_of calls to Type::is_redundant_with (#22337) 2026-01-07 22:17:44 +00:00
Jelle Zijlstra
c02d164357 Check required-version before parsing rules (#22410)
Co-authored-by: Micha Reiser <micha@reiser.io>
2026-01-07 17:29:27 +00:00
Andrew Gallant
88aa3f82f0 [ty] Fix generally poor ranking in playground completions
We enabled [`CompletionListisIncomplete`] in our LSP server a while back
in order to have more of a say in how we rank and filter completions.
When it isn't set, the client tends to ask for completions less
frequently and will instead do its own filtering.

But... we did not enable it for the playground. Which I guess didn't
result in anything noticeably bad until we started limiting completions
to 1,000 suggestions. This meant that if the _initial_ completion
response didn't include the ultimate desired answer, then it would never
show up in the results until the client requested completions again.
This in turn led to some very poor completions in some cases.

This all gets fixed by simply enabling `isIncomplete` for Monaco.

Fixes astral-sh/ty#2340

[`CompletionList::isIncomplete`](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#completionList)
2026-01-07 12:25:26 -05:00
Carl Meyer
30902497db [ty] Make signature return and parameter types non-optional (#22425)
## Summary

Fixes https://github.com/astral-sh/ty/issues/2363
Fixes https://github.com/astral-sh/ty/issues/2013

And several other bugs with the same root cause. And makes any similar
bugs impossible by construction.

Previously we distinguished "no annotation" (Rust `None`) from
"explicitly annotated with something of type `Unknown`" (which is not an
error, and results in the annotation being of Rust type
`Some(Type::DynamicType(Unknown))`), even though semantically these
should be treated the same.

This was a bit of a bug magnet, because it was easy to forget to make
this `None` -> `Unknown` translation everywhere we needed to. And in
fact we did fail to do it in the case of materializing a callable,
leading to a top-materialized callable still having (rust) `None` return
type, which should have instead materialized to `object`.

This also fixes several other bugs related to not handling un-annotated
return types correctly:
1. We previously considered the return type of an unannotated `async
def` to be `Unknown`, where it should be `CoroutineType[Any, Any,
Unknown]`.
2. We previously failed to infer a ParamSpec if the return type of the
callable we are inferring against was not annotated.
3. We previously wrongly returned `Unknown` from `some_dict.get("key",
None)` if the value type of `some_dict` included a callable type with
un-annotated return type.

We now make signature return types and annotated parameter types
required, and we eagerly insert `Unknown` if there's no annotation. Most
of the diff is just a bunch of mechanical code changes where we
construct these types, and simplifications where we use them.

One exception is type display: when a callable type has un-annotated
parameters, we want to display them as un-annotated, but if it has a
parameter explicitly annotated with something of `Unknown` type, we want
to display that parameter as `x: Unknown` (it would be confusing if it
looked like your annotation just disappeared entirely).

Fortunately, we already have a mechanism in place for handling this: the
`inferred_annotation` flag, which suppresses display of an annotation.
Previously we used it only for `self` and `cls` parameters with an
inferred annotated type -- but we now also set it for any un-annotated
parameter, for which we infer `Unknown` type.

We also need to normalize `inferred_annotation`, since it's display-only
and shouldn't impact type equivalence. (This is technically a
previously-existing bug, it just never came up when it only affected
self types -- now it comes up because we have tests asserting that `def
f(x)` and `def g(x: Unknown)` are equivalent.)

## Test Plan

Added mdtests.
2026-01-07 09:18:39 -08:00
Alex Waygood
3ad99fb1f4 [ty] Fix an mdtest title (#22439) 2026-01-07 16:34:56 +00:00
Micha Reiser
d0ff59cfe5 [ty] Use Pool from regex_automata to reuse the matches allocations (#22438) 2026-01-07 17:22:35 +01:00
Andrew Gallant
952193e0c6 [ty] Offer completions for T when a value has type Unknown | T
Fixes astral-sh/ty#2197
2026-01-07 10:15:36 -05:00
173 changed files with 6270 additions and 2262 deletions

View File

@@ -5,4 +5,4 @@ rustup component add clippy rustfmt
cargo install cargo-insta
cargo fetch
pip install maturin pre-commit
pip install maturin prek

View File

@@ -5,5 +5,4 @@
[rules]
possibly-unresolved-reference = "warn"
possibly-missing-import = "warn"
unused-ignore-comment = "warn"
division-by-zero = "warn"

View File

@@ -51,6 +51,7 @@ jobs:
- name: "Build sdist"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
command: sdist
args: --out dist
- name: "Test sdist"
@@ -81,6 +82,7 @@ jobs:
- name: "Build wheels - x86_64"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: x86_64
args: --release --locked --out dist
- name: "Upload wheels"
@@ -123,6 +125,7 @@ jobs:
- name: "Build wheels - aarch64"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: aarch64
args: --release --locked --out dist
- name: "Test wheel - aarch64"
@@ -179,6 +182,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.platform.target }}
args: --release --locked --out dist
env:
@@ -232,6 +236,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.target }}
manylinux: auto
args: --release --locked --out dist
@@ -308,6 +313,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.platform.target }}
manylinux: auto
docker-options: ${{ matrix.platform.maturin_docker_options }}
@@ -374,6 +380,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.target }}
manylinux: musllinux_1_2
args: --release --locked --out dist
@@ -439,6 +446,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.platform.target }}
manylinux: musllinux_1_2
args: --release --locked --out dist

58
.github/workflows/build-wasm.yml vendored Normal file
View File

@@ -0,0 +1,58 @@
# Build ruff_wasm for npm.
#
# Assumed to run as a subworkflow of .github/workflows/release.yml; specifically, as a local
# artifacts job within `cargo-dist`.
name: "Build wasm"
on:
workflow_call:
inputs:
plan:
required: true
type: string
pull_request:
paths:
- .github/workflows/build-wasm.yml
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
env:
CARGO_INCREMENTAL: 0
CARGO_NET_RETRY: 10
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
jobs:
build:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
runs-on: ubuntu-latest
strategy:
matrix:
target: [web, bundler, nodejs]
fail-fast: false
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: jetli/wasm-pack-action@0d096b08b4e5a7de8c28de67e11e945404e9eefa # v0.4.0
with:
version: v0.13.1
- uses: jetli/wasm-bindgen-action@20b33e20595891ab1a0ed73145d8a21fc96e7c29 # v0.2.0
- name: "Run wasm-pack build"
run: wasm-pack build --target ${{ matrix.target }} crates/ruff_wasm
- name: "Rename generated package"
run: | # Replace the package name w/ jq
jq '.name="@astral-sh/ruff-wasm-${{ matrix.target }}"' crates/ruff_wasm/pkg/package.json > /tmp/package.json
mv /tmp/package.json crates/ruff_wasm/pkg
- run: cp LICENSE crates/ruff_wasm/pkg # wasm-pack does not put the LICENSE file in the pkg
- name: "Upload wasm artifact"
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: artifacts-wasm-${{ matrix.target }}
path: crates/ruff_wasm/pkg

View File

@@ -1,25 +1,18 @@
# Build and publish ruff-api for wasm.
# Publish ruff_wasm to npm.
#
# Assumed to run as a subworkflow of .github/workflows/release.yml; specifically, as a publish
# job within `cargo-dist`.
name: "Build and publish wasm"
name: "Publish wasm"
on:
workflow_dispatch:
workflow_call:
inputs:
plan:
required: true
type: string
env:
CARGO_INCREMENTAL: 0
CARGO_NET_RETRY: 10
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
jobs:
ruff_wasm:
publish:
runs-on: ubuntu-latest
permissions:
contents: read
@@ -29,31 +22,19 @@ jobs:
target: [web, bundler, nodejs]
fail-fast: false
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: jetli/wasm-pack-action@0d096b08b4e5a7de8c28de67e11e945404e9eefa # v0.4.0
with:
version: v0.13.1
- uses: jetli/wasm-bindgen-action@20b33e20595891ab1a0ed73145d8a21fc96e7c29 # v0.2.0
- name: "Run wasm-pack build"
run: wasm-pack build --target ${{ matrix.target }} crates/ruff_wasm
- name: "Rename generated package"
run: | # Replace the package name w/ jq
jq '.name="@astral-sh/ruff-wasm-${{ matrix.target }}"' crates/ruff_wasm/pkg/package.json > /tmp/package.json
mv /tmp/package.json crates/ruff_wasm/pkg
- run: cp LICENSE crates/ruff_wasm/pkg # wasm-pack does not put the LICENSE file in the pkg
name: artifacts-wasm-${{ matrix.target }}
path: pkg
- uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: 24
registry-url: "https://registry.npmjs.org"
- name: "Publish (dry-run)"
if: ${{ inputs.plan == '' || fromJson(inputs.plan).announcement_tag_is_implicit }}
run: npm publish --dry-run crates/ruff_wasm/pkg
run: npm publish --dry-run pkg
- name: "Publish"
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
run: npm publish --provenance --access public crates/ruff_wasm/pkg
run: npm publish --provenance --access public pkg
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

View File

@@ -112,12 +112,22 @@ jobs:
"contents": "read"
"packages": "write"
custom-build-wasm:
needs:
- plan
if: ${{ needs.plan.outputs.publishing == 'true' || fromJson(needs.plan.outputs.val).ci.github.pr_run_mode == 'upload' || inputs.tag == 'dry-run' }}
uses: ./.github/workflows/build-wasm.yml
with:
plan: ${{ needs.plan.outputs.val }}
secrets: inherit
# Build and package all the platform-agnostic(ish) things
build-global-artifacts:
needs:
- plan
- custom-build-binaries
- custom-build-docker
- custom-build-wasm
runs-on: "depot-ubuntu-latest-4"
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
@@ -165,9 +175,10 @@ jobs:
- plan
- custom-build-binaries
- custom-build-docker
- custom-build-wasm
- build-global-artifacts
# Only run if we're "publishing", and only if plan, local and global didn't fail (skipped is fine)
if: ${{ always() && needs.plan.result == 'success' && needs.plan.outputs.publishing == 'true' && (needs.build-global-artifacts.result == 'skipped' || needs.build-global-artifacts.result == 'success') && (needs.custom-build-binaries.result == 'skipped' || needs.custom-build-binaries.result == 'success') && (needs.custom-build-docker.result == 'skipped' || needs.custom-build-docker.result == 'success') }}
if: ${{ always() && needs.plan.result == 'success' && needs.plan.outputs.publishing == 'true' && (needs.build-global-artifacts.result == 'skipped' || needs.build-global-artifacts.result == 'success') && (needs.custom-build-binaries.result == 'skipped' || needs.custom-build-binaries.result == 'success') && (needs.custom-build-docker.result == 'skipped' || needs.custom-build-docker.result == 'success') && (needs.custom-build-wasm.result == 'skipped' || needs.custom-build-wasm.result == 'success') }}
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
runs-on: "depot-ubuntu-latest-4"

View File

@@ -40,12 +40,12 @@ jobs:
- name: Install the latest version of uv
uses: astral-sh/setup-uv@681c641aba71e4a1c380be3ab5e12ad51f415867 # v7.1.6
with:
enable-cache: true # zizmor: ignore[cache-poisoning] acceptable risk for CloudFlare pages artifact
enable-cache: true
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
with:
workspaces: "ruff"
lookup-only: false # zizmor: ignore[cache-poisoning] acceptable risk for CloudFlare pages artifact
lookup-only: false
- name: Install Rust toolchain
run: rustup show

View File

@@ -21,15 +21,62 @@ exclude: |
)$
repos:
# Priority 0: Read-only hooks; hooks that modify disjoint file types.
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v6.0.0
hooks:
- id: check-merge-conflict
priority: 0
- repo: https://github.com/abravalheri/validate-pyproject
rev: v0.24.1
hooks:
- id: validate-pyproject
priority: 0
- repo: https://github.com/crate-ci/typos
rev: v1.40.0
hooks:
- id: typos
priority: 0
- repo: local
hooks:
- id: cargo-fmt
name: cargo fmt
entry: cargo fmt --
language: system
types: [rust]
pass_filenames: false # This makes it a lot faster
priority: 0
# Prettier
- repo: https://github.com/rbubley/mirrors-prettier
rev: v3.7.4
hooks:
- id: prettier
types: [yaml]
priority: 0
# zizmor detects security vulnerabilities in GitHub Actions workflows.
# Additional configuration for the tool is found in `.github/zizmor.yml`
- repo: https://github.com/zizmorcore/zizmor-pre-commit
rev: v1.19.0
hooks:
- id: zizmor
priority: 0
- repo: https://github.com/python-jsonschema/check-jsonschema
rev: 0.36.0
hooks:
- id: check-github-workflows
priority: 0
- repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.11.0.1
hooks:
- id: shellcheck
priority: 0
- repo: https://github.com/executablebooks/mdformat
rev: 1.0.0
@@ -44,7 +91,20 @@ repos:
docs/formatter/black\.md
| docs/\w+\.md
)$
priority: 0
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.14.10
hooks:
- id: ruff-format
priority: 0
- id: ruff-check
args: [--fix, --exit-non-zero-on-fix]
types_or: [python, pyi]
require_serial: true
priority: 1
# Priority 1: Second-pass fixers (e.g., markdownlint-fix runs after mdformat).
- repo: https://github.com/igorshubovych/markdownlint-cli
rev: v0.47.0
hooks:
@@ -54,7 +114,9 @@ repos:
docs/formatter/black\.md
| docs/\w+\.md
)$
priority: 1
# Priority 2: blacken-docs runs after markdownlint-fix (both modify markdown).
- repo: https://github.com/adamchainz/blacken-docs
rev: 1.20.0
hooks:
@@ -68,48 +130,7 @@ repos:
)$
additional_dependencies:
- black==25.12.0
- repo: https://github.com/crate-ci/typos
rev: v1.40.0
hooks:
- id: typos
- repo: local
hooks:
- id: cargo-fmt
name: cargo fmt
entry: cargo fmt --
language: system
types: [rust]
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.14.10
hooks:
- id: ruff-format
- id: ruff-check
args: [--fix, --exit-non-zero-on-fix]
types_or: [python, pyi]
require_serial: true
# Prettier
- repo: https://github.com/rbubley/mirrors-prettier
rev: v3.7.4
hooks:
- id: prettier
types: [yaml]
# zizmor detects security vulnerabilities in GitHub Actions workflows.
# Additional configuration for the tool is found in `.github/zizmor.yml`
- repo: https://github.com/zizmorcore/zizmor-pre-commit
rev: v1.19.0
hooks:
- id: zizmor
- repo: https://github.com/python-jsonschema/check-jsonschema
rev: 0.36.0
hooks:
- id: check-github-workflows
priority: 2
# `actionlint` hook, for verifying correct syntax in GitHub Actions workflows.
# Some additional configuration for `actionlint` can be found in `.github/actionlint.yaml`.
@@ -125,13 +146,10 @@ repos:
args:
- "-ignore=SC2129" # ignorable stylistic lint from shellcheck
- "-ignore=SC2016" # another shellcheck lint: seems to have false positives?
language: golang # means renovate will also update `additional_dependencies`
additional_dependencies:
# actionlint has a shellcheck integration which extracts shell scripts in `run:` steps from GitHub Actions
# and checks these with shellcheck. This is arguably its most useful feature,
# but the integration only works if shellcheck is installed
- "github.com/wasilibs/go-shellcheck/cmd/shellcheck@v0.11.1"
- repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.11.0.1
hooks:
- id: shellcheck
priority: 0

View File

@@ -1,5 +1,63 @@
# Changelog
## 0.14.11
Released on 2026-01-08.
### Preview features
- Consolidate diagnostics for matched disable/enable suppression comments ([#22099](https://github.com/astral-sh/ruff/pull/22099))
- Report diagnostics for invalid/unmatched range suppression comments ([#21908](https://github.com/astral-sh/ruff/pull/21908))
- \[`airflow`\] Passing positional argument into `airflow.lineage.hook.HookLineageCollector.create_asset` is not allowed (`AIR303`) ([#22046](https://github.com/astral-sh/ruff/pull/22046))
- \[`refurb`\] Mark `FURB192` fix as always unsafe ([#22210](https://github.com/astral-sh/ruff/pull/22210))
- \[`ruff`\] Add `non-empty-init-module` (`RUF067`) ([#22143](https://github.com/astral-sh/ruff/pull/22143))
### Bug fixes
- Fix GitHub format for multi-line diagnostics ([#22108](https://github.com/astral-sh/ruff/pull/22108))
- \[`flake8-unused-arguments`\] Mark `**kwargs` in `TypeVar` as used (`ARG001`) ([#22214](https://github.com/astral-sh/ruff/pull/22214))
### Rule changes
- Add `help:` subdiagnostics for several Ruff rules that can sometimes appear to disagree with `ty` ([#22331](https://github.com/astral-sh/ruff/pull/22331))
- \[`pylint`\] Demote `PLW1510` fix to display-only ([#22318](https://github.com/astral-sh/ruff/pull/22318))
- \[`pylint`\] Ignore identical members (`PLR1714`) ([#22220](https://github.com/astral-sh/ruff/pull/22220))
- \[`pylint`\] Improve diagnostic range for `PLC0206` ([#22312](https://github.com/astral-sh/ruff/pull/22312))
- \[`ruff`\] Improve fix title for `RUF102` invalid rule code ([#22100](https://github.com/astral-sh/ruff/pull/22100))
- \[`flake8-simplify`\]: Avoid unnecessary builtins import for `SIM105` ([#22358](https://github.com/astral-sh/ruff/pull/22358))
### Configuration
- Allow Python 3.15 as valid `target-version` value in preview ([#22419](https://github.com/astral-sh/ruff/pull/22419))
- Check `required-version` before parsing rules ([#22410](https://github.com/astral-sh/ruff/pull/22410))
- Include configured `src` directories when resolving graphs ([#22451](https://github.com/astral-sh/ruff/pull/22451))
### Documentation
- Update `T201` suggestion to not use root logger to satisfy `LOG015` ([#22059](https://github.com/astral-sh/ruff/pull/22059))
- Fix `iter` example in unsafe fixes doc ([#22118](https://github.com/astral-sh/ruff/pull/22118))
- \[`flake8_print`\] better suggestion for `basicConfig` in `T201` docs ([#22101](https://github.com/astral-sh/ruff/pull/22101))
- \[`pylint`\] Restore the fix safety docs for `PLW0133` ([#22211](https://github.com/astral-sh/ruff/pull/22211))
- Fix Jupyter notebook discovery info for editors ([#22447](https://github.com/astral-sh/ruff/pull/22447))
### Contributors
- [@charliermarsh](https://github.com/charliermarsh)
- [@ntBre](https://github.com/ntBre)
- [@cenviity](https://github.com/cenviity)
- [@njhearp](https://github.com/njhearp)
- [@cbachhuber](https://github.com/cbachhuber)
- [@jelle-openai](https://github.com/jelle-openai)
- [@AlexWaygood](https://github.com/AlexWaygood)
- [@ValdonVitija](https://github.com/ValdonVitija)
- [@BurntSushi](https://github.com/BurntSushi)
- [@Jkhall81](https://github.com/Jkhall81)
- [@PeterJCLaw](https://github.com/PeterJCLaw)
- [@harupy](https://github.com/harupy)
- [@amyreese](https://github.com/amyreese)
- [@sjyangkevin](https://github.com/sjyangkevin)
- [@woodruffw](https://github.com/woodruffw)
## 0.14.10
Released on 2025-12-18.

View File

@@ -65,6 +65,7 @@ When working on ty, PR titles should start with `[ty]` and be tagged with the `t
- All changes must be tested. If you're not testing your changes, you're not done.
- Get your tests to pass. If you didn't run the tests, your code does not work.
- Follow existing code style. Check neighboring files for patterns.
- Always run `uvx pre-commit run -a` at the end of a task.
- Always run `uvx prek run --from-ref @{upstream}` at the end of a task.
- Avoid writing significant amounts of new code. This is often a sign that we're missing an existing method or mechanism that could help solve the problem. Look for existing utilities first.
- Avoid falling back to patterns that require `panic!`, `unreachable!`, or `.unwrap()`. Instead, try to encode those constraints in the type system.
- Prefer let chains (`if let` combined with `&&`) over nested `if let` statements to reduce indentation and improve readability.

View File

@@ -57,8 +57,8 @@ You can optionally install pre-commit hooks to automatically run the validation
when making a commit:
```shell
uv tool install pre-commit
pre-commit install
uv tool install prek
prek install
```
We recommend [nextest](https://nexte.st/) to run Ruff's test suite (via `cargo nextest run`),
@@ -85,7 +85,7 @@ and that it passes both the lint and test validation checks:
```shell
cargo clippy --workspace --all-targets --all-features -- -D warnings # Rust linting
RUFF_UPDATE_SCHEMA=1 cargo test # Rust testing and updating ruff.schema.json
uvx pre-commit run --all-files --show-diff-on-failure # Rust and Python formatting, Markdown and Python linting, etc.
uvx prek run -a # Rust and Python formatting, Markdown and Python linting, etc.
```
These checks will run on GitHub Actions when you open your pull request, but running them locally

14
Cargo.lock generated
View File

@@ -2912,7 +2912,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.14.10"
version = "0.14.11"
dependencies = [
"anyhow",
"argfile",
@@ -2928,6 +2928,7 @@ dependencies = [
"filetime",
"globwalk",
"ignore",
"indexmap",
"indoc",
"insta",
"insta-cmd",
@@ -3171,7 +3172,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.14.10"
version = "0.14.11"
dependencies = [
"aho-corasick",
"anyhow",
@@ -3529,7 +3530,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.14.10"
version = "0.14.11"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -3645,7 +3646,7 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=309c249088fdeef0129606fa34ec2eefc74736ff#309c249088fdeef0129606fa34ec2eefc74736ff"
source = "git+https://github.com/salsa-rs/salsa.git?rev=9860ff6ca0f1f8f3a8d6b832020002790b501254#9860ff6ca0f1f8f3a8d6b832020002790b501254"
dependencies = [
"boxcar",
"compact_str",
@@ -3670,12 +3671,12 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=309c249088fdeef0129606fa34ec2eefc74736ff#309c249088fdeef0129606fa34ec2eefc74736ff"
source = "git+https://github.com/salsa-rs/salsa.git?rev=9860ff6ca0f1f8f3a8d6b832020002790b501254#9860ff6ca0f1f8f3a8d6b832020002790b501254"
[[package]]
name = "salsa-macros"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=309c249088fdeef0129606fa34ec2eefc74736ff#309c249088fdeef0129606fa34ec2eefc74736ff"
source = "git+https://github.com/salsa-rs/salsa.git?rev=9860ff6ca0f1f8f3a8d6b832020002790b501254#9860ff6ca0f1f8f3a8d6b832020002790b501254"
dependencies = [
"proc-macro2",
"quote",
@@ -4443,6 +4444,7 @@ version = "0.0.0"
dependencies = [
"bitflags 2.10.0",
"camino",
"compact_str",
"get-size2",
"insta",
"itertools 0.14.0",

View File

@@ -150,7 +150,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "309c249088fdeef0129606fa34ec2eefc74736ff", default-features = false, features = [
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "9860ff6ca0f1f8f3a8d6b832020002790b501254", default-features = false, features = [
"compact_str",
"macros",
"salsa_unstable",

View File

@@ -150,8 +150,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.14.10/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.14.10/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.14.11/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.14.11/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -184,7 +184,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.14.10
rev: v0.14.11
hooks:
# Run the linter.
- id: ruff-check

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.14.10"
version = "0.14.11"
publish = true
authors = { workspace = true }
edition = { workspace = true }
@@ -48,6 +48,7 @@ colored = { workspace = true }
filetime = { workspace = true }
globwalk = { workspace = true }
ignore = { workspace = true }
indexmap = { workspace = true }
is-macro = { workspace = true }
itertools = { workspace = true }
jiff = { workspace = true }

View File

@@ -2,6 +2,7 @@ use crate::args::{AnalyzeGraphArgs, ConfigArguments};
use crate::resolve::resolve;
use crate::{ExitStatus, resolve_default_files};
use anyhow::Result;
use indexmap::IndexSet;
use log::{debug, warn};
use path_absolutize::CWD;
use ruff_db::system::{SystemPath, SystemPathBuf};
@@ -11,7 +12,7 @@ use ruff_linter::source_kind::SourceKind;
use ruff_linter::{warn_user, warn_user_once};
use ruff_python_ast::{PySourceType, SourceType};
use ruff_workspace::resolver::{ResolvedFile, match_exclusion, python_files_in_path};
use rustc_hash::FxHashMap;
use rustc_hash::{FxBuildHasher, FxHashMap};
use std::io::Write;
use std::path::{Path, PathBuf};
use std::sync::{Arc, Mutex};
@@ -59,17 +60,34 @@ pub(crate) fn analyze_graph(
})
.collect::<FxHashMap<_, _>>();
// Create a database from the source roots.
let src_roots = package_roots
.values()
.filter_map(|package| package.as_deref())
.filter_map(|package| package.parent())
.map(Path::to_path_buf)
.filter_map(|path| SystemPathBuf::from_path_buf(path).ok())
.collect();
// Create a database from the source roots, combining configured `src` paths with detected
// package roots. Configured paths are added first so they take precedence, and duplicates
// are removed.
let mut src_roots: IndexSet<SystemPathBuf, FxBuildHasher> = IndexSet::default();
// Add configured `src` paths first (for precedence), filtering to only include existing
// directories.
src_roots.extend(
pyproject_config
.settings
.linter
.src
.iter()
.filter(|path| path.is_dir())
.filter_map(|path| SystemPathBuf::from_path_buf(path.clone()).ok()),
);
// Add detected package roots.
src_roots.extend(
package_roots
.values()
.filter_map(|package| package.as_deref())
.filter_map(|path| path.parent())
.filter_map(|path| SystemPathBuf::from_path_buf(path.to_path_buf()).ok()),
);
let db = ModuleDb::from_src_roots(
src_roots,
src_roots.into_iter().collect(),
pyproject_config
.settings
.analyze

View File

@@ -29,10 +29,10 @@ pub(crate) fn show_settings(
bail!("No files found under the given path");
};
let settings = resolver.resolve(&path);
let (settings, config_path) = resolver.resolve_with_path(&path);
writeln!(writer, "Resolved settings for: \"{}\"", path.display())?;
if let Some(settings_path) = pyproject_config.path.as_ref() {
if let Some(settings_path) = config_path {
writeln!(writer, "Settings path: \"{}\"", settings_path.display())?;
}
write!(writer, "{settings}")?;

View File

@@ -714,6 +714,121 @@ fn notebook_basic() -> Result<()> {
Ok(())
}
/// Test that the `src` configuration option is respected.
///
/// This is useful for monorepos where there are multiple source directories that need to be
/// included in the module resolution search path.
#[test]
fn src_option() -> Result<()> {
let tempdir = TempDir::new()?;
let root = ChildPath::new(tempdir.path());
// Create a lib directory with a package.
root.child("lib")
.child("mylib")
.child("__init__.py")
.write_str("def helper(): pass")?;
// Create an app directory with a file that imports from mylib.
root.child("app").child("__init__.py").write_str("")?;
root.child("app")
.child("main.py")
.write_str("from mylib import helper")?;
// Without src configured, the import from mylib won't resolve.
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"app/__init__.py": [],
"app/main.py": []
}
----- stderr -----
"#);
});
// With src = ["lib"], the import should resolve.
root.child("ruff.toml").write_str(indoc::indoc! {r#"
src = ["lib"]
"#})?;
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"app/__init__.py": [],
"app/main.py": [
"lib/mylib/__init__.py"
]
}
----- stderr -----
"#);
});
Ok(())
}
/// Test that glob patterns in `src` are expanded.
#[test]
fn src_glob_expansion() -> Result<()> {
let tempdir = TempDir::new()?;
let root = ChildPath::new(tempdir.path());
// Create multiple lib directories with packages.
root.child("libs")
.child("lib_a")
.child("pkg_a")
.child("__init__.py")
.write_str("def func_a(): pass")?;
root.child("libs")
.child("lib_b")
.child("pkg_b")
.child("__init__.py")
.write_str("def func_b(): pass")?;
// Create an app that imports from both packages.
root.child("app").child("__init__.py").write_str("")?;
root.child("app")
.child("main.py")
.write_str("from pkg_a import func_a\nfrom pkg_b import func_b")?;
// Use a glob pattern to include all lib directories.
root.child("ruff.toml").write_str(indoc::indoc! {r#"
src = ["libs/*"]
"#})?;
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"app/__init__.py": [],
"app/main.py": [
"libs/lib_a/pkg_a/__init__.py",
"libs/lib_b/pkg_b/__init__.py"
]
}
----- stderr -----
"#);
});
Ok(())
}
#[test]
fn notebook_with_magic() -> Result<()> {
let tempdir = TempDir::new()?;

View File

@@ -1126,6 +1126,35 @@ import os
Ok(())
}
#[test]
fn required_version_fails_to_parse() -> Result<()> {
let fixture = CliTest::with_file(
"ruff.toml",
r#"
required-version = "pikachu"
"#,
)?;
assert_cmd_snapshot!(fixture
.check_command(), @r#"
success: false
exit_code: 2
----- stdout -----
----- stderr -----
ruff failed
Cause: Failed to load configuration `[TMP]/ruff.toml`
Cause: Failed to parse [TMP]/ruff.toml
Cause: TOML parse error at line 2, column 20
|
2 | required-version = "pikachu"
| ^^^^^^^^^
Failed to parse version: Unexpected end of version specifier, expected operator:
pikachu
^^^^^^^
"#);
Ok(())
}
#[test]
fn required_version_exact_mismatch() -> Result<()> {
let version = env!("CARGO_PKG_VERSION");
@@ -1137,10 +1166,10 @@ required-version = "0.1.0"
"#,
)?;
insta::with_settings!({
filters => vec![(version, "[VERSION]")]
}, {
assert_cmd_snapshot!(fixture
let mut settings = insta::Settings::clone_current();
settings.add_filter(version, "[VERSION]");
settings.bind(|| {
assert_cmd_snapshot!(fixture
.check_command()
.arg("--config")
.arg("ruff.toml")
@@ -1154,6 +1183,7 @@ import os
----- stderr -----
ruff failed
Cause: Failed to load configuration `[TMP]/ruff.toml`
Cause: Required version `==0.1.0` does not match the running version `[VERSION]`
");
});
@@ -1212,10 +1242,10 @@ required-version = ">{version}"
),
)?;
insta::with_settings!({
filters => vec![(version, "[VERSION]")]
}, {
assert_cmd_snapshot!(fixture
let mut settings = insta::Settings::clone_current();
settings.add_filter(version, "[VERSION]");
settings.bind(|| {
assert_cmd_snapshot!(fixture
.check_command()
.arg("--config")
.arg("ruff.toml")
@@ -1229,6 +1259,48 @@ import os
----- stderr -----
ruff failed
Cause: Failed to load configuration `[TMP]/ruff.toml`
Cause: Required version `>[VERSION]` does not match the running version `[VERSION]`
");
});
Ok(())
}
#[test]
fn required_version_precedes_rule_validation() -> Result<()> {
let version = env!("CARGO_PKG_VERSION");
let fixture = CliTest::with_file(
"ruff.toml",
&format!(
r#"
required-version = ">{version}"
[lint]
select = ["RUF999"]
"#
),
)?;
let mut settings = insta::Settings::clone_current();
settings.add_filter(version, "[VERSION]");
settings.bind(|| {
assert_cmd_snapshot!(fixture
.check_command()
.arg("--config")
.arg("ruff.toml")
.arg("-")
.pass_stdin(r#"
import os
"#), @"
success: false
exit_code: 2
----- stdout -----
----- stderr -----
ruff failed
Cause: Failed to load configuration `[TMP]/ruff.toml`
Cause: Required version `>[VERSION]` does not match the running version `[VERSION]`
");
});

View File

@@ -16,6 +16,7 @@ success: true
exit_code: 0
----- stdout -----
Resolved settings for: "[TMP]/foo/test.py"
Settings path: "[TMP]/foo/pyproject.toml"
# General Settings
cache_dir = "[TMP]/foo/.ruff_cache"

View File

@@ -50,6 +50,56 @@ ignore = [
Ok(())
}
#[test]
fn display_settings_from_nested_directory() -> anyhow::Result<()> {
let tempdir = TempDir::new().context("Failed to create temp directory.")?;
// Tempdir path's on macos are symlinks, which doesn't play nicely with
// our snapshot filtering.
let project_dir =
dunce::canonicalize(tempdir.path()).context("Failed to canonical tempdir path.")?;
// Root pyproject.toml.
std::fs::write(
project_dir.join("pyproject.toml"),
r#"
[tool.ruff]
line-length = 100
[tool.ruff.lint]
select = ["E", "F"]
"#,
)?;
// Create a subdirectory with its own pyproject.toml.
let subdir = project_dir.join("subdir");
std::fs::create_dir(&subdir)?;
std::fs::write(
subdir.join("pyproject.toml"),
r#"
[tool.ruff]
line-length = 120
[tool.ruff.lint]
select = ["E", "F", "I"]
"#,
)?;
std::fs::write(subdir.join("test.py"), r#"import os"#).context("Failed to write test.py.")?;
insta::with_settings!({filters => vec![
(&*tempdir_filter(&project_dir), "<temp_dir>/"),
(r#"\\(\w\w|\s|\.|")"#, "/$1"),
]}, {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--show-settings", "subdir/test.py"])
.current_dir(&project_dir));
});
Ok(())
}
fn tempdir_filter(project_dir: &Path) -> String {
format!(r#"{}\\?/?"#, regex::escape(project_dir.to_str().unwrap()))
}

View File

@@ -0,0 +1,410 @@
---
source: crates/ruff/tests/show_settings.rs
info:
program: ruff
args:
- check
- "--show-settings"
- subdir/test.py
---
success: true
exit_code: 0
----- stdout -----
Resolved settings for: "<temp_dir>/subdir/test.py"
Settings path: "<temp_dir>/subdir/pyproject.toml"
# General Settings
cache_dir = "<temp_dir>/subdir/.ruff_cache"
fix = false
fix_only = false
output_format = full
show_fixes = false
unsafe_fixes = hint
# File Resolver Settings
file_resolver.exclude = [
".bzr",
".direnv",
".eggs",
".git",
".git-rewrite",
".hg",
".ipynb_checkpoints",
".mypy_cache",
".nox",
".pants.d",
".pyenv",
".pytest_cache",
".pytype",
".ruff_cache",
".svn",
".tox",
".venv",
".vscode",
"__pypackages__",
"_build",
"buck-out",
"dist",
"node_modules",
"site-packages",
"venv",
]
file_resolver.extend_exclude = []
file_resolver.force_exclude = false
file_resolver.include = [
"*.py",
"*.pyi",
"*.ipynb",
"**/pyproject.toml",
]
file_resolver.extend_include = []
file_resolver.respect_gitignore = true
file_resolver.project_root = "<temp_dir>/subdir"
# Linter Settings
linter.exclude = []
linter.project_root = "<temp_dir>/subdir"
linter.rules.enabled = [
unsorted-imports (I001),
missing-required-import (I002),
mixed-spaces-and-tabs (E101),
multiple-imports-on-one-line (E401),
module-import-not-at-top-of-file (E402),
line-too-long (E501),
multiple-statements-on-one-line-colon (E701),
multiple-statements-on-one-line-semicolon (E702),
useless-semicolon (E703),
none-comparison (E711),
true-false-comparison (E712),
not-in-test (E713),
not-is-test (E714),
type-comparison (E721),
bare-except (E722),
lambda-assignment (E731),
ambiguous-variable-name (E741),
ambiguous-class-name (E742),
ambiguous-function-name (E743),
io-error (E902),
unused-import (F401),
import-shadowed-by-loop-var (F402),
undefined-local-with-import-star (F403),
late-future-import (F404),
undefined-local-with-import-star-usage (F405),
undefined-local-with-nested-import-star-usage (F406),
future-feature-not-defined (F407),
percent-format-invalid-format (F501),
percent-format-expected-mapping (F502),
percent-format-expected-sequence (F503),
percent-format-extra-named-arguments (F504),
percent-format-missing-argument (F505),
percent-format-mixed-positional-and-named (F506),
percent-format-positional-count-mismatch (F507),
percent-format-star-requires-sequence (F508),
percent-format-unsupported-format-character (F509),
string-dot-format-invalid-format (F521),
string-dot-format-extra-named-arguments (F522),
string-dot-format-extra-positional-arguments (F523),
string-dot-format-missing-arguments (F524),
string-dot-format-mixing-automatic (F525),
f-string-missing-placeholders (F541),
multi-value-repeated-key-literal (F601),
multi-value-repeated-key-variable (F602),
expressions-in-star-assignment (F621),
multiple-starred-expressions (F622),
assert-tuple (F631),
is-literal (F632),
invalid-print-syntax (F633),
if-tuple (F634),
break-outside-loop (F701),
continue-outside-loop (F702),
yield-outside-function (F704),
return-outside-function (F706),
default-except-not-last (F707),
forward-annotation-syntax-error (F722),
redefined-while-unused (F811),
undefined-name (F821),
undefined-export (F822),
undefined-local (F823),
unused-variable (F841),
unused-annotation (F842),
raise-not-implemented (F901),
]
linter.rules.should_fix = [
unsorted-imports (I001),
missing-required-import (I002),
mixed-spaces-and-tabs (E101),
multiple-imports-on-one-line (E401),
module-import-not-at-top-of-file (E402),
line-too-long (E501),
multiple-statements-on-one-line-colon (E701),
multiple-statements-on-one-line-semicolon (E702),
useless-semicolon (E703),
none-comparison (E711),
true-false-comparison (E712),
not-in-test (E713),
not-is-test (E714),
type-comparison (E721),
bare-except (E722),
lambda-assignment (E731),
ambiguous-variable-name (E741),
ambiguous-class-name (E742),
ambiguous-function-name (E743),
io-error (E902),
unused-import (F401),
import-shadowed-by-loop-var (F402),
undefined-local-with-import-star (F403),
late-future-import (F404),
undefined-local-with-import-star-usage (F405),
undefined-local-with-nested-import-star-usage (F406),
future-feature-not-defined (F407),
percent-format-invalid-format (F501),
percent-format-expected-mapping (F502),
percent-format-expected-sequence (F503),
percent-format-extra-named-arguments (F504),
percent-format-missing-argument (F505),
percent-format-mixed-positional-and-named (F506),
percent-format-positional-count-mismatch (F507),
percent-format-star-requires-sequence (F508),
percent-format-unsupported-format-character (F509),
string-dot-format-invalid-format (F521),
string-dot-format-extra-named-arguments (F522),
string-dot-format-extra-positional-arguments (F523),
string-dot-format-missing-arguments (F524),
string-dot-format-mixing-automatic (F525),
f-string-missing-placeholders (F541),
multi-value-repeated-key-literal (F601),
multi-value-repeated-key-variable (F602),
expressions-in-star-assignment (F621),
multiple-starred-expressions (F622),
assert-tuple (F631),
is-literal (F632),
invalid-print-syntax (F633),
if-tuple (F634),
break-outside-loop (F701),
continue-outside-loop (F702),
yield-outside-function (F704),
return-outside-function (F706),
default-except-not-last (F707),
forward-annotation-syntax-error (F722),
redefined-while-unused (F811),
undefined-name (F821),
undefined-export (F822),
undefined-local (F823),
unused-variable (F841),
unused-annotation (F842),
raise-not-implemented (F901),
]
linter.per_file_ignores = {}
linter.safety_table.forced_safe = []
linter.safety_table.forced_unsafe = []
linter.unresolved_target_version = none
linter.per_file_target_version = {}
linter.preview = disabled
linter.explicit_preview_rules = false
linter.extension = ExtensionMapping({})
linter.allowed_confusables = []
linter.builtins = []
linter.dummy_variable_rgx = ^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$
linter.external = []
linter.ignore_init_module_imports = true
linter.logger_objects = []
linter.namespace_packages = []
linter.src = [
"<temp_dir>/subdir",
"<temp_dir>/subdir/src",
]
linter.tab_size = 4
linter.line_length = 120
linter.task_tags = [
TODO,
FIXME,
XXX,
]
linter.typing_modules = []
linter.typing_extensions = true
# Linter Plugins
linter.flake8_annotations.mypy_init_return = false
linter.flake8_annotations.suppress_dummy_args = false
linter.flake8_annotations.suppress_none_returning = false
linter.flake8_annotations.allow_star_arg_any = false
linter.flake8_annotations.ignore_fully_untyped = false
linter.flake8_bandit.hardcoded_tmp_directory = [
/tmp,
/var/tmp,
/dev/shm,
]
linter.flake8_bandit.check_typed_exception = false
linter.flake8_bandit.extend_markup_names = []
linter.flake8_bandit.allowed_markup_calls = []
linter.flake8_bugbear.extend_immutable_calls = []
linter.flake8_builtins.allowed_modules = []
linter.flake8_builtins.ignorelist = []
linter.flake8_builtins.strict_checking = false
linter.flake8_comprehensions.allow_dict_calls_with_keyword_arguments = false
linter.flake8_copyright.notice_rgx = (?i)Copyright\s+((?:\(C\)|©)\s+)?\d{4}((-|,\s)\d{4})*
linter.flake8_copyright.author = none
linter.flake8_copyright.min_file_size = 0
linter.flake8_errmsg.max_string_length = 0
linter.flake8_gettext.function_names = [
_,
gettext,
ngettext,
]
linter.flake8_implicit_str_concat.allow_multiline = true
linter.flake8_import_conventions.aliases = {
altair = alt,
holoviews = hv,
matplotlib = mpl,
matplotlib.pyplot = plt,
networkx = nx,
numpy = np,
numpy.typing = npt,
pandas = pd,
panel = pn,
plotly.express = px,
polars = pl,
pyarrow = pa,
seaborn = sns,
tensorflow = tf,
tkinter = tk,
xml.etree.ElementTree = ET,
}
linter.flake8_import_conventions.banned_aliases = {}
linter.flake8_import_conventions.banned_from = []
linter.flake8_pytest_style.fixture_parentheses = false
linter.flake8_pytest_style.parametrize_names_type = tuple
linter.flake8_pytest_style.parametrize_values_type = list
linter.flake8_pytest_style.parametrize_values_row_type = tuple
linter.flake8_pytest_style.raises_require_match_for = [
BaseException,
Exception,
ValueError,
OSError,
IOError,
EnvironmentError,
socket.error,
]
linter.flake8_pytest_style.raises_extend_require_match_for = []
linter.flake8_pytest_style.mark_parentheses = false
linter.flake8_quotes.inline_quotes = double
linter.flake8_quotes.multiline_quotes = double
linter.flake8_quotes.docstring_quotes = double
linter.flake8_quotes.avoid_escape = true
linter.flake8_self.ignore_names = [
_make,
_asdict,
_replace,
_fields,
_field_defaults,
_name_,
_value_,
]
linter.flake8_tidy_imports.ban_relative_imports = "parents"
linter.flake8_tidy_imports.banned_api = {}
linter.flake8_tidy_imports.banned_module_level_imports = []
linter.flake8_type_checking.strict = false
linter.flake8_type_checking.exempt_modules = [
typing,
typing_extensions,
]
linter.flake8_type_checking.runtime_required_base_classes = []
linter.flake8_type_checking.runtime_required_decorators = []
linter.flake8_type_checking.quote_annotations = false
linter.flake8_unused_arguments.ignore_variadic_names = false
linter.isort.required_imports = []
linter.isort.combine_as_imports = false
linter.isort.force_single_line = false
linter.isort.force_sort_within_sections = false
linter.isort.detect_same_package = true
linter.isort.case_sensitive = false
linter.isort.force_wrap_aliases = false
linter.isort.force_to_top = []
linter.isort.known_modules = {}
linter.isort.order_by_type = true
linter.isort.relative_imports_order = furthest_to_closest
linter.isort.single_line_exclusions = []
linter.isort.split_on_trailing_comma = true
linter.isort.classes = []
linter.isort.constants = []
linter.isort.variables = []
linter.isort.no_lines_before = []
linter.isort.lines_after_imports = -1
linter.isort.lines_between_types = 0
linter.isort.forced_separate = []
linter.isort.section_order = [
known { type = future },
known { type = standard_library },
known { type = third_party },
known { type = first_party },
known { type = local_folder },
]
linter.isort.default_section = known { type = third_party }
linter.isort.no_sections = false
linter.isort.from_first = false
linter.isort.length_sort = false
linter.isort.length_sort_straight = false
linter.mccabe.max_complexity = 10
linter.pep8_naming.ignore_names = [
setUp,
tearDown,
setUpClass,
tearDownClass,
setUpModule,
tearDownModule,
asyncSetUp,
asyncTearDown,
setUpTestData,
failureException,
longMessage,
maxDiff,
]
linter.pep8_naming.classmethod_decorators = []
linter.pep8_naming.staticmethod_decorators = []
linter.pycodestyle.max_line_length = 120
linter.pycodestyle.max_doc_length = none
linter.pycodestyle.ignore_overlong_task_comments = false
linter.pyflakes.extend_generics = []
linter.pyflakes.allowed_unused_imports = []
linter.pylint.allow_magic_value_types = [
str,
bytes,
]
linter.pylint.allow_dunder_method_names = []
linter.pylint.max_args = 5
linter.pylint.max_positional_args = 5
linter.pylint.max_returns = 6
linter.pylint.max_bool_expr = 5
linter.pylint.max_branches = 12
linter.pylint.max_statements = 50
linter.pylint.max_public_methods = 20
linter.pylint.max_locals = 15
linter.pylint.max_nested_blocks = 5
linter.pyupgrade.keep_runtime_typing = false
linter.ruff.parenthesize_tuple_in_subscript = false
linter.ruff.strictly_empty_init_modules = false
# Formatter Settings
formatter.exclude = []
formatter.unresolved_target_version = 3.10
formatter.per_file_target_version = {}
formatter.preview = disabled
formatter.line_width = 120
formatter.line_ending = auto
formatter.indent_style = space
formatter.indent_width = 4
formatter.quote_style = double
formatter.magic_trailing_comma = respect
formatter.docstring_code_format = disabled
formatter.docstring_code_line_width = dynamic
# Analyze Settings
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.10
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
analyze.type_checking_imports = true
----- stderr -----

View File

@@ -15,7 +15,7 @@ use ruff_db::files::{File, system_path_to_file};
use ruff_db::source::source_text;
use ruff_db::system::{InMemorySystem, MemoryFileSystem, SystemPath, SystemPathBuf, TestSystem};
use ruff_python_ast::PythonVersion;
use ty_project::metadata::options::{EnvironmentOptions, Options};
use ty_project::metadata::options::{AnalysisOptions, EnvironmentOptions, Options};
use ty_project::metadata::value::{RangedValue, RelativePathBuf};
use ty_project::watch::{ChangeEvent, ChangedKind};
use ty_project::{CheckMode, Db, ProjectDatabase, ProjectMetadata};
@@ -67,6 +67,7 @@ fn tomllib_path(file: &TestFile) -> SystemPathBuf {
SystemPathBuf::from("src").join(file.name())
}
#[expect(clippy::needless_update)]
fn setup_tomllib_case() -> Case {
let system = TestSystem::default();
let fs = system.memory_file_system().clone();
@@ -85,6 +86,10 @@ fn setup_tomllib_case() -> Case {
python_version: Some(RangedValue::cli(PythonVersion::PY312)),
..EnvironmentOptions::default()
}),
analysis: Some(AnalysisOptions {
respect_type_ignore_comments: Some(false),
..AnalysisOptions::default()
}),
..Options::default()
});
@@ -755,7 +760,7 @@ fn datetype(criterion: &mut Criterion) {
max_dep_date: "2025-07-04",
python_version: PythonVersion::PY313,
},
2,
4,
);
bench_project(&benchmark, criterion);

View File

@@ -71,6 +71,8 @@ impl Display for Benchmark<'_> {
}
}
#[track_caller]
#[expect(clippy::cast_precision_loss)]
fn check_project(db: &ProjectDatabase, project_name: &str, max_diagnostics: usize) {
let result = db.check();
let diagnostics = result.len();
@@ -79,6 +81,12 @@ fn check_project(db: &ProjectDatabase, project_name: &str, max_diagnostics: usiz
diagnostics > 1 && diagnostics <= max_diagnostics,
"Expected between 1 and {max_diagnostics} diagnostics on project '{project_name}' but got {diagnostics}",
);
if (max_diagnostics - diagnostics) as f64 / max_diagnostics as f64 > 0.10 {
tracing::warn!(
"The expected diagnostics for project `{project_name}` can be reduced: expected {max_diagnostics} but got {diagnostics}"
);
}
}
static ALTAIR: Benchmark = Benchmark::new(
@@ -101,7 +109,7 @@ static ALTAIR: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
1000,
850,
);
static COLOUR_SCIENCE: Benchmark = Benchmark::new(
@@ -120,7 +128,7 @@ static COLOUR_SCIENCE: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY310,
},
1070,
350,
);
static FREQTRADE: Benchmark = Benchmark::new(
@@ -163,7 +171,7 @@ static PANDAS: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
4000,
3800,
);
static PYDANTIC: Benchmark = Benchmark::new(
@@ -181,7 +189,7 @@ static PYDANTIC: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY39,
},
7000,
3200,
);
static SYMPY: Benchmark = Benchmark::new(
@@ -194,7 +202,7 @@ static SYMPY: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
13116,
13400,
);
static TANJUN: Benchmark = Benchmark::new(
@@ -207,7 +215,7 @@ static TANJUN: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
320,
110,
);
static STATIC_FRAME: Benchmark = Benchmark::new(
@@ -223,7 +231,7 @@ static STATIC_FRAME: Benchmark = Benchmark::new(
max_dep_date: "2025-08-09",
python_version: PythonVersion::PY311,
},
1100,
1700,
);
#[track_caller]

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.14.10"
version = "0.14.11"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -52,6 +52,7 @@ impl InvalidRuleCodeKind {
pub(crate) struct InvalidRuleCode {
pub(crate) rule_code: String,
pub(crate) kind: InvalidRuleCodeKind,
pub(crate) whole_comment: bool,
}
impl AlwaysFixableViolation for InvalidRuleCode {
@@ -65,7 +66,11 @@ impl AlwaysFixableViolation for InvalidRuleCode {
}
fn fix_title(&self) -> String {
"Remove the rule code".to_string()
if self.whole_comment {
format!("Remove the {} comment", self.kind.as_str())
} else {
format!("Remove the rule code `{}`", self.rule_code)
}
}
}
@@ -122,6 +127,7 @@ fn all_codes_invalid_diagnostic(
.collect::<Vec<_>>()
.join(", "),
kind: InvalidRuleCodeKind::Noqa,
whole_comment: true,
},
directive.range(),
)
@@ -139,6 +145,7 @@ fn some_codes_are_invalid_diagnostic(
InvalidRuleCode {
rule_code: invalid_code.to_string(),
kind: InvalidRuleCodeKind::Noqa,
whole_comment: false,
},
invalid_code.range(),
)

View File

@@ -52,6 +52,25 @@ impl UnusedNOQAKind {
/// foo.bar()
/// ```
///
/// ## Conflict with other linters
/// When using `RUF100` with the `--fix` option, Ruff may remove trailing comments
/// that follow a `# noqa` directive on the same line, as it interprets the
/// remainder of the line as a description for the suppression.
///
/// To prevent Ruff from removing suppressions for other tools (like `pylint`
/// or `mypy`), separate them with a second `#` character:
///
/// ```python
/// # Bad: Ruff --fix will remove the pylint comment
/// def visit_ImportFrom(self, node): # noqa: N802, pylint: disable=invalid-name
/// pass
///
///
/// # Good: Ruff will preserve the pylint comment
/// def visit_ImportFrom(self, node): # noqa: N802 # pylint: disable=invalid-name
/// pass
/// ```
///
/// ## Options
/// - `lint.external`
///

View File

@@ -10,7 +10,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID123
3 | # External code
4 | import re # noqa: V123
|
help: Remove the rule code
help: Remove the `# noqa` comment
1 | # Invalid code
- import os # noqa: INVALID123
2 + import os
@@ -28,7 +28,7 @@ RUF102 [*] Invalid rule code in `# noqa`: V123
5 | # Valid noqa
6 | import sys # noqa: E402
|
help: Remove the rule code
help: Remove the `# noqa` comment
1 | # Invalid code
2 | import os # noqa: INVALID123
3 | # External code
@@ -48,7 +48,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID456
8 | from itertools import product # Preceeding comment # noqa: INVALID789
9 | # Succeeding comment
|
help: Remove the rule code
help: Remove the rule code `INVALID456`
4 | import re # noqa: V123
5 | # Valid noqa
6 | import sys # noqa: E402
@@ -68,7 +68,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID789
9 | # Succeeding comment
10 | import math # noqa: INVALID000 # Succeeding comment
|
help: Remove the rule code
help: Remove the `# noqa` comment
5 | # Valid noqa
6 | import sys # noqa: E402
7 | from functools import cache # Preceeding comment # noqa: F401, INVALID456
@@ -88,7 +88,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID000
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
|
help: Remove the rule code
help: Remove the `# noqa` comment
7 | from functools import cache # Preceeding comment # noqa: F401, INVALID456
8 | from itertools import product # Preceeding comment # noqa: INVALID789
9 | # Succeeding comment
@@ -108,7 +108,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID123
13 | # Test for multiple invalid
14 | from collections import defaultdict # noqa: INVALID100, INVALID200, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID123`
9 | # Succeeding comment
10 | import math # noqa: INVALID000 # Succeeding comment
11 | # Mixed valid and invalid
@@ -128,7 +128,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID100
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID100`
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
13 | # Test for multiple invalid
@@ -148,7 +148,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID200
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID200`
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
13 | # Test for multiple invalid
@@ -168,7 +168,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID300
17 | # Test for mixed code types
18 | import json # noqa: E402, INVALID400, V100
|
help: Remove the rule code
help: Remove the rule code `INVALID300`
13 | # Test for multiple invalid
14 | from collections import defaultdict # noqa: INVALID100, INVALID200, F401
15 | # Test for preserving valid codes when fixing
@@ -188,7 +188,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID400
19 | # Test for rule redirects
20 | import pandas as pd # noqa: TCH002
|
help: Remove the rule code
help: Remove the rule code `INVALID400`
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
17 | # Test for mixed code types
@@ -207,7 +207,7 @@ RUF102 [*] Invalid rule code in `# noqa`: V100
19 | # Test for rule redirects
20 | import pandas as pd # noqa: TCH002
|
help: Remove the rule code
help: Remove the rule code `V100`
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
17 | # Test for mixed code types

View File

@@ -10,7 +10,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID123
3 | # External code
4 | import re # noqa: V123
|
help: Remove the rule code
help: Remove the `# noqa` comment
1 | # Invalid code
- import os # noqa: INVALID123
2 + import os
@@ -28,7 +28,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID456
8 | from itertools import product # Preceeding comment # noqa: INVALID789
9 | # Succeeding comment
|
help: Remove the rule code
help: Remove the rule code `INVALID456`
4 | import re # noqa: V123
5 | # Valid noqa
6 | import sys # noqa: E402
@@ -48,7 +48,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID789
9 | # Succeeding comment
10 | import math # noqa: INVALID000 # Succeeding comment
|
help: Remove the rule code
help: Remove the `# noqa` comment
5 | # Valid noqa
6 | import sys # noqa: E402
7 | from functools import cache # Preceeding comment # noqa: F401, INVALID456
@@ -68,7 +68,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID000
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
|
help: Remove the rule code
help: Remove the `# noqa` comment
7 | from functools import cache # Preceeding comment # noqa: F401, INVALID456
8 | from itertools import product # Preceeding comment # noqa: INVALID789
9 | # Succeeding comment
@@ -88,7 +88,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID123
13 | # Test for multiple invalid
14 | from collections import defaultdict # noqa: INVALID100, INVALID200, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID123`
9 | # Succeeding comment
10 | import math # noqa: INVALID000 # Succeeding comment
11 | # Mixed valid and invalid
@@ -108,7 +108,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID100
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID100`
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
13 | # Test for multiple invalid
@@ -128,7 +128,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID200
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID200`
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
13 | # Test for multiple invalid
@@ -148,7 +148,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID300
17 | # Test for mixed code types
18 | import json # noqa: E402, INVALID400, V100
|
help: Remove the rule code
help: Remove the rule code `INVALID300`
13 | # Test for multiple invalid
14 | from collections import defaultdict # noqa: INVALID100, INVALID200, F401
15 | # Test for preserving valid codes when fixing
@@ -168,7 +168,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID400
19 | # Test for rule redirects
20 | import pandas as pd # noqa: TCH002
|
help: Remove the rule code
help: Remove the rule code `INVALID400`
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
17 | # Test for mixed code types

View File

@@ -552,7 +552,7 @@ RUF102 [*] Invalid rule code in suppression: YF829
97 | # ruff: enable[YF829]
| -----
|
help: Remove the rule code
help: Remove the suppression comment
90 |
91 | def f():
92 | # Unknown rule codes
@@ -578,7 +578,7 @@ RUF102 [*] Invalid rule code in suppression: RQW320
| ------
97 | # ruff: enable[YF829]
|
help: Remove the rule code
help: Remove the rule code `RQW320`
91 | def f():
92 | # Unknown rule codes
93 | # ruff: disable[YF829]

View File

@@ -607,13 +607,21 @@ impl TryFrom<String> for RequiredVersion {
type Error = pep440_rs::VersionSpecifiersParseError;
fn try_from(value: String) -> Result<Self, Self::Error> {
value.parse()
}
}
impl FromStr for RequiredVersion {
type Err = pep440_rs::VersionSpecifiersParseError;
fn from_str(value: &str) -> Result<Self, Self::Err> {
// Treat `0.3.1` as `==0.3.1`, for backwards compatibility.
if let Ok(version) = pep440_rs::Version::from_str(&value) {
if let Ok(version) = pep440_rs::Version::from_str(value) {
Ok(Self(VersionSpecifiers::from(
VersionSpecifier::equals_version(version),
)))
} else {
Ok(Self(VersionSpecifiers::from_str(&value)?))
Ok(Self(VersionSpecifiers::from_str(value)?))
}
}
}

View File

@@ -91,6 +91,12 @@ pub(crate) struct Suppression {
comments: DisableEnableComments,
}
impl Suppression {
fn codes(&self) -> &[TextRange] {
&self.comments.disable_comment().codes
}
}
#[derive(Debug)]
pub(crate) enum DisableEnableComments {
/// An implicitly closed disable comment without a matching enable comment.
@@ -201,6 +207,7 @@ impl Suppressions {
InvalidRuleCode {
rule_code: suppression.code.to_string(),
kind: InvalidRuleCodeKind::Suppression,
whole_comment: suppression.codes().len() == 1,
},
);
}

View File

@@ -0,0 +1,133 @@
class Simple:
x=1
x=2 # fmt: skip
x=3
class Semicolon:
x=1
x=2;x=3 # fmt: skip
x=4
class TrailingSemicolon:
x=1
x=2;x=3 ; # fmt: skip
x=4
class SemicolonNewLogicalLine:
x=1;
x=2;x=3 # fmt: skip
x=4
class ManySemicolonOneLine:
x=1
x=2;x=3;x=4 # fmt: skip
x=5
class CompoundInSuite:
x=1
def foo(): y=1 # fmt: skip
x=2
class CompoundInSuiteNewline:
x=1
def foo():
y=1 # fmt: skip
x=2
class MultiLineSkip:
x=1
x = [
'1',
'2',
] # fmt: skip
class MultiLineSemicolon:
x=1
x = [
'1',
'2',
]; x=2 # fmt: skip
class LineContinuationSemicolonAfter:
x=1
x = ['a']\
; y=1 # fmt: skip
class LineContinuationSemicolonBefore:
x=1
x = ['a']; \
y=1 # fmt: skip
class LineContinuationSemicolonAndNewline:
x=1
x = ['a']; \
y=1 # fmt: skip
class LineContinuationSemicolonAndNewlineAndComment:
x=1
x = ['a']; \
# 1
y=1 # fmt: skip
class RepeatedLineContinuation:
x=1
x = ['a']; \
\
\
y=1 # fmt: skip
class MultiLineSemicolonComments:
x=1
# 1
x = [ # 2
'1', # 3
'2',
# 4
]; x=2 # 5 # fmt: skip # 6
class DocstringSkipped:
'''This is a docstring''' # fmt: skip
x=1
class MultilineDocstringSkipped:
'''This is a docstring
''' # fmt: skip
x=1
class FirstStatementNewlines:
x=1 # fmt: skip
class ChainingSemicolons:
x=[
'1',
'2',
'3',
];x=1;x=[
'1',
'2',
'3'
];x=1;x=1 # fmt: skip
class LotsOfComments:
# 1
x=[ # 2
'1', # 3
'2',
'3'
] ;x=2;x=3 # 4 # fmt: skip # 5
# 6
class MixingCompound:
def foo(): bar(); import zoo # fmt: skip
# https://github.com/astral-sh/ruff/issues/17331
def main() -> None:
import ipdb; ipdb.set_trace() # noqa: E402,E702,I001 # fmt: skip
# https://github.com/astral-sh/ruff/issues/11430
print(); print() # noqa # fmt: skip

View File

@@ -0,0 +1,22 @@
# 0
'''Module docstring
multiple lines''' # 1 # fmt: skip # 2
import a;import b; from c import (
x,y,z
); import f # fmt: skip
x=1;x=2;x=3;x=4 # fmt: skip
# 1
x=[ # 2
'1', # 3
'2',
'3'
];x=1;x=1 # 4 # fmt: skip # 5
# 6
def foo(): x=[
'1',
'2',
];x=1 # fmt: skip

View File

@@ -0,0 +1,68 @@
class Simple:
# Range comprises skip range
x=1
<RANGE_START>x=2 <RANGE_END># fmt: skip
x=3
class Semicolon:
# Range is part of skip range
x=1
x=2;<RANGE_START>x=3<RANGE_END> # fmt: skip
x=4
class FormatFirst:
x=1
<RANGE_START>x=2<RANGE_END>;x=3 # fmt: skip
x=4
class FormatMiddle:
x=1
x=2;<RANGE_START>x=3<RANGE_END>;x=4 # fmt: skip
x=5
class SemicolonNewLogicalLine:
# Range overlaps on right side
<RANGE_START>x=1;
x=2<RANGE_END>;x=3 # fmt: skip
x=4
class SemicolonNewLogicalLine:
# Range overlaps on left side
x=1;
x=2;<RANGE_START>x=3 # fmt: skip
x=4<RANGE_END>
class ManySemicolonOneLine:
x=1
x=2;x=3;x=4 # fmt: skip
x=5
class CompoundInSuite:
x=1
<RANGE_START>def foo(): y=1 <RANGE_END># fmt: skip
x=2
class CompoundInSuiteNewline:
x=1
def foo():
y=1 # fmt: skip
x=2
class MultiLineSkip:
# Range inside statement
x=1
x = <RANGE_START>[
'1',
'2',<RANGE_END>
] # fmt: skip
class LotsOfComments:
# 1
x=[ # 2
'1', # 3<RANGE_START>
'2',
'3'
] ;x=2;x=3 # 4<RANGE_END> # fmt: skip # 5
# 6

View File

@@ -24,7 +24,6 @@ pub use crate::options::{
};
use crate::range::is_logical_line;
pub use crate::shared_traits::{AsFormat, FormattedIter, FormattedIterExt, IntoFormat};
use crate::verbatim::suppressed_node;
pub(crate) mod builders;
pub mod cli;
@@ -61,51 +60,39 @@ where
let node_ref = AnyNodeRef::from(node);
let node_comments = comments.leading_dangling_trailing(node_ref);
if self.is_suppressed(node_comments.trailing, f.context()) {
suppressed_node(node_ref).fmt(f)
} else {
leading_comments(node_comments.leading).fmt(f)?;
leading_comments(node_comments.leading).fmt(f)?;
// Emit source map information for nodes that are valid "narrowing" targets
// in range formatting. Never emit source map information if they're disabled
// for performance reasons.
let emit_source_position = (is_logical_line(node_ref) || node_ref.is_mod_module())
&& f.options().source_map_generation().is_enabled();
// Emit source map information for nodes that are valid "narrowing" targets
// in range formatting. Never emit source map information if they're disabled
// for performance reasons.
let emit_source_position = (is_logical_line(node_ref) || node_ref.is_mod_module())
&& f.options().source_map_generation().is_enabled();
emit_source_position
.then_some(source_position(node.start()))
.fmt(f)?;
emit_source_position
.then_some(source_position(node.start()))
.fmt(f)?;
self.fmt_fields(node, f)?;
self.fmt_fields(node, f)?;
debug_assert!(
node_comments
.dangling
.iter()
.all(SourceComment::is_formatted),
"The node has dangling comments that need to be formatted manually. Add the special dangling comments handling to `fmt_fields`."
);
debug_assert!(
node_comments
.dangling
.iter()
.all(SourceComment::is_formatted),
"The node has dangling comments that need to be formatted manually. Add the special dangling comments handling to `fmt_fields`."
);
write!(
f,
[
emit_source_position.then_some(source_position(node.end())),
trailing_comments(node_comments.trailing)
]
)
}
write!(
f,
[
emit_source_position.then_some(source_position(node.end())),
trailing_comments(node_comments.trailing)
]
)
}
/// Formats the node's fields.
fn fmt_fields(&self, item: &N, f: &mut PyFormatter) -> FormatResult<()>;
fn is_suppressed(
&self,
_trailing_comments: &[SourceComment],
_context: &PyFormatContext,
) -> bool {
false
}
}
#[derive(Error, Debug, salsa::Update, PartialEq, Eq)]

View File

@@ -1,9 +1,10 @@
use ruff_formatter::write;
use ruff_python_ast::Decorator;
use ruff_text_size::Ranged;
use crate::comments::SourceComment;
use crate::expression::maybe_parenthesize_expression;
use crate::expression::parentheses::Parenthesize;
use crate::verbatim::verbatim_text;
use crate::{has_skip_comment, prelude::*};
#[derive(Default)]
@@ -11,26 +12,27 @@ pub struct FormatDecorator;
impl FormatNodeRule<Decorator> for FormatDecorator {
fn fmt_fields(&self, item: &Decorator, f: &mut PyFormatter) -> FormatResult<()> {
let Decorator {
expression,
range: _,
node_index: _,
} = item;
let comments = f.context().comments();
let trailing = comments.trailing(item);
write!(
f,
[
token("@"),
maybe_parenthesize_expression(expression, item, Parenthesize::Optional)
]
)
}
if has_skip_comment(trailing, f.context().source()) {
comments.mark_verbatim_node_comments_formatted(item.into());
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
verbatim_text(item.range()).fmt(f)
} else {
let Decorator {
expression,
range: _,
node_index: _,
} = item;
write!(
f,
[
token("@"),
maybe_parenthesize_expression(expression, item, Parenthesize::Optional)
]
)
}
}
}

View File

@@ -15,7 +15,7 @@ use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
use crate::comments::Comments;
use crate::context::{IndentLevel, NodeLevel};
use crate::prelude::*;
use crate::statement::suite::DocstringStmt;
use crate::statement::suite::{DocstringStmt, skip_range};
use crate::verbatim::{ends_suppression, starts_suppression};
use crate::{FormatModuleError, PyFormatOptions, format_module_source};
@@ -251,7 +251,29 @@ impl<'ast> SourceOrderVisitor<'ast> for FindEnclosingNode<'_, 'ast> {
// We only visit statements that aren't suppressed that's why we don't need to track the suppression
// state in a stack. Assert that this assumption is safe.
debug_assert!(self.suppressed.is_no());
walk_body(self, body);
let mut iter = body.iter();
while let Some(stmt) = iter.next() {
// If the range intersects a skip range then we need to
// format the entire suite to properly handle the case
// where a `fmt: skip` affects multiple statements.
//
// For example, in the case
//
// ```
// <RANGE_START>x=1<RANGE_END>;x=2 # fmt: skip
// ```
//
// the statement `x=1` does not "know" that it is
// suppressed, but the suite does.
if let Some(verbatim_range) = skip_range(stmt, iter.as_slice(), self.context)
&& verbatim_range.intersect(self.range).is_some()
{
break;
}
self.visit_stmt(stmt);
}
self.suppressed = Suppressed::No;
}
}
@@ -561,7 +583,7 @@ impl NarrowRange<'_> {
}
pub(crate) const fn is_logical_line(node: AnyNodeRef) -> bool {
// Make sure to update [`FormatEnclosingLine`] when changing this.
// Make sure to update [`FormatEnclosingNode`] when changing this.
node.is_statement()
|| node.is_decorator()
|| node.is_except_handler()

View File

@@ -1,14 +1,13 @@
use ruff_formatter::write;
use ruff_python_ast::StmtAnnAssign;
use crate::comments::SourceComment;
use crate::expression::is_splittable_expression;
use crate::expression::parentheses::Parentheses;
use crate::prelude::*;
use crate::statement::stmt_assign::{
AnyAssignmentOperator, AnyBeforeOperator, FormatStatementsLastExpression,
};
use crate::statement::trailing_semicolon;
use crate::{has_skip_comment, prelude::*};
#[derive(Default)]
pub struct FormatStmtAnnAssign;
@@ -84,12 +83,4 @@ impl FormatNodeRule<StmtAnnAssign> for FormatStmtAnnAssign {
Ok(())
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -2,10 +2,9 @@ use ruff_formatter::prelude::{space, token};
use ruff_formatter::write;
use ruff_python_ast::StmtAssert;
use crate::comments::SourceComment;
use crate::expression::maybe_parenthesize_expression;
use crate::expression::parentheses::Parenthesize;
use crate::{has_skip_comment, prelude::*};
use crate::prelude::*;
#[derive(Default)]
pub struct FormatStmtAssert;
@@ -41,12 +40,4 @@ impl FormatNodeRule<StmtAssert> for FormatStmtAssert {
Ok(())
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -19,6 +19,7 @@ use crate::expression::{
maybe_parenthesize_expression,
};
use crate::other::interpolated_string::InterpolatedStringLayout;
use crate::prelude::*;
use crate::preview::is_parenthesize_lambda_bodies_enabled;
use crate::statement::trailing_semicolon;
use crate::string::StringLikeExtensions;
@@ -26,7 +27,6 @@ use crate::string::implicit::{
FormatImplicitConcatenatedStringExpanded, FormatImplicitConcatenatedStringFlat,
ImplicitConcatenatedLayout,
};
use crate::{has_skip_comment, prelude::*};
#[derive(Default)]
pub struct FormatStmtAssign;
@@ -104,14 +104,6 @@ impl FormatNodeRule<StmtAssign> for FormatStmtAssign {
Ok(())
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}
/// Formats a single target with the equal operator.

View File

@@ -1,15 +1,14 @@
use ruff_formatter::write;
use ruff_python_ast::StmtAugAssign;
use crate::comments::SourceComment;
use crate::expression::parentheses::is_expression_parenthesized;
use crate::prelude::*;
use crate::statement::stmt_assign::{
AnyAssignmentOperator, AnyBeforeOperator, FormatStatementsLastExpression,
has_target_own_parentheses,
};
use crate::statement::trailing_semicolon;
use crate::{AsFormat, FormatNodeRule};
use crate::{has_skip_comment, prelude::*};
#[derive(Default)]
pub struct FormatStmtAugAssign;
@@ -62,12 +61,4 @@ impl FormatNodeRule<StmtAugAssign> for FormatStmtAugAssign {
Ok(())
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,7 +1,6 @@
use ruff_python_ast::StmtBreak;
use crate::comments::SourceComment;
use crate::{has_skip_comment, prelude::*};
use crate::prelude::*;
#[derive(Default)]
pub struct FormatStmtBreak;
@@ -10,12 +9,4 @@ impl FormatNodeRule<StmtBreak> for FormatStmtBreak {
fn fmt_fields(&self, _item: &StmtBreak, f: &mut PyFormatter) -> FormatResult<()> {
token("break").fmt(f)
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,7 +1,6 @@
use ruff_python_ast::StmtContinue;
use crate::comments::SourceComment;
use crate::{has_skip_comment, prelude::*};
use crate::prelude::*;
#[derive(Default)]
pub struct FormatStmtContinue;
@@ -10,12 +9,4 @@ impl FormatNodeRule<StmtContinue> for FormatStmtContinue {
fn fmt_fields(&self, _item: &StmtContinue, f: &mut PyFormatter) -> FormatResult<()> {
token("continue").fmt(f)
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -3,10 +3,10 @@ use ruff_python_ast::StmtDelete;
use ruff_text_size::Ranged;
use crate::builders::{PyFormatterExtensions, parenthesize_if_expands};
use crate::comments::{SourceComment, dangling_node_comments};
use crate::comments::dangling_node_comments;
use crate::expression::maybe_parenthesize_expression;
use crate::expression::parentheses::Parenthesize;
use crate::{has_skip_comment, prelude::*};
use crate::prelude::*;
#[derive(Default)]
pub struct FormatStmtDelete;
@@ -57,12 +57,4 @@ impl FormatNodeRule<StmtDelete> for FormatStmtDelete {
}
}
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,12 +1,10 @@
use ruff_python_ast as ast;
use ruff_python_ast::{Expr, Operator, StmtExpr};
use crate::comments::SourceComment;
use crate::expression::maybe_parenthesize_expression;
use crate::expression::parentheses::Parenthesize;
use crate::prelude::*;
use crate::statement::trailing_semicolon;
use crate::{has_skip_comment, prelude::*};
#[derive(Default)]
pub struct FormatStmtExpr;
@@ -30,14 +28,6 @@ impl FormatNodeRule<StmtExpr> for FormatStmtExpr {
Ok(())
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}
const fn is_arithmetic_like(expression: &Expr) -> bool {

View File

@@ -1,8 +1,6 @@
use ruff_formatter::{format_args, write};
use ruff_python_ast::StmtGlobal;
use crate::comments::SourceComment;
use crate::has_skip_comment;
use crate::prelude::*;
#[derive(Default)]
@@ -47,12 +45,4 @@ impl FormatNodeRule<StmtGlobal> for FormatStmtGlobal {
)
}
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,8 +1,7 @@
use ruff_formatter::{format_args, write};
use ruff_python_ast::StmtImport;
use crate::comments::SourceComment;
use crate::{has_skip_comment, prelude::*};
use crate::prelude::*;
#[derive(Default)]
pub struct FormatStmtImport;
@@ -21,12 +20,4 @@ impl FormatNodeRule<StmtImport> for FormatStmtImport {
});
write!(f, [token("import"), space(), names])
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -3,9 +3,7 @@ use ruff_python_ast::StmtImportFrom;
use ruff_text_size::Ranged;
use crate::builders::{PyFormatterExtensions, TrailingComma, parenthesize_if_expands};
use crate::comments::SourceComment;
use crate::expression::parentheses::parenthesized;
use crate::has_skip_comment;
use crate::other::identifier::DotDelimitedIdentifier;
use crate::prelude::*;
@@ -72,12 +70,4 @@ impl FormatNodeRule<StmtImportFrom> for FormatStmtImportFrom {
.fmt(f)
}
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,8 +1,7 @@
use ruff_python_ast::StmtIpyEscapeCommand;
use ruff_text_size::Ranged;
use crate::comments::SourceComment;
use crate::{has_skip_comment, prelude::*};
use crate::prelude::*;
#[derive(Default)]
pub struct FormatStmtIpyEscapeCommand;
@@ -11,12 +10,4 @@ impl FormatNodeRule<StmtIpyEscapeCommand> for FormatStmtIpyEscapeCommand {
fn fmt_fields(&self, item: &StmtIpyEscapeCommand, f: &mut PyFormatter) -> FormatResult<()> {
source_text_slice(item.range()).fmt(f)
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,8 +1,6 @@
use ruff_formatter::{format_args, write};
use ruff_python_ast::StmtNonlocal;
use crate::comments::SourceComment;
use crate::has_skip_comment;
use crate::prelude::*;
#[derive(Default)]
@@ -47,12 +45,4 @@ impl FormatNodeRule<StmtNonlocal> for FormatStmtNonlocal {
)
}
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,7 +1,6 @@
use ruff_python_ast::StmtPass;
use crate::comments::SourceComment;
use crate::{has_skip_comment, prelude::*};
use crate::prelude::*;
#[derive(Default)]
pub struct FormatStmtPass;
@@ -10,12 +9,4 @@ impl FormatNodeRule<StmtPass> for FormatStmtPass {
fn fmt_fields(&self, _item: &StmtPass, f: &mut PyFormatter) -> FormatResult<()> {
token("pass").fmt(f)
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,10 +1,9 @@
use ruff_formatter::write;
use ruff_python_ast::StmtRaise;
use crate::comments::SourceComment;
use crate::expression::maybe_parenthesize_expression;
use crate::expression::parentheses::Parenthesize;
use crate::{has_skip_comment, prelude::*};
use crate::prelude::*;
#[derive(Default)]
pub struct FormatStmtRaise;
@@ -43,12 +42,4 @@ impl FormatNodeRule<StmtRaise> for FormatStmtRaise {
}
Ok(())
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,10 +1,9 @@
use ruff_formatter::write;
use ruff_python_ast::{Expr, StmtReturn};
use crate::comments::SourceComment;
use crate::expression::expr_tuple::TupleParentheses;
use crate::prelude::*;
use crate::statement::stmt_assign::FormatStatementsLastExpression;
use crate::{has_skip_comment, prelude::*};
#[derive(Default)]
pub struct FormatStmtReturn;
@@ -43,12 +42,4 @@ impl FormatNodeRule<StmtReturn> for FormatStmtReturn {
None => Ok(()),
}
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -1,11 +1,10 @@
use ruff_formatter::write;
use ruff_python_ast::StmtTypeAlias;
use crate::comments::SourceComment;
use crate::prelude::*;
use crate::statement::stmt_assign::{
AnyAssignmentOperator, AnyBeforeOperator, FormatStatementsLastExpression,
};
use crate::{has_skip_comment, prelude::*};
#[derive(Default)]
pub struct FormatStmtTypeAlias;
@@ -42,12 +41,4 @@ impl FormatNodeRule<StmtTypeAlias> for FormatStmtTypeAlias {
]
)
}
fn is_suppressed(
&self,
trailing_comments: &[SourceComment],
context: &PyFormatContext,
) -> bool {
has_skip_comment(trailing_comments, context.source())
}
}

View File

@@ -4,11 +4,15 @@ use ruff_formatter::{
use ruff_python_ast::helpers::is_compound_statement;
use ruff_python_ast::{self as ast, Expr, PySourceType, Stmt, Suite};
use ruff_python_ast::{AnyNodeRef, StmtExpr};
use ruff_python_trivia::{lines_after, lines_after_ignoring_end_of_line_trivia, lines_before};
use ruff_python_trivia::{
SimpleTokenKind, SimpleTokenizer, lines_after, lines_after_ignoring_end_of_line_trivia,
lines_before,
};
use ruff_text_size::{Ranged, TextRange};
use crate::comments::{
Comments, LeadingDanglingTrailingComments, leading_comments, trailing_comments,
Comments, LeadingDanglingTrailingComments, has_skip_comment, leading_comments,
trailing_comments,
};
use crate::context::{NodeLevel, TopLevelStatementPosition, WithIndentLevel, WithNodeLevel};
use crate::other::string_literal::StringLiteralKind;
@@ -16,9 +20,9 @@ use crate::prelude::*;
use crate::preview::{
is_allow_newline_after_block_open_enabled, is_blank_line_before_decorated_class_in_stub_enabled,
};
use crate::statement::stmt_expr::FormatStmtExpr;
use crate::statement::trailing_semicolon;
use crate::verbatim::{
suppressed_node, write_suppressed_statements_starting_with_leading_comment,
write_skipped_statements, write_suppressed_statements_starting_with_leading_comment,
write_suppressed_statements_starting_with_trailing_comment,
};
@@ -152,7 +156,21 @@ impl FormatRule<Suite, PyFormatContext<'_>> for FormatSuite {
let first_comments = comments.leading_dangling_trailing(first);
let (mut preceding, mut empty_line_after_docstring) = if first_comments
let (mut preceding, mut empty_line_after_docstring) = if let Some(verbatim_range) =
skip_range(first.statement(), iter.as_slice(), f.context())
{
let preceding =
write_skipped_statements(first.statement(), &mut iter, verbatim_range, f)?;
// Insert a newline after a module level docstring, but treat
// it as a docstring otherwise. See: https://github.com/psf/black/pull/3932.
let empty_line_after_docstring =
matches!(self.kind, SuiteKind::TopLevel | SuiteKind::Class)
&& DocstringStmt::try_from_statement(preceding, self.kind, f.context())
.is_some();
(preceding, empty_line_after_docstring)
} else if first_comments
.leading
.iter()
.any(|comment| comment.is_suppression_off_comment(source))
@@ -391,7 +409,10 @@ impl FormatRule<Suite, PyFormatContext<'_>> for FormatSuite {
}
}
if following_comments
if let Some(verbatim_range) = skip_range(following, iter.as_slice(), f.context()) {
preceding = write_skipped_statements(following, &mut iter, verbatim_range, f)?;
preceding_comments = comments.leading_dangling_trailing(preceding);
} else if following_comments
.leading
.iter()
.any(|comment| comment.is_suppression_off_comment(source))
@@ -840,61 +861,57 @@ impl Format<PyFormatContext<'_>> for DocstringStmt<'_> {
let comments = f.context().comments().clone();
let node_comments = comments.leading_dangling_trailing(self.docstring);
if FormatStmtExpr.is_suppressed(node_comments.trailing, f.context()) {
suppressed_node(self.docstring).fmt(f)
} else {
// SAFETY: Safe because `DocStringStmt` guarantees that it only ever wraps a `ExprStmt` containing a `ExprStringLiteral`.
let string_literal = self
.docstring
.as_expr_stmt()
.unwrap()
.value
.as_string_literal_expr()
.unwrap();
// SAFETY: Safe because `DocStringStmt` guarantees that it only ever wraps a `ExprStmt` containing a `ExprStringLiteral`.
let string_literal = self
.docstring
.as_expr_stmt()
.unwrap()
.value
.as_string_literal_expr()
.unwrap();
// We format the expression, but the statement carries the comments
write!(
f,
[
leading_comments(node_comments.leading),
f.options()
.source_map_generation()
.is_enabled()
.then_some(source_position(self.docstring.start())),
string_literal
.format()
.with_options(StringLiteralKind::Docstring),
f.options()
.source_map_generation()
.is_enabled()
.then_some(source_position(self.docstring.end())),
]
)?;
// We format the expression, but the statement carries the comments
write!(
f,
[
leading_comments(node_comments.leading),
f.options()
.source_map_generation()
.is_enabled()
.then_some(source_position(self.docstring.start())),
string_literal
.format()
.with_options(StringLiteralKind::Docstring),
f.options()
.source_map_generation()
.is_enabled()
.then_some(source_position(self.docstring.end())),
]
)?;
if self.suite_kind == SuiteKind::Class {
// Comments after class docstrings need a newline between the docstring and the
// comment (https://github.com/astral-sh/ruff/issues/7948).
// ```python
// class ModuleBrowser:
// """Browse module classes and functions in IDLE."""
// # ^ Insert a newline above here
//
// def __init__(self, master, path, *, _htest=False, _utest=False):
// pass
// ```
if let Some(own_line) = node_comments
.trailing
.iter()
.find(|comment| comment.line_position().is_own_line())
{
if lines_before(own_line.start(), f.context().source()) < 2 {
empty_line().fmt(f)?;
}
if self.suite_kind == SuiteKind::Class {
// Comments after class docstrings need a newline between the docstring and the
// comment (https://github.com/astral-sh/ruff/issues/7948).
// ```python
// class ModuleBrowser:
// """Browse module classes and functions in IDLE."""
// # ^ Insert a newline above here
//
// def __init__(self, master, path, *, _htest=False, _utest=False):
// pass
// ```
if let Some(own_line) = node_comments
.trailing
.iter()
.find(|comment| comment.line_position().is_own_line())
{
if lines_before(own_line.start(), f.context().source()) < 2 {
empty_line().fmt(f)?;
}
}
trailing_comments(node_comments.trailing).fmt(f)
}
trailing_comments(node_comments.trailing).fmt(f)
}
}
@@ -938,6 +955,58 @@ impl Format<PyFormatContext<'_>> for SuiteChildStatement<'_> {
}
}
pub(crate) fn skip_range(
first: &Stmt,
statements: &[Stmt],
context: &PyFormatContext,
) -> Option<TextRange> {
let start = first.start();
let mut last_statement = first;
let comments = context.comments();
let source = context.source();
for statement in statements {
if new_logical_line_between_statements(
source,
TextRange::new(last_statement.end(), statement.start()),
) {
break;
}
last_statement = statement;
}
if has_skip_comment(comments.trailing(last_statement), source) {
Some(TextRange::new(
start,
trailing_semicolon(last_statement.into(), source)
.map_or_else(|| last_statement.end(), ruff_text_size::TextRange::end),
))
} else {
None
}
}
fn new_logical_line_between_statements(source: &str, between_statement_range: TextRange) -> bool {
let mut tokenizer = SimpleTokenizer::new(source, between_statement_range).map(|tok| tok.kind());
while let Some(token) = tokenizer.next() {
match token {
SimpleTokenKind::Continuation => {
tokenizer.next();
}
SimpleTokenKind::Newline => {
return true;
}
// Since we are between statements, there are
// no non-trivia tokens, so there is no need to check
// for these and do an early return.
_ => {}
}
}
false
}
#[cfg(test)]
mod tests {
use ruff_formatter::format;

View File

@@ -2,6 +2,7 @@ use std::borrow::Cow;
use std::iter::FusedIterator;
use std::slice::Iter;
use itertools::PeekingNext;
use ruff_formatter::{FormatError, write};
use ruff_python_ast::AnyNodeRef;
use ruff_python_ast::Stmt;
@@ -451,6 +452,40 @@ fn write_suppressed_statements<'a>(
}
}
#[cold]
pub(crate) fn write_skipped_statements<'a>(
first_skipped: &'a Stmt,
statements: &mut std::slice::Iter<'a, Stmt>,
verbatim_range: TextRange,
f: &mut PyFormatter,
) -> FormatResult<&'a Stmt> {
let comments = f.context().comments().clone();
comments.mark_verbatim_node_comments_formatted(first_skipped.into());
let mut preceding = first_skipped;
while let Some(prec) = statements.peeking_next(|next| next.end() <= verbatim_range.end()) {
comments.mark_verbatim_node_comments_formatted(prec.into());
preceding = prec;
}
let first_leading = comments.leading(first_skipped);
let preceding_trailing = comments.trailing(preceding);
// Write the outer comments and format the node as verbatim
write!(
f,
[
leading_comments(first_leading),
source_position(verbatim_range.start()),
verbatim_text(verbatim_range),
source_position(verbatim_range.end()),
trailing_comments(preceding_trailing)
]
)?;
Ok(preceding)
}
#[derive(Copy, Clone, Debug)]
enum InSuppression {
No,
@@ -893,65 +928,6 @@ impl Format<PyFormatContext<'_>> for VerbatimText {
}
}
/// Disables formatting for `node` and instead uses the same formatting as the node has in source.
///
/// The `node` gets indented as any formatted node to avoid syntax errors when the indentation string changes (e.g. from 2 spaces to 4).
/// The `node`s leading and trailing comments are formatted as usual, except if they fall into the suppressed node's range.
#[cold]
pub(crate) fn suppressed_node<'a, N>(node: N) -> FormatSuppressedNode<'a>
where
N: Into<AnyNodeRef<'a>>,
{
FormatSuppressedNode { node: node.into() }
}
pub(crate) struct FormatSuppressedNode<'a> {
node: AnyNodeRef<'a>,
}
impl Format<PyFormatContext<'_>> for FormatSuppressedNode<'_> {
fn fmt(&self, f: &mut Formatter<PyFormatContext<'_>>) -> FormatResult<()> {
let comments = f.context().comments().clone();
let node_comments = comments.leading_dangling_trailing(self.node);
// Mark all comments as formatted that fall into the node range
for comment in node_comments.leading {
if comment.start() > self.node.start() {
comment.mark_formatted();
}
}
for comment in node_comments.trailing {
if comment.start() < self.node.end() {
comment.mark_formatted();
}
}
// Some statements may end with a semicolon. Preserve the semicolon
let semicolon_range = self
.node
.is_statement()
.then(|| trailing_semicolon(self.node, f.context().source()))
.flatten();
let verbatim_range = semicolon_range.map_or(self.node.range(), |semicolon| {
TextRange::new(self.node.start(), semicolon.end())
});
comments.mark_verbatim_node_comments_formatted(self.node);
// Write the outer comments and format the node as verbatim
write!(
f,
[
leading_comments(node_comments.leading),
source_position(verbatim_range.start()),
verbatim_text(verbatim_range),
source_position(verbatim_range.end()),
trailing_comments(node_comments.trailing)
]
)
}
}
#[cold]
pub(crate) fn write_suppressed_clause_header(
header: ClauseHeader,

View File

@@ -0,0 +1,299 @@
---
source: crates/ruff_python_formatter/tests/fixtures.rs
---
## Input
```python
class Simple:
x=1
x=2 # fmt: skip
x=3
class Semicolon:
x=1
x=2;x=3 # fmt: skip
x=4
class TrailingSemicolon:
x=1
x=2;x=3 ; # fmt: skip
x=4
class SemicolonNewLogicalLine:
x=1;
x=2;x=3 # fmt: skip
x=4
class ManySemicolonOneLine:
x=1
x=2;x=3;x=4 # fmt: skip
x=5
class CompoundInSuite:
x=1
def foo(): y=1 # fmt: skip
x=2
class CompoundInSuiteNewline:
x=1
def foo():
y=1 # fmt: skip
x=2
class MultiLineSkip:
x=1
x = [
'1',
'2',
] # fmt: skip
class MultiLineSemicolon:
x=1
x = [
'1',
'2',
]; x=2 # fmt: skip
class LineContinuationSemicolonAfter:
x=1
x = ['a']\
; y=1 # fmt: skip
class LineContinuationSemicolonBefore:
x=1
x = ['a']; \
y=1 # fmt: skip
class LineContinuationSemicolonAndNewline:
x=1
x = ['a']; \
y=1 # fmt: skip
class LineContinuationSemicolonAndNewlineAndComment:
x=1
x = ['a']; \
# 1
y=1 # fmt: skip
class RepeatedLineContinuation:
x=1
x = ['a']; \
\
\
y=1 # fmt: skip
class MultiLineSemicolonComments:
x=1
# 1
x = [ # 2
'1', # 3
'2',
# 4
]; x=2 # 5 # fmt: skip # 6
class DocstringSkipped:
'''This is a docstring''' # fmt: skip
x=1
class MultilineDocstringSkipped:
'''This is a docstring
''' # fmt: skip
x=1
class FirstStatementNewlines:
x=1 # fmt: skip
class ChainingSemicolons:
x=[
'1',
'2',
'3',
];x=1;x=[
'1',
'2',
'3'
];x=1;x=1 # fmt: skip
class LotsOfComments:
# 1
x=[ # 2
'1', # 3
'2',
'3'
] ;x=2;x=3 # 4 # fmt: skip # 5
# 6
class MixingCompound:
def foo(): bar(); import zoo # fmt: skip
# https://github.com/astral-sh/ruff/issues/17331
def main() -> None:
import ipdb; ipdb.set_trace() # noqa: E402,E702,I001 # fmt: skip
# https://github.com/astral-sh/ruff/issues/11430
print(); print() # noqa # fmt: skip
```
## Output
```python
class Simple:
x = 1
x=2 # fmt: skip
x = 3
class Semicolon:
x = 1
x=2;x=3 # fmt: skip
x = 4
class TrailingSemicolon:
x = 1
x=2;x=3 ; # fmt: skip
x = 4
class SemicolonNewLogicalLine:
x = 1
x=2;x=3 # fmt: skip
x = 4
class ManySemicolonOneLine:
x = 1
x=2;x=3;x=4 # fmt: skip
x = 5
class CompoundInSuite:
x = 1
def foo(): y=1 # fmt: skip
x = 2
class CompoundInSuiteNewline:
x = 1
def foo():
y=1 # fmt: skip
x = 2
class MultiLineSkip:
x = 1
x = [
'1',
'2',
] # fmt: skip
class MultiLineSemicolon:
x = 1
x = [
'1',
'2',
]; x=2 # fmt: skip
class LineContinuationSemicolonAfter:
x = 1
x = ['a']\
; y=1 # fmt: skip
class LineContinuationSemicolonBefore:
x = 1
x = ['a']; \
y=1 # fmt: skip
class LineContinuationSemicolonAndNewline:
x = 1
x = ["a"]
y=1 # fmt: skip
class LineContinuationSemicolonAndNewlineAndComment:
x = 1
x = ["a"]
# 1
y=1 # fmt: skip
class RepeatedLineContinuation:
x = 1
x = ['a']; \
\
\
y=1 # fmt: skip
class MultiLineSemicolonComments:
x = 1
# 1
x = [ # 2
'1', # 3
'2',
# 4
]; x=2 # 5 # fmt: skip # 6
class DocstringSkipped:
'''This is a docstring''' # fmt: skip
x = 1
class MultilineDocstringSkipped:
'''This is a docstring
''' # fmt: skip
x = 1
class FirstStatementNewlines:
x=1 # fmt: skip
class ChainingSemicolons:
x=[
'1',
'2',
'3',
];x=1;x=[
'1',
'2',
'3'
];x=1;x=1 # fmt: skip
class LotsOfComments:
# 1
x=[ # 2
'1', # 3
'2',
'3'
] ;x=2;x=3 # 4 # fmt: skip # 5
# 6
class MixingCompound:
def foo(): bar(); import zoo # fmt: skip
# https://github.com/astral-sh/ruff/issues/17331
def main() -> None:
import ipdb; ipdb.set_trace() # noqa: E402,E702,I001 # fmt: skip
# https://github.com/astral-sh/ruff/issues/11430
print(); print() # noqa # fmt: skip
```

View File

@@ -0,0 +1,56 @@
---
source: crates/ruff_python_formatter/tests/fixtures.rs
---
## Input
```python
# 0
'''Module docstring
multiple lines''' # 1 # fmt: skip # 2
import a;import b; from c import (
x,y,z
); import f # fmt: skip
x=1;x=2;x=3;x=4 # fmt: skip
# 1
x=[ # 2
'1', # 3
'2',
'3'
];x=1;x=1 # 4 # fmt: skip # 5
# 6
def foo(): x=[
'1',
'2',
];x=1 # fmt: skip
```
## Output
```python
# 0
'''Module docstring
multiple lines''' # 1 # fmt: skip # 2
import a;import b; from c import (
x,y,z
); import f # fmt: skip
x=1;x=2;x=3;x=4 # fmt: skip
# 1
x=[ # 2
'1', # 3
'2',
'3'
];x=1;x=1 # 4 # fmt: skip # 5
# 6
def foo():
x=[
'1',
'2',
];x=1 # fmt: skip
```

View File

@@ -0,0 +1,147 @@
---
source: crates/ruff_python_formatter/tests/fixtures.rs
---
## Input
```python
class Simple:
# Range comprises skip range
x=1
<RANGE_START>x=2 <RANGE_END># fmt: skip
x=3
class Semicolon:
# Range is part of skip range
x=1
x=2;<RANGE_START>x=3<RANGE_END> # fmt: skip
x=4
class FormatFirst:
x=1
<RANGE_START>x=2<RANGE_END>;x=3 # fmt: skip
x=4
class FormatMiddle:
x=1
x=2;<RANGE_START>x=3<RANGE_END>;x=4 # fmt: skip
x=5
class SemicolonNewLogicalLine:
# Range overlaps on right side
<RANGE_START>x=1;
x=2<RANGE_END>;x=3 # fmt: skip
x=4
class SemicolonNewLogicalLine:
# Range overlaps on left side
x=1;
x=2;<RANGE_START>x=3 # fmt: skip
x=4<RANGE_END>
class ManySemicolonOneLine:
x=1
x=2;x=3;x=4 # fmt: skip
x=5
class CompoundInSuite:
x=1
<RANGE_START>def foo(): y=1 <RANGE_END># fmt: skip
x=2
class CompoundInSuiteNewline:
x=1
def foo():
y=1 # fmt: skip
x=2
class MultiLineSkip:
# Range inside statement
x=1
x = <RANGE_START>[
'1',
'2',<RANGE_END>
] # fmt: skip
class LotsOfComments:
# 1
x=[ # 2
'1', # 3<RANGE_START>
'2',
'3'
] ;x=2;x=3 # 4<RANGE_END> # fmt: skip # 5
# 6
```
## Output
```python
class Simple:
# Range comprises skip range
x=1
x=2 # fmt: skip
x=3
class Semicolon:
# Range is part of skip range
x=1
x=2;x=3 # fmt: skip
x=4
class FormatFirst:
x=1
x=2;x=3 # fmt: skip
x=4
class FormatMiddle:
x=1
x=2;x=3;x=4 # fmt: skip
x=5
class SemicolonNewLogicalLine:
# Range overlaps on right side
x = 1
x=2;x=3 # fmt: skip
x=4
class SemicolonNewLogicalLine:
# Range overlaps on left side
x=1;
x=2;x=3 # fmt: skip
x = 4
class ManySemicolonOneLine:
x=1
x=2;x=3;x=4 # fmt: skip
x=5
class CompoundInSuite:
x=1
def foo(): y=1 # fmt: skip
x=2
class CompoundInSuiteNewline:
x=1
def foo():
y=1 # fmt: skip
x=2
class MultiLineSkip:
# Range inside statement
x=1
x = [
'1',
'2',
] # fmt: skip
class LotsOfComments:
# 1
x=[ # 2
'1', # 3
'2',
'3'
] ;x=2;x=3 # 4 # fmt: skip # 5
# 6
```

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_wasm"
version = "0.14.10"
version = "0.14.11"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -7,7 +7,6 @@ use std::collections::BTreeMap;
use std::env::VarError;
use std::num::{NonZeroU8, NonZeroU16};
use std::path::{Path, PathBuf};
use std::str::FromStr;
use anyhow::{Context, Result, anyhow};
use glob::{GlobError, Paths, PatternError, glob};
@@ -36,8 +35,7 @@ use ruff_linter::settings::{
DEFAULT_SELECTORS, DUMMY_VARIABLE_RGX, LinterSettings, TASK_TAGS, TargetVersion,
};
use ruff_linter::{
RUFF_PKG_VERSION, RuleSelector, fs, warn_user_once, warn_user_once_by_id,
warn_user_once_by_message,
RuleSelector, fs, warn_user_once, warn_user_once_by_id, warn_user_once_by_message,
};
use ruff_python_ast as ast;
use ruff_python_formatter::{
@@ -53,6 +51,7 @@ use crate::options::{
Flake8UnusedArgumentsOptions, FormatOptions, IsortOptions, LintCommonOptions, LintOptions,
McCabeOptions, Options, Pep8NamingOptions, PyUpgradeOptions, PycodestyleOptions,
PydoclintOptions, PydocstyleOptions, PyflakesOptions, PylintOptions, RuffOptions,
validate_required_version,
};
use crate::settings::{
EXCLUDE, FileResolverSettings, FormatterSettings, INCLUDE, INCLUDE_PREVIEW, LineEnding,
@@ -155,13 +154,7 @@ pub struct Configuration {
impl Configuration {
pub fn into_settings(self, project_root: &Path) -> Result<Settings> {
if let Some(required_version) = &self.required_version {
let ruff_pkg_version = pep440_rs::Version::from_str(RUFF_PKG_VERSION)
.expect("RUFF_PKG_VERSION is not a valid PEP 440 version specifier");
if !required_version.contains(&ruff_pkg_version) {
return Err(anyhow!(
"Required version `{required_version}` does not match the running version `{RUFF_PKG_VERSION}`"
));
}
validate_required_version(required_version)?;
}
let linter_target_version = TargetVersion(self.target_version);

View File

@@ -1,15 +1,19 @@
use anyhow::Result;
use regex::Regex;
use rustc_hash::{FxBuildHasher, FxHashMap, FxHashSet};
use serde::de::{self};
use serde::{Deserialize, Deserializer, Serialize};
use std::collections::{BTreeMap, BTreeSet};
use std::path::PathBuf;
use std::str::FromStr;
use strum::IntoEnumIterator;
use unicode_normalization::UnicodeNormalization;
use crate::settings::LineEnding;
use ruff_formatter::IndentStyle;
use ruff_graph::Direction;
use ruff_linter::RUFF_PKG_VERSION;
use ruff_linter::line_width::{IndentWidth, LineLength};
use ruff_linter::rules::flake8_import_conventions::settings::BannedAliases;
use ruff_linter::rules::flake8_pytest_style::settings::SettingsError;
@@ -556,6 +560,17 @@ pub struct LintOptions {
pub future_annotations: Option<bool>,
}
pub fn validate_required_version(required_version: &RequiredVersion) -> anyhow::Result<()> {
let ruff_pkg_version = pep440_rs::Version::from_str(RUFF_PKG_VERSION)
.expect("RUFF_PKG_VERSION is not a valid PEP 440 version specifier");
if !required_version.contains(&ruff_pkg_version) {
return Err(anyhow::anyhow!(
"Required version `{required_version}` does not match the running version `{RUFF_PKG_VERSION}`"
));
}
Ok(())
}
/// Newtype wrapper for [`LintCommonOptions`] that allows customizing the JSON schema and omitting the fields from the [`OptionsMetadata`].
#[derive(Clone, Debug, PartialEq, Eq, Default, Serialize, Deserialize)]
#[serde(transparent)]

View File

@@ -5,12 +5,13 @@ use std::path::{Path, PathBuf};
use anyhow::{Context, Result};
use log::debug;
use pep440_rs::{Operator, Version, VersionSpecifiers};
use serde::de::DeserializeOwned;
use serde::{Deserialize, Serialize};
use strum::IntoEnumIterator;
use ruff_linter::settings::types::PythonVersion;
use ruff_linter::settings::types::{PythonVersion, RequiredVersion};
use crate::options::Options;
use crate::options::{Options, validate_required_version};
#[derive(Debug, PartialEq, Eq, Serialize, Deserialize)]
struct Tools {
@@ -40,20 +41,38 @@ impl Pyproject {
}
}
/// Parse a `ruff.toml` file.
fn parse_ruff_toml<P: AsRef<Path>>(path: P) -> Result<Options> {
fn parse_toml<P: AsRef<Path>, T: DeserializeOwned>(path: P, table_path: &[&str]) -> Result<T> {
let path = path.as_ref();
let contents = std::fs::read_to_string(path)
.with_context(|| format!("Failed to read {}", path.display()))?;
toml::from_str(&contents).with_context(|| format!("Failed to parse {}", path.display()))
// Parse the TOML document once into a spanned representation so we can:
// - Inspect `required-version` without triggering strict deserialization errors.
// - Deserialize with precise spans (line/column and excerpt) on errors.
let root = toml::de::DeTable::parse(&contents)
.with_context(|| format!("Failed to parse {}", path.display()))?;
check_required_version(root.get_ref(), table_path)?;
let deserializer = toml::de::Deserializer::from(root);
T::deserialize(deserializer)
.map_err(|mut err| {
// `Deserializer::from` doesn't have access to the original input, but we do.
// Attach it so TOML errors include line/column and a source excerpt.
err.set_input(Some(&contents));
err
})
.with_context(|| format!("Failed to parse {}", path.display()))
}
/// Parse a `ruff.toml` file.
fn parse_ruff_toml<P: AsRef<Path>>(path: P) -> Result<Options> {
parse_toml(path, &[])
}
/// Parse a `pyproject.toml` file.
fn parse_pyproject_toml<P: AsRef<Path>>(path: P) -> Result<Pyproject> {
let path = path.as_ref();
let contents = std::fs::read_to_string(path)
.with_context(|| format!("Failed to read {}", path.display()))?;
toml::from_str(&contents).with_context(|| format!("Failed to parse {}", path.display()))
parse_toml(path, &["tool", "ruff"])
}
/// Return `true` if a `pyproject.toml` contains a `[tool.ruff]` section.
@@ -98,6 +117,33 @@ pub fn find_settings_toml<P: AsRef<Path>>(path: P) -> Result<Option<PathBuf>> {
Ok(None)
}
fn check_required_version(value: &toml::de::DeTable, table_path: &[&str]) -> Result<()> {
let mut current = value;
for key in table_path {
let Some(next) = current.get(*key) else {
return Ok(());
};
let toml::de::DeValue::Table(next) = next.get_ref() else {
return Ok(());
};
current = next;
}
let required_version = current
.get("required-version")
.and_then(|value| value.get_ref().as_str());
let Some(required_version) = required_version else {
return Ok(());
};
// If it doesn't parse, we just fall through to normal parsing; it will give a nicer error message.
if let Ok(required_version) = required_version.parse::<RequiredVersion>() {
validate_required_version(&required_version)?;
}
Ok(())
}
/// Derive target version from `required-version` in `pyproject.toml`, if
/// such a file exists in an ancestor directory.
pub fn find_fallback_target_version<P: AsRef<Path>>(path: P) -> Option<PythonVersion> {

View File

@@ -102,8 +102,8 @@ impl Relativity {
#[derive(Debug)]
pub struct Resolver<'a> {
pyproject_config: &'a PyprojectConfig,
/// All [`Settings`] that have been added to the resolver.
settings: Vec<Settings>,
/// All [`Settings`] that have been added to the resolver, along with their config file paths.
settings: Vec<(Settings, PathBuf)>,
/// A router from path to index into the `settings` vector.
router: Router<usize>,
}
@@ -146,8 +146,8 @@ impl<'a> Resolver<'a> {
}
/// Add a resolved [`Settings`] under a given [`PathBuf`] scope.
fn add(&mut self, path: &Path, settings: Settings) {
self.settings.push(settings);
fn add(&mut self, path: &Path, settings: Settings, config_path: PathBuf) {
self.settings.push((settings, config_path));
// Normalize the path to use `/` separators and escape the '{' and '}' characters,
// which matchit uses for routing parameters.
@@ -172,13 +172,27 @@ impl<'a> Resolver<'a> {
/// Return the appropriate [`Settings`] for a given [`Path`].
pub fn resolve(&self, path: &Path) -> &Settings {
self.resolve_with_path(path).0
}
/// Return the appropriate [`Settings`] and config file path for a given [`Path`].
pub fn resolve_with_path(&self, path: &Path) -> (&Settings, Option<&Path>) {
match self.pyproject_config.strategy {
PyprojectDiscoveryStrategy::Fixed => &self.pyproject_config.settings,
PyprojectDiscoveryStrategy::Fixed => (
&self.pyproject_config.settings,
self.pyproject_config.path.as_deref(),
),
PyprojectDiscoveryStrategy::Hierarchical => self
.router
.at(path.to_slash_lossy().as_ref())
.map(|Match { value, .. }| &self.settings[*value])
.unwrap_or(&self.pyproject_config.settings),
.map(|Match { value, .. }| {
let (settings, config_path) = &self.settings[*value];
(settings, Some(config_path.as_path()))
})
.unwrap_or((
&self.pyproject_config.settings,
self.pyproject_config.path.as_deref(),
)),
}
}
@@ -255,7 +269,8 @@ impl<'a> Resolver<'a> {
/// Return an iterator over the resolved [`Settings`] in this [`Resolver`].
pub fn settings(&self) -> impl Iterator<Item = &Settings> {
std::iter::once(&self.pyproject_config.settings).chain(&self.settings)
std::iter::once(&self.pyproject_config.settings)
.chain(self.settings.iter().map(|(settings, _)| settings))
}
}
@@ -379,17 +394,17 @@ pub fn resolve_configuration(
/// Extract the project root (scope) and [`Settings`] from a given
/// `pyproject.toml`.
fn resolve_scoped_settings<'a>(
pyproject: &'a Path,
fn resolve_scoped_settings(
pyproject: &Path,
transformer: &dyn ConfigurationTransformer,
origin: ConfigurationOrigin,
) -> Result<(&'a Path, Settings)> {
) -> Result<(PathBuf, Settings)> {
let relativity = Relativity::from(origin);
let configuration = resolve_configuration(pyproject, transformer, origin)?;
let project_root = relativity.resolve(pyproject);
let settings = configuration.into_settings(project_root)?;
Ok((project_root, settings))
Ok((project_root.to_path_buf(), settings))
}
/// Extract the [`Settings`] from a given `pyproject.toml` and process the
@@ -455,7 +470,7 @@ pub fn python_files_in_path<'a>(
transformer,
ConfigurationOrigin::Ancestor,
)?;
resolver.add(root, settings);
resolver.add(&root, settings, pyproject);
// We found the closest configuration.
break;
}
@@ -647,7 +662,11 @@ impl ParallelVisitor for PythonFilesVisitor<'_, '_> {
ConfigurationOrigin::Ancestor,
) {
Ok((root, settings)) => {
self.global.resolver.write().unwrap().add(root, settings);
self.global
.resolver
.write()
.unwrap()
.add(&root, settings, pyproject);
}
Err(err) => {
self.local_error = Err(err);
@@ -767,7 +786,7 @@ pub fn python_file_at_path(
if let Some(pyproject) = settings_toml(ancestor)? {
let (root, settings) =
resolve_scoped_settings(&pyproject, transformer, ConfigurationOrigin::Unknown)?;
resolver.add(root, settings);
resolver.add(&root, settings, pyproject);
break;
}
}

View File

@@ -38,8 +38,8 @@ You can optionally install pre-commit hooks to automatically run the validation
when making a commit:
```shell
uv tool install pre-commit
pre-commit install
uv tool install prek
prek install
```
We recommend [nextest](https://nexte.st/) to run ty's test suite (via `cargo nextest run`),
@@ -66,7 +66,7 @@ and that it passes both the lint and test validation checks:
```shell
cargo clippy --workspace --all-targets --all-features -- -D warnings # Rust linting
cargo test # Rust testing
uvx pre-commit run --all-files --show-diff-on-failure # Rust and Python formatting, Markdown and Python linting, etc.
uvx prek run -a # Rust and Python formatting, Markdown and Python linting, etc.
```
These checks will run on GitHub Actions when you open your pull request, but running them locally

View File

@@ -467,7 +467,7 @@ def test(): -> "int":
Default level: <a href="../../rules#rule-levels" title="This lint has a default level of 'warn'."><code>warn</code></a> ·
Added in <a href="https://github.com/astral-sh/ty/releases/tag/0.0.1-alpha.1">0.0.1-alpha.1</a> ·
<a href="https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20ignore-comment-unknown-rule" target="_blank">Related issues</a> ·
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L50" target="_blank">View source</a>
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L54" target="_blank">View source</a>
</small>
@@ -1047,7 +1047,7 @@ class D(Generic[U, T]): ...
Default level: <a href="../../rules#rule-levels" title="This lint has a default level of 'warn'."><code>warn</code></a> ·
Added in <a href="https://github.com/astral-sh/ty/releases/tag/0.0.1-alpha.1">0.0.1-alpha.1</a> ·
<a href="https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-ignore-comment" target="_blank">Related issues</a> ·
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L75" target="_blank">View source</a>
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L79" target="_blank">View source</a>
</small>
@@ -2895,7 +2895,7 @@ A() + A() # TypeError: unsupported operand type(s) for +: 'A' and 'A'
## `unused-ignore-comment`
<small>
Default level: <a href="../../rules#rule-levels" title="This lint has a default level of 'ignore'."><code>ignore</code></a> ·
Default level: <a href="../../rules#rule-levels" title="This lint has a default level of 'warn'."><code>warn</code></a> ·
Added in <a href="https://github.com/astral-sh/ty/releases/tag/0.0.1-alpha.1">0.0.1-alpha.1</a> ·
<a href="https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unused-ignore-comment" target="_blank">Related issues</a> ·
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L25" target="_blank">View source</a>
@@ -2904,11 +2904,11 @@ Added in <a href="https://github.com/astral-sh/ty/releases/tag/0.0.1-alpha.1">0.
**What it does**
Checks for `type: ignore` or `ty: ignore` directives that are no longer applicable.
Checks for `ty: ignore` or `type: ignore` directives that are no longer applicable.
**Why is this bad?**
A `type: ignore` directive that no longer matches any diagnostic violations is likely
A `ty: ignore` directive that no longer matches any diagnostic violations is likely
included by mistake, and should be removed to avoid confusion.
**Examples**
@@ -2923,6 +2923,11 @@ Use instead:
a = 20 / 2
```
**Options**
Set [`analysis.respect-type-ignore-comments`](https://docs.astral.sh/ty/reference/configuration/#respect-type-ignore-comments)
to `false` to prevent this rule from reporting unused `type: ignore` comments.
## `useless-overload-body`
<small>

View File

@@ -1,9 +1,10 @@
name,file,index,rank
auto-import-includes-modules,main.py,0,1
auto-import-includes-modules,main.py,1,7
auto-import-includes-modules,main.py,1,2
auto-import-includes-modules,main.py,2,1
auto-import-skips-current-module,main.py,0,1
class-arg-completion,main.py,0,1
exact-over-fuzzy,main.py,0,1
fstring-completions,main.py,0,1
higher-level-symbols-preferred,main.py,0,
higher-level-symbols-preferred,main.py,1,1
@@ -16,8 +17,10 @@ import-deprioritizes-type_check_only,main.py,3,2
import-deprioritizes-type_check_only,main.py,4,3
import-keyword-completion,main.py,0,1
internal-typeshed-hidden,main.py,0,2
local-over-auto-import,main.py,0,1
modules-over-other-symbols,main.py,0,1
none-completion,main.py,0,1
numpy-array,main.py,0,159
numpy-array,main.py,0,57
numpy-array,main.py,1,1
object-attr-instance-methods,main.py,0,1
object-attr-instance-methods,main.py,1,1
@@ -26,7 +29,12 @@ raise-uses-base-exception,main.py,0,1
scope-existing-over-new-import,main.py,0,1
scope-prioritize-closer,main.py,0,2
scope-simple-long-identifier,main.py,0,1
third-party-over-stdlib,main.py,0,1
tighter-over-looser-scope,main.py,0,3
tstring-completions,main.py,0,1
ty-extensions-lower-stdlib,main.py,0,9
type-var-typing-over-ast,main.py,0,3
type-var-typing-over-ast,main.py,1,253
ty-extensions-lower-stdlib,main.py,0,1
typing-gets-priority,main.py,0,1
typing-gets-priority,main.py,1,1
typing-gets-priority,main.py,2,1
typing-gets-priority,main.py,3,1
typing-gets-priority,main.py,4,1
1 name file index rank
2 auto-import-includes-modules main.py 0 1
3 auto-import-includes-modules main.py 1 7 2
4 auto-import-includes-modules main.py 2 1
5 auto-import-skips-current-module main.py 0 1
6 class-arg-completion main.py 0 1
7 exact-over-fuzzy main.py 0 1
8 fstring-completions main.py 0 1
9 higher-level-symbols-preferred main.py 0
10 higher-level-symbols-preferred main.py 1 1
17 import-deprioritizes-type_check_only main.py 4 3
18 import-keyword-completion main.py 0 1
19 internal-typeshed-hidden main.py 0 2
20 local-over-auto-import main.py 0 1
21 modules-over-other-symbols main.py 0 1
22 none-completion main.py 0 1
23 numpy-array main.py 0 159 57
24 numpy-array main.py 1 1
25 object-attr-instance-methods main.py 0 1
26 object-attr-instance-methods main.py 1 1
29 scope-existing-over-new-import main.py 0 1
30 scope-prioritize-closer main.py 0 2
31 scope-simple-long-identifier main.py 0 1
32 third-party-over-stdlib main.py 0 1
33 tighter-over-looser-scope main.py 0 3
34 tstring-completions main.py 0 1
35 ty-extensions-lower-stdlib main.py 0 9 1
36 type-var-typing-over-ast typing-gets-priority main.py 0 3 1
37 type-var-typing-over-ast typing-gets-priority main.py 1 253 1
38 typing-gets-priority main.py 2 1
39 typing-gets-priority main.py 3 1
40 typing-gets-priority main.py 4 1

View File

@@ -540,16 +540,17 @@ fn copy_project(src_dir: &SystemPath, dst_dir: &SystemPath) -> anyhow::Result<Ve
std::fs::create_dir_all(dst_dir).with_context(|| dst_dir.to_string())?;
let mut cursors = vec![];
for result in walkdir::WalkDir::new(src_dir.as_std_path()) {
let it = walkdir::WalkDir::new(src_dir.as_std_path())
.into_iter()
.filter_entry(|dent| {
!dent
.file_name()
.to_str()
.is_some_and(|name| name.starts_with('.'))
});
for result in it {
let dent =
result.with_context(|| format!("failed to get directory entry from {src_dir}"))?;
if dent
.file_name()
.to_str()
.is_some_and(|name| name.starts_with('.'))
{
continue;
}
let src = SystemPath::from_std_path(dent.path()).ok_or_else(|| {
anyhow::anyhow!("path `{}` is not valid UTF-8", dent.path().display())

View File

@@ -0,0 +1,3 @@
pattern = 1
pttn = 1
pttn<CURSOR: pttn>

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,11 @@
def foo(x):
# We specifically want the local `x` to be
# suggested first here, and NOT an `x` or an
# `X` from some other module (via auto-import).
# We'd also like this to come before `except`,
# which is a keyword that contains `x`, but is
# not an exact match (where as `x` is). `except`
# also isn't legal in this context, although
# that sort of context sensitivity is a bit
# trickier.
return x<CURSOR: x>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1,8 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1 @@
os<CURSOR: os>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1 @@
os = 1

View File

@@ -0,0 +1,78 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "regex"
version = "2025.11.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/cc/a9/546676f25e573a4cf00fe8e119b78a37b6a8fe2dc95cda877b30889c9c45/regex-2025.11.3.tar.gz", hash = "sha256:1fedc720f9bb2494ce31a58a1631f9c82df6a09b49c19517ea5cc280b4541e01", size = 414669, upload-time = "2025-11-03T21:34:22.089Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e1/a7/dda24ebd49da46a197436ad96378f17df30ceb40e52e859fc42cac45b850/regex-2025.11.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c1e448051717a334891f2b9a620fe36776ebf3dd8ec46a0b877c8ae69575feb4", size = 489081, upload-time = "2025-11-03T21:31:55.9Z" },
{ url = "https://files.pythonhosted.org/packages/19/22/af2dc751aacf88089836aa088a1a11c4f21a04707eb1b0478e8e8fb32847/regex-2025.11.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9b5aca4d5dfd7fbfbfbdaf44850fcc7709a01146a797536a8f84952e940cca76", size = 291123, upload-time = "2025-11-03T21:31:57.758Z" },
{ url = "https://files.pythonhosted.org/packages/a3/88/1a3ea5672f4b0a84802ee9891b86743438e7c04eb0b8f8c4e16a42375327/regex-2025.11.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:04d2765516395cf7dda331a244a3282c0f5ae96075f728629287dfa6f76ba70a", size = 288814, upload-time = "2025-11-03T21:32:01.12Z" },
{ url = "https://files.pythonhosted.org/packages/fb/8c/f5987895bf42b8ddeea1b315c9fedcfe07cadee28b9c98cf50d00adcb14d/regex-2025.11.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d9903ca42bfeec4cebedba8022a7c97ad2aab22e09573ce9976ba01b65e4361", size = 798592, upload-time = "2025-11-03T21:32:03.006Z" },
{ url = "https://files.pythonhosted.org/packages/99/2a/6591ebeede78203fa77ee46a1c36649e02df9eaa77a033d1ccdf2fcd5d4e/regex-2025.11.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:639431bdc89d6429f6721625e8129413980ccd62e9d3f496be618a41d205f160", size = 864122, upload-time = "2025-11-03T21:32:04.553Z" },
{ url = "https://files.pythonhosted.org/packages/94/d6/be32a87cf28cf8ed064ff281cfbd49aefd90242a83e4b08b5a86b38e8eb4/regex-2025.11.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f117efad42068f9715677c8523ed2be1518116d1c49b1dd17987716695181efe", size = 912272, upload-time = "2025-11-03T21:32:06.148Z" },
{ url = "https://files.pythonhosted.org/packages/62/11/9bcef2d1445665b180ac7f230406ad80671f0fc2a6ffb93493b5dd8cd64c/regex-2025.11.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4aecb6f461316adf9f1f0f6a4a1a3d79e045f9b71ec76055a791affa3b285850", size = 803497, upload-time = "2025-11-03T21:32:08.162Z" },
{ url = "https://files.pythonhosted.org/packages/e5/a7/da0dc273d57f560399aa16d8a68ae7f9b57679476fc7ace46501d455fe84/regex-2025.11.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3b3a5f320136873cc5561098dfab677eea139521cb9a9e8db98b7e64aef44cbc", size = 787892, upload-time = "2025-11-03T21:32:09.769Z" },
{ url = "https://files.pythonhosted.org/packages/da/4b/732a0c5a9736a0b8d6d720d4945a2f1e6f38f87f48f3173559f53e8d5d82/regex-2025.11.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:75fa6f0056e7efb1f42a1c34e58be24072cb9e61a601340cc1196ae92326a4f9", size = 858462, upload-time = "2025-11-03T21:32:11.769Z" },
{ url = "https://files.pythonhosted.org/packages/0c/f5/a2a03df27dc4c2d0c769220f5110ba8c4084b0bfa9ab0f9b4fcfa3d2b0fc/regex-2025.11.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:dbe6095001465294f13f1adcd3311e50dd84e5a71525f20a10bd16689c61ce0b", size = 850528, upload-time = "2025-11-03T21:32:13.906Z" },
{ url = "https://files.pythonhosted.org/packages/d6/09/e1cd5bee3841c7f6eb37d95ca91cdee7100b8f88b81e41c2ef426910891a/regex-2025.11.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:454d9b4ae7881afbc25015b8627c16d88a597479b9dea82b8c6e7e2e07240dc7", size = 789866, upload-time = "2025-11-03T21:32:15.748Z" },
{ url = "https://files.pythonhosted.org/packages/eb/51/702f5ea74e2a9c13d855a6a85b7f80c30f9e72a95493260193c07f3f8d74/regex-2025.11.3-cp313-cp313-win32.whl", hash = "sha256:28ba4d69171fc6e9896337d4fc63a43660002b7da53fc15ac992abcf3410917c", size = 266189, upload-time = "2025-11-03T21:32:17.493Z" },
{ url = "https://files.pythonhosted.org/packages/8b/00/6e29bb314e271a743170e53649db0fdb8e8ff0b64b4f425f5602f4eb9014/regex-2025.11.3-cp313-cp313-win_amd64.whl", hash = "sha256:bac4200befe50c670c405dc33af26dad5a3b6b255dd6c000d92fe4629f9ed6a5", size = 277054, upload-time = "2025-11-03T21:32:19.042Z" },
{ url = "https://files.pythonhosted.org/packages/25/f1/b156ff9f2ec9ac441710764dda95e4edaf5f36aca48246d1eea3f1fd96ec/regex-2025.11.3-cp313-cp313-win_arm64.whl", hash = "sha256:2292cd5a90dab247f9abe892ac584cb24f0f54680c73fcb4a7493c66c2bf2467", size = 270325, upload-time = "2025-11-03T21:32:21.338Z" },
{ url = "https://files.pythonhosted.org/packages/20/28/fd0c63357caefe5680b8ea052131acbd7f456893b69cc2a90cc3e0dc90d4/regex-2025.11.3-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:1eb1ebf6822b756c723e09f5186473d93236c06c579d2cc0671a722d2ab14281", size = 491984, upload-time = "2025-11-03T21:32:23.466Z" },
{ url = "https://files.pythonhosted.org/packages/df/ec/7014c15626ab46b902b3bcc4b28a7bae46d8f281fc7ea9c95e22fcaaa917/regex-2025.11.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1e00ec2970aab10dc5db34af535f21fcf32b4a31d99e34963419636e2f85ae39", size = 292673, upload-time = "2025-11-03T21:32:25.034Z" },
{ url = "https://files.pythonhosted.org/packages/23/ab/3b952ff7239f20d05f1f99e9e20188513905f218c81d52fb5e78d2bf7634/regex-2025.11.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a4cb042b615245d5ff9b3794f56be4138b5adc35a4166014d31d1814744148c7", size = 291029, upload-time = "2025-11-03T21:32:26.528Z" },
{ url = "https://files.pythonhosted.org/packages/21/7e/3dc2749fc684f455f162dcafb8a187b559e2614f3826877d3844a131f37b/regex-2025.11.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:44f264d4bf02f3176467d90b294d59bf1db9fe53c141ff772f27a8b456b2a9ed", size = 807437, upload-time = "2025-11-03T21:32:28.363Z" },
{ url = "https://files.pythonhosted.org/packages/1b/0b/d529a85ab349c6a25d1ca783235b6e3eedf187247eab536797021f7126c6/regex-2025.11.3-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7be0277469bf3bd7a34a9c57c1b6a724532a0d235cd0dc4e7f4316f982c28b19", size = 873368, upload-time = "2025-11-03T21:32:30.4Z" },
{ url = "https://files.pythonhosted.org/packages/7d/18/2d868155f8c9e3e9d8f9e10c64e9a9f496bb8f7e037a88a8bed26b435af6/regex-2025.11.3-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0d31e08426ff4b5b650f68839f5af51a92a5b51abd8554a60c2fbc7c71f25d0b", size = 914921, upload-time = "2025-11-03T21:32:32.123Z" },
{ url = "https://files.pythonhosted.org/packages/2d/71/9d72ff0f354fa783fe2ba913c8734c3b433b86406117a8db4ea2bf1c7a2f/regex-2025.11.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e43586ce5bd28f9f285a6e729466841368c4a0353f6fd08d4ce4630843d3648a", size = 812708, upload-time = "2025-11-03T21:32:34.305Z" },
{ url = "https://files.pythonhosted.org/packages/e7/19/ce4bf7f5575c97f82b6e804ffb5c4e940c62609ab2a0d9538d47a7fdf7d4/regex-2025.11.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:0f9397d561a4c16829d4e6ff75202c1c08b68a3bdbfe29dbfcdb31c9830907c6", size = 795472, upload-time = "2025-11-03T21:32:36.364Z" },
{ url = "https://files.pythonhosted.org/packages/03/86/fd1063a176ffb7b2315f9a1b08d17b18118b28d9df163132615b835a26ee/regex-2025.11.3-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:dd16e78eb18ffdb25ee33a0682d17912e8cc8a770e885aeee95020046128f1ce", size = 868341, upload-time = "2025-11-03T21:32:38.042Z" },
{ url = "https://files.pythonhosted.org/packages/12/43/103fb2e9811205e7386366501bc866a164a0430c79dd59eac886a2822950/regex-2025.11.3-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:ffcca5b9efe948ba0661e9df0fa50d2bc4b097c70b9810212d6b62f05d83b2dd", size = 854666, upload-time = "2025-11-03T21:32:40.079Z" },
{ url = "https://files.pythonhosted.org/packages/7d/22/e392e53f3869b75804762c7c848bd2dd2abf2b70fb0e526f58724638bd35/regex-2025.11.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c56b4d162ca2b43318ac671c65bd4d563e841a694ac70e1a976ac38fcf4ca1d2", size = 799473, upload-time = "2025-11-03T21:32:42.148Z" },
{ url = "https://files.pythonhosted.org/packages/4f/f9/8bd6b656592f925b6845fcbb4d57603a3ac2fb2373344ffa1ed70aa6820a/regex-2025.11.3-cp313-cp313t-win32.whl", hash = "sha256:9ddc42e68114e161e51e272f667d640f97e84a2b9ef14b7477c53aac20c2d59a", size = 268792, upload-time = "2025-11-03T21:32:44.13Z" },
{ url = "https://files.pythonhosted.org/packages/e5/87/0e7d603467775ff65cd2aeabf1b5b50cc1c3708556a8b849a2fa4dd1542b/regex-2025.11.3-cp313-cp313t-win_amd64.whl", hash = "sha256:7a7c7fdf755032ffdd72c77e3d8096bdcb0eb92e89e17571a196f03d88b11b3c", size = 280214, upload-time = "2025-11-03T21:32:45.853Z" },
{ url = "https://files.pythonhosted.org/packages/8d/d0/2afc6f8e94e2b64bfb738a7c2b6387ac1699f09f032d363ed9447fd2bb57/regex-2025.11.3-cp313-cp313t-win_arm64.whl", hash = "sha256:df9eb838c44f570283712e7cff14c16329a9f0fb19ca492d21d4b7528ee6821e", size = 271469, upload-time = "2025-11-03T21:32:48.026Z" },
{ url = "https://files.pythonhosted.org/packages/31/e9/f6e13de7e0983837f7b6d238ad9458800a874bf37c264f7923e63409944c/regex-2025.11.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9697a52e57576c83139d7c6f213d64485d3df5bf84807c35fa409e6c970801c6", size = 489089, upload-time = "2025-11-03T21:32:50.027Z" },
{ url = "https://files.pythonhosted.org/packages/a3/5c/261f4a262f1fa65141c1b74b255988bd2fa020cc599e53b080667d591cfc/regex-2025.11.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e18bc3f73bd41243c9b38a6d9f2366cd0e0137a9aebe2d8ff76c5b67d4c0a3f4", size = 291059, upload-time = "2025-11-03T21:32:51.682Z" },
{ url = "https://files.pythonhosted.org/packages/8e/57/f14eeb7f072b0e9a5a090d1712741fd8f214ec193dba773cf5410108bb7d/regex-2025.11.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:61a08bcb0ec14ff4e0ed2044aad948d0659604f824cbd50b55e30b0ec6f09c73", size = 288900, upload-time = "2025-11-03T21:32:53.569Z" },
{ url = "https://files.pythonhosted.org/packages/3c/6b/1d650c45e99a9b327586739d926a1cd4e94666b1bd4af90428b36af66dc7/regex-2025.11.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c9c30003b9347c24bcc210958c5d167b9e4f9be786cb380a7d32f14f9b84674f", size = 799010, upload-time = "2025-11-03T21:32:55.222Z" },
{ url = "https://files.pythonhosted.org/packages/99/ee/d66dcbc6b628ce4e3f7f0cbbb84603aa2fc0ffc878babc857726b8aab2e9/regex-2025.11.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4e1e592789704459900728d88d41a46fe3969b82ab62945560a31732ffc19a6d", size = 864893, upload-time = "2025-11-03T21:32:57.239Z" },
{ url = "https://files.pythonhosted.org/packages/bf/2d/f238229f1caba7ac87a6c4153d79947fb0261415827ae0f77c304260c7d3/regex-2025.11.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6538241f45eb5a25aa575dbba1069ad786f68a4f2773a29a2bd3dd1f9de787be", size = 911522, upload-time = "2025-11-03T21:32:59.274Z" },
{ url = "https://files.pythonhosted.org/packages/bd/3d/22a4eaba214a917c80e04f6025d26143690f0419511e0116508e24b11c9b/regex-2025.11.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce22519c989bb72a7e6b36a199384c53db7722fe669ba891da75907fe3587db", size = 803272, upload-time = "2025-11-03T21:33:01.393Z" },
{ url = "https://files.pythonhosted.org/packages/84/b1/03188f634a409353a84b5ef49754b97dbcc0c0f6fd6c8ede505a8960a0a4/regex-2025.11.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:66d559b21d3640203ab9075797a55165d79017520685fb407b9234d72ab63c62", size = 787958, upload-time = "2025-11-03T21:33:03.379Z" },
{ url = "https://files.pythonhosted.org/packages/99/6a/27d072f7fbf6fadd59c64d210305e1ff865cc3b78b526fd147db768c553b/regex-2025.11.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:669dcfb2e38f9e8c69507bace46f4889e3abbfd9b0c29719202883c0a603598f", size = 859289, upload-time = "2025-11-03T21:33:05.374Z" },
{ url = "https://files.pythonhosted.org/packages/9a/70/1b3878f648e0b6abe023172dacb02157e685564853cc363d9961bcccde4e/regex-2025.11.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:32f74f35ff0f25a5021373ac61442edcb150731fbaa28286bbc8bb1582c89d02", size = 850026, upload-time = "2025-11-03T21:33:07.131Z" },
{ url = "https://files.pythonhosted.org/packages/dd/d5/68e25559b526b8baab8e66839304ede68ff6727237a47727d240006bd0ff/regex-2025.11.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e6c7a21dffba883234baefe91bc3388e629779582038f75d2a5be918e250f0ed", size = 789499, upload-time = "2025-11-03T21:33:09.141Z" },
{ url = "https://files.pythonhosted.org/packages/fc/df/43971264857140a350910d4e33df725e8c94dd9dee8d2e4729fa0d63d49e/regex-2025.11.3-cp314-cp314-win32.whl", hash = "sha256:795ea137b1d809eb6836b43748b12634291c0ed55ad50a7d72d21edf1cd565c4", size = 271604, upload-time = "2025-11-03T21:33:10.9Z" },
{ url = "https://files.pythonhosted.org/packages/01/6f/9711b57dc6894a55faf80a4c1b5aa4f8649805cb9c7aef46f7d27e2b9206/regex-2025.11.3-cp314-cp314-win_amd64.whl", hash = "sha256:9f95fbaa0ee1610ec0fc6b26668e9917a582ba80c52cc6d9ada15e30aa9ab9ad", size = 280320, upload-time = "2025-11-03T21:33:12.572Z" },
{ url = "https://files.pythonhosted.org/packages/f1/7e/f6eaa207d4377481f5e1775cdeb5a443b5a59b392d0065f3417d31d80f87/regex-2025.11.3-cp314-cp314-win_arm64.whl", hash = "sha256:dfec44d532be4c07088c3de2876130ff0fbeeacaa89a137decbbb5f665855a0f", size = 273372, upload-time = "2025-11-03T21:33:14.219Z" },
{ url = "https://files.pythonhosted.org/packages/c3/06/49b198550ee0f5e4184271cee87ba4dfd9692c91ec55289e6282f0f86ccf/regex-2025.11.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:ba0d8a5d7f04f73ee7d01d974d47c5834f8a1b0224390e4fe7c12a3a92a78ecc", size = 491985, upload-time = "2025-11-03T21:33:16.555Z" },
{ url = "https://files.pythonhosted.org/packages/ce/bf/abdafade008f0b1c9da10d934034cb670432d6cf6cbe38bbb53a1cfd6cf8/regex-2025.11.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:442d86cf1cfe4faabf97db7d901ef58347efd004934da045c745e7b5bd57ac49", size = 292669, upload-time = "2025-11-03T21:33:18.32Z" },
{ url = "https://files.pythonhosted.org/packages/f9/ef/0c357bb8edbd2ad8e273fcb9e1761bc37b8acbc6e1be050bebd6475f19c1/regex-2025.11.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:fd0a5e563c756de210bb964789b5abe4f114dacae9104a47e1a649b910361536", size = 291030, upload-time = "2025-11-03T21:33:20.048Z" },
{ url = "https://files.pythonhosted.org/packages/79/06/edbb67257596649b8fb088d6aeacbcb248ac195714b18a65e018bf4c0b50/regex-2025.11.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bf3490bcbb985a1ae97b2ce9ad1c0f06a852d5b19dde9b07bdf25bf224248c95", size = 807674, upload-time = "2025-11-03T21:33:21.797Z" },
{ url = "https://files.pythonhosted.org/packages/f4/d9/ad4deccfce0ea336296bd087f1a191543bb99ee1c53093dcd4c64d951d00/regex-2025.11.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3809988f0a8b8c9dcc0f92478d6501fac7200b9ec56aecf0ec21f4a2ec4b6009", size = 873451, upload-time = "2025-11-03T21:33:23.741Z" },
{ url = "https://files.pythonhosted.org/packages/13/75/a55a4724c56ef13e3e04acaab29df26582f6978c000ac9cd6810ad1f341f/regex-2025.11.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f4ff94e58e84aedb9c9fce66d4ef9f27a190285b451420f297c9a09f2b9abee9", size = 914980, upload-time = "2025-11-03T21:33:25.999Z" },
{ url = "https://files.pythonhosted.org/packages/67/1e/a1657ee15bd9116f70d4a530c736983eed997b361e20ecd8f5ca3759d5c5/regex-2025.11.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7eb542fd347ce61e1321b0a6b945d5701528dca0cd9759c2e3bb8bd57e47964d", size = 812852, upload-time = "2025-11-03T21:33:27.852Z" },
{ url = "https://files.pythonhosted.org/packages/b8/6f/f7516dde5506a588a561d296b2d0044839de06035bb486b326065b4c101e/regex-2025.11.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:d6c2d5919075a1f2e413c00b056ea0c2f065b3f5fe83c3d07d325ab92dce51d6", size = 795566, upload-time = "2025-11-03T21:33:32.364Z" },
{ url = "https://files.pythonhosted.org/packages/d9/dd/3d10b9e170cc16fb34cb2cef91513cf3df65f440b3366030631b2984a264/regex-2025.11.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:3f8bf11a4827cc7ce5a53d4ef6cddd5ad25595d3c1435ef08f76825851343154", size = 868463, upload-time = "2025-11-03T21:33:34.459Z" },
{ url = "https://files.pythonhosted.org/packages/f5/8e/935e6beff1695aa9085ff83195daccd72acc82c81793df480f34569330de/regex-2025.11.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:22c12d837298651e5550ac1d964e4ff57c3f56965fc1812c90c9fb2028eaf267", size = 854694, upload-time = "2025-11-03T21:33:36.793Z" },
{ url = "https://files.pythonhosted.org/packages/92/12/10650181a040978b2f5720a6a74d44f841371a3d984c2083fc1752e4acf6/regex-2025.11.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:62ba394a3dda9ad41c7c780f60f6e4a70988741415ae96f6d1bf6c239cf01379", size = 799691, upload-time = "2025-11-03T21:33:39.079Z" },
{ url = "https://files.pythonhosted.org/packages/67/90/8f37138181c9a7690e7e4cb388debbd389342db3c7381d636d2875940752/regex-2025.11.3-cp314-cp314t-win32.whl", hash = "sha256:4bf146dca15cdd53224a1bf46d628bd7590e4a07fbb69e720d561aea43a32b38", size = 274583, upload-time = "2025-11-03T21:33:41.302Z" },
{ url = "https://files.pythonhosted.org/packages/8f/cd/867f5ec442d56beb56f5f854f40abcfc75e11d10b11fdb1869dd39c63aaf/regex-2025.11.3-cp314-cp314t-win_amd64.whl", hash = "sha256:adad1a1bcf1c9e76346e091d22d23ac54ef28e1365117d99521631078dfec9de", size = 284286, upload-time = "2025-11-03T21:33:43.324Z" },
{ url = "https://files.pythonhosted.org/packages/20/31/32c0c4610cbc070362bf1d2e4ea86d1ea29014d400a6d6c2486fcfd57766/regex-2025.11.3-cp314-cp314t-win_arm64.whl", hash = "sha256:c54f768482cef41e219720013cd05933b6f971d9562544d691c68699bf2b6801", size = 274741, upload-time = "2025-11-03T21:33:45.557Z" },
]
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
{ name = "regex" },
]
[package.metadata]
requires-dist = [{ name = "regex", specifier = ">=2025.11.3" }]

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,12 @@
# This test was originally written to
# check that a third party dependency
# gets priority over stdlib. But it was
# not clearly the right choice[1].
#
# 2026-01-09: We stuck with it for now,
# but it seems likely that we'll want
# to regress this task in favor of
# another.
#
# [1]: https://github.com/astral-sh/ruff/pull/22460#discussion_r2676343225
fullma<CURSOR: regex.fullmatch>

View File

@@ -0,0 +1,7 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = [
"regex>=2025.11.3",
]

View File

@@ -0,0 +1,78 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "regex"
version = "2025.11.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/cc/a9/546676f25e573a4cf00fe8e119b78a37b6a8fe2dc95cda877b30889c9c45/regex-2025.11.3.tar.gz", hash = "sha256:1fedc720f9bb2494ce31a58a1631f9c82df6a09b49c19517ea5cc280b4541e01", size = 414669, upload-time = "2025-11-03T21:34:22.089Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e1/a7/dda24ebd49da46a197436ad96378f17df30ceb40e52e859fc42cac45b850/regex-2025.11.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c1e448051717a334891f2b9a620fe36776ebf3dd8ec46a0b877c8ae69575feb4", size = 489081, upload-time = "2025-11-03T21:31:55.9Z" },
{ url = "https://files.pythonhosted.org/packages/19/22/af2dc751aacf88089836aa088a1a11c4f21a04707eb1b0478e8e8fb32847/regex-2025.11.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9b5aca4d5dfd7fbfbfbdaf44850fcc7709a01146a797536a8f84952e940cca76", size = 291123, upload-time = "2025-11-03T21:31:57.758Z" },
{ url = "https://files.pythonhosted.org/packages/a3/88/1a3ea5672f4b0a84802ee9891b86743438e7c04eb0b8f8c4e16a42375327/regex-2025.11.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:04d2765516395cf7dda331a244a3282c0f5ae96075f728629287dfa6f76ba70a", size = 288814, upload-time = "2025-11-03T21:32:01.12Z" },
{ url = "https://files.pythonhosted.org/packages/fb/8c/f5987895bf42b8ddeea1b315c9fedcfe07cadee28b9c98cf50d00adcb14d/regex-2025.11.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d9903ca42bfeec4cebedba8022a7c97ad2aab22e09573ce9976ba01b65e4361", size = 798592, upload-time = "2025-11-03T21:32:03.006Z" },
{ url = "https://files.pythonhosted.org/packages/99/2a/6591ebeede78203fa77ee46a1c36649e02df9eaa77a033d1ccdf2fcd5d4e/regex-2025.11.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:639431bdc89d6429f6721625e8129413980ccd62e9d3f496be618a41d205f160", size = 864122, upload-time = "2025-11-03T21:32:04.553Z" },
{ url = "https://files.pythonhosted.org/packages/94/d6/be32a87cf28cf8ed064ff281cfbd49aefd90242a83e4b08b5a86b38e8eb4/regex-2025.11.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f117efad42068f9715677c8523ed2be1518116d1c49b1dd17987716695181efe", size = 912272, upload-time = "2025-11-03T21:32:06.148Z" },
{ url = "https://files.pythonhosted.org/packages/62/11/9bcef2d1445665b180ac7f230406ad80671f0fc2a6ffb93493b5dd8cd64c/regex-2025.11.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4aecb6f461316adf9f1f0f6a4a1a3d79e045f9b71ec76055a791affa3b285850", size = 803497, upload-time = "2025-11-03T21:32:08.162Z" },
{ url = "https://files.pythonhosted.org/packages/e5/a7/da0dc273d57f560399aa16d8a68ae7f9b57679476fc7ace46501d455fe84/regex-2025.11.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3b3a5f320136873cc5561098dfab677eea139521cb9a9e8db98b7e64aef44cbc", size = 787892, upload-time = "2025-11-03T21:32:09.769Z" },
{ url = "https://files.pythonhosted.org/packages/da/4b/732a0c5a9736a0b8d6d720d4945a2f1e6f38f87f48f3173559f53e8d5d82/regex-2025.11.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:75fa6f0056e7efb1f42a1c34e58be24072cb9e61a601340cc1196ae92326a4f9", size = 858462, upload-time = "2025-11-03T21:32:11.769Z" },
{ url = "https://files.pythonhosted.org/packages/0c/f5/a2a03df27dc4c2d0c769220f5110ba8c4084b0bfa9ab0f9b4fcfa3d2b0fc/regex-2025.11.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:dbe6095001465294f13f1adcd3311e50dd84e5a71525f20a10bd16689c61ce0b", size = 850528, upload-time = "2025-11-03T21:32:13.906Z" },
{ url = "https://files.pythonhosted.org/packages/d6/09/e1cd5bee3841c7f6eb37d95ca91cdee7100b8f88b81e41c2ef426910891a/regex-2025.11.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:454d9b4ae7881afbc25015b8627c16d88a597479b9dea82b8c6e7e2e07240dc7", size = 789866, upload-time = "2025-11-03T21:32:15.748Z" },
{ url = "https://files.pythonhosted.org/packages/eb/51/702f5ea74e2a9c13d855a6a85b7f80c30f9e72a95493260193c07f3f8d74/regex-2025.11.3-cp313-cp313-win32.whl", hash = "sha256:28ba4d69171fc6e9896337d4fc63a43660002b7da53fc15ac992abcf3410917c", size = 266189, upload-time = "2025-11-03T21:32:17.493Z" },
{ url = "https://files.pythonhosted.org/packages/8b/00/6e29bb314e271a743170e53649db0fdb8e8ff0b64b4f425f5602f4eb9014/regex-2025.11.3-cp313-cp313-win_amd64.whl", hash = "sha256:bac4200befe50c670c405dc33af26dad5a3b6b255dd6c000d92fe4629f9ed6a5", size = 277054, upload-time = "2025-11-03T21:32:19.042Z" },
{ url = "https://files.pythonhosted.org/packages/25/f1/b156ff9f2ec9ac441710764dda95e4edaf5f36aca48246d1eea3f1fd96ec/regex-2025.11.3-cp313-cp313-win_arm64.whl", hash = "sha256:2292cd5a90dab247f9abe892ac584cb24f0f54680c73fcb4a7493c66c2bf2467", size = 270325, upload-time = "2025-11-03T21:32:21.338Z" },
{ url = "https://files.pythonhosted.org/packages/20/28/fd0c63357caefe5680b8ea052131acbd7f456893b69cc2a90cc3e0dc90d4/regex-2025.11.3-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:1eb1ebf6822b756c723e09f5186473d93236c06c579d2cc0671a722d2ab14281", size = 491984, upload-time = "2025-11-03T21:32:23.466Z" },
{ url = "https://files.pythonhosted.org/packages/df/ec/7014c15626ab46b902b3bcc4b28a7bae46d8f281fc7ea9c95e22fcaaa917/regex-2025.11.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1e00ec2970aab10dc5db34af535f21fcf32b4a31d99e34963419636e2f85ae39", size = 292673, upload-time = "2025-11-03T21:32:25.034Z" },
{ url = "https://files.pythonhosted.org/packages/23/ab/3b952ff7239f20d05f1f99e9e20188513905f218c81d52fb5e78d2bf7634/regex-2025.11.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a4cb042b615245d5ff9b3794f56be4138b5adc35a4166014d31d1814744148c7", size = 291029, upload-time = "2025-11-03T21:32:26.528Z" },
{ url = "https://files.pythonhosted.org/packages/21/7e/3dc2749fc684f455f162dcafb8a187b559e2614f3826877d3844a131f37b/regex-2025.11.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:44f264d4bf02f3176467d90b294d59bf1db9fe53c141ff772f27a8b456b2a9ed", size = 807437, upload-time = "2025-11-03T21:32:28.363Z" },
{ url = "https://files.pythonhosted.org/packages/1b/0b/d529a85ab349c6a25d1ca783235b6e3eedf187247eab536797021f7126c6/regex-2025.11.3-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7be0277469bf3bd7a34a9c57c1b6a724532a0d235cd0dc4e7f4316f982c28b19", size = 873368, upload-time = "2025-11-03T21:32:30.4Z" },
{ url = "https://files.pythonhosted.org/packages/7d/18/2d868155f8c9e3e9d8f9e10c64e9a9f496bb8f7e037a88a8bed26b435af6/regex-2025.11.3-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0d31e08426ff4b5b650f68839f5af51a92a5b51abd8554a60c2fbc7c71f25d0b", size = 914921, upload-time = "2025-11-03T21:32:32.123Z" },
{ url = "https://files.pythonhosted.org/packages/2d/71/9d72ff0f354fa783fe2ba913c8734c3b433b86406117a8db4ea2bf1c7a2f/regex-2025.11.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e43586ce5bd28f9f285a6e729466841368c4a0353f6fd08d4ce4630843d3648a", size = 812708, upload-time = "2025-11-03T21:32:34.305Z" },
{ url = "https://files.pythonhosted.org/packages/e7/19/ce4bf7f5575c97f82b6e804ffb5c4e940c62609ab2a0d9538d47a7fdf7d4/regex-2025.11.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:0f9397d561a4c16829d4e6ff75202c1c08b68a3bdbfe29dbfcdb31c9830907c6", size = 795472, upload-time = "2025-11-03T21:32:36.364Z" },
{ url = "https://files.pythonhosted.org/packages/03/86/fd1063a176ffb7b2315f9a1b08d17b18118b28d9df163132615b835a26ee/regex-2025.11.3-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:dd16e78eb18ffdb25ee33a0682d17912e8cc8a770e885aeee95020046128f1ce", size = 868341, upload-time = "2025-11-03T21:32:38.042Z" },
{ url = "https://files.pythonhosted.org/packages/12/43/103fb2e9811205e7386366501bc866a164a0430c79dd59eac886a2822950/regex-2025.11.3-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:ffcca5b9efe948ba0661e9df0fa50d2bc4b097c70b9810212d6b62f05d83b2dd", size = 854666, upload-time = "2025-11-03T21:32:40.079Z" },
{ url = "https://files.pythonhosted.org/packages/7d/22/e392e53f3869b75804762c7c848bd2dd2abf2b70fb0e526f58724638bd35/regex-2025.11.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c56b4d162ca2b43318ac671c65bd4d563e841a694ac70e1a976ac38fcf4ca1d2", size = 799473, upload-time = "2025-11-03T21:32:42.148Z" },
{ url = "https://files.pythonhosted.org/packages/4f/f9/8bd6b656592f925b6845fcbb4d57603a3ac2fb2373344ffa1ed70aa6820a/regex-2025.11.3-cp313-cp313t-win32.whl", hash = "sha256:9ddc42e68114e161e51e272f667d640f97e84a2b9ef14b7477c53aac20c2d59a", size = 268792, upload-time = "2025-11-03T21:32:44.13Z" },
{ url = "https://files.pythonhosted.org/packages/e5/87/0e7d603467775ff65cd2aeabf1b5b50cc1c3708556a8b849a2fa4dd1542b/regex-2025.11.3-cp313-cp313t-win_amd64.whl", hash = "sha256:7a7c7fdf755032ffdd72c77e3d8096bdcb0eb92e89e17571a196f03d88b11b3c", size = 280214, upload-time = "2025-11-03T21:32:45.853Z" },
{ url = "https://files.pythonhosted.org/packages/8d/d0/2afc6f8e94e2b64bfb738a7c2b6387ac1699f09f032d363ed9447fd2bb57/regex-2025.11.3-cp313-cp313t-win_arm64.whl", hash = "sha256:df9eb838c44f570283712e7cff14c16329a9f0fb19ca492d21d4b7528ee6821e", size = 271469, upload-time = "2025-11-03T21:32:48.026Z" },
{ url = "https://files.pythonhosted.org/packages/31/e9/f6e13de7e0983837f7b6d238ad9458800a874bf37c264f7923e63409944c/regex-2025.11.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9697a52e57576c83139d7c6f213d64485d3df5bf84807c35fa409e6c970801c6", size = 489089, upload-time = "2025-11-03T21:32:50.027Z" },
{ url = "https://files.pythonhosted.org/packages/a3/5c/261f4a262f1fa65141c1b74b255988bd2fa020cc599e53b080667d591cfc/regex-2025.11.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e18bc3f73bd41243c9b38a6d9f2366cd0e0137a9aebe2d8ff76c5b67d4c0a3f4", size = 291059, upload-time = "2025-11-03T21:32:51.682Z" },
{ url = "https://files.pythonhosted.org/packages/8e/57/f14eeb7f072b0e9a5a090d1712741fd8f214ec193dba773cf5410108bb7d/regex-2025.11.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:61a08bcb0ec14ff4e0ed2044aad948d0659604f824cbd50b55e30b0ec6f09c73", size = 288900, upload-time = "2025-11-03T21:32:53.569Z" },
{ url = "https://files.pythonhosted.org/packages/3c/6b/1d650c45e99a9b327586739d926a1cd4e94666b1bd4af90428b36af66dc7/regex-2025.11.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c9c30003b9347c24bcc210958c5d167b9e4f9be786cb380a7d32f14f9b84674f", size = 799010, upload-time = "2025-11-03T21:32:55.222Z" },
{ url = "https://files.pythonhosted.org/packages/99/ee/d66dcbc6b628ce4e3f7f0cbbb84603aa2fc0ffc878babc857726b8aab2e9/regex-2025.11.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4e1e592789704459900728d88d41a46fe3969b82ab62945560a31732ffc19a6d", size = 864893, upload-time = "2025-11-03T21:32:57.239Z" },
{ url = "https://files.pythonhosted.org/packages/bf/2d/f238229f1caba7ac87a6c4153d79947fb0261415827ae0f77c304260c7d3/regex-2025.11.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6538241f45eb5a25aa575dbba1069ad786f68a4f2773a29a2bd3dd1f9de787be", size = 911522, upload-time = "2025-11-03T21:32:59.274Z" },
{ url = "https://files.pythonhosted.org/packages/bd/3d/22a4eaba214a917c80e04f6025d26143690f0419511e0116508e24b11c9b/regex-2025.11.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce22519c989bb72a7e6b36a199384c53db7722fe669ba891da75907fe3587db", size = 803272, upload-time = "2025-11-03T21:33:01.393Z" },
{ url = "https://files.pythonhosted.org/packages/84/b1/03188f634a409353a84b5ef49754b97dbcc0c0f6fd6c8ede505a8960a0a4/regex-2025.11.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:66d559b21d3640203ab9075797a55165d79017520685fb407b9234d72ab63c62", size = 787958, upload-time = "2025-11-03T21:33:03.379Z" },
{ url = "https://files.pythonhosted.org/packages/99/6a/27d072f7fbf6fadd59c64d210305e1ff865cc3b78b526fd147db768c553b/regex-2025.11.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:669dcfb2e38f9e8c69507bace46f4889e3abbfd9b0c29719202883c0a603598f", size = 859289, upload-time = "2025-11-03T21:33:05.374Z" },
{ url = "https://files.pythonhosted.org/packages/9a/70/1b3878f648e0b6abe023172dacb02157e685564853cc363d9961bcccde4e/regex-2025.11.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:32f74f35ff0f25a5021373ac61442edcb150731fbaa28286bbc8bb1582c89d02", size = 850026, upload-time = "2025-11-03T21:33:07.131Z" },
{ url = "https://files.pythonhosted.org/packages/dd/d5/68e25559b526b8baab8e66839304ede68ff6727237a47727d240006bd0ff/regex-2025.11.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e6c7a21dffba883234baefe91bc3388e629779582038f75d2a5be918e250f0ed", size = 789499, upload-time = "2025-11-03T21:33:09.141Z" },
{ url = "https://files.pythonhosted.org/packages/fc/df/43971264857140a350910d4e33df725e8c94dd9dee8d2e4729fa0d63d49e/regex-2025.11.3-cp314-cp314-win32.whl", hash = "sha256:795ea137b1d809eb6836b43748b12634291c0ed55ad50a7d72d21edf1cd565c4", size = 271604, upload-time = "2025-11-03T21:33:10.9Z" },
{ url = "https://files.pythonhosted.org/packages/01/6f/9711b57dc6894a55faf80a4c1b5aa4f8649805cb9c7aef46f7d27e2b9206/regex-2025.11.3-cp314-cp314-win_amd64.whl", hash = "sha256:9f95fbaa0ee1610ec0fc6b26668e9917a582ba80c52cc6d9ada15e30aa9ab9ad", size = 280320, upload-time = "2025-11-03T21:33:12.572Z" },
{ url = "https://files.pythonhosted.org/packages/f1/7e/f6eaa207d4377481f5e1775cdeb5a443b5a59b392d0065f3417d31d80f87/regex-2025.11.3-cp314-cp314-win_arm64.whl", hash = "sha256:dfec44d532be4c07088c3de2876130ff0fbeeacaa89a137decbbb5f665855a0f", size = 273372, upload-time = "2025-11-03T21:33:14.219Z" },
{ url = "https://files.pythonhosted.org/packages/c3/06/49b198550ee0f5e4184271cee87ba4dfd9692c91ec55289e6282f0f86ccf/regex-2025.11.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:ba0d8a5d7f04f73ee7d01d974d47c5834f8a1b0224390e4fe7c12a3a92a78ecc", size = 491985, upload-time = "2025-11-03T21:33:16.555Z" },
{ url = "https://files.pythonhosted.org/packages/ce/bf/abdafade008f0b1c9da10d934034cb670432d6cf6cbe38bbb53a1cfd6cf8/regex-2025.11.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:442d86cf1cfe4faabf97db7d901ef58347efd004934da045c745e7b5bd57ac49", size = 292669, upload-time = "2025-11-03T21:33:18.32Z" },
{ url = "https://files.pythonhosted.org/packages/f9/ef/0c357bb8edbd2ad8e273fcb9e1761bc37b8acbc6e1be050bebd6475f19c1/regex-2025.11.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:fd0a5e563c756de210bb964789b5abe4f114dacae9104a47e1a649b910361536", size = 291030, upload-time = "2025-11-03T21:33:20.048Z" },
{ url = "https://files.pythonhosted.org/packages/79/06/edbb67257596649b8fb088d6aeacbcb248ac195714b18a65e018bf4c0b50/regex-2025.11.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bf3490bcbb985a1ae97b2ce9ad1c0f06a852d5b19dde9b07bdf25bf224248c95", size = 807674, upload-time = "2025-11-03T21:33:21.797Z" },
{ url = "https://files.pythonhosted.org/packages/f4/d9/ad4deccfce0ea336296bd087f1a191543bb99ee1c53093dcd4c64d951d00/regex-2025.11.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3809988f0a8b8c9dcc0f92478d6501fac7200b9ec56aecf0ec21f4a2ec4b6009", size = 873451, upload-time = "2025-11-03T21:33:23.741Z" },
{ url = "https://files.pythonhosted.org/packages/13/75/a55a4724c56ef13e3e04acaab29df26582f6978c000ac9cd6810ad1f341f/regex-2025.11.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f4ff94e58e84aedb9c9fce66d4ef9f27a190285b451420f297c9a09f2b9abee9", size = 914980, upload-time = "2025-11-03T21:33:25.999Z" },
{ url = "https://files.pythonhosted.org/packages/67/1e/a1657ee15bd9116f70d4a530c736983eed997b361e20ecd8f5ca3759d5c5/regex-2025.11.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7eb542fd347ce61e1321b0a6b945d5701528dca0cd9759c2e3bb8bd57e47964d", size = 812852, upload-time = "2025-11-03T21:33:27.852Z" },
{ url = "https://files.pythonhosted.org/packages/b8/6f/f7516dde5506a588a561d296b2d0044839de06035bb486b326065b4c101e/regex-2025.11.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:d6c2d5919075a1f2e413c00b056ea0c2f065b3f5fe83c3d07d325ab92dce51d6", size = 795566, upload-time = "2025-11-03T21:33:32.364Z" },
{ url = "https://files.pythonhosted.org/packages/d9/dd/3d10b9e170cc16fb34cb2cef91513cf3df65f440b3366030631b2984a264/regex-2025.11.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:3f8bf11a4827cc7ce5a53d4ef6cddd5ad25595d3c1435ef08f76825851343154", size = 868463, upload-time = "2025-11-03T21:33:34.459Z" },
{ url = "https://files.pythonhosted.org/packages/f5/8e/935e6beff1695aa9085ff83195daccd72acc82c81793df480f34569330de/regex-2025.11.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:22c12d837298651e5550ac1d964e4ff57c3f56965fc1812c90c9fb2028eaf267", size = 854694, upload-time = "2025-11-03T21:33:36.793Z" },
{ url = "https://files.pythonhosted.org/packages/92/12/10650181a040978b2f5720a6a74d44f841371a3d984c2083fc1752e4acf6/regex-2025.11.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:62ba394a3dda9ad41c7c780f60f6e4a70988741415ae96f6d1bf6c239cf01379", size = 799691, upload-time = "2025-11-03T21:33:39.079Z" },
{ url = "https://files.pythonhosted.org/packages/67/90/8f37138181c9a7690e7e4cb388debbd389342db3c7381d636d2875940752/regex-2025.11.3-cp314-cp314t-win32.whl", hash = "sha256:4bf146dca15cdd53224a1bf46d628bd7590e4a07fbb69e720d561aea43a32b38", size = 274583, upload-time = "2025-11-03T21:33:41.302Z" },
{ url = "https://files.pythonhosted.org/packages/8f/cd/867f5ec442d56beb56f5f854f40abcfc75e11d10b11fdb1869dd39c63aaf/regex-2025.11.3-cp314-cp314t-win_amd64.whl", hash = "sha256:adad1a1bcf1c9e76346e091d22d23ac54ef28e1365117d99521631078dfec9de", size = 284286, upload-time = "2025-11-03T21:33:43.324Z" },
{ url = "https://files.pythonhosted.org/packages/20/31/32c0c4610cbc070362bf1d2e4ea86d1ea29014d400a6d6c2486fcfd57766/regex-2025.11.3-cp314-cp314t-win_arm64.whl", hash = "sha256:c54f768482cef41e219720013cd05933b6f971d9562544d691c68699bf2b6801", size = 274741, upload-time = "2025-11-03T21:33:45.557Z" },
]
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
{ name = "regex" },
]
[package.metadata]
requires-dist = [{ name = "regex", specifier = ">=2025.11.3" }]

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,10 @@
scope1 = 1
def x():
scope2 = 1
def xx():
scope3 = 1
def xxx():
# We specifically want `scope3` to be
# suggested first here, since that's in
# the "tighter" scope.
scope<CURSOR: scope3>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1,8 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }

View File

@@ -1,12 +0,0 @@
# This one demands that `TypeVa` complete to `typing.TypeVar`
# even though there is also an `ast.TypeVar`. Getting this one
# right seems tricky, and probably requires module-specific
# heuristics.
#
# ref: https://github.com/astral-sh/ty/issues/1274#issuecomment-3345884227
TypeVa<CURSOR: typing.TypeVar>
# This is a similar case of `ctypes.cast` being preferred over
# `typing.cast`. Maybe `typing` should just get a slightly higher
# weight than most other stdlib modules?
cas<CURSOR: typing.cast>

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,19 @@
# Many of these came from discussion in:
# <https://github.com/astral-sh/ty/issues/1274>
# We should prefer `typing` over `asyncio` here.
class Foo(Protoco<CURSOR: typing.Protocol>): ...
# We should prefer `typing` over `ty_extensions`
# or `typing_extensions`.
reveal_<CURSOR: typing.reveal_type>
# We should prefer `typing` over `ast`.
TypeVa<CURSOR: typing.TypeVar>
# We should prefer `typing` over `ctypes`.
cast<CURSOR: typing.cast>
# We should prefer a non-stdlib project import
# over a stdlib `typing` import.
NoRetur<CURSOR: sub1.NoReturn>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1 @@
NoReturn = 1

View File

@@ -0,0 +1,8 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }

View File

@@ -27,6 +27,7 @@ ty_project = { workspace = true, features = ["testing"] }
ty_python_semantic = { workspace = true }
ty_vendored = { workspace = true }
compact_str = { workspace = true }
get-size2 = { workspace = true }
itertools = { workspace = true }
rayon = { workspace = true }

View File

@@ -64,7 +64,11 @@ pub fn all_symbols<'db>(
continue;
}
s.spawn(move |_| {
let symbols_for_file_span = tracing::debug_span!(parent: all_symbols_span, "symbols_for_file_global_only", ?file);
let symbols_for_file_span = tracing::debug_span!(
parent: all_symbols_span,
"symbols_for_file_global_only",
path = %file.path(&*db),
);
let _entered = symbols_for_file_span.entered();
if query.is_match_symbol_name(module.name(&*db)) {
@@ -94,11 +98,13 @@ pub fn all_symbols<'db>(
let key1 = (
s1.name_in_file()
.unwrap_or_else(|| s1.module().name(db).as_str()),
s1.module().name(db).as_str(),
s1.file.path(db).as_str(),
);
let key2 = (
s2.name_in_file()
.unwrap_or_else(|| s2.module().name(db).as_str()),
s2.module().name(db).as_str(),
s2.file.path(db).as_str(),
);
key1.cmp(&key2)

File diff suppressed because it is too large Load Diff

View File

@@ -34,8 +34,8 @@ pub struct ParameterDetails<'db> {
pub name: String,
/// The parameter label in the signature (e.g., "param1: str")
pub label: String,
/// The annotated type of the parameter, if any
pub ty: Option<Type<'db>>,
/// The annotated type of the parameter. If no annotation was provided, this is `Unknown`.
pub ty: Type<'db>,
/// Documentation specific to the parameter, typically extracted from the
/// function's docstring
pub documentation: Option<String>,
@@ -237,7 +237,7 @@ fn create_parameters_from_offsets<'db>(
docstring: Option<&Docstring>,
parameter_names: &[String],
parameter_kinds: &[ParameterKind],
parameter_types: &[Option<Type<'db>>],
parameter_types: &[Type<'db>],
) -> Vec<ParameterDetails<'db>> {
// Extract parameter documentation from the function's docstring if available.
let param_docs = if let Some(docstring) = docstring {
@@ -264,12 +264,11 @@ fn create_parameters_from_offsets<'db>(
parameter_kinds.get(i),
Some(ParameterKind::PositionalOnly { .. })
);
let ty = parameter_types.get(i).copied().flatten();
ParameterDetails {
name: param_name.to_string(),
label,
ty,
ty: parameter_types[i],
documentation: param_docs.get(param_name).cloned(),
is_positional_only,
}

Some files were not shown because too many files have changed in this diff Show More