Compare commits

..

61 Commits

Author SHA1 Message Date
Alex Waygood
c872fe4c08 generalize fast path more...? 2026-01-10 15:26:09 +00:00
Alex Waygood
f497167798 use an FxOrderSet in UnionType 2026-01-10 14:07:13 +00:00
Charlie Marsh
046c5a46d8 [ty] Support dataclass_transform as a function call (#22378)
## Summary

Instead of just as a decorator.

Closes https://github.com/astral-sh/ty/issues/2319.
2026-01-10 08:45:45 -05:00
Micha Reiser
cfed34334c Update mypy primer pin (#22490) 2026-01-10 14:00:00 +01:00
Charlie Marsh
11cc324449 [ty] Detect invalid @total_ordering applications in non-decorator contexts (#22486)
## Summary

E.g., `ValidOrderedClass = total_ordering(HasOrderingMethod)`.
2026-01-09 19:37:58 -05:00
Charlie Marsh
c88e1a0663 [ty] Avoid emitting Liskov repeated violations from grandparent to child (#22484)
## Summary

If parent violates LSP against grandparent, and child has the same
violation (but matches parent), we no longer flag the LSP violation on
child, since it can't be fixed without violating parent.

If parent violates LSP against grandparent, and child violates LSP
against both parent and grandparent, we emit two diagnostics (one for
each violation).

If parent violates LSP against grandparent, and child violates LSP
against parent (but not grandparent), we flag it.

Closes https://github.com/astral-sh/ty/issues/2000.
2026-01-09 19:27:57 -05:00
William Woodruff
2c7ac17b1e Remove two old zizmor ignore comments (#22485) 2026-01-09 18:02:44 -05:00
Matthias Schoettle
a0f2cd0ded Add language: golang to actionlint pre-commit hook (#22483) 2026-01-09 19:30:00 +00:00
Alex Waygood
dc61104726 [ty] Derive Default in a few more places in place.rs (#22481) 2026-01-09 18:37:36 +00:00
Jason K Hall
f40c578ffb [ruff] document RUF100 trailing comment fix behavior (#22479)
## Summary
Ruff's `--fix` for `RUF100` can inadvertently remove trailing comments
(e.g., `pylint` or `mypy` suppressions) by interpreting them as
descriptions. This PR adds a "Conflict with other linters" section to
the rule documentation to clarify this behavior and provide the
double-hash (`# noqa # pylint`) workaround.

## Fixes
Fixes #20762
2026-01-09 09:01:09 -08:00
Leo Hayashi
baaf6966f6 [ruff] Split WASM build and publish into separate workflows (#12387) (#22476)
Co-authored-by: Hayashi Reo <reo@wantedly.com>
2026-01-09 17:41:13 +01:00
Zanie Blue
a6df4a3be7 Add llms.txt support for documentation (#22463)
Matching uv.

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-09 09:56:41 -06:00
Micha Reiser
1094009790 Fix configuration path in --show-settings (#22478) 2026-01-09 16:28:57 +01:00
Andrew Gallant
10eb3d52d5 [ty] Rank top-level module symbols above most other symbols
This makes it so, e.g., `os<CURSOR>` will suggest the top-level stdlib
`os` module even if there is an `os` symbol elsewhere in your project.

The way this is done is somewhat overwrought, but it's done to avoid
suggesting top-level modules over other symbols already in scope.

Fixes astral-sh/issues#1852
2026-01-09 10:11:37 -05:00
Andrew Gallant
91d24ebb92 [ty] Give exact completion matches very high ranking
Aside from ruling out "impossible" completions in the current context,
we give completions with an exact match the highest possible ranking.
2026-01-09 09:55:32 -05:00
Andrew Gallant
e68fba20a9 [ty] Remove type-var-typing-over-ast evaluation task
This is now strictly redundant with the `typing-gets-priority` task.
2026-01-09 09:55:32 -05:00
Andrew Gallant
64117c1146 [ty] Improve completion ranking based on origin of symbols
Part of this was already done, but it was half-assed. We now look at the
search path that a symbol came from and centralize a symbol's origin
classification.

The preference ordering here is maybe not the right one, but we can
iterate as users give us feedback. Note also that the preference
ordering based on the origin is pretty low in the relevance sorting.
This means that other more specific criteria will and can override this.

This results in some nice improvements to our evaluation tasks.
2026-01-09 09:55:32 -05:00
Andrew Gallant
2196ef3a33 [ty] Add package priority completion evaluation tasks
This commit adds two new tests. One checks that a symbol in the current
project gets priority over a symbol in the standard library. Another
checks that a symbol in a third party dependency gets priority over a
symbol in the standard library. We don't get either of these right
today.

Note that these comparisons are done ceteris paribus. A symbol from the
standard library could still be ranked above a symbol elsewhere.
(Although I believe currently this is somewhat rare.)
2026-01-09 09:55:32 -05:00
Andrew Gallant
c36397031b [ty] Actually ignore hidden directories in completion evaluation
I apparently don't know how to use my own API. Previously,
we would skip, e.g., `.venv`, but still descend into it.

This was annoying in practice because I sometimes have an
environment in one of the truth task directories. The eval
command should ignore that entirely, but it ended up
choking on it without properly ignoring hidden files
and directories.
2026-01-09 09:55:32 -05:00
Andrew Gallant
eef34958f9 [ty] More sensible sorting for results returned by all_symbols
Previously, we would only sort by name and file path (unless the symbol
refers to an entire module). This led to somewhat confusing ordering,
since the file path is absolute and can be anything.

Sorting by module name after symbol name gives a more predictable
ordering in common practice.

This is mostly just an improvement for debugging purposes, i.e., when
looking at the output of `all_symbols` directly. This mostly shouldn't
impact completion ordering since completions do their own ranking.
2026-01-09 09:55:32 -05:00
Andrew Gallant
f1bd5f1941 [ty] Rank unimported symbols from typing higher
This addresses a number of minor annoyances where users really want
imports from `typing` most of the time.

For example:
https://github.com/astral-sh/ty/issues/1274#issuecomment-3345884227
2026-01-09 09:55:32 -05:00
Andrew Gallant
56862f8241 [ty] Refactor ranking
This moves the information we want to use to rank completions into
`Completion` itself. (This was the primary motivation for creating a
`CompletionBuilder` in the first place.)

The principal advantage here is that we now only need to compute the
relevance information for each completion exactly once. Previously, we
were computing it on every comparison, which might end up doing
redundant work for a not insignifcant number of completions.

The relevance information is also specifically constructed from the
builder so that, in the future, we might choose to short-circuit
construction if we know we'll never send it back to the client (e.g.,
its ranking is worse than the lowest ranked completion). But we don't
implement that optimization yet.
2026-01-09 09:55:32 -05:00
Andrew Gallant
e9cc2f6f42 [ty] Refactor completion construction to use a builder
I want to be able to attach extra data to each `Completion`, but not
burden callers with the need to construct it. This commit helps get us
to that point by requiring callers to use a `CompletionBuilder` for
construction instead of a `Completion` itself.

I think this will also help in the future if it proves to be the case
that we can improve performance by delaying work until we actually build
a `Completion`, which might only happen if we know we won't throw it
out. But we aren't quite there yet.

This also lets us tighten things up a little bit and makes completion
construction less noisy. The downside is that callers no longer need to
consider "every" completion field.

There should not be any behavior changes here.
2026-01-09 09:55:32 -05:00
Andrew Gallant
8dd56d4264 [ty] Rename internal completion Rank type to Relevance
I think "relevance" captures the meaning of this type more precisely.
2026-01-09 09:55:32 -05:00
Andrew Gallant
d4c1b0ccc7 [ty] Add evaluation tasks for a few cases
I went through https://github.com/astral-sh/ty/issues/1274 and tried to
extract what I could into eval tasks. Some of the suggestions from that
issue have already been done, but most haven't.

This captures the status quo.
2026-01-09 09:55:32 -05:00
Micha Reiser
e61657ff3c [ty] Enable unused-type-ignore-comment by default (#22474) 2026-01-09 10:58:43 +00:00
Micha Reiser
ba5dd5837c [ty] Pass slice to specialize (#22421) 2026-01-09 09:45:39 +01:00
Micha Reiser
c5f6a74da5 [ty] Fix goto definition for relative imports in third-party files (#22457) 2026-01-09 09:30:31 +01:00
Micha Reiser
b3cde98cd1 [ty] Update salsa (#22473) 2026-01-09 09:21:45 +01:00
Carl Meyer
f9f7a6901b Complete minor TODO in ty_python_semantic crate (#22468)
This TODO is very old -- we have long since recorded this definition.
Updating the test to actually assert the declaration requires a new
helper method for declarations, to complement the existing
`first_public_binding` helper.

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-08 16:04:34 -08:00
Dylan
c920cf8cdb Bump 0.14.11 (#22462) 2026-01-08 12:51:47 -06:00
Micha Reiser
bb757b5a79 [ty] Don't show diagnostics for excluded files (#22455) 2026-01-08 18:27:28 +01:00
Charlie Marsh
1f49e8ef51 Include configured src directories when resolving graphs (#22451)
## Summary

This PR augments the detected source paths with the user-configured
`src` when computing roots for `ruff analyze graph`.
2026-01-08 15:19:15 +00:00
Douglas Creager
701f5134ab [ty] Only consider fully static pivots when deriving transitive constraints (#22444)
When working with constraint sets, we track transitive relationships
between the constraints in the set. For instance, in `S ≤ int ∧ int ≤
T`, we can infer that `S ≤ T`. However, we should only consider fully
static types when looking for a "pivot" for this kind of transitive
relationship. The same pattern does not hold for `S ≤ Any ∧ Any ≤ T`;
because the two `Any`s can materialize to different types, we cannot
infer that `S ≤ T`.

Fixes https://github.com/astral-sh/ty/issues/2371
2026-01-08 09:31:55 -05:00
Micha Reiser
eea9ad8352 Pin maturin version (#22454) 2026-01-08 12:39:51 +01:00
Alex Waygood
eeac2bd3ee [ty] Optimize union building for unions with many enum-literal members (#22363) 2026-01-08 10:50:04 +00:00
Jason K Hall
7319c37f4e docs: fix jupyter notebook discovery info for editors (#22447)
Resolves #21892

## Summary

This PR updates `docs/editors/features.md` to clarify that Jupyter
Notebooks are now included by default as of version 0.6.0.
2026-01-08 11:52:01 +05:30
Amethyst Reese
805503c19a [ruff] Improve fix title for RUF102 invalid rule code (#22100)
## Summary

Updates the fix title for RUF102 to either specify which rule code to
remove, or clarify
that the entire suppression comment should be removed.

## Test Plan

Updated test snapshots.
2026-01-07 17:23:18 -08:00
Charlie Marsh
68a2f6c57d [ty] Fix super() with TypeVar-annotated self and cls parameter (#22208)
## Summary

This PR fixes `super()` handling when the first parameter (`self` or
`cls`) is annotated with a TypeVar, like `Self`.

Previously, `super()` would incorrectly resolve TypeVars to their bounds
before creating the `BoundSuperType`. So if you had `self: Self` where
`Self` is bounded by `Parent`, we'd process `Parent` as a
`NominalInstance` and end up with `SuperOwnerKind::Instance(Parent)`.

As a result:

```python
class Parent:
    @classmethod
    def create(cls) -> Self:
        return cls()

class Child(Parent):
    @classmethod
    def create(cls) -> Self:
        return super().create()  # Error: Argument type `Self@create` does not satisfy upper bound `Parent`
```

We now track two additional variants on `SuperOwnerKind` for TypeVar
owners:

- `InstanceTypeVar`: for instance methods where self is a TypeVar (e.g.,
`self: Self`).
- `ClassTypeVar`: for classmethods where `cls` is a `TypeVar` wrapped in
`type[...]` (e.g., `cls: type[Self]`).

Closes https://github.com/astral-sh/ty/issues/2122.

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2026-01-07 19:56:09 -05:00
Alex Waygood
abaa735e1d [ty] Improve UnionBuilder performance by changing Type::is_subtype_of calls to Type::is_redundant_with (#22337) 2026-01-07 22:17:44 +00:00
Jelle Zijlstra
c02d164357 Check required-version before parsing rules (#22410)
Co-authored-by: Micha Reiser <micha@reiser.io>
2026-01-07 17:29:27 +00:00
Andrew Gallant
88aa3f82f0 [ty] Fix generally poor ranking in playground completions
We enabled [`CompletionListisIncomplete`] in our LSP server a while back
in order to have more of a say in how we rank and filter completions.
When it isn't set, the client tends to ask for completions less
frequently and will instead do its own filtering.

But... we did not enable it for the playground. Which I guess didn't
result in anything noticeably bad until we started limiting completions
to 1,000 suggestions. This meant that if the _initial_ completion
response didn't include the ultimate desired answer, then it would never
show up in the results until the client requested completions again.
This in turn led to some very poor completions in some cases.

This all gets fixed by simply enabling `isIncomplete` for Monaco.

Fixes astral-sh/ty#2340

[`CompletionList::isIncomplete`](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#completionList)
2026-01-07 12:25:26 -05:00
Carl Meyer
30902497db [ty] Make signature return and parameter types non-optional (#22425)
## Summary

Fixes https://github.com/astral-sh/ty/issues/2363
Fixes https://github.com/astral-sh/ty/issues/2013

And several other bugs with the same root cause. And makes any similar
bugs impossible by construction.

Previously we distinguished "no annotation" (Rust `None`) from
"explicitly annotated with something of type `Unknown`" (which is not an
error, and results in the annotation being of Rust type
`Some(Type::DynamicType(Unknown))`), even though semantically these
should be treated the same.

This was a bit of a bug magnet, because it was easy to forget to make
this `None` -> `Unknown` translation everywhere we needed to. And in
fact we did fail to do it in the case of materializing a callable,
leading to a top-materialized callable still having (rust) `None` return
type, which should have instead materialized to `object`.

This also fixes several other bugs related to not handling un-annotated
return types correctly:
1. We previously considered the return type of an unannotated `async
def` to be `Unknown`, where it should be `CoroutineType[Any, Any,
Unknown]`.
2. We previously failed to infer a ParamSpec if the return type of the
callable we are inferring against was not annotated.
3. We previously wrongly returned `Unknown` from `some_dict.get("key",
None)` if the value type of `some_dict` included a callable type with
un-annotated return type.

We now make signature return types and annotated parameter types
required, and we eagerly insert `Unknown` if there's no annotation. Most
of the diff is just a bunch of mechanical code changes where we
construct these types, and simplifications where we use them.

One exception is type display: when a callable type has un-annotated
parameters, we want to display them as un-annotated, but if it has a
parameter explicitly annotated with something of `Unknown` type, we want
to display that parameter as `x: Unknown` (it would be confusing if it
looked like your annotation just disappeared entirely).

Fortunately, we already have a mechanism in place for handling this: the
`inferred_annotation` flag, which suppresses display of an annotation.
Previously we used it only for `self` and `cls` parameters with an
inferred annotated type -- but we now also set it for any un-annotated
parameter, for which we infer `Unknown` type.

We also need to normalize `inferred_annotation`, since it's display-only
and shouldn't impact type equivalence. (This is technically a
previously-existing bug, it just never came up when it only affected
self types -- now it comes up because we have tests asserting that `def
f(x)` and `def g(x: Unknown)` are equivalent.)

## Test Plan

Added mdtests.
2026-01-07 09:18:39 -08:00
Alex Waygood
3ad99fb1f4 [ty] Fix an mdtest title (#22439) 2026-01-07 16:34:56 +00:00
Micha Reiser
d0ff59cfe5 [ty] Use Pool from regex_automata to reuse the matches allocations (#22438) 2026-01-07 17:22:35 +01:00
Andrew Gallant
952193e0c6 [ty] Offer completions for T when a value has type Unknown | T
Fixes astral-sh/ty#2197
2026-01-07 10:15:36 -05:00
Alex Waygood
4cba2e8f91 [ty] Generalize len() narrowing somewhat (#22330) 2026-01-07 13:57:50 +00:00
Alex Waygood
1a7f53022a [ty] Link to Callable __name__ FAQ directly from unresolved-attribute diagnostic (#22437) 2026-01-07 13:22:53 +00:00
Micha Reiser
266a7bc4c5 [ty] Fix stack overflow due to too small stack size (#22433) 2026-01-07 13:55:23 +01:00
Micha Reiser
3b7a5e4de8 [ty] Allow including files with no extension (#22243) 2026-01-07 11:38:02 +01:00
Micha Reiser
93039d055d [ty] Add --add-ignore CLI option (#21696) 2026-01-07 11:17:05 +01:00
Jason K Hall
3b61da0da3 Allow Python 3.15 as valid target-version value in preview (#22419) 2026-01-07 09:38:36 +01:00
Alex Waygood
5933cc0101 [ty] Optimize and simplify some object-related code (#22366)
## Summary

I wondered if this might improve performance a little. It doesn't seem
to, but it's a net reduction in LOC and I think the changes make sense.
I think it's worth it anyway just in terms of simplifying the code.

## Test Plan

Our existing tests all pass and the primer report is clean (aside from
our usual flakes).
2026-01-07 08:35:26 +00:00
Dhruv Manilawala
2190fcebe0 [ty] Substitute ParamSpec in overloaded functions (#22416)
## Summary

fixes: https://github.com/astral-sh/ty/issues/2027

This PR fixes a bug where the type mapping for a `ParamSpec` was not
being applied in an overloaded function.

This PR also fixes https://github.com/astral-sh/ty/issues/2081 and
reveals new diagnostics which doesn't look related to the bug:

```py
from prefect import flow, task

@task
def task_get() -> int:
    """Task get integer."""
    return 42

@task
def task_add(x: int, y: int) -> int:
    """Task add two integers."""
    print(f"Adding {x} and {y}")
    return x + y

@flow
def my_flow():
    """My flow."""
    x = 23
    future_y = task_get.submit()

	# error: [no-matching-overload]
    task_add(future_y, future_y)
	# error: [no-matching-overload]
    task_add(x, future_y)
```

The reason is that the type of `future_y` is `PrefectFuture[int]` while
the type of `task_add` is `Task[(x: int, y: int), int]` which means that
the assignment between `int` and `PrefectFuture[int]` fails which
results in no overload matching. Pyright also raises the invalid
argument type error on all three usages of `future_y` in those two
calls.

## Test Plan

Add regression mdtest from the linked issue.
2026-01-07 13:30:34 +05:30
Douglas Creager
df9d6886d4 [ty] Remove redundant apply_specialization type mappings (#22422)
@dhruvmanila encountered this in #22416 — there are two different
`TypeMapping` variants for apply a specialization to a type. One
operates on a full `Specialization` instance, the other on a partially
constructed one. If we move this enum-ness "down a level" it reduces
some copy/paste in places where we are operating on a `TypeMapping`.
2026-01-07 13:10:26 +05:30
Aria Desires
5133fa4516 [ty] fix typo in CODEOWNERS (#22430) 2026-01-07 07:44:46 +01:00
Amethyst Reese
21c5cfe236 Consolidate diagnostics for matched disable/enable suppression comments (#22099)
## Summary

Combines diagnostics for matched suppression comments, so that ranges
and autofixes for both
the `#ruff:disable` and `#ruff:enable` comments will be reported as a
single diagnostic.

## Test Plan

Snapshot changes, added new snapshot for full output from preview mode
rather than just a diff.

Issue #3711
2026-01-06 18:42:51 -08:00
Carl Meyer
f97da18267 [ty] improve typevar solving from constraint sets (#22411)
## Summary

Fixes https://github.com/astral-sh/ty/issues/2292

When solving a bounded typevar, we preferred the upper bound over the
actual type seen in the call. This change fixes that.

## Test Plan

Added mdtest, existing tests pass.
2026-01-06 13:10:51 -08:00
Alex Waygood
bc191f59b9 Convert more ty snapshots to the new format (#22424) 2026-01-06 20:01:41 +00:00
Alex Waygood
00f86c39e0 Add Alex Waygood back as a ty_ide codeowner (#22423) 2026-01-06 19:24:13 +00:00
Alex Waygood
2ec29b7418 [ty] Optimize Type::negate() (#22402) 2026-01-06 19:17:59 +00:00
190 changed files with 7233 additions and 2623 deletions

4
.github/CODEOWNERS vendored
View File

@@ -21,10 +21,10 @@
/crates/ty* @carljm @MichaReiser @AlexWaygood @sharkdp @dcreager
/crates/ruff_db/ @carljm @MichaReiser @sharkdp @dcreager
/crates/ty_project/ @carljm @MichaReiser @sharkdp @dcreager @Gankra
/crates/ty_ide/ @carljm @MichaReiser @sharkdp @dcreager @Gankra
/crates/ty_ide/ @carljm @MichaReiser @AlexWaygood @sharkdp @dcreager @Gankra
/crates/ty_server/ @carljm @MichaReiser @sharkdp @dcreager @Gankra
/crates/ty/ @carljm @MichaReiser @sharkdp @dcreager
/crates/ty_wasm/ @carljm @MichaReiser @sharkdp @dcreager @Gankra
/scripts/ty_benchmark/ @carljm @MichaReiser @AlexWaygood @sharkdp @dcreager
/crates/ty_python_semantic/ @carljm @AlexWaygood @sharkdp @dcreager
/crates/ty_module_resolver/ @carljm @MichaReiser @AlexWaygood Gankra
/crates/ty_module_resolver/ @carljm @MichaReiser @AlexWaygood @Gankra

View File

@@ -5,5 +5,4 @@
[rules]
possibly-unresolved-reference = "warn"
possibly-missing-import = "warn"
unused-ignore-comment = "warn"
division-by-zero = "warn"

View File

@@ -51,6 +51,7 @@ jobs:
- name: "Build sdist"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
command: sdist
args: --out dist
- name: "Test sdist"
@@ -81,6 +82,7 @@ jobs:
- name: "Build wheels - x86_64"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: x86_64
args: --release --locked --out dist
- name: "Upload wheels"
@@ -123,6 +125,7 @@ jobs:
- name: "Build wheels - aarch64"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: aarch64
args: --release --locked --out dist
- name: "Test wheel - aarch64"
@@ -179,6 +182,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.platform.target }}
args: --release --locked --out dist
env:
@@ -232,6 +236,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.target }}
manylinux: auto
args: --release --locked --out dist
@@ -308,6 +313,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.platform.target }}
manylinux: auto
docker-options: ${{ matrix.platform.maturin_docker_options }}
@@ -374,6 +380,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.target }}
manylinux: musllinux_1_2
args: --release --locked --out dist
@@ -439,6 +446,7 @@ jobs:
- name: "Build wheels"
uses: PyO3/maturin-action@86b9d133d34bc1b40018696f782949dac11bd380 # v1.49.4
with:
maturin-version: v1.9.6
target: ${{ matrix.platform.target }}
manylinux: musllinux_1_2
args: --release --locked --out dist

58
.github/workflows/build-wasm.yml vendored Normal file
View File

@@ -0,0 +1,58 @@
# Build ruff_wasm for npm.
#
# Assumed to run as a subworkflow of .github/workflows/release.yml; specifically, as a local
# artifacts job within `cargo-dist`.
name: "Build wasm"
on:
workflow_call:
inputs:
plan:
required: true
type: string
pull_request:
paths:
- .github/workflows/build-wasm.yml
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
env:
CARGO_INCREMENTAL: 0
CARGO_NET_RETRY: 10
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
jobs:
build:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
runs-on: ubuntu-latest
strategy:
matrix:
target: [web, bundler, nodejs]
fail-fast: false
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: jetli/wasm-pack-action@0d096b08b4e5a7de8c28de67e11e945404e9eefa # v0.4.0
with:
version: v0.13.1
- uses: jetli/wasm-bindgen-action@20b33e20595891ab1a0ed73145d8a21fc96e7c29 # v0.2.0
- name: "Run wasm-pack build"
run: wasm-pack build --target ${{ matrix.target }} crates/ruff_wasm
- name: "Rename generated package"
run: | # Replace the package name w/ jq
jq '.name="@astral-sh/ruff-wasm-${{ matrix.target }}"' crates/ruff_wasm/pkg/package.json > /tmp/package.json
mv /tmp/package.json crates/ruff_wasm/pkg
- run: cp LICENSE crates/ruff_wasm/pkg # wasm-pack does not put the LICENSE file in the pkg
- name: "Upload wasm artifact"
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: artifacts-wasm-${{ matrix.target }}
path: crates/ruff_wasm/pkg

View File

@@ -1,25 +1,18 @@
# Build and publish ruff-api for wasm.
# Publish ruff_wasm to npm.
#
# Assumed to run as a subworkflow of .github/workflows/release.yml; specifically, as a publish
# job within `cargo-dist`.
name: "Build and publish wasm"
name: "Publish wasm"
on:
workflow_dispatch:
workflow_call:
inputs:
plan:
required: true
type: string
env:
CARGO_INCREMENTAL: 0
CARGO_NET_RETRY: 10
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
jobs:
ruff_wasm:
publish:
runs-on: ubuntu-latest
permissions:
contents: read
@@ -29,31 +22,19 @@ jobs:
target: [web, bundler, nodejs]
fail-fast: false
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: jetli/wasm-pack-action@0d096b08b4e5a7de8c28de67e11e945404e9eefa # v0.4.0
with:
version: v0.13.1
- uses: jetli/wasm-bindgen-action@20b33e20595891ab1a0ed73145d8a21fc96e7c29 # v0.2.0
- name: "Run wasm-pack build"
run: wasm-pack build --target ${{ matrix.target }} crates/ruff_wasm
- name: "Rename generated package"
run: | # Replace the package name w/ jq
jq '.name="@astral-sh/ruff-wasm-${{ matrix.target }}"' crates/ruff_wasm/pkg/package.json > /tmp/package.json
mv /tmp/package.json crates/ruff_wasm/pkg
- run: cp LICENSE crates/ruff_wasm/pkg # wasm-pack does not put the LICENSE file in the pkg
name: artifacts-wasm-${{ matrix.target }}
path: pkg
- uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: 24
registry-url: "https://registry.npmjs.org"
- name: "Publish (dry-run)"
if: ${{ inputs.plan == '' || fromJson(inputs.plan).announcement_tag_is_implicit }}
run: npm publish --dry-run crates/ruff_wasm/pkg
run: npm publish --dry-run pkg
- name: "Publish"
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
run: npm publish --provenance --access public crates/ruff_wasm/pkg
run: npm publish --provenance --access public pkg
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

View File

@@ -112,12 +112,22 @@ jobs:
"contents": "read"
"packages": "write"
custom-build-wasm:
needs:
- plan
if: ${{ needs.plan.outputs.publishing == 'true' || fromJson(needs.plan.outputs.val).ci.github.pr_run_mode == 'upload' || inputs.tag == 'dry-run' }}
uses: ./.github/workflows/build-wasm.yml
with:
plan: ${{ needs.plan.outputs.val }}
secrets: inherit
# Build and package all the platform-agnostic(ish) things
build-global-artifacts:
needs:
- plan
- custom-build-binaries
- custom-build-docker
- custom-build-wasm
runs-on: "depot-ubuntu-latest-4"
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
@@ -165,9 +175,10 @@ jobs:
- plan
- custom-build-binaries
- custom-build-docker
- custom-build-wasm
- build-global-artifacts
# Only run if we're "publishing", and only if plan, local and global didn't fail (skipped is fine)
if: ${{ always() && needs.plan.result == 'success' && needs.plan.outputs.publishing == 'true' && (needs.build-global-artifacts.result == 'skipped' || needs.build-global-artifacts.result == 'success') && (needs.custom-build-binaries.result == 'skipped' || needs.custom-build-binaries.result == 'success') && (needs.custom-build-docker.result == 'skipped' || needs.custom-build-docker.result == 'success') }}
if: ${{ always() && needs.plan.result == 'success' && needs.plan.outputs.publishing == 'true' && (needs.build-global-artifacts.result == 'skipped' || needs.build-global-artifacts.result == 'success') && (needs.custom-build-binaries.result == 'skipped' || needs.custom-build-binaries.result == 'success') && (needs.custom-build-docker.result == 'skipped' || needs.custom-build-docker.result == 'success') && (needs.custom-build-wasm.result == 'skipped' || needs.custom-build-wasm.result == 'success') }}
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
runs-on: "depot-ubuntu-latest-4"

View File

@@ -40,12 +40,12 @@ jobs:
- name: Install the latest version of uv
uses: astral-sh/setup-uv@681c641aba71e4a1c380be3ab5e12ad51f415867 # v7.1.6
with:
enable-cache: true # zizmor: ignore[cache-poisoning] acceptable risk for CloudFlare pages artifact
enable-cache: true
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
with:
workspaces: "ruff"
lookup-only: false # zizmor: ignore[cache-poisoning] acceptable risk for CloudFlare pages artifact
lookup-only: false
- name: Install Rust toolchain
run: rustup show

View File

@@ -125,6 +125,7 @@ repos:
args:
- "-ignore=SC2129" # ignorable stylistic lint from shellcheck
- "-ignore=SC2016" # another shellcheck lint: seems to have false positives?
language: golang # means renovate will also update `additional_dependencies`
additional_dependencies:
# actionlint has a shellcheck integration which extracts shell scripts in `run:` steps from GitHub Actions
# and checks these with shellcheck. This is arguably its most useful feature,

View File

@@ -1,5 +1,63 @@
# Changelog
## 0.14.11
Released on 2026-01-08.
### Preview features
- Consolidate diagnostics for matched disable/enable suppression comments ([#22099](https://github.com/astral-sh/ruff/pull/22099))
- Report diagnostics for invalid/unmatched range suppression comments ([#21908](https://github.com/astral-sh/ruff/pull/21908))
- \[`airflow`\] Passing positional argument into `airflow.lineage.hook.HookLineageCollector.create_asset` is not allowed (`AIR303`) ([#22046](https://github.com/astral-sh/ruff/pull/22046))
- \[`refurb`\] Mark `FURB192` fix as always unsafe ([#22210](https://github.com/astral-sh/ruff/pull/22210))
- \[`ruff`\] Add `non-empty-init-module` (`RUF067`) ([#22143](https://github.com/astral-sh/ruff/pull/22143))
### Bug fixes
- Fix GitHub format for multi-line diagnostics ([#22108](https://github.com/astral-sh/ruff/pull/22108))
- \[`flake8-unused-arguments`\] Mark `**kwargs` in `TypeVar` as used (`ARG001`) ([#22214](https://github.com/astral-sh/ruff/pull/22214))
### Rule changes
- Add `help:` subdiagnostics for several Ruff rules that can sometimes appear to disagree with `ty` ([#22331](https://github.com/astral-sh/ruff/pull/22331))
- \[`pylint`\] Demote `PLW1510` fix to display-only ([#22318](https://github.com/astral-sh/ruff/pull/22318))
- \[`pylint`\] Ignore identical members (`PLR1714`) ([#22220](https://github.com/astral-sh/ruff/pull/22220))
- \[`pylint`\] Improve diagnostic range for `PLC0206` ([#22312](https://github.com/astral-sh/ruff/pull/22312))
- \[`ruff`\] Improve fix title for `RUF102` invalid rule code ([#22100](https://github.com/astral-sh/ruff/pull/22100))
- \[`flake8-simplify`\]: Avoid unnecessary builtins import for `SIM105` ([#22358](https://github.com/astral-sh/ruff/pull/22358))
### Configuration
- Allow Python 3.15 as valid `target-version` value in preview ([#22419](https://github.com/astral-sh/ruff/pull/22419))
- Check `required-version` before parsing rules ([#22410](https://github.com/astral-sh/ruff/pull/22410))
- Include configured `src` directories when resolving graphs ([#22451](https://github.com/astral-sh/ruff/pull/22451))
### Documentation
- Update `T201` suggestion to not use root logger to satisfy `LOG015` ([#22059](https://github.com/astral-sh/ruff/pull/22059))
- Fix `iter` example in unsafe fixes doc ([#22118](https://github.com/astral-sh/ruff/pull/22118))
- \[`flake8_print`\] better suggestion for `basicConfig` in `T201` docs ([#22101](https://github.com/astral-sh/ruff/pull/22101))
- \[`pylint`\] Restore the fix safety docs for `PLW0133` ([#22211](https://github.com/astral-sh/ruff/pull/22211))
- Fix Jupyter notebook discovery info for editors ([#22447](https://github.com/astral-sh/ruff/pull/22447))
### Contributors
- [@charliermarsh](https://github.com/charliermarsh)
- [@ntBre](https://github.com/ntBre)
- [@cenviity](https://github.com/cenviity)
- [@njhearp](https://github.com/njhearp)
- [@cbachhuber](https://github.com/cbachhuber)
- [@jelle-openai](https://github.com/jelle-openai)
- [@AlexWaygood](https://github.com/AlexWaygood)
- [@ValdonVitija](https://github.com/ValdonVitija)
- [@BurntSushi](https://github.com/BurntSushi)
- [@Jkhall81](https://github.com/Jkhall81)
- [@PeterJCLaw](https://github.com/PeterJCLaw)
- [@harupy](https://github.com/harupy)
- [@amyreese](https://github.com/amyreese)
- [@sjyangkevin](https://github.com/sjyangkevin)
- [@woodruffw](https://github.com/woodruffw)
## 0.14.10
Released on 2025-12-18.

16
Cargo.lock generated
View File

@@ -2912,7 +2912,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.14.10"
version = "0.14.11"
dependencies = [
"anyhow",
"argfile",
@@ -2928,6 +2928,7 @@ dependencies = [
"filetime",
"globwalk",
"ignore",
"indexmap",
"indoc",
"insta",
"insta-cmd",
@@ -3171,7 +3172,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.14.10"
version = "0.14.11"
dependencies = [
"aho-corasick",
"anyhow",
@@ -3529,7 +3530,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.14.10"
version = "0.14.11"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -3645,7 +3646,7 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=309c249088fdeef0129606fa34ec2eefc74736ff#309c249088fdeef0129606fa34ec2eefc74736ff"
source = "git+https://github.com/salsa-rs/salsa.git?rev=9860ff6ca0f1f8f3a8d6b832020002790b501254#9860ff6ca0f1f8f3a8d6b832020002790b501254"
dependencies = [
"boxcar",
"compact_str",
@@ -3670,12 +3671,12 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=309c249088fdeef0129606fa34ec2eefc74736ff#309c249088fdeef0129606fa34ec2eefc74736ff"
source = "git+https://github.com/salsa-rs/salsa.git?rev=9860ff6ca0f1f8f3a8d6b832020002790b501254#9860ff6ca0f1f8f3a8d6b832020002790b501254"
[[package]]
name = "salsa-macros"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=309c249088fdeef0129606fa34ec2eefc74736ff#309c249088fdeef0129606fa34ec2eefc74736ff"
source = "git+https://github.com/salsa-rs/salsa.git?rev=9860ff6ca0f1f8f3a8d6b832020002790b501254#9860ff6ca0f1f8f3a8d6b832020002790b501254"
dependencies = [
"proc-macro2",
"quote",
@@ -4443,6 +4444,7 @@ version = "0.0.0"
dependencies = [
"bitflags 2.10.0",
"camino",
"compact_str",
"get-size2",
"insta",
"itertools 0.14.0",
@@ -4511,11 +4513,13 @@ dependencies = [
"regex-automata",
"ruff_cache",
"ruff_db",
"ruff_diagnostics",
"ruff_macros",
"ruff_memory_usage",
"ruff_options_metadata",
"ruff_python_ast",
"ruff_python_formatter",
"ruff_python_trivia",
"ruff_text_size",
"rustc-hash",
"salsa",

View File

@@ -150,7 +150,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "309c249088fdeef0129606fa34ec2eefc74736ff", default-features = false, features = [
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "9860ff6ca0f1f8f3a8d6b832020002790b501254", default-features = false, features = [
"compact_str",
"macros",
"salsa_unstable",

View File

@@ -150,8 +150,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.14.10/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.14.10/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.14.11/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.14.11/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -184,7 +184,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.14.10
rev: v0.14.11
hooks:
# Run the linter.
- id: ruff-check

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.14.10"
version = "0.14.11"
publish = true
authors = { workspace = true }
edition = { workspace = true }
@@ -48,6 +48,7 @@ colored = { workspace = true }
filetime = { workspace = true }
globwalk = { workspace = true }
ignore = { workspace = true }
indexmap = { workspace = true }
is-macro = { workspace = true }
itertools = { workspace = true }
jiff = { workspace = true }

View File

@@ -2,6 +2,7 @@ use crate::args::{AnalyzeGraphArgs, ConfigArguments};
use crate::resolve::resolve;
use crate::{ExitStatus, resolve_default_files};
use anyhow::Result;
use indexmap::IndexSet;
use log::{debug, warn};
use path_absolutize::CWD;
use ruff_db::system::{SystemPath, SystemPathBuf};
@@ -11,7 +12,7 @@ use ruff_linter::source_kind::SourceKind;
use ruff_linter::{warn_user, warn_user_once};
use ruff_python_ast::{PySourceType, SourceType};
use ruff_workspace::resolver::{ResolvedFile, match_exclusion, python_files_in_path};
use rustc_hash::FxHashMap;
use rustc_hash::{FxBuildHasher, FxHashMap};
use std::io::Write;
use std::path::{Path, PathBuf};
use std::sync::{Arc, Mutex};
@@ -59,17 +60,34 @@ pub(crate) fn analyze_graph(
})
.collect::<FxHashMap<_, _>>();
// Create a database from the source roots.
let src_roots = package_roots
.values()
.filter_map(|package| package.as_deref())
.filter_map(|package| package.parent())
.map(Path::to_path_buf)
.filter_map(|path| SystemPathBuf::from_path_buf(path).ok())
.collect();
// Create a database from the source roots, combining configured `src` paths with detected
// package roots. Configured paths are added first so they take precedence, and duplicates
// are removed.
let mut src_roots: IndexSet<SystemPathBuf, FxBuildHasher> = IndexSet::default();
// Add configured `src` paths first (for precedence), filtering to only include existing
// directories.
src_roots.extend(
pyproject_config
.settings
.linter
.src
.iter()
.filter(|path| path.is_dir())
.filter_map(|path| SystemPathBuf::from_path_buf(path.clone()).ok()),
);
// Add detected package roots.
src_roots.extend(
package_roots
.values()
.filter_map(|package| package.as_deref())
.filter_map(|path| path.parent())
.filter_map(|path| SystemPathBuf::from_path_buf(path.to_path_buf()).ok()),
);
let db = ModuleDb::from_src_roots(
src_roots,
src_roots.into_iter().collect(),
pyproject_config
.settings
.analyze

View File

@@ -29,10 +29,10 @@ pub(crate) fn show_settings(
bail!("No files found under the given path");
};
let settings = resolver.resolve(&path);
let (settings, config_path) = resolver.resolve_with_path(&path);
writeln!(writer, "Resolved settings for: \"{}\"", path.display())?;
if let Some(settings_path) = pyproject_config.path.as_ref() {
if let Some(settings_path) = config_path {
writeln!(writer, "Settings path: \"{}\"", settings_path.display())?;
}
write!(writer, "{settings}")?;

View File

@@ -714,6 +714,121 @@ fn notebook_basic() -> Result<()> {
Ok(())
}
/// Test that the `src` configuration option is respected.
///
/// This is useful for monorepos where there are multiple source directories that need to be
/// included in the module resolution search path.
#[test]
fn src_option() -> Result<()> {
let tempdir = TempDir::new()?;
let root = ChildPath::new(tempdir.path());
// Create a lib directory with a package.
root.child("lib")
.child("mylib")
.child("__init__.py")
.write_str("def helper(): pass")?;
// Create an app directory with a file that imports from mylib.
root.child("app").child("__init__.py").write_str("")?;
root.child("app")
.child("main.py")
.write_str("from mylib import helper")?;
// Without src configured, the import from mylib won't resolve.
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"app/__init__.py": [],
"app/main.py": []
}
----- stderr -----
"#);
});
// With src = ["lib"], the import should resolve.
root.child("ruff.toml").write_str(indoc::indoc! {r#"
src = ["lib"]
"#})?;
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"app/__init__.py": [],
"app/main.py": [
"lib/mylib/__init__.py"
]
}
----- stderr -----
"#);
});
Ok(())
}
/// Test that glob patterns in `src` are expanded.
#[test]
fn src_glob_expansion() -> Result<()> {
let tempdir = TempDir::new()?;
let root = ChildPath::new(tempdir.path());
// Create multiple lib directories with packages.
root.child("libs")
.child("lib_a")
.child("pkg_a")
.child("__init__.py")
.write_str("def func_a(): pass")?;
root.child("libs")
.child("lib_b")
.child("pkg_b")
.child("__init__.py")
.write_str("def func_b(): pass")?;
// Create an app that imports from both packages.
root.child("app").child("__init__.py").write_str("")?;
root.child("app")
.child("main.py")
.write_str("from pkg_a import func_a\nfrom pkg_b import func_b")?;
// Use a glob pattern to include all lib directories.
root.child("ruff.toml").write_str(indoc::indoc! {r#"
src = ["libs/*"]
"#})?;
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"app/__init__.py": [],
"app/main.py": [
"libs/lib_a/pkg_a/__init__.py",
"libs/lib_b/pkg_b/__init__.py"
]
}
----- stderr -----
"#);
});
Ok(())
}
#[test]
fn notebook_with_magic() -> Result<()> {
let tempdir = TempDir::new()?;

View File

@@ -1126,6 +1126,35 @@ import os
Ok(())
}
#[test]
fn required_version_fails_to_parse() -> Result<()> {
let fixture = CliTest::with_file(
"ruff.toml",
r#"
required-version = "pikachu"
"#,
)?;
assert_cmd_snapshot!(fixture
.check_command(), @r#"
success: false
exit_code: 2
----- stdout -----
----- stderr -----
ruff failed
Cause: Failed to load configuration `[TMP]/ruff.toml`
Cause: Failed to parse [TMP]/ruff.toml
Cause: TOML parse error at line 2, column 20
|
2 | required-version = "pikachu"
| ^^^^^^^^^
Failed to parse version: Unexpected end of version specifier, expected operator:
pikachu
^^^^^^^
"#);
Ok(())
}
#[test]
fn required_version_exact_mismatch() -> Result<()> {
let version = env!("CARGO_PKG_VERSION");
@@ -1137,10 +1166,10 @@ required-version = "0.1.0"
"#,
)?;
insta::with_settings!({
filters => vec![(version, "[VERSION]")]
}, {
assert_cmd_snapshot!(fixture
let mut settings = insta::Settings::clone_current();
settings.add_filter(version, "[VERSION]");
settings.bind(|| {
assert_cmd_snapshot!(fixture
.check_command()
.arg("--config")
.arg("ruff.toml")
@@ -1154,6 +1183,7 @@ import os
----- stderr -----
ruff failed
Cause: Failed to load configuration `[TMP]/ruff.toml`
Cause: Required version `==0.1.0` does not match the running version `[VERSION]`
");
});
@@ -1212,10 +1242,10 @@ required-version = ">{version}"
),
)?;
insta::with_settings!({
filters => vec![(version, "[VERSION]")]
}, {
assert_cmd_snapshot!(fixture
let mut settings = insta::Settings::clone_current();
settings.add_filter(version, "[VERSION]");
settings.bind(|| {
assert_cmd_snapshot!(fixture
.check_command()
.arg("--config")
.arg("ruff.toml")
@@ -1229,6 +1259,48 @@ import os
----- stderr -----
ruff failed
Cause: Failed to load configuration `[TMP]/ruff.toml`
Cause: Required version `>[VERSION]` does not match the running version `[VERSION]`
");
});
Ok(())
}
#[test]
fn required_version_precedes_rule_validation() -> Result<()> {
let version = env!("CARGO_PKG_VERSION");
let fixture = CliTest::with_file(
"ruff.toml",
&format!(
r#"
required-version = ">{version}"
[lint]
select = ["RUF999"]
"#
),
)?;
let mut settings = insta::Settings::clone_current();
settings.add_filter(version, "[VERSION]");
settings.bind(|| {
assert_cmd_snapshot!(fixture
.check_command()
.arg("--config")
.arg("ruff.toml")
.arg("-")
.pass_stdin(r#"
import os
"#), @"
success: false
exit_code: 2
----- stdout -----
----- stderr -----
ruff failed
Cause: Failed to load configuration `[TMP]/ruff.toml`
Cause: Required version `>[VERSION]` does not match the running version `[VERSION]`
");
});

View File

@@ -16,6 +16,7 @@ success: true
exit_code: 0
----- stdout -----
Resolved settings for: "[TMP]/foo/test.py"
Settings path: "[TMP]/foo/pyproject.toml"
# General Settings
cache_dir = "[TMP]/foo/.ruff_cache"

View File

@@ -50,6 +50,56 @@ ignore = [
Ok(())
}
#[test]
fn display_settings_from_nested_directory() -> anyhow::Result<()> {
let tempdir = TempDir::new().context("Failed to create temp directory.")?;
// Tempdir path's on macos are symlinks, which doesn't play nicely with
// our snapshot filtering.
let project_dir =
dunce::canonicalize(tempdir.path()).context("Failed to canonical tempdir path.")?;
// Root pyproject.toml.
std::fs::write(
project_dir.join("pyproject.toml"),
r#"
[tool.ruff]
line-length = 100
[tool.ruff.lint]
select = ["E", "F"]
"#,
)?;
// Create a subdirectory with its own pyproject.toml.
let subdir = project_dir.join("subdir");
std::fs::create_dir(&subdir)?;
std::fs::write(
subdir.join("pyproject.toml"),
r#"
[tool.ruff]
line-length = 120
[tool.ruff.lint]
select = ["E", "F", "I"]
"#,
)?;
std::fs::write(subdir.join("test.py"), r#"import os"#).context("Failed to write test.py.")?;
insta::with_settings!({filters => vec![
(&*tempdir_filter(&project_dir), "<temp_dir>/"),
(r#"\\(\w\w|\s|\.|")"#, "/$1"),
]}, {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--show-settings", "subdir/test.py"])
.current_dir(&project_dir));
});
Ok(())
}
fn tempdir_filter(project_dir: &Path) -> String {
format!(r#"{}\\?/?"#, regex::escape(project_dir.to_str().unwrap()))
}

View File

@@ -0,0 +1,410 @@
---
source: crates/ruff/tests/show_settings.rs
info:
program: ruff
args:
- check
- "--show-settings"
- subdir/test.py
---
success: true
exit_code: 0
----- stdout -----
Resolved settings for: "<temp_dir>/subdir/test.py"
Settings path: "<temp_dir>/subdir/pyproject.toml"
# General Settings
cache_dir = "<temp_dir>/subdir/.ruff_cache"
fix = false
fix_only = false
output_format = full
show_fixes = false
unsafe_fixes = hint
# File Resolver Settings
file_resolver.exclude = [
".bzr",
".direnv",
".eggs",
".git",
".git-rewrite",
".hg",
".ipynb_checkpoints",
".mypy_cache",
".nox",
".pants.d",
".pyenv",
".pytest_cache",
".pytype",
".ruff_cache",
".svn",
".tox",
".venv",
".vscode",
"__pypackages__",
"_build",
"buck-out",
"dist",
"node_modules",
"site-packages",
"venv",
]
file_resolver.extend_exclude = []
file_resolver.force_exclude = false
file_resolver.include = [
"*.py",
"*.pyi",
"*.ipynb",
"**/pyproject.toml",
]
file_resolver.extend_include = []
file_resolver.respect_gitignore = true
file_resolver.project_root = "<temp_dir>/subdir"
# Linter Settings
linter.exclude = []
linter.project_root = "<temp_dir>/subdir"
linter.rules.enabled = [
unsorted-imports (I001),
missing-required-import (I002),
mixed-spaces-and-tabs (E101),
multiple-imports-on-one-line (E401),
module-import-not-at-top-of-file (E402),
line-too-long (E501),
multiple-statements-on-one-line-colon (E701),
multiple-statements-on-one-line-semicolon (E702),
useless-semicolon (E703),
none-comparison (E711),
true-false-comparison (E712),
not-in-test (E713),
not-is-test (E714),
type-comparison (E721),
bare-except (E722),
lambda-assignment (E731),
ambiguous-variable-name (E741),
ambiguous-class-name (E742),
ambiguous-function-name (E743),
io-error (E902),
unused-import (F401),
import-shadowed-by-loop-var (F402),
undefined-local-with-import-star (F403),
late-future-import (F404),
undefined-local-with-import-star-usage (F405),
undefined-local-with-nested-import-star-usage (F406),
future-feature-not-defined (F407),
percent-format-invalid-format (F501),
percent-format-expected-mapping (F502),
percent-format-expected-sequence (F503),
percent-format-extra-named-arguments (F504),
percent-format-missing-argument (F505),
percent-format-mixed-positional-and-named (F506),
percent-format-positional-count-mismatch (F507),
percent-format-star-requires-sequence (F508),
percent-format-unsupported-format-character (F509),
string-dot-format-invalid-format (F521),
string-dot-format-extra-named-arguments (F522),
string-dot-format-extra-positional-arguments (F523),
string-dot-format-missing-arguments (F524),
string-dot-format-mixing-automatic (F525),
f-string-missing-placeholders (F541),
multi-value-repeated-key-literal (F601),
multi-value-repeated-key-variable (F602),
expressions-in-star-assignment (F621),
multiple-starred-expressions (F622),
assert-tuple (F631),
is-literal (F632),
invalid-print-syntax (F633),
if-tuple (F634),
break-outside-loop (F701),
continue-outside-loop (F702),
yield-outside-function (F704),
return-outside-function (F706),
default-except-not-last (F707),
forward-annotation-syntax-error (F722),
redefined-while-unused (F811),
undefined-name (F821),
undefined-export (F822),
undefined-local (F823),
unused-variable (F841),
unused-annotation (F842),
raise-not-implemented (F901),
]
linter.rules.should_fix = [
unsorted-imports (I001),
missing-required-import (I002),
mixed-spaces-and-tabs (E101),
multiple-imports-on-one-line (E401),
module-import-not-at-top-of-file (E402),
line-too-long (E501),
multiple-statements-on-one-line-colon (E701),
multiple-statements-on-one-line-semicolon (E702),
useless-semicolon (E703),
none-comparison (E711),
true-false-comparison (E712),
not-in-test (E713),
not-is-test (E714),
type-comparison (E721),
bare-except (E722),
lambda-assignment (E731),
ambiguous-variable-name (E741),
ambiguous-class-name (E742),
ambiguous-function-name (E743),
io-error (E902),
unused-import (F401),
import-shadowed-by-loop-var (F402),
undefined-local-with-import-star (F403),
late-future-import (F404),
undefined-local-with-import-star-usage (F405),
undefined-local-with-nested-import-star-usage (F406),
future-feature-not-defined (F407),
percent-format-invalid-format (F501),
percent-format-expected-mapping (F502),
percent-format-expected-sequence (F503),
percent-format-extra-named-arguments (F504),
percent-format-missing-argument (F505),
percent-format-mixed-positional-and-named (F506),
percent-format-positional-count-mismatch (F507),
percent-format-star-requires-sequence (F508),
percent-format-unsupported-format-character (F509),
string-dot-format-invalid-format (F521),
string-dot-format-extra-named-arguments (F522),
string-dot-format-extra-positional-arguments (F523),
string-dot-format-missing-arguments (F524),
string-dot-format-mixing-automatic (F525),
f-string-missing-placeholders (F541),
multi-value-repeated-key-literal (F601),
multi-value-repeated-key-variable (F602),
expressions-in-star-assignment (F621),
multiple-starred-expressions (F622),
assert-tuple (F631),
is-literal (F632),
invalid-print-syntax (F633),
if-tuple (F634),
break-outside-loop (F701),
continue-outside-loop (F702),
yield-outside-function (F704),
return-outside-function (F706),
default-except-not-last (F707),
forward-annotation-syntax-error (F722),
redefined-while-unused (F811),
undefined-name (F821),
undefined-export (F822),
undefined-local (F823),
unused-variable (F841),
unused-annotation (F842),
raise-not-implemented (F901),
]
linter.per_file_ignores = {}
linter.safety_table.forced_safe = []
linter.safety_table.forced_unsafe = []
linter.unresolved_target_version = none
linter.per_file_target_version = {}
linter.preview = disabled
linter.explicit_preview_rules = false
linter.extension = ExtensionMapping({})
linter.allowed_confusables = []
linter.builtins = []
linter.dummy_variable_rgx = ^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$
linter.external = []
linter.ignore_init_module_imports = true
linter.logger_objects = []
linter.namespace_packages = []
linter.src = [
"<temp_dir>/subdir",
"<temp_dir>/subdir/src",
]
linter.tab_size = 4
linter.line_length = 120
linter.task_tags = [
TODO,
FIXME,
XXX,
]
linter.typing_modules = []
linter.typing_extensions = true
# Linter Plugins
linter.flake8_annotations.mypy_init_return = false
linter.flake8_annotations.suppress_dummy_args = false
linter.flake8_annotations.suppress_none_returning = false
linter.flake8_annotations.allow_star_arg_any = false
linter.flake8_annotations.ignore_fully_untyped = false
linter.flake8_bandit.hardcoded_tmp_directory = [
/tmp,
/var/tmp,
/dev/shm,
]
linter.flake8_bandit.check_typed_exception = false
linter.flake8_bandit.extend_markup_names = []
linter.flake8_bandit.allowed_markup_calls = []
linter.flake8_bugbear.extend_immutable_calls = []
linter.flake8_builtins.allowed_modules = []
linter.flake8_builtins.ignorelist = []
linter.flake8_builtins.strict_checking = false
linter.flake8_comprehensions.allow_dict_calls_with_keyword_arguments = false
linter.flake8_copyright.notice_rgx = (?i)Copyright\s+((?:\(C\)|©)\s+)?\d{4}((-|,\s)\d{4})*
linter.flake8_copyright.author = none
linter.flake8_copyright.min_file_size = 0
linter.flake8_errmsg.max_string_length = 0
linter.flake8_gettext.function_names = [
_,
gettext,
ngettext,
]
linter.flake8_implicit_str_concat.allow_multiline = true
linter.flake8_import_conventions.aliases = {
altair = alt,
holoviews = hv,
matplotlib = mpl,
matplotlib.pyplot = plt,
networkx = nx,
numpy = np,
numpy.typing = npt,
pandas = pd,
panel = pn,
plotly.express = px,
polars = pl,
pyarrow = pa,
seaborn = sns,
tensorflow = tf,
tkinter = tk,
xml.etree.ElementTree = ET,
}
linter.flake8_import_conventions.banned_aliases = {}
linter.flake8_import_conventions.banned_from = []
linter.flake8_pytest_style.fixture_parentheses = false
linter.flake8_pytest_style.parametrize_names_type = tuple
linter.flake8_pytest_style.parametrize_values_type = list
linter.flake8_pytest_style.parametrize_values_row_type = tuple
linter.flake8_pytest_style.raises_require_match_for = [
BaseException,
Exception,
ValueError,
OSError,
IOError,
EnvironmentError,
socket.error,
]
linter.flake8_pytest_style.raises_extend_require_match_for = []
linter.flake8_pytest_style.mark_parentheses = false
linter.flake8_quotes.inline_quotes = double
linter.flake8_quotes.multiline_quotes = double
linter.flake8_quotes.docstring_quotes = double
linter.flake8_quotes.avoid_escape = true
linter.flake8_self.ignore_names = [
_make,
_asdict,
_replace,
_fields,
_field_defaults,
_name_,
_value_,
]
linter.flake8_tidy_imports.ban_relative_imports = "parents"
linter.flake8_tidy_imports.banned_api = {}
linter.flake8_tidy_imports.banned_module_level_imports = []
linter.flake8_type_checking.strict = false
linter.flake8_type_checking.exempt_modules = [
typing,
typing_extensions,
]
linter.flake8_type_checking.runtime_required_base_classes = []
linter.flake8_type_checking.runtime_required_decorators = []
linter.flake8_type_checking.quote_annotations = false
linter.flake8_unused_arguments.ignore_variadic_names = false
linter.isort.required_imports = []
linter.isort.combine_as_imports = false
linter.isort.force_single_line = false
linter.isort.force_sort_within_sections = false
linter.isort.detect_same_package = true
linter.isort.case_sensitive = false
linter.isort.force_wrap_aliases = false
linter.isort.force_to_top = []
linter.isort.known_modules = {}
linter.isort.order_by_type = true
linter.isort.relative_imports_order = furthest_to_closest
linter.isort.single_line_exclusions = []
linter.isort.split_on_trailing_comma = true
linter.isort.classes = []
linter.isort.constants = []
linter.isort.variables = []
linter.isort.no_lines_before = []
linter.isort.lines_after_imports = -1
linter.isort.lines_between_types = 0
linter.isort.forced_separate = []
linter.isort.section_order = [
known { type = future },
known { type = standard_library },
known { type = third_party },
known { type = first_party },
known { type = local_folder },
]
linter.isort.default_section = known { type = third_party }
linter.isort.no_sections = false
linter.isort.from_first = false
linter.isort.length_sort = false
linter.isort.length_sort_straight = false
linter.mccabe.max_complexity = 10
linter.pep8_naming.ignore_names = [
setUp,
tearDown,
setUpClass,
tearDownClass,
setUpModule,
tearDownModule,
asyncSetUp,
asyncTearDown,
setUpTestData,
failureException,
longMessage,
maxDiff,
]
linter.pep8_naming.classmethod_decorators = []
linter.pep8_naming.staticmethod_decorators = []
linter.pycodestyle.max_line_length = 120
linter.pycodestyle.max_doc_length = none
linter.pycodestyle.ignore_overlong_task_comments = false
linter.pyflakes.extend_generics = []
linter.pyflakes.allowed_unused_imports = []
linter.pylint.allow_magic_value_types = [
str,
bytes,
]
linter.pylint.allow_dunder_method_names = []
linter.pylint.max_args = 5
linter.pylint.max_positional_args = 5
linter.pylint.max_returns = 6
linter.pylint.max_bool_expr = 5
linter.pylint.max_branches = 12
linter.pylint.max_statements = 50
linter.pylint.max_public_methods = 20
linter.pylint.max_locals = 15
linter.pylint.max_nested_blocks = 5
linter.pyupgrade.keep_runtime_typing = false
linter.ruff.parenthesize_tuple_in_subscript = false
linter.ruff.strictly_empty_init_modules = false
# Formatter Settings
formatter.exclude = []
formatter.unresolved_target_version = 3.10
formatter.per_file_target_version = {}
formatter.preview = disabled
formatter.line_width = 120
formatter.line_ending = auto
formatter.indent_style = space
formatter.indent_width = 4
formatter.quote_style = double
formatter.magic_trailing_comma = respect
formatter.docstring_code_format = disabled
formatter.docstring_code_line_width = dynamic
# Analyze Settings
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.10
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
analyze.type_checking_imports = true
----- stderr -----

View File

@@ -15,7 +15,7 @@ use ruff_db::files::{File, system_path_to_file};
use ruff_db::source::source_text;
use ruff_db::system::{InMemorySystem, MemoryFileSystem, SystemPath, SystemPathBuf, TestSystem};
use ruff_python_ast::PythonVersion;
use ty_project::metadata::options::{EnvironmentOptions, Options};
use ty_project::metadata::options::{AnalysisOptions, EnvironmentOptions, Options};
use ty_project::metadata::value::{RangedValue, RelativePathBuf};
use ty_project::watch::{ChangeEvent, ChangedKind};
use ty_project::{CheckMode, Db, ProjectDatabase, ProjectMetadata};
@@ -67,6 +67,7 @@ fn tomllib_path(file: &TestFile) -> SystemPathBuf {
SystemPathBuf::from("src").join(file.name())
}
#[expect(clippy::needless_update)]
fn setup_tomllib_case() -> Case {
let system = TestSystem::default();
let fs = system.memory_file_system().clone();
@@ -85,6 +86,10 @@ fn setup_tomllib_case() -> Case {
python_version: Some(RangedValue::cli(PythonVersion::PY312)),
..EnvironmentOptions::default()
}),
analysis: Some(AnalysisOptions {
respect_type_ignore_comments: Some(false),
..AnalysisOptions::default()
}),
..Options::default()
});
@@ -221,7 +226,7 @@ fn setup_micro_case(code: &str) -> Case {
let file_path = "src/test.py";
fs.write_file_all(
SystemPathBuf::from(file_path),
ruff_python_trivia::textwrap::dedent(code),
&*ruff_python_trivia::textwrap::dedent(code),
)
.unwrap();
@@ -755,7 +760,7 @@ fn datetype(criterion: &mut Criterion) {
max_dep_date: "2025-07-04",
python_version: PythonVersion::PY313,
},
2,
4,
);
bench_project(&benchmark, criterion);

View File

@@ -71,6 +71,8 @@ impl Display for Benchmark<'_> {
}
}
#[track_caller]
#[expect(clippy::cast_precision_loss)]
fn check_project(db: &ProjectDatabase, project_name: &str, max_diagnostics: usize) {
let result = db.check();
let diagnostics = result.len();
@@ -79,6 +81,12 @@ fn check_project(db: &ProjectDatabase, project_name: &str, max_diagnostics: usiz
diagnostics > 1 && diagnostics <= max_diagnostics,
"Expected between 1 and {max_diagnostics} diagnostics on project '{project_name}' but got {diagnostics}",
);
if (max_diagnostics - diagnostics) as f64 / max_diagnostics as f64 > 0.10 {
tracing::warn!(
"The expected diagnostics for project `{project_name}` can be reduced: expected {max_diagnostics} but got {diagnostics}"
);
}
}
static ALTAIR: Benchmark = Benchmark::new(
@@ -101,7 +109,7 @@ static ALTAIR: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
1000,
850,
);
static COLOUR_SCIENCE: Benchmark = Benchmark::new(
@@ -120,7 +128,7 @@ static COLOUR_SCIENCE: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY310,
},
1070,
350,
);
static FREQTRADE: Benchmark = Benchmark::new(
@@ -163,7 +171,7 @@ static PANDAS: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
4000,
3800,
);
static PYDANTIC: Benchmark = Benchmark::new(
@@ -181,7 +189,7 @@ static PYDANTIC: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY39,
},
7000,
3200,
);
static SYMPY: Benchmark = Benchmark::new(
@@ -194,7 +202,7 @@ static SYMPY: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
13116,
13400,
);
static TANJUN: Benchmark = Benchmark::new(
@@ -207,7 +215,7 @@ static TANJUN: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
320,
110,
);
static STATIC_FRAME: Benchmark = Benchmark::new(
@@ -223,7 +231,7 @@ static STATIC_FRAME: Benchmark = Benchmark::new(
max_dep_date: "2025-08-09",
python_version: PythonVersion::PY311,
},
1100,
1700,
);
#[track_caller]

View File

@@ -1,3 +1,4 @@
use std::fmt::Formatter;
use std::sync::Arc;
use std::sync::atomic::AtomicBool;
@@ -49,3 +50,15 @@ impl CancellationToken {
self.cancelled.load(std::sync::atomic::Ordering::Relaxed)
}
}
/// The operation was canceled by the provided [`CancellationToken`].
#[derive(Debug)]
pub struct Canceled;
impl std::error::Error for Canceled {}
impl std::fmt::Display for Canceled {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
f.write_str("operation was canceled")
}
}

View File

@@ -98,6 +98,44 @@ impl Diagnostic {
diag
}
/// Adds sub diagnostics that tell the user that this is a bug in ty
/// and asks them to open an issue on GitHub.
pub fn add_bug_sub_diagnostics(&mut self, url_encoded_title: &str) {
self.sub(SubDiagnostic::new(
SubDiagnosticSeverity::Info,
"This indicates a bug in ty.",
));
self.sub(SubDiagnostic::new(
SubDiagnosticSeverity::Info,
format_args!(
"If you could open an issue at https://github.com/astral-sh/ty/issues/new?title={url_encoded_title}, we'd be very appreciative!"
),
));
self.sub(SubDiagnostic::new(
SubDiagnosticSeverity::Info,
format!(
"Platform: {os} {arch}",
os = std::env::consts::OS,
arch = std::env::consts::ARCH
),
));
if let Some(version) = crate::program_version() {
self.sub(SubDiagnostic::new(
SubDiagnosticSeverity::Info,
format!("Version: {version}"),
));
}
self.sub(SubDiagnostic::new(
SubDiagnosticSeverity::Info,
format!(
"Args: {args:?}",
args = std::env::args().collect::<Vec<_>>()
),
));
}
/// Add an annotation to this diagnostic.
///
/// Annotations for a diagnostic are optional, but if any are added,
@@ -1019,6 +1057,13 @@ impl DiagnosticId {
matches!(self, DiagnosticId::Lint(_))
}
pub const fn as_lint(&self) -> Option<LintName> {
match self {
DiagnosticId::Lint(name) => Some(*name),
_ => None,
}
}
/// Returns `true` if this `DiagnosticId` represents a lint with the given name.
pub fn is_lint_named(&self, name: &str) -> bool {
matches!(self, DiagnosticId::Lint(self_name) if self_name == name)

View File

@@ -14,6 +14,7 @@ use crate::diagnostic::{Span, UnifiedFile};
use crate::file_revision::FileRevision;
use crate::files::file_root::FileRoots;
use crate::files::private::FileStatus;
use crate::source::SourceText;
use crate::system::{SystemPath, SystemPathBuf, SystemVirtualPath, SystemVirtualPathBuf};
use crate::vendored::{VendoredPath, VendoredPathBuf};
use crate::{Db, FxDashMap, vendored};
@@ -323,6 +324,17 @@ pub struct File {
/// the file has been deleted is to change the status to `Deleted`.
#[default]
status: FileStatus,
/// Overrides the result of [`source_text`](crate::source::source_text).
///
/// This is useful when running queries after modifying a file's content but
/// before the content is written to disk. For example, to verify that the applied fixes
/// didn't introduce any new errors.
///
/// The override gets automatically removed the next time the file changes.
#[default]
#[returns(ref)]
pub source_text_override: Option<SourceText>,
}
// The Salsa heap is tracked separately.
@@ -444,20 +456,28 @@ impl File {
_ => (FileStatus::NotFound, FileRevision::zero(), None),
};
let mut clear_override = false;
if file.status(db) != status {
tracing::debug!("Updating the status of `{}`", file.path(db));
file.set_status(db).to(status);
clear_override = true;
}
if file.revision(db) != revision {
tracing::debug!("Updating the revision of `{}`", file.path(db));
file.set_revision(db).to(revision);
clear_override = true;
}
if file.permissions(db) != permission {
tracing::debug!("Updating the permissions of `{}`", file.path(db));
file.set_permissions(db).to(permission);
}
if clear_override && file.source_text_override(db).is_some() {
file.set_source_text_override(db).to(None);
}
}
/// Returns `true` if the file exists.

View File

@@ -85,6 +85,13 @@ pub fn max_parallelism() -> NonZeroUsize {
})
}
// Use a reasonably large stack size to avoid running into stack overflows too easily. The
// size was chosen in such a way as to still be able to handle large expressions involving
// binary operators (x + x + … + x) both during the AST walk in semantic index building as
// well as during type checking. Using this stack size, we can handle handle expressions
// that are several times larger than the corresponding limits in existing type checkers.
pub const STACK_SIZE: usize = 16 * 1024 * 1024;
/// Trait for types that can provide Rust documentation.
///
/// Use `derive(RustDoc)` to automatically implement this trait for types that have a static string documentation.

View File

@@ -1,6 +1,8 @@
use std::borrow::Cow;
use std::ops::Deref;
use std::sync::Arc;
use ruff_diagnostics::SourceMap;
use ruff_notebook::Notebook;
use ruff_python_ast::PySourceType;
use ruff_source_file::LineIndex;
@@ -16,6 +18,10 @@ pub fn source_text(db: &dyn Db, file: File) -> SourceText {
let _span = tracing::trace_span!("source_text", file = %path).entered();
let mut read_error = None;
if let Some(source) = file.source_text_override(db) {
return source.clone();
}
let kind = if is_notebook(db.system(), path) {
file.read_to_notebook(db)
.unwrap_or_else(|error| {
@@ -90,6 +96,45 @@ impl SourceText {
pub fn read_error(&self) -> Option<&SourceTextError> {
self.inner.read_error.as_ref()
}
/// Returns a new instance for this file with the updated source text (Python code).
///
/// Uses the `source_map` to preserve the cell-boundaries.
#[must_use]
pub fn with_text(&self, new_text: String, source_map: &SourceMap) -> Self {
let new_kind = match &self.inner.kind {
SourceTextKind::Text(_) => SourceTextKind::Text(new_text),
SourceTextKind::Notebook { notebook } => {
let mut new_notebook = notebook.as_ref().clone();
new_notebook.update(source_map, new_text);
SourceTextKind::Notebook {
notebook: new_notebook.into(),
}
}
};
Self {
inner: Arc::new(SourceTextInner {
kind: new_kind,
read_error: self.inner.read_error.clone(),
}),
}
}
pub fn to_bytes(&self) -> Cow<'_, [u8]> {
match &self.inner.kind {
SourceTextKind::Text(source) => Cow::Borrowed(source.as_bytes()),
SourceTextKind::Notebook { notebook } => {
let mut output: Vec<u8> = Vec::new();
notebook
.write(&mut output)
.expect("writing to a Vec should never fail");
Cow::Owned(output)
}
}
}
}
impl Deref for SourceText {
@@ -117,13 +162,13 @@ impl std::fmt::Debug for SourceText {
}
}
#[derive(Eq, PartialEq, get_size2::GetSize)]
#[derive(Eq, PartialEq, get_size2::GetSize, Clone)]
struct SourceTextInner {
kind: SourceTextKind,
read_error: Option<SourceTextError>,
}
#[derive(Eq, PartialEq, get_size2::GetSize)]
#[derive(Eq, PartialEq, get_size2::GetSize, Clone)]
enum SourceTextKind {
Text(String),
Notebook {

View File

@@ -271,7 +271,12 @@ pub trait WritableSystem: System {
fn create_new_file(&self, path: &SystemPath) -> Result<()>;
/// Writes the given content to the file at the given path.
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()>;
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
self.write_file_bytes(path, content.as_bytes())
}
/// Writes the given content to the file at the given path.
fn write_file_bytes(&self, path: &SystemPath, content: &[u8]) -> Result<()>;
/// Creates a directory at `path` as well as any intermediate directories.
fn create_directory_all(&self, path: &SystemPath) -> Result<()>;
@@ -311,6 +316,8 @@ pub trait WritableSystem: System {
Ok(Some(cache_path))
}
fn dyn_clone(&self) -> Box<dyn WritableSystem>;
}
#[derive(Clone, Debug, Eq, PartialEq)]

View File

@@ -122,7 +122,9 @@ impl MemoryFileSystem {
let entry = by_path.get(&normalized).ok_or_else(not_found)?;
match entry {
Entry::File(file) => Ok(file.content.clone()),
Entry::File(file) => {
String::from_utf8(file.content.to_vec()).map_err(|_| invalid_utf8())
}
Entry::Directory(_) => Err(is_a_directory()),
}
}
@@ -139,7 +141,7 @@ impl MemoryFileSystem {
.get(&path.as_ref().to_path_buf())
.ok_or_else(not_found)?;
Ok(file.content.clone())
String::from_utf8(file.content.to_vec()).map_err(|_| invalid_utf8())
}
pub fn exists(&self, path: &SystemPath) -> bool {
@@ -161,7 +163,7 @@ impl MemoryFileSystem {
match by_path.entry(normalized) {
btree_map::Entry::Vacant(entry) => {
entry.insert(Entry::File(File {
content: String::new(),
content: Box::default(),
last_modified: file_time_now(),
}));
@@ -177,13 +179,17 @@ impl MemoryFileSystem {
/// Stores a new file in the file system.
///
/// The operation overrides the content for an existing file with the same normalized `path`.
pub fn write_file(&self, path: impl AsRef<SystemPath>, content: impl ToString) -> Result<()> {
pub fn write_file(
&self,
path: impl AsRef<SystemPath>,
content: impl AsRef<[u8]>,
) -> Result<()> {
let mut by_path = self.inner.by_path.write().unwrap();
let normalized = self.normalize_path(path.as_ref());
let file = get_or_create_file(&mut by_path, &normalized)?;
file.content = content.to_string();
file.content = content.as_ref().to_vec().into_boxed_slice();
file.last_modified = file_time_now();
Ok(())
@@ -214,7 +220,7 @@ impl MemoryFileSystem {
pub fn write_file_all(
&self,
path: impl AsRef<SystemPath>,
content: impl ToString,
content: impl AsRef<[u8]>,
) -> Result<()> {
let path = path.as_ref();
@@ -228,19 +234,24 @@ impl MemoryFileSystem {
/// Stores a new virtual file in the file system.
///
/// The operation overrides the content for an existing virtual file with the same `path`.
pub fn write_virtual_file(&self, path: impl AsRef<SystemVirtualPath>, content: impl ToString) {
pub fn write_virtual_file(
&self,
path: impl AsRef<SystemVirtualPath>,
content: impl AsRef<[u8]>,
) {
let path = path.as_ref();
let mut virtual_files = self.inner.virtual_files.write().unwrap();
let content = content.as_ref().to_vec().into_boxed_slice();
match virtual_files.entry(path.to_path_buf()) {
std::collections::hash_map::Entry::Vacant(entry) => {
entry.insert(File {
content: content.to_string(),
content,
last_modified: file_time_now(),
});
}
std::collections::hash_map::Entry::Occupied(mut entry) => {
entry.get_mut().content = content.to_string();
entry.get_mut().content = content;
}
}
}
@@ -468,7 +479,7 @@ impl Entry {
#[derive(Debug)]
struct File {
content: String,
content: Box<[u8]>,
last_modified: FileTime,
}
@@ -497,6 +508,13 @@ fn directory_not_empty() -> std::io::Error {
std::io::Error::other("directory not empty")
}
fn invalid_utf8() -> std::io::Error {
std::io::Error::new(
std::io::ErrorKind::InvalidData,
"stream did not contain valid UTF-8",
)
}
fn create_dir_all(
paths: &mut RwLockWriteGuard<BTreeMap<Utf8PathBuf, Entry>>,
normalized: &Utf8Path,
@@ -533,7 +551,7 @@ fn get_or_create_file<'a>(
let entry = paths.entry(normalized.to_path_buf()).or_insert_with(|| {
Entry::File(File {
content: String::new(),
content: Box::default(),
last_modified: file_time_now(),
})
});
@@ -844,7 +862,7 @@ mod tests {
let fs = with_files(["c.py"]);
let error = fs
.write_file(SystemPath::new("a/b.py"), "content".to_string())
.write_file(SystemPath::new("a/b.py"), "content")
.unwrap_err();
assert_eq!(error.kind(), ErrorKind::NotFound);
@@ -855,7 +873,7 @@ mod tests {
let fs = with_files(["a/b.py"]);
let error = fs
.write_file_all(SystemPath::new("a/b.py/c"), "content".to_string())
.write_file_all(SystemPath::new("a/b.py/c"), "content")
.unwrap_err();
assert_eq!(error.kind(), ErrorKind::Other);
@@ -878,7 +896,7 @@ mod tests {
let fs = MemoryFileSystem::new();
let path = SystemPath::new("a.py");
fs.write_file_all(path, "Test content".to_string())?;
fs.write_file_all(path, "Test content")?;
assert_eq!(fs.read_to_string(path)?, "Test content");
@@ -915,9 +933,7 @@ mod tests {
fs.create_directory_all("a")?;
let error = fs
.write_file(SystemPath::new("a"), "content".to_string())
.unwrap_err();
let error = fs.write_file(SystemPath::new("a"), "content").unwrap_err();
assert_eq!(error.kind(), ErrorKind::Other);

View File

@@ -361,13 +361,17 @@ impl WritableSystem for OsSystem {
std::fs::File::create_new(path).map(drop)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
fn write_file_bytes(&self, path: &SystemPath, content: &[u8]) -> Result<()> {
std::fs::write(path.as_std_path(), content)
}
fn create_directory_all(&self, path: &SystemPath) -> Result<()> {
std::fs::create_dir_all(path.as_std_path())
}
fn dyn_clone(&self) -> Box<dyn WritableSystem> {
Box::new(self.clone())
}
}
impl Default for OsSystem {

View File

@@ -205,13 +205,17 @@ impl WritableSystem for TestSystem {
self.system().create_new_file(path)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
self.system().write_file(path, content)
fn write_file_bytes(&self, path: &SystemPath, content: &[u8]) -> Result<()> {
self.system().write_file_bytes(path, content)
}
fn create_directory_all(&self, path: &SystemPath) -> Result<()> {
self.system().create_directory_all(path)
}
fn dyn_clone(&self) -> Box<dyn WritableSystem> {
Box::new(self.clone())
}
}
/// Extension trait for databases that use a [`WritableSystem`].
@@ -283,7 +287,11 @@ pub trait DbWithTestSystem: Db + Sized {
///
/// ## Panics
/// If the db isn't using the [`InMemorySystem`].
fn write_virtual_file(&mut self, path: impl AsRef<SystemVirtualPath>, content: impl ToString) {
fn write_virtual_file(
&mut self,
path: impl AsRef<SystemVirtualPath>,
content: impl AsRef<[u8]>,
) {
let path = path.as_ref();
self.test_system()
.memory_file_system()
@@ -322,23 +330,23 @@ where
}
}
#[derive(Default, Debug)]
#[derive(Clone, Default, Debug)]
pub struct InMemorySystem {
user_config_directory: Mutex<Option<SystemPathBuf>>,
user_config_directory: Arc<Mutex<Option<SystemPathBuf>>>,
memory_fs: MemoryFileSystem,
}
impl InMemorySystem {
pub fn new(cwd: SystemPathBuf) -> Self {
Self {
user_config_directory: Mutex::new(None),
user_config_directory: Mutex::new(None).into(),
memory_fs: MemoryFileSystem::with_current_directory(cwd),
}
}
pub fn from_memory_fs(memory_fs: MemoryFileSystem) -> Self {
Self {
user_config_directory: Mutex::new(None),
user_config_directory: Mutex::new(None).into(),
memory_fs,
}
}
@@ -440,10 +448,7 @@ impl System for InMemorySystem {
}
fn dyn_clone(&self) -> Box<dyn System> {
Box::new(Self {
user_config_directory: Mutex::new(self.user_config_directory.lock().unwrap().clone()),
memory_fs: self.memory_fs.clone(),
})
Box::new(self.clone())
}
}
@@ -452,11 +457,15 @@ impl WritableSystem for InMemorySystem {
self.memory_fs.create_new_file(path)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
fn write_file_bytes(&self, path: &SystemPath, content: &[u8]) -> Result<()> {
self.memory_fs.write_file(path, content)
}
fn create_directory_all(&self, path: &SystemPath) -> Result<()> {
self.memory_fs.create_directory_all(path)
}
fn dyn_clone(&self) -> Box<dyn WritableSystem> {
Box::new(self.clone())
}
}

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.14.10"
version = "0.14.11"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -26,6 +26,7 @@ use crate::doc_lines::{doc_lines_from_ast, doc_lines_from_tokens};
use crate::fix::{FixResult, fix_file};
use crate::noqa::add_noqa;
use crate::package::PackageRoot;
use crate::preview::is_py315_support_enabled;
use crate::registry::Rule;
#[cfg(any(feature = "test-rules", test))]
use crate::rules::ruff::rules::test_rules::{self, TEST_RULES, TestRule};
@@ -33,7 +34,7 @@ use crate::settings::types::UnsafeFixes;
use crate::settings::{LinterSettings, TargetVersion, flags};
use crate::source_kind::SourceKind;
use crate::suppression::Suppressions;
use crate::{Locator, directives, fs};
use crate::{Locator, directives, fs, warn_user_once};
pub(crate) mod float;
@@ -404,8 +405,7 @@ pub fn add_noqa_to_path(
);
// Parse range suppression comments
let suppressions =
Suppressions::from_tokens(settings, locator.contents(), parsed.tokens(), &indexer);
let suppressions = Suppressions::from_tokens(settings, locator.contents(), parsed.tokens());
// Generate diagnostics, ignoring any existing `noqa` directives.
let diagnostics = check_path(
@@ -451,6 +451,14 @@ pub fn lint_only(
) -> LinterResult {
let target_version = settings.resolve_target_version(path);
if matches!(target_version.linter_version(), PythonVersion::PY315)
&& !is_py315_support_enabled(settings)
{
warn_user_once!(
"Support for Python 3.15 is under development and may be unstable. Enable `preview` to remove this warning."
);
}
let parsed = source.into_parsed(source_kind, source_type, target_version.parser_version());
// Map row and column locations to byte slices (lazily).
@@ -471,8 +479,7 @@ pub fn lint_only(
);
// Parse range suppression comments
let suppressions =
Suppressions::from_tokens(settings, locator.contents(), parsed.tokens(), &indexer);
let suppressions = Suppressions::from_tokens(settings, locator.contents(), parsed.tokens());
// Generate diagnostics.
let diagnostics = check_path(
@@ -557,6 +564,14 @@ pub fn lint_fix<'a>(
let target_version = settings.resolve_target_version(path);
if matches!(target_version.linter_version(), PythonVersion::PY315)
&& !is_py315_support_enabled(settings)
{
warn_user_once!(
"Support for Python 3.15 is under development and may be unstable. Enable `preview` to remove this warning."
);
}
// Continuously fix until the source code stabilizes.
loop {
// Parse once.
@@ -581,8 +596,7 @@ pub fn lint_fix<'a>(
);
// Parse range suppression comments
let suppressions =
Suppressions::from_tokens(settings, locator.contents(), parsed.tokens(), &indexer);
let suppressions = Suppressions::from_tokens(settings, locator.contents(), parsed.tokens());
// Generate diagnostics.
let diagnostics = check_path(
@@ -964,8 +978,7 @@ mod tests {
&locator,
&indexer,
);
let suppressions =
Suppressions::from_tokens(settings, locator.contents(), parsed.tokens(), &indexer);
let suppressions = Suppressions::from_tokens(settings, locator.contents(), parsed.tokens());
let mut diagnostics = check_path(
path,
None,

View File

@@ -296,3 +296,8 @@ pub(crate) const fn is_s310_resolve_string_literal_bindings_enabled(
pub(crate) const fn is_range_suppressions_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/22419
pub(crate) const fn is_py315_support_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}

View File

@@ -957,7 +957,7 @@ mod tests {
&indexer,
);
let suppressions =
Suppressions::from_tokens(&settings, locator.contents(), parsed.tokens(), &indexer);
Suppressions::from_tokens(&settings, locator.contents(), parsed.tokens());
let mut messages = check_path(
Path::new("<filename>"),
None,

View File

@@ -52,6 +52,7 @@ impl InvalidRuleCodeKind {
pub(crate) struct InvalidRuleCode {
pub(crate) rule_code: String,
pub(crate) kind: InvalidRuleCodeKind,
pub(crate) whole_comment: bool,
}
impl AlwaysFixableViolation for InvalidRuleCode {
@@ -65,7 +66,11 @@ impl AlwaysFixableViolation for InvalidRuleCode {
}
fn fix_title(&self) -> String {
"Remove the rule code".to_string()
if self.whole_comment {
format!("Remove the {} comment", self.kind.as_str())
} else {
format!("Remove the rule code `{}`", self.rule_code)
}
}
}
@@ -122,6 +127,7 @@ fn all_codes_invalid_diagnostic(
.collect::<Vec<_>>()
.join(", "),
kind: InvalidRuleCodeKind::Noqa,
whole_comment: true,
},
directive.range(),
)
@@ -139,6 +145,7 @@ fn some_codes_are_invalid_diagnostic(
InvalidRuleCode {
rule_code: invalid_code.to_string(),
kind: InvalidRuleCodeKind::Noqa,
whole_comment: false,
},
invalid_code.range(),
)

View File

@@ -52,6 +52,25 @@ impl UnusedNOQAKind {
/// foo.bar()
/// ```
///
/// ## Conflict with other linters
/// When using `RUF100` with the `--fix` option, Ruff may remove trailing comments
/// that follow a `# noqa` directive on the same line, as it interprets the
/// remainder of the line as a description for the suppression.
///
/// To prevent Ruff from removing suppressions for other tools (like `pylint`
/// or `mypy`), separate them with a second `#` character:
///
/// ```python
/// # Bad: Ruff --fix will remove the pylint comment
/// def visit_ImportFrom(self, node): # noqa: N802, pylint: disable=invalid-name
/// pass
///
///
/// # Good: Ruff will preserve the pylint comment
/// def visit_ImportFrom(self, node): # noqa: N802 # pylint: disable=invalid-name
/// pass
/// ```
///
/// ## Options
/// - `lint.external`
///

View File

@@ -10,7 +10,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID123
3 | # External code
4 | import re # noqa: V123
|
help: Remove the rule code
help: Remove the `# noqa` comment
1 | # Invalid code
- import os # noqa: INVALID123
2 + import os
@@ -28,7 +28,7 @@ RUF102 [*] Invalid rule code in `# noqa`: V123
5 | # Valid noqa
6 | import sys # noqa: E402
|
help: Remove the rule code
help: Remove the `# noqa` comment
1 | # Invalid code
2 | import os # noqa: INVALID123
3 | # External code
@@ -48,7 +48,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID456
8 | from itertools import product # Preceeding comment # noqa: INVALID789
9 | # Succeeding comment
|
help: Remove the rule code
help: Remove the rule code `INVALID456`
4 | import re # noqa: V123
5 | # Valid noqa
6 | import sys # noqa: E402
@@ -68,7 +68,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID789
9 | # Succeeding comment
10 | import math # noqa: INVALID000 # Succeeding comment
|
help: Remove the rule code
help: Remove the `# noqa` comment
5 | # Valid noqa
6 | import sys # noqa: E402
7 | from functools import cache # Preceeding comment # noqa: F401, INVALID456
@@ -88,7 +88,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID000
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
|
help: Remove the rule code
help: Remove the `# noqa` comment
7 | from functools import cache # Preceeding comment # noqa: F401, INVALID456
8 | from itertools import product # Preceeding comment # noqa: INVALID789
9 | # Succeeding comment
@@ -108,7 +108,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID123
13 | # Test for multiple invalid
14 | from collections import defaultdict # noqa: INVALID100, INVALID200, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID123`
9 | # Succeeding comment
10 | import math # noqa: INVALID000 # Succeeding comment
11 | # Mixed valid and invalid
@@ -128,7 +128,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID100
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID100`
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
13 | # Test for multiple invalid
@@ -148,7 +148,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID200
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID200`
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
13 | # Test for multiple invalid
@@ -168,7 +168,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID300
17 | # Test for mixed code types
18 | import json # noqa: E402, INVALID400, V100
|
help: Remove the rule code
help: Remove the rule code `INVALID300`
13 | # Test for multiple invalid
14 | from collections import defaultdict # noqa: INVALID100, INVALID200, F401
15 | # Test for preserving valid codes when fixing
@@ -188,7 +188,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID400
19 | # Test for rule redirects
20 | import pandas as pd # noqa: TCH002
|
help: Remove the rule code
help: Remove the rule code `INVALID400`
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
17 | # Test for mixed code types
@@ -207,7 +207,7 @@ RUF102 [*] Invalid rule code in `# noqa`: V100
19 | # Test for rule redirects
20 | import pandas as pd # noqa: TCH002
|
help: Remove the rule code
help: Remove the rule code `V100`
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
17 | # Test for mixed code types

View File

@@ -10,7 +10,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID123
3 | # External code
4 | import re # noqa: V123
|
help: Remove the rule code
help: Remove the `# noqa` comment
1 | # Invalid code
- import os # noqa: INVALID123
2 + import os
@@ -28,7 +28,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID456
8 | from itertools import product # Preceeding comment # noqa: INVALID789
9 | # Succeeding comment
|
help: Remove the rule code
help: Remove the rule code `INVALID456`
4 | import re # noqa: V123
5 | # Valid noqa
6 | import sys # noqa: E402
@@ -48,7 +48,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID789
9 | # Succeeding comment
10 | import math # noqa: INVALID000 # Succeeding comment
|
help: Remove the rule code
help: Remove the `# noqa` comment
5 | # Valid noqa
6 | import sys # noqa: E402
7 | from functools import cache # Preceeding comment # noqa: F401, INVALID456
@@ -68,7 +68,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID000
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
|
help: Remove the rule code
help: Remove the `# noqa` comment
7 | from functools import cache # Preceeding comment # noqa: F401, INVALID456
8 | from itertools import product # Preceeding comment # noqa: INVALID789
9 | # Succeeding comment
@@ -88,7 +88,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID123
13 | # Test for multiple invalid
14 | from collections import defaultdict # noqa: INVALID100, INVALID200, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID123`
9 | # Succeeding comment
10 | import math # noqa: INVALID000 # Succeeding comment
11 | # Mixed valid and invalid
@@ -108,7 +108,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID100
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID100`
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
13 | # Test for multiple invalid
@@ -128,7 +128,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID200
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
|
help: Remove the rule code
help: Remove the rule code `INVALID200`
11 | # Mixed valid and invalid
12 | from typing import List # noqa: F401, INVALID123
13 | # Test for multiple invalid
@@ -148,7 +148,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID300
17 | # Test for mixed code types
18 | import json # noqa: E402, INVALID400, V100
|
help: Remove the rule code
help: Remove the rule code `INVALID300`
13 | # Test for multiple invalid
14 | from collections import defaultdict # noqa: INVALID100, INVALID200, F401
15 | # Test for preserving valid codes when fixing
@@ -168,7 +168,7 @@ RUF102 [*] Invalid rule code in `# noqa`: INVALID400
19 | # Test for rule redirects
20 | import pandas as pd # noqa: TCH002
|
help: Remove the rule code
help: Remove the rule code `INVALID400`
15 | # Test for preserving valid codes when fixing
16 | from itertools import chain # noqa: E402, INVALID300, F401
17 | # Test for mixed code types

View File

@@ -7,7 +7,7 @@ source: crates/ruff_linter/src/rules/ruff/mod.rs
--- Summary ---
Removed: 15
Added: 23
Added: 20
--- Removed ---
E741 Ambiguous variable name: `I`
@@ -301,6 +301,7 @@ RUF100 [*] Unused suppression (non-enabled: `E501`)
| ^^^^^^^^^^^^^^^^^^^^^
47 | I = 1
48 | # ruff: enable[E501]
| --------------------
|
help: Remove unused suppression
43 | def f():
@@ -308,26 +309,10 @@ help: Remove unused suppression
45 | # logged to user
- # ruff: disable[E501]
46 | I = 1
47 | # ruff: enable[E501]
48 |
RUF100 [*] Unused suppression (non-enabled: `E501`)
--> suppressions.py:48:5
|
46 | # ruff: disable[E501]
47 | I = 1
48 | # ruff: enable[E501]
| ^^^^^^^^^^^^^^^^^^^^
|
help: Remove unused suppression
45 | # logged to user
46 | # ruff: disable[E501]
47 | I = 1
- # ruff: enable[E501]
47 |
48 |
49 |
50 | def f():
49 | def f():
RUF100 [*] Unused `noqa` directive (unused: `E741`, `F841`)
@@ -563,8 +548,11 @@ RUF102 [*] Invalid rule code in suppression: YF829
| ^^^^^
94 | # ruff: disable[F841, RQW320]
95 | value = 0
96 | # ruff: enable[F841, RQW320]
97 | # ruff: enable[YF829]
| -----
|
help: Remove the rule code
help: Remove the suppression comment
90 |
91 | def f():
92 | # Unknown rule codes
@@ -572,6 +560,10 @@ help: Remove the rule code
93 | # ruff: disable[F841, RQW320]
94 | value = 0
95 | # ruff: enable[F841, RQW320]
- # ruff: enable[YF829]
96 |
97 |
98 | def f():
RUF102 [*] Invalid rule code in suppression: RQW320
@@ -583,30 +575,15 @@ RUF102 [*] Invalid rule code in suppression: RQW320
| ^^^^^^
95 | value = 0
96 | # ruff: enable[F841, RQW320]
| ------
97 | # ruff: enable[YF829]
|
help: Remove the rule code
help: Remove the rule code `RQW320`
91 | def f():
92 | # Unknown rule codes
93 | # ruff: disable[YF829]
- # ruff: disable[F841, RQW320]
94 + # ruff: disable[F841]
95 | value = 0
96 | # ruff: enable[F841, RQW320]
97 | # ruff: enable[YF829]
RUF102 [*] Invalid rule code in suppression: RQW320
--> suppressions.py:96:26
|
94 | # ruff: disable[F841, RQW320]
95 | value = 0
96 | # ruff: enable[F841, RQW320]
| ^^^^^^
97 | # ruff: enable[YF829]
|
help: Remove the rule code
93 | # ruff: disable[YF829]
94 | # ruff: disable[F841, RQW320]
95 | value = 0
- # ruff: enable[F841, RQW320]
96 + # ruff: enable[F841]
@@ -615,24 +592,6 @@ help: Remove the rule code
99 |
RUF102 [*] Invalid rule code in suppression: YF829
--> suppressions.py:97:20
|
95 | value = 0
96 | # ruff: enable[F841, RQW320]
97 | # ruff: enable[YF829]
| ^^^^^
|
help: Remove the rule code
94 | # ruff: disable[F841, RQW320]
95 | value = 0
96 | # ruff: enable[F841, RQW320]
- # ruff: enable[YF829]
97 |
98 |
99 | def f():
RUF103 [*] Invalid suppression comment: missing suppression codes like `[E501, ...]`
--> suppressions.py:109:5
|

View File

@@ -36,6 +36,7 @@ pub enum PythonVersion {
Py312,
Py313,
Py314,
Py315,
}
impl Default for PythonVersion {
@@ -58,6 +59,7 @@ impl TryFrom<ast::PythonVersion> for PythonVersion {
ast::PythonVersion::PY312 => Ok(Self::Py312),
ast::PythonVersion::PY313 => Ok(Self::Py313),
ast::PythonVersion::PY314 => Ok(Self::Py314),
ast::PythonVersion::PY315 => Ok(Self::Py315),
_ => Err(format!("unrecognized python version {value}")),
}
}
@@ -88,6 +90,7 @@ impl PythonVersion {
Self::Py312 => (3, 12),
Self::Py313 => (3, 13),
Self::Py314 => (3, 14),
Self::Py315 => (3, 15),
}
}
}
@@ -604,13 +607,21 @@ impl TryFrom<String> for RequiredVersion {
type Error = pep440_rs::VersionSpecifiersParseError;
fn try_from(value: String) -> Result<Self, Self::Error> {
value.parse()
}
}
impl FromStr for RequiredVersion {
type Err = pep440_rs::VersionSpecifiersParseError;
fn from_str(value: &str) -> Result<Self, Self::Err> {
// Treat `0.3.1` as `==0.3.1`, for backwards compatibility.
if let Ok(version) = pep440_rs::Version::from_str(&value) {
if let Ok(version) = pep440_rs::Version::from_str(value) {
Ok(Self(VersionSpecifiers::from(
VersionSpecifier::equals_version(version),
)))
} else {
Ok(Self(VersionSpecifiers::from_str(&value)?))
Ok(Self(VersionSpecifiers::from_str(value)?))
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -235,8 +235,7 @@ pub(crate) fn test_contents<'a>(
&locator,
&indexer,
);
let suppressions =
Suppressions::from_tokens(settings, locator.contents(), parsed.tokens(), &indexer);
let suppressions = Suppressions::from_tokens(settings, locator.contents(), parsed.tokens());
let messages = check_path(
path,
path.parent()
@@ -304,7 +303,7 @@ pub(crate) fn test_contents<'a>(
);
let suppressions =
Suppressions::from_tokens(settings, locator.contents(), parsed.tokens(), &indexer);
Suppressions::from_tokens(settings, locator.contents(), parsed.tokens());
let fixed_messages = check_path(
path,
None,

View File

@@ -35,6 +35,10 @@ impl PythonVersion {
major: 3,
minor: 14,
};
pub const PY315: PythonVersion = PythonVersion {
major: 3,
minor: 15,
};
pub fn iter() -> impl Iterator<Item = PythonVersion> {
[
@@ -46,6 +50,7 @@ impl PythonVersion {
PythonVersion::PY312,
PythonVersion::PY313,
PythonVersion::PY314,
PythonVersion::PY315,
]
.into_iter()
}
@@ -61,7 +66,7 @@ impl PythonVersion {
/// The latest Python version supported in preview
pub fn latest_preview() -> Self {
let latest_preview = Self::PY314;
let latest_preview = Self::PY315;
debug_assert!(latest_preview >= Self::latest());
latest_preview
}

View File

@@ -4,7 +4,7 @@
use ruff_python_ast::Stmt;
use ruff_python_ast::token::{TokenKind, Tokens};
use ruff_python_trivia::{
BlockRanges, CommentRanges, has_leading_content, has_trailing_content, is_python_whitespace,
CommentRanges, has_leading_content, has_trailing_content, is_python_whitespace,
};
use ruff_source_file::LineRanges;
use ruff_text_size::{Ranged, TextRange, TextSize};
@@ -26,9 +26,6 @@ pub struct Indexer {
/// The range of all comments in the source document.
comment_ranges: CommentRanges,
/// The ranges of all indent/dedent blocks in the source document.
block_ranges: BlockRanges,
}
impl Indexer {
@@ -39,8 +36,6 @@ impl Indexer {
let mut multiline_ranges_builder = MultilineRangesBuilder::default();
let mut continuation_lines = Vec::new();
let mut comment_ranges = Vec::new();
let mut indent_ranges = Vec::new();
let mut dedent_ranges = Vec::new();
// Token, end
let mut prev_end = TextSize::default();
@@ -81,12 +76,6 @@ impl Indexer {
TokenKind::Comment => {
comment_ranges.push(token.range());
}
TokenKind::Indent => {
indent_ranges.push(token.range());
}
TokenKind::Dedent => {
dedent_ranges.push(token.range());
}
_ => {}
}
@@ -98,7 +87,6 @@ impl Indexer {
interpolated_string_ranges: interpolated_string_ranges_builder.finish(),
multiline_ranges: multiline_ranges_builder.finish(),
comment_ranges: CommentRanges::new(comment_ranges),
block_ranges: BlockRanges::new(indent_ranges, dedent_ranges),
}
}
@@ -107,11 +95,6 @@ impl Indexer {
&self.comment_ranges
}
/// Returns the block indent/dedent ranges.
pub const fn block_ranges(&self) -> &BlockRanges {
&self.block_ranges
}
/// Returns the byte offset ranges of interpolated strings.
pub const fn interpolated_string_ranges(&self) -> &InterpolatedStringRanges {
&self.interpolated_string_ranges

View File

@@ -1,48 +0,0 @@
use ruff_text_size::TextRange;
/// Stores the ranges of indents and dedents sorted by [`TextRange::start`] in increasing order.
#[derive(Clone, Debug, Default)]
pub struct BlockRanges {
raw: Vec<BlockRange>,
}
#[derive(Clone, Copy, Debug, Default)]
pub struct BlockRange {
pub indent: TextRange,
pub dedent: TextRange,
}
impl BlockRanges {
pub fn new(indent_ranges: Vec<TextRange>, dedent_ranges: Vec<TextRange>) -> Self {
let mut index = 0;
let mut stack = Vec::new();
let mut blocks = Vec::new();
for dedent in &dedent_ranges {
while index < indent_ranges.len() && indent_ranges[index].end() < dedent.start() {
stack.push(indent_ranges[index]);
index += 1;
}
if let Some(indent) = stack.pop() {
blocks.push(BlockRange {
indent,
dedent: *dedent,
});
}
}
blocks.sort_by_key(|b| b.indent.start());
Self { raw: blocks }
}
pub fn containing(&self, range: &TextRange) -> Vec<&BlockRange> {
self.raw
.iter()
.filter(|block| {
block.indent.start() <= range.start() && block.dedent.end() > range.end()
})
.collect()
}
}

View File

@@ -1,4 +1,3 @@
mod block_ranges;
mod comment_ranges;
mod comments;
mod cursor;
@@ -7,7 +6,6 @@ pub mod textwrap;
mod tokenizer;
mod whitespace;
pub use block_ranges::BlockRanges;
pub use comment_ranges::CommentRanges;
pub use comments::*;
pub use cursor::*;

View File

@@ -120,12 +120,8 @@ pub(crate) fn check(
let directives = extract_directives(parsed.tokens(), Flags::all(), &locator, &indexer);
// Parse range suppression comments
let suppressions = Suppressions::from_tokens(
&settings.linter,
locator.contents(),
parsed.tokens(),
&indexer,
);
let suppressions =
Suppressions::from_tokens(&settings.linter, locator.contents(), parsed.tokens());
// Generate checks.
let diagnostics = check_path(

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_wasm"
version = "0.14.10"
version = "0.14.11"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -213,12 +213,8 @@ impl Workspace {
&indexer,
);
let suppressions = Suppressions::from_tokens(
&self.settings.linter,
locator.contents(),
parsed.tokens(),
&indexer,
);
let suppressions =
Suppressions::from_tokens(&self.settings.linter, locator.contents(), parsed.tokens());
// Generate checks.
let diagnostics = check_path(

View File

@@ -7,7 +7,6 @@ use std::collections::BTreeMap;
use std::env::VarError;
use std::num::{NonZeroU8, NonZeroU16};
use std::path::{Path, PathBuf};
use std::str::FromStr;
use anyhow::{Context, Result, anyhow};
use glob::{GlobError, Paths, PatternError, glob};
@@ -36,8 +35,7 @@ use ruff_linter::settings::{
DEFAULT_SELECTORS, DUMMY_VARIABLE_RGX, LinterSettings, TASK_TAGS, TargetVersion,
};
use ruff_linter::{
RUFF_PKG_VERSION, RuleSelector, fs, warn_user_once, warn_user_once_by_id,
warn_user_once_by_message,
RuleSelector, fs, warn_user_once, warn_user_once_by_id, warn_user_once_by_message,
};
use ruff_python_ast as ast;
use ruff_python_formatter::{
@@ -53,6 +51,7 @@ use crate::options::{
Flake8UnusedArgumentsOptions, FormatOptions, IsortOptions, LintCommonOptions, LintOptions,
McCabeOptions, Options, Pep8NamingOptions, PyUpgradeOptions, PycodestyleOptions,
PydoclintOptions, PydocstyleOptions, PyflakesOptions, PylintOptions, RuffOptions,
validate_required_version,
};
use crate::settings::{
EXCLUDE, FileResolverSettings, FormatterSettings, INCLUDE, INCLUDE_PREVIEW, LineEnding,
@@ -155,13 +154,7 @@ pub struct Configuration {
impl Configuration {
pub fn into_settings(self, project_root: &Path) -> Result<Settings> {
if let Some(required_version) = &self.required_version {
let ruff_pkg_version = pep440_rs::Version::from_str(RUFF_PKG_VERSION)
.expect("RUFF_PKG_VERSION is not a valid PEP 440 version specifier");
if !required_version.contains(&ruff_pkg_version) {
return Err(anyhow!(
"Required version `{required_version}` does not match the running version `{RUFF_PKG_VERSION}`"
));
}
validate_required_version(required_version)?;
}
let linter_target_version = TargetVersion(self.target_version);

View File

@@ -1,15 +1,19 @@
use anyhow::Result;
use regex::Regex;
use rustc_hash::{FxBuildHasher, FxHashMap, FxHashSet};
use serde::de::{self};
use serde::{Deserialize, Deserializer, Serialize};
use std::collections::{BTreeMap, BTreeSet};
use std::path::PathBuf;
use std::str::FromStr;
use strum::IntoEnumIterator;
use unicode_normalization::UnicodeNormalization;
use crate::settings::LineEnding;
use ruff_formatter::IndentStyle;
use ruff_graph::Direction;
use ruff_linter::RUFF_PKG_VERSION;
use ruff_linter::line_width::{IndentWidth, LineLength};
use ruff_linter::rules::flake8_import_conventions::settings::BannedAliases;
use ruff_linter::rules::flake8_pytest_style::settings::SettingsError;
@@ -556,6 +560,17 @@ pub struct LintOptions {
pub future_annotations: Option<bool>,
}
pub fn validate_required_version(required_version: &RequiredVersion) -> anyhow::Result<()> {
let ruff_pkg_version = pep440_rs::Version::from_str(RUFF_PKG_VERSION)
.expect("RUFF_PKG_VERSION is not a valid PEP 440 version specifier");
if !required_version.contains(&ruff_pkg_version) {
return Err(anyhow::anyhow!(
"Required version `{required_version}` does not match the running version `{RUFF_PKG_VERSION}`"
));
}
Ok(())
}
/// Newtype wrapper for [`LintCommonOptions`] that allows customizing the JSON schema and omitting the fields from the [`OptionsMetadata`].
#[derive(Clone, Debug, PartialEq, Eq, Default, Serialize, Deserialize)]
#[serde(transparent)]

View File

@@ -5,12 +5,13 @@ use std::path::{Path, PathBuf};
use anyhow::{Context, Result};
use log::debug;
use pep440_rs::{Operator, Version, VersionSpecifiers};
use serde::de::DeserializeOwned;
use serde::{Deserialize, Serialize};
use strum::IntoEnumIterator;
use ruff_linter::settings::types::PythonVersion;
use ruff_linter::settings::types::{PythonVersion, RequiredVersion};
use crate::options::Options;
use crate::options::{Options, validate_required_version};
#[derive(Debug, PartialEq, Eq, Serialize, Deserialize)]
struct Tools {
@@ -40,20 +41,38 @@ impl Pyproject {
}
}
/// Parse a `ruff.toml` file.
fn parse_ruff_toml<P: AsRef<Path>>(path: P) -> Result<Options> {
fn parse_toml<P: AsRef<Path>, T: DeserializeOwned>(path: P, table_path: &[&str]) -> Result<T> {
let path = path.as_ref();
let contents = std::fs::read_to_string(path)
.with_context(|| format!("Failed to read {}", path.display()))?;
toml::from_str(&contents).with_context(|| format!("Failed to parse {}", path.display()))
// Parse the TOML document once into a spanned representation so we can:
// - Inspect `required-version` without triggering strict deserialization errors.
// - Deserialize with precise spans (line/column and excerpt) on errors.
let root = toml::de::DeTable::parse(&contents)
.with_context(|| format!("Failed to parse {}", path.display()))?;
check_required_version(root.get_ref(), table_path)?;
let deserializer = toml::de::Deserializer::from(root);
T::deserialize(deserializer)
.map_err(|mut err| {
// `Deserializer::from` doesn't have access to the original input, but we do.
// Attach it so TOML errors include line/column and a source excerpt.
err.set_input(Some(&contents));
err
})
.with_context(|| format!("Failed to parse {}", path.display()))
}
/// Parse a `ruff.toml` file.
fn parse_ruff_toml<P: AsRef<Path>>(path: P) -> Result<Options> {
parse_toml(path, &[])
}
/// Parse a `pyproject.toml` file.
fn parse_pyproject_toml<P: AsRef<Path>>(path: P) -> Result<Pyproject> {
let path = path.as_ref();
let contents = std::fs::read_to_string(path)
.with_context(|| format!("Failed to read {}", path.display()))?;
toml::from_str(&contents).with_context(|| format!("Failed to parse {}", path.display()))
parse_toml(path, &["tool", "ruff"])
}
/// Return `true` if a `pyproject.toml` contains a `[tool.ruff]` section.
@@ -98,6 +117,33 @@ pub fn find_settings_toml<P: AsRef<Path>>(path: P) -> Result<Option<PathBuf>> {
Ok(None)
}
fn check_required_version(value: &toml::de::DeTable, table_path: &[&str]) -> Result<()> {
let mut current = value;
for key in table_path {
let Some(next) = current.get(*key) else {
return Ok(());
};
let toml::de::DeValue::Table(next) = next.get_ref() else {
return Ok(());
};
current = next;
}
let required_version = current
.get("required-version")
.and_then(|value| value.get_ref().as_str());
let Some(required_version) = required_version else {
return Ok(());
};
// If it doesn't parse, we just fall through to normal parsing; it will give a nicer error message.
if let Ok(required_version) = required_version.parse::<RequiredVersion>() {
validate_required_version(&required_version)?;
}
Ok(())
}
/// Derive target version from `required-version` in `pyproject.toml`, if
/// such a file exists in an ancestor directory.
pub fn find_fallback_target_version<P: AsRef<Path>>(path: P) -> Option<PythonVersion> {

View File

@@ -102,8 +102,8 @@ impl Relativity {
#[derive(Debug)]
pub struct Resolver<'a> {
pyproject_config: &'a PyprojectConfig,
/// All [`Settings`] that have been added to the resolver.
settings: Vec<Settings>,
/// All [`Settings`] that have been added to the resolver, along with their config file paths.
settings: Vec<(Settings, PathBuf)>,
/// A router from path to index into the `settings` vector.
router: Router<usize>,
}
@@ -146,8 +146,8 @@ impl<'a> Resolver<'a> {
}
/// Add a resolved [`Settings`] under a given [`PathBuf`] scope.
fn add(&mut self, path: &Path, settings: Settings) {
self.settings.push(settings);
fn add(&mut self, path: &Path, settings: Settings, config_path: PathBuf) {
self.settings.push((settings, config_path));
// Normalize the path to use `/` separators and escape the '{' and '}' characters,
// which matchit uses for routing parameters.
@@ -172,13 +172,27 @@ impl<'a> Resolver<'a> {
/// Return the appropriate [`Settings`] for a given [`Path`].
pub fn resolve(&self, path: &Path) -> &Settings {
self.resolve_with_path(path).0
}
/// Return the appropriate [`Settings`] and config file path for a given [`Path`].
pub fn resolve_with_path(&self, path: &Path) -> (&Settings, Option<&Path>) {
match self.pyproject_config.strategy {
PyprojectDiscoveryStrategy::Fixed => &self.pyproject_config.settings,
PyprojectDiscoveryStrategy::Fixed => (
&self.pyproject_config.settings,
self.pyproject_config.path.as_deref(),
),
PyprojectDiscoveryStrategy::Hierarchical => self
.router
.at(path.to_slash_lossy().as_ref())
.map(|Match { value, .. }| &self.settings[*value])
.unwrap_or(&self.pyproject_config.settings),
.map(|Match { value, .. }| {
let (settings, config_path) = &self.settings[*value];
(settings, Some(config_path.as_path()))
})
.unwrap_or((
&self.pyproject_config.settings,
self.pyproject_config.path.as_deref(),
)),
}
}
@@ -255,7 +269,8 @@ impl<'a> Resolver<'a> {
/// Return an iterator over the resolved [`Settings`] in this [`Resolver`].
pub fn settings(&self) -> impl Iterator<Item = &Settings> {
std::iter::once(&self.pyproject_config.settings).chain(&self.settings)
std::iter::once(&self.pyproject_config.settings)
.chain(self.settings.iter().map(|(settings, _)| settings))
}
}
@@ -379,17 +394,17 @@ pub fn resolve_configuration(
/// Extract the project root (scope) and [`Settings`] from a given
/// `pyproject.toml`.
fn resolve_scoped_settings<'a>(
pyproject: &'a Path,
fn resolve_scoped_settings(
pyproject: &Path,
transformer: &dyn ConfigurationTransformer,
origin: ConfigurationOrigin,
) -> Result<(&'a Path, Settings)> {
) -> Result<(PathBuf, Settings)> {
let relativity = Relativity::from(origin);
let configuration = resolve_configuration(pyproject, transformer, origin)?;
let project_root = relativity.resolve(pyproject);
let settings = configuration.into_settings(project_root)?;
Ok((project_root, settings))
Ok((project_root.to_path_buf(), settings))
}
/// Extract the [`Settings`] from a given `pyproject.toml` and process the
@@ -455,7 +470,7 @@ pub fn python_files_in_path<'a>(
transformer,
ConfigurationOrigin::Ancestor,
)?;
resolver.add(root, settings);
resolver.add(&root, settings, pyproject);
// We found the closest configuration.
break;
}
@@ -647,7 +662,11 @@ impl ParallelVisitor for PythonFilesVisitor<'_, '_> {
ConfigurationOrigin::Ancestor,
) {
Ok((root, settings)) => {
self.global.resolver.write().unwrap().add(root, settings);
self.global
.resolver
.write()
.unwrap()
.add(&root, settings, pyproject);
}
Err(err) => {
self.local_error = Err(err);
@@ -767,7 +786,7 @@ pub fn python_file_at_path(
if let Some(pyproject) = settings_toml(ancestor)? {
let (root, settings) =
resolve_scoped_settings(&pyproject, transformer, ConfigurationOrigin::Unknown)?;
resolver.add(root, settings);
resolver.add(&root, settings, pyproject);
break;
}
}

3
crates/ty/docs/cli.md generated
View File

@@ -37,7 +37,8 @@ ty check [OPTIONS] [PATH]...
<h3 class="cli-reference">Options</h3>
<dl class="cli-reference"><dt id="ty-check--color"><a href="#ty-check--color"><code>--color</code></a> <i>when</i></dt><dd><p>Control when colored output is used</p>
<dl class="cli-reference"><dt id="ty-check--add-ignore"><a href="#ty-check--add-ignore"><code>--add-ignore</code></a></dt><dd><p>Adds <code>ty: ignore</code> comments to suppress all rule diagnostics</p>
</dd><dt id="ty-check--color"><a href="#ty-check--color"><code>--color</code></a> <i>when</i></dt><dd><p>Control when colored output is used</p>
<p>Possible values:</p>
<ul>
<li><code>auto</code>: Display colors if the output goes to an interactive terminal</li>

View File

@@ -467,7 +467,7 @@ def test(): -> "int":
Default level: <a href="../../rules#rule-levels" title="This lint has a default level of 'warn'."><code>warn</code></a> ·
Added in <a href="https://github.com/astral-sh/ty/releases/tag/0.0.1-alpha.1">0.0.1-alpha.1</a> ·
<a href="https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20ignore-comment-unknown-rule" target="_blank">Related issues</a> ·
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L50" target="_blank">View source</a>
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L54" target="_blank">View source</a>
</small>
@@ -1047,7 +1047,7 @@ class D(Generic[U, T]): ...
Default level: <a href="../../rules#rule-levels" title="This lint has a default level of 'warn'."><code>warn</code></a> ·
Added in <a href="https://github.com/astral-sh/ty/releases/tag/0.0.1-alpha.1">0.0.1-alpha.1</a> ·
<a href="https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-ignore-comment" target="_blank">Related issues</a> ·
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L75" target="_blank">View source</a>
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L79" target="_blank">View source</a>
</small>
@@ -2895,7 +2895,7 @@ A() + A() # TypeError: unsupported operand type(s) for +: 'A' and 'A'
## `unused-ignore-comment`
<small>
Default level: <a href="../../rules#rule-levels" title="This lint has a default level of 'ignore'."><code>ignore</code></a> ·
Default level: <a href="../../rules#rule-levels" title="This lint has a default level of 'warn'."><code>warn</code></a> ·
Added in <a href="https://github.com/astral-sh/ty/releases/tag/0.0.1-alpha.1">0.0.1-alpha.1</a> ·
<a href="https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unused-ignore-comment" target="_blank">Related issues</a> ·
<a href="https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Fsuppression.rs#L25" target="_blank">View source</a>
@@ -2904,11 +2904,11 @@ Added in <a href="https://github.com/astral-sh/ty/releases/tag/0.0.1-alpha.1">0.
**What it does**
Checks for `type: ignore` or `ty: ignore` directives that are no longer applicable.
Checks for `ty: ignore` or `type: ignore` directives that are no longer applicable.
**Why is this bad?**
A `type: ignore` directive that no longer matches any diagnostic violations is likely
A `ty: ignore` directive that no longer matches any diagnostic violations is likely
included by mistake, and should be removed to avoid confusion.
**Examples**
@@ -2923,6 +2923,11 @@ Use instead:
a = 20 / 2
```
**Options**
Set [`analysis.respect-type-ignore-comments`](https://docs.astral.sh/ty/reference/configuration/#respect-type-ignore-comments)
to `false` to prevent this rule from reporting unused `type: ignore` comments.
## `useless-overload-body`
<small>

View File

@@ -54,6 +54,10 @@ pub(crate) struct CheckCommand {
)]
pub paths: Vec<SystemPathBuf>,
/// Adds `ty: ignore` comments to suppress all rule diagnostics.
#[arg(long)]
pub(crate) add_ignore: bool,
/// Run the command within the given project directory.
///
/// All `pyproject.toml` files will be discovered by walking up the directory tree from the given project directory,

View File

@@ -4,37 +4,36 @@ mod printer;
mod python_version;
mod version;
pub use args::Cli;
use ty_project::metadata::settings::TerminalSettings;
use ty_static::EnvVars;
use std::fmt::Write;
use std::process::{ExitCode, Termination};
use std::sync::Mutex;
use anyhow::Result;
use crate::args::{CheckCommand, Command, TerminalColor};
use crate::logging::{VerbosityLevel, setup_tracing};
use crate::printer::Printer;
use anyhow::{Context, anyhow};
use clap::{CommandFactory, Parser};
use colored::Colorize;
use crossbeam::channel as crossbeam_channel;
use rayon::ThreadPoolBuilder;
use ruff_db::cancellation::{CancellationToken, CancellationTokenSource};
use ruff_db::cancellation::{Canceled, CancellationToken, CancellationTokenSource};
use ruff_db::diagnostic::{
Diagnostic, DiagnosticId, DisplayDiagnosticConfig, DisplayDiagnostics, Severity,
};
use ruff_db::files::File;
use ruff_db::max_parallelism;
use ruff_db::system::{OsSystem, SystemPath, SystemPathBuf};
use ruff_db::{STACK_SIZE, max_parallelism};
use salsa::Database;
use ty_project::metadata::options::ProjectOptionsOverrides;
use ty_project::metadata::settings::TerminalSettings;
use ty_project::watch::ProjectWatcher;
use ty_project::{CollectReporter, Db, watch};
use ty_project::{CollectReporter, Db, suppress_all_diagnostics, watch};
use ty_project::{ProjectDatabase, ProjectMetadata};
use ty_server::run_server;
use ty_static::EnvVars;
use crate::args::{CheckCommand, Command, TerminalColor};
use crate::logging::{VerbosityLevel, setup_tracing};
use crate::printer::Printer;
pub use args::Cli;
pub fn run() -> anyhow::Result<ExitStatus> {
setup_rayon();
@@ -112,6 +111,12 @@ fn run_check(args: CheckCommand) -> anyhow::Result<ExitStatus> {
.map(|path| SystemPath::absolute(path, &cwd))
.collect();
let mode = if args.add_ignore {
MainLoopMode::AddIgnore
} else {
MainLoopMode::Check
};
let system = OsSystem::new(&cwd);
let watch = args.watch;
let exit_zero = args.exit_zero;
@@ -144,7 +149,7 @@ fn run_check(args: CheckCommand) -> anyhow::Result<ExitStatus> {
}
let (main_loop, main_loop_cancellation_token) =
MainLoop::new(project_options_overrides, printer);
MainLoop::new(mode, project_options_overrides, printer);
// Listen to Ctrl+C and abort the watch mode.
let main_loop_cancellation_token = Mutex::new(Some(main_loop_cancellation_token));
@@ -215,6 +220,8 @@ impl Termination for ExitStatus {
}
struct MainLoop {
mode: MainLoopMode,
/// Sender that can be used to send messages to the main loop.
sender: crossbeam_channel::Sender<MainLoopMessage>,
@@ -237,6 +244,7 @@ struct MainLoop {
impl MainLoop {
fn new(
mode: MainLoopMode,
project_options_overrides: ProjectOptionsOverrides,
printer: Printer,
) -> (Self, MainLoopCancellationToken) {
@@ -247,6 +255,7 @@ impl MainLoop {
(
Self {
mode,
sender: sender.clone(),
receiver,
watcher: None,
@@ -325,80 +334,78 @@ impl MainLoop {
result,
revision: check_revision,
} => {
let terminal_settings = db.project().settings(db).terminal();
let display_config = DisplayDiagnosticConfig::default()
.format(terminal_settings.output_format.into())
.color(colored::control::SHOULD_COLORIZE.should_colorize())
.with_cancellation_token(Some(self.cancellation_token.clone()))
.show_fix_diff(true);
if check_revision == revision {
if db.project().files(db).is_empty() {
tracing::warn!("No python files found under the given path(s)");
}
// TODO: We should have an official flag to silence workspace diagnostics.
if std::env::var("TY_MEMORY_REPORT").as_deref() == Ok("mypy_primer") {
return Ok(ExitStatus::Success);
}
let is_human_readable = terminal_settings.output_format.is_human_readable();
if result.is_empty() {
if is_human_readable {
writeln!(
self.printer.stream_for_success_summary(),
"{}",
"All checks passed!".green().bold()
)?;
}
if self.watcher.is_none() {
return Ok(ExitStatus::Success);
}
} else {
let diagnostics_count = result.len();
let mut stdout = self.printer.stream_for_details().lock();
let exit_status =
exit_status_from_diagnostics(&result, terminal_settings);
// Only render diagnostics if they're going to be displayed, since doing
// so is expensive.
if stdout.is_enabled() {
write!(
stdout,
"{}",
DisplayDiagnostics::new(db, &display_config, &result)
)?;
}
if !self.cancellation_token.is_cancelled() {
if is_human_readable {
writeln!(
self.printer.stream_for_failure_summary(),
"Found {} diagnostic{}",
diagnostics_count,
if diagnostics_count > 1 { "s" } else { "" }
)?;
}
if exit_status.is_internal_error() {
tracing::warn!(
"A fatal error occurred while checking some files. Not all project files were analyzed. See the diagnostics list above for details."
);
}
}
if self.watcher.is_none() {
return Ok(exit_status);
}
}
} else {
if check_revision != revision {
tracing::debug!(
"Discarding check result for outdated revision: current: {revision}, result revision: {check_revision}"
);
continue;
}
if db.project().files(db).is_empty() {
tracing::warn!("No python files found under the given path(s)");
}
let result = match self.mode {
MainLoopMode::Check => {
// TODO: We should have an official flag to silence workspace diagnostics.
if std::env::var("TY_MEMORY_REPORT").as_deref() == Ok("mypy_primer") {
return Ok(ExitStatus::Success);
}
self.write_diagnostics(db, &result)?;
if self.cancellation_token.is_cancelled() {
Err(Canceled)
} else {
Ok(result)
}
}
MainLoopMode::AddIgnore => {
if let Ok(result) =
suppress_all_diagnostics(db, result, &self.cancellation_token)
{
self.write_diagnostics(db, &result.diagnostics)?;
let terminal_settings = db.project().settings(db).terminal();
let is_human_readable =
terminal_settings.output_format.is_human_readable();
if is_human_readable {
writeln!(
self.printer.stream_for_failure_summary(),
"Added {} ignore comment{}",
result.count,
if result.count > 1 { "s" } else { "" }
)?;
}
Ok(result.diagnostics)
} else {
Err(Canceled)
}
}
};
let exit_status = match result.as_deref() {
Ok([]) => ExitStatus::Success,
Ok(diagnostics) => {
let terminal_settings = db.project().settings(db).terminal();
exit_status_from_diagnostics(diagnostics, terminal_settings)
}
Err(Canceled) => ExitStatus::Success,
};
if exit_status.is_internal_error() {
tracing::warn!(
"A fatal error occurred while checking some files. Not all project files were analyzed. See the diagnostics list above for details."
);
}
if self.watcher.is_some() {
continue;
}
return Ok(exit_status);
}
MainLoopMessage::ApplyChanges(changes) => {
@@ -425,6 +432,65 @@ impl MainLoop {
Ok(ExitStatus::Success)
}
fn write_diagnostics(
&self,
db: &ProjectDatabase,
diagnostics: &[Diagnostic],
) -> anyhow::Result<()> {
let terminal_settings = db.project().settings(db).terminal();
let is_human_readable = terminal_settings.output_format.is_human_readable();
match diagnostics {
[] => {
if is_human_readable {
writeln!(
self.printer.stream_for_success_summary(),
"{}",
"All checks passed!".green().bold()
)?;
}
}
diagnostics => {
let diagnostics_count = diagnostics.len();
let mut stdout = self.printer.stream_for_details().lock();
// Only render diagnostics if they're going to be displayed, since doing
// so is expensive.
if stdout.is_enabled() {
let display_config = DisplayDiagnosticConfig::default()
.format(terminal_settings.output_format.into())
.color(colored::control::SHOULD_COLORIZE.should_colorize())
.with_cancellation_token(Some(self.cancellation_token.clone()))
.show_fix_diff(true);
write!(
stdout,
"{}",
DisplayDiagnostics::new(db, &display_config, diagnostics)
)?;
}
if !self.cancellation_token.is_cancelled() && is_human_readable {
writeln!(
self.printer.stream_for_failure_summary(),
"Found {} diagnostic{}",
diagnostics_count,
if diagnostics_count > 1 { "s" } else { "" }
)?;
}
}
}
Ok(())
}
}
#[derive(Copy, Clone, Debug)]
enum MainLoopMode {
Check,
AddIgnore,
}
fn exit_status_from_diagnostics(
@@ -559,12 +625,7 @@ fn set_colored_override(color: Option<TerminalColor>) {
fn setup_rayon() {
ThreadPoolBuilder::default()
.num_threads(max_parallelism().get())
// Use a reasonably large stack size to avoid running into stack overflows too easily. The
// size was chosen in such a way as to still be able to handle large expressions involving
// binary operators (x + x + … + x) both during the AST walk in semantic index building as
// well as during type checking. Using this stack size, we can handle handle expressions
// that are several times larger than the corresponding limits in existing type checkers.
.stack_size(16 * 1024 * 1024)
.stack_size(STACK_SIZE)
.build_global()
.unwrap();
}

View File

@@ -160,6 +160,65 @@ fn configuration_include() -> anyhow::Result<()> {
Ok(())
}
/// Files without extensions can be included by adding a literal glob to `include` that matches
/// the path exactly. A literal glob is a glob without any meta characters.
#[test]
fn configuration_include_no_extension() -> anyhow::Result<()> {
let case = CliTest::with_files([(
"src/main",
r#"
print(undefined_var) # error: unresolved-reference
"#,
)])?;
// By default, `src/main` is excluded because the file has no supported extension.
case.write_file(
"ty.toml",
r#"
[src]
include = ["src"]
"#,
)?;
assert_cmd_snapshot!(case.command(), @r"
success: true
exit_code: 0
----- stdout -----
All checks passed!
----- stderr -----
WARN No python files found under the given path(s)
");
// The file can be included by adding an exactly matching pattern
case.write_file(
"ty.toml",
r#"
[src]
include = ["src", "src/main"]
"#,
)?;
assert_cmd_snapshot!(case.command(), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-reference]: Name `undefined_var` used when not defined
--> src/main:2:7
|
2 | print(undefined_var) # error: unresolved-reference
| ^^^^^^^^^^^^^
|
info: rule `unresolved-reference` is enabled by default
Found 1 diagnostic
----- stderr -----
");
Ok(())
}
/// Test configuration file exclude functionality
#[test]
fn configuration_exclude() -> anyhow::Result<()> {

View File

@@ -0,0 +1,114 @@
use insta_cmd::assert_cmd_snapshot;
use crate::CliTest;
#[test]
fn add_ignore() -> anyhow::Result<()> {
let case = CliTest::with_file(
"different_violations.py",
r#"
import sys
x = 1 + a
if sys.does_not_exist:
...
def test(a, b): ...
test(x = 10, b = 12)
"#,
)?;
assert_cmd_snapshot!(case.command().arg("--add-ignore"), @r"
success: true
exit_code: 0
----- stdout -----
All checks passed!
Added 4 ignore comments
----- stderr -----
");
// There should be no diagnostics when running ty again
assert_cmd_snapshot!(case.command(), @r"
success: true
exit_code: 0
----- stdout -----
All checks passed!
----- stderr -----
");
Ok(())
}
#[test]
fn add_ignore_unfixable() -> anyhow::Result<()> {
let case = CliTest::with_files([
("has_syntax_error.py", r"print(x # [unresolved-reference]"),
(
"different_violations.py",
r#"
import sys
x = 1 + a
reveal_type(x)
if sys.does_not_exist:
...
"#,
),
(
"repeated_violations.py",
r#"
x = (
1 +
a * b
)
y = y # ty: ignore[unresolved-reference]
"#,
),
])?;
assert_cmd_snapshot!(case.command().arg("--add-ignore").env("RUST_BACKTRACE", "1"), @r"
success: false
exit_code: 1
----- stdout -----
info[revealed-type]: Revealed type
--> different_violations.py:6:13
|
4 | x = 1 + a # ty:ignore[unresolved-reference]
5 |
6 | reveal_type(x) # ty:ignore[undefined-reveal]
| ^ `Unknown`
7 |
8 | if sys.does_not_exist: # ty:ignore[unresolved-attribute]
|
error[unresolved-reference]: Name `x` used when not defined
--> has_syntax_error.py:1:7
|
1 | print(x # [unresolved-reference]
| ^
|
info: rule `unresolved-reference` is enabled by default
error[invalid-syntax]: unexpected EOF while parsing
--> has_syntax_error.py:1:34
|
1 | print(x # [unresolved-reference]
| ^
|
Found 3 diagnostics
Added 5 ignore comments
----- stderr -----
WARN Skipping file `<temp_dir>/has_syntax_error.py` with syntax errors
");
Ok(())
}

View File

@@ -2,6 +2,7 @@ mod analysis_options;
mod config_option;
mod exit_code;
mod file_selection;
mod fixes;
mod python_environment;
mod rule_selection;

View File

@@ -1,9 +1,10 @@
name,file,index,rank
auto-import-includes-modules,main.py,0,1
auto-import-includes-modules,main.py,1,7
auto-import-includes-modules,main.py,1,2
auto-import-includes-modules,main.py,2,1
auto-import-skips-current-module,main.py,0,1
class-arg-completion,main.py,0,1
exact-over-fuzzy,main.py,0,1
fstring-completions,main.py,0,1
higher-level-symbols-preferred,main.py,0,
higher-level-symbols-preferred,main.py,1,1
@@ -16,8 +17,10 @@ import-deprioritizes-type_check_only,main.py,3,2
import-deprioritizes-type_check_only,main.py,4,3
import-keyword-completion,main.py,0,1
internal-typeshed-hidden,main.py,0,2
local-over-auto-import,main.py,0,1
modules-over-other-symbols,main.py,0,1
none-completion,main.py,0,1
numpy-array,main.py,0,159
numpy-array,main.py,0,57
numpy-array,main.py,1,1
object-attr-instance-methods,main.py,0,1
object-attr-instance-methods,main.py,1,1
@@ -26,7 +29,12 @@ raise-uses-base-exception,main.py,0,1
scope-existing-over-new-import,main.py,0,1
scope-prioritize-closer,main.py,0,2
scope-simple-long-identifier,main.py,0,1
third-party-over-stdlib,main.py,0,1
tighter-over-looser-scope,main.py,0,3
tstring-completions,main.py,0,1
ty-extensions-lower-stdlib,main.py,0,9
type-var-typing-over-ast,main.py,0,3
type-var-typing-over-ast,main.py,1,253
ty-extensions-lower-stdlib,main.py,0,1
typing-gets-priority,main.py,0,1
typing-gets-priority,main.py,1,1
typing-gets-priority,main.py,2,1
typing-gets-priority,main.py,3,1
typing-gets-priority,main.py,4,1
1 name file index rank
2 auto-import-includes-modules main.py 0 1
3 auto-import-includes-modules main.py 1 7 2
4 auto-import-includes-modules main.py 2 1
5 auto-import-skips-current-module main.py 0 1
6 class-arg-completion main.py 0 1
7 exact-over-fuzzy main.py 0 1
8 fstring-completions main.py 0 1
9 higher-level-symbols-preferred main.py 0
10 higher-level-symbols-preferred main.py 1 1
17 import-deprioritizes-type_check_only main.py 4 3
18 import-keyword-completion main.py 0 1
19 internal-typeshed-hidden main.py 0 2
20 local-over-auto-import main.py 0 1
21 modules-over-other-symbols main.py 0 1
22 none-completion main.py 0 1
23 numpy-array main.py 0 159 57
24 numpy-array main.py 1 1
25 object-attr-instance-methods main.py 0 1
26 object-attr-instance-methods main.py 1 1
29 scope-existing-over-new-import main.py 0 1
30 scope-prioritize-closer main.py 0 2
31 scope-simple-long-identifier main.py 0 1
32 third-party-over-stdlib main.py 0 1
33 tighter-over-looser-scope main.py 0 3
34 tstring-completions main.py 0 1
35 ty-extensions-lower-stdlib main.py 0 9 1
36 type-var-typing-over-ast typing-gets-priority main.py 0 3 1
37 type-var-typing-over-ast typing-gets-priority main.py 1 253 1
38 typing-gets-priority main.py 2 1
39 typing-gets-priority main.py 3 1
40 typing-gets-priority main.py 4 1

View File

@@ -540,16 +540,17 @@ fn copy_project(src_dir: &SystemPath, dst_dir: &SystemPath) -> anyhow::Result<Ve
std::fs::create_dir_all(dst_dir).with_context(|| dst_dir.to_string())?;
let mut cursors = vec![];
for result in walkdir::WalkDir::new(src_dir.as_std_path()) {
let it = walkdir::WalkDir::new(src_dir.as_std_path())
.into_iter()
.filter_entry(|dent| {
!dent
.file_name()
.to_str()
.is_some_and(|name| name.starts_with('.'))
});
for result in it {
let dent =
result.with_context(|| format!("failed to get directory entry from {src_dir}"))?;
if dent
.file_name()
.to_str()
.is_some_and(|name| name.starts_with('.'))
{
continue;
}
let src = SystemPath::from_std_path(dent.path()).ok_or_else(|| {
anyhow::anyhow!("path `{}` is not valid UTF-8", dent.path().display())

View File

@@ -0,0 +1,3 @@
pattern = 1
pttn = 1
pttn<CURSOR: pttn>

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,11 @@
def foo(x):
# We specifically want the local `x` to be
# suggested first here, and NOT an `x` or an
# `X` from some other module (via auto-import).
# We'd also like this to come before `except`,
# which is a keyword that contains `x`, but is
# not an exact match (where as `x` is). `except`
# also isn't legal in this context, although
# that sort of context sensitivity is a bit
# trickier.
return x<CURSOR: x>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1,8 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1 @@
os<CURSOR: os>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1 @@
os = 1

View File

@@ -0,0 +1,78 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "regex"
version = "2025.11.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/cc/a9/546676f25e573a4cf00fe8e119b78a37b6a8fe2dc95cda877b30889c9c45/regex-2025.11.3.tar.gz", hash = "sha256:1fedc720f9bb2494ce31a58a1631f9c82df6a09b49c19517ea5cc280b4541e01", size = 414669, upload-time = "2025-11-03T21:34:22.089Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e1/a7/dda24ebd49da46a197436ad96378f17df30ceb40e52e859fc42cac45b850/regex-2025.11.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c1e448051717a334891f2b9a620fe36776ebf3dd8ec46a0b877c8ae69575feb4", size = 489081, upload-time = "2025-11-03T21:31:55.9Z" },
{ url = "https://files.pythonhosted.org/packages/19/22/af2dc751aacf88089836aa088a1a11c4f21a04707eb1b0478e8e8fb32847/regex-2025.11.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9b5aca4d5dfd7fbfbfbdaf44850fcc7709a01146a797536a8f84952e940cca76", size = 291123, upload-time = "2025-11-03T21:31:57.758Z" },
{ url = "https://files.pythonhosted.org/packages/a3/88/1a3ea5672f4b0a84802ee9891b86743438e7c04eb0b8f8c4e16a42375327/regex-2025.11.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:04d2765516395cf7dda331a244a3282c0f5ae96075f728629287dfa6f76ba70a", size = 288814, upload-time = "2025-11-03T21:32:01.12Z" },
{ url = "https://files.pythonhosted.org/packages/fb/8c/f5987895bf42b8ddeea1b315c9fedcfe07cadee28b9c98cf50d00adcb14d/regex-2025.11.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d9903ca42bfeec4cebedba8022a7c97ad2aab22e09573ce9976ba01b65e4361", size = 798592, upload-time = "2025-11-03T21:32:03.006Z" },
{ url = "https://files.pythonhosted.org/packages/99/2a/6591ebeede78203fa77ee46a1c36649e02df9eaa77a033d1ccdf2fcd5d4e/regex-2025.11.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:639431bdc89d6429f6721625e8129413980ccd62e9d3f496be618a41d205f160", size = 864122, upload-time = "2025-11-03T21:32:04.553Z" },
{ url = "https://files.pythonhosted.org/packages/94/d6/be32a87cf28cf8ed064ff281cfbd49aefd90242a83e4b08b5a86b38e8eb4/regex-2025.11.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f117efad42068f9715677c8523ed2be1518116d1c49b1dd17987716695181efe", size = 912272, upload-time = "2025-11-03T21:32:06.148Z" },
{ url = "https://files.pythonhosted.org/packages/62/11/9bcef2d1445665b180ac7f230406ad80671f0fc2a6ffb93493b5dd8cd64c/regex-2025.11.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4aecb6f461316adf9f1f0f6a4a1a3d79e045f9b71ec76055a791affa3b285850", size = 803497, upload-time = "2025-11-03T21:32:08.162Z" },
{ url = "https://files.pythonhosted.org/packages/e5/a7/da0dc273d57f560399aa16d8a68ae7f9b57679476fc7ace46501d455fe84/regex-2025.11.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3b3a5f320136873cc5561098dfab677eea139521cb9a9e8db98b7e64aef44cbc", size = 787892, upload-time = "2025-11-03T21:32:09.769Z" },
{ url = "https://files.pythonhosted.org/packages/da/4b/732a0c5a9736a0b8d6d720d4945a2f1e6f38f87f48f3173559f53e8d5d82/regex-2025.11.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:75fa6f0056e7efb1f42a1c34e58be24072cb9e61a601340cc1196ae92326a4f9", size = 858462, upload-time = "2025-11-03T21:32:11.769Z" },
{ url = "https://files.pythonhosted.org/packages/0c/f5/a2a03df27dc4c2d0c769220f5110ba8c4084b0bfa9ab0f9b4fcfa3d2b0fc/regex-2025.11.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:dbe6095001465294f13f1adcd3311e50dd84e5a71525f20a10bd16689c61ce0b", size = 850528, upload-time = "2025-11-03T21:32:13.906Z" },
{ url = "https://files.pythonhosted.org/packages/d6/09/e1cd5bee3841c7f6eb37d95ca91cdee7100b8f88b81e41c2ef426910891a/regex-2025.11.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:454d9b4ae7881afbc25015b8627c16d88a597479b9dea82b8c6e7e2e07240dc7", size = 789866, upload-time = "2025-11-03T21:32:15.748Z" },
{ url = "https://files.pythonhosted.org/packages/eb/51/702f5ea74e2a9c13d855a6a85b7f80c30f9e72a95493260193c07f3f8d74/regex-2025.11.3-cp313-cp313-win32.whl", hash = "sha256:28ba4d69171fc6e9896337d4fc63a43660002b7da53fc15ac992abcf3410917c", size = 266189, upload-time = "2025-11-03T21:32:17.493Z" },
{ url = "https://files.pythonhosted.org/packages/8b/00/6e29bb314e271a743170e53649db0fdb8e8ff0b64b4f425f5602f4eb9014/regex-2025.11.3-cp313-cp313-win_amd64.whl", hash = "sha256:bac4200befe50c670c405dc33af26dad5a3b6b255dd6c000d92fe4629f9ed6a5", size = 277054, upload-time = "2025-11-03T21:32:19.042Z" },
{ url = "https://files.pythonhosted.org/packages/25/f1/b156ff9f2ec9ac441710764dda95e4edaf5f36aca48246d1eea3f1fd96ec/regex-2025.11.3-cp313-cp313-win_arm64.whl", hash = "sha256:2292cd5a90dab247f9abe892ac584cb24f0f54680c73fcb4a7493c66c2bf2467", size = 270325, upload-time = "2025-11-03T21:32:21.338Z" },
{ url = "https://files.pythonhosted.org/packages/20/28/fd0c63357caefe5680b8ea052131acbd7f456893b69cc2a90cc3e0dc90d4/regex-2025.11.3-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:1eb1ebf6822b756c723e09f5186473d93236c06c579d2cc0671a722d2ab14281", size = 491984, upload-time = "2025-11-03T21:32:23.466Z" },
{ url = "https://files.pythonhosted.org/packages/df/ec/7014c15626ab46b902b3bcc4b28a7bae46d8f281fc7ea9c95e22fcaaa917/regex-2025.11.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1e00ec2970aab10dc5db34af535f21fcf32b4a31d99e34963419636e2f85ae39", size = 292673, upload-time = "2025-11-03T21:32:25.034Z" },
{ url = "https://files.pythonhosted.org/packages/23/ab/3b952ff7239f20d05f1f99e9e20188513905f218c81d52fb5e78d2bf7634/regex-2025.11.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a4cb042b615245d5ff9b3794f56be4138b5adc35a4166014d31d1814744148c7", size = 291029, upload-time = "2025-11-03T21:32:26.528Z" },
{ url = "https://files.pythonhosted.org/packages/21/7e/3dc2749fc684f455f162dcafb8a187b559e2614f3826877d3844a131f37b/regex-2025.11.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:44f264d4bf02f3176467d90b294d59bf1db9fe53c141ff772f27a8b456b2a9ed", size = 807437, upload-time = "2025-11-03T21:32:28.363Z" },
{ url = "https://files.pythonhosted.org/packages/1b/0b/d529a85ab349c6a25d1ca783235b6e3eedf187247eab536797021f7126c6/regex-2025.11.3-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7be0277469bf3bd7a34a9c57c1b6a724532a0d235cd0dc4e7f4316f982c28b19", size = 873368, upload-time = "2025-11-03T21:32:30.4Z" },
{ url = "https://files.pythonhosted.org/packages/7d/18/2d868155f8c9e3e9d8f9e10c64e9a9f496bb8f7e037a88a8bed26b435af6/regex-2025.11.3-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0d31e08426ff4b5b650f68839f5af51a92a5b51abd8554a60c2fbc7c71f25d0b", size = 914921, upload-time = "2025-11-03T21:32:32.123Z" },
{ url = "https://files.pythonhosted.org/packages/2d/71/9d72ff0f354fa783fe2ba913c8734c3b433b86406117a8db4ea2bf1c7a2f/regex-2025.11.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e43586ce5bd28f9f285a6e729466841368c4a0353f6fd08d4ce4630843d3648a", size = 812708, upload-time = "2025-11-03T21:32:34.305Z" },
{ url = "https://files.pythonhosted.org/packages/e7/19/ce4bf7f5575c97f82b6e804ffb5c4e940c62609ab2a0d9538d47a7fdf7d4/regex-2025.11.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:0f9397d561a4c16829d4e6ff75202c1c08b68a3bdbfe29dbfcdb31c9830907c6", size = 795472, upload-time = "2025-11-03T21:32:36.364Z" },
{ url = "https://files.pythonhosted.org/packages/03/86/fd1063a176ffb7b2315f9a1b08d17b18118b28d9df163132615b835a26ee/regex-2025.11.3-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:dd16e78eb18ffdb25ee33a0682d17912e8cc8a770e885aeee95020046128f1ce", size = 868341, upload-time = "2025-11-03T21:32:38.042Z" },
{ url = "https://files.pythonhosted.org/packages/12/43/103fb2e9811205e7386366501bc866a164a0430c79dd59eac886a2822950/regex-2025.11.3-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:ffcca5b9efe948ba0661e9df0fa50d2bc4b097c70b9810212d6b62f05d83b2dd", size = 854666, upload-time = "2025-11-03T21:32:40.079Z" },
{ url = "https://files.pythonhosted.org/packages/7d/22/e392e53f3869b75804762c7c848bd2dd2abf2b70fb0e526f58724638bd35/regex-2025.11.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c56b4d162ca2b43318ac671c65bd4d563e841a694ac70e1a976ac38fcf4ca1d2", size = 799473, upload-time = "2025-11-03T21:32:42.148Z" },
{ url = "https://files.pythonhosted.org/packages/4f/f9/8bd6b656592f925b6845fcbb4d57603a3ac2fb2373344ffa1ed70aa6820a/regex-2025.11.3-cp313-cp313t-win32.whl", hash = "sha256:9ddc42e68114e161e51e272f667d640f97e84a2b9ef14b7477c53aac20c2d59a", size = 268792, upload-time = "2025-11-03T21:32:44.13Z" },
{ url = "https://files.pythonhosted.org/packages/e5/87/0e7d603467775ff65cd2aeabf1b5b50cc1c3708556a8b849a2fa4dd1542b/regex-2025.11.3-cp313-cp313t-win_amd64.whl", hash = "sha256:7a7c7fdf755032ffdd72c77e3d8096bdcb0eb92e89e17571a196f03d88b11b3c", size = 280214, upload-time = "2025-11-03T21:32:45.853Z" },
{ url = "https://files.pythonhosted.org/packages/8d/d0/2afc6f8e94e2b64bfb738a7c2b6387ac1699f09f032d363ed9447fd2bb57/regex-2025.11.3-cp313-cp313t-win_arm64.whl", hash = "sha256:df9eb838c44f570283712e7cff14c16329a9f0fb19ca492d21d4b7528ee6821e", size = 271469, upload-time = "2025-11-03T21:32:48.026Z" },
{ url = "https://files.pythonhosted.org/packages/31/e9/f6e13de7e0983837f7b6d238ad9458800a874bf37c264f7923e63409944c/regex-2025.11.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9697a52e57576c83139d7c6f213d64485d3df5bf84807c35fa409e6c970801c6", size = 489089, upload-time = "2025-11-03T21:32:50.027Z" },
{ url = "https://files.pythonhosted.org/packages/a3/5c/261f4a262f1fa65141c1b74b255988bd2fa020cc599e53b080667d591cfc/regex-2025.11.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e18bc3f73bd41243c9b38a6d9f2366cd0e0137a9aebe2d8ff76c5b67d4c0a3f4", size = 291059, upload-time = "2025-11-03T21:32:51.682Z" },
{ url = "https://files.pythonhosted.org/packages/8e/57/f14eeb7f072b0e9a5a090d1712741fd8f214ec193dba773cf5410108bb7d/regex-2025.11.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:61a08bcb0ec14ff4e0ed2044aad948d0659604f824cbd50b55e30b0ec6f09c73", size = 288900, upload-time = "2025-11-03T21:32:53.569Z" },
{ url = "https://files.pythonhosted.org/packages/3c/6b/1d650c45e99a9b327586739d926a1cd4e94666b1bd4af90428b36af66dc7/regex-2025.11.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c9c30003b9347c24bcc210958c5d167b9e4f9be786cb380a7d32f14f9b84674f", size = 799010, upload-time = "2025-11-03T21:32:55.222Z" },
{ url = "https://files.pythonhosted.org/packages/99/ee/d66dcbc6b628ce4e3f7f0cbbb84603aa2fc0ffc878babc857726b8aab2e9/regex-2025.11.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4e1e592789704459900728d88d41a46fe3969b82ab62945560a31732ffc19a6d", size = 864893, upload-time = "2025-11-03T21:32:57.239Z" },
{ url = "https://files.pythonhosted.org/packages/bf/2d/f238229f1caba7ac87a6c4153d79947fb0261415827ae0f77c304260c7d3/regex-2025.11.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6538241f45eb5a25aa575dbba1069ad786f68a4f2773a29a2bd3dd1f9de787be", size = 911522, upload-time = "2025-11-03T21:32:59.274Z" },
{ url = "https://files.pythonhosted.org/packages/bd/3d/22a4eaba214a917c80e04f6025d26143690f0419511e0116508e24b11c9b/regex-2025.11.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce22519c989bb72a7e6b36a199384c53db7722fe669ba891da75907fe3587db", size = 803272, upload-time = "2025-11-03T21:33:01.393Z" },
{ url = "https://files.pythonhosted.org/packages/84/b1/03188f634a409353a84b5ef49754b97dbcc0c0f6fd6c8ede505a8960a0a4/regex-2025.11.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:66d559b21d3640203ab9075797a55165d79017520685fb407b9234d72ab63c62", size = 787958, upload-time = "2025-11-03T21:33:03.379Z" },
{ url = "https://files.pythonhosted.org/packages/99/6a/27d072f7fbf6fadd59c64d210305e1ff865cc3b78b526fd147db768c553b/regex-2025.11.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:669dcfb2e38f9e8c69507bace46f4889e3abbfd9b0c29719202883c0a603598f", size = 859289, upload-time = "2025-11-03T21:33:05.374Z" },
{ url = "https://files.pythonhosted.org/packages/9a/70/1b3878f648e0b6abe023172dacb02157e685564853cc363d9961bcccde4e/regex-2025.11.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:32f74f35ff0f25a5021373ac61442edcb150731fbaa28286bbc8bb1582c89d02", size = 850026, upload-time = "2025-11-03T21:33:07.131Z" },
{ url = "https://files.pythonhosted.org/packages/dd/d5/68e25559b526b8baab8e66839304ede68ff6727237a47727d240006bd0ff/regex-2025.11.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e6c7a21dffba883234baefe91bc3388e629779582038f75d2a5be918e250f0ed", size = 789499, upload-time = "2025-11-03T21:33:09.141Z" },
{ url = "https://files.pythonhosted.org/packages/fc/df/43971264857140a350910d4e33df725e8c94dd9dee8d2e4729fa0d63d49e/regex-2025.11.3-cp314-cp314-win32.whl", hash = "sha256:795ea137b1d809eb6836b43748b12634291c0ed55ad50a7d72d21edf1cd565c4", size = 271604, upload-time = "2025-11-03T21:33:10.9Z" },
{ url = "https://files.pythonhosted.org/packages/01/6f/9711b57dc6894a55faf80a4c1b5aa4f8649805cb9c7aef46f7d27e2b9206/regex-2025.11.3-cp314-cp314-win_amd64.whl", hash = "sha256:9f95fbaa0ee1610ec0fc6b26668e9917a582ba80c52cc6d9ada15e30aa9ab9ad", size = 280320, upload-time = "2025-11-03T21:33:12.572Z" },
{ url = "https://files.pythonhosted.org/packages/f1/7e/f6eaa207d4377481f5e1775cdeb5a443b5a59b392d0065f3417d31d80f87/regex-2025.11.3-cp314-cp314-win_arm64.whl", hash = "sha256:dfec44d532be4c07088c3de2876130ff0fbeeacaa89a137decbbb5f665855a0f", size = 273372, upload-time = "2025-11-03T21:33:14.219Z" },
{ url = "https://files.pythonhosted.org/packages/c3/06/49b198550ee0f5e4184271cee87ba4dfd9692c91ec55289e6282f0f86ccf/regex-2025.11.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:ba0d8a5d7f04f73ee7d01d974d47c5834f8a1b0224390e4fe7c12a3a92a78ecc", size = 491985, upload-time = "2025-11-03T21:33:16.555Z" },
{ url = "https://files.pythonhosted.org/packages/ce/bf/abdafade008f0b1c9da10d934034cb670432d6cf6cbe38bbb53a1cfd6cf8/regex-2025.11.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:442d86cf1cfe4faabf97db7d901ef58347efd004934da045c745e7b5bd57ac49", size = 292669, upload-time = "2025-11-03T21:33:18.32Z" },
{ url = "https://files.pythonhosted.org/packages/f9/ef/0c357bb8edbd2ad8e273fcb9e1761bc37b8acbc6e1be050bebd6475f19c1/regex-2025.11.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:fd0a5e563c756de210bb964789b5abe4f114dacae9104a47e1a649b910361536", size = 291030, upload-time = "2025-11-03T21:33:20.048Z" },
{ url = "https://files.pythonhosted.org/packages/79/06/edbb67257596649b8fb088d6aeacbcb248ac195714b18a65e018bf4c0b50/regex-2025.11.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bf3490bcbb985a1ae97b2ce9ad1c0f06a852d5b19dde9b07bdf25bf224248c95", size = 807674, upload-time = "2025-11-03T21:33:21.797Z" },
{ url = "https://files.pythonhosted.org/packages/f4/d9/ad4deccfce0ea336296bd087f1a191543bb99ee1c53093dcd4c64d951d00/regex-2025.11.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3809988f0a8b8c9dcc0f92478d6501fac7200b9ec56aecf0ec21f4a2ec4b6009", size = 873451, upload-time = "2025-11-03T21:33:23.741Z" },
{ url = "https://files.pythonhosted.org/packages/13/75/a55a4724c56ef13e3e04acaab29df26582f6978c000ac9cd6810ad1f341f/regex-2025.11.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f4ff94e58e84aedb9c9fce66d4ef9f27a190285b451420f297c9a09f2b9abee9", size = 914980, upload-time = "2025-11-03T21:33:25.999Z" },
{ url = "https://files.pythonhosted.org/packages/67/1e/a1657ee15bd9116f70d4a530c736983eed997b361e20ecd8f5ca3759d5c5/regex-2025.11.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7eb542fd347ce61e1321b0a6b945d5701528dca0cd9759c2e3bb8bd57e47964d", size = 812852, upload-time = "2025-11-03T21:33:27.852Z" },
{ url = "https://files.pythonhosted.org/packages/b8/6f/f7516dde5506a588a561d296b2d0044839de06035bb486b326065b4c101e/regex-2025.11.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:d6c2d5919075a1f2e413c00b056ea0c2f065b3f5fe83c3d07d325ab92dce51d6", size = 795566, upload-time = "2025-11-03T21:33:32.364Z" },
{ url = "https://files.pythonhosted.org/packages/d9/dd/3d10b9e170cc16fb34cb2cef91513cf3df65f440b3366030631b2984a264/regex-2025.11.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:3f8bf11a4827cc7ce5a53d4ef6cddd5ad25595d3c1435ef08f76825851343154", size = 868463, upload-time = "2025-11-03T21:33:34.459Z" },
{ url = "https://files.pythonhosted.org/packages/f5/8e/935e6beff1695aa9085ff83195daccd72acc82c81793df480f34569330de/regex-2025.11.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:22c12d837298651e5550ac1d964e4ff57c3f56965fc1812c90c9fb2028eaf267", size = 854694, upload-time = "2025-11-03T21:33:36.793Z" },
{ url = "https://files.pythonhosted.org/packages/92/12/10650181a040978b2f5720a6a74d44f841371a3d984c2083fc1752e4acf6/regex-2025.11.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:62ba394a3dda9ad41c7c780f60f6e4a70988741415ae96f6d1bf6c239cf01379", size = 799691, upload-time = "2025-11-03T21:33:39.079Z" },
{ url = "https://files.pythonhosted.org/packages/67/90/8f37138181c9a7690e7e4cb388debbd389342db3c7381d636d2875940752/regex-2025.11.3-cp314-cp314t-win32.whl", hash = "sha256:4bf146dca15cdd53224a1bf46d628bd7590e4a07fbb69e720d561aea43a32b38", size = 274583, upload-time = "2025-11-03T21:33:41.302Z" },
{ url = "https://files.pythonhosted.org/packages/8f/cd/867f5ec442d56beb56f5f854f40abcfc75e11d10b11fdb1869dd39c63aaf/regex-2025.11.3-cp314-cp314t-win_amd64.whl", hash = "sha256:adad1a1bcf1c9e76346e091d22d23ac54ef28e1365117d99521631078dfec9de", size = 284286, upload-time = "2025-11-03T21:33:43.324Z" },
{ url = "https://files.pythonhosted.org/packages/20/31/32c0c4610cbc070362bf1d2e4ea86d1ea29014d400a6d6c2486fcfd57766/regex-2025.11.3-cp314-cp314t-win_arm64.whl", hash = "sha256:c54f768482cef41e219720013cd05933b6f971d9562544d691c68699bf2b6801", size = 274741, upload-time = "2025-11-03T21:33:45.557Z" },
]
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
{ name = "regex" },
]
[package.metadata]
requires-dist = [{ name = "regex", specifier = ">=2025.11.3" }]

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,12 @@
# This test was originally written to
# check that a third party dependency
# gets priority over stdlib. But it was
# not clearly the right choice[1].
#
# 2026-01-09: We stuck with it for now,
# but it seems likely that we'll want
# to regress this task in favor of
# another.
#
# [1]: https://github.com/astral-sh/ruff/pull/22460#discussion_r2676343225
fullma<CURSOR: regex.fullmatch>

View File

@@ -0,0 +1,7 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = [
"regex>=2025.11.3",
]

View File

@@ -0,0 +1,78 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "regex"
version = "2025.11.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/cc/a9/546676f25e573a4cf00fe8e119b78a37b6a8fe2dc95cda877b30889c9c45/regex-2025.11.3.tar.gz", hash = "sha256:1fedc720f9bb2494ce31a58a1631f9c82df6a09b49c19517ea5cc280b4541e01", size = 414669, upload-time = "2025-11-03T21:34:22.089Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e1/a7/dda24ebd49da46a197436ad96378f17df30ceb40e52e859fc42cac45b850/regex-2025.11.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c1e448051717a334891f2b9a620fe36776ebf3dd8ec46a0b877c8ae69575feb4", size = 489081, upload-time = "2025-11-03T21:31:55.9Z" },
{ url = "https://files.pythonhosted.org/packages/19/22/af2dc751aacf88089836aa088a1a11c4f21a04707eb1b0478e8e8fb32847/regex-2025.11.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9b5aca4d5dfd7fbfbfbdaf44850fcc7709a01146a797536a8f84952e940cca76", size = 291123, upload-time = "2025-11-03T21:31:57.758Z" },
{ url = "https://files.pythonhosted.org/packages/a3/88/1a3ea5672f4b0a84802ee9891b86743438e7c04eb0b8f8c4e16a42375327/regex-2025.11.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:04d2765516395cf7dda331a244a3282c0f5ae96075f728629287dfa6f76ba70a", size = 288814, upload-time = "2025-11-03T21:32:01.12Z" },
{ url = "https://files.pythonhosted.org/packages/fb/8c/f5987895bf42b8ddeea1b315c9fedcfe07cadee28b9c98cf50d00adcb14d/regex-2025.11.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d9903ca42bfeec4cebedba8022a7c97ad2aab22e09573ce9976ba01b65e4361", size = 798592, upload-time = "2025-11-03T21:32:03.006Z" },
{ url = "https://files.pythonhosted.org/packages/99/2a/6591ebeede78203fa77ee46a1c36649e02df9eaa77a033d1ccdf2fcd5d4e/regex-2025.11.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:639431bdc89d6429f6721625e8129413980ccd62e9d3f496be618a41d205f160", size = 864122, upload-time = "2025-11-03T21:32:04.553Z" },
{ url = "https://files.pythonhosted.org/packages/94/d6/be32a87cf28cf8ed064ff281cfbd49aefd90242a83e4b08b5a86b38e8eb4/regex-2025.11.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f117efad42068f9715677c8523ed2be1518116d1c49b1dd17987716695181efe", size = 912272, upload-time = "2025-11-03T21:32:06.148Z" },
{ url = "https://files.pythonhosted.org/packages/62/11/9bcef2d1445665b180ac7f230406ad80671f0fc2a6ffb93493b5dd8cd64c/regex-2025.11.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4aecb6f461316adf9f1f0f6a4a1a3d79e045f9b71ec76055a791affa3b285850", size = 803497, upload-time = "2025-11-03T21:32:08.162Z" },
{ url = "https://files.pythonhosted.org/packages/e5/a7/da0dc273d57f560399aa16d8a68ae7f9b57679476fc7ace46501d455fe84/regex-2025.11.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3b3a5f320136873cc5561098dfab677eea139521cb9a9e8db98b7e64aef44cbc", size = 787892, upload-time = "2025-11-03T21:32:09.769Z" },
{ url = "https://files.pythonhosted.org/packages/da/4b/732a0c5a9736a0b8d6d720d4945a2f1e6f38f87f48f3173559f53e8d5d82/regex-2025.11.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:75fa6f0056e7efb1f42a1c34e58be24072cb9e61a601340cc1196ae92326a4f9", size = 858462, upload-time = "2025-11-03T21:32:11.769Z" },
{ url = "https://files.pythonhosted.org/packages/0c/f5/a2a03df27dc4c2d0c769220f5110ba8c4084b0bfa9ab0f9b4fcfa3d2b0fc/regex-2025.11.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:dbe6095001465294f13f1adcd3311e50dd84e5a71525f20a10bd16689c61ce0b", size = 850528, upload-time = "2025-11-03T21:32:13.906Z" },
{ url = "https://files.pythonhosted.org/packages/d6/09/e1cd5bee3841c7f6eb37d95ca91cdee7100b8f88b81e41c2ef426910891a/regex-2025.11.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:454d9b4ae7881afbc25015b8627c16d88a597479b9dea82b8c6e7e2e07240dc7", size = 789866, upload-time = "2025-11-03T21:32:15.748Z" },
{ url = "https://files.pythonhosted.org/packages/eb/51/702f5ea74e2a9c13d855a6a85b7f80c30f9e72a95493260193c07f3f8d74/regex-2025.11.3-cp313-cp313-win32.whl", hash = "sha256:28ba4d69171fc6e9896337d4fc63a43660002b7da53fc15ac992abcf3410917c", size = 266189, upload-time = "2025-11-03T21:32:17.493Z" },
{ url = "https://files.pythonhosted.org/packages/8b/00/6e29bb314e271a743170e53649db0fdb8e8ff0b64b4f425f5602f4eb9014/regex-2025.11.3-cp313-cp313-win_amd64.whl", hash = "sha256:bac4200befe50c670c405dc33af26dad5a3b6b255dd6c000d92fe4629f9ed6a5", size = 277054, upload-time = "2025-11-03T21:32:19.042Z" },
{ url = "https://files.pythonhosted.org/packages/25/f1/b156ff9f2ec9ac441710764dda95e4edaf5f36aca48246d1eea3f1fd96ec/regex-2025.11.3-cp313-cp313-win_arm64.whl", hash = "sha256:2292cd5a90dab247f9abe892ac584cb24f0f54680c73fcb4a7493c66c2bf2467", size = 270325, upload-time = "2025-11-03T21:32:21.338Z" },
{ url = "https://files.pythonhosted.org/packages/20/28/fd0c63357caefe5680b8ea052131acbd7f456893b69cc2a90cc3e0dc90d4/regex-2025.11.3-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:1eb1ebf6822b756c723e09f5186473d93236c06c579d2cc0671a722d2ab14281", size = 491984, upload-time = "2025-11-03T21:32:23.466Z" },
{ url = "https://files.pythonhosted.org/packages/df/ec/7014c15626ab46b902b3bcc4b28a7bae46d8f281fc7ea9c95e22fcaaa917/regex-2025.11.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1e00ec2970aab10dc5db34af535f21fcf32b4a31d99e34963419636e2f85ae39", size = 292673, upload-time = "2025-11-03T21:32:25.034Z" },
{ url = "https://files.pythonhosted.org/packages/23/ab/3b952ff7239f20d05f1f99e9e20188513905f218c81d52fb5e78d2bf7634/regex-2025.11.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a4cb042b615245d5ff9b3794f56be4138b5adc35a4166014d31d1814744148c7", size = 291029, upload-time = "2025-11-03T21:32:26.528Z" },
{ url = "https://files.pythonhosted.org/packages/21/7e/3dc2749fc684f455f162dcafb8a187b559e2614f3826877d3844a131f37b/regex-2025.11.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:44f264d4bf02f3176467d90b294d59bf1db9fe53c141ff772f27a8b456b2a9ed", size = 807437, upload-time = "2025-11-03T21:32:28.363Z" },
{ url = "https://files.pythonhosted.org/packages/1b/0b/d529a85ab349c6a25d1ca783235b6e3eedf187247eab536797021f7126c6/regex-2025.11.3-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7be0277469bf3bd7a34a9c57c1b6a724532a0d235cd0dc4e7f4316f982c28b19", size = 873368, upload-time = "2025-11-03T21:32:30.4Z" },
{ url = "https://files.pythonhosted.org/packages/7d/18/2d868155f8c9e3e9d8f9e10c64e9a9f496bb8f7e037a88a8bed26b435af6/regex-2025.11.3-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0d31e08426ff4b5b650f68839f5af51a92a5b51abd8554a60c2fbc7c71f25d0b", size = 914921, upload-time = "2025-11-03T21:32:32.123Z" },
{ url = "https://files.pythonhosted.org/packages/2d/71/9d72ff0f354fa783fe2ba913c8734c3b433b86406117a8db4ea2bf1c7a2f/regex-2025.11.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e43586ce5bd28f9f285a6e729466841368c4a0353f6fd08d4ce4630843d3648a", size = 812708, upload-time = "2025-11-03T21:32:34.305Z" },
{ url = "https://files.pythonhosted.org/packages/e7/19/ce4bf7f5575c97f82b6e804ffb5c4e940c62609ab2a0d9538d47a7fdf7d4/regex-2025.11.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:0f9397d561a4c16829d4e6ff75202c1c08b68a3bdbfe29dbfcdb31c9830907c6", size = 795472, upload-time = "2025-11-03T21:32:36.364Z" },
{ url = "https://files.pythonhosted.org/packages/03/86/fd1063a176ffb7b2315f9a1b08d17b18118b28d9df163132615b835a26ee/regex-2025.11.3-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:dd16e78eb18ffdb25ee33a0682d17912e8cc8a770e885aeee95020046128f1ce", size = 868341, upload-time = "2025-11-03T21:32:38.042Z" },
{ url = "https://files.pythonhosted.org/packages/12/43/103fb2e9811205e7386366501bc866a164a0430c79dd59eac886a2822950/regex-2025.11.3-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:ffcca5b9efe948ba0661e9df0fa50d2bc4b097c70b9810212d6b62f05d83b2dd", size = 854666, upload-time = "2025-11-03T21:32:40.079Z" },
{ url = "https://files.pythonhosted.org/packages/7d/22/e392e53f3869b75804762c7c848bd2dd2abf2b70fb0e526f58724638bd35/regex-2025.11.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c56b4d162ca2b43318ac671c65bd4d563e841a694ac70e1a976ac38fcf4ca1d2", size = 799473, upload-time = "2025-11-03T21:32:42.148Z" },
{ url = "https://files.pythonhosted.org/packages/4f/f9/8bd6b656592f925b6845fcbb4d57603a3ac2fb2373344ffa1ed70aa6820a/regex-2025.11.3-cp313-cp313t-win32.whl", hash = "sha256:9ddc42e68114e161e51e272f667d640f97e84a2b9ef14b7477c53aac20c2d59a", size = 268792, upload-time = "2025-11-03T21:32:44.13Z" },
{ url = "https://files.pythonhosted.org/packages/e5/87/0e7d603467775ff65cd2aeabf1b5b50cc1c3708556a8b849a2fa4dd1542b/regex-2025.11.3-cp313-cp313t-win_amd64.whl", hash = "sha256:7a7c7fdf755032ffdd72c77e3d8096bdcb0eb92e89e17571a196f03d88b11b3c", size = 280214, upload-time = "2025-11-03T21:32:45.853Z" },
{ url = "https://files.pythonhosted.org/packages/8d/d0/2afc6f8e94e2b64bfb738a7c2b6387ac1699f09f032d363ed9447fd2bb57/regex-2025.11.3-cp313-cp313t-win_arm64.whl", hash = "sha256:df9eb838c44f570283712e7cff14c16329a9f0fb19ca492d21d4b7528ee6821e", size = 271469, upload-time = "2025-11-03T21:32:48.026Z" },
{ url = "https://files.pythonhosted.org/packages/31/e9/f6e13de7e0983837f7b6d238ad9458800a874bf37c264f7923e63409944c/regex-2025.11.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9697a52e57576c83139d7c6f213d64485d3df5bf84807c35fa409e6c970801c6", size = 489089, upload-time = "2025-11-03T21:32:50.027Z" },
{ url = "https://files.pythonhosted.org/packages/a3/5c/261f4a262f1fa65141c1b74b255988bd2fa020cc599e53b080667d591cfc/regex-2025.11.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e18bc3f73bd41243c9b38a6d9f2366cd0e0137a9aebe2d8ff76c5b67d4c0a3f4", size = 291059, upload-time = "2025-11-03T21:32:51.682Z" },
{ url = "https://files.pythonhosted.org/packages/8e/57/f14eeb7f072b0e9a5a090d1712741fd8f214ec193dba773cf5410108bb7d/regex-2025.11.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:61a08bcb0ec14ff4e0ed2044aad948d0659604f824cbd50b55e30b0ec6f09c73", size = 288900, upload-time = "2025-11-03T21:32:53.569Z" },
{ url = "https://files.pythonhosted.org/packages/3c/6b/1d650c45e99a9b327586739d926a1cd4e94666b1bd4af90428b36af66dc7/regex-2025.11.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c9c30003b9347c24bcc210958c5d167b9e4f9be786cb380a7d32f14f9b84674f", size = 799010, upload-time = "2025-11-03T21:32:55.222Z" },
{ url = "https://files.pythonhosted.org/packages/99/ee/d66dcbc6b628ce4e3f7f0cbbb84603aa2fc0ffc878babc857726b8aab2e9/regex-2025.11.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4e1e592789704459900728d88d41a46fe3969b82ab62945560a31732ffc19a6d", size = 864893, upload-time = "2025-11-03T21:32:57.239Z" },
{ url = "https://files.pythonhosted.org/packages/bf/2d/f238229f1caba7ac87a6c4153d79947fb0261415827ae0f77c304260c7d3/regex-2025.11.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6538241f45eb5a25aa575dbba1069ad786f68a4f2773a29a2bd3dd1f9de787be", size = 911522, upload-time = "2025-11-03T21:32:59.274Z" },
{ url = "https://files.pythonhosted.org/packages/bd/3d/22a4eaba214a917c80e04f6025d26143690f0419511e0116508e24b11c9b/regex-2025.11.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce22519c989bb72a7e6b36a199384c53db7722fe669ba891da75907fe3587db", size = 803272, upload-time = "2025-11-03T21:33:01.393Z" },
{ url = "https://files.pythonhosted.org/packages/84/b1/03188f634a409353a84b5ef49754b97dbcc0c0f6fd6c8ede505a8960a0a4/regex-2025.11.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:66d559b21d3640203ab9075797a55165d79017520685fb407b9234d72ab63c62", size = 787958, upload-time = "2025-11-03T21:33:03.379Z" },
{ url = "https://files.pythonhosted.org/packages/99/6a/27d072f7fbf6fadd59c64d210305e1ff865cc3b78b526fd147db768c553b/regex-2025.11.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:669dcfb2e38f9e8c69507bace46f4889e3abbfd9b0c29719202883c0a603598f", size = 859289, upload-time = "2025-11-03T21:33:05.374Z" },
{ url = "https://files.pythonhosted.org/packages/9a/70/1b3878f648e0b6abe023172dacb02157e685564853cc363d9961bcccde4e/regex-2025.11.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:32f74f35ff0f25a5021373ac61442edcb150731fbaa28286bbc8bb1582c89d02", size = 850026, upload-time = "2025-11-03T21:33:07.131Z" },
{ url = "https://files.pythonhosted.org/packages/dd/d5/68e25559b526b8baab8e66839304ede68ff6727237a47727d240006bd0ff/regex-2025.11.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e6c7a21dffba883234baefe91bc3388e629779582038f75d2a5be918e250f0ed", size = 789499, upload-time = "2025-11-03T21:33:09.141Z" },
{ url = "https://files.pythonhosted.org/packages/fc/df/43971264857140a350910d4e33df725e8c94dd9dee8d2e4729fa0d63d49e/regex-2025.11.3-cp314-cp314-win32.whl", hash = "sha256:795ea137b1d809eb6836b43748b12634291c0ed55ad50a7d72d21edf1cd565c4", size = 271604, upload-time = "2025-11-03T21:33:10.9Z" },
{ url = "https://files.pythonhosted.org/packages/01/6f/9711b57dc6894a55faf80a4c1b5aa4f8649805cb9c7aef46f7d27e2b9206/regex-2025.11.3-cp314-cp314-win_amd64.whl", hash = "sha256:9f95fbaa0ee1610ec0fc6b26668e9917a582ba80c52cc6d9ada15e30aa9ab9ad", size = 280320, upload-time = "2025-11-03T21:33:12.572Z" },
{ url = "https://files.pythonhosted.org/packages/f1/7e/f6eaa207d4377481f5e1775cdeb5a443b5a59b392d0065f3417d31d80f87/regex-2025.11.3-cp314-cp314-win_arm64.whl", hash = "sha256:dfec44d532be4c07088c3de2876130ff0fbeeacaa89a137decbbb5f665855a0f", size = 273372, upload-time = "2025-11-03T21:33:14.219Z" },
{ url = "https://files.pythonhosted.org/packages/c3/06/49b198550ee0f5e4184271cee87ba4dfd9692c91ec55289e6282f0f86ccf/regex-2025.11.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:ba0d8a5d7f04f73ee7d01d974d47c5834f8a1b0224390e4fe7c12a3a92a78ecc", size = 491985, upload-time = "2025-11-03T21:33:16.555Z" },
{ url = "https://files.pythonhosted.org/packages/ce/bf/abdafade008f0b1c9da10d934034cb670432d6cf6cbe38bbb53a1cfd6cf8/regex-2025.11.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:442d86cf1cfe4faabf97db7d901ef58347efd004934da045c745e7b5bd57ac49", size = 292669, upload-time = "2025-11-03T21:33:18.32Z" },
{ url = "https://files.pythonhosted.org/packages/f9/ef/0c357bb8edbd2ad8e273fcb9e1761bc37b8acbc6e1be050bebd6475f19c1/regex-2025.11.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:fd0a5e563c756de210bb964789b5abe4f114dacae9104a47e1a649b910361536", size = 291030, upload-time = "2025-11-03T21:33:20.048Z" },
{ url = "https://files.pythonhosted.org/packages/79/06/edbb67257596649b8fb088d6aeacbcb248ac195714b18a65e018bf4c0b50/regex-2025.11.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bf3490bcbb985a1ae97b2ce9ad1c0f06a852d5b19dde9b07bdf25bf224248c95", size = 807674, upload-time = "2025-11-03T21:33:21.797Z" },
{ url = "https://files.pythonhosted.org/packages/f4/d9/ad4deccfce0ea336296bd087f1a191543bb99ee1c53093dcd4c64d951d00/regex-2025.11.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3809988f0a8b8c9dcc0f92478d6501fac7200b9ec56aecf0ec21f4a2ec4b6009", size = 873451, upload-time = "2025-11-03T21:33:23.741Z" },
{ url = "https://files.pythonhosted.org/packages/13/75/a55a4724c56ef13e3e04acaab29df26582f6978c000ac9cd6810ad1f341f/regex-2025.11.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f4ff94e58e84aedb9c9fce66d4ef9f27a190285b451420f297c9a09f2b9abee9", size = 914980, upload-time = "2025-11-03T21:33:25.999Z" },
{ url = "https://files.pythonhosted.org/packages/67/1e/a1657ee15bd9116f70d4a530c736983eed997b361e20ecd8f5ca3759d5c5/regex-2025.11.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7eb542fd347ce61e1321b0a6b945d5701528dca0cd9759c2e3bb8bd57e47964d", size = 812852, upload-time = "2025-11-03T21:33:27.852Z" },
{ url = "https://files.pythonhosted.org/packages/b8/6f/f7516dde5506a588a561d296b2d0044839de06035bb486b326065b4c101e/regex-2025.11.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:d6c2d5919075a1f2e413c00b056ea0c2f065b3f5fe83c3d07d325ab92dce51d6", size = 795566, upload-time = "2025-11-03T21:33:32.364Z" },
{ url = "https://files.pythonhosted.org/packages/d9/dd/3d10b9e170cc16fb34cb2cef91513cf3df65f440b3366030631b2984a264/regex-2025.11.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:3f8bf11a4827cc7ce5a53d4ef6cddd5ad25595d3c1435ef08f76825851343154", size = 868463, upload-time = "2025-11-03T21:33:34.459Z" },
{ url = "https://files.pythonhosted.org/packages/f5/8e/935e6beff1695aa9085ff83195daccd72acc82c81793df480f34569330de/regex-2025.11.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:22c12d837298651e5550ac1d964e4ff57c3f56965fc1812c90c9fb2028eaf267", size = 854694, upload-time = "2025-11-03T21:33:36.793Z" },
{ url = "https://files.pythonhosted.org/packages/92/12/10650181a040978b2f5720a6a74d44f841371a3d984c2083fc1752e4acf6/regex-2025.11.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:62ba394a3dda9ad41c7c780f60f6e4a70988741415ae96f6d1bf6c239cf01379", size = 799691, upload-time = "2025-11-03T21:33:39.079Z" },
{ url = "https://files.pythonhosted.org/packages/67/90/8f37138181c9a7690e7e4cb388debbd389342db3c7381d636d2875940752/regex-2025.11.3-cp314-cp314t-win32.whl", hash = "sha256:4bf146dca15cdd53224a1bf46d628bd7590e4a07fbb69e720d561aea43a32b38", size = 274583, upload-time = "2025-11-03T21:33:41.302Z" },
{ url = "https://files.pythonhosted.org/packages/8f/cd/867f5ec442d56beb56f5f854f40abcfc75e11d10b11fdb1869dd39c63aaf/regex-2025.11.3-cp314-cp314t-win_amd64.whl", hash = "sha256:adad1a1bcf1c9e76346e091d22d23ac54ef28e1365117d99521631078dfec9de", size = 284286, upload-time = "2025-11-03T21:33:43.324Z" },
{ url = "https://files.pythonhosted.org/packages/20/31/32c0c4610cbc070362bf1d2e4ea86d1ea29014d400a6d6c2486fcfd57766/regex-2025.11.3-cp314-cp314t-win_arm64.whl", hash = "sha256:c54f768482cef41e219720013cd05933b6f971d9562544d691c68699bf2b6801", size = 274741, upload-time = "2025-11-03T21:33:45.557Z" },
]
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
{ name = "regex" },
]
[package.metadata]
requires-dist = [{ name = "regex", specifier = ">=2025.11.3" }]

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,10 @@
scope1 = 1
def x():
scope2 = 1
def xx():
scope3 = 1
def xxx():
# We specifically want `scope3` to be
# suggested first here, since that's in
# the "tighter" scope.
scope<CURSOR: scope3>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1,8 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }

View File

@@ -1,12 +0,0 @@
# This one demands that `TypeVa` complete to `typing.TypeVar`
# even though there is also an `ast.TypeVar`. Getting this one
# right seems tricky, and probably requires module-specific
# heuristics.
#
# ref: https://github.com/astral-sh/ty/issues/1274#issuecomment-3345884227
TypeVa<CURSOR: typing.TypeVar>
# This is a similar case of `ctypes.cast` being preferred over
# `typing.cast`. Maybe `typing` should just get a slightly higher
# weight than most other stdlib modules?
cas<CURSOR: typing.cast>

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,19 @@
# Many of these came from discussion in:
# <https://github.com/astral-sh/ty/issues/1274>
# We should prefer `typing` over `asyncio` here.
class Foo(Protoco<CURSOR: typing.Protocol>): ...
# We should prefer `typing` over `ty_extensions`
# or `typing_extensions`.
reveal_<CURSOR: typing.reveal_type>
# We should prefer `typing` over `ast`.
TypeVa<CURSOR: typing.TypeVar>
# We should prefer `typing` over `ctypes`.
cast<CURSOR: typing.cast>
# We should prefer a non-stdlib project import
# over a stdlib `typing` import.
NoRetur<CURSOR: sub1.NoReturn>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1 @@
NoReturn = 1

View File

@@ -0,0 +1,8 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }

View File

@@ -27,6 +27,7 @@ ty_project = { workspace = true, features = ["testing"] }
ty_python_semantic = { workspace = true }
ty_vendored = { workspace = true }
compact_str = { workspace = true }
get-size2 = { workspace = true }
itertools = { workspace = true }
rayon = { workspace = true }

View File

@@ -64,7 +64,11 @@ pub fn all_symbols<'db>(
continue;
}
s.spawn(move |_| {
let symbols_for_file_span = tracing::debug_span!(parent: all_symbols_span, "symbols_for_file_global_only", ?file);
let symbols_for_file_span = tracing::debug_span!(
parent: all_symbols_span,
"symbols_for_file_global_only",
path = %file.path(&*db),
);
let _entered = symbols_for_file_span.entered();
if query.is_match_symbol_name(module.name(&*db)) {
@@ -94,11 +98,13 @@ pub fn all_symbols<'db>(
let key1 = (
s1.name_in_file()
.unwrap_or_else(|| s1.module().name(db).as_str()),
s1.module().name(db).as_str(),
s1.file.path(db).as_str(),
);
let key2 = (
s2.name_in_file()
.unwrap_or_else(|| s2.module().name(db).as_str()),
s2.module().name(db).as_str(),
s2.file.path(db).as_str(),
);
key1.cmp(&key2)

View File

@@ -5,8 +5,8 @@ use ruff_diagnostics::Edit;
use ruff_python_ast::find_node::covering_node;
use ruff_text_size::TextRange;
use ty_project::Db;
use ty_python_semantic::create_suppression_fix;
use ty_python_semantic::lint::LintId;
use ty_python_semantic::suppress_single;
use ty_python_semantic::types::{UNDEFINED_REVEAL, UNRESOLVED_REFERENCE};
/// A `QuickFix` Code Action
@@ -42,7 +42,7 @@ pub fn code_actions(
// Suggest just suppressing the lint (always a valid option, but never ideal)
actions.push(QuickFix {
title: format!("Ignore '{}' for this line", lint_id.name()),
edits: create_suppression_fix(db, file, lint_id, diagnostic_range).into_edits(),
edits: suppress_single(db, file, lint_id, diagnostic_range).into_edits(),
preferred: false,
});
@@ -437,6 +437,38 @@ mod tests {
"#);
}
#[test]
fn add_ignore_line_continuation_empty_lines() {
let test = CodeActionTest::with_source(
r#"b = bbbbb \
[ ccc # test
+ <START>ddd<END> \
] # test
"#,
);
assert_snapshot!(test.code_actions(&UNRESOLVED_REFERENCE), @r"
info[code-action]: Ignore 'unresolved-reference' for this line
--> main.py:4:11
|
2 | [ ccc # test
3 |
4 | + ddd \
| ^^^
5 |
6 | ] # test
|
2 | [ ccc # test
3 |
4 | + ddd \
-
5 + # ty:ignore[unresolved-reference]
6 | ] # test
");
}
#[test]
fn undefined_reveal_type() {
let test = CodeActionTest::with_source(

File diff suppressed because it is too large Load Diff

View File

@@ -34,8 +34,8 @@ pub struct ParameterDetails<'db> {
pub name: String,
/// The parameter label in the signature (e.g., "param1: str")
pub label: String,
/// The annotated type of the parameter, if any
pub ty: Option<Type<'db>>,
/// The annotated type of the parameter. If no annotation was provided, this is `Unknown`.
pub ty: Type<'db>,
/// Documentation specific to the parameter, typically extracted from the
/// function's docstring
pub documentation: Option<String>,
@@ -237,7 +237,7 @@ fn create_parameters_from_offsets<'db>(
docstring: Option<&Docstring>,
parameter_names: &[String],
parameter_kinds: &[ParameterKind],
parameter_types: &[Option<Type<'db>>],
parameter_types: &[Type<'db>],
) -> Vec<ParameterDetails<'db>> {
// Extract parameter documentation from the function's docstring if available.
let param_docs = if let Some(docstring) = docstring {
@@ -264,12 +264,11 @@ fn create_parameters_from_offsets<'db>(
parameter_kinds.get(i),
Some(ParameterKind::PositionalOnly { .. })
);
let ty = parameter_types.get(i).copied().flatten();
ParameterDetails {
name: param_name.to_string(),
label,
ty,
ty: parameter_types[i],
documentation: param_docs.get(param_name).cloned(),
is_positional_only,
}

View File

@@ -554,10 +554,17 @@ impl SearchPath {
)
}
/// Is this search path in "first party" code? i.e., The
/// end user's project code.
pub fn is_first_party(&self) -> bool {
matches!(&*self.0, SearchPathInner::FirstParty(_))
}
/// Is the module in a site-packages directory?
pub fn is_site_packages(&self) -> bool {
matches!(&*self.0, SearchPathInner::SitePackages(_))
}
fn is_valid_extension(&self, extension: &str) -> bool {
if self.is_standard_library() {
extension == "pyi"

View File

@@ -14,6 +14,7 @@ license.workspace = true
[dependencies]
ruff_cache = { workspace = true }
ruff_db = { workspace = true, features = ["cache", "serde"] }
ruff_diagnostics = { workspace = true }
ruff_macros = { workspace = true }
ruff_memory_usage = { workspace = true }
ruff_options_metadata = { workspace = true }
@@ -30,7 +31,7 @@ anyhow = { workspace = true }
camino = { workspace = true }
colored = { workspace = true }
crossbeam = { workspace = true }
get-size2 = { workspace = true }
get-size2 = { workspace = true, features = ["ordermap"] }
globset = { workspace = true }
notify = { workspace = true }
ordermap = { workspace = true, features = ["serde"] }
@@ -48,8 +49,10 @@ toml = { workspace = true }
tracing = { workspace = true }
[dev-dependencies]
insta = { workspace = true, features = ["redactions", "ron"] }
ruff_db = { workspace = true, features = ["testing"] }
ruff_python_trivia = { workspace = true }
insta = { workspace = true, features = ["redactions", "ron"] }
[features]
default = ["zstd"]

View File

@@ -0,0 +1,794 @@
use ruff_db::cancellation::{Canceled, CancellationToken};
use ruff_db::diagnostic::{DisplayDiagnosticConfig, DisplayDiagnostics};
use ruff_db::parsed::parsed_module;
use ruff_db::source::SourceText;
use ruff_db::system::{SystemPath, WritableSystem};
use ruff_db::{
diagnostic::{Annotation, Diagnostic, DiagnosticId, Severity, Span},
files::File,
source::source_text,
};
use ruff_diagnostics::{Fix, IsolationLevel, SourceMap};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
use rustc_hash::FxHashSet;
use salsa::Setter as _;
use std::collections::BTreeMap;
use thiserror::Error;
use ty_python_semantic::{UNUSED_IGNORE_COMMENT, suppress_all};
use crate::Db;
pub struct SuppressAllResult {
/// The non-lint diagnostics that can't be suppressed or the diagnostics of files
/// that couldn't be suppressed (because ty failed to write the result back to disk,
/// or the file contains syntax errors).
pub diagnostics: Vec<Diagnostic>,
/// The number of diagnostics that were suppressed.
pub count: usize,
}
/// Adds suppressions to all lint diagnostics and writes the changed files back to disk.
///
/// Returns how many diagnostics were suppressed along the remaining, non-suppressed diagnostics.
///
/// ## Panics
/// If the `db`'s system isn't [writable](WritableSystem).
pub fn suppress_all_diagnostics(
db: &mut dyn Db,
mut diagnostics: Vec<Diagnostic>,
cancellation_token: &CancellationToken,
) -> Result<SuppressAllResult, Canceled> {
let system = WritableSystem::dyn_clone(
db.system()
.as_writable()
.expect("System should be writable"),
);
let has_fixable = diagnostics.iter().any(|diagnostic| {
diagnostic
.primary_span()
.and_then(|span| span.range())
.is_some()
&& diagnostic.id().is_lint()
&& diagnostic.id() != DiagnosticId::Lint(UNUSED_IGNORE_COMMENT.name())
});
// Early return if there are no diagnostics that can be suppressed to avoid all the heavy work below.
if !has_fixable {
return Ok(SuppressAllResult {
diagnostics,
count: 0,
});
}
let mut by_file: BTreeMap<File, Vec<_>> = BTreeMap::new();
// Group the diagnostics by file, leave the file-agnostic diagnostics in `diagnostics`.
for diagnostic in diagnostics.extract_if(.., |diagnostic| diagnostic.primary_span().is_some()) {
let span = diagnostic
.primary_span()
.expect("should be set because `extract_if` only yields elements with a primary_span");
by_file
.entry(span.expect_ty_file())
.or_default()
.push(diagnostic);
}
let mut fixed_count = 0usize;
let project = db.project();
// Try to suppress all lint-diagnostics in the given file.
for (&file, file_diagnostics) in &mut by_file {
if cancellation_token.is_cancelled() {
return Err(Canceled);
}
let Some(path) = file.path(db).as_system_path() else {
tracing::debug!(
"Skipping file `{}` with non-system path because vendored and system virtual file paths are read-only",
file.path(db)
);
continue;
};
let parsed = parsed_module(db, file);
if parsed.load(db).has_syntax_errors() {
tracing::warn!("Skipping file `{path}` with syntax errors",);
continue;
}
let fixable_diagnostics: Vec<_> = file_diagnostics
.iter()
.filter_map(|diagnostic| {
let lint_id = diagnostic.id().as_lint()?;
// Don't suppress unused ignore comments.
if lint_id == UNUSED_IGNORE_COMMENT.name() {
return None;
}
// We can't suppress diagnostics without a corresponding file or range.
let span = diagnostic.primary_span()?;
let range = span.range()?;
Some((lint_id, range))
})
.collect();
if fixable_diagnostics.is_empty() {
tracing::debug!(
"Skipping file `{path}` because it contains no suppressable diagnostics"
);
continue;
}
tracing::debug!(
"Suppressing {} diagnostics in `{path}`.",
fixable_diagnostics.len()
);
// Required to work around borrow checker issues.
let path = path.to_path_buf();
let fixes = suppress_all(db, file, &fixable_diagnostics);
let source = source_text(db, file);
// TODO: Handle overlapping fixes when adding support for `--fix` by iterating until all fixes
// were successfully applied. We don't need to do that for suppressions because suppression fixes
// should never overlap (and, if they were, the worst outcome is that some suppressions are missing).
let FixedCode {
source: new_source,
source_map,
} = apply_fixes(&source, fixes).unwrap_or_else(|fixed| fixed);
let new_source = source.with_text(new_source, &source_map);
// Verify that the fix didn't introduce any syntax errors by overriding
// the source text for `file`.
let mut source_guard = WithUpdatedSourceGuard::new(db, file, &source, new_source.clone());
let db = source_guard.db();
let new_parsed = parsed_module(db, file);
let new_parsed = new_parsed.load(db);
if new_parsed.has_syntax_errors() {
let mut diag = Diagnostic::new(
DiagnosticId::InternalError,
Severity::Fatal,
format_args!(
"Adding suppressions introduced a syntax error. Reverting all changes."
),
);
let mut file_annotation = Annotation::primary(Span::from(file));
file_annotation.hide_snippet(true);
diag.annotate(file_annotation);
let parse_diagnostics: Vec<_> = new_parsed
.errors()
.iter()
.map(|error| {
Diagnostic::invalid_syntax(Span::from(file), &error.error, error.location)
})
.collect();
diag.add_bug_sub_diagnostics("%5BFix%20error%5D");
let file_db: &dyn ruff_db::Db = db;
diag.info(format_args!(
"Introduced syntax errors:\n\n{}",
DisplayDiagnostics::new(
&file_db,
&DisplayDiagnosticConfig::default(),
&parse_diagnostics
)
));
file_diagnostics.push(diag);
continue;
}
// Write the changes back to disk.
if let Err(err) = write_changes(db, &*system, file, &path, &new_source) {
let mut diag = Diagnostic::new(
DiagnosticId::Io,
Severity::Error,
format_args!("Failed to write fixes to file: {err}"),
);
diag.annotate(Annotation::primary(Span::from(file)));
diagnostics.push(diag);
continue;
}
// If we got here then we've been successful. Re-check to get the diagnostics with the
// update source, update the fix count.
if fixable_diagnostics.len() == file_diagnostics.len() {
file_diagnostics.clear();
} else {
// If there are any other file level diagnostics, call `check_file` to re-compute them
// with updated ranges.
let diagnostics = project.check_file(db, file);
*file_diagnostics = diagnostics;
}
fixed_count += fixable_diagnostics.len();
// Don't restore the source text or we risk a panic when rendering the diagnostics
// if reading any of the fixed files fails (for whatever reason).
// The override will get removed on the next `File::sync_path` call.
source_guard.defuse();
}
// Stitch the remaining diagnostics back together.
diagnostics.extend(by_file.into_values().flatten());
diagnostics.sort_by(|left, right| {
left.rendering_sort_key(db)
.cmp(&right.rendering_sort_key(db))
});
Ok(SuppressAllResult {
diagnostics,
count: fixed_count,
})
}
fn write_changes(
db: &dyn Db,
system: &dyn WritableSystem,
file: File,
path: &SystemPath,
new_source: &SourceText,
) -> Result<(), WriteChangesError> {
let metadata = system.path_metadata(path)?;
if metadata.revision() != file.revision(db) {
return Err(WriteChangesError::FileWasModified);
}
system.write_file_bytes(path, &new_source.to_bytes())?;
Ok(())
}
#[derive(Debug, Error)]
enum WriteChangesError {
#[error("failed to write changes to disk: {0}")]
Io(#[from] std::io::Error),
#[error("the file has been modified")]
FileWasModified,
}
/// Apply a series of fixes to `File` and returns the updated source code along with the source map.
///
/// Returns an error if not all fixes were applied because some fixes are overlapping.
fn apply_fixes(source: &str, mut fixes: Vec<Fix>) -> Result<FixedCode, FixedCode> {
let mut output = String::with_capacity(source.len());
let mut last_pos: Option<TextSize> = None;
let mut has_overlapping_fixes = false;
let mut isolated: FxHashSet<u32> = FxHashSet::default();
let mut source_map = SourceMap::default();
fixes.sort_unstable_by_key(Fix::min_start);
for fix in fixes {
let mut edits = fix.edits().iter().peekable();
// If the fix contains at least one new edit, enforce isolation and positional requirements.
if let Some(first) = edits.peek() {
// If this fix requires isolation, and we've already applied another fix in the
// same isolation group, skip it.
if let IsolationLevel::Group(id) = fix.isolation() {
if !isolated.insert(id) {
has_overlapping_fixes = true;
continue;
}
}
// If this fix overlaps with a fix we've already applied, skip it.
if last_pos.is_some_and(|last_pos| last_pos >= first.start()) {
has_overlapping_fixes = true;
continue;
}
}
let mut applied_edits = Vec::with_capacity(fix.edits().len());
for edit in edits {
// Add all contents from `last_pos` to `fix.location`.
let slice = &source[TextRange::new(last_pos.unwrap_or_default(), edit.start())];
output.push_str(slice);
// Add the start source marker for the patch.
source_map.push_start_marker(edit, output.text_len());
// Add the patch itself.
output.push_str(edit.content().unwrap_or_default());
// Add the end source marker for the added patch.
source_map.push_end_marker(edit, output.text_len());
// Track that the edit was applied.
last_pos = Some(edit.end());
applied_edits.push(edit);
}
}
// Add the remaining content.
let slice = &source[last_pos.unwrap_or_default().to_usize()..];
output.push_str(slice);
let fixed = FixedCode {
source: output,
source_map,
};
if has_overlapping_fixes {
Err(fixed)
} else {
Ok(fixed)
}
}
struct FixedCode {
/// Source map that allows mapping positions in the fixed code back to positions in the original
/// source code (useful for mapping fixed lines back to their original notebook cells).
source_map: SourceMap,
/// The fixed source code
source: String,
}
/// Guard that sets [`File::set_source_text_override`] and guarantees to restore the original source
/// text unless the guard is explicitly defused.
struct WithUpdatedSourceGuard<'db> {
db: &'db mut dyn Db,
file: File,
old_source: Option<SourceText>,
}
impl<'db> WithUpdatedSourceGuard<'db> {
fn new(
db: &'db mut dyn Db,
file: File,
old_source: &SourceText,
new_source: SourceText,
) -> Self {
file.set_source_text_override(db).to(Some(new_source));
Self {
db,
file,
old_source: Some(old_source.clone()),
}
}
fn defuse(&mut self) {
self.old_source = None;
}
fn db(&mut self) -> &mut dyn Db {
self.db
}
}
impl Drop for WithUpdatedSourceGuard<'_> {
fn drop(&mut self) {
if let Some(old_source) = self.old_source.take() {
// We don't set `source_text_override` to `None` here because setting the value
// invalidates the `source_text` query and there's the chance that reading the file's content
// will fail this time (e.g. because the file was deleted), resulting in ty panicking
// when trying to render any diagnostic for that file (because all offsets now point nowhere).
// The override will be cleared by `File::sync_path`, the next time the revision changes.
self.file
.set_source_text_override(self.db)
.to(Some(old_source));
}
}
}
#[cfg(test)]
mod tests {
use std::collections::hash_map::Entry;
use std::hash::{DefaultHasher, Hash, Hasher};
use insta::assert_snapshot;
use ruff_db::cancellation::CancellationTokenSource;
use ruff_db::diagnostic::{Diagnostic, DisplayDiagnosticConfig, DisplayDiagnostics};
use ruff_db::files::{File, system_path_to_file};
use ruff_db::parsed::parsed_module;
use ruff_db::source::source_text;
use ruff_db::system::{DbWithWritableSystem, SystemPath, SystemPathBuf};
use ruff_python_ast::name::Name;
use rustc_hash::FxHashMap;
use ty_python_semantic::UNUSED_IGNORE_COMMENT;
use ty_python_semantic::lint::Level;
use crate::db::tests::TestDb;
use crate::metadata::options::Rules;
use crate::metadata::value::RangedValue;
use crate::{Db, ProjectMetadata, suppress_all_diagnostics};
#[test]
fn simple_suppression() {
assert_snapshot!(
suppress_all_in(r#"
a = b + 10"#
),
@r"
Added 1 suppressions
## Fixed source
```py
a = b + 10 # ty:ignore[unresolved-reference]
```
");
}
#[test]
fn multiple_suppressions_same_code() {
assert_snapshot!(
suppress_all_in(r#"
a = b + 10 + c"#
),
@r"
Added 2 suppressions
## Fixed source
```py
a = b + 10 + c # ty:ignore[unresolved-reference]
```
");
}
#[test]
fn multiple_suppressions_different_codes() {
assert_snapshot!(
suppress_all_in(r#"
import sys
a = b + 10 + sys.veeersion"#
),
@r"
Added 2 suppressions
## Fixed source
```py
import sys
a = b + 10 + sys.veeersion # ty:ignore[unresolved-attribute, unresolved-reference]
```
");
}
#[test]
fn dont_fix_unused_ignore() {
assert_snapshot!(
suppress_all_in(r#"
import sys
a = 5 + 10 # ty: ignore[unresolved-reference]"#
),
@r"
Added 0 suppressions
## Fixed source
```py
import sys
a = 5 + 10 # ty: ignore[unresolved-reference]
```
## Diagnostics after applying fixes
warning[unused-ignore-comment]: Unused `ty: ignore` directive
--> test.py:2:13
|
1 | import sys
2 | a = 5 + 10 # ty: ignore[unresolved-reference]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
help: Remove the unused suppression comment
");
}
#[test]
fn dont_fix_files_containing_syntax_errors() {
assert_snapshot!(
suppress_all_in(r#"
import sys
a = x +
"#
),
@r"
Added 0 suppressions
## Fixed source
```py
import sys
a = x +
```
## Diagnostics after applying fixes
error[unresolved-reference]: Name `x` used when not defined
--> test.py:2:5
|
1 | import sys
2 | a = x +
| ^
|
info: rule `unresolved-reference` is enabled by default
error[invalid-syntax]: Expected an expression
--> test.py:2:8
|
1 | import sys
2 | a = x +
| ^
|
");
}
#[test]
fn arguments() {
assert_snapshot!(
suppress_all_in(r#"
def test(a, b):
pass
test(
a = 10,
c = "unknown"
)
"#
),
@r#"
Added 2 suppressions
## Fixed source
```py
def test(a, b):
pass
test(
a = 10,
c = "unknown" # ty:ignore[unknown-argument]
) # ty:ignore[missing-argument]
```
"#);
}
#[test]
fn return_type() {
assert_snapshot!(
suppress_all_in(r#"class A:
def test(self, b: int) -> str:
return "test"
class B(A):
def test(
self,
b: str
) -> A.b:
pass"#
),
@r#"
Added 2 suppressions
## Fixed source
```py
class A:
def test(self, b: int) -> str:
return "test"
class B(A):
def test(
self,
b: str
) -> A.b: # ty:ignore[invalid-method-override, unresolved-attribute]
pass
```
"#);
}
#[test]
fn existing_ty_ignore() {
assert_snapshot!(
suppress_all_in(r#"class A:
def test(self, b: int) -> str:
return "test"
class B(A):
def test( # ty:ignore[unresolved-reference]
self,
b: str
) -> A.b:
pass"#
),
@r#"
Added 2 suppressions
## Fixed source
```py
class A:
def test(self, b: int) -> str:
return "test"
class B(A):
def test( # ty:ignore[unresolved-reference, invalid-method-override]
self,
b: str
) -> A.b: # ty:ignore[unresolved-attribute]
pass
```
## Diagnostics after applying fixes
warning[unused-ignore-comment]: Unused `ty: ignore` directive: 'unresolved-reference'
--> test.py:7:28
|
6 | class B(A):
7 | def test( # ty:ignore[unresolved-reference, invalid-method-override]
| ^^^^^^^^^^^^^^^^^^^^
8 | self,
9 | b: str
|
help: Remove the unused suppression code
"#);
}
#[track_caller]
fn suppress_all_in(source: &str) -> String {
use std::fmt::Write as _;
let mut metadata = ProjectMetadata::new(Name::new_static("test"), SystemPathBuf::from("."));
metadata.options.rules = Some(Rules::from_iter([(
RangedValue::cli(UNUSED_IGNORE_COMMENT.name.to_string()),
RangedValue::cli(Level::Warn),
)]));
let mut db = TestDb::new(metadata);
db.init_program().unwrap();
db.write_file(
"test.py",
ruff_python_trivia::textwrap::dedent(source).trim(),
)
.unwrap();
let file = system_path_to_file(&db, "test.py").unwrap();
let parsed_before = parsed_module(&db, file);
let had_syntax_errors = parsed_before.load(&db).has_syntax_errors();
let diagnostics = db.project().check_file(&db, file);
let total_diagnostics = diagnostics.len();
let cancellation_token_source = CancellationTokenSource::new();
let fixes =
suppress_all_diagnostics(&mut db, diagnostics, &cancellation_token_source.token())
.expect("operation never gets cancelled");
assert_eq!(fixes.count, total_diagnostics - fixes.diagnostics.len());
File::sync_path(&mut db, SystemPath::new("test.py"));
let fixed = source_text(&db, file);
let parsed = parsed_module(&db, file);
let parsed = parsed.load(&db);
let diagnostics_after_applying_fixes = db.project().check_file(&db, file);
let mut output = String::new();
writeln!(
output,
"Added {} suppressions\n\n## Fixed source\n\n```py\n{}\n```\n",
fixes.count,
fixed.as_str()
)
.unwrap();
if !fixes.diagnostics.is_empty() {
writeln!(
output,
"## Diagnostics after applying fixes\n\n{diagnostics}\n",
diagnostics = DisplayDiagnostics::new(
&db,
&DisplayDiagnosticConfig::default(),
&fixes.diagnostics
)
)
.unwrap();
}
assert!(
!parsed.has_syntax_errors() || had_syntax_errors,
"Fixed introduced syntax errors\n\n{output}"
);
let new_diagnostics =
diff_diagnostics(&fixes.diagnostics, &diagnostics_after_applying_fixes);
if !new_diagnostics.is_empty() {
writeln!(
&mut output,
"## New diagnostics after re-checking file\n\n{diagnostics}\n",
diagnostics = DisplayDiagnostics::new(
&db,
&DisplayDiagnosticConfig::default(),
&new_diagnostics
)
)
.unwrap();
}
output
}
fn diff_diagnostics<'a>(before: &'a [Diagnostic], after: &'a [Diagnostic]) -> Vec<Diagnostic> {
let before = DiagnosticFingerprint::group_diagnostics(before);
let after = DiagnosticFingerprint::group_diagnostics(after);
after
.into_iter()
.filter(|(key, _)| !before.contains_key(key))
.map(|(_, diagnostic)| diagnostic.clone())
.collect()
}
#[derive(Copy, Clone, Eq, PartialEq, Hash)]
struct DiagnosticFingerprint(u64);
impl DiagnosticFingerprint {
fn group_diagnostics(diagnostics: &[Diagnostic]) -> FxHashMap<Self, &Diagnostic> {
let mut result = FxHashMap::default();
for diagnostic in diagnostics {
Self::from_diagnostic(diagnostic, &mut result);
}
result
}
fn from_diagnostic<'a>(
diagnostic: &'a Diagnostic,
seen: &mut FxHashMap<DiagnosticFingerprint, &'a Diagnostic>,
) -> DiagnosticFingerprint {
let mut disambiguator = 0u64;
loop {
let mut h = DefaultHasher::default();
disambiguator.hash(&mut h);
diagnostic.id().hash(&mut h);
let key = DiagnosticFingerprint(h.finish());
match seen.entry(key) {
Entry::Occupied(_) => {
disambiguator += 1;
}
Entry::Vacant(entry) => {
entry.insert(diagnostic);
return key;
}
}
}
}
}
}

View File

@@ -1,5 +1,6 @@
use ruff_db::system::SystemPath;
use crate::glob::include::MatchFile;
pub(crate) use exclude::{ExcludeFilter, ExcludeFilterBuilder};
pub(crate) use include::{IncludeFilter, IncludeFilterBuilder};
pub(crate) use portable::{
@@ -39,7 +40,9 @@ impl IncludeExcludeFilter {
if self.exclude.match_directory(path, mode) {
IncludeResult::Excluded
} else if self.include.match_directory(path) {
IncludeResult::Included
IncludeResult::Included {
literal_match: None,
}
} else {
IncludeResult::NotIncluded
}
@@ -52,10 +55,16 @@ impl IncludeExcludeFilter {
) -> IncludeResult {
if self.exclude.match_file(path, mode) {
IncludeResult::Excluded
} else if self.include.match_file(path) {
IncludeResult::Included
} else {
IncludeResult::NotIncluded
match self.include.match_file(path) {
MatchFile::Literal => IncludeResult::Included {
literal_match: Some(true),
},
MatchFile::Pattern => IncludeResult::Included {
literal_match: Some(false),
},
MatchFile::No => IncludeResult::NotIncluded,
}
}
}
}
@@ -86,7 +95,7 @@ pub(crate) enum IncludeResult {
///
/// For directories: This isn't a guarantee that any file in this directory gets included
/// but we need to traverse it to make this decision.
Included,
Included { literal_match: Option<bool> },
/// The path matches an exclude pattern.
Excluded,

View File

@@ -40,25 +40,22 @@ impl ExcludeFilter {
}
fn matches(&self, path: &SystemPath, mode: GlobFilterCheckMode, directory: bool) -> bool {
// If the path is excluded, return `ignore`
if self.ignore.matched(path, directory).is_ignore() {
return true;
}
match mode {
GlobFilterCheckMode::TopDown => {
match self.ignore.matched(path, directory) {
// No hit or an allow hit means the file or directory is not excluded.
Match::None | Match::Allow => false,
Match::Ignore => true,
}
// No hit or an allow hit means the file or directory is not excluded.
false
}
GlobFilterCheckMode::Adhoc => {
for ancestor in path.ancestors() {
match self.ignore.matched(ancestor, directory) {
// If the path is allowlisted or there's no hit, try the parent to ensure we don't return false
// for a folder where there's an exclude for a parent.
Match::None | Match::Allow => {}
Match::Ignore => return true,
}
}
false
// If the path is allowlisted or there's no hit, try the parent to ensure we don't return false
// for a folder where there's an exclude for a parent.
path.ancestors()
.skip(1)
.any(|ancestor| self.ignore.matched(ancestor, true).is_ignore())
}
}
}
@@ -124,7 +121,7 @@ struct Gitignore {
set: GlobSet,
globs: Vec<IgnoreGlob>,
#[get_size(ignore)]
matches: Option<Arc<Pool<Vec<usize>>>>,
matches: Arc<Pool<Vec<usize>>>,
}
impl Gitignore {
@@ -140,7 +137,7 @@ impl Gitignore {
return Match::None;
}
let mut matches = self.matches.as_ref().unwrap().get();
let mut matches = self.matches.get();
let candidate = Candidate::new(path);
self.set.matches_candidate_into(&candidate, &mut matches);
for &i in matches.iter().rev() {
@@ -187,6 +184,12 @@ enum Match {
Allow,
}
impl Match {
const fn is_ignore(self) -> bool {
matches!(self, Match::Ignore)
}
}
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
struct IgnoreGlob {
/// The pattern that was originally parsed.
@@ -232,7 +235,7 @@ impl GitignoreBuilder {
Ok(Gitignore {
set,
globs: self.globs.clone(),
matches: Some(Arc::new(Pool::new(Vec::new))),
matches: Arc::new(Pool::new(Vec::new)),
})
}

Some files were not shown because too many files have changed in this diff Show More