Compare commits

...

350 Commits

Author SHA1 Message Date
David Peter
e5b3de30a1 [ty] Remove unused lint registry functionality 2025-08-01 09:49:34 +02:00
Matthew Mckee
b30d97e5e0 [ty] Support __setitem__ and improve __getitem__ related diagnostics (#19578)
## Summary

Adds validation to subscript assignment expressions.

```py
class Foo: ...

class Bar:
    __setattr__ = None

class Baz:
    def __setitem__(self, index: str, value: int) -> None:
        pass

# We now emit a diagnostic on these statements
Foo()[1] = 2
Bar()[1] = 2
Baz()[1] = 2

```

Also improves error messages on invalid `__getitem__` expressions

## Test Plan

Update mdtests and add more to `subscript/instance.md`

---------

Co-authored-by: David Peter <sharkdp@users.noreply.github.com>
Co-authored-by: David Peter <mail@david-peter.de>
2025-08-01 09:23:27 +02:00
github-actions[bot]
5c5d50d57a [ty] Sync vendored typeshed stubs (#19676)
Close and reopen this PR to trigger CI

---------

Co-authored-by: typeshedbot <>
2025-08-01 08:43:42 +02:00
Dan Parizher
b3a26a50ad [flake8-use-pathlib] Expand PTH201 to check all PurePath subclasses (#19440)
## Summary

Fixes #19437
2025-07-31 22:18:07 -04:00
GiGaGon
6a2d358d7a [refurb] Make example error out-of-the-box (FURB180) (#19672)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [meta-class-abc-meta
(FURB180)](https://docs.astral.sh/ruff/rules/meta-class-abc-meta/#meta-class-abc-meta-furb180)'s
example error out-of-the-box.

[Old example](https://play.ruff.rs/6beca1be-45cd-4e5a-aafa-6a0584c10d64)
```py
class C(metaclass=ABCMeta):
    pass
```

[New example](https://play.ruff.rs/bbad34da-bf07-44e6-9f34-53337e8f57d4)
```py
import abc


class C(metaclass=abc.ABCMeta):
    pass
```

The "Use instead" section as also modified similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-31 17:14:15 -04:00
Dan Parizher
b07def07c9 [pyupgrade] Prevent infinite loop with I002 (UP010, UP035) (#19413)
## Summary

Fixes #18729 and fixes #16802

## Test Plan

Manually verified via CLI that Ruff no longer enters an infinite loop by
running:
```sh
echo 1 | ruff --isolated check - --select I002,UP010 --fix
```
with `required-imports = ["from __future__ import generator_stop"]` set
in the config, confirming “All checks passed!” and no snapshots were
generated.

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-07-31 15:17:27 -04:00
Alex Waygood
2ab1502e51 [ty] Improve the Display for generic type[] types (#19667) 2025-07-31 19:45:01 +01:00
Alex Waygood
a3f28baab4 [ty] Refactor TypeInferenceBuilder::infer_subscript_expression_types (#19658) 2025-07-31 13:38:43 +00:00
Brent Westbrook
a71513bae1 Fix tests on 32-bit architectures (#19652)
Summary
--

Fixes #19640. I'm not sure these are the exact fixes we really want, but
I
reproduced the issue in a 32-bit Docker container and tracked down the
causes,
so I figured I'd open a PR.

As I commented on the issue, the `goto_references` test depends on the
iteration
order of the files in an `FxHashSet` in `Indexed`. In this case, we can
just
sort the output in test code.

Similarly, the tuple case depended on the order of overloads inserted in
an
`FxHashMap`. `FxIndexMap` seemed like a convenient drop-in replacement,
but I
don't know if that will have other detrimental effects. I did have to
change the
assertion for the tuple test, but I think it should now be stable across
architectures.

Test Plan
--

Running the tests in the aforementioned Docker container
2025-07-31 08:52:19 -04:00
Alex Waygood
d2d4b115e3 [ty] Move pandas-stubs to bad.txt (#19659) 2025-07-31 12:33:24 +01:00
Alex Waygood
27b03a9d7b [ty] Remove special casing for string-literal-in-tuple __contains__ (#19642) 2025-07-31 11:28:03 +01:00
Harshil
32c454bb56 Update pre-commit's ruff id (#19654) 2025-07-31 07:17:04 +02:00
Ibraheem Ahmed
f6b7418def Update salsa (#19449)
## Summary

Pulls in a bunch of salsa micro-optimizations.
2025-07-30 15:31:46 -04:00
Ibraheem Ahmed
8f8c39c435 Simplify get_size2 usage (#19643)
## Summary

These were added in the 0.5.0 release.
2025-07-30 15:31:37 -04:00
Matthew Mckee
4739bc8d14 [ty] Fix incorrect diagnostic when calling __setitem__ (#19645)
## Summary

Resolves https://github.com/astral-sh/ty/issues/862 by not emitting a
diagnostic.

## Test Plan

Add test to show we don't emit the diagnostic
2025-07-30 20:34:52 +02:00
Alex Waygood
7b4103bcb6 [ty] Remove special casing for tuple addition (#19636) 2025-07-30 16:25:42 +00:00
Jim Hoekstra
38049aae12 fix missing-required-imports introducing syntax error after dosctring ending with backslash (#19505)
Issue: https://github.com/astral-sh/ruff/issues/19498

## Summary


[missing-required-import](https://docs.astral.sh/ruff/rules/missing-required-import/)
inserts the missing import on the line immediately following the last
line of the docstring. However, if the dosctring is immediately followed
by a continuation token (i.e. backslash) then this leads to a syntax
error because Python interprets the docstring and the inserted import to
be on the same line.

The proposed solution in this PR is to check if the first token after a
file docstring is a continuation character, and if so, to advance an
additional line before inserting the missing import.

## Test Plan

Added a unit test, and the following example was verified manually:

Given this simple test Python file:

```python
"Hello, World!"\

print(__doc__)
```

and this ruff linting configuration in the `pyproject.toml` file:

```toml
[tool.ruff.lint]
select = ["I"]

[tool.ruff.lint.isort]
required-imports = ["import sys"]
```

Without the changes in this PR, the ruff linter would try to insert the
missing import in line 2, resulting in a syntax error, and report the
following:

`error: Fix introduced a syntax error. Reverting all changes.`

With the changes in this PR, ruff correctly advances one more line
before adding the missing import, resulting in the following output:

```python
"Hello, World!"\

import sys

print(__doc__)
```

---------

Co-authored-by: Jim Hoekstra <jim.hoekstra@pacmed.nl>
2025-07-30 12:12:46 -04:00
Alex Waygood
ec3d5ebda2 [ty] Upcast heterogeneous and mixed tuples to homogeneous tuples where it's necessary to solve a TypeVar (#19635)
## Summary

This PR improves our generics solver such that we are able to solve the
`TypeVar` in this snippet to `int | str` (the union of the elements in
the heterogeneous tuple) by upcasting the heterogeneous tuple to its
pure-homogeneous-tuple supertype:

```py
def f[T](x: tuple[T, ...]) -> T:
    return x[0]

def g(x: tuple[int, str]):
    reveal_type(f(x))
```

## Test Plan

Mdtests. Some TODOs remain in the mdtest regarding solving `TypeVar`s
for mixed tuples, but I think this PR on its own is a significant step
forward for our generics solver when it comes to tuple types.

---------

Co-authored-by: Douglas Creager <dcreager@dcreager.net>
2025-07-30 17:12:21 +01:00
Micha Reiser
d797592f70 [ty] Fix server panic in workspace diagnostics request handler when typing (#19631) 2025-07-30 16:40:42 +01:00
David Peter
eb02aa5676 [ty] Async for loops and async iterables (#19634)
## Summary

Add support for `async for` loops and async iterables.

part of https://github.com/astral-sh/ty/issues/151

## Ecosystem impact

```diff
- boostedblob/listing.py:445:54: warning[unused-ignore-comment] Unused blanket `type: ignore` directive
```

This is correct. We now find a true positive in the `# type: ignore`'d
code.

All of the other ecosystem hits are of the type

```diff
trio (https://github.com/python-trio/trio)
+ src/trio/_core/_tests/test_guest_mode.py:532:24: error[not-iterable] Object of type `MemorySendChannel[int] | MemoryReceiveChannel[int]` may not be iterable
```

The message is correct, because only `MemoryReceiveChannel` has an
`__aiter__` method, but `MemorySendChannel` does not. What's not correct
is our inferred type here. It should be `MemoryReceiveChannel[int]`, not
the union of the two. This is due to missing unpacking support for tuple
subclasses, which @AlexWaygood is working on. I don't think this should
block merging this PR, because those wrong types are already there,
without this PR.

## Test Plan

New Markdown tests and snapshot tests for diagnostics.
2025-07-30 17:40:24 +02:00
Dan Parizher
e593761232 [ruff] Parenthesize generator expressions in f-strings (RUF010) (#19434)
## Summary

Fixes #19433

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-30 15:02:31 +00:00
Brent Westbrook
8979271ea8 Always expand tabs to four spaces in diagnostics (#19618)
## Summary

I was a bit stuck on some snapshot differences I was seeing in #19415,
but @BurntSushi pointed out that `annotate-snippets` already normalizes
tabs on its own, which was very helpful! Instead of applying this change
directly to the other branch, I wanted to try applying it in
`ruff_linter` first. This should very slightly reduce the number of
changes in #19415 proper.

It looks like `annotate-snippets` always expands a tab to four spaces,
whereas I think we were aligning to tab stops:

```diff
  6 | spam(ham[1], { eggs: 2})
  7 | #: E201:1:6
- 8 | spam(   ham[1], {eggs: 2})
-   |      ^^^ E201
+ 8 | spam(    ham[1], {eggs: 2})
+   |      ^^^^ E201
```

```diff
61 | #: E203:2:15 E702:2:16
 62 | if x == 4:
-63 |     print(x, y) ; x, y = y, x
-   |                ^ E203
+63 |     print(x, y)    ; x, y = y, x
+   |                ^^^^ E203
```

```diff
 E27.py:15:6: E271 [*] Multiple spaces after keyword
    |
-13 | True        and False
+13 | True        and    False
 14 | #: E271
 15 | a and  b
    |      ^^ E271
```

I don't think this is too bad and has the major benefit of allowing us
to pass the non-tab-expanded range to `annotate-snippets` in #19415,
where it's also displayed in the header. Ruff doesn't have this problem
currently because it uses its own concise diagnostic output as the
header for full diagnostics, where the pre-expansion range is used
directly.

## Test Plan

Existing tests with a few snapshot updates
2025-07-30 11:00:36 -04:00
Andrew Gallant
d1a286226c [ty] Update module resolution diagram to account for typeshed VERSIONS file
This does unfortunately add a fair bit of complexity to the flow
diagram.

Ref https://github.com/astral-sh/ruff/pull/19620#issuecomment-3133684294
2025-07-30 10:34:58 -04:00
Micha Reiser
1ba32684da Fix copy and line separator colors in dark mode (#19630) 2025-07-30 15:08:31 +01:00
Micha Reiser
70d4b271da [ty] Remove AssertUnwindSafe requirement from ProgressReporter (#19637) 2025-07-30 12:46:44 +00:00
Alex Waygood
feaedb1812 [ty] Synthesize precise __getitem__ overloads for tuple subclasses (#19493) 2025-07-30 11:25:44 +00:00
Micha Reiser
6237ecb4db [ty] Add progress reporting to workspace diagnostics (#19616) 2025-07-30 10:27:34 +00:00
Micha Reiser
2a5ace6e55 [ty] Implement diagnostic caching (#19605) 2025-07-30 11:04:34 +01:00
David Peter
4ecf1d205a [ty] Support async/await, async with and yield from (#19595)
## Summary

- Add support for the return types of `async` functions
- Add type inference for `await` expressions
- Add support for `async with` / async context managers
- Add support for `yield from` expressions

This PR is generally lacking proper error handling in some cases (e.g.
illegal `__await__` attributes). I'm planning to work on this in a
follow-up.

part of https://github.com/astral-sh/ty/issues/151

closes https://github.com/astral-sh/ty/issues/736

## Ecosystem

There are a lot of true positives on `prefect` which look similar to:
```diff
prefect (https://github.com/PrefectHQ/prefect)
+ src/integrations/prefect-aws/tests/workers/test_ecs_worker.py:406:12: error[unresolved-attribute] Type `str` has no attribute `status_code`
```

This is due to a wrong return type annotation
[here](e926b8c4c1/src/integrations/prefect-aws/tests/workers/test_ecs_worker.py (L355-L391)).

```diff
mitmproxy (https://github.com/mitmproxy/mitmproxy)
+ test/mitmproxy/addons/test_clientplayback.py:18:1: error[invalid-argument-type] Argument to function `asynccontextmanager` is incorrect: Expected `(...) -> AsyncIterator[Unknown]`, found `def tcp_server(handle_conn, **server_args) -> Unknown | tuple[str, int]`
```


[This](a4d794c59a/test/mitmproxy/addons/test_clientplayback.py (L18-L19))
is a true positive. That function should return
`AsyncIterator[Address]`, not `Address`.

I looked through almost all of the other new diagnostics and they all
look like known problems or true positives.

## Typing conformance

The typing conformance diff looks good.

## Test Plan

New Markdown tests
2025-07-30 11:51:21 +02:00
Brent Westbrook
c5ac998892 Bump 0.12.7 (#19627)
## Test Plan

- [x] Download the [sdist
artifact](https://github.com/astral-sh/ruff/actions/runs/16608501774/artifacts/3643617012)
and check that the LICENSE is present
2025-07-29 18:18:42 -04:00
Brent Westbrook
04a8f64cd7 Revert license and license-files changes in pyproject.toml (#19624)
Summary
--

This partially reverts commit 13634ff433
after issues in the release today.

Test Plan
--

```shell
uv build --sdist
tar -tzf dist/ruff-0.12.6.tar.gz | grep ruff-0.12.6/LICENSE
```

which finds the license now.
2025-07-29 17:27:55 -04:00
Brent Westbrook
6e00adf308 Bump 0.12.6 (#19622) 2025-07-29 16:31:01 -04:00
Brent Westbrook
864196b988 Add Checker::context method, deduplicate Unicode checks (#19609)
Summary
--

This PR adds a `Checker::context` method that returns the underlying
`LintContext` to unify `Candidate::into_diagnostic` and
`Candidate::report_diagnostic` in our ambiguous Unicode character
checks. This avoids some duplication and also avoids collecting a `Vec`
of `Candidate`s only to iterate over it later.

Test Plan
--

Existing tests
2025-07-29 16:07:55 -04:00
Thomas Mattone
ae26fa020c [flake8-pyi] Preserve inline comment in ellipsis removal (PYI013) (#19399)
## Summary

Fixes #19385.

Based on [unnecessary-placeholder
(PIE790)](https://docs.astral.sh/ruff/rules/unnecessary-placeholder/)
behavior, [ellipsis-in-non-empty-class-body
(PYI013)](https://docs.astral.sh/ruff/rules/ellipsis-in-non-empty-class-body/)
now safely preserve inline comment on ellipsis removal.

## Test Plan

A new test class was added:

```python
class NonEmptyChildWithInlineComment:
    value: int
    ... # preserve me
```

with the following expected fix:

```python
class NonEmptyChildWithInlineComment:
    value: int
    # preserve me
```
2025-07-29 15:06:04 -04:00
Andrew Gallant
88a679945c [ty] Add flow diagram for import resolution
The diagram is written in the Dot language, which can
be converted to SVG (or any other image) by GraphViz.

I thought it was a good idea to write this down in
preparation for adding routines that list modules.
Code reuse is likely to be difficult and I wanted to
be sure I understood how it worked.
2025-07-29 14:49:20 -04:00
Andrew Gallant
941be52358 [ty] Add comments to some core resolver functions
Some of the contracts were a little tricky to discover from just the
parameter types, so I added some docs (and fixed what I believe was one
typo).
2025-07-29 14:49:20 -04:00
Andrew Gallant
13624ce17f [ty] Add missing ticks and use consistent quoting
This irked me while I was reading the code, so I just tried to fix what
I could see.
2025-07-29 14:49:20 -04:00
Andrew Gallant
edb2f8e997 [ty] Reflow some long lines
I mostly just did this because the long string literals were annoying
me. And these can make rustfmt give up on formatting.

I also re-flowed some long comment lines while I was here.
2025-07-29 14:49:20 -04:00
Andrew Gallant
5e6ad849ff [ty] Unexport helper function
I'm not sure if this used to be used elsewhere, but it no longer is.
And it looks like an internal-only helper function, so just un-export
it.

And note that `ModuleNameIngredient` is also un-exported, so this
function isn't really usable outside of its defining module anyway.
2025-07-29 14:49:20 -04:00
Andrew Gallant
865a9b3424 [ty] Remove offset from CompletionTargetTokens::Unknown
At some point, the surrounding code was refactored so that the
cursor offset was always passed around, so storing it here is
no longer necessary.
2025-07-29 14:49:20 -04:00
Dan Parizher
d449c541cb [pyupgrade] Fix UP030 to avoid modifying double curly braces in format strings (#19378)
## Summary

Fixes #19348

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-07-29 18:35:54 +00:00
हिमांशु
f7c6a6b2d0 [ty] fix a typo (#19621)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-29 17:53:44 +00:00
justin
656273bf3d [ty] synthesize __replace__ for dataclasses (>=3.13) (#19545)
## Summary
https://github.com/astral-sh/ty/issues/111

adds support for the new `copy.replace` and `__replace__` protocol
[added in 3.13](https://docs.python.org/3/whatsnew/3.13.html#copy)

- docs: https://docs.python.org/3/library/copy.html#object.__replace__
- some discussion on pyright/mypy implementations:
https://discuss.python.org/t/dataclass-transform-and-replace/69067



### Burndown
- [x] add tests
- [x] implement `__replace__`
- [ ]
[collections.namedtuple()](https://docs.python.org/3/library/collections.html#collections.namedtuple)
- [x]
[dataclasses.dataclass](https://docs.python.org/3/library/dataclasses.html#dataclasses.dataclass)

## Test Plan
new mdtests

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-07-29 17:32:01 +02:00
Alex Waygood
81867ea7ce [ty] Discard Definitions when normalizing Signatures (#19615) 2025-07-29 14:37:47 +01:00
Brent Westbrook
a54061e757 [ty] Fix empty spans following a line terminator and unprintable character spans in diagnostics (#19535)
## Summary

This was previously the last commit in #19415, split out to make it
easier to review. This applies the fixes from c9b99e4, 5021f32, and
2922490cb8 to the new rendering code in `ruff_db`. I initially intended
only to fix the empty span after a line terminator (as you can see in
the branch name), but the two fixes were tied pretty closely together,
and my initial fix for the empty spans needed a big change after trying
to handle unprintable characters too. I can still split this up if it
would help with review. I would just start with the unprintable
characters first.

The implementation here is essentially copy-pasted from
`ruff_linter::message::text.rs`, with the `SourceCode` struct renamed to
`EscapedSourceCode` since there's already a `SourceCode` in scope in
`render.rs`. It's also updated slightly to account for the multiple
annotations for a single snippet. The original implementation used some
types from the `line_width` module from `ruff_linter`. I copied over
heavily stripped-down versions of these instead of trying to import
them. We could inline the remaining code entirely, if we want, but I
thought it was nice enough to keep.

I also moved over `ceil_char_boundary`, which is unchanged except to
make it a free function taking a `&str` instead of a `Locator` method.
All of this code could be deleted from `ruff_linter` if we also move
over the `grouped` output format, which will be the last user after
#19415.

## Test Plan

I added new tests in `ruff_linter` that call into the new rendering code
to snapshot the diagnostics for the affected cases. These are copies of
existing snapshots in Ruff, so it's helpful to compare them. These are a
bit noisy because of the other rendering differences in the header, but
all of the `^^^` indicators should be the same.

<details><summary>`empty_span_after_line_terminator` diff</summary>

```diff
diff --git a/crates/ruff_linter/src/rules/pycodestyle/snapshots/ruff_linter__rules__pycodestyle__tests__E112_E11.py.snap b/crates/ruff_linter/src/message/snapshots/ruff_linter__message__text__tests__empty_span_after_line_terminator.snap
index 5ade4346e0..6df75c16f0 100644
--- a/crates/ruff_linter/src/rules/pycodestyle/snapshots/ruff_linter__rules__pycodestyle__tests__E112_E11.py.snap
+++ b/crates/ruff_linter/src/message/snapshots/ruff_linter__message__text__tests__empty_span_after_line_terminator.snap
@@ -1,17 +1,20 @@
 ---
-source: crates/ruff_linter/src/rules/pycodestyle/mod.rs
+source: crates/ruff_linter/src/message/text.rs
+expression: value.to_string()
 ---
-E11.py:9:1: E112 Expected an indented block
+error[no-indented-block]: Expected an indented block
+  --> E11.py:9:1
    |
  7 | #: E112
  8 | if False:
  9 | print()
-   | ^ E112
+   | ^
 10 | #: E113
 11 | print()
    |
 
-E11.py:9:1: SyntaxError: Expected an indented block after `if` statement
+error[invalid-syntax]: SyntaxError: Expected an indented block after `if` statement
+  --> E11.py:9:1
    |
  7 | #: E112
  8 | if False:
@@ -21,7 +24,8 @@ E11.py:9:1: SyntaxError: Expected an indented block after `if` statement
 11 | print()
    |
 
-E11.py:12:1: SyntaxError: Unexpected indentation
+error[invalid-syntax]: SyntaxError: Unexpected indentation
+  --> E11.py:12:1
    |
 10 | #: E113
 11 | print()
@@ -31,7 +35,8 @@ E11.py:12:1: SyntaxError: Unexpected indentation
 14 | mimetype = 'application/x-directory'
    |
 
-E11.py:14:1: SyntaxError: Expected a statement
+error[invalid-syntax]: SyntaxError: Expected a statement
+  --> E11.py:14:1
    |
 12 |     print()
 13 | #: E114 E116
@@ -41,17 +46,19 @@ E11.py:14:1: SyntaxError: Expected a statement
 16 | create_date = False
    |
 
-E11.py:45:1: E112 Expected an indented block
+error[no-indented-block]: Expected an indented block
+  --> E11.py:45:1
    |
 43 | #: E112
 44 | if False:  #
 45 | print()
-   | ^ E112
+   | ^
 46 | #:
 47 | if False:
    |
 
-E11.py:45:1: SyntaxError: Expected an indented block after `if` statement
+error[invalid-syntax]: SyntaxError: Expected an indented block after `if` statement
+  --> E11.py:45:1
    |
 43 | #: E112
 44 | if False:  #
```

</details>

<details><summary>`unprintable_characters` diff</summary>

```diff
diff --git a/crates/ruff_linter/src/rules/pylint/snapshots/ruff_linter__rules__pylint__tests__PLE2512_invalid_characters.py.snap b/crates/ruff_linter/src/message/snapshots/ruff_linter__message__text__tests__unprintable_characters.snap
index 52cfdf9cce..fcfa1ac9f1 100644
--- a/crates/ruff_linter/src/rules/pylint/snapshots/ruff_linter__rules__pylint__tests__PLE2512_invalid_characters.py.snap
+++ b/crates/ruff_linter/src/message/snapshots/ruff_linter__message__text__tests__unprintable_characters.snap
@@ -1,161 +1,115 @@
 ---
-source: crates/ruff_linter/src/rules/pylint/mod.rs
+source: crates/ruff_linter/src/message/text.rs
+expression: value.to_string()
 ---
-invalid_characters.py:24:12: PLE2512 [*] Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:24:12
    |
 22 | cr_ok = f'\\r'
 23 |
 24 | sub = 'sub '
-   |            ^ PLE2512
+   |            ^
 25 | sub = f'sub '
    |
-   = help: Replace with escape sequence
+help: Replace with escape sequence
 
-ℹ Safe fix
-21 21 | cr_ok = '\\r'
-22 22 | cr_ok = f'\\r'
-23 23 | 
-24    |-sub = 'sub '
-   24 |+sub = 'sub \x1A'
-25 25 | sub = f'sub '
-26 26 | 
-27 27 | sub_ok = '\x1a'
-
-invalid_characters.py:25:13: PLE2512 [*] Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:25:13
    |
 24 | sub = 'sub '
 25 | sub = f'sub '
-   |             ^ PLE2512
+   |             ^
 26 |
 27 | sub_ok = '\x1a'
    |
-   = help: Replace with escape sequence
-
-ℹ Safe fix
-22 22 | cr_ok = f'\\r'
-23 23 | 
-24 24 | sub = 'sub '
-25    |-sub = f'sub '
-   25 |+sub = f'sub \x1A'
-26 26 | 
-27 27 | sub_ok = '\x1a'
-28 28 | sub_ok = f'\x1a'
+help: Replace with escape sequence
 
-invalid_characters.py:55:25: PLE2512 [*] Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:55:25
    |
 53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ ​​"
 54 |
 55 | nested_fstrings = f'␈{f'{f'␛'}'}'
-   |                         ^ PLE2512
+   |                         ^
 56 |
 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
    |
-   = help: Replace with escape sequence
-
-ℹ Safe fix
-52 52 | zwsp_after_multicharacter_grapheme_cluster = "ಫ್ರಾನ್ಸಿಸ್ಕೊ ​​"
-53 53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ ​​"
-54 54 | 
-55    |-nested_fstrings = f'␈{f'{f'␛'}'}'
-   55 |+nested_fstrings = f'␈{f'\x1A{f'␛'}'}'
-56 56 | 
-57 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
-58 58 | x = f"""}}ab"""
+help: Replace with escape sequence
 
-invalid_characters.py:58:12: PLE2512 [*] Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:58:12
    |
 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
 58 | x = f"""}}ab"""
-   |            ^ PLE2512
+   |            ^
 59 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998256
 60 | x = f"""}}a␛b"""
    |
-   = help: Replace with escape sequence
+help: Replace with escape sequence
 
-ℹ Safe fix
-55 55 | nested_fstrings = f'␈{f'{f'␛'}'}'
-56 56 | 
-57 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
-58    |-x = f"""}}ab"""
-   58 |+x = f"""}}a\x1Ab"""
-59 59 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998256
-60 60 | x = f"""}}a␛b"""
-61 61 | 
-
-invalid_characters.py:64:12: PLE2512 Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:64:12
    |
 63 | # https://github.com/astral-sh/ruff/issues/13294
 64 | print(r"""␈␛�​
-   |            ^ PLE2512
+   |            ^
 65 | """)
 66 | print(fr"""␈␛�​
    |
-   = help: Replace with escape sequence
+help: Replace with escape sequence
 
-invalid_characters.py:66:13: PLE2512 Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:66:13
    |
 64 | print(r"""␈␛�​
 65 | """)
 66 | print(fr"""␈␛�​
-   |             ^ PLE2512
+   |             ^
 67 | """)
 68 | print(Rf"""␈␛�​
    |
-   = help: Replace with escape sequence
+help: Replace with escape sequence
 
-invalid_characters.py:68:13: PLE2512 Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:68:13
    |
 66 | print(fr"""␈␛�​
 67 | """)
 68 | print(Rf"""␈␛�​
-   |             ^ PLE2512
+   |             ^
 69 | """)
    |
-   = help: Replace with escape sequence
+help: Replace with escape sequence
 
-invalid_characters.py:73:9: PLE2512 Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:73:9
    |
 71 | # https://github.com/astral-sh/ruff/issues/18815
 72 | b = "\␈"
 73 | sub = "\"
-   |         ^ PLE2512
+   |         ^
 74 | esc = "\␛"
 75 | zwsp = "\​"
    |
-   = help: Replace with escape sequence
+help: Replace with escape sequence
 
-invalid_characters.py:80:25: PLE2512 [*] Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:80:25
    |
 78 | # tstrings
 79 | esc = t'esc esc ␛'
 80 | nested_tstrings = t'␈{t'{t'␛'}'}'
-   |                         ^ PLE2512
+   |                         ^
 81 | nested_ftstrings = t'␈{f'{t'␛'}'}'
    |
-   = help: Replace with escape sequence
-
-ℹ Safe fix
-77 77 | 
-78 78 | # tstrings
-79 79 | esc = t'esc esc ␛'
-80    |-nested_tstrings = t'␈{t'{t'␛'}'}'
-   80 |+nested_tstrings = t'␈{t'\x1A{t'␛'}'}'
-81 81 | nested_ftstrings = t'␈{f'{t'␛'}'}'
-82 82 | 
+help: Replace with escape sequence
 
-invalid_characters.py:81:26: PLE2512 [*] Invalid unescaped character SUB, use "\x1A" instead
+error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
+  --> invalid_characters.py:81:26
    |
 79 | esc = t'esc esc ␛'
 80 | nested_tstrings = t'␈{t'{t'␛'}'}'
 81 | nested_ftstrings = t'␈{f'{t'␛'}'}'
-   |                          ^ PLE2512
+   |                          ^
    |
-   = help: Replace with escape sequence
-
-ℹ Safe fix
-78 78 | # tstrings
-79 79 | esc = t'esc esc ␛'
-80 80 | nested_tstrings = t'␈{t'{t'␛'}'}'
-81    |-nested_ftstrings = t'␈{f'{t'␛'}'}'
-   81 |+nested_ftstrings = t'␈{f'\x1A{t'␛'}'}'
-82 82 |
+help: Replace with escape sequence
```

</details>
2025-07-29 08:25:58 -04:00
Brent Westbrook
19569bf838 Add LinterContext::settings to avoid passing separate settings (#19608)
Summary
--

I noticed while reviewing #19390 that in `check_tokens` we were still
passing
around an extra `LinterSettings`, despite all of the same functions also
receiving a `LintContext` with its own settings.

This PR adds the `LintContext::settings` method and calls that instead
of using
the separate `LinterSettings`.

Test Plan
--

Existing tests
2025-07-29 08:13:22 -04:00
Charlie Marsh
e0f4f25d28 Support .pyi files in ruff analyze graph (#19611)
## Summary

We now return both the `.pyi` and `.py` files. Previously, we only
returned the `.pyi` file.
2025-07-28 22:00:27 -04:00
github-actions[bot]
c6a123290d [ty] Sync vendored typeshed stubs (#19607)
Co-authored-by: typeshedbot <>
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-07-28 22:06:33 +00:00
Alex Waygood
d4f64cd474 [ty] Bump docstring-adder pin (#19606) 2025-07-28 22:59:56 +01:00
Igor Drokin
e4f64480da [perflint] Ignore rule if target is global or nonlocal (PERF401) (#19539)
## Summary

Resolves #19531

I've implemented a check to determine whether the for_stmt target is
declared as global or nonlocal. I believe we should skip the rule in all
such cases, since variables declared this way are intended for use
outside the loop scope, making value changes expected behavior.

## Test Plan

Added two test cases for global and nonlocal variable to snapshot.
2025-07-28 17:03:22 -04:00
Micha Reiser
4016aff057 Add license classifier back to pyproject.toml (#19599)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-28 20:58:16 +01:00
UnboundVariable
24134837f3 [ty] Add stub mapping support to signature help (#19570)
This PR improves the "signature help" language server feature in two
ways:
1. It adds support for the recently-introduced "stub mapper" which maps
symbol declarations within stubs to their implementation counterparts.
This allows the signature help to display docstrings from the original
implementation.
2. It incorporates a more robust fix to a bug that was addressed in a
[previous PR](https://github.com/astral-sh/ruff/pull/19542). It also
adds more comprehensive tests to cover this case.

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-28 10:57:18 -07:00
Douglas Creager
130d4e1135 [ty] Don't panic with argument that doesn't actually implement Iterable (#19602)
This eliminates the panic reported in
https://github.com/astral-sh/ty/issues/909, though it doesn't address
the underlying cause, which is that we aren't yet checking the types of
the fields of a protocol when checking whether a class implements the
protocol. And in particular, if a class explictly opts out of iteration
via

```py
class NotIterable:
    __iter__ = None
```

we currently treat that as "having an `__iter__`" member, and therefore
implementing `Iterable`.

Note that the assumption that was in the comment before is still
correct: call binding will have already checked that the argument
satisfies `Iterable`, and so it shouldn't be an error to iterate over
said argument. But arguably, the new logic in this PR is a better way to
discharge that assumption — instead of panicking if we happen to be
wrong, fall back on an unknown iteration result.
2025-07-28 12:09:54 -04:00
Dan Parizher
e63dfa3d18 [flake8-commas] Add support for trailing comma checks in type parameter lists (COM812,COM819) (#19390)
## Summary

Fixes #18844

I'm not too sure if the solution is as simple as the way I implemented
it, but I'm curious to see if we are covering all cases correctly here.

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-07-28 10:53:04 -04:00
Junhson Jean-Baptiste
6d0f3ef3a5 [pylint] Implement auto-fix for missing-maxsplit-arg (PLC0207) (#19387)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
As a follow-up to #18949 (suggested
[here](https://github.com/astral-sh/ruff/pull/18949#pullrequestreview-2998417889)),
this PR implements auto-fix logic for `PLC0207`.

## Test Plan

<!-- How was it tested? -->

Existing tests pass, with updates to the snapshot so that it expects the
new output that comes along with the auto-fix.
2025-07-28 10:45:26 -04:00
Dan Parizher
201b079084 [refurb] Mark int and bool cases for Decimal.from_float as safe fixes in FURB164 tests (#19468)
## Summary

Fixes #19460

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-28 14:21:38 +00:00
David Peter
2680f2ed81 [ty] Minor: test isolation (#19597)
## Summary

Split the "Generator functions" tests into two parts. The first part
(synchronous) refers to a function called `i` from a function `i2`. But
`i` is later redeclared in the asynchronous part, which was probably not
intended.
2025-07-28 15:52:59 +02:00
Micha Reiser
afdfa042f3 [ty] Remove AssertUnwindSafe from BackgroundRequestHandler api (#19598) 2025-07-28 13:28:09 +00:00
Micha Reiser
8c0743df97 [ty] Fix "peek definition" in playground (#19592) 2025-07-28 09:13:00 +01:00
Dimitri Papadopoulos Orfanos
13634ff433 Use PEP 639 license information for Ruff itself instead of classifier (#19499)
## Summary

Declare licenses using only these two fields, as per PEP 639:
* `license`: SPDX license expression consisting of one or more license
identifiers
* `license-files`: list of license file glob patterns

Supported by maturin ≥ 1.9.0:
https://www.maturin.rs/changelog.html

## Test Plan

N/A
2025-07-28 09:43:50 +02:00
renovate[bot]
7a541f597f Update NPM Development dependencies (#19588)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-28 07:55:22 +01:00
renovate[bot]
2deb50f4e3 Update Rust crate get-size2 to 0.6.0 (#19590)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-28 07:55:03 +01:00
renovate[bot]
85e22645aa Update pre-commit hook astral-sh/ruff-pre-commit to v0.12.5 (#19585)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | patch | `v0.12.4` -> `v0.12.5` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.5`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.5)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.4...v0.12.5)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.5

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-28 08:48:14 +02:00
renovate[bot]
d3f6de8b0e Update cargo-bins/cargo-binstall action to v1.14.2 (#19583)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[cargo-bins/cargo-binstall](https://redirect.github.com/cargo-bins/cargo-binstall)
| action | patch | `v1.14.1` -> `v1.14.2` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>cargo-bins/cargo-binstall (cargo-bins/cargo-binstall)</summary>

###
[`v1.14.2`](https://redirect.github.com/cargo-bins/cargo-binstall/releases/tag/v1.14.2)

[Compare
Source](https://redirect.github.com/cargo-bins/cargo-binstall/compare/v1.14.1...v1.14.2)

*Binstall is a tool to fetch and install Rust-based executables as
binaries. It aims to be a drop-in replacement for `cargo install` in
most cases. Install it today with `cargo install cargo-binstall`, from
the binaries below, or if you already have it, upgrade with `cargo
binstall cargo-binstall`.*

##### In this release:

- Upgrade dependencies

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-28 08:47:33 +02:00
renovate[bot]
9eb8174209 Update CodSpeedHQ/action action to v3.8.0 (#19587)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [CodSpeedHQ/action](https://redirect.github.com/CodSpeedHQ/action) |
action | minor | `v3.7.0` -> `v3.8.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>CodSpeedHQ/action (CodSpeedHQ/action)</summary>

###
[`v3.8.0`](https://redirect.github.com/CodSpeedHQ/action/releases/tag/v3.8.0)

[Compare
Source](https://redirect.github.com/CodSpeedHQ/action/compare/v3.7.0...v3.8.0)

##### What's Changed

##### <!-- 1 -->🐛 Bug Fixes

- Adjust offset for symbols of module loaded at preferred base by
[@&#8203;not-matthias](https://redirect.github.com/not-matthias) in
[#&#8203;97](https://redirect.github.com/CodSpeedHQ/runner/pull/97)
- Run with --scope to allow perf to trace the benchmark process by
[@&#8203;not-matthias](https://redirect.github.com/not-matthias)
- Run with bash to support complex scripts by
[@&#8203;not-matthias](https://redirect.github.com/not-matthias)
- Execute pre- and post-bench scripts for non-perf walltime runner by
[@&#8203;not-matthias](https://redirect.github.com/not-matthias) in
[#&#8203;96](https://redirect.github.com/CodSpeedHQ/runner/pull/96)

##### <!-- 2 -->🏗️ Refactor

- Process memory mappings in separate function by
[@&#8203;not-matthias](https://redirect.github.com/not-matthias)

##### <!-- 7 -->⚙️ Internals

- Add debug logs for perf.map collection by
[@&#8203;not-matthias](https://redirect.github.com/not-matthias)
- Add complex cmd and env tests by
[@&#8203;not-matthias](https://redirect.github.com/not-matthias)

**Full Changelog**:
https://github.com/CodSpeedHQ/action/compare/v3.7.0...v3.8.0
**Full Runner Changelog**:
https://github.com/CodSpeedHQ/runner/blob/main/CHANGELOG.md

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-28 08:47:05 +02:00
renovate[bot]
9c68616d91 Update Rust crate criterion to 0.7.0 (#19589)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [criterion](https://bheisler.github.io/criterion.rs/book/index.html)
([source](https://redirect.github.com/bheisler/criterion.rs)) |
workspace.dependencies | minor | `0.6.0` -> `0.7.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>bheisler/criterion.rs (criterion)</summary>

###
[`v0.7.0`](https://redirect.github.com/bheisler/criterion.rs/blob/HEAD/CHANGELOG.md#070---2025-07-25)

[Compare
Source](https://redirect.github.com/bheisler/criterion.rs/compare/0.6.0...0.7.0)

- Bump version of criterion-plot to align dependencies.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-28 08:35:36 +02:00
renovate[bot]
9280c7e945 Update astral-sh/setup-uv action to v6.4.3 (#19586)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [astral-sh/setup-uv](https://redirect.github.com/astral-sh/setup-uv) |
action | minor | `v6.3.1` -> `v6.4.3` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>astral-sh/setup-uv (astral-sh/setup-uv)</summary>

###
[`v6.4.3`](https://redirect.github.com/astral-sh/setup-uv/releases/tag/v6.4.3):
🌈 fix relative paths starting with dots

[Compare
Source](https://redirect.github.com/astral-sh/setup-uv/compare/v6.4.2...v6.4.3)

#### 🐛 Bug fixes

- fix relative paths starting with dots
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;500](https://redirect.github.com/astral-sh/setup-uv/issues/500))

###
[`v6.4.2`](https://redirect.github.com/astral-sh/setup-uv/releases/tag/v6.4.2):
🌈 Interpret relative inputs as under working-directory

[Compare
Source](https://redirect.github.com/astral-sh/setup-uv/compare/v6.4.1...v6.4.2)

#### Changes

This release will interpret relative paths in inputs as relative
to the value of `working-directory` (default is `${{ github.workspace
}}`) .
This means the following configuration

```yaml
- uses: astral-sh/setup-uv@v6
   with:
     working-directory: /my/path
     cache-dependency-glob: uv.lock
```

will look for the `cache-dependency-glob` under `/my/path/uv.lock`

#### 🐛 Bug fixes

- interpret relative inputs as under working-directory
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;498](https://redirect.github.com/astral-sh/setup-uv/issues/498))

#### 🧰 Maintenance

- chore: update known versions for 0.8.1/0.8.2
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;497](https://redirect.github.com/astral-sh/setup-uv/issues/497))
- chore: update known versions for 0.8.0
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;491](https://redirect.github.com/astral-sh/setup-uv/issues/491))

###
[`v6.4.1`](https://redirect.github.com/astral-sh/setup-uv/releases/tag/v6.4.1):
🌈 Hotfix: Ignore deps starting with uv when finding uv version

[Compare
Source](https://redirect.github.com/astral-sh/setup-uv/compare/v6.4.0...v6.4.1)

##### Changes

Thank you [@&#8203;phpmypython](https://redirect.github.com/phpmypython)
for raising a PR to fix this issue!

##### 🐛 Bug fixes

- Ignore deps starting with uv when finding uv version
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;492](https://redirect.github.com/astral-sh/setup-uv/issues/492))

###
[`v6.4.0`](https://redirect.github.com/astral-sh/setup-uv/releases/tag/v6.4.0):
🌈 Add input `version-file`

[Compare
Source](https://redirect.github.com/astral-sh/setup-uv/compare/v6.3.1...v6.4.0)

##### Changes

You can now use the `version-file` input to specify a file that contains
the version of uv to install.
This can either be a `pyproject.toml` or `uv.toml` file which defines a
`required-version` or
uv defined as a dependency in `pyproject.toml` or `requirements.txt`.

```yaml
- name: Install uv based on the version defined in requirements.txt
  uses: astral-sh/setup-uv@v6
  with:
    version-file: "requirements.txt"
```

##### 🚀 Enhancements

- Add input version-file
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;486](https://redirect.github.com/astral-sh/setup-uv/issues/486))

##### 🧰 Maintenance

- chore: update known versions for 0.7.22
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;488](https://redirect.github.com/astral-sh/setup-uv/issues/488))
- Bump dependencies
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;487](https://redirect.github.com/astral-sh/setup-uv/issues/487))
- chore: update known versions for 0.7.21
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;483](https://redirect.github.com/astral-sh/setup-uv/issues/483))
- chore: update known versions for 0.7.20
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;480](https://redirect.github.com/astral-sh/setup-uv/issues/480))
- chore: update known versions for 0.7.19
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;475](https://redirect.github.com/astral-sh/setup-uv/issues/475))
- chore: update known versions for 0.7.18
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;473](https://redirect.github.com/astral-sh/setup-uv/issues/473))
- chore: update known versions for 0.7.17
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;468](https://redirect.github.com/astral-sh/setup-uv/issues/468))
- chore: update known versions for 0.7.16
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;466](https://redirect.github.com/astral-sh/setup-uv/issues/466))
- chore: update known versions for 0.7.15
@&#8203;[github-actions\[bot\]](https://redirect.github.com/apps/github-actions)
([#&#8203;463](https://redirect.github.com/astral-sh/setup-uv/issues/463))

##### 📚 Documentation

- Add FAQ on changed cache and cache upload behavior
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;477](https://redirect.github.com/astral-sh/setup-uv/issues/477))

##### ⬆️ Dependency updates

- Bump dependencies
[@&#8203;eifinger](https://redirect.github.com/eifinger)
([#&#8203;487](https://redirect.github.com/astral-sh/setup-uv/issues/487))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-28 08:33:44 +02:00
Dhruv Manilawala
e19145040f [ty] Add workflow to comment conformance tests diff (#19555)
## Summary

Follow-up to https://github.com/astral-sh/ruff/pull/19556, this PR adds
the workflow that computes the diagnostic diff which the workflow
introduced in the linked PR will add as a comment.

This workflow is similar to the [ty ecosystem-analyzer
workflow](d781a6ab3f/.github/workflows/ty-ecosystem-analyzer.yaml).

Closes: astral-sh/ty#212

## Test Plan

1. Initially there's no diff to show
2. This
[commit](d0db9937df)
comments out a rule which updates the comment with the diff
3. Later, that commit is reverted and the diff goes away

Use the comment history to look at the diff output where the order of
the history corresponds to the steps mentioned above in reverse order
i.e., the edit in the middle will contain the diff output:

<img width="1082" height="313" alt="Screenshot 2025-07-25 at 21 09 26"
src="https://github.com/user-attachments/assets/6aceb60c-1987-4b9a-9063-e3999844f035"
/>
2025-07-28 11:33:22 +05:30
renovate[bot]
ef3a195f28 Update dependency ruff to v0.12.5 (#19584) 2025-07-27 22:22:23 -04:00
Dylan
008bbfdf5a Disallow implicit concatenation of t-strings and other string types (#19485)
As of [this cpython PR](https://github.com/python/cpython/pull/135996),
it is not allowed to concatenate t-strings with non-t-strings,
implicitly or explicitly. Expressions such as `"foo" t"{bar}"` are now
syntax errors.

This PR updates some AST nodes and parsing to reflect this change.

The structural change is that `TStringPart` is no longer needed, since,
as in the case of `BytesStringLiteral`, the only possibilities are that
we have a single `TString` or a vector of such (representing an implicit
concatenation of t-strings). This removes a level of nesting from many
AST expressions (which is what all the snapshot changes reflect), and
simplifies some logic in the implementation of visitors, for example.

The other change of note is in the parser. When we meet an implicit
concatenation of string-like literals, we now count the number of
t-string literals. If these do not exhaust the total number of
implicitly concatenated pieces, then we emit a syntax error. To recover
from this syntax error, we encode any t-string pieces as _invalid_
string literals (which means we flag them as invalid, record their
range, and record the value as `""`). Note that if at least one of the
pieces is an f-string we prefer to parse the entire string as an
f-string; otherwise we parse it as a string.

This logic is exactly the same as how we currently treat
`BytesStringLiteral` parsing and error recovery - and carries with it
the same pros and cons.

Finally, note that I have not implemented any changes in the
implementation of the formatter. As far as I can tell, none are needed.
I did change a few of the fixtures so that we are always concatenating
t-strings with t-strings.
2025-07-27 12:41:03 +00:00
Alex Waygood
df5eba7583 [ty] Mark all_type_assignable_to_iterable_are_iterable as flaky (#19574) 2025-07-27 11:04:13 +00:00
Micha Reiser
469c50b0b7 [ty] Support stdlib files in playground (#19557) 2025-07-26 19:33:38 +01:00
UnboundVariable
738246627f [ty] Implemented support for "selection range" language server feature (#19567)
This PR adds support for the "selection range" language server feature.
This feature was recently requested by a ty user in [this feature
request](https://github.com/astral-sh/ty/issues/882).

This feature allows a client to implement "smart selection expansion"
based on the structure of the parse tree. For example, if you type
"shift-ctrl-right-arrow" in VS Code, the current selection will be
expanded to include the parent AST node. Conversely,
"shift-ctrl-left-arrow" shrinks the selection.

We will probably need to tune the granularity of selection expansion
based on user feedback. The initial implementation includes most AST
nodes, but users may find this to be too fine-grained. We have the
option of skipping some AST nodes that are not as meaningful when
editing code.

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-26 09:08:36 -07:00
Douglas Creager
e867830848 [ty] Don't include already-bound legacy typevars in function generic context (#19558)
We now correctly exclude legacy typevars from enclosing scopes when
constructing the generic context for a generic function.

more detail:

A function is generic if it refers to legacy typevars in its signature:

```py
from typing import TypeVar

T = TypeVar("T")

def f(t: T) -> T:
    return t
```

Generic functions are allowed to appear inside of other generic
contexts. When they do, they can refer to the typevars of those
enclosing generic contexts, and that should not rebind the typevar:

```py
from typing import TypeVar, Generic

T = TypeVar("T")
U = TypeVar("U")

class C(Generic[T]):
    @staticmethod
    def method(t: T, u: U) -> None: ...

# revealed: def method(t: int, u: U) -> None
reveal_type(C[int].method)
```

This substitution was already being performed correctly, but we were
also still including the enclosing legacy typevars in the method's own
generic context, which can be seen via `ty_extensions.generic_context`
(which has been updated to work on generic functions and methods):

```py
from ty_extensions import generic_context

# before: tuple[T, U]
# after: tuple[U]
reveal_type(generic_context(C[int].method))
```

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-25 18:14:19 -04:00
Elliot Simpson
72fdb7d439 [flake8-blind-except] Change BLE001 to permit logging.critical(..., exc_info=True). (#19520)
## Summary

Changing `BLE001` (blind-except) so that it does not flag `except`
clauses which include `logging.critical(..., exc_info=True)`.

## Test Plan

It passes the following (whereas the `main` branch does not):
```sh
$ cargo run -p ruff -- check somefile.py --no-cache --select=BLE001
```
```python
# somefile.py

import logging


try:
    print("Hello world!")
except Exception:
    logging.critical("Did not run.", exc_info=True)
```
Related: https://github.com/astral-sh/ruff/issues/19519
2025-07-25 17:52:58 -04:00
Dylan
fbf1dfc782 Reword preview warning for target-version Python 3.14 (#19563)
Small rewording to indicate that core development is done but that we
may add breaking changes.

Feel free to bikeshed!

Test:

```console
❯ echo "t''" | cargo run -p ruff -- check --no-cache --isolated --target-version py314 -
    Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.13s
     Running `target/debug/ruff check --no-cache --isolated --target-version py314 -`
warning: Support for Python 3.14 is in preview and may undergo breaking changes. Enable `preview` to remove this warning.
All checks passed!
```
2025-07-25 16:09:45 -05:00
UnboundVariable
a0d8ff51dd [ty] Added support for "document symbols" and "workspace symbols" (#19521)
This PR adds support for "document symbols" and "workspace symbols"
language server features. Most of the logic to implement these features
is shared.

The "document symbols" feature returns a list of all symbols within a
specified source file. Clients can specify whether they want a flat or
hierarchical list. Document symbols are typically presented by a client
in an "outline" form. Here's what this looks like in VS Code, for
example.

<img width="240" height="249" alt="image"
src="https://github.com/user-attachments/assets/82b11f4f-32ec-4165-ba01-d6496ad13bdf"
/>


The "workspace symbols" feature returns a list of all symbols across the
entire workspace that match some user-supplied query string. This allows
the user to quickly find and navigate to any symbol within their code.

<img width="450" height="134" alt="image"
src="https://github.com/user-attachments/assets/aac131e0-9464-4adf-8a6c-829da028c759"
/>

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-25 13:07:38 -07:00
Brent Westbrook
165091a31c Add TextEmitter::with_color and disable colors in unreadable_files test (#19562)
Summary
--

I looked at other uses of `TextEmitter`, and I think this should be the
only one affected by this. The other integration tests must work
properly since they're run with `assert_cmd_snapshot!`, which I assume
triggers the `SHOULD_COLORIZE` case, and the `cfg!(test)` check will
work for uses in `ruff_linter`.


4a4dc38b5b/crates/ruff_linter/src/message/text.rs (L36-L44)

Alternatively, we could probably move this to a CLI test instead.

Test Plan
--

`cargo test -p ruff`, which was failing on `main` with color codes in
the output before this
2025-07-25 15:47:49 -04:00
UnboundVariable
4a4dc38b5b [ty] Added support for document highlights in playground. (#19540)
This PR adds support for the "document highlights" feature in the ty
playground.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-25 08:55:40 -07:00
Dan Parizher
3e366fdf13 [refurb] Ignore decorated functions for FURB118 (#19339)
## Summary

Fixes #19305
2025-07-25 10:43:17 -05:00
Dhruv Manilawala
53e9e4421c [ty] Add workflow to comment diagnostic diff for conformance tests (#19556)
## Summary

This PR adds a workflow to comment the diff of diagnostics when running
ty between `main` and a pull request on the [typing conformance test
suite](https://github.com/python/typing/tree/main/conformance/tests).

The main workflow is introduced in
https://github.com/astral-sh/ruff/pull/19555 which this workflow depends
on. This workflow is similar to the [mypy primer
comment](d781a6ab3f/.github/workflows/mypy_primer_comment.yaml)
workflow.

## Test Plan

I cannot test this workflow without merging it on `main` unless anyone
knows a way to do this.
2025-07-25 20:54:28 +05:30
Alex Waygood
859262bd49 [ty] Move zope.interface to good.txt for primer runs (#19208) 2025-07-25 14:12:17 +01:00
David Peter
c0768dfd96 [ty] Attribute access on intersections with negative parts (#19524)
## Summary

We currently infer a `@Todo` type whenever we access an attribute on an
intersection type with negative components. This can happen very
naturally. Consequently, this `@Todo` type is rather pervasive and hides
a lot of true positives that ty could otherwise detect:

```py
class Foo:
    attr: int = 1

def _(f: Foo | None):
    if f:
        reveal_type(f)  # Foo & ~AlwaysFalsy

        reveal_type(f.attr)  # now: int, previously: @Todo
```

The changeset here proposes to handle member access on these
intersection types by simply ignoring all negative contributions. This
is not always ideal: a negative contribution like `~<Protocol with
members 'attr'>` could be a hint that `.attr` should not be accessible
on the full intersection type. The behavior can certainly be improved in
the future, but this seems like a reasonable initial step to get rid of
this unnecessary `@Todo` type.

## Ecosystem analysis

There are quite a few changes here. I spot-checked them and found one
bug where attribute access on pure negation types (`~P == object & ~P`)
would not allow attributes on `object` to be accessed. After that was
fixed, I only see true positives and known problems. The fact that a lot
of `unused-ignore-comment` diagnostics go away are also evidence for the
fact that this touches a sensitive area, where static analysis clashes
with dynamically adding attributes to objects:
```py
… # type: ignore # Runtime attribute access
```

## Test Plan

Updated tests.
2025-07-25 14:56:14 +02:00
David Peter
d4eb4277ad [ty] Add basic support for dataclasses.field (#19553)
## Summary

Add basic support for `dataclasses.field`:
* remove fields with `init=False` from the signature of the synthesized
`__init__` method
* infer correct default value types from `default` or `default_factory`
arguments

```py
from dataclasses import dataclass, field

def default_roles() -> list[str]:
    return ["user"]

@dataclass
class Member:
    name: str
    roles: list[str] = field(default_factory=default_roles)
    tag: str | None = field(default=None, init=False)

# revealed: (self: Member, name: str, roles: list[str] = list[str]) -> None
reveal_type(Member.__init__)
```

Support for `kw_only` has **not** been added.

part of https://github.com/astral-sh/ty/issues/111

## Test Plan

New Markdown tests
2025-07-25 14:56:04 +02:00
Micha Reiser
b033fb6bfd [ty] Split ScopedPlaceId into ScopedSymbolId and ScopedMemberId (#19497) 2025-07-25 13:54:33 +02:00
Alex Waygood
f722bfa9e6 [ty] Do not consider a type T to satisfy a method member on a protocol unless the method is available on the meta-type of T (#19187) 2025-07-25 11:16:04 +01:00
Shunsuke Shibayama
b124e182ca [ty] improve lazy scope place lookup (#19321)
Co-authored-by: David Peter <sharkdp@users.noreply.github.com>
Co-authored-by: Carl Meyer <carl@oddbird.net>
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-25 07:11:11 +00:00
Dhruv Manilawala
57373a7e4d [ty] Derive Serialize unconditionally on client options (#19549) 2025-07-25 04:03:03 +00:00
Carl Meyer
ae9d450b5f [ty] Fallback to Unknown if no type is stored for an expression (#19517)
## Summary

See discussion at
https://github.com/astral-sh/ruff/pull/19478/files#r2223870292

Fixes https://github.com/astral-sh/ty/issues/865

## Test Plan

Added one mdtest for invalid Callable annotation; removed `pull-types:
skip` from that test file.

Co-authored-by: lipefree <willy.ngo.2000@gmail.com>
2025-07-25 02:05:32 +00:00
UnboundVariable
c8c80e054e [ty] Fix bug #879 in signature help (#19542)
This PR fixes bug [#879](https://github.com/astral-sh/ty/issues/879)
where the signature help popup remains visible after typing the closing
paren in a call expression.

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-24 16:26:14 -07:00
UnboundVariable
4bc34b82ef [ty] Added support for "document highlights" language server feature. (#19515)
This PR adds support for the "document highlights" language server
feature.

This feature allows a client to highlight all instances of a selected
name within a document. Without this feature, editors perform
highlighting based on a simple text match. This adds semantic knowledge.

The implementation of this feature largely overlaps that of the
recently-added "references" feature. This PR refactors the existing
"references.rs" module, separating out the functionality and tests that
are specific to the other language feature into a "goto_references.rs"
module. The "references.rs" module now contains the functionality that
is common to "goto references", "document highlights" and "rename"
(which is not yet implemented).

As part of this PR, I also created a new `ReferenceTarget` type which is
similar to the existing `NavigationTarget` type but better suited for
references. This idea was suggested by @MichaReiser in [this code review
feedback](https://github.com/astral-sh/ruff/pull/19475#discussion_r2224061006)
from a previous PR. Notably, this new type contains a field that
specifies the "kind" of the reference (read, write or other). This
"kind" is needed for the document highlights feature.

Before: all textual instances of `foo` are highlighted
<img width="156" height="126" alt="Screenshot 2025-07-23 at 12 51 09 PM"
src="https://github.com/user-attachments/assets/37ccdb2f-d48a-473d-89d5-8e89cb6c394e"
/>

After: only semantic matches are highlighted
<img width="164" height="157" alt="Screenshot 2025-07-23 at 12 52 05 PM"
src="https://github.com/user-attachments/assets/2efadadd-4691-4815-af04-b031e74c81b7"
/>

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-24 13:06:25 -07:00
Charlie Marsh
d9cab4d242 Add support for specifying minimum dots in detected string imports (#19538)
## Summary

Defaults to requiring two dots, which matches the Pants default.
2025-07-24 15:48:23 -04:00
David Peter
d77b7312b0 [ty] Minor: fix incomplete docstring (#19534) 2025-07-24 21:01:15 +02:00
Dhruv Manilawala
f9091ea8bb [ty] Move server tests as integration tests (#19522)
## Summary

Reference:
https://github.com/astral-sh/ruff/pull/19391#discussion_r2222780892
2025-07-24 16:10:17 +00:00
Robsdedude
1d2181623c [ruff] Offer fixes for RUF039 in more cases (#19065)
## Summary
Expand cases in which ruff can offer a fix for `RUF039` (some of which
are unsafe).

While turning `"\n"` (== `\n`) into `r"\n"` (== `\\n`) is not equivalent
at run-time, it's still functionally equivalent to do so in the context
of [regex
patterns](https://docs.python.org/3/library/re.html#regular-expression-syntax)
as they themselves interpret the escape sequence. Therefore, an unsafe
fix can be offered.

Further, this PR also makes ruff offer fixes for byte string literals,
not only strings literals as before.

## Test Plan
Tests for all escape sequences have been added.

## Related
Closes: https://github.com/astral-sh/ruff/issues/16713

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-24 11:45:45 -04:00
David Peter
dc6be457b5 [ty] Support dataclasses.InitVar (#19527)
## Summary

I saw that this creates a lot of false positives in the ecosystem, and
it seemed to be relatively easy to add basic support for this.

Some preliminary work on this was done by @InSyncWithFoo — thank you.

part of https://github.com/astral-sh/ty/issues/111

## Ecosystem analysis

The results look good.

## Test Plan

New Markdown tests

---------

Co-authored-by: InSync <insyncwithfoo@gmail.com>
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-24 16:33:33 +02:00
Robsdedude
1079975b35 [ruff] Fix RUF033 breaking with named default expressions (#19115)
## Summary
The generated fix for `RUF033` would cause a syntax error for named
expressions as parameter defaults.
```python
from dataclasses import InitVar, dataclass
@dataclass
class Foo:
    def __post_init__(self, bar: int = (x := 1)) -> None:
        pass
```
would be turned into
```python
from dataclasses import InitVar, dataclass
@dataclass
class Foo:
    x: InitVar[int] = x := 1
    def __post_init__(self, bar: int = (x := 1)) -> None:
        pass
```
instead of the syntactically correct
```python
# ...
x: InitVar[int] = (x := 1)
# ...
```

## Test Plan
Test reproducer (plus some extra tests) have been added to the test
suite.

## Related
Fixes: https://github.com/astral-sh/ruff/issues/18950
2025-07-24 09:45:49 -04:00
Brent Westbrook
39eb0f6c6c Update pre-commit hook name (#19530)
## Summary

A couple of months ago now
(https://github.com/astral-sh/ruff-pre-commit/pull/124) we changed the
hook ID from just `ruff` to `ruff-check` to mirror `ruff-format`. I
noticed the `ruff (legacy alias)` when running pre-commit on the release
today and realized we should probably update.

## Test Plan

Commit on this PR:

```shell
> git commit -m "Update pre-commit hook name"
check for merge conflicts................................................Passed
Validate pyproject.toml..............................(no files to check)Skipped
mdformat.............................................(no files to check)Skipped
markdownlint-fix.....................................(no files to check)Skipped
blacken-docs.........................................(no files to check)Skipped
typos....................................................................Passed
cargo fmt............................................(no files to check)Skipped
ruff format..........................................(no files to check)Skipped
ruff check...........................................(no files to check)Skipped  <-- 
prettier.................................................................Passed
zizmor...............................................(no files to check)Skipped
Validate GitHub Workflows............................(no files to check)Skipped
shellcheck...........................................(no files to check)Skipped
```

Compared to the release branch:

```shell
> pre-commit run
...
cargo fmt............................................(no files to check)Skipped
ruff format..........................................(no files to check)Skipped
ruff (legacy alias)..................................(no files to check)Skipped
...
```
2025-07-24 09:44:47 -04:00
Brent Westbrook
d13228ab85 Bump 0.12.5 (#19528) 2025-07-24 09:12:50 -04:00
David Peter
9461d3076f [ty] Rename type_api => ty_extensions (#19523) 2025-07-24 08:24:26 +00:00
UnboundVariable
63d1d332b3 [ty] Added support for "go to references" in ty playground. (#19516)
This PR adds support for "go to references" in the ty playground.

<img width="393" height="168" alt="image"
src="https://github.com/user-attachments/assets/ce3ae1bf-c17c-4510-9f77-20b10f6170c4"
/>

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-23 22:46:42 -07:00
Douglas Creager
e0149cd9f3 [ty] Return a tuple spec from the iterator protocol (#19496)
This PR updates our iterator protocol machinery to return a tuple spec
describing the elements that are returned, instead of a type. That
allows us to track heterogeneous iterators more precisely, and
consolidates the logic in unpacking and splatting, which are the two
places where we can take advantage of that more precise information.
(Other iterator consumers, like `for` loops, have to collapse the
iterated elements down to a single type regardless, and we provide a new
helper method on `TupleSpec` to perform that summarization.)
2025-07-23 17:11:44 -04:00
David Peter
2a00eca66b [ty] Exhaustiveness checking & reachability for match statements (#19508)
## Summary

Implements proper reachability analysis and — in effect — exhaustiveness
checking for `match` statements. This allows us to check the following
code without any errors (leads to *"can implicitly return `None`"* on
`main`):

```py
from enum import Enum, auto

class Color(Enum):
    RED = auto()
    GREEN = auto()
    BLUE = auto()

def hex(color: Color) -> str:
    match color:
        case Color.RED:
            return "#ff0000"
        case Color.GREEN:
            return "#00ff00"
        case Color.BLUE:
            return "#0000ff"
```

Note that code like this already worked fine if there was a
`assert_never(color)` statement in a catch-all case, because we would
then consider that `assert_never` call terminal. But now this also works
without the wildcard case. Adding a member to the enum would still lead
to an error here, if that case would not be handled in `hex`.

What needed to happen to support this is a new way of evaluating match
pattern constraints. Previously, we would simply compare the type of the
subject expression against the patterns. For the last case here, the
subject type would still be `Color` and the value type would be
`Literal[Color.BLUE]`, so we would infer an ambiguous truthiness.

Now, before we compare the subject type against the pattern, we first
generate a union type that corresponds to the set of all values that
would have *definitely been matched* by previous patterns. Then, we
build a "narrowed" subject type by computing `subject_type &
~already_matched_type`, and compare *that* against the pattern type. For
the example here, `already_matched_type = Literal[Color.RED] |
Literal[Color.GREEN]`, and so we have a narrowed subject type of `Color
& ~(Literal[Color.RED] | Literal[Color.GREEN]) = Literal[Color.BLUE]`,
which allows us to infer a reachability of `AlwaysTrue`.

<details>

<summary>A note on negated reachability constraints</summary>

It might seem that we now perform duplicate work, because we also record
*negated* reachability constraints. But that is still important for
cases like the following (and possibly also for more realistic
scenarios):

```py
from typing import Literal

def _(x: int | str):
    match x:
        case None:
            pass # never reachable
        case _:
            y = 1

    y
```

</details>

closes https://github.com/astral-sh/ty/issues/99

## Test Plan

* I verified that this solves all examples from the linked ticket (the
first example needs a PEP 695 type alias, because we don't support
legacy type aliases yet)
* Verified that the ecosystem changes are all because of removed false
positives
* Updated tests
2025-07-23 22:45:45 +02:00
David Peter
3d17897c02 [ty] Fix narrowing and reachability of class patterns with arguments (#19512)
## Summary

I noticed that our type narrowing and reachability analysis was
incorrect for class patterns that are not irrefutable. The test cases
below compare the old and the new behavior:

```py
from dataclasses import dataclass

@dataclass
class Point:
    x: int
    y: int

class Other: ...

def _(target: Point):
    y = 1

    match target:
        case Point(0, 0):
            y = 2
        case Point(x=0, y=1):
            y = 3
        case Point(x=1, y=0):
            y = 4
    
    reveal_type(y)  # revealed: Literal[1, 2, 3, 4]    (previously: Literal[2])


def _(target: Point | Other):
    match target:
        case Point(0, 0):
            reveal_type(target)  # revealed: Point
        case Point(x=0, y=1):
            reveal_type(target)  # revealed: Point    (previously: Never)
        case Point(x=1, y=0):
            reveal_type(target)  # revealed: Point    (previously: Never)
        case Other():
            reveal_type(target)  # revealed: Other    (previously: Other & ~Point)
```

## Test Plan

New Markdown test
2025-07-23 18:45:03 +02:00
UnboundVariable
fa1df4cedc [ty] Implemented partial support for "find references" language server feature. (#19475)
This PR adds basic support for the "find all references" language server feature.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-23 09:16:22 -07:00
chiri
89258f1938 [flake8-use-pathlib] Add autofix for PTH101, PTH104, PTH105, PTH121 (#19404)
## Summary

Part of https://github.com/astral-sh/ruff/issues/2331

## Test Plan

`cargo nextest run flake8_use_pathlib`
2025-07-23 12:13:43 -04:00
हिमांशु
1dcef1a011 [perflint] Parenthesize generator expressions (PERF401) (#19325)
## Summary
closes #19204 

## Test Plan
1. test case is added in dedicated file
2. locally tested the code manually

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
Co-authored-by: CodeMan62 <sharmahimanshu150082007@gmail.com>
2025-07-23 12:08:15 -04:00
Dan Parizher
ba629fe262 [pep8-naming] Fix N802 false positives for CGIHTTPRequestHandler and SimpleHTTPRequestHandler (#19432)
## Summary

Fixes #19422
2025-07-23 12:04:11 -04:00
frank
bb3a05f92b [pylint] Handle empty comments after line continuation (PLR2044) (#19405)
fixes #19326
2025-07-23 11:56:49 -04:00
Brent Westbrook
4daf59e5e7 Move concise diagnostic rendering to ruff_db (#19398)
## Summary

This PR moves most of the work of rendering concise diagnostics in Ruff
into `ruff_db`, where the code is shared with ty. To accomplish this
without breaking backwards compatibility in Ruff, there are two main
changes on the `ruff_db`/ty side:
- Added the logic from Ruff for remapping notebook line numbers to cells
- Reordered the fields in the diagnostic to match Ruff and rustc
  ```text
  # old
error[invalid-assignment] try.py:3:1: Object of type `Literal[1]` is not
assignable to `str`
  # new
try.py:3:1: error[invalid-assignment]: Object of type `Literal[1]` is
not assignable to `str`
  ```

I don't think the notebook change failed any tests on its own, and only
a handful of snaphots changed in ty after reordering the fields, but
this will obviously affect any other uses of the concise format, outside
of tests, too.

The other big change should only affect Ruff:

- Added three new `DisplayDiagnosticConfig` options
Micha and I hoped that we could get by with one option
(`hide_severity`), but Ruff also toggles `show_fix_status` itself,
independently (there are cases where we want neither severity nor the
fix status), and during the implementation I realized we also needed
access to an `Applicability`. The main goal here is to suppress the
severity (`error` above) because ruff only uses the `error` severity and
to use the secondary/noqa code instead of the line name
(`invalid-assignment` above).
  ```text
  # ty - same as "new" above
try.py:3:1: error[invalid-assignment]: Object of type `Literal[1]` is
not assignable to `str`
  # ruff
try.py:3:1: RUF123 [*] Object of type `Literal[1]` is not assignable to
`str`
  ```

This part of the concise diagnostic is actually shared with the `full`
output format in Ruff, but with the settings above, there are no
snapshot changes to either format.

## Test Plan

Existing tests with the handful of updates mentioned above, as well as
some new tests in the `concise` module.

Also this PR. Swapping the fields might have broken mypy_primer, unless
it occasionally times out on its own.

I also ran this script in the root of my Ruff checkout, which also has
CPython in it:

```shell
flags=(--isolated --no-cache --no-respect-gitignore --output-format concise .)
diff <(target/release/ruff check ${flags[@]} 2> /dev/null) \
     <(ruff check ${flags[@]} 2> /dev/null)
```

This yielded an expected diff due to some t-string error changes on main
since 0.12.4:
```diff
33622c33622
< crates/ruff_python_parser/resources/inline/err/f_string_lambda_without_parentheses.py:1:15: SyntaxError: Expected an element of or the end of the f-string
---
> crates/ruff_python_parser/resources/inline/err/f_string_lambda_without_parentheses.py:1:15: SyntaxError: Expected an f-string or t-string element or the end of the f-string or t-string
33742c33742
< crates/ruff_python_parser/resources/inline/err/implicitly_concatenated_unterminated_string_multiline.py:4:1: SyntaxError: Expected an element of or the end of the f-string
---
> crates/ruff_python_parser/resources/inline/err/implicitly_concatenated_unterminated_string_multiline.py:4:1: SyntaxError: Expected an f-string or t-string element or the end of the f-string or t-string
34131c34131
< crates/ruff_python_parser/resources/inline/err/t_string_lambda_without_parentheses.py:2:15: SyntaxError: Expected an element of or the end of the t-string
---
> crates/ruff_python_parser/resources/inline/err/t_string_lambda_without_parentheses.py:2:15: SyntaxError: Expected an f-string or t-string element or the end of the f-string or t-string
```

So modulo color, the results are identical on 38,186 errors in our test
suite and CPython 3.10.

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-07-23 11:43:32 -04:00
Jack O'Connor
88bd82938f [ty] highlight the argument in static_assert error messages (#19426)
Closes https://github.com/astral-sh/ty/issues/209.

Before:
```
error[static-assert-error]: Static assertion error: custom message
 --> test.py:2:1
  |
1 | from ty_extensions import static_assert
2 | static_assert(3 > 4, "custom message")
  | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  |
```

After:
```
error[static-assert-error]: Static assertion error: custom message
 --> test.py:2:1
  |
1 | from ty_extensions import static_assert
2 | static_assert(3 > 4, "custom message")
  | ^^^^^^^^^^^^^^-----^^^^^^^^^^^^^^^^^^^
  |               |
  |               Inferred type of argument is `Literal[False]`
  |
```
2025-07-23 08:24:12 -07:00
David Peter
5a55bab3f3 [ty] Infer single-valuedness for enums based on int/str (#19510)
## Summary

We previously didn't recognize `Literal[Color.RED]` as single-valued, if
the enum also derived from `str` or `int`:
```py
from enum import Enum

class Color(str, Enum):
    RED = "red"
    GREEN = "green"
    BLUE = "blue"

def _(color: Color):
    if color == Color.RED:
        reveal_type(color)  # previously: Color, now: Literal[Color.RED]
```

The reason for that was that `int` and `str` have "custom" `__eq__` and
`__ne__` implementations that return `bool`. We do not treat enum
literals from classes with custom `__eq__` and `__ne__` implementations
as single-valued, but of course we know that `int.__eq__` and
`str.__eq__` are well-behaved.

## Test Plan

New Markdown tests.
2025-07-23 15:55:42 +02:00
Andrew Gallant
cc5885e564 [ty] Restructure submodule query around File dependency
This makes caching of submodules independent of whether `Module`
is itself a Salsa ingredient. In fact, this makes the work done in
the prior commit superfluous. But we're possibly keeping it as an
ingredient for now since it's a bit of a tedious change and we might
need it in the near future.

Ref https://github.com/astral-sh/ruff/pull/19495#pullrequestreview-3045736715
2025-07-23 09:46:40 -04:00
Andrew Gallant
4573a0f6a0 [ty] Make Module a Salsa ingredient
We want to write queries that depend on `Module` for caching. While it
seems it can be done without making `Module` an ingredient, it seems it
is best practice to do so.

[best practice to do so]: https://github.com/astral-sh/ruff/pull/19408#discussion_r2215867301
2025-07-23 09:46:40 -04:00
David Peter
905b9d7f51 [ty] Reachability analysis for isinstance(…) branches (#19503)
## Summary

Add more precise type inference for a limited set of `isinstance(…)`
calls, i.e. return `Literal[True]` if we can be sure that this is the
correct result. This improves exhaustiveness checking / reachability
analysis for if-elif-else chains with `isinstance` checks. For example:

```py
def is_number(x: int | str) -> bool:  # no "can implicitly return `None` error here anymore
    if isinstance(x, int):
        return True
    elif isinstance(x, str):
        return False

    # code here is now detected as being unreachable
```

This PR also adds a new test suite for exhaustiveness checking.

## Test Plan

New Markdown tests

### Ecosystem analysis

The removed diagnostics look good. There's [one
case](f52c4f1afd/torchvision/io/video_reader.py (L125-L143))
where a "true positive" is removed in unreachable code. `src` is
annotated as being of type `str`, but there is an `elif isinstance(src,
bytes)` branch, which we now detect as unreachable. And so the
diagnostic inside that branch is silenced. I don't think this is a
problem, especially once we have a "graying out" feature, or a lint that
warns about unreachable code.
2025-07-23 13:06:30 +02:00
David Peter
b605c3e232 [ty] Normalize single-member enums to their instance type (#19502)
## Summary

Fixes https://github.com/astral-sh/ty/issues/874

Labeling this as `internal`, since we haven't released the
enum-expansion feature.

## Test Plan

New Markdown tests
2025-07-23 10:14:20 +02:00
Micha Reiser
c281891b5c [ty] Invert ty_ide and ty_project dependency (#19501) 2025-07-23 07:37:46 +00:00
Dhruv Manilawala
53d795da67 [ty] Implement mock language server for testing (#19391)
## Summary

Closes: astral-sh/ty#88

This PR implements an initial version of a mock language server that can
be used to write e2e tests using the real server running in the
background.

The way it works is that you'd use the `TestServerBuilder` to help
construct the `TestServer` with the setup data. This could be the
workspace folders, populating the file and it's content in the memory
file system, setting the right client capabilities to make the server
respond correctly, etc. This can be expanded as we write more test
cases.

There are still a few things to follow-up on:
- ~In the `Drop` implementation, we should assert that there are no
pending notification, request and responses from the server that the
test code hasn't handled yet~ Implemented in [`afd1f82`
(#19391)](afd1f82bde)
- Reduce the setup boilerplate in any way we can
- Improve the final assertion, currently I'm just snapshotting the final
output

## Test Plan

Written a few test cases.
2025-07-23 12:26:58 +05:30
David Peter
385d6fa608 [ty] Detect enums if metaclass is a subtype of EnumType/EnumMeta (#19481)
## Summary

This PR implements the following section from the [typing spec on
enums](https://typing.python.org/en/latest/spec/enums.html#enum-definition):

> Enum classes can also be defined using a subclass of `enum.Enum` **or
any class that uses `enum.EnumType` (or a subclass thereof) as a
metaclass**. Note that `enum.EnumType` was named `enum.EnumMeta` prior
to Python 3.11.

part of https://github.com/astral-sh/ty/issues/183

## Test Plan

New Markdown tests
2025-07-23 08:46:51 +02:00
Jack O'Connor
ba070bb6d5 [ty] perform type narrowing for places marked global too (#19381)
Fixes https://github.com/astral-sh/ty/issues/311.
2025-07-22 16:42:10 -07:00
Micha Reiser
dc10ab81bd [ty] Use ThinVec for sub segments in PlaceExpr (#19470) 2025-07-22 20:39:39 +02:00
Douglas Creager
7673d46b71 [ty] Splat variadic arguments into parameter list (#18996)
This PR updates our call binding logic to handle splatted arguments.

Complicating matters is that we have separated call bind analysis into
two phases: parameter matching and type checking. Parameter matching
looks at the arity of the function signature and call site, and assigns
arguments to parameters. Importantly, we don't yet know the type of each
argument! This is needed so that we can decide whether to infer the type
of each argument as a type form or value form, depending on the
requirements of the parameter that the argument was matched to.

This is an issue when splatting an argument, since we need to know how
many elements the splatted argument contains to know how many positional
parameters to match it against. And to know how many elements the
splatted argument has, we need to know its type.

To get around this, we now make the assumption that splatted arguments
can only be used with value-form parameters. (If you end up splatting an
argument into a type-form parameter, we will silently pass in its
value-form type instead.) That allows us to preemptively infer the
(value-form) type of any splatted argument, so that we have its arity
available during parameter matching. We defer inference of non-splatted
arguments until after parameter matching has finished, as before.

We reuse a lot of the new tuple machinery to make this happen — in
particular resizing the tuple spec representing the number of arguments
passed in with the tuple length representing the number of parameters
the splat was matched with.

This work also shows that we might need to change how we are performing
argument expansion during overload resolution. At the moment, when we
expand parameters, we assume that each argument will still be matched to
the same parameters as before, and only retry the type-checking phase.
With splatted arguments, this is no longer the case, since the inferred
arity of each union element might be different than the arity of the
union as a whole, which can affect how many parameters the splatted
argument is matched to. See the regression test case in
`mdtest/call/function.md` for more details.
2025-07-22 14:33:08 -04:00
frank
9d5ecacdc5 [flake8-pyi] Skip fix if all Union members are None (PYI016) (#19416)
patches #19403

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-22 17:03:14 +00:00
Brent Westbrook
9af8597608 Skip notebook with errors in ecosystem check (#19491)
Summary
--

I've been noticing this failure in the formatter ecosystem check and
decided to
look into it. We fail to parse the
[notebook](https://github.com/openai/openai-cookbook/blob/main/examples/mcp/databricks_mcp_cookbook.ipynb)
because some of the `code` cells
have non-Python code in them. `ruff format` only reports one of these,
corresponding to a shell snippet, but `ruff check` emits some additional
errors
about JS code later in the file too:

```
databricks_mcp_cookbook.ipynb:cell 21:1:11: SyntaxError: Simple statements must be separated by newlines or semicolons
databricks_mcp_cookbook.ipynb:cell 21:1:19: SyntaxError: Simple statements must be separated by newlines or semicolons
databricks_mcp_cookbook.ipynb:cell 21:1:50: SyntaxError: Simple statements must be separated by newlines or semicolons
databricks_mcp_cookbook.ipynb:cell 30:4:7: SyntaxError: Simple statements must be separated by newlines or semicolons
databricks_mcp_cookbook.ipynb:cell 30:4:41: E703 Statement ends with an unnecessary semicolon
databricks_mcp_cookbook.ipynb:cell 30:5:14: SyntaxError: Expected ':', found '{'
databricks_mcp_cookbook.ipynb:cell 30:6:9: SyntaxError: Expected ',', found '{'
databricks_mcp_cookbook.ipynb:cell 30:6:25: SyntaxError: Expected ',', found '='
databricks_mcp_cookbook.ipynb:cell 30:6:46: SyntaxError: Expected ',', found ';'
databricks_mcp_cookbook.ipynb:cell 30:6:47: SyntaxError: Expected '}', found newline
databricks_mcp_cookbook.ipynb:cell 30:7:1: SyntaxError: Unexpected indentation
databricks_mcp_cookbook.ipynb:cell 30:7:13: SyntaxError: Expected ':', found 'break'
databricks_mcp_cookbook.ipynb:cell 30:7:18: E703 Statement ends with an unnecessary semicolon
databricks_mcp_cookbook.ipynb:cell 30:8:28: SyntaxError: Simple statements must be separated by newlines or semicolons
databricks_mcp_cookbook.ipynb:cell 30:8:55: E703 Statement ends with an unnecessary semicolon
databricks_mcp_cookbook.ipynb:cell 30:9:18: SyntaxError: Expected an expression
databricks_mcp_cookbook.ipynb:cell 30:10:11: SyntaxError: Expected ',', found name
databricks_mcp_cookbook.ipynb:cell 30:10:16: SyntaxError: Expected ',', found '='
databricks_mcp_cookbook.ipynb:cell 30:10:22: SyntaxError: Expected ',', found name
databricks_mcp_cookbook.ipynb:cell 30:10:24: SyntaxError: Expected ',', found ';'
databricks_mcp_cookbook.ipynb:cell 30:11:27: SyntaxError: Expected ',', found '='
databricks_mcp_cookbook.ipynb:cell 30:11:34: SyntaxError: Expected ',', found name
databricks_mcp_cookbook.ipynb:cell 30:11:48: SyntaxError: Expected ',', found ';'
databricks_mcp_cookbook.ipynb:cell 30:11:49: SyntaxError: Expected '}', found NonLogicalNewline
databricks_mcp_cookbook.ipynb:cell 30:12:1: SyntaxError: Unexpected indentation
databricks_mcp_cookbook.ipynb:cell 30:12:16: E703 Statement ends with an unnecessary semicolon
databricks_mcp_cookbook.ipynb:cell 30:13:3: SyntaxError: Expected a statement
databricks_mcp_cookbook.ipynb:cell 30:13:4: SyntaxError: Expected a statement
databricks_mcp_cookbook.ipynb:cell 30:13:5: SyntaxError: Expected a statement
databricks_mcp_cookbook.ipynb:cell 30:13:5: E703 Statement ends with an unnecessary semicolon
databricks_mcp_cookbook.ipynb:cell 30:13:6: SyntaxError: Expected a statement
databricks_mcp_cookbook.ipynb:cell 30:14:1: SyntaxError: Expected a statement
databricks_mcp_cookbook.ipynb:cell 30:14:2: SyntaxError: Expected a statement
```

Test Plan
--

This PR
2025-07-22 12:29:38 -04:00
David Peter
64e5780037 [ty] Consistent use of American english (in rules) (#19488)
## Summary

Just noticed this as a minor inconsistency in our rules, and had Claude
do a few more automated replacements.
2025-07-22 16:10:38 +02:00
David Peter
da8aa6a631 [ty] Support iterating over enums (#19486)
## Summary

Infer the correct type in a scenario like this:

```py
class Color(Enum):
    RED = 1
    GREEN = 2
    BLUE = 3

for color in Color:
    reveal_type(color)  # revealed: Color
```

We should eventually support this out-of-the-box when
https://github.com/astral-sh/ty/issues/501 is implemented. For this
reason, @AlexWaygood would prefer to keep things as they are (we
currently infer `Unknown`, so false positives seem unlikely). But it
seemed relatively easy to support, so I'm opening this for discussion.

part of https://github.com/astral-sh/ty/issues/183

## Test Plan

Adapted existing test.

## Ecosystem analysis

```diff
- warning[unused-ignore-comment] rotkehlchen/chain/aggregator.py:591:82: Unused blanket `type: ignore` directive
```

This `unused-ignore-comment` goes away due to a new true positive.
2025-07-22 16:09:28 +02:00
David Peter
ee69d38000 Fix panic for illegal Literal[…] annotations with inner subscript expressions (#19489)
## Summary

Fixes pull-types panics for illegal annotations like
`Literal[object[index]]`.

Originally reported by @AlexWaygood

## Test Plan

* Verified that this caused panics in the playground, when typing (and
potentially hovering over) `x: Literal[obj[0]]`.
* Added a regression test
2025-07-22 14:07:20 +00:00
Brent Westbrook
fd335eb8b7 Move fix suggestion to subdiagnostic (#19464)
Summary
--

This PR tweaks Ruff's internal usage of the new diagnostic model to more
closely
match the intended use, as I understand it. Specifically, it moves the
fix/help
suggestion from the primary annotation's message to a subdiagnostic. In
turn, it
adds the secondary/noqa code as the new primary annotation message. As
shown in
the new `ruff_db` tests, this more closely mirrors Ruff's current
diagnostic
output.

I also added `Severity::Help` to render the fix suggestion with a
`help:` prefix
instead of `info:`.

These changes don't have any external impact now but should help a bit
with #19415.

Test Plan
--

New full output format tests in `ruff_db`

Rendered Diagnostics
--

Full diagnostic output from `annotate-snippets` in this PR:

``` 
error[unused-import]: `os` imported but unused
  --> fib.py:1:8
   |
 1 | import os
   |        ^^
   |
 help: Remove unused import: `os`
```

Current Ruff output for the same code:

```
fib.py:1:8: F401 [*] `os` imported but unused
  |
1 | import os
  |        ^^ F401
  |
  = help: Remove unused import: `os`
```

Proposed final output after #19415:

``` 
F401 [*] `os` imported but unused
  --> fib.py:1:8
   |
 1 | import os
   |        ^^
   |
 help: Remove unused import: `os`
```

These are slightly updated from
https://github.com/astral-sh/ruff/pull/19464#issuecomment-3097377634
below to remove the extra noqa codes in the primary annotation messages
for the first and third cases.
2025-07-22 10:03:58 -04:00
Aria Desires
c82fa94e0a [ty] Implement non-stdlib stub mapping for classes and functions (#19471)
This implements mapping of definitions in stubs to definitions in the
"real" implementation using the approach described in
https://github.com/astral-sh/ty/issues/788#issuecomment-3097000287

I've tested this with goto-definition in vscode with code that uses
`colorama` and `types-colorama`.

Notably this implementation does not add support for stub-mapping stdlib
modules, which can be done as an essentially orthogonal followup in the
implementation of `resolve_real_module`.

Part of https://github.com/astral-sh/ty/issues/788
2025-07-22 12:42:55 +00:00
David Peter
6d4687c9af [ty] Disallow illegal uses of ClassVar (#19483)
## Summary

It was faster to implement this then to write the ticket: Disallow
`ClassVar` annotations almost everywhere outside of class body scopes.

## Test Plan

New Markdown tests
2025-07-22 14:21:29 +02:00
David Peter
9180cd094d [ty] Disallow Final in function parameter/return-type annotations (#19480)
## Summary

Disallow `Final` in function parameter- and return-type annotations.

[Typing
spec](https://typing.python.org/en/latest/spec/qualifiers.html#uppercase-final):

> `Final` may only be used in assignments or variable annotations. Using
it in any other position is an error. In particular, `Final` can’t be
used in annotations for function arguments

## Test Plan

Updated MD test
2025-07-22 13:15:19 +02:00
David Peter
9d98a66f65 [ty] Extend Final test suite (#19476)
## Summary

Restructures and cleans up the `typing.Final` test suite. Also adds a
few more tests with TODOs based on the [typing spec for
`typing.Final`](https://typing.python.org/en/latest/spec/qualifiers.html#uppercase-final).
2025-07-22 12:06:47 +02:00
David Peter
cb60ecef6b [ty] Minor change to diagnostic message for invalid Literal uses (#19482) 2025-07-22 11:42:12 +02:00
David Peter
215a1c55d4 [ty] Detect illegal non-enum attribute accesses in Literal annotation (#19477)
## Summary

Detect illegal attribute accesses in `Literal[X.Y]` annotations if `X`
is not an enum class.

## Test Plan

New Markdown test
2025-07-22 11:42:03 +02:00
Micha Reiser
5e29278aa2 [ty] Reduce size of TypeInference (#19435) 2025-07-22 11:36:36 +02:00
David Peter
af62d0368f Run MD tests for Markdown-only changes (#19479)
## Summary

Exclusions in Git pathspecs [are not
order-sensitive](https://css-tricks.com/git-pathspecs-and-how-to-use-them/#aa-exclude):

> After all other pathspecs have been resolved, all pathspecs with an
exclude signature are resolved and then removed from the returned paths.

This means that we can't write chains like we had here before to exclude
Markdown file changes *unless* they are in
`crates/ty_python_semantic/resources/mdtest`. This doesn't work. The
exclude pattern will just overwrite the second pattern and all Markdown
changes will be excluded:

```bash
':!**/*.md' \
':crates/ty_python_semantic/resources/mdtest/**/*.md' \
```

The configuration we had here before meant that tests wouldn't run on
MD-test only PRs, see e.g. https://github.com/astral-sh/ruff/pull/19476.

So here, I'm proposing to remove the broad `:!**/*.md` pattern. We can
always add more fine-grained exclusion patterns, if that's needed. The
`docs` folder is already excluded.

## Test Plan

Tested with local `git diff` invocations.
2025-07-22 11:29:07 +02:00
David Peter
30683e3a93 Revert "[ty] Detect illegal non-enum attribute accesses in Literal annotation"
This reverts commit cbc8c08016.
2025-07-22 09:19:44 +02:00
David Peter
cbc8c08016 [ty] Detect illegal non-enum attribute accesses in Literal annotation 2025-07-22 09:18:50 +02:00
UnboundVariable
897889d1ce [ty] Added semantic token support for more identifiers (#19473)
I noticed that the semantic token implementation was not handling
identifiers in a few cases. This adds support for identifiers that
appear in `except`, `case`, `nonlocal`, and `global` statements.

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-21 15:39:40 -07:00
Alex Waygood
cb5a9ff8dc [ty] Make tuple subclass constructors sound (#19469) 2025-07-21 21:25:11 +00:00
David Peter
fcdffe4ac9 [ty] Pass down specialization to generic dataclass bases (#19472)
## Summary

closes https://github.com/astral-sh/ty/issues/853

## Test Plan

Regression test
2025-07-21 20:51:58 +02:00
Douglas Creager
88de5727df [ty] Garbage-collect reachability constraints (#19414)
This is a follow-on to #19410 that further reduces the memory usage of
our reachability constraints. When finishing the building of a use-def
map, we walk through all of the "final" states and mark only those
reachability constraints as "used". We then throw away the interior TDD
nodes of any reachability constraints that weren't marked as used.

(This helps because we build up quite a few intermediate TDD nodes when
constructing complex reachability constraints. These nodes can never be
accessed if they were _only_ used as an intermediate TDD node. The
marking step ensures that we keep any nodes that ended up being referred
to in some accessible use-def map state.)
2025-07-21 14:16:27 -04:00
David Peter
b8dec79182 [ty] Implicit instance attributes declared Final (#19462)
## Summary

Adds proper type inference for implicit instance attributes that are
declared with a "bare" `Final` and adds `invalid-assignment` diagnostics
for all implicit instance attributes that are declared `Final` or
`Final[…]`.

## Test Plan

New and updated MD tests.

## Ecosystem analysis

```diff
pytest (https://github.com/pytest-dev/pytest)
+ error[invalid-return-type] src/_pytest/fixtures.py:1662:24: Return type does not match returned value: expected `Scope`, found `Scope | (Unknown & ~None & ~((...) -> object) & ~str) | (((str, Config, /) -> Unknown) & ~((...) -> object) & ~str) | (Unknown & ~str)
```

The definition of the `scope` attribute is [here](

5f99385635/src/_pytest/fixtures.py (L1020-L1028)).
Looks like this is a new false positive due to missing `TypeAlias`
support that is surfaced here because we now infer a more precise type
for `FixtureDef._scope`.
2025-07-21 20:01:07 +02:00
David Peter
dc66019fbc [ty] Expansion of enums into unions of literals (#19382)
## Summary

Implement expansion of enums into unions of enum literals (and the
reverse operation). For the enum below, this allows us to understand
that `Color = Literal[Color.RED, Color.GREEN, Color.BLUE]`, or that
`Color & ~Literal[Color.RED] = Literal[Color.GREEN, Color.BLUE]`. This
helps in exhaustiveness checking, which is why we see some removed
`assert_never` false positives. And since exhaustiveness checking also
helps with understanding terminal control flow, we also see a few
removed `invalid-return-type` and `possibly-unresolved-reference` false
positives. This PR also adds expansion of enums in overload resolution
and type narrowing constructs.

```py
from enum import Enum
from typing_extensions import Literal, assert_never
from ty_extensions import Intersection, Not, static_assert, is_equivalent_to

class Color(Enum):
    RED = 1
    GREEN = 2
    BLUE = 3

type Red = Literal[Color.RED]
type Green = Literal[Color.GREEN]
type Blue = Literal[Color.BLUE]

static_assert(is_equivalent_to(Red | Green | Blue, Color))
static_assert(is_equivalent_to(Intersection[Color, Not[Red]], Green | Blue))


def color_name(color: Color) -> str:  # no error here (we detect that this can not implicitly return None)
    if color is Color.RED:
        return "Red"
    elif color is Color.GREEN:
        return "Green"
    elif color is Color.BLUE:
        return "Blue"
    else:
        assert_never(color)  # no error here
```

## Performance

I avoided an initial regression here for large enums, but the
`UnionBuilder` and `IntersectionBuilder` parts can certainly still be
optimized. We might want to use the same technique that we also use for
unions of other literals. I didn't see any problems in our benchmarks so
far, so this is not included yet.

## Test Plan

Many new Markdown tests
2025-07-21 19:37:55 +02:00
Micha Reiser
926e83323a [ty] Avoid rechecking the entire project when changing the opened files (#19463) 2025-07-21 18:05:03 +02:00
Micha Reiser
5cace28c3e [ty] Add warning for unknown TY_MEMORY_REPORT value (#19465) 2025-07-21 14:29:24 +00:00
github-actions[bot]
3785e13231 [ty] Sync vendored typeshed stubs (#19461)
Co-authored-by: typeshedbot <>
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-07-21 14:01:42 +01:00
Alex Waygood
c2380fa0e2 [ty] Extend tuple __len__ and __bool__ special casing to also cover tuple subclasses (#19289)
Co-authored-by: Brent Westbrook
2025-07-21 12:50:46 +00:00
Alex Waygood
4dec44ae49 [ty] bump docstring-adder pin (#19458) 2025-07-21 13:38:40 +01:00
David Peter
b6579eaf04 [ty] Disallow assignment to Final class attributes (#19457)
## Summary

Emit errors for the following assignments:
```py
class C:
    CLASS_LEVEL_CONSTANT: Final[int] = 1

C.CLASS_LEVEL_CONSTANT = 2
C().CLASS_LEVEL_CONSTANT = 2
```

## Test Plan

Updated and new MD tests
2025-07-21 14:27:56 +02:00
renovate[bot]
f063c0e874 Update dependency ruff to v0.12.4 (#19442)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-21 08:32:09 +02:00
renovate[bot]
6a65734ee3 Update pre-commit hook astral-sh/ruff-pre-commit to v0.12.4 (#19443)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-21 08:31:55 +02:00
renovate[bot]
00066e094c Update rui314/setup-mold digest to 702b190 (#19441)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-21 08:31:47 +02:00
renovate[bot]
37a1958374 Update taiki-e/install-action action to v2.56.19 (#19448)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-21 08:31:18 +02:00
renovate[bot]
2535d791ae Update Rust crate strum_macros to v0.27.2 (#19447)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-21 08:31:07 +02:00
renovate[bot]
05c4399e7b Update Rust crate strum to v0.27.2 (#19446)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-21 08:30:51 +02:00
renovate[bot]
b18434b0f6 Update Rust crate rand to v0.9.2 (#19444)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-21 08:30:07 +02:00
renovate[bot]
17779c9a17 Update Rust crate serde_json to v1.0.141 (#19445)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-21 08:29:51 +02:00
Dylan
53fc0614da Fix unreachable panic in parser (#19183)
Parsing the (invalid) expression `f"{\t"i}"` caused a panic because the
`TStringMiddle` character was "unreachable" due the way the parser
recovered from the line continuation (it ate the t-string start).

The cause of the issue is as follows: 

The parser begins parsing the f-string and expects to see a list of
objects, essentially alternating between _interpolated elements_ and
ordinary strings. It is happy to see the first left brace, but then
there is a lexical error caused by the line-continuation character. So
instead of the parser seeing a list of elements with just one member, it
sees a list that starts like this:

- Interpolated element with an invalid token, stored as a `Name`
- Something else built from tokens beginning with `TStringStart` and
`TStringMiddle`

When it sees the `TStringStart` error recovery says "that's a list
element I don't know what to do with, let's skip it". When it sees
`TStringMiddle` it says "oh, that looks like the middle of _some
interpolated string_ so let's try to parse it as one of the literal
elements of my `FString`". Unfortunately, the function being used to
parse individual list elements thinks (arguably correctly) that it's not
possible to have a `TStringMiddle` sitting in your `FString`, and hits
`unreachable`.

Two potential ways (among many) to solve this issue are:

1. Allow a `TStringMiddle` as a valid "literal" part of an f-string
during parsing (with the hope/understanding that this would only occur
in an invalid context)
2. Skip the `TStringMiddle` as an "unexpected/invalid list item" in the
same way that we skipped `TStringStart`.

I have opted for the second approach since it seems somehow more morally
correct, even though it loses more information. To implement this, the
recovery context needs to know whether we are in an f-string or t-string
- hence the changes to that enum. As a bonus we get slightly more
specific error messages in some cases.

Closes #18860
2025-07-20 22:04:14 +00:00
Dan Parizher
59249f483b [ruff] Support byte strings (RUF055) (#18926)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Closes #18739

## Test Plan

<!-- How was it tested? -->

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-20 17:58:40 -04:00
Micha Reiser
84e76f4d04 [ty] Avoid second lookup for infer_maybe_standalone_expression (#19439) 2025-07-20 18:22:04 +02:00
UnboundVariable
0acc273286 [ty] Implemented "go to definition" support for import statements (#19428)
This PR extends the "go to declaration" and "go to definition"
functionality to support import statements — both standard imports and
"from" import forms.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-19 11:22:07 -07:00
Micha Reiser
93a9fabb26 [ty] Avoid secondary tree traversal to get call expression for keyword arguments (#19429) 2025-07-19 18:21:27 +02:00
Micha Reiser
98d1811dd1 [ty] Add goto definition to playground (#19425) 2025-07-19 15:44:44 +02:00
Aria Desires
06f9f52e59 [ty] Add support for @warnings.deprecated (#19376)
* [x] basic handling
  * [x] parse and discover `@warnings.deprecated` attributes
  * [x] associate them with function definitions
  * [x] associate them with class definitions
  * [x] add a new "deprecated" diagnostic
* [x] ensure diagnostic is styled appropriately for LSPs
(DiagnosticTag::Deprecated)

* [x] functions
  * [x] fire on calls
  * [x] fire on arbitrary references 
* [x] classes
  * [x] fire on initializers
  * [x] fire on arbitrary references
* [x] methods
  * [x] fire on calls
  * [x] fire on arbitrary references
* [ ] overloads
  * [ ] fire on calls
  * [ ] fire on arbitrary references(??? maybe not ???)
  * [ ] only fire if the actual selected overload is deprecated 

* [ ] dunder desugarring (warn on deprecated `__add__` if `+` is
invoked)
* [ ] alias supression? (don't warn on uses of variables that deprecated
items were assigned to)

* [ ] import logic
  * [x] fire on imports of deprecated items
* [ ] suppress subsequent diagnostics if the import diagnostic fired (is
this handled by alias supression?)
  * [x] fire on all qualified references (`module.mydeprecated`)
  * [x] fire on all references that depend on a `*` import
    


Fixes https://github.com/astral-sh/ty/issues/153
2025-07-18 23:50:29 +00:00
Jack O'Connor
e9a64e5825 [ty] make del x force local resolution of x in the current scope (#19389)
Fixes https://github.com/astral-sh/ty/issues/769.

**Updated:** The preferred approach here is to keep the SemanticIndex
simple (`del` of any name marks that name "bound" in the current scope)
and to move complexity to type inference (free variable resolution stops
when it finds a binding, unless that binding is declared `nonlocal`). As
part of this change, free variable resolution will now union the types
it finds as it walks in enclosing scopes. This approach is still
incomplete, because it doesn't consider inner scopes or sibling scopes,
but it improves the common case.

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-07-18 14:58:32 -07:00
UnboundVariable
360eb7005f [ty] Added support for "go to definition" for attribute accesses and keyword arguments (#19417)
This PR builds upon #19371. It addresses a few additional code review
suggestions and adds support for attribute accesses (expressions of the
form `x.y`) and keyword arguments within call expressions.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-18 11:33:57 -07:00
Micha Reiser
630c7a3152 [ty] Reduce number of inline stored definitions per place (#19409) 2025-07-18 18:28:46 +02:00
Ibraheem Ahmed
e6e029a8b7 Update salsa (#19258)
## Summary

Pulls in https://github.com/salsa-rs/salsa/pull/934.
2025-07-18 12:14:28 -04:00
Andrew Gallant
64f9481fd0 [ty] Add caching for submodule completion suggestions (#19408)
This change makes it so we aren't doing a directory traversal every time
we ask for completions from a module. Specifically, submodules that
aren't attributes of their parent module can only be discovered by
looking at the directory tree. But we want to avoid doing a directory
scan unless we think there are changes.

To make this work, this change does a little bit of surgery to
`FileRoot`. Previously, a `FileRoot` was only used for library search
paths. Its revision was bumped whenever a file in that tree was added,
deleted or even modified (to support the discovery of `pth` files and
changes to its contents). This generally seems fine since these are
presumably dependency paths that shouldn't change frequently.

In this change, we add a `FileRoot` for the project. But having the
`FileRoot`'s revision bumped for every change in the project makes
caching based on that `FileRoot` rather ineffective. That is, cache
invalidation will occur too aggressively. To the point that there is
little point in adding caching in the first place. To mitigate this, a
`FileRoot`'s revision is only bumped on a change to a child file's
contents when the `FileRoot` is a `LibrarySearchPath`. Otherwise, we
only bump the revision when a file is created or added.

The effect is that, at least in VS Code, when a new module is added or
removed, this change is picked up and the cache is properly invalidated.
Other LSP clients with worse support for file watching (which seems to
be the case for the CoC vim plugin that I use) don't work as well. Here,
the cache is less likely to be invalidated which might cause completions
to have stale results. Unless there's an obvious way to fix or improve
this, I propose punting on improvements here for now.
2025-07-18 11:54:27 -04:00
Dhruv Manilawala
99d0ac60b4 [ty] Track open files in the server (#19264)
## Summary

This PR updates the server to keep track of open files both system and
virtual files.

This is done by updating the project by adding the file in the open file
set in `didOpen` notification and removing it in `didClose`
notification.

This does mean that for workspace diagnostics, ty will only check open
files because the behavior of different diagnostic builder is to first
check `is_file_open` and only add diagnostics for open files. So, this
required updating the `is_file_open` model to be `should_check_file`
model which validates whether the file needs to be checked based on the
`CheckMode`. If the check mode is open files only then it will check
whether the file is open. If it's all files then it'll return `true` by
default.

Closes: astral-sh/ty#619

## Test Plan

### Before

There are two files in the project: `__init__.py` and `diagnostics.py`.

In the video, I'm demonstrating the old behavior where making changes to
the (open) `diagnostics.py` file results in re-parsing the file:


https://github.com/user-attachments/assets/c2ac0ecd-9c77-42af-a924-c3744b146045

### After

Same setup as above.

In the video, I'm demonstrating the new behavior where making changes to
the (open) `diagnostics.py` file doesn't result in re-parting the file:


https://github.com/user-attachments/assets/7b82fe92-f330-44c7-b527-c841c4545f8f
2025-07-18 19:33:35 +05:30
Andrew Gallant
ba7ed3a6f9 [ty] Use as the "cut" indicator in diagnostic rendering (#19420)
This makes ty match ruff's behavior. Specifically, we want to use `…`
instead of the default `...` because `...` has special significance in
Python.
2025-07-18 07:46:48 -04:00
justin
39b41838f3 [ty] synthesize __setattr__ for frozen dataclasses (#19307)
## Summary

Synthesize a `__setattr__` method with a return type of `Never` for
frozen dataclasses.

https://docs.python.org/3/library/dataclasses.html#frozen-instances

https://docs.python.org/3/library/dataclasses.html#dataclasses.FrozenInstanceError

### Related
https://github.com/astral-sh/ty/issues/111
https://github.com/astral-sh/ruff/pull/17974#discussion_r2108527106
https://github.com/astral-sh/ruff/pull/18347#discussion_r2128174665

## Test Plan

New Markdown tests

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-07-18 11:35:05 +02:00
UnboundVariable
c7640a433e [ty] Fixed bug in semantic token provider for parameters. (#19418)
This fixes https://github.com/astral-sh/ty/issues/832.

New tests were added to prevent future regressions.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-18 00:02:23 -07:00
Micha Reiser
1765014be3 [ty] Shrink reachability constraints (#19410) 2025-07-18 07:36:18 +02:00
Brent Westbrook
997dc2e7cc Move JUnit rendering to ruff_db (#19370)
Summary
--

This PR moves the JUnit output format to the new rendering
infrastructure. As I
mention in a TODO in the code, there's some code that will be shared
with the
`grouped` output format. Hopefully I'll have that PR up too by the time
this one
is reviewed.

Test Plan
--

Existing tests moved to `ruff_db`

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-17 18:24:13 -04:00
Douglas Creager
4aee0398cb [ty] Show the raw argument type in reveal_type (#19400)
This PR is changes how `reveal_type` determines what type to reveal, in
a way that should be a no-op to most callers.

Previously, we would reveal the type of the first parameter, _after_ all
of the call binding machinery had done its work. This includes inferring
the specialization of a generic function, and then applying that
specialization to all parameter and argument types, which is relevant
since the typeshed definition of `reveal_type` is generic:

```pyi
def reveal_type(obj: _T, /) -> _T: ...
```

Normally this does not matter, since we infer `_T = [arg type]` and
apply that to the parameter type, yielding `[arg type]`. But applying
that specialization also simplifies the argument type, which makes
`reveal_type` less useful as a debugging aid when we want to see the
actual, raw, unsimplified argument type.

With this patch, we now grab the original unmodified argument type and
reveal that instead.

In addition to making the debugging aid example work, this also makes
our `reveal_type` implementation more robust to custom typeshed
definitions, such as

```py
def reveal_type(obj: Any) -> Any: ...
```

(That custom definition is probably not what anyone would want, since
you wouldn't be able to depend on the return type being equivalent to
the argument type, but still)
2025-07-17 16:50:29 -04:00
Brent Westbrook
1fd9103e81 Canonicalize path before filtering (#19407)
## Summary

This came up on
[Discord](https://discord.com/channels/1039017663004942429/1343692072921731082/1395447082520678440)
and also in #19387, but on macOS the tmp directory is a symlink to
`/private/tmp`, which breaks this filter. I'm still not quite sure why
only these tests are affected when we use the `tempdir_filter`
elsewhere, but hopefully this fixes the immediate issue. Just
`tempdir.path().canonicalize()` also worked, but I used `dunce` since
that's what I saw in other tests (I guess it's not _just_ these tests).

Some related links from uv:
-
1b2f212e8b/crates/uv/tests/it/common/mod.rs (L1161-L1178)
-
1b2f212e8b/crates/uv/tests/it/common/mod.rs (L424-L438)
- https://github.com/astral-sh/uv/pull/14290

Thanks to @zanieb for those!

## Test Plan

I tested the `main` branch on my MacBook and reproduced the test
failure, then confirmed that the tests pass after the change. Now to
make sure it passes on Windows, which caused most of the trouble in the
first PR!
2025-07-17 14:02:17 -04:00
Dylan
ee2759b365 Bump 0.12.4 (#19406) 2025-07-17 12:14:01 -05:00
Aria Desires
35f33d9bf5 [ty] publish settings diagnostics (#19335) 2025-07-17 11:57:00 -04:00
chiri
5d78b3117a [flake8-use-pathlib] Add autofix for PTH109 (#19245)
## Summary

Part of #2331

## Test Plan

`cargo nextest run flake8_use_pathlib`
2025-07-17 10:11:43 -04:00
Dhruv Manilawala
c2a05b4825 [ty] Use bitflags for resolved client capabilities (#19393)
## Summary

This PR updates the `ResolvedClientCapabilities` to be represented as
`bitflags`. This allows us to remove the `Arc` as the type becomes copy.

Additionally, this PR also fixed the goto definition and declaration
code to use the `textDocument.definition.linkSupport` and
`textDocument.declaration.linkSupport` client capability.

This PR also removes the unused client capabilities which are
`code_action_deferred_edit_resolution`, `apply_edit`, and
`document_changes` which are all related to auto-fix ability.
2025-07-17 15:31:47 +05:30
UnboundVariable
fae0b5c89e [ty] Initial implementation of declaration and definition providers. (#19371)
This PR implements "go to definition" and "go to declaration"
functionality for name nodes only. Future PRs will add support for
attributes, module names in import statements, keyword argument names,
etc.

This PR:
* Registers a declaration and definition request handler for the
language server.
* Splits out the `goto_type_definition` into its own module. The `goto`
module contains functionality that is common to `goto_type_definition`,
`goto_declaration` and `goto_definition`.
* Roughs in a new module `stub_mapping` that is not yet implemented. It
will be responsible for mapping a definition in a stub file to its
corresponding definition(s) in an implementation (source) file.
* Adds a new IDE support function `definitions_for_name` that collects
all of the definitions associated with a name and resolves any imports
(recursively) to find the original definitions associated with that
name.
* Adds a new `VisibleAncestorsIter` stuct that iterates up the scope
hierarchy but skips scopes that are not visible to starting scope.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-16 15:07:24 -07:00
Matthew Mckee
cbe94b094b [ty] Support empty function bodies in if TYPE_CHECKING blocks (#19372)
## Summary

Resolves https://github.com/astral-sh/ty/issues/339

Supports having a blank function body inside `if TYPE_CHECKING` block or
in the elif or else of a `if not TYPE_CHECKING` block.

```py
if TYPE_CHECKING:
    def foo() -> int: ...

if not TYPE_CHECKING: ...
else:     
    def bar() -> int: ...
```

## Test Plan

Update `function/return_type.md`

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-07-16 14:48:04 -06:00
Dan Parizher
029de784f1 [flake8-use-pathlib] Fix false negative on direct Path() instantiation (PTH210) (#19388)
## Summary

Fixes #19329

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-16 20:43:03 +00:00
frank
ff94fe7447 Treat form feed as valid whitespace before a semicolon (#19343)
fixes #19310
2025-07-16 16:39:05 -04:00
Dan Parizher
b2501b45e0 [pylint] Detect indirect pathlib.Path usages for unspecified-encoding (PLW1514) (#19304)
## Summary

Fixes #19294

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-07-16 12:57:31 -04:00
Dan Parizher
291699b375 [refurb] FURB164 fix should validate arguments and should usually be marked unsafe (#19136)
## Summary

Fixes #19076

An attempt at fixing #19076 where the rule could change program behavior
by incorrectly converting from_float/from_decimal method calls to
constructor calls.

The fix implements argument validation using Ruff's existing type
inference system (`ResolvedPythonType`, `typing::is_int`,
`typing::is_float`) to determine when conversions are actually safe,
adds logic to detect invalid method calls (wrong argument counts,
incorrect keyword names) and suppress fixes for them, and changes the
default fix applicability from `Safe` to `Unsafe` with safe fixes only
offered when the argument type is known to be compatible and no
problematic keywords are used.

One uncertainty is whether the type inference catches all possible edge
cases in complex codebases, but the new approach is significantly more
conservative and safer than the previous implementation.

## Test Plan

I updated the existing test fixtures with edge cases from the issue and
manually verified behavior with temporary test files for
valid/unsafe/invalid scenarios.

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-07-16 15:38:33 +00:00
Jack O'Connor
64ac7d7dbf [ty] use the name span rather than the statement span for unresolved global lints (#19379)
This is a follow-up to https://github.com/astral-sh/ruff/pull/19344 that
improves the error formatting slightly. For example with this program:

```py
def f():
    global foo, bar
```

Before we printed:

```
1 | def f():
2 |     global foo, bar
  |     ^^^^^^^^^^^^^^^ `foo` has no declarations or bindings in the global scope
...
1 | def f():
2 |     global foo, bar
  |     ^^^^^^^^^^^^^^^ `bar` has no declarations or bindings in the global scope
```

Now we print:

```
1 | def f():
2 |     global foo, bar
  |            ^^^ `foo` has no declarations or bindings in the global scope
...
1 | def f():
2 |     global foo, bar
  |                 ^^^ `bar` has no declarations or bindings in the global scope
```

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-16 15:34:47 +00:00
Jack O'Connor
5f2e855c29 allow reads of "free" variables to refer to a global declaration
Previously this worked if there was also a binding in the same scope as
the `global` declaration (probably almost always the case), but CPython
doesn't require this.

This change surfaced an error in an existing test, where a global
variable was only ever declared and bound using the `global` keyword,
and never mentioned explicitly in the global scope. @AlexWaygood
suggested we probably want to keep that requirement, so I'm adding an a
new test for that on top of fixing the failing test.
2025-07-16 08:30:42 -07:00
Jack O'Connor
3b4667ec32 respect annotation-only declarations in infer_place_load 2025-07-16 08:30:42 -07:00
Brent Westbrook
893f5727e5 [flake8-type-checking, pyupgrade, ruff] Add from __future__ import annotations when it would allow new fixes (TC001, TC002, TC003, UP037, RUF013) (#19100)
## Summary

This is a second attempt at addressing
https://github.com/astral-sh/ruff/issues/18502 instead of reusing
`FA100` (#18919).

This PR:
- adds a new `lint.allow-importing-future-annotations` option
- uses the option to add a `__future__` import when it would trigger
`TC001`, `TC002`, or `TC003`
- uses the option to add an import when it would allow unquoting more
annotations in [quoted-annotation
(UP037)](https://docs.astral.sh/ruff/rules/quoted-annotation/#quoted-annotation-up037)
- uses the option to allow the `|` union syntax before 3.10 in
[implicit-optional
(RUF013)](https://docs.astral.sh/ruff/rules/implicit-optional/#implicit-optional-ruf013)

I started adding a fix for [runtime-string-union
(TC010)](https://docs.astral.sh/ruff/rules/runtime-string-union/#runtime-string-union-tc010)
too, as mentioned in my previous
[comment](https://github.com/astral-sh/ruff/issues/18502#issuecomment-3005238092),
but some of the existing tests already imported `from __future__ import
annotations`, so I think we intentionally flag these cases for the user
to inspect. Adding the import is _a_ fix but probably not the best one.

## Test Plan

Existing `TC` tests, new copies of them with the option enabled, and new
tests based on ideas in
https://github.com/astral-sh/ruff/pull/18919#discussion_r2166292705 and
the following thread. For UP037 and RUF013, the new tests are also
copies of the existing tests, with the new option enabled. The easiest
way to review them is probably by their diffs from the existing
snapshots:

### UP037

`UP037_0.py` and `UP037_2.pyi` have no diffs. The diff for `UP037_1.py`
is below. It correctly unquotes an annotation in module scope that would
otherwise be invalid.

<details><summary>UP037_1.py</summary>

```diff
3d2
< snapshot_kind: text
23c22,42
< 12 12 |
---
> 12 12 |
>
> UP037_1.py:14:4: UP037 [*] Remove quotes from type annotation
>    |
> 13 | # OK
> 14 | X: "Tuple[int, int]" = (0, 0)
>    |    ^^^^^^^^^^^^^^^^^ UP037
>    |
>    = help: Remove quotes
>
> ℹ Unsafe fix
>    1  |+from __future__ import annotations
> 1  2  | from typing import TYPE_CHECKING
> 2  3  |
> 3  4  | if TYPE_CHECKING:
> --------------------------------------------------------------------------------
> 11 12 |
> 12 13 |
> 13 14 | # OK
> 14    |-X: "Tuple[int, int]" = (0, 0)
>    15 |+X: Tuple[int, int] = (0, 0)
```

</details>

### RUF013

The diffs here are mostly just the imports because the original snaps
were on 3.13. So we're getting the same fixes now on 3.9.

<details><summary>RUF013_0.py</summary>

```diff
3d2
< snapshot_kind: text
14,16c13,20
< 17 17 |     pass
< 18 18 | 
< 19 19 | 
---
>    1  |+from __future__ import annotations
> 1  2  | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 17 18 |     pass
> 18 19 | 
> 19 20 | 
18,21c22,25
<    20 |+def f(arg: int | None = None):  # RUF013
< 21 21 |     pass
< 22 22 | 
< 23 23 | 
---
>    21 |+def f(arg: int | None = None):  # RUF013
> 21 22 |     pass
> 22 23 | 
> 23 24 | 
32,34c36,43
< 21 21 |     pass
< 22 22 | 
< 23 23 | 
---
>    1  |+from __future__ import annotations
> 1  2  | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 21 22 |     pass
> 22 23 | 
> 23 24 | 
36,39c45,48
<    24 |+def f(arg: str | None = None):  # RUF013
< 25 25 |     pass
< 26 26 | 
< 27 27 | 
---
>    25 |+def f(arg: str | None = None):  # RUF013
> 25 26 |     pass
> 26 27 | 
> 27 28 | 
50,52c59,66
< 25 25 |     pass
< 26 26 | 
< 27 27 | 
---
>    1  |+from __future__ import annotations
> 1  2  | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 25 26 |     pass
> 26 27 | 
> 27 28 | 
54,57c68,71
<    28 |+def f(arg: Tuple[str] | None = None):  # RUF013
< 29 29 |     pass
< 30 30 | 
< 31 31 | 
---
>    29 |+def f(arg: Tuple[str] | None = None):  # RUF013
> 29 30 |     pass
> 30 31 | 
> 31 32 | 
68,70c82,89
< 55 55 |     pass
< 56 56 | 
< 57 57 | 
---
>    1  |+from __future__ import annotations
> 1  2  | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 55 56 |     pass
> 56 57 | 
> 57 58 | 
72,75c91,94
<    58 |+def f(arg: Union | None = None):  # RUF013
< 59 59 |     pass
< 60 60 | 
< 61 61 | 
---
>    59 |+def f(arg: Union | None = None):  # RUF013
> 59 60 |     pass
> 60 61 | 
> 61 62 | 
86,88c105,112
< 59 59 |     pass
< 60 60 | 
< 61 61 | 
---
>    1  |+from __future__ import annotations
> 1  2  | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 59 60 |     pass
> 60 61 | 
> 61 62 | 
90,93c114,117
<    62 |+def f(arg: Union[int] | None = None):  # RUF013
< 63 63 |     pass
< 64 64 | 
< 65 65 | 
---
>    63 |+def f(arg: Union[int] | None = None):  # RUF013
> 63 64 |     pass
> 64 65 | 
> 65 66 | 
104,106c128,135
< 63 63 |     pass
< 64 64 | 
< 65 65 | 
---
>    1  |+from __future__ import annotations
> 1  2  | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 63 64 |     pass
> 64 65 | 
> 65 66 | 
108,111c137,140
<    66 |+def f(arg: Union[int, str] | None = None):  # RUF013
< 67 67 |     pass
< 68 68 | 
< 69 69 | 
---
>    67 |+def f(arg: Union[int, str] | None = None):  # RUF013
> 67 68 |     pass
> 68 69 | 
> 69 70 | 
122,124c151,158
< 82 82 |     pass
< 83 83 | 
< 84 84 | 
---
>    1  |+from __future__ import annotations
> 1  2  | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 82 83 |     pass
> 83 84 | 
> 84 85 | 
126,129c160,163
<    85 |+def f(arg: int | float | None = None):  # RUF013
< 86 86 |     pass
< 87 87 | 
< 88 88 | 
---
>    86 |+def f(arg: int | float | None = None):  # RUF013
> 86 87 |     pass
> 87 88 | 
> 88 89 | 
140,142c174,181
< 86 86 |     pass
< 87 87 | 
< 88 88 | 
---
>    1  |+from __future__ import annotations
> 1  2  | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 86 87 |     pass
> 87 88 | 
> 88 89 | 
144,147c183,186
<    89 |+def f(arg: int | float | str | bytes | None = None):  # RUF013
< 90 90 |     pass
< 91 91 | 
< 92 92 | 
---
>    90 |+def f(arg: int | float | str | bytes | None = None):  # RUF013
> 90 91 |     pass
> 91 92 | 
> 92 93 | 
158,160c197,204
< 105 105 |     pass
< 106 106 | 
< 107 107 | 
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 105 106 |     pass
> 106 107 | 
> 107 108 | 
162,165c206,209
<     108 |+def f(arg: Literal[1] | None = None):  # RUF013
< 109 109 |     pass
< 110 110 | 
< 111 111 | 
---
>     109 |+def f(arg: Literal[1] | None = None):  # RUF013
> 109 110 |     pass
> 110 111 | 
> 111 112 | 
176,178c220,227
< 109 109 |     pass
< 110 110 | 
< 111 111 | 
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 109 110 |     pass
> 110 111 | 
> 111 112 | 
180,183c229,232
<     112 |+def f(arg: Literal[1, "foo"] | None = None):  # RUF013
< 113 113 |     pass
< 114 114 | 
< 115 115 | 
---
>     113 |+def f(arg: Literal[1, "foo"] | None = None):  # RUF013
> 113 114 |     pass
> 114 115 | 
> 115 116 | 
194,196c243,250
< 128 128 |     pass
< 129 129 | 
< 130 130 | 
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 128 129 |     pass
> 129 130 | 
> 130 131 | 
198,201c252,255
<     131 |+def f(arg: Annotated[int | None, ...] = None):  # RUF013
< 132 132 |     pass
< 133 133 | 
< 134 134 | 
---
>     132 |+def f(arg: Annotated[int | None, ...] = None):  # RUF013
> 132 133 |     pass
> 133 134 | 
> 134 135 | 
212,214c266,273
< 132 132 |     pass
< 133 133 | 
< 134 134 | 
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 132 133 |     pass
> 133 134 | 
> 134 135 | 
216,219c275,278
<     135 |+def f(arg: Annotated[Annotated[int | str | None, ...], ...] = None):  # RUF013
< 136 136 |     pass
< 137 137 | 
< 138 138 | 
---
>     136 |+def f(arg: Annotated[Annotated[int | str | None, ...], ...] = None):  # RUF013
> 136 137 |     pass
> 137 138 | 
> 138 139 | 
232,234c291,298
< 148 148 | 
< 149 149 | 
< 150 150 | def f(
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 148 149 | 
> 149 150 | 
> 150 151 | def f(
236,239c300,303
<     151 |+    arg1: int | None = None,  # RUF013
< 152 152 |     arg2: Union[int, float] = None,  # RUF013
< 153 153 |     arg3: Literal[1, 2, 3] = None,  # RUF013
< 154 154 | ):
---
>     152 |+    arg1: int | None = None,  # RUF013
> 152 153 |     arg2: Union[int, float] = None,  # RUF013
> 153 154 |     arg3: Literal[1, 2, 3] = None,  # RUF013
> 154 155 | ):
253,255c317,324
< 149 149 | 
< 150 150 | def f(
< 151 151 |     arg1: int = None,  # RUF013
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 149 150 | 
> 150 151 | def f(
> 151 152 |     arg1: int = None,  # RUF013
257,260c326,329
<     152 |+    arg2: Union[int, float] | None = None,  # RUF013
< 153 153 |     arg3: Literal[1, 2, 3] = None,  # RUF013
< 154 154 | ):
< 155 155 |     pass
---
>     153 |+    arg2: Union[int, float] | None = None,  # RUF013
> 153 154 |     arg3: Literal[1, 2, 3] = None,  # RUF013
> 154 155 | ):
> 155 156 |     pass
274,276c343,350
< 150 150 | def f(
< 151 151 |     arg1: int = None,  # RUF013
< 152 152 |     arg2: Union[int, float] = None,  # RUF013
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 150 151 | def f(
> 151 152 |     arg1: int = None,  # RUF013
> 152 153 |     arg2: Union[int, float] = None,  # RUF013
278,281c352,355
<     153 |+    arg3: Literal[1, 2, 3] | None = None,  # RUF013
< 154 154 | ):
< 155 155 |     pass
< 156 156 | 
---
>     154 |+    arg3: Literal[1, 2, 3] | None = None,  # RUF013
> 154 155 | ):
> 155 156 |     pass
> 156 157 | 
292,294c366,373
< 178 178 |     pass
< 179 179 | 
< 180 180 | 
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 178 179 |     pass
> 179 180 | 
> 180 181 | 
296,299c375,378
<     181 |+def f(arg: Union[Annotated[int, ...], Union[str, bytes]] | None = None):  # RUF013
< 182 182 |     pass
< 183 183 | 
< 184 184 | 
---
>     182 |+def f(arg: Union[Annotated[int, ...], Union[str, bytes]] | None = None):  # RUF013
> 182 183 |     pass
> 183 184 | 
> 184 185 | 
307c386
<     = help: Convert to `T | None`
---
>     = help: Convert to `Optional[T]`
314c393
<     188 |+def f(arg: "int | None" = None):  # RUF013
---
>     188 |+def f(arg: "Optional[int]" = None):  # RUF013
325c404
<     = help: Convert to `T | None`
---
>     = help: Convert to `Optional[T]`
332c411
<     192 |+def f(arg: "str | None" = None):  # RUF013
---
>     192 |+def f(arg: "Optional[str]" = None):  # RUF013
343c422
<     = help: Convert to `T | None`
---
>     = help: Convert to `Optional[T]`
354,356c433,440
< 201 201 |     pass
< 202 202 | 
< 203 203 | 
---
>     1   |+from __future__ import annotations
> 1   2   | from typing import Annotated, Any, Literal, Optional, Tuple, Union, Hashable
> 2   3   | 
> 3   4   | 
> --------------------------------------------------------------------------------
> 201 202 |     pass
> 202 203 | 
> 203 204 | 
358,361c442,445
<     204 |+def f(arg: Union["int", "str"] | None = None):  # RUF013
< 205 205 |     pass
< 206 206 | 
< 207 207 |
---
>     205 |+def f(arg: Union["int", "str"] | None = None):  # RUF013
> 205 206 |     pass
> 206 207 | 
> 207 208 |
```

</details>

<details><summary>RUF013_1.py</summary>

```diff
3d2
< snapshot_kind: text
15,16c14,16
< 2 2 |
< 3 3 |
---
>   2 |+from __future__ import annotations
> 2 3 |
> 3 4 |
18,19c18,19
<   4 |+def f(arg: int | None = None):  # RUF013
< 5 5 |     pass
---
>   5 |+def f(arg: int | None = None):  # RUF013
> 5 6 |     pass
```

</details>

<details><summary>RUF013_3.py</summary>

```diff
3d2
< snapshot_kind: text
14,16c13,16
< 1 1 | import typing
< 2 2 | 
< 3 3 | 
---
>   1 |+from __future__ import annotations
> 1 2 | import typing
> 2 3 | 
> 3 4 | 
18,21c18,21
<   4 |+def f(arg: typing.List[str] | None = None):  # RUF013
< 5 5 |     pass
< 6 6 | 
< 7 7 | 
---
>   5 |+def f(arg: typing.List[str] | None = None):  # RUF013
> 5 6 |     pass
> 6 7 | 
> 7 8 | 
32,34c32,39
< 19 19 |     pass
< 20 20 | 
< 21 21 | 
---
>    1  |+from __future__ import annotations
> 1  2  | import typing
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 19 20 |     pass
> 20 21 | 
> 21 22 | 
36,39c41,44
<    22 |+def f(arg: typing.Union[int, str] | None = None):  # RUF013
< 23 23 |     pass
< 24 24 | 
< 25 25 | 
---
>    23 |+def f(arg: typing.Union[int, str] | None = None):  # RUF013
> 23 24 |     pass
> 24 25 | 
> 25 26 | 
50,52c55,62
< 26 26 | # Literal
< 27 27 | 
< 28 28 | 
---
>    1  |+from __future__ import annotations
> 1  2  | import typing
> 2  3  | 
> 3  4  | 
> --------------------------------------------------------------------------------
> 26 27 | # Literal
> 27 28 | 
> 28 29 | 
54,55c64,65
<    29 |+def f(arg: typing.Literal[1, "foo", True] | None = None):  # RUF013
< 30 30 |     pass
---
>    30 |+def f(arg: typing.Literal[1, "foo", True] | None = None):  # RUF013
> 30 31 |     pass
```

</details>

<details><summary>RUF013_4.py</summary>

```diff
3d2
< snapshot_kind: text
13,15c12,20
< 12 12 | def multiple_1(arg1: Optional, arg2: Optional = None): ...
< 13 13 |
< 14 14 |
---
> 1  1  | # https://github.com/astral-sh/ruff/issues/13833
>    2  |+from __future__ import annotations
> 2  3  |
> 3  4  | from typing import Optional
> 4  5  |
> --------------------------------------------------------------------------------
> 12 13 | def multiple_1(arg1: Optional, arg2: Optional = None): ...
> 13 14 |
> 14 15 |
17,20c22,25
<    15 |+def multiple_2(arg1: Optional, arg2: Optional = None, arg3: int | None = None): ...
< 16 16 |
< 17 17 |
< 18 18 | def return_type(arg: Optional = None) -> Optional: ...
---
>    16 |+def multiple_2(arg1: Optional, arg2: Optional = None, arg3: int | None = None): ...
> 16 17 |
> 17 18 |
> 18 19 | def return_type(arg: Optional = None) -> Optional: ...
```

</details>

## Future work

This PR does not touch UP006, UP007, or UP045, which are currently
coupled to FA100. If this new approach turns out well, we may eventually
want to deprecate FA100 and add a `__future__` import in those rules'
fixes too.

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-16 08:50:52 -04:00
Cornelius Roemer
b8dddd514f chore: Document Material for MkDocs Insiders limitations in CONTRIBUTING.md in more detail (#19373)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-16 08:42:25 +00:00
Jack O'Connor
e73a8ba571 lint on the global keyword if there's no explicit definition in the global scope 2025-07-15 16:56:54 -07:00
David Peter
a1edb69ea5 [ty] Enum literal types (#19328)
## Summary

Add a new `Type::EnumLiteral(…)` variant and infer this type for member
accesses on enums.

**Example**: No more `@Todo` types here:
```py
from enum import Enum

class Answer(Enum):
    YES = 1
    NO = 2

    def is_yes(self) -> bool:
        return self == Answer.YES

reveal_type(Answer.YES)  # revealed: Literal[Answer.YES]
reveal_type(Answer.YES == Answer.NO)  # revealed: Literal[False]
reveal_type(Answer.YES.is_yes())  # revealed: bool
```

## Test Plan

* Many new Markdown tests for the new type variant
* Added enum literal types to property tests, ran property tests

## Ecosystem analysis

Summary:

Lots of false positives removed. All of the new diagnostics are
either new true positives (the majority) or known problems. Click for
detailed analysis</summary>

Details:

```diff
AutoSplit (https://github.com/Toufool/AutoSplit)
+ error[call-non-callable] src/capture_method/__init__.py:137:9: Method `__getitem__` of type `bound method CaptureMethodDict.__getitem__(key: Never, /) -> type[CaptureMethodBase]` is not callable on object of type `CaptureMethodDict`
+ error[call-non-callable] src/capture_method/__init__.py:147:9: Method `__getitem__` of type `bound method CaptureMethodDict.__getitem__(key: Never, /) -> type[CaptureMethodBase]` is not callable on object of type `CaptureMethodDict`
+ error[call-non-callable] src/capture_method/__init__.py:148:1: Method `__getitem__` of type `bound method CaptureMethodDict.__getitem__(key: Never, /) -> type[CaptureMethodBase]` is not callable on object of type `CaptureMethodDict`
```

New true positives. That `__getitem__` method is apparently annotated
with `Never` to prevent developers from using it.


```diff
dd-trace-py (https://github.com/DataDog/dd-trace-py)
+ error[invalid-assignment] ddtrace/vendor/psutil/_common.py:29:5: Object of type `None` is not assignable to `Literal[AddressFamily.AF_INET6]`
+ error[invalid-assignment] ddtrace/vendor/psutil/_common.py:33:5: Object of type `None` is not assignable to `Literal[AddressFamily.AF_UNIX]`
```

Arguably true positives:
e0a772c28b/ddtrace/vendor/psutil/_common.py (L29)

```diff
ignite (https://github.com/pytorch/ignite)
+ error[invalid-argument-type] tests/ignite/engine/test_custom_events.py:190:34: Argument to bound method `__call__` is incorrect: Expected `((...) -> Unknown) | None`, found `Literal["123"]`
+ error[invalid-argument-type] tests/ignite/engine/test_custom_events.py:220:37: Argument to function `default_event_filter` is incorrect: Expected `Engine`, found `None`
+ error[invalid-argument-type] tests/ignite/engine/test_custom_events.py:220:43: Argument to function `default_event_filter` is incorrect: Expected `int`, found `None`
+ error[call-non-callable] tests/ignite/engine/test_custom_events.py:561:9: Object of type `CustomEvents` is not callable
+ error[invalid-argument-type] tests/ignite/metrics/test_frequency.py:50:38: Argument to bound method `attach` is incorrect: Expected `Events`, found `CallableEventWithFilter`
```

All true positives. Some of them are inside `pytest.raises(TypeError,
…)` blocks 🙃

```diff
meson (https://github.com/mesonbuild/meson)
+ error[invalid-argument-type] unittests/internaltests.py:243:51: Argument to bound method `__init__` is incorrect: Expected `bool`, found `Literal[MachineChoice.HOST]`
+ error[invalid-argument-type] unittests/internaltests.py:271:51: Argument to bound method `__init__` is incorrect: Expected `bool`, found `Literal[MachineChoice.HOST]`
```

New true positives. Enum literals can not be assigned to `bool`, even if
their value types are `0` and `1`.

```diff
poetry (https://github.com/python-poetry/poetry)
+ error[invalid-assignment] src/poetry/console/exceptions.py:101:5: Object of type `Literal[""]` is not assignable to `InitVar[str]`
```

New false positive, missing support for `InitVar`.

```diff
prefect (https://github.com/PrefectHQ/prefect)
+ error[invalid-argument-type] src/integrations/prefect-dask/tests/test_task_runners.py:193:17: Argument is incorrect: Expected `StateType`, found `Literal[StateType.COMPLETED]`
```

This is confusing. There are two definitions
([one](74d8cd93ee/src/prefect/client/schemas/objects.py (L89-L100)),
[two](https://github.com/PrefectHQ/prefect/blob/main/src/prefect/server/schemas/states.py#L40))
of the `StateType` enum. Here, we're trying to assign one to the other.
I don't think that should be allowed, so this is a true positive (?).

```diff
python-htmlgen (https://github.com/srittau/python-htmlgen)
+ error[invalid-assignment] test_htmlgen/form.py:51:9: Object of type `str` is not assignable to attribute `autocomplete` of type `Autocomplete | None`
+ error[invalid-assignment] test_htmlgen/video.py:38:9: Object of type `str` is not assignable to attribute `preload` of type `Preload | None`
```

True positives. [The stubs are
wrong](01e3b911ac/htmlgen/form.pyi (L8-L10)).
These should not contain type annotations, but rather just `OFF = ...`.

```diff
rotki (https://github.com/rotki/rotki)
+ error[invalid-argument-type] rotkehlchen/tests/unit/test_serialization.py:62:30: Argument to bound method `deserialize` is incorrect: Expected `str`, found `Literal[15]`
```

New true positive.

```diff
vision (https://github.com/pytorch/vision)
+ error[unresolved-attribute] test/test_extended_models.py:302:17: Type `type[WeightsEnum]` has no attribute `DEFAULT`
+ error[unresolved-attribute] test/test_extended_models.py:302:58: Type `type[WeightsEnum]` has no attribute `DEFAULT`
```

Also new true positives. No `DEFAULT` member exists on `WeightsEnum`.
2025-07-15 21:31:53 +02:00
github-actions[bot]
a0d4e1f854 [ty] Sync vendored typeshed stubs (#19368)
Co-authored-by: typeshedbot <>
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-07-15 18:14:46 +00:00
Alex Waygood
c0d04f2d56 Fix typeshed-sync workflow (#19367) 2025-07-15 19:07:38 +01:00
Alex Waygood
8d7d02193e Rework typeshed-sync workflow to also add docstrings for Windows- and MacOS-specific APIs (#19360) 2025-07-15 18:14:32 +01:00
Zanie Blue
78dfc8af0f [ty] Allow -qq for silent output mode (#19366)
This matches uv's behavior.

Briefly discussed at
https://github.com/astral-sh/ruff/pull/19233#discussion_r2197930360

I think the most useful case is to avoid piping to `/dev/null` which
hard to do properly in a cross-platform script.
2025-07-15 17:08:19 +00:00
Zanie Blue
0c84652cc5 [ty] Allow -q short alias for --quiet (#19364) 2025-07-15 12:00:07 -05:00
Alex Waygood
560ae04346 Add shellcheck to pre-commit (#19361) 2025-07-15 16:49:13 +00:00
Jack O'Connor
a357a68fc9 distinguish references from definitions in infer_nonlocal
The initial implementation of `infer_nonlocal` landed in
https://github.com/astral-sh/ruff/pull/19112 fails to report an error
for this example:

```py
x = 1
def f():
    # This is only a usage of `x`, not a definition. It shouldn't be
    # enough to make the `nonlocal` statement below allowed.
    print(x)
    def g():
        nonlocal x
```

Fix this by continuing to walk enclosing scopes when the place we've
found isn't bound, declared, or `nonlocal`.
2025-07-15 07:55:40 -07:00
Dylan
00e7d1ffd6 [pycodestyle] Handle brace escapes for t-strings in logical lines (#19358)
Tracks both f and t-strings in the logical line rules for `pycodestyle`.

Progress towards #15506
2025-07-15 14:48:48 +00:00
Douglas Creager
f4d0273532 [ty] Combine CallArguments and CallArgumentTypes (#19337)
We previously had separate `CallArguments` and `CallArgumentTypes` types
in support of our two-phase call binding logic. `CallArguments` would
store only the arity/kind of each argument (positional, keyword,
variadic, etc). We then performed parameter matching using only this
arity/kind information, and then infered the type of each argument,
placing the result of this second phase into a new `CallArgumentTypes`.

In #18996, we will need to infer the types of splatted arguments
_before_ performing parameter matching, since we need to know the
argument type to accurately infer its length, which informs how many
parameters the splatted argument is matched against.

That makes this separation of Rust types no longer useful. This PR
merges everything back into a single `CallArguments`. In the case where
we are performing two-phase call binding, the types will be initialized
to `None`, and updated to the actual argument type during the second
`check_types` phase.

_[This is a refactoring in support of fixing the merge conflicts on
#18996. I've pulled this out into a separate PR to make it easier to
review in isolation.]_
2025-07-15 10:20:58 -04:00
Brent Westbrook
e9cac3684a Move Pylint rendering to ruff_db (#19340)
Summary
--

This is a very simple output format, the only decision is what to do if
the file
is missing from the diagnostic. For now, I opted to `unwrap_or_default`
both the
path and the `OneIndexed` row number, giving `:1: main diagnostic
message` in
the test without a file.

Another quirk here is that the path is relativized. I just pasted in the
`relativize_path` and `get_cwd` implementations from `ruff_linter::fs`
for now,
but maybe there's a better place for them.

I didn't see any details about why this needs to be relativized in the
original
[issue](https://github.com/astral-sh/ruff/issues/1953),
[PR](https://github.com/astral-sh/ruff/pull/1995), or in the pylint

[docs](https://flake8.pycqa.org/en/latest/internal/formatters.html#pylint-formatter),
but it did change the results of the CLI integration test when I tried
deleting
it. I haven't been able to reproduce that in the CLI, though, so it may
only
happen with `Command::current_dir`.

Test Plan
--

Tests ported from `ruff_linter` and a new test for the case with no file

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-15 10:14:49 -04:00
Dylan
92a302e291 [pylint] Extend invalid string character rules to include t-strings (#19355)
Handle t-strings in PLE2510-15

Progress towards #15506
2025-07-15 07:59:51 -05:00
Alex Waygood
7b8161e80d Make TC010 docs example more realistic (#19356) 2025-07-15 13:52:21 +01:00
Brent Westbrook
e9b0c33703 Move RDJSON rendering to ruff_db (#19293)
## Summary

Another output format like #19133. This is the
[reviewdog](https://github.com/reviewdog/reviewdog) output format, which
is somewhat similar to regular JSON. Like #19270, in the first commit I
converted from using `json!` to `Serialize` structs, then in the second
commit I moved the module to `ruff_db`.

The reviewdog
[schema](320a8e73a9/proto/rdf/jsonschema/DiagnosticResult.json)
seems a bit more flexible than our JSON schema, so I'm not sure if we
need any preview checks here. I'll flag the places I wasn't sure about
as review comments.

## Test Plan

New tests in `rdjson.rs`, ported from the old `rjdson.rs` module, as
well as the new CLI output tests.

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-15 12:39:21 +00:00
Dylan
82391b5675 [flake8-use-pathlib] Skip single dots for invalid-pathlib-with-suffix (PTH210) on versions >= 3.14 (#19331)
Skips [invalid-pathlib-with-suffix
(PTH210)](https://docs.astral.sh/ruff/rules/invalid-pathlib-with-suffix/#invalid-pathlib-with-suffix-pth210)
for `.with_suffix(".")` on Python versions 3.14 and greater, as per [the
docs](https://docs.python.org/3.14/library/pathlib.html#pathlib.PurePath.with_suffix).

Progress towards #15506
2025-07-15 07:05:00 -05:00
Dylan
464144f1c6 [ruff] Allow strict kwarg when checking for starmap-zip (RUF058) in Python 3.14+ (#19333)
In Python 3.14 the keyword-argument `strict` was [added to
`map`](https://docs.python.org/3.14/library/functions.html#map). This PR
adds support for this when replacing a starmap-zip call with map in
[starmap-zip
(RUF058)](https://docs.astral.sh/ruff/rules/starmap-zip/#starmap-zip-ruf058).

Progress towards #15506
2025-07-15 07:04:23 -05:00
Alex Waygood
002f9057db [ty] Reduce false positives for TypedDict types (#19354) 2025-07-15 12:47:19 +01:00
Dhruv Manilawala
f3a27406c9 [ty] Remove ConnectionInitializer (#19353)
## Summary

This PR removes the `ConnectionInitializer` and inlines the
`initialize_start` and `initialize_finish` calls.

The main benefit of this is that it will allow us to use
[`Connection::memory`](https://docs.rs/lsp-server/latest/lsp_server/struct.Connection.html#method.memory)
in the mock server. That method returns two `Connection` where one of
them will represent the client side connection and the other will be
sent to the `Server::new` call to be used by the server. This way the
mock client can send notifications and requests to mimic the editor.

## Test Plan

I tested out the initialization process and checked that the initialized
result contains the server capabilities and server info.
2025-07-15 17:02:44 +05:30
Alex Waygood
2c9da80985 [ty] Use Type::string_literal() more (#19352) 2025-07-15 11:09:07 +00:00
David Peter
8e61da740a [ty] Add ecosystem-report workflow (#19349)
## Summary

Adds a new workflow that generates an ecosystem report of all
diagnostics and publishes it to Cloudflare pages.

## Test Plan

Not yet tested.
2025-07-15 12:29:44 +02:00
Micha Reiser
e506296cec [ty] Make use of salsa Lookup when interning values (#19347) 2025-07-15 09:54:43 +02:00
github-actions[bot]
966cc9d6e9 [ty] Sync vendored typeshed stubs (#19345)
Close and reopen this PR to trigger CI

Co-authored-by: typeshedbot <>
2025-07-15 11:46:59 +05:30
GiGaGon
7b27fe966e [pylint] Make example error out-of-the-box (PLE2502) (#19272)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972
Fixes #14346

This PR makes [bidirectional-unicode
(PLE2502)](https://docs.astral.sh/ruff/rules/bidirectional-unicode/#bidirectional-unicode-ple2502)'s
example error out-of-the-box, by converting it to use one of the test
cases. The documentation in general is also updated to replace
"bidirectional unicode character" with "bidirectional formatting
character", as those are the only ones checked for, and the "unicode"
suffix is redundant. The new example section looks like this:
<img width="1074" height="264" alt="image"
src="https://github.com/user-attachments/assets/cc1d2cb4-b590-4f20-a4d2-15b744872cdd"
/>

The "References" section link is also updated to reflect the rule's
actual behavior.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-14 14:46:23 -04:00
GiGaGon
966fd6f57a [pydoclint] Fix SyntaxError from fixes with line continuations (D201, D202) (#19246)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

This PR fixes #7172 by suppressing the fixes for
[docstring-missing-returns
(DOC201)](https://docs.astral.sh/ruff/rules/docstring-missing-returns/#docstring-missing-returns-doc201)
/ [docstring-extraneous-returns
(DOC202)](https://docs.astral.sh/ruff/rules/docstring-extraneous-returns/#docstring-extraneous-returns-doc202)
if there is a surrounding line continuation character `\` that would
make the fix cause a syntax error.

To do this, the lints are changed from `AlwaysFixableViolation` to
`Violation` with `FixAvailability::Sometimes`.

In the case of `DOC201`, the fix is not given if the non-break line ends
in a line continuation character `\`. Note that lines are iterated in
reverse from the docstring to the function definition.

In the case of `DOC202`, the fix is not given if the docstring ends with
a line continuation character `\`.

## Test Plan

<!-- How was it tested? -->

Added a test case.
2025-07-14 13:31:36 -04:00
github-actions[bot]
4f60f0e925 [ty] Sync vendored typeshed stubs (#19334)
Co-authored-by: typeshedbot <>
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-07-14 17:34:09 +01:00
GiGaGon
059e90a98f [refurb] Make example error out-of-the-box (FURB122) (#19297)
## Summary

Part of #18972

This PR makes [for-loop-writes
(FURB122)](https://docs.astral.sh/ruff/rules/for-loop-writes/#for-loop-writes-furb122)'s
example error out-of-the-box. I also had to re-name the second case's
variables to get both to raise at the same time, I suspect because of
limitations in ruff's current semantic model. New names subject to
bikeshedding, I just went with the least effort `_b` for binary suffix.

[Old example](https://play.ruff.rs/19e8e47a-8058-4013-aef5-e9b5eab65962)
```py
with Path("file").open("w") as f:
    for line in lines:
        f.write(line)

with Path("file").open("wb") as f:
    for line in lines:
        f.write(line.encode())
```

[New example](https://play.ruff.rs/e96b00e5-3c63-47c3-996d-dace420dd711)
```py
from pathlib import Path

with Path("file").open("w") as f:
    for line in lines:
        f.write(line)

with Path("file").open("wb") as f_b:
    for line_b in lines_b:
        f_b.write(line_b.encode())
```

The "Use instead" section was also modified similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-14 11:24:16 -05:00
Juriah
a4562ac673 [refurb] Make example error out-of-the-box (FURB177) (#19309)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Part of #18972
This PR makes
[implicit-cwd(FURB177)](https://docs.astral.sh/ruff/rules/implicit-cwd/)'s
example error out-of-the-box.

[Old example](https://play.ruff.rs/a0bef229-9626-426f-867f-55cb95ee64d8)
```python
cwd = Path().resolve()
```
[New example](https://play.ruff.rs/bdbea4af-e276-4603-a1b6-88757dfaa399)
```python
from pathlib import Path

cwd = Path().resolve()
```
<!-- What's the purpose of the change? What does it do, and why? -->


## Test Plan

<!-- How was it tested? -->
N/A, no functionality/tests affected
2025-07-14 11:23:02 -05:00
Alex Waygood
021a70d30c [ty] ignore errors when reformatting codemodded typeshed (#19332) 2025-07-14 16:14:01 +00:00
Alex Waygood
fddf2f33d2 [ty] Provide docstrings for stdlib APIs when hovering over them in an IDE (#19311) 2025-07-14 17:00:45 +01:00
Dhruv Manilawala
b4c42eb83b [ty] Add virtual files to the only project database (#19322)
## Summary

Previously, the virtual files were being added to the default database
that's present on the session. This is wrong because the default
database is for any files that don't belong to any project i.e., they're
outside of any projects managed by the server. Virtual files are neither
part of the project nor it is outside the projects. This was not the
intention as in the initial version, virtual files were being added to
the only project database managed by the server.

This PR fixes this by reverting back to the original behavior where
virtual files will be added to the only project database present. When
support for multiple workspace and project is added, this will require
updating (https://github.com/astral-sh/ty/issues/794).

This is required for #19264 because workspace diagnostics doesn't check
the default project database yet. Ideally, the default db should be
checked as well.

The implementation of this PR means that virtual files are now being
included for workspace diagnostics but it doesn't work completely e.g.,
if I save an untitled file the diagnostics disappears but it doesn't
appear back for the (now) saved file on disk as shown in the following
video demonstration:



https://github.com/user-attachments/assets/123e8d20-1e95-4c7d-b7eb-eb65be8c476e
2025-07-14 20:17:51 +05:30
Dylan
2a2cc37158 Add t-string fixtures for rules that do not need to be modified (#19146)
I used a script to attempt to identify those rules with the following
property: changing f-strings to t-strings in the corresponding fixture
altered the number of lint errors emitted. In other words, those rules
for which f-strings and t-strings are not treated the same in the
current implementation.

This PR documents the subset of such rules where this is fine and no
changes need to be made to the implementation of the rule. Mostly these
are the rules where it is relevant that an f-string evaluates to type
`str` at runtime whereas t-strings do not.

In theory many of these fixtures are not super necessary - it's unlikely
t-strings would be used for most of these. However, the internal
handling of t-strings is tightly coupled with that of f-strings, and may
become even more so as we implement the upcoming changes due to
https://github.com/python/cpython/pull/135996 . So I'd like to keep
these around as regression tests.

Note: The `flake8-bandit` fixtures were already added during the
original t-string implementation.

| Rule(s) | Reason |
| --- | --- |
| [`unused-method-argument`
(`ARG002`)](https://docs.astral.sh/ruff/rules/unused-method-argument/#unused-method-argument-arg002)
| f-strings exempted for msg in `NotImplementedError` not relevant for
t-strings |
| [`logging-f-string`
(`G004`)](https://docs.astral.sh/ruff/rules/logging-f-string/#logging-f-string-g004)
| t-strings cannot be used here |
| [`f-string-in-get-text-func-call`
(`INT001`)](https://docs.astral.sh/ruff/rules/f-string-in-get-text-func-call/#f-string-in-get-text-func-call-int001)
| rule justified by eager evaluation of interpolations |
| [`flake8-bandit`](https://docs.astral.sh/ruff/rules/#flake8-bandit-s)|
rules justified by eager evaluation of interpolations |
| [`single-string-slots`
(`PLC0205`)](https://docs.astral.sh/ruff/rules/single-string-slots/#single-string-slots-plc0205)
| t-strings cannot be slots in general |
| [`unnecessary-encode-utf8`
(`UP012`)](https://docs.astral.sh/ruff/rules/unnecessary-encode-utf8/#unnecessary-encode-utf8-up012)
| cannot encode t-strings |
| [`no-self-use`
(`PLR6301`)](https://docs.astral.sh/ruff/rules/no-self-use/#no-self-use-plr6301)
| f-strings exempted for msg in NotImplementedError not relevant for
t-strings |
| [`pytest-raises-too-broad`
(`PT011`)](https://docs.astral.sh/ruff/rules/pytest-raises-too-broad/) /
[`pytest-fail-without-message`
(`PT016`)](https://docs.astral.sh/ruff/rules/pytest-fail-without-message/#pytest-fail-without-message-pt016)
/ [`pytest-warns-too-broad`
(`PT030`)](https://docs.astral.sh/ruff/rules/pytest-warns-too-broad/#pytest-warns-too-broad-pt030)
| t-strings cannot be empty or used as messages |
| [`assert-on-string-literal`
(`PLW0129`)](https://docs.astral.sh/ruff/rules/assert-on-string-literal/#assert-on-string-literal-plw0129)
| t-strings are not strings and cannot be empty |
| [`native-literals`
(`UP018`)](https://docs.astral.sh/ruff/rules/native-literals/#native-literals-up018)
| t-strings are not native literals |
2025-07-14 09:46:31 -05:00
Dhruv Manilawala
8a217e5920 [ty] Remove FileLookupError (#19323)
## Summary

This PR removes the `FileLookupError` as it's not really required. The
original intention was that this would be returned from the `.file`
lookup to the different handlers but we've since moved the logic of
"lookup file and add trace message if file unavailable with the reason"
under the `file_ok` method which all of the handlers use.
2025-07-14 13:35:14 +00:00
Andrew Gallant
f7973ac870 [ty] Fix handling of metaclasses in object.<CURSOR> completions
Basically, we weren't quite using `Type::member` in every case
correctly. Specifically, this example from @sharkdp:

```
class Meta(type):
    @property
    def meta_attr(self) -> int:
        return 0

class C(metaclass=Meta): ...

C.<CURSOR>
```

While we would return `C.meta_attr` here, we were claiming its type was
`property`. But its type should be `int`.

Ref https://github.com/astral-sh/ruff/pull/19216#discussion_r2197065241
2025-07-14 08:24:23 -04:00
Micha Reiser
3560f86450 [ty] Use an interval map for scopes by expression (#19025) 2025-07-14 13:50:58 +02:00
David Peter
f22da352db [ty] List all enum members (#19283)
## Summary

Adds a way to list all members of an `Enum` and implements almost all of
the mechanisms by which members are distinguished from non-members
([spec](https://typing.python.org/en/latest/spec/enums.html#defining-members)).
This has no effect on actual enums, so far.

## Test Plan

New Markdown tests using `ty_extensions.enum_members`.
2025-07-14 13:18:17 +02:00
Micha Reiser
cb530a0216 [ty] Handle configuration errors in LSP more gracefully (#19262) 2025-07-14 12:27:52 +02:00
Micha Reiser
90026047f9 [ty] Use python version and path from Python extension (#19012) 2025-07-14 09:47:27 +00:00
w0nder1ng
26f736bc46 [pep8_naming] Avoid false positives on standard library functions with uppercase names (N802) (#18907)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-14 08:26:57 +00:00
renovate[bot]
c9f95e8714 Update Rust crate toml to 0.9.0 (#19320)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [toml](https://redirect.github.com/toml-rs/toml) |
workspace.dependencies | minor | `0.8.11` -> `0.9.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>toml-rs/toml (toml)</summary>

###
[`v0.9.2`](https://redirect.github.com/toml-rs/toml/compare/toml-v0.9.1...toml-v0.9.2)

[Compare
Source](https://redirect.github.com/toml-rs/toml/compare/toml-v0.9.1...toml-v0.9.2)

###
[`v0.9.1`](https://redirect.github.com/toml-rs/toml/compare/toml-v0.9.0...toml-v0.9.1)

[Compare
Source](https://redirect.github.com/toml-rs/toml/compare/toml-v0.9.0...toml-v0.9.1)

###
[`v0.9.0`](https://redirect.github.com/toml-rs/toml/compare/toml-v0.8.23...toml-v0.9.0)

[Compare
Source](https://redirect.github.com/toml-rs/toml/compare/toml-v0.8.23...toml-v0.9.0)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
2025-07-14 13:11:10 +05:30
Micha Reiser
3da8b51dc1 [ty] Fix server version (#19284) 2025-07-14 09:06:34 +02:00
renovate[bot]
3cbf2fe82e Update NPM Development dependencies (#19319)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-14 08:45:59 +02:00
renovate[bot]
221f3258d4 Update taiki-e/install-action action to v2.56.13 (#19317)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-14 08:40:43 +02:00
renovate[bot]
3fae3f248c Update Rust crate clap to v4.5.41 (#19315)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-14 08:40:26 +02:00
renovate[bot]
6c73837bc4 Update Rust crate get-size2 to v0.5.2 (#19316)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-14 08:40:03 +02:00
renovate[bot]
6f8f38cb68 Update CodSpeedHQ/action action to v3.7.0 (#19318)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-14 08:35:06 +02:00
renovate[bot]
4e2d6f5e45 Update pre-commit hook astral-sh/ruff-pre-commit to v0.12.3 (#19314)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | patch | `v0.12.2` -> `v0.12.3` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.3`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.3)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.2...v0.12.3)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.3

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-14 11:52:42 +05:30
renovate[bot]
3ed3852c38 Update dependency ruff to v0.12.3 (#19313)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [ruff](https://docs.astral.sh/ruff)
([source](https://redirect.github.com/astral-sh/ruff),
[changelog](https://redirect.github.com/astral-sh/ruff/blob/main/CHANGELOG.md))
| `==0.12.2` -> `==0.12.3` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/ruff/0.12.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/ruff/0.12.2/0.12.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>astral-sh/ruff (ruff)</summary>

###
[`v0.12.3`](https://redirect.github.com/astral-sh/ruff/blob/HEAD/CHANGELOG.md#0123)

[Compare
Source](https://redirect.github.com/astral-sh/ruff/compare/0.12.2...0.12.3)

##### Preview features

- \[`flake8-bugbear`] Support non-context-manager calls in `B017`
([#&#8203;19063](https://redirect.github.com/astral-sh/ruff/pull/19063))
- \[`flake8-use-pathlib`] Add autofixes for `PTH100`, `PTH106`,
`PTH107`, `PTH108`, `PTH110`, `PTH111`, `PTH112`, `PTH113`, `PTH114`,
`PTH115`, `PTH117`, `PTH119`, `PTH120`
([#&#8203;19213](https://redirect.github.com/astral-sh/ruff/pull/19213))
- \[`flake8-use-pathlib`] Add autofixes for `PTH203`, `PTH204`, `PTH205`
([#&#8203;18922](https://redirect.github.com/astral-sh/ruff/pull/18922))

##### Bug fixes

- \[`flake8-return`] Fix false-positive for variables used inside nested
functions in `RET504`
([#&#8203;18433](https://redirect.github.com/astral-sh/ruff/pull/18433))
- Treat form feed as valid whitespace before a line continuation
([#&#8203;19220](https://redirect.github.com/astral-sh/ruff/pull/19220))
- \[`flake8-type-checking`] Fix syntax error introduced by fix (`TC008`)
([#&#8203;19150](https://redirect.github.com/astral-sh/ruff/pull/19150))
- \[`pyupgrade`] Keyword arguments in `super` should suppress the
`UP008` fix
([#&#8203;19131](https://redirect.github.com/astral-sh/ruff/pull/19131))

##### Documentation

- \[`flake8-pyi`] Make example error out-of-the-box (`PYI007`, `PYI008`)
([#&#8203;19103](https://redirect.github.com/astral-sh/ruff/pull/19103))
- \[`flake8-simplify`] Make example error out-of-the-box (`SIM116`)
([#&#8203;19111](https://redirect.github.com/astral-sh/ruff/pull/19111))
- \[`flake8-type-checking`] Make example error out-of-the-box (`TC001`)
([#&#8203;19151](https://redirect.github.com/astral-sh/ruff/pull/19151))
- \[`flake8-use-pathlib`] Make example error out-of-the-box (`PTH210`)
([#&#8203;19189](https://redirect.github.com/astral-sh/ruff/pull/19189))
- \[`pycodestyle`] Make example error out-of-the-box (`E272`)
([#&#8203;19191](https://redirect.github.com/astral-sh/ruff/pull/19191))
- \[`pycodestyle`] Make example not raise unnecessary `SyntaxError`
(`E114`)
([#&#8203;19190](https://redirect.github.com/astral-sh/ruff/pull/19190))
- \[`pydoclint`] Make example error out-of-the-box (`DOC501`)
([#&#8203;19218](https://redirect.github.com/astral-sh/ruff/pull/19218))
- \[`pylint`, `pyupgrade`] Fix syntax errors in examples (`PLW1501`,
`UP028`)
([#&#8203;19127](https://redirect.github.com/astral-sh/ruff/pull/19127))
- \[`pylint`] Update `missing-maxsplit-arg` docs and error to suggest
proper usage (`PLC0207`)
([#&#8203;18949](https://redirect.github.com/astral-sh/ruff/pull/18949))
- \[`flake8-bandit`] Make example error out-of-the-box (`S412`)
([#&#8203;19241](https://redirect.github.com/astral-sh/ruff/pull/19241))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-14 11:52:25 +05:30
GiGaGon
dca594f89f [pyupgrade] Make example error out-of-the-box (UP040) (#19296)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [non-pep695-type-alias
(UP040)](https://docs.astral.sh/ruff/rules/non-pep695-type-alias/#non-pep695-type-alias-up040)'s
example error out-of-the-box.

[Old example](https://play.ruff.rs/6beca1be-45cd-4e5a-aafa-6a0584c10d64)
```py
ListOfInt: TypeAlias = list[int]
PositiveInt = TypeAliasType("PositiveInt", Annotated[int, Gt(0)])
```

[New example](https://play.ruff.rs/bbad34da-bf07-44e6-9f34-53337e8f57d4)
```py
from typing import Annotated, TypeAlias, TypeAliasType
from annotated_types import Gt

ListOfInt: TypeAlias = list[int]
PositiveInt = TypeAliasType("PositiveInt", Annotated[int, Gt(0)])
```

Imports were also added to the "Use instead" section.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-12 18:39:25 +01:00
GiGaGon
4bc27133a9 [pyupgrade] Make example error out-of-the-box (UP046) (#19295) 2025-07-12 14:54:56 +01:00
GiGaGon
7154b64248 [pylint] Make example error out-of-the-box (PLE1507) (#19288)
## Summary

Part of #18972

This PR makes [invalid-envvar-value
(PLE1507)](https://docs.astral.sh/ruff/rules/invalid-envvar-value/#invalid-envvar-value-ple1507)'s
example error out-of-the-box.

[Old example](https://play.ruff.rs/a46a9bca-edd5-4474-b20d-e6b6d87291ca)
```py
os.getenv(1)
```

[New example](https://play.ruff.rs/8348d32d-71fa-422c-b228-e2bc343765b1)
```py
import os

os.getenv(1)
```

The "Use instead" section was also updated similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-11 16:08:47 -05:00
GiGaGon
6d01c487a5 [pyupgrade] Make example error out-of-the-box (UP041) (#19292)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [timeout-error-alias
(UP041)](https://docs.astral.sh/ruff/rules/timeout-error-alias/#timeout-error-alias-up041)'s
example error out-of-the-box.

[Old example](https://play.ruff.rs/87e20352-d80a-46ec-98a2-6f6ea700438b)
```py
raise asyncio.TimeoutError
```

[New example](https://play.ruff.rs/d3b95557-46a2-4856-bd71-30d5f3f5ca44)
```py
import asyncio

raise asyncio.TimeoutError
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-11 16:08:20 -05:00
GiGaGon
6660b11422 [pyupgrade] Make example error out-of-the-box (UP023) (#19291)
## Summary

Part of #18972

This PR makes [deprecated-c-element-tree
(UP023)](https://docs.astral.sh/ruff/rules/deprecated-c-element-tree/#deprecated-c-element-tree-up023)'s
example error out-of-the-box. I have no clue why the `import
xml.etree.cElementTree` and `from xml.etree import cElementTree` cases
are specifically carved out if they do not have an `as ...`, but the
tests explicitly call this out, and that's how it is in `pyupgrade`'s
source as well.


b5c5f710fc/crates/ruff_linter/resources/test/fixtures/pyupgrade/UP023.py (L23-L31)

[Old example](https://play.ruff.rs/632b8ce1-393d-45e5-9504-5444ae71a0d8)
```py
from xml.etree import cElementTree
```

[New example](https://play.ruff.rs/fef4d378-8c54-41b2-8778-2d02bcbbd7d3)
```py
from xml.etree import cElementTree as ET
```

The "Use instead" section was also updated similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-11 16:07:34 -05:00
Brent Westbrook
b5c5f710fc Render Azure, JSON, and JSON lines output with the new diagnostics (#19133)
## Summary

This was originally stacked on #19129, but some of the changes I made
for JSON also impacted the Azure format, so I went ahead and combined
them. The main changes here are:

- Implementing `FileResolver` for Ruff's `EmitterContext`
- Adding `FileResolver::notebook_index` and `FileResolver::is_notebook`
methods
- Adding a `DisplayDiagnostics` (with an "s") type for rendering a group
of diagnostics at once
- Adding `Azure`, `Json`, and `JsonLines` as new `DiagnosticFormat`s

I tried a couple of alternatives to the `FileResolver::notebook` methods
like passing down the `NotebookIndex` separately and trying to reparse a
`Notebook` from Ruff's `SourceFile`. The latter seemed promising, but
the `SourceFile` only stores the concatenated plain text of the
notebook, not the re-parsable JSON. I guess the current version is just
a variation on passing the `NotebookIndex`, but at least we can reuse
the existing `resolver` argument. I think a lot of this can be cleaned
up once Ruff has its own actual file resolver.

As suggested, I also tried deleting the corresponding `Emitter` files in
`ruff_linter`, but it doesn't look like git was able to follow this as a
rename. It did, however, track that the tests were moved, so the
snapshots should be easy to review.

## Test Plan

Existing Ruff tests ported to tests in `ruff_db`. I think some other
existing ruff tests also cover parts of this refactor.

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-11 15:04:46 -04:00
Dan Parizher
ee88abf77c [flake8_django] Fix DJ008 false positive for abstract models with type-annotated abstract field (#19221)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-11 16:50:59 +00:00
Jack O'Connor
78bd73f25a [ty] add support for nonlocal statements 2025-07-11 09:44:54 -07:00
Dan Parizher
110765154f [flake8-bugbear] Fix B017 false negatives for keyword exception arguments (#19217)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-11 16:43:09 +00:00
Dan Parizher
30ee44770d Fix I002 import insertion after docstring with multiple string statements (#19222) 2025-07-11 18:35:41 +02:00
Dhruv Manilawala
fd69533fe5 [ty] Make sure to always respond to client requests (#19277)
## Summary

This PR fixes a bug that didn't return a response to the client if the
document snapshotting failed.

This is resolved by making sure that the server always creates the
document snapshot and embed the any failures inside the snapshot.

Closes: astral-sh/ty#798

## Test Plan

Using the test case as described in the linked issue:



https://github.com/user-attachments/assets/f32833f8-03e5-4641-8c7f-2a536fe2e270
2025-07-11 14:27:27 +00:00
Zanie Blue
39c6364545 Only build tests in the msrv job (#19261)
Alternative to https://github.com/astral-sh/ruff/pull/19260
2025-07-11 09:16:12 -05:00
Andrew Gallant
100d765ddf [ty] Document path separator usage in VendoredFileSystem
Ref https://github.com/astral-sh/ruff/pull/19266#discussion_r2198530383
2025-07-11 10:06:35 -04:00
Andrew Gallant
6ea231e458 [ty] Add debug output with completion request timings
I had this in a branch somewhere but forgot to get it
merged. So I'm sneaking it in here.

This is useful for very ad hoc performance testing.
2025-07-11 10:06:35 -04:00
Alex Waygood
c9df4ddf6a [ty] Add completions for submodule imports
While we did previously support submodule completions via our
`all_members` API, that only works when submodules are attributes of
their parent module. For example, `os.path`. But that didn't work when
the submodule was not an attribute of its parent. For example,
`http.client`. To make the latter work, we read the directory of the
parent module to discover its submodules.
2025-07-11 10:06:35 -04:00
Andrew Gallant
948463aafa [ty] Move SystemOrVendoredPathRef
This moves the type and adds a few methods so that it can
be used elsewhere.
2025-07-11 10:06:35 -04:00
Andrew Gallant
729fa12575 [ty] Add "readdir" for vendored file systems
This is mostly just holding a zip file in the right way
to simulate reading a directory. We want this to be able
to discover sub-modules for completions.
2025-07-11 10:06:35 -04:00
Brent Westbrook
f14ee9edd5 Use structs for JSON serialization (#19270)
## Summary

See https://github.com/astral-sh/ruff/pull/19133#discussion_r2198413586
for recent discussion. This PR moves to using structs for the types in
our JSON output format instead of the `json!` macro.

I didn't rename any of the `message` references because that should be
handled when rebasing #19133 onto this.

My plan for handling the `preview` behavior with the new diagnostics is
to use a wrapper enum. Something like:

```rust
#[derive(Serialize)]
#[serde(untagged)]
pub(crate) enum JsonDiagnostic<'a> {
    Old(OldJsonDiagnostic<'a>),
}

#[derive(Serialize)]
pub(crate) struct OldJsonDiagnostic<'a> {
    // ...
}
```

Initially I thought I could use a `&dyn Serialize` for the affected
fields, but I see that `Serialize` isn't dyn-compatible in testing this
now.

## Test Plan

Existing tests. One quirk of the new types is that their fields are in
alphabetical order. I guess `json!` sorts the fields alphabetically? The
tests were failing before I sorted the struct fields.

## Other formats

It looks like the `rdjson`, `sarif`, and `gitlab` formats also use
`json!`, so if we decide to merge this, I can do something similar for
those before moving them to the new diagnostic format.
2025-07-11 09:37:44 -04:00
Alex Waygood
a67630f907 [ty] Filter out private type aliases from stub files when offering autocomplete suggestions (#19282) 2025-07-11 13:20:16 +00:00
Brent Westbrook
5bc81f26c8 Bump 0.12.3 (#19279) 2025-07-11 09:07:50 -04:00
Brent Westbrook
6908e2682f Filter ruff_linter::VERSION out of SARIF output tests (#19280)
Summary
--

Fixes the test failures in #19279. This is the same variable used to
construct the SARIF output:


350d563c88/crates/ruff_linter/src/message/sarif.rs (L39-L44)

Test Plan
--

Existing tests with the modified filter
2025-07-11 08:55:51 -04:00
Dhruv Manilawala
25c4295564 [ty] Avoid stale diagnostics for open files diagnostic mode (#19273)
## Summary

This PR fixes a bug where in `openFilesOnly` diagnostic mode, VS Code
wouldn't clean up the diagnostics even though the server asked it to by
sending an empty publish diagnostics.

This is not the long-term solution but a quick fix. Ideally, the server
would dynamically register for workspace diagnostics but that requires
listening for `didChangeConfiguration` notification which I'm going to
be working on with https://github.com/astral-sh/ty/issues/82.

## Test Plan

### Before

This uses the latest stable version of ty.


https://github.com/user-attachments/assets/0cc6c513-ccad-4955-a1b6-a0ee242119d6

### After

This uses the debug build of ty from this PR.


https://github.com/user-attachments/assets/e539d569-d852-46a9-bbfc-d54375127c62
2025-07-11 16:29:16 +05:30
Micha Reiser
426fa4bb12 [ty] Add signature help provider to playground (#19276) 2025-07-11 09:58:14 +02:00
UnboundVariable
b0b65c24ff [ty] Initial implementation of signature help provider (#19194)
This PR includes:
* Implemented core signature help logic
* Added new docstring method on Definition that returns a docstring for
function and class definitions
* Modified the display code for Signature that allows a signature string
to be broken into text ranges that correspond to each parameter in the
signature
* Augmented Signature struct so it can track the Definition for a
signature when available; this allows us to find the docstring
associated with the signature
* Added utility functions for parsing parameter documentation from three
popular docstring formats (Google, NumPy and reST)
* Implemented tests for all of the above

"Signature help" is displayed by an editor when you are typing a
function call expression. It is typically triggered when you type an
open parenthesis. The language server provides information about the
target function's signature (or multiple signatures), documentation, and
parameters.

Here is how this appears:


![image](https://github.com/user-attachments/assets/40dce616-ed74-4810-be62-42a5b5e4b334)

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-10 19:32:00 -07:00
Brent Westbrook
08bc6d2589 Add simple integration tests for all output formats (#19265)
Summary
--

I spun this off from #19133 to be sure to get an accurate baseline
before modifying any of the formats. I picked the code snippet to
include a lint diagnostic with a fix, one without a fix, and one syntax
error. I'm happy to expand it if there are any other kinds we want to
test.

I initially passed `CONTENT` on stdin, but I was a bit surprised to
notice that some of our output formats include an absolute path to the
file. I switched to a `TempDir` to use the `tempdir_filter`.

Test Plan
--

New CLI tests
2025-07-10 17:57:48 -04:00
Victor Hugo Gomes
f2ae12bab3 [flake8-return] Fix false-positive for variables used inside nested functions in RET504 (#18433)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
This PR is the same as #17656.

I accidentally deleted the branch of that PR, so I'm creating a new one.

Fixes #14052

## Test Plan

Add regression tests
<!-- How was it tested? -->
2025-07-10 16:10:22 -04:00
Zanie Blue
965f415212 [ty] Add a --quiet mode (#19233)
Adds a `--quiet` flag which silences diagnostic, warning logs, and
messages like "all checks passed" while retaining summary messages that
indicate problems, e.g., the number of diagnostics.

I'm a bit on the fence regarding filtering out warning logs, because it
can omit important details, e.g., the message that a fatal diagnostic
was encountered. Let's discuss that in
https://github.com/astral-sh/ruff/pull/19233#discussion_r2195408693

The implementation recycles the `Printer` abstraction used in uv, which
is intended to replace all direct usage of `std::io::stdout`. See
https://github.com/astral-sh/ruff/pull/19233#discussion_r2195140197

I ended up futzing with the progress bar more than I probably should
have to ensure it was also using the printer, but it doesn't seem like a
big deal. See
https://github.com/astral-sh/ruff/pull/19233#discussion_r2195330467

Closes https://github.com/astral-sh/ty/issues/772
2025-07-10 09:40:47 -05:00
frank
83b5bbf004 Treat form feed as valid whitespace before a line continuation (#19220)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-10 14:09:34 +00:00
Micha Reiser
87f6f08ef5 [ty] Make check_file a salsa query (#19255)
## Summary
We noticed that all files get reparsed when workspace diagnostics are
enabled.

I realised that this is because `check_file_impl` access the parsed
module but itself isn't a salsa query.
This pr makes `check_file_impl` a salsa query, so that we only access
the `parsed_module` when the file actually changed. I decided to remove
the salsa query from `check_types` because most functions it calls are
salsa queries itself and having both `check_types` and `check_file` as
salsa querise has the downside that we double cache the diagnostics.

## Test Plan

**Before**

```
2025-07-10 12:54:16.620766000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c0c))}: File `/Users/micha/astral/test/yaml/yaml-stubs/__init__.pyi` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.621942000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c13))}: File `/Users/micha/astral/test/ignore2 2/nested-repository/main.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.622107000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c09))}: File `/Users/micha/astral/test/notebook.ipynb` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.622357000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c04))}: File `/Users/micha/astral/test/no-trailing.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.622634000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c02))}: File `/Users/micha/astral/test/simple.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.623056000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c07))}: File `/Users/micha/astral/test/open/more.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.623254000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c11))}: File `/Users/micha/astral/test/ignore-bug/backend/src/subdir/log/some_logging_lib.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.623450000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c0f))}: File `/Users/micha/astral/test/yaml/tomllib/__init__.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.624599000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c05))}: File `/Users/micha/astral/test/create.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.624784000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c00))}: File `/Users/micha/astral/test/lib.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.624911000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c0a))}: File `/Users/micha/astral/test/sub/test.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.625032000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c12))}: File `/Users/micha/astral/test/ignore2/nested-repository/main.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.625101000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c08))}: File `/Users/micha/astral/test/open/test.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.625227000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c03))}: File `/Users/micha/astral/test/pseudocode_with_bom.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.625353000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c0b))}: File `/Users/micha/astral/test/yaml/yaml-stubs/loader.pyi` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.625543000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c01))}: File `/Users/micha/astral/test/test_trailing.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.625616000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c0d))}: File `/Users/micha/astral/test/yaml/tomllib/_re.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.625667000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c06))}: File `/Users/micha/astral/test/yaml/main.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.625779000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c10))}: File `/Users/micha/astral/test/yaml/tomllib/_types.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.627526000  WARN request{id=19 method="workspace/diagnostic"}:Project::check:check_file{file=file(Id(c0e))}: File `/Users/micha/astral/test/yaml/tomllib/_parser.py` was reparsed after being collected in the current Salsa revision
2025-07-10 12:54:16.627959000 DEBUG request{id=19 method="workspace/diagnostic"}:Project::check: Checking all files took 0.007s
```

Now, no more logs regarding reparsing
2025-07-10 18:46:56 +05:30
Alex Waygood
59114d0301 [ty] Consolidate submodule resolving code between types.rs and ide_support.rs (#19256) 2025-07-10 13:10:09 +00:00
Micha Reiser
492f5bf2aa [ty] Remove countme from salsa-structs (#19257) 2025-07-10 11:45:09 +00:00
Alex Waygood
934aaa23f3 [ty] Improve and document equivalence for module-literal types (#19243) 2025-07-10 09:11:10 +00:00
Alex Waygood
59aa869724 [ty] Optimize protocol subtyping by removing expensive and unnecessary equivalence check from the top of Type::has_relation_to() (#19230) 2025-07-10 09:42:27 +01:00
David Peter
edaffa6c4f [ty] Ecosystem analyzer: parallelize, fix race condition (#19252)
## Summary

Pulls in two fixes and a performance optimization:

- Fix a bug with the Markdown table formatting.
- Combine the two `analyze` commands into a single `diff` command. This
means we only need to set up the projects once, which is faster and also
avoids a race condition where projects could change between the two
`analyze` runs.
2025-07-10 10:25:24 +02:00
Micha Reiser
5fb2fb916b [ty] Add completion kind to playground (#19251) 2025-07-10 07:41:59 +00:00
David Peter
801f69a7b4 [ty] Deploy ecosystem diff to Cloudflare pages (#19234)
## Summary

Changes the ecosystem-analyzer workflow to deploy the diff to Cloudflare
pages and post a link in the PR. Also adds a summary statistics to that
PR comment.

## Test Plan

The comment below:
https://github.com/astral-sh/ruff/pull/19234#issuecomment-3053205937. I
previously had some dummy changes on this PR to see a non-zero diff. And
I didn't reapply the label after I reverted that change, such that it's
still visible for reviewers.
2025-07-10 09:03:42 +02:00
Micha Reiser
3926dd8424 [ty] Add semantic token provider to playground (#19232) 2025-07-10 07:50:28 +02:00
Faisal
563268ce53 [docs] add capital one to who's using ruff (#19248)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Add Capital One to Who's Using Ruff (README)
Also thanks for the fantastic project!
2025-07-09 23:50:27 +00:00
Dan Parizher
221edcba5c [pyupgrade] Keyword arguments in super should suppress the UP008 fix (#19131)
## Summary

Fixes #19096
2025-07-09 15:13:22 -04:00
chiri
beb98dae7c [flake8-use-pathlib] Add autofixes for PTH100, PTH106, PTH107, PTH108, PTH110, PTH111, PTH112, PTH113, PTH114, PTH115, PTH117, PTH119, PTH120 (#19213)
## Summary

Part of #2331

## Test Plan

update snapshots for preview mode
2025-07-09 14:54:33 -04:00
InSync
05b1b788a0 [ty] Do not run mypy_primer.yaml when all changed files are Markdown files (#19244) 2025-07-09 19:40:43 +01:00
GiGaGon
a18f76158d [flake8-bandit] Make example error out-of-the-box (S412) (#19241)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [suspicious-httpoxy-import
(S412)](https://docs.astral.sh/ruff/rules/suspicious-httpoxy-import/#suspicious-httpoxy-import-s412)'s
example error out-of-the-box. Since the checked imports are classes
instead of modules, the example isn't valid. See #19009 for more details
```
PS ~>py -c "import wsgiref.handlers.CGIHandler"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
    import wsgiref.handlers.CGIHandler
ModuleNotFoundError: No module named 'wsgiref.handlers.CGIHandler'; 'wsgiref.handlers' is not a package
PS ~>py -c "from wsgiref.handlers import CGIHandler"
PS ~>
```

[Old example](https://play.ruff.rs/bf48c901-6a46-4795-ba1d-c6af79d5c96e)
```py
import wsgiref.handlers.CGIHandler
```

[New example](https://play.ruff.rs/1f0e1e60-1f0f-484a-9a17-2d0290a68f2a)
```py
from wsgiref.handlers import CGIHandler
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-09 14:25:27 -04:00
GiGaGon
8f400bb37a [pydoclint] Make example error out-of-the-box (DOC501) (#19218)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [docstring-missing-exception
(DOC501)](https://docs.astral.sh/ruff/rules/docstring-missing-exception/#docstring-missing-exception-doc501)'s
example error out-of-the-box. Since the exceptions in the function body
need to undergo name resolution to figure out if one of them is
`NotImplementedError`, `DOC501` won't lint if the raised name is not
defined. This could be considered a limitation, but should be fine since
`F821` already covers undefined names. I did discover a different edge
case, but it's not relevant to the example.

[Old example](https://play.ruff.rs/d213e87d-e5c7-49d8-a908-931f61f06055)
```py
def calculate_speed(distance: float, time: float) -> float:
    """Calculate speed as distance divided by time.

    Args:
        distance: Distance traveled.
        time: Time spent traveling.

    Returns:
        Speed as distance divided by time.
    """
    try:
        return distance / time
    except ZeroDivisionError as exc:
        raise FasterThanLightError from exc
```

[New example](https://play.ruff.rs/cb41e0b7-b950-4fa0-842d-cecab9c8e842)
```py
class FasterThanLightError(ArithmeticError): ...


def calculate_speed(distance: float, time: float) -> float:
    """Calculate speed as distance divided by time.

    Args:
        distance: Distance traveled.
        time: Time spent traveling.

    Returns:
        Speed as distance divided by time.
    """
    try:
        return distance / time
    except ZeroDivisionError as exc:
        raise FasterThanLightError from exc
```

The "Use instead" section was also updated similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-09 12:59:31 -04:00
Andrew Gallant
1eff0300d3 [ty] Add "kind" to completion suggestions
This makes use of the new `Type` field on `Completion` to figure out the
"kind" of a `Completion`.

The mapping here is perhaps a little suspect for some cases.

Closes astral-sh/ty#775
2025-07-09 12:03:56 -04:00
Andrew Gallant
fea84e8777 [ty] Add type information to all_members API
Since we generally need (so far) to get the type information of each
suggestion to figure out its boundness anyway, we might as well expose
it here. Completions want to use this information to enhance the
metadata on each suggestion for a more pleasant user experience.

For the most part, this was pretty straight-forward. The most exciting
part was in computing the types for instance attributes. I'm not 100%
sure it's correct or is the best way to do it.
2025-07-09 12:03:56 -04:00
Andrew Gallant
79fe538458 [ty] Expand API of all_members to return a struct
This commit doesn't change any behavior, but makes it so `all_members`
returns a `Vec<Member>` instead of `Vec<Name>`, where a `Member`
contains a `Name`. This gives us an expansion point to include other
data (such as the type of the `Name`).
2025-07-09 12:03:56 -04:00
David Peter
f7234cb474 [ty] Ecosystem analyzer PR comment workflow (#19237)
## Summary

Add PR comment workflow as a prerequisite for
https://github.com/astral-sh/ruff/pull/19234

## Test Plan

Not yet tested. Need to merge this first.
2025-07-09 18:02:05 +02:00
Micha Reiser
35a33f045e [ty] Merge ty_macros into ruff_macros (#19229) 2025-07-09 11:28:21 +00:00
Matthew Mckee
f32f7a3b48 [ty] Fix ClassLiteral.into_callable for dataclasses (#19192)
## Summary

Change `ClassLiteral.into_callable` to also look for `__init__` functions
of type `Type::Callable` (such as synthesized `__init__` functions of
dataclasses).

Fixes https://github.com/astral-sh/ty/issues/760

## Test Plan

Add subtype test

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-07-09 10:04:55 +02:00
David Peter
68106dd631 [ty] dataclasses.field support (#19140)
## Summary

Add an initial set of tests for `dataclasses.field`.
2025-07-09 09:18:08 +02:00
David Peter
ab3af924ef [ty] Fix panic for attribute expressions with empty value (#19069)
## Summary

closes https://github.com/astral-sh/ty/issues/738

## Test Plan

Added corpus test
2025-07-09 08:46:33 +02:00
Matthew Mckee
05139a323b [ty] Return CallableType from BoundMethodType.into_callable_type (#19193) 2025-07-08 20:33:43 +01:00
Dan Parizher
5eb5ec987d [flake8-bugbear] Support non-context-manager calls in B017 (#19063)
## Summary

Fixes #19050

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-07-08 15:04:55 -04:00
David Peter
1a099886ab [ty] Improved diagnostic for reassignments of Final symbols (#19214)
## Summary

Implement [this
suggestion](https://github.com/astral-sh/ruff/pull/19178#discussion_r2192658146)
by @AlexWaygood.


![image](https://github.com/user-attachments/assets/f183d691-ef6e-43a2-b005-3a32205bc408)
2025-07-08 20:29:07 +02:00
David Peter
a8f2c26143 [ty] Use full range for assignment definitions (#19211)
## Summary

Fix the `full_range` function for (annotated) assignment definition
kinds.

## Test Plan

Update snapshot tests
2025-07-08 19:51:09 +02:00
Junhson Jean-Baptiste
fda188953f [pylint] Update missing-maxsplit-arg docs and error to suggest proper usage (PLC0207) (#18949)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Fix #18383 by updating the documentation and error message to explain
that users should use `rsplit` in order to access the last element of
the result with `maxsplit=1`

## Test Plan

<!-- How was it tested? -->

Only documentation and an error message was changed. As such, snapshots
were updated to reflect the new error message. With this change, all
existing tests pass.
2025-07-08 12:53:23 -04:00
Ibraheem Ahmed
546f1b7b39 [ty] Add set -eu to mypy-primer script (#19212)
## Summary

So that the CI job fails if ty panics.
2025-07-08 12:16:09 -04:00
Alex Waygood
7533a0bfdb [ty] Upgrade mypy_primer (#19207) 2025-07-08 15:56:54 +01:00
Charlie Marsh
3ee3434187 Auto-generate environment variable references for ty (#19205)
## Summary

This PR mirrors the environment variable implementation we have in uv:
efc361223c/crates/uv-static/src/env_vars.rs (L6-L7).

See: https://github.com/astral-sh/ty/issues/773.
2025-07-08 10:48:31 -04:00
David Peter
149350bf39 [ty] Enforce typing.Final (#19178)
## Summary

Emit a diagnostic when a `Final`-qualified symbol is modified. This
first iteration only works for name targets. Tests with TODO comments
were added for attribute assignments as well.

related ticket: https://github.com/astral-sh/ty/issues/158

## Ecosystem impact

Correctly identified [modification of a `Final`
symbol](7b4164a5f2/sphinx/__init__.py (L44))
(behind a `# type: ignore`):
```diff
- warning[unused-ignore-comment] sphinx/__init__.py:44:56: Unused blanket `type: ignore` directive
```
And the same
[here](5471a37e82/src/trio/_core/_run.py (L128)):
```diff
- warning[unused-ignore-comment] src/trio/_core/_run.py:128:45: Unused blanket `type: ignore` directive
```

## Test Plan

New Markdown tests
2025-07-08 16:26:09 +02:00
Aria Desires
6a42d28867 [ty] Do not report settings diagnostics in check_file (#19206)
This is the trivial first part of
https://github.com/astral-sh/ty/issues/613

Ideally we should surface these elsewhere, but this is definitely Not
the place to surface them.
2025-07-08 10:18:32 -04:00
David Peter
ce2bdb9357 [ty] Conditionally defined dataclass fields (#19197)
## Summary

Fixes a bug where conditionally defined dataclass fields were previously
ignored.

Thanks to @lipefree for reporting this.

## Test Plan

New Markdown tests
2025-07-08 16:16:50 +02:00
GiGaGon
d78d10dd94 [pycodestyle] Make example not raise unnecessary SyntaxError (E114) (#19190)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [indentation-with-invalid-multiple-comment
(E114)](https://docs.astral.sh/ruff/rules/indentation-with-invalid-multiple-comment/#indentation-with-invalid-multiple-comment-e114)'s
example not raise a syntax error by adding a 4 space indented `...`. The
example still gave `E114` without this, but adding the `...` both makes
the change in indentation of the comment clearer, and makes it not give
a `SyntaxError`.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-08 10:00:14 -04:00
GiGaGon
36276143be [pycodestyle] Make example error out-of-the-box (E272) (#19191)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [multiple-spaces-before-keyword
(E272)](https://docs.astral.sh/ruff/rules/multiple-spaces-before-keyword/#multiple-spaces-before-keyword-e272)'s
example error out-of-the-box. Since `True` is also a keyword, the old
example raises `E271` instead.

[Old example](https://play.ruff.rs/23ec3774-5038-471c-be3f-1c1e36f85cbb)
```py
True  and False
```

[New example](https://play.ruff.rs/d77432e2-fd99-4db2-9cd0-bc08675c0aca)
```py
x  and y
```

The "Use instead" section was also updated similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-08 09:58:04 -04:00
Brent Westbrook
2643dc5b7a Rename Diagnostic::syntax_error methods, separate Ord implementation (#19179)
## Summary

This PR addresses some additional feedback on #19053:

- Renaming the `syntax_error` methods to `invalid_syntax` to match the
lint id
- Moving the standalone `diagnostic_from_violation` function to
`Violation::into_diagnostic`
- Removing the `Ord` and `PartialOrd` implementations from `Diagnostic`
in favor of `Diagnostic::start_ordering`

## Test Plan

Existing tests

## Additional Follow-ups

Besides these, I also put the following comments on my todo list, but
they seemed like they might be big enough to have their own PRs:

- [Use `LintId::IOError` for IO
errors](https://github.com/astral-sh/ruff/pull/19053#discussion_r2189425922)
- [Move `Fix` and
`Edit`](https://github.com/astral-sh/ruff/pull/19053#discussion_r2189448647)
- [Avoid so many
unwraps](https://github.com/astral-sh/ruff/pull/19053#discussion_r2189465980)
2025-07-08 09:54:19 -04:00
justin
738692baff [ty] Fix __setattr__ call check precedence during attribute assignment (#18347)
## Summary

Related:

- https://github.com/astral-sh/ty/issues/111
- https://github.com/astral-sh/ruff/pull/17974#discussion_r2108527106

Previously, when validating an attribute assignment, a `__setattr__`
call check was only done if the attribute wasn't found as either a class
member or instance member

This PR changes the `__setattr__` call check to be attempted first,
prior to the "[normal
mechanism](https://docs.python.org/3/reference/datamodel.html#object.__setattr__)",
as a defined `__setattr__` should take precedence over setting an
attribute on the instance dictionary directly.

if the return type of `__setattr__` is `Never`, an `invalid-assignment`
diagnostic is emitted

Once this is merged, a subsequent PR will synthesize a `__setattr__`
method with a `Never` return type for frozen dataclasses.

## Test Plan

Existing tests + mypy_primer

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-07-08 15:34:34 +02:00
David Peter
9a4b85d845 [ty] Add tests for dataclass fields annotated with Final (#19202)
## Summary

Adds some tests for dataclass fields that are annotated with `Final`
(see comment
[here](https://github.com/astral-sh/ruff/pull/15768#issuecomment-3044737645)).
Turns out that nothing is needed here, everything already works as
expected (apart from the fact that we can assign to `Final` fields,
which is tracked in https://github.com/astral-sh/ty/issues/158

## Test Plan

New Markdown tests
2025-07-08 12:33:46 +00:00
David Peter
6d8c84bde9 [ty] Clarify diagnostic message (#19203)
This diagnostic message was missing the word "type"
2025-07-08 14:21:20 +02:00
Alex Waygood
e16473d260 [ty] Add a new property test: all types assignable to Iterable[object] should be considered iterable (#19186) 2025-07-08 10:54:06 +01:00
Alex Waygood
220a584c11 [ty] Add an instance of an Any subclass to the property tests (#19180) 2025-07-08 10:53:50 +01:00
Dhruv Manilawala
1ddda241f6 [ty] Add an empty line to separate bullet points (#19195)
Without the newline, the rendering would just combine all the bullet
points in a single line like in
https://docs.astral.sh/ty/reference/configuration/#exclude_1. With the
empty line, it would be similar to
https://docs.astral.sh/ty/reference/configuration/#include_1.
2025-07-08 05:10:31 +00:00
UnboundVariable
278f93022a [ty] First cut at semantic token provider (#19108)
This PR implements a basic semantic token provider for ty's language
server. This allows for more accurate semantic highlighting / coloring
within editors that support this LSP functionality.

Here are screen shots that show how code appears in VS Code using the
"rainbow" theme both before and after this change.


![461737617-15630625-d4a9-4ec5-9886-77b00eb7a41a](https://github.com/user-attachments/assets/f963b55b-3195-41d1-ba38-ac2e7508d5f5)


![461737624-d6dcf5f0-7b9b-47de-a410-e202c63e2058](https://github.com/user-attachments/assets/111ca2c5-bb4f-4c8a-a0b5-6c1b2b6f246b)

The token types and modifier tags in this implementation largely mirror
those used in Microsoft's default language server for Python.

The implementation supports two LSP interfaces. The first provides
semantic tokens for an entire document, and the second returns semantic
tokens for a requested range within a document.

The PR includes unit tests. It also includes comments that document
known limitations and areas for future improvements.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-07 15:34:47 -07:00
GiGaGon
4dd2c03144 [flake8-simplify] Make example error out-of-the-box (SIM116) (#19111)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [if-else-block-instead-of-dict-lookup
(SIM116)](https://docs.astral.sh/ruff/rules/if-else-block-instead-of-dict-lookup/#if-else-block-instead-of-dict-lookup-sim116)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/718f17ee-fbe2-4520-97c6-153bc0f4502d)
```py
if x == 1:
    return "Hello"
elif x == 2:
    return "Goodbye"
else:
    return "Goodnight"
```

[New example](https://play.ruff.rs/8a9b47b4-da46-4a50-8576-362cdd707cee)
```py
def find_phrase(x):
    if x == 1:
        return "Hello"
    elif x == 2:
        return "Goodbye"
    elif x == 3:
        return "Good morning"
    else:
        return "Goodnight"
```

The "Use instead" section was also updated to reflect the new case. I
also changed it to use an intermediary variable since I find the `return
<long dict>.get` very ugly and hard to read.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-07 17:17:55 -04:00
GiGaGon
de5264fe13 [flake8-use-pathlib] Make example error out-of-the-box (PTH210) (#19189)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [invalid-pathlib-with-suffix
(PTH210)](https://docs.astral.sh/ruff/rules/invalid-pathlib-with-suffix/#invalid-pathlib-with-suffix-pth210)'s
example error out-of-the-box.

[Old example](https://play.ruff.rs/d45720cc-fd08-4443-820f-b3bc9756ac59)
```py
path.with_suffix("py")
```

[New example](https://play.ruff.rs/4103669e-19c5-464a-a3fb-6e7d190ce5fd)
```py
from pathlib import Path

path = Path()

path.with_suffix("py")
```

The "Use instead" section was also modified similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-07 17:04:35 -04:00
chiri
e23780c2e1 [flake8-use-pathlib] Add autofixes for PTH203, PTH204, PTH205 (#18922)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
Part of #2331 |
[#18763](https://github.com/astral-sh/ruff/pull/18763#issuecomment-2988340436)
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan
update snapshots
<!-- How was it tested? -->
2025-07-07 16:56:21 -04:00
GiGaGon
47f88b3008 [flake8-type-checking] Fix syntax error introduced by fix (TC008) (#19150)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

I noticed this while working on #18972. If the string targeted by
[quoted-type-alias
(TC008)](https://docs.astral.sh/ruff/rules/quoted-type-alias/#quoted-type-alias-tc008)
is a multiline string, the fix would introduce a syntax error. This PR
fixes that by adding parenthesis around the resulting replacement if the
string contained any newline characters (`\n`, `\r`) if it doesn't
already have parenthesis outside `("""...""")` or inside `"""(...)"""`
the annotation.

Failing examples:
https://play.ruff.rs/8793eb95-860a-4bb3-9cbc-6a042fee2946
```
PS D:\rust_projects\ruff> Get-Content issue.py
```
```py
from typing import TypeAlias

OptInt: TypeAlias = """int
| None"""

type OptInt = """int
| None"""
```
```
PS D:\rust_projects\ruff> uvx ruff check issue.py --isolated --select TC008 --fix --diff --preview
```
```

error: Fix introduced a syntax error. Reverting all changes.

This indicates a bug in Ruff. If you could open an issue at:

    https://github.com/astral-sh/ruff/issues/new?title=%5BFix%20error%5D

...quoting the contents of `issue.py`, the rule codes TC008, along with the `pyproject.toml` settings and executed command, we'd be very appreciative!
```

This PR also makes the example error out-of-the-box for #18972

Old example: https://play.ruff.rs/f6cd5adb-7f9b-444d-bb3e-8c045241d93e
```py
OptInt: TypeAlias = "int | None"
```

New example: https://play.ruff.rs/906c1056-72c0-4777-b70b-2114eb9e6eaf
```py
from typing import TypeAlias

OptInt: TypeAlias = "int | None"
```

The import was also added to the "Use instead" section.

## Test Plan

<!-- How was it tested? -->

Added multiple test cases
2025-07-07 15:34:14 -05:00
GiGaGon
6e77e1b760 [flake8-pyi] Make example error out-of-the-box (PYI007, PYI008) (#19103)
## Summary

Part of #18972

Both in one PR since they are in the same file

No playground links since the playground does not support rules that
only apply to PYI files

PYI007
---

This PR makes [unrecognized-platform-check
(PYI007)](https://docs.astral.sh/ruff/rules/unrecognized-platform-check/#unrecognized-platform-check-pyi007)'s
example error out-of-the-box

Old example:
```
PS ~\Desktop\New_folder\ruff>echo @"
```
```py
if sys.platform.startswith("linux"):
    # Linux specific definitions
    ...
else:
    # Posix specific definitions
    ...
```
```
"@ | uvx ruff check --isolated --preview --select PYI007 --stdin-filename "test.pyi" -
```
```
All checks passed!
```

New example:
```
PS ~\Desktop\New_folder\ruff>echo @"
```
```py
import sys

if sys.platform is "linux":
    # Linux specific definitions
    ...
else:
    # Posix specific definitions
    ...
```
```
"@ | uvx ruff check --isolated --preview --select PYI007 --stdin-filename "test.pyi" -
```
```snap
test.pyi:3:4: PYI007 Unrecognized `sys.platform` check
  |
1 | import sys
2 |
3 | if sys.platform is "linux":
  |    ^^^^^^^^^^^^^^^^^^^^^^^ PYI007
4 |     # Linux specific definitions
5 |     ...
  |

Found 1 error.
```

Imports were also added to the "use instead" section

> [!NOTE]
> `PYI007` is really hard to trigger, it's only specifically in the case
of a comparison where the operator is not `!=` or `==`. The original
example raises [complex-if-statement-in-stub
(PYI002)](https://docs.astral.sh/ruff/rules/complex-if-statement-in-stub/#complex-if-statement-in-stub-pyi002)
with or without the `import sys`

PYI008
---

This PR makes [unrecognized-platform-name
(PYI008)](https://docs.astral.sh/ruff/rules/unrecognized-platform-name/#unrecognized-platform-name-pyi008)'s
example error out-of-the-box

Old example:
```
PS ~\Desktop\New_folder\ruff>echo @"
```
```py
if sys.platform == "linus": ...
```
```
"@ | uvx ruff check --isolated --preview --select PYI008 --stdin-filename "test.pyi" -
```
```
All checks passed!
```

New example:
```
PS ~\Desktop\New_folder\ruff>echo @"
```
```py
import sys

if sys.platform == "linus": ...
```
```
"@ | uvx ruff check --isolated --preview --select PYI008 --stdin-filename "test.pyi" -
```
```snap
test.pyi:3:20: PYI008 Unrecognized platform `linus`
  |
1 | import sys
2 |
3 | if sys.platform == "linus": ...
  |                    ^^^^^^^ PYI008
  |

Found 1 error.
```

Imports were also added to the "use instead" section

> [!NOTE]
> The original example raises `PYI002` instead

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-07 21:11:43 +01:00
renovate[bot]
845a1eeba6 Update Rust crate indicatif to 0.18.0 (#19165)
## Summary

Updates `indicatif` and `tracing-indicatif`.
2025-07-07 13:19:23 -04:00
Ibraheem Ahmed
cd848986d7 [ty] Add separate CI job for memory usage stats (#19134)
## Summary

As discussed in https://github.com/astral-sh/ruff/pull/19059.
2025-07-07 12:17:02 -04:00
Dhruv Manilawala
56258bb3b7 [ty] Add documentation for server traits (#19137)
This PR adds some basic documentation for the traits in the server
implementation.
2025-07-07 14:26:09 +00:00
Dhruv Manilawala
8cf1b876ee Rename to SessionSnapshot, move unwind assertion closer (#19177)
This PR addresses the post-merge review comments from
https://github.com/astral-sh/ruff/pull/19041, specifically it:
- Rename `WorkspaceSnapshot` to `SessionSnapshot`
- Rename `take_workspace_snapshot` to `take_session_snapshot`
- Rename `take_snapshot` to `take_document_snapshot`
- Move `AssertUnwindSafe` closer to the `catch_unwind` call which
requires the assertion
2025-07-07 19:44:23 +05:30
GiGaGon
1fd48120ba [flake8-type-checking] Make example error out-of-the-box (TC001) (#19151)
## Summary

Part of #18972

This PR makes [typing-only-first-party-import
(TC001)](https://docs.astral.sh/ruff/rules/typing-only-first-party-import/#typing-only-first-party-import-tc001)'s
example error out-of-the-box. The old example raised `TC002` instead of
`TC001`, so this makes it a `from .` import to fix that.

[Old example](https://play.ruff.rs/1fdbb293-86fc-4ed2-b2ff-b4836cea0c59)
```py
from __future__ import annotations

import local_module


def func(sized: local_module.Container) -> int:
    return len(sized)
```

[New example](https://play.ruff.rs/b886535c-9203-48bb-812b-1aa306f2c287)
```py
from __future__ import annotations

from . import local_module


def func(sized: local_module.Container) -> int:
    return len(sized)
```

The "Use instead" section was also modified similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-07 08:53:37 -05:00
David Peter
e7fb3684e8 [ty] Bare ClassVar annotations (#15768)
## Summary

It was recently clarified in the [typing
spec](https://typing.python.org/en/latest/spec/class-compat.html#classvar)
that bare `ClassVar` annotations are allowed. For annotated assignments
with a right hand side value, the spec requires type checkers to infer
the type as something "to which [the] value is assignable". For a value
of `2`, the spec suggests `int`, `Literal[2]`, or `Any` as examples.
Here, we choose `Unknown | Literal[2]` instead, conforming with out
usual treatment of attribute types.

closes https://github.com/astral-sh/ty/issues/211
2025-07-07 15:04:27 +02:00
David Peter
4aaf32476a [ty] Re-enable multithreaded pydantic benchmark (#19176)
## Summary

I played with those numbers a bit locally and `sample_size=3,
sample_count=8` seemed like a rather stable setup. This means a single
sample consistents of 3 iterations of checking pydantic multithreaded.
And this is repeated 8 times for statistics. A single check took ~300 ms
previously on the runners, so this should only take 7 s.
2025-07-07 14:28:15 +02:00
Alex Waygood
a6637964d2 [ty] Implement equivalence for protocols with method members (#18659)
## Summary

This PR implements the following pieces of `Protocol` semantics:
1. A protocol with a method member that does not have a fully static
signature should not be considered fully static. I.e., this protocol is
not fully static because `Foo.x` has no return type; we previously
incorrectly considered that it was:
  ```py
  class Foo(Protocol):
      def f(self): ...
  ```
2. Two protocols `P1` and `P2`, both with method members `x`, should be
considered equivalent if the signature of `P1.x` is equivalent to the
signature of `P2.x`. Currently we do not recognize this.

Implementing these semantics requires distinguishing between method
members and non-method members. The stored type of a method member must
be eagerly upcast to a `Callable` type when collecting the protocol's
interface: doing otherwise would mean that it would be hard to implement
equivalence of protocols even in the face of differently ordered unions,
since the two equivalent protocols would have different Salsa IDs even
when normalized.

The semantics implemented by this PR are that we consider something a
method member if:
1. It is accessible on the class itself; and
2. It is a function-like callable: a callable type that also has a
`__get__` method, meaning it can be used as a method when accessed on
instances.

Note that the spec has complicated things to say about classmethod
members and staticmethod members. These semantics are not implemented by
this PR; they are all deferred for now.

The infrastructure added in this PR fixes bugs in its own right, but
also lays the groundwork for implementing subtyping and assignability
rules for method members of protocols. A (currently failing) test is
added to verify this.

## Test Plan

mdtests
2025-07-07 12:28:32 +01:00
David Peter
c15aa572ff [ty] Use RHS inferred type for bare Final symbols (#19142)
## Summary

Infer the type of symbols with a `Final` qualifier as their
right-hand-side inferred type:
```py
x: Final = 1
y: Final[int] = 1

def _():
    reveal_type(x)  # previously: Unknown, now: Literal[1]
    reveal_type(y)  # int, same as before
```
Part of https://github.com/astral-sh/ty/issues/158

## Ecosystem analysis

### aiohttp

```diff
aiohttp (https://github.com/aio-libs/aiohttp)
+ error[invalid-argument-type] aiohttp/compression_utils.py:131:54: Argument to bound method `__init__` is incorrect: Expected `ZLibBackendProtocol`, found `<module 'zlib'>`
```

This code [creates a
protocol](a83597fa88/aiohttp/compression_utils.py (L52-L77))
that looks like
```pyi
class ZLibBackendProtocol(Protocol):
    Z_FULL_FLUSH: int
    Z_SYNC_FLUSH: int
    # more fields…
```

It then [tries to
assign](a83597fa88/aiohttp/compression_utils.py (L131))
the module literal `zlib` to that protocol. Howefer, in typeshed, these
`zlib` members are annotated like this:
```pyi
Z_FULL_FLUSH: Final = 3
Z_SYNC_FLUSH: Final = 2
```
With the proposed change here, we now infer these as `Literal[3]` /
`Literal[2]`. Since protocol members have to be assignable both ways
(invariance), we do not consider `zlib` assignable to this protocol
anymore.

That seems rather unfortunate. Not sure who is to blame here? That
`ZLibBackendProtocol` protocol should probably not annotate the members
with `int`, given that `typeshed` doesn't use an explicit annotation
here either? But what should they do instead? Annotate those fields with
`Any`?

Or is it another case where we should consider literal-widening?

FYI @AlexWaygood 

### cloud-init

```diff
cloud-init (https://github.com/canonical/cloud-init)
+ error[invalid-argument-type] tests/unittests/sources/test_smartos.py:575:32: Argument to function `oct` is incorrect: Expected `SupportsIndex`, found `int | float`
+ error[invalid-argument-type] tests/unittests/sources/test_smartos.py:593:32: Argument to function `oct` is incorrect: Expected `SupportsIndex`, found `int | float`
+ error[invalid-argument-type] tests/unittests/sources/test_smartos.py:647:35: Argument to function `oct` is incorrect: Expected `SupportsIndex`, found `int | float`
```

New false positives on expressions like
`oct(os.stat(legacy_script_f)[stat.ST_MODE])`. We now correctly infer
`stat.ST_MODE` as `Literal[1]`, because in typeshed, it is annotated as
`ST_MODE: Final = 0`. `os.stat` returns a `stat_result` which is a tuple
subclass. Accessing it at index 0 should return an `int`, but we
currently return `int | float`, presumably due to missing support for
tuple subclasses (FYI @AlexWaygood):
```pyi
class stat_result(structseq[float], tuple[int, int, int, int, int, int, int, float, float, float]):
```
In terms of `typing.Final`, things are working as expected here.


### pywin-32

Many new false positives similar to:

```diff
pywin32 (https://github.com/mhammond/pywin32)
+ error[invalid-argument-type] Pythonwin/pywin/docking/DockingBar.py:288:55: Argument to function `LoadCursor` is incorrect: Expected `PyResourceId`, found `Literal[32645]`
```

The line in question calls `win32api.LoadCursor(0, win32con.IDC_ARROW)`.
The `win32con.IDC_ARROW` symbol is annotated as [`IDC_ARROW: Final =
32512` in
typeshed](2408c028f4/stubs/pywin32/win32/lib/win32con.pyi (L594)),
but
[`LoadCursor`](2408c028f4/stubs/pywin32/win32/win32api.pyi (L197))
expects a
[`PyResourceId`](2408c028f4/stubs/pywin32/_win32typing.pyi (L1252)),
which is an empty class. So.. this seems like a true positive to me,
unless that typeshed annotation of `IDC_ARROW` is meant to imply that
the type should be `Unknown`/`Any`?

### streamlit

```diff
streamlit (https://github.com/streamlit/streamlit)
+ error[invalid-argument-type] lib/streamlit/string_util.py:163:37: Argument to bound method `translate` is incorrect: Expected `bytes`, found `bytearray`
```

This looks like a true positive? The code calls `inp.translate(None,
TEXTCHARS)`. `inp` is `bytes`, and `TEXTCHARS` is:
```py
TEXTCHARS: Final = bytearray(
    {7, 8, 9, 10, 12, 13, 27} | set(range(0x20, 0x100)) - {0x7F}
)
```
~~We now infer this as `bytearray`, but `bytes.translate` [expects
`bytes` for its `delete`
parameter](2408c028f4/stdlib/builtins.pyi (L710)).
This seems to work at runtime, so maybe the typeshed annotation is
wrong?~~ (Edit: this is now fixed in typeshed)
```pycon
>>> b"abc".translate(None, bytearray(b"b"))
b'ac'
```

## rotki

```diff
+ error[invalid-return-type] rotkehlchen/chain/ethereum/modules/yearn/decoder.py:412:13: Return type does not match returned value: expected `dict[Unknown, str]`, found `dict[Unknown, Literal["yearn-v1", "yearn-v2"]]`
```

The code in question looks like
```py
    def addresses_to_counterparties(self) -> dict[ChecksumEvmAddress, str]:
        return dict.fromkeys(self.vaults, CPT_BEEFY_FINANCE)
```
where `CPT_BEEFY_FINANCE: Final = 'beefy_finance'. We previously
inferred the value type of the returned `dict` as `Unknown`, and now we
infer it as `Literal["beefy_finance"]`, which does not match the
annotated return type because `dict` is invariant in the value type.

```diff
+ error[invalid-argument-type] rotkehlchen/tests/unit/decoders/test_curve.py:249:9: Argument is incorrect: Expected `int`, found `FVal`
```
There are true positives that were previously silenced through the
`Unknown`.

## Test Plan

New Markdown tests
2025-07-07 13:16:40 +02:00
Ivan Yakushev
e0b7f496f2 [ty] Support declaration-only attributes (#19048)
## Summary

Following ty issue [#698](https://github.com/astral-sh/ty/issues/698)
this PR adds support for declarations.

closes #698

## Test Plan

Tested against mdtest (specifically attributes).

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-07-07 12:55:32 +02:00
github-actions[bot]
b6edfbc70f [ty] Sync vendored typeshed stubs (#19174)
Close and reopen this PR to trigger CI

---------

Co-authored-by: typeshedbot <>
Co-authored-by: David Peter <mail@david-peter.de>
2025-07-07 12:00:09 +02:00
renovate[bot]
9be8276616 Update dependency pyodide to ^0.28.0 (#19164)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:38:29 +02:00
renovate[bot]
d0099fd012 Update NPM Development dependencies (#19170)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:33:59 +02:00
renovate[bot]
0d3802998b Update taiki-e/install-action action to v2.56.7 (#19169)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:25:43 +02:00
renovate[bot]
1a03b5841b Update pre-commit dependencies (#19162)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | patch | `v0.12.1` -> `v0.12.2` |
| [crate-ci/typos](https://redirect.github.com/crate-ci/typos) |
repository | minor | `v1.33.1` -> `v1.34.0` |
|
[python-jsonschema/check-jsonschema](https://redirect.github.com/python-jsonschema/check-jsonschema)
| repository | patch | `0.33.1` -> `0.33.2` |
|
[woodruffw/zizmor-pre-commit](https://redirect.github.com/woodruffw/zizmor-pre-commit)
| repository | minor | `v1.10.0` -> `v1.11.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.2`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.2)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.1...v0.12.2)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.2

</details>

<details>
<summary>crate-ci/typos (crate-ci/typos)</summary>

###
[`v1.34.0`](https://redirect.github.com/crate-ci/typos/releases/tag/v1.34.0)

[Compare
Source](https://redirect.github.com/crate-ci/typos/compare/v1.33.1...v1.34.0)

#### \[1.34.0] - 2025-06-30

##### Features

- Updated the dictionary with the [June
2025](https://redirect.github.com/crate-ci/typos/issues/1309) changes

</details>

<details>
<summary>python-jsonschema/check-jsonschema
(python-jsonschema/check-jsonschema)</summary>

###
[`v0.33.2`](https://redirect.github.com/python-jsonschema/check-jsonschema/blob/HEAD/CHANGELOG.rst#0332)

[Compare
Source](https://redirect.github.com/python-jsonschema/check-jsonschema/compare/0.33.1...0.33.2)

- Update vendored schemas: bitbucket-pipelines, mergify, renovate
(2025-06-29)
- Fix a bug in the evaluation of the `date-time` format on non-string
data,
which incorrectly rejected values for which `string` was one of several
  valid types. Thanks :user:`katylava`! (:issue:`571`)

</details>

<details>
<summary>woodruffw/zizmor-pre-commit
(woodruffw/zizmor-pre-commit)</summary>

###
[`v1.11.0`](https://redirect.github.com/zizmorcore/zizmor-pre-commit/releases/tag/v1.11.0)

[Compare
Source](https://redirect.github.com/woodruffw/zizmor-pre-commit/compare/v1.10.0...v1.11.0)

See: https://github.com/zizmorcore/zizmor/releases/tag/v1.11.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
2025-07-07 04:07:44 +00:00
renovate[bot]
a7fafb96be Update dependency smol-toml to v1.4.1 (#19161)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [smol-toml](https://redirect.github.com/squirrelchat/smol-toml) |
[`1.4.0` ->
`1.4.1`](https://renovatebot.com/diffs/npm/smol-toml/1.4.0/1.4.1) |
[![age](https://developer.mend.io/api/mc/badges/age/npm/smol-toml/1.4.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/smol-toml/1.4.0/1.4.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>squirrelchat/smol-toml (smol-toml)</summary>

###
[`v1.4.1`](https://redirect.github.com/squirrelchat/smol-toml/releases/tag/v1.4.1)

[Compare
Source](https://redirect.github.com/squirrelchat/smol-toml/compare/v1.4.0...v1.4.1)

A little fix for `asNeeded` not being implemented correctly.

#### What's Changed

fix: properly implement asNeeded by
[@&#8203;cyyynthia](https://redirect.github.com/cyyynthia)

**Full Changelog**:
https://github.com/squirrelchat/smol-toml/compare/v1.4.0...v1.4.1

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 08:54:52 +05:30
renovate[bot]
22d809b8ce Update dependency ruff to v0.12.2 (#19160)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [ruff](https://docs.astral.sh/ruff)
([source](https://redirect.github.com/astral-sh/ruff),
[changelog](https://redirect.github.com/astral-sh/ruff/blob/main/CHANGELOG.md))
| `==0.12.1` -> `==0.12.2` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/ruff/0.12.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/ruff/0.12.1/0.12.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>astral-sh/ruff (ruff)</summary>

###
[`v0.12.2`](https://redirect.github.com/astral-sh/ruff/blob/HEAD/CHANGELOG.md#0122)

[Compare
Source](https://redirect.github.com/astral-sh/ruff/compare/0.12.1...0.12.2)

##### Preview features

- \[`flake8-pyi`] Expand `Optional[A]` to `A | None` (`PYI016`)
([#&#8203;18572](https://redirect.github.com/astral-sh/ruff/pull/18572))
- \[`pyupgrade`] Mark `UP008` fix safe if no comments are in range
([#&#8203;18683](https://redirect.github.com/astral-sh/ruff/pull/18683))

##### Bug fixes

- \[`flake8-comprehensions`] Fix `C420` to prepend whitespace when
needed
([#&#8203;18616](https://redirect.github.com/astral-sh/ruff/pull/18616))
- \[`perflint`] Fix `PERF403` panic on attribute or subscription loop
variable
([#&#8203;19042](https://redirect.github.com/astral-sh/ruff/pull/19042))
- \[`pydocstyle`] Fix `D413` infinite loop for parenthesized docstring
([#&#8203;18930](https://redirect.github.com/astral-sh/ruff/pull/18930))
- \[`pylint`] Fix `PLW0108` autofix introducing a syntax error when the
lambda's body contains an assignment expression
([#&#8203;18678](https://redirect.github.com/astral-sh/ruff/pull/18678))
- \[`refurb`] Fix false positive on empty tuples (`FURB168`)
([#&#8203;19058](https://redirect.github.com/astral-sh/ruff/pull/19058))
- \[`ruff`] Allow more `field` calls from `attrs` (`RUF009`)
([#&#8203;19021](https://redirect.github.com/astral-sh/ruff/pull/19021))
- \[`ruff`] Fix syntax error introduced for an empty string followed by
a u-prefixed string (`UP025`)
([#&#8203;18899](https://redirect.github.com/astral-sh/ruff/pull/18899))

##### Rule changes

- \[`flake8-executable`] Allow `uvx` in shebang line (`EXE003`)
([#&#8203;18967](https://redirect.github.com/astral-sh/ruff/pull/18967))
- \[`pandas`] Avoid flagging `PD002` if `pandas` is not imported
([#&#8203;18963](https://redirect.github.com/astral-sh/ruff/pull/18963))
- \[`pyupgrade`] Avoid PEP-604 unions with `typing.NamedTuple` (`UP007`,
`UP045`)
([#&#8203;18682](https://redirect.github.com/astral-sh/ruff/pull/18682))

##### Documentation

- Document link between `import-outside-top-level (PLC0415)` and
`lint.flake8-tidy-imports.banned-module-level-imports`
([#&#8203;18733](https://redirect.github.com/astral-sh/ruff/pull/18733))
- Fix description of the `format.skip-magic-trailing-comma` example
([#&#8203;19095](https://redirect.github.com/astral-sh/ruff/pull/19095))
- \[`airflow`] Make `AIR302` example error out-of-the-box
([#&#8203;18988](https://redirect.github.com/astral-sh/ruff/pull/18988))
- \[`airflow`] Make `AIR312` example error out-of-the-box
([#&#8203;18989](https://redirect.github.com/astral-sh/ruff/pull/18989))
- \[`flake8-annotations`] Make `ANN401` example error out-of-the-box
([#&#8203;18974](https://redirect.github.com/astral-sh/ruff/pull/18974))
- \[`flake8-async`] Make `ASYNC100` example error out-of-the-box
([#&#8203;18993](https://redirect.github.com/astral-sh/ruff/pull/18993))
- \[`flake8-async`] Make `ASYNC105` example error out-of-the-box
([#&#8203;19002](https://redirect.github.com/astral-sh/ruff/pull/19002))
- \[`flake8-async`] Make `ASYNC110` example error out-of-the-box
([#&#8203;18975](https://redirect.github.com/astral-sh/ruff/pull/18975))
- \[`flake8-async`] Make `ASYNC210` example error out-of-the-box
([#&#8203;18977](https://redirect.github.com/astral-sh/ruff/pull/18977))
- \[`flake8-async`] Make `ASYNC220`, `ASYNC221`, and `ASYNC222` examples
error out-of-the-box
([#&#8203;18978](https://redirect.github.com/astral-sh/ruff/pull/18978))
- \[`flake8-async`] Make `ASYNC251` example error out-of-the-box
([#&#8203;18990](https://redirect.github.com/astral-sh/ruff/pull/18990))
- \[`flake8-bandit`] Make `S201` example error out-of-the-box
([#&#8203;19017](https://redirect.github.com/astral-sh/ruff/pull/19017))
- \[`flake8-bandit`] Make `S604` and `S609` examples error
out-of-the-box
([#&#8203;19049](https://redirect.github.com/astral-sh/ruff/pull/19049))
- \[`flake8-bugbear`] Make `B028` example error out-of-the-box
([#&#8203;19054](https://redirect.github.com/astral-sh/ruff/pull/19054))
- \[`flake8-bugbear`] Make `B911` example error out-of-the-box
([#&#8203;19051](https://redirect.github.com/astral-sh/ruff/pull/19051))
- \[`flake8-datetimez`] Make `DTZ011` example error out-of-the-box
([#&#8203;19055](https://redirect.github.com/astral-sh/ruff/pull/19055))
- \[`flake8-datetimez`] Make `DTZ901` example error out-of-the-box
([#&#8203;19056](https://redirect.github.com/astral-sh/ruff/pull/19056))
- \[`flake8-pyi`] Make `PYI032` example error out-of-the-box
([#&#8203;19061](https://redirect.github.com/astral-sh/ruff/pull/19061))
- \[`flake8-pyi`] Make example error out-of-the-box (`PYI014`, `PYI015`)
([#&#8203;19097](https://redirect.github.com/astral-sh/ruff/pull/19097))
- \[`flake8-pyi`] Make example error out-of-the-box (`PYI042`)
([#&#8203;19101](https://redirect.github.com/astral-sh/ruff/pull/19101))
- \[`flake8-pyi`] Make example error out-of-the-box (`PYI059`)
([#&#8203;19080](https://redirect.github.com/astral-sh/ruff/pull/19080))
- \[`flake8-pyi`] Make example error out-of-the-box (`PYI062`)
([#&#8203;19079](https://redirect.github.com/astral-sh/ruff/pull/19079))
- \[`flake8-pytest-style`] Make example error out-of-the-box (`PT023`)
([#&#8203;19104](https://redirect.github.com/astral-sh/ruff/pull/19104))
- \[`flake8-pytest-style`] Make example error out-of-the-box (`PT030`)
([#&#8203;19105](https://redirect.github.com/astral-sh/ruff/pull/19105))
- \[`flake8-quotes`] Make example error out-of-the-box (`Q003`)
([#&#8203;19106](https://redirect.github.com/astral-sh/ruff/pull/19106))
- \[`flake8-simplify`] Make example error out-of-the-box (`SIM110`)
([#&#8203;19113](https://redirect.github.com/astral-sh/ruff/pull/19113))
- \[`flake8-simplify`] Make example error out-of-the-box (`SIM113`)
([#&#8203;19109](https://redirect.github.com/astral-sh/ruff/pull/19109))
- \[`flake8-simplify`] Make example error out-of-the-box (`SIM401`)
([#&#8203;19110](https://redirect.github.com/astral-sh/ruff/pull/19110))
- \[`pyflakes`] Fix backslash in docs (`F621`)
([#&#8203;19098](https://redirect.github.com/astral-sh/ruff/pull/19098))
- \[`pylint`] Fix `PLC0415` example
([#&#8203;18970](https://redirect.github.com/astral-sh/ruff/pull/18970))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 08:54:35 +05:30
renovate[bot]
cc190a550c Update Rust crate serde_with to v3.14.0 (#19167)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [serde_with](https://redirect.github.com/jonasbb/serde_with) |
workspace.dependencies | minor | `3.12.0` -> `3.14.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>jonasbb/serde_with (serde_with)</summary>

###
[`v3.14.0`](https://redirect.github.com/jonasbb/serde_with/releases/tag/v3.14.0):
serde_with v3.14.0

[Compare
Source](https://redirect.github.com/jonasbb/serde_with/compare/v3.13.0...v3.14.0)

##### Added

- Add support for `Range`, `RangeFrom`, `RangeTo`, `RangeInclusive`
([#&#8203;851](https://redirect.github.com/jonasbb/serde_with/issues/851))
  `RangeToInclusive` is currently unsupported by serde.
- Add `schemars` implementations for `Bound`, `Range`, `RangeFrom`,
`RangeTo`, `RangeInclusive`.
- Added support for `schemars` v1 under the `schemars_1` feature flag

###
[`v3.13.0`](https://redirect.github.com/jonasbb/serde_with/releases/tag/v3.13.0):
serde_with v3.13.0

[Compare
Source](https://redirect.github.com/jonasbb/serde_with/compare/v3.12.0...v3.13.0)

##### Added

- Added support for `schemars` v0.9.0 under the `schemars_0_9` feature
flag by [@&#8203;swlynch99](https://redirect.github.com/swlynch99)
([#&#8203;849](https://redirect.github.com/jonasbb/serde_with/issues/849))
- Introduce `SerializeDisplayAlt` derive macro
([#&#8203;833](https://redirect.github.com/jonasbb/serde_with/issues/833))
An alternative to the `SerializeDisplay` macro except instead of using
the
  plain formatting like `format!("{}", ...)`, it serializes with the
  `Formatter::alternate` flag set to true, like `format!("{:#}", ...)`

##### Changed

- Generalize `serde_with::rust::unwrap_or_skip` to support deserializing
references by [@&#8203;beroal](https://redirect.github.com/beroal)
([#&#8203;832](https://redirect.github.com/jonasbb/serde_with/issues/832))
- Bump MSRV to 1.71, since that is required for the `jsonschema`
dev-dependency.
- Make `serde_conv` available without the `std` feature by
[@&#8203;arilou](https://redirect.github.com/arilou)
([#&#8203;839](https://redirect.github.com/jonasbb/serde_with/issues/839))
- Bump MSRV to 1.74, since that is required for `schemars` v0.9.0 by
[@&#8203;swlynch99](https://redirect.github.com/swlynch99)
([#&#8203;849](https://redirect.github.com/jonasbb/serde_with/issues/849))

##### Fixed

- Make the `DurationSeconds` types and other variants more accessible
even without `std`
([#&#8203;845](https://redirect.github.com/jonasbb/serde_with/issues/845))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 08:52:38 +05:30
renovate[bot]
ab024f9de1 Update Rust crate notify to v8.1.0 (#19166)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [notify](https://redirect.github.com/notify-rs/notify) |
workspace.dependencies | minor | `8.0.0` -> `8.1.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>notify-rs/notify (notify)</summary>

###
[`v8.1.0`](https://redirect.github.com/notify-rs/notify/blob/HEAD/CHANGELOG.md#notify-810-2025-07-03)

[Compare
Source](https://redirect.github.com/notify-rs/notify/compare/notify-8.0.0...notify-8.1.0)

- FEATURE: added support for the [`flume`](https://docs.rs/flume) crate
- FIX: kqueue-backend: do not double unwatch top-level directory when
recursively unwatching
\[[#&#8203;683](https://redirect.github.com/notify-rs/notify/issues/683)]
- FIX: Return the crate error `PathNotFound` instead bubbling up the
std::io error
\[[#&#8203;685](https://redirect.github.com/notify-rs/notify/issues/685)]
- FIX: fix server hangs when trashing folders on Windows
\[[#&#8203;674](https://redirect.github.com/notify-rs/notify/issues/674)]

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 08:51:09 +05:30
renovate[bot]
3d2a0c3cd6 Update Swatinem/rust-cache action to v2.8.0 (#19168)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [Swatinem/rust-cache](https://redirect.github.com/Swatinem/rust-cache)
| action | minor | `v2.7.8` -> `v2.8.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>Swatinem/rust-cache (Swatinem/rust-cache)</summary>

###
[`v2.8.0`](https://redirect.github.com/Swatinem/rust-cache/releases/tag/v2.8.0)

[Compare
Source](https://redirect.github.com/Swatinem/rust-cache/compare/v2.7.8...v2.8.0)

##### What's Changed

- Add cache-workspace-crates feature by
[@&#8203;jbransen](https://redirect.github.com/jbransen) in
[https://github.com/Swatinem/rust-cache/pull/246](https://redirect.github.com/Swatinem/rust-cache/pull/246)
- Feat: support warpbuild cache provider by
[@&#8203;stegaBOB](https://redirect.github.com/stegaBOB) in
[https://github.com/Swatinem/rust-cache/pull/247](https://redirect.github.com/Swatinem/rust-cache/pull/247)

##### New Contributors

- [@&#8203;jbransen](https://redirect.github.com/jbransen) made their
first contribution in
[https://github.com/Swatinem/rust-cache/pull/246](https://redirect.github.com/Swatinem/rust-cache/pull/246)
- [@&#8203;stegaBOB](https://redirect.github.com/stegaBOB) made their
first contribution in
[https://github.com/Swatinem/rust-cache/pull/247](https://redirect.github.com/Swatinem/rust-cache/pull/247)

**Full Changelog**:
https://github.com/Swatinem/rust-cache/compare/v2.7.8...v2.8.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 08:50:29 +05:30
Alex Waygood
08d8819c8a [ty] Fix descriptor lookups for most types that overlap with None (#19120) 2025-07-05 19:34:23 +01:00
Alex Waygood
44f2f77748 [ty] Add a DateType benchmark (#19148)
## Summary

The [`DateType`](https://github.com/glyph/DateType) library has some
very large protocols in it. Currently we type-check it quite quickly,
but the current version of https://github.com/astral-sh/ruff/pull/18659
makes our execution time on this library pathologically slow. That PR
doesn't seem to have a big impact on any of our current benchmarks,
however, so it seems we have some missing coverage in this area; I
therefore propose that we add `DateType` as a benchmark.

Currently the benchmark runs pretty quickly (about half the runtime of
attrs, which is our fastest real-world benchmark currently), and the
library has 0 third-party dependencies, so the benchmark is quick to
setup.

## Test Plan

`cargo bench -p ruff_benchmark --bench=ty`
2025-07-04 21:11:47 +01:00
NamelessGO
1c710c2840 Add Weblate to Who's Using Ruff (#19124)
https://github.com/WeblateOrg/weblate/issues/8867
2025-07-04 15:24:44 -04:00
Abhijeet Prasad Bodas
f4bd74ab6a [ty] Correctly handle calls to functions marked as returning Never / NoReturn (#18333)
## Summary

`ty` does not understand that calls to functions which have been
annotated as having a return type of `Never` / `NoReturn` are terminal.

This PR fixes that, by adding new reachability constraints when call
expressions are seen. If the call expression evaluates to `Never`, the
code following it will be considered to be unreachable. Note that, for
adding these constraints, we only consider call expressions at the
statement level, and that too only inside function scopes. This is
because otherwise, the number of such constraints becomes too high, and
evaluating them later on during type inference results in a major
performance degradation.

Fixes https://github.com/astral-sh/ty/issues/180

## Test Plan

New mdtests.

## Ecosystem changes

This PR removes the following false-positives:
- "Function can implicitly return `None`, which is not assignable to
...".
- "Name `foo` used when possibly not defind" - because the branch in
which it is not defined has a `NoReturn` call, or when `foo` was
imported in a `try`, and the except had a `NoReturn` call.

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-07-04 11:52:52 -07:00
GiGaGon
a33cff2b12 Fix F701 to F707 errors in tests (#19125)
## Summary

Per @ntBre in https://github.com/astral-sh/ruff/pull/19111, it would be
a good idea to make the tests no longer have these syntax errors, so
this PR updates the tests and snapshots.

`B031` gave me a lot of trouble since the ending test of declaring a
function named `groupby` makes it so that inside other functions, it's
unclear which `groupby` is referred to since it depends on when the
function is called. To fix it I made each function have it's own `from
itertools import groupby` so there's no more ambiguity.
2025-07-04 13:43:18 -05:00
GiGaGon
f48a34fbab [pylint, pyupgrade] Fix syntax errors in examples (PLW1501, UP028) (#19127)
## Summary

From me and @ntBre's discussion in #19111.

This PR makes these two examples into valid code, since they previously
had `F701`-`F707` syntax errors. `SIM110` was already fixed in a
different PR, I just forgot to pull.
2025-07-04 13:38:37 -05:00
Carl Meyer
411cccb35e [ty] detect cycles in Type::is_disjoint_from (#19139) 2025-07-04 06:31:44 -07:00
Carl Meyer
7712c2fd15 [ty] don't allow first-party code to shadow stdlib types module (#19128) 2025-07-04 10:36:26 +00:00
David Peter
25bdb67d9a [ty] Remove TODOs regarding legacy generics (#19141) 2025-07-04 10:45:06 +02:00
Matthew Mckee
3be83d36a5 [ty] Add into_callable method for Type (#19130)
## Summary

Was just playing around with this, there's definitely more to do with
this function, but it seems like maybe a better option than having so
many arms in has_relation_to for (_, Callable).

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
Co-authored-by: Carl Meyer <carl@astral.sh>
2025-07-03 19:04:03 -07:00
Alex Waygood
333191b7f7 [ty] Rewrite Type::any_over_type using a new generalised TypeVisitor trait (#19094) 2025-07-03 18:19:23 +00:00
Brent Westbrook
77a5c5ac80 Combine OldDiagnostic and Diagnostic (#19053)
## Summary

This PR is a collaboration with @AlexWaygood from our pairing session
last Friday.

The main goal here is removing `ruff_linter::message::OldDiagnostic` in
favor of
using `ruff_db::diagnostic::Diagnostic` directly. This involved a few
major steps:

- Transferring the fields
- Transferring the methods and trait implementations, where possible
- Converting some constructor methods to free functions
- Moving the `SecondaryCode` struct
- Updating the method names

I'm hoping that some of the methods, especially those in the
`expect_ruff_*`
family, won't be necessary long-term, but I avoided trying to replace
them
entirely for now to keep the already-large diff a bit smaller.

### Related refactors

Alex and I noticed a few refactoring opportunities while looking at the
code,
specifically the very similar implementations for
`create_parse_diagnostic`,
`create_unsupported_syntax_diagnostic`, and
`create_semantic_syntax_diagnostic`.
We combined these into a single generic function, which I then copied
into
`ruff_linter::message` with some small changes and a TODO to combine
them in the
future.

I also deleted the `DisplayParseErrorType` and `TruncateAtNewline` types
for
reporting parse errors. These were added in #4124, I believe to work
around the
error messages from LALRPOP. Removing these didn't affect any tests, so
I think
they were unnecessary now that we fully control the error messages from
the
parser.

On a more minor note, I factored out some calls to the
`OldDiagnostic::filename`
(now `Diagnostic::expect_ruff_filename`) function to avoid repeatedly
allocating
`String`s in some places.

### Snapshot changes

The `show_statistics_syntax_errors` integration test changed because the
`OldDiagnostic::name` method used `syntax-error` instead of
`invalid-syntax`
like in ty. I think this (`--statistics`) is one of the only places we
actually
use this name for syntax errors, so I hope this is okay. An alternative
is to
use `syntax-error` in ty too.

The other snapshot changes are from removing this code, as discussed on

[Discord](https://discord.com/channels/1039017663004942429/1228460843033821285/1388252408848847069):


34052a1185/crates/ruff_linter/src/message/mod.rs (L128-L135)

I think both of these are technically breaking changes, but they only
affect
syntax errors and are very narrow in scope, while also pretty
substantially
simplifying the refactor, so I hope they're okay to include in a patch
release.

## Test plan

Existing tests, with the adjustments mentioned above

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-03 13:01:09 -04:00
1490 changed files with 140331 additions and 28469 deletions

View File

@@ -143,12 +143,12 @@ jobs:
env:
MERGE_BASE: ${{ steps.merge_base.outputs.sha }}
run: |
if git diff --quiet "${MERGE_BASE}...HEAD" -- ':**' \
':!**/*.md' \
':crates/ty_python_semantic/resources/mdtest/**/*.md' \
# NOTE: Do not exclude all Markdown files here, but rather use
# specific exclude patterns like 'docs/**'), because tests for
# 'ty' are written in Markdown.
if git diff --quiet "${MERGE_BASE}...HEAD" -- \
':!docs/**' \
':!assets/**' \
':.github/workflows/ci.yaml' \
; then
echo "changed=false" >> "$GITHUB_OUTPUT"
else
@@ -214,7 +214,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: |
rustup component add clippy
@@ -234,17 +234,17 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@85c79d00377f0d32cdbae595a46de6f7c2fa6599 # v1
uses: rui314/setup-mold@702b1908b5edf30d71a8d1666b724e0f0c6fa035 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
uses: taiki-e/install-action@c99cc51b309eee71a866715cfa08c922f11cf898 # v2.56.19
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
uses: taiki-e/install-action@c99cc51b309eee71a866715cfa08c922f11cf898 # v2.56.19
with:
tool: cargo-insta
- name: ty mdtests (GitHub annotations)
@@ -292,17 +292,17 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@85c79d00377f0d32cdbae595a46de6f7c2fa6599 # v1
uses: rui314/setup-mold@702b1908b5edf30d71a8d1666b724e0f0c6fa035 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
uses: taiki-e/install-action@c99cc51b309eee71a866715cfa08c922f11cf898 # v2.56.19
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
uses: taiki-e/install-action@c99cc51b309eee71a866715cfa08c922f11cf898 # v2.56.19
with:
tool: cargo-insta
- name: "Run tests"
@@ -321,11 +321,11 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
uses: taiki-e/install-action@c99cc51b309eee71a866715cfa08c922f11cf898 # v2.56.19
with:
tool: cargo-nextest
- name: "Run tests"
@@ -348,7 +348,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
@@ -377,11 +377,11 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@85c79d00377f0d32cdbae595a46de6f7c2fa6599 # v1
uses: rui314/setup-mold@702b1908b5edf30d71a8d1666b724e0f0c6fa035 # v1
- name: "Build"
run: cargo build --release --locked
@@ -400,27 +400,18 @@ jobs:
with:
file: "Cargo.toml"
field: "workspace.package.rust-version"
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
env:
MSRV: ${{ steps.msrv.outputs.value }}
run: rustup default "${MSRV}"
- name: "Install mold"
uses: rui314/setup-mold@85c79d00377f0d32cdbae595a46de6f7c2fa6599 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
with:
tool: cargo-insta
- name: "Run tests"
uses: rui314/setup-mold@702b1908b5edf30d71a8d1666b724e0f0c6fa035 # v1
- name: "Build tests"
shell: bash
env:
NEXTEST_PROFILE: "ci"
MSRV: ${{ steps.msrv.outputs.value }}
run: cargo "+${MSRV}" insta test --all-features --unreferenced reject --test-runner nextest
run: cargo "+${MSRV}" test --no-run --all-features
cargo-fuzz-build:
name: "cargo fuzz build"
@@ -432,13 +423,13 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: "fuzz -> target"
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo-binstall"
uses: cargo-bins/cargo-binstall@8aac5aa2bf0dfaa2863eccad9f43c68fe40e5ec8 # v1.14.1
uses: cargo-bins/cargo-binstall@808dcb1b503398677d089d3216c51ac7cc11e7ab # v1.14.2
with:
tool: cargo-fuzz@0.11.2
- name: "Install cargo-fuzz"
@@ -460,7 +451,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
name: Download Ruff binary to test
id: download-cached-binary
@@ -494,7 +485,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup component add rustfmt
# Run all code generation scripts, and verify that the current output is
@@ -661,7 +652,7 @@ jobs:
branch: ${{ github.event.pull_request.base.ref }}
workflow: "ci.yaml"
check_artifacts: true
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- name: Fuzz
env:
FORCE_COLOR: 1
@@ -691,7 +682,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: cargo-bins/cargo-binstall@8aac5aa2bf0dfaa2863eccad9f43c68fe40e5ec8 # v1.14.1
- uses: cargo-bins/cargo-binstall@808dcb1b503398677d089d3216c51ac7cc11e7ab # v1.14.2
- run: cargo binstall --no-confirm cargo-shear
- run: cargo shear
@@ -708,7 +699,7 @@ jobs:
with:
python-version: ${{ env.PYTHON_VERSION }}
architecture: x64
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
@@ -731,8 +722,8 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: 22
@@ -765,7 +756,7 @@ jobs:
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: "3.13"
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Add SSH key"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
uses: webfactory/ssh-agent@a6f90b1f127823b31d4d4a8d96047790581349bd # v0.9.1
@@ -774,7 +765,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: Install uv
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uv pip install -r docs/requirements-insiders.txt --system
@@ -804,7 +795,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Run checks"
@@ -874,7 +865,7 @@ jobs:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: 22
@@ -905,14 +896,14 @@ jobs:
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
uses: taiki-e/install-action@c99cc51b309eee71a866715cfa08c922f11cf898 # v2.56.19
with:
tool: cargo-codspeed
@@ -920,7 +911,7 @@ jobs:
run: cargo codspeed build --features "codspeed,instrumented" --no-default-features -p ruff_benchmark
- name: "Run benchmarks"
uses: CodSpeedHQ/action@0010eb0ca6e89b80c88e8edaaa07cfe5f3e6664d # v3.5.0
uses: CodSpeedHQ/action@0b6e7a3d96c9d2a6057e7bcea6b45aaf2f7ce60b # v3.8.0
with:
run: cargo codspeed run
token: ${{ secrets.CODSPEED_TOKEN }}
@@ -938,14 +929,14 @@ jobs:
with:
persist-credentials: false
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
uses: taiki-e/install-action@c99cc51b309eee71a866715cfa08c922f11cf898 # v2.56.19
with:
tool: cargo-codspeed
@@ -953,7 +944,7 @@ jobs:
run: cargo codspeed build --features "codspeed,walltime" --no-default-features -p ruff_benchmark
- name: "Run benchmarks"
uses: CodSpeedHQ/action@0010eb0ca6e89b80c88e8edaaa07cfe5f3e6664d # v3.5.0
uses: CodSpeedHQ/action@0b6e7a3d96c9d2a6057e7bcea6b45aaf2f7ce60b # v3.8.0
with:
run: cargo codspeed run
token: ${{ secrets.CODSPEED_TOKEN }}

View File

@@ -34,12 +34,12 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@85c79d00377f0d32cdbae595a46de6f7c2fa6599 # v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: rui314/setup-mold@702b1908b5edf30d71a8d1666b724e0f0c6fa035 # v1
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: Build ruff
# A debug build means the script runs slower once it gets started,
# but this is outweighed by the fact that a release build takes *much* longer to compile in CI

View File

@@ -12,6 +12,7 @@ on:
- ".github/workflows/mypy_primer.yaml"
- ".github/workflows/mypy_primer_comment.yaml"
- "Cargo.lock"
- "!**.md"
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.event.pull_request.number || github.sha }}
@@ -37,9 +38,9 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: "ruff"
@@ -49,46 +50,12 @@ jobs:
- name: Run mypy_primer
shell: bash
env:
TY_MEMORY_REPORT: mypy_primer
PRIMER_SELECTOR: crates/ty_python_semantic/resources/primer/good.txt
DIFF_FILE: mypy_primer.diff
run: |
cd ruff
echo "Enabling mypy primer specific configuration overloads (see .github/mypy-primer-ty.toml)"
mkdir -p ~/.config/ty
cp .github/mypy-primer-ty.toml ~/.config/ty/ty.toml
PRIMER_SELECTOR="$(paste -s -d'|' crates/ty_python_semantic/resources/primer/good.txt)"
echo "new commit"
git rev-list --format=%s --max-count=1 "$GITHUB_SHA"
MERGE_BASE="$(git merge-base "$GITHUB_SHA" "origin/$GITHUB_BASE_REF")"
git checkout -b base_commit "$MERGE_BASE"
echo "base commit"
git rev-list --format=%s --max-count=1 base_commit
cd ..
echo "Project selector: $PRIMER_SELECTOR"
# Allow the exit code to be 0 or 1, only fail for actual mypy_primer crashes/bugs
uvx \
--from="git+https://github.com/hauntsaninja/mypy_primer@e5f55447969d33ae3c7ccdb183e2a37101867270" \
mypy_primer \
--repo ruff \
--type-checker ty \
--old base_commit \
--new "$GITHUB_SHA" \
--project-selector "/($PRIMER_SELECTOR)\$" \
--output concise \
--debug > mypy_primer.diff || [ $? -eq 1 ]
# Output diff with ANSI color codes
cat mypy_primer.diff
# Remove ANSI color codes before uploading
sed -ie 's/\x1b\[[0-9;]*m//g' mypy_primer.diff
echo ${{ github.event.number }} > pr-number
scripts/mypy_primer.sh
echo ${{ github.event.number }} > ../pr-number
- name: Upload diff
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
@@ -101,3 +68,41 @@ jobs:
with:
name: pr-number
path: pr-number
memory_usage:
name: Run memory statistics
runs-on: depot-ubuntu-22.04-32
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
path: ruff
fetch-depth: 0
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:
workspaces: "ruff"
- name: Install Rust toolchain
run: rustup show
- name: Run mypy_primer
shell: bash
env:
TY_MAX_PARALLELISM: 1 # for deterministic memory numbers
TY_MEMORY_REPORT: mypy_primer
PRIMER_SELECTOR: crates/ty_python_semantic/resources/primer/memory.txt
DIFF_FILE: mypy_primer_memory.diff
run: |
cd ruff
scripts/mypy_primer.sh
- name: Upload diff
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: mypy_primer_memory_diff
path: mypy_primer_memory.diff

View File

@@ -45,15 +45,28 @@ jobs:
if_no_artifact_found: ignore
allow_forks: true
- uses: dawidd6/action-download-artifact@20319c5641d495c8a52e688b7dc5fada6c3a9fbc # v8
name: "Download mypy_primer memory results"
id: download-mypy_primer_memory_diff
if: steps.pr-number.outputs.pr-number
with:
name: mypy_primer_memory_diff
workflow: mypy_primer.yaml
pr: ${{ steps.pr-number.outputs.pr-number }}
path: pr/mypy_primer_memory_diff
workflow_conclusion: completed
if_no_artifact_found: ignore
allow_forks: true
- name: Generate comment content
id: generate-comment
if: steps.download-mypy_primer_diff.outputs.found_artifact == 'true'
if: ${{ steps.download-mypy_primer_diff.outputs.found_artifact == 'true' && steps.download-mypy_primer_memory_diff.outputs.found_artifact == 'true' }}
run: |
# Guard against malicious mypy_primer results that symlink to a secret
# file on this runner
if [[ -L pr/mypy_primer_diff/mypy_primer.diff ]]
if [[ -L pr/mypy_primer_diff/mypy_primer.diff ]] || [[ -L pr/mypy_primer_memory_diff/mypy_primer_memory.diff ]]
then
echo "Error: mypy_primer.diff cannot be a symlink"
echo "Error: mypy_primer.diff and mypy_primer_memory.diff cannot be a symlink"
exit 1
fi
@@ -74,6 +87,18 @@ jobs:
echo 'No ecosystem changes detected ✅' >> comment.txt
fi
if [ -s "pr/mypy_primer_memory_diff/mypy_primer_memory.diff" ]; then
echo '<details>' >> comment.txt
echo '<summary>Memory usage changes were detected when running on open source projects</summary>' >> comment.txt
echo '' >> comment.txt
echo '```diff' >> comment.txt
cat pr/mypy_primer_memory_diff/mypy_primer_memory.diff >> comment.txt
echo '```' >> comment.txt
echo '</details>' >> comment.txt
else
echo 'No memory usage changes detected ✅' >> comment.txt
fi
echo 'comment<<EOF' >> "$GITHUB_OUTPUT"
cat comment.txt >> "$GITHUB_OUTPUT"
echo 'EOF' >> "$GITHUB_OUTPUT"

View File

@@ -68,7 +68,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}

View File

@@ -22,7 +22,7 @@ jobs:
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
pattern: wheels-*

View File

@@ -1,5 +1,25 @@
name: Sync typeshed
# How this works:
#
# 1. A Linux worker:
# a. Checks out Ruff and typeshed
# b. Deletes the vendored typeshed stdlib stubs from Ruff
# c. Copies the latest versions of the stubs from typeshed
# d. Uses docstring-adder to sync all docstrings available on Linux
# e. Creates a new branch on the upstream astral-sh/ruff repository
# f. Commits the changes it's made and pushes them to the new upstream branch
# 2. Once the Linux worker is done, a Windows worker:
# a. Checks out the branch created by the Linux worker
# b. Syncs all docstrings available on Windows that are not available on Linux
# c. Commits the changes and pushes them to the same upstream branch
# 3. Once the Windows worker is done, a MacOS worker:
# a. Checks out the branch created by the Linux worker
# b. Syncs all docstrings available on MacOS that are not available on Linux or Windows
# c. Commits the changes and pushes them to the same upstream branch
# d. Creates a PR against the `main` branch using the branch all three workers have pushed to
# 4. If any of steps 1-3 failed, an issue is created in the `astral-sh/ruff` repository
on:
workflow_dispatch:
schedule:
@@ -10,7 +30,17 @@ env:
FORCE_COLOR: 1
GH_TOKEN: ${{ github.token }}
# The name of the upstream branch that the first worker creates,
# and which all three workers push to.
UPSTREAM_BRANCH: typeshedbot/sync-typeshed
# The path to the directory that contains the vendored typeshed stubs,
# relative to the root of the Ruff repository.
VENDORED_TYPESHED: crates/ty_vendored/vendor/typeshed
jobs:
# Sync typeshed stubs, and sync all docstrings available on Linux.
# Push the changes to a new branch on the upstream repository.
sync:
name: Sync typeshed
runs-on: ubuntu-latest
@@ -19,7 +49,6 @@ jobs:
if: ${{ github.repository == 'astral-sh/ruff' || github.event_name != 'schedule' }}
permissions:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
name: Checkout Ruff
@@ -36,37 +65,130 @@ jobs:
run: |
git config --global user.name typeshedbot
git config --global user.email '<>'
- name: Sync typeshed
id: sync
run: |
rm -rf ruff/crates/ty_vendored/vendor/typeshed
mkdir ruff/crates/ty_vendored/vendor/typeshed
cp typeshed/README.md ruff/crates/ty_vendored/vendor/typeshed
cp typeshed/LICENSE ruff/crates/ty_vendored/vendor/typeshed
cp -r typeshed/stdlib ruff/crates/ty_vendored/vendor/typeshed/stdlib
rm -rf ruff/crates/ty_vendored/vendor/typeshed/stdlib/@tests
git -C typeshed rev-parse HEAD > ruff/crates/ty_vendored/vendor/typeshed/source_commit.txt
- name: Commit the changes
id: commit
if: ${{ steps.sync.outcome == 'success' }}
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- name: Sync typeshed stubs
run: |
rm -rf "ruff/${VENDORED_TYPESHED}"
mkdir "ruff/${VENDORED_TYPESHED}"
cp typeshed/README.md "ruff/${VENDORED_TYPESHED}"
cp typeshed/LICENSE "ruff/${VENDORED_TYPESHED}"
# The pyproject.toml file is needed by a later job for the black configuration.
# It's deleted before creating the PR.
cp typeshed/pyproject.toml "ruff/${VENDORED_TYPESHED}"
cp -r typeshed/stdlib "ruff/${VENDORED_TYPESHED}/stdlib"
rm -rf "ruff/${VENDORED_TYPESHED}/stdlib/@tests"
git -C typeshed rev-parse HEAD > "ruff/${VENDORED_TYPESHED}/source_commit.txt"
cd ruff
git checkout -b typeshedbot/sync-typeshed
git checkout -b "${UPSTREAM_BRANCH}"
git add .
git diff --staged --quiet || git commit -m "Sync typeshed. Source commit: https://github.com/python/typeshed/commit/$(git -C ../typeshed rev-parse HEAD)"
- name: Create a PR
if: ${{ steps.sync.outcome == 'success' && steps.commit.outcome == 'success' }}
git commit -m "Sync typeshed. Source commit: https://github.com/python/typeshed/commit/$(git -C ../typeshed rev-parse HEAD)" --allow-empty
- name: Sync Linux docstrings
if: ${{ success() }}
run: |
cd ruff
git push --force origin typeshedbot/sync-typeshed
gh pr list --repo "$GITHUB_REPOSITORY" --head typeshedbot/sync-typeshed --json id --jq length | grep 1 && exit 0 # exit if there is existing pr
./scripts/codemod_docstrings.sh
git commit -am "Sync Linux docstrings" --allow-empty
- name: Push the changes
id: commit
if: ${{ success() }}
run: git -C ruff push --force --set-upstream origin "${UPSTREAM_BRANCH}"
# Checkout the branch created by the sync job,
# and sync all docstrings available on Windows that are not available on Linux.
# Commit the changes and push them to the same branch.
docstrings-windows:
runs-on: windows-latest
timeout-minutes: 20
needs: [sync]
# Don't run the cron job on forks.
# The job will also be skipped if the sync job failed, because it's specified in `needs` above,
# and we haven't used `always()` in the `if` condition here
# (https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#example-requiring-successful-dependent-jobs)
if: ${{ github.repository == 'astral-sh/ruff' || github.event_name != 'schedule' }}
permissions:
contents: write
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
name: Checkout Ruff
with:
persist-credentials: true
ref: ${{ env.UPSTREAM_BRANCH}}
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- name: Setup git
run: |
git config --global user.name typeshedbot
git config --global user.email '<>'
- name: Sync Windows docstrings
id: docstrings
shell: bash
run: ./scripts/codemod_docstrings.sh
- name: Commit the changes
if: ${{ steps.docstrings.outcome == 'success' }}
run: |
git commit -am "Sync Windows docstrings" --allow-empty
git push
# Checkout the branch created by the sync job,
# and sync all docstrings available on macOS that are not available on Linux or Windows.
# Push the changes to the same branch and create a PR against the `main` branch using that branch.
docstrings-macos-and-pr:
runs-on: macos-latest
timeout-minutes: 20
needs: [sync, docstrings-windows]
# Don't run the cron job on forks.
# The job will also be skipped if the sync or docstrings-windows jobs failed,
# because they're specified in `needs` above and we haven't used an `always()` condition in the `if` here
# (https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#example-requiring-successful-dependent-jobs)
if: ${{ github.repository == 'astral-sh/ruff' || github.event_name != 'schedule' }}
permissions:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
name: Checkout Ruff
with:
persist-credentials: true
ref: ${{ env.UPSTREAM_BRANCH}}
- uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- name: Setup git
run: |
git config --global user.name typeshedbot
git config --global user.email '<>'
- name: Sync macOS docstrings
run: ./scripts/codemod_docstrings.sh
- name: Commit and push the changes
if: ${{ success() }}
run: |
git commit -am "Sync macOS docstrings" --allow-empty
# Here we just reformat the codemodded stubs so that they are
# consistent with the other typeshed stubs around them.
# Typeshed formats code using black in their CI, so we just invoke
# black on the stubs the same way that typeshed does.
uvx black "${VENDORED_TYPESHED}/stdlib" --config "${VENDORED_TYPESHED}/pyproject.toml" || true
git commit -am "Format codemodded docstrings" --allow-empty
rm "${VENDORED_TYPESHED}/pyproject.toml"
git commit -am "Remove pyproject.toml file"
git push
- name: Create a PR
if: ${{ success() }}
run: |
gh pr list --repo "${GITHUB_REPOSITORY}" --head "${UPSTREAM_BRANCH}" --json id --jq length | grep 1 && exit 0 # exit if there is existing pr
gh pr create --title "[ty] Sync vendored typeshed stubs" --body "Close and reopen this PR to trigger CI" --label "ty"
create-issue-on-failure:
name: Create an issue if the typeshed sync failed
runs-on: ubuntu-latest
needs: [sync]
if: ${{ github.repository == 'astral-sh/ruff' && always() && github.event_name == 'schedule' && needs.sync.result == 'failure' }}
needs: [sync, docstrings-windows, docstrings-macos-and-pr]
if: ${{ github.repository == 'astral-sh/ruff' && always() && github.event_name == 'schedule' && (needs.sync.result == 'failure' || needs.docstrings-windows.result == 'failure' || needs.docstrings-macos-and-pr.result == 'failure') }}
permissions:
issues: write
steps:

View File

@@ -17,6 +17,7 @@ env:
RUSTUP_MAX_RETRIES: 10
RUST_BACKTRACE: 1
REF_NAME: ${{ github.ref_name }}
CF_API_TOKEN_EXISTS: ${{ secrets.CF_API_TOKEN != '' }}
jobs:
ty-ecosystem-analyzer:
@@ -32,9 +33,9 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: "ruff"
@@ -63,32 +64,75 @@ jobs:
cd ..
uv tool install "git+https://github.com/astral-sh/ecosystem-analyzer@9c34dc514ee9aef6735db1dfebb80f63acbc3440"
uv tool install "git+https://github.com/astral-sh/ecosystem-analyzer@27dd66d9e397d986ef9c631119ee09556eab8af9"
ecosystem-analyzer \
--repository ruff \
analyze \
--projects ruff/projects_old.txt \
--commit old_commit \
--output diagnostics_old.json
diff \
--projects-old ruff/projects_old.txt \
--projects-new ruff/projects_new.txt \
--old old_commit \
--new new_commit \
--output-old diagnostics-old.json \
--output-new diagnostics-new.json
ecosystem-analyzer \
--repository ruff \
analyze \
--projects ruff/projects_new.txt \
--commit new_commit \
--output diagnostics_new.json
mkdir dist
ecosystem-analyzer \
generate-diff \
diagnostics_old.json \
diagnostics_new.json \
diagnostics-old.json \
diagnostics-new.json \
--old-name "main (merge base)" \
--new-name "$REF_NAME" \
--output-html diff.html
--output-html dist/diff.html
- name: Upload HTML diff report
ecosystem-analyzer \
generate-diff-statistics \
diagnostics-old.json \
diagnostics-new.json \
--old-name "main (merge base)" \
--new-name "$REF_NAME" \
--output diff-statistics.md
echo '## `ecosystem-analyzer` results' > comment.md
echo >> comment.md
cat diff-statistics.md >> comment.md
cat diff-statistics.md >> "$GITHUB_STEP_SUMMARY"
echo ${{ github.event.number }} > pr-number
- name: "Deploy to Cloudflare Pages"
if: ${{ env.CF_API_TOKEN_EXISTS == 'true' }}
id: deploy
uses: cloudflare/wrangler-action@da0e0dfe58b7a431659754fdf3f186c529afbe65 # v3.14.1
with:
apiToken: ${{ secrets.CF_API_TOKEN }}
accountId: ${{ secrets.CF_ACCOUNT_ID }}
command: pages deploy dist --project-name=ty-ecosystem --branch ${{ github.head_ref }} --commit-hash ${GITHUB_SHA}
- name: "Append deployment URL"
if: ${{ env.CF_API_TOKEN_EXISTS == 'true' }}
env:
DEPLOYMENT_URL: ${{ steps.deploy.outputs.pages-deployment-alias-url }}
run: |
echo >> comment.md
echo "**[Full report with detailed diff]($DEPLOYMENT_URL/diff)**" >> comment.md
- name: Upload comment
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: comment.md
path: comment.md
- name: Upload pr-number
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: pr-number
path: pr-number
- name: Upload diagnostics diff
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: diff.html
path: diff.html
path: dist/diff.html

View File

@@ -0,0 +1,85 @@
name: PR comment (ty ecosystem-analyzer)
on: # zizmor: ignore[dangerous-triggers]
workflow_run:
workflows: [ty ecosystem-analyzer]
types: [completed]
workflow_dispatch:
inputs:
workflow_run_id:
description: The ty ecosystem-analyzer workflow that triggers the workflow run
required: true
jobs:
comment:
runs-on: ubuntu-24.04
permissions:
pull-requests: write
steps:
- uses: dawidd6/action-download-artifact@20319c5641d495c8a52e688b7dc5fada6c3a9fbc # v8
name: Download PR number
with:
name: pr-number
run_id: ${{ github.event.workflow_run.id || github.event.inputs.workflow_run_id }}
if_no_artifact_found: ignore
allow_forks: true
- name: Parse pull request number
id: pr-number
run: |
if [[ -f pr-number ]]
then
echo "pr-number=$(<pr-number)" >> "$GITHUB_OUTPUT"
fi
- uses: dawidd6/action-download-artifact@20319c5641d495c8a52e688b7dc5fada6c3a9fbc # v8
name: "Download comment.md"
id: download-comment
if: steps.pr-number.outputs.pr-number
with:
name: comment.md
workflow: ty-ecosystem-analyzer.yaml
pr: ${{ steps.pr-number.outputs.pr-number }}
path: pr/comment
workflow_conclusion: completed
if_no_artifact_found: ignore
allow_forks: true
- name: Generate comment content
id: generate-comment
if: ${{ steps.download-comment.outputs.found_artifact == 'true' }}
run: |
# Guard against malicious ty ecosystem-analyzer results that symlink to a secret
# file on this runner
if [[ -L pr/comment/comment.md ]]
then
echo "Error: comment.md cannot be a symlink"
exit 1
fi
# Note: this identifier is used to find the comment to update on subsequent runs
echo '<!-- generated-comment ty ecosystem-analyzer -->' > comment.md
echo >> comment.md
cat pr/comment/comment.md >> comment.md
echo 'comment<<EOF' >> "$GITHUB_OUTPUT"
cat comment.md >> "$GITHUB_OUTPUT"
echo 'EOF' >> "$GITHUB_OUTPUT"
- name: Find existing comment
uses: peter-evans/find-comment@3eae4d37986fb5a8592848f6a574fdf654e61f9e # v3.1.0
if: steps.generate-comment.outcome == 'success'
id: find-comment
with:
issue-number: ${{ steps.pr-number.outputs.pr-number }}
comment-author: "github-actions[bot]"
body-includes: "<!-- generated-comment ty ecosystem-analyzer -->"
- name: Create or update comment
if: steps.find-comment.outcome == 'success'
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4
with:
comment-id: ${{ steps.find-comment.outputs.comment-id }}
issue-number: ${{ steps.pr-number.outputs.pr-number }}
body-path: comment.md
edit-mode: replace

View File

@@ -0,0 +1,76 @@
name: ty ecosystem-report
permissions: {}
on:
workflow_dispatch:
schedule:
# Run every Wednesday at 5:00 UTC:
- cron: 0 5 * * 3
env:
CARGO_INCREMENTAL: 0
CARGO_NET_RETRY: 10
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
RUST_BACKTRACE: 1
CF_API_TOKEN_EXISTS: ${{ secrets.CF_API_TOKEN != '' }}
jobs:
ty-ecosystem-report:
name: Create ecosystem report
runs-on: depot-ubuntu-22.04-32
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
path: ruff
fetch-depth: 0
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: "ruff"
- name: Install Rust toolchain
run: rustup show
- name: Create report
shell: bash
run: |
cd ruff
echo "Enabling configuration overloads (see .github/mypy-primer-ty.toml)"
mkdir -p ~/.config/ty
cp .github/mypy-primer-ty.toml ~/.config/ty/ty.toml
cd ..
uv tool install "git+https://github.com/astral-sh/ecosystem-analyzer@27dd66d9e397d986ef9c631119ee09556eab8af9"
ecosystem-analyzer \
--verbose \
--repository ruff \
analyze \
--projects ruff/crates/ty_python_semantic/resources/primer/good.txt \
--output ecosystem-diagnostics.json
mkdir dist
ecosystem-analyzer \
generate-report \
--max-diagnostics-per-project=1200 \
ecosystem-diagnostics.json \
--output dist/index.html
- name: "Deploy to Cloudflare Pages"
if: ${{ env.CF_API_TOKEN_EXISTS == 'true' }}
id: deploy
uses: cloudflare/wrangler-action@da0e0dfe58b7a431659754fdf3f186c529afbe65 # v3.14.1
with:
apiToken: ${{ secrets.CF_API_TOKEN }}
accountId: ${{ secrets.CF_ACCOUNT_ID }}
command: pages deploy dist --project-name=ty-ecosystem --branch main --commit-hash ${GITHUB_SHA}

View File

@@ -0,0 +1,109 @@
name: Run typing conformance
permissions: {}
on:
pull_request:
paths:
- "crates/ty*/**"
- "crates/ruff_db"
- "crates/ruff_python_ast"
- "crates/ruff_python_parser"
- ".github/workflows/typing_conformance.yaml"
- ".github/workflows/typing_conformance_comment.yaml"
- "Cargo.lock"
- "!**.md"
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.event.pull_request.number || github.sha }}
cancel-in-progress: true
env:
CARGO_INCREMENTAL: 0
CARGO_NET_RETRY: 10
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
RUST_BACKTRACE: 1
jobs:
typing_conformance:
name: Compute diagnostic diff
runs-on: depot-ubuntu-22.04-32
timeout-minutes: 10
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
path: ruff
fetch-depth: 0
persist-credentials: false
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
repository: python/typing
ref: d4f39b27a4a47aac8b6d4019e1b0b5b3156fabdc
path: typing
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: "ruff"
- name: Install Rust toolchain
run: rustup show
- name: Compute diagnostic diff
shell: bash
run: |
RUFF_DIR="$GITHUB_WORKSPACE/ruff"
# Build the executable for the old and new commit
(
cd ruff
echo "new commit"
git checkout -b new_commit "${{ github.event.pull_request.head.sha }}"
git rev-list --format=%s --max-count=1 new_commit
cargo build --release --bin ty
mv target/release/ty ty-new
echo "old commit (merge base)"
MERGE_BASE="$(git merge-base "$GITHUB_SHA" "origin/$GITHUB_BASE_REF")"
git checkout -b old_commit "$MERGE_BASE"
git rev-list --format=%s --max-count=1 old_commit
cargo build --release --bin ty
mv target/release/ty ty-old
)
(
cd typing/conformance/tests
echo "Running ty on old commit (merge base)"
"$RUFF_DIR/ty-old" check --color=never --output-format=concise . > "$GITHUB_WORKSPACE/old-output.txt" 2>&1 || true
echo "Running ty on new commit"
"$RUFF_DIR/ty-new" check --color=never --output-format=concise . > "$GITHUB_WORKSPACE/new-output.txt" 2>&1 || true
)
if ! diff -u old-output.txt new-output.txt > typing_conformance_diagnostics.diff; then
echo "Differences found between base and PR"
else
echo "No differences found"
touch typing_conformance_diagnostics.diff
fi
echo ${{ github.event.number }} > pr-number
- name: Upload diff
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: typing_conformance_diagnostics_diff
path: typing_conformance_diagnostics.diff
- name: Upload pr-number
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: pr-number
path: pr-number

View File

@@ -0,0 +1,97 @@
name: PR comment (typing_conformance)
on: # zizmor: ignore[dangerous-triggers]
workflow_run:
workflows: [Run typing conformance]
types: [completed]
workflow_dispatch:
inputs:
workflow_run_id:
description: The typing_conformance workflow that triggers the workflow run
required: true
jobs:
comment:
runs-on: ubuntu-24.04
permissions:
pull-requests: write
steps:
- uses: dawidd6/action-download-artifact@20319c5641d495c8a52e688b7dc5fada6c3a9fbc # v8
name: Download PR number
with:
name: pr-number
run_id: ${{ github.event.workflow_run.id || github.event.inputs.workflow_run_id }}
if_no_artifact_found: ignore
allow_forks: true
- name: Parse pull request number
id: pr-number
run: |
if [[ -f pr-number ]]
then
echo "pr-number=$(<pr-number)" >> "$GITHUB_OUTPUT"
fi
- uses: dawidd6/action-download-artifact@20319c5641d495c8a52e688b7dc5fada6c3a9fbc # v8
name: "Download typing_conformance results"
id: download-typing_conformance_diff
if: steps.pr-number.outputs.pr-number
with:
name: typing_conformance_diagnostics_diff
workflow: typing_conformance.yaml
pr: ${{ steps.pr-number.outputs.pr-number }}
path: pr/typing_conformance_diagnostics_diff
workflow_conclusion: completed
if_no_artifact_found: ignore
allow_forks: true
- name: Generate comment content
id: generate-comment
if: ${{ steps.download-typing_conformance_diff.outputs.found_artifact == 'true' }}
run: |
# Guard against malicious typing_conformance results that symlink to a secret
# file on this runner
if [[ -L pr/typing_conformance_diagnostics_diff/typing_conformance_diagnostics.diff ]]
then
echo "Error: typing_conformance_diagnostics.diff cannot be a symlink"
exit 1
fi
# Note this identifier is used to find the comment to update on
# subsequent runs
echo '<!-- generated-comment typing_conformance_diagnostics_diff -->' >> comment.txt
echo '## Diagnostic diff on typing conformance tests' >> comment.txt
if [ -s "pr/typing_conformance_diagnostics_diff/typing_conformance_diagnostics.diff" ]; then
echo '<details>' >> comment.txt
echo '<summary>Changes were detected when running ty on typing conformance tests</summary>' >> comment.txt
echo '' >> comment.txt
echo '```diff' >> comment.txt
cat pr/typing_conformance_diagnostics_diff/typing_conformance_diagnostics.diff >> comment.txt
echo '```' >> comment.txt
echo '</details>' >> comment.txt
else
echo 'No changes detected when running ty on typing conformance tests ✅' >> comment.txt
fi
echo 'comment<<EOF' >> "$GITHUB_OUTPUT"
cat comment.txt >> "$GITHUB_OUTPUT"
echo 'EOF' >> "$GITHUB_OUTPUT"
- name: Find existing comment
uses: peter-evans/find-comment@3eae4d37986fb5a8592848f6a574fdf654e61f9e # v3.1.0
if: steps.generate-comment.outcome == 'success'
id: find-comment
with:
issue-number: ${{ steps.pr-number.outputs.pr-number }}
comment-author: "github-actions[bot]"
body-includes: "<!-- generated-comment typing_conformance_diagnostics_diff -->"
- name: Create or update comment
if: steps.find-comment.outcome == 'success'
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4
with:
comment-id: ${{ steps.find-comment.outputs.comment-id }}
issue-number: ${{ steps.pr-number.outputs.pr-number }}
body-path: comment.txt
edit-mode: replace

2
.github/zizmor.yml vendored
View File

@@ -10,6 +10,8 @@ rules:
ignore:
- build-docker.yml
- publish-playground.yml
- ty-ecosystem-analyzer.yaml
- ty-ecosystem-report.yaml
excessive-permissions:
# it's hard to test what the impact of removing these ignores would be
# without actually running the release workflow...

View File

@@ -6,7 +6,7 @@ exclude: |
crates/ty_vendored/vendor/.*|
crates/ty_project/resources/.*|
crates/ty_python_semantic/resources/corpus/.*|
crates/ty/docs/(configuration|rules|cli).md|
crates/ty/docs/(configuration|rules|cli|environment).md|
crates/ruff_benchmark/resources/.*|
crates/ruff_linter/resources/.*|
crates/ruff_linter/src/rules/.*/snapshots/.*|
@@ -67,7 +67,7 @@ repos:
- black==25.1.0
- repo: https://github.com/crate-ci/typos
rev: v1.33.1
rev: v1.34.0
hooks:
- id: typos
@@ -81,10 +81,10 @@ repos:
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.12.1
rev: v0.12.5
hooks:
- id: ruff-format
- id: ruff
- id: ruff-check
args: [--fix, --exit-non-zero-on-fix]
types_or: [python, pyi]
require_serial: true
@@ -99,12 +99,12 @@ repos:
# zizmor detects security vulnerabilities in GitHub Actions workflows.
# Additional configuration for the tool is found in `.github/zizmor.yml`
- repo: https://github.com/woodruffw/zizmor-pre-commit
rev: v1.10.0
rev: v1.11.0
hooks:
- id: zizmor
- repo: https://github.com/python-jsonschema/check-jsonschema
rev: 0.33.1
rev: 0.33.2
hooks:
- id: check-github-workflows
@@ -128,5 +128,10 @@ repos:
# but the integration only works if shellcheck is installed
- "github.com/wasilibs/go-shellcheck/cmd/shellcheck@v0.10.0"
- repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.10.0.1
hooks:
- id: shellcheck
ci:
skip: [cargo-fmt, dev-generate-all]

View File

@@ -1,5 +1,112 @@
# Changelog
## 0.12.7
This is a follow-up release to 0.12.6. Because of an issue in the package metadata, 0.12.6 failed to publish fully to PyPI and has been yanked. Similarly, there is no GitHub release or Git tag for 0.12.6. The contents of the 0.12.7 release are identical to 0.12.6, except for the updated metadata.
## 0.12.6
### Preview features
- \[`flake8-commas`\] Add support for trailing comma checks in type parameter lists (`COM812`, `COM819`) ([#19390](https://github.com/astral-sh/ruff/pull/19390))
- \[`pylint`\] Implement auto-fix for `missing-maxsplit-arg` (`PLC0207`) ([#19387](https://github.com/astral-sh/ruff/pull/19387))
- \[`ruff`\] Offer fixes for `RUF039` in more cases ([#19065](https://github.com/astral-sh/ruff/pull/19065))
### Bug fixes
- Support `.pyi` files in ruff analyze graph ([#19611](https://github.com/astral-sh/ruff/pull/19611))
- \[`flake8-pyi`\] Preserve inline comment in ellipsis removal (`PYI013`) ([#19399](https://github.com/astral-sh/ruff/pull/19399))
- \[`perflint`\] Ignore rule if target is `global` or `nonlocal` (`PERF401`) ([#19539](https://github.com/astral-sh/ruff/pull/19539))
- \[`pyupgrade`\] Fix `UP030` to avoid modifying double curly braces in format strings ([#19378](https://github.com/astral-sh/ruff/pull/19378))
- \[`refurb`\] Ignore decorated functions for `FURB118` ([#19339](https://github.com/astral-sh/ruff/pull/19339))
- \[`refurb`\] Mark `int` and `bool` cases for `Decimal.from_float` as safe fixes (`FURB164`) ([#19468](https://github.com/astral-sh/ruff/pull/19468))
- \[`ruff`\] Fix `RUF033` for named default expressions ([#19115](https://github.com/astral-sh/ruff/pull/19115))
### Rule changes
- \[`flake8-blind-except`\] Change `BLE001` to permit `logging.critical(..., exc_info=True)` ([#19520](https://github.com/astral-sh/ruff/pull/19520))
### Performance
- Add support for specifying minimum dots in detected string imports ([#19538](https://github.com/astral-sh/ruff/pull/19538))
## 0.12.5
### Preview features
- \[`flake8-use-pathlib`\] Add autofix for `PTH101`, `PTH104`, `PTH105`, `PTH121` ([#19404](https://github.com/astral-sh/ruff/pull/19404))
- \[`ruff`\] Support byte strings (`RUF055`) ([#18926](https://github.com/astral-sh/ruff/pull/18926))
### Bug fixes
- Fix `unreachable` panic in parser ([#19183](https://github.com/astral-sh/ruff/pull/19183))
- \[`flake8-pyi`\] Skip fix if all `Union` members are `None` (`PYI016`) ([#19416](https://github.com/astral-sh/ruff/pull/19416))
- \[`perflint`\] Parenthesize generator expressions (`PERF401`) ([#19325](https://github.com/astral-sh/ruff/pull/19325))
- \[`pylint`\] Handle empty comments after line continuation (`PLR2044`) ([#19405](https://github.com/astral-sh/ruff/pull/19405))
### Rule changes
- \[`pep8-naming`\] Fix `N802` false positives for `CGIHTTPRequestHandler` and `SimpleHTTPRequestHandler` ([#19432](https://github.com/astral-sh/ruff/pull/19432))
## 0.12.4
### Preview features
- \[`flake8-type-checking`, `pyupgrade`, `ruff`\] Add `from __future__ import annotations` when it would allow new fixes (`TC001`, `TC002`, `TC003`, `UP037`, `RUF013`) ([#19100](https://github.com/astral-sh/ruff/pull/19100))
- \[`flake8-use-pathlib`\] Add autofix for `PTH109` ([#19245](https://github.com/astral-sh/ruff/pull/19245))
- \[`pylint`\] Detect indirect `pathlib.Path` usages for `unspecified-encoding` (`PLW1514`) ([#19304](https://github.com/astral-sh/ruff/pull/19304))
### Bug fixes
- \[`flake8-bugbear`\] Fix `B017` false negatives for keyword exception arguments ([#19217](https://github.com/astral-sh/ruff/pull/19217))
- \[`flake8-use-pathlib`\] Fix false negative on direct `Path()` instantiation (`PTH210`) ([#19388](https://github.com/astral-sh/ruff/pull/19388))
- \[`flake8-django`\] Fix `DJ008` false positive for abstract models with type-annotated `abstract` field ([#19221](https://github.com/astral-sh/ruff/pull/19221))
- \[`isort`\] Fix `I002` import insertion after docstring with multiple string statements ([#19222](https://github.com/astral-sh/ruff/pull/19222))
- \[`isort`\] Treat form feed as valid whitespace before a semicolon ([#19343](https://github.com/astral-sh/ruff/pull/19343))
- \[`pydoclint`\] Fix `SyntaxError` from fixes with line continuations (`D201`, `D202`) ([#19246](https://github.com/astral-sh/ruff/pull/19246))
- \[`refurb`\] `FURB164` fix should validate arguments and should usually be marked unsafe ([#19136](https://github.com/astral-sh/ruff/pull/19136))
### Rule changes
- \[`flake8-use-pathlib`\] Skip single dots for `invalid-pathlib-with-suffix` (`PTH210`) on versions >= 3.14 ([#19331](https://github.com/astral-sh/ruff/pull/19331))
- \[`pep8_naming`\] Avoid false positives on standard library functions with uppercase names (`N802`) ([#18907](https://github.com/astral-sh/ruff/pull/18907))
- \[`pycodestyle`\] Handle brace escapes for t-strings in logical lines ([#19358](https://github.com/astral-sh/ruff/pull/19358))
- \[`pylint`\] Extend invalid string character rules to include t-strings ([#19355](https://github.com/astral-sh/ruff/pull/19355))
- \[`ruff`\] Allow `strict` kwarg when checking for `starmap-zip` (`RUF058`) in Python 3.14+ ([#19333](https://github.com/astral-sh/ruff/pull/19333))
### Documentation
- \[`flake8-type-checking`\] Make `TC010` docs example more realistic ([#19356](https://github.com/astral-sh/ruff/pull/19356))
- Make more documentation examples error out-of-the-box ([#19288](https://github.com/astral-sh/ruff/pull/19288),[#19272](https://github.com/astral-sh/ruff/pull/19272),[#19291](https://github.com/astral-sh/ruff/pull/19291),[#19296](https://github.com/astral-sh/ruff/pull/19296),[#19292](https://github.com/astral-sh/ruff/pull/19292),[#19295](https://github.com/astral-sh/ruff/pull/19295),[#19297](https://github.com/astral-sh/ruff/pull/19297),[#19309](https://github.com/astral-sh/ruff/pull/19309))
## 0.12.3
### Preview features
- \[`flake8-bugbear`\] Support non-context-manager calls in `B017` ([#19063](https://github.com/astral-sh/ruff/pull/19063))
- \[`flake8-use-pathlib`\] Add autofixes for `PTH100`, `PTH106`, `PTH107`, `PTH108`, `PTH110`, `PTH111`, `PTH112`, `PTH113`, `PTH114`, `PTH115`, `PTH117`, `PTH119`, `PTH120` ([#19213](https://github.com/astral-sh/ruff/pull/19213))
- \[`flake8-use-pathlib`\] Add autofixes for `PTH203`, `PTH204`, `PTH205` ([#18922](https://github.com/astral-sh/ruff/pull/18922))
### Bug fixes
- \[`flake8-return`\] Fix false-positive for variables used inside nested functions in `RET504` ([#18433](https://github.com/astral-sh/ruff/pull/18433))
- Treat form feed as valid whitespace before a line continuation ([#19220](https://github.com/astral-sh/ruff/pull/19220))
- \[`flake8-type-checking`\] Fix syntax error introduced by fix (`TC008`) ([#19150](https://github.com/astral-sh/ruff/pull/19150))
- \[`pyupgrade`\] Keyword arguments in `super` should suppress the `UP008` fix ([#19131](https://github.com/astral-sh/ruff/pull/19131))
### Documentation
- \[`flake8-pyi`\] Make example error out-of-the-box (`PYI007`, `PYI008`) ([#19103](https://github.com/astral-sh/ruff/pull/19103))
- \[`flake8-simplify`\] Make example error out-of-the-box (`SIM116`) ([#19111](https://github.com/astral-sh/ruff/pull/19111))
- \[`flake8-type-checking`\] Make example error out-of-the-box (`TC001`) ([#19151](https://github.com/astral-sh/ruff/pull/19151))
- \[`flake8-use-pathlib`\] Make example error out-of-the-box (`PTH210`) ([#19189](https://github.com/astral-sh/ruff/pull/19189))
- \[`pycodestyle`\] Make example error out-of-the-box (`E272`) ([#19191](https://github.com/astral-sh/ruff/pull/19191))
- \[`pycodestyle`\] Make example not raise unnecessary `SyntaxError` (`E114`) ([#19190](https://github.com/astral-sh/ruff/pull/19190))
- \[`pydoclint`\] Make example error out-of-the-box (`DOC501`) ([#19218](https://github.com/astral-sh/ruff/pull/19218))
- \[`pylint`, `pyupgrade`\] Fix syntax errors in examples (`PLW1501`, `UP028`) ([#19127](https://github.com/astral-sh/ruff/pull/19127))
- \[`pylint`\] Update `missing-maxsplit-arg` docs and error to suggest proper usage (`PLC0207`) ([#18949](https://github.com/astral-sh/ruff/pull/18949))
- \[`flake8-bandit`\] Make example error out-of-the-box (`S412`) ([#19241](https://github.com/astral-sh/ruff/pull/19241))
## 0.12.2
### Preview features

View File

@@ -266,6 +266,13 @@ Finally, regenerate the documentation and generated code with `cargo dev generat
## MkDocs
> [!NOTE]
>
> The documentation uses Material for MkDocs Insiders, which is closed-source software.
> This means only members of the Astral organization can preview the documentation exactly as it
> will appear in production.
> Outside contributors can still preview the documentation, but there will be some differences. Consult [the Material for MkDocs documentation](https://squidfunk.github.io/mkdocs-material/insiders/benefits/#features) for which features are exclusively available in the insiders version.
To preview any changes to the documentation locally:
1. Install the [Rust toolchain](https://www.rust-lang.org/tools/install).

755
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -44,6 +44,7 @@ ty_ide = { path = "crates/ty_ide" }
ty_project = { path = "crates/ty_project", default-features = false }
ty_python_semantic = { path = "crates/ty_python_semantic" }
ty_server = { path = "crates/ty_server" }
ty_static = { path = "crates/ty_static" }
ty_test = { path = "crates/ty_test" }
ty_vendored = { path = "crates/ty_vendored" }
@@ -56,6 +57,9 @@ assert_fs = { version = "1.1.0" }
argfile = { version = "0.2.0" }
bincode = { version = "2.0.0" }
bitflags = { version = "2.5.0" }
bitvec = { version = "1.0.1", default-features = false, features = [
"alloc",
] }
bstr = { version = "1.9.1" }
cachedir = { version = "0.3.1" }
camino = { version = "1.1.7" }
@@ -69,7 +73,7 @@ console_error_panic_hook = { version = "0.1.7" }
console_log = { version = "1.0.0" }
countme = { version = "3.0.1" }
compact_str = "0.9.0"
criterion = { version = "0.6.0", default-features = false }
criterion = { version = "0.7.0", default-features = false }
crossbeam = { version = "0.8.4" }
dashmap = { version = "6.0.1" }
dir-test = { version = "0.4.0" }
@@ -79,11 +83,11 @@ etcetera = { version = "0.10.0" }
fern = { version = "0.7.0" }
filetime = { version = "0.2.23" }
getrandom = { version = "0.3.1" }
get-size2 = { version = "0.5.0", features = [
get-size2 = { version = "0.6.0", features = [
"derive",
"smallvec",
"hashbrown",
"compact-str"
"compact-str",
] }
glob = { version = "0.3.1" }
globset = { version = "0.4.14" }
@@ -98,7 +102,7 @@ ignore = { version = "0.4.22" }
imara-diff = { version = "0.1.5" }
imperative = { version = "1.0.4" }
indexmap = { version = "2.6.0" }
indicatif = { version = "0.17.8" }
indicatif = { version = "0.18.0" }
indoc = { version = "2.0.4" }
insta = { version = "1.35.1" }
insta-cmd = { version = "0.6.0" }
@@ -137,7 +141,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "fc00eba89e5dcaa5edba51c41aa5f309b5cb126b" }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "f3dc2f30f9a250618161e35600a00de7fe744953" }
schemars = { version = "0.8.16" }
seahash = { version = "4.1.0" }
serde = { version = "1.0.197", features = ["derive"] }
@@ -149,7 +153,7 @@ serde_with = { version = "3.6.0", default-features = false, features = [
] }
shellexpand = { version = "3.0.0" }
similar = { version = "2.4.0", features = ["inline"] }
smallvec = { version = "1.13.2" }
smallvec = { version = "1.13.2", features = ["union", "const_generics", "const_new"] }
snapbox = { version = "0.6.0", features = [
"diff",
"term-svg",
@@ -164,16 +168,16 @@ tempfile = { version = "3.9.0" }
test-case = { version = "3.3.1" }
thiserror = { version = "2.0.0" }
tikv-jemallocator = { version = "0.6.0" }
toml = { version = "0.8.11" }
toml = { version = "0.9.0" }
tracing = { version = "0.1.40" }
tracing-flame = { version = "0.2.0" }
tracing-indicatif = { version = "0.3.6" }
tracing-indicatif = { version = "0.3.11" }
tracing-log = { version = "0.2.0" }
tracing-subscriber = { version = "0.3.18", default-features = false, features = [
"env-filter",
"fmt",
"ansi",
"smallvec"
"smallvec",
] }
tryfn = { version = "0.2.1" }
typed-arena = { version = "2.0.2" }
@@ -183,11 +187,7 @@ unicode-width = { version = "0.2.0" }
unicode_names2 = { version = "1.2.2" }
unicode-normalization = { version = "0.1.23" }
url = { version = "2.5.0" }
uuid = { version = "1.6.1", features = [
"v4",
"fast-rng",
"macro-diagnostics",
] }
uuid = { version = "1.6.1", features = ["v4", "fast-rng", "macro-diagnostics"] }
walkdir = { version = "2.3.2" }
wasm-bindgen = { version = "0.2.92" }
wasm-bindgen-test = { version = "0.3.42" }
@@ -222,8 +222,8 @@ must_use_candidate = "allow"
similar_names = "allow"
single_match_else = "allow"
too_many_lines = "allow"
needless_continue = "allow" # An explicit continue can be more readable, especially if the alternative is an empty block.
unnecessary_debug_formatting = "allow" # too many instances, the display also doesn't quote the path which is often desired in logs where we use them the most often.
needless_continue = "allow" # An explicit continue can be more readable, especially if the alternative is an empty block.
unnecessary_debug_formatting = "allow" # too many instances, the display also doesn't quote the path which is often desired in logs where we use them the most often.
# Without the hashes we run into a `rustfmt` bug in some snapshot tests, see #13250
needless_raw_string_hashes = "allow"
# Disallowed restriction lints

View File

@@ -148,8 +148,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.12.2/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.12.2/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.12.7/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.12.7/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -182,7 +182,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.12.2
rev: v0.12.7
hooks:
# Run the linter.
- id: ruff-check
@@ -430,6 +430,7 @@ Ruff is used by a number of major open-source projects and companies, including:
- [Babel](https://github.com/python-babel/babel)
- Benchling ([Refac](https://github.com/benchling/refac))
- [Bokeh](https://github.com/bokeh/bokeh)
- Capital One ([datacompy](https://github.com/capitalone/datacompy))
- CrowdCent ([NumerBlox](https://github.com/crowdcent/numerblox)) <!-- typos: ignore -->
- [Cryptography (PyCA)](https://github.com/pyca/cryptography)
- CERN ([Indico](https://getindico.io/))
@@ -506,6 +507,7 @@ Ruff is used by a number of major open-source projects and companies, including:
- [Streamlit](https://github.com/streamlit/streamlit)
- [The Algorithms](https://github.com/TheAlgorithms/Python)
- [Vega-Altair](https://github.com/altair-viz/altair)
- [Weblate](https://weblate.org/)
- WordPress ([Openverse](https://github.com/WordPress/openverse))
- [ZenML](https://github.com/zenml-io/zenml)
- [Zulip](https://github.com/zulip/zulip)

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.12.2"
version = "0.12.7"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -169,6 +169,9 @@ pub struct AnalyzeGraphCommand {
/// Attempt to detect imports from string literals.
#[clap(long)]
detect_string_imports: bool,
/// The minimum number of dots in a string import to consider it a valid import.
#[clap(long)]
min_dots: Option<usize>,
/// Enable preview mode. Use `--no-preview` to disable.
#[arg(long, overrides_with("no_preview"))]
preview: bool,
@@ -808,6 +811,7 @@ impl AnalyzeGraphCommand {
} else {
None
},
string_imports_min_dots: self.min_dots,
preview: resolve_bool_arg(self.preview, self.no_preview).map(PreviewMode::from),
target_version: self.target_version.map(ast::PythonVersion::from),
..ExplicitConfigOverrides::default()
@@ -1305,6 +1309,7 @@ struct ExplicitConfigOverrides {
show_fixes: Option<bool>,
extension: Option<Vec<ExtensionPair>>,
detect_string_imports: Option<bool>,
string_imports_min_dots: Option<usize>,
}
impl ConfigurationTransformer for ExplicitConfigOverrides {
@@ -1392,6 +1397,9 @@ impl ConfigurationTransformer for ExplicitConfigOverrides {
if let Some(detect_string_imports) = &self.detect_string_imports {
config.analyze.detect_string_imports = Some(*detect_string_imports);
}
if let Some(string_imports_min_dots) = &self.string_imports_min_dots {
config.analyze.string_imports_min_dots = Some(*string_imports_min_dots);
}
config
}

View File

@@ -18,14 +18,15 @@ use rustc_hash::FxHashMap;
use tempfile::NamedTempFile;
use ruff_cache::{CacheKey, CacheKeyHasher};
use ruff_db::diagnostic::Diagnostic;
use ruff_diagnostics::Fix;
use ruff_linter::message::OldDiagnostic;
use ruff_linter::message::create_lint_diagnostic;
use ruff_linter::package::PackageRoot;
use ruff_linter::{VERSION, warn_user};
use ruff_macros::CacheKey;
use ruff_notebook::NotebookIndex;
use ruff_source_file::SourceFileBuilder;
use ruff_text_size::{Ranged, TextRange, TextSize};
use ruff_text_size::{TextRange, TextSize};
use ruff_workspace::Settings;
use ruff_workspace::resolver::Resolver;
@@ -348,7 +349,7 @@ impl FileCache {
lint.messages
.iter()
.map(|msg| {
OldDiagnostic::lint(
create_lint_diagnostic(
&msg.body,
msg.suggestion.as_ref(),
msg.range,
@@ -428,11 +429,11 @@ pub(crate) struct LintCacheData {
impl LintCacheData {
pub(crate) fn from_diagnostics(
diagnostics: &[OldDiagnostic],
diagnostics: &[Diagnostic],
notebook_index: Option<NotebookIndex>,
) -> Self {
let source = if let Some(msg) = diagnostics.first() {
msg.source_file().source_text().to_owned()
msg.expect_ruff_source_file().source_text().to_owned()
} else {
String::new() // No messages, no need to keep the source!
};
@@ -446,16 +447,16 @@ impl LintCacheData {
.map(|(rule, msg)| {
// Make sure that all message use the same source file.
assert_eq!(
msg.source_file(),
diagnostics.first().unwrap().source_file(),
msg.expect_ruff_source_file(),
diagnostics.first().unwrap().expect_ruff_source_file(),
"message uses a different source file"
);
CacheMessage {
rule,
body: msg.body().to_string(),
suggestion: msg.suggestion().map(ToString::to_string),
range: msg.range(),
parent: msg.parent,
suggestion: msg.first_help_text().map(ToString::to_string),
range: msg.expect_range(),
parent: msg.parent(),
fix: msg.fix().cloned(),
noqa_offset: msg.noqa_offset(),
}
@@ -608,12 +609,12 @@ mod tests {
use anyhow::Result;
use filetime::{FileTime, set_file_mtime};
use itertools::Itertools;
use ruff_linter::settings::LinterSettings;
use test_case::test_case;
use ruff_cache::CACHE_DIR_NAME;
use ruff_linter::message::OldDiagnostic;
use ruff_db::diagnostic::Diagnostic;
use ruff_linter::package::PackageRoot;
use ruff_linter::settings::LinterSettings;
use ruff_linter::settings::flags;
use ruff_linter::settings::types::UnsafeFixes;
use ruff_python_ast::{PySourceType, PythonVersion};
@@ -680,7 +681,7 @@ mod tests {
UnsafeFixes::Enabled,
)
.unwrap();
if diagnostics.inner.iter().any(OldDiagnostic::is_syntax_error) {
if diagnostics.inner.iter().any(Diagnostic::is_invalid_syntax) {
parse_errors.push(path.clone());
}
paths.push(path);

View File

@@ -102,7 +102,7 @@ pub(crate) fn analyze_graph(
// Resolve the per-file settings.
let settings = resolver.resolve(path);
let string_imports = settings.analyze.detect_string_imports;
let string_imports = settings.analyze.string_imports;
let include_dependencies = settings.analyze.include_dependencies.get(path).cloned();
// Skip excluded files.

View File

@@ -11,13 +11,13 @@ use log::{debug, error, warn};
use rayon::prelude::*;
use rustc_hash::FxHashMap;
use ruff_db::diagnostic::Diagnostic;
use ruff_db::panic::catch_unwind;
use ruff_linter::OldDiagnostic;
use ruff_linter::package::PackageRoot;
use ruff_linter::registry::Rule;
use ruff_linter::settings::types::UnsafeFixes;
use ruff_linter::settings::{LinterSettings, flags};
use ruff_linter::{IOError, fs, warn_user_once};
use ruff_linter::{IOError, Violation, fs, warn_user_once};
use ruff_source_file::SourceFileBuilder;
use ruff_text_size::TextRange;
use ruff_workspace::resolver::{
@@ -129,11 +129,7 @@ pub(crate) fn check(
SourceFileBuilder::new(path.to_string_lossy().as_ref(), "").finish();
Diagnostics::new(
vec![OldDiagnostic::new(
IOError { message },
TextRange::default(),
&dummy,
)],
vec![IOError { message }.into_diagnostic(TextRange::default(), &dummy)],
FxHashMap::default(),
)
} else {
@@ -166,7 +162,9 @@ pub(crate) fn check(
|a, b| (a.0 + b.0, a.1 + b.1),
);
all_diagnostics.inner.sort();
all_diagnostics
.inner
.sort_by(Diagnostic::ruff_start_ordering);
// Store the caches.
caches.persist()?;
@@ -281,6 +279,7 @@ mod test {
TextEmitter::default()
.with_show_fix_status(true)
.with_color(false)
.emit(
&mut output,
&diagnostics.inner,

View File

@@ -1,6 +1,7 @@
use std::path::Path;
use anyhow::Result;
use ruff_db::diagnostic::Diagnostic;
use ruff_linter::package::PackageRoot;
use ruff_linter::packaging;
use ruff_linter::settings::flags;
@@ -52,6 +53,8 @@ pub(crate) fn check_stdin(
noqa,
fix_mode,
)?;
diagnostics.inner.sort_unstable();
diagnostics
.inner
.sort_unstable_by(Diagnostic::ruff_start_ordering);
Ok(diagnostics)
}

View File

@@ -10,35 +10,35 @@ use std::path::Path;
use anyhow::{Context, Result};
use colored::Colorize;
use log::{debug, warn};
use rustc_hash::FxHashMap;
use ruff_linter::OldDiagnostic;
use ruff_db::diagnostic::Diagnostic;
use ruff_linter::codes::Rule;
use ruff_linter::linter::{FixTable, FixerResult, LinterResult, ParseSource, lint_fix, lint_only};
use ruff_linter::message::create_syntax_error_diagnostic;
use ruff_linter::package::PackageRoot;
use ruff_linter::pyproject_toml::lint_pyproject_toml;
use ruff_linter::settings::types::UnsafeFixes;
use ruff_linter::settings::{LinterSettings, flags};
use ruff_linter::source_kind::{SourceError, SourceKind};
use ruff_linter::{IOError, fs};
use ruff_linter::{IOError, Violation, fs};
use ruff_notebook::{Notebook, NotebookError, NotebookIndex};
use ruff_python_ast::{PySourceType, SourceType, TomlSourceType};
use ruff_source_file::SourceFileBuilder;
use ruff_text_size::TextRange;
use ruff_workspace::Settings;
use rustc_hash::FxHashMap;
use crate::cache::{Cache, FileCacheKey, LintCacheData};
#[derive(Debug, Default, PartialEq)]
pub(crate) struct Diagnostics {
pub(crate) inner: Vec<OldDiagnostic>,
pub(crate) inner: Vec<Diagnostic>,
pub(crate) fixed: FixMap,
pub(crate) notebook_indexes: FxHashMap<String, NotebookIndex>,
}
impl Diagnostics {
pub(crate) fn new(
diagnostics: Vec<OldDiagnostic>,
diagnostics: Vec<Diagnostic>,
notebook_indexes: FxHashMap<String, NotebookIndex>,
) -> Self {
Self {
@@ -62,13 +62,12 @@ impl Diagnostics {
let name = path.map_or_else(|| "-".into(), Path::to_string_lossy);
let source_file = SourceFileBuilder::new(name, "").finish();
Self::new(
vec![OldDiagnostic::new(
vec![
IOError {
message: err.to_string(),
},
TextRange::default(),
&source_file,
)],
}
.into_diagnostic(TextRange::default(), &source_file),
],
FxHashMap::default(),
)
} else {
@@ -98,10 +97,10 @@ impl Diagnostics {
let name = path.map_or_else(|| "-".into(), Path::to_string_lossy);
let dummy = SourceFileBuilder::new(name, "").finish();
Self::new(
vec![OldDiagnostic::syntax_error(
vec![create_syntax_error_diagnostic(
dummy,
err,
TextRange::default(),
dummy,
)],
FxHashMap::default(),
)

View File

@@ -131,6 +131,7 @@ pub fn run(
}: Args,
) -> Result<ExitStatus> {
{
ruff_db::set_program_version(crate::version::version().to_string()).unwrap();
let default_panic_hook = std::panic::take_hook();
std::panic::set_hook(Box::new(move |info| {
#[expect(clippy::print_stderr)]
@@ -439,7 +440,7 @@ pub fn check(args: CheckCommand, global_options: GlobalConfigArgs) -> Result<Exi
if cli.statistics {
printer.write_statistics(&diagnostics, &mut summary_writer)?;
} else {
printer.write_once(&diagnostics, &mut summary_writer)?;
printer.write_once(&diagnostics, &mut summary_writer, preview)?;
}
if !cli.exit_zero {

View File

@@ -9,12 +9,14 @@ use itertools::{Itertools, iterate};
use ruff_linter::linter::FixTable;
use serde::Serialize;
use ruff_db::diagnostic::{
Diagnostic, DiagnosticFormat, DisplayDiagnosticConfig, DisplayDiagnostics, SecondaryCode,
};
use ruff_linter::fs::relativize_path;
use ruff_linter::logging::LogLevel;
use ruff_linter::message::{
AzureEmitter, Emitter, EmitterContext, GithubEmitter, GitlabEmitter, GroupedEmitter,
JsonEmitter, JsonLinesEmitter, JunitEmitter, OldDiagnostic, PylintEmitter, RdjsonEmitter,
SarifEmitter, SecondaryCode, TextEmitter,
Emitter, EmitterContext, GithubEmitter, GitlabEmitter, GroupedEmitter, SarifEmitter,
TextEmitter,
};
use ruff_linter::notify_user;
use ruff_linter::settings::flags::{self};
@@ -201,6 +203,7 @@ impl Printer {
&self,
diagnostics: &Diagnostics,
writer: &mut dyn Write,
preview: bool,
) -> Result<()> {
if matches!(self.log_level, LogLevel::Silent) {
return Ok(());
@@ -228,16 +231,32 @@ impl Printer {
match self.format {
OutputFormat::Json => {
JsonEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Json)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Rdjson => {
RdjsonEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Rdjson)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::JsonLines => {
JsonLinesEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::JsonLines)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Junit => {
JunitEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Junit)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Concise | OutputFormat::Full => {
TextEmitter::default()
@@ -245,6 +264,7 @@ impl Printer {
.with_show_fix_diff(self.flags.intersects(Flags::SHOW_FIX_DIFF))
.with_show_source(self.format == OutputFormat::Full)
.with_unsafe_fixes(self.unsafe_fixes)
.with_preview(preview)
.emit(writer, &diagnostics.inner, &context)?;
if self.flags.intersects(Flags::SHOW_FIX_SUMMARY) {
@@ -279,10 +299,18 @@ impl Printer {
GitlabEmitter::default().emit(writer, &diagnostics.inner, &context)?;
}
OutputFormat::Pylint => {
PylintEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Pylint)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Azure => {
AzureEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Azure)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Sarif => {
SarifEmitter.emit(writer, &diagnostics.inner, &context)?;
@@ -306,8 +334,7 @@ impl Printer {
.sorted_by_key(|(code, message)| (*code, message.fixable()))
.fold(
vec![],
|mut acc: Vec<((Option<&SecondaryCode>, &OldDiagnostic), usize)>,
(code, message)| {
|mut acc: Vec<((Option<&SecondaryCode>, &Diagnostic), usize)>, (code, message)| {
if let Some(((prev_code, _prev_message), count)) = acc.last_mut() {
if *prev_code == code {
*count += 1;

View File

@@ -57,33 +57,40 @@ fn dependencies() -> Result<()> {
.write_str(indoc::indoc! {r#"
def f(): pass
"#})?;
root.child("ruff")
.child("e.pyi")
.write_str(indoc::indoc! {r#"
def f() -> None: ...
"#})?;
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().current_dir(&root), @r###"
success: true
exit_code: 0
----- stdout -----
{
"ruff/__init__.py": [],
"ruff/a.py": [
"ruff/b.py"
],
"ruff/b.py": [
"ruff/c.py"
],
"ruff/c.py": [
"ruff/d.py"
],
"ruff/d.py": [
"ruff/e.py"
],
"ruff/e.py": []
}
assert_cmd_snapshot!(command().current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"ruff/__init__.py": [],
"ruff/a.py": [
"ruff/b.py"
],
"ruff/b.py": [
"ruff/c.py"
],
"ruff/c.py": [
"ruff/d.py"
],
"ruff/d.py": [
"ruff/e.py",
"ruff/e.pyi"
],
"ruff/e.py": [],
"ruff/e.pyi": []
}
----- stderr -----
"###);
----- stderr -----
"#);
});
Ok(())
@@ -197,23 +204,43 @@ fn string_detection() -> Result<()> {
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().arg("--detect-string-imports").current_dir(&root), @r###"
success: true
exit_code: 0
----- stdout -----
{
"ruff/__init__.py": [],
"ruff/a.py": [
"ruff/b.py"
],
"ruff/b.py": [
"ruff/c.py"
],
"ruff/c.py": []
}
assert_cmd_snapshot!(command().arg("--detect-string-imports").current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"ruff/__init__.py": [],
"ruff/a.py": [
"ruff/b.py"
],
"ruff/b.py": [],
"ruff/c.py": []
}
----- stderr -----
"###);
----- stderr -----
"#);
});
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().arg("--detect-string-imports").arg("--min-dots").arg("1").current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"ruff/__init__.py": [],
"ruff/a.py": [
"ruff/b.py"
],
"ruff/b.py": [
"ruff/c.py"
],
"ruff/c.py": []
}
----- stderr -----
"#);
});
Ok(())

View File

@@ -120,7 +120,7 @@ fn nonexistent_config_file() {
#[test]
fn config_override_rejected_if_invalid_toml() {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["format", "--config", "foo = bar", "."]), @r#"
.args(["format", "--config", "foo = bar", "."]), @r"
success: false
exit_code: 2
----- stdout -----
@@ -137,12 +137,11 @@ fn config_override_rejected_if_invalid_toml() {
TOML parse error at line 1, column 7
|
1 | foo = bar
| ^
invalid string
expected `"`, `'`
| ^^^
string values must be quoted, expected literal string
For more information, try '--help'.
"#);
");
}
#[test]

View File

@@ -1067,7 +1067,7 @@ fn show_statistics_syntax_errors() {
success: false
exit_code: 1
----- stdout -----
1 syntax-error
1 invalid-syntax
Found 1 error.
----- stderr -----
@@ -1080,7 +1080,7 @@ fn show_statistics_syntax_errors() {
success: false
exit_code: 1
----- stdout -----
1 syntax-error
1 invalid-syntax
Found 1 error.
----- stderr -----
@@ -1093,7 +1093,7 @@ fn show_statistics_syntax_errors() {
success: false
exit_code: 1
----- stdout -----
1 syntax-error
1 invalid-syntax
Found 1 error.
----- stderr -----
@@ -2246,8 +2246,7 @@ fn pyproject_toml_stdin_syntax_error() {
success: false
exit_code: 1
----- stdout -----
pyproject.toml:1:9: RUF200 Failed to parse pyproject.toml: invalid table header
expected `.`, `]`
pyproject.toml:1:9: RUF200 Failed to parse pyproject.toml: unclosed table, expected `]`
|
1 | [project
| ^ RUF200

View File

@@ -534,7 +534,7 @@ fn nonexistent_config_file() {
fn config_override_rejected_if_invalid_toml() {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(STDIN_BASE_OPTIONS)
.args(["--config", "foo = bar", "."]), @r#"
.args(["--config", "foo = bar", "."]), @r"
success: false
exit_code: 2
----- stdout -----
@@ -551,12 +551,11 @@ fn config_override_rejected_if_invalid_toml() {
TOML parse error at line 1, column 7
|
1 | foo = bar
| ^
invalid string
expected `"`, `'`
| ^^^
string values must be quoted, expected literal string
For more information, try '--help'.
"#);
");
}
#[test]
@@ -733,9 +732,8 @@ select = [E501]
Cause: TOML parse error at line 3, column 11
|
3 | select = [E501]
| ^
invalid array
expected `]`
| ^^^^
string values must be quoted, expected literal string
");
});
@@ -876,7 +874,7 @@ fn each_toml_option_requires_a_new_flag_1() {
|
1 | extend-select=['F841'], line-length=90
| ^
expected newline, `#`
unexpected key or value, expected newline, `#`
For more information, try '--help'.
");
@@ -907,7 +905,7 @@ fn each_toml_option_requires_a_new_flag_2() {
|
1 | extend-select=['F841'] line-length=90
| ^
expected newline, `#`
unexpected key or value, expected newline, `#`
For more information, try '--help'.
");
@@ -995,6 +993,7 @@ fn value_given_to_table_key_is_not_inline_table_2() {
- `lint.exclude`
- `lint.preview`
- `lint.typing-extensions`
- `lint.future-annotations`
For more information, try '--help'.
");
@@ -2423,7 +2422,7 @@ requires-python = ">= 3.11"
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.11
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
@@ -2735,7 +2734,7 @@ requires-python = ">= 3.11"
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.10
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
@@ -3099,7 +3098,7 @@ from typing import Union;foo: Union[int, str] = 1
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.11
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
@@ -3479,7 +3478,7 @@ from typing import Union;foo: Union[int, str] = 1
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.11
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
@@ -3807,7 +3806,7 @@ from typing import Union;foo: Union[int, str] = 1
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.10
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
@@ -4135,7 +4134,7 @@ from typing import Union;foo: Union[int, str] = 1
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.9
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
@@ -4420,7 +4419,7 @@ from typing import Union;foo: Union[int, str] = 1
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.9
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
@@ -4758,7 +4757,7 @@ from typing import Union;foo: Union[int, str] = 1
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.10
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
@@ -5692,3 +5691,82 @@ class Foo:
"
);
}
#[test_case::test_case("concise")]
#[test_case::test_case("full")]
#[test_case::test_case("json")]
#[test_case::test_case("json-lines")]
#[test_case::test_case("junit")]
#[test_case::test_case("grouped")]
#[test_case::test_case("github")]
#[test_case::test_case("gitlab")]
#[test_case::test_case("pylint")]
#[test_case::test_case("rdjson")]
#[test_case::test_case("azure")]
#[test_case::test_case("sarif")]
fn output_format(output_format: &str) -> Result<()> {
const CONTENT: &str = "\
import os # F401
x = y # F821
match 42: # invalid-syntax
case _: ...
";
let tempdir = TempDir::new()?;
let input = tempdir.path().join("input.py");
fs::write(&input, CONTENT)?;
let snapshot = format!("output_format_{output_format}");
let project_dir = dunce::canonicalize(tempdir.path())?;
insta::with_settings!({
filters => vec![
(tempdir_filter(&project_dir).as_str(), "[TMP]/"),
(tempdir_filter(&tempdir).as_str(), "[TMP]/"),
(r#""[^"]+\\?/?input.py"#, r#""[TMP]/input.py"#),
(ruff_linter::VERSION, "[VERSION]"),
]
}, {
assert_cmd_snapshot!(
snapshot,
Command::new(get_cargo_bin(BIN_NAME))
.args([
"check",
"--no-cache",
"--output-format",
output_format,
"--select",
"F401,F821",
"--target-version",
"py39",
"input.py",
])
.current_dir(&tempdir),
);
});
Ok(())
}
#[test]
fn future_annotations_preview_warning() {
assert_cmd_snapshot!(
Command::new(get_cargo_bin(BIN_NAME))
.args(STDIN_BASE_OPTIONS)
.args(["--config", "lint.future-annotations = true"])
.args(["--select", "F"])
.arg("--no-preview")
.arg("-")
.pass_stdin("1"),
@r"
success: true
exit_code: 0
----- stdout -----
All checks passed!
----- stderr -----
warning: The `lint.future-annotations` setting will have no effect because `preview` is disabled
",
);
}

View File

@@ -0,0 +1,23 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- azure
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
##vso[task.logissue type=error;sourcepath=[TMP]/input.py;linenumber=1;columnnumber=8;code=F401;]`os` imported but unused
##vso[task.logissue type=error;sourcepath=[TMP]/input.py;linenumber=2;columnnumber=5;code=F821;]Undefined name `y`
##vso[task.logissue type=error;sourcepath=[TMP]/input.py;linenumber=3;columnnumber=1;]SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)
----- stderr -----

View File

@@ -0,0 +1,25 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- concise
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
input.py:1:8: F401 [*] `os` imported but unused
input.py:2:5: F821 Undefined name `y`
input.py:3:1: SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)
Found 3 errors.
[*] 1 fixable with the `--fix` option.
----- stderr -----

View File

@@ -0,0 +1,49 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- full
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
input.py:1:8: F401 [*] `os` imported but unused
|
1 | import os # F401
| ^^ F401
2 | x = y # F821
3 | match 42: # invalid-syntax
|
= help: Remove unused import: `os`
input.py:2:5: F821 Undefined name `y`
|
1 | import os # F401
2 | x = y # F821
| ^ F821
3 | match 42: # invalid-syntax
4 | case _: ...
|
input.py:3:1: SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)
|
1 | import os # F401
2 | x = y # F821
3 | match 42: # invalid-syntax
| ^^^^^
4 | case _: ...
|
Found 3 errors.
[*] 1 fixable with the `--fix` option.
----- stderr -----

View File

@@ -0,0 +1,23 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- github
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
::error title=Ruff (F401),file=[TMP]/input.py,line=1,col=8,endLine=1,endColumn=10::input.py:1:8: F401 `os` imported but unused
::error title=Ruff (F821),file=[TMP]/input.py,line=2,col=5,endLine=2,endColumn=6::input.py:2:5: F821 Undefined name `y`
::error title=Ruff,file=[TMP]/input.py,line=3,col=1,endLine=3,endColumn=6::input.py:3:1: SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)
----- stderr -----

View File

@@ -0,0 +1,60 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- gitlab
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
[
{
"check_name": "F401",
"description": "`os` imported but unused",
"fingerprint": "4dbad37161e65c72",
"location": {
"lines": {
"begin": 1,
"end": 1
},
"path": "input.py"
},
"severity": "major"
},
{
"check_name": "F821",
"description": "Undefined name `y`",
"fingerprint": "7af59862a085230",
"location": {
"lines": {
"begin": 2,
"end": 2
},
"path": "input.py"
},
"severity": "major"
},
{
"check_name": "syntax-error",
"description": "Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)",
"fingerprint": "e558cec859bb66e8",
"location": {
"lines": {
"begin": 3,
"end": 3
},
"path": "input.py"
},
"severity": "major"
}
]
----- stderr -----

View File

@@ -0,0 +1,27 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- grouped
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
input.py:
1:8 F401 [*] `os` imported but unused
2:5 F821 Undefined name `y`
3:1 SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)
Found 3 errors.
[*] 1 fixable with the `--fix` option.
----- stderr -----

View File

@@ -0,0 +1,23 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- json-lines
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
{"cell":null,"code":"F401","end_location":{"column":10,"row":1},"filename":"[TMP]/input.py","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":1,"row":2},"location":{"column":1,"row":1}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":1},"message":"`os` imported but unused","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":null,"code":"F821","end_location":{"column":6,"row":2},"filename":"[TMP]/input.py","fix":null,"location":{"column":5,"row":2},"message":"Undefined name `y`","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/undefined-name"}
{"cell":null,"code":null,"end_location":{"column":6,"row":3},"filename":"[TMP]/input.py","fix":null,"location":{"column":1,"row":3},"message":"SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)","noqa_row":null,"url":null}
----- stderr -----

View File

@@ -0,0 +1,88 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- json
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
[
{
"cell": null,
"code": "F401",
"end_location": {
"column": 10,
"row": 1
},
"filename": "[TMP]/input.py",
"fix": {
"applicability": "safe",
"edits": [
{
"content": "",
"end_location": {
"column": 1,
"row": 2
},
"location": {
"column": 1,
"row": 1
}
}
],
"message": "Remove unused import: `os`"
},
"location": {
"column": 8,
"row": 1
},
"message": "`os` imported but unused",
"noqa_row": 1,
"url": "https://docs.astral.sh/ruff/rules/unused-import"
},
{
"cell": null,
"code": "F821",
"end_location": {
"column": 6,
"row": 2
},
"filename": "[TMP]/input.py",
"fix": null,
"location": {
"column": 5,
"row": 2
},
"message": "Undefined name `y`",
"noqa_row": 2,
"url": "https://docs.astral.sh/ruff/rules/undefined-name"
},
{
"cell": null,
"code": null,
"end_location": {
"column": 6,
"row": 3
},
"filename": "[TMP]/input.py",
"fix": null,
"location": {
"column": 1,
"row": 3
},
"message": "SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)",
"noqa_row": null,
"url": null
}
]
----- stderr -----

View File

@@ -0,0 +1,34 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- junit
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="ruff" tests="3" failures="3" errors="0">
<testsuite name="[TMP]/input.py" tests="3" disabled="0" errors="0" failures="3" package="org.ruff">
<testcase name="org.ruff.F401" classname="[TMP]/input" line="1" column="8">
<failure message="`os` imported but unused">line 1, col 8, `os` imported but unused</failure>
</testcase>
<testcase name="org.ruff.F821" classname="[TMP]/input" line="2" column="5">
<failure message="Undefined name `y`">line 2, col 5, Undefined name `y`</failure>
</testcase>
<testcase name="org.ruff.invalid-syntax" classname="[TMP]/input" line="3" column="1">
<failure message="SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)">line 3, col 1, SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)</failure>
</testcase>
</testsuite>
</testsuites>
----- stderr -----

View File

@@ -0,0 +1,23 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- pylint
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
input.py:1: [F401] `os` imported but unused
input.py:2: [F821] Undefined name `y`
input.py:3: [invalid-syntax] SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)
----- stderr -----

View File

@@ -0,0 +1,102 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- rdjson
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
{
"diagnostics": [
{
"code": {
"url": "https://docs.astral.sh/ruff/rules/unused-import",
"value": "F401"
},
"location": {
"path": "[TMP]/input.py",
"range": {
"end": {
"column": 10,
"line": 1
},
"start": {
"column": 8,
"line": 1
}
}
},
"message": "`os` imported but unused",
"suggestions": [
{
"range": {
"end": {
"column": 1,
"line": 2
},
"start": {
"column": 1,
"line": 1
}
},
"text": ""
}
]
},
{
"code": {
"url": "https://docs.astral.sh/ruff/rules/undefined-name",
"value": "F821"
},
"location": {
"path": "[TMP]/input.py",
"range": {
"end": {
"column": 6,
"line": 2
},
"start": {
"column": 5,
"line": 2
}
}
},
"message": "Undefined name `y`"
},
{
"code": {
"value": "invalid-syntax"
},
"location": {
"path": "[TMP]/input.py",
"range": {
"end": {
"column": 6,
"line": 3
},
"start": {
"column": 1,
"line": 3
}
}
},
"message": "SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)"
}
],
"severity": "WARNING",
"source": {
"name": "ruff",
"url": "https://docs.astral.sh/ruff"
}
}
----- stderr -----

View File

@@ -0,0 +1,142 @@
---
source: crates/ruff/tests/lint.rs
info:
program: ruff
args:
- check
- "--no-cache"
- "--output-format"
- sarif
- "--select"
- "F401,F821"
- "--target-version"
- py39
- input.py
---
success: false
exit_code: 1
----- stdout -----
{
"$schema": "https://json.schemastore.org/sarif-2.1.0.json",
"runs": [
{
"results": [
{
"level": "error",
"locations": [
{
"physicalLocation": {
"artifactLocation": {
"uri": "[TMP]/input.py"
},
"region": {
"endColumn": 10,
"endLine": 1,
"startColumn": 8,
"startLine": 1
}
}
}
],
"message": {
"text": "`os` imported but unused"
},
"ruleId": "F401"
},
{
"level": "error",
"locations": [
{
"physicalLocation": {
"artifactLocation": {
"uri": "[TMP]/input.py"
},
"region": {
"endColumn": 6,
"endLine": 2,
"startColumn": 5,
"startLine": 2
}
}
}
],
"message": {
"text": "Undefined name `y`"
},
"ruleId": "F821"
},
{
"level": "error",
"locations": [
{
"physicalLocation": {
"artifactLocation": {
"uri": "[TMP]/input.py"
},
"region": {
"endColumn": 6,
"endLine": 3,
"startColumn": 1,
"startLine": 3
}
}
}
],
"message": {
"text": "SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)"
},
"ruleId": null
}
],
"tool": {
"driver": {
"informationUri": "https://github.com/astral-sh/ruff",
"name": "ruff",
"rules": [
{
"fullDescription": {
"text": "## What it does\nChecks for unused imports.\n\n## Why is this bad?\nUnused imports add a performance overhead at runtime, and risk creating\nimport cycles. They also increase the cognitive load of reading the code.\n\nIf an import statement is used to check for the availability or existence\nof a module, consider using `importlib.util.find_spec` instead.\n\nIf an import statement is used to re-export a symbol as part of a module's\npublic interface, consider using a \"redundant\" import alias, which\ninstructs Ruff (and other tools) to respect the re-export, and avoid\nmarking it as unused, as in:\n\n```python\nfrom module import member as member\n```\n\nAlternatively, you can use `__all__` to declare a symbol as part of the module's\ninterface, as in:\n\n```python\n# __init__.py\nimport some_module\n\n__all__ = [\"some_module\"]\n```\n\n## Fix safety\n\nFixes to remove unused imports are safe, except in `__init__.py` files.\n\nApplying fixes to `__init__.py` files is currently in preview. The fix offered depends on the\ntype of the unused import. Ruff will suggest a safe fix to export first-party imports with\neither a redundant alias or, if already present in the file, an `__all__` entry. If multiple\n`__all__` declarations are present, Ruff will not offer a fix. Ruff will suggest an unsafe fix\nto remove third-party and standard library imports -- the fix is unsafe because the module's\ninterface changes.\n\n## Example\n\n```python\nimport numpy as np # unused import\n\n\ndef area(radius):\n return 3.14 * radius**2\n```\n\nUse instead:\n\n```python\ndef area(radius):\n return 3.14 * radius**2\n```\n\nTo check the availability of a module, use `importlib.util.find_spec`:\n\n```python\nfrom importlib.util import find_spec\n\nif find_spec(\"numpy\") is not None:\n print(\"numpy is installed\")\nelse:\n print(\"numpy is not installed\")\n```\n\n## Preview\nWhen [preview](https://docs.astral.sh/ruff/preview/) is enabled,\nthe criterion for determining whether an import is first-party\nis stricter, which could affect the suggested fix. See [this FAQ section](https://docs.astral.sh/ruff/faq/#how-does-ruff-determine-which-of-my-imports-are-first-party-third-party-etc) for more details.\n\n## Options\n- `lint.ignore-init-module-imports`\n- `lint.pyflakes.allowed-unused-imports`\n\n## References\n- [Python documentation: `import`](https://docs.python.org/3/reference/simple_stmts.html#the-import-statement)\n- [Python documentation: `importlib.util.find_spec`](https://docs.python.org/3/library/importlib.html#importlib.util.find_spec)\n- [Typing documentation: interface conventions](https://typing.python.org/en/latest/source/libraries.html#library-interface-public-and-private-symbols)\n"
},
"help": {
"text": "`{name}` imported but unused; consider using `importlib.util.find_spec` to test for availability"
},
"helpUri": "https://docs.astral.sh/ruff/rules/unused-import",
"id": "F401",
"properties": {
"id": "F401",
"kind": "Pyflakes",
"name": "unused-import",
"problem.severity": "error"
},
"shortDescription": {
"text": "`{name}` imported but unused; consider using `importlib.util.find_spec` to test for availability"
}
},
{
"fullDescription": {
"text": "## What it does\nChecks for uses of undefined names.\n\n## Why is this bad?\nAn undefined name is likely to raise `NameError` at runtime.\n\n## Example\n```python\ndef double():\n return n * 2 # raises `NameError` if `n` is undefined when `double` is called\n```\n\nUse instead:\n```python\ndef double(n):\n return n * 2\n```\n\n## Options\n- [`target-version`]: Can be used to configure which symbols Ruff will understand\n as being available in the `builtins` namespace.\n\n## References\n- [Python documentation: Naming and binding](https://docs.python.org/3/reference/executionmodel.html#naming-and-binding)\n"
},
"help": {
"text": "Undefined name `{name}`. {tip}"
},
"helpUri": "https://docs.astral.sh/ruff/rules/undefined-name",
"id": "F821",
"properties": {
"id": "F821",
"kind": "Pyflakes",
"name": "undefined-name",
"problem.severity": "error"
},
"shortDescription": {
"text": "Undefined name `{name}`. {tip}"
}
}
],
"version": "[VERSION]"
}
}
}
],
"version": "2.1.0"
}
----- stderr -----

View File

@@ -392,7 +392,7 @@ formatter.docstring_code_line_width = dynamic
analyze.exclude = []
analyze.preview = disabled
analyze.target_version = 3.7
analyze.detect_string_imports = false
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}

View File

@@ -60,7 +60,7 @@ fn config_option_ignored_but_validated() {
assert_cmd_snapshot!(
Command::new(get_cargo_bin(BIN_NAME))
.arg("version")
.args(["--config", "foo = bar"]), @r#"
.args(["--config", "foo = bar"]), @r"
success: false
exit_code: 2
----- stdout -----
@@ -77,12 +77,11 @@ fn config_option_ignored_but_validated() {
TOML parse error at line 1, column 7
|
1 | foo = bar
| ^
invalid string
expected `"`, `'`
| ^^^
string values must be quoted, expected literal string
For more information, try '--help'.
"#
"
);
});
}

View File

@@ -2,6 +2,7 @@
use ruff_benchmark::criterion;
use ruff_benchmark::real_world_projects::{InstalledProject, RealWorldProject};
use std::fmt::Write;
use std::ops::Range;
use criterion::{BatchSize, Criterion, criterion_group, criterion_main};
@@ -17,7 +18,7 @@ use ruff_python_ast::PythonVersion;
use ty_project::metadata::options::{EnvironmentOptions, Options};
use ty_project::metadata::value::{RangedValue, RelativePathBuf};
use ty_project::watch::{ChangeEvent, ChangedKind};
use ty_project::{Db, ProjectDatabase, ProjectMetadata};
use ty_project::{CheckMode, Db, ProjectDatabase, ProjectMetadata};
struct Case {
db: ProjectDatabase,
@@ -101,6 +102,7 @@ fn setup_tomllib_case() -> Case {
let re = re.unwrap();
db.set_check_mode(CheckMode::OpenFiles);
db.project().set_open_files(&mut db, tomllib_files);
let re_path = re.path(&db).as_system_path().unwrap().to_owned();
@@ -236,6 +238,7 @@ fn setup_micro_case(code: &str) -> Case {
let mut db = ProjectDatabase::new(metadata, system).unwrap();
let file = system_path_to_file(&db, SystemPathBuf::from(file_path)).unwrap();
db.set_check_mode(CheckMode::OpenFiles);
db.project()
.set_open_files(&mut db, FxHashSet::from_iter([file]));
@@ -348,6 +351,41 @@ fn benchmark_many_tuple_assignments(criterion: &mut Criterion) {
});
}
fn benchmark_tuple_implicit_instance_attributes(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("ty_micro[many_tuple_assignments]", |b| {
b.iter_batched_ref(
|| {
// This is a regression benchmark for a case that used to hang:
// https://github.com/astral-sh/ty/issues/765
setup_micro_case(
r#"
from typing import Any
class A:
foo: tuple[Any, ...]
class B(A):
def __init__(self, parent: "C", x: tuple[Any]):
self.foo = parent.foo + x
class C(A):
def __init__(self, parent: B, x: tuple[Any]):
self.foo = parent.foo + x
"#,
)
},
|case| {
let Case { db, .. } = case;
let result = db.check();
assert_eq!(result.len(), 0);
},
BatchSize::SmallInput,
);
});
}
fn benchmark_complex_constrained_attributes_1(criterion: &mut Criterion) {
setup_rayon();
@@ -441,6 +479,37 @@ fn benchmark_complex_constrained_attributes_2(criterion: &mut Criterion) {
});
}
fn benchmark_many_enum_members(criterion: &mut Criterion) {
const NUM_ENUM_MEMBERS: usize = 512;
setup_rayon();
let mut code = String::new();
writeln!(&mut code, "from enum import Enum").ok();
writeln!(&mut code, "class E(Enum):").ok();
for i in 0..NUM_ENUM_MEMBERS {
writeln!(&mut code, " m{i} = {i}").ok();
}
writeln!(&mut code).ok();
for i in 0..NUM_ENUM_MEMBERS {
writeln!(&mut code, "print(E.m{i})").ok();
}
criterion.bench_function("ty_micro[many_enum_members]", |b| {
b.iter_batched_ref(
|| setup_micro_case(&code),
|case| {
let Case { db, .. } = case;
let result = db.check();
assert_eq!(result.len(), 0);
},
BatchSize::SmallInput,
);
});
}
struct ProjectBenchmark<'a> {
project: InstalledProject<'a>,
fs: MemoryFileSystem,
@@ -493,17 +562,21 @@ impl<'a> ProjectBenchmark<'a> {
#[track_caller]
fn bench_project(benchmark: &ProjectBenchmark, criterion: &mut Criterion) {
fn check_project(db: &mut ProjectDatabase, max_diagnostics: usize) {
fn check_project(db: &mut ProjectDatabase, project_name: &str, max_diagnostics: usize) {
let result = db.check();
let diagnostics = result.len();
assert!(
diagnostics > 1 && diagnostics <= max_diagnostics,
"Expected between {} and {} diagnostics but got {}",
1,
max_diagnostics,
diagnostics
);
if diagnostics > max_diagnostics {
let details = result
.into_iter()
.map(|diagnostic| diagnostic.concise_message().to_string())
.collect::<Vec<_>>()
.join("\n ");
assert!(
diagnostics <= max_diagnostics,
"{project_name}: Expected <={max_diagnostics} diagnostics but got {diagnostics}:\n {details}",
);
}
}
setup_rayon();
@@ -513,7 +586,7 @@ fn bench_project(benchmark: &ProjectBenchmark, criterion: &mut Criterion) {
group.bench_function(benchmark.project.config.name, |b| {
b.iter_batched_ref(
|| benchmark.setup_iteration(),
|db| check_project(db, benchmark.max_diagnostics),
|db| check_project(db, benchmark.project.config.name, benchmark.max_diagnostics),
BatchSize::SmallInput,
);
});
@@ -570,13 +643,32 @@ fn anyio(criterion: &mut Criterion) {
bench_project(&benchmark, criterion);
}
fn datetype(criterion: &mut Criterion) {
let benchmark = ProjectBenchmark::new(
RealWorldProject {
name: "DateType",
repository: "https://github.com/glyph/DateType",
commit: "57c9c93cf2468069f72945fc04bf27b64100dad8",
paths: vec![SystemPath::new("src")],
dependencies: vec![],
max_dep_date: "2025-07-04",
python_version: PythonVersion::PY313,
},
2,
);
bench_project(&benchmark, criterion);
}
criterion_group!(check_file, benchmark_cold, benchmark_incremental);
criterion_group!(
micro,
benchmark_many_string_assignments,
benchmark_many_tuple_assignments,
benchmark_tuple_implicit_instance_attributes,
benchmark_complex_constrained_attributes_1,
benchmark_complex_constrained_attributes_2,
benchmark_many_enum_members,
);
criterion_group!(project, anyio, attrs, hydra);
criterion_group!(project, anyio, attrs, hydra, datetype);
criterion_main!(check_file, micro, project);

View File

@@ -242,20 +242,19 @@ fn large(bencher: Bencher, benchmark: &Benchmark) {
run_single_threaded(bencher, benchmark);
}
// Currently disabled because the benchmark is too noisy (± 10%) to give useful feedback.
// #[bench(args=[&*PYDANTIC], sample_size=3, sample_count=3)]
// fn multithreaded(bencher: Bencher, benchmark: &Benchmark) {
// let thread_pool = ThreadPoolBuilder::new().build().unwrap();
#[bench(args=[&*PYDANTIC], sample_size=3, sample_count=8)]
fn multithreaded(bencher: Bencher, benchmark: &Benchmark) {
let thread_pool = ThreadPoolBuilder::new().build().unwrap();
// bencher
// .with_inputs(|| benchmark.setup_iteration())
// .bench_local_values(|db| {
// thread_pool.install(|| {
// check_project(&db, benchmark.max_diagnostics);
// db
// })
// });
// }
bencher
.with_inputs(|| benchmark.setup_iteration())
.bench_local_values(|db| {
thread_pool.install(|| {
check_project(&db, benchmark.max_diagnostics);
db
})
});
}
fn main() {
ThreadPoolBuilder::new()

View File

@@ -13,17 +13,18 @@ license = { workspace = true }
[dependencies]
ruff_annotate_snippets = { workspace = true }
ruff_cache = { workspace = true, optional = true }
ruff_diagnostics = { workspace = true }
ruff_notebook = { workspace = true }
ruff_python_ast = { workspace = true, features = ["get-size"] }
ruff_python_parser = { workspace = true }
ruff_python_trivia = { workspace = true }
ruff_source_file = { workspace = true, features = ["get-size"] }
ruff_text_size = { workspace = true }
ty_static = { workspace = true }
anstyle = { workspace = true }
arc-swap = { workspace = true }
camino = { workspace = true }
countme = { workspace = true }
dashmap = { workspace = true }
dunce = { workspace = true }
filetime = { workspace = true }
@@ -32,10 +33,12 @@ glob = { workspace = true }
ignore = { workspace = true, optional = true }
matchit = { workspace = true }
path-slash = { workspace = true }
quick-junit = { workspace = true, optional = true }
rustc-hash = { workspace = true }
salsa = { workspace = true }
schemars = { workspace = true, optional = true }
serde = { workspace = true, optional = true }
serde_json = { workspace = true, optional = true }
thiserror = { workspace = true }
tracing = { workspace = true }
tracing-subscriber = { workspace = true, optional = true }
@@ -53,7 +56,13 @@ tempfile = { workspace = true }
[features]
cache = ["ruff_cache"]
junit = ["dep:quick-junit"]
os = ["ignore", "dep:etcetera"]
serde = ["dep:serde", "camino/serde1"]
serde = [
"camino/serde1",
"dep:serde",
"dep:serde_json",
"ruff_diagnostics/serde",
]
# Exposes testing utilities.
testing = ["tracing-subscriber"]

View File

@@ -1,12 +1,14 @@
use std::{fmt::Formatter, sync::Arc};
use std::{fmt::Formatter, path::Path, sync::Arc};
use render::{FileResolver, Input};
use ruff_source_file::{SourceCode, SourceFile};
use ruff_diagnostics::{Applicability, Fix};
use ruff_source_file::{LineColumn, SourceCode, SourceFile};
use ruff_annotate_snippets::Level as AnnotateLevel;
use ruff_text_size::{Ranged, TextRange};
use ruff_text_size::{Ranged, TextRange, TextSize};
pub use self::render::DisplayDiagnostic;
pub use self::render::{
DisplayDiagnostic, DisplayDiagnostics, FileResolver, Input, ceil_char_boundary,
};
use crate::{Db, files::File};
mod render;
@@ -19,7 +21,7 @@ mod stylesheet;
/// characteristics in the inputs given to the tool. Typically, but not always,
/// a characteristic is a deficiency. An example of a characteristic that is
/// _not_ a deficiency is the `reveal_type` diagnostic for our type checker.
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
#[derive(Debug, Clone, Eq, PartialEq, Hash, get_size2::GetSize)]
pub struct Diagnostic {
/// The actual diagnostic.
///
@@ -62,10 +64,37 @@ impl Diagnostic {
message: message.into_diagnostic_message(),
annotations: vec![],
subs: vec![],
fix: None,
parent: None,
noqa_offset: None,
secondary_code: None,
});
Diagnostic { inner }
}
/// Creates a `Diagnostic` for a syntax error.
///
/// Unlike the more general [`Diagnostic::new`], this requires a [`Span`] and a [`TextRange`]
/// attached to it.
///
/// This should _probably_ be a method on the syntax errors, but
/// at time of writing, `ruff_db` depends on `ruff_python_parser` instead of
/// the other way around. And since we want to do this conversion in a couple
/// places, it makes sense to centralize it _somewhere_. So it's here for now.
///
/// Note that `message` is stored in the primary annotation, _not_ in the primary diagnostic
/// message.
pub fn invalid_syntax(
span: impl Into<Span>,
message: impl IntoDiagnosticMessage,
range: impl Ranged,
) -> Diagnostic {
let mut diag = Diagnostic::new(DiagnosticId::InvalidSyntax, Severity::Error, "");
let span = span.into().with_range(range.range());
diag.annotate(Annotation::primary(span).message(message));
diag
}
/// Add an annotation to this diagnostic.
///
/// Annotations for a diagnostic are optional, but if any are added,
@@ -95,7 +124,14 @@ impl Diagnostic {
/// directly. If callers want or need to avoid cloning the diagnostic
/// message, then they can also pass a `DiagnosticMessage` directly.
pub fn info<'a>(&mut self, message: impl IntoDiagnosticMessage + 'a) {
self.sub(SubDiagnostic::new(Severity::Info, message));
self.sub(SubDiagnostic::new(SubDiagnosticSeverity::Info, message));
}
/// Adds a "help" sub-diagnostic with the given message.
///
/// See the closely related [`Diagnostic::info`] method for more details.
pub fn help<'a>(&mut self, message: impl IntoDiagnosticMessage + 'a) {
self.sub(SubDiagnostic::new(SubDiagnosticSeverity::Help, message));
}
/// Adds a "sub" diagnostic to this diagnostic.
@@ -226,6 +262,11 @@ impl Diagnostic {
self.primary_annotation().map(|ann| ann.span.clone())
}
/// Returns a reference to the primary span of this diagnostic.
pub fn primary_span_ref(&self) -> Option<&Span> {
self.primary_annotation().map(|ann| &ann.span)
}
/// Returns the tags from the primary annotation of this diagnostic if it exists.
pub fn primary_tags(&self) -> Option<&[DiagnosticTag]> {
self.primary_annotation().map(|ann| ann.tags.as_slice())
@@ -268,15 +309,187 @@ impl Diagnostic {
pub fn sub_diagnostics(&self) -> &[SubDiagnostic] {
&self.inner.subs
}
/// Returns the fix for this diagnostic if it exists.
pub fn fix(&self) -> Option<&Fix> {
self.inner.fix.as_ref()
}
/// Set the fix for this diagnostic.
pub fn set_fix(&mut self, fix: Fix) {
debug_assert!(
self.primary_span().is_some(),
"Expected a source file for a diagnostic with a fix"
);
Arc::make_mut(&mut self.inner).fix = Some(fix);
}
/// Remove the fix for this diagnostic.
pub fn remove_fix(&mut self) {
Arc::make_mut(&mut self.inner).fix = None;
}
/// Returns `true` if the diagnostic contains a [`Fix`].
pub fn fixable(&self) -> bool {
self.fix().is_some()
}
/// Returns the offset of the parent statement for this diagnostic if it exists.
///
/// This is primarily used for checking noqa/secondary code suppressions.
pub fn parent(&self) -> Option<TextSize> {
self.inner.parent
}
/// Set the offset of the diagnostic's parent statement.
pub fn set_parent(&mut self, parent: TextSize) {
Arc::make_mut(&mut self.inner).parent = Some(parent);
}
/// Returns the remapped offset for a suppression comment if it exists.
///
/// Like [`Diagnostic::parent`], this is used for noqa code suppression comments in Ruff.
pub fn noqa_offset(&self) -> Option<TextSize> {
self.inner.noqa_offset
}
/// Set the remapped offset for a suppression comment.
pub fn set_noqa_offset(&mut self, noqa_offset: TextSize) {
Arc::make_mut(&mut self.inner).noqa_offset = Some(noqa_offset);
}
/// Returns the secondary code for the diagnostic if it exists.
///
/// The "primary" code for the diagnostic is its lint name. Diagnostics in ty don't have
/// secondary codes (yet), but in Ruff the noqa code is used.
pub fn secondary_code(&self) -> Option<&SecondaryCode> {
self.inner.secondary_code.as_ref()
}
/// Set the secondary code for this diagnostic.
pub fn set_secondary_code(&mut self, code: SecondaryCode) {
Arc::make_mut(&mut self.inner).secondary_code = Some(code);
}
/// Returns the name used to represent the diagnostic.
pub fn name(&self) -> &'static str {
self.id().as_str()
}
/// Returns `true` if `self` is a syntax error message.
pub fn is_invalid_syntax(&self) -> bool {
self.id().is_invalid_syntax()
}
/// Returns the message body to display to the user.
pub fn body(&self) -> &str {
self.primary_message()
}
/// Returns the message of the first sub-diagnostic with a `Help` severity.
///
/// Note that this is used as the fix title/suggestion for some of Ruff's output formats, but in
/// general this is not the guaranteed meaning of such a message.
pub fn first_help_text(&self) -> Option<&str> {
self.sub_diagnostics()
.iter()
.find(|sub| matches!(sub.inner.severity, SubDiagnosticSeverity::Help))
.map(|sub| sub.inner.message.as_str())
}
/// Returns the URL for the rule documentation, if it exists.
pub fn to_ruff_url(&self) -> Option<String> {
if self.is_invalid_syntax() {
None
} else {
Some(format!(
"{}/rules/{}",
env!("CARGO_PKG_HOMEPAGE"),
self.name()
))
}
}
/// Returns the filename for the message.
///
/// Panics if the diagnostic has no primary span, or if its file is not a `SourceFile`.
pub fn expect_ruff_filename(&self) -> String {
self.expect_primary_span()
.expect_ruff_file()
.name()
.to_string()
}
/// Computes the start source location for the message.
///
/// Panics if the diagnostic has no primary span, if its file is not a `SourceFile`, or if the
/// span has no range.
pub fn expect_ruff_start_location(&self) -> LineColumn {
self.expect_primary_span()
.expect_ruff_file()
.to_source_code()
.line_column(self.expect_range().start())
}
/// Computes the end source location for the message.
///
/// Panics if the diagnostic has no primary span, if its file is not a `SourceFile`, or if the
/// span has no range.
pub fn expect_ruff_end_location(&self) -> LineColumn {
self.expect_primary_span()
.expect_ruff_file()
.to_source_code()
.line_column(self.expect_range().end())
}
/// Returns the [`SourceFile`] which the message belongs to.
pub fn ruff_source_file(&self) -> Option<&SourceFile> {
self.primary_span_ref()?.as_ruff_file()
}
/// Returns the [`SourceFile`] which the message belongs to.
///
/// Panics if the diagnostic has no primary span, or if its file is not a `SourceFile`.
pub fn expect_ruff_source_file(&self) -> &SourceFile {
self.ruff_source_file()
.expect("Expected a ruff source file")
}
/// Returns the [`TextRange`] for the diagnostic.
pub fn range(&self) -> Option<TextRange> {
self.primary_span()?.range()
}
/// Returns the [`TextRange`] for the diagnostic.
///
/// Panics if the diagnostic has no primary span or if the span has no range.
pub fn expect_range(&self) -> TextRange {
self.range().expect("Expected a range for the primary span")
}
/// Returns the ordering of diagnostics based on the start of their ranges, if they have any.
///
/// Panics if either diagnostic has no primary span, if the span has no range, or if its file is
/// not a `SourceFile`.
pub fn ruff_start_ordering(&self, other: &Self) -> std::cmp::Ordering {
(self.expect_ruff_source_file(), self.expect_range().start()).cmp(&(
other.expect_ruff_source_file(),
other.expect_range().start(),
))
}
}
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
#[derive(Debug, Clone, Eq, PartialEq, Hash, get_size2::GetSize)]
struct DiagnosticInner {
id: DiagnosticId,
severity: Severity,
message: DiagnosticMessage,
annotations: Vec<Annotation>,
subs: Vec<SubDiagnostic>,
fix: Option<Fix>,
parent: Option<TextSize>,
noqa_offset: Option<TextSize>,
secondary_code: Option<SecondaryCode>,
}
struct RenderingSortKey<'a> {
@@ -342,7 +555,7 @@ impl Eq for RenderingSortKey<'_> {}
/// Currently, the order in which sub-diagnostics are rendered relative to one
/// another (for a single parent diagnostic) is the order in which they were
/// attached to the diagnostic.
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
#[derive(Debug, Clone, Eq, PartialEq, Hash, get_size2::GetSize)]
pub struct SubDiagnostic {
/// Like with `Diagnostic`, we box the `SubDiagnostic` to make it
/// pointer-sized.
@@ -367,7 +580,10 @@ impl SubDiagnostic {
/// Callers can pass anything that implements `std::fmt::Display`
/// directly. If callers want or need to avoid cloning the diagnostic
/// message, then they can also pass a `DiagnosticMessage` directly.
pub fn new<'a>(severity: Severity, message: impl IntoDiagnosticMessage + 'a) -> SubDiagnostic {
pub fn new<'a>(
severity: SubDiagnosticSeverity,
message: impl IntoDiagnosticMessage + 'a,
) -> SubDiagnostic {
let inner = Box::new(SubDiagnosticInner {
severity,
message: message.into_diagnostic_message(),
@@ -443,9 +659,9 @@ impl SubDiagnostic {
}
}
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
#[derive(Debug, Clone, Eq, PartialEq, Hash, get_size2::GetSize)]
struct SubDiagnosticInner {
severity: Severity,
severity: SubDiagnosticSeverity,
message: DiagnosticMessage,
annotations: Vec<Annotation>,
}
@@ -471,7 +687,7 @@ struct SubDiagnosticInner {
///
/// Messages attached to annotations should also be as brief and specific as
/// possible. Long messages could negative impact the quality of rendering.
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
#[derive(Debug, Clone, Eq, PartialEq, Hash, get_size2::GetSize)]
pub struct Annotation {
/// The span of this annotation, corresponding to some subsequence of the
/// user's input that we want to highlight.
@@ -591,7 +807,7 @@ impl Annotation {
///
/// These tags are used to provide additional information about the annotation.
/// and are passed through to the language server protocol.
#[derive(Debug, Clone, Eq, PartialEq, get_size2::GetSize)]
#[derive(Debug, Clone, Eq, PartialEq, Hash, get_size2::GetSize)]
pub enum DiagnosticTag {
/// Unused or unnecessary code. Used for unused parameters, unreachable code, etc.
Unnecessary,
@@ -800,7 +1016,7 @@ impl std::fmt::Display for DiagnosticId {
///
/// This enum presents a unified interface to these two types for the sake of creating [`Span`]s and
/// emitting diagnostics from both ty and ruff.
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
#[derive(Debug, Clone, PartialEq, Eq, Hash, get_size2::GetSize)]
pub enum UnifiedFile {
Ty(File),
Ruff(SourceFile),
@@ -814,6 +1030,18 @@ impl UnifiedFile {
}
}
/// Return the file's path relative to the current working directory.
pub fn relative_path<'a>(&'a self, resolver: &'a dyn FileResolver) -> &'a Path {
let cwd = resolver.current_directory();
let path = Path::new(self.path(resolver));
if let Ok(path) = path.strip_prefix(cwd) {
return path;
}
path
}
fn diagnostic_source(&self, resolver: &dyn FileResolver) -> DiagnosticSource {
match self {
UnifiedFile::Ty(file) => DiagnosticSource::Ty(resolver.input(*file)),
@@ -852,7 +1080,7 @@ impl DiagnosticSource {
/// It consists of a `File` and an optional range into that file. When the
/// range isn't present, it semantically implies that the diagnostic refers to
/// the entire file. For example, when the file should be executable but isn't.
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
#[derive(Debug, Clone, PartialEq, Eq, Hash, get_size2::GetSize)]
pub struct Span {
file: UnifiedFile,
range: Option<TextRange>,
@@ -897,9 +1125,15 @@ impl Span {
///
/// Panics if the file is a [`UnifiedFile::Ty`] instead of a [`UnifiedFile::Ruff`].
pub fn expect_ruff_file(&self) -> &SourceFile {
self.as_ruff_file()
.expect("Expected a ruff `SourceFile`, found a ty `File`")
}
/// Returns the [`SourceFile`] attached to this [`Span`].
pub fn as_ruff_file(&self) -> Option<&SourceFile> {
match &self.file {
UnifiedFile::Ty(_) => panic!("Expected a ruff `SourceFile`, found a ty `File`"),
UnifiedFile::Ruff(file) => file,
UnifiedFile::Ty(_) => None,
UnifiedFile::Ruff(file) => Some(file),
}
}
}
@@ -924,7 +1158,7 @@ impl From<crate::files::FileRange> for Span {
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Ord, PartialOrd, get_size2::GetSize)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Ord, PartialOrd, Hash, get_size2::GetSize)]
pub enum Severity {
Info,
Warning,
@@ -954,6 +1188,32 @@ impl Severity {
}
}
/// Like [`Severity`] but exclusively for sub-diagnostics.
///
/// This type only exists to add an additional `Help` severity that isn't present in `Severity` or
/// used for main diagnostics. If we want to add `Severity::Help` in the future, this type could be
/// deleted and the two combined again.
#[derive(Debug, Clone, Copy, PartialEq, Eq, Ord, PartialOrd, Hash, get_size2::GetSize)]
pub enum SubDiagnosticSeverity {
Help,
Info,
Warning,
Error,
Fatal,
}
impl SubDiagnosticSeverity {
fn to_annotate(self) -> AnnotateLevel {
match self {
SubDiagnosticSeverity::Help => AnnotateLevel::Help,
SubDiagnosticSeverity::Info => AnnotateLevel::Info,
SubDiagnosticSeverity::Warning => AnnotateLevel::Warning,
SubDiagnosticSeverity::Error => AnnotateLevel::Error,
SubDiagnosticSeverity::Fatal => AnnotateLevel::Error,
}
}
}
/// Configuration for rendering diagnostics.
#[derive(Clone, Debug)]
pub struct DisplayDiagnosticConfig {
@@ -974,6 +1234,21 @@ pub struct DisplayDiagnosticConfig {
/// here for now as the most "sensible" place for it to live until
/// we had more concrete use cases. ---AG
context: usize,
/// Whether to use preview formatting for Ruff diagnostics.
#[allow(
dead_code,
reason = "This is currently only used for JSON but will be needed soon for other formats"
)]
preview: bool,
/// Whether to hide the real `Severity` of diagnostics.
///
/// This is intended for temporary use by Ruff, which only has a single `error` severity at the
/// moment. We should be able to remove this option when Ruff gets more severities.
hide_severity: bool,
/// Whether to show the availability of a fix in a diagnostic.
show_fix_status: bool,
/// The lowest applicability that should be shown when reporting diagnostics.
fix_applicability: Applicability,
}
impl DisplayDiagnosticConfig {
@@ -994,6 +1269,43 @@ impl DisplayDiagnosticConfig {
..self
}
}
/// Whether to enable preview behavior or not.
pub fn preview(self, yes: bool) -> DisplayDiagnosticConfig {
DisplayDiagnosticConfig {
preview: yes,
..self
}
}
/// Whether to hide a diagnostic's severity or not.
pub fn hide_severity(self, yes: bool) -> DisplayDiagnosticConfig {
DisplayDiagnosticConfig {
hide_severity: yes,
..self
}
}
/// Whether to show a fix's availability or not.
pub fn show_fix_status(self, yes: bool) -> DisplayDiagnosticConfig {
DisplayDiagnosticConfig {
show_fix_status: yes,
..self
}
}
/// Set the lowest fix applicability that should be shown.
///
/// In other words, an applicability of `Safe` (the default) would suppress showing fixes or fix
/// availability for unsafe or display-only fixes.
///
/// Note that this option is currently ignored when `hide_severity` is false.
pub fn fix_applicability(self, applicability: Applicability) -> DisplayDiagnosticConfig {
DisplayDiagnosticConfig {
fix_applicability: applicability,
..self
}
}
}
impl Default for DisplayDiagnosticConfig {
@@ -1002,6 +1314,10 @@ impl Default for DisplayDiagnosticConfig {
format: DiagnosticFormat::default(),
color: false,
context: 2,
preview: false,
hide_severity: false,
show_fix_status: false,
fix_applicability: Applicability::Safe,
}
}
}
@@ -1029,6 +1345,31 @@ pub enum DiagnosticFormat {
///
/// This may use color when printing to a `tty`.
Concise,
/// Print diagnostics in the [Azure Pipelines] format.
///
/// [Azure Pipelines]: https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash#logissue-log-an-error-or-warning
Azure,
/// Print diagnostics in JSON format.
///
/// Unlike `json-lines`, this prints all of the diagnostics as a JSON array.
#[cfg(feature = "serde")]
Json,
/// Print diagnostics in JSON format, one per line.
///
/// This will print each diagnostic as a separate JSON object on its own line. See the `json`
/// format for an array of all diagnostics. See <https://jsonlines.org/> for more details.
#[cfg(feature = "serde")]
JsonLines,
/// Print diagnostics in the JSON format expected by [reviewdog].
///
/// [reviewdog]: https://github.com/reviewdog/reviewdog
#[cfg(feature = "serde")]
Rdjson,
/// Print diagnostics in the format emitted by Pylint.
Pylint,
/// Print diagnostics in the format expected by JUnit.
#[cfg(feature = "junit")]
Junit,
}
/// A representation of the kinds of messages inside a diagnostic.
@@ -1087,7 +1428,7 @@ impl std::fmt::Display for ConciseMessage<'_> {
/// In most cases, callers shouldn't need to use this. Instead, there is
/// a blanket trait implementation for `IntoDiagnosticMessage` for
/// anything that implements `std::fmt::Display`.
#[derive(Clone, Debug, Eq, PartialEq, get_size2::GetSize)]
#[derive(Clone, Debug, Eq, PartialEq, Hash, get_size2::GetSize)]
pub struct DiagnosticMessage(Box<str>);
impl DiagnosticMessage {
@@ -1147,41 +1488,52 @@ impl<T: std::fmt::Display> IntoDiagnosticMessage for T {
}
}
/// Creates a `Diagnostic` from a parse error.
/// A secondary identifier for a lint diagnostic.
///
/// This should _probably_ be a method on `ruff_python_parser::ParseError`, but
/// at time of writing, `ruff_db` depends on `ruff_python_parser` instead of
/// the other way around. And since we want to do this conversion in a couple
/// places, it makes sense to centralize it _somewhere_. So it's here for now.
pub fn create_parse_diagnostic(file: File, err: &ruff_python_parser::ParseError) -> Diagnostic {
let mut diag = Diagnostic::new(DiagnosticId::InvalidSyntax, Severity::Error, "");
let span = Span::from(file).with_range(err.location);
diag.annotate(Annotation::primary(span).message(&err.error));
diag
/// For Ruff rules this means the noqa code.
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Default, Hash, get_size2::GetSize)]
#[cfg_attr(feature = "serde", derive(serde::Serialize), serde(transparent))]
pub struct SecondaryCode(String);
impl SecondaryCode {
pub fn new(code: String) -> Self {
Self(code)
}
pub fn as_str(&self) -> &str {
&self.0
}
}
/// Creates a `Diagnostic` from an unsupported syntax error.
///
/// See [`create_parse_diagnostic`] for more details.
pub fn create_unsupported_syntax_diagnostic(
file: File,
err: &ruff_python_parser::UnsupportedSyntaxError,
) -> Diagnostic {
let mut diag = Diagnostic::new(DiagnosticId::InvalidSyntax, Severity::Error, "");
let span = Span::from(file).with_range(err.range);
diag.annotate(Annotation::primary(span).message(err.to_string()));
diag
impl std::fmt::Display for SecondaryCode {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str(&self.0)
}
}
/// Creates a `Diagnostic` from a semantic syntax error.
///
/// See [`create_parse_diagnostic`] for more details.
pub fn create_semantic_syntax_diagnostic(
file: File,
err: &ruff_python_parser::semantic_errors::SemanticSyntaxError,
) -> Diagnostic {
let mut diag = Diagnostic::new(DiagnosticId::InvalidSyntax, Severity::Error, "");
let span = Span::from(file).with_range(err.range);
diag.annotate(Annotation::primary(span).message(err.to_string()));
diag
impl std::ops::Deref for SecondaryCode {
type Target = str;
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl PartialEq<&str> for SecondaryCode {
fn eq(&self, other: &&str) -> bool {
self.0 == *other
}
}
impl PartialEq<SecondaryCode> for &str {
fn eq(&self, other: &SecondaryCode) -> bool {
other.eq(self)
}
}
// for `hashbrown::EntryRef`
impl From<&SecondaryCode> for SecondaryCode {
fn from(value: &SecondaryCode) -> Self {
value.clone()
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,83 @@
use ruff_source_file::LineColumn;
use crate::diagnostic::{Diagnostic, Severity};
use super::FileResolver;
pub(super) struct AzureRenderer<'a> {
resolver: &'a dyn FileResolver,
}
impl<'a> AzureRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver) -> Self {
Self { resolver }
}
}
impl AzureRenderer<'_> {
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
for diag in diagnostics {
let severity = match diag.severity() {
Severity::Info | Severity::Warning => "warning",
Severity::Error | Severity::Fatal => "error",
};
write!(f, "##vso[task.logissue type={severity};")?;
if let Some(span) = diag.primary_span() {
let filename = span.file().path(self.resolver);
write!(f, "sourcepath={filename};")?;
if let Some(range) = span.range() {
let location = if self.resolver.notebook_index(span.file()).is_some() {
// We can't give a reasonable location for the structured formats,
// so we show one that's clearly a fallback
LineColumn::default()
} else {
span.file()
.diagnostic_source(self.resolver)
.as_source_code()
.line_column(range.start())
};
write!(
f,
"linenumber={line};columnnumber={col};",
line = location.line,
col = location.column,
)?;
}
}
writeln!(
f,
"{code}]{body}",
code = diag
.secondary_code()
.map_or_else(String::new, |code| format!("code={code};")),
body = diag.body(),
)?;
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{create_diagnostics, create_syntax_error_diagnostics},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Azure);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Azure);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
}

View File

@@ -0,0 +1,195 @@
use crate::diagnostic::{
Diagnostic, DisplayDiagnosticConfig, Severity,
stylesheet::{DiagnosticStylesheet, fmt_styled},
};
use super::FileResolver;
pub(super) struct ConciseRenderer<'a> {
resolver: &'a dyn FileResolver,
config: &'a DisplayDiagnosticConfig,
}
impl<'a> ConciseRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver, config: &'a DisplayDiagnosticConfig) -> Self {
Self { resolver, config }
}
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
let stylesheet = if self.config.color {
DiagnosticStylesheet::styled()
} else {
DiagnosticStylesheet::plain()
};
let sep = fmt_styled(":", stylesheet.separator);
for diag in diagnostics {
if let Some(span) = diag.primary_span() {
write!(
f,
"{path}",
path = fmt_styled(
span.file().relative_path(self.resolver).to_string_lossy(),
stylesheet.emphasis
)
)?;
if let Some(range) = span.range() {
let diagnostic_source = span.file().diagnostic_source(self.resolver);
let start = diagnostic_source
.as_source_code()
.line_column(range.start());
if let Some(notebook_index) = self.resolver.notebook_index(span.file()) {
write!(
f,
"{sep}cell {cell}{sep}{line}{sep}{col}",
cell = notebook_index.cell(start.line).unwrap_or_default(),
line = notebook_index.cell_row(start.line).unwrap_or_default(),
col = start.column,
)?;
} else {
write!(
f,
"{sep}{line}{sep}{col}",
line = start.line,
col = start.column,
)?;
}
}
write!(f, "{sep} ")?;
}
if self.config.hide_severity {
if let Some(code) = diag.secondary_code() {
write!(
f,
"{code} ",
code = fmt_styled(code, stylesheet.secondary_code)
)?;
}
if self.config.show_fix_status {
if let Some(fix) = diag.fix() {
// Do not display an indicator for inapplicable fixes
if fix.applies(self.config.fix_applicability) {
write!(f, "[{fix}] ", fix = fmt_styled("*", stylesheet.separator))?;
}
}
}
} else {
let (severity, severity_style) = match diag.severity() {
Severity::Info => ("info", stylesheet.info),
Severity::Warning => ("warning", stylesheet.warning),
Severity::Error => ("error", stylesheet.error),
Severity::Fatal => ("fatal", stylesheet.error),
};
write!(
f,
"{severity}[{id}] ",
severity = fmt_styled(severity, severity_style),
id = fmt_styled(diag.id(), stylesheet.emphasis)
)?;
}
writeln!(f, "{message}", message = diag.concise_message())?;
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use ruff_diagnostics::Applicability;
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{
TestEnvironment, create_diagnostics, create_notebook_diagnostics,
create_syntax_error_diagnostics,
},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Concise);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics), @r"
fib.py:1:8: error[unused-import] `os` imported but unused
fib.py:6:5: error[unused-variable] Local variable `x` is assigned to but never used
undef.py:1:4: error[undefined-name] Undefined name `a`
");
}
#[test]
fn show_fixes() {
let (mut env, diagnostics) = create_diagnostics(DiagnosticFormat::Concise);
env.hide_severity(true);
env.show_fix_status(true);
env.fix_applicability(Applicability::DisplayOnly);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics), @r"
fib.py:1:8: F401 [*] `os` imported but unused
fib.py:6:5: F841 [*] Local variable `x` is assigned to but never used
undef.py:1:4: F821 Undefined name `a`
");
}
#[test]
fn show_fixes_preview() {
let (mut env, diagnostics) = create_diagnostics(DiagnosticFormat::Concise);
env.hide_severity(true);
env.show_fix_status(true);
env.fix_applicability(Applicability::DisplayOnly);
env.preview(true);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics), @r"
fib.py:1:8: F401 [*] `os` imported but unused
fib.py:6:5: F841 [*] Local variable `x` is assigned to but never used
undef.py:1:4: F821 Undefined name `a`
");
}
#[test]
fn show_fixes_syntax_errors() {
let (mut env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Concise);
env.hide_severity(true);
env.show_fix_status(true);
env.fix_applicability(Applicability::DisplayOnly);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics), @r"
syntax_errors.py:1:15: SyntaxError: Expected one or more symbol names after import
syntax_errors.py:3:12: SyntaxError: Expected ')', found newline
");
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Concise);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics), @r"
syntax_errors.py:1:15: error[invalid-syntax] SyntaxError: Expected one or more symbol names after import
syntax_errors.py:3:12: error[invalid-syntax] SyntaxError: Expected ')', found newline
");
}
#[test]
fn notebook_output() {
let (env, diagnostics) = create_notebook_diagnostics(DiagnosticFormat::Concise);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics), @r"
notebook.ipynb:cell 1:2:8: error[unused-import] `os` imported but unused
notebook.ipynb:cell 2:2:8: error[unused-import] `math` imported but unused
notebook.ipynb:cell 3:4:5: error[unused-variable] Local variable `x` is assigned to but never used
");
}
#[test]
fn missing_file() {
let mut env = TestEnvironment::new();
env.format(DiagnosticFormat::Concise);
let diag = env.err().build();
insta::assert_snapshot!(
env.render(&diag),
@"error[test-diagnostic] main diagnostic message",
);
}
}

View File

@@ -0,0 +1,201 @@
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat, Severity,
render::tests::{TestEnvironment, create_diagnostics, create_syntax_error_diagnostics},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Full);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics), @r#"
error[unused-import]: `os` imported but unused
--> fib.py:1:8
|
1 | import os
| ^^
|
help: Remove unused import: `os`
error[unused-variable]: Local variable `x` is assigned to but never used
--> fib.py:6:5
|
4 | def fibonacci(n):
5 | """Compute the nth number in the Fibonacci sequence."""
6 | x = 1
| ^
7 | if n == 0:
8 | return 0
|
help: Remove assignment to unused variable `x`
error[undefined-name]: Undefined name `a`
--> undef.py:1:4
|
1 | if a == 1: pass
| ^
|
"#);
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Full);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics), @r"
error[invalid-syntax]: SyntaxError: Expected one or more symbol names after import
--> syntax_errors.py:1:15
|
1 | from os import
| ^
2 |
3 | if call(foo
|
error[invalid-syntax]: SyntaxError: Expected ')', found newline
--> syntax_errors.py:3:12
|
1 | from os import
2 |
3 | if call(foo
| ^
4 | def bar():
5 | pass
|
");
}
/// Check that the new `full` rendering code in `ruff_db` handles cases fixed by commit c9b99e4.
///
/// For example, without the fix, we get diagnostics like this:
///
/// ```
/// error[no-indented-block]: Expected an indented block
/// --> example.py:3:1
/// |
/// 2 | if False:
/// | ^
/// 3 | print()
/// |
/// ```
///
/// where the caret points to the end of the previous line instead of the start of the next.
#[test]
fn empty_span_after_line_terminator() {
let mut env = TestEnvironment::new();
env.add(
"example.py",
r#"
if False:
print()
"#,
);
env.format(DiagnosticFormat::Full);
let diagnostic = env
.builder(
"no-indented-block",
Severity::Error,
"Expected an indented block",
)
.primary("example.py", "3:0", "3:0", "")
.build();
insta::assert_snapshot!(env.render(&diagnostic), @r"
error[no-indented-block]: Expected an indented block
--> example.py:3:1
|
2 | if False:
3 | print()
| ^
|
");
}
/// Check that the new `full` rendering code in `ruff_db` handles cases fixed by commit 2922490.
///
/// For example, without the fix, we get diagnostics like this:
///
/// ```
/// error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
/// --> example.py:1:25
/// |
/// 1 | nested_fstrings = f'␈{f'{f'␛'}'}'
/// | ^
/// |
/// ```
///
/// where the caret points to the `f` in the f-string instead of the start of the invalid
/// character (`^Z`).
#[test]
fn unprintable_characters() {
let mut env = TestEnvironment::new();
env.add("example.py", "nested_fstrings = f'{f'{f''}'}'");
env.format(DiagnosticFormat::Full);
let diagnostic = env
.builder(
"invalid-character-sub",
Severity::Error,
r#"Invalid unescaped character SUB, use "\x1A" instead"#,
)
.primary("example.py", "1:24", "1:24", "")
.build();
insta::assert_snapshot!(env.render(&diagnostic), @r#"
error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
--> example.py:1:25
|
1 | nested_fstrings = f'␈{f'{f'␛'}'}'
| ^
|
"#);
}
#[test]
fn multiple_unprintable_characters() -> std::io::Result<()> {
let mut env = TestEnvironment::new();
env.add("example.py", "");
env.format(DiagnosticFormat::Full);
let diagnostic = env
.builder(
"invalid-character-sub",
Severity::Error,
r#"Invalid unescaped character SUB, use "\x1A" instead"#,
)
.primary("example.py", "1:1", "1:1", "")
.build();
insta::assert_snapshot!(env.render(&diagnostic), @r#"
error[invalid-character-sub]: Invalid unescaped character SUB, use "\x1A" instead
--> example.py:1:2
|
1 | ␈
| ^
|
"#);
Ok(())
}
/// Ensure that the header column matches the column in the user's input, even if we've replaced
/// tabs with spaces for rendering purposes.
#[test]
fn tab_replacement() {
let mut env = TestEnvironment::new();
env.add("example.py", "def foo():\n\treturn 1");
env.format(DiagnosticFormat::Full);
let diagnostic = env.err().primary("example.py", "2:1", "2:9", "").build();
insta::assert_snapshot!(env.render(&diagnostic), @r"
error[test-diagnostic]: main diagnostic message
--> example.py:2:2
|
1 | def foo():
2 | return 1
| ^^^^^^^^
|
");
}
}

View File

@@ -0,0 +1,352 @@
use serde::{Serialize, Serializer, ser::SerializeSeq};
use serde_json::{Value, json};
use ruff_diagnostics::{Applicability, Edit};
use ruff_notebook::NotebookIndex;
use ruff_source_file::{LineColumn, OneIndexed};
use ruff_text_size::Ranged;
use crate::diagnostic::{Diagnostic, DiagnosticSource, DisplayDiagnosticConfig, SecondaryCode};
use super::FileResolver;
pub(super) struct JsonRenderer<'a> {
resolver: &'a dyn FileResolver,
config: &'a DisplayDiagnosticConfig,
}
impl<'a> JsonRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver, config: &'a DisplayDiagnosticConfig) -> Self {
Self { resolver, config }
}
}
impl JsonRenderer<'_> {
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
write!(
f,
"{:#}",
diagnostics_to_json_value(diagnostics, self.resolver, self.config)
)
}
}
fn diagnostics_to_json_value<'a>(
diagnostics: impl IntoIterator<Item = &'a Diagnostic>,
resolver: &dyn FileResolver,
config: &DisplayDiagnosticConfig,
) -> Value {
let values: Vec<_> = diagnostics
.into_iter()
.map(|diag| diagnostic_to_json(diag, resolver, config))
.collect();
json!(values)
}
pub(super) fn diagnostic_to_json<'a>(
diagnostic: &'a Diagnostic,
resolver: &'a dyn FileResolver,
config: &'a DisplayDiagnosticConfig,
) -> JsonDiagnostic<'a> {
let span = diagnostic.primary_span_ref();
let filename = span.map(|span| span.file().path(resolver));
let range = span.and_then(|span| span.range());
let diagnostic_source = span.map(|span| span.file().diagnostic_source(resolver));
let source_code = diagnostic_source
.as_ref()
.map(|diagnostic_source| diagnostic_source.as_source_code());
let notebook_index = span.and_then(|span| resolver.notebook_index(span.file()));
let mut start_location = None;
let mut end_location = None;
let mut noqa_location = None;
let mut notebook_cell_index = None;
if let Some(source_code) = source_code {
noqa_location = diagnostic
.noqa_offset()
.map(|offset| source_code.line_column(offset));
if let Some(range) = range {
let mut start = source_code.line_column(range.start());
let mut end = source_code.line_column(range.end());
if let Some(notebook_index) = &notebook_index {
notebook_cell_index =
Some(notebook_index.cell(start.line).unwrap_or(OneIndexed::MIN));
start = notebook_index.translate_line_column(&start);
end = notebook_index.translate_line_column(&end);
noqa_location =
noqa_location.map(|location| notebook_index.translate_line_column(&location));
}
start_location = Some(start);
end_location = Some(end);
}
}
let fix = diagnostic.fix().map(|fix| JsonFix {
applicability: fix.applicability(),
message: diagnostic.first_help_text(),
edits: ExpandedEdits {
edits: fix.edits(),
notebook_index,
config,
diagnostic_source,
},
});
// In preview, the locations and filename can be optional.
if config.preview {
JsonDiagnostic {
code: diagnostic.secondary_code(),
url: diagnostic.to_ruff_url(),
message: diagnostic.body(),
fix,
cell: notebook_cell_index,
location: start_location.map(JsonLocation::from),
end_location: end_location.map(JsonLocation::from),
filename,
noqa_row: noqa_location.map(|location| location.line),
}
} else {
JsonDiagnostic {
code: diagnostic.secondary_code(),
url: diagnostic.to_ruff_url(),
message: diagnostic.body(),
fix,
cell: notebook_cell_index,
location: Some(start_location.unwrap_or_default().into()),
end_location: Some(end_location.unwrap_or_default().into()),
filename: Some(filename.unwrap_or_default()),
noqa_row: noqa_location.map(|location| location.line),
}
}
}
struct ExpandedEdits<'a> {
edits: &'a [Edit],
notebook_index: Option<NotebookIndex>,
config: &'a DisplayDiagnosticConfig,
diagnostic_source: Option<DiagnosticSource>,
}
impl Serialize for ExpandedEdits<'_> {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let mut s = serializer.serialize_seq(Some(self.edits.len()))?;
for edit in self.edits {
let (location, end_location) = if let Some(diagnostic_source) = &self.diagnostic_source
{
let source_code = diagnostic_source.as_source_code();
let mut location = source_code.line_column(edit.start());
let mut end_location = source_code.line_column(edit.end());
if let Some(notebook_index) = &self.notebook_index {
// There exists a newline between each cell's source code in the
// concatenated source code in Ruff. This newline doesn't actually
// exists in the JSON source field.
//
// Now, certain edits may try to remove this newline, which means
// the edit will spill over to the first character of the next cell.
// If it does, we need to translate the end location to the last
// character of the previous cell.
match (
notebook_index.cell(location.line),
notebook_index.cell(end_location.line),
) {
(Some(start_cell), Some(end_cell)) if start_cell != end_cell => {
debug_assert_eq!(end_location.column.get(), 1);
let prev_row = end_location.line.saturating_sub(1);
end_location = LineColumn {
line: notebook_index.cell_row(prev_row).unwrap_or(OneIndexed::MIN),
column: source_code
.line_column(source_code.line_end_exclusive(prev_row))
.column,
};
}
(Some(_), None) => {
debug_assert_eq!(end_location.column.get(), 1);
let prev_row = end_location.line.saturating_sub(1);
end_location = LineColumn {
line: notebook_index.cell_row(prev_row).unwrap_or(OneIndexed::MIN),
column: source_code
.line_column(source_code.line_end_exclusive(prev_row))
.column,
};
}
_ => {
end_location = notebook_index.translate_line_column(&end_location);
}
}
location = notebook_index.translate_line_column(&location);
}
(Some(location), Some(end_location))
} else {
(None, None)
};
// In preview, the locations can be optional.
let value = if self.config.preview {
JsonEdit {
content: edit.content().unwrap_or_default(),
location: location.map(JsonLocation::from),
end_location: end_location.map(JsonLocation::from),
}
} else {
JsonEdit {
content: edit.content().unwrap_or_default(),
location: Some(location.unwrap_or_default().into()),
end_location: Some(end_location.unwrap_or_default().into()),
}
};
s.serialize_element(&value)?;
}
s.end()
}
}
/// A serializable version of `Diagnostic`.
///
/// The `Old` variant only exists to preserve backwards compatibility. Both this and `JsonEdit`
/// should become structs with the `New` definitions in a future Ruff release.
#[derive(Serialize)]
pub(crate) struct JsonDiagnostic<'a> {
cell: Option<OneIndexed>,
code: Option<&'a SecondaryCode>,
end_location: Option<JsonLocation>,
filename: Option<&'a str>,
fix: Option<JsonFix<'a>>,
location: Option<JsonLocation>,
message: &'a str,
noqa_row: Option<OneIndexed>,
url: Option<String>,
}
#[derive(Serialize)]
struct JsonFix<'a> {
applicability: Applicability,
edits: ExpandedEdits<'a>,
message: Option<&'a str>,
}
#[derive(Serialize)]
struct JsonLocation {
column: OneIndexed,
row: OneIndexed,
}
impl From<LineColumn> for JsonLocation {
fn from(location: LineColumn) -> Self {
JsonLocation {
row: location.line,
column: location.column,
}
}
}
#[derive(Serialize)]
struct JsonEdit<'a> {
content: &'a str,
end_location: Option<JsonLocation>,
location: Option<JsonLocation>,
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{
TestEnvironment, create_diagnostics, create_notebook_diagnostics,
create_syntax_error_diagnostics,
},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Json);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Json);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn notebook_output() {
let (env, diagnostics) = create_notebook_diagnostics(DiagnosticFormat::Json);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn missing_file_stable() {
let mut env = TestEnvironment::new();
env.format(DiagnosticFormat::Json);
env.preview(false);
let diag = env.err().build();
insta::assert_snapshot!(
env.render(&diag),
@r#"
[
{
"cell": null,
"code": null,
"end_location": {
"column": 1,
"row": 1
},
"filename": "",
"fix": null,
"location": {
"column": 1,
"row": 1
},
"message": "main diagnostic message",
"noqa_row": null,
"url": "https://docs.astral.sh/ruff/rules/test-diagnostic"
}
]
"#,
);
}
#[test]
fn missing_file_preview() {
let mut env = TestEnvironment::new();
env.format(DiagnosticFormat::Json);
env.preview(true);
let diag = env.err().build();
insta::assert_snapshot!(
env.render(&diag),
@r#"
[
{
"cell": null,
"code": null,
"end_location": null,
"filename": null,
"fix": null,
"location": null,
"message": "main diagnostic message",
"noqa_row": null,
"url": "https://docs.astral.sh/ruff/rules/test-diagnostic"
}
]
"#,
);
}
}

View File

@@ -0,0 +1,59 @@
use crate::diagnostic::{Diagnostic, DisplayDiagnosticConfig, render::json::diagnostic_to_json};
use super::FileResolver;
pub(super) struct JsonLinesRenderer<'a> {
resolver: &'a dyn FileResolver,
config: &'a DisplayDiagnosticConfig,
}
impl<'a> JsonLinesRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver, config: &'a DisplayDiagnosticConfig) -> Self {
Self { resolver, config }
}
}
impl JsonLinesRenderer<'_> {
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
for diag in diagnostics {
writeln!(
f,
"{}",
serde_json::json!(diagnostic_to_json(diag, self.resolver, self.config))
)?;
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{
create_diagnostics, create_notebook_diagnostics, create_syntax_error_diagnostics,
},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::JsonLines);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::JsonLines);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn notebook_output() {
let (env, diagnostics) = create_notebook_diagnostics(DiagnosticFormat::JsonLines);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
}

View File

@@ -0,0 +1,195 @@
use std::{collections::BTreeMap, ops::Deref, path::Path};
use quick_junit::{NonSuccessKind, Report, TestCase, TestCaseStatus, TestSuite, XmlString};
use ruff_source_file::LineColumn;
use crate::diagnostic::{Diagnostic, SecondaryCode, render::FileResolver};
/// A renderer for diagnostics in the [JUnit] format.
///
/// See [`junit.xsd`] for the specification in the JUnit repository and an annotated [version]
/// linked from the [`quick_junit`] docs.
///
/// [JUnit]: https://junit.org/
/// [`junit.xsd`]: https://github.com/junit-team/junit-framework/blob/2870b7d8fd5bf7c1efe489d3991d3ed3900e82bb/platform-tests/src/test/resources/jenkins-junit.xsd
/// [version]: https://llg.cubic.org/docs/junit/
/// [`quick_junit`]: https://docs.rs/quick-junit/latest/quick_junit/
pub struct JunitRenderer<'a> {
resolver: &'a dyn FileResolver,
}
impl<'a> JunitRenderer<'a> {
pub fn new(resolver: &'a dyn FileResolver) -> Self {
Self { resolver }
}
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
let mut report = Report::new("ruff");
if diagnostics.is_empty() {
let mut test_suite = TestSuite::new("ruff");
test_suite
.extra
.insert(XmlString::new("package"), XmlString::new("org.ruff"));
let mut case = TestCase::new("No errors found", TestCaseStatus::success());
case.set_classname("ruff");
test_suite.add_test_case(case);
report.add_test_suite(test_suite);
} else {
for (filename, diagnostics) in group_diagnostics_by_filename(diagnostics, self.resolver)
{
let mut test_suite = TestSuite::new(filename);
test_suite
.extra
.insert(XmlString::new("package"), XmlString::new("org.ruff"));
let classname = Path::new(filename).with_extension("");
for diagnostic in diagnostics {
let DiagnosticWithLocation {
diagnostic,
start_location: location,
} = diagnostic;
let mut status = TestCaseStatus::non_success(NonSuccessKind::Failure);
status.set_message(diagnostic.body());
if let Some(location) = location {
status.set_description(format!(
"line {row}, col {col}, {body}",
row = location.line,
col = location.column,
body = diagnostic.body()
));
} else {
status.set_description(diagnostic.body());
}
let code = diagnostic
.secondary_code()
.map_or_else(|| diagnostic.name(), SecondaryCode::as_str);
let mut case = TestCase::new(format!("org.ruff.{code}"), status);
case.set_classname(classname.to_str().unwrap());
if let Some(location) = location {
case.extra.insert(
XmlString::new("line"),
XmlString::new(location.line.to_string()),
);
case.extra.insert(
XmlString::new("column"),
XmlString::new(location.column.to_string()),
);
}
test_suite.add_test_case(case);
}
report.add_test_suite(test_suite);
}
}
let adapter = FmtAdapter { fmt: f };
report.serialize(adapter).map_err(|_| std::fmt::Error)
}
}
// TODO(brent) this and `group_diagnostics_by_filename` are also used by the `grouped` output
// format. I think they'd make more sense in that file, but I started here first. I'll move them to
// that module when adding the `grouped` output format.
struct DiagnosticWithLocation<'a> {
diagnostic: &'a Diagnostic,
start_location: Option<LineColumn>,
}
impl Deref for DiagnosticWithLocation<'_> {
type Target = Diagnostic;
fn deref(&self) -> &Self::Target {
self.diagnostic
}
}
fn group_diagnostics_by_filename<'a>(
diagnostics: &'a [Diagnostic],
resolver: &'a dyn FileResolver,
) -> BTreeMap<&'a str, Vec<DiagnosticWithLocation<'a>>> {
let mut grouped_diagnostics = BTreeMap::default();
for diagnostic in diagnostics {
let (filename, start_location) = diagnostic
.primary_span_ref()
.map(|span| {
let file = span.file();
let start_location =
span.range()
.filter(|_| !resolver.is_notebook(file))
.map(|range| {
file.diagnostic_source(resolver)
.as_source_code()
.line_column(range.start())
});
(span.file().path(resolver), start_location)
})
.unwrap_or_default();
grouped_diagnostics
.entry(filename)
.or_insert_with(Vec::new)
.push(DiagnosticWithLocation {
diagnostic,
start_location,
});
}
grouped_diagnostics
}
struct FmtAdapter<'a> {
fmt: &'a mut dyn std::fmt::Write,
}
impl std::io::Write for FmtAdapter<'_> {
fn write(&mut self, buf: &[u8]) -> std::io::Result<usize> {
self.fmt
.write_str(std::str::from_utf8(buf).map_err(|_| {
std::io::Error::new(
std::io::ErrorKind::InvalidData,
"Invalid UTF-8 in JUnit report",
)
})?)
.map_err(std::io::Error::other)?;
Ok(buf.len())
}
fn flush(&mut self) -> std::io::Result<()> {
Ok(())
}
fn write_fmt(&mut self, args: std::fmt::Arguments<'_>) -> std::io::Result<()> {
self.fmt.write_fmt(args).map_err(std::io::Error::other)
}
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{create_diagnostics, create_syntax_error_diagnostics},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Junit);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Junit);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
}

View File

@@ -0,0 +1,97 @@
use crate::diagnostic::{Diagnostic, SecondaryCode, render::FileResolver};
/// Generate violations in Pylint format.
///
/// The format is given by this string:
///
/// ```python
/// "%(path)s:%(row)d: [%(code)s] %(text)s"
/// ```
///
/// See: [Flake8 documentation](https://flake8.pycqa.org/en/latest/internal/formatters.html#pylint-formatter)
pub(super) struct PylintRenderer<'a> {
resolver: &'a dyn FileResolver,
}
impl<'a> PylintRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver) -> Self {
Self { resolver }
}
}
impl PylintRenderer<'_> {
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
for diagnostic in diagnostics {
let (filename, row) = diagnostic
.primary_span_ref()
.map(|span| {
let file = span.file();
let row = span
.range()
.filter(|_| !self.resolver.is_notebook(file))
.map(|range| {
file.diagnostic_source(self.resolver)
.as_source_code()
.line_column(range.start())
.line
});
(file.relative_path(self.resolver).to_string_lossy(), row)
})
.unwrap_or_default();
let code = diagnostic
.secondary_code()
.map_or_else(|| diagnostic.name(), SecondaryCode::as_str);
let row = row.unwrap_or_default();
writeln!(
f,
"{path}:{row}: [{code}] {body}",
path = filename,
body = diagnostic.body()
)?;
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{TestEnvironment, create_diagnostics, create_syntax_error_diagnostics},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Pylint);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Pylint);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn missing_file() {
let mut env = TestEnvironment::new();
env.format(DiagnosticFormat::Pylint);
let diag = env.err().build();
insta::assert_snapshot!(
env.render(&diag),
@":1: [test-diagnostic] main diagnostic message",
);
}
}

View File

@@ -0,0 +1,235 @@
use serde::ser::SerializeSeq;
use serde::{Serialize, Serializer};
use ruff_diagnostics::{Edit, Fix};
use ruff_source_file::{LineColumn, SourceCode};
use ruff_text_size::Ranged;
use crate::diagnostic::Diagnostic;
use super::FileResolver;
pub struct RdjsonRenderer<'a> {
resolver: &'a dyn FileResolver,
}
impl<'a> RdjsonRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver) -> Self {
Self { resolver }
}
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
write!(
f,
"{:#}",
serde_json::json!(RdjsonDiagnostics::new(diagnostics, self.resolver))
)
}
}
struct ExpandedDiagnostics<'a> {
resolver: &'a dyn FileResolver,
diagnostics: &'a [Diagnostic],
}
impl Serialize for ExpandedDiagnostics<'_> {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let mut s = serializer.serialize_seq(Some(self.diagnostics.len()))?;
for diagnostic in self.diagnostics {
let value = diagnostic_to_rdjson(diagnostic, self.resolver);
s.serialize_element(&value)?;
}
s.end()
}
}
fn diagnostic_to_rdjson<'a>(
diagnostic: &'a Diagnostic,
resolver: &'a dyn FileResolver,
) -> RdjsonDiagnostic<'a> {
let span = diagnostic.primary_span_ref();
let source_file = span.map(|span| {
let file = span.file();
(file.path(resolver), file.diagnostic_source(resolver))
});
let location = source_file.as_ref().map(|(path, source)| {
let range = diagnostic.range().map(|range| {
let source_code = source.as_source_code();
let start = source_code.line_column(range.start());
let end = source_code.line_column(range.end());
RdjsonRange::new(start, end)
});
RdjsonLocation { path, range }
});
let edits = diagnostic.fix().map(Fix::edits).unwrap_or_default();
RdjsonDiagnostic {
message: diagnostic.body(),
location,
code: RdjsonCode {
value: diagnostic
.secondary_code()
.map_or_else(|| diagnostic.name(), |code| code.as_str()),
url: diagnostic.to_ruff_url(),
},
suggestions: rdjson_suggestions(
edits,
source_file
.as_ref()
.map(|(_, source)| source.as_source_code()),
),
}
}
fn rdjson_suggestions<'a>(
edits: &'a [Edit],
source_code: Option<SourceCode>,
) -> Vec<RdjsonSuggestion<'a>> {
if edits.is_empty() {
return Vec::new();
}
let Some(source_code) = source_code else {
debug_assert!(false, "Expected a source file for a diagnostic with a fix");
return Vec::new();
};
edits
.iter()
.map(|edit| {
let start = source_code.line_column(edit.start());
let end = source_code.line_column(edit.end());
let range = RdjsonRange::new(start, end);
RdjsonSuggestion {
range,
text: edit.content().unwrap_or_default(),
}
})
.collect()
}
#[derive(Serialize)]
struct RdjsonDiagnostics<'a> {
diagnostics: ExpandedDiagnostics<'a>,
severity: &'static str,
source: RdjsonSource,
}
impl<'a> RdjsonDiagnostics<'a> {
fn new(diagnostics: &'a [Diagnostic], resolver: &'a dyn FileResolver) -> Self {
Self {
source: RdjsonSource {
name: "ruff",
url: env!("CARGO_PKG_HOMEPAGE"),
},
severity: "WARNING",
diagnostics: ExpandedDiagnostics {
diagnostics,
resolver,
},
}
}
}
#[derive(Serialize)]
struct RdjsonSource {
name: &'static str,
url: &'static str,
}
#[derive(Serialize)]
struct RdjsonDiagnostic<'a> {
code: RdjsonCode<'a>,
#[serde(skip_serializing_if = "Option::is_none")]
location: Option<RdjsonLocation<'a>>,
message: &'a str,
#[serde(skip_serializing_if = "Vec::is_empty")]
suggestions: Vec<RdjsonSuggestion<'a>>,
}
#[derive(Serialize)]
struct RdjsonLocation<'a> {
path: &'a str,
#[serde(skip_serializing_if = "Option::is_none")]
range: Option<RdjsonRange>,
}
#[derive(Default, Serialize)]
struct RdjsonRange {
end: LineColumn,
start: LineColumn,
}
impl RdjsonRange {
fn new(start: LineColumn, end: LineColumn) -> Self {
Self { start, end }
}
}
#[derive(Serialize)]
struct RdjsonCode<'a> {
#[serde(skip_serializing_if = "Option::is_none")]
url: Option<String>,
value: &'a str,
}
#[derive(Serialize)]
struct RdjsonSuggestion<'a> {
range: RdjsonRange,
text: &'a str,
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{TestEnvironment, create_diagnostics, create_syntax_error_diagnostics},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Rdjson);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Rdjson);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn missing_file_stable() {
let mut env = TestEnvironment::new();
env.format(DiagnosticFormat::Rdjson);
env.preview(false);
let diag = env.err().build();
insta::assert_snapshot!(env.render(&diag));
}
#[test]
fn missing_file_preview() {
let mut env = TestEnvironment::new();
env.format(DiagnosticFormat::Rdjson);
env.preview(true);
let diag = env.err().build();
insta::assert_snapshot!(env.render(&diag));
}
}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/azure.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/azure.rs
expression: env.render_diagnostics(&diagnostics)
---
##vso[task.logissue type=error;sourcepath=fib.py;linenumber=1;columnnumber=8;code=F401;]`os` imported but unused
##vso[task.logissue type=error;sourcepath=fib.py;linenumber=6;columnnumber=5;code=F841;]Local variable `x` is assigned to but never used

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/azure.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/azure.rs
expression: env.render_diagnostics(&diagnostics)
---
##vso[task.logissue type=error;sourcepath=syntax_errors.py;linenumber=1;columnnumber=15;]SyntaxError: Expected one or more symbol names after import
##vso[task.logissue type=error;sourcepath=syntax_errors.py;linenumber=3;columnnumber=12;]SyntaxError: Expected ')', found newline

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json.rs
expression: env.render_diagnostics(&diagnostics)
---
[
{
@@ -84,8 +83,8 @@ snapshot_kind: text
{
"content": "",
"end_location": {
"column": 10,
"row": 4
"column": 1,
"row": 5
},
"location": {
"column": 1,

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json.rs
expression: env.render_diagnostics(&diagnostics)
---
[
{

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json.rs
expression: env.render_diagnostics(&diagnostics)
---
[
{

View File

@@ -1,8 +1,7 @@
---
source: crates/ruff_linter/src/message/json_lines.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":1,"code":"F401","end_location":{"column":10,"row":2},"filename":"notebook.ipynb","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":10,"row":2},"location":{"column":1,"row":2}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":2},"message":"`os` imported but unused","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":2,"code":"F401","end_location":{"column":12,"row":2},"filename":"notebook.ipynb","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":1,"row":3},"location":{"column":1,"row":2}}],"message":"Remove unused import: `math`"},"location":{"column":8,"row":2},"message":"`math` imported but unused","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":3,"code":"F841","end_location":{"column":6,"row":4},"filename":"notebook.ipynb","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":10,"row":4},"location":{"column":1,"row":4}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":4},"message":"Local variable `x` is assigned to but never used","noqa_row":4,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}
{"cell":3,"code":"F841","end_location":{"column":6,"row":4},"filename":"notebook.ipynb","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":1,"row":5},"location":{"column":1,"row":4}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":4},"message":"Local variable `x` is assigned to but never used","noqa_row":4,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json_lines.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":null,"code":"F401","end_location":{"column":10,"row":1},"filename":"fib.py","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":1,"row":2},"location":{"column":1,"row":1}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":1},"message":"`os` imported but unused","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":null,"code":"F841","end_location":{"column":6,"row":6},"filename":"fib.py","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":10,"row":6},"location":{"column":5,"row":6}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":6},"message":"Local variable `x` is assigned to but never used","noqa_row":6,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json_lines.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":null,"code":null,"end_location":{"column":1,"row":2},"filename":"syntax_errors.py","fix":null,"location":{"column":15,"row":1},"message":"SyntaxError: Expected one or more symbol names after import","noqa_row":null,"url":null}
{"cell":null,"code":null,"end_location":{"column":1,"row":4},"filename":"syntax_errors.py","fix":null,"location":{"column":12,"row":3},"message":"SyntaxError: Expected ')', found newline","noqa_row":null,"url":null}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/junit.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/junit.rs
expression: env.render_diagnostics(&diagnostics)
---
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="ruff" tests="3" failures="3" errors="0">

View File

@@ -1,15 +1,14 @@
---
source: crates/ruff_linter/src/message/junit.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/junit.rs
expression: env.render_diagnostics(&diagnostics)
---
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="ruff" tests="2" failures="2" errors="0">
<testsuite name="syntax_errors.py" tests="2" disabled="0" errors="0" failures="2" package="org.ruff">
<testcase name="org.ruff" classname="syntax_errors" line="1" column="15">
<testcase name="org.ruff.invalid-syntax" classname="syntax_errors" line="1" column="15">
<failure message="SyntaxError: Expected one or more symbol names after import">line 1, col 15, SyntaxError: Expected one or more symbol names after import</failure>
</testcase>
<testcase name="org.ruff" classname="syntax_errors" line="3" column="12">
<testcase name="org.ruff.invalid-syntax" classname="syntax_errors" line="3" column="12">
<failure message="SyntaxError: Expected &apos;)&apos;, found newline">line 3, col 12, SyntaxError: Expected &apos;)&apos;, found newline</failure>
</testcase>
</testsuite>

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/pylint.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/pylint.rs
expression: env.render_diagnostics(&diagnostics)
---
fib.py:1: [F401] `os` imported but unused
fib.py:6: [F841] Local variable `x` is assigned to but never used

View File

@@ -0,0 +1,6 @@
---
source: crates/ruff_db/src/diagnostic/render/pylint.rs
expression: env.render_diagnostics(&diagnostics)
---
syntax_errors.py:1: [invalid-syntax] SyntaxError: Expected one or more symbol names after import
syntax_errors.py:3: [invalid-syntax] SyntaxError: Expected ')', found newline

View File

@@ -0,0 +1,20 @@
---
source: crates/ruff_db/src/diagnostic/render/rdjson.rs
expression: env.render(&diag)
---
{
"diagnostics": [
{
"code": {
"url": "https://docs.astral.sh/ruff/rules/test-diagnostic",
"value": "test-diagnostic"
},
"message": "main diagnostic message"
}
],
"severity": "WARNING",
"source": {
"name": "ruff",
"url": "https://docs.astral.sh/ruff"
}
}

View File

@@ -0,0 +1,20 @@
---
source: crates/ruff_db/src/diagnostic/render/rdjson.rs
expression: env.render(&diag)
---
{
"diagnostics": [
{
"code": {
"url": "https://docs.astral.sh/ruff/rules/test-diagnostic",
"value": "test-diagnostic"
},
"message": "main diagnostic message"
}
],
"severity": "WARNING",
"source": {
"name": "ruff",
"url": "https://docs.astral.sh/ruff"
}
}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/rdjson.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/rdjson.rs
expression: env.render_diagnostics(&diagnostics)
---
{
"diagnostics": [
@@ -96,7 +95,7 @@ snapshot_kind: text
"message": "Undefined name `a`"
}
],
"severity": "warning",
"severity": "WARNING",
"source": {
"name": "ruff",
"url": "https://docs.astral.sh/ruff"

View File

@@ -1,14 +1,12 @@
---
source: crates/ruff_linter/src/message/rdjson.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/rdjson.rs
expression: env.render_diagnostics(&diagnostics)
---
{
"diagnostics": [
{
"code": {
"url": null,
"value": null
"value": "invalid-syntax"
},
"location": {
"path": "syntax_errors.py",
@@ -27,8 +25,7 @@ snapshot_kind: text
},
{
"code": {
"url": null,
"value": null
"value": "invalid-syntax"
},
"location": {
"path": "syntax_errors.py",
@@ -46,7 +43,7 @@ snapshot_kind: text
"message": "SyntaxError: Expected ')', found newline"
}
],
"severity": "warning",
"severity": "WARNING",
"source": {
"name": "ruff",
"url": "https://docs.astral.sh/ruff"

View File

@@ -41,6 +41,8 @@ pub struct DiagnosticStylesheet {
pub(crate) line_no: Style,
pub(crate) emphasis: Style,
pub(crate) none: Style,
pub(crate) separator: Style,
pub(crate) secondary_code: Style,
}
impl Default for DiagnosticStylesheet {
@@ -62,6 +64,8 @@ impl DiagnosticStylesheet {
line_no: bright_blue.effects(Effects::BOLD),
emphasis: Style::new().effects(Effects::BOLD),
none: Style::new(),
separator: AnsiColor::Cyan.on_default(),
secondary_code: AnsiColor::Red.on_default().effects(Effects::BOLD),
}
}
@@ -75,6 +79,8 @@ impl DiagnosticStylesheet {
line_no: Style::new(),
emphasis: Style::new(),
none: Style::new(),
separator: Style::new(),
secondary_code: Style::new(),
}
}
}

View File

@@ -1,7 +1,6 @@
use std::fmt;
use std::sync::Arc;
use countme::Count;
use dashmap::mapref::entry::Entry;
pub use file_root::{FileRoot, FileRootKind};
pub use path::FilePath;
@@ -232,7 +231,7 @@ impl Files {
let roots = inner.roots.read().unwrap();
for root in roots.all() {
if root.path(db).starts_with(&path) {
if path.starts_with(root.path(db)) {
root.set_revision(db).to(FileRevision::now());
}
}
@@ -312,11 +311,6 @@ pub struct File {
/// the file has been deleted is to change the status to `Deleted`.
#[default]
status: FileStatus,
/// Counter that counts the number of created file instances and active file instances.
/// Only enabled in debug builds.
#[default]
count: Count<File>,
}
// The Salsa heap is tracked separately.
@@ -375,12 +369,25 @@ impl File {
}
/// Refreshes the file metadata by querying the file system if needed.
///
/// This also "touches" the file root associated with the given path.
/// This means that any Salsa queries that depend on the corresponding
/// root's revision will become invalidated.
pub fn sync_path(db: &mut dyn Db, path: &SystemPath) {
let absolute = SystemPath::absolute(path, db.system().current_directory());
Files::touch_root(db, &absolute);
Self::sync_system_path(db, &absolute, None);
}
/// Refreshes *only* the file metadata by querying the file system if needed.
///
/// This specifically does not touch any file root associated with the
/// given file path.
pub fn sync_path_only(db: &mut dyn Db, path: &SystemPath) {
let absolute = SystemPath::absolute(path, db.system().current_directory());
Self::sync_system_path(db, &absolute, None);
}
/// Increments the revision for the virtual file at `path`.
pub fn sync_virtual_path(db: &mut dyn Db, path: &SystemVirtualPath) {
if let Some(virtual_file) = db.files().try_virtual_file(path) {
@@ -486,7 +493,7 @@ impl fmt::Debug for File {
///
/// This is a wrapper around a [`File`] that provides additional methods to interact with a virtual
/// file.
#[derive(Copy, Clone)]
#[derive(Copy, Clone, Debug)]
pub struct VirtualFile(File);
impl VirtualFile {

View File

@@ -23,7 +23,7 @@ pub struct FileRoot {
pub path: SystemPathBuf,
/// The kind of the root at the time of its creation.
kind_at_time_of_creation: FileRootKind,
pub kind_at_time_of_creation: FileRootKind,
/// A revision that changes when the contents of the source root change.
///

View File

@@ -5,6 +5,7 @@ use ruff_python_ast::PythonVersion;
use rustc_hash::FxHasher;
use std::hash::BuildHasherDefault;
use std::num::NonZeroUsize;
use ty_static::EnvVars;
pub mod diagnostic;
pub mod display;
@@ -27,6 +28,21 @@ pub use web_time::{Instant, SystemTime, SystemTimeError};
pub type FxDashMap<K, V> = dashmap::DashMap<K, V, BuildHasherDefault<FxHasher>>;
pub type FxDashSet<K> = dashmap::DashSet<K, BuildHasherDefault<FxHasher>>;
static VERSION: std::sync::OnceLock<String> = std::sync::OnceLock::new();
/// Returns the version of the executing program if set.
pub fn program_version() -> Option<&'static str> {
VERSION.get().map(|version| version.as_str())
}
/// Sets the version of the executing program.
///
/// ## Errors
/// If the version has already been initialized (can only be set once).
pub fn set_program_version(version: String) -> Result<(), String> {
VERSION.set(version)
}
/// Most basic database that gives access to files, the host system, source code, and parsed AST.
#[salsa::db]
pub trait Db: salsa::Database {
@@ -50,8 +66,8 @@ pub trait Db: salsa::Database {
/// ty can still spawn more threads for other tasks, e.g. to wait for a Ctrl+C signal or
/// watching the files for changes.
pub fn max_parallelism() -> NonZeroUsize {
std::env::var("TY_MAX_PARALLELISM")
.or_else(|_| std::env::var("RAYON_NUM_THREADS"))
std::env::var(EnvVars::TY_MAX_PARALLELISM)
.or_else(|_| std::env::var(EnvVars::RAYON_NUM_THREADS))
.ok()
.and_then(|s| s.parse().ok())
.unwrap_or_else(|| {

View File

@@ -21,7 +21,7 @@ use crate::source::source_text;
/// reflected in the changed AST offsets.
/// The other reason is that Ruff's AST doesn't implement `Eq` which Salsa requires
/// for determining if a query result is unchanged.
#[salsa::tracked(returns(ref), no_eq, heap_size=get_size2::GetSize::get_heap_size)]
#[salsa::tracked(returns(ref), no_eq, heap_size=get_size2::heap_size)]
pub fn parsed_module(db: &dyn Db, file: File) -> ParsedModule {
let _span = tracing::trace_span!("parsed_module", ?file).entered();

View File

@@ -1,8 +1,6 @@
use std::ops::Deref;
use std::sync::Arc;
use countme::Count;
use ruff_notebook::Notebook;
use ruff_python_ast::PySourceType;
use ruff_source_file::LineIndex;
@@ -11,7 +9,7 @@ use crate::Db;
use crate::files::{File, FilePath};
/// Reads the source text of a python text file (must be valid UTF8) or notebook.
#[salsa::tracked(heap_size=get_size2::GetSize::get_heap_size)]
#[salsa::tracked(heap_size=get_size2::heap_size)]
pub fn source_text(db: &dyn Db, file: File) -> SourceText {
let path = file.path(db);
let _span = tracing::trace_span!("source_text", file = %path).entered();
@@ -38,11 +36,7 @@ pub fn source_text(db: &dyn Db, file: File) -> SourceText {
};
SourceText {
inner: Arc::new(SourceTextInner {
kind,
read_error,
count: Count::new(),
}),
inner: Arc::new(SourceTextInner { kind, read_error }),
}
}
@@ -75,21 +69,21 @@ impl SourceText {
pub fn as_str(&self) -> &str {
match &self.inner.kind {
SourceTextKind::Text(source) => source,
SourceTextKind::Notebook(notebook) => notebook.source_code(),
SourceTextKind::Notebook { notebook } => notebook.source_code(),
}
}
/// Returns the underlying notebook if this is a notebook file.
pub fn as_notebook(&self) -> Option<&Notebook> {
match &self.inner.kind {
SourceTextKind::Notebook(notebook) => Some(notebook),
SourceTextKind::Notebook { notebook } => Some(notebook),
SourceTextKind::Text(_) => None,
}
}
/// Returns `true` if this is a notebook source file.
pub fn is_notebook(&self) -> bool {
matches!(&self.inner.kind, SourceTextKind::Notebook(_))
matches!(&self.inner.kind, SourceTextKind::Notebook { .. })
}
/// Returns `true` if there was an error when reading the content of the file.
@@ -114,7 +108,7 @@ impl std::fmt::Debug for SourceText {
SourceTextKind::Text(text) => {
dbg.field(text);
}
SourceTextKind::Notebook(notebook) => {
SourceTextKind::Notebook { notebook } => {
dbg.field(notebook);
}
}
@@ -125,29 +119,19 @@ impl std::fmt::Debug for SourceText {
#[derive(Eq, PartialEq, get_size2::GetSize)]
struct SourceTextInner {
#[get_size(ignore)]
count: Count<SourceText>,
kind: SourceTextKind,
read_error: Option<SourceTextError>,
}
#[derive(Eq, PartialEq)]
#[derive(Eq, PartialEq, get_size2::GetSize)]
enum SourceTextKind {
Text(String),
Notebook(Box<Notebook>),
}
impl get_size2::GetSize for SourceTextKind {
fn get_heap_size(&self) -> usize {
match self {
SourceTextKind::Text(text) => text.get_heap_size(),
// TODO: The `get-size` derive does not support ignoring enum variants.
//
// Jupyter notebooks are not very relevant for memory profiling, and contain
// arbitrary JSON values that do not implement the `GetSize` trait.
SourceTextKind::Notebook(_) => 0,
}
}
Notebook {
// Jupyter notebooks are not very relevant for memory profiling, and contain
// arbitrary JSON values that do not implement the `GetSize` trait.
#[get_size(ignore)]
notebook: Box<Notebook>,
},
}
impl From<String> for SourceTextKind {
@@ -158,7 +142,9 @@ impl From<String> for SourceTextKind {
impl From<Notebook> for SourceTextKind {
fn from(notebook: Notebook) -> Self {
SourceTextKind::Notebook(Box::new(notebook))
SourceTextKind::Notebook {
notebook: Box::new(notebook),
}
}
}
@@ -171,7 +157,7 @@ pub enum SourceTextError {
}
/// Computes the [`LineIndex`] for `file`.
#[salsa::tracked(heap_size=get_size2::GetSize::get_heap_size)]
#[salsa::tracked(heap_size=get_size2::heap_size)]
pub fn line_index(db: &dyn Db, file: File) -> LineIndex {
let _span = tracing::trace_span!("line_index", ?file).entered();

View File

@@ -21,6 +21,19 @@ type LockedZipArchive<'a> = MutexGuard<'a, VendoredZipArchive>;
///
/// "Files" in the `VendoredFileSystem` are read-only and immutable.
/// Directories are supported, but symlinks and hardlinks cannot exist.
///
/// # Path separators
///
/// At time of writing (2025-07-11), this implementation always uses `/` as a
/// path separator, even in Windows environments where `\` is traditionally
/// used as a file path separator. Namely, this is only currently used with zip
/// files built by `crates/ty_vendored/build.rs`.
///
/// Callers using this may provide paths that use a `\` as a separator. It will
/// be transparently normalized to `/`.
///
/// This is particularly important because the presence of a trailing separator
/// in a zip file is conventionally used to indicate a directory entry.
#[derive(Clone)]
pub struct VendoredFileSystem {
inner: Arc<Mutex<VendoredZipArchive>>,
@@ -115,6 +128,68 @@ impl VendoredFileSystem {
read_to_string(self, path.as_ref())
}
/// Read the direct children of the directory
/// identified by `path`.
///
/// If `path` is not a directory, then this will
/// return an empty `Vec`.
pub fn read_directory(&self, dir: impl AsRef<VendoredPath>) -> Vec<DirectoryEntry> {
// N.B. We specifically do not return an iterator here to avoid
// holding a lock for the lifetime of the iterator returned.
// That is, it seems like a footgun to keep the zip archive
// locked during iteration, since the unit of work for each
// item in the iterator could be arbitrarily long. Allocating
// up front and stuffing all entries into it is probably the
// simplest solution and what we do here. If this becomes
// a problem, there are other strategies we could pursue.
// (Amortizing allocs, using a different synchronization
// behavior or even exposing additional APIs.) ---AG
fn read_directory(fs: &VendoredFileSystem, dir: &VendoredPath) -> Vec<DirectoryEntry> {
let mut normalized = NormalizedVendoredPath::from(dir);
if !normalized.as_str().ends_with('/') {
normalized = normalized.with_trailing_slash();
}
let archive = fs.lock_archive();
let mut entries = vec![];
for name in archive.0.file_names() {
// Any entry that doesn't have the `path` (with a
// trailing slash) as a prefix cannot possibly be in
// the directory referenced by `path`.
let Some(without_dir_prefix) = name.strip_prefix(normalized.as_str()) else {
continue;
};
// Filter out an entry equivalent to the path given
// since we only want children of the directory.
if without_dir_prefix.is_empty() {
continue;
}
// We only want *direct* children. Files that are
// direct children cannot have any slashes (or else
// they are not direct children). Directories that
// are direct children can only have one slash and
// it must be at the end.
//
// (We do this manually ourselves to avoid doing a
// full file lookup and metadata retrieval via the
// `zip` crate.)
let file_type = FileType::from_zip_file_name(without_dir_prefix);
let slash_count = without_dir_prefix.matches('/').count();
match file_type {
FileType::File if slash_count > 0 => continue,
FileType::Directory if slash_count > 1 => continue,
_ => {}
}
entries.push(DirectoryEntry {
path: VendoredPathBuf::from(name),
file_type,
});
}
entries
}
read_directory(self, dir.as_ref())
}
/// Acquire a lock on the underlying zip archive.
/// The call will block until it is able to acquire the lock.
///
@@ -206,6 +281,14 @@ pub enum FileType {
}
impl FileType {
fn from_zip_file_name(name: &str) -> FileType {
if name.ends_with('/') {
FileType::Directory
} else {
FileType::File
}
}
pub const fn is_file(self) -> bool {
matches!(self, Self::File)
}
@@ -244,6 +327,30 @@ impl Metadata {
}
}
#[derive(Debug, PartialEq, Eq)]
pub struct DirectoryEntry {
path: VendoredPathBuf,
file_type: FileType,
}
impl DirectoryEntry {
pub fn new(path: VendoredPathBuf, file_type: FileType) -> Self {
Self { path, file_type }
}
pub fn into_path(self) -> VendoredPathBuf {
self.path
}
pub fn path(&self) -> &VendoredPath {
&self.path
}
pub fn file_type(&self) -> FileType {
self.file_type
}
}
/// Newtype wrapper around a ZipArchive.
#[derive(Debug)]
struct VendoredZipArchive(ZipArchive<io::Cursor<Cow<'static, [u8]>>>);
@@ -498,6 +605,60 @@ pub(crate) mod tests {
test_directory("./stdlib/asyncio/../asyncio/")
}
fn readdir_snapshot(fs: &VendoredFileSystem, path: &str) -> String {
let mut paths = fs
.read_directory(VendoredPath::new(path))
.into_iter()
.map(|entry| entry.path().to_string())
.collect::<Vec<String>>();
paths.sort();
paths.join("\n")
}
#[test]
fn read_directory_stdlib() {
let mock_typeshed = mock_typeshed();
assert_snapshot!(readdir_snapshot(&mock_typeshed, "stdlib"), @r"
vendored://stdlib/asyncio/
vendored://stdlib/functools.pyi
");
assert_snapshot!(readdir_snapshot(&mock_typeshed, "stdlib/"), @r"
vendored://stdlib/asyncio/
vendored://stdlib/functools.pyi
");
assert_snapshot!(readdir_snapshot(&mock_typeshed, "./stdlib"), @r"
vendored://stdlib/asyncio/
vendored://stdlib/functools.pyi
");
assert_snapshot!(readdir_snapshot(&mock_typeshed, "./stdlib/"), @r"
vendored://stdlib/asyncio/
vendored://stdlib/functools.pyi
");
}
#[test]
fn read_directory_asyncio() {
let mock_typeshed = mock_typeshed();
assert_snapshot!(
readdir_snapshot(&mock_typeshed, "stdlib/asyncio"),
@"vendored://stdlib/asyncio/tasks.pyi",
);
assert_snapshot!(
readdir_snapshot(&mock_typeshed, "./stdlib/asyncio"),
@"vendored://stdlib/asyncio/tasks.pyi",
);
assert_snapshot!(
readdir_snapshot(&mock_typeshed, "stdlib/asyncio/"),
@"vendored://stdlib/asyncio/tasks.pyi",
);
assert_snapshot!(
readdir_snapshot(&mock_typeshed, "./stdlib/asyncio/"),
@"vendored://stdlib/asyncio/tasks.pyi",
);
}
fn test_nonexistent_path(path: &str) {
let mock_typeshed = mock_typeshed();
let path = VendoredPath::new(path);

View File

@@ -17,6 +17,10 @@ impl VendoredPath {
unsafe { &*(path as *const Utf8Path as *const VendoredPath) }
}
pub fn file_name(&self) -> Option<&str> {
self.0.file_name()
}
pub fn to_path_buf(&self) -> VendoredPathBuf {
VendoredPathBuf(self.0.to_path_buf())
}

View File

@@ -13,6 +13,7 @@ license = { workspace = true }
[dependencies]
ty = { workspace = true }
ty_project = { workspace = true, features = ["schemars"] }
ty_static = { workspace = true }
ruff = { workspace = true }
ruff_formatter = { workspace = true }
ruff_linter = { workspace = true, features = ["schemars"] }

View File

@@ -4,7 +4,7 @@ use anyhow::Result;
use crate::{
generate_cli_help, generate_docs, generate_json_schema, generate_ty_cli_reference,
generate_ty_options, generate_ty_rules, generate_ty_schema,
generate_ty_env_vars_reference, generate_ty_options, generate_ty_rules, generate_ty_schema,
};
pub(crate) const REGENERATE_ALL_COMMAND: &str = "cargo dev generate-all";
@@ -44,5 +44,8 @@ pub(crate) fn main(args: &Args) -> Result<()> {
generate_ty_options::main(&generate_ty_options::Args { mode: args.mode })?;
generate_ty_rules::main(&generate_ty_rules::Args { mode: args.mode })?;
generate_ty_cli_reference::main(&generate_ty_cli_reference::Args { mode: args.mode })?;
generate_ty_env_vars_reference::main(&generate_ty_env_vars_reference::Args {
mode: args.mode,
})?;
Ok(())
}

View File

@@ -0,0 +1,119 @@
//! Generate the environment variables reference from `ty_static::EnvVars`.
use std::collections::BTreeSet;
use std::fs;
use std::path::PathBuf;
use anyhow::bail;
use pretty_assertions::StrComparison;
use ty_static::EnvVars;
use crate::generate_all::Mode;
#[derive(clap::Args)]
pub(crate) struct Args {
#[arg(long, default_value_t, value_enum)]
pub(crate) mode: Mode,
}
pub(crate) fn main(args: &Args) -> anyhow::Result<()> {
let reference_string = generate();
let filename = "environment.md";
let reference_path = PathBuf::from(env!("CARGO_MANIFEST_DIR"))
.parent()
.unwrap()
.parent()
.unwrap()
.join("crates")
.join("ty")
.join("docs")
.join(filename);
match args.mode {
Mode::DryRun => {
println!("{reference_string}");
}
Mode::Check => match fs::read_to_string(&reference_path) {
Ok(current) => {
if current == reference_string {
println!("Up-to-date: {filename}");
} else {
let comparison = StrComparison::new(&current, &reference_string);
bail!(
"{filename} changed, please run `cargo dev generate-ty-env-vars-reference`:\n{comparison}"
);
}
}
Err(err) if err.kind() == std::io::ErrorKind::NotFound => {
bail!(
"{filename} not found, please run `cargo dev generate-ty-env-vars-reference`"
);
}
Err(err) => {
bail!(
"{filename} changed, please run `cargo dev generate-ty-env-vars-reference`:\n{err}"
);
}
},
Mode::Write => {
// Ensure the docs directory exists
if let Some(parent) = reference_path.parent() {
fs::create_dir_all(parent)?;
}
match fs::read_to_string(&reference_path) {
Ok(current) => {
if current == reference_string {
println!("Up-to-date: {filename}");
} else {
println!("Updating: {filename}");
fs::write(&reference_path, reference_string.as_bytes())?;
}
}
Err(err) if err.kind() == std::io::ErrorKind::NotFound => {
println!("Updating: {filename}");
fs::write(&reference_path, reference_string.as_bytes())?;
}
Err(err) => {
bail!(
"{filename} changed, please run `cargo dev generate-ty-env-vars-reference`:\n{err}"
);
}
}
}
}
Ok(())
}
fn generate() -> String {
let mut output = String::new();
output.push_str("# Environment variables\n\n");
// Partition and sort environment variables into TY_ and external variables.
let (ty_vars, external_vars): (BTreeSet<_>, BTreeSet<_>) = EnvVars::metadata()
.iter()
.partition(|(var, _)| var.starts_with("TY_"));
output.push_str("ty defines and respects the following environment variables:\n\n");
for (var, doc) in ty_vars {
output.push_str(&render(var, doc));
}
output.push_str("## Externally-defined variables\n\n");
output.push_str("ty also reads the following externally defined environment variables:\n\n");
for (var, doc) in external_vars {
output.push_str(&render(var, doc));
}
output
}
/// Render an environment variable and its documentation.
fn render(var: &str, doc: &str) -> String {
format!("### `{var}`\n\n{doc}\n\n")
}

View File

@@ -18,6 +18,7 @@ mod generate_json_schema;
mod generate_options;
mod generate_rules_table;
mod generate_ty_cli_reference;
mod generate_ty_env_vars_reference;
mod generate_ty_options;
mod generate_ty_rules;
mod generate_ty_schema;
@@ -53,6 +54,8 @@ enum Command {
/// Generate a Markdown-compatible listing of configuration options.
GenerateOptions,
GenerateTyOptions(generate_ty_options::Args),
/// Generate environment variables reference for ty.
GenerateTyEnvVarsReference(generate_ty_env_vars_reference::Args),
/// Generate CLI help.
GenerateCliHelp(generate_cli_help::Args),
/// Generate Markdown docs.
@@ -98,6 +101,7 @@ fn main() -> Result<ExitCode> {
Command::GenerateTyRules(args) => generate_ty_rules::main(&args)?,
Command::GenerateOptions => println!("{}", generate_options::generate()),
Command::GenerateTyOptions(args) => generate_ty_options::main(&args)?,
Command::GenerateTyEnvVarsReference(args) => generate_ty_env_vars_reference::main(&args)?,
Command::GenerateCliHelp(args) => generate_cli_help::main(&args)?,
Command::GenerateDocs(args) => generate_docs::main(&args)?,
Command::PrintAST(args) => print_ast::main(&args)?,

View File

@@ -16,5 +16,6 @@ doctest = false
[dependencies]
ruff_text_size = { workspace = true }
get-size2 = { workspace = true }
is-macro = { workspace = true }
serde = { workspace = true, optional = true, features = [] }

View File

@@ -7,7 +7,7 @@ use ruff_text_size::{Ranged, TextRange, TextSize};
/// A text edit to be applied to a source file. Inserts, deletes, or replaces
/// content at a given location.
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
#[derive(Clone, Debug, PartialEq, Eq, Hash, get_size2::GetSize)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub struct Edit {
/// The start location of the edit.

View File

@@ -6,7 +6,9 @@ use ruff_text_size::{Ranged, TextSize};
use crate::edit::Edit;
/// Indicates if a fix can be applied.
#[derive(Copy, Clone, Debug, Hash, PartialEq, Eq, PartialOrd, Ord, is_macro::Is)]
#[derive(
Copy, Clone, Debug, Hash, PartialEq, Eq, PartialOrd, Ord, is_macro::Is, get_size2::GetSize,
)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
#[cfg_attr(feature = "serde", serde(rename_all = "lowercase"))]
pub enum Applicability {
@@ -30,7 +32,7 @@ pub enum Applicability {
}
/// Indicates the level of isolation required to apply a fix.
#[derive(Default, Copy, Clone, Debug, Hash, PartialEq, Eq, PartialOrd, Ord)]
#[derive(Default, Copy, Clone, Debug, Hash, PartialEq, Eq, PartialOrd, Ord, get_size2::GetSize)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub enum IsolationLevel {
/// The fix should be applied as long as no other fixes in the same group have been applied.
@@ -41,7 +43,7 @@ pub enum IsolationLevel {
}
/// A collection of [`Edit`] elements to be applied to a source file.
#[derive(Debug, PartialEq, Eq, Clone)]
#[derive(Debug, PartialEq, Eq, Clone, Hash, get_size2::GetSize)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub struct Fix {
/// The [`Edit`] elements to be applied, sorted by [`Edit::start`] in ascending order.

View File

@@ -20,6 +20,7 @@ ty_python_semantic = { workspace = true }
anyhow = { workspace = true }
clap = { workspace = true, optional = true }
memchr = { workspace = true }
salsa = { workspace = true }
schemars = { workspace = true, optional = true }
serde = { workspace = true, optional = true }

View File

@@ -1,3 +1,4 @@
use crate::StringImports;
use ruff_python_ast::visitor::source_order::{
SourceOrderVisitor, walk_expr, walk_module, walk_stmt,
};
@@ -10,13 +11,13 @@ pub(crate) struct Collector<'a> {
/// The path to the current module.
module_path: Option<&'a [String]>,
/// Whether to detect imports from string literals.
string_imports: bool,
string_imports: StringImports,
/// The collected imports from the Python AST.
imports: Vec<CollectedImport>,
}
impl<'a> Collector<'a> {
pub(crate) fn new(module_path: Option<&'a [String]>, string_imports: bool) -> Self {
pub(crate) fn new(module_path: Option<&'a [String]>, string_imports: StringImports) -> Self {
Self {
module_path,
string_imports,
@@ -118,7 +119,7 @@ impl<'ast> SourceOrderVisitor<'ast> for Collector<'_> {
| Stmt::Continue(_)
| Stmt::IpyEscapeCommand(_) => {
// Only traverse simple statements when string imports is enabled.
if self.string_imports {
if self.string_imports.enabled {
walk_stmt(self, stmt);
}
}
@@ -126,20 +127,26 @@ impl<'ast> SourceOrderVisitor<'ast> for Collector<'_> {
}
fn visit_expr(&mut self, expr: &'ast Expr) {
if self.string_imports {
if self.string_imports.enabled {
if let Expr::StringLiteral(ast::ExprStringLiteral {
value,
range: _,
node_index: _,
}) = expr
{
// Determine whether the string literal "looks like" an import statement: contains
// a dot, and consists solely of valid Python identifiers.
let value = value.to_str();
if let Some(module_name) = ModuleName::new(value) {
self.imports.push(CollectedImport::Import(module_name));
// Determine whether the string literal "looks like" an import statement: contains
// the requisite number of dots, and consists solely of valid Python identifiers.
if self.string_imports.min_dots == 0
|| memchr::memchr_iter(b'.', value.as_bytes()).count()
>= self.string_imports.min_dots
{
if let Some(module_name) = ModuleName::new(value) {
self.imports.push(CollectedImport::Import(module_name));
}
}
}
walk_expr(self, expr);
}
}

View File

@@ -87,7 +87,7 @@ impl SourceDb for ModuleDb {
#[salsa::db]
impl Db for ModuleDb {
fn is_file_open(&self, file: File) -> bool {
fn should_check_file(&self, file: File) -> bool {
!file.path(self).is_vendored_path()
}

View File

@@ -9,7 +9,7 @@ use ruff_python_parser::{Mode, ParseOptions, parse};
use crate::collector::Collector;
pub use crate::db::ModuleDb;
use crate::resolver::Resolver;
pub use crate::settings::{AnalyzeSettings, Direction};
pub use crate::settings::{AnalyzeSettings, Direction, StringImports};
mod collector;
mod db;
@@ -26,7 +26,7 @@ impl ModuleImports {
db: &ModuleDb,
path: &SystemPath,
package: Option<&SystemPath>,
string_imports: bool,
string_imports: StringImports,
) -> Result<Self> {
// Read and parse the source code.
let source = std::fs::read_to_string(path)?;
@@ -42,13 +42,11 @@ impl ModuleImports {
// Resolve the imports.
let mut resolved_imports = ModuleImports::default();
for import in imports {
let Some(resolved) = Resolver::new(db).resolve(import) else {
continue;
};
let Some(path) = resolved.as_system_path() else {
continue;
};
resolved_imports.insert(path.to_path_buf());
for resolved in Resolver::new(db).resolve(import) {
if let Some(path) = resolved.as_system_path() {
resolved_imports.insert(path.to_path_buf());
}
}
}
Ok(resolved_imports)

View File

@@ -1,5 +1,5 @@
use ruff_db::files::FilePath;
use ty_python_semantic::resolve_module;
use ty_python_semantic::{ModuleName, resolve_module, resolve_real_module};
use crate::ModuleDb;
use crate::collector::CollectedImport;
@@ -16,24 +16,67 @@ impl<'a> Resolver<'a> {
}
/// Resolve the [`CollectedImport`] into a [`FilePath`].
pub(crate) fn resolve(&self, import: CollectedImport) -> Option<&'a FilePath> {
pub(crate) fn resolve(&self, import: CollectedImport) -> impl Iterator<Item = &'a FilePath> {
match import {
CollectedImport::Import(import) => {
let module = resolve_module(self.db, &import)?;
Some(module.file()?.path(self.db))
// Attempt to resolve the module (e.g., given `import foo`, look for `foo`).
let file = self.resolve_module(&import);
// If the file is a stub, look for the corresponding source file.
let source_file = file
.is_some_and(|file| file.extension() == Some("pyi"))
.then(|| self.resolve_real_module(&import))
.flatten();
std::iter::once(file)
.chain(std::iter::once(source_file))
.flatten()
}
CollectedImport::ImportFrom(import) => {
// Attempt to resolve the member (e.g., given `from foo import bar`, look for `foo.bar`).
if let Some(file) = self.resolve_module(&import) {
// If the file is a stub, look for the corresponding source file.
let source_file = (file.extension() == Some("pyi"))
.then(|| self.resolve_real_module(&import))
.flatten();
return std::iter::once(Some(file))
.chain(std::iter::once(source_file))
.flatten();
}
// Attempt to resolve the module (e.g., given `from foo import bar`, look for `foo`).
let parent = import.parent();
let file = parent
.as_ref()
.and_then(|parent| self.resolve_module(parent));
let module = resolve_module(self.db, &import).or_else(|| {
// Attempt to resolve the module (e.g., given `from foo import bar`, look for `foo`).
// If the file is a stub, look for the corresponding source file.
let source_file = file
.is_some_and(|file| file.extension() == Some("pyi"))
.then(|| {
parent
.as_ref()
.and_then(|parent| self.resolve_real_module(parent))
})
.flatten();
resolve_module(self.db, &parent?)
})?;
Some(module.file()?.path(self.db))
std::iter::once(file)
.chain(std::iter::once(source_file))
.flatten()
}
}
}
/// Resolves a module name to a module.
fn resolve_module(&self, module_name: &ModuleName) -> Option<&'a FilePath> {
let module = resolve_module(self.db, module_name)?;
Some(module.file(self.db)?.path(self.db))
}
/// Resolves a module name to a module (stubs not allowed).
fn resolve_real_module(&self, module_name: &ModuleName) -> Option<&'a FilePath> {
let module = resolve_real_module(self.db, module_name)?;
Some(module.file(self.db)?.path(self.db))
}
}

View File

@@ -11,7 +11,7 @@ pub struct AnalyzeSettings {
pub exclude: FilePatternSet,
pub preview: PreviewMode,
pub target_version: PythonVersion,
pub detect_string_imports: bool,
pub string_imports: StringImports,
pub include_dependencies: BTreeMap<PathBuf, (PathBuf, Vec<String>)>,
pub extension: ExtensionMapping,
}
@@ -26,7 +26,7 @@ impl fmt::Display for AnalyzeSettings {
self.exclude,
self.preview,
self.target_version,
self.detect_string_imports,
self.string_imports,
self.extension | debug,
self.include_dependencies | debug,
]
@@ -35,6 +35,31 @@ impl fmt::Display for AnalyzeSettings {
}
}
#[derive(Debug, Copy, Clone, CacheKey)]
pub struct StringImports {
pub enabled: bool,
pub min_dots: usize,
}
impl Default for StringImports {
fn default() -> Self {
Self {
enabled: false,
min_dots: 2,
}
}
}
impl fmt::Display for StringImports {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
if self.enabled {
write!(f, "enabled (min_dots: {})", self.min_dots)
} else {
write!(f, "disabled")
}
}
}
#[derive(Default, Debug, Copy, Clone, PartialEq, Eq, CacheKey)]
#[cfg_attr(
feature = "serde",

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.12.2"
version = "0.12.7"
publish = false
authors = { workspace = true }
edition = { workspace = true }
@@ -15,7 +15,7 @@ license = { workspace = true }
[dependencies]
ruff_annotate_snippets = { workspace = true }
ruff_cache = { workspace = true }
ruff_db = { workspace = true }
ruff_db = { workspace = true, features = ["junit", "serde"] }
ruff_diagnostics = { workspace = true, features = ["serde"] }
ruff_notebook = { workspace = true }
ruff_macros = { workspace = true }
@@ -55,7 +55,6 @@ path-absolutize = { workspace = true, features = [
pathdiff = { workspace = true }
pep440_rs = { workspace = true }
pyproject-toml = { workspace = true }
quick-junit = { workspace = true }
regex = { workspace = true }
rustc-hash = { workspace = true }
schemars = { workspace = true, optional = true }

View File

@@ -25,5 +25,5 @@ def my_func():
# t-strings - all ok
t"0.0.0.0"
"0.0.0.0" t"0.0.0.0{expr}0.0.0.0"
"0.0.0.0" f"0.0.0.0{expr}0.0.0.0" t"0.0.0.0{expr}0.0.0.0"
t"0.0.0.0" t"0.0.0.0{expr}0.0.0.0"
t"0.0.0.0" t"0.0.0.0{expr}0.0.0.0" t"0.0.0.0{expr}0.0.0.0"

View File

@@ -1,29 +1,30 @@
try:
pass
except Exception:
continue
for _ in []:
try:
pass
except Exception:
continue
try:
pass
except:
continue
try:
pass
except:
continue
try:
pass
except (Exception,):
continue
try:
pass
except (Exception,):
continue
try:
pass
except (Exception, ValueError):
continue
try:
pass
except (Exception, ValueError):
continue
try:
pass
except ValueError:
continue
try:
pass
except ValueError:
continue
try:
pass
except (ValueError,):
continue
try:
pass
except (ValueError,):
continue

View File

@@ -94,7 +94,7 @@ except Exception:
logging.error("...", exc_info=True)
from logging import error, exception
from logging import critical, error, exception
try:
pass
@@ -114,6 +114,23 @@ except Exception:
error("...", exc_info=None)
try:
pass
except Exception:
critical("...")
try:
pass
except Exception:
critical("...", exc_info=False)
try:
pass
except Exception:
critical("...", exc_info=None)
try:
pass
except Exception:
@@ -125,6 +142,13 @@ try:
except Exception:
error("...", exc_info=True)
try:
pass
except Exception:
critical("...", exc_info=True)
try:
...
except Exception as e:

Some files were not shown because too many files have changed in this diff Show More