Compare commits

...

76 Commits

Author SHA1 Message Date
Aria Desires
7ef86c9637 fixup 2025-09-18 21:08:49 -04:00
Aria Desires
793d0d0dd4 Implement additional implicit re-exports idiom for __init__.pyi 2025-09-18 15:09:01 -04:00
Ibraheem Ahmed
e84d523bcf [ty] Infer more precise types for collection literals (#20360)
## Summary

Part of https://github.com/astral-sh/ty/issues/168. Infer more precise types for collection literals (currently, only `list` and `set`). For example,

```py
x = [1, 2, 3] # revealed: list[Unknown | int]
y: list[int] = [1, 2, 3] # revealed: list[int]
```

This could easily be extended to `dict` literals, but I am intentionally limiting scope for now.
2025-09-17 18:51:50 -04:00
chiri
bfb0902446 [flake8-use-pathlib] Add fix for PTH123 (#20169)
## Summary
Part of https://github.com/astral-sh/ruff/issues/2331

## Test Plan
`cargo nextest run flake8_use_pathlib`

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-09-17 15:47:29 -04:00
Shaygan Hooshyari
05622ae757 [ty] Bind Self typevar to method context (#20366)
Fixes: https://github.com/astral-sh/ty/issues/1173

<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

This PR will change the logic of binding Self type variables to bind
self to the immediate function that it's used on.
Since we are binding `self` to methods and not the class itself we need
to ensure that we bind self consistently.

The fix is to traverse scopes containing the self and find the first
function inside a class and use that function to bind the typevar for
self.

If no such scope is found we fallback to the normal behavior. Using Self
outside of a class scope is not legal anyway.

## Test Plan

Added a new mdtest.

Checked the diagnostics that are not emitted anymore in [primer
results](https://github.com/astral-sh/ruff/pull/20366#issuecomment-3289411424).
It looks good altough I don't completely understand what was wrong
before.

---------

Co-authored-by: Douglas Creager <dcreager@dcreager.net>
2025-09-17 14:58:54 -04:00
Andrew Gallant
3fcbe8bde6 [ty] Remove TODO about using a non-panicking lookup method
Ref https://github.com/astral-sh/ruff/pull/20439#discussion_r2355082049

Ref https://github.com/astral-sh/ruff/pull/18455#discussion_r2126833137
2025-09-17 13:59:28 -04:00
Andrew Gallant
64a4e2889e [ty] Add new completion data to wasm bridge 2025-09-17 13:59:28 -04:00
Andrew Gallant
5816985ecd [ruff] Remove Locator from Importer
It seems like we'd like to remove `Locator` since it's a bit
awkward in how it works:
https://github.com/astral-sh/ruff/pull/20439#discussion_r2354683797

It looked pretty easy to rip it out of the `Importer`, so that's
one less thing using it.
2025-09-17 13:59:28 -04:00
Andrew Gallant
b6a29592e7 [ty] Add a new abstraction for adding imports to a module
This is somewhat inspired by a similar abstraction in
`ruff_linter`. The main idea is to create an importer once
for a module that you want to add imports to. And then call
`import` to generate an edit for each symbol you want to
add.

I haven't done any performance profiling here yet. I don't
know if it will be a bottleneck. In particular, I do expect
`Importer::import` (but not `Importer::new`) to get called
many times for a single completion request when auto-import
is enabled. Particularly in projects with a lot of unimported
symbols. Because I don't know the perf impact, I didn't do
any premature optimization here. But there are surely some
low hanging fruit if this does prove to be a problem.

New tests make up a big portion of the diff here. I tried to
think of a bunch of different cases, although I'm sure there
are more.
2025-09-17 13:59:28 -04:00
Andrew Gallant
bcc8d6910b [ty] Refactor to handle unimported completions
This rejiggers some stuff in the main completions entrypoint
in `ty_ide`. A more refined `Completion` type is defined
with more information. In particular, to support auto-import,
we now include a module name and an "edit" for inserting an
import.

This also rolls the old "detailed completion" into the new
completion type. Previously, we were relying on the completion
type for `ty_python_semantic`. But `ty_ide` is really the code
that owns completions.

Note that this code doesn't build as-is. The next commit will
add the importer used here in `add_unimported_completions`.
2025-09-17 13:59:28 -04:00
Andrew Gallant
02ee22db78 [ty] Add module to result returned by "all symbols" API
Based on how this API is currently implemented, this doesn't
really cost us anything. But it gives us access to more
information about where the symbol is defined.
2025-09-17 13:59:28 -04:00
Andrew Gallant
0a2325c5fe [ty] Move CompletionKind to ty_ide
I think this is a better home for it. This way, `ty_ide`
more clearly owns how the "kind" of a completion is computed.
In particular, it is computed differently for things where
we know its type versus unimported symbols.
2025-09-17 13:59:28 -04:00
Andrew Gallant
6c3c963f8a [ty] Include definition site for "members of" API
In the course of writing the "add an import" implementation,
I realized that we needed to know which symbols were in scope
and how they were defined. This was necessary to be able to
determine how to add a new import in a way that (minimally)
does not conflict with existing symbols.

I'm not sure that this is fully correct (especially for
symbol bindings) and it's unclear to me in which cases a
definition site will be missing. But this seems to work for
some of the basic cases that I tried.
2025-09-17 13:59:28 -04:00
Andrew Gallant
6ec52991cb [ty] Fix a bug with "all_submodule_names_for_package" API
The names of the submodules returned should be *complete*. This
is the contract of `Module::name`. However, we were previously
only returning the basename of the submodule.
2025-09-17 13:59:28 -04:00
Andrew Gallant
cf16fc4aa4 [ty] Export some stuff from ty_python_semantic
We're going to want to use this outside of `ty_python_semantic`.
Specifically, in `ty_ide`.
2025-09-17 13:59:28 -04:00
Andrew Gallant
61a49c89eb [ruff] Add TextRange::to_std_range
This can already be accomplished via a `From` impl (and indeed,
that's how this is implemented). But in a generic context, the
turbo-fishing that needs to be applied is quite annoying.
2025-09-17 13:59:28 -04:00
Andrew Gallant
da5eb85087 [ruff] Add API for splicing into an existing import statement
Basically, given a `from module import name1, name2, ...` statement,
we'd like to be able to insert another name in that list.

This new `Insertion::existing_import` API provides such
functionality. There isn't much to it, although we are careful
to try and avoid inserting nonsense for import statements
that are already invalid.
2025-09-17 13:59:28 -04:00
Andrew Gallant
a47a50e6e2 [ruff] Provide a way to get an owned Stylist
This makes it easier to test with in some cases and generally shouldn't
cost anything.
2025-09-17 13:59:28 -04:00
Andrew Gallant
880a867696 [ruff] Move Insertion abstraction out of ruff_linter
This refactors the importer abstraction to use a shared
`Insertion`. This is mostly just moving some code around
with some slight tweaks.

The plan here is to keep the rest of the importing code
in `ruff_linter` and then write something ty-specific on
top of `Insertion`. This ends up sharing some code, but
not as much as would be ideal. In particular, the
`ruff_linter` imported is pretty tightly coupled with
ruff's semantic model. So to share the code, we'd need to
abstract over that.
2025-09-17 13:59:28 -04:00
Andrew Gallant
ec2720c814 [ruff] Small tweak to import
As I was playing around in this file, it was much nicer
to just use `cst::` everywhere, similar to what we do with
`ruff_python_ast`.
2025-09-17 13:59:28 -04:00
chiri
cb3c3ba94d [flake8-use-pathlib] A bit clean up PTH100 (#20452)
## Summary

Part of https://github.com/astral-sh/ruff/pull/20215

## Test Plan
2025-09-17 12:11:30 -04:00
chiri
c585c9f6d4 [flake8-use-pathlib] Make PTH111 fix unsafe because it can change behavior (#20215)
## Summary

Fixes https://github.com/astral-sh/ruff/issues/20214

## Test Plan
2025-09-17 11:23:55 -04:00
Brent Westbrook
ac5488086f [ty] Add GitHub output format (#20358)
## Summary

This PR wires up the GitHub output format moved to `ruff_db` in #20320
to the ty CLI.

It's a bit smaller than the GitLab version (#20155) because some of the
helpers were already in place, but I did factor out a few
`DisplayDiagnosticConfig` constructor calls in Ruff. I also exposed the
`GithubRenderer` and a wrapper `DisplayGithubDiagnostics` type because
we needed a way to configure the program name displayed in the GitHub
diagnostics. This was previously hard-coded to `Ruff`:

<img width="675" height="247" alt="image"
src="https://github.com/user-attachments/assets/592da860-d2f5-4abd-bc5a-66071d742509"
/>

Another option would be to drop the program name in the output format,
but I think it can be helpful in workflows with multiple programs
emitting annotations (such as Ruff and ty!)

## Test Plan

New CLI test, and a manual test with `--config 'terminal.output-format =
"github"'`
2025-09-17 09:50:25 -04:00
Carl Meyer
7e464b8150 [ty] move graphql-core to good.txt (#20447)
## Summary

With https://github.com/astral-sh/ruff/pull/20446, graphql-core now
checks without error; we can move it to `good.txt`.

## Test Plan

CI
2025-09-17 10:09:32 +02:00
David Peter
ffd650e5fd [ty] Update mypy_primer (#20433)
## Summary

Revert the materialize-changes, see
https://github.com/hauntsaninja/mypy_primer/pull/208

## Test Plan

CI
2025-09-17 09:51:48 +02:00
Carl Meyer
99ec4d2c69 [ty] detect cycles in binary comparison inference (#20446)
## Summary

Catch infinite recursion in binary-compare inference.

Fixes the stack overflow in `graphql-core` in mypy-primer.

## Test Plan

Added two tests that stack-overflowed before this PR.
2025-09-17 09:45:25 +02:00
justin
9f0b942b9e [ty] infer name and value for enum members (#20311)
## summary
- this pr implements the following attributes for `Enum` members:
  - `name`
  - `_name_`
  - `value`
  - `_value_`
- adds a TODO test for `my_enum_class_instance.name`
- only implements if the instance is a subclass of `Enum` re: this
[comment](https://github.com/astral-sh/ruff/pull/19481#issuecomment-3103460307)
and existing
[test](c34449ed7c/crates/ty_python_semantic/resources/mdtest/enums.md (L625))

### pointers
- https://github.com/astral-sh/ty/issues/876
- https://typing.python.org/en/latest/spec/enums.html#enum-definition
- https://github.com/astral-sh/ruff/pull/19481#issuecomment-3103460307

## test plan
- mdtests
- triaged conformance diffs here:
https://diffswarm.dev/d-01k531ag4nee3xmdeq4f3j66pb
- triaged mypy primer diffs here for django-stubs:
https://diffswarm.dev/d-01k5331n13k9yx8tvnxnkeawp3
  - added a TODO test for overriding `.value`
- discord diff seems reasonable

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-09-17 09:36:27 +02:00
Carl Meyer
c2fa449954 [ty] support type aliases in binary compares (#20445)
## Summary

Add missing `Type::TypeAlias` clauses to `infer_binary_type_comparison`.

## Test Plan

Added mdtests that failed before.
2025-09-17 09:33:26 +02:00
Carl Meyer
681ad2fd92 [ty] move primer projects to good.txt (#20444)
## Summary

After https://github.com/astral-sh/ruff/pull/20359 we can move all but
three remaining projects over to `good.txt`.

## Test Plan

CI
2025-09-17 09:31:27 +02:00
Takayuki Maeda
98071b49c2 [playground] Enable inline noqa for multiline strings in playground (#20442) 2025-09-17 09:29:40 +02:00
Carl Meyer
d121a76aef [ty] no more diverging query cycles in type expressions (#20359)
## Summary

Use `Type::Divergent` to short-circuit diverging types in type
expressions. This avoids panicking in a wide variety of cases of
recursive type expressions.

Avoids many panics (but not yet all -- I'll be tracking down the rest)
from https://github.com/astral-sh/ty/issues/256 by falling back to
Divergent. For many of these recursive type aliases, we'd like to
support them properly (i.e. really understand the recursive nature of
the type, not just fall back to Divergent) but that will be future work.

This switches `Type::has_divergent_type` from using `any_over_type` to a
custom set of visit methods, because `any_over_type` visits more than we
need to visit, and exercises some lazy attributes of type, causing
significantly more work. This change means this diff doesn't regress
perf; it even reclaims some of the perf regression from
https://github.com/astral-sh/ruff/pull/20333.

## Test Plan

Added mdtest for recursive type alias that panics on main.

Verified that we can now type-check `packaging` (and projects depending
on it) without panic; this will allow moving a number of mypy-primer
projects from `bad.txt` to `good.txt` in a subsequent PR.
2025-09-16 16:44:11 -07:00
Bhuminjay Soni
c3f2187fda [syntax-errors]: import from * only allowed at module scope (F406) (#20166)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

This PR implements F406
https://docs.astral.sh/ruff/rules/undefined-local-with-nested-import-star-usage/
as a semantic syntax error

## Test Plan

I have written inline tests as directed in #17412

---------

Signed-off-by: 11happy <soni5happy@gmail.com>
2025-09-16 15:53:28 -04:00
Amethyst Reese
8a027b0d74 [ruff] Treat panics as fatal diagnostics, sort panics last (#20258)
- Convert panics to diagnostics with id `Panic`, severity `Fatal`, and
the error as the diagnostic message, annotated with a `Span` with empty
code block and no range.
- Updates the post-linting message diagnostic handling to track the
maximum severity seen, and then prints the "report a bug in ruff"
message only if the max severity was `Fatal`

This depends on the sorting changes since it creates diagnostics with no
range specified.
2025-09-16 11:33:37 -07:00
Dan Parizher
aa63c24b8f [pycodestyle] Fix E301 to only trigger for functions immediately within a class (#19768)
## Summary

Fixes #19752

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-09-16 11:00:07 -04:00
Douglas Creager
1f46c18921 [ty] More constraint set simplifications via simpler constraint representation (#20423)
Previously, we used a very fine-grained representation for individual
constraints: each constraint was _either_ a range constraint, a
not-equivalent constraint, or an incomparable constraint. These three
pieces are enough to represent all of the "real" constraints we need to
create — range constraints and their negation.

However, it meant that we weren't picking up as many chances to simplify
constraint sets as we could. Our simplification logic depends on being
able to look at _pairs_ of constraints or clauses to see if they
simplify relative to each other. With our fine-grained representation,
we could easily encounter situations that we should have been able to
simplify, but that would require looking at three or more individual
constraints.

For instance, negating a range constraint would produce:

```
¬(Base ≤ T ≤ Super) = ((T ≤ Base) ∧ (T ≠ Base)) ∨ (T ≁ Base) ∨
                      ((Super ≤ T) ∧ (T ≠ Super)) ∨ (T ≁ Super)
```

That is, `T` must be (strictly) less than `Base`, (strictly) greater
than `Super`, or incomparable to either.

If we tried to union those back together, we should get `always`, since
`x ∨ ¬x` should always be true, no matter what `x` is. But instead we
would get:

```
(Base ≤ T ≤ Super) ∨ ((T ≤ Base) ∧ (T ≠ Base)) ∨ (T ≁ Base) ∨ ((Super ≤ T) ∧ (T ≠
 Super)) ∨ (T ≁ Super)
```

Nothing would simplify relative to each other, because we'd have to look
at all five union elements to see that together they do in fact combine
to `always`.

The fine-grained representation was nice, because it made it easier to
[work out the math](https://dcreager.net/theory/constraints/) for
intersections and unions of each kind of constraint. But being able to
simplify is more important, since the example above comes up immediately
in #20093 when trying to handle constrained typevars.

The fix in this PR is to go back to a more coarse-grained
representation, where each individual constraint consists of a positive
range (which might be `always` / `Never ≤ T ≤ object`), and zero or more
negative ranges. The intuition is to think of a constraint as a region
of the type space (representable as a range) with zero or more "holes"
removed from it.

With this representation, negating a range constraint produces:

```
¬(Base ≤ T ≤ Super) = (always ∧ ¬(Base ≤ T ≤ Super))
```

(That looks trivial, because it is! We just move the positive range to
the negative side.)

The math is not that much harder than before, because there are only
three combinations to consider (each for intersection and union) —
though the fact that there can be multiple holes in a constraint does
require some nested loops. But the mdtest suite gives me confidence that
this is not introducing any new issues, and it definitely removes a
troublesome TODO.

(As an aside, this change also means that we are back to having each
clause contain no more than one individual constraint for any typevar.
This turned out to be important, because part of our simplification
logic was also depending on that!)

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-09-16 10:05:01 -04:00
Takayuki Maeda
0d424d8e78 [ruff] Add analyze.string-imports-min-dots to settings documentation (#20375) 2025-09-16 13:19:34 +02:00
Alex Waygood
e6b321eccc Disable flamegraph uploads for walltime benchmarks (#20419) 2025-09-16 09:08:22 +00:00
David Peter
2a6dde4acb [ty] Remove Self from generic context when binding Self (#20364)
## Summary

This mainly removes an internal inconsistency, where we didn't remove
the `Self` type variable when eagerly binding `Self` to an instance
type. It has no observable effect, apparently.

builds on top of https://github.com/astral-sh/ruff/pull/20328

## Test Plan

None
2025-09-15 17:40:10 +02:00
David Peter
25cbf38a47 [ty] Patch Self for fallback-methods on NamedTuples and TypedDicts (#20328)
## Summary

We use classes like
[`_typeshed._type_checker_internals.NamedTupleFallback`](d9c76e1d9f/stdlib/_typeshed/_type_checker_internals.pyi (L54-L75))
to tack on additional attributes/methods to instances of user-defined
`NamedTuple`s (or `TypedDict`s), even though these classes are not
present in the MRO of those types.

The problem is that those classes use implicit and explicit `Self`
annotations which refer to `NamedTupleFallback` itself, instead of to
the actual type that we're adding those methods to:
```py
class NamedTupleFallback(tuple[Any, ...]):
    # […]
    def _replace(self, **kwargs: Any) -> typing_extensions.Self: ...
```

In effect, when we access `_replace` on an instance of a custom
`NamedTuple` instance, its `self` parameter and return type refer to the
wrong `Self`. This leads to incorrect *"Argument to bound method
`_replace` is incorrect: Argument type `Person` does not satisfy upper
bound `NamedTupleFallback` of type variable `Self`"* errors on #18007.
It would also lead to similar errors on `TypedDict`s, if they would
already implement assignability properly.


## Test Plan

I applied the following patch to typeshed and verified that no errors
appear anymore.

<details>

```diff
diff --git a/crates/ty_vendored/vendor/typeshed/stdlib/_typeshed/_type_checker_internals.pyi b/crates/ty_vendored/vendor/typeshed/stdlib/_typeshed/_type_checker_internals.pyi
index feb22aae00..8e41034f19 100644
--- a/crates/ty_vendored/vendor/typeshed/stdlib/_typeshed/_type_checker_internals.pyi
+++ b/crates/ty_vendored/vendor/typeshed/stdlib/_typeshed/_type_checker_internals.pyi
@@ -29,27 +29,27 @@ class TypedDictFallback(Mapping[str, object], metaclass=ABCMeta):
         __readonly_keys__: ClassVar[frozenset[str]]
         __mutable_keys__: ClassVar[frozenset[str]]
 
-    def copy(self) -> typing_extensions.Self: ...
+    def copy(self: typing_extensions.Self) -> typing_extensions.Self: ...
     # Using Never so that only calls using mypy plugin hook that specialize the signature
     # can go through.
-    def setdefault(self, k: Never, default: object) -> object: ...
+    def setdefault(self: typing_extensions.Self, k: Never, default: object) -> object: ...
     # Mypy plugin hook for 'pop' expects that 'default' has a type variable type.
-    def pop(self, k: Never, default: _T = ...) -> object: ...  # pyright: ignore[reportInvalidTypeVarUse]
-    def update(self, m: typing_extensions.Self, /) -> None: ...
-    def __delitem__(self, k: Never) -> None: ...
-    def items(self) -> dict_items[str, object]: ...
-    def keys(self) -> dict_keys[str, object]: ...
-    def values(self) -> dict_values[str, object]: ...
+    def pop(self: typing_extensions.Self, k: Never, default: _T = ...) -> object: ...  # pyright: ignore[reportInvalidTypeVarUse]
+    def update(self: typing_extensions.Self, m: typing_extensions.Self, /) -> None: ...
+    def __delitem__(self: typing_extensions.Self, k: Never) -> None: ...
+    def items(self: typing_extensions.Self) -> dict_items[str, object]: ...
+    def keys(self: typing_extensions.Self) -> dict_keys[str, object]: ...
+    def values(self: typing_extensions.Self) -> dict_values[str, object]: ...
     @overload
-    def __or__(self, value: typing_extensions.Self, /) -> typing_extensions.Self: ...
+    def __or__(self: typing_extensions.Self, value: typing_extensions.Self, /) -> typing_extensions.Self: ...
     @overload
-    def __or__(self, value: dict[str, Any], /) -> dict[str, object]: ...
+    def __or__(self: typing_extensions.Self, value: dict[str, Any], /) -> dict[str, object]: ...
     @overload
-    def __ror__(self, value: typing_extensions.Self, /) -> typing_extensions.Self: ...
+    def __ror__(self: typing_extensions.Self, value: typing_extensions.Self, /) -> typing_extensions.Self: ...
     @overload
-    def __ror__(self, value: dict[str, Any], /) -> dict[str, object]: ...
+    def __ror__(self: typing_extensions.Self, value: dict[str, Any], /) -> dict[str, object]: ...
     # supposedly incompatible definitions of __or__ and __ior__
-    def __ior__(self, value: typing_extensions.Self, /) -> typing_extensions.Self: ...  # type: ignore[misc]
+    def __ior__(self: typing_extensions.Self, value: typing_extensions.Self, /) -> typing_extensions.Self: ...  # type: ignore[misc]
 
 # Fallback type providing methods and attributes that appear on all `NamedTuple` types.
 class NamedTupleFallback(tuple[Any, ...]):
@@ -61,18 +61,18 @@ class NamedTupleFallback(tuple[Any, ...]):
         __orig_bases__: ClassVar[tuple[Any, ...]]
 
     @overload
-    def __init__(self, typename: str, fields: Iterable[tuple[str, Any]], /) -> None: ...
+    def __init__(self: typing_extensions.Self, typename: str, fields: Iterable[tuple[str, Any]], /) -> None: ...
     @overload
     @typing_extensions.deprecated(
         "Creating a typing.NamedTuple using keyword arguments is deprecated and support will be removed in Python 3.15"
     )
-    def __init__(self, typename: str, fields: None = None, /, **kwargs: Any) -> None: ...
+    def __init__(self: typing_extensions.Self, typename: str, fields: None = None, /, **kwargs: Any) -> None: ...
     @classmethod
     def _make(cls, iterable: Iterable[Any]) -> typing_extensions.Self: ...
-    def _asdict(self) -> dict[str, Any]: ...
-    def _replace(self, **kwargs: Any) -> typing_extensions.Self: ...
+    def _asdict(self: typing_extensions.Self) -> dict[str, Any]: ...
+    def _replace(self: typing_extensions.Self, **kwargs: Any) -> typing_extensions.Self: ...
     if sys.version_info >= (3, 13):
-        def __replace__(self, **kwargs: Any) -> typing_extensions.Self: ...
+        def __replace__(self: typing_extensions.Self, **kwargs: Any) -> typing_extensions.Self: ...
 
 # Non-default variations to accommodate couroutines, and `AwaitableGenerator` having a 4th type parameter.
 _S = TypeVar("_S")
```

</details>
2025-09-15 16:21:53 +02:00
Salaheddine EL FARISSI
9a9ebc316c [docs] Update README.md with Albumentations new repository URL (#20415)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-09-15 13:59:43 +00:00
Alex Waygood
e8b4450125 [ty] Remove unneeded disjoint-base special casing for builtins.tuple (#20414) 2025-09-15 13:12:20 +01:00
renovate[bot]
eb71536dce Update dependency monaco-editor to ^0.53.0 (#20395)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-09-15 13:47:51 +02:00
Alex Waygood
8341da7f63 [ty] Allow annotation expressions to be ast::Attribute nodes (#20413)
Fixes https://github.com/astral-sh/ty/issues/1187
2025-09-15 12:06:48 +01:00
renovate[bot]
1f1365a0fa Update dependency vite to v7.0.7 (#20323)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-09-15 11:57:06 +02:00
renovate[bot]
25c13ea91c Update CodSpeedHQ/action action to v4 (#20409)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-09-15 11:44:52 +02:00
renovate[bot]
6581f9bf2a Update actions/github-script action to v8 (#20406)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-09-15 11:28:18 +02:00
renovate[bot]
c4b0d10438 Update actions/download-artifact action to v5 (#20405)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-09-15 11:26:47 +02:00
renovate[bot]
3adb478c6b Update actions/setup-python action to v6 (#20408)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-09-15 11:25:04 +02:00
renovate[bot]
ac8ac2c677 Update actions/setup-node action to v5 (#20407)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-09-15 11:24:05 +02:00
renovate[bot]
2c8aa6e9e3 Update dependency node to v22 (#20410)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-09-15 11:23:37 +02:00
Takayuki Maeda
093fa72656 [ty] Include NamedTupleFallback members in NamedTuple instance completions (#20356)
## Summary

Fixes https://github.com/astral-sh/ty/issues/1161

Include `NamedTupleFallback` members in `NamedTuple` instance
completions.

- Augment instance attribute completions when completing on NamedTuple
instances by merging members from
`_typeshed._type_checker_internals.NamedTupleFallback`

## Test Plan

Adds a minimal completion test `namedtuple_fallback_instance_methods`

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-09-15 11:00:03 +02:00
David Peter
02c58f1beb [ty] Remove 'materialize' from the ecosystem projects (#20412)
## Summary

This project was [recently removed from
mypy_primer](https://github.com/astral-sh/ruff/pull/20378), so we need
to remove it from `good.txt` in order for ecosystem-analyzer to work
correctly.

## Test Plan

Run mypy_primer and ecosystem-analyzer on this branch.
2025-09-15 10:42:35 +02:00
github-actions[bot]
276ee1bb1e [ty] Sync vendored typeshed stubs (#20394)
Close and reopen this PR to trigger CI

---------

Co-authored-by: typeshedbot <>
Co-authored-by: David Peter <mail@david-peter.de>
2025-09-15 09:30:28 +02:00
renovate[bot]
9e4acd8bdd Update actions/checkout action to v5 (#20404) 2025-09-14 22:51:05 -04:00
renovate[bot]
e061c39119 Update taiki-e/install-action action to v2.61.3 (#20403) 2025-09-14 22:50:58 -04:00
renovate[bot]
9299bb42ca Update Rust crate ctrlc to v3.5.0 (#20398) 2025-09-14 22:50:52 -04:00
renovate[bot]
876d800205 Update Rust crate rayon to v1.11.0 (#20400) 2025-09-15 01:54:11 +00:00
renovate[bot]
89d83282b9 Update Rust crate tempfile to v3.22.0 (#20401) 2025-09-15 01:53:05 +00:00
renovate[bot]
97ebb1492f Update Rust crate uuid to v1.18.1 (#20402) 2025-09-15 01:51:31 +00:00
renovate[bot]
afc207ec0f Update Rust crate camino to v1.2.0 (#20397) 2025-09-15 01:50:11 +00:00
renovate[bot]
326c878adb Update dependency ruff to v0.13.0 (#20396) 2025-09-14 21:44:23 -04:00
renovate[bot]
b9a96535bc Update astral-sh/setup-uv action to v6.7.0 (#20389) 2025-09-14 21:29:47 -04:00
renovate[bot]
59330be95d Update Rust crate serde_json to v1.0.145 (#20387) 2025-09-14 21:29:41 -04:00
renovate[bot]
963974146d Update Rust crate serde to v1.0.223 (#20386) 2025-09-14 21:29:36 -04:00
renovate[bot]
4dc28a483e Update Rust crate pyproject-toml to v0.13.6 (#20385) 2025-09-14 21:29:29 -04:00
renovate[bot]
850eefc0c4 Update Rust crate ordermap to v0.5.10 (#20384) 2025-09-14 21:29:16 -04:00
renovate[bot]
fc3e571804 Update Rust crate indexmap to v2.11.1 (#20383) 2025-09-14 21:29:10 -04:00
renovate[bot]
0a36b1e4d7 Update cargo-bins/cargo-binstall action to v1.15.5 (#20382) 2025-09-14 21:28:58 -04:00
renovate[bot]
72e6698550 Update Rust crate unicode-ident to v1.0.19 (#20388) 2025-09-14 21:27:18 -04:00
Alex Waygood
9edbeb44dd bump mypy_primer pin (#20378)
Pulls in
06849fda40
2025-09-14 19:16:35 +01:00
William Woodruff
1fa64a24b8 Revert "[ruff]: Build loongarch64 binaries in CI (#20361)" (#20372) 2025-09-12 17:21:04 -04:00
Alex Waygood
1745554809 [ty] Temporary hack to reduce false positives around builtins.open() (#20367)
## Summary

https://github.com/astral-sh/ruff/pull/20165 added a lot of false
positives around calls to `builtins.open()`, because our missing support
for PEP-613 type aliases means that we don't understand typeshed's
overloads for `builtins.open()` at all yet, and therefore always select
the first overload. This didn't use to matter very much, but now that we
have a much stricter implementation of protocol assignability/subtyping
it matters a lot, because most of the stdlib functions dealing with I/O
(`pickle`, `marshal`, `io`, `json`, etc.) are annotated in typeshed as
taking in protocols of some kind.

In lieu of full PEP-613 support, which is blocked on various things and
might not land in time for our next alpha release, this PR adds some
temporary special-casing for `builtins.open()` to avoid the false
positives. We just infer `Todo` for anything that isn't meant to match
typeshed's first `open()` overload. This should be easy to rip out again
once we have proper support for PEP-613 type aliases, which hopefully
should be pretty soon!

## Test Plan

Added an mdtest
2025-09-12 22:20:38 +01:00
Alex Waygood
98708976e4 [ty] Fix subtyping/assignability of function- and class-literal types to callback protocols (#20363)
## Summary

Fixes https://github.com/astral-sh/ty/issues/377.

We were treating any function as being assignable to any callback
protocol, because we were trying to figure out a type's `Callable`
supertype by looking up the `__call__` attribute on the type's
meta-type. But a function-literal's meta-type is `types.FunctionType`,
and `types.FunctionType.__call__` is `(...) -> Any`, which is not very
helpful!

While working on this PR, I also realised that assignability between
class-literals and callback protocols was somewhat broken too, so I
fixed that at the same time.

## Test Plan

Added mdtests
2025-09-12 22:20:09 +01:00
Igor Drokin
c7f6b85fb3 [ruff] Allow dataclass attribute value instantiation from nested frozen dataclass (RUF009) (#20352)
## Summary
Resolves #20266

Definition of the frozen dataclass attribute can be instantiation of a
nested frozen dataclass as well as a non-nested one.

### Problem explanation
The `function_call_in_dataclass_default` function is invoked during the
"defined scope" stage, after all scopes have been processed. At this
point, the semantic references the top-level scope. When
`SemanticModel::lookup_attribute` executes, it searches for bindings in
the top-level module scope rather than the class scope, resulting in an
error.

To solve this issue, the lookup should be evaluated through the class
scope.

## Test Plan
- Added test case from issue

Co-authored-by: Igor Drokin <drokinii1017@gmail.com>
2025-09-12 16:46:49 -04:00
Takayuki Maeda
151ba49b36 [pyupgrade] Prevent infinite loop with I002 and UP026 (#20327)
## Summary

Fixes #19842

Prevent infinite loop with I002 and UP026

- Implement isort-aware handling for UP026 (deprecated mock import):
- Add CLI integration tests in crates/ruff/tests/lint.rs:

## Test Plan

I have added two integration tests
`pyupgrade_up026_respects_isort_required_import_fix` and
`pyupgrade_up026_respects_isort_required_import_from_fix` in
`crates/ruff/tests/lint.rs`.
2025-09-12 15:45:26 -05:00
Carl Meyer
82796df9b5 [ty] add cycle handling to BoundMethodType::into_callable_type() (#20369)
## Summary

This looks like it should fix the errors that we've been seeing in sympy
in recent mypy-primer runs.

## Test Plan

I wasn't able to reproduce the sympy failures locally; it looks like
there is probably a dependency on the order in which files are checked.
So I don't have a minimal reproducible example, and wasn't able to add a
test :/ Obviously I would be happier if we could commit a regression
test here, but since the change is straightforward and clearly
desirable, I'm not sure how many hours it's worth trying to track it
down.

Mypy-primer is still failing in CI on this PR, because it fails on the
"old" ty commit already (i.e. on main). But it passes [on a no-op PR
stacked on top of this](https://github.com/astral-sh/ruff/pull/20370),
which strongly suggests this PR fixes the problem.
2025-09-12 13:39:38 -07:00
172 changed files with 7174 additions and 2228 deletions

View File

@@ -39,11 +39,11 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: "Prep README.md"
@@ -68,11 +68,11 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
runs-on: macos-14
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
architecture: x64
@@ -110,11 +110,11 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
runs-on: macos-14
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
architecture: arm64
@@ -166,11 +166,11 @@ jobs:
- target: aarch64-pc-windows-msvc
arch: x64
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
architecture: ${{ matrix.platform.arch }}
@@ -219,11 +219,11 @@ jobs:
- x86_64-unknown-linux-gnu
- i686-unknown-linux-gnu
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
architecture: x64
@@ -294,18 +294,13 @@ jobs:
arch: arm
- target: riscv64gc-unknown-linux-gnu
arch: riscv64
- target: loongarch64-unknown-linux-gnu
arch: loong64
# There's currently no loong64 support for Ubuntu so we are using Debian
base_image: --platform=linux/loong64 ghcr.io/loong64/debian:trixie
maturin_docker_options: -e JEMALLOC_SYS_WITH_LG_PAGE=16
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: "Prep README.md"
@@ -323,15 +318,12 @@ jobs:
with:
arch: ${{ matrix.platform.arch == 'arm' && 'armv6' || matrix.platform.arch }}
distro: ${{ matrix.platform.arch == 'arm' && 'bullseye' || 'ubuntu20.04' }}
base_image: ${{ matrix.platform.base_image }}
githubToken: ${{ github.token }}
install: |
apt-get update
apt-get install -y --no-install-recommends python3 python3-pip python3-venv libatomic1
run: |
python3 -m venv .venv
source .venv/bin/activate
apt-get install -y --no-install-recommends python3 python3-pip libatomic1
pip3 install -U pip
run: |
pip3 install ${{ env.PACKAGE_NAME }} --no-index --find-links dist/ --force-reinstall
ruff --help
- name: "Upload wheels"
@@ -369,11 +361,11 @@ jobs:
- x86_64-unknown-linux-musl
- i686-unknown-linux-musl
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
architecture: x64
@@ -435,11 +427,11 @@ jobs:
arch: armv7
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: "Prep README.md"

View File

@@ -33,7 +33,7 @@ jobs:
- linux/amd64
- linux/arm64
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: recursive
persist-credentials: false
@@ -113,7 +113,7 @@ jobs:
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
steps:
- name: Download digests
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
path: /tmp/digests
pattern: digests-*
@@ -256,7 +256,7 @@ jobs:
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
steps:
- name: Download digests
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
path: /tmp/digests
pattern: digests-*

View File

@@ -43,7 +43,7 @@ jobs:
# Flag that is set to "true" when code related to the playground changes.
playground: ${{ steps.check_playground.outputs.changed }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
@@ -209,7 +209,7 @@ jobs:
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: "Install Rust toolchain"
@@ -223,7 +223,7 @@ jobs:
if: ${{ needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main' }}
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
@@ -243,7 +243,7 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
@@ -252,15 +252,15 @@ jobs:
- name: "Install mold"
uses: rui314/setup-mold@725a8794d15fc7563f59595bd9556495c0564878 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@6064345e6658255e90e9500fdf9a06ab77e6909c # v2.57.6
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@6064345e6658255e90e9500fdf9a06ab77e6909c # v2.57.6
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
with:
tool: cargo-insta
- name: "Install uv"
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
with:
enable-cache: "true"
- name: ty mdtests (GitHub annotations)
@@ -305,7 +305,7 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
@@ -314,15 +314,15 @@ jobs:
- name: "Install mold"
uses: rui314/setup-mold@725a8794d15fc7563f59595bd9556495c0564878 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@6064345e6658255e90e9500fdf9a06ab77e6909c # v2.57.6
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@6064345e6658255e90e9500fdf9a06ab77e6909c # v2.57.6
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
with:
tool: cargo-insta
- name: "Install uv"
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
with:
enable-cache: "true"
- name: "Run tests"
@@ -338,18 +338,18 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@6064345e6658255e90e9500fdf9a06ab77e6909c # v2.57.6
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
with:
tool: cargo-nextest
- name: "Install uv"
uses: astral-sh/setup-uv@e92bafb6253dcd438e0484186d7669ea7a8ca1cc # v6.4.3
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
with:
enable-cache: "true"
- name: "Run tests"
@@ -369,15 +369,15 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
timeout-minutes: 10
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: 20
node-version: 22
cache: "npm"
cache-dependency-path: playground/package-lock.json
- uses: jetli/wasm-pack-action@0d096b08b4e5a7de8c28de67e11e945404e9eefa # v0.4.0
@@ -398,7 +398,7 @@ jobs:
if: ${{ github.ref == 'refs/heads/main' }}
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
@@ -416,7 +416,7 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: SebRollen/toml-action@b1b3628f55fc3a28208d4203ada8b737e9687876 # v1.2.0
@@ -444,7 +444,7 @@ jobs:
if: ${{ github.ref == 'refs/heads/main' || needs.determine_changes.outputs.fuzz == 'true' || needs.determine_changes.outputs.code == 'true' }}
timeout-minutes: 10
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
@@ -453,7 +453,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo-binstall"
uses: cargo-bins/cargo-binstall@837578dfb436769f1e6669b2e23ffea9d9d2da8f # v1.15.4
uses: cargo-bins/cargo-binstall@20aa316bab4942180bbbabe93237858e8d77f1ed # v1.15.5
- name: "Install cargo-fuzz"
# Download the latest version from quick install and not the github releases because github releases only has MUSL targets.
run: cargo binstall cargo-fuzz --force --disable-strategies crate-meta-data --no-confirm
@@ -470,11 +470,11 @@ jobs:
env:
FORCE_COLOR: 1
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
name: Download Ruff binary to test
id: download-cached-binary
with:
@@ -504,7 +504,7 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
timeout-minutes: 5
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
@@ -534,14 +534,14 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && github.event_name == 'pull_request' && needs.determine_changes.outputs.code == 'true' }}
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
name: Download comparison Ruff binary
id: ruff-target
with:
@@ -658,10 +658,10 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && github.event_name == 'pull_request' && (needs.determine_changes.outputs.ty == 'true' || needs.determine_changes.outputs.py-fuzzer == 'true') }}
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
name: Download new ty binary
id: ty-new
with:
@@ -674,7 +674,7 @@ jobs:
branch: ${{ github.event.pull_request.base.ref }}
workflow: "ci.yaml"
check_artifacts: true
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- name: Fuzz
env:
FORCE_COLOR: 1
@@ -701,10 +701,10 @@ jobs:
needs: determine_changes
if: ${{ needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main' }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: cargo-bins/cargo-binstall@837578dfb436769f1e6669b2e23ffea9d9d2da8f # v1.15.4
- uses: cargo-bins/cargo-binstall@20aa316bab4942180bbbabe93237858e8d77f1ed # v1.15.5
- run: cargo binstall --no-confirm cargo-shear
- run: cargo shear
@@ -714,10 +714,10 @@ jobs:
timeout-minutes: 20
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ env.PYTHON_VERSION }}
architecture: x64
@@ -741,12 +741,12 @@ jobs:
runs-on: depot-ubuntu-22.04-16
timeout-minutes: 10
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: 22
- name: "Cache pre-commit"
@@ -772,10 +772,10 @@ jobs:
env:
MKDOCS_INSIDERS_SSH_KEY_EXISTS: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY != '' }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: "3.13"
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
@@ -787,7 +787,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: Install uv
uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uv pip install -r docs/requirements-insiders.txt --system
@@ -814,7 +814,7 @@ jobs:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.formatter == 'true' || github.ref == 'refs/heads/main') }}
timeout-minutes: 10
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
@@ -840,18 +840,18 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
name: "Download ruff-lsp source"
with:
persist-credentials: false
repository: "astral-sh/ruff-lsp"
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
# installation fails on 3.13 and newer
python-version: "3.12"
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
name: Download development ruff binary
id: ruff-target
with:
@@ -882,13 +882,13 @@ jobs:
- determine_changes
if: ${{ (needs.determine_changes.outputs.playground == 'true') }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: 22
cache: "npm"
@@ -914,18 +914,18 @@ jobs:
timeout-minutes: 20
steps:
- name: "Checkout Branch"
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@6064345e6658255e90e9500fdf9a06ab77e6909c # v2.57.6
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
with:
tool: cargo-codspeed
@@ -933,8 +933,9 @@ jobs:
run: cargo codspeed build --features "codspeed,instrumented" --no-default-features -p ruff_benchmark
- name: "Run benchmarks"
uses: CodSpeedHQ/action@76578c2a7ddd928664caa737f0e962e3085d4e7c # v3.8.1
uses: CodSpeedHQ/action@653fdc30e6c40ffd9739e40c8a0576f4f4523ca1 # v4.0.1
with:
mode: instrumentation
run: cargo codspeed run
token: ${{ secrets.CODSPEED_TOKEN }}
@@ -947,18 +948,18 @@ jobs:
TY_LOG: ruff_benchmark=debug
steps:
- name: "Checkout Branch"
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@6064345e6658255e90e9500fdf9a06ab77e6909c # v2.57.6
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
with:
tool: cargo-codspeed
@@ -966,7 +967,13 @@ jobs:
run: cargo codspeed build --features "codspeed,walltime" --no-default-features -p ruff_benchmark
- name: "Run benchmarks"
uses: CodSpeedHQ/action@76578c2a7ddd928664caa737f0e962e3085d4e7c # v3.8.1
uses: CodSpeedHQ/action@653fdc30e6c40ffd9739e40c8a0576f4f4523ca1 # v4.0.1
env:
# enabling walltime flamegraphs adds ~6 minutes to the CI time, and they don't
# appear to provide much useful insight for our walltime benchmarks right now
# (see https://github.com/astral-sh/ruff/pull/20419)
CODSPEED_PERF_ENABLED: false
with:
mode: walltime
run: cargo codspeed run
token: ${{ secrets.CODSPEED_TOKEN }}

View File

@@ -31,10 +31,10 @@ jobs:
# Don't run the cron job on forks:
if: ${{ github.repository == 'astral-sh/ruff' || github.event_name != 'schedule' }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
@@ -65,7 +65,7 @@ jobs:
permissions:
issues: write
steps:
- uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
- uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |

View File

@@ -32,14 +32,14 @@ jobs:
runs-on: depot-ubuntu-22.04-32
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: ruff
fetch-depth: 0
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
@@ -75,14 +75,14 @@ jobs:
runs-on: depot-ubuntu-22.04-32
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: ruff
fetch-depth: 0
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:

View File

@@ -17,7 +17,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: "Update pre-commit mirror"
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
github-token: ${{ secrets.RUFF_PRE_COMMIT_PAT }}
script: |

View File

@@ -23,12 +23,12 @@ jobs:
env:
MKDOCS_INSIDERS_SSH_KEY_EXISTS: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY != '' }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ inputs.ref }}
persist-credentials: true
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12

View File

@@ -24,12 +24,12 @@ jobs:
env:
CF_API_TOKEN_EXISTS: ${{ secrets.CF_API_TOKEN != '' }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: 22
cache: "npm"

View File

@@ -22,13 +22,11 @@ jobs:
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
pattern: wheels-*
path: wheels
merge-multiple: true
- name: Remove wheels unsupported by PyPI
run: rm wheels/*loong*
- name: Publish to PyPi
run: uv publish -v wheels/*

View File

@@ -30,12 +30,12 @@ jobs:
env:
CF_API_TOKEN_EXISTS: ${{ secrets.CF_API_TOKEN != '' }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: 22
- uses: jetli/wasm-bindgen-action@20b33e20595891ab1a0ed73145d8a21fc96e7c29 # v0.2.0

View File

@@ -29,7 +29,7 @@ jobs:
target: [web, bundler, nodejs]
fail-fast: false
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: "Install Rust toolchain"
@@ -45,9 +45,9 @@ jobs:
jq '.name="@astral-sh/ruff-wasm-${{ matrix.target }}"' crates/ruff_wasm/pkg/package.json > /tmp/package.json
mv /tmp/package.json crates/ruff_wasm/pkg
- run: cp LICENSE crates/ruff_wasm/pkg # wasm-pack does not put the LICENSE file in the pkg
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: 20
node-version: 22
registry-url: "https://registry.npmjs.org"
- name: "Publish (dry-run)"
if: ${{ inputs.plan == '' || fromJson(inputs.plan).announcement_tag_is_implicit }}

View File

@@ -50,12 +50,12 @@ jobs:
permissions:
contents: write
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
name: Checkout Ruff
with:
path: ruff
persist-credentials: true
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
name: Checkout typeshed
with:
repository: python/typeshed
@@ -65,7 +65,7 @@ jobs:
run: |
git config --global user.name typeshedbot
git config --global user.email '<>'
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- name: Sync typeshed stubs
run: |
rm -rf "ruff/${VENDORED_TYPESHED}"
@@ -112,12 +112,12 @@ jobs:
permissions:
contents: write
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
name: Checkout Ruff
with:
persist-credentials: true
ref: ${{ env.UPSTREAM_BRANCH}}
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- name: Setup git
run: |
git config --global user.name typeshedbot
@@ -150,12 +150,12 @@ jobs:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
name: Checkout Ruff
with:
persist-credentials: true
ref: ${{ env.UPSTREAM_BRANCH}}
- uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- name: Setup git
run: |
git config --global user.name typeshedbot
@@ -192,7 +192,7 @@ jobs:
permissions:
issues: write
steps:
- uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
- uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |

View File

@@ -26,14 +26,14 @@ jobs:
timeout-minutes: 20
if: contains(github.event.label.name, 'ecosystem-analyzer')
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: ruff
fetch-depth: 0
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
@@ -64,7 +64,7 @@ jobs:
cd ..
uv tool install "git+https://github.com/astral-sh/ecosystem-analyzer@1f560d07d672effae250e3d271da53d96c5260ff"
uv tool install "git+https://github.com/astral-sh/ecosystem-analyzer@fc0f612798710b0dd69bb7528bc9b361dc60bd43"
ecosystem-analyzer \
--repository ruff \

View File

@@ -22,14 +22,14 @@ jobs:
runs-on: depot-ubuntu-22.04-32
timeout-minutes: 20
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: ruff
fetch-depth: 0
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@557e51de59eb14aaaba2ed9621916900a91d50c6 # v6.6.1
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:

View File

@@ -32,13 +32,13 @@ jobs:
runs-on: depot-ubuntu-22.04-32
timeout-minutes: 10
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: ruff
fetch-depth: 0
persist-credentials: false
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
repository: python/typing
ref: ${{ env.CONFORMANCE_SUITE_COMMIT }}

188
Cargo.lock generated
View File

@@ -322,11 +322,11 @@ dependencies = [
[[package]]
name = "camino"
version = "1.1.12"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dd0b03af37dad7a14518b7691d81acb0f8222604ad3d1b02f6b4bed5188c0cd5"
checksum = "e1de8bc0aa9e9385ceb3bf0c152e3a9b9544f6c4a912c8ae504e80c1f0368603"
dependencies = [
"serde",
"serde_core",
]
[[package]]
@@ -376,7 +376,7 @@ dependencies = [
"android-tzdata",
"iana-time-zone",
"num-traits",
"windows-link",
"windows-link 0.1.3",
]
[[package]]
@@ -603,7 +603,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "117725a109d387c937a1533ce01b450cbde6b88abceea8473c4d7a85853cda3c"
dependencies = [
"lazy_static",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -612,7 +612,7 @@ version = "3.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fde0e0ec90c9dfb3b4b1a0891a7dcd0e2bffde2f7efed5fe7c9bb00e5bfb915e"
dependencies = [
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -826,12 +826,13 @@ dependencies = [
[[package]]
name = "ctrlc"
version = "3.4.7"
version = "3.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "46f93780a459b7d656ef7f071fe699c4d3d2cb201c4b24d085b6ddc505276e73"
checksum = "881c5d0a13b2f1498e2306e82cbada78390e152d4b1378fb28a84f4dcd0dc4f3"
dependencies = [
"dispatch",
"nix 0.30.1",
"windows-sys 0.59.0",
"windows-sys 0.61.0",
]
[[package]]
@@ -958,6 +959,12 @@ dependencies = [
"windows-sys 0.60.2",
]
[[package]]
name = "dispatch"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bd0c93bb4b0c6d9b77f4435b0ae98c24d17f1c45b2ff844c6151a07256ca923b"
[[package]]
name = "displaydoc"
version = "0.2.5"
@@ -1035,7 +1042,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "778e2ac28f6c47af28e4907f13ffd1e1ddbd400980a9abd7c8df189bf578a5ad"
dependencies = [
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -1486,9 +1493,9 @@ dependencies = [
[[package]]
name = "indexmap"
version = "2.11.0"
version = "2.11.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f2481980430f9f78649238835720ddccc57e52df14ffce1c6f37391d61b563e9"
checksum = "206a8042aec68fa4a62e8d3f7aa4ceb508177d9324faf261e1959e495b7a1921"
dependencies = [
"equivalent",
"hashbrown 0.15.5",
@@ -1617,7 +1624,7 @@ checksum = "e04d7f318608d35d4b61ddd75cbdaee86b023ebe2bd5a66ee0915f0bf93095a9"
dependencies = [
"hermit-abi",
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -1681,7 +1688,7 @@ dependencies = [
"portable-atomic",
"portable-atomic-util",
"serde",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -2126,9 +2133,9 @@ checksum = "04744f49eae99ab78e0d5c0b603ab218f515ea8cfe5a456d7629ad883a3b6e7d"
[[package]]
name = "ordermap"
version = "0.5.9"
version = "0.5.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2fd6fedcd996c8c97932075cc3811d83f53280f48d5620e4e3cab7f6a12678c4"
checksum = "0dcd63f1ae4b091e314a26627c467dd8810d674ba798abc0e566679955776c63"
dependencies = [
"indexmap",
"serde",
@@ -2475,16 +2482,16 @@ dependencies = [
[[package]]
name = "pyproject-toml"
version = "0.13.5"
version = "0.13.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7b0f6160dc48298b9260d9b958ad1d7f96f6cd0b9df200b22329204e09334663"
checksum = "ec768e063102b426e8962989758115e8659485124de9207bc365fab524125d65"
dependencies = [
"indexmap",
"pep440_rs",
"pep508_rs",
"serde",
"thiserror 2.0.16",
"toml 0.8.23",
"toml",
]
[[package]]
@@ -2635,9 +2642,9 @@ dependencies = [
[[package]]
name = "rayon"
version = "1.10.0"
version = "1.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b418a60154510ca1a002a752ca9714984e21e4241e804d32555251faf8b78ffa"
checksum = "368f01d005bf8fd9b1206fb6fa653e6c4a81ceb1466406b81792d87c5677a58f"
dependencies = [
"either",
"rayon-core",
@@ -2645,9 +2652,9 @@ dependencies = [
[[package]]
name = "rayon-core"
version = "1.12.1"
version = "1.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1465873a3dfdaa8ae7cb14b4383657caab0b3e8a0aa9ae8e04b044854c8dfce2"
checksum = "22e18b0f0062d30d4230b2e85ff77fdfe4326feb054b9783a3460d8435c8ab91"
dependencies = [
"crossbeam-deque",
"crossbeam-utils",
@@ -2773,7 +2780,7 @@ dependencies = [
"test-case",
"thiserror 2.0.16",
"tikv-jemallocator",
"toml 0.9.5",
"toml",
"tracing",
"walkdir",
"wild",
@@ -2789,7 +2796,7 @@ dependencies = [
"ruff_annotate_snippets",
"serde",
"snapbox",
"toml 0.9.5",
"toml",
"tryfn",
"unicode-width 0.2.1",
]
@@ -2908,7 +2915,7 @@ dependencies = [
"similar",
"strum",
"tempfile",
"toml 0.9.5",
"toml",
"tracing",
"tracing-indicatif",
"tracing-subscriber",
@@ -3009,6 +3016,7 @@ dependencies = [
"ruff_notebook",
"ruff_python_ast",
"ruff_python_codegen",
"ruff_python_importer",
"ruff_python_index",
"ruff_python_literal",
"ruff_python_parser",
@@ -3028,7 +3036,7 @@ dependencies = [
"tempfile",
"test-case",
"thiserror 2.0.16",
"toml 0.9.5",
"toml",
"typed-arena",
"unicode-normalization",
"unicode-width 0.2.1",
@@ -3159,6 +3167,21 @@ dependencies = [
"tracing",
]
[[package]]
name = "ruff_python_importer"
version = "0.0.0"
dependencies = [
"anyhow",
"insta",
"ruff_diagnostics",
"ruff_python_ast",
"ruff_python_codegen",
"ruff_python_parser",
"ruff_python_trivia",
"ruff_source_file",
"ruff_text_size",
]
[[package]]
name = "ruff_python_index"
version = "0.0.0"
@@ -3286,7 +3309,7 @@ dependencies = [
"serde_json",
"shellexpand",
"thiserror 2.0.16",
"toml 0.9.5",
"toml",
"tracing",
"tracing-log",
"tracing-subscriber",
@@ -3376,7 +3399,7 @@ dependencies = [
"shellexpand",
"strum",
"tempfile",
"toml 0.9.5",
"toml",
"unicode-normalization",
]
@@ -3412,7 +3435,7 @@ dependencies = [
"errno",
"libc",
"linux-raw-sys",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -3514,10 +3537,11 @@ checksum = "1c107b6f4780854c8b126e228ea8869f4d7b71260f962fefb57b996b8959ba6b"
[[package]]
name = "serde"
version = "1.0.219"
version = "1.0.223"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5f0e2c6ed6606019b4e29e69dbaba95b11854410e5347d525002456dbbb786b6"
checksum = "a505d71960adde88e293da5cb5eda57093379f64e61cf77bf0e6a63af07a7bac"
dependencies = [
"serde_core",
"serde_derive",
]
@@ -3533,10 +3557,19 @@ dependencies = [
]
[[package]]
name = "serde_derive"
version = "1.0.219"
name = "serde_core"
version = "1.0.223"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5b0276cf7f2c73365f7157c8123c21cd9a50fbbd844757af28ca1f5925fc2a00"
checksum = "20f57cbd357666aa7b3ac84a90b4ea328f1d4ddb6772b430caa5d9e1309bb9e9"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.223"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3d428d07faf17e306e699ec1e91996e5a165ba5d6bce5b5155173e91a8a01a56"
dependencies = [
"proc-macro2",
"quote",
@@ -3556,14 +3589,15 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.143"
version = "1.0.145"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d401abef1d108fbd9cbaebc3e46611f4b1021f714a0597a71f41ee463f5f4a5a"
checksum = "402a6f66d8c709116cf22f558eab210f5a50187f702eb4d7e5ef38d9a7f1c79c"
dependencies = [
"itoa",
"memchr",
"ryu",
"serde",
"serde_core",
]
[[package]]
@@ -3577,15 +3611,6 @@ dependencies = [
"syn",
]
[[package]]
name = "serde_spanned"
version = "0.6.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bf41e0cfaf7226dca15e8197172c295a782857fcb97fad1808a166870dee75a3"
dependencies = [
"serde",
]
[[package]]
name = "serde_spanned"
version = "1.0.0"
@@ -3797,15 +3822,15 @@ checksum = "55937e1799185b12863d447f42597ed69d9928686b8d88a1df17376a097d8369"
[[package]]
name = "tempfile"
version = "3.20.0"
version = "3.22.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e8a64e3985349f2441a1a9ef0b853f869006c3855f2cda6862a94d26ebb9d6a1"
checksum = "84fa4d11fadde498443cca10fd3ac23c951f0dc59e080e9f4b93d4df4e4eea53"
dependencies = [
"fastrand",
"getrandom 0.3.3",
"once_cell",
"rustix",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -3997,18 +4022,6 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
[[package]]
name = "toml"
version = "0.8.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dc1beb996b9d83529a9e75c17a1686767d148d70663143c7854d8b4a09ced362"
dependencies = [
"serde",
"serde_spanned 0.6.9",
"toml_datetime 0.6.11",
"toml_edit",
]
[[package]]
name = "toml"
version = "0.9.5"
@@ -4017,7 +4030,7 @@ checksum = "75129e1dc5000bfbaa9fee9d1b21f974f9fbad9daec557a521ee6e080825f6e8"
dependencies = [
"indexmap",
"serde",
"serde_spanned 1.0.0",
"serde_spanned",
"toml_datetime 0.7.0",
"toml_parser",
"toml_writer",
@@ -4029,9 +4042,6 @@ name = "toml_datetime"
version = "0.6.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "22cddaf88f4fbc13c51aebbf5f8eceb5c7c5a9da2ac40a13519eb5b0a0e8f11c"
dependencies = [
"serde",
]
[[package]]
name = "toml_datetime"
@@ -4049,8 +4059,6 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "41fe8c660ae4257887cf66394862d21dbca4a6ddd26f04a3560410406a2f819a"
dependencies = [
"indexmap",
"serde",
"serde_spanned 0.6.9",
"toml_datetime 0.6.11",
"winnow",
]
@@ -4192,7 +4200,7 @@ dependencies = [
"ruff_python_trivia",
"salsa",
"tempfile",
"toml 0.9.5",
"toml",
"tracing",
"tracing-flame",
"tracing-subscriber",
@@ -4226,9 +4234,12 @@ dependencies = [
"rayon",
"regex",
"ruff_db",
"ruff_diagnostics",
"ruff_index",
"ruff_memory_usage",
"ruff_python_ast",
"ruff_python_codegen",
"ruff_python_importer",
"ruff_python_parser",
"ruff_python_trivia",
"ruff_source_file",
@@ -4271,7 +4282,7 @@ dependencies = [
"schemars",
"serde",
"thiserror 2.0.16",
"toml 0.9.5",
"toml",
"tracing",
"ty_combine",
"ty_python_semantic",
@@ -4399,7 +4410,7 @@ dependencies = [
"smallvec",
"tempfile",
"thiserror 2.0.16",
"toml 0.9.5",
"toml",
"tracing",
"ty_python_semantic",
"ty_static",
@@ -4508,9 +4519,9 @@ checksum = "10103c57044730945224467c09f71a4db0071c123a0648cc3e818913bde6b561"
[[package]]
name = "unicode-ident"
version = "1.0.18"
version = "1.0.19"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a5f39404a5da50712a4c1eecf25e90dd62b613502b7e925fd4e4d19b5c96512"
checksum = "f63a545481291138910575129486daeaf8ac54aee4387fe7906919f7830c7d9d"
[[package]]
name = "unicode-normalization"
@@ -4611,9 +4622,9 @@ checksum = "06abde3611657adf66d383f00b093d7faecc7fa57071cce2578660c9f1010821"
[[package]]
name = "uuid"
version = "1.17.0"
version = "1.18.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3cf4199d1e5d15ddd86a694e4d0dffa9c323ce759fea589f00fef9d81cc1931d"
checksum = "2f87b8aa10b915a06587d0dec516c282ff295b475d94abf425d62b57710070a2"
dependencies = [
"getrandom 0.3.3",
"js-sys",
@@ -4624,9 +4635,9 @@ dependencies = [
[[package]]
name = "uuid-macro-internal"
version = "1.17.0"
version = "1.18.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "26b682e8c381995ea03130e381928e0e005b7c9eb483c6c8682f50e07b33c2b7"
checksum = "d9384a660318abfbd7f8932c34d67e4d1ec511095f95972ddc01e19d7ba8413f"
dependencies = [
"proc-macro2",
"quote",
@@ -4878,7 +4889,7 @@ version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cf221c93e13a30d793f7645a0e7762c55d169dbb0a49671918a2319d289b10bb"
dependencies = [
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -4889,7 +4900,7 @@ checksum = "c0fdd3ddb90610c7638aa2b3a3ab2904fb9e5cdbecc643ddb3647212781c4ae3"
dependencies = [
"windows-implement",
"windows-interface",
"windows-link",
"windows-link 0.1.3",
"windows-result",
"windows-strings",
]
@@ -4922,13 +4933,19 @@ version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5e6ad25900d524eaabdbbb96d20b4311e1e7ae1699af4fb28c17ae66c80d798a"
[[package]]
name = "windows-link"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "45e46c0661abb7180e7b9c281db115305d49ca1709ab8242adf09666d2173c65"
[[package]]
name = "windows-result"
version = "0.3.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "56f42bd332cc6c8eac5af113fc0c1fd6a8fd2aa08a0119358686e5160d0586c6"
dependencies = [
"windows-link",
"windows-link 0.1.3",
]
[[package]]
@@ -4937,7 +4954,7 @@ version = "0.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "56e6c93f3a0c3b36176cb1327a4958a0353d5d166c2a35cb268ace15e91d3b57"
dependencies = [
"windows-link",
"windows-link 0.1.3",
]
[[package]]
@@ -4967,6 +4984,15 @@ dependencies = [
"windows-targets 0.53.3",
]
[[package]]
name = "windows-sys"
version = "0.61.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e201184e40b2ede64bc2ea34968b28e33622acdbbf37104f0e4a33f7abe657aa"
dependencies = [
"windows-link 0.2.0",
]
[[package]]
name = "windows-targets"
version = "0.52.6"
@@ -4989,7 +5015,7 @@ version = "0.53.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d5fe6031c4041849d7c496a8ded650796e7b6ecc19df1a431c1a363342e5dc91"
dependencies = [
"windows-link",
"windows-link 0.1.3",
"windows_aarch64_gnullvm 0.53.0",
"windows_aarch64_msvc 0.53.0",
"windows_i686_gnu 0.53.0",

View File

@@ -29,6 +29,7 @@ ruff_options_metadata = { path = "crates/ruff_options_metadata" }
ruff_python_ast = { path = "crates/ruff_python_ast" }
ruff_python_codegen = { path = "crates/ruff_python_codegen" }
ruff_python_formatter = { path = "crates/ruff_python_formatter" }
ruff_python_importer = { path = "crates/ruff_python_importer" }
ruff_python_index = { path = "crates/ruff_python_index" }
ruff_python_literal = { path = "crates/ruff_python_literal" }
ruff_python_parser = { path = "crates/ruff_python_parser" }

View File

@@ -421,7 +421,7 @@ Ruff is released under the MIT license.
Ruff is used by a number of major open-source projects and companies, including:
- [Albumentations](https://github.com/albumentations-team/albumentations)
- [Albumentations](https://github.com/albumentations-team/AlbumentationsX)
- Amazon ([AWS SAM](https://github.com/aws/serverless-application-model))
- [Anki](https://apps.ankiweb.net/)
- Anthropic ([Python SDK](https://github.com/anthropics/anthropic-sdk-python))

View File

@@ -85,7 +85,7 @@ dist = true
[target.'cfg(target_os = "windows")'.dependencies]
mimalloc = { workspace = true }
[target.'cfg(all(not(target_os = "windows"), not(target_os = "openbsd"), not(target_os = "aix"), not(target_os = "android"), any(target_arch = "x86_64", target_arch = "aarch64", target_arch = "powerpc64", target_arch = "riscv64", target_arch = "loongarch64")))'.dependencies]
[target.'cfg(all(not(target_os = "windows"), not(target_os = "openbsd"), not(target_os = "aix"), not(target_os = "android"), any(target_arch = "x86_64", target_arch = "aarch64", target_arch = "powerpc64", target_arch = "riscv64")))'.dependencies]
tikv-jemallocator = { workspace = true }
[lints]

View File

@@ -6,12 +6,14 @@ use std::time::Instant;
use anyhow::Result;
use colored::Colorize;
use ignore::Error;
use log::{debug, error, warn};
use log::{debug, warn};
#[cfg(not(target_family = "wasm"))]
use rayon::prelude::*;
use rustc_hash::FxHashMap;
use ruff_db::diagnostic::Diagnostic;
use ruff_db::diagnostic::{
Annotation, Diagnostic, DiagnosticId, Span, SubDiagnostic, SubDiagnosticSeverity,
};
use ruff_db::panic::catch_unwind;
use ruff_linter::package::PackageRoot;
use ruff_linter::registry::Rule;
@@ -193,21 +195,24 @@ fn lint_path(
match result {
Ok(inner) => inner,
Err(error) => {
let message = r"This indicates a bug in Ruff. If you could open an issue at:
https://github.com/astral-sh/ruff/issues/new?title=%5BLinter%20panic%5D
...with the relevant file contents, the `pyproject.toml` settings, and the following stack trace, we'd be very appreciative!
";
error!(
"{}{}{} {message}\n{error}",
"Panicked while linting ".bold(),
fs::relativize_path(path).bold(),
":".bold()
let message = match error.payload.as_str() {
Some(summary) => format!("Fatal error while linting: {summary}"),
_ => "Fatal error while linting".to_owned(),
};
let mut diagnostic = Diagnostic::new(
DiagnosticId::Panic,
ruff_db::diagnostic::Severity::Fatal,
message,
);
Ok(Diagnostics::default())
let span = Span::from(SourceFileBuilder::new(path.to_string_lossy(), "").finish());
let mut annotation = Annotation::primary(span);
annotation.set_file_level(true);
diagnostic.annotate(annotation);
diagnostic.sub(SubDiagnostic::new(
SubDiagnosticSeverity::Info,
format!("{error}"),
));
Ok(Diagnostics::new(vec![diagnostic], FxHashMap::default()))
}
}
}

View File

@@ -9,10 +9,11 @@ use std::sync::mpsc::channel;
use anyhow::Result;
use clap::CommandFactory;
use colored::Colorize;
use log::warn;
use log::{error, warn};
use notify::{RecursiveMode, Watcher, recommended_watcher};
use args::{GlobalConfigArgs, ServerCommand};
use ruff_db::diagnostic::{Diagnostic, Severity};
use ruff_linter::logging::{LogLevel, set_up_logging};
use ruff_linter::settings::flags::FixMode;
use ruff_linter::settings::types::OutputFormat;
@@ -444,6 +445,27 @@ pub fn check(args: CheckCommand, global_options: GlobalConfigArgs) -> Result<Exi
}
if !cli.exit_zero {
let max_severity = diagnostics
.inner
.iter()
.map(Diagnostic::severity)
.max()
.unwrap_or(Severity::Info);
if max_severity.is_fatal() {
// When a panic/fatal error is reported, prompt the user to open an issue on github.
// Diagnostics with severity `fatal` will be sorted to the bottom, and printing the
// message here instead of attaching it to the diagnostic ensures that we only print
// it once instead of repeating it for each diagnostic. Prints to stderr to prevent
// the message from being captured by tools parsing the normal output.
let message = "Panic during linting indicates a bug in Ruff. If you could open an issue at:
https://github.com/astral-sh/ruff/issues/new?title=%5BLinter%20panic%5D
...with the relevant file contents, the `pyproject.toml` settings, and the stack trace above, we'd be very appreciative!
";
error!("{message}");
return Ok(ExitStatus::Error);
}
if cli.diff {
// If we're printing a diff, we always want to exit non-zero if there are
// any fixable violations (since we've printed the diff, but not applied the

View File

@@ -20,8 +20,7 @@ static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
target_arch = "x86_64",
target_arch = "aarch64",
target_arch = "powerpc64",
target_arch = "riscv64",
target_arch = "loongarch64",
target_arch = "riscv64"
)
))]
#[global_allocator]

View File

@@ -10,7 +10,8 @@ use ruff_linter::linter::FixTable;
use serde::Serialize;
use ruff_db::diagnostic::{
Diagnostic, DiagnosticFormat, DisplayDiagnosticConfig, DisplayDiagnostics, SecondaryCode,
Diagnostic, DiagnosticFormat, DisplayDiagnosticConfig, DisplayDiagnostics,
DisplayGithubDiagnostics, GithubRenderer, SecondaryCode,
};
use ruff_linter::fs::relativize_path;
use ruff_linter::logging::LogLevel;
@@ -224,32 +225,26 @@ impl Printer {
let context = EmitterContext::new(&diagnostics.notebook_indexes);
let fixables = FixableStatistics::try_from(diagnostics, self.unsafe_fixes);
let config = DisplayDiagnosticConfig::default().preview(preview);
match self.format {
OutputFormat::Json => {
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Json)
.preview(preview);
let config = config.format(DiagnosticFormat::Json);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Rdjson => {
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Rdjson)
.preview(preview);
let config = config.format(DiagnosticFormat::Rdjson);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::JsonLines => {
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::JsonLines)
.preview(preview);
let config = config.format(DiagnosticFormat::JsonLines);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Junit => {
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Junit)
.preview(preview);
let config = config.format(DiagnosticFormat::Junit);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
@@ -288,30 +283,22 @@ impl Printer {
self.write_summary_text(writer, diagnostics)?;
}
OutputFormat::Github => {
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Github)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
let renderer = GithubRenderer::new(&context, "Ruff");
let value = DisplayGithubDiagnostics::new(&renderer, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Gitlab => {
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Gitlab)
.preview(preview);
let config = config.format(DiagnosticFormat::Gitlab);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Pylint => {
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Pylint)
.preview(preview);
let config = config.format(DiagnosticFormat::Pylint);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Azure => {
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Azure)
.preview(preview);
let config = config.format(DiagnosticFormat::Azure);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}

View File

@@ -246,6 +246,59 @@ fn string_detection() -> Result<()> {
Ok(())
}
#[test]
fn string_detection_from_config() -> Result<()> {
let tempdir = TempDir::new()?;
let root = ChildPath::new(tempdir.path());
// Configure string import detection with a lower min-dots via ruff.toml
root.child("ruff.toml").write_str(indoc::indoc! {r#"
[analyze]
detect-string-imports = true
string-imports-min-dots = 1
"#})?;
root.child("ruff").child("__init__.py").write_str("")?;
root.child("ruff")
.child("a.py")
.write_str(indoc::indoc! {r#"
import ruff.b
"#})?;
root.child("ruff")
.child("b.py")
.write_str(indoc::indoc! {r#"
import importlib
importlib.import_module("ruff.c")
"#})?;
root.child("ruff").child("c.py").write_str("")?;
insta::with_settings!({
filters => INSTA_FILTERS.to_vec(),
}, {
assert_cmd_snapshot!(command().current_dir(&root), @r#"
success: true
exit_code: 0
----- stdout -----
{
"ruff/__init__.py": [],
"ruff/a.py": [
"ruff/b.py"
],
"ruff/b.py": [
"ruff/c.py"
],
"ruff/c.py": []
}
----- stderr -----
"#);
});
Ok(())
}
#[test]
fn globs() -> Result<()> {
let tempdir = TempDir::new()?;

View File

@@ -5059,6 +5059,59 @@ fn flake8_import_convention_unused_aliased_import_no_conflict() {
);
}
// https://github.com/astral-sh/ruff/issues/19842
#[test]
fn pyupgrade_up026_respects_isort_required_import_fix() {
assert_cmd_snapshot!(
Command::new(get_cargo_bin(BIN_NAME))
.arg("--isolated")
.arg("check")
.arg("-")
.args(["--select", "I002,UP026"])
.arg("--config")
.arg(r#"lint.isort.required-imports=["import mock"]"#)
.arg("--fix")
.arg("--no-cache")
.pass_stdin("1\n"),
@r"
success: true
exit_code: 0
----- stdout -----
import mock
1
----- stderr -----
Found 1 error (1 fixed, 0 remaining).
"
);
}
// https://github.com/astral-sh/ruff/issues/19842
#[test]
fn pyupgrade_up026_respects_isort_required_import_from_fix() {
assert_cmd_snapshot!(
Command::new(get_cargo_bin(BIN_NAME))
.arg("--isolated")
.arg("check")
.arg("-")
.args(["--select", "I002,UP026"])
.arg("--config")
.arg(r#"lint.isort.required-imports = ["from mock import mock"]"#)
.arg("--fix")
.arg("--no-cache")
.pass_stdin("from mock import mock\n"),
@r"
success: true
exit_code: 0
----- stdout -----
from mock import mock
----- stderr -----
All checks passed!
"
);
}
// See: https://github.com/astral-sh/ruff/issues/16177
#[test]
fn flake8_pyi_redundant_none_literal() {
@@ -5838,3 +5891,113 @@ fn show_fixes_in_full_output_with_preview_enabled() {
",
);
}
#[test]
fn rule_panic_mixed_results_concise() -> Result<()> {
let tempdir = TempDir::new()?;
// Create python files
let file_a_path = tempdir.path().join("normal.py");
let file_b_path = tempdir.path().join("panic.py");
fs::write(&file_a_path, b"import os")?;
fs::write(&file_b_path, b"print('hello, world!')")?;
insta::with_settings!({
filters => vec![
(tempdir_filter(&tempdir).as_str(), "[TMP]/"),
(r"\\", r"/"),
]
}, {
assert_cmd_snapshot!(
Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--select", "RUF9", "--preview", "--output-format=concise", "--no-cache"])
.args([file_a_path, file_b_path]),
@r"
success: false
exit_code: 2
----- stdout -----
[TMP]/normal.py:1:1: RUF900 Hey this is a stable test rule.
[TMP]/normal.py:1:1: RUF901 [*] Hey this is a stable test rule with a safe fix.
[TMP]/normal.py:1:1: RUF902 Hey this is a stable test rule with an unsafe fix.
[TMP]/normal.py:1:1: RUF903 Hey this is a stable test rule with a display only fix.
[TMP]/normal.py:1:1: RUF911 Hey this is a preview test rule.
[TMP]/normal.py:1:1: RUF950 Hey this is a test rule that was redirected from another.
[TMP]/panic.py: panic: Fatal error while linting: This is a fake panic for testing.
Found 7 errors.
[*] 1 fixable with the `--fix` option (1 hidden fix can be enabled with the `--unsafe-fixes` option).
----- stderr -----
error: Panic during linting indicates a bug in Ruff. If you could open an issue at:
https://github.com/astral-sh/ruff/issues/new?title=%5BLinter%20panic%5D
...with the relevant file contents, the `pyproject.toml` settings, and the stack trace above, we'd be very appreciative!
");
});
Ok(())
}
#[test]
fn rule_panic_mixed_results_full() -> Result<()> {
let tempdir = TempDir::new()?;
// Create python files
let file_a_path = tempdir.path().join("normal.py");
let file_b_path = tempdir.path().join("panic.py");
fs::write(&file_a_path, b"import os")?;
fs::write(&file_b_path, b"print('hello, world!')")?;
insta::with_settings!({
filters => vec![
(tempdir_filter(&tempdir).as_str(), "[TMP]/"),
(r"\\", r"/"),
]
}, {
assert_cmd_snapshot!(
Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--select", "RUF9", "--preview", "--output-format=full", "--no-cache"])
.args([file_a_path, file_b_path]),
@r"
success: false
exit_code: 2
----- stdout -----
RUF900 Hey this is a stable test rule.
--> [TMP]/normal.py:1:1
RUF901 [*] Hey this is a stable test rule with a safe fix.
--> [TMP]/normal.py:1:1
1 + # fix from stable-test-rule-safe-fix
2 | import os
RUF902 Hey this is a stable test rule with an unsafe fix.
--> [TMP]/normal.py:1:1
RUF903 Hey this is a stable test rule with a display only fix.
--> [TMP]/normal.py:1:1
RUF911 Hey this is a preview test rule.
--> [TMP]/normal.py:1:1
RUF950 Hey this is a test rule that was redirected from another.
--> [TMP]/normal.py:1:1
panic: Fatal error while linting: This is a fake panic for testing.
--> [TMP]/panic.py:1:1
info: panicked at crates/ruff_linter/src/rules/ruff/rules/test_rules.rs:511:9:
This is a fake panic for testing.
run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Found 7 errors.
[*] 1 fixable with the `--fix` option (1 hidden fix can be enabled with the `--unsafe-fixes` option).
----- stderr -----
error: Panic during linting indicates a bug in Ruff. If you could open an issue at:
https://github.com/astral-sh/ruff/issues/new?title=%5BLinter%20panic%5D
...with the relevant file contents, the `pyproject.toml` settings, and the stack trace above, we'd be very appreciative!
");
});
Ok(())
}

View File

@@ -86,5 +86,5 @@ walltime = ["ruff_db/os", "ty_project", "divan"]
[target.'cfg(target_os = "windows")'.dev-dependencies]
mimalloc = { workspace = true }
[target.'cfg(all(not(target_os = "windows"), not(target_os = "openbsd"), any(target_arch = "x86_64", target_arch = "aarch64", target_arch = "powerpc64", target_arch = "riscv64", target_arch = "loongarch64")))'.dev-dependencies]
[target.'cfg(all(not(target_os = "windows"), not(target_os = "openbsd"), any(target_arch = "x86_64", target_arch = "aarch64", target_arch = "powerpc64", target_arch = "riscv64")))'.dev-dependencies]
tikv-jemallocator = { workspace = true }

View File

@@ -22,8 +22,7 @@ static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
target_arch = "x86_64",
target_arch = "aarch64",
target_arch = "powerpc64",
target_arch = "riscv64",
target_arch = "loongarch64"
target_arch = "riscv64"
)
))]
#[global_allocator]

View File

@@ -19,8 +19,7 @@ static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
target_arch = "x86_64",
target_arch = "aarch64",
target_arch = "powerpc64",
target_arch = "riscv64",
target_arch = "loongarch64"
target_arch = "riscv64"
)
))]
#[global_allocator]

View File

@@ -27,8 +27,7 @@ static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
target_arch = "x86_64",
target_arch = "aarch64",
target_arch = "powerpc64",
target_arch = "riscv64",
target_arch = "loongarch64"
target_arch = "riscv64"
)
))]
#[global_allocator]
@@ -45,8 +44,7 @@ static GLOBAL: tikv_jemallocator::Jemalloc = tikv_jemallocator::Jemalloc;
target_arch = "x86_64",
target_arch = "aarch64",
target_arch = "powerpc64",
target_arch = "riscv64",
target_arch = "loongarch64"
target_arch = "riscv64"
)
))]
#[unsafe(export_name = "_rjem_malloc_conf")]

View File

@@ -21,8 +21,7 @@ static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
target_arch = "x86_64",
target_arch = "aarch64",
target_arch = "powerpc64",
target_arch = "riscv64",
target_arch = "loongarch64"
target_arch = "riscv64"
)
))]
#[global_allocator]

View File

@@ -8,6 +8,7 @@ use ruff_text_size::{Ranged, TextRange, TextSize};
pub use self::render::{
DisplayDiagnostic, DisplayDiagnostics, FileResolver, Input, ceil_char_boundary,
github::{DisplayGithubDiagnostics, GithubRenderer},
};
use crate::{Db, files::File};
@@ -499,10 +500,12 @@ impl Diagnostic {
/// Panics if either diagnostic has no primary span, or if its file is not a `SourceFile`.
pub fn ruff_start_ordering(&self, other: &Self) -> std::cmp::Ordering {
let a = (
self.severity().is_fatal(),
self.expect_ruff_source_file(),
self.range().map(|r| r.start()),
);
let b = (
other.severity().is_fatal(),
other.expect_ruff_source_file(),
other.range().map(|r| r.start()),
);

View File

@@ -31,7 +31,7 @@ use pylint::PylintRenderer;
mod azure;
mod concise;
mod full;
mod github;
pub mod github;
#[cfg(feature = "serde")]
mod gitlab;
#[cfg(feature = "serde")]
@@ -145,7 +145,7 @@ impl std::fmt::Display for DisplayDiagnostics<'_> {
gitlab::GitlabRenderer::new(self.resolver).render(f, self.diagnostics)?;
}
DiagnosticFormat::Github => {
GithubRenderer::new(self.resolver).render(f, self.diagnostics)?;
GithubRenderer::new(self.resolver, "ty").render(f, self.diagnostics)?;
}
}

View File

@@ -1,12 +1,13 @@
use crate::diagnostic::{Diagnostic, FileResolver};
use crate::diagnostic::{Diagnostic, FileResolver, Severity};
pub(super) struct GithubRenderer<'a> {
pub struct GithubRenderer<'a> {
resolver: &'a dyn FileResolver,
program: &'a str,
}
impl<'a> GithubRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver) -> Self {
Self { resolver }
pub fn new(resolver: &'a dyn FileResolver, program: &'a str) -> Self {
Self { resolver, program }
}
pub(super) fn render(
@@ -15,9 +16,15 @@ impl<'a> GithubRenderer<'a> {
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
for diagnostic in diagnostics {
let severity = match diagnostic.severity() {
Severity::Info => "notice",
Severity::Warning => "warning",
Severity::Error | Severity::Fatal => "error",
};
write!(
f,
"::error title=Ruff ({code})",
"::{severity} title={program} ({code})",
program = self.program,
code = diagnostic.secondary_code_or_id()
)?;
@@ -75,6 +82,26 @@ impl<'a> GithubRenderer<'a> {
}
}
pub struct DisplayGithubDiagnostics<'a> {
renderer: &'a GithubRenderer<'a>,
diagnostics: &'a [Diagnostic],
}
impl<'a> DisplayGithubDiagnostics<'a> {
pub fn new(renderer: &'a GithubRenderer<'a>, diagnostics: &'a [Diagnostic]) -> Self {
Self {
renderer,
diagnostics,
}
}
}
impl std::fmt::Display for DisplayGithubDiagnostics<'_> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.renderer.render(f, self.diagnostics)
}
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
@@ -103,7 +130,7 @@ mod tests {
insta::assert_snapshot!(
env.render(&diag),
@"::error title=Ruff (test-diagnostic)::test-diagnostic: main diagnostic message",
@"::error title=ty (test-diagnostic)::test-diagnostic: main diagnostic message",
);
}
}

View File

@@ -2,6 +2,6 @@
source: crates/ruff_db/src/diagnostic/render/github.rs
expression: env.render_diagnostics(&diagnostics)
---
::error title=Ruff (F401),file=fib.py,line=1,col=8,endLine=1,endColumn=10::fib.py:1:8: F401 `os` imported but unused
::error title=Ruff (F841),file=fib.py,line=6,col=5,endLine=6,endColumn=6::fib.py:6:5: F841 Local variable `x` is assigned to but never used
::error title=Ruff (F821),file=undef.py,line=1,col=4,endLine=1,endColumn=5::undef.py:1:4: F821 Undefined name `a`
::error title=ty (F401),file=fib.py,line=1,col=8,endLine=1,endColumn=10::fib.py:1:8: F401 `os` imported but unused
::error title=ty (F841),file=fib.py,line=6,col=5,endLine=6,endColumn=6::fib.py:6:5: F841 Local variable `x` is assigned to but never used
::error title=ty (F821),file=undef.py,line=1,col=4,endLine=1,endColumn=5::undef.py:1:4: F821 Undefined name `a`

View File

@@ -2,5 +2,5 @@
source: crates/ruff_db/src/diagnostic/render/github.rs
expression: env.render_diagnostics(&diagnostics)
---
::error title=Ruff (invalid-syntax),file=syntax_errors.py,line=1,col=15,endLine=2,endColumn=1::syntax_errors.py:1:15: invalid-syntax: Expected one or more symbol names after import
::error title=Ruff (invalid-syntax),file=syntax_errors.py,line=3,col=12,endLine=4,endColumn=1::syntax_errors.py:3:12: invalid-syntax: Expected ')', found newline
::error title=ty (invalid-syntax),file=syntax_errors.py,line=1,col=15,endLine=2,endColumn=1::syntax_errors.py:1:15: invalid-syntax: Expected one or more symbol names after import
::error title=ty (invalid-syntax),file=syntax_errors.py,line=3,col=12,endLine=4,endColumn=1::syntax_errors.py:3:12: invalid-syntax: Expected ')', found newline

View File

@@ -459,6 +459,12 @@ impl File {
self.source_type(db).is_stub()
}
/// Returns `true` if the file is an `__init__.py(i)`
pub fn is_init(self, db: &dyn Db) -> bool {
let path = self.path(db).as_str();
path.ends_with("__init__.py") || path.ends_with("__init__.pyi")
}
pub fn source_type(self, db: &dyn Db) -> PySourceType {
match self.path(db) {
FilePath::System(path) => path

View File

@@ -20,6 +20,7 @@ ruff_notebook = { workspace = true }
ruff_macros = { workspace = true }
ruff_python_ast = { workspace = true, features = ["serde", "cache"] }
ruff_python_codegen = { workspace = true }
ruff_python_importer = { workspace = true }
ruff_python_index = { workspace = true }
ruff_python_literal = { workspace = true }
ruff_python_semantic = { workspace = true }

View File

@@ -0,0 +1,24 @@
import builtins
from pathlib import Path
_file = "file.txt"
_x = ("r+", -1)
r_plus = "r+"
builtins.open(file=_file)
open(_file, "r+ ", - 1)
open(mode="wb", file=_file)
open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
open(_file, "r+", - 1, None, None, None, True, None)
open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
open(_file, f" {r_plus} ", - 1)
open(buffering=- 1, file=_file, encoding= "utf-8")
# Only diagnostic
open()
open(_file, *_x)
open(_file, "r+", unknown=True)
open(_file, "r+", closefd=False)
open(_file, "r+", None, None, None, None, None, None, None)

View File

@@ -62,4 +62,4 @@ rename(
rename(file, "file_2.py", src_dir_fd=None, dst_dir_fd=None)
rename(file, "file_2.py", src_dir_fd=1)
rename(file, "file_2.py", src_dir_fd=1)

View File

@@ -974,3 +974,39 @@ BANANA = 100
APPLE = 200
# end
# https://github.com/astral-sh/ruff/issues/19752
class foo:
async def recv(self, *, length=65536):
loop = asyncio.get_event_loop()
def callback():
loop.remove_reader(self._fd)
loop.add_reader(self._fd, callback)
# end
# E301
class Foo:
if True:
print("conditional")
def test():
pass
# end
# Test case for nested class scenario
class Bar:
def f():
x = 1
def g():
return 1
return 2
def f():
class Baz:
x = 1
def g():
return 1
return 2
# end

View File

@@ -125,4 +125,13 @@ class D:
@dataclass
class E:
c: C = C()
c: C = C()
# https://github.com/astral-sh/ruff/issues/20266
@dataclass(frozen=True)
class C:
@dataclass(frozen=True)
class D:
foo: int = 1
d: D = D() # OK

View File

@@ -134,7 +134,7 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
);
}
if checker.is_rule_enabled(Rule::FunctionCallInDataclassDefaultArgument) {
ruff::rules::function_call_in_dataclass_default(checker, class_def);
ruff::rules::function_call_in_dataclass_default(checker, class_def, scope_id);
}
if checker.is_rule_enabled(Rule::MutableClassDefault) {
ruff::rules::mutable_class_default(checker, class_def);

View File

@@ -1051,7 +1051,6 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
Rule::OsStat,
Rule::OsPathJoin,
Rule::OsPathSplitext,
Rule::BuiltinOpen,
Rule::PyPath,
Rule::Glob,
Rule::OsListdir,
@@ -1135,6 +1134,9 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if checker.is_rule_enabled(Rule::OsSymlink) {
flake8_use_pathlib::rules::os_symlink(checker, call, segments);
}
if checker.is_rule_enabled(Rule::BuiltinOpen) {
flake8_use_pathlib::rules::builtin_open(checker, call, segments);
}
if checker.is_rule_enabled(Rule::PathConstructorCurrentDirectory) {
flake8_use_pathlib::rules::path_constructor_current_directory(
checker, call, segments,

View File

@@ -1,7 +1,6 @@
use ruff_python_ast::helpers;
use ruff_python_ast::types::Node;
use ruff_python_ast::{self as ast, Expr, Stmt};
use ruff_python_semantic::ScopeKind;
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
@@ -809,17 +808,6 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pyflakes::rules::future_feature_not_defined(checker, alias);
}
} else if &alias.name == "*" {
// F406
if checker.is_rule_enabled(Rule::UndefinedLocalWithNestedImportStarUsage) {
if !matches!(checker.semantic.current_scope().kind, ScopeKind::Module) {
checker.report_diagnostic(
pyflakes::rules::UndefinedLocalWithNestedImportStarUsage {
name: helpers::format_import_from(level, module).to_string(),
},
stmt.range(),
);
}
}
// F403
checker.report_diagnostic_if_enabled(
pyflakes::rules::UndefinedLocalWithImportStar {

View File

@@ -13,7 +13,7 @@ pub(crate) fn unresolved_references(checker: &Checker) {
for reference in checker.semantic.unresolved_references() {
if reference.is_wildcard_import() {
// F406
// F405
checker.report_diagnostic_if_enabled(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: reference.name(checker.source()).to_string(),

View File

@@ -69,7 +69,8 @@ use crate::package::PackageRoot;
use crate::preview::is_undefined_export_in_dunder_init_enabled;
use crate::registry::Rule;
use crate::rules::pyflakes::rules::{
LateFutureImport, ReturnOutsideFunction, YieldOutsideFunction,
LateFutureImport, ReturnOutsideFunction, UndefinedLocalWithNestedImportStarUsage,
YieldOutsideFunction,
};
use crate::rules::pylint::rules::{
AwaitOutsideAsync, LoadBeforeGlobalDeclaration, YieldFromInAsyncFunction,
@@ -276,7 +277,7 @@ impl<'a> Checker<'a> {
locator,
stylist,
indexer,
importer: Importer::new(parsed, locator, stylist),
importer: Importer::new(parsed, locator.contents(), stylist),
semantic,
visit: deferred::Visit::default(),
analyze: deferred::Analyze::default(),
@@ -659,6 +660,14 @@ impl SemanticSyntaxContext for Checker<'_> {
self.report_diagnostic(YieldOutsideFunction::new(kind), error.range);
}
}
SemanticSyntaxErrorKind::NonModuleImportStar(name) => {
if self.is_rule_enabled(Rule::UndefinedLocalWithNestedImportStarUsage) {
self.report_diagnostic(
UndefinedLocalWithNestedImportStarUsage { name },
error.range,
);
}
}
SemanticSyntaxErrorKind::ReturnOutsideFunction => {
// F706
if self.is_rule_enabled(Rule::ReturnOutsideFunction) {
@@ -3266,6 +3275,11 @@ impl<'a> LintContext<'a> {
pub(crate) const fn settings(&self) -> &LinterSettings {
self.settings
}
#[cfg(any(feature = "test-rules", test))]
pub(crate) const fn source_file(&self) -> &SourceFile {
&self.source_file
}
}
/// An abstraction for mutating a diagnostic.

View File

@@ -944,7 +944,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Flake8UsePathlib, "120") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsPathDirname),
(Flake8UsePathlib, "121") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsPathSamefile),
(Flake8UsePathlib, "122") => (RuleGroup::Stable, rules::flake8_use_pathlib::violations::OsPathSplitext),
(Flake8UsePathlib, "123") => (RuleGroup::Stable, rules::flake8_use_pathlib::violations::BuiltinOpen),
(Flake8UsePathlib, "123") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::BuiltinOpen),
(Flake8UsePathlib, "124") => (RuleGroup::Stable, rules::flake8_use_pathlib::violations::PyPath),
(Flake8UsePathlib, "201") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::PathConstructorCurrentDirectory),
(Flake8UsePathlib, "202") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsPathGetsize),
@@ -1080,6 +1080,8 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Ruff, "950") => (RuleGroup::Stable, rules::ruff::rules::RedirectedToTestRule),
#[cfg(any(feature = "test-rules", test))]
(Ruff, "960") => (RuleGroup::Removed, rules::ruff::rules::RedirectedFromPrefixTestRule),
#[cfg(any(feature = "test-rules", test))]
(Ruff, "990") => (RuleGroup::Preview, rules::ruff::rules::PanicyTestRule),
// flake8-django

View File

@@ -14,6 +14,7 @@ use unicode_normalization::UnicodeNormalization;
use ruff_python_ast::Stmt;
use ruff_python_ast::name::UnqualifiedName;
use ruff_python_codegen::Stylist;
use ruff_text_size::Ranged;
use crate::Locator;
use crate::cst::matchers::match_statement;
@@ -127,10 +128,10 @@ pub(crate) fn remove_imports<'a>(
pub(crate) fn retain_imports(
member_names: &[&str],
stmt: &Stmt,
locator: &Locator,
contents: &str,
stylist: &Stylist,
) -> Result<String> {
let module_text = locator.slice(stmt);
let module_text = &contents[stmt.range()];
let mut tree = match_statement(module_text)?;
let Statement::Simple(body) = &mut tree else {

View File

@@ -6,10 +6,12 @@
use std::error::Error;
use anyhow::Result;
use libcst_native::{ImportAlias, Name as cstName, NameOrAttribute};
use libcst_native as cst;
use ruff_diagnostics::Edit;
use ruff_python_ast::{self as ast, Expr, ModModule, Stmt};
use ruff_python_codegen::Stylist;
use ruff_python_importer::Insertion;
use ruff_python_parser::{Parsed, Tokens};
use ruff_python_semantic::{
ImportedName, MemberNameImport, ModuleNameImport, NameImport, SemanticModel,
@@ -17,22 +19,17 @@ use ruff_python_semantic::{
use ruff_python_trivia::textwrap::indent;
use ruff_text_size::{Ranged, TextSize};
use crate::Edit;
use crate::Locator;
use crate::cst::matchers::{match_aliases, match_import_from, match_statement};
use crate::fix;
use crate::fix::codemods::CodegenStylist;
use crate::importer::insertion::Insertion;
mod insertion;
pub(crate) struct Importer<'a> {
/// The Python AST to which we are adding imports.
python_ast: &'a [Stmt],
/// The tokens representing the Python AST.
tokens: &'a Tokens,
/// The [`Locator`] for the Python AST.
locator: &'a Locator<'a>,
/// The source code text for `python_ast`.
source: &'a str,
/// The [`Stylist`] for the Python AST.
stylist: &'a Stylist<'a>,
/// The list of visited, top-level runtime imports in the Python AST.
@@ -44,13 +41,13 @@ pub(crate) struct Importer<'a> {
impl<'a> Importer<'a> {
pub(crate) fn new(
parsed: &'a Parsed<ModModule>,
locator: &'a Locator<'a>,
source: &'a str,
stylist: &'a Stylist<'a>,
) -> Self {
Self {
python_ast: parsed.suite(),
tokens: parsed.tokens(),
locator,
source,
stylist,
runtime_imports: Vec::default(),
type_checking_blocks: Vec::default(),
@@ -76,11 +73,10 @@ impl<'a> Importer<'a> {
let required_import = import.to_string();
if let Some(stmt) = self.preceding_import(at) {
// Insert after the last top-level import.
Insertion::end_of_statement(stmt, self.locator, self.stylist)
.into_edit(&required_import)
Insertion::end_of_statement(stmt, self.source, self.stylist).into_edit(&required_import)
} else {
// Insert at the start of the file.
Insertion::start_of_file(self.python_ast, self.locator, self.stylist)
Insertion::start_of_file(self.python_ast, self.source, self.stylist)
.into_edit(&required_import)
}
}
@@ -99,17 +95,17 @@ impl<'a> Importer<'a> {
let content = fix::codemods::retain_imports(
&import.names,
import.statement,
self.locator,
self.source,
self.stylist,
)?;
// Add the import to the top-level.
let insertion = if let Some(stmt) = self.preceding_import(at) {
// Insert after the last top-level import.
Insertion::end_of_statement(stmt, self.locator, self.stylist)
Insertion::end_of_statement(stmt, self.source, self.stylist)
} else {
// Insert at the start of the file.
Insertion::start_of_file(self.python_ast, self.locator, self.stylist)
Insertion::start_of_file(self.python_ast, self.source, self.stylist)
};
let add_import_edit = insertion.into_edit(&content);
@@ -131,7 +127,7 @@ impl<'a> Importer<'a> {
let content = fix::codemods::retain_imports(
&import.names,
import.statement,
self.locator,
self.source,
self.stylist,
)?;
@@ -155,7 +151,7 @@ impl<'a> Importer<'a> {
None
} else {
Some(Edit::range_replacement(
self.locator.slice(statement.range()).to_string(),
self.source[statement.range()].to_string(),
statement.range(),
))
}
@@ -186,7 +182,7 @@ impl<'a> Importer<'a> {
None
} else {
Some(Edit::range_replacement(
self.locator.slice(type_checking.range()).to_string(),
self.source[type_checking.range()].to_string(),
type_checking.range(),
))
};
@@ -362,7 +358,7 @@ impl<'a> Importer<'a> {
// By adding this no-op edit, we force the `unused-imports` fix to conflict with the
// `sys-exit-alias` fix, and thus will avoid applying both fixes in the same pass.
let import_edit = Edit::range_replacement(
self.locator.slice(imported_name.range()).to_string(),
self.source[imported_name.range()].to_string(),
imported_name.range(),
);
Ok(Some((import_edit, imported_name.into_name())))
@@ -469,11 +465,11 @@ impl<'a> Importer<'a> {
/// Add the given member to an existing `Stmt::ImportFrom` statement.
fn add_member(&self, stmt: &Stmt, member: &str) -> Result<Edit> {
let mut statement = match_statement(self.locator.slice(stmt))?;
let mut statement = match_statement(&self.source[stmt.range()])?;
let import_from = match_import_from(&mut statement)?;
let aliases = match_aliases(import_from)?;
aliases.push(ImportAlias {
name: NameOrAttribute::N(Box::new(cstName {
aliases.push(cst::ImportAlias {
name: cst::NameOrAttribute::N(Box::new(cst::Name {
value: member,
lpar: vec![],
rpar: vec![],
@@ -491,10 +487,10 @@ impl<'a> Importer<'a> {
fn add_type_checking_block(&self, content: &str, at: TextSize) -> Result<Edit> {
let insertion = if let Some(stmt) = self.preceding_import(at) {
// Insert after the last top-level import.
Insertion::end_of_statement(stmt, self.locator, self.stylist)
Insertion::end_of_statement(stmt, self.source, self.stylist)
} else {
// Insert at the start of the file.
Insertion::start_of_file(self.python_ast, self.locator, self.stylist)
Insertion::start_of_file(self.python_ast, self.source, self.stylist)
};
if insertion.is_inline() {
Err(anyhow::anyhow!(
@@ -507,7 +503,7 @@ impl<'a> Importer<'a> {
/// Add an import statement to an existing `TYPE_CHECKING` block.
fn add_to_type_checking_block(&self, content: &str, at: TextSize) -> Edit {
Insertion::start_of_block(at, self.locator, self.stylist, self.tokens).into_edit(content)
Insertion::start_of_block(at, self.source, self.stylist, self.tokens).into_edit(content)
}
/// Return the import statement that precedes the given position, if any.

View File

@@ -319,6 +319,9 @@ pub fn check_path(
&context,
);
}
Rule::PanicyTestRule => {
test_rules::PanicyTestRule::diagnostic(locator, comment_ranges, &context);
}
_ => unreachable!("All test rules must have an implementation"),
}
}

View File

@@ -223,3 +223,8 @@ pub(crate) const fn is_unnecessary_default_type_args_stubs_enabled(
pub(crate) const fn is_sim910_expanded_key_support_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/20169
pub(crate) const fn is_fix_builtin_open_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}

View File

@@ -205,3 +205,14 @@ pub(crate) fn has_unknown_keywords_or_starred_expr(
None => true,
})
}
/// Returns `true` if argument `name` is set to a non-default `None` value.
pub(crate) fn is_argument_non_default(
arguments: &ast::Arguments,
name: &str,
position: usize,
) -> bool {
arguments
.find_argument_value(name, position)
.is_some_and(|expr| !expr.is_none_literal_expr())
}

View File

@@ -59,6 +59,7 @@ mod tests {
#[test_case(Rule::PyPath, Path::new("py_path_1.py"))]
#[test_case(Rule::PyPath, Path::new("py_path_2.py"))]
#[test_case(Rule::BuiltinOpen, Path::new("PTH123.py"))]
#[test_case(Rule::PathConstructorCurrentDirectory, Path::new("PTH201.py"))]
#[test_case(Rule::OsPathGetsize, Path::new("PTH202.py"))]
#[test_case(Rule::OsPathGetsize, Path::new("PTH202_2.py"))]
@@ -123,6 +124,7 @@ mod tests {
Ok(())
}
#[test_case(Rule::BuiltinOpen, Path::new("PTH123.py"))]
#[test_case(Rule::PathConstructorCurrentDirectory, Path::new("PTH201.py"))]
#[test_case(Rule::OsPathGetsize, Path::new("PTH202.py"))]
#[test_case(Rule::OsPathGetsize, Path::new("PTH202_2.py"))]

View File

@@ -0,0 +1,191 @@
use ruff_diagnostics::{Applicability, Edit, Fix};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::{ArgOrKeyword, Expr, ExprBooleanLiteral, ExprCall};
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::preview::is_fix_builtin_open_enabled;
use crate::rules::flake8_use_pathlib::helpers::{
has_unknown_keywords_or_starred_expr, is_argument_non_default, is_file_descriptor,
is_pathlib_path_call,
};
use crate::{FixAvailability, Violation};
/// ## What it does
/// Checks for uses of the `open()` builtin.
///
/// ## Why is this bad?
/// `pathlib` offers a high-level API for path manipulation. When possible,
/// using `Path` object methods such as `Path.open()` can improve readability
/// over the `open` builtin.
///
/// ## Examples
/// ```python
/// with open("f1.py", "wb") as fp:
/// ...
/// ```
///
/// Use instead:
/// ```python
/// from pathlib import Path
///
/// with Path("f1.py").open("wb") as fp:
/// ...
/// ```
///
/// ## Known issues
/// While using `pathlib` can improve the readability and type safety of your code,
/// it can be less performant than working directly with strings,
/// especially on older versions of Python.
///
/// ## Fix Safety
/// This rule's fix is marked as unsafe if the replacement would remove comments attached to the original expression.
///
/// ## References
/// - [Python documentation: `Path.open`](https://docs.python.org/3/library/pathlib.html#pathlib.Path.open)
/// - [Python documentation: `open`](https://docs.python.org/3/library/functions.html#open)
/// - [PEP 428 The pathlib module object-oriented filesystem paths](https://peps.python.org/pep-0428/)
/// - [Correspondence between `os` and `pathlib`](https://docs.python.org/3/library/pathlib.html#corresponding-tools)
/// - [Why you should be using pathlib](https://treyhunner.com/2018/12/why-you-should-be-using-pathlib/)
/// - [No really, pathlib is great](https://treyhunner.com/2019/01/no-really-pathlib-is-great/)
#[derive(ViolationMetadata)]
pub(crate) struct BuiltinOpen;
impl Violation for BuiltinOpen {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
#[derive_message_formats]
fn message(&self) -> String {
"`open()` should be replaced by `Path.open()`".to_string()
}
fn fix_title(&self) -> Option<String> {
Some("Replace with `Path.open()`".to_string())
}
}
// PTH123
pub(crate) fn builtin_open(checker: &Checker, call: &ExprCall, segments: &[&str]) {
// `closefd` and `opener` are not supported by pathlib, so check if they
// are set to non-default values.
// https://github.com/astral-sh/ruff/issues/7620
// Signature as of Python 3.11 (https://docs.python.org/3/library/functions.html#open):
// ```text
// builtins.open(
// file, 0
// mode='r', 1
// buffering=-1, 2
// encoding=None, 3
// errors=None, 4
// newline=None, 5
// closefd=True, 6 <= not supported
// opener=None 7 <= not supported
// )
// ```
// For `pathlib` (https://docs.python.org/3/library/pathlib.html#pathlib.Path.open):
// ```text
// Path.open(mode='r', buffering=-1, encoding=None, errors=None, newline=None)
// ```
let file_arg = call.arguments.find_argument_value("file", 0);
if call
.arguments
.find_argument_value("closefd", 6)
.is_some_and(|expr| {
!matches!(
expr,
Expr::BooleanLiteral(ExprBooleanLiteral { value: true, .. })
)
})
|| is_argument_non_default(&call.arguments, "opener", 7)
|| file_arg.is_some_and(|expr| is_file_descriptor(expr, checker.semantic()))
{
return;
}
if !matches!(segments, ["" | "builtins", "open"]) {
return;
}
let mut diagnostic = checker.report_diagnostic(BuiltinOpen, call.func.range());
if !is_fix_builtin_open_enabled(checker.settings()) {
return;
}
let Some(file) = file_arg else {
return;
};
if has_unknown_keywords_or_starred_expr(
&call.arguments,
&[
"file",
"mode",
"buffering",
"encoding",
"errors",
"newline",
"closefd",
"opener",
],
) {
return;
}
diagnostic.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import("pathlib", "Path"),
call.start(),
checker.semantic(),
)?;
let locator = checker.locator();
let file_code = locator.slice(file.range());
let args = |i: usize, arg: ArgOrKeyword| match arg {
ArgOrKeyword::Arg(expr) => {
if expr.range() == file.range() || i == 6 || i == 7 {
None
} else {
Some(locator.slice(expr.range()))
}
}
ArgOrKeyword::Keyword(kw) => match kw.arg.as_deref() {
Some("mode" | "buffering" | "encoding" | "errors" | "newline") => {
Some(locator.slice(kw))
}
_ => None,
},
};
let open_args = itertools::join(
call.arguments
.arguments_source_order()
.enumerate()
.filter_map(|(i, arg)| args(i, arg)),
", ",
);
let replacement = if is_pathlib_path_call(checker, file) {
format!("{file_code}.open({open_args})")
} else {
format!("{binding}({file_code}).open({open_args})")
};
let range = call.range();
let applicability = if checker.comment_ranges().intersects(range) {
Applicability::Unsafe
} else {
Applicability::Safe
};
Ok(Fix::applicable_edits(
Edit::range_replacement(replacement, range),
[import_edit],
applicability,
))
});
}

View File

@@ -1,3 +1,4 @@
pub(crate) use builtin_open::*;
pub(crate) use glob_rule::*;
pub(crate) use invalid_pathlib_with_suffix::*;
pub(crate) use os_chmod::*;
@@ -29,6 +30,7 @@ pub(crate) use os_unlink::*;
pub(crate) use path_constructor_current_directory::*;
pub(crate) use replaceable_by_pathlib::*;
mod builtin_open;
mod glob_rule;
mod invalid_pathlib_with_suffix;
mod os_chmod;

View File

@@ -1,13 +1,11 @@
use ruff_diagnostics::{Edit, Fix};
use ruff_diagnostics::Applicability;
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::ExprCall;
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::preview::is_fix_os_path_abspath_enabled;
use crate::rules::flake8_use_pathlib::helpers::{
has_unknown_keywords_or_starred_expr, is_pathlib_path_call,
check_os_pathlib_single_arg_calls, has_unknown_keywords_or_starred_expr,
};
use crate::{FixAvailability, Violation};
@@ -75,43 +73,17 @@ pub(crate) fn os_path_abspath(checker: &Checker, call: &ExprCall, segments: &[&s
return;
}
if call.arguments.len() != 1 {
return;
}
let Some(arg) = call.arguments.find_argument_value("path", 0) else {
return;
};
let arg_code = checker.locator().slice(arg.range());
let range = call.range();
let mut diagnostic = checker.report_diagnostic(OsPathAbspath, call.func.range());
if has_unknown_keywords_or_starred_expr(&call.arguments, &["path"]) {
return;
}
if !is_fix_os_path_abspath_enabled(checker.settings()) {
return;
}
diagnostic.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import("pathlib", "Path"),
call.start(),
checker.semantic(),
)?;
let replacement = if is_pathlib_path_call(checker, arg) {
format!("{arg_code}.resolve()")
} else {
format!("{binding}({arg_code}).resolve()")
};
Ok(Fix::unsafe_edits(
Edit::range_replacement(replacement, range),
[import_edit],
))
});
check_os_pathlib_single_arg_calls(
checker,
call,
"resolve()",
"path",
is_fix_os_path_abspath_enabled(checker.settings()),
OsPathAbspath,
Some(Applicability::Unsafe),
);
}

View File

@@ -1,3 +1,4 @@
use ruff_diagnostics::Applicability;
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::ExprCall;
@@ -35,7 +36,10 @@ use crate::{FixAvailability, Violation};
/// especially on older versions of Python.
///
/// ## Fix Safety
/// This rule's fix is marked as unsafe if the replacement would remove comments attached to the original expression.
/// This rule's fix is always marked as unsafe because the behaviors of
/// `os.path.expanduser` and `Path.expanduser` differ when a user's home
/// directory can't be resolved: `os.path.expanduser` returns the
/// input unchanged, while `Path.expanduser` raises `RuntimeError`.
///
/// ## References
/// - [Python documentation: `Path.expanduser`](https://docs.python.org/3/library/pathlib.html#pathlib.Path.expanduser)
@@ -71,6 +75,6 @@ pub(crate) fn os_path_expanduser(checker: &Checker, call: &ExprCall, segments: &
"path",
is_fix_os_path_expanduser_enabled(checker.settings()),
OsPathExpanduser,
None,
Some(Applicability::Unsafe),
);
}

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{self as ast, Expr, ExprBooleanLiteral, ExprCall};
use ruff_python_ast::{Expr, ExprCall};
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
@@ -7,7 +7,7 @@ use crate::rules::flake8_use_pathlib::helpers::{
};
use crate::rules::flake8_use_pathlib::{
rules::Glob,
violations::{BuiltinOpen, Joiner, OsListdir, OsPathJoin, OsPathSplitext, OsStat, PyPath},
violations::{Joiner, OsListdir, OsPathJoin, OsPathSplitext, OsStat, PyPath},
};
pub(crate) fn replaceable_by_pathlib(checker: &Checker, call: &ExprCall) {
@@ -60,42 +60,6 @@ pub(crate) fn replaceable_by_pathlib(checker: &Checker, call: &ExprCall) {
),
// PTH122
["os", "path", "splitext"] => checker.report_diagnostic_if_enabled(OsPathSplitext, range),
// PTH123
["" | "builtins", "open"] => {
// `closefd` and `opener` are not supported by pathlib, so check if they
// are set to non-default values.
// https://github.com/astral-sh/ruff/issues/7620
// Signature as of Python 3.11 (https://docs.python.org/3/library/functions.html#open):
// ```text
// 0 1 2 3 4 5
// open(file, mode='r', buffering=-1, encoding=None, errors=None, newline=None,
// 6 7
// closefd=True, opener=None)
// ^^^^ ^^^^
// ```
// For `pathlib` (https://docs.python.org/3/library/pathlib.html#pathlib.Path.open):
// ```text
// Path.open(mode='r', buffering=-1, encoding=None, errors=None, newline=None)
// ```
if call
.arguments
.find_argument_value("closefd", 6)
.is_some_and(|expr| {
!matches!(
expr,
Expr::BooleanLiteral(ExprBooleanLiteral { value: true, .. })
)
})
|| is_argument_non_default(&call.arguments, "opener", 7)
|| call
.arguments
.find_argument_value("file", 0)
.is_some_and(|expr| is_file_descriptor(expr, checker.semantic()))
{
return;
}
checker.report_diagnostic_if_enabled(BuiltinOpen, range)
}
// PTH124
["py", "path", "local"] => checker.report_diagnostic_if_enabled(PyPath, range),
// PTH207
@@ -151,10 +115,3 @@ pub(crate) fn replaceable_by_pathlib(checker: &Checker, call: &ExprCall) {
_ => return,
};
}
/// Returns `true` if argument `name` is set to a non-default `None` value.
fn is_argument_non_default(arguments: &ast::Arguments, name: &str, position: usize) -> bool {
arguments
.find_argument_value(name, position)
.is_some_and(|expr| !expr.is_none_literal_expr())
}

View File

@@ -0,0 +1,143 @@
---
source: crates/ruff_linter/src/rules/flake8_use_pathlib/mod.rs
---
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:8:1
|
6 | r_plus = "r+"
7 |
8 | builtins.open(file=_file)
| ^^^^^^^^^^^^^
9 |
10 | open(_file, "r+ ", - 1)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:10:1
|
8 | builtins.open(file=_file)
9 |
10 | open(_file, "r+ ", - 1)
| ^^^^
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:11:1
|
10 | open(_file, "r+ ", - 1)
11 | open(mode="wb", file=_file)
| ^^^^
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:12:1
|
10 | open(_file, "r+ ", - 1)
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
| ^^^^
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:13:1
|
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
| ^^^^
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:14:1
|
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
| ^^^^
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:15:1
|
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
| ^^^^
16 | open(_file, f" {r_plus} ", - 1)
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:16:1
|
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
| ^^^^
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:17:1
|
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
| ^^^^
18 |
19 | # Only diagnostic
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:20:1
|
19 | # Only diagnostic
20 | open()
| ^^^^
21 | open(_file, *_x)
22 | open(_file, "r+", unknown=True)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:21:1
|
19 | # Only diagnostic
20 | open()
21 | open(_file, *_x)
| ^^^^
22 | open(_file, "r+", unknown=True)
23 | open(_file, "r+", closefd=False)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:22:1
|
20 | open()
21 | open(_file, *_x)
22 | open(_file, "r+", unknown=True)
| ^^^^
23 | open(_file, "r+", closefd=False)
24 | open(_file, "r+", None, None, None, None, None, None, None)
|
help: Replace with `Path.open()`

View File

@@ -305,6 +305,7 @@ PTH123 `open()` should be replaced by `Path.open()`
33 | fp.read()
34 | open(p).close()
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> full_name.py:34:1
@@ -316,6 +317,7 @@ PTH123 `open()` should be replaced by `Path.open()`
35 | os.getcwdb(p)
36 | os.path.join(p, *q)
|
help: Replace with `Path.open()`
PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
--> full_name.py:35:1
@@ -360,6 +362,7 @@ PTH123 `open()` should be replaced by `Path.open()`
47 | open(p, 'r', - 1, None, None, None, True, None)
48 | open(p, 'r', - 1, None, None, None, False, opener)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> full_name.py:47:1
@@ -370,6 +373,7 @@ PTH123 `open()` should be replaced by `Path.open()`
| ^^^^
48 | open(p, 'r', - 1, None, None, None, False, opener)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> full_name.py:65:1
@@ -381,6 +385,7 @@ PTH123 `open()` should be replaced by `Path.open()`
66 | byte_str = b"bar"
67 | open(byte_str)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> full_name.py:67:1
@@ -392,6 +397,7 @@ PTH123 `open()` should be replaced by `Path.open()`
68 |
69 | def bytes_str_func() -> bytes:
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> full_name.py:71:1
@@ -403,6 +409,7 @@ PTH123 `open()` should be replaced by `Path.open()`
72 |
73 | # https://github.com/astral-sh/ruff/issues/17693
|
help: Replace with `Path.open()`
PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
--> full_name.py:108:1

View File

@@ -305,6 +305,7 @@ PTH123 `open()` should be replaced by `Path.open()`
35 | fp.read()
36 | open(p).close()
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> import_from.py:36:1
@@ -314,6 +315,7 @@ PTH123 `open()` should be replaced by `Path.open()`
36 | open(p).close()
| ^^^^
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> import_from.py:43:10
@@ -323,6 +325,7 @@ PTH123 `open()` should be replaced by `Path.open()`
43 | with open(p) as _: ... # Error
| ^^^^
|
help: Replace with `Path.open()`
PTH104 `os.rename()` should be replaced by `Path.rename()`
--> import_from.py:53:1

View File

@@ -0,0 +1,215 @@
---
source: crates/ruff_linter/src/rules/flake8_use_pathlib/mod.rs
---
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:8:1
|
6 | r_plus = "r+"
7 |
8 | builtins.open(file=_file)
| ^^^^^^^^^^^^^
9 |
10 | open(_file, "r+ ", - 1)
|
help: Replace with `Path.open()`
5 | _x = ("r+", -1)
6 | r_plus = "r+"
7 |
- builtins.open(file=_file)
8 + Path(_file).open()
9 |
10 | open(_file, "r+ ", - 1)
11 | open(mode="wb", file=_file)
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:10:1
|
8 | builtins.open(file=_file)
9 |
10 | open(_file, "r+ ", - 1)
| ^^^^
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
|
help: Replace with `Path.open()`
7 |
8 | builtins.open(file=_file)
9 |
- open(_file, "r+ ", - 1)
10 + Path(_file).open("r+ ", - 1)
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:11:1
|
10 | open(_file, "r+ ", - 1)
11 | open(mode="wb", file=_file)
| ^^^^
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
|
help: Replace with `Path.open()`
8 | builtins.open(file=_file)
9 |
10 | open(_file, "r+ ", - 1)
- open(mode="wb", file=_file)
11 + Path(_file).open(mode="wb")
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:12:1
|
10 | open(_file, "r+ ", - 1)
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
| ^^^^
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
|
help: Replace with `Path.open()`
9 |
10 | open(_file, "r+ ", - 1)
11 | open(mode="wb", file=_file)
- open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
12 + Path(_file).open(mode="r+", buffering=-1, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:13:1
|
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
| ^^^^
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
|
help: Replace with `Path.open()`
10 | open(_file, "r+ ", - 1)
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
- open(_file, "r+", - 1, None, None, None, True, None)
13 + Path(_file).open("r+", - 1, None, None, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:14:1
|
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
| ^^^^
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
|
help: Replace with `Path.open()`
11 | open(mode="wb", file=_file)
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
- open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
14 + Path(_file).open("r+", -1, None, None, None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:15:1
|
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
| ^^^^
16 | open(_file, f" {r_plus} ", - 1)
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
|
help: Replace with `Path.open()`
12 | open(mode="r+", buffering=-1, file=_file, encoding="utf-8")
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
- open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
15 + Path(_file).open(mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
18 |
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:16:1
|
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
| ^^^^
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
|
help: Replace with `Path.open()`
13 | open(_file, "r+", - 1, None, None, None, True, None)
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
- open(_file, f" {r_plus} ", - 1)
16 + Path(_file).open(f" {r_plus} ", - 1)
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
18 |
19 | # Only diagnostic
PTH123 [*] `open()` should be replaced by `Path.open()`
--> PTH123.py:17:1
|
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
17 | open(buffering=- 1, file=_file, encoding= "utf-8")
| ^^^^
18 |
19 | # Only diagnostic
|
help: Replace with `Path.open()`
14 | open(_file, "r+", -1, None, None, None, closefd=True, opener=None)
15 | open(_file, mode="r+", buffering=-1, encoding=None, errors=None, newline=None)
16 | open(_file, f" {r_plus} ", - 1)
- open(buffering=- 1, file=_file, encoding= "utf-8")
17 + Path(_file).open(buffering=- 1, encoding= "utf-8")
18 |
19 | # Only diagnostic
20 | open()
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:20:1
|
19 | # Only diagnostic
20 | open()
| ^^^^
21 | open(_file, *_x)
22 | open(_file, "r+", unknown=True)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:21:1
|
19 | # Only diagnostic
20 | open()
21 | open(_file, *_x)
| ^^^^
22 | open(_file, "r+", unknown=True)
23 | open(_file, "r+", closefd=False)
|
help: Replace with `Path.open()`
PTH123 `open()` should be replaced by `Path.open()`
--> PTH123.py:22:1
|
20 | open()
21 | open(_file, *_x)
22 | open(_file, "r+", unknown=True)
| ^^^^
23 | open(_file, "r+", closefd=False)
24 | open(_file, "r+", None, None, None, None, None, None, None)
|
help: Replace with `Path.open()`

View File

@@ -260,6 +260,7 @@ help: Replace with `Path(...).expanduser()`
20 | bbb = os.path.isdir(p)
21 | bbbb = os.path.isfile(p)
22 | bbbbb = os.path.islink(p)
note: This is an unsafe fix and may change runtime behavior
PTH112 [*] `os.path.isdir()` should be replaced by `Path.is_dir()`
--> full_name.py:19:7
@@ -519,7 +520,7 @@ PTH122 `os.path.splitext()` should be replaced by `Path.suffix`, `Path.stem`, an
33 | fp.read()
|
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> full_name.py:32:6
|
30 | os.path.samefile(p)
@@ -529,8 +530,24 @@ PTH123 `open()` should be replaced by `Path.open()`
33 | fp.read()
34 | open(p).close()
|
help: Replace with `Path.open()`
1 | import os
2 | import os.path
3 + import pathlib
4 |
5 | p = "/foo"
6 | q = "bar"
--------------------------------------------------------------------------------
30 | os.path.dirname(p)
31 | os.path.samefile(p)
32 | os.path.splitext(p)
- with open(p) as fp:
33 + with pathlib.Path(p).open() as fp:
34 | fp.read()
35 | open(p).close()
36 | os.getcwdb(p)
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> full_name.py:34:1
|
32 | with open(p) as fp:
@@ -540,6 +557,22 @@ PTH123 `open()` should be replaced by `Path.open()`
35 | os.getcwdb(p)
36 | os.path.join(p, *q)
|
help: Replace with `Path.open()`
1 | import os
2 | import os.path
3 + import pathlib
4 |
5 | p = "/foo"
6 | q = "bar"
--------------------------------------------------------------------------------
32 | os.path.splitext(p)
33 | with open(p) as fp:
34 | fp.read()
- open(p).close()
35 + pathlib.Path(p).open().close()
36 | os.getcwdb(p)
37 | os.path.join(p, *q)
38 | os.sep.join(p, *q)
PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
--> full_name.py:35:1
@@ -574,7 +607,7 @@ PTH118 `os.sep.join()` should be replaced by `Path.joinpath()`
39 | # https://github.com/astral-sh/ruff/issues/7620
|
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> full_name.py:46:1
|
44 | open(p, closefd=False)
@@ -584,8 +617,24 @@ PTH123 `open()` should be replaced by `Path.open()`
47 | open(p, 'r', - 1, None, None, None, True, None)
48 | open(p, 'r', - 1, None, None, None, False, opener)
|
help: Replace with `Path.open()`
1 | import os
2 | import os.path
3 + import pathlib
4 |
5 | p = "/foo"
6 | q = "bar"
--------------------------------------------------------------------------------
44 |
45 | open(p, closefd=False)
46 | open(p, opener=opener)
- open(p, mode='r', buffering=-1, encoding=None, errors=None, newline=None, closefd=True, opener=None)
47 + pathlib.Path(p).open(mode='r', buffering=-1, encoding=None, errors=None, newline=None)
48 | open(p, 'r', - 1, None, None, None, True, None)
49 | open(p, 'r', - 1, None, None, None, False, opener)
50 |
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> full_name.py:47:1
|
45 | open(p, opener=opener)
@@ -594,8 +643,24 @@ PTH123 `open()` should be replaced by `Path.open()`
| ^^^^
48 | open(p, 'r', - 1, None, None, None, False, opener)
|
help: Replace with `Path.open()`
1 | import os
2 | import os.path
3 + import pathlib
4 |
5 | p = "/foo"
6 | q = "bar"
--------------------------------------------------------------------------------
45 | open(p, closefd=False)
46 | open(p, opener=opener)
47 | open(p, mode='r', buffering=-1, encoding=None, errors=None, newline=None, closefd=True, opener=None)
- open(p, 'r', - 1, None, None, None, True, None)
48 + pathlib.Path(p).open('r', - 1, None, None, None)
49 | open(p, 'r', - 1, None, None, None, False, opener)
50 |
51 | # Cannot be upgraded `pathlib.Open` does not support fds
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> full_name.py:65:1
|
63 | open(f())
@@ -605,8 +670,24 @@ PTH123 `open()` should be replaced by `Path.open()`
66 | byte_str = b"bar"
67 | open(byte_str)
|
help: Replace with `Path.open()`
1 | import os
2 | import os.path
3 + import pathlib
4 |
5 | p = "/foo"
6 | q = "bar"
--------------------------------------------------------------------------------
63 | return 1
64 | open(f())
65 |
- open(b"foo")
66 + pathlib.Path(b"foo").open()
67 | byte_str = b"bar"
68 | open(byte_str)
69 |
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> full_name.py:67:1
|
65 | open(b"foo")
@@ -616,8 +697,24 @@ PTH123 `open()` should be replaced by `Path.open()`
68 |
69 | def bytes_str_func() -> bytes:
|
help: Replace with `Path.open()`
1 | import os
2 | import os.path
3 + import pathlib
4 |
5 | p = "/foo"
6 | q = "bar"
--------------------------------------------------------------------------------
65 |
66 | open(b"foo")
67 | byte_str = b"bar"
- open(byte_str)
68 + pathlib.Path(byte_str).open()
69 |
70 | def bytes_str_func() -> bytes:
71 | return b"foo"
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> full_name.py:71:1
|
69 | def bytes_str_func() -> bytes:
@@ -627,6 +724,22 @@ PTH123 `open()` should be replaced by `Path.open()`
72 |
73 | # https://github.com/astral-sh/ruff/issues/17693
|
help: Replace with `Path.open()`
1 | import os
2 | import os.path
3 + import pathlib
4 |
5 | p = "/foo"
6 | q = "bar"
--------------------------------------------------------------------------------
69 |
70 | def bytes_str_func() -> bytes:
71 | return b"foo"
- open(bytes_str_func())
72 + pathlib.Path(bytes_str_func()).open()
73 |
74 | # https://github.com/astral-sh/ruff/issues/17693
75 | os.stat(1)
PTH109 [*] `os.getcwd()` should be replaced by `Path.cwd()`
--> full_name.py:108:1

View File

@@ -260,6 +260,7 @@ help: Replace with `Path(...).expanduser()`
20 | bbb = foo_p.isdir(p)
21 | bbbb = foo_p.isfile(p)
22 | bbbbb = foo_p.islink(p)
note: This is an unsafe fix and may change runtime behavior
PTH112 [*] `os.path.isdir()` should be replaced by `Path.is_dir()`
--> import_as.py:19:7

View File

@@ -268,6 +268,7 @@ help: Replace with `Path(...).expanduser()`
22 | bbb = isdir(p)
23 | bbbb = isfile(p)
24 | bbbbb = islink(p)
note: This is an unsafe fix and may change runtime behavior
PTH112 [*] `os.path.isdir()` should be replaced by `Path.is_dir()`
--> import_from.py:21:7
@@ -534,7 +535,7 @@ PTH122 `os.path.splitext()` should be replaced by `Path.suffix`, `Path.stem`, an
35 | fp.read()
|
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> import_from.py:34:6
|
32 | samefile(p)
@@ -544,8 +545,25 @@ PTH123 `open()` should be replaced by `Path.open()`
35 | fp.read()
36 | open(p).close()
|
help: Replace with `Path.open()`
2 | from os import remove, unlink, getcwd, readlink, stat
3 | from os.path import abspath, exists, expanduser, isdir, isfile, islink
4 | from os.path import isabs, join, basename, dirname, samefile, splitext
5 + import pathlib
6 |
7 | p = "/foo"
8 | q = "bar"
--------------------------------------------------------------------------------
32 | dirname(p)
33 | samefile(p)
34 | splitext(p)
- with open(p) as fp:
35 + with pathlib.Path(p).open() as fp:
36 | fp.read()
37 | open(p).close()
38 |
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> import_from.py:36:1
|
34 | with open(p) as fp:
@@ -553,8 +571,25 @@ PTH123 `open()` should be replaced by `Path.open()`
36 | open(p).close()
| ^^^^
|
help: Replace with `Path.open()`
2 | from os import remove, unlink, getcwd, readlink, stat
3 | from os.path import abspath, exists, expanduser, isdir, isfile, islink
4 | from os.path import isabs, join, basename, dirname, samefile, splitext
5 + import pathlib
6 |
7 | p = "/foo"
8 | q = "bar"
--------------------------------------------------------------------------------
34 | splitext(p)
35 | with open(p) as fp:
36 | fp.read()
- open(p).close()
37 + pathlib.Path(p).open().close()
38 |
39 |
40 | # https://github.com/astral-sh/ruff/issues/15442
PTH123 `open()` should be replaced by `Path.open()`
PTH123 [*] `open()` should be replaced by `Path.open()`
--> import_from.py:43:10
|
41 | from builtins import open
@@ -562,6 +597,23 @@ PTH123 `open()` should be replaced by `Path.open()`
43 | with open(p) as _: ... # Error
| ^^^^
|
help: Replace with `Path.open()`
2 | from os import remove, unlink, getcwd, readlink, stat
3 | from os.path import abspath, exists, expanduser, isdir, isfile, islink
4 | from os.path import isabs, join, basename, dirname, samefile, splitext
5 + import pathlib
6 |
7 | p = "/foo"
8 | q = "bar"
--------------------------------------------------------------------------------
41 | def _():
42 | from builtins import open
43 |
- with open(p) as _: ... # Error
44 + with pathlib.Path(p).open() as _: ... # Error
45 |
46 |
47 | def _():
PTH104 [*] `os.rename()` should be replaced by `Path.rename()`
--> import_from.py:53:1

View File

@@ -268,6 +268,7 @@ help: Replace with `Path(...).expanduser()`
27 | bbb = xisdir(p)
28 | bbbb = xisfile(p)
29 | bbbbb = xislink(p)
note: This is an unsafe fix and may change runtime behavior
PTH112 [*] `os.path.isdir()` should be replaced by `Path.is_dir()`
--> import_from_as.py:26:7

View File

@@ -174,50 +174,6 @@ impl Violation for OsPathSplitext {
}
}
/// ## What it does
/// Checks for uses of the `open()` builtin.
///
/// ## Why is this bad?
/// `pathlib` offers a high-level API for path manipulation. When possible,
/// using `Path` object methods such as `Path.open()` can improve readability
/// over the `open` builtin.
///
/// ## Examples
/// ```python
/// with open("f1.py", "wb") as fp:
/// ...
/// ```
///
/// Use instead:
/// ```python
/// from pathlib import Path
///
/// with Path("f1.py").open("wb") as fp:
/// ...
/// ```
///
/// ## Known issues
/// While using `pathlib` can improve the readability and type safety of your code,
/// it can be less performant than working directly with strings,
/// especially on older versions of Python.
///
/// ## References
/// - [Python documentation: `Path.open`](https://docs.python.org/3/library/pathlib.html#pathlib.Path.open)
/// - [Python documentation: `open`](https://docs.python.org/3/library/functions.html#open)
/// - [PEP 428 The pathlib module object-oriented filesystem paths](https://peps.python.org/pep-0428/)
/// - [Correspondence between `os` and `pathlib`](https://docs.python.org/3/library/pathlib.html#corresponding-tools)
/// - [Why you should be using pathlib](https://treyhunner.com/2018/12/why-you-should-be-using-pathlib/)
/// - [No really, pathlib is great](https://treyhunner.com/2019/01/no-really-pathlib-is-great/)
#[derive(ViolationMetadata)]
pub(crate) struct BuiltinOpen;
impl Violation for BuiltinOpen {
#[derive_message_formats]
fn message(&self) -> String {
"`open()` should be replaced by `Path.open()`".to_string()
}
}
/// ## What it does
/// Checks for uses of the `py.path` library.
///

View File

@@ -125,7 +125,8 @@ fn add_required_import(
TextRange::default(),
);
diagnostic.set_fix(Fix::safe_edit(
Importer::new(parsed, locator, stylist).add_import(required_import, TextSize::default()),
Importer::new(parsed, locator.contents(), stylist)
.add_import(required_import, TextSize::default()),
));
}

View File

@@ -836,7 +836,10 @@ impl<'a, 'b> BlankLinesChecker<'a, 'b> {
// Allow groups of one-liners.
&& !(state.follows.is_any_def() && line.last_token != TokenKind::Colon)
&& !state.follows.follows_def_with_dummy_body()
// Only for class scope: we must be inside a class block
&& matches!(state.class_status, Status::Inside(_))
// But NOT inside a function body; nested defs inside methods are handled by E306
&& matches!(state.fn_status, Status::Outside | Status::CommentAfter(_))
// The class/parent method's docstring can directly precede the def.
// Allow following a decorator (if there is an error it will be triggered on the first decorator).
&& !matches!(state.follows, Follows::Docstring | Follows::Decorator)

View File

@@ -94,3 +94,22 @@ help: Add missing blank line
527 | def bar(self, x: int | str) -> int | str:
528 | return x
529 | # end
E301 [*] Expected 1 blank line, found 0
--> E30.py:993:9
|
991 | if True:
992 | print("conditional")
993 | def test():
| ^^^
994 | pass
995 | # end
|
help: Add missing blank line
990 | class Foo:
991 | if True:
992 | print("conditional")
993 +
994 | def test():
995 | pass
996 | # end

View File

@@ -208,3 +208,60 @@ help: Add missing blank line
926 | async def b():
927 | pass
928 | # end
E306 [*] Expected 1 blank line before a nested definition, found 0
--> E30.py:983:9
|
981 | async def recv(self, *, length=65536):
982 | loop = asyncio.get_event_loop()
983 | def callback():
| ^^^
984 | loop.remove_reader(self._fd)
985 | loop.add_reader(self._fd, callback)
|
help: Add missing blank line
980 | class foo:
981 | async def recv(self, *, length=65536):
982 | loop = asyncio.get_event_loop()
983 +
984 | def callback():
985 | loop.remove_reader(self._fd)
986 | loop.add_reader(self._fd, callback)
E306 [*] Expected 1 blank line before a nested definition, found 0
--> E30.py:1002:9
|
1000 | def f():
1001 | x = 1
1002 | def g():
| ^^^
1003 | return 1
1004 | return 2
|
help: Add missing blank line
999 | class Bar:
1000 | def f():
1001 | x = 1
1002 +
1003 | def g():
1004 | return 1
1005 | return 2
E306 [*] Expected 1 blank line before a nested definition, found 0
--> E30.py:1009:13
|
1007 | class Baz:
1008 | x = 1
1009 | def g():
| ^^^
1010 | return 1
1011 | return 2
|
help: Add missing blank line
1006 | def f():
1007 | class Baz:
1008 | x = 1
1009 +
1010 | def g():
1011 | return 1
1012 | return 2

View File

@@ -17,6 +17,7 @@ use crate::Locator;
use crate::checkers::ast::Checker;
use crate::cst::matchers::{match_import, match_import_from, match_statement};
use crate::fix::codemods::CodegenStylist;
use crate::rules::pyupgrade::rules::is_import_required_by_isort;
use crate::{AlwaysFixableViolation, Edit, Fix};
#[derive(Debug, PartialEq, Eq, Copy, Clone)]
@@ -299,7 +300,13 @@ pub(crate) fn deprecated_mock_import(checker: &Checker, stmt: &Stmt) {
// Add a `Diagnostic` for each `mock` import.
for name in names {
if &name.name == "mock" || &name.name == "mock.mock" {
if (&name.name == "mock" || &name.name == "mock.mock")
&& !is_import_required_by_isort(
&checker.settings().isort.required_imports,
stmt.into(),
name,
)
{
let mut diagnostic = checker.report_diagnostic(
DeprecatedMockImport {
reference_type: MockReference::Import,
@@ -319,6 +326,7 @@ pub(crate) fn deprecated_mock_import(checker: &Checker, stmt: &Stmt) {
Stmt::ImportFrom(ast::StmtImportFrom {
module: Some(module),
level,
names,
..
}) => {
if *level > 0 {
@@ -326,6 +334,17 @@ pub(crate) fn deprecated_mock_import(checker: &Checker, stmt: &Stmt) {
}
if module == "mock" {
if names.iter().any(|alias| {
alias.name.as_str() == "mock"
&& is_import_required_by_isort(
&checker.settings().isort.required_imports,
stmt.into(),
alias,
)
}) {
return;
}
let mut diagnostic = checker.report_diagnostic(
DeprecatedMockImport {
reference_type: MockReference::Import,

View File

@@ -2,10 +2,10 @@ use ruff_python_ast::{self as ast, Expr, Stmt};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::name::{QualifiedName, UnqualifiedName};
use ruff_python_semantic::SemanticModel;
use ruff_python_semantic::analyze::typing::{
is_immutable_annotation, is_immutable_func, is_immutable_newtype_call,
};
use ruff_python_semantic::{ScopeId, SemanticModel};
use ruff_text_size::Ranged;
use crate::Violation;
@@ -77,7 +77,11 @@ impl Violation for FunctionCallInDataclassDefaultArgument {
}
/// RUF009
pub(crate) fn function_call_in_dataclass_default(checker: &Checker, class_def: &ast::StmtClassDef) {
pub(crate) fn function_call_in_dataclass_default(
checker: &Checker,
class_def: &ast::StmtClassDef,
scope_id: ScopeId,
) {
let semantic = checker.semantic();
let Some((dataclass_kind, _)) = dataclass_kind(class_def, semantic) else {
@@ -144,7 +148,7 @@ pub(crate) fn function_call_in_dataclass_default(checker: &Checker, class_def: &
|| func.as_name_expr().is_some_and(|name| {
is_immutable_newtype_call(name, checker.semantic(), &extend_immutable_calls)
})
|| is_frozen_dataclass_instantiation(func, semantic)
|| is_frozen_dataclass_instantiation(func, semantic, scope_id)
{
continue;
}
@@ -165,16 +169,22 @@ fn any_annotated(class_body: &[Stmt]) -> bool {
/// Checks that the passed function is an instantiation of the class,
/// retrieves the ``StmtClassDef`` and verifies that it is a frozen dataclass
fn is_frozen_dataclass_instantiation(func: &Expr, semantic: &SemanticModel) -> bool {
semantic.lookup_attribute(func).is_some_and(|id| {
let binding = &semantic.binding(id);
let Some(Stmt::ClassDef(class_def)) = binding.statement(semantic) else {
return false;
};
fn is_frozen_dataclass_instantiation(
func: &Expr,
semantic: &SemanticModel,
scope_id: ScopeId,
) -> bool {
semantic
.lookup_attribute_in_scope(func, scope_id)
.is_some_and(|id| {
let binding = &semantic.binding(id);
let Some(Stmt::ClassDef(class_def)) = binding.statement(semantic) else {
return false;
};
let Some((_, dataclass_decorator)) = dataclass_kind(class_def, semantic) else {
return false;
};
is_frozen_dataclass(dataclass_decorator, semantic)
})
let Some((_, dataclass_decorator)) = dataclass_kind(class_def, semantic) else {
return false;
};
is_frozen_dataclass(dataclass_decorator, semantic)
})
}

View File

@@ -46,6 +46,7 @@ pub(crate) const TEST_RULES: &[Rule] = &[
Rule::RedirectedFromTestRule,
Rule::RedirectedToTestRule,
Rule::RedirectedFromPrefixTestRule,
Rule::PanicyTestRule,
];
pub(crate) trait TestRule {
@@ -477,3 +478,39 @@ impl TestRule for RedirectedFromPrefixTestRule {
);
}
}
/// # What it does
/// Fake rule for testing panics.
///
/// # Why is this bad?
/// Panics are bad.
///
/// # Example
/// ```python
/// foo
/// ```
///
/// Use instead:
/// ```python
/// bar
/// ```
#[derive(ViolationMetadata)]
pub(crate) struct PanicyTestRule;
impl Violation for PanicyTestRule {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::None;
#[derive_message_formats]
fn message(&self) -> String {
"If you see this, maybe panic!".to_string()
}
}
impl TestRule for PanicyTestRule {
fn diagnostic(_locator: &Locator, _comment_ranges: &CommentRanges, context: &LintContext) {
assert!(
!context.source_file().name().ends_with("panic.py"),
"This is a fake panic for testing."
);
}
}

View File

@@ -1,5 +1,6 @@
//! Detect code style from Python source code.
use std::borrow::Cow;
use std::cell::OnceCell;
use std::ops::Deref;
@@ -10,34 +11,43 @@ use ruff_text_size::Ranged;
#[derive(Debug, Clone)]
pub struct Stylist<'a> {
source: &'a str,
source: Cow<'a, str>,
indentation: Indentation,
quote: Quote,
line_ending: OnceCell<LineEnding>,
}
impl<'a> Stylist<'a> {
pub fn indentation(&'a self) -> &'a Indentation {
pub fn indentation(&self) -> &Indentation {
&self.indentation
}
pub fn quote(&'a self) -> Quote {
pub fn quote(&self) -> Quote {
self.quote
}
pub fn line_ending(&'a self) -> LineEnding {
pub fn line_ending(&self) -> LineEnding {
*self.line_ending.get_or_init(|| {
find_newline(self.source)
find_newline(&self.source)
.map(|(_, ending)| ending)
.unwrap_or_default()
})
}
pub fn into_owned(self) -> Stylist<'static> {
Stylist {
source: Cow::Owned(self.source.into_owned()),
indentation: self.indentation,
quote: self.quote,
line_ending: self.line_ending,
}
}
pub fn from_tokens(tokens: &Tokens, source: &'a str) -> Self {
let indentation = detect_indentation(tokens, source);
Self {
source,
source: Cow::Borrowed(source),
indentation,
quote: detect_quote(tokens),
line_ending: OnceCell::default(),

View File

@@ -0,0 +1,30 @@
[package]
name = "ruff_python_importer"
version = "0.0.0"
publish = false
authors = { workspace = true }
edition = { workspace = true }
rust-version = { workspace = true }
homepage = { workspace = true }
documentation = { workspace = true }
repository = { workspace = true }
license = { workspace = true }
[dependencies]
ruff_diagnostics = { workspace = true }
ruff_python_ast = { workspace = true }
ruff_python_codegen = { workspace = true }
ruff_python_parser = { workspace = true }
ruff_python_trivia = { workspace = true }
ruff_source_file = { workspace = true, features = ["serde"] }
ruff_text_size = { workspace = true }
anyhow = { workspace = true }
[dev-dependencies]
insta = { workspace = true }
[features]
[lints]
workspace = true

View File

@@ -1,6 +1,8 @@
//! Insert statements into Python code.
use std::ops::Add;
use ruff_diagnostics::Edit;
use ruff_python_ast::Stmt;
use ruff_python_ast::helpers::is_docstring_stmt;
use ruff_python_codegen::Stylist;
@@ -10,9 +12,6 @@ use ruff_python_trivia::{PythonWhitespace, textwrap::indent};
use ruff_source_file::{LineRanges, UniversalNewlineIterator};
use ruff_text_size::{Ranged, TextSize};
use crate::Edit;
use crate::Locator;
#[derive(Debug, Clone, PartialEq, Eq)]
pub(super) enum Placement<'a> {
/// The content will be inserted inline with the existing code (i.e., within semicolon-delimited
@@ -25,7 +24,7 @@ pub(super) enum Placement<'a> {
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub(super) struct Insertion<'a> {
pub struct Insertion<'a> {
/// The content to add before the insertion.
prefix: &'a str,
/// The location at which to insert.
@@ -50,33 +49,31 @@ impl<'a> Insertion<'a> {
///
/// The insertion returned will begin at the start of the `import os` statement, and will
/// include a trailing newline.
pub(super) fn start_of_file(
body: &[Stmt],
locator: &Locator,
stylist: &Stylist,
) -> Insertion<'static> {
pub fn start_of_file(body: &[Stmt], contents: &str, stylist: &Stylist) -> Insertion<'static> {
// Skip over any docstrings.
let mut location = if let Some(mut location) = match_docstring_end(body) {
// If the first token after the docstring is a semicolon, insert after the semicolon as
// an inline statement.
if let Some(offset) = match_semicolon(locator.after(location)) {
if let Some(offset) = match_semicolon(&contents[location.to_usize()..]) {
return Insertion::inline(" ", location.add(offset).add(TextSize::of(';')), ";");
}
// While the first token after the docstring is a continuation character (i.e. "\"), advance
// additional rows to prevent inserting in the same logical line.
while match_continuation(locator.after(location)).is_some() {
location = locator.full_line_end(location);
while match_continuation(&contents[location.to_usize()..]).is_some() {
location = contents.full_line_end(location);
}
// Otherwise, advance to the next row.
locator.full_line_end(location)
contents.full_line_end(location)
} else {
locator.bom_start_offset()
contents.bom_start_offset()
};
// Skip over commented lines, with whitespace separation.
for line in UniversalNewlineIterator::with_offset(locator.after(location), location) {
for line in
UniversalNewlineIterator::with_offset(&contents[location.to_usize()..], location)
{
let trimmed_line = line.trim_whitespace_start();
if trimmed_line.is_empty() {
continue;
@@ -111,17 +108,13 @@ impl<'a> Insertion<'a> {
/// in this case is the line after `import math`, and will include a trailing newline.
///
/// The statement itself is assumed to be at the top-level of the module.
pub(super) fn end_of_statement(
stmt: &Stmt,
locator: &Locator,
stylist: &Stylist,
) -> Insertion<'static> {
pub fn end_of_statement(stmt: &Stmt, contents: &str, stylist: &Stylist) -> Insertion<'static> {
let location = stmt.end();
if let Some(offset) = match_semicolon(locator.after(location)) {
if let Some(offset) = match_semicolon(&contents[location.to_usize()..]) {
// If the first token after the statement is a semicolon, insert after the semicolon as
// an inline statement.
Insertion::inline(" ", location.add(offset).add(TextSize::of(';')), ";")
} else if match_continuation(locator.after(location)).is_some() {
} else if match_continuation(&contents[location.to_usize()..]).is_some() {
// If the first token after the statement is a continuation, insert after the statement
// with a semicolon.
Insertion::inline("; ", location, "")
@@ -129,12 +122,63 @@ impl<'a> Insertion<'a> {
// Otherwise, insert on the next line.
Insertion::own_line(
"",
locator.full_line_end(location),
contents.full_line_end(location),
stylist.line_ending().as_str(),
)
}
}
/// Create an [`Insertion`] to insert an additional member to import
/// into a `from <module> import member1, member2, ...` statement.
///
/// For example, given the following code:
///
/// ```python
/// """Hello, world!"""
///
/// from collections import Counter
///
///
/// def foo():
/// pass
/// ```
///
/// The insertion returned will begin after `Counter` but before the
/// newline terminator. Callers can then call [`Insertion::into_edit`]
/// with the additional member to add. A comma delimiter is handled
/// automatically.
///
/// The statement itself is assumed to be at the top-level of the module.
///
/// This returns `None` when `stmt` isn't a `from ... import ...`
/// statement.
pub fn existing_import(stmt: &Stmt, tokens: &Tokens) -> Option<Insertion<'static>> {
let Stmt::ImportFrom(ref import_from) = *stmt else {
return None;
};
if let Some(at) = import_from.names.last().map(Ranged::end) {
return Some(Insertion::inline(", ", at, ""));
}
// Our AST can deal with partial `from ... import`
// statements, so we might not have any members
// yet. In this case, we don't need the comma.
//
// ... however, unless we can be certain that
// inserting this name leads to a valid AST, we
// give up.
let at = import_from.end();
if !matches!(
tokens
.before(at)
.last()
.map(ruff_python_parser::Token::kind),
Some(TokenKind::Import)
) {
return None;
}
Some(Insertion::inline(" ", at, ""))
}
/// Create an [`Insertion`] to insert (e.g.) an import statement at the start of a given
/// block, along with a prefix and suffix to use for the insertion.
///
@@ -149,9 +193,9 @@ impl<'a> Insertion<'a> {
/// include a trailing newline.
///
/// The block itself is assumed to be at the top-level of the module.
pub(super) fn start_of_block(
pub fn start_of_block(
mut location: TextSize,
locator: &Locator<'a>,
contents: &'a str,
stylist: &Stylist,
tokens: &Tokens,
) -> Insertion<'a> {
@@ -204,7 +248,7 @@ impl<'a> Insertion<'a> {
"",
token.start(),
stylist.line_ending().as_str(),
locator.slice(token),
&contents[token.range()],
);
}
_ => {
@@ -220,7 +264,7 @@ impl<'a> Insertion<'a> {
}
/// Convert this [`Insertion`] into an [`Edit`] that inserts the given content.
pub(super) fn into_edit(self, content: &str) -> Edit {
pub fn into_edit(self, content: &str) -> Edit {
let Insertion {
prefix,
location,
@@ -240,7 +284,7 @@ impl<'a> Insertion<'a> {
}
/// Returns `true` if this [`Insertion`] is inline.
pub(super) fn is_inline(&self) -> bool {
pub fn is_inline(&self) -> bool {
matches!(self.placement, Placement::Inline)
}
@@ -321,9 +365,7 @@ mod tests {
use ruff_python_codegen::Stylist;
use ruff_python_parser::parse_module;
use ruff_source_file::LineEnding;
use ruff_text_size::TextSize;
use crate::Locator;
use ruff_text_size::{Ranged, TextSize};
use super::Insertion;
@@ -331,9 +373,8 @@ mod tests {
fn start_of_file() -> Result<()> {
fn insert(contents: &str) -> Result<Insertion<'_>> {
let parsed = parse_module(contents)?;
let locator = Locator::new(contents);
let stylist = Stylist::from_tokens(parsed.tokens(), locator.contents());
Ok(Insertion::start_of_file(parsed.suite(), &locator, &stylist))
let stylist = Stylist::from_tokens(parsed.tokens(), contents);
Ok(Insertion::start_of_file(parsed.suite(), contents, &stylist))
}
let contents = "";
@@ -463,9 +504,8 @@ x = 1
fn start_of_block() {
fn insert(contents: &str, offset: TextSize) -> Insertion<'_> {
let parsed = parse_module(contents).unwrap();
let locator = Locator::new(contents);
let stylist = Stylist::from_tokens(parsed.tokens(), locator.contents());
Insertion::start_of_block(offset, &locator, &stylist, parsed.tokens())
let stylist = Stylist::from_tokens(parsed.tokens(), contents);
Insertion::start_of_block(offset, contents, &stylist, parsed.tokens())
}
let contents = "if True: pass";
@@ -484,4 +524,286 @@ if True:
Insertion::indented("", TextSize::from(9), "\n", " ")
);
}
#[test]
fn existing_import_works() {
fn snapshot(content: &str, member: &str) -> String {
let parsed = parse_module(content).unwrap();
let edit = Insertion::existing_import(parsed.suite().first().unwrap(), parsed.tokens())
.unwrap()
.into_edit(member);
let insert_text = edit.content().expect("edit should be non-empty");
let mut content = content.to_string();
content.replace_range(edit.range().to_std_range(), insert_text);
content
}
let source = r#"
from collections import Counter
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import Counter, defaultdict
",
);
let source = r#"
from collections import Counter, OrderedDict
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import Counter, OrderedDict, defaultdict
",
);
let source = r#"
from collections import (Counter)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@"from collections import (Counter, defaultdict)",
);
let source = r#"
from collections import (Counter, OrderedDict)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@"from collections import (Counter, OrderedDict, defaultdict)",
);
let source = r#"
from collections import (Counter,)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@"from collections import (Counter, defaultdict,)",
);
let source = r#"
from collections import (Counter, OrderedDict,)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@"from collections import (Counter, OrderedDict, defaultdict,)",
);
let source = r#"
from collections import (
Counter
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
Counter, defaultdict
)
",
);
let source = r#"
from collections import (
Counter,
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
Counter, defaultdict,
)
",
);
let source = r#"
from collections import (
Counter,
OrderedDict
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
Counter,
OrderedDict, defaultdict
)
",
);
let source = r#"
from collections import (
Counter,
OrderedDict,
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
Counter,
OrderedDict, defaultdict,
)
",
);
let source = r#"
from collections import \
Counter
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import \
Counter, defaultdict
",
);
let source = r#"
from collections import \
Counter, OrderedDict
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import \
Counter, OrderedDict, defaultdict
",
);
let source = r#"
from collections import \
Counter, \
OrderedDict
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import \
Counter, \
OrderedDict, defaultdict
",
);
/*
from collections import (
Collector # comment
)
from collections import (
Collector, # comment
)
from collections import (
Collector # comment
,
)
from collections import (
Collector
# comment
,
)
*/
let source = r#"
from collections import (
Counter # comment
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
Counter, defaultdict # comment
)
",
);
let source = r#"
from collections import (
Counter, # comment
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
Counter, defaultdict, # comment
)
",
);
let source = r#"
from collections import (
Counter # comment
,
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
Counter, defaultdict # comment
,
)
",
);
let source = r#"
from collections import (
Counter
# comment
,
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
Counter, defaultdict
# comment
,
)
",
);
let source = r#"
from collections import (
# comment 1
Counter # comment 2
# comment 3
)
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@r"
from collections import (
# comment 1
Counter, defaultdict # comment 2
# comment 3
)
",
);
let source = r#"
from collections import Counter # comment
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@"from collections import Counter, defaultdict # comment",
);
let source = r#"
from collections import Counter, OrderedDict # comment
"#;
insta::assert_snapshot!(
snapshot(source, "defaultdict"),
@"from collections import Counter, OrderedDict, defaultdict # comment",
);
}
}

View File

@@ -0,0 +1,7 @@
/*!
Low-level helpers for manipulating Python import statements.
*/
pub use self::insertion::Insertion;
mod insertion;

View File

@@ -0,0 +1,8 @@
def f1():
from module import *
class C:
from module import *
def f2():
from ..module import *
def f3():
from module import *, *

View File

@@ -0,0 +1 @@
from module import *

View File

@@ -3,16 +3,16 @@
//! This checker is not responsible for traversing the AST itself. Instead, its
//! [`SemanticSyntaxChecker::visit_stmt`] and [`SemanticSyntaxChecker::visit_expr`] methods should
//! be called in a parent `Visitor`'s `visit_stmt` and `visit_expr` methods, respectively.
use std::fmt::Display;
use ruff_python_ast::{
self as ast, Expr, ExprContext, IrrefutablePatternKind, Pattern, PythonVersion, Stmt, StmtExpr,
StmtImportFrom,
comparable::ComparableExpr,
helpers,
visitor::{Visitor, walk_expr},
};
use ruff_text_size::{Ranged, TextRange, TextSize};
use rustc_hash::{FxBuildHasher, FxHashSet};
use std::fmt::Display;
#[derive(Debug, Default)]
pub struct SemanticSyntaxChecker {
@@ -58,10 +58,40 @@ impl SemanticSyntaxChecker {
fn check_stmt<Ctx: SemanticSyntaxContext>(&mut self, stmt: &ast::Stmt, ctx: &Ctx) {
match stmt {
Stmt::ImportFrom(StmtImportFrom { range, module, .. }) => {
Stmt::ImportFrom(StmtImportFrom {
range,
module,
level,
names,
..
}) => {
if self.seen_futures_boundary && matches!(module.as_deref(), Some("__future__")) {
Self::add_error(ctx, SemanticSyntaxErrorKind::LateFutureImport, *range);
}
for alias in names {
if alias.name.as_str() == "*" && !ctx.in_module_scope() {
// test_err import_from_star
// def f1():
// from module import *
// class C:
// from module import *
// def f2():
// from ..module import *
// def f3():
// from module import *, *
// test_ok import_from_star
// from module import *
Self::add_error(
ctx,
SemanticSyntaxErrorKind::NonModuleImportStar(
helpers::format_import_from(*level, module.as_deref()).to_string(),
),
*range,
);
break;
}
}
}
Stmt::Match(match_stmt) => {
Self::irrefutable_match_case(match_stmt, ctx);
@@ -1002,6 +1032,9 @@ impl Display for SemanticSyntaxError {
SemanticSyntaxErrorKind::YieldFromInAsyncFunction => {
f.write_str("`yield from` statement in async function; use `async for` instead")
}
SemanticSyntaxErrorKind::NonModuleImportStar(name) => {
write!(f, "`from {name} import *` only allowed at module level")
}
}
}
}
@@ -1362,6 +1395,9 @@ pub enum SemanticSyntaxErrorKind {
/// Represents the use of `yield from` inside an asynchronous function.
YieldFromInAsyncFunction,
/// Represents the use of `from <module> import *` outside module scope.
NonModuleImportStar(String),
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, get_size2::GetSize)]

View File

@@ -0,0 +1,271 @@
---
source: crates/ruff_python_parser/tests/fixtures.rs
input_file: crates/ruff_python_parser/resources/inline/err/import_from_star.py
---
## AST
```
Module(
ModModule {
node_index: NodeIndex(None),
range: 0..144,
body: [
FunctionDef(
StmtFunctionDef {
node_index: NodeIndex(None),
range: 0..34,
is_async: false,
decorator_list: [],
name: Identifier {
id: Name("f1"),
range: 4..6,
node_index: NodeIndex(None),
},
type_params: None,
parameters: Parameters {
range: 6..8,
node_index: NodeIndex(None),
posonlyargs: [],
args: [],
vararg: None,
kwonlyargs: [],
kwarg: None,
},
returns: None,
body: [
ImportFrom(
StmtImportFrom {
node_index: NodeIndex(None),
range: 14..34,
module: Some(
Identifier {
id: Name("module"),
range: 19..25,
node_index: NodeIndex(None),
},
),
names: [
Alias {
range: 33..34,
node_index: NodeIndex(None),
name: Identifier {
id: Name("*"),
range: 33..34,
node_index: NodeIndex(None),
},
asname: None,
},
],
level: 0,
},
),
],
},
),
ClassDef(
StmtClassDef {
node_index: NodeIndex(None),
range: 35..68,
decorator_list: [],
name: Identifier {
id: Name("C"),
range: 41..42,
node_index: NodeIndex(None),
},
type_params: None,
arguments: None,
body: [
ImportFrom(
StmtImportFrom {
node_index: NodeIndex(None),
range: 48..68,
module: Some(
Identifier {
id: Name("module"),
range: 53..59,
node_index: NodeIndex(None),
},
),
names: [
Alias {
range: 67..68,
node_index: NodeIndex(None),
name: Identifier {
id: Name("*"),
range: 67..68,
node_index: NodeIndex(None),
},
asname: None,
},
],
level: 0,
},
),
],
},
),
FunctionDef(
StmtFunctionDef {
node_index: NodeIndex(None),
range: 69..105,
is_async: false,
decorator_list: [],
name: Identifier {
id: Name("f2"),
range: 73..75,
node_index: NodeIndex(None),
},
type_params: None,
parameters: Parameters {
range: 75..77,
node_index: NodeIndex(None),
posonlyargs: [],
args: [],
vararg: None,
kwonlyargs: [],
kwarg: None,
},
returns: None,
body: [
ImportFrom(
StmtImportFrom {
node_index: NodeIndex(None),
range: 83..105,
module: Some(
Identifier {
id: Name("module"),
range: 90..96,
node_index: NodeIndex(None),
},
),
names: [
Alias {
range: 104..105,
node_index: NodeIndex(None),
name: Identifier {
id: Name("*"),
range: 104..105,
node_index: NodeIndex(None),
},
asname: None,
},
],
level: 2,
},
),
],
},
),
FunctionDef(
StmtFunctionDef {
node_index: NodeIndex(None),
range: 106..143,
is_async: false,
decorator_list: [],
name: Identifier {
id: Name("f3"),
range: 110..112,
node_index: NodeIndex(None),
},
type_params: None,
parameters: Parameters {
range: 112..114,
node_index: NodeIndex(None),
posonlyargs: [],
args: [],
vararg: None,
kwonlyargs: [],
kwarg: None,
},
returns: None,
body: [
ImportFrom(
StmtImportFrom {
node_index: NodeIndex(None),
range: 120..143,
module: Some(
Identifier {
id: Name("module"),
range: 125..131,
node_index: NodeIndex(None),
},
),
names: [
Alias {
range: 139..140,
node_index: NodeIndex(None),
name: Identifier {
id: Name("*"),
range: 139..140,
node_index: NodeIndex(None),
},
asname: None,
},
Alias {
range: 142..143,
node_index: NodeIndex(None),
name: Identifier {
id: Name("*"),
range: 142..143,
node_index: NodeIndex(None),
},
asname: None,
},
],
level: 0,
},
),
],
},
),
],
},
)
```
## Errors
|
6 | from ..module import *
7 | def f3():
8 | from module import *, *
| ^^^^ Syntax Error: Star import must be the only import
|
## Semantic Syntax Errors
|
1 | def f1():
2 | from module import *
| ^^^^^^^^^^^^^^^^^^^^ Syntax Error: `from module import *` only allowed at module level
3 | class C:
4 | from module import *
|
|
2 | from module import *
3 | class C:
4 | from module import *
| ^^^^^^^^^^^^^^^^^^^^ Syntax Error: `from module import *` only allowed at module level
5 | def f2():
6 | from ..module import *
|
|
4 | from module import *
5 | def f2():
6 | from ..module import *
| ^^^^^^^^^^^^^^^^^^^^^^ Syntax Error: `from ..module import *` only allowed at module level
7 | def f3():
8 | from module import *, *
|
|
6 | from ..module import *
7 | def f3():
8 | from module import *, *
| ^^^^^^^^^^^^^^^^^^^^^^^ Syntax Error: `from module import *` only allowed at module level
|

View File

@@ -0,0 +1,42 @@
---
source: crates/ruff_python_parser/tests/fixtures.rs
input_file: crates/ruff_python_parser/resources/inline/ok/import_from_star.py
---
## AST
```
Module(
ModModule {
node_index: NodeIndex(None),
range: 0..21,
body: [
ImportFrom(
StmtImportFrom {
node_index: NodeIndex(None),
range: 0..20,
module: Some(
Identifier {
id: Name("module"),
range: 5..11,
node_index: NodeIndex(None),
},
),
names: [
Alias {
range: 19..20,
node_index: NodeIndex(None),
name: Identifier {
id: Name("*"),
range: 19..20,
node_index: NodeIndex(None),
},
asname: None,
},
],
level: 0,
},
),
],
},
)
```

View File

@@ -859,11 +859,17 @@ impl<'a> SemanticModel<'a> {
/// associated with `Class`, then the `BindingKind::FunctionDefinition` associated with
/// `Class.method`.
pub fn lookup_attribute(&self, value: &Expr) -> Option<BindingId> {
self.lookup_attribute_in_scope(value, self.scope_id)
}
/// Lookup a qualified attribute in the certain scope.
pub fn lookup_attribute_in_scope(&self, value: &Expr, scope_id: ScopeId) -> Option<BindingId> {
let unqualified_name = UnqualifiedName::from_expr(value)?;
// Find the symbol in the current scope.
let (symbol, attribute) = unqualified_name.segments().split_first()?;
let mut binding_id = self.lookup_symbol(symbol)?;
let mut binding_id =
self.lookup_symbol_in_scope(symbol, scope_id, self.in_forward_reference())?;
// Recursively resolve class attributes, e.g., `foo.bar.baz` in.
let mut tail = attribute;

View File

@@ -425,6 +425,22 @@ impl TextRange {
}
}
/// Conversion methods.
impl TextRange {
/// A convenience routine for converting this range to a
/// standard library range that satisfies the `RangeBounds`
/// trait.
///
/// This is also available as a `From` trait implementation,
/// but this method avoids the need to specify types to help
/// inference.
#[inline]
#[must_use]
pub fn to_std_range(&self) -> Range<usize> {
(*self).into()
}
}
impl Index<TextRange> for str {
type Output = str;
#[inline]

View File

@@ -203,7 +203,7 @@ impl Workspace {
// Extract the `# noqa` and `# isort: skip` directives from the source.
let directives = directives::extract_directives(
parsed.tokens(),
directives::Flags::empty(),
directives::Flags::from_settings(&self.settings.linter),
&locator,
&indexer,
);

View File

@@ -3860,6 +3860,13 @@ pub struct AnalyzeOptions {
/// This setting is only relevant when [`detect-string-imports`](#detect-string-imports) is enabled.
/// For example, if this is set to `2`, then only strings with at least two dots (e.g., `"path.to.module"`)
/// would be considered valid imports.
#[option(
default = "2",
value_type = "usize",
example = r#"
string-imports-min-dots = 2
"#
)]
pub string_imports_min_dots: Option<usize>,
/// A map from file path to the list of Python or non-Python file paths or globs that should be
/// considered dependencies of that file, regardless of whether relevant imports are detected.

1
crates/ty/docs/cli.md generated
View File

@@ -63,6 +63,7 @@ over all configuration files.</p>
<li><code>full</code>: Print diagnostics verbosely, with context and helpful hints (default)</li>
<li><code>concise</code>: Print diagnostics concisely, one per line</li>
<li><code>gitlab</code>: Print diagnostics in the JSON format expected by GitLab Code Quality reports</li>
<li><code>github</code>: Print diagnostics in the format used by GitHub Actions workflow error annotations</li>
</ul></dd><dt id="ty-check--project"><a href="#ty-check--project"><code>--project</code></a> <i>project</i></dt><dd><p>Run the command within the given project directory.</p>
<p>All <code>pyproject.toml</code> files will be discovered by walking up the directory tree from the given project directory, as will the project's virtual environment (<code>.venv</code>) unless the <code>venv-path</code> option is set.</p>
<p>Other command-line arguments (such as relative paths) will be resolved relative to the current working directory.</p>

View File

@@ -324,6 +324,9 @@ pub enum OutputFormat {
/// Print diagnostics in the JSON format expected by GitLab Code Quality reports.
#[value(name = "gitlab")]
Gitlab,
#[value(name = "github")]
/// Print diagnostics in the format used by GitHub Actions workflow error annotations.
Github,
}
impl From<OutputFormat> for ty_project::metadata::options::OutputFormat {
@@ -332,6 +335,7 @@ impl From<OutputFormat> for ty_project::metadata::options::OutputFormat {
OutputFormat::Full => Self::Full,
OutputFormat::Concise => Self::Concise,
OutputFormat::Gitlab => Self::Gitlab,
OutputFormat::Github => Self::Github,
}
}
}

View File

@@ -683,6 +683,30 @@ fn gitlab_diagnostics() -> anyhow::Result<()> {
Ok(())
}
#[test]
fn github_diagnostics() -> anyhow::Result<()> {
let case = CliTest::with_file(
"test.py",
r#"
print(x) # [unresolved-reference]
print(4[1]) # [non-subscriptable]
"#,
)?;
assert_cmd_snapshot!(case.command().arg("--output-format=github").arg("--warn").arg("unresolved-reference"), @r"
success: false
exit_code: 1
----- stdout -----
::warning title=ty (unresolved-reference),file=test.py,line=2,col=7,endLine=2,endColumn=8::test.py:2:7: unresolved-reference: Name `x` used when not defined
::error title=ty (non-subscriptable),file=test.py,line=3,col=7,endLine=3,endColumn=8::test.py:3:7: non-subscriptable: Cannot subscript object of type `Literal[4]` with no `__getitem__` method
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
Ok(())
}
/// This tests the diagnostic format for revealed type.
///
/// This test was introduced because changes were made to

View File

@@ -1912,7 +1912,7 @@ fn submodule_cache_invalidation_created() -> anyhow::Result<()> {
insta::assert_snapshot!(
case.sorted_submodule_names("bar").join("\n"),
@"foo",
@"bar.foo",
);
std::fs::write(case.project_path("bar/wazoo.py").as_std_path(), "")?;
@@ -1922,8 +1922,8 @@ fn submodule_cache_invalidation_created() -> anyhow::Result<()> {
insta::assert_snapshot!(
case.sorted_submodule_names("bar").join("\n"),
@r"
foo
wazoo
bar.foo
bar.wazoo
",
);
@@ -1944,8 +1944,8 @@ fn submodule_cache_invalidation_deleted() -> anyhow::Result<()> {
insta::assert_snapshot!(
case.sorted_submodule_names("bar").join("\n"),
@r"
foo
wazoo
bar.foo
bar.wazoo
",
);
@@ -1955,7 +1955,7 @@ fn submodule_cache_invalidation_deleted() -> anyhow::Result<()> {
insta::assert_snapshot!(
case.sorted_submodule_names("bar").join("\n"),
@"foo",
@"bar.foo",
);
Ok(())
@@ -1969,7 +1969,7 @@ fn submodule_cache_invalidation_created_then_deleted() -> anyhow::Result<()> {
insta::assert_snapshot!(
case.sorted_submodule_names("bar").join("\n"),
@"foo",
@"bar.foo",
);
std::fs::write(case.project_path("bar/wazoo.py").as_std_path(), "")?;
@@ -1982,7 +1982,7 @@ fn submodule_cache_invalidation_created_then_deleted() -> anyhow::Result<()> {
insta::assert_snapshot!(
case.sorted_submodule_names("bar").join("\n"),
@"foo",
@"bar.foo",
);
Ok(())
@@ -1997,7 +1997,7 @@ fn submodule_cache_invalidation_after_pyproject_created() -> anyhow::Result<()>
insta::assert_snapshot!(
case.sorted_submodule_names("bar").join("\n"),
@"foo",
@"bar.foo",
);
case.update_options(Options::default())?;
@@ -2009,8 +2009,8 @@ fn submodule_cache_invalidation_after_pyproject_created() -> anyhow::Result<()>
insta::assert_snapshot!(
case.sorted_submodule_names("bar").join("\n"),
@r"
foo
wazoo
bar.foo
bar.wazoo
",
);

View File

@@ -13,9 +13,12 @@ license = { workspace = true }
[dependencies]
bitflags = { workspace = true }
ruff_db = { workspace = true }
ruff_diagnostics = { workspace = true }
ruff_index = { workspace = true }
ruff_memory_usage = { workspace = true }
ruff_python_ast = { workspace = true }
ruff_python_codegen = { workspace = true }
ruff_python_importer = { workspace = true }
ruff_python_parser = { workspace = true }
ruff_python_trivia = { workspace = true }
ruff_source_file = { workspace = true }

View File

@@ -1,6 +1,6 @@
use ruff_db::files::File;
use ty_project::Db;
use ty_python_semantic::all_modules;
use ty_python_semantic::{Module, all_modules};
use crate::symbols::{QueryPattern, SymbolInfo, symbols_for_file_global_only};
@@ -8,7 +8,7 @@ use crate::symbols::{QueryPattern, SymbolInfo, symbols_for_file_global_only};
///
/// Returns symbols from all files in the workspace and dependencies, filtered
/// by the query.
pub fn all_symbols(db: &dyn Db, query: &str) -> Vec<AllSymbolInfo> {
pub fn all_symbols<'db>(db: &'db dyn Db, query: &str) -> Vec<AllSymbolInfo<'db>> {
// If the query is empty, return immediately to avoid expensive file scanning
if query.is_empty() {
return Vec::new();
@@ -36,6 +36,7 @@ pub fn all_symbols(db: &dyn Db, query: &str) -> Vec<AllSymbolInfo> {
// but this works pretty well as it is.
results.lock().unwrap().push(AllSymbolInfo {
symbol: symbol.to_owned(),
module,
file,
});
}
@@ -56,10 +57,15 @@ pub fn all_symbols(db: &dyn Db, query: &str) -> Vec<AllSymbolInfo> {
/// A symbol found in the workspace and dependencies, including the
/// file it was found in.
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct AllSymbolInfo {
/// The symbol information
pub struct AllSymbolInfo<'db> {
/// The symbol information.
pub symbol: SymbolInfo<'static>,
/// The file containing the symbol
/// The module containing the symbol.
pub module: Module<'db>,
/// The file containing the symbol.
///
/// This `File` is guaranteed to be the same
/// as the `File` underlying `module`.
pub file: File,
}
@@ -148,17 +154,17 @@ ABCDEFGHIJKLMNOP = 'https://api.example.com'
}
}
struct AllSymbolDiagnostic {
symbol_info: AllSymbolInfo,
struct AllSymbolDiagnostic<'db> {
symbol_info: AllSymbolInfo<'db>,
}
impl AllSymbolDiagnostic {
fn new(symbol_info: AllSymbolInfo) -> Self {
impl<'db> AllSymbolDiagnostic<'db> {
fn new(symbol_info: AllSymbolInfo<'db>) -> Self {
Self { symbol_info }
}
}
impl IntoDiagnostic for AllSymbolDiagnostic {
impl IntoDiagnostic for AllSymbolDiagnostic<'_> {
fn into_diagnostic(self) -> Diagnostic {
let symbol_kind_str = self.symbol_info.symbol.kind.to_string();

View File

@@ -2,16 +2,198 @@ use std::cmp::Ordering;
use ruff_db::files::File;
use ruff_db::parsed::{ParsedModuleRef, parsed_module};
use ruff_db::source::source_text;
use ruff_diagnostics::Edit;
use ruff_python_ast as ast;
use ruff_python_ast::name::Name;
use ruff_python_codegen::Stylist;
use ruff_python_parser::{Token, TokenAt, TokenKind};
use ruff_text_size::{Ranged, TextRange, TextSize};
use ty_python_semantic::{Completion, NameKind, SemanticModel};
use ty_python_semantic::{
Completion as SemanticCompletion, ModuleName, NameKind, SemanticModel,
types::{CycleDetector, Type},
};
use crate::docstring::Docstring;
use crate::find_node::covering_node;
use crate::goto::DefinitionsOrTargets;
use crate::importer::{ImportRequest, Importer};
use crate::{Db, all_symbols};
#[derive(Clone, Debug)]
pub struct Completion<'db> {
/// The label shown to the user for this suggestion.
pub name: Name,
/// The text that should be inserted at the cursor
/// when the completion is selected.
///
/// When this is not set, `name` is used.
pub insert: Option<Box<str>>,
/// The type of this completion, if available.
///
/// Generally speaking, this is always available
/// *unless* this was a completion corresponding to
/// an unimported symbol. In that case, computing the
/// type of all such symbols could be quite expensive.
pub ty: Option<Type<'db>>,
/// The "kind" of this completion.
///
/// When this is set, it takes priority over any kind
/// inferred from `ty`.
///
/// Usually this is set when `ty` is `None`, since it
/// may be cheaper to compute at scale (e.g., for
/// unimported symbol completions).
///
/// Callers should use [`Completion::kind`] to get the
/// kind, which will take type information into account
/// if this kind is not present.
pub kind: Option<CompletionKind>,
/// The name of the module that this completion comes from.
///
/// This is generally only present when this is a completion
/// suggestion for an unimported symbol.
pub module_name: Option<&'db ModuleName>,
/// An import statement to insert (or ensure is already
/// present) when this completion is selected.
pub import: Option<Edit>,
/// Whether this suggestion came from builtins or not.
///
/// At time of writing (2025-06-26), this information
/// doesn't make it into the LSP response. Instead, we
/// use it mainly in tests so that we can write less
/// noisy tests.
pub builtin: bool,
/// The documentation associated with this item, if
/// available.
pub documentation: Option<Docstring>,
}
impl<'db> Completion<'db> {
fn from_semantic_completion(
db: &'db dyn Db,
semantic: SemanticCompletion<'db>,
) -> Completion<'db> {
let definition = semantic
.ty
.and_then(|ty| DefinitionsOrTargets::from_ty(db, ty));
let documentation = definition.and_then(|def| def.docstring(db));
Completion {
name: semantic.name,
insert: None,
ty: semantic.ty,
kind: None,
module_name: None,
import: None,
builtin: semantic.builtin,
documentation,
}
}
/// Returns the "kind" of this completion.
///
/// This is meant to be a very general classification of this completion.
/// Typically, this is communicated from the LSP server to a client, and
/// the client uses this information to help improve the UX (perhaps by
/// assigning an icon of some kind to the completion).
pub fn kind(&self, db: &'db dyn Db) -> Option<CompletionKind> {
type CompletionKindVisitor<'db> =
CycleDetector<CompletionKind, Type<'db>, Option<CompletionKind>>;
fn imp<'db>(
db: &'db dyn Db,
ty: Type<'db>,
visitor: &CompletionKindVisitor<'db>,
) -> Option<CompletionKind> {
Some(match ty {
Type::FunctionLiteral(_)
| Type::DataclassDecorator(_)
| Type::WrapperDescriptor(_)
| Type::DataclassTransformer(_)
| Type::Callable(_) => CompletionKind::Function,
Type::BoundMethod(_) | Type::KnownBoundMethod(_) => CompletionKind::Method,
Type::ModuleLiteral(_) => CompletionKind::Module,
Type::ClassLiteral(_) | Type::GenericAlias(_) | Type::SubclassOf(_) => {
CompletionKind::Class
}
// This is a little weird for "struct." I'm mostly interpreting
// "struct" here as a more general "object." ---AG
Type::NominalInstance(_)
| Type::PropertyInstance(_)
| Type::BoundSuper(_)
| Type::TypedDict(_) => CompletionKind::Struct,
Type::IntLiteral(_)
| Type::BooleanLiteral(_)
| Type::TypeIs(_)
| Type::StringLiteral(_)
| Type::LiteralString
| Type::BytesLiteral(_) => CompletionKind::Value,
Type::EnumLiteral(_) => CompletionKind::Enum,
Type::ProtocolInstance(_) => CompletionKind::Interface,
Type::NonInferableTypeVar(_) | Type::TypeVar(_) => CompletionKind::TypeParameter,
Type::Union(union) => union
.elements(db)
.iter()
.find_map(|&ty| imp(db, ty, visitor))?,
Type::Intersection(intersection) => intersection
.iter_positive(db)
.find_map(|ty| imp(db, ty, visitor))?,
Type::Dynamic(_)
| Type::Never
| Type::SpecialForm(_)
| Type::KnownInstance(_)
| Type::AlwaysTruthy
| Type::AlwaysFalsy => return None,
Type::TypeAlias(alias) => {
visitor.visit(ty, || imp(db, alias.value_type(db), visitor))?
}
})
}
self.kind.or_else(|| {
self.ty
.and_then(|ty| imp(db, ty, &CompletionKindVisitor::default()))
})
}
}
/// The "kind" of a completion.
///
/// This is taken directly from the LSP completion specification:
/// <https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#completionItemKind>
///
/// The idea here is that [`Completion::kind`] defines the mapping to this from
/// `Type` (and possibly other information), which might be interesting and
/// contentious. Then the outer edges map this to the LSP types, which is
/// expected to be mundane and boring.
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
pub enum CompletionKind {
Text,
Method,
Function,
Constructor,
Field,
Variable,
Class,
Interface,
Module,
Property,
Unit,
Value,
Enum,
Keyword,
Snippet,
Color,
File,
Reference,
Folder,
EnumMember,
Constant,
Struct,
Event,
Operator,
TypeParameter,
}
#[derive(Clone, Debug, Default)]
pub struct CompletionSettings {
pub auto_import: bool,
@@ -22,7 +204,7 @@ pub fn completion<'db>(
settings: &CompletionSettings,
file: File,
offset: TextSize,
) -> Vec<DetailedCompletion<'db>> {
) -> Vec<Completion<'db>> {
let parsed = parsed_module(db, file).load(db);
let Some(target_token) = CompletionTargetTokens::find(&parsed, offset) else {
@@ -33,64 +215,79 @@ pub fn completion<'db>(
};
let model = SemanticModel::new(db, file);
let mut completions = match target {
CompletionTargetAst::ObjectDot { expr } => model.attribute_completions(expr),
let (semantic_completions, scoped) = match target {
CompletionTargetAst::ObjectDot { expr } => (model.attribute_completions(expr), None),
CompletionTargetAst::ObjectDotInImport { import, name } => {
model.import_submodule_completions(import, name)
(model.import_submodule_completions(import, name), None)
}
CompletionTargetAst::ObjectDotInImportFrom { import } => {
model.from_import_submodule_completions(import)
(model.from_import_submodule_completions(import), None)
}
CompletionTargetAst::ImportFrom { import, name } => {
model.from_import_completions(import, name)
(model.from_import_completions(import, name), None)
}
CompletionTargetAst::Import { .. } | CompletionTargetAst::ImportViaFrom { .. } => {
model.import_completions()
(model.import_completions(), None)
}
CompletionTargetAst::Scoped { node, typed } => {
let mut completions = model.scoped_completions(node);
if settings.auto_import {
if let Some(typed) = typed {
for symbol in all_symbols(db, typed) {
completions.push(Completion {
name: ast::name::Name::new(&symbol.symbol.name),
ty: None,
kind: symbol.symbol.kind.to_completion_kind(),
builtin: false,
});
}
}
}
completions
CompletionTargetAst::Scoped(scoped) => {
(model.scoped_completions(scoped.node), Some(scoped))
}
};
completions.sort_by(compare_suggestions);
completions.dedup_by(|c1, c2| c1.name == c2.name);
completions
let mut completions: Vec<Completion<'_>> = semantic_completions
.into_iter()
.map(|completion| {
let definition = completion
.ty
.and_then(|ty| DefinitionsOrTargets::from_ty(db, ty));
let documentation = definition.and_then(|def| def.docstring(db));
DetailedCompletion {
inner: completion,
documentation,
}
})
.collect()
.map(|c| Completion::from_semantic_completion(db, c))
.collect();
if settings.auto_import {
if let Some(scoped) = scoped {
add_unimported_completions(db, file, &parsed, scoped, &mut completions);
}
}
completions.sort_by(compare_suggestions);
completions.dedup_by(|c1, c2| (&c1.name, c1.module_name) == (&c2.name, c2.module_name));
completions
}
#[derive(Clone, Debug)]
pub struct DetailedCompletion<'db> {
pub inner: Completion<'db>,
pub documentation: Option<Docstring>,
}
/// Adds completions not in scope.
///
/// `scoped` should be information about the identified scope
/// in which the cursor is currently placed.
///
/// The completions returned will auto-insert import statements
/// when selected into `File`.
fn add_unimported_completions<'db>(
db: &'db dyn Db,
file: File,
parsed: &ParsedModuleRef,
scoped: ScopedTarget<'_>,
completions: &mut Vec<Completion<'db>>,
) {
let Some(typed) = scoped.typed else {
return;
};
let source = source_text(db, file);
let stylist = Stylist::from_tokens(parsed.tokens(), source.as_str());
let importer = Importer::new(db, &stylist, file, source.as_str(), parsed);
let members = importer.members_in_scope_at(scoped.node, scoped.node.start());
impl<'db> std::ops::Deref for DetailedCompletion<'db> {
type Target = Completion<'db>;
fn deref(&self) -> &Self::Target {
&self.inner
for symbol in all_symbols(db, typed) {
let request =
ImportRequest::import_from(symbol.module.name(db).as_str(), &symbol.symbol.name);
// FIXME: `all_symbols` doesn't account for wildcard imports.
// Since we're looking at every module, this is probably
// "fine," but it might mean that we import a symbol from the
// "wrong" module.
let import_action = importer.import(request, &members);
completions.push(Completion {
name: ast::name::Name::new(&symbol.symbol.name),
insert: Some(import_action.symbol_text().into()),
ty: None,
kind: symbol.symbol.kind.to_completion_kind(),
module_name: Some(symbol.module.name(db)),
import: import_action.import().cloned(),
builtin: false,
documentation: None,
});
}
}
@@ -295,15 +492,15 @@ impl<'t> CompletionTargetTokens<'t> {
}
_ => None,
};
Some(CompletionTargetAst::Scoped { node, typed })
Some(CompletionTargetAst::Scoped(ScopedTarget { node, typed }))
}
CompletionTargetTokens::Unknown => {
let range = TextRange::empty(offset);
let covering_node = covering_node(parsed.syntax().into(), range);
Some(CompletionTargetAst::Scoped {
Some(CompletionTargetAst::Scoped(ScopedTarget {
node: covering_node.node(),
typed: None,
})
}))
}
}
}
@@ -356,16 +553,19 @@ enum CompletionTargetAst<'t> {
},
/// A scoped scenario, where we want to list all items available in
/// the most narrow scope containing the giving AST node.
Scoped {
/// The node with the smallest range that fully covers
/// the token under the cursor.
node: ast::AnyNodeRef<'t>,
/// The text that has been typed so far, if available.
///
/// When not `None`, the typed text is guaranteed to be
/// non-empty.
typed: Option<&'t str>,
},
Scoped(ScopedTarget<'t>),
}
#[derive(Clone, Copy, Debug)]
struct ScopedTarget<'t> {
/// The node with the smallest range that fully covers
/// the token under the cursor.
node: ast::AnyNodeRef<'t>,
/// The text that has been typed so far, if available.
///
/// When not `None`, the typed text is guaranteed to be
/// non-empty.
typed: Option<&'t str>,
}
/// Returns a suffix of `tokens` corresponding to the `kinds` given.
@@ -546,12 +746,10 @@ mod tests {
use insta::assert_snapshot;
use ruff_python_parser::{Mode, ParseOptions, TokenKind, Tokens};
use crate::completion::{DetailedCompletion, completion};
use crate::completion::{Completion, completion};
use crate::tests::{CursorTest, cursor_test};
use super::{CompletionSettings, token_suffix_by_kinds};
use ty_python_semantic::CompletionKind;
use super::{CompletionKind, CompletionSettings, token_suffix_by_kinds};
#[test]
fn token_suffixes_match() {
@@ -3140,14 +3338,14 @@ from os.<CURSOR>
)
}
fn completions_if(&self, predicate: impl Fn(&DetailedCompletion) -> bool) -> String {
fn completions_if(&self, predicate: impl Fn(&Completion) -> bool) -> String {
self.completions_if_snapshot(predicate, |c| c.name.as_str().to_string())
}
fn completions_if_snapshot(
&self,
predicate: impl Fn(&DetailedCompletion) -> bool,
snapshot: impl Fn(&DetailedCompletion) -> String,
predicate: impl Fn(&Completion) -> bool,
snapshot: impl Fn(&Completion) -> String,
) -> String {
let settings = CompletionSettings::default();
let completions = completion(&self.db, &settings, self.cursor.file, self.cursor.offset);

View File

@@ -409,13 +409,13 @@ f(**kwargs<CURSOR>)
assert_snapshot!(test.goto_type_definition(), @r#"
info[goto-type-definition]: Type definition
--> stdlib/builtins.pyi:2917:7
--> stdlib/builtins.pyi:2916:7
|
2916 | @disjoint_base
2917 | class dict(MutableMapping[_KT, _VT]):
2915 | @disjoint_base
2916 | class dict(MutableMapping[_KT, _VT]):
| ^^^^
2918 | """dict() -> new empty dictionary
2919 | dict(mapping) -> new dictionary initialized from a mapping object's
2917 | """dict() -> new empty dictionary
2918 | dict(mapping) -> new dictionary initialized from a mapping object's
|
info: Source
--> main.py:6:5

File diff suppressed because it is too large Load Diff

View File

@@ -14,6 +14,7 @@ mod goto_definition;
mod goto_references;
mod goto_type_definition;
mod hover;
mod importer;
mod inlay_hints;
mod markup;
mod references;
@@ -26,7 +27,7 @@ mod symbols;
mod workspace_symbols;
pub use all_symbols::{AllSymbolInfo, all_symbols};
pub use completion::{CompletionSettings, completion};
pub use completion::{Completion, CompletionKind, CompletionSettings, completion};
pub use doc_highlights::document_highlights;
pub use document_symbols::document_symbols;
pub use goto::{goto_declaration, goto_definition, goto_type_definition};
@@ -294,7 +295,10 @@ mod tests {
use ruff_db::Db;
use ruff_db::diagnostic::{Diagnostic, DiagnosticFormat, DisplayDiagnosticConfig};
use ruff_db::files::{File, FileRootKind, system_path_to_file};
use ruff_db::parsed::{ParsedModuleRef, parsed_module};
use ruff_db::source::{SourceText, source_text};
use ruff_db::system::{DbWithWritableSystem, SystemPath, SystemPathBuf};
use ruff_python_codegen::Stylist;
use ruff_python_trivia::textwrap::dedent;
use ruff_text_size::TextSize;
use ty_project::ProjectMetadata;
@@ -350,11 +354,17 @@ mod tests {
}
}
/// The file and offset into that file containing
/// a `<CURSOR>` marker.
/// The file and offset into that file where a `<CURSOR>` marker
/// is located.
///
/// (Along with other information about that file, such as the
/// parsed AST.)
pub(super) struct Cursor {
pub(super) file: File,
pub(super) offset: TextSize,
pub(super) parsed: ParsedModuleRef,
pub(super) source: SourceText,
pub(super) stylist: Stylist<'static>,
}
#[derive(Default)]
@@ -371,8 +381,20 @@ mod tests {
SystemPathBuf::from("/"),
));
let mut cursor: Option<Cursor> = None;
let search_paths = SearchPathSettings::new(vec![SystemPathBuf::from("/")])
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings");
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths,
},
);
let mut cursor: Option<Cursor> = None;
for &Source {
ref path,
ref contents,
@@ -405,25 +427,22 @@ mod tests {
cursor.is_none(),
"found more than one source that contains `<CURSOR>`"
);
cursor = Some(Cursor { file, offset });
let source = source_text(&db, file);
let parsed = parsed_module(&db, file).load(&db);
let stylist =
Stylist::from_tokens(parsed.tokens(), source.as_str()).into_owned();
cursor = Some(Cursor {
file,
offset,
parsed,
source,
stylist,
});
}
}
let search_paths = SearchPathSettings::new(vec![SystemPathBuf::from("/")])
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings");
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths,
},
);
let mut insta_settings = insta::Settings::clone_current();
insta_settings.add_filter(r#"\\(\w\w|\s|\.|")"#, "/$1");
insta_settings.add_filter(r#"\\(\w\w|\.|")"#, "/$1");
// Filter out TODO types because they are different between debug and release builds.
insta_settings.add_filter(r"@Todo\(.+\)", "@Todo");

View File

@@ -13,7 +13,8 @@ use ruff_python_ast::visitor::source_order::{self, SourceOrderVisitor};
use ruff_python_ast::{Expr, Stmt};
use ruff_text_size::{Ranged, TextRange};
use ty_project::Db;
use ty_python_semantic::CompletionKind;
use crate::completion::CompletionKind;
/// A compiled query pattern used for searching symbols.
///

View File

@@ -1077,6 +1077,10 @@ pub enum OutputFormat {
///
/// [Code Quality]: https://docs.gitlab.com/ci/testing/code_quality/#code-quality-report-format
Gitlab,
/// Print diagnostics in the format used by [GitHub Actions] workflow error annotations.
///
/// [GitHub Actions]: https://docs.github.com/en/actions/reference/workflows-and-actions/workflow-commands#setting-an-error-message
Github,
}
impl OutputFormat {
@@ -1096,6 +1100,7 @@ impl From<OutputFormat> for DiagnosticFormat {
OutputFormat::Full => Self::Full,
OutputFormat::Concise => Self::Concise,
OutputFormat::Gitlab => Self::Gitlab,
OutputFormat::Github => Self::Github,
}
}
}

View File

@@ -30,9 +30,7 @@ class Shape:
def nested_func_without_enclosing_binding(self):
def inner(x: Self):
# TODO: revealed: Self@nested_func_without_enclosing_binding
# (The outer method binds an implicit `Self`)
reveal_type(x) # revealed: Self@inner
reveal_type(x) # revealed: Self@nested_func_without_enclosing_binding
inner(self)
def implicit_self(self) -> Self:
@@ -239,4 +237,36 @@ reveal_type(D().instance_method)
reveal_type(D.class_method)
```
In nested functions `self` binds to the method. So in the following example the `self` in `C.b` is
bound at `C.f`.
```py
from typing import Self
from ty_extensions import generic_context
class C[T]():
def f(self: Self):
def b(x: Self):
reveal_type(x) # revealed: Self@f
reveal_type(generic_context(b)) # revealed: None
reveal_type(generic_context(C.f)) # revealed: tuple[Self@f]
```
Even if the `Self` annotation appears first in the nested function, it is the method that binds
`Self`.
```py
from typing import Self
from ty_extensions import generic_context
class C:
def f(self: "C"):
def b(x: Self):
reveal_type(x) # revealed: Self@f
reveal_type(generic_context(b)) # revealed: None
reveal_type(generic_context(C.f)) # revealed: None
```
[self attribute]: https://typing.python.org/en/latest/spec/generics.html#use-in-attribute-annotations

Some files were not shown because too many files have changed in this diff Show More