Compare commits

...

7 Commits

Author SHA1 Message Date
Micha Reiser
2cd25ef641 Ruff 0.11.0 (#16723)
## Summary

Follow-up release for Ruff v0.10 that now includes the following two
changes that we intended to ship but slipped:

* Changes to how the Python version is inferred when a `target-version`
is not specified (#16319)
* `blanket-noqa` (`PGH004`): Also detect blanked file-level noqa
comments (and not just line level comments).

## Test plan

I verified that the binary built on this branch respects the
`requires-python` setting
([logs](https://www.diffchecker.com/qyJWYi6W/), left: v0.10, right:
v0.11)
2025-03-14 13:57:56 +01:00
David Peter
a22d206db2 [red-knot] Preliminary tests for typing.Final (#15917)
## Summary

WIP.

Adds some preliminary tests for `typing.Final`.

## Test Plan

New MD tests
2025-03-14 12:30:13 +01:00
cake-monotone
270318c2e0 [red-knot] fix: improve type inference for binary ops on tuples (#16725)
## Summary

This PR includes minor improvements to binary operation inference,
specifically for tuple concatenation.

### Before

```py
reveal_type((1, 2) + (3, 4))  # revealed: @Todo(return type of decorated function)
# If TODO is ignored, the revealed type would be `tuple[1|2|3|4, ...]`
```

The `builtins.tuple` type stub defines `__add__`, but it appears to only
work for homogeneous tuples. However, I think this limitation is not
ideal for many use cases.

### After

```py
reveal_type((1, 2) + (3, 4))  # revealed: tuple[Literal[1], Literal[2], Literal[3], Literal[4]]
```

## Test Plan

### Added
- `mdtest/binary/tuples.md`

### Affected
- `mdtest/slots.md` (a test have been moved out of the `False-Negative`
block.)
2025-03-14 12:29:57 +01:00
David Peter
d03b12e711 [red-knot] Assignments to attributes (#16705)
## Summary

This changeset adds proper support for assignments to attributes:
```py
obj.attr = value
```

In particular, the following new features are now available:

* We previously didn't raise any errors if you tried to assign to a
non-existing attribute `attr`. This is now fixed.
* If `type(obj).attr` is a data descriptor, we now call its `__set__`
method instead of trying to assign to the load-context type of
`obj.attr`, which can be different for data descriptors.
* An initial attempt was made to support unions and intersections, as
well as possibly-unbound situations. There are some remaining TODOs in
tests, but they only affect edge cases. Having nested diagnostics would
be one way that could help solve the remaining cases, I believe.

## Follow ups

The following things are planned as follow-ups:

- Write a test suite with snapshot diagnostics for various attribute
assignment errors
- Improve the diagnostics. An easy improvement would be to highlight the
right hand side of the assignment as a secondary span (with the rhs type
as additional information). Some other ideas are mentioned in TODO
comments in this PR.
- Improve the union/intersection/possible-unboundness handling
- Add support for calling custom `__setattr__` methods (see new false
positive in the ecosystem results)

## Ecosystem changes

Some changes are related to assignments on attributes with a custom
`__setattr__` method (see above). Since we didn't notice missing
attributes at all in store context previously, these are new.

The other changes are related to properties. We previously used their
read-context type to test the assignment. That results in weird error
messages, as we often see assignments to `self.property` and then we
think that those are instance attributes *and* descriptors, leading to
union types. Now we properly look them up on the meta type, see the
decorated function, and try to overwrite it with the new value (as we
don't understand decorators yet). Long story short: the errors are still
weird, we need to understand decorators to make them go away.

## Test Plan

New Markdown tests
2025-03-14 12:15:41 +01:00
Micha Reiser
14c5ed5d7d [pygrep-hooks]: Detect file-level suppressions comments without rul… (#16720)
## Summary

I accidentially dropped this commit from the Ruff 0.10 release. See
https://github.com/astral-sh/ruff/pull/16699
2025-03-14 09:37:16 +01:00
Micha Reiser
595565015b Fallback to requires-python in certain cases when target-version is not found (#16721)
## Summary

Restores https://github.com/astral-sh/ruff/pull/16319 after it got
dropped from the 0.10 release branch :(

---------

Co-authored-by: dylwil3 <dylwil3@gmail.com>
2025-03-14 09:36:51 +01:00
Brent Westbrook
2382fe1f25 [syntax-errors] Tuple unpacking in for statement iterator clause before Python 3.9 (#16558)
Summary
--

This PR reuses a slightly modified version of the
`check_tuple_unpacking` method added for detecting unpacking in `return`
and `yield` statements to detect the same issue in the iterator clause
of `for` loops.

I ran into the same issue with a bare `for x in *rest: ...` example
(invalid even on Python 3.13) and added it as a comment on
https://github.com/astral-sh/ruff/issues/16520.

I considered just making this an additional `StarTupleKind` variant as
well, but this change was in a different version of Python, so I kept it
separate.

Test Plan
--

New inline tests.
2025-03-13 15:55:17 -04:00
41 changed files with 4374 additions and 278 deletions

View File

@@ -1,6 +1,8 @@
# Breaking Changes
## 0.10.0
## 0.11.0
This is a follow-up to release 0.10.0. Because of a mistake in the release process, the `requires-python` inference changes were not included in that release. Ruff 0.11.0 now includes this change as well as the stabilization of the preview behavior for `PGH004`.
- **Changes to how the Python version is inferred when a `target-version` is not specified** ([#16319](https://github.com/astral-sh/ruff/pull/16319))
@@ -23,6 +25,13 @@
search for the closest `pyproject.toml` in the parent directories and use its
`requires-python` setting.
## 0.10.0
- **Changes to how the Python version is inferred when a `target-version` is not specified** ([#16319](https://github.com/astral-sh/ruff/pull/16319))
Because of a mistake in the release process, the `requires-python` inference changes are not included in this release and instead shipped as part of 0.11.0.
You can find a description of this change in the 0.11.0 section.
- **Updated `TYPE_CHECKING` behavior** ([#16669](https://github.com/astral-sh/ruff/pull/16669))
Previously, Ruff only recognized typechecking blocks that tested the `typing.TYPE_CHECKING` symbol. Now, Ruff recognizes any local variable named `TYPE_CHECKING`. This release also removes support for the legacy `if 0:` and `if False:` typechecking checks. Use a local `TYPE_CHECKING` variable instead.

View File

@@ -1,13 +1,11 @@
# Changelog
## 0.10.0
## 0.11.0
Check out the [blog post](https://astral.sh/blog/ruff-v0.10.0) for a migration guide and overview of the changes!
This is a follow-up to release 0.10.0. Because of a mistake in the release process, the `requires-python` inference changes were not included in that release. Ruff 0.11.0 now includes this change as well as the stabilization of the preview behavior for `PGH004`.
### Breaking changes
See also, the "Remapped rules" section which may result in disabled rules.
- **Changes to how the Python version is inferred when a `target-version` is not specified** ([#16319](https://github.com/astral-sh/ruff/pull/16319))
In previous versions of Ruff, you could specify your Python version with:
@@ -29,6 +27,29 @@ See also, the "Remapped rules" section which may result in disabled rules.
search for the closest `pyproject.toml` in the parent directories and use its
`requires-python` setting.
### Stabilization
The following behaviors have been stabilized:
- [`blanket-noqa`](https://docs.astral.sh/ruff/rules/blanket-noqa/) (`PGH004`): Also detect blanked file-level noqa comments (and not just line level comments).
### Preview features
- [syntax-errors] Tuple unpacking in `for` statement iterator clause before Python 3.9 ([#16558](https://github.com/astral-sh/ruff/pull/16558))
## 0.10.0
Check out the [blog post](https://astral.sh/blog/ruff-v0.10.0) for a migration guide and overview of the changes!
### Breaking changes
See also, the "Remapped rules" section which may result in disabled rules.
- **Changes to how the Python version is inferred when a `target-version` is not specified** ([#16319](https://github.com/astral-sh/ruff/pull/16319))
Because of a mistake in the release process, the `requires-python` inference changes are not included in this release and instead shipped as part of 0.11.0.
You can find a description of this change in the 0.11.0 section.
- **Updated `TYPE_CHECKING` behavior** ([#16669](https://github.com/astral-sh/ruff/pull/16669))
Previously, Ruff only recognized typechecking blocks that tested the `typing.TYPE_CHECKING` symbol. Now, Ruff recognizes any local variable named `TYPE_CHECKING`. This release also removes support for the legacy `if 0:` and `if False:` typechecking checks. Use a local `TYPE_CHECKING` variable instead.
@@ -86,7 +107,6 @@ The following behaviors have been stabilized:
- [`bad-staticmethod-argument`](https://docs.astral.sh/ruff/rules/bad-staticmethod-argument/) (`PLW0211`) [`invalid-first-argument-name-for-class-method`](https://docs.astral.sh/ruff/rules/invalid-first-argument-name-for-class-method/) (`N804`): `__new__` methods are now no longer flagged by `invalid-first-argument-name-for-class-method` (`N804`) but instead by `bad-staticmethod-argument` (`PLW0211`)
- [`bad-str-strip-call`](https://docs.astral.sh/ruff/rules/bad-str-strip-call/) (`PLE1310`): The rule now applies to objects which are known to have type `str` or `bytes`.
- [`blanket-noqa`](https://docs.astral.sh/ruff/rules/blanket-noqa/) (`PGH004`): Also detect blanked file-level noqa comments (and not just line level comments).
- [`custom-type-var-for-self`](https://docs.astral.sh/ruff/rules/custom-type-var-for-self/) (`PYI019`): More accurate detection of custom `TypeVars` replaceable by `Self`. The range of the diagnostic is now the full function header rather than just the return annotation.
- [`invalid-argument-name`](https://docs.astral.sh/ruff/rules/invalid-argument-name/) (`N803`): Ignore argument names of functions decorated with `typing.override`
- [`invalid-envvar-default`](https://docs.astral.sh/ruff/rules/invalid-envvar-default/) (`PLW1508`): Detect default value arguments to `os.environ.get` with invalid type.

7
Cargo.lock generated
View File

@@ -2659,7 +2659,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.10.0"
version = "0.11.0"
dependencies = [
"anyhow",
"argfile",
@@ -2894,7 +2894,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.10.0"
version = "0.11.0"
dependencies = [
"aho-corasick",
"anyhow",
@@ -3192,6 +3192,7 @@ dependencies = [
"thiserror 2.0.12",
"toml",
"tracing",
"tracing-log",
"tracing-subscriber",
]
@@ -3216,7 +3217,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.10.0"
version = "0.11.0"
dependencies = [
"console_error_panic_hook",
"console_log",

View File

@@ -154,6 +154,7 @@ toml = { version = "0.8.11" }
tracing = { version = "0.1.40" }
tracing-flame = { version = "0.2.0" }
tracing-indicatif = { version = "0.3.6" }
tracing-log = { version = "0.2.0" }
tracing-subscriber = { version = "0.3.18", default-features = false, features = [
"env-filter",
"fmt",

View File

@@ -149,8 +149,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.10.0/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.10.0/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.11.0/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.11.0/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -183,7 +183,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.10.0
rev: v0.11.0
hooks:
# Run the linter.
- id: ruff

View File

@@ -818,40 +818,74 @@ def _(flag: bool):
if flag:
class C1:
x = 1
y: int = 1
else:
class C1:
x = 2
y: int | str = "b"
reveal_type(C1.x) # revealed: Unknown | Literal[1, 2]
reveal_type(C1.y) # revealed: int | str
C1.y = 100
# error: [invalid-assignment] "Object of type `Literal["problematic"]` is not assignable to attribute `y` on type `Literal[C1, C1]`"
C1.y = "problematic"
class C2:
if flag:
x = 3
y: int = 3
else:
x = 4
y: int | str = "d"
reveal_type(C2.x) # revealed: Unknown | Literal[3, 4]
reveal_type(C2.y) # revealed: int | str
C2.y = 100
# error: [invalid-assignment] "Object of type `None` is not assignable to attribute `y` of type `int | str`"
C2.y = None
# TODO: should be an error, needs more sophisticated union handling in `validate_attribute_assignment`
C2.y = "problematic"
if flag:
class Meta3(type):
x = 5
y: int = 5
else:
class Meta3(type):
x = 6
y: int | str = "f"
class C3(metaclass=Meta3): ...
reveal_type(C3.x) # revealed: Unknown | Literal[5, 6]
reveal_type(C3.y) # revealed: int | str
C3.y = 100
# error: [invalid-assignment] "Object of type `None` is not assignable to attribute `y` of type `int | str`"
C3.y = None
# TODO: should be an error, needs more sophisticated union handling in `validate_attribute_assignment`
C3.y = "problematic"
class Meta4(type):
if flag:
x = 7
y: int = 7
else:
x = 8
y: int | str = "h"
class C4(metaclass=Meta4): ...
reveal_type(C4.x) # revealed: Unknown | Literal[7, 8]
reveal_type(C4.y) # revealed: int | str
C4.y = 100
# error: [invalid-assignment] "Object of type `None` is not assignable to attribute `y` of type `int | str`"
C4.y = None
# TODO: should be an error, needs more sophisticated union handling in `validate_attribute_assignment`
C4.y = "problematic"
```
## Unions with possibly unbound paths
@@ -875,8 +909,14 @@ def _(flag1: bool, flag2: bool):
# error: [possibly-unbound-attribute] "Attribute `x` on type `Literal[C1, C2, C3]` is possibly unbound"
reveal_type(C.x) # revealed: Unknown | Literal[1, 3]
# error: [invalid-assignment] "Object of type `Literal[100]` is not assignable to attribute `x` on type `Literal[C1, C2, C3]`"
C.x = 100
# error: [possibly-unbound-attribute] "Attribute `x` on type `C1 | C2 | C3` is possibly unbound"
reveal_type(C().x) # revealed: Unknown | Literal[1, 3]
# error: [invalid-assignment] "Object of type `Literal[100]` is not assignable to attribute `x` on type `C1 | C2 | C3`"
C().x = 100
```
### Possibly-unbound within a class
@@ -901,10 +941,16 @@ def _(flag: bool, flag1: bool, flag2: bool):
# error: [possibly-unbound-attribute] "Attribute `x` on type `Literal[C1, C2, C3]` is possibly unbound"
reveal_type(C.x) # revealed: Unknown | Literal[1, 2, 3]
# error: [possibly-unbound-attribute]
C.x = 100
# Note: we might want to consider ignoring possibly-unbound diagnostics for instance attributes eventually,
# see the "Possibly unbound/undeclared instance attribute" section below.
# error: [possibly-unbound-attribute] "Attribute `x` on type `C1 | C2 | C3` is possibly unbound"
reveal_type(C().x) # revealed: Unknown | Literal[1, 2, 3]
# error: [possibly-unbound-attribute]
C().x = 100
```
### Possibly-unbound within gradual types
@@ -922,6 +968,9 @@ def _(flag: bool):
x: int
reveal_type(Derived().x) # revealed: int | Any
Derived().x = 1
Derived().x = "a"
```
### Attribute possibly unbound on a subclass but not on a superclass
@@ -936,8 +985,10 @@ def _(flag: bool):
x = 2
reveal_type(Bar.x) # revealed: Unknown | Literal[2, 1]
Bar.x = 3
reveal_type(Bar().x) # revealed: Unknown | Literal[2, 1]
Bar().x = 3
```
### Attribute possibly unbound on a subclass and on a superclass
@@ -955,8 +1006,14 @@ def _(flag: bool):
# error: [possibly-unbound-attribute]
reveal_type(Bar.x) # revealed: Unknown | Literal[2, 1]
# error: [possibly-unbound-attribute]
Bar.x = 3
# error: [possibly-unbound-attribute]
reveal_type(Bar().x) # revealed: Unknown | Literal[2, 1]
# error: [possibly-unbound-attribute]
Bar().x = 3
```
### Possibly unbound/undeclared instance attribute
@@ -975,6 +1032,9 @@ def _(flag: bool):
# error: [possibly-unbound-attribute]
reveal_type(Foo().x) # revealed: int | Unknown
# error: [possibly-unbound-attribute]
Foo().x = 1
```
#### Possibly unbound
@@ -989,6 +1049,9 @@ def _(flag: bool):
# Emitting a diagnostic in a case like this is not something we support, and it's unclear
# if we ever will (or want to)
reveal_type(Foo().x) # revealed: Unknown | Literal[1]
# Same here
Foo().x = 2
```
### Unions with all paths unbound
@@ -1003,6 +1066,11 @@ def _(flag: bool):
# error: [unresolved-attribute] "Type `Literal[C1, C2]` has no attribute `x`"
reveal_type(C.x) # revealed: Unknown
# TODO: This should ideally be a `unresolved-attribute` error. We need better union
# handling in `validate_attribute_assignment` for this.
# error: [invalid-assignment] "Object of type `Literal[1]` is not assignable to attribute `x` on type `Literal[C1, C2]`"
C.x = 1
```
## Inherited class attributes
@@ -1017,6 +1085,8 @@ class B(A): ...
class C(B): ...
reveal_type(C.X) # revealed: Unknown | Literal["foo"]
C.X = "bar"
```
### Multiple inheritance
@@ -1040,6 +1110,8 @@ reveal_type(A.__mro__)
# `E` is earlier in the MRO than `F`, so we should use the type of `E.X`
reveal_type(A.X) # revealed: Unknown | Literal[42]
A.X = 100
```
## Intersections of attributes
@@ -1057,9 +1129,13 @@ class B: ...
def _(a_and_b: Intersection[A, B]):
reveal_type(a_and_b.x) # revealed: int
a_and_b.x = 2
# Same for class objects
def _(a_and_b: Intersection[type[A], type[B]]):
reveal_type(a_and_b.x) # revealed: int
a_and_b.x = 2
```
### Attribute available on both elements
@@ -1069,6 +1145,7 @@ from knot_extensions import Intersection
class P: ...
class Q: ...
class R(P, Q): ...
class A:
x: P = P()
@@ -1078,10 +1155,12 @@ class B:
def _(a_and_b: Intersection[A, B]):
reveal_type(a_and_b.x) # revealed: P & Q
a_and_b.x = R()
# Same for class objects
def _(a_and_b: Intersection[type[A], type[B]]):
reveal_type(a_and_b.x) # revealed: P & Q
a_and_b.x = R()
```
### Possible unboundness
@@ -1091,6 +1170,7 @@ from knot_extensions import Intersection
class P: ...
class Q: ...
class R(P, Q): ...
def _(flag: bool):
class A1:
@@ -1102,11 +1182,17 @@ def _(flag: bool):
def inner1(a_and_b: Intersection[A1, B1]):
# error: [possibly-unbound-attribute]
reveal_type(a_and_b.x) # revealed: P
# error: [possibly-unbound-attribute]
a_and_b.x = R()
# Same for class objects
def inner1_class(a_and_b: Intersection[type[A1], type[B1]]):
# error: [possibly-unbound-attribute]
reveal_type(a_and_b.x) # revealed: P
# error: [possibly-unbound-attribute]
a_and_b.x = R()
class A2:
if flag:
x: P = P()
@@ -1116,6 +1202,11 @@ def _(flag: bool):
def inner2(a_and_b: Intersection[A2, B1]):
reveal_type(a_and_b.x) # revealed: P & Q
# TODO: this should not be an error, we need better intersection
# handling in `validate_attribute_assignment` for this
# error: [possibly-unbound-attribute]
a_and_b.x = R()
# Same for class objects
def inner2_class(a_and_b: Intersection[type[A2], type[B1]]):
reveal_type(a_and_b.x) # revealed: P & Q
@@ -1131,21 +1222,33 @@ def _(flag: bool):
def inner3(a_and_b: Intersection[A3, B3]):
# error: [possibly-unbound-attribute]
reveal_type(a_and_b.x) # revealed: P & Q
# error: [possibly-unbound-attribute]
a_and_b.x = R()
# Same for class objects
def inner3_class(a_and_b: Intersection[type[A3], type[B3]]):
# error: [possibly-unbound-attribute]
reveal_type(a_and_b.x) # revealed: P & Q
# error: [possibly-unbound-attribute]
a_and_b.x = R()
class A4: ...
class B4: ...
def inner4(a_and_b: Intersection[A4, B4]):
# error: [unresolved-attribute]
reveal_type(a_and_b.x) # revealed: Unknown
# error: [invalid-assignment]
a_and_b.x = R()
# Same for class objects
def inner4_class(a_and_b: Intersection[type[A4], type[B4]]):
# error: [unresolved-attribute]
reveal_type(a_and_b.x) # revealed: Unknown
# error: [invalid-assignment]
a_and_b.x = R()
```
### Intersection of implicit instance attributes

View File

@@ -0,0 +1,22 @@
# Binary operations on tuples
## Concatenation for heterogeneous tuples
```py
reveal_type((1, 2) + (3, 4)) # revealed: tuple[Literal[1], Literal[2], Literal[3], Literal[4]]
reveal_type(() + (1, 2)) # revealed: tuple[Literal[1], Literal[2]]
reveal_type((1, 2) + ()) # revealed: tuple[Literal[1], Literal[2]]
reveal_type(() + ()) # revealed: tuple[()]
def _(x: tuple[int, str], y: tuple[None, tuple[int]]):
reveal_type(x + y) # revealed: tuple[int, str, None, tuple[int]]
reveal_type(y + x) # revealed: tuple[None, tuple[int], int, str]
```
## Concatenation for homogeneous tuples
```py
def _(x: tuple[int, ...], y: tuple[str, ...]):
reveal_type(x + y) # revealed: @Todo(full tuple[...] support)
reveal_type(x + (1, 2)) # revealed: @Todo(full tuple[...] support)
```

View File

@@ -32,14 +32,22 @@ reveal_type(c.ten) # revealed: Literal[10]
reveal_type(C.ten) # revealed: Literal[10]
# These are fine:
# This is fine:
c.ten = 10
C.ten = 10
# error: [invalid-assignment] "Object of type `Literal[11]` is not assignable to attribute `ten` of type `Literal[10]`"
# error: [invalid-assignment] "Invalid assignment to data descriptor attribute `ten` on type `C` with custom `__set__` method"
c.ten = 11
```
# error: [invalid-assignment] "Object of type `Literal[11]` is not assignable to attribute `ten` of type `Literal[10]`"
When assigning to the `ten` attribute from the class object, we get an error. The descriptor
protocol is *not* triggered in this case. Since the attribute is declared as `Ten` in the class
body, we do not allow these assignments, preventing users from accidentally overwriting the data
descriptor, which is what would happen at runtime:
```py
# error: [invalid-assignment] "Object of type `Literal[10]` is not assignable to attribute `ten` of type `Ten`"
C.ten = 10
# error: [invalid-assignment] "Object of type `Literal[11]` is not assignable to attribute `ten` of type `Ten`"
C.ten = 11
```
@@ -66,13 +74,11 @@ c = C()
reveal_type(c.flexible_int) # revealed: int | None
c.flexible_int = 42 # okay
# TODO: This should not be an error
# error: [invalid-assignment]
c.flexible_int = "42" # also okay!
reveal_type(c.flexible_int) # revealed: int | None
# TODO: This should be an error
# error: [invalid-assignment] "Invalid assignment to data descriptor attribute `flexible_int` on type `C` with custom `__set__` method"
c.flexible_int = None # not okay
reveal_type(c.flexible_int) # revealed: int | None
@@ -167,19 +173,24 @@ def f1(flag: bool):
self.attr = "normal"
reveal_type(C1().attr) # revealed: Unknown | Literal["data", "normal"]
# Assigning to the attribute also causes no `possibly-unbound` diagnostic:
C1().attr = 1
```
We never treat implicit instance attributes as definitely bound, so we fall back to the non-data
descriptor here:
```py
def f2(flag: bool):
class C2:
def f(self):
self.attr = "normal"
attr = NonDataDescriptor()
class C2:
def f(self):
self.attr = "normal"
attr = NonDataDescriptor()
reveal_type(C2().attr) # revealed: Unknown | Literal["non-data", "normal"]
reveal_type(C2().attr) # revealed: Unknown | Literal["non-data", "normal"]
# Assignments always go to the instance attribute in this case
C2().attr = 1
```
### Descriptors only work when used as class variables
@@ -198,6 +209,12 @@ class C:
self.ten: Ten = Ten()
reveal_type(C().ten) # revealed: Ten
C().ten = Ten()
# The instance attribute is declared as `Ten`, so this is an
# error: [invalid-assignment] "Object of type `Literal[10]` is not assignable to attribute `ten` of type `Ten`"
C().ten = 10
```
## Descriptor protocol for class objects
@@ -219,7 +236,7 @@ class DataDescriptor:
def __get__(self, instance: object, owner: type | None = None) -> Literal["data"]:
return "data"
def __set__(self, instance: object, value: str) -> None:
def __set__(self, instance: object, value: int) -> None:
pass
class NonDataDescriptor:
@@ -246,7 +263,28 @@ reveal_type(C1.class_data_descriptor) # revealed: Literal["data"]
reveal_type(C1.class_non_data_descriptor) # revealed: Literal["non-data"]
```
Next, we demonstrate that a *metaclass data descriptor* takes precedence over all class-level
Assignments to class object attribute only trigger the descriptor protocol if the data descriptor is
on the metaclass:
```py
C1.meta_data_descriptor = 1
# error: [invalid-assignment] "Invalid assignment to data descriptor attribute `meta_data_descriptor` on type `Literal[C1]` with custom `__set__` method"
C1.meta_data_descriptor = "invalid"
```
When writing to a class-level data descriptor from the class object itself, the descriptor protocol
is *not* triggered (this is in contrast to what happens when you read class-level descriptor
attributes!). So the following assignment does not call `__set__`. At runtime, the assignment would
overwrite the data descriptor, but the attribute is declared as `DataDescriptor` in the class body,
so we do not allow this:
```py
# error: [invalid-assignment] "Object of type `Literal[1]` is not assignable to attribute `class_data_descriptor` of type `DataDescriptor`"
C1.class_data_descriptor = 1
```
We now demonstrate that a *metaclass data descriptor* takes precedence over all class-level
attributes:
```py
@@ -267,6 +305,14 @@ class C2(metaclass=Meta2):
reveal_type(C2.meta_data_descriptor1) # revealed: Literal["data"]
reveal_type(C2.meta_data_descriptor2) # revealed: Literal["data"]
C2.meta_data_descriptor1 = 1
C2.meta_data_descriptor2 = 1
# error: [invalid-assignment]
C2.meta_data_descriptor1 = "invalid"
# error: [invalid-assignment]
C2.meta_data_descriptor2 = "invalid"
```
On the other hand, normal metaclass attributes and metaclass non-data descriptors are shadowed by
@@ -321,6 +367,16 @@ def _(flag: bool):
reveal_type(C5.meta_data_descriptor1) # revealed: Literal["data", "value on class"]
# error: [possibly-unbound-attribute]
reveal_type(C5.meta_data_descriptor2) # revealed: Literal["data"]
# TODO: We currently emit two diagnostics here, corresponding to the two states of `flag`. The diagnostics are not
# wrong, but they could be subsumed under a higher-level diagnostic.
# error: [invalid-assignment] "Invalid assignment to data descriptor attribute `meta_data_descriptor1` on type `Literal[C5]` with custom `__set__` method"
# error: [invalid-assignment] "Object of type `None` is not assignable to attribute `meta_data_descriptor1` of type `Literal["value on class"]`"
C5.meta_data_descriptor1 = None
# error: [possibly-unbound-attribute]
C5.meta_data_descriptor2 = 1
```
When a class-level attribute is possibly unbound, we union its (descriptor protocol) type with the
@@ -373,6 +429,11 @@ def _(flag: bool):
reveal_type(C7.union_of_metaclass_data_descriptor_and_attribute) # revealed: Literal["data", 2]
reveal_type(C7.union_of_class_attributes) # revealed: Literal[1, 2]
reveal_type(C7.union_of_class_data_descriptor_and_attribute) # revealed: Literal["data", 2]
C7.union_of_metaclass_attributes = 2 if flag else 1
C7.union_of_metaclass_data_descriptor_and_attribute = 2 if flag else 100
C7.union_of_class_attributes = 2 if flag else 1
C7.union_of_class_data_descriptor_and_attribute = 2 if flag else DataDescriptor()
```
## Descriptors distinguishing between class and instance access
@@ -469,7 +530,7 @@ c.name = "new"
c.name = None
# TODO: this should be an error, but with a proper error message
# error: [invalid-assignment] "Object of type `Literal[42]` is not assignable to attribute `name` of type `<bound method `name` of `C`>`"
# error: [invalid-assignment] "Implicit shadowing of function `name`; annotate to make it explicit if this is intentional"
c.name = 42
```

View File

@@ -113,6 +113,24 @@ class D(B, A): ... # fine
class E(B, C, A): ... # fine
```
## Post-hoc modifications
```py
class A:
__slots__ = ()
__slots__ += ("a", "b")
reveal_type(A.__slots__) # revealed: tuple[Literal["a"], Literal["b"]]
class B:
__slots__ = ("c", "d")
class C(
A, # error: [incompatible-slots]
B, # error: [incompatible-slots]
): ...
```
## False negatives
### Possibly unbound
@@ -160,22 +178,6 @@ class B:
class C(A, B): ...
```
### Post-hoc modifications
```py
class A:
__slots__ = ()
__slots__ += ("a", "b")
reveal_type(A.__slots__) # revealed: @Todo(return type of decorated function)
class B:
__slots__ = ("c", "d")
# False negative: [incompatible-slots]
class C(A, B): ...
```
### Built-ins with implicit layouts
```py

View File

@@ -0,0 +1,87 @@
# `typing.Final`
[`typing.Final`] is a type qualifier that is used to indicate that a symbol may not be reassigned in
any scope. Final names declared in class scopes cannot be overridden in subclasses.
## Basic
`mod.py`:
```py
from typing import Final, Annotated
FINAL_A: int = 1
FINAL_B: Annotated[Final[int], "the annotation for FINAL_B"] = 1
FINAL_C: Final[Annotated[int, "the annotation for FINAL_C"]] = 1
FINAL_D: Final = 1
FINAL_E: "Final[int]" = 1
reveal_type(FINAL_A) # revealed: Literal[1]
reveal_type(FINAL_B) # revealed: Literal[1]
reveal_type(FINAL_C) # revealed: Literal[1]
reveal_type(FINAL_D) # revealed: Literal[1]
reveal_type(FINAL_E) # revealed: Literal[1]
# TODO: All of these should be errors:
FINAL_A = 2
FINAL_B = 2
FINAL_C = 2
FINAL_D = 2
FINAL_E = 2
```
Public types:
```py
from mod import FINAL_A, FINAL_B, FINAL_C, FINAL_D, FINAL_E
# TODO: All of these should be Literal[1]
reveal_type(FINAL_A) # revealed: int
reveal_type(FINAL_B) # revealed: int
reveal_type(FINAL_C) # revealed: int
reveal_type(FINAL_D) # revealed: Unknown
reveal_type(FINAL_E) # revealed: int
```
## Too many arguments
```py
from typing import Final
class C:
# error: [invalid-type-form] "Type qualifier `typing.Final` expects exactly one type parameter"
x: Final[int, str] = 1
```
## Illegal `Final` in type expression
```py
from typing import Final
class C:
# error: [invalid-type-form] "Type qualifier `typing.Final` is not allowed in type expressions (only in annotation expressions)"
x: Final | int
# error: [invalid-type-form] "Type qualifier `typing.Final` is not allowed in type expressions (only in annotation expressions)"
y: int | Final[str]
```
## No assignment
```py
from typing import Final
DECLARED_THEN_BOUND: Final[int]
DECLARED_THEN_BOUND = 1
```
## No assignment for bare `Final`
```py
from typing import Final
# TODO: This should be an error
NO_RHS: Final
```
[`typing.final`]: https://docs.python.org/3/library/typing.html#typing.Final

View File

@@ -1169,6 +1169,22 @@ pub(super) fn report_possibly_unresolved_reference(
);
}
pub(super) fn report_possibly_unbound_attribute(
context: &InferContext,
target: &ast::ExprAttribute,
attribute: &str,
object_ty: Type,
) {
context.report_lint(
&POSSIBLY_UNBOUND_ATTRIBUTE,
target,
format_args!(
"Attribute `{attribute}` on type `{}` is possibly unbound",
object_ty.display(context.db()),
),
);
}
pub(super) fn report_unresolved_reference(context: &InferContext, expr_name_node: &ast::ExprName) {
let ast::ExprName { id, .. } = expr_name_node;

View File

@@ -65,13 +65,14 @@ use crate::types::call::{Argument, CallArguments, UnionCallError};
use crate::types::diagnostic::{
report_implicit_return_type, report_invalid_arguments_to_annotated,
report_invalid_arguments_to_callable, report_invalid_assignment,
report_invalid_attribute_assignment, report_invalid_return_type, report_unresolved_module,
TypeCheckDiagnostics, CALL_NON_CALLABLE, CALL_POSSIBLY_UNBOUND_METHOD,
CONFLICTING_DECLARATIONS, CONFLICTING_METACLASS, CYCLIC_CLASS_DEFINITION, DIVISION_BY_ZERO,
DUPLICATE_BASE, INCONSISTENT_MRO, INVALID_ATTRIBUTE_ACCESS, INVALID_BASE, INVALID_DECLARATION,
INVALID_PARAMETER_DEFAULT, INVALID_TYPE_FORM, INVALID_TYPE_VARIABLE_CONSTRAINTS,
POSSIBLY_UNBOUND_ATTRIBUTE, POSSIBLY_UNBOUND_IMPORT, UNDEFINED_REVEAL, UNRESOLVED_ATTRIBUTE,
UNRESOLVED_IMPORT, UNSUPPORTED_OPERATOR,
report_invalid_attribute_assignment, report_invalid_return_type,
report_possibly_unbound_attribute, report_unresolved_module, TypeCheckDiagnostics,
CALL_NON_CALLABLE, CALL_POSSIBLY_UNBOUND_METHOD, CONFLICTING_DECLARATIONS,
CONFLICTING_METACLASS, CYCLIC_CLASS_DEFINITION, DIVISION_BY_ZERO, DUPLICATE_BASE,
INCONSISTENT_MRO, INVALID_ASSIGNMENT, INVALID_ATTRIBUTE_ACCESS, INVALID_BASE,
INVALID_DECLARATION, INVALID_PARAMETER_DEFAULT, INVALID_TYPE_FORM,
INVALID_TYPE_VARIABLE_CONSTRAINTS, POSSIBLY_UNBOUND_IMPORT, UNDEFINED_REVEAL,
UNRESOLVED_ATTRIBUTE, UNRESOLVED_IMPORT, UNSUPPORTED_OPERATOR,
};
use crate::types::mro::MroErrorKind;
use crate::types::unpacker::{UnpackResult, Unpacker};
@@ -2168,6 +2169,353 @@ impl<'db> TypeInferenceBuilder<'db> {
self.infer_target_impl(target, assigned_ty);
}
/// Make sure that the attribute assignment `obj.attribute = value` is valid.
///
/// `target` is the node for the left-hand side, `object_ty` is the type of `obj`, `attribute` is
/// the name of the attribute being assigned, and `value_ty` is the type of the right-hand side of
/// the assignment. If the assignment is invalid, emit diagnostics.
fn validate_attribute_assignment(
&mut self,
target: &ast::ExprAttribute,
object_ty: Type<'db>,
attribute: &str,
value_ty: Type<'db>,
emit_diagnostics: bool,
) -> bool {
let db = self.db();
let ensure_assignable_to = |attr_ty| -> bool {
let assignable = value_ty.is_assignable_to(db, attr_ty);
if !assignable && emit_diagnostics {
report_invalid_attribute_assignment(
&self.context,
target.into(),
attr_ty,
value_ty,
attribute,
);
}
assignable
};
match object_ty {
Type::Union(union) => {
if union.elements(self.db()).iter().all(|elem| {
self.validate_attribute_assignment(target, *elem, attribute, value_ty, false)
}) {
true
} else {
// TODO: This is not a very helpful error message, as it does not include the underlying reason
// why the assignment is invalid. This would be a good use case for nested diagnostics.
if emit_diagnostics {
self.context.report_lint(&INVALID_ASSIGNMENT, target, format_args!(
"Object of type `{}` is not assignable to attribute `{attribute}` on type `{}`",
value_ty.display(self.db()),
object_ty.display(self.db()),
));
}
false
}
}
Type::Intersection(intersection) => {
// TODO: Handle negative intersection elements
if intersection.positive(db).iter().any(|elem| {
self.validate_attribute_assignment(target, *elem, attribute, value_ty, false)
}) {
true
} else {
if emit_diagnostics {
// TODO: same here, see above
self.context.report_lint(&INVALID_ASSIGNMENT, target, format_args!(
"Object of type `{}` is not assignable to attribute `{attribute}` on type `{}`",
value_ty.display(self.db()),
object_ty.display(self.db()),
));
}
false
}
}
Type::Dynamic(..) | Type::Never => true,
Type::Instance(..)
| Type::BooleanLiteral(..)
| Type::IntLiteral(..)
| Type::StringLiteral(..)
| Type::BytesLiteral(..)
| Type::LiteralString
| Type::SliceLiteral(..)
| Type::Tuple(..)
| Type::KnownInstance(..)
| Type::FunctionLiteral(..)
| Type::Callable(..)
| Type::AlwaysTruthy
| Type::AlwaysFalsy => match object_ty.class_member(db, attribute.into()) {
meta_attr @ SymbolAndQualifiers { .. } if meta_attr.is_class_var() => {
if emit_diagnostics {
self.context.report_lint(
&INVALID_ATTRIBUTE_ACCESS,
target,
format_args!(
"Cannot assign to ClassVar `{attribute}` from an instance of type `{ty}`",
ty = object_ty.display(self.db()),
),
);
}
false
}
SymbolAndQualifiers {
symbol: Symbol::Type(meta_attr_ty, meta_attr_boundness),
qualifiers: _,
} => {
let assignable_to_meta_attr = if let Symbol::Type(meta_dunder_set, _) =
meta_attr_ty.class_member(db, "__set__".into()).symbol
{
let successful_call = meta_dunder_set
.try_call(
db,
&CallArguments::positional([meta_attr_ty, object_ty, value_ty]),
)
.is_ok();
if !successful_call && emit_diagnostics {
// TODO: Here, it would be nice to emit an additional diagnostic that explains why the call failed
self.context.report_lint(
&INVALID_ASSIGNMENT,
target,
format_args!(
"Invalid assignment to data descriptor attribute `{attribute}` on type `{}` with custom `__set__` method",
object_ty.display(db)
),
);
}
successful_call
} else {
ensure_assignable_to(meta_attr_ty)
};
let assignable_to_instance_attribute =
if meta_attr_boundness == Boundness::PossiblyUnbound {
let (assignable, boundness) =
if let Symbol::Type(instance_attr_ty, instance_attr_boundness) =
object_ty.instance_member(db, attribute).symbol
{
(
ensure_assignable_to(instance_attr_ty),
instance_attr_boundness,
)
} else {
(true, Boundness::PossiblyUnbound)
};
if boundness == Boundness::PossiblyUnbound {
report_possibly_unbound_attribute(
&self.context,
target,
attribute,
object_ty,
);
}
assignable
} else {
true
};
assignable_to_meta_attr && assignable_to_instance_attribute
}
SymbolAndQualifiers {
symbol: Symbol::Unbound,
..
} => {
if let Symbol::Type(instance_attr_ty, instance_attr_boundness) =
object_ty.instance_member(db, attribute).symbol
{
if instance_attr_boundness == Boundness::PossiblyUnbound {
report_possibly_unbound_attribute(
&self.context,
target,
attribute,
object_ty,
);
}
ensure_assignable_to(instance_attr_ty)
} else {
if emit_diagnostics {
self.context.report_lint(
&UNRESOLVED_ATTRIBUTE,
target,
format_args!(
"Unresolved attribute `{}` on type `{}`.",
attribute,
object_ty.display(db)
),
);
}
false
}
}
},
Type::ClassLiteral(..) | Type::SubclassOf(..) => {
match object_ty.class_member(db, attribute.into()) {
SymbolAndQualifiers {
symbol: Symbol::Type(meta_attr_ty, meta_attr_boundness),
qualifiers: _,
} => {
let assignable_to_meta_attr = if let Symbol::Type(meta_dunder_set, _) =
meta_attr_ty.class_member(db, "__set__".into()).symbol
{
let successful_call = meta_dunder_set
.try_call(
db,
&CallArguments::positional([meta_attr_ty, object_ty, value_ty]),
)
.is_ok();
if !successful_call && emit_diagnostics {
// TODO: Here, it would be nice to emit an additional diagnostic that explains why the call failed
self.context.report_lint(
&INVALID_ASSIGNMENT,
target,
format_args!(
"Invalid assignment to data descriptor attribute `{attribute}` on type `{}` with custom `__set__` method",
object_ty.display(db)
),
);
}
successful_call
} else {
ensure_assignable_to(meta_attr_ty)
};
let assignable_to_class_attr = if meta_attr_boundness
== Boundness::PossiblyUnbound
{
let (assignable, boundness) = if let Symbol::Type(
class_attr_ty,
class_attr_boundness,
) = object_ty
.find_name_in_mro(db, attribute)
.expect("called on Type::ClassLiteral or Type::SubclassOf")
.symbol
{
(ensure_assignable_to(class_attr_ty), class_attr_boundness)
} else {
(true, Boundness::PossiblyUnbound)
};
if boundness == Boundness::PossiblyUnbound {
report_possibly_unbound_attribute(
&self.context,
target,
attribute,
object_ty,
);
}
assignable
} else {
true
};
assignable_to_meta_attr && assignable_to_class_attr
}
SymbolAndQualifiers {
symbol: Symbol::Unbound,
..
} => {
if let Symbol::Type(class_attr_ty, class_attr_boundness) = object_ty
.find_name_in_mro(db, attribute)
.expect("called on Type::ClassLiteral or Type::SubclassOf")
.symbol
{
if class_attr_boundness == Boundness::PossiblyUnbound {
report_possibly_unbound_attribute(
&self.context,
target,
attribute,
object_ty,
);
}
ensure_assignable_to(class_attr_ty)
} else {
let attribute_is_bound_on_instance =
object_ty.to_instance(self.db()).is_some_and(|instance| {
!instance
.instance_member(self.db(), attribute)
.symbol
.is_unbound()
});
// Attribute is declared or bound on instance. Forbid access from the class object
if emit_diagnostics {
if attribute_is_bound_on_instance {
self.context.report_lint(
&INVALID_ATTRIBUTE_ACCESS,
target,
format_args!(
"Cannot assign to instance attribute `{attribute}` from the class object `{ty}`",
ty = object_ty.display(self.db()),
));
} else {
self.context.report_lint(
&UNRESOLVED_ATTRIBUTE,
target,
format_args!(
"Unresolved attribute `{}` on type `{}`.",
attribute,
object_ty.display(db)
),
);
}
}
false
}
}
}
}
Type::ModuleLiteral(module) => {
if let Symbol::Type(attr_ty, _) = module.static_member(db, attribute) {
let assignable = value_ty.is_assignable_to(db, attr_ty);
if !assignable {
report_invalid_attribute_assignment(
&self.context,
target.into(),
attr_ty,
value_ty,
attribute,
);
}
false
} else {
self.context.report_lint(
&UNRESOLVED_ATTRIBUTE,
target,
format_args!(
"Unresolved attribute `{}` on type `{}`.",
attribute,
object_ty.display(db)
),
);
false
}
}
}
}
fn infer_target_impl(&mut self, target: &ast::Expr, assigned_ty: Option<Type<'db>>) {
match target {
ast::Expr::Name(name) => self.infer_definition(name),
@@ -2185,25 +2533,25 @@ impl<'db> TypeInferenceBuilder<'db> {
}
}
ast::Expr::Attribute(
lhs_expr @ ast::ExprAttribute {
attr_expr @ ast::ExprAttribute {
value: object,
ctx: ExprContext::Store,
attr,
..
},
) => {
let attribute_expr_ty = self.infer_attribute_expression(lhs_expr);
self.store_expression_type(target, attribute_expr_ty);
self.store_expression_type(target, Type::Never);
let object_ty = self.infer_expression(object);
if let Some(assigned_ty) = assigned_ty {
if !assigned_ty.is_assignable_to(self.db(), attribute_expr_ty) {
report_invalid_attribute_assignment(
&self.context,
target.into(),
attribute_expr_ty,
assigned_ty,
attr.as_str(),
);
}
self.validate_attribute_assignment(
attr_expr,
object_ty,
attr.id(),
assigned_ty,
true,
);
}
}
_ => {
@@ -3988,15 +4336,13 @@ impl<'db> TypeInferenceBuilder<'db> {
Type::unknown().into()
}
LookupError::PossiblyUnbound(type_when_bound) => {
self.context.report_lint(
&POSSIBLY_UNBOUND_ATTRIBUTE,
report_possibly_unbound_attribute(
&self.context,
attribute,
format_args!(
"Attribute `{}` on type `{}` is possibly unbound",
attr.id,
value_type.display(db),
),
&attr.id,
value_type,
);
type_when_bound
}
}).inner_type()
@@ -4005,73 +4351,14 @@ impl<'db> TypeInferenceBuilder<'db> {
fn infer_attribute_expression(&mut self, attribute: &ast::ExprAttribute) -> Type<'db> {
let ast::ExprAttribute {
value,
attr,
attr: _,
range: _,
ctx,
} = attribute;
match ctx {
ExprContext::Load => self.infer_attribute_load(attribute),
ExprContext::Store => {
let value_ty = self.infer_expression(value);
let symbol = match value_ty {
Type::Instance(_) => {
let instance_member = value_ty.member(self.db(), &attr.id);
if instance_member.is_class_var() {
self.context.report_lint(
&INVALID_ATTRIBUTE_ACCESS,
attribute,
format_args!(
"Cannot assign to ClassVar `{attr}` from an instance of type `{ty}`",
ty = value_ty.display(self.db()),
),
);
}
instance_member.symbol
}
Type::ClassLiteral(_) | Type::SubclassOf(_) => {
let class_member = value_ty.member(self.db(), &attr.id).symbol;
if class_member.is_unbound() {
let class = match value_ty {
Type::ClassLiteral(class) => Some(class.class()),
Type::SubclassOf(subclass_of @ SubclassOfType { .. }) => {
match subclass_of.subclass_of() {
ClassBase::Class(class) => Some(class),
ClassBase::Dynamic(_) => unreachable!("Attribute lookup on a dynamic `SubclassOf` type should always return a bound symbol"),
}
}
_ => None,
};
if let Some(class) = class {
let instance_member = class.instance_member(self.db(), attr).symbol;
// Attribute is declared or bound on instance. Forbid access from the class object
if !instance_member.is_unbound() {
self.context.report_lint(
&INVALID_ATTRIBUTE_ACCESS,
attribute,
format_args!(
"Cannot assign to instance attribute `{attr}` from the class object `{ty}`",
ty = value_ty.display(self.db()),
));
}
}
}
class_member
}
_ => value_ty.member(self.db(), &attr.id).symbol,
};
// TODO: The unbound-case might also yield a diagnostic, but we can not activate
// this yet until we understand implicit instance attributes (those that are not
// defined in the class body), as there would be too many false positives.
symbol.ignore_possibly_unbound().unwrap_or(Type::unknown())
}
ExprContext::Del => {
ExprContext::Store | ExprContext::Del => {
self.infer_expression(value);
Type::Never
}
@@ -4346,6 +4633,20 @@ impl<'db> TypeInferenceBuilder<'db> {
self.infer_binary_expression_type(left, Type::IntLiteral(i64::from(bool_value)), op)
}
(Type::Tuple(lhs), Type::Tuple(rhs), ast::Operator::Add) => {
// Note: this only works on heterogeneous tuples.
let lhs_elements = lhs.elements(self.db());
let rhs_elements = rhs.elements(self.db());
Some(TupleType::from_elements(
self.db(),
lhs_elements
.iter()
.copied()
.chain(rhs_elements.iter().copied()),
))
}
// We've handled all of the special cases that we support for literals, so we need to
// fall back on looking for dunder methods on one of the operand types.
(

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.10.0"
version = "0.11.0"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -153,7 +153,11 @@ pub fn run(
}));
}
set_up_logging(global_options.log_level())?;
// Don't set up logging for the server command, as it has its own logging setup
// and setting the global logger can only be done once.
if !matches!(command, Command::Server { .. }) {
set_up_logging(global_options.log_level())?;
}
match command {
Command::Version { output_format } => {

View File

@@ -5,12 +5,14 @@ use log::debug;
use path_absolutize::path_dedot;
use ruff_workspace::configuration::Configuration;
use ruff_workspace::pyproject;
use ruff_workspace::pyproject::{self, find_fallback_target_version};
use ruff_workspace::resolver::{
resolve_root_settings, ConfigurationTransformer, PyprojectConfig, PyprojectDiscoveryStrategy,
Relativity,
resolve_root_settings, ConfigurationOrigin, ConfigurationTransformer, PyprojectConfig,
PyprojectDiscoveryStrategy,
};
use ruff_python_ast as ast;
use crate::args::ConfigArguments;
/// Resolve the relevant settings strategy and defaults for the current
@@ -35,7 +37,11 @@ pub fn resolve(
// `pyproject.toml` for _all_ configuration, and resolve paths relative to the
// current working directory. (This matches ESLint's behavior.)
if let Some(pyproject) = config_arguments.config_file() {
let settings = resolve_root_settings(pyproject, Relativity::Cwd, config_arguments)?;
let settings = resolve_root_settings(
pyproject,
config_arguments,
ConfigurationOrigin::UserSpecified,
)?;
debug!(
"Using user-specified configuration file at: {}",
pyproject.display()
@@ -61,7 +67,8 @@ pub fn resolve(
"Using configuration file (via parent) at: {}",
pyproject.display()
);
let settings = resolve_root_settings(&pyproject, Relativity::Parent, config_arguments)?;
let settings =
resolve_root_settings(&pyproject, config_arguments, ConfigurationOrigin::Ancestor)?;
return Ok(PyprojectConfig::new(
PyprojectDiscoveryStrategy::Hierarchical,
settings,
@@ -74,11 +81,35 @@ pub fn resolve(
// end up the "closest" `pyproject.toml` file for every Python file later on, so
// these act as the "default" settings.)
if let Some(pyproject) = pyproject::find_user_settings_toml() {
struct FallbackTransformer<'a> {
arguments: &'a ConfigArguments,
}
impl ConfigurationTransformer for FallbackTransformer<'_> {
fn transform(&self, mut configuration: Configuration) -> Configuration {
// The `requires-python` constraint from the `pyproject.toml` takes precedence
// over the `target-version` from the user configuration.
let fallback = find_fallback_target_version(&*path_dedot::CWD);
if let Some(fallback) = fallback {
debug!("Derived `target-version` from found `requires-python`: {fallback:?}");
configuration.target_version = Some(fallback.into());
}
self.arguments.transform(configuration)
}
}
debug!(
"Using configuration file (via cwd) at: {}",
pyproject.display()
);
let settings = resolve_root_settings(&pyproject, Relativity::Cwd, config_arguments)?;
let settings = resolve_root_settings(
&pyproject,
&FallbackTransformer {
arguments: config_arguments,
},
ConfigurationOrigin::UserSettings,
)?;
return Ok(PyprojectConfig::new(
PyprojectDiscoveryStrategy::Hierarchical,
settings,
@@ -91,7 +122,24 @@ pub fn resolve(
// "closest" `pyproject.toml` file for every Python file later on, so these act
// as the "default" settings.)
debug!("Using Ruff default settings");
let config = config_arguments.transform(Configuration::default());
let mut config = config_arguments.transform(Configuration::default());
if config.target_version.is_none() {
// If we have arrived here we know that there was no `pyproject.toml`
// containing a `[tool.ruff]` section found in an ancestral directory.
// (This is an implicit requirement in the function
// `pyproject::find_settings_toml`.)
// However, there may be a `pyproject.toml` with a `requires-python`
// specified, and that is what we look for in this step.
let fallback = find_fallback_target_version(
stdin_filename
.as_ref()
.unwrap_or(&path_dedot::CWD.as_path()),
);
if let Some(version) = fallback {
debug!("Derived `target-version` from found `requires-python`: {version:?}");
}
config.target_version = fallback.map(ast::PythonVersion::from);
}
let settings = config.into_settings(&path_dedot::CWD)?;
Ok(PyprojectConfig::new(
PyprojectDiscoveryStrategy::Hierarchical,

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.10.0"
version = "0.11.0"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -224,7 +224,6 @@ pub(crate) fn check_noqa(
&noqa_directives,
locator,
&file_noqa_directives,
settings.preview,
);
}

View File

@@ -10,7 +10,6 @@ mod tests {
use crate::registry::Rule;
use crate::settings::types::PreviewMode;
use crate::test::test_path;
use crate::{assert_messages, settings};
@@ -30,22 +29,4 @@ mod tests {
assert_messages!(snapshot, diagnostics);
Ok(())
}
#[test_case(Rule::BlanketNOQA, Path::new("PGH004_2.py"))]
fn preview_rules(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!(
"preview__{}_{}",
rule_code.noqa_code(),
path.to_string_lossy()
);
let diagnostics = test_path(
Path::new("pygrep_hooks").join(path).as_path(),
&settings::LinterSettings {
preview: PreviewMode::Enabled,
..settings::LinterSettings::for_rule(rule_code)
},
)?;
assert_messages!(snapshot, diagnostics);
Ok(())
}
}

View File

@@ -4,7 +4,6 @@ use ruff_python_trivia::Cursor;
use ruff_text_size::{Ranged, TextRange};
use crate::noqa::{self, Directive, FileNoqaDirectives, NoqaDirectives};
use crate::settings::types::PreviewMode;
use crate::Locator;
/// ## What it does
@@ -18,9 +17,6 @@ use crate::Locator;
/// maintain, as the annotation does not clarify which diagnostics are intended
/// to be suppressed.
///
/// In [preview], this rule also checks for blanket file-level annotations (e.g.,
/// `# ruff: noqa`, as opposed to `# ruff: noqa: F401`).
///
/// ## Example
/// ```python
/// from .base import * # noqa
@@ -41,8 +37,6 @@ use crate::Locator;
///
/// ## References
/// - [Ruff documentation](https://docs.astral.sh/ruff/configuration/#error-suppression)
///
/// [preview]: https://docs.astral.sh/ruff/preview/
#[derive(ViolationMetadata)]
pub(crate) struct BlanketNOQA {
missing_colon: bool,
@@ -84,19 +78,16 @@ pub(crate) fn blanket_noqa(
noqa_directives: &NoqaDirectives,
locator: &Locator,
file_noqa_directives: &FileNoqaDirectives,
preview: PreviewMode,
) {
if preview.is_enabled() {
for line in file_noqa_directives.lines() {
if let Directive::All(_) = line.parsed_file_exemption {
diagnostics.push(Diagnostic::new(
BlanketNOQA {
missing_colon: false,
file_exemption: true,
},
line.range(),
));
}
for line in file_noqa_directives.lines() {
if let Directive::All(_) = line.parsed_file_exemption {
diagnostics.push(Diagnostic::new(
BlanketNOQA {
missing_colon: false,
file_exemption: true,
},
line.range(),
));
}
}

View File

@@ -1,6 +1,5 @@
---
source: crates/ruff_linter/src/rules/pygrep_hooks/mod.rs
snapshot_kind: text
---
PGH004_2.py:1:1: PGH004 Use specific rule codes when using `noqa`
|
@@ -9,3 +8,18 @@ PGH004_2.py:1:1: PGH004 Use specific rule codes when using `noqa`
2 | # ruff : noqa
3 | # ruff: noqa: F401
|
PGH004_2.py:2:1: PGH004 Use specific rule codes when using `ruff: noqa`
|
1 | # noqa
2 | # ruff : noqa
| ^^^^^^^^^^^^^ PGH004
3 | # ruff: noqa: F401
|
PGH004_2.py:6:1: PGH004 Use specific rule codes when using `ruff: noqa`
|
6 | # flake8: noqa
| ^^^^^^^^^^^^^^ PGH004
7 | import math as m
|

View File

@@ -1,26 +0,0 @@
---
source: crates/ruff_linter/src/rules/pygrep_hooks/mod.rs
snapshot_kind: text
---
PGH004_2.py:1:1: PGH004 Use specific rule codes when using `noqa`
|
1 | # noqa
| ^^^^^^ PGH004
2 | # ruff : noqa
3 | # ruff: noqa: F401
|
PGH004_2.py:2:1: PGH004 Use specific rule codes when using `ruff: noqa`
|
1 | # noqa
2 | # ruff : noqa
| ^^^^^^^^^^^^^ PGH004
3 | # ruff: noqa: F401
|
PGH004_2.py:6:1: PGH004 Use specific rule codes when using `ruff: noqa`
|
6 | # flake8: noqa
| ^^^^^^^^^^^^^^ PGH004
7 | import math as m
|

View File

@@ -0,0 +1,4 @@
# parse_options: {"target-version": "3.8"}
for x in *a, b: ...
for x in a, *b: ...
for x in *a, *b: ...

View File

@@ -0,0 +1,4 @@
# parse_options: {"target-version": "3.8"}
for x in (*a, b): ...
for x in ( a, *b): ...
for x in (*a, *b): ...

View File

@@ -0,0 +1,4 @@
# parse_options: {"target-version": "3.9"}
for x in *a, b: ...
for x in a, *b: ...
for x in *a, *b: ...

View File

@@ -616,6 +616,33 @@ pub enum UnsupportedSyntaxErrorKind {
TypeParameterList,
TypeAliasStatement,
TypeParamDefault,
/// Represents the use of tuple unpacking in a `for` statement iterator clause before Python
/// 3.9.
///
/// ## Examples
///
/// Like [`UnsupportedSyntaxErrorKind::StarTuple`] in `return` and `yield` statements, prior to
/// Python 3.9, tuple unpacking in the iterator clause of a `for` statement required
/// parentheses:
///
/// ```python
/// # valid on Python 3.8 and earlier
/// for i in (*a, *b): ...
/// ```
///
/// Omitting the parentheses was invalid:
///
/// ```python
/// for i in *a, *b: ... # SyntaxError
/// ```
///
/// This was changed as part of the [PEG parser rewrite] included in Python 3.9 but not
/// documented directly until the [Python 3.11 release].
///
/// [PEG parser rewrite]: https://peps.python.org/pep-0617/
/// [Python 3.11 release]: https://docs.python.org/3/whatsnew/3.11.html#other-language-changes
UnparenthesizedUnpackInFor,
}
impl Display for UnsupportedSyntaxError {
@@ -642,6 +669,9 @@ impl Display for UnsupportedSyntaxError {
UnsupportedSyntaxErrorKind::TypeParamDefault => {
"Cannot set default type for a type parameter"
}
UnsupportedSyntaxErrorKind::UnparenthesizedUnpackInFor => {
"Cannot use iterable unpacking in `for` statements"
}
};
write!(
@@ -687,6 +717,9 @@ impl UnsupportedSyntaxErrorKind {
UnsupportedSyntaxErrorKind::TypeParameterList => Change::Added(PythonVersion::PY312),
UnsupportedSyntaxErrorKind::TypeAliasStatement => Change::Added(PythonVersion::PY312),
UnsupportedSyntaxErrorKind::TypeParamDefault => Change::Added(PythonVersion::PY313),
UnsupportedSyntaxErrorKind::UnparenthesizedUnpackInFor => {
Change::Added(PythonVersion::PY39)
}
}
}

View File

@@ -2130,7 +2130,10 @@ impl<'src> Parser<'src> {
// rest = (4, 5, 6)
// def g(): yield 1, 2, 3, *rest
// def h(): yield 1, (yield 2, *rest), 3
self.check_tuple_unpacking(&parsed_expr, StarTupleKind::Yield);
self.check_tuple_unpacking(
&parsed_expr,
UnsupportedSyntaxErrorKind::StarTuple(StarTupleKind::Yield),
);
Box::new(parsed_expr.expr)
});

View File

@@ -406,7 +406,10 @@ impl<'src> Parser<'src> {
// # parse_options: {"target-version": "3.7"}
// rest = (4, 5, 6)
// def f(): return 1, 2, 3, *rest
self.check_tuple_unpacking(&parsed_expr, StarTupleKind::Return);
self.check_tuple_unpacking(
&parsed_expr,
UnsupportedSyntaxErrorKind::StarTuple(StarTupleKind::Return),
);
Box::new(parsed_expr.expr)
});
@@ -420,10 +423,12 @@ impl<'src> Parser<'src> {
/// Report [`UnsupportedSyntaxError`]s for each starred element in `expr` if it is an
/// unparenthesized tuple.
///
/// This method can be used to check for tuple unpacking in return and yield statements, which
/// are only allowed in Python 3.8 and later: <https://github.com/python/cpython/issues/76298>.
pub(crate) fn check_tuple_unpacking(&mut self, expr: &Expr, kind: StarTupleKind) {
let kind = UnsupportedSyntaxErrorKind::StarTuple(kind);
/// This method can be used to check for tuple unpacking in `return`, `yield`, and `for`
/// statements, which are only allowed after [Python 3.8] and [Python 3.9], respectively.
///
/// [Python 3.8]: https://github.com/python/cpython/issues/76298
/// [Python 3.9]: https://github.com/python/cpython/issues/90881
pub(super) fn check_tuple_unpacking(&mut self, expr: &Expr, kind: UnsupportedSyntaxErrorKind) {
if kind.is_supported(self.options.target_version) {
return;
}
@@ -1732,6 +1737,28 @@ impl<'src> Parser<'src> {
// for target in x := 1: ...
let iter = self.parse_expression_list(ExpressionContext::starred_bitwise_or());
// test_ok for_iter_unpack_py39
// # parse_options: {"target-version": "3.9"}
// for x in *a, b: ...
// for x in a, *b: ...
// for x in *a, *b: ...
// test_ok for_iter_unpack_py38
// # parse_options: {"target-version": "3.8"}
// for x in (*a, b): ...
// for x in ( a, *b): ...
// for x in (*a, *b): ...
// test_err for_iter_unpack_py38
// # parse_options: {"target-version": "3.8"}
// for x in *a, b: ...
// for x in a, *b: ...
// for x in *a, *b: ...
self.check_tuple_unpacking(
&iter,
UnsupportedSyntaxErrorKind::UnparenthesizedUnpackInFor,
);
self.expect(TokenKind::Colon);
let body = self.parse_body(Clause::For);

View File

@@ -0,0 +1,220 @@
---
source: crates/ruff_python_parser/tests/fixtures.rs
input_file: crates/ruff_python_parser/resources/inline/err/for_iter_unpack_py38.py
---
## AST
```
Module(
ModModule {
range: 0..106,
body: [
For(
StmtFor {
range: 43..63,
is_async: false,
target: Name(
ExprName {
range: 47..48,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 52..58,
elts: [
Starred(
ExprStarred {
range: 52..54,
value: Name(
ExprName {
range: 53..54,
id: Name("a"),
ctx: Load,
},
),
ctx: Load,
},
),
Name(
ExprName {
range: 57..58,
id: Name("b"),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: false,
},
),
body: [
Expr(
StmtExpr {
range: 60..63,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 60..63,
},
),
},
),
],
orelse: [],
},
),
For(
StmtFor {
range: 64..84,
is_async: false,
target: Name(
ExprName {
range: 68..69,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 74..79,
elts: [
Name(
ExprName {
range: 74..75,
id: Name("a"),
ctx: Load,
},
),
Starred(
ExprStarred {
range: 77..79,
value: Name(
ExprName {
range: 78..79,
id: Name("b"),
ctx: Load,
},
),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: false,
},
),
body: [
Expr(
StmtExpr {
range: 81..84,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 81..84,
},
),
},
),
],
orelse: [],
},
),
For(
StmtFor {
range: 85..105,
is_async: false,
target: Name(
ExprName {
range: 89..90,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 94..100,
elts: [
Starred(
ExprStarred {
range: 94..96,
value: Name(
ExprName {
range: 95..96,
id: Name("a"),
ctx: Load,
},
),
ctx: Load,
},
),
Starred(
ExprStarred {
range: 98..100,
value: Name(
ExprName {
range: 99..100,
id: Name("b"),
ctx: Load,
},
),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: false,
},
),
body: [
Expr(
StmtExpr {
range: 102..105,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 102..105,
},
),
},
),
],
orelse: [],
},
),
],
},
)
```
## Unsupported Syntax Errors
|
1 | # parse_options: {"target-version": "3.8"}
2 | for x in *a, b: ...
| ^^ Syntax Error: Cannot use iterable unpacking in `for` statements on Python 3.8 (syntax was added in Python 3.9)
3 | for x in a, *b: ...
4 | for x in *a, *b: ...
|
|
1 | # parse_options: {"target-version": "3.8"}
2 | for x in *a, b: ...
3 | for x in a, *b: ...
| ^^ Syntax Error: Cannot use iterable unpacking in `for` statements on Python 3.8 (syntax was added in Python 3.9)
4 | for x in *a, *b: ...
|
|
2 | for x in *a, b: ...
3 | for x in a, *b: ...
4 | for x in *a, *b: ...
| ^^ Syntax Error: Cannot use iterable unpacking in `for` statements on Python 3.8 (syntax was added in Python 3.9)
|
|
2 | for x in *a, b: ...
3 | for x in a, *b: ...
4 | for x in *a, *b: ...
| ^^ Syntax Error: Cannot use iterable unpacking in `for` statements on Python 3.8 (syntax was added in Python 3.9)
|

View File

@@ -0,0 +1,186 @@
---
source: crates/ruff_python_parser/tests/fixtures.rs
input_file: crates/ruff_python_parser/resources/inline/ok/for_iter_unpack_py38.py
---
## AST
```
Module(
ModModule {
range: 0..112,
body: [
For(
StmtFor {
range: 43..65,
is_async: false,
target: Name(
ExprName {
range: 47..48,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 52..60,
elts: [
Starred(
ExprStarred {
range: 53..55,
value: Name(
ExprName {
range: 54..55,
id: Name("a"),
ctx: Load,
},
),
ctx: Load,
},
),
Name(
ExprName {
range: 58..59,
id: Name("b"),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: true,
},
),
body: [
Expr(
StmtExpr {
range: 62..65,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 62..65,
},
),
},
),
],
orelse: [],
},
),
For(
StmtFor {
range: 66..88,
is_async: false,
target: Name(
ExprName {
range: 70..71,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 75..83,
elts: [
Name(
ExprName {
range: 77..78,
id: Name("a"),
ctx: Load,
},
),
Starred(
ExprStarred {
range: 80..82,
value: Name(
ExprName {
range: 81..82,
id: Name("b"),
ctx: Load,
},
),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: true,
},
),
body: [
Expr(
StmtExpr {
range: 85..88,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 85..88,
},
),
},
),
],
orelse: [],
},
),
For(
StmtFor {
range: 89..111,
is_async: false,
target: Name(
ExprName {
range: 93..94,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 98..106,
elts: [
Starred(
ExprStarred {
range: 99..101,
value: Name(
ExprName {
range: 100..101,
id: Name("a"),
ctx: Load,
},
),
ctx: Load,
},
),
Starred(
ExprStarred {
range: 103..105,
value: Name(
ExprName {
range: 104..105,
id: Name("b"),
ctx: Load,
},
),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: true,
},
),
body: [
Expr(
StmtExpr {
range: 108..111,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 108..111,
},
),
},
),
],
orelse: [],
},
),
],
},
)
```

View File

@@ -0,0 +1,186 @@
---
source: crates/ruff_python_parser/tests/fixtures.rs
input_file: crates/ruff_python_parser/resources/inline/ok/for_iter_unpack_py39.py
---
## AST
```
Module(
ModModule {
range: 0..106,
body: [
For(
StmtFor {
range: 43..63,
is_async: false,
target: Name(
ExprName {
range: 47..48,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 52..58,
elts: [
Starred(
ExprStarred {
range: 52..54,
value: Name(
ExprName {
range: 53..54,
id: Name("a"),
ctx: Load,
},
),
ctx: Load,
},
),
Name(
ExprName {
range: 57..58,
id: Name("b"),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: false,
},
),
body: [
Expr(
StmtExpr {
range: 60..63,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 60..63,
},
),
},
),
],
orelse: [],
},
),
For(
StmtFor {
range: 64..84,
is_async: false,
target: Name(
ExprName {
range: 68..69,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 74..79,
elts: [
Name(
ExprName {
range: 74..75,
id: Name("a"),
ctx: Load,
},
),
Starred(
ExprStarred {
range: 77..79,
value: Name(
ExprName {
range: 78..79,
id: Name("b"),
ctx: Load,
},
),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: false,
},
),
body: [
Expr(
StmtExpr {
range: 81..84,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 81..84,
},
),
},
),
],
orelse: [],
},
),
For(
StmtFor {
range: 85..105,
is_async: false,
target: Name(
ExprName {
range: 89..90,
id: Name("x"),
ctx: Store,
},
),
iter: Tuple(
ExprTuple {
range: 94..100,
elts: [
Starred(
ExprStarred {
range: 94..96,
value: Name(
ExprName {
range: 95..96,
id: Name("a"),
ctx: Load,
},
),
ctx: Load,
},
),
Starred(
ExprStarred {
range: 98..100,
value: Name(
ExprName {
range: 99..100,
id: Name("b"),
ctx: Load,
},
),
ctx: Load,
},
),
],
ctx: Load,
parenthesized: false,
},
),
body: [
Expr(
StmtExpr {
range: 102..105,
value: EllipsisLiteral(
ExprEllipsisLiteral {
range: 102..105,
},
),
},
),
],
orelse: [],
},
),
],
},
)
```

View File

@@ -40,6 +40,7 @@ shellexpand = { workspace = true }
thiserror = { workspace = true }
toml = { workspace = true }
tracing = { workspace = true }
tracing-log = { workspace = true }
tracing-subscriber = { workspace = true, features = ["chrono"] }
[dev-dependencies]

View File

@@ -64,6 +64,8 @@ pub(crate) fn init_logging(log_level: LogLevel, log_file: Option<&std::path::Pat
tracing::subscriber::set_global_default(subscriber)
.expect("should be able to set global default subscriber");
tracing_log::LogTracer::init().unwrap();
}
/// The log level for the server as provided by the client during initialization.

View File

@@ -9,12 +9,13 @@ use ignore::{WalkBuilder, WalkState};
use ruff_linter::settings::types::GlobPath;
use ruff_linter::{settings::types::FilePattern, settings::types::PreviewMode};
use ruff_workspace::pyproject::find_fallback_target_version;
use ruff_workspace::resolver::match_exclusion;
use ruff_workspace::Settings;
use ruff_workspace::{
configuration::{Configuration, FormatConfiguration, LintConfiguration, RuleSelection},
pyproject::{find_user_settings_toml, settings_toml},
resolver::{ConfigurationTransformer, Relativity},
resolver::ConfigurationTransformer,
};
use crate::session::settings::{
@@ -64,12 +65,36 @@ impl RuffSettings {
/// In the absence of a valid configuration file, it gracefully falls back to
/// editor-only settings.
pub(crate) fn fallback(editor_settings: &ResolvedEditorSettings, root: &Path) -> RuffSettings {
struct FallbackTransformer<'a> {
inner: EditorConfigurationTransformer<'a>,
}
impl ConfigurationTransformer for FallbackTransformer<'_> {
fn transform(&self, mut configuration: Configuration) -> Configuration {
let fallback = find_fallback_target_version(self.inner.1);
if let Some(fallback) = fallback {
tracing::debug!(
"Derived `target-version` from found `requires-python`: {fallback:?}"
);
configuration.target_version = Some(fallback.into());
}
self.inner.transform(configuration)
}
}
find_user_settings_toml()
.and_then(|user_settings| {
tracing::debug!(
"Loading settings from user configuration file: `{}`",
user_settings.display()
);
ruff_workspace::resolver::resolve_root_settings(
&user_settings,
Relativity::Cwd,
&EditorConfigurationTransformer(editor_settings, root),
&FallbackTransformer {
inner: EditorConfigurationTransformer(editor_settings, root),
},
ruff_workspace::resolver::ConfigurationOrigin::UserSettings,
)
.ok()
.map(|settings| RuffSettings {
@@ -77,21 +102,45 @@ impl RuffSettings {
settings,
})
})
.unwrap_or_else(|| Self::editor_only(editor_settings, root))
.unwrap_or_else(|| {
let fallback = find_fallback_target_version(root);
if let Some(fallback) = fallback {
tracing::debug!(
"Derived `target-version` from found `requires-python` for fallback configuration: {fallback:?}"
);
}
let configuration = Configuration {
target_version: fallback.map(Into::into),
..Configuration::default()
};
Self::with_editor_settings(editor_settings, root, configuration).expect(
"editor configuration should merge successfully with default configuration",
)
})
}
/// Constructs [`RuffSettings`] by merging the editor-defined settings with the
/// default configuration.
fn editor_only(editor_settings: &ResolvedEditorSettings, root: &Path) -> RuffSettings {
let settings = EditorConfigurationTransformer(editor_settings, root)
.transform(Configuration::default())
.into_settings(root)
.expect("editor configuration should merge successfully with default configuration");
Self::with_editor_settings(editor_settings, root, Configuration::default())
.expect("editor configuration should merge successfully with default configuration")
}
RuffSettings {
/// Merges the `configuration` with the editor defined settings.
fn with_editor_settings(
editor_settings: &ResolvedEditorSettings,
root: &Path,
configuration: Configuration,
) -> anyhow::Result<RuffSettings> {
let settings = EditorConfigurationTransformer(editor_settings, root)
.transform(configuration)
.into_settings(root)?;
Ok(RuffSettings {
path: None,
settings,
}
})
}
}
@@ -140,10 +189,11 @@ impl RuffSettingsIndex {
Ok(Some(pyproject)) => {
match ruff_workspace::resolver::resolve_root_settings(
&pyproject,
Relativity::Parent,
&EditorConfigurationTransformer(editor_settings, root),
ruff_workspace::resolver::ConfigurationOrigin::Ancestor,
) {
Ok(settings) => {
tracing::debug!("Loaded settings from: `{}`", pyproject.display());
respect_gitignore = Some(settings.file_resolver.respect_gitignore);
index.insert(
@@ -264,10 +314,15 @@ impl RuffSettingsIndex {
Ok(Some(pyproject)) => {
match ruff_workspace::resolver::resolve_root_settings(
&pyproject,
Relativity::Parent,
&EditorConfigurationTransformer(editor_settings, root),
ruff_workspace::resolver::ConfigurationOrigin::Ancestor,
) {
Ok(settings) => {
tracing::debug!(
"Loaded settings from: `{}` for `{}`",
pyproject.display(),
directory.display()
);
index.write().unwrap().insert(
directory,
Arc::new(RuffSettings {
@@ -437,8 +492,8 @@ impl ConfigurationTransformer for EditorConfigurationTransformer<'_> {
fn open_configuration_file(config_path: &Path) -> crate::Result<Configuration> {
ruff_workspace::resolver::resolve_configuration(
config_path,
Relativity::Cwd,
&IdentityTransformer,
ruff_workspace::resolver::ConfigurationOrigin::UserSpecified,
)
}

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_wasm"
version = "0.10.0"
version = "0.11.0"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -42,18 +42,18 @@ impl Pyproject {
/// Parse a `ruff.toml` file.
fn parse_ruff_toml<P: AsRef<Path>>(path: P) -> Result<Options> {
let contents = std::fs::read_to_string(path.as_ref())
.with_context(|| format!("Failed to read {}", path.as_ref().display()))?;
toml::from_str(&contents)
.with_context(|| format!("Failed to parse {}", path.as_ref().display()))
let path = path.as_ref();
let contents = std::fs::read_to_string(path)
.with_context(|| format!("Failed to read {}", path.display()))?;
toml::from_str(&contents).with_context(|| format!("Failed to parse {}", path.display()))
}
/// Parse a `pyproject.toml` file.
fn parse_pyproject_toml<P: AsRef<Path>>(path: P) -> Result<Pyproject> {
let contents = std::fs::read_to_string(path.as_ref())
.with_context(|| format!("Failed to read {}", path.as_ref().display()))?;
toml::from_str(&contents)
.with_context(|| format!("Failed to parse {}", path.as_ref().display()))
let path = path.as_ref();
let contents = std::fs::read_to_string(path)
.with_context(|| format!("Failed to read {}", path.display()))?;
toml::from_str(&contents).with_context(|| format!("Failed to parse {}", path.display()))
}
/// Return `true` if a `pyproject.toml` contains a `[tool.ruff]` section.
@@ -65,20 +65,21 @@ pub fn ruff_enabled<P: AsRef<Path>>(path: P) -> Result<bool> {
/// Return the path to the `pyproject.toml` or `ruff.toml` file in a given
/// directory.
pub fn settings_toml<P: AsRef<Path>>(path: P) -> Result<Option<PathBuf>> {
let path = path.as_ref();
// Check for `.ruff.toml`.
let ruff_toml = path.as_ref().join(".ruff.toml");
let ruff_toml = path.join(".ruff.toml");
if ruff_toml.is_file() {
return Ok(Some(ruff_toml));
}
// Check for `ruff.toml`.
let ruff_toml = path.as_ref().join("ruff.toml");
let ruff_toml = path.join("ruff.toml");
if ruff_toml.is_file() {
return Ok(Some(ruff_toml));
}
// Check for `pyproject.toml`.
let pyproject_toml = path.as_ref().join("pyproject.toml");
let pyproject_toml = path.join("pyproject.toml");
if pyproject_toml.is_file() && ruff_enabled(&pyproject_toml)? {
return Ok(Some(pyproject_toml));
}
@@ -97,6 +98,17 @@ pub fn find_settings_toml<P: AsRef<Path>>(path: P) -> Result<Option<PathBuf>> {
Ok(None)
}
/// Derive target version from `required-version` in `pyproject.toml`, if
/// such a file exists in an ancestor directory.
pub fn find_fallback_target_version<P: AsRef<Path>>(path: P) -> Option<PythonVersion> {
for directory in path.as_ref().ancestors() {
if let Some(fallback) = get_fallback_target_version(directory) {
return Some(fallback);
}
}
None
}
/// Find the path to the user-specific `pyproject.toml` or `ruff.toml`, if it
/// exists.
#[cfg(not(target_arch = "wasm32"))]
@@ -141,9 +153,13 @@ pub fn find_user_settings_toml() -> Option<PathBuf> {
}
/// Load `Options` from a `pyproject.toml` or `ruff.toml` file.
pub(super) fn load_options<P: AsRef<Path>>(path: P) -> Result<Options> {
if path.as_ref().ends_with("pyproject.toml") {
let pyproject = parse_pyproject_toml(&path)?;
pub(super) fn load_options<P: AsRef<Path>>(
path: P,
version_strategy: &TargetVersionStrategy,
) -> Result<Options> {
let path = path.as_ref();
if path.ends_with("pyproject.toml") {
let pyproject = parse_pyproject_toml(path)?;
let mut ruff = pyproject
.tool
.and_then(|tool| tool.ruff)
@@ -157,16 +173,55 @@ pub(super) fn load_options<P: AsRef<Path>>(path: P) -> Result<Options> {
}
Ok(ruff)
} else {
let ruff = parse_ruff_toml(path);
if let Ok(ruff) = &ruff {
let mut ruff = parse_ruff_toml(path);
if let Ok(ref mut ruff) = ruff {
if ruff.target_version.is_none() {
debug!("`project.requires_python` in `pyproject.toml` will not be used to set `target_version` when using `ruff.toml`.");
debug!("No `target-version` found in `ruff.toml`");
match version_strategy {
TargetVersionStrategy::UseDefault => {}
TargetVersionStrategy::RequiresPythonFallback => {
if let Some(dir) = path.parent() {
let fallback = get_fallback_target_version(dir);
if let Some(version) = fallback {
debug!("Derived `target-version` from `requires-python` in `pyproject.toml`: {version:?}");
} else {
debug!("No `pyproject.toml` with `requires-python` in same directory; `target-version` unspecified");
}
ruff.target_version = fallback;
}
}
}
}
}
ruff
}
}
/// Extract `target-version` from `pyproject.toml` in the given directory
/// if the file exists and has `requires-python`.
fn get_fallback_target_version(dir: &Path) -> Option<PythonVersion> {
let pyproject_path = dir.join("pyproject.toml");
if !pyproject_path.exists() {
return None;
}
let parsed_pyproject = parse_pyproject_toml(&pyproject_path);
let pyproject = match parsed_pyproject {
Ok(pyproject) => pyproject,
Err(err) => {
debug!("Failed to find fallback `target-version` due to: {}", err);
return None;
}
};
if let Some(project) = pyproject.project {
if let Some(requires_python) = project.requires_python {
return get_minimum_supported_version(&requires_python);
}
}
None
}
/// Infer the minimum supported [`PythonVersion`] from a `requires-python` specifier.
fn get_minimum_supported_version(requires_version: &VersionSpecifiers) -> Option<PythonVersion> {
/// Truncate a version to its major and minor components.
@@ -199,6 +254,15 @@ fn get_minimum_supported_version(requires_version: &VersionSpecifiers) -> Option
PythonVersion::iter().find(|version| Version::from(*version) == minimum_version)
}
/// Strategy for handling missing `target-version` in configuration.
#[derive(Debug)]
pub(super) enum TargetVersionStrategy {
/// Use default `target-version`
UseDefault,
/// Derive from `requires-python` if available
RequiresPythonFallback,
}
#[cfg(test)]
mod tests {
use std::fs;

View File

@@ -23,7 +23,7 @@ use ruff_linter::package::PackageRoot;
use ruff_linter::packaging::is_package;
use crate::configuration::Configuration;
use crate::pyproject::settings_toml;
use crate::pyproject::{settings_toml, TargetVersionStrategy};
use crate::settings::Settings;
use crate::{pyproject, FileResolverSettings};
@@ -301,9 +301,10 @@ pub trait ConfigurationTransformer {
// resolving the "default" configuration).
pub fn resolve_configuration(
pyproject: &Path,
relativity: Relativity,
transformer: &dyn ConfigurationTransformer,
origin: ConfigurationOrigin,
) -> Result<Configuration> {
let relativity = Relativity::from(origin);
let mut configurations = indexmap::IndexMap::new();
let mut next = Some(fs::normalize_path(pyproject));
while let Some(path) = next {
@@ -319,7 +320,19 @@ pub fn resolve_configuration(
}
// Resolve the current path.
let options = pyproject::load_options(&path).with_context(|| {
let version_strategy =
if configurations.is_empty() && matches!(origin, ConfigurationOrigin::Ancestor) {
// For configurations that are discovered by
// walking back from a file, we will attempt to
// infer the `target-version` if it is missing
TargetVersionStrategy::RequiresPythonFallback
} else {
// In all other cases (e.g. for configurations
// inherited via `extend`, or user-level settings)
// we do not attempt to infer a missing `target-version`
TargetVersionStrategy::UseDefault
};
let options = pyproject::load_options(&path, &version_strategy).with_context(|| {
if configurations.is_empty() {
format!(
"Failed to load configuration `{path}`",
@@ -368,10 +381,12 @@ pub fn resolve_configuration(
/// `pyproject.toml`.
fn resolve_scoped_settings<'a>(
pyproject: &'a Path,
relativity: Relativity,
transformer: &dyn ConfigurationTransformer,
origin: ConfigurationOrigin,
) -> Result<(&'a Path, Settings)> {
let configuration = resolve_configuration(pyproject, relativity, transformer)?;
let relativity = Relativity::from(origin);
let configuration = resolve_configuration(pyproject, transformer, origin)?;
let project_root = relativity.resolve(pyproject);
let settings = configuration.into_settings(project_root)?;
Ok((project_root, settings))
@@ -381,13 +396,37 @@ fn resolve_scoped_settings<'a>(
/// configuration with the given [`ConfigurationTransformer`].
pub fn resolve_root_settings(
pyproject: &Path,
relativity: Relativity,
transformer: &dyn ConfigurationTransformer,
origin: ConfigurationOrigin,
) -> Result<Settings> {
let (_project_root, settings) = resolve_scoped_settings(pyproject, relativity, transformer)?;
let (_project_root, settings) = resolve_scoped_settings(pyproject, transformer, origin)?;
Ok(settings)
}
#[derive(Debug, Clone, Copy)]
/// How the configuration is provided.
pub enum ConfigurationOrigin {
/// Origin is unknown to the caller
Unknown,
/// User specified path to specific configuration file
UserSpecified,
/// User-level configuration (e.g. in `~/.config/ruff/pyproject.toml`)
UserSettings,
/// In parent or higher ancestor directory of path
Ancestor,
}
impl From<ConfigurationOrigin> for Relativity {
fn from(value: ConfigurationOrigin) -> Self {
match value {
ConfigurationOrigin::Unknown => Self::Parent,
ConfigurationOrigin::UserSpecified => Self::Cwd,
ConfigurationOrigin::UserSettings => Self::Cwd,
ConfigurationOrigin::Ancestor => Self::Parent,
}
}
}
/// Find all Python (`.py`, `.pyi` and `.ipynb` files) in a set of paths.
pub fn python_files_in_path<'a>(
paths: &[PathBuf],
@@ -411,8 +450,11 @@ pub fn python_files_in_path<'a>(
for ancestor in path.ancestors() {
if seen.insert(ancestor) {
if let Some(pyproject) = settings_toml(ancestor)? {
let (root, settings) =
resolve_scoped_settings(&pyproject, Relativity::Parent, transformer)?;
let (root, settings) = resolve_scoped_settings(
&pyproject,
transformer,
ConfigurationOrigin::Ancestor,
)?;
resolver.add(root, settings);
// We found the closest configuration.
break;
@@ -564,8 +606,8 @@ impl ParallelVisitor for PythonFilesVisitor<'_, '_> {
match settings_toml(entry.path()) {
Ok(Some(pyproject)) => match resolve_scoped_settings(
&pyproject,
Relativity::Parent,
self.transformer,
ConfigurationOrigin::Ancestor,
) {
Ok((root, settings)) => {
self.global.resolver.write().unwrap().add(root, settings);
@@ -699,7 +741,7 @@ pub fn python_file_at_path(
for ancestor in path.ancestors() {
if let Some(pyproject) = settings_toml(ancestor)? {
let (root, settings) =
resolve_scoped_settings(&pyproject, Relativity::Parent, transformer)?;
resolve_scoped_settings(&pyproject, transformer, ConfigurationOrigin::Unknown)?;
resolver.add(root, settings);
break;
}
@@ -883,7 +925,7 @@ mod tests {
use crate::pyproject::find_settings_toml;
use crate::resolver::{
is_file_excluded, match_exclusion, python_files_in_path, resolve_root_settings,
ConfigurationTransformer, PyprojectConfig, PyprojectDiscoveryStrategy, Relativity,
ConfigurationOrigin, ConfigurationTransformer, PyprojectConfig, PyprojectDiscoveryStrategy,
ResolvedFile, Resolver,
};
use crate::settings::Settings;
@@ -904,8 +946,8 @@ mod tests {
PyprojectDiscoveryStrategy::Hierarchical,
resolve_root_settings(
&find_settings_toml(&package_root)?.unwrap(),
Relativity::Parent,
&NoOpTransformer,
ConfigurationOrigin::Ancestor,
)?,
None,
);

View File

@@ -80,7 +80,7 @@ You can add the following configuration to `.gitlab-ci.yml` to run a `ruff forma
stage: build
interruptible: true
image:
name: ghcr.io/astral-sh/ruff:0.10.0-alpine
name: ghcr.io/astral-sh/ruff:0.11.0-alpine
before_script:
- cd $CI_PROJECT_DIR
- ruff --version
@@ -106,7 +106,7 @@ Ruff can be used as a [pre-commit](https://pre-commit.com) hook via [`ruff-pre-c
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.10.0
rev: v0.11.0
hooks:
# Run the linter.
- id: ruff
@@ -119,7 +119,7 @@ To enable lint fixes, add the `--fix` argument to the lint hook:
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.10.0
rev: v0.11.0
hooks:
# Run the linter.
- id: ruff
@@ -133,7 +133,7 @@ To avoid running on Jupyter Notebooks, remove `jupyter` from the list of allowed
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.10.0
rev: v0.11.0
hooks:
# Run the linter.
- id: ruff

View File

@@ -365,7 +365,7 @@ This tutorial has focused on Ruff's command-line interface, but Ruff can also be
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.10.0
rev: v0.11.0
hooks:
# Run the linter.
- id: ruff

View File

@@ -4,7 +4,7 @@ build-backend = "maturin"
[project]
name = "ruff"
version = "0.10.0"
version = "0.11.0"
description = "An extremely fast Python linter and code formatter, written in Rust."
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
readme = "README.md"

View File

@@ -1,6 +1,6 @@
[project]
name = "scripts"
version = "0.10.0"
version = "0.11.0"
description = ""
authors = ["Charles Marsh <charlie.r.marsh@gmail.com>"]