Compare commits

..

18 Commits

Author SHA1 Message Date
David Peter
0d159b2f38 [ty] Provide completions at the end of the file 2025-10-17 22:33:45 +02:00
Alex Waygood
8ca2b5555d Dogfood ty on py-fuzzer in CI (#20946) 2025-10-17 20:30:17 +01:00
David Peter
6d2cf3475f Only add the actual schema in schemastore PRs (#20947)
Same as https://github.com/astral-sh/ty/pull/1391:

> Last time I ran this script, due to what I assume was a `npm` version
mismatch, the `package-lock.json` file was updated while running `npm
install` in the `schemastore`. Due to the use of `git commit -a`, it was
accidentally included in the commit for the semi-automated schemastore
PR. The solution here is to only add the actual file that we want to
commit.
2025-10-17 21:14:04 +02:00
Shunsuke Shibayama
e4384fc212 [ty] impl VarianceInferable for KnownInstanceType (#20924)
## Summary

Derived from #20900

Implement `VarianceInferable` for `KnownInstanceType` (especially for
`KnownInstanceType::TypeAliasType`).

The variance of a type alias matches its value type. In normal usage,
type aliases are expanded to value types, so the variance of a type
alias can be obtained without implementing this. However, for example,
if we want to display the variance when hovering over a type alias, we
need to be able to obtain the variance of the type alias itself (cf.
#20900).

## Test Plan

I couldn't come up with a way to test this in mdtest, so I'm testing it
in a test submodule at the end of `types.rs`.
I also added a test to `mdtest/generics/pep695/variance.md`, but it
passes without the changes in this PR.
2025-10-17 21:12:19 +02:00
David Peter
6e7ff07065 [ty] Provide completions on TypeVars (#20943)
## Summary

closes https://github.com/astral-sh/ty/issues/1370

## Test Plan

New snapshot tests
2025-10-17 20:05:20 +02:00
Alex Waygood
c7e2bfd759 [ty] continue and break statements outside loops are syntax errors (#20944)
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-10-17 17:13:40 +00:00
Alex Waygood
c424007645 Update usage instructions and lockfile for py-fuzzer script (#20940) 2025-10-17 15:57:17 +01:00
Brent Westbrook
0115fd3757 Avoid reusing nested, interpolated quotes before Python 3.12 (#20930)
## Summary

Fixes #20774 by tracking whether an `InterpolatedStringState` element is
nested inside of another interpolated element. This feels like kind of a
naive fix, so I'm welcome to other ideas. But it resolves the problem in
the issue and clears up the syntax error in the black compatibility
test, without affecting many other cases.

The other affected case is actually interesting too because the
[input](96b156303b/crates/ruff_python_formatter/resources/test/fixtures/ruff/expression/fstring.py (L707))
is invalid, but the previous quote selection fixed the invalid syntax:

```pycon
Python 3.11.13 (main, Sep  2 2025, 14:20:25) [Clang 20.1.4 ] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> f'{1: abcd "{'aa'}" }'  # input
  File "<stdin>", line 1
    f'{1: abcd "{'aa'}" }'
                  ^^
SyntaxError: f-string: expecting '}'
>>> f'{1: abcd "{"aa"}" }'  # old output
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ValueError: Invalid format specifier ' abcd "aa" ' for object of type 'int'
>>> f'{1: abcd "{'aa'}" }'  # new output
  File "<stdin>", line 1
    f'{1: abcd "{'aa'}" }'
                  ^^
SyntaxError: f-string: expecting '}'
```

We now preserve the invalid syntax in the input.

Unfortunately, this also seems to be another edge case I didn't consider
in https://github.com/astral-sh/ruff/pull/20867 because we don't flag
this as a syntax error after 0.14.1:

<details><summary>Shell output</summary>
<p>

```
> uvx ruff@0.14.0 check --ignore ALL --target-version py311 - <<EOF
f'{1: abcd "{'aa'}" }'
EOF
invalid-syntax: Cannot reuse outer quote character in f-strings on Python 3.11 (syntax was added in Python 3.12)
 --> -:1:14
  |
1 | f'{1: abcd "{'aa'}" }'
  |              ^
  |

Found 1 error.
> uvx ruff@0.14.1 check --ignore ALL --target-version py311 - <<EOF
f'{1: abcd "{'aa'}" }'
EOF
All checks passed!
> uvx python@3.11 -m ast <<EOF
f'{1: abcd "{'aa'}" }'
EOF
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/home/brent/.local/share/uv/python/cpython-3.11.13-linux-x86_64-gnu/lib/python3.11/ast.py", line 1752, in <module>
    main()
  File "/home/brent/.local/share/uv/python/cpython-3.11.13-linux-x86_64-gnu/lib/python3.11/ast.py", line 1748, in main
    tree = parse(source, args.infile.name, args.mode, type_comments=args.no_type_comments)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/brent/.local/share/uv/python/cpython-3.11.13-linux-x86_64-gnu/lib/python3.11/ast.py", line 50, in parse
    return compile(source, filename, mode, flags,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<stdin>", line 1
    f'{1: abcd "{'aa'}" }'
                  ^^
SyntaxError: f-string: expecting '}'
```

</p>
</details> 


I assumed that was the same `ParseError` as the one caused by
`f"{1:""}"`, but this is a nested interpolation inside of the format
spec.

## Test Plan

New test copied from the black compatibility test. I guess this is a
duplicate now, I started working on this branch before the new black
tests were imported, so I could delete the separate test in our fixtures
if that's preferable.
2025-10-17 08:49:16 -04:00
David Peter
cfbd42c22a [ty] Support dataclass_transform for base class models (#20783)
## Summary

Support `dataclass_transform` when used on a (base) class.

## Typing conformance

* The changes in `dataclasses_transform_class.py` look good, just a few
mistakes due to missing `alias` support.
* I didn't look closely at the changes in
`dataclasses_transform_converter.py` since we don't support `converter`
yet.

## Ecosystem impact

The impact looks huge, but it's concentrated on a single project (ibis).
Their setup looks more or less like this:

* the real `Annotatable`:
d7083c2c96/ibis/common/grounds.py (L100-L101)
* the real `DataType`:
d7083c2c96/ibis/expr/datatypes/core.py (L161-L179)
* the real `Array`:
d7083c2c96/ibis/expr/datatypes/core.py (L1003-L1006)


```py
from typing import dataclass_transform

@dataclass_transform()
class Annotatable:
    pass

class DataType(Annotatable):
    nullable: bool = True

class Array[T](DataType):
    value_type: T
```

They expect something like `Array([1, 2])` to work, but ty, pyright,
mypy, and pyrefly would all expect there to be a first argument for the
`nullable` field on `DataType`. I don't really understand on what
grounds they expect the `nullable` field to be excluded from the
signature, but this seems to be the main reason for the new diagnostics
here. Not sure if related, but it looks like their typing setup is not
really complete
(https://github.com/ibis-project/ibis/issues/6844#issuecomment-1868274770,
this thread also mentions `dataclass_transform`).

## Test Plan

Update pre-existing tests.
2025-10-17 14:04:31 +02:00
Mark Z. Ding
fc3b341529 [ty] Truncate Literal type display in some situations (#20928) 2025-10-17 11:50:58 +00:00
Alex Waygood
baaa8dad3a [ty] Re-enable fuzzer seeds that are no longer slow (#20937) 2025-10-17 12:29:13 +01:00
Micha Reiser
a21cde8a5a [ty] Fix playground crash for very large files (#20934) 2025-10-17 09:15:33 +02:00
Aria Desires
64edfb6ef6 [ty] add legacy namespace package support (#20897)
Detect legacy namespace packages and treat them like namespace packages
when looking them up as the *parent* of the module we're interested in.
In all other cases treat them like a regular package.

(This PR is coauthored by @MichaReiser in a shared coding session)

Fixes https://github.com/astral-sh/ty/issues/838

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-10-17 03:16:37 +00:00
Ibraheem Ahmed
96b156303b [ty] Prefer declared type for invariant collection literals (#20927)
## Summary

Prefer the declared type for collection literals, e.g.,
```py
x: list[Any] = [1, "2", (3,)]
reveal_type(x)  # list[Any]
```

This solves a large part of https://github.com/astral-sh/ty/issues/136
for invariant generics, where respecting the declared type is a lot more
important. It also means that annotated dict literals with `dict[_,
Any]` is a way out of https://github.com/astral-sh/ty/issues/1248.
2025-10-16 16:11:28 -04:00
Douglas Creager
b0e10a9777 [ty] Don't track inferability via different Type variants (#20677)
We have to track whether a typevar appears in a position where it's
inferable or not. In a non-inferable position (in the body of the
generic class or function that binds it), assignability must hold for
every possible specialization of the typevar. In an inferable position,
it only needs to hold for _some_ specialization.
https://github.com/astral-sh/ruff/pull/20093 is working on using
constraint sets to model assignability of typevars, and the constraint
sets that we produce will be the same for inferable vs non-inferable
typevars; what changes is what we _compare_ that constraint set to. (For
a non-inferable typevar, the constraint set must equal the set of valid
specializations; for an inferable typevar, it must not be `never`.)

When I first added support for tracking inferable vs non-inferable
typevars, it seemed like it would be easiest to have separate `Type`
variants for each. The alternative (which lines up with the Δ set in
[POPL15](https://doi.org/10.1145/2676726.2676991)) would be to
explicitly plumb through a list of inferable typevars through our type
property methods. That seemed cumbersome.

In retrospect, that was the wrong decision. We've had to jump through
hoops to translate types between the inferable and non-inferable
variants, which has been quite brittle. Combined with the original point
above, that much of the assignability logic will become more identical
between inferable and non-inferable, there is less justification for the
two `Type` variants. And plumbing an extra `inferable` parameter through
all of these methods turns out to not be as bad as I anticipated.

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-10-16 15:59:46 -04:00
Ibraheem Ahmed
25023cc0ea [ty] Use declared variable types as bidirectional type context (#20796)
## Summary

Use the declared type of variables as type context for the RHS of assignment expressions, e.g.,
```py
x: list[int | str]
x = [1]
reveal_type(x)  # revealed: list[int | str]
```
2025-10-16 15:40:39 -04:00
Ibraheem Ahmed
1ade4f2081 [ty] Avoid unnecessarily widening generic specializations (#20875)
## Summary

Ignore the type context when specializing a generic call if it leads to
an unnecessarily wide return type. For example, [the example mentioned
here](https://github.com/astral-sh/ruff/pull/20796#issuecomment-3403319536)
works as expected after this change:
```py
def id[T](x: T) -> T:
    return x

def _(i: int):
    x: int | None = id(i)
    y: int | None = i
    reveal_type(x)  # revealed: int
    reveal_type(y)  # revealed: int
```

I also added extended our usage of `filter_disjoint_elements` to tuple
and typed-dict inference, which resolves
https://github.com/astral-sh/ty/issues/1266.
2025-10-16 19:17:37 +00:00
David Peter
8dad58de37 [ty] Support dataclass-transform field_specifiers (#20888)
## Summary

Add support for the `field_specifiers` parameter on
`dataclass_transform` decorator calls.

closes https://github.com/astral-sh/ty/issues/1068

## Conformance test results

All true positives ✔️ 

## Ecosystem analysis

* `trio`: this is the kind of change that I would expect from this PR.
The code makes use of a dataclass `Outcome` with a `_unwrapped: bool =
attr.ib(default=False, eq=False, init=False)` field that is excluded
from the `__init__` signature, so we now see a bunch of
constructor-call-related errors going away.
* `home-assistant/core`: They have a `domain: str = attr.ib(init=False,
repr=False)` field and then use
  ```py
    @domain.default
    def _domain_default(self) -> str:
        # …
  ```
This accesses the `default` attribute on `dataclasses.Field[…]` with a
type of `default: _T | Literal[_MISSING_TYPE.MISSING]`, so we get those
"Object of type `_MISSING_TYPE` is not callable" errors. I don't really
understand how that is supposed to work. Even if `_MISSING_TYPE` would
be absent from that union, what does this try to call? pyright also
issues an error and it doesn't seem to work at runtime? So this looks
like a true positive?
* `attrs`: Similar here. There are some new diagnostics on code that
tries to access `.validator` on a field. This *does* work at runtime,
but I'm not sure how that is supposed to type-check (without a [custom
plugin](2c6c395935/mypy/plugins/attrs.py (L575-L602))).
pyright errors on this as well.
* A handful of new false positives because we don't support `alias` yet

## Test Plan

Updated tests.
2025-10-16 20:49:11 +02:00
47 changed files with 2273 additions and 894 deletions

View File

@@ -277,7 +277,8 @@ jobs:
run: cargo test -p ty_python_semantic --test mdtest || true
- name: "Run tests"
run: cargo insta test --all-features --unreferenced reject --test-runner nextest
# Dogfood ty on py-fuzzer
- run: uv run --project=./python/py-fuzzer cargo run -p ty check --project=./python/py-fuzzer
# Check for broken links in the documentation.
- run: cargo doc --all --no-deps
env:
@@ -498,9 +499,10 @@ jobs:
chmod +x "${DOWNLOAD_PATH}/ruff"
(
uvx \
uv run \
--python="${PYTHON_VERSION}" \
--from=./python/py-fuzzer \
--project=./python/py-fuzzer \
--locked \
fuzz \
--test-executable="${DOWNLOAD_PATH}/ruff" \
--bin=ruff \
@@ -518,6 +520,7 @@ jobs:
with:
persist-credentials: false
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: "Install Rust toolchain"
run: rustup component add rustfmt
# Run all code generation scripts, and verify that the current output is
@@ -532,6 +535,11 @@ jobs:
./scripts/add_plugin.py test --url https://pypi.org/project/-test/0.1.0/ --prefix TST
./scripts/add_rule.py --name FirstRule --prefix TST --code 001 --linter test
- run: cargo check
# Lint/format/type-check py-fuzzer
# (dogfooding with ty is done in a separate job)
- run: uv run --directory=./python/py-fuzzer mypy
- run: uv run --directory=./python/py-fuzzer ruff format --check
- run: uv run --directory=./python/py-fuzzer ruff check
ecosystem:
name: "ecosystem"
@@ -666,7 +674,7 @@ jobs:
- determine_changes
# Only runs on pull requests, since that is the only we way we can find the base version for comparison.
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && github.event_name == 'pull_request' && (needs.determine_changes.outputs.ty == 'true' || needs.determine_changes.outputs.py-fuzzer == 'true') }}
timeout-minutes: 20
timeout-minutes: 5
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
@@ -694,9 +702,10 @@ jobs:
chmod +x "${PWD}/ty" "${NEW_TY}/ty"
(
uvx \
uv run \
--python="${PYTHON_VERSION}" \
--from=./python/py-fuzzer \
--project=./python/py-fuzzer \
--locked \
fuzz \
--test-executable="${NEW_TY}/ty" \
--baseline-executable="${PWD}/ty" \

View File

@@ -48,9 +48,10 @@ jobs:
run: |
# shellcheck disable=SC2046
(
uvx \
--python=3.12 \
--from=./python/py-fuzzer \
uv run \
--python=3.13 \
--project=./python/py-fuzzer \
--locked \
fuzz \
--test-executable=target/debug/ruff \
--bin=ruff \

View File

@@ -748,3 +748,7 @@ print(f"{ # Tuple with multiple elements that doesn't fit on a single line gets
# Regression tests for https://github.com/astral-sh/ruff/issues/15536
print(f"{ {}, 1, }")
# The inner quotes should not be changed to double quotes before Python 3.12
f"{f'''{'nested'} inner'''} outer"

View File

@@ -144,6 +144,12 @@ pub(crate) enum InterpolatedStringState {
///
/// The containing `FStringContext` is the surrounding f-string context.
InsideInterpolatedElement(InterpolatedStringContext),
/// The formatter is inside more than one nested f-string, such as in `nested` in:
///
/// ```py
/// f"{f'''{'nested'} inner'''} outer"
/// ```
NestedInterpolatedElement(InterpolatedStringContext),
/// The formatter is outside an f-string.
#[default]
Outside,
@@ -152,12 +158,18 @@ pub(crate) enum InterpolatedStringState {
impl InterpolatedStringState {
pub(crate) fn can_contain_line_breaks(self) -> Option<bool> {
match self {
InterpolatedStringState::InsideInterpolatedElement(context) => {
InterpolatedStringState::InsideInterpolatedElement(context)
| InterpolatedStringState::NestedInterpolatedElement(context) => {
Some(context.is_multiline())
}
InterpolatedStringState::Outside => None,
}
}
/// Returns `true` if the interpolated string state is [`NestedInterpolatedElement`].
pub(crate) fn is_nested(self) -> bool {
matches!(self, Self::NestedInterpolatedElement(..))
}
}
/// The position of a top-level statement in the module.

View File

@@ -181,10 +181,16 @@ impl Format<PyFormatContext<'_>> for FormatInterpolatedElement<'_> {
let item = format_with(|f: &mut PyFormatter| {
// Update the context to be inside the f-string expression element.
let f = &mut WithInterpolatedStringState::new(
InterpolatedStringState::InsideInterpolatedElement(self.context),
f,
);
let state = match f.context().interpolated_string_state() {
InterpolatedStringState::InsideInterpolatedElement(_)
| InterpolatedStringState::NestedInterpolatedElement(_) => {
InterpolatedStringState::NestedInterpolatedElement(self.context)
}
InterpolatedStringState::Outside => {
InterpolatedStringState::InsideInterpolatedElement(self.context)
}
};
let f = &mut WithInterpolatedStringState::new(state, f);
write!(f, [bracket_spacing, expression.format()])?;

View File

@@ -46,8 +46,15 @@ impl<'a, 'src> StringNormalizer<'a, 'src> {
.unwrap_or(self.context.options().quote_style());
let supports_pep_701 = self.context.options().target_version().supports_pep_701();
// Preserve the existing quote style for nested interpolations more than one layer deep, if
// PEP 701 isn't supported.
if !supports_pep_701 && self.context.interpolated_string_state().is_nested() {
return QuoteStyle::Preserve;
}
// For f-strings and t-strings prefer alternating the quotes unless The outer string is triple quoted and the inner isn't.
if let InterpolatedStringState::InsideInterpolatedElement(parent_context) =
if let InterpolatedStringState::InsideInterpolatedElement(parent_context)
| InterpolatedStringState::NestedInterpolatedElement(parent_context) =
self.context.interpolated_string_state()
{
let parent_flags = parent_context.flags();

View File

@@ -28,12 +28,11 @@ but none started with prefix {parentdir_prefix}"
f'{{NOT \'a\' "formatted" "value"}}'
f"some f-string with {a} {few():.2f} {formatted.values!r}"
-f'some f-string with {a} {few(""):.2f} {formatted.values!r}'
-f"{f'''{'nested'} inner'''} outer"
+f"some f-string with {a} {few(''):.2f} {formatted.values!r}"
f"{f'''{'nested'} inner'''} outer"
-f"\"{f'{nested} inner'}\" outer"
-f"space between opening braces: { {a for a in (1, 2, 3)}}"
-f'Hello \'{tricky + "example"}\''
+f"some f-string with {a} {few(''):.2f} {formatted.values!r}"
+f"{f'''{"nested"} inner'''} outer"
+f'"{f"{nested} inner"}" outer'
+f"space between opening braces: { {a for a in (1, 2, 3)} }"
+f"Hello '{tricky + 'example'}'"
@@ -49,7 +48,7 @@ f"{{NOT a formatted value}}"
f'{{NOT \'a\' "formatted" "value"}}'
f"some f-string with {a} {few():.2f} {formatted.values!r}"
f"some f-string with {a} {few(''):.2f} {formatted.values!r}"
f"{f'''{"nested"} inner'''} outer"
f"{f'''{'nested'} inner'''} outer"
f'"{f"{nested} inner"}" outer'
f"space between opening braces: { {a for a in (1, 2, 3)} }"
f"Hello '{tricky + 'example'}'"
@@ -72,17 +71,3 @@ f'Hello \'{tricky + "example"}\''
f"Tried directories {str(rootdirs)} \
but none started with prefix {parentdir_prefix}"
```
## New Unsupported Syntax Errors
error[invalid-syntax]: Cannot reuse outer quote character in f-strings on Python 3.10 (syntax was added in Python 3.12)
--> fstring.py:6:9
|
4 | f"some f-string with {a} {few():.2f} {formatted.values!r}"
5 | f"some f-string with {a} {few(''):.2f} {formatted.values!r}"
6 | f"{f'''{"nested"} inner'''} outer"
| ^
7 | f'"{f"{nested} inner"}" outer'
8 | f"space between opening braces: { {a for a in (1, 2, 3)} }"
|
warning: Only accept new syntax errors if they are also present in the input. The formatter should not introduce syntax errors.

View File

@@ -754,6 +754,10 @@ print(f"{ # Tuple with multiple elements that doesn't fit on a single line gets
# Regression tests for https://github.com/astral-sh/ruff/issues/15536
print(f"{ {}, 1, }")
# The inner quotes should not be changed to double quotes before Python 3.12
f"{f'''{'nested'} inner'''} outer"
```
## Outputs
@@ -1532,7 +1536,7 @@ f'{f"""other " """}'
f'{1: hy "user"}'
f'{1:hy "user"}'
f'{1: abcd "{1}" }'
f'{1: abcd "{"aa"}" }'
f'{1: abcd "{'aa'}" }'
f'{1=: "abcd {'aa'}}'
f"{x:a{z:hy \"user\"}} '''"
@@ -1581,6 +1585,10 @@ print(
# Regression tests for https://github.com/astral-sh/ruff/issues/15536
print(f"{ {}, 1 }")
# The inner quotes should not be changed to double quotes before Python 3.12
f"{f'''{'nested'} inner'''} outer"
```
@@ -2359,7 +2367,7 @@ f'{f"""other " """}'
f'{1: hy "user"}'
f'{1:hy "user"}'
f'{1: abcd "{1}" }'
f'{1: abcd "{"aa"}" }'
f'{1: abcd "{'aa'}" }'
f'{1=: "abcd {'aa'}}'
f"{x:a{z:hy \"user\"}} '''"
@@ -2408,6 +2416,10 @@ print(
# Regression tests for https://github.com/astral-sh/ruff/issues/15536
print(f"{ {}, 1 }")
# The inner quotes should not be changed to double quotes before Python 3.12
f"{f'''{'nested'} inner'''} outer"
```

View File

@@ -62,7 +62,7 @@ To run the fuzzer, execute the following command
(requires [`uv`](https://github.com/astral-sh/uv) to be installed):
```sh
uvx --from ./python/py-fuzzer fuzz
uv run --project=./python/py-fuzzer fuzz
```
Refer to the [py-fuzzer](https://github.com/astral-sh/ruff/blob/main/python/py-fuzzer/fuzz.py)

View File

@@ -131,7 +131,7 @@ impl<'db> Completion<'db> {
| Type::BytesLiteral(_) => CompletionKind::Value,
Type::EnumLiteral(_) => CompletionKind::Enum,
Type::ProtocolInstance(_) => CompletionKind::Interface,
Type::NonInferableTypeVar(_) | Type::TypeVar(_) => CompletionKind::TypeParameter,
Type::TypeVar(_) => CompletionKind::TypeParameter,
Type::Union(union) => union
.elements(db)
.iter()
@@ -409,7 +409,7 @@ impl<'t> CompletionTargetTokens<'t> {
TokenAt::Single(tok) => tok.end(),
TokenAt::Between(_, tok) => tok.start(),
};
let before = tokens_start_before(parsed.tokens(), offset);
let before = strip_eof_newline(tokens_start_before(parsed.tokens(), offset));
Some(
// Our strategy when it comes to `object.attribute` here is
// to look for the `.` and then take the token immediately
@@ -627,6 +627,19 @@ fn tokens_start_before(tokens: &Tokens, offset: TextSize) -> &[Token] {
&tokens[..idx]
}
/// Strip trailing Newline token that the parser automatically adds at the
/// end of the file.
fn strip_eof_newline(tokens: &[Token]) -> &[Token] {
if tokens
.last()
.is_some_and(|t| t.kind() == TokenKind::Newline)
{
&tokens[..tokens.len() - 1]
} else {
tokens
}
}
/// Returns a suffix of `tokens` corresponding to the `kinds` given.
///
/// If a suffix of `tokens` with the given `kinds` could not be found,
@@ -800,7 +813,7 @@ fn find_typed_text(
offset: TextSize,
) -> Option<String> {
let source = source_text(db, file);
let tokens = tokens_start_before(parsed.tokens(), offset);
let tokens = strip_eof_newline(tokens_start_before(parsed.tokens(), offset));
let last = tokens.last()?;
if !matches!(last.kind(), TokenKind::Name) {
return None;
@@ -3896,6 +3909,70 @@ print(t'''{Foo} and Foo.zqzq<CURSOR>
assert_snapshot!(test.completions_without_builtins(), @"<No completions found>");
}
#[test]
fn typevar_with_upper_bound() {
let test = cursor_test(
"\
def f[T: str](msg: T):
msg.<CURSOR>
",
);
test.assert_completions_include("upper");
test.assert_completions_include("capitalize");
}
#[test]
fn typevar_with_constraints() {
// Test TypeVar with constraints
let test = cursor_test(
"\
from typing import TypeVar
class A:
only_on_a: int
on_a_and_b: str
class B:
only_on_b: float
on_a_and_b: str
T = TypeVar('T', A, B)
def f(x: T):
x.<CURSOR>
",
);
test.assert_completions_include("on_a_and_b");
test.assert_completions_do_not_include("only_on_a");
test.assert_completions_do_not_include("only_on_b");
}
#[test]
fn typevar_without_bounds_or_constraints() {
let test = cursor_test(
"\
def f[T](x: T):
x.<CURSOR>
",
);
test.assert_completions_include("__repr__");
}
#[test]
fn attribute_at_eof_without_trailing_newline() {
// Regression test for https://github.com/astral-sh/ty/issues/1392
// This test ensures completions work when the cursor is at the end
// of the file with no trailing newline.
let test = cursor_test("def f(msg: str):\n msg.<CURSOR>");
test.assert_completions_include("upper");
test.assert_completions_include("capitalize");
let test = cursor_test("def f(msg: str):\n msg.u<CURSOR>");
test.assert_completions_include("upper");
test.assert_completions_do_not_include("capitalize");
}
// NOTE: The methods below are getting somewhat ridiculous.
// We should refactor this by converting to using a builder
// to set different modes. ---AG

View File

@@ -336,9 +336,7 @@ impl<'db> SemanticTokenVisitor<'db> {
match ty {
Type::ClassLiteral(_) => (SemanticTokenType::Class, modifiers),
Type::NonInferableTypeVar(_) | Type::TypeVar(_) => {
(SemanticTokenType::TypeParameter, modifiers)
}
Type::TypeVar(_) => (SemanticTokenType::TypeParameter, modifiers),
Type::FunctionLiteral(_) => {
// Check if this is a method based on current scope
if self.in_class_scope {

View File

@@ -114,7 +114,7 @@ h: list[list[int]] = [[], [42]]
reveal_type(h) # revealed: list[list[int]]
i: list[typing.Any] = [1, 2, "3", ([4],)]
reveal_type(i) # revealed: list[Any | int | str | tuple[list[Unknown | int]]]
reveal_type(i) # revealed: list[Any]
j: list[tuple[str | int, ...]] = [(1, 2), ("foo", "bar"), ()]
reveal_type(j) # revealed: list[tuple[str | int, ...]]
@@ -123,7 +123,7 @@ k: list[tuple[list[int], ...]] = [([],), ([1, 2], [3, 4]), ([5], [6], [7])]
reveal_type(k) # revealed: list[tuple[list[int], ...]]
l: tuple[list[int], *tuple[list[typing.Any], ...], list[str]] = ([1, 2, 3], [4, 5, 6], [7, 8, 9], ["10", "11", "12"])
reveal_type(l) # revealed: tuple[list[int], list[Any | int], list[Any | int], list[str]]
reveal_type(l) # revealed: tuple[list[int], list[Any], list[Any], list[str]]
type IntList = list[int]
@@ -144,6 +144,12 @@ reveal_type(q) # revealed: dict[int | str, int]
r: dict[int | str, int | str] = {1: 1, 2: 2, 3: 3}
reveal_type(r) # revealed: dict[int | str, int | str]
s: dict[int | str, int | str]
s = {1: 1, 2: 2, 3: 3}
reveal_type(s) # revealed: dict[int | str, int | str]
(s := {1: 1, 2: 2, 3: 3})
reveal_type(s) # revealed: dict[int | str, int | str]
```
## Optional collection literal annotations are understood
@@ -181,7 +187,7 @@ h: list[list[int]] | None = [[], [42]]
reveal_type(h) # revealed: list[list[int]]
i: list[typing.Any] | None = [1, 2, "3", ([4],)]
reveal_type(i) # revealed: list[Any | int | str | tuple[list[Unknown | int]]]
reveal_type(i) # revealed: list[Any]
j: list[tuple[str | int, ...]] | None = [(1, 2), ("foo", "bar"), ()]
reveal_type(j) # revealed: list[tuple[str | int, ...]]
@@ -190,8 +196,7 @@ k: list[tuple[list[int], ...]] | None = [([],), ([1, 2], [3, 4]), ([5], [6], [7]
reveal_type(k) # revealed: list[tuple[list[int], ...]]
l: tuple[list[int], *tuple[list[typing.Any], ...], list[str]] | None = ([1, 2, 3], [4, 5, 6], [7, 8, 9], ["10", "11", "12"])
# TODO: this should be `tuple[list[int], list[Any | int], list[Any | int], list[str]]`
reveal_type(l) # revealed: tuple[list[Unknown | int], list[Unknown | int], list[Unknown | int], list[Unknown | str]]
reveal_type(l) # revealed: tuple[list[int], list[Any], list[Any], list[str]]
type IntList = list[int]
@@ -277,7 +282,7 @@ reveal_type(k) # revealed: list[Literal[1, 2, 3]]
type Y[T] = list[T]
l: Y[Y[Literal[1]]] = [[1]]
reveal_type(l) # revealed: list[list[Literal[1]]]
reveal_type(l) # revealed: list[Y[Literal[1]]]
m: list[tuple[Literal[1], Literal[2], Literal[3]]] = [(1, 2, 3)]
reveal_type(m) # revealed: list[tuple[Literal[1], Literal[2], Literal[3]]]
@@ -297,6 +302,12 @@ reveal_type(q) # revealed: list[int]
r: list[Literal[1, 2, 3, 4]] = [1, 2]
reveal_type(r) # revealed: list[Literal[1, 2, 3, 4]]
s: list[Literal[1]]
s = [1]
reveal_type(s) # revealed: list[Literal[1]]
(s := [1])
reveal_type(s) # revealed: list[Literal[1]]
```
## PEP-604 annotations are supported
@@ -416,13 +427,14 @@ a = f("a")
reveal_type(a) # revealed: list[Literal["a"]]
b: list[int | Literal["a"]] = f("a")
reveal_type(b) # revealed: list[int | Literal["a"]]
reveal_type(b) # revealed: list[Literal["a"] | int]
c: list[int | str] = f("a")
reveal_type(c) # revealed: list[int | str]
reveal_type(c) # revealed: list[str | int]
d: list[int | tuple[int, int]] = f((1, 2))
reveal_type(d) # revealed: list[int | tuple[int, int]]
# TODO: We could avoid reordering the union elements here.
reveal_type(d) # revealed: list[tuple[int, int] | int]
e: list[int] = f(True)
reveal_type(e) # revealed: list[int]
@@ -437,8 +449,49 @@ def f2[T: int](x: T) -> T:
return x
i: int = f2(True)
reveal_type(i) # revealed: int
reveal_type(i) # revealed: Literal[True]
j: int | str = f2(True)
reveal_type(j) # revealed: Literal[True]
```
Types are not widened unnecessarily:
```py
def id[T](x: T) -> T:
return x
def lst[T](x: T) -> list[T]:
return [x]
def _(i: int):
a: int | None = i
b: int | None = id(i)
c: int | str | None = id(i)
reveal_type(a) # revealed: int
reveal_type(b) # revealed: int
reveal_type(c) # revealed: int
a: list[int | None] | None = [i]
b: list[int | None] | None = id([i])
c: list[int | None] | int | None = id([i])
reveal_type(a) # revealed: list[int | None]
# TODO: these should reveal `list[int | None]`
# we currently do not use the call expression annotation as type context for argument inference
reveal_type(b) # revealed: list[Unknown | int]
reveal_type(c) # revealed: list[Unknown | int]
a: list[int | None] | None = [i]
b: list[int | None] | None = lst(i)
c: list[int | None] | int | None = lst(i)
reveal_type(a) # revealed: list[int | None]
reveal_type(b) # revealed: list[int | None]
reveal_type(c) # revealed: list[int | None]
a: list | None = []
b: list | None = id([])
c: list | int | None = id([])
reveal_type(a) # revealed: list[Unknown]
reveal_type(b) # revealed: list[Unknown]
reveal_type(c) # revealed: list[Unknown]
```

View File

@@ -122,9 +122,6 @@ class CustomerModel(ModelBase):
id: int
name: str
# TODO: this is not supported yet
# error: [unknown-argument]
# error: [unknown-argument]
CustomerModel(id=1, name="Test")
```
@@ -216,11 +213,7 @@ class OrderedModelBase: ...
class TestWithBase(OrderedModelBase):
inner: int
# TODO: No errors here, should reveal `bool`
# error: [too-many-positional-arguments]
# error: [too-many-positional-arguments]
# error: [unsupported-operator]
reveal_type(TestWithBase(1) < TestWithBase(2)) # revealed: Unknown
reveal_type(TestWithBase(1) < TestWithBase(2)) # revealed: bool
```
### `kw_only_default`
@@ -277,8 +270,7 @@ class ModelBase: ...
class TestBase(ModelBase):
name: str
# TODO: This should be `(self: TestBase, *, name: str) -> None`
reveal_type(TestBase.__init__) # revealed: def __init__(self) -> None
reveal_type(TestBase.__init__) # revealed: (self: TestBase, *, name: str) -> None
```
### `frozen_default`
@@ -333,12 +325,9 @@ class ModelBase: ...
class TestMeta(ModelBase):
name: str
# TODO: no error here
# error: [unknown-argument]
t = TestMeta(name="test")
# TODO: this should be an `invalid-assignment` error
t.name = "new"
t.name = "new" # error: [invalid-assignment]
```
### Combining parameters
@@ -437,19 +426,15 @@ class DefaultFrozenModel:
class Frozen(DefaultFrozenModel):
name: str
# TODO: no error here
# error: [unknown-argument]
f = Frozen(name="test")
# TODO: this should be an `invalid-assignment` error
f.name = "new"
f.name = "new" # error: [invalid-assignment]
class Mutable(DefaultFrozenModel, frozen=False):
name: str
# TODO: no error here
# error: [unknown-argument]
m = Mutable(name="test")
m.name = "new" # No error
# TODO: This should not be an error
m.name = "new" # error: [invalid-assignment]
```
## `field_specifiers`
@@ -461,7 +446,7 @@ The [`typing.dataclass_transform`] specification also allows classes (such as `d
to be listed in `field_specifiers`, but it is currently unclear how this should work, and other type
checkers do not seem to support this either.
### Basic example
### For function-based transformers
```py
from typing_extensions import dataclass_transform, Any
@@ -478,11 +463,8 @@ class Person:
name: str = fancy_field()
age: int | None = fancy_field(kw_only=True)
# TODO: Should be `(self: Person, name: str, *, age: int | None) -> None`
reveal_type(Person.__init__) # revealed: (self: Person, id: int = Any, name: str = Any, age: int | None = Any) -> None
reveal_type(Person.__init__) # revealed: (self: Person, name: str, *, age: int | None) -> None
# TODO: No error here
# error: [invalid-argument-type]
alice = Person("Alice", age=30)
reveal_type(alice.id) # revealed: int
@@ -490,6 +472,141 @@ reveal_type(alice.name) # revealed: str
reveal_type(alice.age) # revealed: int | None
```
### For metaclass-based transformers
```py
from typing_extensions import dataclass_transform, Any
def fancy_field(*, init: bool = True, kw_only: bool = False) -> Any: ...
@dataclass_transform(field_specifiers=(fancy_field,))
class FancyMeta(type):
def __new__(cls, name, bases, namespace):
...
return super().__new__(cls, name, bases, namespace)
class FancyBase(metaclass=FancyMeta): ...
class Person(FancyBase):
id: int = fancy_field(init=False)
name: str = fancy_field()
age: int | None = fancy_field(kw_only=True)
reveal_type(Person.__init__) # revealed: (self: Person, name: str, *, age: int | None) -> None
alice = Person("Alice", age=30)
reveal_type(alice.id) # revealed: int
reveal_type(alice.name) # revealed: str
reveal_type(alice.age) # revealed: int | None
```
### For base-class-based transformers
```py
from typing_extensions import dataclass_transform, Any
def fancy_field(*, init: bool = True, kw_only: bool = False) -> Any: ...
@dataclass_transform(field_specifiers=(fancy_field,))
class FancyBase:
def __init_subclass__(cls):
...
super().__init_subclass__()
class Person(FancyBase):
id: int = fancy_field(init=False)
name: str = fancy_field()
age: int | None = fancy_field(kw_only=True)
reveal_type(Person.__init__) # revealed: (self: Person, name: str, *, age: int | None) -> None
alice = Person("Alice", age=30)
reveal_type(alice.id) # revealed: int
reveal_type(alice.name) # revealed: str
reveal_type(alice.age) # revealed: int | None
```
### With default arguments
Field specifiers can have default arguments that should be respected:
```py
from typing_extensions import dataclass_transform, Any
def fancy_field(*, init: bool = False) -> Any: ...
@dataclass_transform(field_specifiers=(fancy_field,))
def fancy_model[T](cls: type[T]) -> type[T]:
...
return cls
@fancy_model
class Person:
id: int = fancy_field()
name: str = fancy_field(init=True)
reveal_type(Person.__init__) # revealed: (self: Person, name: str) -> None
Person(name="Alice")
```
### With overloaded field specifiers
```py
from typing_extensions import dataclass_transform, overload, Any
@overload
def fancy_field(*, init: bool = True) -> Any: ...
@overload
def fancy_field(*, kw_only: bool = False) -> Any: ...
def fancy_field(*, init: bool = True, kw_only: bool = False) -> Any: ...
@dataclass_transform(field_specifiers=(fancy_field,))
def fancy_model[T](cls: type[T]) -> type[T]:
...
return cls
@fancy_model
class Person:
id: int = fancy_field(init=False)
name: str = fancy_field()
age: int | None = fancy_field(kw_only=True)
reveal_type(Person.__init__) # revealed: (self: Person, name: str, *, age: int | None) -> None
```
### Nested dataclass-transformers
Make sure that models are only affected by the field specifiers of their own transformer:
```py
from typing_extensions import dataclass_transform, Any
from dataclasses import field
def outer_field(*, init: bool = True, kw_only: bool = False) -> Any: ...
@dataclass_transform(field_specifiers=(outer_field,))
def outer_model[T](cls: type[T]) -> type[T]:
# ...
return cls
def inner_field(*, init: bool = True, kw_only: bool = False) -> Any: ...
@dataclass_transform(field_specifiers=(inner_field,))
def inner_model[T](cls: type[T]) -> type[T]:
# ...
return cls
@outer_model
class Outer:
@inner_model
class Inner:
inner_a: int = inner_field(init=False)
inner_b: str = outer_field(init=False)
outer_a: int = outer_field(init=False)
outer_b: str = inner_field(init=False)
reveal_type(Outer.__init__) # revealed: (self: Outer, outer_b: str = Any) -> None
reveal_type(Outer.Inner.__init__) # revealed: (self: Inner, inner_b: str = Any) -> None
```
## Overloaded dataclass-like decorators
In the case of an overloaded decorator, the `dataclass_transform` decorator can be applied to the

View File

@@ -11,7 +11,7 @@ class Member:
role: str = field(default="user")
tag: str | None = field(default=None, init=False)
# revealed: (self: Member, name: str, role: str = str) -> None
# revealed: (self: Member, name: str, role: str = Literal["user"]) -> None
reveal_type(Member.__init__)
alice = Member(name="Alice", role="admin")
@@ -37,7 +37,7 @@ class Data:
content: list[int] = field(default_factory=list)
timestamp: datetime = field(default_factory=datetime.now, init=False)
# revealed: (self: Data, content: list[int] = list[int]) -> None
# revealed: (self: Data, content: list[int] = Unknown) -> None
reveal_type(Data.__init__)
data = Data([1, 2, 3])
@@ -64,7 +64,7 @@ class Person:
role: str = field(default="user", kw_only=True)
# TODO: this would ideally show a default value of `None` for `age`
# revealed: (self: Person, name: str, *, age: int | None = int | None, role: str = str) -> None
# revealed: (self: Person, name: str, *, age: int | None = None, role: str = Literal["user"]) -> None
reveal_type(Person.__init__)
alice = Person(role="admin", name="Alice")

View File

@@ -354,3 +354,25 @@ def f():
x = 1
global x # error: [invalid-syntax] "name `x` is used prior to global declaration"
```
## `break` and `continue` outside a loop
<!-- snapshot-diagnostics -->
```py
break # error: [invalid-syntax]
continue # error: [invalid-syntax]
for x in range(42):
break # fine
continue # fine
def _():
break # error: [invalid-syntax]
continue # error: [invalid-syntax]
class Fine:
# this is invalid syntax despite it being in an eager-nested scope!
break # error: [invalid-syntax]
continue # error: [invalid-syntax]
```

View File

@@ -138,3 +138,27 @@ def _(n: int):
# error: [unknown-argument]
y = f("foo", name="bar", unknown="quux")
```
### Truncation for long unions and literals
This test demonstrates a call where the expected type is a large mixed union. The diagnostic must
therefore truncate the long expected union type to avoid overwhelming output.
```py
from typing import Literal, Union
class A: ...
class B: ...
class C: ...
class D: ...
class E: ...
class F: ...
def f1(x: Union[Literal[1, 2, 3, 4, 5, 6, 7, 8], A, B, C, D, E, F]) -> int:
return 0
def _(n: int):
x = n
# error: [invalid-argument-type]
f1(x)
```

View File

@@ -790,6 +790,65 @@ static_assert(not is_assignable_to(C[B], C[A]))
static_assert(not is_assignable_to(C[A], C[B]))
```
## Type aliases
The variance of the type alias matches the variance of the value type (RHS type).
```py
from ty_extensions import static_assert, is_subtype_of
from typing import Literal
class Covariant[T]:
def get(self) -> T:
raise ValueError
type CovariantLiteral1 = Covariant[Literal[1]]
type CovariantInt = Covariant[int]
type MyCovariant[T] = Covariant[T]
static_assert(is_subtype_of(CovariantLiteral1, CovariantInt))
static_assert(is_subtype_of(MyCovariant[Literal[1]], MyCovariant[int]))
class Contravariant[T]:
def set(self, value: T):
pass
type ContravariantLiteral1 = Contravariant[Literal[1]]
type ContravariantInt = Contravariant[int]
type MyContravariant[T] = Contravariant[T]
static_assert(is_subtype_of(ContravariantInt, ContravariantLiteral1))
static_assert(is_subtype_of(MyContravariant[int], MyContravariant[Literal[1]]))
class Invariant[T]:
def get(self) -> T:
raise ValueError
def set(self, value: T):
pass
type InvariantLiteral1 = Invariant[Literal[1]]
type InvariantInt = Invariant[int]
type MyInvariant[T] = Invariant[T]
static_assert(not is_subtype_of(InvariantInt, InvariantLiteral1))
static_assert(not is_subtype_of(InvariantLiteral1, InvariantInt))
static_assert(not is_subtype_of(MyInvariant[Literal[1]], MyInvariant[int]))
static_assert(not is_subtype_of(MyInvariant[int], MyInvariant[Literal[1]]))
class Bivariant[T]:
pass
type BivariantLiteral1 = Bivariant[Literal[1]]
type BivariantInt = Bivariant[int]
type MyBivariant[T] = Bivariant[T]
static_assert(is_subtype_of(BivariantInt, BivariantLiteral1))
static_assert(is_subtype_of(BivariantLiteral1, BivariantInt))
static_assert(is_subtype_of(MyBivariant[Literal[1]], MyBivariant[int]))
static_assert(is_subtype_of(MyBivariant[int], MyBivariant[Literal[1]]))
```
## Inheriting from generic classes with inferred variance
When inheriting from a generic class with our type variable substituted in, we count its occurrences

View File

@@ -0,0 +1,221 @@
# Legacy namespace packages
## `__import__("pkgutil").extend_path`
```toml
[environment]
extra-paths = ["/airflow-core/src", "/providers/amazon/src/"]
```
`/airflow-core/src/airflow/__init__.py`:
```py
__path__ = __import__("pkgutil").extend_path(__path__, __name__)
__version__ = "3.2.0"
```
`/providers/amazon/src/airflow/__init__.py`:
```py
__path__ = __import__("pkgutil").extend_path(__path__, __name__)
```
`/providers/amazon/src/airflow/providers/__init__.py`:
```py
__path__ = __import__("pkgutil").extend_path(__path__, __name__)
```
`/providers/amazon/src/airflow/providers/amazon/__init__.py`:
```py
__version__ = "9.15.0"
```
`test.py`:
```py
from airflow import __version__ as airflow_version
from airflow.providers.amazon import __version__ as amazon_provider_version
reveal_type(airflow_version) # revealed: Literal["3.2.0"]
reveal_type(amazon_provider_version) # revealed: Literal["9.15.0"]
```
## `pkgutil.extend_path`
```toml
[environment]
extra-paths = ["/airflow-core/src", "/providers/amazon/src/"]
```
`/airflow-core/src/airflow/__init__.py`:
```py
import pkgutil
__path__ = pkgutil.extend_path(__path__, __name__)
__version__ = "3.2.0"
```
`/providers/amazon/src/airflow/__init__.py`:
```py
import pkgutil
__path__ = pkgutil.extend_path(__path__, __name__)
```
`/providers/amazon/src/airflow/providers/__init__.py`:
```py
import pkgutil
__path__ = pkgutil.extend_path(__path__, __name__)
```
`/providers/amazon/src/airflow/providers/amazon/__init__.py`:
```py
__version__ = "9.15.0"
```
`test.py`:
```py
from airflow import __version__ as airflow_version
from airflow.providers.amazon import __version__ as amazon_provider_version
reveal_type(airflow_version) # revealed: Literal["3.2.0"]
reveal_type(amazon_provider_version) # revealed: Literal["9.15.0"]
```
## `extend_path` with keyword arguments
```toml
[environment]
extra-paths = ["/airflow-core/src", "/providers/amazon/src/"]
```
`/airflow-core/src/airflow/__init__.py`:
```py
import pkgutil
__path__ = pkgutil.extend_path(name=__name__, path=__path__)
__version__ = "3.2.0"
```
`/providers/amazon/src/airflow/__init__.py`:
```py
import pkgutil
__path__ = pkgutil.extend_path(name=__name__, path=__path__)
```
`/providers/amazon/src/airflow/providers/__init__.py`:
```py
import pkgutil
__path__ = pkgutil.extend_path(name=__name__, path=__path__)
```
`/providers/amazon/src/airflow/providers/amazon/__init__.py`:
```py
__version__ = "9.15.0"
```
`test.py`:
```py
from airflow import __version__ as airflow_version
from airflow.providers.amazon import __version__ as amazon_provider_version
reveal_type(airflow_version) # revealed: Literal["3.2.0"]
reveal_type(amazon_provider_version) # revealed: Literal["9.15.0"]
```
## incorrect `__import__` arguments
```toml
[environment]
extra-paths = ["/airflow-core/src", "/providers/amazon/src/"]
```
`/airflow-core/src/airflow/__init__.py`:
```py
__path__ = __import__("not_pkgutil").extend_path(__path__, __name__)
__version__ = "3.2.0"
```
`/providers/amazon/src/airflow/__init__.py`:
```py
__path__ = __import__("not_pkgutil").extend_path(__path__, __name__)
```
`/providers/amazon/src/airflow/providers/__init__.py`:
```py
__path__ = __import__("not_pkgutil").extend_path(__path__, __name__)
```
`/providers/amazon/src/airflow/providers/amazon/__init__.py`:
```py
__version__ = "9.15.0"
```
`test.py`:
```py
from airflow.providers.amazon import __version__ as amazon_provider_version # error: [unresolved-import]
from airflow import __version__ as airflow_version
reveal_type(airflow_version) # revealed: Literal["3.2.0"]
```
## incorrect `extend_path` arguments
```toml
[environment]
extra-paths = ["/airflow-core/src", "/providers/amazon/src/"]
```
`/airflow-core/src/airflow/__init__.py`:
```py
__path__ = __import__("pkgutil").extend_path(__path__, "other_module")
__version__ = "3.2.0"
```
`/providers/amazon/src/airflow/__init__.py`:
```py
__path__ = __import__("pkgutil").extend_path(__path__, "other_module")
```
`/providers/amazon/src/airflow/providers/__init__.py`:
```py
__path__ = __import__("pkgutil").extend_path(__path__, "other_module")
```
`/providers/amazon/src/airflow/providers/amazon/__init__.py`:
```py
__version__ = "9.15.0"
```
`test.py`:
```py
from airflow.providers.amazon import __version__ as amazon_provider_version # error: [unresolved-import]
from airflow import __version__ as airflow_version
reveal_type(airflow_version) # revealed: Literal["3.2.0"]
```

View File

@@ -310,13 +310,13 @@ no longer valid in the inner lazy scope.
def f(l: list[str | None]):
if l[0] is not None:
def _():
reveal_type(l[0]) # revealed: str | None | Unknown
reveal_type(l[0]) # revealed: str | None
l = [None]
def f(l: list[str | None]):
l[0] = "a"
def _():
reveal_type(l[0]) # revealed: str | None | Unknown
reveal_type(l[0]) # revealed: str | None
l = [None]
def f(l: list[str | None]):

View File

@@ -0,0 +1,107 @@
---
source: crates/ty_test/src/lib.rs
expression: snapshot
---
---
mdtest name: semantic_syntax_errors.md - Semantic syntax error diagnostics - `break` and `continue` outside a loop
mdtest path: crates/ty_python_semantic/resources/mdtest/diagnostics/semantic_syntax_errors.md
---
# Python source files
## mdtest_snippet.py
```
1 | break # error: [invalid-syntax]
2 | continue # error: [invalid-syntax]
3 |
4 | for x in range(42):
5 | break # fine
6 | continue # fine
7 |
8 | def _():
9 | break # error: [invalid-syntax]
10 | continue # error: [invalid-syntax]
11 |
12 | class Fine:
13 | # this is invalid syntax despite it being in an eager-nested scope!
14 | break # error: [invalid-syntax]
15 | continue # error: [invalid-syntax]
```
# Diagnostics
```
error[invalid-syntax]: `break` outside loop
--> src/mdtest_snippet.py:1:1
|
1 | break # error: [invalid-syntax]
| ^^^^^
2 | continue # error: [invalid-syntax]
|
```
```
error[invalid-syntax]: `continue` outside loop
--> src/mdtest_snippet.py:2:1
|
1 | break # error: [invalid-syntax]
2 | continue # error: [invalid-syntax]
| ^^^^^^^^
3 |
4 | for x in range(42):
|
```
```
error[invalid-syntax]: `break` outside loop
--> src/mdtest_snippet.py:9:9
|
8 | def _():
9 | break # error: [invalid-syntax]
| ^^^^^
10 | continue # error: [invalid-syntax]
|
```
```
error[invalid-syntax]: `continue` outside loop
--> src/mdtest_snippet.py:10:9
|
8 | def _():
9 | break # error: [invalid-syntax]
10 | continue # error: [invalid-syntax]
| ^^^^^^^^
11 |
12 | class Fine:
|
```
```
error[invalid-syntax]: `break` outside loop
--> src/mdtest_snippet.py:14:9
|
12 | class Fine:
13 | # this is invalid syntax despite it being in an eager-nested scope!
14 | break # error: [invalid-syntax]
| ^^^^^
15 | continue # error: [invalid-syntax]
|
```
```
error[invalid-syntax]: `continue` outside loop
--> src/mdtest_snippet.py:15:9
|
13 | # this is invalid syntax despite it being in an eager-nested scope!
14 | break # error: [invalid-syntax]
15 | continue # error: [invalid-syntax]
| ^^^^^^^^
|
```

View File

@@ -0,0 +1,56 @@
---
source: crates/ty_test/src/lib.rs
assertion_line: 427
expression: snapshot
---
---
mdtest name: union_call.md - Calling a union of function types - Try to cover all possible reasons - Truncation for long unions and literals
mdtest path: crates/ty_python_semantic/resources/mdtest/diagnostics/union_call.md
---
# Python source files
## mdtest_snippet.py
```
1 | from typing import Literal, Union
2 |
3 | class A: ...
4 | class B: ...
5 | class C: ...
6 | class D: ...
7 | class E: ...
8 | class F: ...
9 |
10 | def f1(x: Union[Literal[1, 2, 3, 4, 5, 6, 7, 8], A, B, C, D, E, F]) -> int:
11 | return 0
12 |
13 | def _(n: int):
14 | x = n
15 | # error: [invalid-argument-type]
16 | f1(x)
```
# Diagnostics
```
error[invalid-argument-type]: Argument to function `f1` is incorrect
--> src/mdtest_snippet.py:16:8
|
14 | x = n
15 | # error: [invalid-argument-type]
16 | f1(x)
| ^ Expected `Literal[1, 2, 3, 4, 5, ... omitted 3 literals] | A | B | ... omitted 4 union elements`, found `int`
|
info: Function defined here
--> src/mdtest_snippet.py:10:5
|
8 | class F: ...
9 |
10 | def f1(x: Union[Literal[1, 2, 3, 4, 5, 6, 7, 8], A, B, C, D, E, F]) -> int:
| ^^ ----------------------------------------------------------- Parameter declared here
11 | return 0
|
info: rule `invalid-argument-type` is enabled by default
```

View File

@@ -233,10 +233,12 @@ Person({"name": "Alice"})
# error: [missing-typed-dict-key] "Missing required key 'age' in TypedDict `Person` constructor"
accepts_person({"name": "Alice"})
# TODO: this should be an error, similar to the above
house.owner = {"name": "Alice"}
a_person: Person
# TODO: this should be an error, similar to the above
# error: [missing-typed-dict-key] "Missing required key 'age' in TypedDict `Person` constructor"
a_person = {"name": "Alice"}
```
@@ -254,9 +256,12 @@ Person({"name": None, "age": 30})
accepts_person({"name": None, "age": 30})
# TODO: this should be an error, similar to the above
house.owner = {"name": None, "age": 30}
a_person: Person
# TODO: this should be an error, similar to the above
# error: [invalid-argument-type] "Invalid argument to key "name" with declared type `str` on TypedDict `Person`: value of type `None`"
a_person = {"name": None, "age": 30}
# error: [invalid-argument-type] "Invalid argument to key "name" with declared type `str` on TypedDict `Person`: value of type `None`"
(a_person := {"name": None, "age": 30})
```
All of these have an extra field that is not defined in the `TypedDict`:
@@ -273,9 +278,12 @@ Person({"name": "Alice", "age": 30, "extra": True})
accepts_person({"name": "Alice", "age": 30, "extra": True})
# TODO: this should be an error
house.owner = {"name": "Alice", "age": 30, "extra": True}
# TODO: this should be an error
a_person: Person
# error: [invalid-key] "Invalid key access on TypedDict `Person`: Unknown key "extra""
a_person = {"name": "Alice", "age": 30, "extra": True}
# error: [invalid-key] "Invalid key access on TypedDict `Person`: Unknown key "extra""
(a_person := {"name": "Alice", "age": 30, "extra": True})
```
## Type ignore compatibility issues
@@ -907,7 +915,7 @@ grandchild: Node = {"name": "grandchild", "parent": child}
nested: Node = {"name": "n1", "parent": {"name": "n2", "parent": {"name": "n3", "parent": None}}}
# TODO: this should be an error (invalid type for `name` in innermost node)
# error: [invalid-argument-type] "Invalid argument to key "name" with declared type `str` on TypedDict `Node`: value of type `Literal[3]`"
nested_invalid: Node = {"name": "n1", "parent": {"name": "n2", "parent": {"name": 3, "parent": None}}}
```

View File

@@ -19,7 +19,10 @@ use rustc_hash::{FxBuildHasher, FxHashSet};
use ruff_db::files::{File, FilePath, FileRootKind};
use ruff_db::system::{DirectoryEntry, System, SystemPath, SystemPathBuf};
use ruff_db::vendored::VendoredFileSystem;
use ruff_python_ast::{PySourceType, PythonVersion};
use ruff_python_ast::{
self as ast, PySourceType, PythonVersion,
visitor::{Visitor, walk_body},
};
use crate::db::Db;
use crate::module_name::ModuleName;
@@ -1002,7 +1005,12 @@ where
let is_regular_package = package_path.is_regular_package(resolver_state);
if is_regular_package {
in_namespace_package = false;
// This is the only place where we need to consider the existence of legacy namespace
// packages, as we are explicitly searching for the *parent* package of the module
// we actually want. Here, such a package should be treated as a PEP-420 ("modern")
// namespace package. In all other contexts it acts like a normal package and needs
// no special handling.
in_namespace_package = is_legacy_namespace_package(&package_path, resolver_state);
} else if package_path.is_directory(resolver_state)
// Pure modules hide namespace packages with the same name
&& resolve_file_module(&package_path, resolver_state).is_none()
@@ -1039,6 +1047,62 @@ where
})
}
/// Determines whether a package is a legacy namespace package.
///
/// Before PEP 420 introduced implicit namespace packages, the ecosystem developed
/// its own form of namespace packages. These legacy namespace packages continue to persist
/// in modern codebases because they work with ancient Pythons and if it ain't broke, don't fix it.
///
/// A legacy namespace package is distinguished by having an `__init__.py` that contains an
/// expression to the effect of:
///
/// ```python
/// __path__ = __import__("pkgutil").extend_path(__path__, __name__)
/// ```
///
/// The resulting package simultaneously has properties of both regular packages and namespace ones:
///
/// * Like regular packages, `__init__.py` is defined and can contain items other than submodules
/// * Like implicit namespace packages, multiple copies of the package may exist with different
/// submodules, and they will be merged into one namespace at runtime by the interpreter
///
/// Now, you may rightly wonder: "What if the `__init__.py` files have different contents?"
/// The apparent official answer is: "Don't do that!"
/// And the reality is: "Of course people do that!"
///
/// In practice we think it's fine to, just like with regular packages, use the first one
/// we find on the search paths. To the extent that the different copies "need" to have the same
/// contents, they all "need" to have the legacy namespace idiom (we do nothing to enforce that,
/// we will just get confused if you mess it up).
fn is_legacy_namespace_package(
package_path: &ModulePath,
resolver_state: &ResolverContext,
) -> bool {
// Just an optimization, the stdlib and typeshed are never legacy namespace packages
if package_path.search_path().is_standard_library() {
return false;
}
let mut package_path = package_path.clone();
package_path.push("__init__");
let Some(init) = resolve_file_module(&package_path, resolver_state) else {
return false;
};
// This is all syntax-only analysis so it *could* be fooled but it's really unlikely.
//
// The benefit of being syntax-only is speed and avoiding circular dependencies
// between module resolution and semantic analysis.
//
// The downside is if you write slightly different syntax we will fail to detect the idiom,
// but hey, this is better than nothing!
let parsed = ruff_db::parsed::parsed_module(resolver_state.db, init);
let mut visitor = LegacyNamespacePackageVisitor::default();
visitor.visit_body(parsed.load(resolver_state.db).suite());
visitor.is_legacy_namespace_package
}
#[derive(Debug)]
struct ResolvedPackage {
path: ModulePath,
@@ -1148,6 +1212,124 @@ impl fmt::Display for RelaxedModuleName {
}
}
/// Detects if a module contains a statement of the form:
/// ```python
/// __path__ = pkgutil.extend_path(__path__, __name__)
/// ```
/// or
/// ```python
/// __path__ = __import__("pkgutil").extend_path(__path__, __name__)
/// ```
#[derive(Default)]
struct LegacyNamespacePackageVisitor {
is_legacy_namespace_package: bool,
in_body: bool,
}
impl Visitor<'_> for LegacyNamespacePackageVisitor {
fn visit_body(&mut self, body: &[ruff_python_ast::Stmt]) {
if self.is_legacy_namespace_package {
return;
}
// Don't traverse into nested bodies.
if self.in_body {
return;
}
self.in_body = true;
walk_body(self, body);
}
fn visit_stmt(&mut self, stmt: &ast::Stmt) {
if self.is_legacy_namespace_package {
return;
}
let ast::Stmt::Assign(ast::StmtAssign { value, targets, .. }) = stmt else {
return;
};
let [ast::Expr::Name(maybe_path)] = &**targets else {
return;
};
if &*maybe_path.id != "__path__" {
return;
}
let ast::Expr::Call(ast::ExprCall {
func: extend_func,
arguments: extend_arguments,
..
}) = &**value
else {
return;
};
let ast::Expr::Attribute(ast::ExprAttribute {
value: maybe_pkg_util,
attr: maybe_extend_path,
..
}) = &**extend_func
else {
return;
};
// Match if the left side of the attribute access is either `__import__("pkgutil")` or `pkgutil`
match &**maybe_pkg_util {
// __import__("pkgutil").extend_path(__path__, __name__)
ast::Expr::Call(ruff_python_ast::ExprCall {
func: maybe_import,
arguments: import_arguments,
..
}) => {
let ast::Expr::Name(maybe_import) = &**maybe_import else {
return;
};
if maybe_import.id() != "__import__" {
return;
}
let Some(ast::Expr::StringLiteral(name)) =
import_arguments.find_argument_value("name", 0)
else {
return;
};
if name.value.to_str() != "pkgutil" {
return;
}
}
// "pkgutil.extend_path(__path__, __name__)"
ast::Expr::Name(name) => {
if name.id() != "pkgutil" {
return;
}
}
_ => {
return;
}
}
// Test that this is an `extend_path(__path__, __name__)` call
if maybe_extend_path != "extend_path" {
return;
}
let Some(ast::Expr::Name(path)) = extend_arguments.find_argument_value("path", 0) else {
return;
};
let Some(ast::Expr::Name(name)) = extend_arguments.find_argument_value("name", 1) else {
return;
};
self.is_legacy_namespace_package = path.id() == "__path__" && name.id() == "__name__";
}
}
#[cfg(test)]
mod tests {
#![expect(

View File

@@ -2787,7 +2787,7 @@ impl SemanticSyntaxContext for SemanticIndexBuilder<'_, '_> {
}
fn in_loop_context(&self) -> bool {
true
self.current_scope_info().current_loop.is_some()
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -343,7 +343,7 @@ impl<'db> BoundSuperType<'db> {
Type::TypeAlias(alias) => {
return delegate_with_error_mapped(alias.value_type(db), None);
}
Type::TypeVar(type_var) | Type::NonInferableTypeVar(type_var) => {
Type::TypeVar(type_var) => {
let type_var = type_var.typevar(db);
return match type_var.bound_or_constraints(db) {
Some(TypeVarBoundOrConstraints::UpperBound(bound)) => {

View File

@@ -1062,8 +1062,7 @@ impl<'db> InnerIntersectionBuilder<'db> {
let mut positive_to_remove = SmallVec::<[usize; 1]>::new();
for (typevar_index, ty) in self.positive.iter().enumerate() {
let (Type::NonInferableTypeVar(bound_typevar) | Type::TypeVar(bound_typevar)) = ty
else {
let Type::TypeVar(bound_typevar) = ty else {
continue;
};
let Some(TypeVarBoundOrConstraints::Constraints(constraints)) =

View File

@@ -24,7 +24,8 @@ use crate::types::diagnostic::{
};
use crate::types::enums::is_enum_class;
use crate::types::function::{
DataclassTransformerParams, FunctionDecorators, FunctionType, KnownFunction, OverloadLiteral,
DataclassTransformerFlags, DataclassTransformerParams, FunctionDecorators, FunctionType,
KnownFunction, OverloadLiteral,
};
use crate::types::generics::{
InferableTypeVars, Specialization, SpecializationBuilder, SpecializationError,
@@ -32,9 +33,9 @@ use crate::types::generics::{
use crate::types::signatures::{Parameter, ParameterForm, ParameterKind, Parameters};
use crate::types::tuple::{TupleLength, TupleType};
use crate::types::{
BoundMethodType, ClassLiteral, DataclassParams, FieldInstance, KnownBoundMethodType,
KnownClass, KnownInstanceType, MemberLookupPolicy, PropertyInstanceType, SpecialFormType,
TrackedConstraintSet, TypeAliasType, TypeContext, UnionBuilder, UnionType,
BoundMethodType, ClassLiteral, DataclassFlags, DataclassParams, FieldInstance,
KnownBoundMethodType, KnownClass, KnownInstanceType, MemberLookupPolicy, PropertyInstanceType,
SpecialFormType, TrackedConstraintSet, TypeAliasType, TypeContext, UnionBuilder, UnionType,
WrapperDescriptorKind, enums, ide_support, infer_isolated_expression, todo_type,
};
use ruff_db::diagnostic::{Annotation, Diagnostic, SubDiagnostic, SubDiagnosticSeverity};
@@ -135,6 +136,7 @@ impl<'db> Bindings<'db> {
db: &'db dyn Db,
argument_types: &CallArguments<'_, 'db>,
call_expression_tcx: &TypeContext<'db>,
dataclass_field_specifiers: &[Type<'db>],
) -> Result<Self, CallError<'db>> {
for element in &mut self.elements {
if let Some(mut updated_argument_forms) =
@@ -147,7 +149,7 @@ impl<'db> Bindings<'db> {
}
}
self.evaluate_known_cases(db);
self.evaluate_known_cases(db, dataclass_field_specifiers);
// In order of precedence:
//
@@ -269,7 +271,7 @@ impl<'db> Bindings<'db> {
/// Evaluates the return type of certain known callables, where we have special-case logic to
/// determine the return type in a way that isn't directly expressible in the type system.
fn evaluate_known_cases(&mut self, db: &'db dyn Db) {
fn evaluate_known_cases(&mut self, db: &'db dyn Db, dataclass_field_specifiers: &[Type<'db>]) {
let to_bool = |ty: &Option<Type<'_>>, default: bool| -> bool {
if let Some(Type::BooleanLiteral(value)) = ty {
*value
@@ -596,6 +598,70 @@ impl<'db> Bindings<'db> {
}
}
function @ Type::FunctionLiteral(function_type)
if dataclass_field_specifiers.contains(&function)
|| function_type.is_known(db, KnownFunction::Field) =>
{
let has_default_value = overload
.parameter_type_by_name("default", false)
.is_ok_and(|ty| ty.is_some())
|| overload
.parameter_type_by_name("default_factory", false)
.is_ok_and(|ty| ty.is_some())
|| overload
.parameter_type_by_name("factory", false)
.is_ok_and(|ty| ty.is_some());
let init = overload
.parameter_type_by_name("init", true)
.unwrap_or(None);
let kw_only = overload
.parameter_type_by_name("kw_only", true)
.unwrap_or(None);
// `dataclasses.field` and field-specifier functions of commonly used
// libraries like `pydantic`, `attrs`, and `SQLAlchemy` all return
// the default type for the field (or `Any`) instead of an actual `Field`
// instance, even if this is not what happens at runtime (see also below).
// We still make use of this fact and pretend that all field specifiers
// return the type of the default value:
let default_ty = if has_default_value {
Some(overload.return_ty)
} else {
None
};
let init = init
.map(|init| !init.bool(db).is_always_false())
.unwrap_or(true);
let kw_only = if Program::get(db).python_version(db) >= PythonVersion::PY310
{
match kw_only {
// We are more conservative here when turning the type for `kw_only`
// into a bool, because a field specifier in a stub might use
// `kw_only: bool = ...` and the truthiness of `...` is always true.
// This is different from `init` above because may need to fall back
// to `kw_only_default`, whereas `init_default` does not exist.
Some(Type::BooleanLiteral(yes)) => Some(yes),
_ => None,
}
} else {
None
};
// `typeshed` pretends that `dataclasses.field()` returns the type of the
// default value directly. At runtime, however, this function returns an
// instance of `dataclasses.Field`. We also model it this way and return
// a known-instance type with information about the field. The drawback
// of this approach is that we need to pretend that instances of `Field`
// are assignable to `T` if the default type of the field is assignable
// to `T`. Otherwise, we would error on `name: str = field(default="")`.
overload.set_return_type(Type::KnownInstance(KnownInstanceType::Field(
FieldInstance::new(db, default_ty, init, kw_only),
)));
}
Type::FunctionLiteral(function_type) => match function_type.known(db) {
Some(KnownFunction::IsEquivalentTo) => {
if let [Some(ty_a), Some(ty_b)] = overload.parameter_types() {
@@ -871,43 +937,45 @@ impl<'db> Bindings<'db> {
weakref_slot,
] = overload.parameter_types()
{
let mut params = DataclassParams::empty();
let mut flags = DataclassFlags::empty();
if to_bool(init, true) {
params |= DataclassParams::INIT;
flags |= DataclassFlags::INIT;
}
if to_bool(repr, true) {
params |= DataclassParams::REPR;
flags |= DataclassFlags::REPR;
}
if to_bool(eq, true) {
params |= DataclassParams::EQ;
flags |= DataclassFlags::EQ;
}
if to_bool(order, false) {
params |= DataclassParams::ORDER;
flags |= DataclassFlags::ORDER;
}
if to_bool(unsafe_hash, false) {
params |= DataclassParams::UNSAFE_HASH;
flags |= DataclassFlags::UNSAFE_HASH;
}
if to_bool(frozen, false) {
params |= DataclassParams::FROZEN;
flags |= DataclassFlags::FROZEN;
}
if to_bool(match_args, true) {
params |= DataclassParams::MATCH_ARGS;
flags |= DataclassFlags::MATCH_ARGS;
}
if to_bool(kw_only, false) {
if Program::get(db).python_version(db) >= PythonVersion::PY310 {
params |= DataclassParams::KW_ONLY;
flags |= DataclassFlags::KW_ONLY;
} else {
// TODO: emit diagnostic
}
}
if to_bool(slots, false) {
params |= DataclassParams::SLOTS;
flags |= DataclassFlags::SLOTS;
}
if to_bool(weakref_slot, false) {
params |= DataclassParams::WEAKREF_SLOT;
flags |= DataclassFlags::WEAKREF_SLOT;
}
let params = DataclassParams::from_flags(db, flags);
overload.set_return_type(Type::DataclassDecorator(params));
}
@@ -915,7 +983,7 @@ impl<'db> Bindings<'db> {
if let [Some(Type::ClassLiteral(class_literal))] =
overload.parameter_types()
{
let params = DataclassParams::default();
let params = DataclassParams::default_params(db);
overload.set_return_type(Type::from(ClassLiteral::new(
db,
class_literal.name(db),
@@ -938,82 +1006,39 @@ impl<'db> Bindings<'db> {
_kwargs,
] = overload.parameter_types()
{
let mut params = DataclassTransformerParams::empty();
let mut flags = DataclassTransformerFlags::empty();
if to_bool(eq_default, true) {
params |= DataclassTransformerParams::EQ_DEFAULT;
flags |= DataclassTransformerFlags::EQ_DEFAULT;
}
if to_bool(order_default, false) {
params |= DataclassTransformerParams::ORDER_DEFAULT;
flags |= DataclassTransformerFlags::ORDER_DEFAULT;
}
if to_bool(kw_only_default, false) {
params |= DataclassTransformerParams::KW_ONLY_DEFAULT;
flags |= DataclassTransformerFlags::KW_ONLY_DEFAULT;
}
if to_bool(frozen_default, false) {
params |= DataclassTransformerParams::FROZEN_DEFAULT;
flags |= DataclassTransformerFlags::FROZEN_DEFAULT;
}
if let Some(field_specifiers_type) = field_specifiers {
// For now, we'll do a simple check: if field_specifiers is not
// None/empty, we assume it might contain dataclasses.field
// TODO: Implement proper parsing to check for
// dataclasses.field/Field specifically
if !field_specifiers_type.is_none(db) {
params |= DataclassTransformerParams::FIELD_SPECIFIERS;
}
}
let field_specifiers: Box<[Type<'db>]> = field_specifiers
.map(|tuple_type| {
tuple_type
.exact_tuple_instance_spec(db)
.iter()
.flat_map(|tuple_spec| tuple_spec.fixed_elements())
.copied()
.collect()
})
.unwrap_or_default();
let params =
DataclassTransformerParams::new(db, flags, field_specifiers);
overload.set_return_type(Type::DataclassTransformer(params));
}
}
Some(KnownFunction::Field) => {
let default =
overload.parameter_type_by_name("default").unwrap_or(None);
let default_factory = overload
.parameter_type_by_name("default_factory")
.unwrap_or(None);
let init = overload.parameter_type_by_name("init").unwrap_or(None);
let kw_only =
overload.parameter_type_by_name("kw_only").unwrap_or(None);
// `dataclasses.field` and field-specifier functions of commonly used
// libraries like `pydantic`, `attrs`, and `SQLAlchemy` all return
// the default type for the field (or `Any`) instead of an actual `Field`
// instance, even if this is not what happens at runtime (see also below).
// We still make use of this fact and pretend that all field specifiers
// return the type of the default value:
let default_ty = if default.is_some() || default_factory.is_some() {
Some(overload.return_ty)
} else {
None
};
let init = init
.map(|init| !init.bool(db).is_always_false())
.unwrap_or(true);
let kw_only =
if Program::get(db).python_version(db) >= PythonVersion::PY310 {
kw_only.map(|kw_only| !kw_only.bool(db).is_always_false())
} else {
None
};
// `typeshed` pretends that `dataclasses.field()` returns the type of the
// default value directly. At runtime, however, this function returns an
// instance of `dataclasses.Field`. We also model it this way and return
// a known-instance type with information about the field. The drawback
// of this approach is that we need to pretend that instances of `Field`
// are assignable to `T` if the default type of the field is assignable
// to `T`. Otherwise, we would error on `name: str = field(default="")`.
overload.set_return_type(Type::KnownInstance(
KnownInstanceType::Field(FieldInstance::new(
db, default_ty, init, kw_only,
)),
));
}
_ => {
// Ideally, either the implementation, or exactly one of the overloads
// of the function can have the dataclass_transform decorator applied.
@@ -1030,36 +1055,41 @@ impl<'db> Bindings<'db> {
// the argument type and overwrite the corresponding flag in `dataclass_params` after
// constructing them from the `dataclass_transformer`-parameter defaults.
let mut dataclass_params =
DataclassParams::from(params);
let dataclass_params =
DataclassParams::from_transformer_params(
db, params,
);
let mut flags = dataclass_params.flags(db);
if let Ok(Some(Type::BooleanLiteral(order))) =
overload.parameter_type_by_name("order")
overload.parameter_type_by_name("order", false)
{
dataclass_params.set(DataclassParams::ORDER, order);
flags.set(DataclassFlags::ORDER, order);
}
if let Ok(Some(Type::BooleanLiteral(eq))) =
overload.parameter_type_by_name("eq")
overload.parameter_type_by_name("eq", false)
{
dataclass_params.set(DataclassParams::EQ, eq);
flags.set(DataclassFlags::EQ, eq);
}
if let Ok(Some(Type::BooleanLiteral(kw_only))) =
overload.parameter_type_by_name("kw_only")
overload.parameter_type_by_name("kw_only", false)
{
dataclass_params
.set(DataclassParams::KW_ONLY, kw_only);
flags.set(DataclassFlags::KW_ONLY, kw_only);
}
if let Ok(Some(Type::BooleanLiteral(frozen))) =
overload.parameter_type_by_name("frozen")
overload.parameter_type_by_name("frozen", false)
{
dataclass_params
.set(DataclassParams::FROZEN, frozen);
flags.set(DataclassFlags::FROZEN, frozen);
}
Type::DataclassDecorator(dataclass_params)
Type::DataclassDecorator(DataclassParams::new(
db,
flags,
dataclass_params.field_specifiers(db),
))
},
)
})
@@ -2494,6 +2524,7 @@ struct ArgumentTypeChecker<'a, 'db> {
argument_matches: &'a [MatchedArgument<'db>],
parameter_tys: &'a mut [Option<Type<'db>>],
call_expression_tcx: &'a TypeContext<'db>,
return_ty: Type<'db>,
errors: &'a mut Vec<BindingError<'db>>,
inferable_typevars: InferableTypeVars<'db, 'db>,
@@ -2501,6 +2532,7 @@ struct ArgumentTypeChecker<'a, 'db> {
}
impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
#[expect(clippy::too_many_arguments)]
fn new(
db: &'db dyn Db,
signature: &'a Signature<'db>,
@@ -2508,6 +2540,7 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
argument_matches: &'a [MatchedArgument<'db>],
parameter_tys: &'a mut [Option<Type<'db>>],
call_expression_tcx: &'a TypeContext<'db>,
return_ty: Type<'db>,
errors: &'a mut Vec<BindingError<'db>>,
) -> Self {
Self {
@@ -2517,6 +2550,7 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
argument_matches,
parameter_tys,
call_expression_tcx,
return_ty,
errors,
inferable_typevars: InferableTypeVars::None,
specialization: None,
@@ -2555,28 +2589,9 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
return;
};
// TODO: Use the list of inferable typevars from the generic context of the callable.
self.inferable_typevars = generic_context.inferable_typevars(self.db);
let mut builder = SpecializationBuilder::new(self.db, self.inferable_typevars);
// Note that we infer the annotated type _before_ the arguments if this call is part of
// an annotated assignment, to closer match the order of any unions written in the type
// annotation.
if let Some(return_ty) = self.signature.return_ty
&& let Some(call_expression_tcx) = self.call_expression_tcx.annotation
{
match call_expression_tcx {
// A type variable is not a useful type-context for expression inference, and applying it
// to the return type can lead to confusing unions in nested generic calls.
Type::TypeVar(_) => {}
_ => {
// Ignore any specialization errors here, because the type context is only used as a hint
// to infer a more assignable return type.
let _ = builder.infer(return_ty, call_expression_tcx);
}
}
}
let parameters = self.signature.parameters();
for (argument_index, adjusted_argument_index, _, argument_type) in
self.enumerate_argument_types()
@@ -2601,7 +2616,41 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
}
}
self.specialization = Some(builder.build(generic_context, *self.call_expression_tcx));
// Build the specialization first without inferring the type context.
let isolated_specialization = builder.build(generic_context, *self.call_expression_tcx);
let isolated_return_ty = self
.return_ty
.apply_specialization(self.db, isolated_specialization);
let mut try_infer_tcx = || {
let return_ty = self.signature.return_ty?;
let call_expression_tcx = self.call_expression_tcx.annotation?;
// A type variable is not a useful type-context for expression inference, and applying it
// to the return type can lead to confusing unions in nested generic calls.
if call_expression_tcx.is_type_var() {
return None;
}
// If the return type is already assignable to the annotated type, we can ignore the
// type context and prefer the narrower inferred type.
if isolated_return_ty.is_assignable_to(self.db, call_expression_tcx) {
return None;
}
// TODO: Ideally we would infer the annotated type _before_ the arguments if this call is part of an
// annotated assignment, to closer match the order of any unions written in the type annotation.
builder.infer(return_ty, call_expression_tcx).ok()?;
// Otherwise, build the specialization again after inferring the type context.
let specialization = builder.build(generic_context, *self.call_expression_tcx);
let return_ty = return_ty.apply_specialization(self.db, specialization);
Some((Some(specialization), return_ty))
};
(self.specialization, self.return_ty) =
try_infer_tcx().unwrap_or((Some(isolated_specialization), isolated_return_ty));
}
fn check_argument_type(
@@ -2796,8 +2845,14 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
}
}
fn finish(self) -> (InferableTypeVars<'db, 'db>, Option<Specialization<'db>>) {
(self.inferable_typevars, self.specialization)
fn finish(
self,
) -> (
InferableTypeVars<'db, 'db>,
Option<Specialization<'db>>,
Type<'db>,
) {
(self.inferable_typevars, self.specialization, self.return_ty)
}
}
@@ -2843,6 +2898,7 @@ impl<'db> MatchedArgument<'db> {
}
/// Indicates that a parameter of the given name was not found.
#[derive(Debug, Clone, Copy)]
pub(crate) struct UnknownParameterNameError;
/// Binding information for one of the overloads of a callable.
@@ -2954,18 +3010,16 @@ impl<'db> Binding<'db> {
&self.argument_matches,
&mut self.parameter_tys,
call_expression_tcx,
self.return_ty,
&mut self.errors,
);
// If this overload is generic, first see if we can infer a specialization of the function
// from the arguments that were passed in.
checker.infer_specialization();
checker.check_argument_types();
(self.inferable_typevars, self.specialization) = checker.finish();
if let Some(specialization) = self.specialization {
self.return_ty = self.return_ty.apply_specialization(db, specialization);
}
(self.inferable_typevars, self.specialization, self.return_ty) = checker.finish();
}
pub(crate) fn set_return_type(&mut self, return_ty: Type<'db>) {
@@ -2993,15 +3047,24 @@ impl<'db> Binding<'db> {
pub(crate) fn parameter_type_by_name(
&self,
parameter_name: &str,
fallback_to_default: bool,
) -> Result<Option<Type<'db>>, UnknownParameterNameError> {
let index = self
.signature
.parameters()
let parameters = self.signature.parameters();
let index = parameters
.keyword_by_name(parameter_name)
.map(|(i, _)| i)
.ok_or(UnknownParameterNameError)?;
Ok(self.parameter_tys[index])
let parameter_ty = self.parameter_tys[index];
if parameter_ty.is_some() {
Ok(parameter_ty)
} else if fallback_to_default {
Ok(parameters[index].default_type())
} else {
Ok(None)
}
}
pub(crate) fn arguments_for_parameter<'a>(

View File

@@ -32,12 +32,12 @@ use crate::types::tuple::{TupleSpec, TupleType};
use crate::types::typed_dict::typed_dict_params_from_class_def;
use crate::types::visitor::{NonAtomicType, TypeKind, TypeVisitor, walk_non_atomic_type};
use crate::types::{
ApplyTypeMappingVisitor, Binding, BoundSuperType, CallableType, DataclassParams,
DeprecatedInstance, FindLegacyTypeVarsVisitor, HasRelationToVisitor, IsDisjointVisitor,
IsEquivalentVisitor, KnownInstanceType, ManualPEP695TypeAliasType, MaterializationKind,
NormalizedVisitor, PropertyInstanceType, StringLiteralType, TypeAliasType, TypeContext,
TypeMapping, TypeRelation, TypedDictParams, UnionBuilder, VarianceInferable, declaration_type,
determine_upper_bound, infer_definition_types,
ApplyTypeMappingVisitor, Binding, BoundSuperType, CallableType, DataclassFlags,
DataclassParams, DeprecatedInstance, FindLegacyTypeVarsVisitor, HasRelationToVisitor,
IsDisjointVisitor, IsEquivalentVisitor, KnownInstanceType, ManualPEP695TypeAliasType,
MaterializationKind, NormalizedVisitor, PropertyInstanceType, StringLiteralType, TypeAliasType,
TypeContext, TypeMapping, TypeRelation, TypedDictParams, UnionBuilder, VarianceInferable,
declaration_type, determine_upper_bound, infer_definition_types,
};
use crate::{
Db, FxIndexMap, FxIndexSet, FxOrderSet, Program,
@@ -163,7 +163,7 @@ fn try_metaclass_cycle_recover<'db>(
_count: u32,
_self: ClassLiteral<'db>,
) -> salsa::CycleRecoveryAction<
Result<(Type<'db>, Option<DataclassTransformerParams>), MetaclassError<'db>>,
Result<(Type<'db>, Option<DataclassTransformerParams<'db>>), MetaclassError<'db>>,
> {
salsa::CycleRecoveryAction::Iterate
}
@@ -172,7 +172,7 @@ fn try_metaclass_cycle_recover<'db>(
fn try_metaclass_cycle_initial<'db>(
_db: &'db dyn Db,
_self_: ClassLiteral<'db>,
) -> Result<(Type<'db>, Option<DataclassTransformerParams>), MetaclassError<'db>> {
) -> Result<(Type<'db>, Option<DataclassTransformerParams<'db>>), MetaclassError<'db>> {
Err(MetaclassError {
kind: MetaclassErrorKind::Cycle,
})
@@ -180,17 +180,21 @@ fn try_metaclass_cycle_initial<'db>(
/// A category of classes with code generation capabilities (with synthesized methods).
#[derive(Clone, Copy, Debug, PartialEq, salsa::Update, get_size2::GetSize)]
pub(crate) enum CodeGeneratorKind {
pub(crate) enum CodeGeneratorKind<'db> {
/// Classes decorated with `@dataclass` or similar dataclass-like decorators
DataclassLike(Option<DataclassTransformerParams>),
DataclassLike(Option<DataclassTransformerParams<'db>>),
/// Classes inheriting from `typing.NamedTuple`
NamedTuple,
/// Classes inheriting from `typing.TypedDict`
TypedDict,
}
impl CodeGeneratorKind {
pub(crate) fn from_class(db: &dyn Db, class: ClassLiteral<'_>) -> Option<Self> {
impl<'db> CodeGeneratorKind<'db> {
pub(crate) fn from_class(
db: &'db dyn Db,
class: ClassLiteral<'db>,
specialization: Option<Specialization<'db>>,
) -> Option<Self> {
#[salsa::tracked(
cycle_fn=code_generator_of_class_recover,
cycle_initial=code_generator_of_class_initial,
@@ -199,11 +203,20 @@ impl CodeGeneratorKind {
fn code_generator_of_class<'db>(
db: &'db dyn Db,
class: ClassLiteral<'db>,
) -> Option<CodeGeneratorKind> {
specialization: Option<Specialization<'db>>,
) -> Option<CodeGeneratorKind<'db>> {
if class.dataclass_params(db).is_some() {
Some(CodeGeneratorKind::DataclassLike(None))
} else if let Ok((_, Some(transformer_params))) = class.try_metaclass(db) {
Some(CodeGeneratorKind::DataclassLike(Some(transformer_params)))
} else if let Some(transformer_params) =
class.iter_mro(db, specialization).skip(1).find_map(|base| {
base.into_class().and_then(|class| {
class.class_literal(db).0.dataclass_transformer_params(db)
})
})
{
Some(CodeGeneratorKind::DataclassLike(Some(transformer_params)))
} else if class
.explicit_bases(db)
.contains(&Type::SpecialForm(SpecialFormType::NamedTuple))
@@ -216,34 +229,51 @@ impl CodeGeneratorKind {
}
}
fn code_generator_of_class_initial(
_db: &dyn Db,
_class: ClassLiteral<'_>,
) -> Option<CodeGeneratorKind> {
fn code_generator_of_class_initial<'db>(
_db: &'db dyn Db,
_class: ClassLiteral<'db>,
_specialization: Option<Specialization<'db>>,
) -> Option<CodeGeneratorKind<'db>> {
None
}
#[expect(clippy::ref_option, clippy::trivially_copy_pass_by_ref)]
fn code_generator_of_class_recover(
_db: &dyn Db,
_value: &Option<CodeGeneratorKind>,
#[expect(clippy::ref_option)]
fn code_generator_of_class_recover<'db>(
_db: &'db dyn Db,
_value: &Option<CodeGeneratorKind<'db>>,
_count: u32,
_class: ClassLiteral<'_>,
) -> salsa::CycleRecoveryAction<Option<CodeGeneratorKind>> {
_class: ClassLiteral<'db>,
_specialization: Option<Specialization<'db>>,
) -> salsa::CycleRecoveryAction<Option<CodeGeneratorKind<'db>>> {
salsa::CycleRecoveryAction::Iterate
}
code_generator_of_class(db, class)
code_generator_of_class(db, class, specialization)
}
pub(super) fn matches(self, db: &dyn Db, class: ClassLiteral<'_>) -> bool {
pub(super) fn matches(
self,
db: &'db dyn Db,
class: ClassLiteral<'db>,
specialization: Option<Specialization<'db>>,
) -> bool {
matches!(
(CodeGeneratorKind::from_class(db, class), self),
(
CodeGeneratorKind::from_class(db, class, specialization),
self
),
(Some(Self::DataclassLike(_)), Self::DataclassLike(_))
| (Some(Self::NamedTuple), Self::NamedTuple)
| (Some(Self::TypedDict), Self::TypedDict)
)
}
pub(super) fn dataclass_transformer_params(self) -> Option<DataclassTransformerParams<'db>> {
match self {
Self::DataclassLike(params) => params,
Self::NamedTuple | Self::TypedDict => None,
}
}
}
/// A specialization of a generic class with a particular assignment of types to typevars.
@@ -1387,8 +1417,8 @@ pub struct ClassLiteral<'db> {
/// If this class is deprecated, this holds the deprecation message.
pub(crate) deprecated: Option<DeprecatedInstance<'db>>,
pub(crate) dataclass_params: Option<DataclassParams>,
pub(crate) dataclass_transformer_params: Option<DataclassTransformerParams>,
pub(crate) dataclass_params: Option<DataclassParams<'db>>,
pub(crate) dataclass_transformer_params: Option<DataclassTransformerParams<'db>>,
}
// The Salsa heap is tracked separately.
@@ -1627,15 +1657,10 @@ impl<'db> ClassLiteral<'db> {
})
}
/// Returns a specialization of this class where each typevar is mapped to itself. The second
/// parameter can be `Type::TypeVar` or `Type::NonInferableTypeVar`, depending on the use case.
pub(crate) fn identity_specialization(
self,
db: &'db dyn Db,
typevar_to_type: &impl Fn(BoundTypeVarInstance<'db>) -> Type<'db>,
) -> ClassType<'db> {
/// Returns a specialization of this class where each typevar is mapped to itself.
pub(crate) fn identity_specialization(self, db: &'db dyn Db) -> ClassType<'db> {
self.apply_specialization(db, |generic_context| {
generic_context.identity_specialization(db, typevar_to_type)
generic_context.identity_specialization(db)
})
}
@@ -1909,7 +1934,7 @@ impl<'db> ClassLiteral<'db> {
pub(super) fn try_metaclass(
self,
db: &'db dyn Db,
) -> Result<(Type<'db>, Option<DataclassTransformerParams>), MetaclassError<'db>> {
) -> Result<(Type<'db>, Option<DataclassTransformerParams<'db>>), MetaclassError<'db>> {
tracing::trace!("ClassLiteral::try_metaclass: {}", self.name(db));
// Identify the class's own metaclass (or take the first base class's metaclass).
@@ -2205,7 +2230,7 @@ impl<'db> ClassLiteral<'db> {
};
}
if CodeGeneratorKind::NamedTuple.matches(db, self) {
if CodeGeneratorKind::NamedTuple.matches(db, self, specialization) {
if let Some(field) = self
.own_fields(db, specialization, CodeGeneratorKind::NamedTuple)
.get(name)
@@ -2267,18 +2292,21 @@ impl<'db> ClassLiteral<'db> {
) -> Option<Type<'db>> {
let dataclass_params = self.dataclass_params(db);
let field_policy = CodeGeneratorKind::from_class(db, self)?;
let field_policy = CodeGeneratorKind::from_class(db, self, specialization)?;
let transformer_params =
if let CodeGeneratorKind::DataclassLike(Some(transformer_params)) = field_policy {
Some(DataclassParams::from(transformer_params))
Some(DataclassParams::from_transformer_params(
db,
transformer_params,
))
} else {
None
};
let has_dataclass_param = |param| {
dataclass_params.is_some_and(|params| params.contains(param))
|| transformer_params.is_some_and(|params| params.contains(param))
dataclass_params.is_some_and(|params| params.flags(db).contains(param))
|| transformer_params.is_some_and(|params| params.flags(db).contains(param))
};
let instance_ty =
@@ -2357,7 +2385,7 @@ impl<'db> ClassLiteral<'db> {
}
let is_kw_only = name == "__replace__"
|| kw_only.unwrap_or(has_dataclass_param(DataclassParams::KW_ONLY));
|| kw_only.unwrap_or(has_dataclass_param(DataclassFlags::KW_ONLY));
let mut parameter = if is_kw_only {
Parameter::keyword_only(field_name)
@@ -2395,7 +2423,7 @@ impl<'db> ClassLiteral<'db> {
match (field_policy, name) {
(CodeGeneratorKind::DataclassLike(_), "__init__") => {
if !has_dataclass_param(DataclassParams::INIT) {
if !has_dataclass_param(DataclassFlags::INIT) {
return None;
}
@@ -2410,7 +2438,7 @@ impl<'db> ClassLiteral<'db> {
signature_from_fields(vec![cls_parameter], Some(Type::none(db)))
}
(CodeGeneratorKind::DataclassLike(_), "__lt__" | "__le__" | "__gt__" | "__ge__") => {
if !has_dataclass_param(DataclassParams::ORDER) {
if !has_dataclass_param(DataclassFlags::ORDER) {
return None;
}
@@ -2461,7 +2489,7 @@ impl<'db> ClassLiteral<'db> {
signature_from_fields(vec![self_parameter], Some(instance_ty))
}
(CodeGeneratorKind::DataclassLike(_), "__setattr__") => {
if has_dataclass_param(DataclassParams::FROZEN) {
if has_dataclass_param(DataclassFlags::FROZEN) {
let signature = Signature::new(
Parameters::new([
Parameter::positional_or_keyword(Name::new_static("self"))
@@ -2477,7 +2505,7 @@ impl<'db> ClassLiteral<'db> {
None
}
(CodeGeneratorKind::DataclassLike(_), "__slots__") => {
has_dataclass_param(DataclassParams::SLOTS).then(|| {
has_dataclass_param(DataclassFlags::SLOTS).then(|| {
let fields = self.fields(db, specialization, field_policy);
let slots = fields.keys().map(|name| Type::string_literal(db, name));
Type::heterogeneous_tuple(db, slots)
@@ -2810,7 +2838,7 @@ impl<'db> ClassLiteral<'db> {
.filter_map(|superclass| {
if let Some(class) = superclass.into_class() {
let (class_literal, specialization) = class.class_literal(db);
if field_policy.matches(db, class_literal) {
if field_policy.matches(db, class_literal, specialization) {
Some((class_literal, specialization))
} else {
None
@@ -2901,7 +2929,7 @@ impl<'db> ClassLiteral<'db> {
default_ty = field.default_type(db);
if self
.dataclass_params(db)
.map(|params| params.contains(DataclassParams::NO_FIELD_SPECIFIERS))
.map(|params| params.field_specifiers(db).is_empty())
.unwrap_or(false)
{
// This happens when constructing a `dataclass` with a `dataclass_transform`
@@ -3625,7 +3653,7 @@ impl<'db> VarianceInferable<'db> for ClassLiteral<'db> {
.map(|class| class.variance_of(db, typevar));
let default_attribute_variance = {
let is_namedtuple = CodeGeneratorKind::NamedTuple.matches(db, self);
let is_namedtuple = CodeGeneratorKind::NamedTuple.matches(db, self, None);
// Python 3.13 introduced a synthesized `__replace__` method on dataclasses which uses
// their field types in contravariant position, thus meaning a frozen dataclass must
// still be invariant in its field types. Other synthesized methods on dataclasses are
@@ -3635,7 +3663,7 @@ impl<'db> VarianceInferable<'db> for ClassLiteral<'db> {
let is_frozen_dataclass = Program::get(db).python_version(db) <= PythonVersion::PY312
&& self
.dataclass_params(db)
.is_some_and(|params| params.contains(DataclassParams::FROZEN));
.is_some_and(|params| params.flags(db).contains(DataclassFlags::FROZEN));
if is_namedtuple || is_frozen_dataclass {
TypeVarVariance::Covariant
} else {

View File

@@ -155,7 +155,6 @@ impl<'db> ClassBase<'db> {
| Type::StringLiteral(_)
| Type::LiteralString
| Type::ModuleLiteral(_)
| Type::NonInferableTypeVar(_)
| Type::TypeVar(_)
| Type::BoundSuper(_)
| Type::ProtocolInstance(_)

View File

@@ -38,7 +38,7 @@ pub struct DisplaySettings<'db> {
/// Class names that should be displayed fully qualified
/// (e.g., `module.ClassName` instead of just `ClassName`)
pub qualified: Rc<FxHashMap<&'db str, QualificationLevel>>,
/// Whether long unions are displayed in full
/// Whether long unions and literals are displayed in full
pub preserve_full_unions: bool,
}
@@ -560,9 +560,7 @@ impl Display for DisplayRepresentation<'_> {
.display_with(self.db, self.settings.clone()),
literal_name = enum_literal.name(self.db)
),
Type::NonInferableTypeVar(bound_typevar) | Type::TypeVar(bound_typevar) => {
bound_typevar.identity(self.db).display(self.db).fmt(f)
}
Type::TypeVar(bound_typevar) => bound_typevar.identity(self.db).display(self.db).fmt(f),
Type::AlwaysTruthy => f.write_str("AlwaysTruthy"),
Type::AlwaysFalsy => f.write_str("AlwaysFalsy"),
Type::BoundSuper(bound_super) => {
@@ -593,7 +591,15 @@ impl Display for DisplayRepresentation<'_> {
.0
.display_with(self.db, self.settings.clone())
.fmt(f),
Type::TypeAlias(alias) => f.write_str(alias.name(self.db)),
Type::TypeAlias(alias) => {
f.write_str(alias.name(self.db))?;
match alias.specialization(self.db) {
None => Ok(()),
Some(specialization) => specialization
.display_short(self.db, TupleSpecialization::No, self.settings.clone())
.fmt(f),
}
}
}
}
}
@@ -1322,6 +1328,44 @@ impl Display for DisplayParameter<'_> {
}
}
#[derive(Debug, Copy, Clone)]
struct TruncationPolicy {
max: usize,
max_when_elided: usize,
}
impl TruncationPolicy {
fn display_limit(self, total: usize, preserve_full: bool) -> usize {
if preserve_full {
return total;
}
let limit = if total > self.max {
self.max_when_elided
} else {
self.max
};
limit.min(total)
}
}
#[derive(Debug)]
struct DisplayOmitted {
count: usize,
singular: &'static str,
plural: &'static str,
}
impl Display for DisplayOmitted {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
let noun = if self.count == 1 {
self.singular
} else {
self.plural
};
write!(f, "... omitted {} {}", self.count, noun)
}
}
impl<'db> UnionType<'db> {
fn display_with(
&'db self,
@@ -1342,8 +1386,10 @@ struct DisplayUnionType<'db> {
settings: DisplaySettings<'db>,
}
const MAX_DISPLAYED_UNION_ITEMS: usize = 5;
const MAX_DISPLAYED_UNION_ITEMS_WHEN_ELIDED: usize = 3;
const UNION_POLICY: TruncationPolicy = TruncationPolicy {
max: 5,
max_when_elided: 3,
};
impl Display for DisplayUnionType<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
@@ -1373,16 +1419,8 @@ impl Display for DisplayUnionType<'_> {
let mut join = f.join(" | ");
let display_limit = if self.settings.preserve_full_unions {
total_entries
} else {
let limit = if total_entries > MAX_DISPLAYED_UNION_ITEMS {
MAX_DISPLAYED_UNION_ITEMS_WHEN_ELIDED
} else {
MAX_DISPLAYED_UNION_ITEMS
};
limit.min(total_entries)
};
let display_limit =
UNION_POLICY.display_limit(total_entries, self.settings.preserve_full_unions);
let mut condensed_types = Some(condensed_types);
let mut displayed_entries = 0usize;
@@ -1414,8 +1452,10 @@ impl Display for DisplayUnionType<'_> {
if !self.settings.preserve_full_unions {
let omitted_entries = total_entries.saturating_sub(displayed_entries);
if omitted_entries > 0 {
join.entry(&DisplayUnionOmitted {
join.entry(&DisplayOmitted {
count: omitted_entries,
singular: "union element",
plural: "union elements",
});
}
}
@@ -1431,38 +1471,45 @@ impl fmt::Debug for DisplayUnionType<'_> {
Display::fmt(self, f)
}
}
struct DisplayUnionOmitted {
count: usize,
}
impl Display for DisplayUnionOmitted {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
let plural = if self.count == 1 {
"element"
} else {
"elements"
};
write!(f, "... omitted {} union {}", self.count, plural)
}
}
struct DisplayLiteralGroup<'db> {
literals: Vec<Type<'db>>,
db: &'db dyn Db,
settings: DisplaySettings<'db>,
}
const LITERAL_POLICY: TruncationPolicy = TruncationPolicy {
max: 7,
max_when_elided: 5,
};
impl Display for DisplayLiteralGroup<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
f.write_str("Literal[")?;
f.join(", ")
.entries(
self.literals
.iter()
.map(|ty| ty.representation(self.db, self.settings.singleline())),
)
.finish()?;
let total_entries = self.literals.len();
let display_limit =
LITERAL_POLICY.display_limit(total_entries, self.settings.preserve_full_unions);
let mut join = f.join(", ");
for lit in self.literals.iter().take(display_limit) {
let rep = lit.representation(self.db, self.settings.singleline());
join.entry(&rep);
}
if !self.settings.preserve_full_unions {
let omitted_entries = total_entries.saturating_sub(display_limit);
if omitted_entries > 0 {
join.entry(&DisplayOmitted {
count: omitted_entries,
singular: "literal",
plural: "literals",
});
}
}
join.finish()?;
f.write_str("]")
}
}

View File

@@ -152,24 +152,36 @@ bitflags! {
/// arguments that were passed in. For the precise meaning of the fields, see [1].
///
/// [1]: https://docs.python.org/3/library/typing.html#typing.dataclass_transform
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, salsa::Update)]
pub struct DataclassTransformerParams: u8 {
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord, salsa::Update)]
pub struct DataclassTransformerFlags: u8 {
const EQ_DEFAULT = 1 << 0;
const ORDER_DEFAULT = 1 << 1;
const KW_ONLY_DEFAULT = 1 << 2;
const FROZEN_DEFAULT = 1 << 3;
const FIELD_SPECIFIERS= 1 << 4;
}
}
impl get_size2::GetSize for DataclassTransformerParams {}
impl get_size2::GetSize for DataclassTransformerFlags {}
impl Default for DataclassTransformerParams {
impl Default for DataclassTransformerFlags {
fn default() -> Self {
Self::EQ_DEFAULT
}
}
/// Metadata for a dataclass-transformer. Stored inside a `Type::DataclassTransformer(…)`
/// instance that we use as the return type for `dataclass_transform(…)` calls.
#[salsa::interned(debug, heap_size=ruff_memory_usage::heap_size)]
#[derive(PartialOrd, Ord)]
pub struct DataclassTransformerParams<'db> {
pub flags: DataclassTransformerFlags,
#[returns(deref)]
pub field_specifiers: Box<[Type<'db>]>,
}
impl get_size2::GetSize for DataclassTransformerParams<'_> {}
/// Representation of a function definition in the AST: either a non-generic function, or a generic
/// function that has not been specialized.
///
@@ -201,7 +213,7 @@ pub struct OverloadLiteral<'db> {
/// The arguments to `dataclass_transformer`, if this function was annotated
/// with `@dataclass_transformer(...)`.
pub(crate) dataclass_transformer_params: Option<DataclassTransformerParams>,
pub(crate) dataclass_transformer_params: Option<DataclassTransformerParams<'db>>,
}
// The Salsa heap is tracked separately.
@@ -212,7 +224,7 @@ impl<'db> OverloadLiteral<'db> {
fn with_dataclass_transformer_params(
self,
db: &'db dyn Db,
params: DataclassTransformerParams,
params: DataclassTransformerParams<'db>,
) -> Self {
Self::new(
db,
@@ -353,28 +365,12 @@ impl<'db> OverloadLiteral<'db> {
if function_node.is_async && !is_generator {
signature = signature.wrap_coroutine_return_type(db);
}
signature = signature.mark_typevars_inferable(db);
let pep695_ctx = function_node.type_params.as_ref().map(|type_params| {
GenericContext::from_type_params(db, index, self.definition(db), type_params)
});
let legacy_ctx = GenericContext::from_function_params(
db,
self.definition(db),
signature.parameters(),
signature.return_ty,
);
// We need to update `signature.generic_context` here,
// because type variables in `GenericContext::variables` are still non-inferable.
signature.generic_context =
GenericContext::merge_pep695_and_legacy(db, pep695_ctx, legacy_ctx);
signature
}
/// Typed internally-visible "raw" signature for this function.
/// That is, type variables in parameter types and the return type remain non-inferable,
/// and the return types of async functions are not wrapped in `CoroutineType[...]`.
/// That is, the return types of async functions are not wrapped in `CoroutineType[...]`.
///
/// ## Warning
///
@@ -740,7 +736,7 @@ impl<'db> FunctionType<'db> {
pub(crate) fn with_dataclass_transformer_params(
self,
db: &'db dyn Db,
params: DataclassTransformerParams,
params: DataclassTransformerParams<'db>,
) -> Self {
// A decorator only applies to the specific overload that it is attached to, not to all
// previous overloads.
@@ -1121,7 +1117,6 @@ fn is_instance_truthiness<'db>(
| Type::PropertyInstance(..)
| Type::AlwaysTruthy
| Type::AlwaysFalsy
| Type::NonInferableTypeVar(..)
| Type::TypeVar(..)
| Type::BoundSuper(..)
| Type::TypeIs(..)
@@ -1717,11 +1712,7 @@ impl KnownFunction {
}
KnownFunction::RangeConstraint => {
let [
Some(lower),
Some(Type::NonInferableTypeVar(typevar)),
Some(upper),
] = parameter_types
let [Some(lower), Some(Type::TypeVar(typevar)), Some(upper)] = parameter_types
else {
return;
};
@@ -1734,11 +1725,7 @@ impl KnownFunction {
}
KnownFunction::NegatedRangeConstraint => {
let [
Some(lower),
Some(Type::NonInferableTypeVar(typevar)),
Some(upper),
] = parameter_types
let [Some(lower), Some(Type::TypeVar(typevar)), Some(upper)] = parameter_types
else {
return;
};

View File

@@ -1,8 +1,9 @@
use std::marker::PhantomData;
use std::cell::RefCell;
use std::fmt::Display;
use itertools::Itertools;
use ruff_python_ast as ast;
use rustc_hash::FxHashMap;
use rustc_hash::{FxHashMap, FxHashSet};
use crate::semantic_index::definition::Definition;
use crate::semantic_index::scope::{FileScopeId, NodeWithScopeKind, ScopeId};
@@ -14,14 +15,16 @@ use crate::types::infer::infer_definition_types;
use crate::types::instance::{Protocol, ProtocolInstanceType};
use crate::types::signatures::{Parameter, Parameters, Signature};
use crate::types::tuple::{TupleSpec, TupleType, walk_tuple_type};
use crate::types::visitor::{NonAtomicType, TypeKind, TypeVisitor, walk_non_atomic_type};
use crate::types::{
ApplyTypeMappingVisitor, BoundTypeVarIdentity, BoundTypeVarInstance, ClassLiteral,
FindLegacyTypeVarsVisitor, HasRelationToVisitor, IsDisjointVisitor, IsEquivalentVisitor,
KnownClass, KnownInstanceType, MaterializationKind, NormalizedVisitor, Type, TypeContext,
TypeMapping, TypeRelation, TypeVarBoundOrConstraints, TypeVarIdentity, TypeVarInstance,
TypeVarKind, TypeVarVariance, UnionType, binding_type, declaration_type,
walk_bound_type_var_type,
};
use crate::{Db, FxOrderMap, FxOrderSet};
use crate::{Db, FxIndexSet, FxOrderMap, FxOrderSet};
/// Returns an iterator of any generic context introduced by the given scope or any enclosing
/// scope.
@@ -106,7 +109,6 @@ pub(crate) fn typing_self<'db>(
scope_id: ScopeId,
typevar_binding_context: Option<Definition<'db>>,
class: ClassLiteral<'db>,
typevar_to_type: &impl Fn(BoundTypeVarInstance<'db>) -> Type<'db>,
) -> Option<Type<'db>> {
let index = semantic_index(db, scope_id.file(db));
@@ -118,7 +120,7 @@ pub(crate) fn typing_self<'db>(
);
let bounds = TypeVarBoundOrConstraints::UpperBound(Type::instance(
db,
class.identity_specialization(db, typevar_to_type),
class.identity_specialization(db),
));
let typevar = TypeVarInstance::new(
db,
@@ -138,17 +140,70 @@ pub(crate) fn typing_self<'db>(
typevar_binding_context,
typevar,
)
.map(typevar_to_type)
.map(Type::TypeVar)
}
#[derive(Clone, Copy, Debug)]
pub(crate) enum InferableTypeVars<'a, 'db> {
None,
// TODO: This variant isn't used, and only exists so that we can include the 'a and 'db in the
// type definition. They will be used soon when we start creating real InferableTypeVars
// instances.
#[expect(unused)]
Unused(PhantomData<&'a &'db ()>),
One(&'a FxHashSet<BoundTypeVarIdentity<'db>>),
Two(
&'a InferableTypeVars<'a, 'db>,
&'a InferableTypeVars<'a, 'db>,
),
}
impl<'db> BoundTypeVarInstance<'db> {
pub(crate) fn is_inferable(
self,
db: &'db dyn Db,
inferable: InferableTypeVars<'_, 'db>,
) -> bool {
match inferable {
InferableTypeVars::None => false,
InferableTypeVars::One(typevars) => typevars.contains(&self.identity(db)),
InferableTypeVars::Two(left, right) => {
self.is_inferable(db, *left) || self.is_inferable(db, *right)
}
}
}
}
impl<'a, 'db> InferableTypeVars<'a, 'db> {
pub(crate) fn merge(&'a self, other: Option<&'a InferableTypeVars<'a, 'db>>) -> Self {
match other {
Some(other) => InferableTypeVars::Two(self, other),
None => *self,
}
}
// Keep this around for debugging purposes
#[expect(dead_code)]
pub(crate) fn display(&self, db: &'db dyn Db) -> impl Display {
fn find_typevars<'db>(
result: &mut FxHashSet<BoundTypeVarIdentity<'db>>,
inferable: &InferableTypeVars<'_, 'db>,
) {
match inferable {
InferableTypeVars::None => {}
InferableTypeVars::One(typevars) => result.extend(typevars.iter().copied()),
InferableTypeVars::Two(left, right) => {
find_typevars(result, left);
find_typevars(result, right);
}
}
}
let mut typevars = FxHashSet::default();
find_typevars(&mut typevars, self);
format!(
"[{}]",
typevars
.into_iter()
.map(|identity| identity.display(db))
.format(", ")
)
}
}
#[derive(Copy, Clone, Debug, Eq, Hash, PartialEq, get_size2::GetSize)]
@@ -255,6 +310,66 @@ impl<'db> GenericContext<'db> {
)
}
pub(crate) fn inferable_typevars(self, db: &'db dyn Db) -> InferableTypeVars<'db, 'db> {
#[derive(Default)]
struct CollectTypeVars<'db> {
typevars: RefCell<FxHashSet<BoundTypeVarIdentity<'db>>>,
seen_types: RefCell<FxIndexSet<NonAtomicType<'db>>>,
}
impl<'db> TypeVisitor<'db> for CollectTypeVars<'db> {
fn should_visit_lazy_type_attributes(&self) -> bool {
true
}
fn visit_bound_type_var_type(
&self,
db: &'db dyn Db,
bound_typevar: BoundTypeVarInstance<'db>,
) {
self.typevars
.borrow_mut()
.insert(bound_typevar.identity(db));
walk_bound_type_var_type(db, bound_typevar, self);
}
fn visit_type(&self, db: &'db dyn Db, ty: Type<'db>) {
match TypeKind::from(ty) {
TypeKind::Atomic => {}
TypeKind::NonAtomic(non_atomic_type) => {
if !self.seen_types.borrow_mut().insert(non_atomic_type) {
// If we have already seen this type, we can skip it.
return;
}
walk_non_atomic_type(db, non_atomic_type, self);
}
}
}
}
#[salsa::tracked(
returns(ref),
cycle_fn=inferable_typevars_cycle_recover,
cycle_initial=inferable_typevars_cycle_initial,
heap_size=ruff_memory_usage::heap_size,
)]
fn inferable_typevars_inner<'db>(
db: &'db dyn Db,
generic_context: GenericContext<'db>,
) -> FxHashSet<BoundTypeVarIdentity<'db>> {
let visitor = CollectTypeVars::default();
for bound_typevar in generic_context.variables(db) {
visitor.visit_bound_type_var_type(db, bound_typevar);
}
visitor.typevars.into_inner()
}
// This ensures that salsa caches the FxHashSet, not the InferableTypeVars that wraps it.
// (That way InferableTypeVars can contain references, and doesn't need to impl
// salsa::Update.)
InferableTypeVars::One(inferable_typevars_inner(db, self))
}
pub(crate) fn variables(
self,
db: &'db dyn Db,
@@ -410,14 +525,8 @@ impl<'db> GenericContext<'db> {
}
/// Returns a specialization of this generic context where each typevar is mapped to itself.
/// The second parameter can be `Type::TypeVar` or `Type::NonInferableTypeVar`, depending on
/// the use case.
pub(crate) fn identity_specialization(
self,
db: &'db dyn Db,
typevar_to_type: &impl Fn(BoundTypeVarInstance<'db>) -> Type<'db>,
) -> Specialization<'db> {
let types = self.variables(db).map(typevar_to_type).collect();
pub(crate) fn identity_specialization(self, db: &'db dyn Db) -> Specialization<'db> {
let types = self.variables(db).map(Type::TypeVar).collect();
self.specialize(db, types)
}
@@ -543,6 +652,22 @@ impl<'db> GenericContext<'db> {
}
}
fn inferable_typevars_cycle_recover<'db>(
_db: &'db dyn Db,
_value: &FxHashSet<BoundTypeVarIdentity<'db>>,
_count: u32,
_self: GenericContext<'db>,
) -> salsa::CycleRecoveryAction<FxHashSet<BoundTypeVarIdentity<'db>>> {
salsa::CycleRecoveryAction::Iterate
}
fn inferable_typevars_cycle_initial<'db>(
_db: &'db dyn Db,
_self: GenericContext<'db>,
) -> FxHashSet<BoundTypeVarIdentity<'db>> {
FxHashSet::default()
}
#[derive(Debug, Copy, Clone, PartialEq, Eq)]
pub(super) enum LegacyGenericBase {
Generic,
@@ -1229,6 +1354,7 @@ impl<'db> SpecializationBuilder<'db> {
let tcx = tcx_specialization.and_then(|specialization| {
specialization.get(self.db, variable.bound_typevar)
});
ty = ty.map(|ty| ty.promote_literals(self.db, TypeContext::new(tcx)));
}
@@ -1251,7 +1377,7 @@ impl<'db> SpecializationBuilder<'db> {
pub(crate) fn infer(
&mut self,
formal: Type<'db>,
mut actual: Type<'db>,
actual: Type<'db>,
) -> Result<(), SpecializationError<'db>> {
if formal == actual {
return Ok(());
@@ -1282,9 +1408,11 @@ impl<'db> SpecializationBuilder<'db> {
return Ok(());
}
// For example, if `formal` is `list[T]` and `actual` is `list[int] | None`, we want to specialize `T` to `int`.
// So, here we remove the union elements that are not related to `formal`.
actual = actual.filter_disjoint_elements(self.db, formal, self.inferable);
// Remove the union elements that are not related to `formal`.
//
// For example, if `formal` is `list[T]` and `actual` is `list[int] | None`, we want to specialize `T`
// to `int`.
let actual = actual.filter_disjoint_elements(self.db, formal, self.inferable);
match (formal, actual) {
// TODO: We haven't implemented a full unification solver yet. If typevars appear in
@@ -1354,7 +1482,9 @@ impl<'db> SpecializationBuilder<'db> {
}
}
(Type::TypeVar(bound_typevar), ty) | (ty, Type::TypeVar(bound_typevar)) => {
(Type::TypeVar(bound_typevar), ty) | (ty, Type::TypeVar(bound_typevar))
if bound_typevar.is_inferable(self.db, self.inferable) =>
{
match bound_typevar.typevar(self.db).bound_or_constraints(self.db) {
Some(TypeVarBoundOrConstraints::UpperBound(bound)) => {
if !ty

View File

@@ -14,7 +14,7 @@ use crate::types::call::{CallArguments, MatchedArgument};
use crate::types::signatures::Signature;
use crate::types::{
ClassBase, ClassLiteral, DynamicType, KnownClass, KnownInstanceType, Type,
class::CodeGeneratorKind,
TypeVarBoundOrConstraints, class::CodeGeneratorKind,
};
use crate::{Db, HasType, NameKind, SemanticModel};
use ruff_db::files::{File, FileRange};
@@ -122,7 +122,7 @@ impl<'db> AllMembers<'db> {
self.extend_with_instance_members(db, ty, class_literal);
// If this is a NamedTuple instance, include members from NamedTupleFallback
if CodeGeneratorKind::NamedTuple.matches(db, class_literal) {
if CodeGeneratorKind::NamedTuple.matches(db, class_literal, None) {
self.extend_with_type(db, KnownClass::NamedTupleFallback.to_class_literal(db));
}
}
@@ -142,7 +142,7 @@ impl<'db> AllMembers<'db> {
Type::ClassLiteral(class_literal) => {
self.extend_with_class_members(db, ty, class_literal);
if CodeGeneratorKind::NamedTuple.matches(db, class_literal) {
if CodeGeneratorKind::NamedTuple.matches(db, class_literal, None) {
self.extend_with_type(db, KnownClass::NamedTupleFallback.to_class_literal(db));
}
@@ -153,7 +153,7 @@ impl<'db> AllMembers<'db> {
Type::GenericAlias(generic_alias) => {
let class_literal = generic_alias.origin(db);
if CodeGeneratorKind::NamedTuple.matches(db, class_literal) {
if CodeGeneratorKind::NamedTuple.matches(db, class_literal, None) {
self.extend_with_type(db, KnownClass::NamedTupleFallback.to_class_literal(db));
}
self.extend_with_class_members(db, ty, class_literal);
@@ -164,7 +164,7 @@ impl<'db> AllMembers<'db> {
let class_literal = class_type.class_literal(db).0;
self.extend_with_class_members(db, ty, class_literal);
if CodeGeneratorKind::NamedTuple.matches(db, class_literal) {
if CodeGeneratorKind::NamedTuple.matches(db, class_literal, None) {
self.extend_with_type(
db,
KnownClass::NamedTupleFallback.to_class_literal(db),
@@ -177,6 +177,29 @@ impl<'db> AllMembers<'db> {
Type::TypeAlias(alias) => self.extend_with_type(db, alias.value_type(db)),
Type::TypeVar(bound_typevar) => {
match bound_typevar.typevar(db).bound_or_constraints(db) {
None => {
self.extend_with_type(db, Type::object());
}
Some(TypeVarBoundOrConstraints::UpperBound(bound)) => {
self.extend_with_type(db, bound);
}
Some(TypeVarBoundOrConstraints::Constraints(constraints)) => {
self.members.extend(
constraints
.elements(db)
.iter()
.map(|ty| AllMembers::of(db, *ty).members)
.reduce(|acc, members| {
acc.intersection(&members).cloned().collect()
})
.unwrap_or_default(),
);
}
}
}
Type::IntLiteral(_)
| Type::BooleanLiteral(_)
| Type::StringLiteral(_)
@@ -194,8 +217,6 @@ impl<'db> AllMembers<'db> {
| Type::ProtocolInstance(_)
| Type::SpecialForm(_)
| Type::KnownInstance(_)
| Type::NonInferableTypeVar(_)
| Type::TypeVar(_)
| Type::BoundSuper(_)
| Type::TypeIs(_) => match ty.to_meta_type(db) {
Type::ClassLiteral(class_literal) => {

View File

@@ -391,7 +391,7 @@ impl<'db> TypeContext<'db> {
.and_then(|ty| ty.known_specialization(db, known_class))
}
pub(crate) fn map_annotation(self, f: impl FnOnce(Type<'db>) -> Type<'db>) -> Self {
pub(crate) fn map(self, f: impl FnOnce(Type<'db>) -> Type<'db>) -> Self {
Self {
annotation: self.annotation.map(f),
}

View File

@@ -9,6 +9,7 @@ use ruff_python_ast::{self as ast, AnyNodeRef, ExprContext, PythonVersion};
use ruff_python_stdlib::builtins::version_builtin_was_added;
use ruff_text_size::{Ranged, TextRange};
use rustc_hash::{FxHashMap, FxHashSet};
use smallvec::SmallVec;
use super::{
CycleRecovery, DefinitionInference, DefinitionInferenceExtra, ExpressionInference,
@@ -152,6 +153,12 @@ type BinaryComparisonVisitor<'db> = CycleDetector<
Result<Type<'db>, CompareUnsupportedError<'db>>,
>;
/// We currently store one dataclass field-specifiers inline, because that covers standard
/// dataclasses. attrs uses 2 specifiers, pydantic and strawberry use 3 specifiers. SQLAlchemy
/// uses 7 field specifiers. We could probably store more inline if this turns out to be a
/// performance problem. For now, we optimize for memory usage.
const NUM_FIELD_SPECIFIERS_INLINE: usize = 1;
/// Builder to infer all types in a region.
///
/// A builder is used by creating it with [`new()`](TypeInferenceBuilder::new), and then calling
@@ -277,6 +284,10 @@ pub(super) struct TypeInferenceBuilder<'db, 'ast> {
/// `true` if all places in this expression are definitely bound
all_definitely_bound: bool,
/// A list of `dataclass_transform` field specifiers that are "active" (when inferring
/// the right hand side of an annotated assignment in a class that is a dataclass).
dataclass_field_specifiers: SmallVec<[Type<'db>; NUM_FIELD_SPECIFIERS_INLINE]>,
}
impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
@@ -312,6 +323,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
undecorated_type: None,
cycle_recovery: None,
all_definitely_bound: true,
dataclass_field_specifiers: SmallVec::new(),
}
}
@@ -565,7 +577,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
continue;
}
let is_named_tuple = CodeGeneratorKind::NamedTuple.matches(self.db(), class);
let is_named_tuple = CodeGeneratorKind::NamedTuple.matches(self.db(), class, None);
// (2) If it's a `NamedTuple` class, check that no field without a default value
// appears after a field with a default value.
@@ -886,7 +898,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// (7) Check that a dataclass does not have more than one `KW_ONLY`.
if let Some(field_policy @ CodeGeneratorKind::DataclassLike(_)) =
CodeGeneratorKind::from_class(self.db(), class)
CodeGeneratorKind::from_class(self.db(), class, None)
{
let specialization = None;
@@ -1333,7 +1345,16 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
true
}
fn add_binding(&mut self, node: AnyNodeRef, binding: Definition<'db>, ty: Type<'db>) {
/// Add a binding for the given definition.
///
/// Returns the result of the `infer_value_ty` closure, which is called with the declared type
/// as type context.
fn add_binding(
&mut self,
node: AnyNodeRef,
binding: Definition<'db>,
infer_value_ty: impl FnOnce(&mut Self, TypeContext<'db>) -> Type<'db>,
) -> Type<'db> {
/// Arbitrary `__getitem__`/`__setitem__` methods on a class do not
/// necessarily guarantee that the passed-in value for `__setitem__` is stored and
/// can be retrieved unmodified via `__getitem__`. Therefore, we currently only
@@ -1378,7 +1399,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let file_scope_id = binding.file_scope(db);
let place_table = self.index.place_table(file_scope_id);
let use_def = self.index.use_def_map(file_scope_id);
let mut bound_ty = ty;
let global_use_def_map = self.index.use_def_map(FileScopeId::global());
let place_id = binding.place(self.db());
@@ -1489,12 +1509,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
qualifiers,
} = place_and_quals;
let unwrap_declared_ty = || {
resolved_place
.ignore_possibly_undefined()
.unwrap_or(Type::unknown())
};
// If the place is unbound and its an attribute or subscript place, fall back to normal
// attribute/subscript inference on the root type.
let declared_ty =
@@ -1506,9 +1520,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
value_type.member(db, attr).place
{
// TODO: also consider qualifiers on the attribute
ty
Some(ty)
} else {
unwrap_declared_ty()
None
}
} else if let AnyNodeRef::ExprSubscript(
subscript @ ast::ExprSubscript {
@@ -1518,13 +1532,19 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
{
let value_ty = self.infer_expression(value, TypeContext::default());
let slice_ty = self.infer_expression(slice, TypeContext::default());
self.infer_subscript_expression_types(subscript, value_ty, slice_ty, *ctx)
Some(self.infer_subscript_expression_types(subscript, value_ty, slice_ty, *ctx))
} else {
unwrap_declared_ty()
None
}
} else {
unwrap_declared_ty()
};
None
}
.or_else(|| resolved_place.ignore_possibly_undefined());
let inferred_ty = infer_value_ty(self, TypeContext::new(declared_ty));
let declared_ty = declared_ty.unwrap_or(Type::unknown());
let mut bound_ty = inferred_ty;
if qualifiers.contains(TypeQualifiers::FINAL) {
let mut previous_bindings = use_def.bindings_at_definition(binding);
@@ -1580,7 +1600,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
if !bound_ty.is_assignable_to(db, declared_ty) {
report_invalid_assignment(&self.context, node, binding, declared_ty, bound_ty);
// allow declarations to override inference in case of invalid assignment
// Allow declarations to override inference in case of invalid assignment.
bound_ty = declared_ty;
}
// In the following cases, the bound type may not be the same as the RHS value type.
@@ -1608,6 +1629,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
self.bindings.insert(binding, bound_ty);
inferred_ty
}
/// Returns `true` if `symbol_id` should be looked up in the global scope, skipping intervening
@@ -2473,7 +2496,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
} else {
Type::unknown()
};
self.add_binding(parameter.into(), definition, ty);
self.add_binding(parameter.into(), definition, |_, _| ty);
}
}
@@ -2503,11 +2527,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
&DeclaredAndInferredType::are_the_same_type(ty),
);
} else {
self.add_binding(
parameter.into(),
definition,
Type::homogeneous_tuple(self.db(), Type::unknown()),
);
self.add_binding(parameter.into(), definition, |builder, _| {
Type::homogeneous_tuple(builder.db(), Type::unknown())
});
}
}
@@ -2535,14 +2557,12 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
&DeclaredAndInferredType::are_the_same_type(ty),
);
} else {
self.add_binding(
parameter.into(),
definition,
self.add_binding(parameter.into(), definition, |builder, _| {
KnownClass::Dict.to_specialized_instance(
self.db(),
[KnownClass::Str.to_instance(self.db()), Type::unknown()],
),
);
builder.db(),
[KnownClass::Str.to_instance(builder.db()), Type::unknown()],
)
});
}
}
@@ -2574,7 +2594,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
.as_function_literal()
.is_some_and(|function| function.is_known(self.db(), KnownFunction::Dataclass))
{
dataclass_params = Some(DataclassParams::default());
dataclass_params = Some(DataclassParams::default_params(self.db()));
continue;
}
@@ -2595,11 +2615,14 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// overload, or an overload and the implementation both. Nevertheless, this is not
// allowed. We do not try to treat the offenders intelligently -- just use the
// params of the last seen usage of `@dataclass_transform`
let params = f
let transformer_params = f
.iter_overloads_and_implementation(self.db())
.find_map(|overload| overload.dataclass_transformer_params(self.db()));
if let Some(params) = params {
dataclass_params = Some(params.into());
if let Some(transformer_params) = transformer_params {
dataclass_params = Some(DataclassParams::from_transformer_params(
self.db(),
transformer_params,
));
continue;
}
}
@@ -2813,12 +2836,12 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
for item in items {
let target = item.optional_vars.as_deref();
if let Some(target) = target {
self.infer_target(target, &item.context_expr, |builder, context_expr| {
self.infer_target(target, &item.context_expr, |builder| {
// TODO: `infer_with_statement_definition` reports a diagnostic if `ctx_manager_ty` isn't a context manager
// but only if the target is a name. We should report a diagnostic here if the target isn't a name:
// `with not_context_manager as a.x: ...
builder
.infer_standalone_expression(context_expr, TypeContext::default())
.infer_standalone_expression(&item.context_expr, TypeContext::default())
.enter(builder.db())
});
} else {
@@ -2858,7 +2881,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
};
self.store_expression_type(target, target_ty);
self.add_binding(target.into(), definition, target_ty);
self.add_binding(target.into(), definition, |_, _| target_ty);
}
/// Infers the type of a context expression (`with expr`) and returns the target's type
@@ -2990,7 +3013,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
self.add_binding(
except_handler_definition.node(self.module()).into(),
definition,
symbol_ty,
|_, _| symbol_ty,
);
}
@@ -3159,11 +3182,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// against the subject expression type (which we can query via `infer_expression_types`)
// and extract the type at the `index` position if the pattern matches. This will be
// similar to the logic in `self.infer_assignment_definition`.
self.add_binding(
pattern.into(),
definition,
todo_type!("`match` pattern definition types"),
);
self.add_binding(pattern.into(), definition, |_, _| {
todo_type!("`match` pattern definition types")
});
}
fn infer_match_pattern(&mut self, pattern: &ast::Pattern) {
@@ -3284,8 +3305,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
} = assignment;
for target in targets {
self.infer_target(target, value, |builder, value_expr| {
builder.infer_standalone_expression(value_expr, TypeContext::default())
self.infer_target(target, value, |builder| {
builder.infer_standalone_expression(value, TypeContext::default())
});
}
}
@@ -3301,11 +3322,11 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
/// `target`.
fn infer_target<F>(&mut self, target: &ast::Expr, value: &ast::Expr, infer_value_expr: F)
where
F: Fn(&mut TypeInferenceBuilder<'db, '_>, &ast::Expr) -> Type<'db>,
F: Fn(&mut Self) -> Type<'db>,
{
let assigned_ty = match target {
ast::Expr::Name(_) => None,
_ => Some(infer_value_expr(self, value)),
_ => Some(infer_value_expr(self)),
};
self.infer_target_impl(target, value, assigned_ty);
}
@@ -3596,7 +3617,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
| Type::WrapperDescriptor(_)
| Type::DataclassDecorator(_)
| Type::DataclassTransformer(_)
| Type::NonInferableTypeVar(..)
| Type::TypeVar(..)
| Type::AlwaysTruthy
| Type::AlwaysFalsy
@@ -4054,6 +4074,21 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
assignment: &AssignmentDefinitionKind<'db>,
definition: Definition<'db>,
) {
let target = assignment.target(self.module());
self.add_binding(target.into(), definition, |builder, tcx| {
let target_ty = builder.infer_assignment_definition_impl(assignment, definition, tcx);
builder.store_expression_type(target, target_ty);
target_ty
});
}
fn infer_assignment_definition_impl(
&mut self,
assignment: &AssignmentDefinitionKind<'db>,
definition: Definition<'db>,
tcx: TypeContext<'db>,
) -> Type<'db> {
let value = assignment.value(self.module());
let target = assignment.target(self.module());
@@ -4069,7 +4104,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
unpacked.expression_type(target)
}
TargetKind::Single => {
let tcx = TypeContext::default();
let value_ty = if let Some(standalone_expression) = self.index.try_expression(value)
{
self.infer_standalone_expression_impl(value, standalone_expression, tcx)
@@ -4094,6 +4128,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
} else {
self.infer_call_expression_impl(call_expr, callable_type, tcx)
};
self.store_expression_type(value, ty);
ty
} else {
@@ -4125,8 +4160,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
target_ty = Type::SpecialForm(special_form);
}
self.store_expression_type(target, target_ty);
self.add_binding(target.into(), definition, target_ty);
target_ty
}
fn infer_legacy_typevar(
@@ -4518,10 +4552,42 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
debug_assert!(PlaceExpr::try_from_expr(target).is_some());
if let Some(value) = value {
fn field_specifiers<'db>(
db: &'db dyn Db,
index: &'db SemanticIndex<'db>,
scope: ScopeId<'db>,
) -> Option<SmallVec<[Type<'db>; NUM_FIELD_SPECIFIERS_INLINE]>> {
let enclosing_scope = index.scope(scope.file_scope_id(db));
let class_node = enclosing_scope.node().as_class()?;
let class_definition = index.expect_single_definition(class_node);
let class_literal = infer_definition_types(db, class_definition)
.declaration_type(class_definition)
.inner_type()
.as_class_literal()?;
class_literal
.dataclass_params(db)
.map(|params| SmallVec::from(params.field_specifiers(db)))
.or_else(|| {
Some(SmallVec::from(
CodeGeneratorKind::from_class(db, class_literal, None)?
.dataclass_transformer_params()?
.field_specifiers(db),
))
})
}
if let Some(specifiers) = field_specifiers(self.db(), self.index, self.scope()) {
self.dataclass_field_specifiers = specifiers;
}
let inferred_ty = self.infer_maybe_standalone_expression(
value,
TypeContext::new(Some(declared.inner_type())),
);
self.dataclass_field_specifiers.clear();
let inferred_ty = if target
.as_name_expr()
.is_some_and(|name| &name.id == "TYPE_CHECKING")
@@ -4631,7 +4697,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
definition: Definition<'db>,
) {
let target_ty = self.infer_augment_assignment(assignment);
self.add_binding(assignment.into(), definition, target_ty);
self.add_binding(assignment.into(), definition, |_, _| target_ty);
}
fn infer_augment_assignment(&mut self, assignment: &ast::StmtAugAssign) -> Type<'db> {
@@ -4682,12 +4748,12 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
is_async: _,
} = for_statement;
self.infer_target(target, iter, |builder, iter_expr| {
self.infer_target(target, iter, |builder| {
// TODO: `infer_for_statement_definition` reports a diagnostic if `iter_ty` isn't iterable
// but only if the target is a name. We should report a diagnostic here if the target isn't a name:
// `for a.x in not_iterable: ...
builder
.infer_standalone_expression(iter_expr, TypeContext::default())
.infer_standalone_expression(iter, TypeContext::default())
.iterate(builder.db())
.homogeneous_element_type(builder.db())
});
@@ -4731,7 +4797,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
};
self.store_expression_type(target, loop_var_value_type);
self.add_binding(target.into(), definition, loop_var_value_type);
self.add_binding(target.into(), definition, |_, _| loop_var_value_type);
}
fn infer_while_statement(&mut self, while_statement: &ast::StmtWhile) {
@@ -5446,6 +5512,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
// Retrieve the parameter type for the current argument in a given overload and its binding.
let db = self.db();
let parameter_type = |overload: &Binding<'db>, binding: &CallableBinding<'db>| {
let argument_index = if binding.bound_type.is_some() {
argument_index + 1
@@ -5458,7 +5525,36 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
return None;
};
overload.signature.parameters()[*parameter_index].annotated_type()
let parameter_type =
overload.signature.parameters()[*parameter_index].annotated_type()?;
// TODO: For now, skip any parameter annotations that mention any typevars. There
// are two issues:
//
// First, if we include those typevars in the type context that we use to infer the
// corresponding argument type, the typevars might end up appearing in the inferred
// argument type as well. As part of analyzing this call, we're going to (try to)
// infer a specialization of those typevars, and would need to substitute those
// typevars in the inferred argument type. We can't do that easily at the moment,
// since specialization inference occurs _after_ we've inferred argument types, and
// we can't _update_ an expression's inferred type after the fact.
//
// Second, certain kinds of arguments themselves have typevars that we need to
// infer specializations for. (For instance, passing the result of _another_ call
// to the argument of _this_ call, where both are calls to generic functions.) In
// that case, we want to "tie together" the typevars of the two calls so that we
// can infer their specializations at the same time — or at least, for the
// specialization of one to influence the specialization of the other. It's not yet
// clear how we're going to do that. (We might have to start inferring constraint
// sets for each expression, instead of simple types?)
//
// Regardless, for now, the expedient "solution" is to not perform bidi type
// checking for these kinds of parameters.
if parameter_type.has_typevar(db) {
return None;
}
Some(parameter_type)
};
// If there is only a single binding and overload, we can infer the argument directly with
@@ -5843,6 +5939,20 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
parenthesized: _,
} = tuple;
// Remove any union elements of that are unrelated to the tuple type.
let tcx = tcx.map(|annotation| {
let inferable = KnownClass::Tuple
.try_to_class_literal(self.db())
.and_then(|class| class.generic_context(self.db()))
.map(|generic_context| generic_context.inferable_typevars(self.db()))
.unwrap_or(InferableTypeVars::None);
annotation.filter_disjoint_elements(
self.db(),
KnownClass::Tuple.to_instance(self.db()),
inferable,
)
});
let annotated_tuple = tcx
.known_specialization(self.db(), KnownClass::Tuple)
.and_then(|specialization| {
@@ -5908,7 +6018,10 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
} = dict;
// Validate `TypedDict` dictionary literal assignments.
if let Some(typed_dict) = tcx.annotation.and_then(Type::as_typed_dict)
if let Some(tcx) = tcx.annotation
&& let Some(typed_dict) = tcx
.filter_union(self.db(), Type::is_typed_dict)
.as_typed_dict()
&& let Some(ty) = self.infer_typed_dict_expression(dict, typed_dict)
{
return ty;
@@ -5975,10 +6088,14 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let elt_tys = |collection_class: KnownClass| {
let class_literal = collection_class.try_to_class_literal(self.db())?;
let generic_context = class_literal.generic_context(self.db())?;
Some((class_literal, generic_context.variables(self.db())))
Some((
class_literal,
generic_context,
generic_context.variables(self.db()),
))
};
let Some((class_literal, elt_tys)) = elt_tys(collection_class) else {
let Some((class_literal, generic_context, elt_tys)) = elt_tys(collection_class) else {
// Infer the element types without type context, and fallback to unknown for
// custom typesheds.
for elt in elts.flatten().flatten() {
@@ -5988,12 +6105,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
return None;
};
// TODO: Use the list of inferable typevars from the generic context of the collection
// class.
let inferable = InferableTypeVars::None;
let tcx = tcx.map_annotation(|annotation| {
// Remove any union elements of `annotation` that are not related to `collection_ty`.
// e.g. `annotation: list[int] | None => list[int]` if `collection_ty: list`
let inferable = generic_context.inferable_typevars(self.db());
// Remove any union elements of that are unrelated to the collection type.
//
// For example, we only want the `list[int]` from `annotation: list[int] | None` if
// `collection_ty` is `list`.
let tcx = tcx.map(|annotation| {
let collection_ty = collection_class.to_instance(self.db());
annotation.filter_disjoint_elements(self.db(), collection_ty, inferable)
});
@@ -6061,6 +6179,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let inferred_elt_ty = self.get_or_infer_expression(elt, elt_tcx);
// Simplify the inference based on the declared type of the element.
if let Some(elt_tcx) = elt_tcx.annotation {
if inferred_elt_ty.is_assignable_to(self.db(), elt_tcx) {
continue;
}
}
// Convert any element literals to their promoted type form to avoid excessively large
// unions for large nested list literals, which the constraint solver struggles with.
let inferred_elt_ty = inferred_elt_ty.promote_literals(self.db(), elt_tcx);
@@ -6069,7 +6194,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
}
let class_type = class_literal.apply_specialization(self.db(), |generic_context| {
let class_type = class_literal.apply_specialization(self.db(), |_| {
builder.build(generic_context, TypeContext::default())
});
@@ -6226,23 +6351,24 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
is_async: _,
} = comprehension;
self.infer_target(target, iter, |builder, iter_expr| {
self.infer_target(target, iter, |builder| {
// TODO: `infer_comprehension_definition` reports a diagnostic if `iter_ty` isn't iterable
// but only if the target is a name. We should report a diagnostic here if the target isn't a name:
// `[... for a.x in not_iterable]
if is_first {
infer_same_file_expression_type(
builder.db(),
builder.index.expression(iter_expr),
builder.index.expression(iter),
TypeContext::default(),
builder.module(),
)
} else {
builder.infer_standalone_expression(iter_expr, TypeContext::default())
builder.infer_standalone_expression(iter, TypeContext::default())
}
.iterate(builder.db())
.homogeneous_element_type(builder.db())
});
for expr in ifs {
self.infer_standalone_expression(expr, TypeContext::default());
}
@@ -6300,7 +6426,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
};
self.expressions.insert(target.into(), target_type);
self.add_binding(target.into(), definition, target_type);
self.add_binding(target.into(), definition, |_, _| target_type);
}
fn infer_named_expression(&mut self, named: &ast::ExprNamed) -> Type<'db> {
@@ -6330,12 +6456,11 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
value,
} = named;
let value_ty = self.infer_expression(value, TypeContext::default());
self.infer_expression(target, TypeContext::default());
self.add_binding(named.into(), definition, value_ty);
value_ty
self.add_binding(named.into(), definition, |builder, tcx| {
builder.infer_expression(value, tcx)
})
}
fn infer_if_expression(&mut self, if_expression: &ast::ExprIf) -> Type<'db> {
@@ -6650,7 +6775,12 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
}
let mut bindings = match bindings.check_types(self.db(), &call_arguments, &tcx) {
let mut bindings = match bindings.check_types(
self.db(),
&call_arguments,
&tcx,
&self.dataclass_field_specifiers[..],
) {
Ok(bindings) => bindings,
Err(CallError(_, bindings)) => {
bindings.report_diagnostics(&self.context, call_expression.into());
@@ -7642,7 +7772,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
| Type::BytesLiteral(_)
| Type::EnumLiteral(_)
| Type::BoundSuper(_)
| Type::NonInferableTypeVar(_)
| Type::TypeVar(_)
| Type::TypeIs(_)
| Type::TypedDict(_),
@@ -8029,7 +8158,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
| Type::BytesLiteral(_)
| Type::EnumLiteral(_)
| Type::BoundSuper(_)
| Type::NonInferableTypeVar(_)
| Type::TypeVar(_)
| Type::TypeIs(_)
| Type::TypedDict(_),
@@ -8059,7 +8187,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
| Type::BytesLiteral(_)
| Type::EnumLiteral(_)
| Type::BoundSuper(_)
| Type::NonInferableTypeVar(_)
| Type::TypeVar(_)
| Type::TypeIs(_)
| Type::TypedDict(_),
@@ -8479,8 +8606,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// - `[ast::CompOp::Is]`: return `false` if unequal, `bool` if equal
// - `[ast::CompOp::IsNot]`: return `true` if unequal, `bool` if equal
let db = self.db();
let try_dunder = |inference: &mut TypeInferenceBuilder<'db, '_>,
policy: MemberLookupPolicy| {
let try_dunder = |inference: &mut Self, policy: MemberLookupPolicy| {
let rich_comparison = |op| inference.infer_rich_comparison(left, right, op, policy);
let membership_test_comparison = |op, range: TextRange| {
inference.infer_membership_test_comparison(left, right, op, range)
@@ -9238,8 +9364,12 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let binding = Binding::single(value_ty, generic_context.signature(self.db()));
let bindings = match Bindings::from(binding)
.match_parameters(self.db(), &call_argument_types)
.check_types(self.db(), &call_argument_types, &TypeContext::default())
{
.check_types(
self.db(),
&call_argument_types,
&TypeContext::default(),
&self.dataclass_field_specifiers[..],
) {
Ok(bindings) => bindings,
Err(CallError(_, bindings)) => {
bindings.report_diagnostics(&self.context, subscript.into());
@@ -9771,6 +9901,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
deferred,
cycle_recovery,
all_definitely_bound,
dataclass_field_specifiers: _,
// Ignored; only relevant to definition regions
undecorated_type: _,
@@ -9837,8 +9968,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
deferred,
cycle_recovery,
undecorated_type,
all_definitely_bound: _,
// builder only state
dataclass_field_specifiers: _,
all_definitely_bound: _,
typevar_binding_context: _,
deferred_state: _,
multi_inference_state: _,
@@ -9905,12 +10037,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
deferred: _,
bindings: _,
declarations: _,
all_definitely_bound: _,
// Ignored; only relevant to definition regions
undecorated_type: _,
// Builder only state
dataclass_field_specifiers: _,
all_definitely_bound: _,
typevar_binding_context: _,
deferred_state: _,
multi_inference_state: _,

View File

@@ -207,15 +207,16 @@ impl ClassInfoConstraintFunction {
Type::Union(union) => {
union.try_map(db, |element| self.generate_constraint(db, *element))
}
Type::NonInferableTypeVar(bound_typevar) => match bound_typevar
.typevar(db)
.bound_or_constraints(db)?
{
TypeVarBoundOrConstraints::UpperBound(bound) => self.generate_constraint(db, bound),
TypeVarBoundOrConstraints::Constraints(constraints) => {
self.generate_constraint(db, Type::Union(constraints))
Type::TypeVar(bound_typevar) => {
match bound_typevar.typevar(db).bound_or_constraints(db)? {
TypeVarBoundOrConstraints::UpperBound(bound) => {
self.generate_constraint(db, bound)
}
TypeVarBoundOrConstraints::Constraints(constraints) => {
self.generate_constraint(db, Type::Union(constraints))
}
}
},
}
// It's not valid to use a generic alias as the second argument to `isinstance()` or `issubclass()`,
// e.g. `isinstance(x, list[int])` fails at runtime.
@@ -251,7 +252,6 @@ impl ClassInfoConstraintFunction {
| Type::IntLiteral(_)
| Type::KnownInstance(_)
| Type::TypeIs(_)
| Type::TypeVar(_)
| Type::WrapperDescriptor(_)
| Type::DataclassTransformer(_)
| Type::TypedDict(_) => None,

View File

@@ -28,10 +28,9 @@ use crate::types::generics::{
};
use crate::types::infer::nearest_enclosing_class;
use crate::types::{
ApplyTypeMappingVisitor, BindingContext, BoundTypeVarInstance, ClassLiteral,
FindLegacyTypeVarsVisitor, HasRelationToVisitor, IsDisjointVisitor, IsEquivalentVisitor,
KnownClass, MaterializationKind, NormalizedVisitor, TypeContext, TypeMapping, TypeRelation,
VarianceInferable, todo_type,
ApplyTypeMappingVisitor, BoundTypeVarInstance, ClassLiteral, FindLegacyTypeVarsVisitor,
HasRelationToVisitor, IsDisjointVisitor, IsEquivalentVisitor, KnownClass, MaterializationKind,
NormalizedVisitor, TypeContext, TypeMapping, TypeRelation, VarianceInferable, todo_type,
};
use crate::{Db, FxOrderSet};
use ruff_python_ast::{self as ast, name::Name};
@@ -446,19 +445,6 @@ impl<'db> Signature<'db> {
}
}
pub(super) fn mark_typevars_inferable(self, db: &'db dyn Db) -> Self {
if let Some(definition) = self.definition {
self.apply_type_mapping_impl(
db,
&TypeMapping::MarkTypeVarsInferable(Some(BindingContext::Definition(definition))),
TypeContext::default(),
&ApplyTypeMappingVisitor::default(),
)
} else {
self
}
}
pub(super) fn wrap_coroutine_return_type(self, db: &'db dyn Db) -> Self {
let return_ty = self.return_ty.map(|return_ty| {
KnownClass::CoroutineType
@@ -617,6 +603,15 @@ impl<'db> Signature<'db> {
inferable: InferableTypeVars<'_, 'db>,
visitor: &IsEquivalentVisitor<'db>,
) -> ConstraintSet<'db> {
// The typevars in self and other should also be considered inferable when checking whether
// two signatures are equivalent.
let self_inferable =
(self.generic_context).map(|generic_context| generic_context.inferable_typevars(db));
let other_inferable =
(other.generic_context).map(|generic_context| generic_context.inferable_typevars(db));
let inferable = inferable.merge(self_inferable.as_ref());
let inferable = inferable.merge(other_inferable.as_ref());
let mut result = ConstraintSet::from(true);
let mut check_types = |self_type: Option<Type<'db>>, other_type: Option<Type<'db>>| {
let self_type = self_type.unwrap_or(Type::unknown());
@@ -767,6 +762,15 @@ impl<'db> Signature<'db> {
}
}
// The typevars in self and other should also be considered inferable when checking whether
// two signatures are equivalent.
let self_inferable =
(self.generic_context).map(|generic_context| generic_context.inferable_typevars(db));
let other_inferable =
(other.generic_context).map(|generic_context| generic_context.inferable_typevars(db));
let inferable = inferable.merge(self_inferable.as_ref());
let inferable = inferable.merge(other_inferable.as_ref());
let mut result = ConstraintSet::from(true);
let mut check_types = |type1: Option<Type<'db>>, type2: Option<Type<'db>>| {
let type1 = type1.unwrap_or(Type::unknown());
@@ -1301,19 +1305,8 @@ impl<'db> Parameters<'db> {
let class = nearest_enclosing_class(db, index, scope_id).unwrap();
Some(
// It looks like unnecessary work here that we create the implicit Self
// annotation using non-inferable typevars and then immediately apply
// `MarkTypeVarsInferable` to it. However, this is currently necessary to
// ensure that implicit-Self and explicit Self annotations are both treated
// the same. Marking type vars inferable will cause reification of lazy
// typevar defaults/bounds/constraints; this needs to happen for both
// implicit and explicit Self so they remain the "same" typevar.
typing_self(db, scope_id, typevar_binding_context, class, &Type::NonInferableTypeVar)
.expect("We should always find the surrounding class for an implicit self: Self annotation").apply_type_mapping(
db,
&TypeMapping::MarkTypeVarsInferable(None),
TypeContext::default()
)
typing_self(db, scope_id, typevar_binding_context, class)
.expect("We should always find the surrounding class for an implicit self: Self annotation"),
)
} else {
// For methods of non-generic classes that are not otherwise generic (e.g. return `Self` or

View File

@@ -83,15 +83,11 @@ pub(super) fn union_or_intersection_elements_ordering<'db>(
(Type::WrapperDescriptor(_), _) => Ordering::Less,
(_, Type::WrapperDescriptor(_)) => Ordering::Greater,
(Type::DataclassDecorator(left), Type::DataclassDecorator(right)) => {
left.bits().cmp(&right.bits())
}
(Type::DataclassDecorator(left), Type::DataclassDecorator(right)) => left.cmp(right),
(Type::DataclassDecorator(_), _) => Ordering::Less,
(_, Type::DataclassDecorator(_)) => Ordering::Greater,
(Type::DataclassTransformer(left), Type::DataclassTransformer(right)) => {
left.bits().cmp(&right.bits())
}
(Type::DataclassTransformer(left), Type::DataclassTransformer(right)) => left.cmp(right),
(Type::DataclassTransformer(_), _) => Ordering::Less,
(_, Type::DataclassTransformer(_)) => Ordering::Greater,
@@ -141,10 +137,6 @@ pub(super) fn union_or_intersection_elements_ordering<'db>(
(Type::ProtocolInstance(_), _) => Ordering::Less,
(_, Type::ProtocolInstance(_)) => Ordering::Greater,
(Type::NonInferableTypeVar(left), Type::NonInferableTypeVar(right)) => left.cmp(right),
(Type::NonInferableTypeVar(_), _) => Ordering::Less,
(_, Type::NonInferableTypeVar(_)) => Ordering::Greater,
(Type::TypeVar(left), Type::TypeVar(right)) => left.cmp(right),
(Type::TypeVar(_), _) => Ordering::Less,
(_, Type::TypeVar(_)) => Ordering::Greater,

View File

@@ -122,7 +122,6 @@ pub(super) enum NonAtomicType<'db> {
NominalInstance(NominalInstanceType<'db>),
PropertyInstance(PropertyInstanceType<'db>),
TypeIs(TypeIsType<'db>),
NonInferableTypeVar(BoundTypeVarInstance<'db>),
TypeVar(BoundTypeVarInstance<'db>),
ProtocolInstance(ProtocolInstanceType<'db>),
TypedDict(TypedDictType<'db>),
@@ -186,9 +185,6 @@ impl<'db> From<Type<'db>> for TypeKind<'db> {
Type::PropertyInstance(property) => {
TypeKind::NonAtomic(NonAtomicType::PropertyInstance(property))
}
Type::NonInferableTypeVar(bound_typevar) => {
TypeKind::NonAtomic(NonAtomicType::NonInferableTypeVar(bound_typevar))
}
Type::TypeVar(bound_typevar) => {
TypeKind::NonAtomic(NonAtomicType::TypeVar(bound_typevar))
}
@@ -228,9 +224,6 @@ pub(super) fn walk_non_atomic_type<'db, V: TypeVisitor<'db> + ?Sized>(
visitor.visit_property_instance_type(db, property);
}
NonAtomicType::TypeIs(type_is) => visitor.visit_typeis_type(db, type_is),
NonAtomicType::NonInferableTypeVar(bound_typevar) => {
visitor.visit_bound_type_var_type(db, bound_typevar);
}
NonAtomicType::TypeVar(bound_typevar) => {
visitor.visit_bound_type_var_type(db, bound_typevar);
}

View File

@@ -79,6 +79,15 @@ export function persistLocal({
settingsSource: string;
pythonSource: string;
}) {
const totalLength = settingsSource.length + pythonSource.length;
// Don't persist large files to local storage because they can exceed the local storage quota
// The number here is picked rarely arbitrarily. Also note, JS uses UTF 16:
// that means the limit here is strings larger than 1MB (because UTf 16 uses 2 bytes per character)
if (totalLength > 500_000) {
return;
}
localStorage.setItem(
"source",
JSON.stringify([settingsSource, pythonSource]),

View File

@@ -40,6 +40,18 @@ export async function restore(): Promise<Workspace | null> {
}
export function persistLocal(workspace: Workspace) {
let totalLength = 0;
for (const fileContent of Object.values(workspace.files)) {
totalLength += fileContent.length;
// Don't persist large files to local storage because they can exceed the local storage quota
// The number here is picked rarely arbitrarily. Also note, JS uses UTF 16:
// that means the limit here is strings larger than 1MB (because UTf 16 uses 2 bytes per character)
if (totalLength > 500_000) {
return;
}
}
localStorage.setItem("workspace", JSON.stringify(workspace));
}

View File

@@ -3,6 +3,6 @@
A fuzzer script to run Ruff executables on randomly generated
(but syntactically valid) Python source-code files.
Run `uvx --from ./python/py-fuzzer fuzz -h` from the repository root
Run `uv run --project=./python/py-fuzzer fuzz -h` from the repository root
for more information and example invocations
(requires [`uv`](https://github.com/astral-sh/uv) to be installed).

View File

@@ -4,24 +4,21 @@ Python source-code files.
This script can be installed into a virtual environment using
`uv pip install -e ./python/py-fuzzer` from the Ruff repository root,
or can be run using `uvx --from ./python/py-fuzzer fuzz`
or can be run using `uv run --project=./python/py-fuzzer fuzz`
(in which case the virtual environment does not need to be activated).
Note that using `uv run --project` rather than `uvx --from` means that
uv will respect the script's lockfile.
Example invocations of the script using `uv`:
- Run the fuzzer on Ruff's parser using seeds 0, 1, 2, 78 and 93 to generate the code:
`uvx --from ./python/py-fuzzer fuzz --bin ruff 0-2 78 93`
`uv run --project=./python/py-fuzzer fuzz --bin ruff 0-2 78 93`
- Run the fuzzer concurrently using seeds in range 0-10 inclusive,
but only reporting bugs that are new on your branch:
`uvx --from ./python/py-fuzzer fuzz --bin ruff 0-10 --only-new-bugs`
`uv run --project=./python/py-fuzzer fuzz --bin ruff 0-10 --only-new-bugs`
- Run the fuzzer concurrently on 10,000 different Python source-code files,
using a random selection of seeds, and only print a summary at the end
(the `shuf` command is Unix-specific):
`uvx --from ./python/py-fuzzer fuzz --bin ruff $(shuf -i 0-1000000 -n 10000) --quiet
If you make local modifications to this script, you'll need to run the above
with `--reinstall` to get your changes reflected in the uv-cached installed
package. Alternatively, if iterating quickly on changes, you can add
`--with-editable ./python/py-fuzzer`.
`uv run --project=./python/py-fuzzer fuzz --bin ruff $(shuf -i 0-1000000 -n 10000) --quiet
"""
from __future__ import annotations
@@ -142,7 +139,7 @@ class FuzzResult:
case Executable.TY:
panic_message = f"The following code triggers a {new}ty panic:"
case _ as unreachable:
assert_never(unreachable)
assert_never(unreachable) # ty: ignore[type-assertion-failure]
print(colored(panic_message, "red"))
print()
@@ -152,16 +149,13 @@ class FuzzResult:
def fuzz_code(seed: Seed, args: ResolvedCliArgs) -> FuzzResult:
"""Return a `FuzzResult` instance describing the fuzzing result from this seed."""
# TODO debug slowness of these seeds
skip_check = seed in {32, 56, 208}
code = generate_random_code(seed)
bug_found = False
minimizer_callback: Callable[[str], bool] | None = None
if args.baseline_executable_path is None:
only_new_bugs = False
if not skip_check and contains_bug(
if contains_bug(
code, executable=args.executable, executable_path=args.test_executable_path
):
bug_found = True
@@ -172,7 +166,7 @@ def fuzz_code(seed: Seed, args: ResolvedCliArgs) -> FuzzResult:
)
else:
only_new_bugs = True
if not skip_check and contains_new_bug(
if contains_new_bug(
code,
executable=args.executable,
test_executable_path=args.test_executable_path,

View File

@@ -32,6 +32,10 @@ warn_unreachable = true
local_partial_types = true
enable_error_code = "ignore-without-code,redundant-expr,truthy-bool"
[tool.ty.rules]
possibly-unresolved-reference = "error"
unused-ignore-comment = "error"
[tool.ruff]
fix = true
preview = true

141
python/py-fuzzer/uv.lock generated
View File

@@ -1,58 +1,76 @@
version = 1
revision = 1
revision = 3
requires-python = ">=3.12"
[[package]]
name = "markdown-it-py"
version = "3.0.0"
version = "4.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "mdurl" },
]
sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596 }
sdist = { url = "https://files.pythonhosted.org/packages/5b/f5/4ec618ed16cc4f8fb3b701563655a69816155e79e24a17b651541804721d/markdown_it_py-4.0.0.tar.gz", hash = "sha256:cb0a2b4aa34f932c007117b194e945bd74e0ec24133ceb5bac59009cda1cb9f3", size = 73070, upload-time = "2025-08-11T12:57:52.854Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528 },
{ url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" },
]
[[package]]
name = "mdurl"
version = "0.1.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729 }
sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979 },
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
]
[[package]]
name = "mypy"
version = "1.13.0"
version = "1.18.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "mypy-extensions" },
{ name = "pathspec" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/e8/21/7e9e523537991d145ab8a0a2fd98548d67646dc2aaaf6091c31ad883e7c1/mypy-1.13.0.tar.gz", hash = "sha256:0291a61b6fbf3e6673e3405cfcc0e7650bebc7939659fdca2702958038bd835e", size = 3152532 }
sdist = { url = "https://files.pythonhosted.org/packages/c0/77/8f0d0001ffad290cef2f7f216f96c814866248a0b92a722365ed54648e7e/mypy-1.18.2.tar.gz", hash = "sha256:06a398102a5f203d7477b2923dda3634c36727fa5c237d8f859ef90c42a9924b", size = 3448846, upload-time = "2025-09-19T00:11:10.519Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/fb/31/c526a7bd2e5c710ae47717c7a5f53f616db6d9097caf48ad650581e81748/mypy-1.13.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:5c7051a3461ae84dfb5dd15eff5094640c61c5f22257c8b766794e6dd85e72d5", size = 11077900 },
{ url = "https://files.pythonhosted.org/packages/83/67/b7419c6b503679d10bd26fc67529bc6a1f7a5f220bbb9f292dc10d33352f/mypy-1.13.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:39bb21c69a5d6342f4ce526e4584bc5c197fd20a60d14a8624d8743fffb9472e", size = 10074818 },
{ url = "https://files.pythonhosted.org/packages/ba/07/37d67048786ae84e6612575e173d713c9a05d0ae495dde1e68d972207d98/mypy-1.13.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:164f28cb9d6367439031f4c81e84d3ccaa1e19232d9d05d37cb0bd880d3f93c2", size = 12589275 },
{ url = "https://files.pythonhosted.org/packages/1f/17/b1018c6bb3e9f1ce3956722b3bf91bff86c1cefccca71cec05eae49d6d41/mypy-1.13.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a4c1bfcdbce96ff5d96fc9b08e3831acb30dc44ab02671eca5953eadad07d6d0", size = 13037783 },
{ url = "https://files.pythonhosted.org/packages/cb/32/cd540755579e54a88099aee0287086d996f5a24281a673f78a0e14dba150/mypy-1.13.0-cp312-cp312-win_amd64.whl", hash = "sha256:a0affb3a79a256b4183ba09811e3577c5163ed06685e4d4b46429a271ba174d2", size = 9726197 },
{ url = "https://files.pythonhosted.org/packages/11/bb/ab4cfdc562cad80418f077d8be9b4491ee4fb257440da951b85cbb0a639e/mypy-1.13.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a7b44178c9760ce1a43f544e595d35ed61ac2c3de306599fa59b38a6048e1aa7", size = 11069721 },
{ url = "https://files.pythonhosted.org/packages/59/3b/a393b1607cb749ea2c621def5ba8c58308ff05e30d9dbdc7c15028bca111/mypy-1.13.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5d5092efb8516d08440e36626f0153b5006d4088c1d663d88bf79625af3d1d62", size = 10063996 },
{ url = "https://files.pythonhosted.org/packages/d1/1f/6b76be289a5a521bb1caedc1f08e76ff17ab59061007f201a8a18cc514d1/mypy-1.13.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:de2904956dac40ced10931ac967ae63c5089bd498542194b436eb097a9f77bc8", size = 12584043 },
{ url = "https://files.pythonhosted.org/packages/a6/83/5a85c9a5976c6f96e3a5a7591aa28b4a6ca3a07e9e5ba0cec090c8b596d6/mypy-1.13.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:7bfd8836970d33c2105562650656b6846149374dc8ed77d98424b40b09340ba7", size = 13036996 },
{ url = "https://files.pythonhosted.org/packages/b4/59/c39a6f752f1f893fccbcf1bdd2aca67c79c842402b5283563d006a67cf76/mypy-1.13.0-cp313-cp313-win_amd64.whl", hash = "sha256:9f73dba9ec77acb86457a8fc04b5239822df0c14a082564737833d2963677dbc", size = 9737709 },
{ url = "https://files.pythonhosted.org/packages/3b/86/72ce7f57431d87a7ff17d442f521146a6585019eb8f4f31b7c02801f78ad/mypy-1.13.0-py3-none-any.whl", hash = "sha256:9c250883f9fd81d212e0952c92dbfcc96fc237f4b7c92f56ac81fd48460b3e5a", size = 2647043 },
{ url = "https://files.pythonhosted.org/packages/07/06/dfdd2bc60c66611dd8335f463818514733bc763e4760dee289dcc33df709/mypy-1.18.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:33eca32dd124b29400c31d7cf784e795b050ace0e1f91b8dc035672725617e34", size = 12908273, upload-time = "2025-09-19T00:10:58.321Z" },
{ url = "https://files.pythonhosted.org/packages/81/14/6a9de6d13a122d5608e1a04130724caf9170333ac5a924e10f670687d3eb/mypy-1.18.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a3c47adf30d65e89b2dcd2fa32f3aeb5e94ca970d2c15fcb25e297871c8e4764", size = 11920910, upload-time = "2025-09-19T00:10:20.043Z" },
{ url = "https://files.pythonhosted.org/packages/5f/a9/b29de53e42f18e8cc547e38daa9dfa132ffdc64f7250e353f5c8cdd44bee/mypy-1.18.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d6c838e831a062f5f29d11c9057c6009f60cb294fea33a98422688181fe2893", size = 12465585, upload-time = "2025-09-19T00:10:33.005Z" },
{ url = "https://files.pythonhosted.org/packages/77/ae/6c3d2c7c61ff21f2bee938c917616c92ebf852f015fb55917fd6e2811db2/mypy-1.18.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:01199871b6110a2ce984bde85acd481232d17413868c9807e95c1b0739a58914", size = 13348562, upload-time = "2025-09-19T00:10:11.51Z" },
{ url = "https://files.pythonhosted.org/packages/4d/31/aec68ab3b4aebdf8f36d191b0685d99faa899ab990753ca0fee60fb99511/mypy-1.18.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a2afc0fa0b0e91b4599ddfe0f91e2c26c2b5a5ab263737e998d6817874c5f7c8", size = 13533296, upload-time = "2025-09-19T00:10:06.568Z" },
{ url = "https://files.pythonhosted.org/packages/9f/83/abcb3ad9478fca3ebeb6a5358bb0b22c95ea42b43b7789c7fb1297ca44f4/mypy-1.18.2-cp312-cp312-win_amd64.whl", hash = "sha256:d8068d0afe682c7c4897c0f7ce84ea77f6de953262b12d07038f4d296d547074", size = 9828828, upload-time = "2025-09-19T00:10:28.203Z" },
{ url = "https://files.pythonhosted.org/packages/5f/04/7f462e6fbba87a72bc8097b93f6842499c428a6ff0c81dd46948d175afe8/mypy-1.18.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:07b8b0f580ca6d289e69209ec9d3911b4a26e5abfde32228a288eb79df129fcc", size = 12898728, upload-time = "2025-09-19T00:10:01.33Z" },
{ url = "https://files.pythonhosted.org/packages/99/5b/61ed4efb64f1871b41fd0b82d29a64640f3516078f6c7905b68ab1ad8b13/mypy-1.18.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:ed4482847168439651d3feee5833ccedbf6657e964572706a2adb1f7fa4dfe2e", size = 11910758, upload-time = "2025-09-19T00:10:42.607Z" },
{ url = "https://files.pythonhosted.org/packages/3c/46/d297d4b683cc89a6e4108c4250a6a6b717f5fa96e1a30a7944a6da44da35/mypy-1.18.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c3ad2afadd1e9fea5cf99a45a822346971ede8685cc581ed9cd4d42eaf940986", size = 12475342, upload-time = "2025-09-19T00:11:00.371Z" },
{ url = "https://files.pythonhosted.org/packages/83/45/4798f4d00df13eae3bfdf726c9244bcb495ab5bd588c0eed93a2f2dd67f3/mypy-1.18.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a431a6f1ef14cf8c144c6b14793a23ec4eae3db28277c358136e79d7d062f62d", size = 13338709, upload-time = "2025-09-19T00:11:03.358Z" },
{ url = "https://files.pythonhosted.org/packages/d7/09/479f7358d9625172521a87a9271ddd2441e1dab16a09708f056e97007207/mypy-1.18.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:7ab28cc197f1dd77a67e1c6f35cd1f8e8b73ed2217e4fc005f9e6a504e46e7ba", size = 13529806, upload-time = "2025-09-19T00:10:26.073Z" },
{ url = "https://files.pythonhosted.org/packages/71/cf/ac0f2c7e9d0ea3c75cd99dff7aec1c9df4a1376537cb90e4c882267ee7e9/mypy-1.18.2-cp313-cp313-win_amd64.whl", hash = "sha256:0e2785a84b34a72ba55fb5daf079a1003a34c05b22238da94fcae2bbe46f3544", size = 9833262, upload-time = "2025-09-19T00:10:40.035Z" },
{ url = "https://files.pythonhosted.org/packages/5a/0c/7d5300883da16f0063ae53996358758b2a2df2a09c72a5061fa79a1f5006/mypy-1.18.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:62f0e1e988ad41c2a110edde6c398383a889d95b36b3e60bcf155f5164c4fdce", size = 12893775, upload-time = "2025-09-19T00:10:03.814Z" },
{ url = "https://files.pythonhosted.org/packages/50/df/2cffbf25737bdb236f60c973edf62e3e7b4ee1c25b6878629e88e2cde967/mypy-1.18.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:8795a039bab805ff0c1dfdb8cd3344642c2b99b8e439d057aba30850b8d3423d", size = 11936852, upload-time = "2025-09-19T00:10:51.631Z" },
{ url = "https://files.pythonhosted.org/packages/be/50/34059de13dd269227fb4a03be1faee6e2a4b04a2051c82ac0a0b5a773c9a/mypy-1.18.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6ca1e64b24a700ab5ce10133f7ccd956a04715463d30498e64ea8715236f9c9c", size = 12480242, upload-time = "2025-09-19T00:11:07.955Z" },
{ url = "https://files.pythonhosted.org/packages/5b/11/040983fad5132d85914c874a2836252bbc57832065548885b5bb5b0d4359/mypy-1.18.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d924eef3795cc89fecf6bedc6ed32b33ac13e8321344f6ddbf8ee89f706c05cb", size = 13326683, upload-time = "2025-09-19T00:09:55.572Z" },
{ url = "https://files.pythonhosted.org/packages/e9/ba/89b2901dd77414dd7a8c8729985832a5735053be15b744c18e4586e506ef/mypy-1.18.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:20c02215a080e3a2be3aa50506c67242df1c151eaba0dcbc1e4e557922a26075", size = 13514749, upload-time = "2025-09-19T00:10:44.827Z" },
{ url = "https://files.pythonhosted.org/packages/25/bc/cc98767cffd6b2928ba680f3e5bc969c4152bf7c2d83f92f5a504b92b0eb/mypy-1.18.2-cp314-cp314-win_amd64.whl", hash = "sha256:749b5f83198f1ca64345603118a6f01a4e99ad4bf9d103ddc5a3200cc4614adf", size = 9982959, upload-time = "2025-09-19T00:10:37.344Z" },
{ url = "https://files.pythonhosted.org/packages/87/e3/be76d87158ebafa0309946c4a73831974d4d6ab4f4ef40c3b53a385a66fd/mypy-1.18.2-py3-none-any.whl", hash = "sha256:22a1748707dd62b58d2ae53562ffc4d7f8bcc727e8ac7cbc69c053ddc874d47e", size = 2352367, upload-time = "2025-09-19T00:10:15.489Z" },
]
[[package]]
name = "mypy-extensions"
version = "1.0.0"
version = "1.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/98/a4/1ab47638b92648243faf97a5aeb6ea83059cc3624972ab6b8d2316078d3f/mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782", size = 4433 }
sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2a/e2/5d3f6ada4297caebe1a2add3b126fe800c96f56dbe5d1988a2cbe0b267aa/mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d", size = 4695 },
{ url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" },
]
[[package]]
name = "pathspec"
version = "0.12.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" },
]
[[package]]
@@ -90,98 +108,99 @@ dev = [
[[package]]
name = "pygments"
version = "2.18.0"
version = "2.19.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/8e/62/8336eff65bcbc8e4cb5d05b55faf041285951b6e80f33e2bff2024788f31/pygments-2.18.0.tar.gz", hash = "sha256:786ff802f32e91311bff3889f6e9a86e81505fe99f2735bb6d60ae0c5004f199", size = 4891905 }
sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f7/3f/01c8b82017c199075f8f788d0d906b9ffbbc5a47dc9918a945e13d5a2bda/pygments-2.18.0-py3-none-any.whl", hash = "sha256:b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a", size = 1205513 },
{ url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" },
]
[[package]]
name = "pysource-codegen"
version = "0.6.0"
version = "0.7.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/38/5d/1ab81f5f4eb3fb9203a91ea3cdb7da7b19b38993d669e39000a6454128e1/pysource_codegen-0.6.0.tar.gz", hash = "sha256:0337e3cf3639f017567ab298684c78eb15c877b093965259c725a13a4917be4e", size = 157342 }
sdist = { url = "https://files.pythonhosted.org/packages/7d/a8/2bbac069fef2361eaef2c23706fc0de051034c14cc1c4ab3d88b095cfe5f/pysource_codegen-0.7.1.tar.gz", hash = "sha256:1a72d29591a9732fa9a66ee4976307081e56dd8991231b19a0bddf87a31d4c8e", size = 71073, upload-time = "2025-08-29T19:27:58.964Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/45/e4/c3a105d7d43bf237b18169eb1a06e1b93dc0daa96e74efa14c6d908979b4/pysource_codegen-0.6.0-py3-none-any.whl", hash = "sha256:858f2bbed6de7a7b7e9bbea5cb8212ec7e66fa7f2a7ba433fe05f72438017b30", size = 19489 },
{ url = "https://files.pythonhosted.org/packages/63/a5/c851b7fac516d7ffb8943991c3ac46d7d29808b773d270d8b6b820cb2e51/pysource_codegen-0.7.1-py3-none-any.whl", hash = "sha256:099e00d587a59babacaff902ad0b6d5b2f7344648c1e0baa981b30cc11a5c363", size = 20671, upload-time = "2025-08-29T19:27:57.756Z" },
]
[[package]]
name = "pysource-minimize"
version = "0.8.0"
version = "0.10.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/90/d3/6786a52121987875b2e9d273399504e2bdb868e7b80b603ecb29c900846f/pysource_minimize-0.8.0.tar.gz", hash = "sha256:e9a88c68717891dc7dc74beab769ef4c2e397599e1620b2046b89783fb500652", size = 715267 }
sdist = { url = "https://files.pythonhosted.org/packages/03/e6/14136d4868c3ea2e46d7f83faef26d07fd9231b7bd3fc811b6b6f8688cf2/pysource_minimize-0.10.1.tar.gz", hash = "sha256:05000b5174207d10dbb6da1a67a6f3a6f61d295efa17e3c74283f0d9ece6401c", size = 9254623, upload-time = "2025-08-30T14:41:16.989Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4a/7d/4e9ed2a376bb7372d74fdec557f35f70c2bf5373f2c67e05535555d0a6d4/pysource_minimize-0.8.0-py3-none-any.whl", hash = "sha256:edee433c24a2e8f81701aa7e01ba4c1e63f481f683dd3a561610762bc03ed6c3", size = 14635 },
{ url = "https://files.pythonhosted.org/packages/f5/de/847456f142124b242933a1cea17aea7041d297f1ac99b411c5dee69ac250/pysource_minimize-0.10.1-py3-none-any.whl", hash = "sha256:69b41fd2f0c1840ca281ec788c6ffb4e79680b06c7cc0e7c4d9a75321654b709", size = 18631, upload-time = "2025-08-30T14:41:15.588Z" },
]
[[package]]
name = "rich"
version = "13.9.4"
version = "14.2.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "markdown-it-py" },
{ name = "pygments" },
]
sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149 }
sdist = { url = "https://files.pythonhosted.org/packages/fb/d2/8920e102050a0de7bfabeb4c4614a49248cf8d5d7a8d01885fbb24dc767a/rich-14.2.0.tar.gz", hash = "sha256:73ff50c7c0c1c77c8243079283f4edb376f0f6442433aecb8ce7e6d0b92d1fe4", size = 219990, upload-time = "2025-10-09T14:16:53.064Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424 },
{ url = "https://files.pythonhosted.org/packages/25/7a/b0178788f8dc6cafce37a212c99565fa1fe7872c70c6c9c1e1a372d9d88f/rich-14.2.0-py3-none-any.whl", hash = "sha256:76bc51fe2e57d2b1be1f96c524b890b816e334ab4c1e45888799bfaab0021edd", size = 243393, upload-time = "2025-10-09T14:16:51.245Z" },
]
[[package]]
name = "rich-argparse"
version = "1.7.0"
version = "1.7.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "rich" },
]
sdist = { url = "https://files.pythonhosted.org/packages/aa/b9/ff53663ee7fa6a4195fa96d91da499f2e00ca067541e016d345cce1c9ad2/rich_argparse-1.7.0.tar.gz", hash = "sha256:f31d809c465ee43f367d599ccaf88b73bc2c4d75d74ed43f2d538838c53544ba", size = 38009 }
sdist = { url = "https://files.pythonhosted.org/packages/71/a6/34460d81e5534f6d2fc8e8d91ff99a5835fdca53578eac89e4f37b3a7c6d/rich_argparse-1.7.1.tar.gz", hash = "sha256:d7a493cde94043e41ea68fb43a74405fa178de981bf7b800f7a3bd02ac5c27be", size = 38094, upload-time = "2025-05-25T20:20:35.335Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/bb/9c/dc7cbeb99a7b7422392ed7f327efdbb958bc0faf424aef5f130309320bda/rich_argparse-1.7.0-py3-none-any.whl", hash = "sha256:b8ec8943588e9731967f4f97b735b03dc127c416f480a083060433a97baf2fd3", size = 25339 },
{ url = "https://files.pythonhosted.org/packages/31/f6/5fc0574af5379606ffd57a4b68ed88f9b415eb222047fe023aefcc00a648/rich_argparse-1.7.1-py3-none-any.whl", hash = "sha256:a8650b42e4a4ff72127837632fba6b7da40784842f08d7395eb67a9cbd7b4bf9", size = 25357, upload-time = "2025-05-25T20:20:33.793Z" },
]
[[package]]
name = "ruff"
version = "0.11.9"
version = "0.14.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f5/e7/e55dda1c92cdcf34b677ebef17486669800de01e887b7831a1b8fdf5cb08/ruff-0.11.9.tar.gz", hash = "sha256:ebd58d4f67a00afb3a30bf7d383e52d0e036e6195143c6db7019604a05335517", size = 4132134 }
sdist = { url = "https://files.pythonhosted.org/packages/9e/58/6ca66896635352812de66f71cdf9ff86b3a4f79071ca5730088c0cd0fc8d/ruff-0.14.1.tar.gz", hash = "sha256:1dd86253060c4772867c61791588627320abcb6ed1577a90ef432ee319729b69", size = 5513429, upload-time = "2025-10-16T18:05:41.766Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/fb/71/75dfb7194fe6502708e547941d41162574d1f579c4676a8eb645bf1a6842/ruff-0.11.9-py3-none-linux_armv6l.whl", hash = "sha256:a31a1d143a5e6f499d1fb480f8e1e780b4dfdd580f86e05e87b835d22c5c6f8c", size = 10335453 },
{ url = "https://files.pythonhosted.org/packages/74/fc/ad80c869b1732f53c4232bbf341f33c5075b2c0fb3e488983eb55964076a/ruff-0.11.9-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:66bc18ca783b97186a1f3100e91e492615767ae0a3be584e1266aa9051990722", size = 11072566 },
{ url = "https://files.pythonhosted.org/packages/87/0d/0ccececef8a0671dae155cbf7a1f90ea2dd1dba61405da60228bbe731d35/ruff-0.11.9-py3-none-macosx_11_0_arm64.whl", hash = "sha256:bd576cd06962825de8aece49f28707662ada6a1ff2db848d1348e12c580acbf1", size = 10435020 },
{ url = "https://files.pythonhosted.org/packages/52/01/e249e1da6ad722278094e183cbf22379a9bbe5f21a3e46cef24ccab76e22/ruff-0.11.9-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5b1d18b4be8182cc6fddf859ce432cc9631556e9f371ada52f3eaefc10d878de", size = 10593935 },
{ url = "https://files.pythonhosted.org/packages/ed/9a/40cf91f61e3003fe7bd43f1761882740e954506c5a0f9097b1cff861f04c/ruff-0.11.9-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0f3f46f759ac623e94824b1e5a687a0df5cd7f5b00718ff9c24f0a894a683be7", size = 10172971 },
{ url = "https://files.pythonhosted.org/packages/61/12/d395203de1e8717d7a2071b5a340422726d4736f44daf2290aad1085075f/ruff-0.11.9-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f34847eea11932d97b521450cf3e1d17863cfa5a94f21a056b93fb86f3f3dba2", size = 11748631 },
{ url = "https://files.pythonhosted.org/packages/66/d6/ef4d5eba77677eab511644c37c55a3bb8dcac1cdeb331123fe342c9a16c9/ruff-0.11.9-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:f33b15e00435773df97cddcd263578aa83af996b913721d86f47f4e0ee0ff271", size = 12409236 },
{ url = "https://files.pythonhosted.org/packages/c5/8f/5a2c5fc6124dd925a5faf90e1089ee9036462118b619068e5b65f8ea03df/ruff-0.11.9-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7b27613a683b086f2aca8996f63cb3dd7bc49e6eccf590563221f7b43ded3f65", size = 11881436 },
{ url = "https://files.pythonhosted.org/packages/39/d1/9683f469ae0b99b95ef99a56cfe8c8373c14eba26bd5c622150959ce9f64/ruff-0.11.9-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9e0d88756e63e8302e630cee3ce2ffb77859797cc84a830a24473939e6da3ca6", size = 13982759 },
{ url = "https://files.pythonhosted.org/packages/4e/0b/c53a664f06e0faab596397867c6320c3816df479e888fe3af63bc3f89699/ruff-0.11.9-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:537c82c9829d7811e3aa680205f94c81a2958a122ac391c0eb60336ace741a70", size = 11541985 },
{ url = "https://files.pythonhosted.org/packages/23/a0/156c4d7e685f6526a636a60986ee4a3c09c8c4e2a49b9a08c9913f46c139/ruff-0.11.9-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:440ac6a7029f3dee7d46ab7de6f54b19e34c2b090bb4f2480d0a2d635228f381", size = 10465775 },
{ url = "https://files.pythonhosted.org/packages/43/d5/88b9a6534d9d4952c355e38eabc343df812f168a2c811dbce7d681aeb404/ruff-0.11.9-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:71c539bac63d0788a30227ed4d43b81353c89437d355fdc52e0cda4ce5651787", size = 10170957 },
{ url = "https://files.pythonhosted.org/packages/f0/b8/2bd533bdaf469dc84b45815ab806784d561fab104d993a54e1852596d581/ruff-0.11.9-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c67117bc82457e4501473c5f5217d49d9222a360794bfb63968e09e70f340abd", size = 11143307 },
{ url = "https://files.pythonhosted.org/packages/2f/d9/43cfba291788459b9bfd4e09a0479aa94d05ab5021d381a502d61a807ec1/ruff-0.11.9-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:e4b78454f97aa454586e8a5557facb40d683e74246c97372af3c2d76901d697b", size = 11603026 },
{ url = "https://files.pythonhosted.org/packages/22/e6/7ed70048e89b01d728ccc950557a17ecf8df4127b08a56944b9d0bae61bc/ruff-0.11.9-py3-none-win32.whl", hash = "sha256:7fe1bc950e7d7b42caaee2a8a3bc27410547cc032c9558ee2e0f6d3b209e845a", size = 10548627 },
{ url = "https://files.pythonhosted.org/packages/90/36/1da5d566271682ed10f436f732e5f75f926c17255c9c75cefb77d4bf8f10/ruff-0.11.9-py3-none-win_amd64.whl", hash = "sha256:52edaa4a6d70f8180343a5b7f030c7edd36ad180c9f4d224959c2d689962d964", size = 11634340 },
{ url = "https://files.pythonhosted.org/packages/40/f7/70aad26e5877c8f7ee5b161c4c9fa0100e63fc4c944dc6d97b9c7e871417/ruff-0.11.9-py3-none-win_arm64.whl", hash = "sha256:bcf42689c22f2e240f496d0c183ef2c6f7b35e809f12c1db58f75d9aa8d630ca", size = 10741080 },
{ url = "https://files.pythonhosted.org/packages/8d/39/9cc5ab181478d7a18adc1c1e051a84ee02bec94eb9bdfd35643d7c74ca31/ruff-0.14.1-py3-none-linux_armv6l.whl", hash = "sha256:083bfc1f30f4a391ae09c6f4f99d83074416b471775b59288956f5bc18e82f8b", size = 12445415, upload-time = "2025-10-16T18:04:48.227Z" },
{ url = "https://files.pythonhosted.org/packages/ef/2e/1226961855ccd697255988f5a2474890ac7c5863b080b15bd038df820818/ruff-0.14.1-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f6fa757cd717f791009f7669fefb09121cc5f7d9bd0ef211371fad68c2b8b224", size = 12784267, upload-time = "2025-10-16T18:04:52.515Z" },
{ url = "https://files.pythonhosted.org/packages/c1/ea/fd9e95863124ed159cd0667ec98449ae461de94acda7101f1acb6066da00/ruff-0.14.1-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d6191903d39ac156921398e9c86b7354d15e3c93772e7dbf26c9fcae59ceccd5", size = 11781872, upload-time = "2025-10-16T18:04:55.396Z" },
{ url = "https://files.pythonhosted.org/packages/1e/5a/e890f7338ff537dba4589a5e02c51baa63020acfb7c8cbbaea4831562c96/ruff-0.14.1-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed04f0e04f7a4587244e5c9d7df50e6b5bf2705d75059f409a6421c593a35896", size = 12226558, upload-time = "2025-10-16T18:04:58.166Z" },
{ url = "https://files.pythonhosted.org/packages/a6/7a/8ab5c3377f5bf31e167b73651841217542bcc7aa1c19e83030835cc25204/ruff-0.14.1-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5c9e6cf6cd4acae0febbce29497accd3632fe2025c0c583c8b87e8dbdeae5f61", size = 12187898, upload-time = "2025-10-16T18:05:01.455Z" },
{ url = "https://files.pythonhosted.org/packages/48/8d/ba7c33aa55406955fc124e62c8259791c3d42e3075a71710fdff9375134f/ruff-0.14.1-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a6fa2458527794ecdfbe45f654e42c61f2503a230545a91af839653a0a93dbc6", size = 12939168, upload-time = "2025-10-16T18:05:04.397Z" },
{ url = "https://files.pythonhosted.org/packages/b4/c2/70783f612b50f66d083380e68cbd1696739d88e9b4f6164230375532c637/ruff-0.14.1-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:39f1c392244e338b21d42ab29b8a6392a722c5090032eb49bb4d6defcdb34345", size = 14386942, upload-time = "2025-10-16T18:05:07.102Z" },
{ url = "https://files.pythonhosted.org/packages/48/44/cd7abb9c776b66d332119d67f96acf15830d120f5b884598a36d9d3f4d83/ruff-0.14.1-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7382fa12a26cce1f95070ce450946bec357727aaa428983036362579eadcc5cf", size = 13990622, upload-time = "2025-10-16T18:05:09.882Z" },
{ url = "https://files.pythonhosted.org/packages/eb/56/4259b696db12ac152fe472764b4f78bbdd9b477afd9bc3a6d53c01300b37/ruff-0.14.1-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dd0bf2be3ae8521e1093a487c4aa3b455882f139787770698530d28ed3fbb37c", size = 13431143, upload-time = "2025-10-16T18:05:13.46Z" },
{ url = "https://files.pythonhosted.org/packages/e0/35/266a80d0eb97bd224b3265b9437bd89dde0dcf4faf299db1212e81824e7e/ruff-0.14.1-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cabcaa9ccf8089fb4fdb78d17cc0e28241520f50f4c2e88cb6261ed083d85151", size = 13132844, upload-time = "2025-10-16T18:05:16.1Z" },
{ url = "https://files.pythonhosted.org/packages/65/6e/d31ce218acc11a8d91ef208e002a31acf315061a85132f94f3df7a252b18/ruff-0.14.1-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:747d583400f6125ec11a4c14d1c8474bf75d8b419ad22a111a537ec1a952d192", size = 13401241, upload-time = "2025-10-16T18:05:19.395Z" },
{ url = "https://files.pythonhosted.org/packages/9f/b5/dbc4221bf0b03774b3b2f0d47f39e848d30664157c15b965a14d890637d2/ruff-0.14.1-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:5a6e74c0efd78515a1d13acbfe6c90f0f5bd822aa56b4a6d43a9ffb2ae6e56cd", size = 12132476, upload-time = "2025-10-16T18:05:22.163Z" },
{ url = "https://files.pythonhosted.org/packages/98/4b/ac99194e790ccd092d6a8b5f341f34b6e597d698e3077c032c502d75ea84/ruff-0.14.1-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:0ea6a864d2fb41a4b6d5b456ed164302a0d96f4daac630aeba829abfb059d020", size = 12139749, upload-time = "2025-10-16T18:05:25.162Z" },
{ url = "https://files.pythonhosted.org/packages/47/26/7df917462c3bb5004e6fdfcc505a49e90bcd8a34c54a051953118c00b53a/ruff-0.14.1-py3-none-musllinux_1_2_i686.whl", hash = "sha256:0826b8764f94229604fa255918d1cc45e583e38c21c203248b0bfc9a0e930be5", size = 12544758, upload-time = "2025-10-16T18:05:28.018Z" },
{ url = "https://files.pythonhosted.org/packages/64/d0/81e7f0648e9764ad9b51dd4be5e5dac3fcfff9602428ccbae288a39c2c22/ruff-0.14.1-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:cbc52160465913a1a3f424c81c62ac8096b6a491468e7d872cb9444a860bc33d", size = 13221811, upload-time = "2025-10-16T18:05:30.707Z" },
{ url = "https://files.pythonhosted.org/packages/c3/07/3c45562c67933cc35f6d5df4ca77dabbcd88fddaca0d6b8371693d29fd56/ruff-0.14.1-py3-none-win32.whl", hash = "sha256:e037ea374aaaff4103240ae79168c0945ae3d5ae8db190603de3b4012bd1def6", size = 12319467, upload-time = "2025-10-16T18:05:33.261Z" },
{ url = "https://files.pythonhosted.org/packages/02/88/0ee4ca507d4aa05f67e292d2e5eb0b3e358fbcfe527554a2eda9ac422d6b/ruff-0.14.1-py3-none-win_amd64.whl", hash = "sha256:59d599cdff9c7f925a017f6f2c256c908b094e55967f93f2821b1439928746a1", size = 13401123, upload-time = "2025-10-16T18:05:35.984Z" },
{ url = "https://files.pythonhosted.org/packages/b8/81/4b6387be7014858d924b843530e1b2a8e531846807516e9bea2ee0936bf7/ruff-0.14.1-py3-none-win_arm64.whl", hash = "sha256:e3b443c4c9f16ae850906b8d0a707b2a4c16f8d2f0a7fe65c475c5886665ce44", size = 12436636, upload-time = "2025-10-16T18:05:38.995Z" },
]
[[package]]
name = "termcolor"
version = "3.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/ca/6c/3d75c196ac07ac8749600b60b03f4f6094d54e132c4d94ebac6ee0e0add0/termcolor-3.1.0.tar.gz", hash = "sha256:6a6dd7fbee581909eeec6a756cff1d7f7c376063b14e4a298dc4980309e55970", size = 14324 }
sdist = { url = "https://files.pythonhosted.org/packages/ca/6c/3d75c196ac07ac8749600b60b03f4f6094d54e132c4d94ebac6ee0e0add0/termcolor-3.1.0.tar.gz", hash = "sha256:6a6dd7fbee581909eeec6a756cff1d7f7c376063b14e4a298dc4980309e55970", size = 14324, upload-time = "2025-04-30T11:37:53.791Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4f/bd/de8d508070629b6d84a30d01d57e4a65c69aa7f5abe7560b8fad3b50ea59/termcolor-3.1.0-py3-none-any.whl", hash = "sha256:591dd26b5c2ce03b9e43f391264626557873ce1d379019786f99b0c2bee140aa", size = 7684 },
{ url = "https://files.pythonhosted.org/packages/4f/bd/de8d508070629b6d84a30d01d57e4a65c69aa7f5abe7560b8fad3b50ea59/termcolor-3.1.0-py3-none-any.whl", hash = "sha256:591dd26b5c2ce03b9e43f391264626557873ce1d379019786f99b0c2bee140aa", size = 7684, upload-time = "2025-04-30T11:37:52.382Z" },
]
[[package]]
name = "typing-extensions"
version = "4.12.2"
version = "4.15.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 }
sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 },
{ url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
]

View File

@@ -117,11 +117,11 @@ def update_schemastore(
f"This updates ruff's JSON schema to [{current_sha}]({commit_url})"
)
# https://stackoverflow.com/a/22909204/3549270
check_call(["git", "add", (src / RUFF_JSON).as_posix()], cwd=schemastore_path)
check_call(
[
"git",
"commit",
"-a",
"-m",
"Update ruff's JSON schema",
"-m",