Compare commits

...

19 Commits

Author SHA1 Message Date
Alex Waygood
0cecc22d8b [ty] Rename extra-paths to non-environment-paths, and --extra-search-path to --non-environment-search-path 2025-10-06 15:00:07 +01:00
Nikolas Hearp
1c5666ce5d [RUF051] Ignore if else/elif block is present (#20705)
## Summary

Fixes #20700

`else` and `elif` blocks could previously be deleted when applying a fix
for this rule. If an `else` or `elif` branch is detected the rule will
not trigger. So now the rule will only flag if it is safe.
2025-10-06 08:02:27 -05:00
Alex Waygood
42b297bf44 [ty] Improve documentation for extra-paths and python config settings (#20717)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-10-06 12:20:00 +00:00
Alex Waygood
80b337669f [ty] Add --venv as an alias to --python (#20718) 2025-10-06 13:03:05 +01:00
Alex Waygood
1ce57edf33 [ty] Enforce that typing_extensions must come from a stdlib search path (#20715) 2025-10-06 11:43:34 +00:00
renovate[bot]
1f8a74b5c6 Update astral-sh/setup-uv action to v6.8.0 (#20710)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-06 08:37:03 +02:00
renovate[bot]
c895b29f23 Update dependency ruff to v0.13.3 (#20707)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-06 08:26:22 +02:00
renovate[bot]
a9f4852956 Update docker/login-action action to v3.6.0 (#20712)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-06 08:26:10 +02:00
renovate[bot]
c20f489484 Update dependency uuid to v13 (#20714)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-06 08:25:48 +02:00
renovate[bot]
6ae7e7ba6b Update Swatinem/rust-cache action to v2.8.1 (#20708)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-06 08:25:00 +02:00
renovate[bot]
c05b172266 Update taiki-e/install-action action to v2.62.21 (#20713)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-06 08:23:24 +02:00
renovate[bot]
10d23bb5e1 Update CodSpeedHQ/action action to v4.1.0 (#20711) 2025-10-06 07:00:11 +01:00
Dan Parizher
2d44ad2f8f [ty] Fix playground crashes when accessing vendored files with leading slashes (#20661) 2025-10-04 12:40:37 +01:00
Ibraheem Ahmed
2ce3aba458 [ty] Use annotated parameters as type context (#20635)
## Summary

Use the type annotation of function parameters as bidirectional type
context when inferring the argument expression. For example, the
following example now type-checks:

```py
class TD(TypedDict):
    x: int

def f(_: TD): ...

f({ "x": 1 })
```

Part of https://github.com/astral-sh/ty/issues/168.
2025-10-03 17:14:51 -04:00
Douglas Creager
b83ac5e234 [ty] Clean up inherited generic contexts (#20647)
We add an `inherited_generic_context` to the constructors of a generic
class. That lets us infer specializations of the class when invoking the
constructor. The constructor might itself be generic, in which case we
have to merge the list of typevars that we are willing to infer in the
constructor call.

Before we did that by tracking the two (and their specializations)
separately, with distinct `Option` fields/parameters. This PR updates
our call binding logic such that any given function call has _one_
optional generic context that we're willing to infer a specialization
for. If needed, we use the existing `GenericContext::merge` method to
create a new combined generic context for when the class and constructor
are both generic. This simplifies the call binding code considerably,
and is no more complex in the constructor call logic.

We also have a heuristic that we will promote any literals in the
specialized types of a generic class, but we don't promote literals in
the specialized types of the function itself. To handle this, we now
track this `should_promote_literals` property within `GenericContext`.
And moreover, we track this separately for each typevar, instead of a
single property for the generic context as a whole, so that we can
correctly merge the generic context of a constructor method (where the
option should be `false`) with the inherited generic context of its
containing class (where the option should be `true`).

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-10-03 13:55:43 -04:00
Alex Waygood
c91b457044 [ty] Introduce TypeRelation::Redundancy (#20602)
## Summary

The union `T | U` can be validly simplified to `U` iff:
1. `T` is a subtype of `U` OR
2. `T` is equivalent to `U` OR
3. `U` is a union and contains a type that is equivalent to `T` OR
4. `T` is an intersection and contains a type that is equivalent to `U`

(In practice, the only situation in which 2, 3 or 4 would be true when
(1) was not true would be if `T` or `U` is a dynamic type.)

Currently we achieve these simplifications in the union builder by doing
something along the lines of `t.is_subtype_of(db, u) ||
t.is_equivalent_to_(db, u) ||
t.into_intersection().is_some_and(|intersection|
intersection.positive(db).contains(&u)) ||
u.into_union().is_some_and(|union| union.elements(db).contains(&t))`.
But this is both slow and misses some cases (it doesn't simplify the
union `Any | (Unknown & ~None)` to `Any`, for example). We can improve
the consistency and performance of our union simplifications by adding a
third type relation that sits in between `TypeRelation::Subtyping` and
`TypeRelation::Assignability`: `TypeRelation::UnionSimplification`.

This change leads to simpler, more user-friendly types due to the more
consistent simplification. It also lead to a pretty huge performance
improvement!

## Test Plan

Existing tests, plus some new ones.
2025-10-03 18:35:30 +01:00
Igor Drokin
673167a565 [flake8-bugbear] Include certain guaranteed-mutable expressions: tuples, generators, and assignment expressions (B006) (#20024)
## Summary
Resolves #20004

The implementation now supports guaranteed-mutable expressions in the
following cases:
- Tuple literals with mutable elements (supporting deep nesting)
- Generator expressions
- Named expressions (walrus operator) containing mutable components

Preserves original formatting for assignment value:

```python
# Test case
def f5(x=([1, ])):
    print(x)
```
```python
# Fix before
def f5(x=(None)):
    if x is None:
        x = [1]
    print(x)
```
```python
# Fix after 
def f5(x=None):
    if x is None:
        x = ([1, ])
    print(x)
```
The expansion of detected expressions and the new fixes gated behind
previews.

## Test Plan
- Added B006_9.py with a bunch of test cases
- Generated snapshots

---------

Co-authored-by: Igor Drokin <drokinii1017@gmail.com>
Co-authored-by: dylwil3 <dylwil3@gmail.com>
2025-10-03 09:29:36 -05:00
Dan Parizher
805d179dc0 [flake8-comprehensions] Clarify fix safety documentation (C413) (#20640)
## Summary

Fixes #20632
2025-10-03 09:23:57 -05:00
Daniel Kongsgaard
f73ead11cb [ty] improve base conda distinction from child conda (#20675)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

#19990 didn't completely fix the base vs. child conda environment
distinction, since it detected slightly different behavior than what I
usually see in conda. E.g., I see something like the following:
```
(didn't yet activate conda, but base is active)
➜ printenv | grep CONDA
CONDA_PYTHON_EXE=/opt/anaconda3/bin/python
CONDA_PREFIX=/opt/anaconda3
CONDA_DEFAULT_ENV=base
CONDA_EXE=/opt/anaconda3/bin/conda
CONDA_SHLVL=1
CONDA_PROMPT_MODIFIER=(base)

(activating conda)
➜ conda activate test

(test is an active conda environment)
❯ printenv | grep CONDA
CONDA_PREFIX=/opt/anaconda3/envs/test
CONDA_PYTHON_EXE=/opt/anaconda3/bin/python
CONDA_SHLVL=2
CONDA_PREFIX_1=/opt/anaconda3
CONDA_DEFAULT_ENV=test
CONDA_PROMPT_MODIFIER=(test)
CONDA_EXE=/opt/anaconda3/bin/conda
```

But the current behavior looks for `CONDA_DEFAULT_ENV =
basename(CONDA_PREFIX)` for the base environment instead of the child
environment, where we actually see this equality.

This pull request fixes that and updates the tests correspondingly.

## Test Plan

I updated the existing tests with the new behavior. Let me know if you
want more tests. Note: It shouldn't be necessary to test for the case
where we have `conda/envs/base`, since one should not be able to create
such an environment (one with the name of `CONDA_DEFAULT_ENV`).

---------

Co-authored-by: Aria Desires <aria.desires@gmail.com>
2025-10-03 13:56:06 +00:00
78 changed files with 2750 additions and 795 deletions

View File

@@ -40,7 +40,7 @@ jobs:
- uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
- uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -131,7 +131,7 @@ jobs:
type=pep440,pattern={{ version }},value=${{ fromJson(inputs.plan).announcement_tag }}
type=pep440,pattern={{ major }}.{{ minor }},value=${{ fromJson(inputs.plan).announcement_tag }}
- uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
- uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -169,7 +169,7 @@ jobs:
steps:
- uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
- uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -276,7 +276,7 @@ jobs:
type=pep440,pattern={{ version }},value=${{ fromJson(inputs.plan).announcement_tag }}
type=pep440,pattern={{ major }}.{{ minor }},value=${{ fromJson(inputs.plan).announcement_tag }}
- uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
- uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}

View File

@@ -225,7 +225,7 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
run: |
rustup component add clippy
@@ -245,21 +245,21 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@725a8794d15fc7563f59595bd9556495c0564878 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
uses: taiki-e/install-action@522492a8c115f1b6d4d318581f09638e9442547b # v2.62.21
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
uses: taiki-e/install-action@522492a8c115f1b6d4d318581f09638e9442547b # v2.62.21
with:
tool: cargo-insta
- name: "Install uv"
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
with:
enable-cache: "true"
- name: ty mdtests (GitHub annotations)
@@ -307,21 +307,21 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@725a8794d15fc7563f59595bd9556495c0564878 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
uses: taiki-e/install-action@522492a8c115f1b6d4d318581f09638e9442547b # v2.62.21
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
uses: taiki-e/install-action@522492a8c115f1b6d4d318581f09638e9442547b # v2.62.21
with:
tool: cargo-insta
- name: "Install uv"
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
with:
enable-cache: "true"
- name: "Run tests"
@@ -340,15 +340,15 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
uses: taiki-e/install-action@522492a8c115f1b6d4d318581f09638e9442547b # v2.62.21
with:
tool: cargo-nextest
- name: "Install uv"
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
with:
enable-cache: "true"
- name: "Run tests"
@@ -371,7 +371,7 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
@@ -400,7 +400,7 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
@@ -423,7 +423,7 @@ jobs:
with:
file: "Cargo.toml"
field: "workspace.package.rust-version"
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
env:
MSRV: ${{ steps.msrv.outputs.value }}
@@ -446,7 +446,7 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
with:
workspaces: "fuzz -> target"
- name: "Install Rust toolchain"
@@ -472,7 +472,7 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
name: Download Ruff binary to test
id: download-cached-binary
@@ -506,7 +506,7 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
run: rustup component add rustfmt
# Run all code generation scripts, and verify that the current output is
@@ -673,7 +673,7 @@ jobs:
branch: ${{ github.event.pull_request.base.ref }}
workflow: "ci.yaml"
check_artifacts: true
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: Fuzz
env:
FORCE_COLOR: 1
@@ -720,7 +720,7 @@ jobs:
with:
python-version: ${{ env.PYTHON_VERSION }}
architecture: x64
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build wheels"
@@ -743,8 +743,8 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: 22
@@ -777,7 +777,7 @@ jobs:
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: "3.13"
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Add SSH key"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
uses: webfactory/ssh-agent@a6f90b1f127823b31d4d4a8d96047790581349bd # v0.9.1
@@ -786,7 +786,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: Install uv
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uv pip install -r docs/requirements-insiders.txt --system
@@ -816,7 +816,7 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Rust toolchain"
run: rustup show
- name: "Run checks"
@@ -886,7 +886,7 @@ jobs:
persist-credentials: false
- name: "Install Rust toolchain"
run: rustup target add wasm32-unknown-unknown
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: 22
@@ -920,14 +920,14 @@ jobs:
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
uses: taiki-e/install-action@522492a8c115f1b6d4d318581f09638e9442547b # v2.62.21
with:
tool: cargo-codspeed
@@ -935,7 +935,7 @@ jobs:
run: cargo codspeed build --features "codspeed,instrumented" --no-default-features -p ruff_benchmark --bench formatter --bench lexer --bench linter --bench parser
- name: "Run benchmarks"
uses: CodSpeedHQ/action@653fdc30e6c40ffd9739e40c8a0576f4f4523ca1 # v4.0.1
uses: CodSpeedHQ/action@3959e9e296ef25296e93e32afcc97196f966e57f # v4.1.0
with:
mode: instrumentation
run: cargo codspeed run
@@ -955,14 +955,14 @@ jobs:
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
uses: taiki-e/install-action@522492a8c115f1b6d4d318581f09638e9442547b # v2.62.21
with:
tool: cargo-codspeed
@@ -970,7 +970,7 @@ jobs:
run: cargo codspeed build --features "codspeed,instrumented" --no-default-features -p ruff_benchmark --bench ty
- name: "Run benchmarks"
uses: CodSpeedHQ/action@653fdc30e6c40ffd9739e40c8a0576f4f4523ca1 # v4.0.1
uses: CodSpeedHQ/action@3959e9e296ef25296e93e32afcc97196f966e57f # v4.1.0
with:
mode: instrumentation
run: cargo codspeed run
@@ -989,14 +989,14 @@ jobs:
with:
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@67cc679904bee382389bf22082124fa963c6f6bd # v2.61.3
uses: taiki-e/install-action@522492a8c115f1b6d4d318581f09638e9442547b # v2.62.21
with:
tool: cargo-codspeed
@@ -1004,7 +1004,7 @@ jobs:
run: cargo codspeed build --features "codspeed,walltime" --no-default-features -p ruff_benchmark
- name: "Run benchmarks"
uses: CodSpeedHQ/action@653fdc30e6c40ffd9739e40c8a0576f4f4523ca1 # v4.0.1
uses: CodSpeedHQ/action@3959e9e296ef25296e93e32afcc97196f966e57f # v4.1.0
env:
# enabling walltime flamegraphs adds ~6 minutes to the CI time, and they don't
# appear to provide much useful insight for our walltime benchmarks right now

View File

@@ -34,12 +34,12 @@ jobs:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@725a8794d15fc7563f59595bd9556495c0564878 # v1
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: Build ruff
# A debug build means the script runs slower once it gets started,
# but this is outweighed by the fact that a release build takes *much* longer to compile in CI

View File

@@ -39,9 +39,9 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
with:
workspaces: "ruff"
@@ -82,9 +82,9 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
with:
workspaces: "ruff"

View File

@@ -68,7 +68,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}

View File

@@ -22,7 +22,7 @@ jobs:
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
pattern: wheels-*

View File

@@ -65,7 +65,7 @@ jobs:
run: |
git config --global user.name typeshedbot
git config --global user.email '<>'
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: Sync typeshed stubs
run: |
rm -rf "ruff/${VENDORED_TYPESHED}"
@@ -117,7 +117,7 @@ jobs:
with:
persist-credentials: true
ref: ${{ env.UPSTREAM_BRANCH}}
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: Setup git
run: |
git config --global user.name typeshedbot
@@ -155,7 +155,7 @@ jobs:
with:
persist-credentials: true
ref: ${{ env.UPSTREAM_BRANCH}}
- uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
- uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: Setup git
run: |
git config --global user.name typeshedbot

View File

@@ -33,9 +33,9 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
with:
workspaces: "ruff"

View File

@@ -29,9 +29,9 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@b75a909f75acd358c2196fb9a5f1299a9a8868a4 # v6.7.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
with:
workspaces: "ruff"

View File

@@ -45,7 +45,7 @@ jobs:
path: typing
persist-credentials: false
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: Swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 # v2.8.1
with:
workspaces: "ruff"

View File

@@ -117,7 +117,7 @@ static COLOUR_SCIENCE: std::sync::LazyLock<Benchmark<'static>> = std::sync::Lazy
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY310,
},
500,
600,
)
});

View File

@@ -0,0 +1,26 @@
def f1(x=([],)):
print(x)
def f2(x=(x for x in "x")):
print(x)
def f3(x=((x for x in "x"),)):
print(x)
def f4(x=(z := [1, ])):
print(x)
def f5(x=([1, ])):
print(x)
def w1(x=(1,)):
print(x)
def w2(x=(z := 3)):
print(x)

View File

@@ -128,3 +128,15 @@ if f"0" in d: # f-string
if k in a.d: # Attribute dict
del a.d[k]
if k in d: # else statement
del d[k]
else:
pass
if k in d: # elif and else statements
del d[k]
elif 0 in d:
del d[0]
else:
pass

View File

@@ -245,3 +245,17 @@ pub(crate) const fn is_a003_class_scope_shadowing_expansion_enabled(
pub(crate) const fn is_refined_submodule_import_match_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// github.com/astral-sh/ruff/issues/20004
pub(crate) const fn is_b006_check_guaranteed_mutable_expr_enabled(
settings: &LinterSettings,
) -> bool {
settings.preview.is_enabled()
}
// github.com/astral-sh/ruff/issues/20004
pub(crate) const fn is_b006_unsafe_fix_preserve_assignment_expr_enabled(
settings: &LinterSettings,
) -> bool {
settings.preview.is_enabled()
}

View File

@@ -47,6 +47,7 @@ mod tests {
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_6.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_7.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_8.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_9.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_B008.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_1.pyi"))]
#[test_case(Rule::NoExplicitStacklevel, Path::new("B028.py"))]
@@ -83,6 +84,17 @@ mod tests {
}
#[test_case(Rule::MapWithoutExplicitStrict, Path::new("B912.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_1.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_2.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_3.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_4.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_5.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_6.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_7.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_8.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_9.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_B008.py"))]
#[test_case(Rule::MutableArgumentDefault, Path::new("B006_1.pyi"))]
fn preview_rules(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!(
"preview__{}_{}",

View File

@@ -3,9 +3,8 @@ use std::fmt::Write;
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::helpers::is_docstring_stmt;
use ruff_python_ast::name::QualifiedName;
use ruff_python_ast::{self as ast, Expr, Parameter};
use ruff_python_codegen::{Generator, Stylist};
use ruff_python_index::Indexer;
use ruff_python_ast::parenthesize::parenthesized_range;
use ruff_python_ast::{self as ast, Expr, ParameterWithDefault};
use ruff_python_semantic::SemanticModel;
use ruff_python_semantic::analyze::function_type::is_stub;
use ruff_python_semantic::analyze::typing::{is_immutable_annotation, is_mutable_expr};
@@ -13,8 +12,11 @@ use ruff_python_trivia::{indentation_at_offset, textwrap};
use ruff_source_file::LineRanges;
use ruff_text_size::Ranged;
use crate::Locator;
use crate::checkers::ast::Checker;
use crate::preview::{
is_b006_check_guaranteed_mutable_expr_enabled,
is_b006_unsafe_fix_preserve_assignment_expr_enabled,
};
use crate::{Edit, Fix, FixAvailability, Violation};
/// ## What it does
@@ -111,8 +113,12 @@ pub(crate) fn mutable_argument_default(checker: &Checker, function_def: &ast::St
.iter()
.map(|target| QualifiedName::from_dotted_name(target))
.collect();
if is_mutable_expr(default, checker.semantic())
let is_mut_expr = if is_b006_check_guaranteed_mutable_expr_enabled(checker.settings()) {
is_guaranteed_mutable_expr(default, checker.semantic())
} else {
is_mutable_expr(default, checker.semantic())
};
if is_mut_expr
&& !parameter.annotation().is_some_and(|expr| {
is_immutable_annotation(expr, checker.semantic(), extend_immutable_calls.as_slice())
})
@@ -120,35 +126,37 @@ pub(crate) fn mutable_argument_default(checker: &Checker, function_def: &ast::St
let mut diagnostic = checker.report_diagnostic(MutableArgumentDefault, default.range());
// If the function body is on the same line as the function def, do not fix
if let Some(fix) = move_initialization(
function_def,
&parameter.parameter,
default,
checker.semantic(),
checker.locator(),
checker.stylist(),
checker.indexer(),
checker.generator(),
) {
if let Some(fix) = move_initialization(function_def, parameter, default, checker) {
diagnostic.set_fix(fix);
}
}
}
}
/// Returns `true` if the expression is guaranteed to create a mutable object.
fn is_guaranteed_mutable_expr(expr: &Expr, semantic: &SemanticModel) -> bool {
match expr {
Expr::Generator(_) => true,
Expr::Tuple(ast::ExprTuple { elts, .. }) => {
elts.iter().any(|e| is_guaranteed_mutable_expr(e, semantic))
}
Expr::Named(ast::ExprNamed { value, .. }) => is_guaranteed_mutable_expr(value, semantic),
_ => is_mutable_expr(expr, semantic),
}
}
/// Generate a [`Fix`] to move a mutable argument default initialization
/// into the function body.
#[expect(clippy::too_many_arguments)]
fn move_initialization(
function_def: &ast::StmtFunctionDef,
parameter: &Parameter,
parameter: &ParameterWithDefault,
default: &Expr,
semantic: &SemanticModel,
locator: &Locator,
stylist: &Stylist,
indexer: &Indexer,
generator: Generator,
checker: &Checker,
) -> Option<Fix> {
let indexer = checker.indexer();
let locator = checker.locator();
let stylist = checker.stylist();
let mut body = function_def.body.iter().peekable();
// Avoid attempting to fix single-line functions.
@@ -157,25 +165,58 @@ fn move_initialization(
return None;
}
let range = match parenthesized_range(
default.into(),
parameter.into(),
checker.comment_ranges(),
checker.source(),
) {
Some(range) => range,
None => default.range(),
};
// Set the default argument value to `None`.
let default_edit = Edit::range_replacement("None".to_string(), default.range());
let default_edit = Edit::range_replacement("None".to_string(), range);
// If the function is a stub, this is the only necessary edit.
if is_stub(function_def, semantic) {
if is_stub(function_def, checker.semantic()) {
return Some(Fix::unsafe_edit(default_edit));
}
// Add an `if`, to set the argument to its original value if still `None`.
let mut content = String::new();
let _ = write!(&mut content, "if {} is None:", parameter.name());
let _ = write!(&mut content, "if {} is None:", parameter.parameter.name());
content.push_str(stylist.line_ending().as_str());
content.push_str(stylist.indentation());
let _ = write!(
&mut content,
"{} = {}",
parameter.name(),
generator.expr(default)
);
if is_b006_unsafe_fix_preserve_assignment_expr_enabled(checker.settings()) {
let annotation = if let Some(ann) = parameter.annotation() {
format!(": {}", locator.slice(ann))
} else {
String::new()
};
let _ = write!(
&mut content,
"{}{} = {}",
parameter.parameter.name(),
annotation,
locator.slice(
parenthesized_range(
default.into(),
parameter.into(),
checker.comment_ranges(),
checker.source()
)
.unwrap_or(default.range())
)
);
} else {
let _ = write!(
&mut content,
"{} = {}",
parameter.name(),
checker.generator().expr(default)
);
}
content.push_str(stylist.line_ending().as_str());
// Determine the indentation depth of the function body.

View File

@@ -0,0 +1,22 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_9.py:17:11
|
17 | def f5(x=([1, ])):
| ^^^^^
18 | print(x)
|
help: Replace with `None`; initialize within function
14 | print(x)
15 |
16 |
- def f5(x=([1, ])):
17 + def f5(x=None):
18 + if x is None:
19 + x = [1]
20 | print(x)
21 |
22 |
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,21 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_1.py:3:22
|
1 | # Docstring followed by a newline
2 |
3 | def foobar(foor, bar={}):
| ^^
4 | """
5 | """
|
help: Replace with `None`; initialize within function
1 | # Docstring followed by a newline
2 |
- def foobar(foor, bar={}):
3 + def foobar(foor, bar=None):
4 | """
5 | """
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---

View File

@@ -0,0 +1,22 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_2.py:4:22
|
2 | # Regression test for https://github.com/astral-sh/ruff/issues/7155
3 |
4 | def foobar(foor, bar={}):
| ^^
5 | """
6 | """
|
help: Replace with `None`; initialize within function
1 | # Docstring followed by whitespace with no newline
2 | # Regression test for https://github.com/astral-sh/ruff/issues/7155
3 |
- def foobar(foor, bar={}):
4 + def foobar(foor, bar=None):
5 | """
6 | """
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,20 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_3.py:4:22
|
4 | def foobar(foor, bar={}):
| ^^
5 | """
6 | """
|
help: Replace with `None`; initialize within function
1 | # Docstring with no newline
2 |
3 |
- def foobar(foor, bar={}):
4 + def foobar(foor, bar=None):
5 | """
6 | """
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,22 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_4.py:7:26
|
6 | class FormFeedIndent:
7 | def __init__(self, a=[]):
| ^^
8 | print(a)
|
help: Replace with `None`; initialize within function
4 |
5 |
6 | class FormFeedIndent:
- def __init__(self, a=[]):
7 + def __init__(self, a=None):
8 + if a is None:
9 + a = []
10 | print(a)
11 |
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,314 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:5:49
|
5 | def import_module_wrong(value: dict[str, str] = {}):
| ^^
6 | import os
|
help: Replace with `None`; initialize within function
2 | # https://github.com/astral-sh/ruff/issues/7616
3 |
4 |
- def import_module_wrong(value: dict[str, str] = {}):
5 + def import_module_wrong(value: dict[str, str] = None):
6 | import os
7 + if value is None:
8 + value: dict[str, str] = {}
9 |
10 |
11 | def import_module_with_values_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:9:61
|
9 | def import_module_with_values_wrong(value: dict[str, str] = {}):
| ^^
10 | import os
|
help: Replace with `None`; initialize within function
6 | import os
7 |
8 |
- def import_module_with_values_wrong(value: dict[str, str] = {}):
9 + def import_module_with_values_wrong(value: dict[str, str] = None):
10 | import os
11 |
12 + if value is None:
13 + value: dict[str, str] = {}
14 | return 2
15 |
16 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:15:50
|
15 | def import_modules_wrong(value: dict[str, str] = {}):
| ^^
16 | import os
17 | import sys
|
help: Replace with `None`; initialize within function
12 | return 2
13 |
14 |
- def import_modules_wrong(value: dict[str, str] = {}):
15 + def import_modules_wrong(value: dict[str, str] = None):
16 | import os
17 | import sys
18 | import itertools
19 + if value is None:
20 + value: dict[str, str] = {}
21 |
22 |
23 | def from_import_module_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:21:54
|
21 | def from_import_module_wrong(value: dict[str, str] = {}):
| ^^
22 | from os import path
|
help: Replace with `None`; initialize within function
18 | import itertools
19 |
20 |
- def from_import_module_wrong(value: dict[str, str] = {}):
21 + def from_import_module_wrong(value: dict[str, str] = None):
22 | from os import path
23 + if value is None:
24 + value: dict[str, str] = {}
25 |
26 |
27 | def from_imports_module_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:25:55
|
25 | def from_imports_module_wrong(value: dict[str, str] = {}):
| ^^
26 | from os import path
27 | from sys import version_info
|
help: Replace with `None`; initialize within function
22 | from os import path
23 |
24 |
- def from_imports_module_wrong(value: dict[str, str] = {}):
25 + def from_imports_module_wrong(value: dict[str, str] = None):
26 | from os import path
27 | from sys import version_info
28 + if value is None:
29 + value: dict[str, str] = {}
30 |
31 |
32 | def import_and_from_imports_module_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:30:66
|
30 | def import_and_from_imports_module_wrong(value: dict[str, str] = {}):
| ^^
31 | import os
32 | from sys import version_info
|
help: Replace with `None`; initialize within function
27 | from sys import version_info
28 |
29 |
- def import_and_from_imports_module_wrong(value: dict[str, str] = {}):
30 + def import_and_from_imports_module_wrong(value: dict[str, str] = None):
31 | import os
32 | from sys import version_info
33 + if value is None:
34 + value: dict[str, str] = {}
35 |
36 |
37 | def import_docstring_module_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:35:59
|
35 | def import_docstring_module_wrong(value: dict[str, str] = {}):
| ^^
36 | """Docstring"""
37 | import os
|
help: Replace with `None`; initialize within function
32 | from sys import version_info
33 |
34 |
- def import_docstring_module_wrong(value: dict[str, str] = {}):
35 + def import_docstring_module_wrong(value: dict[str, str] = None):
36 | """Docstring"""
37 | import os
38 + if value is None:
39 + value: dict[str, str] = {}
40 |
41 |
42 | def import_module_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:40:49
|
40 | def import_module_wrong(value: dict[str, str] = {}):
| ^^
41 | """Docstring"""
42 | import os; import sys
|
help: Replace with `None`; initialize within function
37 | import os
38 |
39 |
- def import_module_wrong(value: dict[str, str] = {}):
40 + def import_module_wrong(value: dict[str, str] = None):
41 | """Docstring"""
42 | import os; import sys
43 + if value is None:
44 + value: dict[str, str] = {}
45 |
46 |
47 | def import_module_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:45:49
|
45 | def import_module_wrong(value: dict[str, str] = {}):
| ^^
46 | """Docstring"""
47 | import os; import sys; x = 1
|
help: Replace with `None`; initialize within function
42 | import os; import sys
43 |
44 |
- def import_module_wrong(value: dict[str, str] = {}):
45 + def import_module_wrong(value: dict[str, str] = None):
46 | """Docstring"""
47 + if value is None:
48 + value: dict[str, str] = {}
49 | import os; import sys; x = 1
50 |
51 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:50:49
|
50 | def import_module_wrong(value: dict[str, str] = {}):
| ^^
51 | """Docstring"""
52 | import os; import sys
|
help: Replace with `None`; initialize within function
47 | import os; import sys; x = 1
48 |
49 |
- def import_module_wrong(value: dict[str, str] = {}):
50 + def import_module_wrong(value: dict[str, str] = None):
51 | """Docstring"""
52 | import os; import sys
53 + if value is None:
54 + value: dict[str, str] = {}
55 |
56 |
57 | def import_module_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:55:49
|
55 | def import_module_wrong(value: dict[str, str] = {}):
| ^^
56 | import os; import sys
|
help: Replace with `None`; initialize within function
52 | import os; import sys
53 |
54 |
- def import_module_wrong(value: dict[str, str] = {}):
55 + def import_module_wrong(value: dict[str, str] = None):
56 | import os; import sys
57 + if value is None:
58 + value: dict[str, str] = {}
59 |
60 |
61 | def import_module_wrong(value: dict[str, str] = {}):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:59:49
|
59 | def import_module_wrong(value: dict[str, str] = {}):
| ^^
60 | import os; import sys; x = 1
|
help: Replace with `None`; initialize within function
56 | import os; import sys
57 |
58 |
- def import_module_wrong(value: dict[str, str] = {}):
59 + def import_module_wrong(value: dict[str, str] = None):
60 + if value is None:
61 + value: dict[str, str] = {}
62 | import os; import sys; x = 1
63 |
64 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_5.py:63:49
|
63 | def import_module_wrong(value: dict[str, str] = {}):
| ^^
64 | import os; import sys
|
help: Replace with `None`; initialize within function
60 | import os; import sys; x = 1
61 |
62 |
- def import_module_wrong(value: dict[str, str] = {}):
63 + def import_module_wrong(value: dict[str, str] = None):
64 | import os; import sys
65 + if value is None:
66 + value: dict[str, str] = {}
67 |
68 |
69 | def import_module_wrong(value: dict[str, str] = {}): import os
note: This is an unsafe fix and may change runtime behavior
B006 Do not use mutable data structures for argument defaults
--> B006_5.py:67:49
|
67 | def import_module_wrong(value: dict[str, str] = {}): import os
| ^^
|
help: Replace with `None`; initialize within function
B006 Do not use mutable data structures for argument defaults
--> B006_5.py:70:49
|
70 | def import_module_wrong(value: dict[str, str] = {}): import os; import sys
| ^^
|
help: Replace with `None`; initialize within function
B006 Do not use mutable data structures for argument defaults
--> B006_5.py:73:49
|
73 | def import_module_wrong(value: dict[str, str] = {}): \
| ^^
74 | import os
|
help: Replace with `None`; initialize within function

View File

@@ -0,0 +1,23 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_6.py:4:22
|
2 | # Same as B006_2.py, but import instead of docstring
3 |
4 | def foobar(foor, bar={}):
| ^^
5 | import os
|
help: Replace with `None`; initialize within function
1 | # Import followed by whitespace with no newline
2 | # Same as B006_2.py, but import instead of docstring
3 |
- def foobar(foor, bar={}):
- import os
4 + def foobar(foor, bar=None):
5 + import os
6 + if bar is None:
7 + bar = {}
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,23 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_7.py:4:22
|
2 | # Same as B006_3.py, but import instead of docstring
3 |
4 | def foobar(foor, bar={}):
| ^^
5 | import os
|
help: Replace with `None`; initialize within function
1 | # Import with no newline
2 | # Same as B006_3.py, but import instead of docstring
3 |
- def foobar(foor, bar={}):
- import os
4 + def foobar(foor, bar=None):
5 + import os
6 + if bar is None:
7 + bar = {}
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,92 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_8.py:1:19
|
1 | def foo(a: list = []):
| ^^
2 | raise NotImplementedError("")
|
help: Replace with `None`; initialize within function
- def foo(a: list = []):
1 + def foo(a: list = None):
2 | raise NotImplementedError("")
3 |
4 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_8.py:5:19
|
5 | def bar(a: dict = {}):
| ^^
6 | """ This one also has a docstring"""
7 | raise NotImplementedError("and has some text in here")
|
help: Replace with `None`; initialize within function
2 | raise NotImplementedError("")
3 |
4 |
- def bar(a: dict = {}):
5 + def bar(a: dict = None):
6 | """ This one also has a docstring"""
7 | raise NotImplementedError("and has some text in here")
8 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_8.py:10:19
|
10 | def baz(a: list = []):
| ^^
11 | """This one raises a different exception"""
12 | raise IndexError()
|
help: Replace with `None`; initialize within function
7 | raise NotImplementedError("and has some text in here")
8 |
9 |
- def baz(a: list = []):
10 + def baz(a: list = None):
11 | """This one raises a different exception"""
12 + if a is None:
13 + a: list = []
14 | raise IndexError()
15 |
16 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_8.py:15:19
|
15 | def qux(a: list = []):
| ^^
16 | raise NotImplementedError
|
help: Replace with `None`; initialize within function
12 | raise IndexError()
13 |
14 |
- def qux(a: list = []):
15 + def qux(a: list = None):
16 | raise NotImplementedError
17 |
18 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_8.py:19:20
|
19 | def quux(a: list = []):
| ^^
20 | raise NotImplemented
|
help: Replace with `None`; initialize within function
16 | raise NotImplementedError
17 |
18 |
- def quux(a: list = []):
19 + def quux(a: list = None):
20 | raise NotImplemented
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,99 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_9.py:1:10
|
1 | def f1(x=([],)):
| ^^^^^
2 | print(x)
|
help: Replace with `None`; initialize within function
- def f1(x=([],)):
1 + def f1(x=None):
2 + if x is None:
3 + x = ([],)
4 | print(x)
5 |
6 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_9.py:5:10
|
5 | def f2(x=(x for x in "x")):
| ^^^^^^^^^^^^^^^^
6 | print(x)
|
help: Replace with `None`; initialize within function
2 | print(x)
3 |
4 |
- def f2(x=(x for x in "x")):
5 + def f2(x=None):
6 + if x is None:
7 + x = (x for x in "x")
8 | print(x)
9 |
10 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_9.py:9:10
|
9 | def f3(x=((x for x in "x"),)):
| ^^^^^^^^^^^^^^^^^^^
10 | print(x)
|
help: Replace with `None`; initialize within function
6 | print(x)
7 |
8 |
- def f3(x=((x for x in "x"),)):
9 + def f3(x=None):
10 + if x is None:
11 + x = ((x for x in "x"),)
12 | print(x)
13 |
14 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_9.py:13:11
|
13 | def f4(x=(z := [1, ])):
| ^^^^^^^^^^
14 | print(x)
|
help: Replace with `None`; initialize within function
10 | print(x)
11 |
12 |
- def f4(x=(z := [1, ])):
13 + def f4(x=None):
14 + if x is None:
15 + x = (z := [1, ])
16 | print(x)
17 |
18 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_9.py:17:11
|
17 | def f5(x=([1, ])):
| ^^^^^
18 | print(x)
|
help: Replace with `None`; initialize within function
14 | print(x)
15 |
16 |
- def f5(x=([1, ])):
17 + def f5(x=None):
18 + if x is None:
19 + x = ([1, ])
20 | print(x)
21 |
22 |
note: This is an unsafe fix and may change runtime behavior

View File

@@ -0,0 +1,462 @@
---
source: crates/ruff_linter/src/rules/flake8_bugbear/mod.rs
---
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:63:25
|
63 | def this_is_wrong(value=[1, 2, 3]):
| ^^^^^^^^^
64 | ...
|
help: Replace with `None`; initialize within function
60 | # Flag mutable literals/comprehensions
61 |
62 |
- def this_is_wrong(value=[1, 2, 3]):
63 + def this_is_wrong(value=None):
64 | ...
65 |
66 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:67:30
|
67 | def this_is_also_wrong(value={}):
| ^^
68 | ...
|
help: Replace with `None`; initialize within function
64 | ...
65 |
66 |
- def this_is_also_wrong(value={}):
67 + def this_is_also_wrong(value=None):
68 | ...
69 |
70 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:73:52
|
71 | class Foo:
72 | @staticmethod
73 | def this_is_also_wrong_and_more_indented(value={}):
| ^^
74 | pass
|
help: Replace with `None`; initialize within function
70 |
71 | class Foo:
72 | @staticmethod
- def this_is_also_wrong_and_more_indented(value={}):
73 + def this_is_also_wrong_and_more_indented(value=None):
74 | pass
75 |
76 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:77:31
|
77 | def multiline_arg_wrong(value={
| _______________________________^
78 | |
79 | | }):
| |_^
80 | ...
|
help: Replace with `None`; initialize within function
74 | pass
75 |
76 |
- def multiline_arg_wrong(value={
-
- }):
77 + def multiline_arg_wrong(value=None):
78 | ...
79 |
80 | def single_line_func_wrong(value = {}): ...
note: This is an unsafe fix and may change runtime behavior
B006 Do not use mutable data structures for argument defaults
--> B006_B008.py:82:36
|
80 | ...
81 |
82 | def single_line_func_wrong(value = {}): ...
| ^^
|
help: Replace with `None`; initialize within function
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:85:20
|
85 | def and_this(value=set()):
| ^^^^^
86 | ...
|
help: Replace with `None`; initialize within function
82 | def single_line_func_wrong(value = {}): ...
83 |
84 |
- def and_this(value=set()):
85 + def and_this(value=None):
86 | ...
87 |
88 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:89:20
|
89 | def this_too(value=collections.OrderedDict()):
| ^^^^^^^^^^^^^^^^^^^^^^^^^
90 | ...
|
help: Replace with `None`; initialize within function
86 | ...
87 |
88 |
- def this_too(value=collections.OrderedDict()):
89 + def this_too(value=None):
90 | ...
91 |
92 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:93:32
|
93 | async def async_this_too(value=collections.defaultdict()):
| ^^^^^^^^^^^^^^^^^^^^^^^^^
94 | ...
|
help: Replace with `None`; initialize within function
90 | ...
91 |
92 |
- async def async_this_too(value=collections.defaultdict()):
93 + async def async_this_too(value=None):
94 | ...
95 |
96 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:97:26
|
97 | def dont_forget_me(value=collections.deque()):
| ^^^^^^^^^^^^^^^^^^^
98 | ...
|
help: Replace with `None`; initialize within function
94 | ...
95 |
96 |
- def dont_forget_me(value=collections.deque()):
97 + def dont_forget_me(value=None):
98 | ...
99 |
100 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:102:46
|
101 | # N.B. we're also flagging the function call in the comprehension
102 | def list_comprehension_also_not_okay(default=[i**2 for i in range(3)]):
| ^^^^^^^^^^^^^^^^^^^^^^^^
103 | pass
|
help: Replace with `None`; initialize within function
99 |
100 |
101 | # N.B. we're also flagging the function call in the comprehension
- def list_comprehension_also_not_okay(default=[i**2 for i in range(3)]):
102 + def list_comprehension_also_not_okay(default=None):
103 | pass
104 |
105 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:106:46
|
106 | def dict_comprehension_also_not_okay(default={i: i**2 for i in range(3)}):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
107 | pass
|
help: Replace with `None`; initialize within function
103 | pass
104 |
105 |
- def dict_comprehension_also_not_okay(default={i: i**2 for i in range(3)}):
106 + def dict_comprehension_also_not_okay(default=None):
107 | pass
108 |
109 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:110:45
|
110 | def set_comprehension_also_not_okay(default={i**2 for i in range(3)}):
| ^^^^^^^^^^^^^^^^^^^^^^^^
111 | pass
|
help: Replace with `None`; initialize within function
107 | pass
108 |
109 |
- def set_comprehension_also_not_okay(default={i**2 for i in range(3)}):
110 + def set_comprehension_also_not_okay(default=None):
111 | pass
112 |
113 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:114:33
|
114 | def kwonlyargs_mutable(*, value=[]):
| ^^
115 | ...
|
help: Replace with `None`; initialize within function
111 | pass
112 |
113 |
- def kwonlyargs_mutable(*, value=[]):
114 + def kwonlyargs_mutable(*, value=None):
115 | ...
116 |
117 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:239:20
|
237 | # B006 and B008
238 | # We should handle arbitrary nesting of these B008.
239 | def nested_combo(a=[float(3), dt.datetime.now()]):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
240 | pass
|
help: Replace with `None`; initialize within function
236 |
237 | # B006 and B008
238 | # We should handle arbitrary nesting of these B008.
- def nested_combo(a=[float(3), dt.datetime.now()]):
239 + def nested_combo(a=None):
240 | pass
241 |
242 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:276:27
|
275 | def mutable_annotations(
276 | a: list[int] | None = [],
| ^^
277 | b: Optional[Dict[int, int]] = {},
278 | c: Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
|
help: Replace with `None`; initialize within function
273 |
274 |
275 | def mutable_annotations(
- a: list[int] | None = [],
276 + a: list[int] | None = None,
277 | b: Optional[Dict[int, int]] = {},
278 | c: Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
279 | d: typing_extensions.Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:277:35
|
275 | def mutable_annotations(
276 | a: list[int] | None = [],
277 | b: Optional[Dict[int, int]] = {},
| ^^
278 | c: Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
279 | d: typing_extensions.Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
|
help: Replace with `None`; initialize within function
274 |
275 | def mutable_annotations(
276 | a: list[int] | None = [],
- b: Optional[Dict[int, int]] = {},
277 + b: Optional[Dict[int, int]] = None,
278 | c: Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
279 | d: typing_extensions.Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
280 | ):
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:278:62
|
276 | a: list[int] | None = [],
277 | b: Optional[Dict[int, int]] = {},
278 | c: Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
| ^^^^^
279 | d: typing_extensions.Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
280 | ):
|
help: Replace with `None`; initialize within function
275 | def mutable_annotations(
276 | a: list[int] | None = [],
277 | b: Optional[Dict[int, int]] = {},
- c: Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
278 + c: Annotated[Union[Set[str], abc.Sized], "annotation"] = None,
279 | d: typing_extensions.Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
280 | ):
281 | pass
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:279:80
|
277 | b: Optional[Dict[int, int]] = {},
278 | c: Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
279 | d: typing_extensions.Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
| ^^^^^
280 | ):
281 | pass
|
help: Replace with `None`; initialize within function
276 | a: list[int] | None = [],
277 | b: Optional[Dict[int, int]] = {},
278 | c: Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
- d: typing_extensions.Annotated[Union[Set[str], abc.Sized], "annotation"] = set(),
279 + d: typing_extensions.Annotated[Union[Set[str], abc.Sized], "annotation"] = None,
280 | ):
281 | pass
282 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:284:52
|
284 | def single_line_func_wrong(value: dict[str, str] = {}):
| ^^
285 | """Docstring"""
|
help: Replace with `None`; initialize within function
281 | pass
282 |
283 |
- def single_line_func_wrong(value: dict[str, str] = {}):
284 + def single_line_func_wrong(value: dict[str, str] = None):
285 | """Docstring"""
286 |
287 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:288:52
|
288 | def single_line_func_wrong(value: dict[str, str] = {}):
| ^^
289 | """Docstring"""
290 | ...
|
help: Replace with `None`; initialize within function
285 | """Docstring"""
286 |
287 |
- def single_line_func_wrong(value: dict[str, str] = {}):
288 + def single_line_func_wrong(value: dict[str, str] = None):
289 | """Docstring"""
290 | ...
291 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:293:52
|
293 | def single_line_func_wrong(value: dict[str, str] = {}):
| ^^
294 | """Docstring"""; ...
|
help: Replace with `None`; initialize within function
290 | ...
291 |
292 |
- def single_line_func_wrong(value: dict[str, str] = {}):
293 + def single_line_func_wrong(value: dict[str, str] = None):
294 | """Docstring"""; ...
295 |
296 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:297:52
|
297 | def single_line_func_wrong(value: dict[str, str] = {}):
| ^^
298 | """Docstring"""; \
299 | ...
|
help: Replace with `None`; initialize within function
294 | """Docstring"""; ...
295 |
296 |
- def single_line_func_wrong(value: dict[str, str] = {}):
297 + def single_line_func_wrong(value: dict[str, str] = None):
298 | """Docstring"""; \
299 | ...
300 |
note: This is an unsafe fix and may change runtime behavior
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:302:52
|
302 | def single_line_func_wrong(value: dict[str, str] = {
| ____________________________________________________^
303 | | # This is a comment
304 | | }):
| |_^
305 | """Docstring"""
|
help: Replace with `None`; initialize within function
299 | ...
300 |
301 |
- def single_line_func_wrong(value: dict[str, str] = {
- # This is a comment
- }):
302 + def single_line_func_wrong(value: dict[str, str] = None):
303 | """Docstring"""
304 |
305 |
note: This is an unsafe fix and may change runtime behavior
B006 Do not use mutable data structures for argument defaults
--> B006_B008.py:308:52
|
308 | def single_line_func_wrong(value: dict[str, str] = {}) \
| ^^
309 | : \
310 | """Docstring"""
|
help: Replace with `None`; initialize within function
B006 [*] Do not use mutable data structures for argument defaults
--> B006_B008.py:313:52
|
313 | def single_line_func_wrong(value: dict[str, str] = {}):
| ^^
314 | """Docstring without newline"""
|
help: Replace with `None`; initialize within function
310 | """Docstring"""
311 |
312 |
- def single_line_func_wrong(value: dict[str, str] = {}):
313 + def single_line_func_wrong(value: dict[str, str] = None):
314 | """Docstring without newline"""
note: This is an unsafe fix and may change runtime behavior

View File

@@ -32,12 +32,15 @@ use crate::rules::flake8_comprehensions::fixes;
/// ```
///
/// ## Fix safety
/// This rule's fix is marked as unsafe, as `reversed()` and `reverse=True` will
/// yield different results in the event of custom sort keys or equality
/// functions. Specifically, `reversed()` will reverse the order of the
/// collection, while `sorted()` with `reverse=True` will perform a stable
/// This rule's fix is marked as unsafe for `reversed()` cases, as `reversed()`
/// and `reverse=True` will yield different results in the event of custom sort
/// keys or equality functions. Specifically, `reversed()` will reverse the order
/// of the collection, while `sorted()` with `reverse=True` will perform a stable
/// reverse sort, which will preserve the order of elements that compare as
/// equal.
///
/// The fix is marked as safe for `list()` cases, as removing `list()` around
/// `sorted()` does not change the behavior.
#[derive(ViolationMetadata)]
pub(crate) struct UnnecessaryCallAroundSorted {
func: UnnecessaryFunction,

View File

@@ -45,6 +45,10 @@ impl AlwaysFixableViolation for IfKeyInDictDel {
/// RUF051
pub(crate) fn if_key_in_dict_del(checker: &Checker, stmt: &StmtIf) {
if !stmt.elif_else_clauses.is_empty() {
return;
}
let [Stmt::Delete(delete)] = &stmt.body[..] else {
return;
};

View File

@@ -1,7 +1,7 @@
use std::sync::{LazyLock, Mutex};
use get_size2::{GetSize, StandardTracker};
use ordermap::OrderSet;
use ordermap::{OrderMap, OrderSet};
/// Returns the memory usage of the provided object, using a global tracker to avoid
/// double-counting shared objects.
@@ -18,3 +18,11 @@ pub fn heap_size<T: GetSize>(value: &T) -> usize {
pub fn order_set_heap_size<T: GetSize, S>(set: &OrderSet<T, S>) -> usize {
(set.capacity() * T::get_stack_size()) + set.iter().map(heap_size).sum::<usize>()
}
/// An implementation of [`GetSize::get_heap_size`] for [`OrderMap`].
pub fn order_map_heap_size<K: GetSize, V: GetSize, S>(map: &OrderMap<K, V, S>) -> usize {
(map.capacity() * (K::get_stack_size() + V::get_stack_size()))
+ (map.iter())
.map(|(k, v)| heap_size(k) + heap_size(v))
.sum::<usize>()
}

12
crates/ty/docs/cli.md generated
View File

@@ -54,9 +54,10 @@ over all configuration files.</p>
</dd><dt id="ty-check--exclude"><a href="#ty-check--exclude"><code>--exclude</code></a> <i>exclude</i></dt><dd><p>Glob patterns for files to exclude from type checking.</p>
<p>Uses gitignore-style syntax to exclude files and directories from type checking. Supports patterns like <code>tests/</code>, <code>*.tmp</code>, <code>**/__pycache__/**</code>.</p>
</dd><dt id="ty-check--exit-zero"><a href="#ty-check--exit-zero"><code>--exit-zero</code></a></dt><dd><p>Always use exit code 0, even when there are error-level diagnostics</p>
</dd><dt id="ty-check--extra-search-path"><a href="#ty-check--extra-search-path"><code>--extra-search-path</code></a> <i>path</i></dt><dd><p>Additional path to use as a module-resolution source (can be passed multiple times)</p>
</dd><dt id="ty-check--help"><a href="#ty-check--help"><code>--help</code></a>, <code>-h</code></dt><dd><p>Print help (see a summary with '-h')</p>
</dd><dt id="ty-check--ignore"><a href="#ty-check--ignore"><code>--ignore</code></a> <i>rule</i></dt><dd><p>Disables the rule. Can be specified multiple times.</p>
</dd><dt id="ty-check--non-environment-search-path"><a href="#ty-check--non-environment-search-path"><code>--non-environment-search-path</code></a> <i>path</i></dt><dd><p>Additional path to use as a module-resolution source (can be passed multiple times).</p>
<p>This is an advanced option that should usually only be used for first-party or third-party modules that are not installed into your Python environment in a conventional way. Use <code>--python</code> to point ty to your Python environment if it is in an unusual location.</p>
</dd><dt id="ty-check--output-format"><a href="#ty-check--output-format"><code>--output-format</code></a> <i>output-format</i></dt><dd><p>The format to use for printing diagnostic messages</p>
<p>Possible values:</p>
<ul>
@@ -67,11 +68,10 @@ over all configuration files.</p>
</ul></dd><dt id="ty-check--project"><a href="#ty-check--project"><code>--project</code></a> <i>project</i></dt><dd><p>Run the command within the given project directory.</p>
<p>All <code>pyproject.toml</code> files will be discovered by walking up the directory tree from the given project directory, as will the project's virtual environment (<code>.venv</code>) unless the <code>venv-path</code> option is set.</p>
<p>Other command-line arguments (such as relative paths) will be resolved relative to the current working directory.</p>
</dd><dt id="ty-check--python"><a href="#ty-check--python"><code>--python</code></a> <i>path</i></dt><dd><p>Path to the Python environment.</p>
<p>ty uses the Python environment to resolve type information and third-party dependencies.</p>
<p>If not specified, ty will attempt to infer it from the <code>VIRTUAL_ENV</code> or <code>CONDA_PREFIX</code> environment variables, or discover a <code>.venv</code> directory in the project root or working directory.</p>
<p>If a path to a Python interpreter is provided, e.g., <code>.venv/bin/python3</code>, ty will attempt to find an environment two directories up from the interpreter's path, e.g., <code>.venv</code>. At this time, ty does not invoke the interpreter to determine the location of the environment. This means that ty will not resolve dynamic executables such as a shim.</p>
<p>ty will search in the resolved environment's <code>site-packages</code> directories for type information and third-party imports.</p>
</dd><dt id="ty-check--python"><a href="#ty-check--python"><code>--python</code></a>, <code>--venv</code> <i>path</i></dt><dd><p>Path to your project's Python environment or interpreter.</p>
<p>ty uses your Python environment to resolve third-party imports in your code.</p>
<p>If you're using a project management tool such as uv or you have an activated Conda or virtual environment, you should not generally need to specify this option.</p>
<p>This option can be used to point to virtual or system Python environments.</p>
</dd><dt id="ty-check--python-platform"><a href="#ty-check--python-platform"><code>--python-platform</code></a>, <code>--platform</code> <i>platform</i></dt><dd><p>Target platform to assume when resolving types.</p>
<p>This is used to specialize the type of <code>sys.platform</code> and will affect the visibility of platform-specific functions and attributes. If the value is set to <code>all</code>, no assumptions are made about the target platform. If unspecified, the current system's platform will be used.</p>
</dd><dt id="ty-check--python-version"><a href="#ty-check--python-version"><code>--python-version</code></a>, <code>--target-version</code> <i>version</i></dt><dd><p>Python version to assume when resolving types.</p>

View File

@@ -30,11 +30,16 @@ division-by-zero = "ignore"
## `environment`
### `extra-paths`
### `non-environment-paths`
List of user-provided paths that should take first priority in the module resolution.
Examples in other type checkers are mypy's `MYPYPATH` environment variable,
or pyright's `stubPath` configuration setting.
User-provided paths that should take first priority in module resolution.
This is an advanced option that should usually only be used for first-party or third-party
modules that are not installed into your Python environment in a conventional way.
Use the `python` option to specify the location of your Python environment.
This option is similar to mypy's `MYPYPATH` environment variable and pyright's `stubPath`
configuration setting.
**Default value**: `[]`
@@ -44,19 +49,28 @@ or pyright's `stubPath` configuration setting.
```toml
[tool.ty.environment]
extra-paths = ["./shared/my-search-path"]
non-environment-paths = ["./shared/my-search-path"]
```
---
### `python`
Path to the Python installation from which ty resolves type information and third-party dependencies.
Path to your project's Python environment or interpreter.
ty will search in the path's `site-packages` directories for type information and
third-party imports.
ty uses the `site-packages` directory of your project's Python environment
to resolve third-party (and, in some cases, first-party) imports in your code.
This option is commonly used to specify the path to a virtual environment.
If you're using a project management tool such as uv, you should not generally need
to specify this option, as commands such as `uv run` will set the `VIRTUAL_ENV`
environment variable to point to your project's virtual environment. ty can also infer
the location of your environment from an activated Conda environment, and will look for
a `.venv` directory in the project root if none of the above apply.
Passing a path to a Python executable is supported, but passing a path to a dynamic executable
(such as a shim) is not currently supported.
This option can be used to point to virtual or system Python environments.
**Default value**: `null`
@@ -66,7 +80,7 @@ This option is commonly used to specify the path to a virtual environment.
```toml
[tool.ty.environment]
python = "./.venv"
python = "./custom-venv-location/.venv"
```
---

View File

@@ -35,11 +35,11 @@ ty also reads the following externally defined environment variables:
### `CONDA_DEFAULT_ENV`
Used to determine if an active Conda environment is the base environment or not.
Used to determine the name of the active Conda environment.
### `CONDA_PREFIX`
Used to detect an activated Conda environment location.
Used to detect the path of an active Conda environment.
If both `VIRTUAL_ENV` and `CONDA_PREFIX` are present, `VIRTUAL_ENV` will be preferred.
### `PYTHONPATH`
@@ -64,3 +64,7 @@ Used to detect an activated virtual environment.
Path to user-level configuration directory on Unix systems.
### `_CONDA_ROOT`
Used to determine the root install path of Conda.

View File

@@ -51,22 +51,15 @@ pub(crate) struct CheckCommand {
#[arg(long, value_name = "PROJECT")]
pub(crate) project: Option<SystemPathBuf>,
/// Path to the Python environment.
/// Path to your project's Python environment or interpreter.
///
/// ty uses the Python environment to resolve type information and third-party dependencies.
/// ty uses your Python environment to resolve third-party imports in your code.
///
/// If not specified, ty will attempt to infer it from the `VIRTUAL_ENV` or `CONDA_PREFIX`
/// environment variables, or discover a `.venv` directory in the project root or working
/// directory.
/// If you're using a project management tool such as uv or you have an activated Conda or virtual
/// environment, you should not generally need to specify this option.
///
/// If a path to a Python interpreter is provided, e.g., `.venv/bin/python3`, ty will attempt to
/// find an environment two directories up from the interpreter's path, e.g., `.venv`. At this
/// time, ty does not invoke the interpreter to determine the location of the environment. This
/// means that ty will not resolve dynamic executables such as a shim.
///
/// ty will search in the resolved environment's `site-packages` directories for type
/// information and third-party imports.
#[arg(long, value_name = "PATH")]
/// This option can be used to point to virtual or system Python environments.
#[arg(long, value_name = "PATH", alias = "venv")]
pub(crate) python: Option<SystemPathBuf>,
/// Custom directory to use for stdlib typeshed stubs.
@@ -74,8 +67,12 @@ pub(crate) struct CheckCommand {
pub(crate) typeshed: Option<SystemPathBuf>,
/// Additional path to use as a module-resolution source (can be passed multiple times).
///
/// This is an advanced option that should usually only be used for first-party or third-party
/// modules that are not installed into your Python environment in a conventional way.
/// Use `--python` to point ty to your Python environment if it is in an unusual location.
#[arg(long, value_name = "PATH")]
pub(crate) extra_search_path: Option<Vec<SystemPathBuf>>,
pub(crate) non_environment_search_path: Option<Vec<SystemPathBuf>>,
/// Python version to assume when resolving types.
///
@@ -187,12 +184,14 @@ impl CheckCommand {
.map(|platform| RangedValue::cli(platform.into())),
python: self.python.map(RelativePathBuf::cli),
typeshed: self.typeshed.map(RelativePathBuf::cli),
extra_paths: self.extra_search_path.map(|extra_search_paths| {
extra_search_paths
.into_iter()
.map(RelativePathBuf::cli)
.collect()
}),
non_environment_paths: self.non_environment_search_path.map(
|non_environment_paths| {
non_environment_paths
.into_iter()
.map(RelativePathBuf::cli)
.collect()
},
),
..EnvironmentOptions::default()
}),
terminal: Some(TerminalOptions {

View File

@@ -275,7 +275,7 @@ fn cli_arguments_are_relative_to_the_current_directory() -> anyhow::Result<()> {
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
assert_cmd_snapshot!(case.command().current_dir(case.root().join("child")).arg("--extra-search-path").arg("../libs"), @r"
assert_cmd_snapshot!(case.command().current_dir(case.root().join("child")).arg("--non-environment-search-path").arg("../libs"), @r"
success: true
exit_code: 0
----- stdout -----
@@ -290,7 +290,7 @@ fn cli_arguments_are_relative_to_the_current_directory() -> anyhow::Result<()> {
/// Paths specified in a configuration file are relative to the project root.
///
/// We test this by adding `libs` (as a relative path) to the extra search path in the configuration and run
/// We test this by adding `libs` (as a relative path) to `non-environment-paths` in the configuration and run
/// the CLI from a subdirectory.
///
/// Project layout:
@@ -309,7 +309,7 @@ fn paths_in_configuration_files_are_relative_to_the_project_root() -> anyhow::Re
r#"
[tool.ty.environment]
python-version = "3.11"
extra-paths = ["libs"]
non-environment-paths = ["libs"]
"#,
),
(

View File

@@ -943,9 +943,10 @@ fn defaults_to_a_new_python_version() -> anyhow::Result<()> {
/// The `site-packages` directory is used by ty for external import.
/// Ty does the following checks to discover the `site-packages` directory in the order:
/// 1) If `VIRTUAL_ENV` environment variable is set
/// 2) If `CONDA_PREFIX` environment variable is set (and .filename != `CONDA_DEFAULT_ENV`)
/// 2) If `CONDA_PREFIX` environment variable is set (and .filename == `CONDA_DEFAULT_ENV`)
/// 3) If a `.venv` directory exists at the project root
/// 4) If `CONDA_PREFIX` environment variable is set (and .filename == `CONDA_DEFAULT_ENV`)
/// 4) If `CONDA_PREFIX` environment variable is set (and .filename != `CONDA_DEFAULT_ENV`)
/// or if `_CONDA_ROOT` is set (and `_CONDA_ROOT` == `CONDA_PREFIX`)
///
/// This test (and the next one) is aiming at validating the logic around these cases.
///
@@ -986,15 +987,14 @@ fn defaults_to_a_new_python_version() -> anyhow::Result<()> {
/// │ └── site-packages
/// │ └── package1
/// │ └── __init__.py
/// ├── conda-env
/// │ └── lib
/// │ └── python3.13
/// │ └── site-packages
/// │ └── package1
/// │ └── __init__.py
/// └── conda
/// ├── lib
/// │ └── python3.13
/// │ └── site-packages
/// │ └── package1
/// │ └── __init__.py
/// └── envs
/// └── base
/// └── conda-env
/// └── lib
/// └── python3.13
/// └── site-packages
@@ -1006,15 +1006,15 @@ fn defaults_to_a_new_python_version() -> anyhow::Result<()> {
#[test]
fn check_venv_resolution_with_working_venv() -> anyhow::Result<()> {
let child_conda_package1_path = if cfg!(windows) {
"conda-env/Lib/site-packages/package1/__init__.py"
"conda/envs/conda-env/Lib/site-packages/package1/__init__.py"
} else {
"conda-env/lib/python3.13/site-packages/package1/__init__.py"
"conda/envs/conda-env/lib/python3.13/site-packages/package1/__init__.py"
};
let base_conda_package1_path = if cfg!(windows) {
"conda/envs/base/Lib/site-packages/package1/__init__.py"
"conda/Lib/site-packages/package1/__init__.py"
} else {
"conda/envs/base/lib/python3.13/site-packages/package1/__init__.py"
"conda//lib/python3.13/site-packages/package1/__init__.py"
};
let working_venv_package1_path = if cfg!(windows) {
@@ -1136,7 +1136,7 @@ home = ./
// run with CONDA_PREFIX set, should find the child conda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda-env")), @r"
.env("CONDA_PREFIX", case.root().join("conda/envs/conda-env")), @r"
success: false
exit_code: 1
----- stdout -----
@@ -1157,61 +1157,10 @@ home = ./
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV set (unequal), should find child conda
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV set (unequal), should find working venv
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda-env"))
.env("CONDA_DEFAULT_ENV", "base"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `ChildConda`
--> test.py:3:22
|
2 | from package1 import ActiveVenv
3 | from package1 import ChildConda
| ^^^^^^^^^^
4 | from package1 import WorkingVenv
5 | from package1 import BaseConda
|
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV (unequal) and VIRTUAL_ENV set,
// should find child active venv
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda-env"))
.env("CONDA_DEFAULT_ENV", "base")
.env("VIRTUAL_ENV", case.root().join("myvenv")), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `ActiveVenv`
--> test.py:2:22
|
2 | from package1 import ActiveVenv
| ^^^^^^^^^^
3 | from package1 import ChildConda
4 | from package1 import WorkingVenv
|
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV (equal!) set, should find working venv
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda/envs/base"))
.env("CONDA_PREFIX", case.root().join("conda"))
.env("CONDA_DEFAULT_ENV", "base"), @r"
success: false
exit_code: 1
@@ -1233,6 +1182,106 @@ home = ./
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV (unequal) and VIRTUAL_ENV set,
// should find child active venv
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda"))
.env("CONDA_DEFAULT_ENV", "base")
.env("VIRTUAL_ENV", case.root().join("myvenv")), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `ActiveVenv`
--> test.py:2:22
|
2 | from package1 import ActiveVenv
| ^^^^^^^^^^
3 | from package1 import ChildConda
4 | from package1 import WorkingVenv
|
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV (equal!) set, should find ChildConda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda/envs/conda-env"))
.env("CONDA_DEFAULT_ENV", "conda-env"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `ChildConda`
--> test.py:3:22
|
2 | from package1 import ActiveVenv
3 | from package1 import ChildConda
| ^^^^^^^^^^
4 | from package1 import WorkingVenv
5 | from package1 import BaseConda
|
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with _CONDA_ROOT and CONDA_PREFIX (unequal!) set, should find ChildConda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda/envs/conda-env"))
.env("_CONDA_ROOT", "conda"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `ChildConda`
--> test.py:3:22
|
2 | from package1 import ActiveVenv
3 | from package1 import ChildConda
| ^^^^^^^^^^
4 | from package1 import WorkingVenv
5 | from package1 import BaseConda
|
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with _CONDA_ROOT and CONDA_PREFIX (equal!) set, should find BaseConda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda"))
.env("_CONDA_ROOT", "conda"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `BaseConda`
--> test.py:5:22
|
3 | from package1 import ChildConda
4 | from package1 import WorkingVenv
5 | from package1 import BaseConda
| ^^^^^^^^^
|
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
Ok(())
}
@@ -1242,15 +1291,15 @@ home = ./
#[test]
fn check_venv_resolution_without_working_venv() -> anyhow::Result<()> {
let child_conda_package1_path = if cfg!(windows) {
"conda-env/Lib/site-packages/package1/__init__.py"
"conda/envs/conda-env/Lib/site-packages/package1/__init__.py"
} else {
"conda-env/lib/python3.13/site-packages/package1/__init__.py"
"conda/envs/conda-env/lib/python3.13/site-packages/package1/__init__.py"
};
let base_conda_package1_path = if cfg!(windows) {
"conda/envs/base/Lib/site-packages/package1/__init__.py"
"conda/Lib/site-packages/package1/__init__.py"
} else {
"conda/envs/base/lib/python3.13/site-packages/package1/__init__.py"
"conda/lib/python3.13/site-packages/package1/__init__.py"
};
let active_venv_package1_path = if cfg!(windows) {
@@ -1398,7 +1447,7 @@ home = ./
// run with CONDA_PREFIX set, should find the child conda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda-env")), @r"
.env("CONDA_PREFIX", case.root().join("conda/envs/conda-env")), @r"
success: false
exit_code: 1
----- stdout -----
@@ -1419,22 +1468,21 @@ home = ./
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV set (unequal), should find child conda
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV set (unequal), should find base conda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda-env"))
.env("CONDA_PREFIX", case.root().join("conda"))
.env("CONDA_DEFAULT_ENV", "base"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `ChildConda`
--> test.py:3:22
error[unresolved-import]: Module `package1` has no member `BaseConda`
--> test.py:5:22
|
2 | from package1 import ActiveVenv
3 | from package1 import ChildConda
| ^^^^^^^^^^
4 | from package1 import WorkingVenv
5 | from package1 import BaseConda
| ^^^^^^^^^
|
info: rule `unresolved-import` is enabled by default
@@ -1448,7 +1496,7 @@ home = ./
// should find child active venv
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda-env"))
.env("CONDA_PREFIX", case.root().join("conda"))
.env("CONDA_DEFAULT_ENV", "base")
.env("VIRTUAL_ENV", case.root().join("myvenv")), @r"
success: false
@@ -1470,10 +1518,10 @@ home = ./
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV (equal!) set, should find base conda
// run with CONDA_PREFIX and CONDA_DEFAULT_ENV (unequal!) set, should find base conda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda/envs/base"))
.env("CONDA_PREFIX", case.root().join("conda"))
.env("CONDA_DEFAULT_ENV", "base"), @r"
success: false
exit_code: 1
@@ -1494,6 +1542,55 @@ home = ./
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with _CONDA_ROOT and CONDA_PREFIX (unequal!) set, should find ChildConda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda/envs/conda-env"))
.env("_CONDA_ROOT", "conda"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `ChildConda`
--> test.py:3:22
|
2 | from package1 import ActiveVenv
3 | from package1 import ChildConda
| ^^^^^^^^^^
4 | from package1 import WorkingVenv
5 | from package1 import BaseConda
|
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
// run with _CONDA_ROOT and CONDA_PREFIX (equal!) set, should find BaseConda
assert_cmd_snapshot!(case.command()
.current_dir(case.root().join("project"))
.env("CONDA_PREFIX", case.root().join("conda"))
.env("_CONDA_ROOT", "conda"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-import]: Module `package1` has no member `BaseConda`
--> test.py:5:22
|
3 | from package1 import ChildConda
4 | from package1 import WorkingVenv
5 | from package1 import BaseConda
| ^^^^^^^^^
|
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.
");
Ok(())
}

View File

@@ -422,7 +422,7 @@ where
// We need a chance to create the directories here.
if let Some(environment) = project.options().environment.as_ref() {
for path in environment
.extra_paths
.non_environment_paths
.as_deref()
.unwrap_or_default()
.iter()
@@ -542,7 +542,7 @@ fn new_non_project_file() -> anyhow::Result<()> {
context.write_project_file("bar.py", "")?;
context.set_options(Options {
environment: Some(EnvironmentOptions {
extra_paths: Some(vec![RelativePathBuf::cli(
non_environment_paths: Some(vec![RelativePathBuf::cli(
context.join_root_path("site_packages"),
)]),
..EnvironmentOptions::default()
@@ -986,7 +986,7 @@ fn search_path() -> anyhow::Result<()> {
context.set_options(Options {
environment: Some(EnvironmentOptions {
extra_paths: Some(vec![RelativePathBuf::cli(
non_environment_paths: Some(vec![RelativePathBuf::cli(
context.join_root_path("site_packages"),
)]),
..EnvironmentOptions::default()
@@ -1027,7 +1027,7 @@ fn add_search_path() -> anyhow::Result<()> {
// Register site-packages as a search path.
case.update_options(Options {
environment: Some(EnvironmentOptions {
extra_paths: Some(vec![RelativePathBuf::cli("site_packages")]),
non_environment_paths: Some(vec![RelativePathBuf::cli("site_packages")]),
..EnvironmentOptions::default()
}),
..Options::default()
@@ -1051,7 +1051,7 @@ fn remove_search_path() -> anyhow::Result<()> {
context.write_project_file("bar.py", "import sub.a")?;
context.set_options(Options {
environment: Some(EnvironmentOptions {
extra_paths: Some(vec![RelativePathBuf::cli(
non_environment_paths: Some(vec![RelativePathBuf::cli(
context.join_root_path("site_packages"),
)]),
..EnvironmentOptions::default()
@@ -1579,7 +1579,7 @@ mod unix {
context.set_options(Options {
environment: Some(EnvironmentOptions {
extra_paths: Some(vec![RelativePathBuf::cli(
non_environment_paths: Some(vec![RelativePathBuf::cli(
".venv/lib/python3.12/site-packages",
)]),
python_version: Some(RangedValue::cli(PythonVersion::PY312)),

View File

@@ -32,25 +32,25 @@ use ty_python_semantic::PythonPlatform;
///
/// The main downside of this approach is that the ordering can be surprising in cases
/// where the option has a "first match" semantic and not a "last match" wins.
/// One such example is `extra-paths` where the semantics is given by Python:
/// One such example is `non-environment-paths` where the semantics is given by Python:
/// the module on the first matching search path wins.
///
/// ```toml
/// [environment]
/// extra-paths = ["b", "c"]
/// non-environment-paths = ["b", "c"]
/// ```
///
/// ```bash
/// ty --extra-paths a
/// ty --non-environment-search-path=a
/// ```
///
/// That's why a user might expect that this configuration results in `["a", "b", "c"]`,
/// because the CLI has higher precedence. However, the current implementation results in a
/// resolved extra search path of `["b", "c", "a"]`, which means `a` will be tried last.
/// resolved `non-environment-paths` of `["b", "c", "a"]`, which means `a` will be tried last.
///
/// There's an argument here that the user should be able to specify the order of the paths,
/// because only then is the user in full control of where to insert the path when specifying `extra-paths`
/// in multiple sources.
/// because only then is the user in full control of where to insert the path when specifying
/// `non-environment-paths` in multiple sources.
///
/// ## Macro
/// You can automatically derive `Combine` for structs with named fields by using `derive(ruff_macros::Combine)`.

View File

@@ -301,18 +301,19 @@ impl Options {
};
// collect the existing site packages
let mut extra_paths: Vec<SystemPathBuf> = environment
.extra_paths
let mut non_environment_paths: Vec<SystemPathBuf> = environment
.non_environment_paths
.as_deref()
.unwrap_or_default()
.iter()
.map(|path| path.absolute(project_root, system))
.collect();
// read all the paths off the PYTHONPATH environment variable, check
// they exist as a directory, and add them to the vec of extra_paths
// as they should be checked before site-packages just like python
// interpreter does
// Read all the paths off the `PYTHONPATH` environment variable, check
// each exists as a directory, and add each to the vec of `non_environment_paths`.
// This emulates the way `PYTHONPATH` entries will appear in `sys.path` prior to
// `site-packages` (the "environment search path") from a virtual environment at
// runtime.
if let Ok(python_path) = system.env_var(EnvVars::PYTHONPATH) {
for path in std::env::split_paths(python_path.as_str()) {
let path = match SystemPathBuf::from_path_buf(path) {
@@ -336,15 +337,15 @@ impl Options {
}
tracing::debug!(
"Adding `{abspath}` from the `PYTHONPATH` environment variable to `extra_paths`"
"Adding `{abspath}` from the `PYTHONPATH` environment variable to `non_environment_paths`"
);
extra_paths.push(abspath);
non_environment_paths.push(abspath);
}
}
let settings = SearchPathSettings {
extra_paths,
non_environment_paths,
src_roots,
custom_typeshed: environment
.typeshed
@@ -550,18 +551,23 @@ pub struct EnvironmentOptions {
)]
pub python_platform: Option<RangedValue<PythonPlatform>>,
/// List of user-provided paths that should take first priority in the module resolution.
/// Examples in other type checkers are mypy's `MYPYPATH` environment variable,
/// or pyright's `stubPath` configuration setting.
/// User-provided paths that should take first priority in module resolution.
///
/// This is an advanced option that should usually only be used for first-party or third-party
/// modules that are not installed into your Python environment in a conventional way.
/// Use the `python` option to specify the location of your Python environment.
///
/// This option is similar to mypy's `MYPYPATH` environment variable and pyright's `stubPath`
/// configuration setting.
#[serde(skip_serializing_if = "Option::is_none")]
#[option(
default = r#"[]"#,
value_type = "list[str]",
example = r#"
extra-paths = ["./shared/my-search-path"]
non-environment-paths = ["./shared/my-search-path"]
"#
)]
pub extra_paths: Option<Vec<RelativePathBuf>>,
pub non_environment_paths: Option<Vec<RelativePathBuf>>,
/// Optional path to a "typeshed" directory on disk for us to use for standard-library types.
/// If this is not provided, we will fallback to our vendored typeshed stubs for the stdlib,
@@ -576,18 +582,27 @@ pub struct EnvironmentOptions {
)]
pub typeshed: Option<RelativePathBuf>,
/// Path to the Python installation from which ty resolves type information and third-party dependencies.
/// Path to your project's Python environment or interpreter.
///
/// ty will search in the path's `site-packages` directories for type information and
/// third-party imports.
/// ty uses the `site-packages` directory of your project's Python environment
/// to resolve third-party (and, in some cases, first-party) imports in your code.
///
/// This option is commonly used to specify the path to a virtual environment.
/// If you're using a project management tool such as uv, you should not generally need
/// to specify this option, as commands such as `uv run` will set the `VIRTUAL_ENV`
/// environment variable to point to your project's virtual environment. ty can also infer
/// the location of your environment from an activated Conda environment, and will look for
/// a `.venv` directory in the project root if none of the above apply.
///
/// Passing a path to a Python executable is supported, but passing a path to a dynamic executable
/// (such as a shim) is not currently supported.
///
/// This option can be used to point to virtual or system Python environments.
#[serde(skip_serializing_if = "Option::is_none")]
#[option(
default = r#"null"#,
value_type = "str",
example = r#"
python = "./.venv"
python = "./custom-venv-location/.venv"
"#
)]
pub python: Option<RelativePathBuf>,

View File

@@ -25,7 +25,7 @@ pub enum ValueSource {
File(Arc<SystemPathBuf>),
/// The value comes from a CLI argument, while it's left open if specified using a short argument,
/// long argument (`--extra-paths`) or `--config key=value`.
/// long argument (`--non-environment-search-path`) or `--config key=value`.
Cli,
/// The value comes from an LSP client configuration.

View File

@@ -1662,3 +1662,67 @@ def _(arg: tuple[A | B, Any]):
reveal_type(f(arg)) # revealed: Unknown
reveal_type(f(*(arg,))) # revealed: Unknown
```
## Bidirectional Type Inference
```toml
[environment]
python-version = "3.12"
```
Type inference accounts for parameter type annotations across all overloads.
```py
from typing import TypedDict, overload
class T(TypedDict):
x: int
@overload
def f(a: list[T], b: int) -> int: ...
@overload
def f(a: list[dict[str, int]], b: str) -> str: ...
def f(a: list[dict[str, int]] | list[T], b: int | str) -> int | str:
return 1
def int_or_str() -> int | str:
return 1
x = f([{"x": 1}], int_or_str())
reveal_type(x) # revealed: int | str
# TODO: error: [no-matching-overload] "No overload of function `f` matches arguments"
# we currently incorrectly consider `list[dict[str, int]]` a subtype of `list[T]`
f([{"y": 1}], int_or_str())
```
Non-matching overloads do not produce diagnostics:
```py
from typing import TypedDict, overload
class T(TypedDict):
x: int
@overload
def f(a: T, b: int) -> int: ...
@overload
def f(a: dict[str, int], b: str) -> str: ...
def f(a: T | dict[str, int], b: int | str) -> int | str:
return 1
x = f({"y": 1}, "a")
reveal_type(x) # revealed: str
```
```py
from typing import SupportsRound, overload
@overload
def takes_str_or_float(x: str): ...
@overload
def takes_str_or_float(x: float): ...
def takes_str_or_float(x: float | str): ...
takes_str_or_float(round(1.0))
```

View File

@@ -251,3 +251,59 @@ from ty_extensions import Intersection, Not
def _(x: Union[Intersection[Any, Not[int]], Intersection[Any, Not[int]]]):
reveal_type(x) # revealed: Any & ~int
```
## Bidirectional Type Inference
```toml
[environment]
python-version = "3.12"
```
Type inference accounts for parameter type annotations across all signatures in a union.
```py
from typing import TypedDict, overload
class T(TypedDict):
x: int
def _(flag: bool):
if flag:
def f(x: T) -> int:
return 1
else:
def f(x: dict[str, int]) -> int:
return 1
x = f({"x": 1})
reveal_type(x) # revealed: int
# TODO: error: [invalid-argument-type] "Argument to function `f` is incorrect: Expected `T`, found `dict[str, int]`"
# we currently consider `TypedDict` instances to be subtypes of `dict`
f({"y": 1})
```
Diagnostics unrelated to the type-context are only reported once:
```py
def f[T](x: T) -> list[T]:
return [x]
def a(x: list[bool], y: list[bool]): ...
def b(x: list[int], y: list[int]): ...
def c(x: list[int], y: list[int]): ...
def _(x: int):
if x == 0:
y = a
elif x == 1:
y = b
else:
y = c
if x == 0:
z = True
y(f(True), [True])
# error: [possibly-unresolved-reference] "Name `z` used when possibly not defined"
y(f(True), [z])
```

View File

@@ -10,6 +10,7 @@ from typing_extensions import assert_type
def _(x: int):
assert_type(x, int) # fine
assert_type(x, str) # error: [type-assertion-failure]
assert_type(assert_type(x, int), int)
```
## Narrowing

View File

@@ -48,7 +48,7 @@ search paths if the file name's casing does not match.
```toml
[environment]
extra-paths = ["/search-1", "/search-2"]
non-environment-paths = ["/search-1", "/search-2"]
```
`/search-1/A.py`:

View File

@@ -40,7 +40,7 @@ neither.
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs/py.typed`:
@@ -99,7 +99,7 @@ Omitting the partial `py.typed`, we see "impl" now cannot be found.
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs/__init__.pyi`:
@@ -152,7 +152,7 @@ Including a blank py.typed we still don't conclude it's partial.
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs/py.typed`:
@@ -210,7 +210,7 @@ reveal_type(Fake().fake) # revealed: Unknown
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs/py.typed`:
@@ -280,7 +280,7 @@ reveal_type(Fake().fake) # revealed: Unknown
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs/py.typed`:
@@ -355,7 +355,7 @@ reveal_type(Fake().fake) # revealed: Unknown
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs/py.typed`:
@@ -433,7 +433,7 @@ This is a regression test for <https://github.com/astral-sh/ty/issues/520>.
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/parent-stubs/foo/both.pyi`:

View File

@@ -7,7 +7,7 @@ Stub packages are packages named `<package>-stubs` that provide typing stubs for
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs/__init__.pyi`:
@@ -38,7 +38,7 @@ The regular package isn't required for type checking.
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs/__init__.pyi`:
@@ -63,7 +63,7 @@ A module named `<module>-stubs` isn't a stub package.
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/foo-stubs.pyi`:
@@ -88,7 +88,7 @@ A namespace package with multiple stub packages spread over multiple search path
```toml
[environment]
extra-paths = ["/stubs1", "/stubs2", "/packages"]
non-environment-paths = ["/stubs1", "/stubs2", "/packages"]
```
`/stubs1/shapes-stubs/polygons/pentagon.pyi`:
@@ -136,7 +136,7 @@ should stop after the first non-namespace stub package. This matches Pyright's b
```toml
[environment]
extra-paths = ["/stubs1", "/stubs2", "/packages"]
non-environment-paths = ["/stubs1", "/stubs2", "/packages"]
```
`/stubs1/shapes-stubs/__init__.pyi`:
@@ -196,7 +196,7 @@ fewer lookups.
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/shapes-stubs/polygons/pentagon.pyi`:
@@ -254,7 +254,7 @@ to be an enforced convention. At least, Pyright is fine with the following.
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/shapes-stubs/__init__.py`:
@@ -291,7 +291,7 @@ Regression test for <https://github.com/astral-sh/ty/issues/408>
```toml
[environment]
extra-paths = ["/packages"]
non-environment-paths = ["/packages"]
```
`/packages/yaml-stubs/__init__.pyi`:

View File

@@ -17,6 +17,7 @@ mdtest path: crates/ty_python_semantic/resources/mdtest/directives/assert_type.m
3 | def _(x: int):
4 | assert_type(x, int) # fine
5 | assert_type(x, str) # error: [type-assertion-failure]
6 | assert_type(assert_type(x, int), int)
```
# Diagnostics
@@ -31,6 +32,7 @@ error[type-assertion-failure]: Argument does not have asserted type `str`
| ^^^^^^^^^^^^-^^^^^^
| |
| Inferred type of argument is `int`
6 | assert_type(assert_type(x, int), int)
|
info: `str` and `int` are not equivalent types
info: rule `type-assertion-failure` is enabled by default

View File

@@ -138,6 +138,11 @@ static_assert(is_equivalent_to(Any, Any | Intersection[Any, str]))
static_assert(is_equivalent_to(Any, Intersection[str, Any] | Any))
static_assert(is_equivalent_to(Any, Any | Intersection[Any, Not[None]]))
static_assert(is_equivalent_to(Any, Intersection[Not[None], Any] | Any))
static_assert(is_equivalent_to(Any, Unknown | Intersection[Unknown, str]))
static_assert(is_equivalent_to(Any, Intersection[str, Unknown] | Unknown))
static_assert(is_equivalent_to(Any, Unknown | Intersection[Unknown, Not[None]]))
static_assert(is_equivalent_to(Any, Intersection[Not[None], Unknown] | Unknown))
```
## Tuples

View File

@@ -152,7 +152,7 @@ Person(name="Alice")
# error: [missing-typed-dict-key] "Missing required key 'age' in TypedDict `Person` constructor"
Person({"name": "Alice"})
# TODO: this should be an error, similar to the above
# error: [missing-typed-dict-key] "Missing required key 'age' in TypedDict `Person` constructor"
accepts_person({"name": "Alice"})
# TODO: this should be an error, similar to the above
house.owner = {"name": "Alice"}
@@ -171,7 +171,7 @@ Person(name=None, age=30)
# error: [invalid-argument-type] "Invalid argument to key "name" with declared type `str` on TypedDict `Person`: value of type `None`"
Person({"name": None, "age": 30})
# TODO: this should be an error, similar to the above
# error: [invalid-argument-type] "Invalid argument to key "name" with declared type `str` on TypedDict `Person`: value of type `None`"
accepts_person({"name": None, "age": 30})
# TODO: this should be an error, similar to the above
house.owner = {"name": None, "age": 30}
@@ -190,7 +190,7 @@ Person(name="Alice", age=30, extra=True)
# error: [invalid-key] "Invalid key access on TypedDict `Person`: Unknown key "extra""
Person({"name": "Alice", "age": 30, "extra": True})
# TODO: this should be an error
# error: [invalid-key] "Invalid key access on TypedDict `Person`: Unknown key "extra""
accepts_person({"name": "Alice", "age": 30, "extra": True})
# TODO: this should be an error
house.owner = {"name": "Alice", "age": 30, "extra": True}

View File

@@ -306,3 +306,74 @@ def _(c: BC, d: BD):
reveal_type(c) # revealed: Literal[b""]
reveal_type(d) # revealed: Literal[b""]
```
## Unions of tuples
A union of a fixed-length tuple and a variable-length tuple must be collapsed to the variable-length
element, never to the fixed-length element (`tuple[()] | tuple[Any, ...]` -> `tuple[Any, ...]`, not
`tuple[()]`).
```py
from typing import Any
def f(
a: tuple[()] | tuple[int, ...],
b: tuple[int, ...] | tuple[()],
c: tuple[int] | tuple[str, ...],
d: tuple[str, ...] | tuple[int],
e: tuple[()] | tuple[Any, ...],
f: tuple[Any, ...] | tuple[()],
g: tuple[Any, ...] | tuple[Any | str, ...],
h: tuple[Any | str, ...] | tuple[Any, ...],
):
reveal_type(a) # revealed: tuple[int, ...]
reveal_type(b) # revealed: tuple[int, ...]
reveal_type(c) # revealed: tuple[int] | tuple[str, ...]
reveal_type(d) # revealed: tuple[str, ...] | tuple[int]
reveal_type(e) # revealed: tuple[Any, ...]
reveal_type(f) # revealed: tuple[Any, ...]
reveal_type(g) # revealed: tuple[Any | str, ...]
reveal_type(h) # revealed: tuple[Any | str, ...]
```
## Unions of other generic containers
```toml
[environment]
python-version = "3.12"
```
```py
from typing import Any
class Bivariant[T]: ...
class Covariant[T]:
def get(self) -> T:
raise NotImplementedError
class Contravariant[T]:
def receive(self, input: T) -> None: ...
class Invariant[T]:
mutable_attribute: T
def _(
a: Bivariant[Any] | Bivariant[Any | str],
b: Bivariant[Any | str] | Bivariant[Any],
c: Covariant[Any] | Covariant[Any | str],
d: Covariant[Any | str] | Covariant[Any],
e: Contravariant[Any | str] | Contravariant[Any],
f: Contravariant[Any] | Contravariant[Any | str],
g: Invariant[Any] | Invariant[Any | str],
h: Invariant[Any | str] | Invariant[Any],
):
reveal_type(a) # revealed: Bivariant[Any]
reveal_type(b) # revealed: Bivariant[Any | str]
reveal_type(c) # revealed: Covariant[Any | str]
reveal_type(d) # revealed: Covariant[Any | str]
reveal_type(e) # revealed: Contravariant[Any]
reveal_type(f) # revealed: Contravariant[Any]
reveal_type(g) # revealed: Invariant[Any] | Invariant[Any | str]
reveal_type(h) # revealed: Invariant[Any | str] | Invariant[Any]
```

View File

@@ -93,7 +93,7 @@ impl ModulePath {
relative_path,
} = self;
match &*search_path.0 {
SearchPathInner::Extra(search_path)
SearchPathInner::NonEnvironment(search_path)
| SearchPathInner::FirstParty(search_path)
| SearchPathInner::SitePackages(search_path)
| SearchPathInner::Editable(search_path)
@@ -131,7 +131,7 @@ impl ModulePath {
} = self;
match &*search_path.0 {
SearchPathInner::Extra(search_path)
SearchPathInner::NonEnvironment(search_path)
| SearchPathInner::FirstParty(search_path)
| SearchPathInner::SitePackages(search_path)
| SearchPathInner::Editable(search_path) => {
@@ -199,7 +199,7 @@ impl ModulePath {
relative_path,
} = self;
match &*search_path.0 {
SearchPathInner::Extra(search_path)
SearchPathInner::NonEnvironment(search_path)
| SearchPathInner::FirstParty(search_path)
| SearchPathInner::SitePackages(search_path)
| SearchPathInner::Editable(search_path) => Some(search_path.join(relative_path)),
@@ -219,7 +219,7 @@ impl ModulePath {
relative_path,
} = self;
match &*search_path.0 {
SearchPathInner::Extra(search_path)
SearchPathInner::NonEnvironment(search_path)
| SearchPathInner::FirstParty(search_path)
| SearchPathInner::SitePackages(search_path)
| SearchPathInner::Editable(search_path) => {
@@ -451,7 +451,7 @@ type SearchPathResult<T> = Result<T, SearchPathValidationError>;
#[derive(Debug, Clone, PartialEq, Eq, Hash, get_size2::GetSize)]
enum SearchPathInner {
Extra(SystemPathBuf),
NonEnvironment(SystemPathBuf),
FirstParty(SystemPathBuf),
StandardLibraryCustom(SystemPathBuf),
StandardLibraryVendored(VendoredPathBuf),
@@ -464,7 +464,7 @@ enum SearchPathInner {
/// that can be used to locate Python modules.
///
/// The different kinds of search paths are:
/// - "Extra" search paths: these go at the start of the module resolution order
/// - "Non-environment" search paths: these go at the start of the module resolution order
/// - First-party search paths: the user code that we are directly invoked on.
/// - Standard-library search paths: these come in three different forms:
/// - Custom standard-library search paths: paths provided by the user
@@ -499,9 +499,12 @@ impl SearchPath {
}
}
/// Create a new "Extra" search path
pub(crate) fn extra(system: &dyn System, root: SystemPathBuf) -> SearchPathResult<Self> {
Ok(Self(Arc::new(SearchPathInner::Extra(
/// Create a new non-environment search path
pub(crate) fn non_environment(
system: &dyn System,
root: SystemPathBuf,
) -> SearchPathResult<Self> {
Ok(Self(Arc::new(SearchPathInner::NonEnvironment(
Self::directory_path(system, root)?,
))))
}
@@ -616,7 +619,7 @@ impl SearchPath {
}
match &*self.0 {
SearchPathInner::Extra(search_path)
SearchPathInner::NonEnvironment(search_path)
| SearchPathInner::FirstParty(search_path)
| SearchPathInner::StandardLibraryCustom(search_path)
| SearchPathInner::StandardLibraryReal(search_path)
@@ -643,7 +646,7 @@ impl SearchPath {
}
match &*self.0 {
SearchPathInner::Extra(_)
SearchPathInner::NonEnvironment(_)
| SearchPathInner::FirstParty(_)
| SearchPathInner::StandardLibraryCustom(_)
| SearchPathInner::StandardLibraryReal(_)
@@ -662,7 +665,7 @@ impl SearchPath {
#[must_use]
pub(super) fn as_path(&self) -> SystemOrVendoredPathRef<'_> {
match *self.0 {
SearchPathInner::Extra(ref path)
SearchPathInner::NonEnvironment(ref path)
| SearchPathInner::FirstParty(ref path)
| SearchPathInner::StandardLibraryCustom(ref path)
| SearchPathInner::StandardLibraryReal(ref path)
@@ -691,7 +694,7 @@ impl SearchPath {
#[must_use]
pub(crate) fn debug_kind(&self) -> &'static str {
match *self.0 {
SearchPathInner::Extra(_) => "extra",
SearchPathInner::NonEnvironment(_) => "extra",
SearchPathInner::FirstParty(_) => "first-party",
SearchPathInner::StandardLibraryCustom(_) => "std-custom",
SearchPathInner::StandardLibraryReal(_) => "std-real",
@@ -706,8 +709,8 @@ impl SearchPath {
#[must_use]
pub(crate) fn describe_kind(&self) -> &'static str {
match *self.0 {
SearchPathInner::Extra(_) => {
"extra search path specified on the CLI or in your config file"
SearchPathInner::NonEnvironment(_) => {
"non-environment search path specified on the CLI or in your config file"
}
SearchPathInner::FirstParty(_) => "first-party code",
SearchPathInner::StandardLibraryCustom(_) => {
@@ -772,7 +775,7 @@ impl PartialEq<SearchPath> for VendoredPathBuf {
impl fmt::Display for SearchPath {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match &*self.0 {
SearchPathInner::Extra(system_path_buf)
SearchPathInner::NonEnvironment(system_path_buf)
| SearchPathInner::FirstParty(system_path_buf)
| SearchPathInner::SitePackages(system_path_buf)
| SearchPathInner::Editable(system_path_buf)
@@ -1073,7 +1076,7 @@ mod tests {
fn relativize_non_stdlib_path_errors() {
let TestCase { db, src, .. } = TestCaseBuilder::new().build();
let root = SearchPath::extra(db.system(), src.clone()).unwrap();
let root = SearchPath::non_environment(db.system(), src.clone()).unwrap();
// Must have a `.py` extension, a `.pyi` extension, or no extension:
let bad_absolute_path = src.join("x.rs");
assert_eq!(root.relativize_system_path(&bad_absolute_path), None);

View File

@@ -228,7 +228,7 @@ impl SearchPaths {
}
let SearchPathSettings {
extra_paths,
non_environment_paths,
src_roots,
custom_typeshed: typeshed,
site_packages_paths,
@@ -237,11 +237,11 @@ impl SearchPaths {
let mut static_paths = vec![];
for path in extra_paths {
for path in non_environment_paths {
let path = canonicalize(path, system);
tracing::debug!("Adding extra search-path `{path}`");
tracing::debug!("Adding non-environment search path `{path}`");
static_paths.push(SearchPath::extra(system, path)?);
static_paths.push(SearchPath::non_environment(system, path)?);
}
for src_root in src_roots {
@@ -666,11 +666,15 @@ struct ModuleNameIngredient<'db> {
/// Returns `true` if the module name refers to a standard library module which can't be shadowed
/// by a first-party module.
///
/// This includes "builtin" modules, which can never be shadowed at runtime either, as well as the
/// `types` module, which tends to be imported early in Python startup, so can't be consistently
/// shadowed, and is important to type checking.
/// This includes "builtin" modules, which can never be shadowed at runtime either, as well as
/// certain other modules that are involved in an import cycle with `builtins` (`types`,
/// `typing_extensions`, etc.). This latter set of modules cannot be allowed to be shadowed by
/// first-party or "extra-path" modules, or we risk panics in unexpected places due to being
/// unable to resolve builtin symbols. This is similar behaviour to other type checkers such
/// as mypy: <https://github.com/python/mypy/blob/3807423e9d98e678bf16b13ec8b4f909fe181908/mypy/build.py#L104-L117>
pub(super) fn is_non_shadowable(minor_version: u8, module_name: &str) -> bool {
module_name == "types" || ruff_python_stdlib::sys::is_builtin_module(minor_version, module_name)
matches!(module_name, "types" | "typing_extensions")
|| ruff_python_stdlib::sys::is_builtin_module(minor_version, module_name)
}
/// Given a module name and a list of search paths in which to lookup modules,

View File

@@ -110,7 +110,7 @@ pub enum PythonVersionSource {
InstallationDirectoryLayout { site_packages_parent_dir: Box<str> },
/// The value comes from a CLI argument, while it's left open if specified using a short argument,
/// long argument (`--extra-paths`) or `--config key=value`.
/// long argument (`--non-environment-search-path`) or `--config key=value`.
Cli,
/// The value comes from the Python VS Code extension (the selected interpreter).
@@ -163,10 +163,10 @@ impl Default for PythonVersionWithSource {
/// Configures the search paths for module resolution.
#[derive(Eq, PartialEq, Debug, Clone)]
pub struct SearchPathSettings {
/// List of user-provided paths that should take first priority in the module resolution.
/// Examples in other type checkers are mypy's MYPYPATH environment variable,
/// or pyright's stubPath configuration setting.
pub extra_paths: Vec<SystemPathBuf>,
/// List of user-provided paths that should take first priority in module resolution.
/// Examples in other type checkers are mypy's `MYPYPATH` environment variable,
/// or pyright's `stubPath` configuration setting.
pub non_environment_paths: Vec<SystemPathBuf>,
/// The root of the project, used for finding first-party modules.
pub src_roots: Vec<SystemPathBuf>,
@@ -197,7 +197,7 @@ impl SearchPathSettings {
pub fn empty() -> Self {
SearchPathSettings {
src_roots: vec![],
extra_paths: vec![],
non_environment_paths: vec![],
custom_typeshed: None,
site_packages_paths: vec![],
real_stdlib_path: None,

View File

@@ -614,28 +614,44 @@ pub(crate) enum CondaEnvironmentKind {
impl CondaEnvironmentKind {
/// Compute the kind of `CONDA_PREFIX` we have.
///
/// When the base environment is used, `CONDA_DEFAULT_ENV` will be set to a name, i.e., `base` or
/// `root` which does not match the prefix, e.g. `/usr/local` instead of
/// `/usr/local/conda/envs/<name>`.
/// The base environment is typically stored in a location matching the `_CONDA_ROOT` path.
///
/// Additionally, when the base environment is active, `CONDA_DEFAULT_ENV` will be set to a
/// name, e.g., `base`, which does not match the `CONDA_PREFIX`, e.g., `/usr/local` instead of
/// `/usr/local/conda/envs/<name>`. Note that the name `CONDA_DEFAULT_ENV` is misleading, it's
/// the active environment name, not a constant base environment name.
fn from_prefix_path(system: &dyn System, path: &SystemPath) -> Self {
// If we cannot read `CONDA_DEFAULT_ENV`, there's no way to know if the base environment
let Ok(default_env) = system.env_var(EnvVars::CONDA_DEFAULT_ENV) else {
return CondaEnvironmentKind::Child;
};
// These are the expected names for the base environment
if default_env != "base" && default_env != "root" {
return CondaEnvironmentKind::Child;
// If `_CONDA_ROOT` is set and matches `CONDA_PREFIX`, it's the base environment.
if let Ok(conda_root) = system.env_var(EnvVars::CONDA_ROOT) {
if path.as_str() == conda_root {
return Self::Base;
}
}
let Some(name) = path.file_name() else {
return CondaEnvironmentKind::Child;
// Next, we'll use a heuristic based on `CONDA_DEFAULT_ENV`
let Ok(current_env) = system.env_var(EnvVars::CONDA_DEFAULT_ENV) else {
return Self::Child;
};
if name == default_env {
CondaEnvironmentKind::Base
// If the environment name is "base" or "root", treat it as a base environment
//
// These are the expected names for the base environment; and is retained for backwards
// compatibility, but in a future breaking release we should remove this special-casing.
if current_env == "base" || current_env == "root" {
return Self::Base;
}
// For other environment names, use the path-based logic
let Some(name) = path.file_name() else {
return Self::Child;
};
// If the environment is in a directory matching the name of the environment, it's not
// usually a base environment.
if name == current_env {
Self::Child
} else {
CondaEnvironmentKind::Child
Self::Base
}
}
}

View File

@@ -1050,13 +1050,6 @@ impl<'db> Type<'db> {
}
}
pub(crate) const fn into_intersection(self) -> Option<IntersectionType<'db>> {
match self {
Type::Intersection(intersection_type) => Some(intersection_type),
_ => None,
}
}
#[cfg(test)]
#[track_caller]
pub(crate) fn expect_union(self) -> UnionType<'db> {
@@ -1420,43 +1413,9 @@ impl<'db> Type<'db> {
}
}
/// Return true if this type is a [subtype of] type `target`.
/// Return true if this type is a subtype of type `target`.
///
/// For fully static types, this means that the set of objects represented by `self` is a
/// subset of the objects represented by `target`.
///
/// For gradual types, it means that the union of all possible sets of values represented by
/// `self` (the "top materialization" of `self`) is a subtype of the intersection of all
/// possible sets of values represented by `target` (the "bottom materialization" of
/// `target`). In other words, for all possible pairs of materializations `self'` and
/// `target'`, `self'` is always a subtype of `target'`.
///
/// Note that this latter expansion of the subtyping relation to non-fully-static types is not
/// described in the typing spec, but the primary use of the subtyping relation is for
/// simplifying unions and intersections, and this expansion to gradual types is sound and
/// allows us to better simplify many unions and intersections. This definition does mean the
/// subtyping relation is not reflexive for non-fully-static types (e.g. `Any` is not a subtype
/// of `Any`).
///
/// [subtype of]: https://typing.python.org/en/latest/spec/concepts.html#subtype-supertype-and-type-equivalence
///
/// There would be an even more general definition of subtyping for gradual types, allowing a
/// type `S` to be a subtype of a type `T` if the top materialization of `S` (`S+`) is a
/// subtype of `T+`, and the bottom materialization of `S` (`S-`) is a subtype of `T-`. This
/// definition is attractive in that it would restore reflexivity of subtyping for all types,
/// and would mean that gradual equivalence of `S` and `T` could be defined simply as `S <: T
/// && T <: S`. It would also be sound, in that simplifying unions or intersections according
/// to this definition of subtyping would still result in an equivalent type.
///
/// Unfortunately using this definition would break transitivity of subtyping when both nominal
/// and structural types are involved, because Liskov enforcement for nominal types is based on
/// assignability, so we can have class `A` with method `def meth(self) -> Any` and a subclass
/// `B(A)` with method `def meth(self) -> int`. In this case, `A` would be a subtype of a
/// protocol `P` with method `def meth(self) -> Any`, but `B` would not be a subtype of `P`,
/// and yet `B` is (by nominal subtyping) a subtype of `A`, so we would have `B <: A` and `A <:
/// P`, but not `B <: P`. Losing transitivity of subtyping is not tenable (it makes union and
/// intersection simplification dependent on the order in which elements are added), so we do
/// not use this more general definition of subtyping.
/// See [`TypeRelation::Subtyping`] for more details.
pub(crate) fn is_subtype_of(self, db: &'db dyn Db, target: Type<'db>) -> bool {
self.when_subtype_of(db, target).is_always_satisfied()
}
@@ -1465,9 +1424,9 @@ impl<'db> Type<'db> {
self.has_relation_to(db, target, TypeRelation::Subtyping)
}
/// Return true if this type is [assignable to] type `target`.
/// Return true if this type is assignable to type `target`.
///
/// [assignable to]: https://typing.python.org/en/latest/spec/concepts.html#the-assignable-to-or-consistent-subtyping-relation
/// See [`TypeRelation::Assignability`] for more details.
pub(crate) fn is_assignable_to(self, db: &'db dyn Db, target: Type<'db>) -> bool {
self.when_assignable_to(db, target).is_always_satisfied()
}
@@ -1476,6 +1435,14 @@ impl<'db> Type<'db> {
self.has_relation_to(db, target, TypeRelation::Assignability)
}
/// Return `true` if it would be redundant to add `self` to a union that already contains `other`.
///
/// See [`TypeRelation::Redundancy`] for more details.
pub(crate) fn is_redundant_with(self, db: &'db dyn Db, other: Type<'db>) -> bool {
self.has_relation_to(db, other, TypeRelation::Redundancy)
.is_always_satisfied()
}
fn has_relation_to(
self,
db: &'db dyn Db,
@@ -1497,7 +1464,7 @@ impl<'db> Type<'db> {
//
// Note that we could do a full equivalence check here, but that would be both expensive
// and unnecessary. This early return is only an optimisation.
if (relation.is_assignability() || self.subtyping_is_always_reflexive()) && self == target {
if (!relation.is_subtyping() || self.subtyping_is_always_reflexive()) && self == target {
return ConstraintSet::from(true);
}
@@ -1514,9 +1481,11 @@ impl<'db> Type<'db> {
// It is a subtype of all other types.
(Type::Never, _) => ConstraintSet::from(true),
// Dynamic is only a subtype of `object` and only a supertype of `Never`; both were
// handled above. It's always assignable, though.
(Type::Dynamic(_), _) | (_, Type::Dynamic(_)) => {
// In some specific situations, `Any`/`Unknown`/`@Todo` can be simplified out of unions and intersections,
// but this is not true for divergent types (and moving this case any lower down appears to cause
// "too many cycle iterations" panics).
(Type::Dynamic(DynamicType::Divergent(_)), _)
| (_, Type::Dynamic(DynamicType::Divergent(_))) => {
ConstraintSet::from(relation.is_assignability())
}
@@ -1541,10 +1510,33 @@ impl<'db> Type<'db> {
.has_relation_to_impl(db, right, relation, visitor)
}
(Type::TypedDict(_), _) | (_, Type::TypedDict(_)) => {
// TODO: Implement assignability and subtyping for TypedDict
ConstraintSet::from(relation.is_assignability())
}
// Dynamic is only a subtype of `object` and only a supertype of `Never`; both were
// handled above. It's always assignable, though.
//
// Union simplification sits in between subtyping and assignability. `Any <: T` only
// holds true if `T` is also a dynamic type or a union that contains a dynamic type.
// Similarly, `T <: Any` only holds true if `T` is a dynamic type or an intersection
// that contains a dynamic type.
(Type::Dynamic(_), _) => ConstraintSet::from(match relation {
TypeRelation::Subtyping => false,
TypeRelation::Assignability => true,
TypeRelation::Redundancy => match target {
Type::Dynamic(_) => true,
Type::Union(union) => union.elements(db).iter().any(Type::is_dynamic),
_ => false,
},
}),
(_, Type::Dynamic(_)) => ConstraintSet::from(match relation {
TypeRelation::Subtyping => false,
TypeRelation::Assignability => true,
TypeRelation::Redundancy => match self {
Type::Dynamic(_) => true,
Type::Intersection(intersection) => {
intersection.positive(db).iter().any(Type::is_dynamic)
}
_ => false,
},
}),
// In general, a TypeVar `T` is not a subtype of a type `S` unless one of the two conditions is satisfied:
// 1. `T` is a bound TypeVar and `T`'s upper bound is a subtype of `S`.
@@ -1692,6 +1684,11 @@ impl<'db> Type<'db> {
// TODO: Infer specializations here
(Type::TypeVar(_), _) | (_, Type::TypeVar(_)) => ConstraintSet::from(false),
(Type::TypedDict(_), _) | (_, Type::TypedDict(_)) => {
// TODO: Implement assignability and subtyping for TypedDict
ConstraintSet::from(relation.is_assignability())
}
// Note that the definition of `Type::AlwaysFalsy` depends on the return value of `__bool__`.
// If `__bool__` always returns True or False, it can be treated as a subtype of `AlwaysTruthy` or `AlwaysFalsy`, respectively.
(left, Type::AlwaysFalsy) => ConstraintSet::from(left.bool(db).is_always_false()),
@@ -4197,20 +4194,26 @@ impl<'db> Type<'db> {
.into()
}
Some(KnownFunction::AssertType) => Binding::single(
self,
Signature::new(
Parameters::new([
Parameter::positional_only(Some(Name::new_static("value")))
.with_annotated_type(Type::any()),
Parameter::positional_only(Some(Name::new_static("type")))
.type_form()
.with_annotated_type(Type::any()),
]),
Some(Type::none(db)),
),
)
.into(),
Some(KnownFunction::AssertType) => {
let val_ty =
BoundTypeVarInstance::synthetic(db, "T", TypeVarVariance::Invariant);
Binding::single(
self,
Signature::new_generic(
Some(GenericContext::from_typevar_instances(db, [val_ty])),
Parameters::new([
Parameter::positional_only(Some(Name::new_static("value")))
.with_annotated_type(Type::TypeVar(val_ty)),
Parameter::positional_only(Some(Name::new_static("type")))
.type_form()
.with_annotated_type(Type::any()),
]),
Some(Type::TypeVar(val_ty)),
),
)
.into()
}
Some(KnownFunction::AssertNever) => {
Binding::single(
@@ -5459,28 +5462,19 @@ impl<'db> Type<'db> {
}
}
let new_specialization = new_call_outcome
.and_then(Result::ok)
.as_ref()
.and_then(Bindings::single_element)
.into_iter()
.flat_map(CallableBinding::matching_overloads)
.next()
.and_then(|(_, binding)| binding.inherited_specialization())
.filter(|specialization| {
Some(specialization.generic_context(db)) == generic_context
});
let init_specialization = init_call_outcome
.and_then(Result::ok)
.as_ref()
.and_then(Bindings::single_element)
.into_iter()
.flat_map(CallableBinding::matching_overloads)
.next()
.and_then(|(_, binding)| binding.inherited_specialization())
.filter(|specialization| {
Some(specialization.generic_context(db)) == generic_context
});
let specialize_constructor = |outcome: Option<Bindings<'db>>| {
let (_, binding) = outcome
.as_ref()?
.single_element()?
.matching_overloads()
.next()?;
binding.specialization()?.restrict(db, generic_context?)
};
let new_specialization =
specialize_constructor(new_call_outcome.and_then(Result::ok));
let init_specialization =
specialize_constructor(init_call_outcome.and_then(Result::ok));
let specialization =
combine_specializations(db, new_specialization, init_specialization);
let specialized = specialization
@@ -6771,13 +6765,11 @@ impl<'db> TypeMapping<'_, 'db> {
db,
context
.variables(db)
.iter()
.filter(|var| !var.typevar(db).is_self(db))
.copied(),
.filter(|var| !var.typevar(db).is_self(db)),
),
TypeMapping::ReplaceSelf { new_upper_bound } => GenericContext::from_typevar_instances(
db,
context.variables(db).iter().map(|typevar| {
context.variables(db).map(|typevar| {
if typevar.typevar(db).is_self(db) {
BoundTypeVarInstance::synthetic_self(
db,
@@ -6785,7 +6777,7 @@ impl<'db> TypeMapping<'_, 'db> {
typevar.binding_context(db),
)
} else {
*typevar
typevar
}
}),
),
@@ -8993,16 +8985,138 @@ impl<'db> ConstructorCallError<'db> {
}
}
/// A non-exhaustive enumeration of relations that can exist between types.
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub(crate) enum TypeRelation {
/// The "subtyping" relation.
///
/// A [fully static] type `B` is a subtype of a fully static type `A` if and only if
/// the set of possible runtime values represented by `B` is a subset of the set
/// of possible runtime values represented by `A`.
///
/// For a pair of types `C` and `D` that may or may not be fully static,
/// `D` can be said to be a subtype of `C` if every possible fully static
/// [materialization] of `D` is a subtype of every possible fully static
/// materialization of `C`. Another way of saying this is that `D` will be a
/// subtype of `C` if and only if the union of all possible sets of values
/// represented by `D` (the "top materialization" of `D`) is a subtype of the
/// intersection of all possible sets of values represented by `C` (the "bottom
/// materialization" of `C`). More concisely: `D <: C` iff `Top[D] <: Bottom[C]`.
///
/// For example, `list[Any]` can be said to be a subtype of `Sequence[object]`,
/// because every possible fully static materialization of `list[Any]` (`list[int]`,
/// `list[str]`, `list[bytes | bool]`, `list[SupportsIndex]`, etc.) would be
/// considered a subtype of `Sequence[object]`.
///
/// Note that this latter expansion of the subtyping relation to non-fully-static
/// types is not described in the typing spec, but this expansion to gradual types is
/// sound and consistent with the principles laid out in the spec. This definition
/// does mean the subtyping relation is not reflexive for non-fully-static types
/// (e.g. `Any` is not a subtype of `Any`).
///
/// [fully static]: https://typing.python.org/en/latest/spec/glossary.html#term-fully-static-type
/// [materialization]: https://typing.python.org/en/latest/spec/glossary.html#term-materialize
Subtyping,
/// The "assignability" relation.
///
/// The assignability relation between two types `A` and `B` dictates whether a
/// type checker should emit an error when a value of type `B` is assigned to a
/// variable declared as having type `A`.
///
/// For a pair of [fully static] types `A` and `B`, the assignability relation
/// between `A` and `B` is the same as the subtyping relation.
///
/// Between a pair of `C` and `D` where either `C` or `D` is not fully static, the
/// assignability relation may be more permissive than the subtyping relation. `D`
/// can be said to be assignable to `C` if *some* possible fully static [materialization]
/// of `D` is a subtype of *some* possible fully static materialization of `C`.
/// Another way of saying this is that `D` will be assignable to `C` if and only if the
/// intersection of all possible sets of values represented by `D` (the "bottom
/// materialization" of `D`) is a subtype of the union of all possible sets of values
/// represented by `C` (the "top materialization" of `C`).
/// More concisely: `D <: C` iff `Bottom[D] <: Top[C]`.
///
/// For example, `Any` is not a subtype of `int`, because there are possible
/// materializations of `Any` (e.g., `str`) that are not subtypes of `int`.
/// `Any` is *assignable* to `int`, however, as there are *some* possible materializations
/// of `Any` (such as `int` itself!) that *are* subtypes of `int`. `Any` cannot even
/// be considered a subtype of itself, as two separate uses of `Any` in the same scope
/// might materialize to different types between which there would exist no subtyping
/// relation; nor is `Any` a subtype of `int | Any`, for the same reason. Nonetheless,
/// `Any` is assignable to both `Any` and `int | Any`.
///
/// While `Any` can materialize to anything, the presence of `Any` in a type does not
/// necessarily make it assignable to everything. For example, `list[Any]` is not
/// assignable to `int`, because there are no possible fully static types we could
/// substitute for `Any` in this type that would make it a subtype of `int`. For the
/// same reason, a union such as `str | Any` is not assignable to `int`.
///
/// [fully static]: https://typing.python.org/en/latest/spec/glossary.html#term-fully-static-type
/// [materialization]: https://typing.python.org/en/latest/spec/glossary.html#term-materialize
Assignability,
/// The "redundancy" relation.
///
/// The redundancy relation dictates whether the union `A | B` can be safely simplified
/// to the type `A` without downstream consequences on ty's inference of types elsewhere.
///
/// For a pair of [fully static] types `A` and `B`, the redundancy relation between `A`
/// and `B` is the same as the subtyping relation.
///
/// Between a pair of `C` and `D` where either `C` or `D` is not fully static, the
/// redundancy relation sits in between the subtyping relation and the assignability relation.
/// `D` can be said to be redundant in a union with `C` if the top materialization of the type
/// `C | D` is equivalent to the top materialization of `C`, *and* the bottom materialization
/// of `C | D` is equivalent to the bottom materialization of `C`.
/// More concisely: `D <: C` iff `Top[C | D] == Top[C]` AND `Bottom[C | D] == Bottom[C]`.
///
/// Practically speaking, in most respects the redundancy relation is the same as the subtyping
/// relation. It is redundant to add `bool` to a union that includes `int`, because `bool` is a
/// subtype of `int`, so inference of attribute access or binary expressions on the union
/// `int | bool` would always produce a type that represents the same set of possible sets of
/// runtime values as if ty had inferred the attribute access or binary expression on `int`
/// alone.
///
/// Where the redundancy relation differs from the subtyping relation is that there are a
/// number of simplifications that can be made when simplifying unions that are not
/// strictly permitted by the subtyping relation. For example, it is safe to avoid adding
/// `Any` to a union that already includes `Any`, because `Any` already represents an
/// unknown set of possible sets of runtime values that can materialize to any type in a
/// gradual, permissive way. Inferring attribute access or binary expressions over
/// `Any | Any` could never conceivably yield a type that represents a different set of
/// possible sets of runtime values to inferring the same expression over `Any` alone;
/// although `Any` is not a subtype of `Any`, top materialization of both `Any` and
/// `Any | Any` is `object`, and the bottom materialization of both types is `Never`.
///
/// The same principle also applies to intersections that include `Any` being added to
/// unions that include `Any`: for any type `A`, although naively distributing
/// type-inference operations over `(Any & A) | Any` could produce types that have
/// different displays to `Any`, `(Any & A) | Any` nonetheless has the same top
/// materialization as `Any` and the same bottom materialization as `Any`, and thus it is
/// redundant to add `Any & A` to a union that already includes `Any`.
///
/// Union simplification cannot use the assignability relation, meanwhile, as it is
/// trivial to produce examples of cases where adding a type `B` to a union that includes
/// `A` would impact downstream type inference, even where `B` is assignable to `A`. For
/// example, `int` is assignable to `Any`, but attribute access over the union `int | Any`
/// will yield very different results to attribute access over `Any` alone. The top
/// materialization of `Any` and `int | Any` may be the same type (`object`), but the
/// two differ in their bottom materializations (`Never` and `int`, respectively).
///
/// [fully static]: https://typing.python.org/en/latest/spec/glossary.html#term-fully-static-type
/// [materializations]: https://typing.python.org/en/latest/spec/glossary.html#term-materialize
Redundancy,
}
impl TypeRelation {
pub(crate) const fn is_assignability(self) -> bool {
matches!(self, TypeRelation::Assignability)
}
pub(crate) const fn is_subtyping(self) -> bool {
matches!(self, TypeRelation::Subtyping)
}
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, get_size2::GetSize)]

View File

@@ -502,18 +502,9 @@ impl<'db> UnionBuilder<'db> {
}
if should_simplify_full && !matches!(element_type, Type::TypeAlias(_)) {
if ty.is_equivalent_to(self.db, element_type)
|| ty.is_subtype_of(self.db, element_type)
|| ty.into_intersection().is_some_and(|intersection| {
intersection.positive(self.db).contains(&element_type)
})
{
if ty.is_redundant_with(self.db, element_type) {
return;
} else if element_type.is_subtype_of(self.db, ty)
|| element_type
.into_intersection()
.is_some_and(|intersection| intersection.positive(self.db).contains(&ty))
{
} else if element_type.is_redundant_with(self.db, ty) {
to_remove.push(index);
} else if ty_negated.is_subtype_of(self.db, element_type) {
// We add `ty` to the union. We just checked that `~ty` is a subtype of an
@@ -930,13 +921,11 @@ impl<'db> InnerIntersectionBuilder<'db> {
let mut to_remove = SmallVec::<[usize; 1]>::new();
for (index, existing_positive) in self.positive.iter().enumerate() {
// S & T = S if S <: T
if existing_positive.is_subtype_of(db, new_positive)
|| existing_positive.is_equivalent_to(db, new_positive)
{
if existing_positive.is_redundant_with(db, new_positive) {
return;
}
// same rule, reverse order
if new_positive.is_subtype_of(db, *existing_positive) {
if new_positive.is_redundant_with(db, *existing_positive) {
to_remove.push(index);
}
// A & B = Never if A and B are disjoint
@@ -1027,9 +1016,7 @@ impl<'db> InnerIntersectionBuilder<'db> {
let mut to_remove = SmallVec::<[usize; 1]>::new();
for (index, existing_negative) in self.negative.iter().enumerate() {
// ~S & ~T = ~T if S <: T
if existing_negative.is_subtype_of(db, new_negative)
|| existing_negative.is_equivalent_to(db, new_negative)
{
if existing_negative.is_redundant_with(db, new_negative) {
to_remove.push(index);
}
// same rule, reverse order
@@ -1090,7 +1077,11 @@ impl<'db> InnerIntersectionBuilder<'db> {
// don't need to worry about finding any particular constraint more than once.
let constraints = constraints.elements(db);
let mut positive_constraint_count = 0;
for positive in &self.positive {
for (i, positive) in self.positive.iter().enumerate() {
if i == typevar_index {
continue;
}
// This linear search should be fine as long as we don't encounter typevars with
// thousands of constraints.
positive_constraint_count += constraints

View File

@@ -32,11 +32,11 @@ use crate::types::tuple::{TupleLength, TupleType};
use crate::types::{
BoundMethodType, ClassLiteral, DataclassParams, FieldInstance, KnownBoundMethodType,
KnownClass, KnownInstanceType, MemberLookupPolicy, PropertyInstanceType, SpecialFormType,
TrackedConstraintSet, TypeAliasType, TypeContext, TypeMapping, UnionBuilder, UnionType,
WrapperDescriptorKind, enums, ide_support, todo_type,
TrackedConstraintSet, TypeAliasType, TypeContext, UnionBuilder, UnionType,
WrapperDescriptorKind, enums, ide_support, infer_isolated_expression, todo_type,
};
use ruff_db::diagnostic::{Annotation, Diagnostic, SubDiagnostic, SubDiagnosticSeverity};
use ruff_python_ast::{self as ast, PythonVersion};
use ruff_python_ast::{self as ast, ArgOrKeyword, PythonVersion};
/// Binding information for a possible union of callables. At a call site, the arguments must be
/// compatible with _all_ of the types in the union for the call to be valid.
@@ -1701,10 +1701,6 @@ impl<'db> CallableBinding<'db> {
parameter_type =
parameter_type.apply_specialization(db, specialization);
}
if let Some(inherited_specialization) = overload.inherited_specialization {
parameter_type =
parameter_type.apply_specialization(db, inherited_specialization);
}
union_parameter_types[parameter_index.saturating_sub(skipped_parameters)]
.add_in_place(parameter_type);
}
@@ -1780,7 +1776,7 @@ impl<'db> CallableBinding<'db> {
}
/// Returns the index of the matching overload in the form of [`MatchingOverloadIndex`].
fn matching_overload_index(&self) -> MatchingOverloadIndex {
pub(crate) fn matching_overload_index(&self) -> MatchingOverloadIndex {
let mut matching_overloads = self.matching_overloads();
match matching_overloads.next() {
None => MatchingOverloadIndex::None,
@@ -1798,8 +1794,15 @@ impl<'db> CallableBinding<'db> {
}
}
/// Returns all overloads for this call binding, including overloads that did not match.
pub(crate) fn overloads(&self) -> &[Binding<'db>] {
self.overloads.as_slice()
}
/// Returns an iterator over all the overloads that matched for this call binding.
pub(crate) fn matching_overloads(&self) -> impl Iterator<Item = (usize, &Binding<'db>)> {
pub(crate) fn matching_overloads(
&self,
) -> impl Iterator<Item = (usize, &Binding<'db>)> + Clone {
self.overloads
.iter()
.enumerate()
@@ -1983,7 +1986,7 @@ impl<'db> CallableBinding<'db> {
for overload in overloads.iter().take(MAXIMUM_OVERLOADS) {
diag.info(format_args!(
" {}",
overload.signature(context.db(), None).display(context.db())
overload.signature(context.db()).display(context.db())
));
}
if overloads.len() > MAXIMUM_OVERLOADS {
@@ -2030,7 +2033,7 @@ enum OverloadCallReturnType<'db> {
}
#[derive(Debug)]
enum MatchingOverloadIndex {
pub(crate) enum MatchingOverloadIndex {
/// No matching overloads found.
None,
@@ -2444,7 +2447,6 @@ struct ArgumentTypeChecker<'a, 'db> {
errors: &'a mut Vec<BindingError<'db>>,
specialization: Option<Specialization<'db>>,
inherited_specialization: Option<Specialization<'db>>,
}
impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
@@ -2466,7 +2468,6 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
call_expression_tcx,
errors,
specialization: None,
inherited_specialization: None,
}
}
@@ -2498,9 +2499,7 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
}
fn infer_specialization(&mut self) {
if self.signature.generic_context.is_none()
&& self.signature.inherited_generic_context.is_none()
{
if self.signature.generic_context.is_none() {
return;
}
@@ -2512,9 +2511,17 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
if let Some(return_ty) = self.signature.return_ty
&& let Some(call_expression_tcx) = self.call_expression_tcx.annotation
{
// Ignore any specialization errors here, because the type context is only used to
// optionally widen the return type.
let _ = builder.infer(return_ty, call_expression_tcx);
match call_expression_tcx {
// A type variable is not a useful type-context for expression inference, and applying it
// to the return type can lead to confusing unions in nested generic calls.
Type::TypeVar(_) => {}
_ => {
// Ignore any specialization errors here, because the type context is only used as a hint
// to infer a more assignable return type.
let _ = builder.infer(return_ty, call_expression_tcx);
}
}
}
let parameters = self.signature.parameters();
@@ -2542,14 +2549,6 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
}
self.specialization = self.signature.generic_context.map(|gc| builder.build(gc));
self.inherited_specialization = self.signature.inherited_generic_context.map(|gc| {
// The inherited generic context is used when inferring the specialization of a generic
// class from a constructor call. In this case (only), we promote any typevars that are
// inferred as a literal to the corresponding instance type.
builder
.build(gc)
.apply_type_mapping(self.db, &TypeMapping::PromoteLiterals)
});
}
fn check_argument_type(
@@ -2566,11 +2565,6 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
argument_type = argument_type.apply_specialization(self.db, specialization);
expected_ty = expected_ty.apply_specialization(self.db, specialization);
}
if let Some(inherited_specialization) = self.inherited_specialization {
argument_type =
argument_type.apply_specialization(self.db, inherited_specialization);
expected_ty = expected_ty.apply_specialization(self.db, inherited_specialization);
}
// This is one of the few places where we want to check if there's _any_ specialization
// where assignability holds; normally we want to check that assignability holds for
// _all_ specializations.
@@ -2742,8 +2736,8 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
}
}
fn finish(self) -> (Option<Specialization<'db>>, Option<Specialization<'db>>) {
(self.specialization, self.inherited_specialization)
fn finish(self) -> Option<Specialization<'db>> {
self.specialization
}
}
@@ -2807,10 +2801,6 @@ pub(crate) struct Binding<'db> {
/// The specialization that was inferred from the argument types, if the callable is generic.
specialization: Option<Specialization<'db>>,
/// The specialization that was inferred for a class method's containing generic class, if it
/// is being used to infer a specialization for the class.
inherited_specialization: Option<Specialization<'db>>,
/// Information about which parameter(s) each argument was matched with, in argument source
/// order.
argument_matches: Box<[MatchedArgument<'db>]>,
@@ -2835,7 +2825,6 @@ impl<'db> Binding<'db> {
signature_type,
return_ty: Type::unknown(),
specialization: None,
inherited_specialization: None,
argument_matches: Box::from([]),
variadic_argument_matched_to_variadic_parameter: false,
parameter_tys: Box::from([]),
@@ -2906,15 +2895,10 @@ impl<'db> Binding<'db> {
checker.infer_specialization();
checker.check_argument_types();
(self.specialization, self.inherited_specialization) = checker.finish();
self.specialization = checker.finish();
if let Some(specialization) = self.specialization {
self.return_ty = self.return_ty.apply_specialization(db, specialization);
}
if let Some(inherited_specialization) = self.inherited_specialization {
self.return_ty = self
.return_ty
.apply_specialization(db, inherited_specialization);
}
}
pub(crate) fn set_return_type(&mut self, return_ty: Type<'db>) {
@@ -2925,8 +2909,8 @@ impl<'db> Binding<'db> {
self.return_ty
}
pub(crate) fn inherited_specialization(&self) -> Option<Specialization<'db>> {
self.inherited_specialization
pub(crate) fn specialization(&self) -> Option<Specialization<'db>> {
self.specialization
}
/// Returns the bound types for each parameter, in parameter source order, or `None` if no
@@ -2988,7 +2972,6 @@ impl<'db> Binding<'db> {
BindingSnapshot {
return_ty: self.return_ty,
specialization: self.specialization,
inherited_specialization: self.inherited_specialization,
argument_matches: self.argument_matches.clone(),
parameter_tys: self.parameter_tys.clone(),
errors: self.errors.clone(),
@@ -2999,7 +2982,6 @@ impl<'db> Binding<'db> {
let BindingSnapshot {
return_ty,
specialization,
inherited_specialization,
argument_matches,
parameter_tys,
errors,
@@ -3007,7 +2989,6 @@ impl<'db> Binding<'db> {
self.return_ty = return_ty;
self.specialization = specialization;
self.inherited_specialization = inherited_specialization;
self.argument_matches = argument_matches;
self.parameter_tys = parameter_tys;
self.errors = errors;
@@ -3027,7 +3008,6 @@ impl<'db> Binding<'db> {
fn reset(&mut self) {
self.return_ty = Type::unknown();
self.specialization = None;
self.inherited_specialization = None;
self.argument_matches = Box::from([]);
self.parameter_tys = Box::from([]);
self.errors.clear();
@@ -3038,7 +3018,6 @@ impl<'db> Binding<'db> {
struct BindingSnapshot<'db> {
return_ty: Type<'db>,
specialization: Option<Specialization<'db>>,
inherited_specialization: Option<Specialization<'db>>,
argument_matches: Box<[MatchedArgument<'db>]>,
parameter_tys: Box<[Option<Type<'db>>]>,
errors: Vec<BindingError<'db>>,
@@ -3078,7 +3057,6 @@ impl<'db> CallableBindingSnapshot<'db> {
// ... and update the snapshot with the current state of the binding.
snapshot.return_ty = binding.return_ty;
snapshot.specialization = binding.specialization;
snapshot.inherited_specialization = binding.inherited_specialization;
snapshot
.argument_matches
.clone_from(&binding.argument_matches);
@@ -3326,6 +3304,23 @@ impl<'db> BindingError<'db> {
return;
};
// Re-infer the argument type of call expressions, ignoring the type context for more
// precise error messages.
let provided_ty = match Self::get_argument_node(node, *argument_index) {
None => *provided_ty,
// Ignore starred arguments, as those are difficult to re-infer.
Some(
ast::ArgOrKeyword::Arg(ast::Expr::Starred(_))
| ast::ArgOrKeyword::Keyword(ast::Keyword { arg: None, .. }),
) => *provided_ty,
Some(
ast::ArgOrKeyword::Arg(value)
| ast::ArgOrKeyword::Keyword(ast::Keyword { value, .. }),
) => infer_isolated_expression(context.db(), context.scope(), value),
};
let provided_ty_display = provided_ty.display(context.db());
let expected_ty_display = expected_ty.display(context.db());
@@ -3373,7 +3368,7 @@ impl<'db> BindingError<'db> {
}
diag.info(format_args!(
" {}",
overload.signature(context.db(), None).display(context.db())
overload.signature(context.db()).display(context.db())
));
}
if overloads.len() > MAXIMUM_OVERLOADS {
@@ -3661,22 +3656,29 @@ impl<'db> BindingError<'db> {
}
}
fn get_node(node: ast::AnyNodeRef, argument_index: Option<usize>) -> ast::AnyNodeRef {
fn get_node(node: ast::AnyNodeRef<'_>, argument_index: Option<usize>) -> ast::AnyNodeRef<'_> {
// If we have a Call node and an argument index, report the diagnostic on the correct
// argument node; otherwise, report it on the entire provided node.
match Self::get_argument_node(node, argument_index) {
Some(ast::ArgOrKeyword::Arg(expr)) => expr.into(),
Some(ast::ArgOrKeyword::Keyword(expr)) => expr.into(),
None => node,
}
}
fn get_argument_node(
node: ast::AnyNodeRef<'_>,
argument_index: Option<usize>,
) -> Option<ArgOrKeyword<'_>> {
match (node, argument_index) {
(ast::AnyNodeRef::ExprCall(call_node), Some(argument_index)) => {
match call_node
(ast::AnyNodeRef::ExprCall(call_node), Some(argument_index)) => Some(
call_node
.arguments
.arguments_source_order()
.nth(argument_index)
.expect("argument index should not be out of range")
{
ast::ArgOrKeyword::Arg(expr) => expr.into(),
ast::ArgOrKeyword::Keyword(keyword) => keyword.into(),
}
}
_ => node,
.expect("argument index should not be out of range"),
),
_ => None,
}
}
}

View File

@@ -324,7 +324,6 @@ impl<'db> VarianceInferable<'db> for GenericAlias<'db> {
specialization
.generic_context(db)
.variables(db)
.iter()
.zip(specialization.types(db))
.map(|(generic_typevar, ty)| {
if let Some(explicit_variance) =
@@ -346,7 +345,7 @@ impl<'db> VarianceInferable<'db> for GenericAlias<'db> {
let typevar_variance_in_substituted_type = ty.variance_of(db, typevar);
origin
.with_polarity(typevar_variance_in_substituted_type)
.variance_of(db, *generic_typevar)
.variance_of(db, generic_typevar)
}
}),
)
@@ -551,7 +550,9 @@ impl<'db> ClassType<'db> {
self.iter_mro(db).when_any(db, |base| {
match base {
ClassBase::Dynamic(_) => match relation {
TypeRelation::Subtyping => ConstraintSet::from(other.is_object(db)),
TypeRelation::Subtyping | TypeRelation::Redundancy => {
ConstraintSet::from(other.is_object(db))
}
TypeRelation::Assignability => ConstraintSet::from(!other.is_final(db)),
},
@@ -1011,8 +1012,7 @@ impl<'db> ClassType<'db> {
let synthesized_dunder = CallableType::function_like(
db,
Signature::new(parameters, None)
.with_inherited_generic_context(inherited_generic_context),
Signature::new_generic(inherited_generic_context, parameters, None),
);
Place::bound(synthesized_dunder).into()
@@ -1452,6 +1452,16 @@ impl<'db> ClassLiteral<'db> {
)
}
/// Returns the generic context that should be inherited by any constructor methods of this
/// class.
///
/// When inferring a specialization of the class's generic context from a constructor call, we
/// promote any typevars that are inferred as a literal to the corresponding instance type.
fn inherited_generic_context(self, db: &'db dyn Db) -> Option<GenericContext<'db>> {
self.generic_context(db)
.map(|generic_context| generic_context.promote_literals(db))
}
fn file(self, db: &dyn Db) -> File {
self.body_scope(db).file(db)
}
@@ -1994,7 +2004,7 @@ impl<'db> ClassLiteral<'db> {
lookup_result = lookup_result.or_else(|lookup_error| {
lookup_error.or_fall_back_to(
db,
class.own_class_member(db, self.generic_context(db), name),
class.own_class_member(db, self.inherited_generic_context(db), name),
)
});
}
@@ -2244,8 +2254,14 @@ impl<'db> ClassLiteral<'db> {
// so that the keyword-only parameters appear after positional parameters.
parameters.sort_by_key(Parameter::is_keyword_only);
let mut signature = Signature::new(Parameters::new(parameters), return_ty);
signature.inherited_generic_context = self.generic_context(db);
let signature = match name {
"__new__" | "__init__" => Signature::new_generic(
self.inherited_generic_context(db),
Parameters::new(parameters),
return_ty,
),
_ => Signature::new(Parameters::new(parameters), return_ty),
};
Some(CallableType::function_like(db, signature))
};
@@ -2293,7 +2309,7 @@ impl<'db> ClassLiteral<'db> {
KnownClass::NamedTupleFallback
.to_class_literal(db)
.into_class_literal()?
.own_class_member(db, self.generic_context(db), None, name)
.own_class_member(db, self.inherited_generic_context(db), None, name)
.place
.ignore_possibly_unbound()
.map(|ty| {
@@ -5419,7 +5435,7 @@ enum SlotsKind {
impl SlotsKind {
fn from(db: &dyn Db, base: ClassLiteral) -> Self {
let Place::Type(slots_ty, bound) = base
.own_class_member(db, base.generic_context(db), None, "__slots__")
.own_class_member(db, base.inherited_generic_context(db), None, "__slots__")
.place
else {
return Self::NotSpecified;

View File

@@ -40,6 +40,7 @@ pub(crate) struct InferContext<'db, 'ast> {
module: &'ast ParsedModuleRef,
diagnostics: std::cell::RefCell<TypeCheckDiagnostics>,
no_type_check: InNoTypeCheck,
multi_inference: bool,
bomb: DebugDropBomb,
}
@@ -50,6 +51,7 @@ impl<'db, 'ast> InferContext<'db, 'ast> {
scope,
module,
file: scope.file(db),
multi_inference: false,
diagnostics: std::cell::RefCell::new(TypeCheckDiagnostics::default()),
no_type_check: InNoTypeCheck::default(),
bomb: DebugDropBomb::new(
@@ -156,6 +158,18 @@ impl<'db, 'ast> InferContext<'db, 'ast> {
DiagnosticGuardBuilder::new(self, id, severity)
}
/// Returns `true` if the current expression is being inferred for a second
/// (or subsequent) time, with a potentially different bidirectional type
/// context.
pub(super) fn is_in_multi_inference(&self) -> bool {
self.multi_inference
}
/// Set the multi-inference state, returning the previous value.
pub(super) fn set_multi_inference(&mut self, multi_inference: bool) -> bool {
std::mem::replace(&mut self.multi_inference, multi_inference)
}
pub(super) fn set_in_no_type_check(&mut self, no_type_check: InNoTypeCheck) {
self.no_type_check = no_type_check;
}
@@ -410,6 +424,11 @@ impl<'db, 'ctx> LintDiagnosticGuardBuilder<'db, 'ctx> {
if ctx.is_in_no_type_check() {
return None;
}
// If this lint is being reported as part of multi-inference of a given expression,
// silence it to avoid duplicated diagnostics.
if ctx.is_in_multi_inference() {
return None;
}
let id = DiagnosticId::Lint(lint.name());
let suppressions = suppressions(ctx.db(), ctx.file());
@@ -575,6 +594,11 @@ impl<'db, 'ctx> DiagnosticGuardBuilder<'db, 'ctx> {
if !ctx.db.should_check_file(ctx.file) {
return None;
}
// If this lint is being reported as part of multi-inference of a given expression,
// silence it to avoid duplicated diagnostics.
if ctx.is_in_multi_inference() {
return None;
}
Some(DiagnosticGuardBuilder { ctx, id, severity })
}

View File

@@ -1975,7 +1975,7 @@ pub(super) fn report_invalid_assignment<'db>(
if let DefinitionKind::AnnotatedAssignment(annotated_assignment) = definition.kind(context.db())
&& let Some(value) = annotated_assignment.value(context.module())
{
// Re-infer the RHS of the annotated assignment, ignoring the type context, for more precise
// Re-infer the RHS of the annotated assignment, ignoring the type context for more precise
// error messages.
source_ty = infer_isolated_expression(context.db(), definition.scope(context.db()), value);
}

View File

@@ -654,7 +654,7 @@ pub(crate) struct DisplayOverloadLiteral<'db> {
impl Display for DisplayOverloadLiteral<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
let signature = self.literal.signature(self.db, None);
let signature = self.literal.signature(self.db);
let type_parameters = DisplayOptionalGenericContext {
generic_context: signature.generic_context.as_ref(),
db: self.db,
@@ -832,7 +832,6 @@ impl Display for DisplayGenericContext<'_> {
let variables = self.generic_context.variables(self.db);
let non_implicit_variables: Vec<_> = variables
.iter()
.filter(|bound_typevar| !bound_typevar.typevar(self.db).is_self(self.db))
.collect();
@@ -852,6 +851,10 @@ impl Display for DisplayGenericContext<'_> {
}
impl<'db> Specialization<'db> {
pub fn display(&'db self, db: &'db dyn Db) -> DisplaySpecialization<'db> {
self.display_short(db, TupleSpecialization::No, DisplaySettings::default())
}
/// Renders the specialization as it would appear in a subscript expression, e.g. `[int, str]`.
pub fn display_short(
&'db self,

View File

@@ -72,7 +72,7 @@ use crate::types::diagnostic::{
report_bad_argument_to_get_protocol_members, report_bad_argument_to_protocol_interface,
report_runtime_check_against_non_runtime_checkable_protocol,
};
use crate::types::generics::{GenericContext, walk_generic_context};
use crate::types::generics::GenericContext;
use crate::types::narrow::ClassInfoConstraintFunction;
use crate::types::signatures::{CallableSignature, Signature};
use crate::types::visitor::any_over_type;
@@ -338,11 +338,7 @@ impl<'db> OverloadLiteral<'db> {
/// calling query is not in the same file as this function is defined in, then this will create
/// a cross-module dependency directly on the full AST which will lead to cache
/// over-invalidation.
pub(crate) fn signature(
self,
db: &'db dyn Db,
inherited_generic_context: Option<GenericContext<'db>>,
) -> Signature<'db> {
pub(crate) fn signature(self, db: &'db dyn Db) -> Signature<'db> {
/// `self` or `cls` can be implicitly positional-only if:
/// - It is a method AND
/// - No parameters in the method use PEP-570 syntax AND
@@ -420,7 +416,6 @@ impl<'db> OverloadLiteral<'db> {
Signature::from_function(
db,
generic_context,
inherited_generic_context,
definition,
function_stmt_node,
is_generator,
@@ -484,58 +479,13 @@ impl<'db> OverloadLiteral<'db> {
#[derive(PartialOrd, Ord)]
pub struct FunctionLiteral<'db> {
pub(crate) last_definition: OverloadLiteral<'db>,
/// The inherited generic context, if this function is a constructor method (`__new__` or
/// `__init__`) being used to infer the specialization of its generic class. If any of the
/// method's overloads are themselves generic, this is in addition to those per-overload
/// generic contexts (which are created lazily in [`OverloadLiteral::signature`]).
///
/// If the function is not a constructor method, this field will always be `None`.
///
/// If the function is a constructor method, we will end up creating two `FunctionLiteral`
/// instances for it. The first is created in [`TypeInferenceBuilder`][infer] when we encounter
/// the function definition during type inference. At this point, we don't yet know if the
/// function is a constructor method, so we create a `FunctionLiteral` with `None` for this
/// field.
///
/// If at some point we encounter a call expression, which invokes the containing class's
/// constructor, as will create a _new_ `FunctionLiteral` instance for the function, with this
/// field [updated][] to contain the containing class's generic context.
///
/// [infer]: crate::types::infer::TypeInferenceBuilder::infer_function_definition
/// [updated]: crate::types::class::ClassLiteral::own_class_member
inherited_generic_context: Option<GenericContext<'db>>,
}
// The Salsa heap is tracked separately.
impl get_size2::GetSize for FunctionLiteral<'_> {}
fn walk_function_literal<'db, V: super::visitor::TypeVisitor<'db> + ?Sized>(
db: &'db dyn Db,
function: FunctionLiteral<'db>,
visitor: &V,
) {
if let Some(context) = function.inherited_generic_context(db) {
walk_generic_context(db, context, visitor);
}
}
#[salsa::tracked]
impl<'db> FunctionLiteral<'db> {
fn with_inherited_generic_context(
self,
db: &'db dyn Db,
inherited_generic_context: GenericContext<'db>,
) -> Self {
// A function cannot inherit more than one generic context from its containing class.
debug_assert!(self.inherited_generic_context(db).is_none());
Self::new(
db,
self.last_definition(db),
Some(inherited_generic_context),
)
}
fn name(self, db: &'db dyn Db) -> &'db ast::name::Name {
// All of the overloads of a function literal should have the same name.
self.last_definition(db).name(db)
@@ -626,21 +576,14 @@ impl<'db> FunctionLiteral<'db> {
fn signature(self, db: &'db dyn Db) -> CallableSignature<'db> {
// We only include an implementation (i.e. a definition not decorated with `@overload`) if
// it's the only definition.
let inherited_generic_context = self.inherited_generic_context(db);
let (overloads, implementation) = self.overloads_and_implementation(db);
if let Some(implementation) = implementation {
if overloads.is_empty() {
return CallableSignature::single(
implementation.signature(db, inherited_generic_context),
);
return CallableSignature::single(implementation.signature(db));
}
}
CallableSignature::from_overloads(
overloads
.iter()
.map(|overload| overload.signature(db, inherited_generic_context)),
)
CallableSignature::from_overloads(overloads.iter().map(|overload| overload.signature(db)))
}
/// Typed externally-visible signature of the last overload or implementation of this function.
@@ -652,16 +595,7 @@ impl<'db> FunctionLiteral<'db> {
/// a cross-module dependency directly on the full AST which will lead to cache
/// over-invalidation.
fn last_definition_signature(self, db: &'db dyn Db) -> Signature<'db> {
let inherited_generic_context = self.inherited_generic_context(db);
self.last_definition(db)
.signature(db, inherited_generic_context)
}
fn normalized_impl(self, db: &'db dyn Db, visitor: &NormalizedVisitor<'db>) -> Self {
let context = self
.inherited_generic_context(db)
.map(|ctx| ctx.normalized_impl(db, visitor));
Self::new(db, self.last_definition(db), context)
self.last_definition(db).signature(db)
}
}
@@ -695,7 +629,6 @@ pub(super) fn walk_function_type<'db, V: super::visitor::TypeVisitor<'db> + ?Siz
function: FunctionType<'db>,
visitor: &V,
) {
walk_function_literal(db, function.literal(db), visitor);
if let Some(callable_signature) = function.updated_signature(db) {
for signature in &callable_signature.overloads {
walk_signature(db, signature, visitor);
@@ -713,23 +646,18 @@ impl<'db> FunctionType<'db> {
db: &'db dyn Db,
inherited_generic_context: GenericContext<'db>,
) -> Self {
let literal = self
.literal(db)
let updated_signature = self
.signature(db)
.with_inherited_generic_context(db, inherited_generic_context);
let updated_last_definition_signature = self
.last_definition_signature(db)
.clone()
.with_inherited_generic_context(db, inherited_generic_context);
let updated_signature = self.updated_signature(db).map(|signature| {
signature.with_inherited_generic_context(Some(inherited_generic_context))
});
let updated_last_definition_signature =
self.updated_last_definition_signature(db).map(|signature| {
signature
.clone()
.with_inherited_generic_context(Some(inherited_generic_context))
});
Self::new(
db,
literal,
updated_signature,
updated_last_definition_signature,
self.literal(db),
Some(updated_signature),
Some(updated_last_definition_signature),
)
}
@@ -764,8 +692,7 @@ impl<'db> FunctionType<'db> {
let last_definition = literal
.last_definition(db)
.with_dataclass_transformer_params(db, params);
let literal =
FunctionLiteral::new(db, last_definition, literal.inherited_generic_context(db));
let literal = FunctionLiteral::new(db, last_definition);
Self::new(db, literal, None, None)
}
@@ -969,7 +896,9 @@ impl<'db> FunctionType<'db> {
_visitor: &HasRelationToVisitor<'db>,
) -> ConstraintSet<'db> {
match relation {
TypeRelation::Subtyping => ConstraintSet::from(self.is_subtype_of(db, other)),
TypeRelation::Subtyping | TypeRelation::Redundancy => {
ConstraintSet::from(self.is_subtype_of(db, other))
}
TypeRelation::Assignability => ConstraintSet::from(self.is_assignable_to(db, other)),
}
}
@@ -1034,7 +963,7 @@ impl<'db> FunctionType<'db> {
}
pub(crate) fn normalized_impl(self, db: &'db dyn Db, visitor: &NormalizedVisitor<'db>) -> Self {
let literal = self.literal(db).normalized_impl(db, visitor);
let literal = self.literal(db);
let updated_signature = self
.updated_signature(db)
.map(|signature| signature.normalized_impl(db, visitor));

View File

@@ -19,7 +19,7 @@ use crate::types::{
NormalizedVisitor, Type, TypeMapping, TypeRelation, TypeVarBoundOrConstraints, TypeVarInstance,
TypeVarKind, TypeVarVariance, UnionType, binding_type, declaration_type,
};
use crate::{Db, FxOrderSet};
use crate::{Db, FxOrderMap, FxOrderSet};
/// Returns an iterator of any generic context introduced by the given scope or any enclosing
/// scope.
@@ -137,19 +137,28 @@ pub(crate) fn typing_self<'db>(
.map(typevar_to_type)
}
#[derive(Copy, Clone, Debug, Default, Eq, Hash, PartialEq, get_size2::GetSize)]
pub struct GenericContextTypeVarOptions {
should_promote_literals: bool,
}
impl GenericContextTypeVarOptions {
fn promote_literals(mut self) -> Self {
self.should_promote_literals = true;
self
}
}
/// A list of formal type variables for a generic function, class, or type alias.
///
/// TODO: Handle nested generic contexts better, with actual parent links to the lexically
/// containing context.
///
/// # Ordering
/// Ordering is based on the context's salsa-assigned id and not on its values.
/// The id may change between runs, or when the context was garbage collected and recreated.
#[salsa::interned(debug, heap_size=GenericContext::heap_size)]
#[salsa::interned(debug, constructor=new_internal, heap_size=GenericContext::heap_size)]
#[derive(PartialOrd, Ord)]
pub struct GenericContext<'db> {
#[returns(ref)]
pub(crate) variables: FxOrderSet<BoundTypeVarInstance<'db>>,
variables_inner: FxOrderMap<BoundTypeVarInstance<'db>, GenericContextTypeVarOptions>,
}
pub(super) fn walk_generic_context<'db, V: super::visitor::TypeVisitor<'db> + ?Sized>(
@@ -158,7 +167,7 @@ pub(super) fn walk_generic_context<'db, V: super::visitor::TypeVisitor<'db> + ?S
visitor: &V,
) {
for bound_typevar in context.variables(db) {
visitor.visit_bound_type_var_type(db, *bound_typevar);
visitor.visit_bound_type_var_type(db, bound_typevar);
}
}
@@ -166,6 +175,13 @@ pub(super) fn walk_generic_context<'db, V: super::visitor::TypeVisitor<'db> + ?S
impl get_size2::GetSize for GenericContext<'_> {}
impl<'db> GenericContext<'db> {
fn from_variables(
db: &'db dyn Db,
variables: impl IntoIterator<Item = (BoundTypeVarInstance<'db>, GenericContextTypeVarOptions)>,
) -> Self {
Self::new_internal(db, variables.into_iter().collect::<FxOrderMap<_, _>>())
}
/// Creates a generic context from a list of PEP-695 type parameters.
pub(crate) fn from_type_params(
db: &'db dyn Db,
@@ -185,21 +201,44 @@ impl<'db> GenericContext<'db> {
db: &'db dyn Db,
type_params: impl IntoIterator<Item = BoundTypeVarInstance<'db>>,
) -> Self {
Self::new(db, type_params.into_iter().collect::<FxOrderSet<_>>())
Self::from_variables(
db,
type_params
.into_iter()
.map(|bound_typevar| (bound_typevar, GenericContextTypeVarOptions::default())),
)
}
/// Returns a copy of this generic context where we will promote literal types in any inferred
/// specializations.
pub(crate) fn promote_literals(self, db: &'db dyn Db) -> Self {
Self::from_variables(
db,
self.variables_inner(db)
.iter()
.map(|(bound_typevar, options)| (*bound_typevar, options.promote_literals())),
)
}
/// Merge this generic context with another, returning a new generic context that
/// contains type variables from both contexts.
pub(crate) fn merge(self, db: &'db dyn Db, other: Self) -> Self {
Self::from_typevar_instances(
Self::from_variables(
db,
self.variables(db)
self.variables_inner(db)
.iter()
.chain(other.variables(db).iter())
.copied(),
.chain(other.variables_inner(db).iter())
.map(|(bound_typevar, options)| (*bound_typevar, *options)),
)
}
pub(crate) fn variables(
self,
db: &'db dyn Db,
) -> impl ExactSizeIterator<Item = BoundTypeVarInstance<'db>> + Clone {
self.variables_inner(db).keys().copied()
}
fn variable_from_type_param(
db: &'db dyn Db,
index: &'db SemanticIndex<'db>,
@@ -247,7 +286,7 @@ impl<'db> GenericContext<'db> {
if variables.is_empty() {
return None;
}
Some(Self::new(db, variables))
Some(Self::from_typevar_instances(db, variables))
}
/// Creates a generic context from the legacy `TypeVar`s that appear in class's base class
@@ -263,18 +302,17 @@ impl<'db> GenericContext<'db> {
if variables.is_empty() {
return None;
}
Some(Self::new(db, variables))
Some(Self::from_typevar_instances(db, variables))
}
pub(crate) fn len(self, db: &'db dyn Db) -> usize {
self.variables(db).len()
self.variables_inner(db).len()
}
pub(crate) fn signature(self, db: &'db dyn Db) -> Signature<'db> {
let parameters = Parameters::new(
self.variables(db)
.iter()
.map(|typevar| Self::parameter_from_typevar(db, *typevar)),
.map(|typevar| Self::parameter_from_typevar(db, typevar)),
);
Signature::new(parameters, None)
}
@@ -309,8 +347,7 @@ impl<'db> GenericContext<'db> {
db: &'db dyn Db,
known_class: Option<KnownClass>,
) -> Specialization<'db> {
let partial =
self.specialize_partial(db, std::iter::repeat_n(None, self.variables(db).len()));
let partial = self.specialize_partial(db, std::iter::repeat_n(None, self.len(db)));
if known_class == Some(KnownClass::Tuple) {
Specialization::new(
db,
@@ -332,31 +369,24 @@ impl<'db> GenericContext<'db> {
db: &'db dyn Db,
typevar_to_type: &impl Fn(BoundTypeVarInstance<'db>) -> Type<'db>,
) -> Specialization<'db> {
let types = self
.variables(db)
.iter()
.map(|typevar| typevar_to_type(*typevar))
.collect();
let types = self.variables(db).map(typevar_to_type).collect();
self.specialize(db, types)
}
pub(crate) fn unknown_specialization(self, db: &'db dyn Db) -> Specialization<'db> {
let types = vec![Type::unknown(); self.variables(db).len()];
let types = vec![Type::unknown(); self.len(db)];
self.specialize(db, types.into())
}
/// Returns a tuple type of the typevars introduced by this generic context.
pub(crate) fn as_tuple(self, db: &'db dyn Db) -> Type<'db> {
Type::heterogeneous_tuple(
db,
self.variables(db)
.iter()
.map(|typevar| Type::TypeVar(*typevar)),
)
Type::heterogeneous_tuple(db, self.variables(db).map(Type::TypeVar))
}
pub(crate) fn is_subset_of(self, db: &'db dyn Db, other: GenericContext<'db>) -> bool {
self.variables(db).is_subset(other.variables(db))
let other_variables = other.variables_inner(db);
self.variables(db)
.all(|bound_typevar| other_variables.contains_key(&bound_typevar))
}
pub(crate) fn binds_typevar(
@@ -365,9 +395,7 @@ impl<'db> GenericContext<'db> {
typevar: TypeVarInstance<'db>,
) -> Option<BoundTypeVarInstance<'db>> {
self.variables(db)
.iter()
.find(|self_bound_typevar| self_bound_typevar.typevar(db) == typevar)
.copied()
}
/// Creates a specialization of this generic context. Panics if the length of `types` does not
@@ -379,7 +407,7 @@ impl<'db> GenericContext<'db> {
db: &'db dyn Db,
types: Box<[Type<'db>]>,
) -> Specialization<'db> {
assert!(self.variables(db).len() == types.len());
assert!(self.len(db) == types.len());
Specialization::new(db, self, types, None, None)
}
@@ -403,7 +431,7 @@ impl<'db> GenericContext<'db> {
{
let types = types.into_iter();
let variables = self.variables(db);
assert!(variables.len() == types.len());
assert!(self.len(db) == types.len());
// Typevars can have other typevars as their default values, e.g.
//
@@ -442,14 +470,15 @@ impl<'db> GenericContext<'db> {
pub(crate) fn normalized_impl(self, db: &'db dyn Db, visitor: &NormalizedVisitor<'db>) -> Self {
let variables = self
.variables(db)
.iter()
.map(|bound_typevar| bound_typevar.normalized_impl(db, visitor));
Self::from_typevar_instances(db, variables)
}
fn heap_size((variables,): &(FxOrderSet<BoundTypeVarInstance<'db>>,)) -> usize {
ruff_memory_usage::order_set_heap_size(variables)
fn heap_size(
(variables,): &(FxOrderMap<BoundTypeVarInstance<'db>, GenericContextTypeVarOptions>,),
) -> usize {
ruff_memory_usage::order_map_heap_size(variables)
}
}
@@ -620,22 +649,26 @@ fn has_relation_in_invariant_position<'db>(
base_type.has_relation_to_impl(db, *derived_type, relation, visitor)
}),
// For gradual types, A <: B (subtyping) is defined as Top[A] <: Bottom[B]
(None, Some(base_mat), TypeRelation::Subtyping) => is_subtype_in_invariant_position(
db,
derived_type,
MaterializationKind::Top,
base_type,
base_mat,
visitor,
),
(Some(derived_mat), None, TypeRelation::Subtyping) => is_subtype_in_invariant_position(
db,
derived_type,
derived_mat,
base_type,
MaterializationKind::Bottom,
visitor,
),
(None, Some(base_mat), TypeRelation::Subtyping | TypeRelation::Redundancy) => {
is_subtype_in_invariant_position(
db,
derived_type,
MaterializationKind::Top,
base_type,
base_mat,
visitor,
)
}
(Some(derived_mat), None, TypeRelation::Subtyping | TypeRelation::Redundancy) => {
is_subtype_in_invariant_position(
db,
derived_type,
derived_mat,
base_type,
MaterializationKind::Bottom,
visitor,
)
}
// And A <~ B (assignability) is Bottom[A] <: Top[B]
(None, Some(base_mat), TypeRelation::Assignability) => is_subtype_in_invariant_position(
db,
@@ -657,6 +690,31 @@ fn has_relation_in_invariant_position<'db>(
}
impl<'db> Specialization<'db> {
/// Restricts this specialization to only include the typevars in a generic context. If the
/// specialization does not include all of those typevars, returns `None`.
pub(crate) fn restrict(
self,
db: &'db dyn Db,
generic_context: GenericContext<'db>,
) -> Option<Self> {
let self_variables = self.generic_context(db).variables_inner(db);
let self_types = self.types(db);
let restricted_variables = generic_context.variables(db);
let restricted_types: Option<Box<[_]>> = restricted_variables
.map(|variable| {
let index = self_variables.get_index_of(&variable)?;
self_types.get(index).copied()
})
.collect();
Some(Self::new(
db,
generic_context,
restricted_types?,
self.materialization_kind(db),
None,
))
}
/// Returns the tuple spec for a specialization of the `tuple` class.
pub(crate) fn tuple(self, db: &'db dyn Db) -> Option<&'db TupleSpec<'db>> {
self.tuple_inner(db).map(|tuple_type| tuple_type.tuple(db))
@@ -671,7 +729,7 @@ impl<'db> Specialization<'db> {
) -> Option<Type<'db>> {
let index = self
.generic_context(db)
.variables(db)
.variables_inner(db)
.get_index_of(&bound_typevar)?;
self.types(db).get(index).copied()
}
@@ -809,7 +867,6 @@ impl<'db> Specialization<'db> {
let types: Box<[_]> = self
.generic_context(db)
.variables(db)
.into_iter()
.zip(self.types(db))
.map(|(bound_typevar, vartype)| {
match bound_typevar.variance(db) {
@@ -878,7 +935,7 @@ impl<'db> Specialization<'db> {
let other_materialization_kind = other.materialization_kind(db);
let mut result = ConstraintSet::from(true);
for ((bound_typevar, self_type), other_type) in (generic_context.variables(db).into_iter())
for ((bound_typevar, self_type), other_type) in (generic_context.variables(db))
.zip(self.types(db))
.zip(other.types(db))
{
@@ -929,7 +986,7 @@ impl<'db> Specialization<'db> {
}
let mut result = ConstraintSet::from(true);
for ((bound_typevar, self_type), other_type) in (generic_context.variables(db).into_iter())
for ((bound_typevar, self_type), other_type) in (generic_context.variables(db))
.zip(self.types(db))
.zip(other.types(db))
{
@@ -1001,7 +1058,7 @@ impl<'db> PartialSpecialization<'_, 'db> {
) -> Option<Type<'db>> {
let index = self
.generic_context
.variables(db)
.variables_inner(db)
.get_index_of(&bound_typevar)?;
self.types.get(index).copied()
}
@@ -1023,10 +1080,18 @@ impl<'db> SpecializationBuilder<'db> {
}
pub(crate) fn build(&mut self, generic_context: GenericContext<'db>) -> Specialization<'db> {
let types = generic_context
.variables(self.db)
.iter()
.map(|variable| self.types.get(variable).copied());
let types = (generic_context.variables_inner(self.db).iter()).map(|(variable, options)| {
let mut ty = self.types.get(variable).copied();
// When inferring a specialization for a generic class typevar from a constructor call,
// promote any typevars that are inferred as a literal to the corresponding instance
// type.
if options.should_promote_literals {
ty = ty.map(|ty| ty.promote_literals(self.db));
}
ty
});
// TODO Infer the tuple spec for a tuple type
generic_context.specialize_partial(self.db, types)
}

View File

@@ -1,4 +1,4 @@
use std::iter;
use std::{iter, mem};
use itertools::{Either, Itertools};
use ruff_db::diagnostic::{Annotation, DiagnosticId, Severity};
@@ -44,6 +44,7 @@ use crate::semantic_index::symbol::{ScopedSymbolId, Symbol};
use crate::semantic_index::{
ApplicableConstraints, EnclosingSnapshotResult, SemanticIndex, place_table,
};
use crate::types::call::bind::MatchingOverloadIndex;
use crate::types::call::{Binding, Bindings, CallArguments, CallError, CallErrorKind};
use crate::types::class::{CodeGeneratorKind, FieldKind, MetaclassErrorKind, MethodDecorator};
use crate::types::context::{InNoTypeCheck, InferContext};
@@ -88,13 +89,13 @@ use crate::types::typed_dict::{
};
use crate::types::visitor::any_over_type;
use crate::types::{
BoundTypeVarInstance, CallDunderError, CallableType, ClassLiteral, ClassType, DataclassParams,
CallDunderError, CallableBinding, CallableType, ClassLiteral, ClassType, DataclassParams,
DynamicType, IntersectionBuilder, IntersectionType, KnownClass, KnownInstanceType,
MemberLookupPolicy, MetaclassCandidate, PEP695TypeAliasType, Parameter, ParameterForm,
Parameters, SpecialFormType, SubclassOfType, TrackedConstraintSet, Truthiness, Type,
TypeAliasType, TypeAndQualifiers, TypeContext, TypeQualifiers,
TypeVarBoundOrConstraintsEvaluation, TypeVarDefaultEvaluation, TypeVarInstance, TypeVarKind,
UnionBuilder, UnionType, binding_type, todo_type,
TypedDictType, UnionBuilder, UnionType, binding_type, todo_type,
};
use crate::types::{ClassBase, add_inferred_python_version_hint_to_diagnostic};
use crate::unpack::{EvaluationMode, UnpackPosition};
@@ -258,6 +259,8 @@ pub(super) struct TypeInferenceBuilder<'db, 'ast> {
/// is a stub file but we're still in a non-deferred region.
deferred_state: DeferredExpressionState,
multi_inference_state: MultiInferenceState,
/// For function definitions, the undecorated type of the function.
undecorated_type: Option<Type<'db>>,
@@ -288,10 +291,11 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
context: InferContext::new(db, scope, module),
index,
region,
scope,
return_types_and_ranges: vec![],
called_functions: FxHashSet::default(),
deferred_state: DeferredExpressionState::None,
scope,
multi_inference_state: MultiInferenceState::Panic,
expressions: FxHashMap::default(),
bindings: VecMap::default(),
declarations: VecMap::default(),
@@ -2141,10 +2145,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
deprecated,
dataclass_transformer_params,
);
let inherited_generic_context = None;
let function_literal =
FunctionLiteral::new(self.db(), overload_literal, inherited_generic_context);
let function_literal = FunctionLiteral::new(self.db(), overload_literal);
let mut inferred_ty =
Type::FunctionLiteral(FunctionType::new(self.db(), function_literal, None, None));
@@ -4915,6 +4916,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
self.infer_expression(expression, TypeContext::default())
}
/// Infer the argument types for a single binding.
fn infer_argument_types<'a>(
&mut self,
ast_arguments: &ast::Arguments,
@@ -4924,22 +4926,155 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
debug_assert!(
ast_arguments.len() == arguments.len() && arguments.len() == argument_forms.len()
);
let iter = (arguments.iter_mut())
.zip(argument_forms.iter().copied())
.zip(ast_arguments.arguments_source_order());
for (((_, argument_type), form), arg_or_keyword) in iter {
let argument = match arg_or_keyword {
// We already inferred the type of splatted arguments.
let iter = itertools::izip!(
arguments.iter_mut(),
argument_forms.iter().copied(),
ast_arguments.arguments_source_order()
);
for ((_, argument_type), argument_form, ast_argument) in iter {
let argument = match ast_argument {
// Splatted arguments are inferred before parameter matching to
// determine their length.
ast::ArgOrKeyword::Arg(ast::Expr::Starred(_))
| ast::ArgOrKeyword::Keyword(ast::Keyword { arg: None, .. }) => continue,
ast::ArgOrKeyword::Arg(arg) => arg,
ast::ArgOrKeyword::Keyword(ast::Keyword { value, .. }) => value,
};
let ty = self.infer_argument_type(argument, form, TypeContext::default());
let ty = self.infer_argument_type(argument, argument_form, TypeContext::default());
*argument_type = Some(ty);
}
}
/// Infer the argument types for multiple potential bindings and overloads.
fn infer_all_argument_types<'a>(
&mut self,
ast_arguments: &ast::Arguments,
arguments: &mut CallArguments<'a, 'db>,
bindings: &Bindings<'db>,
) {
debug_assert!(
ast_arguments.len() == arguments.len()
&& arguments.len() == bindings.argument_forms().len()
);
let iter = itertools::izip!(
0..,
arguments.iter_mut(),
bindings.argument_forms().iter().copied(),
ast_arguments.arguments_source_order()
);
let overloads_with_binding = bindings
.into_iter()
.filter_map(|binding| {
match binding.matching_overload_index() {
MatchingOverloadIndex::Single(_) | MatchingOverloadIndex::Multiple(_) => {
let overloads = binding
.matching_overloads()
.map(move |(_, overload)| (overload, binding));
Some(Either::Right(overloads))
}
// If there is a single overload that does not match, we still infer the argument
// types for better diagnostics.
MatchingOverloadIndex::None => match binding.overloads() {
[overload] => Some(Either::Left(std::iter::once((overload, binding)))),
_ => None,
},
}
})
.flatten();
for (argument_index, (_, argument_type), argument_form, ast_argument) in iter {
let ast_argument = match ast_argument {
// Splatted arguments are inferred before parameter matching to
// determine their length.
//
// TODO: Re-infer splatted arguments with their type context.
ast::ArgOrKeyword::Arg(ast::Expr::Starred(_))
| ast::ArgOrKeyword::Keyword(ast::Keyword { arg: None, .. }) => continue,
ast::ArgOrKeyword::Arg(arg) => arg,
ast::ArgOrKeyword::Keyword(ast::Keyword { value, .. }) => value,
};
// Type-form arguments are inferred without type context, so we can infer the argument type directly.
if let Some(ParameterForm::Type) = argument_form {
*argument_type = Some(self.infer_type_expression(ast_argument));
continue;
}
// Retrieve the parameter type for the current argument in a given overload and its binding.
let parameter_type = |overload: &Binding<'db>, binding: &CallableBinding<'db>| {
let argument_index = if binding.bound_type.is_some() {
argument_index + 1
} else {
argument_index
};
let argument_matches = &overload.argument_matches()[argument_index];
let [parameter_index] = argument_matches.parameters.as_slice() else {
return None;
};
overload.signature.parameters()[*parameter_index].annotated_type()
};
// If there is only a single binding and overload, we can infer the argument directly with
// the unique parameter type annotation.
if let Ok((overload, binding)) = overloads_with_binding.clone().exactly_one() {
self.infer_expression_impl(
ast_argument,
TypeContext::new(parameter_type(overload, binding)),
);
} else {
// Otherwise, each type is a valid independent inference of the given argument, and we may
// require different permutations of argument types to correctly perform argument expansion
// during overload evaluation, so we take the intersection of all the types we inferred for
// each argument.
//
// Note that this applies to all nested expressions within each argument.
let old_multi_inference_state = mem::replace(
&mut self.multi_inference_state,
MultiInferenceState::Intersect,
);
// We perform inference once without any type context, emitting any diagnostics that are unrelated
// to bidirectional type inference.
self.infer_expression_impl(ast_argument, TypeContext::default());
// We then silence any diagnostics emitted during multi-inference, as the type context is only
// used as a hint to infer a more assignable argument type, and should not lead to diagnostics
// for non-matching overloads.
let was_in_multi_inference = self.context.set_multi_inference(true);
// Infer the type of each argument once with each distinct parameter type as type context.
let parameter_types = overloads_with_binding
.clone()
.filter_map(|(overload, binding)| parameter_type(overload, binding))
.collect::<FxHashSet<_>>();
for parameter_type in parameter_types {
self.infer_expression_impl(
ast_argument,
TypeContext::new(Some(parameter_type)),
);
}
// Restore the multi-inference state.
self.multi_inference_state = old_multi_inference_state;
self.context.set_multi_inference(was_in_multi_inference);
}
*argument_type = self.try_expression_type(ast_argument);
}
}
fn infer_argument_type(
&mut self,
ast_argument: &ast::Expr,
@@ -4960,6 +5095,15 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
expression.map(|expr| self.infer_expression(expr, tcx))
}
fn get_or_infer_expression(
&mut self,
expression: &ast::Expr,
tcx: TypeContext<'db>,
) -> Type<'db> {
self.try_expression_type(expression)
.unwrap_or_else(|| self.infer_expression(expression, tcx))
}
#[track_caller]
fn infer_expression(&mut self, expression: &ast::Expr, tcx: TypeContext<'db>) -> Type<'db> {
debug_assert!(
@@ -5020,6 +5164,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
types.expression_type(expression)
}
/// Infer the type of an expression.
fn infer_expression_impl(
&mut self,
expression: &ast::Expr,
@@ -5055,7 +5200,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
ast::Expr::Compare(compare) => self.infer_compare_expression(compare),
ast::Expr::Subscript(subscript) => self.infer_subscript_expression(subscript),
ast::Expr::Slice(slice) => self.infer_slice_expression(slice),
ast::Expr::Named(named) => self.infer_named_expression(named),
ast::Expr::If(if_expression) => self.infer_if_expression(if_expression),
ast::Expr::Lambda(lambda_expression) => self.infer_lambda_expression(lambda_expression),
ast::Expr::Call(call_expression) => self.infer_call_expression(call_expression, tcx),
@@ -5063,6 +5207,16 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
ast::Expr::Yield(yield_expression) => self.infer_yield_expression(yield_expression),
ast::Expr::YieldFrom(yield_from) => self.infer_yield_from_expression(yield_from),
ast::Expr::Await(await_expression) => self.infer_await_expression(await_expression),
ast::Expr::Named(named) => {
// Definitions must be unique, so we bypass multi-inference for named expressions.
if !self.multi_inference_state.is_panic()
&& let Some(ty) = self.expressions.get(&expression.into())
{
return *ty;
}
self.infer_named_expression(named)
}
ast::Expr::IpyEscapeCommand(_) => {
todo_type!("Ipy escape command support")
}
@@ -5072,6 +5226,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
ty
}
fn store_expression_type(&mut self, expression: &ast::Expr, ty: Type<'db>) {
if self.deferred_state.in_string_annotation() {
// Avoid storing the type of expressions that are part of a string annotation because
@@ -5079,8 +5234,24 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// on the string expression itself that represents the annotation.
return;
}
let previous = self.expressions.insert(expression.into(), ty);
assert_eq!(previous, None);
let db = self.db();
match self.multi_inference_state {
MultiInferenceState::Panic => {
let previous = self.expressions.insert(expression.into(), ty);
assert_eq!(previous, None);
}
MultiInferenceState::Intersect => {
self.expressions
.entry(expression.into())
.and_modify(|current| {
*current = IntersectionType::from_elements(db, [*current, ty]);
})
.or_insert(ty);
}
}
}
fn infer_number_literal_expression(&mut self, literal: &ast::ExprNumberLiteral) -> Type<'db> {
@@ -5301,31 +5472,10 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
} = dict;
// Validate `TypedDict` dictionary literal assignments.
if let Some(typed_dict) = tcx.annotation.and_then(Type::into_typed_dict) {
let typed_dict_items = typed_dict.items(self.db());
for item in items {
self.infer_optional_expression(item.key.as_ref(), TypeContext::default());
if let Some(ast::Expr::StringLiteral(ref key)) = item.key
&& let Some(key) = key.as_single_part_string()
&& let Some(field) = typed_dict_items.get(key.as_str())
{
self.infer_expression(&item.value, TypeContext::new(Some(field.declared_ty)));
} else {
self.infer_expression(&item.value, TypeContext::default());
}
}
validate_typed_dict_dict_literal(
&self.context,
typed_dict,
dict,
dict.into(),
|expr| self.expression_type(expr),
);
return Type::TypedDict(typed_dict);
if let Some(typed_dict) = tcx.annotation.and_then(Type::into_typed_dict)
&& let Some(ty) = self.infer_typed_dict_expression(dict, typed_dict)
{
return ty;
}
// Avoid false positives for the functional `TypedDict` form, which is currently
@@ -5346,6 +5496,39 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
})
}
fn infer_typed_dict_expression(
&mut self,
dict: &ast::ExprDict,
typed_dict: TypedDictType<'db>,
) -> Option<Type<'db>> {
let ast::ExprDict {
range: _,
node_index: _,
items,
} = dict;
let typed_dict_items = typed_dict.items(self.db());
for item in items {
self.infer_optional_expression(item.key.as_ref(), TypeContext::default());
if let Some(ast::Expr::StringLiteral(ref key)) = item.key
&& let Some(key) = key.as_single_part_string()
&& let Some(field) = typed_dict_items.get(key.as_str())
{
self.infer_expression(&item.value, TypeContext::new(Some(field.declared_ty)));
} else {
self.infer_expression(&item.value, TypeContext::default());
}
}
validate_typed_dict_dict_literal(&self.context, typed_dict, dict, dict.into(), |expr| {
self.expression_type(expr)
})
.ok()
.map(|_| Type::TypedDict(typed_dict))
}
// Infer the type of a collection literal expression.
fn infer_collection_literal<'expr, const N: usize>(
&mut self,
@@ -5354,16 +5537,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
collection_class: KnownClass,
) -> Option<Type<'db>> {
// Extract the type variable `T` from `list[T]` in typeshed.
fn elt_tys(
collection_class: KnownClass,
db: &dyn Db,
) -> Option<(ClassLiteral<'_>, &FxOrderSet<BoundTypeVarInstance<'_>>)> {
let class_literal = collection_class.try_to_class_literal(db)?;
let generic_context = class_literal.generic_context(db)?;
Some((class_literal, generic_context.variables(db)))
}
let elt_tys = |collection_class: KnownClass| {
let class_literal = collection_class.try_to_class_literal(self.db())?;
let generic_context = class_literal.generic_context(self.db())?;
Some((class_literal, generic_context.variables(self.db())))
};
let (class_literal, elt_tys) = elt_tys(collection_class, self.db()).unwrap_or_else(|| {
let (class_literal, elt_tys) = elt_tys(collection_class).unwrap_or_else(|| {
let name = collection_class.name(self.db());
panic!("Typeshed should always have a `{name}` class in `builtins.pyi`")
});
@@ -5382,9 +5562,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// Note that we infer the annotated type _before_ the elements, to more closely match the
// order of any unions as written in the type annotation.
Some(annotated_elt_tys) => {
for (elt_ty, annotated_elt_ty) in iter::zip(elt_tys, annotated_elt_tys) {
for (elt_ty, annotated_elt_ty) in iter::zip(elt_tys.clone(), annotated_elt_tys) {
builder
.infer(Type::TypeVar(*elt_ty), *annotated_elt_ty)
.infer(Type::TypeVar(elt_ty), *annotated_elt_ty)
.ok()?;
}
}
@@ -5392,10 +5572,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// If a valid type annotation was not provided, avoid restricting the type of the collection
// by unioning the inferred type with `Unknown`.
None => {
for elt_ty in elt_tys {
builder
.infer(Type::TypeVar(*elt_ty), Type::unknown())
.ok()?;
for elt_ty in elt_tys.clone() {
builder.infer(Type::TypeVar(elt_ty), Type::unknown()).ok()?;
}
}
}
@@ -5408,17 +5586,17 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
for elts in elts {
// An unpacking expression for a dictionary.
if let &[None, Some(value)] = elts.as_slice() {
let inferred_value_ty = self.infer_expression(value, TypeContext::default());
let inferred_value_ty = self.get_or_infer_expression(value, TypeContext::default());
// Merge the inferred type of the nested dictionary.
if let Some(specialization) =
inferred_value_ty.known_specialization(KnownClass::Dict, self.db())
{
for (elt_ty, inferred_elt_ty) in
iter::zip(elt_tys, specialization.types(self.db()))
iter::zip(elt_tys.clone(), specialization.types(self.db()))
{
builder
.infer(Type::TypeVar(*elt_ty), *inferred_elt_ty)
.infer(Type::TypeVar(elt_ty), *inferred_elt_ty)
.ok()?;
}
}
@@ -5427,18 +5605,17 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
// The inferred type of each element acts as an additional constraint on `T`.
for (elt, elt_ty, elt_tcx) in itertools::izip!(elts, elt_tys, elt_tcxs.clone()) {
let Some(inferred_elt_ty) = self.infer_optional_expression(elt, elt_tcx) else {
continue;
};
for (elt, elt_ty, elt_tcx) in itertools::izip!(elts, elt_tys.clone(), elt_tcxs.clone())
{
let Some(elt) = elt else { continue };
let inferred_elt_ty = self.get_or_infer_expression(elt, elt_tcx);
// Convert any element literals to their promoted type form to avoid excessively large
// unions for large nested list literals, which the constraint solver struggles with.
let inferred_elt_ty = inferred_elt_ty.promote_literals(self.db());
builder
.infer(Type::TypeVar(*elt_ty), inferred_elt_ty)
.ok()?;
builder.infer(Type::TypeVar(elt_ty), inferred_elt_ty).ok()?;
}
}
@@ -5977,7 +6154,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let bindings = callable_type
.bindings(self.db())
.match_parameters(self.db(), &call_arguments);
self.infer_argument_types(arguments, &mut call_arguments, bindings.argument_forms());
self.infer_all_argument_types(arguments, &mut call_arguments, &bindings);
// Validate `TypedDict` constructor calls after argument type inference
if let Some(class_literal) = callable_type.into_class_literal() {
@@ -9012,7 +9189,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
})
.collect();
typevars.map(|typevars| GenericContext::new(self.db(), typevars))
typevars.map(|typevars| GenericContext::from_typevar_instances(self.db(), typevars))
}
fn infer_slice_expression(&mut self, slice: &ast::ExprSlice) -> Type<'db> {
@@ -9097,6 +9274,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// builder only state
typevar_binding_context: _,
deferred_state: _,
multi_inference_state: _,
called_functions: _,
index: _,
region: _,
@@ -9159,6 +9337,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// builder only state
typevar_binding_context: _,
deferred_state: _,
multi_inference_state: _,
called_functions: _,
index: _,
region: _,
@@ -9230,6 +9409,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// Builder only state
typevar_binding_context: _,
deferred_state: _,
multi_inference_state: _,
called_functions: _,
index: _,
region: _,
@@ -9275,6 +9455,26 @@ impl GenericContextError {
}
}
/// Dictates the behavior when an expression is inferred multiple times.
#[derive(Default, Debug, Clone, Copy)]
enum MultiInferenceState {
/// Panic if the expression has already been inferred.
#[default]
Panic,
/// Store the intersection of all types inferred for the expression.
Intersect,
}
impl MultiInferenceState {
fn is_panic(self) -> bool {
match self {
MultiInferenceState::Panic => true,
MultiInferenceState::Intersect => false,
}
}
}
/// The deferred state of a specific expression in an inference region.
#[derive(Default, Debug, Clone, Copy)]
enum DeferredExpressionState {
@@ -9548,7 +9748,7 @@ impl<K, V> Default for VecMap<K, V> {
/// Set based on a `Vec`. It doesn't enforce
/// uniqueness on insertion. Instead, it relies on the caller
/// that elements are uniuqe. For example, the way we visit definitions
/// that elements are unique. For example, the way we visit definitions
/// in the `TypeInference` builder make already implicitly guarantees that each definition
/// is only visited once.
#[derive(Debug, Clone, PartialEq, Eq, Hash)]

View File

@@ -102,12 +102,13 @@ impl<'db> CallableSignature<'db> {
pub(crate) fn with_inherited_generic_context(
&self,
inherited_generic_context: Option<GenericContext<'db>>,
db: &'db dyn Db,
inherited_generic_context: GenericContext<'db>,
) -> Self {
Self::from_overloads(self.overloads.iter().map(|signature| {
signature
.clone()
.with_inherited_generic_context(inherited_generic_context)
.with_inherited_generic_context(db, inherited_generic_context)
}))
}
@@ -301,11 +302,6 @@ pub struct Signature<'db> {
/// The generic context for this overload, if it is generic.
pub(crate) generic_context: Option<GenericContext<'db>>,
/// The inherited generic context, if this function is a class method being used to infer the
/// specialization of its generic class. If the method is itself generic, this is in addition
/// to its own generic context.
pub(crate) inherited_generic_context: Option<GenericContext<'db>>,
/// The original definition associated with this function, if available.
/// This is useful for locating and extracting docstring information for the signature.
pub(crate) definition: Option<Definition<'db>>,
@@ -332,9 +328,6 @@ pub(super) fn walk_signature<'db, V: super::visitor::TypeVisitor<'db> + ?Sized>(
if let Some(generic_context) = &signature.generic_context {
walk_generic_context(db, *generic_context, visitor);
}
if let Some(inherited_generic_context) = &signature.inherited_generic_context {
walk_generic_context(db, *inherited_generic_context, visitor);
}
// By default we usually don't visit the type of the default value,
// as it isn't relevant to most things
for parameter in &signature.parameters {
@@ -351,7 +344,6 @@ impl<'db> Signature<'db> {
pub(crate) fn new(parameters: Parameters<'db>, return_ty: Option<Type<'db>>) -> Self {
Self {
generic_context: None,
inherited_generic_context: None,
definition: None,
parameters,
return_ty,
@@ -365,7 +357,6 @@ impl<'db> Signature<'db> {
) -> Self {
Self {
generic_context,
inherited_generic_context: None,
definition: None,
parameters,
return_ty,
@@ -376,7 +367,6 @@ impl<'db> Signature<'db> {
pub(crate) fn dynamic(signature_type: Type<'db>) -> Self {
Signature {
generic_context: None,
inherited_generic_context: None,
definition: None,
parameters: Parameters::gradual_form(),
return_ty: Some(signature_type),
@@ -389,7 +379,6 @@ impl<'db> Signature<'db> {
let signature_type = todo_type!(reason);
Signature {
generic_context: None,
inherited_generic_context: None,
definition: None,
parameters: Parameters::todo(),
return_ty: Some(signature_type),
@@ -400,7 +389,6 @@ impl<'db> Signature<'db> {
pub(super) fn from_function(
db: &'db dyn Db,
generic_context: Option<GenericContext<'db>>,
inherited_generic_context: Option<GenericContext<'db>>,
definition: Definition<'db>,
function_node: &ast::StmtFunctionDef,
is_generator: bool,
@@ -434,7 +422,6 @@ impl<'db> Signature<'db> {
(Some(legacy_ctx), Some(ctx)) => {
if legacy_ctx
.variables(db)
.iter()
.exactly_one()
.is_ok_and(|bound_typevar| bound_typevar.typevar(db).is_self(db))
{
@@ -449,7 +436,6 @@ impl<'db> Signature<'db> {
Self {
generic_context: full_generic_context,
inherited_generic_context,
definition: Some(definition),
parameters,
return_ty,
@@ -468,9 +454,17 @@ impl<'db> Signature<'db> {
pub(crate) fn with_inherited_generic_context(
mut self,
inherited_generic_context: Option<GenericContext<'db>>,
db: &'db dyn Db,
inherited_generic_context: GenericContext<'db>,
) -> Self {
self.inherited_generic_context = inherited_generic_context;
match self.generic_context.as_mut() {
Some(generic_context) => {
*generic_context = generic_context.merge(db, inherited_generic_context);
}
None => {
self.generic_context = Some(inherited_generic_context);
}
}
self
}
@@ -483,9 +477,6 @@ impl<'db> Signature<'db> {
generic_context: self
.generic_context
.map(|ctx| ctx.normalized_impl(db, visitor)),
inherited_generic_context: self
.inherited_generic_context
.map(|ctx| ctx.normalized_impl(db, visitor)),
// Discard the definition when normalizing, so that two equivalent signatures
// with different `Definition`s share the same Salsa ID when normalized
definition: None,
@@ -516,7 +507,6 @@ impl<'db> Signature<'db> {
generic_context: self
.generic_context
.map(|context| type_mapping.update_signature_generic_context(db, context)),
inherited_generic_context: self.inherited_generic_context,
definition: self.definition,
parameters: self
.parameters
@@ -571,7 +561,6 @@ impl<'db> Signature<'db> {
}
Self {
generic_context: self.generic_context,
inherited_generic_context: self.inherited_generic_context,
definition: self.definition,
parameters,
return_ty,
@@ -1236,10 +1225,7 @@ impl<'db> Parameters<'db> {
let method_has_self_in_generic_context =
method.signature(db).overloads.iter().any(|s| {
s.generic_context.is_some_and(|context| {
context
.variables(db)
.iter()
.any(|v| v.typevar(db).is_self(db))
context.variables(db).any(|v| v.typevar(db).is_self(db))
})
});
@@ -1882,7 +1868,7 @@ mod tests {
.literal(&db)
.last_definition(&db);
let sig = func.signature(&db, None);
let sig = func.signature(&db);
assert!(sig.return_ty.is_none());
assert_params(&sig, &[]);
@@ -1907,7 +1893,7 @@ mod tests {
.literal(&db)
.last_definition(&db);
let sig = func.signature(&db, None);
let sig = func.signature(&db);
assert_eq!(sig.return_ty.unwrap().display(&db).to_string(), "bytes");
assert_params(
@@ -1959,7 +1945,7 @@ mod tests {
.literal(&db)
.last_definition(&db);
let sig = func.signature(&db, None);
let sig = func.signature(&db);
let [
Parameter {
@@ -1997,7 +1983,7 @@ mod tests {
.literal(&db)
.last_definition(&db);
let sig = func.signature(&db, None);
let sig = func.signature(&db);
let [
Parameter {
@@ -2035,7 +2021,7 @@ mod tests {
.literal(&db)
.last_definition(&db);
let sig = func.signature(&db, None);
let sig = func.signature(&db);
let [
Parameter {
@@ -2079,7 +2065,7 @@ mod tests {
.literal(&db)
.last_definition(&db);
let sig = func.signature(&db, None);
let sig = func.signature(&db);
let [
Parameter {
@@ -2116,7 +2102,7 @@ mod tests {
let func = get_function_f(&db, "/src/a.py");
let overload = func.literal(&db).last_definition(&db);
let expected_sig = overload.signature(&db, None);
let expected_sig = overload.signature(&db);
// With no decorators, internal and external signature are the same
assert_eq!(

View File

@@ -138,7 +138,7 @@ impl<'db> SubclassOfType<'db> {
) -> ConstraintSet<'db> {
match (self.subclass_of, other.subclass_of) {
(SubclassOfInner::Dynamic(_), SubclassOfInner::Dynamic(_)) => {
ConstraintSet::from(relation.is_assignability())
ConstraintSet::from(!relation.is_subtyping())
}
(SubclassOfInner::Dynamic(_), SubclassOfInner::Class(other_class)) => {
ConstraintSet::from(other_class.is_object(db) || relation.is_assignability())

View File

@@ -757,8 +757,7 @@ impl<'db> VariableLengthTuple<Type<'db>> {
// (or any other dynamic type), then the `...` is the _gradual choice_ of all
// possible lengths. This means that `tuple[Any, ...]` can match any tuple of any
// length.
if relation == TypeRelation::Subtyping || !matches!(self.variable, Type::Dynamic(_))
{
if !relation.is_assignability() || !self.variable.is_dynamic() {
return ConstraintSet::from(false);
}

View File

@@ -132,7 +132,8 @@ impl TypedDictAssignmentKind {
}
/// Validates assignment of a value to a specific key on a `TypedDict`.
/// Returns true if the assignment is valid, false otherwise.
///
/// Returns true if the assignment is valid, or false otherwise.
#[allow(clippy::too_many_arguments)]
pub(super) fn validate_typed_dict_key_assignment<'db, 'ast>(
context: &InferContext<'db, 'ast>,
@@ -157,6 +158,7 @@ pub(super) fn validate_typed_dict_key_assignment<'db, 'ast>(
Type::string_literal(db, key),
&items,
);
return false;
};
@@ -240,13 +242,16 @@ pub(super) fn validate_typed_dict_key_assignment<'db, 'ast>(
}
/// Validates that all required keys are provided in a `TypedDict` construction.
///
/// Reports errors for any keys that are required but not provided.
///
/// Returns true if the assignment is valid, or false otherwise.
pub(super) fn validate_typed_dict_required_keys<'db, 'ast>(
context: &InferContext<'db, 'ast>,
typed_dict: TypedDictType<'db>,
provided_keys: &OrderSet<&str>,
error_node: AnyNodeRef<'ast>,
) {
) -> bool {
let db = context.db();
let items = typed_dict.items(db);
@@ -255,7 +260,12 @@ pub(super) fn validate_typed_dict_required_keys<'db, 'ast>(
.filter_map(|(key_name, field)| field.is_required().then_some(key_name.as_str()))
.collect();
for missing_key in required_keys.difference(provided_keys) {
let missing_keys = required_keys.difference(provided_keys);
let mut has_missing_key = false;
for missing_key in missing_keys {
has_missing_key = true;
report_missing_typed_dict_key(
context,
error_node,
@@ -263,6 +273,8 @@ pub(super) fn validate_typed_dict_required_keys<'db, 'ast>(
missing_key,
);
}
!has_missing_key
}
pub(super) fn validate_typed_dict_constructor<'db, 'ast>(
@@ -373,7 +385,7 @@ fn validate_from_keywords<'db, 'ast>(
provided_keys
}
/// Validates a `TypedDict` dictionary literal assignment
/// Validates a `TypedDict` dictionary literal assignment,
/// e.g. `person: Person = {"name": "Alice", "age": 30}`
pub(super) fn validate_typed_dict_dict_literal<'db, 'ast>(
context: &InferContext<'db, 'ast>,
@@ -381,7 +393,8 @@ pub(super) fn validate_typed_dict_dict_literal<'db, 'ast>(
dict_expr: &'ast ast::ExprDict,
error_node: AnyNodeRef<'ast>,
expression_type_fn: impl Fn(&ast::Expr) -> Type<'db>,
) -> OrderSet<&'ast str> {
) -> Result<OrderSet<&'ast str>, OrderSet<&'ast str>> {
let mut valid = true;
let mut provided_keys = OrderSet::new();
// Validate each key-value pair in the dictionary literal
@@ -392,7 +405,8 @@ pub(super) fn validate_typed_dict_dict_literal<'db, 'ast>(
provided_keys.insert(key_str);
let value_type = expression_type_fn(&item.value);
validate_typed_dict_key_assignment(
valid &= validate_typed_dict_key_assignment(
context,
typed_dict,
key_str,
@@ -406,7 +420,11 @@ pub(super) fn validate_typed_dict_dict_literal<'db, 'ast>(
}
}
validate_typed_dict_required_keys(context, typed_dict, &provided_keys, error_node);
valid &= validate_typed_dict_required_keys(context, typed_dict, &provided_keys, error_node);
provided_keys
if valid {
Ok(provided_keys)
} else {
Err(provided_keys)
}
}

View File

@@ -48,13 +48,16 @@ impl EnvVars {
/// (e.g. colons on Unix or semicolons on Windows).
pub const PYTHONPATH: &'static str = "PYTHONPATH";
/// Used to determine if an active Conda environment is the base environment or not.
/// Used to determine the name of the active Conda environment.
pub const CONDA_DEFAULT_ENV: &'static str = "CONDA_DEFAULT_ENV";
/// Used to detect an activated Conda environment location.
/// Used to detect the path of an active Conda environment.
/// If both `VIRTUAL_ENV` and `CONDA_PREFIX` are present, `VIRTUAL_ENV` will be preferred.
pub const CONDA_PREFIX: &'static str = "CONDA_PREFIX";
/// Used to determine the root install path of Conda.
pub const CONDA_ROOT: &'static str = "_CONDA_ROOT";
/// Filter which tests to run in mdtest.
///
/// Only tests whose names contain this filter string will be executed.

View File

@@ -44,8 +44,8 @@ impl MarkdownTestConfig {
self.environment.as_ref()?.typeshed.as_deref()
}
pub(crate) fn extra_paths(&self) -> Option<&[SystemPathBuf]> {
self.environment.as_ref()?.extra_paths.as_deref()
pub(crate) fn non_environment_paths(&self) -> Option<&[SystemPathBuf]> {
self.environment.as_ref()?.non_environment_paths.as_deref()
}
pub(crate) fn python(&self) -> Option<&SystemPath> {
@@ -75,7 +75,7 @@ pub(crate) struct Environment {
pub(crate) typeshed: Option<SystemPathBuf>,
/// Additional search paths to consider when resolving modules.
pub(crate) extra_paths: Option<Vec<SystemPathBuf>>,
pub(crate) non_environment_paths: Option<Vec<SystemPathBuf>>,
/// Path to the Python environment.
///

View File

@@ -303,7 +303,10 @@ fn run_test(
.unwrap_or(PythonPlatform::Identifier("linux".to_string())),
search_paths: SearchPathSettings {
src_roots: vec![src_path],
extra_paths: configuration.extra_paths().unwrap_or_default().to_vec(),
non_environment_paths: configuration
.non_environment_paths()
.unwrap_or_default()
.to_vec(),
custom_typeshed: custom_typeshed_path.map(SystemPath::to_path_buf),
site_packages_paths,
real_stdlib_path: None,

View File

@@ -1,5 +1,5 @@
PyYAML==6.0.3
ruff==0.13.2
ruff==0.13.3
mkdocs==1.6.1
mkdocs-material @ git+ssh://git@github.com/astral-sh/mkdocs-material-insiders.git@39da7a5e761410349e9a1b8abf593b0cdd5453ff
mkdocs-redirects==1.2.2

View File

@@ -1,5 +1,5 @@
PyYAML==6.0.3
ruff==0.13.2
ruff==0.13.3
mkdocs==1.6.1
mkdocs-material==9.5.38
mkdocs-redirects==1.2.2

View File

@@ -10,7 +10,7 @@
"dependencies": {
"@miniflare/kv": "^2.14.0",
"@miniflare/storage-memory": "^2.14.0",
"uuid": "^11.0.0"
"uuid": "^13.0.0"
},
"devDependencies": {
"@cloudflare/workers-types": "^4.20230801.0",
@@ -138,7 +138,8 @@
"resolved": "https://registry.npmjs.org/@cloudflare/workers-types/-/workers-types-4.20250726.0.tgz",
"integrity": "sha512-NtM1yVBKJFX4LgSoZkVU0EDhWWvSb1vt6REO+uMYZRgx1HAfQz9GDN6bBB0B+fm2ZIxzt6FzlDbmrXpGJ2M/4Q==",
"dev": true,
"license": "MIT OR Apache-2.0"
"license": "MIT OR Apache-2.0",
"peer": true
},
"node_modules/@cspotcode/source-map-support": {
"version": "0.8.1",
@@ -1694,6 +1695,7 @@
"integrity": "sha512-B06u0wXkEd+o5gOCMl/ZHl5cfpYbDZKAT+HWTL+Hws6jWu7dCiqBBXXXzMFcFVJb8D4ytAnYmxJA83uwOQRSsg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"defu": "^6.1.4",
"exsolve": "^1.0.4",
@@ -1703,16 +1705,16 @@
}
},
"node_modules/uuid": {
"version": "11.1.0",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-11.1.0.tgz",
"integrity": "sha512-0/A9rDy9P7cJ+8w1c9WD9V//9Wj15Ce2MPz8Ri6032usz+NfePxx5AcN3bN+r6ZL6jEo066/yNYB3tn4pQEx+A==",
"version": "13.0.0",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-13.0.0.tgz",
"integrity": "sha512-XQegIaBTVUjSHliKqcnFqYypAd4S+WCYt5NIeRs6w/UAry7z8Y9j5ZwRRL4kzq9U3sD6v+85er9FvkEaBpji2w==",
"funding": [
"https://github.com/sponsors/broofa",
"https://github.com/sponsors/ctavan"
],
"license": "MIT",
"bin": {
"uuid": "dist/esm/bin/uuid"
"uuid": "dist-node/bin/uuid"
}
},
"node_modules/validate-npm-package-name": {
@@ -1747,6 +1749,7 @@
"dev": true,
"hasInstallScript": true,
"license": "Apache-2.0",
"peer": true,
"bin": {
"workerd": "bin/workerd"
},

View File

@@ -15,6 +15,6 @@
"dependencies": {
"@miniflare/kv": "^2.14.0",
"@miniflare/storage-memory": "^2.14.0",
"uuid": "^11.0.0"
"uuid": "^13.0.0"
}
}

View File

@@ -100,6 +100,10 @@ export default function Playground() {
setError("File names cannot start with '/'.");
return;
}
if (newName.startsWith("vendored:")) {
setError("File names cannot start with 'vendored:'.");
return;
}
const handle = files.handles[file];
let newHandle: FileHandle | null = null;

6
ty.schema.json generated
View File

@@ -61,8 +61,8 @@
"EnvironmentOptions": {
"type": "object",
"properties": {
"extra-paths": {
"description": "List of user-provided paths that should take first priority in the module resolution. Examples in other type checkers are mypy's `MYPYPATH` environment variable, or pyright's `stubPath` configuration setting.",
"non-environment-paths": {
"description": "User-provided paths that should take first priority in module resolution.\n\nThis is an advanced option that should usually only be used for first-party or third-party modules that are not installed into your Python environment in a conventional way. Use the `python` option to specify the location of your Python environment.\n\nThis option is similar to mypy's `MYPYPATH` environment variable and pyright's `stubPath` configuration setting.",
"type": [
"array",
"null"
@@ -72,7 +72,7 @@
}
},
"python": {
"description": "Path to the Python installation from which ty resolves type information and third-party dependencies.\n\nty will search in the path's `site-packages` directories for type information and third-party imports.\n\nThis option is commonly used to specify the path to a virtual environment.",
"description": "Path to your project's Python environment or interpreter.\n\nty uses the `site-packages` directory of your project's Python environment to resolve third-party (and, in some cases, first-party) imports in your code.\n\nIf you're using a project management tool such as uv, you should not generally need to specify this option, as commands such as `uv run` will set the `VIRTUAL_ENV` environment variable to point to your project's virtual environment. ty can also infer the location of your environment from an activated Conda environment, and will look for a `.venv` directory in the project root if none of the above apply.\n\nPassing a path to a Python executable is supported, but passing a path to a dynamic executable (such as a shim) is not currently supported.\n\nThis option can be used to point to virtual or system Python environments.",
"type": [
"string",
"null"