Compare commits

...

59 Commits

Author SHA1 Message Date
David Peter
ad4945aed0 SQLAlchemy investigation 2025-12-08 15:30:58 +01:00
David Peter
4686111681 [ty] More SQLAlchemy test updates (#21846)
Minor updates to the SQLAlchemy test suite. I verified all expected
results using pyright.
2025-12-08 15:22:55 +01:00
Micha Reiser
4364ffbdd3 [ty] Don't create a related diagnostic for the primary annotation of sub-diagnostics (#21845) 2025-12-08 14:22:11 +00:00
Charlie Marsh
b845e81c4a Use memchr for computing line indexes (#21838)
## Summary

Some benchmarks with Claude's help:

| File | Size | Baseline | Optimized | Speedup |

|---------------------|-------|----------------------|----------------------|---------|
| numpy/globals.py | 3 KB | 1.48 µs (1.95 GiB/s) | 740 ns (3.89 GiB/s) |
2.0x |
| unicode/pypinyin.py | 4 KB | 2.04 µs (2.01 GiB/s) | 1.18 µs (3.49
GiB/s) | 1.7x |
| pydantic/types.py | 26 KB | 13.1 µs (1.90 GiB/s) | 5.88 µs (4.23
GiB/s) | 2.2x |
| numpy/ctypeslib.py | 17 KB | 8.45 µs (1.92 GiB/s) | 3.94 µs (4.13
GiB/s) | 2.1x |
| large/dataset.py | 41 KB | 21.6 µs (1.84 GiB/s) | 11.2 µs (3.55 GiB/s)
| 1.9x |

I think that I originally thought we _had_ to iterate
character-by-character here because we needed to do the ASCII check, but
the ASCII check can be vectorized by LLVM (and the "search for newlines"
can be done with `memchr`).
2025-12-08 08:50:51 -05:00
David Peter
c99e10eedc [ty] Increase SQLAlchemy test coverage (#21843)
## Summary

Increase our SQLAlchemy test coverage to make sure we understand
`Session.scalar`, `Session.scalars`, `Session.execute` (and their async
equivalents), as well as `Result.tuples`, `Result.one_or_none`,
`Row._tuple`.
2025-12-08 14:36:13 +01:00
Dhruv Manilawala
a364195335 [ty] Avoid diagnostic when typing_extensions.ParamSpec uses default parameter (#21839)
## Summary

fixes: https://github.com/astral-sh/ty/issues/1798

## Test Plan

Add mdtest.
2025-12-08 12:34:30 +00:00
David Peter
dfd6ed0524 [ty] mdtests with external dependencies (#20904)
## Summary

This PR adds the possibility to write mdtests that specify external
dependencies in a `project` section of TOML blocks. For example, here is
a test that makes sure that we understand Pydantic's dataclass-transform
setup:

````markdown
```toml
[environment]
python-version = "3.12"
python-platform = "linux"

[project]
dependencies = ["pydantic==2.12.2"]
```

```py
from pydantic import BaseModel

class User(BaseModel):
    id: int
    name: str

user = User(id=1, name="Alice")
reveal_type(user.id)  # revealed: int
reveal_type(user.name)  # revealed: str

# error: [missing-argument] "No argument provided for required parameter
`name`"
invalid_user = User(id=2)
```
````

## How?

Using the `python-version` and the `dependencies` fields from the
Markdown section, we generate a `pyproject.toml` file, write it to a
temporary directory, and use `uv sync` to install the dependencies into
a virtual environment. We then copy the Python source files from that
venv's `site-packages` folder to a corresponding directory structure in
the in-memory filesystem. Finally, we configure the search paths
accordingly, and run the mdtest as usual.

I fully understand that there are valid concerns here:
* Doesn't this require network access? (yes, it does)
* Is this fast enough? (`uv` caching makes this almost unnoticeable,
actually)
* Is this deterministic? ~~(probably not, package resolution can depend
on the platform you're on)~~ (yes, hopefully)

For this reason, this first version is opt-in, locally. ~~We don't even
run these tests in CI (even though they worked fine in a previous
iteration of this PR).~~ You need to set `MDTEST_EXTERNAL=1`, or use the
new `-e/--enable-external` command line option of the `mdtest.py`
runner. For example:
```bash
# Skip mdtests with external dependencies (default):
uv run crates/ty_python_semantic/mdtest.py

# Run all mdtests, including those with external dependencies:
uv run crates/ty_python_semantic/mdtest.py -e

# Only run the `pydantic` tests. Use `-e` to make sure it is not skipped:
uv run crates/ty_python_semantic/mdtest.py -e pydantic
```

## Why?

I believe that this can be a useful addition to our testing strategy,
which lies somewhere between ecosystem tests and normal mdtests.
Ecosystem tests cover much more code, but they have the disadvantage
that we only see second- or third-order effects via diagnostic diffs. If
we unexpectedly gain or lose type coverage somewhere, we might not even
notice (assuming the gradual guarantee holds, and ecosystem code is
mostly correct). Another disadvantage of ecosystem checks is that they
only test checked-in code that is usually correct. However, we also want
to test what happens on wrong code, like the code that is momentarily
written in an editor, before fixing it. On the other end of the spectrum
we have normal mdtests, which have the disadvantage that they do not
reflect the reality of complex real-world code. We experience this
whenever we're surprised by an ecosystem report on a PR.

That said, these tests should not be seen as a replacement for either of
these things. For example, we should still strive to write detailed
self-contained mdtests for user-reported issues. But we might use this
new layer for regression tests, or simply as a debugging tool. It can
also serve as a tool to document our support for popular third-party
libraries.

## Test Plan

* I've been locally using this for a couple of weeks now.
* `uv run crates/ty_python_semantic/mdtest.py -e`
2025-12-08 11:44:20 +01:00
Dhruv Manilawala
ac882f7e63 [ty] Handle various invalid explicit specializations for ParamSpec (#21821)
## Summary

fixes: https://github.com/astral-sh/ty/issues/1788

## Test Plan

Add new mdtests.

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-12-08 05:20:41 +00:00
Alex Waygood
857fd4f683 [ty] Add test case for fixed panic (#21832) 2025-12-07 15:58:11 +00:00
Charlie Marsh
285d6410d3 [ty] Avoid double-analyzing tuple in Final subscript (#21828)
## Summary

As-is, a single-element tuple gets destructured via:

```rust
let arguments = if let ast::Expr::Tuple(tuple) = slice {
    &*tuple.elts
} else {
    std::slice::from_ref(slice)
};
```

But then, because it's a single element, we call
`infer_annotation_expression_impl`, passing in the tuple, rather than
the first element.

Closes https://github.com/astral-sh/ty/issues/1793.
Closes https://github.com/astral-sh/ty/issues/1768.

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-07 14:27:14 +00:00
Prakhar Pratyush
cbff09b9af [flake8-bandit] Fix false positive when using non-standard CSafeLoader path (S506). (#21830) 2025-12-07 11:40:46 +01:00
Louis Maddox
6e0e49eda8 Add minimal-size build profile (#21826)
This PR adds the same `minimal-size` profile as `uv` repo workspace has

```toml
# Profile to build a minimally sized binary for uv-build
[profile.minimal-size]
inherits = "release"
opt-level = "z"
# This will still show a panic message, we only skip the unwind
panic = "abort"
codegen-units = 1
```
but removes its `panic = "abort"` setting

- As discussed in #21825

Compared to the ones pre-built via `uv tool install`, this builds 35%
smaller ruff and 24% smaller ty binaries
(as measured
[here](https://github.com/lmmx/just-pre-commit/blob/master/refresh_binaries.sh))
2025-12-06 13:19:04 -05:00
Charlie Marsh
ef45c97dab [ty] Allow tuple[Any, ...] to assign to tuple[int, *tuple[int, ...]] (#21803)
## Summary

Closes https://github.com/astral-sh/ty/issues/1750.
2025-12-05 19:04:23 +00:00
Micha Reiser
9714c589e1 [ty] Support renaming import aliases (#21792) 2025-12-05 19:12:13 +01:00
Micha Reiser
b2fb421ddd [ty] Add redeclaration LSP tests (#21812) 2025-12-05 18:02:34 +00:00
Shunsuke Shibayama
2f05ffa2c8 [ty] more detailed description of "Size limit on unions of literals" in mdtest (#21804) 2025-12-05 17:34:39 +00:00
Dhruv Manilawala
b623189560 [ty] Complete support for ParamSpec (#21445)
## Summary

Closes: https://github.com/astral-sh/ty/issues/157

This PR adds support for the following capabilities involving a
`ParamSpec` type variable:
- Representing `P.args` and `P.kwargs` in the type system
- Matching against a callable containing `P` to create a type mapping
- Specializing `P` against the stored parameters

The value of a `ParamSpec` type variable is being represented using
`CallableType` with a `CallableTypeKind::ParamSpecValue` variant. This
`CallableTypeKind` is expanded from the existing `is_function_like`
boolean flag. An `enum` is used as these variants are mutually
exclusive.

For context, an initial iteration made an attempt to expand the
`Specialization` to use `TypeOrParameters` enum that represents that a
type variable can specialize into either a `Type` or `Parameters` but
that increased the complexity of the code as all downstream usages would
need to handle both the variants appropriately. Additionally, we'd have
also need to establish an invariant that a regular type variable always
maps to a `Type` while a paramspec type variable always maps to a
`Parameters`.

I've intentionally left out checking and raising diagnostics when the
`ParamSpec` type variable and it's components are not being used
correctly to avoid scope increase and it can easily be done as a
follow-up. This would also include the scoping rules which I don't think
a regular type variable implements either.

## Test Plan

Add new mdtest cases and update existing test cases.

Ran this branch on pyx, no new diagnostics.

### Ecosystem analysis

There's a case where in an annotated assignment like:
```py
type CustomType[P] = Callable[...]

def value[**P](...): ...

def another[**P](...):
	target: CustomType[P] = value
```
The type of `value` is a callable and it has a paramspec that's bound to
`value`, `CustomType` is a type alias that's a callable and `P` that's
used in it's specialization is bound to `another`. Now, ty infers the
type of `target` same as `value` and does not use the declared type
`CustomType[P]`. [This is the
assignment](0980b9d9ab/src/async_utils/gen_transform.py (L108))
that I'm referring to which then leads to error in downstream usage.
Pyright and mypy does seem to use the declared type.

There are multiple diagnostics in `dd-trace-py` that requires support
for `cls`.

I'm seeing `Divergent` type for an example like which ~~I'm not sure
why, I'll look into it tomorrow~~ is because of a cycle as mentioned in
https://github.com/astral-sh/ty/issues/1729#issuecomment-3612279974:
```py
from typing import Callable

def decorator[**P](c: Callable[P, int]) -> Callable[P, str]: ...

@decorator
def func(a: int) -> int: ...

# ((a: int) -> str) | ((a: Divergent) -> str)
reveal_type(func)
```

I ~~need to look into why are the parameters not being specialized
through multiple decorators in the following code~~ think this is also
because of the cycle mentioned in
https://github.com/astral-sh/ty/issues/1729#issuecomment-3612279974 and
the fact that we don't support `staticmethod` properly:
```py
from contextlib import contextmanager

class Foo:
    @staticmethod
    @contextmanager
    def method(x: int):
        yield

foo = Foo()
# ty: Revealed type: `() -> _GeneratorContextManager[Unknown, None, None]` [revealed-type]
reveal_type(foo.method)
```

There's some issue related to `Protocol` that are generic over a
`ParamSpec` in `starlette` which might be related to
https://github.com/astral-sh/ty/issues/1635 but I'm not sure. Here's a
minimal example to reproduce:

<details><summary>Code snippet:</summary>
<p>

```py
from collections.abc import Awaitable, Callable, MutableMapping
from typing import Any, Callable, ParamSpec, Protocol

P = ParamSpec("P")

Scope = MutableMapping[str, Any]
Message = MutableMapping[str, Any]
Receive = Callable[[], Awaitable[Message]]
Send = Callable[[Message], Awaitable[None]]

ASGIApp = Callable[[Scope, Receive, Send], Awaitable[None]]

_Scope = Any
_Receive = Callable[[], Awaitable[Any]]
_Send = Callable[[Any], Awaitable[None]]

# Since `starlette.types.ASGIApp` type differs from `ASGIApplication` from `asgiref`
# we need to define a more permissive version of ASGIApp that doesn't cause type errors.
_ASGIApp = Callable[[_Scope, _Receive, _Send], Awaitable[None]]


class _MiddlewareFactory(Protocol[P]):
    def __call__(
        self, app: _ASGIApp, *args: P.args, **kwargs: P.kwargs
    ) -> _ASGIApp: ...


class Middleware:
    def __init__(
        self, factory: _MiddlewareFactory[P], *args: P.args, **kwargs: P.kwargs
    ) -> None:
        self.factory = factory
        self.args = args
        self.kwargs = kwargs


class ServerErrorMiddleware:
    def __init__(
        self,
        app: ASGIApp,
        value: int | None = None,
        flag: bool = False,
    ) -> None:
        self.app = app
        self.value = value
        self.flag = flag

    async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None: ...


# ty: Argument to bound method `__init__` is incorrect: Expected `_MiddlewareFactory[(...)]`, found `<class 'ServerErrorMiddleware'>` [invalid-argument-type]
Middleware(ServerErrorMiddleware, value=500, flag=True)
```

</p>
</details> 

### Conformance analysis

> ```diff
> -constructors_callable.py:36:13: info[revealed-type] Revealed type:
`(...) -> Unknown`
> +constructors_callable.py:36:13: info[revealed-type] Revealed type:
`(x: int) -> Unknown`
> ```

Requires return type inference i.e.,
https://github.com/astral-sh/ruff/pull/21551

> ```diff
> +constructors_callable.py:194:16: error[invalid-argument-type]
Argument is incorrect: Expected `list[T@__init__]`, found `list[Unknown
| str]`
> +constructors_callable.py:194:22: error[invalid-argument-type]
Argument is incorrect: Expected `list[T@__init__]`, found `list[Unknown
| str]`
> +constructors_callable.py:195:4: error[invalid-argument-type] Argument
is incorrect: Expected `list[T@__init__]`, found `list[Unknown | int]`
> +constructors_callable.py:195:9: error[invalid-argument-type] Argument
is incorrect: Expected `list[T@__init__]`, found `list[Unknown | str]`
> ```

I might need to look into why this is happening...

> ```diff
> +generics_defaults.py:79:1: error[type-assertion-failure] Type
`type[Class_ParamSpec[(str, int, /)]]` does not match asserted type
`<class 'Class_ParamSpec'>`
> ```

which is on the following code
```py
DefaultP = ParamSpec("DefaultP", default=[str, int])

class Class_ParamSpec(Generic[DefaultP]): ...

assert_type(Class_ParamSpec, type[Class_ParamSpec[str, int]])
```

It's occurring because there's no equivalence relationship defined
between `ClassLiteral` and `KnownInstanceType::TypeGenericAlias` which
is what these types are.

Everything else looks good to me!
2025-12-05 22:00:06 +05:30
Micha Reiser
f29436ca9e [ty] Update benchmark dependencies (#21815) 2025-12-05 17:23:18 +01:00
Douglas Creager
e42cdf8495 [ty] Carry generic context through when converting class into Callable (#21798)
When converting a class (whether specialized or not) into a `Callable`
type, we should carry through any generic context that the constructor
has. This includes both the generic context of the class itself (if it's
generic) and of the constructor methods (if they are separately
generic).

To help test this, this also updates the `generic_context` extension
function to work on `Callable` types and unions; and adds a new
`into_callable` extension function that works just like
`CallableTypeOf`, but on value forms instead of type forms.

Pulled this out of #21551 for separate review.
2025-12-05 08:57:21 -05:00
Alex Waygood
71a7a03ad4 [ty] Add more tests for renamings (#21810) 2025-12-05 12:41:31 +00:00
Alex Waygood
48f7f42784 [ty] Minor improvements to assert_type diagnostics (#21811) 2025-12-05 12:33:30 +00:00
Micha Reiser
3deb7e1b90 [ty] Add some attribute/method renaming test cases (#21809) 2025-12-05 11:56:28 +01:00
mahiro
5df8a959f5 Update mkdocs-material to 9.7.0 (Insiders now free) (#21797) 2025-12-05 08:53:08 +01:00
Dhruv Manilawala
6f03afe318 Remove unused whitespaces in test cases (#21806)
These aren't used in the tests themselves. There are more instances of
them in other files but those require code changes so I've left them as
it is.
2025-12-05 12:51:40 +05:30
Shunsuke Shibayama
1951f1bbb8 [ty] fix panic when instantiating a type variable with invalid constraints (#21663) 2025-12-04 18:48:38 -08:00
Shunsuke Shibayama
10de342991 [ty] fix build failure caused by conflicts between #21683 and #21800 (#21802) 2025-12-04 18:20:24 -08:00
Shunsuke Shibayama
3511b7a06b [ty] do nothing with store_expression_type if inner_expression_inference_state is Get (#21718)
## Summary

Fixes https://github.com/astral-sh/ty/issues/1688

## Test Plan

N/A
2025-12-04 18:05:41 -08:00
Shunsuke Shibayama
f3e5713d90 [ty] increase the limit on the number of elements in a non-recursively defined literal union (#21683)
## Summary

Closes https://github.com/astral-sh/ty/issues/957

As explained in https://github.com/astral-sh/ty/issues/957, literal
union types for recursively defined values ​​can be widened early to
speed up the convergence of fixed-point iterations.
This PR achieves this by embedding a marker in `UnionType` that
distinguishes whether a value is recursively defined.

This also allows us to identify values ​​that are not recursively
defined, so I've increased the limit on the number of elements in a
literal union type for such values.

Edit: while this PR doesn't provide the significant performance
improvement initially hoped for, it does have the benefit of allowing
the number of elements in a literal union to be raised above the salsa
limit, and indeed mypy_primer results revealed that a literal union of
220 elements was actually being used.

## Test Plan

`call/union.md` has been updated
2025-12-04 18:01:48 -08:00
Carl Meyer
a9de6b5c3e [ty] normalize typevar bounds/constraints in cycles (#21800)
Fixes https://github.com/astral-sh/ty/issues/1587

## Summary

Perform cycle normalization on typevar bounds and constraints (similar
to how it was already done for typevar defaults) in order to ensure
convergence in cyclic cases.

There might be another fix here that could avoid the cycle in many more
cases, where we don't eagerly evaluate typevar bounds/constraints on
explicit specialization, but just accept the given specialization and
later evaluate to see whether we need to emit a diagnostic on it. But
the current fix here is sufficient to solve the problem and matches the
patterns we use to ensure cycle convergence elsewhere, so it seems good
for now; left a TODO for the other idea.

This fix is sufficient to make us not panic, but not sufficient to get
the semantics fully correct; see the TODOs in the tests. I have ideas
for fixing that as well, but it seems worth at least getting this in to
fix the panic.

## Test Plan

Test that previously panicked now does not.

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-12-04 15:17:57 -08:00
Andrew Gallant
06415b1877 [ty] Update completion eval to include modules
Our parsing and confirming of symbol names is highly suspect, but
I think it's fine for now.
2025-12-04 17:37:37 -05:00
Andrew Gallant
518d11b33f [ty] Add modules to auto-import
This makes auto-import include modules in suggestions.

In this initial implementation, we permit this to include submodules as
well. This is in contrast to what we do in `import ...` completions.
It's easy to change this behavior, but I think it'd be interesting to
run with this for now to see how well it works.
2025-12-04 17:37:37 -05:00
Andrew Gallant
da94b99248 [ty] Add support for module-only import requests
The existing importer functionality always required
an import request with a module and a member in that
module. But we want to be able to insert import statements
for a module itself and not any members in the module.

This is basically changing `member: &str` to an
`Option<&str>` and fixing the fallout in a way that
makes sense for module-only imports.
2025-12-04 17:37:37 -05:00
Andrew Gallant
3c2cf49f60 [ty] Refactor auto-import symbol info
This just encapsulates the representation so that
we can make changes to it more easily.
2025-12-04 17:37:37 -05:00
Andrew Gallant
fdcb5a7e73 [ty] Clarify the use of SymbolKind in auto-import 2025-12-04 13:21:26 -05:00
Andrew Gallant
6a025d1925 [ty] Redact ranking of completions from e2e LSP tests
I think changes to this value are generally noise. It's hard to tell
what it means and it isn't especially actionable. We already have an
eval running in CI for completion ranking, so I don't think it's
terribly important to care about ranking here in e2e tests _generally_.
2025-12-04 13:21:26 -05:00
Andrew Gallant
f054e7edf8 [ty] Tweaks tests to use clearer language
A completion lacking a module reference doesn't necessarily mean that
the symbol is defined within the current module. I believe the intent
here is that it means that no import is required to use it.
2025-12-04 13:21:26 -05:00
Andrew Gallant
e154efa229 [ty] Update evaluation results
These are all improvements here with one slight regression on
`reveal_type` ranking. The previous completions offered were:

```
$ cargo r -q -p ty_completion_eval show-one ty-extensions-lower-stdlib
ENOTRECOVERABLE (module: errno)
REG_WHOLE_HIVE_VOLATILE (module: winreg)
SQLITE_NOTICE_RECOVER_WAL (module: _sqlite3)
SupportsGetItemViewable (module: _typeshed)
removeHandler (module: unittest.signals)
reveal_mro (module: ty_extensions)
reveal_protocol_interface (module: ty_extensions)
reveal_type (module: typing) (*, 8/10)
_remove_original_values (module: _osx_support)
_remove_universal_flags (module: _osx_support)
-----
found 10 completions
```

And now they are:

```
$ cargo r -q -p ty_completion_eval show-one ty-extensions-lower-stdlib
ENOTRECOVERABLE (module: errno)
REG_WHOLE_HIVE_VOLATILE (module: winreg)
SQLITE_NOTICE_RECOVER_WAL (module: sqlite3)
SQLITE_NOTICE_RECOVER_WAL (module: sqlite3.dbapi2)
removeHandler (module: unittest)
removeHandler (module: unittest.signals)
reveal_mro (module: ty_extensions)
reveal_protocol_interface (module: ty_extensions)
reveal_type (module: typing) (*, 9/9)
-----
found 9 completions
```

Some completions were removed (because they are now considered
unexported) and some were added (likely do to better re-export support).

This particular case probably warrants more special attention anyway.
So I think this is fine. (It's only a one-ranking regression.)
2025-12-04 13:21:26 -05:00
Andrew Gallant
32f400a457 [ty] Make auto-import ignore symbols in modules starting with a _
This applies recursively. So if *any* component of a module name starts
with a `_`, then symbols from that module are excluded from auto-import.

The exception is when it's a module within first party code. Then we
want to include it in auto-import.
2025-12-04 13:21:26 -05:00
Andrew Gallant
2a38395bc8 [ty] Add some tests for re-exports and __all__ to completions
Note that the `Deprecated` symbols from `importlib.metadata` are no
longer offered because 1) `importlib.metadata` defined `__all__` and 2)
the `Deprecated` symbols aren't in it. These seem to not be a part of
its public API according to the docs, so this seems right to me.
2025-12-04 13:21:26 -05:00
Andrew Gallant
8c72b296c9 [ty] Add support for re-exports and __all__ to auto-import
This commit (mostly) re-implements the support for `__all__` in
ty-proper, but inside the auto-import AST scanner.

When `__all__` isn't present in a module, we fall back to conventions to
determine whether a symbol is exported or not:
https://docs.python.org/3/library/index.html

However, in keeping with current practice for non-auto-import
completions, we continue to provide sunder and dunder names as
re-exports.

When `__all__` is present, we respect it strictly. That is, a symbol is
exported *if and only if* it's in `__all__`. This is somewhat stricter
than pylance seemingly is. I felt like it was a good idea to start here,
and we can relax it based on user demand (perhaps through a setting).
2025-12-04 13:21:26 -05:00
Andrew Gallant
086f1e0b89 [ty] Skip over expressions in auto-import AST scanning 2025-12-04 13:21:26 -05:00
Andrew Gallant
5da45f8ec7 [ty] Simplify auto-import AST visitor slightly and add tests
This simplifies the existing visitor by DRYing it up slightly.
We also add tests for the existing functionality. In particular,
we want to add support for re-export conventions, and that
warrants more careful testing.
2025-12-04 13:21:26 -05:00
Andrew Gallant
62f20b1e86 [ty] Re-arrange imports in symbol extraction
I like using a qualified `ast::` prefix for things from
`ruff_python_ast`, so switch over to that convention.
2025-12-04 13:21:26 -05:00
Aria Desires
cccb0bbaa4 [ty] Add tests for implicit submodule references (#21793)
## Summary

I realized we don't really test `DefinitionKind::ImportFromSubmodule` in
the IDE at all, so here's a bunch of them, just recording our current
behaviour.

## Test Plan

*stares at the camera*
2025-12-04 15:46:23 +00:00
Brent Westbrook
9d4f1c6ae2 Bump 0.14.8 (#21791) 2025-12-04 09:45:53 -05:00
Micha Reiser
326025d45f [ty] Always register rename provider if client doesn't support dynamic registration (#21789) 2025-12-04 14:40:16 +01:00
Micha Reiser
3aefe85b32 [ty] Ensure rename CursorTest calls can_rename before renaming (#21790) 2025-12-04 14:19:48 +01:00
Dhruv Manilawala
b8ecc83a54 Fix clippy errors on main (#21788)
https://github.com/astral-sh/ruff/actions/runs/19922070773/job/57112827024#step:5:62
2025-12-04 16:20:37 +05:30
Aria Desires
6491932757 [ty] Fix crash when hovering an unknown string annotation (#21782)
## Summary

I have no idea what I'm doing with the fix (all the interesting stuff is
in the second commit).

The basic problem is the compiler emits the diagnostic:

```
x: "foobar"
    ^^^^^^
```

Which the suppression code-action hands the end of to `Tokens::after`
which then panics because that function panics if handed an offset that
is in the middle of a token.

Fixes https://github.com/astral-sh/ty/issues/1748

## Test Plan

Many tests added (only the e2e test matters).
2025-12-04 09:11:40 +01:00
Micha Reiser
a9f2bb41bd [ty] Don't send publish diagnostics for clients supporting pull diagnostics (#21772) 2025-12-04 08:12:04 +01:00
Aria Desires
e2b72fbf99 [ty] cleanup test path (#21781)
Fixes
https://github.com/astral-sh/ruff/pull/21745#discussion_r2586552295
2025-12-03 21:54:50 +00:00
Alex Waygood
14fce0d440 [ty] Improve the display of various special-form types (#21775) 2025-12-03 21:19:59 +00:00
Alex Waygood
8ebecb2a88 [ty] Add subdiagnostic hint if the user wrote X = Any rather than X: Any (#21777) 2025-12-03 20:42:21 +00:00
Aria Desires
45ac30a4d7 [ty] Teach ty the meaning of desperation (try ancestor pyproject.tomls as search-paths if module resolution fails) (#21745)
## Summary

This makes an importing file a required argument to module resolution,
and if the fast-path cached query fails to resolve the module, take the
slow-path uncached (could be cached if we want)
`desperately_resolve_module` which will walk up from the importing file
until it finds a `pyproject.toml` (arbitrary decision, we could try
every ancestor directory), at which point it takes one last desperate
attempt to use that directory as a search-path. We do not continue
walking up once we've found a `pyproject.toml` (arbitrary decision, we
could keep going up).

Running locally, this fixes every broken-for-workspace-reasons import in
pyx's workspace!

* Fixes https://github.com/astral-sh/ty/issues/1539
* Improves https://github.com/astral-sh/ty/issues/839

## Test Plan

The workspace tests see a huge improvement on most absolute imports.
2025-12-03 15:04:36 -05:00
Alex Waygood
0280949000 [ty] fix panic when attempting to infer the variance of a PEP-695 class that depends on a recursive type aliases and also somehow protocols (#21778)
Fixes https://github.com/astral-sh/ty/issues/1716.

## Test plan

I added a corpus snippet that causes us to panic on `main` (I tested by
running `cargo run -p ty_python_semantic --test=corpus` without the fix
applied).
2025-12-03 19:01:42 +00:00
Bhuminjay Soni
c722f498fe [flake8-bugbear] Catch yield expressions within other statements (B901) (#21200)
## Summary

This PR re-implements [return-in-generator
(B901)](https://docs.astral.sh/ruff/rules/return-in-generator/#return-in-generator-b901)
for async generators as a semantic syntax error. This is not a syntax
error for sync generators, so we'll need to preserve both the lint rule
and the syntax error in this case.

It also updates B901 and the new implementation to catch cases where the
generator's `yield` or `yield from` expression is part of another
statement, as in:

```py
def foo():
    return (yield)
```

These were previously not caught because we only looked for
`Stmt::Expr(Expr::Yield)` in `visit_stmt` instead of visiting `yield`
expressions directly. I think this modification is within the spirit of
the rule and safe to try out since the rule is in preview.

## Test Plan

<!-- How was it tested? -->
I have written tests as directed in #17412

---------

Signed-off-by: 11happy <soni5happy@gmail.com>
Signed-off-by: 11happy <bhuminjaysoni@gmail.com>
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-12-03 12:05:15 -05:00
David Peter
1f4f8d9950 [ty] Fix flow of associated member states during star imports (#21776)
## Summary

Star-imports can not just affect the state of symbols that they pull in,
they can also affect the state of members that are associated with those
symbols. For example, if `obj.attr` was previously narrowed from `int |
None` to `int`, and a star-import now overwrites `obj`, then the
narrowing on `obj.attr` should be "reset".

This PR keeps track of the state of associated members during star
imports and properly models the flow of their corresponding state
through the control flow structure that we artificially create for
star-imports.

See [this
comment](https://github.com/astral-sh/ty/issues/1355#issuecomment-3607125005)
for an explanation why this caused ty to see certain `asyncio` symbols
as not being accessible on Python 3.14.

closes https://github.com/astral-sh/ty/issues/1355

## Ecosystem impact

```diff
async-utils (https://github.com/mikeshardmind/async-utils)
- src/async_utils/bg_loop.py:115:31: error[invalid-argument-type] Argument to bound method `set_task_factory` is incorrect: Expected `_TaskFactory | None`, found `def eager_task_factory[_T_co](loop: AbstractEventLoop | None, coro: Coroutine[Any, Any, _T_co@eager_task_factory], *, name: str | None = None, context: Context | None = None) -> Task[_T_co@eager_task_factory]`
- Found 30 diagnostics
+ Found 29 diagnostics

mitmproxy (https://github.com/mitmproxy/mitmproxy)
+ mitmproxy/utils/asyncio_utils.py:96:60: warning[unused-ignore-comment] Unused blanket `type: ignore` directive
- test/conftest.py:37:31: error[invalid-argument-type] Argument to bound method `set_task_factory` is incorrect: Expected `_TaskFactory | None`, found `def eager_task_factory[_T_co](loop: AbstractEventLoop | None, coro: Coroutine[Any, Any, _T_co@eager_task_factory], *, name: str | None = None, context: Context | None = None) -> Task[_T_co@eager_task_factory]`
```

All of these seem to be correct, they give us a different type for
`asyncio` symbols that are now imported from different
`sys.version_info` branches (where we previously failed to recognize
some of these as statically true/false).

```diff
dd-trace-py (https://github.com/DataDog/dd-trace-py)
- ddtrace/contrib/internal/asyncio/patch.py:39:12: error[invalid-argument-type] Argument to function `unwrap` is incorrect: Expected `WrappedFunction`, found `def create_task[_T](self, coro: Coroutine[Any, Any, _T@create_task] | Generator[Any, None, _T@create_task], *, name: object = None) -> Task[_T@create_task]`
+ ddtrace/contrib/internal/asyncio/patch.py:39:12: error[invalid-argument-type] Argument to function `unwrap` is incorrect: Expected `WrappedFunction`, found `def create_task[_T](self, coro: Generator[Any, None, _T@create_task] | Coroutine[Any, Any, _T@create_task], *, name: object = None) -> Task[_T@create_task]`
```

Similar, but only results in a diagnostic change.

## Test Plan

Added a regression test
2025-12-03 17:52:31 +01:00
William Woodruff
4488e9d47d Revert "Enable PEP 740 attestations when publishing to PyPI" (#21768) 2025-12-03 11:07:29 -05:00
github-actions[bot]
b08f0b2caa [ty] Sync vendored typeshed stubs (#21715)
Co-authored-by: typeshedbot <>
Co-authored-by: David Peter <mail@david-peter.de>
2025-12-03 15:49:51 +00:00
209 changed files with 18827 additions and 8258 deletions

View File

@@ -75,14 +75,6 @@
matchManagers: ["cargo"],
enabled: false,
},
{
// `mkdocs-material` requires a manual update to keep the version in sync
// with `mkdocs-material-insider`.
// See: https://squidfunk.github.io/mkdocs-material/insiders/upgrade/
matchManagers: ["pip_requirements"],
matchPackageNames: ["mkdocs-material"],
enabled: false,
},
{
groupName: "pre-commit dependencies",
matchManagers: ["pre-commit"],

View File

@@ -24,6 +24,8 @@ env:
PACKAGE_NAME: ruff
PYTHON_VERSION: "3.14"
NEXTEST_PROFILE: ci
# Enable mdtests that require external dependencies
MDTEST_EXTERNAL: "1"
jobs:
determine_changes:
@@ -779,8 +781,6 @@ jobs:
name: "mkdocs"
runs-on: ubuntu-latest
timeout-minutes: 10
env:
MKDOCS_INSIDERS_SSH_KEY_EXISTS: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY != '' }}
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
@@ -788,11 +788,6 @@ jobs:
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: "Add SSH key"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
uses: webfactory/ssh-agent@a6f90b1f127823b31d4d4a8d96047790581349bd # v0.9.1
with:
ssh-private-key: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY }}
- name: "Install Rust toolchain"
run: rustup show
- name: Install uv
@@ -800,11 +795,7 @@ jobs:
with:
python-version: 3.13
activate-environment: true
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uv pip install -r docs/requirements-insiders.txt
- name: "Install dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS != 'true' }}
run: uv pip install -r docs/requirements.txt
- name: "Update README File"
run: python scripts/transform_readme.py --target mkdocs
@@ -812,12 +803,8 @@ jobs:
run: python scripts/generate_mkdocs.py
- name: "Check docs formatting"
run: python scripts/check_docs_formatted.py
- name: "Build Insiders docs"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: mkdocs build --strict -f mkdocs.insiders.yml
- name: "Build docs"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS != 'true' }}
run: mkdocs build --strict -f mkdocs.public.yml
run: mkdocs build --strict -f mkdocs.yml
check-formatter-instability-and-black-similarity:
name: "formatter instabilities and black similarity"

View File

@@ -20,8 +20,6 @@ on:
jobs:
mkdocs:
runs-on: ubuntu-latest
env:
MKDOCS_INSIDERS_SSH_KEY_EXISTS: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY != '' }}
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
@@ -59,23 +57,12 @@ jobs:
echo "branch_name=update-docs-$branch_display_name-$timestamp" >> "$GITHUB_ENV"
echo "timestamp=$timestamp" >> "$GITHUB_ENV"
- name: "Add SSH key"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
uses: webfactory/ssh-agent@a6f90b1f127823b31d4d4a8d96047790581349bd # v0.9.1
with:
ssh-private-key: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY }}
- name: "Install Rust toolchain"
run: rustup show
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: pip install -r docs/requirements-insiders.txt
- name: "Install dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS != 'true' }}
run: pip install -r docs/requirements.txt
- name: "Copy README File"
@@ -83,13 +70,8 @@ jobs:
python scripts/transform_readme.py --target mkdocs
python scripts/generate_mkdocs.py
- name: "Build Insiders docs"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: mkdocs build --strict -f mkdocs.insiders.yml
- name: "Build docs"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS != 'true' }}
run: mkdocs build --strict -f mkdocs.public.yml
run: mkdocs build --strict -f mkdocs.yml
- name: "Clone docs repo"
run: git clone https://${{ secrets.ASTRAL_DOCS_PAT }}@github.com/astral-sh/docs.git astral-docs

View File

@@ -18,7 +18,8 @@ jobs:
environment:
name: release
permissions:
id-token: write # For PyPI's trusted publishing + PEP 740 attestations
# For PyPI's trusted publishing.
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
@@ -27,8 +28,5 @@ jobs:
pattern: wheels-*
path: wheels
merge-multiple: true
- uses: astral-sh/attest-action@2c727738cea36d6c97dd85eb133ea0e0e8fe754b # v0.0.4
with:
paths: wheels/*
- name: Publish to PyPi
run: uv publish -v wheels/*

View File

@@ -1,5 +1,34 @@
# Changelog
## 0.14.8
Released on 2025-12-04.
### Preview features
- \[`flake8-bugbear`\] Catch `yield` expressions within other statements (`B901`) ([#21200](https://github.com/astral-sh/ruff/pull/21200))
- \[`flake8-use-pathlib`\] Mark fixes unsafe for return type changes (`PTH104`, `PTH105`, `PTH109`, `PTH115`) ([#21440](https://github.com/astral-sh/ruff/pull/21440))
### Bug fixes
- Fix syntax error false positives for `await` outside functions ([#21763](https://github.com/astral-sh/ruff/pull/21763))
- \[`flake8-simplify`\] Fix truthiness assumption for non-iterable arguments in tuple/list/set calls (`SIM222`, `SIM223`) ([#21479](https://github.com/astral-sh/ruff/pull/21479))
### Documentation
- Suggest using `--output-file` option in GitLab integration ([#21706](https://github.com/astral-sh/ruff/pull/21706))
### Other changes
- [syntax-error] Default type parameter followed by non-default type parameter ([#21657](https://github.com/astral-sh/ruff/pull/21657))
### Contributors
- [@kieran-ryan](https://github.com/kieran-ryan)
- [@11happy](https://github.com/11happy)
- [@danparizher](https://github.com/danparizher)
- [@ntBre](https://github.com/ntBre)
## 0.14.7
Released on 2025-11-28.

View File

@@ -331,13 +331,6 @@ you addressed them.
## MkDocs
> [!NOTE]
>
> The documentation uses Material for MkDocs Insiders, which is closed-source software.
> This means only members of the Astral organization can preview the documentation exactly as it
> will appear in production.
> Outside contributors can still preview the documentation, but there will be some differences. Consult [the Material for MkDocs documentation](https://squidfunk.github.io/mkdocs-material/insiders/benefits/#features) for which features are exclusively available in the insiders version.
To preview any changes to the documentation locally:
1. Install the [Rust toolchain](https://www.rust-lang.org/tools/install).
@@ -351,11 +344,7 @@ To preview any changes to the documentation locally:
1. Run the development server with:
```shell
# For contributors.
uvx --with-requirements docs/requirements.txt -- mkdocs serve -f mkdocs.public.yml
# For members of the Astral org, which has access to MkDocs Insiders via sponsorship.
uvx --with-requirements docs/requirements-insiders.txt -- mkdocs serve -f mkdocs.insiders.yml
uvx --with-requirements docs/requirements.txt -- mkdocs serve -f mkdocs.yml
```
The documentation should then be available locally at

7
Cargo.lock generated
View File

@@ -2859,7 +2859,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.14.7"
version = "0.14.8"
dependencies = [
"anyhow",
"argfile",
@@ -3117,7 +3117,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.14.7"
version = "0.14.8"
dependencies = [
"aho-corasick",
"anyhow",
@@ -3473,7 +3473,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.14.7"
version = "0.14.8"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -4557,6 +4557,7 @@ dependencies = [
"anyhow",
"camino",
"colored 3.0.0",
"dunce",
"insta",
"memchr",
"path-slash",

View File

@@ -272,6 +272,12 @@ large_stack_arrays = "allow"
lto = "fat"
codegen-units = 16
# Profile to build a minimally sized binary for ruff/ty
[profile.minimal-size]
inherits = "release"
opt-level = "z"
codegen-units = 1
# Some crates don't change as much but benefit more from
# more expensive optimization passes, so we selectively
# decrease codegen-units in some cases.

View File

@@ -147,8 +147,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.14.7/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.14.7/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.14.8/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.14.8/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -181,7 +181,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.14.7
rev: v0.14.8
hooks:
# Run the linter.
- id: ruff-check

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.14.7"
version = "0.14.8"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -888,6 +888,10 @@ impl Annotation {
pub fn hide_snippet(&mut self, yes: bool) {
self.hide_snippet = yes;
}
pub fn is_primary(&self) -> bool {
self.is_primary
}
}
/// Tags that can be associated with an annotation.

View File

@@ -667,6 +667,13 @@ impl Deref for SystemPathBuf {
}
}
impl AsRef<Path> for SystemPathBuf {
#[inline]
fn as_ref(&self) -> &Path {
self.0.as_std_path()
}
}
impl<P: AsRef<SystemPath>> FromIterator<P> for SystemPathBuf {
fn from_iter<I: IntoIterator<Item = P>>(iter: I) -> Self {
let mut buf = SystemPathBuf::new();

View File

@@ -49,7 +49,7 @@ impl ModuleImports {
// Resolve the imports.
let mut resolved_imports = ModuleImports::default();
for import in imports {
for resolved in Resolver::new(db).resolve(import) {
for resolved in Resolver::new(db, path).resolve(import) {
if let Some(path) = resolved.as_system_path() {
resolved_imports.insert(path.to_path_buf());
}

View File

@@ -1,5 +1,9 @@
use ruff_db::files::FilePath;
use ty_python_semantic::{ModuleName, resolve_module, resolve_real_module};
use ruff_db::files::{File, FilePath, system_path_to_file};
use ruff_db::system::SystemPath;
use ty_python_semantic::{
ModuleName, resolve_module, resolve_module_confident, resolve_real_module,
resolve_real_module_confident,
};
use crate::ModuleDb;
use crate::collector::CollectedImport;
@@ -7,12 +11,15 @@ use crate::collector::CollectedImport;
/// Collect all imports for a given Python file.
pub(crate) struct Resolver<'a> {
db: &'a ModuleDb,
file: Option<File>,
}
impl<'a> Resolver<'a> {
/// Initialize a [`Resolver`] with a given [`ModuleDb`].
pub(crate) fn new(db: &'a ModuleDb) -> Self {
Self { db }
pub(crate) fn new(db: &'a ModuleDb, path: &SystemPath) -> Self {
// If we know the importing file we can potentially resolve more imports
let file = system_path_to_file(db, path).ok();
Self { db, file }
}
/// Resolve the [`CollectedImport`] into a [`FilePath`].
@@ -70,13 +77,21 @@ impl<'a> Resolver<'a> {
/// Resolves a module name to a module.
pub(crate) fn resolve_module(&self, module_name: &ModuleName) -> Option<&'a FilePath> {
let module = resolve_module(self.db, module_name)?;
let module = if let Some(file) = self.file {
resolve_module(self.db, file, module_name)?
} else {
resolve_module_confident(self.db, module_name)?
};
Some(module.file(self.db)?.path(self.db))
}
/// Resolves a module name to a module (stubs not allowed).
fn resolve_real_module(&self, module_name: &ModuleName) -> Option<&'a FilePath> {
let module = resolve_real_module(self.db, module_name)?;
let module = if let Some(file) = self.file {
resolve_real_module(self.db, file, module_name)?
} else {
resolve_real_module_confident(self.db, module_name)?
};
Some(module.file(self.db)?.path(self.db))
}
}

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.14.7"
version = "0.14.8"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -28,9 +28,11 @@ yaml.load("{}", SafeLoader)
yaml.load("{}", yaml.SafeLoader)
yaml.load("{}", CSafeLoader)
yaml.load("{}", yaml.CSafeLoader)
yaml.load("{}", yaml.cyaml.CSafeLoader)
yaml.load("{}", NewSafeLoader)
yaml.load("{}", Loader=SafeLoader)
yaml.load("{}", Loader=yaml.SafeLoader)
yaml.load("{}", Loader=CSafeLoader)
yaml.load("{}", Loader=yaml.CSafeLoader)
yaml.load("{}", Loader=yaml.cyaml.CSafeLoader)
yaml.load("{}", Loader=NewSafeLoader)

View File

@@ -52,16 +52,16 @@ def not_broken5():
yield inner()
def not_broken6():
def broken3():
return (yield from [])
def not_broken7():
def broken4():
x = yield from []
return x
def not_broken8():
def broken5():
x = None
def inner(ex):
@@ -76,3 +76,13 @@ class NotBroken9(object):
def __await__(self):
yield from function()
return 42
async def broken6():
yield 1
return foo()
async def broken7():
yield 1
return [1, 2, 3]

View File

@@ -0,0 +1,24 @@
async def gen():
yield 1
return 42
def gen(): # B901 but not a syntax error - not an async generator
yield 1
return 42
async def gen(): # ok - no value in return
yield 1
return
async def gen():
yield 1
return foo()
async def gen():
yield 1
return [1, 2, 3]
async def gen():
if True:
yield 1
return 10

View File

@@ -69,6 +69,7 @@ use crate::noqa::NoqaMapping;
use crate::package::PackageRoot;
use crate::preview::is_undefined_export_in_dunder_init_enabled;
use crate::registry::Rule;
use crate::rules::flake8_bugbear::rules::ReturnInGenerator;
use crate::rules::pyflakes::rules::{
LateFutureImport, MultipleStarredExpressions, ReturnOutsideFunction,
UndefinedLocalWithNestedImportStarUsage, YieldOutsideFunction,
@@ -729,6 +730,12 @@ impl SemanticSyntaxContext for Checker<'_> {
self.report_diagnostic(NonlocalWithoutBinding { name }, error.range);
}
}
SemanticSyntaxErrorKind::ReturnInGenerator => {
// B901
if self.is_rule_enabled(Rule::ReturnInGenerator) {
self.report_diagnostic(ReturnInGenerator, error.range);
}
}
SemanticSyntaxErrorKind::ReboundComprehensionVariable
| SemanticSyntaxErrorKind::DuplicateTypeParameter
| SemanticSyntaxErrorKind::MultipleCaseAssignment(_)

View File

@@ -1043,6 +1043,7 @@ mod tests {
Rule::YieldFromInAsyncFunction,
Path::new("yield_from_in_async_function.py")
)]
#[test_case(Rule::ReturnInGenerator, Path::new("return_in_generator.py"))]
fn test_syntax_errors(rule: Rule, path: &Path) -> Result<()> {
let snapshot = path.to_string_lossy().to_string();
let path = Path::new("resources/test/fixtures/syntax_errors").join(path);

View File

@@ -75,6 +75,7 @@ pub(crate) fn unsafe_yaml_load(checker: &Checker, call: &ast::ExprCall) {
qualified_name.segments(),
["yaml", "SafeLoader" | "CSafeLoader"]
| ["yaml", "loader", "SafeLoader" | "CSafeLoader"]
| ["yaml", "cyaml", "CSafeLoader"]
)
})
{

View File

@@ -1,6 +1,5 @@
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::statement_visitor;
use ruff_python_ast::statement_visitor::StatementVisitor;
use ruff_python_ast::visitor::{Visitor, walk_expr, walk_stmt};
use ruff_python_ast::{self as ast, Expr, Stmt, StmtFunctionDef};
use ruff_text_size::TextRange;
@@ -96,6 +95,11 @@ pub(crate) fn return_in_generator(checker: &Checker, function_def: &StmtFunction
return;
}
// Async functions are flagged by the `ReturnInGenerator` semantic syntax error.
if function_def.is_async {
return;
}
let mut visitor = ReturnInGeneratorVisitor::default();
visitor.visit_body(&function_def.body);
@@ -112,15 +116,9 @@ struct ReturnInGeneratorVisitor {
has_yield: bool,
}
impl StatementVisitor<'_> for ReturnInGeneratorVisitor {
impl Visitor<'_> for ReturnInGeneratorVisitor {
fn visit_stmt(&mut self, stmt: &Stmt) {
match stmt {
Stmt::Expr(ast::StmtExpr { value, .. }) => match **value {
Expr::Yield(_) | Expr::YieldFrom(_) => {
self.has_yield = true;
}
_ => {}
},
Stmt::FunctionDef(_) => {
// Do not recurse into nested functions; they're evaluated separately.
}
@@ -130,8 +128,19 @@ impl StatementVisitor<'_> for ReturnInGeneratorVisitor {
node_index: _,
}) => {
self.return_ = Some(*range);
walk_stmt(self, stmt);
}
_ => statement_visitor::walk_stmt(self, stmt),
_ => walk_stmt(self, stmt),
}
}
fn visit_expr(&mut self, expr: &Expr) {
match expr {
Expr::Lambda(_) => {}
Expr::Yield(_) | Expr::YieldFrom(_) => {
self.has_yield = true;
}
_ => walk_expr(self, expr),
}
}
}

View File

@@ -21,3 +21,46 @@ B901 Using `yield` and `return {value}` in a generator function can lead to conf
37 |
38 | yield from not_broken()
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> B901.py:56:5
|
55 | def broken3():
56 | return (yield from [])
| ^^^^^^^^^^^^^^^^^^^^^^
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> B901.py:61:5
|
59 | def broken4():
60 | x = yield from []
61 | return x
| ^^^^^^^^
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> B901.py:72:5
|
71 | inner((yield from []))
72 | return x
| ^^^^^^^^
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> B901.py:83:5
|
81 | async def broken6():
82 | yield 1
83 | return foo()
| ^^^^^^^^^^^^
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> B901.py:88:5
|
86 | async def broken7():
87 | yield 1
88 | return [1, 2, 3]
| ^^^^^^^^^^^^^^^^
|

View File

@@ -0,0 +1,55 @@
---
source: crates/ruff_linter/src/linter.rs
---
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> resources/test/fixtures/syntax_errors/return_in_generator.py:3:5
|
1 | async def gen():
2 | yield 1
3 | return 42
| ^^^^^^^^^
4 |
5 | def gen(): # B901 but not a syntax error - not an async generator
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> resources/test/fixtures/syntax_errors/return_in_generator.py:7:5
|
5 | def gen(): # B901 but not a syntax error - not an async generator
6 | yield 1
7 | return 42
| ^^^^^^^^^
8 |
9 | async def gen(): # ok - no value in return
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> resources/test/fixtures/syntax_errors/return_in_generator.py:15:5
|
13 | async def gen():
14 | yield 1
15 | return foo()
| ^^^^^^^^^^^^
16 |
17 | async def gen():
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> resources/test/fixtures/syntax_errors/return_in_generator.py:19:5
|
17 | async def gen():
18 | yield 1
19 | return [1, 2, 3]
| ^^^^^^^^^^^^^^^^
20 |
21 | async def gen():
|
B901 Using `yield` and `return {value}` in a generator function can lead to confusing behavior
--> resources/test/fixtures/syntax_errors/return_in_generator.py:24:5
|
22 | if True:
23 | yield 1
24 | return 10
| ^^^^^^^^^
|

View File

@@ -3,12 +3,13 @@
//! This checker is not responsible for traversing the AST itself. Instead, its
//! [`SemanticSyntaxChecker::visit_stmt`] and [`SemanticSyntaxChecker::visit_expr`] methods should
//! be called in a parent `Visitor`'s `visit_stmt` and `visit_expr` methods, respectively.
use ruff_python_ast::{
self as ast, Expr, ExprContext, IrrefutablePatternKind, Pattern, PythonVersion, Stmt, StmtExpr,
StmtImportFrom,
StmtFunctionDef, StmtImportFrom,
comparable::ComparableExpr,
helpers,
visitor::{Visitor, walk_expr},
visitor::{Visitor, walk_expr, walk_stmt},
};
use ruff_text_size::{Ranged, TextRange, TextSize};
use rustc_hash::{FxBuildHasher, FxHashSet};
@@ -739,7 +740,21 @@ impl SemanticSyntaxChecker {
self.seen_futures_boundary = true;
}
}
Stmt::FunctionDef(_) => {
Stmt::FunctionDef(StmtFunctionDef { is_async, body, .. }) => {
if *is_async {
let mut visitor = ReturnVisitor::default();
visitor.visit_body(body);
if visitor.has_yield {
if let Some(return_range) = visitor.return_range {
Self::add_error(
ctx,
SemanticSyntaxErrorKind::ReturnInGenerator,
return_range,
);
}
}
}
self.seen_futures_boundary = true;
}
_ => {
@@ -1213,6 +1228,9 @@ impl Display for SemanticSyntaxError {
SemanticSyntaxErrorKind::NonlocalWithoutBinding(name) => {
write!(f, "no binding for nonlocal `{name}` found")
}
SemanticSyntaxErrorKind::ReturnInGenerator => {
write!(f, "`return` with value in async generator")
}
}
}
}
@@ -1619,6 +1637,9 @@ pub enum SemanticSyntaxErrorKind {
/// Represents a default type parameter followed by a non-default type parameter.
TypeParameterDefaultOrder(String),
/// Represents a `return` statement with a value in an asynchronous generator.
ReturnInGenerator,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, get_size2::GetSize)]
@@ -1735,6 +1756,40 @@ impl Visitor<'_> for ReboundComprehensionVisitor<'_> {
}
}
#[derive(Default)]
struct ReturnVisitor {
return_range: Option<TextRange>,
has_yield: bool,
}
impl Visitor<'_> for ReturnVisitor {
fn visit_stmt(&mut self, stmt: &Stmt) {
match stmt {
// Do not recurse into nested functions; they're evaluated separately.
Stmt::FunctionDef(_) | Stmt::ClassDef(_) => {}
Stmt::Return(ast::StmtReturn {
value: Some(_),
range,
..
}) => {
self.return_range = Some(*range);
walk_stmt(self, stmt);
}
_ => walk_stmt(self, stmt),
}
}
fn visit_expr(&mut self, expr: &Expr) {
match expr {
Expr::Lambda(_) => {}
Expr::Yield(_) | Expr::YieldFrom(_) => {
self.has_yield = true;
}
_ => walk_expr(self, expr),
}
}
}
struct MatchPatternVisitor<'a, Ctx> {
names: FxHashSet<&'a ast::name::Name>,
ctx: &'a Ctx,

View File

@@ -33,26 +33,29 @@ impl LineIndex {
line_starts.push(TextSize::default());
let bytes = text.as_bytes();
let mut utf8 = false;
assert!(u32::try_from(bytes.len()).is_ok());
for (i, byte) in bytes.iter().enumerate() {
utf8 |= !byte.is_ascii();
match byte {
// Only track one line break for `\r\n`.
b'\r' if bytes.get(i + 1) == Some(&b'\n') => continue,
b'\n' | b'\r' => {
// SAFETY: Assertion above guarantees `i <= u32::MAX`
#[expect(clippy::cast_possible_truncation)]
line_starts.push(TextSize::from(i as u32) + TextSize::from(1));
}
_ => {}
for i in memchr::memchr2_iter(b'\n', b'\r', bytes) {
// Skip `\r` in `\r\n` sequences (only count the `\n`).
if bytes[i] == b'\r' && bytes.get(i + 1) == Some(&b'\n') {
continue;
}
// SAFETY: Assertion above guarantees `i <= u32::MAX`
#[expect(clippy::cast_possible_truncation)]
line_starts.push(TextSize::from(i as u32) + TextSize::from(1));
}
let kind = if utf8 {
// Determine whether the source text is ASCII.
//
// Empirically, this simple loop is auto-vectorized by LLVM and benchmarks faster than both
// `str::is_ascii()` and hand-written SIMD.
let mut has_non_ascii = false;
for byte in bytes {
has_non_ascii |= !byte.is_ascii();
}
let kind = if has_non_ascii {
IndexKind::Utf8
} else {
IndexKind::Ascii

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_wasm"
version = "0.14.7"
version = "0.14.8"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -15,7 +15,7 @@ use ty_project::metadata::pyproject::{PyProject, Tool};
use ty_project::metadata::value::{RangedValue, RelativePathBuf};
use ty_project::watch::{ChangeEvent, ProjectWatcher, directory_watcher};
use ty_project::{Db, ProjectDatabase, ProjectMetadata};
use ty_python_semantic::{Module, ModuleName, PythonPlatform, resolve_module};
use ty_python_semantic::{Module, ModuleName, PythonPlatform, resolve_module_confident};
struct TestCase {
db: ProjectDatabase,
@@ -232,7 +232,8 @@ impl TestCase {
}
fn module<'c>(&'c self, name: &str) -> Module<'c> {
resolve_module(self.db(), &ModuleName::new(name).unwrap()).expect("module to be present")
resolve_module_confident(self.db(), &ModuleName::new(name).unwrap())
.expect("module to be present")
}
fn sorted_submodule_names(&self, parent_module_name: &str) -> Vec<String> {
@@ -811,7 +812,8 @@ fn directory_moved_to_project() -> anyhow::Result<()> {
.with_context(|| "Failed to create __init__.py")?;
std::fs::write(a_original_path.as_std_path(), "").with_context(|| "Failed to create a.py")?;
let sub_a_module = resolve_module(case.db(), &ModuleName::new_static("sub.a").unwrap());
let sub_a_module =
resolve_module_confident(case.db(), &ModuleName::new_static("sub.a").unwrap());
assert_eq!(sub_a_module, None);
case.assert_indexed_project_files([bar]);
@@ -832,7 +834,9 @@ fn directory_moved_to_project() -> anyhow::Result<()> {
.expect("a.py to exist");
// `import sub.a` should now resolve
assert!(resolve_module(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_some());
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_some()
);
case.assert_indexed_project_files([bar, init_file, a_file]);
@@ -848,7 +852,9 @@ fn directory_moved_to_trash() -> anyhow::Result<()> {
])?;
let bar = case.system_file(case.project_path("bar.py")).unwrap();
assert!(resolve_module(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_some());
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_some()
);
let sub_path = case.project_path("sub");
let init_file = case
@@ -870,7 +876,9 @@ fn directory_moved_to_trash() -> anyhow::Result<()> {
case.apply_changes(changes, None);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_none());
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_none()
);
assert!(!init_file.exists(case.db()));
assert!(!a_file.exists(case.db()));
@@ -890,8 +898,12 @@ fn directory_renamed() -> anyhow::Result<()> {
let bar = case.system_file(case.project_path("bar.py")).unwrap();
assert!(resolve_module(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_some());
assert!(resolve_module(case.db(), &ModuleName::new_static("foo.baz").unwrap()).is_none());
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_some()
);
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("foo.baz").unwrap()).is_none()
);
let sub_path = case.project_path("sub");
let sub_init = case
@@ -915,9 +927,13 @@ fn directory_renamed() -> anyhow::Result<()> {
case.apply_changes(changes, None);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_none());
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_none()
);
// `import foo.baz` should now resolve
assert!(resolve_module(case.db(), &ModuleName::new_static("foo.baz").unwrap()).is_some());
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("foo.baz").unwrap()).is_some()
);
// The old paths are no longer tracked
assert!(!sub_init.exists(case.db()));
@@ -950,7 +966,9 @@ fn directory_deleted() -> anyhow::Result<()> {
let bar = case.system_file(case.project_path("bar.py")).unwrap();
assert!(resolve_module(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_some());
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_some()
);
let sub_path = case.project_path("sub");
@@ -970,7 +988,9 @@ fn directory_deleted() -> anyhow::Result<()> {
case.apply_changes(changes, None);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_none());
assert!(
resolve_module_confident(case.db(), &ModuleName::new_static("sub.a").unwrap()).is_none()
);
assert!(!init_file.exists(case.db()));
assert!(!a_file.exists(case.db()));
@@ -999,7 +1019,7 @@ fn search_path() -> anyhow::Result<()> {
let site_packages = case.root_path().join("site_packages");
assert_eq!(
resolve_module(case.db(), &ModuleName::new("a").unwrap()),
resolve_module_confident(case.db(), &ModuleName::new("a").unwrap()),
None
);
@@ -1009,7 +1029,7 @@ fn search_path() -> anyhow::Result<()> {
case.apply_changes(changes, None);
assert!(resolve_module(case.db(), &ModuleName::new_static("a").unwrap()).is_some());
assert!(resolve_module_confident(case.db(), &ModuleName::new_static("a").unwrap()).is_some());
case.assert_indexed_project_files([case.system_file(case.project_path("bar.py")).unwrap()]);
Ok(())
@@ -1022,7 +1042,7 @@ fn add_search_path() -> anyhow::Result<()> {
let site_packages = case.project_path("site_packages");
std::fs::create_dir_all(site_packages.as_std_path())?;
assert!(resolve_module(case.db(), &ModuleName::new_static("a").unwrap()).is_none());
assert!(resolve_module_confident(case.db(), &ModuleName::new_static("a").unwrap()).is_none());
// Register site-packages as a search path.
case.update_options(Options {
@@ -1040,7 +1060,7 @@ fn add_search_path() -> anyhow::Result<()> {
case.apply_changes(changes, None);
assert!(resolve_module(case.db(), &ModuleName::new_static("a").unwrap()).is_some());
assert!(resolve_module_confident(case.db(), &ModuleName::new_static("a").unwrap()).is_some());
Ok(())
}
@@ -1172,7 +1192,7 @@ fn changed_versions_file() -> anyhow::Result<()> {
// Unset the custom typeshed directory.
assert_eq!(
resolve_module(case.db(), &ModuleName::new("os").unwrap()),
resolve_module_confident(case.db(), &ModuleName::new("os").unwrap()),
None
);
@@ -1187,7 +1207,7 @@ fn changed_versions_file() -> anyhow::Result<()> {
case.apply_changes(changes, None);
assert!(resolve_module(case.db(), &ModuleName::new("os").unwrap()).is_some());
assert!(resolve_module_confident(case.db(), &ModuleName::new("os").unwrap()).is_some());
Ok(())
}
@@ -1410,7 +1430,7 @@ mod unix {
Ok(())
})?;
let baz = resolve_module(case.db(), &ModuleName::new_static("bar.baz").unwrap())
let baz = resolve_module_confident(case.db(), &ModuleName::new_static("bar.baz").unwrap())
.expect("Expected bar.baz to exist in site-packages.");
let baz_project = case.project_path("bar/baz.py");
let baz_file = baz.file(case.db()).unwrap();
@@ -1486,7 +1506,7 @@ mod unix {
Ok(())
})?;
let baz = resolve_module(case.db(), &ModuleName::new_static("bar.baz").unwrap())
let baz = resolve_module_confident(case.db(), &ModuleName::new_static("bar.baz").unwrap())
.expect("Expected bar.baz to exist in site-packages.");
let baz_file = baz.file(case.db()).unwrap();
let bar_baz = case.project_path("bar/baz.py");
@@ -1591,7 +1611,7 @@ mod unix {
Ok(())
})?;
let baz = resolve_module(case.db(), &ModuleName::new_static("bar.baz").unwrap())
let baz = resolve_module_confident(case.db(), &ModuleName::new_static("bar.baz").unwrap())
.expect("Expected bar.baz to exist in site-packages.");
let baz_site_packages_path =
case.project_path(".venv/lib/python3.12/site-packages/bar/baz.py");
@@ -1854,11 +1874,11 @@ fn rename_files_casing_only() -> anyhow::Result<()> {
let mut case = setup([("lib.py", "class Foo: ...")])?;
assert!(
resolve_module(case.db(), &ModuleName::new("lib").unwrap()).is_some(),
resolve_module_confident(case.db(), &ModuleName::new("lib").unwrap()).is_some(),
"Expected `lib` module to exist."
);
assert_eq!(
resolve_module(case.db(), &ModuleName::new("Lib").unwrap()),
resolve_module_confident(case.db(), &ModuleName::new("Lib").unwrap()),
None,
"Expected `Lib` module not to exist"
);
@@ -1891,13 +1911,13 @@ fn rename_files_casing_only() -> anyhow::Result<()> {
// Resolving `lib` should now fail but `Lib` should now succeed
assert_eq!(
resolve_module(case.db(), &ModuleName::new("lib").unwrap()),
resolve_module_confident(case.db(), &ModuleName::new("lib").unwrap()),
None,
"Expected `lib` module to no longer exist."
);
assert!(
resolve_module(case.db(), &ModuleName::new("Lib").unwrap()).is_some(),
resolve_module_confident(case.db(), &ModuleName::new("Lib").unwrap()).is_some(),
"Expected `Lib` module to exist"
);

View File

@@ -1,4 +1,7 @@
name,file,index,rank
auto-import-includes-modules,main.py,0,1
auto-import-includes-modules,main.py,1,7
auto-import-includes-modules,main.py,2,1
auto-import-skips-current-module,main.py,0,1
fstring-completions,main.py,0,1
higher-level-symbols-preferred,main.py,0,
@@ -11,9 +14,9 @@ import-deprioritizes-type_check_only,main.py,2,1
import-deprioritizes-type_check_only,main.py,3,2
import-deprioritizes-type_check_only,main.py,4,3
import-keyword-completion,main.py,0,1
internal-typeshed-hidden,main.py,0,4
internal-typeshed-hidden,main.py,0,2
none-completion,main.py,0,2
numpy-array,main.py,0,
numpy-array,main.py,0,159
numpy-array,main.py,1,1
object-attr-instance-methods,main.py,0,1
object-attr-instance-methods,main.py,1,1
@@ -23,6 +26,6 @@ scope-existing-over-new-import,main.py,0,1
scope-prioritize-closer,main.py,0,2
scope-simple-long-identifier,main.py,0,1
tstring-completions,main.py,0,1
ty-extensions-lower-stdlib,main.py,0,8
ty-extensions-lower-stdlib,main.py,0,9
type-var-typing-over-ast,main.py,0,3
type-var-typing-over-ast,main.py,1,275
type-var-typing-over-ast,main.py,1,251
1 name file index rank
2 auto-import-includes-modules main.py 0 1
3 auto-import-includes-modules main.py 1 7
4 auto-import-includes-modules main.py 2 1
5 auto-import-skips-current-module main.py 0 1
6 fstring-completions main.py 0 1
7 higher-level-symbols-preferred main.py 0
14 import-deprioritizes-type_check_only main.py 3 2
15 import-deprioritizes-type_check_only main.py 4 3
16 import-keyword-completion main.py 0 1
17 internal-typeshed-hidden main.py 0 4 2
18 none-completion main.py 0 2
19 numpy-array main.py 0 159
20 numpy-array main.py 1 1
21 object-attr-instance-methods main.py 0 1
22 object-attr-instance-methods main.py 1 1
26 scope-prioritize-closer main.py 0 2
27 scope-simple-long-identifier main.py 0 1
28 tstring-completions main.py 0 1
29 ty-extensions-lower-stdlib main.py 0 8 9
30 type-var-typing-over-ast main.py 0 3
31 type-var-typing-over-ast main.py 1 275 251

View File

@@ -506,9 +506,21 @@ struct CompletionAnswer {
impl CompletionAnswer {
/// Returns true when this answer matches the completion given.
fn matches(&self, completion: &Completion) -> bool {
if let Some(ref qualified) = completion.qualified {
if qualified.as_str() == self.qualified() {
return true;
}
}
self.symbol == completion.name.as_str()
&& self.module.as_deref() == completion.module_name.map(ModuleName::as_str)
}
fn qualified(&self) -> String {
self.module
.as_ref()
.map(|module| format!("{module}.{}", self.symbol))
.unwrap_or_else(|| self.symbol.clone())
}
}
/// Copy the Python project from `src_dir` to `dst_dir`.

View File

@@ -0,0 +1,2 @@
[settings]
auto-import = true

View File

@@ -0,0 +1,3 @@
multiprocess<CURSOR: multiprocessing>
collect<CURSOR: collections>
collabc<CURSOR: collections.abc>

View File

@@ -0,0 +1,5 @@
[project]
name = "test"
version = "0.1.0"
requires-python = ">=3.13"
dependencies = []

View File

@@ -0,0 +1,8 @@
version = 1
revision = 3
requires-python = ">=3.13"
[[package]]
name = "test"
version = "0.1.0"
source = { virtual = "." }

View File

@@ -2,7 +2,10 @@ use ruff_db::files::File;
use ty_project::Db;
use ty_python_semantic::{Module, ModuleName, all_modules, resolve_real_shadowable_module};
use crate::symbols::{QueryPattern, SymbolInfo, symbols_for_file_global_only};
use crate::{
SymbolKind,
symbols::{QueryPattern, SymbolInfo, symbols_for_file_global_only},
};
/// Get all symbols matching the query string.
///
@@ -20,7 +23,7 @@ pub fn all_symbols<'db>(
let typing_extensions = ModuleName::new("typing_extensions").unwrap();
let is_typing_extensions_available = importing_from.is_stub(db)
|| resolve_real_shadowable_module(db, &typing_extensions).is_some();
|| resolve_real_shadowable_module(db, importing_from, &typing_extensions).is_some();
let results = std::sync::Mutex::new(Vec::new());
{
@@ -36,18 +39,39 @@ pub fn all_symbols<'db>(
let Some(file) = module.file(&*db) else {
continue;
};
// By convention, modules starting with an underscore
// are generally considered unexported. However, we
// should consider first party modules fair game.
//
// Note that we apply this recursively. e.g.,
// `numpy._core.multiarray` is considered private
// because it's a child of `_core`.
if module.name(&*db).components().any(|c| c.starts_with('_'))
&& module
.search_path(&*db)
.is_none_or(|sp| !sp.is_first_party())
{
continue;
}
// TODO: also make it available in `TYPE_CHECKING` blocks
// (we'd need https://github.com/astral-sh/ty/issues/1553 to do this well)
if !is_typing_extensions_available && module.name(&*db) == &typing_extensions {
continue;
}
s.spawn(move |_| {
if query.is_match_symbol_name(module.name(&*db)) {
results.lock().unwrap().push(AllSymbolInfo {
symbol: None,
module,
file,
});
}
for (_, symbol) in symbols_for_file_global_only(&*db, file).search(query) {
// It seems like we could do better here than
// locking `results` for every single symbol,
// but this works pretty well as it is.
results.lock().unwrap().push(AllSymbolInfo {
symbol: symbol.to_owned(),
symbol: Some(symbol.to_owned()),
module,
file,
});
@@ -59,8 +83,16 @@ pub fn all_symbols<'db>(
let mut results = results.into_inner().unwrap();
results.sort_by(|s1, s2| {
let key1 = (&s1.symbol.name, s1.file.path(db).as_str());
let key2 = (&s2.symbol.name, s2.file.path(db).as_str());
let key1 = (
s1.name_in_file()
.unwrap_or_else(|| s1.module().name(db).as_str()),
s1.file.path(db).as_str(),
);
let key2 = (
s2.name_in_file()
.unwrap_or_else(|| s2.module().name(db).as_str()),
s2.file.path(db).as_str(),
);
key1.cmp(&key2)
});
results
@@ -71,14 +103,53 @@ pub fn all_symbols<'db>(
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct AllSymbolInfo<'db> {
/// The symbol information.
pub symbol: SymbolInfo<'static>,
///
/// When absent, this implies the symbol is the module itself.
symbol: Option<SymbolInfo<'static>>,
/// The module containing the symbol.
pub module: Module<'db>,
module: Module<'db>,
/// The file containing the symbol.
///
/// This `File` is guaranteed to be the same
/// as the `File` underlying `module`.
pub file: File,
file: File,
}
impl<'db> AllSymbolInfo<'db> {
/// Returns the name of this symbol as it exists in a file.
///
/// When absent, there is no concrete symbol in a module
/// somewhere. Instead, this represents importing a module.
/// In this case, if the caller needs a symbol name, they
/// should use `AllSymbolInfo::module().name()`.
pub fn name_in_file(&self) -> Option<&str> {
self.symbol.as_ref().map(|symbol| &*symbol.name)
}
/// Returns the "kind" of this symbol.
///
/// The kind of a symbol in the context of auto-import is
/// determined on a best effort basis. It may be imprecise
/// in some cases, e.g., reporting a module as a variable.
pub fn kind(&self) -> SymbolKind {
self.symbol
.as_ref()
.map(|symbol| symbol.kind)
.unwrap_or(SymbolKind::Module)
}
/// Returns the module this symbol is exported from.
pub fn module(&self) -> Module<'db> {
self.module
}
/// Returns the `File` corresponding to the module.
///
/// This is always equivalent to
/// `AllSymbolInfo::module().file().unwrap()`.
pub fn file(&self) -> File {
self.file
}
}
#[cfg(test)]
@@ -162,25 +233,31 @@ ABCDEFGHIJKLMNOP = 'https://api.example.com'
return "No symbols found".to_string();
}
self.render_diagnostics(symbols.into_iter().map(AllSymbolDiagnostic::new))
self.render_diagnostics(symbols.into_iter().map(|symbol_info| AllSymbolDiagnostic {
db: &self.db,
symbol_info,
}))
}
}
struct AllSymbolDiagnostic<'db> {
db: &'db dyn Db,
symbol_info: AllSymbolInfo<'db>,
}
impl<'db> AllSymbolDiagnostic<'db> {
fn new(symbol_info: AllSymbolInfo<'db>) -> Self {
Self { symbol_info }
}
}
impl IntoDiagnostic for AllSymbolDiagnostic<'_> {
fn into_diagnostic(self) -> Diagnostic {
let symbol_kind_str = self.symbol_info.symbol.kind.to_string();
let symbol_kind_str = self.symbol_info.kind().to_string();
let info_text = format!("{} {}", symbol_kind_str, self.symbol_info.symbol.name);
let info_text = format!(
"{} {}",
symbol_kind_str,
self.symbol_info.name_in_file().unwrap_or_else(|| self
.symbol_info
.module()
.name(self.db)
.as_str())
);
let sub = SubDiagnostic::new(SubDiagnosticSeverity::Info, info_text);
@@ -189,9 +266,12 @@ ABCDEFGHIJKLMNOP = 'https://api.example.com'
Severity::Info,
"AllSymbolInfo".to_string(),
);
main.annotate(Annotation::primary(
Span::from(self.symbol_info.file).with_range(self.symbol_info.symbol.name_range),
));
let mut span = Span::from(self.symbol_info.file());
if let Some(ref symbol) = self.symbol_info.symbol {
span = span.with_range(symbol.name_range);
}
main.annotate(Annotation::primary(span));
main.sub(sub);
main

View File

@@ -74,7 +74,7 @@ impl<'db> Completions<'db> {
.into_iter()
.filter_map(|item| {
Some(ImportEdit {
label: format!("import {}.{}", item.module_name?, item.name),
label: format!("import {}", item.qualified?),
edit: item.import?,
})
})
@@ -160,6 +160,10 @@ impl<'db> Extend<Completion<'db>> for Completions<'db> {
pub struct Completion<'db> {
/// The label shown to the user for this suggestion.
pub name: Name,
/// The fully qualified name, when available.
///
/// This is only set when `module_name` is available.
pub qualified: Option<Name>,
/// The text that should be inserted at the cursor
/// when the completion is selected.
///
@@ -225,6 +229,7 @@ impl<'db> Completion<'db> {
let is_type_check_only = semantic.is_type_check_only(db);
Completion {
name: semantic.name,
qualified: None,
insert: None,
ty: semantic.ty,
kind: None,
@@ -306,6 +311,7 @@ impl<'db> Completion<'db> {
fn keyword(name: &str) -> Self {
Completion {
name: name.into(),
qualified: None,
insert: None,
ty: None,
kind: Some(CompletionKind::Keyword),
@@ -321,6 +327,7 @@ impl<'db> Completion<'db> {
fn value_keyword(name: &str, ty: Type<'db>) -> Completion<'db> {
Completion {
name: name.into(),
qualified: None,
insert: None,
ty: Some(ty),
kind: Some(CompletionKind::Keyword),
@@ -537,12 +544,22 @@ fn add_unimported_completions<'db>(
let members = importer.members_in_scope_at(scoped.node, scoped.node.start());
for symbol in all_symbols(db, file, &completions.query) {
if symbol.module.file(db) == Some(file) || symbol.module.is_known(db, KnownModule::Builtins)
{
if symbol.file() == file || symbol.module().is_known(db, KnownModule::Builtins) {
continue;
}
let request = create_import_request(symbol.module.name(db), &symbol.symbol.name);
let module_name = symbol.module().name(db);
let (name, qualified, request) = symbol
.name_in_file()
.map(|name| {
let qualified = format!("{module_name}.{name}");
(name, qualified, create_import_request(module_name, name))
})
.unwrap_or_else(|| {
let name = module_name.as_str();
let qualified = name.to_string();
(name, qualified, ImportRequest::module(name))
});
// FIXME: `all_symbols` doesn't account for wildcard imports.
// Since we're looking at every module, this is probably
// "fine," but it might mean that we import a symbol from the
@@ -551,11 +568,12 @@ fn add_unimported_completions<'db>(
// N.B. We use `add` here because `all_symbols` already
// takes our query into account.
completions.force_add(Completion {
name: ast::name::Name::new(&symbol.symbol.name),
name: ast::name::Name::new(name),
qualified: Some(ast::name::Name::new(qualified)),
insert: Some(import_action.symbol_text().into()),
ty: None,
kind: symbol.symbol.kind.to_completion_kind(),
module_name: Some(symbol.module.name(db)),
kind: symbol.kind().to_completion_kind(),
module_name: Some(module_name),
import: import_action.import().cloned(),
builtin: false,
// TODO: `is_type_check_only` requires inferring the type of the symbol
@@ -4350,7 +4368,7 @@ from os.<CURSOR>
.build()
.snapshot();
assert_snapshot!(snapshot, @r"
Kadabra :: Literal[1] :: Current module
Kadabra :: Literal[1] :: <no import required>
AbraKadabra :: Unavailable :: package
");
}
@@ -5534,7 +5552,7 @@ def foo(param: s<CURSOR>)
// Even though long_namea is alphabetically before long_nameb,
// long_nameb is currently imported and should be preferred.
assert_snapshot!(snapshot, @r"
long_nameb :: Literal[1] :: Current module
long_nameb :: Literal[1] :: <no import required>
long_namea :: Unavailable :: foo
");
}
@@ -5804,7 +5822,7 @@ from .imp<CURSOR>
#[test]
fn typing_extensions_excluded_from_import() {
let builder = completion_test_builder("from typing<CURSOR>").module_names();
assert_snapshot!(builder.build().snapshot(), @"typing :: Current module");
assert_snapshot!(builder.build().snapshot(), @"typing :: <no import required>");
}
#[test]
@@ -5812,13 +5830,7 @@ from .imp<CURSOR>
let builder = completion_test_builder("deprecated<CURSOR>")
.auto_import()
.module_names();
assert_snapshot!(builder.build().snapshot(), @r"
Deprecated :: importlib.metadata
DeprecatedList :: importlib.metadata
DeprecatedNonAbstract :: importlib.metadata
DeprecatedTuple :: importlib.metadata
deprecated :: warnings
");
assert_snapshot!(builder.build().snapshot(), @"deprecated :: warnings");
}
#[test]
@@ -5829,8 +5841,8 @@ from .imp<CURSOR>
.completion_test_builder()
.module_names();
assert_snapshot!(builder.build().snapshot(), @r"
typing :: Current module
typing_extensions :: Current module
typing :: <no import required>
typing_extensions :: <no import required>
");
}
@@ -5843,10 +5855,6 @@ from .imp<CURSOR>
.auto_import()
.module_names();
assert_snapshot!(builder.build().snapshot(), @r"
Deprecated :: importlib.metadata
DeprecatedList :: importlib.metadata
DeprecatedNonAbstract :: importlib.metadata
DeprecatedTuple :: importlib.metadata
deprecated :: typing_extensions
deprecated :: warnings
");
@@ -5859,8 +5867,8 @@ from .imp<CURSOR>
.completion_test_builder()
.module_names();
assert_snapshot!(builder.build().snapshot(), @r"
typing :: Current module
typing_extensions :: Current module
typing :: <no import required>
typing_extensions :: <no import required>
");
}
@@ -5872,15 +5880,284 @@ from .imp<CURSOR>
.auto_import()
.module_names();
assert_snapshot!(builder.build().snapshot(), @r"
Deprecated :: importlib.metadata
DeprecatedList :: importlib.metadata
DeprecatedNonAbstract :: importlib.metadata
DeprecatedTuple :: importlib.metadata
deprecated :: typing_extensions
deprecated :: warnings
");
}
#[test]
fn reexport_simple_import_noauto() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
import foo
foo.ZQ<CURSOR>
"#,
)
.source("foo.py", r#"from bar import ZQZQ"#)
.source("bar.py", r#"ZQZQ = 1"#)
.completion_test_builder()
.module_names()
.build()
.snapshot();
assert_snapshot!(snapshot, @"ZQZQ :: <no import required>");
}
#[test]
fn reexport_simple_import_auto() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
ZQ<CURSOR>
"#,
)
.source("foo.py", r#"from bar import ZQZQ"#)
.source("bar.py", r#"ZQZQ = 1"#)
.completion_test_builder()
.auto_import()
.module_names()
.build()
.snapshot();
// We're specifically looking for `ZQZQ` in `bar`
// here but *not* in `foo`. Namely, in `foo`,
// `ZQZQ` is a "regular" import that is not by
// convention considered a re-export.
assert_snapshot!(snapshot, @"ZQZQ :: bar");
}
#[test]
fn reexport_redundant_convention_import_noauto() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
import foo
foo.ZQ<CURSOR>
"#,
)
.source("foo.py", r#"from bar import ZQZQ as ZQZQ"#)
.source("bar.py", r#"ZQZQ = 1"#)
.completion_test_builder()
.module_names()
.build()
.snapshot();
assert_snapshot!(snapshot, @"ZQZQ :: <no import required>");
}
#[test]
fn reexport_redundant_convention_import_auto() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
ZQ<CURSOR>
"#,
)
.source("foo.py", r#"from bar import ZQZQ as ZQZQ"#)
.source("bar.py", r#"ZQZQ = 1"#)
.completion_test_builder()
.auto_import()
.module_names()
.build()
.snapshot();
assert_snapshot!(snapshot, @r"
ZQZQ :: bar
ZQZQ :: foo
");
}
#[test]
fn auto_import_respects_all() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
ZQ<CURSOR>
"#,
)
.source(
"bar.py",
r#"
ZQZQ1 = 1
ZQZQ2 = 1
__all__ = ['ZQZQ1']
"#,
)
.completion_test_builder()
.auto_import()
.module_names()
.build()
.snapshot();
// We specifically do not want `ZQZQ2` here, since
// it is not part of `__all__`.
assert_snapshot!(snapshot, @r"
ZQZQ1 :: bar
");
}
// This test confirms current behavior (as of 2025-12-04), but
// it's not consistent with auto-import. That is, it doesn't
// strictly respect `__all__` on `bar`, but perhaps it should.
//
// See: https://github.com/astral-sh/ty/issues/1757
#[test]
fn object_attr_ignores_all() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
import bar
bar.ZQ<CURSOR>
"#,
)
.source(
"bar.py",
r#"
ZQZQ1 = 1
ZQZQ2 = 1
__all__ = ['ZQZQ1']
"#,
)
.completion_test_builder()
.auto_import()
.module_names()
.build()
.snapshot();
// We specifically do not want `ZQZQ2` here, since
// it is not part of `__all__`.
assert_snapshot!(snapshot, @r"
ZQZQ1 :: <no import required>
ZQZQ2 :: <no import required>
");
}
#[test]
fn auto_import_ignores_modules_with_leading_underscore() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
Quitter<CURSOR>
"#,
)
.completion_test_builder()
.auto_import()
.module_names()
.build()
.snapshot();
// There is a `Quitter` in `_sitebuiltins` in the standard
// library. But this is skipped by auto-import because it's
// 1) not first party and 2) starts with an `_`.
assert_snapshot!(snapshot, @"<No completions found>");
}
#[test]
fn auto_import_includes_modules_with_leading_underscore_in_first_party() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
ZQ<CURSOR>
"#,
)
.source(
"bar.py",
r#"
ZQZQ1 = 1
"#,
)
.source(
"_foo.py",
r#"
ZQZQ1 = 1
"#,
)
.completion_test_builder()
.auto_import()
.module_names()
.build()
.snapshot();
assert_snapshot!(snapshot, @r"
ZQZQ1 :: _foo
ZQZQ1 :: bar
");
}
#[test]
fn auto_import_includes_stdlib_modules_as_suggestions() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
multiprocess<CURSOR>
"#,
)
.completion_test_builder()
.auto_import()
.build()
.snapshot();
assert_snapshot!(snapshot, @r"
multiprocessing
multiprocessing.connection
multiprocessing.context
multiprocessing.dummy
multiprocessing.dummy.connection
multiprocessing.forkserver
multiprocessing.heap
multiprocessing.managers
multiprocessing.pool
multiprocessing.popen_fork
multiprocessing.popen_forkserver
multiprocessing.popen_spawn_posix
multiprocessing.popen_spawn_win32
multiprocessing.process
multiprocessing.queues
multiprocessing.reduction
multiprocessing.resource_sharer
multiprocessing.resource_tracker
multiprocessing.shared_memory
multiprocessing.sharedctypes
multiprocessing.spawn
multiprocessing.synchronize
multiprocessing.util
");
}
#[test]
fn auto_import_includes_first_party_modules_as_suggestions() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
zqzqzq<CURSOR>
"#,
)
.source("zqzqzqzqzq.py", "")
.completion_test_builder()
.auto_import()
.build()
.snapshot();
assert_snapshot!(snapshot, @"zqzqzqzqzq");
}
#[test]
fn auto_import_includes_sub_modules_as_suggestions() {
let snapshot = CursorTest::builder()
.source(
"main.py",
r#"
collabc<CURSOR>
"#,
)
.completion_test_builder()
.auto_import()
.build()
.snapshot();
assert_snapshot!(snapshot, @"collections.abc");
}
/// A way to create a simple single-file (named `main.py`) completion test
/// builder.
///
@@ -6055,7 +6332,7 @@ from .imp<CURSOR>
let module_name = c
.module_name
.map(ModuleName::as_str)
.unwrap_or("Current module");
.unwrap_or("<no import required>");
snapshot = format!("{snapshot} :: {module_name}");
}
snapshot

View File

@@ -230,10 +230,58 @@ calc = Calculator()
"
def test():
# Cursor on a position with no symbol
<CURSOR>
<CURSOR>
",
);
assert_snapshot!(test.document_highlights(), @"No highlights found");
}
// TODO: Should only highlight the last use and the last declaration
#[test]
fn redeclarations() {
let test = CursorTest::builder()
.source(
"main.py",
r#"
a: str = "test"
a: int = 10
print(a<CURSOR>)
"#,
)
.build();
assert_snapshot!(test.document_highlights(), @r#"
info[document_highlights]: Highlight 1 (Write)
--> main.py:2:1
|
2 | a: str = "test"
| ^
3 |
4 | a: int = 10
|
info[document_highlights]: Highlight 2 (Write)
--> main.py:4:1
|
2 | a: str = "test"
3 |
4 | a: int = 10
| ^
5 |
6 | print(a)
|
info[document_highlights]: Highlight 3 (Read)
--> main.py:6:7
|
4 | a: int = 10
5 |
6 | print(a)
| ^
|
"#);
}
}

View File

@@ -824,12 +824,12 @@ mod tests {
Check out this great example code::
x_y = "hello"
if len(x_y) > 4:
print(x_y)
else:
print("too short :(")
print("done")
You love to see it.
@@ -862,12 +862,12 @@ mod tests {
Check out this great example code ::
x_y = "hello"
if len(x_y) > 4:
print(x_y)
else:
print("too short :(")
print("done")
You love to see it.
@@ -901,12 +901,12 @@ mod tests {
::
x_y = "hello"
if len(x_y) > 4:
print(x_y)
else:
print("too short :(")
print("done")
You love to see it.
@@ -939,12 +939,12 @@ mod tests {
let docstring = r#"
Check out this great example code::
x_y = "hello"
if len(x_y) > 4:
print(x_y)
else:
print("too short :(")
print("done")
You love to see it.
"#;
@@ -975,12 +975,12 @@ mod tests {
Check out this great example code::
x_y = "hello"
if len(x_y) > 4:
print(x_y)
else:
print("too short :(")
print("done")"#;
let docstring = Docstring::new(docstring.to_owned());

View File

@@ -898,6 +898,42 @@ cls = MyClass
assert_snapshot!(test.references(), @"No references found");
}
#[test]
fn references_string_annotation_recursive() {
let test = cursor_test(
r#"
ab: "a<CURSOR>b"
"#,
);
assert_snapshot!(test.references(), @r#"
info[references]: Reference 1
--> main.py:2:1
|
2 | ab: "ab"
| ^^
|
info[references]: Reference 2
--> main.py:2:6
|
2 | ab: "ab"
| ^^
|
"#);
}
#[test]
fn references_string_annotation_unknown() {
let test = cursor_test(
r#"
x: "foo<CURSOR>bar"
"#,
);
assert_snapshot!(test.references(), @"No references found");
}
#[test]
fn references_match_name_stmt() {
let test = cursor_test(
@@ -1870,4 +1906,259 @@ func<CURSOR>_alias()
|
");
}
#[test]
fn references_submodule_import_from_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.submod import val
x = sub<CURSOR>pkg
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// TODO(submodule-imports): this should light up both instances of `subpkg`
assert_snapshot!(test.references(), @r"
info[references]: Reference 1
--> mypackage/__init__.py:4:5
|
2 | from .subpkg.submod import val
3 |
4 | x = subpkg
| ^^^^^^
|
");
}
#[test]
fn references_submodule_import_from_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .sub<CURSOR>pkg.submod import val
x = subpkg
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// TODO(submodule-imports): this should light up both instances of `subpkg`
assert_snapshot!(test.references(), @"No references found");
}
#[test]
fn references_submodule_import_from_wrong_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.submod import val
x = sub<CURSOR>mod
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// No references is actually correct (or it should only see itself)
assert_snapshot!(test.references(), @"No references found");
}
#[test]
fn references_submodule_import_from_wrong_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.sub<CURSOR>mod import val
x = submod
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// No references is actually correct (or it should only see itself)
assert_snapshot!(test.references(), @"No references found");
}
#[test]
fn references_submodule_import_from_confusing_shadowed_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .sub<CURSOR>pkg import subpkg
x = subpkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// No references is actually correct (or it should only see itself)
assert_snapshot!(test.references(), @"No references found");
}
#[test]
fn references_submodule_import_from_confusing_real_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg import sub<CURSOR>pkg
x = subpkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
assert_snapshot!(test.references(), @r"
info[references]: Reference 1
--> mypackage/__init__.py:2:21
|
2 | from .subpkg import subpkg
| ^^^^^^
3 |
4 | x = subpkg
|
info[references]: Reference 2
--> mypackage/__init__.py:4:5
|
2 | from .subpkg import subpkg
3 |
4 | x = subpkg
| ^^^^^^
|
info[references]: Reference 3
--> mypackage/subpkg/__init__.py:2:1
|
2 | subpkg: int = 10
| ^^^^^^
|
");
}
#[test]
fn references_submodule_import_from_confusing_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg import subpkg
x = sub<CURSOR>pkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// TODO: this should also highlight the RHS subpkg in the import
assert_snapshot!(test.references(), @r"
info[references]: Reference 1
--> mypackage/__init__.py:4:5
|
2 | from .subpkg import subpkg
3 |
4 | x = subpkg
| ^^^^^^
|
");
}
// TODO: Should only return references to the last declaration
#[test]
fn declarations() {
let test = CursorTest::builder()
.source(
"main.py",
r#"
a: str = "test"
a: int = 10
print(a<CURSOR>)
"#,
)
.build();
assert_snapshot!(test.references(), @r#"
info[references]: Reference 1
--> main.py:2:1
|
2 | a: str = "test"
| ^
3 |
4 | a: int = 10
|
info[references]: Reference 2
--> main.py:4:1
|
2 | a: str = "test"
3 |
4 | a: int = 10
| ^
5 |
6 | print(a)
|
info[references]: Reference 3
--> main.py:6:7
|
4 | a: int = 10
5 |
6 | print(a)
| ^
|
"#);
}
}

View File

@@ -73,19 +73,29 @@ pub(crate) enum GotoTarget<'a> {
/// ```
ImportModuleAlias {
alias: &'a ast::Alias,
asname: &'a ast::Identifier,
},
/// In an import statement, the named under which the symbol is exported
/// in the imported file.
///
/// ```py
/// from foo import bar as baz
/// ^^^
/// ```
ImportExportedName {
alias: &'a ast::Alias,
import_from: &'a ast::StmtImportFrom,
},
/// Import alias in from import statement
/// ```py
/// from foo import bar as baz
/// ^^^
/// from foo import bar as baz
/// ^^^
/// ```
ImportSymbolAlias {
alias: &'a ast::Alias,
range: TextRange,
import_from: &'a ast::StmtImportFrom,
asname: &'a ast::Identifier,
},
/// Go to on the exception handler variable
@@ -290,8 +300,9 @@ impl GotoTarget<'_> {
GotoTarget::FunctionDef(function) => function.inferred_type(model),
GotoTarget::ClassDef(class) => class.inferred_type(model),
GotoTarget::Parameter(parameter) => parameter.inferred_type(model),
GotoTarget::ImportSymbolAlias { alias, .. } => alias.inferred_type(model),
GotoTarget::ImportModuleAlias { alias } => alias.inferred_type(model),
GotoTarget::ImportSymbolAlias { alias, .. }
| GotoTarget::ImportModuleAlias { alias, .. }
| GotoTarget::ImportExportedName { alias, .. } => alias.inferred_type(model),
GotoTarget::ExceptVariable(except) => except.inferred_type(model),
GotoTarget::KeywordArgument { keyword, .. } => keyword.value.inferred_type(model),
// When asking the type of a callable, usually you want the callable itself?
@@ -378,7 +389,9 @@ impl GotoTarget<'_> {
alias_resolution: ImportAliasResolution,
) -> Option<Definitions<'db>> {
let definitions = match self {
GotoTarget::Expression(expression) => definitions_for_expression(model, *expression),
GotoTarget::Expression(expression) => {
definitions_for_expression(model, *expression, alias_resolution)
}
// For already-defined symbols, they are their own definitions
GotoTarget::FunctionDef(function) => Some(vec![ResolvedDefinition::Definition(
function.definition(model),
@@ -393,22 +406,21 @@ impl GotoTarget<'_> {
)]),
// For import aliases (offset within 'y' or 'z' in "from x import y as z")
GotoTarget::ImportSymbolAlias {
alias, import_from, ..
} => {
if let Some(asname) = alias.asname.as_ref()
&& alias_resolution == ImportAliasResolution::PreserveAliases
{
Some(definitions_for_name(model, asname.as_str(), asname.into()))
} else {
let symbol_name = alias.name.as_str();
Some(definitions_for_imported_symbol(
model,
import_from,
symbol_name,
alias_resolution,
))
}
GotoTarget::ImportSymbolAlias { asname, .. } => Some(definitions_for_name(
model,
asname.as_str(),
AnyNodeRef::from(*asname),
alias_resolution,
)),
GotoTarget::ImportExportedName { alias, import_from } => {
let symbol_name = alias.name.as_str();
Some(definitions_for_imported_symbol(
model,
import_from,
symbol_name,
alias_resolution,
))
}
GotoTarget::ImportModuleComponent {
@@ -423,15 +435,12 @@ impl GotoTarget<'_> {
}
// Handle import aliases (offset within 'z' in "import x.y as z")
GotoTarget::ImportModuleAlias { alias } => {
if let Some(asname) = alias.asname.as_ref()
&& alias_resolution == ImportAliasResolution::PreserveAliases
{
Some(definitions_for_name(model, asname.as_str(), asname.into()))
} else {
definitions_for_module(model, Some(alias.name.as_str()), 0)
}
}
GotoTarget::ImportModuleAlias { asname, .. } => Some(definitions_for_name(
model,
asname.as_str(),
AnyNodeRef::from(*asname),
alias_resolution,
)),
// Handle keyword arguments in call expressions
GotoTarget::KeywordArgument {
@@ -454,12 +463,22 @@ impl GotoTarget<'_> {
// because they're not expressions
GotoTarget::PatternMatchRest(pattern_mapping) => {
pattern_mapping.rest.as_ref().map(|name| {
definitions_for_name(model, name.as_str(), AnyNodeRef::Identifier(name))
definitions_for_name(
model,
name.as_str(),
AnyNodeRef::Identifier(name),
alias_resolution,
)
})
}
GotoTarget::PatternMatchAsName(pattern_as) => pattern_as.name.as_ref().map(|name| {
definitions_for_name(model, name.as_str(), AnyNodeRef::Identifier(name))
definitions_for_name(
model,
name.as_str(),
AnyNodeRef::Identifier(name),
alias_resolution,
)
}),
GotoTarget::PatternKeywordArgument(pattern_keyword) => {
@@ -468,12 +487,18 @@ impl GotoTarget<'_> {
model,
name.as_str(),
AnyNodeRef::Identifier(name),
alias_resolution,
))
}
GotoTarget::PatternMatchStarName(pattern_star) => {
pattern_star.name.as_ref().map(|name| {
definitions_for_name(model, name.as_str(), AnyNodeRef::Identifier(name))
definitions_for_name(
model,
name.as_str(),
AnyNodeRef::Identifier(name),
alias_resolution,
)
})
}
@@ -481,9 +506,18 @@ impl GotoTarget<'_> {
//
// Prefer the function impl over the callable so that its docstrings win if defined.
GotoTarget::Call { callable, call } => {
let mut definitions = definitions_for_callable(model, call);
let mut definitions = Vec::new();
// We prefer the specific overload for hover, go-to-def etc. However,
// `definitions_for_callable` always resolves import aliases. That's why we
// skip it in cases import alias resolution is turned of (rename, highlight references).
if alias_resolution == ImportAliasResolution::ResolveAliases {
definitions.extend(definitions_for_callable(model, call));
}
let expr_definitions =
definitions_for_expression(model, *callable).unwrap_or_default();
definitions_for_expression(model, *callable, alias_resolution)
.unwrap_or_default();
definitions.extend(expr_definitions);
if definitions.is_empty() {
@@ -517,7 +551,7 @@ impl GotoTarget<'_> {
let subexpr = covering_node(subast.syntax().into(), *subrange)
.node()
.as_expr_ref()?;
definitions_for_expression(&submodel, subexpr)
definitions_for_expression(&submodel, subexpr, alias_resolution)
}
// nonlocal and global are essentially loads, but again they're statements,
@@ -527,6 +561,7 @@ impl GotoTarget<'_> {
model,
identifier.as_str(),
AnyNodeRef::Identifier(identifier),
alias_resolution,
))
}
@@ -537,6 +572,7 @@ impl GotoTarget<'_> {
model,
name.as_str(),
AnyNodeRef::Identifier(name),
alias_resolution,
))
}
@@ -546,6 +582,7 @@ impl GotoTarget<'_> {
model,
name.as_str(),
AnyNodeRef::Identifier(name),
alias_resolution,
))
}
@@ -555,6 +592,7 @@ impl GotoTarget<'_> {
model,
name.as_str(),
AnyNodeRef::Identifier(name),
alias_resolution,
))
}
};
@@ -580,12 +618,9 @@ impl GotoTarget<'_> {
GotoTarget::FunctionDef(function) => Some(Cow::Borrowed(function.name.as_str())),
GotoTarget::ClassDef(class) => Some(Cow::Borrowed(class.name.as_str())),
GotoTarget::Parameter(parameter) => Some(Cow::Borrowed(parameter.name.as_str())),
GotoTarget::ImportSymbolAlias { alias, .. } => {
if let Some(asname) = &alias.asname {
Some(Cow::Borrowed(asname.as_str()))
} else {
Some(Cow::Borrowed(alias.name.as_str()))
}
GotoTarget::ImportSymbolAlias { asname, .. } => Some(Cow::Borrowed(asname.as_str())),
GotoTarget::ImportExportedName { alias, .. } => {
Some(Cow::Borrowed(alias.name.as_str()))
}
GotoTarget::ImportModuleComponent {
module_name,
@@ -599,13 +634,7 @@ impl GotoTarget<'_> {
Some(Cow::Borrowed(module_name))
}
}
GotoTarget::ImportModuleAlias { alias } => {
if let Some(asname) = &alias.asname {
Some(Cow::Borrowed(asname.as_str()))
} else {
Some(Cow::Borrowed(alias.name.as_str()))
}
}
GotoTarget::ImportModuleAlias { asname, .. } => Some(Cow::Borrowed(asname.as_str())),
GotoTarget::ExceptVariable(except) => {
Some(Cow::Borrowed(except.name.as_ref()?.as_str()))
}
@@ -667,7 +696,7 @@ impl GotoTarget<'_> {
// Is the offset within the alias name (asname) part?
if let Some(asname) = &alias.asname {
if asname.range.contains_inclusive(offset) {
return Some(GotoTarget::ImportModuleAlias { alias });
return Some(GotoTarget::ImportModuleAlias { alias, asname });
}
}
@@ -699,21 +728,13 @@ impl GotoTarget<'_> {
// Is the offset within the alias name (asname) part?
if let Some(asname) = &alias.asname {
if asname.range.contains_inclusive(offset) {
return Some(GotoTarget::ImportSymbolAlias {
alias,
range: asname.range,
import_from,
});
return Some(GotoTarget::ImportSymbolAlias { alias, asname });
}
}
// Is the offset in the original name part?
if alias.name.range.contains_inclusive(offset) {
return Some(GotoTarget::ImportSymbolAlias {
alias,
range: alias.name.range,
import_from,
});
return Some(GotoTarget::ImportExportedName { alias, import_from });
}
None
@@ -893,12 +914,13 @@ impl Ranged for GotoTarget<'_> {
GotoTarget::FunctionDef(function) => function.name.range,
GotoTarget::ClassDef(class) => class.name.range,
GotoTarget::Parameter(parameter) => parameter.name.range,
GotoTarget::ImportSymbolAlias { range, .. } => *range,
GotoTarget::ImportSymbolAlias { asname, .. } => asname.range,
Self::ImportExportedName { alias, .. } => alias.name.range,
GotoTarget::ImportModuleComponent {
component_range, ..
} => *component_range,
GotoTarget::StringAnnotationSubexpr { subrange, .. } => *subrange,
GotoTarget::ImportModuleAlias { alias } => alias.asname.as_ref().unwrap().range,
GotoTarget::ImportModuleAlias { asname, .. } => asname.range,
GotoTarget::ExceptVariable(except) => except.name.as_ref().unwrap().range,
GotoTarget::KeywordArgument { keyword, .. } => keyword.arg.as_ref().unwrap().range,
GotoTarget::PatternMatchRest(rest) => rest.rest.as_ref().unwrap().range,
@@ -955,12 +977,14 @@ fn convert_resolved_definitions_to_targets<'db>(
fn definitions_for_expression<'db>(
model: &SemanticModel<'db>,
expression: ruff_python_ast::ExprRef<'_>,
alias_resolution: ImportAliasResolution,
) -> Option<Vec<ResolvedDefinition<'db>>> {
match expression {
ast::ExprRef::Name(name) => Some(definitions_for_name(
model,
name.id.as_str(),
expression.into(),
alias_resolution,
)),
ast::ExprRef::Attribute(attribute) => Some(ty_python_semantic::definitions_for_attribute(
model, attribute,

View File

@@ -273,7 +273,7 @@ mod tests {
r#"
class A:
x = 1
def method(self):
def inner():
return <CURSOR>x # Should NOT find class variable x
@@ -1073,6 +1073,41 @@ def another_helper(path):
assert_snapshot!(test.goto_declaration(), @"No goto target found");
}
#[test]
fn goto_declaration_string_annotation_recursive() {
let test = cursor_test(
r#"
ab: "a<CURSOR>b"
"#,
);
assert_snapshot!(test.goto_declaration(), @r#"
info[goto-declaration]: Declaration
--> main.py:2:1
|
2 | ab: "ab"
| ^^
|
info: Source
--> main.py:2:6
|
2 | ab: "ab"
| ^^
|
"#);
}
#[test]
fn goto_declaration_string_annotation_unknown() {
let test = cursor_test(
r#"
x: "foo<CURSOR>bar"
"#,
);
assert_snapshot!(test.goto_declaration(), @"No goto target found");
}
#[test]
fn goto_declaration_nested_instance_attribute() {
let test = cursor_test(
@@ -1220,12 +1255,12 @@ x: i<CURSOR>nt = 42
r#"
def outer():
x = "outer_value"
def inner():
nonlocal x
x = "modified"
return x<CURSOR> # Should find the nonlocal x declaration in outer scope
return inner
"#,
);
@@ -1260,12 +1295,12 @@ def outer():
r#"
def outer():
xy = "outer_value"
def inner():
nonlocal x<CURSOR>y
xy = "modified"
return x # Should find the nonlocal x declaration in outer scope
return inner
"#,
);
@@ -1601,7 +1636,7 @@ def function():
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, button=a<CURSOR>b):
@@ -1640,7 +1675,7 @@ def function():
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, button=ab):
@@ -1678,7 +1713,7 @@ def function():
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Cl<CURSOR>ick(x, button=ab):
@@ -1716,7 +1751,7 @@ def function():
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, but<CURSOR>ton=ab):
@@ -1884,7 +1919,7 @@ def function():
class C:
def __init__(self):
self._value = 0
@property
def value(self):
return self._value
@@ -1994,7 +2029,7 @@ def function():
r#"
class MyClass:
ClassType = int
def generic_method[T](self, value: Class<CURSOR>Type) -> T:
return value
"#,
@@ -2567,6 +2602,378 @@ def ab(a: int, *, c: int): ...
");
}
#[test]
fn goto_declaration_submodule_import_from_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.submod import val
x = sub<CURSOR>pkg
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// TODO(submodule-imports): this should only highlight `subpkg` in the import statement
// This happens because DefinitionKind::ImportFromSubmodule claims the entire ImportFrom node,
// which is correct but unhelpful. Unfortunately even if it only claimed the LHS identifier it
// would highlight `subpkg.submod` which is strictly better but still isn't what we want.
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> mypackage/__init__.py:2:1
|
2 | from .subpkg.submod import val
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
3 |
4 | x = subpkg
|
info: Source
--> mypackage/__init__.py:4:5
|
2 | from .subpkg.submod import val
3 |
4 | x = subpkg
| ^^^^^^
|
");
}
#[test]
fn goto_declaration_submodule_import_from_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .sub<CURSOR>pkg.submod import val
x = subpkg
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// TODO(submodule-imports): I don't *think* this is what we want..?
// It's a bit confusing because this symbol is essentially the LHS *and* RHS of
// `subpkg = mypackage.subpkg`. As in, it's both defining a local `subpkg` and
// loading the module `mypackage.subpkg`, so, it's understandable to get confused!
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> mypackage/subpkg/__init__.py:1:1
|
|
info: Source
--> mypackage/__init__.py:2:7
|
2 | from .subpkg.submod import val
| ^^^^^^
3 |
4 | x = subpkg
|
");
}
#[test]
fn goto_declaration_submodule_import_from_wrong_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.submod import val
x = sub<CURSOR>mod
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// No result is correct!
assert_snapshot!(test.goto_declaration(), @"No goto target found");
}
#[test]
fn goto_declaration_submodule_import_from_wrong_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.sub<CURSOR>mod import val
x = submod
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// Going to the submod module is correct!
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> mypackage/subpkg/submod.py:1:1
|
1 |
| ^
2 | val: int = 0
|
info: Source
--> mypackage/__init__.py:2:14
|
2 | from .subpkg.submod import val
| ^^^^^^
3 |
4 | x = submod
|
");
}
#[test]
fn goto_declaration_submodule_import_from_confusing_shadowed_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .sub<CURSOR>pkg import subpkg
x = subpkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// Going to the subpkg module is correct!
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> mypackage/subpkg/__init__.py:1:1
|
1 |
| ^
2 | subpkg: int = 10
|
info: Source
--> mypackage/__init__.py:2:7
|
2 | from .subpkg import subpkg
| ^^^^^^
3 |
4 | x = subpkg
|
");
}
#[test]
fn goto_declaration_submodule_import_from_confusing_real_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg import sub<CURSOR>pkg
x = subpkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// Going to the subpkg `int` is correct!
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> mypackage/subpkg/__init__.py:2:1
|
2 | subpkg: int = 10
| ^^^^^^
|
info: Source
--> mypackage/__init__.py:2:21
|
2 | from .subpkg import subpkg
| ^^^^^^
3 |
4 | x = subpkg
|
");
}
#[test]
fn goto_declaration_submodule_import_from_confusing_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg import subpkg
x = sub<CURSOR>pkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// TODO(submodule-imports): Ok this one is FASCINATING and it's kinda right but confusing!
//
// So there's 3 relevant definitions here:
//
// * `subpkg: int = 10` in the other file is in fact the original definition
//
// * the LHS `subpkg` in the import is an instance of `subpkg = ...`
// because it's a `DefinitionKind::ImportFromSubmodle`.
// This is the span that covers the entire import.
//
// * `the RHS `subpkg` in the import is a second instance of `subpkg = ...`
// that *immediately* overwrites the `ImportFromSubmodule`'s definition
// This span seemingly doesn't appear at all!? Is it getting hidden by the LHS span?
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> mypackage/__init__.py:2:1
|
2 | from .subpkg import subpkg
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
3 |
4 | x = subpkg
|
info: Source
--> mypackage/__init__.py:4:5
|
2 | from .subpkg import subpkg
3 |
4 | x = subpkg
| ^^^^^^
|
info[goto-declaration]: Declaration
--> mypackage/subpkg/__init__.py:2:1
|
2 | subpkg: int = 10
| ^^^^^^
|
info: Source
--> mypackage/__init__.py:4:5
|
2 | from .subpkg import subpkg
3 |
4 | x = subpkg
| ^^^^^^
|
");
}
// TODO: Should only return `a: int`
#[test]
fn redeclarations() {
let test = CursorTest::builder()
.source(
"main.py",
r#"
a: str = "test"
a: int = 10
print(a<CURSOR>)
a: bool = True
"#,
)
.build();
assert_snapshot!(test.goto_declaration(), @r#"
info[goto-declaration]: Declaration
--> main.py:2:1
|
2 | a: str = "test"
| ^
3 |
4 | a: int = 10
|
info: Source
--> main.py:6:7
|
4 | a: int = 10
5 |
6 | print(a)
| ^
7 |
8 | a: bool = True
|
info[goto-declaration]: Declaration
--> main.py:4:1
|
2 | a: str = "test"
3 |
4 | a: int = 10
| ^
5 |
6 | print(a)
|
info: Source
--> main.py:6:7
|
4 | a: int = 10
5 |
6 | print(a)
| ^
7 |
8 | a: bool = True
|
info[goto-declaration]: Declaration
--> main.py:8:1
|
6 | print(a)
7 |
8 | a: bool = True
| ^
|
info: Source
--> main.py:6:7
|
4 | a: int = 10
5 |
6 | print(a)
| ^
7 |
8 | a: bool = True
|
"#);
}
impl CursorTest {
fn goto_declaration(&self) -> String {
let Some(targets) = goto_declaration(&self.db, self.cursor.file, self.cursor.offset)

View File

@@ -1714,6 +1714,86 @@ Traceb<CURSOR>ackType
assert_snapshot!(test.goto_definition(), @"No goto target found");
}
// TODO: Should only list `a: int`
#[test]
fn redeclarations() {
let test = CursorTest::builder()
.source(
"main.py",
r#"
a: str = "test"
a: int = 10
print(a<CURSOR>)
a: bool = True
"#,
)
.build();
assert_snapshot!(test.goto_definition(), @r#"
info[goto-definition]: Definition
--> main.py:2:1
|
2 | a: str = "test"
| ^
3 |
4 | a: int = 10
|
info: Source
--> main.py:6:7
|
4 | a: int = 10
5 |
6 | print(a)
| ^
7 |
8 | a: bool = True
|
info[goto-definition]: Definition
--> main.py:4:1
|
2 | a: str = "test"
3 |
4 | a: int = 10
| ^
5 |
6 | print(a)
|
info: Source
--> main.py:6:7
|
4 | a: int = 10
5 |
6 | print(a)
| ^
7 |
8 | a: bool = True
|
info[goto-definition]: Definition
--> main.py:8:1
|
6 | print(a)
7 |
8 | a: bool = True
| ^
|
info: Source
--> main.py:6:7
|
4 | a: int = 10
5 |
6 | print(a)
| ^
7 |
8 | a: bool = True
|
"#);
}
impl CursorTest {
fn goto_definition(&self) -> String {
let Some(targets) = goto_definition(&self.db, self.cursor.file, self.cursor.offset)

View File

@@ -145,14 +145,14 @@ mod tests {
assert_snapshot!(test.goto_type_definition(), @r"
info[goto-type-definition]: Type definition
--> stdlib/typing.pyi:770:1
--> stdlib/typing.pyi:781:1
|
768 | def __class_getitem__(cls, args: TypeVar | tuple[TypeVar, ...]) -> _Final: ...
769 |
770 | Generic: type[_Generic]
779 | def __class_getitem__(cls, args: TypeVar | tuple[TypeVar, ...]) -> _Final: ...
780 |
781 | Generic: type[_Generic]
| ^^^^^^^
771 |
772 | class _ProtocolMeta(ABCMeta):
782 |
783 | class _ProtocolMeta(ABCMeta):
|
info: Source
--> main.py:4:1
@@ -964,6 +964,60 @@ mod tests {
assert_snapshot!(test.goto_type_definition(), @"No goto target found");
}
#[test]
fn goto_type_string_annotation_recursive() {
let test = cursor_test(
r#"
ab: "a<CURSOR>b"
"#,
);
assert_snapshot!(test.goto_type_definition(), @r#"
info[goto-type-definition]: Type definition
--> stdlib/ty_extensions.pyi:20:1
|
19 | # Types
20 | Unknown = object()
| ^^^^^^^
21 | AlwaysTruthy = object()
22 | AlwaysFalsy = object()
|
info: Source
--> main.py:2:6
|
2 | ab: "ab"
| ^^
|
"#);
}
#[test]
fn goto_type_string_annotation_unknown() {
let test = cursor_test(
r#"
x: "foo<CURSOR>bar"
"#,
);
assert_snapshot!(test.goto_type_definition(), @r#"
info[goto-type-definition]: Type definition
--> stdlib/ty_extensions.pyi:20:1
|
19 | # Types
20 | Unknown = object()
| ^^^^^^^
21 | AlwaysTruthy = object()
22 | AlwaysFalsy = object()
|
info: Source
--> main.py:2:5
|
2 | x: "foobar"
| ^^^^^^
|
"#);
}
#[test]
fn goto_type_match_name_stmt() {
let test = cursor_test(
@@ -1057,7 +1111,7 @@ mod tests {
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, button=a<CURSOR>b):
@@ -1077,7 +1131,7 @@ mod tests {
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, button=ab):
@@ -1097,7 +1151,7 @@ mod tests {
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Cl<CURSOR>ick(x, button=ab):
@@ -1135,7 +1189,7 @@ mod tests {
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, but<CURSOR>ton=ab):
@@ -1344,12 +1398,12 @@ f(**kwargs<CURSOR>)
r#"
def outer():
x = "outer_value"
def inner():
nonlocal x
x = "modified"
return x<CURSOR> # Should find the nonlocal x declaration in outer scope
return inner
"#,
);
@@ -1384,12 +1438,12 @@ def outer():
r#"
def outer():
xy = "outer_value"
def inner():
nonlocal x<CURSOR>y
xy = "modified"
return x # Should find the nonlocal x declaration in outer scope
return inner
"#,
);
@@ -1618,6 +1672,283 @@ def function():
"#);
}
#[test]
fn goto_type_submodule_import_from_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.submod import val
x = sub<CURSOR>pkg
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// The module is the correct type definition
assert_snapshot!(test.goto_type_definition(), @r"
info[goto-type-definition]: Type definition
--> mypackage/subpkg/__init__.py:1:1
|
|
info: Source
--> mypackage/__init__.py:4:5
|
2 | from .subpkg.submod import val
3 |
4 | x = subpkg
| ^^^^^^
|
");
}
#[test]
fn goto_type_submodule_import_from_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .sub<CURSOR>pkg.submod import val
x = subpkg
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// The module is the correct type definition
assert_snapshot!(test.goto_type_definition(), @r"
info[goto-type-definition]: Type definition
--> mypackage/subpkg/__init__.py:1:1
|
|
info: Source
--> mypackage/__init__.py:2:7
|
2 | from .subpkg.submod import val
| ^^^^^^
3 |
4 | x = subpkg
|
");
}
#[test]
fn goto_type_submodule_import_from_wrong_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.submod import val
x = sub<CURSOR>mod
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// Unknown is correct, `submod` is not in scope
assert_snapshot!(test.goto_type_definition(), @r"
info[goto-type-definition]: Type definition
--> stdlib/ty_extensions.pyi:20:1
|
19 | # Types
20 | Unknown = object()
| ^^^^^^^
21 | AlwaysTruthy = object()
22 | AlwaysFalsy = object()
|
info: Source
--> mypackage/__init__.py:4:5
|
2 | from .subpkg.submod import val
3 |
4 | x = submod
| ^^^^^^
|
");
}
#[test]
fn goto_type_submodule_import_from_wrong_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.sub<CURSOR>mod import val
x = submod
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// The module is correct
assert_snapshot!(test.goto_type_definition(), @r"
info[goto-type-definition]: Type definition
--> mypackage/subpkg/submod.py:1:1
|
1 | /
2 | | val: int = 0
| |_____________^
|
info: Source
--> mypackage/__init__.py:2:14
|
2 | from .subpkg.submod import val
| ^^^^^^
3 |
4 | x = submod
|
");
}
#[test]
fn goto_type_submodule_import_from_confusing_shadowed_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .sub<CURSOR>pkg import subpkg
x = subpkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// The module is correct
assert_snapshot!(test.goto_type_definition(), @r"
info[goto-type-definition]: Type definition
--> mypackage/subpkg/__init__.py:1:1
|
1 | /
2 | | subpkg: int = 10
| |_________________^
|
info: Source
--> mypackage/__init__.py:2:7
|
2 | from .subpkg import subpkg
| ^^^^^^
3 |
4 | x = subpkg
|
");
}
#[test]
fn goto_type_submodule_import_from_confusing_real_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg import sub<CURSOR>pkg
x = subpkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// `int` is correct
assert_snapshot!(test.goto_type_definition(), @r#"
info[goto-type-definition]: Type definition
--> stdlib/builtins.pyi:348:7
|
347 | @disjoint_base
348 | class int:
| ^^^
349 | """int([x]) -> integer
350 | int(x, base=10) -> integer
|
info: Source
--> mypackage/__init__.py:2:21
|
2 | from .subpkg import subpkg
| ^^^^^^
3 |
4 | x = subpkg
|
"#);
}
#[test]
fn goto_type_submodule_import_from_confusing_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg import subpkg
x = sub<CURSOR>pkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// `int` is correct
assert_snapshot!(test.goto_type_definition(), @r#"
info[goto-type-definition]: Type definition
--> stdlib/builtins.pyi:348:7
|
347 | @disjoint_base
348 | class int:
| ^^^
349 | """int([x]) -> integer
350 | int(x, base=10) -> integer
|
info: Source
--> mypackage/__init__.py:4:5
|
2 | from .subpkg import subpkg
3 |
4 | x = subpkg
| ^^^^^^
|
"#);
}
impl CursorTest {
fn goto_type_definition(&self) -> String {
let Some(targets) =

View File

@@ -1089,6 +1089,60 @@ mod tests {
assert_snapshot!(test.hover(), @"Hover provided no content");
}
#[test]
fn hover_string_annotation_recursive() {
let test = cursor_test(
r#"
ab: "a<CURSOR>b"
"#,
);
assert_snapshot!(test.hover(), @r#"
Unknown
---------------------------------------------
```python
Unknown
```
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:6
|
2 | ab: "ab"
| ^-
| ||
| |Cursor offset
| source
|
"#);
}
#[test]
fn hover_string_annotation_unknown() {
let test = cursor_test(
r#"
x: "foo<CURSOR>bar"
"#,
);
assert_snapshot!(test.hover(), @r#"
Unknown
---------------------------------------------
```python
Unknown
```
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:5
|
2 | x: "foobar"
| ^^^-^^
| | |
| | Cursor offset
| source
|
"#);
}
#[test]
fn hover_overload_type_disambiguated1() {
let test = CursorTest::builder()
@@ -1654,12 +1708,12 @@ def ab(a: int, *, c: int):
r#"
def outer():
x = "outer_value"
def inner():
nonlocal x
x = "modified"
return x<CURSOR> # Should find the nonlocal x declaration in outer scope
return inner
"#,
);
@@ -1693,12 +1747,12 @@ def outer():
r#"
def outer():
xy = "outer_value"
def inner():
nonlocal x<CURSOR>y
xy = "modified"
return x # Should find the nonlocal x declaration in outer scope
return inner
"#,
);
@@ -1906,7 +1960,7 @@ def function():
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, button=a<CURSOR>b):
@@ -1926,7 +1980,7 @@ def function():
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, button=ab):
@@ -1964,7 +2018,7 @@ def function():
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Cl<CURSOR>ick(x, button=ab):
@@ -2003,7 +2057,7 @@ def function():
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, but<CURSOR>ton=ab):
@@ -2089,15 +2143,13 @@ def function():
"#,
);
// TODO: This should just be `**AB@Alias2 (<variance>)`
// https://github.com/astral-sh/ty/issues/1581
assert_snapshot!(test.hover(), @r"
(
...
) -> tuple[typing.ParamSpec]
(**AB@Alias2) -> tuple[AB@Alias2]
---------------------------------------------
```python
(
...
) -> tuple[typing.ParamSpec]
(**AB@Alias2) -> tuple[AB@Alias2]
```
---------------------------------------------
info[hover]: Hovered content is
@@ -2238,12 +2290,12 @@ def function():
"#,
);
// TODO: This should be `P@Alias (<variance>)`
// TODO: Should this be constravariant instead?
assert_snapshot!(test.hover(), @r"
typing.ParamSpec
P@Alias (bivariant)
---------------------------------------------
```python
typing.ParamSpec
P@Alias (bivariant)
```
---------------------------------------------
info[hover]: Hovered content is
@@ -3267,6 +3319,297 @@ def function():
");
}
#[test]
fn hover_submodule_import_from_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.submod import val
x = sub<CURSOR>pkg
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// The module is correct
assert_snapshot!(test.hover(), @r"
<module 'mypackage.subpkg'>
---------------------------------------------
```python
<module 'mypackage.subpkg'>
```
---------------------------------------------
info[hover]: Hovered content is
--> mypackage/__init__.py:4:5
|
2 | from .subpkg.submod import val
3 |
4 | x = subpkg
| ^^^-^^
| | |
| | Cursor offset
| source
|
");
}
#[test]
fn hover_submodule_import_from_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .sub<CURSOR>pkg.submod import val
x = subpkg
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// The module is correct
assert_snapshot!(test.hover(), @r"
<module 'mypackage.subpkg'>
---------------------------------------------
```python
<module 'mypackage.subpkg'>
```
---------------------------------------------
info[hover]: Hovered content is
--> mypackage/__init__.py:2:7
|
2 | from .subpkg.submod import val
| ^^^-^^
| | |
| | Cursor offset
| source
3 |
4 | x = subpkg
|
");
}
#[test]
fn hover_submodule_import_from_wrong_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.submod import val
x = sub<CURSOR>mod
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// Unknown is correct
assert_snapshot!(test.hover(), @r"
Unknown
---------------------------------------------
```python
Unknown
```
---------------------------------------------
info[hover]: Hovered content is
--> mypackage/__init__.py:4:5
|
2 | from .subpkg.submod import val
3 |
4 | x = submod
| ^^^-^^
| | |
| | Cursor offset
| source
|
");
}
#[test]
fn hover_submodule_import_from_wrong_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg.sub<CURSOR>mod import val
x = submod
"#,
)
.source("mypackage/subpkg/__init__.py", r#""#)
.source(
"mypackage/subpkg/submod.py",
r#"
val: int = 0
"#,
)
.build();
// The submodule is correct
assert_snapshot!(test.hover(), @r"
<module 'mypackage.subpkg.submod'>
---------------------------------------------
```python
<module 'mypackage.subpkg.submod'>
```
---------------------------------------------
info[hover]: Hovered content is
--> mypackage/__init__.py:2:14
|
2 | from .subpkg.submod import val
| ^^^-^^
| | |
| | Cursor offset
| source
3 |
4 | x = submod
|
");
}
#[test]
fn hover_submodule_import_from_confusing_shadowed_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .sub<CURSOR>pkg import subpkg
x = subpkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// The module is correct
assert_snapshot!(test.hover(), @r"
<module 'mypackage.subpkg'>
---------------------------------------------
```python
<module 'mypackage.subpkg'>
```
---------------------------------------------
info[hover]: Hovered content is
--> mypackage/__init__.py:2:7
|
2 | from .subpkg import subpkg
| ^^^-^^
| | |
| | Cursor offset
| source
3 |
4 | x = subpkg
|
");
}
#[test]
fn hover_submodule_import_from_confusing_real_def() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg import sub<CURSOR>pkg
x = subpkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// int is correct
assert_snapshot!(test.hover(), @r"
int
---------------------------------------------
```python
int
```
---------------------------------------------
info[hover]: Hovered content is
--> mypackage/__init__.py:2:21
|
2 | from .subpkg import subpkg
| ^^^-^^
| | |
| | Cursor offset
| source
3 |
4 | x = subpkg
|
");
}
#[test]
fn hover_submodule_import_from_confusing_use() {
let test = CursorTest::builder()
.source(
"mypackage/__init__.py",
r#"
from .subpkg import subpkg
x = sub<CURSOR>pkg
"#,
)
.source(
"mypackage/subpkg/__init__.py",
r#"
subpkg: int = 10
"#,
)
.build();
// int is correct
assert_snapshot!(test.hover(), @r"
int
---------------------------------------------
```python
int
```
---------------------------------------------
info[hover]: Hovered content is
--> mypackage/__init__.py:4:5
|
2 | from .subpkg import subpkg
3 |
4 | x = subpkg
| ^^^-^^
| | |
| | Cursor offset
| source
|
");
}
impl CursorTest {
fn hover(&self) -> String {
use std::fmt::Write;

View File

@@ -145,7 +145,7 @@ impl<'a> Importer<'a> {
members: &MembersInScope,
) -> ImportAction {
let request = request.avoid_conflicts(self.db, self.file, members);
let mut symbol_text: Box<str> = request.member.into();
let mut symbol_text: Box<str> = request.member.unwrap_or(request.module).into();
let Some(response) = self.find(&request, members.at) else {
let insertion = if let Some(future) = self.find_last_future_import(members.at) {
Insertion::end_of_statement(future.stmt, self.source, self.stylist)
@@ -157,14 +157,27 @@ impl<'a> Importer<'a> {
Insertion::start_of_file(self.parsed.suite(), self.source, self.stylist, range)
};
let import = insertion.into_edit(&request.to_string());
if matches!(request.style, ImportStyle::Import) {
symbol_text = format!("{}.{}", request.module, request.member).into();
if let Some(member) = request.member
&& matches!(request.style, ImportStyle::Import)
{
symbol_text = format!("{}.{}", request.module, member).into();
}
return ImportAction {
import: Some(import),
symbol_text,
};
};
// When we just have a request to import a module (and not
// any members from that module), then the only way we can be
// here is if we found a pre-existing import that definitively
// satisfies the request. So we're done.
let Some(member) = request.member else {
return ImportAction {
import: None,
symbol_text,
};
};
match response.kind {
ImportResponseKind::Unqualified { ast, alias } => {
let member = alias.asname.as_ref().unwrap_or(&alias.name).as_str();
@@ -189,13 +202,10 @@ impl<'a> Importer<'a> {
let import = if let Some(insertion) =
Insertion::existing_import(response.import.stmt, self.tokens)
{
insertion.into_edit(request.member)
insertion.into_edit(member)
} else {
Insertion::end_of_statement(response.import.stmt, self.source, self.stylist)
.into_edit(&format!(
"from {} import {}",
request.module, request.member
))
.into_edit(&format!("from {} import {member}", request.module))
};
ImportAction {
import: Some(import),
@@ -481,6 +491,17 @@ impl<'ast> AstImportKind<'ast> {
Some(ImportResponseKind::Qualified { ast, alias })
}
AstImportKind::ImportFrom(ast) => {
// If the request is for a module itself, then we
// assume that it can never be satisfies by a
// `from ... import ...` statement. For example, a
// `request for collections.abc` needs an
// `import collections.abc`. Now, there could be a
// `from collections import abc`, and we could
// plausibly consider that a match and return a
// symbol text of `abc`. But it's not clear if that's
// the right choice or not.
let member = request.member?;
if request.force_style && !matches!(request.style, ImportStyle::ImportFrom) {
return None;
}
@@ -492,9 +513,7 @@ impl<'ast> AstImportKind<'ast> {
let kind = ast
.names
.iter()
.find(|alias| {
alias.name.as_str() == "*" || alias.name.as_str() == request.member
})
.find(|alias| alias.name.as_str() == "*" || alias.name.as_str() == member)
.map(|alias| ImportResponseKind::Unqualified { ast, alias })
.unwrap_or_else(|| ImportResponseKind::Partial(ast));
Some(kind)
@@ -510,7 +529,10 @@ pub(crate) struct ImportRequest<'a> {
/// `foo`, in `from foo import bar`).
module: &'a str,
/// The member to import (e.g., `bar`, in `from foo import bar`).
member: &'a str,
///
/// When `member` is absent, then this request reflects an import
/// of the module itself. i.e., `import module`.
member: Option<&'a str>,
/// The preferred style to use when importing the symbol (e.g.,
/// `import foo` or `from foo import bar`).
///
@@ -532,7 +554,7 @@ impl<'a> ImportRequest<'a> {
pub(crate) fn import(module: &'a str, member: &'a str) -> Self {
Self {
module,
member,
member: Some(member),
style: ImportStyle::Import,
force_style: false,
}
@@ -545,12 +567,26 @@ impl<'a> ImportRequest<'a> {
pub(crate) fn import_from(module: &'a str, member: &'a str) -> Self {
Self {
module,
member,
member: Some(member),
style: ImportStyle::ImportFrom,
force_style: false,
}
}
/// Create a new [`ImportRequest`] for bringing the given module
/// into scope.
///
/// This is for just importing the module itself, always via an
/// `import` statement.
pub(crate) fn module(module: &'a str) -> Self {
Self {
module,
member: None,
style: ImportStyle::Import,
force_style: false,
}
}
/// Causes this request to become a command. This will force the
/// requested import style, even if another style would be more
/// appropriate generally.
@@ -565,7 +601,13 @@ impl<'a> ImportRequest<'a> {
/// of an import conflict are minimized (although not always reduced
/// to zero).
fn avoid_conflicts(self, db: &dyn Db, importing_file: File, members: &MembersInScope) -> Self {
match (members.map.get(self.module), members.map.get(self.member)) {
let Some(member) = self.member else {
return Self {
style: ImportStyle::Import,
..self
};
};
match (members.map.get(self.module), members.map.get(member)) {
// Neither symbol exists, so we can just proceed as
// normal.
(None, None) => self,
@@ -630,7 +672,10 @@ impl std::fmt::Display for ImportRequest<'_> {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
match self.style {
ImportStyle::Import => write!(f, "import {}", self.module),
ImportStyle::ImportFrom => write!(f, "from {} import {}", self.module, self.member),
ImportStyle::ImportFrom => match self.member {
None => write!(f, "import {}", self.module),
Some(member) => write!(f, "from {} import {member}", self.module),
},
}
}
}
@@ -843,6 +888,10 @@ mod tests {
self.add(ImportRequest::import_from(module, member))
}
fn module(&self, module: &str) -> String {
self.add(ImportRequest::module(module))
}
fn add(&self, request: ImportRequest<'_>) -> String {
let node = covering_node(
self.cursor.parsed.syntax().into(),
@@ -2156,4 +2205,73 @@ except ImportError:
(bar.MAGIC)
");
}
#[test]
fn import_module_blank() {
let test = cursor_test(
"\
<CURSOR>
",
);
assert_snapshot!(
test.module("collections"), @r"
import collections
collections
");
}
#[test]
fn import_module_exists() {
let test = cursor_test(
"\
import collections
<CURSOR>
",
);
assert_snapshot!(
test.module("collections"), @r"
import collections
collections
");
}
#[test]
fn import_module_from_exists() {
let test = cursor_test(
"\
from collections import defaultdict
<CURSOR>
",
);
assert_snapshot!(
test.module("collections"), @r"
import collections
from collections import defaultdict
collections
");
}
// This test is working as intended. That is,
// `abc` is already in scope, so requesting an
// import for `collections.abc` could feasibly
// reuse the import and rewrite the symbol text
// to just `abc`. But for now it seems better
// to respect what has been written and add the
// `import collections.abc`. This behavior could
// plausibly be changed.
#[test]
fn import_module_from_via_member_exists() {
let test = cursor_test(
"\
from collections import abc
<CURSOR>
",
);
assert_snapshot!(
test.module("collections.abc"), @r"
import collections.abc
from collections import abc
collections.abc
");
}
}

View File

@@ -2012,7 +2012,7 @@ mod tests {
def __init__(self, pos, btn):
self.position: int = pos
self.button: str = btn
def my_func(event: Click):
match event:
case Click(x, button=ab):
@@ -6428,11 +6428,11 @@ mod tests {
a = Literal['a', 'b', 'c']",
);
assert_snapshot!(test.inlay_hints(), @r"
assert_snapshot!(test.inlay_hints(), @r#"
from typing import Literal
a[: <typing.Literal special form>] = Literal['a', 'b', 'c']
");
a[: <special form 'Literal["a", "b", "c"]'>] = Literal['a', 'b', 'c']
"#);
}
struct InlayHintLocationDiagnostic {

View File

@@ -37,6 +37,38 @@ pub enum ReferencesMode {
DocumentHighlights,
}
impl ReferencesMode {
pub(super) fn to_import_alias_resolution(self) -> ImportAliasResolution {
match self {
// Resolve import aliases for find references:
// ```py
// from warnings import deprecated as my_deprecated
//
// @my_deprecated
// def foo
// ```
//
// When finding references on `my_deprecated`, we want to find all usages of `deprecated` across the entire
// project.
Self::References | Self::ReferencesSkipDeclaration => {
ImportAliasResolution::ResolveAliases
}
// For rename, don't resolve import aliases.
//
// ```py
// from warnings import deprecated as my_deprecated
//
// @my_deprecated
// def foo
// ```
// When renaming `my_deprecated`, only rename the alias, but not the original definition in `warnings`.
Self::Rename | Self::RenameMultiFile | Self::DocumentHighlights => {
ImportAliasResolution::PreserveAliases
}
}
}
}
/// Find all references to a symbol at the given position.
/// Search for references across all files in the project.
pub(crate) fn references(
@@ -45,12 +77,9 @@ pub(crate) fn references(
goto_target: &GotoTarget,
mode: ReferencesMode,
) -> Option<Vec<ReferenceTarget>> {
// Get the definitions for the symbol at the cursor position
// When finding references, do not resolve any local aliases.
let model = SemanticModel::new(db, file);
let target_definitions = goto_target
.get_definition_targets(&model, ImportAliasResolution::PreserveAliases)?
.get_definition_targets(&model, mode.to_import_alias_resolution())?
.declaration_targets(db)?;
// Extract the target text from the goto target for fast comparison
@@ -318,7 +347,7 @@ impl LocalReferencesFinder<'_> {
{
// Get the definitions for this goto target
if let Some(current_definitions) = goto_target
.get_definition_targets(self.model, ImportAliasResolution::PreserveAliases)
.get_definition_targets(self.model, self.mode.to_import_alias_resolution())
.and_then(|definitions| definitions.declaration_targets(self.model.db()))
{
// Check if any of the current definitions match our target definitions

File diff suppressed because it is too large Load Diff

View File

@@ -259,7 +259,11 @@ impl<'db> SemanticTokenVisitor<'db> {
fn classify_name(&self, name: &ast::ExprName) -> (SemanticTokenType, SemanticTokenModifier) {
// First try to classify the token based on its definition kind.
let definition = definition_for_name(self.model, name);
let definition = definition_for_name(
self.model,
name,
ty_python_semantic::ImportAliasResolution::ResolveAliases,
);
if let Some(definition) = definition {
let name_str = name.id.as_str();

File diff suppressed because it is too large Load Diff

View File

@@ -37,14 +37,16 @@ class MDTestRunner:
mdtest_executable: Path | None
console: Console
filters: list[str]
enable_external: bool
def __init__(self, filters: list[str] | None = None) -> None:
def __init__(self, filters: list[str] | None, enable_external: bool) -> None:
self.mdtest_executable = None
self.console = Console()
self.filters = [
f.removesuffix(".md").replace("/", "_").replace("-", "_")
for f in (filters or [])
]
self.enable_external = enable_external
def _run_cargo_test(self, *, message_format: Literal["human", "json"]) -> str:
return subprocess.check_output(
@@ -120,6 +122,7 @@ class MDTestRunner:
CLICOLOR_FORCE="1",
INSTA_FORCE_PASS="1",
INSTA_OUTPUT="none",
MDTEST_EXTERNAL="1" if self.enable_external else "0",
),
capture_output=capture_output,
text=True,
@@ -266,11 +269,19 @@ def main() -> None:
nargs="*",
help="Partial paths or mangled names, e.g., 'loops/for.md' or 'loops_for'",
)
parser.add_argument(
"--enable-external",
"-e",
action="store_true",
help="Enable tests with external dependencies",
)
args = parser.parse_args()
try:
runner = MDTestRunner(filters=args.filters)
runner = MDTestRunner(
filters=args.filters, enable_external=args.enable_external
)
runner.watch()
except KeyboardInterrupt:
print()

View File

@@ -0,0 +1,14 @@
from typing import Protocol
class A(Protocol):
@property
def f(self): ...
type Recursive = int | tuple[Recursive, ...]
class B[T: A]: ...
class C[T: A](A):
x: tuple[Recursive, ...]
class D(B[C]): ...

View File

@@ -0,0 +1,4 @@
from __future__ import annotations
class MyClass:
type: type = str

View File

@@ -0,0 +1,6 @@
# This is a regression test for `store_expression_type`.
# ref: https://github.com/astral-sh/ty/issues/1688
x: int
type x[T] = x[T, U]

View File

@@ -0,0 +1,6 @@
class C[T: (A, B)]:
def f(foo: T):
try:
pass
except foo:
pass

View File

@@ -169,13 +169,13 @@ def f(x: Any[int]):
`Any` cannot be called (this leads to a `TypeError` at runtime):
```py
Any() # error: [call-non-callable] "Object of type `typing.Any` is not callable"
Any() # error: [call-non-callable] "Object of type `<special form 'typing.Any'>` is not callable"
```
`Any` also cannot be used as a metaclass (under the hood, this leads to an implicit call to `Any`):
```py
class F(metaclass=Any): ... # error: [invalid-metaclass] "Metaclass type `typing.Any` is not callable"
class F(metaclass=Any): ... # error: [invalid-metaclass] "Metaclass type `<special form 'typing.Any'>` is not callable"
```
And `Any` cannot be used in `isinstance()` checks:

View File

@@ -307,12 +307,10 @@ Using a `ParamSpec` in a `Callable` annotation:
from typing_extensions import Callable
def _[**P1](c: Callable[P1, int]):
# TODO: Should reveal `ParamSpecArgs` and `ParamSpecKwargs`
reveal_type(P1.args) # revealed: @Todo(ParamSpecArgs / ParamSpecKwargs)
reveal_type(P1.kwargs) # revealed: @Todo(ParamSpecArgs / ParamSpecKwargs)
reveal_type(P1.args) # revealed: P1@_.args
reveal_type(P1.kwargs) # revealed: P1@_.kwargs
# TODO: Signature should be (**P1) -> int
reveal_type(c) # revealed: (...) -> int
reveal_type(c) # revealed: (**P1@_) -> int
```
And, using the legacy syntax:
@@ -322,9 +320,8 @@ from typing_extensions import ParamSpec
P2 = ParamSpec("P2")
# TODO: argument list should not be `...` (requires `ParamSpec` support)
def _(c: Callable[P2, int]):
reveal_type(c) # revealed: (...) -> int
reveal_type(c) # revealed: (**P2@_) -> int
```
## Using `typing.Unpack`

View File

@@ -59,7 +59,7 @@ python-version = "3.11"
```py
from typing import Never
reveal_type(Never) # revealed: typing.Never
reveal_type(Never) # revealed: <special form 'typing.Never'>
```
### Python 3.10

View File

@@ -18,9 +18,8 @@ def f(*args: Unpack[Ts]) -> tuple[Unpack[Ts]]:
def g() -> TypeGuard[int]: ...
def i(callback: Callable[Concatenate[int, P], R_co], *args: P.args, **kwargs: P.kwargs) -> R_co:
# TODO: Should reveal a type representing `P.args` and `P.kwargs`
reveal_type(args) # revealed: tuple[@Todo(ParamSpecArgs / ParamSpecKwargs), ...]
reveal_type(kwargs) # revealed: dict[str, @Todo(ParamSpecArgs / ParamSpecKwargs)]
reveal_type(args) # revealed: P@i.args
reveal_type(kwargs) # revealed: P@i.kwargs
return callback(42, *args, **kwargs)
class Foo:
@@ -65,8 +64,9 @@ def _(
reveal_type(c) # revealed: Unknown
reveal_type(d) # revealed: Unknown
# error: [invalid-type-form] "Variable of type `ParamSpec` is not allowed in a type expression"
def foo(a_: e) -> None:
reveal_type(a_) # revealed: @Todo(Support for `typing.ParamSpec`)
reveal_type(a_) # revealed: Unknown
```
## Inheritance

View File

@@ -13,7 +13,7 @@ python-version = "3.10"
class A: ...
class B: ...
reveal_type(A | B) # revealed: types.UnionType
reveal_type(A | B) # revealed: <types.UnionType special form 'A | B'>
```
## Union of two classes (prior to 3.10)
@@ -43,14 +43,14 @@ class A: ...
class B: ...
def _(sub_a: type[A], sub_b: type[B]):
reveal_type(A | sub_b) # revealed: types.UnionType
reveal_type(sub_a | B) # revealed: types.UnionType
reveal_type(sub_a | sub_b) # revealed: types.UnionType
reveal_type(A | sub_b) # revealed: <types.UnionType special form>
reveal_type(sub_a | B) # revealed: <types.UnionType special form>
reveal_type(sub_a | sub_b) # revealed: <types.UnionType special form>
class C[T]: ...
class D[T]: ...
reveal_type(C | D) # revealed: types.UnionType
reveal_type(C | D) # revealed: <types.UnionType special form 'C[Unknown] | D[Unknown]'>
reveal_type(C[int] | D[str]) # revealed: types.UnionType
reveal_type(C[int] | D[str]) # revealed: <types.UnionType special form 'C[int] | D[str]'>
```

View File

@@ -227,17 +227,56 @@ def _(literals_2: Literal[0, 1], b: bool, flag: bool):
literals_16 = 4 * literals_4 + literals_4 # Literal[0, 1, .., 15]
literals_64 = 4 * literals_16 + literals_4 # Literal[0, 1, .., 63]
literals_128 = 2 * literals_64 + literals_2 # Literal[0, 1, .., 127]
literals_256 = 2 * literals_128 + literals_2 # Literal[0, 1, .., 255]
# Going beyond the MAX_UNION_LITERALS limit (currently 200):
literals_256 = 16 * literals_16 + literals_16
reveal_type(literals_256) # revealed: int
# Going beyond the MAX_NON_RECURSIVE_UNION_LITERALS limit (currently 256):
reveal_type(literals_256 if flag else 256) # revealed: int
# Going beyond the limit when another type is already part of the union
bool_and_literals_128 = b if flag else literals_128 # bool | Literal[0, 1, ..., 127]
literals_128_shifted = literals_128 + 128 # Literal[128, 129, ..., 255]
literals_256_shifted = literals_256 + 256 # Literal[256, 257, ..., 511]
# Now union the two:
reveal_type(bool_and_literals_128 if flag else literals_128_shifted) # revealed: int
two = bool_and_literals_128 if flag else literals_128_shifted
# revealed: bool | Literal[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255]
reveal_type(two)
reveal_type(two if flag else literals_256_shifted) # revealed: int
```
Recursively defined literal union types are widened earlier than non-recursively defined types for
faster convergence.
```py
class RecursiveAttr:
def __init__(self):
self.i = 0
def update(self):
self.i = self.i + 1
reveal_type(RecursiveAttr().i) # revealed: Unknown | int
# Here are some recursive but saturating examples. Because it's difficult to statically determine whether literal unions saturate or diverge,
# we widen them early, even though they may actually be convergent.
class RecursiveAttr2:
def __init__(self):
self.i = 0
def update(self):
self.i = (self.i + 1) % 9
reveal_type(RecursiveAttr2().i) # revealed: Unknown | Literal[0, 1, 2, 3, 4, 5, 6, 7, 8]
class RecursiveAttr3:
def __init__(self):
self.i = 0
def update(self):
self.i = (self.i + 1) % 10
# Going beyond the MAX_RECURSIVE_UNION_LITERALS limit:
reveal_type(RecursiveAttr3().i) # revealed: Unknown | int
```
## Simplifying gradually-equivalent types

View File

@@ -603,12 +603,14 @@ super(object, object()).__class__
# Not all objects valid in a class's bases list are valid as the first argument to `super()`.
# For example, it's valid to inherit from `typing.ChainMap`, but it's not valid as the first argument to `super()`.
#
# error: [invalid-super-argument] "`typing.ChainMap` is not a valid class"
# error: [invalid-super-argument] "`<special form 'typing.ChainMap'>` is not a valid class"
reveal_type(super(typing.ChainMap, collections.ChainMap())) # revealed: Unknown
# Meanwhile, it's not valid to inherit from unsubscripted `typing.Generic`,
# but it *is* valid as the first argument to `super()`.
reveal_type(super(typing.Generic, typing.SupportsInt)) # revealed: <super: typing.Generic, <class 'SupportsInt'>>
#
# revealed: <super: <special form 'typing.Generic'>, <class 'SupportsInt'>>
reveal_type(super(typing.Generic, typing.SupportsInt))
def _(x: type[typing.Any], y: typing.Any):
reveal_type(super(x, y)) # revealed: <super: Any, Any>

View File

@@ -0,0 +1,26 @@
# Diagnostics for invalid attribute access on special forms
<!-- snapshot-diagnostics -->
```py
from typing_extensions import Any, Final, LiteralString, Self
X = Any
class Foo:
X: Final = LiteralString
a: int
b: Self
class Bar:
def __init__(self):
self.y: Final = LiteralString
X.foo # error: [unresolved-attribute]
X.aaaaooooooo # error: [unresolved-attribute]
Foo.X.startswith # error: [unresolved-attribute]
Foo.Bar().y.startswith # error: [unresolved-attribute]
# TODO: false positive (just testing the diagnostic in the meantime)
Foo().b.a # error: [unresolved-attribute]
```

View File

@@ -7,10 +7,11 @@
```py
from typing_extensions import assert_type
def _(x: int):
def _(x: int, y: bool):
assert_type(x, int) # fine
assert_type(x, str) # error: [type-assertion-failure]
assert_type(assert_type(x, int), int)
assert_type(y, int) # error: [type-assertion-failure]
```
## Narrowing

View File

@@ -0,0 +1,4 @@
# mdtests with external dependencies
This directory contains mdtests that make use of external packages. See the mdtest `README.md` for
more information.

View File

@@ -0,0 +1,78 @@
# attrs
```toml
[environment]
python-version = "3.13"
python-platform = "linux"
[project]
dependencies = ["attrs==25.4.0"]
```
## Basic class (`attr`)
```py
import attr
@attr.s
class User:
id: int = attr.ib()
name: str = attr.ib()
user = User(id=1, name="John Doe")
reveal_type(user.id) # revealed: int
reveal_type(user.name) # revealed: str
```
## Basic class (`define`)
```py
from attrs import define, field
@define
class User:
id: int = field()
internal_name: str = field(alias="name")
user = User(id=1, name="John Doe")
reveal_type(user.id) # revealed: int
reveal_type(user.internal_name) # revealed: str
```
## Usage of `field` parameters
```py
from attrs import define, field
@define
class Product:
id: int = field(init=False)
name: str = field()
price_cent: int = field(kw_only=True)
reveal_type(Product.__init__) # revealed: (self: Product, name: str, *, price_cent: int) -> None
```
## Dedicated support for the `default` decorator?
We currently do not support this:
```py
from attrs import define, field
@define
class Person:
id: int = field()
name: str = field()
# error: [call-non-callable] "Object of type `_MISSING_TYPE` is not callable"
@id.default
def _default_id(self) -> int:
raise NotImplementedError
# error: [missing-argument] "No argument provided for required parameter `id`"
person = Person(name="Alice")
reveal_type(person.id) # revealed: int
reveal_type(person.name) # revealed: str
```

View File

@@ -0,0 +1,23 @@
# numpy
```toml
[environment]
python-version = "3.13"
python-platform = "linux"
[project]
dependencies = ["numpy==2.3.0"]
```
## Basic usage
```py
import numpy as np
xs = np.array([1, 2, 3])
reveal_type(xs) # revealed: ndarray[tuple[Any, ...], dtype[Any]]
xs = np.array([1.0, 2.0, 3.0], dtype=np.float64)
# TODO: should be `ndarray[tuple[Any, ...], dtype[float64]]`
reveal_type(xs) # revealed: ndarray[tuple[Any, ...], dtype[Unknown]]
```

View File

@@ -0,0 +1,48 @@
# Pydantic
```toml
[environment]
python-version = "3.12"
python-platform = "linux"
[project]
dependencies = ["pydantic==2.12.2"]
```
## Basic model
```py
from pydantic import BaseModel
class User(BaseModel):
id: int
name: str
reveal_type(User.__init__) # revealed: (self: User, *, id: int, name: str) -> None
user = User(id=1, name="John Doe")
reveal_type(user.id) # revealed: int
reveal_type(user.name) # revealed: str
# error: [missing-argument] "No argument provided for required parameter `name`"
invalid_user = User(id=2)
```
## Usage of `Field`
```py
from pydantic import BaseModel, Field
class Product(BaseModel):
id: int = Field(init=False)
name: str = Field(..., kw_only=False, min_length=1)
internal_price_cent: int = Field(..., gt=0, alias="price_cent")
reveal_type(Product.__init__) # revealed: (self: Product, name: str = Any, *, price_cent: int = Any) -> None
product = Product("Laptop", price_cent=999_00)
reveal_type(product.id) # revealed: int
reveal_type(product.name) # revealed: str
reveal_type(product.internal_price_cent) # revealed: int
```

View File

@@ -0,0 +1,27 @@
# pytest
```toml
[environment]
python-version = "3.13"
python-platform = "linux"
[project]
dependencies = ["pytest==9.0.1"]
```
## `pytest.fail`
Make sure that we recognize `pytest.fail` calls as terminal:
```py
import pytest
def some_runtime_condition() -> bool:
return True
def test_something():
if not some_runtime_condition():
pytest.fail("Runtime condition failed")
no_error_here_this_is_unreachable
```

View File

@@ -0,0 +1,354 @@
# SQLAlchemy
```toml
[environment]
python-version = "3.13"
python-platform = "linux"
[project]
dependencies = ["SQLAlchemy==2.0.44"]
```
## ORM Model
This test makes sure that ty understands SQLAlchemy's `dataclass_transform` setup:
```py
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column
class Base(DeclarativeBase):
pass
class User(Base):
__tablename__ = "user"
id: Mapped[int] = mapped_column(primary_key=True, init=False)
internal_name: Mapped[str] = mapped_column(alias="name")
user = User(name="John Doe")
reveal_type(user.id) # revealed: int
reveal_type(user.internal_name) # revealed: str
```
Unfortunately, SQLAlchemy overrides `__init__` and explicitly accepts all combinations of keyword
arguments. This is why we currently cannot flag invalid constructor calls:
```py
reveal_type(User.__init__) # revealed: def __init__(self, **kw: Any) -> Unknown
# TODO: this should ideally be an error
invalid_user = User(invalid_arg=42)
```
## Basic query example
First, set up a `Session`:
```py
from sqlalchemy import select, Integer, Text, Boolean
from sqlalchemy.orm import Session
from sqlalchemy.orm import DeclarativeBase
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy import create_engine
engine = create_engine("sqlite://example.db")
session = Session(engine)
```
And define a simple model:
```py
class Base(DeclarativeBase):
pass
class User(Base):
__tablename__ = "users"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
name: Mapped[str] = mapped_column(Text)
is_admin: Mapped[bool] = mapped_column(Boolean, default=False)
```
Finally, we can execute queries:
```py
stmt = select(User)
reveal_type(stmt) # revealed: Select[tuple[User]]
users = session.scalars(stmt).all()
reveal_type(users) # revealed: Sequence[User]
for row in session.execute(stmt):
reveal_type(row) # revealed: Row[tuple[User]]
stmt = select(User).where(User.name == "Alice")
alice1 = session.scalars(stmt).first()
reveal_type(alice1) # revealed: User | None
alice2 = session.scalar(stmt)
reveal_type(alice2) # revealed: User | None
result = session.execute(stmt)
row = result.one_or_none()
assert row is not None
(alice3,) = row._tuple()
reveal_type(alice3) # revealed: User
```
This also works with more complex queries:
```py
stmt = select(User).where(User.is_admin == True).order_by(User.name).limit(10)
admin_users = session.scalars(stmt).all()
reveal_type(admin_users) # revealed: Sequence[User]
```
We can also specify particular columns to select:
```py
stmt = select(User.id, User.name)
# TODO: should be `Select[tuple[int, str]]`
reveal_type(stmt) # revealed: Select[tuple[Unknown, Unknown]]
ids_and_names = session.execute(stmt).all()
# TODO: should be `Sequence[Row[tuple[int, str]]]`
reveal_type(ids_and_names) # revealed: Sequence[Row[tuple[Unknown, Unknown]]]
for row in session.execute(stmt):
# TODO: should be `Row[tuple[int, str]]`
reveal_type(row) # revealed: Row[tuple[Unknown, Unknown]]
for user_id, name in session.execute(stmt).tuples():
# TODO: should be `int`
reveal_type(user_id) # revealed: Unknown
# TODO: should be `str`
reveal_type(name) # revealed: Unknown
result = session.execute(stmt)
row = result.one_or_none()
assert row is not None
(user_id, name) = row._tuple()
# TODO: should be `int`
reveal_type(user_id) # revealed: Unknown
# TODO: should be `str`
reveal_type(name) # revealed: Unknown
stmt = select(User.id).where(User.name == "Alice")
# TODO: should be `Select[tuple[int]]`
reveal_type(stmt) # revealed: Select[tuple[Unknown]]
alice_id = session.scalars(stmt).first()
# TODO: should be `int | None`
reveal_type(alice_id) # revealed: Unknown | None
alice_id = session.scalar(stmt)
# TODO: should be `int | None`
reveal_type(alice_id) # revealed: Unknown | None
```
Using the legacy `query` API also works:
```py
users_legacy = session.query(User).all()
reveal_type(users_legacy) # revealed: list[User]
query = session.query(User)
reveal_type(query) # revealed: Query[User]
reveal_type(query.all()) # revealed: list[User]
for row in query:
reveal_type(row) # revealed: User
```
And similarly when specifying particular columns:
```py
query = session.query(User.id, User.name)
# TODO: should be `RowReturningQuery[tuple[int, str]]`
reveal_type(query) # revealed: RowReturningQuery[tuple[Unknown, Unknown]]
# TODO: should be `list[Row[tuple[int, str]]]`
reveal_type(query.all()) # revealed: list[Row[tuple[Unknown, Unknown]]]
for row in query:
# TODO: should be `Row[tuple[int, str]]`
reveal_type(row) # revealed: Row[tuple[Unknown, Unknown]]
```
## Async API
The async API is supported as well:
```py
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select, Integer, Text
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column
class Base(DeclarativeBase):
pass
class User(Base):
__tablename__ = "users"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
name: Mapped[str] = mapped_column(Text)
async def test_async(session: AsyncSession):
stmt = select(User).where(User.name == "Alice")
alice = await session.scalar(stmt)
reveal_type(alice) # revealed: User | None
stmt = select(User.id, User.name)
result = await session.execute(stmt)
for user_id, name in result.tuples():
# TODO: should be `int`
reveal_type(user_id) # revealed: Unknown
# TODO: should be `str`
reveal_type(name) # revealed: Unknown
```
## What is it that we do not support yet?
Basic setup:
```py
from datetime import datetime
from sqlalchemy import select, Integer, Text, Boolean, DateTime
from sqlalchemy.orm import Session
from sqlalchemy.orm import DeclarativeBase
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy import create_engine
engine = create_engine("sqlite://example.db")
session = Session(engine)
class Base(DeclarativeBase):
pass
class User(Base):
__tablename__ = "users"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
name: Mapped[str] = mapped_column(Text)
is_admin: Mapped[bool] = mapped_column(Boolean, default=False)
```
Why do we see `Unknown`s for `select(User.id, User.name)` here?
```py
stmt = select(User.id, User.name)
# TODO: should be `Select[tuple[int, str]]`
reveal_type(stmt) # revealed: Select[tuple[Unknown, Unknown]]
```
The types of the arguments seem correct:
```py
reveal_type(User.id) # revealed: InstrumentedAttribute[int]
reveal_type(User.name) # revealed: InstrumentedAttribute[str]
```
The two-parameter overload of `select` has a type of
`def select(__ent0: _TCCA[_T0], __ent1: _TCCA[_T1], /) -> Select[_T0, _T1]: ...`
here `_TCCA` is an alias for `_TypedColumnClauseArgument`:
```py
from sqlalchemy.sql._typing import _TypedColumnClauseArgument
# revealed: <types.UnionType special form 'TypedColumnsClauseRole[_T@_TypedColumnClauseArgument] | SQLCoreOperations[_T@_TypedColumnClauseArgument] | type[_T@_TypedColumnClauseArgument]'>
reveal_type(_TypedColumnClauseArgument)
```
If we use that generic type alias in a type expression, we can properly specialize it:
```py
def _(
col: _TypedColumnClauseArgument[int],
) -> None:
reveal_type(col) # revealed: TypedColumnsClauseRole[int] | SQLCoreOperations[int] | type[int]
```
Next, verify that we can assign `User.id` to a fully specialized version of
`_TypedColumnClauseArgument`:
```py
user_id_as_tcca: _TypedColumnClauseArgument[int] = User.id
```
If we use the generic version of `_TypedColumnClauseArgument` without specialization, we get
`Unknown`:
```py
def extract_t_from_tcca[T](col: _TypedColumnClauseArgument[T]) -> T:
raise NotImplementedError
reveal_type(extract_t_from_tcca(User.id)) # revealed: Unknown
```
However, if we use just the relevant union element of `_TypedColumnClauseArgument`
(`SQLCoreOperations`), it works as expected:
```py
from sqlalchemy.sql.elements import SQLCoreOperations
def extract_t_from_sco[T](col: SQLCoreOperations[T]) -> T:
raise NotImplementedError
reveal_type(extract_t_from_sco(User.id)) # revealed: int
reveal_type(extract_t_from_sco(User.name)) # revealed: str
```
I reported this as <https://github.com/astral-sh/ty/issues/1772>.
Now let's assume we would be able to solve for `T` here. This would mean we would get a type of
`Select[tuple[int, str]]`. Can we use that type and proceed with it? It looks like this works:
```py
from sqlalchemy.sql.selectable import Select
def _(stmt: Select[tuple[int, str]]) -> None:
for row in session.execute(stmt):
reveal_type(row) # revealed: Row[tuple[int, str]]
```
What about the `_tuple` calls? This seems to work:
```py
def _(stmt: Select[tuple[int, str]]) -> None:
result = session.execute(stmt)
reveal_type(result) # revealed: Result[tuple[int, str]]
user = result.one_or_none()
reveal_type(user) # revealed: Row[tuple[int, str]] | None
if not user:
return
reveal_type(user) # revealed: Row[tuple[int, str]] & ~AlwaysFalsy
reveal_type(user._tuple()) # revealed: tuple[int, str]
```
What about `.tuples()`? That seems to work as well:
```py
def _(stmt: Select[tuple[int, str]]) -> None:
for user_id, name in session.execute(stmt).tuples():
reveal_type(user_id) # revealed: int
reveal_type(name) # revealed: str
```
What about the `.scalar` calls? Those seem to work too:
```py
def _(stmt: Select[tuple[int]]) -> None:
user_id = session.scalar(stmt)
reveal_type(user_id) # revealed: int | None
reveal_type(session.scalars(stmt).first()) # revealed: int | None
```

View File

@@ -0,0 +1,30 @@
# SQLModel
```toml
[environment]
python-version = "3.13"
python-platform = "linux"
[project]
dependencies = ["sqlmodel==0.0.27"]
```
## Basic model
```py
from sqlmodel import SQLModel
class User(SQLModel):
id: int
name: str
user = User(id=1, name="John Doe")
reveal_type(user.id) # revealed: int
reveal_type(user.name) # revealed: str
# TODO: this should not mention `__pydantic_self__`, and have proper parameters defined by the fields
reveal_type(User.__init__) # revealed: def __init__(__pydantic_self__, **data: Any) -> None
# TODO: this should be an error
User()
```

View File

@@ -0,0 +1,27 @@
# Strawberry GraphQL
```toml
[environment]
python-version = "3.13"
python-platform = "linux"
[project]
dependencies = ["strawberry-graphql==0.283.3"]
```
## Basic model
```py
import strawberry
@strawberry.type
class User:
id: int
role: str = strawberry.field(default="user")
reveal_type(User.__init__) # revealed: (self: User, *, id: int, role: str = Any) -> None
user = User(id=1)
reveal_type(user.id) # revealed: int
reveal_type(user.role) # revealed: str
```

View File

@@ -80,7 +80,7 @@ class Foo(Protocol):
def f[T](self, v: T) -> T: ...
t = (Protocol, int)
reveal_type(t[0]) # revealed: typing.Protocol
reveal_type(t[0]) # revealed: <special form 'typing.Protocol'>
class Lorem(t[0]):
def f(self) -> int: ...

View File

@@ -301,6 +301,7 @@ consistent with each other.
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
@@ -308,6 +309,11 @@ class C(Generic[T]):
def __new__(cls, x: T) -> "C[T]":
return object.__new__(cls)
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1)) # revealed: C[int]
# error: [invalid-assignment] "Object of type `C[int | str]` is not assignable to `C[int]`"
@@ -318,12 +324,18 @@ wrong_innards: C[int] = C("five")
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
class C(Generic[T]):
def __init__(self, x: T) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1)) # revealed: C[int]
# error: [invalid-assignment] "Object of type `C[int | str]` is not assignable to `C[int]`"
@@ -334,6 +346,7 @@ wrong_innards: C[int] = C("five")
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
@@ -343,6 +356,11 @@ class C(Generic[T]):
def __init__(self, x: T) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1)) # revealed: C[int]
# error: [invalid-assignment] "Object of type `C[int | str]` is not assignable to `C[int]`"
@@ -353,6 +371,7 @@ wrong_innards: C[int] = C("five")
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
@@ -362,6 +381,11 @@ class C(Generic[T]):
def __init__(self, x: T) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1)) # revealed: C[int]
# error: [invalid-assignment] "Object of type `C[int | str]` is not assignable to `C[int]`"
@@ -373,6 +397,11 @@ class D(Generic[T]):
def __init__(self, *args, **kwargs) -> None: ...
# revealed: ty_extensions.GenericContext[T@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D(1)) # revealed: D[int]
# error: [invalid-assignment] "Object of type `D[int | str]` is not assignable to `D[int]`"
@@ -386,6 +415,7 @@ to specialize the class.
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
U = TypeVar("U")
@@ -398,6 +428,11 @@ class C(Generic[T, U]):
class D(C[V, int]):
def __init__(self, x: V) -> None: ...
# revealed: ty_extensions.GenericContext[V@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[V@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D(1)) # revealed: D[int]
```
@@ -405,6 +440,7 @@ reveal_type(D(1)) # revealed: D[int]
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
U = TypeVar("U")
@@ -415,6 +451,11 @@ class C(Generic[T, U]):
class D(C[T, U]):
pass
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(C(1, "str")) # revealed: C[int, str]
reveal_type(D(1, "str")) # revealed: D[int, str]
```
@@ -425,6 +466,7 @@ This is a specific example of the above, since it was reported specifically by a
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
U = TypeVar("U")
@@ -432,6 +474,11 @@ U = TypeVar("U")
class D(dict[T, U]):
pass
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D(key=1)) # revealed: D[str, int]
```
@@ -443,12 +490,18 @@ context. But from the user's point of view, this is another example of the above
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
U = TypeVar("U")
class C(tuple[T, U]): ...
# revealed: ty_extensions.GenericContext[T@C, U@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C, U@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C((1, 2))) # revealed: C[int, int]
```
@@ -480,6 +533,7 @@ def func8(t1: tuple[complex, list[int]], t2: tuple[int, *tuple[str, ...]], t3: t
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
S = TypeVar("S")
T = TypeVar("T")
@@ -487,6 +541,11 @@ T = TypeVar("T")
class C(Generic[T]):
def __init__(self, x: T, y: S) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C, S@__init__]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1, 1)) # revealed: C[int]
reveal_type(C(1, "string")) # revealed: C[int]
reveal_type(C(1, True)) # revealed: C[int]
@@ -499,6 +558,7 @@ wrong_innards: C[int] = C("five", 1)
```py
from typing_extensions import overload, Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
U = TypeVar("U")
@@ -514,6 +574,11 @@ class C(Generic[T]):
def __init__(self, x: int) -> None: ...
def __init__(self, x: str | bytes | int) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C("string")) # revealed: C[str]
reveal_type(C(b"bytes")) # revealed: C[bytes]
reveal_type(C(12)) # revealed: C[Unknown]
@@ -541,6 +606,11 @@ class D(Generic[T, U]):
def __init__(self, t: T, u: U) -> None: ...
def __init__(self, *args) -> None: ...
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D("string")) # revealed: D[str, str]
reveal_type(D(1)) # revealed: D[str, int]
reveal_type(D(1, "string")) # revealed: D[int, str]
@@ -551,6 +621,7 @@ reveal_type(D(1, "string")) # revealed: D[int, str]
```py
from dataclasses import dataclass
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
@@ -558,6 +629,11 @@ T = TypeVar("T")
class A(Generic[T]):
x: T
# revealed: ty_extensions.GenericContext[T@A]
reveal_type(generic_context(A))
# revealed: ty_extensions.GenericContext[T@A]
reveal_type(generic_context(into_callable(A)))
reveal_type(A(x=1)) # revealed: A[int]
```
@@ -565,17 +641,28 @@ reveal_type(A(x=1)) # revealed: A[int]
```py
from typing_extensions import Generic, TypeVar
from ty_extensions import generic_context, into_callable
T = TypeVar("T")
U = TypeVar("U", default=T)
class C(Generic[T, U]): ...
# revealed: ty_extensions.GenericContext[T@C, U@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C, U@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C()) # revealed: C[Unknown, Unknown]
class D(Generic[T, U]):
def __init__(self) -> None: ...
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D()) # revealed: D[Unknown, Unknown]
```

View File

@@ -102,6 +102,38 @@ Other values are invalid.
P4 = ParamSpec("P4", default=int)
```
### `default` parameter in `typing_extensions.ParamSpec`
```toml
[environment]
python-version = "3.12"
```
The `default` parameter to `ParamSpec` is available from `typing_extensions` in Python 3.12 and
earlier.
```py
from typing import ParamSpec
from typing_extensions import ParamSpec as ExtParamSpec
# This shouldn't emit a diagnostic
P1 = ExtParamSpec("P1", default=[int, str])
# But, this should
# error: [invalid-paramspec] "The `default` parameter of `typing.ParamSpec` was added in Python 3.13"
P2 = ParamSpec("P2", default=[int, str])
```
And, it allows the same set of values as `typing.ParamSpec`.
```py
P3 = ExtParamSpec("P3", default=...)
P4 = ExtParamSpec("P4", default=P3)
# error: [invalid-paramspec]
P5 = ExtParamSpec("P5", default=int)
```
### Forward references in stub files
Stubs natively support forward references, so patterns that would raise `NameError` at runtime are
@@ -115,3 +147,297 @@ P = ParamSpec("P", default=[A, B])
class A: ...
class B: ...
```
## Validating `ParamSpec` usage
In type annotations, `ParamSpec` is only valid as the first element to `Callable`, the final element
to `Concatenate`, or as a type parameter to `Protocol` or `Generic`.
```py
from typing import ParamSpec, Callable, Concatenate, Protocol, Generic
P = ParamSpec("P")
class ValidProtocol(Protocol[P]):
def method(self, c: Callable[P, int]) -> None: ...
class ValidGeneric(Generic[P]):
def method(self, c: Callable[P, int]) -> None: ...
def valid(
a1: Callable[P, int],
a2: Callable[Concatenate[int, P], int],
) -> None: ...
def invalid(
# TODO: error
a1: P,
# TODO: error
a2: list[P],
# TODO: error
a3: Callable[[P], int],
# TODO: error
a4: Callable[..., P],
# TODO: error
a5: Callable[Concatenate[P, ...], int],
) -> None: ...
```
## Validating `P.args` and `P.kwargs` usage
The components of `ParamSpec` i.e., `P.args` and `P.kwargs` are only valid when used as the
annotated types of `*args` and `**kwargs` respectively.
```py
from typing import Generic, Callable, ParamSpec
P = ParamSpec("P")
def foo1(c: Callable[P, int]) -> None:
def nested1(*args: P.args, **kwargs: P.kwargs) -> None: ...
def nested2(
# error: [invalid-type-form] "`P.kwargs` is valid only in `**kwargs` annotation: Did you mean `P.args`?"
*args: P.kwargs,
# error: [invalid-type-form] "`P.args` is valid only in `*args` annotation: Did you mean `P.kwargs`?"
**kwargs: P.args,
) -> None: ...
# TODO: error
def nested3(*args: P.args) -> None: ...
# TODO: error
def nested4(**kwargs: P.kwargs) -> None: ...
# TODO: error
def nested5(*args: P.args, x: int, **kwargs: P.kwargs) -> None: ...
# TODO: error
def bar1(*args: P.args, **kwargs: P.kwargs) -> None:
pass
class Foo1:
# TODO: error
def method(self, *args: P.args, **kwargs: P.kwargs) -> None: ...
```
And, they need to be used together.
```py
def foo2(c: Callable[P, int]) -> None:
# TODO: error
def nested1(*args: P.args) -> None: ...
# TODO: error
def nested2(**kwargs: P.kwargs) -> None: ...
class Foo2:
# TODO: error
args: P.args
# TODO: error
kwargs: P.kwargs
```
The name of these parameters does not need to be `args` or `kwargs`, it's the annotated type to the
respective variadic parameter that matters.
```py
class Foo3(Generic[P]):
def method1(self, *paramspec_args: P.args, **paramspec_kwargs: P.kwargs) -> None: ...
def method2(
self,
# error: [invalid-type-form] "`P.kwargs` is valid only in `**kwargs` annotation: Did you mean `P.args`?"
*paramspec_args: P.kwargs,
# error: [invalid-type-form] "`P.args` is valid only in `*args` annotation: Did you mean `P.kwargs`?"
**paramspec_kwargs: P.args,
) -> None: ...
```
## Specializing generic classes explicitly
```py
from typing import Any, Generic, ParamSpec, Callable, TypeVar
P1 = ParamSpec("P1")
P2 = ParamSpec("P2")
T1 = TypeVar("T1")
class OnlyParamSpec(Generic[P1]):
attr: Callable[P1, None]
class TwoParamSpec(Generic[P1, P2]):
attr1: Callable[P1, None]
attr2: Callable[P2, None]
class TypeVarAndParamSpec(Generic[T1, P1]):
attr: Callable[P1, T1]
```
Explicit specialization of a generic class involving `ParamSpec` is done by providing either a list
of types, `...`, or another in-scope `ParamSpec`.
```py
reveal_type(OnlyParamSpec[[]]().attr) # revealed: () -> None
reveal_type(OnlyParamSpec[[int, str]]().attr) # revealed: (int, str, /) -> None
reveal_type(OnlyParamSpec[...]().attr) # revealed: (...) -> None
def func(c: Callable[P2, None]):
reveal_type(OnlyParamSpec[P2]().attr) # revealed: (**P2@func) -> None
# TODO: error: paramspec is unbound
reveal_type(OnlyParamSpec[P2]().attr) # revealed: (...) -> None
# error: [invalid-type-arguments] "No type argument provided for required type variable `P1` of class `OnlyParamSpec`"
reveal_type(OnlyParamSpec[()]().attr) # revealed: (...) -> None
```
An explicit tuple expression (unlike an implicit one that omits the parentheses) is also accepted
when the `ParamSpec` is the only type variable. But, this isn't recommended is mainly a fallout of
it having the same AST as the one without the parentheses. Both mypy and Pyright also allow this.
```py
reveal_type(OnlyParamSpec[(int, str)]().attr) # revealed: (int, str, /) -> None
```
<!-- blacken-docs:off -->
```py
# error: [invalid-syntax]
reveal_type(OnlyParamSpec[]().attr) # revealed: (...) -> None
```
<!-- blacken-docs:on -->
The square brackets can be omitted when `ParamSpec` is the only type variable
```py
reveal_type(OnlyParamSpec[int, str]().attr) # revealed: (int, str, /) -> None
reveal_type(OnlyParamSpec[int,]().attr) # revealed: (int, /) -> None
# Even when there is only one element
reveal_type(OnlyParamSpec[Any]().attr) # revealed: (Any, /) -> None
reveal_type(OnlyParamSpec[object]().attr) # revealed: (object, /) -> None
reveal_type(OnlyParamSpec[int]().attr) # revealed: (int, /) -> None
```
But, they cannot be omitted when there are multiple type variables.
```py
reveal_type(TypeVarAndParamSpec[int, []]().attr) # revealed: () -> int
reveal_type(TypeVarAndParamSpec[int, [int, str]]().attr) # revealed: (int, str, /) -> int
reveal_type(TypeVarAndParamSpec[int, [str]]().attr) # revealed: (str, /) -> int
reveal_type(TypeVarAndParamSpec[int, ...]().attr) # revealed: (...) -> int
# TODO: We could still specialize for `T1` as the type is valid which would reveal `(...) -> int`
# TODO: error: paramspec is unbound
reveal_type(TypeVarAndParamSpec[int, P2]().attr) # revealed: (...) -> Unknown
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
reveal_type(TypeVarAndParamSpec[int, int]().attr) # revealed: (...) -> Unknown
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
reveal_type(TypeVarAndParamSpec[int, ()]().attr) # revealed: (...) -> Unknown
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
reveal_type(TypeVarAndParamSpec[int, (int, str)]().attr) # revealed: (...) -> Unknown
```
Nor can they be omitted when there are more than one `ParamSpec`s.
```py
p = TwoParamSpec[[int, str], [int]]()
reveal_type(p.attr1) # revealed: (int, str, /) -> None
reveal_type(p.attr2) # revealed: (int, /) -> None
# error: [invalid-type-arguments]
# error: [invalid-type-arguments]
TwoParamSpec[int, str]
```
Specializing `ParamSpec` type variable using `typing.Any` isn't explicitly allowed by the spec but
both mypy and Pyright allow this and there are usages of this in the wild e.g.,
`staticmethod[Any, Any]`.
```py
reveal_type(TypeVarAndParamSpec[int, Any]().attr) # revealed: (...) -> int
```
## Specialization when defaults are involved
```toml
[environment]
python-version = "3.13"
```
```py
from typing import Any, Generic, ParamSpec, Callable, TypeVar
P = ParamSpec("P")
PList = ParamSpec("PList", default=[int, str])
PEllipsis = ParamSpec("PEllipsis", default=...)
PAnother = ParamSpec("PAnother", default=P)
PAnotherWithDefault = ParamSpec("PAnotherWithDefault", default=PList)
```
```py
class ParamSpecWithDefault1(Generic[PList]):
attr: Callable[PList, None]
reveal_type(ParamSpecWithDefault1().attr) # revealed: (int, str, /) -> None
reveal_type(ParamSpecWithDefault1[[int]]().attr) # revealed: (int, /) -> None
```
```py
class ParamSpecWithDefault2(Generic[PEllipsis]):
attr: Callable[PEllipsis, None]
reveal_type(ParamSpecWithDefault2().attr) # revealed: (...) -> None
reveal_type(ParamSpecWithDefault2[[int, str]]().attr) # revealed: (int, str, /) -> None
```
```py
class ParamSpecWithDefault3(Generic[P, PAnother]):
attr1: Callable[P, None]
attr2: Callable[PAnother, None]
# `P` hasn't been specialized, so it defaults to `Unknown` gradual form
p1 = ParamSpecWithDefault3()
reveal_type(p1.attr1) # revealed: (...) -> None
reveal_type(p1.attr2) # revealed: (...) -> None
p2 = ParamSpecWithDefault3[[int, str]]()
reveal_type(p2.attr1) # revealed: (int, str, /) -> None
reveal_type(p2.attr2) # revealed: (int, str, /) -> None
p3 = ParamSpecWithDefault3[[int], [str]]()
reveal_type(p3.attr1) # revealed: (int, /) -> None
reveal_type(p3.attr2) # revealed: (str, /) -> None
class ParamSpecWithDefault4(Generic[PList, PAnotherWithDefault]):
attr1: Callable[PList, None]
attr2: Callable[PAnotherWithDefault, None]
p1 = ParamSpecWithDefault4()
reveal_type(p1.attr1) # revealed: (int, str, /) -> None
reveal_type(p1.attr2) # revealed: (int, str, /) -> None
p2 = ParamSpecWithDefault4[[int]]()
reveal_type(p2.attr1) # revealed: (int, /) -> None
reveal_type(p2.attr2) # revealed: (int, /) -> None
p3 = ParamSpecWithDefault4[[int], [str]]()
reveal_type(p3.attr1) # revealed: (int, /) -> None
reveal_type(p3.attr2) # revealed: (str, /) -> None
# TODO: error
# Un-ordered type variables as the default of `PAnother` is `P`
class ParamSpecWithDefault5(Generic[PAnother, P]):
attr: Callable[PAnother, None]
# TODO: error
# PAnother has default as P (another ParamSpec) which is not in scope
class ParamSpecWithDefault6(Generic[PAnother]):
attr: Callable[PAnother, None]
```
## Semantics
The semantics of `ParamSpec` are described in
[the PEP 695 `ParamSpec` document](./../pep695/paramspec.md) to avoid duplication unless there are
any behavior specific to the legacy `ParamSpec` implementation.

View File

@@ -25,11 +25,11 @@ reveal_type(generic_context(SingleTypevar))
# revealed: ty_extensions.GenericContext[T@MultipleTypevars, S@MultipleTypevars]
reveal_type(generic_context(MultipleTypevars))
# TODO: support `ParamSpec`/`TypeVarTuple` properly
# (these should include the `ParamSpec`s and `TypeVarTuple`s in their generic contexts)
# revealed: ty_extensions.GenericContext[]
# TODO: support `TypeVarTuple` properly
# (these should include the `TypeVarTuple`s in their generic contexts)
# revealed: ty_extensions.GenericContext[P@SingleParamSpec]
reveal_type(generic_context(SingleParamSpec))
# revealed: ty_extensions.GenericContext[T@TypeVarAndParamSpec]
# revealed: ty_extensions.GenericContext[T@TypeVarAndParamSpec, P@TypeVarAndParamSpec]
reveal_type(generic_context(TypeVarAndParamSpec))
# revealed: ty_extensions.GenericContext[]
reveal_type(generic_context(SingleTypeVarTuple))
@@ -62,7 +62,7 @@ The specialization must match the generic types:
```py
# error: [invalid-type-arguments] "Too many type arguments: expected 1, got 2"
reveal_type(C[int, int]) # revealed: C[Unknown]
reveal_type(C[int, int]) # revealed: <type alias 'C[Unknown]'>
```
And non-generic types cannot be specialized:
@@ -85,19 +85,19 @@ type BoundedByUnion[T: int | str] = ...
class IntSubclass(int): ...
reveal_type(Bounded[int]) # revealed: Bounded[int]
reveal_type(Bounded[IntSubclass]) # revealed: Bounded[IntSubclass]
reveal_type(Bounded[int]) # revealed: <type alias 'Bounded[int]'>
reveal_type(Bounded[IntSubclass]) # revealed: <type alias 'Bounded[IntSubclass]'>
# error: [invalid-type-arguments] "Type `str` is not assignable to upper bound `int` of type variable `T@Bounded`"
reveal_type(Bounded[str]) # revealed: Bounded[Unknown]
reveal_type(Bounded[str]) # revealed: <type alias 'Bounded[Unknown]'>
# error: [invalid-type-arguments] "Type `int | str` is not assignable to upper bound `int` of type variable `T@Bounded`"
reveal_type(Bounded[int | str]) # revealed: Bounded[Unknown]
reveal_type(Bounded[int | str]) # revealed: <type alias 'Bounded[Unknown]'>
reveal_type(BoundedByUnion[int]) # revealed: BoundedByUnion[int]
reveal_type(BoundedByUnion[IntSubclass]) # revealed: BoundedByUnion[IntSubclass]
reveal_type(BoundedByUnion[str]) # revealed: BoundedByUnion[str]
reveal_type(BoundedByUnion[int | str]) # revealed: BoundedByUnion[int | str]
reveal_type(BoundedByUnion[int]) # revealed: <type alias 'BoundedByUnion[int]'>
reveal_type(BoundedByUnion[IntSubclass]) # revealed: <type alias 'BoundedByUnion[IntSubclass]'>
reveal_type(BoundedByUnion[str]) # revealed: <type alias 'BoundedByUnion[str]'>
reveal_type(BoundedByUnion[int | str]) # revealed: <type alias 'BoundedByUnion[int | str]'>
```
If the type variable is constrained, the specialized type must satisfy those constraints:
@@ -105,20 +105,20 @@ If the type variable is constrained, the specialized type must satisfy those con
```py
type Constrained[T: (int, str)] = ...
reveal_type(Constrained[int]) # revealed: Constrained[int]
reveal_type(Constrained[int]) # revealed: <type alias 'Constrained[int]'>
# TODO: error: [invalid-argument-type]
# TODO: revealed: Constrained[Unknown]
reveal_type(Constrained[IntSubclass]) # revealed: Constrained[IntSubclass]
reveal_type(Constrained[IntSubclass]) # revealed: <type alias 'Constrained[IntSubclass]'>
reveal_type(Constrained[str]) # revealed: Constrained[str]
reveal_type(Constrained[str]) # revealed: <type alias 'Constrained[str]'>
# TODO: error: [invalid-argument-type]
# TODO: revealed: Unknown
reveal_type(Constrained[int | str]) # revealed: Constrained[int | str]
reveal_type(Constrained[int | str]) # revealed: <type alias 'Constrained[int | str]'>
# error: [invalid-type-arguments] "Type `object` does not satisfy constraints `int`, `str` of type variable `T@Constrained`"
reveal_type(Constrained[object]) # revealed: Constrained[Unknown]
reveal_type(Constrained[object]) # revealed: <type alias 'Constrained[Unknown]'>
```
If the type variable has a default, it can be omitted:
@@ -126,8 +126,8 @@ If the type variable has a default, it can be omitted:
```py
type WithDefault[T, U = int] = ...
reveal_type(WithDefault[str, str]) # revealed: WithDefault[str, str]
reveal_type(WithDefault[str]) # revealed: WithDefault[str, int]
reveal_type(WithDefault[str, str]) # revealed: <type alias 'WithDefault[str, str]'>
reveal_type(WithDefault[str]) # revealed: <type alias 'WithDefault[str, int]'>
```
If the type alias is not specialized explicitly, it is implicitly specialized to `Unknown`:

View File

@@ -25,11 +25,11 @@ reveal_type(generic_context(SingleTypevar))
# revealed: ty_extensions.GenericContext[T@MultipleTypevars, S@MultipleTypevars]
reveal_type(generic_context(MultipleTypevars))
# TODO: support `ParamSpec`/`TypeVarTuple` properly
# (these should include the `ParamSpec`s and `TypeVarTuple`s in their generic contexts)
# revealed: ty_extensions.GenericContext[]
# TODO: support `TypeVarTuple` properly
# (these should include the `TypeVarTuple`s in their generic contexts)
# revealed: ty_extensions.GenericContext[P@SingleParamSpec]
reveal_type(generic_context(SingleParamSpec))
# revealed: ty_extensions.GenericContext[T@TypeVarAndParamSpec]
# revealed: ty_extensions.GenericContext[T@TypeVarAndParamSpec, P@TypeVarAndParamSpec]
reveal_type(generic_context(TypeVarAndParamSpec))
# revealed: ty_extensions.GenericContext[]
reveal_type(generic_context(SingleTypeVarTuple))
@@ -264,12 +264,19 @@ signatures don't count towards variance).
### `__new__` only
```py
from ty_extensions import generic_context, into_callable
class C[T]:
x: T
def __new__(cls, x: T) -> "C[T]":
return object.__new__(cls)
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1)) # revealed: C[int]
# error: [invalid-assignment] "Object of type `C[int | str]` is not assignable to `C[int]`"
@@ -279,11 +286,18 @@ wrong_innards: C[int] = C("five")
### `__init__` only
```py
from ty_extensions import generic_context, into_callable
class C[T]:
x: T
def __init__(self, x: T) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1)) # revealed: C[int]
# error: [invalid-assignment] "Object of type `C[int | str]` is not assignable to `C[int]`"
@@ -293,6 +307,8 @@ wrong_innards: C[int] = C("five")
### Identical `__new__` and `__init__` signatures
```py
from ty_extensions import generic_context, into_callable
class C[T]:
x: T
@@ -301,6 +317,11 @@ class C[T]:
def __init__(self, x: T) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1)) # revealed: C[int]
# error: [invalid-assignment] "Object of type `C[int | str]` is not assignable to `C[int]`"
@@ -310,6 +331,8 @@ wrong_innards: C[int] = C("five")
### Compatible `__new__` and `__init__` signatures
```py
from ty_extensions import generic_context, into_callable
class C[T]:
x: T
@@ -318,6 +341,11 @@ class C[T]:
def __init__(self, x: T) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1)) # revealed: C[int]
# error: [invalid-assignment] "Object of type `C[int | str]` is not assignable to `C[int]`"
@@ -331,6 +359,11 @@ class D[T]:
def __init__(self, *args, **kwargs) -> None: ...
# revealed: ty_extensions.GenericContext[T@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D(1)) # revealed: D[int]
# error: [invalid-assignment] "Object of type `D[int | str]` is not assignable to `D[int]`"
@@ -343,6 +376,8 @@ If either method comes from a generic base class, we don't currently use its inf
to specialize the class.
```py
from ty_extensions import generic_context, into_callable
class C[T, U]:
def __new__(cls, *args, **kwargs) -> "C[T, U]":
return object.__new__(cls)
@@ -350,18 +385,30 @@ class C[T, U]:
class D[V](C[V, int]):
def __init__(self, x: V) -> None: ...
# revealed: ty_extensions.GenericContext[V@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[V@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D(1)) # revealed: D[Literal[1]]
```
### Generic class inherits `__init__` from generic base class
```py
from ty_extensions import generic_context, into_callable
class C[T, U]:
def __init__(self, t: T, u: U) -> None: ...
class D[T, U](C[T, U]):
pass
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(C(1, "str")) # revealed: C[Literal[1], Literal["str"]]
reveal_type(D(1, "str")) # revealed: D[Literal[1], Literal["str"]]
```
@@ -371,9 +418,16 @@ reveal_type(D(1, "str")) # revealed: D[Literal[1], Literal["str"]]
This is a specific example of the above, since it was reported specifically by a user.
```py
from ty_extensions import generic_context, into_callable
class D[T, U](dict[T, U]):
pass
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D(key=1)) # revealed: D[str, int]
```
@@ -384,8 +438,15 @@ for `tuple`, so we use a different mechanism to make sure it has the right inher
context. But from the user's point of view, this is another example of the above.)
```py
from ty_extensions import generic_context, into_callable
class C[T, U](tuple[T, U]): ...
# revealed: ty_extensions.GenericContext[T@C, U@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C, U@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C((1, 2))) # revealed: C[Literal[1], Literal[2]]
```
@@ -409,11 +470,18 @@ def func8(t1: tuple[complex, list[int]], t2: tuple[int, *tuple[str, ...]], t3: t
### `__init__` is itself generic
```py
from ty_extensions import generic_context, into_callable
class C[T]:
x: T
def __init__[S](self, x: T, y: S) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C, S@__init__]
reveal_type(generic_context(into_callable(C)))
reveal_type(C(1, 1)) # revealed: C[int]
reveal_type(C(1, "string")) # revealed: C[int]
reveal_type(C(1, True)) # revealed: C[int]
@@ -427,6 +495,7 @@ wrong_innards: C[int] = C("five", 1)
```py
from __future__ import annotations
from typing import overload
from ty_extensions import generic_context, into_callable
class C[T]:
# we need to use the type variable or else the class is bivariant in T, and
@@ -443,6 +512,11 @@ class C[T]:
def __init__(self, x: int) -> None: ...
def __init__(self, x: str | bytes | int) -> None: ...
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C("string")) # revealed: C[str]
reveal_type(C(b"bytes")) # revealed: C[bytes]
reveal_type(C(12)) # revealed: C[Unknown]
@@ -470,6 +544,11 @@ class D[T, U]:
def __init__(self, t: T, u: U) -> None: ...
def __init__(self, *args) -> None: ...
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D("string")) # revealed: D[str, Literal["string"]]
reveal_type(D(1)) # revealed: D[str, Literal[1]]
reveal_type(D(1, "string")) # revealed: D[Literal[1], Literal["string"]]
@@ -479,24 +558,42 @@ reveal_type(D(1, "string")) # revealed: D[Literal[1], Literal["string"]]
```py
from dataclasses import dataclass
from ty_extensions import generic_context, into_callable
@dataclass
class A[T]:
x: T
# revealed: ty_extensions.GenericContext[T@A]
reveal_type(generic_context(A))
# revealed: ty_extensions.GenericContext[T@A]
reveal_type(generic_context(into_callable(A)))
reveal_type(A(x=1)) # revealed: A[int]
```
### Class typevar has another typevar as a default
```py
from ty_extensions import generic_context, into_callable
class C[T, U = T]: ...
# revealed: ty_extensions.GenericContext[T@C, U@C]
reveal_type(generic_context(C))
# revealed: ty_extensions.GenericContext[T@C, U@C]
reveal_type(generic_context(into_callable(C)))
reveal_type(C()) # revealed: C[Unknown, Unknown]
class D[T, U = T]:
def __init__(self) -> None: ...
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(D))
# revealed: ty_extensions.GenericContext[T@D, U@D]
reveal_type(generic_context(into_callable(D)))
reveal_type(D()) # revealed: D[Unknown, Unknown]
```

View File

@@ -62,3 +62,614 @@ Other values are invalid.
def foo[**P = int]() -> None:
pass
```
## Validating `ParamSpec` usage
`ParamSpec` is only valid as the first element to `Callable` or the final element to `Concatenate`.
```py
from typing import ParamSpec, Callable, Concatenate
def valid[**P](
a1: Callable[P, int],
a2: Callable[Concatenate[int, P], int],
) -> None: ...
def invalid[**P](
# TODO: error
a1: P,
# TODO: error
a2: list[P],
# TODO: error
a3: Callable[[P], int],
# TODO: error
a4: Callable[..., P],
# TODO: error
a5: Callable[Concatenate[P, ...], int],
) -> None: ...
```
## Validating `P.args` and `P.kwargs` usage
The components of `ParamSpec` i.e., `P.args` and `P.kwargs` are only valid when used as the
annotated types of `*args` and `**kwargs` respectively.
```py
from typing import Callable
def foo[**P](c: Callable[P, int]) -> None:
def nested1(*args: P.args, **kwargs: P.kwargs) -> None: ...
# error: [invalid-type-form] "`P.kwargs` is valid only in `**kwargs` annotation: Did you mean `P.args`?"
# error: [invalid-type-form] "`P.args` is valid only in `*args` annotation: Did you mean `P.kwargs`?"
def nested2(*args: P.kwargs, **kwargs: P.args) -> None: ...
# TODO: error
def nested3(*args: P.args) -> None: ...
# TODO: error
def nested4(**kwargs: P.kwargs) -> None: ...
# TODO: error
def nested5(*args: P.args, x: int, **kwargs: P.kwargs) -> None: ...
```
And, they need to be used together.
```py
def foo[**P](c: Callable[P, int]) -> None:
# TODO: error
def nested1(*args: P.args) -> None: ...
# TODO: error
def nested2(**kwargs: P.kwargs) -> None: ...
class Foo[**P]:
# TODO: error
args: P.args
# TODO: error
kwargs: P.kwargs
```
The name of these parameters does not need to be `args` or `kwargs`, it's the annotated type to the
respective variadic parameter that matters.
```py
class Foo3[**P]:
def method1(self, *paramspec_args: P.args, **paramspec_kwargs: P.kwargs) -> None: ...
def method2(
self,
# error: [invalid-type-form] "`P.kwargs` is valid only in `**kwargs` annotation: Did you mean `P.args`?"
*paramspec_args: P.kwargs,
# error: [invalid-type-form] "`P.args` is valid only in `*args` annotation: Did you mean `P.kwargs`?"
**paramspec_kwargs: P.args,
) -> None: ...
```
It isn't allowed to annotate an instance attribute either:
```py
class Foo4[**P]:
def __init__(self, fn: Callable[P, int], *args: P.args, **kwargs: P.kwargs) -> None:
self.fn = fn
# TODO: error
self.args: P.args = args
# TODO: error
self.kwargs: P.kwargs = kwargs
```
## Semantics of `P.args` and `P.kwargs`
The type of `args` and `kwargs` inside the function is `P.args` and `P.kwargs` respectively instead
of `tuple[P.args, ...]` and `dict[str, P.kwargs]`.
### Passing `*args` and `**kwargs` to a callable
```py
from typing import Callable
def f[**P](func: Callable[P, int]) -> Callable[P, None]:
def wrapper(*args: P.args, **kwargs: P.kwargs) -> None:
reveal_type(args) # revealed: P@f.args
reveal_type(kwargs) # revealed: P@f.kwargs
reveal_type(func(*args, **kwargs)) # revealed: int
# error: [invalid-argument-type] "Argument is incorrect: Expected `P@f.args`, found `P@f.kwargs`"
# error: [invalid-argument-type] "Argument is incorrect: Expected `P@f.kwargs`, found `P@f.args`"
reveal_type(func(*kwargs, **args)) # revealed: int
# error: [invalid-argument-type] "Argument is incorrect: Expected `P@f.args`, found `P@f.kwargs`"
reveal_type(func(args, kwargs)) # revealed: int
# Both parameters are required
# TODO: error
reveal_type(func()) # revealed: int
reveal_type(func(*args)) # revealed: int
reveal_type(func(**kwargs)) # revealed: int
return wrapper
```
### Operations on `P.args` and `P.kwargs`
The type of `P.args` and `P.kwargs` behave like a `tuple` and `dict` respectively. Internally, they
are represented as a type variable that has an upper bound of `tuple[object, ...]` and
`Top[dict[str, Any]]` respectively.
```py
from typing import Callable, Any
def f[**P](func: Callable[P, int], *args: P.args, **kwargs: P.kwargs) -> None:
reveal_type(args + ("extra",)) # revealed: tuple[object, ...]
reveal_type(args + (1, 2, 3)) # revealed: tuple[object, ...]
reveal_type(args[0]) # revealed: object
reveal_type("key" in kwargs) # revealed: bool
reveal_type(kwargs.get("key")) # revealed: object
reveal_type(kwargs["key"]) # revealed: object
```
## Specializing generic classes explicitly
```py
from typing import Any, Callable, ParamSpec
class OnlyParamSpec[**P1]:
attr: Callable[P1, None]
class TwoParamSpec[**P1, **P2]:
attr1: Callable[P1, None]
attr2: Callable[P2, None]
class TypeVarAndParamSpec[T1, **P1]:
attr: Callable[P1, T1]
```
Explicit specialization of a generic class involving `ParamSpec` is done by providing either a list
of types, `...`, or another in-scope `ParamSpec`.
```py
reveal_type(OnlyParamSpec[[]]().attr) # revealed: () -> None
reveal_type(OnlyParamSpec[[int, str]]().attr) # revealed: (int, str, /) -> None
reveal_type(OnlyParamSpec[...]().attr) # revealed: (...) -> None
def func[**P2](c: Callable[P2, None]):
reveal_type(OnlyParamSpec[P2]().attr) # revealed: (**P2@func) -> None
P2 = ParamSpec("P2")
# TODO: error: paramspec is unbound
reveal_type(OnlyParamSpec[P2]().attr) # revealed: (...) -> None
# error: [invalid-type-arguments] "No type argument provided for required type variable `P1` of class `OnlyParamSpec`"
reveal_type(OnlyParamSpec[()]().attr) # revealed: (...) -> None
```
An explicit tuple expression (unlike an implicit one that omits the parentheses) is also accepted
when the `ParamSpec` is the only type variable. But, this isn't recommended is mainly a fallout of
it having the same AST as the one without the parentheses. Both mypy and Pyright also allow this.
```py
reveal_type(OnlyParamSpec[(int, str)]().attr) # revealed: (int, str, /) -> None
```
<!-- blacken-docs:off -->
```py
# error: [invalid-syntax]
reveal_type(OnlyParamSpec[]().attr) # revealed: (...) -> None
```
<!-- blacken-docs:on -->
The square brackets can be omitted when `ParamSpec` is the only type variable
```py
reveal_type(OnlyParamSpec[int, str]().attr) # revealed: (int, str, /) -> None
reveal_type(OnlyParamSpec[int,]().attr) # revealed: (int, /) -> None
# Even when there is only one element
reveal_type(OnlyParamSpec[Any]().attr) # revealed: (Any, /) -> None
reveal_type(OnlyParamSpec[object]().attr) # revealed: (object, /) -> None
reveal_type(OnlyParamSpec[int]().attr) # revealed: (int, /) -> None
```
But, they cannot be omitted when there are multiple type variables.
```py
reveal_type(TypeVarAndParamSpec[int, []]().attr) # revealed: () -> int
reveal_type(TypeVarAndParamSpec[int, [int, str]]().attr) # revealed: (int, str, /) -> int
reveal_type(TypeVarAndParamSpec[int, [str]]().attr) # revealed: (str, /) -> int
reveal_type(TypeVarAndParamSpec[int, ...]().attr) # revealed: (...) -> int
# TODO: error: paramspec is unbound
reveal_type(TypeVarAndParamSpec[int, P2]().attr) # revealed: (...) -> Unknown
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
reveal_type(TypeVarAndParamSpec[int, int]().attr) # revealed: (...) -> Unknown
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
reveal_type(TypeVarAndParamSpec[int, ()]().attr) # revealed: (...) -> Unknown
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
reveal_type(TypeVarAndParamSpec[int, (int, str)]().attr) # revealed: (...) -> Unknown
```
Nor can they be omitted when there are more than one `ParamSpec`.
```py
p = TwoParamSpec[[int, str], [int]]()
reveal_type(p.attr1) # revealed: (int, str, /) -> None
reveal_type(p.attr2) # revealed: (int, /) -> None
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be either a list of types, `ParamSpec`, `Concatenate`, or `...`"
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be either a list of types, `ParamSpec`, `Concatenate`, or `...`"
TwoParamSpec[int, str]
```
Specializing `ParamSpec` type variable using `typing.Any` isn't explicitly allowed by the spec but
both mypy and Pyright allow this and there are usages of this in the wild e.g.,
`staticmethod[Any, Any]`.
```py
reveal_type(TypeVarAndParamSpec[int, Any]().attr) # revealed: (...) -> int
```
## Specialization when defaults are involved
```py
from typing import Callable, ParamSpec
class ParamSpecWithDefault1[**P1 = [int, str]]:
attr: Callable[P1, None]
reveal_type(ParamSpecWithDefault1().attr) # revealed: (int, str, /) -> None
reveal_type(ParamSpecWithDefault1[int]().attr) # revealed: (int, /) -> None
```
```py
class ParamSpecWithDefault2[**P1 = ...]:
attr: Callable[P1, None]
reveal_type(ParamSpecWithDefault2().attr) # revealed: (...) -> None
reveal_type(ParamSpecWithDefault2[int, str]().attr) # revealed: (int, str, /) -> None
```
```py
class ParamSpecWithDefault3[**P1, **P2 = P1]:
attr1: Callable[P1, None]
attr2: Callable[P2, None]
# `P1` hasn't been specialized, so it defaults to `...` gradual form
p1 = ParamSpecWithDefault3()
reveal_type(p1.attr1) # revealed: (...) -> None
reveal_type(p1.attr2) # revealed: (...) -> None
p2 = ParamSpecWithDefault3[[int, str]]()
reveal_type(p2.attr1) # revealed: (int, str, /) -> None
reveal_type(p2.attr2) # revealed: (int, str, /) -> None
p3 = ParamSpecWithDefault3[[int], [str]]()
reveal_type(p3.attr1) # revealed: (int, /) -> None
reveal_type(p3.attr2) # revealed: (str, /) -> None
class ParamSpecWithDefault4[**P1 = [int, str], **P2 = P1]:
attr1: Callable[P1, None]
attr2: Callable[P2, None]
p1 = ParamSpecWithDefault4()
reveal_type(p1.attr1) # revealed: (int, str, /) -> None
reveal_type(p1.attr2) # revealed: (int, str, /) -> None
p2 = ParamSpecWithDefault4[[int]]()
reveal_type(p2.attr1) # revealed: (int, /) -> None
reveal_type(p2.attr2) # revealed: (int, /) -> None
p3 = ParamSpecWithDefault4[[int], [str]]()
reveal_type(p3.attr1) # revealed: (int, /) -> None
reveal_type(p3.attr2) # revealed: (str, /) -> None
P2 = ParamSpec("P2")
# TODO: error: paramspec is out of scope
class ParamSpecWithDefault5[**P1 = P2]:
attr: Callable[P1, None]
```
## Semantics
Most of these test cases are adopted from the
[typing documentation on `ParamSpec` semantics](https://typing.python.org/en/latest/spec/generics.html#semantics).
### Return type change using `ParamSpec` once
```py
from typing import Callable
def converter[**P](func: Callable[P, int]) -> Callable[P, bool]:
def wrapper(*args: P.args, **kwargs: P.kwargs) -> bool:
func(*args, **kwargs)
return True
return wrapper
def f1(x: int, y: str) -> int:
return 1
# This should preserve all the information about the parameters of `f1`
f2 = converter(f1)
reveal_type(f2) # revealed: (x: int, y: str) -> bool
reveal_type(f1(1, "a")) # revealed: int
reveal_type(f2(1, "a")) # revealed: bool
# As it preserves the parameter kinds, the following should work as well
reveal_type(f2(1, y="a")) # revealed: bool
reveal_type(f2(x=1, y="a")) # revealed: bool
reveal_type(f2(y="a", x=1)) # revealed: bool
# error: [missing-argument] "No argument provided for required parameter `y`"
f2(1)
# error: [invalid-argument-type] "Argument is incorrect: Expected `int`, found `Literal["a"]`"
f2("a", "b")
```
The `converter` function act as a decorator here:
```py
@converter
def f3(x: int, y: str) -> int:
return 1
# TODO: This should reveal `(x: int, y: str) -> bool` but there's a cycle: https://github.com/astral-sh/ty/issues/1729
reveal_type(f3) # revealed: ((x: int, y: str) -> bool) | ((x: Divergent, y: Divergent) -> bool)
reveal_type(f3(1, "a")) # revealed: bool
reveal_type(f3(x=1, y="a")) # revealed: bool
reveal_type(f3(1, y="a")) # revealed: bool
reveal_type(f3(y="a", x=1)) # revealed: bool
# TODO: There should only be one error but the type of `f3` is a union: https://github.com/astral-sh/ty/issues/1729
# error: [missing-argument] "No argument provided for required parameter `y`"
# error: [missing-argument] "No argument provided for required parameter `y`"
f3(1)
# error: [invalid-argument-type] "Argument is incorrect: Expected `int`, found `Literal["a"]`"
f3("a", "b")
```
### Return type change using the same `ParamSpec` multiple times
```py
from typing import Callable
def multiple[**P](func1: Callable[P, int], func2: Callable[P, int]) -> Callable[P, bool]:
def wrapper(*args: P.args, **kwargs: P.kwargs) -> bool:
func1(*args, **kwargs)
func2(*args, **kwargs)
return True
return wrapper
```
As per the spec,
> A user may include the same `ParamSpec` multiple times in the arguments of the same function, to
> indicate a dependency between multiple arguments. In these cases a type checker may choose to
> solve to a common behavioral supertype (i.e. a set of parameters for which all of the valid calls
> are valid in both of the subtypes), but is not obligated to do so.
TODO: Currently, we don't do this
```py
def xy(x: int, y: str) -> int:
return 1
def yx(y: int, x: str) -> int:
return 2
reveal_type(multiple(xy, xy)) # revealed: (x: int, y: str) -> bool
# The common supertype is `(int, str, /)` which is converting the positional-or-keyword parameters
# into positional-only parameters because the position of the types are the same.
# TODO: This shouldn't error
# error: [invalid-argument-type]
reveal_type(multiple(xy, yx)) # revealed: (x: int, y: str) -> bool
def keyword_only_with_default_1(*, x: int = 42) -> int:
return 1
def keyword_only_with_default_2(*, y: int = 42) -> int:
return 2
# The common supertype for two functions with only keyword-only parameters would be an empty
# parameter list i.e., `()`
# TODO: This shouldn't error
# error: [invalid-argument-type]
# revealed: (*, x: int = Literal[42]) -> bool
reveal_type(multiple(keyword_only_with_default_1, keyword_only_with_default_2))
def keyword_only1(*, x: int) -> int:
return 1
def keyword_only2(*, y: int) -> int:
return 2
# On the other hand, combining two functions with only keyword-only parameters does not have a
# common supertype, so it should result in an error.
# error: [invalid-argument-type] "Argument to function `multiple` is incorrect: Expected `(*, x: int) -> int`, found `def keyword_only2(*, y: int) -> int`"
reveal_type(multiple(keyword_only1, keyword_only2)) # revealed: (*, x: int) -> bool
```
### Constructors of user-defined generic class on `ParamSpec`
```py
from typing import Callable
class C[**P]:
f: Callable[P, int]
def __init__(self, f: Callable[P, int]) -> None:
self.f = f
def f(x: int, y: str) -> bool:
return True
c = C(f)
reveal_type(c.f) # revealed: (x: int, y: str) -> int
```
### `ParamSpec` in prepended positional parameters
> If one of these prepended positional parameters contains a free `ParamSpec`, we consider that
> variable in scope for the purposes of extracting the components of that `ParamSpec`.
```py
from typing import Callable
def foo1[**P1](func: Callable[P1, int], *args: P1.args, **kwargs: P1.kwargs) -> int:
return func(*args, **kwargs)
def foo1_with_extra_arg[**P1](func: Callable[P1, int], extra: str, *args: P1.args, **kwargs: P1.kwargs) -> int:
return func(*args, **kwargs)
def foo2[**P2](func: Callable[P2, int], *args: P2.args, **kwargs: P2.kwargs) -> None:
foo1(func, *args, **kwargs)
# error: [invalid-argument-type] "Argument to function `foo1` is incorrect: Expected `P2@foo2.args`, found `Literal[1]`"
foo1(func, 1, *args, **kwargs)
# error: [invalid-argument-type] "Argument to function `foo1_with_extra_arg` is incorrect: Expected `str`, found `P2@foo2.args`"
foo1_with_extra_arg(func, *args, **kwargs)
foo1_with_extra_arg(func, "extra", *args, **kwargs)
```
Here, the first argument to `f` can specialize `P` to the parameters of the callable passed to it
which is then used to type the `ParamSpec` components used in `*args` and `**kwargs`.
```py
def f1(x: int, y: str) -> int:
return 1
foo1(f1, 1, "a")
foo1(f1, x=1, y="a")
foo1(f1, 1, y="a")
# error: [missing-argument] "No arguments provided for required parameters `x`, `y` of function `foo1`"
foo1(f1)
# error: [missing-argument] "No argument provided for required parameter `y` of function `foo1`"
foo1(f1, 1)
# error: [invalid-argument-type] "Argument to function `foo1` is incorrect: Expected `str`, found `Literal[2]`"
foo1(f1, 1, 2)
# error: [too-many-positional-arguments] "Too many positional arguments to function `foo1`: expected 2, got 3"
foo1(f1, 1, "a", "b")
# error: [missing-argument] "No argument provided for required parameter `y` of function `foo1`"
# error: [unknown-argument] "Argument `z` does not match any known parameter of function `foo1`"
foo1(f1, x=1, z="a")
```
### Specializing `ParamSpec` with another `ParamSpec`
```py
class Foo[**P]:
def __init__(self, *args: P.args, **kwargs: P.kwargs) -> None:
self.args = args
self.kwargs = kwargs
def bar[**P](foo: Foo[P]) -> None:
reveal_type(foo) # revealed: Foo[P@bar]
reveal_type(foo.args) # revealed: Unknown | P@bar.args
reveal_type(foo.kwargs) # revealed: Unknown | P@bar.kwargs
```
ty will check whether the argument after `**` is a mapping type but as instance attribute are
unioned with `Unknown`, it shouldn't error here.
```py
from typing import Callable
def baz[**P](fn: Callable[P, None], foo: Foo[P]) -> None:
fn(*foo.args, **foo.kwargs)
```
The `Unknown` can be eliminated by using annotating these attributes with `Final`:
```py
from typing import Final
class FooWithFinal[**P]:
def __init__(self, *args: P.args, **kwargs: P.kwargs) -> None:
self.args: Final = args
self.kwargs: Final = kwargs
def with_final[**P](foo: FooWithFinal[P]) -> None:
reveal_type(foo) # revealed: FooWithFinal[P@with_final]
reveal_type(foo.args) # revealed: P@with_final.args
reveal_type(foo.kwargs) # revealed: P@with_final.kwargs
```
### Specializing `Self` when `ParamSpec` is involved
```py
class Foo[**P]:
def method(self, *args: P.args, **kwargs: P.kwargs) -> str:
return "hello"
foo = Foo[int, str]()
reveal_type(foo) # revealed: Foo[(int, str, /)]
reveal_type(foo.method) # revealed: bound method Foo[(int, str, /)].method(int, str, /) -> str
reveal_type(foo.method(1, "a")) # revealed: str
```
### Overloads
`overloaded.pyi`:
```pyi
from typing import overload
@overload
def int_int(x: int) -> int: ...
@overload
def int_int(x: str) -> int: ...
@overload
def int_str(x: int) -> int: ...
@overload
def int_str(x: str) -> str: ...
@overload
def str_str(x: int) -> str: ...
@overload
def str_str(x: str) -> str: ...
```
```py
from typing import Callable
from overloaded import int_int, int_str, str_str
def change_return_type[**P](f: Callable[P, int]) -> Callable[P, str]:
def nested(*args: P.args, **kwargs: P.kwargs) -> str:
return str(f(*args, **kwargs))
return nested
def with_parameters[**P](f: Callable[P, int], *args: P.args, **kwargs: P.kwargs) -> Callable[P, str]:
def nested(*args: P.args, **kwargs: P.kwargs) -> str:
return str(f(*args, **kwargs))
return nested
reveal_type(change_return_type(int_int)) # revealed: Overload[(x: int) -> str, (x: str) -> str]
# TODO: This shouldn't error and should pick the first overload because of the return type
# error: [invalid-argument-type]
reveal_type(change_return_type(int_str)) # revealed: Overload[(x: int) -> str, (x: str) -> str]
# error: [invalid-argument-type]
reveal_type(change_return_type(str_str)) # revealed: Overload[(x: int) -> str, (x: str) -> str]
# TODO: Both of these shouldn't raise an error
# error: [invalid-argument-type]
reveal_type(with_parameters(int_int, 1)) # revealed: Overload[(x: int) -> str, (x: str) -> str]
# error: [invalid-argument-type]
reveal_type(with_parameters(int_int, "a")) # revealed: Overload[(x: int) -> str, (x: str) -> str]
```

View File

@@ -77,44 +77,44 @@ IntOrTypeVar = int | T
TypeVarOrNone = T | None
NoneOrTypeVar = None | T
reveal_type(IntOrStr) # revealed: types.UnionType
reveal_type(IntOrStrOrBytes1) # revealed: types.UnionType
reveal_type(IntOrStrOrBytes2) # revealed: types.UnionType
reveal_type(IntOrStrOrBytes3) # revealed: types.UnionType
reveal_type(IntOrStrOrBytes4) # revealed: types.UnionType
reveal_type(IntOrStrOrBytes5) # revealed: types.UnionType
reveal_type(IntOrStrOrBytes6) # revealed: types.UnionType
reveal_type(BytesOrIntOrStr) # revealed: types.UnionType
reveal_type(IntOrNone) # revealed: types.UnionType
reveal_type(NoneOrInt) # revealed: types.UnionType
reveal_type(IntOrStrOrNone) # revealed: types.UnionType
reveal_type(NoneOrIntOrStr) # revealed: types.UnionType
reveal_type(IntOrAny) # revealed: types.UnionType
reveal_type(AnyOrInt) # revealed: types.UnionType
reveal_type(NoneOrAny) # revealed: types.UnionType
reveal_type(AnyOrNone) # revealed: types.UnionType
reveal_type(NeverOrAny) # revealed: types.UnionType
reveal_type(AnyOrNever) # revealed: types.UnionType
reveal_type(UnknownOrInt) # revealed: types.UnionType
reveal_type(IntOrUnknown) # revealed: types.UnionType
reveal_type(StrOrZero) # revealed: types.UnionType
reveal_type(ZeroOrStr) # revealed: types.UnionType
reveal_type(IntOrLiteralString) # revealed: types.UnionType
reveal_type(LiteralStringOrInt) # revealed: types.UnionType
reveal_type(NoneOrTuple) # revealed: types.UnionType
reveal_type(TupleOrNone) # revealed: types.UnionType
reveal_type(IntOrAnnotated) # revealed: types.UnionType
reveal_type(AnnotatedOrInt) # revealed: types.UnionType
reveal_type(IntOrOptional) # revealed: types.UnionType
reveal_type(OptionalOrInt) # revealed: types.UnionType
reveal_type(IntOrTypeOfStr) # revealed: types.UnionType
reveal_type(TypeOfStrOrInt) # revealed: types.UnionType
reveal_type(IntOrCallable) # revealed: types.UnionType
reveal_type(CallableOrInt) # revealed: types.UnionType
reveal_type(TypeVarOrInt) # revealed: types.UnionType
reveal_type(IntOrTypeVar) # revealed: types.UnionType
reveal_type(TypeVarOrNone) # revealed: types.UnionType
reveal_type(NoneOrTypeVar) # revealed: types.UnionType
reveal_type(IntOrStr) # revealed: <types.UnionType special form 'int | str'>
reveal_type(IntOrStrOrBytes1) # revealed: <types.UnionType special form 'int | str | bytes'>
reveal_type(IntOrStrOrBytes2) # revealed: <types.UnionType special form 'int | str | bytes'>
reveal_type(IntOrStrOrBytes3) # revealed: <types.UnionType special form 'int | str | bytes'>
reveal_type(IntOrStrOrBytes4) # revealed: <types.UnionType special form 'int | str | bytes'>
reveal_type(IntOrStrOrBytes5) # revealed: <types.UnionType special form 'int | str | bytes'>
reveal_type(IntOrStrOrBytes6) # revealed: <types.UnionType special form 'int | str | bytes'>
reveal_type(BytesOrIntOrStr) # revealed: <types.UnionType special form 'bytes | int | str'>
reveal_type(IntOrNone) # revealed: <types.UnionType special form 'int | None'>
reveal_type(NoneOrInt) # revealed: <types.UnionType special form 'None | int'>
reveal_type(IntOrStrOrNone) # revealed: <types.UnionType special form 'int | str | None'>
reveal_type(NoneOrIntOrStr) # revealed: <types.UnionType special form 'None | int | str'>
reveal_type(IntOrAny) # revealed: <types.UnionType special form 'int | Any'>
reveal_type(AnyOrInt) # revealed: <types.UnionType special form 'Any | int'>
reveal_type(NoneOrAny) # revealed: <types.UnionType special form 'None | Any'>
reveal_type(AnyOrNone) # revealed: <types.UnionType special form 'Any | None'>
reveal_type(NeverOrAny) # revealed: <types.UnionType special form 'Any'>
reveal_type(AnyOrNever) # revealed: <types.UnionType special form 'Any'>
reveal_type(UnknownOrInt) # revealed: <types.UnionType special form 'Unknown | int'>
reveal_type(IntOrUnknown) # revealed: <types.UnionType special form 'int | Unknown'>
reveal_type(StrOrZero) # revealed: <types.UnionType special form 'str | Literal[0]'>
reveal_type(ZeroOrStr) # revealed: <types.UnionType special form 'Literal[0] | str'>
reveal_type(IntOrLiteralString) # revealed: <types.UnionType special form 'int | LiteralString'>
reveal_type(LiteralStringOrInt) # revealed: <types.UnionType special form 'LiteralString | int'>
reveal_type(NoneOrTuple) # revealed: <types.UnionType special form 'None | tuple[int, str]'>
reveal_type(TupleOrNone) # revealed: <types.UnionType special form 'tuple[int, str] | None'>
reveal_type(IntOrAnnotated) # revealed: <types.UnionType special form 'int | str'>
reveal_type(AnnotatedOrInt) # revealed: <types.UnionType special form 'str | int'>
reveal_type(IntOrOptional) # revealed: <types.UnionType special form 'int | str | None'>
reveal_type(OptionalOrInt) # revealed: <types.UnionType special form 'str | None | int'>
reveal_type(IntOrTypeOfStr) # revealed: <types.UnionType special form 'int | type[str]'>
reveal_type(TypeOfStrOrInt) # revealed: <types.UnionType special form 'type[str] | int'>
reveal_type(IntOrCallable) # revealed: <types.UnionType special form 'int | ((str, /) -> bytes)'>
reveal_type(CallableOrInt) # revealed: <types.UnionType special form '((str, /) -> bytes) | int'>
reveal_type(TypeVarOrInt) # revealed: <types.UnionType special form 'T@TypeVarOrInt | int'>
reveal_type(IntOrTypeVar) # revealed: <types.UnionType special form 'int | T@IntOrTypeVar'>
reveal_type(TypeVarOrNone) # revealed: <types.UnionType special form 'T@TypeVarOrNone | None'>
reveal_type(NoneOrTypeVar) # revealed: <types.UnionType special form 'None | T@NoneOrTypeVar'>
def _(
int_or_str: IntOrStr,
@@ -295,7 +295,7 @@ X = Foo | Bar
# In an ideal world, perhaps we would respect `Meta.__or__` here and reveal `str`?
# But we still need to record what the elements are, since (according to the typing spec)
# `X` is still a valid type alias
reveal_type(X) # revealed: types.UnionType
reveal_type(X) # revealed: <types.UnionType special form 'Foo | Bar'>
def f(obj: X):
reveal_type(obj) # revealed: Foo | Bar
@@ -391,16 +391,17 @@ MyOptional = T | None
reveal_type(MyList) # revealed: <class 'list[T@MyList]'>
reveal_type(MyDict) # revealed: <class 'dict[T@MyDict, U@MyDict]'>
reveal_type(MyType) # revealed: GenericAlias
reveal_type(MyType) # revealed: <special form 'type[T@MyType]'>
reveal_type(IntAndType) # revealed: <class 'tuple[int, T@IntAndType]'>
reveal_type(Pair) # revealed: <class 'tuple[T@Pair, T@Pair]'>
reveal_type(Sum) # revealed: <class 'tuple[T@Sum, U@Sum]'>
reveal_type(ListOrTuple) # revealed: types.UnionType
reveal_type(ListOrTupleLegacy) # revealed: types.UnionType
reveal_type(MyCallable) # revealed: @Todo(Callable[..] specialized with ParamSpec)
reveal_type(AnnotatedType) # revealed: <typing.Annotated special form>
reveal_type(ListOrTuple) # revealed: <types.UnionType special form 'list[T@ListOrTuple] | tuple[T@ListOrTuple, ...]'>
# revealed: <types.UnionType special form 'list[T@ListOrTupleLegacy] | tuple[T@ListOrTupleLegacy, ...]'>
reveal_type(ListOrTupleLegacy)
reveal_type(MyCallable) # revealed: <typing.Callable special form '(**P@MyCallable) -> T@MyCallable'>
reveal_type(AnnotatedType) # revealed: <special form 'typing.Annotated[T@AnnotatedType, <metadata>]'>
reveal_type(TransparentAlias) # revealed: typing.TypeVar
reveal_type(MyOptional) # revealed: types.UnionType
reveal_type(MyOptional) # revealed: <types.UnionType special form 'T@MyOptional | None'>
def _(
list_of_ints: MyList[int],
@@ -424,8 +425,7 @@ def _(
reveal_type(int_and_bytes) # revealed: tuple[int, bytes]
reveal_type(list_or_tuple) # revealed: list[int] | tuple[int, ...]
reveal_type(list_or_tuple_legacy) # revealed: list[int] | tuple[int, ...]
# TODO: This should be `(str, bytes) -> int`
reveal_type(my_callable) # revealed: @Todo(Callable[..] specialized with ParamSpec)
reveal_type(my_callable) # revealed: (str, bytes, /) -> int
reveal_type(annotated_int) # revealed: int
reveal_type(transparent_alias) # revealed: int
reveal_type(optional_int) # revealed: int | None
@@ -456,13 +456,13 @@ AnnotatedInt = AnnotatedType[int]
SubclassOfInt = MyType[int]
CallableIntToStr = MyCallable[[int], str]
reveal_type(IntsOrNone) # revealed: types.UnionType
reveal_type(IntsOrStrs) # revealed: types.UnionType
reveal_type(IntsOrNone) # revealed: <types.UnionType special form 'list[int] | None'>
reveal_type(IntsOrStrs) # revealed: <types.UnionType special form 'tuple[int, int] | tuple[str, str]'>
reveal_type(ListOfPairs) # revealed: <class 'list[tuple[str, str]]'>
reveal_type(ListOrTupleOfInts) # revealed: types.UnionType
reveal_type(AnnotatedInt) # revealed: <typing.Annotated special form>
reveal_type(SubclassOfInt) # revealed: GenericAlias
reveal_type(CallableIntToStr) # revealed: @Todo(Callable[..] specialized with ParamSpec)
reveal_type(ListOrTupleOfInts) # revealed: <types.UnionType special form 'list[int] | tuple[int, ...]'>
reveal_type(AnnotatedInt) # revealed: <special form 'typing.Annotated[int, <metadata>]'>
reveal_type(SubclassOfInt) # revealed: <special form 'type[int]'>
reveal_type(CallableIntToStr) # revealed: <typing.Callable special form '(int, /) -> str'>
def _(
ints_or_none: IntsOrNone,
@@ -479,8 +479,7 @@ def _(
reveal_type(list_or_tuple_of_ints) # revealed: list[int] | tuple[int, ...]
reveal_type(annotated_int) # revealed: int
reveal_type(subclass_of_int) # revealed: type[int]
# TODO: This should be `(int, /) -> str`
reveal_type(callable_int_to_str) # revealed: @Todo(Callable[..] specialized with ParamSpec)
reveal_type(callable_int_to_str) # revealed: (int, /) -> str
```
A generic implicit type alias can also be used in another generic implicit type alias:
@@ -495,8 +494,8 @@ MyOtherType = MyType[T]
TypeOrList = MyType[B] | MyList[B]
reveal_type(MyOtherList) # revealed: <class 'list[T@MyOtherList]'>
reveal_type(MyOtherType) # revealed: GenericAlias
reveal_type(TypeOrList) # revealed: types.UnionType
reveal_type(MyOtherType) # revealed: <special form 'type[T@MyOtherType]'>
reveal_type(TypeOrList) # revealed: <types.UnionType special form 'type[B@TypeOrList] | list[B@TypeOrList]'>
def _(
list_of_ints: MyOtherList[int],
@@ -533,8 +532,7 @@ def _(
reveal_type(unknown_and_unknown) # revealed: tuple[Unknown, Unknown]
reveal_type(list_or_tuple) # revealed: list[Unknown] | tuple[Unknown, ...]
reveal_type(list_or_tuple_legacy) # revealed: list[Unknown] | tuple[Unknown, ...]
# TODO: should be (...) -> Unknown
reveal_type(my_callable) # revealed: @Todo(Callable[..] specialized with ParamSpec)
reveal_type(my_callable) # revealed: (...) -> Unknown
reveal_type(annotated_unknown) # revealed: Unknown
reveal_type(optional_unknown) # revealed: Unknown | None
```
@@ -898,7 +896,7 @@ from typing import Optional
MyOptionalInt = Optional[int]
reveal_type(MyOptionalInt) # revealed: types.UnionType
reveal_type(MyOptionalInt) # revealed: <types.UnionType special form 'int | None'>
def _(optional_int: MyOptionalInt):
reveal_type(optional_int) # revealed: int | None
@@ -931,9 +929,9 @@ MyLiteralString = LiteralString
MyNoReturn = NoReturn
MyNever = Never
reveal_type(MyLiteralString) # revealed: typing.LiteralString
reveal_type(MyNoReturn) # revealed: typing.NoReturn
reveal_type(MyNever) # revealed: typing.Never
reveal_type(MyLiteralString) # revealed: <special form 'typing.LiteralString'>
reveal_type(MyNoReturn) # revealed: <special form 'typing.NoReturn'>
reveal_type(MyNever) # revealed: <special form 'typing.Never'>
def _(
ls: MyLiteralString,
@@ -986,8 +984,8 @@ from typing import Union
IntOrStr = Union[int, str]
IntOrStrOrBytes = Union[int, Union[str, bytes]]
reveal_type(IntOrStr) # revealed: types.UnionType
reveal_type(IntOrStrOrBytes) # revealed: types.UnionType
reveal_type(IntOrStr) # revealed: <types.UnionType special form 'int | str'>
reveal_type(IntOrStrOrBytes) # revealed: <types.UnionType special form 'int | str | bytes'>
def _(
int_or_str: IntOrStr,
@@ -1015,7 +1013,7 @@ An empty `typing.Union` leads to a `TypeError` at runtime, so we emit an error.
# error: [invalid-type-form] "`typing.Union` requires at least one type argument"
EmptyUnion = Union[()]
reveal_type(EmptyUnion) # revealed: types.UnionType
reveal_type(EmptyUnion) # revealed: <types.UnionType special form 'Never'>
def _(empty: EmptyUnion):
reveal_type(empty) # revealed: Never
@@ -1060,14 +1058,14 @@ SubclassOfG = type[G]
SubclassOfGInt = type[G[int]]
SubclassOfP = type[P]
reveal_type(SubclassOfA) # revealed: GenericAlias
reveal_type(SubclassOfAny) # revealed: GenericAlias
reveal_type(SubclassOfAOrB1) # revealed: GenericAlias
reveal_type(SubclassOfAOrB2) # revealed: types.UnionType
reveal_type(SubclassOfAOrB3) # revealed: types.UnionType
reveal_type(SubclassOfG) # revealed: GenericAlias
reveal_type(SubclassOfGInt) # revealed: GenericAlias
reveal_type(SubclassOfP) # revealed: GenericAlias
reveal_type(SubclassOfA) # revealed: <special form 'type[A]'>
reveal_type(SubclassOfAny) # revealed: <special form 'type[Any]'>
reveal_type(SubclassOfAOrB1) # revealed: <special form 'type[A | B]'>
reveal_type(SubclassOfAOrB2) # revealed: <types.UnionType special form 'type[A] | type[B]'>
reveal_type(SubclassOfAOrB3) # revealed: <types.UnionType special form 'type[A] | type[B]'>
reveal_type(SubclassOfG) # revealed: <special form 'type[G[Unknown]]'>
reveal_type(SubclassOfGInt) # revealed: <special form 'type[G[int]]'>
reveal_type(SubclassOfP) # revealed: <special form 'type[P]'>
def _(
subclass_of_a: SubclassOfA,
@@ -1148,14 +1146,14 @@ SubclassOfG = Type[G]
SubclassOfGInt = Type[G[int]]
SubclassOfP = Type[P]
reveal_type(SubclassOfA) # revealed: GenericAlias
reveal_type(SubclassOfAny) # revealed: GenericAlias
reveal_type(SubclassOfAOrB1) # revealed: GenericAlias
reveal_type(SubclassOfAOrB2) # revealed: types.UnionType
reveal_type(SubclassOfAOrB3) # revealed: types.UnionType
reveal_type(SubclassOfG) # revealed: GenericAlias
reveal_type(SubclassOfGInt) # revealed: GenericAlias
reveal_type(SubclassOfP) # revealed: GenericAlias
reveal_type(SubclassOfA) # revealed: <special form 'type[A]'>
reveal_type(SubclassOfAny) # revealed: <special form 'type[Any]'>
reveal_type(SubclassOfAOrB1) # revealed: <special form 'type[A | B]'>
reveal_type(SubclassOfAOrB2) # revealed: <types.UnionType special form 'type[A] | type[B]'>
reveal_type(SubclassOfAOrB3) # revealed: <types.UnionType special form 'type[A] | type[B]'>
reveal_type(SubclassOfG) # revealed: <special form 'type[G[Unknown]]'>
reveal_type(SubclassOfGInt) # revealed: <special form 'type[G[int]]'>
reveal_type(SubclassOfP) # revealed: <special form 'type[P]'>
def _(
subclass_of_a: SubclassOfA,
@@ -1270,25 +1268,25 @@ DefaultDictOrNone = DefaultDict[str, int] | None
DequeOrNone = Deque[str] | None
OrderedDictOrNone = OrderedDict[str, int] | None
reveal_type(NoneOrList) # revealed: types.UnionType
reveal_type(NoneOrSet) # revealed: types.UnionType
reveal_type(NoneOrDict) # revealed: types.UnionType
reveal_type(NoneOrFrozenSet) # revealed: types.UnionType
reveal_type(NoneOrChainMap) # revealed: types.UnionType
reveal_type(NoneOrCounter) # revealed: types.UnionType
reveal_type(NoneOrDefaultDict) # revealed: types.UnionType
reveal_type(NoneOrDeque) # revealed: types.UnionType
reveal_type(NoneOrOrderedDict) # revealed: types.UnionType
reveal_type(NoneOrList) # revealed: <types.UnionType special form 'None | list[str]'>
reveal_type(NoneOrSet) # revealed: <types.UnionType special form 'None | set[str]'>
reveal_type(NoneOrDict) # revealed: <types.UnionType special form 'None | dict[str, int]'>
reveal_type(NoneOrFrozenSet) # revealed: <types.UnionType special form 'None | frozenset[str]'>
reveal_type(NoneOrChainMap) # revealed: <types.UnionType special form 'None | ChainMap[str, int]'>
reveal_type(NoneOrCounter) # revealed: <types.UnionType special form 'None | Counter[str]'>
reveal_type(NoneOrDefaultDict) # revealed: <types.UnionType special form 'None | defaultdict[str, int]'>
reveal_type(NoneOrDeque) # revealed: <types.UnionType special form 'None | deque[str]'>
reveal_type(NoneOrOrderedDict) # revealed: <types.UnionType special form 'None | OrderedDict[str, int]'>
reveal_type(ListOrNone) # revealed: types.UnionType
reveal_type(SetOrNone) # revealed: types.UnionType
reveal_type(DictOrNone) # revealed: types.UnionType
reveal_type(FrozenSetOrNone) # revealed: types.UnionType
reveal_type(ChainMapOrNone) # revealed: types.UnionType
reveal_type(CounterOrNone) # revealed: types.UnionType
reveal_type(DefaultDictOrNone) # revealed: types.UnionType
reveal_type(DequeOrNone) # revealed: types.UnionType
reveal_type(OrderedDictOrNone) # revealed: types.UnionType
reveal_type(ListOrNone) # revealed: <types.UnionType special form 'list[int] | None'>
reveal_type(SetOrNone) # revealed: <types.UnionType special form 'set[int] | None'>
reveal_type(DictOrNone) # revealed: <types.UnionType special form 'dict[str, int] | None'>
reveal_type(FrozenSetOrNone) # revealed: <types.UnionType special form 'frozenset[int] | None'>
reveal_type(ChainMapOrNone) # revealed: <types.UnionType special form 'ChainMap[str, int] | None'>
reveal_type(CounterOrNone) # revealed: <types.UnionType special form 'Counter[str] | None'>
reveal_type(DefaultDictOrNone) # revealed: <types.UnionType special form 'defaultdict[str, int] | None'>
reveal_type(DequeOrNone) # revealed: <types.UnionType special form 'deque[str] | None'>
reveal_type(OrderedDictOrNone) # revealed: <types.UnionType special form 'OrderedDict[str, int] | None'>
def _(
none_or_list: NoneOrList,
@@ -1381,9 +1379,9 @@ CallableNoArgs = Callable[[], None]
BasicCallable = Callable[[int, str], bytes]
GradualCallable = Callable[..., str]
reveal_type(CallableNoArgs) # revealed: GenericAlias
reveal_type(BasicCallable) # revealed: GenericAlias
reveal_type(GradualCallable) # revealed: GenericAlias
reveal_type(CallableNoArgs) # revealed: <typing.Callable special form '() -> None'>
reveal_type(BasicCallable) # revealed: <typing.Callable special form '(int, str, /) -> bytes'>
reveal_type(GradualCallable) # revealed: <typing.Callable special form '(...) -> str'>
def _(
callable_no_args: CallableNoArgs,
@@ -1415,8 +1413,8 @@ InvalidCallable1 = Callable[[int]]
# error: [invalid-type-form] "The first argument to `Callable` must be either a list of types, ParamSpec, Concatenate, or `...`"
InvalidCallable2 = Callable[int, str]
reveal_type(InvalidCallable1) # revealed: GenericAlias
reveal_type(InvalidCallable2) # revealed: GenericAlias
reveal_type(InvalidCallable1) # revealed: <typing.Callable special form '(...) -> Unknown'>
reveal_type(InvalidCallable2) # revealed: <typing.Callable special form '(...) -> Unknown'>
def _(invalid_callable1: InvalidCallable1, invalid_callable2: InvalidCallable2):
reveal_type(invalid_callable1) # revealed: (...) -> Unknown

View File

@@ -53,8 +53,8 @@ in `import os.path as os.path` the `os.path` is not a valid identifier.
```py
from b import Any, Literal, foo
reveal_type(Any) # revealed: typing.Any
reveal_type(Literal) # revealed: typing.Literal
reveal_type(Any) # revealed: <special form 'typing.Any'>
reveal_type(Literal) # revealed: <special form 'typing.Literal'>
reveal_type(foo) # revealed: <module 'foo'>
```
@@ -132,7 +132,7 @@ reveal_type(Any) # revealed: Unknown
```pyi
from typing import Any
reveal_type(Any) # revealed: typing.Any
reveal_type(Any) # revealed: <special form 'typing.Any'>
```
## Nested mixed re-export and not
@@ -169,7 +169,7 @@ reveal_type(Any) # revealed: Unknown
```pyi
from typing import Any
reveal_type(Any) # revealed: typing.Any
reveal_type(Any) # revealed: <special form 'typing.Any'>
```
## Exported as different name

View File

@@ -22,10 +22,10 @@ python = "/.venv"
`/.venv/pyvenv.cfg`:
```cfg
home = /doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin
home = /do/re/mi//cpython-3.13.2-macos-aarch64-none/bin
```
`/doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin/python`:
`/do/re/mi//cpython-3.13.2-macos-aarch64-none/bin/python`:
```text
```
@@ -54,11 +54,11 @@ python = "/.venv"
`/.venv/pyvenv.cfg`:
```cfg
home = /doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin
home = /do/re/mi//cpython-3.13.2-macos-aarch64-none/bin
version = wut
```
`/doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin/python`:
`/do/re/mi//cpython-3.13.2-macos-aarch64-none/bin/python`:
```text
```
@@ -87,11 +87,11 @@ python = "/.venv"
`/.venv/pyvenv.cfg`:
```cfg
home = /doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin
home = /do/re/mi//cpython-3.13.2-macos-aarch64-none/bin
version_info = no-really-wut
```
`/doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin/python`:
`/do/re/mi//cpython-3.13.2-macos-aarch64-none/bin/python`:
```text
```
@@ -132,7 +132,7 @@ python = "/.venv"
`/.venv/pyvenv.cfg`:
```cfg
home = /doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin
home = /do/re/mi//cpython-3.13.2-macos-aarch64-none/bin
implementation = CPython
uv = 0.7.6
version_info = 3.13.2
@@ -141,7 +141,7 @@ prompt = ruff
extends-environment = /.other-environment
```
`/doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin/python`:
`/do/re/mi//cpython-3.13.2-macos-aarch64-none/bin/python`:
```text
```
@@ -182,12 +182,12 @@ python = "/.venv"
`/.venv/pyvenv.cfg`:
```cfg
home = /doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin
home = /do/re/mi//cpython-3.13.2-macos-aarch64-none/bin
version_info = 3.13
command = /.pyenv/versions/3.13.3/bin/python3.13 -m venv --without-pip --prompt="python-default/3.13.3" /somewhere-else/python/virtualenvs/python-default/3.13.3
```
`/doo/doo/wop/cpython-3.13.2-macos-aarch64-none/bin/python`:
`/do/re/mi//cpython-3.13.2-macos-aarch64-none/bin/python`:
```text
```

View File

@@ -1336,6 +1336,69 @@ reveal_type(g) # revealed: Unknown
reveal_type(h) # revealed: Unknown
```
## Star-imports can affect member states
If a star-import pulls in a symbol that was previously defined in the importing module (e.g. `obj`),
it can affect the state of associated member expressions (e.g. `obj.attr` or `obj[0]`). In the test
below, note how the types of the corresponding attribute expressions change after the star import
affects the object:
`common.py`:
```py
class C:
attr: int | None
```
`exporter.py`:
```py
from common import C
def flag() -> bool:
return True
should_be_imported: C = C()
if flag():
might_be_imported: C = C()
if False:
should_not_be_imported: C = C()
```
`main.py`:
```py
from common import C
should_be_imported = C()
might_be_imported = C()
should_not_be_imported = C()
# We start with the plain attribute types:
reveal_type(should_be_imported.attr) # revealed: int | None
reveal_type(might_be_imported.attr) # revealed: int | None
reveal_type(should_not_be_imported.attr) # revealed: int | None
# Now we narrow the types by assignment:
should_be_imported.attr = 1
might_be_imported.attr = 1
should_not_be_imported.attr = 1
reveal_type(should_be_imported.attr) # revealed: Literal[1]
reveal_type(might_be_imported.attr) # revealed: Literal[1]
reveal_type(should_not_be_imported.attr) # revealed: Literal[1]
# This star import adds bindings for `should_be_imported` and `might_be_imported`:
from exporter import *
# As expected, narrowing is "reset" for the first two variables, but not for the third:
reveal_type(should_be_imported.attr) # revealed: int | None
reveal_type(might_be_imported.attr) # revealed: int | None
reveal_type(should_not_be_imported.attr) # revealed: Literal[1]
```
## Cyclic star imports
Believe it or not, this code does *not* raise an exception at runtime!
@@ -1374,7 +1437,7 @@ are present due to `*` imports.
import collections.abc
reveal_type(collections.abc.Sequence) # revealed: <class 'Sequence'>
reveal_type(collections.abc.Callable) # revealed: typing.Callable
reveal_type(collections.abc.Callable) # revealed: <special form 'typing.Callable'>
reveal_type(collections.abc.Set) # revealed: <class 'AbstractSet'>
```

View File

@@ -6,6 +6,15 @@ python file in some random workspace, and so we need to be more tolerant of situ
fly in a published package, cases where we're not configured as well as we'd like, or cases where
two projects in a monorepo have conflicting definitions (but we want to analyze both at once).
In practice these tests cover what we call "desperate module resolution" which, when an import
fails, results in us walking up the ancestor directories of the importing file and trying those as
"desperate search-paths".
Currently desperate search-paths are restricted to subdirectories of the first-party search-path
(the directory you're running `ty` in). Currently we only consider one desperate search-path: the
closest ancestor directory containing a `pyproject.toml`. In the future we may want to try every
ancestor `pyproject.toml` or every ancestor directory.
## Invalid Names
While you can't syntactically refer to a module with an invalid name (i.e. one with a `-`, or that
@@ -18,9 +27,10 @@ strings and does in fact allow syntactically invalid module names.
### Current File Is Invalid Module Name
Relative and absolute imports should resolve fine in a file that isn't a valid module name.
Relative and absolute imports should resolve fine in a file that isn't a valid module name (in this
case, it could be imported via `importlib.import_module`).
`my-main.py`:
`tests/my-mod.py`:
```py
# TODO: there should be no errors in this file
@@ -37,13 +47,13 @@ reveal_type(mod2.y) # revealed: Unknown
reveal_type(mod3.z) # revealed: int
```
`mod1.py`:
`tests/mod1.py`:
```py
x: int = 1
```
`mod2.py`:
`tests/mod2.py`:
```py
y: int = 2
@@ -57,13 +67,16 @@ z: int = 2
### Current Directory Is Invalid Module Name
Relative and absolute imports should resolve fine in a dir that isn't a valid module name.
If python files are rooted in a directory with an invalid module name and they relatively import
each other, there's probably no coherent explanation for what's going on and it's fine that the
relative import don't resolve (but maybe we could provide some good diagnostics).
`my-tests/main.py`:
This is a case that sufficient desperation might "accidentally" make work, so it's included here as
a canary in the coal mine.
`my-tests/mymod.py`:
```py
# TODO: there should be no errors in this file
# error: [unresolved-import]
from .mod1 import x
@@ -94,46 +107,97 @@ y: int = 2
z: int = 2
```
### Current Directory Is Invalid Package Name
### Ancestor Directory Is Invalid Module Name
Relative and absolute imports should resolve fine in a dir that isn't a valid package name, even if
it contains an `__init__.py`:
Relative and absolute imports *could* resolve fine in the first-party search-path, even if one of
the ancestor dirs is an invalid module. i.e. in this case we will be inclined to compute module
names like `my-proj.tests.mymod`, but it could be that in practice the user always runs this code
rooted in the `my-proj` directory.
`my-tests/__init__.py`:
This case is hard for us to detect and handle in a principled way, but two more extreme kinds of
desperation could handle this:
- try every ancestor as a desperate search-path
- try the closest ancestor with an invalid module name as a desperate search-path
The second one is a bit messed up because it could result in situations where someone can get a
worse experience because a directory happened to *not* be invalid as a module name (`myproj` or
`my_proj`).
`my-proj/tests/mymod.py`:
```py
```
`my-tests/main.py`:
```py
# TODO: there should be no errors in this file
# TODO: it would be *nice* if there were no errors in this file
# error: [unresolved-import]
from .mod1 import x
# error: [unresolved-import]
from . import mod2
# error: [unresolved-import]
import mod3
reveal_type(x) # revealed: Unknown
reveal_type(mod2.y) # revealed: Unknown
reveal_type(mod3.z) # revealed: int
reveal_type(mod3.z) # revealed: Unknown
```
`my-tests/mod1.py`:
`my-proj/tests/mod1.py`:
```py
x: int = 1
```
`my-tests/mod2.py`:
`my-proj/tests/mod2.py`:
```py
y: int = 2
```
`mod3.py`:
`my-proj/mod3.py`:
```py
z: int = 2
```
### Ancestor Directory Above `pyproject.toml` is invalid
Like the previous tests but with a `pyproject.toml` existing between the invalid name and the python
files. This is an "easier" case in case we use the `pyproject.toml` as a hint about what's going on.
`my-proj/pyproject.toml`:
```text
name = "my_proj"
version = "0.1.0"
```
`my-proj/tests/main.py`:
```py
from .mod1 import x
from . import mod2
import mod3
reveal_type(x) # revealed: int
reveal_type(mod2.y) # revealed: int
reveal_type(mod3.z) # revealed: int
```
`my-proj/tests/mod1.py`:
```py
x: int = 1
```
`my-proj/tests/mod2.py`:
```py
y: int = 2
```
`my-proj/mod3.py`:
```py
z: int = 2
@@ -141,7 +205,7 @@ z: int = 2
## Multiple Projects
It's common for a monorepo to define many separate projects that may or may not depend on eachother
It's common for a monorepo to define many separate projects that may or may not depend on each other
and are stitched together with a package manager like `uv` or `poetry`, often as editables. In this
case, especially when running as an LSP, we want to be able to analyze all of the projects at once,
allowing us to reuse results between projects, without getting confused about things that only make
@@ -150,7 +214,7 @@ sense when analyzing the project separately.
The following tests will feature two projects, `a` and `b` where the "real" packages are found under
`src/` subdirectories (and we've been configured to understand that), but each project also contains
other python files in their roots or subdirectories that contains python files which relatively
import eachother and also absolutely import the main package of the project. All of these imports
import each other and also absolutely import the main package of the project. All of these imports
*should* resolve.
Often the fact that there is both an `a` and `b` project seemingly won't matter, but many possible
@@ -164,13 +228,36 @@ following examples include them in case they help.
Here we have fairly typical situation where there are two projects `aproj` and `bproj` where the
"real" packages are found under `src/` subdirectories, but each project also contains a `tests/`
directory that contains python files which relatively import eachother and also absolutely import
directory that contains python files which relatively import each other and also absolutely import
the package they test. All of these imports *should* resolve.
```toml
[environment]
# This is similar to what we would compute for installed editables
extra-paths = ["aproj/src/", "bproj/src/"]
# Setup a venv with editables for aproj/src/ and bproj/src/
python = "/.venv"
```
`/.venv/pyvenv.cfg`:
```cfg
home = /do/re/mi//cpython-3.13.2-macos-aarch64-none/bin
```
`/do/re/mi//cpython-3.13.2-macos-aarch64-none/bin/python`:
```text
```
`/.venv/<path-to-site-packages>/a.pth`:
```pth
aproj/src/
```
`/.venv/<path-to-site-packages>/b.pth`:
```pth
bproj/src/
```
`aproj/tests/test1.py`:
@@ -239,16 +326,60 @@ version = "0.1.0"
y: str = "20"
```
### Tests Directory With Ambiguous Project Directories
### Tests Directory With Ambiguous Project Directories Via Editables
The same situation as the previous test but instead of the project `a` being in a directory `aproj`
to disambiguate, we now need to avoid getting confused about whether `a/` or `a/src/a/` is the
package `a` while still resolving imports.
Unfortunately this is a quite difficult square to circle as `a/` is a namespace package of `a` and
`a/src/a/` is a regular package of `a`. **This is a very bad situation you're not supposed to ever
create, and we are now very sensitive to precise search-path ordering.**
Here the use of editables means that `a/` has higher priority than `a/src/a/`.
Somehow this results in `a/tests/test1.py` being able to resolve `.setup` but not `.`.
My best guess is that in this state we can resolve regular modules in `a/tests/` but not namespace
packages because we have some extra validation for namespace packages conflicted by regular
packages, but that validation isn't applied when we successfully resolve a submodule of the
namespace package.
In this case, as we find that `a/tests/test1.py` matches on the first-party path as `a.tests.test1`
and is syntactically valid. We then resolve `a.tests.test1` and because the namespace package
(`/a/`) comes first we succeed. We then syntactically compute `.` to be `a.tests`.
When we go to lookup `a.tests.setup`, whatever grace that allowed `a.tests.test1` to resolve still
works so it resolves too. However when we try to resolve `a.tests` on its own some additional
validation rejects the namespace package conflicting with the regular package.
```toml
[environment]
# This is similar to what we would compute for installed editables
extra-paths = ["a/src/", "b/src/"]
# Setup a venv with editables for a/src/ and b/src/
python = "/.venv"
```
`/.venv/pyvenv.cfg`:
```cfg
home = /do/re/mi//cpython-3.13.2-macos-aarch64-none/bin
```
`/do/re/mi//cpython-3.13.2-macos-aarch64-none/bin/python`:
```text
```
`/.venv/<path-to-site-packages>/a.pth`:
```pth
a/src/
```
`/.venv/<path-to-site-packages>/b.pth`:
```pth
b/src/
```
`a/tests/test1.py`:
@@ -256,7 +387,6 @@ extra-paths = ["a/src/", "b/src/"]
```py
# TODO: there should be no errors in this file.
# error: [unresolved-import]
from .setup import x
# error: [unresolved-import]
@@ -264,7 +394,7 @@ from . import setup
from a import y
import a
reveal_type(x) # revealed: Unknown
reveal_type(x) # revealed: int
reveal_type(setup.x) # revealed: Unknown
reveal_type(y) # revealed: int
reveal_type(a.y) # revealed: int
@@ -294,7 +424,6 @@ y: int = 10
```py
# TODO: there should be no errors in this file
# error: [unresolved-import]
from .setup import x
# error: [unresolved-import]
@@ -302,7 +431,7 @@ from . import setup
from b import y
import b
reveal_type(x) # revealed: Unknown
reveal_type(x) # revealed: str
reveal_type(setup.x) # revealed: Unknown
reveal_type(y) # revealed: str
reveal_type(b.y) # revealed: str
@@ -327,10 +456,15 @@ version = "0.1.0"
y: str = "20"
```
### Tests Package With Ambiguous Project Directories
### Tests Directory With Ambiguous Project Directories Via `extra-paths`
The same situation as the previous test but `tests/__init__.py` is also defined, in case that
complicates the situation.
The same situation as the previous test but instead of using editables we use `extra-paths` which
have higher priority than the first-party search-path. Thus, `/a/src/a/` is always seen before
`/a/`.
In this case everything works well because the namespace package `a.tests` (`a/tests/`) is
completely hidden by the regular package `a` (`a/src/a/`) and so we immediately enter desperate
resolution and use the now-unambiguous namespace package `tests`.
```toml
[environment]
@@ -340,27 +474,17 @@ extra-paths = ["a/src/", "b/src/"]
`a/tests/test1.py`:
```py
# TODO: there should be no errors in this file.
# error: [unresolved-import]
from .setup import x
# error: [unresolved-import]
from . import setup
from a import y
import a
reveal_type(x) # revealed: Unknown
reveal_type(setup.x) # revealed: Unknown
reveal_type(x) # revealed: int
reveal_type(setup.x) # revealed: int
reveal_type(y) # revealed: int
reveal_type(a.y) # revealed: int
```
`a/tests/__init__.py`:
```py
```
`a/tests/setup.py`:
```py
@@ -383,27 +507,17 @@ y: int = 10
`b/tests/test1.py`:
```py
# TODO: there should be no errors in this file
# error: [unresolved-import]
from .setup import x
# error: [unresolved-import]
from . import setup
from b import y
import b
reveal_type(x) # revealed: Unknown
reveal_type(setup.x) # revealed: Unknown
reveal_type(x) # revealed: str
reveal_type(setup.x) # revealed: str
reveal_type(y) # revealed: str
reveal_type(b.y) # revealed: str
```
`b/tests/__init__.py`:
```py
```
`b/tests/setup.py`:
```py
@@ -431,21 +545,16 @@ that `import main` and expect that to work.
`a/tests/test1.py`:
```py
# TODO: there should be no errors in this file.
from .setup import x
from . import setup
# error: [unresolved-import]
from main import y
# error: [unresolved-import]
import main
reveal_type(x) # revealed: int
reveal_type(setup.x) # revealed: int
reveal_type(y) # revealed: Unknown
reveal_type(main.y) # revealed: Unknown
reveal_type(y) # revealed: int
reveal_type(main.y) # revealed: int
```
`a/tests/setup.py`:
@@ -470,113 +579,16 @@ y: int = 10
`b/tests/test1.py`:
```py
# TODO: there should be no errors in this file
from .setup import x
from . import setup
# error: [unresolved-import]
from main import y
# error: [unresolved-import]
import main
reveal_type(x) # revealed: str
reveal_type(setup.x) # revealed: str
reveal_type(y) # revealed: Unknown
reveal_type(main.y) # revealed: Unknown
```
`b/tests/setup.py`:
```py
x: str = "2"
```
`b/pyproject.toml`:
```text
name = "a"
version = "0.1.0"
```
`b/main.py`:
```py
y: str = "20"
```
### Tests Package Absolute Importing `main.py`
The same as the previous case but `tests/__init__.py` exists in case that causes different issues.
`a/tests/test1.py`:
```py
# TODO: there should be no errors in this file.
from .setup import x
from . import setup
# error: [unresolved-import]
from main import y
# error: [unresolved-import]
import main
reveal_type(x) # revealed: int
reveal_type(setup.x) # revealed: int
reveal_type(y) # revealed: Unknown
reveal_type(main.y) # revealed: Unknown
```
`a/tests/__init__.py`:
```py
```
`a/tests/setup.py`:
```py
x: int = 1
```
`a/pyproject.toml`:
```text
name = "a"
version = "0.1.0"
```
`a/main.py`:
```py
y: int = 10
```
`b/tests/test1.py`:
```py
# TODO: there should be no errors in this file
from .setup import x
from . import setup
# error: [unresolved-import]
from main import y
# error: [unresolved-import]
import main
reveal_type(x) # revealed: str
reveal_type(setup.x) # revealed: str
reveal_type(y) # revealed: Unknown
reveal_type(main.y) # revealed: Unknown
```
`b/tests/__init__.py`:
```py
reveal_type(y) # revealed: str
reveal_type(main.y) # revealed: str
```
`b/tests/setup.py`:
@@ -606,16 +618,11 @@ imports it.
`a/main.py`:
```py
# TODO: there should be no errors in this file.
# error: [unresolved-import]
from utils import x
# error: [unresolved-import]
import utils
reveal_type(x) # revealed: Unknown
reveal_type(utils.x) # revealed: Unknown
reveal_type(x) # revealed: int
reveal_type(utils.x) # revealed: int
```
`a/utils/__init__.py`:
@@ -634,16 +641,11 @@ version = "0.1.0"
`b/main.py`:
```py
# TODO: there should be no errors in this file.
# error: [unresolved-import]
from utils import x
# error: [unresolved-import]
import utils
reveal_type(x) # revealed: Unknown
reveal_type(utils.x) # revealed: Unknown
reveal_type(x) # revealed: str
reveal_type(utils.x) # revealed: str
```
`b/utils/__init__.py`:

View File

@@ -218,8 +218,8 @@ class E(A[int]):
def method(self, x: object) -> None: ... # fine
class F[T](A[T]):
# TODO: we should emit `invalid-method-override` on this:
# `str` is not necessarily a supertype of `T`!
# error: [invalid-method-override]
def method(self, x: str) -> None: ...
class G(A[int]):

View File

@@ -301,7 +301,7 @@ class B: ...
EitherOr = A | B
# error: [invalid-base] "Invalid class base with type `types.UnionType`"
# error: [invalid-base] "Invalid class base with type `<types.UnionType special form 'A | B'>`"
class Foo(EitherOr): ...
```

View File

@@ -156,7 +156,7 @@ from typing import Union
IntOrStr = Union[int, str]
reveal_type(IntOrStr) # revealed: types.UnionType
reveal_type(IntOrStr) # revealed: <types.UnionType special form 'int | str'>
def _(x: int | str | bytes | memoryview | range):
if isinstance(x, IntOrStr):

View File

@@ -209,7 +209,7 @@ from typing import Union
IntOrStr = Union[int, str]
reveal_type(IntOrStr) # revealed: types.UnionType
reveal_type(IntOrStr) # revealed: <types.UnionType special form 'int | str'>
def f(x: type[int | str | bytes | range]):
if issubclass(x, IntOrStr):

View File

@@ -113,7 +113,7 @@ MyList: TypeAlias = list[T]
ListOrSet: TypeAlias = list[T] | set[T]
reveal_type(MyList) # revealed: <class 'list[T]'>
reveal_type(ListOrSet) # revealed: types.UnionType
reveal_type(ListOrSet) # revealed: <types.UnionType special form 'list[T] | set[T]'>
def _(list_of_int: MyList[int], list_or_set_of_str: ListOrSet[str]):
reveal_type(list_of_int) # revealed: list[int]
@@ -293,7 +293,7 @@ def _(rec: RecursiveHomogeneousTuple):
reveal_type(rec) # revealed: tuple[Divergent, ...]
ClassInfo: TypeAlias = type | UnionType | tuple["ClassInfo", ...]
reveal_type(ClassInfo) # revealed: types.UnionType
reveal_type(ClassInfo) # revealed: <types.UnionType special form 'type | UnionType | tuple[Divergent, ...]'>
def my_isinstance(obj: object, classinfo: ClassInfo) -> bool:
# TODO should be `type | UnionType | tuple[ClassInfo, ...]`

View File

@@ -3184,14 +3184,9 @@ from ty_extensions import reveal_protocol_interface
reveal_protocol_interface(Foo)
```
## Known panics
## Protocols generic over TypeVars bound to forward references
### Protocols generic over TypeVars bound to forward references
This test currently panics because the `ClassLiteral::explicit_bases` query fails to converge. See
issue <https://github.com/astral-sh/ty/issues/1587>.
<!-- expect-panic: execute: too many cycle iterations -->
Protocols can have TypeVars with forward reference bounds that form cycles.
```py
from typing import Any, Protocol, TypeVar
@@ -3209,6 +3204,19 @@ class A2(Protocol[T2]):
class B1(A1[T3], Protocol[T3]): ...
class B2(A2[T4], Protocol[T4]): ...
# TODO should just be `B2[Any]`
reveal_type(T3.__bound__) # revealed: B2[Any] | @Todo(specialized non-generic class)
# TODO error: [invalid-type-arguments]
def f(x: B1[int]):
pass
reveal_type(T4.__bound__) # revealed: B1[Any]
# error: [invalid-type-arguments]
def g(x: B2[int]):
pass
```
## TODO

View File

@@ -0,0 +1,19 @@
# `ParamSpec` regression on 3.9
```toml
[environment]
python-version = "3.9"
```
This used to panic when run on Python 3.9 because `ParamSpec` was introduced in Python 3.10 and the
diagnostic message for `invalid-exception-caught` expects to construct `typing.ParamSpec`.
```py
# error: [invalid-syntax]
def foo[**P]() -> None:
try:
pass
# error: [invalid-exception-caught] "Invalid object caught in an exception handler: Object has type `typing.ParamSpec`"
except P:
pass
```

View File

@@ -14,10 +14,11 @@ mdtest path: crates/ty_python_semantic/resources/mdtest/directives/assert_type.m
```
1 | from typing_extensions import assert_type
2 |
3 | def _(x: int):
3 | def _(x: int, y: bool):
4 | assert_type(x, int) # fine
5 | assert_type(x, str) # error: [type-assertion-failure]
6 | assert_type(assert_type(x, int), int)
7 | assert_type(y, int) # error: [type-assertion-failure]
```
# Diagnostics
@@ -26,15 +27,32 @@ mdtest path: crates/ty_python_semantic/resources/mdtest/directives/assert_type.m
error[type-assertion-failure]: Argument does not have asserted type `str`
--> src/mdtest_snippet.py:5:5
|
3 | def _(x: int):
3 | def _(x: int, y: bool):
4 | assert_type(x, int) # fine
5 | assert_type(x, str) # error: [type-assertion-failure]
| ^^^^^^^^^^^^-^^^^^^
| |
| Inferred type of argument is `int`
| Inferred type is `int`
6 | assert_type(assert_type(x, int), int)
7 | assert_type(y, int) # error: [type-assertion-failure]
|
info: `str` and `int` are not equivalent types
info: rule `type-assertion-failure` is enabled by default
```
```
error[type-assertion-failure]: Argument does not have asserted type `int`
--> src/mdtest_snippet.py:7:5
|
5 | assert_type(x, str) # error: [type-assertion-failure]
6 | assert_type(assert_type(x, int), int)
7 | assert_type(y, int) # error: [type-assertion-failure]
| ^^^^^^^^^^^^-^^^^^^
| |
| Inferred type is `bool`
|
info: `bool` is a subtype of `int`, but they are not equivalent
info: rule `type-assertion-failure` is enabled by default
```

View File

@@ -91,14 +91,14 @@ error[missing-argument]: No argument provided for required parameter `arg` of bo
7 | from typing_extensions import deprecated
|
info: Parameter declared here
--> stdlib/typing_extensions.pyi:1000:28
--> stdlib/typing_extensions.pyi:1001:28
|
998 | stacklevel: int
999 | def __init__(self, message: LiteralString, /, *, category: type[Warning] | None = ..., stacklevel: int = 1) -> None: ...
1000 | def __call__(self, arg: _T, /) -> _T: ...
999 | stacklevel: int
1000 | def __init__(self, message: LiteralString, /, *, category: type[Warning] | None = ..., stacklevel: int = 1) -> None: ...
1001 | def __call__(self, arg: _T, /) -> _T: ...
| ^^^^^^^
1001 |
1002 | @final
1002 |
1003 | @final
|
info: rule `missing-argument` is enabled by default

View File

@@ -63,7 +63,7 @@ error[invalid-argument-type]: Invalid second argument to `isinstance`
10 | # error: [invalid-argument-type]
|
info: A `UnionType` instance can only be used as the second argument to `isinstance` if all elements are class objects
info: Elements `<typing.Literal special form>` and `<class 'list[int]'>` in the union are not class objects
info: Elements `<special form 'Literal[42]'>` and `<class 'list[int]'>` in the union are not class objects
info: rule `invalid-argument-type` is enabled by default
```
@@ -82,7 +82,7 @@ error[invalid-argument-type]: Invalid second argument to `isinstance`
13 | else:
|
info: A `UnionType` instance can only be used as the second argument to `isinstance` if all elements are class objects
info: Element `typing.Any` in the union, and 2 more elements, are not class objects
info: Element `<special form 'typing.Any'>` in the union, and 2 more elements, are not class objects
info: rule `invalid-argument-type` is enabled by default
```

View File

@@ -24,7 +24,7 @@ mdtest path: crates/ty_python_semantic/resources/mdtest/mro.md
# Diagnostics
```
error[inconsistent-mro]: Cannot create a consistent method resolution order (MRO) for class `Baz` with bases list `[typing.Protocol[T], <class 'Foo'>, <class 'Bar[T@Baz]'>]`
error[inconsistent-mro]: Cannot create a consistent method resolution order (MRO) for class `Baz` with bases list `[<special form 'typing.Protocol[T]'>, <class 'Foo'>, <class 'Bar[T@Baz]'>]`
--> src/mdtest_snippet.py:7:1
|
5 | class Foo(Protocol): ...

View File

@@ -42,7 +42,7 @@ mdtest path: crates/ty_python_semantic/resources/mdtest/protocols.md
# Diagnostics
```
error[call-non-callable]: Object of type `typing.Protocol` is not callable
error[call-non-callable]: Object of type `<special form 'typing.Protocol'>` is not callable
--> src/mdtest_snippet.py:4:13
|
3 | # error: [call-non-callable]

View File

@@ -0,0 +1,114 @@
---
source: crates/ty_test/src/lib.rs
expression: snapshot
---
---
mdtest name: special_form_attributes.md - Diagnostics for invalid attribute access on special forms
mdtest path: crates/ty_python_semantic/resources/mdtest/diagnostics/special_form_attributes.md
---
# Python source files
## mdtest_snippet.py
```
1 | from typing_extensions import Any, Final, LiteralString, Self
2 |
3 | X = Any
4 |
5 | class Foo:
6 | X: Final = LiteralString
7 | a: int
8 | b: Self
9 |
10 | class Bar:
11 | def __init__(self):
12 | self.y: Final = LiteralString
13 |
14 | X.foo # error: [unresolved-attribute]
15 | X.aaaaooooooo # error: [unresolved-attribute]
16 | Foo.X.startswith # error: [unresolved-attribute]
17 | Foo.Bar().y.startswith # error: [unresolved-attribute]
18 |
19 | # TODO: false positive (just testing the diagnostic in the meantime)
20 | Foo().b.a # error: [unresolved-attribute]
```
# Diagnostics
```
error[unresolved-attribute]: Special form `typing.Any` has no attribute `foo`
--> src/mdtest_snippet.py:14:1
|
12 | self.y: Final = LiteralString
13 |
14 | X.foo # error: [unresolved-attribute]
| ^^^^^
15 | X.aaaaooooooo # error: [unresolved-attribute]
16 | Foo.X.startswith # error: [unresolved-attribute]
|
help: Objects with type `Any` have a `foo` attribute, but the symbol `typing.Any` does not itself inhabit the type `Any`
help: This error may indicate that `X` was defined as `X = typing.Any` when `X: typing.Any` was intended
info: rule `unresolved-attribute` is enabled by default
```
```
error[unresolved-attribute]: Special form `typing.Any` has no attribute `aaaaooooooo`
--> src/mdtest_snippet.py:15:1
|
14 | X.foo # error: [unresolved-attribute]
15 | X.aaaaooooooo # error: [unresolved-attribute]
| ^^^^^^^^^^^^^
16 | Foo.X.startswith # error: [unresolved-attribute]
17 | Foo.Bar().y.startswith # error: [unresolved-attribute]
|
help: Objects with type `Any` have an `aaaaooooooo` attribute, but the symbol `typing.Any` does not itself inhabit the type `Any`
help: This error may indicate that `X` was defined as `X = typing.Any` when `X: typing.Any` was intended
info: rule `unresolved-attribute` is enabled by default
```
```
error[unresolved-attribute]: Special form `typing.LiteralString` has no attribute `startswith`
--> src/mdtest_snippet.py:16:1
|
14 | X.foo # error: [unresolved-attribute]
15 | X.aaaaooooooo # error: [unresolved-attribute]
16 | Foo.X.startswith # error: [unresolved-attribute]
| ^^^^^^^^^^^^^^^^
17 | Foo.Bar().y.startswith # error: [unresolved-attribute]
|
help: Objects with type `LiteralString` have a `startswith` attribute, but the symbol `typing.LiteralString` does not itself inhabit the type `LiteralString`
help: This error may indicate that `Foo.X` was defined as `Foo.X = typing.LiteralString` when `Foo.X: typing.LiteralString` was intended
info: rule `unresolved-attribute` is enabled by default
```
```
error[unresolved-attribute]: Special form `typing.LiteralString` has no attribute `startswith`
--> src/mdtest_snippet.py:17:1
|
15 | X.aaaaooooooo # error: [unresolved-attribute]
16 | Foo.X.startswith # error: [unresolved-attribute]
17 | Foo.Bar().y.startswith # error: [unresolved-attribute]
| ^^^^^^^^^^^^^^^^^^^^^^
18 |
19 | # TODO: false positive (just testing the diagnostic in the meantime)
|
help: Objects with type `LiteralString` have a `startswith` attribute, but the symbol `typing.LiteralString` does not itself inhabit the type `LiteralString`
info: rule `unresolved-attribute` is enabled by default
```
```
error[unresolved-attribute]: Special form `typing.Self` has no attribute `a`
--> src/mdtest_snippet.py:20:1
|
19 | # TODO: false positive (just testing the diagnostic in the meantime)
20 | Foo().b.a # error: [unresolved-attribute]
| ^^^^^^^^^
|
info: rule `unresolved-attribute` is enabled by default
```

View File

@@ -537,6 +537,9 @@ static_assert(is_assignable_to(tuple[Any, ...], tuple[Any, Any]))
static_assert(is_assignable_to(tuple[Any, ...], tuple[int, ...]))
static_assert(is_assignable_to(tuple[Any, ...], tuple[int]))
static_assert(is_assignable_to(tuple[Any, ...], tuple[int, int]))
static_assert(is_assignable_to(tuple[Any, ...], tuple[int, *tuple[int, ...]]))
static_assert(is_assignable_to(tuple[Any, ...], tuple[*tuple[int, ...], int]))
static_assert(is_assignable_to(tuple[Any, ...], tuple[int, *tuple[int, ...], int]))
```
This also applies when `tuple[Any, ...]` is unpacked into a mixed tuple.
@@ -560,6 +563,10 @@ static_assert(is_assignable_to(tuple[*tuple[Any, ...], int], tuple[int, ...]))
static_assert(is_assignable_to(tuple[*tuple[Any, ...], int], tuple[int]))
static_assert(is_assignable_to(tuple[*tuple[Any, ...], int], tuple[int, int]))
# `*tuple[Any, ...]` can materialize to a tuple of any length as a special case,
# so this passes:
static_assert(is_assignable_to(tuple[*tuple[Any, ...], Any], tuple[*tuple[Any, ...], Any, Any]))
static_assert(is_assignable_to(tuple[int, *tuple[Any, ...], int], tuple[int, *tuple[Any, ...], int]))
static_assert(is_assignable_to(tuple[int, *tuple[Any, ...], int], tuple[Any, ...]))
static_assert(not is_assignable_to(tuple[int, *tuple[Any, ...], int], tuple[Any]))
@@ -580,6 +587,9 @@ static_assert(not is_assignable_to(tuple[int, ...], tuple[Any, Any]))
static_assert(is_assignable_to(tuple[int, ...], tuple[int, ...]))
static_assert(not is_assignable_to(tuple[int, ...], tuple[int]))
static_assert(not is_assignable_to(tuple[int, ...], tuple[int, int]))
static_assert(not is_assignable_to(tuple[int, ...], tuple[int, *tuple[int, ...]]))
static_assert(not is_assignable_to(tuple[int, ...], tuple[*tuple[int, ...], int]))
static_assert(not is_assignable_to(tuple[int, ...], tuple[int, *tuple[int, ...], int]))
static_assert(is_assignable_to(tuple[int, *tuple[int, ...]], tuple[int, *tuple[Any, ...]]))
static_assert(is_assignable_to(tuple[int, *tuple[int, ...]], tuple[Any, ...]))
@@ -1344,6 +1354,38 @@ static_assert(not is_assignable_to(TypeGuard[Unknown], str)) # error: [static-a
static_assert(not is_assignable_to(TypeIs[Any], str))
```
## `ParamSpec`
```py
from ty_extensions import TypeOf, static_assert, is_assignable_to, Unknown
from typing import ParamSpec, Mapping, Callable, Any
P = ParamSpec("P")
def f(func: Callable[P, int], *args: P.args, **kwargs: P.kwargs) -> None:
static_assert(is_assignable_to(TypeOf[args], tuple[Any, ...]))
static_assert(is_assignable_to(TypeOf[args], tuple[object, ...]))
static_assert(is_assignable_to(TypeOf[args], tuple[Unknown, ...]))
static_assert(not is_assignable_to(TypeOf[args], tuple[int, ...]))
static_assert(not is_assignable_to(TypeOf[args], tuple[int, str]))
static_assert(not is_assignable_to(tuple[Any, ...], TypeOf[args]))
static_assert(not is_assignable_to(tuple[object, ...], TypeOf[args]))
static_assert(not is_assignable_to(tuple[Unknown, ...], TypeOf[args]))
static_assert(is_assignable_to(TypeOf[kwargs], dict[str, Any]))
static_assert(is_assignable_to(TypeOf[kwargs], dict[str, Unknown]))
static_assert(not is_assignable_to(TypeOf[kwargs], dict[str, object]))
static_assert(not is_assignable_to(TypeOf[kwargs], dict[str, int]))
static_assert(is_assignable_to(TypeOf[kwargs], Mapping[str, Any]))
static_assert(is_assignable_to(TypeOf[kwargs], Mapping[str, object]))
static_assert(is_assignable_to(TypeOf[kwargs], Mapping[str, Unknown]))
static_assert(not is_assignable_to(dict[str, Any], TypeOf[kwargs]))
static_assert(not is_assignable_to(dict[str, object], TypeOf[kwargs]))
static_assert(not is_assignable_to(dict[str, Unknown], TypeOf[kwargs]))
```
[gradual form]: https://typing.python.org/en/latest/spec/glossary.html#term-gradual-form
[gradual tuple]: https://typing.python.org/en/latest/spec/tuples.html#tuple-type-form
[typing documentation]: https://typing.python.org/en/latest/spec/concepts.html#the-assignable-to-or-consistent-subtyping-relation

View File

@@ -101,6 +101,37 @@ class C:
x: ClassVar[int, str] = 1
```
## Trailing comma creates a tuple
A trailing comma in a subscript creates a single-element tuple. We need to handle this gracefully
and emit a proper error rather than crashing (see
[ty#1793](https://github.com/astral-sh/ty/issues/1793)).
```py
from typing import ClassVar
class C:
# error: [invalid-type-form] "Tuple literals are not allowed in this context in a type expression: Did you mean `tuple[()]`?"
x: ClassVar[(),]
# error: [invalid-attribute-access] "Cannot assign to ClassVar `x` from an instance of type `C`"
C().x = 42
reveal_type(C.x) # revealed: Unknown
```
This also applies when the trailing comma is inside the brackets (see
[ty#1768](https://github.com/astral-sh/ty/issues/1768)):
```py
from typing import ClassVar
class D:
# A trailing comma here doesn't change the meaning; it's still one argument.
a: ClassVar[int,] = 1
reveal_type(D.a) # revealed: int
```
## Illegal `ClassVar` in type expression
```py

View File

@@ -340,6 +340,22 @@ class C:
x: Final[int, str] = 1
```
### Trailing comma creates a tuple
A trailing comma in a subscript creates a single-element tuple. We need to handle this gracefully
and emit a proper error rather than crashing (see
[ty#1793](https://github.com/astral-sh/ty/issues/1793)).
```py
from typing import Final
# error: [invalid-type-form] "Tuple literals are not allowed in this context in a type expression: Did you mean `tuple[()]`?"
x: Final[(),] = 42
# error: [invalid-assignment] "Reassignment of `Final` symbol `x` is not allowed"
x = 56
```
### Illegal `Final` in type expression
```py

View File

@@ -112,6 +112,25 @@ class Wrong:
x: InitVar[int, str] # error: [invalid-type-form] "Type qualifier `InitVar` expected exactly 1 argument, got 2"
```
A trailing comma in a subscript creates a single-element tuple. We need to handle this gracefully
and emit a proper error rather than crashing (see
[ty#1793](https://github.com/astral-sh/ty/issues/1793)).
```py
from dataclasses import InitVar, dataclass
@dataclass
class AlsoWrong:
# error: [invalid-type-form] "Tuple literals are not allowed in this context in a type expression: Did you mean `tuple[()]`?"
x: InitVar[(),]
# revealed: (self: AlsoWrong, x: Unknown) -> None
reveal_type(AlsoWrong.__init__)
# error: [unresolved-attribute]
reveal_type(AlsoWrong(42).x) # revealed: Unknown
```
A bare `InitVar` is not allowed according to the [type annotation grammar]:
```py

Some files were not shown because too many files have changed in this diff Show More