Compare commits

...

76 Commits

Author SHA1 Message Date
Micha Reiser
0ad2e09430 [ty] Split out ty_python_types crate 2025-12-25 17:54:44 +01:00
Simon Lamon
dd3a985109 [ty] Spell out "method resolution order" in unsupported-base subdiagnostic (#22194) 2025-12-25 10:26:44 +00:00
Micha Reiser
12f5ea51e3 [ty] Invert dependencies of ty_combine and ty_python_semantic (#22191) 2025-12-25 10:06:06 +01:00
Alex Waygood
f9afcc400c [ty] Improve diagnostic when a user tries to access a function attribute on a Callable type (#22182)
## Summary

Other type checkers allow you to access all `FunctionType` attributes on
any object with a `Callable` type. ty does not, because this is
demonstrably unsound, but this is often a source of confusion for users.
And there were lots of diagnostics in the ecosystem report for
https://github.com/astral-sh/ruff/pull/22145 that were complaining that
"Object of type `(...) -> Unknown` has no attribute `__name__`", for
example.

The discrepancy between what ty does here and what other type checkers
do is discussed a bit in https://github.com/astral-sh/ty/issues/1495.
You can see that there have been lots of issues closed as duplicates of
that issue; we should probably also add an FAQ entry for it.

Anyway, this PR adds a subdiagnostic to help users out when they hit
this diagnostic. Unfortunately something I did meant that rustfmt
increased the indentation of the whole of this huge closure, so this PR
is best reviewed with the "No whitespace" option selected for viewing
the diff.

## Test Plan

Snapshot added
2025-12-24 15:47:11 -05:00
Alex Waygood
768c5a2285 [ty] Use Type::string_literal() more (#22184) 2025-12-24 20:06:57 +00:00
Alex Waygood
139149f87b [ty] Improve diagnostic when callable is used in a type expression instead of collections.abc.Callable or typing.Callable (#22180) 2025-12-24 19:18:51 +00:00
Charlie Marsh
2de4464e92 [ty] Fix implementation of Top[Callable[..., object]] (#22145)
## Summary

Add a proper representation for the `Callable` top type, and use it to
get `callable()` narrowing right.

Closes https://github.com/astral-sh/ty/issues/1426.

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-12-24 12:49:09 -05:00
Micha Reiser
ded4d4bbe9 [ty] Fix classification of module in import x as y (#22175) 2025-12-24 18:25:29 +01:00
Micha Reiser
eef403f6cf Revert "[ty] Fix completion in decorator without class or function definition" (#22176) 2025-12-24 17:27:21 +01:00
Micha Reiser
adef89eb7c [ty] Fix completion in decorator without class or function definition 2025-12-24 16:57:36 +01:00
Micha Reiser
e3498121b4 [ty] Fix module resolution on network drives (#22173) 2025-12-24 16:36:18 +01:00
Brent Westbrook
5e7fc9a4e1 Render the entire diagnostic message in all output formats (#22164)
Summary
--

This PR fixes https://github.com/astral-sh/ty/issues/2186 by replacing
uses of
`Diagnostic::body` with [`Diagnostic::concise_message`][d]. The initial
report
was only about ty's GitHub and GitLab output formats, but I think it
makes sense
to prefer `concise_message` in the other output formats too. Ruff
currently only
sets the primary message on its diagnostics, which is why this has no
effect on
Ruff, and ty currently only supports the GitHub and GitLab formats that
used
`body`, but removing `body` should help to avoid this problem as Ruff
starts to
use more diagnostic features or ty gets new output formats.

Test Plan
--

Updated existing GitLab and GitHub CLI tests to have `reveal_type`
diagnostics

[d]:
https://github.com/astral-sh/ruff/blob/395bf106ab/crates/ruff_db/src/diagnostic/mod.rs#L185
[t]:
https://github.com/astral-sh/ruff/blob/395bf106ab/crates/ruff/tests/cli/lint.rs#L3509
2025-12-24 10:28:24 -05:00
Alex Waygood
3c5956e93d [ty] Include the specialization of a generic TypedDict as part of its display (#22174)
## Summary

This is the easy bit of https://github.com/astral-sh/ty/issues/2190

## Test Plan

mdtests updated
2025-12-24 14:39:46 +00:00
Charlie Marsh
81f34fbc8e [ty] Store un-widened type in Place (#22093)
## Summary

See: https://github.com/astral-sh/ruff/pull/22025#discussion_r2632724156
2025-12-23 23:19:57 -05:00
Charlie Marsh
184f487c84 [ty] Add a dedicated diagnostic for TypedDict deletions (#22123)
## Summary

Provides a message like:

```
  error[invalid-argument-type]: Cannot delete required key "name" from TypedDict `Movie`
    --> test.py:15:7
     |
  15 | del m["name"]
     |       ^^^^^^
     |
  info: Field defined here
   --> test.py:4:5
    |
  4 |     name: str
    |     --------- `name` declared as required here; consider making it `NotRequired`
    |
  info: Only keys marked as `NotRequired` (or in a TypedDict with `total=False`) can be deleted
```
2025-12-24 03:49:42 +00:00
Charlie Marsh
969c8a547e [ty] Synthesize __delitem__ for TypedDict to allow deleting non-required keys (#22122)
## Summary

TypedDict now synthesizes a proper `__delitem__` method that...

- ...allows deletion of `NotRequired` keys and keys in `total=False`
TypedDicts.
- ...rejects deletion of required keys (synthesizes `__delitem__(k:
Never)`).
2025-12-24 03:39:54 +00:00
Charlie Marsh
acdda78189 [ty] Fix @staticmethod combined with other decorators incorrectly binding self (#22128)
## Summary

We already had `CallableTypeKind::ClassMethodLike` to track callables
that behave like `classmethods` (always bind the first argument). This
PR adds the symmetric `CallableTypeKind::StaticMethodLike` for callables
that behave like `staticmethods` (never bind `self`).

Closes https://github.com/astral-sh/ty/issues/2114.
2025-12-24 03:35:09 +00:00
Charlie Marsh
c28c1f534d [ty] Check __delitem__ instead of __getitem__ for del x[k] (#22121)
## Summary

Previously, `del x[k]` incorrectly required the object to have a
`__getitem__` method. This was wrong because deletion only needs
`__delitem__`, which is independent of `__getitem__`.

Closes https://github.com/astral-sh/ty/issues/1799.
2025-12-24 03:34:20 +00:00
Charlie Marsh
b723917463 [ty] Support tuple narrowing based on member checks (#22167)
## Summary

Closes https://github.com/astral-sh/ty/issues/2179.
2025-12-23 20:15:50 -05:00
Charlie Marsh
5decf94644 [ty] Synthesize a _fields attribute for NamedTuples (#22163)
## Summary

Closes #2176.
2025-12-23 21:38:41 +00:00
Charlie Marsh
89a55dd09f [ty] Synthesize a _replace method for NamedTuples (#22153)
## Summary

Closes https://github.com/astral-sh/ty/issues/2170.
2025-12-23 16:33:55 -05:00
Shunsuke Shibayama
8710a8c4ac [ty] don't expand type aliases in implicit tuple aliases (#22015)
## Summary

This PR fixes https://github.com/astral-sh/ty/issues/1848.

```python
T = tuple[int, 'U']

class C(set['U']):
    pass

type U = T | C
```

The reason why the fixed point iteration did not converge was because
the types stored in the implicit tuple type alias `Specialization`
changed each time.

```
1st: <class 'tuple[int, C]'>
2nd: <class 'tuple[int, tuple[int, C] | C]'>
3rd: <class 'tuple[int, tuple[int, tuple[int, C] | C] | C]'>
...
```

And this was because `UnionType::from_elements` was used when creating
union types for tuple operations, which causes type aliases inside to be
expanded.
This PR replaces these with `UnionType::from_elements_leave_aliases`.

## Test Plan

New corpus test
2025-12-23 13:22:54 -08:00
Jack O'Connor
e245c1d76e [ty] narrow tagged unions of TypedDict (#22104)
Identify and narrow cases like this:

```py
class Foo(TypedDict):
    tag: Literal["foo"]

class Bar(TypedDict):
    tag: Literal["bar"]

def _(union: Foo | Bar):
    if union["tag"] == "foo":
        reveal_type(union)  # Foo
```

Fixes part of https://github.com/astral-sh/ty/issues/1479.

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-12-23 19:30:08 +00:00
Charlie Marsh
4c175fa0e1 [ty] Bind self with instance in __get__ (#22155)
## Summary

See: https://github.com/astral-sh/ruff/pull/22153/changes#r2641788438.
2025-12-23 11:25:58 -08:00
Micha Reiser
ed64c4d943 [ty] Abort printing diagnostics when pressing Ctrl+C (#22083) 2025-12-23 18:03:58 +01:00
Matthew Mckee
f1e6c9c3a0 [ty] Use markdown for completions documentation (#21752) 2025-12-23 17:07:53 +01:00
Wizzerinus | Alex K.
d9fe996e64 [ty] Support custom builtins (#22021) 2025-12-23 13:48:14 +00:00
Matthew Mckee
5ea30c4c53 Show both ty.toml and pyproject.toml examples in configuration reference (#22144)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-12-23 09:49:44 +01:00
Micha Reiser
ccc9132f73 [ty] Use ModuleName::new_static in more places (#22156) 2025-12-23 09:24:49 +01:00
Ibraheem Ahmed
65021fcee9 [ty] Support type inference between protocol instances (#22120) 2025-12-23 09:24:01 +01:00
Micha Reiser
aa21b70a8b [ty] Fix path of instrumented benchnmark binary (#22157) 2025-12-23 09:12:18 +01:00
Vincent Ging Ho Yim
22ce0c8a51 Decrease Markdown heading level (#22152) 2025-12-23 07:44:45 +00:00
Micha Reiser
d6a7c9b4ed [ty] Add respect-type-ignore-comments configuration option (#22137) 2025-12-23 08:36:51 +01:00
Charlie Marsh
4745d15fff [ty] Respect debug text interpolation in f-strings (#22151)
## Summary

Per @carljm's comment, we just fall back to `str`.

Closes https://github.com/astral-sh/ty/issues/2151.
2025-12-22 20:21:28 -05:00
Shunsuke Shibayama
06db474f20 [ty] stabilize union-type ordering in fixed-point iteration (#22070)
## Summary

This PR fixes https://github.com/astral-sh/ty/issues/2085.

Based on the reported code, the panicking MRE is:

```python
class Test:
    def __init__(self, x: int):
        self.left = x
        self.right = x
    def method(self):
        self.left, self.right = self.right, self.left
        if self.right:
            self.right = self.right
```

The type inference (`implicit_attribute_inner`) for `self.right`
proceeds as follows:

```
0: Divergent(Id(6c07))
1: Unknown | int | (Divergent(Id(1c00)) & ~AlwaysFalsy)
2: Unknown | int | (Divergent(Id(6c07)) & ~AlwaysFalsy) | (Divergent(Id(1c00)) & ~AlwaysFalsy)
3: Unknown | int | (Divergent(Id(1c00)) & ~AlwaysFalsy) | (Divergent(Id(6c07)) & ~AlwaysFalsy)
4: Unknown | int | (Divergent(Id(6c07)) & ~AlwaysFalsy) | (Divergent(Id(1c00)) & ~AlwaysFalsy)
...
```

The problem is that the order of union types is not stable between
cycles. To solve this, when unioning the previous union type with the
current union type, we should use the previous type as the base and add
only the new elements in this cycle (In the current implementation, this
unioning order was reversed).

## Test Plan

New corpus test
2025-12-22 16:16:03 -08:00
Charlie Marsh
664686bdbc [ty] Exclude parameterized tuple types from narrowing when disjoint from comparison values (#22129)
## Summary

IIUC, tuples with a known structure (`tuple_spec`) use the standard
tuple `__eq__` which only returns `True` for other tuples, so they can
be safely excluded when disjoint from string literals or other non-tuple
types.

Closes https://github.com/astral-sh/ty/issues/2140.
2025-12-22 20:44:49 +00:00
William Woodruff
4a937543b9 Ecosystem report: publish site via astral-sh-bot (#22142) 2025-12-22 12:57:49 -05:00
Micha Reiser
ec034fc359 [ty] Add new diagnosticMode: off (#22073) 2025-12-22 16:46:02 +01:00
Micha Reiser
29d7f22c1f [ty] Add ty.configuration and ty.configurationFile options (#22053) 2025-12-22 16:13:20 +01:00
Micha Reiser
8fc4349fd3 [ty] Split suppression.rs into multiple smaller modules (#22141) 2025-12-22 16:08:56 +01:00
Matthew Mckee
816b19c4a6 [ty] Rename set_invalid_syntax to set_invalid_type_annotation (#22140) 2025-12-22 14:29:03 +00:00
Peter Law
87406b43ea Fix iter example in usafe fixes doc (#22118)
## Summary

This appears to have been a copy/paste error from the list example, as
the subscript is not present in the original next/iter example only in
the case where the error case is shown. While in the specific example
code the subscript actually has no effect, it does make the example
slightly confusing.

Consider the following variations, first the example from the docs
unchanged and second the same code but not hitting the intended error
case (due to using a non-empty collection):
```console
$ python3 -c 'next(iter(range(0)))[0]'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
StopIteration

$ python3 -c 'next(iter(range(1)))[0]'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
TypeError: 'int' object is not subscriptable
```

## Test Plan

Not directly tested, however see inline snippets above.
2025-12-22 09:25:28 -05:00
Matthew Mckee
422e99ea70 [ty] Add inlay hint request time log (#22138) 2025-12-22 15:08:49 +01:00
Micha Reiser
ea6730f546 [ty] Speed-up instrumented benchmarks (#22133) 2025-12-22 15:06:22 +01:00
Matthew Mckee
a46835c224 [ty] Set flag to avoid type[T@f] being inserted when you double-click on the inlay (#22139) 2025-12-22 14:00:45 +00:00
Micha Reiser
884e83591e [ty] Update salsa (#22072) 2025-12-22 13:22:54 +01:00
Alex Waygood
6b3dd28e63 [ty] Make a server snapshot less painful to update (#22132) 2025-12-22 12:13:58 +00:00
Harutaka Kawamura
572f57aa3c Fix GitHub Actions output format for multi-line diagnostics (#22108) 2025-12-22 10:08:07 +01:00
Micha Reiser
ed423e0ae2 [ty] Speedup ty-walltime benchmarks (#22126)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-12-22 08:44:17 +01:00
Charlie Marsh
fee4e2d72a [ty] Distribute type[] over unions (#22115)
## Summary

Closes https://github.com/astral-sh/ty/issues/2121.
2025-12-21 18:45:29 -05:00
Will Duke
b6e84eca16 [ty] Document invalid-syntax-in-forward-annotation and escape-character-in-forward-annotation (#22130)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-12-21 19:35:44 +00:00
Micha Reiser
b4c2825afd [ty] Move module resolver code into its own crate (#22106) 2025-12-21 11:00:34 +00:00
Ibraheem Ahmed
ad41728204 [ty] Avoid temporarily storing invalid multi-inference attempts (#22103)
## Summary

I missed this in https://github.com/astral-sh/ruff/pull/22062. This
avoids exponential runtime in the following snippet:

```py
class X1: ...
class X2: ...
class X3: ...
class X4: ...
class X5: ...
class X6: ...
...

def f(
    x:
        list[X1 | None]
      | list[X2 | None]
      | list[X3 | None]
      | list[X4 | None]
      | list[X5 | None]
      | list[X6 | None]
      ...
):
    ...

def g[T](x: T) -> list[T]:
    return [x]

def id[T](x: T) -> T:
    return x

f(id(id(id(id(g(X64()))))))
```

Eventually I want to refactor our multi-inference infrastructure (which
is currently very brittle) to handle this implicitly, but this is a
temporary performance fix until that happens.
2025-12-20 08:20:53 -08:00
Hugo
3ec63b964c [ty] Add support for dict(...) calls in typed dict contexts (#22113)
## Summary

fixes https://github.com/astral-sh/ty/issues/2127
- handle `dict(...)` calls in TypedDict context with bidirectional
inference
- validate keys/values using the existing TypedDict constructor implem
- mdtest: add 1 positive test, 1 negative test for invalid coverage

## Test Plan

```sh
cargo clippy --workspace --all-targets --all-features -- -D warnings  # Rust linting
cargo test  # Rust testing
uvx pre-commit run --all-files --show-diff-on-failure  # Rust and Python formatting, Markdown and Python linting, etc.
```
fully green
2025-12-20 07:59:03 -08:00
Matthew Mckee
f9a0e1e3f6 [ty] Fix panic introduced in #22076 (#22112)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Was looking over that PR and this looked wrong.

panic introduced in #22076 

## Test Plan

Before running:

```bash
cargo run -p ty check test.py --force-exclude --no-progress 
```

would result in a panic

```text
thread 'main' (162713) panicked at crates/ty/src/args.rs:459:17:
internal error: entered unreachable code: Clap should make this impossible
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
```

Now it does not.
2025-12-20 07:53:45 -08:00
William Woodruff
ef4507be96 Remove in-workflow deployment to Cloudflare Pages (#22098) 2025-12-20 10:21:47 -05:00
Alex Waygood
3398ab23a9 Update comments in sync_typeshed.yaml workflow (#22109) 2025-12-20 12:19:04 +00:00
Micha Reiser
5b475b45aa [ty] Add --force-exclude option (#22076) 2025-12-20 10:03:41 +01:00
Ibraheem Ahmed
2a959ef3f2 [ty] Avoid narrowing on non-generic calls (#22102)
## Summary

Resolves https://github.com/astral-sh/ty/issues/2026.
2025-12-19 23:18:07 -05:00
Ibraheem Ahmed
674d3902c6 [ty] Only prefer declared types in non-covariant positions (#22068)
## Summary

The following snippet currently errors because we widen the inferred
type, even though `X` is covariant over `T`. If `T` was contravariant or
invariant, this would be fine, as it would lead to an assignability
error anyways.

```python
class X[T]:
    def __init__(self: X[None]): ...

    def pop(self) -> T:
        raise NotImplementedError

# error: Argument to bound method `__init__` is incorrect: Expected `X[None]`, found `X[int | None]`
x: X[int | None] = X()
```

There are some cases where it is still helpful to prefer covariant
declared types, but this error seems hard to fix otherwise, and makes
our heuristics more consistent overall.
2025-12-19 17:27:31 -05:00
Alex Waygood
1a18ada931 [ty] Run mypy_primer and typing-conformance workflows on fewer PRs (#22096)
## Summary

Fixes https://github.com/astral-sh/ty/issues/2090. Quoting my rationale
from that issue:

> A PR that only touches code in [one of these crates] should never have
any impact on memory usage or diagnostics produced. And the comments
from the bot just lead to additional notifications which is annoying.

I _think_ I've got the syntax right here. The
[docs](https://docs.github.com/en/actions/reference/workflows-and-actions/workflow-syntax)
say:

>  The order that you define paths patterns matters:
>
> - A matching negative pattern (prefixed with !) after a positive match
will exclude the path.
> - A matching positive pattern after a negative match will include the
path again.

## Test Plan

No idea? Merge it and see?
2025-12-19 19:31:20 +00:00
Alex Waygood
dde0d0af68 [ty] List rules in alphabetical order in the reference docs (#22097)
## Summary

Fixes https://github.com/astral-sh/ty/issues/1885.

It wasn't obvious to me that there was a deliberate order to the way
these rules were listed in our reference docs -- it looked like it was
_nearly_ alphabetical, but not quite. I think it's simpler if we just
list them in alphabetical order.

## Test Plan

The output from running `cargo dev generate-all` (committed as part of
this PR) looks correct!
2025-12-19 19:11:05 +00:00
Chris Bachhuber
b342f60b40 Update T201 suggestion to not use root logger to satisfy LOG015 (#22059)
## Summary

Currently, the proposed fix for https://docs.astral.sh/ruff/rules/print/
violates https://docs.astral.sh/ruff/rules/root-logger-call/. Thus,
let's change the proposal to make LOG015 happy as well.

## Test Plan

Test manually in a project that has both T201 and LOG015 enabled and run
them over the previous and proposed code. Is there continuous testing of
the code snippets from the docs?
2025-12-19 11:08:12 -08:00
Alex Waygood
2151c3d351 [ty] Document that several rules are disabled by default because of the number of false positives they produce (#22095) 2025-12-19 18:45:18 +00:00
Aria Desires
cdb7a9fb33 [ty] Classify docstrings in semantic tokens (syntax highlighting) (#22031)
## Summary

* Related to, but does not handle
https://github.com/astral-sh/ty/issues/2021

## Test Plan

I also added some snapshot tests for future work on non-standard
attribute docstrings (didn't want to highlight them if we don't
recognize them elsewhere).
2025-12-19 13:36:01 -05:00
github-actions[bot]
df1552b9a4 [ty] Sync vendored typeshed stubs (#22091)
Co-authored-by: typeshedbot <>
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-12-19 18:23:09 +00:00
Alex Waygood
77de3df150 [ty] Fix sync-typeshed workflow timeouts (#22090) 2025-12-19 18:09:08 +00:00
Charlie Marsh
0f18a08a0a [ty] Respect intersections in iterations (#21965)
## Summary

This PR implements the strategy described in
https://github.com/astral-sh/ty/issues/1871: we iterate over the
positive types, resolve them, then intersect the results.
2025-12-19 12:36:37 -05:00
William Woodruff
b63b3c13fb Upload full ecosystem report as a GitHub Actions artifact (#22086) 2025-12-19 11:41:21 -05:00
Ibraheem Ahmed
270d755621 [ty] Avoid storing invalid multi-inference attempts (#22062)
## Summary

This should make revealed types a little nicer, as well as avoid
confusing the constraint solver in some cases (which were showing up in
https://github.com/astral-sh/ruff/pull/21930).
2025-12-19 15:38:47 +00:00
Micha Reiser
9809405b05 [ty] Only clear output between two successful checks (#22078) 2025-12-19 15:16:54 +01:00
Alex Waygood
28cdbb18c5 [ty] Minor followups to #22048 (#22082) 2025-12-19 13:58:59 +00:00
Alex Waygood
f9d1a282fb [ty] Collect mdtest failures as part of the assertion message rather than printing them to the terminal immediately (#22081) 2025-12-19 13:53:31 +00:00
David Peter
0bac023cd2 [ty] Lockfiles for mdtests with external dependencies (#22077)
## Summary

Add lockfiles for all mdtests which make use of external dependencies.
When running tests normally, we use this lockfile when creating the
temporary venv using `uv sync --locked`. A new
`MDTEST_UPGRADE_LOCKFILES` environment variable is used to switch to a
mode in which those lockfiles can be updated or regenerated. When using
the Python mdtest runner, this environment variable is automatically set
(because we use this command while developing, not to simulate exactly
what happens in CI). A command-line flag is provided to opt out of this.

## Test Plan

### Using the mdtest runner

#### Adding a new test (no lockfile yet)

* Removed `attrs.lock` to simulate this
* Ran `uv run crates/ty_python_semantic/mdtest.py -e external/`. The
lockfile is generated and the test succeeds.

#### Upgrading/downgrading a dependency

* Changed pydantic requirement from `pydantic==2.12.2` to
`pydantic==2.12.5` (also tested with `2.12.0`)
* Ran `uv run crates/ty_python_semantic/mdtest.py -e external/`. The
lockfile is updated and the test succeeds.

### Using cargo

#### Adding a new test (no lockfile yet)

* Removed `attrs.lock` to simulate this
* Ran `MDTEST_EXTERNAL=1 cargo test -p ty_python_semantic --test mdtest
mdtest__external` "naively", which outputs:
> Failed to setup in-memory virtual environment with dependencies:
Lockfile not found at
'/home/shark/ruff/crates/ty_python_semantic/resources/mdtest/external/attrs.lock'.
Run with `MDTEST_UPGRADE_LOCKFILES=1` to generate it.
* Ran `MDTEST_UPGRADE_LOCKFILES=1 MDTEST_EXTERNAL=1 cargo test -p
ty_python_semantic --test mdtest mdtest__external`. The lockfile is
updated and the test succeeds.

#### Upgrading/downgrading a dependency

* Changed pydantic requirement from `pydantic==2.12.2` to
`pydantic==2.12.5` (also tested with `2.12.0`)
* Ran `MDTEST_EXTERNAL=1 cargo test -p ty_python_semantic --test mdtest
mdtest__external` "naively", which outputs a similar error message as
above.
* Ran the command suggested in the error message (`MDTEST_EXTERNAL=1
MDTEST_UPGRADE_LOCKFILES=1 cargo test -p ty_python_semantic --test
mdtest mdtest__external`). The lockfile is updated and the test
succeeds.
2025-12-19 14:29:52 +01:00
Micha Reiser
30efab8138 [ty] Only print dashed line for failing tests (#22080) 2025-12-19 14:20:03 +01:00
RasmusNygren
58d25129aa [ty] Visit class arguments in source order for semantic tokens (#22063)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-12-19 13:19:49 +00:00
1148 changed files with 12759 additions and 6114 deletions

View File

@@ -0,0 +1,67 @@
{
"permissions": {
"allow": [
"Bash(cargo build:*)",
"Read(//home/micha/astral/discord.py/discord/**)",
"Bash(source:*)",
"Bash(./target/profiling/ty check:*)",
"Bash(tee:*)",
"Read(//home/micha/astral/discord.py/.venv/**)",
"Read(//home/micha/astral/discord.py/**)",
"Bash(perf record:*)",
"Bash(perf report:*)",
"Bash(time:*)",
"Bash(../ruff/target/profiling/ty check discord/audit_logs.py -vv)",
"Bash(sed:*)",
"Read(//home/micha/git/TypeScript/**)",
"Bash(cargo test:*)",
"Bash(MDTEST_TEST_FILTER='union_types.md - Union types - Unions of tuples' cargo test -p ty_python_semantic --test mdtest -- mdtest__union_types)",
"Bash(timeout 10 cargo test --package ty_python_semantic --lib types::tests::divergent_type)",
"Bash(timeout 30 cargo test:*)",
"Bash(git stash:*)",
"Bash(timeout 60 time:*)",
"Bash(for i in 1 2 3 4 5)",
"Bash(do echo \"Run $i:\")",
"Bash(done)",
"Bash(for i in 1 2 3)",
"Bash(cargo fuzz run:*)",
"Bash(timeout 60 cargo fuzz run -s none ty_check_invalid_syntax -- -timeout=1)",
"Bash(for:*)",
"Bash(do echo \"=== Checking $crash ===\")",
"Bash(uvx ty@latest check \"$crash\")",
"Bash(do echo \"=== $crash ===\")",
"Bash(while read crash)",
"Bash(cargo fuzz cmin:*)",
"Bash(cargo +nightly fuzz cmin:*)",
"Bash(timeout 120 cargo fuzz run -s none ty_check_invalid_syntax -- -timeout=1)",
"Bash(awk:*)",
"Bash(while read file)",
"Bash(cat:*)",
"Bash(uvx ty@latest:*)",
"Bash(do cp \"$crash\" /tmp/isolated_crash.py)",
"Bash(echo \"=== $crash ===\")",
"Bash(do echo \"=== test$i.py ===\")",
"Bash(do echo \"=== Testing $crash ===\")",
"Bash(tree:*)",
"Bash(cut:*)",
"Bash(grep:*)",
"Bash(ls:*)",
"Bash(xargs basename:*)",
"Bash(wc:*)",
"Bash(find:*)",
"Bash({} ;)",
"Bash(git checkout:*)",
"Bash(do)",
"Bash(if ! grep -q \"use crate::types\" \"$f\")",
"Bash(! grep -q \"crate::types::\" \"$f\")",
"Bash(then)",
"Bash(else)",
"Bash(fi)",
"Bash(1)",
"Bash(__NEW_LINE__ echo \"Done\")",
"Bash(rm:*)"
],
"deny": [],
"ask": []
}
}

1
.gitattributes vendored
View File

@@ -22,6 +22,7 @@ crates/ruff_linter/resources/test/fixtures/pyupgrade/UP018_CR.py text eol=cr
crates/ruff_linter/resources/test/fixtures/pyupgrade/UP018_LF.py text eol=lf
crates/ruff_python_parser/resources/inline linguist-generated=true
crates/ty_python_semantic/resources/mdtest/external/*.lock linguist-generated=true
ruff.schema.json -diff linguist-generated=true text=auto eol=lf
ty.schema.json -diff linguist-generated=true text=auto eol=lf

View File

@@ -4,10 +4,12 @@
self-hosted-runner:
# Various runners we use that aren't recognized out-of-the-box by actionlint:
labels:
- depot-ubuntu-24.04-4
- depot-ubuntu-latest-8
- depot-ubuntu-22.04-16
- depot-ubuntu-22.04-32
- depot-windows-2022-16
- depot-ubuntu-22.04-arm-4
- github-windows-2025-x86_64-8
- github-windows-2025-x86_64-16
- codspeed-macro

View File

@@ -952,7 +952,7 @@ jobs:
tool: cargo-codspeed
- name: "Build benchmarks"
run: cargo codspeed build --features "codspeed,instrumented" --profile profiling --no-default-features -p ruff_benchmark --bench formatter --bench lexer --bench linter --bench parser
run: cargo codspeed build --features "codspeed,ruff_instrumented" --profile profiling --no-default-features -p ruff_benchmark --bench formatter --bench lexer --bench linter --bench parser
- name: "Run benchmarks"
uses: CodSpeedHQ/action@346a2d8a8d9d38909abd0bc3d23f773110f076ad # v4.4.1
@@ -960,9 +960,9 @@ jobs:
mode: simulation
run: cargo codspeed run
benchmarks-instrumented-ty:
name: "benchmarks instrumented (ty)"
runs-on: ubuntu-24.04
benchmarks-instrumented-ty-build:
name: "benchmarks instrumented ty (build)"
runs-on: depot-ubuntu-24.04-4
needs: determine_changes
if: |
github.repository == 'astral-sh/ruff' &&
@@ -971,9 +971,83 @@ jobs:
needs.determine_changes.outputs.ty == 'true'
)
timeout-minutes: 20
steps:
- name: "Checkout Branch"
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@3575e532701a5fc614b0c842e4119af4cc5fd16d # v2.62.60
with:
tool: cargo-codspeed
- name: "Build benchmarks"
run: cargo codspeed build -m instrumentation --features "codspeed,ty_instrumented" --profile profiling --no-default-features -p ruff_benchmark --bench ty
- name: "Upload benchmark binary"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: benchmarks-instrumented-ty-binary
path: target/codspeed/simulation/ruff_benchmark
retention-days: 1
benchmarks-instrumented-ty-run:
name: "benchmarks instrumented ty (${{ matrix.benchmark }})"
runs-on: ubuntu-24.04
needs: benchmarks-instrumented-ty-build
timeout-minutes: 20
permissions:
contents: read # required for actions/checkout
id-token: write # required for OIDC authentication with CodSpeed
strategy:
fail-fast: false
matrix:
benchmark:
- "check_file|micro|anyio"
- "attrs|hydra|datetype"
steps:
- name: "Checkout Branch"
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
- name: "Install codspeed"
uses: taiki-e/install-action@3575e532701a5fc614b0c842e4119af4cc5fd16d # v2.62.60
with:
tool: cargo-codspeed
- name: "Download benchmark binary"
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v4.3.0
with:
name: benchmarks-instrumented-ty-binary
path: target/codspeed/simulation/ruff_benchmark
- name: "Restore binary permissions"
run: chmod +x target/codspeed/simulation/ruff_benchmark/ty
- name: "Run benchmarks"
uses: CodSpeedHQ/action@346a2d8a8d9d38909abd0bc3d23f773110f076ad # v4.4.1
with:
mode: simulation
run: cargo codspeed run --bench ty "${{ matrix.benchmark }}"
benchmarks-walltime-build:
name: "benchmarks walltime (build)"
# We only run this job if `github.repository == 'astral-sh/ruff'`,
# so hardcoding depot here is fine
runs-on: depot-ubuntu-22.04-arm-4
needs: determine_changes
if: ${{ github.repository == 'astral-sh/ruff' && !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.ty == 'true' || github.ref == 'refs/heads/main') }}
timeout-minutes: 20
steps:
- name: "Checkout Branch"
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
@@ -994,49 +1068,51 @@ jobs:
tool: cargo-codspeed
- name: "Build benchmarks"
run: cargo codspeed build --features "codspeed,instrumented" --profile profiling --no-default-features -p ruff_benchmark --bench ty
run: cargo codspeed build -m walltime --features "codspeed,ty_walltime" --profile profiling --no-default-features -p ruff_benchmark
- name: "Run benchmarks"
uses: CodSpeedHQ/action@346a2d8a8d9d38909abd0bc3d23f773110f076ad # v4.4.1
- name: "Upload benchmark binary"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
mode: simulation
run: cargo codspeed run
name: benchmarks-walltime-binary
path: target/codspeed/walltime/ruff_benchmark
retention-days: 1
benchmarks-walltime:
name: "benchmarks walltime (${{ matrix.benchmarks }})"
benchmarks-walltime-run:
name: "benchmarks walltime (${{ matrix.benchmark }})"
runs-on: codspeed-macro
needs: determine_changes
if: ${{ github.repository == 'astral-sh/ruff' && !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.ty == 'true' || github.ref == 'refs/heads/main') }}
needs: benchmarks-walltime-build
timeout-minutes: 20
permissions:
contents: read # required for actions/checkout
id-token: write # required for OIDC authentication with CodSpeed
strategy:
matrix:
benchmarks:
- "medium|multithreaded"
- "small|large"
benchmark:
- colour_science
- "pandas|tanjun|altair"
- "static_frame|sympy"
- "pydantic|multithreaded|freqtrade"
steps:
- name: "Checkout Branch"
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@3575e532701a5fc614b0c842e4119af4cc5fd16d # v2.62.60
with:
tool: cargo-codspeed
- name: "Build benchmarks"
run: cargo codspeed build --features "codspeed,walltime" --profile profiling --no-default-features -p ruff_benchmark
- name: "Download benchmark binary"
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: benchmarks-walltime-binary
path: target/codspeed/walltime/ruff_benchmark
- name: "Restore binary permissions"
run: chmod +x target/codspeed/walltime/ruff_benchmark/ty_walltime
- name: "Run benchmarks"
uses: CodSpeedHQ/action@346a2d8a8d9d38909abd0bc3d23f773110f076ad # v4.4.1
@@ -1047,4 +1123,4 @@ jobs:
CODSPEED_PERF_ENABLED: false
with:
mode: walltime
run: cargo codspeed run --bench ty_walltime "${{ matrix.benchmarks }}"
run: cargo codspeed run --bench ty_walltime -m walltime "${{ matrix.benchmark }}"

View File

@@ -62,7 +62,7 @@ jobs:
name: Create an issue if the daily fuzz surfaced any bugs
runs-on: ubuntu-latest
needs: fuzz
if: ${{ github.repository == 'astral-sh/ruff' && always() && github.event_name == 'schedule' && needs.fuzz.result == 'failure' }}
if: ${{ github.repository == 'astral-sh/ruff' && always() && github.event_name == 'schedule' && needs.fuzz.result != 'success' }}
permissions:
issues: write
steps:

View File

@@ -6,6 +6,11 @@ on:
pull_request:
paths:
- "crates/ty*/**"
- "!crates/ty_ide/**"
- "!crates/ty_server/**"
- "!crates/ty_test/**"
- "!crates/ty_completion_eval/**"
- "!crates/ty_wasm/**"
- "crates/ruff_db"
- "crates/ruff_python_ast"
- "crates/ruff_python_parser"

View File

@@ -16,8 +16,7 @@ name: Sync typeshed
# 3. Once the Windows worker is done, a MacOS worker:
# a. Checks out the branch created by the Linux worker
# b. Syncs all docstrings available on MacOS that are not available on Linux or Windows
# c. Attempts to update any snapshots that might have changed
# (this sub-step is allowed to fail)
# c. Formats the code again
# d. Commits the changes and pushes them to the same upstream branch
# e. Creates a PR against the `main` branch using the branch all three workers have pushed to
# 4. If any of steps 1-3 failed, an issue is created in the `astral-sh/ruff` repository
@@ -198,42 +197,6 @@ jobs:
run: |
rm "${VENDORED_TYPESHED}/pyproject.toml"
git commit -am "Remove pyproject.toml file"
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
- name: "Install Rust toolchain"
if: ${{ success() }}
run: rustup show
- name: "Install mold"
if: ${{ success() }}
uses: rui314/setup-mold@725a8794d15fc7563f59595bd9556495c0564878 # v1
- name: "Install cargo nextest"
if: ${{ success() }}
uses: taiki-e/install-action@3575e532701a5fc614b0c842e4119af4cc5fd16d # v2.62.60
with:
tool: cargo-nextest
- name: "Install cargo insta"
if: ${{ success() }}
uses: taiki-e/install-action@3575e532701a5fc614b0c842e4119af4cc5fd16d # v2.62.60
with:
tool: cargo-insta
- name: Update snapshots
if: ${{ success() }}
run: |
cargo r \
--profile=profiling \
-p ty_completion_eval \
-- all --tasks ./crates/ty_completion_eval/completion-evaluation-tasks.csv
# The `cargo insta` docs indicate that `--unreferenced=delete` might be a good option,
# but from local testing it appears to just revert all changes made by `cargo insta test --accept`.
#
# If there were only snapshot-related failures, `cargo insta test --accept` will have exit code 0,
# but if there were also other mdtest failures (for example), it will return a nonzero exit code.
# We don't care about other tests failing here, we just want snapshots updated where possible,
# so we use `|| true` here to ignore the exit code.
cargo insta test --accept --color=always --all-features --test-runner=nextest || true
- name: Commit snapshot changes
if: ${{ success() }}
run: git commit -am "Update snapshots" || echo "No snapshot changes to commit"
- name: Push changes upstream and create a PR
if: ${{ success() }}
run: |
@@ -245,7 +208,7 @@ jobs:
name: Create an issue if the typeshed sync failed
runs-on: ubuntu-latest
needs: [sync, docstrings-windows, docstrings-macos-and-pr]
if: ${{ github.repository == 'astral-sh/ruff' && always() && github.event_name == 'schedule' && (needs.sync.result == 'failure' || needs.docstrings-windows.result == 'failure' || needs.docstrings-macos-and-pr.result == 'failure') }}
if: ${{ github.repository == 'astral-sh/ruff' && always() && github.event_name == 'schedule' && (needs.sync.result != 'success' || needs.docstrings-windows.result != 'success' || needs.docstrings-macos-and-pr.result != 'success') }}
permissions:
issues: write
steps:

View File

@@ -17,7 +17,6 @@ env:
RUSTUP_MAX_RETRIES: 10
RUST_BACKTRACE: 1
REF_NAME: ${{ github.ref_name }}
CF_API_TOKEN_EXISTS: ${{ secrets.CF_API_TOKEN != '' }}
jobs:
ty-ecosystem-analyzer:
@@ -112,22 +111,13 @@ jobs:
cat diff-statistics.md >> "$GITHUB_STEP_SUMMARY"
- name: "Deploy to Cloudflare Pages"
if: ${{ env.CF_API_TOKEN_EXISTS == 'true' }}
id: deploy
uses: cloudflare/wrangler-action@da0e0dfe58b7a431659754fdf3f186c529afbe65 # v3.14.1
# NOTE: astral-sh-bot uses this artifact to post comments on PRs.
# Make sure to update the bot if you rename the artifact.
- name: "Upload full report"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
apiToken: ${{ secrets.CF_API_TOKEN }}
accountId: ${{ secrets.CF_ACCOUNT_ID }}
command: pages deploy dist --project-name=ty-ecosystem --branch ${{ github.head_ref }} --commit-hash ${GITHUB_SHA}
- name: "Append deployment URL"
if: ${{ env.CF_API_TOKEN_EXISTS == 'true' }}
env:
DEPLOYMENT_URL: ${{ steps.deploy.outputs.pages-deployment-alias-url }}
run: |
echo >> comment.md
echo "**[Full report with detailed diff]($DEPLOYMENT_URL/diff)** ([timing results]($DEPLOYMENT_URL/timing))" >> comment.md
name: full-report
path: dist/
# NOTE: astral-sh-bot uses this artifact to post comments on PRs.
# Make sure to update the bot if you rename the artifact.

View File

@@ -14,7 +14,6 @@ env:
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
RUST_BACKTRACE: 1
CF_API_TOKEN_EXISTS: ${{ secrets.CF_API_TOKEN != '' }}
jobs:
ty-ecosystem-report:
@@ -31,12 +30,12 @@ jobs:
- name: Install the latest version of uv
uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
with:
enable-cache: true # zizmor: ignore[cache-poisoning] acceptable risk for CloudFlare pages artifact
enable-cache: true
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
with:
workspaces: "ruff"
lookup-only: false # zizmor: ignore[cache-poisoning] acceptable risk for CloudFlare pages artifact
lookup-only: false
- name: Install Rust toolchain
run: rustup show
@@ -70,11 +69,10 @@ jobs:
ecosystem-diagnostics.json \
--output dist/index.html
- name: "Deploy to Cloudflare Pages"
if: ${{ env.CF_API_TOKEN_EXISTS == 'true' }}
id: deploy
uses: cloudflare/wrangler-action@da0e0dfe58b7a431659754fdf3f186c529afbe65 # v3.14.1
# NOTE: astral-sh-bot uses this artifact to publish the ecosystem report.
# Make sure to update the bot if you rename the artifact.
- name: "Upload ecosystem report"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
apiToken: ${{ secrets.CF_API_TOKEN }}
accountId: ${{ secrets.CF_ACCOUNT_ID }}
command: pages deploy dist --project-name=ty-ecosystem --branch main --commit-hash ${GITHUB_SHA}
name: full-report
path: dist/

View File

@@ -6,6 +6,11 @@ on:
pull_request:
paths:
- "crates/ty*/**"
- "!crates/ty_ide/**"
- "!crates/ty_server/**"
- "!crates/ty_test/**"
- "!crates/ty_completion_eval/**"
- "!crates/ty_wasm/**"
- "crates/ruff_db"
- "crates/ruff_python_ast"
- "crates/ruff_python_parser"

View File

@@ -5,7 +5,7 @@ exclude: |
.github/workflows/release.yml|
crates/ty_vendored/vendor/.*|
crates/ty_project/resources/.*|
crates/ty_python_semantic/resources/corpus/.*|
crates/ty_python_types/resources/corpus/.*|
crates/ty/docs/(configuration|rules|cli|environment).md|
crates/ruff_benchmark/resources/.*|
crates/ruff_linter/resources/.*|

109
Cargo.lock generated
View File

@@ -3083,6 +3083,7 @@ dependencies = [
"ty",
"ty_project",
"ty_python_semantic",
"ty_python_types",
"ty_static",
"url",
]
@@ -3129,7 +3130,9 @@ dependencies = [
"salsa",
"schemars",
"serde",
"ty_module_resolver",
"ty_python_semantic",
"ty_python_types",
"zip",
]
@@ -3619,8 +3622,8 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.24.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=55e5e7d32fa3fc189276f35bb04c9438f9aedbd1#55e5e7d32fa3fc189276f35bb04c9438f9aedbd1"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=ce80691fa0b87dc2fd2235a26544e63e5e43d8d3#ce80691fa0b87dc2fd2235a26544e63e5e43d8d3"
dependencies = [
"boxcar",
"compact_str",
@@ -3644,13 +3647,13 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.24.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=55e5e7d32fa3fc189276f35bb04c9438f9aedbd1#55e5e7d32fa3fc189276f35bb04c9438f9aedbd1"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=ce80691fa0b87dc2fd2235a26544e63e5e43d8d3#ce80691fa0b87dc2fd2235a26544e63e5e43d8d3"
[[package]]
name = "salsa-macros"
version = "0.24.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=55e5e7d32fa3fc189276f35bb04c9438f9aedbd1#55e5e7d32fa3fc189276f35bb04c9438f9aedbd1"
version = "0.25.2"
source = "git+https://github.com/salsa-rs/salsa.git?rev=ce80691fa0b87dc2fd2235a26544e63e5e43d8d3#ce80691fa0b87dc2fd2235a26544e63e5e43d8d3"
dependencies = [
"proc-macro2",
"quote",
@@ -4375,6 +4378,7 @@ dependencies = [
"tracing-flame",
"tracing-subscriber",
"ty_combine",
"ty_module_resolver",
"ty_project",
"ty_python_semantic",
"ty_server",
@@ -4389,7 +4393,6 @@ dependencies = [
"ordermap",
"ruff_db",
"ruff_python_ast",
"ty_python_semantic",
]
[[package]]
@@ -4407,8 +4410,8 @@ dependencies = [
"tempfile",
"toml",
"ty_ide",
"ty_module_resolver",
"ty_project",
"ty_python_semantic",
"walkdir",
]
@@ -4438,8 +4441,33 @@ dependencies = [
"salsa",
"smallvec",
"tracing",
"ty_module_resolver",
"ty_project",
"ty_python_semantic",
"ty_python_types",
"ty_vendored",
]
[[package]]
name = "ty_module_resolver"
version = "0.0.0"
dependencies = [
"anyhow",
"camino",
"compact_str",
"get-size2",
"insta",
"ruff_db",
"ruff_memory_usage",
"ruff_python_ast",
"ruff_python_stdlib",
"rustc-hash",
"salsa",
"strum",
"strum_macros",
"tempfile",
"thiserror 2.0.17",
"tracing",
"ty_vendored",
]
@@ -4477,7 +4505,9 @@ dependencies = [
"toml",
"tracing",
"ty_combine",
"ty_module_resolver",
"ty_python_semantic",
"ty_python_types",
"ty_static",
"ty_vendored",
]
@@ -4491,19 +4521,12 @@ dependencies = [
"bitvec",
"camino",
"colored 3.0.0",
"compact_str",
"datatest-stable",
"drop_bomb",
"get-size2",
"glob",
"hashbrown 0.16.1",
"indexmap",
"indoc",
"insta",
"itertools 0.14.0",
"memchr",
"ordermap",
"pretty_assertions",
"quickcheck",
"quickcheck_macros",
"ruff_annotate_snippets",
@@ -4513,9 +4536,7 @@ dependencies = [
"ruff_macros",
"ruff_memory_usage",
"ruff_python_ast",
"ruff_python_literal",
"ruff_python_parser",
"ruff_python_stdlib",
"ruff_python_trivia",
"ruff_source_file",
"ruff_text_size",
@@ -4529,10 +4550,57 @@ dependencies = [
"strsim",
"strum",
"strum_macros",
"tempfile",
"test-case",
"thiserror 2.0.17",
"tracing",
"ty_combine",
"ty_module_resolver",
"ty_static",
"ty_vendored",
]
[[package]]
name = "ty_python_types"
version = "0.0.0"
dependencies = [
"anyhow",
"bitflags 2.10.0",
"camino",
"compact_str",
"datatest-stable",
"drop_bomb",
"get-size2",
"glob",
"indexmap",
"indoc",
"insta",
"itertools 0.14.0",
"memchr",
"ordermap",
"pretty_assertions",
"quickcheck",
"quickcheck_macros",
"ruff_db",
"ruff_diagnostics",
"ruff_macros",
"ruff_memory_usage",
"ruff_python_ast",
"ruff_python_literal",
"ruff_python_parser",
"ruff_python_stdlib",
"ruff_source_file",
"ruff_text_size",
"rustc-hash",
"salsa",
"schemars",
"serde",
"serde_json",
"smallvec",
"static_assertions",
"strum",
"strum_macros",
"test-case",
"tracing",
"ty_module_resolver",
"ty_python_semantic",
"ty_static",
"ty_test",
@@ -4572,6 +4640,7 @@ dependencies = [
"tracing-subscriber",
"ty_combine",
"ty_ide",
"ty_module_resolver",
"ty_project",
"ty_python_semantic",
]
@@ -4612,7 +4681,9 @@ dependencies = [
"thiserror 2.0.17",
"toml",
"tracing",
"ty_module_resolver",
"ty_python_semantic",
"ty_python_types",
"ty_static",
"ty_vendored",
]

View File

@@ -45,8 +45,10 @@ ty = { path = "crates/ty" }
ty_combine = { path = "crates/ty_combine" }
ty_completion_eval = { path = "crates/ty_completion_eval" }
ty_ide = { path = "crates/ty_ide" }
ty_module_resolver = { path = "crates/ty_module_resolver" }
ty_project = { path = "crates/ty_project", default-features = false }
ty_python_semantic = { path = "crates/ty_python_semantic" }
ty_python_types = { path = "crates/ty_python_types" }
ty_server = { path = "crates/ty_server" }
ty_static = { path = "crates/ty_static" }
ty_test = { path = "crates/ty_test" }
@@ -146,7 +148,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "55e5e7d32fa3fc189276f35bb04c9438f9aedbd1", default-features = false, features = [
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "ce80691fa0b87dc2fd2235a26544e63e5e43d8d3", default-features = false, features = [
"compact_str",
"macros",
"salsa_unstable",

View File

@@ -14,6 +14,6 @@ info:
success: false
exit_code: 1
----- stdout -----
::error title=Ruff (unformatted),file=[TMP]/input.py,line=1,col=1,endLine=2,endColumn=1::input.py:1:1: unformatted: File would be reformatted
::error title=Ruff (unformatted),file=[TMP]/input.py,line=1,endLine=2::input.py:1:1: unformatted: File would be reformatted
----- stderr -----

View File

@@ -19,32 +19,32 @@ doctest = false
[[bench]]
name = "linter"
harness = false
required-features = ["instrumented"]
required-features = ["ruff_instrumented"]
[[bench]]
name = "lexer"
harness = false
required-features = ["instrumented"]
required-features = ["ruff_instrumented"]
[[bench]]
name = "parser"
harness = false
required-features = ["instrumented"]
required-features = ["ruff_instrumented"]
[[bench]]
name = "formatter"
harness = false
required-features = ["instrumented"]
required-features = ["ruff_instrumented"]
[[bench]]
name = "ty"
harness = false
required-features = ["instrumented"]
required-features = ["ty_instrumented"]
[[bench]]
name = "ty_walltime"
harness = false
required-features = ["walltime"]
required-features = ["ty_walltime"]
[dependencies]
ruff_db = { workspace = true, features = ["testing"] }
@@ -67,25 +67,32 @@ tracing = { workspace = true }
workspace = true
[features]
default = ["instrumented", "walltime"]
# Enables the benchmark that should only run with codspeed's instrumented runner
instrumented = [
default = ["ty_instrumented", "ty_walltime", "ruff_instrumented"]
# Enables the ruff instrumented benchmarks
ruff_instrumented = [
"criterion",
"ruff_linter",
"ruff_python_formatter",
"ruff_python_parser",
"ruff_python_trivia",
"mimalloc",
"tikv-jemallocator",
]
# Enables the ty instrumented benchmarks
ty_instrumented = [
"criterion",
"ty_project",
"ruff_python_trivia",
]
codspeed = ["codspeed-criterion-compat"]
# Enables benchmark that should only run with codspeed's walltime runner.
walltime = ["ruff_db/os", "ty_project", "divan"]
# Enables the ty_walltime benchmarks
ty_walltime = ["ruff_db/os", "ty_project", "divan"]
[target.'cfg(target_os = "windows")'.dev-dependencies]
mimalloc = { workspace = true }
[target.'cfg(target_os = "windows")'.dependencies]
mimalloc = { workspace = true, optional = true }
[target.'cfg(all(not(target_os = "windows"), not(target_os = "openbsd"), any(target_arch = "x86_64", target_arch = "aarch64", target_arch = "powerpc64", target_arch = "riscv64")))'.dev-dependencies]
tikv-jemallocator = { workspace = true }
[target.'cfg(all(not(target_os = "windows"), not(target_os = "openbsd"), any(target_arch = "x86_64", target_arch = "aarch64", target_arch = "powerpc64", target_arch = "riscv64")))'.dependencies]
tikv-jemallocator = { workspace = true, optional = true }
[dev-dependencies]
rustc-hash = { workspace = true }

View File

@@ -194,7 +194,7 @@ static SYMPY: Benchmark = Benchmark::new(
max_dep_date: "2025-06-17",
python_version: PythonVersion::PY312,
},
13100,
13106,
);
static TANJUN: Benchmark = Benchmark::new(
@@ -235,30 +235,55 @@ fn run_single_threaded(bencher: Bencher, benchmark: &Benchmark) {
});
}
#[bench(args=[&ALTAIR, &FREQTRADE, &TANJUN], sample_size=2, sample_count=3)]
fn small(bencher: Bencher, benchmark: &Benchmark) {
run_single_threaded(bencher, benchmark);
#[bench(sample_size = 2, sample_count = 3)]
fn altair(bencher: Bencher) {
run_single_threaded(bencher, &ALTAIR);
}
#[bench(args=[&COLOUR_SCIENCE, &PANDAS, &STATIC_FRAME], sample_size=1, sample_count=3)]
fn medium(bencher: Bencher, benchmark: &Benchmark) {
run_single_threaded(bencher, benchmark);
#[bench(sample_size = 2, sample_count = 3)]
fn freqtrade(bencher: Bencher) {
run_single_threaded(bencher, &FREQTRADE);
}
#[bench(args=[&SYMPY, &PYDANTIC], sample_size=1, sample_count=2)]
fn large(bencher: Bencher, benchmark: &Benchmark) {
run_single_threaded(bencher, benchmark);
#[bench(sample_size = 2, sample_count = 3)]
fn tanjun(bencher: Bencher) {
run_single_threaded(bencher, &TANJUN);
}
#[bench(args=[&ALTAIR], sample_size=3, sample_count=8)]
fn multithreaded(bencher: Bencher, benchmark: &Benchmark) {
#[bench(sample_size = 2, sample_count = 3)]
fn pydantic(bencher: Bencher) {
run_single_threaded(bencher, &PYDANTIC);
}
#[bench(sample_size = 1, sample_count = 3)]
fn static_frame(bencher: Bencher) {
run_single_threaded(bencher, &STATIC_FRAME);
}
#[bench(sample_size = 1, sample_count = 2)]
fn colour_science(bencher: Bencher) {
run_single_threaded(bencher, &COLOUR_SCIENCE);
}
#[bench(sample_size = 1, sample_count = 2)]
fn pandas(bencher: Bencher) {
run_single_threaded(bencher, &PANDAS);
}
#[bench(sample_size = 1, sample_count = 2)]
fn sympy(bencher: Bencher) {
run_single_threaded(bencher, &SYMPY);
}
#[bench(sample_size = 3, sample_count = 8)]
fn multithreaded(bencher: Bencher) {
let thread_pool = ThreadPoolBuilder::new().build().unwrap();
bencher
.with_inputs(|| benchmark.setup_iteration())
.with_inputs(|| ALTAIR.setup_iteration())
.bench_local_values(|db| {
thread_pool.install(|| {
check_project(&db, benchmark.project.name, benchmark.max_diagnostics);
check_project(&db, ALTAIR.project.name, ALTAIR.max_diagnostics);
db
})
});

View File

@@ -1,6 +1,6 @@
use std::path::PathBuf;
#[cfg(feature = "instrumented")]
#[cfg(any(feature = "ty_instrumented", feature = "ruff_instrumented"))]
pub mod criterion;
pub mod real_world_projects;

View File

@@ -0,0 +1,51 @@
use std::sync::Arc;
use std::sync::atomic::AtomicBool;
/// Signals a [`CancellationToken`] that it should be canceled.
#[derive(Debug, Clone)]
pub struct CancellationTokenSource {
cancelled: Arc<AtomicBool>,
}
impl Default for CancellationTokenSource {
fn default() -> Self {
Self::new()
}
}
impl CancellationTokenSource {
pub fn new() -> Self {
Self {
cancelled: Arc::new(AtomicBool::new(false)),
}
}
pub fn is_cancellation_requested(&self) -> bool {
self.cancelled.load(std::sync::atomic::Ordering::Relaxed)
}
/// Creates a new token that uses this source.
pub fn token(&self) -> CancellationToken {
CancellationToken {
cancelled: self.cancelled.clone(),
}
}
/// Requests cancellation for operations using this token.
pub fn cancel(&self) {
self.cancelled
.store(true, std::sync::atomic::Ordering::Relaxed);
}
}
/// Token signals whether an operation should be canceled.
#[derive(Debug, Clone)]
pub struct CancellationToken {
cancelled: Arc<AtomicBool>,
}
impl CancellationToken {
pub fn is_cancelled(&self) -> bool {
self.cancelled.load(std::sync::atomic::Ordering::Relaxed)
}
}

View File

@@ -1,4 +1,4 @@
use std::{fmt::Formatter, path::Path, sync::Arc};
use std::{borrow::Cow, fmt::Formatter, path::Path, sync::Arc};
use ruff_diagnostics::{Applicability, Fix};
use ruff_source_file::{LineColumn, SourceCode, SourceFile};
@@ -11,6 +11,7 @@ pub use self::render::{
ceil_char_boundary,
github::{DisplayGithubDiagnostics, GithubRenderer},
};
use crate::cancellation::CancellationToken;
use crate::{Db, files::File};
mod render;
@@ -410,11 +411,6 @@ impl Diagnostic {
self.id().is_invalid_syntax()
}
/// Returns the message body to display to the user.
pub fn body(&self) -> &str {
self.primary_message()
}
/// Returns the message of the first sub-diagnostic with a `Help` severity.
///
/// Note that this is used as the fix title/suggestion for some of Ruff's output formats, but in
@@ -1312,6 +1308,8 @@ pub struct DisplayDiagnosticConfig {
show_fix_diff: bool,
/// The lowest applicability that should be shown when reporting diagnostics.
fix_applicability: Applicability,
cancellation_token: Option<CancellationToken>,
}
impl DisplayDiagnosticConfig {
@@ -1385,6 +1383,20 @@ impl DisplayDiagnosticConfig {
pub fn fix_applicability(&self) -> Applicability {
self.fix_applicability
}
pub fn with_cancellation_token(
mut self,
token: Option<CancellationToken>,
) -> DisplayDiagnosticConfig {
self.cancellation_token = token;
self
}
pub fn is_canceled(&self) -> bool {
self.cancellation_token
.as_ref()
.is_some_and(|token| token.is_cancelled())
}
}
impl Default for DisplayDiagnosticConfig {
@@ -1398,6 +1410,7 @@ impl Default for DisplayDiagnosticConfig {
show_fix_status: false,
show_fix_diff: false,
fix_applicability: Applicability::Safe,
cancellation_token: None,
}
}
}
@@ -1474,6 +1487,15 @@ pub enum ConciseMessage<'a> {
Custom(&'a str),
}
impl<'a> ConciseMessage<'a> {
pub fn to_str(&self) -> Cow<'a, str> {
match self {
ConciseMessage::MainDiagnostic(s) | ConciseMessage::Custom(s) => Cow::Borrowed(s),
ConciseMessage::Both { .. } => Cow::Owned(self.to_string()),
}
}
}
impl std::fmt::Display for ConciseMessage<'_> {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
match *self {
@@ -1490,6 +1512,16 @@ impl std::fmt::Display for ConciseMessage<'_> {
}
}
#[cfg(feature = "serde")]
impl serde::Serialize for ConciseMessage<'_> {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serializer.collect_str(self)
}
}
/// A diagnostic message string.
///
/// This is, for all intents and purposes, equivalent to a `Box<str>`.

View File

@@ -52,7 +52,7 @@ impl AzureRenderer<'_> {
f,
"code={code};]{body}",
code = diag.secondary_code_or_id(),
body = diag.body(),
body = diag.concise_message(),
)?;
}

View File

@@ -28,6 +28,10 @@ impl<'a> ConciseRenderer<'a> {
let sep = fmt_styled(":", stylesheet.separator);
for diag in diagnostics {
if self.config.is_canceled() {
return Ok(());
}
if let Some(span) = diag.primary_span() {
write!(
f,

View File

@@ -53,6 +53,10 @@ impl<'a> FullRenderer<'a> {
.hyperlink(stylesheet.hyperlink);
for diag in diagnostics {
if self.config.is_canceled() {
return Ok(());
}
let resolved = Resolved::new(self.resolver, diag, self.config);
let renderable = resolved.to_renderable(self.config.context);
for diag in renderable.diagnostics.iter() {

View File

@@ -49,14 +49,26 @@ impl<'a> GithubRenderer<'a> {
}
.unwrap_or_default();
write!(
f,
",line={row},col={column},endLine={end_row},endColumn={end_column}::",
row = start_location.line,
column = start_location.column,
end_row = end_location.line,
end_column = end_location.column,
)?;
// GitHub Actions workflow commands have constraints on error annotations:
// - `col` and `endColumn` cannot be set if `line` and `endLine` are different
// See: https://github.com/astral-sh/ruff/issues/22074
if start_location.line == end_location.line {
write!(
f,
",line={row},col={column},endLine={end_row},endColumn={end_column}::",
row = start_location.line,
column = start_location.column,
end_row = end_location.line,
end_column = end_location.column,
)?;
} else {
write!(
f,
",line={row},endLine={end_row}::",
row = start_location.line,
end_row = end_location.line,
)?;
}
write!(
f,
@@ -75,7 +87,7 @@ impl<'a> GithubRenderer<'a> {
write!(f, "{id}:", id = diagnostic.id())?;
}
writeln!(f, " {}", diagnostic.body())?;
writeln!(f, " {}", diagnostic.concise_message())?;
}
Ok(())

View File

@@ -98,7 +98,7 @@ impl Serialize for SerializedMessages<'_> {
}
fingerprints.insert(message_fingerprint);
let description = diagnostic.body();
let description = diagnostic.concise_message();
let check_name = diagnostic.secondary_code_or_id();
let severity = match diagnostic.severity() {
Severity::Info => "info",

View File

@@ -6,7 +6,7 @@ use ruff_notebook::NotebookIndex;
use ruff_source_file::{LineColumn, OneIndexed};
use ruff_text_size::Ranged;
use crate::diagnostic::{Diagnostic, DiagnosticSource, DisplayDiagnosticConfig};
use crate::diagnostic::{ConciseMessage, Diagnostic, DiagnosticSource, DisplayDiagnosticConfig};
use super::FileResolver;
@@ -101,7 +101,7 @@ pub(super) fn diagnostic_to_json<'a>(
JsonDiagnostic {
code: diagnostic.secondary_code_or_id(),
url: diagnostic.documentation_url(),
message: diagnostic.body(),
message: diagnostic.concise_message(),
fix,
cell: notebook_cell_index,
location: start_location.map(JsonLocation::from),
@@ -113,7 +113,7 @@ pub(super) fn diagnostic_to_json<'a>(
JsonDiagnostic {
code: diagnostic.secondary_code_or_id(),
url: diagnostic.documentation_url(),
message: diagnostic.body(),
message: diagnostic.concise_message(),
fix,
cell: notebook_cell_index,
location: Some(start_location.unwrap_or_default().into()),
@@ -226,7 +226,7 @@ pub(crate) struct JsonDiagnostic<'a> {
filename: Option<&'a str>,
fix: Option<JsonFix<'a>>,
location: Option<JsonLocation>,
message: &'a str,
message: ConciseMessage<'a>,
noqa_row: Option<OneIndexed>,
url: Option<&'a str>,
}

View File

@@ -56,17 +56,17 @@ impl<'a> JunitRenderer<'a> {
start_location: location,
} = diagnostic;
let mut status = TestCaseStatus::non_success(NonSuccessKind::Failure);
status.set_message(diagnostic.body());
status.set_message(diagnostic.concise_message().to_str());
if let Some(location) = location {
status.set_description(format!(
"line {row}, col {col}, {body}",
row = location.line,
col = location.column,
body = diagnostic.body()
body = diagnostic.concise_message()
));
} else {
status.set_description(diagnostic.body());
status.set_description(diagnostic.concise_message().to_str());
}
let code = diagnostic

View File

@@ -55,7 +55,7 @@ impl PylintRenderer<'_> {
f,
"{path}:{row}: [{code}] {body}",
path = filename,
body = diagnostic.body()
body = diagnostic.concise_message()
)?;
}

View File

@@ -5,7 +5,7 @@ use ruff_diagnostics::{Edit, Fix};
use ruff_source_file::{LineColumn, SourceCode};
use ruff_text_size::Ranged;
use crate::diagnostic::Diagnostic;
use crate::diagnostic::{ConciseMessage, Diagnostic};
use super::FileResolver;
@@ -76,7 +76,7 @@ fn diagnostic_to_rdjson<'a>(
let edits = diagnostic.fix().map(Fix::edits).unwrap_or_default();
RdjsonDiagnostic {
message: diagnostic.body(),
message: diagnostic.concise_message(),
location,
code: RdjsonCode {
value: diagnostic
@@ -155,7 +155,7 @@ struct RdjsonDiagnostic<'a> {
code: RdjsonCode<'a>,
#[serde(skip_serializing_if = "Option::is_none")]
location: Option<RdjsonLocation<'a>>,
message: &'a str,
message: ConciseMessage<'a>,
#[serde(skip_serializing_if = "Vec::is_empty")]
suggestions: Vec<RdjsonSuggestion<'a>>,
}

View File

@@ -2,5 +2,5 @@
source: crates/ruff_db/src/diagnostic/render/github.rs
expression: env.render_diagnostics(&diagnostics)
---
::error title=ty (invalid-syntax),file=/syntax_errors.py,line=1,col=15,endLine=2,endColumn=1::syntax_errors.py:1:15: invalid-syntax: Expected one or more symbol names after import
::error title=ty (invalid-syntax),file=/syntax_errors.py,line=3,col=12,endLine=4,endColumn=1::syntax_errors.py:3:12: invalid-syntax: Expected ')', found newline
::error title=ty (invalid-syntax),file=/syntax_errors.py,line=1,endLine=2::syntax_errors.py:1:15: invalid-syntax: Expected one or more symbol names after import
::error title=ty (invalid-syntax),file=/syntax_errors.py,line=3,endLine=4::syntax_errors.py:3:12: invalid-syntax: Expected ')', found newline

View File

@@ -12,6 +12,7 @@ use std::hash::BuildHasherDefault;
use std::num::NonZeroUsize;
use ty_static::EnvVars;
pub mod cancellation;
pub mod diagnostic;
pub mod display;
pub mod file_revision;

View File

@@ -275,16 +275,16 @@ impl OsSystem {
/// instead of at least one system call for each component between `path` and `prefix`.
///
/// However, using `canonicalize` to resolve the path's casing doesn't work in two cases:
/// * if `path` is a symlink because `canonicalize` then returns the symlink's target and not the symlink's source path.
/// * on Windows: If `path` is a mapped network drive because `canonicalize` then returns the UNC path
/// (e.g. `Z:\` is mapped to `\\server\share` and `canonicalize` then returns `\\?\UNC\server\share`).
/// * if `path` is a symlink, `canonicalize` returns the symlink's target and not the symlink's source path.
/// * on Windows: If `path` is a mapped network drive, `canonicalize` returns the UNC path
/// (e.g. `Z:\` is mapped to `\\server\share` and `canonicalize` returns `\\?\UNC\server\share`).
///
/// Symlinks and mapped network drives should be rare enough that this fast path is worth trying first,
/// even if it comes at a cost for those rare use cases.
fn path_exists_case_sensitive_fast(&self, path: &SystemPath) -> Option<bool> {
// This is a more forgiving version of `dunce::simplified` that removes all `\\?\` prefixes on Windows.
// We use this more forgiving version because we don't intend on using either path for anything other than comparison
// and the prefix is only relevant when passing the path to other programs and its longer than 200 something
// and the prefix is only relevant when passing the path to other programs and it's longer than 200 something
// characters.
fn simplify_ignore_verbatim(path: &SystemPath) -> &SystemPath {
if cfg!(windows) {
@@ -298,9 +298,7 @@ impl OsSystem {
}
}
let simplified = simplify_ignore_verbatim(path);
let Ok(canonicalized) = simplified.as_std_path().canonicalize() else {
let Ok(canonicalized) = path.as_std_path().canonicalize() else {
// The path doesn't exist or can't be accessed. The path doesn't exist.
return Some(false);
};
@@ -309,12 +307,13 @@ impl OsSystem {
// The original path is valid UTF8 but the canonicalized path isn't. This definitely suggests
// that a symlink is involved. Fall back to the slow path.
tracing::debug!(
"Falling back to the slow case-sensitive path existence check because the canonicalized path of `{simplified}` is not valid UTF-8"
"Falling back to the slow case-sensitive path existence check because the canonicalized path of `{path}` is not valid UTF-8"
);
return None;
};
let simplified_canonicalized = simplify_ignore_verbatim(&canonicalized);
let simplified = simplify_ignore_verbatim(path);
// Test if the paths differ by anything other than casing. If so, that suggests that
// `path` pointed to a symlink (or some other none reversible path normalization happened).

View File

@@ -14,6 +14,7 @@ license = { workspace = true }
ty = { workspace = true }
ty_project = { workspace = true, features = ["schemars"] }
ty_python_semantic = { workspace = true }
ty_python_types = { workspace = true }
ty_static = { workspace = true }
ruff = { workspace = true }
ruff_formatter = { workspace = true }

View File

@@ -1,11 +1,13 @@
//! Generate a Markdown-compatible listing of configuration options for `pyproject.toml`.
use std::borrow::Cow;
use std::{fmt::Write, path::PathBuf};
use anyhow::bail;
use itertools::Itertools;
use pretty_assertions::StrComparison;
use std::{fmt::Write, path::PathBuf};
use ruff_options_metadata::{OptionField, OptionSet, OptionsMetadata, Visit};
use ruff_python_trivia::textwrap;
use ty_project::metadata::Options;
use crate::{
@@ -165,62 +167,69 @@ fn emit_field(output: &mut String, name: &str, field: &OptionField, parents: &[S
let _ = writeln!(output, "**Default value**: `{}`", field.default);
output.push('\n');
let _ = writeln!(output, "**Type**: `{}`", field.value_type);
output.push('\n');
output.push_str("**Example usage**:\n\n");
output.push_str(&format_example(
"pyproject.toml",
&format_header(
field.scope,
field.example,
parents,
ConfigurationFile::PyprojectToml,
),
field.example,
));
output.push('\n');
}
fn format_example(title: &str, header: &str, content: &str) -> String {
if header.is_empty() {
format!("```toml title=\"{title}\"\n{content}\n```\n",)
} else {
format!("```toml title=\"{title}\"\n{header}\n{content}\n```\n",)
for configuration_file in [ConfigurationFile::PyprojectToml, ConfigurationFile::TyToml] {
let (header, example) =
format_snippet(field.scope, field.example, parents, configuration_file);
output.push_str(&format_tab(configuration_file.name(), &header, &example));
output.push('\n');
}
}
fn format_tab(tab_name: &str, header: &str, content: &str) -> String {
let header = if header.is_empty() {
String::new()
} else {
format!("\n {header}")
};
format!(
"=== \"{}\"\n\n ```toml{}\n{}\n ```\n",
tab_name,
header,
textwrap::indent(content, " ")
)
}
/// Format the TOML header for the example usage for a given option.
///
/// For example: `[tool.ruff.format]` or `[tool.ruff.lint.isort]`.
fn format_header(
/// For example: `[tool.ty.rules]`.
fn format_snippet<'a>(
scope: Option<&str>,
example: &str,
example: &'a str,
parents: &[Set],
configuration: ConfigurationFile,
) -> String {
let tool_parent = match configuration {
ConfigurationFile::PyprojectToml => Some("tool.ty"),
ConfigurationFile::TyToml => None,
};
) -> (String, Cow<'a, str>) {
let mut example = Cow::Borrowed(example);
let header = tool_parent
let header = configuration
.parent_table()
.into_iter()
.chain(parents.iter().filter_map(|parent| parent.name()))
.chain(scope)
.join(".");
// Rewrite examples starting with `[tool.ty]` or `[[tool.ty]]` to their `ty.toml` equivalent.
if matches!(configuration, ConfigurationFile::TyToml) {
example = example.replace("[tool.ty.", "[").into();
}
// Ex) `[[tool.ty.xx]]`
if example.starts_with(&format!("[[{header}")) {
return String::new();
return (String::new(), example);
}
// Ex) `[tool.ty.rules]`
if example.starts_with(&format!("[{header}")) {
return String::new();
return (String::new(), example);
}
if header.is_empty() {
String::new()
(String::new(), example)
} else {
format!("[{header}]")
(format!("[{header}]"), example)
}
}
@@ -243,10 +252,25 @@ impl Visit for CollectOptionsVisitor {
#[derive(Debug, Copy, Clone)]
enum ConfigurationFile {
PyprojectToml,
#[expect(dead_code)]
TyToml,
}
impl ConfigurationFile {
const fn name(self) -> &'static str {
match self {
Self::PyprojectToml => "pyproject.toml",
Self::TyToml => "ty.toml",
}
}
const fn parent_table(self) -> Option<&'static str> {
match self {
Self::PyprojectToml => Some("tool.ty"),
Self::TyToml => None,
}
}
}
#[cfg(test)]
mod tests {
use anyhow::Result;

View File

@@ -52,7 +52,7 @@ pub(crate) fn main(args: &Args) -> Result<()> {
}
fn generate_markdown() -> String {
let registry = ty_python_semantic::default_lint_registry();
let registry = ty_python_types::default_lint_registry();
let mut output = String::new();
@@ -63,12 +63,7 @@ fn generate_markdown() -> String {
let _ = writeln!(&mut output, "# Rules\n");
let mut lints: Vec<_> = registry.lints().iter().collect();
lints.sort_by(|a, b| {
a.default_level()
.cmp(&b.default_level())
.reverse()
.then_with(|| a.name().cmp(&b.name()))
});
lints.sort_by_key(|a| a.name());
for lint in lints {
let _ = writeln!(&mut output, "## `{rule_name}`\n", rule_name = lint.name());

View File

@@ -16,7 +16,9 @@ ruff_linter = { workspace = true }
ruff_macros = { workspace = true }
ruff_python_ast = { workspace = true }
ruff_python_parser = { workspace = true }
ty_module_resolver = { workspace = true }
ty_python_semantic = { workspace = true }
ty_python_types = { workspace = true }
anyhow = { workspace = true }
clap = { workspace = true, optional = true }

View File

@@ -3,7 +3,7 @@ use ruff_python_ast::visitor::source_order::{
SourceOrderVisitor, walk_expr, walk_module, walk_stmt,
};
use ruff_python_ast::{self as ast, Expr, Mod, Stmt};
use ty_python_semantic::ModuleName;
use ty_module_resolver::ModuleName;
/// Collect all imports for a given Python file.
#[derive(Default, Debug)]

View File

@@ -7,11 +7,13 @@ use ruff_db::files::{File, Files};
use ruff_db::system::{OsSystem, System, SystemPathBuf};
use ruff_db::vendored::{VendoredFileSystem, VendoredFileSystemBuilder};
use ruff_python_ast::PythonVersion;
use ty_module_resolver::{SearchPathSettings, SearchPaths};
use ty_python_semantic::lint::{LintRegistry, RuleSelection};
use ty_python_semantic::{
Db, Program, ProgramSettings, PythonEnvironment, PythonPlatform, PythonVersionSource,
PythonVersionWithSource, SearchPathSettings, SysPrefixPathOrigin, default_lint_registry,
AnalysisSettings, Db, Program, ProgramSettings, PythonEnvironment, PythonPlatform,
PythonVersionSource, PythonVersionWithSource, SysPrefixPathOrigin,
};
use ty_python_types::default_lint_registry;
static EMPTY_VENDORED: std::sync::LazyLock<VendoredFileSystem> = std::sync::LazyLock::new(|| {
let mut builder = VendoredFileSystemBuilder::new(CompressionMethod::Stored);
@@ -26,6 +28,7 @@ pub struct ModuleDb {
files: Files,
system: OsSystem,
rule_selection: Arc<RuleSelection>,
analysis_settings: Arc<AnalysisSettings>,
}
impl ModuleDb {
@@ -85,6 +88,13 @@ impl SourceDb for ModuleDb {
}
}
#[salsa::db]
impl ty_module_resolver::Db for ModuleDb {
fn search_paths(&self) -> &SearchPaths {
Program::get(self).search_paths(self)
}
}
#[salsa::db]
impl Db for ModuleDb {
fn should_check_file(&self, file: File) -> bool {
@@ -102,6 +112,10 @@ impl Db for ModuleDb {
fn verbose(&self) -> bool {
false
}
fn analysis_settings(&self) -> &AnalysisSettings {
&self.analysis_settings
}
}
#[salsa::db]

View File

@@ -1,6 +1,6 @@
use ruff_db::files::{File, FilePath, system_path_to_file};
use ruff_db::system::SystemPath;
use ty_python_semantic::{
use ty_module_resolver::{
ModuleName, resolve_module, resolve_module_confident, resolve_real_module,
resolve_real_module_confident,
};

View File

@@ -197,7 +197,7 @@ impl Display for RuleCodeAndBody<'_> {
f,
"{fix}{body}",
fix = format_args!("[{}] ", "*".cyan()),
body = self.message.body(),
body = self.message.concise_message(),
);
}
}
@@ -208,14 +208,14 @@ impl Display for RuleCodeAndBody<'_> {
f,
"{code} {body}",
code = code.red().bold(),
body = self.message.body(),
body = self.message.concise_message(),
)
} else {
write!(
f,
"{code}: {body}",
code = self.message.id().as_str().red().bold(),
body = self.message.body(),
body = self.message.concise_message(),
)
}
}

View File

@@ -334,7 +334,7 @@ impl<'a> SarifResult<'a> {
rule_id: RuleCode::from(diagnostic),
level: "error".to_string(),
message: SarifMessage {
text: diagnostic.body().to_string(),
text: diagnostic.concise_message().to_string(),
},
fixes: Self::fix(diagnostic, &uri).into_iter().collect(),
locations: vec![SarifLocation {

View File

@@ -37,10 +37,11 @@ use crate::{Fix, FixAvailability, Violation};
/// import logging
///
/// logging.basicConfig(level=logging.INFO)
/// logger = logging.getLogger(__name__)
///
///
/// def sum_less_than_four(a, b):
/// logging.debug("Calling sum_less_than_four")
/// logger.debug("Calling sum_less_than_four")
/// return a + b < 4
/// ```
///

View File

@@ -243,7 +243,7 @@ fn to_lsp_diagnostic(
) -> (usize, lsp_types::Diagnostic) {
let diagnostic_range = diagnostic.range().unwrap_or_default();
let name = diagnostic.name();
let body = diagnostic.body().to_string();
let body = diagnostic.concise_message().to_string();
let fix = diagnostic.fix();
let suggestion = diagnostic.first_help_text();
let code = diagnostic.secondary_code();

View File

@@ -241,7 +241,7 @@ impl Workspace {
let range = msg.range().unwrap_or_default();
ExpandedMessage {
code: msg.secondary_code_or_id().to_string(),
message: msg.body().to_string(),
message: msg.concise_message().to_string(),
start_location: source_code
.source_location(range.start(), self.position_encoding)
.into(),

View File

@@ -42,6 +42,7 @@ wild = { workspace = true }
[dev-dependencies]
ruff_db = { workspace = true, features = ["testing"] }
ruff_python_trivia = { workspace = true }
ty_module_resolver = { workspace = true }
dunce = { workspace = true }
insta = { workspace = true, features = ["filters"] }

1
crates/ty/docs/cli.md generated
View File

@@ -56,6 +56,7 @@ over all configuration files.</p>
</dd><dt id="ty-check--exit-zero"><a href="#ty-check--exit-zero"><code>--exit-zero</code></a></dt><dd><p>Always use exit code 0, even when there are error-level diagnostics</p>
</dd><dt id="ty-check--extra-search-path"><a href="#ty-check--extra-search-path"><code>--extra-search-path</code></a> <i>path</i></dt><dd><p>Additional path to use as a module-resolution source (can be passed multiple times).</p>
<p>This is an advanced option that should usually only be used for first-party or third-party modules that are not installed into your Python environment in a conventional way. Use <code>--python</code> to point ty to your Python environment if it is in an unusual location.</p>
</dd><dt id="ty-check--force-exclude"><a href="#ty-check--force-exclude"><code>--force-exclude</code></a></dt><dd><p>Enforce exclusions, even for paths passed to ty directly on the command-line. Use <code>--no-force-exclude</code> to disable</p>
</dd><dt id="ty-check--help"><a href="#ty-check--help"><code>--help</code></a>, <code>-h</code></dt><dd><p>Print help (see a summary with '-h')</p>
</dd><dt id="ty-check--ignore"><a href="#ty-check--ignore"><code>--ignore</code></a> <i>rule</i></dt><dd><p>Disables the rule. Can be specified multiple times.</p>
</dd><dt id="ty-check--no-progress"><a href="#ty-check--no-progress"><code>--no-progress</code></a></dt><dd><p>Hide all progress outputs.</p>

View File

@@ -20,11 +20,59 @@ Valid severities are:
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.rules]
possibly-unresolved-reference = "warn"
division-by-zero = "ignore"
```
=== "pyproject.toml"
```toml
[tool.ty.rules]
possibly-unresolved-reference = "warn"
division-by-zero = "ignore"
```
=== "ty.toml"
```toml
[rules]
possibly-unresolved-reference = "warn"
division-by-zero = "ignore"
```
---
## `analysis`
### `respect-type-ignore-comments`
Whether ty should respect `type: ignore` comments.
When set to `false`, `type: ignore` comments are treated like any other normal
comment and can't be used to suppress ty errors (you have to use `ty: ignore` instead).
Setting this option can be useful when using ty alongside other type checkers or when
you prefer using `ty: ignore` over `type: ignore`.
Defaults to `true`.
**Default value**: `true`
**Type**: `bool`
**Example usage**:
=== "pyproject.toml"
```toml
[tool.ty.analysis]
# Disable support for `type: ignore` comments
respect-type-ignore-comments = false
```
=== "ty.toml"
```toml
[analysis]
# Disable support for `type: ignore` comments
respect-type-ignore-comments = false
```
---
@@ -47,10 +95,19 @@ configuration setting.
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.environment]
extra-paths = ["./shared/my-search-path"]
```
=== "pyproject.toml"
```toml
[tool.ty.environment]
extra-paths = ["./shared/my-search-path"]
```
=== "ty.toml"
```toml
[environment]
extra-paths = ["./shared/my-search-path"]
```
---
@@ -78,10 +135,19 @@ This option can be used to point to virtual or system Python environments.
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.environment]
python = "./custom-venv-location/.venv"
```
=== "pyproject.toml"
```toml
[tool.ty.environment]
python = "./custom-venv-location/.venv"
```
=== "ty.toml"
```toml
[environment]
python = "./custom-venv-location/.venv"
```
---
@@ -105,11 +171,21 @@ If no platform is specified, ty will use the current platform:
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.environment]
# Tailor type stubs and conditionalized type definitions to windows.
python-platform = "win32"
```
=== "pyproject.toml"
```toml
[tool.ty.environment]
# Tailor type stubs and conditionalized type definitions to windows.
python-platform = "win32"
```
=== "ty.toml"
```toml
[environment]
# Tailor type stubs and conditionalized type definitions to windows.
python-platform = "win32"
```
---
@@ -139,10 +215,19 @@ to reflect the differing contents of the standard library across Python versions
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.environment]
python-version = "3.12"
```
=== "pyproject.toml"
```toml
[tool.ty.environment]
python-version = "3.12"
```
=== "ty.toml"
```toml
[environment]
python-version = "3.12"
```
---
@@ -167,11 +252,21 @@ it will also be included in the first party search path.
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.environment]
# Multiple directories (priority order)
root = ["./src", "./lib", "./vendor"]
```
=== "pyproject.toml"
```toml
[tool.ty.environment]
# Multiple directories (priority order)
root = ["./src", "./lib", "./vendor"]
```
=== "ty.toml"
```toml
[environment]
# Multiple directories (priority order)
root = ["./src", "./lib", "./vendor"]
```
---
@@ -187,10 +282,19 @@ bundled as a zip file in the binary
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.environment]
typeshed = "/path/to/custom/typeshed"
```
=== "pyproject.toml"
```toml
[tool.ty.environment]
typeshed = "/path/to/custom/typeshed"
```
=== "ty.toml"
```toml
[environment]
typeshed = "/path/to/custom/typeshed"
```
---
@@ -240,15 +344,29 @@ If not specified, defaults to `[]` (excludes no files).
**Example usage**:
```toml title="pyproject.toml"
[[tool.ty.overrides]]
exclude = [
"generated",
"*.proto",
"tests/fixtures/**",
"!tests/fixtures/important.py" # Include this one file
]
```
=== "pyproject.toml"
```toml
[[tool.ty.overrides]]
exclude = [
"generated",
"*.proto",
"tests/fixtures/**",
"!tests/fixtures/important.py" # Include this one file
]
```
=== "ty.toml"
```toml
[[overrides]]
exclude = [
"generated",
"*.proto",
"tests/fixtures/**",
"!tests/fixtures/important.py" # Include this one file
]
```
---
@@ -268,13 +386,25 @@ If not specified, defaults to `["**"]` (matches all files).
**Example usage**:
```toml title="pyproject.toml"
[[tool.ty.overrides]]
include = [
"src",
"tests",
]
```
=== "pyproject.toml"
```toml
[[tool.ty.overrides]]
include = [
"src",
"tests",
]
```
=== "ty.toml"
```toml
[[overrides]]
include = [
"src",
"tests",
]
```
---
@@ -292,13 +422,25 @@ severity levels or disable them entirely.
**Example usage**:
```toml title="pyproject.toml"
[[tool.ty.overrides]]
include = ["src"]
=== "pyproject.toml"
[tool.ty.overrides.rules]
possibly-unresolved-reference = "ignore"
```
```toml
[[tool.ty.overrides]]
include = ["src"]
[tool.ty.overrides.rules]
possibly-unresolved-reference = "ignore"
```
=== "ty.toml"
```toml
[[overrides]]
include = ["src"]
[overrides.rules]
possibly-unresolved-reference = "ignore"
```
---
@@ -358,15 +500,29 @@ to re-include `dist` use `exclude = ["!dist"]`
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.src]
exclude = [
"generated",
"*.proto",
"tests/fixtures/**",
"!tests/fixtures/important.py" # Include this one file
]
```
=== "pyproject.toml"
```toml
[tool.ty.src]
exclude = [
"generated",
"*.proto",
"tests/fixtures/**",
"!tests/fixtures/important.py" # Include this one file
]
```
=== "ty.toml"
```toml
[src]
exclude = [
"generated",
"*.proto",
"tests/fixtures/**",
"!tests/fixtures/important.py" # Include this one file
]
```
---
@@ -399,13 +555,25 @@ matches `<project_root>/src` and not `<project_root>/test/src`).
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.src]
include = [
"src",
"tests",
]
```
=== "pyproject.toml"
```toml
[tool.ty.src]
include = [
"src",
"tests",
]
```
=== "ty.toml"
```toml
[src]
include = [
"src",
"tests",
]
```
---
@@ -421,10 +589,19 @@ Enabled by default.
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.src]
respect-ignore-files = false
```
=== "pyproject.toml"
```toml
[tool.ty.src]
respect-ignore-files = false
```
=== "ty.toml"
```toml
[src]
respect-ignore-files = false
```
---
@@ -450,10 +627,19 @@ it will also be included in the first party search path.
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.src]
root = "./app"
```
=== "pyproject.toml"
```toml
[tool.ty.src]
root = "./app"
```
=== "ty.toml"
```toml
[src]
root = "./app"
```
---
@@ -471,11 +657,21 @@ Defaults to `false`.
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.terminal]
# Error if ty emits any warning-level diagnostics.
error-on-warning = true
```
=== "pyproject.toml"
```toml
[tool.ty.terminal]
# Error if ty emits any warning-level diagnostics.
error-on-warning = true
```
=== "ty.toml"
```toml
[terminal]
# Error if ty emits any warning-level diagnostics.
error-on-warning = true
```
---
@@ -491,10 +687,19 @@ Defaults to `full`.
**Example usage**:
```toml title="pyproject.toml"
[tool.ty.terminal]
output-format = "concise"
```
=== "pyproject.toml"
```toml
[tool.ty.terminal]
output-format = "concise"
```
=== "ty.toml"
```toml
[terminal]
output-format = "concise"
```
---

1193
crates/ty/docs/rules.md generated

File diff suppressed because it is too large Load Diff

View File

@@ -154,6 +154,17 @@ pub(crate) struct CheckCommand {
#[clap(long, overrides_with("respect_ignore_files"), hide = true)]
no_respect_ignore_files: bool,
/// Enforce exclusions, even for paths passed to ty directly on the command-line.
/// Use `--no-force-exclude` to disable.
#[arg(
long,
overrides_with("no_force_exclude"),
help_heading = "File selection"
)]
force_exclude: bool,
#[clap(long, overrides_with("force_exclude"), hide = true)]
no_force_exclude: bool,
/// Glob patterns for files to exclude from type checking.
///
/// Uses gitignore-style syntax to exclude files and directories from type checking.
@@ -178,6 +189,10 @@ pub(crate) struct CheckCommand {
}
impl CheckCommand {
pub(crate) fn force_exclude(&self) -> bool {
resolve_bool_arg(self.force_exclude, self.no_force_exclude).unwrap_or_default()
}
pub(crate) fn into_options(self) -> Options {
let rules = if self.rules.is_empty() {
None
@@ -435,3 +450,12 @@ impl ConfigsArg {
self.0
}
}
fn resolve_bool_arg(yes: bool, no: bool) -> Option<bool> {
match (yes, no) {
(true, false) => Some(true),
(false, true) => Some(false),
(false, false) => None,
(..) => unreachable!("Clap should make this impossible"),
}
}

View File

@@ -10,9 +10,9 @@ use ty_static::EnvVars;
use std::fmt::Write;
use std::process::{ExitCode, Termination};
use std::sync::Mutex;
use anyhow::Result;
use std::sync::Mutex;
use crate::args::{CheckCommand, Command, TerminalColor};
use crate::logging::{VerbosityLevel, setup_tracing};
@@ -22,6 +22,7 @@ use clap::{CommandFactory, Parser};
use colored::Colorize;
use crossbeam::channel as crossbeam_channel;
use rayon::ThreadPoolBuilder;
use ruff_db::cancellation::{CancellationToken, CancellationTokenSource};
use ruff_db::diagnostic::{
Diagnostic, DiagnosticId, DisplayDiagnosticConfig, DisplayDiagnostics, Severity,
};
@@ -118,9 +119,12 @@ fn run_check(args: CheckCommand) -> anyhow::Result<ExitStatus> {
.config_file
.as_ref()
.map(|path| SystemPath::absolute(path, &cwd));
let force_exclude = args.force_exclude();
let mut project_metadata = match &config_file {
Some(config_file) => ProjectMetadata::from_config_file(config_file.clone(), &system)?,
Some(config_file) => {
ProjectMetadata::from_config_file(config_file.clone(), &project_path, &system)?
}
None => ProjectMetadata::discover(&project_path, &system)?,
};
@@ -130,11 +134,13 @@ fn run_check(args: CheckCommand) -> anyhow::Result<ExitStatus> {
project_metadata.apply_overrides(&project_options_overrides);
let mut db = ProjectDatabase::new(project_metadata, system)?;
let project = db.project();
project.set_verbose(&mut db, verbosity >= VerbosityLevel::Verbose);
project.set_force_exclude(&mut db, force_exclude);
db.project()
.set_verbose(&mut db, verbosity >= VerbosityLevel::Verbose);
if !check_paths.is_empty() {
db.project().set_included_paths(&mut db, check_paths);
project.set_included_paths(&mut db, check_paths);
}
let (main_loop, main_loop_cancellation_token) =
@@ -222,6 +228,11 @@ struct MainLoop {
printer: Printer,
project_options_overrides: ProjectOptionsOverrides,
/// Cancellation token that gets set by Ctrl+C.
/// Used for long-running operations on the main thread. Operations on background threads
/// use Salsa's cancellation mechanism.
cancellation_token: CancellationToken,
}
impl MainLoop {
@@ -231,6 +242,9 @@ impl MainLoop {
) -> (Self, MainLoopCancellationToken) {
let (sender, receiver) = crossbeam_channel::bounded(10);
let cancellation_token_source = CancellationTokenSource::new();
let cancellation_token = cancellation_token_source.token();
(
Self {
sender: sender.clone(),
@@ -238,8 +252,12 @@ impl MainLoop {
watcher: None,
project_options_overrides,
printer,
cancellation_token,
},
MainLoopCancellationToken {
sender,
source: cancellation_token_source,
},
MainLoopCancellationToken { sender },
)
}
@@ -273,9 +291,6 @@ impl MainLoop {
let mut revision = 0u64;
while let Ok(message) = self.receiver.recv() {
if self.watcher.is_some() {
Printer::clear_screen()?;
}
match message {
MainLoopMessage::CheckWorkspace => {
let db = db.clone();
@@ -314,6 +329,7 @@ impl MainLoop {
let display_config = DisplayDiagnosticConfig::default()
.format(terminal_settings.output_format.into())
.color(colored::control::SHOULD_COLORIZE.should_colorize())
.with_cancellation_token(Some(self.cancellation_token.clone()))
.show_fix_diff(true);
if check_revision == revision {
@@ -357,19 +373,21 @@ impl MainLoop {
)?;
}
if is_human_readable {
writeln!(
self.printer.stream_for_failure_summary(),
"Found {} diagnostic{}",
diagnostics_count,
if diagnostics_count > 1 { "s" } else { "" }
)?;
}
if !self.cancellation_token.is_cancelled() {
if is_human_readable {
writeln!(
self.printer.stream_for_failure_summary(),
"Found {} diagnostic{}",
diagnostics_count,
if diagnostics_count > 1 { "s" } else { "" }
)?;
}
if exit_status.is_internal_error() {
tracing::warn!(
"A fatal error occurred while checking some files. Not all project files were analyzed. See the diagnostics list above for details."
);
if exit_status.is_internal_error() {
tracing::warn!(
"A fatal error occurred while checking some files. Not all project files were analyzed. See the diagnostics list above for details."
);
}
}
if self.watcher.is_none() {
@@ -384,12 +402,15 @@ impl MainLoop {
}
MainLoopMessage::ApplyChanges(changes) => {
Printer::clear_screen()?;
revision += 1;
// Automatically cancels any pending queries and waits for them to complete.
db.apply_changes(changes, Some(&self.project_options_overrides));
if let Some(watcher) = self.watcher.as_mut() {
watcher.update(db);
}
self.sender.send(MainLoopMessage::CheckWorkspace).unwrap();
}
MainLoopMessage::Exit => {
@@ -493,10 +514,12 @@ impl ty_project::ProgressReporter for IndicatifReporter {
#[derive(Debug)]
struct MainLoopCancellationToken {
sender: crossbeam_channel::Sender<MainLoopMessage>,
source: CancellationTokenSource,
}
impl MainLoopCancellationToken {
fn stop(self) {
self.source.cancel();
self.sender.send(MainLoopMessage::Exit).unwrap();
}
}

View File

@@ -0,0 +1,43 @@
use insta_cmd::assert_cmd_snapshot;
use crate::CliTest;
/// ty ignores `type: ignore` comments when setting `respect-type-ignore-comments=false`
#[test]
fn respect_type_ignore_comments_is_turned_off() -> anyhow::Result<()> {
let case = CliTest::with_file(
"test.py",
r#"
y = a + 5 # type: ignore
"#,
)?;
// Assert that there's an `unresolved-reference` diagnostic (error).
assert_cmd_snapshot!(case.command(), @r"
success: true
exit_code: 0
----- stdout -----
All checks passed!
----- stderr -----
");
assert_cmd_snapshot!(case.command().arg("--config").arg("analysis.respect-type-ignore-comments=false"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-reference]: Name `a` used when not defined
--> test.py:2:5
|
2 | y = a + 5 # type: ignore
| ^
|
info: rule `unresolved-reference` is enabled by default
Found 1 diagnostic
----- stderr -----
");
Ok(())
}

View File

@@ -133,7 +133,7 @@ fn cli_config_args_invalid_option() -> anyhow::Result<()> {
|
1 | bad-option=true
| ^^^^^^^^^^
unknown field `bad-option`, expected one of `environment`, `src`, `rules`, `terminal`, `overrides`
unknown field `bad-option`, expected one of `environment`, `src`, `rules`, `terminal`, `analysis`, `overrides`
Usage: ty <COMMAND>

View File

@@ -589,6 +589,128 @@ fn explicit_path_overrides_exclude() -> anyhow::Result<()> {
Ok(())
}
/// Test behavior when explicitly checking a path that matches an exclude pattern and `--force-exclude` is provided
#[test]
fn explicit_path_overrides_exclude_force_exclude() -> anyhow::Result<()> {
let case = CliTest::with_files([
(
"src/main.py",
r#"
print(undefined_var) # error: unresolved-reference
"#,
),
(
"tests/generated.py",
r#"
print(dist_undefined_var) # error: unresolved-reference
"#,
),
(
"dist/other.py",
r#"
print(other_undefined_var) # error: unresolved-reference
"#,
),
(
"ty.toml",
r#"
[src]
exclude = ["tests/generated.py"]
"#,
),
])?;
// Explicitly checking a file in an excluded directory should still check that file
assert_cmd_snapshot!(case.command().arg("tests/generated.py").arg("src/main.py"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-reference]: Name `undefined_var` used when not defined
--> src/main.py:2:7
|
2 | print(undefined_var) # error: unresolved-reference
| ^^^^^^^^^^^^^
|
info: rule `unresolved-reference` is enabled by default
error[unresolved-reference]: Name `dist_undefined_var` used when not defined
--> tests/generated.py:2:7
|
2 | print(dist_undefined_var) # error: unresolved-reference
| ^^^^^^^^^^^^^^^^^^
|
info: rule `unresolved-reference` is enabled by default
Found 2 diagnostics
----- stderr -----
");
// Except when `--force-exclude` is set.
assert_cmd_snapshot!(case.command().arg("tests/generated.py").arg("src/main.py").arg("--force-exclude"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-reference]: Name `undefined_var` used when not defined
--> src/main.py:2:7
|
2 | print(undefined_var) # error: unresolved-reference
| ^^^^^^^^^^^^^
|
info: rule `unresolved-reference` is enabled by default
Found 1 diagnostic
----- stderr -----
");
// Explicitly checking the entire excluded directory should check all files in it
assert_cmd_snapshot!(case.command().arg("dist/").arg("src/main.py"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-reference]: Name `other_undefined_var` used when not defined
--> dist/other.py:2:7
|
2 | print(other_undefined_var) # error: unresolved-reference
| ^^^^^^^^^^^^^^^^^^^
|
info: rule `unresolved-reference` is enabled by default
error[unresolved-reference]: Name `undefined_var` used when not defined
--> src/main.py:2:7
|
2 | print(undefined_var) # error: unresolved-reference
| ^^^^^^^^^^^^^
|
info: rule `unresolved-reference` is enabled by default
Found 2 diagnostics
----- stderr -----
");
// Except when using `--force-exclude`
assert_cmd_snapshot!(case.command().arg("dist/").arg("src/main.py").arg("--force-exclude"), @r"
success: false
exit_code: 1
----- stdout -----
error[unresolved-reference]: Name `undefined_var` used when not defined
--> src/main.py:2:7
|
2 | print(undefined_var) # error: unresolved-reference
| ^^^^^^^^^^^^^
|
info: rule `unresolved-reference` is enabled by default
Found 1 diagnostic
----- stderr -----
");
Ok(())
}
#[test]
fn cli_and_configuration_exclude() -> anyhow::Result<()> {
let case = CliTest::with_files([

View File

@@ -1,3 +1,4 @@
mod analysis_options;
mod config_option;
mod exit_code;
mod file_selection;
@@ -658,6 +659,8 @@ fn gitlab_diagnostics() -> anyhow::Result<()> {
r#"
print(x) # [unresolved-reference]
print(4[1]) # [non-subscriptable]
from typing_extensions import reveal_type
reveal_type('str'.lower()) # [revealed-type]
"#,
)?;
@@ -708,6 +711,25 @@ fn gitlab_diagnostics() -> anyhow::Result<()> {
}
}
}
},
{
"check_name": "revealed-type",
"description": "revealed-type: Revealed type: `LiteralString`",
"severity": "info",
"fingerprint": "[FINGERPRINT]",
"location": {
"path": "test.py",
"positions": {
"begin": {
"line": 5,
"column": 13
},
"end": {
"line": 5,
"column": 26
}
}
}
}
]
----- stderr -----
@@ -723,6 +745,8 @@ fn github_diagnostics() -> anyhow::Result<()> {
r#"
print(x) # [unresolved-reference]
print(4[1]) # [non-subscriptable]
from typing_extensions import reveal_type
reveal_type('str'.lower()) # [revealed-type]
"#,
)?;
@@ -732,6 +756,7 @@ fn github_diagnostics() -> anyhow::Result<()> {
----- stdout -----
::warning title=ty (unresolved-reference),file=<temp_dir>/test.py,line=2,col=7,endLine=2,endColumn=8::test.py:2:7: unresolved-reference: Name `x` used when not defined
::error title=ty (non-subscriptable),file=<temp_dir>/test.py,line=3,col=7,endLine=3,endColumn=11::test.py:3:7: non-subscriptable: Cannot subscript object of type `Literal[4]` with no `__getitem__` method
::notice title=ty (revealed-type),file=<temp_dir>/test.py,line=5,col=13,endLine=5,endColumn=26::test.py:5:13: revealed-type: Revealed type: `LiteralString`
----- stderr -----
");

View File

@@ -10,12 +10,13 @@ use ruff_db::system::{
OsSystem, System, SystemPath, SystemPathBuf, UserConfigDirectoryOverrideGuard, file_time_now,
};
use ruff_python_ast::PythonVersion;
use ty_module_resolver::{Module, ModuleName, resolve_module_confident};
use ty_project::metadata::options::{EnvironmentOptions, Options, ProjectOptionsOverrides};
use ty_project::metadata::pyproject::{PyProject, Tool};
use ty_project::metadata::value::{RangedValue, RelativePathBuf};
use ty_project::watch::{ChangeEvent, ProjectWatcher, directory_watcher};
use ty_project::{Db, ProjectDatabase, ProjectMetadata};
use ty_python_semantic::{Module, ModuleName, PythonPlatform, resolve_module_confident};
use ty_python_semantic::PythonPlatform;
struct TestCase {
db: ProjectDatabase,
@@ -1019,7 +1020,7 @@ fn search_path() -> anyhow::Result<()> {
let site_packages = case.root_path().join("site_packages");
assert_eq!(
resolve_module_confident(case.db(), &ModuleName::new("a").unwrap()),
resolve_module_confident(case.db(), &ModuleName::new_static("a").unwrap()),
None
);
@@ -1192,7 +1193,7 @@ fn changed_versions_file() -> anyhow::Result<()> {
// Unset the custom typeshed directory.
assert_eq!(
resolve_module_confident(case.db(), &ModuleName::new("os").unwrap()),
resolve_module_confident(case.db(), &ModuleName::new_static("os").unwrap()),
None
);
@@ -1207,7 +1208,7 @@ fn changed_versions_file() -> anyhow::Result<()> {
case.apply_changes(changes, None);
assert!(resolve_module_confident(case.db(), &ModuleName::new("os").unwrap()).is_some());
assert!(resolve_module_confident(case.db(), &ModuleName::new_static("os").unwrap()).is_some());
Ok(())
}
@@ -1874,11 +1875,11 @@ fn rename_files_casing_only() -> anyhow::Result<()> {
let mut case = setup([("lib.py", "class Foo: ...")])?;
assert!(
resolve_module_confident(case.db(), &ModuleName::new("lib").unwrap()).is_some(),
resolve_module_confident(case.db(), &ModuleName::new_static("lib").unwrap()).is_some(),
"Expected `lib` module to exist."
);
assert_eq!(
resolve_module_confident(case.db(), &ModuleName::new("Lib").unwrap()),
resolve_module_confident(case.db(), &ModuleName::new_static("Lib").unwrap()),
None,
"Expected `Lib` module not to exist"
);
@@ -1911,13 +1912,13 @@ fn rename_files_casing_only() -> anyhow::Result<()> {
// Resolving `lib` should now fail but `Lib` should now succeed
assert_eq!(
resolve_module_confident(case.db(), &ModuleName::new("lib").unwrap()),
resolve_module_confident(case.db(), &ModuleName::new_static("lib").unwrap()),
None,
"Expected `lib` module to no longer exist."
);
assert!(
resolve_module_confident(case.db(), &ModuleName::new("Lib").unwrap()).is_some(),
resolve_module_confident(case.db(), &ModuleName::new_static("Lib").unwrap()).is_some(),
"Expected `Lib` module to exist"
);

View File

@@ -12,7 +12,6 @@ license.workspace = true
[dependencies]
ruff_db = { workspace = true }
ruff_python_ast = { workspace = true }
ty_python_semantic = { workspace = true }
ordermap = { workspace = true }

View File

@@ -8,7 +8,6 @@ use std::{collections::HashMap, hash::BuildHasher};
use ordermap::OrderMap;
use ruff_db::system::SystemPathBuf;
use ruff_python_ast::PythonVersion;
use ty_python_semantic::PythonPlatform;
/// Combine two values, preferring the values in `self`.
///
@@ -145,7 +144,6 @@ macro_rules! impl_noop_combine {
}
impl_noop_combine!(SystemPathBuf);
impl_noop_combine!(PythonPlatform);
impl_noop_combine!(PythonVersion);
// std types

View File

@@ -15,8 +15,8 @@ ruff_db = { workspace = true, features = ["os"] }
ruff_text_size = { workspace = true }
ty_ide = { workspace = true }
ty_module_resolver = { workspace = true }
ty_project = { workspace = true }
ty_python_semantic = { workspace = true }
anyhow = { workspace = true }
bstr = { workspace = true }

View File

@@ -28,4 +28,4 @@ scope-simple-long-identifier,main.py,0,1
tstring-completions,main.py,0,1
ty-extensions-lower-stdlib,main.py,0,9
type-var-typing-over-ast,main.py,0,3
type-var-typing-over-ast,main.py,1,251
type-var-typing-over-ast,main.py,1,253
1 name file index rank
28 tstring-completions main.py 0 1
29 ty-extensions-lower-stdlib main.py 0 9
30 type-var-typing-over-ast main.py 0 3
31 type-var-typing-over-ast main.py 1 251 253

View File

@@ -15,11 +15,11 @@ use regex::bytes::Regex;
use ruff_db::files::system_path_to_file;
use ruff_db::system::{OsSystem, SystemPath, SystemPathBuf};
use ty_ide::Completion;
use ty_module_resolver::ModuleName;
use ty_project::metadata::Options;
use ty_project::metadata::options::EnvironmentOptions;
use ty_project::metadata::value::RelativePathBuf;
use ty_project::{ProjectDatabase, ProjectMetadata};
use ty_python_semantic::ModuleName;
#[derive(Debug, clap::Parser)]
#[command(

View File

@@ -22,7 +22,9 @@ ruff_python_importer = { workspace = true }
ruff_python_trivia = { workspace = true }
ruff_source_file = { workspace = true }
ruff_text_size = { workspace = true }
ty_module_resolver = { workspace = true }
ty_python_semantic = { workspace = true }
ty_python_types = { workspace = true }
ty_project = { workspace = true, features = ["testing"] }
ty_vendored = { workspace = true }

View File

@@ -1,6 +1,6 @@
use ruff_db::files::File;
use ty_module_resolver::{Module, ModuleName, all_modules, resolve_real_shadowable_module};
use ty_project::Db;
use ty_python_semantic::{Module, ModuleName, all_modules, resolve_real_shadowable_module};
use crate::{
SymbolKind,
@@ -21,7 +21,7 @@ pub fn all_symbols<'db>(
return Vec::new();
}
let typing_extensions = ModuleName::new("typing_extensions").unwrap();
let typing_extensions = ModuleName::new_static("typing_extensions").unwrap();
let is_typing_extensions_available = importing_from.is_stub(db)
|| resolve_real_shadowable_module(db, importing_from, &typing_extensions).is_some();

View File

@@ -7,7 +7,7 @@ use ruff_text_size::TextRange;
use ty_project::Db;
use ty_python_semantic::create_suppression_fix;
use ty_python_semantic::lint::LintId;
use ty_python_semantic::types::{UNDEFINED_REVEAL, UNRESOLVED_REFERENCE};
use ty_python_types::types::{UNDEFINED_REVEAL, UNRESOLVED_REFERENCE};
/// A `QuickFix` Code Action
#[derive(Debug, Clone)]
@@ -87,10 +87,8 @@ mod tests {
use ruff_python_trivia::textwrap::dedent;
use ruff_text_size::{TextRange, TextSize};
use ty_project::ProjectMetadata;
use ty_python_semantic::{
lint::LintMetadata,
types::{UNDEFINED_REVEAL, UNRESOLVED_REFERENCE},
};
use ty_python_semantic::lint::LintMetadata;
use ty_python_types::types::{UNDEFINED_REVEAL, UNRESOLVED_REFERENCE};
#[test]
fn add_ignore() {

View File

@@ -11,11 +11,9 @@ use ruff_python_ast::{self as ast, AnyNodeRef};
use ruff_python_codegen::Stylist;
use ruff_text_size::{Ranged, TextRange, TextSize};
use rustc_hash::FxHashSet;
use ty_python_semantic::types::UnionType;
use ty_python_semantic::{
Completion as SemanticCompletion, KnownModule, ModuleName, NameKind, SemanticModel,
types::{CycleDetector, KnownClass, Type},
};
use ty_module_resolver::{KnownModule, ModuleName};
use ty_python_types::types::{CycleDetector, KnownClass, Type, UnionType};
use ty_python_types::{Completion as SemanticCompletion, NameKind, SemanticModel};
use crate::docstring::Docstring;
use crate::goto::Definitions;
@@ -2106,7 +2104,7 @@ mod tests {
use insta::assert_snapshot;
use ruff_python_ast::token::{TokenKind, Tokens};
use ruff_python_parser::{Mode, ParseOptions};
use ty_python_semantic::ModuleName;
use ty_module_resolver::ModuleName;
use crate::completion::{Completion, completion};
use crate::tests::{CursorTest, CursorTestBuilder};

View File

@@ -3,7 +3,7 @@ use crate::references::{ReferencesMode, references};
use crate::{Db, ReferenceTarget};
use ruff_db::files::File;
use ruff_text_size::TextSize;
use ty_python_semantic::SemanticModel;
use ty_python_types::SemanticModel;
/// Find all document highlights for a symbol at the given position.
/// Document highlights are limited to the current file only.

View File

@@ -3,7 +3,7 @@ use crate::references::{ReferencesMode, references};
use crate::{Db, ReferenceTarget};
use ruff_db::files::File;
use ruff_text_size::TextSize;
use ty_python_semantic::SemanticModel;
use ty_python_types::SemanticModel;
/// Find all references to a symbol at the given position.
/// Search for references across all files in the project.

View File

@@ -12,12 +12,12 @@ use ruff_python_ast::token::{TokenKind, Tokens};
use ruff_python_ast::{self as ast, AnyNodeRef};
use ruff_text_size::{Ranged, TextRange, TextSize};
use ty_python_semantic::ResolvedDefinition;
use ty_python_semantic::types::Type;
use ty_python_semantic::types::ide_support::{
use ty_python_types::ResolvedDefinition;
use ty_python_types::types::Type;
use ty_python_types::types::ide_support::{
call_signature_details, call_type_simplified_by_overloads, definitions_for_keyword_argument,
};
use ty_python_semantic::{
use ty_python_types::{
HasDefinition, HasType, ImportAliasResolution, SemanticModel, definitions_for_imported_symbol,
definitions_for_name,
};
@@ -228,15 +228,15 @@ impl<'db> Definitions<'db> {
pub(crate) fn from_ty(db: &'db dyn crate::Db, ty: Type<'db>) -> Option<Self> {
let ty_def = ty.definition(db)?;
let resolved = match ty_def {
ty_python_semantic::types::TypeDefinition::Module(module) => {
ty_python_types::types::TypeDefinition::Module(module) => {
ResolvedDefinition::Module(module.file(db)?)
}
ty_python_semantic::types::TypeDefinition::Class(definition)
| ty_python_semantic::types::TypeDefinition::Function(definition)
| ty_python_semantic::types::TypeDefinition::TypeVar(definition)
| ty_python_semantic::types::TypeDefinition::TypeAlias(definition)
| ty_python_semantic::types::TypeDefinition::SpecialForm(definition)
| ty_python_semantic::types::TypeDefinition::NewType(definition) => {
ty_python_types::types::TypeDefinition::Class(definition)
| ty_python_types::types::TypeDefinition::Function(definition)
| ty_python_types::types::TypeDefinition::TypeVar(definition)
| ty_python_types::types::TypeDefinition::TypeAlias(definition)
| ty_python_types::types::TypeDefinition::SpecialForm(definition)
| ty_python_types::types::TypeDefinition::NewType(definition) => {
ResolvedDefinition::Definition(definition)
}
};
@@ -338,11 +338,11 @@ impl GotoTarget<'_> {
}
}
GotoTarget::BinOp { expression, .. } => {
let (_, ty) = ty_python_semantic::definitions_for_bin_op(model, expression)?;
let (_, ty) = ty_python_types::definitions_for_bin_op(model, expression)?;
Some(ty)
}
GotoTarget::UnaryOp { expression, .. } => {
let (_, ty) = ty_python_semantic::definitions_for_unary_op(model, expression)?;
let (_, ty) = ty_python_types::definitions_for_unary_op(model, expression)?;
Some(ty)
}
// TODO: Support identifier targets
@@ -526,15 +526,14 @@ impl GotoTarget<'_> {
}
GotoTarget::BinOp { expression, .. } => {
let (definitions, _) =
ty_python_semantic::definitions_for_bin_op(model, expression)?;
let (definitions, _) = ty_python_types::definitions_for_bin_op(model, expression)?;
Some(definitions)
}
GotoTarget::UnaryOp { expression, .. } => {
let (definitions, _) =
ty_python_semantic::definitions_for_unary_op(model, expression)?;
ty_python_types::definitions_for_unary_op(model, expression)?;
Some(definitions)
}
@@ -939,12 +938,12 @@ impl Ranged for GotoTarget<'_> {
/// Converts a collection of `ResolvedDefinition` items into `NavigationTarget` items.
fn convert_resolved_definitions_to_targets<'db>(
db: &'db dyn ty_python_semantic::Db,
definitions: Vec<ty_python_semantic::ResolvedDefinition<'db>>,
definitions: Vec<ty_python_types::ResolvedDefinition<'db>>,
) -> Vec<crate::NavigationTarget> {
definitions
.into_iter()
.map(|resolved_definition| match resolved_definition {
ty_python_semantic::ResolvedDefinition::Definition(definition) => {
ty_python_types::ResolvedDefinition::Definition(definition) => {
// Get the parsed module for range calculation
let definition_file = definition.file(db);
let module = ruff_db::parsed::parsed_module(db, definition_file).load(db);
@@ -959,11 +958,11 @@ fn convert_resolved_definitions_to_targets<'db>(
full_range: full_range.range(),
}
}
ty_python_semantic::ResolvedDefinition::Module(file) => {
ty_python_types::ResolvedDefinition::Module(file) => {
// For modules, navigate to the start of the file
crate::NavigationTarget::new(file, TextRange::default())
}
ty_python_semantic::ResolvedDefinition::FileWithRange(file_range) => {
ty_python_types::ResolvedDefinition::FileWithRange(file_range) => {
// For file ranges, navigate to the specific range within the file
crate::NavigationTarget::from(file_range)
}
@@ -984,9 +983,9 @@ fn definitions_for_expression<'db>(
expression.into(),
alias_resolution,
)),
ast::ExprRef::Attribute(attribute) => Some(ty_python_semantic::definitions_for_attribute(
model, attribute,
)),
ast::ExprRef::Attribute(attribute) => {
Some(ty_python_types::definitions_for_attribute(model, attribute))
}
_ => None,
}
}
@@ -1007,7 +1006,7 @@ fn definitions_for_callable<'db>(
fn definitions_to_navigation_targets<'db>(
db: &dyn ty_python_semantic::Db,
stub_mapper: Option<&StubMapper<'db>>,
mut definitions: Vec<ty_python_semantic::ResolvedDefinition<'db>>,
mut definitions: Vec<ty_python_types::ResolvedDefinition<'db>>,
) -> Option<crate::NavigationTargets> {
if let Some(mapper) = stub_mapper {
definitions = mapper.map_definitions(definitions);

View File

@@ -3,7 +3,7 @@ use crate::{Db, NavigationTargets, RangedValue};
use ruff_db::files::{File, FileRange};
use ruff_db::parsed::parsed_module;
use ruff_text_size::{Ranged, TextSize};
use ty_python_semantic::{ImportAliasResolution, SemanticModel};
use ty_python_types::{ImportAliasResolution, SemanticModel};
/// Navigate to the declaration of a symbol.
///

View File

@@ -3,7 +3,7 @@ use crate::{Db, NavigationTargets, RangedValue};
use ruff_db::files::{File, FileRange};
use ruff_db::parsed::parsed_module;
use ruff_text_size::{Ranged, TextSize};
use ty_python_semantic::{ImportAliasResolution, SemanticModel};
use ty_python_types::{ImportAliasResolution, SemanticModel};
/// Navigate to the definition of a symbol.
///

View File

@@ -3,7 +3,7 @@ use crate::{Db, HasNavigationTargets, NavigationTargets, RangedValue};
use ruff_db::files::{File, FileRange};
use ruff_db::parsed::parsed_module;
use ruff_text_size::{Ranged, TextSize};
use ty_python_semantic::SemanticModel;
use ty_python_types::SemanticModel;
pub fn goto_type_definition(
db: &dyn Db,
@@ -1339,13 +1339,13 @@ f(**kwargs<CURSOR>)
| ^^^^^^ Clicking here
|
info: Found 1 type definition
--> stdlib/builtins.pyi:2920:7
--> stdlib/builtins.pyi:2947:7
|
2919 | @disjoint_base
2920 | class dict(MutableMapping[_KT, _VT]):
2946 | @disjoint_base
2947 | class dict(MutableMapping[_KT, _VT]):
| ----
2921 | """dict() -> new empty dictionary
2922 | dict(mapping) -> new dictionary initialized from a mapping object's
2948 | """dict() -> new empty dictionary
2949 | dict(mapping) -> new dictionary initialized from a mapping object's
|
"#);
}

View File

@@ -6,8 +6,8 @@ use ruff_db::parsed::parsed_module;
use ruff_text_size::{Ranged, TextSize};
use std::fmt;
use std::fmt::Formatter;
use ty_python_semantic::types::{KnownInstanceType, Type, TypeVarVariance};
use ty_python_semantic::{DisplaySettings, SemanticModel};
use ty_python_types::types::{KnownInstanceType, Type, TypeVarVariance};
use ty_python_types::{DisplaySettings, SemanticModel};
pub fn hover(db: &dyn Db, file: File, offset: TextSize) -> Option<RangedValue<Hover<'_>>> {
let parsed = parsed_module(db, file).load(db);
@@ -23,7 +23,7 @@ pub fn hover(db: &dyn Db, file: File, offset: TextSize) -> Option<RangedValue<Ho
let docs = goto_target
.get_definition_targets(
&model,
ty_python_semantic::ImportAliasResolution::ResolveAliases,
ty_python_types::ImportAliasResolution::ResolveAliases,
)
.and_then(|definitions| definitions.docstring(db))
.map(HoverContent::Docstring);
@@ -2956,6 +2956,295 @@ def function():
assert_snapshot!(test.hover(), @"Hover provided no content");
}
#[test]
fn hover_func_with_concat_docstring() {
let test = cursor_test(
r#"
def a<CURSOR>b():
"""wow cool docs""" """and docs"""
return
"#,
);
assert_snapshot!(test.hover(), @r#"
def ab() -> Unknown
---------------------------------------------
wow cool docsand docs
---------------------------------------------
```python
def ab() -> Unknown
```
---
wow cool docsand docs
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:5
|
2 | def ab():
| ^-
| ||
| |Cursor offset
| source
3 | """wow cool docs""" """and docs"""
4 | return
|
"#);
}
#[test]
fn hover_func_with_plus_docstring() {
let test = cursor_test(
r#"
def a<CURSOR>b():
"""wow cool docs""" + """and docs"""
return
"#,
);
assert_snapshot!(test.hover(), @r#"
def ab() -> Unknown
---------------------------------------------
```python
def ab() -> Unknown
```
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:5
|
2 | def ab():
| ^-
| ||
| |Cursor offset
| source
3 | """wow cool docs""" + """and docs"""
4 | return
|
"#);
}
#[test]
fn hover_func_with_slash_docstring() {
let test = cursor_test(
r#"
def a<CURSOR>b():
"""wow cool docs""" \
"""and docs"""
return
"#,
);
assert_snapshot!(test.hover(), @r#"
def ab() -> Unknown
---------------------------------------------
wow cool docsand docs
---------------------------------------------
```python
def ab() -> Unknown
```
---
wow cool docsand docs
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:5
|
2 | def ab():
| ^-
| ||
| |Cursor offset
| source
3 | """wow cool docs""" \
4 | """and docs"""
|
"#);
}
#[test]
fn hover_func_with_sameline_commented_docstring() {
let test = cursor_test(
r#"
def a<CURSOR>b():
"""wow cool docs""" # and a comment
"""and docs""" # that shouldn't be included
return
"#,
);
assert_snapshot!(test.hover(), @r#"
def ab() -> Unknown
---------------------------------------------
wow cool docs
---------------------------------------------
```python
def ab() -> Unknown
```
---
wow cool docs
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:5
|
2 | def ab():
| ^-
| ||
| |Cursor offset
| source
3 | """wow cool docs""" # and a comment
4 | """and docs""" # that shouldn't be included
|
"#);
}
#[test]
fn hover_func_with_nextline_commented_docstring() {
let test = cursor_test(
r#"
def a<CURSOR>b():
"""wow cool docs"""
# and a comment that shouldn't be included
"""and docs"""
return
"#,
);
assert_snapshot!(test.hover(), @r#"
def ab() -> Unknown
---------------------------------------------
wow cool docs
---------------------------------------------
```python
def ab() -> Unknown
```
---
wow cool docs
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:5
|
2 | def ab():
| ^-
| ||
| |Cursor offset
| source
3 | """wow cool docs"""
4 | # and a comment that shouldn't be included
|
"#);
}
#[test]
fn hover_func_with_parens_docstring() {
let test = cursor_test(
r#"
def a<CURSOR>b():
(
"""wow cool docs"""
"""and docs"""
)
return
"#,
);
assert_snapshot!(test.hover(), @r#"
def ab() -> Unknown
---------------------------------------------
wow cool docsand docs
---------------------------------------------
```python
def ab() -> Unknown
```
---
wow cool docsand docs
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:5
|
2 | def ab():
| ^-
| ||
| |Cursor offset
| source
3 | (
4 | """wow cool docs"""
|
"#);
}
#[test]
fn hover_func_with_nextline_commented_parens_docstring() {
let test = cursor_test(
r#"
def a<CURSOR>b():
(
"""wow cool docs"""
# and a comment that shouldn't be included
"""and docs"""
)
return
"#,
);
assert_snapshot!(test.hover(), @r#"
def ab() -> Unknown
---------------------------------------------
wow cool docsand docs
---------------------------------------------
```python
def ab() -> Unknown
```
---
wow cool docsand docs
---------------------------------------------
info[hover]: Hovered content is
--> main.py:2:5
|
2 | def ab():
| ^-
| ||
| |Cursor offset
| source
3 | (
4 | """wow cool docs"""
|
"#);
}
#[test]
fn hover_attribute_docstring_spill() {
let test = cursor_test(
r#"
if True:
a<CURSOR>b = 1
"this shouldn't be a docstring but also it doesn't matter much"
"#,
);
assert_snapshot!(test.hover(), @r#"
Literal[1]
---------------------------------------------
```python
Literal[1]
```
---------------------------------------------
info[hover]: Hovered content is
--> main.py:3:5
|
2 | if True:
3 | ab = 1
| ^-
| ||
| |Cursor offset
| source
4 | "this shouldn't be a docstring but also it doesn't matter much"
|
"#);
}
#[test]
fn hover_class_typevar_variance() {
let test = cursor_test(

View File

@@ -29,10 +29,11 @@ use ruff_python_ast::visitor::source_order::{SourceOrderVisitor, TraversalSignal
use ruff_python_codegen::Stylist;
use ruff_python_importer::Insertion;
use ruff_text_size::{Ranged, TextRange, TextSize};
use ty_module_resolver::ModuleName;
use ty_project::Db;
use ty_python_semantic::semantic_index::definition::DefinitionKind;
use ty_python_semantic::types::Type;
use ty_python_semantic::{MemberDefinition, ModuleName, SemanticModel};
use ty_python_types::types::Type;
use ty_python_types::{MemberDefinition, SemanticModel};
pub(crate) struct Importer<'a> {
/// The ty Salsa database.
@@ -880,11 +881,10 @@ mod tests {
use ruff_python_codegen::Stylist;
use ruff_python_trivia::textwrap::dedent;
use ruff_text_size::TextSize;
use ty_module_resolver::SearchPathSettings;
use ty_project::ProjectMetadata;
use ty_python_semantic::{
Program, ProgramSettings, PythonPlatform, PythonVersionWithSource, SearchPathSettings,
SemanticModel,
};
use ty_python_semantic::{Program, ProgramSettings, PythonPlatform, PythonVersionWithSource};
use ty_python_types::SemanticModel;
use super::*;

View File

@@ -6,9 +6,9 @@ use ruff_db::parsed::parsed_module;
use ruff_python_ast::visitor::source_order::{self, SourceOrderVisitor, TraversalSignal};
use ruff_python_ast::{AnyNodeRef, ArgOrKeyword, Expr, ExprUnaryOp, Stmt, UnaryOp};
use ruff_text_size::{Ranged, TextRange, TextSize};
use ty_python_semantic::types::ide_support::inlay_hint_call_argument_details;
use ty_python_semantic::types::{Type, TypeDetail};
use ty_python_semantic::{HasType, SemanticModel};
use ty_python_types::types::ide_support::inlay_hint_call_argument_details;
use ty_python_types::types::{Type, TypeDetail};
use ty_python_types::{HasType, SemanticModel};
#[derive(Debug, Clone)]
pub struct InlayHint {
@@ -1241,12 +1241,12 @@ mod tests {
---------------------------------------------
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2695:7
--> stdlib/builtins.pyi:2722:7
|
2694 | @disjoint_base
2695 | class tuple(Sequence[_T_co]):
2721 | @disjoint_base
2722 | class tuple(Sequence[_T_co]):
| ^^^^^
2696 | """Built-in immutable sequence.
2723 | """Built-in immutable sequence.
|
info: Source
--> main2.py:8:5
@@ -1335,12 +1335,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2695:7
--> stdlib/builtins.pyi:2722:7
|
2694 | @disjoint_base
2695 | class tuple(Sequence[_T_co]):
2721 | @disjoint_base
2722 | class tuple(Sequence[_T_co]):
| ^^^^^
2696 | """Built-in immutable sequence.
2723 | """Built-in immutable sequence.
|
info: Source
--> main2.py:9:5
@@ -1391,12 +1391,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2695:7
--> stdlib/builtins.pyi:2722:7
|
2694 | @disjoint_base
2695 | class tuple(Sequence[_T_co]):
2721 | @disjoint_base
2722 | class tuple(Sequence[_T_co]):
| ^^^^^
2696 | """Built-in immutable sequence.
2723 | """Built-in immutable sequence.
|
info: Source
--> main2.py:10:5
@@ -2217,12 +2217,12 @@ mod tests {
---------------------------------------------
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:2:5
@@ -2270,12 +2270,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:3:5
@@ -2325,12 +2325,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:4:5
@@ -2364,13 +2364,13 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2591:7
--> stdlib/builtins.pyi:2618:7
|
2590 | @final
2591 | class bool(int):
2617 | @final
2618 | class bool(int):
| ^^^^
2592 | """Returns True when the argument is true, False otherwise.
2593 | The builtins True and False are the only two instances of the class bool.
2619 | """Returns True when the argument is true, False otherwise.
2620 | The builtins True and False are the only two instances of the class bool.
|
info: Source
--> main2.py:4:20
@@ -2384,12 +2384,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:5:5
@@ -2443,12 +2443,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:6:5
@@ -2502,12 +2502,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:7:5
@@ -2561,12 +2561,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:8:5
@@ -2620,12 +2620,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:9:5
@@ -2678,12 +2678,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:10:5
@@ -2737,12 +2737,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:11:5
@@ -2811,12 +2811,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:12:5
@@ -2925,12 +2925,12 @@ mod tests {
---------------------------------------------
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2695:7
--> stdlib/builtins.pyi:2722:7
|
2694 | @disjoint_base
2695 | class tuple(Sequence[_T_co]):
2721 | @disjoint_base
2722 | class tuple(Sequence[_T_co]):
| ^^^^^
2696 | """Built-in immutable sequence.
2723 | """Built-in immutable sequence.
|
info: Source
--> main2.py:7:5
@@ -3092,12 +3092,12 @@ mod tests {
---------------------------------------------
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:4:18
@@ -3110,12 +3110,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2695:7
--> stdlib/builtins.pyi:2722:7
|
2694 | @disjoint_base
2695 | class tuple(Sequence[_T_co]):
2721 | @disjoint_base
2722 | class tuple(Sequence[_T_co]):
| ^^^^^
2696 | """Built-in immutable sequence.
2723 | """Built-in immutable sequence.
|
info: Source
--> main2.py:5:18
@@ -3248,12 +3248,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2695:7
--> stdlib/builtins.pyi:2722:7
|
2694 | @disjoint_base
2695 | class tuple(Sequence[_T_co]):
2721 | @disjoint_base
2722 | class tuple(Sequence[_T_co]):
| ^^^^^
2696 | """Built-in immutable sequence.
2723 | """Built-in immutable sequence.
|
info: Source
--> main2.py:8:5
@@ -4250,12 +4250,12 @@ mod tests {
foo([x=]y[0])
---------------------------------------------
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:3:5
@@ -4303,12 +4303,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:4:5
@@ -5609,12 +5609,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:3:14
@@ -6046,13 +6046,13 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2591:7
--> stdlib/builtins.pyi:2618:7
|
2590 | @final
2591 | class bool(int):
2617 | @final
2618 | class bool(int):
| ^^^^
2592 | """Returns True when the argument is true, False otherwise.
2593 | The builtins True and False are the only two instances of the class bool.
2619 | """Returns True when the argument is true, False otherwise.
2620 | The builtins True and False are the only two instances of the class bool.
|
info: Source
--> main2.py:4:25
@@ -6100,12 +6100,12 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/builtins.pyi:2802:7
--> stdlib/builtins.pyi:2829:7
|
2801 | @disjoint_base
2802 | class list(MutableSequence[_T]):
2828 | @disjoint_base
2829 | class list(MutableSequence[_T]):
| ^^^^
2803 | """Built-in mutable sequence.
2830 | """Built-in mutable sequence.
|
info: Source
--> main2.py:4:49
@@ -6614,13 +6614,6 @@ mod tests {
3 | y[: type[T@f]] = x
| ^^^
|
---------------------------------------------
info[inlay-hint-edit]: File after edits
info: Source
def f[T](x: type[T]):
y: type[T@f] = x
"#);
}
@@ -6739,14 +6732,14 @@ mod tests {
A = TypeAliasType([name=]'A', [value=]str)
---------------------------------------------
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/typing.pyi:2032:26
--> stdlib/typing.pyi:2037:26
|
2030 | """
2031 |
2032 | def __new__(cls, name: str, value: Any, *, type_params: tuple[_TypeParameter, ...] = ()) -> Self: ...
2035 | """
2036 |
2037 | def __new__(cls, name: str, value: Any, *, type_params: tuple[_TypeParameter, ...] = ()) -> Self: ...
| ^^^^
2033 | @property
2034 | def __value__(self) -> Any: ... # AnnotationForm
2038 | @property
2039 | def __value__(self) -> Any: ... # AnnotationForm
|
info: Source
--> main2.py:3:20
@@ -6757,14 +6750,14 @@ mod tests {
|
info[inlay-hint-location]: Inlay Hint Target
--> stdlib/typing.pyi:2032:37
--> stdlib/typing.pyi:2037:37
|
2030 | """
2031 |
2032 | def __new__(cls, name: str, value: Any, *, type_params: tuple[_TypeParameter, ...] = ()) -> Self: ...
2035 | """
2036 |
2037 | def __new__(cls, name: str, value: Any, *, type_params: tuple[_TypeParameter, ...] = ()) -> Self: ...
| ^^^^^
2033 | @property
2034 | def __value__(self) -> Any: ... # AnnotationForm
2038 | @property
2039 | def __value__(self) -> Any: ... # AnnotationForm
|
info: Source
--> main2.py:3:32

View File

@@ -57,7 +57,7 @@ use ruff_text_size::{Ranged, TextRange};
use rustc_hash::FxHashSet;
use std::ops::{Deref, DerefMut};
use ty_project::Db;
use ty_python_semantic::types::{Type, TypeDefinition};
use ty_python_types::types::{Type, TypeDefinition};
/// Information associated with a text range.
#[derive(Debug, Copy, Clone, Eq, PartialEq, Hash)]

View File

@@ -20,7 +20,7 @@ use ruff_python_ast::{
visitor::source_order::{SourceOrderVisitor, TraversalSignal},
};
use ruff_text_size::{Ranged, TextRange};
use ty_python_semantic::{ImportAliasResolution, SemanticModel};
use ty_python_types::{ImportAliasResolution, SemanticModel};
/// Mode for references search behavior
#[derive(Debug, Clone, Copy, PartialEq, Eq)]

View File

@@ -3,7 +3,7 @@ use crate::references::{ReferencesMode, references};
use crate::{Db, ReferenceTarget};
use ruff_db::files::File;
use ruff_text_size::{Ranged, TextSize};
use ty_python_semantic::SemanticModel;
use ty_python_types::SemanticModel;
/// Returns the range of the symbol if it can be renamed, None if not.
pub fn can_rename(db: &dyn Db, file: File, offset: TextSize) -> Option<ruff_text_size::TextRange> {

File diff suppressed because it is too large Load Diff

View File

@@ -15,13 +15,13 @@ use ruff_python_ast::find_node::covering_node;
use ruff_python_ast::token::TokenKind;
use ruff_python_ast::{self as ast, AnyNodeRef};
use ruff_text_size::{Ranged, TextRange, TextSize};
use ty_python_semantic::ResolvedDefinition;
use ty_python_semantic::SemanticModel;
use ty_python_semantic::semantic_index::definition::Definition;
use ty_python_semantic::types::ide_support::{
use ty_python_types::ResolvedDefinition;
use ty_python_types::SemanticModel;
use ty_python_types::types::ide_support::{
CallSignatureDetails, call_signature_details, find_active_signature_from_details,
};
use ty_python_semantic::types::{ParameterKind, Type};
use ty_python_types::types::{ParameterKind, Type};
// TODO: We may want to add special-case handling for calls to constructors
// so the class docstring is used in place of (or inaddition to) any docstring

View File

@@ -1,6 +1,6 @@
use itertools::Either;
use ruff_db::system::SystemPathBuf;
use ty_python_semantic::{ResolvedDefinition, map_stub_definition};
use ty_python_types::{ResolvedDefinition, map_stub_definition};
use crate::cached_vendored_root;

View File

@@ -14,8 +14,8 @@ use ruff_python_ast::name::{Name, UnqualifiedName};
use ruff_python_ast::visitor::source_order::{self, SourceOrderVisitor};
use ruff_text_size::{Ranged, TextRange};
use rustc_hash::{FxHashMap, FxHashSet};
use ty_module_resolver::{ModuleName, resolve_module};
use ty_project::Db;
use ty_python_semantic::{ModuleName, resolve_module};
use crate::completion::CompletionKind;

View File

@@ -0,0 +1,38 @@
[package]
name = "ty_module_resolver"
version = "0.0.0"
publish = false
authors = { workspace = true }
edition = { workspace = true }
rust-version = { workspace = true }
homepage = { workspace = true }
documentation = { workspace = true }
repository = { workspace = true }
license = { workspace = true }
[dependencies]
ruff_db = { workspace = true }
ruff_memory_usage = { workspace = true }
ruff_python_ast = { workspace = true, features = ["salsa"] }
ruff_python_stdlib = { workspace = true }
camino = { workspace = true }
compact_str = { workspace = true }
get-size2 = { workspace = true }
rustc-hash = { workspace = true }
salsa = { workspace = true }
thiserror = { workspace = true }
tracing = { workspace = true }
strum = { workspace = true }
strum_macros = { workspace = true }
[dev-dependencies]
ruff_db = { workspace = true, features = ["testing", "os"] }
ty_vendored = { workspace = true }
anyhow = { workspace = true }
insta = { workspace = true, features = ["filters"] }
tempfile = { workspace = true }
[lints]
workspace = true

View File

Before

Width:  |  Height:  |  Size: 36 KiB

After

Width:  |  Height:  |  Size: 36 KiB

View File

@@ -0,0 +1,124 @@
use ruff_db::Db as SourceDb;
use crate::resolve::SearchPaths;
#[salsa::db]
pub trait Db: SourceDb {
/// Returns the search paths for module resolution.
fn search_paths(&self) -> &SearchPaths;
}
#[cfg(test)]
pub(crate) mod tests {
use std::sync::{Arc, Mutex};
use ruff_db::Db as SourceDb;
use ruff_db::files::Files;
use ruff_db::system::{DbWithTestSystem, TestSystem};
use ruff_db::vendored::VendoredFileSystem;
use ruff_python_ast::PythonVersion;
use super::Db;
use crate::resolve::SearchPaths;
type Events = Arc<Mutex<Vec<salsa::Event>>>;
#[salsa::db]
#[derive(Clone)]
pub(crate) struct TestDb {
storage: salsa::Storage<Self>,
files: Files,
system: TestSystem,
vendored: VendoredFileSystem,
search_paths: Arc<SearchPaths>,
python_version: PythonVersion,
events: Events,
}
impl TestDb {
pub(crate) fn new() -> Self {
let events = Events::default();
Self {
storage: salsa::Storage::new(Some(Box::new({
let events = events.clone();
move |event| {
tracing::trace!("event: {event:?}");
let mut events = events.lock().unwrap();
events.push(event);
}
}))),
system: TestSystem::default(),
vendored: ty_vendored::file_system().clone(),
files: Files::default(),
search_paths: Arc::new(SearchPaths::empty(ty_vendored::file_system())),
python_version: PythonVersion::default(),
events,
}
}
pub(crate) fn with_search_paths(mut self, search_paths: SearchPaths) -> Self {
self.set_search_paths(search_paths);
self
}
pub(crate) fn with_python_version(mut self, python_version: PythonVersion) -> Self {
self.python_version = python_version;
self
}
pub(crate) fn set_search_paths(&mut self, search_paths: SearchPaths) {
search_paths.try_register_static_roots(self);
self.search_paths = Arc::new(search_paths);
}
/// Takes the salsa events.
pub(crate) fn take_salsa_events(&mut self) -> Vec<salsa::Event> {
let mut events = self.events.lock().unwrap();
std::mem::take(&mut *events)
}
/// Clears the salsa events.
pub(crate) fn clear_salsa_events(&mut self) {
self.take_salsa_events();
}
}
impl DbWithTestSystem for TestDb {
fn test_system(&self) -> &TestSystem {
&self.system
}
fn test_system_mut(&mut self) -> &mut TestSystem {
&mut self.system
}
}
#[salsa::db]
impl SourceDb for TestDb {
fn vendored(&self) -> &VendoredFileSystem {
&self.vendored
}
fn system(&self) -> &dyn ruff_db::system::System {
&self.system
}
fn files(&self) -> &Files {
&self.files
}
fn python_version(&self) -> PythonVersion {
self.python_version
}
}
#[salsa::db]
impl Db for TestDb {
fn search_paths(&self) -> &SearchPaths {
&self.search_paths
}
}
#[salsa::db]
impl salsa::Database for TestDb {}
}

View File

@@ -1,24 +1,31 @@
use std::iter::FusedIterator;
pub use list::{all_modules, list_modules};
pub use module::KnownModule;
pub use module::Module;
pub use path::{SearchPath, SearchPathValidationError};
pub use resolver::SearchPaths;
pub(crate) use resolver::file_to_module;
pub use resolver::{
resolve_module, resolve_module_confident, resolve_real_module, resolve_real_module_confident,
resolve_real_shadowable_module,
};
use ruff_db::system::SystemPath;
use crate::Db;
pub(crate) use resolver::{ModuleResolveMode, SearchPathIterator, search_paths};
pub use db::Db;
pub use module::KnownModule;
pub use module::Module;
pub use module_name::{ModuleName, ModuleNameResolutionError};
pub use path::{SearchPath, SearchPathError};
pub use resolve::{
SearchPaths, file_to_module, resolve_module, resolve_module_confident, resolve_real_module,
resolve_real_module_confident, resolve_real_shadowable_module,
};
pub use settings::{MisconfigurationMode, SearchPathSettings, SearchPathSettingsError};
pub use typeshed::{
PyVersionRange, TypeshedVersions, TypeshedVersionsParseError, vendored_typeshed_versions,
};
pub use list::{all_modules, list_modules};
pub use resolve::{ModuleResolveMode, SearchPathIterator, search_paths};
mod db;
mod list;
mod module;
mod module_name;
mod path;
mod resolver;
mod resolve;
mod settings;
mod typeshed;
#[cfg(test)]

View File

@@ -3,12 +3,10 @@ use std::collections::btree_map::{BTreeMap, Entry};
use ruff_python_ast::PythonVersion;
use crate::db::Db;
use crate::module::{Module, ModuleKind};
use crate::module_name::ModuleName;
use crate::program::Program;
use super::module::{Module, ModuleKind};
use super::path::{ModulePath, SearchPath, SystemOrVendoredPathRef};
use super::resolver::{ModuleResolveMode, ResolverContext, resolve_file_module, search_paths};
use crate::path::{ModulePath, SearchPath, SystemOrVendoredPathRef};
use crate::resolve::{ModuleResolveMode, ResolverContext, resolve_file_module, search_paths};
/// List all available modules, including all sub-modules, sorted in lexicographic order.
pub fn all_modules(db: &dyn Db) -> Vec<Module<'_>> {
@@ -100,7 +98,6 @@ fn list_modules_in<'db>(
/// in the same directory).
struct Lister<'db> {
db: &'db dyn Db,
program: Program,
search_path: &'db SearchPath,
modules: BTreeMap<&'db ModuleName, Module<'db>>,
}
@@ -109,10 +106,8 @@ impl<'db> Lister<'db> {
/// Create new state that can accumulate modules from a list
/// of file paths.
fn new(db: &'db dyn Db, search_path: &'db SearchPath) -> Lister<'db> {
let program = Program::get(db);
Lister {
db,
program,
search_path,
modules: BTreeMap::new(),
}
@@ -314,7 +309,7 @@ impl<'db> Lister<'db> {
/// Returns the Python version we want to perform module resolution
/// with.
fn python_version(&self) -> PythonVersion {
self.program.python_version(self.db)
self.db.python_version()
}
/// Constructs a resolver context for use with some APIs that require it.
@@ -382,20 +377,17 @@ mod tests {
use camino::{Utf8Component, Utf8Path};
use ruff_db::Db as _;
use ruff_db::files::{File, FilePath, FileRootKind};
use ruff_db::system::{
DbWithTestSystem, DbWithWritableSystem, OsSystem, SystemPath, SystemPathBuf,
};
use ruff_db::system::{DbWithTestSystem, DbWithWritableSystem, SystemPath, SystemPathBuf};
use ruff_db::testing::assert_function_query_was_not_run;
use ruff_python_ast::PythonVersion;
use crate::db::{Db, tests::TestDb};
use crate::module_resolver::module::Module;
use crate::module_resolver::resolver::{
use crate::module::Module;
use crate::resolve::{
ModuleResolveMode, ModuleResolveModeIngredient, dynamic_resolution_paths,
};
use crate::module_resolver::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use crate::program::{Program, ProgramSettings, SearchPathSettings};
use crate::{PythonPlatform, PythonVersionSource, PythonVersionWithSource};
use crate::settings::SearchPathSettings;
use crate::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use super::list_modules;
@@ -940,7 +932,7 @@ mod tests {
fn symlink() -> anyhow::Result<()> {
use anyhow::Context;
let mut db = TestDb::new();
let mut db = TestDb::new().with_python_version(PythonVersion::PY38);
let temp_dir = tempfile::TempDir::with_prefix("PREFIX-SENTINEL")?;
let root = temp_dir
@@ -948,7 +940,7 @@ mod tests {
.canonicalize()
.context("Failed to canonicalize temp dir")?;
let root = SystemPath::from_std_path(&root).unwrap();
db.use_system(OsSystem::new(root));
db.use_system(ruff_db::system::OsSystem::new(root));
let src = root.join("src");
let site_packages = root.join("site-packages");
@@ -965,26 +957,21 @@ mod tests {
std::fs::write(foo.as_std_path(), "")?;
std::os::unix::fs::symlink(foo.as_std_path(), bar.as_std_path())?;
db.files().try_add_root(&db, &src, FileRootKind::Project);
let settings = SearchPathSettings {
src_roots: vec![src.clone()],
custom_typeshed: Some(custom_typeshed),
site_packages_paths: vec![site_packages],
..SearchPathSettings::empty()
};
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource {
version: PythonVersion::PY38,
source: PythonVersionSource::default(),
},
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings {
custom_typeshed: Some(custom_typeshed),
site_packages_paths: vec![site_packages],
..SearchPathSettings::new(vec![src])
}
db.set_search_paths(
settings
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings"),
},
);
db.files().try_add_root(&db, &src, FileRootKind::Project);
// From the original test in the "resolve this module"
// implementation, this test seems to symlink a Python module
// and assert that they are treated as two distinct modules.
@@ -1479,18 +1466,15 @@ not_a_directory
db.files()
.try_add_root(&db, SystemPath::new("/src"), FileRootKind::Project);
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings {
site_packages_paths: vec![venv_site_packages],
..SearchPathSettings::new(vec![src.to_path_buf()])
}
let settings = SearchPathSettings {
site_packages_paths: vec![venv_site_packages],
..SearchPathSettings::new(vec![src.to_path_buf()])
};
db.set_search_paths(
settings
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings"),
},
);
insta::assert_debug_snapshot!(
@@ -1533,18 +1517,15 @@ not_a_directory
db.files()
.try_add_root(&db, SystemPath::new("/src"), FileRootKind::Project);
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings {
site_packages_paths: vec![venv_site_packages, system_site_packages],
..SearchPathSettings::new(vec![SystemPathBuf::from("/src")])
}
let settings = SearchPathSettings {
site_packages_paths: vec![venv_site_packages, system_site_packages],
..SearchPathSettings::new(vec![SystemPathBuf::from("/src")])
};
db.set_search_paths(
settings
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings"),
},
);
// The editable installs discovered from the `.pth` file in the
@@ -1585,7 +1566,7 @@ not_a_directory
#[test]
#[cfg(unix)]
fn case_sensitive_resolution_with_symlinked_directory() -> anyhow::Result<()> {
use anyhow::Context;
use anyhow::Context as _;
let temp_dir = tempfile::TempDir::with_prefix("PREFIX-SENTINEL")?;
let root = SystemPathBuf::from_path_buf(
@@ -1602,7 +1583,7 @@ not_a_directory
let a_package_target = root.join("a-package");
let a_src = src.join("a");
db.use_system(OsSystem::new(&root));
db.use_system(ruff_db::system::OsSystem::new(&root));
db.write_file(
a_package_target.join("__init__.py"),
@@ -1621,16 +1602,11 @@ not_a_directory
db.files().try_add_root(&db, &root, FileRootKind::Project);
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings::new(vec![src])
.to_search_paths(db.system(), db.vendored())
.expect("valid search path settings"),
},
);
let settings = SearchPathSettings::new(vec![src]);
let search_paths = settings
.to_search_paths(db.system(), db.vendored())
.expect("valid search path settings");
db.set_search_paths(search_paths);
insta::with_settings!({
// Temporary directory often have random chars in them, so
@@ -1662,18 +1638,14 @@ not_a_directory
db.files()
.try_add_root(&db, &project_directory, FileRootKind::Project);
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings {
site_packages_paths: vec![site_packages],
..SearchPathSettings::new(vec![project_directory])
}
let settings = SearchPathSettings {
site_packages_paths: vec![site_packages],
..SearchPathSettings::new(vec![project_directory])
};
db.set_search_paths(
settings
.to_search_paths(db.system(), db.vendored())
.unwrap(),
},
);
insta::assert_debug_snapshot!(
@@ -1820,16 +1792,11 @@ not_a_directory
db.files()
.try_add_root(&db, SystemPath::new("/"), FileRootKind::Project);
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings::new(vec![project_directory])
.to_search_paths(db.system(), db.vendored())
.unwrap(),
},
);
let settings = SearchPathSettings::new(vec![project_directory]);
let search_paths = settings
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings");
db.set_search_paths(search_paths);
insta::assert_debug_snapshot!(
list_snapshot_filter(&db, |m| m.name(&db).as_str() == "foo"),

View File

@@ -7,10 +7,9 @@ use ruff_db::vendored::VendoredPath;
use salsa::Database;
use salsa::plumbing::AsId;
use super::path::SearchPath;
use crate::Db;
use crate::module_name::ModuleName;
use crate::module_resolver::path::SystemOrVendoredPathRef;
use crate::path::{SearchPath, SystemOrVendoredPathRef};
/// Representation of a Python module.
#[derive(Clone, Copy, Eq, Hash, PartialEq, salsa::Supertype, salsa::Update)]
@@ -330,16 +329,14 @@ pub enum KnownModule {
TyExtensions,
#[strum(serialize = "importlib")]
ImportLib,
#[cfg(test)]
#[strum(serialize = "unittest.mock")]
UnittestMock,
#[cfg(test)]
Uuid,
Warnings,
}
impl KnownModule {
pub(crate) const fn as_str(self) -> &'static str {
pub const fn as_str(self) -> &'static str {
match self {
Self::Builtins => "builtins",
Self::Enum => "enum",
@@ -360,23 +357,18 @@ impl KnownModule {
Self::TyExtensions => "ty_extensions",
Self::ImportLib => "importlib",
Self::Warnings => "warnings",
#[cfg(test)]
Self::UnittestMock => "unittest.mock",
#[cfg(test)]
Self::Uuid => "uuid",
Self::Templatelib => "string.templatelib",
}
}
pub(crate) fn name(self) -> ModuleName {
pub fn name(self) -> ModuleName {
ModuleName::new_static(self.as_str())
.unwrap_or_else(|| panic!("{self} should be a valid module name!"))
}
pub(crate) fn try_from_search_path_and_name(
search_path: &SearchPath,
name: &ModuleName,
) -> Option<Self> {
fn try_from_search_path_and_name(search_path: &SearchPath, name: &ModuleName) -> Option<Self> {
if search_path.is_standard_library() {
Self::from_str(name.as_str()).ok()
} else {
@@ -384,23 +376,23 @@ impl KnownModule {
}
}
pub(crate) const fn is_builtins(self) -> bool {
pub const fn is_builtins(self) -> bool {
matches!(self, Self::Builtins)
}
pub(crate) const fn is_typing(self) -> bool {
pub const fn is_typing(self) -> bool {
matches!(self, Self::Typing)
}
pub(crate) const fn is_ty_extensions(self) -> bool {
pub const fn is_ty_extensions(self) -> bool {
matches!(self, Self::TyExtensions)
}
pub(crate) const fn is_inspect(self) -> bool {
pub const fn is_inspect(self) -> bool {
matches!(self, Self::Inspect)
}
pub(crate) const fn is_importlib(self) -> bool {
pub const fn is_importlib(self) -> bool {
matches!(self, Self::ImportLib)
}
}

View File

@@ -8,7 +8,8 @@ use ruff_db::files::File;
use ruff_python_ast as ast;
use ruff_python_stdlib::identifiers::is_identifier;
use crate::{db::Db, module_resolver::file_to_module};
use crate::db::Db;
use crate::resolve::file_to_module;
/// A module name, e.g. `foo.bar`.
///
@@ -47,7 +48,7 @@ impl ModuleName {
/// ## Examples
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// assert_eq!(ModuleName::new_static("foo.bar").as_deref(), Some("foo.bar"));
/// assert_eq!(ModuleName::new_static(""), None);
@@ -73,7 +74,7 @@ impl ModuleName {
/// # Examples
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// assert_eq!(ModuleName::new_static("foo.bar.baz").unwrap().components().collect::<Vec<_>>(), vec!["foo", "bar", "baz"]);
/// ```
@@ -87,7 +88,7 @@ impl ModuleName {
/// # Examples
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// assert_eq!(ModuleName::new_static("foo.bar").unwrap().parent(), Some(ModuleName::new_static("foo").unwrap()));
/// assert_eq!(ModuleName::new_static("foo.bar.baz").unwrap().parent(), Some(ModuleName::new_static("foo.bar").unwrap()));
@@ -106,7 +107,7 @@ impl ModuleName {
/// # Examples
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// assert!(ModuleName::new_static("foo.bar").unwrap().starts_with(&ModuleName::new_static("foo").unwrap()));
///
@@ -142,7 +143,7 @@ impl ModuleName {
/// module name:
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// let this = ModuleName::new_static("importlib.resources").unwrap();
/// let parent = ModuleName::new_static("importlib").unwrap();
@@ -156,7 +157,7 @@ impl ModuleName {
/// This shows some cases where it isn't a parent:
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// let this = ModuleName::new_static("importliblib.resources").unwrap();
/// let parent = ModuleName::new_static("importlib").unwrap();
@@ -199,7 +200,7 @@ impl ModuleName {
/// # Examples
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// assert_eq!(&*ModuleName::from_components(["a"]).unwrap(), "a");
/// assert_eq!(&*ModuleName::from_components(["a", "b"]).unwrap(), "a.b");
@@ -240,7 +241,7 @@ impl ModuleName {
/// # Examples
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// let mut module_name = ModuleName::new_static("foo").unwrap();
/// module_name.extend(&ModuleName::new_static("bar").unwrap());
@@ -258,7 +259,7 @@ impl ModuleName {
/// # Examples
///
/// ```
/// use ty_python_semantic::ModuleName;
/// use ty_module_resolver::ModuleName;
///
/// assert_eq!(
/// ModuleName::new_static("foo.bar.baz").unwrap().ancestors().collect::<Vec<_>>(),
@@ -314,7 +315,7 @@ impl ModuleName {
/// Computes the absolute module name for the package this file belongs to.
///
/// i.e. this resolves `.`
pub(crate) fn package_for_file(
pub fn package_for_file(
db: &dyn Db,
importing_file: File,
) -> Result<Self, ModuleNameResolutionError> {

View File

@@ -8,11 +8,10 @@ use ruff_db::files::{File, FileError, FilePath, system_path_to_file, vendored_pa
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ruff_db::vendored::{VendoredPath, VendoredPathBuf};
use super::typeshed::{TypeshedVersionsParseError, TypeshedVersionsQueryResult, typeshed_versions};
use crate::Db;
use crate::module_name::ModuleName;
use crate::module_resolver::resolver::{PyTyped, ResolverContext};
use crate::site_packages::SitePackagesDiscoveryError;
use crate::resolve::{PyTyped, ResolverContext};
use crate::typeshed::{TypeshedVersionsQueryResult, typeshed_versions};
/// A path that points to a Python module.
///
@@ -397,13 +396,8 @@ fn query_stdlib_version(
typeshed_versions(*db).query_module(&module_name, *python_version)
}
/// Enumeration describing the various ways in which validation of a search path might fail.
///
/// If validation fails for a search path derived from the user settings,
/// a message must be displayed to the user,
/// as type checking cannot be done reliably in these circumstances.
#[derive(Debug, thiserror::Error)]
pub enum SearchPathValidationError {
pub enum SearchPathError {
/// The path provided by the user was not a directory
#[error("{0} does not point to a directory")]
NotADirectory(SystemPathBuf),
@@ -413,41 +407,9 @@ pub enum SearchPathValidationError {
/// (This is only relevant for stdlib search paths.)
#[error("The directory at {0} has no `stdlib/` subdirectory")]
NoStdlibSubdirectory(SystemPathBuf),
/// The typeshed path provided by the user is a directory,
/// but `stdlib/VERSIONS` could not be read.
/// (This is only relevant for stdlib search paths.)
#[error("Failed to read the custom typeshed versions file '{path}'")]
FailedToReadVersionsFile {
path: SystemPathBuf,
#[source]
error: std::io::Error,
},
/// The path provided by the user is a directory,
/// and a `stdlib/VERSIONS` file exists, but it fails to parse.
/// (This is only relevant for stdlib search paths.)
#[error(transparent)]
VersionsParseError(TypeshedVersionsParseError),
/// Failed to discover the site-packages for the configured virtual environment.
#[error("Failed to discover the site-packages directory")]
SitePackagesDiscovery(#[source] SitePackagesDiscoveryError),
}
impl From<TypeshedVersionsParseError> for SearchPathValidationError {
fn from(value: TypeshedVersionsParseError) -> Self {
Self::VersionsParseError(value)
}
}
impl From<SitePackagesDiscoveryError> for SearchPathValidationError {
fn from(value: SitePackagesDiscoveryError) -> Self {
Self::SitePackagesDiscovery(value)
}
}
type SearchPathResult<T> = Result<T, SearchPathValidationError>;
type SearchPathResult<T> = Result<T, SearchPathError>;
#[derive(Debug, Clone, PartialEq, Eq, Hash, get_size2::GetSize)]
enum SearchPathInner {
@@ -495,7 +457,7 @@ impl SearchPath {
if system.is_directory(&root) {
Ok(root)
} else {
Err(SearchPathValidationError::NotADirectory(root))
Err(SearchPathError::NotADirectory(root))
}
}
@@ -519,17 +481,15 @@ impl SearchPath {
typeshed: &SystemPath,
) -> SearchPathResult<Self> {
if !system.is_directory(typeshed) {
return Err(SearchPathValidationError::NotADirectory(
typeshed.to_path_buf(),
));
return Err(SearchPathError::NotADirectory(typeshed.to_path_buf()));
}
let stdlib =
Self::directory_path(system, typeshed.join("stdlib")).map_err(|err| match err {
SearchPathValidationError::NotADirectory(_) => {
SearchPathValidationError::NoStdlibSubdirectory(typeshed.to_path_buf())
SearchPathError::NotADirectory(_) => {
SearchPathError::NoStdlibSubdirectory(typeshed.to_path_buf())
}
err => err,
SearchPathError::NoStdlibSubdirectory(_) => err,
})?;
Ok(Self(Arc::new(SearchPathInner::StandardLibraryCustom(
@@ -585,7 +545,7 @@ impl SearchPath {
/// Does this search path point to the standard library?
#[must_use]
pub(crate) fn is_standard_library(&self) -> bool {
pub fn is_standard_library(&self) -> bool {
matches!(
&*self.0,
SearchPathInner::StandardLibraryCustom(_)
@@ -680,7 +640,7 @@ impl SearchPath {
}
#[must_use]
pub(crate) fn as_system_path(&self) -> Option<&SystemPath> {
pub fn as_system_path(&self) -> Option<&SystemPath> {
self.as_path().as_system_path()
}
@@ -709,7 +669,7 @@ impl SearchPath {
/// Returns a string suitable for describing what kind of search path this is
/// in user-facing diagnostics.
#[must_use]
pub(crate) fn describe_kind(&self) -> &'static str {
pub fn describe_kind(&self) -> &'static str {
match *self.0 {
SearchPathInner::Extra(_) => {
"extra search path specified on the CLI or in your config file"
@@ -869,8 +829,8 @@ mod tests {
use ruff_python_ast::PythonVersion;
use crate::db::tests::TestDb;
use crate::module_resolver::resolver::ModuleResolveMode;
use crate::module_resolver::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use crate::resolve::ModuleResolveMode;
use crate::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use super::*;

View File

@@ -48,13 +48,11 @@ use ruff_python_ast::{
};
use crate::db::Db;
use crate::module::{Module, ModuleKind};
use crate::module_name::ModuleName;
use crate::module_resolver::typeshed::{TypeshedVersions, vendored_typeshed_versions};
use crate::program::MisconfigurationMode;
use crate::{Program, SearchPathSettings};
use super::module::{Module, ModuleKind};
use super::path::{ModulePath, SearchPath, SearchPathValidationError, SystemOrVendoredPathRef};
use crate::path::{ModulePath, SearchPath, SystemOrVendoredPathRef};
use crate::typeshed::{TypeshedVersions, vendored_typeshed_versions};
use crate::{MisconfigurationMode, SearchPathSettings, SearchPathSettingsError};
/// Resolves a module name to a module.
pub fn resolve_module<'db>(
@@ -137,7 +135,7 @@ pub fn resolve_real_shadowable_module<'db>(
/// Which files should be visible when doing a module query
#[derive(Debug, Copy, Clone, PartialEq, Eq, PartialOrd, Ord, Hash, get_size2::GetSize)]
#[allow(clippy::enum_variant_names)]
pub(crate) enum ModuleResolveMode {
pub enum ModuleResolveMode {
/// Stubs are allowed to appear.
///
/// This is the "normal" mode almost everything uses, as type checkers are in fact supposed
@@ -326,7 +324,7 @@ pub(crate) fn path_to_module<'db>(db: &'db dyn Db, path: &FilePath) -> Option<Mo
/// This intuition is particularly useful for understanding why it's correct that we pass
/// the file itself as `importing_file` to various subroutines.
#[salsa::tracked(heap_size=ruff_memory_usage::heap_size)]
pub(crate) fn file_to_module(db: &dyn Db, file: File) -> Option<Module<'_>> {
pub fn file_to_module(db: &dyn Db, file: File) -> Option<Module<'_>> {
let _span = tracing::trace_span!("file_to_module", ?file).entered();
let path = SystemOrVendoredPathRef::try_from_file(db, file)?;
@@ -392,8 +390,8 @@ fn file_to_module_impl<'db, 'a>(
None
}
pub(crate) fn search_paths(db: &dyn Db, resolve_mode: ModuleResolveMode) -> SearchPathIterator<'_> {
Program::get(db).search_paths(db).iter(db, resolve_mode)
pub fn search_paths(db: &dyn Db, resolve_mode: ModuleResolveMode) -> SearchPathIterator<'_> {
db.search_paths().iter(db, resolve_mode)
}
/// Get the search-paths for desperate resolution of absolute imports in this file.
@@ -554,11 +552,11 @@ impl SearchPaths {
/// This method also implements the typing spec's [module resolution order].
///
/// [module resolution order]: https://typing.python.org/en/latest/spec/distributing.html#import-resolution-ordering
pub(crate) fn from_settings(
pub fn from_settings(
settings: &SearchPathSettings,
system: &dyn System,
vendored: &VendoredFileSystem,
) -> Result<Self, SearchPathValidationError> {
) -> Result<Self, SearchPathSettingsError> {
fn canonicalize(path: &SystemPath, system: &dyn System) -> SystemPathBuf {
system
.canonicalize_path(path)
@@ -586,7 +584,7 @@ impl SearchPaths {
if *misconfiguration_mode == MisconfigurationMode::UseDefault {
tracing::debug!("Skipping invalid extra search-path: {err}");
} else {
return Err(err);
return Err(err.into());
}
}
}
@@ -600,7 +598,7 @@ impl SearchPaths {
if *misconfiguration_mode == MisconfigurationMode::UseDefault {
tracing::debug!("Skipping invalid first-party search-path: {err}");
} else {
return Err(err);
return Err(err.into());
}
}
}
@@ -614,12 +612,10 @@ impl SearchPaths {
let results = system
.read_to_string(&versions_path)
.map_err(
|error| SearchPathValidationError::FailedToReadVersionsFile {
path: versions_path,
error,
},
)
.map_err(|error| SearchPathSettingsError::FailedToReadVersionsFile {
path: versions_path,
error,
})
.and_then(|versions_content| Ok(versions_content.parse()?))
.and_then(|parsed| Ok((parsed, SearchPath::custom_stdlib(system, &typeshed)?)));
@@ -653,7 +649,7 @@ impl SearchPaths {
tracing::debug!("Skipping invalid real-stdlib search-path: {err}");
None
} else {
return Err(err);
return Err(err.into());
}
}
}
@@ -669,9 +665,9 @@ impl SearchPaths {
Ok(path) => site_packages.push(path),
Err(err) => {
if settings.misconfiguration_mode == MisconfigurationMode::UseDefault {
tracing::debug!("Skipping invalid real-stdlib search-path: {err}");
tracing::debug!("Skipping invalid site-packages search-path: {err}");
} else {
return Err(err);
return Err(err.into());
}
}
}
@@ -709,13 +705,11 @@ impl SearchPaths {
// preserve this behaviour to avoid getting into the weeds of corner cases.)
let stdlib_path_is_shadowed = stdlib_path
.as_system_path()
.map(|path| seen_paths.contains(path))
.unwrap_or(false);
.is_some_and(|path| seen_paths.contains(path));
let real_stdlib_path_is_shadowed = real_stdlib_path
.as_ref()
.and_then(SearchPath::as_system_path)
.map(|path| seen_paths.contains(path))
.unwrap_or(false);
.is_some_and(|path| seen_paths.contains(path));
let stdlib_path = if stdlib_path_is_shadowed {
None
@@ -737,8 +731,21 @@ impl SearchPaths {
})
}
/// Returns a new `SearchPaths` with no search paths configured.
///
/// This is primarily useful for testing.
pub fn empty(vendored: &VendoredFileSystem) -> Self {
Self {
static_paths: vec![],
stdlib_path: Some(SearchPath::vendored_stdlib()),
real_stdlib_path: None,
site_packages: vec![],
typeshed_versions: vendored_typeshed_versions(vendored),
}
}
/// Registers the file roots for all non-dynamically discovered search paths that aren't first-party.
pub(crate) fn try_register_static_roots(&self, db: &dyn Db) {
pub fn try_register_static_roots(&self, db: &dyn Db) {
let files = db.files();
for path in self
.static_paths
@@ -779,13 +786,13 @@ impl SearchPaths {
}
}
pub(crate) fn custom_stdlib(&self) -> Option<&SystemPath> {
pub fn custom_stdlib(&self) -> Option<&SystemPath> {
self.stdlib_path
.as_ref()
.and_then(SearchPath::as_system_path)
}
pub(crate) fn typeshed_versions(&self) -> &TypeshedVersions {
pub fn typeshed_versions(&self) -> &TypeshedVersions {
&self.typeshed_versions
}
}
@@ -811,7 +818,7 @@ pub(crate) fn dynamic_resolution_paths<'db>(
site_packages,
typeshed_versions: _,
real_stdlib_path,
} = Program::get(db).search_paths(db);
} = db.search_paths();
let mut dynamic_paths = Vec::new();
@@ -932,7 +939,7 @@ pub(crate) fn dynamic_resolution_paths<'db>(
/// are only calculated lazily.
///
/// [`sys.path` at runtime]: https://docs.python.org/3/library/site.html#module-site
pub(crate) struct SearchPathIterator<'db> {
pub struct SearchPathIterator<'db> {
db: &'db dyn Db,
static_paths: std::slice::Iter<'db, SearchPath>,
stdlib_path: Option<&'db SearchPath>,
@@ -1101,8 +1108,7 @@ fn resolve_name_impl<'a>(
mode: ModuleResolveMode,
search_paths: impl Iterator<Item = &'a SearchPath>,
) -> Option<ResolvedName> {
let program = Program::get(db);
let python_version = program.python_version(db);
let python_version = db.python_version();
let resolver_state = ResolverContext::new(db, python_version, mode);
let is_non_shadowable = mode.is_non_shadowable(python_version.minor, name.as_str());
@@ -1740,10 +1746,9 @@ mod tests {
use ruff_python_ast::PythonVersion;
use crate::db::tests::TestDb;
use crate::module::ModuleKind;
use crate::module_name::ModuleName;
use crate::module_resolver::module::ModuleKind;
use crate::module_resolver::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use crate::{ProgramSettings, PythonPlatform, PythonVersionWithSource};
use crate::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use super::*;
@@ -2267,15 +2272,11 @@ mod tests {
#[cfg(target_family = "unix")]
fn symlink() -> anyhow::Result<()> {
use anyhow::Context;
use crate::{
PythonPlatform, PythonVersionSource, PythonVersionWithSource, program::Program,
};
use ruff_db::system::{OsSystem, SystemPath};
use crate::db::tests::TestDb;
let mut db = TestDb::new();
let mut db = TestDb::new().with_python_version(PythonVersion::PY38);
let temp_dir = tempfile::tempdir()?;
let root = temp_dir
@@ -2300,22 +2301,15 @@ mod tests {
std::fs::write(foo.as_std_path(), "")?;
std::os::unix::fs::symlink(foo.as_std_path(), bar.as_std_path())?;
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource {
version: PythonVersion::PY38,
source: PythonVersionSource::default(),
},
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings {
custom_typeshed: Some(custom_typeshed),
site_packages_paths: vec![site_packages],
..SearchPathSettings::new(vec![src.clone()])
}
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings"),
},
db.set_search_paths(
SearchPathSettings {
src_roots: vec![src.clone()],
custom_typeshed: Some(custom_typeshed),
site_packages_paths: vec![site_packages],
..SearchPathSettings::empty()
}
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings"),
);
let foo_module =
@@ -2848,18 +2842,13 @@ not_a_directory
])
.unwrap();
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings {
site_packages_paths: vec![venv_site_packages, system_site_packages],
..SearchPathSettings::new(vec![SystemPathBuf::from("/src")])
}
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings"),
},
db.set_search_paths(
SearchPathSettings {
site_packages_paths: vec![venv_site_packages, system_site_packages],
..SearchPathSettings::new(vec![SystemPathBuf::from("/src")])
}
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings"),
);
// The editable installs discovered from the `.pth` file in the first `site-packages` directory
@@ -2924,15 +2913,10 @@ not_a_directory
std::os::unix::fs::symlink(a_package_target.as_std_path(), a_src.as_std_path())
.context("Failed to symlink `src/a` to `a-package`")?;
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings::new(vec![src])
.to_search_paths(db.system(), db.vendored())
.expect("valid search path settings"),
},
db.set_search_paths(
SearchPathSettings::new(vec![src])
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings"),
);
// Now try to resolve the module `A` (note the capital `A` instead of `a`).
@@ -2964,19 +2948,14 @@ not_a_directory
let mut db = TestDb::new();
db.write_file(&installed_foo_module, "").unwrap();
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource::default(),
python_platform: PythonPlatform::default(),
search_paths: SearchPathSettings {
site_packages_paths: vec![site_packages.clone()],
..SearchPathSettings::new(vec![project_directory])
}
.to_search_paths(db.system(), db.vendored())
.unwrap(),
},
);
let search_paths = SearchPathSettings {
src_roots: vec![project_directory],
site_packages_paths: vec![site_packages.clone()],
..SearchPathSettings::empty()
}
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings");
db.set_search_paths(search_paths);
let foo_module_file = File::new(&db, FilePath::System(installed_foo_module));
let module = file_to_module(&db, foo_module_file).unwrap();

View File

@@ -0,0 +1,105 @@
//! Search path configuration settings.
use ruff_db::system::{System, SystemPathBuf};
use ruff_db::vendored::VendoredFileSystem;
use crate::path::SearchPathError;
use crate::resolve::SearchPaths;
use crate::typeshed::TypeshedVersionsParseError;
/// How to handle apparent misconfiguration
#[derive(PartialEq, Eq, Debug, Copy, Clone, Default, get_size2::GetSize)]
pub enum MisconfigurationMode {
/// Settings Failure Is Not An Error.
///
/// This is used by the default database, which we are incentivized to make infallible,
/// while still trying to "do our best" to set things up properly where we can.
UseDefault,
/// Settings Failure Is An Error.
#[default]
Fail,
}
/// Configures the search paths for module resolution.
#[derive(Eq, PartialEq, Debug, Clone)]
pub struct SearchPathSettings {
/// List of user-provided paths that should take first priority in the module resolution.
/// Examples in other type checkers are mypy's MYPYPATH environment variable,
/// or pyright's stubPath configuration setting.
pub extra_paths: Vec<SystemPathBuf>,
/// The root of the project, used for finding first-party modules.
pub src_roots: Vec<SystemPathBuf>,
/// Optional path to a "custom typeshed" directory on disk for us to use for standard-library types.
/// If this is not provided, we will fallback to our vendored typeshed stubs for the stdlib,
/// bundled as a zip file in the binary
pub custom_typeshed: Option<SystemPathBuf>,
/// List of site packages paths to use.
pub site_packages_paths: Vec<SystemPathBuf>,
/// Option path to the real stdlib on the system, and not some instance of typeshed.
///
/// We should ideally only ever use this for things like goto-definition,
/// where typeshed isn't the right answer.
pub real_stdlib_path: Option<SystemPathBuf>,
/// How to handle apparent misconfiguration
pub misconfiguration_mode: MisconfigurationMode,
}
impl SearchPathSettings {
pub fn new(src_roots: Vec<SystemPathBuf>) -> Self {
Self {
src_roots,
..SearchPathSettings::empty()
}
}
pub fn empty() -> Self {
SearchPathSettings {
src_roots: vec![],
extra_paths: vec![],
custom_typeshed: None,
site_packages_paths: vec![],
real_stdlib_path: None,
misconfiguration_mode: MisconfigurationMode::Fail,
}
}
pub fn to_search_paths(
&self,
system: &dyn System,
vendored: &VendoredFileSystem,
) -> Result<SearchPaths, SearchPathSettingsError> {
SearchPaths::from_settings(self, system, vendored)
}
}
/// Enumeration describing the various ways in which validation of the search paths options might fail.
///
/// If validation fails for a search path derived from the user settings,
/// a message must be displayed to the user,
/// as type checking cannot be done reliably in these circumstances.
#[derive(Debug, thiserror::Error)]
pub enum SearchPathSettingsError {
#[error(transparent)]
InvalidSearchPath(#[from] SearchPathError),
/// The typeshed path provided by the user is a directory,
/// but `stdlib/VERSIONS` could not be read.
/// (This is only relevant for stdlib search paths.)
#[error("Failed to read the custom typeshed versions file '{path}'")]
FailedToReadVersionsFile {
path: SystemPathBuf,
#[source]
error: std::io::Error,
},
/// The path provided by the user is a directory,
/// and a `stdlib/VERSIONS` file exists, but it fails to parse.
/// (This is only relevant for stdlib search paths.)
#[error(transparent)]
VersionsParseError(#[from] TypeshedVersionsParseError),
}

View File

@@ -1,4 +1,4 @@
use ruff_db::Db;
use ruff_db::Db as _;
use ruff_db::files::FileRootKind;
use ruff_db::system::{
DbWithTestSystem as _, DbWithWritableSystem as _, SystemPath, SystemPathBuf,
@@ -7,8 +7,7 @@ use ruff_db::vendored::VendoredPathBuf;
use ruff_python_ast::PythonVersion;
use crate::db::tests::TestDb;
use crate::program::{Program, SearchPathSettings};
use crate::{ProgramSettings, PythonPlatform, PythonVersionSource, PythonVersionWithSource};
use crate::settings::SearchPathSettings;
/// A test case for the module resolver.
///
@@ -20,7 +19,7 @@ pub(crate) struct TestCase<T> {
pub(crate) stdlib: T,
// Most test cases only ever need a single `site-packages` directory,
// so this is a single directory instead of a `Vec` of directories,
// like it is in `ruff_db::Program`.
// like it is in `SearchPaths`.
pub(crate) site_packages: SystemPathBuf,
pub(crate) python_version: PythonVersion,
}
@@ -105,7 +104,6 @@ pub(crate) struct UnspecifiedTypeshed;
pub(crate) struct TestCaseBuilder<T> {
typeshed_option: T,
python_version: PythonVersion,
python_platform: PythonPlatform,
first_party_files: Vec<FileSpec>,
site_packages_files: Vec<FileSpec>,
// Additional file roots (beyond site_packages, src and stdlib)
@@ -166,7 +164,6 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
Self {
typeshed_option: UnspecifiedTypeshed,
python_version: PythonVersion::default(),
python_platform: PythonPlatform::default(),
first_party_files: vec![],
site_packages_files: vec![],
roots: vec![],
@@ -178,7 +175,6 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
let TestCaseBuilder {
typeshed_option: _,
python_version,
python_platform,
first_party_files,
site_packages_files,
roots,
@@ -186,7 +182,6 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
TestCaseBuilder {
typeshed_option: VendoredTypeshed,
python_version,
python_platform,
first_party_files,
site_packages_files,
roots,
@@ -201,7 +196,6 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
let TestCaseBuilder {
typeshed_option: _,
python_version,
python_platform,
first_party_files,
site_packages_files,
roots,
@@ -210,7 +204,6 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
TestCaseBuilder {
typeshed_option: typeshed,
python_version,
python_platform,
first_party_files,
site_packages_files,
roots,
@@ -241,18 +234,29 @@ impl TestCaseBuilder<MockedTypeshed> {
let TestCaseBuilder {
typeshed_option,
python_version,
python_platform,
first_party_files,
site_packages_files,
roots,
} = self;
let mut db = TestDb::new();
let mut db = TestDb::new().with_python_version(python_version);
let site_packages =
Self::write_mock_directory(&mut db, "/site-packages", site_packages_files);
let src = Self::write_mock_directory(&mut db, "/src", first_party_files);
let typeshed = Self::build_typeshed_mock(&mut db, &typeshed_option);
let stdlib = typeshed.join("stdlib");
let search_paths = SearchPathSettings {
src_roots: vec![src.clone()],
custom_typeshed: Some(typeshed),
site_packages_paths: vec![site_packages.clone()],
..SearchPathSettings::empty()
}
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings");
db = db.with_search_paths(search_paths);
// This root is needed for correct Salsa tracking.
// Namely, a `SearchPath` is treated as an input, and
@@ -262,36 +266,19 @@ impl TestCaseBuilder<MockedTypeshed> {
// here, they won't get added.
//
// Roots for other search paths are added as part of
// search path initialization in `Program::from_settings`,
// search path initialization in `SearchPaths::from_settings`,
// and any remaining are added below.
db.files()
.try_add_root(&db, SystemPath::new("/src"), FileRootKind::Project);
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource {
version: python_version,
source: PythonVersionSource::default(),
},
python_platform,
search_paths: SearchPathSettings {
custom_typeshed: Some(typeshed.clone()),
site_packages_paths: vec![site_packages.clone()],
..SearchPathSettings::new(vec![src.clone()])
}
.to_search_paths(db.system(), db.vendored())
.expect("valid search path settings"),
},
);
let stdlib = typeshed.join("stdlib");
db.files()
.try_add_root(&db, &stdlib, FileRootKind::LibrarySearchPath);
for root in &roots {
db.files()
.try_add_root(&db, root, FileRootKind::LibrarySearchPath);
}
TestCase {
db,
src,
@@ -324,42 +311,34 @@ impl TestCaseBuilder<VendoredTypeshed> {
let TestCaseBuilder {
typeshed_option: VendoredTypeshed,
python_version,
python_platform,
first_party_files,
site_packages_files,
roots,
} = self;
let mut db = TestDb::new();
let mut db = TestDb::new().with_python_version(python_version);
let site_packages =
Self::write_mock_directory(&mut db, "/site-packages", site_packages_files);
let src = Self::write_mock_directory(&mut db, "/src", first_party_files);
let search_paths = SearchPathSettings {
src_roots: vec![src.clone()],
site_packages_paths: vec![site_packages.clone()],
..SearchPathSettings::empty()
}
.to_search_paths(db.system(), db.vendored())
.expect("Valid search path settings");
db = db.with_search_paths(search_paths);
db.files()
.try_add_root(&db, SystemPath::new("/src"), FileRootKind::Project);
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource {
version: python_version,
source: PythonVersionSource::default(),
},
python_platform,
search_paths: SearchPathSettings {
site_packages_paths: vec![site_packages.clone()],
..SearchPathSettings::new(vec![src.clone()])
}
.to_search_paths(db.system(), db.vendored())
.expect("valid search path settings"),
},
);
for root in &roots {
db.files()
.try_add_root(&db, root, FileRootKind::LibrarySearchPath);
}
TestCase {
db,
src,

View File

@@ -8,13 +8,10 @@ use ruff_db::vendored::VendoredFileSystem;
use ruff_python_ast::{PythonVersion, PythonVersionDeserializationError};
use rustc_hash::FxHashMap;
use crate::Program;
use crate::db::Db;
use crate::module_name::ModuleName;
pub(in crate::module_resolver) fn vendored_typeshed_versions(
vendored: &VendoredFileSystem,
) -> TypeshedVersions {
pub fn vendored_typeshed_versions(vendored: &VendoredFileSystem) -> TypeshedVersions {
TypeshedVersions::from_str(
&vendored
.read_to_string("stdlib/VERSIONS")
@@ -24,7 +21,7 @@ pub(in crate::module_resolver) fn vendored_typeshed_versions(
}
pub(crate) fn typeshed_versions(db: &dyn Db) -> &TypeshedVersions {
Program::get(db).search_paths(db).typeshed_versions()
db.search_paths().typeshed_versions()
}
#[derive(Debug, PartialEq, Eq, Clone)]
@@ -61,7 +58,7 @@ impl std::error::Error for TypeshedVersionsParseError {
}
#[derive(Debug, PartialEq, Eq, Clone, thiserror::Error)]
pub(crate) enum TypeshedVersionsParseErrorKind {
pub enum TypeshedVersionsParseErrorKind {
#[error("File has too many lines ({0}); maximum allowed is {max_allowed}", max_allowed = NonZeroU16::MAX)]
TooManyLines(NonZeroUsize),
#[error("Expected every non-comment line to have exactly one colon")]
@@ -75,16 +72,16 @@ pub(crate) enum TypeshedVersionsParseErrorKind {
}
#[derive(Clone, Debug, PartialEq, Eq, get_size2::GetSize)]
pub(crate) struct TypeshedVersions(FxHashMap<ModuleName, PyVersionRange>);
pub struct TypeshedVersions(FxHashMap<ModuleName, PyVersionRange>);
impl TypeshedVersions {
#[must_use]
pub(crate) fn exact(&self, module_name: &ModuleName) -> Option<&PyVersionRange> {
pub fn exact(&self, module_name: &ModuleName) -> Option<&PyVersionRange> {
self.0.get(module_name)
}
#[must_use]
pub(in crate::module_resolver) fn query_module(
pub(crate) fn query_module(
&self,
module: &ModuleName,
python_version: PythonVersion,
@@ -231,14 +228,14 @@ impl fmt::Display for TypeshedVersions {
}
#[derive(Debug, Clone, Eq, PartialEq, Hash, get_size2::GetSize)]
pub(crate) enum PyVersionRange {
pub enum PyVersionRange {
AvailableFrom(RangeFrom<PythonVersion>),
AvailableWithin(RangeInclusive<PythonVersion>),
}
impl PyVersionRange {
#[must_use]
pub(crate) fn contains(&self, version: PythonVersion) -> bool {
pub fn contains(&self, version: PythonVersion) -> bool {
match self {
Self::AvailableFrom(inner) => inner.contains(&version),
Self::AvailableWithin(inner) => inner.contains(&version),
@@ -246,7 +243,7 @@ impl PyVersionRange {
}
/// Display the version range in a way that is suitable for rendering in user-facing diagnostics.
pub(crate) fn diagnostic_display(&self) -> impl std::fmt::Display {
pub fn diagnostic_display(&self) -> impl std::fmt::Display {
struct DiagnosticDisplay<'a>(&'a PyVersionRange);
impl fmt::Display for DiagnosticDisplay<'_> {

View File

@@ -21,7 +21,9 @@ ruff_python_ast = { workspace = true, features = ["serde"] }
ruff_python_formatter = { workspace = true, optional = true }
ruff_text_size = { workspace = true }
ty_combine = { workspace = true }
ty_module_resolver = { workspace = true }
ty_python_semantic = { workspace = true, features = ["serde"] }
ty_python_types = { workspace = true }
ty_static = { workspace = true }
ty_vendored = { workspace = true }

View File

@@ -14,8 +14,9 @@ use ruff_db::files::{File, Files};
use ruff_db::system::System;
use ruff_db::vendored::VendoredFileSystem;
use salsa::{Database, Event, Setter};
use ty_module_resolver::SearchPaths;
use ty_python_semantic::lint::{LintRegistry, RuleSelection};
use ty_python_semantic::{Db as SemanticDb, Program};
use ty_python_semantic::{AnalysisSettings, Db as SemanticDb, Program};
mod changes;
@@ -446,6 +447,13 @@ impl SalsaMemoryDump {
}
}
#[salsa::db]
impl ty_module_resolver::Db for ProjectDatabase {
fn search_paths(&self) -> &SearchPaths {
Program::get(self).search_paths(self)
}
}
#[salsa::db]
impl SemanticDb for ProjectDatabase {
fn should_check_file(&self, file: File) -> bool {
@@ -459,7 +467,11 @@ impl SemanticDb for ProjectDatabase {
}
fn lint_registry(&self) -> &LintRegistry {
ty_python_semantic::default_lint_registry()
ty_python_types::default_lint_registry()
}
fn analysis_settings(&self) -> &AnalysisSettings {
self.project().settings(self).analysis()
}
fn verbose(&self) -> bool {
@@ -523,9 +535,10 @@ pub(crate) mod tests {
use ruff_db::files::{FileRootKind, Files};
use ruff_db::system::{DbWithTestSystem, System, TestSystem};
use ruff_db::vendored::VendoredFileSystem;
use ty_module_resolver::SearchPathSettings;
use ty_python_semantic::lint::{LintRegistry, RuleSelection};
use ty_python_semantic::{
Program, ProgramSettings, PythonPlatform, PythonVersionWithSource, SearchPathSettings,
AnalysisSettings, Program, ProgramSettings, PythonPlatform, PythonVersionWithSource,
};
use crate::db::Db;
@@ -627,6 +640,13 @@ pub(crate) mod tests {
}
}
#[salsa::db]
impl ty_module_resolver::Db for TestDb {
fn search_paths(&self) -> &ty_module_resolver::SearchPaths {
Program::get(self).search_paths(self)
}
}
#[salsa::db]
impl ty_python_semantic::Db for TestDb {
fn should_check_file(&self, file: ruff_db::files::File) -> bool {
@@ -638,7 +658,11 @@ pub(crate) mod tests {
}
fn lint_registry(&self) -> &LintRegistry {
ty_python_semantic::default_lint_registry()
ty_python_types::default_lint_registry()
}
fn analysis_settings(&self) -> &AnalysisSettings {
self.project().settings(self).analysis()
}
fn verbose(&self) -> bool {

View File

@@ -245,7 +245,9 @@ impl ProjectDatabase {
if result.project_changed {
let new_project_metadata = match config_file_override {
Some(config_file) => ProjectMetadata::from_config_file(config_file, self.system()),
Some(config_file) => {
ProjectMetadata::from_config_file(config_file, &project_root, self.system())
}
None => ProjectMetadata::discover(&project_root, self.system()),
};
match new_project_metadata {

View File

@@ -29,7 +29,7 @@ use std::sync::Arc;
use thiserror::Error;
use ty_python_semantic::add_inferred_python_version_hint_to_diagnostic;
use ty_python_semantic::lint::RuleSelection;
use ty_python_semantic::types::check_types;
use ty_python_types::types::check_types;
mod db;
mod files;
@@ -115,6 +115,10 @@ pub struct Project {
#[default]
verbose_flag: bool,
/// Whether to enforce exclusion rules even to files explicitly passed to ty on the command line.
#[default]
force_exclude_flag: bool,
}
/// A progress reporter.
@@ -380,6 +384,16 @@ impl Project {
self.verbose_flag(db)
}
pub fn set_force_exclude(self, db: &mut dyn Db, force: bool) {
if self.force_exclude_flag(db) != force {
self.set_force_exclude_flag(db).to(force);
}
}
pub fn force_exclude(self, db: &dyn Db) -> bool {
self.force_exclude_flag(db)
}
/// Returns the paths that should be checked.
///
/// The default is to check the entire project in which case this method returns
@@ -755,7 +769,7 @@ mod tests {
use ruff_db::system::{DbWithTestSystem, DbWithWritableSystem as _, SystemPath, SystemPathBuf};
use ruff_db::testing::assert_function_query_was_not_run;
use ruff_python_ast::name::Name;
use ty_python_semantic::types::check_types;
use ty_python_types::types::check_types;
#[test]
fn check_file_skips_type_checking_when_file_cant_be_read() -> ruff_db::system::Result<()> {

View File

@@ -56,6 +56,7 @@ impl ProjectMetadata {
pub fn from_config_file(
path: SystemPathBuf,
root: &SystemPath,
system: &dyn System,
) -> Result<Self, ProjectMetadataError> {
tracing::debug!("Using overridden configuration file at '{path}'");
@@ -70,8 +71,8 @@ impl ProjectMetadata {
let options = config_file.into_options();
Ok(Self {
name: Name::new(system.current_directory().file_name().unwrap_or("root")),
root: system.current_directory().to_path_buf(),
name: Name::new(root.file_name().unwrap_or("root")),
root: root.to_path_buf(),
options,
extra_configuration_paths: vec![path],
misconfiguration_mode: MisconfigurationMode::Fail,

View File

@@ -28,11 +28,12 @@ use std::ops::Deref;
use std::sync::Arc;
use thiserror::Error;
use ty_combine::Combine;
use ty_module_resolver::{SearchPathSettings, SearchPathSettingsError, SearchPaths};
use ty_python_semantic::lint::{Level, LintSource, RuleSelection};
use ty_python_semantic::{
MisconfigurationMode, ProgramSettings, PythonEnvironment, PythonPlatform,
PythonVersionFileSource, PythonVersionSource, PythonVersionWithSource, SearchPathSettings,
SearchPathValidationError, SearchPaths, SitePackagesPaths, SysPrefixPathOrigin,
AnalysisSettings, MisconfigurationMode, ProgramSettings, PythonEnvironment, PythonPlatform,
PythonVersionFileSource, PythonVersionSource, PythonVersionWithSource, SitePackagesPaths,
SysPrefixPathOrigin,
};
use ty_static::EnvVars;
@@ -86,6 +87,10 @@ pub struct Options {
#[option_group]
pub terminal: Option<TerminalOptions>,
#[serde(skip_serializing_if = "Option::is_none")]
#[option_group]
pub analysis: Option<AnalysisOptions>,
/// Override configurations for specific file patterns.
///
/// Each override specifies include/exclude patterns and rule configurations
@@ -258,7 +263,7 @@ impl Options {
system: &dyn System,
vendored: &VendoredFileSystem,
misconfiguration_mode: MisconfigurationMode,
) -> Result<SearchPaths, SearchPathValidationError> {
) -> Result<SearchPaths, SearchPathSettingsError> {
let environment = self.environment.or_default();
let src = self.src.or_default();
@@ -434,6 +439,8 @@ impl Options {
color: colored::control::SHOULD_COLORIZE.should_colorize(),
})?;
let analysis = self.analysis.or_default().to_settings();
let overrides = self
.to_overrides_settings(db, project_root, &mut diagnostics)
.map_err(|err| ToSettingsError {
@@ -446,6 +453,7 @@ impl Options {
rules: Arc::new(rules),
terminal,
src,
analysis,
overrides,
};
@@ -875,10 +883,7 @@ impl Rules {
let lint_source = match source {
ValueSource::File(_) => LintSource::File,
ValueSource::Cli => LintSource::Cli,
ValueSource::Editor => {
unreachable!("Can't configure rules from the user's editor")
}
ValueSource::Editor => LintSource::Editor,
};
if let Ok(severity) = Severity::try_from(**level) {
selection.enable(lint, severity, lint_source);
@@ -1014,7 +1019,12 @@ fn build_include_filter(
SubDiagnosticSeverity::Info,
"The pattern was specified on the CLI",
)),
ValueSource::Editor => unreachable!("Can't configure includes from the user's editor"),
ValueSource::Editor => {
diagnostic.sub(SubDiagnostic::new(
SubDiagnosticSeverity::Info,
"The pattern was specified in the editor settings.",
))
}
}
})?;
}
@@ -1097,9 +1107,10 @@ fn build_exclude_filter(
SubDiagnosticSeverity::Info,
"The pattern was specified on the CLI",
)),
ValueSource::Editor => unreachable!(
"Can't configure excludes from the user's editor"
)
ValueSource::Editor => diagnostic.sub(SubDiagnostic::new(
SubDiagnosticSeverity::Info,
"The pattern was specified in the editor settings",
))
}
})?;
}
@@ -1250,6 +1261,55 @@ pub struct TerminalOptions {
pub error_on_warning: Option<bool>,
}
#[derive(
Debug,
Default,
Clone,
Eq,
PartialEq,
Combine,
Serialize,
Deserialize,
OptionsMetadata,
get_size2::GetSize,
)]
#[serde(rename_all = "kebab-case", deny_unknown_fields)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct AnalysisOptions {
/// Whether ty should respect `type: ignore` comments.
///
/// When set to `false`, `type: ignore` comments are treated like any other normal
/// comment and can't be used to suppress ty errors (you have to use `ty: ignore` instead).
///
/// Setting this option can be useful when using ty alongside other type checkers or when
/// you prefer using `ty: ignore` over `type: ignore`.
///
/// Defaults to `true`.
#[option(
default = r#"true"#,
value_type = "bool",
example = r#"
# Disable support for `type: ignore` comments
respect-type-ignore-comments = false
"#
)]
respect_type_ignore_comments: Option<bool>,
}
impl AnalysisOptions {
fn to_settings(&self) -> AnalysisSettings {
let AnalysisSettings {
respect_type_ignore_comments: respect_type_ignore_default,
} = AnalysisSettings::default();
AnalysisSettings {
respect_type_ignore_comments: self
.respect_type_ignore_comments
.unwrap_or(respect_type_ignore_default),
}
}
}
/// Configuration override that applies to specific files based on glob patterns.
///
/// An override allows you to apply different rule configurations to specific
@@ -1606,7 +1666,7 @@ mod schema {
fn json_schema(generator: &mut schemars::SchemaGenerator) -> schemars::Schema {
use serde_json::{Map, Value};
let registry = ty_python_semantic::default_lint_registry();
let registry = ty_python_types::default_lint_registry();
let level_schema = generator.subschema_for::<super::Level>();
let properties: Map<String, Value> = registry

View File

@@ -2,6 +2,7 @@ use std::sync::Arc;
use ruff_db::files::File;
use ty_combine::Combine;
use ty_python_semantic::AnalysisSettings;
use ty_python_semantic::lint::RuleSelection;
use crate::metadata::options::{InnerOverrideOptions, OutputFormat};
@@ -25,6 +26,7 @@ pub struct Settings {
pub(super) rules: Arc<RuleSelection>,
pub(super) terminal: TerminalSettings,
pub(super) src: SrcSettings,
pub(super) analysis: AnalysisSettings,
/// Settings for configuration overrides that apply to specific file patterns.
///
@@ -54,6 +56,10 @@ impl Settings {
pub fn overrides(&self) -> &[Override] {
&self.overrides
}
pub fn analysis(&self) -> &AnalysisSettings {
&self.analysis
}
}
#[derive(Debug, Clone, PartialEq, Eq, Default, get_size2::GetSize)]

View File

@@ -111,6 +111,8 @@ pub(crate) struct ProjectFilesWalker<'a> {
walker: WalkDirectoryBuilder,
filter: ProjectFilesFilter<'a>,
force_exclude: bool,
}
impl<'a> ProjectFilesWalker<'a> {
@@ -158,7 +160,11 @@ impl<'a> ProjectFilesWalker<'a> {
walker = walker.add(path);
}
Some(Self { walker, filter })
Some(Self {
walker,
filter,
force_exclude: db.project().force_exclude(db),
})
}
/// Walks the project paths and collects the paths of all files that
@@ -179,7 +185,7 @@ impl<'a> ProjectFilesWalker<'a> {
// Skip excluded directories unless they were explicitly passed to the walker
// (which is the case passed to `ty check <paths>`).
if entry.file_type().is_directory() {
if entry.depth() > 0 {
if entry.depth() > 0 || self.force_exclude {
let directory_included = filter
.is_directory_included(entry.path(), GlobFilterCheckMode::TopDown);
return match directory_included {
@@ -218,7 +224,7 @@ impl<'a> ProjectFilesWalker<'a> {
// For all files, except the ones that were explicitly passed to the walker (CLI),
// check if they're included in the project.
if entry.depth() > 0 {
if entry.depth() > 0 || self.force_exclude {
match filter
.is_file_included(entry.path(), GlobFilterCheckMode::TopDown)
{

Some files were not shown because too many files have changed in this diff Show More