Compare commits

...

34 Commits

Author SHA1 Message Date
Zanie Blue
40b4aa28f9 Zizmor 2025-07-03 07:23:11 -05:00
Zanie Blue
ea4bf00c23 Revert "Only run the relevant test"
This reverts commit 82dc27f2680b8085280136848ffe2ee1d2952a4e.
2025-07-03 07:01:28 -05:00
Zanie Blue
7f4aa4b3fb Update for Depot 2025-07-03 07:01:28 -05:00
Zanie Blue
34c98361ae Do not set TMP 2025-07-03 07:01:28 -05:00
Zanie Blue
38bb96a6c2 Remove log variables 2025-07-03 07:01:28 -05:00
Zanie Blue
a014d55455 Remove fuzz corpus hack 2025-07-03 07:01:27 -05:00
Zanie Blue
306f6f17a9 Enable more logs 2025-07-03 07:01:27 -05:00
Zanie Blue
b233888f00 Only run the relevant test 2025-07-03 07:01:27 -05:00
Zanie Blue
540cbd9085 Add debug logs? 2025-07-03 07:01:27 -05:00
Zanie Blue
0112f7f0e4 Use a dev drive for testing on Windows 2025-07-03 07:01:27 -05:00
GiGaGon
d0f0577ac7 [flake8-pyi] Make example error out-of-the-box (PYI014, PYI015) (#19097) 2025-07-03 12:54:35 +01:00
Dhruv Manilawala
dc56c33618 [ty] Initial support for workspace diagnostics (#18939)
## Summary

This PR adds initial support for workspace diagnostics in the ty server.

Reference spec:
https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_diagnostic

This is currently implemented via the **pull diagnostics method** which
was added in the current version (3.17) and the server advertises it via
the `diagnosticProvider.workspaceDiagnostics` server capability.

**Note:** This might be a bit confusing but a workspace diagnostics is
not for a single workspace but for all the workspaces that the server
handles. These are the ones that the server received during
initialization. Currently, the ty server doesn't support multiple
workspaces so this capability is also limited to provide diagnostics
only for a single workspace (the first one if the client provided
multiple).

A new `ty.diagnosticMode` server setting is added which can be either
`workspace` (for workspace diagnostics) or `openFilesOnly` (for checking
only open files) (default). This is same as
`python.analysis.diagnosticMode` that Pyright / Pylance utilizes. In the
future, we could use the value under `python.*` namespace as fallback to
improve the experience on user side to avoid setting the value multiple
times.

Part of: astral-sh/ty#81

## Test Plan

This capability was introduced in the current LSP version (~3 years) and
the way it's implemented by various clients are a bit different. I've
provided notes on what I've noticed and what would need to be done on
our side to further improve the experience.

### VS Code

VS Code sends the `workspace/diagnostic` requests every ~2 second:

```
[Trace - 12:12:32 PM] Sending request 'workspace/diagnostic - (403)'.
[Trace - 12:12:32 PM] Received response 'workspace/diagnostic - (403)' in 2ms.
[Trace - 12:12:34 PM] Sending request 'workspace/diagnostic - (404)'.
[Trace - 12:12:34 PM] Received response 'workspace/diagnostic - (404)' in 2ms.
[Trace - 12:12:36 PM] Sending request 'workspace/diagnostic - (405)'.
[Trace - 12:12:36 PM] Received response 'workspace/diagnostic - (405)' in 2ms.
[Trace - 12:12:38 PM] Sending request 'workspace/diagnostic - (406)'.
[Trace - 12:12:38 PM] Received response 'workspace/diagnostic - (406)' in 3ms.
[Trace - 12:12:40 PM] Sending request 'workspace/diagnostic - (407)'.
[Trace - 12:12:40 PM] Received response 'workspace/diagnostic - (407)' in 2ms.
...
```

I couldn't really find any resource that explains this behavior. But,
this does mean that we'd need to implement the caching layer via the
previous result ids sooner. This will allow the server to avoid sending
all the diagnostics on every request and instead just send a response
stating that the diagnostics hasn't changed yet. This could possibly be
achieved by using the salsa ID.

If we switch from workspace diagnostics to open-files diagnostics, the
server would send the diagnostics only via the `textDocument/diagnostic`
endpoint. Here, when a document containing the diagnostic is closed, the
server would send a publish diagnostics notification with an empty list
of diagnostics to clear the diagnostics from that document. The issue is
the VS Code doesn't seem to be clearing the diagnostics in this case
even though it receives the notification. (I'm going to open an issue on
VS Code side for this today.)


https://github.com/user-attachments/assets/b0c0833d-386c-49f5-8a15-0ac9133e15ed

### Zed

Zed's implementation works by refreshing the workspace diagnostics
whenever the content of the documents are changed. This seems like a
very reasonable behavior and I was a bit surprised that VS Code didn't
use this heuristic.


https://github.com/user-attachments/assets/71c7b546-7970-434a-9ba0-4fa620647f6c

### Neovim

Neovim only recently added support for workspace diagnostics
(https://github.com/neovim/neovim/pull/34262, merged ~3 weeks ago) so
it's only available on nightly versions.

The initial support is limited and requires fetching the workspace
diagnostics manually as demonstrated in the video. It doesn't support
refreshing the workspace diagnostics either, so that would need to be
done manually as well. I'm assuming that these are just a temporary
limitation and will be implemented before the stable release.


https://github.com/user-attachments/assets/25b4a0e5-9833-4877-88ad-279904fffaf9
2025-07-03 11:04:54 +00:00
Dhruv Manilawala
a95c18a8e1 [ty] Add background request task support (#19041)
## Summary

This PR adds a new trait to support running a request in the background.

Currently, there exists a `BackgroundDocumentRequestHandler` trait which
is similar but is scoped to a specific document (file in an editor
context). The new trait `BackgroundRequestHandler` is not tied to a
specific document nor a specific project but it's for the entire
workspace.

This is added to support running workspace wide requests like computing
the [workspace
diagnostics](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_diagnostic)
or [workspace
symbols](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#workspace_symbol).

**Note:** There's a slight difference with what a "workspace" means
between the server and ty. Currently, there's a 1-1 relationship between
a workspace in an editor and the project database corresponding to that
workspace in ty but this could change in the future when Micha adds
support for multiple workspaces or multi-root workspaces.

The data that would be required by the request handler (based on
implementing workspace diagnostics) is the list of databases
(`ProjectDatabse`) corresponding to the projects in the workspace and
the index (`Index`) that contains the open documents. The
`WorkspaceSnapshot` represents this and is passed to the handler similar
to `DocumentSnapshot`.

## Test Plan

This is used in implementing the workspace diagnostics which is where
this is tested.
2025-07-03 11:01:10 +00:00
David Peter
e212dc2e8e [ty] Restructure/move dataclass tests (#19117)
Before I'm adding even more dataclass-related files, let's organize them
in a separate folder.
2025-07-03 10:36:14 +00:00
Aria Desires
c4f2eec865 [ty] Remove last vestiges of std::path from ty_server (#19088)
Fixes https://github.com/astral-sh/ty/issues/603
2025-07-03 15:18:30 +05:30
Zanie Blue
9fc04d6bf0 Use "python" for markdown code fences in on-hover content (#19082)
Instead of "text".

Closes https://github.com/astral-sh/ty/issues/749

We may not want this because the type display implementations are not
guaranteed to be valid Python, however, unless they're going to
highlight invalid syntax this seems like a better interim value than
"text"? I'm not the expert though. See
https://github.com/astral-sh/ty/issues/749#issuecomment-3026201114 for
prior commentary.

edit: Going back further to
https://github.com/astral-sh/ruff/pull/17057#discussion_r2028151621 for
prior context, it turns out they _do_ highlight invalid syntax in red
which is quite unfortunate and probably a blocker here.
2025-07-03 10:50:34 +05:30
Matthew Mckee
352b896c89 [ty] Add subtyping between SubclassOf and CallableType (#19026)
## Summary

Part of https://github.com/astral-sh/ty/issues/129

There were previously some false positives here.

## Test Plan

Updated `is_subtype_of.md` and `is_assignable_to.md`
2025-07-02 19:22:31 -07:00
GiGaGon
321575e48f [flake8-pyi] Make example error out-of-the-box (PYI042) (#19101)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [snake-case-type-alias
(PYI042)](https://docs.astral.sh/ruff/rules/snake-case-type-alias/#snake-case-type-alias-pyi042)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/8fafec81-2228-4ffe-81e8-1989b724cb47)
```py
type_alias_name: TypeAlias = int
```

[New example](https://play.ruff.rs/b396746c-e6d2-423c-bc13-01a533bb0747)
```py
from typing import TypeAlias

type_alias_name: TypeAlias = int
```

Imports were also added to the "use instead" section.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-02 22:31:15 +01:00
GiGaGon
066018859f [pyflakes] Fix backslash in docs (F621) (#19098)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

This fixes the docs for [expressions-in-star-assignment
(F621)](https://docs.astral.sh/ruff/rules/expressions-in-star-assignment/#expressions-in-star-assignment-f621)
having a backslash `\` before the left shifts `<<`. I'm not sure why
this happened in the first place, as the docstring looks fine, but
putting the `<<` inside a code block fixes it. I was not able to track
down the source of the issue either. The only other rule with a `<<` is
[missing-whitespace-around-bitwise-or-shift-operator
(E227)](https://docs.astral.sh/ruff/rules/missing-whitespace-around-bitwise-or-shift-operator/#missing-whitespace-around-bitwise-or-shift-operator-e227),
which already has it in a code block.

Old docs page:

![image](https://github.com/user-attachments/assets/993106c6-5d83-4aed-836b-e252f5b64916)
> In Python 3, no more than 1 \\<< 8 assignments are allowed before a
starred expression, and no more than 1 \\<< 24 expressions are allowed
after a starred expression.

New docs page:

![image](https://github.com/user-attachments/assets/3b40b35f-f39e-49f1-8b2e-262dda4085b4)
> In Python 3, no more than `1 << 8` assignments are allowed before a
starred expression, and no more than `1 << 24` expressions are allowed
after a starred expression.

## Test Plan

<!-- How was it tested? -->

N/A, no tests/functionality affected.
2025-07-02 15:00:33 -04:00
David Peter
f76d3f87cf [ty] Allow declared-only class-level attributes to be accessed on the class (#19071)
## Summary

Allow declared-only class-level attributes to be accessed on the class:
```py
class C:
    attr: int

C.attr  # this is now allowed
``` 

closes https://github.com/astral-sh/ty/issues/384
closes https://github.com/astral-sh/ty/issues/553

## Ecosystem analysis


* We see many removed `unresolved-attribute` false-positives for code
that makes use of sqlalchemy, as expected (see changes for `prefect`)
* We see many removed `call-non-callable` false-positives for uses of
`pytest.skip` and similar, as expected
* Most new diagnostics seem to be related to cases like the following,
where we previously inferred `int` for `Derived().x`, but now we infer
`int | None`. I think this should be a
conflicting-declarations/bad-override error anyway? The new behavior may
even be preferred here?
  ```py
  class Base:
      x: int | None
  
  
  class Derived(Base):
      def __init__(self):
          self.x: int = 1
  ```
2025-07-02 18:03:56 +02:00
Micha Reiser
5f426b9f8b [ty] Remove ScopedExpressionId (#19019)
## Summary

The motivation of `ScopedExpressionId` was that we have an expression
identifier that's local to a scope and, therefore, unlikely to change if
a user makes changes in another scope. A local identifier like this has
the advantage that query results may remain unchanged even if other
parts of the file change, which in turn allows Salsa to short-circuit
dependent queries.

However, I noticed that we aren't using `ScopedExpressionId` in a place
where it's important that the identifier is local. It's main use is
inside `infer` which we always run for the entire file. The one
exception to this is `Unpack` but unpack runs as part of `infer`.

Edit: The above isn't entirely correct. We used ScopedExpressionId in
TypeInference which is a query result. Now using ExpressionNodeKey does
mean that a change to the AST invalidates most if not all TypeInference
results of a single file. Salsa then has to run all dependent queries to
see if they're affected by this change even if the change was local to
another scope.

If this locality proves to be important I suggest that we create two
queries on top of TypeInference: one that returns the expression map
which is mainly used in the linter and type inference and a second that
returns all remaining fields. This should give us a similar optimization
at a much lower cost

I also considered remove `ScopedUseId` but I believe that one is still
useful because using `ExpressionNodeKey` for it instead would mean that
all `UseDefMap` change when a single AST node changes. Whether this is
important is something difficult to assess. I'm simply not familiar
enough with the `UseDefMap`. If the locality doesn't matter for the
`UseDefMap`, then a similar change could be made and `bindings_by_use`
could be changed to an `FxHashMap<UseId, Bindings>` where `UseId` is a
thin wrapper around `NodeKey`.

Closes https://github.com/astral-sh/ty/issues/721
2025-07-02 17:57:32 +02:00
GiGaGon
37ba185c04 [flake8-pyi] Make example error out-of-the-box (PYI059) (#19080)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-07-02 16:49:54 +01:00
David Peter
93413d3631 [ty] Update docs links (#19092)
Point everything to the new documentation at https://docs.astral.sh/ty/
2025-07-02 17:34:56 +02:00
Zanie Blue
efd9b75352 Avoid reformatting comments in rules reference documentation (#19093)
closes https://github.com/astral-sh/ty/issues/754
2025-07-02 17:16:44 +02:00
David Peter
4cf56d7ad4 [ty] Fix lint summary wording (#19091) 2025-07-02 16:32:11 +02:00
David Peter
4e4e428a95 [ty] Fix link in generate_ty_rules (#19090) 2025-07-02 14:21:32 +00:00
Zanie Blue
522fd4462e Fix header levels in generated settings reference (#19089)
The headers were one level too deep for child items, and the top-level
`rules` header was way off.
2025-07-02 16:01:23 +02:00
David Peter
e599c9d0d3 [ty] Adapt generate_ty_rules for MkDocs (#19087)
## Summary

Adapts the Markdown for the rules-reference documentation page for
MkDocs.
2025-07-02 16:01:10 +02:00
Micha Reiser
e9b5ea71b3 Update Salsa (#19020)
## Summary

This PR updates Salsa to pull in Ibraheem's multithreading improvements (https://github.com/salsa-rs/salsa/pull/921).

## Performance

A small regression for single-threaded benchmarks is expected because
papaya is slightly slower than a `Mutex<FxHashMap>` in the uncontested
case (~10%). However, this shouldn't matter as much in practice because:

1. Salsa has a fast-path when only using 1 DB instance which is the
common case in production. This fast-path is not impacted by the changes
but we measure the slow paths in our benchmarks (because we use multiple
db instances)
2. Fixing the 10x slowdown for the congested case (multi threading)
outweights the downsides of a 10% perf regression for single threaded
use cases, especially considering that ty is heavily multi threaded.

## Test Plan

`cargo test`
2025-07-02 09:55:37 -04:00
Ibraheem Ahmed
ebc70a4002 [ty] Support LSP go-to with vendored typeshed stubs (#19057)
## Summary

Extracts the vendored typeshed stubs lazily and caches them on the local
filesystem to support go-to in the LSP.

Resolves https://github.com/astral-sh/ty/issues/77.
2025-07-02 07:58:58 -04:00
Micha Reiser
f7fc8fb084 [ty] Request configuration from client (#18984)
## Summary

This PR makes the necessary changes to the server that it can request
configurations from the client using the `configuration` request.
This PR doesn't make use of the request yet. It only sets up the
foundation (mainly the coordination between client and server)
so that future PRs could pull specific settings. 

I plan to use this for pulling the Python environment from the Python
extension.

Deno does something very similar to this.

## Test Plan

Tested that diagnostics are still shown.
2025-07-02 14:31:41 +05:30
GiGaGon
cdf91b8b74 [flake8-pyi] Make example error out-of-the-box (PYI062) (#19079)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [duplicate-literal-member
(PYI062)](https://docs.astral.sh/ruff/rules/duplicate-literal-member/#duplicate-literal-member-pyi062)'s
example error out-of-the-box

[Old example](https://play.ruff.rs/6b00b41c-c1c5-4421-873d-fc2a143e7337)
```py
foo: Literal["a", "b", "a"]
```

[New example](https://play.ruff.rs/1aea839b-9ae8-4848-bb83-2637e1a68ce4)
```py
from typing import Literal

foo: Literal["a", "b", "a"]
```

Imports were also added to the "use instead" section.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-02 08:21:39 +01:00
Dhruv Manilawala
d1e705738e [ty] Log target names at trace level (#19084)
Follow-up to https://github.com/astral-sh/ruff/pull/19083, also log the
target names like `ty_python_semantic::module_resolver::resolver` in
`2025-07-02 10:12:20.188697000 DEBUG
ty_python_semantic::module_resolver::resolver: Adding first-party search
path '/Users/dhruv/playground/ty_server'` at trace level.
2025-07-02 04:49:36 +00:00
Dhruv Manilawala
c3d9b21db5 [ty] Use better datetime format for server logs (#19083)
This PR improves the timer format for ty server logs to be same as Ruff.

Ref: https://github.com/astral-sh/ruff/pull/16389
2025-07-02 04:39:12 +00:00
85 changed files with 2106 additions and 1431 deletions

View File

@@ -321,14 +321,30 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- name: Setup Dev Drive
run: ${{ github.workspace }}/.github/workflows/setup-dev-drive.ps1
# actions/checkout does not let us clone into anywhere outside `github.workspace`, so we have to copy the clone
- name: Copy Git Repo to Dev Drive
env:
RUFF_WORKSPACE: ${{ env.RUFF_WORKSPACE }}
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${env:RUFF_WORKSPACE}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:
workspaces: ${{ env.RUFF_WORKSPACE }}
- name: "Install Rust toolchain"
working-directory: ${{ env.RUFF_WORKSPACE }}
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@d12e869b89167df346dd0ff65da342d1fb1202fb # v2.53.2
with:
tool: cargo-nextest
- name: "Run tests"
working-directory: ${{ env.RUFF_WORKSPACE }}
shell: bash
env:
NEXTEST_PROFILE: "ci"

93
.github/workflows/setup-dev-drive.ps1 vendored Normal file
View File

@@ -0,0 +1,93 @@
# Configures a drive for testing in CI.
#
# When using standard GitHub Actions runners, a `D:` drive is present and has
# similar or better performance characteristics than a ReFS dev drive. Sometimes
# using a larger runner is still more performant (e.g., when running the test
# suite) and we need to create a dev drive. This script automatically configures
# the appropriate drive.
#
# When using GitHub Actions' "larger runners", the `D:` drive is not present and
# we create a DevDrive mount on `C:`. This is purported to be more performant
# than an ReFS drive, though we did not see a change when we switched over.
#
# When using Depot runners, the underling infrastructure is EC2, which does not
# support Hyper-V. The `New-VHD` commandlet only works with Hyper-V, but we can
# create a ReFS drive using `diskpart` and `format` directory. We cannot use a
# DevDrive, as that also requires Hyper-V. The Depot runners use `D:` already,
# so we must check if it's a Depot runner first, and we use `V:` as the target
# instead.
if ($env:DEPOT_RUNNER -eq "1") {
Write-Output "DEPOT_RUNNER detected, setting up custom dev drive..."
# Create VHD and configure drive using diskpart
$vhdPath = "C:\ruff_dev_drive.vhdx"
@"
create vdisk file="$vhdPath" maximum=20480 type=expandable
attach vdisk
create partition primary
active
assign letter=V
"@ | diskpart
# Format the drive as ReFS
format V: /fs:ReFS /q /y
$Drive = "V:"
Write-Output "Custom dev drive created at $Drive"
} elseif (Test-Path "D:\") {
# Note `Get-PSDrive` is not sufficient because the drive letter is assigned.
Write-Output "Using existing drive at D:"
$Drive = "D:"
} else {
# The size (20 GB) is chosen empirically to be large enough for our
# workflows; larger drives can take longer to set up.
$Volume = New-VHD -Path C:/ruff_dev_drive.vhdx -SizeBytes 20GB |
Mount-VHD -Passthru |
Initialize-Disk -Passthru |
New-Partition -AssignDriveLetter -UseMaximumSize |
Format-Volume -DevDrive -Confirm:$false -Force
$Drive = "$($Volume.DriveLetter):"
# Set the drive as trusted
# See https://learn.microsoft.com/en-us/windows/dev-drive/#how-do-i-designate-a-dev-drive-as-trusted
fsutil devdrv trust $Drive
# Disable antivirus filtering on dev drives
# See https://learn.microsoft.com/en-us/windows/dev-drive/#how-do-i-configure-additional-filters-on-dev-drive
fsutil devdrv enable /disallowAv
# Remount so the changes take effect
Dismount-VHD -Path C:/ruff_dev_drive.vhdx
Mount-VHD -Path C:/ruff_dev_drive.vhdx
# Show some debug information
Write-Output $Volume
fsutil devdrv query $Drive
Write-Output "Using Dev Drive at $Volume"
}
$Tmp = "$($Drive)\ruff-tmp"
# Create the directory ahead of time in an attempt to avoid race-conditions
New-Item $Tmp -ItemType Directory
# Move Cargo to the dev drive
New-Item -Path "$($Drive)/.cargo/bin" -ItemType Directory -Force
if (Test-Path "C:/Users/runneradmin/.cargo") {
Copy-Item -Path "C:/Users/runneradmin/.cargo/*" -Destination "$($Drive)/.cargo/" -Recurse -Force
}
Write-Output `
"DEV_DRIVE=$($Drive)" `
"TMP=$($Tmp)" `
"TEMP=$($Tmp)" `
"UV_INTERNAL__TEST_DIR=$($Tmp)" `
"RUSTUP_HOME=$($Drive)/.rustup" `
"CARGO_HOME=$($Drive)/.cargo" `
"RUFF_WORKSPACE=$($Drive)/ruff" `
"PATH=$($Drive)/.cargo/bin;$env:PATH" `
>> $env:GITHUB_ENV

35
Cargo.lock generated
View File

@@ -2136,6 +2136,16 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
[[package]]
name = "papaya"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f92dd0b07c53a0a0c764db2ace8c541dc47320dad97c2200c2a637ab9dd2328f"
dependencies = [
"equivalent",
"seize",
]
[[package]]
name = "parking_lot"
version = "0.12.3"
@@ -3411,8 +3421,8 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa?rev=0666e2018bc35376b1ac4f98906f2d04d11e5fe4#0666e2018bc35376b1ac4f98906f2d04d11e5fe4"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
dependencies = [
"boxcar",
"compact_str",
@@ -3422,6 +3432,7 @@ dependencies = [
"hashlink",
"indexmap",
"intrusive-collections",
"papaya",
"parking_lot",
"portable-atomic",
"rayon",
@@ -3435,13 +3446,13 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa?rev=0666e2018bc35376b1ac4f98906f2d04d11e5fe4#0666e2018bc35376b1ac4f98906f2d04d11e5fe4"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
[[package]]
name = "salsa-macros"
version = "0.22.0"
source = "git+https://github.com/salsa-rs/salsa?rev=0666e2018bc35376b1ac4f98906f2d04d11e5fe4#0666e2018bc35376b1ac4f98906f2d04d11e5fe4"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
dependencies = [
"proc-macro2",
"quote",
@@ -3494,6 +3505,16 @@ version = "4.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1c107b6f4780854c8b126e228ea8869f4d7b71260f962fefb57b996b8959ba6b"
[[package]]
name = "seize"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e4b8d813387d566f627f3ea1b914c068aac94c40ae27ec43f5f33bde65abefe7"
dependencies = [
"libc",
"windows-sys 0.52.0",
]
[[package]]
name = "serde"
version = "1.0.219"
@@ -4266,6 +4287,7 @@ dependencies = [
"ty_ide",
"ty_project",
"ty_python_semantic",
"ty_vendored",
]
[[package]]
@@ -4305,6 +4327,7 @@ version = "0.0.0"
dependencies = [
"path-slash",
"ruff_db",
"static_assertions",
"walkdir",
"zip",
]

View File

@@ -137,7 +137,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "0666e2018bc35376b1ac4f98906f2d04d11e5fe4" }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "fc00eba89e5dcaa5edba51c41aa5f309b5cb126b" }
schemars = { version = "0.8.16" }
seahash = { version = "4.1.0" }
serde = { version = "1.0.197", features = ["derive"] }

View File

@@ -263,12 +263,23 @@ impl Files {
impl fmt::Debug for Files {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut map = f.debug_map();
if f.alternate() {
let mut map = f.debug_map();
for entry in self.inner.system_by_path.iter() {
map.entry(entry.key(), entry.value());
for entry in self.inner.system_by_path.iter() {
map.entry(entry.key(), entry.value());
}
map.finish()
} else {
f.debug_struct("Files")
.field("system_by_path", &self.inner.system_by_path.len())
.field(
"system_virtual_by_path",
&self.inner.system_virtual_by_path.len(),
)
.field("vendored_by_path", &self.inner.vendored_by_path.len())
.finish()
}
map.finish()
}
}

View File

@@ -124,6 +124,11 @@ pub trait System: Debug {
/// Returns `None` if no such convention exists for the system.
fn user_config_directory(&self) -> Option<SystemPathBuf>;
/// Returns the directory path where cached files are stored.
///
/// Returns `None` if no such convention exists for the system.
fn cache_dir(&self) -> Option<SystemPathBuf>;
/// Iterate over the contents of the directory at `path`.
///
/// The returned iterator must have the following properties:
@@ -186,6 +191,9 @@ pub trait System: Debug {
Err(std::env::VarError::NotPresent)
}
/// Returns a handle to a [`WritableSystem`] if this system is writeable.
fn as_writable(&self) -> Option<&dyn WritableSystem>;
fn as_any(&self) -> &dyn std::any::Any;
fn as_any_mut(&mut self) -> &mut dyn std::any::Any;
@@ -226,11 +234,52 @@ impl fmt::Display for CaseSensitivity {
/// System trait for non-readonly systems.
pub trait WritableSystem: System {
/// Creates a file at the given path.
///
/// Returns an error if the file already exists.
fn create_new_file(&self, path: &SystemPath) -> Result<()>;
/// Writes the given content to the file at the given path.
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()>;
/// Creates a directory at `path` as well as any intermediate directories.
fn create_directory_all(&self, path: &SystemPath) -> Result<()>;
/// Reads the provided file from the system cache, or creates the file if necessary.
///
/// Returns `Ok(None)` if the system does not expose a suitable cache directory.
fn get_or_cache(
&self,
path: &SystemPath,
read_contents: &dyn Fn() -> Result<String>,
) -> Result<Option<SystemPathBuf>> {
let Some(cache_dir) = self.cache_dir() else {
return Ok(None);
};
let cache_path = cache_dir.join(path);
// The file has already been cached.
if self.is_file(&cache_path) {
return Ok(Some(cache_path));
}
// Read the file contents.
let contents = read_contents()?;
// Create the parent directory.
self.create_directory_all(cache_path.parent().unwrap())?;
// Create and write to the file on the system.
//
// Note that `create_new_file` will fail if the file has already been created. This
// ensures that only one thread/process ever attempts to write to it to avoid corrupting
// the cache.
self.create_new_file(&cache_path)?;
self.write_file(&cache_path, &contents)?;
Ok(Some(cache_path))
}
}
#[derive(Clone, Debug, Eq, PartialEq)]

View File

@@ -1,4 +1,5 @@
use std::collections::BTreeMap;
use std::collections::{BTreeMap, btree_map};
use std::io;
use std::iter::FusedIterator;
use std::sync::{Arc, RwLock, RwLockWriteGuard};
@@ -153,6 +154,26 @@ impl MemoryFileSystem {
virtual_files.contains_key(&path.to_path_buf())
}
pub(crate) fn create_new_file(&self, path: &SystemPath) -> Result<()> {
let normalized = self.normalize_path(path);
let mut by_path = self.inner.by_path.write().unwrap();
match by_path.entry(normalized) {
btree_map::Entry::Vacant(entry) => {
entry.insert(Entry::File(File {
content: String::new(),
last_modified: file_time_now(),
}));
Ok(())
}
btree_map::Entry::Occupied(_) => Err(io::Error::new(
io::ErrorKind::AlreadyExists,
"File already exists",
)),
}
}
/// Stores a new file in the file system.
///
/// The operation overrides the content for an existing file with the same normalized `path`.
@@ -278,14 +299,14 @@ impl MemoryFileSystem {
let normalized = fs.normalize_path(path);
match by_path.entry(normalized) {
std::collections::btree_map::Entry::Occupied(entry) => match entry.get() {
btree_map::Entry::Occupied(entry) => match entry.get() {
Entry::File(_) => {
entry.remove();
Ok(())
}
Entry::Directory(_) => Err(is_a_directory()),
},
std::collections::btree_map::Entry::Vacant(_) => Err(not_found()),
btree_map::Entry::Vacant(_) => Err(not_found()),
}
}
@@ -345,14 +366,14 @@ impl MemoryFileSystem {
}
match by_path.entry(normalized.clone()) {
std::collections::btree_map::Entry::Occupied(entry) => match entry.get() {
btree_map::Entry::Occupied(entry) => match entry.get() {
Entry::Directory(_) => {
entry.remove();
Ok(())
}
Entry::File(_) => Err(not_a_directory()),
},
std::collections::btree_map::Entry::Vacant(_) => Err(not_found()),
btree_map::Entry::Vacant(_) => Err(not_found()),
}
}

View File

@@ -160,6 +160,39 @@ impl System for OsSystem {
None
}
/// Returns an absolute cache directory on the system.
///
/// On Linux and macOS, uses `$XDG_CACHE_HOME/ty` or `.cache/ty`.
/// On Windows, uses `C:\Users\User\AppData\Local\ty\cache`.
#[cfg(not(target_arch = "wasm32"))]
fn cache_dir(&self) -> Option<SystemPathBuf> {
use etcetera::BaseStrategy as _;
let cache_dir = etcetera::base_strategy::choose_base_strategy()
.ok()
.map(|dirs| dirs.cache_dir().join("ty"))
.map(|cache_dir| {
if cfg!(windows) {
// On Windows, we append `cache` to the LocalAppData directory, i.e., prefer
// `C:\Users\User\AppData\Local\ty\cache` over `C:\Users\User\AppData\Local\ty`.
cache_dir.join("cache")
} else {
cache_dir
}
})
.and_then(|path| SystemPathBuf::from_path_buf(path).ok())
.unwrap_or_else(|| SystemPathBuf::from(".ty_cache"));
Some(cache_dir)
}
// TODO: Remove this feature gating once `ruff_wasm` no longer indirectly depends on `ruff_db` with the
// `os` feature enabled (via `ruff_workspace` -> `ruff_graph` -> `ruff_db`).
#[cfg(target_arch = "wasm32")]
fn cache_dir(&self) -> Option<SystemPathBuf> {
None
}
/// Creates a builder to recursively walk `path`.
///
/// The walker ignores files according to [`ignore::WalkBuilder::standard_filters`]
@@ -192,6 +225,10 @@ impl System for OsSystem {
})
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn Any {
self
}
@@ -310,6 +347,10 @@ impl OsSystem {
}
impl WritableSystem for OsSystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
std::fs::File::create_new(path).map(drop)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
std::fs::write(path.as_std_path(), content)
}

View File

@@ -102,6 +102,10 @@ impl System for TestSystem {
self.system().user_config_directory()
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
self.system().cache_dir()
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -123,6 +127,10 @@ impl System for TestSystem {
self.system().glob(pattern)
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
@@ -149,6 +157,10 @@ impl Default for TestSystem {
}
impl WritableSystem for TestSystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
self.system().create_new_file(path)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
self.system().write_file(path, content)
}
@@ -335,6 +347,10 @@ impl System for InMemorySystem {
self.user_config_directory.lock().unwrap().clone()
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
None
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -357,6 +373,10 @@ impl System for InMemorySystem {
Ok(Box::new(iterator))
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
@@ -377,6 +397,10 @@ impl System for InMemorySystem {
}
impl WritableSystem for InMemorySystem {
fn create_new_file(&self, path: &SystemPath) -> Result<()> {
self.memory_fs.create_new_file(path)
}
fn write_file(&self, path: &SystemPath, content: &str) -> Result<()> {
self.memory_fs.write_file(path, content)
}

View File

@@ -4,12 +4,12 @@ use std::fmt::{self, Debug};
use std::io::{self, Read, Write};
use std::sync::{Arc, Mutex, MutexGuard};
use crate::file_revision::FileRevision;
use zip::result::ZipResult;
use zip::write::FileOptions;
use zip::{CompressionMethod, ZipArchive, ZipWriter, read::ZipFile};
pub use self::path::{VendoredPath, VendoredPathBuf};
use crate::file_revision::FileRevision;
mod path;

View File

@@ -114,6 +114,7 @@ fn generate_set(output: &mut String, set: Set, parents: &mut Vec<Set>) {
parents.pop();
}
#[derive(Debug)]
enum Set {
Toplevel(OptionSet),
Named { name: String, set: OptionSet },
@@ -136,7 +137,7 @@ impl Set {
}
fn emit_field(output: &mut String, name: &str, field: &OptionField, parents: &[Set]) {
let header_level = if parents.is_empty() { "###" } else { "####" };
let header_level = "#".repeat(parents.len() + 1);
let _ = writeln!(output, "{header_level} `{name}`");

View File

@@ -73,12 +73,20 @@ fn generate_markdown() -> String {
for lint in lints {
let _ = writeln!(&mut output, "## `{rule_name}`\n", rule_name = lint.name());
// Increase the header-level by one
// Reformat headers as bold text
let mut in_code_fence = false;
let documentation = lint
.documentation_lines()
.map(|line| {
if line.starts_with('#') {
Cow::Owned(format!("#{line}"))
// Toggle the code fence state if we encounter a boundary
if line.starts_with("```") {
in_code_fence = !in_code_fence;
}
if !in_code_fence && line.starts_with('#') {
Cow::Owned(format!(
"**{line}**\n",
line = line.trim_start_matches('#').trim_start()
))
} else {
Cow::Borrowed(line)
}
@@ -87,21 +95,15 @@ fn generate_markdown() -> String {
let _ = writeln!(
&mut output,
r#"**Default level**: {level}
<details>
<summary>{summary}</summary>
r#"<small>
Default level: [`{level}`](../rules.md#rule-levels "This lint has a default level of '{level}'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20{encoded_name}) ·
[View source](https://github.com/astral-sh/ruff/blob/main/{file}#L{line})
</small>
{documentation}
### Links
* [Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20{encoded_name})
* [View source](https://github.com/astral-sh/ruff/blob/main/{file}#L{line})
</details>
"#,
level = lint.default_level(),
// GitHub doesn't support markdown in `summary` headers
summary = replace_inline_code(lint.summary()),
encoded_name = url::form_urlencoded::byte_serialize(lint.name().as_str().as_bytes())
.collect::<String>(),
file = url::form_urlencoded::byte_serialize(lint.file().replace('\\', "/").as_bytes())
@@ -113,25 +115,6 @@ fn generate_markdown() -> String {
output
}
/// Replaces inline code blocks (`code`) with `<code>code</code>`
fn replace_inline_code(input: &str) -> String {
let mut output = String::new();
let mut parts = input.split('`');
while let Some(before) = parts.next() {
if let Some(between) = parts.next() {
output.push_str(before);
output.push_str("<code>");
output.push_str(between);
output.push_str("</code>");
} else {
output.push_str(before);
}
}
output
}
#[cfg(test)]
mod tests {
use anyhow::Result;

View File

@@ -19,11 +19,15 @@ use crate::{AlwaysFixableViolation, Applicability, Edit, Fix};
///
/// ## Example
/// ```python
/// from typing import Literal
///
/// foo: Literal["a", "b", "a"]
/// ```
///
/// Use instead:
/// ```python
/// from typing import Literal
///
/// foo: Literal["a", "b"]
/// ```
///

View File

@@ -21,27 +21,47 @@ use crate::{Fix, FixAvailability, Violation};
///
/// For example:
/// ```python
/// from collections.abc import Container, Iterable, Sized
/// from typing import Generic, TypeVar
///
///
/// T = TypeVar("T")
/// K = TypeVar("K")
/// V = TypeVar("V")
///
///
/// class LinkedList(Generic[T], Sized):
/// def push(self, item: T) -> None:
/// self._items.append(item)
///
///
/// class MyMapping(
/// Generic[K, V],
/// Iterable[Tuple[K, V]],
/// Container[Tuple[K, V]],
/// Iterable[tuple[K, V]],
/// Container[tuple[K, V]],
/// ):
/// ...
/// ```
///
/// Use instead:
/// ```python
/// from collections.abc import Container, Iterable, Sized
/// from typing import Generic, TypeVar
///
///
/// T = TypeVar("T")
/// K = TypeVar("K")
/// V = TypeVar("V")
///
///
/// class LinkedList(Sized, Generic[T]):
/// def push(self, item: T) -> None:
/// self._items.append(item)
///
///
/// class MyMapping(
/// Iterable[Tuple[K, V]],
/// Container[Tuple[K, V]],
/// Iterable[tuple[K, V]],
/// Container[tuple[K, V]],
/// Generic[K, V],
/// ):
/// ...

View File

@@ -75,7 +75,7 @@ impl AlwaysFixableViolation for TypedArgumentDefaultInStub {
/// ## Example
///
/// ```pyi
/// def foo(arg=[]) -> None: ...
/// def foo(arg=bar()) -> None: ...
/// ```
///
/// Use instead:
@@ -120,7 +120,7 @@ impl AlwaysFixableViolation for ArgumentDefaultInStub {
///
/// ## Example
/// ```pyi
/// foo: str = "..."
/// foo: str = bar()
/// ```
///
/// Use instead:

View File

@@ -14,11 +14,15 @@ use crate::checkers::ast::Checker;
///
/// ## Example
/// ```pyi
/// from typing import TypeAlias
///
/// type_alias_name: TypeAlias = int
/// ```
///
/// Use instead:
/// ```pyi
/// from typing import TypeAlias
///
/// TypeAliasName: TypeAlias = int
/// ```
#[derive(ViolationMetadata)]

View File

@@ -11,8 +11,8 @@ use crate::{Violation, checkers::ast::Checker};
/// ## Why is this bad?
/// In assignment statements, starred expressions can be used to unpack iterables.
///
/// In Python 3, no more than 1 << 8 assignments are allowed before a starred
/// expression, and no more than 1 << 24 expressions are allowed after a starred
/// In Python 3, no more than `1 << 8` assignments are allowed before a starred
/// expression, and no more than `1 << 24` expressions are allowed after a starred
/// expression.
///
/// ## References

View File

@@ -3,7 +3,7 @@ name = "ty"
version = "0.0.0"
# required for correct pypi metadata
homepage = "https://github.com/astral-sh/ty/"
documentation = "https://github.com/astral-sh/ty/"
documentation = "https://docs.astral.sh/ty/"
# Releases occur in this other repository!
repository = "https://github.com/astral-sh/ty/"
edition.workspace = true

View File

@@ -1,7 +1,7 @@
<!-- WARNING: This file is auto-generated (cargo dev generate-all). Update the doc comments on the 'Options' struct in 'crates/ty_project/src/metadata/options.rs' if you want to change anything here. -->
# Configuration
#### `rules`
## `rules`
Configures the enabled rules and their severity.
@@ -30,7 +30,7 @@ division-by-zero = "ignore"
## `environment`
#### `extra-paths`
### `extra-paths`
List of user-provided paths that should take first priority in the module resolution.
Examples in other type checkers are mypy's `MYPYPATH` environment variable,
@@ -49,7 +49,7 @@ extra-paths = ["~/shared/my-search-path"]
---
#### `python`
### `python`
Path to the Python installation from which ty resolves type information and third-party dependencies.
@@ -71,7 +71,7 @@ python = "./.venv"
---
#### `python-platform`
### `python-platform`
Specifies the target platform that will be used to analyze the source code.
If specified, ty will understand conditions based on comparisons with `sys.platform`, such
@@ -99,7 +99,7 @@ python-platform = "win32"
---
#### `python-version`
### `python-version`
Specifies the version of Python that will be used to analyze the source code.
The version should be specified as a string in the format `M.m` where `M` is the major version
@@ -132,7 +132,7 @@ python-version = "3.12"
---
#### `root`
### `root`
The root paths of the project, used for finding first-party modules.
@@ -161,7 +161,7 @@ root = ["./src", "./lib", "./vendor"]
---
#### `typeshed`
### `typeshed`
Optional path to a "typeshed" directory on disk for us to use for standard-library types.
If this is not provided, we will fallback to our vendored typeshed stubs for the stdlib,
@@ -213,7 +213,7 @@ possibly-unresolved-reference = "ignore"
```
#### `exclude`
### `exclude`
A list of file and directory patterns to exclude from this override.
@@ -240,7 +240,7 @@ exclude = [
---
#### `include`
### `include`
A list of file and directory patterns to include for this override.
@@ -266,7 +266,7 @@ include = [
---
#### `rules`
### `rules`
Rule overrides for files matching the include/exclude patterns.
@@ -292,7 +292,7 @@ possibly-unresolved-reference = "ignore"
## `src`
#### `exclude`
### `exclude`
A list of file and directory patterns to exclude from type checking.
@@ -357,7 +357,7 @@ exclude = [
---
#### `include`
### `include`
A list of files and directories to check. The `include` option
follows a similar syntax to `.gitignore` but reversed:
@@ -396,7 +396,7 @@ include = [
---
#### `respect-ignore-files`
### `respect-ignore-files`
Whether to automatically exclude files that are ignored by `.ignore`,
`.gitignore`, `.git/info/exclude`, and global `gitignore` files.
@@ -415,7 +415,7 @@ respect-ignore-files = false
---
#### `root`
### `root`
> [!WARN] "Deprecated"
> This option has been deprecated. Use `environment.root` instead.
@@ -446,7 +446,7 @@ root = "./app"
## `terminal`
#### `error-on-warning`
### `error-on-warning`
Use exit code 1 if there are any warning-level diagnostics.
@@ -466,7 +466,7 @@ error-on-warning = true
---
#### `output-format`
### `output-format`
The format to use for printing diagnostic messages.

1427
crates/ty/docs/rules.md generated

File diff suppressed because it is too large Load Diff

View File

@@ -186,7 +186,7 @@ fn cli_arguments_are_relative_to_the_current_directory() -> anyhow::Result<()> {
3 |
4 | stat = add(10, 15)
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
@@ -412,7 +412,7 @@ fn check_specific_paths() -> anyhow::Result<()> {
3 |
4 | print(z)
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
error[unresolved-import]: Cannot resolve imported module `does_not_exist`
@@ -421,7 +421,7 @@ fn check_specific_paths() -> anyhow::Result<()> {
2 | import does_not_exist # error: unresolved-import
| ^^^^^^^^^^^^^^
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
Found 2 diagnostics
@@ -447,7 +447,7 @@ fn check_specific_paths() -> anyhow::Result<()> {
3 |
4 | print(z)
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
error[unresolved-import]: Cannot resolve imported module `does_not_exist`
@@ -456,7 +456,7 @@ fn check_specific_paths() -> anyhow::Result<()> {
2 | import does_not_exist # error: unresolved-import
| ^^^^^^^^^^^^^^
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
Found 2 diagnostics

View File

@@ -333,7 +333,7 @@ import bar",
| ^^^
2 | import bar
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
@@ -909,7 +909,7 @@ fn check_conda_prefix_var_to_resolve_path() -> anyhow::Result<()> {
2 | import package1
| ^^^^^^^^
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic
@@ -1206,7 +1206,7 @@ fn default_root_tests_package() -> anyhow::Result<()> {
4 |
5 | print(f"{foo} {bar}")
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
Found 1 diagnostic

View File

@@ -101,7 +101,7 @@ fn cli_rule_severity() -> anyhow::Result<()> {
3 |
4 | y = 4 / 0
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
error[unresolved-reference]: Name `prin` used when not defined
@@ -141,7 +141,7 @@ fn cli_rule_severity() -> anyhow::Result<()> {
3 |
4 | y = 4 / 0
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` was selected on the command line
warning[division-by-zero]: Cannot divide object of type `Literal[4]` by zero

View File

@@ -919,6 +919,9 @@ fn directory_renamed() -> anyhow::Result<()> {
#[test]
fn directory_deleted() -> anyhow::Result<()> {
use ruff_db::testing::setup_logging;
let _logging = setup_logging();
let mut case = setup([
("bar.py", "import sub.a"),
("sub/__init__.py", ""),

View File

@@ -118,7 +118,7 @@ impl fmt::Display for DisplayHoverContent<'_, '_> {
match self.content {
HoverContent::Type(ty) => self
.kind
.fenced_code_block(ty.display(self.db), "text")
.fenced_code_block(ty.display(self.db), "python")
.fmt(f),
}
}
@@ -148,7 +148,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
Literal[10]
---------------------------------------------
```text
```python
Literal[10]
```
---------------------------------------------
@@ -184,7 +184,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
int
---------------------------------------------
```text
```python
int
```
---------------------------------------------
@@ -214,7 +214,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
def foo(a, b) -> Unknown
---------------------------------------------
```text
```python
def foo(a, b) -> Unknown
```
---------------------------------------------
@@ -243,7 +243,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
bool
---------------------------------------------
```text
```python
bool
```
---------------------------------------------
@@ -274,7 +274,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
Literal[123]
---------------------------------------------
```text
```python
Literal[123]
```
---------------------------------------------
@@ -312,7 +312,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
(def foo(a, b) -> Unknown) | (def bar(a, b) -> Unknown)
---------------------------------------------
```text
```python
(def foo(a, b) -> Unknown) | (def bar(a, b) -> Unknown)
```
---------------------------------------------
@@ -344,7 +344,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
<module 'lib'>
---------------------------------------------
```text
```python
<module 'lib'>
```
---------------------------------------------
@@ -373,7 +373,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
T
---------------------------------------------
```text
```python
T
```
---------------------------------------------
@@ -399,7 +399,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
@Todo
---------------------------------------------
```text
```python
@Todo
```
---------------------------------------------
@@ -425,7 +425,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
@Todo
---------------------------------------------
```text
```python
@Todo
```
---------------------------------------------
@@ -451,7 +451,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
Literal[1]
---------------------------------------------
```text
```python
Literal[1]
```
---------------------------------------------
@@ -482,7 +482,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
Literal[1]
---------------------------------------------
```text
```python
Literal[1]
```
---------------------------------------------
@@ -512,7 +512,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
Literal[2]
---------------------------------------------
```text
```python
Literal[2]
```
---------------------------------------------
@@ -545,7 +545,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
Unknown | Literal[1]
---------------------------------------------
```text
```python
Unknown | Literal[1]
```
---------------------------------------------
@@ -574,7 +574,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
int
---------------------------------------------
```text
```python
int
```
---------------------------------------------
@@ -602,7 +602,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
Literal[1]
---------------------------------------------
```text
```python
Literal[1]
```
---------------------------------------------
@@ -631,7 +631,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
int
---------------------------------------------
```text
```python
int
```
---------------------------------------------
@@ -661,7 +661,7 @@ mod tests {
assert_snapshot!(test.hover(), @r"
str
---------------------------------------------
```text
```python
str
```
---------------------------------------------

View File

@@ -1,3 +1,4 @@
use std::fmt::Formatter;
use std::panic::{AssertUnwindSafe, RefUnwindSafe};
use std::sync::Arc;
use std::{cmp, fmt};
@@ -82,15 +83,20 @@ impl ProjectDatabase {
/// Checks all open files in the project and its dependencies.
pub fn check(&self) -> Vec<Diagnostic> {
let mut reporter = DummyReporter;
let reporter = AssertUnwindSafe(&mut reporter as &mut dyn Reporter);
self.project().check(self, reporter)
self.check_with_mode(CheckMode::OpenFiles)
}
/// Checks all open files in the project and its dependencies, using the given reporter.
pub fn check_with_reporter(&self, reporter: &mut dyn Reporter) -> Vec<Diagnostic> {
let reporter = AssertUnwindSafe(reporter);
self.project().check(self, reporter)
self.project().check(self, CheckMode::OpenFiles, reporter)
}
/// Check the project with the given mode.
pub fn check_with_mode(&self, mode: CheckMode) -> Vec<Diagnostic> {
let mut reporter = DummyReporter;
let reporter = AssertUnwindSafe(&mut reporter as &mut dyn Reporter);
self.project().check(self, mode, reporter)
}
#[tracing::instrument(level = "debug", skip(self))]
@@ -146,6 +152,27 @@ impl ProjectDatabase {
}
}
impl std::fmt::Debug for ProjectDatabase {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
f.debug_struct("ProjectDatabase")
.field("project", &self.project)
.field("files", &self.files)
.field("system", &self.system)
.finish_non_exhaustive()
}
}
#[derive(Debug, Clone, Copy, PartialEq)]
pub enum CheckMode {
/// Checks only the open files in the project.
OpenFiles,
/// Checks all files in the project, ignoring the open file set.
///
/// This includes virtual files, such as those created by the language server.
AllFiles,
}
/// Stores memory usage information.
pub struct SalsaMemoryDump {
total_fields: usize,

View File

@@ -1,7 +1,7 @@
use crate::glob::{GlobFilterCheckMode, IncludeResult};
use crate::metadata::options::{OptionDiagnostic, ToSettingsError};
use crate::walk::{ProjectFilesFilter, ProjectFilesWalker};
pub use db::{Db, ProjectDatabase, SalsaMemoryDump};
pub use db::{CheckMode, Db, ProjectDatabase, SalsaMemoryDump};
use files::{Index, Indexed, IndexedFiles};
use metadata::settings::Settings;
pub use metadata::{ProjectMetadata, ProjectMetadataError};
@@ -55,6 +55,7 @@ pub fn default_lints_registry() -> LintRegistry {
/// it remains the same project. That's why program is a narrowed view of the project only
/// holding on to the most fundamental settings required for checking.
#[salsa::input]
#[derive(Debug)]
pub struct Project {
/// The files that are open in the project.
///
@@ -213,6 +214,7 @@ impl Project {
pub(crate) fn check(
self,
db: &ProjectDatabase,
mode: CheckMode,
mut reporter: AssertUnwindSafe<&mut dyn Reporter>,
) -> Vec<Diagnostic> {
let project_span = tracing::debug_span!("Project::check");
@@ -227,7 +229,11 @@ impl Project {
.map(OptionDiagnostic::to_diagnostic),
);
let files = ProjectFiles::new(db, self);
let files = match mode {
CheckMode::OpenFiles => ProjectFiles::new(db, self),
// TODO: Consider open virtual files as well
CheckMode::AllFiles => ProjectFiles::Indexed(self.files(db)),
};
reporter.set_files(files.len());
diagnostics.extend(

View File

@@ -87,13 +87,8 @@ c_instance = C()
reveal_type(c_instance.declared_and_bound) # revealed: str | None
# Note that both mypy and pyright show no error in this case! So we may reconsider this in
# the future, if it turns out to produce too many false positives. We currently emit:
# error: [unresolved-attribute] "Attribute `declared_and_bound` can only be accessed on instances, not on the class object `<class 'C'>` itself."
reveal_type(C.declared_and_bound) # revealed: Unknown
reveal_type(C.declared_and_bound) # revealed: str | None
# Same as above. Mypy and pyright do not show an error here.
# error: [invalid-attribute-access] "Cannot assign to instance attribute `declared_and_bound` from the class object `<class 'C'>`"
C.declared_and_bound = "overwritten on class"
# error: [invalid-assignment] "Object of type `Literal[1]` is not assignable to attribute `declared_and_bound` of type `str | None`"
@@ -102,8 +97,11 @@ c_instance.declared_and_bound = 1
#### Variable declared in class body and not bound anywhere
If a variable is declared in the class body but not bound anywhere, we still consider it a pure
instance variable and allow access to it via instances.
If a variable is declared in the class body but not bound anywhere, we consider it to be accessible
on instances and the class itself. It would be more consistent to treat this as a pure instance
variable (and require the attribute to be annotated with `ClassVar` if it should be accessible on
the class as well), but other type checkers allow this as well. This is also heavily relied on in
the Python ecosystem:
```py
class C:
@@ -113,11 +111,8 @@ c_instance = C()
reveal_type(c_instance.only_declared) # revealed: str
# Mypy and pyright do not show an error here. We treat this as a pure instance variable.
# error: [unresolved-attribute] "Attribute `only_declared` can only be accessed on instances, not on the class object `<class 'C'>` itself."
reveal_type(C.only_declared) # revealed: Unknown
reveal_type(C.only_declared) # revealed: str
# error: [invalid-attribute-access] "Cannot assign to instance attribute `only_declared` from the class object `<class 'C'>`"
C.only_declared = "overwritten on class"
```
@@ -1235,6 +1230,16 @@ def _(flag: bool):
reveal_type(Derived().x) # revealed: int | Any
Derived().x = 1
# TODO
# The following assignment currently fails, because we first check if "a" is assignable to the
# attribute on the meta-type of `Derived`, i.e. `<class 'Derived'>`. When accessing the class
# member `x` on `Derived`, we only see the `x: int` declaration and do not union it with the
# type of the base class attribute `x: Any`. This could potentially be improved. Note that we
# see a type of `int | Any` above because we have the full union handling of possibly-unbound
# *instance* attributes.
# error: [invalid-assignment] "Object of type `Literal["a"]` is not assignable to attribute `x` of type `int`"
Derived().x = "a"
```
@@ -1299,10 +1304,8 @@ def _(flag: bool):
if flag:
self.x = 1
# error: [possibly-unbound-attribute]
reveal_type(Foo().x) # revealed: int | Unknown
# error: [possibly-unbound-attribute]
Foo().x = 1
```

View File

@@ -120,8 +120,7 @@ def _(flag: bool):
### Dunder methods as class-level annotations with no value
Class-level annotations with no value assigned are considered instance-only, and aren't available as
dunder methods:
Class-level annotations with no value assigned are considered to be accessible on the class:
```py
from typing import Callable
@@ -129,10 +128,8 @@ from typing import Callable
class C:
__call__: Callable[..., None]
# error: [call-non-callable]
C()()
# error: [invalid-assignment]
_: Callable[..., None] = C()
```

View File

@@ -810,21 +810,6 @@ D(1) # OK
D() # error: [missing-argument]
```
### Accessing instance attributes on the class itself
Just like for normal classes, accessing instance attributes on the class itself is not allowed:
```py
from dataclasses import dataclass
@dataclass
class C:
x: int
# error: [unresolved-attribute] "Attribute `x` can only be accessed on instances, not on the class object `<class 'C'>` itself."
C.x
```
### Return type of `dataclass(...)`
A call like `dataclass(order=True)` returns a callable itself, which is then used as the decorator.

View File

@@ -533,7 +533,13 @@ class FooSubclassOfAny:
x: SubclassOfAny
static_assert(not is_subtype_of(FooSubclassOfAny, HasX))
static_assert(not is_assignable_to(FooSubclassOfAny, HasX))
# `FooSubclassOfAny` is assignable to `HasX` for the following reason. The `x` attribute on `FooSubclassOfAny`
# is accessible on the class itself. When accessing `x` on an instance, the descriptor protocol is invoked, and
# `__get__` is looked up on `SubclassOfAny`. Every member access on `SubclassOfAny` yields `Any`, so `__get__` is
# also available, and calling `Any` also yields `Any`. Thus, accessing `x` on an instance of `FooSubclassOfAny`
# yields `Any`, which is assignable to `int` and vice versa.
static_assert(is_assignable_to(FooSubclassOfAny, HasX))
class FooWithY(Foo):
y: int
@@ -1586,11 +1592,7 @@ def g(a: Truthy, b: FalsyFoo, c: FalsyFooSubclass):
reveal_type(bool(c)) # revealed: Literal[False]
```
It is not sufficient for a protocol to have a callable `__bool__` instance member that returns
`Literal[True]` for it to be considered always truthy. Dunder methods are looked up on the class
rather than the instance. If a protocol `X` has an instance-attribute `__bool__` member, it is
unknowable whether that attribute can be accessed on the type of an object that satisfies `X`'s
interface:
The same works with a class-level declaration of `__bool__`:
```py
from typing import Callable
@@ -1599,7 +1601,7 @@ class InstanceAttrBool(Protocol):
__bool__: Callable[[], Literal[True]]
def h(obj: InstanceAttrBool):
reveal_type(bool(obj)) # revealed: bool
reveal_type(bool(obj)) # revealed: Literal[True]
```
## Callable protocols
@@ -1832,7 +1834,8 @@ def _(r: Recursive):
reveal_type(r.direct) # revealed: Recursive
reveal_type(r.union) # revealed: None | Recursive
reveal_type(r.intersection1) # revealed: C & Recursive
reveal_type(r.intersection2) # revealed: C & ~Recursive
# revealed: @Todo(map_with_boundness: intersections with negative contributions) | (C & ~Recursive)
reveal_type(r.intersection2)
reveal_type(r.t) # revealed: tuple[int, tuple[str, Recursive]]
reveal_type(r.callable1) # revealed: (int, /) -> Recursive
reveal_type(r.callable2) # revealed: (Recursive, /) -> int

View File

@@ -26,7 +26,7 @@ error[unresolved-import]: Cannot resolve imported module `does_not_exist`
2 | from does_not_exist import foo, bar, baz
| ^^^^^^^^^^^^^^
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
```

View File

@@ -24,7 +24,7 @@ error[unresolved-import]: Cannot resolve imported module `zqzqzqzqzqzqzq`
1 | import zqzqzqzqzqzqzq # error: [unresolved-import] "Cannot resolve imported module `zqzqzqzqzqzqzq`"
| ^^^^^^^^^^^^^^
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
```

View File

@@ -36,7 +36,7 @@ error[unresolved-import]: Cannot resolve imported module `a.foo`
3 |
4 | # Topmost component unresolvable:
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
```
@@ -49,7 +49,7 @@ error[unresolved-import]: Cannot resolve imported module `b.foo`
5 | import b.foo # error: [unresolved-import] "Cannot resolve imported module `b.foo`"
| ^^^^^
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
```

View File

@@ -4,7 +4,7 @@ expression: snapshot
---
---
mdtest name: dataclasses.md - Dataclasses - `dataclasses.KW_ONLY`
mdtest path: crates/ty_python_semantic/resources/mdtest/dataclasses.md
mdtest path: crates/ty_python_semantic/resources/mdtest/dataclasses/dataclasses.md
---
# Python source files

View File

@@ -28,7 +28,7 @@ error[unresolved-import]: Cannot resolve imported module `does_not_exist`
2 |
3 | x = does_not_exist.foo
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
```

View File

@@ -28,7 +28,7 @@ error[unresolved-import]: Cannot resolve imported module `does_not_exist`
2 |
3 | stat = add(10, 15)
|
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
info: rule `unresolved-import` is enabled by default
```

View File

@@ -1064,6 +1064,37 @@ static_assert(not is_assignable_to(A, Callable[[int], int]))
reveal_type(A()(1)) # revealed: str
```
### Subclass of
#### Type of a class with constructor methods
```py
from typing import Callable
from ty_extensions import static_assert, is_assignable_to
class A:
def __init__(self, x: int) -> None: ...
class B:
def __new__(cls, x: str) -> "B":
return super().__new__(cls)
static_assert(is_assignable_to(type[A], Callable[[int], A]))
static_assert(not is_assignable_to(type[A], Callable[[str], A]))
static_assert(is_assignable_to(type[B], Callable[[str], B]))
static_assert(not is_assignable_to(type[B], Callable[[int], B]))
```
#### Type with no generic parameters
```py
from typing import Callable, Any
from ty_extensions import static_assert, is_assignable_to
static_assert(is_assignable_to(type, Callable[..., Any]))
```
## Generics
### Assignability of generic types parameterized by gradual types

View File

@@ -1752,6 +1752,28 @@ static_assert(not is_subtype_of(TypeOf[F], Callable[[], str]))
static_assert(not is_subtype_of(TypeOf[F], Callable[[int], F]))
```
### Subclass of
#### Type of a class with constructor methods
```py
from typing import Callable
from ty_extensions import TypeOf, static_assert, is_subtype_of
class A:
def __init__(self, x: int) -> None: ...
class B:
def __new__(cls, x: str) -> "B":
return super().__new__(cls)
static_assert(is_subtype_of(type[A], Callable[[int], A]))
static_assert(not is_subtype_of(type[A], Callable[[str], A]))
static_assert(is_subtype_of(type[B], Callable[[str], B]))
static_assert(not is_subtype_of(type[B], Callable[[int], B]))
```
### Bound methods
```py

View File

@@ -64,24 +64,6 @@ c = C()
c.a = 2
```
and similarly here:
```py
class Base:
a: ClassVar[int] = 1
class Derived(Base):
if flag():
a: int
reveal_type(Derived.a) # revealed: int
d = Derived()
# error: [invalid-attribute-access]
d.a = 2
```
## Too many arguments
```py

View File

@@ -6,9 +6,7 @@ use ruff_python_ast::name::Name;
use ruff_python_ast::statement_visitor::{StatementVisitor, walk_stmt};
use ruff_python_ast::{self as ast};
use crate::semantic_index::ast_ids::HasScopedExpressionId;
use crate::semantic_index::place::ScopeId;
use crate::semantic_index::{SemanticIndex, global_scope, semantic_index};
use crate::semantic_index::{SemanticIndex, semantic_index};
use crate::types::{Truthiness, Type, infer_expression_types};
use crate::{Db, ModuleName, resolve_module};
@@ -44,11 +42,6 @@ struct DunderAllNamesCollector<'db> {
db: &'db dyn Db,
file: File,
/// The scope in which the `__all__` names are being collected from.
///
/// This is always going to be the global scope of the module.
scope: ScopeId<'db>,
/// The semantic index for the module.
index: &'db SemanticIndex<'db>,
@@ -68,7 +61,6 @@ impl<'db> DunderAllNamesCollector<'db> {
Self {
db,
file,
scope: global_scope(db, file),
index,
origin: None,
invalid: false,
@@ -190,8 +182,7 @@ impl<'db> DunderAllNamesCollector<'db> {
///
/// This function panics if `expr` was not marked as a standalone expression during semantic indexing.
fn standalone_expression_type(&self, expr: &ast::Expr) -> Type<'db> {
infer_expression_types(self.db, self.index.expression(expr))
.expression_type(expr.scoped_expression_id(self.db, self.scope))
infer_expression_types(self.db, self.index.expression(expr)).expression_type(expr)
}
/// Evaluate the given expression and return its truthiness.

View File

@@ -235,29 +235,28 @@ pub(crate) fn class_symbol<'db>(
) -> PlaceAndQualifiers<'db> {
place_table(db, scope)
.place_id_by_name(name)
.map(|symbol| {
let symbol_and_quals = place_by_id(
.map(|place| {
let place_and_quals = place_by_id(
db,
scope,
symbol,
place,
RequiresExplicitReExport::No,
ConsideredDefinitions::EndOfScope,
);
if symbol_and_quals.is_class_var() {
// For declared class vars we do not need to check if they have bindings,
// we just trust the declaration.
return symbol_and_quals;
if !place_and_quals.place.is_unbound() {
// Trust the declared type if we see a class-level declaration
return place_and_quals;
}
if let PlaceAndQualifiers {
place: Place::Type(ty, _),
qualifiers,
} = symbol_and_quals
} = place_and_quals
{
// Otherwise, we need to check if the symbol has bindings
let use_def = use_def_map(db, scope);
let bindings = use_def.end_of_scope_bindings(symbol);
let bindings = use_def.end_of_scope_bindings(place);
let inferred = place_from_bindings_impl(db, bindings, RequiresExplicitReExport::No);
// TODO: we should not need to calculate inferred type second time. This is a temporary

View File

@@ -26,20 +26,11 @@ use crate::semantic_index::semantic_index;
/// ```
#[derive(Debug, salsa::Update, get_size2::GetSize)]
pub(crate) struct AstIds {
/// Maps expressions to their expression id.
expressions_map: FxHashMap<ExpressionNodeKey, ScopedExpressionId>,
/// Maps expressions which "use" a place (that is, [`ast::ExprName`], [`ast::ExprAttribute`] or [`ast::ExprSubscript`]) to a use id.
uses_map: FxHashMap<ExpressionNodeKey, ScopedUseId>,
}
impl AstIds {
fn expression_id(&self, key: impl Into<ExpressionNodeKey>) -> ScopedExpressionId {
let key = &key.into();
*self.expressions_map.get(key).unwrap_or_else(|| {
panic!("Could not find expression ID for {key:?}");
})
}
fn use_id(&self, key: impl Into<ExpressionNodeKey>) -> ScopedUseId {
self.uses_map[&key.into()]
}
@@ -94,90 +85,12 @@ impl HasScopedUseId for ast::ExprRef<'_> {
}
}
/// Uniquely identifies an [`ast::Expr`] in a [`crate::semantic_index::place::FileScopeId`].
#[newtype_index]
#[derive(salsa::Update, get_size2::GetSize)]
pub struct ScopedExpressionId;
pub trait HasScopedExpressionId {
/// Returns the ID that uniquely identifies the node in `scope`.
fn scoped_expression_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedExpressionId;
}
impl<T: HasScopedExpressionId> HasScopedExpressionId for Box<T> {
fn scoped_expression_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedExpressionId {
self.as_ref().scoped_expression_id(db, scope)
}
}
macro_rules! impl_has_scoped_expression_id {
($ty: ty) => {
impl HasScopedExpressionId for $ty {
fn scoped_expression_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedExpressionId {
let expression_ref = ExprRef::from(self);
expression_ref.scoped_expression_id(db, scope)
}
}
};
}
impl_has_scoped_expression_id!(ast::ExprBoolOp);
impl_has_scoped_expression_id!(ast::ExprName);
impl_has_scoped_expression_id!(ast::ExprBinOp);
impl_has_scoped_expression_id!(ast::ExprUnaryOp);
impl_has_scoped_expression_id!(ast::ExprLambda);
impl_has_scoped_expression_id!(ast::ExprIf);
impl_has_scoped_expression_id!(ast::ExprDict);
impl_has_scoped_expression_id!(ast::ExprSet);
impl_has_scoped_expression_id!(ast::ExprListComp);
impl_has_scoped_expression_id!(ast::ExprSetComp);
impl_has_scoped_expression_id!(ast::ExprDictComp);
impl_has_scoped_expression_id!(ast::ExprGenerator);
impl_has_scoped_expression_id!(ast::ExprAwait);
impl_has_scoped_expression_id!(ast::ExprYield);
impl_has_scoped_expression_id!(ast::ExprYieldFrom);
impl_has_scoped_expression_id!(ast::ExprCompare);
impl_has_scoped_expression_id!(ast::ExprCall);
impl_has_scoped_expression_id!(ast::ExprFString);
impl_has_scoped_expression_id!(ast::ExprStringLiteral);
impl_has_scoped_expression_id!(ast::ExprBytesLiteral);
impl_has_scoped_expression_id!(ast::ExprNumberLiteral);
impl_has_scoped_expression_id!(ast::ExprBooleanLiteral);
impl_has_scoped_expression_id!(ast::ExprNoneLiteral);
impl_has_scoped_expression_id!(ast::ExprEllipsisLiteral);
impl_has_scoped_expression_id!(ast::ExprAttribute);
impl_has_scoped_expression_id!(ast::ExprSubscript);
impl_has_scoped_expression_id!(ast::ExprStarred);
impl_has_scoped_expression_id!(ast::ExprNamed);
impl_has_scoped_expression_id!(ast::ExprList);
impl_has_scoped_expression_id!(ast::ExprTuple);
impl_has_scoped_expression_id!(ast::ExprSlice);
impl_has_scoped_expression_id!(ast::ExprIpyEscapeCommand);
impl_has_scoped_expression_id!(ast::Expr);
impl HasScopedExpressionId for ast::ExprRef<'_> {
fn scoped_expression_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedExpressionId {
let ast_ids = ast_ids(db, scope);
ast_ids.expression_id(*self)
}
}
#[derive(Debug, Default)]
pub(super) struct AstIdsBuilder {
expressions_map: FxHashMap<ExpressionNodeKey, ScopedExpressionId>,
uses_map: FxHashMap<ExpressionNodeKey, ScopedUseId>,
}
impl AstIdsBuilder {
/// Adds `expr` to the expression ids map and returns its id.
pub(super) fn record_expression(&mut self, expr: &ast::Expr) -> ScopedExpressionId {
let expression_id = self.expressions_map.len().into();
self.expressions_map.insert(expr.into(), expression_id);
expression_id
}
/// Adds `expr` to the use ids map and returns its id.
pub(super) fn record_use(&mut self, expr: impl Into<ExpressionNodeKey>) -> ScopedUseId {
let use_id = self.uses_map.len().into();
@@ -188,11 +101,9 @@ impl AstIdsBuilder {
}
pub(super) fn finish(mut self) -> AstIds {
self.expressions_map.shrink_to_fit();
self.uses_map.shrink_to_fit();
AstIds {
expressions_map: self.expressions_map,
uses_map: self.uses_map,
}
}
@@ -219,6 +130,12 @@ pub(crate) mod node_key {
}
}
impl From<&ast::ExprCall> for ExpressionNodeKey {
fn from(value: &ast::ExprCall) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<&ast::Identifier> for ExpressionNodeKey {
fn from(value: &ast::Identifier) -> Self {
Self(NodeKey::from_node(value))

View File

@@ -1918,7 +1918,6 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
self.scopes_by_expression
.insert(expr.into(), self.current_scope());
self.current_ast_ids().record_expression(expr);
let node_key = NodeKey::from_node(expr);

View File

@@ -7,7 +7,6 @@ use ruff_source_file::LineIndex;
use crate::Db;
use crate::module_name::ModuleName;
use crate::module_resolver::{KnownModule, Module, resolve_module};
use crate::semantic_index::ast_ids::HasScopedExpressionId;
use crate::semantic_index::place::FileScopeId;
use crate::semantic_index::semantic_index;
use crate::types::ide_support::all_declarations_and_bindings;
@@ -159,8 +158,7 @@ impl HasType for ast::ExprRef<'_> {
let file_scope = index.expression_scope_id(*self);
let scope = file_scope.to_scope_id(model.db, model.file);
let expression_id = self.scoped_expression_id(model.db, scope);
infer_scope_types(model.db, scope).expression_type(expression_id)
infer_scope_types(model.db, scope).expression_type(*self)
}
}

View File

@@ -33,7 +33,6 @@ pub(crate) use self::subclass_of::{SubclassOfInner, SubclassOfType};
use crate::module_name::ModuleName;
use crate::module_resolver::{KnownModule, resolve_module};
use crate::place::{Boundness, Place, PlaceAndQualifiers, imported_symbol};
use crate::semantic_index::ast_ids::HasScopedExpressionId;
use crate::semantic_index::definition::Definition;
use crate::semantic_index::place::{ScopeId, ScopedPlaceId};
use crate::semantic_index::{imported_modules, place_table, semantic_index};
@@ -143,18 +142,17 @@ fn definition_expression_type<'db>(
let index = semantic_index(db, file);
let file_scope = index.expression_scope_id(expression);
let scope = file_scope.to_scope_id(db, file);
let expr_id = expression.scoped_expression_id(db, scope);
if scope == definition.scope(db) {
// expression is in the definition scope
let inference = infer_definition_types(db, definition);
if let Some(ty) = inference.try_expression_type(expr_id) {
if let Some(ty) = inference.try_expression_type(expression) {
ty
} else {
infer_deferred_types(db, definition).expression_type(expr_id)
infer_deferred_types(db, definition).expression_type(expression)
}
} else {
// expression is in a type-params sub-scope
infer_scope_types(db, scope).expression_type(expr_id)
infer_scope_types(db, scope).expression_type(expression)
}
}
@@ -1561,6 +1559,16 @@ impl<'db> Type<'db> {
.into_callable(db)
.has_relation_to(db, target, relation),
// TODO: This is unsound so in future we can consider an opt-in option to disable it.
(Type::SubclassOf(subclass_of_ty), Type::Callable(_))
if subclass_of_ty.subclass_of().into_class().is_some() =>
{
let class = subclass_of_ty.subclass_of().into_class().unwrap();
class
.into_callable(db)
.has_relation_to(db, target, relation)
}
// `Literal[str]` is a subtype of `type` because the `str` class object is an instance of its metaclass `type`.
// `Literal[abc.ABC]` is a subtype of `abc.ABCMeta` because the `abc.ABC` class object
// is an instance of its metaclass `abc.ABCMeta`.

View File

@@ -32,7 +32,6 @@ use crate::{
known_module_symbol, place_from_bindings, place_from_declarations,
},
semantic_index::{
ast_ids::HasScopedExpressionId,
attribute_assignments,
definition::{DefinitionKind, TargetKind},
place::ScopeId,
@@ -1861,10 +1860,8 @@ impl<'db> ClassLiteral<'db> {
// [.., self.name, ..] = <value>
let unpacked = infer_unpack_types(db, unpack);
let target_ast_id = assign
.target(&module)
.scoped_expression_id(db, method_scope);
let inferred_ty = unpacked.expression_type(target_ast_id);
let inferred_ty = unpacked.expression_type(assign.target(&module));
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
}
@@ -1890,10 +1887,8 @@ impl<'db> ClassLiteral<'db> {
// for .., self.name, .. in <iterable>:
let unpacked = infer_unpack_types(db, unpack);
let target_ast_id = for_stmt
.target(&module)
.scoped_expression_id(db, method_scope);
let inferred_ty = unpacked.expression_type(target_ast_id);
let inferred_ty =
unpacked.expression_type(for_stmt.target(&module));
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
}
@@ -1921,10 +1916,8 @@ impl<'db> ClassLiteral<'db> {
// with <context_manager> as .., self.name, ..:
let unpacked = infer_unpack_types(db, unpack);
let target_ast_id = with_item
.target(&module)
.scoped_expression_id(db, method_scope);
let inferred_ty = unpacked.expression_type(target_ast_id);
let inferred_ty =
unpacked.expression_type(with_item.target(&module));
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
}
@@ -1951,10 +1944,9 @@ impl<'db> ClassLiteral<'db> {
// [... for .., self.name, .. in <iterable>]
let unpacked = infer_unpack_types(db, unpack);
let target_ast_id = comprehension
.target(&module)
.scoped_expression_id(db, unpack.target_scope(db));
let inferred_ty = unpacked.expression_type(target_ast_id);
let inferred_ty =
unpacked.expression_type(comprehension.target(&module));
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
}

View File

@@ -307,7 +307,7 @@ declare_lint! {
/// d: bytes
/// ```
pub(crate) static DUPLICATE_KW_ONLY = {
summary: "detects dataclass definitions with more than once usages of `KW_ONLY`",
summary: "detects dataclass definitions with more than one usage of `KW_ONLY`",
status: LintStatus::preview("1.0.0"),
default_level: Level::Error,
}

View File

@@ -45,6 +45,21 @@ use rustc_hash::{FxHashMap, FxHashSet};
use salsa;
use salsa::plumbing::AsId;
use super::context::{InNoTypeCheck, InferContext};
use super::diagnostic::{
INVALID_METACLASS, INVALID_OVERLOAD, INVALID_PROTOCOL, SUBCLASS_OF_FINAL_CLASS,
hint_if_stdlib_submodule_exists_on_other_versions, report_attempted_protocol_instantiation,
report_duplicate_bases, report_index_out_of_bounds, report_invalid_exception_caught,
report_invalid_exception_cause, report_invalid_exception_raised,
report_invalid_or_unsupported_base, report_invalid_type_checking_constant,
report_non_subscriptable, report_possibly_unresolved_reference, report_slice_step_size_zero,
};
use super::generics::LegacyGenericBase;
use super::string_annotation::{
BYTE_STRING_TYPE_ANNOTATION, FSTRING_TYPE_ANNOTATION, parse_string_annotation,
};
use super::subclass_of::SubclassOfInner;
use super::{ClassBase, NominalInstanceType, add_inferred_python_version_hint_to_diagnostic};
use crate::module_name::{ModuleName, ModuleNameResolutionError};
use crate::module_resolver::resolve_module;
use crate::node_key::NodeKey;
@@ -54,9 +69,8 @@ use crate::place::{
module_type_implicit_global_declaration, module_type_implicit_global_symbol, place,
place_from_bindings, place_from_declarations, typing_extensions_symbol,
};
use crate::semantic_index::ast_ids::{
HasScopedExpressionId, HasScopedUseId, ScopedExpressionId, ScopedUseId,
};
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::semantic_index::ast_ids::{HasScopedUseId, ScopedUseId};
use crate::semantic_index::definition::{
AnnotatedAssignmentDefinitionKind, AssignmentDefinitionKind, ComprehensionDefinitionKind,
Definition, DefinitionKind, DefinitionNodeKey, DefinitionState, ExceptHandlerDefinitionKind,
@@ -110,22 +124,6 @@ use crate::util::diagnostics::format_enumeration;
use crate::util::subscript::{PyIndex, PySlice};
use crate::{Db, FxOrderSet, Program};
use super::context::{InNoTypeCheck, InferContext};
use super::diagnostic::{
INVALID_METACLASS, INVALID_OVERLOAD, INVALID_PROTOCOL, SUBCLASS_OF_FINAL_CLASS,
hint_if_stdlib_submodule_exists_on_other_versions, report_attempted_protocol_instantiation,
report_duplicate_bases, report_index_out_of_bounds, report_invalid_exception_caught,
report_invalid_exception_cause, report_invalid_exception_raised,
report_invalid_or_unsupported_base, report_invalid_type_checking_constant,
report_non_subscriptable, report_possibly_unresolved_reference, report_slice_step_size_zero,
};
use super::generics::LegacyGenericBase;
use super::string_annotation::{
BYTE_STRING_TYPE_ANNOTATION, FSTRING_TYPE_ANNOTATION, parse_string_annotation,
};
use super::subclass_of::SubclassOfInner;
use super::{ClassBase, NominalInstanceType, add_inferred_python_version_hint_to_diagnostic};
/// Infer all types for a [`ScopeId`], including all definitions and expressions in that scope.
/// Use when checking a scope, or needing to provide a type for an arbitrary expression in the
/// scope.
@@ -281,12 +279,7 @@ pub(super) fn infer_same_file_expression_type<'db>(
parsed: &ParsedModuleRef,
) -> Type<'db> {
let inference = infer_expression_types(db, expression);
let scope = expression.scope(db);
inference.expression_type(
expression
.node_ref(db, parsed)
.scoped_expression_id(db, scope),
)
inference.expression_type(expression.node_ref(db, parsed))
}
/// Infers the type of an expression where the expression might come from another file.
@@ -337,7 +330,7 @@ pub(super) fn infer_unpack_types<'db>(db: &'db dyn Db, unpack: Unpack<'db>) -> U
let _span = tracing::trace_span!("infer_unpack_types", range=?unpack.range(db, &module), ?file)
.entered();
let mut unpacker = Unpacker::new(db, unpack.target_scope(db), unpack.value_scope(db), &module);
let mut unpacker = Unpacker::new(db, unpack.target_scope(db), &module);
unpacker.unpack(unpack.target(db, &module), unpack.value(db));
unpacker.finish()
}
@@ -417,7 +410,7 @@ struct TypeAndRange<'db> {
#[derive(Debug, Eq, PartialEq, salsa::Update, get_size2::GetSize)]
pub(crate) struct TypeInference<'db> {
/// The types of every expression in this region.
expressions: FxHashMap<ScopedExpressionId, Type<'db>>,
expressions: FxHashMap<ExpressionNodeKey, Type<'db>>,
/// The types of every binding in this region.
bindings: FxHashMap<Definition<'db>, Type<'db>>,
@@ -466,7 +459,7 @@ impl<'db> TypeInference<'db> {
}
#[track_caller]
pub(crate) fn expression_type(&self, expression: ScopedExpressionId) -> Type<'db> {
pub(crate) fn expression_type(&self, expression: impl Into<ExpressionNodeKey>) -> Type<'db> {
self.try_expression_type(expression).expect(
"Failed to retrieve the inferred type for an `ast::Expr` node \
passed to `TypeInference::expression_type()`. The `TypeInferenceBuilder` \
@@ -475,9 +468,12 @@ impl<'db> TypeInference<'db> {
)
}
pub(crate) fn try_expression_type(&self, expression: ScopedExpressionId) -> Option<Type<'db>> {
pub(crate) fn try_expression_type(
&self,
expression: impl Into<ExpressionNodeKey>,
) -> Option<Type<'db>> {
self.expressions
.get(&expression)
.get(&expression.into())
.copied()
.or(self.cycle_fallback_type)
}
@@ -738,13 +734,11 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
/// this node.
#[track_caller]
fn expression_type(&self, expr: &ast::Expr) -> Type<'db> {
self.types
.expression_type(expr.scoped_expression_id(self.db(), self.scope()))
self.types.expression_type(expr)
}
fn try_expression_type(&self, expr: &ast::Expr) -> Option<Type<'db>> {
self.types
.try_expression_type(expr.scoped_expression_id(self.db(), self.scope()))
self.types.try_expression_type(expr)
}
/// Get the type of an expression from any scope in the same file.
@@ -762,12 +756,11 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
fn file_expression_type(&self, expression: &ast::Expr) -> Type<'db> {
let file_scope = self.index.expression_scope_id(expression);
let expr_scope = file_scope.to_scope_id(self.db(), self.file());
let expr_id = expression.scoped_expression_id(self.db(), expr_scope);
match self.region {
InferenceRegion::Scope(scope) if scope == expr_scope => {
self.expression_type(expression)
}
_ => infer_scope_types(self.db(), expr_scope).expression_type(expr_id),
_ => infer_scope_types(self.db(), expr_scope).expression_type(expression),
}
}
@@ -1954,13 +1947,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
function: &'a ast::StmtFunctionDef,
) -> impl Iterator<Item = Type<'db>> + 'a {
let definition = self.index.expect_single_definition(function);
let scope = definition.scope(self.db());
let definition_types = infer_definition_types(self.db(), definition);
function.decorator_list.iter().map(move |decorator| {
definition_types
.expression_type(decorator.expression.scoped_expression_id(self.db(), scope))
})
function
.decorator_list
.iter()
.map(move |decorator| definition_types.expression_type(&decorator.expression))
}
/// Returns `true` if the current scope is the function body scope of a function overload (that
@@ -2759,11 +2752,10 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
match with_item.target_kind() {
TargetKind::Sequence(unpack_position, unpack) => {
let unpacked = infer_unpack_types(self.db(), unpack);
let target_ast_id = target.scoped_expression_id(self.db(), self.scope());
if unpack_position == UnpackPosition::First {
self.context.extend(unpacked.diagnostics());
}
unpacked.expression_type(target_ast_id)
unpacked.expression_type(target)
}
TargetKind::Single => {
let context_expr_ty = self.infer_standalone_expression(context_expr);
@@ -3757,8 +3749,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
self.context.extend(unpacked.diagnostics());
}
let target_ast_id = target.scoped_expression_id(self.db(), self.scope());
unpacked.expression_type(target_ast_id)
unpacked.expression_type(target)
}
TargetKind::Single => {
let value_ty = self.infer_standalone_expression(value);
@@ -3816,10 +3807,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// But here we explicitly overwrite the type for the overall `self.attr` node with
// the annotated type. We do no use `store_expression_type` here, because it checks
// that no type has been stored for the expression before.
let expr_id = target.scoped_expression_id(self.db(), self.scope());
self.types
.expressions
.insert(expr_id, annotated.inner_type());
.insert((&**target).into(), annotated.inner_type());
}
}
@@ -4077,8 +4067,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
if unpack_position == UnpackPosition::First {
self.context.extend(unpacked.diagnostics());
}
let target_ast_id = target.scoped_expression_id(self.db(), self.scope());
unpacked.expression_type(target_ast_id)
unpacked.expression_type(target)
}
TargetKind::Single => {
let iterable_type = self.infer_standalone_expression(iterable);
@@ -4172,7 +4162,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
diagnostic.info(
"make sure your Python environment is properly configured: \
https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment",
https://docs.astral.sh/ty/modules/#python-environment",
);
}
}
@@ -4628,7 +4618,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// the result from `types` directly because we might be in cycle recovery where
// `types.cycle_fallback_type` is `Some(fallback_ty)`, which we can retrieve by
// using `expression_type` on `types`:
types.expression_type(expression.scoped_expression_id(self.db(), self.scope()))
types.expression_type(expression)
}
fn infer_expression_impl(&mut self, expression: &ast::Expr) -> Type<'db> {
@@ -4680,15 +4670,14 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
ty
}
fn store_expression_type(&mut self, expression: &impl HasScopedExpressionId, ty: Type<'db>) {
fn store_expression_type(&mut self, expression: &ast::Expr, ty: Type<'db>) {
if self.deferred_state.in_string_annotation() {
// Avoid storing the type of expressions that are part of a string annotation because
// the expression ids don't exists in the semantic index. Instead, we'll store the type
// on the string expression itself that represents the annotation.
return;
}
let expr_id = expression.scoped_expression_id(self.db(), self.scope());
let previous = self.types.expressions.insert(expr_id, ty);
let previous = self.types.expressions.insert(expression.into(), ty);
assert_eq!(previous, None);
}
@@ -5093,20 +5082,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// because `ScopedExpressionId`s are only meaningful within their own scope, so
// we'd add types for random wrong expressions in the current scope
if comprehension.is_first() && target.is_name_expr() {
let lookup_scope = self
.index
.parent_scope_id(self.scope().file_scope_id(self.db()))
.expect("A comprehension should never be the top-level scope")
.to_scope_id(self.db(), self.file());
result.expression_type(iterable.scoped_expression_id(self.db(), lookup_scope))
result.expression_type(iterable)
} else {
let scope = self.types.scope;
self.types.scope = result.scope;
self.extend(result);
self.types.scope = scope;
result.expression_type(
iterable.scoped_expression_id(self.db(), expression.scope(self.db())),
)
result.expression_type(iterable)
}
};
@@ -5121,9 +5103,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
if unpack_position == UnpackPosition::First {
self.context.extend(unpacked.diagnostics());
}
let target_ast_id =
target.scoped_expression_id(self.db(), unpack.target_scope(self.db()));
unpacked.expression_type(target_ast_id)
unpacked.expression_type(target)
}
TargetKind::Single => {
let iterable_type = infer_iterable_type();
@@ -5135,10 +5116,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
};
self.types.expressions.insert(
target.scoped_expression_id(self.db(), self.scope()),
target_type,
);
self.types.expressions.insert(target.into(), target_type);
self.add_binding(target.into(), definition, target_type);
}

View File

@@ -1,5 +1,4 @@
use crate::Db;
use crate::semantic_index::ast_ids::HasScopedExpressionId;
use crate::semantic_index::expression::Expression;
use crate::semantic_index::place::{PlaceExpr, PlaceTable, ScopeId, ScopedPlaceId};
use crate::semantic_index::place_table;
@@ -687,7 +686,7 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
// and that requires cross-symbol constraints, which we don't support yet.
return None;
}
let scope = self.scope();
let inference = infer_expression_types(self.db, expression);
let comparator_tuples = std::iter::once(&**left)
@@ -698,10 +697,8 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
let mut last_rhs_ty: Option<Type> = None;
for (op, (left, right)) in std::iter::zip(&**ops, comparator_tuples) {
let lhs_ty = last_rhs_ty.unwrap_or_else(|| {
inference.expression_type(left.scoped_expression_id(self.db, scope))
});
let rhs_ty = inference.expression_type(right.scoped_expression_id(self.db, scope));
let lhs_ty = last_rhs_ty.unwrap_or_else(|| inference.expression_type(left));
let rhs_ty = inference.expression_type(right);
last_rhs_ty = Some(rhs_ty);
match left {
@@ -756,8 +753,7 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
continue;
}
let callable_type =
inference.expression_type(callable.scoped_expression_id(self.db, scope));
let callable_type = inference.expression_type(&**callable);
if callable_type
.into_class_literal()
@@ -782,11 +778,9 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
expression: Expression<'db>,
is_positive: bool,
) -> Option<NarrowingConstraints<'db>> {
let scope = self.scope();
let inference = infer_expression_types(self.db, expression);
let callable_ty =
inference.expression_type(expr_call.func.scoped_expression_id(self.db, scope));
let callable_ty = inference.expression_type(&*expr_call.func);
// TODO: add support for PEP 604 union types on the right hand side of `isinstance`
// and `issubclass`, for example `isinstance(x, str | (int | float))`.
@@ -797,8 +791,7 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
None | Some(KnownFunction::RevealType)
) =>
{
let return_ty =
inference.expression_type(expr_call.scoped_expression_id(self.db, scope));
let return_ty = inference.expression_type(expr_call);
let (guarded_ty, place) = match return_ty {
// TODO: TypeGuard
@@ -824,7 +817,7 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
if function == KnownFunction::HasAttr {
let attr = inference
.expression_type(second_arg.scoped_expression_id(self.db, scope))
.expression_type(second_arg)
.into_string_literal()?
.value(self.db);
@@ -847,8 +840,7 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
let function = function.into_classinfo_constraint_function()?;
let class_info_ty =
inference.expression_type(second_arg.scoped_expression_id(self.db, scope));
let class_info_ty = inference.expression_type(second_arg);
function
.generate_constraint(self.db, class_info_ty)
@@ -939,15 +931,12 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
is_positive: bool,
) -> Option<NarrowingConstraints<'db>> {
let inference = infer_expression_types(self.db, expression);
let scope = self.scope();
let mut sub_constraints = expr_bool_op
.values
.iter()
// filter our arms with statically known truthiness
.filter(|expr| {
inference
.expression_type(expr.scoped_expression_id(self.db, scope))
.bool(self.db)
inference.expression_type(*expr).bool(self.db)
!= match expr_bool_op.op {
BoolOp::And => Truthiness::AlwaysTrue,
BoolOp::Or => Truthiness::AlwaysFalse,

View File

@@ -6,7 +6,7 @@ use rustc_hash::FxHashMap;
use ruff_python_ast::{self as ast, AnyNodeRef};
use crate::Db;
use crate::semantic_index::ast_ids::{HasScopedExpressionId, ScopedExpressionId};
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::semantic_index::place::ScopeId;
use crate::types::tuple::{ResizeTupleError, Tuple, TupleLength, TupleUnpacker};
use crate::types::{Type, TypeCheckDiagnostics, infer_expression_types};
@@ -18,23 +18,18 @@ use super::diagnostic::INVALID_ASSIGNMENT;
/// Unpacks the value expression type to their respective targets.
pub(crate) struct Unpacker<'db, 'ast> {
context: InferContext<'db, 'ast>,
target_scope: ScopeId<'db>,
value_scope: ScopeId<'db>,
targets: FxHashMap<ScopedExpressionId, Type<'db>>,
targets: FxHashMap<ExpressionNodeKey, Type<'db>>,
}
impl<'db, 'ast> Unpacker<'db, 'ast> {
pub(crate) fn new(
db: &'db dyn Db,
target_scope: ScopeId<'db>,
value_scope: ScopeId<'db>,
module: &'ast ParsedModuleRef,
) -> Self {
Self {
context: InferContext::new(db, target_scope, module),
targets: FxHashMap::default(),
target_scope,
value_scope,
}
}
@@ -53,9 +48,8 @@ impl<'db, 'ast> Unpacker<'db, 'ast> {
"Unpacking target must be a list or tuple expression"
);
let value_type = infer_expression_types(self.db(), value.expression()).expression_type(
value.scoped_expression_id(self.db(), self.value_scope, self.module()),
);
let value_type = infer_expression_types(self.db(), value.expression())
.expression_type(value.expression().node_ref(self.db(), self.module()));
let value_type = match value.kind() {
UnpackKind::Assign => {
@@ -103,10 +97,7 @@ impl<'db, 'ast> Unpacker<'db, 'ast> {
) {
match target {
ast::Expr::Name(_) | ast::Expr::Attribute(_) | ast::Expr::Subscript(_) => {
self.targets.insert(
target.scoped_expression_id(self.db(), self.target_scope),
value_ty,
);
self.targets.insert(target.into(), value_ty);
}
ast::Expr::Starred(ast::ExprStarred { value, .. }) => {
self.unpack_inner(value, value_expr, value_ty);
@@ -208,7 +199,7 @@ impl<'db, 'ast> Unpacker<'db, 'ast> {
#[derive(Debug, Default, PartialEq, Eq, salsa::Update, get_size2::GetSize)]
pub(crate) struct UnpackResult<'db> {
targets: FxHashMap<ScopedExpressionId, Type<'db>>,
targets: FxHashMap<ExpressionNodeKey, Type<'db>>,
diagnostics: TypeCheckDiagnostics,
/// The fallback type for missing expressions.
@@ -226,16 +217,19 @@ impl<'db> UnpackResult<'db> {
/// May panic if a scoped expression ID is passed in that does not correspond to a sub-
/// expression of the target.
#[track_caller]
pub(crate) fn expression_type(&self, expr_id: ScopedExpressionId) -> Type<'db> {
pub(crate) fn expression_type(&self, expr_id: impl Into<ExpressionNodeKey>) -> Type<'db> {
self.try_expression_type(expr_id).expect(
"expression should belong to this `UnpackResult` and \
`Unpacker` should have inferred a type for it",
)
}
pub(crate) fn try_expression_type(&self, expr_id: ScopedExpressionId) -> Option<Type<'db>> {
pub(crate) fn try_expression_type(
&self,
expr: impl Into<ExpressionNodeKey>,
) -> Option<Type<'db>> {
self.targets
.get(&expr_id)
.get(&expr.into())
.copied()
.or(self.cycle_fallback_type)
}

View File

@@ -5,7 +5,6 @@ use ruff_text_size::{Ranged, TextRange};
use crate::Db;
use crate::ast_node_ref::AstNodeRef;
use crate::semantic_index::ast_ids::{HasScopedExpressionId, ScopedExpressionId};
use crate::semantic_index::expression::Expression;
use crate::semantic_index::place::{FileScopeId, ScopeId};
@@ -58,16 +57,6 @@ impl<'db> Unpack<'db> {
self._target(db).node(parsed)
}
/// Returns the scope in which the unpack value expression belongs.
///
/// The scope in which the target and value expression belongs to are usually the same
/// except in generator expressions and comprehensions (list/dict/set), where the value
/// expression of the first generator is evaluated in the outer scope, while the ones in the subsequent
/// generators are evaluated in the comprehension scope.
pub(crate) fn value_scope(self, db: &'db dyn Db) -> ScopeId<'db> {
self.value_file_scope(db).to_scope_id(db, self.file(db))
}
/// Returns the scope where the unpack target expression belongs to.
pub(crate) fn target_scope(self, db: &'db dyn Db) -> ScopeId<'db> {
self.target_file_scope(db).to_scope_id(db, self.file(db))
@@ -98,18 +87,6 @@ impl<'db> UnpackValue<'db> {
self.expression
}
/// Returns the [`ScopedExpressionId`] of the underlying expression.
pub(crate) fn scoped_expression_id(
self,
db: &'db dyn Db,
scope: ScopeId<'db>,
module: &ParsedModuleRef,
) -> ScopedExpressionId {
self.expression()
.node_ref(db, module)
.scoped_expression_id(db, scope)
}
/// Returns the expression as an [`AnyNodeRef`].
pub(crate) fn as_any_node_ref<'ast>(
self,

View File

@@ -19,6 +19,7 @@ ruff_text_size = { workspace = true }
ty_ide = { workspace = true }
ty_project = { workspace = true }
ty_python_semantic = { workspace = true }
ty_vendored = { workspace = true }
anyhow = { workspace = true }
crossbeam = { workspace = true }
@@ -31,7 +32,7 @@ serde = { workspace = true }
serde_json = { workspace = true }
shellexpand = { workspace = true }
tracing = { workspace = true }
tracing-subscriber = { workspace = true }
tracing-subscriber = { workspace = true, features = ["chrono"] }
[dev-dependencies]

View File

@@ -4,24 +4,24 @@
//! are written to `stderr` by default, which should appear in the logs for most LSP clients. A
//! `logFile` path can also be specified in the settings, and output will be directed there
//! instead.
use core::str;
use serde::Deserialize;
use std::{path::PathBuf, str::FromStr, sync::Arc};
use tracing::level_filters::LevelFilter;
use tracing_subscriber::{
Layer,
fmt::{time::Uptime, writer::BoxMakeWriter},
layer::SubscriberExt,
};
use std::sync::Arc;
pub(crate) fn init_logging(log_level: LogLevel, log_file: Option<&std::path::Path>) {
use ruff_db::system::{SystemPath, SystemPathBuf};
use serde::Deserialize;
use tracing::level_filters::LevelFilter;
use tracing_subscriber::Layer;
use tracing_subscriber::fmt::time::ChronoLocal;
use tracing_subscriber::fmt::writer::BoxMakeWriter;
use tracing_subscriber::layer::SubscriberExt;
pub(crate) fn init_logging(log_level: LogLevel, log_file: Option<&SystemPath>) {
let log_file = log_file
.map(|path| {
// this expands `logFile` so that tildes and environment variables
// are replaced with their values, if possible.
if let Some(expanded) = shellexpand::full(&path.to_string_lossy())
if let Some(expanded) = shellexpand::full(&path.to_string())
.ok()
.and_then(|path| PathBuf::from_str(&path).ok())
.map(|path| SystemPathBuf::from(&*path))
{
expanded
} else {
@@ -32,14 +32,11 @@ pub(crate) fn init_logging(log_level: LogLevel, log_file: Option<&std::path::Pat
std::fs::OpenOptions::new()
.create(true)
.append(true)
.open(&path)
.open(path.as_std_path())
.map_err(|err| {
#[expect(clippy::print_stderr)]
{
eprintln!(
"Failed to open file at {} for logging: {err}",
path.display()
);
eprintln!("Failed to open file at {path} for logging: {err}");
}
})
.ok()
@@ -49,10 +46,12 @@ pub(crate) fn init_logging(log_level: LogLevel, log_file: Option<&std::path::Pat
Some(file) => BoxMakeWriter::new(Arc::new(file)),
None => BoxMakeWriter::new(std::io::stderr),
};
let is_trace_level = log_level == LogLevel::Trace;
let subscriber = tracing_subscriber::Registry::default().with(
tracing_subscriber::fmt::layer()
.with_timer(Uptime::default())
.with_thread_names(true)
.with_timer(ChronoLocal::new("%Y-%m-%d %H:%M:%S.%f".to_string()))
.with_thread_names(is_trace_level)
.with_target(is_trace_level)
.with_ansi(false)
.with_writer(logger)
.with_filter(LogLevelFilter { filter: log_level }),
@@ -108,7 +107,7 @@ impl<S> tracing_subscriber::layer::Filter<S> for LogLevelFilter {
meta.level() <= &filter
}
fn max_level_hint(&self) -> Option<tracing::level_filters::LevelFilter> {
fn max_level_hint(&self) -> Option<LevelFilter> {
Some(LevelFilter::from_level(self.filter.trace_level()))
}
}

View File

@@ -136,7 +136,7 @@ impl Server {
&client_capabilities,
position_encoding,
global_options,
&workspaces,
workspaces,
)?,
client_capabilities,
})
@@ -173,6 +173,7 @@ impl Server {
diagnostic_provider: Some(DiagnosticServerCapabilities::Options(DiagnosticOptions {
identifier: Some(crate::DIAGNOSTIC_NAME.into()),
inter_file_dependencies: true,
workspace_diagnostics: true,
..Default::default()
})),
text_document_sync: Some(TextDocumentSyncCapability::Options(
@@ -227,12 +228,10 @@ impl ServerPanicHookHandler {
writeln!(stderr, "{panic_info}\n{backtrace}").ok();
if let Some(client) = hook_client.upgrade() {
client
.show_message(
"The ty language server exited with a panic. See the logs for more details.",
MessageType::ERROR,
)
.ok();
client.show_message(
"The ty language server exited with a panic. See the logs for more details.",
MessageType::ERROR,
);
}
}));

View File

@@ -28,23 +28,28 @@ pub(super) fn request(req: server::Request) -> Task {
let id = req.id.clone();
match req.method.as_str() {
requests::DocumentDiagnosticRequestHandler::METHOD => background_request_task::<
requests::DocumentDiagnosticRequestHandler::METHOD => background_document_request_task::<
requests::DocumentDiagnosticRequestHandler,
>(
req, BackgroundSchedule::Worker
),
requests::GotoTypeDefinitionRequestHandler::METHOD => background_request_task::<
requests::WorkspaceDiagnosticRequestHandler::METHOD => background_request_task::<
requests::WorkspaceDiagnosticRequestHandler,
>(
req, BackgroundSchedule::Worker
),
requests::GotoTypeDefinitionRequestHandler::METHOD => background_document_request_task::<
requests::GotoTypeDefinitionRequestHandler,
>(
req, BackgroundSchedule::Worker
),
requests::HoverRequestHandler::METHOD => background_request_task::<
requests::HoverRequestHandler::METHOD => background_document_request_task::<
requests::HoverRequestHandler,
>(req, BackgroundSchedule::Worker),
requests::InlayHintRequestHandler::METHOD => background_request_task::<
requests::InlayHintRequestHandler::METHOD => background_document_request_task::<
requests::InlayHintRequestHandler,
>(req, BackgroundSchedule::Worker),
requests::CompletionRequestHandler::METHOD => background_request_task::<
requests::CompletionRequestHandler::METHOD => background_document_request_task::<
requests::CompletionRequestHandler,
>(
req, BackgroundSchedule::LatencySensitive
@@ -135,7 +140,51 @@ where
}))
}
fn background_request_task<R: traits::BackgroundDocumentRequestHandler>(
fn background_request_task<R: traits::BackgroundRequestHandler>(
req: server::Request,
schedule: BackgroundSchedule,
) -> Result<Task>
where
<<R as RequestHandler>::RequestType as Request>::Params: UnwindSafe,
{
let retry = R::RETRY_ON_CANCELLATION.then(|| req.clone());
let (id, params) = cast_request::<R>(req)?;
Ok(Task::background(schedule, move |session: &Session| {
let cancellation_token = session
.request_queue()
.incoming()
.cancellation_token(&id)
.expect("request should have been tested for cancellation before scheduling");
let snapshot = session.take_workspace_snapshot();
Box::new(move |client| {
let _span = tracing::debug_span!("request", %id, method = R::METHOD).entered();
// Test again if the request was cancelled since it was scheduled on the background task
// and, if so, return early
if cancellation_token.is_cancelled() {
tracing::trace!(
"Ignoring request id={id} method={} because it was cancelled",
R::METHOD
);
// We don't need to send a response here because the `cancel` notification
// handler already responded with a message.
return;
}
let result = ruff_db::panic::catch_unwind(|| R::run(snapshot, client, params));
if let Some(response) = request_result_to_response::<R>(&id, client, result, retry) {
respond::<R>(&id, response, client);
}
})
}))
}
fn background_document_request_task<R: traits::BackgroundDocumentRequestHandler>(
req: server::Request,
schedule: BackgroundSchedule,
) -> Result<Task>
@@ -160,7 +209,7 @@ where
};
let db = match &path {
AnySystemPath::System(path) => match session.project_db_for_path(path.as_std_path()) {
AnySystemPath::System(path) => match session.project_db_for_path(path) {
Some(db) => db.clone(),
None => session.default_project_db().clone(),
},
@@ -168,7 +217,7 @@ where
};
let Some(snapshot) = session.take_snapshot(url) else {
tracing::warn!("Ignoring request because snapshot for path `{path:?}` doesn't exist.");
tracing::warn!("Ignoring request because snapshot for path `{path:?}` doesn't exist");
return Box::new(|_| {});
};
@@ -209,7 +258,7 @@ fn request_result_to_response<R>(
request: Option<lsp_server::Request>,
) -> Option<Result<<<R as RequestHandler>::RequestType as Request>::Result>>
where
R: traits::BackgroundDocumentRequestHandler,
R: traits::RetriableRequestHandler,
{
match result {
Ok(response) => Some(response),
@@ -224,17 +273,14 @@ where
request.id,
request.method
);
if client.retry(request).is_ok() {
return None;
}
client.retry(request);
} else {
tracing::trace!(
"request id={} was cancelled by salsa, sending content modified",
id
);
respond_silent_error(id.clone(), client, R::salsa_cancellation_error());
}
tracing::trace!(
"request id={} was cancelled by salsa, sending content modified",
id
);
respond_silent_error(id.clone(), client, R::salsa_cancellation_error());
None
} else {
Some(Err(Error {
@@ -343,17 +389,13 @@ fn respond<Req>(
tracing::error!("An error occurred with request ID {id}: {err}");
client.show_error_message("ty encountered a problem. Check the logs for more details.");
}
if let Err(err) = client.respond(id, result) {
tracing::error!("Failed to send response: {err}");
}
client.respond(id, result);
}
/// Sends back an error response to the server using a [`Client`] without showing a warning
/// to the user.
fn respond_silent_error(id: RequestId, client: &Client, error: lsp_server::ResponseError) {
if let Err(err) = client.respond_err(id, error) {
tracing::error!("Failed to send response: {err}");
}
client.respond_err(id, error);
}
/// Tries to cast a serialized request from the server into

View File

@@ -1,4 +1,3 @@
use lsp_server::ErrorCode;
use lsp_types::notification::PublishDiagnostics;
use lsp_types::{
CodeDescription, Diagnostic, DiagnosticRelatedInformation, DiagnosticSeverity, DiagnosticTag,
@@ -46,20 +45,17 @@ impl Diagnostics {
/// This is done by notifying the client with an empty list of diagnostics for the document.
/// For notebook cells, this clears diagnostics for the specific cell.
/// For other document types, this clears diagnostics for the main document.
pub(super) fn clear_diagnostics(key: &DocumentKey, client: &Client) -> Result<()> {
pub(super) fn clear_diagnostics(key: &DocumentKey, client: &Client) {
let Some(uri) = key.to_url() else {
// If we can't convert to URL, we can't clear diagnostics
return Ok(());
return;
};
client
.send_notification::<PublishDiagnostics>(PublishDiagnosticsParams {
uri,
diagnostics: vec![],
version: None,
})
.with_failure_code(ErrorCode::InternalError)?;
Ok(())
client.send_notification::<PublishDiagnostics>(PublishDiagnosticsParams {
uri,
diagnostics: vec![],
version: None,
});
}
/// Publishes the diagnostics for the given document snapshot using the [publish diagnostics
@@ -96,22 +92,20 @@ pub(super) fn publish_diagnostics(
// Sends a notification to the client with the diagnostics for the document.
let publish_diagnostics_notification = |uri: Url, diagnostics: Vec<Diagnostic>| {
client
.send_notification::<PublishDiagnostics>(PublishDiagnosticsParams {
uri,
diagnostics,
version: Some(snapshot.query().version()),
})
.with_failure_code(lsp_server::ErrorCode::InternalError)
client.send_notification::<PublishDiagnostics>(PublishDiagnosticsParams {
uri,
diagnostics,
version: Some(snapshot.query().version()),
});
};
match diagnostics {
Diagnostics::TextDocument(diagnostics) => {
publish_diagnostics_notification(url, diagnostics)?;
publish_diagnostics_notification(url, diagnostics);
}
Diagnostics::NotebookDocument(cell_diagnostics) => {
for (cell_url, diagnostics) in cell_diagnostics {
publish_diagnostics_notification(cell_url, diagnostics)?;
publish_diagnostics_notification(cell_url, diagnostics);
}
}
}
@@ -172,7 +166,7 @@ pub(super) fn compute_diagnostics(
/// Converts the tool specific [`Diagnostic`][ruff_db::diagnostic::Diagnostic] to an LSP
/// [`Diagnostic`].
fn to_lsp_diagnostic(
pub(super) fn to_lsp_diagnostic(
db: &dyn Db,
diagnostic: &ruff_db::diagnostic::Diagnostic,
encoding: PositionEncoding,

View File

@@ -20,7 +20,7 @@ impl SyncNotificationHandler for CancelNotificationHandler {
lsp_types::NumberOrString::String(id) => id.into(),
};
let _ = client.cancel(session, id);
client.cancel(session, id);
Ok(())
}

View File

@@ -39,7 +39,7 @@ impl SyncNotificationHandler for DidChangeTextDocumentHandler {
match key.path() {
AnySystemPath::System(path) => {
let db = match session.project_db_for_path_mut(path.as_std_path()) {
let db = match session.project_db_for_path_mut(path) {
Some(db) => db,
None => session.default_project_db_mut(),
};

View File

@@ -1,5 +1,4 @@
use crate::server::Result;
use crate::server::api::LSPResult;
use crate::server::api::diagnostics::publish_diagnostics;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
use crate::session::Session;
@@ -45,7 +44,7 @@ impl SyncNotificationHandler for DidChangeWatchedFiles {
}
};
let Some(db) = session.project_db_for_path(system_path.as_std_path()) else {
let Some(db) = session.project_db_for_path(&system_path) else {
tracing::trace!(
"Ignoring change event for `{system_path}` because it's not in any workspace"
);
@@ -103,13 +102,11 @@ impl SyncNotificationHandler for DidChangeWatchedFiles {
if project_changed {
if client_capabilities.diagnostics_refresh {
client
.send_request::<types::request::WorkspaceDiagnosticRefresh>(
session,
(),
|_, ()| {},
)
.with_failure_code(lsp_server::ErrorCode::InternalError)?;
client.send_request::<types::request::WorkspaceDiagnosticRefresh>(
session,
(),
|_, ()| {},
);
} else {
for key in session.text_document_keys() {
publish_diagnostics(session, &key, client)?;
@@ -120,9 +117,7 @@ impl SyncNotificationHandler for DidChangeWatchedFiles {
}
if client_capabilities.inlay_refresh {
client
.send_request::<types::request::InlayHintRefreshRequest>(session, (), |_, ()| {})
.with_failure_code(lsp_server::ErrorCode::InternalError)?;
client.send_request::<types::request::InlayHintRefreshRequest>(session, (), |_, ()| {});
}
Ok(())

View File

@@ -41,6 +41,13 @@ impl SyncNotificationHandler for DidCloseTextDocumentHandler {
);
}
clear_diagnostics(&key, client)
if !session.global_settings().diagnostic_mode().is_workspace() {
// The server needs to clear the diagnostics regardless of whether the client supports
// pull diagnostics or not. This is because the client only has the capability to fetch
// the diagnostics but does not automatically clear them when a document is closed.
clear_diagnostics(&key, client);
}
Ok(())
}
}

View File

@@ -41,7 +41,7 @@ impl SyncNotificationHandler for DidOpenTextDocumentHandler {
match key.path() {
AnySystemPath::System(system_path) => {
let db = match session.project_db_for_path_mut(system_path.as_std_path()) {
let db = match session.project_db_for_path_mut(system_path) {
Some(db) => db,
None => session.default_project_db_mut(),
};

View File

@@ -40,7 +40,7 @@ impl SyncNotificationHandler for DidOpenNotebookHandler {
match &path {
AnySystemPath::System(system_path) => {
let db = match session.project_db_for_path_mut(system_path.as_std_path()) {
let db = match session.project_db_for_path_mut(system_path) {
Some(db) => db,
None => session.default_project_db_mut(),
};

View File

@@ -4,6 +4,7 @@ mod goto_type_definition;
mod hover;
mod inlay_hints;
mod shutdown;
mod workspace_diagnostic;
pub(super) use completion::CompletionRequestHandler;
pub(super) use diagnostic::DocumentDiagnosticRequestHandler;
@@ -11,3 +12,4 @@ pub(super) use goto_type_definition::GotoTypeDefinitionRequestHandler;
pub(super) use hover::HoverRequestHandler;
pub(super) use inlay_hints::InlayHintRequestHandler;
pub(super) use shutdown::ShutdownHandler;
pub(super) use workspace_diagnostic::WorkspaceDiagnosticRequestHandler;

View File

@@ -8,7 +8,9 @@ use ty_project::ProjectDatabase;
use crate::DocumentSnapshot;
use crate::document::PositionExt;
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::client::Client;
pub(crate) struct CompletionRequestHandler;
@@ -18,8 +20,6 @@ impl RequestHandler for CompletionRequestHandler {
}
impl BackgroundDocumentRequestHandler for CompletionRequestHandler {
const RETRY_ON_CANCELLATION: bool = true;
fn document_url(params: &CompletionParams) -> Cow<Url> {
Cow::Borrowed(&params.text_document_position.text_document.uri)
}
@@ -65,3 +65,7 @@ impl BackgroundDocumentRequestHandler for CompletionRequestHandler {
Ok(Some(response))
}
}
impl RetriableRequestHandler for CompletionRequestHandler {
const RETRY_ON_CANCELLATION: bool = true;
}

View File

@@ -8,7 +8,9 @@ use lsp_types::{
use crate::server::Result;
use crate::server::api::diagnostics::{Diagnostics, compute_diagnostics};
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
use ty_project::ProjectDatabase;
@@ -43,7 +45,9 @@ impl BackgroundDocumentRequestHandler for DocumentDiagnosticRequestHandler {
}),
))
}
}
impl RetriableRequestHandler for DocumentDiagnosticRequestHandler {
fn salsa_cancellation_error() -> lsp_server::ResponseError {
lsp_server::ResponseError {
code: lsp_server::ErrorCode::ServerCancelled as i32,

View File

@@ -8,7 +8,9 @@ use ty_project::ProjectDatabase;
use crate::DocumentSnapshot;
use crate::document::{PositionExt, ToLink};
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::client::Client;
pub(crate) struct GotoTypeDefinitionRequestHandler;
@@ -70,3 +72,5 @@ impl BackgroundDocumentRequestHandler for GotoTypeDefinitionRequestHandler {
}
}
}
impl RetriableRequestHandler for GotoTypeDefinitionRequestHandler {}

View File

@@ -2,7 +2,9 @@ use std::borrow::Cow;
use crate::DocumentSnapshot;
use crate::document::{PositionExt, ToRangeExt};
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::client::Client;
use lsp_types::request::HoverRequest;
use lsp_types::{HoverContents, HoverParams, MarkupContent, Url};
@@ -73,3 +75,5 @@ impl BackgroundDocumentRequestHandler for HoverRequestHandler {
}))
}
}
impl RetriableRequestHandler for HoverRequestHandler {}

View File

@@ -2,7 +2,9 @@ use std::borrow::Cow;
use crate::DocumentSnapshot;
use crate::document::{RangeExt, TextSizeExt};
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::client::Client;
use lsp_types::request::InlayHintRequest;
use lsp_types::{InlayHintParams, Url};
@@ -64,3 +66,5 @@ impl BackgroundDocumentRequestHandler for InlayHintRequestHandler {
Ok(Some(inlay_hints))
}
}
impl RetriableRequestHandler for InlayHintRequestHandler {}

View File

@@ -0,0 +1,108 @@
use lsp_types::request::WorkspaceDiagnosticRequest;
use lsp_types::{
FullDocumentDiagnosticReport, Url, WorkspaceDiagnosticParams, WorkspaceDiagnosticReport,
WorkspaceDiagnosticReportResult, WorkspaceDocumentDiagnosticReport,
WorkspaceFullDocumentDiagnosticReport,
};
use rustc_hash::FxHashMap;
use ty_project::CheckMode;
use crate::server::Result;
use crate::server::api::diagnostics::to_lsp_diagnostic;
use crate::server::api::traits::{
BackgroundRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::WorkspaceSnapshot;
use crate::session::client::Client;
use crate::system::file_to_url;
pub(crate) struct WorkspaceDiagnosticRequestHandler;
impl RequestHandler for WorkspaceDiagnosticRequestHandler {
type RequestType = WorkspaceDiagnosticRequest;
}
impl BackgroundRequestHandler for WorkspaceDiagnosticRequestHandler {
fn run(
snapshot: WorkspaceSnapshot,
_client: &Client,
_params: WorkspaceDiagnosticParams,
) -> Result<WorkspaceDiagnosticReportResult> {
let index = snapshot.index();
if !index.global_settings().diagnostic_mode().is_workspace() {
tracing::debug!("Workspace diagnostics is disabled; returning empty report");
return Ok(WorkspaceDiagnosticReportResult::Report(
WorkspaceDiagnosticReport { items: vec![] },
));
}
let mut items = Vec::new();
for db in snapshot.projects() {
let diagnostics = db.check_with_mode(CheckMode::AllFiles);
// Group diagnostics by URL
let mut diagnostics_by_url: FxHashMap<Url, Vec<_>> = FxHashMap::default();
for diagnostic in diagnostics {
if let Some(span) = diagnostic.primary_span() {
let file = span.expect_ty_file();
let Some(url) = file_to_url(db, file) else {
tracing::debug!("Failed to convert file to URL at {}", file.path(db));
continue;
};
diagnostics_by_url.entry(url).or_default().push(diagnostic);
}
}
items.reserve(diagnostics_by_url.len());
// Convert to workspace diagnostic report format
for (url, file_diagnostics) in diagnostics_by_url {
let version = index
.key_from_url(url.clone())
.ok()
.and_then(|key| index.make_document_ref(&key))
.map(|doc| i64::from(doc.version()));
// Convert diagnostics to LSP format
let lsp_diagnostics = file_diagnostics
.into_iter()
.map(|diagnostic| {
to_lsp_diagnostic(db, &diagnostic, snapshot.position_encoding())
})
.collect::<Vec<_>>();
items.push(WorkspaceDocumentDiagnosticReport::Full(
WorkspaceFullDocumentDiagnosticReport {
uri: url,
version,
full_document_diagnostic_report: FullDocumentDiagnosticReport {
// TODO: We don't implement result ID caching yet
result_id: None,
items: lsp_diagnostics,
},
},
));
}
}
Ok(WorkspaceDiagnosticReportResult::Report(
WorkspaceDiagnosticReport { items },
))
}
}
impl RetriableRequestHandler for WorkspaceDiagnosticRequestHandler {
fn salsa_cancellation_error() -> lsp_server::ResponseError {
lsp_server::ResponseError {
code: lsp_server::ErrorCode::ServerCancelled as i32,
message: "server cancelled the request".to_owned(),
data: serde_json::to_value(lsp_types::DiagnosticServerCancellationData {
retrigger_request: true,
})
.ok(),
}
}
}

View File

@@ -1,7 +1,7 @@
//! A stateful LSP implementation that calls into the ty API.
use crate::session::client::Client;
use crate::session::{DocumentSnapshot, Session};
use crate::session::{DocumentSnapshot, Session, WorkspaceSnapshot};
use lsp_types::notification::Notification as LSPNotification;
use lsp_types::request::Request;
@@ -25,11 +25,24 @@ pub(super) trait SyncRequestHandler: RequestHandler {
) -> super::Result<<<Self as RequestHandler>::RequestType as Request>::Result>;
}
/// A request handler that can be run on a background thread.
pub(super) trait BackgroundDocumentRequestHandler: RequestHandler {
/// Whether this request be retried if it was cancelled due to a modification to the Salsa database.
pub(super) trait RetriableRequestHandler: RequestHandler {
/// Whether this request can be cancelled if the Salsa database is modified.
const RETRY_ON_CANCELLATION: bool = false;
/// The error to return if the request was cancelled due to a modification to the Salsa database.
fn salsa_cancellation_error() -> lsp_server::ResponseError {
lsp_server::ResponseError {
code: lsp_server::ErrorCode::ContentModified as i32,
message: "content modified".to_string(),
data: None,
}
}
}
/// A request handler that can be run on a background thread.
///
/// This handler is specific to requests that operate on a single document.
pub(super) trait BackgroundDocumentRequestHandler: RetriableRequestHandler {
fn document_url(
params: &<<Self as RequestHandler>::RequestType as Request>::Params,
) -> std::borrow::Cow<lsp_types::Url>;
@@ -40,14 +53,15 @@ pub(super) trait BackgroundDocumentRequestHandler: RequestHandler {
client: &Client,
params: <<Self as RequestHandler>::RequestType as Request>::Params,
) -> super::Result<<<Self as RequestHandler>::RequestType as Request>::Result>;
}
fn salsa_cancellation_error() -> lsp_server::ResponseError {
lsp_server::ResponseError {
code: lsp_server::ErrorCode::ContentModified as i32,
message: "content modified".to_string(),
data: None,
}
}
/// A request handler that can be run on a background thread.
pub(super) trait BackgroundRequestHandler: RetriableRequestHandler {
fn run(
snapshot: WorkspaceSnapshot,
client: &Client,
params: <<Self as RequestHandler>::RequestType as Request>::Params,
) -> super::Result<<<Self as RequestHandler>::RequestType as Request>::Result>;
}
/// A supertrait for any server notification handler.

View File

@@ -1,11 +1,15 @@
use crate::server::schedule::Scheduler;
use crate::server::{Server, api};
use crate::session::ClientOptions;
use crate::session::client::Client;
use anyhow::anyhow;
use crossbeam::select;
use lsp_server::Message;
use lsp_types::notification::Notification;
use lsp_types::{DidChangeWatchedFilesRegistrationOptions, FileSystemWatcher};
use lsp_types::{
ConfigurationParams, DidChangeWatchedFilesRegistrationOptions, FileSystemWatcher, Url,
};
use serde_json::Value;
pub(crate) type MainLoopSender = crossbeam::channel::Sender<Event>;
pub(crate) type MainLoopReceiver = crossbeam::channel::Receiver<Event>;
@@ -26,6 +30,10 @@ impl Server {
match next_event {
Event::Message(msg) => {
let Some(msg) = self.session.should_defer_message(msg) else {
continue;
};
let client = Client::new(
self.main_loop_sender.clone(),
self.connection.sender.clone(),
@@ -49,7 +57,7 @@ impl Server {
message: "Shutdown already requested".to_owned(),
data: None,
},
)?;
);
continue;
}
@@ -130,6 +138,9 @@ impl Server {
);
}
}
Action::InitializeWorkspaces(workspaces_with_options) => {
self.session.initialize_workspaces(workspaces_with_options);
}
},
}
}
@@ -140,7 +151,25 @@ impl Server {
/// Waits for the next message from the client or action.
///
/// Returns `Ok(None)` if the client connection is closed.
fn next_event(&self) -> Result<Option<Event>, crossbeam::channel::RecvError> {
fn next_event(&mut self) -> Result<Option<Event>, crossbeam::channel::RecvError> {
// We can't queue those into the main loop because that could result in reordering if
// the `select` below picks a client message first.
if let Some(deferred) = self.session.take_deferred_messages() {
match &deferred {
Message::Request(req) => {
tracing::debug!("Processing deferred request `{}`", req.method);
}
Message::Notification(notification) => {
tracing::debug!("Processing deferred notification `{}`", notification.method);
}
Message::Response(response) => {
tracing::debug!("Processing deferred response `{}`", response.id);
}
}
return Ok(Some(Event::Message(deferred)));
}
select!(
recv(self.connection.receiver) -> msg => {
// Ignore disconnect errors, they're handled by the main loop (it will exit).
@@ -151,6 +180,47 @@ impl Server {
}
fn initialize(&mut self, client: &Client) {
let urls = self
.session
.workspaces()
.urls()
.cloned()
.collect::<Vec<_>>();
let items = urls
.iter()
.map(|root| lsp_types::ConfigurationItem {
scope_uri: Some(root.clone()),
section: Some("ty".to_string()),
})
.collect();
tracing::debug!("Requesting workspace configuration for workspaces");
client
.send_request::<lsp_types::request::WorkspaceConfiguration>(
&self.session,
ConfigurationParams { items },
|client, result: Vec<Value>| {
tracing::debug!("Received workspace configurations, initializing workspaces");
assert_eq!(result.len(), urls.len());
let workspaces_with_options: Vec<_> = urls
.into_iter()
.zip(result)
.map(|(url, value)| {
let options: ClientOptions = serde_json::from_value(value).unwrap_or_else(|err| {
tracing::warn!("Failed to deserialize workspace options for {url}: {err}. Using default options.");
ClientOptions::default()
});
(url, options)
})
.collect();
client.queue_action(Action::InitializeWorkspaces(workspaces_with_options));
},
);
let fs_watcher = self
.client_capabilities
.workspace
@@ -206,17 +276,13 @@ impl Server {
tracing::info!("File watcher successfully registered");
};
if let Err(err) = client.send_request::<lsp_types::request::RegisterCapability>(
client.send_request::<lsp_types::request::RegisterCapability>(
&self.session,
lsp_types::RegistrationParams {
registrations: vec![registration],
},
response_handler,
) {
tracing::error!(
"An error occurred when trying to register the configuration file watcher: {err}"
);
}
);
} else {
tracing::warn!("The client does not support file system watching.");
}
@@ -231,6 +297,10 @@ pub(crate) enum Action {
/// Retry a request that previously failed due to a salsa cancellation.
RetryRequest(lsp_server::Request),
/// Initialize the workspace after the server received
/// the options from the client.
InitializeWorkspaces(Vec<(Url, ClientOptions)>),
}
#[derive(Debug)]

View File

@@ -83,9 +83,7 @@ impl Task {
R: Serialize + Send + 'static,
{
Self::sync(move |_, client| {
if let Err(err) = client.respond(&id, result) {
tracing::error!("Unable to send immediate response: {err}");
}
client.respond(&id, result);
})
}

View File

@@ -1,22 +1,24 @@
//! Data model, state management, and configuration resolution.
use std::collections::BTreeMap;
use std::collections::{BTreeMap, VecDeque};
use std::ops::{Deref, DerefMut};
use std::path::{Path, PathBuf};
use std::panic::AssertUnwindSafe;
use std::sync::Arc;
use anyhow::anyhow;
use anyhow::{Context, anyhow};
use lsp_server::Message;
use lsp_types::{ClientCapabilities, TextDocumentContentChangeEvent, Url};
use options::GlobalOptions;
use ruff_db::Db;
use ruff_db::files::{File, system_path_to_file};
use ruff_db::system::SystemPath;
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ty_project::metadata::Options;
use ty_project::{ProjectDatabase, ProjectMetadata};
pub(crate) use self::capabilities::ResolvedClientCapabilities;
pub use self::index::DocumentQuery;
pub(crate) use self::options::{AllOptions, ClientOptions};
use self::settings::ClientSettings;
pub(crate) use self::settings::ClientSettings;
use crate::document::{DocumentKey, DocumentVersion, NotebookDocument};
use crate::session::request_queue::RequestQueue;
use crate::system::{AnySystemPath, LSPSystem};
@@ -40,8 +42,13 @@ pub struct Session {
/// [`index_mut`]: Session::index_mut
index: Option<Arc<index::Index>>,
/// Maps workspace folders to their respective project databases.
projects_by_workspace_folder: BTreeMap<PathBuf, ProjectDatabase>,
/// Maps workspace folders to their respective workspace.
workspaces: Workspaces,
/// The projects across all workspaces.
projects: BTreeMap<SystemPathBuf, ProjectDatabase>,
default_project: ProjectDatabase,
/// The global position encoding, negotiated during LSP initialization.
position_encoding: PositionEncoding,
@@ -54,6 +61,8 @@ pub struct Session {
/// Has the client requested the server to shutdown.
shutdown_requested: bool,
deferred_messages: VecDeque<Message>,
}
impl Session {
@@ -61,32 +70,33 @@ impl Session {
client_capabilities: &ClientCapabilities,
position_encoding: PositionEncoding,
global_options: GlobalOptions,
workspace_folders: &[(Url, ClientOptions)],
workspace_folders: Vec<(Url, ClientOptions)>,
) -> crate::Result<Self> {
let mut workspaces = BTreeMap::new();
let index = Arc::new(index::Index::new(global_options.into_settings()));
// TODO: Consider workspace settings
for (url, _) in workspace_folders {
let path = url
.to_file_path()
.map_err(|()| anyhow!("Workspace URL is not a file or directory: {:?}", url))?;
let system_path = SystemPath::from_std_path(&path)
.ok_or_else(|| anyhow!("Workspace path is not a valid UTF-8 path: {:?}", path))?;
let system = LSPSystem::new(index.clone());
// TODO(dhruvmanila): Get the values from the client settings
let mut metadata = ProjectMetadata::discover(system_path, &system)?;
metadata.apply_configuration_files(&system)?;
// TODO(micha): Handle the case where the program settings are incorrect more gracefully.
workspaces.insert(path, ProjectDatabase::new(metadata, system)?);
let mut workspaces = Workspaces::default();
for (url, options) in workspace_folders {
workspaces.register(url, options)?;
}
let default_project = {
let system = LSPSystem::new(index.clone());
let metadata = ProjectMetadata::from_options(
Options::default(),
system.current_directory().to_path_buf(),
None,
)
.unwrap();
ProjectDatabase::new(metadata, system).unwrap()
};
Ok(Self {
position_encoding,
projects_by_workspace_folder: workspaces,
workspaces,
deferred_messages: VecDeque::new(),
index: Some(index),
default_project,
projects: BTreeMap::new(),
resolved_client_capabilities: Arc::new(ResolvedClientCapabilities::new(
client_capabilities,
)),
@@ -111,6 +121,52 @@ impl Session {
self.shutdown_requested = requested;
}
/// The LSP specification doesn't allow configuration requests during initialization,
/// but we need access to the configuration to resolve the settings in turn to create the
/// project databases. This will become more important in the future when we support
/// persistent caching. It's then crucial that we have the correct settings to select the
/// right cache.
///
/// We work around this by queueing up all messages that arrive between the `initialized` notification
/// and the completion of workspace initialization (which waits for the client's configuration response).
///
/// This queuing is only necessary when registering *new* workspaces. Changes to configurations
/// don't need to go through the same process because we can update the existing
/// database in place.
///
/// See <https://github.com/Microsoft/language-server-protocol/issues/567#issuecomment-2085131917>
pub(crate) fn should_defer_message(&mut self, message: Message) -> Option<Message> {
if self.workspaces.all_initialized() {
Some(message)
} else {
match &message {
Message::Request(request) => {
tracing::debug!(
"Deferring `{}` request until all workspaces are initialized",
request.method
);
}
Message::Response(_) => {
// We still want to get client responses even during workspace initialization.
return Some(message);
}
Message::Notification(notification) => {
tracing::debug!(
"Deferring `{}` notification until all workspaces are initialized",
notification.method
);
}
}
self.deferred_messages.push_back(message);
None
}
}
pub(crate) fn workspaces(&self) -> &Workspaces {
&self.workspaces
}
// TODO(dhruvmanila): Ideally, we should have a single method for `workspace_db_for_path_mut`
// and `default_workspace_db_mut` but the borrow checker doesn't allow that.
// https://github.com/astral-sh/ruff/pull/13041#discussion_r1726725437
@@ -119,14 +175,17 @@ impl Session {
/// or the default project if no project is found for the path.
pub(crate) fn project_db_or_default(&self, path: &AnySystemPath) -> &ProjectDatabase {
path.as_system()
.and_then(|path| self.project_db_for_path(path.as_std_path()))
.and_then(|path| self.project_db_for_path(path))
.unwrap_or_else(|| self.default_project_db())
}
/// Returns a reference to the project's [`ProjectDatabase`] corresponding to the given path, if
/// any.
pub(crate) fn project_db_for_path(&self, path: impl AsRef<Path>) -> Option<&ProjectDatabase> {
self.projects_by_workspace_folder
pub(crate) fn project_db_for_path(
&self,
path: impl AsRef<SystemPath>,
) -> Option<&ProjectDatabase> {
self.projects
.range(..=path.as_ref().to_path_buf())
.next_back()
.map(|(_, db)| db)
@@ -136,9 +195,9 @@ impl Session {
/// path, if any.
pub(crate) fn project_db_for_path_mut(
&mut self,
path: impl AsRef<Path>,
path: impl AsRef<SystemPath>,
) -> Option<&mut ProjectDatabase> {
self.projects_by_workspace_folder
self.projects
.range_mut(..=path.as_ref().to_path_buf())
.next_back()
.map(|(_, db)| db)
@@ -147,23 +206,86 @@ impl Session {
/// Returns a reference to the default project [`ProjectDatabase`]. The default project is the
/// minimum root path in the project map.
pub(crate) fn default_project_db(&self) -> &ProjectDatabase {
// SAFETY: Currently, ty only support a single project.
self.projects_by_workspace_folder.values().next().unwrap()
&self.default_project
}
/// Returns a mutable reference to the default project [`ProjectDatabase`].
pub(crate) fn default_project_db_mut(&mut self) -> &mut ProjectDatabase {
// SAFETY: Currently, ty only support a single project.
self.projects_by_workspace_folder
&mut self.default_project
}
fn projects_mut(&mut self) -> impl Iterator<Item = &'_ mut ProjectDatabase> + '_ {
self.projects
.values_mut()
.next()
.unwrap()
.chain(std::iter::once(&mut self.default_project))
}
pub(crate) fn key_from_url(&self, url: Url) -> crate::Result<DocumentKey> {
self.index().key_from_url(url)
}
pub(crate) fn take_workspace_snapshot(&self) -> WorkspaceSnapshot {
WorkspaceSnapshot {
projects: AssertUnwindSafe(self.projects.values().cloned().collect()),
index: self.index.clone().unwrap(),
position_encoding: self.position_encoding,
}
}
pub(crate) fn initialize_workspaces(&mut self, workspace_settings: Vec<(Url, ClientOptions)>) {
assert!(!self.workspaces.all_initialized());
for (url, options) in workspace_settings {
let Some(workspace) = self.workspaces.initialize(&url, options) else {
continue;
};
// For now, create one project database per workspace.
// In the future, index the workspace directories to find all projects
// and create a project database for each.
let system = LSPSystem::new(self.index.as_ref().unwrap().clone());
let system_path = workspace.root();
let root = system_path.to_path_buf();
let project = ProjectMetadata::discover(&root, &system)
.context("Failed to find project configuration")
.and_then(|mut metadata| {
// TODO(dhruvmanila): Merge the client options with the project metadata options.
metadata
.apply_configuration_files(&system)
.context("Failed to apply configuration files")?;
ProjectDatabase::new(metadata, system)
.context("Failed to create project database")
});
// TODO(micha): Handle the case where the program settings are incorrect more gracefully.
// The easiest is to ignore those projects but to show a message to the user that we do so.
// Ignoring the projects has the effect that we'll use the default project for those files.
// The only challenge with this is that we need to register the project when the configuration
// becomes valid again. But that's a case we need to handle anyway for good mono repository support.
match project {
Ok(project) => {
self.projects.insert(root, project);
}
Err(err) => {
tracing::warn!("Failed to create project database for `{root}`: {err}",);
}
}
}
assert!(
self.workspaces.all_initialized(),
"All workspaces should be initialized after calling `initialize_workspaces`"
);
}
pub(crate) fn take_deferred_messages(&mut self) -> Option<Message> {
if self.workspaces.all_initialized() {
self.deferred_messages.pop_front()
} else {
None
}
}
/// Creates a document snapshot with the URL referencing the document to snapshot.
///
/// Returns `None` if the url can't be converted to a document key or if the document isn't open.
@@ -240,7 +362,7 @@ impl Session {
fn index_mut(&mut self) -> MutIndexGuard {
let index = self.index.take().unwrap();
for db in self.projects_by_workspace_folder.values_mut() {
for db in self.projects_mut() {
// Remove the `index` from each database. This drops the count of `Arc<Index>` down to 1
db.system_mut()
.as_any_mut()
@@ -250,17 +372,21 @@ impl Session {
}
// There should now be exactly one reference to index which is self.index.
let index = Arc::into_inner(index);
let index = Arc::into_inner(index).unwrap();
MutIndexGuard {
session: self,
index,
index: Some(index),
}
}
pub(crate) fn client_capabilities(&self) -> &ResolvedClientCapabilities {
&self.resolved_client_capabilities
}
pub(crate) fn global_settings(&self) -> Arc<ClientSettings> {
self.index().global_settings()
}
}
/// A guard that holds the only reference to the index and allows modifying it.
@@ -289,7 +415,7 @@ impl Drop for MutIndexGuard<'_> {
fn drop(&mut self) {
if let Some(index) = self.index.take() {
let index = Arc::new(index);
for db in self.session.projects_by_workspace_folder.values_mut() {
for db in self.session.projects_mut() {
db.system_mut()
.as_any_mut()
.downcast_mut::<LSPSystem>()
@@ -339,3 +465,97 @@ impl DocumentSnapshot {
}
}
}
/// An immutable snapshot of the current state of [`Session`].
pub(crate) struct WorkspaceSnapshot {
projects: AssertUnwindSafe<Vec<ProjectDatabase>>,
index: Arc<index::Index>,
position_encoding: PositionEncoding,
}
impl WorkspaceSnapshot {
pub(crate) fn projects(&self) -> &[ProjectDatabase] {
&self.projects
}
pub(crate) fn index(&self) -> &index::Index {
&self.index
}
pub(crate) fn position_encoding(&self) -> PositionEncoding {
self.position_encoding
}
}
#[derive(Debug, Default)]
pub(crate) struct Workspaces {
workspaces: BTreeMap<Url, Workspace>,
uninitialized: usize,
}
impl Workspaces {
pub(crate) fn register(&mut self, url: Url, options: ClientOptions) -> anyhow::Result<()> {
let path = url
.to_file_path()
.map_err(|()| anyhow!("Workspace URL is not a file or directory: {url:?}"))?;
// Realistically I don't think this can fail because we got the path from a Url
let system_path = SystemPathBuf::from_path_buf(path)
.map_err(|_| anyhow!("Workspace URL is not valid UTF8"))?;
self.workspaces.insert(
url,
Workspace {
options,
root: system_path,
},
);
self.uninitialized += 1;
Ok(())
}
pub(crate) fn initialize(
&mut self,
url: &Url,
options: ClientOptions,
) -> Option<&mut Workspace> {
if let Some(workspace) = self.workspaces.get_mut(url) {
workspace.options = options;
self.uninitialized -= 1;
Some(workspace)
} else {
None
}
}
pub(crate) fn urls(&self) -> impl Iterator<Item = &Url> + '_ {
self.workspaces.keys()
}
pub(crate) fn all_initialized(&self) -> bool {
self.uninitialized == 0
}
}
impl<'a> IntoIterator for &'a Workspaces {
type Item = (&'a Url, &'a Workspace);
type IntoIter = std::collections::btree_map::Iter<'a, Url, Workspace>;
fn into_iter(self) -> Self::IntoIter {
self.workspaces.iter()
}
}
#[derive(Debug)]
pub(crate) struct Workspace {
root: SystemPathBuf,
options: ClientOptions,
}
impl Workspace {
pub(crate) fn root(&self) -> &SystemPath {
&self.root
}
}

View File

@@ -1,7 +1,6 @@
use crate::Session;
use crate::server::{Action, ConnectionSender};
use crate::server::{Event, MainLoopSender};
use anyhow::{Context, anyhow};
use lsp_server::{ErrorCode, Message, Notification, RequestId, ResponseError};
use serde_json::Value;
use std::any::TypeId;
@@ -45,8 +44,7 @@ impl Client {
session: &Session,
params: R::Params,
response_handler: impl FnOnce(&Client, R::Result) + Send + 'static,
) -> crate::Result<()>
where
) where
R: lsp_types::request::Request,
{
let response_handler = Box::new(move |client: &Client, response: lsp_server::Response| {
@@ -95,60 +93,64 @@ impl Client {
.outgoing()
.register(response_handler);
self.client_sender
if let Err(err) = self
.client_sender
.send(Message::Request(lsp_server::Request {
id,
method: R::METHOD.to_string(),
params: serde_json::to_value(params).context("Failed to serialize params")?,
params: serde_json::to_value(params).expect("Params to be serializable"),
}))
.with_context(|| {
format!("Failed to send request method={method}", method = R::METHOD)
})?;
Ok(())
{
tracing::error!(
"Failed to send request `{}` because the client sender is closed: {err}",
R::METHOD
);
}
}
/// Sends a notification to the client.
pub(crate) fn send_notification<N>(&self, params: N::Params) -> crate::Result<()>
pub(crate) fn send_notification<N>(&self, params: N::Params)
where
N: lsp_types::notification::Notification,
{
let method = N::METHOD.to_string();
self.client_sender
.send(lsp_server::Message::Notification(Notification::new(
method, params,
)))
.map_err(|error| {
anyhow!(
"Failed to send notification (method={method}): {error}",
method = N::METHOD
)
})
if let Err(err) =
self.client_sender
.send(lsp_server::Message::Notification(Notification::new(
method, params,
)))
{
tracing::error!(
"Failed to send notification `{}` because the client sender is closed: {err}",
N::METHOD
);
}
}
/// Sends a notification without any parameters to the client.
///
/// This is useful for notifications that don't require any data.
#[expect(dead_code)]
pub(crate) fn send_notification_no_params(&self, method: &str) -> crate::Result<()> {
self.client_sender
.send(lsp_server::Message::Notification(Notification::new(
method.to_string(),
Value::Null,
)))
.map_err(|error| anyhow!("Failed to send notification (method={method}): {error}",))
pub(crate) fn send_notification_no_params(&self, method: &str) {
if let Err(err) =
self.client_sender
.send(lsp_server::Message::Notification(Notification::new(
method.to_string(),
Value::Null,
)))
{
tracing::error!(
"Failed to send notification `{method}` because the client sender is closed: {err}",
);
}
}
/// Sends a response to the client for a given request ID.
///
/// The response isn't sent immediately. Instead, it's queued up in the main loop
/// and checked for cancellation (each request must have exactly one response).
pub(crate) fn respond<R>(
&self,
id: &RequestId,
result: crate::server::Result<R>,
) -> crate::Result<()>
pub(crate) fn respond<R>(&self, id: &RequestId, result: crate::server::Result<R>)
where
R: serde::Serialize,
{
@@ -161,17 +163,13 @@ impl Client {
self.main_loop_sender
.send(Event::Action(Action::SendResponse(response)))
.map_err(|error| anyhow!("Failed to send response for request {id}: {error}"))
.unwrap();
}
/// Sends an error response to the client for a given request ID.
///
/// The response isn't sent immediately. Instead, it's queued up in the main loop.
pub(crate) fn respond_err(
&self,
id: RequestId,
error: lsp_server::ResponseError,
) -> crate::Result<()> {
pub(crate) fn respond_err(&self, id: RequestId, error: lsp_server::ResponseError) {
let response = lsp_server::Response {
id,
result: None,
@@ -180,23 +178,19 @@ impl Client {
self.main_loop_sender
.send(Event::Action(Action::SendResponse(response)))
.map_err(|error| anyhow!("Failed to send response: {error}"))
.unwrap();
}
/// Shows a message to the user.
///
/// This opens a pop up in VS Code showing `message`.
pub(crate) fn show_message(
&self,
message: impl Display,
message_type: lsp_types::MessageType,
) -> crate::Result<()> {
pub(crate) fn show_message(&self, message: impl Display, message_type: lsp_types::MessageType) {
self.send_notification::<lsp_types::notification::ShowMessage>(
lsp_types::ShowMessageParams {
typ: message_type,
message: message.to_string(),
},
)
);
}
/// Sends a request to display a warning to the client with a formatted message. The warning is
@@ -204,11 +198,7 @@ impl Client {
///
/// Logs an error if the message could not be sent.
pub(crate) fn show_warning_message(&self, message: impl Display) {
let result = self.show_message(message, lsp_types::MessageType::WARNING);
if let Err(err) = result {
tracing::error!("Failed to send warning message to the client: {err}");
}
self.show_message(message, lsp_types::MessageType::WARNING);
}
/// Sends a request to display an error to the client with a formatted message. The error is
@@ -216,23 +206,23 @@ impl Client {
///
/// Logs an error if the message could not be sent.
pub(crate) fn show_error_message(&self, message: impl Display) {
let result = self.show_message(message, lsp_types::MessageType::ERROR);
if let Err(err) = result {
tracing::error!("Failed to send error message to the client: {err}");
}
self.show_message(message, lsp_types::MessageType::ERROR);
}
/// Re-queues this request after a salsa cancellation for a retry.
///
/// The main loop will skip the retry if the client cancelled the request in the meantime.
pub(crate) fn retry(&self, request: lsp_server::Request) -> crate::Result<()> {
pub(crate) fn retry(&self, request: lsp_server::Request) {
self.main_loop_sender
.send(Event::Action(Action::RetryRequest(request)))
.map_err(|error| anyhow!("Failed to send retry request: {error}"))
.unwrap();
}
pub(crate) fn cancel(&self, session: &mut Session, id: RequestId) -> crate::Result<()> {
pub(crate) fn queue_action(&self, action: Action) {
self.main_loop_sender.send(Event::Action(action)).unwrap();
}
pub(crate) fn cancel(&self, session: &mut Session, id: RequestId) {
let method_name = session.request_queue_mut().incoming_mut().cancel(&id);
if let Some(method_name) = method_name {
@@ -245,14 +235,18 @@ impl Client {
// Use `client_sender` here instead of `respond_err` because
// `respond_err` filters out responses for canceled requests (which we just did!).
self.client_sender
if let Err(err) = self
.client_sender
.send(Message::Response(lsp_server::Response {
id,
result: None,
error: Some(error),
}))?;
}))
{
tracing::error!(
"Failed to send cancellation response for request `{method_name}` because the client sender is closed: {err}",
);
}
}
Ok(())
}
}

View File

@@ -1,6 +1,5 @@
use std::path::PathBuf;
use lsp_types::Url;
use ruff_db::system::SystemPathBuf;
use rustc_hash::FxHashMap;
use serde::Deserialize;
@@ -24,14 +23,7 @@ pub(crate) struct GlobalOptions {
impl GlobalOptions {
pub(crate) fn into_settings(self) -> ClientSettings {
ClientSettings {
disable_language_services: self
.client
.python
.and_then(|python| python.ty)
.and_then(|ty| ty.disable_language_services)
.unwrap_or_default(),
}
self.client.into_settings()
}
}
@@ -54,6 +46,40 @@ pub(crate) struct ClientOptions {
/// Settings under the `python.*` namespace in VS Code that are useful for the ty language
/// server.
python: Option<Python>,
/// Diagnostic mode for the language server.
diagnostic_mode: Option<DiagnosticMode>,
}
/// Diagnostic mode for the language server.
#[derive(Clone, Copy, Debug, Default, Deserialize)]
#[cfg_attr(test, derive(PartialEq, Eq))]
#[serde(rename_all = "camelCase")]
pub(crate) enum DiagnosticMode {
/// Check only currently open files.
#[default]
OpenFilesOnly,
/// Check all files in the workspace.
Workspace,
}
impl DiagnosticMode {
pub(crate) fn is_workspace(self) -> bool {
matches!(self, DiagnosticMode::Workspace)
}
}
impl ClientOptions {
/// Returns the client settings that are relevant to the language server.
pub(crate) fn into_settings(self) -> ClientSettings {
ClientSettings {
disable_language_services: self
.python
.and_then(|python| python.ty)
.and_then(|ty| ty.disable_language_services)
.unwrap_or_default(),
diagnostic_mode: self.diagnostic_mode.unwrap_or_default(),
}
}
}
// TODO(dhruvmanila): We need to mirror the "python.*" namespace on the server side but ideally it
@@ -83,7 +109,7 @@ pub(crate) struct TracingOptions {
pub(crate) log_level: Option<LogLevel>,
/// Path to the log file - tildes and environment variables are supported.
pub(crate) log_file: Option<PathBuf>,
pub(crate) log_file: Option<SystemPathBuf>,
}
/// This is the exact schema for initialization options sent in by the client during

View File

@@ -1,3 +1,5 @@
use super::options::DiagnosticMode;
/// Resolved client settings for a specific document. These settings are meant to be
/// used directly by the server, and are *not* a 1:1 representation with how the client
/// sends them.
@@ -5,10 +7,15 @@
#[cfg_attr(test, derive(PartialEq, Eq))]
pub(crate) struct ClientSettings {
pub(super) disable_language_services: bool,
pub(super) diagnostic_mode: DiagnosticMode,
}
impl ClientSettings {
pub(crate) fn is_language_services_disabled(&self) -> bool {
self.disable_language_services
}
pub(crate) fn diagnostic_mode(&self) -> DiagnosticMode {
self.diagnostic_mode
}
}

View File

@@ -8,7 +8,7 @@ use ruff_db::files::{File, FilePath};
use ruff_db::system::walk_directory::WalkDirectoryBuilder;
use ruff_db::system::{
CaseSensitivity, DirectoryEntry, FileType, GlobError, Metadata, OsSystem, PatternError, Result,
System, SystemPath, SystemPathBuf, SystemVirtualPath, SystemVirtualPathBuf,
System, SystemPath, SystemPathBuf, SystemVirtualPath, SystemVirtualPathBuf, WritableSystem,
};
use ruff_notebook::{Notebook, NotebookError};
use ty_python_semantic::Db;
@@ -17,13 +17,29 @@ use crate::DocumentQuery;
use crate::document::DocumentKey;
use crate::session::index::Index;
/// Returns a [`Url`] for the given [`File`].
pub(crate) fn file_to_url(db: &dyn Db, file: File) -> Option<Url> {
match file.path(db) {
FilePath::System(system) => Url::from_file_path(system.as_std_path()).ok(),
FilePath::SystemVirtual(path) => Url::parse(path.as_str()).ok(),
// TODO: Not yet supported, consider an approach similar to Sorbet's custom paths
// https://sorbet.org/docs/sorbet-uris
FilePath::Vendored(_) => None,
FilePath::Vendored(path) => {
let writable = db.system().as_writable()?;
let system_path = SystemPathBuf::from(format!(
"vendored/typeshed/{}/{}",
// The vendored files are uniquely identified by the source commit.
ty_vendored::SOURCE_COMMIT,
path.as_str()
));
// Extract the vendored file onto the system.
let system_path = writable
.get_or_cache(&system_path, &|| db.vendored().read_to_string(path))
.ok()
.flatten()?;
Url::from_file_path(system_path.as_std_path()).ok()
}
}
}
@@ -224,6 +240,10 @@ impl System for LSPSystem {
self.os_system.user_config_directory()
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
self.os_system.cache_dir()
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -245,6 +265,10 @@ impl System for LSPSystem {
self.os_system.glob(pattern)
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
self.os_system.as_writable()
}
fn as_any(&self) -> &dyn Any {
self
}

View File

@@ -226,6 +226,10 @@ impl System for MdtestSystem {
self.as_system().user_config_directory()
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
self.as_system().cache_dir()
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -253,6 +257,10 @@ impl System for MdtestSystem {
.glob(self.normalize_path(SystemPath::new(pattern)).as_str())
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
Some(self)
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
@@ -263,6 +271,10 @@ impl System for MdtestSystem {
}
impl WritableSystem for MdtestSystem {
fn create_new_file(&self, path: &SystemPath) -> ruff_db::system::Result<()> {
self.as_system().create_new_file(&self.normalize_path(path))
}
fn write_file(&self, path: &SystemPath, content: &str) -> ruff_db::system::Result<()> {
self.as_system()
.write_file(&self.normalize_path(path), content)

View File

@@ -12,6 +12,7 @@ license = { workspace = true }
[dependencies]
ruff_db = { workspace = true }
static_assertions = { workspace = true }
zip = { workspace = true }
[build-dependencies]

View File

@@ -1,6 +1,12 @@
use ruff_db::vendored::VendoredFileSystem;
use std::sync::LazyLock;
/// The source commit of the vendored typeshed.
pub const SOURCE_COMMIT: &str =
include_str!("../../../crates/ty_vendored/vendor/typeshed/source_commit.txt").trim_ascii_end();
static_assertions::const_assert_eq!(SOURCE_COMMIT.len(), 40);
// The file path here is hardcoded in this crate's `build.rs` script.
// Luckily this crate will fail to build if this file isn't available at build time.
static TYPESHED_ZIP_BYTES: &[u8] = include_bytes!(concat!(env!("OUT_DIR"), "/zipped_typeshed.zip"));

View File

@@ -8,7 +8,7 @@ use ruff_db::source::{line_index, source_text};
use ruff_db::system::walk_directory::WalkDirectoryBuilder;
use ruff_db::system::{
CaseSensitivity, DirectoryEntry, GlobError, MemoryFileSystem, Metadata, PatternError, System,
SystemPath, SystemPathBuf, SystemVirtualPath,
SystemPath, SystemPathBuf, SystemVirtualPath, WritableSystem,
};
use ruff_notebook::Notebook;
use ruff_python_formatter::formatted_file;
@@ -695,6 +695,10 @@ impl System for WasmSystem {
None
}
fn cache_dir(&self) -> Option<SystemPathBuf> {
None
}
fn read_directory<'a>(
&'a self,
path: &SystemPath,
@@ -715,6 +719,10 @@ impl System for WasmSystem {
Ok(Box::new(self.fs.glob(pattern)?))
}
fn as_writable(&self) -> Option<&dyn WritableSystem> {
None
}
fn as_any(&self) -> &dyn Any {
self
}

View File

@@ -30,7 +30,7 @@ ty_python_semantic = { path = "../crates/ty_python_semantic" }
ty_vendored = { path = "../crates/ty_vendored" }
libfuzzer-sys = { git = "https://github.com/rust-fuzz/libfuzzer", default-features = false }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "0666e2018bc35376b1ac4f98906f2d04d11e5fe4" }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "fc00eba89e5dcaa5edba51c41aa5f309b5cb126b" }
similar = { version = "2.5.0" }
tracing = { version = "0.1.40" }

2
ty.schema.json generated
View File

@@ -352,7 +352,7 @@
]
},
"duplicate-kw-only": {
"title": "detects dataclass definitions with more than once usages of `KW_ONLY`",
"title": "detects dataclass definitions with more than one usage of `KW_ONLY`",
"description": "## What it does\nChecks for dataclass definitions with more than one field\nannotated with `KW_ONLY`.\n\n## Why is this bad?\n`dataclasses.KW_ONLY` is a special marker used to\nemulate the `*` syntax in normal signatures.\nIt can only be used once per dataclass.\n\nAttempting to annotate two different fields with\nit will lead to a runtime error.\n\n## Examples\n```python\nfrom dataclasses import dataclass, KW_ONLY\n\n@dataclass\nclass A: # Crash at runtime\n b: int\n _1: KW_ONLY\n c: str\n _2: KW_ONLY\n d: bytes\n```",
"default": "error",
"oneOf": [