Compare commits

...

57 Commits

Author SHA1 Message Date
Dhruv Manilawala
ee258caed7 Bump version to 0.6.3 (#13152) 2024-08-29 20:29:33 +05:30
Aditya Pal
b4d9d26020 Update faq.md to highlight changes to src (#13145)
This attempts to close https://github.com/astral-sh/ruff/issues/13134

## Summary

Documentation change to address
https://github.com/astral-sh/ruff/issues/13134

## Test Plan

Markdown Changes were previewed
2024-08-29 11:57:53 +00:00
Steve C
a99832088a [ruff] - extend comment deletions for unused-noqa (RUF100) (#13105)
## Summary

Extends deletions for RUF100, deleting trailing text from noqa
directives, while preserving upcoming comments on the same line if any.

In cases where it deletes a comment up to another comment on the same
line, the whitespace between them is now shown to be in the autofix in
the diagnostic as well. Leading whitespace before the removed comment is
not, though.

Fixes #12251 

## Test Plan

`cargo test`
2024-08-29 10:50:16 +05:30
Carl Meyer
770ef2ab27 [red-knot] support deferred evaluation of type expressions (#13131)
Prototype deferred evaluation of type expressions by deferring
evaluation of class bases in a stub file. This allows self-referential
class definitions, as occur with the definition of `str` in typeshed
(which inherits `Sequence[str]`).

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2024-08-28 11:41:01 -07:00
Alex Waygood
c6023c03a2 [red-knot] Add docs on using RAYON_NUM_THREADS for better logging (#13140)
Followup to #13049. We check files concurrently now; to get readable
logs, you probably want to switch that off
2024-08-28 17:14:56 +01:00
Adam Kuhn
df694ca1c1 [FastAPI] Avoid introducing invalid syntax in fix for fast-api-non-annotated-dependency (FAST002) (#13133) 2024-08-28 15:29:00 +00:00
Calum Young
2e75cfbfe7 Format PYI examples in docs as .pyi-file snippets (#13116) 2024-08-28 13:20:40 +01:00
Alex Waygood
cfafaa7637 [red-knot] Remove very noisy tracing call when resolving ImportFrom statements (#13136) 2024-08-28 10:05:00 +00:00
Jonathan Plasse
3e9c7adeee Replace crates by dependi for VS Code Dev Container (#13125)
## Summary
crates is now recommending migrating to dependi
![Screenshot from 2024-08-27
19-33-39](https://github.com/user-attachments/assets/c8f6480e-a07c-41a5-8bd0-d808c5a987a0)

## Test Plan
Opening dev-container installs correctly dependi
2024-08-28 09:53:27 +05:30
Chris Krycho
81cd438d88 red-knot: infer and display ellipsis type (#13124)
## Summary

Just what it says on the tin: adds basic `EllipsisType` inference for
any time `...` appears in the AST.

## Test Plan

Test that `x = ...` produces exactly what we would expect.

---------

Co-authored-by: Carl Meyer <carl@oddbird.net>
2024-08-27 20:52:53 +01:00
Dylan
483748c188 [flake8-implicit-str-concat] Normalize octals before merging concatenated strings in single-line-implicit-string-concatenation (ISC001) (#13118)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2024-08-27 18:53:27 +01:00
Calum Young
eb3dc37faa Add note about how Ruff handles PYI files wrt target version (#13111)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2024-08-27 17:28:22 +00:00
Chris Krycho
aba1802828 red-knot: infer multiplication for strings and integers (#13117)
## Summary

The resulting type when multiplying a string literal by an integer
literal is one of two types:

- `StringLiteral`, in the case where it is a reasonably small resulting
string (arbitrarily bounded here to 4096 bytes, roughly a page on many
operating systems), including the fully expanded string.
- `LiteralString`, matching Pyright etc., for strings larger than that.

Additionally:

- Switch to using `Box<str>` instead of `String` for the internal value
of `StringLiteral`, saving some non-trivial byte overhead (and keeping
the total number of allocations the same).
- Be clearer and more accurate about which types we ought to defer to in
`StringLiteral` and `LiteralString` member lookup.

## Test Plan

Added a test case covering multiplication times integers: positive,
negative, zero, and in and out of bounds.

---------

Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
Co-authored-by: Carl Meyer <carl@astral.sh>
2024-08-27 09:00:36 -07:00
Tom Kuson
96b42b0c8f [DOC201] Permit explicit None in functions that only return None (#13064)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2024-08-27 16:00:18 +00:00
Niels Wouda
e6d0c4a65d Add time, tzinfo, and timezone as immutable function calls (#13109) 2024-08-27 15:51:32 +01:00
Calum Young
4e1b289a67 Disable E741 in stub files (#13119)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2024-08-27 15:02:14 +01:00
Alex Waygood
a5ef124201 [red-knot] Improve the accuracy of the unresolved-import check (#13055) 2024-08-27 14:17:22 +01:00
Chris Krycho
390bb43276 red-knot: flatten match expression in infer_binary_expression (#13115)
## Summary

This fixes the outstanding TODO and make it easier to work with new
cases. (Tidy first, *then* implement, basically!)

## Test Plan

After making this change all the existing tests still pass. A classic
refactor win. 🎉
2024-08-26 12:34:07 -07:00
Chris Krycho
fe8b15291f red-knot: implement unary minus on integer literals (#13114)
# Summary

Add support for the first unary operator: negating integer literals. The
resulting type is another integer literal, with the value being the
negated value of the literal. All other types continue to return
`Type::Unknown` for the present, but this is designed to make it easy to
extend easily with other combinations of operator and operand.

Contributes to #12701.

## Test Plan

Add tests with basic negation, including of very large integers and
double negation.
2024-08-26 12:08:18 -07:00
Dhruv Manilawala
c8e01d7c53 Update dependency in insider requirements.txt (#13112) 2024-08-26 19:02:27 +00:00
Chris Krycho
c4d628cc4c red-knot: infer string literal types (#13113)
## Summary

Introduce a `StringLiteralType` with corresponding `Display` type and a
relatively basic test that the resulting representation is as expected.

Note: we currently always allocate for `StringLiteral` types. This may
end up being a perf issue later, at which point we may want to look at
other ways of representing `value` here, i.e. with some kind of smarter
string structure which can reuse types. That is most likely to show up
with e.g. concatenation.

Contributes to #12701.

## Test Plan

Added a test for individual strings with both single and double quotes
as well as concatenated strings with both forms.
2024-08-26 11:42:34 -07:00
Calum Young
ab3648c4c5 Format docs with ruff formatter (#13087)
## Summary

Now that Ruff provides a formatter, there is no need to rely on Black to
check that the docs are formatted correctly in
`check_docs_formatted.py`. This PR swaps out Black for the Ruff
formatter and updates inconsistencies between the two.

This PR will be a precursor to another PR
([branch](https://github.com/calumy/ruff/tree/format-pyi-in-docs)),
updating the `check_docs_formatted.py` script to check for pyi files,
fixing #11568.

## Test Plan

- CI to check that the docs are formatted correctly using the updated
script.
2024-08-26 21:25:10 +05:30
Teodoro Freund
a822fd6642 Fixed benchmarking section in Contributing guide (#13107)
## Summary

Noticed there was a wrong tip on the Contributing guide, `cargo
benchmark lexer` wouldn't run any benches.
Probably a missed update on #9535 

It may make sense to remove the `cargo benchmark` command from the guide
altogether, but up to the mantainers.
2024-08-26 18:49:01 +05:30
Calum Young
f8f2e2a442 Add anchor tags to README headers (#13083)
## Summary

This pull request adds anchor tags to the elements referenced in the
table of contents section of the readme used on
[PyPI](https://pypi.org/project/ruff/) as an attempt to fix #7257. This
update follows [this
suggestion](https://github.com/pypa/readme_renderer/issues/169#issuecomment-808577486)
to add anchor tags (with no spaces) after the title that is to be linked
to.

## Test Plan

- This has been tested on GitHub to check that the additional tags do
not interfere with how the read me is rendered; see:
https://github.com/calumy/ruff/blob/add-links-to-pypi-docs/README.md
- MK docs were generated using the `generate_mkdocs.py` script; however
as the added tags are beyond the comment `<!-- End section: Overview
-->`, they are excluded so will not change how the docs are rendered.
- I was unable to verify how PyPI renders this change, any suggestions
would be appreciated and I can follow up on this. Hopefully, the four
thumbs up/heart on [this
comment](https://github.com/pypa/readme_renderer/issues/169#issuecomment-808577486)
and [this
suggestion](https://github.com/pypa/readme_renderer/issues/169#issuecomment-1765616890)
all suggest that this approach should work.
2024-08-26 12:42:35 +05:30
Steve C
0b5828a1e8 [flake8-simplify] - extend open-file-with-context-handler to work with dbm.sqlite3 (SIM115) (#13104)
## Summary

Adds upcoming `dbm.sqlite3` to rule that suggests using context managers
to open things with.

See: https://docs.python.org/3.13/library/dbm.html#module-dbm.sqlite3

## Test Plan

`cargo test`
2024-08-26 08:11:03 +01:00
Steve C
5af48337a5 [pylint] - fix incorrect starred expression replacement for nested-min-max (PLW3301) (#13089)
## Summary

Moves the min/max detection up, and fixes #13088 

## Test Plan

`cargo test`
2024-08-26 10:01:38 +05:30
renovate[bot]
39ad6b9472 Update tj-actions/changed-files action to v45 (#13102) 2024-08-25 22:11:24 -04:00
renovate[bot]
41dec93cd2 Update dependency monaco-editor to ^0.51.0 (#13101) 2024-08-25 22:11:15 -04:00
renovate[bot]
aee2caa733 Update NPM Development dependencies (#13100) 2024-08-25 22:11:07 -04:00
renovate[bot]
fe5544e137 Update dependency react-resizable-panels to v2.1.1 (#13098) 2024-08-25 22:11:01 -04:00
renovate[bot]
14c014a48b Update Rust crate syn to v2.0.76 (#13097) 2024-08-25 22:10:57 -04:00
renovate[bot]
ecd0597d6b Update Rust crate serde_json to v1.0.127 (#13096) 2024-08-25 22:10:50 -04:00
renovate[bot]
202271fba6 Update Rust crate serde to v1.0.209 (#13095) 2024-08-25 22:10:45 -04:00
renovate[bot]
4bdb0b4f86 Update Rust crate quote to v1.0.37 (#13094) 2024-08-25 22:10:38 -04:00
renovate[bot]
2286f916c1 Update Rust crate libc to v0.2.158 (#13093) 2024-08-25 22:10:32 -04:00
renovate[bot]
1e4c944251 Update pre-commit dependencies (#13099)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-08-26 01:49:41 +01:00
Calum Young
f50f8732e9 [flake8-pytest-style] Improve help message for pytest-incorrect-mark-parentheses-style (PT023) (#13092) 2024-08-26 01:37:57 +01:00
Micha Reiser
ecab04e338 Basic concurrent checking (#13049) 2024-08-24 09:53:27 +01:00
Dylan
8c09496b07 [red-knot] Resolve function annotations before adding function symbol (#13084)
This PR has the `SemanticIndexBuilder` visit function definition
annotations before adding the function symbol/name to the builder.

For example, the following snippet no longer causes a panic:

```python
def bool(x) -> bool:
    Return True
```

Note: This fix changes the ordering of the global symbol table.

Closes #13069
2024-08-23 19:31:36 -07:00
Alex Waygood
d19fd1b91c [red-knot] Add symbols for for loop variables (#13075)
## Summary

This PR adds symbols introduced by `for` loops to red-knot:
- `x` in `for x in range(10): pass`
- `x` and `y` in `for x, y in d.items(): pass`
- `a`, `b`, `c` and `d` in `for [((a,), b), (c, d)] in foo: pass`

## Test Plan

Several tests added, and the assertion in the benchmarks has been
updated.

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2024-08-23 23:40:27 +01:00
Dhruv Manilawala
99df859e20 Include all required keys for Zed settings (#13082)
## Summary

Closes: #13081
2024-08-23 16:18:05 +00:00
Micha Reiser
2d5fe9a6d3 Fix dark theme on initial page load (#13077) 2024-08-23 12:53:43 +00:00
jesse
1f2cb09853 [async-function-with-timeout] Disable check for asyncio before Python 3.11 (ASYNC109) (#13023)
Co-authored-by: Micha Reiser <micha@reiser.io>
2024-08-23 08:02:53 +00:00
Dhruv Manilawala
cfe25ab465 [red-knot] Support untitled files in the server (#13044)
## Summary

This PR adds support for untitled files in the red knot server.

## Test Plan

https://github.com/user-attachments/assets/57fa5db6-e1ad-4694-ae5f-c47a21eaa82b
2024-08-23 12:47:35 +05:30
Dhruv Manilawala
551ed2706b [red-knot] Simplify virtual file support (#13043)
## Summary

This PR simplifies the virtual file support in the red knot core,
specifically:

* Update `File::add_virtual_file` method to `File::virtual_file` which
will always create a new virtual file and override the existing entry in
the lookup table
* Add `VirtualFile` which is a wrapper around `File` and provides
methods to increment the file revision / close the virtual file
* Add a new `File::try_virtual_file` to lookup the `VirtualFile` from
`Files`
* Add `File::sync_virtual_path` which takes in the `SystemVirtualPath`,
looks up the `VirtualFile` for it and calls the `sync` method to
increment the file revision
* Removes the `virtual_path_metadata` method on `System` trait

## Test Plan

- [x] Make sure the existing red knot tests pass
- [x] Updated code works well with the LSP
2024-08-23 07:04:15 +00:00
Dhruv Manilawala
21c5606793 [red-knot] Support textDocument/didChange notification (#13042)
## Summary

This PR adds support for `textDocument/didChange` notification.

There seems to be a bug (probably in Salsa) where it panics with:
```
2024-08-22 15:33:38.802 [info] panicked at /Users/dhruv/.cargo/git/checkouts/salsa-61760caba2b17ca5/f608ff8/src/tracked_struct.rs:377:9:
two concurrent writers to Id(4800), should not be possible
```

## Test Plan


https://github.com/user-attachments/assets/81055feb-ba8e-4acf-ad2f-94084a3efead
2024-08-23 06:58:54 +00:00
Dhruv Manilawala
c73a7bb929 [red-knot] Support files outside of any workspace (#13041)
## Summary

This PR adds basic support for files outside of any workspace in the red
knot server.

This also limits the red knot server to only work in a single workspace.
The server will not start if there are multiple workspaces.

## Test Plan

https://github.com/user-attachments/assets/de601387-0ad5-433c-9d2c-7b6ae5137654
2024-08-23 06:51:48 +00:00
Micha Reiser
4f6accb5c6 Add basic red knot benchmark (#13026)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2024-08-23 08:22:42 +02:00
Micha Reiser
1ca14e4335 Move collection of parse errors to check_file (#13059) 2024-08-23 08:22:12 +02:00
Teodoro Freund
b9c8113a8a Added bytes type and some inference (#13061)
## Summary

This PR adds the `bytes` type to red-knot:
- Added the `bytes` type
- Added support for bytes literals
- Support for the `+` operator

Improves on #12701 

Big TODO on supporting and normalizing r-prefixed bytestrings
(`rb"hello\n"`)

## Test Plan

Added a test for a bytes literals, concatenation, and corner values
2024-08-22 13:27:15 -07:00
Dylan
2edd32aa31 [red-knot] SemanticIndexBuilder visits value before target in named expressions (#13053)
The `SemanticIndexBuilder` was causing a cycle in a salsa query by
attempting to resolve the target before the value in a named expression
(e.g. `x := x+1`). This PR swaps the order, avoiding a panic.

Closes #13012.
2024-08-22 07:59:13 -07:00
Dhruv Manilawala
02c4373a49 Bump version to 0.6.2 (#13056) 2024-08-22 18:59:27 +05:30
Steve C
d37e2e5d33 [flake8-simplify] Extend open-file-with-context-handler to work with other standard-library IO modules (SIM115) (#12959)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2024-08-22 14:18:55 +01:00
Dhruv Manilawala
d1d067896c [red-knot] Remove notebook support from the server (#13040)
## Summary

This PR removes notebook sync support from server capabilities because
it isn't tested, it'll be added back once we actually add full support
for notebook.
2024-08-22 14:55:46 +05:30
olp-cs
93f9023ea3 Add hyperfine installation instructions; update hyperfine code samples (#13034)
## Summary

When following the step-by-step instructions to run the benchmarks in
`CONTRIBUTING.md`, I encountered two errors:

**Error 1:**
`bash: hyperfine: command not found`
    
**Solution**: I updated the instructions to include the step of
installing the benchmark tool.

**Error 2:**
```shell
$ ./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ 
error: `ruff <path>` has been removed. Use `ruff check <path>` instead.
```
**Solution**: I added `check`.

## Test Plan

I tested it by running the benchmark-related commands in a new workspace
within GitHub Codespaces.
2024-08-22 09:05:09 +05:30
Dhruv Manilawala
8144a11f98 [red-knot] Add definition for with items (#12920)
## Summary

This PR adds symbols and definitions introduced by `with` statements.

The symbols and definitions are introduced for each with item. The type
inference is updated to call the definition region type inference
instead.

## Test Plan

Add test case to check for symbol table and definitions.
2024-08-22 08:00:19 +05:30
Micha Reiser
dce87c21fd Eagerly validate typeshed versions (#12786) 2024-08-21 15:49:53 +00:00
175 changed files with 5484 additions and 1538 deletions

View File

@@ -20,7 +20,7 @@
"extensions": [
"ms-python.python",
"rust-lang.rust-analyzer",
"serayuzgur.crates",
"fill-labs.dependi",
"tamasfe.even-better-toml",
"Swellaby.vscode-rust-test-adapter",
"charliermarsh.ruff"

View File

@@ -37,7 +37,7 @@ jobs:
with:
fetch-depth: 0
- uses: tj-actions/changed-files@v44
- uses: tj-actions/changed-files@v45
id: changed
with:
files_yaml: |

View File

@@ -45,7 +45,7 @@ repos:
)$
- repo: https://github.com/crate-ci/typos
rev: v1.23.6
rev: v1.24.1
hooks:
- id: typos
@@ -59,7 +59,7 @@ repos:
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.6.1
rev: v0.6.2
hooks:
- id: ruff-format
- id: ruff

View File

@@ -1,5 +1,61 @@
# Changelog
## 0.6.3
### Preview features
- \[`flake8-simplify`\] Extend `open-file-with-context-handler` to work with `dbm.sqlite3` (`SIM115`) ([#13104](https://github.com/astral-sh/ruff/pull/13104))
- \[`pycodestyle`\] Disable `E741` in stub files (`.pyi`) ([#13119](https://github.com/astral-sh/ruff/pull/13119))
- \[`pydoclint`\] Avoid `DOC201` on explicit returns in functions that only return `None` ([#13064](https://github.com/astral-sh/ruff/pull/13064))
### Rule changes
- \[`flake8-async`\] Disable check for `asyncio` before Python 3.11 (`ASYNC109`) ([#13023](https://github.com/astral-sh/ruff/pull/13023))
### Bug fixes
- \[`FastAPI`\] Avoid introducing invalid syntax in fix for `fast-api-non-annotated-dependency` (`FAST002`) ([#13133](https://github.com/astral-sh/ruff/pull/13133))
- \[`flake8-implicit-str-concat`\] Normalize octals before merging concatenated strings in `single-line-implicit-string-concatenation` (`ISC001`) ([#13118](https://github.com/astral-sh/ruff/pull/13118))
- \[`flake8-pytest-style`\] Improve help message for `pytest-incorrect-mark-parentheses-style` (`PT023`) ([#13092](https://github.com/astral-sh/ruff/pull/13092))
- \[`pylint`\] Avoid autofix for calls that aren't `min` or `max` as starred expression (`PLW3301`) ([#13089](https://github.com/astral-sh/ruff/pull/13089))
- \[`ruff`\] Add `datetime.time`, `datetime.tzinfo`, and `datetime.timezone` as immutable function calls (`RUF009`) ([#13109](https://github.com/astral-sh/ruff/pull/13109))
- \[`ruff`\] Extend comment deletion for `RUF100` to include trailing text from `noqa` directives while preserving any following comments on the same line, if any ([#13105](https://github.com/astral-sh/ruff/pull/13105))
- Fix dark theme on initial page load for the Ruff playground ([#13077](https://github.com/astral-sh/ruff/pull/13077))
## 0.6.2
### Preview features
- \[`flake8-simplify`\] Extend `open-file-with-context-handler` to work with other standard-library IO modules (`SIM115`) ([#12959](https://github.com/astral-sh/ruff/pull/12959))
- \[`ruff`\] Avoid `unused-async` for functions with FastAPI route decorator (`RUF029`) ([#12938](https://github.com/astral-sh/ruff/pull/12938))
- \[`ruff`\] Ignore `fstring-missing-syntax` (`RUF027`) for `fastAPI` paths ([#12939](https://github.com/astral-sh/ruff/pull/12939))
- \[`ruff`\] Implement check for Decimal called with a float literal (RUF032) ([#12909](https://github.com/astral-sh/ruff/pull/12909))
### Rule changes
- \[`flake8-bugbear`\] Update diagnostic message when expression is at the end of function (`B015`) ([#12944](https://github.com/astral-sh/ruff/pull/12944))
- \[`flake8-pyi`\] Skip type annotations in `string-or-bytes-too-long` (`PYI053`) ([#13002](https://github.com/astral-sh/ruff/pull/13002))
- \[`flake8-type-checking`\] Always recognise relative imports as first-party ([#12994](https://github.com/astral-sh/ruff/pull/12994))
- \[`flake8-unused-arguments`\] Ignore unused arguments on stub functions (`ARG001`) ([#12966](https://github.com/astral-sh/ruff/pull/12966))
- \[`pylint`\] Ignore augmented assignment for `self-cls-assignment` (`PLW0642`) ([#12957](https://github.com/astral-sh/ruff/pull/12957))
### Server
- Show full context in error log messages ([#13029](https://github.com/astral-sh/ruff/pull/13029))
### Bug fixes
- \[`pep8-naming`\] Don't flag `from` imports following conventional import names (`N817`) ([#12946](https://github.com/astral-sh/ruff/pull/12946))
- \[`pylint`\] - Allow `__new__` methods to have `cls` as their first argument even if decorated with `@staticmethod` for `bad-staticmethod-argument` (`PLW0211`) ([#12958](https://github.com/astral-sh/ruff/pull/12958))
### Documentation
- Add `hyperfine` installation instructions; update `hyperfine` code samples ([#13034](https://github.com/astral-sh/ruff/pull/13034))
- Expand note to use Ruff with other language server in Kate ([#12806](https://github.com/astral-sh/ruff/pull/12806))
- Update example for `PT001` as per the new default behavior ([#13019](https://github.com/astral-sh/ruff/pull/13019))
- \[`perflint`\] Improve docs for `try-except-in-loop` (`PERF203`) ([#12947](https://github.com/astral-sh/ruff/pull/12947))
- \[`pydocstyle`\] Add reference to `lint.pydocstyle.ignore-decorators` setting to rule docs ([#12996](https://github.com/astral-sh/ruff/pull/12996))
## 0.6.1
This is a hotfix release to address an issue with `ruff-pre-commit`. In v0.6,

View File

@@ -397,12 +397,18 @@ which makes it a good target for benchmarking.
git clone --branch 3.10 https://github.com/python/cpython.git crates/ruff_linter/resources/test/cpython
```
Install `hyperfine`:
```shell
cargo install hyperfine
```
To benchmark the release build:
```shell
cargo build --release && hyperfine --warmup 10 \
"./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ --no-cache -e" \
"./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ -e"
"./target/release/ruff check ./crates/ruff_linter/resources/test/cpython/ --no-cache -e" \
"./target/release/ruff check ./crates/ruff_linter/resources/test/cpython/ -e"
Benchmark 1: ./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ --no-cache
Time (mean ± σ): 293.8 ms ± 3.2 ms [User: 2384.6 ms, System: 90.3 ms]
@@ -421,7 +427,7 @@ To benchmark against the ecosystem's existing tools:
```shell
hyperfine --ignore-failure --warmup 5 \
"./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ --no-cache" \
"./target/release/ruff check ./crates/ruff_linter/resources/test/cpython/ --no-cache" \
"pyflakes crates/ruff_linter/resources/test/cpython" \
"autoflake --recursive --expand-star-imports --remove-all-unused-imports --remove-unused-variables --remove-duplicate-keys resources/test/cpython" \
"pycodestyle crates/ruff_linter/resources/test/cpython" \
@@ -467,7 +473,7 @@ To benchmark a subset of rules, e.g. `LineTooLong` and `DocLineTooLong`:
```shell
cargo build --release && hyperfine --warmup 10 \
"./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ --no-cache -e --select W505,E501"
"./target/release/ruff check ./crates/ruff_linter/resources/test/cpython/ --no-cache -e --select W505,E501"
```
You can run `poetry install` from `./scripts/benchmarks` to create a working environment for the
@@ -524,6 +530,8 @@ You can run the benchmarks with
cargo benchmark
```
`cargo benchmark` is an alias for `cargo bench -p ruff_benchmark --bench linter --bench formatter --`
#### Benchmark-driven Development
Ruff uses [Criterion.rs](https://bheisler.github.io/criterion.rs/book/) for benchmarks. You can use
@@ -562,7 +570,7 @@ cargo install critcmp
#### Tips
- Use `cargo bench -p ruff_benchmark <filter>` to only run specific benchmarks. For example: `cargo benchmark lexer`
- Use `cargo bench -p ruff_benchmark <filter>` to only run specific benchmarks. For example: `cargo bench -p ruff_benchmark lexer`
to only run the lexer benchmarks.
- Use `cargo bench -p ruff_benchmark -- --quiet` for a more cleaned up output (without statistical relevance)
- Use `cargo bench -p ruff_benchmark -- --quick` to get faster results (more prone to noise)

36
Cargo.lock generated
View File

@@ -1256,9 +1256,9 @@ checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646"
[[package]]
name = "libc"
version = "0.2.157"
version = "0.2.158"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "374af5f94e54fa97cf75e945cce8a6b201e88a1a07e688b47dfd2a59c66dbd86"
checksum = "d8adc4bb1803a324070e64a98ae98f38934d91957a99cfb3a43dcbc01bc56439"
[[package]]
name = "libcst"
@@ -1827,9 +1827,9 @@ dependencies = [
[[package]]
name = "quote"
version = "1.0.36"
version = "1.0.37"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0fa76aaf39101c457836aec0ce2316dbdc3ab723cdda1c6bd4e6ad4208acaca7"
checksum = "b5b9d34b8991d19d98081b46eacdd8eb58c6f2b201139f7c5f643cc155a633af"
dependencies = [
"proc-macro2",
]
@@ -1926,6 +1926,7 @@ dependencies = [
"ruff_db",
"ruff_index",
"ruff_python_ast",
"ruff_python_literal",
"ruff_python_parser",
"ruff_python_stdlib",
"ruff_source_file",
@@ -1935,6 +1936,7 @@ dependencies = [
"smallvec",
"static_assertions",
"tempfile",
"thiserror",
"tracing",
"walkdir",
"zip",
@@ -1950,7 +1952,6 @@ dependencies = [
"libc",
"lsp-server",
"lsp-types",
"red_knot_python_semantic",
"red_knot_workspace",
"ruff_db",
"ruff_notebook",
@@ -1988,6 +1989,7 @@ dependencies = [
"anyhow",
"crossbeam",
"notify",
"rayon",
"red_knot_python_semantic",
"ruff_cache",
"ruff_db",
@@ -1995,7 +1997,6 @@ dependencies = [
"ruff_text_size",
"rustc-hash 2.0.0",
"salsa",
"thiserror",
"tracing",
]
@@ -2089,7 +2090,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.6.1"
version = "0.6.3"
dependencies = [
"anyhow",
"argfile",
@@ -2147,6 +2148,7 @@ dependencies = [
"criterion",
"mimalloc",
"once_cell",
"rayon",
"red_knot_python_semantic",
"red_knot_workspace",
"ruff_db",
@@ -2281,7 +2283,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.6.1"
version = "0.6.3"
dependencies = [
"aho-corasick",
"annotate-snippets 0.9.2",
@@ -2601,7 +2603,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.6.1"
version = "0.6.3"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -2828,9 +2830,9 @@ checksum = "1c107b6f4780854c8b126e228ea8869f4d7b71260f962fefb57b996b8959ba6b"
[[package]]
name = "serde"
version = "1.0.208"
version = "1.0.209"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cff085d2cb684faa248efb494c39b68e522822ac0de72ccf08109abde717cfb2"
checksum = "99fce0ffe7310761ca6bf9faf5115afbc19688edd00171d81b1bb1b116c63e09"
dependencies = [
"serde_derive",
]
@@ -2848,9 +2850,9 @@ dependencies = [
[[package]]
name = "serde_derive"
version = "1.0.208"
version = "1.0.209"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24008e81ff7613ed8e5ba0cfaf24e2c2f1e5b8a0495711e44fcd4882fca62bcf"
checksum = "a5831b979fd7b5439637af1752d535ff49f4860c0f341d1baeb6faf0f4242170"
dependencies = [
"proc-macro2",
"quote",
@@ -2870,9 +2872,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.125"
version = "1.0.127"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "83c8e735a073ccf5be70aa8066aa984eaf2fa000db6c8d0100ae605b366d31ed"
checksum = "8043c06d9f82bd7271361ed64f415fe5e12a77fdb52e573e7f06a516dea329ad"
dependencies = [
"itoa",
"memchr",
@@ -3031,9 +3033,9 @@ checksum = "81cdd64d312baedb58e21336b31bc043b77e01cc99033ce76ef539f78e965ebc"
[[package]]
name = "syn"
version = "2.0.75"
version = "2.0.76"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6af063034fc1935ede7be0122941bafa9bacb949334d090b77ca98b5817c7d9"
checksum = "578e081a14e0cefc3279b0472138c513f37b41a08d5a3cca9b6e4e8ceb6cd525"
dependencies = [
"proc-macro2",
"quote",

View File

@@ -110,7 +110,7 @@ For more, see the [documentation](https://docs.astral.sh/ruff/).
1. [Who's Using Ruff?](#whos-using-ruff)
1. [License](#license)
## Getting Started
## Getting Started<a id="getting-started"></a>
For more, see the [documentation](https://docs.astral.sh/ruff/).
@@ -136,8 +136,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.6.1/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.6.1/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.6.3/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.6.3/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -170,7 +170,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.6.1
rev: v0.6.3
hooks:
# Run the linter.
- id: ruff
@@ -195,7 +195,7 @@ jobs:
- uses: chartboost/ruff-action@v1
```
### Configuration
### Configuration<a id="configuration"></a>
Ruff can be configured through a `pyproject.toml`, `ruff.toml`, or `.ruff.toml` file (see:
[_Configuration_](https://docs.astral.sh/ruff/configuration/), or [_Settings_](https://docs.astral.sh/ruff/settings/)
@@ -291,7 +291,7 @@ features that may change prior to stabilization.
See `ruff help` for more on Ruff's top-level commands, or `ruff help check` and `ruff help format`
for more on the linting and formatting commands, respectively.
## Rules
## Rules<a id="rules"></a>
<!-- Begin section: Rules -->
@@ -367,21 +367,21 @@ quality tools, including:
For a complete enumeration of the supported rules, see [_Rules_](https://docs.astral.sh/ruff/rules/).
## Contributing
## Contributing<a id="contributing"></a>
Contributions are welcome and highly appreciated. To get started, check out the
[**contributing guidelines**](https://docs.astral.sh/ruff/contributing/).
You can also join us on [**Discord**](https://discord.com/invite/astral-sh).
## Support
## Support<a id="support"></a>
Having trouble? Check out the existing issues on [**GitHub**](https://github.com/astral-sh/ruff/issues),
or feel free to [**open a new one**](https://github.com/astral-sh/ruff/issues/new).
You can also ask for help on [**Discord**](https://discord.com/invite/astral-sh).
## Acknowledgements
## Acknowledgements<a id="acknowledgements"></a>
Ruff's linter draws on both the APIs and implementation details of many other
tools in the Python ecosystem, especially [Flake8](https://github.com/PyCQA/flake8), [Pyflakes](https://github.com/PyCQA/pyflakes),
@@ -405,7 +405,7 @@ Ruff is the beneficiary of a large number of [contributors](https://github.com/a
Ruff is released under the MIT license.
## Who's Using Ruff?
## Who's Using Ruff?<a id="whos-using-ruff"></a>
Ruff is used by a number of major open-source projects and companies, including:
@@ -524,7 +524,7 @@ If you're using Ruff, consider adding the Ruff badge to your project's `README.m
<a href="https://github.com/astral-sh/ruff"><img src="https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json" alt="Ruff" style="max-width:100%;"></a>
```
## License
## License<a id="license"></a>
This repository is licensed under the [MIT License](https://github.com/astral-sh/ruff/blob/main/LICENSE)

View File

@@ -13,12 +13,17 @@ The CLI supports different verbosity levels.
- `-vv` activates `debug!` and timestamps: This should be enough information to get to the bottom of bug reports. When you're processing many packages or files, you'll get pages and pages of output, but each line is link to a specific action or state change.
- `-vvv` activates `trace!` (only in debug builds) and shows tracing-spans: At this level, you're logging everything. Most of this is wasted, it's really slow, we dump e.g. the entire resolution graph. Only useful to developers, and you almost certainly want to use `RED_KNOT_LOG` to filter it down to the area your investigating.
## `RED_KNOT_LOG`
## Better logging with `RED_KNOT_LOG` and `RAYON_NUM_THREADS`
By default, the CLI shows messages from the `ruff` and `red_knot` crates. Tracing messages from other crates are not shown.
The `RED_KNOT_LOG` environment variable allows you to customize which messages are shown by specifying one
or more [filter directives](https://docs.rs/tracing-subscriber/latest/tracing_subscriber/filter/struct.EnvFilter.html#directives).
The `RAYON_NUM_THREADS` environment variable, meanwhile, can be used to control the level of concurrency red-knot uses.
By default, red-knot will attempt to parallelize its work so that multiple files are checked simultaneously,
but this can result in a confused logging output where messages from different threads are intertwined.
To switch off concurrency entirely and have more readable logs, use `RAYON_NUM_THREADS=1`.
### Examples
#### Show all debug messages

View File

@@ -7,12 +7,12 @@ use colored::Colorize;
use crossbeam::channel as crossbeam_channel;
use salsa::plumbing::ZalsaDatabase;
use red_knot_python_semantic::{ProgramSettings, SearchPathSettings};
use red_knot_python_semantic::SitePackages;
use red_knot_server::run_server;
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::site_packages::VirtualEnvironment;
use red_knot_workspace::watch;
use red_knot_workspace::watch::WorkspaceWatcher;
use red_knot_workspace::workspace::settings::Configuration;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::system::{OsSystem, System, SystemPath, SystemPathBuf};
use target_version::TargetVersion;
@@ -65,15 +65,14 @@ to resolve type information for the project's third-party dependencies.",
value_name = "PATH",
help = "Additional path to use as a module-resolution source (can be passed multiple times)"
)]
extra_search_path: Vec<SystemPathBuf>,
extra_search_path: Option<Vec<SystemPathBuf>>,
#[arg(
long,
help = "Python version to assume when resolving types",
default_value_t = TargetVersion::default(),
value_name="VERSION")
]
target_version: TargetVersion,
value_name = "VERSION"
)]
target_version: Option<TargetVersion>,
#[clap(flatten)]
verbosity: Verbosity,
@@ -86,6 +85,36 @@ to resolve type information for the project's third-party dependencies.",
watch: bool,
}
impl Args {
fn to_configuration(&self, cli_cwd: &SystemPath) -> Configuration {
let mut configuration = Configuration::default();
if let Some(target_version) = self.target_version {
configuration.target_version = Some(target_version.into());
}
if let Some(venv_path) = &self.venv_path {
configuration.search_paths.site_packages = Some(SitePackages::Derived {
venv_path: SystemPath::absolute(venv_path, cli_cwd),
});
}
if let Some(custom_typeshed_dir) = &self.custom_typeshed_dir {
configuration.search_paths.custom_typeshed =
Some(SystemPath::absolute(custom_typeshed_dir, cli_cwd));
}
if let Some(extra_search_paths) = &self.extra_search_path {
configuration.search_paths.extra_paths = extra_search_paths
.iter()
.map(|path| Some(SystemPath::absolute(path, cli_cwd)))
.collect();
}
configuration
}
}
#[derive(Debug, clap::Subcommand)]
pub enum Command {
/// Start the language server
@@ -115,22 +144,13 @@ pub fn main() -> ExitStatus {
}
fn run() -> anyhow::Result<ExitStatus> {
let Args {
command,
current_directory,
custom_typeshed_dir,
extra_search_path: extra_paths,
venv_path,
target_version,
verbosity,
watch,
} = Args::parse_from(std::env::args().collect::<Vec<_>>());
let args = Args::parse_from(std::env::args().collect::<Vec<_>>());
if matches!(command, Some(Command::Server)) {
if matches!(args.command, Some(Command::Server)) {
return run_server().map(|()| ExitStatus::Success);
}
let verbosity = verbosity.level();
let verbosity = args.verbosity.level();
countme::enable(verbosity.is_trace());
let _guard = setup_tracing(verbosity)?;
@@ -146,10 +166,12 @@ fn run() -> anyhow::Result<ExitStatus> {
})?
};
let cwd = current_directory
let cwd = args
.current_directory
.as_ref()
.map(|cwd| {
if cwd.as_std_path().is_dir() {
Ok(SystemPath::absolute(&cwd, &cli_base_path))
Ok(SystemPath::absolute(cwd, &cli_base_path))
} else {
Err(anyhow!(
"Provided current-directory path '{cwd}' is not a directory."
@@ -160,33 +182,18 @@ fn run() -> anyhow::Result<ExitStatus> {
.unwrap_or_else(|| cli_base_path.clone());
let system = OsSystem::new(cwd.clone());
let workspace_metadata = WorkspaceMetadata::from_path(system.current_directory(), &system)?;
// TODO: Verify the remaining search path settings eagerly.
let site_packages = venv_path
.map(|path| {
VirtualEnvironment::new(path, &OsSystem::new(cli_base_path))
.and_then(|venv| venv.site_packages_directories(&system))
})
.transpose()?
.unwrap_or_default();
// TODO: Respect the settings from the workspace metadata. when resolving the program settings.
let program_settings = ProgramSettings {
target_version: target_version.into(),
search_paths: SearchPathSettings {
extra_paths,
src_root: workspace_metadata.root().to_path_buf(),
custom_typeshed: custom_typeshed_dir,
site_packages,
},
};
let cli_configuration = args.to_configuration(&cwd);
let workspace_metadata = WorkspaceMetadata::from_path(
system.current_directory(),
&system,
Some(cli_configuration.clone()),
)?;
// TODO: Use the `program_settings` to compute the key for the database's persistent
// cache and load the cache if it exists.
let mut db = RootDatabase::new(workspace_metadata, program_settings, system)?;
let mut db = RootDatabase::new(workspace_metadata, system)?;
let (main_loop, main_loop_cancellation_token) = MainLoop::new();
let (main_loop, main_loop_cancellation_token) = MainLoop::new(cli_configuration);
// Listen to Ctrl+C and abort the watch mode.
let main_loop_cancellation_token = Mutex::new(Some(main_loop_cancellation_token));
@@ -198,7 +205,7 @@ fn run() -> anyhow::Result<ExitStatus> {
}
})?;
let exit_status = if watch {
let exit_status = if args.watch {
main_loop.watch(&mut db)?
} else {
main_loop.run(&mut db)
@@ -238,10 +245,12 @@ struct MainLoop {
/// The file system watcher, if running in watch mode.
watcher: Option<WorkspaceWatcher>,
cli_configuration: Configuration,
}
impl MainLoop {
fn new() -> (Self, MainLoopCancellationToken) {
fn new(cli_configuration: Configuration) -> (Self, MainLoopCancellationToken) {
let (sender, receiver) = crossbeam_channel::bounded(10);
(
@@ -249,6 +258,7 @@ impl MainLoop {
sender: sender.clone(),
receiver,
watcher: None,
cli_configuration,
},
MainLoopCancellationToken { sender },
)
@@ -331,7 +341,7 @@ impl MainLoop {
MainLoopMessage::ApplyChanges(changes) => {
revision += 1;
// Automatically cancels any pending queries and waits for them to complete.
db.apply_changes(changes);
db.apply_changes(changes, Some(&self.cli_configuration));
if let Some(watcher) = self.watcher.as_mut() {
watcher.update(db);
}

View File

@@ -5,12 +5,11 @@ use std::time::Duration;
use anyhow::{anyhow, Context};
use red_knot_python_semantic::{
resolve_module, ModuleName, Program, ProgramSettings, PythonVersion, SearchPathSettings,
};
use red_knot_python_semantic::{resolve_module, ModuleName, Program, PythonVersion, SitePackages};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::watch;
use red_knot_workspace::watch::{directory_watcher, WorkspaceWatcher};
use red_knot_workspace::workspace::settings::{Configuration, SearchPathConfiguration};
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::files::{system_path_to_file, File, FileError};
use ruff_db::source::source_text;
@@ -25,7 +24,7 @@ struct TestCase {
/// We need to hold on to it in the test case or the temp files get deleted.
_temp_dir: tempfile::TempDir,
root_dir: SystemPathBuf,
search_path_settings: SearchPathSettings,
configuration: Configuration,
}
impl TestCase {
@@ -41,10 +40,6 @@ impl TestCase {
&self.db
}
fn db_mut(&mut self) -> &mut RootDatabase {
&mut self.db
}
fn stop_watch(&mut self) -> Vec<watch::ChangeEvent> {
self.try_stop_watch(Duration::from_secs(10))
.expect("Expected watch changes but observed none.")
@@ -105,16 +100,20 @@ impl TestCase {
Some(all_events)
}
fn apply_changes(&mut self, changes: Vec<watch::ChangeEvent>) {
self.db.apply_changes(changes, Some(&self.configuration));
}
fn update_search_path_settings(
&mut self,
f: impl FnOnce(&SearchPathSettings) -> SearchPathSettings,
configuration: SearchPathConfiguration,
) -> anyhow::Result<()> {
let program = Program::get(self.db());
let new_settings = f(&self.search_path_settings);
self.configuration.search_paths = configuration.clone();
let new_settings = configuration.into_settings(self.db.workspace().root(&self.db));
program.update_search_paths(&mut self.db, new_settings.clone())?;
self.search_path_settings = new_settings;
program.update_search_paths(&mut self.db, &new_settings)?;
if let Some(watcher) = &mut self.watcher {
watcher.update(&self.db);
@@ -179,17 +178,14 @@ fn setup<F>(setup_files: F) -> anyhow::Result<TestCase>
where
F: SetupFiles,
{
setup_with_search_paths(setup_files, |_root, workspace_path| SearchPathSettings {
extra_paths: vec![],
src_root: workspace_path.to_path_buf(),
custom_typeshed: None,
site_packages: vec![],
setup_with_search_paths(setup_files, |_root, _workspace_path| {
SearchPathConfiguration::default()
})
}
fn setup_with_search_paths<F>(
setup_files: F,
create_search_paths: impl FnOnce(&SystemPath, &SystemPath) -> SearchPathSettings,
create_search_paths: impl FnOnce(&SystemPath, &SystemPath) -> SearchPathConfiguration,
) -> anyhow::Result<TestCase>
where
F: SetupFiles,
@@ -221,25 +217,34 @@ where
let system = OsSystem::new(&workspace_path);
let workspace = WorkspaceMetadata::from_path(&workspace_path, &system)?;
let search_path_settings = create_search_paths(&root_path, workspace.root());
let search_paths = create_search_paths(&root_path, &workspace_path);
for path in search_path_settings
for path in search_paths
.extra_paths
.iter()
.chain(search_path_settings.site_packages.iter())
.chain(search_path_settings.custom_typeshed.iter())
.flatten()
.chain(search_paths.custom_typeshed.iter())
.chain(search_paths.site_packages.iter().flat_map(|site_packages| {
if let SitePackages::Known(path) = site_packages {
path.as_slice()
} else {
&[]
}
}))
{
std::fs::create_dir_all(path.as_std_path())
.with_context(|| format!("Failed to create search path '{path}'"))?;
}
let settings = ProgramSettings {
target_version: PythonVersion::default(),
search_paths: search_path_settings.clone(),
let configuration = Configuration {
target_version: Some(PythonVersion::PY312),
search_paths,
};
let db = RootDatabase::new(workspace, settings, system)?;
let workspace =
WorkspaceMetadata::from_path(&workspace_path, &system, Some(configuration.clone()))?;
let db = RootDatabase::new(workspace, system)?;
let (sender, receiver) = crossbeam::channel::unbounded();
let watcher = directory_watcher(move |events| sender.send(events).unwrap())
@@ -254,7 +259,7 @@ where
watcher: Some(watcher),
_temp_dir: temp_dir,
root_dir: root_path,
search_path_settings,
configuration,
};
// Sometimes the file watcher reports changes for events that happened before the watcher was started.
@@ -307,7 +312,7 @@ fn new_file() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
let foo = case.system_file(&foo_path).expect("foo.py to exist.");
@@ -330,7 +335,7 @@ fn new_ignored_file() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert!(case.system_file(&foo_path).is_ok());
assert_eq!(&case.collect_package_files(&bar_path), &[bar_file]);
@@ -354,7 +359,7 @@ fn changed_file() -> anyhow::Result<()> {
assert!(!changes.is_empty());
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert_eq!(source_text(case.db(), foo).as_str(), "print('Version 2')");
assert_eq!(&case.collect_package_files(&foo_path), &[foo]);
@@ -377,7 +382,7 @@ fn deleted_file() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert!(!foo.exists(case.db()));
assert_eq!(&case.collect_package_files(&foo_path), &[] as &[File]);
@@ -409,7 +414,7 @@ fn move_file_to_trash() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert!(!foo.exists(case.db()));
assert_eq!(&case.collect_package_files(&foo_path), &[] as &[File]);
@@ -441,7 +446,7 @@ fn move_file_to_workspace() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
let foo_in_workspace = case.system_file(&foo_in_workspace_path)?;
@@ -469,7 +474,7 @@ fn rename_file() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert!(!foo.exists(case.db()));
@@ -510,7 +515,7 @@ fn directory_moved_to_workspace() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
let init_file = case
.system_file(sub_new_path.join("__init__.py"))
@@ -561,7 +566,7 @@ fn directory_moved_to_trash() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("sub.a").unwrap()).is_none());
@@ -615,7 +620,7 @@ fn directory_renamed() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("sub.a").unwrap()).is_none());
@@ -680,7 +685,7 @@ fn directory_deleted() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("sub.a").unwrap()).is_none());
@@ -694,15 +699,13 @@ fn directory_deleted() -> anyhow::Result<()> {
#[test]
fn search_path() -> anyhow::Result<()> {
let mut case =
setup_with_search_paths([("bar.py", "import sub.a")], |root_path, workspace_path| {
SearchPathSettings {
extra_paths: vec![],
src_root: workspace_path.to_path_buf(),
custom_typeshed: None,
site_packages: vec![root_path.join("site_packages")],
}
})?;
let mut case = setup_with_search_paths(
[("bar.py", "import sub.a")],
|root_path, _workspace_path| SearchPathConfiguration {
site_packages: Some(SitePackages::Known(vec![root_path.join("site_packages")])),
..SearchPathConfiguration::default()
},
)?;
let site_packages = case.root_path().join("site_packages");
@@ -715,7 +718,7 @@ fn search_path() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("a").unwrap()).is_some());
assert_eq!(
@@ -736,9 +739,9 @@ fn add_search_path() -> anyhow::Result<()> {
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("a").unwrap()).is_none());
// Register site-packages as a search path.
case.update_search_path_settings(|settings| SearchPathSettings {
site_packages: vec![site_packages.clone()],
..settings.clone()
case.update_search_path_settings(SearchPathConfiguration {
site_packages: Some(SitePackages::Known(vec![site_packages.clone()])),
..SearchPathConfiguration::default()
})
.expect("Search path settings to be valid");
@@ -746,7 +749,7 @@ fn add_search_path() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("a").unwrap()).is_some());
@@ -755,21 +758,19 @@ fn add_search_path() -> anyhow::Result<()> {
#[test]
fn remove_search_path() -> anyhow::Result<()> {
let mut case =
setup_with_search_paths([("bar.py", "import sub.a")], |root_path, workspace_path| {
SearchPathSettings {
extra_paths: vec![],
src_root: workspace_path.to_path_buf(),
custom_typeshed: None,
site_packages: vec![root_path.join("site_packages")],
}
})?;
let mut case = setup_with_search_paths(
[("bar.py", "import sub.a")],
|root_path, _workspace_path| SearchPathConfiguration {
site_packages: Some(SitePackages::Known(vec![root_path.join("site_packages")])),
..SearchPathConfiguration::default()
},
)?;
// Remove site packages from the search path settings.
let site_packages = case.root_path().join("site_packages");
case.update_search_path_settings(|settings| SearchPathSettings {
site_packages: vec![],
..settings.clone()
case.update_search_path_settings(SearchPathConfiguration {
site_packages: None,
..SearchPathConfiguration::default()
})
.expect("Search path settings to be valid");
@@ -782,6 +783,48 @@ fn remove_search_path() -> anyhow::Result<()> {
Ok(())
}
#[test]
fn changed_versions_file() -> anyhow::Result<()> {
let mut case = setup_with_search_paths(
|root_path: &SystemPath, workspace_path: &SystemPath| {
std::fs::write(workspace_path.join("bar.py").as_std_path(), "import sub.a")?;
std::fs::create_dir_all(root_path.join("typeshed/stdlib").as_std_path())?;
std::fs::write(root_path.join("typeshed/stdlib/VERSIONS").as_std_path(), "")?;
std::fs::write(
root_path.join("typeshed/stdlib/os.pyi").as_std_path(),
"# not important",
)?;
Ok(())
},
|root_path, _workspace_path| SearchPathConfiguration {
custom_typeshed: Some(root_path.join("typeshed")),
..SearchPathConfiguration::default()
},
)?;
// Unset the custom typeshed directory.
assert_eq!(
resolve_module(case.db(), ModuleName::new("os").unwrap()),
None
);
std::fs::write(
case.root_path()
.join("typeshed/stdlib/VERSIONS")
.as_std_path(),
"os: 3.0-",
)?;
let changes = case.stop_watch();
case.apply_changes(changes);
assert!(resolve_module(case.db(), ModuleName::new("os").unwrap()).is_some());
Ok(())
}
/// Watch a workspace that contains two files where one file is a hardlink to another.
///
/// Setup:
@@ -828,7 +871,7 @@ fn hard_links_in_workspace() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert_eq!(source_text(case.db(), foo).as_str(), "print('Version 2')");
@@ -899,7 +942,7 @@ fn hard_links_to_target_outside_workspace() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert_eq!(source_text(case.db(), bar).as_str(), "print('Version 2')");
@@ -938,7 +981,7 @@ mod unix {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert_eq!(
foo.permissions(case.db()),
@@ -1023,7 +1066,7 @@ mod unix {
let changes = case.take_watch_changes();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert_eq!(
source_text(case.db(), baz.file()).as_str(),
@@ -1036,7 +1079,7 @@ mod unix {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert_eq!(
source_text(case.db(), baz.file()).as_str(),
@@ -1107,7 +1150,7 @@ mod unix {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
// The file watcher is guaranteed to emit one event for the changed file, but it isn't specified
// if the event is emitted for the "original" or linked path because both paths are watched.
@@ -1176,11 +1219,11 @@ mod unix {
Ok(())
},
|_root, workspace| SearchPathSettings {
extra_paths: vec![],
src_root: workspace.to_path_buf(),
custom_typeshed: None,
site_packages: vec![workspace.join(".venv/lib/python3.12/site-packages")],
|_root, workspace| SearchPathConfiguration {
site_packages: Some(SitePackages::Known(vec![
workspace.join(".venv/lib/python3.12/site-packages")
])),
..SearchPathConfiguration::default()
},
)?;
@@ -1215,7 +1258,7 @@ mod unix {
let changes = case.stop_watch();
case.db_mut().apply_changes(changes);
case.apply_changes(changes);
assert_eq!(
source_text(case.db(), baz_original_file).as_str(),

View File

@@ -17,6 +17,7 @@ ruff_python_ast = { workspace = true }
ruff_python_stdlib = { workspace = true }
ruff_source_file = { workspace = true }
ruff_text_size = { workspace = true }
ruff_python_literal = { workspace = true }
anyhow = { workspace = true }
bitflags = { workspace = true }
@@ -26,6 +27,7 @@ countme = { workspace = true }
once_cell = { workspace = true }
ordermap = { workspace = true }
salsa = { workspace = true }
thiserror = { workspace = true }
tracing = { workspace = true }
rustc-hash = { workspace = true }
hashbrown = { workspace = true }

View File

@@ -5,7 +5,7 @@ use rustc_hash::FxHasher;
pub use db::Db;
pub use module_name::ModuleName;
pub use module_resolver::{resolve_module, system_module_search_paths, vendored_typeshed_stubs};
pub use program::{Program, ProgramSettings, SearchPathSettings};
pub use program::{Program, ProgramSettings, SearchPathSettings, SitePackages};
pub use python_version::PythonVersion;
pub use semantic_model::{HasTy, SemanticModel};
@@ -19,6 +19,7 @@ mod program;
mod python_version;
pub mod semantic_index;
mod semantic_model;
pub(crate) mod site_packages;
pub mod types;
type FxOrderSet<V> = ordermap::set::OrderSet<V, BuildHasherDefault<FxHasher>>;

View File

@@ -13,7 +13,6 @@ use resolver::SearchPathIterator;
mod module;
mod path;
mod resolver;
mod state;
mod typeshed;
#[cfg(test)]

View File

@@ -9,11 +9,11 @@ use ruff_db::files::{system_path_to_file, vendored_path_to_file, File, FileError
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ruff_db::vendored::{VendoredPath, VendoredPathBuf};
use super::typeshed::{typeshed_versions, TypeshedVersionsParseError, TypeshedVersionsQueryResult};
use crate::db::Db;
use crate::module_name::ModuleName;
use super::state::ResolverState;
use super::typeshed::{TypeshedVersionsParseError, TypeshedVersionsQueryResult};
use crate::module_resolver::resolver::ResolverContext;
use crate::site_packages::SitePackagesDiscoveryError;
/// A path that points to a Python module.
///
@@ -60,7 +60,7 @@ impl ModulePath {
}
#[must_use]
pub(crate) fn is_directory(&self, resolver: &ResolverState) -> bool {
pub(super) fn is_directory(&self, resolver: &ResolverContext) -> bool {
let ModulePath {
search_path,
relative_path,
@@ -74,7 +74,7 @@ impl ModulePath {
== Err(FileError::IsADirectory)
}
SearchPathInner::StandardLibraryCustom(stdlib_root) => {
match query_stdlib_version(Some(stdlib_root), relative_path, resolver) {
match query_stdlib_version(relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => false,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => {
@@ -84,7 +84,7 @@ impl ModulePath {
}
}
SearchPathInner::StandardLibraryVendored(stdlib_root) => {
match query_stdlib_version(None, relative_path, resolver) {
match query_stdlib_version(relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => false,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => resolver
@@ -96,7 +96,7 @@ impl ModulePath {
}
#[must_use]
pub(crate) fn is_regular_package(&self, resolver: &ResolverState) -> bool {
pub(super) fn is_regular_package(&self, resolver: &ResolverContext) -> bool {
let ModulePath {
search_path,
relative_path,
@@ -113,7 +113,7 @@ impl ModulePath {
.is_ok()
}
SearchPathInner::StandardLibraryCustom(search_path) => {
match query_stdlib_version(Some(search_path), relative_path, resolver) {
match query_stdlib_version(relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => false,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => system_path_to_file(
@@ -124,7 +124,7 @@ impl ModulePath {
}
}
SearchPathInner::StandardLibraryVendored(search_path) => {
match query_stdlib_version(None, relative_path, resolver) {
match query_stdlib_version(relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => false,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => resolver
@@ -136,7 +136,7 @@ impl ModulePath {
}
#[must_use]
pub(crate) fn to_file(&self, resolver: &ResolverState) -> Option<File> {
pub(super) fn to_file(&self, resolver: &ResolverContext) -> Option<File> {
let db = resolver.db.upcast();
let ModulePath {
search_path,
@@ -150,7 +150,7 @@ impl ModulePath {
system_path_to_file(db, search_path.join(relative_path)).ok()
}
SearchPathInner::StandardLibraryCustom(stdlib_root) => {
match query_stdlib_version(Some(stdlib_root), relative_path, resolver) {
match query_stdlib_version(relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => None,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => {
@@ -159,7 +159,7 @@ impl ModulePath {
}
}
SearchPathInner::StandardLibraryVendored(stdlib_root) => {
match query_stdlib_version(None, relative_path, resolver) {
match query_stdlib_version(relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => None,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => {
@@ -273,19 +273,15 @@ fn stdlib_path_to_module_name(relative_path: &Utf8Path) -> Option<ModuleName> {
#[must_use]
fn query_stdlib_version(
custom_stdlib_root: Option<&SystemPath>,
relative_path: &Utf8Path,
resolver: &ResolverState,
context: &ResolverContext,
) -> TypeshedVersionsQueryResult {
let Some(module_name) = stdlib_path_to_module_name(relative_path) else {
return TypeshedVersionsQueryResult::DoesNotExist;
};
let ResolverState {
db,
typeshed_versions,
target_version,
} = resolver;
typeshed_versions.query_module(*db, &module_name, custom_stdlib_root, *target_version)
let ResolverContext { db, target_version } = context;
typeshed_versions(*db).query_module(&module_name, *target_version)
}
/// Enumeration describing the various ways in which validation of a search path might fail.
@@ -293,7 +289,7 @@ fn query_stdlib_version(
/// If validation fails for a search path derived from the user settings,
/// a message must be displayed to the user,
/// as type checking cannot be done reliably in these circumstances.
#[derive(Debug, PartialEq, Eq)]
#[derive(Debug)]
pub(crate) enum SearchPathValidationError {
/// The path provided by the user was not a directory
NotADirectory(SystemPathBuf),
@@ -304,18 +300,20 @@ pub(crate) enum SearchPathValidationError {
NoStdlibSubdirectory(SystemPathBuf),
/// The typeshed path provided by the user is a directory,
/// but no `stdlib/VERSIONS` file exists.
/// but `stdlib/VERSIONS` could not be read.
/// (This is only relevant for stdlib search paths.)
NoVersionsFile(SystemPathBuf),
/// `stdlib/VERSIONS` is a directory.
/// (This is only relevant for stdlib search paths.)
VersionsIsADirectory(SystemPathBuf),
FailedToReadVersionsFile {
path: SystemPathBuf,
error: std::io::Error,
},
/// The path provided by the user is a directory,
/// and a `stdlib/VERSIONS` file exists, but it fails to parse.
/// (This is only relevant for stdlib search paths.)
VersionsParseError(TypeshedVersionsParseError),
/// Failed to discover the site-packages for the configured virtual environment.
SitePackagesDiscovery(SitePackagesDiscoveryError),
}
impl fmt::Display for SearchPathValidationError {
@@ -325,9 +323,16 @@ impl fmt::Display for SearchPathValidationError {
Self::NoStdlibSubdirectory(path) => {
write!(f, "The directory at {path} has no `stdlib/` subdirectory")
}
Self::NoVersionsFile(path) => write!(f, "Expected a file at {path}/stdlib/VERSIONS"),
Self::VersionsIsADirectory(path) => write!(f, "{path}/stdlib/VERSIONS is a directory."),
Self::FailedToReadVersionsFile { path, error } => {
write!(
f,
"Failed to read the custom typeshed versions file '{path}': {error}"
)
}
Self::VersionsParseError(underlying_error) => underlying_error.fmt(f),
SearchPathValidationError::SitePackagesDiscovery(error) => {
write!(f, "Failed to discover the site-packages directory: {error}")
}
}
}
}
@@ -342,6 +347,18 @@ impl std::error::Error for SearchPathValidationError {
}
}
impl From<TypeshedVersionsParseError> for SearchPathValidationError {
fn from(value: TypeshedVersionsParseError) -> Self {
Self::VersionsParseError(value)
}
}
impl From<SitePackagesDiscoveryError> for SearchPathValidationError {
fn from(value: SitePackagesDiscoveryError) -> Self {
Self::SitePackagesDiscovery(value)
}
}
type SearchPathResult<T> = Result<T, SearchPathValidationError>;
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
@@ -384,11 +401,10 @@ pub(crate) struct SearchPath(Arc<SearchPathInner>);
impl SearchPath {
fn directory_path(system: &dyn System, root: SystemPathBuf) -> SearchPathResult<SystemPathBuf> {
let canonicalized = system.canonicalize_path(&root).unwrap_or(root);
if system.is_directory(&canonicalized) {
Ok(canonicalized)
if system.is_directory(&root) {
Ok(root)
} else {
Err(SearchPathValidationError::NotADirectory(canonicalized))
Err(SearchPathValidationError::NotADirectory(root))
}
}
@@ -407,32 +423,22 @@ impl SearchPath {
}
/// Create a new standard-library search path pointing to a custom directory on disk
pub(crate) fn custom_stdlib(db: &dyn Db, typeshed: SystemPathBuf) -> SearchPathResult<Self> {
pub(crate) fn custom_stdlib(db: &dyn Db, typeshed: &SystemPath) -> SearchPathResult<Self> {
let system = db.system();
if !system.is_directory(&typeshed) {
if !system.is_directory(typeshed) {
return Err(SearchPathValidationError::NotADirectory(
typeshed.to_path_buf(),
));
}
let stdlib =
Self::directory_path(system, typeshed.join("stdlib")).map_err(|err| match err {
SearchPathValidationError::NotADirectory(path) => {
SearchPathValidationError::NoStdlibSubdirectory(path)
SearchPathValidationError::NotADirectory(_) => {
SearchPathValidationError::NoStdlibSubdirectory(typeshed.to_path_buf())
}
err => err,
})?;
let typeshed_versions =
system_path_to_file(db.upcast(), stdlib.join("VERSIONS")).map_err(|err| match err {
FileError::NotFound => SearchPathValidationError::NoVersionsFile(typeshed),
FileError::IsADirectory => {
SearchPathValidationError::VersionsIsADirectory(typeshed)
}
})?;
super::typeshed::parse_typeshed_versions(db, typeshed_versions)
.as_ref()
.map_err(|validation_error| {
SearchPathValidationError::VersionsParseError(validation_error.clone())
})?;
Ok(Self(Arc::new(SearchPathInner::StandardLibraryCustom(
stdlib,
))))
@@ -623,11 +629,11 @@ mod tests {
use ruff_db::Db;
use crate::db::tests::TestDb;
use super::*;
use crate::module_resolver::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use crate::python_version::PythonVersion;
use super::*;
impl ModulePath {
#[must_use]
fn join(&self, component: &str) -> ModulePath {
@@ -638,15 +644,6 @@ mod tests {
}
impl SearchPath {
#[must_use]
pub(crate) fn is_stdlib_search_path(&self) -> bool {
matches!(
&*self.0,
SearchPathInner::StandardLibraryCustom(_)
| SearchPathInner::StandardLibraryVendored(_)
)
}
fn join(&self, component: &str) -> ModulePath {
self.to_module_path().join(component)
}
@@ -661,7 +658,7 @@ mod tests {
.build();
assert_eq!(
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf())
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap())
.unwrap()
.to_module_path()
.with_py_extension(),
@@ -669,7 +666,7 @@ mod tests {
);
assert_eq!(
&SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf())
&SearchPath::custom_stdlib(&db, stdlib.parent().unwrap())
.unwrap()
.join("foo")
.with_pyi_extension(),
@@ -780,7 +777,7 @@ mod tests {
let TestCase { db, stdlib, .. } = TestCaseBuilder::new()
.with_custom_typeshed(MockedTypeshed::default())
.build();
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf())
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap())
.unwrap()
.to_module_path()
.push("bar.py");
@@ -792,7 +789,7 @@ mod tests {
let TestCase { db, stdlib, .. } = TestCaseBuilder::new()
.with_custom_typeshed(MockedTypeshed::default())
.build();
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf())
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap())
.unwrap()
.to_module_path()
.push("bar.rs");
@@ -824,7 +821,7 @@ mod tests {
.with_custom_typeshed(MockedTypeshed::default())
.build();
let root = SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf()).unwrap();
let root = SearchPath::custom_stdlib(&db, stdlib.parent().unwrap()).unwrap();
// Must have a `.pyi` extension or no extension:
let bad_absolute_path = SystemPath::new("foo/stdlib/x.py");
@@ -872,8 +869,7 @@ mod tests {
.with_custom_typeshed(typeshed)
.with_target_version(target_version)
.build();
let stdlib =
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf()).unwrap();
let stdlib = SearchPath::custom_stdlib(&db, stdlib.parent().unwrap()).unwrap();
(db, stdlib)
}
@@ -898,7 +894,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let asyncio_regular_package = stdlib_path.join("asyncio");
assert!(asyncio_regular_package.is_directory(&resolver));
@@ -926,7 +922,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let xml_namespace_package = stdlib_path.join("xml");
assert!(xml_namespace_package.is_directory(&resolver));
@@ -948,7 +944,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let functools_module = stdlib_path.join("functools.pyi");
assert!(functools_module.to_file(&resolver).is_some());
@@ -964,7 +960,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let collections_regular_package = stdlib_path.join("collections");
assert_eq!(collections_regular_package.to_file(&resolver), None);
@@ -980,7 +976,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let importlib_namespace_package = stdlib_path.join("importlib");
assert_eq!(importlib_namespace_package.to_file(&resolver), None);
@@ -1001,7 +997,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let non_existent = stdlib_path.join("doesnt_even_exist");
assert_eq!(non_existent.to_file(&resolver), None);
@@ -1029,7 +1025,7 @@ mod tests {
};
let (db, stdlib_path) = py39_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY39);
let resolver = ResolverContext::new(&db, PythonVersion::PY39);
// Since we've set the target version to Py39,
// `collections` should now exist as a directory, according to VERSIONS...
@@ -1058,7 +1054,7 @@ mod tests {
};
let (db, stdlib_path) = py39_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY39);
let resolver = ResolverContext::new(&db, PythonVersion::PY39);
// The `importlib` directory now also exists
let importlib_namespace_package = stdlib_path.join("importlib");
@@ -1082,7 +1078,7 @@ mod tests {
};
let (db, stdlib_path) = py39_typeshed_test_case(TYPESHED);
let resolver = ResolverState::new(&db, PythonVersion::PY39);
let resolver = ResolverContext::new(&db, PythonVersion::PY39);
// The `xml` package no longer exists on py39:
let xml_namespace_package = stdlib_path.join("xml");

View File

@@ -1,19 +1,19 @@
use rustc_hash::{FxBuildHasher, FxHashSet};
use std::borrow::Cow;
use std::iter::FusedIterator;
use rustc_hash::{FxBuildHasher, FxHashSet};
use std::ops::Deref;
use ruff_db::files::{File, FilePath, FileRootKind};
use ruff_db::system::{DirectoryEntry, SystemPath, SystemPathBuf};
use ruff_db::vendored::VendoredPath;
use crate::db::Db;
use crate::module_name::ModuleName;
use crate::{Program, SearchPathSettings};
use ruff_db::system::{DirectoryEntry, System, SystemPath, SystemPathBuf};
use ruff_db::vendored::{VendoredFileSystem, VendoredPath};
use super::module::{Module, ModuleKind};
use super::path::{ModulePath, SearchPath, SearchPathValidationError};
use super::state::ResolverState;
use crate::db::Db;
use crate::module_name::ModuleName;
use crate::module_resolver::typeshed::{vendored_typeshed_versions, TypeshedVersions};
use crate::site_packages::VirtualEnvironment;
use crate::{Program, PythonVersion, SearchPathSettings, SitePackages};
/// Resolves a module name to a module.
pub fn resolve_module(db: &dyn Db, module_name: ModuleName) -> Option<Module> {
@@ -122,7 +122,7 @@ pub(crate) fn search_paths(db: &dyn Db) -> SearchPathIterator {
Program::get(db).search_paths(db).iter(db)
}
#[derive(Debug, PartialEq, Eq, Default)]
#[derive(Debug, PartialEq, Eq)]
pub(crate) struct SearchPaths {
/// Search paths that have been statically determined purely from reading Ruff's configuration settings.
/// These shouldn't ever change unless the config settings themselves change.
@@ -135,6 +135,8 @@ pub(crate) struct SearchPaths {
/// in terms of module-resolution priority until we've discovered the editable installs
/// for the first `site-packages` path
site_packages: Vec<SearchPath>,
typeshed_versions: ResolvedTypeshedVersions,
}
impl SearchPaths {
@@ -146,8 +148,14 @@ impl SearchPaths {
/// [module resolution order]: https://typing.readthedocs.io/en/latest/spec/distributing.html#import-resolution-ordering
pub(crate) fn from_settings(
db: &dyn Db,
settings: SearchPathSettings,
settings: &SearchPathSettings,
) -> Result<Self, SearchPathValidationError> {
fn canonicalize(path: &SystemPath, system: &dyn System) -> SystemPathBuf {
system
.canonicalize_path(path)
.unwrap_or_else(|_| path.to_path_buf())
}
let SearchPathSettings {
extra_paths,
src_root,
@@ -161,45 +169,65 @@ impl SearchPaths {
let mut static_paths = vec![];
for path in extra_paths {
tracing::debug!("Adding static extra search-path '{path}'");
let path = canonicalize(path, system);
files.try_add_root(db.upcast(), &path, FileRootKind::LibrarySearchPath);
tracing::debug!("Adding extra search-path '{path}'");
let search_path = SearchPath::extra(system, path)?;
files.try_add_root(
db.upcast(),
search_path.as_system_path().unwrap(),
FileRootKind::LibrarySearchPath,
);
static_paths.push(search_path);
static_paths.push(SearchPath::extra(system, path)?);
}
tracing::debug!("Adding first-party search path '{src_root}'");
static_paths.push(SearchPath::first_party(system, src_root)?);
static_paths.push(SearchPath::first_party(system, src_root.to_path_buf())?);
static_paths.push(if let Some(custom_typeshed) = custom_typeshed {
let (typeshed_versions, stdlib_path) = if let Some(custom_typeshed) = custom_typeshed {
let custom_typeshed = canonicalize(custom_typeshed, system);
tracing::debug!("Adding custom-stdlib search path '{custom_typeshed}'");
let search_path = SearchPath::custom_stdlib(db, custom_typeshed)?;
files.try_add_root(
db.upcast(),
search_path.as_system_path().unwrap(),
&custom_typeshed,
FileRootKind::LibrarySearchPath,
);
search_path
let versions_path = custom_typeshed.join("stdlib/VERSIONS");
let versions_content = system.read_to_string(&versions_path).map_err(|error| {
SearchPathValidationError::FailedToReadVersionsFile {
path: versions_path,
error,
}
})?;
let parsed: TypeshedVersions = versions_content.parse()?;
let search_path = SearchPath::custom_stdlib(db, &custom_typeshed)?;
(ResolvedTypeshedVersions::Custom(parsed), search_path)
} else {
SearchPath::vendored_stdlib()
});
tracing::debug!("Using vendored stdlib");
(
ResolvedTypeshedVersions::Vendored(vendored_typeshed_versions()),
SearchPath::vendored_stdlib(),
)
};
static_paths.push(stdlib_path);
let site_packages_paths = match site_packages_paths {
SitePackages::Derived { venv_path } => VirtualEnvironment::new(venv_path, system)
.and_then(|venv| venv.site_packages_directories(system))?,
SitePackages::Known(paths) => paths
.iter()
.map(|path| canonicalize(path, system))
.collect(),
};
let mut site_packages: Vec<_> = Vec::with_capacity(site_packages_paths.len());
for path in site_packages_paths {
tracing::debug!("Adding site-packages search path '{path}'");
let search_path = SearchPath::site_packages(system, path)?;
files.try_add_root(
db.upcast(),
search_path.as_system_path().unwrap(),
FileRootKind::LibrarySearchPath,
);
site_packages.push(search_path);
files.try_add_root(db.upcast(), &path, FileRootKind::LibrarySearchPath);
site_packages.push(SearchPath::site_packages(system, path)?);
}
// TODO vendor typeshed's third-party stubs as well as the stdlib and fallback to them as a final step
@@ -224,16 +252,48 @@ impl SearchPaths {
Ok(SearchPaths {
static_paths,
site_packages,
typeshed_versions,
})
}
pub(crate) fn iter<'a>(&'a self, db: &'a dyn Db) -> SearchPathIterator<'a> {
pub(super) fn iter<'a>(&'a self, db: &'a dyn Db) -> SearchPathIterator<'a> {
SearchPathIterator {
db,
static_paths: self.static_paths.iter(),
dynamic_paths: None,
}
}
pub(crate) fn custom_stdlib(&self) -> Option<&SystemPath> {
self.static_paths.iter().find_map(|search_path| {
if search_path.is_standard_library() {
search_path.as_system_path()
} else {
None
}
})
}
pub(super) fn typeshed_versions(&self) -> &TypeshedVersions {
&self.typeshed_versions
}
}
#[derive(Debug, PartialEq, Eq)]
enum ResolvedTypeshedVersions {
Vendored(&'static TypeshedVersions),
Custom(TypeshedVersions),
}
impl Deref for ResolvedTypeshedVersions {
type Target = TypeshedVersions;
fn deref(&self) -> &Self::Target {
match self {
ResolvedTypeshedVersions::Vendored(versions) => versions,
ResolvedTypeshedVersions::Custom(versions) => versions,
}
}
}
/// Collect all dynamic search paths. For each `site-packages` path:
@@ -251,6 +311,7 @@ pub(crate) fn dynamic_resolution_paths(db: &dyn Db) -> Vec<SearchPath> {
let SearchPaths {
static_paths,
site_packages,
typeshed_versions: _,
} = Program::get(db).search_paths(db);
let mut dynamic_paths = Vec::new();
@@ -315,12 +376,16 @@ pub(crate) fn dynamic_resolution_paths(db: &dyn Db) -> Vec<SearchPath> {
let installations = all_pth_files.iter().flat_map(PthFile::items);
for installation in installations {
let installation = system
.canonicalize_path(&installation)
.unwrap_or(installation);
if existing_paths.insert(Cow::Owned(installation.clone())) {
match SearchPath::editable(system, installation) {
match SearchPath::editable(system, installation.clone()) {
Ok(search_path) => {
tracing::debug!(
"Adding editable installation to module resolution path {path}",
path = search_path.as_system_path().unwrap()
path = installation
);
dynamic_paths.push(search_path);
}
@@ -482,7 +547,7 @@ struct ModuleNameIngredient<'db> {
fn resolve_name(db: &dyn Db, name: &ModuleName) -> Option<(SearchPath, File, ModuleKind)> {
let program = Program::get(db);
let target_version = program.target_version(db);
let resolver_state = ResolverState::new(db, target_version);
let resolver_state = ResolverContext::new(db, target_version);
let is_builtin_module =
ruff_python_stdlib::sys::is_builtin_module(target_version.minor, name.as_str());
@@ -545,7 +610,7 @@ fn resolve_name(db: &dyn Db, name: &ModuleName) -> Option<(SearchPath, File, Mod
fn resolve_package<'a, 'db, I>(
module_search_path: &SearchPath,
components: I,
resolver_state: &ResolverState<'db>,
resolver_state: &ResolverContext<'db>,
) -> Result<ResolvedPackage, PackageKind>
where
I: Iterator<Item = &'a str>,
@@ -627,6 +692,21 @@ impl PackageKind {
}
}
pub(super) struct ResolverContext<'db> {
pub(super) db: &'db dyn Db,
pub(super) target_version: PythonVersion,
}
impl<'db> ResolverContext<'db> {
pub(super) fn new(db: &'db dyn Db, target_version: PythonVersion) -> Self {
Self { db, target_version }
}
pub(super) fn vendored(&self) -> &VendoredFileSystem {
self.db.vendored()
}
}
#[cfg(test)]
mod tests {
use ruff_db::files::{system_path_to_file, File, FilePath};
@@ -781,7 +861,7 @@ mod tests {
"Search path for {module_name} was unexpectedly {search_path:?}"
);
assert!(
search_path.is_stdlib_search_path(),
search_path.is_standard_library(),
"Expected a stdlib search path, but got {search_path:?}"
);
}
@@ -877,7 +957,7 @@ mod tests {
"Search path for {module_name} was unexpectedly {search_path:?}"
);
assert!(
search_path.is_stdlib_search_path(),
search_path.is_standard_library(),
"Expected a stdlib search path, but got {search_path:?}"
);
}
@@ -1194,13 +1274,13 @@ mod tests {
Program::from_settings(
&db,
ProgramSettings {
&ProgramSettings {
target_version: PythonVersion::PY38,
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: src.clone(),
custom_typeshed: Some(custom_typeshed.clone()),
site_packages: vec![site_packages],
site_packages: SitePackages::Known(vec![site_packages]),
},
},
)
@@ -1699,13 +1779,16 @@ not_a_directory
Program::from_settings(
&db,
ProgramSettings {
&ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: SystemPathBuf::from("/src"),
custom_typeshed: None,
site_packages: vec![venv_site_packages, system_site_packages],
site_packages: SitePackages::Known(vec![
venv_site_packages,
system_site_packages,
]),
},
},
)

View File

@@ -1,25 +0,0 @@
use ruff_db::vendored::VendoredFileSystem;
use super::typeshed::LazyTypeshedVersions;
use crate::db::Db;
use crate::python_version::PythonVersion;
pub(crate) struct ResolverState<'db> {
pub(crate) db: &'db dyn Db,
pub(crate) typeshed_versions: LazyTypeshedVersions<'db>,
pub(crate) target_version: PythonVersion,
}
impl<'db> ResolverState<'db> {
pub(crate) fn new(db: &'db dyn Db, target_version: PythonVersion) -> Self {
Self {
db,
typeshed_versions: LazyTypeshedVersions::new(),
target_version,
}
}
pub(crate) fn vendored(&self) -> &VendoredFileSystem {
self.db.vendored()
}
}

View File

@@ -4,7 +4,7 @@ use ruff_db::vendored::VendoredPathBuf;
use crate::db::tests::TestDb;
use crate::program::{Program, SearchPathSettings};
use crate::python_version::PythonVersion;
use crate::ProgramSettings;
use crate::{ProgramSettings, SitePackages};
/// A test case for the module resolver.
///
@@ -179,6 +179,7 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
first_party_files,
site_packages_files,
} = self;
TestCaseBuilder {
typeshed_option: typeshed,
target_version,
@@ -195,6 +196,7 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
site_packages,
target_version,
} = self.with_custom_typeshed(MockedTypeshed::default()).build();
TestCase {
db,
src,
@@ -223,13 +225,13 @@ impl TestCaseBuilder<MockedTypeshed> {
Program::from_settings(
&db,
ProgramSettings {
&ProgramSettings {
target_version,
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: src.clone(),
custom_typeshed: Some(typeshed.clone()),
site_packages: vec![site_packages.clone()],
site_packages: SitePackages::Known(vec![site_packages.clone()]),
},
},
)
@@ -279,13 +281,11 @@ impl TestCaseBuilder<VendoredTypeshed> {
Program::from_settings(
&db,
ProgramSettings {
&ProgramSettings {
target_version,
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: src.clone(),
custom_typeshed: None,
site_packages: vec![site_packages.clone()],
site_packages: SitePackages::Known(vec![site_packages.clone()]),
..SearchPathSettings::new(src.clone())
},
},
)

View File

@@ -1,6 +1,6 @@
pub use self::vendored::vendored_typeshed_stubs;
pub(super) use self::versions::{
parse_typeshed_versions, LazyTypeshedVersions, TypeshedVersionsParseError,
typeshed_versions, vendored_typeshed_versions, TypeshedVersions, TypeshedVersionsParseError,
TypeshedVersionsQueryResult,
};

View File

@@ -1,4 +1,3 @@
use std::cell::OnceCell;
use std::collections::BTreeMap;
use std::fmt;
use std::num::{NonZeroU16, NonZeroUsize};
@@ -6,78 +5,12 @@ use std::ops::{RangeFrom, RangeInclusive};
use std::str::FromStr;
use once_cell::sync::Lazy;
use ruff_db::system::SystemPath;
use rustc_hash::FxHashMap;
use ruff_db::files::{system_path_to_file, File};
use super::vendored::vendored_typeshed_stubs;
use crate::db::Db;
use crate::module_name::ModuleName;
use crate::python_version::PythonVersion;
#[derive(Debug)]
pub(crate) struct LazyTypeshedVersions<'db>(OnceCell<&'db TypeshedVersions>);
impl<'db> LazyTypeshedVersions<'db> {
#[must_use]
pub(crate) fn new() -> Self {
Self(OnceCell::new())
}
/// Query whether a module exists at runtime in the stdlib on a certain Python version.
///
/// Simply probing whether a file exists in typeshed is insufficient for this question,
/// as a module in the stdlib may have been added in Python 3.10, but the typeshed stub
/// will still be available (either in a custom typeshed dir or in our vendored copy)
/// even if the user specified Python 3.8 as the target version.
///
/// For top-level modules and packages, the VERSIONS file can always provide an unambiguous answer
/// as to whether the module exists on the specified target version. However, VERSIONS does not
/// provide comprehensive information on all submodules, meaning that this method sometimes
/// returns [`TypeshedVersionsQueryResult::MaybeExists`].
/// See [`TypeshedVersionsQueryResult`] for more details.
#[must_use]
pub(crate) fn query_module(
&self,
db: &'db dyn Db,
module: &ModuleName,
stdlib_root: Option<&SystemPath>,
target_version: PythonVersion,
) -> TypeshedVersionsQueryResult {
let versions = self.0.get_or_init(|| {
let versions_path = if let Some(system_path) = stdlib_root {
system_path.join("VERSIONS")
} else {
return &VENDORED_VERSIONS;
};
let Ok(versions_file) = system_path_to_file(db.upcast(), &versions_path) else {
todo!(
"Still need to figure out how to handle VERSIONS files being deleted \
from custom typeshed directories! Expected a file to exist at {versions_path}"
)
};
// TODO(Alex/Micha): If VERSIONS is invalid,
// this should invalidate not just the specific module resolution we're currently attempting,
// but all type inference that depends on any standard-library types.
// Unwrapping here is not correct...
parse_typeshed_versions(db, versions_file).as_ref().unwrap()
});
versions.query_module(module, target_version)
}
}
#[salsa::tracked(return_ref)]
pub(crate) fn parse_typeshed_versions(
db: &dyn Db,
versions_file: File,
) -> Result<TypeshedVersions, TypeshedVersionsParseError> {
// TODO: Handle IO errors
let file_content = versions_file
.read_to_string(db.upcast())
.unwrap_or_default();
file_content.parse()
}
use crate::{Program, PythonVersion};
static VENDORED_VERSIONS: Lazy<TypeshedVersions> = Lazy::new(|| {
TypeshedVersions::from_str(
@@ -88,6 +21,14 @@ static VENDORED_VERSIONS: Lazy<TypeshedVersions> = Lazy::new(|| {
.unwrap()
});
pub(crate) fn vendored_typeshed_versions() -> &'static TypeshedVersions {
&VENDORED_VERSIONS
}
pub(crate) fn typeshed_versions(db: &dyn Db) -> &TypeshedVersions {
Program::get(db).search_paths(db).typeshed_versions()
}
#[derive(Debug, PartialEq, Eq, Clone)]
pub(crate) struct TypeshedVersionsParseError {
line_number: Option<NonZeroU16>,
@@ -174,7 +115,7 @@ impl TypeshedVersions {
}
#[must_use]
fn query_module(
pub(in crate::module_resolver) fn query_module(
&self,
module: &ModuleName,
target_version: PythonVersion,
@@ -204,7 +145,7 @@ impl TypeshedVersions {
}
}
/// Possible answers [`LazyTypeshedVersions::query_module()`] could give to the question:
/// Possible answers [`TypeshedVersions::query_module()`] could give to the question:
/// "Does this module exist in the stdlib at runtime on a certain target version?"
#[derive(Debug, Copy, PartialEq, Eq, Clone, Hash)]
pub(crate) enum TypeshedVersionsQueryResult {

View File

@@ -3,7 +3,7 @@ use anyhow::Context;
use salsa::Durability;
use salsa::Setter;
use ruff_db::system::SystemPathBuf;
use ruff_db::system::{SystemPath, SystemPathBuf};
use crate::module_resolver::SearchPaths;
use crate::Db;
@@ -12,13 +12,12 @@ use crate::Db;
pub struct Program {
pub target_version: PythonVersion,
#[default]
#[return_ref]
pub(crate) search_paths: SearchPaths,
}
impl Program {
pub fn from_settings(db: &dyn Db, settings: ProgramSettings) -> anyhow::Result<Self> {
pub fn from_settings(db: &dyn Db, settings: &ProgramSettings) -> anyhow::Result<Self> {
let ProgramSettings {
target_version,
search_paths,
@@ -29,16 +28,15 @@ impl Program {
let search_paths = SearchPaths::from_settings(db, search_paths)
.with_context(|| "Invalid search path settings")?;
Ok(Program::builder(settings.target_version)
Ok(Program::builder(settings.target_version, search_paths)
.durability(Durability::HIGH)
.search_paths(search_paths)
.new(db))
}
pub fn update_search_paths(
&self,
self,
db: &mut dyn Db,
search_path_settings: SearchPathSettings,
search_path_settings: &SearchPathSettings,
) -> anyhow::Result<()> {
let search_paths = SearchPaths::from_settings(db, search_path_settings)?;
@@ -49,16 +47,20 @@ impl Program {
Ok(())
}
pub fn custom_stdlib_search_path(self, db: &dyn Db) -> Option<&SystemPath> {
self.search_paths(db).custom_stdlib()
}
}
#[derive(Debug, Eq, PartialEq)]
#[derive(Clone, Debug, Eq, PartialEq)]
pub struct ProgramSettings {
pub target_version: PythonVersion,
pub search_paths: SearchPathSettings,
}
/// Configures the search paths for module resolution.
#[derive(Eq, PartialEq, Debug, Clone, Default)]
#[derive(Eq, PartialEq, Debug, Clone)]
pub struct SearchPathSettings {
/// List of user-provided paths that should take first priority in the module resolution.
/// Examples in other type checkers are mypy's MYPYPATH environment variable,
@@ -74,5 +76,25 @@ pub struct SearchPathSettings {
pub custom_typeshed: Option<SystemPathBuf>,
/// The path to the user's `site-packages` directory, where third-party packages from ``PyPI`` are installed.
pub site_packages: Vec<SystemPathBuf>,
pub site_packages: SitePackages,
}
impl SearchPathSettings {
pub fn new(src_root: SystemPathBuf) -> Self {
Self {
src_root,
extra_paths: vec![],
custom_typeshed: None,
site_packages: SitePackages::Known(vec![]),
}
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
pub enum SitePackages {
Derived {
venv_path: SystemPathBuf,
},
/// Resolved site packages paths
Known(Vec<SystemPathBuf>),
}

View File

@@ -575,7 +575,7 @@ def f(a: str, /, b: str, c: int = 1, *args, d: int = 2, **kwargs):
let index = semantic_index(&db, file);
let global_table = symbol_table(&db, global_scope(&db, file));
assert_eq!(names(&global_table), vec!["f", "str", "int"]);
assert_eq!(names(&global_table), vec!["str", "int", "f"]);
let [(function_scope_id, _function_scope)] = index
.child_scopes(FileScopeId::global())
@@ -790,6 +790,56 @@ def f(a: str, /, b: str, c: int = 1, *args, d: int = 2, **kwargs):
assert_eq!(names(&inner_comprehension_symbol_table), vec!["x"]);
}
#[test]
fn with_item_definition() {
let TestCase { db, file } = test_case(
"
with item1 as x, item2 as y:
pass
",
);
let index = semantic_index(&db, file);
let global_table = index.symbol_table(FileScopeId::global());
assert_eq!(names(&global_table), vec!["item1", "x", "item2", "y"]);
let use_def = index.use_def_map(FileScopeId::global());
for name in ["x", "y"] {
let Some(definition) = use_def.first_public_definition(
global_table.symbol_id_by_name(name).expect("symbol exists"),
) else {
panic!("Expected with item definition for {name}");
};
assert!(matches!(definition.node(&db), DefinitionKind::WithItem(_)));
}
}
#[test]
fn with_item_unpacked_definition() {
let TestCase { db, file } = test_case(
"
with context() as (x, y):
pass
",
);
let index = semantic_index(&db, file);
let global_table = index.symbol_table(FileScopeId::global());
assert_eq!(names(&global_table), vec!["context", "x", "y"]);
let use_def = index.use_def_map(FileScopeId::global());
for name in ["x", "y"] {
let Some(definition) = use_def.first_public_definition(
global_table.symbol_id_by_name(name).expect("symbol exists"),
) else {
panic!("Expected with item definition for {name}");
};
assert!(matches!(definition.node(&db), DefinitionKind::WithItem(_)));
}
}
#[test]
fn dupes() {
let TestCase { db, file } = test_case(
@@ -1045,4 +1095,56 @@ match subject:
vec!["subject", "a", "b", "c", "d", "f", "e", "h", "g", "Foo", "i", "j", "k", "l"]
);
}
#[test]
fn for_loops_single_assignment() {
let TestCase { db, file } = test_case("for x in a: pass");
let scope = global_scope(&db, file);
let global_table = symbol_table(&db, scope);
assert_eq!(&names(&global_table), &["a", "x"]);
let use_def = use_def_map(&db, scope);
let definition = use_def
.first_public_definition(global_table.symbol_id_by_name("x").unwrap())
.unwrap();
assert!(matches!(definition.node(&db), DefinitionKind::For(_)));
}
#[test]
fn for_loops_simple_unpacking() {
let TestCase { db, file } = test_case("for (x, y) in a: pass");
let scope = global_scope(&db, file);
let global_table = symbol_table(&db, scope);
assert_eq!(&names(&global_table), &["a", "x", "y"]);
let use_def = use_def_map(&db, scope);
let x_definition = use_def
.first_public_definition(global_table.symbol_id_by_name("x").unwrap())
.unwrap();
let y_definition = use_def
.first_public_definition(global_table.symbol_id_by_name("y").unwrap())
.unwrap();
assert!(matches!(x_definition.node(&db), DefinitionKind::For(_)));
assert!(matches!(y_definition.node(&db), DefinitionKind::For(_)));
}
#[test]
fn for_loops_complex_unpacking() {
let TestCase { db, file } = test_case("for [((a,) b), (c, d)] in e: pass");
let scope = global_scope(&db, file);
let global_table = symbol_table(&db, scope);
assert_eq!(&names(&global_table), &["e", "a", "b", "c", "d"]);
let use_def = use_def_map(&db, scope);
let definition = use_def
.first_public_definition(global_table.symbol_id_by_name("a").unwrap())
.unwrap();
assert!(matches!(definition.node(&db), DefinitionKind::For(_)));
}
}

View File

@@ -15,7 +15,7 @@ use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::semantic_index::ast_ids::AstIdsBuilder;
use crate::semantic_index::definition::{
AssignmentDefinitionNodeRef, ComprehensionDefinitionNodeRef, Definition, DefinitionNodeKey,
DefinitionNodeRef, ImportFromDefinitionNodeRef,
DefinitionNodeRef, ForStmtDefinitionNodeRef, ImportFromDefinitionNodeRef,
};
use crate::semantic_index::expression::Expression;
use crate::semantic_index::symbol::{
@@ -26,6 +26,8 @@ use crate::semantic_index::use_def::{FlowSnapshot, UseDefMapBuilder};
use crate::semantic_index::SemanticIndex;
use crate::Db;
use super::definition::WithItemDefinitionNodeRef;
pub(super) struct SemanticIndexBuilder<'db> {
// Builder state
db: &'db dyn Db,
@@ -390,20 +392,6 @@ where
self.visit_decorator(decorator);
}
let symbol = self
.add_or_update_symbol(function_def.name.id.clone(), SymbolFlags::IS_DEFINED);
self.add_definition(symbol, function_def);
// The default value of the parameters needs to be evaluated in the
// enclosing scope.
for default in function_def
.parameters
.iter_non_variadic_params()
.filter_map(|param| param.default.as_deref())
{
self.visit_expr(default);
}
self.with_type_params(
NodeWithScopeRef::FunctionTypeParameters(function_def),
function_def.type_params.as_deref(),
@@ -424,6 +412,21 @@ where
builder.pop_scope()
},
);
// The default value of the parameters needs to be evaluated in the
// enclosing scope.
for default in function_def
.parameters
.iter_non_variadic_params()
.filter_map(|param| param.default.as_deref())
{
self.visit_expr(default);
}
// The symbol for the function name itself has to be evaluated
// at the end to match the runtime evaluation of parameter defaults
// and return-type annotations.
let symbol = self
.add_or_update_symbol(function_def.name.id.clone(), SymbolFlags::IS_DEFINED);
self.add_definition(symbol, function_def);
}
ast::Stmt::ClassDef(class) => {
for decorator in &class.decorator_list {
@@ -561,9 +564,42 @@ where
self.flow_merge(break_state);
}
}
ast::Stmt::With(ast::StmtWith { items, body, .. }) => {
for item in items {
self.visit_expr(&item.context_expr);
if let Some(optional_vars) = item.optional_vars.as_deref() {
self.add_standalone_expression(&item.context_expr);
self.current_assignment = Some(item.into());
self.visit_expr(optional_vars);
self.current_assignment = None;
}
}
self.visit_body(body);
}
ast::Stmt::Break(_) => {
self.loop_break_states.push(self.flow_snapshot());
}
ast::Stmt::For(
for_stmt @ ast::StmtFor {
range: _,
is_async: _,
target,
iter,
body,
orelse,
},
) => {
// TODO add control flow similar to `ast::Stmt::While` above
self.add_standalone_expression(iter);
self.visit_expr(iter);
debug_assert!(self.current_assignment.is_none());
self.current_assignment = Some(for_stmt.into());
self.visit_expr(target);
self.current_assignment = None;
self.visit_body(body);
self.visit_body(orelse);
}
_ => {
walk_stmt(self, stmt);
}
@@ -610,6 +646,15 @@ where
Some(CurrentAssignment::AugAssign(aug_assign)) => {
self.add_definition(symbol, aug_assign);
}
Some(CurrentAssignment::For(node)) => {
self.add_definition(
symbol,
ForStmtDefinitionNodeRef {
iterable: &node.iter,
target: name_node,
},
);
}
Some(CurrentAssignment::Named(named)) => {
// TODO(dhruvmanila): If the current scope is a comprehension, then the
// named expression is implicitly nonlocal. This is yet to be
@@ -622,6 +667,15 @@ where
ComprehensionDefinitionNodeRef { node, first },
);
}
Some(CurrentAssignment::WithItem(with_item)) => {
self.add_definition(
symbol,
WithItemDefinitionNodeRef {
node: with_item,
target: name_node,
},
);
}
None => {}
}
}
@@ -635,11 +689,11 @@ where
}
ast::Expr::Named(node) => {
debug_assert!(self.current_assignment.is_none());
self.current_assignment = Some(node.into());
// TODO walrus in comprehensions is implicitly nonlocal
self.visit_expr(&node.value);
self.current_assignment = Some(node.into());
self.visit_expr(&node.target);
self.current_assignment = None;
self.visit_expr(&node.value);
}
ast::Expr::Lambda(lambda) => {
if let Some(parameters) = &lambda.parameters {
@@ -773,11 +827,13 @@ enum CurrentAssignment<'a> {
Assign(&'a ast::StmtAssign),
AnnAssign(&'a ast::StmtAnnAssign),
AugAssign(&'a ast::StmtAugAssign),
For(&'a ast::StmtFor),
Named(&'a ast::ExprNamed),
Comprehension {
node: &'a ast::Comprehension,
first: bool,
},
WithItem(&'a ast::WithItem),
}
impl<'a> From<&'a ast::StmtAssign> for CurrentAssignment<'a> {
@@ -798,8 +854,20 @@ impl<'a> From<&'a ast::StmtAugAssign> for CurrentAssignment<'a> {
}
}
impl<'a> From<&'a ast::StmtFor> for CurrentAssignment<'a> {
fn from(value: &'a ast::StmtFor) -> Self {
Self::For(value)
}
}
impl<'a> From<&'a ast::ExprNamed> for CurrentAssignment<'a> {
fn from(value: &'a ast::ExprNamed) -> Self {
Self::Named(value)
}
}
impl<'a> From<&'a ast::WithItem> for CurrentAssignment<'a> {
fn from(value: &'a ast::WithItem) -> Self {
Self::WithItem(value)
}
}

View File

@@ -39,6 +39,7 @@ impl<'db> Definition<'db> {
pub(crate) enum DefinitionNodeRef<'a> {
Import(&'a ast::Alias),
ImportFrom(ImportFromDefinitionNodeRef<'a>),
For(ForStmtDefinitionNodeRef<'a>),
Function(&'a ast::StmtFunctionDef),
Class(&'a ast::StmtClassDef),
NamedExpression(&'a ast::ExprNamed),
@@ -47,6 +48,7 @@ pub(crate) enum DefinitionNodeRef<'a> {
AugmentedAssignment(&'a ast::StmtAugAssign),
Comprehension(ComprehensionDefinitionNodeRef<'a>),
Parameter(ast::AnyParameterRef<'a>),
WithItem(WithItemDefinitionNodeRef<'a>),
}
impl<'a> From<&'a ast::StmtFunctionDef> for DefinitionNodeRef<'a> {
@@ -91,12 +93,24 @@ impl<'a> From<ImportFromDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
}
}
impl<'a> From<ForStmtDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
fn from(value: ForStmtDefinitionNodeRef<'a>) -> Self {
Self::For(value)
}
}
impl<'a> From<AssignmentDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
fn from(node_ref: AssignmentDefinitionNodeRef<'a>) -> Self {
Self::Assignment(node_ref)
}
}
impl<'a> From<WithItemDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
fn from(node_ref: WithItemDefinitionNodeRef<'a>) -> Self {
Self::WithItem(node_ref)
}
}
impl<'a> From<ComprehensionDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
fn from(node: ComprehensionDefinitionNodeRef<'a>) -> Self {
Self::Comprehension(node)
@@ -121,6 +135,18 @@ pub(crate) struct AssignmentDefinitionNodeRef<'a> {
pub(crate) target: &'a ast::ExprName,
}
#[derive(Copy, Clone, Debug)]
pub(crate) struct WithItemDefinitionNodeRef<'a> {
pub(crate) node: &'a ast::WithItem,
pub(crate) target: &'a ast::ExprName,
}
#[derive(Copy, Clone, Debug)]
pub(crate) struct ForStmtDefinitionNodeRef<'a> {
pub(crate) iterable: &'a ast::Expr,
pub(crate) target: &'a ast::ExprName,
}
#[derive(Copy, Clone, Debug)]
pub(crate) struct ComprehensionDefinitionNodeRef<'a> {
pub(crate) node: &'a ast::Comprehension,
@@ -161,6 +187,12 @@ impl DefinitionNodeRef<'_> {
DefinitionNodeRef::AugmentedAssignment(augmented_assignment) => {
DefinitionKind::AugmentedAssignment(AstNodeRef::new(parsed, augmented_assignment))
}
DefinitionNodeRef::For(ForStmtDefinitionNodeRef { iterable, target }) => {
DefinitionKind::For(ForStmtDefinitionKind {
iterable: AstNodeRef::new(parsed.clone(), iterable),
target: AstNodeRef::new(parsed, target),
})
}
DefinitionNodeRef::Comprehension(ComprehensionDefinitionNodeRef { node, first }) => {
DefinitionKind::Comprehension(ComprehensionDefinitionKind {
node: AstNodeRef::new(parsed, node),
@@ -175,6 +207,12 @@ impl DefinitionNodeRef<'_> {
DefinitionKind::ParameterWithDefault(AstNodeRef::new(parsed, parameter))
}
},
DefinitionNodeRef::WithItem(WithItemDefinitionNodeRef { node, target }) => {
DefinitionKind::WithItem(WithItemDefinitionKind {
node: AstNodeRef::new(parsed.clone(), node),
target: AstNodeRef::new(parsed, target),
})
}
}
}
@@ -193,11 +231,16 @@ impl DefinitionNodeRef<'_> {
}) => target.into(),
Self::AnnotatedAssignment(node) => node.into(),
Self::AugmentedAssignment(node) => node.into(),
Self::For(ForStmtDefinitionNodeRef {
iterable: _,
target,
}) => target.into(),
Self::Comprehension(ComprehensionDefinitionNodeRef { node, first: _ }) => node.into(),
Self::Parameter(node) => match node {
ast::AnyParameterRef::Variadic(parameter) => parameter.into(),
ast::AnyParameterRef::NonVariadic(parameter) => parameter.into(),
},
Self::WithItem(WithItemDefinitionNodeRef { node: _, target }) => target.into(),
}
}
}
@@ -212,9 +255,11 @@ pub enum DefinitionKind {
Assignment(AssignmentDefinitionKind),
AnnotatedAssignment(AstNodeRef<ast::StmtAnnAssign>),
AugmentedAssignment(AstNodeRef<ast::StmtAugAssign>),
For(ForStmtDefinitionKind),
Comprehension(ComprehensionDefinitionKind),
Parameter(AstNodeRef<ast::Parameter>),
ParameterWithDefault(AstNodeRef<ast::ParameterWithDefault>),
WithItem(WithItemDefinitionKind),
}
#[derive(Clone, Debug)]
@@ -250,7 +295,6 @@ impl ImportFromDefinitionKind {
}
#[derive(Clone, Debug)]
#[allow(dead_code)]
pub struct AssignmentDefinitionKind {
assignment: AstNodeRef<ast::StmtAssign>,
target: AstNodeRef<ast::ExprName>,
@@ -266,6 +310,38 @@ impl AssignmentDefinitionKind {
}
}
#[derive(Clone, Debug)]
pub struct WithItemDefinitionKind {
node: AstNodeRef<ast::WithItem>,
target: AstNodeRef<ast::ExprName>,
}
impl WithItemDefinitionKind {
pub(crate) fn node(&self) -> &ast::WithItem {
self.node.node()
}
pub(crate) fn target(&self) -> &ast::ExprName {
self.target.node()
}
}
#[derive(Clone, Debug)]
pub struct ForStmtDefinitionKind {
iterable: AstNodeRef<ast::Expr>,
target: AstNodeRef<ast::ExprName>,
}
impl ForStmtDefinitionKind {
pub(crate) fn iterable(&self) -> &ast::Expr {
self.iterable.node()
}
pub(crate) fn target(&self) -> &ast::ExprName {
self.target.node()
}
}
#[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)]
pub(crate) struct DefinitionNodeKey(NodeKey);
@@ -311,6 +387,12 @@ impl From<&ast::StmtAugAssign> for DefinitionNodeKey {
}
}
impl From<&ast::StmtFor> for DefinitionNodeKey {
fn from(value: &ast::StmtFor) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<&ast::Comprehension> for DefinitionNodeKey {
fn from(node: &ast::Comprehension) -> Self {
Self(NodeKey::from_node(node))

View File

@@ -184,14 +184,9 @@ mod tests {
Program::from_settings(
&db,
ProgramSettings {
&ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: SystemPathBuf::from("/src"),
site_packages: vec![],
custom_typeshed: None,
},
search_paths: SearchPathSettings::new(SystemPathBuf::from("/src")),
},
)?;

View File

@@ -13,9 +13,10 @@ use std::io;
use std::num::NonZeroUsize;
use std::ops::Deref;
use red_knot_python_semantic::PythonVersion;
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use crate::PythonVersion;
type SitePackagesDiscoveryResult<T> = Result<T, SitePackagesDiscoveryError>;
/// Abstraction for a Python virtual environment.
@@ -24,7 +25,7 @@ type SitePackagesDiscoveryResult<T> = Result<T, SitePackagesDiscoveryError>;
/// The format of this file is not defined anywhere, and exactly which keys are present
/// depends on the tool that was used to create the virtual environment.
#[derive(Debug)]
pub struct VirtualEnvironment {
pub(crate) struct VirtualEnvironment {
venv_path: SysPrefixPath,
base_executable_home_path: PythonHomePath,
include_system_site_packages: bool,
@@ -41,7 +42,7 @@ pub struct VirtualEnvironment {
}
impl VirtualEnvironment {
pub fn new(
pub(crate) fn new(
path: impl AsRef<SystemPath>,
system: &dyn System,
) -> SitePackagesDiscoveryResult<Self> {
@@ -157,7 +158,7 @@ impl VirtualEnvironment {
/// Return a list of `site-packages` directories that are available from this virtual environment
///
/// See the documentation for `site_packages_dir_from_sys_prefix` for more details.
pub fn site_packages_directories(
pub(crate) fn site_packages_directories(
&self,
system: &dyn System,
) -> SitePackagesDiscoveryResult<Vec<SystemPathBuf>> {
@@ -204,7 +205,7 @@ System site-packages will not be used for module resolution.",
}
#[derive(Debug, thiserror::Error)]
pub enum SitePackagesDiscoveryError {
pub(crate) enum SitePackagesDiscoveryError {
#[error("Invalid --venv-path argument: {0} could not be canonicalized")]
VenvDirCanonicalizationError(SystemPathBuf, #[source] io::Error),
#[error("Invalid --venv-path argument: {0} does not point to a directory on disk")]
@@ -221,7 +222,7 @@ pub enum SitePackagesDiscoveryError {
/// The various ways in which parsing a `pyvenv.cfg` file could fail
#[derive(Debug)]
pub enum PyvenvCfgParseErrorKind {
pub(crate) enum PyvenvCfgParseErrorKind {
TooManyEquals { line_number: NonZeroUsize },
MalformedKeyValuePair { line_number: NonZeroUsize },
NoHomeKey,
@@ -370,7 +371,7 @@ fn site_packages_directory_from_sys_prefix(
///
/// [`sys.prefix`]: https://docs.python.org/3/library/sys.html#sys.prefix
#[derive(Debug, PartialEq, Eq, Clone)]
pub struct SysPrefixPath(SystemPathBuf);
pub(crate) struct SysPrefixPath(SystemPathBuf);
impl SysPrefixPath {
fn new(

View File

@@ -1,8 +1,9 @@
use ruff_db::files::File;
use ruff_python_ast::name::Name;
use ruff_python_ast as ast;
use crate::builtins::builtins_scope;
use crate::semantic_index::definition::Definition;
use crate::semantic_index::ast_ids::HasScopedAstId;
use crate::semantic_index::definition::{Definition, DefinitionKind};
use crate::semantic_index::symbol::{ScopeId, ScopedSymbolId};
use crate::semantic_index::{
global_scope, semantic_index, symbol_table, use_def_map, DefinitionWithConstraints,
@@ -14,7 +15,8 @@ use crate::{Db, FxOrderSet};
pub(crate) use self::builder::{IntersectionBuilder, UnionBuilder};
pub(crate) use self::diagnostic::TypeCheckDiagnostics;
pub(crate) use self::infer::{
infer_definition_types, infer_expression_types, infer_scope_types, TypeInference,
infer_deferred_types, infer_definition_types, infer_expression_types, infer_scope_types,
TypeInference,
};
mod builder;
@@ -88,6 +90,24 @@ pub(crate) fn definition_ty<'db>(db: &'db dyn Db, definition: Definition<'db>) -
inference.definition_ty(definition)
}
/// Infer the type of a (possibly deferred) sub-expression of a [`Definition`].
///
/// ## Panics
/// If the given expression is not a sub-expression of the given [`Definition`].
pub(crate) fn definition_expression_ty<'db>(
db: &'db dyn Db,
definition: Definition<'db>,
expression: &ast::Expr,
) -> Type<'db> {
let expr_id = expression.scoped_ast_id(db, definition.scope(db));
let inference = infer_definition_types(db, definition);
if let Some(ty) = inference.try_expression_ty(expr_id) {
ty
} else {
infer_deferred_types(db, definition).expression_ty(expr_id)
}
}
/// Infer the combined type of an array of [`Definition`]s, plus one optional "unbound type".
///
/// Will return a union if there is more than one definition, or at least one plus an unbound
@@ -181,6 +201,13 @@ pub enum Type<'db> {
IntLiteral(i64),
/// A boolean literal, either `True` or `False`.
BooleanLiteral(bool),
/// A string literal
StringLiteral(StringLiteralType<'db>),
/// A string known to originate only from literal values, but whose value is not known (unlike
/// `StringLiteral` above).
LiteralString,
/// A bytes literal
BytesLiteral(BytesLiteralType<'db>),
// TODO protocols, callable types, overloads, generics, type vars
}
@@ -236,7 +263,7 @@ impl<'db> Type<'db> {
/// us to explicitly consider whether to handle an error or propagate
/// it up the call stack.
#[must_use]
pub fn member(&self, db: &'db dyn Db, name: &Name) -> Type<'db> {
pub fn member(&self, db: &'db dyn Db, name: &ast::name::Name) -> Type<'db> {
match self {
Type::Any => Type::Any,
Type::Never => {
@@ -276,6 +303,20 @@ impl<'db> Type<'db> {
Type::Unknown
}
Type::BooleanLiteral(_) => Type::Unknown,
Type::StringLiteral(_) => {
// TODO defer to `typing.LiteralString`/`builtins.str` methods
// from typeshed's stubs
Type::Unknown
}
Type::LiteralString => {
// TODO defer to `typing.LiteralString`/`builtins.str` methods
// from typeshed's stubs
Type::Unknown
}
Type::BytesLiteral(_) => {
// TODO defer to Type::Instance(<bytes from typeshed>).member
Type::Unknown
}
}
}
@@ -293,7 +334,7 @@ impl<'db> Type<'db> {
#[salsa::interned]
pub struct FunctionType<'db> {
/// name of the function at definition
pub name: Name,
pub name: ast::name::Name,
/// types of all decorators on this function
decorators: Vec<Type<'db>>,
@@ -308,19 +349,33 @@ impl<'db> FunctionType<'db> {
#[salsa::interned]
pub struct ClassType<'db> {
/// Name of the class at definition
pub name: Name,
pub name: ast::name::Name,
/// Types of all class bases
bases: Vec<Type<'db>>,
definition: Definition<'db>,
body_scope: ScopeId<'db>,
}
impl<'db> ClassType<'db> {
/// Return an iterator over the types of this class's bases.
///
/// # Panics:
/// If `definition` is not a `DefinitionKind::Class`.
pub fn bases(&self, db: &'db dyn Db) -> impl Iterator<Item = Type<'db>> {
let definition = self.definition(db);
let DefinitionKind::Class(class_stmt_node) = definition.node(db) else {
panic!("Class type definition must have DefinitionKind::Class");
};
class_stmt_node
.bases()
.iter()
.map(move |base_expr| definition_expression_ty(db, definition, base_expr))
}
/// Returns the class member of this class named `name`.
///
/// The member resolves to a member of the class itself or any of its bases.
pub fn class_member(self, db: &'db dyn Db, name: &Name) -> Type<'db> {
pub fn class_member(self, db: &'db dyn Db, name: &ast::name::Name) -> Type<'db> {
let member = self.own_class_member(db, name);
if !member.is_unbound() {
return member;
@@ -330,12 +385,12 @@ impl<'db> ClassType<'db> {
}
/// Returns the inferred type of the class member named `name`.
pub fn own_class_member(self, db: &'db dyn Db, name: &Name) -> Type<'db> {
pub fn own_class_member(self, db: &'db dyn Db, name: &ast::name::Name) -> Type<'db> {
let scope = self.body_scope(db);
symbol_ty_by_name(db, scope, name)
}
pub fn inherited_class_member(self, db: &'db dyn Db, name: &Name) -> Type<'db> {
pub fn inherited_class_member(self, db: &'db dyn Db, name: &ast::name::Name) -> Type<'db> {
for base in self.bases(db) {
let member = base.member(db, name);
if !member.is_unbound() {
@@ -372,6 +427,18 @@ pub struct IntersectionType<'db> {
negative: FxOrderSet<Type<'db>>,
}
#[salsa::interned]
pub struct StringLiteralType<'db> {
#[return_ref]
value: Box<str>,
}
#[salsa::interned]
pub struct BytesLiteralType<'db> {
#[return_ref]
value: Box<[u8]>,
}
#[cfg(test)]
mod tests {
use anyhow::Context;
@@ -392,14 +459,9 @@ mod tests {
Program::from_settings(
&db,
ProgramSettings {
&ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings {
extra_paths: Vec::new(),
src_root: SystemPathBuf::from("/src"),
site_packages: vec![],
custom_typeshed: None,
},
search_paths: SearchPathSettings::new(SystemPathBuf::from("/src")),
},
)
.expect("Valid search path settings");
@@ -425,7 +487,7 @@ mod tests {
let foo = system_path_to_file(&db, "src/foo.py").context("Failed to resolve foo.py")?;
let diagnostics = super::check_types(&db, foo);
assert_diagnostic_messages(&diagnostics, &["Import 'bar' could not be resolved."]);
assert_diagnostic_messages(&diagnostics, &["Cannot resolve import 'bar'."]);
Ok(())
}
@@ -438,7 +500,7 @@ mod tests {
.unwrap();
let foo = system_path_to_file(&db, "src/foo.py").unwrap();
let diagnostics = super::check_types(&db, foo);
assert_diagnostic_messages(&diagnostics, &["Import 'bar' could not be resolved."]);
assert_diagnostic_messages(&diagnostics, &["Cannot resolve import 'bar'."]);
}
#[test]
@@ -450,15 +512,9 @@ mod tests {
let b_file = system_path_to_file(&db, "/src/b.py").unwrap();
let b_file_diagnostics = super::check_types(&db, b_file);
assert_diagnostic_messages(
&b_file_diagnostics,
&["Could not resolve import of 'thing' from 'a'"],
);
assert_diagnostic_messages(&b_file_diagnostics, &["Module 'a' has no member 'thing'"]);
}
#[ignore = "\
A spurious second 'Unresolved import' diagnostic message is emitted on `b.py`, \
despite the symbol existing in the symbol table for `a.py`"]
#[test]
fn resolved_import_of_symbol_from_unresolved_import() {
let mut db = setup_db();
@@ -471,10 +527,7 @@ despite the symbol existing in the symbol table for `a.py`"]
let a_file = system_path_to_file(&db, "/src/a.py").unwrap();
let a_file_diagnostics = super::check_types(&db, a_file);
assert_diagnostic_messages(
&a_file_diagnostics,
&["Import 'foo' could not be resolved."],
);
assert_diagnostic_messages(&a_file_diagnostics, &["Cannot resolve import 'foo'."]);
// Importing the unresolved import into a second first-party file should not trigger
// an additional "unresolved import" violation

View File

@@ -2,6 +2,9 @@
use std::fmt::{Display, Formatter};
use ruff_python_ast::str::Quote;
use ruff_python_literal::escape::AsciiEscape;
use crate::types::{IntersectionType, Type, UnionType};
use crate::Db;
@@ -38,6 +41,20 @@ impl Display for DisplayType<'_> {
Type::BooleanLiteral(boolean) => {
write!(f, "Literal[{}]", if *boolean { "True" } else { "False" })
}
Type::StringLiteral(string) => write!(
f,
r#"Literal["{}"]"#,
string.value(self.db).replace('"', r#"\""#)
),
Type::LiteralString => write!(f, "LiteralString"),
Type::BytesLiteral(bytes) => {
let escape =
AsciiEscape::with_preferred_quote(bytes.value(self.db).as_ref(), Quote::Double);
f.write_str("Literal[")?;
escape.bytes_repr().write(f)?;
f.write_str("]")
}
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -11,7 +11,6 @@ repository = { workspace = true }
license = { workspace = true }
[dependencies]
red_knot_python_semantic = { workspace = true }
red_knot_workspace = { workspace = true }
ruff_db = { workspace = true }
ruff_notebook = { workspace = true }

View File

@@ -3,11 +3,11 @@
use std::num::NonZeroUsize;
use std::panic::PanicInfo;
use lsp_server as lsp;
use lsp_types as types;
use lsp_server::Message;
use lsp_types::{
ClientCapabilities, DiagnosticOptions, NotebookCellSelector, NotebookDocumentSyncOptions,
NotebookSelector, TextDocumentSyncCapability, TextDocumentSyncOptions,
ClientCapabilities, DiagnosticOptions, DiagnosticServerCapabilities, MessageType,
ServerCapabilities, TextDocumentSyncCapability, TextDocumentSyncKind, TextDocumentSyncOptions,
Url,
};
use self::connection::{Connection, ConnectionInitializer};
@@ -74,7 +74,7 @@ impl Server {
init_params.client_info.as_ref(),
);
let mut workspace_for_url = |url: lsp_types::Url| {
let mut workspace_for_url = |url: Url| {
let Some(workspace_settings) = workspace_settings.as_mut() else {
return (url, ClientSettings::default());
};
@@ -93,13 +93,18 @@ impl Server {
}).collect())
.or_else(|| {
tracing::warn!("No workspace(s) were provided during initialization. Using the current working directory as a default workspace...");
let uri = types::Url::from_file_path(std::env::current_dir().ok()?).ok()?;
let uri = Url::from_file_path(std::env::current_dir().ok()?).ok()?;
Some(vec![workspace_for_url(uri)])
})
.ok_or_else(|| {
anyhow::anyhow!("Failed to get the current working directory while creating a default workspace.")
})?;
if workspaces.len() > 1 {
// TODO(dhruvmanila): Support multi-root workspaces
anyhow::bail!("Multi-root workspaces are not supported yet");
}
Ok(Self {
connection,
worker_threads,
@@ -149,7 +154,7 @@ impl Server {
try_show_message(
"The Ruff language server exited with a panic. See the logs for more details."
.to_string(),
lsp_types::MessageType::ERROR,
MessageType::ERROR,
)
.ok();
}));
@@ -182,9 +187,9 @@ impl Server {
break;
}
let task = match msg {
lsp::Message::Request(req) => api::request(req),
lsp::Message::Notification(notification) => api::notification(notification),
lsp::Message::Response(response) => scheduler.response(response),
Message::Request(req) => api::request(req),
Message::Notification(notification) => api::notification(notification),
Message::Response(response) => scheduler.response(response),
};
scheduler.dispatch(task);
}
@@ -206,28 +211,17 @@ impl Server {
.unwrap_or_default()
}
fn server_capabilities(position_encoding: PositionEncoding) -> types::ServerCapabilities {
types::ServerCapabilities {
fn server_capabilities(position_encoding: PositionEncoding) -> ServerCapabilities {
ServerCapabilities {
position_encoding: Some(position_encoding.into()),
diagnostic_provider: Some(types::DiagnosticServerCapabilities::Options(
DiagnosticOptions {
identifier: Some(crate::DIAGNOSTIC_NAME.into()),
..Default::default()
},
)),
notebook_document_sync: Some(types::OneOf::Left(NotebookDocumentSyncOptions {
save: Some(false),
notebook_selector: [NotebookSelector::ByCells {
notebook: None,
cells: vec![NotebookCellSelector {
language: "python".to_string(),
}],
}]
.to_vec(),
diagnostic_provider: Some(DiagnosticServerCapabilities::Options(DiagnosticOptions {
identifier: Some(crate::DIAGNOSTIC_NAME.into()),
..Default::default()
})),
text_document_sync: Some(TextDocumentSyncCapability::Options(
TextDocumentSyncOptions {
open_close: Some(true),
change: Some(TextDocumentSyncKind::INCREMENTAL),
..Default::default()
},
)),

View File

@@ -1,13 +1,15 @@
use crate::{server::schedule::Task, session::Session, system::url_to_system_path};
use lsp_server as server;
use crate::server::schedule::Task;
use crate::session::Session;
use crate::system::{url_to_any_system_path, AnySystemPath};
mod diagnostics;
mod notifications;
mod requests;
mod traits;
use notifications as notification;
use red_knot_workspace::db::RootDatabase;
use requests as request;
use self::traits::{NotificationHandler, RequestHandler};
@@ -43,6 +45,7 @@ pub(super) fn notification<'a>(notif: server::Notification) -> Task<'a> {
match notif.method.as_str() {
notification::DidCloseTextDocumentHandler::METHOD => local_notification_task::<notification::DidCloseTextDocumentHandler>(notif),
notification::DidOpenTextDocumentHandler::METHOD => local_notification_task::<notification::DidOpenTextDocumentHandler>(notif),
notification::DidChangeTextDocumentHandler::METHOD => local_notification_task::<notification::DidChangeTextDocumentHandler>(notif),
notification::DidOpenNotebookHandler::METHOD => {
local_notification_task::<notification::DidOpenNotebookHandler>(notif)
}
@@ -82,12 +85,18 @@ fn background_request_task<'a, R: traits::BackgroundDocumentRequestHandler>(
Ok(Task::background(schedule, move |session: &Session| {
let url = R::document_url(&params).into_owned();
let Ok(path) = url_to_system_path(&url) else {
let Ok(path) = url_to_any_system_path(&url) else {
return Box::new(|_, _| {});
};
let db = session
.workspace_db_for_path(path.as_std_path())
.map(RootDatabase::snapshot);
let db = match path {
AnySystemPath::System(path) => {
match session.workspace_db_for_path(path.as_std_path()) {
Some(db) => db.snapshot(),
None => session.default_workspace_db().snapshot(),
}
}
AnySystemPath::SystemVirtual(_) => session.default_workspace_db().snapshot(),
};
let Some(snapshot) = session.take_snapshot(url) else {
return Box::new(|_, _| {});

View File

@@ -1,9 +1,11 @@
mod did_change;
mod did_close;
mod did_close_notebook;
mod did_open;
mod did_open_notebook;
mod set_trace;
pub(super) use did_change::DidChangeTextDocumentHandler;
pub(super) use did_close::DidCloseTextDocumentHandler;
pub(super) use did_close_notebook::DidCloseNotebookHandler;
pub(super) use did_open::DidOpenTextDocumentHandler;

View File

@@ -0,0 +1,55 @@
use lsp_server::ErrorCode;
use lsp_types::notification::DidChangeTextDocument;
use lsp_types::DidChangeTextDocumentParams;
use red_knot_workspace::watch::ChangeEvent;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
use crate::server::api::LSPResult;
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::{url_to_any_system_path, AnySystemPath};
pub(crate) struct DidChangeTextDocumentHandler;
impl NotificationHandler for DidChangeTextDocumentHandler {
type NotificationType = DidChangeTextDocument;
}
impl SyncNotificationHandler for DidChangeTextDocumentHandler {
fn run(
session: &mut Session,
_notifier: Notifier,
_requester: &mut Requester,
params: DidChangeTextDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_any_system_path(&params.text_document.uri) else {
return Ok(());
};
let key = session.key_from_url(params.text_document.uri);
session
.update_text_document(&key, params.content_changes, params.text_document.version)
.with_failure_code(ErrorCode::InternalError)?;
match path {
AnySystemPath::System(path) => {
let db = match session.workspace_db_for_path_mut(path.as_std_path()) {
Some(db) => db,
None => session.default_workspace_db_mut(),
};
db.apply_changes(vec![ChangeEvent::file_content_changed(path)], None);
}
AnySystemPath::SystemVirtual(virtual_path) => {
let db = session.default_workspace_db_mut();
db.apply_changes(vec![ChangeEvent::ChangedVirtual(virtual_path)], None);
}
}
// TODO(dhruvmanila): Publish diagnostics if the client doesnt support pull diagnostics
Ok(())
}
}

View File

@@ -1,8 +1,7 @@
use lsp_server::ErrorCode;
use lsp_types::notification::DidCloseTextDocument;
use lsp_types::DidCloseTextDocumentParams;
use ruff_db::files::File;
use red_knot_workspace::watch::ChangeEvent;
use crate::server::api::diagnostics::clear_diagnostics;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
@@ -10,7 +9,7 @@ use crate::server::api::LSPResult;
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::url_to_system_path;
use crate::system::{url_to_any_system_path, AnySystemPath};
pub(crate) struct DidCloseTextDocumentHandler;
@@ -25,7 +24,7 @@ impl SyncNotificationHandler for DidCloseTextDocumentHandler {
_requester: &mut Requester,
params: DidCloseTextDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_system_path(&params.text_document.uri) else {
let Ok(path) = url_to_any_system_path(&params.text_document.uri) else {
return Ok(());
};
@@ -34,8 +33,9 @@ impl SyncNotificationHandler for DidCloseTextDocumentHandler {
.close_document(&key)
.with_failure_code(ErrorCode::InternalError)?;
if let Some(db) = session.workspace_db_for_path_mut(path.as_std_path()) {
File::sync_path(db, &path);
if let AnySystemPath::SystemVirtual(virtual_path) = path {
let db = session.default_workspace_db_mut();
db.apply_changes(vec![ChangeEvent::DeletedVirtual(virtual_path)], None);
}
clear_diagnostics(key.url(), &notifier)?;

View File

@@ -1,14 +1,14 @@
use lsp_types::notification::DidCloseNotebookDocument;
use lsp_types::DidCloseNotebookDocumentParams;
use ruff_db::files::File;
use red_knot_workspace::watch::ChangeEvent;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
use crate::server::api::LSPResult;
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::url_to_system_path;
use crate::system::{url_to_any_system_path, AnySystemPath};
pub(crate) struct DidCloseNotebookHandler;
@@ -23,7 +23,7 @@ impl SyncNotificationHandler for DidCloseNotebookHandler {
_requester: &mut Requester,
params: DidCloseNotebookDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_system_path(&params.notebook_document.uri) else {
let Ok(path) = url_to_any_system_path(&params.notebook_document.uri) else {
return Ok(());
};
@@ -32,8 +32,9 @@ impl SyncNotificationHandler for DidCloseNotebookHandler {
.close_document(&key)
.with_failure_code(lsp_server::ErrorCode::InternalError)?;
if let Some(db) = session.workspace_db_for_path_mut(path.as_std_path()) {
File::sync_path(db, &path);
if let AnySystemPath::SystemVirtual(virtual_path) = path {
let db = session.default_workspace_db_mut();
db.apply_changes(vec![ChangeEvent::DeletedVirtual(virtual_path)], None);
}
Ok(())

View File

@@ -1,13 +1,14 @@
use lsp_types::notification::DidOpenTextDocument;
use lsp_types::DidOpenTextDocumentParams;
use ruff_db::files::system_path_to_file;
use red_knot_workspace::watch::ChangeEvent;
use ruff_db::Db;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::url_to_system_path;
use crate::system::{url_to_any_system_path, AnySystemPath};
use crate::TextDocument;
pub(crate) struct DidOpenTextDocumentHandler;
@@ -23,17 +24,25 @@ impl SyncNotificationHandler for DidOpenTextDocumentHandler {
_requester: &mut Requester,
params: DidOpenTextDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_system_path(&params.text_document.uri) else {
let Ok(path) = url_to_any_system_path(&params.text_document.uri) else {
return Ok(());
};
let document = TextDocument::new(params.text_document.text, params.text_document.version);
session.open_text_document(params.text_document.uri, document);
if let Some(db) = session.workspace_db_for_path_mut(path.as_std_path()) {
// TODO(dhruvmanila): Store the `file` in `DocumentController`
let file = system_path_to_file(db, &path).unwrap();
file.sync(db);
match path {
AnySystemPath::System(path) => {
let db = match session.workspace_db_for_path_mut(path.as_std_path()) {
Some(db) => db,
None => session.default_workspace_db_mut(),
};
db.apply_changes(vec![ChangeEvent::Opened(path)], None);
}
AnySystemPath::SystemVirtual(virtual_path) => {
let db = session.default_workspace_db_mut();
db.files().virtual_file(db, &virtual_path);
}
}
// TODO(dhruvmanila): Publish diagnostics if the client doesn't support pull diagnostics

View File

@@ -2,7 +2,8 @@ use lsp_server::ErrorCode;
use lsp_types::notification::DidOpenNotebookDocument;
use lsp_types::DidOpenNotebookDocumentParams;
use ruff_db::files::system_path_to_file;
use red_knot_workspace::watch::ChangeEvent;
use ruff_db::Db;
use crate::edit::NotebookDocument;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
@@ -10,7 +11,7 @@ use crate::server::api::LSPResult;
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::url_to_system_path;
use crate::system::{url_to_any_system_path, AnySystemPath};
pub(crate) struct DidOpenNotebookHandler;
@@ -25,7 +26,7 @@ impl SyncNotificationHandler for DidOpenNotebookHandler {
_requester: &mut Requester,
params: DidOpenNotebookDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_system_path(&params.notebook_document.uri) else {
let Ok(path) = url_to_any_system_path(&params.notebook_document.uri) else {
return Ok(());
};
@@ -38,10 +39,18 @@ impl SyncNotificationHandler for DidOpenNotebookHandler {
.with_failure_code(ErrorCode::InternalError)?;
session.open_notebook_document(params.notebook_document.uri.clone(), notebook);
if let Some(db) = session.workspace_db_for_path_mut(path.as_std_path()) {
// TODO(dhruvmanila): Store the `file` in `DocumentController`
let file = system_path_to_file(db, &path).unwrap();
file.sync(db);
match path {
AnySystemPath::System(path) => {
let db = match session.workspace_db_for_path_mut(path.as_std_path()) {
Some(db) => db,
None => session.default_workspace_db_mut(),
};
db.apply_changes(vec![ChangeEvent::Opened(path)], None);
}
AnySystemPath::SystemVirtual(virtual_path) => {
let db = session.default_workspace_db_mut();
db.files().virtual_file(db, &virtual_path);
}
}
// TODO(dhruvmanila): Publish diagnostics if the client doesn't support pull diagnostics

View File

@@ -26,13 +26,11 @@ impl BackgroundDocumentRequestHandler for DocumentDiagnosticRequestHandler {
fn run_with_snapshot(
snapshot: DocumentSnapshot,
db: Option<RootDatabase>,
db: RootDatabase,
_notifier: Notifier,
_params: DocumentDiagnosticParams,
) -> Result<DocumentDiagnosticReportResult> {
let diagnostics = db
.map(|db| compute_diagnostics(&snapshot, &db))
.unwrap_or_default();
let diagnostics = compute_diagnostics(&snapshot, &db);
Ok(DocumentDiagnosticReportResult::Report(
DocumentDiagnosticReport::Full(RelatedFullDocumentDiagnosticReport {
@@ -48,10 +46,19 @@ impl BackgroundDocumentRequestHandler for DocumentDiagnosticRequestHandler {
fn compute_diagnostics(snapshot: &DocumentSnapshot, db: &RootDatabase) -> Vec<Diagnostic> {
let Some(file) = snapshot.file(db) else {
tracing::info!(
"No file found for snapshot for '{}'",
snapshot.query().file_url()
);
return vec![];
};
let Ok(diagnostics) = db.check_file(file) else {
return vec![];
let diagnostics = match db.check_file(file) {
Ok(diagnostics) => diagnostics,
Err(cancelled) => {
tracing::info!("Diagnostics computation {cancelled}");
return vec![];
}
};
diagnostics
@@ -65,12 +72,12 @@ fn to_lsp_diagnostic(message: &str) -> Diagnostic {
let words = message.split(':').collect::<Vec<_>>();
let (range, message) = match words.as_slice() {
[_filename, line, column, message] => {
let line = line.parse::<u32>().unwrap_or_default();
[_, _, line, column, message] | [_, line, column, message] => {
let line = line.parse::<u32>().unwrap_or_default().saturating_sub(1);
let column = column.parse::<u32>().unwrap_or_default();
(
Range::new(
Position::new(line.saturating_sub(1), column.saturating_sub(1)),
Position::new(line, column.saturating_sub(1)),
Position::new(line, column),
),
message.trim(),

View File

@@ -34,7 +34,7 @@ pub(super) trait BackgroundDocumentRequestHandler: RequestHandler {
fn run_with_snapshot(
snapshot: DocumentSnapshot,
db: Option<RootDatabase>,
db: RootDatabase,
notifier: Notifier,
params: <<Self as RequestHandler>::RequestType as Request>::Params,
) -> super::Result<<<Self as RequestHandler>::RequestType as Request>::Result>;

View File

@@ -6,16 +6,16 @@ use std::path::{Path, PathBuf};
use std::sync::Arc;
use anyhow::anyhow;
use lsp_types::{ClientCapabilities, Url};
use lsp_types::{ClientCapabilities, TextDocumentContentChangeEvent, Url};
use red_knot_python_semantic::{ProgramSettings, PythonVersion, SearchPathSettings};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::files::{system_path_to_file, File};
use ruff_db::system::SystemPath;
use ruff_db::Db;
use crate::edit::{DocumentKey, NotebookDocument};
use crate::system::{url_to_system_path, LSPSystem};
use crate::edit::{DocumentKey, DocumentVersion, NotebookDocument};
use crate::system::{url_to_any_system_path, AnySystemPath, LSPSystem};
use crate::{PositionEncoding, TextDocument};
pub(crate) use self::capabilities::ResolvedClientCapabilities;
@@ -67,19 +67,10 @@ impl Session {
.ok_or_else(|| anyhow!("Workspace path is not a valid UTF-8 path: {:?}", path))?;
let system = LSPSystem::new(index.clone());
let metadata = WorkspaceMetadata::from_path(system_path, &system)?;
// TODO(dhruvmanila): Get the values from the client settings
let program_settings = ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: system_path.to_path_buf(),
site_packages: vec![],
custom_typeshed: None,
},
};
let metadata = WorkspaceMetadata::from_path(system_path, &system, None)?;
// TODO(micha): Handle the case where the program settings are incorrect more gracefully.
workspaces.insert(path, RootDatabase::new(metadata, program_settings, system)?);
workspaces.insert(path, RootDatabase::new(metadata, system)?);
}
Ok(Self {
@@ -92,6 +83,12 @@ impl Session {
})
}
// TODO(dhruvmanila): Ideally, we should have a single method for `workspace_db_for_path_mut`
// and `default_workspace_db_mut` but the borrow checker doesn't allow that.
// https://github.com/astral-sh/ruff/pull/13041#discussion_r1726725437
/// Returns a reference to the workspace [`RootDatabase`] corresponding to the given path, if
/// any.
pub(crate) fn workspace_db_for_path(&self, path: impl AsRef<Path>) -> Option<&RootDatabase> {
self.workspaces
.range(..=path.as_ref().to_path_buf())
@@ -99,6 +96,8 @@ impl Session {
.map(|(_, db)| db)
}
/// Returns a mutable reference to the workspace [`RootDatabase`] corresponding to the given
/// path, if any.
pub(crate) fn workspace_db_for_path_mut(
&mut self,
path: impl AsRef<Path>,
@@ -109,6 +108,19 @@ impl Session {
.map(|(_, db)| db)
}
/// Returns a reference to the default workspace [`RootDatabase`]. The default workspace is the
/// minimum root path in the workspace map.
pub(crate) fn default_workspace_db(&self) -> &RootDatabase {
// SAFETY: Currently, red knot only support a single workspace.
self.workspaces.values().next().unwrap()
}
/// Returns a mutable reference to the default workspace [`RootDatabase`].
pub(crate) fn default_workspace_db_mut(&mut self) -> &mut RootDatabase {
// SAFETY: Currently, red knot only support a single workspace.
self.workspaces.values_mut().next().unwrap()
}
pub fn key_from_url(&self, url: Url) -> DocumentKey {
self.index().key_from_url(url)
}
@@ -135,6 +147,20 @@ impl Session {
self.index_mut().open_text_document(url, document);
}
/// Updates a text document at the associated `key`.
///
/// The document key must point to a text document, or this will throw an error.
pub(crate) fn update_text_document(
&mut self,
key: &DocumentKey,
content_changes: Vec<TextDocumentContentChangeEvent>,
new_version: DocumentVersion,
) -> crate::Result<()> {
let position_encoding = self.position_encoding;
self.index_mut()
.update_text_document(key, content_changes, new_version, position_encoding)
}
/// De-registers a document, specified by its key.
/// Calling this multiple times for the same document is a logic error.
pub(crate) fn close_document(&mut self, key: &DocumentKey) -> crate::Result<()> {
@@ -221,6 +247,7 @@ impl Drop for MutIndexGuard<'_> {
/// An immutable snapshot of `Session` that references
/// a specific document.
#[derive(Debug)]
pub struct DocumentSnapshot {
resolved_client_capabilities: Arc<ResolvedClientCapabilities>,
document_ref: index::DocumentQuery,
@@ -241,7 +268,12 @@ impl DocumentSnapshot {
}
pub(crate) fn file(&self, db: &RootDatabase) -> Option<File> {
let path = url_to_system_path(self.document_ref.file_url()).ok()?;
system_path_to_file(db, path).ok()
match url_to_any_system_path(self.document_ref.file_url()).ok()? {
AnySystemPath::System(path) => system_path_to_file(db, path).ok(),
AnySystemPath::SystemVirtual(virtual_path) => db
.files()
.try_virtual_file(&virtual_path)
.map(|virtual_file| virtual_file.file()),
}
}
}

View File

@@ -8,27 +8,40 @@ use ruff_db::file_revision::FileRevision;
use ruff_db::system::walk_directory::WalkDirectoryBuilder;
use ruff_db::system::{
DirectoryEntry, FileType, Metadata, OsSystem, Result, System, SystemPath, SystemPathBuf,
SystemVirtualPath,
SystemVirtualPath, SystemVirtualPathBuf,
};
use ruff_notebook::{Notebook, NotebookError};
use crate::session::index::Index;
use crate::DocumentQuery;
/// Converts the given [`Url`] to a [`SystemPathBuf`].
/// Converts the given [`Url`] to an [`AnySystemPath`].
///
/// If the URL scheme is `file`, then the path is converted to a [`SystemPathBuf`]. Otherwise, the
/// URL is converted to a [`SystemVirtualPathBuf`].
///
/// This fails in the following cases:
/// * The URL scheme is not `file`.
/// * The URL cannot be converted to a file path (refer to [`Url::to_file_path`]).
/// * If the URL is not a valid UTF-8 string.
pub(crate) fn url_to_system_path(url: &Url) -> std::result::Result<SystemPathBuf, ()> {
pub(crate) fn url_to_any_system_path(url: &Url) -> std::result::Result<AnySystemPath, ()> {
if url.scheme() == "file" {
Ok(SystemPathBuf::from_path_buf(url.to_file_path()?).map_err(|_| ())?)
Ok(AnySystemPath::System(
SystemPathBuf::from_path_buf(url.to_file_path()?).map_err(|_| ())?,
))
} else {
Err(())
Ok(AnySystemPath::SystemVirtual(
SystemVirtualPath::new(url.as_str()).to_path_buf(),
))
}
}
/// Represents either a [`SystemPath`] or a [`SystemVirtualPath`].
#[derive(Debug)]
pub(crate) enum AnySystemPath {
System(SystemPathBuf),
SystemVirtual(SystemVirtualPathBuf),
}
#[derive(Debug)]
pub(crate) struct LSPSystem {
/// A read-only copy of the index where the server stores all the open documents and settings.
@@ -144,19 +157,6 @@ impl System for LSPSystem {
}
}
fn virtual_path_metadata(&self, path: &SystemVirtualPath) -> Result<Metadata> {
// Virtual paths only exists in the LSP system, so we don't need to check the OS system.
let document = self
.system_virtual_path_to_document_ref(path)?
.ok_or_else(|| virtual_path_not_found(path))?;
Ok(Metadata::new(
document_revision(&document),
None,
FileType::File,
))
}
fn read_virtual_path_to_string(&self, path: &SystemVirtualPath) -> Result<String> {
let document = self
.system_virtual_path_to_document_ref(path)?

View File

@@ -3,8 +3,8 @@ use std::any::Any;
use js_sys::Error;
use wasm_bindgen::prelude::*;
use red_knot_python_semantic::{ProgramSettings, SearchPathSettings};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::workspace::settings::Configuration;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::files::{system_path_to_file, File};
use ruff_db::system::walk_directory::WalkDirectoryBuilder;
@@ -41,16 +41,17 @@ impl Workspace {
#[wasm_bindgen(constructor)]
pub fn new(root: &str, settings: &Settings) -> Result<Workspace, Error> {
let system = WasmSystem::new(SystemPath::new(root));
let workspace =
WorkspaceMetadata::from_path(SystemPath::new(root), &system).map_err(into_error)?;
let workspace = WorkspaceMetadata::from_path(
SystemPath::new(root),
&system,
Some(Configuration {
target_version: Some(settings.target_version.into()),
..Configuration::default()
}),
)
.map_err(into_error)?;
let program_settings = ProgramSettings {
target_version: settings.target_version.into(),
search_paths: SearchPathSettings::default(),
};
let db =
RootDatabase::new(workspace, program_settings, system.clone()).map_err(into_error)?;
let db = RootDatabase::new(workspace, system.clone()).map_err(into_error)?;
Ok(Self { db, system })
}
@@ -233,13 +234,6 @@ impl System for WasmSystem {
Notebook::from_source_code(&content)
}
fn virtual_path_metadata(
&self,
_path: &SystemVirtualPath,
) -> ruff_db::system::Result<Metadata> {
Err(not_found())
}
fn read_virtual_path_to_string(
&self,
_path: &SystemVirtualPath,

View File

@@ -11,14 +11,14 @@ fn check() {
};
let mut workspace = Workspace::new("/", &settings).expect("Workspace to be created");
let test = workspace
workspace
.open_file("test.py", "import random22\n")
.expect("File to be opened");
let result = workspace.check_file(&test).expect("Check to succeed");
let result = workspace.check().expect("Check to succeed");
assert_eq!(
result,
vec!["/test.py:1:8: Import 'random22' could not be resolved.",]
vec!["/test.py:1:8: Cannot resolve import 'random22'."]
);
}

View File

@@ -22,13 +22,13 @@ ruff_text_size = { workspace = true }
anyhow = { workspace = true }
crossbeam = { workspace = true }
notify = { workspace = true }
rayon = { workspace = true }
rustc-hash = { workspace = true }
salsa = { workspace = true }
thiserror = { workspace = true }
tracing = { workspace = true }
[dev-dependencies]
ruff_db = { workspace = true, features = ["testing"]}
ruff_db = { workspace = true, features = ["testing"] }
[lints]
workspace = true

View File

@@ -0,0 +1,2 @@
x = 0
(x := x + 1)

View File

@@ -0,0 +1,3 @@
x = 0
if x := x + 1:
pass

View File

@@ -0,0 +1,11 @@
def bool(x) -> bool:
return True
class MyClass: ...
def MyClass() -> MyClass: ...
def x(self) -> x: ...

View File

@@ -0,0 +1,2 @@
def bool(x=bool):
return x

View File

@@ -0,0 +1,3 @@
match x:
case [1, 0] if x := x[:0]:
y = 1

View File

@@ -1,15 +1,14 @@
use std::panic::RefUnwindSafe;
use std::sync::Arc;
use red_knot_python_semantic::{
vendored_typeshed_stubs, Db as SemanticDb, Program, ProgramSettings,
};
use salsa::plumbing::ZalsaDatabase;
use salsa::{Cancelled, Event};
use red_knot_python_semantic::{vendored_typeshed_stubs, Db as SemanticDb, Program};
use ruff_db::files::{File, Files};
use ruff_db::system::System;
use ruff_db::vendored::VendoredFileSystem;
use ruff_db::{Db as SourceDb, Upcast};
use salsa::plumbing::ZalsaDatabase;
use salsa::{Cancelled, Event};
use crate::workspace::{check_file, Workspace, WorkspaceMetadata};
@@ -27,11 +26,7 @@ pub struct RootDatabase {
}
impl RootDatabase {
pub fn new<S>(
workspace: WorkspaceMetadata,
settings: ProgramSettings,
system: S,
) -> anyhow::Result<Self>
pub fn new<S>(workspace: WorkspaceMetadata, system: S) -> anyhow::Result<Self>
where
S: System + 'static + Send + Sync + RefUnwindSafe,
{
@@ -42,11 +37,11 @@ impl RootDatabase {
system: Arc::new(system),
};
let workspace = Workspace::from_metadata(&db, workspace);
// Initialize the `Program` singleton
Program::from_settings(&db, settings)?;
Program::from_settings(&db, workspace.settings().program())?;
db.workspace = Some(Workspace::from_metadata(&db, workspace));
db.workspace = Some(workspace);
Ok(db)
}
@@ -61,6 +56,8 @@ impl RootDatabase {
}
pub fn check_file(&self, file: File) -> Result<Vec<String>, Cancelled> {
let _span = tracing::debug_span!("check_file", file=%file.path(self)).entered();
self.with_db(|db| check_file(db, file))
}
@@ -160,9 +157,10 @@ impl Db for RootDatabase {}
#[cfg(test)]
pub(crate) mod tests {
use salsa::Event;
use std::sync::Arc;
use salsa::Event;
use red_knot_python_semantic::{vendored_typeshed_stubs, Db as SemanticDb};
use ruff_db::files::Files;
use ruff_db::system::{DbWithTestSystem, System, TestSystem};

View File

@@ -1,22 +1,33 @@
use rustc_hash::FxHashSet;
use red_knot_python_semantic::Program;
use ruff_db::files::{system_path_to_file, File, Files};
use ruff_db::system::walk_directory::WalkState;
use ruff_db::system::SystemPath;
use ruff_db::Db;
use rustc_hash::FxHashSet;
use crate::db::RootDatabase;
use crate::watch;
use crate::watch::{CreatedKind, DeletedKind};
use crate::workspace::settings::Configuration;
use crate::workspace::WorkspaceMetadata;
impl RootDatabase {
#[tracing::instrument(level = "debug", skip(self, changes))]
pub fn apply_changes(&mut self, changes: Vec<watch::ChangeEvent>) {
#[tracing::instrument(level = "debug", skip(self, changes, base_configuration))]
pub fn apply_changes(
&mut self,
changes: Vec<watch::ChangeEvent>,
base_configuration: Option<&Configuration>,
) {
let workspace = self.workspace();
let workspace_path = workspace.root(self).to_path_buf();
let program = Program::get(self);
let custom_stdlib_versions_path = program
.custom_stdlib_search_path(self)
.map(|path| path.join("VERSIONS"));
let mut workspace_change = false;
// Changes to a custom stdlib path's VERSIONS
let mut custom_stdlib_change = false;
// Packages that need reloading
let mut changed_packages = FxHashSet::default();
// Paths that were added
@@ -39,7 +50,7 @@ impl RootDatabase {
};
for change in changes {
if let Some(path) = change.path() {
if let Some(path) = change.system_path() {
if matches!(
path.file_name(),
Some(".gitignore" | ".ignore" | "ruff.toml" | ".ruff.toml" | "pyproject.toml")
@@ -54,10 +65,15 @@ impl RootDatabase {
continue;
}
if Some(path) == custom_stdlib_versions_path.as_deref() {
custom_stdlib_change = true;
}
}
match change {
watch::ChangeEvent::Changed { path, kind: _ } => sync_path(self, &path),
watch::ChangeEvent::Changed { path, kind: _ }
| watch::ChangeEvent::Opened(path) => sync_path(self, &path),
watch::ChangeEvent::Created { kind, path } => {
match kind {
@@ -100,7 +116,13 @@ impl RootDatabase {
} else {
sync_recursively(self, &path);
// TODO: Remove after converting `package.files()` to a salsa query.
if custom_stdlib_versions_path
.as_ref()
.is_some_and(|versions_path| versions_path.starts_with(&path))
{
custom_stdlib_change = true;
}
if let Some(package) = workspace.package(self, &path) {
changed_packages.insert(package);
} else {
@@ -109,6 +131,17 @@ impl RootDatabase {
}
}
watch::ChangeEvent::CreatedVirtual(path)
| watch::ChangeEvent::ChangedVirtual(path) => {
File::sync_virtual_path(self, &path);
}
watch::ChangeEvent::DeletedVirtual(path) => {
if let Some(virtual_file) = self.files().try_virtual_file(&path) {
virtual_file.close(self);
}
}
watch::ChangeEvent::Rescan => {
workspace_change = true;
Files::sync_all(self);
@@ -118,7 +151,11 @@ impl RootDatabase {
}
if workspace_change {
match WorkspaceMetadata::from_path(&workspace_path, self.system()) {
match WorkspaceMetadata::from_path(
&workspace_path,
self.system(),
base_configuration.cloned(),
) {
Ok(metadata) => {
tracing::debug!("Reloading workspace after structural change.");
// TODO: Handle changes in the program settings.
@@ -130,6 +167,11 @@ impl RootDatabase {
}
return;
} else if custom_stdlib_change {
let search_paths = workspace.search_path_settings(self).clone();
if let Err(error) = program.update_search_paths(self, &search_paths) {
tracing::error!("Failed to set the new search paths: {error}");
}
}
let mut added_paths = added_paths.into_iter().filter(|path| {

View File

@@ -1,5 +1,4 @@
pub mod db;
pub mod lint;
pub mod site_packages;
pub mod watch;
pub mod workspace;

View File

@@ -7,7 +7,7 @@ use red_knot_python_semantic::types::Type;
use red_knot_python_semantic::{HasTy, ModuleName, SemanticModel};
use ruff_db::files::File;
use ruff_db::parsed::{parsed_module, ParsedModule};
use ruff_db::source::{line_index, source_text, SourceText};
use ruff_db::source::{source_text, SourceText};
use ruff_python_ast as ast;
use ruff_python_ast::visitor::{walk_expr, walk_stmt, Visitor};
use ruff_text_size::{Ranged, TextSize};
@@ -48,19 +48,6 @@ pub(crate) fn lint_syntax(db: &dyn Db, file_id: File) -> Vec<String> {
};
visitor.visit_body(&ast.body);
diagnostics = visitor.diagnostics;
} else {
let path = file_id.path(db);
let line_index = line_index(db.upcast(), file_id);
diagnostics.extend(parsed.errors().iter().map(|err| {
let source_location = line_index.source_location(err.location.start(), source.as_str());
format!(
"{}:{}:{}: {}",
path.as_str(),
source_location.row,
source_location.column,
err,
)
}));
}
diagnostics
@@ -286,14 +273,9 @@ mod tests {
Program::from_settings(
&db,
ProgramSettings {
&ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings {
extra_paths: Vec::new(),
src_root,
site_packages: vec![],
custom_typeshed: None,
},
search_paths: SearchPathSettings::new(src_root),
},
)
.expect("Valid program settings");

View File

@@ -1,4 +1,4 @@
use ruff_db::system::{SystemPath, SystemPathBuf};
use ruff_db::system::{SystemPath, SystemPathBuf, SystemVirtualPathBuf};
pub use watcher::{directory_watcher, EventHandler, Watcher};
pub use workspace_watcher::WorkspaceWatcher;
@@ -20,6 +20,9 @@ mod workspace_watcher;
/// event instead of emitting an event for each file or subdirectory in that path.
#[derive(Debug, PartialEq, Eq)]
pub enum ChangeEvent {
/// The file corresponding to the given path was opened in an editor.
Opened(SystemPathBuf),
/// A new path was created
Created {
path: SystemPathBuf,
@@ -38,6 +41,15 @@ pub enum ChangeEvent {
kind: DeletedKind,
},
/// A new virtual path was created.
CreatedVirtual(SystemVirtualPathBuf),
/// The content of a virtual path was changed.
ChangedVirtual(SystemVirtualPathBuf),
/// A virtual path was deleted.
DeletedVirtual(SystemVirtualPathBuf),
/// The file watcher failed to observe some changes and now is out of sync with the file system.
///
/// This can happen if many files are changed at once. The consumer should rescan all files to catch up
@@ -46,16 +58,27 @@ pub enum ChangeEvent {
}
impl ChangeEvent {
pub fn file_name(&self) -> Option<&str> {
self.path().and_then(|path| path.file_name())
/// Creates a new [`Changed`] event for the file content at the given path.
///
/// [`Changed`]: ChangeEvent::Changed
pub fn file_content_changed(path: SystemPathBuf) -> ChangeEvent {
ChangeEvent::Changed {
path,
kind: ChangedKind::FileContent,
}
}
pub fn path(&self) -> Option<&SystemPath> {
pub fn file_name(&self) -> Option<&str> {
self.system_path().and_then(|path| path.file_name())
}
pub fn system_path(&self) -> Option<&SystemPath> {
match self {
ChangeEvent::Created { path, .. }
ChangeEvent::Opened(path)
| ChangeEvent::Created { path, .. }
| ChangeEvent::Changed { path, .. }
| ChangeEvent::Deleted { path, .. } => Some(path),
ChangeEvent::Rescan => None,
_ => None,
}
}
}

View File

@@ -5,6 +5,8 @@ use salsa::{Durability, Setter as _};
pub use metadata::{PackageMetadata, WorkspaceMetadata};
use red_knot_python_semantic::types::check_types;
use red_knot_python_semantic::SearchPathSettings;
use ruff_db::parsed::parsed_module;
use ruff_db::source::{line_index, source_text, SourceDiagnostic};
use ruff_db::{
files::{system_path_to_file, File},
@@ -13,7 +15,8 @@ use ruff_db::{
use ruff_python_ast::{name::Name, PySourceType};
use ruff_text_size::Ranged;
use crate::workspace::files::{Index, Indexed, PackageFiles};
use crate::db::RootDatabase;
use crate::workspace::files::{Index, Indexed, IndexedIter, PackageFiles};
use crate::{
db::Db,
lint::{lint_semantic, lint_syntax},
@@ -21,6 +24,7 @@ use crate::{
mod files;
mod metadata;
pub mod settings;
/// The project workspace as a Salsa ingredient.
///
@@ -81,6 +85,10 @@ pub struct Workspace {
/// The (first-party) packages in this workspace.
#[return_ref]
package_tree: BTreeMap<SystemPathBuf, Package>,
/// The unresolved search path configuration.
#[return_ref]
pub search_path_settings: SearchPathSettings,
}
/// A first-party package in a workspace.
@@ -109,10 +117,14 @@ impl Workspace {
packages.insert(package.root.clone(), Package::from_metadata(db, package));
}
Workspace::builder(metadata.root, packages)
.durability(Durability::MEDIUM)
.open_fileset_durability(Durability::LOW)
.new(db)
Workspace::builder(
metadata.root,
packages,
metadata.settings.program.search_paths,
)
.durability(Durability::MEDIUM)
.open_fileset_durability(Durability::LOW)
.new(db)
}
pub fn root(self, db: &dyn Db) -> &SystemPath {
@@ -143,6 +155,11 @@ impl Workspace {
new_packages.insert(path, package);
}
if &metadata.settings.program.search_paths != self.search_path_settings(db) {
self.set_search_path_settings(db)
.to(metadata.settings.program.search_paths);
}
self.set_package_tree(db).to(new_packages);
}
@@ -174,23 +191,35 @@ impl Workspace {
}
/// Checks all open files in the workspace and its dependencies.
#[tracing::instrument(level = "debug", skip_all)]
pub fn check(self, db: &dyn Db) -> Vec<String> {
pub fn check(self, db: &RootDatabase) -> Vec<String> {
let workspace_span = tracing::debug_span!("check_workspace");
let _span = workspace_span.enter();
tracing::debug!("Checking workspace");
let files = WorkspaceFiles::new(db, self);
let result = Arc::new(std::sync::Mutex::new(Vec::new()));
let inner_result = Arc::clone(&result);
let mut result = Vec::new();
let db = db.snapshot();
let workspace_span = workspace_span.clone();
if let Some(open_files) = self.open_files(db) {
for file in open_files {
result.extend_from_slice(&check_file(db, *file));
rayon::scope(move |scope| {
for file in &files {
let result = inner_result.clone();
let db = db.snapshot();
let workspace_span = workspace_span.clone();
scope.spawn(move |_| {
let check_file_span = tracing::debug_span!(parent: &workspace_span, "check_file", file=%file.path(&db));
let _entered = check_file_span.entered();
let file_diagnostics = check_file(&db, file);
result.lock().unwrap().extend(file_diagnostics);
});
}
} else {
for package in self.packages(db) {
result.extend(package.check(db));
}
}
});
result
Arc::into_inner(result).unwrap().into_inner().unwrap()
}
/// Opens a file in the workspace.
@@ -308,19 +337,6 @@ impl Package {
index.insert(file);
}
#[tracing::instrument(level = "debug", skip(db))]
pub(crate) fn check(self, db: &dyn Db) -> Vec<String> {
tracing::debug!("Checking package '{}'", self.root(db));
let mut result = Vec::new();
for file in &self.files(db) {
let diagnostics = check_file(db, file);
result.extend_from_slice(&diagnostics);
}
result
}
/// Returns the files belonging to this package.
pub fn files(self, db: &dyn Db) -> Indexed<'_> {
let files = self.file_set(db);
@@ -331,11 +347,7 @@ impl Package {
tracing::debug_span!("index_package_files", package = %self.name(db)).entered();
let files = discover_package_files(db, self.root(db));
tracing::info!(
"Indexed {} files for package '{}'",
files.len(),
self.name(db)
);
tracing::info!("Found {} files in package '{}'", files.len(), self.name(db));
vacant.set(files)
}
Index::Indexed(indexed) => indexed,
@@ -372,9 +384,7 @@ impl Package {
#[salsa::tracked]
pub(super) fn check_file(db: &dyn Db, file: File) -> Vec<String> {
let path = file.path(db);
let _span = tracing::debug_span!("check_file", file=%path).entered();
tracing::debug!("Checking file '{path}'");
tracing::debug!("Checking file '{path}'", path = file.path(db));
let mut diagnostics = Vec::new();
@@ -393,6 +403,17 @@ pub(super) fn check_file(db: &dyn Db, file: File) -> Vec<String> {
return diagnostics;
}
let parsed = parsed_module(db.upcast(), file);
if !parsed.errors().is_empty() {
let path = file.path(db);
let line_index = line_index(db.upcast(), file);
diagnostics.extend(parsed.errors().iter().map(|err| {
let source_location = line_index.source_location(err.location.start(), source.as_str());
format!("{path}:{source_location}: {message}", message = err.error)
}));
}
for diagnostic in check_types(db.upcast(), file) {
let index = line_index(db.upcast(), diagnostic.file());
let location = index.source_location(diagnostic.start(), source.as_str());
@@ -451,6 +472,73 @@ fn discover_package_files(db: &dyn Db, path: &SystemPath) -> FxHashSet<File> {
files
}
#[derive(Debug)]
enum WorkspaceFiles<'a> {
OpenFiles(&'a FxHashSet<File>),
PackageFiles(Vec<Indexed<'a>>),
}
impl<'a> WorkspaceFiles<'a> {
fn new(db: &'a dyn Db, workspace: Workspace) -> Self {
if let Some(open_files) = workspace.open_files(db) {
WorkspaceFiles::OpenFiles(open_files)
} else {
WorkspaceFiles::PackageFiles(
workspace
.packages(db)
.map(|package| package.files(db))
.collect(),
)
}
}
}
impl<'a> IntoIterator for &'a WorkspaceFiles<'a> {
type Item = File;
type IntoIter = WorkspaceFilesIter<'a>;
fn into_iter(self) -> Self::IntoIter {
match self {
WorkspaceFiles::OpenFiles(files) => WorkspaceFilesIter::OpenFiles(files.iter()),
WorkspaceFiles::PackageFiles(package_files) => {
let mut package_files = package_files.iter();
WorkspaceFilesIter::PackageFiles {
current: package_files.next().map(IntoIterator::into_iter),
package_files,
}
}
}
}
}
enum WorkspaceFilesIter<'db> {
OpenFiles(std::collections::hash_set::Iter<'db, File>),
PackageFiles {
package_files: std::slice::Iter<'db, Indexed<'db>>,
current: Option<IndexedIter<'db>>,
},
}
impl Iterator for WorkspaceFilesIter<'_> {
type Item = File;
fn next(&mut self) -> Option<Self::Item> {
match self {
WorkspaceFilesIter::OpenFiles(files) => files.next().copied(),
WorkspaceFilesIter::PackageFiles {
package_files,
current,
} => loop {
if let Some(file) = current.as_mut().and_then(Iterator::next) {
return Some(file);
}
*current = Some(package_files.next()?.into_iter());
},
}
}
}
#[cfg(test)]
mod tests {
use ruff_db::files::system_path_to_file;

View File

@@ -158,9 +158,11 @@ impl Deref for Indexed<'_> {
}
}
pub(super) type IndexedIter<'a> = std::iter::Copied<std::collections::hash_set::Iter<'a, File>>;
impl<'a> IntoIterator for &'a Indexed<'_> {
type Item = File;
type IntoIter = std::iter::Copied<std::collections::hash_set::Iter<'a, File>>;
type IntoIter = IndexedIter<'a>;
fn into_iter(self) -> Self::IntoIter {
self.files.iter().copied()

View File

@@ -1,3 +1,4 @@
use crate::workspace::settings::{Configuration, WorkspaceSettings};
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ruff_python_ast::name::Name;
@@ -7,6 +8,8 @@ pub struct WorkspaceMetadata {
/// The (first-party) packages in this workspace.
pub(super) packages: Vec<PackageMetadata>,
pub(super) settings: WorkspaceSettings,
}
/// A first-party package in a workspace.
@@ -21,7 +24,11 @@ pub struct PackageMetadata {
impl WorkspaceMetadata {
/// Discovers the closest workspace at `path` and returns its metadata.
pub fn from_path(path: &SystemPath, system: &dyn System) -> anyhow::Result<WorkspaceMetadata> {
pub fn from_path(
path: &SystemPath,
system: &dyn System,
base_configuration: Option<Configuration>,
) -> anyhow::Result<WorkspaceMetadata> {
assert!(
system.is_directory(path),
"Workspace root path must be a directory"
@@ -38,9 +45,20 @@ impl WorkspaceMetadata {
root: root.clone(),
};
// TODO: Load the configuration from disk.
let mut configuration = Configuration::default();
if let Some(base_configuration) = base_configuration {
configuration.extend(base_configuration);
}
// TODO: Respect the package configurations when resolving settings (e.g. for the target version).
let settings = configuration.into_workspace_settings(&root);
let workspace = WorkspaceMetadata {
root,
packages: vec![package],
settings,
};
Ok(workspace)
@@ -53,6 +71,10 @@ impl WorkspaceMetadata {
pub fn packages(&self) -> &[PackageMetadata] {
&self.packages
}
pub fn settings(&self) -> &WorkspaceSettings {
&self.settings
}
}
impl PackageMetadata {

View File

@@ -0,0 +1,89 @@
use red_knot_python_semantic::{ProgramSettings, PythonVersion, SearchPathSettings, SitePackages};
use ruff_db::system::{SystemPath, SystemPathBuf};
/// The resolved configurations.
///
/// The main difference to [`Configuration`] is that default values are filled in.
#[derive(Debug, Clone)]
pub struct WorkspaceSettings {
pub(super) program: ProgramSettings,
}
impl WorkspaceSettings {
pub fn program(&self) -> &ProgramSettings {
&self.program
}
}
/// The configuration for the workspace or a package.
#[derive(Debug, Default, Clone)]
pub struct Configuration {
pub target_version: Option<PythonVersion>,
pub search_paths: SearchPathConfiguration,
}
impl Configuration {
/// Extends this configuration by using the values from `with` for all values that are absent in `self`.
pub fn extend(&mut self, with: Configuration) {
self.target_version = self.target_version.or(with.target_version);
self.search_paths.extend(with.search_paths);
}
pub fn into_workspace_settings(self, workspace_root: &SystemPath) -> WorkspaceSettings {
WorkspaceSettings {
program: ProgramSettings {
target_version: self.target_version.unwrap_or_default(),
search_paths: self.search_paths.into_settings(workspace_root),
},
}
}
}
#[derive(Debug, Default, Clone, Eq, PartialEq)]
pub struct SearchPathConfiguration {
/// List of user-provided paths that should take first priority in the module resolution.
/// Examples in other type checkers are mypy's MYPYPATH environment variable,
/// or pyright's stubPath configuration setting.
pub extra_paths: Option<Vec<SystemPathBuf>>,
/// The root of the workspace, used for finding first-party modules.
pub src_root: Option<SystemPathBuf>,
/// Optional path to a "custom typeshed" directory on disk for us to use for standard-library types.
/// If this is not provided, we will fallback to our vendored typeshed stubs for the stdlib,
/// bundled as a zip file in the binary
pub custom_typeshed: Option<SystemPathBuf>,
/// The path to the user's `site-packages` directory, where third-party packages from ``PyPI`` are installed.
pub site_packages: Option<SitePackages>,
}
impl SearchPathConfiguration {
pub fn into_settings(self, workspace_root: &SystemPath) -> SearchPathSettings {
let site_packages = self.site_packages.unwrap_or(SitePackages::Known(vec![]));
SearchPathSettings {
extra_paths: self.extra_paths.unwrap_or_default(),
src_root: self
.src_root
.unwrap_or_else(|| workspace_root.to_path_buf()),
custom_typeshed: self.custom_typeshed,
site_packages,
}
}
pub fn extend(&mut self, with: SearchPathConfiguration) {
if let Some(extra_paths) = with.extra_paths {
self.extra_paths.get_or_insert(extra_paths);
}
if let Some(src_root) = with.src_root {
self.src_root.get_or_insert(src_root);
}
if let Some(custom_typeshed) = with.custom_typeshed {
self.custom_typeshed.get_or_insert(custom_typeshed);
}
if let Some(site_packages) = with.site_packages {
self.site_packages.get_or_insert(site_packages);
}
}
}

View File

@@ -1,6 +1,7 @@
use red_knot_python_semantic::{
HasTy, ProgramSettings, PythonVersion, SearchPathSettings, SemanticModel,
};
use std::fs;
use std::path::PathBuf;
use red_knot_python_semantic::{HasTy, SemanticModel};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::files::{system_path_to_file, File};
@@ -9,23 +10,11 @@ use ruff_db::system::{OsSystem, SystemPath, SystemPathBuf};
use ruff_python_ast::visitor::source_order;
use ruff_python_ast::visitor::source_order::SourceOrderVisitor;
use ruff_python_ast::{Alias, Expr, Parameter, ParameterWithDefault, Stmt};
use std::fs;
use std::path::PathBuf;
fn setup_db(workspace_root: SystemPathBuf) -> anyhow::Result<RootDatabase> {
let system = OsSystem::new(&workspace_root);
let workspace = WorkspaceMetadata::from_path(&workspace_root, &system)?;
let search_paths = SearchPathSettings {
extra_paths: vec![],
src_root: workspace_root,
custom_typeshed: None,
site_packages: vec![],
};
let settings = ProgramSettings {
target_version: PythonVersion::default(),
search_paths,
};
RootDatabase::new(workspace, settings, system)
fn setup_db(workspace_root: &SystemPath) -> anyhow::Result<RootDatabase> {
let system = OsSystem::new(workspace_root);
let workspace = WorkspaceMetadata::from_path(workspace_root, &system, None)?;
RootDatabase::new(workspace, system)
}
/// Test that all snippets in testcorpus can be checked without panic
@@ -33,8 +22,9 @@ fn setup_db(workspace_root: SystemPathBuf) -> anyhow::Result<RootDatabase> {
#[allow(clippy::print_stdout)]
fn corpus_no_panic() -> anyhow::Result<()> {
let corpus = PathBuf::from(env!("CARGO_MANIFEST_DIR")).join("resources/test/corpus");
let system_corpus = SystemPath::from_std_path(&corpus).expect("corpus path to be UTF8");
let db = setup_db(system_corpus.to_path_buf())?;
let system_corpus =
SystemPathBuf::from_path_buf(corpus.clone()).expect("corpus path to be UTF8");
let db = setup_db(&system_corpus)?;
for path in fs::read_dir(&corpus).expect("corpus to be a directory") {
let path = path.expect("path to not be an error").path();

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.6.1"
version = "0.6.3"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -37,13 +37,14 @@ name = "red_knot"
harness = false
[dependencies]
codspeed-criterion-compat = { workspace = true, default-features = false, optional = true }
criterion = { workspace = true, default-features = false }
once_cell = { workspace = true }
rayon = { workspace = true }
serde = { workspace = true }
serde_json = { workspace = true }
url = { workspace = true }
ureq = { workspace = true }
criterion = { workspace = true, default-features = false }
codspeed-criterion-compat = { workspace = true, default-features = false, optional = true }
[dev-dependencies]
ruff_db = { workspace = true }

View File

@@ -1,8 +1,10 @@
#![allow(clippy::disallowed_names)]
use red_knot_python_semantic::{ProgramSettings, PythonVersion, SearchPathSettings};
use rayon::ThreadPoolBuilder;
use red_knot_python_semantic::PythonVersion;
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::watch::{ChangeEvent, ChangedKind};
use red_knot_workspace::workspace::settings::Configuration;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_benchmark::criterion::{criterion_group, criterion_main, BatchSize, Criterion};
use ruff_benchmark::TestFile;
@@ -19,18 +21,9 @@ struct Case {
const TOMLLIB_312_URL: &str = "https://raw.githubusercontent.com/python/cpython/8e8a4baf652f6e1cee7acde9d78c4b6154539748/Lib/tomllib";
// This first "unresolved import" is because we don't understand `*` imports yet.
// The following "unresolved import" violations are because we can't distinguish currently from
// "Symbol exists in the module but its type is unknown" and
// "Symbol does not exist in the module"
// The "unresolved import" is because we don't understand `*` imports yet.
static EXPECTED_DIAGNOSTICS: &[&str] = &[
"/src/tomllib/_parser.py:7:29: Could not resolve import of 'Iterable' from 'collections.abc'",
"/src/tomllib/_parser.py:10:20: Could not resolve import of 'Any' from 'typing'",
"/src/tomllib/_parser.py:13:5: Could not resolve import of 'RE_DATETIME' from '._re'",
"/src/tomllib/_parser.py:14:5: Could not resolve import of 'RE_LOCALTIME' from '._re'",
"/src/tomllib/_parser.py:15:5: Could not resolve import of 'RE_NUMBER' from '._re'",
"/src/tomllib/_parser.py:20:21: Could not resolve import of 'Key' from '._types'",
"/src/tomllib/_parser.py:20:26: Could not resolve import of 'ParseFloat' from '._types'",
"/src/tomllib/_parser.py:7:29: Module 'collections.abc' has no member 'Iterable'",
"Line 69 is too long (89 characters)",
"Use double quotes for strings",
"Use double quotes for strings",
@@ -39,23 +32,8 @@ static EXPECTED_DIAGNOSTICS: &[&str] = &[
"Use double quotes for strings",
"Use double quotes for strings",
"Use double quotes for strings",
"/src/tomllib/_parser.py:153:22: Name 'key' used when not defined.",
"/src/tomllib/_parser.py:153:27: Name 'flag' used when not defined.",
"/src/tomllib/_parser.py:159:16: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:161:25: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:168:16: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:169:22: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:170:25: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:180:16: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:182:31: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:206:16: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:207:22: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:208:25: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:330:32: Name 'header' used when not defined.",
"/src/tomllib/_parser.py:330:41: Name 'key' used when not defined.",
"/src/tomllib/_parser.py:333:26: Name 'cont_key' used when not defined.",
"/src/tomllib/_parser.py:334:71: Name 'cont_key' used when not defined.",
"/src/tomllib/_parser.py:337:31: Name 'cont_key' used when not defined.",
"/src/tomllib/_parser.py:628:75: Name 'e' used when not defined.",
"/src/tomllib/_parser.py:686:23: Name 'parse_float' used when not defined.",
];
@@ -86,18 +64,17 @@ fn setup_case() -> Case {
.unwrap();
let src_root = SystemPath::new("/src");
let metadata = WorkspaceMetadata::from_path(src_root, &system).unwrap();
let settings = ProgramSettings {
target_version: PythonVersion::PY312,
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: src_root.to_path_buf(),
site_packages: vec![],
custom_typeshed: None,
},
};
let metadata = WorkspaceMetadata::from_path(
src_root,
&system,
Some(Configuration {
target_version: Some(PythonVersion::PY312),
..Configuration::default()
}),
)
.unwrap();
let mut db = RootDatabase::new(metadata, settings, system).unwrap();
let mut db = RootDatabase::new(metadata, system).unwrap();
let parser = system_path_to_file(&db, parser_path).unwrap();
db.workspace().open_file(&mut db, parser);
@@ -112,7 +89,25 @@ fn setup_case() -> Case {
}
}
static RAYON_INITIALIZED: std::sync::Once = std::sync::Once::new();
fn setup_rayon() {
// Initialize the rayon thread pool outside the benchmark because it has a significant cost.
// We limit the thread pool to only one (the current thread) because we're focused on
// where red knot spends time and less about how well the code runs concurrently.
// We might want to add a benchmark focusing on concurrency to detect congestion in the future.
RAYON_INITIALIZED.call_once(|| {
ThreadPoolBuilder::new()
.num_threads(1)
.use_current_thread()
.build_global()
.unwrap();
});
}
fn benchmark_incremental(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("red_knot_check_file[incremental]", |b| {
b.iter_batched_ref(
|| {
@@ -131,10 +126,13 @@ fn benchmark_incremental(criterion: &mut Criterion) {
|case| {
let Case { db, .. } = case;
db.apply_changes(vec![ChangeEvent::Changed {
path: case.re_path.to_path_buf(),
kind: ChangedKind::FileContent,
}]);
db.apply_changes(
vec![ChangeEvent::Changed {
path: case.re_path.to_path_buf(),
kind: ChangedKind::FileContent,
}],
None,
);
let result = db.check().unwrap();
@@ -146,6 +144,8 @@ fn benchmark_incremental(criterion: &mut Criterion) {
}
fn benchmark_cold(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("red_knot_check_file[cold]", |b| {
b.iter_batched_ref(
setup_case,

View File

@@ -8,11 +8,12 @@ use salsa::{Durability, Setter};
pub use file_root::{FileRoot, FileRootKind};
pub use path::FilePath;
use ruff_notebook::{Notebook, NotebookError};
use ruff_python_ast::PySourceType;
use crate::file_revision::FileRevision;
use crate::files::file_root::FileRoots;
use crate::files::private::FileStatus;
use crate::system::{Metadata, SystemPath, SystemPathBuf, SystemVirtualPath, SystemVirtualPathBuf};
use crate::system::{SystemPath, SystemPathBuf, SystemVirtualPath, SystemVirtualPathBuf};
use crate::vendored::{VendoredPath, VendoredPathBuf};
use crate::{vendored, Db, FxDashMap};
@@ -60,8 +61,8 @@ struct FilesInner {
/// so that queries that depend on the existence of a file are re-executed when the file is created.
system_by_path: FxDashMap<SystemPathBuf, File>,
/// Lookup table that maps [`SystemVirtualPathBuf`]s to salsa interned [`File`] instances.
system_virtual_by_path: FxDashMap<SystemVirtualPathBuf, File>,
/// Lookup table that maps [`SystemVirtualPathBuf`]s to [`VirtualFile`] instances.
system_virtual_by_path: FxDashMap<SystemVirtualPathBuf, VirtualFile>,
/// Lookup table that maps vendored files to the salsa [`File`] ingredients.
vendored_by_path: FxDashMap<VendoredPathBuf, File>,
@@ -147,31 +148,31 @@ impl Files {
Ok(file)
}
/// Looks up a virtual file by its `path`.
/// Create a new virtual file at the given path and store it for future lookups.
///
/// For a non-existing file, creates a new salsa [`File`] ingredient and stores it for future lookups.
///
/// The operations fails if the system failed to provide a metadata for the path.
pub fn add_virtual_file(&self, db: &dyn Db, path: &SystemVirtualPath) -> Option<File> {
let file = match self.inner.system_virtual_by_path.entry(path.to_path_buf()) {
Entry::Occupied(entry) => *entry.get(),
Entry::Vacant(entry) => {
let metadata = db.system().virtual_path_metadata(path).ok()?;
/// This will always create a new file, overwriting any existing file at `path` in the internal
/// storage.
pub fn virtual_file(&self, db: &dyn Db, path: &SystemVirtualPath) -> VirtualFile {
tracing::trace!("Adding virtual file {}", path);
let virtual_file = VirtualFile(
File::builder(FilePath::SystemVirtual(path.to_path_buf()))
.status(FileStatus::Exists)
.revision(FileRevision::zero())
.permissions(None)
.new(db),
);
self.inner
.system_virtual_by_path
.insert(path.to_path_buf(), virtual_file);
virtual_file
}
tracing::trace!("Adding virtual file '{}'", path);
let file = File::builder(FilePath::SystemVirtual(path.to_path_buf()))
.revision(metadata.revision())
.permissions(metadata.permissions())
.new(db);
entry.insert(file);
file
}
};
Some(file)
/// Tries to look up a virtual file by its path. Returns `None` if no such file exists yet.
pub fn try_virtual_file(&self, path: &SystemVirtualPath) -> Option<VirtualFile> {
self.inner
.system_virtual_by_path
.get(&path.to_path_buf())
.map(|entry| *entry.value())
}
/// Looks up the closest root for `path`. Returns `None` if `path` isn't enclosed by any source root.
@@ -318,6 +319,9 @@ impl File {
}
FilePath::Vendored(vendored) => db.vendored().read_to_string(vendored),
FilePath::SystemVirtual(system_virtual) => {
// Add a dependency on the revision to ensure the operation gets re-executed when the file changes.
let _ = self.revision(db);
db.system().read_virtual_path_to_string(system_virtual)
}
}
@@ -342,6 +346,9 @@ impl File {
"Reading a notebook from the vendored file system is not supported.",
))),
FilePath::SystemVirtual(system_virtual) => {
// Add a dependency on the revision to ensure the operation gets re-executed when the file changes.
let _ = self.revision(db);
db.system().read_virtual_path_to_notebook(system_virtual)
}
}
@@ -354,6 +361,13 @@ impl File {
Self::sync_system_path(db, &absolute, None);
}
/// Increments the revision for the virtual file at `path`.
pub fn sync_virtual_path(db: &mut dyn Db, path: &SystemVirtualPath) {
if let Some(virtual_file) = db.files().try_virtual_file(path) {
virtual_file.sync(db);
}
}
/// Syncs the [`File`]'s state with the state of the file on the system.
pub fn sync(self, db: &mut dyn Db) {
let path = self.path(db).clone();
@@ -366,29 +380,20 @@ impl File {
FilePath::Vendored(_) => {
// Readonly, can never be out of date.
}
FilePath::SystemVirtual(system_virtual) => {
Self::sync_system_virtual_path(db, &system_virtual, self);
FilePath::SystemVirtual(_) => {
VirtualFile(self).sync(db);
}
}
}
/// Private method providing the implementation for [`Self::sync_path`] and [`Self::sync`] for
/// system paths.
fn sync_system_path(db: &mut dyn Db, path: &SystemPath, file: Option<File>) {
let Some(file) = file.or_else(|| db.files().try_system(db, path)) else {
return;
};
let metadata = db.system().path_metadata(path);
Self::sync_impl(db, metadata, file);
}
fn sync_system_virtual_path(db: &mut dyn Db, path: &SystemVirtualPath, file: File) {
let metadata = db.system().virtual_path_metadata(path);
Self::sync_impl(db, metadata, file);
}
/// Private method providing the implementation for [`Self::sync_system_path`] and
/// [`Self::sync_system_virtual_path`].
fn sync_impl(db: &mut dyn Db, metadata: crate::system::Result<Metadata>, file: File) {
let (status, revision, permission) = match metadata {
let (status, revision, permission) = match db.system().path_metadata(path) {
Ok(metadata) if metadata.file_type().is_file() => (
FileStatus::Exists,
metadata.revision(),
@@ -420,6 +425,42 @@ impl File {
pub fn exists(self, db: &dyn Db) -> bool {
self.status(db) == FileStatus::Exists
}
/// Returns `true` if the file should be analyzed as a type stub.
pub fn is_stub(self, db: &dyn Db) -> bool {
self.path(db)
.extension()
.is_some_and(|extension| PySourceType::from_extension(extension).is_stub())
}
}
/// A virtual file that doesn't exist on the file system.
///
/// This is a wrapper around a [`File`] that provides additional methods to interact with a virtual
/// file.
#[derive(Copy, Clone)]
pub struct VirtualFile(File);
impl VirtualFile {
/// Returns the underlying [`File`].
pub fn file(&self) -> File {
self.0
}
/// Increments the revision of the underlying [`File`].
fn sync(&self, db: &mut dyn Db) {
let file = self.0;
tracing::debug!("Updating the revision of '{}'", file.path(db));
let current_revision = file.revision(db);
file.set_revision(db)
.to(FileRevision::new(current_revision.as_u128() + 1));
}
/// Closes the virtual file.
pub fn close(&self, db: &mut dyn Db) {
tracing::debug!("Closing virtual file '{}'", self.0.path(db));
self.0.set_status(db).to(FileStatus::NotFound);
}
}
// The types in here need to be public because they're salsa ingredients but we

View File

@@ -121,9 +121,9 @@ mod tests {
db.write_virtual_file(path, "x = 10");
let file = db.files().add_virtual_file(&db, path).unwrap();
let virtual_file = db.files().virtual_file(&db, path);
let parsed = parsed_module(&db, file);
let parsed = parsed_module(&db, virtual_file.file());
assert!(parsed.is_valid());
@@ -137,9 +137,9 @@ mod tests {
db.write_virtual_file(path, "%timeit a = b");
let file = db.files().add_virtual_file(&db, path).unwrap();
let virtual_file = db.files().virtual_file(&db, path);
let parsed = parsed_module(&db, file);
let parsed = parsed_module(&db, virtual_file.file());
assert!(parsed.is_valid());

View File

@@ -63,9 +63,6 @@ pub trait System: Debug {
/// representation fall-back to deserializing the notebook from a string.
fn read_to_notebook(&self, path: &SystemPath) -> std::result::Result<Notebook, NotebookError>;
/// Reads the metadata of the virtual file at `path`.
fn virtual_path_metadata(&self, path: &SystemVirtualPath) -> Result<Metadata>;
/// Reads the content of the virtual file at `path` into a [`String`].
fn read_virtual_path_to_string(&self, path: &SystemVirtualPath) -> Result<String>;

View File

@@ -136,22 +136,6 @@ impl MemoryFileSystem {
ruff_notebook::Notebook::from_source_code(&content)
}
pub(crate) fn virtual_path_metadata(
&self,
path: impl AsRef<SystemVirtualPath>,
) -> Result<Metadata> {
let virtual_files = self.inner.virtual_files.read().unwrap();
let file = virtual_files
.get(&path.as_ref().to_path_buf())
.ok_or_else(not_found)?;
Ok(Metadata {
revision: file.last_modified.into(),
permissions: Some(MemoryFileSystem::PERMISSION),
file_type: FileType::File,
})
}
pub(crate) fn read_virtual_path_to_string(
&self,
path: impl AsRef<SystemVirtualPath>,

View File

@@ -77,10 +77,6 @@ impl System for OsSystem {
Notebook::from_path(path.as_std_path())
}
fn virtual_path_metadata(&self, _path: &SystemVirtualPath) -> Result<Metadata> {
Err(not_found())
}
fn read_virtual_path_to_string(&self, _path: &SystemVirtualPath) -> Result<String> {
Err(not_found())
}

View File

@@ -69,13 +69,6 @@ impl System for TestSystem {
}
}
fn virtual_path_metadata(&self, path: &SystemVirtualPath) -> Result<Metadata> {
match &self.inner {
TestSystemInner::Stub(fs) => fs.virtual_path_metadata(path),
TestSystemInner::System(system) => system.virtual_path_metadata(path),
}
}
fn read_virtual_path_to_string(&self, path: &SystemVirtualPath) -> Result<String> {
match &self.inner {
TestSystemInner::Stub(fs) => fs.read_virtual_path_to_string(path),

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.6.1"
version = "0.6.3"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -17,7 +17,7 @@ app = FastAPI()
router = APIRouter()
# Errors
# Fixable errors
@app.get("/items/")
def get_items(
@@ -40,6 +40,34 @@ def do_stuff(
# do stuff
pass
@app.get("/users/")
def get_users(
skip: int,
limit: int,
current_user: User = Depends(get_current_user),
):
pass
@app.get("/users/")
def get_users(
current_user: User = Depends(get_current_user),
skip: int = 0,
limit: int = 10,
):
pass
# Non fixable errors
@app.get("/users/")
def get_users(
skip: int = 0,
limit: int = 10,
current_user: User = Depends(get_current_user),
):
pass
# Unchanged

View File

@@ -79,3 +79,15 @@ _ = f"a {f"first"
+ f"second"} d"
_ = f"a {f"first {f"middle"}"
+ f"second"} d"
# See https://github.com/astral-sh/ruff/issues/12936
_ = "\12""0" # fix should be "\0120"
_ = "\\12""0" # fix should be "\\120"
_ = "\\\12""0" # fix should be "\\\0120"
_ = "\12 0""0" # fix should be "\12 00"
_ = r"\12"r"0" # fix should be r"\120"
_ = "\12 and more""0" # fix should be "\12 and more0"
_ = "\8""0" # fix should be "\80"
_ = "\12""8" # fix should be "\128"
_ = "\12""foo" # fix should be "\12foo"
_ = "\12" "" # fix should be "\12"

View File

@@ -47,7 +47,6 @@ with contextlib.ExitStack() as exit_stack:
open("filename", "w").close()
pathlib.Path("filename").open("w").close()
# OK (custom context manager)
class MyFile:
def __init__(self, filename: str):
@@ -58,3 +57,202 @@ class MyFile:
def __exit__(self, exc_type, exc_val, exc_tb):
self.file.close()
import tempfile
import tarfile
from tarfile import TarFile
import zipfile
import io
import codecs
import bz2
import gzip
import dbm
import dbm.gnu
import dbm.ndbm
import dbm.dumb
import lzma
import shelve
import tokenize
import wave
import fileinput
f = tempfile.NamedTemporaryFile()
f = tempfile.TemporaryFile()
f = tempfile.SpooledTemporaryFile()
f = tarfile.open("foo.tar")
f = TarFile("foo.tar").open()
f = tarfile.TarFile("foo.tar").open()
f = tarfile.TarFile().open()
f = zipfile.ZipFile("foo.zip").open("foo.txt")
f = io.open("foo.txt")
f = io.open_code("foo.txt")
f = codecs.open("foo.txt")
f = bz2.open("foo.txt")
f = gzip.open("foo.txt")
f = dbm.open("foo.db")
f = dbm.gnu.open("foo.db")
f = dbm.ndbm.open("foo.db")
f = dbm.dumb.open("foo.db")
f = lzma.open("foo.xz")
f = lzma.LZMAFile("foo.xz")
f = shelve.open("foo.db")
f = tokenize.open("foo.py")
f = wave.open("foo.wav")
f = tarfile.TarFile.taropen("foo.tar")
f = fileinput.input("foo.txt")
f = fileinput.FileInput("foo.txt")
with contextlib.suppress(Exception):
# The following line is for example's sake.
# For some f's above, this would raise an error (since it'd be f.readline() etc.)
data = f.read()
f.close()
# OK
with tempfile.TemporaryFile() as f:
data = f.read()
# OK
with tarfile.open("foo.tar") as f:
data = f.add("foo.txt")
# OK
with tarfile.TarFile("foo.tar") as f:
data = f.add("foo.txt")
# OK
with tarfile.TarFile("foo.tar").open() as f:
data = f.add("foo.txt")
# OK
with zipfile.ZipFile("foo.zip") as f:
data = f.read("foo.txt")
# OK
with zipfile.ZipFile("foo.zip").open("foo.txt") as f:
data = f.read()
# OK
with zipfile.ZipFile("foo.zip") as zf:
with zf.open("foo.txt") as f:
data = f.read()
# OK
with io.open("foo.txt") as f:
data = f.read()
# OK
with io.open_code("foo.txt") as f:
data = f.read()
# OK
with codecs.open("foo.txt") as f:
data = f.read()
# OK
with bz2.open("foo.txt") as f:
data = f.read()
# OK
with gzip.open("foo.txt") as f:
data = f.read()
# OK
with dbm.open("foo.db") as f:
data = f.get("foo")
# OK
with dbm.gnu.open("foo.db") as f:
data = f.get("foo")
# OK
with dbm.ndbm.open("foo.db") as f:
data = f.get("foo")
# OK
with dbm.dumb.open("foo.db") as f:
data = f.get("foo")
# OK
with lzma.open("foo.xz") as f:
data = f.read()
# OK
with lzma.LZMAFile("foo.xz") as f:
data = f.read()
# OK
with shelve.open("foo.db") as f:
data = f["foo"]
# OK
with tokenize.open("foo.py") as f:
data = f.read()
# OK
with wave.open("foo.wav") as f:
data = f.readframes(1024)
# OK
with tarfile.TarFile.taropen("foo.tar") as f:
data = f.add("foo.txt")
# OK
with fileinput.input("foo.txt") as f:
data = f.readline()
# OK
with fileinput.FileInput("foo.txt") as f:
data = f.readline()
# OK (quick one-liner to clear file contents)
tempfile.NamedTemporaryFile().close()
tempfile.TemporaryFile().close()
tempfile.SpooledTemporaryFile().close()
tarfile.open("foo.tar").close()
tarfile.TarFile("foo.tar").close()
tarfile.TarFile("foo.tar").open().close()
tarfile.TarFile.open("foo.tar").close()
zipfile.ZipFile("foo.zip").close()
zipfile.ZipFile("foo.zip").open("foo.txt").close()
io.open("foo.txt").close()
io.open_code("foo.txt").close()
codecs.open("foo.txt").close()
bz2.open("foo.txt").close()
gzip.open("foo.txt").close()
dbm.open("foo.db").close()
dbm.gnu.open("foo.db").close()
dbm.ndbm.open("foo.db").close()
dbm.dumb.open("foo.db").close()
lzma.open("foo.xz").close()
lzma.LZMAFile("foo.xz").close()
shelve.open("foo.db").close()
tokenize.open("foo.py").close()
wave.open("foo.wav").close()
tarfile.TarFile.taropen("foo.tar").close()
fileinput.input("foo.txt").close()
fileinput.FileInput("foo.txt").close()
def aliased():
from shelve import open as open_shelf
x = open_shelf("foo.dbm")
x.close()
from tarfile import TarFile as TF
f = TF("foo").open()
f.close()
import dbm.sqlite3
# OK
with dbm.sqlite3.open("foo.db") as f:
print(f.keys())
# OK
dbm.sqlite3.open("foo.db").close()
# SIM115
f = dbm.sqlite3.open("foo.db")
f.close()

View File

@@ -0,0 +1,75 @@
from contextlib import contextmanager
l = 0
I = 0
O = 0
l: int = 0
a, l = 0, 1
[a, l] = 0, 1
a, *l = 0, 1, 2
a = l = 0
o = 0
i = 0
for l in range(3):
pass
for a, l in zip(range(3), range(3)):
pass
def f1():
global l
l = 0
def f2():
l = 0
def f3():
nonlocal l
l = 1
f3()
return l
def f4(l, /, I):
return l, I, O
def f5(l=0, *, I=1):
return l, I
def f6(*l, **I):
return l, I
@contextmanager
def ctx1():
yield 0
with ctx1() as l:
pass
@contextmanager
def ctx2():
yield 0, 1
with ctx2() as (a, l):
pass
try:
pass
except ValueError as l:
pass
if (l := 5) > 0:
pass

View File

@@ -119,3 +119,91 @@ class A(metaclass=abc.abcmeta):
def f(self):
"""Lorem ipsum."""
return True
# OK - implicit None early return
def foo(obj: object) -> None:
"""A very helpful docstring.
Args:
obj (object): An object.
"""
if obj is None:
return
print(obj)
# OK - explicit None early return
def foo(obj: object) -> None:
"""A very helpful docstring.
Args:
obj (object): An object.
"""
if obj is None:
return None
print(obj)
# OK - explicit None early return w/o useful type annotations
def foo(obj):
"""A very helpful docstring.
Args:
obj (object): An object.
"""
if obj is None:
return None
print(obj)
# OK - multiple explicit None early returns
def foo(obj: object) -> None:
"""A very helpful docstring.
Args:
obj (object): An object.
"""
if obj is None:
return None
if obj == "None":
return
if obj == 0:
return None
print(obj)
# DOC201 - non-early return explicit None
def foo(x: int) -> int | None:
"""A very helpful docstring.
Args:
x (int): An interger.
"""
if x < 0:
return None
else:
return x
# DOC201 - non-early return explicit None w/o useful type annotations
def foo(x):
"""A very helpful docstring.
Args:
x (int): An interger.
"""
if x < 0:
return None
else:
return x
# DOC201 - only returns None, but return annotation is not None
def foo(s: str) -> str | None:
"""A very helpful docstring.
Args:
s (str): A string.
"""
return None

View File

@@ -85,3 +85,105 @@ class A(metaclass=abc.abcmeta):
def f(self):
"""Lorem ipsum."""
return True
# OK - implicit None early return
def foo(obj: object) -> None:
"""A very helpful docstring.
Parameters
----------
obj : object
An object.
"""
if obj is None:
return
print(obj)
# OK - explicit None early return
def foo(obj: object) -> None:
"""A very helpful docstring.
Parameters
----------
obj : object
An object.
"""
if obj is None:
return None
print(obj)
# OK - explicit None early return w/o useful type annotations
def foo(obj):
"""A very helpful docstring.
Parameters
----------
obj : object
An object.
"""
if obj is None:
return None
print(obj)
# OK - multiple explicit None early returns
def foo(obj: object) -> None:
"""A very helpful docstring.
Parameters
----------
obj : object
An object.
"""
if obj is None:
return None
if obj == "None":
return
if obj == 0:
return None
print(obj)
# DOC201 - non-early return explicit None
def foo(x: int) -> int | None:
"""A very helpful docstring.
Parameters
----------
x : int
An interger.
"""
if x < 0:
return None
else:
return x
# DOC201 - non-early return explicit None w/o useful type annotations
def foo(x):
"""A very helpful docstring.
Parameters
----------
x : int
An interger.
"""
if x < 0:
return None
else:
return x
# DOC201 - only returns None, but return annotation is not None
def foo(s: str) -> str | None:
"""A very helpful docstring.
Parameters
----------
x : str
A string.
"""
return None

View File

@@ -42,3 +42,16 @@ max(1, max(*a))
import builtins
builtins.min(1, min(2, 3))
# PLW3301
max_word_len = max(
max(len(word) for word in "blah blah blah".split(" ")),
len("Done!"),
)
# OK
max_word_len = max(
*(len(word) for word in "blah blah blah".split(" ")),
len("Done!"),
)

View File

@@ -9,3 +9,13 @@ dictionary = {
#import os # noqa: E501
def f():
data = 1
# line below should autofix to `return data # fmt: skip`
return data # noqa: RET504 # fmt: skip
def f():
data = 1
# line below should autofix to `return data`
return data # noqa: RET504 - intentional incorrect noqa, will be removed

View File

@@ -70,11 +70,11 @@ pub(crate) fn except_handler(except_handler: &ExceptHandler, checker: &mut Check
}
if let Some(name) = name {
if checker.enabled(Rule::AmbiguousVariableName) {
if let Some(diagnostic) =
pycodestyle::rules::ambiguous_variable_name(name.as_str(), name.range())
{
checker.diagnostics.push(diagnostic);
}
pycodestyle::rules::ambiguous_variable_name(
checker,
name.as_str(),
name.range(),
);
}
if checker.enabled(Rule::BuiltinVariableShadowing) {
flake8_builtins::rules::builtin_variable_shadowing(checker, name, name.range());

View File

@@ -259,11 +259,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
}
if checker.enabled(Rule::AmbiguousVariableName) {
if let Some(diagnostic) =
pycodestyle::rules::ambiguous_variable_name(id, expr.range())
{
checker.diagnostics.push(diagnostic);
}
pycodestyle::rules::ambiguous_variable_name(checker, id, expr.range());
}
if !checker.semantic.current_scope().kind.is_class() {
if checker.enabled(Rule::BuiltinVariableShadowing) {
@@ -883,7 +879,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
flake8_simplify::rules::use_capital_environment_variables(checker, expr);
}
if checker.enabled(Rule::OpenFileWithContextHandler) {
flake8_simplify::rules::open_file_with_context_handler(checker, func);
flake8_simplify::rules::open_file_with_context_handler(checker, call);
}
if checker.enabled(Rule::DictGetWithNoneDefault) {
flake8_simplify::rules::dict_get_with_none_default(checker, expr);

View File

@@ -8,11 +8,11 @@ use crate::rules::{flake8_builtins, pep8_naming, pycodestyle};
/// Run lint rules over a [`Parameter`] syntax node.
pub(crate) fn parameter(parameter: &Parameter, checker: &mut Checker) {
if checker.enabled(Rule::AmbiguousVariableName) {
if let Some(diagnostic) =
pycodestyle::rules::ambiguous_variable_name(&parameter.name, parameter.name.range())
{
checker.diagnostics.push(diagnostic);
}
pycodestyle::rules::ambiguous_variable_name(
checker,
&parameter.name,
parameter.name.range(),
);
}
if checker.enabled(Rule::InvalidArgumentName) {
if let Some(diagnostic) = pep8_naming::rules::invalid_argument_name(

View File

@@ -24,16 +24,16 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pylint::rules::global_at_module_level(checker, stmt);
}
if checker.enabled(Rule::AmbiguousVariableName) {
checker.diagnostics.extend(names.iter().filter_map(|name| {
pycodestyle::rules::ambiguous_variable_name(name, name.range())
}));
for name in names {
pycodestyle::rules::ambiguous_variable_name(checker, name, name.range());
}
}
}
Stmt::Nonlocal(nonlocal @ ast::StmtNonlocal { names, range: _ }) => {
if checker.enabled(Rule::AmbiguousVariableName) {
checker.diagnostics.extend(names.iter().filter_map(|name| {
pycodestyle::rules::ambiguous_variable_name(name, name.range())
}));
for name in names {
pycodestyle::rules::ambiguous_variable_name(checker, name, name.range());
}
}
if checker.enabled(Rule::NonlocalWithoutBinding) {
if !checker.semantic.scope_id.is_global() {

View File

@@ -118,10 +118,10 @@ pub(crate) fn check_noqa(
match &line.directive {
Directive::All(directive) => {
if line.matches.is_empty() {
let edit = delete_comment(directive.range(), locator);
let mut diagnostic =
Diagnostic::new(UnusedNOQA { codes: None }, directive.range());
diagnostic
.set_fix(Fix::safe_edit(delete_comment(directive.range(), locator)));
diagnostic.set_fix(Fix::safe_edit(edit));
diagnostics.push(diagnostic);
}
@@ -172,6 +172,14 @@ pub(crate) fn check_noqa(
&& unknown_codes.is_empty()
&& unmatched_codes.is_empty())
{
let edit = if valid_codes.is_empty() {
delete_comment(directive.range(), locator)
} else {
Edit::range_replacement(
format!("# noqa: {}", valid_codes.join(", ")),
directive.range(),
)
};
let mut diagnostic = Diagnostic::new(
UnusedNOQA {
codes: Some(UnusedCodes {
@@ -195,17 +203,7 @@ pub(crate) fn check_noqa(
},
directive.range(),
);
if valid_codes.is_empty() {
diagnostic.set_fix(Fix::safe_edit(delete_comment(
directive.range(),
locator,
)));
} else {
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
format!("# noqa: {}", valid_codes.join(", ")),
directive.range(),
)));
}
diagnostic.set_fix(Fix::safe_edit(edit));
diagnostics.push(diagnostic);
}
}

View File

@@ -99,11 +99,8 @@ pub(crate) fn delete_comment(range: TextRange, locator: &Locator) -> Edit {
}
// Ex) `x = 1 # noqa here`
else {
// Replace `# noqa here` with `# here`.
Edit::range_replacement(
"# ".to_string(),
TextRange::new(range.start(), range.end() + trailing_space_len),
)
// Remove `# noqa here` and whitespace
Edit::deletion(range.start() - leading_space_len, line_range.end())
}
}

View File

@@ -1,4 +1,4 @@
use ruff_diagnostics::{AlwaysFixableViolation, Diagnostic, Edit, Fix};
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast as ast;
use ruff_python_ast::helpers::map_callable;
@@ -59,14 +59,16 @@ use crate::settings::types::PythonVersion;
#[violation]
pub struct FastApiNonAnnotatedDependency;
impl AlwaysFixableViolation for FastApiNonAnnotatedDependency {
impl Violation for FastApiNonAnnotatedDependency {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
#[derive_message_formats]
fn message(&self) -> String {
format!("FastAPI dependency without `Annotated`")
}
fn fix_title(&self) -> String {
"Replace with `Annotated`".to_string()
fn fix_title(&self) -> Option<String> {
Some("Replace with `Annotated`".to_string())
}
}
@@ -75,64 +77,95 @@ pub(crate) fn fastapi_non_annotated_dependency(
checker: &mut Checker,
function_def: &ast::StmtFunctionDef,
) {
if !checker.semantic().seen_module(Modules::FASTAPI) {
return;
}
if !is_fastapi_route(function_def, checker.semantic()) {
if !checker.semantic().seen_module(Modules::FASTAPI)
|| !is_fastapi_route(function_def, checker.semantic())
{
return;
}
let mut updatable_count = 0;
let mut has_non_updatable_default = false;
let total_params = function_def.parameters.args.len();
for parameter in &function_def.parameters.args {
if let (Some(annotation), Some(default)) =
(&parameter.parameter.annotation, &parameter.default)
{
if checker
.semantic()
.resolve_qualified_name(map_callable(default))
.is_some_and(|qualified_name| {
matches!(
qualified_name.segments(),
[
"fastapi",
"Query"
| "Path"
| "Body"
| "Cookie"
| "Header"
| "File"
| "Form"
| "Depends"
| "Security"
]
)
})
{
let mut diagnostic =
Diagnostic::new(FastApiNonAnnotatedDependency, parameter.range);
let needs_update = matches!(
(&parameter.parameter.annotation, &parameter.default),
(Some(_annotation), Some(default)) if is_fastapi_dependency(checker, default)
);
diagnostic.try_set_fix(|| {
let module = if checker.settings.target_version >= PythonVersion::Py39 {
"typing"
} else {
"typing_extensions"
};
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, "Annotated"),
function_def.start(),
checker.semantic(),
)?;
let content = format!(
"{}: {}[{}, {}]",
parameter.parameter.name.id,
binding,
checker.locator().slice(annotation.range()),
checker.locator().slice(default.range())
);
let parameter_edit = Edit::range_replacement(content, parameter.range());
Ok(Fix::unsafe_edits(import_edit, [parameter_edit]))
});
checker.diagnostics.push(diagnostic);
}
if needs_update {
updatable_count += 1;
// Determine if it's safe to update this parameter:
// - if all parameters are updatable its safe.
// - if we've encountered a non-updatable parameter with a default value, it's no longer
// safe. (https://github.com/astral-sh/ruff/issues/12982)
let safe_to_update = updatable_count == total_params || !has_non_updatable_default;
create_diagnostic(checker, parameter, safe_to_update);
} else if parameter.default.is_some() {
has_non_updatable_default = true;
}
}
}
fn is_fastapi_dependency(checker: &Checker, expr: &ast::Expr) -> bool {
checker
.semantic()
.resolve_qualified_name(map_callable(expr))
.is_some_and(|qualified_name| {
matches!(
qualified_name.segments(),
[
"fastapi",
"Query"
| "Path"
| "Body"
| "Cookie"
| "Header"
| "File"
| "Form"
| "Depends"
| "Security"
]
)
})
}
fn create_diagnostic(
checker: &mut Checker,
parameter: &ast::ParameterWithDefault,
safe_to_update: bool,
) {
let mut diagnostic = Diagnostic::new(FastApiNonAnnotatedDependency, parameter.range);
if safe_to_update {
if let (Some(annotation), Some(default)) =
(&parameter.parameter.annotation, &parameter.default)
{
diagnostic.try_set_fix(|| {
let module = if checker.settings.target_version >= PythonVersion::Py39 {
"typing"
} else {
"typing_extensions"
};
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, "Annotated"),
parameter.range.start(),
checker.semantic(),
)?;
let content = format!(
"{}: {}[{}, {}]",
parameter.parameter.name.id,
binding,
checker.locator().slice(annotation.range()),
checker.locator().slice(default.range())
);
let parameter_edit = Edit::range_replacement(content, parameter.range);
Ok(Fix::unsafe_edits(import_edit, [parameter_edit]))
});
}
} else {
diagnostic.fix = None;
}
checker.diagnostics.push(diagnostic);
}

View File

@@ -261,3 +261,72 @@ FAST002.py:38:5: FAST002 [*] FastAPI dependency without `Annotated`
39 40 | ):
40 41 | # do stuff
41 42 | pass
FAST002.py:47:5: FAST002 [*] FastAPI dependency without `Annotated`
|
45 | skip: int,
46 | limit: int,
47 | current_user: User = Depends(get_current_user),
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FAST002
48 | ):
49 | pass
|
= help: Replace with `Annotated`
Unsafe fix
12 12 | Security,
13 13 | )
14 14 | from pydantic import BaseModel
15 |+from typing import Annotated
15 16 |
16 17 | app = FastAPI()
17 18 | router = APIRouter()
--------------------------------------------------------------------------------
44 45 | def get_users(
45 46 | skip: int,
46 47 | limit: int,
47 |- current_user: User = Depends(get_current_user),
48 |+ current_user: Annotated[User, Depends(get_current_user)],
48 49 | ):
49 50 | pass
50 51 |
FAST002.py:53:5: FAST002 [*] FastAPI dependency without `Annotated`
|
51 | @app.get("/users/")
52 | def get_users(
53 | current_user: User = Depends(get_current_user),
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FAST002
54 | skip: int = 0,
55 | limit: int = 10,
|
= help: Replace with `Annotated`
Unsafe fix
12 12 | Security,
13 13 | )
14 14 | from pydantic import BaseModel
15 |+from typing import Annotated
15 16 |
16 17 | app = FastAPI()
17 18 | router = APIRouter()
--------------------------------------------------------------------------------
50 51 |
51 52 | @app.get("/users/")
52 53 | def get_users(
53 |- current_user: User = Depends(get_current_user),
54 |+ current_user: Annotated[User, Depends(get_current_user)],
54 55 | skip: int = 0,
55 56 | limit: int = 10,
56 57 | ):
FAST002.py:67:5: FAST002 FastAPI dependency without `Annotated`
|
65 | skip: int = 0,
66 | limit: int = 10,
67 | current_user: User = Depends(get_current_user),
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FAST002
68 | ):
69 | pass
|
= help: Replace with `Annotated`

View File

@@ -11,6 +11,7 @@ mod tests {
use crate::assert_messages;
use crate::registry::Rule;
use crate::settings::types::PythonVersion;
use crate::settings::LinterSettings;
use crate::test::test_path;
@@ -36,4 +37,18 @@ mod tests {
assert_messages!(snapshot, diagnostics);
Ok(())
}
#[test_case(Path::new("ASYNC109_0.py"); "asyncio")]
#[test_case(Path::new("ASYNC109_1.py"); "trio")]
fn async109_python_310_or_older(path: &Path) -> Result<()> {
let diagnostics = test_path(
Path::new("flake8_async").join(path),
&LinterSettings {
target_version: PythonVersion::Py310,
..LinterSettings::for_rule(Rule::AsyncFunctionWithTimeout)
},
)?;
assert_messages!(path.file_name().unwrap().to_str().unwrap(), diagnostics);
Ok(())
}
}

View File

@@ -6,6 +6,7 @@ use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
use crate::rules::flake8_async::helpers::AsyncModule;
use crate::settings::types::PythonVersion;
/// ## What it does
/// Checks for `async` functions with a `timeout` argument.
@@ -86,6 +87,11 @@ pub(crate) fn async_function_with_timeout(
AsyncModule::AsyncIo
};
// asyncio.timeout feature was first introduced in Python 3.11
if module == AsyncModule::AsyncIo && checker.settings.target_version < PythonVersion::Py311 {
return;
}
checker.diagnostics.push(Diagnostic::new(
AsyncFunctionWithTimeout { module },
timeout.range(),

View File

@@ -0,0 +1,18 @@
---
source: crates/ruff_linter/src/rules/flake8_async/mod.rs
---
ASYNC109_0.py:8:16: ASYNC109 Async function definition with a `timeout` parameter
|
8 | async def func(timeout):
| ^^^^^^^ ASYNC109
9 | ...
|
= help: Use `trio.fail_after` instead
ASYNC109_0.py:12:16: ASYNC109 Async function definition with a `timeout` parameter
|
12 | async def func(timeout=10):
| ^^^^^^^^^^ ASYNC109
13 | ...
|
= help: Use `trio.fail_after` instead

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/flake8_async/mod.rs
---

View File

@@ -1,3 +1,5 @@
use std::borrow::Cow;
use itertools::Itertools;
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
@@ -172,9 +174,16 @@ fn concatenate_strings(a_range: TextRange, b_range: TextRange, locator: &Locator
return None;
}
let a_body = &a_text[a_leading_quote.len()..a_text.len() - a_trailing_quote.len()];
let mut a_body =
Cow::Borrowed(&a_text[a_leading_quote.len()..a_text.len() - a_trailing_quote.len()]);
let b_body = &b_text[b_leading_quote.len()..b_text.len() - b_trailing_quote.len()];
if a_leading_quote.find(['r', 'R']).is_none()
&& matches!(b_body.bytes().next(), Some(b'0'..=b'7'))
{
normalize_ending_octal(&mut a_body);
}
let concatenation = format!("{a_leading_quote}{a_body}{b_body}{a_trailing_quote}");
let range = TextRange::new(a_range.start(), b_range.end());
@@ -183,3 +192,39 @@ fn concatenate_strings(a_range: TextRange, b_range: TextRange, locator: &Locator
range,
)))
}
/// Pads an octal at the end of the string
/// to three digits, if necessary.
fn normalize_ending_octal(text: &mut Cow<'_, str>) {
// Early return for short strings
if text.len() < 2 {
return;
}
let mut rev_bytes = text.bytes().rev();
if let Some(last_byte @ b'0'..=b'7') = rev_bytes.next() {
// "\y" -> "\00y"
if has_odd_consecutive_backslashes(&mut rev_bytes.clone()) {
let prefix = &text[..text.len() - 2];
*text = Cow::Owned(format!("{prefix}\\00{}", last_byte as char));
}
// "\xy" -> "\0xy"
else if let Some(penultimate_byte @ b'0'..=b'7') = rev_bytes.next() {
if has_odd_consecutive_backslashes(&mut rev_bytes.clone()) {
let prefix = &text[..text.len() - 3];
*text = Cow::Owned(format!(
"{prefix}\\0{}{}",
penultimate_byte as char, last_byte as char
));
}
}
}
}
fn has_odd_consecutive_backslashes(mut itr: impl Iterator<Item = u8>) -> bool {
let mut odd_backslashes = false;
while let Some(b'\\') = itr.next() {
odd_backslashes = !odd_backslashes;
}
odd_backslashes
}

View File

@@ -296,4 +296,202 @@ ISC.py:73:20: ISC001 [*] Implicitly concatenated string literals on one line
75 75 | f"def"} g"
76 76 |
ISC.py:84:5: ISC001 [*] Implicitly concatenated string literals on one line
|
83 | # See https://github.com/astral-sh/ruff/issues/12936
84 | _ = "\12""0" # fix should be "\0120"
| ^^^^^^^^ ISC001
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
|
= help: Combine string literals
Safe fix
81 81 | + f"second"} d"
82 82 |
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 |-_ = "\12""0" # fix should be "\0120"
84 |+_ = "\0120" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
ISC.py:85:5: ISC001 [*] Implicitly concatenated string literals on one line
|
83 | # See https://github.com/astral-sh/ruff/issues/12936
84 | _ = "\12""0" # fix should be "\0120"
85 | _ = "\\12""0" # fix should be "\\120"
| ^^^^^^^^^ ISC001
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
|
= help: Combine string literals
Safe fix
82 82 |
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 84 | _ = "\12""0" # fix should be "\0120"
85 |-_ = "\\12""0" # fix should be "\\120"
85 |+_ = "\\120" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
ISC.py:86:5: ISC001 [*] Implicitly concatenated string literals on one line
|
84 | _ = "\12""0" # fix should be "\0120"
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
| ^^^^^^^^^^ ISC001
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
|
= help: Combine string literals
Safe fix
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 84 | _ = "\12""0" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 |-_ = "\\\12""0" # fix should be "\\\0120"
86 |+_ = "\\\0120" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
ISC.py:87:5: ISC001 [*] Implicitly concatenated string literals on one line
|
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
| ^^^^^^^^^^ ISC001
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
|
= help: Combine string literals
Safe fix
84 84 | _ = "\12""0" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 |-_ = "\12 0""0" # fix should be "\12 00"
87 |+_ = "\12 00" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
ISC.py:88:5: ISC001 [*] Implicitly concatenated string literals on one line
|
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
| ^^^^^^^^^^ ISC001
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
|
= help: Combine string literals
Safe fix
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 |-_ = r"\12"r"0" # fix should be r"\120"
88 |+_ = r"\120" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
ISC.py:89:5: ISC001 [*] Implicitly concatenated string literals on one line
|
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
| ^^^^^^^^^^^^^^^^^ ISC001
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
|
= help: Combine string literals
Safe fix
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 |-_ = "\12 and more""0" # fix should be "\12 and more0"
89 |+_ = "\12 and more0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
ISC.py:90:5: ISC001 [*] Implicitly concatenated string literals on one line
|
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
| ^^^^^^^ ISC001
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
|
= help: Combine string literals
Safe fix
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 |-_ = "\8""0" # fix should be "\80"
90 |+_ = "\80" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:91:5: ISC001 [*] Implicitly concatenated string literals on one line
|
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
| ^^^^^^^^ ISC001
92 | _ = "\12""foo" # fix should be "\12foo"
93 | _ = "\12" "" # fix should be "\12"
|
= help: Combine string literals
Safe fix
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 |-_ = "\12""8" # fix should be "\128"
91 |+_ = "\128" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:92:5: ISC001 [*] Implicitly concatenated string literals on one line
|
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
| ^^^^^^^^^^ ISC001
93 | _ = "\12" "" # fix should be "\12"
|
= help: Combine string literals
Safe fix
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 |-_ = "\12""foo" # fix should be "\12foo"
92 |+_ = "\12foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:93:5: ISC001 [*] Implicitly concatenated string literals on one line
|
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
93 | _ = "\12" "" # fix should be "\12"
| ^^^^^^^^ ISC001
|
= help: Combine string literals
Safe fix
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 |-_ = "\12" "" # fix should be "\12"
93 |+_ = "\12" # fix should be "\12"

View File

@@ -50,6 +50,6 @@ ISC.py:80:10: ISC003 Explicitly concatenated string should be implicitly concate
| __________^
81 | | + f"second"} d"
| |_______________^ ISC003
82 |
83 | # See https://github.com/astral-sh/ruff/issues/12936
|

View File

@@ -296,4 +296,202 @@ ISC.py:73:20: ISC001 [*] Implicitly concatenated string literals on one line
75 75 | f"def"} g"
76 76 |
ISC.py:84:5: ISC001 [*] Implicitly concatenated string literals on one line
|
83 | # See https://github.com/astral-sh/ruff/issues/12936
84 | _ = "\12""0" # fix should be "\0120"
| ^^^^^^^^ ISC001
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
|
= help: Combine string literals
Safe fix
81 81 | + f"second"} d"
82 82 |
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 |-_ = "\12""0" # fix should be "\0120"
84 |+_ = "\0120" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
ISC.py:85:5: ISC001 [*] Implicitly concatenated string literals on one line
|
83 | # See https://github.com/astral-sh/ruff/issues/12936
84 | _ = "\12""0" # fix should be "\0120"
85 | _ = "\\12""0" # fix should be "\\120"
| ^^^^^^^^^ ISC001
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
|
= help: Combine string literals
Safe fix
82 82 |
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 84 | _ = "\12""0" # fix should be "\0120"
85 |-_ = "\\12""0" # fix should be "\\120"
85 |+_ = "\\120" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
ISC.py:86:5: ISC001 [*] Implicitly concatenated string literals on one line
|
84 | _ = "\12""0" # fix should be "\0120"
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
| ^^^^^^^^^^ ISC001
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
|
= help: Combine string literals
Safe fix
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 84 | _ = "\12""0" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 |-_ = "\\\12""0" # fix should be "\\\0120"
86 |+_ = "\\\0120" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
ISC.py:87:5: ISC001 [*] Implicitly concatenated string literals on one line
|
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
| ^^^^^^^^^^ ISC001
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
|
= help: Combine string literals
Safe fix
84 84 | _ = "\12""0" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 |-_ = "\12 0""0" # fix should be "\12 00"
87 |+_ = "\12 00" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
ISC.py:88:5: ISC001 [*] Implicitly concatenated string literals on one line
|
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
| ^^^^^^^^^^ ISC001
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
|
= help: Combine string literals
Safe fix
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 |-_ = r"\12"r"0" # fix should be r"\120"
88 |+_ = r"\120" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
ISC.py:89:5: ISC001 [*] Implicitly concatenated string literals on one line
|
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
| ^^^^^^^^^^^^^^^^^ ISC001
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
|
= help: Combine string literals
Safe fix
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 |-_ = "\12 and more""0" # fix should be "\12 and more0"
89 |+_ = "\12 and more0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
ISC.py:90:5: ISC001 [*] Implicitly concatenated string literals on one line
|
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
| ^^^^^^^ ISC001
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
|
= help: Combine string literals
Safe fix
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 |-_ = "\8""0" # fix should be "\80"
90 |+_ = "\80" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:91:5: ISC001 [*] Implicitly concatenated string literals on one line
|
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
| ^^^^^^^^ ISC001
92 | _ = "\12""foo" # fix should be "\12foo"
93 | _ = "\12" "" # fix should be "\12"
|
= help: Combine string literals
Safe fix
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 |-_ = "\12""8" # fix should be "\128"
91 |+_ = "\128" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:92:5: ISC001 [*] Implicitly concatenated string literals on one line
|
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
| ^^^^^^^^^^ ISC001
93 | _ = "\12" "" # fix should be "\12"
|
= help: Combine string literals
Safe fix
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 |-_ = "\12""foo" # fix should be "\12foo"
92 |+_ = "\12foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:93:5: ISC001 [*] Implicitly concatenated string literals on one line
|
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
93 | _ = "\12" "" # fix should be "\12"
| ^^^^^^^^ ISC001
|
= help: Combine string literals
Safe fix
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 |-_ = "\12" "" # fix should be "\12"
93 |+_ = "\12" # fix should be "\12"

View File

@@ -27,14 +27,14 @@ use crate::checkers::ast::Checker;
///
/// ## Example
///
/// ```python
/// ```pyi
/// class Foo:
/// def __eq__(self, obj: typing.Any) -> bool: ...
/// ```
///
/// Use instead:
///
/// ```python
/// ```pyi
/// class Foo:
/// def __eq__(self, obj: object) -> bool: ...
/// ```

View File

@@ -34,19 +34,17 @@ use crate::registry::Rule;
/// ```
///
/// ## Example
/// ```python
/// ```pyi
/// import sys
///
/// if sys.version_info > (3, 8):
/// ...
/// if sys.version_info > (3, 8): ...
/// ```
///
/// Use instead:
/// ```python
/// ```pyi
/// import sys
///
/// if sys.version_info >= (3, 9):
/// ...
/// if sys.version_info >= (3, 9): ...
/// ```
#[violation]
pub struct BadVersionInfoComparison;
@@ -70,27 +68,23 @@ impl Violation for BadVersionInfoComparison {
///
/// ## Example
///
/// ```python
/// ```pyi
/// import sys
///
/// if sys.version_info < (3, 10):
///
/// def read_data(x, *, preserve_order=True): ...
///
/// else:
///
/// def read_data(x): ...
/// ```
///
/// Use instead:
///
/// ```python
/// ```pyi
/// if sys.version_info >= (3, 10):
///
/// def read_data(x): ...
///
/// else:
///
/// def read_data(x, *, preserve_order=True): ...
/// ```
#[violation]

View File

@@ -21,17 +21,16 @@ use crate::checkers::ast::Checker;
/// precisely.
///
/// ## Example
/// ```python
/// ```pyi
/// from collections import namedtuple
///
/// person = namedtuple("Person", ["name", "age"])
/// ```
///
/// Use instead:
/// ```python
/// ```pyi
/// from typing import NamedTuple
///
///
/// class Person(NamedTuple):
/// name: str
/// age: int

View File

@@ -20,27 +20,24 @@ use crate::checkers::ast::Checker;
///
/// ## Example
///
/// ```python
/// ```pyi
/// from typing import TypeAlias
///
/// a = b = int
///
///
/// class Klass: ...
///
///
/// Klass.X: TypeAlias = int
/// ```
///
/// Use instead:
///
/// ```python
/// ```pyi
/// from typing import TypeAlias
///
/// a: TypeAlias = int
/// b: TypeAlias = int
///
///
/// class Klass:
/// X: TypeAlias = int
/// ```

View File

@@ -16,19 +16,17 @@ use crate::checkers::ast::Checker;
/// analyze your code.
///
/// ## Example
/// ```python
/// ```pyi
/// import sys
///
/// if (3, 10) <= sys.version_info < (3, 12):
/// ...
/// if (3, 10) <= sys.version_info < (3, 12): ...
/// ```
///
/// Use instead:
/// ```python
/// ```pyi
/// import sys
///
/// if sys.version_info >= (3, 10) and sys.version_info < (3, 12):
/// ...
/// if sys.version_info >= (3, 10) and sys.version_info < (3, 12): ...
/// ```
///
/// ## References

View File

@@ -25,27 +25,22 @@ use crate::checkers::ast::Checker;
///
/// ## Example
///
/// ```python
/// ```pyi
/// class Foo:
/// def __new__(cls: type[_S], *args: str, **kwargs: int) -> _S: ...
///
/// def foo(self: _S, arg: bytes) -> _S: ...
///
/// @classmethod
/// def bar(cls: type[_S], arg: int) -> _S: ...
/// ```
///
/// Use instead:
///
/// ```python
/// ```pyi
/// from typing import Self
///
///
/// class Foo:
/// def __new__(cls, *args: str, **kwargs: int) -> Self: ...
///
/// def foo(self, arg: bytes) -> Self: ...
///
/// @classmethod
/// def bar(cls, arg: int) -> Self: ...
/// ```

View File

@@ -15,7 +15,7 @@ use crate::checkers::ast::Checker;
///
/// ## Example
///
/// ```python
/// ```pyi
/// def func(param: int) -> str:
/// """This is a docstring."""
/// ...
@@ -23,7 +23,7 @@ use crate::checkers::ast::Checker;
///
/// Use instead:
///
/// ```python
/// ```pyi
/// def func(param: int) -> str: ...
/// ```
#[violation]

Some files were not shown because too many files have changed in this diff Show More