Compare commits

...

119 Commits

Author SHA1 Message Date
Micha Reiser
da824ba316 Release Ruff 0.5.6 (#12629)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2024-08-02 17:35:14 +02:00
Micha Reiser
012198a1b0 Enable notebooks by default in preview mode (#12621) 2024-08-02 13:36:53 +00:00
Alex Waygood
fbab04fbe1 [red-knot] Allow multiple site-packages search paths (#12609) 2024-08-02 13:33:19 +00:00
Dhruv Manilawala
9aa43d5f91 Separate red_knot into CLI and red_knot_workspace crates (#12623)
## Summary

This PR separates the current `red_knot` crate into two crates:
1. `red_knot` - This will be similar to the `ruff` crate, it'll act as
the CLI crate
2. `red_knot_workspace` - This includes everything except for the CLI
functionality from the existing `red_knot` crate

Note that the code related to the file watcher is in
`red_knot_workspace` for now but might be required to extract it out in
the future.

The main motivation for this change is so that we can have a `red_knot
server` command. This makes it easier to test the server out without
making any changes in the VS Code extension. All we need is to specify
the `red_knot` executable path in `ruff.path` extension setting.

## Test Plan

- `cargo build`
- `cargo clippy --workspace --all-targets --all-features`
- `cargo shear --fix`
2024-08-02 11:24:36 +00:00
Micha Reiser
966563c79b Add tests for hard and soft links (#12590) 2024-08-02 10:14:28 +00:00
Micha Reiser
27edadec29 Make server panic hook more error resilient (#12610) 2024-08-02 12:10:06 +02:00
InSync
2e2b1b460f Fix a typo in docs/editors/settings.md (#12614)
Diff:

```diff
-- `false: Same as`off\`
+- `false`: Same as `off`
```
2024-08-01 11:23:55 -05:00
Charlie Marsh
a3e67abf4c Add newlines before comments in E305 (#12606)
## Summary

There's still a problem here. Given:

```python
class Class():
    pass

    # comment

    # another comment
a = 1
```

We only add one newline before `a = 1` on the first pass, because
`max_precedling_blank_lines` is 1... We then add the second newline on
the second pass, so it ends up in the right state, but the logic is
clearly wonky.

Closes https://github.com/astral-sh/ruff/issues/11508.
2024-07-31 23:11:00 -04:00
Carl Meyer
ee0518e8f7 [red-knot] implement attribute of union (#12601)
I hit this `todo!` trying to run type inference over some real modules.
Since it's a one-liner to implement it, I just did that rather than
changing to `Type::Unknown`.
2024-07-31 19:45:24 -07:00
Charlie Marsh
d774a3bd48 Avoid unused async when context manager includes TaskGroup (#12605)
## Summary

Closes https://github.com/astral-sh/ruff/issues/12354.
2024-08-01 02:12:43 +00:00
Charlie Marsh
7e6b19048e Don't attach comments with mismatched indents (#12604)
## Summary

Given:

```python
def test_update():
    pass
    # comment
def test_clientmodel():
    pass
```

We don't want `# comment` to be attached to `def test_clientmodel()`.

Closes https://github.com/astral-sh/ruff/issues/12589.
2024-07-31 22:09:05 -04:00
Charlie Marsh
8e383b9587 Respect start index in unnecessary-list-index-lookup (#12603)
## Summary

Closes https://github.com/astral-sh/ruff/issues/12594.
2024-08-01 01:21:15 +00:00
github-actions[bot]
3f49ab126f Sync vendored typeshed stubs (#12602) 2024-08-01 01:44:56 +01:00
Chris Krycho
c1bc7f4dee Remove ecosystem_ci flag from Ruff CLI (#12596)
## Summary

@zanieb noticed while we were discussing #12595 that this flag is now
unnecessary, so remove it and the flags which reference it.

## Test Plan

Question for maintainers: is there a test to add *or* remove here? (I’ve
opened this as a draft PR with that in view!)
2024-07-31 11:40:03 -05:00
Bowen Liang
a44d579f21 Add Dify to Ruff users (#12593)
## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
- Add the popular LLM Ops project Dify to the user list in Readme, as
Dify introduced Ruff for lining since Feb 2024 in
https://github.com/langgenius/dify/pull/2366
2024-07-31 08:56:52 -04:00
Alex Waygood
a3900d2b0b [pyflakes] Fix preview-mode bugs in F401 when attempting to autofix unused first-party submodule imports in an __init__.py file (#12569) 2024-07-31 13:34:30 +01:00
Alex Waygood
83b1c48a93 Make setting and retrieving pydocstyle settings less tedious (#12582) 2024-07-31 10:39:33 +01:00
Micha Reiser
138e70bd5c Upgrade to Rust 1.80 (#12586) 2024-07-30 19:18:08 +00:00
Eero Vaher
ee103ffb25 Fix an argument name in B905 description (#12588)
The description of `zip-without-explicit-strict` erroneously mentions a
non-existing `check` argument for `zip()`.
2024-07-30 14:40:56 -04:00
Micha Reiser
18f87b9497 Flaky file watching tests, add debug assertions (#12587) 2024-07-30 18:09:55 +00:00
Micha Reiser
adc8d4e1e7 File watch events: Add dynamic wait period before writing new changes (#12585) 2024-07-30 19:18:43 +02:00
Alex Waygood
90db361199 Consider more stdlib decorators to be property-like (#12583) 2024-07-30 17:18:23 +00:00
Alex Waygood
4738135801 Improve consistency between linter rules in determining whether a function is property (#12581) 2024-07-30 17:42:04 +01:00
Micha Reiser
264cd750e9 Add delay between updating a file (#12576) 2024-07-30 18:31:29 +02:00
Alex Waygood
7a4419a2a5 Improve handling of metaclasses in various linter rules (#12579) 2024-07-30 14:48:36 +01:00
Alex Waygood
ac1666d6e2 Remove several incorrect uses of map_callable() (#12580) 2024-07-30 14:30:25 +01:00
epenet
459c85ba27 [flake8-return] Exempt cached properties and other property-like decorators from explicit return rule (RET501) (#12563)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2024-07-30 11:06:28 +00:00
Alex Waygood
aaa56eb0bd Fix NFKC normalization bug when removing unused imports (#12571) 2024-07-30 09:54:35 +00:00
Dhruv Manilawala
f3c14a4276 Keep track of deleted cell for reorder change request (#12575)
## Summary

This PR fixes a bug where the server wouldn't retain the cell content in
case of a reorder change request.

As mentioned in
https://github.com/astral-sh/ruff/issues/12573#issuecomment-2257819298,
this change request is modeled as (a) remove these cell URIs and (b) add
these cell URIs. The cell content isn't provided. But, the way we've
modeled the `NotebookCell` (it contains the underlying `TextDocument`),
we need to keep track of the deleted cells to get the content.

This is not an ideal solution and a better long term solution would be
to model it as per the spec but that is a big structural change and will
affect multiple parts of the server. Modeling as per the spec would also
avoid bugs like https://github.com/astral-sh/ruff/pull/11864. For
context, that model would add complexity per
https://github.com/astral-sh/ruff/pull/11206#discussion_r1600165481.

fixes: #12573

## Test Plan

This video shows the before and after the bug is fixed:


https://github.com/user-attachments/assets/2fcad4b5-f9af-4776-8640-4cd1fa16e325
2024-07-30 09:51:26 +00:00
Alex Waygood
3169d408fa [red-knot] Fix typos in the module resolver (#12574) 2024-07-30 09:38:38 +00:00
Micha Reiser
a2286c8e47 Set Durability to 'HIGH' for most inputs and third-party libraries (#12566) 2024-07-30 09:03:59 +00:00
Piotr Osiewicz
fb9f566f56 Use $/logTrace for server trace logs in Zed and VS Code (#12564)
## Summary

This pull request adds support for logging via `$/logTrace` RPC
messages. It also enables that code path for when a client is Zed editor
or VS Code (as there's no way for us to generically tell whether a client prefers
`$/logTrace` over stderr.

Related to: #12523

## Test Plan

I've built Ruff from this branch and tested it manually with Zed.

---------

Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
2024-07-30 08:32:20 +05:30
Micha Reiser
381bd1ff4a Delete left over debug statement (#12567) 2024-07-29 16:16:12 +02:00
Micha Reiser
2f54d05d97 Remove salsa::report_untracked_read when finding the dynamic module resolution paths (#12509) 2024-07-29 09:31:29 +00:00
Micha Reiser
e18b4e42d3 [red-knot] Upgrade to the *new* *new* salsa (#12406) 2024-07-29 07:21:24 +00:00
Dhruv Manilawala
9495331a5f Recommend client config for trace setting in Neovim (#12562) 2024-07-29 06:14:34 +00:00
renovate[bot]
e1076db7d0 Update CodSpeedHQ/action action to v3 (#12559) 2024-07-29 07:37:02 +02:00
renovate[bot]
1986c9e8e2 Update NPM Development dependencies (#12556) 2024-07-28 22:17:44 -04:00
renovate[bot]
d7e80dc955 Update pre-commit dependencies (#12555) 2024-07-28 22:17:34 -04:00
renovate[bot]
87d09f77cd Update Rust crate imperative to v1.0.6 (#12552) 2024-07-28 22:17:28 -04:00
renovate[bot]
bd37ef13b8 Update Rust crate bstr to v1.10.0 (#12557) 2024-07-28 22:17:11 -04:00
renovate[bot]
ec23c974db Update Rust crate toml to v0.8.16 (#12554) 2024-07-28 22:17:01 -04:00
renovate[bot]
122e5ab428 Update Rust crate serde_json to v1.0.121 (#12553) 2024-07-28 22:16:55 -04:00
renovate[bot]
2f2149aca8 Update Rust crate env_logger to v0.11.5 (#12550) 2024-07-28 22:16:49 -04:00
renovate[bot]
9d5c31e7da Update Rust crate imara-diff to v0.1.7 (#12551) 2024-07-28 22:16:42 -04:00
renovate[bot]
25f3ad6238 Update Rust crate clap to v4.5.11 (#12549) 2024-07-28 22:16:36 -04:00
renovate[bot]
79926329a4 Update Rust crate argfile to v0.2.1 (#12548) 2024-07-28 22:16:31 -04:00
Aleksei Latyshev
9cdc578dd9 [flake8-builtins] Implement import, lambda, and module shadowing (#12546)
## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
Extend `flake8-builtins` to imports, lambda-arguments, and modules to be
consistent with original checker
[flake8_builtins](https://github.com/gforcada/flake8-builtins/blob/main/flake8_builtins.py).

closes #12540 

## Details

- Implement builtin-import-shadowing (A004)
- Stop tracking imports shadowing in builtin-variable-shadowing (A001)
in preview mode.
- Implement builtin-lambda-argument-shadowing (A005)
- Implement builtin-module-shadowing (A006)
  - Add new option `linter.flake8_builtins.builtins_allowed_modules`

## Test Plan

cargo test
2024-07-29 01:42:42 +00:00
Charlie Marsh
665c75f7ab Add document for executable determination (#12547)
Closes https://github.com/astral-sh/ruff/issues/12505.
2024-07-28 16:23:00 -04:00
Micha Reiser
f37b39d6cc Allow downloading ecosystem results from forks (#12544) 2024-07-27 19:57:19 +02:00
Charlie Marsh
e18c45c310 Avoid marking required imports as unused (#12537)
## Summary

If an import is marked as "required", we should never flag it as unused.
In practice, this is rare, since required imports are typically used for
`__future__` annotations, which are always considered "used".

Closes https://github.com/astral-sh/ruff/issues/12458.
2024-07-26 14:23:43 -04:00
Charlie Marsh
d930052de8 Move required import parsing out of lint rule (#12536)
## Summary

Instead, make it part of the serialization and deserialization itself.
This makes it _much_ easier to reuse when solving
https://github.com/astral-sh/ruff/issues/12458.
2024-07-26 13:35:45 -04:00
Sigurd Spieckermann
7ad4df9e9f Complete FBT002 example with Enum argument (#12525)
## Summary

I've completed `FBT002` rule example with an `Enum` argument to show the
full usage in this case.
2024-07-26 11:50:19 -04:00
Charlie Marsh
425761e960 Use colon rather than dot formatting for integer-only types (#12534)
## Summary

Closes https://github.com/astral-sh/ruff/issues/12421.
2024-07-26 15:48:19 +00:00
Carl Meyer
4b69271809 [red-knot] resolve int/list/dict/set/tuple to builtin type (#12521)
Now that we have builtins available, resolve some simple cases to the
right builtin type.

We should also adjust the display for types to include their module
name; that's not done yet here.
2024-07-26 08:21:31 -07:00
Micha Reiser
bf23d38a21 Remove unnecessary clone in workspace API (#12529) 2024-07-26 17:19:05 +02:00
Charlie Marsh
49f51583fa Always allow explicit multi-line concatenations when implicit are banned (#12532)
## Summary

Closes https://github.com/astral-sh/ruff/issues/11582.
2024-07-26 10:36:35 -04:00
Charlie Marsh
1fe4a5faed Avoid recommending __slots__ for classes that inherit from more than namedtuple (#12531)
## Summary

Closes https://github.com/astral-sh/ruff/issues/11887.
2024-07-26 14:24:40 +00:00
Charlie Marsh
998bfe0847 Avoid recommending no-argument super in slots=True dataclasses (#12530)
## Summary

Closes https://github.com/astral-sh/ruff/issues/12506.
2024-07-26 10:09:51 -04:00
Dhruv Manilawala
6f4db8675b [red-knot] Add support for untitled files (#12492)
## Summary

This PR adds support for untitled files in the Red Knot project.

Refer to the [design
discussion](https://github.com/astral-sh/ruff/discussions/12336) for
more details.

### Changes
* The `parsed_module` always assumes that the `SystemVirtual` path is of
`PySourceType::Python`.
* For the module resolver, as suggested, I went ahead by adding a new
`SystemOrVendoredPath` enum and renamed `FilePathRef` to
`SystemOrVendoredPathRef` (happy to consider better names here).
* The `file_to_module` query would return if it's a
`FilePath::SystemVirtual` variant because a virtual file doesn't belong
to any module.
* The sync implementation for the system virtual path is basically the
same as that of system path except that it uses the
`virtual_path_metadata`. The reason for this is that the system
(language server) would provide the metadata on whether it still exists
or not and if it exists, the corresponding metadata.

For point (1), VS Code would use `Untitled-1` for Python files and
`Untitled-1.ipynb` for Jupyter Notebooks. We could use this distinction
to determine whether the source type is `Python` or `Ipynb`.

## Test Plan

Added test cases in #12526
2024-07-26 18:13:31 +05:30
Micha Reiser
71f7aa4971 Remove criterion/codspeed compat layer (#12524) 2024-07-26 12:22:16 +02:00
Auguste Lalande
9f72f474e6 [pydoclint] Add docstring-missing-returns amd docstring-extraneous-returns (DOC201, DOC202) (#12485)
Co-authored-by: Micha Reiser <micha@reiser.io>
2024-07-26 06:36:00 +00:00
Carl Meyer
10c993e21a [red-knot] remove wrong __init__.py from file-watching tests (#12519) 2024-07-26 07:14:01 +01:00
Carl Meyer
2d3914296d [red-knot] handle all syntax without panic (#12499)
Extend red-knot type inference to cover all syntax, so that inferring
types for a scope gives all expressions a type. This means we can run
the red-knot semantic lint on all Python code without panics. It also
means we can infer types for `builtins.pyi` without panics.

To keep things simple, this PR intentionally doesn't add any new type
inference capabilities: the expanded coverage is all achieved with
`Type::Unknown`. But this puts the skeleton in place for adding better
inference of all these language features.

I also had to add basic Salsa cycle recovery (with just `Type::Unknown`
for now), because some `builtins.pyi` definitions are cyclic.

To test this, I added a comprehensive corpus of test snippets sourced
from Cinder under [MIT
license](https://github.com/facebookincubator/cinder/blob/cinder/3.10/cinderx/LICENSE),
which matches Ruff's license. I also added to this corpus some
additional snippets for newer language features: all the
`27_func_generic_*` and `73_class_generic_*` files, as well as
`20_lambda_default_arg.py`, and added a test which runs semantic-lint
over all these files. (The test doesn't assert the test-corpus files are
lint-free; just that they are able to lint without a panic.)
2024-07-25 17:38:08 -07:00
Charlie Marsh
7571da8778 Preserve trailing inline comments on import-from statements (#12498)
## Summary

Right now, in the isort comment model, there's nowhere for trailing
comments on the _statement_ to go, as in:

```python
from mylib import (
    MyClient,
    MyMgmtClient,
)  # some comment
```

If the comment is on the _alias_, we do preserve it, because we attach
it to the alias, as in:

```python
from mylib import (
    MyClient,
    MyMgmtClient,  # some comment
)
```

Similarly, if the comment is trailing on an import statement
(non-`from`), we again attach it to the alias, because it can't be
parenthesized, as in:

```python
import foo  # some comment
```

This PR adds logic to track and preserve those trailing comments.

We also no longer drop several other comments, like:

```python
from mylib import (
    # some comment
    MyClient
)
```

Closes https://github.com/astral-sh/ruff/issues/12487.
2024-07-25 17:46:58 -04:00
Alex Waygood
2ceac5f868 [red-knot] Rename some methods in the module resolver (#12517) 2024-07-25 19:28:48 +00:00
Alex Waygood
5ce80827d2 [red-knot] Refactor path.rs in the module resolver (#12494) 2024-07-25 19:29:28 +01:00
Dhruv Manilawala
e047b9685a Use docs bot email for docs publish (#12511)
Ref: https://github.com/astral-sh/uv/pull/5369
2024-07-25 21:50:00 +05:30
Dhruv Manilawala
fc16d8d04d Bump version to 0.5.5 (#12510) 2024-07-25 20:17:01 +05:30
Uriya Harpeness
175e5d7b88 Add missing traceback line in f-string-in-exception docstring. (#12508)
## Summary

Add missing traceback line in `f-string-in-exception` docstring.

Solves https://github.com/astral-sh/ruff/issues/12504.
2024-07-25 10:22:05 -04:00
Dhruv Manilawala
c03f257ed7 Add note about the breaking change in nvim-lspconfig (#12507)
Refer https://github.com/astral-sh/ruff/issues/12408
2024-07-25 14:01:16 +00:00
Dhruv Manilawala
6bbb4a28c2 Add setup docs for Zed editor (#12501)
## Summary

This PR adds the setup documentation for using Ruff with the Zed editor.

Closes: #12388
2024-07-25 13:09:17 +00:00
Micha Reiser
2ce3e3ae60 Fix the search path tests on MacOS (#12503) 2024-07-25 08:21:38 +02:00
Charlie Marsh
2a64cccb61 Avoid applying ignore-names to self and cls function names (#12497)
## Summary

Closes https://github.com/astral-sh/ruff/issues/12465.
2024-07-24 18:08:23 -04:00
Alex Waygood
928ffd6650 Ignore NPY201 inside except blocks for compatibility with older numpy versions (#12490) 2024-07-24 20:03:23 +00:00
Alex Waygood
e52be0951a [red-knot] Improve validation for search paths (#12376) 2024-07-24 15:02:25 +01:00
Dylan
889073578e [flake8-bugbear] Allow singleton tuples with starred expressions in B013 (#12484) 2024-07-24 15:19:30 +02:00
Micha Reiser
eac965ecaf [red-knot] Watch search paths (#12407) 2024-07-24 07:38:50 +00:00
Auguste Lalande
8659f2f4ea [pydoclint] Fix documentation for DOC501 (#12483)
## Summary

The doc was written backwards. mb.
2024-07-24 00:08:53 -04:00
Alex Waygood
c1b292a0dc Refactor NPY201 (#12479) 2024-07-23 18:24:20 +01:00
Micha Reiser
3af6ccb720 Fix Ord of cmp_fix (#12471) 2024-07-23 15:14:22 +02:00
Micha Reiser
f0fc6a95fe [red-knot] Lazy package file discovery (#12452)
Co-authored-by: Carl Meyer <carl@astral.sh>
2024-07-23 08:47:15 +00:00
Mateusz Sokół
f96a3c71ff Fix NumPy 2.0 rule for np.alltrue and np.sometrue (#12473)
Co-authored-by: Micha Reiser <micha@reiser.io>
2024-07-23 08:34:43 +00:00
Micha Reiser
b9b7deff17 Implement upcast_mut for new TestDb (#12470) 2024-07-23 07:11:00 +00:00
Micha Reiser
40d9324f5a [red-knot] Improved file watching (#12382) 2024-07-23 08:18:59 +02:00
Pathompong Kwangtong
a9f8bd59b2 Add Eglot setup guide for Emacs editor (#12426)
## Summary

The purpose of this change is to explain how to use ruff as a language
server in Eglot with automatic formatting because I've struggle to use
it with Eglot. I've search it online and found that there are some
people also struggle too. (See [this reddit
post](https://www.reddit.com/r/emacs/comments/118mo6w/eglot_automatic_formatting/)
and
https://github.com/astral-sh/ruff-lsp/issues/19#issuecomment-1435138828)


## Test Plan

I've use this setting myself. And I will continue maintain this part as
long as I use it.

---------

Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
2024-07-23 10:50:51 +05:30
Piotr Osiewicz
143e172431 Do not bail code action resolution when a quick fix is requested (#12462)
## Summary

When working on improving Ruff integration with Zed I noticed that it
errors out when we try to resolve a code action of a `QUICKFIX` kind;
apparently, per @dhruvmanila we shouldn't need to resolve it, as the
edit is provided in the initial response for the code action. However,
it's possible for the `resolve` call to fill out other fields (such as
`command`).
AFAICT Helix also tries to resolve the code actions unconditionally (as
in, when either `edit` or `command` is absent); so does VSC. They can
still apply the quickfixes though, as they do not error out on a failed
call to resolve code actions - Zed does. Following suit on Zed's side
does not cut it though, as we still get a log request from Ruff for that
failure (which is surfaced in the UI).
There are also other language servers (such as
[rust-analyzer](c1c9e10f72/crates/rust-analyzer/src/handlers/request.rs (L1257)))
that fill out both `command` and `edit` fields as a part of code action
resolution.

This PR makes the resolve calls for quickfix actions return the input
value.

## Test Plan

N/A
2024-07-23 10:30:03 +05:30
Auguste Lalande
b2d3a05ee4 [flake8-async] Fix references in documentation not displaying (#12467)
## Summary

Fix references in documentation of several `ASYNC` rules not displaying

## Test Plan

Validated documentation now displays correctly
2024-07-22 19:38:13 -04:00
Josh Cannon
ef1ca0dd38 Fix bad markdown in CONTRIBUTING.md (#12466)
See
https://github.com/astral-sh/ruff/blob/main/CONTRIBUTING.md#import-categorization
2024-07-23 00:03:30 +01:00
Carl Meyer
c7b13bb8fc [red-knot] add cycle-free while-loop control flow (#12413)
Add support for while-loop control flow.

This doesn't yet include general support for terminals and reachability;
that is wider than just while loops and belongs in its own PR.

This also doesn't yet add support for cyclic definitions in loops; that
comes with enough of its own complexity in Salsa that I want to handle
it separately.
2024-07-22 14:27:33 -07:00
Carl Meyer
dbbe3526ef [red-knot] add while-loop to benchmark (#12464)
So we can get some signal from the benchmark result on
https://github.com/astral-sh/ruff/pull/12413
2024-07-22 14:16:56 -07:00
Carl Meyer
f22c8ab811 [red-knot] add maybe-undefined lint rule (#12414)
Add a lint rule to detect if a name is definitely or possibly undefined
at a given usage.

If I create the file `undef/main.py` with contents:

```python
x = int
def foo():
    z
    return x
if flag:
    y = x
y
```

And then run `cargo run --bin red_knot -- --current-directory
../ruff-examples/undef`, I get the output:

```
Name 'z' used when not defined.
Name 'flag' used when not defined.
Name 'y' used when possibly not defined.
```

If I modify the file to add `y = 0` at the top, red-knot re-checks it
and I get the new output:

```
Name 'z' used when not defined.
Name 'flag' used when not defined.
```

Note that `int` is not flagged, since it's a builtin, and `return x` in
the function scope is not flagged, since it refers to the global `x`.
2024-07-22 13:53:59 -07:00
Alex Waygood
2a8f95c437 [red-knot] Use a distinct type for module search paths in the module resolver (#12379) 2024-07-22 19:44:27 +00:00
Dhruv Manilawala
ea2d51c2bb Add note to include notebook files for native server (#12449)
## Summary

Similar to https://github.com/astral-sh/ruff-vscode/pull/547 but for the
online docs.

Refer to https://github.com/astral-sh/ruff-vscode/issues/546

## Preview

<img width="1728" alt="Screenshot 2024-07-22 at 14 51 40"
src="https://github.com/user-attachments/assets/39014278-c868-45b0-9058-42858a060fd8">
2024-07-22 21:40:30 +05:30
Micha Reiser
ed238e0c76 Fix incorrect placement of leading function comment with type params (#12447) 2024-07-22 14:17:00 +02:00
Micha Reiser
3ace12943e Ignore more open ai notebooks for now (#12448) 2024-07-22 14:16:48 +02:00
Dhruv Manilawala
978909fcf4 Raise syntax error for unparenthesized generator expr in multi-argument call (#12445)
## Summary

This PR fixes a bug to raise a syntax error when an unparenthesized
generator expression is used as an argument to a call when there are
more than one argument.

For reference, the grammar is:
```
primary:
    | ...
    | primary genexp 
    | primary '(' [arguments] ')' 
    | ...

genexp:
    | '(' ( assignment_expression | expression !':=') for_if_clauses ')' 
```

The `genexp` requires the parenthesis as mentioned in the grammar. So,
the grammar for a call expression is either a name followed by a
generator expression or a name followed by a list of argument. In the
former case, the parenthesis are excluded because the generator
expression provides them while in the later case, the parenthesis are
explicitly provided for a list of arguments which means that the
generator expression requires it's own parenthesis.

This was discovered in https://github.com/astral-sh/ruff/issues/12420.

## Test Plan

Add test cases for valid and invalid syntax.

Make sure that the parser from CPython also raises this at the parsing
step:
```console
$ python3.13 -m ast parser/_.py
  File "parser/_.py", line 1
    total(1, 2, x for x in range(5), 6)
                ^^^^^^^^^^^^^^^^^^^
SyntaxError: Generator expression must be parenthesized

$ python3.13 -m ast parser/_.py
  File "parser/_.py", line 1
    sum(x for x in range(10), 10)
        ^^^^^^^^^^^^^^^^^^^^
SyntaxError: Generator expression must be parenthesized
```
2024-07-22 14:44:20 +05:30
Dhruv Manilawala
f8735e1ee8 Remove unused dependencies, sync existing versions (#12446)
## Summary

This PR removes unused dependencies from `fuzz` crate and syncs the
`similar` crate to the workspace version. This will help in resolve
https://github.com/astral-sh/ruff/pull/12442.

## Test Plan

Build the fuzz crate:

For Mac (it requires the nightly build):
```
cargo +nightly fuzz build
```
2024-07-22 10:49:05 +05:30
renovate[bot]
d70ceb6a56 Update Rust crate uuid to v1.10.0 (#12444) 2024-07-21 21:50:53 -04:00
renovate[bot]
fc7d9e95b8 Update Rust crate tracing-tree to 0.4.0 (#12443) 2024-07-21 21:50:46 -04:00
renovate[bot]
b578fca9cb Update NPM Development dependencies (#12441) 2024-07-21 21:50:32 -04:00
renovate[bot]
8d3146c2b2 Update pre-commit hook astral-sh/ruff-pre-commit to v0.5.4 (#12440) 2024-07-21 21:50:27 -04:00
renovate[bot]
fa5c841154 Update Rust crate thiserror to v1.0.63 (#12437) 2024-07-21 21:49:42 -04:00
renovate[bot]
f8fcbc19d9 Update dependency react-resizable-panels to v2.0.22 (#12439) 2024-07-21 21:49:33 -04:00
renovate[bot]
97fdd48208 Update Rust crate toml to v0.8.15 (#12438) 2024-07-21 21:49:23 -04:00
renovate[bot]
731ed2e40b Update Rust crate syn to v2.0.72 (#12436) 2024-07-21 21:49:16 -04:00
Auguste Lalande
3a742c17f8 [pydoclint] Fix DOC501 panic #12428 (#12435)
## Summary

Fix panic reported in #12428. Where a string would sometimes get split
within a character boundary. This bypasses the need to split the string.

This does not guarantee the correct formatting of the docstring, but
neither did the previous implementation.

Resolves #12428 

## Test Plan

Test case added to fixture
2024-07-21 19:30:06 +00:00
TomerBin
053243635c [fastapi] Implement FAST001 (fastapi-redundant-response-model) and FAST002 (fastapi-non-annotated-dependency) (#11579)
## Summary

Implements ruff specific role for fastapi routes, and its autofix.

## Test Plan

`cargo test` / `cargo insta review`
2024-07-21 18:28:10 +00:00
Ivan Carvalho
82355712c3 Add IBM to Who is Using ruff (#12433)
<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

Just updating the README to reflect that IBM has been using ruff for a
year already: https://github.com/Qiskit/qiskit/pull/10116.
2024-07-21 11:17:24 -05:00
Auguste Lalande
4bc73dd87e [pydoclint] Implement docstring-missing-exception and docstring-extraneous-exception (DOC501, DOC502) (#11471)
## Summary

These are the first rules implemented as part of #458, but I plan to
implement more.

Specifically, this implements `docstring-missing-exception` which checks
for raised exceptions not documented in the docstring, and
`docstring-extraneous-exception` which checks for exceptions in the
docstring not present in the body.

## Test Plan

Test fixtures added for both google and numpy style.
2024-07-20 19:41:51 +00:00
T-256
53b84ab054 Cleanup redundant spaces from changelog (#12424) 2024-07-20 17:46:15 +00:00
Charlie Marsh
3664f85f45 Bump version to v0.5.4 (#12423) 2024-07-20 17:28:13 +00:00
Charlie Marsh
2c1926beeb Insert parentheses for multi-argument generators (#12422)
## Summary

Closes https://github.com/astral-sh/ruff/issues/12420.
2024-07-20 16:41:55 +00:00
Charlie Marsh
4bcc96ae51 Avoid shadowing diagnostics for @override methods (#12415)
Closes https://github.com/astral-sh/ruff/issues/12412.
2024-07-19 21:32:33 -04:00
FishAlchemist
c0a2b49bac Fix the Github link error for Neovim in the setup for editors in the docs. (#12410)
## Summary

Fix Github link error for Neovim setup editors .

## Test Plan
Click Neovim Github link with mkdocs on local.
2024-07-19 16:24:12 -04:00
Sashko
ca22248628 Update docs Settings output-format default (#12409)
## Update docs Settings output-format default

Fixes https://github.com/astral-sh/ruff/issues/12350

## Test Plan

Run all automation mentioned here
fe04f2b09d/CONTRIBUTING.md (development)

Manually verified changes in the generated MkDocs site.

Co-authored-by: Oleksandr Zavertniev <oleksandr.zavertniev@yellowbrick.com>
2024-07-19 17:51:46 +00:00
Alex Waygood
d8cf8ac2ef [red-knot] Resolve symbols from builtins.pyi in the stdlib if they cannot be found in other scopes (#12390)
Co-authored-by: Carl Meyer <carl@astral.sh>
2024-07-19 17:44:56 +01:00
Carl Meyer
1c7b84059e [red-knot] fix incremental benchmark (#12400)
We should write `BAR_CODE` to `bar.py`, not to `foo.py`.
2024-07-19 08:32:37 -07:00
Carl Meyer
f82bb67555 [red-knot] trace file when inferring types (#12401)
When poring over traces, the ones that just include a definition or
symbol or expression ID aren't very useful, because you don't know which
file it comes from. This adds that information to the trace.

I guess the downside here is that if calling `.file(db)` on a
scope/definition/expression would execute other traced code, it would be
marked as outside the span? I don't think that's a concern, because I
don't think a simple field access on a tracked struct should ever
execute our code. If I'm wrong and this is a problem, it seems like the
tracing crate has this feature where you can record a field as
`tracing::field::Empty` and then fill in its value later with
`span.record(...)`, but when I tried this it wasn't working for me, not
sure why.

I think there's a lot more we can do to make our tracing output more
useful for debugging (e.g. record an event whenever a
definition/symbol/expression/use id is created with the details of that
definition/symbol/expression/use), this is just dipping my toes in the
water.
2024-07-19 07:13:51 -07:00
707 changed files with 16806 additions and 5069 deletions

View File

@@ -616,10 +616,10 @@ jobs:
- uses: Swatinem/rust-cache@v2
- name: "Build benchmarks"
run: cargo codspeed build --features codspeed -p ruff_benchmark
run: cargo codspeed build -p ruff_benchmark
- name: "Run benchmarks"
uses: CodSpeedHQ/action@v2
uses: CodSpeedHQ/action@v3
with:
run: cargo codspeed run
token: ${{ secrets.CODSPEED_TOKEN }}

View File

@@ -23,6 +23,7 @@ jobs:
name: pr-number
run_id: ${{ github.event.workflow_run.id || github.event.inputs.workflow_run_id }}
if_no_artifact_found: ignore
allow_forks: true
- name: Parse pull request number
id: pr-number
@@ -43,6 +44,7 @@ jobs:
path: pr/ecosystem
workflow_conclusion: completed
if_no_artifact_found: ignore
allow_forks: true
- name: Generate comment content
id: generate-comment

View File

@@ -104,8 +104,8 @@ jobs:
run: |
branch_name="${{ env.branch_name }}"
git config user.name "$GITHUB_ACTOR"
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
git config user.name "astral-docs-bot"
git config user.email "176161322+astral-docs-bot@users.noreply.github.com"
git checkout -b $branch_name
git add site/ruff

View File

@@ -9,7 +9,8 @@ exclude: |
crates/ruff_python_formatter/resources/.*|
crates/ruff_python_formatter/tests/snapshots/.*|
crates/ruff_python_resolver/resources/.*|
crates/ruff_python_resolver/tests/snapshots/.*
crates/ruff_python_resolver/tests/snapshots/.*|
crates/red_knot_workspace/resources/.*
)$
repos:
@@ -42,7 +43,7 @@ repos:
)$
- repo: https://github.com/crate-ci/typos
rev: v1.23.2
rev: v1.23.5
hooks:
- id: typos
@@ -56,18 +57,13 @@ repos:
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.5.2
rev: v0.5.5
hooks:
- id: ruff-format
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
types_or: [python, pyi]
require_serial: true
exclude: |
(?x)^(
crates/ruff_linter/resources/.*|
crates/ruff_python_formatter/resources/.*
)$
# Prettier
- repo: https://github.com/pre-commit/mirrors-prettier

View File

@@ -1,5 +1,107 @@
# Changelog
## 0.5.6
Ruff 0.5.6 automatically enables linting and formatting of notebooks in *preview mode*.
You can opt-out of this behavior by adding `*.ipynb` to the `extend-exclude` setting.
```toml
[tool.ruff]
extend-exclude = ["*.ipynb"]
```
### Preview features
- Enable notebooks by default in preview mode ([#12621](https://github.com/astral-sh/ruff/pull/12621))
- \[`flake8-builtins`\] Implement import, lambda, and module shadowing ([#12546](https://github.com/astral-sh/ruff/pull/12546))
- \[`pydoclint`\] Add `docstring-missing-returns` (`DOC201`) and `docstring-extraneous-returns` (`DOC202`) ([#12485](https://github.com/astral-sh/ruff/pull/12485))
### Rule changes
- \[`flake8-return`\] Exempt cached properties and other property-like decorators from explicit return rule (`RET501`) ([#12563](https://github.com/astral-sh/ruff/pull/12563))
### Server
- Make server panic hook more error resilient ([#12610](https://github.com/astral-sh/ruff/pull/12610))
- Use `$/logTrace` for server trace logs in Zed and VS Code ([#12564](https://github.com/astral-sh/ruff/pull/12564))
- Keep track of deleted cells for reorder change request ([#12575](https://github.com/astral-sh/ruff/pull/12575))
### Configuration
- \[`flake8-implicit-str-concat`\] Always allow explicit multi-line concatenations when implicit concatenations are banned ([#12532](https://github.com/astral-sh/ruff/pull/12532))
### Bug fixes
- \[`flake8-async`\] Avoid flagging `asyncio.timeout`s as unused when the context manager includes `asyncio.TaskGroup` ([#12605](https://github.com/astral-sh/ruff/pull/12605))
- \[`flake8-slots`\] Avoid recommending `__slots__` for classes that inherit from more than `namedtuple` ([#12531](https://github.com/astral-sh/ruff/pull/12531))
- \[`isort`\] Avoid marking required imports as unused ([#12537](https://github.com/astral-sh/ruff/pull/12537))
- \[`isort`\] Preserve trailing inline comments on import-from statements ([#12498](https://github.com/astral-sh/ruff/pull/12498))
- \[`pycodestyle`\] Add newlines before comments (`E305`) ([#12606](https://github.com/astral-sh/ruff/pull/12606))
- \[`pycodestyle`\] Don't attach comments with mismatched indents ([#12604](https://github.com/astral-sh/ruff/pull/12604))
- \[`pyflakes`\] Fix preview-mode bugs in `F401` when attempting to autofix unused first-party submodule imports in an `__init__.py` file ([#12569](https://github.com/astral-sh/ruff/pull/12569))
- \[`pylint`\] Respect start index in `unnecessary-list-index-lookup` ([#12603](https://github.com/astral-sh/ruff/pull/12603))
- \[`pyupgrade`\] Avoid recommending no-argument super in `slots=True` dataclasses ([#12530](https://github.com/astral-sh/ruff/pull/12530))
- \[`pyupgrade`\] Use colon rather than dot formatting for integer-only types ([#12534](https://github.com/astral-sh/ruff/pull/12534))
- Fix NFKC normalization bug when removing unused imports ([#12571](https://github.com/astral-sh/ruff/pull/12571))
### Other changes
- Consider more stdlib decorators to be property-like ([#12583](https://github.com/astral-sh/ruff/pull/12583))
- Improve handling of metaclasses in various linter rules ([#12579](https://github.com/astral-sh/ruff/pull/12579))
- Improve consistency between linter rules in determining whether a function is property ([#12581](https://github.com/astral-sh/ruff/pull/12581))
## 0.5.5
### Preview features
- \[`fastapi`\] Implement `fastapi-redundant-response-model` (`FAST001`) and `fastapi-non-annotated-dependency`(`FAST002`) ([#11579](https://github.com/astral-sh/ruff/pull/11579))
- \[`pydoclint`\] Implement `docstring-missing-exception` (`DOC501`) and `docstring-extraneous-exception` (`DOC502`) ([#11471](https://github.com/astral-sh/ruff/pull/11471))
### Rule changes
- \[`numpy`\] Fix NumPy 2.0 rule for `np.alltrue` and `np.sometrue` ([#12473](https://github.com/astral-sh/ruff/pull/12473))
- \[`numpy`\] Ignore `NPY201` inside `except` blocks for compatibility with older numpy versions ([#12490](https://github.com/astral-sh/ruff/pull/12490))
- \[`pep8-naming`\] Avoid applying `ignore-names` to `self` and `cls` function names (`N804`, `N805`) ([#12497](https://github.com/astral-sh/ruff/pull/12497))
### Formatter
- Fix incorrect placement of leading function comment with type params ([#12447](https://github.com/astral-sh/ruff/pull/12447))
### Server
- Do not bail code action resolution when a quick fix is requested ([#12462](https://github.com/astral-sh/ruff/pull/12462))
### Bug fixes
- Fix `Ord` implementation of `cmp_fix` ([#12471](https://github.com/astral-sh/ruff/pull/12471))
- Raise syntax error for unparenthesized generator expression in multi-argument call ([#12445](https://github.com/astral-sh/ruff/pull/12445))
- \[`pydoclint`\] Fix panic in `DOC501` reported in [#12428](https://github.com/astral-sh/ruff/pull/12428) ([#12435](https://github.com/astral-sh/ruff/pull/12435))
- \[`flake8-bugbear`\] Allow singleton tuples with starred expressions in `B013` ([#12484](https://github.com/astral-sh/ruff/pull/12484))
### Documentation
- Add Eglot setup guide for Emacs editor ([#12426](https://github.com/astral-sh/ruff/pull/12426))
- Add note about the breaking change in `nvim-lspconfig` ([#12507](https://github.com/astral-sh/ruff/pull/12507))
- Add note to include notebook files for native server ([#12449](https://github.com/astral-sh/ruff/pull/12449))
- Add setup docs for Zed editor ([#12501](https://github.com/astral-sh/ruff/pull/12501))
## 0.5.4
### Rule changes
- \[`ruff`\] Rename `RUF007` to `zip-instead-of-pairwise` ([#12399](https://github.com/astral-sh/ruff/pull/12399))
### Bug fixes
- \[`flake8-builtins`\] Avoid shadowing diagnostics for `@override` methods ([#12415](https://github.com/astral-sh/ruff/pull/12415))
- \[`flake8-comprehensions`\] Insert parentheses for multi-argument generators ([#12422](https://github.com/astral-sh/ruff/pull/12422))
- \[`pydocstyle`\] Handle escaped docstrings within docstring (`D301`) ([#12192](https://github.com/astral-sh/ruff/pull/12192))
### Documentation
- Fix GitHub link to Neovim setup ([#12410](https://github.com/astral-sh/ruff/pull/12410))
- Fix `output-format` default in settings reference ([#12409](https://github.com/astral-sh/ruff/pull/12409))
## 0.5.3
**Ruff 0.5.3 marks the stable release of the Ruff language server and introduces revamped

View File

@@ -905,7 +905,7 @@ There are three ways in which an import can be categorized as "first-party":
package (e.g., `from foo import bar` or `import foo.bar`), they'll be classified as first-party
automatically. This check is as simple as comparing the first segment of the current file's
module path to the first segment of the import.
1. **Source roots**: Ruff supports a `[src](https://docs.astral.sh/ruff/settings/#src)` setting, which
1. **Source roots**: Ruff supports a [`src`](https://docs.astral.sh/ruff/settings/#src) setting, which
sets the directories to scan when identifying first-party imports. The algorithm is
straightforward: given an import, like `import foo`, iterate over the directories enumerated in
the `src` setting and, for each directory, check for the existence of a subdirectory `foo` or a

246
Cargo.lock generated
View File

@@ -141,9 +141,9 @@ checksum = "69f7f8c3906b62b754cd5326047894316021dcfe5a194c8ea52bdd94934a3457"
[[package]]
name = "argfile"
version = "0.2.0"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b7c5c8e418080ef8aa932039d12eda7b6f5043baf48f1523c166fbc32d004534"
checksum = "0a1cc0ba69de57db40674c66f7cf2caee3981ddef084388482c95c0e2133e5e8"
dependencies = [
"fs-err",
"os_str_bytes",
@@ -189,10 +189,22 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b048fb63fd8b5923fc5aa7b340d8e156aec7ec02f0c78fa8a6ddc2613f6f71de"
[[package]]
name = "bstr"
version = "1.9.1"
name = "boomphf"
version = "0.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "05efc5cfd9110c8416e471df0e96702d58690178e206e61b7173706673c93706"
checksum = "617e2d952880a00583ddb9237ac3965732e8df6a92a8e7bcc054100ec467ec3b"
dependencies = [
"crossbeam-utils",
"log",
"rayon",
"wyhash",
]
[[package]]
name = "bstr"
version = "1.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "40723b8fb387abc38f4f4a37c09073622e41dd12327033091ef8950659e6dc0c"
dependencies = [
"memchr",
"regex-automata 0.4.6",
@@ -314,9 +326,9 @@ dependencies = [
[[package]]
name = "clap"
version = "4.5.9"
version = "4.5.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "64acc1846d54c1fe936a78dc189c34e28d3f5afc348403f28ecf53660b9b8462"
checksum = "35723e6a11662c2afb578bcf0b88bf6ea8e21282a953428f240574fcc3a2b5b3"
dependencies = [
"clap_builder",
"clap_derive",
@@ -324,9 +336,9 @@ dependencies = [
[[package]]
name = "clap_builder"
version = "4.5.9"
version = "4.5.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6fb8393d67ba2e7bfaf28a23458e4e2b543cc73a99595511eb207fdb8aede942"
checksum = "49eb96cbfa7cfa35017b7cd548c75b14c3118c98b423041d70562665e07fb0fa"
dependencies = [
"anstream",
"anstyle",
@@ -367,9 +379,9 @@ dependencies = [
[[package]]
name = "clap_derive"
version = "4.5.8"
version = "4.5.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2bac35c6dafb060fd4d275d9a4ffae97917c13a6327903a8be2153cd964f7085"
checksum = "5d029b67f89d30bbb547c89fd5161293c0aec155fc691d7924b64550662db93e"
dependencies = [
"heck",
"proc-macro2",
@@ -759,9 +771,9 @@ dependencies = [
[[package]]
name = "env_logger"
version = "0.11.3"
version = "0.11.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38b35839ba51819680ba087cd351788c9a3c476841207e0b8cee0b04722343b9"
checksum = "e13fa619b91fb2381732789fc5de83b45675e882f66623b7d8cb4f643017018d"
dependencies = [
"anstream",
"anstyle",
@@ -930,9 +942,9 @@ dependencies = [
[[package]]
name = "hashlink"
version = "0.8.4"
version = "0.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e8094feaf31ff591f651a2664fb9cfd92bba7a60ce3197265e9482ebe753c8f7"
checksum = "6ba4ff7128dee98c7dc9794b6a411377e1404dba1c97deb8d1a55297bd25d8af"
dependencies = [
"hashbrown",
]
@@ -1021,9 +1033,9 @@ dependencies = [
[[package]]
name = "imara-diff"
version = "0.1.6"
version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "af13c8ceb376860ff0c6a66d83a8cdd4ecd9e464da24621bbffcd02b49619434"
checksum = "fc9da1a252bd44cd341657203722352efc9bc0c847d06ea6d2dc1cd1135e0a01"
dependencies = [
"ahash",
"hashbrown",
@@ -1031,9 +1043,9 @@ dependencies = [
[[package]]
name = "imperative"
version = "1.0.5"
version = "1.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8b70798296d538cdaa6d652941fcc795963f8b9878b9e300c9fab7a522bd2fc0"
checksum = "29a1f6526af721f9aec9ceed7ab8ebfca47f3399d08b80056c2acca3fcb694a9"
dependencies = [
"phf",
"rust-stemmers",
@@ -1526,10 +1538,84 @@ dependencies = [
]
[[package]]
name = "os_str_bytes"
version = "6.6.1"
name = "orx-concurrent-ordered-bag"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e2355d85b9a3786f481747ced0e0ff2ba35213a1f9bd406ed906554d7af805a1"
checksum = "9aa866e2be4aa03927eddb481e7c479d5109fe3121324fb7db6d97f91adf9876"
dependencies = [
"orx-fixed-vec",
"orx-pinned-concurrent-col",
"orx-pinned-vec",
"orx-pseudo-default",
"orx-split-vec",
]
[[package]]
name = "orx-concurrent-vec"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c5912426ffb660f8b61e8f0812a1d07400803cd5513969d2c7af4d69602ba8a1"
dependencies = [
"orx-concurrent-ordered-bag",
"orx-fixed-vec",
"orx-pinned-concurrent-col",
"orx-pinned-vec",
"orx-pseudo-default",
"orx-split-vec",
]
[[package]]
name = "orx-fixed-vec"
version = "3.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7f69466c7c1fc2e1f00b58e39059b78c438b9fad144d1937ef177ecfc413e997"
dependencies = [
"orx-pinned-vec",
"orx-pseudo-default",
]
[[package]]
name = "orx-pinned-concurrent-col"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fdbcb1fa05dc1676f1c9cf19f443b3d2d2ca5835911477d22fa77cad8b79208d"
dependencies = [
"orx-fixed-vec",
"orx-pinned-vec",
"orx-pseudo-default",
"orx-split-vec",
]
[[package]]
name = "orx-pinned-vec"
version = "3.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c1071baf586de45722668234bddf56c52c1ece6a6153d16541bbb0505f0ac055"
dependencies = [
"orx-pseudo-default",
]
[[package]]
name = "orx-pseudo-default"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a2f627c439e723fa78e410a0faba89047a8a47d0dc013da5c0e05806e8a6cddb"
[[package]]
name = "orx-split-vec"
version = "3.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "52b9dbfa8c7069ae73a890870d3aa9097a897d616751d3d0278f2b42d5214730"
dependencies = [
"orx-pinned-vec",
"orx-pseudo-default",
]
[[package]]
name = "os_str_bytes"
version = "7.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7ac44c994af577c799b1b4bd80dc214701e349873ad894d6cdf96f4f7526e0b9"
dependencies = [
"memchr",
]
@@ -1858,14 +1944,13 @@ dependencies = [
"countme",
"crossbeam",
"ctrlc",
"notify",
"filetime",
"rayon",
"red_knot_module_resolver",
"red_knot_python_semantic",
"red_knot_workspace",
"ruff_db",
"ruff_python_ast",
"rustc-hash 2.0.0",
"salsa",
"tempfile",
"tracing",
"tracing-subscriber",
"tracing-tree",
@@ -1897,6 +1982,7 @@ version = "0.0.0"
dependencies = [
"anyhow",
"bitflags 2.6.0",
"countme",
"hashbrown",
"ordermap",
"red_knot_module_resolver",
@@ -1904,13 +1990,28 @@ dependencies = [
"ruff_index",
"ruff_python_ast",
"ruff_python_parser",
"ruff_python_trivia",
"ruff_text_size",
"rustc-hash 2.0.0",
"salsa",
"tracing",
]
[[package]]
name = "red_knot_workspace"
version = "0.0.0"
dependencies = [
"anyhow",
"crossbeam",
"notify",
"red_knot_module_resolver",
"red_knot_python_semantic",
"ruff_db",
"ruff_python_ast",
"rustc-hash 2.0.0",
"salsa",
"tracing",
]
[[package]]
name = "redox_syscall"
version = "0.4.1"
@@ -1992,7 +2093,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.5.3"
version = "0.5.6"
dependencies = [
"anyhow",
"argfile",
@@ -2047,10 +2148,9 @@ name = "ruff_benchmark"
version = "0.0.0"
dependencies = [
"codspeed-criterion-compat",
"criterion",
"mimalloc",
"once_cell",
"red_knot",
"red_knot_workspace",
"ruff_db",
"ruff_linter",
"ruff_python_ast",
@@ -2087,10 +2187,13 @@ dependencies = [
"filetime",
"ignore",
"insta",
"matchit",
"path-slash",
"ruff_cache",
"ruff_notebook",
"ruff_python_ast",
"ruff_python_parser",
"ruff_python_trivia",
"ruff_source_file",
"ruff_text_size",
"rustc-hash 2.0.0",
@@ -2176,7 +2279,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.5.3"
version = "0.5.6"
dependencies = [
"aho-corasick",
"annotate-snippets 0.9.2",
@@ -2230,6 +2333,7 @@ dependencies = [
"thiserror",
"toml",
"typed-arena",
"unicode-normalization",
"unicode-width",
"unicode_names2",
"url",
@@ -2399,13 +2503,17 @@ version = "0.0.0"
dependencies = [
"bitflags 2.6.0",
"is-macro",
"ruff_cache",
"ruff_index",
"ruff_macros",
"ruff_python_ast",
"ruff_python_parser",
"ruff_python_stdlib",
"ruff_source_file",
"ruff_text_size",
"rustc-hash 2.0.0",
"schemars",
"serde",
]
[[package]]
@@ -2491,7 +2599,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.5.3"
version = "0.5.6"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -2538,6 +2646,7 @@ dependencies = [
"ruff_macros",
"ruff_python_ast",
"ruff_python_formatter",
"ruff_python_semantic",
"ruff_source_file",
"rustc-hash 2.0.0",
"schemars",
@@ -2630,25 +2739,34 @@ checksum = "e86697c916019a8588c99b5fac3cead74ec0b4b819707a682fd4d23fa0ce1ba1"
[[package]]
name = "salsa"
version = "0.18.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=a1bf3a613f451af7fc0a59411c56abc47fe8e8e1#a1bf3a613f451af7fc0a59411c56abc47fe8e8e1"
source = "git+https://github.com/MichaReiser/salsa.git?rev=0cae5c52a3240172ef0be5c9d19e63448c53397c#0cae5c52a3240172ef0be5c9d19e63448c53397c"
dependencies = [
"arc-swap",
"boomphf",
"crossbeam",
"dashmap 5.5.3",
"dashmap 6.0.1",
"hashlink",
"indexmap",
"log",
"orx-concurrent-vec",
"parking_lot",
"rustc-hash 1.1.0",
"rustc-hash 2.0.0",
"salsa-macro-rules",
"salsa-macros",
"smallvec",
"tracing",
]
[[package]]
name = "salsa-macro-rules"
version = "0.1.0"
source = "git+https://github.com/MichaReiser/salsa.git?rev=0cae5c52a3240172ef0be5c9d19e63448c53397c#0cae5c52a3240172ef0be5c9d19e63448c53397c"
[[package]]
name = "salsa-macros"
version = "0.18.0"
source = "git+https://github.com/salsa-rs/salsa.git?rev=a1bf3a613f451af7fc0a59411c56abc47fe8e8e1#a1bf3a613f451af7fc0a59411c56abc47fe8e8e1"
source = "git+https://github.com/MichaReiser/salsa.git?rev=0cae5c52a3240172ef0be5c9d19e63448c53397c#0cae5c52a3240172ef0be5c9d19e63448c53397c"
dependencies = [
"heck",
"proc-macro2",
"quote",
"syn",
@@ -2750,11 +2868,12 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.120"
version = "1.0.121"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4e0d21c9a8cae1235ad58a00c11cb40d4b1e5c784f1ef2c537876ed6ffd8b7c5"
checksum = "4ab380d7d9f22ef3f21ad3e6c1ebe8e4fc7a2000ccba2e4d71fc96f15b2cb609"
dependencies = [
"itoa",
"memchr",
"ryu",
"serde",
]
@@ -2772,9 +2891,9 @@ dependencies = [
[[package]]
name = "serde_spanned"
version = "0.6.6"
version = "0.6.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "79e674e01f999af37c49f70a6ede167a8a60b2503e56c5599532a65baa5969a0"
checksum = "eb5b1b31579f3811bf615c144393417496f152e12ac8b7663bf664f4a815306d"
dependencies = [
"serde",
]
@@ -2910,9 +3029,9 @@ checksum = "81cdd64d312baedb58e21336b31bc043b77e01cc99033ce76ef539f78e965ebc"
[[package]]
name = "syn"
version = "2.0.71"
version = "2.0.72"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b146dcf730474b4bcd16c311627b31ede9ab149045db4d6088b3becaea046462"
checksum = "dc4b9b9bf2add8093d3f2c0204471e951b2285580335de42f9d2534f3ae7a8af"
dependencies = [
"proc-macro2",
"quote",
@@ -3000,18 +3119,18 @@ dependencies = [
[[package]]
name = "thiserror"
version = "1.0.62"
version = "1.0.63"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f2675633b1499176c2dff06b0856a27976a8f9d436737b4cf4f312d4d91d8bbb"
checksum = "c0342370b38b6a11b6cc11d6a805569958d54cfa061a29969c3b5ce2ea405724"
dependencies = [
"thiserror-impl",
]
[[package]]
name = "thiserror-impl"
version = "1.0.62"
version = "1.0.63"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d20468752b09f49e909e55a5d338caa8bedf615594e9d80bc4c565d30faf798c"
checksum = "a4558b58466b9ad7ca0f102865eccc95938dca1a74a856f2b57b6629050da261"
dependencies = [
"proc-macro2",
"quote",
@@ -3075,9 +3194,9 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
[[package]]
name = "toml"
version = "0.8.14"
version = "0.8.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6f49eb2ab21d2f26bd6db7bf383edc527a7ebaee412d17af4d40fdccd442f335"
checksum = "81967dd0dd2c1ab0bc3468bd7caecc32b8a4aa47d0c8c695d8c2b2108168d62c"
dependencies = [
"serde",
"serde_spanned",
@@ -3087,18 +3206,18 @@ dependencies = [
[[package]]
name = "toml_datetime"
version = "0.6.6"
version = "0.6.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4badfd56924ae69bcc9039335b2e017639ce3f9b001c393c1b2d1ef846ce2cbf"
checksum = "f8fb9f64314842840f1d940ac544da178732128f1c78c21772e876579e0da1db"
dependencies = [
"serde",
]
[[package]]
name = "toml_edit"
version = "0.22.14"
version = "0.22.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f21c7aaf97f1bd9ca9d4f9e73b0a6c74bd5afef56f2bc931943a6e1c37e04e38"
checksum = "8d9f8729f5aea9562aac1cc0441f5d6de3cff1ee0c5d67293eeca5eb36ee7c16"
dependencies = [
"indexmap",
"serde",
@@ -3183,9 +3302,9 @@ dependencies = [
[[package]]
name = "tracing-tree"
version = "0.3.1"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b56c62d2c80033cb36fae448730a2f2ef99410fe3ecbffc916681a32f6807dbe"
checksum = "f459ca79f1b0d5f71c54ddfde6debfc59c8b6eeb46808ae492077f739dc7b49c"
dependencies = [
"nu-ansi-term 0.50.0",
"tracing-core",
@@ -3338,9 +3457,9 @@ checksum = "711b9620af191e0cdc7468a8d14e709c3dcdb115b36f838e601583af800a370a"
[[package]]
name = "uuid"
version = "1.9.1"
version = "1.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5de17fd2f7da591098415cff336e12965a28061ddace43b59cb3c430179c9439"
checksum = "81dfa00651efa65069b0b6b651f4aaa31ba9e3c3ce0137aaad053604ee7e0314"
dependencies = [
"getrandom",
"rand",
@@ -3350,9 +3469,9 @@ dependencies = [
[[package]]
name = "uuid-macro-internal"
version = "1.9.1"
version = "1.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a3ff64d5cde1e2cb5268bdb497235b6bd255ba8244f910dbc3574e59593de68c"
checksum = "ee1cd046f83ea2c4e920d6ee9f7c3537ef928d75dce5d84a87c2c5d6b3999a3a"
dependencies = [
"proc-macro2",
"quote",
@@ -3745,6 +3864,15 @@ version = "0.0.19"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d135d17ab770252ad95e9a872d365cf3090e3be864a34ab46f48555993efc904"
[[package]]
name = "wyhash"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "baf6e163c25e3fac820b4b453185ea2dea3b6a3e0a721d4d23d75bd33734c295"
dependencies = [
"rand_core",
]
[[package]]
name = "yansi"
version = "0.5.1"

View File

@@ -4,7 +4,7 @@ resolver = "2"
[workspace.package]
edition = "2021"
rust-version = "1.75"
rust-version = "1.76"
homepage = "https://docs.astral.sh/ruff"
documentation = "https://docs.astral.sh/ruff"
repository = "https://github.com/astral-sh/ruff"
@@ -35,9 +35,9 @@ ruff_source_file = { path = "crates/ruff_source_file" }
ruff_text_size = { path = "crates/ruff_text_size" }
ruff_workspace = { path = "crates/ruff_workspace" }
red_knot = { path = "crates/red_knot" }
red_knot_module_resolver = { path = "crates/red_knot_module_resolver" }
red_knot_python_semantic = { path = "crates/red_knot_python_semantic" }
red_knot_workspace = { path = "crates/red_knot_workspace" }
aho-corasick = { version = "1.1.3" }
annotate-snippets = { version = "0.9.2", features = ["color"] }
@@ -58,7 +58,6 @@ console_error_panic_hook = { version = "0.1.7" }
console_log = { version = "1.0.0" }
countme = { version = "3.0.1" }
compact_str = "0.8.0"
criterion = { version = "0.5.1", default-features = false }
crossbeam = { version = "0.8.4" }
dashmap = { version = "6.0.1" }
drop_bomb = { version = "0.1.5" }
@@ -108,7 +107,7 @@ rand = { version = "0.8.5" }
rayon = { version = "1.10.0" }
regex = { version = "1.10.2" }
rustc-hash = { version = "2.0.0" }
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "a1bf3a613f451af7fc0a59411c56abc47fe8e8e1" }
salsa = { git = "https://github.com/MichaReiser/salsa.git", rev = "0cae5c52a3240172ef0be5c9d19e63448c53397c" }
schemars = { version = "0.8.16" }
seahash = { version = "4.1.0" }
serde = { version = "1.0.197", features = ["derive"] }
@@ -133,7 +132,7 @@ toml = { version = "0.8.11" }
tracing = { version = "0.1.40" }
tracing-indicatif = { version = "0.3.6" }
tracing-subscriber = { version = "0.3.18", features = ["env-filter"] }
tracing-tree = { version = "0.3.0" }
tracing-tree = { version = "0.4.0" }
typed-arena = { version = "2.0.2" }
unic-ucd-category = { version = "0.9" }
unicode-ident = { version = "1.0.12" }
@@ -157,6 +156,7 @@ zip = { version = "0.6.6", default-features = false, features = ["zstd"] }
[workspace.lints.rust]
unsafe_code = "warn"
unreachable_pub = "warn"
unexpected_cfgs = { level = "warn", check-cfg = ['cfg(fuzzing)'] }
[workspace.lints.clippy]
pedantic = { level = "warn", priority = -2 }

25
LICENSE
View File

@@ -1371,3 +1371,28 @@ are:
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
- pydoclint, licensed as follows:
"""
MIT License
Copyright (c) 2023 jsh9
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""

View File

@@ -136,8 +136,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.5.3/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.5.3/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.5.6/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.5.6/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -170,7 +170,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.5.3
rev: v0.5.6
hooks:
# Run the linter.
- id: ruff
@@ -424,6 +424,7 @@ Ruff is used by a number of major open-source projects and companies, including:
- [Dagger](https://github.com/dagger/dagger)
- [Dagster](https://github.com/dagster-io/dagster)
- Databricks ([MLflow](https://github.com/mlflow/mlflow))
- [Dify](https://github.com/langgenius/dify)
- [FastAPI](https://github.com/tiangolo/fastapi)
- [Godot](https://github.com/godotengine/godot)
- [Gradio](https://github.com/gradio-app/gradio)
@@ -434,6 +435,7 @@ Ruff is used by a number of major open-source projects and companies, including:
- Hugging Face ([Transformers](https://github.com/huggingface/transformers),
[Datasets](https://github.com/huggingface/datasets),
[Diffusers](https://github.com/huggingface/diffusers))
- IBM ([Qiskit](https://github.com/Qiskit/qiskit))
- ING Bank ([popmon](https://github.com/ing-bank/popmon), [probatus](https://github.com/ing-bank/probatus))
- [Ibis](https://github.com/ibis-project/ibis)
- [ivy](https://github.com/unifyai/ivy)

View File

@@ -10,4 +10,12 @@ doc-valid-idents = [
"SCREAMING_SNAKE_CASE",
"SQLAlchemy",
"StackOverflow",
"PyCharm",
]
ignore-interior-mutability = [
# Interned is read-only. The wrapped `Rc` never gets updated.
"ruff_formatter::format_element::Interned",
# The expression is read-only.
"ruff_python_ast::hashable::HashableExpr",
]

View File

@@ -13,24 +13,25 @@ license.workspace = true
[dependencies]
red_knot_module_resolver = { workspace = true }
red_knot_python_semantic = { workspace = true }
red_knot_workspace = { workspace = true }
ruff_db = { workspace = true, features = ["os", "cache"] }
ruff_python_ast = { workspace = true }
anyhow = { workspace = true }
clap = { workspace = true, features = ["wrap_help"] }
countme = { workspace = true, features = ["enable"] }
crossbeam = { workspace = true }
ctrlc = { version = "3.4.4" }
notify = { workspace = true }
rayon = { workspace = true }
rustc-hash = { workspace = true }
salsa = { workspace = true }
tracing = { workspace = true }
tracing-subscriber = { workspace = true }
tracing-tree = { workspace = true }
[dev-dependencies]
filetime = { workspace = true }
tempfile = { workspace = true }
[lints]
workspace = true

View File

@@ -2,7 +2,6 @@ use std::sync::Mutex;
use clap::Parser;
use crossbeam::channel as crossbeam_channel;
use salsa::ParallelDatabase;
use tracing::subscriber::Interest;
use tracing::{Level, Metadata};
use tracing_subscriber::filter::LevelFilter;
@@ -10,10 +9,10 @@ use tracing_subscriber::layer::{Context, Filter, SubscriberExt};
use tracing_subscriber::{Layer, Registry};
use tracing_tree::time::Uptime;
use red_knot::db::RootDatabase;
use red_knot::watch::FileWatcher;
use red_knot::watch::FileWatcherChange;
use red_knot::workspace::WorkspaceMetadata;
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::watch;
use red_knot_workspace::watch::WorkspaceWatcher;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::program::{ProgramSettings, SearchPathSettings};
use ruff_db::system::{OsSystem, System, SystemPathBuf};
@@ -57,6 +56,13 @@ struct Args {
#[clap(flatten)]
verbosity: Verbosity,
#[arg(
long,
help = "Run in watch mode by re-running whenever files change",
short = 'W'
)]
watch: bool,
}
#[allow(
@@ -72,6 +78,7 @@ pub fn main() -> anyhow::Result<()> {
extra_search_path: extra_paths,
target_version,
verbosity,
watch,
} = Args::parse_from(std::env::args().collect::<Vec<_>>());
let verbosity = verbosity.level();
@@ -97,13 +104,13 @@ pub fn main() -> anyhow::Result<()> {
extra_paths,
workspace_root: workspace_metadata.root().to_path_buf(),
custom_typeshed: custom_typeshed_dir,
site_packages: None,
site_packages: vec![],
},
};
// TODO: Use the `program_settings` to compute the key for the database's persistent
// cache and load the cache if it exists.
let mut db = RootDatabase::new(workspace_metadata, program_settings, system);
let db = RootDatabase::new(workspace_metadata, program_settings, system);
let (main_loop, main_loop_cancellation_token) = MainLoop::new(verbosity);
@@ -117,125 +124,124 @@ pub fn main() -> anyhow::Result<()> {
}
})?;
let file_changes_notifier = main_loop.file_changes_notifier();
let mut db = salsa::Handle::new(db);
if watch {
main_loop.watch(&mut db)?;
} else {
main_loop.run(&mut db);
};
// Watch for file changes and re-trigger the analysis.
let mut file_watcher = FileWatcher::new(move |changes| {
file_changes_notifier.notify(changes);
})?;
file_watcher.watch_folder(db.workspace().root(&db).as_std_path())?;
main_loop.run(&mut db);
println!("{}", countme::get_all());
std::mem::forget(db);
Ok(())
}
struct MainLoop {
verbosity: Option<VerbosityLevel>,
orchestrator: crossbeam_channel::Sender<OrchestratorMessage>,
/// Sender that can be used to send messages to the main loop.
sender: crossbeam_channel::Sender<MainLoopMessage>,
/// Receiver for the messages sent **to** the main loop.
receiver: crossbeam_channel::Receiver<MainLoopMessage>,
/// The file system watcher, if running in watch mode.
watcher: Option<WorkspaceWatcher>,
verbosity: Option<VerbosityLevel>,
}
impl MainLoop {
fn new(verbosity: Option<VerbosityLevel>) -> (Self, MainLoopCancellationToken) {
let (orchestrator_sender, orchestrator_receiver) = crossbeam_channel::bounded(1);
let (main_loop_sender, main_loop_receiver) = crossbeam_channel::bounded(1);
let mut orchestrator = Orchestrator {
receiver: orchestrator_receiver,
main_loop: main_loop_sender.clone(),
revision: 0,
};
std::thread::spawn(move || {
orchestrator.run();
});
let (sender, receiver) = crossbeam_channel::bounded(10);
(
Self {
sender: sender.clone(),
receiver,
watcher: None,
verbosity,
orchestrator: orchestrator_sender,
receiver: main_loop_receiver,
},
MainLoopCancellationToken {
sender: main_loop_sender,
},
MainLoopCancellationToken { sender },
)
}
fn file_changes_notifier(&self) -> FileChangesNotifier {
FileChangesNotifier {
sender: self.orchestrator.clone(),
}
fn watch(mut self, db: &mut salsa::Handle<RootDatabase>) -> anyhow::Result<()> {
let sender = self.sender.clone();
let watcher = watch::directory_watcher(move |event| {
sender.send(MainLoopMessage::ApplyChanges(event)).unwrap();
})?;
self.watcher = Some(WorkspaceWatcher::new(watcher, db));
self.run(db);
Ok(())
}
#[allow(clippy::print_stderr)]
fn run(self, db: &mut RootDatabase) {
self.orchestrator.send(OrchestratorMessage::Run).unwrap();
fn run(mut self, db: &mut salsa::Handle<RootDatabase>) {
// Schedule the first check.
self.sender.send(MainLoopMessage::CheckWorkspace).unwrap();
let mut revision = 0usize;
for message in &self.receiver {
while let Ok(message) = self.receiver.recv() {
tracing::trace!("Main Loop: Tick");
match message {
MainLoopMessage::CheckWorkspace { revision } => {
let db = db.snapshot();
let orchestrator = self.orchestrator.clone();
MainLoopMessage::CheckWorkspace => {
let db = db.clone();
let sender = self.sender.clone();
// Spawn a new task that checks the workspace. This needs to be done in a separate thread
// to prevent blocking the main loop here.
rayon::spawn(move || {
if let Ok(result) = db.check() {
orchestrator
.send(OrchestratorMessage::CheckCompleted {
diagnostics: result,
revision,
})
.unwrap();
// Send the result back to the main loop for printing.
sender
.send(MainLoopMessage::CheckCompleted { result, revision })
.ok();
}
});
}
MainLoopMessage::ApplyChanges(changes) => {
// Automatically cancels any pending queries and waits for them to complete.
db.apply_changes(changes);
}
MainLoopMessage::CheckCompleted(diagnostics) => {
eprintln!("{}", diagnostics.join("\n"));
if self.verbosity == Some(VerbosityLevel::Trace) {
eprintln!("{}", countme::get_all());
MainLoopMessage::CheckCompleted {
result,
revision: check_revision,
} => {
if check_revision == revision {
eprintln!("{}", result.join("\n"));
if self.verbosity == Some(VerbosityLevel::Trace) {
eprintln!("{}", countme::get_all());
}
}
if self.watcher.is_none() {
return self.exit();
}
}
MainLoopMessage::ApplyChanges(changes) => {
revision += 1;
// Automatically cancels any pending queries and waits for them to complete.
db.get_mut().apply_changes(changes);
if let Some(watcher) = self.watcher.as_mut() {
watcher.update(db);
}
self.sender.send(MainLoopMessage::CheckWorkspace).unwrap();
}
MainLoopMessage::Exit => {
if self.verbosity == Some(VerbosityLevel::Trace) {
eprintln!("{}", countme::get_all());
}
return;
return self.exit();
}
}
}
self.exit();
}
}
impl Drop for MainLoop {
fn drop(&mut self) {
self.orchestrator
.send(OrchestratorMessage::Shutdown)
.unwrap();
}
}
#[derive(Debug, Clone)]
struct FileChangesNotifier {
sender: crossbeam_channel::Sender<OrchestratorMessage>,
}
impl FileChangesNotifier {
fn notify(&self, changes: Vec<FileWatcherChange>) {
self.sender
.send(OrchestratorMessage::FileChanges(changes))
.unwrap();
#[allow(clippy::print_stderr, clippy::unused_self)]
fn exit(self) {
if self.verbosity == Some(VerbosityLevel::Trace) {
eprintln!("Exit");
eprintln!("{}", countme::get_all());
}
}
}
@@ -250,115 +256,16 @@ impl MainLoopCancellationToken {
}
}
struct Orchestrator {
/// Sends messages to the main loop.
main_loop: crossbeam_channel::Sender<MainLoopMessage>,
/// Receives messages from the main loop.
receiver: crossbeam_channel::Receiver<OrchestratorMessage>,
revision: usize,
}
impl Orchestrator {
#[allow(clippy::print_stderr)]
fn run(&mut self) {
while let Ok(message) = self.receiver.recv() {
match message {
OrchestratorMessage::Run => {
self.main_loop
.send(MainLoopMessage::CheckWorkspace {
revision: self.revision,
})
.unwrap();
}
OrchestratorMessage::CheckCompleted {
diagnostics,
revision,
} => {
// Only take the diagnostics if they are for the latest revision.
if self.revision == revision {
self.main_loop
.send(MainLoopMessage::CheckCompleted(diagnostics))
.unwrap();
} else {
tracing::debug!("Discarding diagnostics for outdated revision {revision} (current: {}).", self.revision);
}
}
OrchestratorMessage::FileChanges(changes) => {
// Request cancellation, but wait until all analysis tasks have completed to
// avoid stale messages in the next main loop.
self.revision += 1;
self.debounce_changes(changes);
}
OrchestratorMessage::Shutdown => {
return self.shutdown();
}
}
}
}
fn debounce_changes(&self, mut changes: Vec<FileWatcherChange>) {
loop {
// Consume possibly incoming file change messages before running a new analysis, but don't wait for more than 100ms.
crossbeam_channel::select! {
recv(self.receiver) -> message => {
match message {
Ok(OrchestratorMessage::Shutdown) => {
return self.shutdown();
}
Ok(OrchestratorMessage::FileChanges(file_changes)) => {
changes.extend(file_changes);
}
Ok(OrchestratorMessage::CheckCompleted { .. })=> {
// disregard any outdated completion message.
}
Ok(OrchestratorMessage::Run) => unreachable!("The orchestrator is already running."),
Err(_) => {
// There are no more senders, no point in waiting for more messages
return;
}
}
},
default(std::time::Duration::from_millis(10)) => {
// No more file changes after 10 ms, send the changes and schedule a new analysis
self.main_loop.send(MainLoopMessage::ApplyChanges(changes)).unwrap();
self.main_loop.send(MainLoopMessage::CheckWorkspace { revision: self.revision}).unwrap();
return;
}
}
}
}
#[allow(clippy::unused_self)]
fn shutdown(&self) {
tracing::trace!("Shutting down orchestrator.");
}
}
/// Message sent from the orchestrator to the main loop.
#[derive(Debug)]
enum MainLoopMessage {
CheckWorkspace { revision: usize },
CheckCompleted(Vec<String>),
ApplyChanges(Vec<FileWatcherChange>),
Exit,
}
#[derive(Debug)]
enum OrchestratorMessage {
Run,
Shutdown,
CheckWorkspace,
CheckCompleted {
diagnostics: Vec<String>,
result: Vec<String>,
revision: usize,
},
FileChanges(Vec<FileWatcherChange>),
ApplyChanges(Vec<watch::ChangeEvent>),
Exit,
}
fn setup_tracing(verbosity: Option<VerbosityLevel>) {
@@ -392,6 +299,9 @@ impl LoggingFilter {
fn is_enabled(&self, meta: &Metadata<'_>) -> bool {
let filter = if meta.target().starts_with("red_knot") || meta.target().starts_with("ruff") {
self.trace_level
} else if meta.target().starts_with("salsa") && self.trace_level <= Level::INFO {
// Salsa emits very verbose query traces with level info. Let's not show these to the user.
Level::WARN
} else {
Level::INFO
};

View File

@@ -1,111 +0,0 @@
use std::path::Path;
use anyhow::Context;
use notify::event::{CreateKind, ModifyKind, RemoveKind};
use notify::{recommended_watcher, Event, EventKind, RecommendedWatcher, RecursiveMode, Watcher};
use ruff_db::system::{SystemPath, SystemPathBuf};
pub struct FileWatcher {
watcher: RecommendedWatcher,
}
pub trait EventHandler: Send + 'static {
fn handle(&self, changes: Vec<FileWatcherChange>);
}
impl<F> EventHandler for F
where
F: Fn(Vec<FileWatcherChange>) + Send + 'static,
{
fn handle(&self, changes: Vec<FileWatcherChange>) {
let f = self;
f(changes);
}
}
impl FileWatcher {
pub fn new<E>(handler: E) -> anyhow::Result<Self>
where
E: EventHandler,
{
Self::from_handler(Box::new(handler))
}
fn from_handler(handler: Box<dyn EventHandler>) -> anyhow::Result<Self> {
let watcher = recommended_watcher(move |event: notify::Result<Event>| {
match event {
Ok(event) => {
// TODO verify that this handles all events correctly
let change_kind = match event.kind {
EventKind::Create(CreateKind::File) => FileChangeKind::Created,
EventKind::Modify(ModifyKind::Name(notify::event::RenameMode::From)) => {
FileChangeKind::Deleted
}
EventKind::Modify(ModifyKind::Name(notify::event::RenameMode::To)) => {
FileChangeKind::Created
}
EventKind::Modify(ModifyKind::Name(notify::event::RenameMode::Any)) => {
// TODO Introduce a better catch all event for cases that we don't understand.
FileChangeKind::Created
}
EventKind::Modify(ModifyKind::Name(notify::event::RenameMode::Both)) => {
todo!("Handle both create and delete event.");
}
EventKind::Modify(_) => FileChangeKind::Modified,
EventKind::Remove(RemoveKind::File) => FileChangeKind::Deleted,
_ => {
return;
}
};
let mut changes = Vec::new();
for path in event.paths {
if let Some(fs_path) = SystemPath::from_std_path(&path) {
changes
.push(FileWatcherChange::new(fs_path.to_path_buf(), change_kind));
}
}
if !changes.is_empty() {
handler.handle(changes);
}
}
// TODO proper error handling
Err(err) => {
panic!("Error: {err}");
}
}
})
.context("Failed to create file watcher.")?;
Ok(Self { watcher })
}
pub fn watch_folder(&mut self, path: &Path) -> anyhow::Result<()> {
self.watcher.watch(path, RecursiveMode::Recursive)?;
Ok(())
}
}
#[derive(Clone, Debug)]
pub struct FileWatcherChange {
pub path: SystemPathBuf,
#[allow(unused)]
pub kind: FileChangeKind,
}
impl FileWatcherChange {
pub fn new(path: SystemPathBuf, kind: FileChangeKind) -> Self {
Self { path, kind }
}
}
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
pub enum FileChangeKind {
Created,
Modified,
Deleted,
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,29 +1,12 @@
use ruff_db::Upcast;
use crate::resolver::{
editable_install_resolution_paths, file_to_module, internal::ModuleNameIngredient,
module_resolution_settings, resolve_module_query,
};
use crate::typeshed::parse_typeshed_versions;
#[salsa::jar(db=Db)]
pub struct Jar(
ModuleNameIngredient<'_>,
module_resolution_settings,
editable_install_resolution_paths,
resolve_module_query,
file_to_module,
parse_typeshed_versions,
);
pub trait Db: salsa::DbWithJar<Jar> + ruff_db::Db + Upcast<dyn ruff_db::Db> {}
#[salsa::db]
pub trait Db: ruff_db::Db + Upcast<dyn ruff_db::Db> {}
#[cfg(test)]
pub(crate) mod tests {
use std::sync;
use salsa::DebugWithDb;
use ruff_db::files::Files;
use ruff_db::system::{DbWithTestSystem, TestSystem};
use ruff_db::vendored::VendoredFileSystem;
@@ -32,7 +15,7 @@ pub(crate) mod tests {
use super::*;
#[salsa::db(Jar, ruff_db::Jar)]
#[salsa::db]
pub(crate) struct TestDb {
storage: salsa::Storage<Self>,
system: TestSystem,
@@ -46,7 +29,7 @@ pub(crate) mod tests {
Self {
storage: salsa::Storage::default(),
system: TestSystem::default(),
vendored: vendored_typeshed_stubs().snapshot(),
vendored: vendored_typeshed_stubs().clone(),
events: sync::Arc::default(),
files: Files::default(),
}
@@ -76,8 +59,12 @@ pub(crate) mod tests {
fn upcast(&self) -> &(dyn ruff_db::Db + 'static) {
self
}
fn upcast_mut(&mut self) -> &mut (dyn ruff_db::Db + 'static) {
self
}
}
#[salsa::db]
impl ruff_db::Db for TestDb {
fn vendored(&self) -> &VendoredFileSystem {
&self.vendored
@@ -92,6 +79,7 @@ pub(crate) mod tests {
}
}
#[salsa::db]
impl Db for TestDb {}
impl DbWithTestSystem for TestDb {
@@ -104,23 +92,14 @@ pub(crate) mod tests {
}
}
#[salsa::db]
impl salsa::Database for TestDb {
fn salsa_event(&self, event: salsa::Event) {
tracing::trace!("event: {:?}", event.debug(self));
let mut events = self.events.lock().unwrap();
events.push(event);
}
}
impl salsa::ParallelDatabase for TestDb {
fn snapshot(&self) -> salsa::Snapshot<Self> {
salsa::Snapshot::new(Self {
storage: self.storage.snapshot(),
system: self.system.snapshot(),
vendored: self.vendored.snapshot(),
files: self.files.snapshot(),
events: self.events.clone(),
})
self.attach(|_| {
tracing::trace!("event: {event:?}");
let mut events = self.events.lock().unwrap();
events.push(event);
});
}
}
}

View File

@@ -1,3 +1,16 @@
use std::iter::FusedIterator;
pub use db::Db;
pub use module::{Module, ModuleKind};
pub use module_name::ModuleName;
pub use resolver::resolve_module;
use ruff_db::system::SystemPath;
pub use typeshed::{
vendored_typeshed_stubs, TypeshedVersionsParseError, TypeshedVersionsParseErrorKind,
};
use crate::resolver::{module_resolution_settings, SearchPathIterator};
mod db;
mod module;
mod module_name;
@@ -9,10 +22,29 @@ mod typeshed;
#[cfg(test)]
mod testing;
pub use db::{Db, Jar};
pub use module::{Module, ModuleKind};
pub use module_name::ModuleName;
pub use resolver::resolve_module;
pub use typeshed::{
vendored_typeshed_stubs, TypeshedVersionsParseError, TypeshedVersionsParseErrorKind,
};
/// Returns an iterator over all search paths pointing to a system path
pub fn system_module_search_paths(db: &dyn Db) -> SystemModuleSearchPathsIter {
SystemModuleSearchPathsIter {
inner: module_resolution_settings(db).search_paths(db),
}
}
pub struct SystemModuleSearchPathsIter<'db> {
inner: SearchPathIterator<'db>,
}
impl<'db> Iterator for SystemModuleSearchPathsIter<'db> {
type Item = &'db SystemPath;
fn next(&mut self) -> Option<Self::Item> {
loop {
let next = self.inner.next()?;
if let Some(system_path) = next.as_system_path() {
return Some(system_path);
}
}
}
}
impl FusedIterator for SystemModuleSearchPathsIter<'_> {}

View File

@@ -3,9 +3,8 @@ use std::sync::Arc;
use ruff_db::files::File;
use crate::db::Db;
use crate::module_name::ModuleName;
use crate::path::{ModuleResolutionPathBuf, ModuleResolutionPathRef};
use crate::path::SearchPath;
/// Representation of a Python module.
#[derive(Clone, PartialEq, Eq)]
@@ -17,7 +16,7 @@ impl Module {
pub(crate) fn new(
name: ModuleName,
kind: ModuleKind,
search_path: Arc<ModuleResolutionPathBuf>,
search_path: SearchPath,
file: File,
) -> Self {
Self {
@@ -41,8 +40,8 @@ impl Module {
}
/// The search path from which the module was resolved.
pub(crate) fn search_path(&self) -> ModuleResolutionPathRef {
ModuleResolutionPathRef::from(&*self.inner.search_path)
pub(crate) fn search_path(&self) -> &SearchPath {
&self.inner.search_path
}
/// Determine whether this module is a single-file module or a package
@@ -62,22 +61,11 @@ impl std::fmt::Debug for Module {
}
}
impl salsa::DebugWithDb<dyn Db> for Module {
fn fmt(&self, f: &mut Formatter<'_>, db: &dyn Db) -> std::fmt::Result {
f.debug_struct("Module")
.field("name", &self.name())
.field("kind", &self.kind())
.field("file", &self.file().debug(db.upcast()))
.field("search_path", &self.search_path())
.finish()
}
}
#[derive(PartialEq, Eq)]
struct ModuleInner {
name: ModuleName,
kind: ModuleKind,
search_path: Arc<ModuleResolutionPathBuf>,
search_path: SearchPath,
file: File,
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,24 +1,22 @@
use std::borrow::Cow;
use std::iter::FusedIterator;
use std::sync::Arc;
use rustc_hash::{FxBuildHasher, FxHashSet};
use ruff_db::files::{File, FilePath};
use once_cell::sync::Lazy;
use ruff_db::files::{File, FilePath, FileRootKind};
use ruff_db::program::{Program, SearchPathSettings, TargetVersion};
use ruff_db::system::{DirectoryEntry, System, SystemPath, SystemPathBuf};
use ruff_db::vendored::VendoredPath;
use rustc_hash::{FxBuildHasher, FxHashSet};
use crate::db::Db;
use crate::module::{Module, ModuleKind};
use crate::module_name::ModuleName;
use crate::path::ModuleResolutionPathBuf;
use crate::path::{ModulePath, SearchPath, SearchPathValidationError};
use crate::state::ResolverState;
type SearchPathRoot = Arc<ModuleResolutionPathBuf>;
/// Resolves a module name to a module.
pub fn resolve_module(db: &dyn Db, module_name: ModuleName) -> Option<Module> {
let interned_name = internal::ModuleNameIngredient::new(db, module_name);
let interned_name = ModuleNameIngredient::new(db, module_name);
resolve_module_query(db, interned_name)
}
@@ -30,11 +28,10 @@ pub fn resolve_module(db: &dyn Db, module_name: ModuleName) -> Option<Module> {
#[salsa::tracked]
pub(crate) fn resolve_module_query<'db>(
db: &'db dyn Db,
module_name: internal::ModuleNameIngredient<'db>,
module_name: ModuleNameIngredient<'db>,
) -> Option<Module> {
let _span = tracing::trace_span!("resolve_module", ?module_name).entered();
let name = module_name.name(db);
let _span = tracing::trace_span!("resolve_module", %name).entered();
let (search_path, module_file, kind) = resolve_name(db, name)?;
@@ -60,6 +57,12 @@ pub(crate) fn path_to_module(db: &dyn Db, path: &FilePath) -> Option<Module> {
file_to_module(db, file)
}
#[derive(Debug, Clone, Copy)]
enum SystemOrVendoredPathRef<'a> {
System(&'a SystemPath),
Vendored(&'a VendoredPath),
}
/// Resolves the module for the file with the given id.
///
/// Returns `None` if the file is not a module locatable via any of the known search paths.
@@ -67,7 +70,11 @@ pub(crate) fn path_to_module(db: &dyn Db, path: &FilePath) -> Option<Module> {
pub(crate) fn file_to_module(db: &dyn Db, file: File) -> Option<Module> {
let _span = tracing::trace_span!("file_to_module", ?file).entered();
let path = file.path(db.upcast());
let path = match file.path(db.upcast()) {
FilePath::System(system) => SystemOrVendoredPathRef::System(system),
FilePath::Vendored(vendored) => SystemOrVendoredPathRef::Vendored(vendored),
FilePath::SystemVirtual(_) => return None,
};
let settings = module_resolution_settings(db);
@@ -75,7 +82,11 @@ pub(crate) fn file_to_module(db: &dyn Db, file: File) -> Option<Module> {
let module_name = loop {
let candidate = search_paths.next()?;
if let Some(relative_path) = candidate.relativize_path(path) {
let relative_path = match path {
SystemOrVendoredPathRef::System(path) => candidate.relativize_system_path(path),
SystemOrVendoredPathRef::Vendored(path) => candidate.relativize_vendored_path(path),
};
if let Some(relative_path) = relative_path {
break relative_path.to_module_name()?;
}
};
@@ -105,16 +116,10 @@ pub(crate) fn file_to_module(db: &dyn Db, file: File) -> Option<Module> {
///
/// This method also implements the typing spec's [module resolution order].
///
/// TODO(Alex): this method does multiple `.unwrap()` calls when it should really return an error.
/// Each `.unwrap()` call is a point where we're validating a setting that the user would pass
/// and transforming it into an internal representation for a validated path.
/// Rather than panicking if a path fails to validate, we should display an error message to the user
/// and exit the process with a nonzero exit code.
/// This validation should probably be done outside of Salsa?
///
/// [module resolution order]: https://typing.readthedocs.io/en/latest/spec/distributing.html#import-resolution-ordering
#[salsa::tracked(return_ref)]
pub(crate) fn module_resolution_settings(db: &dyn Db) -> ModuleResolutionSettings {
fn try_resolve_module_resolution_settings(
db: &dyn Db,
) -> Result<ModuleResolutionSettings, SearchPathValidationError> {
let program = Program::get(db.upcast());
let SearchPathSettings {
@@ -132,45 +137,29 @@ pub(crate) fn module_resolution_settings(db: &dyn Db) -> ModuleResolutionSetting
tracing::info!("extra search paths: {extra_paths:?}");
}
let current_directory = db.system().current_directory();
let system = db.system();
let files = db.files();
let mut static_search_paths: Vec<_> = extra_paths
.iter()
.map(|fs_path| {
Arc::new(
ModuleResolutionPathBuf::extra(SystemPath::absolute(fs_path, current_directory))
.unwrap(),
)
})
.collect();
let mut static_search_paths = vec![];
static_search_paths.push(Arc::new(
ModuleResolutionPathBuf::first_party(SystemPath::absolute(
workspace_root,
current_directory,
))
.unwrap(),
));
static_search_paths.push(Arc::new(custom_typeshed.as_ref().map_or_else(
ModuleResolutionPathBuf::vendored_stdlib,
|custom| {
ModuleResolutionPathBuf::stdlib_from_custom_typeshed_root(&SystemPath::absolute(
custom,
current_directory,
))
.unwrap()
},
)));
if let Some(path) = site_packages {
let site_packages_root = Arc::new(
ModuleResolutionPathBuf::site_packages(SystemPath::absolute(path, current_directory))
.unwrap(),
);
static_search_paths.push(site_packages_root);
for path in extra_paths {
files.try_add_root(db.upcast(), path, FileRootKind::LibrarySearchPath);
static_search_paths.push(SearchPath::extra(system, path.clone())?);
}
static_search_paths.push(SearchPath::first_party(system, workspace_root.clone())?);
static_search_paths.push(if let Some(custom_typeshed) = custom_typeshed.as_ref() {
files.try_add_root(
db.upcast(),
custom_typeshed,
FileRootKind::LibrarySearchPath,
);
SearchPath::custom_stdlib(db, custom_typeshed.clone())?
} else {
SearchPath::vendored_stdlib()
});
// TODO vendor typeshed's third-party stubs as well as the stdlib and fallback to them as a final step
let target_version = program.target_version(db.upcast());
@@ -193,41 +182,68 @@ pub(crate) fn module_resolution_settings(db: &dyn Db) -> ModuleResolutionSetting
}
});
ModuleResolutionSettings {
Ok(ModuleResolutionSettings {
target_version,
static_search_paths,
}
site_packages_paths: site_packages.to_owned(),
})
}
/// Collect all dynamic search paths:
/// search paths listed in `.pth` files in the `site-packages` directory
/// due to editable installations of third-party packages.
#[salsa::tracked(return_ref)]
pub(crate) fn editable_install_resolution_paths(db: &dyn Db) -> Vec<Arc<ModuleResolutionPathBuf>> {
// This query needs to be re-executed each time a `.pth` file
// is added, modified or removed from the `site-packages` directory.
// However, we don't use Salsa queries to read the source text of `.pth` files;
// we use the APIs on the `System` trait directly. As such, for now we simply ask
// Salsa to recompute this query on each new revision.
//
// TODO: add some kind of watcher for the `site-packages` directory that looks
// for `site-packages/*.pth` files being added/modified/removed; get rid of this.
// When doing so, also make the test
// `deleting_pth_file_on_which_module_resolution_depends_invalidates_cache()`
// more principled!
db.report_untracked_read();
pub(crate) fn module_resolution_settings(db: &dyn Db) -> ModuleResolutionSettings {
// TODO proper error handling if this returns an error:
try_resolve_module_resolution_settings(db).unwrap()
}
let static_search_paths = &module_resolution_settings(db).static_search_paths;
let site_packages = static_search_paths
/// Collect all dynamic search paths. For each `site-packages` path:
/// - Collect that `site-packages` path
/// - Collect any search paths listed in `.pth` files in that `site-packages` directory
/// due to editable installations of third-party packages.
///
/// The editable-install search paths for the first `site-packages` directory
/// should come between the two `site-packages` directories when it comes to
/// module-resolution priority.
#[salsa::tracked(return_ref)]
pub(crate) fn dynamic_resolution_paths(db: &dyn Db) -> Vec<SearchPath> {
let ModuleResolutionSettings {
target_version: _,
static_search_paths,
site_packages_paths,
} = module_resolution_settings(db);
let mut dynamic_paths = Vec::new();
if site_packages_paths.is_empty() {
return dynamic_paths;
}
let mut existing_paths: FxHashSet<_> = static_search_paths
.iter()
.find(|path| path.is_site_packages());
.filter_map(|path| path.as_system_path())
.map(Cow::Borrowed)
.collect();
let mut dynamic_paths = Vec::default();
let files = db.files();
let system = db.system();
if let Some(site_packages) = site_packages {
let site_packages = site_packages
.as_system_path()
.expect("Expected site-packages never to be a VendoredPath!");
for site_packages_dir in site_packages_paths {
if !existing_paths.insert(Cow::Borrowed(site_packages_dir)) {
continue;
}
let site_packages_root = files.try_add_root(
db.upcast(),
site_packages_dir,
FileRootKind::LibrarySearchPath,
);
// This query needs to be re-executed each time a `.pth` file
// is added, modified or removed from the `site-packages` directory.
// However, we don't use Salsa queries to read the source text of `.pth` files;
// we use the APIs on the `System` trait directly. As such, add a dependency on the
// site-package directory's revision.
site_packages_root.revision(db.upcast());
dynamic_paths
.push(SearchPath::site_packages(system, site_packages_dir.to_owned()).unwrap());
// As well as modules installed directly into `site-packages`,
// the directory may also contain `.pth` files.
@@ -235,8 +251,8 @@ pub(crate) fn editable_install_resolution_paths(db: &dyn Db) -> Vec<Arc<ModuleRe
// containing a (relative or absolute) path.
// Each of these paths may point to an editable install of a package,
// so should be considered an additional search path.
let Ok(pth_file_iterator) = PthFileIterator::new(db, site_packages) else {
return dynamic_paths;
let Ok(pth_file_iterator) = PthFileIterator::new(db, site_packages_dir) else {
continue;
};
// The Python documentation specifies that `.pth` files in `site-packages`
@@ -245,20 +261,12 @@ pub(crate) fn editable_install_resolution_paths(db: &dyn Db) -> Vec<Arc<ModuleRe
let mut all_pth_files: Vec<PthFile> = pth_file_iterator.collect();
all_pth_files.sort_by(|a, b| a.path.cmp(&b.path));
let mut existing_paths: FxHashSet<_> = static_search_paths
.iter()
.filter_map(|path| path.as_system_path())
.map(Cow::Borrowed)
.collect();
dynamic_paths.reserve(all_pth_files.len());
for pth_file in &all_pth_files {
for installation in pth_file.editable_installations() {
if existing_paths.insert(Cow::Owned(
installation.as_system_path().unwrap().to_path_buf(),
)) {
dynamic_paths.push(Arc::new(installation));
dynamic_paths.push(installation);
}
}
}
@@ -274,14 +282,14 @@ pub(crate) fn editable_install_resolution_paths(db: &dyn Db) -> Vec<Arc<ModuleRe
/// are only calculated lazily.
///
/// [`sys.path` at runtime]: https://docs.python.org/3/library/site.html#module-site
struct SearchPathIterator<'db> {
pub(crate) struct SearchPathIterator<'db> {
db: &'db dyn Db,
static_paths: std::slice::Iter<'db, SearchPathRoot>,
dynamic_paths: Option<std::slice::Iter<'db, SearchPathRoot>>,
static_paths: std::slice::Iter<'db, SearchPath>,
dynamic_paths: Option<std::slice::Iter<'db, SearchPath>>,
}
impl<'db> Iterator for SearchPathIterator<'db> {
type Item = &'db SearchPathRoot;
type Item = &'db SearchPath;
fn next(&mut self) -> Option<Self::Item> {
let SearchPathIterator {
@@ -292,7 +300,7 @@ impl<'db> Iterator for SearchPathIterator<'db> {
static_paths.next().or_else(|| {
dynamic_paths
.get_or_insert_with(|| editable_install_resolution_paths(*db).iter())
.get_or_insert_with(|| dynamic_resolution_paths(*db).iter())
.next()
})
}
@@ -313,7 +321,7 @@ struct PthFile<'db> {
impl<'db> PthFile<'db> {
/// Yield paths in this `.pth` file that appear to represent editable installations,
/// and should therefore be added as module-resolution search paths.
fn editable_installations(&'db self) -> impl Iterator<Item = ModuleResolutionPathBuf> + 'db {
fn editable_installations(&'db self) -> impl Iterator<Item = SearchPath> + 'db {
let PthFile {
system,
path: _,
@@ -335,7 +343,7 @@ impl<'db> PthFile<'db> {
return None;
}
let possible_editable_install = SystemPath::absolute(line, site_packages);
ModuleResolutionPathBuf::editable_installation_root(*system, possible_editable_install)
SearchPath::editable(*system, possible_editable_install).ok()
})
}
}
@@ -402,12 +410,18 @@ impl<'db> Iterator for PthFileIterator<'db> {
#[derive(Clone, Debug, PartialEq, Eq)]
pub(crate) struct ModuleResolutionSettings {
target_version: TargetVersion,
/// Search paths that have been statically determined purely from reading Ruff's configuration settings.
/// These shouldn't ever change unless the config settings themselves change.
///
/// Note that `site-packages` *is included* as a search path in this sequence,
/// but it is also stored separately so that we're able to find editable installs later.
static_search_paths: Vec<SearchPathRoot>,
static_search_paths: Vec<SearchPath>,
/// site-packages paths are not included in the above field:
/// if there are multiple site-packages paths, editable installations can appear
/// *between* the site-packages paths on `sys.path` at runtime.
/// That means we can't know where a second or third `site-packages` path should sit
/// in terms of module-resolution priority until we've discovered the editable installs
/// for the first `site-packages` path
site_packages_paths: Vec<SystemPathBuf>,
}
impl ModuleResolutionSettings {
@@ -415,7 +429,7 @@ impl ModuleResolutionSettings {
self.target_version
}
fn search_paths<'db>(&'db self, db: &'db dyn Db) -> SearchPathIterator<'db> {
pub(crate) fn search_paths<'db>(&'db self, db: &'db dyn Db) -> SearchPathIterator<'db> {
SearchPathIterator {
db,
static_paths: self.static_search_paths.iter(),
@@ -424,34 +438,73 @@ impl ModuleResolutionSettings {
}
}
// The singleton methods generated by salsa are all `pub` instead of `pub(crate)` which triggers
// `unreachable_pub`. Work around this by creating a module and allow `unreachable_pub` for it.
// Salsa also generates uses to `_db` variables for `interned` which triggers `clippy::used_underscore_binding`. Suppress that too
// TODO(micha): Contribute a fix for this upstream where the singleton methods have the same visibility as the struct.
#[allow(unreachable_pub, clippy::used_underscore_binding)]
pub(crate) mod internal {
use crate::module_name::ModuleName;
/// A thin wrapper around `ModuleName` to make it a Salsa ingredient.
///
/// This is needed because Salsa requires that all query arguments are salsa ingredients.
#[salsa::interned]
pub(crate) struct ModuleNameIngredient<'db> {
#[return_ref]
pub(super) name: ModuleName,
}
/// A thin wrapper around `ModuleName` to make it a Salsa ingredient.
///
/// This is needed because Salsa requires that all query arguments are salsa ingredients.
#[salsa::interned]
struct ModuleNameIngredient<'db> {
#[return_ref]
pub(super) name: ModuleName,
}
/// Modules that are builtin to the Python interpreter itself.
///
/// When these module names are imported, standard module resolution is bypassed:
/// the module name always resolves to the stdlib module,
/// even if there's a module of the same name in the workspace root
/// (which would normally result in the stdlib module being overridden).
///
/// TODO(Alex): write a script to generate this list,
/// similar to what we do in `crates/ruff_python_stdlib/src/sys.rs`
static BUILTIN_MODULES: Lazy<FxHashSet<&str>> = Lazy::new(|| {
const BUILTIN_MODULE_NAMES: &[&str] = &[
"_abc",
"_ast",
"_codecs",
"_collections",
"_functools",
"_imp",
"_io",
"_locale",
"_operator",
"_signal",
"_sre",
"_stat",
"_string",
"_symtable",
"_thread",
"_tokenize",
"_tracemalloc",
"_typing",
"_warnings",
"_weakref",
"atexit",
"builtins",
"errno",
"faulthandler",
"gc",
"itertools",
"marshal",
"posix",
"pwd",
"sys",
"time",
];
BUILTIN_MODULE_NAMES.iter().copied().collect()
});
/// Given a module name and a list of search paths in which to lookup modules,
/// attempt to resolve the module name
fn resolve_name(
db: &dyn Db,
name: &ModuleName,
) -> Option<(Arc<ModuleResolutionPathBuf>, File, ModuleKind)> {
fn resolve_name(db: &dyn Db, name: &ModuleName) -> Option<(SearchPath, File, ModuleKind)> {
let resolver_settings = module_resolution_settings(db);
let resolver_state = ResolverState::new(db, resolver_settings.target_version());
let is_builtin_module = BUILTIN_MODULES.contains(&name.as_str());
for search_path in resolver_settings.search_paths(db) {
if is_builtin_module && !search_path.is_standard_library() {
continue;
}
let mut components = name.components();
let module_name = components.next_back()?;
@@ -462,7 +515,7 @@ fn resolve_name(
package_path.push(module_name);
// Must be a `__init__.pyi` or `__init__.py` or it isn't a package.
let kind = if package_path.is_directory(search_path, &resolver_state) {
let kind = if package_path.is_directory(&resolver_state) {
package_path.push("__init__");
ModuleKind::Package
} else {
@@ -470,16 +523,13 @@ fn resolve_name(
};
// TODO Implement full https://peps.python.org/pep-0561/#type-checker-module-resolution-order resolution
if let Some(stub) = package_path
.with_pyi_extension()
.to_file(search_path, &resolver_state)
{
if let Some(stub) = package_path.with_pyi_extension().to_file(&resolver_state) {
return Some((search_path.clone(), stub, kind));
}
if let Some(module) = package_path
.with_py_extension()
.and_then(|path| path.to_file(search_path, &resolver_state))
.and_then(|path| path.to_file(&resolver_state))
{
return Some((search_path.clone(), module, kind));
}
@@ -503,14 +553,14 @@ fn resolve_name(
}
fn resolve_package<'a, 'db, I>(
module_search_path: &ModuleResolutionPathBuf,
module_search_path: &SearchPath,
components: I,
resolver_state: &ResolverState<'db>,
) -> Result<ResolvedPackage, PackageKind>
where
I: Iterator<Item = &'a str>,
{
let mut package_path = module_search_path.clone();
let mut package_path = module_search_path.to_module_path();
// `true` if inside a folder that is a namespace package (has no `__init__.py`).
// Namespace packages are special because they can be spread across multiple search paths.
@@ -524,12 +574,11 @@ where
for folder in components {
package_path.push(folder);
let is_regular_package =
package_path.is_regular_package(module_search_path, resolver_state);
let is_regular_package = package_path.is_regular_package(resolver_state);
if is_regular_package {
in_namespace_package = false;
} else if package_path.is_directory(module_search_path, resolver_state) {
} else if package_path.is_directory(resolver_state) {
// A directory without an `__init__.py` is a namespace package, continue with the next folder.
in_namespace_package = true;
} else if in_namespace_package {
@@ -562,7 +611,7 @@ where
#[derive(Debug)]
struct ResolvedPackage {
path: ModuleResolutionPathBuf,
path: ModulePath,
kind: PackageKind,
}
@@ -590,10 +639,11 @@ impl PackageKind {
#[cfg(test)]
mod tests {
use internal::ModuleNameIngredient;
use ruff_db::files::{system_path_to_file, File, FilePath};
use ruff_db::system::{DbWithTestSystem, OsSystem, SystemPath};
use ruff_db::testing::assert_function_query_was_not_run;
use ruff_db::system::DbWithTestSystem;
use ruff_db::testing::{
assert_const_function_query_was_not_run, assert_function_query_was_not_run,
};
use ruff_db::Db;
use crate::db::tests::TestDb;
@@ -618,7 +668,7 @@ mod tests {
);
assert_eq!("foo", foo_module.name());
assert_eq!(&src, &foo_module.search_path());
assert_eq!(&src, foo_module.search_path());
assert_eq!(ModuleKind::Module, foo_module.kind());
let expected_foo_path = src.join("foo.py");
@@ -629,6 +679,40 @@ mod tests {
);
}
#[test]
fn builtins_vendored() {
let TestCase { db, stdlib, .. } = TestCaseBuilder::new()
.with_vendored_typeshed()
.with_src_files(&[("builtins.py", "FOOOO = 42")])
.build();
let builtins_module_name = ModuleName::new_static("builtins").unwrap();
let builtins = resolve_module(&db, builtins_module_name).expect("builtins to resolve");
assert_eq!(builtins.file().path(&db), &stdlib.join("builtins.pyi"));
}
#[test]
fn builtins_custom() {
const TYPESHED: MockedTypeshed = MockedTypeshed {
stdlib_files: &[("builtins.pyi", "def min(a, b): ...")],
versions: "builtins: 3.8-",
};
const SRC: &[FileSpec] = &[("builtins.py", "FOOOO = 42")];
let TestCase { db, stdlib, .. } = TestCaseBuilder::new()
.with_src_files(SRC)
.with_custom_typeshed(TYPESHED)
.with_target_version(TargetVersion::Py38)
.build();
let builtins_module_name = ModuleName::new_static("builtins").unwrap();
let builtins = resolve_module(&db, builtins_module_name).expect("builtins to resolve");
assert_eq!(builtins.file().path(&db), &stdlib.join("builtins.pyi"));
}
#[test]
fn stdlib() {
const TYPESHED: MockedTypeshed = MockedTypeshed {
@@ -649,7 +733,7 @@ mod tests {
resolve_module(&db, functools_module_name).as_ref()
);
assert_eq!(&stdlib, &functools_module.search_path().to_path_buf());
assert_eq!(&stdlib, functools_module.search_path());
assert_eq!(ModuleKind::Module, functools_module.kind());
let expected_functools_path = stdlib.join("functools.pyi");
@@ -701,7 +785,7 @@ mod tests {
});
let search_path = resolved_module.search_path();
assert_eq!(
&stdlib, &search_path,
&stdlib, search_path,
"Search path for {module_name} was unexpectedly {search_path:?}"
);
assert!(
@@ -797,7 +881,7 @@ mod tests {
});
let search_path = resolved_module.search_path();
assert_eq!(
&stdlib, &search_path,
&stdlib, search_path,
"Search path for {module_name} was unexpectedly {search_path:?}"
);
assert!(
@@ -856,7 +940,7 @@ mod tests {
Some(&functools_module),
resolve_module(&db, functools_module_name).as_ref()
);
assert_eq!(&src, &functools_module.search_path());
assert_eq!(&src, functools_module.search_path());
assert_eq!(ModuleKind::Module, functools_module.kind());
assert_eq!(&src.join("functools.py"), functools_module.file().path(&db));
@@ -877,7 +961,7 @@ mod tests {
let pydoc_data_topics = resolve_module(&db, pydoc_data_topics_name).unwrap();
assert_eq!("pydoc_data.topics", pydoc_data_topics.name());
assert_eq!(pydoc_data_topics.search_path(), stdlib);
assert_eq!(pydoc_data_topics.search_path(), &stdlib);
assert_eq!(
pydoc_data_topics.file().path(&db),
&stdlib.join("pydoc_data/topics.pyi")
@@ -894,7 +978,7 @@ mod tests {
let foo_module = resolve_module(&db, ModuleName::new_static("foo").unwrap()).unwrap();
assert_eq!("foo", foo_module.name());
assert_eq!(&src, &foo_module.search_path());
assert_eq!(&src, foo_module.search_path());
assert_eq!(&foo_path, foo_module.file().path(&db));
assert_eq!(
@@ -921,7 +1005,7 @@ mod tests {
let foo_module = resolve_module(&db, ModuleName::new_static("foo").unwrap()).unwrap();
let foo_init_path = src.join("foo/__init__.py");
assert_eq!(&src, &foo_module.search_path());
assert_eq!(&src, foo_module.search_path());
assert_eq!(&foo_init_path, foo_module.file().path(&db));
assert_eq!(ModuleKind::Package, foo_module.kind());
@@ -944,7 +1028,7 @@ mod tests {
let foo = resolve_module(&db, ModuleName::new_static("foo").unwrap()).unwrap();
let foo_stub = src.join("foo.pyi");
assert_eq!(&src, &foo.search_path());
assert_eq!(&src, foo.search_path());
assert_eq!(&foo_stub, foo.file().path(&db));
assert_eq!(Some(foo), path_to_module(&db, &FilePath::System(foo_stub)));
@@ -968,7 +1052,7 @@ mod tests {
resolve_module(&db, ModuleName::new_static("foo.bar.baz").unwrap()).unwrap();
let baz_path = src.join("foo/bar/baz.py");
assert_eq!(&src, &baz_module.search_path());
assert_eq!(&src, baz_module.search_path());
assert_eq!(&baz_path, baz_module.file().path(&db));
assert_eq!(
@@ -1068,7 +1152,7 @@ mod tests {
let foo_module = resolve_module(&db, ModuleName::new_static("foo").unwrap()).unwrap();
let foo_src_path = src.join("foo.py");
assert_eq!(&src, &foo_module.search_path());
assert_eq!(&src, foo_module.search_path());
assert_eq!(&foo_src_path, foo_module.file().path(&db));
assert_eq!(
Some(foo_module),
@@ -1084,7 +1168,9 @@ mod tests {
#[test]
#[cfg(target_family = "unix")]
fn symlink() -> anyhow::Result<()> {
use crate::db::tests::TestDb;
use ruff_db::program::Program;
use ruff_db::system::{OsSystem, SystemPath};
let mut db = TestDb::new();
@@ -1101,7 +1187,8 @@ mod tests {
std::fs::create_dir_all(src.as_std_path())?;
std::fs::create_dir_all(site_packages.as_std_path())?;
std::fs::create_dir_all(custom_typeshed.as_std_path())?;
std::fs::create_dir_all(custom_typeshed.join("stdlib").as_std_path())?;
std::fs::File::create(custom_typeshed.join("stdlib/VERSIONS").as_std_path())?;
std::fs::write(foo.as_std_path(), "")?;
std::os::unix::fs::symlink(foo.as_std_path(), bar.as_std_path())?;
@@ -1110,7 +1197,7 @@ mod tests {
extra_paths: vec![],
workspace_root: src.clone(),
custom_typeshed: Some(custom_typeshed.clone()),
site_packages: Some(site_packages.clone()),
site_packages: vec![site_packages],
};
Program::new(&db, TargetVersion::Py38, search_paths);
@@ -1120,12 +1207,12 @@ mod tests {
assert_ne!(foo_module, bar_module);
assert_eq!(&src, &foo_module.search_path());
assert_eq!(&src, foo_module.search_path());
assert_eq!(&foo, foo_module.file().path(&db));
// `foo` and `bar` shouldn't resolve to the same file
assert_eq!(&src, &bar_module.search_path());
assert_eq!(&src, bar_module.search_path());
assert_eq!(&bar, bar_module.file().path(&db));
assert_eq!(&foo, foo_module.file().path(&db));
@@ -1160,7 +1247,7 @@ mod tests {
// Delete `bar.py`
db.memory_file_system().remove_file(&bar_path).unwrap();
bar.touch(&mut db);
bar.sync(&mut db);
// Re-query the foo module. The foo module should still be cached because `bar.py` isn't relevant
// for resolving `foo`.
@@ -1212,7 +1299,8 @@ mod tests {
db.memory_file_system().remove_file(&foo_init_path)?;
db.memory_file_system()
.remove_directory(foo_init_path.parent().unwrap())?;
File::touch_path(&mut db, &foo_init_path);
File::sync_path(&mut db, &foo_init_path);
File::sync_path(&mut db, foo_init_path.parent().unwrap());
let foo_module = resolve_module(&db, foo_module_name).expect("Foo module to resolve");
assert_eq!(&src.join("foo.py"), foo_module.file().path(&db));
@@ -1241,9 +1329,9 @@ mod tests {
let stdlib_functools_path = stdlib.join("functools.pyi");
let functools_module = resolve_module(&db, functools_module_name.clone()).unwrap();
assert_eq!(functools_module.search_path(), stdlib);
assert_eq!(functools_module.search_path(), &stdlib);
assert_eq!(
Some(functools_module.file()),
Ok(functools_module.file()),
system_path_to_file(&db, &stdlib_functools_path)
);
@@ -1255,15 +1343,15 @@ mod tests {
.unwrap();
let functools_module = resolve_module(&db, functools_module_name.clone()).unwrap();
let events = db.take_salsa_events();
assert_function_query_was_not_run::<resolve_module_query, _, _>(
assert_function_query_was_not_run(
&db,
|res| &res.function,
&ModuleNameIngredient::new(&db, functools_module_name.clone()),
resolve_module_query,
ModuleNameIngredient::new(&db, functools_module_name.clone()),
&events,
);
assert_eq!(functools_module.search_path(), stdlib);
assert_eq!(functools_module.search_path(), &stdlib);
assert_eq!(
Some(functools_module.file()),
Ok(functools_module.file()),
system_path_to_file(&db, &stdlib_functools_path)
);
}
@@ -1287,9 +1375,9 @@ mod tests {
let functools_module_name = ModuleName::new_static("functools").unwrap();
let functools_module = resolve_module(&db, functools_module_name.clone()).unwrap();
assert_eq!(functools_module.search_path(), stdlib);
assert_eq!(functools_module.search_path(), &stdlib);
assert_eq!(
Some(functools_module.file()),
Ok(functools_module.file()),
system_path_to_file(&db, stdlib.join("functools.pyi"))
);
@@ -1298,9 +1386,9 @@ mod tests {
let src_functools_path = src.join("functools.py");
db.write_file(&src_functools_path, "FOO: int").unwrap();
let functools_module = resolve_module(&db, functools_module_name.clone()).unwrap();
assert_eq!(functools_module.search_path(), src);
assert_eq!(functools_module.search_path(), &src);
assert_eq!(
Some(functools_module.file()),
Ok(functools_module.file()),
system_path_to_file(&db, &src_functools_path)
);
}
@@ -1329,9 +1417,9 @@ mod tests {
let src_functools_path = src.join("functools.py");
let functools_module = resolve_module(&db, functools_module_name.clone()).unwrap();
assert_eq!(functools_module.search_path(), src);
assert_eq!(functools_module.search_path(), &src);
assert_eq!(
Some(functools_module.file()),
Ok(functools_module.file()),
system_path_to_file(&db, &src_functools_path)
);
@@ -1340,11 +1428,11 @@ mod tests {
db.memory_file_system()
.remove_file(&src_functools_path)
.unwrap();
File::touch_path(&mut db, &src_functools_path);
File::sync_path(&mut db, &src_functools_path);
let functools_module = resolve_module(&db, functools_module_name.clone()).unwrap();
assert_eq!(functools_module.search_path(), stdlib);
assert_eq!(functools_module.search_path(), &stdlib);
assert_eq!(
Some(functools_module.file()),
Ok(functools_module.file()),
system_path_to_file(&db, stdlib.join("functools.pyi"))
);
}
@@ -1507,12 +1595,7 @@ not_a_directory
&FilePath::system("/y/src/bar.py")
);
let events = db.take_salsa_events();
assert_function_query_was_not_run::<editable_install_resolution_paths, _, _>(
&db,
|res| &res.function,
&(),
&events,
);
assert_const_function_query_was_not_run(&db, dynamic_resolution_paths, &events);
}
#[test]
@@ -1541,18 +1624,7 @@ not_a_directory
.remove_file(site_packages.join("_foo.pth"))
.unwrap();
// Why are we touching a random file in the path that's been editably installed,
// rather than the `.pth` file, when the `.pth` file is the one that has been deleted?
// It's because the `.pth` file isn't directly tracked as a dependency by Salsa
// currently (we don't use `system_path_to_file()` to get the file, and we don't use
// `source_text()` to read the source of the file). Instead of using these APIs which
// would automatically add the existence and contents of the file as a Salsa-tracked
// dependency, we use `.report_untracked_read()` to force Salsa to re-parse all
// `.pth` files on each new "revision". Making a random modification to a tracked
// Salsa file forces a new revision.
//
// TODO: get rid of the `.report_untracked_read()` call...
File::touch_path(&mut db, SystemPath::new("/x/src/foo.py"));
File::sync_path(&mut db, &site_packages.join("_foo.pth"));
assert_eq!(resolve_module(&db, foo_module_name.clone()), None);
}
@@ -1580,8 +1652,8 @@ not_a_directory
.remove_file(src_path.join("foo.py"))
.unwrap();
db.memory_file_system().remove_directory(&src_path).unwrap();
File::touch_path(&mut db, &src_path.join("foo.py"));
File::touch_path(&mut db, &src_path);
File::sync_path(&mut db, &src_path.join("foo.py"));
File::sync_path(&mut db, &src_path);
assert_eq!(resolve_module(&db, foo_module_name.clone()), None);
}
@@ -1592,15 +1664,62 @@ not_a_directory
.with_site_packages_files(&[("_foo.pth", "/src")])
.build();
let search_paths: Vec<&SearchPathRoot> =
let search_paths: Vec<&SearchPath> =
module_resolution_settings(&db).search_paths(&db).collect();
assert!(search_paths.contains(&&Arc::new(
ModuleResolutionPathBuf::first_party("/src").unwrap()
)));
assert!(search_paths.contains(
&&SearchPath::first_party(db.system(), SystemPathBuf::from("/src")).unwrap()
));
assert!(!search_paths
.contains(&&SearchPath::editable(db.system(), SystemPathBuf::from("/src")).unwrap()));
}
assert!(!search_paths.contains(&&Arc::new(
ModuleResolutionPathBuf::editable_installation_root(db.system(), "/src").unwrap()
)));
#[test]
fn multiple_site_packages_with_editables() {
let mut db = TestDb::new();
let venv_site_packages = SystemPathBuf::from("/venv-site-packages");
let site_packages_pth = venv_site_packages.join("foo.pth");
let system_site_packages = SystemPathBuf::from("/system-site-packages");
let editable_install_location = SystemPathBuf::from("/x/y/a.py");
let system_site_packages_location = system_site_packages.join("a.py");
db.memory_file_system()
.create_directory_all("/src")
.unwrap();
db.write_files([
(&site_packages_pth, "/x/y"),
(&editable_install_location, ""),
(&system_site_packages_location, ""),
])
.unwrap();
Program::new(
&db,
TargetVersion::default(),
SearchPathSettings {
extra_paths: vec![],
workspace_root: SystemPathBuf::from("/src"),
custom_typeshed: None,
site_packages: vec![venv_site_packages, system_site_packages],
},
);
// The editable installs discovered from the `.pth` file in the first `site-packages` directory
// take precedence over the second `site-packages` directory...
let a_module_name = ModuleName::new_static("a").unwrap();
let a_module = resolve_module(&db, a_module_name.clone()).unwrap();
assert_eq!(a_module.file().path(&db), &editable_install_location);
db.memory_file_system()
.remove_file(&site_packages_pth)
.unwrap();
File::sync_path(&mut db, &site_packages_pth);
// ...But now that the `.pth` file in the first `site-packages` directory has been deleted,
// the editable install no longer exists, so the module now resolves to the file in the
// second `site-packages` directory
let a_module = resolve_module(&db, a_module_name).unwrap();
assert_eq!(a_module.file().path(&db), &system_site_packages_location);
}
}

View File

@@ -1,5 +1,4 @@
use ruff_db::program::TargetVersion;
use ruff_db::system::System;
use ruff_db::vendored::VendoredFileSystem;
use crate::db::Db;
@@ -20,10 +19,6 @@ impl<'db> ResolverState<'db> {
}
}
pub(crate) fn system(&self) -> &dyn System {
self.db.system()
}
pub(crate) fn vendored(&self) -> &VendoredFileSystem {
self.db.vendored()
}

View File

@@ -12,6 +12,9 @@ pub(crate) struct TestCase<T> {
pub(crate) db: TestDb,
pub(crate) src: SystemPathBuf,
pub(crate) stdlib: T,
// Most test cases only ever need a single `site-packages` directory,
// so this is a single directory instead of a `Vec` of directories,
// like it is in `ruff_db::Program`.
pub(crate) site_packages: SystemPathBuf,
pub(crate) target_version: TargetVersion,
}
@@ -125,6 +128,8 @@ impl<T> TestCaseBuilder<T> {
files: impl IntoIterator<Item = FileSpec>,
) -> SystemPathBuf {
let root = location.as_ref().to_path_buf();
// Make sure to create the directory even if the list of files is empty:
db.memory_file_system().create_directory_all(&root).unwrap();
db.write_files(
files
.into_iter()
@@ -221,7 +226,7 @@ impl TestCaseBuilder<MockedTypeshed> {
extra_paths: vec![],
workspace_root: src.clone(),
custom_typeshed: Some(typeshed.clone()),
site_packages: Some(site_packages.clone()),
site_packages: vec![site_packages.clone()],
},
);
@@ -274,7 +279,7 @@ impl TestCaseBuilder<VendoredTypeshed> {
extra_paths: vec![],
workspace_root: src.clone(),
custom_typeshed: None,
site_packages: Some(site_packages.clone()),
site_packages: vec![site_packages.clone()],
},
);

View File

@@ -52,7 +52,7 @@ impl<'db> LazyTypeshedVersions<'db> {
} else {
return &VENDORED_VERSIONS;
};
let Some(versions_file) = system_path_to_file(db.upcast(), &versions_path) else {
let Ok(versions_file) = system_path_to_file(db.upcast(), &versions_path) else {
todo!(
"Still need to figure out how to handle VERSIONS files being deleted \
from custom typeshed directories! Expected a file to exist at {versions_path}"

View File

@@ -1 +1 @@
f863db6bc5242348ceaa6a3bca4e59aa9e62faaa
4ef2d66663fc080fefa379e6ae5fc45d4f8b54eb

View File

@@ -1,18 +1,18 @@
import sys
from _typeshed import SupportsWrite
from collections.abc import Iterable, Iterator
from typing import Any, Final, Literal
from typing import Any, Final
from typing_extensions import TypeAlias
__version__: Final[str]
QUOTE_ALL: Literal[1]
QUOTE_MINIMAL: Literal[0]
QUOTE_NONE: Literal[3]
QUOTE_NONNUMERIC: Literal[2]
QUOTE_ALL: Final = 1
QUOTE_MINIMAL: Final = 0
QUOTE_NONE: Final = 3
QUOTE_NONNUMERIC: Final = 2
if sys.version_info >= (3, 12):
QUOTE_STRINGS: Literal[4]
QUOTE_NOTNULL: Literal[5]
QUOTE_STRINGS: Final = 4
QUOTE_NOTNULL: Final = 5
# Ideally this would be `QUOTE_ALL | QUOTE_MINIMAL | QUOTE_NONE | QUOTE_NONNUMERIC`
# However, using literals in situations like these can cause false-positives (see #7258)

View File

@@ -71,7 +71,7 @@ class _CData(metaclass=_CDataMeta):
@classmethod
def from_address(cls, address: int) -> Self: ...
@classmethod
def from_param(cls, obj: Any) -> Self | _CArgObject: ...
def from_param(cls, value: Any, /) -> Self | _CArgObject: ...
@classmethod
def in_dll(cls, library: CDLL, name: str) -> Self: ...
def __buffer__(self, flags: int, /) -> memoryview: ...

View File

@@ -368,11 +368,7 @@ def tparm(
) -> bytes: ...
def typeahead(fd: int, /) -> None: ...
def unctrl(ch: _ChType, /) -> bytes: ...
if sys.version_info < (3, 12) or sys.platform != "darwin":
# The support for macos was dropped in 3.12
def unget_wch(ch: int | str, /) -> None: ...
def unget_wch(ch: int | str, /) -> None: ...
def ungetch(ch: _ChType, /) -> None: ...
def ungetmouse(id: int, x: int, y: int, z: int, bstate: int, /) -> None: ...
def update_lines_cols() -> None: ...
@@ -447,13 +443,10 @@ class _CursesWindow:
def getch(self) -> int: ...
@overload
def getch(self, y: int, x: int) -> int: ...
if sys.version_info < (3, 12) or sys.platform != "darwin":
# The support for macos was dropped in 3.12
@overload
def get_wch(self) -> int | str: ...
@overload
def get_wch(self, y: int, x: int) -> int | str: ...
@overload
def get_wch(self) -> int | str: ...
@overload
def get_wch(self, y: int, x: int) -> int | str: ...
@overload
def getkey(self) -> str: ...
@overload

View File

@@ -17,20 +17,20 @@ class DecimalTuple(NamedTuple):
digits: tuple[int, ...]
exponent: int | Literal["n", "N", "F"]
ROUND_DOWN: str
ROUND_HALF_UP: str
ROUND_HALF_EVEN: str
ROUND_CEILING: str
ROUND_FLOOR: str
ROUND_UP: str
ROUND_HALF_DOWN: str
ROUND_05UP: str
HAVE_CONTEXTVAR: bool
HAVE_THREADS: bool
MAX_EMAX: int
MAX_PREC: int
MIN_EMIN: int
MIN_ETINY: int
ROUND_DOWN: Final[str]
ROUND_HALF_UP: Final[str]
ROUND_HALF_EVEN: Final[str]
ROUND_CEILING: Final[str]
ROUND_FLOOR: Final[str]
ROUND_UP: Final[str]
ROUND_HALF_DOWN: Final[str]
ROUND_05UP: Final[str]
HAVE_CONTEXTVAR: Final[bool]
HAVE_THREADS: Final[bool]
MAX_EMAX: Final[int]
MAX_PREC: Final[int]
MIN_EMIN: Final[int]
MIN_ETINY: Final[int]
class DecimalException(ArithmeticError): ...
class Clamped(DecimalException): ...

View File

@@ -1,5 +1,5 @@
from _typeshed import structseq
from typing import Final, Literal, SupportsIndex, final
from typing import Any, Final, Literal, SupportsIndex, final
from typing_extensions import Buffer, Self
class ChannelError(RuntimeError): ...
@@ -72,13 +72,15 @@ class ChannelInfo(structseq[int], tuple[bool, bool, bool, int, int, int, int, in
@property
def send_released(self) -> bool: ...
def create() -> ChannelID: ...
def create(unboundop: Literal[1, 2, 3]) -> ChannelID: ...
def destroy(cid: SupportsIndex) -> None: ...
def list_all() -> list[ChannelID]: ...
def list_interpreters(cid: SupportsIndex, *, send: bool) -> list[int]: ...
def send(cid: SupportsIndex, obj: object, *, blocking: bool = True, timeout: float | None = None) -> None: ...
def send_buffer(cid: SupportsIndex, obj: Buffer, *, blocking: bool = True, timeout: float | None = None) -> None: ...
def recv(cid: SupportsIndex, default: object = ...) -> object: ...
def recv(cid: SupportsIndex, default: object = ...) -> tuple[Any, Literal[1, 2, 3]]: ...
def close(cid: SupportsIndex, *, send: bool = False, recv: bool = False) -> None: ...
def get_count(cid: SupportsIndex) -> int: ...
def get_info(cid: SupportsIndex) -> ChannelInfo: ...
def get_channel_defaults(cid: SupportsIndex) -> Literal[1, 2, 3]: ...
def release(cid: SupportsIndex, *, send: bool = False, recv: bool = False, force: bool = False) -> None: ...

View File

@@ -1,5 +1,5 @@
from collections.abc import Iterable, Sequence
from typing import TypeVar
from typing import Final, TypeVar
_T = TypeVar("_T")
_K = TypeVar("_K")
@@ -7,15 +7,15 @@ _V = TypeVar("_V")
__all__ = ["compiler_fixup", "customize_config_vars", "customize_compiler", "get_platform_osx"]
_UNIVERSAL_CONFIG_VARS: tuple[str, ...] # undocumented
_COMPILER_CONFIG_VARS: tuple[str, ...] # undocumented
_INITPRE: str # undocumented
_UNIVERSAL_CONFIG_VARS: Final[tuple[str, ...]] # undocumented
_COMPILER_CONFIG_VARS: Final[tuple[str, ...]] # undocumented
_INITPRE: Final[str] # undocumented
def _find_executable(executable: str, path: str | None = None) -> str | None: ... # undocumented
def _read_output(commandstring: str, capture_stderr: bool = False) -> str | None: ... # undocumented
def _find_build_tool(toolname: str) -> str: ... # undocumented
_SYSTEM_VERSION: str | None # undocumented
_SYSTEM_VERSION: Final[str | None] # undocumented
def _get_system_version() -> str: ... # undocumented
def _remove_original_values(_config_vars: dict[str, str]) -> None: ... # undocumented

View File

@@ -1,30 +1,30 @@
import sys
from typing import Literal
from typing import Final
SF_APPEND: Literal[0x00040000]
SF_ARCHIVED: Literal[0x00010000]
SF_IMMUTABLE: Literal[0x00020000]
SF_NOUNLINK: Literal[0x00100000]
SF_SNAPSHOT: Literal[0x00200000]
SF_APPEND: Final = 0x00040000
SF_ARCHIVED: Final = 0x00010000
SF_IMMUTABLE: Final = 0x00020000
SF_NOUNLINK: Final = 0x00100000
SF_SNAPSHOT: Final = 0x00200000
ST_MODE: Literal[0]
ST_INO: Literal[1]
ST_DEV: Literal[2]
ST_NLINK: Literal[3]
ST_UID: Literal[4]
ST_GID: Literal[5]
ST_SIZE: Literal[6]
ST_ATIME: Literal[7]
ST_MTIME: Literal[8]
ST_CTIME: Literal[9]
ST_MODE: Final = 0
ST_INO: Final = 1
ST_DEV: Final = 2
ST_NLINK: Final = 3
ST_UID: Final = 4
ST_GID: Final = 5
ST_SIZE: Final = 6
ST_ATIME: Final = 7
ST_MTIME: Final = 8
ST_CTIME: Final = 9
S_IFIFO: Literal[0o010000]
S_IFLNK: Literal[0o120000]
S_IFREG: Literal[0o100000]
S_IFSOCK: Literal[0o140000]
S_IFBLK: Literal[0o060000]
S_IFCHR: Literal[0o020000]
S_IFDIR: Literal[0o040000]
S_IFIFO: Final = 0o010000
S_IFLNK: Final = 0o120000
S_IFREG: Final = 0o100000
S_IFSOCK: Final = 0o140000
S_IFBLK: Final = 0o060000
S_IFCHR: Final = 0o020000
S_IFDIR: Final = 0o040000
# These are 0 on systems that don't support the specific kind of file.
# Example: Linux doesn't support door files, so S_IFDOOR is 0 on linux.
@@ -32,37 +32,37 @@ S_IFDOOR: int
S_IFPORT: int
S_IFWHT: int
S_ISUID: Literal[0o4000]
S_ISGID: Literal[0o2000]
S_ISVTX: Literal[0o1000]
S_ISUID: Final = 0o4000
S_ISGID: Final = 0o2000
S_ISVTX: Final = 0o1000
S_IRWXU: Literal[0o0700]
S_IRUSR: Literal[0o0400]
S_IWUSR: Literal[0o0200]
S_IXUSR: Literal[0o0100]
S_IRWXU: Final = 0o0700
S_IRUSR: Final = 0o0400
S_IWUSR: Final = 0o0200
S_IXUSR: Final = 0o0100
S_IRWXG: Literal[0o0070]
S_IRGRP: Literal[0o0040]
S_IWGRP: Literal[0o0020]
S_IXGRP: Literal[0o0010]
S_IRWXG: Final = 0o0070
S_IRGRP: Final = 0o0040
S_IWGRP: Final = 0o0020
S_IXGRP: Final = 0o0010
S_IRWXO: Literal[0o0007]
S_IROTH: Literal[0o0004]
S_IWOTH: Literal[0o0002]
S_IXOTH: Literal[0o0001]
S_IRWXO: Final = 0o0007
S_IROTH: Final = 0o0004
S_IWOTH: Final = 0o0002
S_IXOTH: Final = 0o0001
S_ENFMT: Literal[0o2000]
S_IREAD: Literal[0o0400]
S_IWRITE: Literal[0o0200]
S_IEXEC: Literal[0o0100]
S_ENFMT: Final = 0o2000
S_IREAD: Final = 0o0400
S_IWRITE: Final = 0o0200
S_IEXEC: Final = 0o0100
UF_APPEND: Literal[0x00000004]
UF_COMPRESSED: Literal[0x00000020] # OS X 10.6+ only
UF_HIDDEN: Literal[0x00008000] # OX X 10.5+ only
UF_IMMUTABLE: Literal[0x00000002]
UF_NODUMP: Literal[0x00000001]
UF_NOUNLINK: Literal[0x00000010]
UF_OPAQUE: Literal[0x00000008]
UF_APPEND: Final = 0x00000004
UF_COMPRESSED: Final = 0x00000020 # OS X 10.6+ only
UF_HIDDEN: Final = 0x00008000 # OX X 10.5+ only
UF_IMMUTABLE: Final = 0x00000002
UF_NODUMP: Final = 0x00000001
UF_NOUNLINK: Final = 0x00000010
UF_OPAQUE: Final = 0x00000008
def S_IMODE(mode: int, /) -> int: ...
def S_IFMT(mode: int, /) -> int: ...
@@ -84,34 +84,36 @@ if sys.platform == "win32":
IO_REPARSE_TAG_APPEXECLINK: int
if sys.platform == "win32":
FILE_ATTRIBUTE_ARCHIVE: Literal[32]
FILE_ATTRIBUTE_COMPRESSED: Literal[2048]
FILE_ATTRIBUTE_DEVICE: Literal[64]
FILE_ATTRIBUTE_DIRECTORY: Literal[16]
FILE_ATTRIBUTE_ENCRYPTED: Literal[16384]
FILE_ATTRIBUTE_HIDDEN: Literal[2]
FILE_ATTRIBUTE_INTEGRITY_STREAM: Literal[32768]
FILE_ATTRIBUTE_NORMAL: Literal[128]
FILE_ATTRIBUTE_NOT_CONTENT_INDEXED: Literal[8192]
FILE_ATTRIBUTE_NO_SCRUB_DATA: Literal[131072]
FILE_ATTRIBUTE_OFFLINE: Literal[4096]
FILE_ATTRIBUTE_READONLY: Literal[1]
FILE_ATTRIBUTE_REPARSE_POINT: Literal[1024]
FILE_ATTRIBUTE_SPARSE_FILE: Literal[512]
FILE_ATTRIBUTE_SYSTEM: Literal[4]
FILE_ATTRIBUTE_TEMPORARY: Literal[256]
FILE_ATTRIBUTE_VIRTUAL: Literal[65536]
FILE_ATTRIBUTE_ARCHIVE: Final = 32
FILE_ATTRIBUTE_COMPRESSED: Final = 2048
FILE_ATTRIBUTE_DEVICE: Final = 64
FILE_ATTRIBUTE_DIRECTORY: Final = 16
FILE_ATTRIBUTE_ENCRYPTED: Final = 16384
FILE_ATTRIBUTE_HIDDEN: Final = 2
FILE_ATTRIBUTE_INTEGRITY_STREAM: Final = 32768
FILE_ATTRIBUTE_NORMAL: Final = 128
FILE_ATTRIBUTE_NOT_CONTENT_INDEXED: Final = 8192
FILE_ATTRIBUTE_NO_SCRUB_DATA: Final = 131072
FILE_ATTRIBUTE_OFFLINE: Final = 4096
FILE_ATTRIBUTE_READONLY: Final = 1
FILE_ATTRIBUTE_REPARSE_POINT: Final = 1024
FILE_ATTRIBUTE_SPARSE_FILE: Final = 512
FILE_ATTRIBUTE_SYSTEM: Final = 4
FILE_ATTRIBUTE_TEMPORARY: Final = 256
FILE_ATTRIBUTE_VIRTUAL: Final = 65536
if sys.version_info >= (3, 13):
SF_SETTABLE: Literal[0x3FFF0000]
# Varies by platform.
SF_SETTABLE: Final[int]
# https://github.com/python/cpython/issues/114081#issuecomment-2119017790
# SF_RESTRICTED: Literal[0x00080000]
SF_FIRMLINK: Literal[0x00800000]
SF_DATALESS: Literal[0x40000000]
SF_FIRMLINK: Final = 0x00800000
SF_DATALESS: Final = 0x40000000
SF_SUPPORTED: Literal[0x9F0000]
SF_SYNTHETIC: Literal[0xC0000000]
if sys.platform == "darwin":
SF_SUPPORTED: Final = 0x9F0000
SF_SYNTHETIC: Final = 0xC0000000
UF_TRACKED: Literal[0x00000040]
UF_DATAVAULT: Literal[0x00000080]
UF_SETTABLE: Literal[0x0000FFFF]
UF_TRACKED: Final = 0x00000040
UF_DATAVAULT: Final = 0x00000080
UF_SETTABLE: Final = 0x0000FFFF

View File

@@ -1,6 +1,6 @@
import sys
from collections.abc import Callable
from typing import Any, ClassVar, Literal, final
from typing import Any, ClassVar, Final, final
from typing_extensions import TypeAlias
# _tkinter is meant to be only used internally by tkinter, but some tkinter
@@ -95,16 +95,16 @@ class TkappType:
def settrace(self, func: _TkinterTraceFunc | None, /) -> None: ...
# These should be kept in sync with tkinter.tix constants, except ALL_EVENTS which doesn't match TCL_ALL_EVENTS
ALL_EVENTS: Literal[-3]
FILE_EVENTS: Literal[8]
IDLE_EVENTS: Literal[32]
TIMER_EVENTS: Literal[16]
WINDOW_EVENTS: Literal[4]
ALL_EVENTS: Final = -3
FILE_EVENTS: Final = 8
IDLE_EVENTS: Final = 32
TIMER_EVENTS: Final = 16
WINDOW_EVENTS: Final = 4
DONT_WAIT: Literal[2]
EXCEPTION: Literal[8]
READABLE: Literal[2]
WRITABLE: Literal[4]
DONT_WAIT: Final = 2
EXCEPTION: Final = 8
READABLE: Final = 2
WRITABLE: Final = 4
TCL_VERSION: str
TK_VERSION: str

View File

@@ -1,117 +1,117 @@
import sys
from _typeshed import ReadableBuffer
from collections.abc import Sequence
from typing import Any, Literal, NoReturn, final, overload
from typing import Any, Final, Literal, NoReturn, final, overload
if sys.platform == "win32":
ABOVE_NORMAL_PRIORITY_CLASS: Literal[0x8000]
BELOW_NORMAL_PRIORITY_CLASS: Literal[0x4000]
ABOVE_NORMAL_PRIORITY_CLASS: Final = 0x8000
BELOW_NORMAL_PRIORITY_CLASS: Final = 0x4000
CREATE_BREAKAWAY_FROM_JOB: Literal[0x1000000]
CREATE_DEFAULT_ERROR_MODE: Literal[0x4000000]
CREATE_NO_WINDOW: Literal[0x8000000]
CREATE_NEW_CONSOLE: Literal[0x10]
CREATE_NEW_PROCESS_GROUP: Literal[0x200]
CREATE_BREAKAWAY_FROM_JOB: Final = 0x1000000
CREATE_DEFAULT_ERROR_MODE: Final = 0x4000000
CREATE_NO_WINDOW: Final = 0x8000000
CREATE_NEW_CONSOLE: Final = 0x10
CREATE_NEW_PROCESS_GROUP: Final = 0x200
DETACHED_PROCESS: Literal[8]
DUPLICATE_CLOSE_SOURCE: Literal[1]
DUPLICATE_SAME_ACCESS: Literal[2]
DETACHED_PROCESS: Final = 8
DUPLICATE_CLOSE_SOURCE: Final = 1
DUPLICATE_SAME_ACCESS: Final = 2
ERROR_ALREADY_EXISTS: Literal[183]
ERROR_BROKEN_PIPE: Literal[109]
ERROR_IO_PENDING: Literal[997]
ERROR_MORE_DATA: Literal[234]
ERROR_NETNAME_DELETED: Literal[64]
ERROR_NO_DATA: Literal[232]
ERROR_NO_SYSTEM_RESOURCES: Literal[1450]
ERROR_OPERATION_ABORTED: Literal[995]
ERROR_PIPE_BUSY: Literal[231]
ERROR_PIPE_CONNECTED: Literal[535]
ERROR_SEM_TIMEOUT: Literal[121]
ERROR_ALREADY_EXISTS: Final = 183
ERROR_BROKEN_PIPE: Final = 109
ERROR_IO_PENDING: Final = 997
ERROR_MORE_DATA: Final = 234
ERROR_NETNAME_DELETED: Final = 64
ERROR_NO_DATA: Final = 232
ERROR_NO_SYSTEM_RESOURCES: Final = 1450
ERROR_OPERATION_ABORTED: Final = 995
ERROR_PIPE_BUSY: Final = 231
ERROR_PIPE_CONNECTED: Final = 535
ERROR_SEM_TIMEOUT: Final = 121
FILE_FLAG_FIRST_PIPE_INSTANCE: Literal[0x80000]
FILE_FLAG_OVERLAPPED: Literal[0x40000000]
FILE_FLAG_FIRST_PIPE_INSTANCE: Final = 0x80000
FILE_FLAG_OVERLAPPED: Final = 0x40000000
FILE_GENERIC_READ: Literal[1179785]
FILE_GENERIC_WRITE: Literal[1179926]
FILE_GENERIC_READ: Final = 1179785
FILE_GENERIC_WRITE: Final = 1179926
FILE_MAP_ALL_ACCESS: Literal[983071]
FILE_MAP_COPY: Literal[1]
FILE_MAP_EXECUTE: Literal[32]
FILE_MAP_READ: Literal[4]
FILE_MAP_WRITE: Literal[2]
FILE_MAP_ALL_ACCESS: Final = 983071
FILE_MAP_COPY: Final = 1
FILE_MAP_EXECUTE: Final = 32
FILE_MAP_READ: Final = 4
FILE_MAP_WRITE: Final = 2
FILE_TYPE_CHAR: Literal[2]
FILE_TYPE_DISK: Literal[1]
FILE_TYPE_PIPE: Literal[3]
FILE_TYPE_REMOTE: Literal[32768]
FILE_TYPE_UNKNOWN: Literal[0]
FILE_TYPE_CHAR: Final = 2
FILE_TYPE_DISK: Final = 1
FILE_TYPE_PIPE: Final = 3
FILE_TYPE_REMOTE: Final = 32768
FILE_TYPE_UNKNOWN: Final = 0
GENERIC_READ: Literal[0x80000000]
GENERIC_WRITE: Literal[0x40000000]
HIGH_PRIORITY_CLASS: Literal[0x80]
INFINITE: Literal[0xFFFFFFFF]
GENERIC_READ: Final = 0x80000000
GENERIC_WRITE: Final = 0x40000000
HIGH_PRIORITY_CLASS: Final = 0x80
INFINITE: Final = 0xFFFFFFFF
# Ignore the Flake8 error -- flake8-pyi assumes
# most numbers this long will be implementation details,
# but here we can see that it's a power of 2
INVALID_HANDLE_VALUE: Literal[0xFFFFFFFFFFFFFFFF] # noqa: Y054
IDLE_PRIORITY_CLASS: Literal[0x40]
NORMAL_PRIORITY_CLASS: Literal[0x20]
REALTIME_PRIORITY_CLASS: Literal[0x100]
NMPWAIT_WAIT_FOREVER: Literal[0xFFFFFFFF]
INVALID_HANDLE_VALUE: Final = 0xFFFFFFFFFFFFFFFF # noqa: Y054
IDLE_PRIORITY_CLASS: Final = 0x40
NORMAL_PRIORITY_CLASS: Final = 0x20
REALTIME_PRIORITY_CLASS: Final = 0x100
NMPWAIT_WAIT_FOREVER: Final = 0xFFFFFFFF
MEM_COMMIT: Literal[0x1000]
MEM_FREE: Literal[0x10000]
MEM_IMAGE: Literal[0x1000000]
MEM_MAPPED: Literal[0x40000]
MEM_PRIVATE: Literal[0x20000]
MEM_RESERVE: Literal[0x2000]
MEM_COMMIT: Final = 0x1000
MEM_FREE: Final = 0x10000
MEM_IMAGE: Final = 0x1000000
MEM_MAPPED: Final = 0x40000
MEM_PRIVATE: Final = 0x20000
MEM_RESERVE: Final = 0x2000
NULL: Literal[0]
OPEN_EXISTING: Literal[3]
NULL: Final = 0
OPEN_EXISTING: Final = 3
PIPE_ACCESS_DUPLEX: Literal[3]
PIPE_ACCESS_INBOUND: Literal[1]
PIPE_READMODE_MESSAGE: Literal[2]
PIPE_TYPE_MESSAGE: Literal[4]
PIPE_UNLIMITED_INSTANCES: Literal[255]
PIPE_WAIT: Literal[0]
PIPE_ACCESS_DUPLEX: Final = 3
PIPE_ACCESS_INBOUND: Final = 1
PIPE_READMODE_MESSAGE: Final = 2
PIPE_TYPE_MESSAGE: Final = 4
PIPE_UNLIMITED_INSTANCES: Final = 255
PIPE_WAIT: Final = 0
PAGE_EXECUTE: Literal[0x10]
PAGE_EXECUTE_READ: Literal[0x20]
PAGE_EXECUTE_READWRITE: Literal[0x40]
PAGE_EXECUTE_WRITECOPY: Literal[0x80]
PAGE_GUARD: Literal[0x100]
PAGE_NOACCESS: Literal[0x1]
PAGE_NOCACHE: Literal[0x200]
PAGE_READONLY: Literal[0x2]
PAGE_READWRITE: Literal[0x4]
PAGE_WRITECOMBINE: Literal[0x400]
PAGE_WRITECOPY: Literal[0x8]
PAGE_EXECUTE: Final = 0x10
PAGE_EXECUTE_READ: Final = 0x20
PAGE_EXECUTE_READWRITE: Final = 0x40
PAGE_EXECUTE_WRITECOPY: Final = 0x80
PAGE_GUARD: Final = 0x100
PAGE_NOACCESS: Final = 0x1
PAGE_NOCACHE: Final = 0x200
PAGE_READONLY: Final = 0x2
PAGE_READWRITE: Final = 0x4
PAGE_WRITECOMBINE: Final = 0x400
PAGE_WRITECOPY: Final = 0x8
PROCESS_ALL_ACCESS: Literal[0x1FFFFF]
PROCESS_DUP_HANDLE: Literal[0x40]
PROCESS_ALL_ACCESS: Final = 0x1FFFFF
PROCESS_DUP_HANDLE: Final = 0x40
SEC_COMMIT: Literal[0x8000000]
SEC_IMAGE: Literal[0x1000000]
SEC_LARGE_PAGES: Literal[0x80000000]
SEC_NOCACHE: Literal[0x10000000]
SEC_RESERVE: Literal[0x4000000]
SEC_WRITECOMBINE: Literal[0x40000000]
SEC_COMMIT: Final = 0x8000000
SEC_IMAGE: Final = 0x1000000
SEC_LARGE_PAGES: Final = 0x80000000
SEC_NOCACHE: Final = 0x10000000
SEC_RESERVE: Final = 0x4000000
SEC_WRITECOMBINE: Final = 0x40000000
STARTF_USESHOWWINDOW: Literal[0x1]
STARTF_USESTDHANDLES: Literal[0x100]
STARTF_USESHOWWINDOW: Final = 0x1
STARTF_USESTDHANDLES: Final = 0x100
STD_ERROR_HANDLE: Literal[0xFFFFFFF4]
STD_OUTPUT_HANDLE: Literal[0xFFFFFFF5]
STD_INPUT_HANDLE: Literal[0xFFFFFFF6]
STD_ERROR_HANDLE: Final = 0xFFFFFFF4
STD_OUTPUT_HANDLE: Final = 0xFFFFFFF5
STD_INPUT_HANDLE: Final = 0xFFFFFFF6
STILL_ACTIVE: Literal[259]
SW_HIDE: Literal[0]
SYNCHRONIZE: Literal[0x100000]
WAIT_ABANDONED_0: Literal[128]
WAIT_OBJECT_0: Literal[0]
WAIT_TIMEOUT: Literal[258]
STILL_ACTIVE: Final = 259
SW_HIDE: Final = 0
SYNCHRONIZE: Final = 0x100000
WAIT_ABANDONED_0: Final = 128
WAIT_OBJECT_0: Final = 0
WAIT_TIMEOUT: Final = 258
if sys.version_info >= (3, 10):
LOCALE_NAME_INVARIANT: str
@@ -131,32 +131,32 @@ if sys.platform == "win32":
LCMAP_UPPERCASE: int
if sys.version_info >= (3, 12):
COPYFILE2_CALLBACK_CHUNK_STARTED: Literal[1]
COPYFILE2_CALLBACK_CHUNK_FINISHED: Literal[2]
COPYFILE2_CALLBACK_STREAM_STARTED: Literal[3]
COPYFILE2_CALLBACK_STREAM_FINISHED: Literal[4]
COPYFILE2_CALLBACK_POLL_CONTINUE: Literal[5]
COPYFILE2_CALLBACK_ERROR: Literal[6]
COPYFILE2_CALLBACK_CHUNK_STARTED: Final = 1
COPYFILE2_CALLBACK_CHUNK_FINISHED: Final = 2
COPYFILE2_CALLBACK_STREAM_STARTED: Final = 3
COPYFILE2_CALLBACK_STREAM_FINISHED: Final = 4
COPYFILE2_CALLBACK_POLL_CONTINUE: Final = 5
COPYFILE2_CALLBACK_ERROR: Final = 6
COPYFILE2_PROGRESS_CONTINUE: Literal[0]
COPYFILE2_PROGRESS_CANCEL: Literal[1]
COPYFILE2_PROGRESS_STOP: Literal[2]
COPYFILE2_PROGRESS_QUIET: Literal[3]
COPYFILE2_PROGRESS_PAUSE: Literal[4]
COPYFILE2_PROGRESS_CONTINUE: Final = 0
COPYFILE2_PROGRESS_CANCEL: Final = 1
COPYFILE2_PROGRESS_STOP: Final = 2
COPYFILE2_PROGRESS_QUIET: Final = 3
COPYFILE2_PROGRESS_PAUSE: Final = 4
COPY_FILE_FAIL_IF_EXISTS: Literal[0x1]
COPY_FILE_RESTARTABLE: Literal[0x2]
COPY_FILE_OPEN_SOURCE_FOR_WRITE: Literal[0x4]
COPY_FILE_ALLOW_DECRYPTED_DESTINATION: Literal[0x8]
COPY_FILE_COPY_SYMLINK: Literal[0x800]
COPY_FILE_NO_BUFFERING: Literal[0x1000]
COPY_FILE_REQUEST_SECURITY_PRIVILEGES: Literal[0x2000]
COPY_FILE_RESUME_FROM_PAUSE: Literal[0x4000]
COPY_FILE_NO_OFFLOAD: Literal[0x40000]
COPY_FILE_REQUEST_COMPRESSED_TRAFFIC: Literal[0x10000000]
COPY_FILE_FAIL_IF_EXISTS: Final = 0x1
COPY_FILE_RESTARTABLE: Final = 0x2
COPY_FILE_OPEN_SOURCE_FOR_WRITE: Final = 0x4
COPY_FILE_ALLOW_DECRYPTED_DESTINATION: Final = 0x8
COPY_FILE_COPY_SYMLINK: Final = 0x800
COPY_FILE_NO_BUFFERING: Final = 0x1000
COPY_FILE_REQUEST_SECURITY_PRIVILEGES: Final = 0x2000
COPY_FILE_RESUME_FROM_PAUSE: Final = 0x4000
COPY_FILE_NO_OFFLOAD: Final = 0x40000
COPY_FILE_REQUEST_COMPRESSED_TRAFFIC: Final = 0x10000000
ERROR_ACCESS_DENIED: Literal[5]
ERROR_PRIVILEGE_NOT_HELD: Literal[1314]
ERROR_ACCESS_DENIED: Final = 5
ERROR_PRIVILEGE_NOT_HELD: Final = 1314
def CloseHandle(handle: int, /) -> None: ...
@overload

View File

@@ -2,7 +2,7 @@ import sys
from _typeshed import sentinel
from collections.abc import Callable, Generator, Iterable, Sequence
from re import Pattern
from typing import IO, Any, Generic, Literal, NewType, NoReturn, Protocol, TypeVar, overload
from typing import IO, Any, Final, Generic, NewType, NoReturn, Protocol, TypeVar, overload
from typing_extensions import Self, TypeAlias, deprecated
__all__ = [
@@ -43,15 +43,15 @@ _ActionStr: TypeAlias = str
# callers that don't use a literal argument
_NArgsStr: TypeAlias = str
ONE_OR_MORE: Literal["+"]
OPTIONAL: Literal["?"]
PARSER: Literal["A..."]
REMAINDER: Literal["..."]
ONE_OR_MORE: Final = "+"
OPTIONAL: Final = "?"
PARSER: Final = "A..."
REMAINDER: Final = "..."
_SUPPRESS_T = NewType("_SUPPRESS_T", str)
SUPPRESS: _SUPPRESS_T | str # not using Literal because argparse sometimes compares SUPPRESS with is
# the | str is there so that foo = argparse.SUPPRESS; foo = "test" checks out in mypy
ZERO_OR_MORE: Literal["*"]
_UNRECOGNIZED_ARGS_ATTR: str # undocumented
ZERO_OR_MORE: Final = "*"
_UNRECOGNIZED_ARGS_ATTR: Final[str] # undocumented
class ArgumentError(Exception):
argument_name: str | None

View File

@@ -1,6 +1,6 @@
from collections.abc import Callable, Sequence
from contextvars import Context
from typing import Any, Literal
from typing import Any, Final
from . import futures
@@ -11,9 +11,9 @@ __all__ = ()
# That's why the import order is reversed.
from .futures import isfuture as isfuture
_PENDING: Literal["PENDING"] # undocumented
_CANCELLED: Literal["CANCELLED"] # undocumented
_FINISHED: Literal["FINISHED"] # undocumented
_PENDING: Final = "PENDING" # undocumented
_CANCELLED: Final = "CANCELLED" # undocumented
_FINISHED: Final = "FINISHED" # undocumented
def _format_callbacks(cb: Sequence[tuple[Callable[[futures.Future[Any]], None], Context]]) -> str: ... # undocumented
def _future_repr_info(future: futures.Future[Any]) -> list[str]: ... # undocumented

View File

@@ -1,18 +1,18 @@
import enum
import sys
from typing import Literal
from typing import Final
LOG_THRESHOLD_FOR_CONNLOST_WRITES: Literal[5]
ACCEPT_RETRY_DELAY: Literal[1]
DEBUG_STACK_DEPTH: Literal[10]
LOG_THRESHOLD_FOR_CONNLOST_WRITES: Final = 5
ACCEPT_RETRY_DELAY: Final = 1
DEBUG_STACK_DEPTH: Final = 10
SSL_HANDSHAKE_TIMEOUT: float
SENDFILE_FALLBACK_READBUFFER_SIZE: Literal[262144]
SENDFILE_FALLBACK_READBUFFER_SIZE: Final = 262144
if sys.version_info >= (3, 11):
SSL_SHUTDOWN_TIMEOUT: float
FLOW_CONTROL_HIGH_WATER_SSL_READ: Literal[256]
FLOW_CONTROL_HIGH_WATER_SSL_WRITE: Literal[512]
FLOW_CONTROL_HIGH_WATER_SSL_READ: Final = 256
FLOW_CONTROL_HIGH_WATER_SSL_WRITE: Final = 512
if sys.version_info >= (3, 12):
THREAD_JOIN_TIMEOUT: Literal[300]
THREAD_JOIN_TIMEOUT: Final = 300
class _SendfileMode(enum.Enum):
UNSUPPORTED = 1

View File

@@ -3,7 +3,7 @@ import sys
from collections import deque
from collections.abc import Callable
from enum import Enum
from typing import Any, ClassVar, Literal
from typing import Any, ClassVar, Final, Literal
from typing_extensions import TypeAlias
from . import constants, events, futures, protocols, transports
@@ -29,10 +29,10 @@ if sys.version_info >= (3, 11):
def add_flowcontrol_defaults(high: int | None, low: int | None, kb: int) -> tuple[int, int]: ...
else:
_UNWRAPPED: Literal["UNWRAPPED"]
_DO_HANDSHAKE: Literal["DO_HANDSHAKE"]
_WRAPPED: Literal["WRAPPED"]
_SHUTDOWN: Literal["SHUTDOWN"]
_UNWRAPPED: Final = "UNWRAPPED"
_DO_HANDSHAKE: Final = "DO_HANDSHAKE"
_WRAPPED: Final = "WRAPPED"
_SHUTDOWN: Final = "SHUTDOWN"
if sys.version_info < (3, 11):
class _SSLPipe:

View File

@@ -429,7 +429,11 @@ class Task(Future[_T_co]): # type: ignore[type-var] # pyright: ignore[reportIn
self, coro: _TaskCompatibleCoro[_T_co], *, loop: AbstractEventLoop = ..., name: str | None = ...
) -> None: ...
def get_coro(self) -> _TaskCompatibleCoro[_T_co]: ...
if sys.version_info >= (3, 12):
def get_coro(self) -> _TaskCompatibleCoro[_T_co] | None: ...
else:
def get_coro(self) -> _TaskCompatibleCoro[_T_co]: ...
def get_name(self) -> str: ...
def set_name(self, value: object, /) -> None: ...
if sys.version_info >= (3, 12):

View File

@@ -2,7 +2,7 @@ import socket
import sys
from _typeshed import Incomplete, ReadableBuffer, WriteableBuffer
from collections.abc import Callable
from typing import IO, Any, ClassVar, Literal, NoReturn
from typing import IO, Any, ClassVar, Final, NoReturn
from . import events, futures, proactor_events, selector_events, streams, windows_utils
@@ -28,10 +28,10 @@ if sys.platform == "win32":
"WindowsProactorEventLoopPolicy",
)
NULL: Literal[0]
INFINITE: Literal[0xFFFFFFFF]
ERROR_CONNECTION_REFUSED: Literal[1225]
ERROR_CONNECTION_ABORTED: Literal[1236]
NULL: Final = 0
INFINITE: Final = 0xFFFFFFFF
ERROR_CONNECTION_REFUSED: Final = 1225
ERROR_CONNECTION_ABORTED: Final = 1236
CONNECT_PIPE_INIT_DELAY: float
CONNECT_PIPE_MAX_DELAY: float

View File

@@ -2,13 +2,13 @@ import subprocess
import sys
from collections.abc import Callable
from types import TracebackType
from typing import Any, AnyStr, Literal
from typing import Any, AnyStr, Final
from typing_extensions import Self
if sys.platform == "win32":
__all__ = ("pipe", "Popen", "PIPE", "PipeHandle")
BUFSIZE: Literal[8192]
BUFSIZE: Final = 8192
PIPE = subprocess.PIPE
STDOUT = subprocess.STDOUT
def pipe(*, duplex: bool = False, overlapped: tuple[bool, bool] = (True, True), bufsize: int = 8192) -> tuple[int, int]: ...

View File

@@ -2,7 +2,7 @@ import sys
from _typeshed import ExcInfo, TraceFunction, Unused
from collections.abc import Callable, Iterable, Mapping
from types import CodeType, FrameType, TracebackType
from typing import IO, Any, Literal, SupportsInt, TypeVar
from typing import IO, Any, Final, SupportsInt, TypeVar
from typing_extensions import ParamSpec
__all__ = ["BdbQuit", "Bdb", "Breakpoint"]
@@ -10,7 +10,10 @@ __all__ = ["BdbQuit", "Bdb", "Breakpoint"]
_T = TypeVar("_T")
_P = ParamSpec("_P")
GENERATOR_AND_COROUTINE_FLAGS: Literal[672]
# A union of code-object flags at runtime.
# The exact values of code-object flags are implementation details,
# so we don't include the value of this constant in the stubs.
GENERATOR_AND_COROUTINE_FLAGS: Final[int]
class BdbQuit(Exception): ...

View File

@@ -1,14 +1,14 @@
from _typeshed import SizedBuffer
from typing import IO, Any, Literal
from typing import IO, Any, Final
from typing_extensions import TypeAlias
__all__ = ["binhex", "hexbin", "Error"]
class Error(Exception): ...
REASONABLY_LARGE: Literal[32768]
LINELEN: Literal[64]
RUNCHAR: Literal[b"\x90"]
REASONABLY_LARGE: Final = 32768
LINELEN: Final = 64
RUNCHAR: Final = b"\x90"
class FInfo:
Type: str

View File

@@ -1868,6 +1868,7 @@ class BaseException:
__suppress_context__: bool
__traceback__: TracebackType | None
def __init__(self, *args: object) -> None: ...
def __new__(cls, *args: Any, **kwds: Any) -> Self: ...
def __setstate__(self, state: dict[str, Any] | None, /) -> None: ...
def with_traceback(self, tb: TracebackType | None, /) -> Self: ...
if sys.version_info >= (3, 11):

View File

@@ -1,9 +1,9 @@
from collections.abc import Callable
from typing import IO, Any, Literal
from typing import IO, Any, Final
__all__ = ["Cmd"]
PROMPT: Literal["(Cmd) "]
PROMPT: Final = "(Cmd) "
IDENTCHARS: str # Too big to be `Literal`
class Cmd:

View File

@@ -3,7 +3,7 @@ from _codecs import *
from _typeshed import ReadableBuffer
from abc import abstractmethod
from collections.abc import Callable, Generator, Iterable
from typing import Any, BinaryIO, Literal, Protocol, TextIO
from typing import Any, BinaryIO, Final, Literal, Protocol, TextIO
from typing_extensions import Self
__all__ = [
@@ -53,10 +53,10 @@ __all__ = [
"lookup_error",
]
BOM32_BE: Literal[b"\xfe\xff"]
BOM32_LE: Literal[b"\xff\xfe"]
BOM64_BE: Literal[b"\x00\x00\xfe\xff"]
BOM64_LE: Literal[b"\xff\xfe\x00\x00"]
BOM32_BE: Final = b"\xfe\xff"
BOM32_LE: Final = b"\xff\xfe"
BOM64_BE: Final = b"\x00\x00\xfe\xff"
BOM64_LE: Final = b"\xff\xfe\x00\x00"
class _WritableStream(Protocol):
def write(self, data: bytes, /) -> object: ...
@@ -135,23 +135,23 @@ def EncodedFile(file: _Stream, data_encoding: str, file_encoding: str | None = N
def iterencode(iterator: Iterable[str], encoding: str, errors: str = "strict") -> Generator[bytes, None, None]: ...
def iterdecode(iterator: Iterable[bytes], encoding: str, errors: str = "strict") -> Generator[str, None, None]: ...
BOM: Literal[b"\xff\xfe", b"\xfe\xff"] # depends on `sys.byteorder`
BOM_BE: Literal[b"\xfe\xff"]
BOM_LE: Literal[b"\xff\xfe"]
BOM_UTF8: Literal[b"\xef\xbb\xbf"]
BOM_UTF16: Literal[b"\xff\xfe", b"\xfe\xff"] # depends on `sys.byteorder`
BOM_UTF16_BE: Literal[b"\xfe\xff"]
BOM_UTF16_LE: Literal[b"\xff\xfe"]
BOM_UTF32: Literal[b"\xff\xfe\x00\x00", b"\x00\x00\xfe\xff"] # depends on `sys.byteorder`
BOM_UTF32_BE: Literal[b"\x00\x00\xfe\xff"]
BOM_UTF32_LE: Literal[b"\xff\xfe\x00\x00"]
BOM: Final[Literal[b"\xff\xfe", b"\xfe\xff"]] # depends on `sys.byteorder`
BOM_BE: Final = b"\xfe\xff"
BOM_LE: Final = b"\xff\xfe"
BOM_UTF8: Final = b"\xef\xbb\xbf"
BOM_UTF16: Final[Literal[b"\xff\xfe", b"\xfe\xff"]] # depends on `sys.byteorder`
BOM_UTF16_BE: Final = b"\xfe\xff"
BOM_UTF16_LE: Final = b"\xff\xfe"
BOM_UTF32: Final[Literal[b"\xff\xfe\x00\x00", b"\x00\x00\xfe\xff"]] # depends on `sys.byteorder`
BOM_UTF32_BE: Final = b"\x00\x00\xfe\xff"
BOM_UTF32_LE: Final = b"\xff\xfe\x00\x00"
def strict_errors(exception: UnicodeError) -> tuple[str | bytes, int]: ...
def replace_errors(exception: UnicodeError) -> tuple[str | bytes, int]: ...
def ignore_errors(exception: UnicodeError) -> tuple[str | bytes, int]: ...
def xmlcharrefreplace_errors(exception: UnicodeError) -> tuple[str | bytes, int]: ...
def backslashreplace_errors(exception: UnicodeError) -> tuple[str | bytes, int]: ...
def namereplace_errors(exception: UnicodeError) -> tuple[str | bytes, int]: ...
def strict_errors(exception: UnicodeError, /) -> tuple[str | bytes, int]: ...
def replace_errors(exception: UnicodeError, /) -> tuple[str | bytes, int]: ...
def ignore_errors(exception: UnicodeError, /) -> tuple[str | bytes, int]: ...
def xmlcharrefreplace_errors(exception: UnicodeError, /) -> tuple[str | bytes, int]: ...
def backslashreplace_errors(exception: UnicodeError, /) -> tuple[str | bytes, int]: ...
def namereplace_errors(exception: UnicodeError, /) -> tuple[str | bytes, int]: ...
class Codec:
# These are sort of @abstractmethod but sort of not.

View File

@@ -4,20 +4,20 @@ from _typeshed import Unused
from collections.abc import Callable, Collection, Iterable, Iterator
from logging import Logger
from types import TracebackType
from typing import Any, Generic, Literal, NamedTuple, Protocol, TypeVar
from typing import Any, Final, Generic, NamedTuple, Protocol, TypeVar
from typing_extensions import ParamSpec, Self
if sys.version_info >= (3, 9):
from types import GenericAlias
FIRST_COMPLETED: Literal["FIRST_COMPLETED"]
FIRST_EXCEPTION: Literal["FIRST_EXCEPTION"]
ALL_COMPLETED: Literal["ALL_COMPLETED"]
PENDING: Literal["PENDING"]
RUNNING: Literal["RUNNING"]
CANCELLED: Literal["CANCELLED"]
CANCELLED_AND_NOTIFIED: Literal["CANCELLED_AND_NOTIFIED"]
FINISHED: Literal["FINISHED"]
FIRST_COMPLETED: Final = "FIRST_COMPLETED"
FIRST_EXCEPTION: Final = "FIRST_EXCEPTION"
ALL_COMPLETED: Final = "ALL_COMPLETED"
PENDING: Final = "PENDING"
RUNNING: Final = "RUNNING"
CANCELLED: Final = "CANCELLED"
CANCELLED_AND_NOTIFIED: Final = "CANCELLED_AND_NOTIFIED"
FINISHED: Final = "FINISHED"
_FUTURE_STATES: list[str]
_STATE_TO_DESCRIPTION_MAP: dict[str, str]
LOGGER: Logger

View File

@@ -2,7 +2,7 @@ import sys
from _typeshed import StrOrBytesPath, SupportsWrite
from collections.abc import Callable, ItemsView, Iterable, Iterator, Mapping, MutableMapping, Sequence
from re import Pattern
from typing import Any, ClassVar, Literal, TypeVar, overload
from typing import Any, ClassVar, Final, Literal, TypeVar, overload
from typing_extensions import TypeAlias
if sys.version_info >= (3, 13):
@@ -83,8 +83,8 @@ _ConverterCallback: TypeAlias = Callable[[str], Any]
_ConvertersMap: TypeAlias = dict[str, _ConverterCallback]
_T = TypeVar("_T")
DEFAULTSECT: Literal["DEFAULT"]
MAX_INTERPOLATION_DEPTH: Literal[10]
DEFAULTSECT: Final = "DEFAULT"
MAX_INTERPOLATION_DEPTH: Final = 10
class Interpolation:
def before_get(self, parser: _Parser, section: str, option: str, value: str, defaults: _Section) -> str: ...

View File

@@ -1,8 +1,16 @@
from typing import Any, TypeVar
import sys
from typing import Any, Protocol, TypeVar
from typing_extensions import ParamSpec, Self
__all__ = ["Error", "copy", "deepcopy"]
_T = TypeVar("_T")
_SR = TypeVar("_SR", bound=_SupportsReplace[Any])
_P = ParamSpec("_P")
class _SupportsReplace(Protocol[_P]):
# In reality doesn't support args, but there's no other great way to express this.
def __replace__(self, *args: _P.args, **kwargs: _P.kwargs) -> Self: ...
# None in CPython but non-None in Jython
PyStringMap: Any
@@ -11,6 +19,10 @@ PyStringMap: Any
def deepcopy(x: _T, memo: dict[int, Any] | None = None, _nil: Any = []) -> _T: ...
def copy(x: _T) -> _T: ...
if sys.version_info >= (3, 13):
__all__ += ["replace"]
def replace(obj: _SR, /, **changes: Any) -> _SR: ...
class Error(Exception): ...
error = Error

View File

@@ -1,7 +1,7 @@
import sys
from abc import abstractmethod
from time import struct_time
from typing import ClassVar, Literal, NamedTuple, NoReturn, SupportsIndex, final, overload
from typing import ClassVar, Final, NamedTuple, NoReturn, SupportsIndex, final, overload
from typing_extensions import Self, TypeAlias, deprecated
if sys.version_info >= (3, 11):
@@ -9,8 +9,8 @@ if sys.version_info >= (3, 11):
elif sys.version_info >= (3, 9):
__all__ = ("date", "datetime", "time", "timedelta", "timezone", "tzinfo", "MINYEAR", "MAXYEAR")
MINYEAR: Literal[1]
MAXYEAR: Literal[9999]
MINYEAR: Final = 1
MAXYEAR: Final = 9999
class tzinfo:
@abstractmethod

View File

@@ -1,10 +1,11 @@
from _typeshed import BytesPath, StrPath
from _typeshed import BytesPath, StrPath, Unused
from collections.abc import Callable, Iterable
from distutils.file_util import _BytesPathT, _StrPathT
from typing import Any, Literal, overload
from typing_extensions import TypeAlias
from typing import Literal, overload
from typing_extensions import TypeAlias, TypeVarTuple, Unpack
_Macro: TypeAlias = tuple[str] | tuple[str, str | None]
_Ts = TypeVarTuple("_Ts")
def gen_lib_options(
compiler: CCompiler, library_dirs: list[str], runtime_library_dirs: list[str], libraries: list[str]
@@ -161,7 +162,9 @@ class CCompiler:
def shared_object_filename(self, basename: str, strip_dir: Literal[0, False] = 0, output_dir: StrPath = "") -> str: ...
@overload
def shared_object_filename(self, basename: StrPath, strip_dir: Literal[1, True], output_dir: StrPath = "") -> str: ...
def execute(self, func: Callable[..., object], args: tuple[Any, ...], msg: str | None = None, level: int = 1) -> None: ...
def execute(
self, func: Callable[[Unpack[_Ts]], Unused], args: tuple[Unpack[_Ts]], msg: str | None = None, level: int = 1
) -> None: ...
def spawn(self, cmd: list[str]) -> None: ...
def mkpath(self, name: str, mode: int = 0o777) -> None: ...
@overload

View File

@@ -3,7 +3,11 @@ from abc import abstractmethod
from collections.abc import Callable, Iterable
from distutils.dist import Distribution
from distutils.file_util import _BytesPathT, _StrPathT
from typing import Any, ClassVar, Literal, overload
from typing import Any, ClassVar, Literal, TypeVar, overload
from typing_extensions import TypeVarTuple, Unpack
_CommandT = TypeVar("_CommandT", bound=Command)
_Ts = TypeVarTuple("_Ts")
class Command:
distribution: Distribution
@@ -19,17 +23,22 @@ class Command:
def announce(self, msg: str, level: int = 1) -> None: ...
def debug_print(self, msg: str) -> None: ...
def ensure_string(self, option: str, default: str | None = None) -> None: ...
def ensure_string_list(self, option: str | list[str]) -> None: ...
def ensure_string_list(self, option: str) -> None: ...
def ensure_filename(self, option: str) -> None: ...
def ensure_dirname(self, option: str) -> None: ...
def get_command_name(self) -> str: ...
def set_undefined_options(self, src_cmd: str, *option_pairs: tuple[str, str]) -> None: ...
def get_finalized_command(self, command: str, create: bool | Literal[0, 1] = 1) -> Command: ...
def reinitialize_command(self, command: Command | str, reinit_subcommands: bool | Literal[0, 1] = 0) -> Command: ...
@overload
def reinitialize_command(self, command: str, reinit_subcommands: bool | Literal[0, 1] = 0) -> Command: ...
@overload
def reinitialize_command(self, command: _CommandT, reinit_subcommands: bool | Literal[0, 1] = 0) -> _CommandT: ...
def run_command(self, command: str) -> None: ...
def get_sub_commands(self) -> list[str]: ...
def warn(self, msg: str) -> None: ...
def execute(self, func: Callable[..., object], args: Iterable[Any], msg: str | None = None, level: int = 1) -> None: ...
def execute(
self, func: Callable[[Unpack[_Ts]], Unused], args: tuple[Unpack[_Ts]], msg: str | None = None, level: int = 1
) -> None: ...
def mkpath(self, name: str, mode: int = 0o777) -> None: ...
@overload
def copy_file(
@@ -89,8 +98,8 @@ class Command:
self,
infiles: str | list[str] | tuple[str, ...],
outfile: StrOrBytesPath,
func: Callable[..., object],
args: list[Any],
func: Callable[[Unpack[_Ts]], Unused],
args: tuple[Unpack[_Ts]],
exec_msg: str | None = None,
skip_msg: str | None = None,
level: Unused = 1,

View File

@@ -1,4 +1,6 @@
from typing import Any
from _typeshed import Unused
from collections.abc import Callable
from typing import Any, ClassVar
from ..cmd import Command
@@ -6,13 +8,13 @@ def show_formats() -> None: ...
class bdist(Command):
description: str
user_options: Any
boolean_options: Any
help_options: Any
no_format_option: Any
default_format: Any
format_commands: Any
format_command: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
help_options: ClassVar[list[tuple[str, str | None, str, Callable[[], Unused]]]]
no_format_option: ClassVar[tuple[str, ...]]
default_format: ClassVar[dict[str, str]]
format_commands: ClassVar[list[str]]
format_command: ClassVar[dict[str, tuple[str, str]]]
bdist_base: Any
plat_name: Any
formats: Any

View File

@@ -1,12 +1,12 @@
from typing import Any
from typing import Any, ClassVar
from ..cmd import Command
class bdist_dumb(Command):
description: str
user_options: Any
boolean_options: Any
default_format: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
default_format: ClassVar[dict[str, str]]
bdist_dir: Any
plat_name: Any
format: Any

View File

@@ -1,5 +1,5 @@
import sys
from typing import Any, Literal
from typing import Any, ClassVar, Literal
from ..cmd import Command
@@ -16,8 +16,8 @@ if sys.platform == "win32":
class bdist_msi(Command):
description: str
user_options: Any
boolean_options: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
all_versions: Any
other_version: str
if sys.version_info >= (3, 9):

View File

@@ -1,12 +1,12 @@
from typing import Any
from typing import Any, ClassVar
from ..cmd import Command
class bdist_rpm(Command):
description: str
user_options: Any
boolean_options: Any
negative_opt: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
negative_opt: ClassVar[dict[str, str]]
bdist_base: Any
rpm_base: Any
dist_dir: Any

View File

@@ -1,10 +1,10 @@
from _typeshed import StrOrBytesPath
from distutils.cmd import Command
from typing import Any, ClassVar
from typing import ClassVar
class bdist_wininst(Command):
description: ClassVar[str]
user_options: ClassVar[list[tuple[Any, ...]]]
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
def initialize_options(self) -> None: ...

View File

@@ -1,3 +1,4 @@
from _typeshed import Unused
from collections.abc import Callable
from typing import Any, ClassVar
@@ -7,9 +8,9 @@ def show_compilers() -> None: ...
class build(Command):
description: str
user_options: Any
boolean_options: Any
help_options: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
help_options: ClassVar[list[tuple[str, str | None, str, Callable[[], Unused]]]]
build_base: str
build_purelib: Any
build_platlib: Any

View File

@@ -1,4 +1,6 @@
from typing import Any
from _typeshed import Unused
from collections.abc import Callable
from typing import Any, ClassVar
from ..cmd import Command
@@ -6,9 +8,9 @@ def show_compilers() -> None: ...
class build_clib(Command):
description: str
user_options: Any
boolean_options: Any
help_options: Any
user_options: ClassVar[list[tuple[str, str, str]]]
boolean_options: ClassVar[list[str]]
help_options: ClassVar[list[tuple[str, str | None, str, Callable[[], Unused]]]]
build_clib: Any
build_temp: Any
libraries: Any

View File

@@ -1,4 +1,6 @@
from typing import Any
from _typeshed import Unused
from collections.abc import Callable
from typing import Any, ClassVar
from ..cmd import Command
@@ -9,9 +11,9 @@ def show_compilers() -> None: ...
class build_ext(Command):
description: str
sep_by: Any
user_options: Any
boolean_options: Any
help_options: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
help_options: ClassVar[list[tuple[str, str | None, str, Callable[[], Unused]]]]
extensions: Any
build_lib: Any
plat_name: Any

View File

@@ -1,13 +1,13 @@
from typing import Any, Literal
from typing import Any, ClassVar, Literal
from ..cmd import Command
from ..util import Mixin2to3 as Mixin2to3
class build_py(Command):
description: str
user_options: Any
boolean_options: Any
negative_opt: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
negative_opt: ClassVar[dict[str, str]]
build_lib: Any
py_modules: Any
package: Any

View File

@@ -1,4 +1,4 @@
from typing import Any
from typing import Any, ClassVar
from ..cmd import Command
from ..util import Mixin2to3 as Mixin2to3
@@ -7,8 +7,8 @@ first_line_re: Any
class build_scripts(Command):
description: str
user_options: Any
boolean_options: Any
user_options: ClassVar[list[tuple[str, str, str]]]
boolean_options: ClassVar[list[str]]
build_dir: Any
scripts: Any
force: Any

View File

@@ -1,4 +1,4 @@
from typing import Any, Literal
from typing import Any, ClassVar, Literal
from typing_extensions import TypeAlias
from ..cmd import Command
@@ -26,8 +26,8 @@ HAS_DOCUTILS: bool
class check(Command):
description: str
user_options: Any
boolean_options: Any
user_options: ClassVar[list[tuple[str, str, str]]]
boolean_options: ClassVar[list[str]]
restructuredtext: int
metadata: int
strict: int

View File

@@ -1,11 +1,11 @@
from typing import Any
from typing import Any, ClassVar
from ..cmd import Command
class clean(Command):
description: str
user_options: Any
boolean_options: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
build_base: Any
build_lib: Any
build_temp: Any

View File

@@ -1,7 +1,7 @@
from _typeshed import StrOrBytesPath
from collections.abc import Sequence
from re import Pattern
from typing import Any, Literal
from typing import Any, ClassVar, Literal
from ..ccompiler import CCompiler
from ..cmd import Command
@@ -11,7 +11,7 @@ LANG_EXT: dict[str, str]
class config(Command):
description: str
# Tuple is full name, short name, description
user_options: Sequence[tuple[str, str | None, str]]
user_options: ClassVar[list[tuple[str, str | None, str]]]
compiler: str | CCompiler
cc: str | None
include_dirs: Sequence[str] | None

View File

@@ -9,9 +9,9 @@ INSTALL_SCHEMES: dict[str, dict[Any, Any]]
class install(Command):
description: str
user_options: Any
boolean_options: Any
negative_opt: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
negative_opt: ClassVar[dict[str, str]]
prefix: str | None
exec_prefix: Any
home: str | None

View File

@@ -1,11 +1,11 @@
from typing import Any
from typing import Any, ClassVar
from ..cmd import Command
class install_data(Command):
description: str
user_options: Any
boolean_options: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
install_dir: Any
outfiles: Any
root: Any

View File

@@ -4,7 +4,7 @@ from ..cmd import Command
class install_egg_info(Command):
description: ClassVar[str]
user_options: ClassVar[list[tuple[str, str | None, str]]]
user_options: ClassVar[list[tuple[str, str, str]]]
install_dir: Any
def initialize_options(self) -> None: ...
target: Any

View File

@@ -1,11 +1,11 @@
from typing import Any
from typing import Any, ClassVar
from ..cmd import Command
class install_headers(Command):
description: str
user_options: Any
boolean_options: Any
user_options: ClassVar[list[tuple[str, str, str]]]
boolean_options: ClassVar[list[str]]
install_dir: Any
force: int
outfiles: Any

View File

@@ -1,4 +1,4 @@
from typing import Any
from typing import Any, ClassVar
from ..cmd import Command
@@ -6,9 +6,9 @@ PYTHON_SOURCE_EXTENSION: str
class install_lib(Command):
description: str
user_options: Any
boolean_options: Any
negative_opt: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
negative_opt: ClassVar[dict[str, str]]
install_dir: Any
build_dir: Any
force: int

View File

@@ -1,11 +1,11 @@
from typing import Any
from typing import Any, ClassVar
from ..cmd import Command
class install_scripts(Command):
description: str
user_options: Any
boolean_options: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
install_dir: Any
force: int
build_dir: Any

View File

@@ -1,3 +1,4 @@
from _typeshed import Unused
from collections.abc import Callable
from typing import Any, ClassVar
@@ -8,13 +9,13 @@ def show_formats() -> None: ...
class sdist(Command):
description: str
def checking_metadata(self): ...
user_options: Any
boolean_options: Any
help_options: Any
negative_opt: Any
user_options: ClassVar[list[tuple[str, str | None, str]]]
boolean_options: ClassVar[list[str]]
help_options: ClassVar[list[tuple[str, str | None, str, Callable[[], Unused]]]]
negative_opt: ClassVar[dict[str, str]]
# Any to work around variance issues
sub_commands: ClassVar[list[tuple[str, Callable[[Any], bool] | None]]]
READMES: Any
READMES: ClassVar[tuple[str, ...]]
template: Any
manifest: Any
use_defaults: int

View File

@@ -1,8 +1,8 @@
from _typeshed import Incomplete, StrOrBytesPath, StrPath, SupportsWrite
from collections.abc import Iterable, Mapping
from collections.abc import Iterable, MutableMapping
from distutils.cmd import Command
from re import Pattern
from typing import IO, Any, ClassVar, Literal, TypeVar, overload
from typing import IO, ClassVar, Literal, TypeVar, overload
from typing_extensions import TypeAlias
command_re: Pattern[str]
@@ -60,7 +60,7 @@ class DistributionMetadata:
class Distribution:
cmdclass: dict[str, type[Command]]
metadata: DistributionMetadata
def __init__(self, attrs: Mapping[str, Any] | None = None) -> None: ...
def __init__(self, attrs: MutableMapping[str, Incomplete] | None = None) -> None: ...
def get_option_dict(self, command: str) -> dict[str, tuple[str, str]]: ...
def parse_config_files(self, filenames: Iterable[str] | None = None) -> None: ...
@overload

View File

@@ -1,6 +1,9 @@
from _typeshed import StrPath, Unused
from collections.abc import Callable, Container, Iterable, Mapping
from typing import Any, Literal
from typing_extensions import TypeVarTuple, Unpack
_Ts = TypeVarTuple("_Ts")
def get_host_platform() -> str: ...
def get_platform() -> str: ...
@@ -10,8 +13,8 @@ def check_environ() -> None: ...
def subst_vars(s: str, local_vars: Mapping[str, str]) -> None: ...
def split_quoted(s: str) -> list[str]: ...
def execute(
func: Callable[..., object],
args: tuple[Any, ...],
func: Callable[[Unpack[_Ts]], Unused],
args: tuple[Unpack[_Ts]],
msg: str | None = None,
verbose: bool | Literal[0, 1] = 0,
dry_run: bool | Literal[0, 1] = 0,

View File

@@ -1,12 +1,12 @@
from collections.abc import Callable, Iterator
from email.message import Message
from typing import overload
from typing import Final, overload
__all__ = ["Charset", "add_alias", "add_charset", "add_codec"]
QP: int # undocumented
BASE64: int # undocumented
SHORTEST: int # undocumented
QP: Final[int] # undocumented
BASE64: Final[int] # undocumented
SHORTEST: Final[int] # undocumented
class Charset:
input_charset: str

View File

@@ -1,6 +1,6 @@
import sys
from _typeshed import FileDescriptorLike, ReadOnlyBuffer, WriteableBuffer
from typing import Any, Literal, overload
from typing import Any, Final, Literal, overload
from typing_extensions import Buffer
if sys.platform != "win32":
@@ -44,9 +44,10 @@ if sys.platform != "win32":
F_SEAL_SHRINK: int
F_SEAL_WRITE: int
if sys.version_info >= (3, 9):
F_OFD_GETLK: int
F_OFD_SETLK: int
F_OFD_SETLKW: int
F_OFD_GETLK: Final[int]
F_OFD_SETLK: Final[int]
F_OFD_SETLKW: Final[int]
if sys.version_info >= (3, 10):
F_GETPIPE_SZ: int
F_SETPIPE_SZ: int
@@ -105,6 +106,36 @@ if sys.platform != "win32":
FICLONE: int
FICLONERANGE: int
if sys.version_info >= (3, 13) and sys.platform == "linux":
F_OWNER_TID: Final = 0
F_OWNER_PID: Final = 1
F_OWNER_PGRP: Final = 2
F_SETOWN_EX: Final = 15
F_GETOWN_EX: Final = 16
F_SEAL_FUTURE_WRITE: Final = 16
F_GET_RW_HINT: Final = 1035
F_SET_RW_HINT: Final = 1036
F_GET_FILE_RW_HINT: Final = 1037
F_SET_FILE_RW_HINT: Final = 1038
RWH_WRITE_LIFE_NOT_SET: Final = 0
RWH_WRITE_LIFE_NONE: Final = 1
RWH_WRITE_LIFE_SHORT: Final = 2
RWH_WRITE_LIFE_MEDIUM: Final = 3
RWH_WRITE_LIFE_LONG: Final = 4
RWH_WRITE_LIFE_EXTREME: Final = 5
if sys.version_info >= (3, 11) and sys.platform == "darwin":
F_OFD_SETLK: Final = 90
F_OFD_SETLKW: Final = 91
F_OFD_GETLK: Final = 92
if sys.version_info >= (3, 13) and sys.platform != "linux":
# OSx and NetBSD
F_GETNOSIGPIPE: Final[int]
F_SETNOSIGPIPE: Final[int]
# OSx and FreeBSD
F_RDAHEAD: Final[int]
@overload
def fcntl(fd: FileDescriptorLike, cmd: int, arg: int = 0, /) -> int: ...
@overload

View File

@@ -1,7 +1,7 @@
import sys
from _typeshed import GenericPath, StrOrBytesPath
from collections.abc import Callable, Iterable, Sequence
from typing import Any, AnyStr, Generic, Literal
from typing import Any, AnyStr, Final, Generic, Literal
if sys.version_info >= (3, 9):
from types import GenericAlias
@@ -9,7 +9,7 @@ if sys.version_info >= (3, 9):
__all__ = ["clear_cache", "cmp", "dircmp", "cmpfiles", "DEFAULT_IGNORES"]
DEFAULT_IGNORES: list[str]
BUFSIZE: Literal[8192]
BUFSIZE: Final = 8192
def cmp(f1: StrOrBytesPath, f2: StrOrBytesPath, shallow: bool | Literal[0, 1] = True) -> bool: ...
def cmpfiles(

View File

@@ -4,16 +4,16 @@ from collections.abc import Callable, Iterable, Iterator
from socket import socket
from ssl import SSLContext
from types import TracebackType
from typing import Any, Literal, TextIO
from typing import Any, Final, Literal, TextIO
from typing_extensions import Self
__all__ = ["FTP", "error_reply", "error_temp", "error_perm", "error_proto", "all_errors", "FTP_TLS"]
MSG_OOB: Literal[1]
FTP_PORT: Literal[21]
MAXLINE: Literal[8192]
CRLF: Literal["\r\n"]
B_CRLF: Literal[b"\r\n"]
MSG_OOB: Final = 1
FTP_PORT: Final = 21
MAXLINE: Final = 8192
CRLF: Final = "\r\n"
B_CRLF: Final = b"\r\n"
class Error(Exception): ...
class error_reply(Error): ...

View File

@@ -1,13 +1,13 @@
import sys
from collections.abc import Callable
from typing import Any, Literal
from typing import Any, Final, Literal
from typing_extensions import TypeAlias
DEBUG_COLLECTABLE: Literal[2]
DEBUG_LEAK: Literal[38]
DEBUG_SAVEALL: Literal[32]
DEBUG_STATS: Literal[1]
DEBUG_UNCOLLECTABLE: Literal[4]
DEBUG_COLLECTABLE: Final = 2
DEBUG_LEAK: Final = 38
DEBUG_SAVEALL: Final = 32
DEBUG_STATS: Final = 1
DEBUG_UNCOLLECTABLE: Final = 4
_CallbackType: TypeAlias = Callable[[Literal["start", "stop"], dict[str, int]], object]
@@ -34,4 +34,4 @@ if sys.version_info >= (3, 9):
def isenabled() -> bool: ...
def set_debug(flags: int, /) -> None: ...
def set_threshold(threshold0: int, threshold1: int = ..., threshold2: int = ...) -> None: ...
def set_threshold(threshold0: int, threshold1: int = ..., threshold2: int = ..., /) -> None: ...

View File

@@ -3,7 +3,7 @@ import sys
import zlib
from _typeshed import ReadableBuffer, SizedBuffer, StrOrBytesPath
from io import FileIO
from typing import Literal, Protocol, TextIO, overload
from typing import Final, Literal, Protocol, TextIO, overload
from typing_extensions import TypeAlias
__all__ = ["BadGzipFile", "GzipFile", "open", "compress", "decompress"]
@@ -12,14 +12,14 @@ _ReadBinaryMode: TypeAlias = Literal["r", "rb"]
_WriteBinaryMode: TypeAlias = Literal["a", "ab", "w", "wb", "x", "xb"]
_OpenTextMode: TypeAlias = Literal["rt", "at", "wt", "xt"]
READ: object # undocumented
WRITE: object # undocumented
READ: Final[object] # undocumented
WRITE: Final[object] # undocumented
FTEXT: int # actually Literal[1] # undocumented
FHCRC: int # actually Literal[2] # undocumented
FEXTRA: int # actually Literal[4] # undocumented
FNAME: int # actually Literal[8] # undocumented
FCOMMENT: int # actually Literal[16] # undocumented
FTEXT: Final[int] # actually Literal[1] # undocumented
FHCRC: Final[int] # actually Literal[2] # undocumented
FEXTRA: Final[int] # actually Literal[4] # undocumented
FNAME: Final[int] # actually Literal[8] # undocumented
FCOMMENT: Final[int] # actually Literal[16] # undocumented
class _ReadableFileobj(Protocol):
def read(self, n: int, /) -> bytes: ...

View File

@@ -42,7 +42,7 @@ class CookieJar(Iterable[Cookie]):
def __len__(self) -> int: ...
class FileCookieJar(CookieJar):
filename: str
filename: str | None
delayload: bool
def __init__(self, filename: StrPath | None = None, delayload: bool = False, policy: CookiePolicy | None = None) -> None: ...
def save(self, filename: str | None = None, ignore_discard: bool = False, ignore_expires: bool = False) -> None: ...

View File

@@ -25,7 +25,7 @@ from types import (
TracebackType,
WrapperDescriptorType,
)
from typing import Any, ClassVar, Literal, NamedTuple, Protocol, TypeVar, overload
from typing import Any, ClassVar, Final, Literal, NamedTuple, Protocol, TypeVar, overload
from typing_extensions import ParamSpec, Self, TypeAlias, TypeGuard, TypeIs
if sys.version_info >= (3, 11):
@@ -161,17 +161,17 @@ class BlockFinder:
last: int
def tokeneater(self, type: int, token: str, srowcol: tuple[int, int], erowcol: tuple[int, int], line: str) -> None: ...
CO_OPTIMIZED: Literal[1]
CO_NEWLOCALS: Literal[2]
CO_VARARGS: Literal[4]
CO_VARKEYWORDS: Literal[8]
CO_NESTED: Literal[16]
CO_GENERATOR: Literal[32]
CO_NOFREE: Literal[64]
CO_COROUTINE: Literal[128]
CO_ITERABLE_COROUTINE: Literal[256]
CO_ASYNC_GENERATOR: Literal[512]
TPFLAGS_IS_ABSTRACT: Literal[1048576]
CO_OPTIMIZED: Final = 1
CO_NEWLOCALS: Final = 2
CO_VARARGS: Final = 4
CO_VARKEYWORDS: Final = 8
CO_NESTED: Final = 16
CO_GENERATOR: Final = 32
CO_NOFREE: Final = 64
CO_COROUTINE: Final = 128
CO_ITERABLE_COROUTINE: Final = 256
CO_ASYNC_GENERATOR: Final = 512
TPFLAGS_IS_ABSTRACT: Final = 1048576
modulesbyfile: dict[str, Any]
@@ -364,10 +364,10 @@ class _ParameterKind(enum.IntEnum):
def description(self) -> str: ...
if sys.version_info >= (3, 12):
AGEN_CREATED: Literal["AGEN_CREATED"]
AGEN_RUNNING: Literal["AGEN_RUNNING"]
AGEN_SUSPENDED: Literal["AGEN_SUSPENDED"]
AGEN_CLOSED: Literal["AGEN_CLOSED"]
AGEN_CREATED: Final = "AGEN_CREATED"
AGEN_RUNNING: Final = "AGEN_RUNNING"
AGEN_SUSPENDED: Final = "AGEN_SUSPENDED"
AGEN_CLOSED: Final = "AGEN_CLOSED"
def getasyncgenstate(
agen: AsyncGenerator[Any, Any]
@@ -584,19 +584,19 @@ def getattr_static(obj: object, attr: str, default: Any | None = ...) -> Any: ..
# Current State of Generators and Coroutines
#
GEN_CREATED: Literal["GEN_CREATED"]
GEN_RUNNING: Literal["GEN_RUNNING"]
GEN_SUSPENDED: Literal["GEN_SUSPENDED"]
GEN_CLOSED: Literal["GEN_CLOSED"]
GEN_CREATED: Final = "GEN_CREATED"
GEN_RUNNING: Final = "GEN_RUNNING"
GEN_SUSPENDED: Final = "GEN_SUSPENDED"
GEN_CLOSED: Final = "GEN_CLOSED"
def getgeneratorstate(
generator: Generator[Any, Any, Any]
) -> Literal["GEN_CREATED", "GEN_RUNNING", "GEN_SUSPENDED", "GEN_CLOSED"]: ...
CORO_CREATED: Literal["CORO_CREATED"]
CORO_RUNNING: Literal["CORO_RUNNING"]
CORO_SUSPENDED: Literal["CORO_SUSPENDED"]
CORO_CLOSED: Literal["CORO_CLOSED"]
CORO_CREATED: Final = "CORO_CREATED"
CORO_RUNNING: Final = "CORO_RUNNING"
CORO_SUSPENDED: Final = "CORO_SUSPENDED"
CORO_CLOSED: Final = "CORO_CLOSED"
def getcoroutinestate(
coroutine: Coroutine[Any, Any, Any]

View File

@@ -6,7 +6,7 @@ from _typeshed import FileDescriptorOrPath, ReadableBuffer, WriteableBuffer
from collections.abc import Callable, Iterable, Iterator
from os import _Opener
from types import TracebackType
from typing import IO, Any, BinaryIO, Generic, Literal, Protocol, TextIO, TypeVar, overload, type_check_only
from typing import IO, Any, BinaryIO, Final, Generic, Literal, Protocol, TextIO, TypeVar, overload, type_check_only
from typing_extensions import Self
__all__ = [
@@ -36,11 +36,11 @@ if sys.version_info >= (3, 11):
_T = TypeVar("_T")
DEFAULT_BUFFER_SIZE: Literal[8192]
DEFAULT_BUFFER_SIZE: Final = 8192
SEEK_SET: Literal[0]
SEEK_CUR: Literal[1]
SEEK_END: Literal[2]
SEEK_SET: Final = 0
SEEK_CUR: Final = 1
SEEK_END: Final = 2
open = builtins.open
@@ -168,7 +168,7 @@ class _WrappedBuffer(Protocol):
def writable(self) -> bool: ...
def truncate(self, size: int, /) -> int: ...
def fileno(self) -> int: ...
def isatty(self) -> int: ...
def isatty(self) -> bool: ...
# Optional: Only needs to be present if seekable() returns True.
# def seek(self, offset: Literal[0], whence: Literal[2]) -> int: ...
# def tell(self) -> int: ...

View File

@@ -1,11 +1,11 @@
import sys
from collections.abc import Iterable, Iterator
from typing import Any, Generic, Literal, SupportsInt, TypeVar, overload
from typing import Any, Final, Generic, Literal, SupportsInt, TypeVar, overload
from typing_extensions import Self, TypeAlias
# Undocumented length constants
IPV4LENGTH: Literal[32]
IPV6LENGTH: Literal[128]
IPV4LENGTH: Final = 32
IPV6LENGTH: Final = 128
_A = TypeVar("_A", IPv4Address, IPv6Address)
_N = TypeVar("_N", IPv4Network, IPv6Network)

View File

@@ -1,8 +1,8 @@
from typing import ClassVar, Literal
from typing import ClassVar, Final, Literal
from ..fixer_base import BaseFix
NAMES: dict[str, str]
NAMES: Final[dict[str, str]]
class FixAsserts(BaseFix):
BM_compatible: ClassVar[Literal[False]]

View File

@@ -1,9 +1,9 @@
from typing import ClassVar, Literal
from typing import ClassVar, Final, Literal
from .. import fixer_base
CMP: str
TYPE: str
CMP: Final[str]
TYPE: Final[str]
class FixIdioms(fixer_base.BaseFix):
BM_compatible: ClassVar[Literal[False]]

View File

@@ -1,11 +1,11 @@
from _typeshed import StrPath
from collections.abc import Generator
from typing import ClassVar, Literal
from typing import ClassVar, Final, Literal
from .. import fixer_base
from ..pytree import Node
MAPPING: dict[str, str]
MAPPING: Final[dict[str, str]]
def alternates(members): ...
def build_pattern(mapping=...) -> Generator[str, None, None]: ...

View File

@@ -1,6 +1,8 @@
from typing import Final
from . import fix_imports
MAPPING: dict[str, str]
MAPPING: Final[dict[str, str]]
class FixImports2(fix_imports.FixImports):
mapping = MAPPING

View File

@@ -1,8 +1,8 @@
from typing import ClassVar, Literal
from typing import ClassVar, Final, Literal
from .. import fixer_base
MAP: dict[str, str]
MAP: Final[dict[str, str]]
class FixMethodattrs(fixer_base.BaseFix):
BM_compatible: ClassVar[Literal[True]]

View File

@@ -1,10 +1,10 @@
from collections.abc import Generator
from typing import ClassVar, Literal
from typing import ClassVar, Final, Literal
from .. import fixer_base
MAPPING: dict[str, dict[str, str]]
LOOKUP: dict[tuple[str, str], str]
MAPPING: Final[dict[str, dict[str, str]]]
LOOKUP: Final[dict[tuple[str, str], str]]
def alternates(members): ...
def build_pattern() -> Generator[str, None, None]: ...

View File

@@ -1,9 +1,9 @@
from collections.abc import Generator
from typing import Literal
from typing import Final, Literal
from .fix_imports import FixImports
MAPPING: dict[str, list[tuple[Literal["urllib.request", "urllib.parse", "urllib.error"], list[str]]]]
MAPPING: Final[dict[str, list[tuple[Literal["urllib.request", "urllib.parse", "urllib.error"], list[str]]]]]
def build_pattern() -> Generator[str, None, None]: ...

View File

@@ -1,65 +1,67 @@
ENDMARKER: int
NAME: int
NUMBER: int
STRING: int
NEWLINE: int
INDENT: int
DEDENT: int
LPAR: int
RPAR: int
LSQB: int
RSQB: int
COLON: int
COMMA: int
SEMI: int
PLUS: int
MINUS: int
STAR: int
SLASH: int
VBAR: int
AMPER: int
LESS: int
GREATER: int
EQUAL: int
DOT: int
PERCENT: int
BACKQUOTE: int
LBRACE: int
RBRACE: int
EQEQUAL: int
NOTEQUAL: int
LESSEQUAL: int
GREATEREQUAL: int
TILDE: int
CIRCUMFLEX: int
LEFTSHIFT: int
RIGHTSHIFT: int
DOUBLESTAR: int
PLUSEQUAL: int
MINEQUAL: int
STAREQUAL: int
SLASHEQUAL: int
PERCENTEQUAL: int
AMPEREQUAL: int
VBAREQUAL: int
CIRCUMFLEXEQUAL: int
LEFTSHIFTEQUAL: int
RIGHTSHIFTEQUAL: int
DOUBLESTAREQUAL: int
DOUBLESLASH: int
DOUBLESLASHEQUAL: int
OP: int
COMMENT: int
NL: int
RARROW: int
AT: int
ATEQUAL: int
AWAIT: int
ASYNC: int
ERRORTOKEN: int
COLONEQUAL: int
N_TOKENS: int
NT_OFFSET: int
from typing import Final
ENDMARKER: Final[int]
NAME: Final[int]
NUMBER: Final[int]
STRING: Final[int]
NEWLINE: Final[int]
INDENT: Final[int]
DEDENT: Final[int]
LPAR: Final[int]
RPAR: Final[int]
LSQB: Final[int]
RSQB: Final[int]
COLON: Final[int]
COMMA: Final[int]
SEMI: Final[int]
PLUS: Final[int]
MINUS: Final[int]
STAR: Final[int]
SLASH: Final[int]
VBAR: Final[int]
AMPER: Final[int]
LESS: Final[int]
GREATER: Final[int]
EQUAL: Final[int]
DOT: Final[int]
PERCENT: Final[int]
BACKQUOTE: Final[int]
LBRACE: Final[int]
RBRACE: Final[int]
EQEQUAL: Final[int]
NOTEQUAL: Final[int]
LESSEQUAL: Final[int]
GREATEREQUAL: Final[int]
TILDE: Final[int]
CIRCUMFLEX: Final[int]
LEFTSHIFT: Final[int]
RIGHTSHIFT: Final[int]
DOUBLESTAR: Final[int]
PLUSEQUAL: Final[int]
MINEQUAL: Final[int]
STAREQUAL: Final[int]
SLASHEQUAL: Final[int]
PERCENTEQUAL: Final[int]
AMPEREQUAL: Final[int]
VBAREQUAL: Final[int]
CIRCUMFLEXEQUAL: Final[int]
LEFTSHIFTEQUAL: Final[int]
RIGHTSHIFTEQUAL: Final[int]
DOUBLESTAREQUAL: Final[int]
DOUBLESLASH: Final[int]
DOUBLESLASHEQUAL: Final[int]
OP: Final[int]
COMMENT: Final[int]
NL: Final[int]
RARROW: Final[int]
AT: Final[int]
ATEQUAL: Final[int]
AWAIT: Final[int]
ASYNC: Final[int]
ERRORTOKEN: Final[int]
COLONEQUAL: Final[int]
N_TOKENS: Final[int]
NT_OFFSET: Final[int]
tok_name: dict[int, str]
def ISTERMINAL(x: int) -> bool: ...

View File

@@ -7,7 +7,7 @@ from re import Pattern
from string import Template
from time import struct_time
from types import FrameType, TracebackType
from typing import Any, ClassVar, Generic, Literal, Protocol, TextIO, TypeVar, overload
from typing import Any, ClassVar, Final, Generic, Literal, Protocol, TextIO, TypeVar, overload
from typing_extensions import Self, TypeAlias, deprecated
if sys.version_info >= (3, 11):
@@ -236,14 +236,14 @@ class Logger(Filterer):
def hasHandlers(self) -> bool: ...
def callHandlers(self, record: LogRecord) -> None: ... # undocumented
CRITICAL: int
FATAL: int
ERROR: int
WARNING: int
WARN: int
INFO: int
DEBUG: int
NOTSET: int
CRITICAL: Final = 50
FATAL: Final = CRITICAL
ERROR: Final = 40
WARNING: Final = 30
WARN: Final = WARNING
INFO: Final = 20
DEBUG: Final = 10
NOTSET: Final = 0
class Handler(Filterer):
level: int # undocumented
@@ -684,6 +684,6 @@ class StrFormatStyle(PercentStyle): # undocumented
class StringTemplateStyle(PercentStyle): # undocumented
_tpl: Template
_STYLES: dict[str, tuple[PercentStyle, str]]
_STYLES: Final[dict[str, tuple[PercentStyle, str]]]
BASIC_FORMAT: str
BASIC_FORMAT: Final[str]

View File

@@ -4,14 +4,14 @@ from collections.abc import Callable, Hashable, Iterable, Sequence
from configparser import RawConfigParser
from re import Pattern
from threading import Thread
from typing import IO, Any, Literal, SupportsIndex, TypedDict, overload
from typing import IO, Any, Final, Literal, SupportsIndex, TypedDict, overload
from typing_extensions import Required, TypeAlias
from . import Filter, Filterer, Formatter, Handler, Logger, _FilterType, _FormatStyle, _Level
DEFAULT_LOGGING_CONFIG_PORT: int
RESET_ERROR: int # undocumented
IDENTIFIER: Pattern[str] # undocumented
RESET_ERROR: Final[int] # undocumented
IDENTIFIER: Final[Pattern[str]] # undocumented
if sys.version_info >= (3, 11):
class _RootLoggerConfiguration(TypedDict, total=False):

View File

@@ -8,16 +8,16 @@ from logging import FileHandler, Handler, LogRecord
from re import Pattern
from socket import SocketKind, socket
from threading import Thread
from typing import Any, ClassVar, Protocol, TypeVar
from typing import Any, ClassVar, Final, Protocol, TypeVar
_T = TypeVar("_T")
DEFAULT_TCP_LOGGING_PORT: int
DEFAULT_UDP_LOGGING_PORT: int
DEFAULT_HTTP_LOGGING_PORT: int
DEFAULT_SOAP_LOGGING_PORT: int
SYSLOG_UDP_PORT: int
SYSLOG_TCP_PORT: int
DEFAULT_TCP_LOGGING_PORT: Final[int]
DEFAULT_UDP_LOGGING_PORT: Final[int]
DEFAULT_HTTP_LOGGING_PORT: Final[int]
DEFAULT_SOAP_LOGGING_PORT: Final[int]
SYSLOG_UDP_PORT: Final[int]
SYSLOG_TCP_PORT: Final[int]
class WatchedFileHandler(FileHandler):
dev: int # undocumented

View File

@@ -1,7 +1,7 @@
from _compression import BaseStream
from _typeshed import ReadableBuffer, StrOrBytesPath
from collections.abc import Mapping, Sequence
from typing import IO, Any, Literal, TextIO, final, overload
from typing import IO, Any, Final, Literal, TextIO, final, overload
from typing_extensions import Self, TypeAlias
__all__ = [
@@ -50,33 +50,33 @@ _PathOrFile: TypeAlias = StrOrBytesPath | IO[bytes]
_FilterChain: TypeAlias = Sequence[Mapping[str, Any]]
FORMAT_AUTO: Literal[0]
FORMAT_XZ: Literal[1]
FORMAT_ALONE: Literal[2]
FORMAT_RAW: Literal[3]
CHECK_NONE: Literal[0]
CHECK_CRC32: Literal[1]
CHECK_CRC64: Literal[4]
CHECK_SHA256: Literal[10]
CHECK_ID_MAX: Literal[15]
CHECK_UNKNOWN: Literal[16]
FORMAT_AUTO: Final = 0
FORMAT_XZ: Final = 1
FORMAT_ALONE: Final = 2
FORMAT_RAW: Final = 3
CHECK_NONE: Final = 0
CHECK_CRC32: Final = 1
CHECK_CRC64: Final = 4
CHECK_SHA256: Final = 10
CHECK_ID_MAX: Final = 15
CHECK_UNKNOWN: Final = 16
FILTER_LZMA1: int # v big number
FILTER_LZMA2: Literal[33]
FILTER_DELTA: Literal[3]
FILTER_X86: Literal[4]
FILTER_IA64: Literal[6]
FILTER_ARM: Literal[7]
FILTER_ARMTHUMB: Literal[8]
FILTER_SPARC: Literal[9]
FILTER_POWERPC: Literal[5]
MF_HC3: Literal[3]
MF_HC4: Literal[4]
MF_BT2: Literal[18]
MF_BT3: Literal[19]
MF_BT4: Literal[20]
MODE_FAST: Literal[1]
MODE_NORMAL: Literal[2]
PRESET_DEFAULT: Literal[6]
FILTER_LZMA2: Final = 33
FILTER_DELTA: Final = 3
FILTER_X86: Final = 4
FILTER_IA64: Final = 6
FILTER_ARM: Final = 7
FILTER_ARMTHUMB: Final = 8
FILTER_SPARC: Final = 9
FILTER_POWERPC: Final = 5
MF_HC3: Final = 3
MF_HC4: Final = 4
MF_BT2: Final = 18
MF_BT3: Final = 19
MF_BT4: Final = 20
MODE_FAST: Final = 1
MODE_NORMAL: Final = 2
PRESET_DEFAULT: Final = 6
PRESET_EXTREME: int # v big number
# from _lzma.c

View File

@@ -118,4 +118,16 @@ if sys.version_info >= (3, 13) and sys.platform != "win32":
MAP_32BIT: Final = 32768
if sys.version_info >= (3, 13) and sys.platform == "darwin":
MAP_NORESERVE: Final = 64
MAP_NOEXTEND: Final = 256
MAP_HASSEMAPHORE: Final = 512
MAP_NOCACHE: Final = 1024
MAP_JIT: Final = 2048
MAP_RESILIENT_CODESIGN: Final = 8192
MAP_RESILIENT_MEDIA: Final = 16384
MAP_TRANSLATED_ALLOW_EXECUTE: Final = 131072
MAP_UNIX03: Final = 262144
MAP_TPRO: Final = 524288
if sys.version_info >= (3, 13) and sys.platform == "linux":
MAP_NORESERVE: Final = 16384

View File

@@ -1,15 +1,15 @@
import sys
from collections.abc import Container, Iterable, Iterator, Sequence
from types import CodeType
from typing import IO, Any
from typing import IO, Any, Final
if sys.version_info < (3, 11):
LOAD_CONST: int # undocumented
IMPORT_NAME: int # undocumented
STORE_NAME: int # undocumented
STORE_GLOBAL: int # undocumented
STORE_OPS: tuple[int, int] # undocumented
EXTENDED_ARG: int # undocumented
LOAD_CONST: Final[int] # undocumented
IMPORT_NAME: Final[int] # undocumented
STORE_NAME: Final[int] # undocumented
STORE_GLOBAL: Final[int] # undocumented
STORE_OPS: Final[tuple[int, int]] # undocumented
EXTENDED_ARG: Final[int] # undocumented
packagePathMap: dict[str, list[str]] # undocumented

View File

@@ -1,14 +1,14 @@
import sys
from typing import Final, Literal
from typing import Final
# This module is only available on Windows
if sys.platform == "win32":
CRT_ASSEMBLY_VERSION: Final[str]
LK_UNLCK: Literal[0]
LK_LOCK: Literal[1]
LK_NBLCK: Literal[2]
LK_RLCK: Literal[3]
LK_NBRLCK: Literal[4]
LK_UNLCK: Final = 0
LK_LOCK: Final = 1
LK_NBLCK: Final = 2
LK_RLCK: Final = 3
LK_NBRLCK: Final = 4
SEM_FAILCRITICALERRORS: int
SEM_NOALIGNMENTFAULTEXCEPT: int
SEM_NOGPFAULTERRORBOX: int

View File

@@ -1,12 +1,12 @@
from _typeshed import FileDescriptorLike, Unused
from collections.abc import Sequence
from struct import Struct
from typing import Any
from typing import Any, Final
__all__ = ["ensure_running", "get_inherited_fds", "connect_to_new_process", "set_forkserver_preload"]
MAXFDS_TO_SEND: int
SIGNED_STRUCT: Struct
MAXFDS_TO_SEND: Final = 256
SIGNED_STRUCT: Final[Struct]
class ForkServer:
def set_forkserver_preload(self, modules_names: list[str]) -> None: ...

View File

@@ -1,7 +1,7 @@
import sys
from collections.abc import Callable, Iterable, Iterator, Mapping
from types import TracebackType
from typing import Any, Generic, Literal, TypeVar
from typing import Any, Final, Generic, TypeVar
from typing_extensions import Self
if sys.version_info >= (3, 9):
@@ -97,7 +97,7 @@ class ThreadPool(Pool):
) -> None: ...
# undocumented
INIT: Literal["INIT"]
RUN: Literal["RUN"]
CLOSE: Literal["CLOSE"]
TERMINATE: Literal["TERMINATE"]
INIT: Final = "INIT"
RUN: Final = "RUN"
CLOSE: Final = "CLOSE"
TERMINATE: Final = "TERMINATE"

Some files were not shown because too many files have changed in this diff Show More