Compare commits

..

19 Commits

Author SHA1 Message Date
Brent Westbrook
b0bdf0334e Bump 0.13.2 (#20576) 2025-09-25 10:37:46 -04:00
Brent Westbrook
7331d393c5 Update rooster to 0.1.0 (#20575)
Summary
--

This reduces the page size of GraphQL queries
(https://github.com/zanieb/rooster/pull/85), hopefully helping with some
of the 502s we've been hitting.

Test Plan
--

I ran the release script, and it succeeded after failing several times
on the old rooster version.
2025-09-25 10:11:21 -04:00
David Peter
529e5fa6c2 [ty] Ecosystem analyzer: timing report (#20571)
## Summary

Generate a timing diff across the whole ecosystem and deploy it to
CloudFlare pages. The timing information is collected already, we just
need to create and upload the HTML report.

The timing results are just based on a single run. No statistical
analysis across multiple runs or similar is performed. This means that
results can be noisy, as can be seen on this PR, where we see slowdowns
up to 1.26× and speedups down to 0.89×, even though the change should be
neutral. Across all projects, these random events cancel out and we see
an average factor of 1.01×. So I think this feature can still be
interesting, given that it comes "for free". We just need to keep in
mind that it will be noisy, and shouldn't read too much into these
results.

## Test Plan

CI run on this PR (see the new *timing results* link).
2025-09-25 14:14:20 +02:00
David Peter
efbb80f747 [ty] Remove hack in protocol satisfiability check (#20568)
## Summary

This removes a hack in the protocol satisfiability check that was
previously needed to work around missing assignability-modeling of
inferable type variables. Assignability of type variables is not
implemented fully, but some recent changes allow us to remove that hack
with limited impact on the ecosystem (and the test suite). The change in
the typing conformance test is favorable.

## Test Plan

* Adapted Markdown tests
* Made sure that this change works in combination with
https://github.com/astral-sh/ruff/pull/20517
2025-09-25 13:35:47 +02:00
Micha Reiser
9f3cffc65c Add 'Finding ways to help' to CONTRIBUTING.md (#20567)
Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
2025-09-25 12:46:41 +02:00
Alex Waygood
21be94ac33 [ty] Explicitly test assignability/subtyping between unions of nominal types and protocols with method members (#20557) 2025-09-25 09:21:29 +00:00
Alex Waygood
b7d5dc98c1 [ty] Add tests for interactions of @classmethod, @staticmethod, and protocol method members (#20555) 2025-09-25 10:14:53 +01:00
Dhruv Manilawala
e1bb74b25a [ty] Match variadic argument to variadic parameter (#20511)
## Summary

Closes: https://github.com/astral-sh/ty/issues/1236

This PR fixes a bug where the variadic argument wouldn't match against
the variadic parameter in certain scenarios.

This was happening because I didn't realize that the `all_elements`
iterator wouldn't keep on returning the variable element (which is
correct, I just didn't realize it back then).

I don't think we can use the `resize` method here because we don't know
how many parameters this variadic argument is matching against as this
is where the actual parameter matching occurs.

## Test Plan

Expand test cases to consider a few more combinations of arguments and
parameters which are variadic.
2025-09-25 07:51:56 +00:00
Aria Desires
edeb45804e [ty] fallback to resolve_real_module in file_to_module (#20461)
This is a naive(?) implementation of the approach @MichaReiser
originally suggested to me in https://github.com/astral-sh/ty/issues/869

Fixes https://github.com/astral-sh/ty/issues/869
Fixes https://github.com/astral-sh/ty/issues/1195
2025-09-24 21:15:35 -04:00
Ibraheem Ahmed
bea92c8229 [ty] More precise type inference for dictionary literals (#20523)
## Summary

Extends https://github.com/astral-sh/ruff/pull/20360 to dictionary
literals. This also improves our `TypeDict` support by passing through
nested type context.
2025-09-24 18:12:00 -04:00
Ed Cuss
f2cc2f604f [flake8-pyi] Avoid syntax error from conflict with PIE790 (PYI021) (#20010)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
First contribution so please let me know if I've made a mistake
anywhere. This was aimed to fix #19982, it adds the isolation level to
PYI021 to in the same style as the PIE790 rule.

fixes: #19982

## Test Plan

<!-- How was it tested? -->
I added a case to the PYI021.pyi file where the two rules are present as
there wasn't a case with them both interacting, using the minimal
reproducible example that @ntBre created on the issue (I think I got the
`# ERROR` markings wrong, so please let me know how to fix that if I
did).

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-09-24 21:26:59 +00:00
Dan Parizher
c361e2f759 [flake8-bandit] Clarify the supported hashing functions (S324) (#20534)
## Summary

Fixes #16572

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-09-24 20:10:23 +00:00
Alex Waygood
0e83af0b80 Bump mypy_primer pin (#20558) 2025-09-24 19:45:47 +00:00
Bhuminjay Soni
e6073d0cca [syntax-errors]: multiple-starred-expressions (F622) (#20243)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

This PR implements
https://docs.astral.sh/ruff/rules/multiple-starred-expressions/ as a
semantic syntax error

## Test Plan

 I have added inline tests as directed in #17412

---------

Signed-off-by: 11happy <soni5happy@gmail.com>
Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
2025-09-24 19:32:55 +00:00
Brent Westbrook
73b4b1ed17 [ty] Make FileResolver::path return a full path (#20550)
## Summary

Fixes https://github.com/astral-sh/ty/issues/1242

From finding references with the LSP, `FileResolver::path` is only
called once, in `UnifiedFile::path`, so I went through those references,
and it looked safe to make this change in every case. Most of the
references are in the various output formats, where we inherited the
absolute vs relative path decision from Ruff. Two other uses are as
fallbacks if converting a relativized path to a string fails. Finally,
we use the path for sorting and in `UnifiedFile::relative_path`.

## Test Plan

Existing tests, with snapshots updated to show absolute paths (in the
`TestDb` this just added a `/` in front of the file names). I also
updated the GitLab CLI test to set the `CI_PROJECT_DIR` environment
variable and ran a test in GitLab CI:

<img width="613" height="114" alt="image"
src="https://github.com/user-attachments/assets/8ab81dba-54fd-4a24-9110-77ef89293cff"
/>
2025-09-24 13:16:51 -04:00
Amethyst Reese
83f80effec include .pyw files by default when linting and formatting (#20458)
- Adds test cases exercising file selection by extension with
`--preview` enabled and disabled.
- Adds `INCLUDE_PREVIEW` with file patterns including `*.pyw`.
- In global preview mode, default configuration selects patterns from
`INCLUDE_PREVIEW`.
- Manually tested ruff server with local vscode for both formatting and
linting of a `.pyw` file.

Closes https://github.com/astral-sh/ruff/issues/13246
2025-09-24 08:39:30 -07:00
David Peter
fcc76bb7b2 [ty] Todo-types for os.fdopen, NamedTemporaryFile, and Path.open (#20549)
## Summary

This applies the trick that we use for `builtins.open` to similar
functions that have the same problem. The reason is that the problem
would otherwise become even more pronounced once we add understanding of
the implicit type of `self` parameters, because then something like
`(base_path / "test.bin").open("rb")` also leads to a wrong return type
and can result in false positives.

## Test Plan

New Markdown tests
2025-09-24 15:43:58 +02:00
David Peter
eea87e24e3 [ty] Update ecosystem-analyzer version for weekly report (#20546)
## Summary

This needs to be updated so we don't fail on "missing" projects
(https://github.com/astral-sh/ruff/actions/runs/17966905581)

## Test plan

Workflow run: https://github.com/astral-sh/ruff/actions/runs/17973053536
2025-09-24 12:02:58 +02:00
Dan Parizher
3e1e02e9b6 Fix non‑BMP code point handling in quick‑fixes and markers (#20526)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-09-24 10:08:00 +02:00
80 changed files with 2531 additions and 510 deletions

View File

@@ -95,6 +95,14 @@ jobs:
--new-name "$REF_NAME" \
--output diff-statistics.md
ecosystem-analyzer \
generate-timing-diff \
diagnostics-old.json \
diagnostics-new.json \
--old-name "main (merge base)" \
--new-name "$REF_NAME" \
--output-html dist/timing.html
echo '## `ecosystem-analyzer` results' > comment.md
echo >> comment.md
cat diff-statistics.md >> comment.md
@@ -118,7 +126,7 @@ jobs:
DEPLOYMENT_URL: ${{ steps.deploy.outputs.pages-deployment-alias-url }}
run: |
echo >> comment.md
echo "**[Full report with detailed diff]($DEPLOYMENT_URL/diff)**" >> comment.md
echo "**[Full report with detailed diff]($DEPLOYMENT_URL/diff)** ([timing results]($DEPLOYMENT_URL/timing))" >> comment.md
- name: Upload comment
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
@@ -137,3 +145,9 @@ jobs:
with:
name: diff.html
path: dist/diff.html
- name: Upload timing diff
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: timing.html
path: dist/timing.html

View File

@@ -1,5 +1,53 @@
# Changelog
## 0.13.2
Released on 2025-09-25.
### Preview features
- \[`flake8-async`\] Implement `blocking-path-method` (`ASYNC240`) ([#20264](https://github.com/astral-sh/ruff/pull/20264))
- \[`flake8-bugbear`\] Implement `map-without-explicit-strict` (`B912`) ([#20429](https://github.com/astral-sh/ruff/pull/20429))
- \[`flake8-bultins`\] Detect class-scope builtin shadowing in decorators, default args, and attribute initializers (`A003`) ([#20178](https://github.com/astral-sh/ruff/pull/20178))
- \[`ruff`\] Implement `logging-eager-conversion` (`RUF065`) ([#19942](https://github.com/astral-sh/ruff/pull/19942))
- Include `.pyw` files by default when linting and formatting ([#20458](https://github.com/astral-sh/ruff/pull/20458))
### Bug fixes
- Deduplicate input paths ([#20105](https://github.com/astral-sh/ruff/pull/20105))
- \[`flake8-comprehensions`\] Preserve trailing commas for single-element lists (`C409`) ([#19571](https://github.com/astral-sh/ruff/pull/19571))
- \[`flake8-pyi`\] Avoid syntax error from conflict with `PIE790` (`PYI021`) ([#20010](https://github.com/astral-sh/ruff/pull/20010))
- \[`flake8-simplify`\] Correct fix for positive `maxsplit` without separator (`SIM905`) ([#20056](https://github.com/astral-sh/ruff/pull/20056))
- \[`pyupgrade`\] Fix `UP008` not to apply when `__class__` is a local variable ([#20497](https://github.com/astral-sh/ruff/pull/20497))
- \[`ruff`\] Fix `B004` to skip invalid `hasattr`/`getattr` calls ([#20486](https://github.com/astral-sh/ruff/pull/20486))
- \[`ruff`\] Replace `-nan` with `nan` when using the value to construct a `Decimal` (`FURB164` ) ([#20391](https://github.com/astral-sh/ruff/pull/20391))
### Documentation
- Add 'Finding ways to help' to CONTRIBUTING.md ([#20567](https://github.com/astral-sh/ruff/pull/20567))
- Update import path to `ruff-wasm-web` ([#20539](https://github.com/astral-sh/ruff/pull/20539))
- \[`flake8-bandit`\] Clarify the supported hashing functions (`S324`) ([#20534](https://github.com/astral-sh/ruff/pull/20534))
### Other changes
- \[`playground`\] Allow hover quick fixes to appear for overlapping diagnostics ([#20527](https://github.com/astral-sh/ruff/pull/20527))
- \[`playground`\] Fix nonBMP code point handling in quick fixes and markers ([#20526](https://github.com/astral-sh/ruff/pull/20526))
### Contributors
- [@BurntSushi](https://github.com/BurntSushi)
- [@mtshiba](https://github.com/mtshiba)
- [@second-ed](https://github.com/second-ed)
- [@danparizher](https://github.com/danparizher)
- [@ShikChen](https://github.com/ShikChen)
- [@PieterCK](https://github.com/PieterCK)
- [@GDYendell](https://github.com/GDYendell)
- [@RazerM](https://github.com/RazerM)
- [@TaKO8Ki](https://github.com/TaKO8Ki)
- [@amyreese](https://github.com/amyreese)
- [@ntbre](https://github.com/ntBre)
- [@MichaReiser](https://github.com/MichaReiser)
## 0.13.1
Released on 2025-09-18.

View File

@@ -7,21 +7,35 @@ Welcome! We're happy to have you here. Thank you in advance for your contributio
> This guide is for Ruff. If you're looking to contribute to ty, please see [the ty contributing
> guide](https://github.com/astral-sh/ruff/blob/main/crates/ty/CONTRIBUTING.md).
## The Basics
## Finding ways to help
Ruff welcomes contributions in the form of pull requests.
We label issues that would be good for a first time contributor as
[`good first issue`](https://github.com/astral-sh/ruff/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22).
These usually do not require significant experience with Rust or the Ruff code base.
For small changes (e.g., bug fixes), feel free to submit a PR.
We label issues that we think are a good opportunity for subsequent contributions as
[`help wanted`](https://github.com/astral-sh/ruff/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22).
These require varying levels of experience with Rust and Ruff. Often, we want to accomplish these
tasks but do not have the resources to do so ourselves.
For larger changes (e.g., new lint rules, new functionality, new configuration options), consider
creating an [**issue**](https://github.com/astral-sh/ruff/issues) outlining your proposed change.
You can also join us on [Discord](https://discord.com/invite/astral-sh) to discuss your idea with the
community. We've labeled [beginner-friendly tasks](https://github.com/astral-sh/ruff/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)
in the issue tracker, along with [bugs](https://github.com/astral-sh/ruff/issues?q=is%3Aissue+is%3Aopen+label%3Abug)
and [improvements](https://github.com/astral-sh/ruff/issues?q=is%3Aissue+is%3Aopen+label%3Aaccepted)
that are ready for contributions.
You don't need our permission to start on an issue we have labeled as appropriate for community
contribution as described above. However, it's a good idea to indicate that you are going to work on
an issue to avoid concurrent attempts to solve the same problem.
If you have suggestions on how we might improve the contributing documentation, [let us know](https://github.com/astral-sh/ruff/discussions/5693)!
Please check in with us before starting work on an issue that has not been labeled as appropriate
for community contribution. We're happy to receive contributions for other issues, but it's
important to make sure we have consensus on the solution to the problem first.
Outside of issues with the labels above, issues labeled as
[`bug`](https://github.com/astral-sh/ruff/issues?q=is%3Aopen+is%3Aissue+label%3A%22bug%22) are the
best candidates for contribution. In contrast, issues labeled with `needs-decision` or
`needs-design` are _not_ good candidates for contribution. Please do not open pull requests for
issues with these labels.
Please do not open pull requests for new features without prior discussion. While we appreciate
exploration of new features, we will often close these pull requests immediately. Adding a
new feature to ruff creates a long-term maintenance burden and requires strong consensus from the ruff
team before it is appropriate to begin work on an implementation.
### Prerequisites

6
Cargo.lock generated
View File

@@ -2738,7 +2738,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.13.1"
version = "0.13.2"
dependencies = [
"anyhow",
"argfile",
@@ -2994,7 +2994,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.13.1"
version = "0.13.2"
dependencies = [
"aho-corasick",
"anyhow",
@@ -3348,7 +3348,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.13.1"
version = "0.13.2"
dependencies = [
"console_error_panic_hook",
"console_log",

View File

@@ -148,8 +148,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.13.1/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.13.1/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.13.2/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.13.2/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -182,7 +182,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.13.1
rev: v0.13.2
hooks:
# Run the linter.
- id: ruff-check

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.13.1"
version = "0.13.2"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -2476,6 +2476,319 @@ requires-python = ">= 3.11"
Ok(())
}
/// ```
/// tmp
/// ├── pyproject.toml #<--- no `[tool.ruff]`
/// └── test.py
/// ```
#[test]
fn requires_python_no_tool_preview_enabled() -> Result<()> {
let tempdir = TempDir::new()?;
let project_dir = dunce::canonicalize(tempdir.path())?;
let ruff_toml = tempdir.path().join("pyproject.toml");
fs::write(
&ruff_toml,
r#"[project]
requires-python = ">= 3.11"
"#,
)?;
let testpy = tempdir.path().join("test.py");
fs::write(
&testpy,
r#"from typing import Union;foo: Union[int, str] = 1"#,
)?;
insta::with_settings!({
filters => vec![(tempdir_filter(&project_dir).as_str(), "[TMP]/")]
}, {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(STDIN_BASE_OPTIONS)
.arg("--preview")
.arg("--show-settings")
.args(["--select","UP007"])
.arg("test.py")
.arg("-")
.current_dir(project_dir)
, @r#"
success: true
exit_code: 0
----- stdout -----
Resolved settings for: "[TMP]/test.py"
# General Settings
cache_dir = "[TMP]/.ruff_cache"
fix = false
fix_only = false
output_format = concise
show_fixes = false
unsafe_fixes = hint
# File Resolver Settings
file_resolver.exclude = [
".bzr",
".direnv",
".eggs",
".git",
".git-rewrite",
".hg",
".ipynb_checkpoints",
".mypy_cache",
".nox",
".pants.d",
".pyenv",
".pytest_cache",
".pytype",
".ruff_cache",
".svn",
".tox",
".venv",
".vscode",
"__pypackages__",
"_build",
"buck-out",
"dist",
"node_modules",
"site-packages",
"venv",
]
file_resolver.extend_exclude = []
file_resolver.force_exclude = false
file_resolver.include = [
"*.py",
"*.pyi",
"*.pyw",
"*.ipynb",
"**/pyproject.toml",
]
file_resolver.extend_include = []
file_resolver.respect_gitignore = true
file_resolver.project_root = "[TMP]/"
# Linter Settings
linter.exclude = []
linter.project_root = "[TMP]/"
linter.rules.enabled = [
non-pep604-annotation-union (UP007),
]
linter.rules.should_fix = [
non-pep604-annotation-union (UP007),
]
linter.per_file_ignores = {}
linter.safety_table.forced_safe = []
linter.safety_table.forced_unsafe = []
linter.unresolved_target_version = 3.11
linter.per_file_target_version = {}
linter.preview = enabled
linter.explicit_preview_rules = false
linter.extension = ExtensionMapping({})
linter.allowed_confusables = []
linter.builtins = []
linter.dummy_variable_rgx = ^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$
linter.external = []
linter.ignore_init_module_imports = true
linter.logger_objects = []
linter.namespace_packages = []
linter.src = [
"[TMP]/",
"[TMP]/src",
]
linter.tab_size = 4
linter.line_length = 88
linter.task_tags = [
TODO,
FIXME,
XXX,
]
linter.typing_modules = []
linter.typing_extensions = true
# Linter Plugins
linter.flake8_annotations.mypy_init_return = false
linter.flake8_annotations.suppress_dummy_args = false
linter.flake8_annotations.suppress_none_returning = false
linter.flake8_annotations.allow_star_arg_any = false
linter.flake8_annotations.ignore_fully_untyped = false
linter.flake8_bandit.hardcoded_tmp_directory = [
/tmp,
/var/tmp,
/dev/shm,
]
linter.flake8_bandit.check_typed_exception = false
linter.flake8_bandit.extend_markup_names = []
linter.flake8_bandit.allowed_markup_calls = []
linter.flake8_bugbear.extend_immutable_calls = []
linter.flake8_builtins.allowed_modules = []
linter.flake8_builtins.ignorelist = []
linter.flake8_builtins.strict_checking = false
linter.flake8_comprehensions.allow_dict_calls_with_keyword_arguments = false
linter.flake8_copyright.notice_rgx = (?i)Copyright\s+((?:\(C\)|©)\s+)?\d{4}((-|,\s)\d{4})*
linter.flake8_copyright.author = none
linter.flake8_copyright.min_file_size = 0
linter.flake8_errmsg.max_string_length = 0
linter.flake8_gettext.functions_names = [
_,
gettext,
ngettext,
]
linter.flake8_implicit_str_concat.allow_multiline = true
linter.flake8_import_conventions.aliases = {
altair = alt,
holoviews = hv,
matplotlib = mpl,
matplotlib.pyplot = plt,
networkx = nx,
numpy = np,
numpy.typing = npt,
pandas = pd,
panel = pn,
plotly.express = px,
polars = pl,
pyarrow = pa,
seaborn = sns,
tensorflow = tf,
tkinter = tk,
xml.etree.ElementTree = ET,
}
linter.flake8_import_conventions.banned_aliases = {}
linter.flake8_import_conventions.banned_from = []
linter.flake8_pytest_style.fixture_parentheses = false
linter.flake8_pytest_style.parametrize_names_type = tuple
linter.flake8_pytest_style.parametrize_values_type = list
linter.flake8_pytest_style.parametrize_values_row_type = tuple
linter.flake8_pytest_style.raises_require_match_for = [
BaseException,
Exception,
ValueError,
OSError,
IOError,
EnvironmentError,
socket.error,
]
linter.flake8_pytest_style.raises_extend_require_match_for = []
linter.flake8_pytest_style.mark_parentheses = false
linter.flake8_quotes.inline_quotes = double
linter.flake8_quotes.multiline_quotes = double
linter.flake8_quotes.docstring_quotes = double
linter.flake8_quotes.avoid_escape = true
linter.flake8_self.ignore_names = [
_make,
_asdict,
_replace,
_fields,
_field_defaults,
_name_,
_value_,
]
linter.flake8_tidy_imports.ban_relative_imports = "parents"
linter.flake8_tidy_imports.banned_api = {}
linter.flake8_tidy_imports.banned_module_level_imports = []
linter.flake8_type_checking.strict = false
linter.flake8_type_checking.exempt_modules = [
typing,
typing_extensions,
]
linter.flake8_type_checking.runtime_required_base_classes = []
linter.flake8_type_checking.runtime_required_decorators = []
linter.flake8_type_checking.quote_annotations = false
linter.flake8_unused_arguments.ignore_variadic_names = false
linter.isort.required_imports = []
linter.isort.combine_as_imports = false
linter.isort.force_single_line = false
linter.isort.force_sort_within_sections = false
linter.isort.detect_same_package = true
linter.isort.case_sensitive = false
linter.isort.force_wrap_aliases = false
linter.isort.force_to_top = []
linter.isort.known_modules = {}
linter.isort.order_by_type = true
linter.isort.relative_imports_order = furthest_to_closest
linter.isort.single_line_exclusions = []
linter.isort.split_on_trailing_comma = true
linter.isort.classes = []
linter.isort.constants = []
linter.isort.variables = []
linter.isort.no_lines_before = []
linter.isort.lines_after_imports = -1
linter.isort.lines_between_types = 0
linter.isort.forced_separate = []
linter.isort.section_order = [
known { type = future },
known { type = standard_library },
known { type = third_party },
known { type = first_party },
known { type = local_folder },
]
linter.isort.default_section = known { type = third_party }
linter.isort.no_sections = false
linter.isort.from_first = false
linter.isort.length_sort = false
linter.isort.length_sort_straight = false
linter.mccabe.max_complexity = 10
linter.pep8_naming.ignore_names = [
setUp,
tearDown,
setUpClass,
tearDownClass,
setUpModule,
tearDownModule,
asyncSetUp,
asyncTearDown,
setUpTestData,
failureException,
longMessage,
maxDiff,
]
linter.pep8_naming.classmethod_decorators = []
linter.pep8_naming.staticmethod_decorators = []
linter.pycodestyle.max_line_length = 88
linter.pycodestyle.max_doc_length = none
linter.pycodestyle.ignore_overlong_task_comments = false
linter.pyflakes.extend_generics = []
linter.pyflakes.allowed_unused_imports = []
linter.pylint.allow_magic_value_types = [
str,
bytes,
]
linter.pylint.allow_dunder_method_names = []
linter.pylint.max_args = 5
linter.pylint.max_positional_args = 5
linter.pylint.max_returns = 6
linter.pylint.max_bool_expr = 5
linter.pylint.max_branches = 12
linter.pylint.max_statements = 50
linter.pylint.max_public_methods = 20
linter.pylint.max_locals = 15
linter.pyupgrade.keep_runtime_typing = false
linter.ruff.parenthesize_tuple_in_subscript = false
# Formatter Settings
formatter.exclude = []
formatter.unresolved_target_version = 3.11
formatter.per_file_target_version = {}
formatter.preview = enabled
formatter.line_width = 88
formatter.line_ending = auto
formatter.indent_style = space
formatter.indent_width = 4
formatter.quote_style = double
formatter.magic_trailing_comma = respect
formatter.docstring_code_format = disabled
formatter.docstring_code_line_width = dynamic
# Analyze Settings
analyze.exclude = []
analyze.preview = enabled
analyze.target_version = 3.11
analyze.string_imports = disabled
analyze.extension = ExtensionMapping({})
analyze.include_dependencies = {}
----- stderr -----
"#);
});
Ok(())
}
/// ```
/// tmp
/// ├── pyproject.toml #<--- no `[tool.ruff]`
@@ -6045,3 +6358,200 @@ fn rule_panic_mixed_results_full() -> Result<()> {
});
Ok(())
}
/// Test that the same rule fires across all supported extensions, but not on unsupported files
#[test]
fn supported_file_extensions() -> Result<()> {
let tempdir = TempDir::new()?;
let inner_dir = tempdir.path().join("src");
fs::create_dir(&inner_dir)?;
// Create files of various types
// text file
fs::write(inner_dir.join("thing.txt"), b"hello world\n")?;
// regular python
fs::write(
inner_dir.join("thing.py"),
b"import os\nprint('hello world')\n",
)?;
// python typestub
fs::write(
inner_dir.join("thing.pyi"),
b"import os\nclass foo:\n ...\n",
)?;
// windows gui
fs::write(
inner_dir.join("thing.pyw"),
b"import os\nprint('hello world')\n",
)?;
// cython
fs::write(
inner_dir.join("thing.pyx"),
b"import os\ncdef int add(int a, int b):\n return a + b\n",
)?;
// notebook
fs::write(
inner_dir.join("thing.ipynb"),
r#"
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"id": "ad6f36d9-4b7d-4562-8d00-f15a0f1fbb6d",
"metadata": {},
"outputs": [],
"source": [
"import os"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.0"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
"#,
)?;
insta::with_settings!({
filters => vec![
(tempdir_filter(&tempdir).as_str(), "[TMP]/"),
(r"\\", r"/"),
]
}, {
assert_cmd_snapshot!(
Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--select", "F401", "--output-format=concise", "--no-cache"])
.args([inner_dir]),
@r"
success: false
exit_code: 1
----- stdout -----
[TMP]/src/thing.ipynb:cell 1:1:8: F401 [*] `os` imported but unused
[TMP]/src/thing.py:1:8: F401 [*] `os` imported but unused
[TMP]/src/thing.pyi:1:8: F401 [*] `os` imported but unused
Found 3 errors.
[*] 3 fixable with the `--fix` option.
----- stderr -----
");
});
Ok(())
}
/// Test that the same rule fires across all supported extensions, but not on unsupported files
#[test]
fn supported_file_extensions_preview_enabled() -> Result<()> {
let tempdir = TempDir::new()?;
let inner_dir = tempdir.path().join("src");
fs::create_dir(&inner_dir)?;
// Create files of various types
// text file
fs::write(inner_dir.join("thing.txt"), b"hello world\n")?;
// regular python
fs::write(
inner_dir.join("thing.py"),
b"import os\nprint('hello world')\n",
)?;
// python typestub
fs::write(
inner_dir.join("thing.pyi"),
b"import os\nclass foo:\n ...\n",
)?;
// windows gui
fs::write(
inner_dir.join("thing.pyw"),
b"import os\nprint('hello world')\n",
)?;
// cython
fs::write(
inner_dir.join("thing.pyx"),
b"import os\ncdef int add(int a, int b):\n return a + b\n",
)?;
// notebook
fs::write(
inner_dir.join("thing.ipynb"),
r#"
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"id": "ad6f36d9-4b7d-4562-8d00-f15a0f1fbb6d",
"metadata": {},
"outputs": [],
"source": [
"import os"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.0"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
"#,
)?;
insta::with_settings!({
filters => vec![
(tempdir_filter(&tempdir).as_str(), "[TMP]/"),
(r"\\", r"/"),
]
}, {
assert_cmd_snapshot!(
Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--select", "F401", "--preview", "--output-format=concise", "--no-cache"])
.args([inner_dir]),
@r"
success: false
exit_code: 1
----- stdout -----
[TMP]/src/thing.ipynb:cell 1:1:8: F401 [*] `os` imported but unused
[TMP]/src/thing.py:1:8: F401 [*] `os` imported but unused
[TMP]/src/thing.pyi:1:8: F401 [*] `os` imported but unused
[TMP]/src/thing.pyw:1:8: F401 [*] `os` imported but unused
Found 4 errors.
[*] 4 fixable with the `--fix` option.
----- stderr -----
");
});
Ok(())
}

View File

@@ -15,7 +15,6 @@ use crate::{
Db,
files::File,
source::{SourceText, line_index, source_text},
system::SystemPath,
};
use super::{
@@ -800,7 +799,7 @@ where
T: Db,
{
fn path(&self, file: File) -> &str {
relativize_path(self.system().current_directory(), file.path(self).as_str())
file.path(self).as_str()
}
fn input(&self, file: File) -> Input {
@@ -836,7 +835,7 @@ where
impl FileResolver for &dyn Db {
fn path(&self, file: File) -> &str {
relativize_path(self.system().current_directory(), file.path(*self).as_str())
file.path(*self).as_str()
}
fn input(&self, file: File) -> Input {
@@ -955,14 +954,6 @@ fn context_after(
line
}
/// Convert an absolute path to be relative to the current working directory.
fn relativize_path<'p>(cwd: &SystemPath, path: &'p str) -> &'p str {
if let Ok(path) = SystemPath::new(path).strip_prefix(cwd) {
return path.as_str();
}
path
}
/// Given some source code and annotation ranges, this routine replaces
/// unprintable characters with printable representations of them.
///

View File

@@ -2,6 +2,6 @@
source: crates/ruff_db/src/diagnostic/render/azure.rs
expression: env.render_diagnostics(&diagnostics)
---
##vso[task.logissue type=error;sourcepath=fib.py;linenumber=1;columnnumber=8;code=F401;]`os` imported but unused
##vso[task.logissue type=error;sourcepath=fib.py;linenumber=6;columnnumber=5;code=F841;]Local variable `x` is assigned to but never used
##vso[task.logissue type=error;sourcepath=undef.py;linenumber=1;columnnumber=4;code=F821;]Undefined name `a`
##vso[task.logissue type=error;sourcepath=/fib.py;linenumber=1;columnnumber=8;code=F401;]`os` imported but unused
##vso[task.logissue type=error;sourcepath=/fib.py;linenumber=6;columnnumber=5;code=F841;]Local variable `x` is assigned to but never used
##vso[task.logissue type=error;sourcepath=/undef.py;linenumber=1;columnnumber=4;code=F821;]Undefined name `a`

View File

@@ -2,5 +2,5 @@
source: crates/ruff_db/src/diagnostic/render/azure.rs
expression: env.render_diagnostics(&diagnostics)
---
##vso[task.logissue type=error;sourcepath=syntax_errors.py;linenumber=1;columnnumber=15;code=invalid-syntax;]Expected one or more symbol names after import
##vso[task.logissue type=error;sourcepath=syntax_errors.py;linenumber=3;columnnumber=12;code=invalid-syntax;]Expected ')', found newline
##vso[task.logissue type=error;sourcepath=/syntax_errors.py;linenumber=1;columnnumber=15;code=invalid-syntax;]Expected one or more symbol names after import
##vso[task.logissue type=error;sourcepath=/syntax_errors.py;linenumber=3;columnnumber=12;code=invalid-syntax;]Expected ')', found newline

View File

@@ -2,6 +2,6 @@
source: crates/ruff_db/src/diagnostic/render/github.rs
expression: env.render_diagnostics(&diagnostics)
---
::error title=ty (F401),file=fib.py,line=1,col=8,endLine=1,endColumn=10::fib.py:1:8: F401 `os` imported but unused
::error title=ty (F841),file=fib.py,line=6,col=5,endLine=6,endColumn=6::fib.py:6:5: F841 Local variable `x` is assigned to but never used
::error title=ty (F821),file=undef.py,line=1,col=4,endLine=1,endColumn=5::undef.py:1:4: F821 Undefined name `a`
::error title=ty (F401),file=/fib.py,line=1,col=8,endLine=1,endColumn=10::fib.py:1:8: F401 `os` imported but unused
::error title=ty (F841),file=/fib.py,line=6,col=5,endLine=6,endColumn=6::fib.py:6:5: F841 Local variable `x` is assigned to but never used
::error title=ty (F821),file=/undef.py,line=1,col=4,endLine=1,endColumn=5::undef.py:1:4: F821 Undefined name `a`

View File

@@ -2,5 +2,5 @@
source: crates/ruff_db/src/diagnostic/render/github.rs
expression: env.render_diagnostics(&diagnostics)
---
::error title=ty (invalid-syntax),file=syntax_errors.py,line=1,col=15,endLine=2,endColumn=1::syntax_errors.py:1:15: invalid-syntax: Expected one or more symbol names after import
::error title=ty (invalid-syntax),file=syntax_errors.py,line=3,col=12,endLine=4,endColumn=1::syntax_errors.py:3:12: invalid-syntax: Expected ')', found newline
::error title=ty (invalid-syntax),file=/syntax_errors.py,line=1,col=15,endLine=2,endColumn=1::syntax_errors.py:1:15: invalid-syntax: Expected one or more symbol names after import
::error title=ty (invalid-syntax),file=/syntax_errors.py,line=3,col=12,endLine=4,endColumn=1::syntax_errors.py:3:12: invalid-syntax: Expected ')', found newline

View File

@@ -10,7 +10,7 @@ expression: env.render_diagnostics(&diagnostics)
"column": 10,
"row": 2
},
"filename": "notebook.ipynb",
"filename": "/notebook.ipynb",
"fix": {
"applicability": "safe",
"edits": [
@@ -43,7 +43,7 @@ expression: env.render_diagnostics(&diagnostics)
"column": 12,
"row": 2
},
"filename": "notebook.ipynb",
"filename": "/notebook.ipynb",
"fix": {
"applicability": "safe",
"edits": [
@@ -76,7 +76,7 @@ expression: env.render_diagnostics(&diagnostics)
"column": 6,
"row": 4
},
"filename": "notebook.ipynb",
"filename": "/notebook.ipynb",
"fix": {
"applicability": "unsafe",
"edits": [

View File

@@ -10,7 +10,7 @@ expression: env.render_diagnostics(&diagnostics)
"column": 10,
"row": 1
},
"filename": "fib.py",
"filename": "/fib.py",
"fix": {
"applicability": "unsafe",
"edits": [
@@ -43,7 +43,7 @@ expression: env.render_diagnostics(&diagnostics)
"column": 6,
"row": 6
},
"filename": "fib.py",
"filename": "/fib.py",
"fix": {
"applicability": "unsafe",
"edits": [
@@ -76,7 +76,7 @@ expression: env.render_diagnostics(&diagnostics)
"column": 5,
"row": 1
},
"filename": "undef.py",
"filename": "/undef.py",
"fix": null,
"location": {
"column": 4,

View File

@@ -10,7 +10,7 @@ expression: env.render_diagnostics(&diagnostics)
"column": 1,
"row": 2
},
"filename": "syntax_errors.py",
"filename": "/syntax_errors.py",
"fix": null,
"location": {
"column": 15,
@@ -27,7 +27,7 @@ expression: env.render_diagnostics(&diagnostics)
"column": 1,
"row": 4
},
"filename": "syntax_errors.py",
"filename": "/syntax_errors.py",
"fix": null,
"location": {
"column": 12,

View File

@@ -2,6 +2,6 @@
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":1,"code":"F401","end_location":{"column":10,"row":2},"filename":"notebook.ipynb","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":10,"row":2},"location":{"column":1,"row":2}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":2},"message":"`os` imported but unused","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":2,"code":"F401","end_location":{"column":12,"row":2},"filename":"notebook.ipynb","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":1,"row":3},"location":{"column":1,"row":2}}],"message":"Remove unused import: `math`"},"location":{"column":8,"row":2},"message":"`math` imported but unused","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":3,"code":"F841","end_location":{"column":6,"row":4},"filename":"notebook.ipynb","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":1,"row":5},"location":{"column":1,"row":4}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":4},"message":"Local variable `x` is assigned to but never used","noqa_row":4,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}
{"cell":1,"code":"F401","end_location":{"column":10,"row":2},"filename":"/notebook.ipynb","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":10,"row":2},"location":{"column":1,"row":2}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":2},"message":"`os` imported but unused","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":2,"code":"F401","end_location":{"column":12,"row":2},"filename":"/notebook.ipynb","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":1,"row":3},"location":{"column":1,"row":2}}],"message":"Remove unused import: `math`"},"location":{"column":8,"row":2},"message":"`math` imported but unused","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":3,"code":"F841","end_location":{"column":6,"row":4},"filename":"/notebook.ipynb","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":1,"row":5},"location":{"column":1,"row":4}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":4},"message":"Local variable `x` is assigned to but never used","noqa_row":4,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}

View File

@@ -2,6 +2,6 @@
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":null,"code":"F401","end_location":{"column":10,"row":1},"filename":"fib.py","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":1,"row":2},"location":{"column":1,"row":1}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":1},"message":"`os` imported but unused","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":null,"code":"F841","end_location":{"column":6,"row":6},"filename":"fib.py","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":10,"row":6},"location":{"column":5,"row":6}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":6},"message":"Local variable `x` is assigned to but never used","noqa_row":6,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}
{"cell":null,"code":"F821","end_location":{"column":5,"row":1},"filename":"undef.py","fix":null,"location":{"column":4,"row":1},"message":"Undefined name `a`","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/undefined-name"}
{"cell":null,"code":"F401","end_location":{"column":10,"row":1},"filename":"/fib.py","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":1,"row":2},"location":{"column":1,"row":1}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":1},"message":"`os` imported but unused","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":null,"code":"F841","end_location":{"column":6,"row":6},"filename":"/fib.py","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":10,"row":6},"location":{"column":5,"row":6}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":6},"message":"Local variable `x` is assigned to but never used","noqa_row":6,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}
{"cell":null,"code":"F821","end_location":{"column":5,"row":1},"filename":"/undef.py","fix":null,"location":{"column":4,"row":1},"message":"Undefined name `a`","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/undefined-name"}

View File

@@ -2,5 +2,5 @@
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":null,"code":"invalid-syntax","end_location":{"column":1,"row":2},"filename":"syntax_errors.py","fix":null,"location":{"column":15,"row":1},"message":"Expected one or more symbol names after import","noqa_row":null,"url":null}
{"cell":null,"code":"invalid-syntax","end_location":{"column":1,"row":4},"filename":"syntax_errors.py","fix":null,"location":{"column":12,"row":3},"message":"Expected ')', found newline","noqa_row":null,"url":null}
{"cell":null,"code":"invalid-syntax","end_location":{"column":1,"row":2},"filename":"/syntax_errors.py","fix":null,"location":{"column":15,"row":1},"message":"Expected one or more symbol names after import","noqa_row":null,"url":null}
{"cell":null,"code":"invalid-syntax","end_location":{"column":1,"row":4},"filename":"/syntax_errors.py","fix":null,"location":{"column":12,"row":3},"message":"Expected ')', found newline","noqa_row":null,"url":null}

View File

@@ -4,16 +4,16 @@ expression: env.render_diagnostics(&diagnostics)
---
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="ruff" tests="3" failures="3" errors="0">
<testsuite name="fib.py" tests="2" disabled="0" errors="0" failures="2" package="org.ruff">
<testcase name="org.ruff.F401" classname="fib" line="1" column="8">
<testsuite name="/fib.py" tests="2" disabled="0" errors="0" failures="2" package="org.ruff">
<testcase name="org.ruff.F401" classname="/fib" line="1" column="8">
<failure message="`os` imported but unused">line 1, col 8, `os` imported but unused</failure>
</testcase>
<testcase name="org.ruff.F841" classname="fib" line="6" column="5">
<testcase name="org.ruff.F841" classname="/fib" line="6" column="5">
<failure message="Local variable `x` is assigned to but never used">line 6, col 5, Local variable `x` is assigned to but never used</failure>
</testcase>
</testsuite>
<testsuite name="undef.py" tests="1" disabled="0" errors="0" failures="1" package="org.ruff">
<testcase name="org.ruff.F821" classname="undef" line="1" column="4">
<testsuite name="/undef.py" tests="1" disabled="0" errors="0" failures="1" package="org.ruff">
<testcase name="org.ruff.F821" classname="/undef" line="1" column="4">
<failure message="Undefined name `a`">line 1, col 4, Undefined name `a`</failure>
</testcase>
</testsuite>

View File

@@ -4,11 +4,11 @@ expression: env.render_diagnostics(&diagnostics)
---
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="ruff" tests="2" failures="2" errors="0">
<testsuite name="syntax_errors.py" tests="2" disabled="0" errors="0" failures="2" package="org.ruff">
<testcase name="org.ruff.invalid-syntax" classname="syntax_errors" line="1" column="15">
<testsuite name="/syntax_errors.py" tests="2" disabled="0" errors="0" failures="2" package="org.ruff">
<testcase name="org.ruff.invalid-syntax" classname="/syntax_errors" line="1" column="15">
<failure message="Expected one or more symbol names after import">line 1, col 15, Expected one or more symbol names after import</failure>
</testcase>
<testcase name="org.ruff.invalid-syntax" classname="syntax_errors" line="3" column="12">
<testcase name="org.ruff.invalid-syntax" classname="/syntax_errors" line="3" column="12">
<failure message="Expected &apos;)&apos;, found newline">line 3, col 12, Expected &apos;)&apos;, found newline</failure>
</testcase>
</testsuite>

View File

@@ -10,7 +10,7 @@ expression: env.render_diagnostics(&diagnostics)
"value": "F401"
},
"location": {
"path": "fib.py",
"path": "/fib.py",
"range": {
"end": {
"column": 10,
@@ -45,7 +45,7 @@ expression: env.render_diagnostics(&diagnostics)
"value": "F841"
},
"location": {
"path": "fib.py",
"path": "/fib.py",
"range": {
"end": {
"column": 6,
@@ -80,7 +80,7 @@ expression: env.render_diagnostics(&diagnostics)
"value": "F821"
},
"location": {
"path": "undef.py",
"path": "/undef.py",
"range": {
"end": {
"column": 5,

View File

@@ -9,7 +9,7 @@ expression: env.render_diagnostics(&diagnostics)
"value": "invalid-syntax"
},
"location": {
"path": "syntax_errors.py",
"path": "/syntax_errors.py",
"range": {
"end": {
"column": 1,
@@ -28,7 +28,7 @@ expression: env.render_diagnostics(&diagnostics)
"value": "invalid-syntax"
},
"location": {
"path": "syntax_errors.py",
"path": "/syntax_errors.py",
"range": {
"end": {
"column": 1,

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.13.1"
version = "0.13.2"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -0,0 +1,18 @@
from contextlib import nullcontext
def check_isolation_level(mode: int) -> None:
"""Will report both, but only fix the first.""" # ERROR PYI021
... # ERROR PIE790
with nullcontext():
"""Should not report."""
# add something thats not a pass here
# to not get a remove unnecessary pass err
x = 0
if True:
"""Should not report."""
# same as above
y = 1

View File

@@ -145,9 +145,6 @@ pub(crate) fn definitions(checker: &mut Checker) {
}
// flake8-pyi
if enforce_stubs {
flake8_pyi::rules::docstring_in_stubs(checker, definition, docstring);
}
if enforce_stubs_and_runtime {
flake8_pyi::rules::iter_method_return_iterable(checker, definition);
}

View File

@@ -212,13 +212,10 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if ctx.is_store() {
let check_too_many_expressions =
checker.is_rule_enabled(Rule::ExpressionsInStarAssignment);
let check_two_starred_expressions =
checker.is_rule_enabled(Rule::MultipleStarredExpressions);
pyflakes::rules::starred_expressions(
checker,
elts,
check_too_many_expressions,
check_two_starred_expressions,
expr.range(),
);
}

View File

@@ -2,14 +2,17 @@ use ruff_python_ast::Stmt;
use crate::checkers::ast::Checker;
use crate::codes::Rule;
use crate::rules::flake8_pie;
use crate::rules::refurb;
use crate::rules::{flake8_pie, flake8_pyi};
/// Run lint rules over a suite of [`Stmt`] syntax nodes.
pub(crate) fn suite(suite: &[Stmt], checker: &Checker) {
if checker.is_rule_enabled(Rule::UnnecessaryPlaceholder) {
flake8_pie::rules::unnecessary_placeholder(checker, suite);
}
if checker.source_type.is_stub() && checker.is_rule_enabled(Rule::DocstringInStub) {
flake8_pyi::rules::docstring_in_stubs(checker, suite);
}
if checker.is_rule_enabled(Rule::RepeatedGlobal) {
refurb::rules::repeated_global(checker, suite);
}

View File

@@ -69,8 +69,8 @@ use crate::package::PackageRoot;
use crate::preview::is_undefined_export_in_dunder_init_enabled;
use crate::registry::Rule;
use crate::rules::pyflakes::rules::{
LateFutureImport, ReturnOutsideFunction, UndefinedLocalWithNestedImportStarUsage,
YieldOutsideFunction,
LateFutureImport, MultipleStarredExpressions, ReturnOutsideFunction,
UndefinedLocalWithNestedImportStarUsage, YieldOutsideFunction,
};
use crate::rules::pylint::rules::{
AwaitOutsideAsync, LoadBeforeGlobalDeclaration, YieldFromInAsyncFunction,
@@ -87,7 +87,7 @@ mod deferred;
/// State representing whether a docstring is expected or not for the next statement.
#[derive(Debug, Copy, Clone, PartialEq)]
enum DocstringState {
pub(crate) enum DocstringState {
/// The next statement is expected to be a docstring, but not necessarily so.
///
/// For example, in the following code:
@@ -128,7 +128,7 @@ impl DocstringState {
/// The kind of an expected docstring.
#[derive(Debug, Copy, Clone, PartialEq)]
enum ExpectedDocstringKind {
pub(crate) enum ExpectedDocstringKind {
/// A module-level docstring.
///
/// For example,
@@ -603,6 +603,11 @@ impl<'a> Checker<'a> {
pub(crate) const fn context(&self) -> &'a LintContext<'a> {
self.context
}
/// Return the current [`DocstringState`].
pub(crate) fn docstring_state(&self) -> DocstringState {
self.docstring_state
}
}
pub(crate) struct TypingImporter<'a, 'b> {
@@ -685,6 +690,12 @@ impl SemanticSyntaxContext for Checker<'_> {
self.report_diagnostic(YieldFromInAsyncFunction, error.range);
}
}
SemanticSyntaxErrorKind::MultipleStarredExpressions => {
// F622
if self.is_rule_enabled(Rule::MultipleStarredExpressions) {
self.report_diagnostic(MultipleStarredExpressions, error.range);
}
}
SemanticSyntaxErrorKind::ReboundComprehensionVariable
| SemanticSyntaxErrorKind::DuplicateTypeParameter
| SemanticSyntaxErrorKind::MultipleCaseAssignment(_)

View File

@@ -23,6 +23,17 @@ use crate::rules::flake8_bandit::helpers::string_literal;
/// Avoid using weak or broken cryptographic hash functions in security
/// contexts. Instead, use a known secure hash function such as SHA256.
///
/// Note: This rule targets the following weak algorithm names in `hashlib`:
/// `md4`, `md5`, `sha`, and `sha1`. It also flags uses of `crypt.crypt` and
/// `crypt.mksalt` when configured with `METHOD_CRYPT`, `METHOD_MD5`, or
/// `METHOD_BLOWFISH`.
///
/// It does not attempt to lint OpenSSL- or platform-specific aliases and OIDs
/// (for example: `"sha-1"`, `"ssl3-sha1"`, `"ssl3-md5"`, or
/// `"1.3.14.3.2.26"`), nor variations with trailing spaces, as the set of
/// accepted aliases depends on the underlying OpenSSL version and varies across
/// platforms and Python builds.
///
/// ## Example
/// ```python
/// import hashlib

View File

@@ -71,6 +71,8 @@ pub(crate) fn implicit_namespace_package(
if package.is_none()
// Ignore non-`.py` files, which don't require an `__init__.py`.
&& PySourceType::try_from_path(path).is_some_and(PySourceType::is_py_file)
// Ignore `.pyw` files that are also PySourceType::Python but aren't importable namespaces
&& path.extension().is_some_and(|ext| ext == "py")
// Ignore any files that are direct children of the project root.
&& path
.parent()

View File

@@ -192,4 +192,17 @@ mod tests {
assert_diagnostics!(snapshot, diagnostics);
Ok(())
}
#[test_case(Path::new("PYI021_1.pyi"))]
fn pyi021_pie790_isolation_check(path: &Path) -> Result<()> {
let diagnostics = test_path(
Path::new("flake8_pyi").join(path).as_path(),
&settings::LinterSettings::for_rules([
Rule::DocstringInStub,
Rule::UnnecessaryPlaceholder,
]),
)?;
assert_diagnostics!(diagnostics);
Ok(())
}
}

View File

@@ -1,9 +1,9 @@
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::ExprStringLiteral;
use ruff_python_semantic::Definition;
use ruff_python_ast::{ExprStringLiteral, Stmt};
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
use crate::checkers::ast::{Checker, DocstringState, ExpectedDocstringKind};
use crate::docstrings::extraction::docstring_from;
use crate::{AlwaysFixableViolation, Edit, Fix};
/// ## What it does
@@ -41,26 +41,34 @@ impl AlwaysFixableViolation for DocstringInStub {
}
/// PYI021
pub(crate) fn docstring_in_stubs(
checker: &Checker,
definition: &Definition,
docstring: Option<&ExprStringLiteral>,
) {
pub(crate) fn docstring_in_stubs(checker: &Checker, body: &[Stmt]) {
if !matches!(
checker.docstring_state(),
DocstringState::Expected(
ExpectedDocstringKind::Module
| ExpectedDocstringKind::Class
| ExpectedDocstringKind::Function
)
) {
return;
}
let docstring = docstring_from(body);
let Some(docstring_range) = docstring.map(ExprStringLiteral::range) else {
return;
};
let statements = match definition {
Definition::Module(module) => module.python_ast,
Definition::Member(member) => member.body(),
};
let edit = if statements.len() == 1 {
let edit = if body.len() == 1 {
Edit::range_replacement("...".to_string(), docstring_range)
} else {
Edit::range_deletion(docstring_range)
};
let mut diagnostic = checker.report_diagnostic(DocstringInStub, docstring_range);
diagnostic.set_fix(Fix::unsafe_edit(edit));
let isolation_level = Checker::isolation(checker.semantic().current_statement_id());
let fix = Fix::unsafe_edit(edit).isolate(isolation_level);
checker
.report_diagnostic(DocstringInStub, docstring_range)
.set_fix(fix);
}

View File

@@ -0,0 +1,39 @@
---
source: crates/ruff_linter/src/rules/flake8_pyi/mod.rs
---
PYI021 [*] Docstrings should not be included in stubs
--> PYI021_1.pyi:5:5
|
4 | def check_isolation_level(mode: int) -> None:
5 | """Will report both, but only fix the first.""" # ERROR PYI021
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
6 | ... # ERROR PIE790
|
help: Remove docstring
2 |
3 |
4 | def check_isolation_level(mode: int) -> None:
- """Will report both, but only fix the first.""" # ERROR PYI021
5 + # ERROR PYI021
6 | ... # ERROR PIE790
7 |
8 |
note: This is an unsafe fix and may change runtime behavior
PIE790 [*] Unnecessary `...` literal
--> PYI021_1.pyi:6:5
|
4 | def check_isolation_level(mode: int) -> None:
5 | """Will report both, but only fix the first.""" # ERROR PYI021
6 | ... # ERROR PIE790
| ^^^
|
help: Remove unnecessary `...`
3 |
4 | def check_isolation_level(mode: int) -> None:
5 | """Will report both, but only fix the first.""" # ERROR PYI021
- ... # ERROR PIE790
6 + # ERROR PIE790
7 |
8 |
9 | with nullcontext():

View File

@@ -53,27 +53,14 @@ impl Violation for MultipleStarredExpressions {
}
}
/// F621, F622
/// F621
pub(crate) fn starred_expressions(
checker: &Checker,
elts: &[Expr],
check_too_many_expressions: bool,
check_two_starred_expressions: bool,
location: TextRange,
) {
let mut has_starred: bool = false;
let mut starred_index: Option<usize> = None;
for (index, elt) in elts.iter().enumerate() {
if elt.is_starred_expr() {
if has_starred && check_two_starred_expressions {
checker.report_diagnostic(MultipleStarredExpressions, location);
return;
}
has_starred = true;
starred_index = Some(index);
}
}
let starred_index: Option<usize> = None;
if check_too_many_expressions {
if let Some(starred_index) = starred_index {
if starred_index >= 1 << 8 || elts.len() - starred_index > 1 << 24 {

View File

@@ -78,7 +78,9 @@ pub enum TomlSourceType {
#[derive(Clone, Copy, Debug, Default, PartialEq, Eq, Hash)]
#[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))]
pub enum PySourceType {
/// The source is a Python file (`.py`).
/// The source is a Python file (`.py`, `.pyw`).
/// Note: `.pyw` files contain Python code, but do not represent importable namespaces.
/// Consider adding a separate source type later if combining the two causes issues.
#[default]
Python,
/// The source is a Python stub file (`.pyi`).
@@ -100,6 +102,7 @@ impl PySourceType {
let ty = match extension {
"py" => Self::Python,
"pyi" => Self::Stub,
"pyw" => Self::Python,
"ipynb" => Self::Ipynb,
_ => return None,
};

View File

@@ -0,0 +1,5 @@
(*a, *b) = (1, 2)
[*a, *b] = (1, 2)
(*a, *b, c) = (1, 2, 3)
[*a, *b, c] = (1, 2, 3)
(*a, *b, (*c, *d)) = (1, 2)

View File

@@ -0,0 +1,2 @@
(*a, b) = (1, 2)
(*_, normed), *_ = [(1,), 2]

View File

@@ -389,6 +389,40 @@ impl SemanticSyntaxChecker {
}
}
fn multiple_star_expression<Ctx: SemanticSyntaxContext>(
ctx: &Ctx,
expr_ctx: ExprContext,
elts: &[Expr],
range: TextRange,
) {
if expr_ctx.is_store() {
let mut has_starred = false;
for elt in elts {
if elt.is_starred_expr() {
if has_starred {
// test_err multiple_starred_assignment_target
// (*a, *b) = (1, 2)
// [*a, *b] = (1, 2)
// (*a, *b, c) = (1, 2, 3)
// [*a, *b, c] = (1, 2, 3)
// (*a, *b, (*c, *d)) = (1, 2)
// test_ok multiple_starred_assignment_target
// (*a, b) = (1, 2)
// (*_, normed), *_ = [(1,), 2]
Self::add_error(
ctx,
SemanticSyntaxErrorKind::MultipleStarredExpressions,
range,
);
return;
}
has_starred = true;
}
}
}
}
/// Check for [`SemanticSyntaxErrorKind::WriteToDebug`] in `stmt`.
fn debug_shadowing<Ctx: SemanticSyntaxContext>(stmt: &ast::Stmt, ctx: &Ctx) {
match stmt {
@@ -754,6 +788,20 @@ impl SemanticSyntaxChecker {
Self::yield_outside_function(ctx, expr, YieldOutsideFunctionKind::Await);
Self::await_outside_async_function(ctx, expr, AwaitOutsideAsyncFunctionKind::Await);
}
Expr::Tuple(ast::ExprTuple {
elts,
ctx: expr_ctx,
range,
..
})
| Expr::List(ast::ExprList {
elts,
ctx: expr_ctx,
range,
..
}) => {
Self::multiple_star_expression(ctx, *expr_ctx, elts, *range);
}
Expr::Lambda(ast::ExprLambda {
parameters: Some(parameters),
..
@@ -1035,6 +1083,9 @@ impl Display for SemanticSyntaxError {
SemanticSyntaxErrorKind::NonModuleImportStar(name) => {
write!(f, "`from {name} import *` only allowed at module level")
}
SemanticSyntaxErrorKind::MultipleStarredExpressions => {
write!(f, "Two starred expressions in assignment")
}
}
}
}
@@ -1398,6 +1449,13 @@ pub enum SemanticSyntaxErrorKind {
/// Represents the use of `from <module> import *` outside module scope.
NonModuleImportStar(String),
/// Represents the use of more than one starred expression in an assignment.
///
/// Python only allows a single starred target when unpacking values on the
/// left-hand side of an assignment. Using multiple starred expressions makes
/// the statement invalid and results in a `SyntaxError`.
MultipleStarredExpressions,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, get_size2::GetSize)]

View File

@@ -0,0 +1,520 @@
---
source: crates/ruff_python_parser/tests/fixtures.rs
input_file: crates/ruff_python_parser/resources/inline/err/multiple_starred_assignment_target.py
---
## AST
```
Module(
ModModule {
node_index: NodeIndex(None),
range: 0..112,
body: [
Assign(
StmtAssign {
node_index: NodeIndex(None),
range: 0..17,
targets: [
Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 0..8,
elts: [
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 1..3,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 2..3,
id: Name("a"),
ctx: Store,
},
),
ctx: Store,
},
),
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 5..7,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 6..7,
id: Name("b"),
ctx: Store,
},
),
ctx: Store,
},
),
],
ctx: Store,
parenthesized: true,
},
),
],
value: Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 11..17,
elts: [
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 12..13,
value: Int(
1,
),
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 15..16,
value: Int(
2,
),
},
),
],
ctx: Load,
parenthesized: true,
},
),
},
),
Assign(
StmtAssign {
node_index: NodeIndex(None),
range: 18..35,
targets: [
List(
ExprList {
node_index: NodeIndex(None),
range: 18..26,
elts: [
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 19..21,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 20..21,
id: Name("a"),
ctx: Store,
},
),
ctx: Store,
},
),
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 23..25,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 24..25,
id: Name("b"),
ctx: Store,
},
),
ctx: Store,
},
),
],
ctx: Store,
},
),
],
value: Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 29..35,
elts: [
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 30..31,
value: Int(
1,
),
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 33..34,
value: Int(
2,
),
},
),
],
ctx: Load,
parenthesized: true,
},
),
},
),
Assign(
StmtAssign {
node_index: NodeIndex(None),
range: 36..59,
targets: [
Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 36..47,
elts: [
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 37..39,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 38..39,
id: Name("a"),
ctx: Store,
},
),
ctx: Store,
},
),
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 41..43,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 42..43,
id: Name("b"),
ctx: Store,
},
),
ctx: Store,
},
),
Name(
ExprName {
node_index: NodeIndex(None),
range: 45..46,
id: Name("c"),
ctx: Store,
},
),
],
ctx: Store,
parenthesized: true,
},
),
],
value: Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 50..59,
elts: [
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 51..52,
value: Int(
1,
),
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 54..55,
value: Int(
2,
),
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 57..58,
value: Int(
3,
),
},
),
],
ctx: Load,
parenthesized: true,
},
),
},
),
Assign(
StmtAssign {
node_index: NodeIndex(None),
range: 60..83,
targets: [
List(
ExprList {
node_index: NodeIndex(None),
range: 60..71,
elts: [
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 61..63,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 62..63,
id: Name("a"),
ctx: Store,
},
),
ctx: Store,
},
),
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 65..67,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 66..67,
id: Name("b"),
ctx: Store,
},
),
ctx: Store,
},
),
Name(
ExprName {
node_index: NodeIndex(None),
range: 69..70,
id: Name("c"),
ctx: Store,
},
),
],
ctx: Store,
},
),
],
value: Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 74..83,
elts: [
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 75..76,
value: Int(
1,
),
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 78..79,
value: Int(
2,
),
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 81..82,
value: Int(
3,
),
},
),
],
ctx: Load,
parenthesized: true,
},
),
},
),
Assign(
StmtAssign {
node_index: NodeIndex(None),
range: 84..111,
targets: [
Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 84..102,
elts: [
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 85..87,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 86..87,
id: Name("a"),
ctx: Store,
},
),
ctx: Store,
},
),
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 89..91,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 90..91,
id: Name("b"),
ctx: Store,
},
),
ctx: Store,
},
),
Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 93..101,
elts: [
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 94..96,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 95..96,
id: Name("c"),
ctx: Store,
},
),
ctx: Store,
},
),
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 98..100,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 99..100,
id: Name("d"),
ctx: Store,
},
),
ctx: Store,
},
),
],
ctx: Store,
parenthesized: true,
},
),
],
ctx: Store,
parenthesized: true,
},
),
],
value: Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 105..111,
elts: [
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 106..107,
value: Int(
1,
),
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 109..110,
value: Int(
2,
),
},
),
],
ctx: Load,
parenthesized: true,
},
),
},
),
],
},
)
```
## Semantic Syntax Errors
|
1 | (*a, *b) = (1, 2)
| ^^^^^^^^ Syntax Error: Two starred expressions in assignment
2 | [*a, *b] = (1, 2)
3 | (*a, *b, c) = (1, 2, 3)
|
|
1 | (*a, *b) = (1, 2)
2 | [*a, *b] = (1, 2)
| ^^^^^^^^ Syntax Error: Two starred expressions in assignment
3 | (*a, *b, c) = (1, 2, 3)
4 | [*a, *b, c] = (1, 2, 3)
|
|
1 | (*a, *b) = (1, 2)
2 | [*a, *b] = (1, 2)
3 | (*a, *b, c) = (1, 2, 3)
| ^^^^^^^^^^^ Syntax Error: Two starred expressions in assignment
4 | [*a, *b, c] = (1, 2, 3)
5 | (*a, *b, (*c, *d)) = (1, 2)
|
|
2 | [*a, *b] = (1, 2)
3 | (*a, *b, c) = (1, 2, 3)
4 | [*a, *b, c] = (1, 2, 3)
| ^^^^^^^^^^^ Syntax Error: Two starred expressions in assignment
5 | (*a, *b, (*c, *d)) = (1, 2)
|
|
3 | (*a, *b, c) = (1, 2, 3)
4 | [*a, *b, c] = (1, 2, 3)
5 | (*a, *b, (*c, *d)) = (1, 2)
| ^^^^^^^^^^^^^^^^^^ Syntax Error: Two starred expressions in assignment
|
|
3 | (*a, *b, c) = (1, 2, 3)
4 | [*a, *b, c] = (1, 2, 3)
5 | (*a, *b, (*c, *d)) = (1, 2)
| ^^^^^^^^ Syntax Error: Two starred expressions in assignment
|

View File

@@ -0,0 +1,188 @@
---
source: crates/ruff_python_parser/tests/fixtures.rs
input_file: crates/ruff_python_parser/resources/inline/ok/multiple_starred_assignment_target.py
---
## AST
```
Module(
ModModule {
node_index: NodeIndex(None),
range: 0..46,
body: [
Assign(
StmtAssign {
node_index: NodeIndex(None),
range: 0..16,
targets: [
Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 0..7,
elts: [
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 1..3,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 2..3,
id: Name("a"),
ctx: Store,
},
),
ctx: Store,
},
),
Name(
ExprName {
node_index: NodeIndex(None),
range: 5..6,
id: Name("b"),
ctx: Store,
},
),
],
ctx: Store,
parenthesized: true,
},
),
],
value: Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 10..16,
elts: [
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 11..12,
value: Int(
1,
),
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 14..15,
value: Int(
2,
),
},
),
],
ctx: Load,
parenthesized: true,
},
),
},
),
Assign(
StmtAssign {
node_index: NodeIndex(None),
range: 17..45,
targets: [
Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 17..33,
elts: [
Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 17..29,
elts: [
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 18..20,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 19..20,
id: Name("_"),
ctx: Store,
},
),
ctx: Store,
},
),
Name(
ExprName {
node_index: NodeIndex(None),
range: 22..28,
id: Name("normed"),
ctx: Store,
},
),
],
ctx: Store,
parenthesized: true,
},
),
Starred(
ExprStarred {
node_index: NodeIndex(None),
range: 31..33,
value: Name(
ExprName {
node_index: NodeIndex(None),
range: 32..33,
id: Name("_"),
ctx: Store,
},
),
ctx: Store,
},
),
],
ctx: Store,
parenthesized: false,
},
),
],
value: List(
ExprList {
node_index: NodeIndex(None),
range: 36..45,
elts: [
Tuple(
ExprTuple {
node_index: NodeIndex(None),
range: 37..41,
elts: [
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 38..39,
value: Int(
1,
),
},
),
],
ctx: Load,
parenthesized: true,
},
),
NumberLiteral(
ExprNumberLiteral {
node_index: NodeIndex(None),
range: 43..44,
value: Int(
2,
),
},
),
],
ctx: Load,
},
),
},
),
],
},
)
```

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_wasm"
version = "0.13.1"
version = "0.13.2"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -19,7 +19,7 @@ There are multiple versions for the different wasm-pack targets. See [here](http
This example uses the wasm-pack web target and is known to work with Vite.
```ts
import init, { Workspace, type Diagnostic } from '@astral-sh/ruff-wasm-web';
import init, { Workspace, type Diagnostic, PositionEncoding } from '@astral-sh/ruff-wasm-web';
const exampleDocument = `print('hello'); print("world")`
@@ -42,7 +42,7 @@ const workspace = new Workspace({
'F'
],
},
});
}, PositionEncoding.UTF16);
// Will contain 1 diagnostic code for E702: Multiple statements on one line
const diagnostics: Diagnostic[] = workspace.check(exampleDocument);

View File

@@ -19,7 +19,7 @@ use ruff_python_formatter::{PyFormatContext, QuoteStyle, format_module_ast, pret
use ruff_python_index::Indexer;
use ruff_python_parser::{Mode, ParseOptions, Parsed, parse, parse_unchecked};
use ruff_python_trivia::CommentRanges;
use ruff_source_file::{LineColumn, OneIndexed};
use ruff_source_file::{OneIndexed, PositionEncoding as SourcePositionEncoding, SourceLocation};
use ruff_text_size::Ranged;
use ruff_workspace::Settings;
use ruff_workspace::configuration::Configuration;
@@ -117,6 +117,7 @@ pub fn run() {
#[wasm_bindgen]
pub struct Workspace {
settings: Settings,
position_encoding: SourcePositionEncoding,
}
#[wasm_bindgen]
@@ -126,7 +127,7 @@ impl Workspace {
}
#[wasm_bindgen(constructor)]
pub fn new(options: JsValue) -> Result<Workspace, Error> {
pub fn new(options: JsValue, position_encoding: PositionEncoding) -> Result<Workspace, Error> {
let options: Options = serde_wasm_bindgen::from_value(options).map_err(into_error)?;
let configuration =
Configuration::from_options(options, Some(Path::new(".")), Path::new("."))
@@ -135,7 +136,10 @@ impl Workspace {
.into_settings(Path::new("."))
.map_err(into_error)?;
Ok(Workspace { settings })
Ok(Workspace {
settings,
position_encoding: position_encoding.into(),
})
}
#[wasm_bindgen(js_name = defaultSettings)]
@@ -228,27 +232,34 @@ impl Workspace {
let messages: Vec<ExpandedMessage> = diagnostics
.into_iter()
.map(|msg| ExpandedMessage {
code: msg.secondary_code_or_id().to_string(),
message: msg.body().to_string(),
start_location: source_code
.line_column(msg.range().unwrap_or_default().start())
.into(),
end_location: source_code
.line_column(msg.range().unwrap_or_default().end())
.into(),
fix: msg.fix().map(|fix| ExpandedFix {
message: msg.first_help_text().map(ToString::to_string),
edits: fix
.edits()
.iter()
.map(|edit| ExpandedEdit {
location: source_code.line_column(edit.start()).into(),
end_location: source_code.line_column(edit.end()).into(),
content: edit.content().map(ToString::to_string),
})
.collect(),
}),
.map(|msg| {
let range = msg.range().unwrap_or_default();
ExpandedMessage {
code: msg.secondary_code_or_id().to_string(),
message: msg.body().to_string(),
start_location: source_code
.source_location(range.start(), self.position_encoding)
.into(),
end_location: source_code
.source_location(range.end(), self.position_encoding)
.into(),
fix: msg.fix().map(|fix| ExpandedFix {
message: msg.first_help_text().map(ToString::to_string),
edits: fix
.edits()
.iter()
.map(|edit| ExpandedEdit {
location: source_code
.source_location(edit.start(), self.position_encoding)
.into(),
end_location: source_code
.source_location(edit.end(), self.position_encoding)
.into(),
content: edit.content().map(ToString::to_string),
})
.collect(),
}),
}
})
.collect();
@@ -331,14 +342,37 @@ impl<'a> ParsedModule<'a> {
#[derive(Serialize, Deserialize, Eq, PartialEq, Debug)]
pub struct Location {
pub row: OneIndexed,
/// The character offset from the start of the line.
///
/// The semantic of the offset depends on the [`PositionEncoding`] used when creating
/// the [`Workspace`].
pub column: OneIndexed,
}
impl From<LineColumn> for Location {
fn from(value: LineColumn) -> Self {
impl From<SourceLocation> for Location {
fn from(value: SourceLocation) -> Self {
Self {
row: value.line,
column: value.column,
column: value.character_offset,
}
}
}
#[derive(Default, Copy, Clone)]
#[wasm_bindgen]
pub enum PositionEncoding {
#[default]
Utf8,
Utf16,
Utf32,
}
impl From<PositionEncoding> for SourcePositionEncoding {
fn from(value: PositionEncoding) -> Self {
match value {
PositionEncoding::Utf8 => Self::Utf8,
PositionEncoding::Utf16 => Self::Utf16,
PositionEncoding::Utf32 => Self::Utf32,
}
}
}

View File

@@ -4,12 +4,15 @@ use wasm_bindgen_test::wasm_bindgen_test;
use ruff_linter::registry::Rule;
use ruff_source_file::OneIndexed;
use ruff_wasm::{ExpandedMessage, Location, Workspace};
use ruff_wasm::{ExpandedMessage, Location, PositionEncoding, Workspace};
macro_rules! check {
($source:expr, $config:expr, $expected:expr) => {{
let config = js_sys::JSON::parse($config).unwrap();
match Workspace::new(config).unwrap().check($source) {
match Workspace::new(config, PositionEncoding::Utf8)
.unwrap()
.check($source)
{
Ok(output) => {
let result: Vec<ExpandedMessage> = serde_wasm_bindgen::from_value(output).unwrap();
assert_eq!(result, $expected);

View File

@@ -55,7 +55,8 @@ use crate::options::{
PydoclintOptions, PydocstyleOptions, PyflakesOptions, PylintOptions, RuffOptions,
};
use crate::settings::{
EXCLUDE, FileResolverSettings, FormatterSettings, INCLUDE, LineEnding, Settings,
EXCLUDE, FileResolverSettings, FormatterSettings, INCLUDE, INCLUDE_PREVIEW, LineEnding,
Settings,
};
#[derive(Clone, Debug, Default)]
@@ -274,9 +275,14 @@ impl Configuration {
extend_exclude: FilePatternSet::try_from_iter(self.extend_exclude)?,
extend_include: FilePatternSet::try_from_iter(self.extend_include)?,
force_exclude: self.force_exclude.unwrap_or(false),
include: FilePatternSet::try_from_iter(
self.include.unwrap_or_else(|| INCLUDE.to_vec()),
)?,
include: match global_preview {
PreviewMode::Disabled => FilePatternSet::try_from_iter(
self.include.unwrap_or_else(|| INCLUDE.to_vec()),
)?,
PreviewMode::Enabled => FilePatternSet::try_from_iter(
self.include.unwrap_or_else(|| INCLUDE_PREVIEW.to_vec()),
)?,
},
respect_gitignore: self.respect_gitignore.unwrap_or(true),
project_root: project_root.to_path_buf(),
},

View File

@@ -251,7 +251,7 @@ pub struct Options {
///
/// For more information on the glob syntax, refer to the [`globset` documentation](https://docs.rs/globset/latest/globset/#syntax).
#[option(
default = r#"["*.py", "*.pyi", "*.ipynb", "**/pyproject.toml"]"#,
default = r#"["*.py", "*.pyi", "*.pyw", "*.ipynb", "**/pyproject.toml"]"#,
value_type = "list[str]",
example = r#"
include = ["*.py"]

View File

@@ -427,7 +427,7 @@ impl From<ConfigurationOrigin> for Relativity {
}
}
/// Find all Python (`.py`, `.pyi` and `.ipynb` files) in a set of paths.
/// Find all Python (`.py`, `.pyi`, `.pyw`, and `.ipynb` files) in a set of paths.
pub fn python_files_in_path<'a>(
paths: &[PathBuf],
pyproject_config: &'a PyprojectConfig,

View File

@@ -144,6 +144,13 @@ pub(crate) static INCLUDE: &[FilePattern] = &[
FilePattern::Builtin("*.ipynb"),
FilePattern::Builtin("**/pyproject.toml"),
];
pub(crate) static INCLUDE_PREVIEW: &[FilePattern] = &[
FilePattern::Builtin("*.py"),
FilePattern::Builtin("*.pyi"),
FilePattern::Builtin("*.pyw"),
FilePattern::Builtin("*.ipynb"),
FilePattern::Builtin("**/pyproject.toml"),
];
impl FileResolverSettings {
fn new(project_root: &Path) -> Self {

View File

@@ -31,13 +31,7 @@ mypy_primer \
```
This will show the diagnostics diff for the `black` project between the `main` branch and your `my/feature` branch. To run the
diff for all projects we currently enable in CI, run
```sh
SELECTOR="$(paste -s -d'|' "crates/ty_python_semantic/resources/primer/good.txt" | sed -e 's@(@\\(@g' -e 's@)@\\)@g')"
```
and then use `--project-selector "$SELECTOR"`.
diff for all projects we currently enable in CI, use `--project-selector "/($(paste -s -d'|' crates/ty_python_semantic/resources/primer/good.txt))\$"`.
You can also take a look at the [full list of ecosystem projects]. Note that some of them might still need a `ty_paths` configuration
option to work correctly.

View File

@@ -632,7 +632,8 @@ fn gitlab_diagnostics() -> anyhow::Result<()> {
settings.add_filter(r#"("fingerprint": ")[a-z0-9]+(",)"#, "$1[FINGERPRINT]$2");
let _s = settings.bind_to_scope();
assert_cmd_snapshot!(case.command().arg("--output-format=gitlab").arg("--warn").arg("unresolved-reference"), @r#"
assert_cmd_snapshot!(case.command().arg("--output-format=gitlab").arg("--warn").arg("unresolved-reference")
.env("CI_PROJECT_DIR", case.project_dir), @r#"
success: false
exit_code: 1
----- stdout -----
@@ -697,8 +698,8 @@ fn github_diagnostics() -> anyhow::Result<()> {
success: false
exit_code: 1
----- stdout -----
::warning title=ty (unresolved-reference),file=test.py,line=2,col=7,endLine=2,endColumn=8::test.py:2:7: unresolved-reference: Name `x` used when not defined
::error title=ty (non-subscriptable),file=test.py,line=3,col=7,endLine=3,endColumn=8::test.py:3:7: non-subscriptable: Cannot subscript object of type `Literal[4]` with no `__getitem__` method
::warning title=ty (unresolved-reference),file=<temp_dir>/test.py,line=2,col=7,endLine=2,endColumn=8::test.py:2:7: unresolved-reference: Name `x` used when not defined
::error title=ty (non-subscriptable),file=<temp_dir>/test.py,line=3,col=7,endLine=3,endColumn=8::test.py:3:7: non-subscriptable: Cannot subscript object of type `Literal[4]` with no `__getitem__` method
----- stderr -----
WARN ty is pre-release software and not ready for production use. Expect to encounter bugs, missing features, and fatal errors.

View File

@@ -139,6 +139,15 @@ reveal_type(n) # revealed: list[Literal[1, 2, 3]]
# error: [invalid-assignment] "Object of type `list[Unknown | str]` is not assignable to `list[LiteralString]`"
o: list[typing.LiteralString] = ["a", "b", "c"]
reveal_type(o) # revealed: list[LiteralString]
p: dict[int, int] = {}
reveal_type(p) # revealed: dict[int, int]
q: dict[int | str, int] = {1: 1, 2: 2, 3: 3}
reveal_type(q) # revealed: dict[int | str, int]
r: dict[int | str, int | str] = {1: 1, 2: 2, 3: 3}
reveal_type(r) # revealed: dict[int | str, int | str]
```
## Incorrect collection literal assignments are complained aobut

View File

@@ -57,7 +57,7 @@ type("Foo", Base, {})
# error: [invalid-argument-type] "Argument to class `type` is incorrect: Expected `tuple[type, ...]`, found `tuple[Literal[1], Literal[2]]`"
type("Foo", (1, 2), {})
# TODO: this should be an error
# error: [invalid-argument-type] "Argument to class `type` is incorrect: Expected `dict[str, Any]`, found `dict[Unknown | bytes, Unknown | int]`"
type("Foo", (Base,), {b"attr": 1})
```
@@ -162,22 +162,3 @@ def _(x: A | B, y: list[int]):
reveal_type(x) # revealed: B & ~A
reveal_type(isinstance(x, B)) # revealed: Literal[True]
```
## Calls to `open()`
We do not fully understand typeshed's overloads for `open()` yet, due to missing support for PEP-613
type aliases. However, we also do not emit false-positive diagnostics on common calls to `open()`:
```py
import pickle
reveal_type(open("")) # revealed: TextIOWrapper[_WrappedBuffer]
reveal_type(open("", "r")) # revealed: TextIOWrapper[_WrappedBuffer]
reveal_type(open("", "rb")) # revealed: @Todo(`builtins.open` return type)
with open("foo.pickle", "rb") as f:
x = pickle.load(f) # fine
def _(mode: str):
reveal_type(open("", mode)) # revealed: @Todo(`builtins.open` return type)
```

View File

@@ -642,6 +642,96 @@ def f(*args: int) -> int:
reveal_type(f("foo")) # revealed: int
```
### Variadic argument, variadic parameter
```toml
[environment]
python-version = "3.11"
```
```py
def f(*args: int) -> int:
return 1
def _(args: list[str]) -> None:
# error: [invalid-argument-type] "Argument to function `f` is incorrect: Expected `int`, found `str`"
reveal_type(f(*args)) # revealed: int
```
Considering a few different shapes of tuple for the splatted argument:
```py
def f1(*args: str): ...
def _(
args1: tuple[str, ...],
args2: tuple[str, *tuple[str, ...]],
args3: tuple[str, *tuple[str, ...], str],
args4: tuple[int, *tuple[str, ...]],
args5: tuple[int, *tuple[str, ...], str],
args6: tuple[*tuple[str, ...], str],
args7: tuple[*tuple[str, ...], int],
args8: tuple[int, *tuple[str, ...], int],
args9: tuple[str, *tuple[str, ...], int],
args10: tuple[str, *tuple[int, ...], str],
):
f1(*args1)
f1(*args2)
f1(*args3)
f1(*args4) # error: [invalid-argument-type]
f1(*args5) # error: [invalid-argument-type]
f1(*args6)
f1(*args7) # error: [invalid-argument-type]
# The reason for two errors here is because of the two fixed elements in the tuple of `args8`
# which are both `int`
# error: [invalid-argument-type]
# error: [invalid-argument-type]
f1(*args8)
f1(*args9) # error: [invalid-argument-type]
f1(*args10) # error: [invalid-argument-type]
```
### Mixed argument and parameter containing variadic
```toml
[environment]
python-version = "3.11"
```
```py
def f(x: int, *args: str) -> int:
return 1
def _(
args1: list[int],
args2: tuple[int],
args3: tuple[int, int],
args4: tuple[int, ...],
args5: tuple[int, *tuple[str, ...]],
args6: tuple[int, int, *tuple[str, ...]],
) -> None:
# error: [invalid-argument-type] "Argument to function `f` is incorrect: Expected `str`, found `int`"
reveal_type(f(*args1)) # revealed: int
# This shouldn't raise an error because the unpacking doesn't match the variadic parameter.
reveal_type(f(*args2)) # revealed: int
# But, this should because the second tuple element is not assignable.
# error: [invalid-argument-type] "Argument to function `f` is incorrect: Expected `str`, found `int`"
reveal_type(f(*args3)) # revealed: int
# error: [invalid-argument-type] "Argument to function `f` is incorrect: Expected `str`, found `int`"
reveal_type(f(*args4)) # revealed: int
# The first element of the tuple matches the required argument;
# all subsequent elements match the variadic argument
reveal_type(f(*args5)) # revealed: int
# error: [invalid-argument-type] "Argument to function `f` is incorrect: Expected `str`, found `int`"
reveal_type(f(*args6)) # revealed: int
```
### Keyword argument, positional-or-keyword parameter
```py

View File

@@ -0,0 +1,68 @@
# Calls to `open()`
## `builtins.open`
We do not fully understand typeshed's overloads for `open()` yet, due to missing support for PEP-613
type aliases. However, we also do not emit false-positive diagnostics on common calls to `open()`:
```py
import pickle
reveal_type(open("")) # revealed: TextIOWrapper[_WrappedBuffer]
reveal_type(open("", "r")) # revealed: TextIOWrapper[_WrappedBuffer]
reveal_type(open("", "rb")) # revealed: @Todo(`builtins.open` return type)
with open("foo.pickle", "rb") as f:
x = pickle.load(f) # fine
def _(mode: str):
reveal_type(open("", mode)) # revealed: @Todo(`builtins.open` return type)
```
## `os.fdopen`
The same is true for `os.fdopen()`:
```py
import pickle
import os
reveal_type(os.fdopen(0)) # revealed: TextIOWrapper[_WrappedBuffer]
reveal_type(os.fdopen(0, "r")) # revealed: TextIOWrapper[_WrappedBuffer]
reveal_type(os.fdopen(0, "rb")) # revealed: @Todo(`os.fdopen` return type)
with os.fdopen(0, "rb") as f:
x = pickle.load(f) # fine
```
## `Path.open`
And similarly for `Path.open()`:
```py
from pathlib import Path
import pickle
reveal_type(Path("").open()) # revealed: @Todo(`Path.open` return type)
reveal_type(Path("").open("r")) # revealed: @Todo(`Path.open` return type)
reveal_type(Path("").open("rb")) # revealed: @Todo(`Path.open` return type)
with Path("foo.pickle").open("rb") as f:
x = pickle.load(f) # fine
```
## `NamedTemporaryFile`
And similarly for `tempfile.NamedTemporaryFile()`:
```py
from tempfile import NamedTemporaryFile
import pickle
reveal_type(NamedTemporaryFile()) # revealed: _TemporaryFileWrapper[bytes]
reveal_type(NamedTemporaryFile("r")) # revealed: _TemporaryFileWrapper[str]
reveal_type(NamedTemporaryFile("rb")) # revealed: @Todo(`tempfile.NamedTemporaryFile` return type)
with NamedTemporaryFile("rb") as f:
x = pickle.load(f) # fine
```

View File

@@ -36,6 +36,38 @@ from .foo import X
reveal_type(X) # revealed: int
```
## Simple With Stub and Implementation
This is a regression test for an issue with relative imports in implementation files when a stub is
also defined.
`package/__init__.py`:
```py
```
`package/foo.py`:
```py
X: int = 42
```
`package/bar.py`:
```py
from .foo import X
reveal_type(X) # revealed: int
```
`package/bar.pyi`:
```pyi
from .foo import X
reveal_type(X) # revealed: int
```
## Dotted
`package/__init__.py`:

View File

@@ -3,7 +3,49 @@
## Empty dictionary
```py
reveal_type({}) # revealed: dict[@Todo(dict literal key type), @Todo(dict literal value type)]
reveal_type({}) # revealed: dict[Unknown, Unknown]
```
## Basic dict
```py
reveal_type({1: 1, 2: 1}) # revealed: dict[Unknown | int, Unknown | int]
```
## Dict of tuples
```py
reveal_type({1: (1, 2), 2: (3, 4)}) # revealed: dict[Unknown | int, Unknown | tuple[int, int]]
```
## Unpacked dict
```py
a = {"a": 1, "b": 2}
b = {"c": 3, "d": 4}
d = {**a, **b}
reveal_type(d) # revealed: dict[Unknown | str, Unknown | int]
```
## Dict of functions
```py
def a(_: int) -> int:
return 0
def b(_: int) -> int:
return 1
x = {1: a, 2: b}
reveal_type(x) # revealed: dict[Unknown | int, Unknown | ((_: int) -> int)]
```
## Mixed dict
```py
# revealed: dict[Unknown | str, Unknown | int | tuple[int, int] | tuple[int, int, int]]
reveal_type({"a": 1, "b": (1, 2), "c": (1, 2, 3)})
```
## Dict comprehensions

View File

@@ -206,8 +206,7 @@ dd: defaultdict[int, int] = defaultdict(int)
dd[0] = 0
cm: ChainMap[int, int] = ChainMap({1: 1}, {0: 0})
cm[0] = 0
# TODO: should be ChainMap[int, int]
reveal_type(cm) # revealed: ChainMap[@Todo(dict literal key type), @Todo(dict literal value type)]
reveal_type(cm) # revealed: ChainMap[Unknown | int, Unknown | int]
reveal_type(l[0]) # revealed: Literal[0]
reveal_type(d[0]) # revealed: Literal[0]

View File

@@ -1761,7 +1761,7 @@ class `T` has a method `m` which is assignable to the `Callable` supertype of th
```py
from typing import Protocol
from ty_extensions import is_subtype_of, static_assert
from ty_extensions import is_subtype_of, is_assignable_to, static_assert
class P(Protocol):
def m(self, x: int, /) -> None: ...
@@ -1769,16 +1769,43 @@ class P(Protocol):
class NominalSubtype:
def m(self, y: int) -> None: ...
class NominalSubtype2:
def m(self, *args: object) -> None: ...
class NotSubtype:
def m(self, x: int) -> int:
return 42
class NominalWithClassMethod:
@classmethod
def m(cls, x: int) -> None: ...
class NominalWithStaticMethod:
@staticmethod
def m(_, x: int) -> None: ...
class DefinitelyNotSubtype:
m = None
static_assert(is_subtype_of(NominalSubtype, P))
static_assert(not is_subtype_of(DefinitelyNotSubtype, P))
static_assert(not is_subtype_of(NotSubtype, P))
static_assert(is_subtype_of(NominalSubtype2, P))
static_assert(is_subtype_of(NominalSubtype | NominalSubtype2, P))
static_assert(not is_assignable_to(DefinitelyNotSubtype, P))
static_assert(not is_assignable_to(NotSubtype, P))
static_assert(not is_assignable_to(NominalSubtype | NotSubtype, P))
static_assert(not is_assignable_to(NominalSubtype2 | DefinitelyNotSubtype, P))
# `m` has the correct signature when accessed on instances of `NominalWithClassMethod`,
# but not when accessed on the class object `NominalWithClassMethod` itself
#
# TODO: these should pass
static_assert(not is_assignable_to(NominalWithClassMethod, P)) # error: [static-assert-error]
static_assert(not is_assignable_to(NominalSubtype | NominalWithClassMethod, P)) # error: [static-assert-error]
# Conversely, `m` has the correct signature when accessed on the class object
# `NominalWithStaticMethod`, but not when accessed on instances of `NominalWithStaticMethod`
static_assert(not is_assignable_to(NominalWithStaticMethod, P))
static_assert(not is_assignable_to(NominalSubtype | NominalWithStaticMethod, P))
```
A callable instance attribute is not sufficient for a type to satisfy a protocol with a method
@@ -1901,21 +1928,23 @@ from typing_extensions import TypeVar, Self, Protocol
from ty_extensions import is_equivalent_to, static_assert, is_assignable_to, is_subtype_of
class NewStyleClassScoped[T](Protocol):
def method(self, input: T) -> None: ...
def method(self: Self, input: T) -> None: ...
S = TypeVar("S")
class LegacyClassScoped(Protocol[S]):
def method(self, input: S) -> None: ...
def method(self: Self, input: S) -> None: ...
static_assert(is_equivalent_to(NewStyleClassScoped, LegacyClassScoped))
static_assert(is_equivalent_to(NewStyleClassScoped[int], LegacyClassScoped[int]))
# TODO: these should pass
static_assert(is_equivalent_to(NewStyleClassScoped, LegacyClassScoped)) # error: [static-assert-error]
static_assert(is_equivalent_to(NewStyleClassScoped[int], LegacyClassScoped[int])) # error: [static-assert-error]
class NominalGeneric[T]:
def method(self, input: T) -> None: ...
def _[T](x: T) -> T:
static_assert(is_equivalent_to(NewStyleClassScoped[T], LegacyClassScoped[T]))
# TODO: should pass
static_assert(is_equivalent_to(NewStyleClassScoped[T], LegacyClassScoped[T])) # error: [static-assert-error]
static_assert(is_subtype_of(NominalGeneric[T], NewStyleClassScoped[T]))
static_assert(is_subtype_of(NominalGeneric[T], LegacyClassScoped[T]))
return x
@@ -1989,17 +2018,27 @@ class NominalReturningSelfNotGeneric:
# TODO: should pass
static_assert(is_equivalent_to(LegacyFunctionScoped, NewStyleFunctionScoped)) # error: [static-assert-error]
static_assert(is_subtype_of(NominalNewStyle, NewStyleFunctionScoped))
static_assert(is_subtype_of(NominalNewStyle, LegacyFunctionScoped))
static_assert(is_assignable_to(NominalNewStyle, NewStyleFunctionScoped))
static_assert(is_assignable_to(NominalNewStyle, LegacyFunctionScoped))
# TODO: should pass
static_assert(is_subtype_of(NominalNewStyle, NewStyleFunctionScoped)) # error: [static-assert-error]
# TODO: should pass
static_assert(is_subtype_of(NominalNewStyle, LegacyFunctionScoped)) # error: [static-assert-error]
static_assert(not is_assignable_to(NominalNewStyle, UsesSelf))
static_assert(is_subtype_of(NominalLegacy, NewStyleFunctionScoped))
static_assert(is_subtype_of(NominalLegacy, LegacyFunctionScoped))
static_assert(is_assignable_to(NominalLegacy, NewStyleFunctionScoped))
static_assert(is_assignable_to(NominalLegacy, LegacyFunctionScoped))
# TODO: should pass
static_assert(is_subtype_of(NominalLegacy, NewStyleFunctionScoped)) # error: [static-assert-error]
# TODO: should pass
static_assert(is_subtype_of(NominalLegacy, LegacyFunctionScoped)) # error: [static-assert-error]
static_assert(not is_assignable_to(NominalLegacy, UsesSelf))
static_assert(not is_assignable_to(NominalWithSelf, NewStyleFunctionScoped))
static_assert(not is_assignable_to(NominalWithSelf, LegacyFunctionScoped))
static_assert(is_subtype_of(NominalWithSelf, UsesSelf))
static_assert(is_assignable_to(NominalWithSelf, UsesSelf))
# TODO: should pass
static_assert(is_subtype_of(NominalWithSelf, UsesSelf)) # error: [static-assert-error]
# TODO: these should pass
static_assert(not is_assignable_to(NominalNotGeneric, NewStyleFunctionScoped)) # error: [static-assert-error]
@@ -2008,8 +2047,117 @@ static_assert(not is_assignable_to(NominalNotGeneric, UsesSelf))
static_assert(not is_assignable_to(NominalReturningSelfNotGeneric, NewStyleFunctionScoped))
static_assert(not is_assignable_to(NominalReturningSelfNotGeneric, LegacyFunctionScoped))
# TODO: should pass
static_assert(not is_assignable_to(NominalReturningSelfNotGeneric, UsesSelf)) # error: [static-assert-error]
# These test cases are taken from the typing conformance suite:
class ShapeProtocolImplicitSelf(Protocol):
def set_scale(self, scale: float) -> Self: ...
class ShapeProtocolExplicitSelf(Protocol):
def set_scale(self: Self, scale: float) -> Self: ...
class BadReturnType:
def set_scale(self, scale: float) -> int:
return 42
static_assert(not is_assignable_to(BadReturnType, ShapeProtocolImplicitSelf))
static_assert(not is_assignable_to(BadReturnType, ShapeProtocolExplicitSelf))
```
## Subtyping of protocols with `@classmethod` or `@staticmethod` members
The typing spec states that protocols may have `@classmethod` or `@staticmethod` method members.
However, as of 2025/09/24, the spec does not elaborate on how these members should behave with
regards to subtyping and assignability (nor are there any tests in the typing conformance suite).
Ty's behaviour is therefore derived from first principles and the
[mypy test suite](https://github.com/python/mypy/blob/354bea6352ee7a38b05e2f42c874e7d1f7bf557a/test-data/unit/check-protocols.test#L1231-L1263).
A protocol `P` with a `@classmethod` method member `x` can only be satisfied by a nominal type `N`
if `N.x` is a callable object that evaluates to the same type whether it is accessed on inhabitants
of `N` or inhabitants of `type[N]`, *and* the signature of `N.x` is equivalent to the signature of
`P.x` after the descriptor protocol has been invoked on `P.x`:
```py
from typing import Protocol
from ty_extensions import static_assert, is_subtype_of, is_assignable_to, is_equivalent_to, is_disjoint_from
class PClassMethod(Protocol):
@classmethod
def x(cls, val: int) -> str: ...
class PStaticMethod(Protocol):
@staticmethod
def x(val: int) -> str: ...
class NNotCallable:
x = None
class NInstanceMethod:
def x(self, val: int) -> str:
return "foo"
class NClassMethodGood:
@classmethod
def x(cls, val: int) -> str:
return "foo"
class NClassMethodBad:
@classmethod
def x(cls, val: str) -> int:
return 42
class NStaticMethodGood:
@staticmethod
def x(val: int) -> str:
return "foo"
class NStaticMethodBad:
@staticmethod
def x(cls, val: int) -> str:
return "foo"
# `PClassMethod.x` and `PStaticMethod.x` evaluate to callable types with equivalent signatures
# whether you access them on the protocol class or instances of the protocol.
# That means that they are equivalent protocols!
static_assert(is_equivalent_to(PClassMethod, PStaticMethod))
# TODO: these should all pass
static_assert(not is_assignable_to(NNotCallable, PClassMethod)) # error: [static-assert-error]
static_assert(not is_assignable_to(NNotCallable, PStaticMethod)) # error: [static-assert-error]
static_assert(is_disjoint_from(NNotCallable, PClassMethod)) # error: [static-assert-error]
static_assert(is_disjoint_from(NNotCallable, PStaticMethod)) # error: [static-assert-error]
# `NInstanceMethod.x` has the correct type when accessed on an instance of
# `NInstanceMethod`, but not when accessed on the class object itself
#
# TODO: these should pass
static_assert(not is_assignable_to(NInstanceMethod, PClassMethod)) # error: [static-assert-error]
static_assert(not is_assignable_to(NInstanceMethod, PStaticMethod)) # error: [static-assert-error]
# A nominal type with a `@staticmethod` can satisfy a protocol with a `@classmethod`
# if the staticmethod duck-types the same as the classmethod member
# both when accessed on the class and when accessed on an instance of the class
# The same also applies for a nominal type with a `@classmethod` and a protocol
# with a `@staticmethod` member
static_assert(is_assignable_to(NClassMethodGood, PClassMethod))
static_assert(is_assignable_to(NClassMethodGood, PStaticMethod))
# TODO: these should all pass:
static_assert(is_subtype_of(NClassMethodGood, PClassMethod)) # error: [static-assert-error]
static_assert(is_subtype_of(NClassMethodGood, PStaticMethod)) # error: [static-assert-error]
static_assert(not is_assignable_to(NClassMethodBad, PClassMethod)) # error: [static-assert-error]
static_assert(not is_assignable_to(NClassMethodBad, PStaticMethod)) # error: [static-assert-error]
static_assert(not is_assignable_to(NClassMethodGood | NClassMethodBad, PClassMethod)) # error: [static-assert-error]
static_assert(is_assignable_to(NStaticMethodGood, PClassMethod))
static_assert(is_assignable_to(NStaticMethodGood, PStaticMethod))
# TODO: these should all pass:
static_assert(is_subtype_of(NStaticMethodGood, PClassMethod)) # error: [static-assert-error]
static_assert(is_subtype_of(NStaticMethodGood, PStaticMethod)) # error: [static-assert-error]
static_assert(not is_assignable_to(NStaticMethodBad, PClassMethod)) # error: [static-assert-error]
static_assert(not is_assignable_to(NStaticMethodBad, PStaticMethod)) # error: [static-assert-error]
static_assert(not is_assignable_to(NStaticMethodGood | NStaticMethodBad, PStaticMethod)) # error: [static-assert-error]
```
## Equivalence of protocols with method or property members
@@ -2846,7 +2994,6 @@ Add tests for:
- Protocols with instance-method members, including:
- Protocols with methods that have parameters or the return type unannotated
- Protocols with methods that have parameters or the return type annotated with `Any`
- Protocols with `@classmethod` and `@staticmethod`
- Assignability of non-instance types to protocols with instance-method members (e.g. a
class-literal type can be a subtype of `Sized` if its metaclass has a `__len__` method)
- Protocols with methods that have annotated `self` parameters.

View File

@@ -85,6 +85,34 @@ alice["extra"] = True
bob["extra"] = True
```
## Nested `TypedDict`
Nested `TypedDict` fields are also supported.
```py
from typing import TypedDict
class Inner(TypedDict):
name: str
age: int | None
class Person(TypedDict):
inner: Inner
```
```py
alice: Person = {"inner": {"name": "Alice", "age": 30}}
reveal_type(alice["inner"]["name"]) # revealed: str
reveal_type(alice["inner"]["age"]) # revealed: int | None
# error: [invalid-key] "Invalid key access on TypedDict `Inner`: Unknown key "non_existing""
reveal_type(alice["inner"]["non_existing"]) # revealed: Unknown
# error: [invalid-key] "Invalid key access on TypedDict `Inner`: Unknown key "extra""
alice: Person = {"inner": {"name": "Alice", "age": 30, "extra": 1}}
```
## Validation of `TypedDict` construction
```py

View File

@@ -33,9 +33,7 @@ colour
com2ann
comtypes
core
CPython (Argument Clinic)
CPython (cases_generator)
CPython (peg_generator)
cpython
cwltool
dacite
dd-trace-py

View File

@@ -318,6 +318,9 @@ pub enum KnownModule {
TypingExtensions,
Typing,
Sys,
Os,
Tempfile,
Pathlib,
Abc,
Dataclasses,
Collections,
@@ -347,6 +350,9 @@ impl KnownModule {
Self::Typeshed => "_typeshed",
Self::TypingExtensions => "typing_extensions",
Self::Sys => "sys",
Self::Os => "os",
Self::Tempfile => "tempfile",
Self::Pathlib => "pathlib",
Self::Abc => "abc",
Self::Dataclasses => "dataclasses",
Self::Collections => "collections",

View File

@@ -19,7 +19,7 @@ use rustc_hash::{FxBuildHasher, FxHashSet};
use ruff_db::files::{File, FilePath, FileRootKind};
use ruff_db::system::{DirectoryEntry, System, SystemPath, SystemPathBuf};
use ruff_db::vendored::VendoredFileSystem;
use ruff_python_ast::PythonVersion;
use ruff_python_ast::{PySourceType, PythonVersion};
use crate::db::Db;
use crate::module_name::ModuleName;
@@ -155,17 +155,27 @@ pub(crate) fn file_to_module(db: &dyn Db, file: File) -> Option<Module<'_>> {
let module_file = module.file(db)?;
if file.path(db) == module_file.path(db) {
Some(module)
} else {
// This path is for a module with the same name but with a different precedence. For example:
// ```
// src/foo.py
// src/foo/__init__.py
// ```
// The module name of `src/foo.py` is `foo`, but the module loaded by Python is `src/foo/__init__.py`.
// That means we need to ignore `src/foo.py` even though it resolves to the same module name.
None
return Some(module);
} else if file.source_type(db) == PySourceType::Python
&& module_file.source_type(db) == PySourceType::Stub
{
// If a .py and .pyi are both defined, the .pyi will be the one returned by `resolve_module().file`,
// which would make us erroneously believe the `.py` is *not* also this module (breaking things
// like relative imports). So here we try `resolve_real_module().file` to cover both cases.
let module = resolve_real_module(db, &module_name)?;
let module_file = module.file(db)?;
if file.path(db) == module_file.path(db) {
return Some(module);
}
}
// This path is for a module with the same name but with a different precedence. For example:
// ```
// src/foo.py
// src/foo/__init__.py
// ```
// The module name of `src/foo.py` is `foo`, but the module loaded by Python is `src/foo/__init__.py`.
// That means we need to ignore `src/foo.py` even though it resolves to the same module name.
None
}
pub(crate) fn search_paths(db: &dyn Db, resolve_mode: ModuleResolveMode) -> SearchPathIterator<'_> {
@@ -1576,6 +1586,7 @@ mod tests {
let TestCase { db, src, .. } = TestCaseBuilder::new().with_src_files(SRC).build();
let foo = resolve_module(&db, &ModuleName::new_static("foo").unwrap()).unwrap();
let foo_real = resolve_real_module(&db, &ModuleName::new_static("foo").unwrap()).unwrap();
let foo_stub = src.join("foo.pyi");
assert_eq!(&src, foo.search_path(&db).unwrap());
@@ -1583,9 +1594,10 @@ mod tests {
assert_eq!(Some(foo), path_to_module(&db, &FilePath::System(foo_stub)));
assert_eq!(
None,
Some(foo_real),
path_to_module(&db, &FilePath::System(src.join("foo.py")))
);
assert!(foo_real != foo);
}
#[test]

View File

@@ -849,6 +849,28 @@ impl<'db> Type<'db> {
matches!(self, Type::Dynamic(_))
}
// If the type is a specialized instance of the given `KnownClass`, returns the specialization.
pub(crate) fn known_specialization(
self,
known_class: KnownClass,
db: &'db dyn Db,
) -> Option<Specialization<'db>> {
let class_type = match self {
Type::NominalInstance(instance) => instance,
Type::TypeAlias(alias) => alias.value_type(db).into_nominal_instance()?,
_ => return None,
}
.class(db);
if !class_type.is_known(db, known_class) {
return None;
}
class_type
.into_generic_alias()
.map(|generic_alias| generic_alias.specialization(db))
}
/// Returns the top materialization (or upper bound materialization) of this type, which is the
/// most general form of the type that is fully static.
#[must_use]
@@ -3447,6 +3469,11 @@ impl<'db> Type<'db> {
Type::KnownBoundMethod(KnownBoundMethodType::StrStartswith(literal)),
)
.into(),
Type::NominalInstance(instance)
if instance.has_known_class(db, KnownClass::Path) && name == "open" =>
{
Place::bound(Type::KnownBoundMethod(KnownBoundMethodType::PathOpen)).into()
}
Type::ClassLiteral(class)
if name == "__get__" && class.is_known(db, KnownClass::FunctionType) =>
@@ -6209,7 +6236,7 @@ impl<'db> Type<'db> {
| Type::AlwaysTruthy
| Type::AlwaysFalsy
| Type::WrapperDescriptor(_)
| Type::KnownBoundMethod(KnownBoundMethodType::StrStartswith(_))
| Type::KnownBoundMethod(KnownBoundMethodType::StrStartswith(_) | KnownBoundMethodType::PathOpen)
| Type::DataclassDecorator(_)
| Type::DataclassTransformer(_)
// A non-generic class never needs to be specialized. A generic class is specialized
@@ -6354,7 +6381,9 @@ impl<'db> Type<'db> {
| Type::AlwaysTruthy
| Type::AlwaysFalsy
| Type::WrapperDescriptor(_)
| Type::KnownBoundMethod(KnownBoundMethodType::StrStartswith(_))
| Type::KnownBoundMethod(
KnownBoundMethodType::StrStartswith(_) | KnownBoundMethodType::PathOpen,
)
| Type::DataclassDecorator(_)
| Type::DataclassTransformer(_)
| Type::ModuleLiteral(_)
@@ -9435,6 +9464,8 @@ pub enum KnownBoundMethodType<'db> {
/// this allows us to understand statically known branches for common tests such as
/// `if sys.platform.startswith("freebsd")`.
StrStartswith(StringLiteralType<'db>),
/// Method wrapper for `Path.open`,
PathOpen,
}
pub(super) fn walk_method_wrapper_type<'db, V: visitor::TypeVisitor<'db> + ?Sized>(
@@ -9458,6 +9489,7 @@ pub(super) fn walk_method_wrapper_type<'db, V: visitor::TypeVisitor<'db> + ?Size
KnownBoundMethodType::StrStartswith(string_literal) => {
visitor.visit_type(db, Type::StringLiteral(string_literal));
}
KnownBoundMethodType::PathOpen => {}
}
}
@@ -9493,17 +9525,23 @@ impl<'db> KnownBoundMethodType<'db> {
ConstraintSet::from(self == other)
}
(KnownBoundMethodType::PathOpen, KnownBoundMethodType::PathOpen) => {
ConstraintSet::from(true)
}
(
KnownBoundMethodType::FunctionTypeDunderGet(_)
| KnownBoundMethodType::FunctionTypeDunderCall(_)
| KnownBoundMethodType::PropertyDunderGet(_)
| KnownBoundMethodType::PropertyDunderSet(_)
| KnownBoundMethodType::StrStartswith(_),
| KnownBoundMethodType::StrStartswith(_)
| KnownBoundMethodType::PathOpen,
KnownBoundMethodType::FunctionTypeDunderGet(_)
| KnownBoundMethodType::FunctionTypeDunderCall(_)
| KnownBoundMethodType::PropertyDunderGet(_)
| KnownBoundMethodType::PropertyDunderSet(_)
| KnownBoundMethodType::StrStartswith(_),
| KnownBoundMethodType::StrStartswith(_)
| KnownBoundMethodType::PathOpen,
) => ConstraintSet::from(false),
}
}
@@ -9538,17 +9576,23 @@ impl<'db> KnownBoundMethodType<'db> {
ConstraintSet::from(self == other)
}
(KnownBoundMethodType::PathOpen, KnownBoundMethodType::PathOpen) => {
ConstraintSet::from(true)
}
(
KnownBoundMethodType::FunctionTypeDunderGet(_)
| KnownBoundMethodType::FunctionTypeDunderCall(_)
| KnownBoundMethodType::PropertyDunderGet(_)
| KnownBoundMethodType::PropertyDunderSet(_)
| KnownBoundMethodType::StrStartswith(_),
| KnownBoundMethodType::StrStartswith(_)
| KnownBoundMethodType::PathOpen,
KnownBoundMethodType::FunctionTypeDunderGet(_)
| KnownBoundMethodType::FunctionTypeDunderCall(_)
| KnownBoundMethodType::PropertyDunderGet(_)
| KnownBoundMethodType::PropertyDunderSet(_)
| KnownBoundMethodType::StrStartswith(_),
| KnownBoundMethodType::StrStartswith(_)
| KnownBoundMethodType::PathOpen,
) => ConstraintSet::from(false),
}
}
@@ -9567,7 +9611,7 @@ impl<'db> KnownBoundMethodType<'db> {
KnownBoundMethodType::PropertyDunderSet(property) => {
KnownBoundMethodType::PropertyDunderSet(property.normalized_impl(db, visitor))
}
KnownBoundMethodType::StrStartswith(_) => self,
KnownBoundMethodType::StrStartswith(_) | KnownBoundMethodType::PathOpen => self,
}
}
@@ -9579,6 +9623,7 @@ impl<'db> KnownBoundMethodType<'db> {
| KnownBoundMethodType::PropertyDunderGet(_)
| KnownBoundMethodType::PropertyDunderSet(_) => KnownClass::MethodWrapperType,
KnownBoundMethodType::StrStartswith(_) => KnownClass::BuiltinFunctionType,
KnownBoundMethodType::PathOpen => KnownClass::MethodType,
}
}
@@ -9675,6 +9720,9 @@ impl<'db> KnownBoundMethodType<'db> {
Some(KnownClass::Bool.to_instance(db)),
)))
}
KnownBoundMethodType::PathOpen => {
Either::Right(std::iter::once(Signature::todo("`Path.open` return type")))
}
}
}
}

View File

@@ -6,7 +6,7 @@ use ruff_python_ast as ast;
use crate::Db;
use crate::types::KnownClass;
use crate::types::enums::{enum_member_literals, enum_metadata};
use crate::types::tuple::{Tuple, TupleLength, TupleType};
use crate::types::tuple::{Tuple, TupleType};
use super::Type;
@@ -17,7 +17,7 @@ pub(crate) enum Argument<'a> {
/// A positional argument.
Positional,
/// A starred positional argument (e.g. `*args`) containing the specified number of elements.
Variadic(TupleLength),
Variadic,
/// A keyword argument (e.g. `a=1`).
Keyword(&'a str),
/// The double-starred keywords argument (e.g. `**kwargs`).
@@ -41,7 +41,6 @@ impl<'a, 'db> CallArguments<'a, 'db> {
/// type of each splatted argument, so that we can determine its length. All other arguments
/// will remain uninitialized as `Unknown`.
pub(crate) fn from_arguments(
db: &'db dyn Db,
arguments: &'a ast::Arguments,
mut infer_argument_type: impl FnMut(Option<&ast::Expr>, &ast::Expr) -> Type<'db>,
) -> Self {
@@ -51,11 +50,7 @@ impl<'a, 'db> CallArguments<'a, 'db> {
ast::ArgOrKeyword::Arg(arg) => match arg {
ast::Expr::Starred(ast::ExprStarred { value, .. }) => {
let ty = infer_argument_type(Some(arg), value);
let length = ty
.try_iterate(db)
.map(|tuple| tuple.len())
.unwrap_or(TupleLength::unknown());
(Argument::Variadic(length), Some(ty))
(Argument::Variadic, Some(ty))
}
_ => (Argument::Positional, None),
},
@@ -203,25 +198,10 @@ impl<'a, 'db> CallArguments<'a, 'db> {
for subtype in &expanded_types {
let mut new_expanded_types = pre_expanded_types.to_vec();
new_expanded_types[index] = Some(*subtype);
// Update the arguments list to handle variadic argument expansion
let mut new_arguments = self.arguments.clone();
if let Argument::Variadic(_) = self.arguments[index] {
// If the argument corresponding to this type is variadic, we need to
// update the tuple length because expanding could change the length.
// For example, in `tuple[int] | tuple[int, int]`, the length of the
// first type is 1, while the length of the second type is 2.
if let Some(expanded_type) = new_expanded_types[index] {
let length = expanded_type
.try_iterate(db)
.map(|tuple| tuple.len())
.unwrap_or(TupleLength::unknown());
new_arguments[index] = Argument::Variadic(length);
}
}
expanded_arguments
.push(CallArguments::new(new_arguments, new_expanded_types));
expanded_arguments.push(CallArguments::new(
self.arguments.clone(),
new_expanded_types,
));
}
}

View File

@@ -2135,24 +2135,36 @@ impl<'a, 'db> ArgumentMatcher<'a, 'db> {
Ok(())
}
/// Match a variadic argument to the remaining positional, standard or variadic parameters.
fn match_variadic(
&mut self,
db: &'db dyn Db,
argument_index: usize,
argument: Argument<'a>,
argument_type: Option<Type<'db>>,
length: TupleLength,
) -> Result<(), ()> {
let tuple = argument_type.map(|ty| ty.iterate(db));
let mut argument_types = match tuple.as_ref() {
Some(tuple) => Either::Left(tuple.all_elements().copied()),
None => Either::Right(std::iter::empty()),
let (mut argument_types, length, variable_element) = match tuple.as_ref() {
Some(tuple) => (
Either::Left(tuple.all_elements().copied()),
tuple.len(),
tuple.variable_element().copied(),
),
None => (
Either::Right(std::iter::empty()),
TupleLength::unknown(),
None,
),
};
// We must be able to match up the fixed-length portion of the argument with positional
// parameters, so we pass on any errors that occur.
for _ in 0..length.minimum() {
self.match_positional(argument_index, argument, argument_types.next())?;
self.match_positional(
argument_index,
argument,
argument_types.next().or(variable_element),
)?;
}
// If the tuple is variable-length, we assume that it will soak up all remaining positional
@@ -2163,7 +2175,24 @@ impl<'a, 'db> ArgumentMatcher<'a, 'db> {
.get_positional(self.next_positional)
.is_some()
{
self.match_positional(argument_index, argument, argument_types.next())?;
self.match_positional(
argument_index,
argument,
argument_types.next().or(variable_element),
)?;
}
}
// Finally, if there is a variadic parameter we can match any of the remaining unpacked
// argument types to it, but only if there is at least one remaining argument type. This is
// because a variadic parameter is optional, so if this was done unconditionally, ty could
// raise a false positive as "too many arguments".
if self.parameters.variadic().is_some() {
if let Some(argument_type) = argument_types.next().or(variable_element) {
self.match_positional(argument_index, argument, Some(argument_type))?;
for argument_type in argument_types {
self.match_positional(argument_index, argument, Some(argument_type))?;
}
}
}
@@ -2433,11 +2462,10 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
self.enumerate_argument_types()
{
match argument {
Argument::Variadic(_) => self.check_variadic_argument_type(
Argument::Variadic => self.check_variadic_argument_type(
argument_index,
adjusted_argument_index,
argument,
argument_type,
),
Argument::Keywords => self.check_keyword_variadic_argument_type(
argument_index,
@@ -2465,37 +2493,15 @@ impl<'a, 'db> ArgumentTypeChecker<'a, 'db> {
argument_index: usize,
adjusted_argument_index: Option<usize>,
argument: Argument<'a>,
argument_type: Type<'db>,
) {
// If the argument is splatted, convert its type into a tuple describing the splatted
// elements. For tuples, we don't have to do anything! For other types, we treat it as
// an iterator, and create a homogeneous tuple of its output type, since we don't know
// how many elements the iterator will produce.
let argument_types = argument_type.iterate(self.db);
// Resize the tuple of argument types to line up with the number of parameters this
// argument was matched against. If parameter matching succeeded, then we can (TODO:
// should be able to, see above) guarantee that all of the required elements of the
// splatted tuple will have been matched with a parameter. But if parameter matching
// failed, there might be more required elements. That means we can't use
// TupleLength::Fixed below, because we would otherwise get a "too many values" error
// when parameter matching failed.
let desired_size =
TupleLength::Variable(self.argument_matches[argument_index].parameters.len(), 0);
let argument_types = argument_types
.resize(self.db, desired_size)
.expect("argument type should be consistent with its arity");
// Check the types by zipping through the splatted argument types and their matched
// parameters.
for (argument_type, parameter_index) in
(argument_types.all_elements()).zip(&self.argument_matches[argument_index].parameters)
for (parameter_index, variadic_argument_type) in
self.argument_matches[argument_index].iter()
{
self.check_argument_type(
adjusted_argument_index,
argument,
*argument_type,
*parameter_index,
variadic_argument_type.unwrap_or_else(Type::unknown),
parameter_index,
);
}
}
@@ -2711,9 +2717,8 @@ impl<'db> Binding<'db> {
Argument::Keyword(name) => {
let _ = matcher.match_keyword(argument_index, argument, None, name);
}
Argument::Variadic(length) => {
let _ =
matcher.match_variadic(db, argument_index, argument, argument_type, length);
Argument::Variadic => {
let _ = matcher.match_variadic(db, argument_index, argument, argument_type);
}
Argument::Keywords => {
keywords_arguments.push((argument_index, argument_type));

View File

@@ -3721,6 +3721,8 @@ pub enum KnownClass {
TypedDictFallback,
// string.templatelib
Template,
// pathlib
Path,
// ty_extensions
ConstraintSet,
}
@@ -3767,7 +3769,8 @@ impl KnownClass {
| Self::MethodWrapperType
| Self::CoroutineType
| Self::BuiltinFunctionType
| Self::Template => Some(Truthiness::AlwaysTrue),
| Self::Template
| Self::Path => Some(Truthiness::AlwaysTrue),
Self::NoneType => Some(Truthiness::AlwaysFalse),
@@ -3909,7 +3912,8 @@ impl KnownClass {
| KnownClass::TypedDictFallback
| KnownClass::BuiltinFunctionType
| KnownClass::ProtocolMeta
| KnownClass::Template => false,
| KnownClass::Template
| KnownClass::Path => false,
}
}
@@ -3990,7 +3994,8 @@ impl KnownClass {
| KnownClass::TypedDictFallback
| KnownClass::BuiltinFunctionType
| KnownClass::ProtocolMeta
| KnownClass::Template => false,
| KnownClass::Template
| KnownClass::Path => false,
}
}
@@ -4070,7 +4075,8 @@ impl KnownClass {
| KnownClass::ConstraintSet
| KnownClass::BuiltinFunctionType
| KnownClass::ProtocolMeta
| KnownClass::Template => false,
| KnownClass::Template
| KnownClass::Path => false,
}
}
@@ -4163,7 +4169,8 @@ impl KnownClass {
| Self::TypedDictFallback
| Self::BuiltinFunctionType
| Self::ProtocolMeta
| Self::Template => false,
| Self::Template
| KnownClass::Path => false,
}
}
@@ -4263,6 +4270,7 @@ impl KnownClass {
Self::ConstraintSet => "ConstraintSet",
Self::TypedDictFallback => "TypedDictFallback",
Self::Template => "Template",
Self::Path => "Path",
Self::ProtocolMeta => "_ProtocolMeta",
}
}
@@ -4534,6 +4542,7 @@ impl KnownClass {
Self::NamedTupleFallback | Self::TypedDictFallback => KnownModule::TypeCheckerInternals,
Self::NamedTupleLike | Self::ConstraintSet => KnownModule::TyExtensions,
Self::Template => KnownModule::Templatelib,
Self::Path => KnownModule::Pathlib,
}
}
@@ -4616,7 +4625,8 @@ impl KnownClass {
| Self::TypedDictFallback
| Self::BuiltinFunctionType
| Self::ProtocolMeta
| Self::Template => Some(false),
| Self::Template
| Self::Path => Some(false),
Self::Tuple => None,
}
@@ -4702,7 +4712,8 @@ impl KnownClass {
| Self::TypedDictFallback
| Self::BuiltinFunctionType
| Self::ProtocolMeta
| Self::Template => false,
| Self::Template
| Self::Path => false,
}
}
@@ -4798,6 +4809,7 @@ impl KnownClass {
"ConstraintSet" => Self::ConstraintSet,
"TypedDictFallback" => Self::TypedDictFallback,
"Template" => Self::Template,
"Path" => Self::Path,
"_ProtocolMeta" => Self::ProtocolMeta,
_ => return None,
};
@@ -4869,7 +4881,8 @@ impl KnownClass {
| Self::ConstraintSet
| Self::Awaitable
| Self::Generator
| Self::Template => module == self.canonical_module(db),
| Self::Template
| Self::Path => module == self.canonical_module(db),
Self::NoneType => matches!(module, KnownModule::Typeshed | KnownModule::Types),
Self::SpecialForm
| Self::TypeVar

View File

@@ -387,6 +387,9 @@ impl Display for DisplayRepresentation<'_> {
Type::KnownBoundMethod(KnownBoundMethodType::StrStartswith(_)) => {
f.write_str("<method-wrapper `startswith` of `str` object>")
}
Type::KnownBoundMethod(KnownBoundMethodType::PathOpen) => {
f.write_str("bound method `Path.open`")
}
Type::WrapperDescriptor(kind) => {
let (method, object) = match kind {
WrapperDescriptorKind::FunctionTypeDunderGet => ("__get__", "function"),

View File

@@ -1120,6 +1120,70 @@ fn is_instance_truthiness<'db>(
}
}
/// Return true, if the type passed as `mode` would require us to pick a non-trivial overload of
/// `builtins.open` / `os.fdopen` / `Path.open`.
fn is_mode_with_nontrivial_return_type<'db>(db: &'db dyn Db, mode: Type<'db>) -> bool {
// Return true for any mode that doesn't match typeshed's
// `OpenTextMode` type alias (<https://github.com/python/typeshed/blob/6937a9b193bfc2f0696452d58aad96d7627aa29a/stdlib/_typeshed/__init__.pyi#L220>).
mode.into_string_literal().is_none_or(|mode| {
!matches!(
mode.value(db),
"r+" | "+r"
| "rt+"
| "r+t"
| "+rt"
| "tr+"
| "t+r"
| "+tr"
| "w+"
| "+w"
| "wt+"
| "w+t"
| "+wt"
| "tw+"
| "t+w"
| "+tw"
| "a+"
| "+a"
| "at+"
| "a+t"
| "+at"
| "ta+"
| "t+a"
| "+ta"
| "x+"
| "+x"
| "xt+"
| "x+t"
| "+xt"
| "tx+"
| "t+x"
| "+tx"
| "w"
| "wt"
| "tw"
| "a"
| "at"
| "ta"
| "x"
| "xt"
| "tx"
| "r"
| "rt"
| "tr"
| "U"
| "rU"
| "Ur"
| "rtU"
| "rUt"
| "Urt"
| "trU"
| "tUr"
| "Utr"
)
})
}
fn signature_cycle_recover<'db>(
_db: &'db dyn Db,
_value: &CallableSignature<'db>,
@@ -1188,8 +1252,16 @@ pub enum KnownFunction {
DunderImport,
/// `importlib.import_module`, which returns the submodule.
ImportModule,
/// `builtins.open`
Open,
/// `os.fdopen`
Fdopen,
/// `tempfile.NamedTemporaryFile`
#[strum(serialize = "NamedTemporaryFile")]
NamedTemporaryFile,
/// `typing(_extensions).final`
Final,
/// `typing(_extensions).disjoint_base`
@@ -1308,6 +1380,12 @@ impl KnownFunction {
Self::AbstractMethod => {
matches!(module, KnownModule::Abc)
}
Self::Fdopen => {
matches!(module, KnownModule::Os)
}
Self::NamedTemporaryFile => {
matches!(module, KnownModule::Tempfile)
}
Self::Dataclass | Self::Field => {
matches!(module, KnownModule::Dataclasses)
}
@@ -1656,72 +1734,33 @@ impl KnownFunction {
}
KnownFunction::Open => {
// Temporary special-casing for `builtins.open` to avoid an excessive number of false positives
// in lieu of proper support for PEP-614 type aliases.
if let [_, Some(mode), ..] = parameter_types {
// Infer `Todo` for any argument that doesn't match typeshed's
// `OpenTextMode` type alias (<https://github.com/python/typeshed/blob/6937a9b193bfc2f0696452d58aad96d7627aa29a/stdlib/_typeshed/__init__.pyi#L220>).
// Without this special-casing, we'd just always select the first overload in our current state,
// which leads to lots of false positives.
if mode.into_string_literal().is_none_or(|mode| {
!matches!(
mode.value(db),
"r+" | "+r"
| "rt+"
| "r+t"
| "+rt"
| "tr+"
| "t+r"
| "+tr"
| "w+"
| "+w"
| "wt+"
| "w+t"
| "+wt"
| "tw+"
| "t+w"
| "+tw"
| "a+"
| "+a"
| "at+"
| "a+t"
| "+at"
| "ta+"
| "t+a"
| "+ta"
| "x+"
| "+x"
| "xt+"
| "x+t"
| "+xt"
| "tx+"
| "t+x"
| "+tx"
| "w"
| "wt"
| "tw"
| "a"
| "at"
| "ta"
| "x"
| "xt"
| "tx"
| "r"
| "rt"
| "tr"
| "U"
| "rU"
| "Ur"
| "rtU"
| "rUt"
| "Urt"
| "trU"
| "tUr"
| "Utr"
)
}) {
overload.set_return_type(todo_type!("`builtins.open` return type"));
}
// TODO: Temporary special-casing for `builtins.open` to avoid an excessive number of
// false positives in lieu of proper support for PEP-613 type aliases.
if let [_, Some(mode), ..] = parameter_types
&& is_mode_with_nontrivial_return_type(db, *mode)
{
overload.set_return_type(todo_type!("`builtins.open` return type"));
}
}
KnownFunction::Fdopen => {
// TODO: Temporary special-casing for `os.fdopen` to avoid an excessive number of
// false positives in lieu of proper support for PEP-613 type aliases.
if let [_, Some(mode), ..] = parameter_types
&& is_mode_with_nontrivial_return_type(db, *mode)
{
overload.set_return_type(todo_type!("`os.fdopen` return type"));
}
}
KnownFunction::NamedTemporaryFile => {
// TODO: Temporary special-casing for `tempfile.NamedTemporaryFile` to avoid an excessive number of
// false positives in lieu of proper support for PEP-613 type aliases.
if let [Some(mode), ..] = parameter_types
&& is_mode_with_nontrivial_return_type(db, *mode)
{
overload
.set_return_type(todo_type!("`tempfile.NamedTemporaryFile` return type"));
}
}
@@ -1756,6 +1795,10 @@ pub(crate) mod tests {
KnownFunction::AbstractMethod => KnownModule::Abc,
KnownFunction::Fdopen => KnownModule::Os,
KnownFunction::NamedTemporaryFile => KnownModule::Tempfile,
KnownFunction::Dataclass | KnownFunction::Field => KnownModule::Dataclasses,
KnownFunction::GetattrStatic => KnownModule::Inspect,

View File

@@ -874,7 +874,7 @@ pub fn call_signature_details<'db>(
// Use into_callable to handle all the complex type conversions
if let Some(callable_type) = func_type.into_callable(db) {
let call_arguments =
CallArguments::from_arguments(db, &call_expr.arguments, |_, splatted_value| {
CallArguments::from_arguments(&call_expr.arguments, |_, splatted_value| {
splatted_value.inferred_type(model)
});
let bindings = callable_type

View File

@@ -386,20 +386,8 @@ impl<'db> TypeContext<'db> {
known_class: KnownClass,
db: &'db dyn Db,
) -> Option<Specialization<'db>> {
let class_type = match self.annotation? {
Type::NominalInstance(instance) => instance,
Type::TypeAlias(alias) => alias.value_type(db).into_nominal_instance()?,
_ => return None,
}
.class(db);
if !class_type.is_known(db, known_class) {
return None;
}
class_type
.into_generic_alias()
.map(|generic_alias| generic_alias.specialization(db))
self.annotation
.and_then(|ty| ty.known_specialization(known_class, db))
}
}

View File

@@ -1,4 +1,6 @@
use itertools::Itertools;
use std::iter;
use itertools::{Either, Itertools};
use ruff_db::diagnostic::{Annotation, DiagnosticId, Severity};
use ruff_db::files::File;
use ruff_db::parsed::ParsedModuleRef;
@@ -86,13 +88,13 @@ use crate::types::typed_dict::{
};
use crate::types::visitor::any_over_type;
use crate::types::{
CallDunderError, CallableType, ClassLiteral, ClassType, DataclassParams, DynamicType,
IntersectionBuilder, IntersectionType, KnownClass, KnownInstanceType, MemberLookupPolicy,
MetaclassCandidate, PEP695TypeAliasType, Parameter, ParameterForm, Parameters, SpecialFormType,
SubclassOfType, TrackedConstraintSet, Truthiness, Type, TypeAliasType, TypeAndQualifiers,
TypeContext, TypeMapping, TypeQualifiers, TypeVarBoundOrConstraintsEvaluation,
TypeVarDefaultEvaluation, TypeVarInstance, TypeVarKind, UnionBuilder, UnionType, binding_type,
todo_type,
BoundTypeVarInstance, CallDunderError, CallableType, ClassLiteral, ClassType, DataclassParams,
DynamicType, IntersectionBuilder, IntersectionType, KnownClass, KnownInstanceType,
MemberLookupPolicy, MetaclassCandidate, PEP695TypeAliasType, Parameter, ParameterForm,
Parameters, SpecialFormType, SubclassOfType, TrackedConstraintSet, Truthiness, Type,
TypeAliasType, TypeAndQualifiers, TypeContext, TypeMapping, TypeQualifiers,
TypeVarBoundOrConstraintsEvaluation, TypeVarDefaultEvaluation, TypeVarInstance, TypeVarKind,
UnionBuilder, UnionType, binding_type, todo_type,
};
use crate::types::{ClassBase, add_inferred_python_version_hint_to_diagnostic};
use crate::unpack::{EvaluationMode, UnpackPosition};
@@ -1731,7 +1733,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let previous_deferred_state =
std::mem::replace(&mut self.deferred_state, in_stub.into());
let mut call_arguments =
CallArguments::from_arguments(self.db(), arguments, |argument, splatted_value| {
CallArguments::from_arguments(arguments, |argument, splatted_value| {
let ty = self.infer_expression(splatted_value, TypeContext::default());
if let Some(argument) = argument {
self.store_expression_type(argument, ty);
@@ -4110,7 +4112,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
value,
TypeContext::new(Some(declared.inner_type())),
);
let mut inferred_ty = if target
let inferred_ty = if target
.as_name_expr()
.is_some_and(|name| &name.id == "TYPE_CHECKING")
{
@@ -4121,24 +4123,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
inferred_ty
};
// Validate `TypedDict` dictionary literal assignments
if let Some(typed_dict) = declared.inner_type().into_typed_dict() {
if let Some(dict_expr) = value.as_dict_expr() {
validate_typed_dict_dict_literal(
&self.context,
typed_dict,
dict_expr,
target.into(),
|expr| self.expression_type(expr),
);
// Override the inferred type of the dict literal to be the `TypedDict` type
// This ensures that the dict literal gets the correct type for key access
let typed_dict_type = Type::TypedDict(typed_dict);
inferred_ty = typed_dict_type;
}
}
self.add_declaration_with_binding(
target.into(),
definition,
@@ -5290,6 +5274,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
ctx: _,
} = list;
let elts = elts.iter().map(|elt| [Some(elt)]);
self.infer_collection_literal(elts, tcx, KnownClass::List)
.unwrap_or_else(|| {
KnownClass::List.to_specialized_instance(self.db(), [Type::unknown()])
@@ -5303,95 +5288,167 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
elts,
} = set;
let elts = elts.iter().map(|elt| [Some(elt)]);
self.infer_collection_literal(elts, tcx, KnownClass::Set)
.unwrap_or_else(|| {
KnownClass::Set.to_specialized_instance(self.db(), [Type::unknown()])
})
}
// Infer the type of a collection literal expression.
fn infer_collection_literal(
&mut self,
elts: &[ast::Expr],
tcx: TypeContext<'db>,
collection_class: KnownClass,
) -> Option<Type<'db>> {
// Extract the type variable `T` from `list[T]` in typeshed.
fn elts_ty(
collection_class: KnownClass,
db: &dyn Db,
) -> Option<(ClassLiteral<'_>, Type<'_>)> {
let class_literal = collection_class.try_to_class_literal(db)?;
let generic_context = class_literal.generic_context(db)?;
let variables = generic_context.variables(db);
let elts_ty = variables.iter().exactly_one().ok()?;
Some((class_literal, Type::TypeVar(*elts_ty)))
}
let annotated_elts_ty = tcx
.known_specialization(collection_class, self.db())
.and_then(|specialization| specialization.types(self.db()).iter().exactly_one().ok())
.copied();
let (class_literal, elts_ty) = elts_ty(collection_class, self.db()).unwrap_or_else(|| {
let name = collection_class.name(self.db());
panic!("Typeshed should always have a `{name}` class in `builtins.pyi` with a single type variable")
});
// Create a set of constraints to infer a precise type for `T`.
let mut builder = SpecializationBuilder::new(self.db());
match annotated_elts_ty {
// The annotated type acts as a constraint for `T`.
//
// Note that we infer the annotated type _before_ the elements, to closer match the order
// of any unions written in the type annotation.
Some(annotated_elts_ty) => {
builder.infer(elts_ty, annotated_elts_ty).ok()?;
}
// If a valid type annotation was not provided, avoid restricting the type of the collection
// by unioning the inferred type with `Unknown`.
None => builder.infer(elts_ty, Type::unknown()).ok()?,
}
// The inferred type of each element acts as an additional constraint on `T`.
for elt in elts {
let inferred_elt_ty = self.infer_expression(elt, TypeContext::new(annotated_elts_ty));
// Convert any element literals to their promoted type form to avoid excessively large
// unions for large nested list literals, which the constraint solver struggles with.
let inferred_elt_ty =
inferred_elt_ty.apply_type_mapping(self.db(), &TypeMapping::PromoteLiterals);
builder.infer(elts_ty, inferred_elt_ty).ok()?;
}
let class_type = class_literal
.apply_specialization(self.db(), |generic_context| builder.build(generic_context));
Type::from(class_type).to_instance(self.db())
}
fn infer_dict_expression(&mut self, dict: &ast::ExprDict, _tcx: TypeContext<'db>) -> Type<'db> {
fn infer_dict_expression(&mut self, dict: &ast::ExprDict, tcx: TypeContext<'db>) -> Type<'db> {
let ast::ExprDict {
range: _,
node_index: _,
items,
} = dict;
// TODO: Use the type context for more precise inference.
for item in items {
self.infer_optional_expression(item.key.as_ref(), TypeContext::default());
self.infer_expression(&item.value, TypeContext::default());
// Validate `TypedDict` dictionary literal assignments.
if let Some(typed_dict) = tcx.annotation.and_then(Type::into_typed_dict) {
let typed_dict_items = typed_dict.items(self.db());
for item in items {
self.infer_optional_expression(item.key.as_ref(), TypeContext::default());
if let Some(ast::Expr::StringLiteral(ref key)) = item.key
&& let Some(key) = key.as_single_part_string()
&& let Some(field) = typed_dict_items.get(key.as_str())
{
self.infer_expression(&item.value, TypeContext::new(Some(field.declared_ty)));
} else {
self.infer_expression(&item.value, TypeContext::default());
}
}
validate_typed_dict_dict_literal(
&self.context,
typed_dict,
dict,
dict.into(),
|expr| self.expression_type(expr),
);
return Type::TypedDict(typed_dict);
}
KnownClass::Dict.to_specialized_instance(
self.db(),
[
todo_type!("dict literal key type"),
todo_type!("dict literal value type"),
],
)
// Avoid false positives for the functional `TypedDict` form, which is currently
// unsupported.
if let Some(Type::Dynamic(DynamicType::Todo(_))) = tcx.annotation {
return KnownClass::Dict
.to_specialized_instance(self.db(), [Type::unknown(), Type::unknown()]);
}
let items = items
.iter()
.map(|item| [item.key.as_ref(), Some(&item.value)]);
self.infer_collection_literal(items, tcx, KnownClass::Dict)
.unwrap_or_else(|| {
KnownClass::Dict
.to_specialized_instance(self.db(), [Type::unknown(), Type::unknown()])
})
}
// Infer the type of a collection literal expression.
fn infer_collection_literal<'expr, const N: usize>(
&mut self,
elts: impl Iterator<Item = [Option<&'expr ast::Expr>; N]>,
tcx: TypeContext<'db>,
collection_class: KnownClass,
) -> Option<Type<'db>> {
// Extract the type variable `T` from `list[T]` in typeshed.
fn elt_tys(
collection_class: KnownClass,
db: &dyn Db,
) -> Option<(ClassLiteral<'_>, &FxOrderSet<BoundTypeVarInstance<'_>>)> {
let class_literal = collection_class.try_to_class_literal(db)?;
let generic_context = class_literal.generic_context(db)?;
Some((class_literal, generic_context.variables(db)))
}
let (class_literal, elt_tys) = elt_tys(collection_class, self.db()).unwrap_or_else(|| {
let name = collection_class.name(self.db());
panic!("Typeshed should always have a `{name}` class in `builtins.pyi`")
});
// Extract the annotated type of `T`, if provided.
let annotated_elt_tys = tcx
.known_specialization(collection_class, self.db())
.map(|specialization| specialization.types(self.db()));
// Create a set of constraints to infer a precise type for `T`.
let mut builder = SpecializationBuilder::new(self.db());
match annotated_elt_tys {
// The annotated type acts as a constraint for `T`.
//
// Note that we infer the annotated type _before_ the elements, to more closely match the
// order of any unions as written in the type annotation.
Some(annotated_elt_tys) => {
for (elt_ty, annotated_elt_ty) in iter::zip(elt_tys, annotated_elt_tys) {
builder
.infer(Type::TypeVar(*elt_ty), *annotated_elt_ty)
.ok()?;
}
}
// If a valid type annotation was not provided, avoid restricting the type of the collection
// by unioning the inferred type with `Unknown`.
None => {
for elt_ty in elt_tys {
builder
.infer(Type::TypeVar(*elt_ty), Type::unknown())
.ok()?;
}
}
}
let elt_tcxs = match annotated_elt_tys {
None => Either::Left(iter::repeat(TypeContext::default())),
Some(tys) => Either::Right(tys.iter().map(|ty| TypeContext::new(Some(*ty)))),
};
for elts in elts {
// An unpacking expression for a dictionary.
if let &[None, Some(value)] = elts.as_slice() {
let inferred_value_ty = self.infer_expression(value, TypeContext::default());
// Merge the inferred type of the nested dictionary.
if let Some(specialization) =
inferred_value_ty.known_specialization(KnownClass::Dict, self.db())
{
for (elt_ty, inferred_elt_ty) in
iter::zip(elt_tys, specialization.types(self.db()))
{
builder
.infer(Type::TypeVar(*elt_ty), *inferred_elt_ty)
.ok()?;
}
}
continue;
}
// The inferred type of each element acts as an additional constraint on `T`.
for (elt, elt_ty, elt_tcx) in itertools::izip!(elts, elt_tys, elt_tcxs.clone()) {
let Some(inferred_elt_ty) = self.infer_optional_expression(elt, elt_tcx) else {
continue;
};
// Convert any element literals to their promoted type form to avoid excessively large
// unions for large nested list literals, which the constraint solver struggles with.
let inferred_elt_ty =
inferred_elt_ty.apply_type_mapping(self.db(), &TypeMapping::PromoteLiterals);
builder
.infer(Type::TypeVar(*elt_ty), inferred_elt_ty)
.ok()?;
}
}
let class_type = class_literal
.apply_specialization(self.db(), |generic_context| builder.build(generic_context));
Type::from(class_type).to_instance(self.db())
}
/// Infer the type of the `iter` expression of the first comprehension.
@@ -5774,7 +5831,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// arguments after matching them to parameters, but before checking that the argument types
// are assignable to any parameter annotations.
let mut call_arguments =
CallArguments::from_arguments(self.db(), arguments, |argument, splatted_value| {
CallArguments::from_arguments(arguments, |argument, splatted_value| {
let ty = self.infer_expression(splatted_value, TypeContext::default());
if let Some(argument) = argument {
self.store_expression_type(argument, ty);

View File

@@ -23,7 +23,6 @@ use crate::{
diagnostic::report_undeclared_protocol_member,
signatures::{Parameter, Parameters},
todo_type,
visitor::any_over_type,
},
};
@@ -571,24 +570,7 @@ impl<'a, 'db> ProtocolMember<'a, 'db> {
attribute_type
};
let proto_member_as_bound_method = method.bind_self(db);
if any_over_type(
db,
proto_member_as_bound_method,
&|t| matches!(t, Type::TypeVar(_)),
true,
) {
// TODO: proper validation for generic methods on protocols
return ConstraintSet::from(true);
}
attribute_type.has_relation_to_impl(
db,
proto_member_as_bound_method,
relation,
visitor,
)
attribute_type.has_relation_to_impl(db, method.bind_self(db), relation, visitor)
}
// TODO: consider the types of the attribute on `other` for property members
ProtocolMemberKind::Property(_) => ConstraintSet::from(matches!(

View File

@@ -970,6 +970,14 @@ impl<T> Tuple<T> {
FixedLengthTuple::from_elements(elements).into()
}
/// Returns the variable-length element of this tuple, if it has one.
pub(crate) fn variable_element(&self) -> Option<&T> {
match self {
Tuple::Fixed(_) => None,
Tuple::Variable(tuple) => Some(&tuple.variable),
}
}
/// Returns an iterator of all of the fixed-length element types of this tuple.
pub(crate) fn fixed_elements(&self) -> impl Iterator<Item = &T> + '_ {
match self {

View File

@@ -80,7 +80,7 @@ You can add the following configuration to `.gitlab-ci.yml` to run a `ruff forma
stage: build
interruptible: true
image:
name: ghcr.io/astral-sh/ruff:0.13.1-alpine
name: ghcr.io/astral-sh/ruff:0.13.2-alpine
before_script:
- cd $CI_PROJECT_DIR
- ruff --version
@@ -106,7 +106,7 @@ Ruff can be used as a [pre-commit](https://pre-commit.com) hook via [`ruff-pre-c
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.13.1
rev: v0.13.2
hooks:
# Run the linter.
- id: ruff-check
@@ -119,7 +119,7 @@ To enable lint fixes, add the `--fix` argument to the lint hook:
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.13.1
rev: v0.13.2
hooks:
# Run the linter.
- id: ruff-check
@@ -133,7 +133,7 @@ To avoid running on Jupyter Notebooks, remove `jupyter` from the list of allowed
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.13.1
rev: v0.13.2
hooks:
# Run the linter.
- id: ruff-check

View File

@@ -369,7 +369,7 @@ This tutorial has focused on Ruff's command-line interface, but Ruff can also be
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.13.1
rev: v0.13.2
hooks:
# Run the linter.
- id: ruff

View File

@@ -6,7 +6,7 @@ import {
useState,
} from "react";
import { Panel, PanelGroup } from "react-resizable-panels";
import { Diagnostic, Workspace } from "ruff_wasm";
import { Diagnostic, Workspace, PositionEncoding } from "ruff_wasm";
import {
ErrorMessage,
Theme,
@@ -173,7 +173,7 @@ export default function Editor({
try {
const config = JSON.parse(settingsSource);
const workspace = new Workspace(config);
const workspace = new Workspace(config, PositionEncoding.Utf16);
const diagnostics = workspace.check(pythonSource);
let secondary: SecondaryPanelResult = null;

View File

@@ -4,7 +4,7 @@ build-backend = "maturin"
[project]
name = "ruff"
version = "0.13.1"
version = "0.13.2"
description = "An extremely fast Python linter and code formatter, written in Rust."
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
readme = "README.md"

View File

@@ -1,6 +1,6 @@
[project]
name = "scripts"
version = "0.13.1"
version = "0.13.2"
description = ""
authors = ["Charles Marsh <charlie.r.marsh@gmail.com>"]

View File

@@ -5,9 +5,7 @@ echo "Enabling mypy primer specific configuration overloads (see .github/mypy-pr
mkdir -p ~/.config/ty
cp .github/mypy-primer-ty.toml ~/.config/ty/ty.toml
# Join the lines (project names) of the file into a large regex
# using `|` and escape parentheses.
PRIMER_SELECTOR="$(paste -s -d'|' "${PRIMER_SELECTOR}" | sed -e 's@(@\\(@g' -e 's@)@\\)@g')"
PRIMER_SELECTOR="$(paste -s -d'|' "${PRIMER_SELECTOR}")"
echo "new commit"
git rev-list --format=%s --max-count=1 "${GITHUB_SHA}"
@@ -22,7 +20,7 @@ cd ..
echo "Project selector: ${PRIMER_SELECTOR}"
# Allow the exit code to be 0 or 1, only fail for actual mypy_primer crashes/bugs
uvx \
--from="git+https://github.com/hauntsaninja/mypy_primer@0d20fff78b67f11f4dcbeb3d9b1c645b7198db5e" \
--from="git+https://github.com/hauntsaninja/mypy_primer@0ee3d6330addd5270a50759a6b6f13b2627a423b" \
mypy_primer \
--repo ruff \
--type-checker ty \

View File

@@ -12,7 +12,7 @@ project_root="$(dirname "$script_root")"
echo "Updating metadata with rooster..."
cd "$project_root"
uvx --python 3.12 --isolated -- \
rooster@0.0.10a1 release "$@"
rooster@0.1.0 release "$@"
echo "Updating lockfile..."
cargo update -p ruff