Compare commits
3 Commits
0.12.2
...
zb/filter-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
fc3b1ed523 | ||
|
|
992bc61185 | ||
|
|
cb602bf66c |
56
CHANGELOG.md
56
CHANGELOG.md
@@ -1,61 +1,5 @@
|
||||
# Changelog
|
||||
|
||||
## 0.12.2
|
||||
|
||||
### Preview features
|
||||
|
||||
- \[`flake8-pyi`\] Expand `Optional[A]` to `A | None` (`PYI016`) ([#18572](https://github.com/astral-sh/ruff/pull/18572))
|
||||
- \[`pyupgrade`\] Mark `UP008` fix safe if no comments are in range ([#18683](https://github.com/astral-sh/ruff/pull/18683))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- \[`flake8-comprehensions`\] Fix `C420` to prepend whitespace when needed ([#18616](https://github.com/astral-sh/ruff/pull/18616))
|
||||
- \[`perflint`\] Fix `PERF403` panic on attribute or subscription loop variable ([#19042](https://github.com/astral-sh/ruff/pull/19042))
|
||||
- \[`pydocstyle`\] Fix `D413` infinite loop for parenthesized docstring ([#18930](https://github.com/astral-sh/ruff/pull/18930))
|
||||
- \[`pylint`\] Fix `PLW0108` autofix introducing a syntax error when the lambda's body contains an assignment expression ([#18678](https://github.com/astral-sh/ruff/pull/18678))
|
||||
- \[`refurb`\] Fix false positive on empty tuples (`FURB168`) ([#19058](https://github.com/astral-sh/ruff/pull/19058))
|
||||
- \[`ruff`\] Allow more `field` calls from `attrs` (`RUF009`) ([#19021](https://github.com/astral-sh/ruff/pull/19021))
|
||||
- \[`ruff`\] Fix syntax error introduced for an empty string followed by a u-prefixed string (`UP025`) ([#18899](https://github.com/astral-sh/ruff/pull/18899))
|
||||
|
||||
### Rule changes
|
||||
|
||||
- \[`flake8-executable`\] Allow `uvx` in shebang line (`EXE003`) ([#18967](https://github.com/astral-sh/ruff/pull/18967))
|
||||
- \[`pandas`\] Avoid flagging `PD002` if `pandas` is not imported ([#18963](https://github.com/astral-sh/ruff/pull/18963))
|
||||
- \[`pyupgrade`\] Avoid PEP-604 unions with `typing.NamedTuple` (`UP007`, `UP045`) ([#18682](https://github.com/astral-sh/ruff/pull/18682))
|
||||
|
||||
### Documentation
|
||||
|
||||
- Document link between `import-outside-top-level (PLC0415)` and `lint.flake8-tidy-imports.banned-module-level-imports` ([#18733](https://github.com/astral-sh/ruff/pull/18733))
|
||||
- Fix description of the `format.skip-magic-trailing-comma` example ([#19095](https://github.com/astral-sh/ruff/pull/19095))
|
||||
- \[`airflow`\] Make `AIR302` example error out-of-the-box ([#18988](https://github.com/astral-sh/ruff/pull/18988))
|
||||
- \[`airflow`\] Make `AIR312` example error out-of-the-box ([#18989](https://github.com/astral-sh/ruff/pull/18989))
|
||||
- \[`flake8-annotations`\] Make `ANN401` example error out-of-the-box ([#18974](https://github.com/astral-sh/ruff/pull/18974))
|
||||
- \[`flake8-async`\] Make `ASYNC100` example error out-of-the-box ([#18993](https://github.com/astral-sh/ruff/pull/18993))
|
||||
- \[`flake8-async`\] Make `ASYNC105` example error out-of-the-box ([#19002](https://github.com/astral-sh/ruff/pull/19002))
|
||||
- \[`flake8-async`\] Make `ASYNC110` example error out-of-the-box ([#18975](https://github.com/astral-sh/ruff/pull/18975))
|
||||
- \[`flake8-async`\] Make `ASYNC210` example error out-of-the-box ([#18977](https://github.com/astral-sh/ruff/pull/18977))
|
||||
- \[`flake8-async`\] Make `ASYNC220`, `ASYNC221`, and `ASYNC222` examples error out-of-the-box ([#18978](https://github.com/astral-sh/ruff/pull/18978))
|
||||
- \[`flake8-async`\] Make `ASYNC251` example error out-of-the-box ([#18990](https://github.com/astral-sh/ruff/pull/18990))
|
||||
- \[`flake8-bandit`\] Make `S201` example error out-of-the-box ([#19017](https://github.com/astral-sh/ruff/pull/19017))
|
||||
- \[`flake8-bandit`\] Make `S604` and `S609` examples error out-of-the-box ([#19049](https://github.com/astral-sh/ruff/pull/19049))
|
||||
- \[`flake8-bugbear`\] Make `B028` example error out-of-the-box ([#19054](https://github.com/astral-sh/ruff/pull/19054))
|
||||
- \[`flake8-bugbear`\] Make `B911` example error out-of-the-box ([#19051](https://github.com/astral-sh/ruff/pull/19051))
|
||||
- \[`flake8-datetimez`\] Make `DTZ011` example error out-of-the-box ([#19055](https://github.com/astral-sh/ruff/pull/19055))
|
||||
- \[`flake8-datetimez`\] Make `DTZ901` example error out-of-the-box ([#19056](https://github.com/astral-sh/ruff/pull/19056))
|
||||
- \[`flake8-pyi`\] Make `PYI032` example error out-of-the-box ([#19061](https://github.com/astral-sh/ruff/pull/19061))
|
||||
- \[`flake8-pyi`\] Make example error out-of-the-box (`PYI014`, `PYI015`) ([#19097](https://github.com/astral-sh/ruff/pull/19097))
|
||||
- \[`flake8-pyi`\] Make example error out-of-the-box (`PYI042`) ([#19101](https://github.com/astral-sh/ruff/pull/19101))
|
||||
- \[`flake8-pyi`\] Make example error out-of-the-box (`PYI059`) ([#19080](https://github.com/astral-sh/ruff/pull/19080))
|
||||
- \[`flake8-pyi`\] Make example error out-of-the-box (`PYI062`) ([#19079](https://github.com/astral-sh/ruff/pull/19079))
|
||||
- \[`flake8-pytest-style`\] Make example error out-of-the-box (`PT023`) ([#19104](https://github.com/astral-sh/ruff/pull/19104))
|
||||
- \[`flake8-pytest-style`\] Make example error out-of-the-box (`PT030`) ([#19105](https://github.com/astral-sh/ruff/pull/19105))
|
||||
- \[`flake8-quotes`\] Make example error out-of-the-box (`Q003`) ([#19106](https://github.com/astral-sh/ruff/pull/19106))
|
||||
- \[`flake8-simplify`\] Make example error out-of-the-box (`SIM110`) ([#19113](https://github.com/astral-sh/ruff/pull/19113))
|
||||
- \[`flake8-simplify`\] Make example error out-of-the-box (`SIM113`) ([#19109](https://github.com/astral-sh/ruff/pull/19109))
|
||||
- \[`flake8-simplify`\] Make example error out-of-the-box (`SIM401`) ([#19110](https://github.com/astral-sh/ruff/pull/19110))
|
||||
- \[`pyflakes`\] Fix backslash in docs (`F621`) ([#19098](https://github.com/astral-sh/ruff/pull/19098))
|
||||
- \[`pylint`\] Fix `PLC0415` example ([#18970](https://github.com/astral-sh/ruff/pull/18970))
|
||||
|
||||
## 0.12.1
|
||||
|
||||
### Preview features
|
||||
|
||||
6
Cargo.lock
generated
6
Cargo.lock
generated
@@ -2724,7 +2724,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.12.2"
|
||||
version = "0.12.1"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"argfile",
|
||||
@@ -2970,7 +2970,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_linter"
|
||||
version = "0.12.2"
|
||||
version = "0.12.1"
|
||||
dependencies = [
|
||||
"aho-corasick",
|
||||
"anyhow",
|
||||
@@ -3302,7 +3302,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_wasm"
|
||||
version = "0.12.2"
|
||||
version = "0.12.1"
|
||||
dependencies = [
|
||||
"console_error_panic_hook",
|
||||
"console_log",
|
||||
|
||||
@@ -148,8 +148,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
|
||||
|
||||
# For a specific version.
|
||||
curl -LsSf https://astral.sh/ruff/0.12.2/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.12.2/install.ps1 | iex"
|
||||
curl -LsSf https://astral.sh/ruff/0.12.1/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.12.1/install.ps1 | iex"
|
||||
```
|
||||
|
||||
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
|
||||
@@ -182,7 +182,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.2
|
||||
rev: v0.12.1
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff"
|
||||
version = "0.12.2"
|
||||
version = "0.12.1"
|
||||
publish = true
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
||||
@@ -242,20 +242,19 @@ fn large(bencher: Bencher, benchmark: &Benchmark) {
|
||||
run_single_threaded(bencher, benchmark);
|
||||
}
|
||||
|
||||
// Currently disabled because the benchmark is too noisy (± 10%) to give useful feedback.
|
||||
// #[bench(args=[&*PYDANTIC], sample_size=3, sample_count=3)]
|
||||
// fn multithreaded(bencher: Bencher, benchmark: &Benchmark) {
|
||||
// let thread_pool = ThreadPoolBuilder::new().build().unwrap();
|
||||
#[bench(args=[&*PYDANTIC], sample_size=3, sample_count=3)]
|
||||
fn multithreaded(bencher: Bencher, benchmark: &Benchmark) {
|
||||
let thread_pool = ThreadPoolBuilder::new().build().unwrap();
|
||||
|
||||
// bencher
|
||||
// .with_inputs(|| benchmark.setup_iteration())
|
||||
// .bench_local_values(|db| {
|
||||
// thread_pool.install(|| {
|
||||
// check_project(&db, benchmark.max_diagnostics);
|
||||
// db
|
||||
// })
|
||||
// });
|
||||
// }
|
||||
bencher
|
||||
.with_inputs(|| benchmark.setup_iteration())
|
||||
.bench_local_values(|db| {
|
||||
thread_pool.install(|| {
|
||||
check_project(&db, benchmark.max_diagnostics);
|
||||
db
|
||||
})
|
||||
});
|
||||
}
|
||||
|
||||
fn main() {
|
||||
ThreadPoolBuilder::new()
|
||||
|
||||
@@ -73,16 +73,11 @@ fn generate_markdown() -> String {
|
||||
for lint in lints {
|
||||
let _ = writeln!(&mut output, "## `{rule_name}`\n", rule_name = lint.name());
|
||||
|
||||
// Reformat headers as bold text
|
||||
let mut in_code_fence = false;
|
||||
// Increase the header-level by one
|
||||
let documentation = lint
|
||||
.documentation_lines()
|
||||
.map(|line| {
|
||||
// Toggle the code fence state if we encounter a boundary
|
||||
if line.starts_with("```") {
|
||||
in_code_fence = !in_code_fence;
|
||||
}
|
||||
if !in_code_fence && line.starts_with('#') {
|
||||
if line.starts_with('#') {
|
||||
Cow::Owned(format!(
|
||||
"**{line}**\n",
|
||||
line = line.trim_start_matches('#').trim_start()
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_linter"
|
||||
version = "0.12.2"
|
||||
version = "0.12.1"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
||||
@@ -125,20 +125,3 @@ class J:
|
||||
class K:
|
||||
f: F = F()
|
||||
g: G = G()
|
||||
|
||||
|
||||
# Regression test for https://github.com/astral-sh/ruff/issues/19014
|
||||
# These are all valid field calls and should not cause diagnostics.
|
||||
@attr.define
|
||||
class TestAttrField:
|
||||
attr_field_factory: list[int] = attr.field(factory=list)
|
||||
attr_field_default: list[int] = attr.field(default=attr.Factory(list))
|
||||
attr_factory: list[int] = attr.Factory(list)
|
||||
attr_ib: list[int] = attr.ib(factory=list)
|
||||
attr_attr: list[int] = attr.attr(factory=list)
|
||||
attr_attrib: list[int] = attr.attrib(factory=list)
|
||||
|
||||
|
||||
@attr.attributes
|
||||
class TestAttrAttributes:
|
||||
x: list[int] = list() # RUF009
|
||||
|
||||
@@ -21,47 +21,27 @@ use crate::{Fix, FixAvailability, Violation};
|
||||
///
|
||||
/// For example:
|
||||
/// ```python
|
||||
/// from collections.abc import Container, Iterable, Sized
|
||||
/// from typing import Generic, TypeVar
|
||||
///
|
||||
///
|
||||
/// T = TypeVar("T")
|
||||
/// K = TypeVar("K")
|
||||
/// V = TypeVar("V")
|
||||
///
|
||||
///
|
||||
/// class LinkedList(Generic[T], Sized):
|
||||
/// def push(self, item: T) -> None:
|
||||
/// self._items.append(item)
|
||||
///
|
||||
///
|
||||
/// class MyMapping(
|
||||
/// Generic[K, V],
|
||||
/// Iterable[tuple[K, V]],
|
||||
/// Container[tuple[K, V]],
|
||||
/// Iterable[Tuple[K, V]],
|
||||
/// Container[Tuple[K, V]],
|
||||
/// ):
|
||||
/// ...
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// from collections.abc import Container, Iterable, Sized
|
||||
/// from typing import Generic, TypeVar
|
||||
///
|
||||
///
|
||||
/// T = TypeVar("T")
|
||||
/// K = TypeVar("K")
|
||||
/// V = TypeVar("V")
|
||||
///
|
||||
///
|
||||
/// class LinkedList(Sized, Generic[T]):
|
||||
/// def push(self, item: T) -> None:
|
||||
/// self._items.append(item)
|
||||
///
|
||||
///
|
||||
/// class MyMapping(
|
||||
/// Iterable[tuple[K, V]],
|
||||
/// Container[tuple[K, V]],
|
||||
/// Iterable[Tuple[K, V]],
|
||||
/// Container[Tuple[K, V]],
|
||||
/// Generic[K, V],
|
||||
/// ):
|
||||
/// ...
|
||||
|
||||
@@ -75,7 +75,7 @@ impl AlwaysFixableViolation for TypedArgumentDefaultInStub {
|
||||
/// ## Example
|
||||
///
|
||||
/// ```pyi
|
||||
/// def foo(arg=bar()) -> None: ...
|
||||
/// def foo(arg=[]) -> None: ...
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
@@ -120,7 +120,7 @@ impl AlwaysFixableViolation for ArgumentDefaultInStub {
|
||||
///
|
||||
/// ## Example
|
||||
/// ```pyi
|
||||
/// foo: str = bar()
|
||||
/// foo: str = "..."
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
|
||||
@@ -14,15 +14,11 @@ use crate::checkers::ast::Checker;
|
||||
///
|
||||
/// ## Example
|
||||
/// ```pyi
|
||||
/// from typing import TypeAlias
|
||||
///
|
||||
/// type_alias_name: TypeAlias = int
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```pyi
|
||||
/// from typing import TypeAlias
|
||||
///
|
||||
/// TypeAliasName: TypeAlias = int
|
||||
/// ```
|
||||
#[derive(ViolationMetadata)]
|
||||
|
||||
@@ -31,7 +31,7 @@ use crate::rules::flake8_pytest_style::helpers::{Parentheses, get_mark_decorator
|
||||
/// import pytest
|
||||
///
|
||||
///
|
||||
/// @pytest.mark.foo()
|
||||
/// @pytest.mark.foo
|
||||
/// def test_something(): ...
|
||||
/// ```
|
||||
///
|
||||
@@ -41,7 +41,7 @@ use crate::rules::flake8_pytest_style::helpers::{Parentheses, get_mark_decorator
|
||||
/// import pytest
|
||||
///
|
||||
///
|
||||
/// @pytest.mark.foo
|
||||
/// @pytest.mark.foo()
|
||||
/// def test_something(): ...
|
||||
/// ```
|
||||
///
|
||||
|
||||
@@ -76,11 +76,11 @@ impl Violation for PytestWarnsWithMultipleStatements {
|
||||
///
|
||||
///
|
||||
/// def test_foo():
|
||||
/// with pytest.warns(Warning):
|
||||
/// with pytest.warns(RuntimeWarning):
|
||||
/// ...
|
||||
///
|
||||
/// # empty string is also an error
|
||||
/// with pytest.warns(Warning, match=""):
|
||||
/// with pytest.warns(RuntimeWarning, match=""):
|
||||
/// ...
|
||||
/// ```
|
||||
///
|
||||
@@ -90,7 +90,7 @@ impl Violation for PytestWarnsWithMultipleStatements {
|
||||
///
|
||||
///
|
||||
/// def test_foo():
|
||||
/// with pytest.warns(Warning, match="expected message"):
|
||||
/// with pytest.warns(RuntimeWarning, match="expected message"):
|
||||
/// ...
|
||||
/// ```
|
||||
///
|
||||
|
||||
@@ -19,12 +19,12 @@ use crate::{AlwaysFixableViolation, Edit, Fix};
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// foo = "bar\"s"
|
||||
/// foo = 'bar\'s'
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// foo = 'bar"s'
|
||||
/// foo = "bar's"
|
||||
/// ```
|
||||
///
|
||||
/// ## Formatter compatibility
|
||||
|
||||
@@ -20,7 +20,6 @@ use crate::checkers::ast::Checker;
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// fruits = ["apple", "banana", "cherry"]
|
||||
/// i = 0
|
||||
/// for fruit in fruits:
|
||||
/// print(f"{i + 1}. {fruit}")
|
||||
/// i += 1
|
||||
|
||||
@@ -27,7 +27,6 @@ use crate::{Edit, Fix, FixAvailability, Violation};
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// foo = {}
|
||||
/// if "bar" in foo:
|
||||
/// value = foo["bar"]
|
||||
/// else:
|
||||
@@ -36,7 +35,6 @@ use crate::{Edit, Fix, FixAvailability, Violation};
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// foo = {}
|
||||
/// value = foo.get("bar", 0)
|
||||
/// ```
|
||||
///
|
||||
|
||||
@@ -23,17 +23,15 @@ use crate::{Edit, Fix, FixAvailability, Violation};
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// def foo():
|
||||
/// for item in iterable:
|
||||
/// if predicate(item):
|
||||
/// return True
|
||||
/// return False
|
||||
/// for item in iterable:
|
||||
/// if predicate(item):
|
||||
/// return True
|
||||
/// return False
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// def foo():
|
||||
/// return any(predicate(item) for item in iterable)
|
||||
/// return any(predicate(item) for item in iterable)
|
||||
/// ```
|
||||
///
|
||||
/// ## Fix safety
|
||||
|
||||
@@ -11,8 +11,8 @@ use crate::{Violation, checkers::ast::Checker};
|
||||
/// ## Why is this bad?
|
||||
/// In assignment statements, starred expressions can be used to unpack iterables.
|
||||
///
|
||||
/// In Python 3, no more than `1 << 8` assignments are allowed before a starred
|
||||
/// expression, and no more than `1 << 24` expressions are allowed after a starred
|
||||
/// In Python 3, no more than 1 << 8 assignments are allowed before a starred
|
||||
/// expression, and no more than 1 << 24 expressions are allowed after a starred
|
||||
/// expression.
|
||||
///
|
||||
/// ## References
|
||||
|
||||
@@ -44,13 +44,6 @@ use crate::{
|
||||
/// print(platform.python_version())
|
||||
/// ```
|
||||
///
|
||||
/// ## See also
|
||||
/// This rule will ignore import statements configured in
|
||||
/// [`lint.flake8-tidy-imports.banned-module-level-imports`][banned-module-level-imports]
|
||||
/// if the rule [`banned-module-level-imports`][TID253] is enabled.
|
||||
///
|
||||
/// [banned-module-level-imports]: https://docs.astral.sh/ruff/settings/#lint_flake8-tidy-imports_banned-module-level-imports
|
||||
/// [TID253]: https://docs.astral.sh/ruff/rules/banned-module-level-imports/
|
||||
/// [PEP 8]: https://peps.python.org/pep-0008/#imports
|
||||
#[derive(ViolationMetadata)]
|
||||
pub(crate) struct ImportOutsideTopLevel;
|
||||
|
||||
@@ -25,16 +25,14 @@ fn is_stdlib_dataclass_field(func: &Expr, semantic: &SemanticModel) -> bool {
|
||||
.is_some_and(|qualified_name| matches!(qualified_name.segments(), ["dataclasses", "field"]))
|
||||
}
|
||||
|
||||
/// Returns `true` if the given [`Expr`] is a call to an `attrs` field function.
|
||||
/// Returns `true` if the given [`Expr`] is a call to `attr.ib()` or `attrs.field()`.
|
||||
fn is_attrs_field(func: &Expr, semantic: &SemanticModel) -> bool {
|
||||
semantic
|
||||
.resolve_qualified_name(func)
|
||||
.is_some_and(|qualified_name| {
|
||||
matches!(
|
||||
qualified_name.segments(),
|
||||
["attrs", "field" | "Factory"]
|
||||
// See https://github.com/python-attrs/attrs/blob/main/src/attr/__init__.py#L33
|
||||
| ["attr", "ib" | "attr" | "attrib" | "field" | "Factory"]
|
||||
["attrs", "field" | "Factory"] | ["attr", "ib"]
|
||||
)
|
||||
})
|
||||
}
|
||||
@@ -122,8 +120,7 @@ pub(super) fn dataclass_kind<'a>(
|
||||
|
||||
match qualified_name.segments() {
|
||||
["attrs" | "attr", func @ ("define" | "frozen" | "mutable")]
|
||||
// See https://github.com/python-attrs/attrs/blob/main/src/attr/__init__.py#L32
|
||||
| ["attr", func @ ("s" | "attributes" | "attrs")] => {
|
||||
| ["attr", func @ ("s" | "attrs")] => {
|
||||
// `.define`, `.frozen` and `.mutable` all default `auto_attribs` to `None`,
|
||||
// whereas `@attr.s` implicitly sets `auto_attribs=False`.
|
||||
// https://www.attrs.org/en/stable/api.html#attrs.define
|
||||
|
||||
@@ -98,11 +98,3 @@ RUF009_attrs.py:127:12: RUF009 Do not perform function call `G` in dataclass def
|
||||
127 | g: G = G()
|
||||
| ^^^ RUF009
|
||||
|
|
||||
|
||||
RUF009_attrs.py:144:20: RUF009 Do not perform function call `list` in dataclass defaults
|
||||
|
|
||||
142 | @attr.attributes
|
||||
143 | class TestAttrAttributes:
|
||||
144 | x: list[int] = list() # RUF009
|
||||
| ^^^^^^ RUF009
|
||||
|
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_wasm"
|
||||
version = "0.12.2"
|
||||
version = "0.12.1"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
||||
@@ -1996,8 +1996,7 @@ pub struct Flake8TidyImportsOptions {
|
||||
|
||||
/// List of specific modules that may not be imported at module level, and should instead be
|
||||
/// imported lazily (e.g., within a function definition, or an `if TYPE_CHECKING:`
|
||||
/// block, or some other nested context). This also affects the rule `import-outside-top-level`
|
||||
/// if `banned-module-level-imports` is enabled.
|
||||
/// block, or some other nested context).
|
||||
#[option(
|
||||
default = r#"[]"#,
|
||||
value_type = r#"list[str]"#,
|
||||
@@ -3587,7 +3586,7 @@ pub struct FormatOptions {
|
||||
/// Setting `skip-magic-trailing-comma = true` changes the formatting to:
|
||||
///
|
||||
/// ```python
|
||||
/// # The arguments are collapsed to a single line because the trailing comma is ignored
|
||||
/// # The arguments remain on separate lines because of the trailing comma after `b`
|
||||
/// def test(a, b):
|
||||
/// pass
|
||||
/// ```
|
||||
|
||||
@@ -3,7 +3,7 @@ name = "ty"
|
||||
version = "0.0.0"
|
||||
# required for correct pypi metadata
|
||||
homepage = "https://github.com/astral-sh/ty/"
|
||||
documentation = "https://docs.astral.sh/ty/"
|
||||
documentation = "https://github.com/astral-sh/ty/"
|
||||
# Releases occur in this other repository!
|
||||
repository = "https://github.com/astral-sh/ty/"
|
||||
edition.workspace = true
|
||||
|
||||
39
crates/ty/docs/rules.md
generated
39
crates/ty/docs/rules.md
generated
@@ -138,7 +138,8 @@ class M2(type): ...
|
||||
class A(metaclass=M1): ...
|
||||
class B(metaclass=M2): ...
|
||||
|
||||
# TypeError: metaclass conflict
|
||||
**TypeError: metaclass conflict**
|
||||
|
||||
class C(A, B): ...
|
||||
```
|
||||
|
||||
@@ -165,7 +166,8 @@ inherits from itself.
|
||||
**Examples**
|
||||
|
||||
```python
|
||||
# foo.pyi
|
||||
**foo.pyi**
|
||||
|
||||
class A(B): ...
|
||||
class B(A): ...
|
||||
```
|
||||
@@ -193,7 +195,8 @@ Class definitions with duplicate bases raise `TypeError` at runtime.
|
||||
```python
|
||||
class A: ...
|
||||
|
||||
# TypeError: duplicate base class
|
||||
**TypeError: duplicate base class**
|
||||
|
||||
class B(A, A): ...
|
||||
```
|
||||
|
||||
@@ -323,7 +326,8 @@ Classes with an inconsistent MRO will raise a `TypeError` at runtime.
|
||||
class A: ...
|
||||
class B(A): ...
|
||||
|
||||
# TypeError: Cannot create a consistent method resolution order
|
||||
**TypeError: Cannot create a consistent method resolution order**
|
||||
|
||||
class C(A, B): ...
|
||||
```
|
||||
|
||||
@@ -393,7 +397,8 @@ class A:
|
||||
class B:
|
||||
__slots__ = ("a", "b") # Even if the values are the same
|
||||
|
||||
# TypeError: multiple bases have instance lay-out conflict
|
||||
**TypeError: multiple bases have instance lay-out conflict**
|
||||
|
||||
class C(A, B): ...
|
||||
```
|
||||
|
||||
@@ -415,7 +420,8 @@ class B:
|
||||
class C:
|
||||
__slots__ = ("a", "b")
|
||||
|
||||
# fine
|
||||
**fine**
|
||||
|
||||
class D(A, B, C): ...
|
||||
```
|
||||
|
||||
@@ -565,7 +571,8 @@ Such a statement will raise `TypeError` at runtime.
|
||||
**Examples**
|
||||
|
||||
```python
|
||||
# TypeError: 'int' object does not support the context manager protocol
|
||||
**TypeError: 'int' object does not support the context manager protocol**
|
||||
|
||||
with 1:
|
||||
print(2)
|
||||
```
|
||||
@@ -662,7 +669,8 @@ from typing import Generic, TypeVar
|
||||
|
||||
T = TypeVar("T") # okay
|
||||
|
||||
# error: class uses both PEP-695 syntax and legacy syntax
|
||||
**error: class uses both PEP-695 syntax and legacy syntax**
|
||||
|
||||
class C[U](Generic[T]): ...
|
||||
```
|
||||
|
||||
@@ -695,7 +703,8 @@ T = TypeVar("T") # okay
|
||||
Q = TypeVar("S") # error: TypeVar name must match the variable it's assigned to
|
||||
T = TypeVar("T") # error: TypeVars should not be redefined
|
||||
|
||||
# error: TypeVar must be immediately assigned to a variable
|
||||
**error: TypeVar must be immediately assigned to a variable**
|
||||
|
||||
def f(t: TypeVar("U")): ...
|
||||
```
|
||||
|
||||
@@ -727,7 +736,8 @@ as `type.__new__`.
|
||||
```python
|
||||
def f(): ...
|
||||
|
||||
# TypeError: f() takes 0 positional arguments but 3 were given
|
||||
**TypeError: f() takes 0 positional arguments but 3 were given**
|
||||
|
||||
class B(metaclass=f): ...
|
||||
```
|
||||
|
||||
@@ -1135,7 +1145,8 @@ T = TypeVar('T', str) # invalid constrained TypeVar
|
||||
Use instead:
|
||||
```python
|
||||
T = TypeVar('T', str, int) # valid constrained TypeVar
|
||||
# or
|
||||
**or**
|
||||
|
||||
T = TypeVar('T', bound=str) # valid bound TypeVar
|
||||
```
|
||||
|
||||
@@ -1726,13 +1737,15 @@ or `ImportError` at runtime.
|
||||
**Examples**
|
||||
|
||||
```python
|
||||
# module.py
|
||||
**module.py**
|
||||
|
||||
import datetime
|
||||
|
||||
if datetime.date.today().weekday() != 6:
|
||||
a = 1
|
||||
|
||||
# main.py
|
||||
**main.py**
|
||||
|
||||
from module import a # ImportError: cannot import name 'a' from 'module'
|
||||
```
|
||||
|
||||
|
||||
@@ -186,7 +186,7 @@ fn cli_arguments_are_relative_to_the_current_directory() -> anyhow::Result<()> {
|
||||
3 |
|
||||
4 | stat = add(10, 15)
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
Found 1 diagnostic
|
||||
@@ -412,7 +412,7 @@ fn check_specific_paths() -> anyhow::Result<()> {
|
||||
3 |
|
||||
4 | print(z)
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
error[unresolved-import]: Cannot resolve imported module `does_not_exist`
|
||||
@@ -421,7 +421,7 @@ fn check_specific_paths() -> anyhow::Result<()> {
|
||||
2 | import does_not_exist # error: unresolved-import
|
||||
| ^^^^^^^^^^^^^^
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
Found 2 diagnostics
|
||||
@@ -447,7 +447,7 @@ fn check_specific_paths() -> anyhow::Result<()> {
|
||||
3 |
|
||||
4 | print(z)
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
error[unresolved-import]: Cannot resolve imported module `does_not_exist`
|
||||
@@ -456,7 +456,7 @@ fn check_specific_paths() -> anyhow::Result<()> {
|
||||
2 | import does_not_exist # error: unresolved-import
|
||||
| ^^^^^^^^^^^^^^
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
Found 2 diagnostics
|
||||
|
||||
@@ -333,7 +333,7 @@ import bar",
|
||||
| ^^^
|
||||
2 | import bar
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
Found 1 diagnostic
|
||||
@@ -909,7 +909,7 @@ fn check_conda_prefix_var_to_resolve_path() -> anyhow::Result<()> {
|
||||
2 | import package1
|
||||
| ^^^^^^^^
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
Found 1 diagnostic
|
||||
@@ -1206,7 +1206,7 @@ fn default_root_tests_package() -> anyhow::Result<()> {
|
||||
4 |
|
||||
5 | print(f"{foo} {bar}")
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
Found 1 diagnostic
|
||||
|
||||
@@ -101,7 +101,7 @@ fn cli_rule_severity() -> anyhow::Result<()> {
|
||||
3 |
|
||||
4 | y = 4 / 0
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
error[unresolved-reference]: Name `prin` used when not defined
|
||||
@@ -141,7 +141,7 @@ fn cli_rule_severity() -> anyhow::Result<()> {
|
||||
3 |
|
||||
4 | y = 4 / 0
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` was selected on the command line
|
||||
|
||||
warning[division-by-zero]: Cannot divide object of type `Literal[4]` by zero
|
||||
|
||||
@@ -445,7 +445,6 @@ mod tests {
|
||||
test.assert_completions_do_not_include("_T");
|
||||
// Dunder attributes should not be stripped
|
||||
test.assert_completions_include("__annotations__");
|
||||
// See `private_symbols_in_stub` for more comprehensive testing private of symbol filtering.
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -510,112 +509,6 @@ re.<CURSOR>
|
||||
test.assert_completions_include("findall");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn private_symbols_in_stub() {
|
||||
let test = CursorTest::builder()
|
||||
.source(
|
||||
"package/__init__.pyi",
|
||||
r#"\
|
||||
from typing import TypeAlias, Literal, TypeVar, ParamSpec, TypeVarTuple, Protocol
|
||||
|
||||
public_name = 1
|
||||
_private_name = 1
|
||||
__mangled_name = 1
|
||||
__dunder_name__ = 1
|
||||
|
||||
public_type_var = TypeVar("public_type_var")
|
||||
_private_type_var = TypeVar("_private_type_var")
|
||||
__mangled_type_var = TypeVar("__mangled_type_var")
|
||||
|
||||
public_param_spec = ParamSpec("public_param_spec")
|
||||
_private_param_spec = ParamSpec("_private_param_spec")
|
||||
|
||||
public_type_var_tuple = TypeVarTuple("public_type_var_tuple")
|
||||
_private_type_var_tuple = TypeVarTuple("_private_type_var_tuple")
|
||||
|
||||
public_explicit_type_alias: TypeAlias = Literal[1]
|
||||
_private_explicit_type_alias: TypeAlias = Literal[1]
|
||||
|
||||
class PublicProtocol(Protocol):
|
||||
def method(self) -> None: ...
|
||||
|
||||
class _PrivateProtocol(Protocol):
|
||||
def method(self) -> None: ...
|
||||
"#,
|
||||
)
|
||||
.source("main.py", "import package; package.<CURSOR>")
|
||||
.build();
|
||||
test.assert_completions_include("public_name");
|
||||
test.assert_completions_include("_private_name");
|
||||
test.assert_completions_include("__mangled_name");
|
||||
test.assert_completions_include("__dunder_name__");
|
||||
test.assert_completions_include("public_type_var");
|
||||
test.assert_completions_do_not_include("_private_type_var");
|
||||
test.assert_completions_do_not_include("__mangled_type_var");
|
||||
test.assert_completions_include("public_param_spec");
|
||||
test.assert_completions_do_not_include("_private_param_spec");
|
||||
test.assert_completions_include("public_type_var_tuple");
|
||||
test.assert_completions_do_not_include("_private_type_var_tuple");
|
||||
test.assert_completions_include("public_explicit_type_alias");
|
||||
test.assert_completions_include("_private_explicit_type_alias");
|
||||
test.assert_completions_include("PublicProtocol");
|
||||
test.assert_completions_do_not_include("_PrivateProtocol");
|
||||
}
|
||||
|
||||
/// Unlike [`private_symbols_in_stub`], this test doesn't use a `.pyi` file so all of the names
|
||||
/// are visible.
|
||||
#[test]
|
||||
fn private_symbols_in_module() {
|
||||
let test = CursorTest::builder()
|
||||
.source(
|
||||
"package/__init__.py",
|
||||
r#"\
|
||||
from typing import TypeAlias, Literal, TypeVar, ParamSpec, TypeVarTuple, Protocol
|
||||
|
||||
public_name = 1
|
||||
_private_name = 1
|
||||
__mangled_name = 1
|
||||
__dunder_name__ = 1
|
||||
|
||||
public_type_var = TypeVar("public_type_var")
|
||||
_private_type_var = TypeVar("_private_type_var")
|
||||
__mangled_type_var = TypeVar("__mangled_type_var")
|
||||
|
||||
public_param_spec = ParamSpec("public_param_spec")
|
||||
_private_param_spec = ParamSpec("_private_param_spec")
|
||||
|
||||
public_type_var_tuple = TypeVarTuple("public_type_var_tuple")
|
||||
_private_type_var_tuple = TypeVarTuple("_private_type_var_tuple")
|
||||
|
||||
public_explicit_type_alias: TypeAlias = Literal[1]
|
||||
_private_explicit_type_alias: TypeAlias = Literal[1]
|
||||
|
||||
class PublicProtocol(Protocol):
|
||||
def method(self) -> None: ...
|
||||
|
||||
class _PrivateProtocol(Protocol):
|
||||
def method(self) -> None: ...
|
||||
"#,
|
||||
)
|
||||
.source("main.py", "import package; package.<CURSOR>")
|
||||
.build();
|
||||
test.assert_completions_include("public_name");
|
||||
test.assert_completions_include("_private_name");
|
||||
test.assert_completions_include("__mangled_name");
|
||||
test.assert_completions_include("__dunder_name__");
|
||||
test.assert_completions_include("public_type_var");
|
||||
test.assert_completions_include("_private_type_var");
|
||||
test.assert_completions_include("__mangled_type_var");
|
||||
test.assert_completions_include("public_param_spec");
|
||||
test.assert_completions_include("_private_param_spec");
|
||||
test.assert_completions_include("public_type_var_tuple");
|
||||
test.assert_completions_include("_private_type_var_tuple");
|
||||
test.assert_completions_include("public_explicit_type_alias");
|
||||
test.assert_completions_include("_private_explicit_type_alias");
|
||||
test.assert_completions_include("PublicProtocol");
|
||||
test.assert_completions_include("_PrivateProtocol");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn one_function_prefix() {
|
||||
let test = cursor_test(
|
||||
|
||||
@@ -118,7 +118,7 @@ impl fmt::Display for DisplayHoverContent<'_, '_> {
|
||||
match self.content {
|
||||
HoverContent::Type(ty) => self
|
||||
.kind
|
||||
.fenced_code_block(ty.display(self.db), "python")
|
||||
.fenced_code_block(ty.display(self.db), "text")
|
||||
.fmt(f),
|
||||
}
|
||||
}
|
||||
@@ -148,7 +148,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
Literal[10]
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
Literal[10]
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -184,7 +184,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
int
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
int
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -214,7 +214,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
def foo(a, b) -> Unknown
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
def foo(a, b) -> Unknown
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -243,7 +243,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
bool
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
bool
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -274,7 +274,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
Literal[123]
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
Literal[123]
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -312,7 +312,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
(def foo(a, b) -> Unknown) | (def bar(a, b) -> Unknown)
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
(def foo(a, b) -> Unknown) | (def bar(a, b) -> Unknown)
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -344,7 +344,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
<module 'lib'>
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
<module 'lib'>
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -373,7 +373,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
T
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
T
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -399,7 +399,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
@Todo
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
@Todo
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -425,7 +425,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
@Todo
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
@Todo
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -451,7 +451,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
Literal[1]
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
Literal[1]
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -482,7 +482,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
Literal[1]
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
Literal[1]
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -512,7 +512,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
Literal[2]
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
Literal[2]
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -545,7 +545,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
Unknown | Literal[1]
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
Unknown | Literal[1]
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -574,7 +574,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
int
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
int
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -602,7 +602,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
Literal[1]
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
Literal[1]
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -631,7 +631,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
int
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
int
|
||||
```
|
||||
---------------------------------------------
|
||||
@@ -661,7 +661,7 @@ mod tests {
|
||||
assert_snapshot!(test.hover(), @r"
|
||||
str
|
||||
---------------------------------------------
|
||||
```python
|
||||
```text
|
||||
str
|
||||
```
|
||||
---------------------------------------------
|
||||
|
||||
@@ -83,20 +83,15 @@ impl ProjectDatabase {
|
||||
|
||||
/// Checks all open files in the project and its dependencies.
|
||||
pub fn check(&self) -> Vec<Diagnostic> {
|
||||
self.check_with_mode(CheckMode::OpenFiles)
|
||||
let mut reporter = DummyReporter;
|
||||
let reporter = AssertUnwindSafe(&mut reporter as &mut dyn Reporter);
|
||||
self.project().check(self, reporter)
|
||||
}
|
||||
|
||||
/// Checks all open files in the project and its dependencies, using the given reporter.
|
||||
pub fn check_with_reporter(&self, reporter: &mut dyn Reporter) -> Vec<Diagnostic> {
|
||||
let reporter = AssertUnwindSafe(reporter);
|
||||
self.project().check(self, CheckMode::OpenFiles, reporter)
|
||||
}
|
||||
|
||||
/// Check the project with the given mode.
|
||||
pub fn check_with_mode(&self, mode: CheckMode) -> Vec<Diagnostic> {
|
||||
let mut reporter = DummyReporter;
|
||||
let reporter = AssertUnwindSafe(&mut reporter as &mut dyn Reporter);
|
||||
self.project().check(self, mode, reporter)
|
||||
self.project().check(self, reporter)
|
||||
}
|
||||
|
||||
#[tracing::instrument(level = "debug", skip(self))]
|
||||
@@ -162,17 +157,6 @@ impl std::fmt::Debug for ProjectDatabase {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq)]
|
||||
pub enum CheckMode {
|
||||
/// Checks only the open files in the project.
|
||||
OpenFiles,
|
||||
|
||||
/// Checks all files in the project, ignoring the open file set.
|
||||
///
|
||||
/// This includes virtual files, such as those created by the language server.
|
||||
AllFiles,
|
||||
}
|
||||
|
||||
/// Stores memory usage information.
|
||||
pub struct SalsaMemoryDump {
|
||||
total_fields: usize,
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
use crate::glob::{GlobFilterCheckMode, IncludeResult};
|
||||
use crate::metadata::options::{OptionDiagnostic, ToSettingsError};
|
||||
use crate::walk::{ProjectFilesFilter, ProjectFilesWalker};
|
||||
pub use db::{CheckMode, Db, ProjectDatabase, SalsaMemoryDump};
|
||||
pub use db::{Db, ProjectDatabase, SalsaMemoryDump};
|
||||
use files::{Index, Indexed, IndexedFiles};
|
||||
use metadata::settings::Settings;
|
||||
pub use metadata::{ProjectMetadata, ProjectMetadataError};
|
||||
@@ -214,7 +214,6 @@ impl Project {
|
||||
pub(crate) fn check(
|
||||
self,
|
||||
db: &ProjectDatabase,
|
||||
mode: CheckMode,
|
||||
mut reporter: AssertUnwindSafe<&mut dyn Reporter>,
|
||||
) -> Vec<Diagnostic> {
|
||||
let project_span = tracing::debug_span!("Project::check");
|
||||
@@ -229,11 +228,7 @@ impl Project {
|
||||
.map(OptionDiagnostic::to_diagnostic),
|
||||
);
|
||||
|
||||
let files = match mode {
|
||||
CheckMode::OpenFiles => ProjectFiles::new(db, self),
|
||||
// TODO: Consider open virtual files as well
|
||||
CheckMode::AllFiles => ProjectFiles::Indexed(self.files(db)),
|
||||
};
|
||||
let files = ProjectFiles::new(db, self);
|
||||
reporter.set_files(files.len());
|
||||
|
||||
diagnostics.extend(
|
||||
|
||||
@@ -87,8 +87,13 @@ c_instance = C()
|
||||
|
||||
reveal_type(c_instance.declared_and_bound) # revealed: str | None
|
||||
|
||||
reveal_type(C.declared_and_bound) # revealed: str | None
|
||||
# Note that both mypy and pyright show no error in this case! So we may reconsider this in
|
||||
# the future, if it turns out to produce too many false positives. We currently emit:
|
||||
# error: [unresolved-attribute] "Attribute `declared_and_bound` can only be accessed on instances, not on the class object `<class 'C'>` itself."
|
||||
reveal_type(C.declared_and_bound) # revealed: Unknown
|
||||
|
||||
# Same as above. Mypy and pyright do not show an error here.
|
||||
# error: [invalid-attribute-access] "Cannot assign to instance attribute `declared_and_bound` from the class object `<class 'C'>`"
|
||||
C.declared_and_bound = "overwritten on class"
|
||||
|
||||
# error: [invalid-assignment] "Object of type `Literal[1]` is not assignable to attribute `declared_and_bound` of type `str | None`"
|
||||
@@ -97,11 +102,8 @@ c_instance.declared_and_bound = 1
|
||||
|
||||
#### Variable declared in class body and not bound anywhere
|
||||
|
||||
If a variable is declared in the class body but not bound anywhere, we consider it to be accessible
|
||||
on instances and the class itself. It would be more consistent to treat this as a pure instance
|
||||
variable (and require the attribute to be annotated with `ClassVar` if it should be accessible on
|
||||
the class as well), but other type checkers allow this as well. This is also heavily relied on in
|
||||
the Python ecosystem:
|
||||
If a variable is declared in the class body but not bound anywhere, we still consider it a pure
|
||||
instance variable and allow access to it via instances.
|
||||
|
||||
```py
|
||||
class C:
|
||||
@@ -111,8 +113,11 @@ c_instance = C()
|
||||
|
||||
reveal_type(c_instance.only_declared) # revealed: str
|
||||
|
||||
reveal_type(C.only_declared) # revealed: str
|
||||
# Mypy and pyright do not show an error here. We treat this as a pure instance variable.
|
||||
# error: [unresolved-attribute] "Attribute `only_declared` can only be accessed on instances, not on the class object `<class 'C'>` itself."
|
||||
reveal_type(C.only_declared) # revealed: Unknown
|
||||
|
||||
# error: [invalid-attribute-access] "Cannot assign to instance attribute `only_declared` from the class object `<class 'C'>`"
|
||||
C.only_declared = "overwritten on class"
|
||||
```
|
||||
|
||||
@@ -1230,16 +1235,6 @@ def _(flag: bool):
|
||||
reveal_type(Derived().x) # revealed: int | Any
|
||||
|
||||
Derived().x = 1
|
||||
|
||||
# TODO
|
||||
# The following assignment currently fails, because we first check if "a" is assignable to the
|
||||
# attribute on the meta-type of `Derived`, i.e. `<class 'Derived'>`. When accessing the class
|
||||
# member `x` on `Derived`, we only see the `x: int` declaration and do not union it with the
|
||||
# type of the base class attribute `x: Any`. This could potentially be improved. Note that we
|
||||
# see a type of `int | Any` above because we have the full union handling of possibly-unbound
|
||||
# *instance* attributes.
|
||||
|
||||
# error: [invalid-assignment] "Object of type `Literal["a"]` is not assignable to attribute `x` of type `int`"
|
||||
Derived().x = "a"
|
||||
```
|
||||
|
||||
@@ -1304,8 +1299,10 @@ def _(flag: bool):
|
||||
if flag:
|
||||
self.x = 1
|
||||
|
||||
# error: [possibly-unbound-attribute]
|
||||
reveal_type(Foo().x) # revealed: int | Unknown
|
||||
|
||||
# error: [possibly-unbound-attribute]
|
||||
Foo().x = 1
|
||||
```
|
||||
|
||||
|
||||
@@ -120,7 +120,8 @@ def _(flag: bool):
|
||||
|
||||
### Dunder methods as class-level annotations with no value
|
||||
|
||||
Class-level annotations with no value assigned are considered to be accessible on the class:
|
||||
Class-level annotations with no value assigned are considered instance-only, and aren't available as
|
||||
dunder methods:
|
||||
|
||||
```py
|
||||
from typing import Callable
|
||||
@@ -128,8 +129,10 @@ from typing import Callable
|
||||
class C:
|
||||
__call__: Callable[..., None]
|
||||
|
||||
# error: [call-non-callable]
|
||||
C()()
|
||||
|
||||
# error: [invalid-assignment]
|
||||
_: Callable[..., None] = C()
|
||||
```
|
||||
|
||||
|
||||
@@ -810,6 +810,21 @@ D(1) # OK
|
||||
D() # error: [missing-argument]
|
||||
```
|
||||
|
||||
### Accessing instance attributes on the class itself
|
||||
|
||||
Just like for normal classes, accessing instance attributes on the class itself is not allowed:
|
||||
|
||||
```py
|
||||
from dataclasses import dataclass
|
||||
|
||||
@dataclass
|
||||
class C:
|
||||
x: int
|
||||
|
||||
# error: [unresolved-attribute] "Attribute `x` can only be accessed on instances, not on the class object `<class 'C'>` itself."
|
||||
C.x
|
||||
```
|
||||
|
||||
### Return type of `dataclass(...)`
|
||||
|
||||
A call like `dataclass(order=True)` returns a callable itself, which is then used as the decorator.
|
||||
@@ -533,13 +533,7 @@ class FooSubclassOfAny:
|
||||
x: SubclassOfAny
|
||||
|
||||
static_assert(not is_subtype_of(FooSubclassOfAny, HasX))
|
||||
|
||||
# `FooSubclassOfAny` is assignable to `HasX` for the following reason. The `x` attribute on `FooSubclassOfAny`
|
||||
# is accessible on the class itself. When accessing `x` on an instance, the descriptor protocol is invoked, and
|
||||
# `__get__` is looked up on `SubclassOfAny`. Every member access on `SubclassOfAny` yields `Any`, so `__get__` is
|
||||
# also available, and calling `Any` also yields `Any`. Thus, accessing `x` on an instance of `FooSubclassOfAny`
|
||||
# yields `Any`, which is assignable to `int` and vice versa.
|
||||
static_assert(is_assignable_to(FooSubclassOfAny, HasX))
|
||||
static_assert(not is_assignable_to(FooSubclassOfAny, HasX))
|
||||
|
||||
class FooWithY(Foo):
|
||||
y: int
|
||||
@@ -1592,7 +1586,11 @@ def g(a: Truthy, b: FalsyFoo, c: FalsyFooSubclass):
|
||||
reveal_type(bool(c)) # revealed: Literal[False]
|
||||
```
|
||||
|
||||
The same works with a class-level declaration of `__bool__`:
|
||||
It is not sufficient for a protocol to have a callable `__bool__` instance member that returns
|
||||
`Literal[True]` for it to be considered always truthy. Dunder methods are looked up on the class
|
||||
rather than the instance. If a protocol `X` has an instance-attribute `__bool__` member, it is
|
||||
unknowable whether that attribute can be accessed on the type of an object that satisfies `X`'s
|
||||
interface:
|
||||
|
||||
```py
|
||||
from typing import Callable
|
||||
@@ -1601,7 +1599,7 @@ class InstanceAttrBool(Protocol):
|
||||
__bool__: Callable[[], Literal[True]]
|
||||
|
||||
def h(obj: InstanceAttrBool):
|
||||
reveal_type(bool(obj)) # revealed: Literal[True]
|
||||
reveal_type(bool(obj)) # revealed: bool
|
||||
```
|
||||
|
||||
## Callable protocols
|
||||
@@ -1834,8 +1832,7 @@ def _(r: Recursive):
|
||||
reveal_type(r.direct) # revealed: Recursive
|
||||
reveal_type(r.union) # revealed: None | Recursive
|
||||
reveal_type(r.intersection1) # revealed: C & Recursive
|
||||
# revealed: @Todo(map_with_boundness: intersections with negative contributions) | (C & ~Recursive)
|
||||
reveal_type(r.intersection2)
|
||||
reveal_type(r.intersection2) # revealed: C & ~Recursive
|
||||
reveal_type(r.t) # revealed: tuple[int, tuple[str, Recursive]]
|
||||
reveal_type(r.callable1) # revealed: (int, /) -> Recursive
|
||||
reveal_type(r.callable2) # revealed: (Recursive, /) -> int
|
||||
|
||||
@@ -26,7 +26,7 @@ error[unresolved-import]: Cannot resolve imported module `does_not_exist`
|
||||
2 | from does_not_exist import foo, bar, baz
|
||||
| ^^^^^^^^^^^^^^
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
```
|
||||
|
||||
@@ -24,7 +24,7 @@ error[unresolved-import]: Cannot resolve imported module `zqzqzqzqzqzqzq`
|
||||
1 | import zqzqzqzqzqzqzq # error: [unresolved-import] "Cannot resolve imported module `zqzqzqzqzqzqzq`"
|
||||
| ^^^^^^^^^^^^^^
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
```
|
||||
|
||||
@@ -36,7 +36,7 @@ error[unresolved-import]: Cannot resolve imported module `a.foo`
|
||||
3 |
|
||||
4 | # Topmost component unresolvable:
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
```
|
||||
@@ -49,7 +49,7 @@ error[unresolved-import]: Cannot resolve imported module `b.foo`
|
||||
5 | import b.foo # error: [unresolved-import] "Cannot resolve imported module `b.foo`"
|
||||
| ^^^^^
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
```
|
||||
|
||||
@@ -4,7 +4,7 @@ expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: dataclasses.md - Dataclasses - `dataclasses.KW_ONLY`
|
||||
mdtest path: crates/ty_python_semantic/resources/mdtest/dataclasses/dataclasses.md
|
||||
mdtest path: crates/ty_python_semantic/resources/mdtest/dataclasses.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
@@ -28,7 +28,7 @@ error[unresolved-import]: Cannot resolve imported module `does_not_exist`
|
||||
2 |
|
||||
3 | x = does_not_exist.foo
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
```
|
||||
|
||||
@@ -28,7 +28,7 @@ error[unresolved-import]: Cannot resolve imported module `does_not_exist`
|
||||
2 |
|
||||
3 | stat = add(10, 15)
|
||||
|
|
||||
info: make sure your Python environment is properly configured: https://docs.astral.sh/ty/modules/#python-environment
|
||||
info: make sure your Python environment is properly configured: https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment
|
||||
info: rule `unresolved-import` is enabled by default
|
||||
|
||||
```
|
||||
|
||||
@@ -1064,37 +1064,6 @@ static_assert(not is_assignable_to(A, Callable[[int], int]))
|
||||
reveal_type(A()(1)) # revealed: str
|
||||
```
|
||||
|
||||
### Subclass of
|
||||
|
||||
#### Type of a class with constructor methods
|
||||
|
||||
```py
|
||||
from typing import Callable
|
||||
from ty_extensions import static_assert, is_assignable_to
|
||||
|
||||
class A:
|
||||
def __init__(self, x: int) -> None: ...
|
||||
|
||||
class B:
|
||||
def __new__(cls, x: str) -> "B":
|
||||
return super().__new__(cls)
|
||||
|
||||
static_assert(is_assignable_to(type[A], Callable[[int], A]))
|
||||
static_assert(not is_assignable_to(type[A], Callable[[str], A]))
|
||||
|
||||
static_assert(is_assignable_to(type[B], Callable[[str], B]))
|
||||
static_assert(not is_assignable_to(type[B], Callable[[int], B]))
|
||||
```
|
||||
|
||||
#### Type with no generic parameters
|
||||
|
||||
```py
|
||||
from typing import Callable, Any
|
||||
from ty_extensions import static_assert, is_assignable_to
|
||||
|
||||
static_assert(is_assignable_to(type, Callable[..., Any]))
|
||||
```
|
||||
|
||||
## Generics
|
||||
|
||||
### Assignability of generic types parameterized by gradual types
|
||||
|
||||
@@ -1752,28 +1752,6 @@ static_assert(not is_subtype_of(TypeOf[F], Callable[[], str]))
|
||||
static_assert(not is_subtype_of(TypeOf[F], Callable[[int], F]))
|
||||
```
|
||||
|
||||
### Subclass of
|
||||
|
||||
#### Type of a class with constructor methods
|
||||
|
||||
```py
|
||||
from typing import Callable
|
||||
from ty_extensions import TypeOf, static_assert, is_subtype_of
|
||||
|
||||
class A:
|
||||
def __init__(self, x: int) -> None: ...
|
||||
|
||||
class B:
|
||||
def __new__(cls, x: str) -> "B":
|
||||
return super().__new__(cls)
|
||||
|
||||
static_assert(is_subtype_of(type[A], Callable[[int], A]))
|
||||
static_assert(not is_subtype_of(type[A], Callable[[str], A]))
|
||||
|
||||
static_assert(is_subtype_of(type[B], Callable[[str], B]))
|
||||
static_assert(not is_subtype_of(type[B], Callable[[int], B]))
|
||||
```
|
||||
|
||||
### Bound methods
|
||||
|
||||
```py
|
||||
|
||||
@@ -64,6 +64,24 @@ c = C()
|
||||
c.a = 2
|
||||
```
|
||||
|
||||
and similarly here:
|
||||
|
||||
```py
|
||||
class Base:
|
||||
a: ClassVar[int] = 1
|
||||
|
||||
class Derived(Base):
|
||||
if flag():
|
||||
a: int
|
||||
|
||||
reveal_type(Derived.a) # revealed: int
|
||||
|
||||
d = Derived()
|
||||
|
||||
# error: [invalid-attribute-access]
|
||||
d.a = 2
|
||||
```
|
||||
|
||||
## Too many arguments
|
||||
|
||||
```py
|
||||
|
||||
@@ -6,7 +6,9 @@ use ruff_python_ast::name::Name;
|
||||
use ruff_python_ast::statement_visitor::{StatementVisitor, walk_stmt};
|
||||
use ruff_python_ast::{self as ast};
|
||||
|
||||
use crate::semantic_index::{SemanticIndex, semantic_index};
|
||||
use crate::semantic_index::ast_ids::HasScopedExpressionId;
|
||||
use crate::semantic_index::place::ScopeId;
|
||||
use crate::semantic_index::{SemanticIndex, global_scope, semantic_index};
|
||||
use crate::types::{Truthiness, Type, infer_expression_types};
|
||||
use crate::{Db, ModuleName, resolve_module};
|
||||
|
||||
@@ -42,6 +44,11 @@ struct DunderAllNamesCollector<'db> {
|
||||
db: &'db dyn Db,
|
||||
file: File,
|
||||
|
||||
/// The scope in which the `__all__` names are being collected from.
|
||||
///
|
||||
/// This is always going to be the global scope of the module.
|
||||
scope: ScopeId<'db>,
|
||||
|
||||
/// The semantic index for the module.
|
||||
index: &'db SemanticIndex<'db>,
|
||||
|
||||
@@ -61,6 +68,7 @@ impl<'db> DunderAllNamesCollector<'db> {
|
||||
Self {
|
||||
db,
|
||||
file,
|
||||
scope: global_scope(db, file),
|
||||
index,
|
||||
origin: None,
|
||||
invalid: false,
|
||||
@@ -182,7 +190,8 @@ impl<'db> DunderAllNamesCollector<'db> {
|
||||
///
|
||||
/// This function panics if `expr` was not marked as a standalone expression during semantic indexing.
|
||||
fn standalone_expression_type(&self, expr: &ast::Expr) -> Type<'db> {
|
||||
infer_expression_types(self.db, self.index.expression(expr)).expression_type(expr)
|
||||
infer_expression_types(self.db, self.index.expression(expr))
|
||||
.expression_type(expr.scoped_expression_id(self.db, self.scope))
|
||||
}
|
||||
|
||||
/// Evaluate the given expression and return its truthiness.
|
||||
|
||||
@@ -235,28 +235,29 @@ pub(crate) fn class_symbol<'db>(
|
||||
) -> PlaceAndQualifiers<'db> {
|
||||
place_table(db, scope)
|
||||
.place_id_by_name(name)
|
||||
.map(|place| {
|
||||
let place_and_quals = place_by_id(
|
||||
.map(|symbol| {
|
||||
let symbol_and_quals = place_by_id(
|
||||
db,
|
||||
scope,
|
||||
place,
|
||||
symbol,
|
||||
RequiresExplicitReExport::No,
|
||||
ConsideredDefinitions::EndOfScope,
|
||||
);
|
||||
|
||||
if !place_and_quals.place.is_unbound() {
|
||||
// Trust the declared type if we see a class-level declaration
|
||||
return place_and_quals;
|
||||
if symbol_and_quals.is_class_var() {
|
||||
// For declared class vars we do not need to check if they have bindings,
|
||||
// we just trust the declaration.
|
||||
return symbol_and_quals;
|
||||
}
|
||||
|
||||
if let PlaceAndQualifiers {
|
||||
place: Place::Type(ty, _),
|
||||
qualifiers,
|
||||
} = place_and_quals
|
||||
} = symbol_and_quals
|
||||
{
|
||||
// Otherwise, we need to check if the symbol has bindings
|
||||
let use_def = use_def_map(db, scope);
|
||||
let bindings = use_def.end_of_scope_bindings(place);
|
||||
let bindings = use_def.end_of_scope_bindings(symbol);
|
||||
let inferred = place_from_bindings_impl(db, bindings, RequiresExplicitReExport::No);
|
||||
|
||||
// TODO: we should not need to calculate inferred type second time. This is a temporary
|
||||
|
||||
@@ -26,11 +26,20 @@ use crate::semantic_index::semantic_index;
|
||||
/// ```
|
||||
#[derive(Debug, salsa::Update, get_size2::GetSize)]
|
||||
pub(crate) struct AstIds {
|
||||
/// Maps expressions to their expression id.
|
||||
expressions_map: FxHashMap<ExpressionNodeKey, ScopedExpressionId>,
|
||||
/// Maps expressions which "use" a place (that is, [`ast::ExprName`], [`ast::ExprAttribute`] or [`ast::ExprSubscript`]) to a use id.
|
||||
uses_map: FxHashMap<ExpressionNodeKey, ScopedUseId>,
|
||||
}
|
||||
|
||||
impl AstIds {
|
||||
fn expression_id(&self, key: impl Into<ExpressionNodeKey>) -> ScopedExpressionId {
|
||||
let key = &key.into();
|
||||
*self.expressions_map.get(key).unwrap_or_else(|| {
|
||||
panic!("Could not find expression ID for {key:?}");
|
||||
})
|
||||
}
|
||||
|
||||
fn use_id(&self, key: impl Into<ExpressionNodeKey>) -> ScopedUseId {
|
||||
self.uses_map[&key.into()]
|
||||
}
|
||||
@@ -85,12 +94,90 @@ impl HasScopedUseId for ast::ExprRef<'_> {
|
||||
}
|
||||
}
|
||||
|
||||
/// Uniquely identifies an [`ast::Expr`] in a [`crate::semantic_index::place::FileScopeId`].
|
||||
#[newtype_index]
|
||||
#[derive(salsa::Update, get_size2::GetSize)]
|
||||
pub struct ScopedExpressionId;
|
||||
|
||||
pub trait HasScopedExpressionId {
|
||||
/// Returns the ID that uniquely identifies the node in `scope`.
|
||||
fn scoped_expression_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedExpressionId;
|
||||
}
|
||||
|
||||
impl<T: HasScopedExpressionId> HasScopedExpressionId for Box<T> {
|
||||
fn scoped_expression_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedExpressionId {
|
||||
self.as_ref().scoped_expression_id(db, scope)
|
||||
}
|
||||
}
|
||||
|
||||
macro_rules! impl_has_scoped_expression_id {
|
||||
($ty: ty) => {
|
||||
impl HasScopedExpressionId for $ty {
|
||||
fn scoped_expression_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedExpressionId {
|
||||
let expression_ref = ExprRef::from(self);
|
||||
expression_ref.scoped_expression_id(db, scope)
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
impl_has_scoped_expression_id!(ast::ExprBoolOp);
|
||||
impl_has_scoped_expression_id!(ast::ExprName);
|
||||
impl_has_scoped_expression_id!(ast::ExprBinOp);
|
||||
impl_has_scoped_expression_id!(ast::ExprUnaryOp);
|
||||
impl_has_scoped_expression_id!(ast::ExprLambda);
|
||||
impl_has_scoped_expression_id!(ast::ExprIf);
|
||||
impl_has_scoped_expression_id!(ast::ExprDict);
|
||||
impl_has_scoped_expression_id!(ast::ExprSet);
|
||||
impl_has_scoped_expression_id!(ast::ExprListComp);
|
||||
impl_has_scoped_expression_id!(ast::ExprSetComp);
|
||||
impl_has_scoped_expression_id!(ast::ExprDictComp);
|
||||
impl_has_scoped_expression_id!(ast::ExprGenerator);
|
||||
impl_has_scoped_expression_id!(ast::ExprAwait);
|
||||
impl_has_scoped_expression_id!(ast::ExprYield);
|
||||
impl_has_scoped_expression_id!(ast::ExprYieldFrom);
|
||||
impl_has_scoped_expression_id!(ast::ExprCompare);
|
||||
impl_has_scoped_expression_id!(ast::ExprCall);
|
||||
impl_has_scoped_expression_id!(ast::ExprFString);
|
||||
impl_has_scoped_expression_id!(ast::ExprStringLiteral);
|
||||
impl_has_scoped_expression_id!(ast::ExprBytesLiteral);
|
||||
impl_has_scoped_expression_id!(ast::ExprNumberLiteral);
|
||||
impl_has_scoped_expression_id!(ast::ExprBooleanLiteral);
|
||||
impl_has_scoped_expression_id!(ast::ExprNoneLiteral);
|
||||
impl_has_scoped_expression_id!(ast::ExprEllipsisLiteral);
|
||||
impl_has_scoped_expression_id!(ast::ExprAttribute);
|
||||
impl_has_scoped_expression_id!(ast::ExprSubscript);
|
||||
impl_has_scoped_expression_id!(ast::ExprStarred);
|
||||
impl_has_scoped_expression_id!(ast::ExprNamed);
|
||||
impl_has_scoped_expression_id!(ast::ExprList);
|
||||
impl_has_scoped_expression_id!(ast::ExprTuple);
|
||||
impl_has_scoped_expression_id!(ast::ExprSlice);
|
||||
impl_has_scoped_expression_id!(ast::ExprIpyEscapeCommand);
|
||||
impl_has_scoped_expression_id!(ast::Expr);
|
||||
|
||||
impl HasScopedExpressionId for ast::ExprRef<'_> {
|
||||
fn scoped_expression_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedExpressionId {
|
||||
let ast_ids = ast_ids(db, scope);
|
||||
ast_ids.expression_id(*self)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Default)]
|
||||
pub(super) struct AstIdsBuilder {
|
||||
expressions_map: FxHashMap<ExpressionNodeKey, ScopedExpressionId>,
|
||||
uses_map: FxHashMap<ExpressionNodeKey, ScopedUseId>,
|
||||
}
|
||||
|
||||
impl AstIdsBuilder {
|
||||
/// Adds `expr` to the expression ids map and returns its id.
|
||||
pub(super) fn record_expression(&mut self, expr: &ast::Expr) -> ScopedExpressionId {
|
||||
let expression_id = self.expressions_map.len().into();
|
||||
|
||||
self.expressions_map.insert(expr.into(), expression_id);
|
||||
|
||||
expression_id
|
||||
}
|
||||
|
||||
/// Adds `expr` to the use ids map and returns its id.
|
||||
pub(super) fn record_use(&mut self, expr: impl Into<ExpressionNodeKey>) -> ScopedUseId {
|
||||
let use_id = self.uses_map.len().into();
|
||||
@@ -101,9 +188,11 @@ impl AstIdsBuilder {
|
||||
}
|
||||
|
||||
pub(super) fn finish(mut self) -> AstIds {
|
||||
self.expressions_map.shrink_to_fit();
|
||||
self.uses_map.shrink_to_fit();
|
||||
|
||||
AstIds {
|
||||
expressions_map: self.expressions_map,
|
||||
uses_map: self.uses_map,
|
||||
}
|
||||
}
|
||||
@@ -130,12 +219,6 @@ pub(crate) mod node_key {
|
||||
}
|
||||
}
|
||||
|
||||
impl From<&ast::ExprCall> for ExpressionNodeKey {
|
||||
fn from(value: &ast::ExprCall) -> Self {
|
||||
Self(NodeKey::from_node(value))
|
||||
}
|
||||
}
|
||||
|
||||
impl From<&ast::Identifier> for ExpressionNodeKey {
|
||||
fn from(value: &ast::Identifier) -> Self {
|
||||
Self(NodeKey::from_node(value))
|
||||
|
||||
@@ -1918,6 +1918,7 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
|
||||
|
||||
self.scopes_by_expression
|
||||
.insert(expr.into(), self.current_scope());
|
||||
self.current_ast_ids().record_expression(expr);
|
||||
|
||||
let node_key = NodeKey::from_node(expr);
|
||||
|
||||
|
||||
@@ -7,6 +7,7 @@ use ruff_source_file::LineIndex;
|
||||
use crate::Db;
|
||||
use crate::module_name::ModuleName;
|
||||
use crate::module_resolver::{KnownModule, Module, resolve_module};
|
||||
use crate::semantic_index::ast_ids::HasScopedExpressionId;
|
||||
use crate::semantic_index::place::FileScopeId;
|
||||
use crate::semantic_index::semantic_index;
|
||||
use crate::types::ide_support::all_declarations_and_bindings;
|
||||
@@ -71,6 +72,8 @@ impl<'db> SemanticModel<'db> {
|
||||
let builtin = module.is_known(KnownModule::Builtins);
|
||||
crate::types::all_members(self.db, ty)
|
||||
.into_iter()
|
||||
// Filter out private members from `builtins`
|
||||
.filter(|name| !(builtin && matches!(NameKind::classify(name), NameKind::Sunder)))
|
||||
.map(|name| Completion { name, builtin })
|
||||
.collect()
|
||||
}
|
||||
@@ -128,9 +131,9 @@ impl<'db> SemanticModel<'db> {
|
||||
}
|
||||
}
|
||||
|
||||
/// A classification of symbol names.
|
||||
/// A classification for completion names.
|
||||
///
|
||||
/// The ordering here is used for sorting completions.
|
||||
/// The ordering here is used for sorting completions based only on name.
|
||||
///
|
||||
/// This sorts "normal" names first, then dunder names and finally
|
||||
/// single-underscore names. This matches the order of the variants defined for
|
||||
@@ -172,6 +175,9 @@ pub struct Completion {
|
||||
/// doesn't make it into the LSP response. Instead, we
|
||||
/// use it mainly in tests so that we can write less
|
||||
/// noisy tests.
|
||||
///
|
||||
/// However, we do pre-filter private names from the
|
||||
/// builtin module before construction.
|
||||
pub builtin: bool,
|
||||
}
|
||||
|
||||
@@ -189,7 +195,8 @@ impl HasType for ast::ExprRef<'_> {
|
||||
let file_scope = index.expression_scope_id(*self);
|
||||
let scope = file_scope.to_scope_id(model.db, model.file);
|
||||
|
||||
infer_scope_types(model.db, scope).expression_type(*self)
|
||||
let expression_id = self.scoped_expression_id(model.db, scope);
|
||||
infer_scope_types(model.db, scope).expression_type(expression_id)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -33,6 +33,7 @@ pub(crate) use self::subclass_of::{SubclassOfInner, SubclassOfType};
|
||||
use crate::module_name::ModuleName;
|
||||
use crate::module_resolver::{KnownModule, resolve_module};
|
||||
use crate::place::{Boundness, Place, PlaceAndQualifiers, imported_symbol};
|
||||
use crate::semantic_index::ast_ids::HasScopedExpressionId;
|
||||
use crate::semantic_index::definition::Definition;
|
||||
use crate::semantic_index::place::{ScopeId, ScopedPlaceId};
|
||||
use crate::semantic_index::{imported_modules, place_table, semantic_index};
|
||||
@@ -142,17 +143,18 @@ fn definition_expression_type<'db>(
|
||||
let index = semantic_index(db, file);
|
||||
let file_scope = index.expression_scope_id(expression);
|
||||
let scope = file_scope.to_scope_id(db, file);
|
||||
let expr_id = expression.scoped_expression_id(db, scope);
|
||||
if scope == definition.scope(db) {
|
||||
// expression is in the definition scope
|
||||
let inference = infer_definition_types(db, definition);
|
||||
if let Some(ty) = inference.try_expression_type(expression) {
|
||||
if let Some(ty) = inference.try_expression_type(expr_id) {
|
||||
ty
|
||||
} else {
|
||||
infer_deferred_types(db, definition).expression_type(expression)
|
||||
infer_deferred_types(db, definition).expression_type(expr_id)
|
||||
}
|
||||
} else {
|
||||
// expression is in a type-params sub-scope
|
||||
infer_scope_types(db, scope).expression_type(expression)
|
||||
infer_scope_types(db, scope).expression_type(expr_id)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1559,16 +1561,6 @@ impl<'db> Type<'db> {
|
||||
.into_callable(db)
|
||||
.has_relation_to(db, target, relation),
|
||||
|
||||
// TODO: This is unsound so in future we can consider an opt-in option to disable it.
|
||||
(Type::SubclassOf(subclass_of_ty), Type::Callable(_))
|
||||
if subclass_of_ty.subclass_of().into_class().is_some() =>
|
||||
{
|
||||
let class = subclass_of_ty.subclass_of().into_class().unwrap();
|
||||
class
|
||||
.into_callable(db)
|
||||
.has_relation_to(db, target, relation)
|
||||
}
|
||||
|
||||
// `Literal[str]` is a subtype of `type` because the `str` class object is an instance of its metaclass `type`.
|
||||
// `Literal[abc.ABC]` is a subtype of `abc.ABCMeta` because the `abc.ABC` class object
|
||||
// is an instance of its metaclass `abc.ABCMeta`.
|
||||
|
||||
@@ -32,6 +32,7 @@ use crate::{
|
||||
known_module_symbol, place_from_bindings, place_from_declarations,
|
||||
},
|
||||
semantic_index::{
|
||||
ast_ids::HasScopedExpressionId,
|
||||
attribute_assignments,
|
||||
definition::{DefinitionKind, TargetKind},
|
||||
place::ScopeId,
|
||||
@@ -1860,8 +1861,10 @@ impl<'db> ClassLiteral<'db> {
|
||||
// [.., self.name, ..] = <value>
|
||||
|
||||
let unpacked = infer_unpack_types(db, unpack);
|
||||
|
||||
let inferred_ty = unpacked.expression_type(assign.target(&module));
|
||||
let target_ast_id = assign
|
||||
.target(&module)
|
||||
.scoped_expression_id(db, method_scope);
|
||||
let inferred_ty = unpacked.expression_type(target_ast_id);
|
||||
|
||||
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
|
||||
}
|
||||
@@ -1887,8 +1890,10 @@ impl<'db> ClassLiteral<'db> {
|
||||
// for .., self.name, .. in <iterable>:
|
||||
|
||||
let unpacked = infer_unpack_types(db, unpack);
|
||||
let inferred_ty =
|
||||
unpacked.expression_type(for_stmt.target(&module));
|
||||
let target_ast_id = for_stmt
|
||||
.target(&module)
|
||||
.scoped_expression_id(db, method_scope);
|
||||
let inferred_ty = unpacked.expression_type(target_ast_id);
|
||||
|
||||
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
|
||||
}
|
||||
@@ -1916,8 +1921,10 @@ impl<'db> ClassLiteral<'db> {
|
||||
// with <context_manager> as .., self.name, ..:
|
||||
|
||||
let unpacked = infer_unpack_types(db, unpack);
|
||||
let inferred_ty =
|
||||
unpacked.expression_type(with_item.target(&module));
|
||||
let target_ast_id = with_item
|
||||
.target(&module)
|
||||
.scoped_expression_id(db, method_scope);
|
||||
let inferred_ty = unpacked.expression_type(target_ast_id);
|
||||
|
||||
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
|
||||
}
|
||||
@@ -1944,9 +1951,10 @@ impl<'db> ClassLiteral<'db> {
|
||||
// [... for .., self.name, .. in <iterable>]
|
||||
|
||||
let unpacked = infer_unpack_types(db, unpack);
|
||||
|
||||
let inferred_ty =
|
||||
unpacked.expression_type(comprehension.target(&module));
|
||||
let target_ast_id = comprehension
|
||||
.target(&module)
|
||||
.scoped_expression_id(db, unpack.target_scope(db));
|
||||
let inferred_ty = unpacked.expression_type(target_ast_id);
|
||||
|
||||
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
|
||||
}
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
use crate::place::{Place, imported_symbol, place_from_bindings, place_from_declarations};
|
||||
use crate::Db;
|
||||
use crate::place::{imported_symbol, place_from_bindings, place_from_declarations};
|
||||
use crate::semantic_index::place::ScopeId;
|
||||
use crate::semantic_index::{
|
||||
attribute_scopes, global_scope, imported_modules, place_table, semantic_index, use_def_map,
|
||||
};
|
||||
use crate::types::{ClassBase, ClassLiteral, KnownClass, KnownInstanceType, Type};
|
||||
use crate::{Db, NameKind};
|
||||
use crate::types::{ClassBase, ClassLiteral, KnownClass, Type};
|
||||
use ruff_python_ast::name::Name;
|
||||
use rustc_hash::FxHashSet;
|
||||
|
||||
@@ -144,41 +144,13 @@ impl AllMembers {
|
||||
let Some(symbol_name) = place_table.place_expr(symbol_id).as_name() else {
|
||||
continue;
|
||||
};
|
||||
let Place::Type(ty, _) = imported_symbol(db, file, symbol_name, None).place
|
||||
else {
|
||||
continue;
|
||||
};
|
||||
|
||||
// Filter private symbols from stubs if they appear to be internal types
|
||||
let is_stub_file = file.path(db).extension() == Some("pyi");
|
||||
let is_private_symbol = match NameKind::classify(symbol_name) {
|
||||
NameKind::Dunder | NameKind::Normal => false,
|
||||
NameKind::Sunder => true,
|
||||
};
|
||||
if is_private_symbol && is_stub_file {
|
||||
match ty {
|
||||
Type::NominalInstance(instance)
|
||||
if matches!(
|
||||
instance.class.known(db),
|
||||
Some(
|
||||
KnownClass::TypeVar
|
||||
| KnownClass::TypeVarTuple
|
||||
| KnownClass::ParamSpec
|
||||
)
|
||||
) =>
|
||||
{
|
||||
continue;
|
||||
}
|
||||
Type::ClassLiteral(class) if class.is_protocol(db) => continue,
|
||||
Type::KnownInstance(
|
||||
KnownInstanceType::TypeVar(_) | KnownInstanceType::TypeAliasType(_),
|
||||
) => continue,
|
||||
_ => {}
|
||||
}
|
||||
if !imported_symbol(db, file, symbol_name, None)
|
||||
.place
|
||||
.is_unbound()
|
||||
{
|
||||
self.members
|
||||
.insert(place_table.place_expr(symbol_id).expect_name().clone());
|
||||
}
|
||||
|
||||
self.members
|
||||
.insert(place_table.place_expr(symbol_id).expect_name().clone());
|
||||
}
|
||||
|
||||
let module_name = module.name();
|
||||
|
||||
@@ -45,21 +45,6 @@ use rustc_hash::{FxHashMap, FxHashSet};
|
||||
use salsa;
|
||||
use salsa::plumbing::AsId;
|
||||
|
||||
use super::context::{InNoTypeCheck, InferContext};
|
||||
use super::diagnostic::{
|
||||
INVALID_METACLASS, INVALID_OVERLOAD, INVALID_PROTOCOL, SUBCLASS_OF_FINAL_CLASS,
|
||||
hint_if_stdlib_submodule_exists_on_other_versions, report_attempted_protocol_instantiation,
|
||||
report_duplicate_bases, report_index_out_of_bounds, report_invalid_exception_caught,
|
||||
report_invalid_exception_cause, report_invalid_exception_raised,
|
||||
report_invalid_or_unsupported_base, report_invalid_type_checking_constant,
|
||||
report_non_subscriptable, report_possibly_unresolved_reference, report_slice_step_size_zero,
|
||||
};
|
||||
use super::generics::LegacyGenericBase;
|
||||
use super::string_annotation::{
|
||||
BYTE_STRING_TYPE_ANNOTATION, FSTRING_TYPE_ANNOTATION, parse_string_annotation,
|
||||
};
|
||||
use super::subclass_of::SubclassOfInner;
|
||||
use super::{ClassBase, NominalInstanceType, add_inferred_python_version_hint_to_diagnostic};
|
||||
use crate::module_name::{ModuleName, ModuleNameResolutionError};
|
||||
use crate::module_resolver::resolve_module;
|
||||
use crate::node_key::NodeKey;
|
||||
@@ -69,8 +54,9 @@ use crate::place::{
|
||||
module_type_implicit_global_declaration, module_type_implicit_global_symbol, place,
|
||||
place_from_bindings, place_from_declarations, typing_extensions_symbol,
|
||||
};
|
||||
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
|
||||
use crate::semantic_index::ast_ids::{HasScopedUseId, ScopedUseId};
|
||||
use crate::semantic_index::ast_ids::{
|
||||
HasScopedExpressionId, HasScopedUseId, ScopedExpressionId, ScopedUseId,
|
||||
};
|
||||
use crate::semantic_index::definition::{
|
||||
AnnotatedAssignmentDefinitionKind, AssignmentDefinitionKind, ComprehensionDefinitionKind,
|
||||
Definition, DefinitionKind, DefinitionNodeKey, DefinitionState, ExceptHandlerDefinitionKind,
|
||||
@@ -124,6 +110,22 @@ use crate::util::diagnostics::format_enumeration;
|
||||
use crate::util::subscript::{PyIndex, PySlice};
|
||||
use crate::{Db, FxOrderSet, Program};
|
||||
|
||||
use super::context::{InNoTypeCheck, InferContext};
|
||||
use super::diagnostic::{
|
||||
INVALID_METACLASS, INVALID_OVERLOAD, INVALID_PROTOCOL, SUBCLASS_OF_FINAL_CLASS,
|
||||
hint_if_stdlib_submodule_exists_on_other_versions, report_attempted_protocol_instantiation,
|
||||
report_duplicate_bases, report_index_out_of_bounds, report_invalid_exception_caught,
|
||||
report_invalid_exception_cause, report_invalid_exception_raised,
|
||||
report_invalid_or_unsupported_base, report_invalid_type_checking_constant,
|
||||
report_non_subscriptable, report_possibly_unresolved_reference, report_slice_step_size_zero,
|
||||
};
|
||||
use super::generics::LegacyGenericBase;
|
||||
use super::string_annotation::{
|
||||
BYTE_STRING_TYPE_ANNOTATION, FSTRING_TYPE_ANNOTATION, parse_string_annotation,
|
||||
};
|
||||
use super::subclass_of::SubclassOfInner;
|
||||
use super::{ClassBase, NominalInstanceType, add_inferred_python_version_hint_to_diagnostic};
|
||||
|
||||
/// Infer all types for a [`ScopeId`], including all definitions and expressions in that scope.
|
||||
/// Use when checking a scope, or needing to provide a type for an arbitrary expression in the
|
||||
/// scope.
|
||||
@@ -279,7 +281,12 @@ pub(super) fn infer_same_file_expression_type<'db>(
|
||||
parsed: &ParsedModuleRef,
|
||||
) -> Type<'db> {
|
||||
let inference = infer_expression_types(db, expression);
|
||||
inference.expression_type(expression.node_ref(db, parsed))
|
||||
let scope = expression.scope(db);
|
||||
inference.expression_type(
|
||||
expression
|
||||
.node_ref(db, parsed)
|
||||
.scoped_expression_id(db, scope),
|
||||
)
|
||||
}
|
||||
|
||||
/// Infers the type of an expression where the expression might come from another file.
|
||||
@@ -330,7 +337,7 @@ pub(super) fn infer_unpack_types<'db>(db: &'db dyn Db, unpack: Unpack<'db>) -> U
|
||||
let _span = tracing::trace_span!("infer_unpack_types", range=?unpack.range(db, &module), ?file)
|
||||
.entered();
|
||||
|
||||
let mut unpacker = Unpacker::new(db, unpack.target_scope(db), &module);
|
||||
let mut unpacker = Unpacker::new(db, unpack.target_scope(db), unpack.value_scope(db), &module);
|
||||
unpacker.unpack(unpack.target(db, &module), unpack.value(db));
|
||||
unpacker.finish()
|
||||
}
|
||||
@@ -410,7 +417,7 @@ struct TypeAndRange<'db> {
|
||||
#[derive(Debug, Eq, PartialEq, salsa::Update, get_size2::GetSize)]
|
||||
pub(crate) struct TypeInference<'db> {
|
||||
/// The types of every expression in this region.
|
||||
expressions: FxHashMap<ExpressionNodeKey, Type<'db>>,
|
||||
expressions: FxHashMap<ScopedExpressionId, Type<'db>>,
|
||||
|
||||
/// The types of every binding in this region.
|
||||
bindings: FxHashMap<Definition<'db>, Type<'db>>,
|
||||
@@ -459,7 +466,7 @@ impl<'db> TypeInference<'db> {
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
pub(crate) fn expression_type(&self, expression: impl Into<ExpressionNodeKey>) -> Type<'db> {
|
||||
pub(crate) fn expression_type(&self, expression: ScopedExpressionId) -> Type<'db> {
|
||||
self.try_expression_type(expression).expect(
|
||||
"Failed to retrieve the inferred type for an `ast::Expr` node \
|
||||
passed to `TypeInference::expression_type()`. The `TypeInferenceBuilder` \
|
||||
@@ -468,12 +475,9 @@ impl<'db> TypeInference<'db> {
|
||||
)
|
||||
}
|
||||
|
||||
pub(crate) fn try_expression_type(
|
||||
&self,
|
||||
expression: impl Into<ExpressionNodeKey>,
|
||||
) -> Option<Type<'db>> {
|
||||
pub(crate) fn try_expression_type(&self, expression: ScopedExpressionId) -> Option<Type<'db>> {
|
||||
self.expressions
|
||||
.get(&expression.into())
|
||||
.get(&expression)
|
||||
.copied()
|
||||
.or(self.cycle_fallback_type)
|
||||
}
|
||||
@@ -734,11 +738,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
/// this node.
|
||||
#[track_caller]
|
||||
fn expression_type(&self, expr: &ast::Expr) -> Type<'db> {
|
||||
self.types.expression_type(expr)
|
||||
self.types
|
||||
.expression_type(expr.scoped_expression_id(self.db(), self.scope()))
|
||||
}
|
||||
|
||||
fn try_expression_type(&self, expr: &ast::Expr) -> Option<Type<'db>> {
|
||||
self.types.try_expression_type(expr)
|
||||
self.types
|
||||
.try_expression_type(expr.scoped_expression_id(self.db(), self.scope()))
|
||||
}
|
||||
|
||||
/// Get the type of an expression from any scope in the same file.
|
||||
@@ -756,11 +762,12 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
fn file_expression_type(&self, expression: &ast::Expr) -> Type<'db> {
|
||||
let file_scope = self.index.expression_scope_id(expression);
|
||||
let expr_scope = file_scope.to_scope_id(self.db(), self.file());
|
||||
let expr_id = expression.scoped_expression_id(self.db(), expr_scope);
|
||||
match self.region {
|
||||
InferenceRegion::Scope(scope) if scope == expr_scope => {
|
||||
self.expression_type(expression)
|
||||
}
|
||||
_ => infer_scope_types(self.db(), expr_scope).expression_type(expression),
|
||||
_ => infer_scope_types(self.db(), expr_scope).expression_type(expr_id),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1947,13 +1954,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
function: &'a ast::StmtFunctionDef,
|
||||
) -> impl Iterator<Item = Type<'db>> + 'a {
|
||||
let definition = self.index.expect_single_definition(function);
|
||||
|
||||
let scope = definition.scope(self.db());
|
||||
let definition_types = infer_definition_types(self.db(), definition);
|
||||
|
||||
function
|
||||
.decorator_list
|
||||
.iter()
|
||||
.map(move |decorator| definition_types.expression_type(&decorator.expression))
|
||||
function.decorator_list.iter().map(move |decorator| {
|
||||
definition_types
|
||||
.expression_type(decorator.expression.scoped_expression_id(self.db(), scope))
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns `true` if the current scope is the function body scope of a function overload (that
|
||||
@@ -2752,10 +2759,11 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
match with_item.target_kind() {
|
||||
TargetKind::Sequence(unpack_position, unpack) => {
|
||||
let unpacked = infer_unpack_types(self.db(), unpack);
|
||||
let target_ast_id = target.scoped_expression_id(self.db(), self.scope());
|
||||
if unpack_position == UnpackPosition::First {
|
||||
self.context.extend(unpacked.diagnostics());
|
||||
}
|
||||
unpacked.expression_type(target)
|
||||
unpacked.expression_type(target_ast_id)
|
||||
}
|
||||
TargetKind::Single => {
|
||||
let context_expr_ty = self.infer_standalone_expression(context_expr);
|
||||
@@ -3749,7 +3757,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
self.context.extend(unpacked.diagnostics());
|
||||
}
|
||||
|
||||
unpacked.expression_type(target)
|
||||
let target_ast_id = target.scoped_expression_id(self.db(), self.scope());
|
||||
unpacked.expression_type(target_ast_id)
|
||||
}
|
||||
TargetKind::Single => {
|
||||
let value_ty = self.infer_standalone_expression(value);
|
||||
@@ -3807,9 +3816,10 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
// But here we explicitly overwrite the type for the overall `self.attr` node with
|
||||
// the annotated type. We do no use `store_expression_type` here, because it checks
|
||||
// that no type has been stored for the expression before.
|
||||
let expr_id = target.scoped_expression_id(self.db(), self.scope());
|
||||
self.types
|
||||
.expressions
|
||||
.insert((&**target).into(), annotated.inner_type());
|
||||
.insert(expr_id, annotated.inner_type());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4067,8 +4077,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
if unpack_position == UnpackPosition::First {
|
||||
self.context.extend(unpacked.diagnostics());
|
||||
}
|
||||
|
||||
unpacked.expression_type(target)
|
||||
let target_ast_id = target.scoped_expression_id(self.db(), self.scope());
|
||||
unpacked.expression_type(target_ast_id)
|
||||
}
|
||||
TargetKind::Single => {
|
||||
let iterable_type = self.infer_standalone_expression(iterable);
|
||||
@@ -4162,7 +4172,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
|
||||
diagnostic.info(
|
||||
"make sure your Python environment is properly configured: \
|
||||
https://docs.astral.sh/ty/modules/#python-environment",
|
||||
https://github.com/astral-sh/ty/blob/main/docs/README.md#python-environment",
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -4618,7 +4628,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
// the result from `types` directly because we might be in cycle recovery where
|
||||
// `types.cycle_fallback_type` is `Some(fallback_ty)`, which we can retrieve by
|
||||
// using `expression_type` on `types`:
|
||||
types.expression_type(expression)
|
||||
types.expression_type(expression.scoped_expression_id(self.db(), self.scope()))
|
||||
}
|
||||
|
||||
fn infer_expression_impl(&mut self, expression: &ast::Expr) -> Type<'db> {
|
||||
@@ -4670,14 +4680,15 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
ty
|
||||
}
|
||||
|
||||
fn store_expression_type(&mut self, expression: &ast::Expr, ty: Type<'db>) {
|
||||
fn store_expression_type(&mut self, expression: &impl HasScopedExpressionId, ty: Type<'db>) {
|
||||
if self.deferred_state.in_string_annotation() {
|
||||
// Avoid storing the type of expressions that are part of a string annotation because
|
||||
// the expression ids don't exists in the semantic index. Instead, we'll store the type
|
||||
// on the string expression itself that represents the annotation.
|
||||
return;
|
||||
}
|
||||
let previous = self.types.expressions.insert(expression.into(), ty);
|
||||
let expr_id = expression.scoped_expression_id(self.db(), self.scope());
|
||||
let previous = self.types.expressions.insert(expr_id, ty);
|
||||
assert_eq!(previous, None);
|
||||
}
|
||||
|
||||
@@ -5082,13 +5093,20 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
// because `ScopedExpressionId`s are only meaningful within their own scope, so
|
||||
// we'd add types for random wrong expressions in the current scope
|
||||
if comprehension.is_first() && target.is_name_expr() {
|
||||
result.expression_type(iterable)
|
||||
let lookup_scope = self
|
||||
.index
|
||||
.parent_scope_id(self.scope().file_scope_id(self.db()))
|
||||
.expect("A comprehension should never be the top-level scope")
|
||||
.to_scope_id(self.db(), self.file());
|
||||
result.expression_type(iterable.scoped_expression_id(self.db(), lookup_scope))
|
||||
} else {
|
||||
let scope = self.types.scope;
|
||||
self.types.scope = result.scope;
|
||||
self.extend(result);
|
||||
self.types.scope = scope;
|
||||
result.expression_type(iterable)
|
||||
result.expression_type(
|
||||
iterable.scoped_expression_id(self.db(), expression.scope(self.db())),
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
@@ -5103,8 +5121,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
if unpack_position == UnpackPosition::First {
|
||||
self.context.extend(unpacked.diagnostics());
|
||||
}
|
||||
|
||||
unpacked.expression_type(target)
|
||||
let target_ast_id =
|
||||
target.scoped_expression_id(self.db(), unpack.target_scope(self.db()));
|
||||
unpacked.expression_type(target_ast_id)
|
||||
}
|
||||
TargetKind::Single => {
|
||||
let iterable_type = infer_iterable_type();
|
||||
@@ -5116,7 +5135,10 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
||||
}
|
||||
};
|
||||
|
||||
self.types.expressions.insert(target.into(), target_type);
|
||||
self.types.expressions.insert(
|
||||
target.scoped_expression_id(self.db(), self.scope()),
|
||||
target_type,
|
||||
);
|
||||
self.add_binding(target.into(), definition, target_type);
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
use crate::Db;
|
||||
use crate::semantic_index::ast_ids::HasScopedExpressionId;
|
||||
use crate::semantic_index::expression::Expression;
|
||||
use crate::semantic_index::place::{PlaceExpr, PlaceTable, ScopeId, ScopedPlaceId};
|
||||
use crate::semantic_index::place_table;
|
||||
@@ -686,7 +687,7 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
|
||||
// and that requires cross-symbol constraints, which we don't support yet.
|
||||
return None;
|
||||
}
|
||||
|
||||
let scope = self.scope();
|
||||
let inference = infer_expression_types(self.db, expression);
|
||||
|
||||
let comparator_tuples = std::iter::once(&**left)
|
||||
@@ -697,8 +698,10 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
|
||||
let mut last_rhs_ty: Option<Type> = None;
|
||||
|
||||
for (op, (left, right)) in std::iter::zip(&**ops, comparator_tuples) {
|
||||
let lhs_ty = last_rhs_ty.unwrap_or_else(|| inference.expression_type(left));
|
||||
let rhs_ty = inference.expression_type(right);
|
||||
let lhs_ty = last_rhs_ty.unwrap_or_else(|| {
|
||||
inference.expression_type(left.scoped_expression_id(self.db, scope))
|
||||
});
|
||||
let rhs_ty = inference.expression_type(right.scoped_expression_id(self.db, scope));
|
||||
last_rhs_ty = Some(rhs_ty);
|
||||
|
||||
match left {
|
||||
@@ -753,7 +756,8 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
|
||||
continue;
|
||||
}
|
||||
|
||||
let callable_type = inference.expression_type(&**callable);
|
||||
let callable_type =
|
||||
inference.expression_type(callable.scoped_expression_id(self.db, scope));
|
||||
|
||||
if callable_type
|
||||
.into_class_literal()
|
||||
@@ -778,9 +782,11 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
|
||||
expression: Expression<'db>,
|
||||
is_positive: bool,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
let scope = self.scope();
|
||||
let inference = infer_expression_types(self.db, expression);
|
||||
|
||||
let callable_ty = inference.expression_type(&*expr_call.func);
|
||||
let callable_ty =
|
||||
inference.expression_type(expr_call.func.scoped_expression_id(self.db, scope));
|
||||
|
||||
// TODO: add support for PEP 604 union types on the right hand side of `isinstance`
|
||||
// and `issubclass`, for example `isinstance(x, str | (int | float))`.
|
||||
@@ -791,7 +797,8 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
|
||||
None | Some(KnownFunction::RevealType)
|
||||
) =>
|
||||
{
|
||||
let return_ty = inference.expression_type(expr_call);
|
||||
let return_ty =
|
||||
inference.expression_type(expr_call.scoped_expression_id(self.db, scope));
|
||||
|
||||
let (guarded_ty, place) = match return_ty {
|
||||
// TODO: TypeGuard
|
||||
@@ -817,7 +824,7 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
|
||||
|
||||
if function == KnownFunction::HasAttr {
|
||||
let attr = inference
|
||||
.expression_type(second_arg)
|
||||
.expression_type(second_arg.scoped_expression_id(self.db, scope))
|
||||
.into_string_literal()?
|
||||
.value(self.db);
|
||||
|
||||
@@ -840,7 +847,8 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
|
||||
|
||||
let function = function.into_classinfo_constraint_function()?;
|
||||
|
||||
let class_info_ty = inference.expression_type(second_arg);
|
||||
let class_info_ty =
|
||||
inference.expression_type(second_arg.scoped_expression_id(self.db, scope));
|
||||
|
||||
function
|
||||
.generate_constraint(self.db, class_info_ty)
|
||||
@@ -931,12 +939,15 @@ impl<'db, 'ast> NarrowingConstraintsBuilder<'db, 'ast> {
|
||||
is_positive: bool,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
let inference = infer_expression_types(self.db, expression);
|
||||
let scope = self.scope();
|
||||
let mut sub_constraints = expr_bool_op
|
||||
.values
|
||||
.iter()
|
||||
// filter our arms with statically known truthiness
|
||||
.filter(|expr| {
|
||||
inference.expression_type(*expr).bool(self.db)
|
||||
inference
|
||||
.expression_type(expr.scoped_expression_id(self.db, scope))
|
||||
.bool(self.db)
|
||||
!= match expr_bool_op.op {
|
||||
BoolOp::And => Truthiness::AlwaysTrue,
|
||||
BoolOp::Or => Truthiness::AlwaysFalse,
|
||||
|
||||
@@ -6,7 +6,7 @@ use rustc_hash::FxHashMap;
|
||||
use ruff_python_ast::{self as ast, AnyNodeRef};
|
||||
|
||||
use crate::Db;
|
||||
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
|
||||
use crate::semantic_index::ast_ids::{HasScopedExpressionId, ScopedExpressionId};
|
||||
use crate::semantic_index::place::ScopeId;
|
||||
use crate::types::tuple::{ResizeTupleError, Tuple, TupleLength, TupleUnpacker};
|
||||
use crate::types::{Type, TypeCheckDiagnostics, infer_expression_types};
|
||||
@@ -18,18 +18,23 @@ use super::diagnostic::INVALID_ASSIGNMENT;
|
||||
/// Unpacks the value expression type to their respective targets.
|
||||
pub(crate) struct Unpacker<'db, 'ast> {
|
||||
context: InferContext<'db, 'ast>,
|
||||
targets: FxHashMap<ExpressionNodeKey, Type<'db>>,
|
||||
target_scope: ScopeId<'db>,
|
||||
value_scope: ScopeId<'db>,
|
||||
targets: FxHashMap<ScopedExpressionId, Type<'db>>,
|
||||
}
|
||||
|
||||
impl<'db, 'ast> Unpacker<'db, 'ast> {
|
||||
pub(crate) fn new(
|
||||
db: &'db dyn Db,
|
||||
target_scope: ScopeId<'db>,
|
||||
value_scope: ScopeId<'db>,
|
||||
module: &'ast ParsedModuleRef,
|
||||
) -> Self {
|
||||
Self {
|
||||
context: InferContext::new(db, target_scope, module),
|
||||
targets: FxHashMap::default(),
|
||||
target_scope,
|
||||
value_scope,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -48,8 +53,9 @@ impl<'db, 'ast> Unpacker<'db, 'ast> {
|
||||
"Unpacking target must be a list or tuple expression"
|
||||
);
|
||||
|
||||
let value_type = infer_expression_types(self.db(), value.expression())
|
||||
.expression_type(value.expression().node_ref(self.db(), self.module()));
|
||||
let value_type = infer_expression_types(self.db(), value.expression()).expression_type(
|
||||
value.scoped_expression_id(self.db(), self.value_scope, self.module()),
|
||||
);
|
||||
|
||||
let value_type = match value.kind() {
|
||||
UnpackKind::Assign => {
|
||||
@@ -97,7 +103,10 @@ impl<'db, 'ast> Unpacker<'db, 'ast> {
|
||||
) {
|
||||
match target {
|
||||
ast::Expr::Name(_) | ast::Expr::Attribute(_) | ast::Expr::Subscript(_) => {
|
||||
self.targets.insert(target.into(), value_ty);
|
||||
self.targets.insert(
|
||||
target.scoped_expression_id(self.db(), self.target_scope),
|
||||
value_ty,
|
||||
);
|
||||
}
|
||||
ast::Expr::Starred(ast::ExprStarred { value, .. }) => {
|
||||
self.unpack_inner(value, value_expr, value_ty);
|
||||
@@ -199,7 +208,7 @@ impl<'db, 'ast> Unpacker<'db, 'ast> {
|
||||
|
||||
#[derive(Debug, Default, PartialEq, Eq, salsa::Update, get_size2::GetSize)]
|
||||
pub(crate) struct UnpackResult<'db> {
|
||||
targets: FxHashMap<ExpressionNodeKey, Type<'db>>,
|
||||
targets: FxHashMap<ScopedExpressionId, Type<'db>>,
|
||||
diagnostics: TypeCheckDiagnostics,
|
||||
|
||||
/// The fallback type for missing expressions.
|
||||
@@ -217,19 +226,16 @@ impl<'db> UnpackResult<'db> {
|
||||
/// May panic if a scoped expression ID is passed in that does not correspond to a sub-
|
||||
/// expression of the target.
|
||||
#[track_caller]
|
||||
pub(crate) fn expression_type(&self, expr_id: impl Into<ExpressionNodeKey>) -> Type<'db> {
|
||||
pub(crate) fn expression_type(&self, expr_id: ScopedExpressionId) -> Type<'db> {
|
||||
self.try_expression_type(expr_id).expect(
|
||||
"expression should belong to this `UnpackResult` and \
|
||||
`Unpacker` should have inferred a type for it",
|
||||
)
|
||||
}
|
||||
|
||||
pub(crate) fn try_expression_type(
|
||||
&self,
|
||||
expr: impl Into<ExpressionNodeKey>,
|
||||
) -> Option<Type<'db>> {
|
||||
pub(crate) fn try_expression_type(&self, expr_id: ScopedExpressionId) -> Option<Type<'db>> {
|
||||
self.targets
|
||||
.get(&expr.into())
|
||||
.get(&expr_id)
|
||||
.copied()
|
||||
.or(self.cycle_fallback_type)
|
||||
}
|
||||
|
||||
@@ -5,6 +5,7 @@ use ruff_text_size::{Ranged, TextRange};
|
||||
|
||||
use crate::Db;
|
||||
use crate::ast_node_ref::AstNodeRef;
|
||||
use crate::semantic_index::ast_ids::{HasScopedExpressionId, ScopedExpressionId};
|
||||
use crate::semantic_index::expression::Expression;
|
||||
use crate::semantic_index::place::{FileScopeId, ScopeId};
|
||||
|
||||
@@ -57,6 +58,16 @@ impl<'db> Unpack<'db> {
|
||||
self._target(db).node(parsed)
|
||||
}
|
||||
|
||||
/// Returns the scope in which the unpack value expression belongs.
|
||||
///
|
||||
/// The scope in which the target and value expression belongs to are usually the same
|
||||
/// except in generator expressions and comprehensions (list/dict/set), where the value
|
||||
/// expression of the first generator is evaluated in the outer scope, while the ones in the subsequent
|
||||
/// generators are evaluated in the comprehension scope.
|
||||
pub(crate) fn value_scope(self, db: &'db dyn Db) -> ScopeId<'db> {
|
||||
self.value_file_scope(db).to_scope_id(db, self.file(db))
|
||||
}
|
||||
|
||||
/// Returns the scope where the unpack target expression belongs to.
|
||||
pub(crate) fn target_scope(self, db: &'db dyn Db) -> ScopeId<'db> {
|
||||
self.target_file_scope(db).to_scope_id(db, self.file(db))
|
||||
@@ -87,6 +98,18 @@ impl<'db> UnpackValue<'db> {
|
||||
self.expression
|
||||
}
|
||||
|
||||
/// Returns the [`ScopedExpressionId`] of the underlying expression.
|
||||
pub(crate) fn scoped_expression_id(
|
||||
self,
|
||||
db: &'db dyn Db,
|
||||
scope: ScopeId<'db>,
|
||||
module: &ParsedModuleRef,
|
||||
) -> ScopedExpressionId {
|
||||
self.expression()
|
||||
.node_ref(db, module)
|
||||
.scoped_expression_id(db, scope)
|
||||
}
|
||||
|
||||
/// Returns the expression as an [`AnyNodeRef`].
|
||||
pub(crate) fn as_any_node_ref<'ast>(
|
||||
self,
|
||||
|
||||
@@ -4,9 +4,10 @@
|
||||
//! are written to `stderr` by default, which should appear in the logs for most LSP clients. A
|
||||
//! `logFile` path can also be specified in the settings, and output will be directed there
|
||||
//! instead.
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::str::FromStr;
|
||||
use std::sync::Arc;
|
||||
|
||||
use ruff_db::system::{SystemPath, SystemPathBuf};
|
||||
use serde::Deserialize;
|
||||
use tracing::level_filters::LevelFilter;
|
||||
use tracing_subscriber::Layer;
|
||||
@@ -14,14 +15,14 @@ use tracing_subscriber::fmt::time::ChronoLocal;
|
||||
use tracing_subscriber::fmt::writer::BoxMakeWriter;
|
||||
use tracing_subscriber::layer::SubscriberExt;
|
||||
|
||||
pub(crate) fn init_logging(log_level: LogLevel, log_file: Option<&SystemPath>) {
|
||||
pub(crate) fn init_logging(log_level: LogLevel, log_file: Option<&Path>) {
|
||||
let log_file = log_file
|
||||
.map(|path| {
|
||||
// this expands `logFile` so that tildes and environment variables
|
||||
// are replaced with their values, if possible.
|
||||
if let Some(expanded) = shellexpand::full(&path.to_string())
|
||||
if let Some(expanded) = shellexpand::full(&path.to_string_lossy())
|
||||
.ok()
|
||||
.map(|path| SystemPathBuf::from(&*path))
|
||||
.and_then(|path| PathBuf::from_str(&path).ok())
|
||||
{
|
||||
expanded
|
||||
} else {
|
||||
@@ -32,11 +33,14 @@ pub(crate) fn init_logging(log_level: LogLevel, log_file: Option<&SystemPath>) {
|
||||
std::fs::OpenOptions::new()
|
||||
.create(true)
|
||||
.append(true)
|
||||
.open(path.as_std_path())
|
||||
.open(&path)
|
||||
.map_err(|err| {
|
||||
#[expect(clippy::print_stderr)]
|
||||
{
|
||||
eprintln!("Failed to open file at {path} for logging: {err}");
|
||||
eprintln!(
|
||||
"Failed to open file at {} for logging: {err}",
|
||||
path.display()
|
||||
);
|
||||
}
|
||||
})
|
||||
.ok()
|
||||
|
||||
@@ -173,7 +173,6 @@ impl Server {
|
||||
diagnostic_provider: Some(DiagnosticServerCapabilities::Options(DiagnosticOptions {
|
||||
identifier: Some(crate::DIAGNOSTIC_NAME.into()),
|
||||
inter_file_dependencies: true,
|
||||
workspace_diagnostics: true,
|
||||
..Default::default()
|
||||
})),
|
||||
text_document_sync: Some(TextDocumentSyncCapability::Options(
|
||||
|
||||
@@ -28,28 +28,23 @@ pub(super) fn request(req: server::Request) -> Task {
|
||||
let id = req.id.clone();
|
||||
|
||||
match req.method.as_str() {
|
||||
requests::DocumentDiagnosticRequestHandler::METHOD => background_document_request_task::<
|
||||
requests::DocumentDiagnosticRequestHandler::METHOD => background_request_task::<
|
||||
requests::DocumentDiagnosticRequestHandler,
|
||||
>(
|
||||
req, BackgroundSchedule::Worker
|
||||
),
|
||||
requests::WorkspaceDiagnosticRequestHandler::METHOD => background_request_task::<
|
||||
requests::WorkspaceDiagnosticRequestHandler,
|
||||
>(
|
||||
req, BackgroundSchedule::Worker
|
||||
),
|
||||
requests::GotoTypeDefinitionRequestHandler::METHOD => background_document_request_task::<
|
||||
requests::GotoTypeDefinitionRequestHandler::METHOD => background_request_task::<
|
||||
requests::GotoTypeDefinitionRequestHandler,
|
||||
>(
|
||||
req, BackgroundSchedule::Worker
|
||||
),
|
||||
requests::HoverRequestHandler::METHOD => background_document_request_task::<
|
||||
requests::HoverRequestHandler::METHOD => background_request_task::<
|
||||
requests::HoverRequestHandler,
|
||||
>(req, BackgroundSchedule::Worker),
|
||||
requests::InlayHintRequestHandler::METHOD => background_document_request_task::<
|
||||
requests::InlayHintRequestHandler::METHOD => background_request_task::<
|
||||
requests::InlayHintRequestHandler,
|
||||
>(req, BackgroundSchedule::Worker),
|
||||
requests::CompletionRequestHandler::METHOD => background_document_request_task::<
|
||||
requests::CompletionRequestHandler::METHOD => background_request_task::<
|
||||
requests::CompletionRequestHandler,
|
||||
>(
|
||||
req, BackgroundSchedule::LatencySensitive
|
||||
@@ -140,51 +135,7 @@ where
|
||||
}))
|
||||
}
|
||||
|
||||
fn background_request_task<R: traits::BackgroundRequestHandler>(
|
||||
req: server::Request,
|
||||
schedule: BackgroundSchedule,
|
||||
) -> Result<Task>
|
||||
where
|
||||
<<R as RequestHandler>::RequestType as Request>::Params: UnwindSafe,
|
||||
{
|
||||
let retry = R::RETRY_ON_CANCELLATION.then(|| req.clone());
|
||||
let (id, params) = cast_request::<R>(req)?;
|
||||
|
||||
Ok(Task::background(schedule, move |session: &Session| {
|
||||
let cancellation_token = session
|
||||
.request_queue()
|
||||
.incoming()
|
||||
.cancellation_token(&id)
|
||||
.expect("request should have been tested for cancellation before scheduling");
|
||||
|
||||
let snapshot = session.take_workspace_snapshot();
|
||||
|
||||
Box::new(move |client| {
|
||||
let _span = tracing::debug_span!("request", %id, method = R::METHOD).entered();
|
||||
|
||||
// Test again if the request was cancelled since it was scheduled on the background task
|
||||
// and, if so, return early
|
||||
if cancellation_token.is_cancelled() {
|
||||
tracing::trace!(
|
||||
"Ignoring request id={id} method={} because it was cancelled",
|
||||
R::METHOD
|
||||
);
|
||||
|
||||
// We don't need to send a response here because the `cancel` notification
|
||||
// handler already responded with a message.
|
||||
return;
|
||||
}
|
||||
|
||||
let result = ruff_db::panic::catch_unwind(|| R::run(snapshot, client, params));
|
||||
|
||||
if let Some(response) = request_result_to_response::<R>(&id, client, result, retry) {
|
||||
respond::<R>(&id, response, client);
|
||||
}
|
||||
})
|
||||
}))
|
||||
}
|
||||
|
||||
fn background_document_request_task<R: traits::BackgroundDocumentRequestHandler>(
|
||||
fn background_request_task<R: traits::BackgroundDocumentRequestHandler>(
|
||||
req: server::Request,
|
||||
schedule: BackgroundSchedule,
|
||||
) -> Result<Task>
|
||||
@@ -217,7 +168,7 @@ where
|
||||
};
|
||||
|
||||
let Some(snapshot) = session.take_snapshot(url) else {
|
||||
tracing::warn!("Ignoring request because snapshot for path `{path:?}` doesn't exist");
|
||||
tracing::warn!("Ignoring request because snapshot for path `{path:?}` doesn't exist.");
|
||||
return Box::new(|_| {});
|
||||
};
|
||||
|
||||
@@ -258,7 +209,7 @@ fn request_result_to_response<R>(
|
||||
request: Option<lsp_server::Request>,
|
||||
) -> Option<Result<<<R as RequestHandler>::RequestType as Request>::Result>>
|
||||
where
|
||||
R: traits::RetriableRequestHandler,
|
||||
R: traits::BackgroundDocumentRequestHandler,
|
||||
{
|
||||
match result {
|
||||
Ok(response) => Some(response),
|
||||
|
||||
@@ -166,7 +166,7 @@ pub(super) fn compute_diagnostics(
|
||||
|
||||
/// Converts the tool specific [`Diagnostic`][ruff_db::diagnostic::Diagnostic] to an LSP
|
||||
/// [`Diagnostic`].
|
||||
pub(super) fn to_lsp_diagnostic(
|
||||
fn to_lsp_diagnostic(
|
||||
db: &dyn Db,
|
||||
diagnostic: &ruff_db::diagnostic::Diagnostic,
|
||||
encoding: PositionEncoding,
|
||||
|
||||
@@ -41,12 +41,7 @@ impl SyncNotificationHandler for DidCloseTextDocumentHandler {
|
||||
);
|
||||
}
|
||||
|
||||
if !session.global_settings().diagnostic_mode().is_workspace() {
|
||||
// The server needs to clear the diagnostics regardless of whether the client supports
|
||||
// pull diagnostics or not. This is because the client only has the capability to fetch
|
||||
// the diagnostics but does not automatically clear them when a document is closed.
|
||||
clear_diagnostics(&key, client);
|
||||
}
|
||||
clear_diagnostics(&key, client);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -4,7 +4,6 @@ mod goto_type_definition;
|
||||
mod hover;
|
||||
mod inlay_hints;
|
||||
mod shutdown;
|
||||
mod workspace_diagnostic;
|
||||
|
||||
pub(super) use completion::CompletionRequestHandler;
|
||||
pub(super) use diagnostic::DocumentDiagnosticRequestHandler;
|
||||
@@ -12,4 +11,3 @@ pub(super) use goto_type_definition::GotoTypeDefinitionRequestHandler;
|
||||
pub(super) use hover::HoverRequestHandler;
|
||||
pub(super) use inlay_hints::InlayHintRequestHandler;
|
||||
pub(super) use shutdown::ShutdownHandler;
|
||||
pub(super) use workspace_diagnostic::WorkspaceDiagnosticRequestHandler;
|
||||
|
||||
@@ -8,9 +8,7 @@ use ty_project::ProjectDatabase;
|
||||
|
||||
use crate::DocumentSnapshot;
|
||||
use crate::document::PositionExt;
|
||||
use crate::server::api::traits::{
|
||||
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
|
||||
};
|
||||
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
|
||||
use crate::session::client::Client;
|
||||
|
||||
pub(crate) struct CompletionRequestHandler;
|
||||
@@ -20,6 +18,8 @@ impl RequestHandler for CompletionRequestHandler {
|
||||
}
|
||||
|
||||
impl BackgroundDocumentRequestHandler for CompletionRequestHandler {
|
||||
const RETRY_ON_CANCELLATION: bool = true;
|
||||
|
||||
fn document_url(params: &CompletionParams) -> Cow<Url> {
|
||||
Cow::Borrowed(¶ms.text_document_position.text_document.uri)
|
||||
}
|
||||
@@ -65,7 +65,3 @@ impl BackgroundDocumentRequestHandler for CompletionRequestHandler {
|
||||
Ok(Some(response))
|
||||
}
|
||||
}
|
||||
|
||||
impl RetriableRequestHandler for CompletionRequestHandler {
|
||||
const RETRY_ON_CANCELLATION: bool = true;
|
||||
}
|
||||
|
||||
@@ -8,9 +8,7 @@ use lsp_types::{
|
||||
|
||||
use crate::server::Result;
|
||||
use crate::server::api::diagnostics::{Diagnostics, compute_diagnostics};
|
||||
use crate::server::api::traits::{
|
||||
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
|
||||
};
|
||||
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
|
||||
use crate::session::DocumentSnapshot;
|
||||
use crate::session::client::Client;
|
||||
use ty_project::ProjectDatabase;
|
||||
@@ -45,9 +43,7 @@ impl BackgroundDocumentRequestHandler for DocumentDiagnosticRequestHandler {
|
||||
}),
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
impl RetriableRequestHandler for DocumentDiagnosticRequestHandler {
|
||||
fn salsa_cancellation_error() -> lsp_server::ResponseError {
|
||||
lsp_server::ResponseError {
|
||||
code: lsp_server::ErrorCode::ServerCancelled as i32,
|
||||
|
||||
@@ -8,9 +8,7 @@ use ty_project::ProjectDatabase;
|
||||
|
||||
use crate::DocumentSnapshot;
|
||||
use crate::document::{PositionExt, ToLink};
|
||||
use crate::server::api::traits::{
|
||||
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
|
||||
};
|
||||
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
|
||||
use crate::session::client::Client;
|
||||
|
||||
pub(crate) struct GotoTypeDefinitionRequestHandler;
|
||||
@@ -72,5 +70,3 @@ impl BackgroundDocumentRequestHandler for GotoTypeDefinitionRequestHandler {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl RetriableRequestHandler for GotoTypeDefinitionRequestHandler {}
|
||||
|
||||
@@ -2,9 +2,7 @@ use std::borrow::Cow;
|
||||
|
||||
use crate::DocumentSnapshot;
|
||||
use crate::document::{PositionExt, ToRangeExt};
|
||||
use crate::server::api::traits::{
|
||||
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
|
||||
};
|
||||
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
|
||||
use crate::session::client::Client;
|
||||
use lsp_types::request::HoverRequest;
|
||||
use lsp_types::{HoverContents, HoverParams, MarkupContent, Url};
|
||||
@@ -75,5 +73,3 @@ impl BackgroundDocumentRequestHandler for HoverRequestHandler {
|
||||
}))
|
||||
}
|
||||
}
|
||||
|
||||
impl RetriableRequestHandler for HoverRequestHandler {}
|
||||
|
||||
@@ -2,9 +2,7 @@ use std::borrow::Cow;
|
||||
|
||||
use crate::DocumentSnapshot;
|
||||
use crate::document::{RangeExt, TextSizeExt};
|
||||
use crate::server::api::traits::{
|
||||
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
|
||||
};
|
||||
use crate::server::api::traits::{BackgroundDocumentRequestHandler, RequestHandler};
|
||||
use crate::session::client::Client;
|
||||
use lsp_types::request::InlayHintRequest;
|
||||
use lsp_types::{InlayHintParams, Url};
|
||||
@@ -66,5 +64,3 @@ impl BackgroundDocumentRequestHandler for InlayHintRequestHandler {
|
||||
Ok(Some(inlay_hints))
|
||||
}
|
||||
}
|
||||
|
||||
impl RetriableRequestHandler for InlayHintRequestHandler {}
|
||||
|
||||
@@ -1,108 +0,0 @@
|
||||
use lsp_types::request::WorkspaceDiagnosticRequest;
|
||||
use lsp_types::{
|
||||
FullDocumentDiagnosticReport, Url, WorkspaceDiagnosticParams, WorkspaceDiagnosticReport,
|
||||
WorkspaceDiagnosticReportResult, WorkspaceDocumentDiagnosticReport,
|
||||
WorkspaceFullDocumentDiagnosticReport,
|
||||
};
|
||||
use rustc_hash::FxHashMap;
|
||||
use ty_project::CheckMode;
|
||||
|
||||
use crate::server::Result;
|
||||
use crate::server::api::diagnostics::to_lsp_diagnostic;
|
||||
use crate::server::api::traits::{
|
||||
BackgroundRequestHandler, RequestHandler, RetriableRequestHandler,
|
||||
};
|
||||
use crate::session::WorkspaceSnapshot;
|
||||
use crate::session::client::Client;
|
||||
use crate::system::file_to_url;
|
||||
|
||||
pub(crate) struct WorkspaceDiagnosticRequestHandler;
|
||||
|
||||
impl RequestHandler for WorkspaceDiagnosticRequestHandler {
|
||||
type RequestType = WorkspaceDiagnosticRequest;
|
||||
}
|
||||
|
||||
impl BackgroundRequestHandler for WorkspaceDiagnosticRequestHandler {
|
||||
fn run(
|
||||
snapshot: WorkspaceSnapshot,
|
||||
_client: &Client,
|
||||
_params: WorkspaceDiagnosticParams,
|
||||
) -> Result<WorkspaceDiagnosticReportResult> {
|
||||
let index = snapshot.index();
|
||||
|
||||
if !index.global_settings().diagnostic_mode().is_workspace() {
|
||||
tracing::debug!("Workspace diagnostics is disabled; returning empty report");
|
||||
return Ok(WorkspaceDiagnosticReportResult::Report(
|
||||
WorkspaceDiagnosticReport { items: vec![] },
|
||||
));
|
||||
}
|
||||
|
||||
let mut items = Vec::new();
|
||||
|
||||
for db in snapshot.projects() {
|
||||
let diagnostics = db.check_with_mode(CheckMode::AllFiles);
|
||||
|
||||
// Group diagnostics by URL
|
||||
let mut diagnostics_by_url: FxHashMap<Url, Vec<_>> = FxHashMap::default();
|
||||
|
||||
for diagnostic in diagnostics {
|
||||
if let Some(span) = diagnostic.primary_span() {
|
||||
let file = span.expect_ty_file();
|
||||
let Some(url) = file_to_url(db, file) else {
|
||||
tracing::debug!("Failed to convert file to URL at {}", file.path(db));
|
||||
continue;
|
||||
};
|
||||
diagnostics_by_url.entry(url).or_default().push(diagnostic);
|
||||
}
|
||||
}
|
||||
|
||||
items.reserve(diagnostics_by_url.len());
|
||||
|
||||
// Convert to workspace diagnostic report format
|
||||
for (url, file_diagnostics) in diagnostics_by_url {
|
||||
let version = index
|
||||
.key_from_url(url.clone())
|
||||
.ok()
|
||||
.and_then(|key| index.make_document_ref(&key))
|
||||
.map(|doc| i64::from(doc.version()));
|
||||
|
||||
// Convert diagnostics to LSP format
|
||||
let lsp_diagnostics = file_diagnostics
|
||||
.into_iter()
|
||||
.map(|diagnostic| {
|
||||
to_lsp_diagnostic(db, &diagnostic, snapshot.position_encoding())
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
items.push(WorkspaceDocumentDiagnosticReport::Full(
|
||||
WorkspaceFullDocumentDiagnosticReport {
|
||||
uri: url,
|
||||
version,
|
||||
full_document_diagnostic_report: FullDocumentDiagnosticReport {
|
||||
// TODO: We don't implement result ID caching yet
|
||||
result_id: None,
|
||||
items: lsp_diagnostics,
|
||||
},
|
||||
},
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
Ok(WorkspaceDiagnosticReportResult::Report(
|
||||
WorkspaceDiagnosticReport { items },
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
impl RetriableRequestHandler for WorkspaceDiagnosticRequestHandler {
|
||||
fn salsa_cancellation_error() -> lsp_server::ResponseError {
|
||||
lsp_server::ResponseError {
|
||||
code: lsp_server::ErrorCode::ServerCancelled as i32,
|
||||
message: "server cancelled the request".to_owned(),
|
||||
data: serde_json::to_value(lsp_types::DiagnosticServerCancellationData {
|
||||
retrigger_request: true,
|
||||
})
|
||||
.ok(),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
//! A stateful LSP implementation that calls into the ty API.
|
||||
|
||||
use crate::session::client::Client;
|
||||
use crate::session::{DocumentSnapshot, Session, WorkspaceSnapshot};
|
||||
use crate::session::{DocumentSnapshot, Session};
|
||||
|
||||
use lsp_types::notification::Notification as LSPNotification;
|
||||
use lsp_types::request::Request;
|
||||
@@ -25,24 +25,11 @@ pub(super) trait SyncRequestHandler: RequestHandler {
|
||||
) -> super::Result<<<Self as RequestHandler>::RequestType as Request>::Result>;
|
||||
}
|
||||
|
||||
pub(super) trait RetriableRequestHandler: RequestHandler {
|
||||
/// Whether this request can be cancelled if the Salsa database is modified.
|
||||
/// A request handler that can be run on a background thread.
|
||||
pub(super) trait BackgroundDocumentRequestHandler: RequestHandler {
|
||||
/// Whether this request be retried if it was cancelled due to a modification to the Salsa database.
|
||||
const RETRY_ON_CANCELLATION: bool = false;
|
||||
|
||||
/// The error to return if the request was cancelled due to a modification to the Salsa database.
|
||||
fn salsa_cancellation_error() -> lsp_server::ResponseError {
|
||||
lsp_server::ResponseError {
|
||||
code: lsp_server::ErrorCode::ContentModified as i32,
|
||||
message: "content modified".to_string(),
|
||||
data: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A request handler that can be run on a background thread.
|
||||
///
|
||||
/// This handler is specific to requests that operate on a single document.
|
||||
pub(super) trait BackgroundDocumentRequestHandler: RetriableRequestHandler {
|
||||
fn document_url(
|
||||
params: &<<Self as RequestHandler>::RequestType as Request>::Params,
|
||||
) -> std::borrow::Cow<lsp_types::Url>;
|
||||
@@ -53,15 +40,14 @@ pub(super) trait BackgroundDocumentRequestHandler: RetriableRequestHandler {
|
||||
client: &Client,
|
||||
params: <<Self as RequestHandler>::RequestType as Request>::Params,
|
||||
) -> super::Result<<<Self as RequestHandler>::RequestType as Request>::Result>;
|
||||
}
|
||||
|
||||
/// A request handler that can be run on a background thread.
|
||||
pub(super) trait BackgroundRequestHandler: RetriableRequestHandler {
|
||||
fn run(
|
||||
snapshot: WorkspaceSnapshot,
|
||||
client: &Client,
|
||||
params: <<Self as RequestHandler>::RequestType as Request>::Params,
|
||||
) -> super::Result<<<Self as RequestHandler>::RequestType as Request>::Result>;
|
||||
fn salsa_cancellation_error() -> lsp_server::ResponseError {
|
||||
lsp_server::ResponseError {
|
||||
code: lsp_server::ErrorCode::ContentModified as i32,
|
||||
message: "content modified".to_string(),
|
||||
data: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A supertrait for any server notification handler.
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
use std::collections::{BTreeMap, VecDeque};
|
||||
use std::ops::{Deref, DerefMut};
|
||||
use std::panic::AssertUnwindSafe;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::sync::Arc;
|
||||
|
||||
use anyhow::{Context, anyhow};
|
||||
@@ -224,14 +224,6 @@ impl Session {
|
||||
self.index().key_from_url(url)
|
||||
}
|
||||
|
||||
pub(crate) fn take_workspace_snapshot(&self) -> WorkspaceSnapshot {
|
||||
WorkspaceSnapshot {
|
||||
projects: AssertUnwindSafe(self.projects.values().cloned().collect()),
|
||||
index: self.index.clone().unwrap(),
|
||||
position_encoding: self.position_encoding,
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn initialize_workspaces(&mut self, workspace_settings: Vec<(Url, ClientOptions)>) {
|
||||
assert!(!self.workspaces.all_initialized());
|
||||
|
||||
@@ -243,7 +235,14 @@ impl Session {
|
||||
// In the future, index the workspace directories to find all projects
|
||||
// and create a project database for each.
|
||||
let system = LSPSystem::new(self.index.as_ref().unwrap().clone());
|
||||
let system_path = workspace.root();
|
||||
|
||||
let Some(system_path) = SystemPath::from_std_path(workspace.root()) else {
|
||||
tracing::warn!(
|
||||
"Ignore workspace `{}` because it's root contains non UTF8 characters",
|
||||
workspace.root().display()
|
||||
);
|
||||
continue;
|
||||
};
|
||||
|
||||
let root = system_path.to_path_buf();
|
||||
let project = ProjectMetadata::discover(&root, &system)
|
||||
@@ -383,10 +382,6 @@ impl Session {
|
||||
pub(crate) fn client_capabilities(&self) -> &ResolvedClientCapabilities {
|
||||
&self.resolved_client_capabilities
|
||||
}
|
||||
|
||||
pub(crate) fn global_settings(&self) -> Arc<ClientSettings> {
|
||||
self.index().global_settings()
|
||||
}
|
||||
}
|
||||
|
||||
/// A guard that holds the only reference to the index and allows modifying it.
|
||||
@@ -466,27 +461,6 @@ impl DocumentSnapshot {
|
||||
}
|
||||
}
|
||||
|
||||
/// An immutable snapshot of the current state of [`Session`].
|
||||
pub(crate) struct WorkspaceSnapshot {
|
||||
projects: AssertUnwindSafe<Vec<ProjectDatabase>>,
|
||||
index: Arc<index::Index>,
|
||||
position_encoding: PositionEncoding,
|
||||
}
|
||||
|
||||
impl WorkspaceSnapshot {
|
||||
pub(crate) fn projects(&self) -> &[ProjectDatabase] {
|
||||
&self.projects
|
||||
}
|
||||
|
||||
pub(crate) fn index(&self) -> &index::Index {
|
||||
&self.index
|
||||
}
|
||||
|
||||
pub(crate) fn position_encoding(&self) -> PositionEncoding {
|
||||
self.position_encoding
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Default)]
|
||||
pub(crate) struct Workspaces {
|
||||
workspaces: BTreeMap<Url, Workspace>,
|
||||
@@ -499,15 +473,11 @@ impl Workspaces {
|
||||
.to_file_path()
|
||||
.map_err(|()| anyhow!("Workspace URL is not a file or directory: {url:?}"))?;
|
||||
|
||||
// Realistically I don't think this can fail because we got the path from a Url
|
||||
let system_path = SystemPathBuf::from_path_buf(path)
|
||||
.map_err(|_| anyhow!("Workspace URL is not valid UTF8"))?;
|
||||
|
||||
self.workspaces.insert(
|
||||
url,
|
||||
Workspace {
|
||||
options,
|
||||
root: system_path,
|
||||
root: path,
|
||||
},
|
||||
);
|
||||
|
||||
@@ -550,12 +520,12 @@ impl<'a> IntoIterator for &'a Workspaces {
|
||||
|
||||
#[derive(Debug)]
|
||||
pub(crate) struct Workspace {
|
||||
root: SystemPathBuf,
|
||||
root: PathBuf,
|
||||
options: ClientOptions,
|
||||
}
|
||||
|
||||
impl Workspace {
|
||||
pub(crate) fn root(&self) -> &SystemPath {
|
||||
pub(crate) fn root(&self) -> &Path {
|
||||
&self.root
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
use std::path::PathBuf;
|
||||
|
||||
use lsp_types::Url;
|
||||
use ruff_db::system::SystemPathBuf;
|
||||
use rustc_hash::FxHashMap;
|
||||
use serde::Deserialize;
|
||||
|
||||
@@ -46,26 +47,6 @@ pub(crate) struct ClientOptions {
|
||||
/// Settings under the `python.*` namespace in VS Code that are useful for the ty language
|
||||
/// server.
|
||||
python: Option<Python>,
|
||||
/// Diagnostic mode for the language server.
|
||||
diagnostic_mode: Option<DiagnosticMode>,
|
||||
}
|
||||
|
||||
/// Diagnostic mode for the language server.
|
||||
#[derive(Clone, Copy, Debug, Default, Deserialize)]
|
||||
#[cfg_attr(test, derive(PartialEq, Eq))]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub(crate) enum DiagnosticMode {
|
||||
/// Check only currently open files.
|
||||
#[default]
|
||||
OpenFilesOnly,
|
||||
/// Check all files in the workspace.
|
||||
Workspace,
|
||||
}
|
||||
|
||||
impl DiagnosticMode {
|
||||
pub(crate) fn is_workspace(self) -> bool {
|
||||
matches!(self, DiagnosticMode::Workspace)
|
||||
}
|
||||
}
|
||||
|
||||
impl ClientOptions {
|
||||
@@ -77,7 +58,6 @@ impl ClientOptions {
|
||||
.and_then(|python| python.ty)
|
||||
.and_then(|ty| ty.disable_language_services)
|
||||
.unwrap_or_default(),
|
||||
diagnostic_mode: self.diagnostic_mode.unwrap_or_default(),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -109,7 +89,7 @@ pub(crate) struct TracingOptions {
|
||||
pub(crate) log_level: Option<LogLevel>,
|
||||
|
||||
/// Path to the log file - tildes and environment variables are supported.
|
||||
pub(crate) log_file: Option<SystemPathBuf>,
|
||||
pub(crate) log_file: Option<PathBuf>,
|
||||
}
|
||||
|
||||
/// This is the exact schema for initialization options sent in by the client during
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
use super::options::DiagnosticMode;
|
||||
|
||||
/// Resolved client settings for a specific document. These settings are meant to be
|
||||
/// used directly by the server, and are *not* a 1:1 representation with how the client
|
||||
/// sends them.
|
||||
@@ -7,15 +5,10 @@ use super::options::DiagnosticMode;
|
||||
#[cfg_attr(test, derive(PartialEq, Eq))]
|
||||
pub(crate) struct ClientSettings {
|
||||
pub(super) disable_language_services: bool,
|
||||
pub(super) diagnostic_mode: DiagnosticMode,
|
||||
}
|
||||
|
||||
impl ClientSettings {
|
||||
pub(crate) fn is_language_services_disabled(&self) -> bool {
|
||||
self.disable_language_services
|
||||
}
|
||||
|
||||
pub(crate) fn diagnostic_mode(&self) -> DiagnosticMode {
|
||||
self.diagnostic_mode
|
||||
}
|
||||
}
|
||||
|
||||
@@ -80,7 +80,7 @@ You can add the following configuration to `.gitlab-ci.yml` to run a `ruff forma
|
||||
stage: build
|
||||
interruptible: true
|
||||
image:
|
||||
name: ghcr.io/astral-sh/ruff:0.12.2-alpine
|
||||
name: ghcr.io/astral-sh/ruff:0.12.1-alpine
|
||||
before_script:
|
||||
- cd $CI_PROJECT_DIR
|
||||
- ruff --version
|
||||
@@ -106,7 +106,7 @@ Ruff can be used as a [pre-commit](https://pre-commit.com) hook via [`ruff-pre-c
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.2
|
||||
rev: v0.12.1
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
@@ -119,7 +119,7 @@ To enable lint fixes, add the `--fix` argument to the lint hook:
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.2
|
||||
rev: v0.12.1
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
@@ -133,7 +133,7 @@ To avoid running on Jupyter Notebooks, remove `jupyter` from the list of allowed
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.2
|
||||
rev: v0.12.1
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
|
||||
@@ -369,7 +369,7 @@ This tutorial has focused on Ruff's command-line interface, but Ruff can also be
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.2
|
||||
rev: v0.12.1
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
|
||||
@@ -4,7 +4,7 @@ build-backend = "maturin"
|
||||
|
||||
[project]
|
||||
name = "ruff"
|
||||
version = "0.12.2"
|
||||
version = "0.12.1"
|
||||
description = "An extremely fast Python linter and code formatter, written in Rust."
|
||||
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
|
||||
readme = "README.md"
|
||||
|
||||
4
ruff.schema.json
generated
4
ruff.schema.json
generated
@@ -1438,7 +1438,7 @@
|
||||
}
|
||||
},
|
||||
"banned-module-level-imports": {
|
||||
"description": "List of specific modules that may not be imported at module level, and should instead be imported lazily (e.g., within a function definition, or an `if TYPE_CHECKING:` block, or some other nested context). This also affects the rule `import-outside-top-level` if `banned-module-level-imports` is enabled.",
|
||||
"description": "List of specific modules that may not be imported at module level, and should instead be imported lazily (e.g., within a function definition, or an `if TYPE_CHECKING:` block, or some other nested context).",
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
@@ -1588,7 +1588,7 @@
|
||||
]
|
||||
},
|
||||
"skip-magic-trailing-comma": {
|
||||
"description": "Ruff uses existing trailing commas as an indication that short lines should be left separate. If this option is set to `true`, the magic trailing comma is ignored.\n\nFor example, Ruff leaves the arguments separate even though collapsing the arguments to a single line doesn't exceed the line length if `skip-magic-trailing-comma = false`:\n\n```python # The arguments remain on separate lines because of the trailing comma after `b` def test( a, b, ): pass ```\n\nSetting `skip-magic-trailing-comma = true` changes the formatting to:\n\n```python # The arguments are collapsed to a single line because the trailing comma is ignored def test(a, b): pass ```",
|
||||
"description": "Ruff uses existing trailing commas as an indication that short lines should be left separate. If this option is set to `true`, the magic trailing comma is ignored.\n\nFor example, Ruff leaves the arguments separate even though collapsing the arguments to a single line doesn't exceed the line length if `skip-magic-trailing-comma = false`:\n\n```python # The arguments remain on separate lines because of the trailing comma after `b` def test( a, b, ): pass ```\n\nSetting `skip-magic-trailing-comma = true` changes the formatting to:\n\n```python # The arguments remain on separate lines because of the trailing comma after `b` def test(a, b): pass ```",
|
||||
"type": [
|
||||
"boolean",
|
||||
"null"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[project]
|
||||
name = "scripts"
|
||||
version = "0.12.2"
|
||||
version = "0.12.1"
|
||||
description = ""
|
||||
authors = ["Charles Marsh <charlie.r.marsh@gmail.com>"]
|
||||
|
||||
|
||||
Reference in New Issue
Block a user