Compare commits
1 Commits
micha/thin
...
micha/stri
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
1bec5784c7 |
31
CHANGELOG.md
31
CHANGELOG.md
@@ -1,36 +1,5 @@
|
||||
# Changelog
|
||||
|
||||
## 0.12.4
|
||||
|
||||
### Preview features
|
||||
|
||||
- \[`flake8-type-checking`, `pyupgrade`, `ruff`\] Add `from __future__ import annotations` when it would allow new fixes (`TC001`, `TC002`, `TC003`, `UP037`, `RUF013`) ([#19100](https://github.com/astral-sh/ruff/pull/19100))
|
||||
- \[`flake8-use-pathlib`\] Add autofix for `PTH109` ([#19245](https://github.com/astral-sh/ruff/pull/19245))
|
||||
- \[`pylint`\] Detect indirect `pathlib.Path` usages for `unspecified-encoding` (`PLW1514`) ([#19304](https://github.com/astral-sh/ruff/pull/19304))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- \[`flake8-bugbear`\] Fix `B017` false negatives for keyword exception arguments ([#19217](https://github.com/astral-sh/ruff/pull/19217))
|
||||
- \[`flake8-use-pathlib`\] Fix false negative on direct `Path()` instantiation (`PTH210`) ([#19388](https://github.com/astral-sh/ruff/pull/19388))
|
||||
- \[`flake8-django`\] Fix `DJ008` false positive for abstract models with type-annotated `abstract` field ([#19221](https://github.com/astral-sh/ruff/pull/19221))
|
||||
- \[`isort`\] Fix `I002` import insertion after docstring with multiple string statements ([#19222](https://github.com/astral-sh/ruff/pull/19222))
|
||||
- \[`isort`\] Treat form feed as valid whitespace before a semicolon ([#19343](https://github.com/astral-sh/ruff/pull/19343))
|
||||
- \[`pydoclint`\] Fix `SyntaxError` from fixes with line continuations (`D201`, `D202`) ([#19246](https://github.com/astral-sh/ruff/pull/19246))
|
||||
- \[`refurb`\] `FURB164` fix should validate arguments and should usually be marked unsafe ([#19136](https://github.com/astral-sh/ruff/pull/19136))
|
||||
|
||||
### Rule changes
|
||||
|
||||
- \[`flake8-use-pathlib`\] Skip single dots for `invalid-pathlib-with-suffix` (`PTH210`) on versions >= 3.14 ([#19331](https://github.com/astral-sh/ruff/pull/19331))
|
||||
- \[`pep8_naming`\] Avoid false positives on standard library functions with uppercase names (`N802`) ([#18907](https://github.com/astral-sh/ruff/pull/18907))
|
||||
- \[`pycodestyle`\] Handle brace escapes for t-strings in logical lines ([#19358](https://github.com/astral-sh/ruff/pull/19358))
|
||||
- \[`pylint`\] Extend invalid string character rules to include t-strings ([#19355](https://github.com/astral-sh/ruff/pull/19355))
|
||||
- \[`ruff`\] Allow `strict` kwarg when checking for `starmap-zip` (`RUF058`) in Python 3.14+ ([#19333](https://github.com/astral-sh/ruff/pull/19333))
|
||||
|
||||
### Documentation
|
||||
|
||||
- \[`flake8-type-checking`\] Make `TC010` docs example more realistic ([#19356](https://github.com/astral-sh/ruff/pull/19356))
|
||||
- Make more documentation examples error out-of-the-box ([#19288](https://github.com/astral-sh/ruff/pull/19288),[#19272](https://github.com/astral-sh/ruff/pull/19272),[#19291](https://github.com/astral-sh/ruff/pull/19291),[#19296](https://github.com/astral-sh/ruff/pull/19296),[#19292](https://github.com/astral-sh/ruff/pull/19292),[#19295](https://github.com/astral-sh/ruff/pull/19295),[#19297](https://github.com/astral-sh/ruff/pull/19297),[#19309](https://github.com/astral-sh/ruff/pull/19309))
|
||||
|
||||
## 0.12.3
|
||||
|
||||
### Preview features
|
||||
|
||||
8
Cargo.lock
generated
8
Cargo.lock
generated
@@ -2711,7 +2711,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.12.4"
|
||||
version = "0.12.3"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"argfile",
|
||||
@@ -2962,7 +2962,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_linter"
|
||||
version = "0.12.4"
|
||||
version = "0.12.3"
|
||||
dependencies = [
|
||||
"aho-corasick",
|
||||
"anyhow",
|
||||
@@ -3295,7 +3295,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_wasm"
|
||||
version = "0.12.4"
|
||||
version = "0.12.3"
|
||||
dependencies = [
|
||||
"console_error_panic_hook",
|
||||
"console_log",
|
||||
@@ -4287,7 +4287,6 @@ dependencies = [
|
||||
"strum_macros",
|
||||
"tempfile",
|
||||
"test-case",
|
||||
"thin-vec",
|
||||
"thiserror 2.0.12",
|
||||
"tracing",
|
||||
"ty_python_semantic",
|
||||
@@ -4301,7 +4300,6 @@ name = "ty_server"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"bitflags 2.9.1",
|
||||
"crossbeam",
|
||||
"jod-thread",
|
||||
"libc",
|
||||
|
||||
@@ -163,7 +163,6 @@ strum_macros = { version = "0.27.0" }
|
||||
syn = { version = "2.0.55" }
|
||||
tempfile = { version = "3.9.0" }
|
||||
test-case = { version = "3.3.1" }
|
||||
thin-vec = { version = "0.2.14" }
|
||||
thiserror = { version = "2.0.0" }
|
||||
tikv-jemallocator = { version = "0.6.0" }
|
||||
toml = { version = "0.9.0" }
|
||||
|
||||
@@ -148,8 +148,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
|
||||
|
||||
# For a specific version.
|
||||
curl -LsSf https://astral.sh/ruff/0.12.4/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.12.4/install.ps1 | iex"
|
||||
curl -LsSf https://astral.sh/ruff/0.12.3/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.12.3/install.ps1 | iex"
|
||||
```
|
||||
|
||||
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
|
||||
@@ -182,7 +182,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.4
|
||||
rev: v0.12.3
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff"
|
||||
version = "0.12.4"
|
||||
version = "0.12.3"
|
||||
publish = true
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
||||
@@ -5718,11 +5718,8 @@ match 42: # invalid-syntax
|
||||
|
||||
let snapshot = format!("output_format_{output_format}");
|
||||
|
||||
let project_dir = dunce::canonicalize(tempdir.path())?;
|
||||
|
||||
insta::with_settings!({
|
||||
filters => vec![
|
||||
(tempdir_filter(&project_dir).as_str(), "[TMP]/"),
|
||||
(tempdir_filter(&tempdir).as_str(), "[TMP]/"),
|
||||
(r#""[^"]+\\?/?input.py"#, r#""[TMP]/input.py"#),
|
||||
(ruff_linter::VERSION, "[VERSION]"),
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_linter"
|
||||
version = "0.12.4"
|
||||
version = "0.12.3"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
||||
@@ -104,6 +104,3 @@ os.chmod(x)
|
||||
os.replace("src", "dst", src_dir_fd=1, dst_dir_fd=2)
|
||||
os.replace("src", "dst", src_dir_fd=1)
|
||||
os.replace("src", "dst", dst_dir_fd=2)
|
||||
|
||||
os.getcwd()
|
||||
os.getcwdb()
|
||||
@@ -1044,6 +1044,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
|
||||
Rule::OsMakedirs,
|
||||
Rule::OsRename,
|
||||
Rule::OsReplace,
|
||||
Rule::OsGetcwd,
|
||||
Rule::OsStat,
|
||||
Rule::OsPathJoin,
|
||||
Rule::OsPathSamefile,
|
||||
@@ -1109,9 +1110,6 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
|
||||
if checker.is_rule_enabled(Rule::OsReadlink) {
|
||||
flake8_use_pathlib::rules::os_readlink(checker, call, segments);
|
||||
}
|
||||
if checker.is_rule_enabled(Rule::OsGetcwd) {
|
||||
flake8_use_pathlib::rules::os_getcwd(checker, call, segments);
|
||||
}
|
||||
if checker.is_rule_enabled(Rule::PathConstructorCurrentDirectory) {
|
||||
flake8_use_pathlib::rules::path_constructor_current_directory(
|
||||
checker, call, segments,
|
||||
|
||||
@@ -928,7 +928,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
|
||||
(Flake8UsePathlib, "106") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsRmdir),
|
||||
(Flake8UsePathlib, "107") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsRemove),
|
||||
(Flake8UsePathlib, "108") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsUnlink),
|
||||
(Flake8UsePathlib, "109") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsGetcwd),
|
||||
(Flake8UsePathlib, "109") => (RuleGroup::Stable, rules::flake8_use_pathlib::violations::OsGetcwd),
|
||||
(Flake8UsePathlib, "110") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsPathExists),
|
||||
(Flake8UsePathlib, "111") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsPathExpanduser),
|
||||
(Flake8UsePathlib, "112") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::OsPathIsdir),
|
||||
|
||||
@@ -134,11 +134,6 @@ pub(crate) const fn is_fix_os_path_dirname_enabled(settings: &LinterSettings) ->
|
||||
settings.preview.is_enabled()
|
||||
}
|
||||
|
||||
// https://github.com/astral-sh/ruff/pull/19245
|
||||
pub(crate) const fn is_fix_os_getcwd_enabled(settings: &LinterSettings) -> bool {
|
||||
settings.preview.is_enabled()
|
||||
}
|
||||
|
||||
// https://github.com/astral-sh/ruff/pull/11436
|
||||
// https://github.com/astral-sh/ruff/pull/11168
|
||||
pub(crate) const fn is_dunder_init_fix_unused_import_enabled(settings: &LinterSettings) -> bool {
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
pub(crate) use glob_rule::*;
|
||||
pub(crate) use invalid_pathlib_with_suffix::*;
|
||||
pub(crate) use os_getcwd::*;
|
||||
pub(crate) use os_path_abspath::*;
|
||||
pub(crate) use os_path_basename::*;
|
||||
pub(crate) use os_path_dirname::*;
|
||||
@@ -24,7 +23,6 @@ pub(crate) use replaceable_by_pathlib::*;
|
||||
|
||||
mod glob_rule;
|
||||
mod invalid_pathlib_with_suffix;
|
||||
mod os_getcwd;
|
||||
mod os_path_abspath;
|
||||
mod os_path_basename;
|
||||
mod os_path_dirname;
|
||||
|
||||
@@ -1,100 +0,0 @@
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::importer::ImportRequest;
|
||||
use crate::preview::is_fix_os_getcwd_enabled;
|
||||
use crate::{FixAvailability, Violation};
|
||||
use ruff_diagnostics::{Applicability, Edit, Fix};
|
||||
use ruff_macros::{ViolationMetadata, derive_message_formats};
|
||||
use ruff_python_ast::ExprCall;
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for uses of `os.getcwd` and `os.getcwdb`.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// `pathlib` offers a high-level API for path manipulation, as compared to
|
||||
/// the lower-level API offered by `os`. When possible, using `Path` object
|
||||
/// methods such as `Path.cwd()` can improve readability over the `os`
|
||||
/// module's counterparts (e.g., `os.getcwd()`).
|
||||
///
|
||||
/// ## Examples
|
||||
/// ```python
|
||||
/// import os
|
||||
///
|
||||
/// cwd = os.getcwd()
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// from pathlib import Path
|
||||
///
|
||||
/// cwd = Path.cwd()
|
||||
/// ```
|
||||
///
|
||||
/// ## Known issues
|
||||
/// While using `pathlib` can improve the readability and type safety of your code,
|
||||
/// it can be less performant than the lower-level alternatives that work directly with strings,
|
||||
/// especially on older versions of Python.
|
||||
///
|
||||
/// ## Fix Safety
|
||||
/// This rule's fix is marked as unsafe if the replacement would remove comments attached to the original expression.
|
||||
///
|
||||
/// ## References
|
||||
/// - [Python documentation: `Path.cwd`](https://docs.python.org/3/library/pathlib.html#pathlib.Path.cwd)
|
||||
/// - [Python documentation: `os.getcwd`](https://docs.python.org/3/library/os.html#os.getcwd)
|
||||
/// - [Python documentation: `os.getcwdb`](https://docs.python.org/3/library/os.html#os.getcwdb)
|
||||
/// - [PEP 428 – The pathlib module – object-oriented filesystem paths](https://peps.python.org/pep-0428/)
|
||||
/// - [Correspondence between `os` and `pathlib`](https://docs.python.org/3/library/pathlib.html#correspondence-to-tools-in-the-os-module)
|
||||
/// - [Why you should be using pathlib](https://treyhunner.com/2018/12/why-you-should-be-using-pathlib/)
|
||||
/// - [No really, pathlib is great](https://treyhunner.com/2019/01/no-really-pathlib-is-great/)
|
||||
#[derive(ViolationMetadata)]
|
||||
pub(crate) struct OsGetcwd;
|
||||
|
||||
impl Violation for OsGetcwd {
|
||||
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
|
||||
#[derive_message_formats]
|
||||
fn message(&self) -> String {
|
||||
"`os.getcwd()` should be replaced by `Path.cwd()`".to_string()
|
||||
}
|
||||
|
||||
fn fix_title(&self) -> Option<String> {
|
||||
Some("Replace with `Path.cwd()`".to_string())
|
||||
}
|
||||
}
|
||||
|
||||
/// PTH109
|
||||
pub(crate) fn os_getcwd(checker: &Checker, call: &ExprCall, segments: &[&str]) {
|
||||
if !matches!(segments, ["os", "getcwd" | "getcwdb"]) {
|
||||
return;
|
||||
}
|
||||
|
||||
let range = call.range();
|
||||
let mut diagnostic = checker.report_diagnostic(OsGetcwd, call.func.range());
|
||||
|
||||
if !call.arguments.is_empty() {
|
||||
return;
|
||||
}
|
||||
|
||||
if is_fix_os_getcwd_enabled(checker.settings()) {
|
||||
diagnostic.try_set_fix(|| {
|
||||
let (import_edit, binding) = checker.importer().get_or_import_symbol(
|
||||
&ImportRequest::import("pathlib", "Path"),
|
||||
call.start(),
|
||||
checker.semantic(),
|
||||
)?;
|
||||
|
||||
let applicability = if checker.comment_ranges().intersects(range) {
|
||||
Applicability::Unsafe
|
||||
} else {
|
||||
Applicability::Safe
|
||||
};
|
||||
|
||||
let replacement = format!("{binding}.cwd()");
|
||||
|
||||
Ok(Fix::applicable_edits(
|
||||
Edit::range_replacement(replacement, range),
|
||||
[import_edit],
|
||||
applicability,
|
||||
))
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -7,8 +7,8 @@ use crate::checkers::ast::Checker;
|
||||
use crate::rules::flake8_use_pathlib::helpers::is_keyword_only_argument_non_default;
|
||||
use crate::rules::flake8_use_pathlib::rules::Glob;
|
||||
use crate::rules::flake8_use_pathlib::violations::{
|
||||
BuiltinOpen, Joiner, OsChmod, OsListdir, OsMakedirs, OsMkdir, OsPathJoin, OsPathSamefile,
|
||||
OsPathSplitext, OsRename, OsReplace, OsStat, OsSymlink, PyPath,
|
||||
BuiltinOpen, Joiner, OsChmod, OsGetcwd, OsListdir, OsMakedirs, OsMkdir, OsPathJoin,
|
||||
OsPathSamefile, OsPathSplitext, OsRename, OsReplace, OsStat, OsSymlink, PyPath,
|
||||
};
|
||||
|
||||
pub(crate) fn replaceable_by_pathlib(checker: &Checker, call: &ExprCall) {
|
||||
@@ -83,6 +83,10 @@ pub(crate) fn replaceable_by_pathlib(checker: &Checker, call: &ExprCall) {
|
||||
}
|
||||
checker.report_diagnostic_if_enabled(OsReplace, range)
|
||||
}
|
||||
// PTH109
|
||||
["os", "getcwd"] => checker.report_diagnostic_if_enabled(OsGetcwd, range),
|
||||
["os", "getcwdb"] => checker.report_diagnostic_if_enabled(OsGetcwd, range),
|
||||
|
||||
// PTH116
|
||||
["os", "stat"] => {
|
||||
// `dir_fd` is not supported by pathlib, so check if it's set to non-default values.
|
||||
|
||||
@@ -103,7 +103,6 @@ full_name.py:16:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
17 | b = os.path.exists(p)
|
||||
18 | bb = os.path.expanduser(p)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
full_name.py:17:5: PTH110 `os.path.exists()` should be replaced by `Path.exists()`
|
||||
|
|
||||
@@ -293,7 +292,6 @@ full_name.py:35:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
36 | os.path.join(p, *q)
|
||||
37 | os.sep.join(p, *q)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
full_name.py:36:1: PTH118 `os.path.join()` should be replaced by `Path.joinpath()`
|
||||
|
|
||||
@@ -362,21 +360,3 @@ full_name.py:71:1: PTH123 `open()` should be replaced by `Path.open()`
|
||||
72 |
|
||||
73 | # https://github.com/astral-sh/ruff/issues/17693
|
||||
|
|
||||
|
||||
full_name.py:108:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
|
|
||||
106 | os.replace("src", "dst", dst_dir_fd=2)
|
||||
107 |
|
||||
108 | os.getcwd()
|
||||
| ^^^^^^^^^ PTH109
|
||||
109 | os.getcwdb()
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
full_name.py:109:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
|
|
||||
108 | os.getcwd()
|
||||
109 | os.getcwdb()
|
||||
| ^^^^^^^^^^ PTH109
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
@@ -103,7 +103,6 @@ import_as.py:16:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
17 | b = foo_p.exists(p)
|
||||
18 | bb = foo_p.expanduser(p)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
import_as.py:17:5: PTH110 `os.path.exists()` should be replaced by `Path.exists()`
|
||||
|
|
||||
|
||||
@@ -103,7 +103,6 @@ import_from.py:18:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
19 | b = exists(p)
|
||||
20 | bb = expanduser(p)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
import_from.py:19:5: PTH110 `os.path.exists()` should be replaced by `Path.exists()`
|
||||
|
|
||||
|
||||
@@ -103,7 +103,6 @@ import_from_as.py:23:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
24 | b = xexists(p)
|
||||
25 | bb = xexpanduser(p)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
import_from_as.py:24:5: PTH110 `os.path.exists()` should be replaced by `Path.exists()`
|
||||
|
|
||||
|
||||
@@ -168,7 +168,6 @@ full_name.py:16:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
17 | b = os.path.exists(p)
|
||||
18 | bb = os.path.expanduser(p)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
full_name.py:17:5: PTH110 [*] `os.path.exists()` should be replaced by `Path.exists()`
|
||||
|
|
||||
@@ -511,7 +510,6 @@ full_name.py:35:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
36 | os.path.join(p, *q)
|
||||
37 | os.sep.join(p, *q)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
full_name.py:36:1: PTH118 `os.path.join()` should be replaced by `Path.joinpath()`
|
||||
|
|
||||
@@ -580,50 +578,3 @@ full_name.py:71:1: PTH123 `open()` should be replaced by `Path.open()`
|
||||
72 |
|
||||
73 | # https://github.com/astral-sh/ruff/issues/17693
|
||||
|
|
||||
|
||||
full_name.py:108:1: PTH109 [*] `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
|
|
||||
106 | os.replace("src", "dst", dst_dir_fd=2)
|
||||
107 |
|
||||
108 | os.getcwd()
|
||||
| ^^^^^^^^^ PTH109
|
||||
109 | os.getcwdb()
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
ℹ Safe fix
|
||||
1 1 | import os
|
||||
2 2 | import os.path
|
||||
3 |+import pathlib
|
||||
3 4 |
|
||||
4 5 | p = "/foo"
|
||||
5 6 | q = "bar"
|
||||
--------------------------------------------------------------------------------
|
||||
105 106 | os.replace("src", "dst", src_dir_fd=1)
|
||||
106 107 | os.replace("src", "dst", dst_dir_fd=2)
|
||||
107 108 |
|
||||
108 |-os.getcwd()
|
||||
109 |+pathlib.Path.cwd()
|
||||
109 110 | os.getcwdb()
|
||||
|
||||
full_name.py:109:1: PTH109 [*] `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
|
|
||||
108 | os.getcwd()
|
||||
109 | os.getcwdb()
|
||||
| ^^^^^^^^^^ PTH109
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
ℹ Safe fix
|
||||
1 1 | import os
|
||||
2 2 | import os.path
|
||||
3 |+import pathlib
|
||||
3 4 |
|
||||
4 5 | p = "/foo"
|
||||
5 6 | q = "bar"
|
||||
--------------------------------------------------------------------------------
|
||||
106 107 | os.replace("src", "dst", dst_dir_fd=2)
|
||||
107 108 |
|
||||
108 109 | os.getcwd()
|
||||
109 |-os.getcwdb()
|
||||
110 |+pathlib.Path.cwd()
|
||||
|
||||
@@ -168,7 +168,6 @@ import_as.py:16:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
17 | b = foo_p.exists(p)
|
||||
18 | bb = foo_p.expanduser(p)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
import_as.py:17:5: PTH110 [*] `os.path.exists()` should be replaced by `Path.exists()`
|
||||
|
|
||||
|
||||
@@ -172,7 +172,6 @@ import_from.py:18:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
19 | b = exists(p)
|
||||
20 | bb = expanduser(p)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
import_from.py:19:5: PTH110 [*] `os.path.exists()` should be replaced by `Path.exists()`
|
||||
|
|
||||
|
||||
@@ -172,7 +172,6 @@ import_from_as.py:23:1: PTH109 `os.getcwd()` should be replaced by `Path.cwd()`
|
||||
24 | b = xexists(p)
|
||||
25 | bb = xexpanduser(p)
|
||||
|
|
||||
= help: Replace with `Path.cwd()`
|
||||
|
||||
import_from_as.py:24:5: PTH110 [*] `os.path.exists()` should be replaced by `Path.exists()`
|
||||
|
|
||||
|
||||
@@ -230,6 +230,52 @@ impl Violation for OsReplace {
|
||||
}
|
||||
}
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for uses of `os.getcwd` and `os.getcwdb`.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// `pathlib` offers a high-level API for path manipulation, as compared to
|
||||
/// the lower-level API offered by `os`. When possible, using `Path` object
|
||||
/// methods such as `Path.cwd()` can improve readability over the `os`
|
||||
/// module's counterparts (e.g., `os.getcwd()`).
|
||||
///
|
||||
/// ## Examples
|
||||
/// ```python
|
||||
/// import os
|
||||
///
|
||||
/// cwd = os.getcwd()
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// from pathlib import Path
|
||||
///
|
||||
/// cwd = Path.cwd()
|
||||
/// ```
|
||||
///
|
||||
/// ## Known issues
|
||||
/// While using `pathlib` can improve the readability and type safety of your code,
|
||||
/// it can be less performant than the lower-level alternatives that work directly with strings,
|
||||
/// especially on older versions of Python.
|
||||
///
|
||||
/// ## References
|
||||
/// - [Python documentation: `Path.cwd`](https://docs.python.org/3/library/pathlib.html#pathlib.Path.cwd)
|
||||
/// - [Python documentation: `os.getcwd`](https://docs.python.org/3/library/os.html#os.getcwd)
|
||||
/// - [Python documentation: `os.getcwdb`](https://docs.python.org/3/library/os.html#os.getcwdb)
|
||||
/// - [PEP 428 – The pathlib module – object-oriented filesystem paths](https://peps.python.org/pep-0428/)
|
||||
/// - [Correspondence between `os` and `pathlib`](https://docs.python.org/3/library/pathlib.html#correspondence-to-tools-in-the-os-module)
|
||||
/// - [Why you should be using pathlib](https://treyhunner.com/2018/12/why-you-should-be-using-pathlib/)
|
||||
/// - [No really, pathlib is great](https://treyhunner.com/2019/01/no-really-pathlib-is-great/)
|
||||
#[derive(ViolationMetadata)]
|
||||
pub(crate) struct OsGetcwd;
|
||||
|
||||
impl Violation for OsGetcwd {
|
||||
#[derive_message_formats]
|
||||
fn message(&self) -> String {
|
||||
"`os.getcwd()` should be replaced by `Path.cwd()`".to_string()
|
||||
}
|
||||
}
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for uses of `os.stat`.
|
||||
///
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_wasm"
|
||||
version = "0.12.4"
|
||||
version = "0.12.3"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
||||
@@ -469,14 +469,6 @@ impl Project {
|
||||
self.set_file_set(db).to(IndexedFiles::lazy());
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if the project's settings have any issues
|
||||
pub fn check_settings(&self, db: &dyn Db) -> Vec<Diagnostic> {
|
||||
self.settings_diagnostics(db)
|
||||
.iter()
|
||||
.map(OptionDiagnostic::to_diagnostic)
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
||||
#[salsa::tracked(returns(deref), heap_size=get_size2::GetSize::get_heap_size)]
|
||||
|
||||
@@ -35,7 +35,6 @@ indexmap = { workspace = true }
|
||||
itertools = { workspace = true }
|
||||
ordermap = { workspace = true }
|
||||
salsa = { workspace = true, features = ["compact_str"] }
|
||||
thin-vec = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
rustc-hash = { workspace = true }
|
||||
|
||||
@@ -26,7 +26,6 @@ use crate::semantic_index::place::{
|
||||
};
|
||||
use crate::semantic_index::use_def::{EagerSnapshotKey, ScopedEagerSnapshotId, UseDefMap};
|
||||
use crate::semantic_model::HasTrackedScope;
|
||||
use crate::util::get_size::ThinVecSized;
|
||||
use crate::util::get_size::untracked_arc_size;
|
||||
|
||||
pub mod ast_ids;
|
||||
@@ -239,7 +238,7 @@ pub(crate) struct SemanticIndex<'db> {
|
||||
eager_snapshots: FxHashMap<EagerSnapshotKey, ScopedEagerSnapshotId>,
|
||||
|
||||
/// List of all semantic syntax errors in this file.
|
||||
semantic_syntax_errors: ThinVecSized<SemanticSyntaxError>,
|
||||
semantic_syntax_errors: Vec<SemanticSyntaxError>,
|
||||
|
||||
/// Set of all generator functions in this file.
|
||||
generator_functions: FxHashSet<FileScopeId>,
|
||||
|
||||
@@ -115,7 +115,7 @@ pub(super) struct SemanticIndexBuilder<'db, 'ast> {
|
||||
generator_functions: FxHashSet<FileScopeId>,
|
||||
eager_snapshots: FxHashMap<EagerSnapshotKey, ScopedEagerSnapshotId>,
|
||||
/// Errors collected by the `semantic_checker`.
|
||||
semantic_syntax_errors: RefCell<thin_vec::ThinVec<SemanticSyntaxError>>,
|
||||
semantic_syntax_errors: RefCell<Vec<SemanticSyntaxError>>,
|
||||
}
|
||||
|
||||
impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
|
||||
@@ -1063,7 +1063,7 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
|
||||
imported_modules: Arc::new(self.imported_modules),
|
||||
has_future_annotations: self.has_future_annotations,
|
||||
eager_snapshots: self.eager_snapshots,
|
||||
semantic_syntax_errors: self.semantic_syntax_errors.into_inner().into(),
|
||||
semantic_syntax_errors: self.semantic_syntax_errors.into_inner(),
|
||||
generator_functions: self.generator_functions,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -10,13 +10,13 @@ use ruff_index::{IndexVec, newtype_index};
|
||||
use ruff_python_ast as ast;
|
||||
use ruff_python_ast::name::Name;
|
||||
use rustc_hash::FxHasher;
|
||||
use smallvec::{SmallVec, smallvec};
|
||||
|
||||
use crate::Db;
|
||||
use crate::ast_node_ref::AstNodeRef;
|
||||
use crate::node_key::NodeKey;
|
||||
use crate::semantic_index::reachability_constraints::ScopedReachabilityConstraintId;
|
||||
use crate::semantic_index::{PlaceSet, SemanticIndex, semantic_index};
|
||||
use crate::util::get_size::ThinVecSized;
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, get_size2::GetSize)]
|
||||
pub(crate) enum PlaceExprSubSegment {
|
||||
@@ -41,7 +41,7 @@ impl PlaceExprSubSegment {
|
||||
#[derive(Eq, PartialEq, Debug, get_size2::GetSize)]
|
||||
pub struct PlaceExpr {
|
||||
root_name: Name,
|
||||
sub_segments: ThinVecSized<PlaceExprSubSegment>,
|
||||
sub_segments: SmallVec<[PlaceExprSubSegment; 1]>,
|
||||
}
|
||||
|
||||
impl std::fmt::Display for PlaceExpr {
|
||||
@@ -165,7 +165,7 @@ impl PlaceExpr {
|
||||
pub(crate) fn name(name: Name) -> Self {
|
||||
Self {
|
||||
root_name: name,
|
||||
sub_segments: ThinVecSized::new(),
|
||||
sub_segments: smallvec![],
|
||||
}
|
||||
}
|
||||
|
||||
@@ -230,9 +230,7 @@ impl std::fmt::Display for PlaceExprWithFlags {
|
||||
}
|
||||
|
||||
impl PlaceExprWithFlags {
|
||||
pub(crate) fn new(mut expr: PlaceExpr) -> Self {
|
||||
expr.sub_segments.shrink_to_fit();
|
||||
|
||||
pub(crate) fn new(expr: PlaceExpr) -> Self {
|
||||
PlaceExprWithFlags {
|
||||
expr,
|
||||
flags: PlaceFlags::empty(),
|
||||
|
||||
@@ -71,12 +71,16 @@ impl ScopedDefinitionId {
|
||||
}
|
||||
}
|
||||
|
||||
/// Can keep inline this many live bindings or declarations per place at a given time; more will
|
||||
/// go to heap.
|
||||
const INLINE_DEFINITIONS_PER_PLACE: usize = 4;
|
||||
|
||||
/// Live declarations for a single place at some point in control flow, with their
|
||||
/// corresponding reachability constraints.
|
||||
#[derive(Clone, Debug, Default, PartialEq, Eq, salsa::Update, get_size2::GetSize)]
|
||||
pub(super) struct Declarations {
|
||||
/// A list of live declarations for this place, sorted by their `ScopedDefinitionId`
|
||||
live_declarations: SmallVec<[LiveDeclaration; 2]>,
|
||||
live_declarations: SmallVec<[LiveDeclaration; INLINE_DEFINITIONS_PER_PLACE]>,
|
||||
}
|
||||
|
||||
/// One of the live declarations for a single place at some point in control flow.
|
||||
@@ -195,7 +199,7 @@ pub(super) struct Bindings {
|
||||
/// "unbound" binding.
|
||||
unbound_narrowing_constraint: Option<ScopedNarrowingConstraint>,
|
||||
/// A list of live bindings for this place, sorted by their `ScopedDefinitionId`
|
||||
live_bindings: SmallVec<[LiveBinding; 2]>,
|
||||
live_bindings: SmallVec<[LiveBinding; INLINE_DEFINITIONS_PER_PLACE]>,
|
||||
}
|
||||
|
||||
impl Bindings {
|
||||
|
||||
@@ -6,6 +6,7 @@ use std::slice::Iter;
|
||||
|
||||
use bitflags::bitflags;
|
||||
use call::{CallDunderError, CallError, CallErrorKind};
|
||||
use compact_str::{CompactString, ToCompactString};
|
||||
use context::InferContext;
|
||||
use diagnostic::{
|
||||
INVALID_CONTEXT_MANAGER, INVALID_SUPER_ARGUMENT, NOT_ITERABLE, POSSIBLY_UNBOUND_IMPLICIT_CALL,
|
||||
@@ -16,6 +17,7 @@ use ruff_db::files::File;
|
||||
use ruff_python_ast::name::Name;
|
||||
use ruff_python_ast::{self as ast, AnyNodeRef};
|
||||
use ruff_text_size::{Ranged, TextRange};
|
||||
use salsa::plumbing::interned::Lookup;
|
||||
use type_ordering::union_or_intersection_elements_ordering;
|
||||
|
||||
pub(crate) use self::builder::{IntersectionBuilder, UnionBuilder};
|
||||
@@ -93,7 +95,7 @@ mod definition;
|
||||
#[cfg(test)]
|
||||
mod property_tests;
|
||||
|
||||
pub fn check_types(db: &dyn Db, file: File) -> thin_vec::ThinVec<Diagnostic> {
|
||||
pub fn check_types(db: &dyn Db, file: File) -> Vec<Diagnostic> {
|
||||
let _span = tracing::trace_span!("check_types", ?file).entered();
|
||||
|
||||
tracing::debug!("Checking file '{path}'", path = file.path(db));
|
||||
@@ -938,7 +940,7 @@ impl<'db> Type<'db> {
|
||||
}
|
||||
|
||||
pub fn string_literal(db: &'db dyn Db, string: &str) -> Self {
|
||||
Self::StringLiteral(StringLiteralType::new(db, string))
|
||||
Self::StringLiteral(StringLiteralType::new(db, StringLiteralValue::new(string)))
|
||||
}
|
||||
|
||||
pub fn bytes_literal(db: &'db dyn Db, bytes: &[u8]) -> Self {
|
||||
@@ -5576,7 +5578,7 @@ impl<'db> Type<'db> {
|
||||
Type::SpecialForm(special_form) => Type::string_literal(db, special_form.repr()),
|
||||
Type::KnownInstance(known_instance) => Type::StringLiteral(StringLiteralType::new(
|
||||
db,
|
||||
known_instance.repr(db).to_string().into_boxed_str(),
|
||||
StringLiteralValue::from(known_instance.repr(db).to_compact_string()),
|
||||
)),
|
||||
// TODO: handle more complex types
|
||||
_ => KnownClass::Str.to_instance(db),
|
||||
@@ -5598,7 +5600,7 @@ impl<'db> Type<'db> {
|
||||
Type::SpecialForm(special_form) => Type::string_literal(db, special_form.repr()),
|
||||
Type::KnownInstance(known_instance) => Type::StringLiteral(StringLiteralType::new(
|
||||
db,
|
||||
known_instance.repr(db).to_string().into_boxed_str(),
|
||||
StringLiteralValue::from(known_instance.repr(db).to_compact_string()),
|
||||
)),
|
||||
// TODO: handle more complex types
|
||||
_ => KnownClass::Str.to_instance(db),
|
||||
@@ -8280,7 +8282,7 @@ impl<'db> IntersectionType<'db> {
|
||||
#[derive(PartialOrd, Ord)]
|
||||
pub struct StringLiteralType<'db> {
|
||||
#[returns(deref)]
|
||||
value: Box<str>,
|
||||
value: StringLiteralValue,
|
||||
}
|
||||
|
||||
// The Salsa heap is tracked separately.
|
||||
@@ -8297,7 +8299,41 @@ impl<'db> StringLiteralType<'db> {
|
||||
pub(crate) fn iter_each_char(self, db: &'db dyn Db) -> impl Iterator<Item = Self> {
|
||||
self.value(db)
|
||||
.chars()
|
||||
.map(|c| StringLiteralType::new(db, c.to_string().into_boxed_str()))
|
||||
.map(|c| StringLiteralType::new(db, StringLiteralValue::from_char(c)))
|
||||
}
|
||||
}
|
||||
|
||||
/// Newtype wrapper around `compact_str`'s `CompactString` so that `Lookup` can be implemented for it.
|
||||
#[derive(PartialEq, Eq, Debug, Clone, Hash, Ord, PartialOrd, get_size2::GetSize)]
|
||||
pub struct StringLiteralValue(CompactString);
|
||||
|
||||
impl StringLiteralValue {
|
||||
fn new(value: impl AsRef<str>) -> Self {
|
||||
StringLiteralValue(CompactString::new(value.as_ref()))
|
||||
}
|
||||
|
||||
fn from_char(c: char) -> Self {
|
||||
StringLiteralValue(c.to_compact_string())
|
||||
}
|
||||
}
|
||||
|
||||
impl Lookup<StringLiteralValue> for &str {
|
||||
fn into_owned(self) -> StringLiteralValue {
|
||||
StringLiteralValue(CompactString::new(self))
|
||||
}
|
||||
}
|
||||
|
||||
impl std::ops::Deref for StringLiteralValue {
|
||||
type Target = str;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.0
|
||||
}
|
||||
}
|
||||
|
||||
impl From<compact_str::CompactString> for StringLiteralValue {
|
||||
fn from(value: compact_str::CompactString) -> Self {
|
||||
StringLiteralValue(value)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -18,7 +18,6 @@ use crate::types::string_annotation::{
|
||||
use crate::types::tuple::TupleType;
|
||||
use crate::types::{SpecialFormType, Type, protocol_class::ProtocolClassLiteral};
|
||||
use crate::util::diagnostics::format_enumeration;
|
||||
use crate::util::get_size::ThinVecSized;
|
||||
use crate::{Db, FxIndexMap, Module, ModuleName, Program, declare_lint};
|
||||
use itertools::Itertools;
|
||||
use ruff_db::diagnostic::{Annotation, Diagnostic, Severity, SubDiagnostic};
|
||||
@@ -1615,7 +1614,7 @@ declare_lint! {
|
||||
/// A collection of type check diagnostics.
|
||||
#[derive(Default, Eq, PartialEq, get_size2::GetSize)]
|
||||
pub struct TypeCheckDiagnostics {
|
||||
diagnostics: ThinVecSized<Diagnostic>,
|
||||
diagnostics: Vec<Diagnostic>,
|
||||
used_suppressions: FxHashSet<FileSuppressionId>,
|
||||
}
|
||||
|
||||
@@ -1650,8 +1649,8 @@ impl TypeCheckDiagnostics {
|
||||
self.diagnostics.shrink_to_fit();
|
||||
}
|
||||
|
||||
pub(crate) fn into_vec(self) -> thin_vec::ThinVec<Diagnostic> {
|
||||
self.diagnostics.into_inner()
|
||||
pub(crate) fn into_vec(self) -> Vec<Diagnostic> {
|
||||
self.diagnostics
|
||||
}
|
||||
|
||||
pub fn iter(&self) -> std::slice::Iter<'_, Diagnostic> {
|
||||
@@ -1667,7 +1666,7 @@ impl std::fmt::Debug for TypeCheckDiagnostics {
|
||||
|
||||
impl IntoIterator for TypeCheckDiagnostics {
|
||||
type Item = Diagnostic;
|
||||
type IntoIter = thin_vec::IntoIter<Diagnostic>;
|
||||
type IntoIter = std::vec::IntoIter<Diagnostic>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.diagnostics.into_iter()
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
use std::ops::{Deref, DerefMut};
|
||||
use std::sync::Arc;
|
||||
|
||||
use get_size2::GetSize;
|
||||
@@ -14,105 +13,3 @@ where
|
||||
{
|
||||
T::get_heap_size(&**arc)
|
||||
}
|
||||
|
||||
#[derive(Clone, Hash, PartialEq, Eq)]
|
||||
pub(crate) struct ThinVecSized<T>(thin_vec::ThinVec<T>);
|
||||
|
||||
impl<T> ThinVecSized<T> {
|
||||
pub(crate) fn into_inner(self) -> thin_vec::ThinVec<T> {
|
||||
self.0
|
||||
}
|
||||
|
||||
pub(crate) fn new() -> Self {
|
||||
Self(thin_vec::ThinVec::new())
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> Default for ThinVecSized<T> {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(unsafe_code)]
|
||||
unsafe impl<T> salsa::Update for ThinVecSized<T>
|
||||
where
|
||||
T: salsa::Update,
|
||||
{
|
||||
unsafe fn maybe_update(old_pointer: *mut Self, new_value: Self) -> bool {
|
||||
let old: &mut Self = unsafe { &mut *old_pointer };
|
||||
|
||||
unsafe { salsa::Update::maybe_update(&raw mut old.0, new_value.0) }
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> std::fmt::Debug for ThinVecSized<T>
|
||||
where
|
||||
T: std::fmt::Debug,
|
||||
{
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
self.0.fmt(f)
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> From<thin_vec::ThinVec<T>> for ThinVecSized<T> {
|
||||
fn from(vec: thin_vec::ThinVec<T>) -> Self {
|
||||
Self(vec)
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> Deref for ThinVecSized<T> {
|
||||
type Target = thin_vec::ThinVec<T>;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.0
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> DerefMut for ThinVecSized<T> {
|
||||
fn deref_mut(&mut self) -> &mut Self::Target {
|
||||
&mut self.0
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> GetSize for ThinVecSized<T>
|
||||
where
|
||||
T: GetSize,
|
||||
{
|
||||
fn get_heap_size(&self) -> usize {
|
||||
let mut total = 0;
|
||||
for v in self {
|
||||
total += GetSize::get_size(v);
|
||||
}
|
||||
let additional: usize = self.capacity() - self.len();
|
||||
total += additional * T::get_stack_size();
|
||||
total
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T> IntoIterator for &'a ThinVecSized<T> {
|
||||
type Item = &'a T;
|
||||
type IntoIter = std::slice::Iter<'a, T>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.0.iter()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T> IntoIterator for &'a mut ThinVecSized<T> {
|
||||
type Item = &'a mut T;
|
||||
type IntoIter = std::slice::IterMut<'a, T>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.0.iter_mut()
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> IntoIterator for ThinVecSized<T> {
|
||||
type Item = T;
|
||||
type IntoIter = thin_vec::IntoIter<T>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.0.into_iter()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -23,7 +23,6 @@ ty_python_semantic = { workspace = true }
|
||||
ty_vendored = { workspace = true }
|
||||
|
||||
anyhow = { workspace = true }
|
||||
bitflags = { workspace = true }
|
||||
crossbeam = { workspace = true }
|
||||
jod-thread = { workspace = true }
|
||||
lsp-server = { workspace = true }
|
||||
|
||||
@@ -21,8 +21,8 @@ mod schedule;
|
||||
|
||||
use crate::session::client::Client;
|
||||
pub(crate) use api::Error;
|
||||
pub(crate) use api::publish_settings_diagnostics;
|
||||
pub(crate) use main_loop::{Action, ConnectionSender, Event, MainLoopReceiver, MainLoopSender};
|
||||
|
||||
pub(crate) type Result<T> = std::result::Result<T, api::Error>;
|
||||
|
||||
pub(crate) struct Server {
|
||||
|
||||
@@ -17,7 +17,6 @@ mod traits;
|
||||
use self::traits::{NotificationHandler, RequestHandler};
|
||||
use super::{Result, schedule::BackgroundSchedule};
|
||||
use crate::session::client::Client;
|
||||
pub(crate) use diagnostics::publish_settings_diagnostics;
|
||||
use ruff_db::panic::PanicError;
|
||||
|
||||
/// Processes a request from the client to the server.
|
||||
|
||||
@@ -8,13 +8,11 @@ use rustc_hash::FxHashMap;
|
||||
use ruff_db::diagnostic::{Annotation, Severity, SubDiagnostic};
|
||||
use ruff_db::files::FileRange;
|
||||
use ruff_db::source::{line_index, source_text};
|
||||
use ruff_db::system::SystemPathBuf;
|
||||
use ty_project::{Db, ProjectDatabase};
|
||||
|
||||
use crate::document::{DocumentKey, FileRangeExt, ToRangeExt};
|
||||
use crate::session::DocumentSnapshot;
|
||||
use crate::session::client::Client;
|
||||
use crate::system::{AnySystemPath, file_to_url};
|
||||
use crate::{PositionEncoding, Session};
|
||||
|
||||
/// Represents the diagnostics for a text document or a notebook document.
|
||||
@@ -66,7 +64,7 @@ pub(super) fn clear_diagnostics(key: &DocumentKey, client: &Client) {
|
||||
///
|
||||
/// [publish diagnostics notification]: https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_publishDiagnostics
|
||||
pub(super) fn publish_diagnostics(session: &Session, key: &DocumentKey, client: &Client) {
|
||||
if session.client_capabilities().supports_pull_diagnostics() {
|
||||
if session.client_capabilities().pull_diagnostics {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -111,82 +109,6 @@ pub(super) fn publish_diagnostics(session: &Session, key: &DocumentKey, client:
|
||||
}
|
||||
}
|
||||
|
||||
/// Publishes settings diagnostics for all the project at the given path
|
||||
/// using the [publish diagnostics notification].
|
||||
///
|
||||
/// [publish diagnostics notification]: https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_publishDiagnostics
|
||||
pub(crate) fn publish_settings_diagnostics(
|
||||
session: &mut Session,
|
||||
client: &Client,
|
||||
path: SystemPathBuf,
|
||||
) {
|
||||
// Don't publish settings diagnostics for workspace that are already doing full diagnostics.
|
||||
//
|
||||
// Note we DO NOT respect the fact that clients support pulls because these are
|
||||
// files they *specifically* won't pull diagnostics from us for, because we don't
|
||||
// claim to be an LSP for them.
|
||||
let has_workspace_diagnostics = session
|
||||
.workspaces()
|
||||
.for_path(&path)
|
||||
.map(|workspace| workspace.settings().diagnostic_mode().is_workspace())
|
||||
.unwrap_or(false);
|
||||
if has_workspace_diagnostics {
|
||||
return;
|
||||
}
|
||||
|
||||
let session_encoding = session.position_encoding();
|
||||
let state = session.project_state_mut(&AnySystemPath::System(path));
|
||||
let db = &state.db;
|
||||
let project = db.project();
|
||||
let settings_diagnostics = project.check_settings(db);
|
||||
|
||||
// We need to send diagnostics if we have non-empty ones, or we have ones to clear.
|
||||
// These will both almost always be empty so this function will almost always be a no-op.
|
||||
if settings_diagnostics.is_empty() && state.untracked_files_with_pushed_diagnostics.is_empty() {
|
||||
return;
|
||||
}
|
||||
|
||||
// Group diagnostics by URL
|
||||
let mut diagnostics_by_url: FxHashMap<Url, Vec<_>> = FxHashMap::default();
|
||||
for diagnostic in settings_diagnostics {
|
||||
if let Some(span) = diagnostic.primary_span() {
|
||||
let file = span.expect_ty_file();
|
||||
let Some(url) = file_to_url(db, file) else {
|
||||
tracing::debug!("Failed to convert file to URL at {}", file.path(db));
|
||||
continue;
|
||||
};
|
||||
diagnostics_by_url.entry(url).or_default().push(diagnostic);
|
||||
}
|
||||
}
|
||||
|
||||
// Record the URLs we're sending non-empty diagnostics for, so we know to clear them
|
||||
// the next time we publish settings diagnostics!
|
||||
let old_untracked = std::mem::replace(
|
||||
&mut state.untracked_files_with_pushed_diagnostics,
|
||||
diagnostics_by_url.keys().cloned().collect(),
|
||||
);
|
||||
|
||||
// Add empty diagnostics for any files that had diagnostics before but don't now.
|
||||
// This will clear them (either the file is no longer relevant to us or fixed!)
|
||||
for url in old_untracked {
|
||||
diagnostics_by_url.entry(url).or_default();
|
||||
}
|
||||
// Send the settings diagnostics!
|
||||
for (url, file_diagnostics) in diagnostics_by_url {
|
||||
// Convert diagnostics to LSP format
|
||||
let lsp_diagnostics = file_diagnostics
|
||||
.into_iter()
|
||||
.map(|diagnostic| to_lsp_diagnostic(db, &diagnostic, session_encoding))
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
client.send_notification::<PublishDiagnostics>(PublishDiagnosticsParams {
|
||||
uri: url,
|
||||
diagnostics: lsp_diagnostics,
|
||||
version: None,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn compute_diagnostics(
|
||||
db: &ProjectDatabase,
|
||||
snapshot: &DocumentSnapshot,
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
use crate::server::Result;
|
||||
use crate::server::api::diagnostics::{publish_diagnostics, publish_settings_diagnostics};
|
||||
use crate::server::api::diagnostics::publish_diagnostics;
|
||||
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
|
||||
use crate::session::Session;
|
||||
use crate::session::client::Client;
|
||||
@@ -88,8 +88,7 @@ impl SyncNotificationHandler for DidChangeWatchedFiles {
|
||||
for (root, changes) in events_by_db {
|
||||
tracing::debug!("Applying changes to `{root}`");
|
||||
|
||||
let result = session.apply_changes(&AnySystemPath::System(root.clone()), changes);
|
||||
publish_settings_diagnostics(session, client, root);
|
||||
let result = session.apply_changes(&AnySystemPath::System(root), changes);
|
||||
|
||||
project_changed |= result.project_changed();
|
||||
}
|
||||
@@ -97,7 +96,7 @@ impl SyncNotificationHandler for DidChangeWatchedFiles {
|
||||
let client_capabilities = session.client_capabilities();
|
||||
|
||||
if project_changed {
|
||||
if client_capabilities.supports_workspace_diagnostic_refresh() {
|
||||
if client_capabilities.diagnostics_refresh {
|
||||
client.send_request::<types::request::WorkspaceDiagnosticRefresh>(
|
||||
session,
|
||||
(),
|
||||
@@ -108,10 +107,11 @@ impl SyncNotificationHandler for DidChangeWatchedFiles {
|
||||
publish_diagnostics(session, &key, client);
|
||||
}
|
||||
}
|
||||
|
||||
// TODO: always publish diagnostics for notebook files (since they don't use pull diagnostics)
|
||||
}
|
||||
|
||||
if client_capabilities.supports_inlay_hint_refresh() {
|
||||
if client_capabilities.inlay_refresh {
|
||||
client.send_request::<types::request::InlayHintRefreshRequest>(session, (), |_, ()| {});
|
||||
}
|
||||
|
||||
|
||||
@@ -52,7 +52,7 @@ impl BackgroundDocumentRequestHandler for GotoDeclarationRequestHandler {
|
||||
|
||||
if snapshot
|
||||
.resolved_client_capabilities()
|
||||
.supports_declaration_link()
|
||||
.type_definition_link_support
|
||||
{
|
||||
let src = Some(ranged.range);
|
||||
let links: Vec<_> = ranged
|
||||
|
||||
@@ -52,7 +52,7 @@ impl BackgroundDocumentRequestHandler for GotoDefinitionRequestHandler {
|
||||
|
||||
if snapshot
|
||||
.resolved_client_capabilities()
|
||||
.supports_definition_link()
|
||||
.type_definition_link_support
|
||||
{
|
||||
let src = Some(ranged.range);
|
||||
let links: Vec<_> = ranged
|
||||
|
||||
@@ -52,7 +52,7 @@ impl BackgroundDocumentRequestHandler for GotoTypeDefinitionRequestHandler {
|
||||
|
||||
if snapshot
|
||||
.resolved_client_capabilities()
|
||||
.supports_type_definition_link()
|
||||
.type_definition_link_support
|
||||
{
|
||||
let src = Some(ranged.range);
|
||||
let links: Vec<_> = ranged
|
||||
|
||||
@@ -52,7 +52,7 @@ impl BackgroundDocumentRequestHandler for HoverRequestHandler {
|
||||
|
||||
let (markup_kind, lsp_markup_kind) = if snapshot
|
||||
.resolved_client_capabilities()
|
||||
.prefers_markdown_in_hover()
|
||||
.hover_prefer_markdown
|
||||
{
|
||||
(MarkupKind::Markdown, lsp_types::MarkupKind::Markdown)
|
||||
} else {
|
||||
|
||||
@@ -41,7 +41,7 @@ impl BackgroundDocumentRequestHandler for SemanticTokensRequestHandler {
|
||||
snapshot.encoding(),
|
||||
snapshot
|
||||
.resolved_client_capabilities()
|
||||
.supports_multiline_semantic_tokens(),
|
||||
.semantic_tokens_multiline_support,
|
||||
);
|
||||
|
||||
Ok(Some(SemanticTokensResult::Tokens(SemanticTokens {
|
||||
|
||||
@@ -51,7 +51,7 @@ impl BackgroundDocumentRequestHandler for SemanticTokensRangeRequestHandler {
|
||||
snapshot.encoding(),
|
||||
snapshot
|
||||
.resolved_client_capabilities()
|
||||
.supports_multiline_semantic_tokens(),
|
||||
.semantic_tokens_multiline_support,
|
||||
);
|
||||
|
||||
Ok(Some(SemanticTokensRangeResult::Tokens(SemanticTokens {
|
||||
|
||||
@@ -71,7 +71,7 @@ impl BackgroundDocumentRequestHandler for SignatureHelpRequestHandler {
|
||||
.parameters
|
||||
.into_iter()
|
||||
.map(|param| {
|
||||
let label = if resolved_capabilities.supports_signature_label_offset() {
|
||||
let label = if resolved_capabilities.signature_label_offset_support {
|
||||
// Find the parameter's offset in the signature label
|
||||
if let Some(start) = sig.label.find(¶m.label) {
|
||||
let encoding = snapshot.encoding();
|
||||
@@ -114,12 +114,11 @@ impl BackgroundDocumentRequestHandler for SignatureHelpRequestHandler {
|
||||
})
|
||||
.collect();
|
||||
|
||||
let active_parameter =
|
||||
if resolved_capabilities.supports_signature_active_parameter() {
|
||||
sig.active_parameter.and_then(|p| u32::try_from(p).ok())
|
||||
} else {
|
||||
None
|
||||
};
|
||||
let active_parameter = if resolved_capabilities.signature_active_parameter_support {
|
||||
sig.active_parameter.and_then(|p| u32::try_from(p).ok())
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
SignatureInformation {
|
||||
label: sig.label,
|
||||
|
||||
@@ -21,7 +21,6 @@ pub(crate) use self::index::DocumentQuery;
|
||||
pub(crate) use self::options::{AllOptions, ClientOptions, DiagnosticMode};
|
||||
pub(crate) use self::settings::ClientSettings;
|
||||
use crate::document::{DocumentKey, DocumentVersion, NotebookDocument};
|
||||
use crate::server::publish_settings_diagnostics;
|
||||
use crate::session::client::Client;
|
||||
use crate::session::request_queue::RequestQueue;
|
||||
use crate::system::{AnySystemPath, LSPSystem};
|
||||
@@ -50,7 +49,7 @@ pub(crate) struct Session {
|
||||
workspaces: Workspaces,
|
||||
|
||||
/// The projects across all workspaces.
|
||||
projects: BTreeMap<SystemPathBuf, ProjectState>,
|
||||
projects: BTreeMap<SystemPathBuf, ProjectDatabase>,
|
||||
|
||||
/// The project to use for files outside any workspace. For example, if the user
|
||||
/// opens the project `<home>/my_project` in VS code but they then opens a Python file from their Desktop.
|
||||
@@ -63,7 +62,7 @@ pub(crate) struct Session {
|
||||
position_encoding: PositionEncoding,
|
||||
|
||||
/// Tracks what LSP features the client supports and doesn't support.
|
||||
resolved_client_capabilities: ResolvedClientCapabilities,
|
||||
resolved_client_capabilities: Arc<ResolvedClientCapabilities>,
|
||||
|
||||
/// Tracks the pending requests between client and server.
|
||||
request_queue: RequestQueue,
|
||||
@@ -74,25 +73,6 @@ pub(crate) struct Session {
|
||||
deferred_messages: VecDeque<Message>,
|
||||
}
|
||||
|
||||
/// LSP State for a Project
|
||||
pub(crate) struct ProjectState {
|
||||
pub(crate) db: ProjectDatabase,
|
||||
/// Files that we have outstanding otherwise-untracked pushed diagnostics for.
|
||||
///
|
||||
/// In `CheckMode::OpenFiles` we still read some files that the client hasn't
|
||||
/// told us to open. Notably settings files like `pyproject.toml`. In this
|
||||
/// mode the client will never pull diagnostics for that file, and because
|
||||
/// the file isn't formally "open" we also don't have a reliable signal to
|
||||
/// refresh diagnostics for it either.
|
||||
///
|
||||
/// However diagnostics for those files include things like "you typo'd your
|
||||
/// configuration for the LSP itself", so it's really important that we tell
|
||||
/// the user about them! So we remember which ones we have emitted diagnostics
|
||||
/// for so that we can clear the diagnostics for all of them before we go
|
||||
/// to update any of them.
|
||||
pub(crate) untracked_files_with_pushed_diagnostics: Vec<Url>,
|
||||
}
|
||||
|
||||
impl Session {
|
||||
pub(crate) fn new(
|
||||
client_capabilities: &ClientCapabilities,
|
||||
@@ -114,7 +94,9 @@ impl Session {
|
||||
index: Some(index),
|
||||
default_project: DefaultProject::new(),
|
||||
projects: BTreeMap::new(),
|
||||
resolved_client_capabilities: ResolvedClientCapabilities::new(client_capabilities),
|
||||
resolved_client_capabilities: Arc::new(ResolvedClientCapabilities::new(
|
||||
client_capabilities,
|
||||
)),
|
||||
request_queue: RequestQueue::new(),
|
||||
shutdown_requested: false,
|
||||
})
|
||||
@@ -188,7 +170,17 @@ impl Session {
|
||||
///
|
||||
/// If the path is a virtual path, it will return the first project database in the session.
|
||||
pub(crate) fn project_db(&self, path: &AnySystemPath) -> &ProjectDatabase {
|
||||
&self.project_state(path).db
|
||||
match path {
|
||||
AnySystemPath::System(system_path) => self
|
||||
.project_db_for_path(system_path)
|
||||
.unwrap_or_else(|| self.default_project.get(self.index.as_ref())),
|
||||
AnySystemPath::SystemVirtual(_virtual_path) => {
|
||||
// TODO: Currently, ty only supports single workspace but we need to figure out
|
||||
// which project should this virtual path belong to when there are multiple
|
||||
// projects: https://github.com/astral-sh/ty/issues/794
|
||||
self.projects.iter().next().map(|(_, db)| db).unwrap()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns a mutable reference to the project's [`ProjectDatabase`] in which the given `path`
|
||||
@@ -198,7 +190,20 @@ impl Session {
|
||||
///
|
||||
/// [`project_db`]: Session::project_db
|
||||
pub(crate) fn project_db_mut(&mut self, path: &AnySystemPath) -> &mut ProjectDatabase {
|
||||
&mut self.project_state_mut(path).db
|
||||
match path {
|
||||
AnySystemPath::System(system_path) => self
|
||||
.projects
|
||||
.range_mut(..=system_path.to_path_buf())
|
||||
.next_back()
|
||||
.map(|(_, db)| db)
|
||||
.unwrap_or_else(|| self.default_project.get_mut(self.index.as_ref())),
|
||||
AnySystemPath::SystemVirtual(_virtual_path) => {
|
||||
// TODO: Currently, ty only supports single workspace but we need to figure out
|
||||
// which project should this virtual path belong to when there are multiple
|
||||
// projects: https://github.com/astral-sh/ty/issues/794
|
||||
self.projects.iter_mut().next().map(|(_, db)| db).unwrap()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns a reference to the project's [`ProjectDatabase`] corresponding to the given path, if
|
||||
@@ -207,70 +212,10 @@ impl Session {
|
||||
&self,
|
||||
path: impl AsRef<SystemPath>,
|
||||
) -> Option<&ProjectDatabase> {
|
||||
self.project_state_for_path(path).map(|state| &state.db)
|
||||
}
|
||||
|
||||
/// Returns a reference to the project's [`ProjectState`] in which the given `path` belongs.
|
||||
///
|
||||
/// If the path is a system path, it will return the project database that is closest to the
|
||||
/// given path, or the default project if no project is found for the path.
|
||||
///
|
||||
/// If the path is a virtual path, it will return the first project database in the session.
|
||||
pub(crate) fn project_state(&self, path: &AnySystemPath) -> &ProjectState {
|
||||
match path {
|
||||
AnySystemPath::System(system_path) => self
|
||||
.project_state_for_path(system_path)
|
||||
.unwrap_or_else(|| self.default_project.get(self.index.as_ref())),
|
||||
AnySystemPath::SystemVirtual(_virtual_path) => {
|
||||
// TODO: Currently, ty only supports single workspace but we need to figure out
|
||||
// which project should this virtual path belong to when there are multiple
|
||||
// projects: https://github.com/astral-sh/ty/issues/794
|
||||
self.projects
|
||||
.iter()
|
||||
.next()
|
||||
.map(|(_, project)| project)
|
||||
.unwrap()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns a mutable reference to the project's [`ProjectState`] in which the given `path`
|
||||
/// belongs.
|
||||
///
|
||||
/// Refer to [`project_db`] for more details on how the project is selected.
|
||||
///
|
||||
/// [`project_db`]: Session::project_db
|
||||
pub(crate) fn project_state_mut(&mut self, path: &AnySystemPath) -> &mut ProjectState {
|
||||
match path {
|
||||
AnySystemPath::System(system_path) => self
|
||||
.projects
|
||||
.range_mut(..=system_path.to_path_buf())
|
||||
.next_back()
|
||||
.map(|(_, project)| project)
|
||||
.unwrap_or_else(|| self.default_project.get_mut(self.index.as_ref())),
|
||||
AnySystemPath::SystemVirtual(_virtual_path) => {
|
||||
// TODO: Currently, ty only supports single workspace but we need to figure out
|
||||
// which project should this virtual path belong to when there are multiple
|
||||
// projects: https://github.com/astral-sh/ty/issues/794
|
||||
self.projects
|
||||
.iter_mut()
|
||||
.next()
|
||||
.map(|(_, project)| project)
|
||||
.unwrap()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns a reference to the project's [`ProjectState`] corresponding to the given path, if
|
||||
/// any.
|
||||
pub(crate) fn project_state_for_path(
|
||||
&self,
|
||||
path: impl AsRef<SystemPath>,
|
||||
) -> Option<&ProjectState> {
|
||||
self.projects
|
||||
.range(..=path.as_ref().to_path_buf())
|
||||
.next_back()
|
||||
.map(|(_, project)| project)
|
||||
.map(|(_, db)| db)
|
||||
}
|
||||
|
||||
pub(crate) fn apply_changes(
|
||||
@@ -294,13 +239,6 @@ impl Session {
|
||||
///
|
||||
/// This iterator will only yield the default project database if it has been used.
|
||||
fn projects_mut(&mut self) -> impl Iterator<Item = &'_ mut ProjectDatabase> + '_ {
|
||||
self.project_states_mut().map(|project| &mut project.db)
|
||||
}
|
||||
|
||||
/// Returns a mutable iterator over all projects that have been initialized to this point.
|
||||
///
|
||||
/// This iterator will only yield the default project if it has been used.
|
||||
pub(crate) fn project_states_mut(&mut self) -> impl Iterator<Item = &'_ mut ProjectState> + '_ {
|
||||
let default_project = self.default_project.try_get_mut();
|
||||
self.projects.values_mut().chain(default_project)
|
||||
}
|
||||
@@ -346,8 +284,10 @@ impl Session {
|
||||
ProjectDatabase::new(metadata, system.clone())
|
||||
});
|
||||
|
||||
let (root, db) = match project {
|
||||
Ok(db) => (root, db),
|
||||
match project {
|
||||
Ok(project) => {
|
||||
self.projects.insert(root, project);
|
||||
}
|
||||
Err(err) => {
|
||||
tracing::error!(
|
||||
"Failed to create project for `{root}`: {err:#}. Falling back to default settings"
|
||||
@@ -362,29 +302,16 @@ impl Session {
|
||||
.context("Failed to convert default options to metadata")
|
||||
.and_then(|metadata| ProjectDatabase::new(metadata, system))
|
||||
.expect("Default configuration to be valid");
|
||||
let default_root = db_with_default_settings
|
||||
.project()
|
||||
.root(&db_with_default_settings)
|
||||
.to_path_buf();
|
||||
|
||||
(default_root, db_with_default_settings)
|
||||
self.projects.insert(
|
||||
db_with_default_settings
|
||||
.project()
|
||||
.root(&db_with_default_settings)
|
||||
.to_path_buf(),
|
||||
db_with_default_settings,
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
// Carry forward diagnostic state if any exists
|
||||
let previous = self.projects.remove(&root);
|
||||
let untracked = previous
|
||||
.map(|state| state.untracked_files_with_pushed_diagnostics)
|
||||
.unwrap_or_default();
|
||||
self.projects.insert(
|
||||
root.clone(),
|
||||
ProjectState {
|
||||
db,
|
||||
untracked_files_with_pushed_diagnostics: untracked,
|
||||
},
|
||||
);
|
||||
|
||||
publish_settings_diagnostics(self, client, root);
|
||||
}
|
||||
}
|
||||
|
||||
assert!(
|
||||
@@ -405,7 +332,7 @@ impl Session {
|
||||
pub(crate) fn take_document_snapshot(&self, url: Url) -> DocumentSnapshot {
|
||||
let index = self.index();
|
||||
DocumentSnapshot {
|
||||
resolved_client_capabilities: self.resolved_client_capabilities,
|
||||
resolved_client_capabilities: self.resolved_client_capabilities.clone(),
|
||||
client_settings: index.global_settings(),
|
||||
position_encoding: self.position_encoding,
|
||||
document_query_result: self
|
||||
@@ -418,12 +345,7 @@ impl Session {
|
||||
/// Creates a snapshot of the current state of the [`Session`].
|
||||
pub(crate) fn take_session_snapshot(&self) -> SessionSnapshot {
|
||||
SessionSnapshot {
|
||||
projects: self
|
||||
.projects
|
||||
.values()
|
||||
.map(|project| &project.db)
|
||||
.cloned()
|
||||
.collect(),
|
||||
projects: self.projects.values().cloned().collect(),
|
||||
index: self.index.clone().unwrap(),
|
||||
position_encoding: self.position_encoding,
|
||||
}
|
||||
@@ -510,17 +432,13 @@ impl Session {
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn client_capabilities(&self) -> ResolvedClientCapabilities {
|
||||
self.resolved_client_capabilities
|
||||
pub(crate) fn client_capabilities(&self) -> &ResolvedClientCapabilities {
|
||||
&self.resolved_client_capabilities
|
||||
}
|
||||
|
||||
pub(crate) fn global_settings(&self) -> Arc<ClientSettings> {
|
||||
self.index().global_settings()
|
||||
}
|
||||
|
||||
pub(crate) fn position_encoding(&self) -> PositionEncoding {
|
||||
self.position_encoding
|
||||
}
|
||||
}
|
||||
|
||||
/// A guard that holds the only reference to the index and allows modifying it.
|
||||
@@ -565,7 +483,7 @@ impl Drop for MutIndexGuard<'_> {
|
||||
/// An immutable snapshot of [`Session`] that references a specific document.
|
||||
#[derive(Debug)]
|
||||
pub(crate) struct DocumentSnapshot {
|
||||
resolved_client_capabilities: ResolvedClientCapabilities,
|
||||
resolved_client_capabilities: Arc<ResolvedClientCapabilities>,
|
||||
client_settings: Arc<ClientSettings>,
|
||||
position_encoding: PositionEncoding,
|
||||
document_query_result: Result<DocumentQuery, DocumentQueryError>,
|
||||
@@ -573,8 +491,8 @@ pub(crate) struct DocumentSnapshot {
|
||||
|
||||
impl DocumentSnapshot {
|
||||
/// Returns the resolved client capabilities that were captured during initialization.
|
||||
pub(crate) fn resolved_client_capabilities(&self) -> ResolvedClientCapabilities {
|
||||
self.resolved_client_capabilities
|
||||
pub(crate) fn resolved_client_capabilities(&self) -> &ResolvedClientCapabilities {
|
||||
&self.resolved_client_capabilities
|
||||
}
|
||||
|
||||
/// Returns the position encoding that was negotiated during initialization.
|
||||
@@ -738,14 +656,14 @@ impl Workspace {
|
||||
/// We really want that to be the actual project database and not our fallback database.
|
||||
/// 2. The logs when the server starts can be confusing if it once shows it uses Python X (for the default db)
|
||||
/// but then has another log that it uses Python Y (for the actual project db).
|
||||
struct DefaultProject(std::sync::OnceLock<ProjectState>);
|
||||
struct DefaultProject(std::sync::OnceLock<ProjectDatabase>);
|
||||
|
||||
impl DefaultProject {
|
||||
pub(crate) fn new() -> Self {
|
||||
DefaultProject(std::sync::OnceLock::new())
|
||||
}
|
||||
|
||||
pub(crate) fn get(&self, index: Option<&Arc<Index>>) -> &ProjectState {
|
||||
pub(crate) fn get(&self, index: Option<&Arc<Index>>) -> &ProjectDatabase {
|
||||
self.0.get_or_init(|| {
|
||||
tracing::info!("Initialize default project");
|
||||
|
||||
@@ -756,14 +674,11 @@ impl DefaultProject {
|
||||
None,
|
||||
)
|
||||
.unwrap();
|
||||
ProjectState {
|
||||
db: ProjectDatabase::new(metadata, system).unwrap(),
|
||||
untracked_files_with_pushed_diagnostics: Vec::new(),
|
||||
}
|
||||
ProjectDatabase::new(metadata, system).unwrap()
|
||||
})
|
||||
}
|
||||
|
||||
pub(crate) fn get_mut(&mut self, index: Option<&Arc<Index>>) -> &mut ProjectState {
|
||||
pub(crate) fn get_mut(&mut self, index: Option<&Arc<Index>>) -> &mut ProjectDatabase {
|
||||
let _ = self.get(index);
|
||||
|
||||
// SAFETY: The `OnceLock` is guaranteed to be initialized at this point because
|
||||
@@ -771,7 +686,7 @@ impl DefaultProject {
|
||||
self.0.get_mut().unwrap()
|
||||
}
|
||||
|
||||
pub(crate) fn try_get_mut(&mut self) -> Option<&mut ProjectState> {
|
||||
pub(crate) fn try_get_mut(&mut self) -> Option<&mut ProjectDatabase> {
|
||||
self.0.get_mut()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,121 +1,87 @@
|
||||
use lsp_types::{ClientCapabilities, MarkupKind};
|
||||
|
||||
bitflags::bitflags! {
|
||||
/// Represents the resolved client capabilities for the language server.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Default)]
|
||||
#[expect(clippy::struct_excessive_bools)]
|
||||
pub(crate) struct ResolvedClientCapabilities {
|
||||
pub(crate) code_action_deferred_edit_resolution: bool,
|
||||
pub(crate) apply_edit: bool,
|
||||
pub(crate) document_changes: bool,
|
||||
pub(crate) diagnostics_refresh: bool,
|
||||
pub(crate) inlay_refresh: bool,
|
||||
|
||||
/// Whether [pull diagnostics] is supported.
|
||||
///
|
||||
/// This tracks various capabilities that the client supports.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Default)]
|
||||
pub(crate) struct ResolvedClientCapabilities: u32 {
|
||||
const WORKSPACE_DIAGNOSTIC_REFRESH = 1 << 0;
|
||||
const INLAY_HINT_REFRESH = 1 << 1;
|
||||
const PULL_DIAGNOSTICS = 1 << 2;
|
||||
const TYPE_DEFINITION_LINK_SUPPORT = 1 << 3;
|
||||
const DEFINITION_LINK_SUPPORT = 1 << 4;
|
||||
const DECLARATION_LINK_SUPPORT = 1 << 5;
|
||||
const PREFER_MARKDOWN_IN_HOVER = 1 << 6;
|
||||
const MULTILINE_SEMANTIC_TOKENS = 1 << 7;
|
||||
const SIGNATURE_LABEL_OFFSET_SUPPORT = 1 << 8;
|
||||
const SIGNATURE_ACTIVE_PARAMETER_SUPPORT = 1 << 9;
|
||||
}
|
||||
/// [pull diagnostics]: https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_pullDiagnostics
|
||||
pub(crate) pull_diagnostics: bool,
|
||||
|
||||
/// Whether `textDocument.typeDefinition.linkSupport` is `true`
|
||||
pub(crate) type_definition_link_support: bool,
|
||||
|
||||
/// `true`, if the first markup kind in `textDocument.hover.contentFormat` is `Markdown`
|
||||
pub(crate) hover_prefer_markdown: bool,
|
||||
|
||||
/// Whether the client supports multiline semantic tokens
|
||||
pub(crate) semantic_tokens_multiline_support: bool,
|
||||
|
||||
/// Whether the client supports signature label offsets in signature help
|
||||
pub(crate) signature_label_offset_support: bool,
|
||||
|
||||
/// Whether the client supports per-signature active parameter in signature help
|
||||
pub(crate) signature_active_parameter_support: bool,
|
||||
}
|
||||
|
||||
impl ResolvedClientCapabilities {
|
||||
/// Returns `true` if the client supports workspace diagnostic refresh.
|
||||
pub(crate) const fn supports_workspace_diagnostic_refresh(self) -> bool {
|
||||
self.contains(Self::WORKSPACE_DIAGNOSTIC_REFRESH)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports inlay hint refresh.
|
||||
pub(crate) const fn supports_inlay_hint_refresh(self) -> bool {
|
||||
self.contains(Self::INLAY_HINT_REFRESH)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports pull diagnostics.
|
||||
pub(crate) const fn supports_pull_diagnostics(self) -> bool {
|
||||
self.contains(Self::PULL_DIAGNOSTICS)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports definition links in goto type definition.
|
||||
pub(crate) const fn supports_type_definition_link(self) -> bool {
|
||||
self.contains(Self::TYPE_DEFINITION_LINK_SUPPORT)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports definition links in goto definition.
|
||||
pub(crate) const fn supports_definition_link(self) -> bool {
|
||||
self.contains(Self::DEFINITION_LINK_SUPPORT)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports definition links in goto declaration.
|
||||
pub(crate) const fn supports_declaration_link(self) -> bool {
|
||||
self.contains(Self::DECLARATION_LINK_SUPPORT)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client prefers markdown in hover responses.
|
||||
pub(crate) const fn prefers_markdown_in_hover(self) -> bool {
|
||||
self.contains(Self::PREFER_MARKDOWN_IN_HOVER)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports multiline semantic tokens.
|
||||
pub(crate) const fn supports_multiline_semantic_tokens(self) -> bool {
|
||||
self.contains(Self::MULTILINE_SEMANTIC_TOKENS)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports signature label offsets in signature help.
|
||||
pub(crate) const fn supports_signature_label_offset(self) -> bool {
|
||||
self.contains(Self::SIGNATURE_LABEL_OFFSET_SUPPORT)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports per-signature active parameter in signature help.
|
||||
pub(crate) const fn supports_signature_active_parameter(self) -> bool {
|
||||
self.contains(Self::SIGNATURE_ACTIVE_PARAMETER_SUPPORT)
|
||||
}
|
||||
|
||||
pub(super) fn new(client_capabilities: &ClientCapabilities) -> Self {
|
||||
let mut flags = Self::empty();
|
||||
let code_action_settings = client_capabilities
|
||||
.text_document
|
||||
.as_ref()
|
||||
.and_then(|doc_settings| doc_settings.code_action.as_ref());
|
||||
let code_action_data_support = code_action_settings
|
||||
.and_then(|code_action_settings| code_action_settings.data_support)
|
||||
.unwrap_or_default();
|
||||
let code_action_edit_resolution = code_action_settings
|
||||
.and_then(|code_action_settings| code_action_settings.resolve_support.as_ref())
|
||||
.is_some_and(|resolve_support| resolve_support.properties.contains(&"edit".into()));
|
||||
|
||||
let workspace = client_capabilities.workspace.as_ref();
|
||||
let text_document = client_capabilities.text_document.as_ref();
|
||||
let apply_edit = client_capabilities
|
||||
.workspace
|
||||
.as_ref()
|
||||
.and_then(|workspace| workspace.apply_edit)
|
||||
.unwrap_or_default();
|
||||
|
||||
if workspace
|
||||
let document_changes = client_capabilities
|
||||
.workspace
|
||||
.as_ref()
|
||||
.and_then(|workspace| workspace.workspace_edit.as_ref()?.document_changes)
|
||||
.unwrap_or_default();
|
||||
|
||||
let declaration_link_support = client_capabilities
|
||||
.text_document
|
||||
.as_ref()
|
||||
.and_then(|document| document.type_definition?.link_support)
|
||||
.unwrap_or_default();
|
||||
|
||||
let diagnostics_refresh = client_capabilities
|
||||
.workspace
|
||||
.as_ref()
|
||||
.and_then(|workspace| workspace.diagnostics.as_ref()?.refresh_support)
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::WORKSPACE_DIAGNOSTIC_REFRESH;
|
||||
}
|
||||
.unwrap_or_default();
|
||||
|
||||
if workspace
|
||||
let inlay_refresh = client_capabilities
|
||||
.workspace
|
||||
.as_ref()
|
||||
.and_then(|workspace| workspace.inlay_hint.as_ref()?.refresh_support)
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::INLAY_HINT_REFRESH;
|
||||
}
|
||||
.unwrap_or_default();
|
||||
|
||||
if text_document.is_some_and(|text_document| text_document.diagnostic.is_some()) {
|
||||
flags |= Self::PULL_DIAGNOSTICS;
|
||||
}
|
||||
let pull_diagnostics = client_capabilities
|
||||
.text_document
|
||||
.as_ref()
|
||||
.and_then(|text_document| text_document.diagnostic.as_ref())
|
||||
.is_some();
|
||||
|
||||
if text_document
|
||||
.and_then(|text_document| text_document.type_definition?.link_support)
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::TYPE_DEFINITION_LINK_SUPPORT;
|
||||
}
|
||||
|
||||
if text_document
|
||||
.and_then(|text_document| text_document.definition?.link_support)
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::DEFINITION_LINK_SUPPORT;
|
||||
}
|
||||
|
||||
if text_document
|
||||
.and_then(|text_document| text_document.declaration?.link_support)
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::DECLARATION_LINK_SUPPORT;
|
||||
}
|
||||
|
||||
if text_document
|
||||
let hover_prefer_markdown = client_capabilities
|
||||
.text_document
|
||||
.as_ref()
|
||||
.and_then(|text_document| {
|
||||
Some(
|
||||
text_document
|
||||
@@ -126,24 +92,18 @@ impl ResolvedClientCapabilities {
|
||||
.contains(&MarkupKind::Markdown),
|
||||
)
|
||||
})
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::PREFER_MARKDOWN_IN_HOVER;
|
||||
}
|
||||
.unwrap_or_default();
|
||||
|
||||
if text_document
|
||||
.and_then(|text_document| {
|
||||
text_document
|
||||
.semantic_tokens
|
||||
.as_ref()?
|
||||
.multiline_token_support
|
||||
})
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::MULTILINE_SEMANTIC_TOKENS;
|
||||
}
|
||||
let semantic_tokens_multiline_support = client_capabilities
|
||||
.text_document
|
||||
.as_ref()
|
||||
.and_then(|doc| doc.semantic_tokens.as_ref())
|
||||
.and_then(|semantic_tokens| semantic_tokens.multiline_token_support)
|
||||
.unwrap_or(false);
|
||||
|
||||
if text_document
|
||||
let signature_label_offset_support = client_capabilities
|
||||
.text_document
|
||||
.as_ref()
|
||||
.and_then(|text_document| {
|
||||
text_document
|
||||
.signature_help
|
||||
@@ -154,12 +114,11 @@ impl ResolvedClientCapabilities {
|
||||
.as_ref()?
|
||||
.label_offset_support
|
||||
})
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::SIGNATURE_LABEL_OFFSET_SUPPORT;
|
||||
}
|
||||
.unwrap_or_default();
|
||||
|
||||
if text_document
|
||||
let signature_active_parameter_support = client_capabilities
|
||||
.text_document
|
||||
.as_ref()
|
||||
.and_then(|text_document| {
|
||||
text_document
|
||||
.signature_help
|
||||
@@ -168,11 +127,21 @@ impl ResolvedClientCapabilities {
|
||||
.as_ref()?
|
||||
.active_parameter_support
|
||||
})
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::SIGNATURE_ACTIVE_PARAMETER_SUPPORT;
|
||||
}
|
||||
.unwrap_or_default();
|
||||
|
||||
flags
|
||||
Self {
|
||||
code_action_deferred_edit_resolution: code_action_data_support
|
||||
&& code_action_edit_resolution,
|
||||
apply_edit,
|
||||
document_changes,
|
||||
diagnostics_refresh,
|
||||
inlay_refresh,
|
||||
pull_diagnostics,
|
||||
type_definition_link_support: declaration_link_support,
|
||||
hover_prefer_markdown,
|
||||
semantic_tokens_multiline_support,
|
||||
signature_label_offset_support,
|
||||
signature_active_parameter_support,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -80,7 +80,7 @@ You can add the following configuration to `.gitlab-ci.yml` to run a `ruff forma
|
||||
stage: build
|
||||
interruptible: true
|
||||
image:
|
||||
name: ghcr.io/astral-sh/ruff:0.12.4-alpine
|
||||
name: ghcr.io/astral-sh/ruff:0.12.3-alpine
|
||||
before_script:
|
||||
- cd $CI_PROJECT_DIR
|
||||
- ruff --version
|
||||
@@ -106,7 +106,7 @@ Ruff can be used as a [pre-commit](https://pre-commit.com) hook via [`ruff-pre-c
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.4
|
||||
rev: v0.12.3
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
@@ -119,7 +119,7 @@ To enable lint fixes, add the `--fix` argument to the lint hook:
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.4
|
||||
rev: v0.12.3
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
@@ -133,7 +133,7 @@ To avoid running on Jupyter Notebooks, remove `jupyter` from the list of allowed
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.4
|
||||
rev: v0.12.3
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
|
||||
@@ -369,7 +369,7 @@ This tutorial has focused on Ruff's command-line interface, but Ruff can also be
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.12.4
|
||||
rev: v0.12.3
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
|
||||
@@ -4,7 +4,7 @@ build-backend = "maturin"
|
||||
|
||||
[project]
|
||||
name = "ruff"
|
||||
version = "0.12.4"
|
||||
version = "0.12.3"
|
||||
description = "An extremely fast Python linter and code formatter, written in Rust."
|
||||
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
|
||||
readme = "README.md"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[project]
|
||||
name = "scripts"
|
||||
version = "0.12.4"
|
||||
version = "0.12.3"
|
||||
description = ""
|
||||
authors = ["Charles Marsh <charlie.r.marsh@gmail.com>"]
|
||||
|
||||
|
||||
Reference in New Issue
Block a user