Compare commits

..

1 Commits

Author SHA1 Message Date
Micha Reiser
82f33db5e6 Inline NodeKey construction and avoid AnyNodeRef 2024-08-21 19:04:02 +02:00
178 changed files with 2591 additions and 5498 deletions

View File

@@ -20,7 +20,7 @@
"extensions": [
"ms-python.python",
"rust-lang.rust-analyzer",
"fill-labs.dependi",
"serayuzgur.crates",
"tamasfe.even-better-toml",
"Swellaby.vscode-rust-test-adapter",
"charliermarsh.ruff"

View File

@@ -37,7 +37,7 @@ jobs:
with:
fetch-depth: 0
- uses: tj-actions/changed-files@v45
- uses: tj-actions/changed-files@v44
id: changed
with:
files_yaml: |

View File

@@ -45,7 +45,7 @@ repos:
)$
- repo: https://github.com/crate-ci/typos
rev: v1.24.1
rev: v1.23.6
hooks:
- id: typos
@@ -59,7 +59,7 @@ repos:
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.6.2
rev: v0.6.1
hooks:
- id: ruff-format
- id: ruff

View File

@@ -1,61 +1,5 @@
# Changelog
## 0.6.3
### Preview features
- \[`flake8-simplify`\] Extend `open-file-with-context-handler` to work with `dbm.sqlite3` (`SIM115`) ([#13104](https://github.com/astral-sh/ruff/pull/13104))
- \[`pycodestyle`\] Disable `E741` in stub files (`.pyi`) ([#13119](https://github.com/astral-sh/ruff/pull/13119))
- \[`pydoclint`\] Avoid `DOC201` on explicit returns in functions that only return `None` ([#13064](https://github.com/astral-sh/ruff/pull/13064))
### Rule changes
- \[`flake8-async`\] Disable check for `asyncio` before Python 3.11 (`ASYNC109`) ([#13023](https://github.com/astral-sh/ruff/pull/13023))
### Bug fixes
- \[`FastAPI`\] Avoid introducing invalid syntax in fix for `fast-api-non-annotated-dependency` (`FAST002`) ([#13133](https://github.com/astral-sh/ruff/pull/13133))
- \[`flake8-implicit-str-concat`\] Normalize octals before merging concatenated strings in `single-line-implicit-string-concatenation` (`ISC001`) ([#13118](https://github.com/astral-sh/ruff/pull/13118))
- \[`flake8-pytest-style`\] Improve help message for `pytest-incorrect-mark-parentheses-style` (`PT023`) ([#13092](https://github.com/astral-sh/ruff/pull/13092))
- \[`pylint`\] Avoid autofix for calls that aren't `min` or `max` as starred expression (`PLW3301`) ([#13089](https://github.com/astral-sh/ruff/pull/13089))
- \[`ruff`\] Add `datetime.time`, `datetime.tzinfo`, and `datetime.timezone` as immutable function calls (`RUF009`) ([#13109](https://github.com/astral-sh/ruff/pull/13109))
- \[`ruff`\] Extend comment deletion for `RUF100` to include trailing text from `noqa` directives while preserving any following comments on the same line, if any ([#13105](https://github.com/astral-sh/ruff/pull/13105))
- Fix dark theme on initial page load for the Ruff playground ([#13077](https://github.com/astral-sh/ruff/pull/13077))
## 0.6.2
### Preview features
- \[`flake8-simplify`\] Extend `open-file-with-context-handler` to work with other standard-library IO modules (`SIM115`) ([#12959](https://github.com/astral-sh/ruff/pull/12959))
- \[`ruff`\] Avoid `unused-async` for functions with FastAPI route decorator (`RUF029`) ([#12938](https://github.com/astral-sh/ruff/pull/12938))
- \[`ruff`\] Ignore `fstring-missing-syntax` (`RUF027`) for `fastAPI` paths ([#12939](https://github.com/astral-sh/ruff/pull/12939))
- \[`ruff`\] Implement check for Decimal called with a float literal (RUF032) ([#12909](https://github.com/astral-sh/ruff/pull/12909))
### Rule changes
- \[`flake8-bugbear`\] Update diagnostic message when expression is at the end of function (`B015`) ([#12944](https://github.com/astral-sh/ruff/pull/12944))
- \[`flake8-pyi`\] Skip type annotations in `string-or-bytes-too-long` (`PYI053`) ([#13002](https://github.com/astral-sh/ruff/pull/13002))
- \[`flake8-type-checking`\] Always recognise relative imports as first-party ([#12994](https://github.com/astral-sh/ruff/pull/12994))
- \[`flake8-unused-arguments`\] Ignore unused arguments on stub functions (`ARG001`) ([#12966](https://github.com/astral-sh/ruff/pull/12966))
- \[`pylint`\] Ignore augmented assignment for `self-cls-assignment` (`PLW0642`) ([#12957](https://github.com/astral-sh/ruff/pull/12957))
### Server
- Show full context in error log messages ([#13029](https://github.com/astral-sh/ruff/pull/13029))
### Bug fixes
- \[`pep8-naming`\] Don't flag `from` imports following conventional import names (`N817`) ([#12946](https://github.com/astral-sh/ruff/pull/12946))
- \[`pylint`\] - Allow `__new__` methods to have `cls` as their first argument even if decorated with `@staticmethod` for `bad-staticmethod-argument` (`PLW0211`) ([#12958](https://github.com/astral-sh/ruff/pull/12958))
### Documentation
- Add `hyperfine` installation instructions; update `hyperfine` code samples ([#13034](https://github.com/astral-sh/ruff/pull/13034))
- Expand note to use Ruff with other language server in Kate ([#12806](https://github.com/astral-sh/ruff/pull/12806))
- Update example for `PT001` as per the new default behavior ([#13019](https://github.com/astral-sh/ruff/pull/13019))
- \[`perflint`\] Improve docs for `try-except-in-loop` (`PERF203`) ([#12947](https://github.com/astral-sh/ruff/pull/12947))
- \[`pydocstyle`\] Add reference to `lint.pydocstyle.ignore-decorators` setting to rule docs ([#12996](https://github.com/astral-sh/ruff/pull/12996))
## 0.6.1
This is a hotfix release to address an issue with `ruff-pre-commit`. In v0.6,

View File

@@ -397,18 +397,12 @@ which makes it a good target for benchmarking.
git clone --branch 3.10 https://github.com/python/cpython.git crates/ruff_linter/resources/test/cpython
```
Install `hyperfine`:
```shell
cargo install hyperfine
```
To benchmark the release build:
```shell
cargo build --release && hyperfine --warmup 10 \
"./target/release/ruff check ./crates/ruff_linter/resources/test/cpython/ --no-cache -e" \
"./target/release/ruff check ./crates/ruff_linter/resources/test/cpython/ -e"
"./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ --no-cache -e" \
"./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ -e"
Benchmark 1: ./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ --no-cache
Time (mean ± σ): 293.8 ms ± 3.2 ms [User: 2384.6 ms, System: 90.3 ms]
@@ -427,7 +421,7 @@ To benchmark against the ecosystem's existing tools:
```shell
hyperfine --ignore-failure --warmup 5 \
"./target/release/ruff check ./crates/ruff_linter/resources/test/cpython/ --no-cache" \
"./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ --no-cache" \
"pyflakes crates/ruff_linter/resources/test/cpython" \
"autoflake --recursive --expand-star-imports --remove-all-unused-imports --remove-unused-variables --remove-duplicate-keys resources/test/cpython" \
"pycodestyle crates/ruff_linter/resources/test/cpython" \
@@ -473,7 +467,7 @@ To benchmark a subset of rules, e.g. `LineTooLong` and `DocLineTooLong`:
```shell
cargo build --release && hyperfine --warmup 10 \
"./target/release/ruff check ./crates/ruff_linter/resources/test/cpython/ --no-cache -e --select W505,E501"
"./target/release/ruff ./crates/ruff_linter/resources/test/cpython/ --no-cache -e --select W505,E501"
```
You can run `poetry install` from `./scripts/benchmarks` to create a working environment for the
@@ -530,8 +524,6 @@ You can run the benchmarks with
cargo benchmark
```
`cargo benchmark` is an alias for `cargo bench -p ruff_benchmark --bench linter --bench formatter --`
#### Benchmark-driven Development
Ruff uses [Criterion.rs](https://bheisler.github.io/criterion.rs/book/) for benchmarks. You can use
@@ -570,7 +562,7 @@ cargo install critcmp
#### Tips
- Use `cargo bench -p ruff_benchmark <filter>` to only run specific benchmarks. For example: `cargo bench -p ruff_benchmark lexer`
- Use `cargo bench -p ruff_benchmark <filter>` to only run specific benchmarks. For example: `cargo benchmark lexer`
to only run the lexer benchmarks.
- Use `cargo bench -p ruff_benchmark -- --quiet` for a more cleaned up output (without statistical relevance)
- Use `cargo bench -p ruff_benchmark -- --quick` to get faster results (more prone to noise)

36
Cargo.lock generated
View File

@@ -1256,9 +1256,9 @@ checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646"
[[package]]
name = "libc"
version = "0.2.158"
version = "0.2.157"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d8adc4bb1803a324070e64a98ae98f38934d91957a99cfb3a43dcbc01bc56439"
checksum = "374af5f94e54fa97cf75e945cce8a6b201e88a1a07e688b47dfd2a59c66dbd86"
[[package]]
name = "libcst"
@@ -1827,9 +1827,9 @@ dependencies = [
[[package]]
name = "quote"
version = "1.0.37"
version = "1.0.36"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b5b9d34b8991d19d98081b46eacdd8eb58c6f2b201139f7c5f643cc155a633af"
checksum = "0fa76aaf39101c457836aec0ce2316dbdc3ab723cdda1c6bd4e6ad4208acaca7"
dependencies = [
"proc-macro2",
]
@@ -1926,7 +1926,6 @@ dependencies = [
"ruff_db",
"ruff_index",
"ruff_python_ast",
"ruff_python_literal",
"ruff_python_parser",
"ruff_python_stdlib",
"ruff_source_file",
@@ -1936,7 +1935,6 @@ dependencies = [
"smallvec",
"static_assertions",
"tempfile",
"thiserror",
"tracing",
"walkdir",
"zip",
@@ -1952,6 +1950,7 @@ dependencies = [
"libc",
"lsp-server",
"lsp-types",
"red_knot_python_semantic",
"red_knot_workspace",
"ruff_db",
"ruff_notebook",
@@ -1989,7 +1988,6 @@ dependencies = [
"anyhow",
"crossbeam",
"notify",
"rayon",
"red_knot_python_semantic",
"ruff_cache",
"ruff_db",
@@ -1997,6 +1995,7 @@ dependencies = [
"ruff_text_size",
"rustc-hash 2.0.0",
"salsa",
"thiserror",
"tracing",
]
@@ -2090,7 +2089,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.6.3"
version = "0.6.1"
dependencies = [
"anyhow",
"argfile",
@@ -2148,7 +2147,6 @@ dependencies = [
"criterion",
"mimalloc",
"once_cell",
"rayon",
"red_knot_python_semantic",
"red_knot_workspace",
"ruff_db",
@@ -2283,7 +2281,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.6.3"
version = "0.6.1"
dependencies = [
"aho-corasick",
"annotate-snippets 0.9.2",
@@ -2603,7 +2601,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.6.3"
version = "0.6.1"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -2830,9 +2828,9 @@ checksum = "1c107b6f4780854c8b126e228ea8869f4d7b71260f962fefb57b996b8959ba6b"
[[package]]
name = "serde"
version = "1.0.209"
version = "1.0.208"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "99fce0ffe7310761ca6bf9faf5115afbc19688edd00171d81b1bb1b116c63e09"
checksum = "cff085d2cb684faa248efb494c39b68e522822ac0de72ccf08109abde717cfb2"
dependencies = [
"serde_derive",
]
@@ -2850,9 +2848,9 @@ dependencies = [
[[package]]
name = "serde_derive"
version = "1.0.209"
version = "1.0.208"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a5831b979fd7b5439637af1752d535ff49f4860c0f341d1baeb6faf0f4242170"
checksum = "24008e81ff7613ed8e5ba0cfaf24e2c2f1e5b8a0495711e44fcd4882fca62bcf"
dependencies = [
"proc-macro2",
"quote",
@@ -2872,9 +2870,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.127"
version = "1.0.125"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8043c06d9f82bd7271361ed64f415fe5e12a77fdb52e573e7f06a516dea329ad"
checksum = "83c8e735a073ccf5be70aa8066aa984eaf2fa000db6c8d0100ae605b366d31ed"
dependencies = [
"itoa",
"memchr",
@@ -3033,9 +3031,9 @@ checksum = "81cdd64d312baedb58e21336b31bc043b77e01cc99033ce76ef539f78e965ebc"
[[package]]
name = "syn"
version = "2.0.76"
version = "2.0.75"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "578e081a14e0cefc3279b0472138c513f37b41a08d5a3cca9b6e4e8ceb6cd525"
checksum = "f6af063034fc1935ede7be0122941bafa9bacb949334d090b77ca98b5817c7d9"
dependencies = [
"proc-macro2",
"quote",

View File

@@ -110,7 +110,7 @@ For more, see the [documentation](https://docs.astral.sh/ruff/).
1. [Who's Using Ruff?](#whos-using-ruff)
1. [License](#license)
## Getting Started<a id="getting-started"></a>
## Getting Started
For more, see the [documentation](https://docs.astral.sh/ruff/).
@@ -136,8 +136,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.6.3/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.6.3/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.6.1/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.6.1/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -170,7 +170,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.6.3
rev: v0.6.1
hooks:
# Run the linter.
- id: ruff
@@ -195,7 +195,7 @@ jobs:
- uses: chartboost/ruff-action@v1
```
### Configuration<a id="configuration"></a>
### Configuration
Ruff can be configured through a `pyproject.toml`, `ruff.toml`, or `.ruff.toml` file (see:
[_Configuration_](https://docs.astral.sh/ruff/configuration/), or [_Settings_](https://docs.astral.sh/ruff/settings/)
@@ -291,7 +291,7 @@ features that may change prior to stabilization.
See `ruff help` for more on Ruff's top-level commands, or `ruff help check` and `ruff help format`
for more on the linting and formatting commands, respectively.
## Rules<a id="rules"></a>
## Rules
<!-- Begin section: Rules -->
@@ -367,21 +367,21 @@ quality tools, including:
For a complete enumeration of the supported rules, see [_Rules_](https://docs.astral.sh/ruff/rules/).
## Contributing<a id="contributing"></a>
## Contributing
Contributions are welcome and highly appreciated. To get started, check out the
[**contributing guidelines**](https://docs.astral.sh/ruff/contributing/).
You can also join us on [**Discord**](https://discord.com/invite/astral-sh).
## Support<a id="support"></a>
## Support
Having trouble? Check out the existing issues on [**GitHub**](https://github.com/astral-sh/ruff/issues),
or feel free to [**open a new one**](https://github.com/astral-sh/ruff/issues/new).
You can also ask for help on [**Discord**](https://discord.com/invite/astral-sh).
## Acknowledgements<a id="acknowledgements"></a>
## Acknowledgements
Ruff's linter draws on both the APIs and implementation details of many other
tools in the Python ecosystem, especially [Flake8](https://github.com/PyCQA/flake8), [Pyflakes](https://github.com/PyCQA/pyflakes),
@@ -405,7 +405,7 @@ Ruff is the beneficiary of a large number of [contributors](https://github.com/a
Ruff is released under the MIT license.
## Who's Using Ruff?<a id="whos-using-ruff"></a>
## Who's Using Ruff?
Ruff is used by a number of major open-source projects and companies, including:
@@ -524,7 +524,7 @@ If you're using Ruff, consider adding the Ruff badge to your project's `README.m
<a href="https://github.com/astral-sh/ruff"><img src="https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json" alt="Ruff" style="max-width:100%;"></a>
```
## License<a id="license"></a>
## License
This repository is licensed under the [MIT License](https://github.com/astral-sh/ruff/blob/main/LICENSE)

View File

@@ -13,17 +13,12 @@ The CLI supports different verbosity levels.
- `-vv` activates `debug!` and timestamps: This should be enough information to get to the bottom of bug reports. When you're processing many packages or files, you'll get pages and pages of output, but each line is link to a specific action or state change.
- `-vvv` activates `trace!` (only in debug builds) and shows tracing-spans: At this level, you're logging everything. Most of this is wasted, it's really slow, we dump e.g. the entire resolution graph. Only useful to developers, and you almost certainly want to use `RED_KNOT_LOG` to filter it down to the area your investigating.
## Better logging with `RED_KNOT_LOG` and `RAYON_NUM_THREADS`
## `RED_KNOT_LOG`
By default, the CLI shows messages from the `ruff` and `red_knot` crates. Tracing messages from other crates are not shown.
The `RED_KNOT_LOG` environment variable allows you to customize which messages are shown by specifying one
or more [filter directives](https://docs.rs/tracing-subscriber/latest/tracing_subscriber/filter/struct.EnvFilter.html#directives).
The `RAYON_NUM_THREADS` environment variable, meanwhile, can be used to control the level of concurrency red-knot uses.
By default, red-knot will attempt to parallelize its work so that multiple files are checked simultaneously,
but this can result in a confused logging output where messages from different threads are intertwined.
To switch off concurrency entirely and have more readable logs, use `RAYON_NUM_THREADS=1`.
### Examples
#### Show all debug messages

View File

@@ -7,12 +7,12 @@ use colored::Colorize;
use crossbeam::channel as crossbeam_channel;
use salsa::plumbing::ZalsaDatabase;
use red_knot_python_semantic::SitePackages;
use red_knot_python_semantic::{ProgramSettings, SearchPathSettings};
use red_knot_server::run_server;
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::site_packages::VirtualEnvironment;
use red_knot_workspace::watch;
use red_knot_workspace::watch::WorkspaceWatcher;
use red_knot_workspace::workspace::settings::Configuration;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::system::{OsSystem, System, SystemPath, SystemPathBuf};
use target_version::TargetVersion;
@@ -65,14 +65,15 @@ to resolve type information for the project's third-party dependencies.",
value_name = "PATH",
help = "Additional path to use as a module-resolution source (can be passed multiple times)"
)]
extra_search_path: Option<Vec<SystemPathBuf>>,
extra_search_path: Vec<SystemPathBuf>,
#[arg(
long,
help = "Python version to assume when resolving types",
value_name = "VERSION"
)]
target_version: Option<TargetVersion>,
default_value_t = TargetVersion::default(),
value_name="VERSION")
]
target_version: TargetVersion,
#[clap(flatten)]
verbosity: Verbosity,
@@ -85,36 +86,6 @@ to resolve type information for the project's third-party dependencies.",
watch: bool,
}
impl Args {
fn to_configuration(&self, cli_cwd: &SystemPath) -> Configuration {
let mut configuration = Configuration::default();
if let Some(target_version) = self.target_version {
configuration.target_version = Some(target_version.into());
}
if let Some(venv_path) = &self.venv_path {
configuration.search_paths.site_packages = Some(SitePackages::Derived {
venv_path: SystemPath::absolute(venv_path, cli_cwd),
});
}
if let Some(custom_typeshed_dir) = &self.custom_typeshed_dir {
configuration.search_paths.custom_typeshed =
Some(SystemPath::absolute(custom_typeshed_dir, cli_cwd));
}
if let Some(extra_search_paths) = &self.extra_search_path {
configuration.search_paths.extra_paths = extra_search_paths
.iter()
.map(|path| Some(SystemPath::absolute(path, cli_cwd)))
.collect();
}
configuration
}
}
#[derive(Debug, clap::Subcommand)]
pub enum Command {
/// Start the language server
@@ -144,13 +115,22 @@ pub fn main() -> ExitStatus {
}
fn run() -> anyhow::Result<ExitStatus> {
let args = Args::parse_from(std::env::args().collect::<Vec<_>>());
let Args {
command,
current_directory,
custom_typeshed_dir,
extra_search_path: extra_paths,
venv_path,
target_version,
verbosity,
watch,
} = Args::parse_from(std::env::args().collect::<Vec<_>>());
if matches!(args.command, Some(Command::Server)) {
if matches!(command, Some(Command::Server)) {
return run_server().map(|()| ExitStatus::Success);
}
let verbosity = args.verbosity.level();
let verbosity = verbosity.level();
countme::enable(verbosity.is_trace());
let _guard = setup_tracing(verbosity)?;
@@ -166,12 +146,10 @@ fn run() -> anyhow::Result<ExitStatus> {
})?
};
let cwd = args
.current_directory
.as_ref()
let cwd = current_directory
.map(|cwd| {
if cwd.as_std_path().is_dir() {
Ok(SystemPath::absolute(cwd, &cli_base_path))
Ok(SystemPath::absolute(&cwd, &cli_base_path))
} else {
Err(anyhow!(
"Provided current-directory path '{cwd}' is not a directory."
@@ -182,18 +160,33 @@ fn run() -> anyhow::Result<ExitStatus> {
.unwrap_or_else(|| cli_base_path.clone());
let system = OsSystem::new(cwd.clone());
let cli_configuration = args.to_configuration(&cwd);
let workspace_metadata = WorkspaceMetadata::from_path(
system.current_directory(),
&system,
Some(cli_configuration.clone()),
)?;
let workspace_metadata = WorkspaceMetadata::from_path(system.current_directory(), &system)?;
// TODO: Verify the remaining search path settings eagerly.
let site_packages = venv_path
.map(|path| {
VirtualEnvironment::new(path, &OsSystem::new(cli_base_path))
.and_then(|venv| venv.site_packages_directories(&system))
})
.transpose()?
.unwrap_or_default();
// TODO: Respect the settings from the workspace metadata. when resolving the program settings.
let program_settings = ProgramSettings {
target_version: target_version.into(),
search_paths: SearchPathSettings {
extra_paths,
src_root: workspace_metadata.root().to_path_buf(),
custom_typeshed: custom_typeshed_dir,
site_packages,
},
};
// TODO: Use the `program_settings` to compute the key for the database's persistent
// cache and load the cache if it exists.
let mut db = RootDatabase::new(workspace_metadata, system)?;
let mut db = RootDatabase::new(workspace_metadata, program_settings, system)?;
let (main_loop, main_loop_cancellation_token) = MainLoop::new(cli_configuration);
let (main_loop, main_loop_cancellation_token) = MainLoop::new();
// Listen to Ctrl+C and abort the watch mode.
let main_loop_cancellation_token = Mutex::new(Some(main_loop_cancellation_token));
@@ -205,7 +198,7 @@ fn run() -> anyhow::Result<ExitStatus> {
}
})?;
let exit_status = if args.watch {
let exit_status = if watch {
main_loop.watch(&mut db)?
} else {
main_loop.run(&mut db)
@@ -245,12 +238,10 @@ struct MainLoop {
/// The file system watcher, if running in watch mode.
watcher: Option<WorkspaceWatcher>,
cli_configuration: Configuration,
}
impl MainLoop {
fn new(cli_configuration: Configuration) -> (Self, MainLoopCancellationToken) {
fn new() -> (Self, MainLoopCancellationToken) {
let (sender, receiver) = crossbeam_channel::bounded(10);
(
@@ -258,7 +249,6 @@ impl MainLoop {
sender: sender.clone(),
receiver,
watcher: None,
cli_configuration,
},
MainLoopCancellationToken { sender },
)
@@ -341,7 +331,7 @@ impl MainLoop {
MainLoopMessage::ApplyChanges(changes) => {
revision += 1;
// Automatically cancels any pending queries and waits for them to complete.
db.apply_changes(changes, Some(&self.cli_configuration));
db.apply_changes(changes);
if let Some(watcher) = self.watcher.as_mut() {
watcher.update(db);
}

View File

@@ -5,11 +5,12 @@ use std::time::Duration;
use anyhow::{anyhow, Context};
use red_knot_python_semantic::{resolve_module, ModuleName, Program, PythonVersion, SitePackages};
use red_knot_python_semantic::{
resolve_module, ModuleName, Program, ProgramSettings, PythonVersion, SearchPathSettings,
};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::watch;
use red_knot_workspace::watch::{directory_watcher, WorkspaceWatcher};
use red_knot_workspace::workspace::settings::{Configuration, SearchPathConfiguration};
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::files::{system_path_to_file, File, FileError};
use ruff_db::source::source_text;
@@ -24,7 +25,7 @@ struct TestCase {
/// We need to hold on to it in the test case or the temp files get deleted.
_temp_dir: tempfile::TempDir,
root_dir: SystemPathBuf,
configuration: Configuration,
search_path_settings: SearchPathSettings,
}
impl TestCase {
@@ -40,6 +41,10 @@ impl TestCase {
&self.db
}
fn db_mut(&mut self) -> &mut RootDatabase {
&mut self.db
}
fn stop_watch(&mut self) -> Vec<watch::ChangeEvent> {
self.try_stop_watch(Duration::from_secs(10))
.expect("Expected watch changes but observed none.")
@@ -100,20 +105,16 @@ impl TestCase {
Some(all_events)
}
fn apply_changes(&mut self, changes: Vec<watch::ChangeEvent>) {
self.db.apply_changes(changes, Some(&self.configuration));
}
fn update_search_path_settings(
&mut self,
configuration: SearchPathConfiguration,
f: impl FnOnce(&SearchPathSettings) -> SearchPathSettings,
) -> anyhow::Result<()> {
let program = Program::get(self.db());
self.configuration.search_paths = configuration.clone();
let new_settings = configuration.into_settings(self.db.workspace().root(&self.db));
let new_settings = f(&self.search_path_settings);
program.update_search_paths(&mut self.db, &new_settings)?;
program.update_search_paths(&mut self.db, new_settings.clone())?;
self.search_path_settings = new_settings;
if let Some(watcher) = &mut self.watcher {
watcher.update(&self.db);
@@ -178,14 +179,17 @@ fn setup<F>(setup_files: F) -> anyhow::Result<TestCase>
where
F: SetupFiles,
{
setup_with_search_paths(setup_files, |_root, _workspace_path| {
SearchPathConfiguration::default()
setup_with_search_paths(setup_files, |_root, workspace_path| SearchPathSettings {
extra_paths: vec![],
src_root: workspace_path.to_path_buf(),
custom_typeshed: None,
site_packages: vec![],
})
}
fn setup_with_search_paths<F>(
setup_files: F,
create_search_paths: impl FnOnce(&SystemPath, &SystemPath) -> SearchPathConfiguration,
create_search_paths: impl FnOnce(&SystemPath, &SystemPath) -> SearchPathSettings,
) -> anyhow::Result<TestCase>
where
F: SetupFiles,
@@ -217,34 +221,25 @@ where
let system = OsSystem::new(&workspace_path);
let search_paths = create_search_paths(&root_path, &workspace_path);
let workspace = WorkspaceMetadata::from_path(&workspace_path, &system)?;
let search_path_settings = create_search_paths(&root_path, workspace.root());
for path in search_paths
for path in search_path_settings
.extra_paths
.iter()
.flatten()
.chain(search_paths.custom_typeshed.iter())
.chain(search_paths.site_packages.iter().flat_map(|site_packages| {
if let SitePackages::Known(path) = site_packages {
path.as_slice()
} else {
&[]
}
}))
.chain(search_path_settings.site_packages.iter())
.chain(search_path_settings.custom_typeshed.iter())
{
std::fs::create_dir_all(path.as_std_path())
.with_context(|| format!("Failed to create search path '{path}'"))?;
}
let configuration = Configuration {
target_version: Some(PythonVersion::PY312),
search_paths,
let settings = ProgramSettings {
target_version: PythonVersion::default(),
search_paths: search_path_settings.clone(),
};
let workspace =
WorkspaceMetadata::from_path(&workspace_path, &system, Some(configuration.clone()))?;
let db = RootDatabase::new(workspace, system)?;
let db = RootDatabase::new(workspace, settings, system)?;
let (sender, receiver) = crossbeam::channel::unbounded();
let watcher = directory_watcher(move |events| sender.send(events).unwrap())
@@ -259,7 +254,7 @@ where
watcher: Some(watcher),
_temp_dir: temp_dir,
root_dir: root_path,
configuration,
search_path_settings,
};
// Sometimes the file watcher reports changes for events that happened before the watcher was started.
@@ -312,7 +307,7 @@ fn new_file() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
let foo = case.system_file(&foo_path).expect("foo.py to exist.");
@@ -335,7 +330,7 @@ fn new_ignored_file() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert!(case.system_file(&foo_path).is_ok());
assert_eq!(&case.collect_package_files(&bar_path), &[bar_file]);
@@ -359,7 +354,7 @@ fn changed_file() -> anyhow::Result<()> {
assert!(!changes.is_empty());
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert_eq!(source_text(case.db(), foo).as_str(), "print('Version 2')");
assert_eq!(&case.collect_package_files(&foo_path), &[foo]);
@@ -382,7 +377,7 @@ fn deleted_file() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert!(!foo.exists(case.db()));
assert_eq!(&case.collect_package_files(&foo_path), &[] as &[File]);
@@ -414,7 +409,7 @@ fn move_file_to_trash() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert!(!foo.exists(case.db()));
assert_eq!(&case.collect_package_files(&foo_path), &[] as &[File]);
@@ -446,7 +441,7 @@ fn move_file_to_workspace() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
let foo_in_workspace = case.system_file(&foo_in_workspace_path)?;
@@ -474,7 +469,7 @@ fn rename_file() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert!(!foo.exists(case.db()));
@@ -515,7 +510,7 @@ fn directory_moved_to_workspace() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
let init_file = case
.system_file(sub_new_path.join("__init__.py"))
@@ -566,7 +561,7 @@ fn directory_moved_to_trash() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("sub.a").unwrap()).is_none());
@@ -620,7 +615,7 @@ fn directory_renamed() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("sub.a").unwrap()).is_none());
@@ -685,7 +680,7 @@ fn directory_deleted() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
// `import sub.a` should no longer resolve
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("sub.a").unwrap()).is_none());
@@ -699,13 +694,15 @@ fn directory_deleted() -> anyhow::Result<()> {
#[test]
fn search_path() -> anyhow::Result<()> {
let mut case = setup_with_search_paths(
[("bar.py", "import sub.a")],
|root_path, _workspace_path| SearchPathConfiguration {
site_packages: Some(SitePackages::Known(vec![root_path.join("site_packages")])),
..SearchPathConfiguration::default()
},
)?;
let mut case =
setup_with_search_paths([("bar.py", "import sub.a")], |root_path, workspace_path| {
SearchPathSettings {
extra_paths: vec![],
src_root: workspace_path.to_path_buf(),
custom_typeshed: None,
site_packages: vec![root_path.join("site_packages")],
}
})?;
let site_packages = case.root_path().join("site_packages");
@@ -718,7 +715,7 @@ fn search_path() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("a").unwrap()).is_some());
assert_eq!(
@@ -739,9 +736,9 @@ fn add_search_path() -> anyhow::Result<()> {
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("a").unwrap()).is_none());
// Register site-packages as a search path.
case.update_search_path_settings(SearchPathConfiguration {
site_packages: Some(SitePackages::Known(vec![site_packages.clone()])),
..SearchPathConfiguration::default()
case.update_search_path_settings(|settings| SearchPathSettings {
site_packages: vec![site_packages.clone()],
..settings.clone()
})
.expect("Search path settings to be valid");
@@ -749,7 +746,7 @@ fn add_search_path() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert!(resolve_module(case.db().upcast(), ModuleName::new_static("a").unwrap()).is_some());
@@ -758,19 +755,21 @@ fn add_search_path() -> anyhow::Result<()> {
#[test]
fn remove_search_path() -> anyhow::Result<()> {
let mut case = setup_with_search_paths(
[("bar.py", "import sub.a")],
|root_path, _workspace_path| SearchPathConfiguration {
site_packages: Some(SitePackages::Known(vec![root_path.join("site_packages")])),
..SearchPathConfiguration::default()
},
)?;
let mut case =
setup_with_search_paths([("bar.py", "import sub.a")], |root_path, workspace_path| {
SearchPathSettings {
extra_paths: vec![],
src_root: workspace_path.to_path_buf(),
custom_typeshed: None,
site_packages: vec![root_path.join("site_packages")],
}
})?;
// Remove site packages from the search path settings.
let site_packages = case.root_path().join("site_packages");
case.update_search_path_settings(SearchPathConfiguration {
site_packages: None,
..SearchPathConfiguration::default()
case.update_search_path_settings(|settings| SearchPathSettings {
site_packages: vec![],
..settings.clone()
})
.expect("Search path settings to be valid");
@@ -783,48 +782,6 @@ fn remove_search_path() -> anyhow::Result<()> {
Ok(())
}
#[test]
fn changed_versions_file() -> anyhow::Result<()> {
let mut case = setup_with_search_paths(
|root_path: &SystemPath, workspace_path: &SystemPath| {
std::fs::write(workspace_path.join("bar.py").as_std_path(), "import sub.a")?;
std::fs::create_dir_all(root_path.join("typeshed/stdlib").as_std_path())?;
std::fs::write(root_path.join("typeshed/stdlib/VERSIONS").as_std_path(), "")?;
std::fs::write(
root_path.join("typeshed/stdlib/os.pyi").as_std_path(),
"# not important",
)?;
Ok(())
},
|root_path, _workspace_path| SearchPathConfiguration {
custom_typeshed: Some(root_path.join("typeshed")),
..SearchPathConfiguration::default()
},
)?;
// Unset the custom typeshed directory.
assert_eq!(
resolve_module(case.db(), ModuleName::new("os").unwrap()),
None
);
std::fs::write(
case.root_path()
.join("typeshed/stdlib/VERSIONS")
.as_std_path(),
"os: 3.0-",
)?;
let changes = case.stop_watch();
case.apply_changes(changes);
assert!(resolve_module(case.db(), ModuleName::new("os").unwrap()).is_some());
Ok(())
}
/// Watch a workspace that contains two files where one file is a hardlink to another.
///
/// Setup:
@@ -871,7 +828,7 @@ fn hard_links_in_workspace() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert_eq!(source_text(case.db(), foo).as_str(), "print('Version 2')");
@@ -942,7 +899,7 @@ fn hard_links_to_target_outside_workspace() -> anyhow::Result<()> {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert_eq!(source_text(case.db(), bar).as_str(), "print('Version 2')");
@@ -981,7 +938,7 @@ mod unix {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert_eq!(
foo.permissions(case.db()),
@@ -1066,7 +1023,7 @@ mod unix {
let changes = case.take_watch_changes();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert_eq!(
source_text(case.db(), baz.file()).as_str(),
@@ -1079,7 +1036,7 @@ mod unix {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert_eq!(
source_text(case.db(), baz.file()).as_str(),
@@ -1150,7 +1107,7 @@ mod unix {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
// The file watcher is guaranteed to emit one event for the changed file, but it isn't specified
// if the event is emitted for the "original" or linked path because both paths are watched.
@@ -1219,11 +1176,11 @@ mod unix {
Ok(())
},
|_root, workspace| SearchPathConfiguration {
site_packages: Some(SitePackages::Known(vec![
workspace.join(".venv/lib/python3.12/site-packages")
])),
..SearchPathConfiguration::default()
|_root, workspace| SearchPathSettings {
extra_paths: vec![],
src_root: workspace.to_path_buf(),
custom_typeshed: None,
site_packages: vec![workspace.join(".venv/lib/python3.12/site-packages")],
},
)?;
@@ -1258,7 +1215,7 @@ mod unix {
let changes = case.stop_watch();
case.apply_changes(changes);
case.db_mut().apply_changes(changes);
assert_eq!(
source_text(case.db(), baz_original_file).as_str(),

View File

@@ -17,7 +17,6 @@ ruff_python_ast = { workspace = true }
ruff_python_stdlib = { workspace = true }
ruff_source_file = { workspace = true }
ruff_text_size = { workspace = true }
ruff_python_literal = { workspace = true }
anyhow = { workspace = true }
bitflags = { workspace = true }
@@ -27,7 +26,6 @@ countme = { workspace = true }
once_cell = { workspace = true }
ordermap = { workspace = true }
salsa = { workspace = true }
thiserror = { workspace = true }
tracing = { workspace = true }
rustc-hash = { workspace = true }
hashbrown = { workspace = true }

View File

@@ -5,7 +5,7 @@ use rustc_hash::FxHasher;
pub use db::Db;
pub use module_name::ModuleName;
pub use module_resolver::{resolve_module, system_module_search_paths, vendored_typeshed_stubs};
pub use program::{Program, ProgramSettings, SearchPathSettings, SitePackages};
pub use program::{Program, ProgramSettings, SearchPathSettings};
pub use python_version::PythonVersion;
pub use semantic_model::{HasTy, SemanticModel};
@@ -19,7 +19,6 @@ mod program;
mod python_version;
pub mod semantic_index;
mod semantic_model;
pub(crate) mod site_packages;
pub mod types;
type FxOrderSet<V> = ordermap::set::OrderSet<V, BuildHasherDefault<FxHasher>>;

View File

@@ -13,6 +13,7 @@ use resolver::SearchPathIterator;
mod module;
mod path;
mod resolver;
mod state;
mod typeshed;
#[cfg(test)]

View File

@@ -9,11 +9,11 @@ use ruff_db::files::{system_path_to_file, vendored_path_to_file, File, FileError
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ruff_db::vendored::{VendoredPath, VendoredPathBuf};
use super::typeshed::{typeshed_versions, TypeshedVersionsParseError, TypeshedVersionsQueryResult};
use crate::db::Db;
use crate::module_name::ModuleName;
use crate::module_resolver::resolver::ResolverContext;
use crate::site_packages::SitePackagesDiscoveryError;
use super::state::ResolverState;
use super::typeshed::{TypeshedVersionsParseError, TypeshedVersionsQueryResult};
/// A path that points to a Python module.
///
@@ -60,7 +60,7 @@ impl ModulePath {
}
#[must_use]
pub(super) fn is_directory(&self, resolver: &ResolverContext) -> bool {
pub(crate) fn is_directory(&self, resolver: &ResolverState) -> bool {
let ModulePath {
search_path,
relative_path,
@@ -74,7 +74,7 @@ impl ModulePath {
== Err(FileError::IsADirectory)
}
SearchPathInner::StandardLibraryCustom(stdlib_root) => {
match query_stdlib_version(relative_path, resolver) {
match query_stdlib_version(Some(stdlib_root), relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => false,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => {
@@ -84,7 +84,7 @@ impl ModulePath {
}
}
SearchPathInner::StandardLibraryVendored(stdlib_root) => {
match query_stdlib_version(relative_path, resolver) {
match query_stdlib_version(None, relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => false,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => resolver
@@ -96,7 +96,7 @@ impl ModulePath {
}
#[must_use]
pub(super) fn is_regular_package(&self, resolver: &ResolverContext) -> bool {
pub(crate) fn is_regular_package(&self, resolver: &ResolverState) -> bool {
let ModulePath {
search_path,
relative_path,
@@ -113,7 +113,7 @@ impl ModulePath {
.is_ok()
}
SearchPathInner::StandardLibraryCustom(search_path) => {
match query_stdlib_version(relative_path, resolver) {
match query_stdlib_version(Some(search_path), relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => false,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => system_path_to_file(
@@ -124,7 +124,7 @@ impl ModulePath {
}
}
SearchPathInner::StandardLibraryVendored(search_path) => {
match query_stdlib_version(relative_path, resolver) {
match query_stdlib_version(None, relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => false,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => resolver
@@ -136,7 +136,7 @@ impl ModulePath {
}
#[must_use]
pub(super) fn to_file(&self, resolver: &ResolverContext) -> Option<File> {
pub(crate) fn to_file(&self, resolver: &ResolverState) -> Option<File> {
let db = resolver.db.upcast();
let ModulePath {
search_path,
@@ -150,7 +150,7 @@ impl ModulePath {
system_path_to_file(db, search_path.join(relative_path)).ok()
}
SearchPathInner::StandardLibraryCustom(stdlib_root) => {
match query_stdlib_version(relative_path, resolver) {
match query_stdlib_version(Some(stdlib_root), relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => None,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => {
@@ -159,7 +159,7 @@ impl ModulePath {
}
}
SearchPathInner::StandardLibraryVendored(stdlib_root) => {
match query_stdlib_version(relative_path, resolver) {
match query_stdlib_version(None, relative_path, resolver) {
TypeshedVersionsQueryResult::DoesNotExist => None,
TypeshedVersionsQueryResult::Exists
| TypeshedVersionsQueryResult::MaybeExists => {
@@ -273,15 +273,19 @@ fn stdlib_path_to_module_name(relative_path: &Utf8Path) -> Option<ModuleName> {
#[must_use]
fn query_stdlib_version(
custom_stdlib_root: Option<&SystemPath>,
relative_path: &Utf8Path,
context: &ResolverContext,
resolver: &ResolverState,
) -> TypeshedVersionsQueryResult {
let Some(module_name) = stdlib_path_to_module_name(relative_path) else {
return TypeshedVersionsQueryResult::DoesNotExist;
};
let ResolverContext { db, target_version } = context;
typeshed_versions(*db).query_module(&module_name, *target_version)
let ResolverState {
db,
typeshed_versions,
target_version,
} = resolver;
typeshed_versions.query_module(*db, &module_name, custom_stdlib_root, *target_version)
}
/// Enumeration describing the various ways in which validation of a search path might fail.
@@ -289,7 +293,7 @@ fn query_stdlib_version(
/// If validation fails for a search path derived from the user settings,
/// a message must be displayed to the user,
/// as type checking cannot be done reliably in these circumstances.
#[derive(Debug)]
#[derive(Debug, PartialEq, Eq)]
pub(crate) enum SearchPathValidationError {
/// The path provided by the user was not a directory
NotADirectory(SystemPathBuf),
@@ -300,20 +304,18 @@ pub(crate) enum SearchPathValidationError {
NoStdlibSubdirectory(SystemPathBuf),
/// The typeshed path provided by the user is a directory,
/// but `stdlib/VERSIONS` could not be read.
/// but no `stdlib/VERSIONS` file exists.
/// (This is only relevant for stdlib search paths.)
FailedToReadVersionsFile {
path: SystemPathBuf,
error: std::io::Error,
},
NoVersionsFile(SystemPathBuf),
/// `stdlib/VERSIONS` is a directory.
/// (This is only relevant for stdlib search paths.)
VersionsIsADirectory(SystemPathBuf),
/// The path provided by the user is a directory,
/// and a `stdlib/VERSIONS` file exists, but it fails to parse.
/// (This is only relevant for stdlib search paths.)
VersionsParseError(TypeshedVersionsParseError),
/// Failed to discover the site-packages for the configured virtual environment.
SitePackagesDiscovery(SitePackagesDiscoveryError),
}
impl fmt::Display for SearchPathValidationError {
@@ -323,16 +325,9 @@ impl fmt::Display for SearchPathValidationError {
Self::NoStdlibSubdirectory(path) => {
write!(f, "The directory at {path} has no `stdlib/` subdirectory")
}
Self::FailedToReadVersionsFile { path, error } => {
write!(
f,
"Failed to read the custom typeshed versions file '{path}': {error}"
)
}
Self::NoVersionsFile(path) => write!(f, "Expected a file at {path}/stdlib/VERSIONS"),
Self::VersionsIsADirectory(path) => write!(f, "{path}/stdlib/VERSIONS is a directory."),
Self::VersionsParseError(underlying_error) => underlying_error.fmt(f),
SearchPathValidationError::SitePackagesDiscovery(error) => {
write!(f, "Failed to discover the site-packages directory: {error}")
}
}
}
}
@@ -347,18 +342,6 @@ impl std::error::Error for SearchPathValidationError {
}
}
impl From<TypeshedVersionsParseError> for SearchPathValidationError {
fn from(value: TypeshedVersionsParseError) -> Self {
Self::VersionsParseError(value)
}
}
impl From<SitePackagesDiscoveryError> for SearchPathValidationError {
fn from(value: SitePackagesDiscoveryError) -> Self {
Self::SitePackagesDiscovery(value)
}
}
type SearchPathResult<T> = Result<T, SearchPathValidationError>;
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
@@ -401,10 +384,11 @@ pub(crate) struct SearchPath(Arc<SearchPathInner>);
impl SearchPath {
fn directory_path(system: &dyn System, root: SystemPathBuf) -> SearchPathResult<SystemPathBuf> {
if system.is_directory(&root) {
Ok(root)
let canonicalized = system.canonicalize_path(&root).unwrap_or(root);
if system.is_directory(&canonicalized) {
Ok(canonicalized)
} else {
Err(SearchPathValidationError::NotADirectory(root))
Err(SearchPathValidationError::NotADirectory(canonicalized))
}
}
@@ -423,22 +407,32 @@ impl SearchPath {
}
/// Create a new standard-library search path pointing to a custom directory on disk
pub(crate) fn custom_stdlib(db: &dyn Db, typeshed: &SystemPath) -> SearchPathResult<Self> {
pub(crate) fn custom_stdlib(db: &dyn Db, typeshed: SystemPathBuf) -> SearchPathResult<Self> {
let system = db.system();
if !system.is_directory(typeshed) {
if !system.is_directory(&typeshed) {
return Err(SearchPathValidationError::NotADirectory(
typeshed.to_path_buf(),
));
}
let stdlib =
Self::directory_path(system, typeshed.join("stdlib")).map_err(|err| match err {
SearchPathValidationError::NotADirectory(_) => {
SearchPathValidationError::NoStdlibSubdirectory(typeshed.to_path_buf())
SearchPathValidationError::NotADirectory(path) => {
SearchPathValidationError::NoStdlibSubdirectory(path)
}
err => err,
})?;
let typeshed_versions =
system_path_to_file(db.upcast(), stdlib.join("VERSIONS")).map_err(|err| match err {
FileError::NotFound => SearchPathValidationError::NoVersionsFile(typeshed),
FileError::IsADirectory => {
SearchPathValidationError::VersionsIsADirectory(typeshed)
}
})?;
super::typeshed::parse_typeshed_versions(db, typeshed_versions)
.as_ref()
.map_err(|validation_error| {
SearchPathValidationError::VersionsParseError(validation_error.clone())
})?;
Ok(Self(Arc::new(SearchPathInner::StandardLibraryCustom(
stdlib,
))))
@@ -629,10 +623,10 @@ mod tests {
use ruff_db::Db;
use crate::db::tests::TestDb;
use crate::module_resolver::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use crate::python_version::PythonVersion;
use super::*;
use crate::module_resolver::testing::{FileSpec, MockedTypeshed, TestCase, TestCaseBuilder};
use crate::python_version::PythonVersion;
impl ModulePath {
#[must_use]
@@ -644,6 +638,15 @@ mod tests {
}
impl SearchPath {
#[must_use]
pub(crate) fn is_stdlib_search_path(&self) -> bool {
matches!(
&*self.0,
SearchPathInner::StandardLibraryCustom(_)
| SearchPathInner::StandardLibraryVendored(_)
)
}
fn join(&self, component: &str) -> ModulePath {
self.to_module_path().join(component)
}
@@ -658,7 +661,7 @@ mod tests {
.build();
assert_eq!(
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap())
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf())
.unwrap()
.to_module_path()
.with_py_extension(),
@@ -666,7 +669,7 @@ mod tests {
);
assert_eq!(
&SearchPath::custom_stdlib(&db, stdlib.parent().unwrap())
&SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf())
.unwrap()
.join("foo")
.with_pyi_extension(),
@@ -777,7 +780,7 @@ mod tests {
let TestCase { db, stdlib, .. } = TestCaseBuilder::new()
.with_custom_typeshed(MockedTypeshed::default())
.build();
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap())
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf())
.unwrap()
.to_module_path()
.push("bar.py");
@@ -789,7 +792,7 @@ mod tests {
let TestCase { db, stdlib, .. } = TestCaseBuilder::new()
.with_custom_typeshed(MockedTypeshed::default())
.build();
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap())
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf())
.unwrap()
.to_module_path()
.push("bar.rs");
@@ -821,7 +824,7 @@ mod tests {
.with_custom_typeshed(MockedTypeshed::default())
.build();
let root = SearchPath::custom_stdlib(&db, stdlib.parent().unwrap()).unwrap();
let root = SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf()).unwrap();
// Must have a `.pyi` extension or no extension:
let bad_absolute_path = SystemPath::new("foo/stdlib/x.py");
@@ -869,7 +872,8 @@ mod tests {
.with_custom_typeshed(typeshed)
.with_target_version(target_version)
.build();
let stdlib = SearchPath::custom_stdlib(&db, stdlib.parent().unwrap()).unwrap();
let stdlib =
SearchPath::custom_stdlib(&db, stdlib.parent().unwrap().to_path_buf()).unwrap();
(db, stdlib)
}
@@ -894,7 +898,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let asyncio_regular_package = stdlib_path.join("asyncio");
assert!(asyncio_regular_package.is_directory(&resolver));
@@ -922,7 +926,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let xml_namespace_package = stdlib_path.join("xml");
assert!(xml_namespace_package.is_directory(&resolver));
@@ -944,7 +948,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let functools_module = stdlib_path.join("functools.pyi");
assert!(functools_module.to_file(&resolver).is_some());
@@ -960,7 +964,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let collections_regular_package = stdlib_path.join("collections");
assert_eq!(collections_regular_package.to_file(&resolver), None);
@@ -976,7 +980,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let importlib_namespace_package = stdlib_path.join("importlib");
assert_eq!(importlib_namespace_package.to_file(&resolver), None);
@@ -997,7 +1001,7 @@ mod tests {
};
let (db, stdlib_path) = py38_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY38);
let resolver = ResolverState::new(&db, PythonVersion::PY38);
let non_existent = stdlib_path.join("doesnt_even_exist");
assert_eq!(non_existent.to_file(&resolver), None);
@@ -1025,7 +1029,7 @@ mod tests {
};
let (db, stdlib_path) = py39_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY39);
let resolver = ResolverState::new(&db, PythonVersion::PY39);
// Since we've set the target version to Py39,
// `collections` should now exist as a directory, according to VERSIONS...
@@ -1054,7 +1058,7 @@ mod tests {
};
let (db, stdlib_path) = py39_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY39);
let resolver = ResolverState::new(&db, PythonVersion::PY39);
// The `importlib` directory now also exists
let importlib_namespace_package = stdlib_path.join("importlib");
@@ -1078,7 +1082,7 @@ mod tests {
};
let (db, stdlib_path) = py39_typeshed_test_case(TYPESHED);
let resolver = ResolverContext::new(&db, PythonVersion::PY39);
let resolver = ResolverState::new(&db, PythonVersion::PY39);
// The `xml` package no longer exists on py39:
let xml_namespace_package = stdlib_path.join("xml");

View File

@@ -1,19 +1,19 @@
use rustc_hash::{FxBuildHasher, FxHashSet};
use std::borrow::Cow;
use std::iter::FusedIterator;
use std::ops::Deref;
use rustc_hash::{FxBuildHasher, FxHashSet};
use ruff_db::files::{File, FilePath, FileRootKind};
use ruff_db::system::{DirectoryEntry, System, SystemPath, SystemPathBuf};
use ruff_db::vendored::{VendoredFileSystem, VendoredPath};
use ruff_db::system::{DirectoryEntry, SystemPath, SystemPathBuf};
use ruff_db::vendored::VendoredPath;
use crate::db::Db;
use crate::module_name::ModuleName;
use crate::{Program, SearchPathSettings};
use super::module::{Module, ModuleKind};
use super::path::{ModulePath, SearchPath, SearchPathValidationError};
use crate::db::Db;
use crate::module_name::ModuleName;
use crate::module_resolver::typeshed::{vendored_typeshed_versions, TypeshedVersions};
use crate::site_packages::VirtualEnvironment;
use crate::{Program, PythonVersion, SearchPathSettings, SitePackages};
use super::state::ResolverState;
/// Resolves a module name to a module.
pub fn resolve_module(db: &dyn Db, module_name: ModuleName) -> Option<Module> {
@@ -122,7 +122,7 @@ pub(crate) fn search_paths(db: &dyn Db) -> SearchPathIterator {
Program::get(db).search_paths(db).iter(db)
}
#[derive(Debug, PartialEq, Eq)]
#[derive(Debug, PartialEq, Eq, Default)]
pub(crate) struct SearchPaths {
/// Search paths that have been statically determined purely from reading Ruff's configuration settings.
/// These shouldn't ever change unless the config settings themselves change.
@@ -135,8 +135,6 @@ pub(crate) struct SearchPaths {
/// in terms of module-resolution priority until we've discovered the editable installs
/// for the first `site-packages` path
site_packages: Vec<SearchPath>,
typeshed_versions: ResolvedTypeshedVersions,
}
impl SearchPaths {
@@ -148,14 +146,8 @@ impl SearchPaths {
/// [module resolution order]: https://typing.readthedocs.io/en/latest/spec/distributing.html#import-resolution-ordering
pub(crate) fn from_settings(
db: &dyn Db,
settings: &SearchPathSettings,
settings: SearchPathSettings,
) -> Result<Self, SearchPathValidationError> {
fn canonicalize(path: &SystemPath, system: &dyn System) -> SystemPathBuf {
system
.canonicalize_path(path)
.unwrap_or_else(|_| path.to_path_buf())
}
let SearchPathSettings {
extra_paths,
src_root,
@@ -169,65 +161,45 @@ impl SearchPaths {
let mut static_paths = vec![];
for path in extra_paths {
let path = canonicalize(path, system);
files.try_add_root(db.upcast(), &path, FileRootKind::LibrarySearchPath);
tracing::debug!("Adding extra search-path '{path}'");
tracing::debug!("Adding static extra search-path '{path}'");
static_paths.push(SearchPath::extra(system, path)?);
let search_path = SearchPath::extra(system, path)?;
files.try_add_root(
db.upcast(),
search_path.as_system_path().unwrap(),
FileRootKind::LibrarySearchPath,
);
static_paths.push(search_path);
}
tracing::debug!("Adding first-party search path '{src_root}'");
static_paths.push(SearchPath::first_party(system, src_root.to_path_buf())?);
static_paths.push(SearchPath::first_party(system, src_root)?);
let (typeshed_versions, stdlib_path) = if let Some(custom_typeshed) = custom_typeshed {
let custom_typeshed = canonicalize(custom_typeshed, system);
static_paths.push(if let Some(custom_typeshed) = custom_typeshed {
tracing::debug!("Adding custom-stdlib search path '{custom_typeshed}'");
let search_path = SearchPath::custom_stdlib(db, custom_typeshed)?;
files.try_add_root(
db.upcast(),
&custom_typeshed,
search_path.as_system_path().unwrap(),
FileRootKind::LibrarySearchPath,
);
let versions_path = custom_typeshed.join("stdlib/VERSIONS");
let versions_content = system.read_to_string(&versions_path).map_err(|error| {
SearchPathValidationError::FailedToReadVersionsFile {
path: versions_path,
error,
}
})?;
let parsed: TypeshedVersions = versions_content.parse()?;
let search_path = SearchPath::custom_stdlib(db, &custom_typeshed)?;
(ResolvedTypeshedVersions::Custom(parsed), search_path)
search_path
} else {
tracing::debug!("Using vendored stdlib");
(
ResolvedTypeshedVersions::Vendored(vendored_typeshed_versions()),
SearchPath::vendored_stdlib(),
)
};
static_paths.push(stdlib_path);
let site_packages_paths = match site_packages_paths {
SitePackages::Derived { venv_path } => VirtualEnvironment::new(venv_path, system)
.and_then(|venv| venv.site_packages_directories(system))?,
SitePackages::Known(paths) => paths
.iter()
.map(|path| canonicalize(path, system))
.collect(),
};
SearchPath::vendored_stdlib()
});
let mut site_packages: Vec<_> = Vec::with_capacity(site_packages_paths.len());
for path in site_packages_paths {
tracing::debug!("Adding site-packages search path '{path}'");
files.try_add_root(db.upcast(), &path, FileRootKind::LibrarySearchPath);
site_packages.push(SearchPath::site_packages(system, path)?);
let search_path = SearchPath::site_packages(system, path)?;
files.try_add_root(
db.upcast(),
search_path.as_system_path().unwrap(),
FileRootKind::LibrarySearchPath,
);
site_packages.push(search_path);
}
// TODO vendor typeshed's third-party stubs as well as the stdlib and fallback to them as a final step
@@ -252,48 +224,16 @@ impl SearchPaths {
Ok(SearchPaths {
static_paths,
site_packages,
typeshed_versions,
})
}
pub(super) fn iter<'a>(&'a self, db: &'a dyn Db) -> SearchPathIterator<'a> {
pub(crate) fn iter<'a>(&'a self, db: &'a dyn Db) -> SearchPathIterator<'a> {
SearchPathIterator {
db,
static_paths: self.static_paths.iter(),
dynamic_paths: None,
}
}
pub(crate) fn custom_stdlib(&self) -> Option<&SystemPath> {
self.static_paths.iter().find_map(|search_path| {
if search_path.is_standard_library() {
search_path.as_system_path()
} else {
None
}
})
}
pub(super) fn typeshed_versions(&self) -> &TypeshedVersions {
&self.typeshed_versions
}
}
#[derive(Debug, PartialEq, Eq)]
enum ResolvedTypeshedVersions {
Vendored(&'static TypeshedVersions),
Custom(TypeshedVersions),
}
impl Deref for ResolvedTypeshedVersions {
type Target = TypeshedVersions;
fn deref(&self) -> &Self::Target {
match self {
ResolvedTypeshedVersions::Vendored(versions) => versions,
ResolvedTypeshedVersions::Custom(versions) => versions,
}
}
}
/// Collect all dynamic search paths. For each `site-packages` path:
@@ -311,7 +251,6 @@ pub(crate) fn dynamic_resolution_paths(db: &dyn Db) -> Vec<SearchPath> {
let SearchPaths {
static_paths,
site_packages,
typeshed_versions: _,
} = Program::get(db).search_paths(db);
let mut dynamic_paths = Vec::new();
@@ -376,16 +315,12 @@ pub(crate) fn dynamic_resolution_paths(db: &dyn Db) -> Vec<SearchPath> {
let installations = all_pth_files.iter().flat_map(PthFile::items);
for installation in installations {
let installation = system
.canonicalize_path(&installation)
.unwrap_or(installation);
if existing_paths.insert(Cow::Owned(installation.clone())) {
match SearchPath::editable(system, installation.clone()) {
match SearchPath::editable(system, installation) {
Ok(search_path) => {
tracing::debug!(
"Adding editable installation to module resolution path {path}",
path = installation
path = search_path.as_system_path().unwrap()
);
dynamic_paths.push(search_path);
}
@@ -547,7 +482,7 @@ struct ModuleNameIngredient<'db> {
fn resolve_name(db: &dyn Db, name: &ModuleName) -> Option<(SearchPath, File, ModuleKind)> {
let program = Program::get(db);
let target_version = program.target_version(db);
let resolver_state = ResolverContext::new(db, target_version);
let resolver_state = ResolverState::new(db, target_version);
let is_builtin_module =
ruff_python_stdlib::sys::is_builtin_module(target_version.minor, name.as_str());
@@ -610,7 +545,7 @@ fn resolve_name(db: &dyn Db, name: &ModuleName) -> Option<(SearchPath, File, Mod
fn resolve_package<'a, 'db, I>(
module_search_path: &SearchPath,
components: I,
resolver_state: &ResolverContext<'db>,
resolver_state: &ResolverState<'db>,
) -> Result<ResolvedPackage, PackageKind>
where
I: Iterator<Item = &'a str>,
@@ -692,21 +627,6 @@ impl PackageKind {
}
}
pub(super) struct ResolverContext<'db> {
pub(super) db: &'db dyn Db,
pub(super) target_version: PythonVersion,
}
impl<'db> ResolverContext<'db> {
pub(super) fn new(db: &'db dyn Db, target_version: PythonVersion) -> Self {
Self { db, target_version }
}
pub(super) fn vendored(&self) -> &VendoredFileSystem {
self.db.vendored()
}
}
#[cfg(test)]
mod tests {
use ruff_db::files::{system_path_to_file, File, FilePath};
@@ -861,7 +781,7 @@ mod tests {
"Search path for {module_name} was unexpectedly {search_path:?}"
);
assert!(
search_path.is_standard_library(),
search_path.is_stdlib_search_path(),
"Expected a stdlib search path, but got {search_path:?}"
);
}
@@ -957,7 +877,7 @@ mod tests {
"Search path for {module_name} was unexpectedly {search_path:?}"
);
assert!(
search_path.is_standard_library(),
search_path.is_stdlib_search_path(),
"Expected a stdlib search path, but got {search_path:?}"
);
}
@@ -1274,13 +1194,13 @@ mod tests {
Program::from_settings(
&db,
&ProgramSettings {
ProgramSettings {
target_version: PythonVersion::PY38,
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: src.clone(),
custom_typeshed: Some(custom_typeshed.clone()),
site_packages: SitePackages::Known(vec![site_packages]),
site_packages: vec![site_packages],
},
},
)
@@ -1779,16 +1699,13 @@ not_a_directory
Program::from_settings(
&db,
&ProgramSettings {
ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: SystemPathBuf::from("/src"),
custom_typeshed: None,
site_packages: SitePackages::Known(vec![
venv_site_packages,
system_site_packages,
]),
site_packages: vec![venv_site_packages, system_site_packages],
},
},
)

View File

@@ -0,0 +1,25 @@
use ruff_db::vendored::VendoredFileSystem;
use super::typeshed::LazyTypeshedVersions;
use crate::db::Db;
use crate::python_version::PythonVersion;
pub(crate) struct ResolverState<'db> {
pub(crate) db: &'db dyn Db,
pub(crate) typeshed_versions: LazyTypeshedVersions<'db>,
pub(crate) target_version: PythonVersion,
}
impl<'db> ResolverState<'db> {
pub(crate) fn new(db: &'db dyn Db, target_version: PythonVersion) -> Self {
Self {
db,
typeshed_versions: LazyTypeshedVersions::new(),
target_version,
}
}
pub(crate) fn vendored(&self) -> &VendoredFileSystem {
self.db.vendored()
}
}

View File

@@ -4,7 +4,7 @@ use ruff_db::vendored::VendoredPathBuf;
use crate::db::tests::TestDb;
use crate::program::{Program, SearchPathSettings};
use crate::python_version::PythonVersion;
use crate::{ProgramSettings, SitePackages};
use crate::ProgramSettings;
/// A test case for the module resolver.
///
@@ -179,7 +179,6 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
first_party_files,
site_packages_files,
} = self;
TestCaseBuilder {
typeshed_option: typeshed,
target_version,
@@ -196,7 +195,6 @@ impl TestCaseBuilder<UnspecifiedTypeshed> {
site_packages,
target_version,
} = self.with_custom_typeshed(MockedTypeshed::default()).build();
TestCase {
db,
src,
@@ -225,13 +223,13 @@ impl TestCaseBuilder<MockedTypeshed> {
Program::from_settings(
&db,
&ProgramSettings {
ProgramSettings {
target_version,
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: src.clone(),
custom_typeshed: Some(typeshed.clone()),
site_packages: SitePackages::Known(vec![site_packages.clone()]),
site_packages: vec![site_packages.clone()],
},
},
)
@@ -281,11 +279,13 @@ impl TestCaseBuilder<VendoredTypeshed> {
Program::from_settings(
&db,
&ProgramSettings {
ProgramSettings {
target_version,
search_paths: SearchPathSettings {
site_packages: SitePackages::Known(vec![site_packages.clone()]),
..SearchPathSettings::new(src.clone())
extra_paths: vec![],
src_root: src.clone(),
custom_typeshed: None,
site_packages: vec![site_packages.clone()],
},
},
)

View File

@@ -1,6 +1,6 @@
pub use self::vendored::vendored_typeshed_stubs;
pub(super) use self::versions::{
typeshed_versions, vendored_typeshed_versions, TypeshedVersions, TypeshedVersionsParseError,
parse_typeshed_versions, LazyTypeshedVersions, TypeshedVersionsParseError,
TypeshedVersionsQueryResult,
};

View File

@@ -1,3 +1,4 @@
use std::cell::OnceCell;
use std::collections::BTreeMap;
use std::fmt;
use std::num::{NonZeroU16, NonZeroUsize};
@@ -5,12 +6,78 @@ use std::ops::{RangeFrom, RangeInclusive};
use std::str::FromStr;
use once_cell::sync::Lazy;
use ruff_db::system::SystemPath;
use rustc_hash::FxHashMap;
use ruff_db::files::{system_path_to_file, File};
use super::vendored::vendored_typeshed_stubs;
use crate::db::Db;
use crate::module_name::ModuleName;
use crate::{Program, PythonVersion};
use crate::python_version::PythonVersion;
#[derive(Debug)]
pub(crate) struct LazyTypeshedVersions<'db>(OnceCell<&'db TypeshedVersions>);
impl<'db> LazyTypeshedVersions<'db> {
#[must_use]
pub(crate) fn new() -> Self {
Self(OnceCell::new())
}
/// Query whether a module exists at runtime in the stdlib on a certain Python version.
///
/// Simply probing whether a file exists in typeshed is insufficient for this question,
/// as a module in the stdlib may have been added in Python 3.10, but the typeshed stub
/// will still be available (either in a custom typeshed dir or in our vendored copy)
/// even if the user specified Python 3.8 as the target version.
///
/// For top-level modules and packages, the VERSIONS file can always provide an unambiguous answer
/// as to whether the module exists on the specified target version. However, VERSIONS does not
/// provide comprehensive information on all submodules, meaning that this method sometimes
/// returns [`TypeshedVersionsQueryResult::MaybeExists`].
/// See [`TypeshedVersionsQueryResult`] for more details.
#[must_use]
pub(crate) fn query_module(
&self,
db: &'db dyn Db,
module: &ModuleName,
stdlib_root: Option<&SystemPath>,
target_version: PythonVersion,
) -> TypeshedVersionsQueryResult {
let versions = self.0.get_or_init(|| {
let versions_path = if let Some(system_path) = stdlib_root {
system_path.join("VERSIONS")
} else {
return &VENDORED_VERSIONS;
};
let Ok(versions_file) = system_path_to_file(db.upcast(), &versions_path) else {
todo!(
"Still need to figure out how to handle VERSIONS files being deleted \
from custom typeshed directories! Expected a file to exist at {versions_path}"
)
};
// TODO(Alex/Micha): If VERSIONS is invalid,
// this should invalidate not just the specific module resolution we're currently attempting,
// but all type inference that depends on any standard-library types.
// Unwrapping here is not correct...
parse_typeshed_versions(db, versions_file).as_ref().unwrap()
});
versions.query_module(module, target_version)
}
}
#[salsa::tracked(return_ref)]
pub(crate) fn parse_typeshed_versions(
db: &dyn Db,
versions_file: File,
) -> Result<TypeshedVersions, TypeshedVersionsParseError> {
// TODO: Handle IO errors
let file_content = versions_file
.read_to_string(db.upcast())
.unwrap_or_default();
file_content.parse()
}
static VENDORED_VERSIONS: Lazy<TypeshedVersions> = Lazy::new(|| {
TypeshedVersions::from_str(
@@ -21,14 +88,6 @@ static VENDORED_VERSIONS: Lazy<TypeshedVersions> = Lazy::new(|| {
.unwrap()
});
pub(crate) fn vendored_typeshed_versions() -> &'static TypeshedVersions {
&VENDORED_VERSIONS
}
pub(crate) fn typeshed_versions(db: &dyn Db) -> &TypeshedVersions {
Program::get(db).search_paths(db).typeshed_versions()
}
#[derive(Debug, PartialEq, Eq, Clone)]
pub(crate) struct TypeshedVersionsParseError {
line_number: Option<NonZeroU16>,
@@ -115,7 +174,7 @@ impl TypeshedVersions {
}
#[must_use]
pub(in crate::module_resolver) fn query_module(
fn query_module(
&self,
module: &ModuleName,
target_version: PythonVersion,
@@ -145,7 +204,7 @@ impl TypeshedVersions {
}
}
/// Possible answers [`TypeshedVersions::query_module()`] could give to the question:
/// Possible answers [`LazyTypeshedVersions::query_module()`] could give to the question:
/// "Does this module exist in the stdlib at runtime on a certain target version?"
#[derive(Debug, Copy, PartialEq, Eq, Clone, Hash)]
pub(crate) enum TypeshedVersionsQueryResult {

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{AnyNodeRef, NodeKind};
use ruff_python_ast::{AnyNodeRef, AstNode, NodeKind};
use ruff_text_size::{Ranged, TextRange};
/// Compact key for a node for use in a hash map.
@@ -11,7 +11,19 @@ pub(super) struct NodeKey {
}
impl NodeKey {
pub(super) fn from_node<'a, N>(node: N) -> Self
#[inline]
pub(super) fn from_node<'a, N>(node: &N) -> Self
where
N: AstNode,
{
NodeKey {
kind: node.kind(),
range: node.range(),
}
}
#[inline]
pub(super) fn from_ref<'a, N>(node: N) -> Self
where
N: Into<AnyNodeRef<'a>>,
{

View File

@@ -3,7 +3,7 @@ use anyhow::Context;
use salsa::Durability;
use salsa::Setter;
use ruff_db::system::{SystemPath, SystemPathBuf};
use ruff_db::system::SystemPathBuf;
use crate::module_resolver::SearchPaths;
use crate::Db;
@@ -12,12 +12,13 @@ use crate::Db;
pub struct Program {
pub target_version: PythonVersion,
#[default]
#[return_ref]
pub(crate) search_paths: SearchPaths,
}
impl Program {
pub fn from_settings(db: &dyn Db, settings: &ProgramSettings) -> anyhow::Result<Self> {
pub fn from_settings(db: &dyn Db, settings: ProgramSettings) -> anyhow::Result<Self> {
let ProgramSettings {
target_version,
search_paths,
@@ -28,15 +29,16 @@ impl Program {
let search_paths = SearchPaths::from_settings(db, search_paths)
.with_context(|| "Invalid search path settings")?;
Ok(Program::builder(settings.target_version, search_paths)
Ok(Program::builder(settings.target_version)
.durability(Durability::HIGH)
.search_paths(search_paths)
.new(db))
}
pub fn update_search_paths(
self,
&self,
db: &mut dyn Db,
search_path_settings: &SearchPathSettings,
search_path_settings: SearchPathSettings,
) -> anyhow::Result<()> {
let search_paths = SearchPaths::from_settings(db, search_path_settings)?;
@@ -47,20 +49,16 @@ impl Program {
Ok(())
}
pub fn custom_stdlib_search_path(self, db: &dyn Db) -> Option<&SystemPath> {
self.search_paths(db).custom_stdlib()
}
}
#[derive(Clone, Debug, Eq, PartialEq)]
#[derive(Debug, Eq, PartialEq)]
pub struct ProgramSettings {
pub target_version: PythonVersion,
pub search_paths: SearchPathSettings,
}
/// Configures the search paths for module resolution.
#[derive(Eq, PartialEq, Debug, Clone)]
#[derive(Eq, PartialEq, Debug, Clone, Default)]
pub struct SearchPathSettings {
/// List of user-provided paths that should take first priority in the module resolution.
/// Examples in other type checkers are mypy's MYPYPATH environment variable,
@@ -76,25 +74,5 @@ pub struct SearchPathSettings {
pub custom_typeshed: Option<SystemPathBuf>,
/// The path to the user's `site-packages` directory, where third-party packages from ``PyPI`` are installed.
pub site_packages: SitePackages,
}
impl SearchPathSettings {
pub fn new(src_root: SystemPathBuf) -> Self {
Self {
src_root,
extra_paths: vec![],
custom_typeshed: None,
site_packages: SitePackages::Known(vec![]),
}
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
pub enum SitePackages {
Derived {
venv_path: SystemPathBuf,
},
/// Resolved site packages paths
Known(Vec<SystemPathBuf>),
pub site_packages: Vec<SystemPathBuf>,
}

View File

@@ -575,7 +575,7 @@ def f(a: str, /, b: str, c: int = 1, *args, d: int = 2, **kwargs):
let index = semantic_index(&db, file);
let global_table = symbol_table(&db, global_scope(&db, file));
assert_eq!(names(&global_table), vec!["str", "int", "f"]);
assert_eq!(names(&global_table), vec!["f", "str", "int"]);
let [(function_scope_id, _function_scope)] = index
.child_scopes(FileScopeId::global())
@@ -790,56 +790,6 @@ def f(a: str, /, b: str, c: int = 1, *args, d: int = 2, **kwargs):
assert_eq!(names(&inner_comprehension_symbol_table), vec!["x"]);
}
#[test]
fn with_item_definition() {
let TestCase { db, file } = test_case(
"
with item1 as x, item2 as y:
pass
",
);
let index = semantic_index(&db, file);
let global_table = index.symbol_table(FileScopeId::global());
assert_eq!(names(&global_table), vec!["item1", "x", "item2", "y"]);
let use_def = index.use_def_map(FileScopeId::global());
for name in ["x", "y"] {
let Some(definition) = use_def.first_public_definition(
global_table.symbol_id_by_name(name).expect("symbol exists"),
) else {
panic!("Expected with item definition for {name}");
};
assert!(matches!(definition.node(&db), DefinitionKind::WithItem(_)));
}
}
#[test]
fn with_item_unpacked_definition() {
let TestCase { db, file } = test_case(
"
with context() as (x, y):
pass
",
);
let index = semantic_index(&db, file);
let global_table = index.symbol_table(FileScopeId::global());
assert_eq!(names(&global_table), vec!["context", "x", "y"]);
let use_def = index.use_def_map(FileScopeId::global());
for name in ["x", "y"] {
let Some(definition) = use_def.first_public_definition(
global_table.symbol_id_by_name(name).expect("symbol exists"),
) else {
panic!("Expected with item definition for {name}");
};
assert!(matches!(definition.node(&db), DefinitionKind::WithItem(_)));
}
}
#[test]
fn dupes() {
let TestCase { db, file } = test_case(
@@ -1095,56 +1045,4 @@ match subject:
vec!["subject", "a", "b", "c", "d", "f", "e", "h", "g", "Foo", "i", "j", "k", "l"]
);
}
#[test]
fn for_loops_single_assignment() {
let TestCase { db, file } = test_case("for x in a: pass");
let scope = global_scope(&db, file);
let global_table = symbol_table(&db, scope);
assert_eq!(&names(&global_table), &["a", "x"]);
let use_def = use_def_map(&db, scope);
let definition = use_def
.first_public_definition(global_table.symbol_id_by_name("x").unwrap())
.unwrap();
assert!(matches!(definition.node(&db), DefinitionKind::For(_)));
}
#[test]
fn for_loops_simple_unpacking() {
let TestCase { db, file } = test_case("for (x, y) in a: pass");
let scope = global_scope(&db, file);
let global_table = symbol_table(&db, scope);
assert_eq!(&names(&global_table), &["a", "x", "y"]);
let use_def = use_def_map(&db, scope);
let x_definition = use_def
.first_public_definition(global_table.symbol_id_by_name("x").unwrap())
.unwrap();
let y_definition = use_def
.first_public_definition(global_table.symbol_id_by_name("y").unwrap())
.unwrap();
assert!(matches!(x_definition.node(&db), DefinitionKind::For(_)));
assert!(matches!(y_definition.node(&db), DefinitionKind::For(_)));
}
#[test]
fn for_loops_complex_unpacking() {
let TestCase { db, file } = test_case("for [((a,) b), (c, d)] in e: pass");
let scope = global_scope(&db, file);
let global_table = symbol_table(&db, scope);
assert_eq!(&names(&global_table), &["e", "a", "b", "c", "d"]);
let use_def = use_def_map(&db, scope);
let definition = use_def
.first_public_definition(global_table.symbol_id_by_name("a").unwrap())
.unwrap();
assert!(matches!(definition.node(&db), DefinitionKind::For(_)));
}
}

View File

@@ -197,12 +197,14 @@ pub(crate) mod node_key {
pub(crate) struct ExpressionNodeKey(NodeKey);
impl From<ast::ExpressionRef<'_>> for ExpressionNodeKey {
#[inline]
fn from(value: ast::ExpressionRef<'_>) -> Self {
Self(NodeKey::from_node(value))
Self(NodeKey::from_ref(value))
}
}
impl From<&ast::Expr> for ExpressionNodeKey {
#[inline]
fn from(value: &ast::Expr) -> Self {
Self(NodeKey::from_node(value))
}

View File

@@ -15,7 +15,7 @@ use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::semantic_index::ast_ids::AstIdsBuilder;
use crate::semantic_index::definition::{
AssignmentDefinitionNodeRef, ComprehensionDefinitionNodeRef, Definition, DefinitionNodeKey,
DefinitionNodeRef, ForStmtDefinitionNodeRef, ImportFromDefinitionNodeRef,
DefinitionNodeRef, ImportFromDefinitionNodeRef,
};
use crate::semantic_index::expression::Expression;
use crate::semantic_index::symbol::{
@@ -26,8 +26,6 @@ use crate::semantic_index::use_def::{FlowSnapshot, UseDefMapBuilder};
use crate::semantic_index::SemanticIndex;
use crate::Db;
use super::definition::WithItemDefinitionNodeRef;
pub(super) struct SemanticIndexBuilder<'db> {
// Builder state
db: &'db dyn Db,
@@ -392,6 +390,20 @@ where
self.visit_decorator(decorator);
}
let symbol = self
.add_or_update_symbol(function_def.name.id.clone(), SymbolFlags::IS_DEFINED);
self.add_definition(symbol, function_def);
// The default value of the parameters needs to be evaluated in the
// enclosing scope.
for default in function_def
.parameters
.iter_non_variadic_params()
.filter_map(|param| param.default.as_deref())
{
self.visit_expr(default);
}
self.with_type_params(
NodeWithScopeRef::FunctionTypeParameters(function_def),
function_def.type_params.as_deref(),
@@ -412,21 +424,6 @@ where
builder.pop_scope()
},
);
// The default value of the parameters needs to be evaluated in the
// enclosing scope.
for default in function_def
.parameters
.iter_non_variadic_params()
.filter_map(|param| param.default.as_deref())
{
self.visit_expr(default);
}
// The symbol for the function name itself has to be evaluated
// at the end to match the runtime evaluation of parameter defaults
// and return-type annotations.
let symbol = self
.add_or_update_symbol(function_def.name.id.clone(), SymbolFlags::IS_DEFINED);
self.add_definition(symbol, function_def);
}
ast::Stmt::ClassDef(class) => {
for decorator in &class.decorator_list {
@@ -564,42 +561,9 @@ where
self.flow_merge(break_state);
}
}
ast::Stmt::With(ast::StmtWith { items, body, .. }) => {
for item in items {
self.visit_expr(&item.context_expr);
if let Some(optional_vars) = item.optional_vars.as_deref() {
self.add_standalone_expression(&item.context_expr);
self.current_assignment = Some(item.into());
self.visit_expr(optional_vars);
self.current_assignment = None;
}
}
self.visit_body(body);
}
ast::Stmt::Break(_) => {
self.loop_break_states.push(self.flow_snapshot());
}
ast::Stmt::For(
for_stmt @ ast::StmtFor {
range: _,
is_async: _,
target,
iter,
body,
orelse,
},
) => {
// TODO add control flow similar to `ast::Stmt::While` above
self.add_standalone_expression(iter);
self.visit_expr(iter);
debug_assert!(self.current_assignment.is_none());
self.current_assignment = Some(for_stmt.into());
self.visit_expr(target);
self.current_assignment = None;
self.visit_body(body);
self.visit_body(orelse);
}
_ => {
walk_stmt(self, stmt);
}
@@ -646,15 +610,6 @@ where
Some(CurrentAssignment::AugAssign(aug_assign)) => {
self.add_definition(symbol, aug_assign);
}
Some(CurrentAssignment::For(node)) => {
self.add_definition(
symbol,
ForStmtDefinitionNodeRef {
iterable: &node.iter,
target: name_node,
},
);
}
Some(CurrentAssignment::Named(named)) => {
// TODO(dhruvmanila): If the current scope is a comprehension, then the
// named expression is implicitly nonlocal. This is yet to be
@@ -667,15 +622,6 @@ where
ComprehensionDefinitionNodeRef { node, first },
);
}
Some(CurrentAssignment::WithItem(with_item)) => {
self.add_definition(
symbol,
WithItemDefinitionNodeRef {
node: with_item,
target: name_node,
},
);
}
None => {}
}
}
@@ -689,11 +635,11 @@ where
}
ast::Expr::Named(node) => {
debug_assert!(self.current_assignment.is_none());
// TODO walrus in comprehensions is implicitly nonlocal
self.visit_expr(&node.value);
self.current_assignment = Some(node.into());
// TODO walrus in comprehensions is implicitly nonlocal
self.visit_expr(&node.target);
self.current_assignment = None;
self.visit_expr(&node.value);
}
ast::Expr::Lambda(lambda) => {
if let Some(parameters) = &lambda.parameters {
@@ -827,13 +773,11 @@ enum CurrentAssignment<'a> {
Assign(&'a ast::StmtAssign),
AnnAssign(&'a ast::StmtAnnAssign),
AugAssign(&'a ast::StmtAugAssign),
For(&'a ast::StmtFor),
Named(&'a ast::ExprNamed),
Comprehension {
node: &'a ast::Comprehension,
first: bool,
},
WithItem(&'a ast::WithItem),
}
impl<'a> From<&'a ast::StmtAssign> for CurrentAssignment<'a> {
@@ -854,20 +798,8 @@ impl<'a> From<&'a ast::StmtAugAssign> for CurrentAssignment<'a> {
}
}
impl<'a> From<&'a ast::StmtFor> for CurrentAssignment<'a> {
fn from(value: &'a ast::StmtFor) -> Self {
Self::For(value)
}
}
impl<'a> From<&'a ast::ExprNamed> for CurrentAssignment<'a> {
fn from(value: &'a ast::ExprNamed) -> Self {
Self::Named(value)
}
}
impl<'a> From<&'a ast::WithItem> for CurrentAssignment<'a> {
fn from(value: &'a ast::WithItem) -> Self {
Self::WithItem(value)
}
}

View File

@@ -39,7 +39,6 @@ impl<'db> Definition<'db> {
pub(crate) enum DefinitionNodeRef<'a> {
Import(&'a ast::Alias),
ImportFrom(ImportFromDefinitionNodeRef<'a>),
For(ForStmtDefinitionNodeRef<'a>),
Function(&'a ast::StmtFunctionDef),
Class(&'a ast::StmtClassDef),
NamedExpression(&'a ast::ExprNamed),
@@ -48,7 +47,6 @@ pub(crate) enum DefinitionNodeRef<'a> {
AugmentedAssignment(&'a ast::StmtAugAssign),
Comprehension(ComprehensionDefinitionNodeRef<'a>),
Parameter(ast::AnyParameterRef<'a>),
WithItem(WithItemDefinitionNodeRef<'a>),
}
impl<'a> From<&'a ast::StmtFunctionDef> for DefinitionNodeRef<'a> {
@@ -93,24 +91,12 @@ impl<'a> From<ImportFromDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
}
}
impl<'a> From<ForStmtDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
fn from(value: ForStmtDefinitionNodeRef<'a>) -> Self {
Self::For(value)
}
}
impl<'a> From<AssignmentDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
fn from(node_ref: AssignmentDefinitionNodeRef<'a>) -> Self {
Self::Assignment(node_ref)
}
}
impl<'a> From<WithItemDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
fn from(node_ref: WithItemDefinitionNodeRef<'a>) -> Self {
Self::WithItem(node_ref)
}
}
impl<'a> From<ComprehensionDefinitionNodeRef<'a>> for DefinitionNodeRef<'a> {
fn from(node: ComprehensionDefinitionNodeRef<'a>) -> Self {
Self::Comprehension(node)
@@ -135,18 +121,6 @@ pub(crate) struct AssignmentDefinitionNodeRef<'a> {
pub(crate) target: &'a ast::ExprName,
}
#[derive(Copy, Clone, Debug)]
pub(crate) struct WithItemDefinitionNodeRef<'a> {
pub(crate) node: &'a ast::WithItem,
pub(crate) target: &'a ast::ExprName,
}
#[derive(Copy, Clone, Debug)]
pub(crate) struct ForStmtDefinitionNodeRef<'a> {
pub(crate) iterable: &'a ast::Expr,
pub(crate) target: &'a ast::ExprName,
}
#[derive(Copy, Clone, Debug)]
pub(crate) struct ComprehensionDefinitionNodeRef<'a> {
pub(crate) node: &'a ast::Comprehension,
@@ -187,12 +161,6 @@ impl DefinitionNodeRef<'_> {
DefinitionNodeRef::AugmentedAssignment(augmented_assignment) => {
DefinitionKind::AugmentedAssignment(AstNodeRef::new(parsed, augmented_assignment))
}
DefinitionNodeRef::For(ForStmtDefinitionNodeRef { iterable, target }) => {
DefinitionKind::For(ForStmtDefinitionKind {
iterable: AstNodeRef::new(parsed.clone(), iterable),
target: AstNodeRef::new(parsed, target),
})
}
DefinitionNodeRef::Comprehension(ComprehensionDefinitionNodeRef { node, first }) => {
DefinitionKind::Comprehension(ComprehensionDefinitionKind {
node: AstNodeRef::new(parsed, node),
@@ -207,12 +175,6 @@ impl DefinitionNodeRef<'_> {
DefinitionKind::ParameterWithDefault(AstNodeRef::new(parsed, parameter))
}
},
DefinitionNodeRef::WithItem(WithItemDefinitionNodeRef { node, target }) => {
DefinitionKind::WithItem(WithItemDefinitionKind {
node: AstNodeRef::new(parsed.clone(), node),
target: AstNodeRef::new(parsed, target),
})
}
}
}
@@ -231,16 +193,11 @@ impl DefinitionNodeRef<'_> {
}) => target.into(),
Self::AnnotatedAssignment(node) => node.into(),
Self::AugmentedAssignment(node) => node.into(),
Self::For(ForStmtDefinitionNodeRef {
iterable: _,
target,
}) => target.into(),
Self::Comprehension(ComprehensionDefinitionNodeRef { node, first: _ }) => node.into(),
Self::Parameter(node) => match node {
ast::AnyParameterRef::Variadic(parameter) => parameter.into(),
ast::AnyParameterRef::NonVariadic(parameter) => parameter.into(),
},
Self::WithItem(WithItemDefinitionNodeRef { node: _, target }) => target.into(),
}
}
}
@@ -255,11 +212,9 @@ pub enum DefinitionKind {
Assignment(AssignmentDefinitionKind),
AnnotatedAssignment(AstNodeRef<ast::StmtAnnAssign>),
AugmentedAssignment(AstNodeRef<ast::StmtAugAssign>),
For(ForStmtDefinitionKind),
Comprehension(ComprehensionDefinitionKind),
Parameter(AstNodeRef<ast::Parameter>),
ParameterWithDefault(AstNodeRef<ast::ParameterWithDefault>),
WithItem(WithItemDefinitionKind),
}
#[derive(Clone, Debug)]
@@ -295,6 +250,7 @@ impl ImportFromDefinitionKind {
}
#[derive(Clone, Debug)]
#[allow(dead_code)]
pub struct AssignmentDefinitionKind {
assignment: AstNodeRef<ast::StmtAssign>,
target: AstNodeRef<ast::ExprName>,
@@ -310,38 +266,6 @@ impl AssignmentDefinitionKind {
}
}
#[derive(Clone, Debug)]
pub struct WithItemDefinitionKind {
node: AstNodeRef<ast::WithItem>,
target: AstNodeRef<ast::ExprName>,
}
impl WithItemDefinitionKind {
pub(crate) fn node(&self) -> &ast::WithItem {
self.node.node()
}
pub(crate) fn target(&self) -> &ast::ExprName {
self.target.node()
}
}
#[derive(Clone, Debug)]
pub struct ForStmtDefinitionKind {
iterable: AstNodeRef<ast::Expr>,
target: AstNodeRef<ast::ExprName>,
}
impl ForStmtDefinitionKind {
pub(crate) fn iterable(&self) -> &ast::Expr {
self.iterable.node()
}
pub(crate) fn target(&self) -> &ast::ExprName {
self.target.node()
}
}
#[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)]
pub(crate) struct DefinitionNodeKey(NodeKey);
@@ -387,12 +311,6 @@ impl From<&ast::StmtAugAssign> for DefinitionNodeKey {
}
}
impl From<&ast::StmtFor> for DefinitionNodeKey {
fn from(value: &ast::StmtFor) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<&ast::Comprehension> for DefinitionNodeKey {
fn from(node: &ast::Comprehension) -> Self {
Self(NodeKey::from_node(node))

View File

@@ -184,9 +184,14 @@ mod tests {
Program::from_settings(
&db,
&ProgramSettings {
ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings::new(SystemPathBuf::from("/src")),
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: SystemPathBuf::from("/src"),
site_packages: vec![],
custom_typeshed: None,
},
},
)?;

View File

@@ -1,9 +1,8 @@
use ruff_db::files::File;
use ruff_python_ast as ast;
use ruff_python_ast::name::Name;
use crate::builtins::builtins_scope;
use crate::semantic_index::ast_ids::HasScopedAstId;
use crate::semantic_index::definition::{Definition, DefinitionKind};
use crate::semantic_index::definition::Definition;
use crate::semantic_index::symbol::{ScopeId, ScopedSymbolId};
use crate::semantic_index::{
global_scope, semantic_index, symbol_table, use_def_map, DefinitionWithConstraints,
@@ -15,8 +14,7 @@ use crate::{Db, FxOrderSet};
pub(crate) use self::builder::{IntersectionBuilder, UnionBuilder};
pub(crate) use self::diagnostic::TypeCheckDiagnostics;
pub(crate) use self::infer::{
infer_deferred_types, infer_definition_types, infer_expression_types, infer_scope_types,
TypeInference,
infer_definition_types, infer_expression_types, infer_scope_types, TypeInference,
};
mod builder;
@@ -90,24 +88,6 @@ pub(crate) fn definition_ty<'db>(db: &'db dyn Db, definition: Definition<'db>) -
inference.definition_ty(definition)
}
/// Infer the type of a (possibly deferred) sub-expression of a [`Definition`].
///
/// ## Panics
/// If the given expression is not a sub-expression of the given [`Definition`].
pub(crate) fn definition_expression_ty<'db>(
db: &'db dyn Db,
definition: Definition<'db>,
expression: &ast::Expr,
) -> Type<'db> {
let expr_id = expression.scoped_ast_id(db, definition.scope(db));
let inference = infer_definition_types(db, definition);
if let Some(ty) = inference.try_expression_ty(expr_id) {
ty
} else {
infer_deferred_types(db, definition).expression_ty(expr_id)
}
}
/// Infer the combined type of an array of [`Definition`]s, plus one optional "unbound type".
///
/// Will return a union if there is more than one definition, or at least one plus an unbound
@@ -201,13 +181,6 @@ pub enum Type<'db> {
IntLiteral(i64),
/// A boolean literal, either `True` or `False`.
BooleanLiteral(bool),
/// A string literal
StringLiteral(StringLiteralType<'db>),
/// A string known to originate only from literal values, but whose value is not known (unlike
/// `StringLiteral` above).
LiteralString,
/// A bytes literal
BytesLiteral(BytesLiteralType<'db>),
// TODO protocols, callable types, overloads, generics, type vars
}
@@ -263,7 +236,7 @@ impl<'db> Type<'db> {
/// us to explicitly consider whether to handle an error or propagate
/// it up the call stack.
#[must_use]
pub fn member(&self, db: &'db dyn Db, name: &ast::name::Name) -> Type<'db> {
pub fn member(&self, db: &'db dyn Db, name: &Name) -> Type<'db> {
match self {
Type::Any => Type::Any,
Type::Never => {
@@ -303,20 +276,6 @@ impl<'db> Type<'db> {
Type::Unknown
}
Type::BooleanLiteral(_) => Type::Unknown,
Type::StringLiteral(_) => {
// TODO defer to `typing.LiteralString`/`builtins.str` methods
// from typeshed's stubs
Type::Unknown
}
Type::LiteralString => {
// TODO defer to `typing.LiteralString`/`builtins.str` methods
// from typeshed's stubs
Type::Unknown
}
Type::BytesLiteral(_) => {
// TODO defer to Type::Instance(<bytes from typeshed>).member
Type::Unknown
}
}
}
@@ -334,7 +293,7 @@ impl<'db> Type<'db> {
#[salsa::interned]
pub struct FunctionType<'db> {
/// name of the function at definition
pub name: ast::name::Name,
pub name: Name,
/// types of all decorators on this function
decorators: Vec<Type<'db>>,
@@ -349,33 +308,19 @@ impl<'db> FunctionType<'db> {
#[salsa::interned]
pub struct ClassType<'db> {
/// Name of the class at definition
pub name: ast::name::Name,
pub name: Name,
definition: Definition<'db>,
/// Types of all class bases
bases: Vec<Type<'db>>,
body_scope: ScopeId<'db>,
}
impl<'db> ClassType<'db> {
/// Return an iterator over the types of this class's bases.
///
/// # Panics:
/// If `definition` is not a `DefinitionKind::Class`.
pub fn bases(&self, db: &'db dyn Db) -> impl Iterator<Item = Type<'db>> {
let definition = self.definition(db);
let DefinitionKind::Class(class_stmt_node) = definition.node(db) else {
panic!("Class type definition must have DefinitionKind::Class");
};
class_stmt_node
.bases()
.iter()
.map(move |base_expr| definition_expression_ty(db, definition, base_expr))
}
/// Returns the class member of this class named `name`.
///
/// The member resolves to a member of the class itself or any of its bases.
pub fn class_member(self, db: &'db dyn Db, name: &ast::name::Name) -> Type<'db> {
pub fn class_member(self, db: &'db dyn Db, name: &Name) -> Type<'db> {
let member = self.own_class_member(db, name);
if !member.is_unbound() {
return member;
@@ -385,12 +330,12 @@ impl<'db> ClassType<'db> {
}
/// Returns the inferred type of the class member named `name`.
pub fn own_class_member(self, db: &'db dyn Db, name: &ast::name::Name) -> Type<'db> {
pub fn own_class_member(self, db: &'db dyn Db, name: &Name) -> Type<'db> {
let scope = self.body_scope(db);
symbol_ty_by_name(db, scope, name)
}
pub fn inherited_class_member(self, db: &'db dyn Db, name: &ast::name::Name) -> Type<'db> {
pub fn inherited_class_member(self, db: &'db dyn Db, name: &Name) -> Type<'db> {
for base in self.bases(db) {
let member = base.member(db, name);
if !member.is_unbound() {
@@ -427,18 +372,6 @@ pub struct IntersectionType<'db> {
negative: FxOrderSet<Type<'db>>,
}
#[salsa::interned]
pub struct StringLiteralType<'db> {
#[return_ref]
value: Box<str>,
}
#[salsa::interned]
pub struct BytesLiteralType<'db> {
#[return_ref]
value: Box<[u8]>,
}
#[cfg(test)]
mod tests {
use anyhow::Context;
@@ -459,9 +392,14 @@ mod tests {
Program::from_settings(
&db,
&ProgramSettings {
ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings::new(SystemPathBuf::from("/src")),
search_paths: SearchPathSettings {
extra_paths: Vec::new(),
src_root: SystemPathBuf::from("/src"),
site_packages: vec![],
custom_typeshed: None,
},
},
)
.expect("Valid search path settings");
@@ -487,7 +425,7 @@ mod tests {
let foo = system_path_to_file(&db, "src/foo.py").context("Failed to resolve foo.py")?;
let diagnostics = super::check_types(&db, foo);
assert_diagnostic_messages(&diagnostics, &["Cannot resolve import 'bar'."]);
assert_diagnostic_messages(&diagnostics, &["Import 'bar' could not be resolved."]);
Ok(())
}
@@ -500,7 +438,7 @@ mod tests {
.unwrap();
let foo = system_path_to_file(&db, "src/foo.py").unwrap();
let diagnostics = super::check_types(&db, foo);
assert_diagnostic_messages(&diagnostics, &["Cannot resolve import 'bar'."]);
assert_diagnostic_messages(&diagnostics, &["Import 'bar' could not be resolved."]);
}
#[test]
@@ -512,9 +450,15 @@ mod tests {
let b_file = system_path_to_file(&db, "/src/b.py").unwrap();
let b_file_diagnostics = super::check_types(&db, b_file);
assert_diagnostic_messages(&b_file_diagnostics, &["Module 'a' has no member 'thing'"]);
assert_diagnostic_messages(
&b_file_diagnostics,
&["Could not resolve import of 'thing' from 'a'"],
);
}
#[ignore = "\
A spurious second 'Unresolved import' diagnostic message is emitted on `b.py`, \
despite the symbol existing in the symbol table for `a.py`"]
#[test]
fn resolved_import_of_symbol_from_unresolved_import() {
let mut db = setup_db();
@@ -527,7 +471,10 @@ mod tests {
let a_file = system_path_to_file(&db, "/src/a.py").unwrap();
let a_file_diagnostics = super::check_types(&db, a_file);
assert_diagnostic_messages(&a_file_diagnostics, &["Cannot resolve import 'foo'."]);
assert_diagnostic_messages(
&a_file_diagnostics,
&["Import 'foo' could not be resolved."],
);
// Importing the unresolved import into a second first-party file should not trigger
// an additional "unresolved import" violation

View File

@@ -2,9 +2,6 @@
use std::fmt::{Display, Formatter};
use ruff_python_ast::str::Quote;
use ruff_python_literal::escape::AsciiEscape;
use crate::types::{IntersectionType, Type, UnionType};
use crate::Db;
@@ -41,20 +38,6 @@ impl Display for DisplayType<'_> {
Type::BooleanLiteral(boolean) => {
write!(f, "Literal[{}]", if *boolean { "True" } else { "False" })
}
Type::StringLiteral(string) => write!(
f,
r#"Literal["{}"]"#,
string.value(self.db).replace('"', r#"\""#)
),
Type::LiteralString => write!(f, "LiteralString"),
Type::BytesLiteral(bytes) => {
let escape =
AsciiEscape::with_preferred_quote(bytes.value(self.db).as_ref(), Quote::Double);
f.write_str("Literal[")?;
escape.bytes_repr().write(f)?;
f.write_str("]")
}
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -11,6 +11,7 @@ repository = { workspace = true }
license = { workspace = true }
[dependencies]
red_knot_python_semantic = { workspace = true }
red_knot_workspace = { workspace = true }
ruff_db = { workspace = true }
ruff_notebook = { workspace = true }

View File

@@ -3,11 +3,11 @@
use std::num::NonZeroUsize;
use std::panic::PanicInfo;
use lsp_server::Message;
use lsp_server as lsp;
use lsp_types as types;
use lsp_types::{
ClientCapabilities, DiagnosticOptions, DiagnosticServerCapabilities, MessageType,
ServerCapabilities, TextDocumentSyncCapability, TextDocumentSyncKind, TextDocumentSyncOptions,
Url,
ClientCapabilities, DiagnosticOptions, NotebookCellSelector, NotebookDocumentSyncOptions,
NotebookSelector, TextDocumentSyncCapability, TextDocumentSyncOptions,
};
use self::connection::{Connection, ConnectionInitializer};
@@ -74,7 +74,7 @@ impl Server {
init_params.client_info.as_ref(),
);
let mut workspace_for_url = |url: Url| {
let mut workspace_for_url = |url: lsp_types::Url| {
let Some(workspace_settings) = workspace_settings.as_mut() else {
return (url, ClientSettings::default());
};
@@ -93,18 +93,13 @@ impl Server {
}).collect())
.or_else(|| {
tracing::warn!("No workspace(s) were provided during initialization. Using the current working directory as a default workspace...");
let uri = Url::from_file_path(std::env::current_dir().ok()?).ok()?;
let uri = types::Url::from_file_path(std::env::current_dir().ok()?).ok()?;
Some(vec![workspace_for_url(uri)])
})
.ok_or_else(|| {
anyhow::anyhow!("Failed to get the current working directory while creating a default workspace.")
})?;
if workspaces.len() > 1 {
// TODO(dhruvmanila): Support multi-root workspaces
anyhow::bail!("Multi-root workspaces are not supported yet");
}
Ok(Self {
connection,
worker_threads,
@@ -154,7 +149,7 @@ impl Server {
try_show_message(
"The Ruff language server exited with a panic. See the logs for more details."
.to_string(),
MessageType::ERROR,
lsp_types::MessageType::ERROR,
)
.ok();
}));
@@ -187,9 +182,9 @@ impl Server {
break;
}
let task = match msg {
Message::Request(req) => api::request(req),
Message::Notification(notification) => api::notification(notification),
Message::Response(response) => scheduler.response(response),
lsp::Message::Request(req) => api::request(req),
lsp::Message::Notification(notification) => api::notification(notification),
lsp::Message::Response(response) => scheduler.response(response),
};
scheduler.dispatch(task);
}
@@ -211,17 +206,28 @@ impl Server {
.unwrap_or_default()
}
fn server_capabilities(position_encoding: PositionEncoding) -> ServerCapabilities {
ServerCapabilities {
fn server_capabilities(position_encoding: PositionEncoding) -> types::ServerCapabilities {
types::ServerCapabilities {
position_encoding: Some(position_encoding.into()),
diagnostic_provider: Some(DiagnosticServerCapabilities::Options(DiagnosticOptions {
identifier: Some(crate::DIAGNOSTIC_NAME.into()),
..Default::default()
diagnostic_provider: Some(types::DiagnosticServerCapabilities::Options(
DiagnosticOptions {
identifier: Some(crate::DIAGNOSTIC_NAME.into()),
..Default::default()
},
)),
notebook_document_sync: Some(types::OneOf::Left(NotebookDocumentSyncOptions {
save: Some(false),
notebook_selector: [NotebookSelector::ByCells {
notebook: None,
cells: vec![NotebookCellSelector {
language: "python".to_string(),
}],
}]
.to_vec(),
})),
text_document_sync: Some(TextDocumentSyncCapability::Options(
TextDocumentSyncOptions {
open_close: Some(true),
change: Some(TextDocumentSyncKind::INCREMENTAL),
..Default::default()
},
)),

View File

@@ -1,15 +1,13 @@
use crate::{server::schedule::Task, session::Session, system::url_to_system_path};
use lsp_server as server;
use crate::server::schedule::Task;
use crate::session::Session;
use crate::system::{url_to_any_system_path, AnySystemPath};
mod diagnostics;
mod notifications;
mod requests;
mod traits;
use notifications as notification;
use red_knot_workspace::db::RootDatabase;
use requests as request;
use self::traits::{NotificationHandler, RequestHandler};
@@ -45,7 +43,6 @@ pub(super) fn notification<'a>(notif: server::Notification) -> Task<'a> {
match notif.method.as_str() {
notification::DidCloseTextDocumentHandler::METHOD => local_notification_task::<notification::DidCloseTextDocumentHandler>(notif),
notification::DidOpenTextDocumentHandler::METHOD => local_notification_task::<notification::DidOpenTextDocumentHandler>(notif),
notification::DidChangeTextDocumentHandler::METHOD => local_notification_task::<notification::DidChangeTextDocumentHandler>(notif),
notification::DidOpenNotebookHandler::METHOD => {
local_notification_task::<notification::DidOpenNotebookHandler>(notif)
}
@@ -85,18 +82,12 @@ fn background_request_task<'a, R: traits::BackgroundDocumentRequestHandler>(
Ok(Task::background(schedule, move |session: &Session| {
let url = R::document_url(&params).into_owned();
let Ok(path) = url_to_any_system_path(&url) else {
let Ok(path) = url_to_system_path(&url) else {
return Box::new(|_, _| {});
};
let db = match path {
AnySystemPath::System(path) => {
match session.workspace_db_for_path(path.as_std_path()) {
Some(db) => db.snapshot(),
None => session.default_workspace_db().snapshot(),
}
}
AnySystemPath::SystemVirtual(_) => session.default_workspace_db().snapshot(),
};
let db = session
.workspace_db_for_path(path.as_std_path())
.map(RootDatabase::snapshot);
let Some(snapshot) = session.take_snapshot(url) else {
return Box::new(|_, _| {});

View File

@@ -1,11 +1,9 @@
mod did_change;
mod did_close;
mod did_close_notebook;
mod did_open;
mod did_open_notebook;
mod set_trace;
pub(super) use did_change::DidChangeTextDocumentHandler;
pub(super) use did_close::DidCloseTextDocumentHandler;
pub(super) use did_close_notebook::DidCloseNotebookHandler;
pub(super) use did_open::DidOpenTextDocumentHandler;

View File

@@ -1,55 +0,0 @@
use lsp_server::ErrorCode;
use lsp_types::notification::DidChangeTextDocument;
use lsp_types::DidChangeTextDocumentParams;
use red_knot_workspace::watch::ChangeEvent;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
use crate::server::api::LSPResult;
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::{url_to_any_system_path, AnySystemPath};
pub(crate) struct DidChangeTextDocumentHandler;
impl NotificationHandler for DidChangeTextDocumentHandler {
type NotificationType = DidChangeTextDocument;
}
impl SyncNotificationHandler for DidChangeTextDocumentHandler {
fn run(
session: &mut Session,
_notifier: Notifier,
_requester: &mut Requester,
params: DidChangeTextDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_any_system_path(&params.text_document.uri) else {
return Ok(());
};
let key = session.key_from_url(params.text_document.uri);
session
.update_text_document(&key, params.content_changes, params.text_document.version)
.with_failure_code(ErrorCode::InternalError)?;
match path {
AnySystemPath::System(path) => {
let db = match session.workspace_db_for_path_mut(path.as_std_path()) {
Some(db) => db,
None => session.default_workspace_db_mut(),
};
db.apply_changes(vec![ChangeEvent::file_content_changed(path)], None);
}
AnySystemPath::SystemVirtual(virtual_path) => {
let db = session.default_workspace_db_mut();
db.apply_changes(vec![ChangeEvent::ChangedVirtual(virtual_path)], None);
}
}
// TODO(dhruvmanila): Publish diagnostics if the client doesnt support pull diagnostics
Ok(())
}
}

View File

@@ -1,7 +1,8 @@
use lsp_server::ErrorCode;
use lsp_types::notification::DidCloseTextDocument;
use lsp_types::DidCloseTextDocumentParams;
use red_knot_workspace::watch::ChangeEvent;
use ruff_db::files::File;
use crate::server::api::diagnostics::clear_diagnostics;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
@@ -9,7 +10,7 @@ use crate::server::api::LSPResult;
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::{url_to_any_system_path, AnySystemPath};
use crate::system::url_to_system_path;
pub(crate) struct DidCloseTextDocumentHandler;
@@ -24,7 +25,7 @@ impl SyncNotificationHandler for DidCloseTextDocumentHandler {
_requester: &mut Requester,
params: DidCloseTextDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_any_system_path(&params.text_document.uri) else {
let Ok(path) = url_to_system_path(&params.text_document.uri) else {
return Ok(());
};
@@ -33,9 +34,8 @@ impl SyncNotificationHandler for DidCloseTextDocumentHandler {
.close_document(&key)
.with_failure_code(ErrorCode::InternalError)?;
if let AnySystemPath::SystemVirtual(virtual_path) = path {
let db = session.default_workspace_db_mut();
db.apply_changes(vec![ChangeEvent::DeletedVirtual(virtual_path)], None);
if let Some(db) = session.workspace_db_for_path_mut(path.as_std_path()) {
File::sync_path(db, &path);
}
clear_diagnostics(key.url(), &notifier)?;

View File

@@ -1,14 +1,14 @@
use lsp_types::notification::DidCloseNotebookDocument;
use lsp_types::DidCloseNotebookDocumentParams;
use red_knot_workspace::watch::ChangeEvent;
use ruff_db::files::File;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
use crate::server::api::LSPResult;
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::{url_to_any_system_path, AnySystemPath};
use crate::system::url_to_system_path;
pub(crate) struct DidCloseNotebookHandler;
@@ -23,7 +23,7 @@ impl SyncNotificationHandler for DidCloseNotebookHandler {
_requester: &mut Requester,
params: DidCloseNotebookDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_any_system_path(&params.notebook_document.uri) else {
let Ok(path) = url_to_system_path(&params.notebook_document.uri) else {
return Ok(());
};
@@ -32,9 +32,8 @@ impl SyncNotificationHandler for DidCloseNotebookHandler {
.close_document(&key)
.with_failure_code(lsp_server::ErrorCode::InternalError)?;
if let AnySystemPath::SystemVirtual(virtual_path) = path {
let db = session.default_workspace_db_mut();
db.apply_changes(vec![ChangeEvent::DeletedVirtual(virtual_path)], None);
if let Some(db) = session.workspace_db_for_path_mut(path.as_std_path()) {
File::sync_path(db, &path);
}
Ok(())

View File

@@ -1,14 +1,13 @@
use lsp_types::notification::DidOpenTextDocument;
use lsp_types::DidOpenTextDocumentParams;
use red_knot_workspace::watch::ChangeEvent;
use ruff_db::Db;
use ruff_db::files::system_path_to_file;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::{url_to_any_system_path, AnySystemPath};
use crate::system::url_to_system_path;
use crate::TextDocument;
pub(crate) struct DidOpenTextDocumentHandler;
@@ -24,25 +23,17 @@ impl SyncNotificationHandler for DidOpenTextDocumentHandler {
_requester: &mut Requester,
params: DidOpenTextDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_any_system_path(&params.text_document.uri) else {
let Ok(path) = url_to_system_path(&params.text_document.uri) else {
return Ok(());
};
let document = TextDocument::new(params.text_document.text, params.text_document.version);
session.open_text_document(params.text_document.uri, document);
match path {
AnySystemPath::System(path) => {
let db = match session.workspace_db_for_path_mut(path.as_std_path()) {
Some(db) => db,
None => session.default_workspace_db_mut(),
};
db.apply_changes(vec![ChangeEvent::Opened(path)], None);
}
AnySystemPath::SystemVirtual(virtual_path) => {
let db = session.default_workspace_db_mut();
db.files().virtual_file(db, &virtual_path);
}
if let Some(db) = session.workspace_db_for_path_mut(path.as_std_path()) {
// TODO(dhruvmanila): Store the `file` in `DocumentController`
let file = system_path_to_file(db, &path).unwrap();
file.sync(db);
}
// TODO(dhruvmanila): Publish diagnostics if the client doesn't support pull diagnostics

View File

@@ -2,8 +2,7 @@ use lsp_server::ErrorCode;
use lsp_types::notification::DidOpenNotebookDocument;
use lsp_types::DidOpenNotebookDocumentParams;
use red_knot_workspace::watch::ChangeEvent;
use ruff_db::Db;
use ruff_db::files::system_path_to_file;
use crate::edit::NotebookDocument;
use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
@@ -11,7 +10,7 @@ use crate::server::api::LSPResult;
use crate::server::client::{Notifier, Requester};
use crate::server::Result;
use crate::session::Session;
use crate::system::{url_to_any_system_path, AnySystemPath};
use crate::system::url_to_system_path;
pub(crate) struct DidOpenNotebookHandler;
@@ -26,7 +25,7 @@ impl SyncNotificationHandler for DidOpenNotebookHandler {
_requester: &mut Requester,
params: DidOpenNotebookDocumentParams,
) -> Result<()> {
let Ok(path) = url_to_any_system_path(&params.notebook_document.uri) else {
let Ok(path) = url_to_system_path(&params.notebook_document.uri) else {
return Ok(());
};
@@ -39,18 +38,10 @@ impl SyncNotificationHandler for DidOpenNotebookHandler {
.with_failure_code(ErrorCode::InternalError)?;
session.open_notebook_document(params.notebook_document.uri.clone(), notebook);
match path {
AnySystemPath::System(path) => {
let db = match session.workspace_db_for_path_mut(path.as_std_path()) {
Some(db) => db,
None => session.default_workspace_db_mut(),
};
db.apply_changes(vec![ChangeEvent::Opened(path)], None);
}
AnySystemPath::SystemVirtual(virtual_path) => {
let db = session.default_workspace_db_mut();
db.files().virtual_file(db, &virtual_path);
}
if let Some(db) = session.workspace_db_for_path_mut(path.as_std_path()) {
// TODO(dhruvmanila): Store the `file` in `DocumentController`
let file = system_path_to_file(db, &path).unwrap();
file.sync(db);
}
// TODO(dhruvmanila): Publish diagnostics if the client doesn't support pull diagnostics

View File

@@ -26,11 +26,13 @@ impl BackgroundDocumentRequestHandler for DocumentDiagnosticRequestHandler {
fn run_with_snapshot(
snapshot: DocumentSnapshot,
db: RootDatabase,
db: Option<RootDatabase>,
_notifier: Notifier,
_params: DocumentDiagnosticParams,
) -> Result<DocumentDiagnosticReportResult> {
let diagnostics = compute_diagnostics(&snapshot, &db);
let diagnostics = db
.map(|db| compute_diagnostics(&snapshot, &db))
.unwrap_or_default();
Ok(DocumentDiagnosticReportResult::Report(
DocumentDiagnosticReport::Full(RelatedFullDocumentDiagnosticReport {
@@ -46,19 +48,10 @@ impl BackgroundDocumentRequestHandler for DocumentDiagnosticRequestHandler {
fn compute_diagnostics(snapshot: &DocumentSnapshot, db: &RootDatabase) -> Vec<Diagnostic> {
let Some(file) = snapshot.file(db) else {
tracing::info!(
"No file found for snapshot for '{}'",
snapshot.query().file_url()
);
return vec![];
};
let diagnostics = match db.check_file(file) {
Ok(diagnostics) => diagnostics,
Err(cancelled) => {
tracing::info!("Diagnostics computation {cancelled}");
return vec![];
}
let Ok(diagnostics) = db.check_file(file) else {
return vec![];
};
diagnostics
@@ -72,12 +65,12 @@ fn to_lsp_diagnostic(message: &str) -> Diagnostic {
let words = message.split(':').collect::<Vec<_>>();
let (range, message) = match words.as_slice() {
[_, _, line, column, message] | [_, line, column, message] => {
let line = line.parse::<u32>().unwrap_or_default().saturating_sub(1);
[_filename, line, column, message] => {
let line = line.parse::<u32>().unwrap_or_default();
let column = column.parse::<u32>().unwrap_or_default();
(
Range::new(
Position::new(line, column.saturating_sub(1)),
Position::new(line.saturating_sub(1), column.saturating_sub(1)),
Position::new(line, column),
),
message.trim(),

View File

@@ -34,7 +34,7 @@ pub(super) trait BackgroundDocumentRequestHandler: RequestHandler {
fn run_with_snapshot(
snapshot: DocumentSnapshot,
db: RootDatabase,
db: Option<RootDatabase>,
notifier: Notifier,
params: <<Self as RequestHandler>::RequestType as Request>::Params,
) -> super::Result<<<Self as RequestHandler>::RequestType as Request>::Result>;

View File

@@ -6,16 +6,16 @@ use std::path::{Path, PathBuf};
use std::sync::Arc;
use anyhow::anyhow;
use lsp_types::{ClientCapabilities, TextDocumentContentChangeEvent, Url};
use lsp_types::{ClientCapabilities, Url};
use red_knot_python_semantic::{ProgramSettings, PythonVersion, SearchPathSettings};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::files::{system_path_to_file, File};
use ruff_db::system::SystemPath;
use ruff_db::Db;
use crate::edit::{DocumentKey, DocumentVersion, NotebookDocument};
use crate::system::{url_to_any_system_path, AnySystemPath, LSPSystem};
use crate::edit::{DocumentKey, NotebookDocument};
use crate::system::{url_to_system_path, LSPSystem};
use crate::{PositionEncoding, TextDocument};
pub(crate) use self::capabilities::ResolvedClientCapabilities;
@@ -67,10 +67,19 @@ impl Session {
.ok_or_else(|| anyhow!("Workspace path is not a valid UTF-8 path: {:?}", path))?;
let system = LSPSystem::new(index.clone());
let metadata = WorkspaceMetadata::from_path(system_path, &system)?;
// TODO(dhruvmanila): Get the values from the client settings
let metadata = WorkspaceMetadata::from_path(system_path, &system, None)?;
let program_settings = ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: system_path.to_path_buf(),
site_packages: vec![],
custom_typeshed: None,
},
};
// TODO(micha): Handle the case where the program settings are incorrect more gracefully.
workspaces.insert(path, RootDatabase::new(metadata, system)?);
workspaces.insert(path, RootDatabase::new(metadata, program_settings, system)?);
}
Ok(Self {
@@ -83,12 +92,6 @@ impl Session {
})
}
// TODO(dhruvmanila): Ideally, we should have a single method for `workspace_db_for_path_mut`
// and `default_workspace_db_mut` but the borrow checker doesn't allow that.
// https://github.com/astral-sh/ruff/pull/13041#discussion_r1726725437
/// Returns a reference to the workspace [`RootDatabase`] corresponding to the given path, if
/// any.
pub(crate) fn workspace_db_for_path(&self, path: impl AsRef<Path>) -> Option<&RootDatabase> {
self.workspaces
.range(..=path.as_ref().to_path_buf())
@@ -96,8 +99,6 @@ impl Session {
.map(|(_, db)| db)
}
/// Returns a mutable reference to the workspace [`RootDatabase`] corresponding to the given
/// path, if any.
pub(crate) fn workspace_db_for_path_mut(
&mut self,
path: impl AsRef<Path>,
@@ -108,19 +109,6 @@ impl Session {
.map(|(_, db)| db)
}
/// Returns a reference to the default workspace [`RootDatabase`]. The default workspace is the
/// minimum root path in the workspace map.
pub(crate) fn default_workspace_db(&self) -> &RootDatabase {
// SAFETY: Currently, red knot only support a single workspace.
self.workspaces.values().next().unwrap()
}
/// Returns a mutable reference to the default workspace [`RootDatabase`].
pub(crate) fn default_workspace_db_mut(&mut self) -> &mut RootDatabase {
// SAFETY: Currently, red knot only support a single workspace.
self.workspaces.values_mut().next().unwrap()
}
pub fn key_from_url(&self, url: Url) -> DocumentKey {
self.index().key_from_url(url)
}
@@ -147,20 +135,6 @@ impl Session {
self.index_mut().open_text_document(url, document);
}
/// Updates a text document at the associated `key`.
///
/// The document key must point to a text document, or this will throw an error.
pub(crate) fn update_text_document(
&mut self,
key: &DocumentKey,
content_changes: Vec<TextDocumentContentChangeEvent>,
new_version: DocumentVersion,
) -> crate::Result<()> {
let position_encoding = self.position_encoding;
self.index_mut()
.update_text_document(key, content_changes, new_version, position_encoding)
}
/// De-registers a document, specified by its key.
/// Calling this multiple times for the same document is a logic error.
pub(crate) fn close_document(&mut self, key: &DocumentKey) -> crate::Result<()> {
@@ -247,7 +221,6 @@ impl Drop for MutIndexGuard<'_> {
/// An immutable snapshot of `Session` that references
/// a specific document.
#[derive(Debug)]
pub struct DocumentSnapshot {
resolved_client_capabilities: Arc<ResolvedClientCapabilities>,
document_ref: index::DocumentQuery,
@@ -268,12 +241,7 @@ impl DocumentSnapshot {
}
pub(crate) fn file(&self, db: &RootDatabase) -> Option<File> {
match url_to_any_system_path(self.document_ref.file_url()).ok()? {
AnySystemPath::System(path) => system_path_to_file(db, path).ok(),
AnySystemPath::SystemVirtual(virtual_path) => db
.files()
.try_virtual_file(&virtual_path)
.map(|virtual_file| virtual_file.file()),
}
let path = url_to_system_path(self.document_ref.file_url()).ok()?;
system_path_to_file(db, path).ok()
}
}

View File

@@ -8,40 +8,27 @@ use ruff_db::file_revision::FileRevision;
use ruff_db::system::walk_directory::WalkDirectoryBuilder;
use ruff_db::system::{
DirectoryEntry, FileType, Metadata, OsSystem, Result, System, SystemPath, SystemPathBuf,
SystemVirtualPath, SystemVirtualPathBuf,
SystemVirtualPath,
};
use ruff_notebook::{Notebook, NotebookError};
use crate::session::index::Index;
use crate::DocumentQuery;
/// Converts the given [`Url`] to an [`AnySystemPath`].
///
/// If the URL scheme is `file`, then the path is converted to a [`SystemPathBuf`]. Otherwise, the
/// URL is converted to a [`SystemVirtualPathBuf`].
/// Converts the given [`Url`] to a [`SystemPathBuf`].
///
/// This fails in the following cases:
/// * The URL scheme is not `file`.
/// * The URL cannot be converted to a file path (refer to [`Url::to_file_path`]).
/// * If the URL is not a valid UTF-8 string.
pub(crate) fn url_to_any_system_path(url: &Url) -> std::result::Result<AnySystemPath, ()> {
pub(crate) fn url_to_system_path(url: &Url) -> std::result::Result<SystemPathBuf, ()> {
if url.scheme() == "file" {
Ok(AnySystemPath::System(
SystemPathBuf::from_path_buf(url.to_file_path()?).map_err(|_| ())?,
))
Ok(SystemPathBuf::from_path_buf(url.to_file_path()?).map_err(|_| ())?)
} else {
Ok(AnySystemPath::SystemVirtual(
SystemVirtualPath::new(url.as_str()).to_path_buf(),
))
Err(())
}
}
/// Represents either a [`SystemPath`] or a [`SystemVirtualPath`].
#[derive(Debug)]
pub(crate) enum AnySystemPath {
System(SystemPathBuf),
SystemVirtual(SystemVirtualPathBuf),
}
#[derive(Debug)]
pub(crate) struct LSPSystem {
/// A read-only copy of the index where the server stores all the open documents and settings.
@@ -157,6 +144,19 @@ impl System for LSPSystem {
}
}
fn virtual_path_metadata(&self, path: &SystemVirtualPath) -> Result<Metadata> {
// Virtual paths only exists in the LSP system, so we don't need to check the OS system.
let document = self
.system_virtual_path_to_document_ref(path)?
.ok_or_else(|| virtual_path_not_found(path))?;
Ok(Metadata::new(
document_revision(&document),
None,
FileType::File,
))
}
fn read_virtual_path_to_string(&self, path: &SystemVirtualPath) -> Result<String> {
let document = self
.system_virtual_path_to_document_ref(path)?

View File

@@ -3,8 +3,8 @@ use std::any::Any;
use js_sys::Error;
use wasm_bindgen::prelude::*;
use red_knot_python_semantic::{ProgramSettings, SearchPathSettings};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::workspace::settings::Configuration;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::files::{system_path_to_file, File};
use ruff_db::system::walk_directory::WalkDirectoryBuilder;
@@ -41,17 +41,16 @@ impl Workspace {
#[wasm_bindgen(constructor)]
pub fn new(root: &str, settings: &Settings) -> Result<Workspace, Error> {
let system = WasmSystem::new(SystemPath::new(root));
let workspace = WorkspaceMetadata::from_path(
SystemPath::new(root),
&system,
Some(Configuration {
target_version: Some(settings.target_version.into()),
..Configuration::default()
}),
)
.map_err(into_error)?;
let workspace =
WorkspaceMetadata::from_path(SystemPath::new(root), &system).map_err(into_error)?;
let db = RootDatabase::new(workspace, system.clone()).map_err(into_error)?;
let program_settings = ProgramSettings {
target_version: settings.target_version.into(),
search_paths: SearchPathSettings::default(),
};
let db =
RootDatabase::new(workspace, program_settings, system.clone()).map_err(into_error)?;
Ok(Self { db, system })
}
@@ -234,6 +233,13 @@ impl System for WasmSystem {
Notebook::from_source_code(&content)
}
fn virtual_path_metadata(
&self,
_path: &SystemVirtualPath,
) -> ruff_db::system::Result<Metadata> {
Err(not_found())
}
fn read_virtual_path_to_string(
&self,
_path: &SystemVirtualPath,

View File

@@ -11,14 +11,14 @@ fn check() {
};
let mut workspace = Workspace::new("/", &settings).expect("Workspace to be created");
workspace
let test = workspace
.open_file("test.py", "import random22\n")
.expect("File to be opened");
let result = workspace.check().expect("Check to succeed");
let result = workspace.check_file(&test).expect("Check to succeed");
assert_eq!(
result,
vec!["/test.py:1:8: Cannot resolve import 'random22'."]
vec!["/test.py:1:8: Import 'random22' could not be resolved.",]
);
}

View File

@@ -22,13 +22,13 @@ ruff_text_size = { workspace = true }
anyhow = { workspace = true }
crossbeam = { workspace = true }
notify = { workspace = true }
rayon = { workspace = true }
rustc-hash = { workspace = true }
salsa = { workspace = true }
thiserror = { workspace = true }
tracing = { workspace = true }
[dev-dependencies]
ruff_db = { workspace = true, features = ["testing"] }
ruff_db = { workspace = true, features = ["testing"]}
[lints]
workspace = true

View File

@@ -1,2 +0,0 @@
x = 0
(x := x + 1)

View File

@@ -1,3 +0,0 @@
x = 0
if x := x + 1:
pass

View File

@@ -1,11 +0,0 @@
def bool(x) -> bool:
return True
class MyClass: ...
def MyClass() -> MyClass: ...
def x(self) -> x: ...

View File

@@ -1,2 +0,0 @@
def bool(x=bool):
return x

View File

@@ -1,3 +0,0 @@
match x:
case [1, 0] if x := x[:0]:
y = 1

View File

@@ -1,14 +1,15 @@
use std::panic::RefUnwindSafe;
use std::sync::Arc;
use salsa::plumbing::ZalsaDatabase;
use salsa::{Cancelled, Event};
use red_knot_python_semantic::{vendored_typeshed_stubs, Db as SemanticDb, Program};
use red_knot_python_semantic::{
vendored_typeshed_stubs, Db as SemanticDb, Program, ProgramSettings,
};
use ruff_db::files::{File, Files};
use ruff_db::system::System;
use ruff_db::vendored::VendoredFileSystem;
use ruff_db::{Db as SourceDb, Upcast};
use salsa::plumbing::ZalsaDatabase;
use salsa::{Cancelled, Event};
use crate::workspace::{check_file, Workspace, WorkspaceMetadata};
@@ -26,7 +27,11 @@ pub struct RootDatabase {
}
impl RootDatabase {
pub fn new<S>(workspace: WorkspaceMetadata, system: S) -> anyhow::Result<Self>
pub fn new<S>(
workspace: WorkspaceMetadata,
settings: ProgramSettings,
system: S,
) -> anyhow::Result<Self>
where
S: System + 'static + Send + Sync + RefUnwindSafe,
{
@@ -37,11 +42,11 @@ impl RootDatabase {
system: Arc::new(system),
};
let workspace = Workspace::from_metadata(&db, workspace);
// Initialize the `Program` singleton
Program::from_settings(&db, workspace.settings().program())?;
db.workspace = Some(Workspace::from_metadata(&db, workspace));
Program::from_settings(&db, settings)?;
db.workspace = Some(workspace);
Ok(db)
}
@@ -56,8 +61,6 @@ impl RootDatabase {
}
pub fn check_file(&self, file: File) -> Result<Vec<String>, Cancelled> {
let _span = tracing::debug_span!("check_file", file=%file.path(self)).entered();
self.with_db(|db| check_file(db, file))
}
@@ -157,9 +160,8 @@ impl Db for RootDatabase {}
#[cfg(test)]
pub(crate) mod tests {
use std::sync::Arc;
use salsa::Event;
use std::sync::Arc;
use red_knot_python_semantic::{vendored_typeshed_stubs, Db as SemanticDb};
use ruff_db::files::Files;

View File

@@ -1,33 +1,22 @@
use red_knot_python_semantic::Program;
use rustc_hash::FxHashSet;
use ruff_db::files::{system_path_to_file, File, Files};
use ruff_db::system::walk_directory::WalkState;
use ruff_db::system::SystemPath;
use ruff_db::Db;
use rustc_hash::FxHashSet;
use crate::db::RootDatabase;
use crate::watch;
use crate::watch::{CreatedKind, DeletedKind};
use crate::workspace::settings::Configuration;
use crate::workspace::WorkspaceMetadata;
impl RootDatabase {
#[tracing::instrument(level = "debug", skip(self, changes, base_configuration))]
pub fn apply_changes(
&mut self,
changes: Vec<watch::ChangeEvent>,
base_configuration: Option<&Configuration>,
) {
#[tracing::instrument(level = "debug", skip(self, changes))]
pub fn apply_changes(&mut self, changes: Vec<watch::ChangeEvent>) {
let workspace = self.workspace();
let workspace_path = workspace.root(self).to_path_buf();
let program = Program::get(self);
let custom_stdlib_versions_path = program
.custom_stdlib_search_path(self)
.map(|path| path.join("VERSIONS"));
let mut workspace_change = false;
// Changes to a custom stdlib path's VERSIONS
let mut custom_stdlib_change = false;
// Packages that need reloading
let mut changed_packages = FxHashSet::default();
// Paths that were added
@@ -50,7 +39,7 @@ impl RootDatabase {
};
for change in changes {
if let Some(path) = change.system_path() {
if let Some(path) = change.path() {
if matches!(
path.file_name(),
Some(".gitignore" | ".ignore" | "ruff.toml" | ".ruff.toml" | "pyproject.toml")
@@ -65,15 +54,10 @@ impl RootDatabase {
continue;
}
if Some(path) == custom_stdlib_versions_path.as_deref() {
custom_stdlib_change = true;
}
}
match change {
watch::ChangeEvent::Changed { path, kind: _ }
| watch::ChangeEvent::Opened(path) => sync_path(self, &path),
watch::ChangeEvent::Changed { path, kind: _ } => sync_path(self, &path),
watch::ChangeEvent::Created { kind, path } => {
match kind {
@@ -116,13 +100,7 @@ impl RootDatabase {
} else {
sync_recursively(self, &path);
if custom_stdlib_versions_path
.as_ref()
.is_some_and(|versions_path| versions_path.starts_with(&path))
{
custom_stdlib_change = true;
}
// TODO: Remove after converting `package.files()` to a salsa query.
if let Some(package) = workspace.package(self, &path) {
changed_packages.insert(package);
} else {
@@ -131,17 +109,6 @@ impl RootDatabase {
}
}
watch::ChangeEvent::CreatedVirtual(path)
| watch::ChangeEvent::ChangedVirtual(path) => {
File::sync_virtual_path(self, &path);
}
watch::ChangeEvent::DeletedVirtual(path) => {
if let Some(virtual_file) = self.files().try_virtual_file(&path) {
virtual_file.close(self);
}
}
watch::ChangeEvent::Rescan => {
workspace_change = true;
Files::sync_all(self);
@@ -151,11 +118,7 @@ impl RootDatabase {
}
if workspace_change {
match WorkspaceMetadata::from_path(
&workspace_path,
self.system(),
base_configuration.cloned(),
) {
match WorkspaceMetadata::from_path(&workspace_path, self.system()) {
Ok(metadata) => {
tracing::debug!("Reloading workspace after structural change.");
// TODO: Handle changes in the program settings.
@@ -167,11 +130,6 @@ impl RootDatabase {
}
return;
} else if custom_stdlib_change {
let search_paths = workspace.search_path_settings(self).clone();
if let Err(error) = program.update_search_paths(self, &search_paths) {
tracing::error!("Failed to set the new search paths: {error}");
}
}
let mut added_paths = added_paths.into_iter().filter(|path| {

View File

@@ -1,4 +1,5 @@
pub mod db;
pub mod lint;
pub mod site_packages;
pub mod watch;
pub mod workspace;

View File

@@ -7,7 +7,7 @@ use red_knot_python_semantic::types::Type;
use red_knot_python_semantic::{HasTy, ModuleName, SemanticModel};
use ruff_db::files::File;
use ruff_db::parsed::{parsed_module, ParsedModule};
use ruff_db::source::{source_text, SourceText};
use ruff_db::source::{line_index, source_text, SourceText};
use ruff_python_ast as ast;
use ruff_python_ast::visitor::{walk_expr, walk_stmt, Visitor};
use ruff_text_size::{Ranged, TextSize};
@@ -48,6 +48,19 @@ pub(crate) fn lint_syntax(db: &dyn Db, file_id: File) -> Vec<String> {
};
visitor.visit_body(&ast.body);
diagnostics = visitor.diagnostics;
} else {
let path = file_id.path(db);
let line_index = line_index(db.upcast(), file_id);
diagnostics.extend(parsed.errors().iter().map(|err| {
let source_location = line_index.source_location(err.location.start(), source.as_str());
format!(
"{}:{}:{}: {}",
path.as_str(),
source_location.row,
source_location.column,
err,
)
}));
}
diagnostics
@@ -273,9 +286,14 @@ mod tests {
Program::from_settings(
&db,
&ProgramSettings {
ProgramSettings {
target_version: PythonVersion::default(),
search_paths: SearchPathSettings::new(src_root),
search_paths: SearchPathSettings {
extra_paths: Vec::new(),
src_root,
site_packages: vec![],
custom_typeshed: None,
},
},
)
.expect("Valid program settings");

View File

@@ -13,10 +13,9 @@ use std::io;
use std::num::NonZeroUsize;
use std::ops::Deref;
use red_knot_python_semantic::PythonVersion;
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use crate::PythonVersion;
type SitePackagesDiscoveryResult<T> = Result<T, SitePackagesDiscoveryError>;
/// Abstraction for a Python virtual environment.
@@ -25,7 +24,7 @@ type SitePackagesDiscoveryResult<T> = Result<T, SitePackagesDiscoveryError>;
/// The format of this file is not defined anywhere, and exactly which keys are present
/// depends on the tool that was used to create the virtual environment.
#[derive(Debug)]
pub(crate) struct VirtualEnvironment {
pub struct VirtualEnvironment {
venv_path: SysPrefixPath,
base_executable_home_path: PythonHomePath,
include_system_site_packages: bool,
@@ -42,7 +41,7 @@ pub(crate) struct VirtualEnvironment {
}
impl VirtualEnvironment {
pub(crate) fn new(
pub fn new(
path: impl AsRef<SystemPath>,
system: &dyn System,
) -> SitePackagesDiscoveryResult<Self> {
@@ -158,7 +157,7 @@ impl VirtualEnvironment {
/// Return a list of `site-packages` directories that are available from this virtual environment
///
/// See the documentation for `site_packages_dir_from_sys_prefix` for more details.
pub(crate) fn site_packages_directories(
pub fn site_packages_directories(
&self,
system: &dyn System,
) -> SitePackagesDiscoveryResult<Vec<SystemPathBuf>> {
@@ -205,7 +204,7 @@ System site-packages will not be used for module resolution.",
}
#[derive(Debug, thiserror::Error)]
pub(crate) enum SitePackagesDiscoveryError {
pub enum SitePackagesDiscoveryError {
#[error("Invalid --venv-path argument: {0} could not be canonicalized")]
VenvDirCanonicalizationError(SystemPathBuf, #[source] io::Error),
#[error("Invalid --venv-path argument: {0} does not point to a directory on disk")]
@@ -222,7 +221,7 @@ pub(crate) enum SitePackagesDiscoveryError {
/// The various ways in which parsing a `pyvenv.cfg` file could fail
#[derive(Debug)]
pub(crate) enum PyvenvCfgParseErrorKind {
pub enum PyvenvCfgParseErrorKind {
TooManyEquals { line_number: NonZeroUsize },
MalformedKeyValuePair { line_number: NonZeroUsize },
NoHomeKey,
@@ -371,7 +370,7 @@ fn site_packages_directory_from_sys_prefix(
///
/// [`sys.prefix`]: https://docs.python.org/3/library/sys.html#sys.prefix
#[derive(Debug, PartialEq, Eq, Clone)]
pub(crate) struct SysPrefixPath(SystemPathBuf);
pub struct SysPrefixPath(SystemPathBuf);
impl SysPrefixPath {
fn new(

View File

@@ -1,4 +1,4 @@
use ruff_db::system::{SystemPath, SystemPathBuf, SystemVirtualPathBuf};
use ruff_db::system::{SystemPath, SystemPathBuf};
pub use watcher::{directory_watcher, EventHandler, Watcher};
pub use workspace_watcher::WorkspaceWatcher;
@@ -20,9 +20,6 @@ mod workspace_watcher;
/// event instead of emitting an event for each file or subdirectory in that path.
#[derive(Debug, PartialEq, Eq)]
pub enum ChangeEvent {
/// The file corresponding to the given path was opened in an editor.
Opened(SystemPathBuf),
/// A new path was created
Created {
path: SystemPathBuf,
@@ -41,15 +38,6 @@ pub enum ChangeEvent {
kind: DeletedKind,
},
/// A new virtual path was created.
CreatedVirtual(SystemVirtualPathBuf),
/// The content of a virtual path was changed.
ChangedVirtual(SystemVirtualPathBuf),
/// A virtual path was deleted.
DeletedVirtual(SystemVirtualPathBuf),
/// The file watcher failed to observe some changes and now is out of sync with the file system.
///
/// This can happen if many files are changed at once. The consumer should rescan all files to catch up
@@ -58,27 +46,16 @@ pub enum ChangeEvent {
}
impl ChangeEvent {
/// Creates a new [`Changed`] event for the file content at the given path.
///
/// [`Changed`]: ChangeEvent::Changed
pub fn file_content_changed(path: SystemPathBuf) -> ChangeEvent {
ChangeEvent::Changed {
path,
kind: ChangedKind::FileContent,
}
}
pub fn file_name(&self) -> Option<&str> {
self.system_path().and_then(|path| path.file_name())
self.path().and_then(|path| path.file_name())
}
pub fn system_path(&self) -> Option<&SystemPath> {
pub fn path(&self) -> Option<&SystemPath> {
match self {
ChangeEvent::Opened(path)
| ChangeEvent::Created { path, .. }
ChangeEvent::Created { path, .. }
| ChangeEvent::Changed { path, .. }
| ChangeEvent::Deleted { path, .. } => Some(path),
_ => None,
ChangeEvent::Rescan => None,
}
}
}

View File

@@ -5,8 +5,6 @@ use salsa::{Durability, Setter as _};
pub use metadata::{PackageMetadata, WorkspaceMetadata};
use red_knot_python_semantic::types::check_types;
use red_knot_python_semantic::SearchPathSettings;
use ruff_db::parsed::parsed_module;
use ruff_db::source::{line_index, source_text, SourceDiagnostic};
use ruff_db::{
files::{system_path_to_file, File},
@@ -15,8 +13,7 @@ use ruff_db::{
use ruff_python_ast::{name::Name, PySourceType};
use ruff_text_size::Ranged;
use crate::db::RootDatabase;
use crate::workspace::files::{Index, Indexed, IndexedIter, PackageFiles};
use crate::workspace::files::{Index, Indexed, PackageFiles};
use crate::{
db::Db,
lint::{lint_semantic, lint_syntax},
@@ -24,7 +21,6 @@ use crate::{
mod files;
mod metadata;
pub mod settings;
/// The project workspace as a Salsa ingredient.
///
@@ -85,10 +81,6 @@ pub struct Workspace {
/// The (first-party) packages in this workspace.
#[return_ref]
package_tree: BTreeMap<SystemPathBuf, Package>,
/// The unresolved search path configuration.
#[return_ref]
pub search_path_settings: SearchPathSettings,
}
/// A first-party package in a workspace.
@@ -117,14 +109,10 @@ impl Workspace {
packages.insert(package.root.clone(), Package::from_metadata(db, package));
}
Workspace::builder(
metadata.root,
packages,
metadata.settings.program.search_paths,
)
.durability(Durability::MEDIUM)
.open_fileset_durability(Durability::LOW)
.new(db)
Workspace::builder(metadata.root, packages)
.durability(Durability::MEDIUM)
.open_fileset_durability(Durability::LOW)
.new(db)
}
pub fn root(self, db: &dyn Db) -> &SystemPath {
@@ -155,11 +143,6 @@ impl Workspace {
new_packages.insert(path, package);
}
if &metadata.settings.program.search_paths != self.search_path_settings(db) {
self.set_search_path_settings(db)
.to(metadata.settings.program.search_paths);
}
self.set_package_tree(db).to(new_packages);
}
@@ -191,35 +174,23 @@ impl Workspace {
}
/// Checks all open files in the workspace and its dependencies.
pub fn check(self, db: &RootDatabase) -> Vec<String> {
let workspace_span = tracing::debug_span!("check_workspace");
let _span = workspace_span.enter();
#[tracing::instrument(level = "debug", skip_all)]
pub fn check(self, db: &dyn Db) -> Vec<String> {
tracing::debug!("Checking workspace");
let files = WorkspaceFiles::new(db, self);
let result = Arc::new(std::sync::Mutex::new(Vec::new()));
let inner_result = Arc::clone(&result);
let db = db.snapshot();
let workspace_span = workspace_span.clone();
let mut result = Vec::new();
rayon::scope(move |scope| {
for file in &files {
let result = inner_result.clone();
let db = db.snapshot();
let workspace_span = workspace_span.clone();
scope.spawn(move |_| {
let check_file_span = tracing::debug_span!(parent: &workspace_span, "check_file", file=%file.path(&db));
let _entered = check_file_span.entered();
let file_diagnostics = check_file(&db, file);
result.lock().unwrap().extend(file_diagnostics);
});
if let Some(open_files) = self.open_files(db) {
for file in open_files {
result.extend_from_slice(&check_file(db, *file));
}
});
} else {
for package in self.packages(db) {
result.extend(package.check(db));
}
}
Arc::into_inner(result).unwrap().into_inner().unwrap()
result
}
/// Opens a file in the workspace.
@@ -337,6 +308,19 @@ impl Package {
index.insert(file);
}
#[tracing::instrument(level = "debug", skip(db))]
pub(crate) fn check(self, db: &dyn Db) -> Vec<String> {
tracing::debug!("Checking package '{}'", self.root(db));
let mut result = Vec::new();
for file in &self.files(db) {
let diagnostics = check_file(db, file);
result.extend_from_slice(&diagnostics);
}
result
}
/// Returns the files belonging to this package.
pub fn files(self, db: &dyn Db) -> Indexed<'_> {
let files = self.file_set(db);
@@ -347,7 +331,11 @@ impl Package {
tracing::debug_span!("index_package_files", package = %self.name(db)).entered();
let files = discover_package_files(db, self.root(db));
tracing::info!("Found {} files in package '{}'", files.len(), self.name(db));
tracing::info!(
"Indexed {} files for package '{}'",
files.len(),
self.name(db)
);
vacant.set(files)
}
Index::Indexed(indexed) => indexed,
@@ -384,7 +372,9 @@ impl Package {
#[salsa::tracked]
pub(super) fn check_file(db: &dyn Db, file: File) -> Vec<String> {
tracing::debug!("Checking file '{path}'", path = file.path(db));
let path = file.path(db);
let _span = tracing::debug_span!("check_file", file=%path).entered();
tracing::debug!("Checking file '{path}'");
let mut diagnostics = Vec::new();
@@ -403,17 +393,6 @@ pub(super) fn check_file(db: &dyn Db, file: File) -> Vec<String> {
return diagnostics;
}
let parsed = parsed_module(db.upcast(), file);
if !parsed.errors().is_empty() {
let path = file.path(db);
let line_index = line_index(db.upcast(), file);
diagnostics.extend(parsed.errors().iter().map(|err| {
let source_location = line_index.source_location(err.location.start(), source.as_str());
format!("{path}:{source_location}: {message}", message = err.error)
}));
}
for diagnostic in check_types(db.upcast(), file) {
let index = line_index(db.upcast(), diagnostic.file());
let location = index.source_location(diagnostic.start(), source.as_str());
@@ -472,73 +451,6 @@ fn discover_package_files(db: &dyn Db, path: &SystemPath) -> FxHashSet<File> {
files
}
#[derive(Debug)]
enum WorkspaceFiles<'a> {
OpenFiles(&'a FxHashSet<File>),
PackageFiles(Vec<Indexed<'a>>),
}
impl<'a> WorkspaceFiles<'a> {
fn new(db: &'a dyn Db, workspace: Workspace) -> Self {
if let Some(open_files) = workspace.open_files(db) {
WorkspaceFiles::OpenFiles(open_files)
} else {
WorkspaceFiles::PackageFiles(
workspace
.packages(db)
.map(|package| package.files(db))
.collect(),
)
}
}
}
impl<'a> IntoIterator for &'a WorkspaceFiles<'a> {
type Item = File;
type IntoIter = WorkspaceFilesIter<'a>;
fn into_iter(self) -> Self::IntoIter {
match self {
WorkspaceFiles::OpenFiles(files) => WorkspaceFilesIter::OpenFiles(files.iter()),
WorkspaceFiles::PackageFiles(package_files) => {
let mut package_files = package_files.iter();
WorkspaceFilesIter::PackageFiles {
current: package_files.next().map(IntoIterator::into_iter),
package_files,
}
}
}
}
}
enum WorkspaceFilesIter<'db> {
OpenFiles(std::collections::hash_set::Iter<'db, File>),
PackageFiles {
package_files: std::slice::Iter<'db, Indexed<'db>>,
current: Option<IndexedIter<'db>>,
},
}
impl Iterator for WorkspaceFilesIter<'_> {
type Item = File;
fn next(&mut self) -> Option<Self::Item> {
match self {
WorkspaceFilesIter::OpenFiles(files) => files.next().copied(),
WorkspaceFilesIter::PackageFiles {
package_files,
current,
} => loop {
if let Some(file) = current.as_mut().and_then(Iterator::next) {
return Some(file);
}
*current = Some(package_files.next()?.into_iter());
},
}
}
}
#[cfg(test)]
mod tests {
use ruff_db::files::system_path_to_file;

View File

@@ -158,11 +158,9 @@ impl Deref for Indexed<'_> {
}
}
pub(super) type IndexedIter<'a> = std::iter::Copied<std::collections::hash_set::Iter<'a, File>>;
impl<'a> IntoIterator for &'a Indexed<'_> {
type Item = File;
type IntoIter = IndexedIter<'a>;
type IntoIter = std::iter::Copied<std::collections::hash_set::Iter<'a, File>>;
fn into_iter(self) -> Self::IntoIter {
self.files.iter().copied()

View File

@@ -1,4 +1,3 @@
use crate::workspace::settings::{Configuration, WorkspaceSettings};
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ruff_python_ast::name::Name;
@@ -8,8 +7,6 @@ pub struct WorkspaceMetadata {
/// The (first-party) packages in this workspace.
pub(super) packages: Vec<PackageMetadata>,
pub(super) settings: WorkspaceSettings,
}
/// A first-party package in a workspace.
@@ -24,11 +21,7 @@ pub struct PackageMetadata {
impl WorkspaceMetadata {
/// Discovers the closest workspace at `path` and returns its metadata.
pub fn from_path(
path: &SystemPath,
system: &dyn System,
base_configuration: Option<Configuration>,
) -> anyhow::Result<WorkspaceMetadata> {
pub fn from_path(path: &SystemPath, system: &dyn System) -> anyhow::Result<WorkspaceMetadata> {
assert!(
system.is_directory(path),
"Workspace root path must be a directory"
@@ -45,20 +38,9 @@ impl WorkspaceMetadata {
root: root.clone(),
};
// TODO: Load the configuration from disk.
let mut configuration = Configuration::default();
if let Some(base_configuration) = base_configuration {
configuration.extend(base_configuration);
}
// TODO: Respect the package configurations when resolving settings (e.g. for the target version).
let settings = configuration.into_workspace_settings(&root);
let workspace = WorkspaceMetadata {
root,
packages: vec![package],
settings,
};
Ok(workspace)
@@ -71,10 +53,6 @@ impl WorkspaceMetadata {
pub fn packages(&self) -> &[PackageMetadata] {
&self.packages
}
pub fn settings(&self) -> &WorkspaceSettings {
&self.settings
}
}
impl PackageMetadata {

View File

@@ -1,89 +0,0 @@
use red_knot_python_semantic::{ProgramSettings, PythonVersion, SearchPathSettings, SitePackages};
use ruff_db::system::{SystemPath, SystemPathBuf};
/// The resolved configurations.
///
/// The main difference to [`Configuration`] is that default values are filled in.
#[derive(Debug, Clone)]
pub struct WorkspaceSettings {
pub(super) program: ProgramSettings,
}
impl WorkspaceSettings {
pub fn program(&self) -> &ProgramSettings {
&self.program
}
}
/// The configuration for the workspace or a package.
#[derive(Debug, Default, Clone)]
pub struct Configuration {
pub target_version: Option<PythonVersion>,
pub search_paths: SearchPathConfiguration,
}
impl Configuration {
/// Extends this configuration by using the values from `with` for all values that are absent in `self`.
pub fn extend(&mut self, with: Configuration) {
self.target_version = self.target_version.or(with.target_version);
self.search_paths.extend(with.search_paths);
}
pub fn into_workspace_settings(self, workspace_root: &SystemPath) -> WorkspaceSettings {
WorkspaceSettings {
program: ProgramSettings {
target_version: self.target_version.unwrap_or_default(),
search_paths: self.search_paths.into_settings(workspace_root),
},
}
}
}
#[derive(Debug, Default, Clone, Eq, PartialEq)]
pub struct SearchPathConfiguration {
/// List of user-provided paths that should take first priority in the module resolution.
/// Examples in other type checkers are mypy's MYPYPATH environment variable,
/// or pyright's stubPath configuration setting.
pub extra_paths: Option<Vec<SystemPathBuf>>,
/// The root of the workspace, used for finding first-party modules.
pub src_root: Option<SystemPathBuf>,
/// Optional path to a "custom typeshed" directory on disk for us to use for standard-library types.
/// If this is not provided, we will fallback to our vendored typeshed stubs for the stdlib,
/// bundled as a zip file in the binary
pub custom_typeshed: Option<SystemPathBuf>,
/// The path to the user's `site-packages` directory, where third-party packages from ``PyPI`` are installed.
pub site_packages: Option<SitePackages>,
}
impl SearchPathConfiguration {
pub fn into_settings(self, workspace_root: &SystemPath) -> SearchPathSettings {
let site_packages = self.site_packages.unwrap_or(SitePackages::Known(vec![]));
SearchPathSettings {
extra_paths: self.extra_paths.unwrap_or_default(),
src_root: self
.src_root
.unwrap_or_else(|| workspace_root.to_path_buf()),
custom_typeshed: self.custom_typeshed,
site_packages,
}
}
pub fn extend(&mut self, with: SearchPathConfiguration) {
if let Some(extra_paths) = with.extra_paths {
self.extra_paths.get_or_insert(extra_paths);
}
if let Some(src_root) = with.src_root {
self.src_root.get_or_insert(src_root);
}
if let Some(custom_typeshed) = with.custom_typeshed {
self.custom_typeshed.get_or_insert(custom_typeshed);
}
if let Some(site_packages) = with.site_packages {
self.site_packages.get_or_insert(site_packages);
}
}
}

View File

@@ -1,7 +1,6 @@
use std::fs;
use std::path::PathBuf;
use red_knot_python_semantic::{HasTy, SemanticModel};
use red_knot_python_semantic::{
HasTy, ProgramSettings, PythonVersion, SearchPathSettings, SemanticModel,
};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_db::files::{system_path_to_file, File};
@@ -10,11 +9,23 @@ use ruff_db::system::{OsSystem, SystemPath, SystemPathBuf};
use ruff_python_ast::visitor::source_order;
use ruff_python_ast::visitor::source_order::SourceOrderVisitor;
use ruff_python_ast::{Alias, Expr, Parameter, ParameterWithDefault, Stmt};
use std::fs;
use std::path::PathBuf;
fn setup_db(workspace_root: &SystemPath) -> anyhow::Result<RootDatabase> {
let system = OsSystem::new(workspace_root);
let workspace = WorkspaceMetadata::from_path(workspace_root, &system, None)?;
RootDatabase::new(workspace, system)
fn setup_db(workspace_root: SystemPathBuf) -> anyhow::Result<RootDatabase> {
let system = OsSystem::new(&workspace_root);
let workspace = WorkspaceMetadata::from_path(&workspace_root, &system)?;
let search_paths = SearchPathSettings {
extra_paths: vec![],
src_root: workspace_root,
custom_typeshed: None,
site_packages: vec![],
};
let settings = ProgramSettings {
target_version: PythonVersion::default(),
search_paths,
};
RootDatabase::new(workspace, settings, system)
}
/// Test that all snippets in testcorpus can be checked without panic
@@ -22,9 +33,8 @@ fn setup_db(workspace_root: &SystemPath) -> anyhow::Result<RootDatabase> {
#[allow(clippy::print_stdout)]
fn corpus_no_panic() -> anyhow::Result<()> {
let corpus = PathBuf::from(env!("CARGO_MANIFEST_DIR")).join("resources/test/corpus");
let system_corpus =
SystemPathBuf::from_path_buf(corpus.clone()).expect("corpus path to be UTF8");
let db = setup_db(&system_corpus)?;
let system_corpus = SystemPath::from_std_path(&corpus).expect("corpus path to be UTF8");
let db = setup_db(system_corpus.to_path_buf())?;
for path in fs::read_dir(&corpus).expect("corpus to be a directory") {
let path = path.expect("path to not be an error").path();

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.6.3"
version = "0.6.1"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -37,14 +37,13 @@ name = "red_knot"
harness = false
[dependencies]
codspeed-criterion-compat = { workspace = true, default-features = false, optional = true }
criterion = { workspace = true, default-features = false }
once_cell = { workspace = true }
rayon = { workspace = true }
serde = { workspace = true }
serde_json = { workspace = true }
url = { workspace = true }
ureq = { workspace = true }
criterion = { workspace = true, default-features = false }
codspeed-criterion-compat = { workspace = true, default-features = false, optional = true }
[dev-dependencies]
ruff_db = { workspace = true }

View File

@@ -1,10 +1,8 @@
#![allow(clippy::disallowed_names)]
use rayon::ThreadPoolBuilder;
use red_knot_python_semantic::PythonVersion;
use red_knot_python_semantic::{ProgramSettings, PythonVersion, SearchPathSettings};
use red_knot_workspace::db::RootDatabase;
use red_knot_workspace::watch::{ChangeEvent, ChangedKind};
use red_knot_workspace::workspace::settings::Configuration;
use red_knot_workspace::workspace::WorkspaceMetadata;
use ruff_benchmark::criterion::{criterion_group, criterion_main, BatchSize, Criterion};
use ruff_benchmark::TestFile;
@@ -21,9 +19,18 @@ struct Case {
const TOMLLIB_312_URL: &str = "https://raw.githubusercontent.com/python/cpython/8e8a4baf652f6e1cee7acde9d78c4b6154539748/Lib/tomllib";
// The "unresolved import" is because we don't understand `*` imports yet.
// This first "unresolved import" is because we don't understand `*` imports yet.
// The following "unresolved import" violations are because we can't distinguish currently from
// "Symbol exists in the module but its type is unknown" and
// "Symbol does not exist in the module"
static EXPECTED_DIAGNOSTICS: &[&str] = &[
"/src/tomllib/_parser.py:7:29: Module 'collections.abc' has no member 'Iterable'",
"/src/tomllib/_parser.py:7:29: Could not resolve import of 'Iterable' from 'collections.abc'",
"/src/tomllib/_parser.py:10:20: Could not resolve import of 'Any' from 'typing'",
"/src/tomllib/_parser.py:13:5: Could not resolve import of 'RE_DATETIME' from '._re'",
"/src/tomllib/_parser.py:14:5: Could not resolve import of 'RE_LOCALTIME' from '._re'",
"/src/tomllib/_parser.py:15:5: Could not resolve import of 'RE_NUMBER' from '._re'",
"/src/tomllib/_parser.py:20:21: Could not resolve import of 'Key' from '._types'",
"/src/tomllib/_parser.py:20:26: Could not resolve import of 'ParseFloat' from '._types'",
"Line 69 is too long (89 characters)",
"Use double quotes for strings",
"Use double quotes for strings",
@@ -32,8 +39,23 @@ static EXPECTED_DIAGNOSTICS: &[&str] = &[
"Use double quotes for strings",
"Use double quotes for strings",
"Use double quotes for strings",
"/src/tomllib/_parser.py:153:22: Name 'key' used when not defined.",
"/src/tomllib/_parser.py:153:27: Name 'flag' used when not defined.",
"/src/tomllib/_parser.py:159:16: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:161:25: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:168:16: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:169:22: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:170:25: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:180:16: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:182:31: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:206:16: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:207:22: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:208:25: Name 'k' used when not defined.",
"/src/tomllib/_parser.py:330:32: Name 'header' used when not defined.",
"/src/tomllib/_parser.py:330:41: Name 'key' used when not defined.",
"/src/tomllib/_parser.py:333:26: Name 'cont_key' used when not defined.",
"/src/tomllib/_parser.py:334:71: Name 'cont_key' used when not defined.",
"/src/tomllib/_parser.py:337:31: Name 'cont_key' used when not defined.",
"/src/tomllib/_parser.py:628:75: Name 'e' used when not defined.",
"/src/tomllib/_parser.py:686:23: Name 'parse_float' used when not defined.",
];
@@ -64,17 +86,18 @@ fn setup_case() -> Case {
.unwrap();
let src_root = SystemPath::new("/src");
let metadata = WorkspaceMetadata::from_path(
src_root,
&system,
Some(Configuration {
target_version: Some(PythonVersion::PY312),
..Configuration::default()
}),
)
.unwrap();
let metadata = WorkspaceMetadata::from_path(src_root, &system).unwrap();
let settings = ProgramSettings {
target_version: PythonVersion::PY312,
search_paths: SearchPathSettings {
extra_paths: vec![],
src_root: src_root.to_path_buf(),
site_packages: vec![],
custom_typeshed: None,
},
};
let mut db = RootDatabase::new(metadata, system).unwrap();
let mut db = RootDatabase::new(metadata, settings, system).unwrap();
let parser = system_path_to_file(&db, parser_path).unwrap();
db.workspace().open_file(&mut db, parser);
@@ -89,25 +112,7 @@ fn setup_case() -> Case {
}
}
static RAYON_INITIALIZED: std::sync::Once = std::sync::Once::new();
fn setup_rayon() {
// Initialize the rayon thread pool outside the benchmark because it has a significant cost.
// We limit the thread pool to only one (the current thread) because we're focused on
// where red knot spends time and less about how well the code runs concurrently.
// We might want to add a benchmark focusing on concurrency to detect congestion in the future.
RAYON_INITIALIZED.call_once(|| {
ThreadPoolBuilder::new()
.num_threads(1)
.use_current_thread()
.build_global()
.unwrap();
});
}
fn benchmark_incremental(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("red_knot_check_file[incremental]", |b| {
b.iter_batched_ref(
|| {
@@ -126,13 +131,10 @@ fn benchmark_incremental(criterion: &mut Criterion) {
|case| {
let Case { db, .. } = case;
db.apply_changes(
vec![ChangeEvent::Changed {
path: case.re_path.to_path_buf(),
kind: ChangedKind::FileContent,
}],
None,
);
db.apply_changes(vec![ChangeEvent::Changed {
path: case.re_path.to_path_buf(),
kind: ChangedKind::FileContent,
}]);
let result = db.check().unwrap();
@@ -144,8 +146,6 @@ fn benchmark_incremental(criterion: &mut Criterion) {
}
fn benchmark_cold(criterion: &mut Criterion) {
setup_rayon();
criterion.bench_function("red_knot_check_file[cold]", |b| {
b.iter_batched_ref(
setup_case,

View File

@@ -8,12 +8,11 @@ use salsa::{Durability, Setter};
pub use file_root::{FileRoot, FileRootKind};
pub use path::FilePath;
use ruff_notebook::{Notebook, NotebookError};
use ruff_python_ast::PySourceType;
use crate::file_revision::FileRevision;
use crate::files::file_root::FileRoots;
use crate::files::private::FileStatus;
use crate::system::{SystemPath, SystemPathBuf, SystemVirtualPath, SystemVirtualPathBuf};
use crate::system::{Metadata, SystemPath, SystemPathBuf, SystemVirtualPath, SystemVirtualPathBuf};
use crate::vendored::{VendoredPath, VendoredPathBuf};
use crate::{vendored, Db, FxDashMap};
@@ -61,8 +60,8 @@ struct FilesInner {
/// so that queries that depend on the existence of a file are re-executed when the file is created.
system_by_path: FxDashMap<SystemPathBuf, File>,
/// Lookup table that maps [`SystemVirtualPathBuf`]s to [`VirtualFile`] instances.
system_virtual_by_path: FxDashMap<SystemVirtualPathBuf, VirtualFile>,
/// Lookup table that maps [`SystemVirtualPathBuf`]s to salsa interned [`File`] instances.
system_virtual_by_path: FxDashMap<SystemVirtualPathBuf, File>,
/// Lookup table that maps vendored files to the salsa [`File`] ingredients.
vendored_by_path: FxDashMap<VendoredPathBuf, File>,
@@ -148,31 +147,31 @@ impl Files {
Ok(file)
}
/// Create a new virtual file at the given path and store it for future lookups.
/// Looks up a virtual file by its `path`.
///
/// This will always create a new file, overwriting any existing file at `path` in the internal
/// storage.
pub fn virtual_file(&self, db: &dyn Db, path: &SystemVirtualPath) -> VirtualFile {
tracing::trace!("Adding virtual file {}", path);
let virtual_file = VirtualFile(
File::builder(FilePath::SystemVirtual(path.to_path_buf()))
.status(FileStatus::Exists)
.revision(FileRevision::zero())
.permissions(None)
.new(db),
);
self.inner
.system_virtual_by_path
.insert(path.to_path_buf(), virtual_file);
virtual_file
}
/// For a non-existing file, creates a new salsa [`File`] ingredient and stores it for future lookups.
///
/// The operations fails if the system failed to provide a metadata for the path.
pub fn add_virtual_file(&self, db: &dyn Db, path: &SystemVirtualPath) -> Option<File> {
let file = match self.inner.system_virtual_by_path.entry(path.to_path_buf()) {
Entry::Occupied(entry) => *entry.get(),
Entry::Vacant(entry) => {
let metadata = db.system().virtual_path_metadata(path).ok()?;
/// Tries to look up a virtual file by its path. Returns `None` if no such file exists yet.
pub fn try_virtual_file(&self, path: &SystemVirtualPath) -> Option<VirtualFile> {
self.inner
.system_virtual_by_path
.get(&path.to_path_buf())
.map(|entry| *entry.value())
tracing::trace!("Adding virtual file '{}'", path);
let file = File::builder(FilePath::SystemVirtual(path.to_path_buf()))
.revision(metadata.revision())
.permissions(metadata.permissions())
.new(db);
entry.insert(file);
file
}
};
Some(file)
}
/// Looks up the closest root for `path`. Returns `None` if `path` isn't enclosed by any source root.
@@ -319,9 +318,6 @@ impl File {
}
FilePath::Vendored(vendored) => db.vendored().read_to_string(vendored),
FilePath::SystemVirtual(system_virtual) => {
// Add a dependency on the revision to ensure the operation gets re-executed when the file changes.
let _ = self.revision(db);
db.system().read_virtual_path_to_string(system_virtual)
}
}
@@ -346,9 +342,6 @@ impl File {
"Reading a notebook from the vendored file system is not supported.",
))),
FilePath::SystemVirtual(system_virtual) => {
// Add a dependency on the revision to ensure the operation gets re-executed when the file changes.
let _ = self.revision(db);
db.system().read_virtual_path_to_notebook(system_virtual)
}
}
@@ -361,13 +354,6 @@ impl File {
Self::sync_system_path(db, &absolute, None);
}
/// Increments the revision for the virtual file at `path`.
pub fn sync_virtual_path(db: &mut dyn Db, path: &SystemVirtualPath) {
if let Some(virtual_file) = db.files().try_virtual_file(path) {
virtual_file.sync(db);
}
}
/// Syncs the [`File`]'s state with the state of the file on the system.
pub fn sync(self, db: &mut dyn Db) {
let path = self.path(db).clone();
@@ -380,20 +366,29 @@ impl File {
FilePath::Vendored(_) => {
// Readonly, can never be out of date.
}
FilePath::SystemVirtual(_) => {
VirtualFile(self).sync(db);
FilePath::SystemVirtual(system_virtual) => {
Self::sync_system_virtual_path(db, &system_virtual, self);
}
}
}
/// Private method providing the implementation for [`Self::sync_path`] and [`Self::sync`] for
/// system paths.
fn sync_system_path(db: &mut dyn Db, path: &SystemPath, file: Option<File>) {
let Some(file) = file.or_else(|| db.files().try_system(db, path)) else {
return;
};
let metadata = db.system().path_metadata(path);
Self::sync_impl(db, metadata, file);
}
let (status, revision, permission) = match db.system().path_metadata(path) {
fn sync_system_virtual_path(db: &mut dyn Db, path: &SystemVirtualPath, file: File) {
let metadata = db.system().virtual_path_metadata(path);
Self::sync_impl(db, metadata, file);
}
/// Private method providing the implementation for [`Self::sync_system_path`] and
/// [`Self::sync_system_virtual_path`].
fn sync_impl(db: &mut dyn Db, metadata: crate::system::Result<Metadata>, file: File) {
let (status, revision, permission) = match metadata {
Ok(metadata) if metadata.file_type().is_file() => (
FileStatus::Exists,
metadata.revision(),
@@ -425,42 +420,6 @@ impl File {
pub fn exists(self, db: &dyn Db) -> bool {
self.status(db) == FileStatus::Exists
}
/// Returns `true` if the file should be analyzed as a type stub.
pub fn is_stub(self, db: &dyn Db) -> bool {
self.path(db)
.extension()
.is_some_and(|extension| PySourceType::from_extension(extension).is_stub())
}
}
/// A virtual file that doesn't exist on the file system.
///
/// This is a wrapper around a [`File`] that provides additional methods to interact with a virtual
/// file.
#[derive(Copy, Clone)]
pub struct VirtualFile(File);
impl VirtualFile {
/// Returns the underlying [`File`].
pub fn file(&self) -> File {
self.0
}
/// Increments the revision of the underlying [`File`].
fn sync(&self, db: &mut dyn Db) {
let file = self.0;
tracing::debug!("Updating the revision of '{}'", file.path(db));
let current_revision = file.revision(db);
file.set_revision(db)
.to(FileRevision::new(current_revision.as_u128() + 1));
}
/// Closes the virtual file.
pub fn close(&self, db: &mut dyn Db) {
tracing::debug!("Closing virtual file '{}'", self.0.path(db));
self.0.set_status(db).to(FileStatus::NotFound);
}
}
// The types in here need to be public because they're salsa ingredients but we

View File

@@ -121,9 +121,9 @@ mod tests {
db.write_virtual_file(path, "x = 10");
let virtual_file = db.files().virtual_file(&db, path);
let file = db.files().add_virtual_file(&db, path).unwrap();
let parsed = parsed_module(&db, virtual_file.file());
let parsed = parsed_module(&db, file);
assert!(parsed.is_valid());
@@ -137,9 +137,9 @@ mod tests {
db.write_virtual_file(path, "%timeit a = b");
let virtual_file = db.files().virtual_file(&db, path);
let file = db.files().add_virtual_file(&db, path).unwrap();
let parsed = parsed_module(&db, virtual_file.file());
let parsed = parsed_module(&db, file);
assert!(parsed.is_valid());

View File

@@ -63,6 +63,9 @@ pub trait System: Debug {
/// representation fall-back to deserializing the notebook from a string.
fn read_to_notebook(&self, path: &SystemPath) -> std::result::Result<Notebook, NotebookError>;
/// Reads the metadata of the virtual file at `path`.
fn virtual_path_metadata(&self, path: &SystemVirtualPath) -> Result<Metadata>;
/// Reads the content of the virtual file at `path` into a [`String`].
fn read_virtual_path_to_string(&self, path: &SystemVirtualPath) -> Result<String>;

View File

@@ -136,6 +136,22 @@ impl MemoryFileSystem {
ruff_notebook::Notebook::from_source_code(&content)
}
pub(crate) fn virtual_path_metadata(
&self,
path: impl AsRef<SystemVirtualPath>,
) -> Result<Metadata> {
let virtual_files = self.inner.virtual_files.read().unwrap();
let file = virtual_files
.get(&path.as_ref().to_path_buf())
.ok_or_else(not_found)?;
Ok(Metadata {
revision: file.last_modified.into(),
permissions: Some(MemoryFileSystem::PERMISSION),
file_type: FileType::File,
})
}
pub(crate) fn read_virtual_path_to_string(
&self,
path: impl AsRef<SystemVirtualPath>,

View File

@@ -77,6 +77,10 @@ impl System for OsSystem {
Notebook::from_path(path.as_std_path())
}
fn virtual_path_metadata(&self, _path: &SystemVirtualPath) -> Result<Metadata> {
Err(not_found())
}
fn read_virtual_path_to_string(&self, _path: &SystemVirtualPath) -> Result<String> {
Err(not_found())
}

View File

@@ -69,6 +69,13 @@ impl System for TestSystem {
}
}
fn virtual_path_metadata(&self, path: &SystemVirtualPath) -> Result<Metadata> {
match &self.inner {
TestSystemInner::Stub(fs) => fs.virtual_path_metadata(path),
TestSystemInner::System(system) => system.virtual_path_metadata(path),
}
}
fn read_virtual_path_to_string(&self, path: &SystemVirtualPath) -> Result<String> {
match &self.inner {
TestSystemInner::Stub(fs) => fs.read_virtual_path_to_string(path),

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.6.3"
version = "0.6.1"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -17,7 +17,7 @@ app = FastAPI()
router = APIRouter()
# Fixable errors
# Errors
@app.get("/items/")
def get_items(
@@ -40,34 +40,6 @@ def do_stuff(
# do stuff
pass
@app.get("/users/")
def get_users(
skip: int,
limit: int,
current_user: User = Depends(get_current_user),
):
pass
@app.get("/users/")
def get_users(
current_user: User = Depends(get_current_user),
skip: int = 0,
limit: int = 10,
):
pass
# Non fixable errors
@app.get("/users/")
def get_users(
skip: int = 0,
limit: int = 10,
current_user: User = Depends(get_current_user),
):
pass
# Unchanged

View File

@@ -79,15 +79,3 @@ _ = f"a {f"first"
+ f"second"} d"
_ = f"a {f"first {f"middle"}"
+ f"second"} d"
# See https://github.com/astral-sh/ruff/issues/12936
_ = "\12""0" # fix should be "\0120"
_ = "\\12""0" # fix should be "\\120"
_ = "\\\12""0" # fix should be "\\\0120"
_ = "\12 0""0" # fix should be "\12 00"
_ = r"\12"r"0" # fix should be r"\120"
_ = "\12 and more""0" # fix should be "\12 and more0"
_ = "\8""0" # fix should be "\80"
_ = "\12""8" # fix should be "\128"
_ = "\12""foo" # fix should be "\12foo"
_ = "\12" "" # fix should be "\12"

View File

@@ -47,6 +47,7 @@ with contextlib.ExitStack() as exit_stack:
open("filename", "w").close()
pathlib.Path("filename").open("w").close()
# OK (custom context manager)
class MyFile:
def __init__(self, filename: str):
@@ -57,202 +58,3 @@ class MyFile:
def __exit__(self, exc_type, exc_val, exc_tb):
self.file.close()
import tempfile
import tarfile
from tarfile import TarFile
import zipfile
import io
import codecs
import bz2
import gzip
import dbm
import dbm.gnu
import dbm.ndbm
import dbm.dumb
import lzma
import shelve
import tokenize
import wave
import fileinput
f = tempfile.NamedTemporaryFile()
f = tempfile.TemporaryFile()
f = tempfile.SpooledTemporaryFile()
f = tarfile.open("foo.tar")
f = TarFile("foo.tar").open()
f = tarfile.TarFile("foo.tar").open()
f = tarfile.TarFile().open()
f = zipfile.ZipFile("foo.zip").open("foo.txt")
f = io.open("foo.txt")
f = io.open_code("foo.txt")
f = codecs.open("foo.txt")
f = bz2.open("foo.txt")
f = gzip.open("foo.txt")
f = dbm.open("foo.db")
f = dbm.gnu.open("foo.db")
f = dbm.ndbm.open("foo.db")
f = dbm.dumb.open("foo.db")
f = lzma.open("foo.xz")
f = lzma.LZMAFile("foo.xz")
f = shelve.open("foo.db")
f = tokenize.open("foo.py")
f = wave.open("foo.wav")
f = tarfile.TarFile.taropen("foo.tar")
f = fileinput.input("foo.txt")
f = fileinput.FileInput("foo.txt")
with contextlib.suppress(Exception):
# The following line is for example's sake.
# For some f's above, this would raise an error (since it'd be f.readline() etc.)
data = f.read()
f.close()
# OK
with tempfile.TemporaryFile() as f:
data = f.read()
# OK
with tarfile.open("foo.tar") as f:
data = f.add("foo.txt")
# OK
with tarfile.TarFile("foo.tar") as f:
data = f.add("foo.txt")
# OK
with tarfile.TarFile("foo.tar").open() as f:
data = f.add("foo.txt")
# OK
with zipfile.ZipFile("foo.zip") as f:
data = f.read("foo.txt")
# OK
with zipfile.ZipFile("foo.zip").open("foo.txt") as f:
data = f.read()
# OK
with zipfile.ZipFile("foo.zip") as zf:
with zf.open("foo.txt") as f:
data = f.read()
# OK
with io.open("foo.txt") as f:
data = f.read()
# OK
with io.open_code("foo.txt") as f:
data = f.read()
# OK
with codecs.open("foo.txt") as f:
data = f.read()
# OK
with bz2.open("foo.txt") as f:
data = f.read()
# OK
with gzip.open("foo.txt") as f:
data = f.read()
# OK
with dbm.open("foo.db") as f:
data = f.get("foo")
# OK
with dbm.gnu.open("foo.db") as f:
data = f.get("foo")
# OK
with dbm.ndbm.open("foo.db") as f:
data = f.get("foo")
# OK
with dbm.dumb.open("foo.db") as f:
data = f.get("foo")
# OK
with lzma.open("foo.xz") as f:
data = f.read()
# OK
with lzma.LZMAFile("foo.xz") as f:
data = f.read()
# OK
with shelve.open("foo.db") as f:
data = f["foo"]
# OK
with tokenize.open("foo.py") as f:
data = f.read()
# OK
with wave.open("foo.wav") as f:
data = f.readframes(1024)
# OK
with tarfile.TarFile.taropen("foo.tar") as f:
data = f.add("foo.txt")
# OK
with fileinput.input("foo.txt") as f:
data = f.readline()
# OK
with fileinput.FileInput("foo.txt") as f:
data = f.readline()
# OK (quick one-liner to clear file contents)
tempfile.NamedTemporaryFile().close()
tempfile.TemporaryFile().close()
tempfile.SpooledTemporaryFile().close()
tarfile.open("foo.tar").close()
tarfile.TarFile("foo.tar").close()
tarfile.TarFile("foo.tar").open().close()
tarfile.TarFile.open("foo.tar").close()
zipfile.ZipFile("foo.zip").close()
zipfile.ZipFile("foo.zip").open("foo.txt").close()
io.open("foo.txt").close()
io.open_code("foo.txt").close()
codecs.open("foo.txt").close()
bz2.open("foo.txt").close()
gzip.open("foo.txt").close()
dbm.open("foo.db").close()
dbm.gnu.open("foo.db").close()
dbm.ndbm.open("foo.db").close()
dbm.dumb.open("foo.db").close()
lzma.open("foo.xz").close()
lzma.LZMAFile("foo.xz").close()
shelve.open("foo.db").close()
tokenize.open("foo.py").close()
wave.open("foo.wav").close()
tarfile.TarFile.taropen("foo.tar").close()
fileinput.input("foo.txt").close()
fileinput.FileInput("foo.txt").close()
def aliased():
from shelve import open as open_shelf
x = open_shelf("foo.dbm")
x.close()
from tarfile import TarFile as TF
f = TF("foo").open()
f.close()
import dbm.sqlite3
# OK
with dbm.sqlite3.open("foo.db") as f:
print(f.keys())
# OK
dbm.sqlite3.open("foo.db").close()
# SIM115
f = dbm.sqlite3.open("foo.db")
f.close()

View File

@@ -1,75 +0,0 @@
from contextlib import contextmanager
l = 0
I = 0
O = 0
l: int = 0
a, l = 0, 1
[a, l] = 0, 1
a, *l = 0, 1, 2
a = l = 0
o = 0
i = 0
for l in range(3):
pass
for a, l in zip(range(3), range(3)):
pass
def f1():
global l
l = 0
def f2():
l = 0
def f3():
nonlocal l
l = 1
f3()
return l
def f4(l, /, I):
return l, I, O
def f5(l=0, *, I=1):
return l, I
def f6(*l, **I):
return l, I
@contextmanager
def ctx1():
yield 0
with ctx1() as l:
pass
@contextmanager
def ctx2():
yield 0, 1
with ctx2() as (a, l):
pass
try:
pass
except ValueError as l:
pass
if (l := 5) > 0:
pass

View File

@@ -119,91 +119,3 @@ class A(metaclass=abc.abcmeta):
def f(self):
"""Lorem ipsum."""
return True
# OK - implicit None early return
def foo(obj: object) -> None:
"""A very helpful docstring.
Args:
obj (object): An object.
"""
if obj is None:
return
print(obj)
# OK - explicit None early return
def foo(obj: object) -> None:
"""A very helpful docstring.
Args:
obj (object): An object.
"""
if obj is None:
return None
print(obj)
# OK - explicit None early return w/o useful type annotations
def foo(obj):
"""A very helpful docstring.
Args:
obj (object): An object.
"""
if obj is None:
return None
print(obj)
# OK - multiple explicit None early returns
def foo(obj: object) -> None:
"""A very helpful docstring.
Args:
obj (object): An object.
"""
if obj is None:
return None
if obj == "None":
return
if obj == 0:
return None
print(obj)
# DOC201 - non-early return explicit None
def foo(x: int) -> int | None:
"""A very helpful docstring.
Args:
x (int): An interger.
"""
if x < 0:
return None
else:
return x
# DOC201 - non-early return explicit None w/o useful type annotations
def foo(x):
"""A very helpful docstring.
Args:
x (int): An interger.
"""
if x < 0:
return None
else:
return x
# DOC201 - only returns None, but return annotation is not None
def foo(s: str) -> str | None:
"""A very helpful docstring.
Args:
s (str): A string.
"""
return None

View File

@@ -85,105 +85,3 @@ class A(metaclass=abc.abcmeta):
def f(self):
"""Lorem ipsum."""
return True
# OK - implicit None early return
def foo(obj: object) -> None:
"""A very helpful docstring.
Parameters
----------
obj : object
An object.
"""
if obj is None:
return
print(obj)
# OK - explicit None early return
def foo(obj: object) -> None:
"""A very helpful docstring.
Parameters
----------
obj : object
An object.
"""
if obj is None:
return None
print(obj)
# OK - explicit None early return w/o useful type annotations
def foo(obj):
"""A very helpful docstring.
Parameters
----------
obj : object
An object.
"""
if obj is None:
return None
print(obj)
# OK - multiple explicit None early returns
def foo(obj: object) -> None:
"""A very helpful docstring.
Parameters
----------
obj : object
An object.
"""
if obj is None:
return None
if obj == "None":
return
if obj == 0:
return None
print(obj)
# DOC201 - non-early return explicit None
def foo(x: int) -> int | None:
"""A very helpful docstring.
Parameters
----------
x : int
An interger.
"""
if x < 0:
return None
else:
return x
# DOC201 - non-early return explicit None w/o useful type annotations
def foo(x):
"""A very helpful docstring.
Parameters
----------
x : int
An interger.
"""
if x < 0:
return None
else:
return x
# DOC201 - only returns None, but return annotation is not None
def foo(s: str) -> str | None:
"""A very helpful docstring.
Parameters
----------
x : str
A string.
"""
return None

View File

@@ -42,16 +42,3 @@ max(1, max(*a))
import builtins
builtins.min(1, min(2, 3))
# PLW3301
max_word_len = max(
max(len(word) for word in "blah blah blah".split(" ")),
len("Done!"),
)
# OK
max_word_len = max(
*(len(word) for word in "blah blah blah".split(" ")),
len("Done!"),
)

View File

@@ -9,13 +9,3 @@ dictionary = {
#import os # noqa: E501
def f():
data = 1
# line below should autofix to `return data # fmt: skip`
return data # noqa: RET504 # fmt: skip
def f():
data = 1
# line below should autofix to `return data`
return data # noqa: RET504 - intentional incorrect noqa, will be removed

View File

@@ -70,11 +70,11 @@ pub(crate) fn except_handler(except_handler: &ExceptHandler, checker: &mut Check
}
if let Some(name) = name {
if checker.enabled(Rule::AmbiguousVariableName) {
pycodestyle::rules::ambiguous_variable_name(
checker,
name.as_str(),
name.range(),
);
if let Some(diagnostic) =
pycodestyle::rules::ambiguous_variable_name(name.as_str(), name.range())
{
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::BuiltinVariableShadowing) {
flake8_builtins::rules::builtin_variable_shadowing(checker, name, name.range());

View File

@@ -259,7 +259,11 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
}
if checker.enabled(Rule::AmbiguousVariableName) {
pycodestyle::rules::ambiguous_variable_name(checker, id, expr.range());
if let Some(diagnostic) =
pycodestyle::rules::ambiguous_variable_name(id, expr.range())
{
checker.diagnostics.push(diagnostic);
}
}
if !checker.semantic.current_scope().kind.is_class() {
if checker.enabled(Rule::BuiltinVariableShadowing) {
@@ -879,7 +883,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
flake8_simplify::rules::use_capital_environment_variables(checker, expr);
}
if checker.enabled(Rule::OpenFileWithContextHandler) {
flake8_simplify::rules::open_file_with_context_handler(checker, call);
flake8_simplify::rules::open_file_with_context_handler(checker, func);
}
if checker.enabled(Rule::DictGetWithNoneDefault) {
flake8_simplify::rules::dict_get_with_none_default(checker, expr);

View File

@@ -8,11 +8,11 @@ use crate::rules::{flake8_builtins, pep8_naming, pycodestyle};
/// Run lint rules over a [`Parameter`] syntax node.
pub(crate) fn parameter(parameter: &Parameter, checker: &mut Checker) {
if checker.enabled(Rule::AmbiguousVariableName) {
pycodestyle::rules::ambiguous_variable_name(
checker,
&parameter.name,
parameter.name.range(),
);
if let Some(diagnostic) =
pycodestyle::rules::ambiguous_variable_name(&parameter.name, parameter.name.range())
{
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::InvalidArgumentName) {
if let Some(diagnostic) = pep8_naming::rules::invalid_argument_name(

View File

@@ -24,16 +24,16 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pylint::rules::global_at_module_level(checker, stmt);
}
if checker.enabled(Rule::AmbiguousVariableName) {
for name in names {
pycodestyle::rules::ambiguous_variable_name(checker, name, name.range());
}
checker.diagnostics.extend(names.iter().filter_map(|name| {
pycodestyle::rules::ambiguous_variable_name(name, name.range())
}));
}
}
Stmt::Nonlocal(nonlocal @ ast::StmtNonlocal { names, range: _ }) => {
if checker.enabled(Rule::AmbiguousVariableName) {
for name in names {
pycodestyle::rules::ambiguous_variable_name(checker, name, name.range());
}
checker.diagnostics.extend(names.iter().filter_map(|name| {
pycodestyle::rules::ambiguous_variable_name(name, name.range())
}));
}
if checker.enabled(Rule::NonlocalWithoutBinding) {
if !checker.semantic.scope_id.is_global() {

View File

@@ -118,10 +118,10 @@ pub(crate) fn check_noqa(
match &line.directive {
Directive::All(directive) => {
if line.matches.is_empty() {
let edit = delete_comment(directive.range(), locator);
let mut diagnostic =
Diagnostic::new(UnusedNOQA { codes: None }, directive.range());
diagnostic.set_fix(Fix::safe_edit(edit));
diagnostic
.set_fix(Fix::safe_edit(delete_comment(directive.range(), locator)));
diagnostics.push(diagnostic);
}
@@ -172,14 +172,6 @@ pub(crate) fn check_noqa(
&& unknown_codes.is_empty()
&& unmatched_codes.is_empty())
{
let edit = if valid_codes.is_empty() {
delete_comment(directive.range(), locator)
} else {
Edit::range_replacement(
format!("# noqa: {}", valid_codes.join(", ")),
directive.range(),
)
};
let mut diagnostic = Diagnostic::new(
UnusedNOQA {
codes: Some(UnusedCodes {
@@ -203,7 +195,17 @@ pub(crate) fn check_noqa(
},
directive.range(),
);
diagnostic.set_fix(Fix::safe_edit(edit));
if valid_codes.is_empty() {
diagnostic.set_fix(Fix::safe_edit(delete_comment(
directive.range(),
locator,
)));
} else {
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
format!("# noqa: {}", valid_codes.join(", ")),
directive.range(),
)));
}
diagnostics.push(diagnostic);
}
}

View File

@@ -99,8 +99,11 @@ pub(crate) fn delete_comment(range: TextRange, locator: &Locator) -> Edit {
}
// Ex) `x = 1 # noqa here`
else {
// Remove `# noqa here` and whitespace
Edit::deletion(range.start() - leading_space_len, line_range.end())
// Replace `# noqa here` with `# here`.
Edit::range_replacement(
"# ".to_string(),
TextRange::new(range.start(), range.end() + trailing_space_len),
)
}
}

View File

@@ -1,4 +1,4 @@
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use ruff_diagnostics::{AlwaysFixableViolation, Diagnostic, Edit, Fix};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast as ast;
use ruff_python_ast::helpers::map_callable;
@@ -59,16 +59,14 @@ use crate::settings::types::PythonVersion;
#[violation]
pub struct FastApiNonAnnotatedDependency;
impl Violation for FastApiNonAnnotatedDependency {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
impl AlwaysFixableViolation for FastApiNonAnnotatedDependency {
#[derive_message_formats]
fn message(&self) -> String {
format!("FastAPI dependency without `Annotated`")
}
fn fix_title(&self) -> Option<String> {
Some("Replace with `Annotated`".to_string())
fn fix_title(&self) -> String {
"Replace with `Annotated`".to_string()
}
}
@@ -77,95 +75,64 @@ pub(crate) fn fastapi_non_annotated_dependency(
checker: &mut Checker,
function_def: &ast::StmtFunctionDef,
) {
if !checker.semantic().seen_module(Modules::FASTAPI)
|| !is_fastapi_route(function_def, checker.semantic())
{
if !checker.semantic().seen_module(Modules::FASTAPI) {
return;
}
let mut updatable_count = 0;
let mut has_non_updatable_default = false;
let total_params = function_def.parameters.args.len();
for parameter in &function_def.parameters.args {
let needs_update = matches!(
(&parameter.parameter.annotation, &parameter.default),
(Some(_annotation), Some(default)) if is_fastapi_dependency(checker, default)
);
if needs_update {
updatable_count += 1;
// Determine if it's safe to update this parameter:
// - if all parameters are updatable its safe.
// - if we've encountered a non-updatable parameter with a default value, it's no longer
// safe. (https://github.com/astral-sh/ruff/issues/12982)
let safe_to_update = updatable_count == total_params || !has_non_updatable_default;
create_diagnostic(checker, parameter, safe_to_update);
} else if parameter.default.is_some() {
has_non_updatable_default = true;
}
if !is_fastapi_route(function_def, checker.semantic()) {
return;
}
}
fn is_fastapi_dependency(checker: &Checker, expr: &ast::Expr) -> bool {
checker
.semantic()
.resolve_qualified_name(map_callable(expr))
.is_some_and(|qualified_name| {
matches!(
qualified_name.segments(),
[
"fastapi",
"Query"
| "Path"
| "Body"
| "Cookie"
| "Header"
| "File"
| "Form"
| "Depends"
| "Security"
]
)
})
}
fn create_diagnostic(
checker: &mut Checker,
parameter: &ast::ParameterWithDefault,
safe_to_update: bool,
) {
let mut diagnostic = Diagnostic::new(FastApiNonAnnotatedDependency, parameter.range);
if safe_to_update {
for parameter in &function_def.parameters.args {
if let (Some(annotation), Some(default)) =
(&parameter.parameter.annotation, &parameter.default)
{
diagnostic.try_set_fix(|| {
let module = if checker.settings.target_version >= PythonVersion::Py39 {
"typing"
} else {
"typing_extensions"
};
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, "Annotated"),
parameter.range.start(),
checker.semantic(),
)?;
let content = format!(
"{}: {}[{}, {}]",
parameter.parameter.name.id,
binding,
checker.locator().slice(annotation.range()),
checker.locator().slice(default.range())
);
let parameter_edit = Edit::range_replacement(content, parameter.range);
Ok(Fix::unsafe_edits(import_edit, [parameter_edit]))
});
}
} else {
diagnostic.fix = None;
}
if checker
.semantic()
.resolve_qualified_name(map_callable(default))
.is_some_and(|qualified_name| {
matches!(
qualified_name.segments(),
[
"fastapi",
"Query"
| "Path"
| "Body"
| "Cookie"
| "Header"
| "File"
| "Form"
| "Depends"
| "Security"
]
)
})
{
let mut diagnostic =
Diagnostic::new(FastApiNonAnnotatedDependency, parameter.range);
checker.diagnostics.push(diagnostic);
diagnostic.try_set_fix(|| {
let module = if checker.settings.target_version >= PythonVersion::Py39 {
"typing"
} else {
"typing_extensions"
};
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, "Annotated"),
function_def.start(),
checker.semantic(),
)?;
let content = format!(
"{}: {}[{}, {}]",
parameter.parameter.name.id,
binding,
checker.locator().slice(annotation.range()),
checker.locator().slice(default.range())
);
let parameter_edit = Edit::range_replacement(content, parameter.range());
Ok(Fix::unsafe_edits(import_edit, [parameter_edit]))
});
checker.diagnostics.push(diagnostic);
}
}
}
}

View File

@@ -261,72 +261,3 @@ FAST002.py:38:5: FAST002 [*] FastAPI dependency without `Annotated`
39 40 | ):
40 41 | # do stuff
41 42 | pass
FAST002.py:47:5: FAST002 [*] FastAPI dependency without `Annotated`
|
45 | skip: int,
46 | limit: int,
47 | current_user: User = Depends(get_current_user),
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FAST002
48 | ):
49 | pass
|
= help: Replace with `Annotated`
Unsafe fix
12 12 | Security,
13 13 | )
14 14 | from pydantic import BaseModel
15 |+from typing import Annotated
15 16 |
16 17 | app = FastAPI()
17 18 | router = APIRouter()
--------------------------------------------------------------------------------
44 45 | def get_users(
45 46 | skip: int,
46 47 | limit: int,
47 |- current_user: User = Depends(get_current_user),
48 |+ current_user: Annotated[User, Depends(get_current_user)],
48 49 | ):
49 50 | pass
50 51 |
FAST002.py:53:5: FAST002 [*] FastAPI dependency without `Annotated`
|
51 | @app.get("/users/")
52 | def get_users(
53 | current_user: User = Depends(get_current_user),
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FAST002
54 | skip: int = 0,
55 | limit: int = 10,
|
= help: Replace with `Annotated`
Unsafe fix
12 12 | Security,
13 13 | )
14 14 | from pydantic import BaseModel
15 |+from typing import Annotated
15 16 |
16 17 | app = FastAPI()
17 18 | router = APIRouter()
--------------------------------------------------------------------------------
50 51 |
51 52 | @app.get("/users/")
52 53 | def get_users(
53 |- current_user: User = Depends(get_current_user),
54 |+ current_user: Annotated[User, Depends(get_current_user)],
54 55 | skip: int = 0,
55 56 | limit: int = 10,
56 57 | ):
FAST002.py:67:5: FAST002 FastAPI dependency without `Annotated`
|
65 | skip: int = 0,
66 | limit: int = 10,
67 | current_user: User = Depends(get_current_user),
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FAST002
68 | ):
69 | pass
|
= help: Replace with `Annotated`

View File

@@ -11,7 +11,6 @@ mod tests {
use crate::assert_messages;
use crate::registry::Rule;
use crate::settings::types::PythonVersion;
use crate::settings::LinterSettings;
use crate::test::test_path;
@@ -37,18 +36,4 @@ mod tests {
assert_messages!(snapshot, diagnostics);
Ok(())
}
#[test_case(Path::new("ASYNC109_0.py"); "asyncio")]
#[test_case(Path::new("ASYNC109_1.py"); "trio")]
fn async109_python_310_or_older(path: &Path) -> Result<()> {
let diagnostics = test_path(
Path::new("flake8_async").join(path),
&LinterSettings {
target_version: PythonVersion::Py310,
..LinterSettings::for_rule(Rule::AsyncFunctionWithTimeout)
},
)?;
assert_messages!(path.file_name().unwrap().to_str().unwrap(), diagnostics);
Ok(())
}
}

View File

@@ -6,7 +6,6 @@ use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
use crate::rules::flake8_async::helpers::AsyncModule;
use crate::settings::types::PythonVersion;
/// ## What it does
/// Checks for `async` functions with a `timeout` argument.
@@ -87,11 +86,6 @@ pub(crate) fn async_function_with_timeout(
AsyncModule::AsyncIo
};
// asyncio.timeout feature was first introduced in Python 3.11
if module == AsyncModule::AsyncIo && checker.settings.target_version < PythonVersion::Py311 {
return;
}
checker.diagnostics.push(Diagnostic::new(
AsyncFunctionWithTimeout { module },
timeout.range(),

View File

@@ -1,18 +0,0 @@
---
source: crates/ruff_linter/src/rules/flake8_async/mod.rs
---
ASYNC109_0.py:8:16: ASYNC109 Async function definition with a `timeout` parameter
|
8 | async def func(timeout):
| ^^^^^^^ ASYNC109
9 | ...
|
= help: Use `trio.fail_after` instead
ASYNC109_0.py:12:16: ASYNC109 Async function definition with a `timeout` parameter
|
12 | async def func(timeout=10):
| ^^^^^^^^^^ ASYNC109
13 | ...
|
= help: Use `trio.fail_after` instead

View File

@@ -1,4 +0,0 @@
---
source: crates/ruff_linter/src/rules/flake8_async/mod.rs
---

View File

@@ -1,5 +1,3 @@
use std::borrow::Cow;
use itertools::Itertools;
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
@@ -174,16 +172,9 @@ fn concatenate_strings(a_range: TextRange, b_range: TextRange, locator: &Locator
return None;
}
let mut a_body =
Cow::Borrowed(&a_text[a_leading_quote.len()..a_text.len() - a_trailing_quote.len()]);
let a_body = &a_text[a_leading_quote.len()..a_text.len() - a_trailing_quote.len()];
let b_body = &b_text[b_leading_quote.len()..b_text.len() - b_trailing_quote.len()];
if a_leading_quote.find(['r', 'R']).is_none()
&& matches!(b_body.bytes().next(), Some(b'0'..=b'7'))
{
normalize_ending_octal(&mut a_body);
}
let concatenation = format!("{a_leading_quote}{a_body}{b_body}{a_trailing_quote}");
let range = TextRange::new(a_range.start(), b_range.end());
@@ -192,39 +183,3 @@ fn concatenate_strings(a_range: TextRange, b_range: TextRange, locator: &Locator
range,
)))
}
/// Pads an octal at the end of the string
/// to three digits, if necessary.
fn normalize_ending_octal(text: &mut Cow<'_, str>) {
// Early return for short strings
if text.len() < 2 {
return;
}
let mut rev_bytes = text.bytes().rev();
if let Some(last_byte @ b'0'..=b'7') = rev_bytes.next() {
// "\y" -> "\00y"
if has_odd_consecutive_backslashes(&mut rev_bytes.clone()) {
let prefix = &text[..text.len() - 2];
*text = Cow::Owned(format!("{prefix}\\00{}", last_byte as char));
}
// "\xy" -> "\0xy"
else if let Some(penultimate_byte @ b'0'..=b'7') = rev_bytes.next() {
if has_odd_consecutive_backslashes(&mut rev_bytes.clone()) {
let prefix = &text[..text.len() - 3];
*text = Cow::Owned(format!(
"{prefix}\\0{}{}",
penultimate_byte as char, last_byte as char
));
}
}
}
}
fn has_odd_consecutive_backslashes(mut itr: impl Iterator<Item = u8>) -> bool {
let mut odd_backslashes = false;
while let Some(b'\\') = itr.next() {
odd_backslashes = !odd_backslashes;
}
odd_backslashes
}

View File

@@ -296,202 +296,4 @@ ISC.py:73:20: ISC001 [*] Implicitly concatenated string literals on one line
75 75 | f"def"} g"
76 76 |
ISC.py:84:5: ISC001 [*] Implicitly concatenated string literals on one line
|
83 | # See https://github.com/astral-sh/ruff/issues/12936
84 | _ = "\12""0" # fix should be "\0120"
| ^^^^^^^^ ISC001
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
|
= help: Combine string literals
Safe fix
81 81 | + f"second"} d"
82 82 |
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 |-_ = "\12""0" # fix should be "\0120"
84 |+_ = "\0120" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
ISC.py:85:5: ISC001 [*] Implicitly concatenated string literals on one line
|
83 | # See https://github.com/astral-sh/ruff/issues/12936
84 | _ = "\12""0" # fix should be "\0120"
85 | _ = "\\12""0" # fix should be "\\120"
| ^^^^^^^^^ ISC001
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
|
= help: Combine string literals
Safe fix
82 82 |
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 84 | _ = "\12""0" # fix should be "\0120"
85 |-_ = "\\12""0" # fix should be "\\120"
85 |+_ = "\\120" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
ISC.py:86:5: ISC001 [*] Implicitly concatenated string literals on one line
|
84 | _ = "\12""0" # fix should be "\0120"
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
| ^^^^^^^^^^ ISC001
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
|
= help: Combine string literals
Safe fix
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 84 | _ = "\12""0" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 |-_ = "\\\12""0" # fix should be "\\\0120"
86 |+_ = "\\\0120" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
ISC.py:87:5: ISC001 [*] Implicitly concatenated string literals on one line
|
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
| ^^^^^^^^^^ ISC001
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
|
= help: Combine string literals
Safe fix
84 84 | _ = "\12""0" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 |-_ = "\12 0""0" # fix should be "\12 00"
87 |+_ = "\12 00" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
ISC.py:88:5: ISC001 [*] Implicitly concatenated string literals on one line
|
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
| ^^^^^^^^^^ ISC001
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
|
= help: Combine string literals
Safe fix
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 |-_ = r"\12"r"0" # fix should be r"\120"
88 |+_ = r"\120" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
ISC.py:89:5: ISC001 [*] Implicitly concatenated string literals on one line
|
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
| ^^^^^^^^^^^^^^^^^ ISC001
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
|
= help: Combine string literals
Safe fix
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 |-_ = "\12 and more""0" # fix should be "\12 and more0"
89 |+_ = "\12 and more0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
ISC.py:90:5: ISC001 [*] Implicitly concatenated string literals on one line
|
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
| ^^^^^^^ ISC001
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
|
= help: Combine string literals
Safe fix
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 |-_ = "\8""0" # fix should be "\80"
90 |+_ = "\80" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:91:5: ISC001 [*] Implicitly concatenated string literals on one line
|
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
| ^^^^^^^^ ISC001
92 | _ = "\12""foo" # fix should be "\12foo"
93 | _ = "\12" "" # fix should be "\12"
|
= help: Combine string literals
Safe fix
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 |-_ = "\12""8" # fix should be "\128"
91 |+_ = "\128" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:92:5: ISC001 [*] Implicitly concatenated string literals on one line
|
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
| ^^^^^^^^^^ ISC001
93 | _ = "\12" "" # fix should be "\12"
|
= help: Combine string literals
Safe fix
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 |-_ = "\12""foo" # fix should be "\12foo"
92 |+_ = "\12foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:93:5: ISC001 [*] Implicitly concatenated string literals on one line
|
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
93 | _ = "\12" "" # fix should be "\12"
| ^^^^^^^^ ISC001
|
= help: Combine string literals
Safe fix
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 |-_ = "\12" "" # fix should be "\12"
93 |+_ = "\12" # fix should be "\12"

View File

@@ -50,6 +50,6 @@ ISC.py:80:10: ISC003 Explicitly concatenated string should be implicitly concate
| __________^
81 | | + f"second"} d"
| |_______________^ ISC003
82 |
83 | # See https://github.com/astral-sh/ruff/issues/12936
|

View File

@@ -296,202 +296,4 @@ ISC.py:73:20: ISC001 [*] Implicitly concatenated string literals on one line
75 75 | f"def"} g"
76 76 |
ISC.py:84:5: ISC001 [*] Implicitly concatenated string literals on one line
|
83 | # See https://github.com/astral-sh/ruff/issues/12936
84 | _ = "\12""0" # fix should be "\0120"
| ^^^^^^^^ ISC001
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
|
= help: Combine string literals
Safe fix
81 81 | + f"second"} d"
82 82 |
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 |-_ = "\12""0" # fix should be "\0120"
84 |+_ = "\0120" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
ISC.py:85:5: ISC001 [*] Implicitly concatenated string literals on one line
|
83 | # See https://github.com/astral-sh/ruff/issues/12936
84 | _ = "\12""0" # fix should be "\0120"
85 | _ = "\\12""0" # fix should be "\\120"
| ^^^^^^^^^ ISC001
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
|
= help: Combine string literals
Safe fix
82 82 |
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 84 | _ = "\12""0" # fix should be "\0120"
85 |-_ = "\\12""0" # fix should be "\\120"
85 |+_ = "\\120" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
ISC.py:86:5: ISC001 [*] Implicitly concatenated string literals on one line
|
84 | _ = "\12""0" # fix should be "\0120"
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
| ^^^^^^^^^^ ISC001
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
|
= help: Combine string literals
Safe fix
83 83 | # See https://github.com/astral-sh/ruff/issues/12936
84 84 | _ = "\12""0" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 |-_ = "\\\12""0" # fix should be "\\\0120"
86 |+_ = "\\\0120" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
ISC.py:87:5: ISC001 [*] Implicitly concatenated string literals on one line
|
85 | _ = "\\12""0" # fix should be "\\120"
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
| ^^^^^^^^^^ ISC001
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
|
= help: Combine string literals
Safe fix
84 84 | _ = "\12""0" # fix should be "\0120"
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 |-_ = "\12 0""0" # fix should be "\12 00"
87 |+_ = "\12 00" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
ISC.py:88:5: ISC001 [*] Implicitly concatenated string literals on one line
|
86 | _ = "\\\12""0" # fix should be "\\\0120"
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
| ^^^^^^^^^^ ISC001
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
|
= help: Combine string literals
Safe fix
85 85 | _ = "\\12""0" # fix should be "\\120"
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 |-_ = r"\12"r"0" # fix should be r"\120"
88 |+_ = r"\120" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
ISC.py:89:5: ISC001 [*] Implicitly concatenated string literals on one line
|
87 | _ = "\12 0""0" # fix should be "\12 00"
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
| ^^^^^^^^^^^^^^^^^ ISC001
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
|
= help: Combine string literals
Safe fix
86 86 | _ = "\\\12""0" # fix should be "\\\0120"
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 |-_ = "\12 and more""0" # fix should be "\12 and more0"
89 |+_ = "\12 and more0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
ISC.py:90:5: ISC001 [*] Implicitly concatenated string literals on one line
|
88 | _ = r"\12"r"0" # fix should be r"\120"
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
| ^^^^^^^ ISC001
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
|
= help: Combine string literals
Safe fix
87 87 | _ = "\12 0""0" # fix should be "\12 00"
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 |-_ = "\8""0" # fix should be "\80"
90 |+_ = "\80" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:91:5: ISC001 [*] Implicitly concatenated string literals on one line
|
89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
| ^^^^^^^^ ISC001
92 | _ = "\12""foo" # fix should be "\12foo"
93 | _ = "\12" "" # fix should be "\12"
|
= help: Combine string literals
Safe fix
88 88 | _ = r"\12"r"0" # fix should be r"\120"
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 |-_ = "\12""8" # fix should be "\128"
91 |+_ = "\128" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:92:5: ISC001 [*] Implicitly concatenated string literals on one line
|
90 | _ = "\8""0" # fix should be "\80"
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
| ^^^^^^^^^^ ISC001
93 | _ = "\12" "" # fix should be "\12"
|
= help: Combine string literals
Safe fix
89 89 | _ = "\12 and more""0" # fix should be "\12 and more0"
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 |-_ = "\12""foo" # fix should be "\12foo"
92 |+_ = "\12foo" # fix should be "\12foo"
93 93 | _ = "\12" "" # fix should be "\12"
ISC.py:93:5: ISC001 [*] Implicitly concatenated string literals on one line
|
91 | _ = "\12""8" # fix should be "\128"
92 | _ = "\12""foo" # fix should be "\12foo"
93 | _ = "\12" "" # fix should be "\12"
| ^^^^^^^^ ISC001
|
= help: Combine string literals
Safe fix
90 90 | _ = "\8""0" # fix should be "\80"
91 91 | _ = "\12""8" # fix should be "\128"
92 92 | _ = "\12""foo" # fix should be "\12foo"
93 |-_ = "\12" "" # fix should be "\12"
93 |+_ = "\12" # fix should be "\12"

View File

@@ -27,14 +27,14 @@ use crate::checkers::ast::Checker;
///
/// ## Example
///
/// ```pyi
/// ```python
/// class Foo:
/// def __eq__(self, obj: typing.Any) -> bool: ...
/// ```
///
/// Use instead:
///
/// ```pyi
/// ```python
/// class Foo:
/// def __eq__(self, obj: object) -> bool: ...
/// ```

View File

@@ -34,17 +34,19 @@ use crate::registry::Rule;
/// ```
///
/// ## Example
/// ```pyi
/// ```python
/// import sys
///
/// if sys.version_info > (3, 8): ...
/// if sys.version_info > (3, 8):
/// ...
/// ```
///
/// Use instead:
/// ```pyi
/// ```python
/// import sys
///
/// if sys.version_info >= (3, 9): ...
/// if sys.version_info >= (3, 9):
/// ...
/// ```
#[violation]
pub struct BadVersionInfoComparison;
@@ -68,23 +70,27 @@ impl Violation for BadVersionInfoComparison {
///
/// ## Example
///
/// ```pyi
/// ```python
/// import sys
///
/// if sys.version_info < (3, 10):
///
/// def read_data(x, *, preserve_order=True): ...
///
/// else:
///
/// def read_data(x): ...
/// ```
///
/// Use instead:
///
/// ```pyi
/// ```python
/// if sys.version_info >= (3, 10):
///
/// def read_data(x): ...
///
/// else:
///
/// def read_data(x, *, preserve_order=True): ...
/// ```
#[violation]

View File

@@ -21,16 +21,17 @@ use crate::checkers::ast::Checker;
/// precisely.
///
/// ## Example
/// ```pyi
/// ```python
/// from collections import namedtuple
///
/// person = namedtuple("Person", ["name", "age"])
/// ```
///
/// Use instead:
/// ```pyi
/// ```python
/// from typing import NamedTuple
///
///
/// class Person(NamedTuple):
/// name: str
/// age: int

View File

@@ -20,24 +20,27 @@ use crate::checkers::ast::Checker;
///
/// ## Example
///
/// ```pyi
/// ```python
/// from typing import TypeAlias
///
/// a = b = int
///
///
/// class Klass: ...
///
///
/// Klass.X: TypeAlias = int
/// ```
///
/// Use instead:
///
/// ```pyi
/// ```python
/// from typing import TypeAlias
///
/// a: TypeAlias = int
/// b: TypeAlias = int
///
///
/// class Klass:
/// X: TypeAlias = int
/// ```

View File

@@ -16,17 +16,19 @@ use crate::checkers::ast::Checker;
/// analyze your code.
///
/// ## Example
/// ```pyi
/// ```python
/// import sys
///
/// if (3, 10) <= sys.version_info < (3, 12): ...
/// if (3, 10) <= sys.version_info < (3, 12):
/// ...
/// ```
///
/// Use instead:
/// ```pyi
/// ```python
/// import sys
///
/// if sys.version_info >= (3, 10) and sys.version_info < (3, 12): ...
/// if sys.version_info >= (3, 10) and sys.version_info < (3, 12):
/// ...
/// ```
///
/// ## References

Some files were not shown because too many files have changed in this diff Show More