Compare commits

..

18 Commits

Author SHA1 Message Date
Douglas Creager
70a5373791 Include scope kind in scope_count metric, add bar graph 2025-02-07 11:39:01 -05:00
Douglas Creager
8e880c8f61 List comprehension 2025-02-07 10:18:30 -05:00
Douglas Creager
5e34d79cc1 Use default_missing_value instead of double-Option 2025-02-07 10:17:11 -05:00
Douglas Creager
b0beb93517 pre-commit 2025-02-06 17:36:55 -05:00
Douglas Creager
391917bc87 Describe before/after comparisons 2025-02-06 17:28:40 -05:00
Douglas Creager
f587a89a3b Optionally save output to file 2025-02-06 17:24:41 -05:00
Douglas Creager
5a7650d5ee Add README 2025-02-06 17:22:23 -05:00
Douglas Creager
ff5e65f6f5 Add histogram subcommand 2025-02-06 17:13:15 -05:00
Douglas Creager
88ef456757 Add initial plot script 2025-02-06 16:57:30 -05:00
Douglas Creager
1e1470073c Clippy 2025-02-06 14:24:07 -05:00
Douglas Creager
ca237345d9 Add --metrics flag 2025-02-06 14:22:34 -05:00
Douglas Creager
5ec0cb32f8 Include time since start of process 2025-02-06 14:01:23 -05:00
Douglas Creager
e73374c146 Include executable name in metrics 2025-02-06 13:56:46 -05:00
Douglas Creager
382349f85a Better documentation 2025-02-06 13:46:48 -05:00
Douglas Creager
80efb01e5d Hide the thread-local buffer better 2025-02-06 13:46:48 -05:00
Douglas Creager
c8b2cc4e00 Include timestamp in JSON metrics 2025-02-06 13:46:48 -05:00
Douglas Creager
25a84c7b18 Initial JSON metrics exporter 2025-02-06 13:46:48 -05:00
Douglas Creager
1c71f9b8c4 Start recording some metrics 2025-02-06 13:46:48 -05:00
638 changed files with 4290 additions and 5335 deletions

View File

@@ -1,64 +1,5 @@
# Changelog
## 0.9.5
### Preview features
- Recognize all symbols named `TYPE_CHECKING` for `in_type_checking_block` ([#15719](https://github.com/astral-sh/ruff/pull/15719))
- \[`flake8-comprehensions`\] Handle builtins at top of file correctly for `unnecessary-dict-comprehension-for-iterable` (`C420`) ([#15837](https://github.com/astral-sh/ruff/pull/15837))
- \[`flake8-logging`\] `.exception()` and `exc_info=` outside exception handlers (`LOG004`, `LOG014`) ([#15799](https://github.com/astral-sh/ruff/pull/15799))
- \[`flake8-pyi`\] Fix incorrect behaviour of `custom-typevar-return-type` preview-mode autofix if `typing` was already imported (`PYI019`) ([#15853](https://github.com/astral-sh/ruff/pull/15853))
- \[`flake8-pyi`\] Fix more complex cases (`PYI019`) ([#15821](https://github.com/astral-sh/ruff/pull/15821))
- \[`flake8-pyi`\] Make `PYI019` autofixable for `.py` files in preview mode as well as stubs ([#15889](https://github.com/astral-sh/ruff/pull/15889))
- \[`flake8-pyi`\] Remove type parameter correctly when it is the last (`PYI019`) ([#15854](https://github.com/astral-sh/ruff/pull/15854))
- \[`pylint`\] Fix missing parens in unsafe fix for `unnecessary-dunder-call` (`PLC2801`) ([#15762](https://github.com/astral-sh/ruff/pull/15762))
- \[`pyupgrade`\] Better messages and diagnostic range (`UP015`) ([#15872](https://github.com/astral-sh/ruff/pull/15872))
- \[`pyupgrade`\] Rename private type parameters in PEP 695 generics (`UP049`) ([#15862](https://github.com/astral-sh/ruff/pull/15862))
- \[`refurb`\] Also report non-name expressions (`FURB169`) ([#15905](https://github.com/astral-sh/ruff/pull/15905))
- \[`refurb`\] Mark fix as unsafe if there are comments (`FURB171`) ([#15832](https://github.com/astral-sh/ruff/pull/15832))
- \[`ruff`\] Classes with mixed type variable style (`RUF053`) ([#15841](https://github.com/astral-sh/ruff/pull/15841))
- \[`airflow`\] `BashOperator` has been moved to `airflow.providers.standard.operators.bash.BashOperator` (`AIR302`) ([#15922](https://github.com/astral-sh/ruff/pull/15922))
- \[`flake8-pyi`\] Add autofix for unused-private-type-var (`PYI018`) ([#15999](https://github.com/astral-sh/ruff/pull/15999))
- \[`flake8-pyi`\] Significantly improve accuracy of `PYI019` if preview mode is enabled ([#15888](https://github.com/astral-sh/ruff/pull/15888))
### Rule changes
- Preserve triple quotes and prefixes for strings ([#15818](https://github.com/astral-sh/ruff/pull/15818))
- \[`flake8-comprehensions`\] Skip when `TypeError` present from too many (kw)args for `C410`,`C411`, and `C418` ([#15838](https://github.com/astral-sh/ruff/pull/15838))
- \[`flake8-pyi`\] Rename `PYI019` and improve its diagnostic message ([#15885](https://github.com/astral-sh/ruff/pull/15885))
- \[`pep8-naming`\] Ignore `@override` methods (`N803`) ([#15954](https://github.com/astral-sh/ruff/pull/15954))
- \[`pyupgrade`\] Reuse replacement logic from `UP046` and `UP047` to preserve more comments (`UP040`) ([#15840](https://github.com/astral-sh/ruff/pull/15840))
- \[`ruff`\] Analyze deferred annotations before enforcing `mutable-(data)class-default` and `function-call-in-dataclass-default-argument` (`RUF008`,`RUF009`,`RUF012`) ([#15921](https://github.com/astral-sh/ruff/pull/15921))
- \[`pycodestyle`\] Exempt `sys.path += ...` calls (`E402`) ([#15980](https://github.com/astral-sh/ruff/pull/15980))
### Configuration
- Config error only when `flake8-import-conventions` alias conflicts with `isort.required-imports` bound name ([#15918](https://github.com/astral-sh/ruff/pull/15918))
- Workaround Even Better TOML crash related to `allOf` ([#15992](https://github.com/astral-sh/ruff/pull/15992))
### Bug fixes
- \[`flake8-comprehensions`\] Unnecessary `list` comprehension (rewrite as a `set` comprehension) (`C403`) - Handle extraneous parentheses around list comprehension ([#15877](https://github.com/astral-sh/ruff/pull/15877))
- \[`flake8-comprehensions`\] Handle trailing comma in fixes for `unnecessary-generator-list/set` (`C400`,`C401`) ([#15929](https://github.com/astral-sh/ruff/pull/15929))
- \[`flake8-pyi`\] Fix several correctness issues with `custom-type-var-return-type` (`PYI019`) ([#15851](https://github.com/astral-sh/ruff/pull/15851))
- \[`pep8-naming`\] Consider any number of leading underscore for `N801` ([#15988](https://github.com/astral-sh/ruff/pull/15988))
- \[`pyflakes`\] Visit forward annotations in `TypeAliasType` as types (`F401`) ([#15829](https://github.com/astral-sh/ruff/pull/15829))
- \[`pylint`\] Correct min/max auto-fix and suggestion for (`PL1730`) ([#15930](https://github.com/astral-sh/ruff/pull/15930))
- \[`refurb`\] Handle unparenthesized tuples correctly (`FURB122`, `FURB142`) ([#15953](https://github.com/astral-sh/ruff/pull/15953))
- \[`refurb`\] Avoid `None | None` as well as better detection and fix (`FURB168`) ([#15779](https://github.com/astral-sh/ruff/pull/15779))
### Documentation
- Add deprecation warning for `ruff-lsp` related settings ([#15850](https://github.com/astral-sh/ruff/pull/15850))
- Docs (`linter.md`): clarify that Python files are always searched for in subdirectories ([#15882](https://github.com/astral-sh/ruff/pull/15882))
- Fix a typo in `non_pep695_generic_class.rs` ([#15946](https://github.com/astral-sh/ruff/pull/15946))
- Improve Docs: Pylint subcategories' codes ([#15909](https://github.com/astral-sh/ruff/pull/15909))
- Remove non-existing `lint.extendIgnore` editor setting ([#15844](https://github.com/astral-sh/ruff/pull/15844))
- Update black deviations ([#15928](https://github.com/astral-sh/ruff/pull/15928))
- Mention `UP049` in `UP046` and `UP047`, add `See also` section to `UP040` ([#15956](https://github.com/astral-sh/ruff/pull/15956))
- Add instance variable examples to `RUF012` ([#15982](https://github.com/astral-sh/ruff/pull/15982))
- Explain precedence for `ignore` and `select` config ([#15883](https://github.com/astral-sh/ruff/pull/15883))
## 0.9.4
### Preview features

139
Cargo.lock generated
View File

@@ -471,7 +471,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "117725a109d387c937a1533ce01b450cbde6b88abceea8473c4d7a85853cda3c"
dependencies = [
"lazy_static",
"windows-sys 0.48.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -480,7 +480,7 @@ version = "3.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fde0e0ec90c9dfb3b4b1a0891a7dcd0e2bffde2f7efed5fe7c9bb00e5bfb915e"
dependencies = [
"windows-sys 0.48.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -855,6 +855,12 @@ version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34aa73646ffb006b8f5147f3dc182bd4bcb190227ce861fc4a4844bf8e3cb2c0"
[[package]]
name = "endian-type"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c34f04666d835ff5d62e058c3995147c06f42fe86ff053337632bca83e42702d"
[[package]]
name = "env_filter"
version = "0.1.3"
@@ -897,7 +903,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "33d852cb9b869c2a9b3df2f71a3074817f01e1844f839a144f5fcef059a4eb5d"
dependencies = [
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -1475,7 +1481,7 @@ checksum = "e19b23d53f35ce9f56aebc7d1bb4e6ac1e9c0db7ac85c8d1760c04379edced37"
dependencies = [
"hermit-abi 0.4.0",
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -1728,6 +1734,36 @@ version = "2.7.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "78ca9ab1a0babb1e7d5695e3530886289c18cf2f87ec19a575a0abdce112e3a3"
[[package]]
name = "metrics"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7a7deb012b3b2767169ff203fadb4c6b0b82b947512e5eb9e0b78c2e186ad9e3"
dependencies = [
"ahash",
"portable-atomic",
]
[[package]]
name = "metrics-util"
version = "0.19.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dbd4884b1dd24f7d6628274a2f5ae22465c337c5ba065ec9b6edccddf8acc673"
dependencies = [
"aho-corasick",
"crossbeam-epoch",
"crossbeam-utils",
"hashbrown 0.15.2",
"indexmap",
"metrics",
"ordered-float",
"quanta",
"radix_trie",
"rand 0.8.5",
"rand_xoshiro",
"sketches-ddsketch",
]
[[package]]
name = "mimalloc"
version = "0.1.43"
@@ -1789,6 +1825,15 @@ dependencies = [
"uuid",
]
[[package]]
name = "nibble_vec"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "77a5d83df9f36fe23f0c3648c6bbb8b0298bb5f1939c8f2704431371f4b84d43"
dependencies = [
"smallvec",
]
[[package]]
name = "nix"
version = "0.29.0"
@@ -1904,6 +1949,15 @@ version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "04744f49eae99ab78e0d5c0b603ab218f515ea8cfe5a456d7629ad883a3b6e7d"
[[package]]
name = "ordered-float"
version = "4.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7bb71e1b3fa6ca1c61f383464aaf2bb0e2f8e772a1f01d486832464de363b951"
dependencies = [
"num-traits",
]
[[package]]
name = "ordermap"
version = "0.5.5"
@@ -2255,6 +2309,21 @@ dependencies = [
"toml",
]
[[package]]
name = "quanta"
version = "0.12.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3bd1fe6824cea6538803de3ff1bc0cf3949024db3d43c9643024bfb33a807c0e"
dependencies = [
"crossbeam-utils",
"libc",
"once_cell",
"raw-cpuid",
"wasi 0.11.0+wasi-snapshot-preview1",
"web-sys",
"winapi",
]
[[package]]
name = "quick-junit"
version = "0.5.1"
@@ -2308,6 +2377,16 @@ dependencies = [
"proc-macro2",
]
[[package]]
name = "radix_trie"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c069c179fcdc6a2fe24d8d18305cf085fdbd4f922c041943e203685d6a1c58fd"
dependencies = [
"endian-type",
"nibble_vec",
]
[[package]]
name = "rand"
version = "0.8.5"
@@ -2369,6 +2448,24 @@ dependencies = [
"zerocopy 0.8.14",
]
[[package]]
name = "rand_xoshiro"
version = "0.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6f97cdb2a36ed4183de61b2f824cc45c9f1037f28afe0a322e9fff4c108b5aaa"
dependencies = [
"rand_core 0.6.4",
]
[[package]]
name = "raw-cpuid"
version = "11.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c6928fa44c097620b706542d428957635951bade7143269085389d42c8a4927e"
dependencies = [
"bitflags 2.8.0",
]
[[package]]
name = "rayon"
version = "1.10.0"
@@ -2403,12 +2500,14 @@ dependencies = [
"filetime",
"insta",
"insta-cmd",
"metrics",
"rayon",
"red_knot_project",
"red_knot_python_semantic",
"red_knot_server",
"regex",
"ruff_db",
"ruff_metrics",
"ruff_python_trivia",
"salsa",
"tempfile",
@@ -2439,7 +2538,6 @@ dependencies = [
"ruff_text_size",
"rustc-hash 2.1.0",
"salsa",
"schemars",
"serde",
"thiserror 2.0.11",
"toml",
@@ -2462,6 +2560,7 @@ dependencies = [
"insta",
"itertools 0.14.0",
"memchr",
"metrics",
"ordermap",
"quickcheck",
"quickcheck_macros",
@@ -2479,7 +2578,6 @@ dependencies = [
"ruff_text_size",
"rustc-hash 2.1.0",
"salsa",
"schemars",
"serde",
"smallvec",
"static_assertions",
@@ -2642,7 +2740,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.9.5"
version = "0.9.4"
dependencies = [
"anyhow",
"argfile",
@@ -2770,7 +2868,6 @@ dependencies = [
"ruff_text_size",
"rustc-hash 2.1.0",
"salsa",
"schemars",
"serde",
"tempfile",
"thiserror 2.0.11",
@@ -2795,7 +2892,6 @@ dependencies = [
"libcst",
"pretty_assertions",
"rayon",
"red_knot_project",
"regex",
"ruff",
"ruff_diagnostics",
@@ -2876,7 +2972,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.9.5"
version = "0.9.4"
dependencies = [
"aho-corasick",
"anyhow",
@@ -2947,6 +3043,15 @@ dependencies = [
"syn 2.0.98",
]
[[package]]
name = "ruff_metrics"
version = "0.0.0"
dependencies = [
"metrics",
"metrics-util",
"serde_json",
]
[[package]]
name = "ruff_notebook"
version = "0.0.0"
@@ -3194,7 +3299,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.9.5"
version = "0.9.4"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -3287,7 +3392,7 @@ dependencies = [
"errno",
"libc",
"linux-raw-sys",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -3537,6 +3642,12 @@ version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "56199f7ddabf13fe5074ce809e7d3f42b42ae711800501b5b16ea82ad029c39d"
[[package]]
name = "sketches-ddsketch"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c1e9a774a6c28142ac54bb25d25562e6bcf957493a184f15ad4eebccb23e410a"
[[package]]
name = "smallvec"
version = "1.13.2"
@@ -3665,7 +3776,7 @@ dependencies = [
"getrandom 0.3.1",
"once_cell",
"rustix",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -4429,7 +4540,7 @@ version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cf221c93e13a30d793f7645a0e7762c55d169dbb0a49671918a2319d289b10bb"
dependencies = [
"windows-sys 0.48.0",
"windows-sys 0.59.0",
]
[[package]]

View File

@@ -22,6 +22,7 @@ ruff_graph = { path = "crates/ruff_graph" }
ruff_index = { path = "crates/ruff_index" }
ruff_linter = { path = "crates/ruff_linter" }
ruff_macros = { path = "crates/ruff_macros" }
ruff_metrics = { path = "crates/ruff_metrics" }
ruff_notebook = { path = "crates/ruff_notebook" }
ruff_python_ast = { path = "crates/ruff_python_ast" }
ruff_python_codegen = { path = "crates/ruff_python_codegen" }
@@ -105,6 +106,8 @@ lsp-types = { git = "https://github.com/astral-sh/lsp-types.git", rev = "3512a9f
] }
matchit = { version = "0.8.1" }
memchr = { version = "2.7.1" }
metrics = { version = "0.24.1" }
metrics-util = { version = "0.19.0" }
mimalloc = { version = "0.1.39" }
natord = { version = "1.0.9" }
notify = { version = "8.0.0" }

View File

@@ -149,8 +149,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.9.5/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.9.5/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.9.4/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.9.4/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -183,7 +183,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.9.5
rev: v0.9.4
hooks:
# Run the linter.
- id: ruff
@@ -452,7 +452,6 @@ Ruff is used by a number of major open-source projects and companies, including:
- ING Bank ([popmon](https://github.com/ing-bank/popmon), [probatus](https://github.com/ing-bank/probatus))
- [Ibis](https://github.com/ibis-project/ibis)
- [ivy](https://github.com/unifyai/ivy)
- [JAX](https://github.com/jax-ml/jax)
- [Jupyter](https://github.com/jupyter-server/jupyter_server)
- [Kraken Tech](https://kraken.tech/)
- [LangChain](https://github.com/hwchase17/langchain)

View File

@@ -16,6 +16,7 @@ red_knot_python_semantic = { workspace = true }
red_knot_project = { workspace = true, features = ["zstd"] }
red_knot_server = { workspace = true }
ruff_db = { workspace = true, features = ["os", "cache"] }
ruff_metrics = { workspace = true }
anyhow = { workspace = true }
chrono = { workspace = true }
@@ -24,6 +25,7 @@ colored = { workspace = true }
countme = { workspace = true, features = ["enable"] }
crossbeam = { workspace = true }
ctrlc = { version = "3.4.4" }
metrics = { workspace = true }
rayon = { workspace = true }
salsa = { workspace = true }
tracing = { workspace = true, features = ["release_max_level_debug"] }

View File

@@ -63,6 +63,13 @@ pub(crate) struct CheckCommand {
#[clap(flatten)]
pub(crate) verbosity: Verbosity,
/// Whether to output metrics about type-checking performance. If you provide a path, metrics
/// will be written to that file. If you provide this option but don't provide a path, metrics
/// will be written to a file called `metrics.json` in the current directory. We will _append_
/// metrics to the file if it already exists.
#[arg(long, value_name = "PATH", default_missing_value="metrics.json", num_args=0..=1)]
pub(crate) metrics: Option<SystemPathBuf>,
#[clap(flatten)]
pub(crate) rules: RulesArg,

View File

@@ -2,8 +2,10 @@
use anyhow::Context;
use colored::Colorize;
use ruff_db::system::SystemPathBuf;
use ruff_metrics::JsonRecorder;
use std::fmt;
use std::fs::File;
use std::fs::{File, OpenOptions};
use std::io::BufWriter;
use tracing::{Event, Subscriber};
use tracing_subscriber::filter::LevelFilter;
@@ -252,3 +254,18 @@ where
writeln!(writer)
}
}
pub(crate) fn setup_metrics(dest: Option<&SystemPathBuf>) {
// If --metrics is not provided at all, don't collect any metrics.
let Some(dest) = dest else {
return;
};
let dest = OpenOptions::new()
.append(true)
.create(true)
.open(dest.as_std_path())
.expect("cannot open metrics file");
let recorder = JsonRecorder::new(dest);
metrics::set_global_recorder(recorder).expect("metrics recorder already registered");
}

View File

@@ -5,7 +5,7 @@ use anyhow::Result;
use std::sync::Mutex;
use crate::args::{Args, CheckCommand, Command};
use crate::logging::setup_tracing;
use crate::logging::{setup_metrics, setup_tracing};
use anyhow::{anyhow, Context};
use clap::Parser;
use colored::Colorize;
@@ -68,6 +68,7 @@ fn run_check(args: CheckCommand) -> anyhow::Result<ExitStatus> {
let verbosity = args.verbosity.level();
countme::enable(verbosity.is_trace());
let _guard = setup_tracing(verbosity)?;
setup_metrics(args.metrics.as_ref());
// The base path to which all CLI arguments are relative to.
let cli_base_path = {

View File

@@ -28,7 +28,6 @@ pep440_rs = { workspace = true }
rayon = { workspace = true }
rustc-hash = { workspace = true }
salsa = { workspace = true }
schemars = { workspace = true, optional = true }
serde = { workspace = true }
thiserror = { workspace = true }
toml = { workspace = true }
@@ -41,9 +40,8 @@ insta = { workspace = true, features = ["redactions", "ron"] }
[features]
default = ["zstd"]
deflate = ["red_knot_vendored/deflate"]
schemars = ["dep:schemars", "ruff_db/schemars", "red_knot_python_semantic/schemars"]
zstd = ["red_knot_vendored/zstd"]
deflate = ["red_knot_vendored/deflate"]
[lints]
workspace = true

View File

@@ -18,16 +18,13 @@ use thiserror::Error;
/// The options for the project.
#[derive(Debug, Default, Clone, PartialEq, Eq, Combine, Serialize, Deserialize)]
#[serde(rename_all = "kebab-case", deny_unknown_fields)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct Options {
/// Configures the type checking environment.
#[serde(skip_serializing_if = "Option::is_none")]
pub environment: Option<EnvironmentOptions>,
#[serde(skip_serializing_if = "Option::is_none")]
pub src: Option<SrcOptions>,
/// Configures the enabled lints and their severity.
#[serde(skip_serializing_if = "Option::is_none")]
pub rules: Option<Rules>,
}
@@ -180,22 +177,10 @@ impl Options {
#[derive(Debug, Default, Clone, Eq, PartialEq, Combine, Serialize, Deserialize)]
#[serde(rename_all = "kebab-case", deny_unknown_fields)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct EnvironmentOptions {
/// Specifies the version of Python that will be used to execute the source code.
/// The version should be specified as a string in the format `M.m` where `M` is the major version
/// and `m` is the minor (e.g. "3.0" or "3.6").
/// If a version is provided, knot will generate errors if the source code makes use of language features
/// that are not supported in that version.
/// It will also tailor its use of type stub files, which conditionalizes type definitions based on the version.
#[serde(skip_serializing_if = "Option::is_none")]
pub python_version: Option<RangedValue<PythonVersion>>,
/// Specifies the target platform that will be used to execute the source code.
/// If specified, Red Knot will tailor its use of type stub files,
/// which conditionalize type definitions based on the platform.
///
/// If no platform is specified, knot will use `all` or the current platform in the LSP use case.
#[serde(skip_serializing_if = "Option::is_none")]
pub python_platform: Option<RangedValue<PythonPlatform>>,
@@ -219,7 +204,6 @@ pub struct EnvironmentOptions {
#[derive(Debug, Default, Clone, Eq, PartialEq, Combine, Serialize, Deserialize)]
#[serde(rename_all = "kebab-case", deny_unknown_fields)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct SrcOptions {
/// The root of the project, used for finding first-party modules.
#[serde(skip_serializing_if = "Option::is_none")]
@@ -228,9 +212,7 @@ pub struct SrcOptions {
#[derive(Debug, Default, Clone, Eq, PartialEq, Combine, Serialize, Deserialize)]
#[serde(rename_all = "kebab-case", transparent)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct Rules {
#[cfg_attr(feature = "schemars", schemars(with = "schema::Rules"))]
inner: FxHashMap<RangedValue<String>, RangedValue<Level>>,
}
@@ -244,69 +226,6 @@ impl FromIterator<(RangedValue<String>, RangedValue<Level>)> for Rules {
}
}
#[cfg(feature = "schemars")]
mod schema {
use crate::DEFAULT_LINT_REGISTRY;
use red_knot_python_semantic::lint::Level;
use schemars::gen::SchemaGenerator;
use schemars::schema::{
InstanceType, Metadata, ObjectValidation, Schema, SchemaObject, SubschemaValidation,
};
use schemars::JsonSchema;
pub(super) struct Rules;
impl JsonSchema for Rules {
fn schema_name() -> String {
"Rules".to_string()
}
fn json_schema(gen: &mut SchemaGenerator) -> Schema {
let registry = &*DEFAULT_LINT_REGISTRY;
let level_schema = gen.subschema_for::<Level>();
let properties: schemars::Map<String, Schema> = registry
.lints()
.iter()
.map(|lint| {
(
lint.name().to_string(),
Schema::Object(SchemaObject {
metadata: Some(Box::new(Metadata {
title: Some(lint.summary().to_string()),
description: Some(lint.documentation()),
deprecated: lint.status.is_deprecated(),
default: Some(lint.default_level.to_string().into()),
..Metadata::default()
})),
subschemas: Some(Box::new(SubschemaValidation {
one_of: Some(vec![level_schema.clone()]),
..Default::default()
})),
..Default::default()
}),
)
})
.collect();
Schema::Object(SchemaObject {
instance_type: Some(InstanceType::Object.into()),
object: Some(Box::new(ObjectValidation {
properties,
// Allow unknown rules: Red Knot will warn about them.
// It gives a better experience when using an older Red Knot version because
// the schema will not deny rules that have been removed in newer versions.
additional_properties: Some(Box::new(level_schema)),
..ObjectValidation::default()
})),
..Default::default()
})
}
}
}
#[derive(Error, Debug)]
pub enum KnotTomlError {
#[error(transparent)]

View File

@@ -1,9 +1,8 @@
use crate::combine::Combine;
use crate::Db;
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ruff_macros::Combine;
use ruff_text_size::{TextRange, TextSize};
use serde::{Deserialize, Deserializer};
use serde::{Deserialize, Deserializer, Serialize, Serializer};
use std::cell::RefCell;
use std::cmp::Ordering;
use std::fmt;
@@ -71,19 +70,15 @@ impl Drop for ValueSourceGuard {
///
/// This ensures that two resolved configurations are identical even if the position of a value has changed
/// or if the values were loaded from different sources.
#[derive(Clone, serde::Serialize)]
#[serde(transparent)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
#[derive(Clone)]
pub struct RangedValue<T> {
value: T,
#[serde(skip)]
source: ValueSource,
/// The byte range of `value` in `source`.
///
/// Can be `None` because not all sources support a range.
/// For example, arguments provided on the CLI won't have a range attached.
#[serde(skip)]
range: Option<TextRange>,
}
@@ -271,6 +266,18 @@ where
}
}
impl<T> Serialize for RangedValue<T>
where
T: Serialize,
{
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
self.value.serialize(serializer)
}
}
/// A possibly relative path in a configuration file.
///
/// Relative paths in configuration files or from CLI options
@@ -279,19 +286,9 @@ where
/// * CLI: The path is relative to the current working directory
/// * Configuration file: The path is relative to the project's root.
#[derive(
Debug,
Clone,
serde::Serialize,
serde::Deserialize,
PartialEq,
Eq,
PartialOrd,
Ord,
Hash,
Combine,
Debug, Clone, serde::Serialize, serde::Deserialize, PartialEq, Eq, PartialOrd, Ord, Hash,
)]
#[serde(transparent)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct RelativePathBuf(RangedValue<SystemPathBuf>);
impl RelativePathBuf {
@@ -328,3 +325,13 @@ impl RelativePathBuf {
SystemPath::absolute(&self.0, relative_to)
}
}
impl Combine for RelativePathBuf {
fn combine(self, other: Self) -> Self {
Self(self.0.combine(other.0))
}
fn combine_with(&mut self, other: Self) {
self.0.combine_with(other.0);
}
}

View File

@@ -30,13 +30,13 @@ countme = { workspace = true }
drop_bomb = { workspace = true }
indexmap = { workspace = true }
itertools = { workspace = true }
metrics = { workspace = true }
ordermap = { workspace = true }
salsa = { workspace = true }
thiserror = { workspace = true }
tracing = { workspace = true }
rustc-hash = { workspace = true }
hashbrown = { workspace = true }
schemars = { workspace = true, optional = true }
serde = { workspace = true, optional = true }
smallvec = { workspace = true }
static_assertions = { workspace = true }

View File

@@ -232,36 +232,6 @@ reveal_type(c_instance.y) # revealed: int
reveal_type(c_instance.z) # revealed: int
```
#### Attributes defined in multi-target assignments
```py
class C:
def __init__(self) -> None:
self.a = self.b = 1
c_instance = C()
reveal_type(c_instance.a) # revealed: Unknown | Literal[1]
reveal_type(c_instance.b) # revealed: Unknown | Literal[1]
```
#### Augmented assignments
```py
class Weird:
def __iadd__(self, other: None) -> str:
return "a"
class C:
def __init__(self) -> None:
self.w = Weird()
self.w += None
# TODO: Mypy and pyright do not support this, but it would be great if we could
# infer `Unknown | str` or at least `Unknown | Weird | str` here.
reveal_type(C().w) # revealed: Unknown | Weird
```
#### Attributes defined in tuple unpackings
```py
@@ -283,24 +253,19 @@ reveal_type(c_instance.b1) # revealed: Unknown | Literal["a"]
reveal_type(c_instance.c1) # revealed: Unknown | int
reveal_type(c_instance.d1) # revealed: Unknown | str
reveal_type(c_instance.a2) # revealed: Unknown | Literal[1]
# TODO: This should be supported (no error; type should be: `Unknown | Literal[1]`)
# error: [unresolved-attribute]
reveal_type(c_instance.a2) # revealed: Unknown
reveal_type(c_instance.b2) # revealed: Unknown | Literal["a"]
# TODO: This should be supported (no error; type should be: `Unknown | Literal["a"]`)
# error: [unresolved-attribute]
reveal_type(c_instance.b2) # revealed: Unknown
reveal_type(c_instance.c2) # revealed: Unknown | int
reveal_type(c_instance.d2) # revealed: Unknown | str
```
#### Starred assignments
```py
class C:
def __init__(self) -> None:
self.a, *self.b = (1, 2, 3)
c_instance = C()
reveal_type(c_instance.a) # revealed: Unknown | Literal[1]
reveal_type(c_instance.b) # revealed: Unknown | @Todo(starred unpacking)
# TODO: Similar for these two (should be `Unknown | int` and `Unknown | str`, respectively)
# error: [unresolved-attribute]
reveal_type(c_instance.c2) # revealed: Unknown
# error: [unresolved-attribute]
reveal_type(c_instance.d2) # revealed: Unknown
```
#### Attributes defined in for-loop (unpacking)
@@ -322,8 +287,6 @@ class TupleIterable:
def __iter__(self) -> TupleIterator:
return TupleIterator()
class NonIterable: ...
class C:
def __init__(self):
for self.x in IntIterable():
@@ -332,54 +295,14 @@ class C:
for _, self.y in TupleIterable():
pass
# TODO: We should emit a diagnostic here
for self.z in NonIterable():
pass
# TODO: Pyright fully supports these, mypy detects the presence of the attributes,
# but infers type `Any` for both of them. We should infer `int` and `str` here:
reveal_type(C().x) # revealed: Unknown | int
reveal_type(C().y) # revealed: Unknown | str
```
#### Attributes defined in `with` statements
```py
class ContextManager:
def __enter__(self) -> int | None: ...
def __exit__(self, exc_type, exc_value, traceback) -> None: ...
class C:
def __init__(self) -> None:
with ContextManager() as self.x:
pass
c_instance = C()
# TODO: Should be `Unknown | int | None`
# error: [unresolved-attribute]
reveal_type(c_instance.x) # revealed: Unknown
```
reveal_type(C().x) # revealed: Unknown
#### Attributes defined in comprehensions
```py
class IntIterator:
def __next__(self) -> int:
return 1
class IntIterable:
def __iter__(self) -> IntIterator:
return IntIterator()
class C:
def __init__(self) -> None:
[... for self.a in IntIterable()]
c_instance = C()
# TODO: Should be `Unknown | int`
# error: [unresolved-attribute]
reveal_type(c_instance.a) # revealed: Unknown
reveal_type(C().y) # revealed: Unknown
```
#### Conditionally declared / bound attributes
@@ -520,15 +443,6 @@ class C:
reveal_type(C().x) # revealed: str
```
#### Diagnostics are reported for the right-hand side of attribute assignments
```py
class C:
def __init__(self) -> None:
# error: [too-many-positional-arguments]
self.x: int = len(1, 2, 3)
```
### Pure class variables (`ClassVar`)
#### Annotated with `ClassVar` type qualifier

View File

@@ -219,11 +219,7 @@ import package
reveal_type(package.foo.X) # revealed: Unknown
```
## Relative imports at the top of a search path
Relative imports at the top of a search path result in a runtime error:
`ImportError: attempted relative import with no known parent package`. That's why Red Knot should
disallow them.
## In the src-root
`parser.py`:
@@ -234,5 +230,21 @@ X: int = 42
`__main__.py`:
```py
from .parser import X # error: [unresolved-import]
from .parser import X
reveal_type(X) # revealed: int
```
## Beyond the src-root
`parser.py`:
```py
X: int = 42
```
`__main__.py`:
```py
from ..parser import X # error: [unresolved-import]
```

View File

@@ -132,27 +132,6 @@ static_assert(not is_disjoint_from(Intersection[X, Z], Y))
static_assert(not is_disjoint_from(Intersection[Y, Z], X))
```
## Negation / complement
The complement of a type `T` is disjoint from `T`. In fact, it is disjoint from every subtype of
`T`:
```py
from knot_extensions import Not, Intersection, is_disjoint_from, static_assert
class T: ...
class S(T): ...
static_assert(is_disjoint_from(Not[T], T))
static_assert(is_disjoint_from(Not[T], S))
static_assert(is_disjoint_from(Intersection[T, Any], Not[T]))
static_assert(is_disjoint_from(Not[T], Intersection[T, Any]))
static_assert(is_disjoint_from(Intersection[S, Any], Not[T]))
static_assert(is_disjoint_from(Not[T], Intersection[S, Any]))
```
## Special types
### `Never`
@@ -265,7 +244,7 @@ static_assert(not is_disjoint_from(TypeOf[f], object))
### `AlwaysTruthy` and `AlwaysFalsy`
```py
from knot_extensions import AlwaysFalsy, AlwaysTruthy, Intersection, Not, is_disjoint_from, static_assert
from knot_extensions import AlwaysFalsy, AlwaysTruthy, is_disjoint_from, static_assert
from typing import Literal
static_assert(is_disjoint_from(None, AlwaysTruthy))
@@ -277,14 +256,6 @@ static_assert(not is_disjoint_from(str, AlwaysTruthy))
static_assert(is_disjoint_from(Literal[1, 2], AlwaysFalsy))
static_assert(not is_disjoint_from(Literal[0, 1], AlwaysTruthy))
type Truthy = Not[AlwaysFalsy]
type Falsy = Not[AlwaysTruthy]
type AmbiguousTruthiness = Intersection[Truthy, Falsy]
static_assert(is_disjoint_from(AlwaysTruthy, AmbiguousTruthiness))
static_assert(is_disjoint_from(AlwaysFalsy, AmbiguousTruthiness))
```
### Instance types versus `type[T]` types

View File

@@ -1,8 +1,6 @@
use core::fmt;
use itertools::Itertools;
use ruff_db::diagnostic::{DiagnosticId, LintName, Severity};
use rustc_hash::FxHashMap;
use std::fmt::Formatter;
use std::hash::Hasher;
use thiserror::Error;
@@ -38,20 +36,13 @@ pub struct LintMetadata {
derive(serde::Serialize, serde::Deserialize),
serde(rename_all = "kebab-case")
)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub enum Level {
/// # Ignore
///
/// The lint is disabled and should not run.
Ignore,
/// # Warn
///
/// The lint is enabled and diagnostic should have a warning severity.
Warn,
/// # Error
///
/// The lint is enabled and diagnostics have an error severity.
Error,
}
@@ -70,16 +61,6 @@ impl Level {
}
}
impl fmt::Display for Level {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
match self {
Level::Ignore => f.write_str("ignore"),
Level::Warn => f.write_str("warn"),
Level::Error => f.write_str("error"),
}
}
}
impl TryFrom<Level> for Severity {
type Error = ();
@@ -103,11 +84,9 @@ impl LintMetadata {
/// Returns the documentation line by line with one leading space and all trailing whitespace removed.
pub fn documentation_lines(&self) -> impl Iterator<Item = &str> {
self.raw_documentation.lines().map(|line| {
line.strip_prefix(char::is_whitespace)
.unwrap_or(line)
.trim_end()
})
self.raw_documentation
.lines()
.map(|line| line.strip_prefix(' ').unwrap_or(line).trim_end())
}
/// Returns the documentation as a single string.
@@ -201,10 +180,6 @@ impl LintStatus {
pub const fn is_removed(&self) -> bool {
matches!(self, LintStatus::Removed { .. })
}
pub const fn is_deprecated(&self) -> bool {
matches!(self, LintStatus::Deprecated { .. })
}
}
/// Declares a lint rule with the given metadata.
@@ -248,7 +223,7 @@ macro_rules! declare_lint {
$vis static $name: $crate::lint::LintMetadata = $crate::lint::LintMetadata {
name: ruff_db::diagnostic::LintName::of(ruff_macros::kebab_case!($name)),
summary: $summary,
raw_documentation: concat!($($doc, '\n',)+),
raw_documentation: concat!($($doc,)+ "\n"),
status: $status,
file: file!(),
line: line!(),

View File

@@ -11,7 +11,6 @@ pub enum PythonPlatform {
/// Do not make any assumptions about the target platform.
#[default]
All,
/// Assume a specific target platform like `linux`, `darwin` or `win32`.
///
/// We use a string (instead of individual enum variants), as the set of possible platforms
@@ -29,77 +28,3 @@ impl Display for PythonPlatform {
}
}
}
#[cfg(feature = "schemars")]
mod schema {
use crate::PythonPlatform;
use schemars::_serde_json::Value;
use schemars::gen::SchemaGenerator;
use schemars::schema::{Metadata, Schema, SchemaObject, SubschemaValidation};
use schemars::JsonSchema;
impl JsonSchema for PythonPlatform {
fn schema_name() -> String {
"PythonPlatform".to_string()
}
fn json_schema(_gen: &mut SchemaGenerator) -> Schema {
Schema::Object(SchemaObject {
// Hard code some well known values, but allow any other string as well.
subschemas: Some(Box::new(SubschemaValidation {
any_of: Some(vec![
Schema::Object(SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
..SchemaObject::default()
}),
// Promote well-known values for better auto-completion.
// Using `const` over `enumValues` as recommended [here](https://github.com/SchemaStore/schemastore/blob/master/CONTRIBUTING.md#documenting-enums).
Schema::Object(SchemaObject {
const_value: Some(Value::String("all".to_string())),
metadata: Some(Box::new(Metadata {
description: Some(
"Do not make any assumptions about the target platform."
.to_string(),
),
..Metadata::default()
})),
..SchemaObject::default()
}),
Schema::Object(SchemaObject {
const_value: Some(Value::String("darwin".to_string())),
metadata: Some(Box::new(Metadata {
description: Some("Darwin".to_string()),
..Metadata::default()
})),
..SchemaObject::default()
}),
Schema::Object(SchemaObject {
const_value: Some(Value::String("linux".to_string())),
metadata: Some(Box::new(Metadata {
description: Some("Linux".to_string()),
..Metadata::default()
})),
..SchemaObject::default()
}),
Schema::Object(SchemaObject {
const_value: Some(Value::String("win32".to_string())),
metadata: Some(Box::new(Metadata {
description: Some("Windows".to_string()),
..Metadata::default()
})),
..SchemaObject::default()
}),
]),
..SubschemaValidation::default()
})),
..SchemaObject::default()
})
}
}
}

View File

@@ -31,20 +31,6 @@ impl PythonVersion {
minor: 13,
};
pub fn iter() -> impl Iterator<Item = PythonVersion> {
[
PythonVersion::PY37,
PythonVersion::PY38,
PythonVersion::PY39,
PythonVersion::PY310,
PythonVersion::PY311,
PythonVersion::PY312,
PythonVersion::PY313,
]
.iter()
.copied()
}
pub fn free_threaded_build_available(self) -> bool {
self >= PythonVersion::PY313
}
@@ -83,86 +69,40 @@ impl fmt::Display for PythonVersion {
}
#[cfg(feature = "serde")]
mod serde {
use crate::PythonVersion;
impl<'de> serde::Deserialize<'de> for PythonVersion {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
let as_str = String::deserialize(deserializer)?;
impl<'de> serde::Deserialize<'de> for PythonVersion {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
let as_str = String::deserialize(deserializer)?;
if let Some((major, minor)) = as_str.split_once('.') {
let major = major
.parse()
.map_err(|err| serde::de::Error::custom(format!("invalid major version: {err}")))?;
let minor = minor
.parse()
.map_err(|err| serde::de::Error::custom(format!("invalid minor version: {err}")))?;
if let Some((major, minor)) = as_str.split_once('.') {
let major = major.parse().map_err(|err| {
serde::de::Error::custom(format!("invalid major version: {err}"))
})?;
let minor = minor.parse().map_err(|err| {
serde::de::Error::custom(format!("invalid minor version: {err}"))
})?;
Ok((major, minor).into())
} else {
let major = as_str.parse().map_err(|err| {
serde::de::Error::custom(format!(
"invalid python-version: {err}, expected: `major.minor`"
))
})?;
Ok((major, minor).into())
} else {
let major = as_str.parse().map_err(|err| {
serde::de::Error::custom(format!(
"invalid python-version: {err}, expected: `major.minor`"
))
})?;
Ok((major, 0).into())
}
}
}
impl serde::Serialize for PythonVersion {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serializer.serialize_str(&self.to_string())
Ok((major, 0).into())
}
}
}
#[cfg(feature = "schemars")]
mod schemars {
use super::PythonVersion;
use schemars::schema::{Metadata, Schema, SchemaObject, SubschemaValidation};
use schemars::JsonSchema;
use schemars::_serde_json::Value;
impl JsonSchema for PythonVersion {
fn schema_name() -> String {
"PythonVersion".to_string()
}
fn json_schema(_gen: &mut schemars::gen::SchemaGenerator) -> Schema {
let sub_schemas = std::iter::once(Schema::Object(SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
string: Some(Box::new(schemars::schema::StringValidation {
pattern: Some(r"^\d+\.\d+$".to_string()),
..Default::default()
})),
..Default::default()
}))
.chain(Self::iter().map(|v| {
Schema::Object(SchemaObject {
const_value: Some(Value::String(v.to_string())),
metadata: Some(Box::new(Metadata {
description: Some(format!("Python {v}")),
..Metadata::default()
})),
..SchemaObject::default()
})
}));
Schema::Object(SchemaObject {
subschemas: Some(Box::new(SubschemaValidation {
any_of: Some(sub_schemas.collect()),
..Default::default()
})),
..SchemaObject::default()
})
}
#[cfg(feature = "serde")]
impl serde::Serialize for PythonVersion {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serializer.serialize_str(&self.to_string())
}
}

View File

@@ -1,7 +1,4 @@
use crate::{
semantic_index::{ast_ids::ScopedExpressionId, expression::Expression},
unpack::Unpack,
};
use crate::semantic_index::expression::Expression;
use ruff_python_ast::name::Name;
@@ -17,17 +14,6 @@ pub(crate) enum AttributeAssignment<'db> {
/// An attribute assignment without a type annotation, e.g. `self.x = <value>`.
Unannotated { value: Expression<'db> },
/// An attribute assignment where the right-hand side is an iterable, for example
/// `for self.x in <iterable>`.
Iterable { iterable: Expression<'db> },
/// An attribute assignment where the left-hand side is an unpacking expression,
/// e.g. `self.x, self.y = <value>`.
Unpack {
attribute_expression_id: ScopedExpressionId,
unpack: Unpack<'db>,
},
}
pub(crate) type AttributeAssignments<'db> = FxHashMap<Name, Vec<AttributeAssignment<'db>>>;

View File

@@ -6,9 +6,9 @@ use rustc_hash::{FxHashMap, FxHashSet};
use ruff_db::files::File;
use ruff_db::parsed::ParsedModule;
use ruff_index::IndexVec;
use ruff_python_ast as ast;
use ruff_python_ast::name::Name;
use ruff_python_ast::visitor::{walk_expr, walk_pattern, walk_stmt, Visitor};
use ruff_python_ast::{self as ast, ExprContext};
use crate::ast_node_ref::AstNodeRef;
use crate::module_name::ModuleName;
@@ -61,6 +61,9 @@ pub(super) struct SemanticIndexBuilder<'db> {
// Builder state
db: &'db dyn Db,
file: File,
// A shared clone of the path of the file being analyzed. We use this as a label for all of the
// metrics that we export, and this avoids cloning the path into a new string each time.
file_path: Arc<str>,
module: &'db ParsedModule,
scope_stack: Vec<ScopeInfo>,
/// The assignments we're currently visiting, with
@@ -95,9 +98,11 @@ pub(super) struct SemanticIndexBuilder<'db> {
impl<'db> SemanticIndexBuilder<'db> {
pub(super) fn new(db: &'db dyn Db, file: File, parsed: &'db ParsedModule) -> Self {
let file_path = Arc::from(file.path(db).as_str());
let mut builder = Self {
db,
file,
file_path,
module: parsed,
scope_stack: Vec::new(),
current_assignments: vec![],
@@ -186,6 +191,13 @@ impl<'db> SemanticIndexBuilder<'db> {
};
self.try_node_context_stack_manager.enter_nested_scope();
metrics::counter!(
"semantic_index.scope_count",
"file" => self.file_path.clone(),
"kind" => scope.kind().as_str(),
)
.increment(1);
let file_scope_id = self.scopes.push(scope);
self.symbol_tables.push(SymbolTableBuilder::default());
self.use_def_maps.push(UseDefMapBuilder::default());
@@ -1231,20 +1243,6 @@ where
unpack: None,
first: false,
}),
ast::Expr::Attribute(ast::ExprAttribute {
value: object,
attr,
..
}) => {
self.register_attribute_assignment(
object,
attr,
AttributeAssignment::Iterable {
iterable: iter_expr,
},
);
None
}
_ => None,
};
@@ -1473,7 +1471,7 @@ where
fn visit_expr(&mut self, expr: &'ast ast::Expr) {
self.scopes_by_expression
.insert(expr.into(), self.current_scope());
let expression_id = self.current_ast_ids().record_expression(expr);
self.current_ast_ids().record_expression(expr);
match expr {
ast::Expr::Name(name_node @ ast::ExprName { id, ctx, .. }) => {
@@ -1732,35 +1730,6 @@ where
self.simplify_visibility_constraints(pre_op);
}
ast::Expr::Attribute(ast::ExprAttribute {
value: object,
attr,
ctx: ExprContext::Store,
range: _,
}) => {
if let Some(
CurrentAssignment::Assign {
unpack: Some(unpack),
..
}
| CurrentAssignment::For {
unpack: Some(unpack),
..
},
) = self.current_assignment()
{
self.register_attribute_assignment(
object,
attr,
AttributeAssignment::Unpack {
attribute_expression_id: expression_id,
unpack,
},
);
}
walk_expr(self, expr);
}
_ => {
walk_expr(self, expr);
}

View File

@@ -213,6 +213,18 @@ impl ScopeKind {
pub const fn is_comprehension(self) -> bool {
matches!(self, ScopeKind::Comprehension)
}
pub const fn as_str(self) -> &'static str {
match self {
Self::Module => "Module",
Self::Annotation => "Annotation",
Self::Class => "Class",
Self::Function => "Function",
Self::Lambda => "Lambda",
Self::Comprehension => "Comprehension",
Self::TypeAlias => "TypeAlias",
}
}
}
/// Symbol table for a specific [`Scope`].

View File

@@ -40,7 +40,6 @@ use crate::types::call::{
};
use crate::types::class_base::ClassBase;
use crate::types::diagnostic::INVALID_TYPE_FORM;
use crate::types::infer::infer_unpack_types;
use crate::types::mro::{Mro, MroError, MroIterator};
use crate::types::narrow::narrowing_constraint;
use crate::{Db, FxOrderSet, Module, Program, PythonVersion};
@@ -1261,42 +1260,19 @@ impl<'db> Type<'db> {
.iter()
.all(|e| e.is_disjoint_from(db, other)),
(Type::Intersection(inter_left), Type::Intersection(inter_right)) => {
// We explicitly make this case a symmetric version of the case below, as there
// are some type pairs like `Any & T` and `~T` that would otherwise lead to non-
// symmetric results.
inter_left
(Type::Intersection(intersection), other)
| (other, Type::Intersection(intersection)) => {
if intersection
.positive(db)
.iter()
.any(|p| p.is_disjoint_from(db, other))
|| inter_right
.positive(db)
.iter()
.any(|p| p.is_disjoint_from(db, self))
|| inter_left.negative(db).iter().any(|n| {
other.is_subtype_of(db, *n)
&& self.is_fully_static(db)
&& other.is_fully_static(db)
})
|| inter_right.negative(db).iter().any(|n| {
self.is_subtype_of(db, *n)
&& self.is_fully_static(db)
&& other.is_fully_static(db)
})
}
(Type::Intersection(intersection), t) | (t, Type::Intersection(intersection)) => {
// TODO: There are certainly more cases that could be handled here. For example,
// it is possible that both A and B overlap with C, but the intersection A & B
// does not overlap with C.
intersection
.positive(db)
.iter()
.any(|p| p.is_disjoint_from(db, t))
|| intersection.negative(db).iter().any(|n| {
t.is_subtype_of(db, *n)
&& self.is_fully_static(db)
&& other.is_fully_static(db)
})
{
true
} else {
// TODO we can do better here. For example:
// X & ~Literal[1] is disjoint from Literal[1]
false
}
}
// any single-valued type is disjoint from another single-valued type
@@ -4255,32 +4231,6 @@ impl<'db> Class<'db> {
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
}
AttributeAssignment::Iterable { iterable } => {
// We found an attribute assignment like:
//
// for self.name in <iterable>:
// TODO: Potential diagnostics resulting from the iterable are currently not reported.
let iterable_ty = infer_expression_type(db, *iterable);
let inferred_ty = iterable_ty.iterate(db).unwrap_without_diagnostic();
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
}
AttributeAssignment::Unpack {
attribute_expression_id,
unpack,
} => {
// We found an unpacking assignment like:
//
// .., self.name, .. = <value>
// (.., self.name, ..) = <value>
// [.., self.name, ..] = <value>
let inferred_ty =
infer_unpack_types(db, *unpack).expression_type(*attribute_expression_id);
union_of_inferred_types = union_of_inferred_types.add(inferred_ty);
}
}
}

View File

@@ -200,7 +200,7 @@ pub(crate) fn infer_expression_types<'db>(
/// type of the variables involved in this unpacking along with any violations that are detected
/// during this unpacking.
#[salsa::tracked(return_ref)]
pub(super) fn infer_unpack_types<'db>(db: &'db dyn Db, unpack: Unpack<'db>) -> UnpackResult<'db> {
fn infer_unpack_types<'db>(db: &'db dyn Db, unpack: Unpack<'db>) -> UnpackResult<'db> {
let file = unpack.file(db);
let _span =
tracing::trace_span!("infer_unpack_types", range=?unpack.range(db), file=%file.path(db))
@@ -2085,7 +2085,7 @@ impl<'db> TypeInferenceBuilder<'db> {
}
let name_ast_id = name.scoped_expression_id(self.db(), self.scope());
unpacked.expression_type(name_ast_id)
unpacked.get(name_ast_id).unwrap_or(Type::unknown())
}
TargetKind::Name => {
if self.in_stub() && value.is_ellipsis_literal_expr() {
@@ -2356,7 +2356,7 @@ impl<'db> TypeInferenceBuilder<'db> {
self.context.extend(unpacked);
}
let name_ast_id = name.scoped_expression_id(self.db(), self.scope());
unpacked.expression_type(name_ast_id)
unpacked.get(name_ast_id).unwrap_or(Type::unknown())
}
TargetKind::Name => iterable_ty
.iterate(self.db())
@@ -2512,19 +2512,30 @@ impl<'db> TypeInferenceBuilder<'db> {
let module = file_to_module(self.db(), self.file())
.ok_or(ModuleNameResolutionError::UnknownCurrentModule)?;
let mut level = level.get();
if module.kind().is_package() {
level = level.saturating_sub(1);
}
let mut module_name = module
.name()
.ancestors()
.nth(level as usize)
.ok_or(ModuleNameResolutionError::TooManyDots)?;
let mut module_name = module.name().clone();
let tail = tail
.map(|tail| ModuleName::new(tail).ok_or(ModuleNameResolutionError::InvalidSyntax))
.transpose()?;
for remaining_dots in (0..level).rev() {
if let Some(parent) = module_name.parent() {
module_name = parent;
} else if remaining_dots == 0 {
// If we reached a search path root and this was the last dot return the tail if any.
// If there's no tail, then we have a relative import that's too deep.
return tail.ok_or(ModuleNameResolutionError::TooManyDots);
} else {
// We're at a search path root. This is a too deep relative import if there's more than
// one dot remaining.
return Err(ModuleNameResolutionError::TooManyDots);
}
}
if let Some(tail) = tail {
let tail = ModuleName::new(tail).ok_or(ModuleNameResolutionError::InvalidSyntax)?;
module_name.extend(&tail);
}

View File

@@ -72,9 +72,11 @@ impl<'db> Unpacker<'db> {
value_ty: Type<'db>,
) {
match target {
ast::Expr::Name(_) | ast::Expr::Attribute(_) => {
self.targets
.insert(target.scoped_expression_id(self.db(), self.scope), value_ty);
ast::Expr::Name(target_name) => {
self.targets.insert(
target_name.scoped_expression_id(self.db(), self.scope),
value_ty,
);
}
ast::Expr::Starred(ast::ExprStarred { value, .. }) => {
self.unpack_inner(value, value_expr, value_ty);
@@ -268,14 +270,8 @@ pub(crate) struct UnpackResult<'db> {
}
impl<'db> UnpackResult<'db> {
/// Returns the inferred type for a given sub-expression of the left-hand side target
/// of an unpacking assignment.
///
/// Panics if a scoped expression ID is passed in that does not correspond to a sub-
/// expression of the target.
#[track_caller]
pub(crate) fn expression_type(&self, expr_id: ScopedExpressionId) -> Type<'db> {
self.targets[&expr_id]
pub(crate) fn get(&self, expr_id: ScopedExpressionId) -> Option<Type<'db>> {
self.targets.get(&expr_id).copied()
}
}

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.9.5"
version = "0.9.4"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -3,9 +3,9 @@
#![cfg(not(target_family = "wasm"))]
use regex::escape;
use std::fs;
use std::process::Command;
use std::str;
use std::{fs, path::Path};
use anyhow::Result;
use assert_fs::fixture::{ChildPath, FileTouch, PathChild};
@@ -2236,149 +2236,3 @@ def func(t: _T) -> _T:
"
);
}
/// Test that we do not rename two different type parameters to the same name
/// in one execution of Ruff (autofixing this to `class Foo[T, T]: ...` would
/// introduce invalid syntax)
#[test]
fn type_parameter_rename_isolation() {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(STDIN_BASE_OPTIONS)
.args(["--select", "UP049"])
.args(["--stdin-filename", "test.py"])
.arg("--unsafe-fixes")
.arg("--fix")
.arg("--preview")
.arg("--target-version=py312")
.arg("-")
.pass_stdin(
r#"
class Foo[_T, __T]:
pass
"#
),
@r"
success: false
exit_code: 1
----- stdout -----
class Foo[T, __T]:
pass
----- stderr -----
test.py:2:14: UP049 Generic class uses private type parameters
Found 2 errors (1 fixed, 1 remaining).
"
);
}
#[test]
fn a005_module_shadowing_strict() -> Result<()> {
fn create_module(path: &Path) -> Result<()> {
fs::create_dir(path)?;
fs::File::create(path.join("__init__.py"))?;
Ok(())
}
// construct a directory tree with this structure:
// .
// ├── abc
// │   └── __init__.py
// ├── collections
// │   ├── __init__.py
// │   ├── abc
// │   │   └── __init__.py
// │   └── foobar
// │   └── __init__.py
// ├── foobar
// │   ├── __init__.py
// │   ├── abc
// │   │   └── __init__.py
// │   └── collections
// │   ├── __init__.py
// │   ├── abc
// │   │   └── __init__.py
// │   └── foobar
// │   └── __init__.py
// ├── ruff.toml
// └── urlparse
// └── __init__.py
let tempdir = TempDir::new()?;
let foobar = tempdir.path().join("foobar");
create_module(&foobar)?;
for base in [&tempdir.path().into(), &foobar] {
for dir in ["abc", "collections"] {
create_module(&base.join(dir))?;
}
create_module(&base.join("collections").join("abc"))?;
create_module(&base.join("collections").join("foobar"))?;
}
create_module(&tempdir.path().join("urlparse"))?;
// also create a ruff.toml to mark the project root
fs::File::create(tempdir.path().join("ruff.toml"))?;
insta::with_settings!({
filters => vec![(r"\\", "/")]
}, {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(STDIN_BASE_OPTIONS)
.args(["--select", "A005"])
.current_dir(tempdir.path()),
@r"
success: false
exit_code: 1
----- stdout -----
abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
collections/__init__.py:1:1: A005 Module `collections` shadows a Python standard-library module
collections/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
foobar/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
foobar/collections/__init__.py:1:1: A005 Module `collections` shadows a Python standard-library module
foobar/collections/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
Found 6 errors.
----- stderr -----
");
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(STDIN_BASE_OPTIONS)
.args(["--select", "A005"])
.current_dir(tempdir.path()),
@r"
success: false
exit_code: 1
----- stdout -----
abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
collections/__init__.py:1:1: A005 Module `collections` shadows a Python standard-library module
collections/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
foobar/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
foobar/collections/__init__.py:1:1: A005 Module `collections` shadows a Python standard-library module
foobar/collections/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
Found 6 errors.
----- stderr -----
");
// TODO(brent) Default should currently match the strict version, but after the next minor
// release it will match the non-strict version directly above
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(STDIN_BASE_OPTIONS)
.args(["--select", "A005"])
.current_dir(tempdir.path()),
@r"
success: false
exit_code: 1
----- stdout -----
abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
collections/__init__.py:1:1: A005 Module `collections` shadows a Python standard-library module
collections/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
foobar/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
foobar/collections/__init__.py:1:1: A005 Module `collections` shadows a Python standard-library module
foobar/collections/abc/__init__.py:1:1: A005 Module `abc` shadows a Python standard-library module
Found 6 errors.
----- stderr -----
");
});
Ok(())
}

View File

@@ -30,7 +30,6 @@ glob = { workspace = true }
ignore = { workspace = true, optional = true }
matchit = { workspace = true }
salsa = { workspace = true }
schemars = { workspace = true, optional = true }
serde = { workspace = true, optional = true }
path-slash = { workspace = true }
thiserror = { workspace = true }

View File

@@ -471,13 +471,7 @@ impl ToOwned for SystemPath {
/// The path is guaranteed to be valid UTF-8.
#[repr(transparent)]
#[derive(Eq, PartialEq, Clone, Hash, PartialOrd, Ord)]
#[cfg_attr(
feature = "serde",
derive(serde::Serialize, serde::Deserialize),
serde(transparent)
)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub struct SystemPathBuf(#[cfg_attr(feature = "schemars", schemars(with = "String"))] Utf8PathBuf);
pub struct SystemPathBuf(Utf8PathBuf);
impl SystemPathBuf {
pub fn new() -> Self {
@@ -664,6 +658,27 @@ impl ruff_cache::CacheKey for SystemPathBuf {
}
}
#[cfg(feature = "serde")]
impl serde::Serialize for SystemPath {
fn serialize<S: serde::Serializer>(&self, serializer: S) -> Result<S::Ok, S::Error> {
self.0.serialize(serializer)
}
}
#[cfg(feature = "serde")]
impl serde::Serialize for SystemPathBuf {
fn serialize<S: serde::Serializer>(&self, serializer: S) -> Result<S::Ok, S::Error> {
self.0.serialize(serializer)
}
}
#[cfg(feature = "serde")]
impl<'de> serde::Deserialize<'de> for SystemPathBuf {
fn deserialize<D: serde::Deserializer<'de>>(deserializer: D) -> Result<Self, D::Error> {
Utf8PathBuf::deserialize(deserializer).map(SystemPathBuf)
}
}
/// A slice of a virtual path on [`System`](super::System) (akin to [`str`]).
#[repr(transparent)]
pub struct SystemVirtualPath(str);

View File

@@ -11,7 +11,6 @@ repository = { workspace = true }
license = { workspace = true }
[dependencies]
red_knot_project = { workspace = true, features = ["schemars"] }
ruff = { workspace = true }
ruff_diagnostics = { workspace = true }
ruff_formatter = { workspace = true }

View File

@@ -2,7 +2,7 @@
use anyhow::Result;
use crate::{generate_cli_help, generate_docs, generate_json_schema, generate_knot_schema};
use crate::{generate_cli_help, generate_docs, generate_json_schema};
pub(crate) const REGENERATE_ALL_COMMAND: &str = "cargo dev generate-all";
@@ -33,7 +33,6 @@ impl Mode {
pub(crate) fn main(args: &Args) -> Result<()> {
generate_json_schema::main(&generate_json_schema::Args { mode: args.mode })?;
generate_knot_schema::main(&generate_knot_schema::Args { mode: args.mode })?;
generate_cli_help::main(&generate_cli_help::Args { mode: args.mode })?;
generate_docs::main(&generate_docs::Args {
dry_run: args.mode.is_dry_run(),

View File

@@ -1,72 +0,0 @@
#![allow(clippy::print_stdout, clippy::print_stderr)]
use std::fs;
use std::path::PathBuf;
use anyhow::{bail, Result};
use pretty_assertions::StrComparison;
use schemars::schema_for;
use crate::generate_all::{Mode, REGENERATE_ALL_COMMAND};
use crate::ROOT_DIR;
use red_knot_project::metadata::options::Options;
#[derive(clap::Args)]
pub(crate) struct Args {
/// Write the generated table to stdout (rather than to `knot.schema.json`).
#[arg(long, default_value_t, value_enum)]
pub(crate) mode: Mode,
}
pub(crate) fn main(args: &Args) -> Result<()> {
let schema = schema_for!(Options);
let schema_string = serde_json::to_string_pretty(&schema).unwrap();
let filename = "knot.schema.json";
let schema_path = PathBuf::from(ROOT_DIR).join(filename);
match args.mode {
Mode::DryRun => {
println!("{schema_string}");
}
Mode::Check => {
let current = fs::read_to_string(schema_path)?;
if current == schema_string {
println!("Up-to-date: {filename}");
} else {
let comparison = StrComparison::new(&current, &schema_string);
bail!("{filename} changed, please run `{REGENERATE_ALL_COMMAND}`:\n{comparison}");
}
}
Mode::Write => {
let current = fs::read_to_string(&schema_path)?;
if current == schema_string {
println!("Up-to-date: {filename}");
} else {
println!("Updating: {filename}");
fs::write(schema_path, schema_string.as_bytes())?;
}
}
}
Ok(())
}
#[cfg(test)]
mod tests {
use anyhow::Result;
use std::env;
use crate::generate_all::Mode;
use super::{main, Args};
#[test]
fn test_generate_json_schema() -> Result<()> {
let mode = if env::var("KNOT_UPDATE_SCHEMA").as_deref() == Ok("1") {
Mode::Write
} else {
Mode::Check
};
main(&Args { mode })
}
}

View File

@@ -13,7 +13,6 @@ mod generate_all;
mod generate_cli_help;
mod generate_docs;
mod generate_json_schema;
mod generate_knot_schema;
mod generate_options;
mod generate_rules_table;
mod print_ast;
@@ -40,8 +39,6 @@ enum Command {
GenerateAll(generate_all::Args),
/// Generate JSON schema for the TOML configuration file.
GenerateJSONSchema(generate_json_schema::Args),
/// Generate JSON schema for the Red Knot TOML configuration file.
GenerateKnotSchema(generate_knot_schema::Args),
/// Generate a Markdown-compatible table of supported lint rules.
GenerateRulesTable,
/// Generate a Markdown-compatible listing of configuration options.
@@ -86,7 +83,6 @@ fn main() -> Result<ExitCode> {
match command {
Command::GenerateAll(args) => generate_all::main(&args)?,
Command::GenerateJSONSchema(args) => generate_json_schema::main(&args)?,
Command::GenerateKnotSchema(args) => generate_knot_schema::main(&args)?,
Command::GenerateRulesTable => println!("{}", generate_rules_table::generate()),
Command::GenerateOptions => println!("{}", generate_options::generate()),
Command::GenerateCliHelp(args) => generate_cli_help::main(&args)?,

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.9.5"
version = "0.9.4"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -75,10 +75,15 @@ from airflow.secrets.local_filesystem import LocalFilesystemBackend, load_connec
from airflow.security.permissions import RESOURCE_DATASET
from airflow.sensors.base_sensor_operator import BaseSensorOperator
from airflow.sensors.date_time_sensor import DateTimeSensor
from airflow.sensors.external_task import (
ExternalTaskSensorLink as ExternalTaskSensorLinkFromExternalTask,
)
from airflow.sensors.external_task_sensor import (
ExternalTaskMarker,
ExternalTaskSensor,
ExternalTaskSensorLink,
)
from airflow.sensors.external_task_sensor import (
ExternalTaskSensorLink as ExternalTaskSensorLinkFromExternalTaskSensor,
)
from airflow.sensors.time_delta_sensor import TimeDeltaSensor
from airflow.timetables.datasets import DatasetOrTimeSchedule
@@ -244,13 +249,11 @@ BaseSensorOperator()
DateTimeSensor()
# airflow.sensors.external_task
ExternalTaskSensorLink()
ExternalTaskMarker()
ExternalTaskSensor()
ExternalTaskSensorLinkFromExternalTask()
# airflow.sensors.external_task_sensor
ExternalTaskMarkerFromExternalTaskSensor()
ExternalTaskSensorFromExternalTaskSensor()
ExternalTaskMarker()
ExternalTaskSensor()
ExternalTaskSensorLinkFromExternalTaskSensor()
# airflow.sensors.time_delta_sensor

View File

@@ -1,28 +0,0 @@
##### https://github.com/astral-sh/ruff/issues/15809
### Errors
def overshadowed_list():
list = ...
list(map(lambda x: x, []))
### No errors
dict(map(lambda k: (k,), a))
dict(map(lambda k: (k, v, 0), a))
dict(map(lambda k: [k], a))
dict(map(lambda k: [k, v, 0], a))
dict(map(lambda k: {k, v}, a))
dict(map(lambda k: {k: 0, v: 1}, a))
a = [(1, 2), (3, 4)]
map(lambda x: [*x, 10], *a)
map(lambda x: [*x, 10], *a, *b)
map(lambda x: [*x, 10], a, *b)
map(lambda x: x + 10, (a := []))
list(map(lambda x: x + 10, (a := [])))
set(map(lambda x: x + 10, (a := [])))
dict(map(lambda x: (x, 10), (a := [])))

View File

@@ -70,32 +70,6 @@ foo({**foo, **{"bar": True}}) # PIE800
,
})
{
"data": [],
** # Foo
( # Comment
{ "a": b,
# Comment
}
) ,
c: 9,
}
# https://github.com/astral-sh/ruff/issues/15997
{"a": [], **{},}
{"a": [], **({}),}
{"a": [], **{}, 6: 3}
{"a": [], **({}), 6: 3}
{"a": [], **{
# Comment
}, 6: 3}
{"a": [], **({
# Comment
}), 6: 3}
{**foo, "bar": True } # OK

View File

@@ -2,8 +2,6 @@
# Positive cases
###
a_dict = {}
# SIM401 (pattern-1)
if key in a_dict:
var = a_dict[key]
@@ -28,8 +26,6 @@ if keys[idx] in a_dict:
else:
var = "default"
dicts = {"key": a_dict}
# SIM401 (complex expression in dict)
if key in dicts[idx]:
var = dicts[idx][key]
@@ -119,28 +115,6 @@ elif key in a_dict:
else:
vars[idx] = "default"
class NotADictionary:
def __init__(self):
self._dict = {}
def __getitem__(self, key):
return self._dict[key]
def __setitem__(self, key, value):
self._dict[key] = value
def __iter__(self):
return self._dict.__iter__()
not_dict = NotADictionary()
not_dict["key"] = "value"
# OK (type `NotADictionary` is not a known dictionary type)
if "key" in not_dict:
value = not_dict["key"]
else:
value = None
###
# Positive cases (preview)
###

View File

@@ -64,42 +64,10 @@ u''.strip('http://')
u''.lstrip('http://')
# PLE1310
b''.rstrip(b'http://')
b''.rstrip('http://')
# OK
''.strip('Hi')
# OK
''.strip()
### https://github.com/astral-sh/ruff/issues/15968
# Errors: Multiple backslashes
''.strip('\\b\\x09')
''.strip(r'\b\x09')
''.strip('\\\x5C')
# OK: Different types
b"".strip("//")
"".strip(b"//")
# OK: Escapes
'\\test'.strip('\\')
# OK: Extra/missing arguments
"".strip("//", foo)
b"".lstrip(b"//", foo = "bar")
"".rstrip()
# OK: Not literals
foo: str = ""; bar: bytes = b""
"".strip(foo)
b"".strip(bar)
# False negative
foo.rstrip("//")
bar.lstrip(b"//")
# OK: Not `.[lr]?strip`
"".mobius_strip("")

View File

@@ -113,18 +113,3 @@ PositiveList = TypeAliasType(
Annotated[T, Gt(0)], # preserved comment
], type_params=(T,)
)
T: TypeAlias = (
int
| str
)
T: TypeAlias = ( # comment0
# comment1
int # comment2
# comment3
| # comment4
# comment5
str # comment6
# comment7
) # comment8

View File

@@ -12,13 +12,3 @@ x: TypeAlias = tuple[
int, # preserved
float,
]
T: TypeAlias = ( # comment0
# comment1
int # comment2
# comment3
| # comment4
# comment5
str # comment6
# comment7
) # comment8

View File

@@ -54,52 +54,3 @@ def f[_](x: _) -> _: ...
def g[__](x: __) -> __: ...
def h[_T_](x: _T_) -> _T_: ...
def i[__T__](x: __T__) -> __T__: ...
# https://github.com/astral-sh/ruff/issues/16024
from typing import cast, Literal
class C[_0]: ...
class C[T, _T]: ...
class C[_T, T]: ...
class C[_T]:
v1 = cast(_T, ...)
v2 = cast('_T', ...)
v3 = cast("\u005fT", ...)
def _(self):
v1 = cast(_T, ...)
v2 = cast('_T', ...)
v3 = cast("\u005fT", ...)
class C[_T]:
v = cast('Literal[\'foo\'] | _T', ...)
## Name collision
class C[T]:
def f[_T](self): # No fix, collides with `T` from outer scope
v1 = cast(_T, ...)
v2 = cast('_T', ...)
# Unfixable as the new name collides with a variable visible from one of the inner scopes
class C[_T]:
T = 42
v1 = cast(_T, ...)
v2 = cast('_T', ...)
# Unfixable as the new name collides with a variable visible from one of the inner scopes
class C[_T]:
def f[T](self):
v1 = cast(_T, ...)
v2 = cast('_T', ...)

View File

@@ -176,22 +176,3 @@ class Node:
_seen.add(self)
for other in self.connected:
other.recurse(_seen=_seen)
def foo():
_dummy_var = 42
def bar():
dummy_var = 43
print(_dummy_var)
def foo():
# Unfixable because both possible candidates for the new name are shadowed
# in the scope of one of the references to the variable
_dummy_var = 42
def bar():
dummy_var = 43
dummy_var_ = 44
print(_dummy_var)

View File

@@ -9,7 +9,7 @@ use crate::rules::{
};
/// Run lint rules over the [`Binding`]s.
pub(crate) fn bindings(checker: &Checker) {
pub(crate) fn bindings(checker: &mut Checker) {
if !checker.any_enabled(&[
Rule::AssignmentInAssert,
Rule::InvalidAllFormat,
@@ -48,22 +48,22 @@ pub(crate) fn bindings(checker: &Checker) {
pyflakes::fixes::remove_exception_handler_assignment(binding, checker.locator)
.map(Fix::safe_edit)
});
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::InvalidAllFormat) {
if let Some(diagnostic) = pylint::rules::invalid_all_format(binding) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::InvalidAllObject) {
if let Some(diagnostic) = pylint::rules::invalid_all_object(binding) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::NonAsciiName) {
if let Some(diagnostic) = pylint::rules::non_ascii_name(binding, checker.locator) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::UnconventionalImportAlias) {
@@ -72,61 +72,61 @@ pub(crate) fn bindings(checker: &Checker) {
binding,
&checker.settings.flake8_import_conventions.aliases,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::UnaliasedCollectionsAbcSetImport) {
if let Some(diagnostic) =
flake8_pyi::rules::unaliased_collections_abc_set_import(checker, binding)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if !checker.source_type.is_stub() && checker.enabled(Rule::UnquotedTypeAlias) {
if let Some(diagnostics) =
flake8_type_checking::rules::unquoted_type_alias(checker, binding)
{
checker.report_diagnostics(diagnostics);
checker.diagnostics.extend(diagnostics);
}
}
if checker.enabled(Rule::UnsortedDunderSlots) {
if let Some(diagnostic) = ruff::rules::sort_dunder_slots(checker, binding) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::UsedDummyVariable) {
if let Some(diagnostic) = ruff::rules::used_dummy_variable(checker, binding, binding_id)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::AssignmentInAssert) {
if let Some(diagnostic) = ruff::rules::assignment_in_assert(checker, binding) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::PytestUnittestRaisesAssertion) {
if let Some(diagnostic) =
flake8_pytest_style::rules::unittest_raises_assertion_binding(checker, binding)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::ForLoopWrites) {
if let Some(diagnostic) = refurb::rules::for_loop_writes_binding(checker, binding) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::CustomTypeVarForSelf) {
if let Some(diagnostic) =
flake8_pyi::rules::custom_type_var_instead_of_self(checker, binding)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::PrivateTypeParameter) {
if let Some(diagnostic) = pyupgrade::rules::private_type_parameter(checker, binding) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
}

View File

@@ -5,7 +5,7 @@ use crate::codes::Rule;
use crate::rules::{flake8_simplify, pylint, refurb};
/// Run lint rules over a [`Comprehension`] syntax nodes.
pub(crate) fn comprehension(comprehension: &Comprehension, checker: &Checker) {
pub(crate) fn comprehension(comprehension: &Comprehension, checker: &mut Checker) {
if checker.enabled(Rule::InDictKeys) {
flake8_simplify::rules::key_in_dict_comprehension(checker, comprehension);
}

View File

@@ -13,7 +13,7 @@ use crate::rules::{
};
/// Run lint rules over all deferred scopes in the [`SemanticModel`].
pub(crate) fn deferred_scopes(checker: &Checker) {
pub(crate) fn deferred_scopes(checker: &mut Checker) {
if !checker.any_enabled(&[
Rule::AsyncioDanglingTask,
Rule::BadStaticmethodArgument,
@@ -85,11 +85,12 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
vec![]
};
let mut diagnostics: Vec<Diagnostic> = vec![];
for scope_id in checker.analyze.scopes.iter().rev().copied() {
let scope = &checker.semantic.scopes[scope_id];
if checker.enabled(Rule::UndefinedLocal) {
pyflakes::rules::undefined_local(checker, scope_id, scope);
pyflakes::rules::undefined_local(checker, scope_id, scope, &mut diagnostics);
}
if checker.enabled(Rule::GlobalVariableNotAssigned) {
@@ -111,7 +112,7 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
.map(|id| checker.semantic.reference(*id))
.all(ResolvedReference::is_load)
{
checker.report_diagnostic(Diagnostic::new(
diagnostics.push(Diagnostic::new(
pylint::rules::GlobalVariableNotAssigned {
name: (*name).to_string(),
},
@@ -145,7 +146,7 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
if scope.kind.is_generator() {
continue;
}
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pylint::rules::RedefinedArgumentFromLocal {
name: name.to_string(),
},
@@ -185,7 +186,7 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
continue;
}
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::ImportShadowedByLoopVar {
name: name.to_string(),
row: checker.compute_source_row(shadowed.start()),
@@ -346,7 +347,7 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
diagnostic.set_fix(fix.clone());
}
checker.report_diagnostic(diagnostic);
diagnostics.push(diagnostic);
}
}
}
@@ -355,47 +356,55 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
|| matches!(scope.kind, ScopeKind::Module | ScopeKind::Function(_))
{
if checker.enabled(Rule::UnusedPrivateTypeVar) {
flake8_pyi::rules::unused_private_type_var(checker, scope);
flake8_pyi::rules::unused_private_type_var(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateProtocol) {
flake8_pyi::rules::unused_private_protocol(checker, scope);
flake8_pyi::rules::unused_private_protocol(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateTypeAlias) {
flake8_pyi::rules::unused_private_type_alias(checker, scope);
flake8_pyi::rules::unused_private_type_alias(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateTypedDict) {
flake8_pyi::rules::unused_private_typed_dict(checker, scope);
flake8_pyi::rules::unused_private_typed_dict(checker, scope, &mut diagnostics);
}
}
if checker.enabled(Rule::AsyncioDanglingTask) {
ruff::rules::asyncio_dangling_binding(scope, checker);
ruff::rules::asyncio_dangling_binding(scope, &checker.semantic, &mut diagnostics);
}
if let Some(class_def) = scope.kind.as_class() {
if checker.enabled(Rule::BuiltinAttributeShadowing) {
flake8_builtins::rules::builtin_attribute_shadowing(
checker, scope_id, scope, class_def,
checker,
scope_id,
scope,
class_def,
&mut diagnostics,
);
}
if checker.enabled(Rule::FunctionCallInDataclassDefaultArgument) {
ruff::rules::function_call_in_dataclass_default(checker, class_def);
ruff::rules::function_call_in_dataclass_default(
checker,
class_def,
&mut diagnostics,
);
}
if checker.enabled(Rule::MutableClassDefault) {
ruff::rules::mutable_class_default(checker, class_def);
ruff::rules::mutable_class_default(checker, class_def, &mut diagnostics);
}
if checker.enabled(Rule::MutableDataclassDefault) {
ruff::rules::mutable_dataclass_default(checker, class_def);
ruff::rules::mutable_dataclass_default(checker, class_def, &mut diagnostics);
}
}
if matches!(scope.kind, ScopeKind::Function(_) | ScopeKind::Lambda(_)) {
if checker.enabled(Rule::UnusedVariable) {
pyflakes::rules::unused_variable(checker, scope);
pyflakes::rules::unused_variable(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedAnnotation) {
pyflakes::rules::unused_annotation(checker, scope);
pyflakes::rules::unused_annotation(checker, scope, &mut diagnostics);
}
if !checker.source_type.is_stub() {
@@ -406,7 +415,11 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
Rule::UnusedMethodArgument,
Rule::UnusedStaticMethodArgument,
]) {
flake8_unused_arguments::rules::unused_arguments(checker, scope);
flake8_unused_arguments::rules::unused_arguments(
checker,
scope,
&mut diagnostics,
);
}
}
}
@@ -415,7 +428,11 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
if !checker.source_type.is_stub()
&& checker.enabled(Rule::RuntimeImportInTypeCheckingBlock)
{
flake8_type_checking::rules::runtime_import_in_type_checking_block(checker, scope);
flake8_type_checking::rules::runtime_import_in_type_checking_block(
checker,
scope,
&mut diagnostics,
);
}
if enforce_typing_only_imports {
let runtime_imports: Vec<&Binding> = checker
@@ -430,45 +447,47 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
checker,
scope,
&runtime_imports,
&mut diagnostics,
);
}
if checker.enabled(Rule::UnusedImport) {
pyflakes::rules::unused_import(checker, scope);
pyflakes::rules::unused_import(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::ImportPrivateName) {
pylint::rules::import_private_name(checker, scope);
pylint::rules::import_private_name(checker, scope, &mut diagnostics);
}
}
if scope.kind.is_function() {
if checker.enabled(Rule::NoSelfUse) {
pylint::rules::no_self_use(checker, scope_id, scope);
pylint::rules::no_self_use(checker, scope_id, scope, &mut diagnostics);
}
if checker.enabled(Rule::TooManyLocals) {
pylint::rules::too_many_locals(checker, scope);
pylint::rules::too_many_locals(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::SingledispatchMethod) {
pylint::rules::singledispatch_method(checker, scope);
pylint::rules::singledispatch_method(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::SingledispatchmethodFunction) {
pylint::rules::singledispatchmethod_function(checker, scope);
pylint::rules::singledispatchmethod_function(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::BadStaticmethodArgument) {
pylint::rules::bad_staticmethod_argument(checker, scope);
pylint::rules::bad_staticmethod_argument(checker, scope, &mut diagnostics);
}
if checker.any_enabled(&[
Rule::InvalidFirstArgumentNameForClassMethod,
Rule::InvalidFirstArgumentNameForMethod,
]) {
pep8_naming::rules::invalid_first_argument_name(checker, scope);
pep8_naming::rules::invalid_first_argument_name(checker, scope, &mut diagnostics);
}
}
}
checker.diagnostics.extend(diagnostics);
}

View File

@@ -139,11 +139,13 @@ pub(crate) fn definitions(checker: &mut Checker) {
&checker.semantic,
)
}) {
checker.report_diagnostics(flake8_annotations::rules::definition(
checker,
definition,
*visibility,
));
checker
.diagnostics
.extend(flake8_annotations::rules::definition(
checker,
definition,
*visibility,
));
}
overloaded_name =
flake8_annotations::helpers::overloaded_name(definition, &checker.semantic);

View File

@@ -8,7 +8,7 @@ use crate::rules::{
};
/// Run lint rules over an [`ExceptHandler`] syntax node.
pub(crate) fn except_handler(except_handler: &ExceptHandler, checker: &Checker) {
pub(crate) fn except_handler(except_handler: &ExceptHandler, checker: &mut Checker) {
match except_handler {
ExceptHandler::ExceptHandler(ast::ExceptHandlerExceptHandler {
type_,
@@ -23,7 +23,7 @@ pub(crate) fn except_handler(except_handler: &ExceptHandler, checker: &Checker)
except_handler,
checker.locator,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::RaiseWithoutFromInsideExcept) {

View File

@@ -21,7 +21,7 @@ use crate::rules::{
use crate::settings::types::PythonVersion;
/// Run lint rules over an [`Expr`] syntax node.
pub(crate) fn expression(expr: &Expr, checker: &Checker) {
pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
match expr {
Expr::Subscript(subscript @ ast::ExprSubscript { value, slice, .. }) => {
// Ex) Optional[...], Union[...]
@@ -201,7 +201,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
check_two_starred_expressions,
expr.range(),
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
}
@@ -515,7 +515,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
match pyflakes::format::FormatSummary::try_from(string_value.to_str()) {
Err(e) => {
if checker.enabled(Rule::StringDotFormatInvalidFormat) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::StringDotFormatInvalidFormat {
message: pyflakes::format::error_to_string(&e),
},
@@ -842,7 +842,13 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
flake8_comprehensions::rules::unnecessary_subscript_reversal(checker, call);
}
if checker.enabled(Rule::UnnecessaryMap) {
flake8_comprehensions::rules::unnecessary_map(checker, call);
flake8_comprehensions::rules::unnecessary_map(
checker,
expr,
checker.semantic.current_expression_parent(),
func,
args,
);
}
if checker.enabled(Rule::UnnecessaryComprehensionInCall) {
flake8_comprehensions::rules::unnecessary_comprehension_in_call(
@@ -906,7 +912,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
pylint::rules::bad_open_mode(checker, call);
}
if checker.enabled(Rule::BadStrStripCall) {
pylint::rules::bad_str_strip_call(checker, call);
pylint::rules::bad_str_strip_call(checker, func, args);
}
if checker.enabled(Rule::ShallowCopyEnviron) {
pylint::rules::shallow_copy_environ(checker, call);
@@ -925,7 +931,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
}
if checker.enabled(Rule::PytestPatchWithLambda) {
if let Some(diagnostic) = flake8_pytest_style::rules::patch_with_lambda(call) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.any_enabled(&[
@@ -1281,7 +1287,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
..
}) => {
if checker.enabled(Rule::PercentFormatUnsupportedFormatCharacter) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::PercentFormatUnsupportedFormatCharacter {
char: c,
},
@@ -1291,7 +1297,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
}
Err(e) => {
if checker.enabled(Rule::PercentFormatInvalidFormat) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::PercentFormatInvalidFormat {
message: e.to_string(),
},
@@ -1365,7 +1371,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
checker.locator,
checker.settings,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::CollectionLiteralConcatenation) {

View File

@@ -5,7 +5,7 @@ use crate::codes::Rule;
use crate::rules::{flake8_bugbear, ruff};
/// Run lint rules over a module.
pub(crate) fn module(suite: &Suite, checker: &Checker) {
pub(crate) fn module(suite: &Suite, checker: &mut Checker) {
if checker.enabled(Rule::FStringDocstring) {
flake8_bugbear::rules::f_string_docstring(checker, suite);
}

View File

@@ -6,7 +6,7 @@ use crate::codes::Rule;
use crate::rules::{flake8_builtins, pycodestyle};
/// Run lint rules over a [`Parameter`] syntax node.
pub(crate) fn parameter(parameter: &Parameter, checker: &Checker) {
pub(crate) fn parameter(parameter: &Parameter, checker: &mut Checker) {
if checker.enabled(Rule::AmbiguousVariableName) {
pycodestyle::rules::ambiguous_variable_name(
checker,

View File

@@ -5,7 +5,7 @@ use crate::codes::Rule;
use crate::rules::{flake8_bugbear, flake8_pyi, ruff};
/// Run lint rules over a [`Parameters`] syntax node.
pub(crate) fn parameters(parameters: &Parameters, checker: &Checker) {
pub(crate) fn parameters(parameters: &Parameters, checker: &mut Checker) {
if checker.enabled(Rule::FunctionCallInDefaultArgument) {
flake8_bugbear::rules::function_call_in_argument_default(checker, parameters);
}

View File

@@ -39,7 +39,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if !checker.semantic.scope_id.is_global() {
for name in names {
if checker.semantic.nonlocal(name).is_none() {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pylint::rules::NonlocalWithoutBinding {
name: name.to_string(),
},
@@ -59,7 +59,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
&mut checker.semantic.current_statements().skip(1),
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
}
@@ -69,7 +69,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
&mut checker.semantic.current_statements().skip(1),
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
}
@@ -99,7 +99,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
if checker.enabled(Rule::AmbiguousFunctionName) {
if let Some(diagnostic) = pycodestyle::rules::ambiguous_function_name(name) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::InvalidBoolReturnType) {
@@ -128,7 +128,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.pep8_naming.ignore_names,
&checker.semantic,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.source_type.is_stub() {
@@ -187,7 +187,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
name,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::GlobalStatement) {
@@ -239,7 +239,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
body,
checker.settings.mccabe.max_complexity,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::HardcodedPasswordDefault) {
@@ -265,7 +265,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
body,
checker.settings.pylint.max_returns,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::TooManyBranches) {
@@ -274,7 +274,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
body,
checker.settings.pylint.max_branches,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::TooManyStatements) {
@@ -283,7 +283,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
body,
checker.settings.pylint.max_statements,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.any_enabled(&[
@@ -351,7 +351,9 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
#[cfg(any(feature = "test-rules", test))]
if checker.enabled(Rule::UnreachableCode) {
checker.report_diagnostics(pylint::rules::in_function(name, body));
checker
.diagnostics
.extend(pylint::rules::in_function(name, body));
}
if checker.enabled(Rule::ReimplementedOperator) {
refurb::rules::reimplemented_operator(checker, &function_def.into());
@@ -454,7 +456,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
if checker.enabled(Rule::AmbiguousClassName) {
if let Some(diagnostic) = pycodestyle::rules::ambiguous_class_name(name) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::InvalidClassName) {
@@ -463,7 +465,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
name,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::ErrorSuffixOnExceptionName) {
@@ -473,7 +475,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
name,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if !checker.source_type.is_stub() {
@@ -613,7 +615,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if let Some(diagnostic) =
flake8_debugger::rules::debugger_import(stmt, None, &alias.name)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::BannedApi) {
@@ -640,7 +642,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if let Some(diagnostic) =
pylint::rules::import_self(alias, checker.module.qualified_name())
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if let Some(asname) = &alias.asname {
@@ -655,7 +657,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::LowercaseImportedAsNonLowercase) {
@@ -668,7 +670,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsLowercase) {
@@ -681,7 +683,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsConstant) {
@@ -692,14 +694,14 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsAcronym) {
if let Some(diagnostic) = pep8_naming::rules::camelcase_imported_as_acronym(
name, asname, alias, stmt, checker,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
}
@@ -713,7 +715,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.flake8_import_conventions.banned_aliases,
)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
}
@@ -723,7 +725,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&alias.name,
alias.asname.as_deref(),
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::BuiltinImportShadowing) {
@@ -839,7 +841,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if let Some(diagnostic) =
flake8_pytest_style::rules::import_from(stmt, module, level)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.source_type.is_stub() {
@@ -854,7 +856,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
if checker.enabled(Rule::LateFutureImport) {
if checker.semantic.seen_futures_boundary() {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::LateFutureImport,
stmt.range(),
));
@@ -863,7 +865,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
} else if &alias.name == "*" {
if checker.enabled(Rule::UndefinedLocalWithNestedImportStarUsage) {
if !matches!(checker.semantic.current_scope().kind, ScopeKind::Module) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::UndefinedLocalWithNestedImportStarUsage {
name: helpers::format_import_from(level, module).to_string(),
},
@@ -872,7 +874,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
}
if checker.enabled(Rule::UndefinedLocalWithImportStar) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::UndefinedLocalWithImportStar {
name: helpers::format_import_from(level, module).to_string(),
},
@@ -889,14 +891,14 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker.module.qualified_name(),
checker.settings.flake8_tidy_imports.ban_relative_imports,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::Debugger) {
if let Some(diagnostic) =
flake8_debugger::rules::debugger_import(stmt, module, &alias.name)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::BannedImportAlias) {
@@ -911,7 +913,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.flake8_import_conventions.banned_aliases,
)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
}
@@ -926,7 +928,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::LowercaseImportedAsNonLowercase) {
@@ -939,7 +941,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsLowercase) {
@@ -952,7 +954,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsConstant) {
@@ -963,7 +965,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsAcronym) {
@@ -974,7 +976,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
checker,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if !checker.source_type.is_stub() {
@@ -994,7 +996,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
names,
checker.module.qualified_name(),
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::BannedImportFrom) {
@@ -1003,7 +1005,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
&helpers::format_import_from(level, module),
&checker.settings.flake8_import_conventions.banned_from,
) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::ByteStringUsage) {
@@ -1172,7 +1174,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
if checker.any_enabled(&[Rule::BadVersionInfoComparison, Rule::BadVersionInfoOrder]) {
fn bad_version_info_comparison(
checker: &Checker,
checker: &mut Checker,
test: &Expr,
has_else_clause: bool,
) {
@@ -1219,7 +1221,9 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
) => {
if !checker.semantic.in_type_checking_block() {
if checker.enabled(Rule::Assert) {
checker.report_diagnostic(flake8_bandit::rules::assert_used(stmt));
checker
.diagnostics
.push(flake8_bandit::rules::assert_used(stmt));
}
}
if checker.enabled(Rule::AssertTuple) {
@@ -1436,7 +1440,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if let Some(diagnostic) =
pyflakes::rules::default_except_not_last(handlers, checker.locator)
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.any_enabled(&[
@@ -1533,7 +1537,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
if checker.enabled(Rule::PandasDfVariableName) {
if let Some(diagnostic) = pandas_vet::rules::assignment_to_df(targets) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker
@@ -1731,7 +1735,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if let Some(diagnostic) =
ruff::rules::asyncio_dangling_task(value, checker.semantic())
{
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::RepeatedAppend) {

View File

@@ -5,7 +5,7 @@ use crate::codes::Rule;
use crate::rules::{flake8_bandit, flake8_pyi, flake8_quotes, pycodestyle, ruff};
/// Run lint rules over a [`StringLike`] syntax nodes.
pub(crate) fn string_like(string_like: StringLike, checker: &Checker) {
pub(crate) fn string_like(string_like: StringLike, checker: &mut Checker) {
if checker.any_enabled(&[
Rule::AmbiguousUnicodeCharacterString,
Rule::AmbiguousUnicodeCharacterDocstring,

View File

@@ -6,7 +6,7 @@ use crate::rules::flake8_pie;
use crate::rules::refurb;
/// Run lint rules over a suite of [`Stmt`] syntax nodes.
pub(crate) fn suite(suite: &[Stmt], checker: &Checker) {
pub(crate) fn suite(suite: &[Stmt], checker: &mut Checker) {
if checker.enabled(Rule::UnnecessaryPlaceholder) {
flake8_pie::rules::unnecessary_placeholder(checker, suite);
}

View File

@@ -7,7 +7,7 @@ use crate::codes::Rule;
use crate::rules::pyflakes;
/// Run lint rules over all [`UnresolvedReference`] entities in the [`SemanticModel`].
pub(crate) fn unresolved_references(checker: &Checker) {
pub(crate) fn unresolved_references(checker: &mut Checker) {
if !checker.any_enabled(&[Rule::UndefinedLocalWithImportStarUsage, Rule::UndefinedName]) {
return;
}
@@ -15,7 +15,7 @@ pub(crate) fn unresolved_references(checker: &Checker) {
for reference in checker.semantic.unresolved_references() {
if reference.is_wildcard_import() {
if checker.enabled(Rule::UndefinedLocalWithImportStarUsage) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: reference.name(checker.source()).to_string(),
},
@@ -42,7 +42,7 @@ pub(crate) fn unresolved_references(checker: &Checker) {
let symbol_name = reference.name(checker.source());
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
pyflakes::rules::UndefinedName {
name: symbol_name.to_string(),
minor_version_builtin_added: version_builtin_was_added(symbol_name),

View File

@@ -9,6 +9,11 @@
//! parent scopes have been fully traversed. Individual rules may also perform internal traversals
//! of the AST.
//!
//! While the [`Checker`] is typically passed by mutable reference to the individual lint rule
//! implementations, most of its constituent components are intended to be treated immutably, with
//! the exception of the [`Diagnostic`] vector, which is intended to be mutated by the individual
//! lint rules. In the future, this should be formalized in the API.
//!
//! The individual [`Visitor`] implementations within the [`Checker`] typically proceed in four
//! steps:
//!
@@ -26,7 +31,7 @@ use std::path::Path;
use itertools::Itertools;
use log::debug;
use rustc_hash::{FxHashMap, FxHashSet};
use rustc_hash::FxHashMap;
use ruff_diagnostics::{Diagnostic, IsolationLevel};
use ruff_notebook::{CellOffsets, NotebookIndex};
@@ -216,9 +221,9 @@ pub(crate) struct Checker<'a> {
/// A set of deferred nodes to be analyzed after the AST traversal (e.g., `for` loops).
analyze: deferred::Analyze,
/// The cumulative set of diagnostics computed across all lint rules.
diagnostics: RefCell<Vec<Diagnostic>>,
pub(crate) diagnostics: Vec<Diagnostic>,
/// The list of names already seen by flake8-bugbear diagnostics, to avoid duplicate violations.
flake8_bugbear_seen: RefCell<FxHashSet<TextRange>>,
pub(crate) flake8_bugbear_seen: Vec<TextRange>,
/// The end offset of the last visited statement.
last_stmt_end: TextSize,
/// A state describing if a docstring is expected or not.
@@ -266,8 +271,8 @@ impl<'a> Checker<'a> {
semantic,
visit: deferred::Visit::default(),
analyze: deferred::Analyze::default(),
diagnostics: RefCell::default(),
flake8_bugbear_seen: RefCell::default(),
diagnostics: Vec::default(),
flake8_bugbear_seen: Vec::default(),
cell_offsets,
notebook_index,
last_stmt_end: TextSize::default(),
@@ -357,30 +362,6 @@ impl<'a> Checker<'a> {
self.indexer.comment_ranges()
}
/// Push a new [`Diagnostic`] to the collection in the [`Checker`]
pub(crate) fn report_diagnostic(&self, diagnostic: Diagnostic) {
let mut diagnostics = self.diagnostics.borrow_mut();
diagnostics.push(diagnostic);
}
/// Extend the collection of [`Diagnostic`] objects in the [`Checker`]
pub(crate) fn report_diagnostics<I>(&self, diagnostics: I)
where
I: IntoIterator<Item = Diagnostic>,
{
let mut checker_diagnostics = self.diagnostics.borrow_mut();
checker_diagnostics.extend(diagnostics);
}
/// Adds a [`TextRange`] to the set of ranges of variable names
/// flagged in `flake8-bugbear` violations so far.
///
/// Returns whether the value was newly inserted.
pub(crate) fn insert_flake8_bugbear_range(&self, range: TextRange) -> bool {
let mut ranges = self.flake8_bugbear_seen.borrow_mut();
ranges.insert(range)
}
/// Returns the [`Tokens`] for the parsed type annotation if the checker is in a typing context
/// or the parsed source code.
pub(crate) fn tokens(&self) -> &'a Tokens {
@@ -495,9 +476,9 @@ impl<'a> Checker<'a> {
}
/// Push `diagnostic` if the checker is not in a `@no_type_check` context.
pub(crate) fn report_type_diagnostic(&self, diagnostic: Diagnostic) {
pub(crate) fn push_type_diagnostic(&mut self, diagnostic: Diagnostic) {
if !self.semantic.in_no_type_check() {
self.report_diagnostic(diagnostic);
self.diagnostics.push(diagnostic);
}
}
}
@@ -2438,7 +2419,7 @@ impl<'a> Checker<'a> {
self.semantic.restore(snapshot);
if self.enabled(Rule::ForwardAnnotationSyntaxError) {
self.report_type_diagnostic(Diagnostic::new(
self.push_type_diagnostic(Diagnostic::new(
pyflakes::rules::ForwardAnnotationSyntaxError {
parse_error: parse_error.error.to_string(),
},
@@ -2580,7 +2561,7 @@ impl<'a> Checker<'a> {
} else {
if self.semantic.global_scope().uses_star_imports() {
if self.enabled(Rule::UndefinedLocalWithImportStarUsage) {
self.diagnostics.get_mut().push(
self.diagnostics.push(
Diagnostic::new(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: name.to_string(),
@@ -2595,7 +2576,7 @@ impl<'a> Checker<'a> {
if self.settings.preview.is_enabled()
|| !self.path.ends_with("__init__.py")
{
self.diagnostics.get_mut().push(
self.diagnostics.push(
Diagnostic::new(
pyflakes::rules::UndefinedExport {
name: name.to_string(),
@@ -2719,13 +2700,13 @@ pub(crate) fn check_ast(
analyze::deferred_lambdas(&mut checker);
analyze::deferred_for_loops(&mut checker);
analyze::definitions(&mut checker);
analyze::bindings(&checker);
analyze::unresolved_references(&checker);
analyze::bindings(&mut checker);
analyze::unresolved_references(&mut checker);
// Reset the scope to module-level, and check all consumed scopes.
checker.semantic.scope_id = ScopeId::global();
checker.analyze.scopes.push(ScopeId::global());
analyze::deferred_scopes(&checker);
analyze::deferred_scopes(&mut checker);
checker.diagnostics.take()
checker.diagnostics
}

View File

@@ -46,7 +46,12 @@ pub(crate) fn check_file_path(
// flake8-builtins
if settings.rules.enabled(Rule::StdlibModuleShadowing) {
if let Some(diagnostic) = stdlib_module_shadowing(path, settings) {
if let Some(diagnostic) = stdlib_module_shadowing(
path,
package,
&settings.flake8_builtins.builtins_allowed_modules,
settings.target_version,
) {
diagnostics.push(diagnostic);
}
}

View File

@@ -387,36 +387,25 @@ pub(crate) enum ShadowedKind {
}
impl ShadowedKind {
/// Determines the kind of shadowing or conflict for the proposed new name of a given [`Binding`].
/// Determines the kind of shadowing or conflict for a given variable name.
///
/// This function is useful for checking whether or not the `target` of a [`Renamer::rename`]
/// This function is useful for checking whether or not the `target` of a [`Rename::rename`]
/// will shadow another binding.
pub(crate) fn new(binding: &Binding, new_name: &str, checker: &Checker) -> ShadowedKind {
pub(crate) fn new(name: &str, checker: &Checker, scope_id: ScopeId) -> ShadowedKind {
// Check the kind in order of precedence
if is_keyword(new_name) {
if is_keyword(name) {
return ShadowedKind::Keyword;
}
if is_python_builtin(
new_name,
name,
checker.settings.target_version.minor(),
checker.source_type.is_ipynb(),
) {
return ShadowedKind::BuiltIn;
}
let semantic = checker.semantic();
if !semantic.is_available_in_scope(new_name, binding.scope) {
return ShadowedKind::Some;
}
if binding
.references()
.map(|reference_id| semantic.reference(reference_id).scope_id())
.dedup()
.any(|scope| !semantic.is_available_in_scope(new_name, scope))
{
if !checker.semantic().is_available_in_scope(name, scope_id) {
return ShadowedKind::Some;
}

View File

@@ -50,7 +50,7 @@ impl Violation for AirflowDagNoScheduleArgument {
}
/// AIR301
pub(crate) fn dag_no_schedule_argument(checker: &Checker, expr: &Expr) {
pub(crate) fn dag_no_schedule_argument(checker: &mut Checker, expr: &Expr) {
if !checker.semantic().seen_module(Modules::AIRFLOW) {
return;
}
@@ -86,5 +86,5 @@ pub(crate) fn dag_no_schedule_argument(checker: &Checker, expr: &Expr) {
// Produce a diagnostic when the `schedule` keyword argument is not found.
let diagnostic = Diagnostic::new(AirflowDagNoScheduleArgument, expr.range());
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}

View File

@@ -83,7 +83,7 @@ impl Violation for Airflow3MovedToProvider {
}
/// AIR303
pub(crate) fn moved_to_provider_in_3(checker: &Checker, expr: &Expr) {
pub(crate) fn moved_to_provider_in_3(checker: &mut Checker, expr: &Expr) {
if !checker.semantic().seen_module(Modules::AIRFLOW) {
return;
}
@@ -112,7 +112,7 @@ enum Replacement {
},
}
fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRange) {
fn check_names_moved_to_provider(checker: &mut Checker, expr: &Expr, ranged: TextRange) {
let Some(qualified_name) = checker.semantic().resolve_qualified_name(expr) else {
return;
};
@@ -1018,7 +1018,7 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
},
_ => return,
};
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3MovedToProvider {
deprecated: qualified_name.to_string(),
replacement,

View File

@@ -80,7 +80,7 @@ enum Replacement {
}
/// AIR302
pub(crate) fn airflow_3_removal_expr(checker: &Checker, expr: &Expr) {
pub(crate) fn airflow_3_removal_expr(checker: &mut Checker, expr: &Expr) {
if !checker.semantic().seen_module(Modules::AIRFLOW) {
return;
}
@@ -117,7 +117,10 @@ pub(crate) fn airflow_3_removal_expr(checker: &Checker, expr: &Expr) {
}
/// AIR302
pub(crate) fn airflow_3_removal_function_def(checker: &Checker, function_def: &StmtFunctionDef) {
pub(crate) fn airflow_3_removal_function_def(
checker: &mut Checker,
function_def: &StmtFunctionDef,
) {
if !checker.semantic().seen_module(Modules::AIRFLOW) {
return;
}
@@ -153,7 +156,7 @@ const REMOVED_CONTEXT_KEYS: [&str; 12] = [
/// # 'execution_date' is removed in Airflow 3.0
/// pass
/// ```
fn check_function_parameters(checker: &Checker, function_def: &StmtFunctionDef) {
fn check_function_parameters(checker: &mut Checker, function_def: &StmtFunctionDef) {
if !is_airflow_task(function_def, checker.semantic())
&& !is_execute_method_inherits_from_airflow_operator(function_def, checker.semantic())
{
@@ -163,7 +166,7 @@ fn check_function_parameters(checker: &Checker, function_def: &StmtFunctionDef)
for param in function_def.parameters.iter_non_variadic_params() {
let param_name = param.name();
if REMOVED_CONTEXT_KEYS.contains(&param_name.as_str()) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3Removal {
deprecated: param_name.to_string(),
replacement: Replacement::None,
@@ -183,25 +186,29 @@ fn check_function_parameters(checker: &Checker, function_def: &StmtFunctionDef)
///
/// DAG(schedule_interval="@daily")
/// ```
fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, arguments: &Arguments) {
fn check_call_arguments(
checker: &mut Checker,
qualified_name: &QualifiedName,
arguments: &Arguments,
) {
match qualified_name.segments() {
["airflow", .., "DAG" | "dag"] => {
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"schedule_interval",
Some("schedule"),
));
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"timetable",
Some("schedule"),
));
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"sla_miss_callback",
None,
));
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"fail_stop",
Some("fail_fast"),
@@ -210,7 +217,7 @@ fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, argum
_ => {
if is_airflow_auth_manager(qualified_name.segments()) {
if !arguments.is_empty() {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3Removal {
deprecated: String::from("appbuilder"),
replacement: Replacement::Message(
@@ -221,42 +228,44 @@ fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, argum
));
}
} else if is_airflow_task_handler(qualified_name.segments()) {
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"filename_template",
None,
));
} else if is_airflow_operator(qualified_name.segments()) {
checker.report_diagnostics(diagnostic_for_argument(arguments, "sla", None));
checker.report_diagnostics(diagnostic_for_argument(
checker
.diagnostics
.extend(diagnostic_for_argument(arguments, "sla", None));
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"task_concurrency",
Some("max_active_tis_per_dag"),
));
match qualified_name.segments() {
["airflow", .., "operators", "trigger_dagrun", "TriggerDagRunOperator"] => {
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"execution_date",
Some("logical_date"),
));
}
["airflow", .., "operators", "datetime", "BranchDateTimeOperator"] => {
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"use_task_execution_day",
Some("use_task_logical_date"),
));
}
["airflow", .., "operators", "weekday", "DayOfWeekSensor"] => {
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"use_task_execution_day",
Some("use_task_logical_date"),
));
}
["airflow", .., "operators", "weekday", "BranchDayOfWeekOperator"] => {
checker.report_diagnostics(diagnostic_for_argument(
checker.diagnostics.extend(diagnostic_for_argument(
arguments,
"use_task_execution_day",
Some("use_task_logical_date"),
@@ -279,7 +288,7 @@ fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, argum
/// info = DatasetLineageInfo()
/// info.dataset
/// ```
fn check_class_attribute(checker: &Checker, attribute_expr: &ExprAttribute) {
fn check_class_attribute(checker: &mut Checker, attribute_expr: &ExprAttribute) {
let ExprAttribute { value, attr, .. } = attribute_expr;
let Some(qualname) = typing::resolve_assignment(value, checker.semantic()) else {
@@ -303,7 +312,7 @@ fn check_class_attribute(checker: &Checker, attribute_expr: &ExprAttribute) {
};
if let Some(replacement) = replacement {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3Removal {
deprecated: attr.to_string(),
replacement,
@@ -341,7 +350,7 @@ fn check_class_attribute(checker: &Checker, attribute_expr: &ExprAttribute) {
/// def my_task(**context):
/// context.get("conf") # 'conf' is removed in Airflow 3.0
/// ```
fn check_context_key_usage_in_call(checker: &Checker, call_expr: &ExprCall) {
fn check_context_key_usage_in_call(checker: &mut Checker, call_expr: &ExprCall) {
if !in_airflow_task_function(checker.semantic()) {
return;
}
@@ -377,7 +386,7 @@ fn check_context_key_usage_in_call(checker: &Checker, call_expr: &ExprCall) {
continue;
};
if value == removed_key {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3Removal {
deprecated: removed_key.to_string(),
replacement: Replacement::None,
@@ -390,7 +399,7 @@ fn check_context_key_usage_in_call(checker: &Checker, call_expr: &ExprCall) {
/// Check if a subscript expression accesses a removed Airflow context variable.
/// If a removed key is found, push a corresponding diagnostic.
fn check_context_key_usage_in_subscript(checker: &Checker, subscript: &ExprSubscript) {
fn check_context_key_usage_in_subscript(checker: &mut Checker, subscript: &ExprSubscript) {
if !in_airflow_task_function(checker.semantic()) {
return;
}
@@ -418,7 +427,7 @@ fn check_context_key_usage_in_subscript(checker: &Checker, subscript: &ExprSubsc
}
if REMOVED_CONTEXT_KEYS.contains(&key.to_str()) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3Removal {
deprecated: key.to_string(),
replacement: Replacement::None,
@@ -454,7 +463,7 @@ fn is_kwarg_parameter(semantic: &SemanticModel, name: &ExprName) -> bool {
/// manager = DatasetManager()
/// manager.register_datsaet_change()
/// ```
fn check_method(checker: &Checker, call_expr: &ExprCall) {
fn check_method(checker: &mut Checker, call_expr: &ExprCall) {
let Expr::Attribute(ExprAttribute { attr, value, .. }) = &*call_expr.func else {
return;
};
@@ -519,7 +528,7 @@ fn check_method(checker: &Checker, call_expr: &ExprCall) {
}
};
if let Some(replacement) = replacement {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3Removal {
deprecated: attr.to_string(),
replacement,
@@ -543,7 +552,7 @@ fn check_method(checker: &Checker, call_expr: &ExprCall) {
/// # Or, directly
/// SubDagOperator()
/// ```
fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
fn check_name(checker: &mut Checker, expr: &Expr, range: TextRange) {
let Some(qualified_name) = checker.semantic().resolve_qualified_name(expr) else {
return;
};
@@ -681,7 +690,16 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
["airflow", "operators", "branch_operator", "BaseBranchOperator"] => {
Replacement::Name("airflow.operators.branch.BaseBranchOperator")
}
["airflow", "operators", "dummy" | "dummy_operator", "EmptyOperator" | "DummyOperator"] => {
["airflow", "operators", " dummy", "EmptyOperator"] => {
Replacement::Name("airflow.operators.empty.EmptyOperator")
}
["airflow", "operators", "dummy", "DummyOperator"] => {
Replacement::Name("airflow.operators.empty.EmptyOperator")
}
["airflow", "operators", "dummy_operator", "EmptyOperator"] => {
Replacement::Name("airflow.operators.empty.EmptyOperator")
}
["airflow", "operators", "dummy_operator", "DummyOperator"] => {
Replacement::Name("airflow.operators.empty.EmptyOperator")
}
["airflow", "operators", "email_operator", "EmailOperator"] => {
@@ -710,21 +728,24 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
}
// airflow.sensors
["airflow", "sensors", "external_task", "ExternalTaskSensorLink"] => {
Replacement::Name("airflow.sensors.external_task.ExternalDagLink")
}
["airflow", "sensors", "base_sensor_operator", "BaseSensorOperator"] => {
Replacement::Name("airflow.sensors.base.BaseSensorOperator")
}
["airflow", "sensors", "date_time_sensor", "DateTimeSensor"] => {
Replacement::Name("airflow.sensors.date_time.DateTimeSensor")
}
["airflow", "sensors", "external_task" | "external_task_sensor", "ExternalTaskMarker"] => {
["airflow", "sensors", "external_task_sensor", "ExternalTaskMarker"] => {
Replacement::Name("airflow.sensors.external_task.ExternalTaskMarker")
}
["airflow", "sensors", "external_task" | "external_task_sensor", "ExternalTaskSensorLink"] => {
Replacement::Name("airflow.sensors.external_task.ExternalDagLink")
}
["airflow", "sensors", "external_task" | "external_task_sensor", "ExternalTaskSensor"] => {
["airflow", "sensors", "external_task_sensor", "ExternalTaskSensor"] => {
Replacement::Name("airflow.sensors.external_task.ExternalTaskSensor")
}
["airflow", "sensors", "external_task_sensor", "ExternalTaskSensorLink"] => {
Replacement::Name("airflow.sensors.external_task.ExternalDagLink")
}
["airflow", "sensors", "time_delta_sensor", "TimeDeltaSensor"] => {
Replacement::Name("airflow.sensors.time_delta.TimeDeltaSensor")
}
@@ -743,9 +764,10 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
["airflow", "utils", "dates", "days_ago"] => {
Replacement::Name("pendulum.today('UTC').add(days=-N, ...)")
}
["airflow", "utils", "dates", "parse_execution_date" | "round_time" | "scale_time_units" | "infer_time_unit"] => {
Replacement::None
}
["airflow", "utils", "dates", "parse_execution_date"] => Replacement::None,
["airflow", "utils", "dates", "round_time"] => Replacement::None,
["airflow", "utils", "dates", "scale_time_units"] => Replacement::None,
["airflow", "utils", "dates", "infer_time_unit"] => Replacement::None,
// airflow.utils.file
["airflow", "utils", "file", "TemporaryDirectory"] => Replacement::None,
@@ -762,10 +784,12 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
}
// airflow.utils.state
["airflow", "utils", "state", "SHUTDOWN" | "terminating_states"] => Replacement::None,
["airflow", "utils", "state", "SHUTDOWN"] => Replacement::None,
["airflow", "utils", "state", "terminating_states"] => Replacement::None,
// airflow.utils.trigger_rule
["airflow", "utils", "trigger_rule", "TriggerRule", "DUMMY" | "NONE_FAILED_OR_SKIPPED"] => {
["airflow", "utils", "trigger_rule", "TriggerRule", "DUMMY"] => Replacement::None,
["airflow", "utils", "trigger_rule", "TriggerRule", "NONE_FAILED_OR_SKIPPED"] => {
Replacement::None
}
@@ -867,7 +891,7 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
_ => return,
};
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3Removal {
deprecated: qualified_name.to_string(),
replacement,
@@ -888,7 +912,7 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
/// executors = "some.third.party.executor"
/// ```
fn check_airflow_plugin_extension(
checker: &Checker,
checker: &mut Checker,
expr: &Expr,
name: &str,
class_def: &StmtClassDef,
@@ -905,7 +929,7 @@ fn check_airflow_plugin_extension(
)
})
}) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Airflow3Removal {
deprecated: name.to_string(),
replacement: Replacement::Message(

View File

@@ -45,7 +45,7 @@ impl Violation for AirflowVariableNameTaskIdMismatch {
}
/// AIR001
pub(crate) fn variable_name_task_id(checker: &Checker, targets: &[Expr], value: &Expr) {
pub(crate) fn variable_name_task_id(checker: &mut Checker, targets: &[Expr], value: &Expr) {
if !checker.semantic().seen_module(Modules::AIRFLOW) {
return;
}
@@ -116,5 +116,5 @@ pub(crate) fn variable_name_task_id(checker: &Checker, targets: &[Expr], value:
},
target.range(),
);
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}

View File

@@ -88,7 +88,7 @@ impl Violation for FastApiNonAnnotatedDependency {
/// FAST002
pub(crate) fn fastapi_non_annotated_dependency(
checker: &Checker,
checker: &mut Checker,
function_def: &ast::StmtFunctionDef,
) {
if !checker.semantic().seen_module(Modules::FASTAPI)
@@ -219,7 +219,7 @@ impl<'a> DependencyCall<'a> {
/// necessary to determine this while generating the fix, thus the need to return an updated
/// `seen_default` here.
fn create_diagnostic(
checker: &Checker,
checker: &mut Checker,
parameter: &DependencyParameter,
dependency_call: Option<DependencyCall>,
mut seen_default: bool,
@@ -304,7 +304,7 @@ fn create_diagnostic(
}
diagnostic.try_set_optional_fix(|| fix);
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
seen_default
}

View File

@@ -74,7 +74,10 @@ impl AlwaysFixableViolation for FastApiRedundantResponseModel {
}
/// FAST001
pub(crate) fn fastapi_redundant_response_model(checker: &Checker, function_def: &StmtFunctionDef) {
pub(crate) fn fastapi_redundant_response_model(
checker: &mut Checker,
function_def: &StmtFunctionDef,
) {
if !checker.semantic().seen_module(Modules::FASTAPI) {
return;
}
@@ -95,7 +98,7 @@ pub(crate) fn fastapi_redundant_response_model(checker: &Checker, function_def:
)
.map(Fix::unsafe_edit)
});
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}

View File

@@ -105,7 +105,7 @@ impl Violation for FastApiUnusedPathParameter {
/// FAST003
pub(crate) fn fastapi_unused_path_parameter(
checker: &Checker,
checker: &mut Checker,
function_def: &ast::StmtFunctionDef,
) {
if !checker.semantic().seen_module(Modules::FASTAPI) {
@@ -163,6 +163,7 @@ pub(crate) fn fastapi_unused_path_parameter(
}
// Check if any of the path parameters are not in the function signature.
let mut diagnostics = vec![];
for (path_param, range) in path_params {
// Ignore invalid identifiers (e.g., `user-id`, as opposed to `user_id`)
if !is_identifier(path_param) {
@@ -202,8 +203,10 @@ pub(crate) fn fastapi_unused_path_parameter(
checker.locator().contents(),
)));
}
checker.report_diagnostic(diagnostic);
diagnostics.push(diagnostic);
}
checker.diagnostics.extend(diagnostics);
}
/// Returns an iterator over the non-positional-only, non-variadic parameters of a function.

View File

@@ -223,7 +223,7 @@ impl Violation for SysVersionCmpStr10 {
}
/// YTT103, YTT201, YTT203, YTT204, YTT302
pub(crate) fn compare(checker: &Checker, left: &Expr, ops: &[CmpOp], comparators: &[Expr]) {
pub(crate) fn compare(checker: &mut Checker, left: &Expr, ops: &[CmpOp], comparators: &[Expr]) {
match left {
Expr::Subscript(ast::ExprSubscript { value, slice, .. })
if is_sys(value, "version_info", checker.semantic()) =>
@@ -243,10 +243,9 @@ pub(crate) fn compare(checker: &Checker, left: &Expr, ops: &[CmpOp], comparators
) = (ops, comparators)
{
if *n == 3 && checker.enabled(Rule::SysVersionInfo0Eq3) {
checker.report_diagnostic(Diagnostic::new(
SysVersionInfo0Eq3,
left.range(),
));
checker
.diagnostics
.push(Diagnostic::new(SysVersionInfo0Eq3, left.range()));
}
}
} else if *i == 1 {
@@ -259,10 +258,9 @@ pub(crate) fn compare(checker: &Checker, left: &Expr, ops: &[CmpOp], comparators
) = (ops, comparators)
{
if checker.enabled(Rule::SysVersionInfo1CmpInt) {
checker.report_diagnostic(Diagnostic::new(
SysVersionInfo1CmpInt,
left.range(),
));
checker
.diagnostics
.push(Diagnostic::new(SysVersionInfo1CmpInt, left.range()));
}
}
}
@@ -281,10 +279,9 @@ pub(crate) fn compare(checker: &Checker, left: &Expr, ops: &[CmpOp], comparators
) = (ops, comparators)
{
if checker.enabled(Rule::SysVersionInfoMinorCmpInt) {
checker.report_diagnostic(Diagnostic::new(
SysVersionInfoMinorCmpInt,
left.range(),
));
checker
.diagnostics
.push(Diagnostic::new(SysVersionInfoMinorCmpInt, left.range()));
}
}
}
@@ -300,10 +297,14 @@ pub(crate) fn compare(checker: &Checker, left: &Expr, ops: &[CmpOp], comparators
{
if value.len() == 1 {
if checker.enabled(Rule::SysVersionCmpStr10) {
checker.report_diagnostic(Diagnostic::new(SysVersionCmpStr10, left.range()));
checker
.diagnostics
.push(Diagnostic::new(SysVersionCmpStr10, left.range()));
}
} else if checker.enabled(Rule::SysVersionCmpStr3) {
checker.report_diagnostic(Diagnostic::new(SysVersionCmpStr3, left.range()));
checker
.diagnostics
.push(Diagnostic::new(SysVersionCmpStr3, left.range()));
}
}
}

View File

@@ -46,7 +46,7 @@ impl Violation for SixPY3 {
}
/// YTT202
pub(crate) fn name_or_attribute(checker: &Checker, expr: &Expr) {
pub(crate) fn name_or_attribute(checker: &mut Checker, expr: &Expr) {
if !checker.semantic().seen_module(Modules::SIX) {
return;
}
@@ -56,6 +56,8 @@ pub(crate) fn name_or_attribute(checker: &Checker, expr: &Expr) {
.resolve_qualified_name(expr)
.is_some_and(|qualified_name| matches!(qualified_name.segments(), ["six", "PY3"]))
{
checker.report_diagnostic(Diagnostic::new(SixPY3, expr.range()));
checker
.diagnostics
.push(Diagnostic::new(SixPY3, expr.range()));
}
}

View File

@@ -168,7 +168,7 @@ impl Violation for SysVersionSlice1 {
}
/// YTT101, YTT102, YTT301, YTT303
pub(crate) fn subscript(checker: &Checker, value: &Expr, slice: &Expr) {
pub(crate) fn subscript(checker: &mut Checker, value: &Expr, slice: &Expr) {
if is_sys(value, "version", checker.semantic()) {
match slice {
Expr::Slice(ast::ExprSlice {
@@ -183,9 +183,13 @@ pub(crate) fn subscript(checker: &Checker, value: &Expr, slice: &Expr) {
}) = upper.as_ref()
{
if *i == 1 && checker.enabled(Rule::SysVersionSlice1) {
checker.report_diagnostic(Diagnostic::new(SysVersionSlice1, value.range()));
checker
.diagnostics
.push(Diagnostic::new(SysVersionSlice1, value.range()));
} else if *i == 3 && checker.enabled(Rule::SysVersionSlice3) {
checker.report_diagnostic(Diagnostic::new(SysVersionSlice3, value.range()));
checker
.diagnostics
.push(Diagnostic::new(SysVersionSlice3, value.range()));
}
}
}
@@ -195,9 +199,13 @@ pub(crate) fn subscript(checker: &Checker, value: &Expr, slice: &Expr) {
..
}) => {
if *i == 2 && checker.enabled(Rule::SysVersion2) {
checker.report_diagnostic(Diagnostic::new(SysVersion2, value.range()));
checker
.diagnostics
.push(Diagnostic::new(SysVersion2, value.range()));
} else if *i == 0 && checker.enabled(Rule::SysVersion0) {
checker.report_diagnostic(Diagnostic::new(SysVersion0, value.range()));
checker
.diagnostics
.push(Diagnostic::new(SysVersion0, value.range()));
}
}

View File

@@ -1,5 +1,6 @@
---
source: crates/ruff_linter/src/rules/flake8_annotations/mod.rs
snapshot_kind: text
---
auto_return_type.py:1:5: ANN201 [*] Missing return type annotation for public function `func`
|
@@ -96,7 +97,7 @@ auto_return_type.py:27:5: ANN201 [*] Missing return type annotation for public f
| ^^^^ ANN201
28 | return 1 or 2.5 if x > 0 else 1.5 or "str"
|
= help: Add return type annotation: `Union[str, float]`
= help: Add return type annotation: `Union[str | float]`
Unsafe fix
1 |+from typing import Union
@@ -108,7 +109,7 @@ auto_return_type.py:27:5: ANN201 [*] Missing return type annotation for public f
25 26 |
26 27 |
27 |-def func(x: int):
28 |+def func(x: int) -> Union[str, float]:
28 |+def func(x: int) -> Union[str | float]:
28 29 | return 1 or 2.5 if x > 0 else 1.5 or "str"
29 30 |
30 31 |
@@ -119,7 +120,7 @@ auto_return_type.py:31:5: ANN201 [*] Missing return type annotation for public f
| ^^^^ ANN201
32 | return 1 + 2.5 if x > 0 else 1.5 or "str"
|
= help: Add return type annotation: `Union[str, float]`
= help: Add return type annotation: `Union[str | float]`
Unsafe fix
1 |+from typing import Union
@@ -131,7 +132,7 @@ auto_return_type.py:31:5: ANN201 [*] Missing return type annotation for public f
29 30 |
30 31 |
31 |-def func(x: int):
32 |+def func(x: int) -> Union[str, float]:
32 |+def func(x: int) -> Union[str | float]:
32 33 | return 1 + 2.5 if x > 0 else 1.5 or "str"
33 34 |
34 35 |
@@ -203,7 +204,7 @@ auto_return_type.py:59:5: ANN201 [*] Missing return type annotation for public f
60 | if not x:
61 | return 1
|
= help: Add return type annotation: `Union[str, int, None]`
= help: Add return type annotation: `Union[str | int | None]`
Unsafe fix
1 |+from typing import Union
@@ -215,7 +216,7 @@ auto_return_type.py:59:5: ANN201 [*] Missing return type annotation for public f
57 58 |
58 59 |
59 |-def func(x: int):
60 |+def func(x: int) -> Union[str, int, None]:
60 |+def func(x: int) -> Union[str | int | None]:
60 61 | if not x:
61 62 | return 1
62 63 | elif x > 5:
@@ -293,7 +294,7 @@ auto_return_type.py:82:5: ANN201 [*] Missing return type annotation for public f
83 | match x:
84 | case [1, 2, 3]:
|
= help: Add return type annotation: `Union[str, int, None]`
= help: Add return type annotation: `Union[str | int | None]`
Unsafe fix
1 |+from typing import Union
@@ -305,7 +306,7 @@ auto_return_type.py:82:5: ANN201 [*] Missing return type annotation for public f
80 81 |
81 82 |
82 |-def func(x: int):
83 |+def func(x: int) -> Union[str, int, None]:
83 |+def func(x: int) -> Union[str | int | None]:
83 84 | match x:
84 85 | case [1, 2, 3]:
85 86 | return 1
@@ -852,7 +853,7 @@ auto_return_type.py:299:5: ANN201 [*] Missing return type annotation for public
300 | match x:
301 | case [1, 2, 3]:
|
= help: Add return type annotation: `Union[str, int]`
= help: Add return type annotation: `Union[str | int]`
Unsafe fix
214 214 | return 1
@@ -868,7 +869,7 @@ auto_return_type.py:299:5: ANN201 [*] Missing return type annotation for public
297 297 |
298 298 |
299 |-def func(x: int):
299 |+def func(x: int) -> Union[str, int]:
299 |+def func(x: int) -> Union[str | int]:
300 300 | match x:
301 301 | case [1, 2, 3]:
302 302 | return 1

View File

@@ -51,7 +51,7 @@ impl Violation for AsyncBusyWait {
}
/// ASYNC110
pub(crate) fn async_busy_wait(checker: &Checker, while_stmt: &ast::StmtWhile) {
pub(crate) fn async_busy_wait(checker: &mut Checker, while_stmt: &ast::StmtWhile) {
// The body should be a single `await` call.
let [stmt] = while_stmt.body.as_slice() else {
return;
@@ -74,7 +74,7 @@ pub(crate) fn async_busy_wait(checker: &Checker, while_stmt: &ast::StmtWhile) {
qualified_name.segments(),
["trio" | "anyio", "sleep" | "sleep_until"] | ["asyncio", "sleep"]
) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
AsyncBusyWait {
module: AsyncModule::try_from(&qualified_name).unwrap(),
},

View File

@@ -87,7 +87,10 @@ impl Violation for AsyncFunctionWithTimeout {
}
/// ASYNC109
pub(crate) fn async_function_with_timeout(checker: &Checker, function_def: &ast::StmtFunctionDef) {
pub(crate) fn async_function_with_timeout(
checker: &mut Checker,
function_def: &ast::StmtFunctionDef,
) {
// Detect `async` calls with a `timeout` argument.
if !function_def.is_async {
return;
@@ -112,7 +115,7 @@ pub(crate) fn async_function_with_timeout(checker: &Checker, function_def: &ast:
return;
}
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
AsyncFunctionWithTimeout { module },
timeout.range(),
));

View File

@@ -51,7 +51,7 @@ impl AlwaysFixableViolation for AsyncZeroSleep {
}
/// ASYNC115
pub(crate) fn async_zero_sleep(checker: &Checker, call: &ExprCall) {
pub(crate) fn async_zero_sleep(checker: &mut Checker, call: &ExprCall) {
if !(checker.semantic().seen_module(Modules::TRIO)
|| checker.semantic().seen_module(Modules::ANYIO))
{
@@ -103,6 +103,6 @@ pub(crate) fn async_zero_sleep(checker: &Checker, call: &ExprCall) {
let arg_edit = Edit::range_replacement("()".to_string(), call.arguments.range());
Ok(Fix::safe_edits(import_edit, [reference_edit, arg_edit]))
});
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}

View File

@@ -62,7 +62,7 @@ fn is_blocking_http_call(qualified_name: &QualifiedName) -> bool {
}
/// ASYNC210
pub(crate) fn blocking_http_call(checker: &Checker, call: &ExprCall) {
pub(crate) fn blocking_http_call(checker: &mut Checker, call: &ExprCall) {
if checker.semantic().in_async_context() {
if checker
.semantic()
@@ -70,7 +70,7 @@ pub(crate) fn blocking_http_call(checker: &Checker, call: &ExprCall) {
.as_ref()
.is_some_and(is_blocking_http_call)
{
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
BlockingHttpCallInAsyncFunction,
call.func.range(),
));

View File

@@ -44,7 +44,7 @@ impl Violation for BlockingOpenCallInAsyncFunction {
}
/// ASYNC230
pub(crate) fn blocking_open_call(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn blocking_open_call(checker: &mut Checker, call: &ast::ExprCall) {
if !checker.semantic().in_async_context() {
return;
}
@@ -52,7 +52,7 @@ pub(crate) fn blocking_open_call(checker: &Checker, call: &ast::ExprCall) {
if is_open_call(&call.func, checker.semantic())
|| is_open_call_from_pathlib(call.func.as_ref(), checker.semantic())
{
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
BlockingOpenCallInAsyncFunction,
call.func.range(),
));

View File

@@ -112,7 +112,7 @@ impl Violation for WaitForProcessInAsyncFunction {
}
/// ASYNC220, ASYNC221, ASYNC222
pub(crate) fn blocking_process_invocation(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn blocking_process_invocation(checker: &mut Checker, call: &ast::ExprCall) {
if !checker.semantic().in_async_context() {
return;
}
@@ -146,7 +146,7 @@ pub(crate) fn blocking_process_invocation(checker: &Checker, call: &ast::ExprCal
};
let diagnostic = Diagnostic::new::<DiagnosticKind>(diagnostic_kind, call.func.range());
if checker.enabled(diagnostic.kind.rule()) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}

View File

@@ -43,7 +43,7 @@ fn is_blocking_sleep(qualified_name: &QualifiedName) -> bool {
}
/// ASYNC251
pub(crate) fn blocking_sleep(checker: &Checker, call: &ExprCall) {
pub(crate) fn blocking_sleep(checker: &mut Checker, call: &ExprCall) {
if checker.semantic().in_async_context() {
if checker
.semantic()
@@ -51,7 +51,7 @@ pub(crate) fn blocking_sleep(checker: &Checker, call: &ExprCall) {
.as_ref()
.is_some_and(is_blocking_sleep)
{
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
BlockingSleepInAsyncFunction,
call.func.range(),
));

View File

@@ -53,7 +53,7 @@ impl Violation for CancelScopeNoCheckpoint {
/// ASYNC100
pub(crate) fn cancel_scope_no_checkpoint(
checker: &Checker,
checker: &mut Checker,
with_stmt: &StmtWith,
with_items: &[WithItem],
) {
@@ -98,7 +98,7 @@ pub(crate) fn cancel_scope_no_checkpoint(
return;
}
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
CancelScopeNoCheckpoint { method_name },
with_stmt.range,
));

View File

@@ -56,7 +56,7 @@ impl Violation for LongSleepNotForever {
}
/// ASYNC116
pub(crate) fn long_sleep_not_forever(checker: &Checker, call: &ExprCall) {
pub(crate) fn long_sleep_not_forever(checker: &mut Checker, call: &ExprCall) {
if !(checker.semantic().seen_module(Modules::TRIO)
|| checker.semantic().seen_module(Modules::ANYIO))
{
@@ -127,5 +127,5 @@ pub(crate) fn long_sleep_not_forever(checker: &Checker, call: &ExprCall) {
let arg_edit = Edit::range_replacement("()".to_string(), call.arguments.range());
Ok(Fix::unsafe_edits(import_edit, [reference_edit, arg_edit]))
});
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}

View File

@@ -51,7 +51,7 @@ impl Violation for TrioSyncCall {
}
/// ASYNC105
pub(crate) fn sync_call(checker: &Checker, call: &ExprCall) {
pub(crate) fn sync_call(checker: &mut Checker, call: &ExprCall) {
if !checker.semantic().seen_module(Modules::TRIO) {
return;
}
@@ -91,5 +91,5 @@ pub(crate) fn sync_call(checker: &Checker, call: &ExprCall) {
call.func.start(),
)));
}
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}

View File

@@ -61,7 +61,7 @@ enum Reason {
}
/// S103
pub(crate) fn bad_file_permissions(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn bad_file_permissions(checker: &mut Checker, call: &ast::ExprCall) {
if !checker.semantic().seen_module(Modules::OS) {
return;
}
@@ -78,7 +78,7 @@ pub(crate) fn bad_file_permissions(checker: &Checker, call: &ast::ExprCall) {
// The mask is a valid integer value -- check for overly permissive permissions.
Ok(Some(mask)) => {
if (mask & WRITE_WORLD > 0) || (mask & EXECUTE_GROUP > 0) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
BadFilePermissions {
reason: Reason::Permissive(mask),
},
@@ -88,7 +88,7 @@ pub(crate) fn bad_file_permissions(checker: &Checker, call: &ast::ExprCall) {
}
// The mask is an invalid integer value (i.e., it's out of range).
Err(_) => {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
BadFilePermissions {
reason: Reason::Invalid,
},

View File

@@ -44,7 +44,7 @@ impl Violation for DjangoExtra {
}
/// S610
pub(crate) fn django_extra(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn django_extra(checker: &mut Checker, call: &ast::ExprCall) {
let Expr::Attribute(ExprAttribute { attr, .. }) = call.func.as_ref() else {
return;
};
@@ -54,7 +54,9 @@ pub(crate) fn django_extra(checker: &Checker, call: &ast::ExprCall) {
}
if is_call_insecure(call) {
checker.report_diagnostic(Diagnostic::new(DjangoExtra, call.arguments.range()));
checker
.diagnostics
.push(Diagnostic::new(DjangoExtra, call.arguments.range()));
}
}

View File

@@ -35,7 +35,7 @@ impl Violation for DjangoRawSql {
}
/// S611
pub(crate) fn django_raw_sql(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn django_raw_sql(checker: &mut Checker, call: &ast::ExprCall) {
if !checker.semantic().seen_module(Modules::DJANGO) {
return;
}
@@ -55,7 +55,9 @@ pub(crate) fn django_raw_sql(checker: &Checker, call: &ast::ExprCall) {
.find_argument_value("sql", 0)
.is_some_and(Expr::is_string_literal_expr)
{
checker.report_diagnostic(Diagnostic::new(DjangoRawSql, call.func.range()));
checker
.diagnostics
.push(Diagnostic::new(DjangoRawSql, call.func.range()));
}
}
}

View File

@@ -32,8 +32,10 @@ impl Violation for ExecBuiltin {
}
/// S102
pub(crate) fn exec_used(checker: &Checker, func: &Expr) {
pub(crate) fn exec_used(checker: &mut Checker, func: &Expr) {
if checker.semantic().match_builtin_expr(func, "exec") {
checker.report_diagnostic(Diagnostic::new(ExecBuiltin, func.range()));
checker
.diagnostics
.push(Diagnostic::new(ExecBuiltin, func.range()));
}
}

View File

@@ -47,7 +47,7 @@ impl Violation for FlaskDebugTrue {
}
/// S201
pub(crate) fn flask_debug_true(checker: &Checker, call: &ExprCall) {
pub(crate) fn flask_debug_true(checker: &mut Checker, call: &ExprCall) {
let Expr::Attribute(ExprAttribute { attr, value, .. }) = call.func.as_ref() else {
return;
};
@@ -67,6 +67,8 @@ pub(crate) fn flask_debug_true(checker: &Checker, call: &ExprCall) {
if typing::resolve_assignment(value, checker.semantic())
.is_some_and(|qualified_name| matches!(qualified_name.segments(), ["flask", "Flask"]))
{
checker.report_diagnostic(Diagnostic::new(FlaskDebugTrue, debug_argument.range()));
checker
.diagnostics
.push(Diagnostic::new(FlaskDebugTrue, debug_argument.range()));
}
}

View File

@@ -37,12 +37,13 @@ impl Violation for HardcodedBindAllInterfaces {
}
/// S104
pub(crate) fn hardcoded_bind_all_interfaces(checker: &Checker, string: StringLike) {
pub(crate) fn hardcoded_bind_all_interfaces(checker: &mut Checker, string: StringLike) {
match string {
StringLike::String(ast::ExprStringLiteral { value, .. }) => {
if value == "0.0.0.0" {
checker
.report_diagnostic(Diagnostic::new(HardcodedBindAllInterfaces, string.range()));
.diagnostics
.push(Diagnostic::new(HardcodedBindAllInterfaces, string.range()));
}
}
StringLike::FString(ast::ExprFString { value, .. }) => {
@@ -50,16 +51,15 @@ pub(crate) fn hardcoded_bind_all_interfaces(checker: &Checker, string: StringLik
match part {
ast::FStringPart::Literal(literal) => {
if &**literal == "0.0.0.0" {
checker.report_diagnostic(Diagnostic::new(
HardcodedBindAllInterfaces,
literal.range(),
));
checker
.diagnostics
.push(Diagnostic::new(HardcodedBindAllInterfaces, literal.range()));
}
}
ast::FStringPart::FString(f_string) => {
for literal in f_string.elements.literals() {
if &**literal == "0.0.0.0" {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
HardcodedBindAllInterfaces,
literal.range(),
));

View File

@@ -69,13 +69,13 @@ fn check_password_kwarg(parameter: &Parameter, default: &Expr) -> Option<Diagnos
}
/// S107
pub(crate) fn hardcoded_password_default(checker: &Checker, parameters: &Parameters) {
pub(crate) fn hardcoded_password_default(checker: &mut Checker, parameters: &Parameters) {
for parameter in parameters.iter_non_variadic_params() {
let Some(default) = parameter.default() else {
continue;
};
if let Some(diagnostic) = check_password_kwarg(&parameter.parameter, default) {
checker.report_diagnostic(diagnostic);
checker.diagnostics.push(diagnostic);
}
}
}

View File

@@ -51,18 +51,20 @@ impl Violation for HardcodedPasswordFuncArg {
}
/// S106
pub(crate) fn hardcoded_password_func_arg(checker: &Checker, keywords: &[Keyword]) {
checker.report_diagnostics(keywords.iter().filter_map(|keyword| {
string_literal(&keyword.value).filter(|string| !string.is_empty())?;
let arg = keyword.arg.as_ref()?;
if !matches_password_name(arg) {
return None;
}
Some(Diagnostic::new(
HardcodedPasswordFuncArg {
name: arg.to_string(),
},
keyword.range(),
))
}));
pub(crate) fn hardcoded_password_func_arg(checker: &mut Checker, keywords: &[Keyword]) {
checker
.diagnostics
.extend(keywords.iter().filter_map(|keyword| {
string_literal(&keyword.value).filter(|string| !string.is_empty())?;
let arg = keyword.arg.as_ref()?;
if !matches_password_name(arg) {
return None;
}
Some(Diagnostic::new(
HardcodedPasswordFuncArg {
name: arg.to_string(),
},
keyword.range(),
))
}));
}

View File

@@ -72,31 +72,37 @@ fn password_target(target: &Expr) -> Option<&str> {
/// S105
pub(crate) fn compare_to_hardcoded_password_string(
checker: &Checker,
checker: &mut Checker,
left: &Expr,
comparators: &[Expr],
) {
checker.report_diagnostics(comparators.iter().filter_map(|comp| {
string_literal(comp).filter(|string| !string.is_empty())?;
let name = password_target(left)?;
Some(Diagnostic::new(
HardcodedPasswordString {
name: name.to_string(),
},
comp.range(),
))
}));
checker
.diagnostics
.extend(comparators.iter().filter_map(|comp| {
string_literal(comp).filter(|string| !string.is_empty())?;
let name = password_target(left)?;
Some(Diagnostic::new(
HardcodedPasswordString {
name: name.to_string(),
},
comp.range(),
))
}));
}
/// S105
pub(crate) fn assign_hardcoded_password_string(checker: &Checker, value: &Expr, targets: &[Expr]) {
pub(crate) fn assign_hardcoded_password_string(
checker: &mut Checker,
value: &Expr,
targets: &[Expr],
) {
if string_literal(value)
.filter(|string| !string.is_empty())
.is_some()
{
for target in targets {
if let Some(name) = password_target(target) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
HardcodedPasswordString {
name: name.to_string(),
},

View File

@@ -55,7 +55,7 @@ impl Violation for HardcodedSQLExpression {
}
/// S608
pub(crate) fn hardcoded_sql_expression(checker: &Checker, expr: &Expr) {
pub(crate) fn hardcoded_sql_expression(checker: &mut Checker, expr: &Expr) {
let content = match expr {
// "select * from table where val = " + "str" + ...
Expr::BinOp(ast::ExprBinOp {
@@ -105,7 +105,9 @@ pub(crate) fn hardcoded_sql_expression(checker: &Checker, expr: &Expr) {
};
if SQL_REGEX.is_match(&content) {
checker.report_diagnostic(Diagnostic::new(HardcodedSQLExpression, expr.range()));
checker
.diagnostics
.push(Diagnostic::new(HardcodedSQLExpression, expr.range()));
}
}

View File

@@ -56,7 +56,7 @@ impl Violation for HardcodedTempFile {
}
/// S108
pub(crate) fn hardcoded_tmp_directory(checker: &Checker, string: StringLike) {
pub(crate) fn hardcoded_tmp_directory(checker: &mut Checker, string: StringLike) {
match string {
StringLike::String(ast::ExprStringLiteral { value, .. }) => {
check(checker, value.to_str(), string.range());
@@ -79,7 +79,7 @@ pub(crate) fn hardcoded_tmp_directory(checker: &Checker, string: StringLike) {
}
}
fn check(checker: &Checker, value: &str, range: TextRange) {
fn check(checker: &mut Checker, value: &str, range: TextRange) {
if !checker
.settings
.flake8_bandit
@@ -102,7 +102,7 @@ fn check(checker: &Checker, value: &str, range: TextRange) {
}
}
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
HardcodedTempFile {
string: value.to_string(),
},

View File

@@ -64,7 +64,7 @@ impl Violation for HashlibInsecureHashFunction {
}
/// S324
pub(crate) fn hashlib_insecure_hash_functions(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn hashlib_insecure_hash_functions(checker: &mut Checker, call: &ast::ExprCall) {
if !checker
.semantic()
.seen_module(Modules::HASHLIB | Modules::CRYPT)
@@ -105,7 +105,7 @@ pub(crate) fn hashlib_insecure_hash_functions(checker: &Checker, call: &ast::Exp
}
fn detect_insecure_hashlib_calls(
checker: &Checker,
checker: &mut Checker,
call: &ast::ExprCall,
hashlib_call: HashlibCall,
) {
@@ -128,7 +128,7 @@ fn detect_insecure_hashlib_calls(
hash_func_name,
"md4" | "md5" | "sha" | "sha1" | "MD4" | "MD5" | "SHA" | "SHA1"
) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
HashlibInsecureHashFunction {
library: "hashlib".to_string(),
string: hash_func_name.to_string(),
@@ -138,7 +138,7 @@ fn detect_insecure_hashlib_calls(
}
}
HashlibCall::WeakHash(func_name) => {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
HashlibInsecureHashFunction {
library: "hashlib".to_string(),
string: (*func_name).to_string(),
@@ -149,7 +149,7 @@ fn detect_insecure_hashlib_calls(
}
}
fn detect_insecure_crypt_calls(checker: &Checker, call: &ast::ExprCall) {
fn detect_insecure_crypt_calls(checker: &mut Checker, call: &ast::ExprCall) {
let Some(method) = checker
.semantic()
.resolve_qualified_name(&call.func)
@@ -173,7 +173,7 @@ fn detect_insecure_crypt_calls(checker: &Checker, call: &ast::ExprCall) {
qualified_name.segments(),
["crypt", "METHOD_CRYPT" | "METHOD_MD5" | "METHOD_BLOWFISH"]
) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
HashlibInsecureHashFunction {
library: "crypt".to_string(),
string: qualified_name.to_string(),

View File

@@ -56,7 +56,7 @@ impl Violation for Jinja2AutoescapeFalse {
}
/// S701
pub(crate) fn jinja2_autoescape_false(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn jinja2_autoescape_false(checker: &mut Checker, call: &ast::ExprCall) {
if checker
.semantic()
.resolve_qualified_name(&call.func)
@@ -70,20 +70,20 @@ pub(crate) fn jinja2_autoescape_false(checker: &Checker, call: &ast::ExprCall) {
Expr::Call(ast::ExprCall { func, .. }) => {
if let Expr::Name(ast::ExprName { id, .. }) = func.as_ref() {
if id != "select_autoescape" {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Jinja2AutoescapeFalse { value: true },
keyword.range(),
));
}
}
}
_ => checker.report_diagnostic(Diagnostic::new(
_ => checker.diagnostics.push(Diagnostic::new(
Jinja2AutoescapeFalse { value: true },
keyword.range(),
)),
}
} else {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
Jinja2AutoescapeFalse { value: false },
call.func.range(),
));

View File

@@ -35,7 +35,7 @@ impl Violation for LoggingConfigInsecureListen {
}
/// S612
pub(crate) fn logging_config_insecure_listen(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn logging_config_insecure_listen(checker: &mut Checker, call: &ast::ExprCall) {
if !checker.semantic().seen_module(Modules::LOGGING) {
return;
}
@@ -51,7 +51,7 @@ pub(crate) fn logging_config_insecure_listen(checker: &Checker, call: &ast::Expr
return;
}
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
LoggingConfigInsecureListen,
call.func.range(),
));

View File

@@ -42,7 +42,7 @@ impl Violation for MakoTemplates {
}
/// S702
pub(crate) fn mako_templates(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn mako_templates(checker: &mut Checker, call: &ast::ExprCall) {
if checker
.semantic()
.resolve_qualified_name(&call.func)
@@ -50,6 +50,8 @@ pub(crate) fn mako_templates(checker: &Checker, call: &ast::ExprCall) {
matches!(qualified_name.segments(), ["mako", "template", "Template"])
})
{
checker.report_diagnostic(Diagnostic::new(MakoTemplates, call.func.range()));
checker
.diagnostics
.push(Diagnostic::new(MakoTemplates, call.func.range()));
}
}

View File

@@ -37,7 +37,7 @@ impl Violation for ParamikoCall {
}
/// S601
pub(crate) fn paramiko_call(checker: &Checker, func: &Expr) {
pub(crate) fn paramiko_call(checker: &mut Checker, func: &Expr) {
if checker
.semantic()
.resolve_qualified_name(func)
@@ -45,6 +45,8 @@ pub(crate) fn paramiko_call(checker: &Checker, func: &Expr) {
matches!(qualified_name.segments(), ["paramiko", "exec_command"])
})
{
checker.report_diagnostic(Diagnostic::new(ParamikoCall, func.range()));
checker
.diagnostics
.push(Diagnostic::new(ParamikoCall, func.range()));
}
}

View File

@@ -46,7 +46,7 @@ impl Violation for RequestWithNoCertValidation {
}
/// S501
pub(crate) fn request_with_no_cert_validation(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn request_with_no_cert_validation(checker: &mut Checker, call: &ast::ExprCall) {
if let Some(target) = checker
.semantic()
.resolve_qualified_name(&call.func)
@@ -61,7 +61,7 @@ pub(crate) fn request_with_no_cert_validation(checker: &Checker, call: &ast::Exp
{
if let Some(keyword) = call.arguments.find_keyword("verify") {
if is_const_false(&keyword.value) {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
RequestWithNoCertValidation {
string: target.to_string(),
},

View File

@@ -51,7 +51,7 @@ impl Violation for RequestWithoutTimeout {
}
/// S113
pub(crate) fn request_without_timeout(checker: &Checker, call: &ast::ExprCall) {
pub(crate) fn request_without_timeout(checker: &mut Checker, call: &ast::ExprCall) {
if let Some(module) = checker
.semantic()
.resolve_qualified_name(&call.func)
@@ -67,13 +67,13 @@ pub(crate) fn request_without_timeout(checker: &Checker, call: &ast::ExprCall) {
{
if let Some(keyword) = call.arguments.find_keyword("timeout") {
if keyword.value.is_none_literal_expr() {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
RequestWithoutTimeout { implicit: false, module: module.to_string() },
keyword.range(),
));
}
} else if module == "requests" {
checker.report_diagnostic(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
RequestWithoutTimeout { implicit: true, module: module.to_string() },
call.func.range(),
));

Some files were not shown because too many files have changed in this diff Show More