Compare commits
23 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0a0e1926f2 | ||
|
|
4473c7e905 | ||
|
|
20930b3675 | ||
|
|
fb1a638a96 | ||
|
|
16964409a8 | ||
|
|
aacfc9ee0b | ||
|
|
bd14f92898 | ||
|
|
cc116b0192 | ||
|
|
0d27c0be27 | ||
|
|
60359c6adf | ||
|
|
77692e4b5f | ||
|
|
731f3a74a9 | ||
|
|
03275c9c98 | ||
|
|
8d99e317b8 | ||
|
|
d4d67e3014 | ||
|
|
e9a236f740 | ||
|
|
ebb31dc29b | ||
|
|
b9e92affb1 | ||
|
|
68fbd0f029 | ||
|
|
8d01efb571 | ||
|
|
da5a25b421 | ||
|
|
bfdab4ac94 | ||
|
|
d1389894a4 |
@@ -1,6 +1,6 @@
|
||||
repos:
|
||||
- repo: https://github.com/charliermarsh/ruff-pre-commit
|
||||
rev: v0.0.208
|
||||
rev: v0.0.210
|
||||
hooks:
|
||||
- id: ruff
|
||||
|
||||
|
||||
@@ -56,31 +56,31 @@ prior to merging.
|
||||
|
||||
There are four phases to adding a new lint rule:
|
||||
|
||||
1. Define the rule in `src/checks.rs`.
|
||||
1. Define the rule in `src/registry.rs`.
|
||||
2. Define the _logic_ for triggering the rule in `src/checkers/ast.rs` (for AST-based checks),
|
||||
`src/checkers/tokens.rs` (for token-based checks), or `src/checkers/lines.rs` (for text-based checks).
|
||||
3. Add a test fixture.
|
||||
4. Update the generated files (documentation and generated code).
|
||||
|
||||
To define the rule, open up `src/checks.rs`. You'll need to define both a `CheckCode` and
|
||||
To define the rule, open up `src/registry.rs`. You'll need to define both a `CheckCode` and
|
||||
`CheckKind`. As an example, you can grep for `E402` and `ModuleImportNotAtTopOfFile`, and follow the
|
||||
pattern implemented therein.
|
||||
|
||||
To trigger the rule, you'll likely want to augment the logic in `src/check_ast.rs`, which defines
|
||||
To trigger the rule, you'll likely want to augment the logic in `src/checkers/ast.rs`, which defines
|
||||
the Python AST visitor, responsible for iterating over the abstract syntax tree and collecting
|
||||
lint-rule violations as it goes. If you need to inspect the AST, you can run
|
||||
`cargo +nightly dev print-ast` with a Python file. Grep for the `Check::new` invocations to
|
||||
understand how other, similar rules are implemented.
|
||||
|
||||
To add a test fixture, create a file under `resources/test/fixtures`, named to match the `CheckCode`
|
||||
you defined earlier (e.g., `E402.py`). This file should contain a variety of violations and
|
||||
non-violations designed to evaluate and demonstrate the behavior of your lint rule.
|
||||
To add a test fixture, create a file under `resources/test/fixtures/[plugin-name]`, named to match
|
||||
the `CheckCode` you defined earlier (e.g., `E402.py`). This file should contain a variety of
|
||||
violations and non-violations designed to evaluate and demonstrate the behavior of your lint rule.
|
||||
|
||||
Run `cargo +nightly dev generate-all` to generate the code for your new fixture. Then run Ruff
|
||||
locally with (e.g.) `cargo run resources/test/fixtures/E402.py --no-cache --select E402`.
|
||||
locally with (e.g.) `cargo run resources/test/fixtures/pycodestyle/E402.py --no-cache --select E402`.
|
||||
|
||||
Once you're satisfied with the output, codify the behavior as a snapshot test by adding a new
|
||||
`test_case` macro in the relevant `src/[test-suite-name]/mod.rs` file. Then, run `cargo test`. Your
|
||||
`test_case` macro in the relevant `src/[plugin-name]/mod.rs` file. Then, run `cargo test`. Your
|
||||
test will fail, but you'll be prompted to follow-up with `cargo insta review`. Accept the generated
|
||||
snapshot, then commit the snapshot file alongside the rest of your changes.
|
||||
|
||||
@@ -88,21 +88,23 @@ Finally, regenerate the documentation and generated code with `cargo +nightly de
|
||||
|
||||
### Example: Adding a new configuration option
|
||||
|
||||
Ruff's user-facing settings live in two places: first, the command-line options defined with
|
||||
[clap](https://docs.rs/clap/latest/clap/) via the `Cli` struct in `src/main.rs`; and second, the
|
||||
`Config` struct defined `src/pyproject.rs`, which is responsible for extracting user-defined
|
||||
settings from a `pyproject.toml` file.
|
||||
Ruff's user-facing settings live in a few different places.
|
||||
|
||||
Ultimately, these two sources of configuration are merged into the `Settings` struct defined
|
||||
in `src/settings.rs`, which is then threaded through the codebase.
|
||||
First, the command-line options are defined via the `Cli` struct in `src/cli.rs`.
|
||||
|
||||
To add a new configuration option, you'll likely want to _both_ add a CLI option to `src/main.rs`
|
||||
_and_ a `pyproject.toml` parameter to `src/pyproject.rs`. If you want to pattern-match against an
|
||||
existing example, grep for `dummy_variable_rgx`, which defines a regular expression to match against
|
||||
acceptable unused variables (e.g., `_`).
|
||||
Second, the `pyproject.toml` options are defined in `src/settings/options.rs` (via the `Options`
|
||||
struct), `src/settings/configuration.rs` (via the `Configuration` struct), and `src/settings/mod.rs`
|
||||
(via the `Settings` struct). These represent, respectively: the schema used to parse the
|
||||
`pyproject.toml` file; an internal, intermediate representation; and the final, internal
|
||||
representation used to power Ruff.
|
||||
|
||||
If the new plugin's configuration should be cached between runs, you'll need to add it to the
|
||||
`Hash` implementation for `Settings` in `src/settings/mod.rs`.
|
||||
To add a new configuration option, you'll likely want to modify these latter few files (along with
|
||||
`cli.rs`, if appropriate). If you want to pattern-match against an existing example, grep for
|
||||
`dummy_variable_rgx`, which defines a regular expression to match against acceptable unused
|
||||
variables (e.g., `_`).
|
||||
|
||||
Note that plugin-specific configuration options are defined in their own modules (e.g.,
|
||||
`src/flake8_unused_arguments/settings.rs`).
|
||||
|
||||
You may also want to add the new configuration option to the `flake8-to-ruff` tool, which is
|
||||
responsible for converting `flake8` configuration files to Ruff's TOML format. This logic
|
||||
|
||||
8
Cargo.lock
generated
8
Cargo.lock
generated
@@ -744,7 +744,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
|
||||
|
||||
[[package]]
|
||||
name = "flake8-to-ruff"
|
||||
version = "0.0.208-dev.0"
|
||||
version = "0.0.210-dev.0"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"clap 4.0.32",
|
||||
@@ -1873,7 +1873,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
dependencies = [
|
||||
"annotate-snippets 0.9.1",
|
||||
"anyhow",
|
||||
@@ -1941,7 +1941,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_dev"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"clap 4.0.32",
|
||||
@@ -1962,7 +1962,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_macros"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
|
||||
@@ -6,7 +6,7 @@ members = [
|
||||
|
||||
[package]
|
||||
name = "ruff"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
authors = ["Charlie Marsh <charlie.r.marsh@gmail.com>"]
|
||||
edition = "2021"
|
||||
rust-version = "1.65.0"
|
||||
@@ -51,7 +51,7 @@ path-absolutize = { version = "3.0.14", features = ["once_cell_cache", "use_unix
|
||||
quick-junit = { version = "0.3.2" }
|
||||
regex = { version = "1.6.0" }
|
||||
ropey = { version = "1.5.0", features = ["cr_lines", "simd"], default-features = false }
|
||||
ruff_macros = { version = "0.0.208", path = "ruff_macros" }
|
||||
ruff_macros = { version = "0.0.210", path = "ruff_macros" }
|
||||
rustc-hash = { version = "1.1.0" }
|
||||
rustpython-ast = { features = ["unparse"], git = "https://github.com/RustPython/RustPython.git", rev = "4d53c7cb27c0379adf8b51c4d3d0d2174f41d590" }
|
||||
rustpython-common = { git = "https://github.com/RustPython/RustPython.git", rev = "4d53c7cb27c0379adf8b51c4d3d0d2174f41d590" }
|
||||
|
||||
85
README.md
85
README.md
@@ -53,8 +53,10 @@ Ruff is extremely actively developed and used in major open-source projects like
|
||||
- [Hatch](https://github.com/pypa/hatch)
|
||||
- [Jupyter](https://github.com/jupyter-server/jupyter_server)
|
||||
- [Synapse](https://github.com/matrix-org/synapse)
|
||||
- [Ibis](https://github.com/ibis-project/ibis)
|
||||
- [Saleor](https://github.com/saleor/saleor)
|
||||
- [Polars](https://github.com/pola-rs/polars)
|
||||
- [Ibis](https://github.com/ibis-project/ibis)
|
||||
- [`pyca/cryptography`](https://github.com/pyca/cryptography)
|
||||
|
||||
Read the [launch blog post](https://notes.crmarsh.com/python-tooling-could-be-much-much-faster).
|
||||
|
||||
@@ -176,7 +178,7 @@ Ruff also works with [pre-commit](https://pre-commit.com):
|
||||
```yaml
|
||||
- repo: https://github.com/charliermarsh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: 'v0.0.208'
|
||||
rev: 'v0.0.210'
|
||||
hooks:
|
||||
- id: ruff
|
||||
# Respect `exclude` and `extend-exclude` settings.
|
||||
@@ -277,7 +279,7 @@ alternative docstring formats. Enabling `ALL` without further configuration may
|
||||
behavior, especially for the `pydocstyle` plugin.
|
||||
|
||||
As an alternative to `pyproject.toml`, Ruff will also respect a `ruff.toml` file, which implements
|
||||
an equivalent schema (though the `[tool.ruff]` hierarchy can be omitted). For example, the above
|
||||
an equivalent schema (though the `[tool.ruff]` hierarchy can be omitted). For example, the
|
||||
`pyproject.toml` described above would be represented via the following `ruff.toml`:
|
||||
|
||||
```toml
|
||||
@@ -404,7 +406,7 @@ directory hierarchy is used for every individual file, with all paths in the `py
|
||||
|
||||
There are a few exceptions to these rules:
|
||||
|
||||
1. In locating the "closest" `pyproject.toml` file for a given path, Ruff ignore any
|
||||
1. In locating the "closest" `pyproject.toml` file for a given path, Ruff ignores any
|
||||
`pyproject.toml` files that lack a `[tool.ruff]` section.
|
||||
2. If a configuration file is passed directly via `--config`, those settings are used for across
|
||||
files. Any relative paths in that configuration file (like `exclude` globs or `src` paths) are
|
||||
@@ -449,7 +451,7 @@ For example, `ruff /path/to/excluded/file.py` will always check `file.py`.
|
||||
### Ignoring errors
|
||||
|
||||
To omit a lint check entirely, add it to the "ignore" list via [`ignore`](#ignore) or
|
||||
[`extend-ignore`](#extend-ignore), either on the command-line or in your `project.toml` file.
|
||||
[`extend-ignore`](#extend-ignore), either on the command-line or in your `pyproject.toml` file.
|
||||
|
||||
To ignore an error inline, Ruff uses a `noqa` system similar to [Flake8](https://flake8.pycqa.org/en/3.1.1/user/ignoring-errors.html).
|
||||
To ignore an individual error, add `# noqa: {code}` to the end of the line, like so:
|
||||
@@ -693,6 +695,7 @@ For more, see [pyupgrade](https://pypi.org/project/pyupgrade/3.2.0/) on PyPI.
|
||||
| UP025 | RewriteUnicodeLiteral | Remove unicode literals from strings | 🛠 |
|
||||
| UP026 | RewriteMockImport | `mock` is deprecated, use `unittest.mock` | 🛠 |
|
||||
| UP027 | RewriteListComprehension | Replace unpacked list comprehension with a generator expression | 🛠 |
|
||||
| UP028 | RewriteYieldFrom | Replace `yield` over `for` loop with `yield from` | 🛠 |
|
||||
|
||||
### pep8-naming (N)
|
||||
|
||||
@@ -919,7 +922,7 @@ For more, see [flake8-pytest-style](https://pypi.org/project/flake8-pytest-style
|
||||
| PT019 | FixtureParamWithoutValue | Fixture ... without value is injected as parameter, use @pytest.mark.usefixtures instead | |
|
||||
| PT020 | DeprecatedYieldFixture | `@pytest.yield_fixture` is deprecated, use `@pytest.fixture` | |
|
||||
| PT021 | FixtureFinalizerCallback | Use `yield` instead of `request.addfinalizer` | |
|
||||
| PT022 | UselessYieldFixture | No teardown in fixture ..., use `return` instead of `yield` | |
|
||||
| PT022 | UselessYieldFixture | No teardown in fixture ..., use `return` instead of `yield` | 🛠 |
|
||||
| PT023 | IncorrectMarkParenthesesStyle | Use `@pytest.mark....` over `@pytest.mark....()` | 🛠 |
|
||||
| PT024 | UnnecessaryAsyncioMarkOnFixture | `pytest.mark.asyncio` is unnecessary for fixtures | |
|
||||
| PT025 | ErroneousUseFixturesOnFixture | `pytest.mark.usefixtures` has no effect on fixtures | |
|
||||
@@ -960,7 +963,7 @@ For more, see [flake8-simplify](https://pypi.org/project/flake8-simplify/0.19.3/
|
||||
| SIM118 | KeyInDict | Use `key in dict` instead of `key in dict.keys()` | 🛠 |
|
||||
| SIM222 | OrTrue | Use `True` instead of `... or True` | 🛠 |
|
||||
| SIM223 | AndFalse | Use `False` instead of `... and False` | 🛠 |
|
||||
| SIM300 | YodaConditions | Use `left == right` instead of `right == left (Yoda-conditions)` | |
|
||||
| SIM300 | YodaConditions | Use `left == right` instead of `right == left (Yoda-conditions)` | 🛠 |
|
||||
|
||||
### flake8-tidy-imports (TID)
|
||||
|
||||
@@ -1419,7 +1422,7 @@ Ruff can also replace [`isort`](https://pypi.org/project/isort/),
|
||||
[`pygrep-hooks`](https://github.com/pre-commit/pygrep-hooks) (3/10), and a subset of the rules
|
||||
implemented in [`pyupgrade`](https://pypi.org/project/pyupgrade/) (21/33).
|
||||
|
||||
If you're looking to use Ruff, but rely on an unsupported Flake8 plugin, free to file an Issue.
|
||||
If you're looking to use Ruff, but rely on an unsupported Flake8 plugin, feel free to file an Issue.
|
||||
|
||||
### Do I need to install Rust to use Ruff?
|
||||
|
||||
@@ -1493,53 +1496,21 @@ For example, if you're coming from `flake8-docstrings`, and your originating con
|
||||
`--docstring-convention=numpy`, you'd instead set `convention = "numpy"` in your `pyproject.toml`,
|
||||
as above.
|
||||
|
||||
Note that setting a `convention` is equivalent to adding that convention's specific set of codes to
|
||||
your `select`. For example, `convention = "numpy"` is equivalent to:
|
||||
Alongside `convention`, you'll want to explicitly enable the `D` error code class, like so:
|
||||
|
||||
```toml
|
||||
[tool.ruff]
|
||||
# Enable all `D` errors except `D107`, `D203`, `D212`, `D213`, `D402`, `D413`, `D415`, `D416`,
|
||||
# and `D417`.
|
||||
select = [
|
||||
"D100",
|
||||
"D101",
|
||||
"D102",
|
||||
"D103",
|
||||
"D104",
|
||||
"D105",
|
||||
"D106",
|
||||
"D200",
|
||||
"D201",
|
||||
"D202",
|
||||
"D204",
|
||||
"D205",
|
||||
"D206",
|
||||
"D207",
|
||||
"D208",
|
||||
"D209",
|
||||
"D210",
|
||||
"D211",
|
||||
"D214",
|
||||
"D215",
|
||||
"D300",
|
||||
"D301",
|
||||
"D400",
|
||||
"D403",
|
||||
"D404",
|
||||
"D405",
|
||||
"D406",
|
||||
"D407",
|
||||
"D408",
|
||||
"D409",
|
||||
"D410",
|
||||
"D411",
|
||||
"D412",
|
||||
"D414",
|
||||
"D418",
|
||||
"D419",
|
||||
"D",
|
||||
]
|
||||
|
||||
[tool.ruff.pydocstyle]
|
||||
convention = "google"
|
||||
```
|
||||
|
||||
Setting a `convention` force-disables any rules that are incompatible with that convention, no
|
||||
matter how they're provided, which avoids accidental incompatibilities and simplifies configuration.
|
||||
|
||||
### How can I tell what settings Ruff is using to check my code?
|
||||
|
||||
Run `ruff /path/to/code.py --show-settings` to view the resolved settings for a given file.
|
||||
@@ -2851,6 +2822,24 @@ known-third-party = ["src"]
|
||||
|
||||
---
|
||||
|
||||
#### [`order-by-type`](#order-by-type)
|
||||
|
||||
Order imports by type, which is determined by case, in addition to
|
||||
alphabetically.
|
||||
|
||||
**Default value**: `true`
|
||||
|
||||
**Type**: `bool`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
```toml
|
||||
[tool.ruff.isort]
|
||||
order-by-type = true
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### [`single-line-exclusions`](#single-line-exclusions)
|
||||
|
||||
One or more modules to exclude from the single line rule.
|
||||
|
||||
4
flake8_to_ruff/Cargo.lock
generated
4
flake8_to_ruff/Cargo.lock
generated
@@ -771,7 +771,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
|
||||
|
||||
[[package]]
|
||||
name = "flake8_to_ruff"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"clap",
|
||||
@@ -1975,7 +1975,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"bincode",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "flake8-to-ruff"
|
||||
version = "0.0.208-dev.0"
|
||||
version = "0.0.210-dev.0"
|
||||
edition = "2021"
|
||||
|
||||
[lib]
|
||||
|
||||
@@ -74,7 +74,7 @@ pub fn convert(
|
||||
.as_ref()
|
||||
.map(|value| BTreeSet::from_iter(parser::parse_prefix_codes(value)))
|
||||
})
|
||||
.unwrap_or_else(|| plugin::resolve_select(flake8, &plugins));
|
||||
.unwrap_or_else(|| plugin::resolve_select(&plugins));
|
||||
let mut ignore = flake8
|
||||
.get("ignore")
|
||||
.and_then(|value| {
|
||||
@@ -291,13 +291,6 @@ pub fn convert(
|
||||
}
|
||||
}
|
||||
|
||||
// Set default `convention`.
|
||||
if plugins.contains(&Plugin::Flake8Docstrings) {
|
||||
if pydocstyle.convention.is_none() {
|
||||
pydocstyle.convention = Some(Convention::Pep257);
|
||||
}
|
||||
}
|
||||
|
||||
// Deduplicate and sort.
|
||||
options.select = Some(Vec::from_iter(select));
|
||||
options.ignore = Some(Vec::from_iter(ignore));
|
||||
@@ -697,6 +690,7 @@ mod tests {
|
||||
required_version: None,
|
||||
respect_gitignore: None,
|
||||
select: Some(vec![
|
||||
CheckCodePrefix::D,
|
||||
CheckCodePrefix::E,
|
||||
CheckCodePrefix::F,
|
||||
CheckCodePrefix::W,
|
||||
|
||||
@@ -97,7 +97,7 @@ impl fmt::Debug for Plugin {
|
||||
}
|
||||
|
||||
impl Plugin {
|
||||
pub fn default(&self) -> CheckCodePrefix {
|
||||
pub fn prefix(&self) -> CheckCodePrefix {
|
||||
match self {
|
||||
Plugin::Flake8Annotations => CheckCodePrefix::ANN,
|
||||
Plugin::Flake8Bandit => CheckCodePrefix::S,
|
||||
@@ -125,48 +125,6 @@ impl Plugin {
|
||||
Plugin::Pyupgrade => CheckCodePrefix::UP,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn select(&self, flake8: &HashMap<String, Option<String>>) -> Vec<CheckCodePrefix> {
|
||||
match self {
|
||||
Plugin::Flake8Annotations => vec![CheckCodePrefix::ANN],
|
||||
Plugin::Flake8Bandit => vec![CheckCodePrefix::S],
|
||||
Plugin::Flake8BlindExcept => vec![CheckCodePrefix::BLE],
|
||||
Plugin::Flake8Bugbear => vec![CheckCodePrefix::B],
|
||||
Plugin::Flake8Builtins => vec![CheckCodePrefix::A],
|
||||
Plugin::Flake8Comprehensions => vec![CheckCodePrefix::C4],
|
||||
Plugin::Flake8Datetimez => vec![CheckCodePrefix::DTZ],
|
||||
Plugin::Flake8Debugger => vec![CheckCodePrefix::T1],
|
||||
Plugin::Flake8Docstrings => {
|
||||
// Use the user-provided docstring.
|
||||
for key in ["docstring-convention", "docstring_convention"] {
|
||||
if let Some(Some(value)) = flake8.get(key) {
|
||||
match DocstringConvention::from_str(value) {
|
||||
Ok(convention) => return convention.select(),
|
||||
Err(e) => {
|
||||
eprintln!("Unexpected '{key}' value: {value} ({e}");
|
||||
return vec![];
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
// Default to PEP257.
|
||||
DocstringConvention::Pep257.select()
|
||||
}
|
||||
Plugin::Flake8Eradicate => vec![CheckCodePrefix::ERA],
|
||||
Plugin::Flake8ErrMsg => vec![CheckCodePrefix::EM],
|
||||
Plugin::Flake8ImplicitStrConcat => vec![CheckCodePrefix::ISC],
|
||||
Plugin::Flake8Print => vec![CheckCodePrefix::T2],
|
||||
Plugin::Flake8PytestStyle => vec![CheckCodePrefix::PT],
|
||||
Plugin::Flake8Quotes => vec![CheckCodePrefix::Q],
|
||||
Plugin::Flake8Return => vec![CheckCodePrefix::RET],
|
||||
Plugin::Flake8Simplify => vec![CheckCodePrefix::SIM],
|
||||
Plugin::Flake8TidyImports => vec![CheckCodePrefix::TID],
|
||||
Plugin::McCabe => vec![CheckCodePrefix::C9],
|
||||
Plugin::PandasVet => vec![CheckCodePrefix::PD],
|
||||
Plugin::PEP8Naming => vec![CheckCodePrefix::N],
|
||||
Plugin::Pyupgrade => vec![CheckCodePrefix::UP],
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub enum DocstringConvention {
|
||||
@@ -190,23 +148,6 @@ impl FromStr for DocstringConvention {
|
||||
}
|
||||
}
|
||||
|
||||
impl DocstringConvention {
|
||||
fn select(&self) -> Vec<CheckCodePrefix> {
|
||||
match self {
|
||||
DocstringConvention::All => vec![CheckCodePrefix::D],
|
||||
DocstringConvention::Pep257 => vec![
|
||||
// Covered by the `convention` setting.
|
||||
],
|
||||
DocstringConvention::Numpy => vec![
|
||||
// Covered by the `convention` setting.
|
||||
],
|
||||
DocstringConvention::Google => vec![
|
||||
// Covered by the `convention` setting.
|
||||
],
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Infer the enabled plugins based on user-provided options.
|
||||
///
|
||||
/// For example, if the user specified a `mypy-init-return` setting, we should
|
||||
@@ -356,7 +297,7 @@ pub fn infer_plugins_from_codes(codes: &BTreeSet<CheckCodePrefix>) -> Vec<Plugin
|
||||
if prefix
|
||||
.codes()
|
||||
.iter()
|
||||
.any(|code| plugin.default().codes().contains(code))
|
||||
.any(|code| plugin.prefix().codes().contains(code))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
@@ -367,18 +308,9 @@ pub fn infer_plugins_from_codes(codes: &BTreeSet<CheckCodePrefix>) -> Vec<Plugin
|
||||
}
|
||||
|
||||
/// Resolve the set of enabled `CheckCodePrefix` values for the given plugins.
|
||||
pub fn resolve_select(
|
||||
flake8: &HashMap<String, Option<String>>,
|
||||
plugins: &[Plugin],
|
||||
) -> BTreeSet<CheckCodePrefix> {
|
||||
// Include default Pyflakes and pycodestyle checks.
|
||||
pub fn resolve_select(plugins: &[Plugin]) -> BTreeSet<CheckCodePrefix> {
|
||||
let mut select = BTreeSet::from([CheckCodePrefix::F, CheckCodePrefix::E, CheckCodePrefix::W]);
|
||||
|
||||
// Add prefix codes for every plugin.
|
||||
for plugin in plugins {
|
||||
select.extend(plugin.select(flake8));
|
||||
}
|
||||
|
||||
select.extend(plugins.iter().map(Plugin::prefix));
|
||||
select
|
||||
}
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ build-backend = "maturin"
|
||||
|
||||
[project]
|
||||
name = "ruff"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
description = "An extremely fast Python linter, written in Rust."
|
||||
authors = [
|
||||
{ name = "Charlie Marsh", email = "charlie.r.marsh@gmail.com" },
|
||||
|
||||
5
resources/test/fixtures/flake8_pie/PIE790.py
vendored
5
resources/test/fixtures/flake8_pie/PIE790.py
vendored
@@ -9,6 +9,11 @@ if foo:
|
||||
pass # PIE790
|
||||
|
||||
|
||||
def multi_statement() -> None:
|
||||
"""This is a function."""
|
||||
pass; print("hello")
|
||||
|
||||
|
||||
if foo:
|
||||
pass
|
||||
else:
|
||||
|
||||
@@ -11,6 +11,16 @@ def test_csv(param1, param2):
|
||||
...
|
||||
|
||||
|
||||
@pytest.mark.parametrize(" param1, , param2 , ", [(1, 2), (3, 4)])
|
||||
def test_csv_with_whitespace(param1, param2):
|
||||
...
|
||||
|
||||
|
||||
@pytest.mark.parametrize("param1,param2", [(1, 2), (3, 4)])
|
||||
def test_csv_bad_quotes(param1, param2):
|
||||
...
|
||||
|
||||
|
||||
@pytest.mark.parametrize(("param1", "param2"), [(1, 2), (3, 4)])
|
||||
def test_tuple(param1, param2):
|
||||
...
|
||||
@@ -29,3 +39,13 @@ def test_list(param1, param2):
|
||||
@pytest.mark.parametrize(["param1"], [1, 2, 3])
|
||||
def test_list_one_elem(param1, param2):
|
||||
...
|
||||
|
||||
|
||||
@pytest.mark.parametrize([some_expr, another_expr], [1, 2, 3])
|
||||
def test_list_expressions(param1, param2):
|
||||
...
|
||||
|
||||
|
||||
@pytest.mark.parametrize([some_expr, "param2"], [1, 2, 3])
|
||||
def test_list_mixed_expr_literal(param1, param2):
|
||||
...
|
||||
|
||||
@@ -1,12 +1,22 @@
|
||||
key in dict.keys() # SIM118
|
||||
key in obj.keys() # SIM118
|
||||
|
||||
foo["bar"] in dict.keys() # SIM118
|
||||
foo["bar"] in obj.keys() # SIM118
|
||||
|
||||
foo() in dict.keys() # SIM118
|
||||
foo['bar'] in obj.keys() # SIM118
|
||||
|
||||
for key in dict.keys(): # SIM118
|
||||
foo() in obj.keys() # SIM118
|
||||
|
||||
for key in obj.keys(): # SIM118
|
||||
pass
|
||||
|
||||
for key in list(dict.keys()):
|
||||
for key in list(obj.keys()):
|
||||
if some_property(key):
|
||||
del dict[key]
|
||||
del obj[key]
|
||||
|
||||
[k for k in obj.keys()] # SIM118
|
||||
|
||||
{k for k in obj.keys()} # SIM118
|
||||
|
||||
{k: k for k in obj.keys()} # SIM118
|
||||
|
||||
(k for k in obj.keys()) # SIM118
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
# Errors
|
||||
"yoda" == compare # SIM300
|
||||
'yoda' == compare # SIM300
|
||||
42 == age # SIM300
|
||||
|
||||
# OK
|
||||
|
||||
11
resources/test/fixtures/isort/inline_comments.py
vendored
Normal file
11
resources/test/fixtures/isort/inline_comments.py
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
from a.prometheus.metrics import ( # type:ignore[attr-defined]
|
||||
TERMINAL_CURRENTLY_RUNNING_TOTAL,
|
||||
)
|
||||
|
||||
from b.prometheus.metrics import (
|
||||
TERMINAL_CURRENTLY_RUNNING_TOTAL, # type:ignore[attr-defined]
|
||||
)
|
||||
|
||||
from c.prometheus.metrics import TERMINAL_CURRENTLY_RUNNING_TOTAL # type:ignore[attr-defined]
|
||||
|
||||
from d.prometheus.metrics import TERMINAL_CURRENTLY_RUNNING_TOTAL, OTHER_RUNNING_TOTAL # type:ignore[attr-defined]
|
||||
45
resources/test/fixtures/pydocstyle/D417.py
vendored
45
resources/test/fixtures/pydocstyle/D417.py
vendored
@@ -1,5 +1,5 @@
|
||||
def f(x, y, z):
|
||||
"""Do f.
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x: the value
|
||||
@@ -12,7 +12,7 @@ def f(x, y, z):
|
||||
|
||||
|
||||
def f(x, y, z):
|
||||
"""Do f.
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x:
|
||||
@@ -25,7 +25,7 @@ def f(x, y, z):
|
||||
|
||||
|
||||
def f(x, y, z):
|
||||
"""Do f.
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x:
|
||||
@@ -37,7 +37,7 @@ def f(x, y, z):
|
||||
|
||||
|
||||
def f(x, y, z):
|
||||
"""Do f.
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x: the value def
|
||||
@@ -50,7 +50,7 @@ def f(x, y, z):
|
||||
|
||||
|
||||
def f(x, y, z):
|
||||
"""Do f.
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x: the value
|
||||
@@ -63,7 +63,7 @@ def f(x, y, z):
|
||||
|
||||
|
||||
def f(x, y, z):
|
||||
"""Do g.
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x: the value
|
||||
@@ -75,10 +75,41 @@ def f(x, y, z):
|
||||
|
||||
|
||||
def f(x, y, z):
|
||||
"""Do h.
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x: the value
|
||||
z: A final argument
|
||||
"""
|
||||
return x
|
||||
|
||||
|
||||
def f(x, *args, **kwargs):
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x: the value
|
||||
*args: variable arguments
|
||||
**kwargs: keyword arguments
|
||||
"""
|
||||
return x
|
||||
|
||||
|
||||
def f(x, *args, **kwargs):
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
*args: variable arguments
|
||||
**kwargs: keyword arguments
|
||||
"""
|
||||
return x
|
||||
|
||||
|
||||
def f(x, *args, **kwargs):
|
||||
"""Do something.
|
||||
|
||||
Args:
|
||||
x: the value
|
||||
**kwargs: keyword arguments
|
||||
"""
|
||||
return x
|
||||
|
||||
3
resources/test/fixtures/pyflakes/F821_8.pyi
vendored
Normal file
3
resources/test/fixtures/pyflakes/F821_8.pyi
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
class Pool:
|
||||
@classmethod
|
||||
def create(cls) -> Pool: ...
|
||||
2
resources/test/fixtures/pyupgrade/UP018.py
vendored
2
resources/test/fixtures/pyupgrade/UP018.py
vendored
@@ -7,12 +7,14 @@ str("foo", **k)
|
||||
str("foo", encoding="UTF-8")
|
||||
str("foo"
|
||||
"bar")
|
||||
str(b"foo")
|
||||
bytes("foo", encoding="UTF-8")
|
||||
bytes(*a)
|
||||
bytes("foo", *a)
|
||||
bytes("foo", **a)
|
||||
bytes(b"foo"
|
||||
b"bar")
|
||||
bytes("foo")
|
||||
|
||||
# These become string or byte literals
|
||||
str()
|
||||
|
||||
74
resources/test/fixtures/pyupgrade/UP028_0.py
vendored
Normal file
74
resources/test/fixtures/pyupgrade/UP028_0.py
vendored
Normal file
@@ -0,0 +1,74 @@
|
||||
def f():
|
||||
for x in y:
|
||||
yield x
|
||||
|
||||
|
||||
def g():
|
||||
for x, y in z:
|
||||
yield (x, y)
|
||||
|
||||
|
||||
def h():
|
||||
for x in [1, 2, 3]:
|
||||
yield x
|
||||
|
||||
|
||||
def i():
|
||||
for x in {x for x in y}:
|
||||
yield x
|
||||
|
||||
|
||||
def j():
|
||||
for x in (1, 2, 3):
|
||||
yield x
|
||||
|
||||
|
||||
def k():
|
||||
for x, y in {3: "x", 6: "y"}:
|
||||
yield x, y
|
||||
|
||||
|
||||
def f(): # Comment one\n'
|
||||
# Comment two\n'
|
||||
for x, y in { # Comment three\n'
|
||||
3: "x", # Comment four\n'
|
||||
# Comment five\n'
|
||||
6: "y", # Comment six\n'
|
||||
}: # Comment seven\n'
|
||||
# Comment eight\n'
|
||||
yield x, y # Comment nine\n'
|
||||
# Comment ten',
|
||||
|
||||
|
||||
def f():
|
||||
for x, y in [{3: (3, [44, "long ss"]), 6: "y"}]:
|
||||
yield x, y
|
||||
|
||||
|
||||
def f():
|
||||
for x, y in z():
|
||||
yield x, y
|
||||
|
||||
def f():
|
||||
def func():
|
||||
# This comment is preserved\n'
|
||||
for x, y in z(): # Comment one\n'
|
||||
# Comment two\n'
|
||||
yield x, y # Comment three\n'
|
||||
# Comment four\n'
|
||||
# Comment\n'
|
||||
def g():
|
||||
print(3)
|
||||
|
||||
|
||||
def f():
|
||||
for x in y:
|
||||
yield x
|
||||
for z in x:
|
||||
yield z
|
||||
|
||||
|
||||
def f():
|
||||
for x, y in z():
|
||||
yield x, y
|
||||
x = 1
|
||||
113
resources/test/fixtures/pyupgrade/UP028_1.py
vendored
Normal file
113
resources/test/fixtures/pyupgrade/UP028_1.py
vendored
Normal file
@@ -0,0 +1,113 @@
|
||||
# These should NOT change
|
||||
def f():
|
||||
for x in z:
|
||||
yield
|
||||
|
||||
|
||||
def f():
|
||||
for x in z:
|
||||
yield y
|
||||
|
||||
|
||||
def f():
|
||||
for x, y in z:
|
||||
yield x
|
||||
|
||||
|
||||
def f():
|
||||
for x, y in z:
|
||||
yield y
|
||||
|
||||
|
||||
def f():
|
||||
for a, b in z:
|
||||
yield x, y
|
||||
|
||||
|
||||
def f():
|
||||
for x, y in z:
|
||||
yield y, x
|
||||
|
||||
|
||||
def f():
|
||||
for x, y, c in z:
|
||||
yield x, y
|
||||
|
||||
|
||||
def f():
|
||||
for x in z:
|
||||
x = 22
|
||||
yield x
|
||||
|
||||
|
||||
def f():
|
||||
for x in z:
|
||||
yield x
|
||||
else:
|
||||
print("boom!")
|
||||
|
||||
|
||||
def f():
|
||||
for x in range(5):
|
||||
yield x
|
||||
print(x)
|
||||
|
||||
|
||||
def f():
|
||||
def g():
|
||||
print(x)
|
||||
|
||||
for x in range(5):
|
||||
yield x
|
||||
g()
|
||||
|
||||
|
||||
def f():
|
||||
def g():
|
||||
def h():
|
||||
print(x)
|
||||
|
||||
return h
|
||||
|
||||
for x in range(5):
|
||||
yield x
|
||||
g()()
|
||||
|
||||
|
||||
def f(x):
|
||||
for x in y:
|
||||
yield x
|
||||
del x
|
||||
|
||||
|
||||
async def f():
|
||||
for x in y:
|
||||
yield x
|
||||
|
||||
|
||||
def f():
|
||||
x = 1
|
||||
print(x)
|
||||
for x in y:
|
||||
yield x
|
||||
|
||||
|
||||
def f():
|
||||
for x in y:
|
||||
yield x
|
||||
print(x)
|
||||
|
||||
|
||||
def f():
|
||||
for x in y:
|
||||
yield x
|
||||
z = lambda: x
|
||||
|
||||
|
||||
def f():
|
||||
for x in y:
|
||||
yield x
|
||||
|
||||
class C:
|
||||
def __init__(self):
|
||||
print(x)
|
||||
@@ -969,6 +969,7 @@
|
||||
"UP025",
|
||||
"UP026",
|
||||
"UP027",
|
||||
"UP028",
|
||||
"W",
|
||||
"W2",
|
||||
"W29",
|
||||
@@ -1324,6 +1325,13 @@
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"order-by-type": {
|
||||
"description": "Order imports by type, which is determined by case, in addition to alphabetically.",
|
||||
"type": [
|
||||
"boolean",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"single-line-exclusions": {
|
||||
"description": "One or more modules to exclude from the single line rule.",
|
||||
"type": [
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_dev"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_macros"
|
||||
version = "0.0.208"
|
||||
version = "0.0.210"
|
||||
edition = "2021"
|
||||
|
||||
[lib]
|
||||
|
||||
@@ -6,7 +6,7 @@ use itertools::Itertools;
|
||||
use log::error;
|
||||
use nohash_hasher::IntMap;
|
||||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
use rustpython_ast::{Located, Location};
|
||||
use rustpython_ast::{Comprehension, Located, Location};
|
||||
use rustpython_common::cformat::{CFormatError, CFormatErrorType};
|
||||
use rustpython_parser::ast::{
|
||||
Arg, Arguments, Constant, Excepthandler, ExcepthandlerKind, Expr, ExprContext, ExprKind,
|
||||
@@ -151,7 +151,7 @@ impl<'a> Checker<'a> {
|
||||
in_subscript: false,
|
||||
seen_import_boundary: false,
|
||||
futures_allowed: true,
|
||||
annotations_future_enabled: false,
|
||||
annotations_future_enabled: path.extension().map_or(false, |ext| ext == "pyi"),
|
||||
except_handlers: vec![],
|
||||
// Check-specific state.
|
||||
flake8_bugbear_seen: vec![],
|
||||
@@ -1357,6 +1357,9 @@ where
|
||||
body,
|
||||
&Documentable::Function,
|
||||
);
|
||||
if self.settings.enabled.contains(&CheckCode::UP028) {
|
||||
pyupgrade::plugins::rewrite_yield_from(self, stmt);
|
||||
}
|
||||
let scope = transition_scope(&self.visible_scope, stmt, &Documentable::Function);
|
||||
self.definitions
|
||||
.push((definition, scope.visibility.clone()));
|
||||
@@ -2922,6 +2925,17 @@ where
|
||||
}
|
||||
}
|
||||
|
||||
fn visit_comprehension(&mut self, comprehension: &'b Comprehension) {
|
||||
if self.settings.enabled.contains(&CheckCode::SIM118) {
|
||||
flake8_simplify::plugins::key_in_dict_for(
|
||||
self,
|
||||
&comprehension.target,
|
||||
&comprehension.iter,
|
||||
);
|
||||
}
|
||||
visitor::walk_comprehension(self, comprehension);
|
||||
}
|
||||
|
||||
fn visit_arguments(&mut self, arguments: &'b Arguments) {
|
||||
if self.settings.enabled.contains(&CheckCode::B006) {
|
||||
flake8_bugbear::plugins::mutable_argument_default(self, arguments);
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
use log::error;
|
||||
use rustpython_ast::{Constant, Expr, ExprContext, ExprKind, Location, Stmt, StmtKind};
|
||||
|
||||
use crate::ast::types::Range;
|
||||
@@ -53,13 +54,16 @@ pub fn assert_false(checker: &mut Checker, stmt: &Stmt, test: &Expr, msg: Option
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_stmt(&assertion_error(msg));
|
||||
if let Ok(content) = generator.generate() {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
stmt.location,
|
||||
stmt.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
stmt.location,
|
||||
stmt.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!("Failed to rewrite `assert False`: {e}"),
|
||||
};
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
use itertools::Itertools;
|
||||
use log::error;
|
||||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
use rustpython_ast::{Excepthandler, ExcepthandlerKind, Expr, ExprContext, ExprKind, Location};
|
||||
|
||||
@@ -64,12 +65,15 @@ fn duplicate_handler_exceptions<'a>(
|
||||
} else {
|
||||
generator.unparse_expr(&type_pattern(unique_elts), 0);
|
||||
}
|
||||
if let Ok(content) = generator.generate() {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!("Failed to remove duplicate exceptions: {e}"),
|
||||
}
|
||||
}
|
||||
checker.add_check(check);
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
use log::error;
|
||||
use rustpython_ast::{Constant, Expr, ExprContext, ExprKind, Location};
|
||||
|
||||
use crate::ast::types::Range;
|
||||
@@ -52,12 +53,15 @@ pub fn getattr_with_constant(checker: &mut Checker, expr: &Expr, func: &Expr, ar
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(&attribute(obj, value), 0);
|
||||
if let Ok(content) = generator.generate() {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!("Failed to rewrite `getattr`: {e}"),
|
||||
}
|
||||
}
|
||||
checker.add_check(check);
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
use log::error;
|
||||
use rustpython_ast::{Excepthandler, ExcepthandlerKind, ExprKind};
|
||||
|
||||
use crate::ast::types::Range;
|
||||
@@ -29,12 +30,15 @@ pub fn redundant_tuple_in_exception_handler(checker: &mut Checker, handlers: &[E
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(elt, 0);
|
||||
if let Ok(content) = generator.generate() {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
type_.location,
|
||||
type_.end_location.unwrap(),
|
||||
));
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
type_.location,
|
||||
type_.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!("Failed to remove redundant tuple: {e}"),
|
||||
}
|
||||
}
|
||||
checker.add_check(check);
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
use log::error;
|
||||
use rustc_hash::FxHashSet;
|
||||
use rustpython_ast::{Constant, Expr, ExprKind, Stmt, StmtKind};
|
||||
|
||||
use crate::ast::types::Range;
|
||||
use crate::autofix::helpers::delete_stmt;
|
||||
use crate::autofix::Fix;
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::registry::{Check, CheckCode, CheckKind};
|
||||
@@ -27,10 +29,14 @@ pub fn no_unnecessary_pass(checker: &mut Checker, body: &[Stmt]) {
|
||||
let mut check =
|
||||
Check::new(CheckKind::NoUnnecessaryPass, Range::from_located(pass_stmt));
|
||||
if checker.patch(&CheckCode::PIE790) {
|
||||
check.amend(Fix::deletion(
|
||||
pass_stmt.location,
|
||||
pass_stmt.end_location.unwrap(),
|
||||
));
|
||||
match delete_stmt(pass_stmt, None, &[], checker.locator) {
|
||||
Ok(fix) => {
|
||||
check.amend(fix);
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to delete `pass` statement: {}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
|
||||
@@ -13,10 +13,10 @@ expression: checks
|
||||
content: ""
|
||||
location:
|
||||
row: 4
|
||||
column: 4
|
||||
column: 0
|
||||
end_location:
|
||||
row: 4
|
||||
column: 8
|
||||
row: 5
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
@@ -29,138 +29,138 @@ expression: checks
|
||||
content: ""
|
||||
location:
|
||||
row: 9
|
||||
column: 4
|
||||
column: 0
|
||||
end_location:
|
||||
row: 9
|
||||
column: 8
|
||||
row: 10
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 16
|
||||
row: 14
|
||||
column: 4
|
||||
end_location:
|
||||
row: 16
|
||||
row: 14
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 16
|
||||
row: 14
|
||||
column: 4
|
||||
end_location:
|
||||
row: 16
|
||||
column: 8
|
||||
row: 14
|
||||
column: 10
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 23
|
||||
row: 21
|
||||
column: 4
|
||||
end_location:
|
||||
row: 23
|
||||
row: 21
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 23
|
||||
column: 4
|
||||
row: 21
|
||||
column: 0
|
||||
end_location:
|
||||
row: 23
|
||||
column: 8
|
||||
row: 22
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 30
|
||||
row: 28
|
||||
column: 4
|
||||
end_location:
|
||||
row: 30
|
||||
row: 28
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 30
|
||||
column: 4
|
||||
row: 28
|
||||
column: 0
|
||||
end_location:
|
||||
row: 30
|
||||
column: 8
|
||||
row: 29
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 37
|
||||
row: 35
|
||||
column: 4
|
||||
end_location:
|
||||
row: 37
|
||||
row: 35
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 37
|
||||
column: 4
|
||||
row: 35
|
||||
column: 0
|
||||
end_location:
|
||||
row: 37
|
||||
column: 8
|
||||
row: 36
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 45
|
||||
row: 42
|
||||
column: 4
|
||||
end_location:
|
||||
row: 45
|
||||
row: 42
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 45
|
||||
column: 4
|
||||
row: 42
|
||||
column: 0
|
||||
end_location:
|
||||
row: 45
|
||||
column: 8
|
||||
row: 43
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 53
|
||||
row: 50
|
||||
column: 4
|
||||
end_location:
|
||||
row: 53
|
||||
row: 50
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 53
|
||||
column: 4
|
||||
row: 50
|
||||
column: 0
|
||||
end_location:
|
||||
row: 53
|
||||
column: 8
|
||||
row: 51
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 60
|
||||
row: 58
|
||||
column: 4
|
||||
end_location:
|
||||
row: 60
|
||||
row: 58
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 60
|
||||
column: 4
|
||||
row: 58
|
||||
column: 0
|
||||
end_location:
|
||||
row: 60
|
||||
column: 8
|
||||
row: 59
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 69
|
||||
row: 65
|
||||
column: 4
|
||||
end_location:
|
||||
row: 69
|
||||
row: 65
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 69
|
||||
column: 4
|
||||
row: 65
|
||||
column: 0
|
||||
end_location:
|
||||
row: 69
|
||||
column: 8
|
||||
row: 66
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
@@ -173,42 +173,42 @@ expression: checks
|
||||
content: ""
|
||||
location:
|
||||
row: 74
|
||||
column: 4
|
||||
column: 0
|
||||
end_location:
|
||||
row: 74
|
||||
column: 8
|
||||
row: 75
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 78
|
||||
row: 79
|
||||
column: 4
|
||||
end_location:
|
||||
row: 78
|
||||
row: 79
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 78
|
||||
column: 4
|
||||
row: 79
|
||||
column: 0
|
||||
end_location:
|
||||
row: 78
|
||||
column: 8
|
||||
row: 80
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 82
|
||||
row: 83
|
||||
column: 4
|
||||
end_location:
|
||||
row: 82
|
||||
row: 83
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 82
|
||||
column: 4
|
||||
row: 83
|
||||
column: 0
|
||||
end_location:
|
||||
row: 82
|
||||
column: 8
|
||||
row: 84
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
@@ -221,25 +221,41 @@ expression: checks
|
||||
content: ""
|
||||
location:
|
||||
row: 87
|
||||
column: 4
|
||||
column: 0
|
||||
end_location:
|
||||
row: 87
|
||||
column: 8
|
||||
row: 88
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 91
|
||||
row: 92
|
||||
column: 4
|
||||
end_location:
|
||||
row: 91
|
||||
row: 92
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 91
|
||||
column: 4
|
||||
row: 92
|
||||
column: 0
|
||||
end_location:
|
||||
row: 91
|
||||
column: 8
|
||||
row: 93
|
||||
column: 0
|
||||
parent: ~
|
||||
- kind: NoUnnecessaryPass
|
||||
location:
|
||||
row: 96
|
||||
column: 4
|
||||
end_location:
|
||||
row: 96
|
||||
column: 8
|
||||
fix:
|
||||
content: ""
|
||||
location:
|
||||
row: 96
|
||||
column: 0
|
||||
end_location:
|
||||
row: 97
|
||||
column: 0
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
use rustpython_ast::{Arguments, Expr, ExprKind, Stmt, StmtKind};
|
||||
use rustpython_ast::{Arguments, Expr, ExprKind, Location, Stmt, StmtKind};
|
||||
|
||||
use super::helpers::{
|
||||
get_mark_decorators, get_mark_name, is_abstractmethod_decorator, is_pytest_fixture,
|
||||
@@ -171,14 +171,25 @@ fn check_fixture_returns(checker: &mut Checker, func: &Stmt, func_name: &str, bo
|
||||
}
|
||||
|
||||
if checker.settings.enabled.contains(&CheckCode::PT022) {
|
||||
if let Some(last_statement) = body.last() {
|
||||
if let StmtKind::Expr { value, .. } = &last_statement.node {
|
||||
if let Some(stmt) = body.last() {
|
||||
if let StmtKind::Expr { value, .. } = &stmt.node {
|
||||
if let ExprKind::Yield { .. } = value.node {
|
||||
if visitor.yield_statements.len() == 1 {
|
||||
checker.add_check(Check::new(
|
||||
let mut check = Check::new(
|
||||
CheckKind::UselessYieldFixture(func_name.to_string()),
|
||||
Range::from_located(last_statement),
|
||||
));
|
||||
Range::from_located(stmt),
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
check.amend(Fix::replacement(
|
||||
"return".to_string(),
|
||||
stmt.location,
|
||||
Location::new(
|
||||
stmt.location.row(),
|
||||
stmt.location.column() + "yield".len(),
|
||||
),
|
||||
));
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,11 +1,14 @@
|
||||
use rustpython_ast::{Constant, Expr, ExprKind};
|
||||
use log::error;
|
||||
use rustpython_ast::{Constant, Expr, ExprContext, ExprKind};
|
||||
|
||||
use super::helpers::is_pytest_parametrize;
|
||||
use crate::ast::helpers::create_expr;
|
||||
use crate::ast::types::Range;
|
||||
use crate::autofix::Fix;
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::flake8_pytest_style::types;
|
||||
use crate::registry::{Check, CheckCode, CheckKind};
|
||||
use crate::source_code_generator::SourceCodeGenerator;
|
||||
|
||||
fn get_parametrize_decorator<'a>(checker: &Checker, decorators: &'a [Expr]) -> Option<&'a Expr> {
|
||||
decorators
|
||||
@@ -13,6 +16,59 @@ fn get_parametrize_decorator<'a>(checker: &Checker, decorators: &'a [Expr]) -> O
|
||||
.find(|decorator| is_pytest_parametrize(decorator, checker))
|
||||
}
|
||||
|
||||
fn elts_to_csv(elts: &[Expr], checker: &Checker) -> Option<String> {
|
||||
let all_literals = elts.iter().all(|e| {
|
||||
matches!(
|
||||
e.node,
|
||||
ExprKind::Constant {
|
||||
value: Constant::Str(_),
|
||||
..
|
||||
}
|
||||
)
|
||||
});
|
||||
|
||||
if !all_literals {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut generator = SourceCodeGenerator::new(
|
||||
checker.style.indentation(),
|
||||
checker.style.quote(),
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
|
||||
generator.unparse_expr(
|
||||
&create_expr(ExprKind::Constant {
|
||||
value: Constant::Str(elts.iter().fold(String::new(), |mut acc, elt| {
|
||||
if let ExprKind::Constant {
|
||||
value: Constant::Str(ref s),
|
||||
..
|
||||
} = elt.node
|
||||
{
|
||||
if !acc.is_empty() {
|
||||
acc.push(',');
|
||||
}
|
||||
acc.push_str(s);
|
||||
}
|
||||
acc
|
||||
})),
|
||||
kind: None,
|
||||
}),
|
||||
0,
|
||||
);
|
||||
|
||||
match generator.generate() {
|
||||
Ok(s) => Some(s),
|
||||
Err(e) => {
|
||||
error!(
|
||||
"Failed to generate CSV string from sequence of names: {}",
|
||||
e
|
||||
);
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// PT006
|
||||
fn check_names(checker: &mut Checker, expr: &Expr) {
|
||||
let names_type = checker.settings.flake8_pytest_style.parametrize_names_type;
|
||||
@@ -22,7 +78,19 @@ fn check_names(checker: &mut Checker, expr: &Expr) {
|
||||
value: Constant::Str(string),
|
||||
..
|
||||
} => {
|
||||
let names = string.split(',').collect::<Vec<&str>>();
|
||||
// Match the following pytest code:
|
||||
// [x.strip() for x in argnames.split(",") if x.strip()]
|
||||
let names = string
|
||||
.split(',')
|
||||
.filter_map(|s| {
|
||||
let trimmed = s.trim();
|
||||
if trimmed.is_empty() {
|
||||
None
|
||||
} else {
|
||||
Some(trimmed)
|
||||
}
|
||||
})
|
||||
.collect::<Vec<&str>>();
|
||||
|
||||
if names.len() > 1 {
|
||||
match names_type {
|
||||
@@ -32,11 +100,39 @@ fn check_names(checker: &mut Checker, expr: &Expr) {
|
||||
Range::from_located(expr),
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
check.amend(Fix::replacement(
|
||||
strings_to_python_tuple(&names),
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
let mut generator = SourceCodeGenerator::new(
|
||||
checker.style.indentation(),
|
||||
checker.style.quote(),
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(
|
||||
&create_expr(ExprKind::Tuple {
|
||||
elts: names
|
||||
.iter()
|
||||
.map(|&name| {
|
||||
create_expr(ExprKind::Constant {
|
||||
value: Constant::Str(name.to_string()),
|
||||
kind: None,
|
||||
})
|
||||
})
|
||||
.collect(),
|
||||
ctx: ExprContext::Load,
|
||||
}),
|
||||
1,
|
||||
);
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!(
|
||||
"Failed to fix wrong name(s) type in \
|
||||
`@pytest.mark.parametrize`: {e}"
|
||||
),
|
||||
};
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
@@ -46,11 +142,39 @@ fn check_names(checker: &mut Checker, expr: &Expr) {
|
||||
Range::from_located(expr),
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
check.amend(Fix::replacement(
|
||||
strings_to_python_list(&names),
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
let mut generator = SourceCodeGenerator::new(
|
||||
checker.style.indentation(),
|
||||
checker.style.quote(),
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(
|
||||
&create_expr(ExprKind::List {
|
||||
elts: names
|
||||
.iter()
|
||||
.map(|&name| {
|
||||
create_expr(ExprKind::Constant {
|
||||
value: Constant::Str(name.to_string()),
|
||||
kind: None,
|
||||
})
|
||||
})
|
||||
.collect(),
|
||||
ctx: ExprContext::Load,
|
||||
}),
|
||||
0,
|
||||
);
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!(
|
||||
"Failed to fix wrong name(s) type in \
|
||||
`@pytest.mark.parametrize`: {e}"
|
||||
),
|
||||
};
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
@@ -63,11 +187,60 @@ fn check_names(checker: &mut Checker, expr: &Expr) {
|
||||
if let Some(first) = elts.first() {
|
||||
handle_single_name(checker, expr, first);
|
||||
}
|
||||
} else if names_type != types::ParametrizeNameType::Tuple {
|
||||
checker.add_check(Check::new(
|
||||
CheckKind::ParametrizeNamesWrongType(names_type),
|
||||
Range::from_located(expr),
|
||||
));
|
||||
} else {
|
||||
match names_type {
|
||||
types::ParametrizeNameType::Tuple => {}
|
||||
types::ParametrizeNameType::List => {
|
||||
let mut check = Check::new(
|
||||
CheckKind::ParametrizeNamesWrongType(names_type),
|
||||
Range::from_located(expr),
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
let mut generator = SourceCodeGenerator::new(
|
||||
checker.style.indentation(),
|
||||
checker.style.quote(),
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(
|
||||
&create_expr(ExprKind::List {
|
||||
elts: elts.clone(),
|
||||
ctx: ExprContext::Load,
|
||||
}),
|
||||
0,
|
||||
);
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!(
|
||||
"Failed to fix wrong name(s) type in \
|
||||
`@pytest.mark.parametrize`: {e}"
|
||||
),
|
||||
};
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
types::ParametrizeNameType::CSV => {
|
||||
let mut check = Check::new(
|
||||
CheckKind::ParametrizeNamesWrongType(names_type),
|
||||
Range::from_located(expr),
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
if let Some(content) = elts_to_csv(elts, checker) {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
ExprKind::List { elts, .. } => {
|
||||
@@ -75,11 +248,60 @@ fn check_names(checker: &mut Checker, expr: &Expr) {
|
||||
if let Some(first) = elts.first() {
|
||||
handle_single_name(checker, expr, first);
|
||||
}
|
||||
} else if names_type != types::ParametrizeNameType::List {
|
||||
checker.add_check(Check::new(
|
||||
CheckKind::ParametrizeNamesWrongType(names_type),
|
||||
Range::from_located(expr),
|
||||
));
|
||||
} else {
|
||||
match names_type {
|
||||
types::ParametrizeNameType::List => {}
|
||||
types::ParametrizeNameType::Tuple => {
|
||||
let mut check = Check::new(
|
||||
CheckKind::ParametrizeNamesWrongType(names_type),
|
||||
Range::from_located(expr),
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
let mut generator = SourceCodeGenerator::new(
|
||||
checker.style.indentation(),
|
||||
checker.style.quote(),
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(
|
||||
&create_expr(ExprKind::Tuple {
|
||||
elts: elts.clone(),
|
||||
ctx: ExprContext::Load,
|
||||
}),
|
||||
1, // so tuple is generated with parentheses
|
||||
);
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!(
|
||||
"Failed to fix wrong name(s) type in \
|
||||
`@pytest.mark.parametrize`: {e}"
|
||||
),
|
||||
};
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
types::ParametrizeNameType::CSV => {
|
||||
let mut check = Check::new(
|
||||
CheckKind::ParametrizeNamesWrongType(names_type),
|
||||
Range::from_located(expr),
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
if let Some(content) = elts_to_csv(elts, checker) {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
_ => {}
|
||||
@@ -123,42 +345,28 @@ fn handle_single_name(checker: &mut Checker, expr: &Expr, value: &Expr) {
|
||||
CheckKind::ParametrizeNamesWrongType(types::ParametrizeNameType::CSV),
|
||||
Range::from_located(expr),
|
||||
);
|
||||
if let ExprKind::Constant {
|
||||
value: Constant::Str(string),
|
||||
..
|
||||
} = &value.node
|
||||
{
|
||||
if checker.patch(check.kind.code()) {
|
||||
check.amend(Fix::replacement(
|
||||
format!("\"{string}\""),
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
|
||||
if checker.patch(check.kind.code()) {
|
||||
let mut generator = SourceCodeGenerator::new(
|
||||
checker.style.indentation(),
|
||||
checker.style.quote(),
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(&create_expr(value.node.clone()), 0);
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!("Failed to fix wrong name(s) type in `@pytest.mark.parametrize`: {e}"),
|
||||
};
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
|
||||
fn strings_to_python_tuple(strings: &[&str]) -> String {
|
||||
let result = strings
|
||||
.iter()
|
||||
.map(|s| format!("\"{s}\""))
|
||||
.collect::<Vec<String>>()
|
||||
.join(", ");
|
||||
|
||||
format!("({result})")
|
||||
}
|
||||
|
||||
fn strings_to_python_list(strings: &[&str]) -> String {
|
||||
let result = strings
|
||||
.iter()
|
||||
.map(|s| format!("\"{s}\""))
|
||||
.collect::<Vec<String>>()
|
||||
.join(", ");
|
||||
|
||||
format!("[{result}]")
|
||||
}
|
||||
|
||||
fn handle_value_rows(
|
||||
checker: &mut Checker,
|
||||
elts: &[Expr],
|
||||
@@ -193,13 +401,14 @@ pub fn parametrize(checker: &mut Checker, decorators: &[Expr]) {
|
||||
if let Some(decorator) = decorator {
|
||||
if let ExprKind::Call { args, .. } = &decorator.node {
|
||||
if checker.settings.enabled.contains(&CheckCode::PT006) {
|
||||
let first = args.first().unwrap();
|
||||
check_names(checker, first);
|
||||
if let Some(arg) = args.get(0) {
|
||||
check_names(checker, arg);
|
||||
}
|
||||
}
|
||||
|
||||
if checker.settings.enabled.contains(&CheckCode::PT007) {
|
||||
let second = args.get(1).unwrap();
|
||||
check_values(checker, second);
|
||||
if let Some(arg) = args.get(1) {
|
||||
check_values(checker, arg);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,55 +5,89 @@ expression: checks
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 14
|
||||
row: 24
|
||||
column: 25
|
||||
end_location:
|
||||
row: 14
|
||||
row: 24
|
||||
column: 45
|
||||
fix: ~
|
||||
fix:
|
||||
content: "\"param1,param2\""
|
||||
location:
|
||||
row: 24
|
||||
column: 25
|
||||
end_location:
|
||||
row: 24
|
||||
column: 45
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 19
|
||||
row: 29
|
||||
column: 25
|
||||
end_location:
|
||||
row: 19
|
||||
row: 29
|
||||
column: 36
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
location:
|
||||
row: 19
|
||||
row: 29
|
||||
column: 25
|
||||
end_location:
|
||||
row: 19
|
||||
row: 29
|
||||
column: 36
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 24
|
||||
row: 34
|
||||
column: 25
|
||||
end_location:
|
||||
row: 24
|
||||
row: 34
|
||||
column: 45
|
||||
fix:
|
||||
content: "\"param1,param2\""
|
||||
location:
|
||||
row: 34
|
||||
column: 25
|
||||
end_location:
|
||||
row: 34
|
||||
column: 45
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 39
|
||||
column: 25
|
||||
end_location:
|
||||
row: 39
|
||||
column: 35
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
location:
|
||||
row: 39
|
||||
column: 25
|
||||
end_location:
|
||||
row: 39
|
||||
column: 35
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 44
|
||||
column: 25
|
||||
end_location:
|
||||
row: 44
|
||||
column: 50
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 29
|
||||
row: 49
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
column: 35
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
location:
|
||||
row: 29
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
column: 35
|
||||
row: 49
|
||||
column: 46
|
||||
fix: ~
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -20,47 +20,122 @@ expression: checks
|
||||
column: 40
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
ParametrizeNamesWrongType: tuple
|
||||
location:
|
||||
row: 14
|
||||
column: 25
|
||||
end_location:
|
||||
row: 14
|
||||
column: 56
|
||||
fix:
|
||||
content: "(\"param1\", \"param2\")"
|
||||
location:
|
||||
row: 14
|
||||
column: 25
|
||||
end_location:
|
||||
row: 14
|
||||
column: 56
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: tuple
|
||||
location:
|
||||
row: 19
|
||||
column: 25
|
||||
end_location:
|
||||
row: 19
|
||||
column: 36
|
||||
column: 40
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
content: "(\"param1\", \"param2\")"
|
||||
location:
|
||||
row: 19
|
||||
column: 25
|
||||
end_location:
|
||||
row: 19
|
||||
column: 40
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 29
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
column: 36
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
location:
|
||||
row: 29
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
column: 36
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: tuple
|
||||
location:
|
||||
row: 24
|
||||
row: 34
|
||||
column: 25
|
||||
end_location:
|
||||
row: 24
|
||||
row: 34
|
||||
column: 45
|
||||
fix: ~
|
||||
fix:
|
||||
content: "(\"param1\", \"param2\")"
|
||||
location:
|
||||
row: 34
|
||||
column: 25
|
||||
end_location:
|
||||
row: 34
|
||||
column: 45
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 29
|
||||
row: 39
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
row: 39
|
||||
column: 35
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
location:
|
||||
row: 29
|
||||
row: 39
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
row: 39
|
||||
column: 35
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: tuple
|
||||
location:
|
||||
row: 44
|
||||
column: 25
|
||||
end_location:
|
||||
row: 44
|
||||
column: 50
|
||||
fix:
|
||||
content: "(some_expr, another_expr)"
|
||||
location:
|
||||
row: 44
|
||||
column: 25
|
||||
end_location:
|
||||
row: 44
|
||||
column: 50
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: tuple
|
||||
location:
|
||||
row: 49
|
||||
column: 25
|
||||
end_location:
|
||||
row: 49
|
||||
column: 46
|
||||
fix:
|
||||
content: "(some_expr, \"param2\")"
|
||||
location:
|
||||
row: 49
|
||||
column: 25
|
||||
end_location:
|
||||
row: 49
|
||||
column: 46
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -26,41 +26,82 @@ expression: checks
|
||||
column: 25
|
||||
end_location:
|
||||
row: 14
|
||||
column: 45
|
||||
fix: ~
|
||||
column: 56
|
||||
fix:
|
||||
content: "[\"param1\", \"param2\"]"
|
||||
location:
|
||||
row: 14
|
||||
column: 25
|
||||
end_location:
|
||||
row: 14
|
||||
column: 56
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
ParametrizeNamesWrongType: list
|
||||
location:
|
||||
row: 19
|
||||
column: 25
|
||||
end_location:
|
||||
row: 19
|
||||
column: 36
|
||||
column: 40
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
content: "[\"param1\", \"param2\"]"
|
||||
location:
|
||||
row: 19
|
||||
column: 25
|
||||
end_location:
|
||||
row: 19
|
||||
column: 40
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: list
|
||||
location:
|
||||
row: 24
|
||||
column: 25
|
||||
end_location:
|
||||
row: 24
|
||||
column: 45
|
||||
fix:
|
||||
content: "[\"param1\", \"param2\"]"
|
||||
location:
|
||||
row: 24
|
||||
column: 25
|
||||
end_location:
|
||||
row: 24
|
||||
column: 45
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 29
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
column: 36
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
location:
|
||||
row: 29
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
column: 36
|
||||
parent: ~
|
||||
- kind:
|
||||
ParametrizeNamesWrongType: csv
|
||||
location:
|
||||
row: 29
|
||||
row: 39
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
row: 39
|
||||
column: 35
|
||||
fix:
|
||||
content: "\"param1\""
|
||||
location:
|
||||
row: 29
|
||||
row: 39
|
||||
column: 25
|
||||
end_location:
|
||||
row: 29
|
||||
row: 39
|
||||
column: 35
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -10,6 +10,13 @@ expression: checks
|
||||
end_location:
|
||||
row: 17
|
||||
column: 18
|
||||
fix: ~
|
||||
fix:
|
||||
content: return
|
||||
location:
|
||||
row: 17
|
||||
column: 4
|
||||
end_location:
|
||||
row: 17
|
||||
column: 9
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -1,11 +1,9 @@
|
||||
use rustpython_ast::{Cmpop, Expr, ExprKind};
|
||||
|
||||
use crate::ast::helpers::create_expr;
|
||||
use crate::ast::types::Range;
|
||||
use crate::autofix::Fix;
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::registry::{Check, CheckKind};
|
||||
use crate::source_code_generator::SourceCodeGenerator;
|
||||
|
||||
/// SIM118
|
||||
fn key_in_dict(checker: &mut Checker, left: &Expr, right: &Expr, range: Range) {
|
||||
@@ -27,25 +25,24 @@ fn key_in_dict(checker: &mut Checker, left: &Expr, right: &Expr, range: Range) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Slice exact content to preserve formatting.
|
||||
let left_content = checker
|
||||
.locator
|
||||
.slice_source_code_range(&Range::from_located(left));
|
||||
let value_content = checker
|
||||
.locator
|
||||
.slice_source_code_range(&Range::from_located(value));
|
||||
|
||||
let mut check = Check::new(
|
||||
CheckKind::KeyInDict(left.to_string(), value.to_string()),
|
||||
CheckKind::KeyInDict(left_content.to_string(), value_content.to_string()),
|
||||
range,
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
let mut generator = SourceCodeGenerator::new(
|
||||
checker.style.indentation(),
|
||||
checker.style.quote(),
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(&create_expr(value.node.clone()), 0);
|
||||
|
||||
if let Ok(content) = generator.generate() {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
right.location,
|
||||
right.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
check.amend(Fix::replacement(
|
||||
value_content.to_string(),
|
||||
right.location,
|
||||
right.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
use rustpython_ast::{Cmpop, Expr, ExprKind};
|
||||
|
||||
use crate::ast::types::Range;
|
||||
use crate::autofix::Fix;
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::registry::{Check, CheckKind};
|
||||
|
||||
@@ -31,10 +32,26 @@ pub fn yoda_conditions(
|
||||
return;
|
||||
}
|
||||
|
||||
let check = Check::new(
|
||||
CheckKind::YodaConditions(left.to_string(), right.to_string()),
|
||||
// Slice exact content to preserve formatting.
|
||||
let left_content = checker
|
||||
.locator
|
||||
.slice_source_code_range(&Range::from_located(left));
|
||||
let right_content = checker
|
||||
.locator
|
||||
.slice_source_code_range(&Range::from_located(right));
|
||||
|
||||
let mut check = Check::new(
|
||||
CheckKind::YodaConditions(left_content.to_string(), right_content.to_string()),
|
||||
Range::from_located(expr),
|
||||
);
|
||||
|
||||
if checker.patch(check.kind.code()) {
|
||||
check.amend(Fix::replacement(
|
||||
format!("{right_content} == {left_content}"),
|
||||
left.location,
|
||||
right.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
|
||||
checker.add_check(check);
|
||||
}
|
||||
|
||||
@@ -5,77 +5,172 @@ expression: checks
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- key
|
||||
- dict
|
||||
- obj
|
||||
location:
|
||||
row: 1
|
||||
column: 0
|
||||
end_location:
|
||||
row: 1
|
||||
column: 18
|
||||
column: 17
|
||||
fix:
|
||||
content: dict
|
||||
content: obj
|
||||
location:
|
||||
row: 1
|
||||
column: 7
|
||||
end_location:
|
||||
row: 1
|
||||
column: 18
|
||||
column: 17
|
||||
parent: ~
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- "foo['bar']"
|
||||
- dict
|
||||
- "foo[\"bar\"]"
|
||||
- obj
|
||||
location:
|
||||
row: 3
|
||||
column: 0
|
||||
end_location:
|
||||
row: 3
|
||||
column: 25
|
||||
column: 24
|
||||
fix:
|
||||
content: dict
|
||||
content: obj
|
||||
location:
|
||||
row: 3
|
||||
column: 14
|
||||
end_location:
|
||||
row: 3
|
||||
column: 25
|
||||
column: 24
|
||||
parent: ~
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- foo()
|
||||
- dict
|
||||
- "foo['bar']"
|
||||
- obj
|
||||
location:
|
||||
row: 5
|
||||
column: 0
|
||||
end_location:
|
||||
row: 5
|
||||
column: 20
|
||||
column: 24
|
||||
fix:
|
||||
content: dict
|
||||
content: obj
|
||||
location:
|
||||
row: 5
|
||||
column: 9
|
||||
column: 14
|
||||
end_location:
|
||||
row: 5
|
||||
column: 20
|
||||
column: 24
|
||||
parent: ~
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- foo()
|
||||
- obj
|
||||
location:
|
||||
row: 7
|
||||
column: 0
|
||||
end_location:
|
||||
row: 7
|
||||
column: 19
|
||||
fix:
|
||||
content: obj
|
||||
location:
|
||||
row: 7
|
||||
column: 9
|
||||
end_location:
|
||||
row: 7
|
||||
column: 19
|
||||
parent: ~
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- key
|
||||
- dict
|
||||
- obj
|
||||
location:
|
||||
row: 7
|
||||
row: 9
|
||||
column: 4
|
||||
end_location:
|
||||
row: 7
|
||||
column: 22
|
||||
row: 9
|
||||
column: 21
|
||||
fix:
|
||||
content: dict
|
||||
content: obj
|
||||
location:
|
||||
row: 7
|
||||
row: 9
|
||||
column: 11
|
||||
end_location:
|
||||
row: 7
|
||||
row: 9
|
||||
column: 21
|
||||
parent: ~
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- k
|
||||
- obj
|
||||
location:
|
||||
row: 16
|
||||
column: 7
|
||||
end_location:
|
||||
row: 16
|
||||
column: 22
|
||||
fix:
|
||||
content: obj
|
||||
location:
|
||||
row: 16
|
||||
column: 12
|
||||
end_location:
|
||||
row: 16
|
||||
column: 22
|
||||
parent: ~
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- k
|
||||
- obj
|
||||
location:
|
||||
row: 18
|
||||
column: 7
|
||||
end_location:
|
||||
row: 18
|
||||
column: 22
|
||||
fix:
|
||||
content: obj
|
||||
location:
|
||||
row: 18
|
||||
column: 12
|
||||
end_location:
|
||||
row: 18
|
||||
column: 22
|
||||
parent: ~
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- k
|
||||
- obj
|
||||
location:
|
||||
row: 20
|
||||
column: 10
|
||||
end_location:
|
||||
row: 20
|
||||
column: 25
|
||||
fix:
|
||||
content: obj
|
||||
location:
|
||||
row: 20
|
||||
column: 15
|
||||
end_location:
|
||||
row: 20
|
||||
column: 25
|
||||
parent: ~
|
||||
- kind:
|
||||
KeyInDict:
|
||||
- k
|
||||
- obj
|
||||
location:
|
||||
row: 22
|
||||
column: 7
|
||||
end_location:
|
||||
row: 22
|
||||
column: 22
|
||||
fix:
|
||||
content: obj
|
||||
location:
|
||||
row: 22
|
||||
column: 12
|
||||
end_location:
|
||||
row: 22
|
||||
column: 22
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ expression: checks
|
||||
---
|
||||
- kind:
|
||||
YodaConditions:
|
||||
- "'yoda'"
|
||||
- "\"yoda\""
|
||||
- compare
|
||||
location:
|
||||
row: 2
|
||||
@@ -12,18 +12,51 @@ expression: checks
|
||||
end_location:
|
||||
row: 2
|
||||
column: 17
|
||||
fix: ~
|
||||
fix:
|
||||
content: "compare == \"yoda\""
|
||||
location:
|
||||
row: 2
|
||||
column: 0
|
||||
end_location:
|
||||
row: 2
|
||||
column: 17
|
||||
parent: ~
|
||||
- kind:
|
||||
YodaConditions:
|
||||
- "'yoda'"
|
||||
- compare
|
||||
location:
|
||||
row: 3
|
||||
column: 0
|
||||
end_location:
|
||||
row: 3
|
||||
column: 17
|
||||
fix:
|
||||
content: "compare == 'yoda'"
|
||||
location:
|
||||
row: 3
|
||||
column: 0
|
||||
end_location:
|
||||
row: 3
|
||||
column: 17
|
||||
parent: ~
|
||||
- kind:
|
||||
YodaConditions:
|
||||
- "42"
|
||||
- age
|
||||
location:
|
||||
row: 3
|
||||
row: 4
|
||||
column: 0
|
||||
end_location:
|
||||
row: 3
|
||||
row: 4
|
||||
column: 9
|
||||
fix: ~
|
||||
fix:
|
||||
content: age == 42
|
||||
location:
|
||||
row: 4
|
||||
column: 0
|
||||
end_location:
|
||||
row: 4
|
||||
column: 9
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -108,11 +108,20 @@ fn annotate_imports<'a>(
|
||||
}
|
||||
|
||||
// Find comments inline.
|
||||
// We associate inline comments with the import statement unless there's a
|
||||
// single member, and it's a single-line import (like `from foo
|
||||
// import bar # noqa`).
|
||||
let mut inline = vec![];
|
||||
while let Some(comment) =
|
||||
comments_iter.next_if(|comment| comment.location.row() == import.location.row())
|
||||
if names.len() > 1
|
||||
|| names
|
||||
.first()
|
||||
.map_or(false, |alias| alias.location.row() > import.location.row())
|
||||
{
|
||||
inline.push(comment);
|
||||
while let Some(comment) = comments_iter
|
||||
.next_if(|comment| comment.location.row() == import.location.row())
|
||||
{
|
||||
inline.push(comment);
|
||||
}
|
||||
}
|
||||
|
||||
// Capture names.
|
||||
@@ -206,11 +215,6 @@ fn normalize_imports(imports: Vec<AnnotatedImport>, combine_as_imports: bool) ->
|
||||
inline,
|
||||
trailing_comma,
|
||||
} => {
|
||||
let single_import = names.len() == 1;
|
||||
|
||||
// If we're dealing with a multi-import block (i.e., a non-star, non-aliased
|
||||
// import), associate the comments with the first alias (best
|
||||
// effort).
|
||||
if let Some(alias) = names.first() {
|
||||
let entry = if alias.name == "*" {
|
||||
block
|
||||
@@ -240,29 +244,8 @@ fn normalize_imports(imports: Vec<AnnotatedImport>, combine_as_imports: bool) ->
|
||||
entry.atop.push(comment.value);
|
||||
}
|
||||
|
||||
// Associate inline comments with first alias if multiple names have been
|
||||
// imported, i.e., the comment applies to all names; otherwise, associate
|
||||
// with the alias.
|
||||
if single_import
|
||||
&& (alias.name != "*" && (alias.asname.is_none() || combine_as_imports))
|
||||
{
|
||||
let entry = block
|
||||
.import_from
|
||||
.entry(ImportFromData { module, level })
|
||||
.or_default()
|
||||
.1
|
||||
.entry(AliasData {
|
||||
name: alias.name,
|
||||
asname: alias.asname,
|
||||
})
|
||||
.or_default();
|
||||
for comment in inline {
|
||||
entry.inline.push(comment.value);
|
||||
}
|
||||
} else {
|
||||
for comment in inline {
|
||||
entry.inline.push(comment.value);
|
||||
}
|
||||
for comment in inline {
|
||||
entry.inline.push(comment.value);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -399,7 +382,7 @@ fn categorize_imports<'a>(
|
||||
block_by_type
|
||||
}
|
||||
|
||||
fn sort_imports(block: ImportBlock) -> OrderedImportBlock {
|
||||
fn sort_imports(block: ImportBlock, order_by_type: bool) -> OrderedImportBlock {
|
||||
let mut ordered = OrderedImportBlock::default();
|
||||
|
||||
// Sort `StmtKind::Import`.
|
||||
@@ -477,7 +460,9 @@ fn sort_imports(block: ImportBlock) -> OrderedImportBlock {
|
||||
locations,
|
||||
aliases
|
||||
.into_iter()
|
||||
.sorted_by(|(alias1, _), (alias2, _)| cmp_members(alias1, alias2))
|
||||
.sorted_by(|(alias1, _), (alias2, _)| {
|
||||
cmp_members(alias1, alias2, order_by_type)
|
||||
})
|
||||
.collect::<Vec<(AliasData, CommentSet)>>(),
|
||||
)
|
||||
})
|
||||
@@ -488,7 +473,9 @@ fn sort_imports(block: ImportBlock) -> OrderedImportBlock {
|
||||
(None, None) => Ordering::Equal,
|
||||
(None, Some(_)) => Ordering::Less,
|
||||
(Some(_), None) => Ordering::Greater,
|
||||
(Some((alias1, _)), Some((alias2, _))) => cmp_members(alias1, alias2),
|
||||
(Some((alias1, _)), Some((alias2, _))) => {
|
||||
cmp_members(alias1, alias2, order_by_type)
|
||||
}
|
||||
}
|
||||
})
|
||||
},
|
||||
@@ -561,6 +548,7 @@ pub fn format_imports(
|
||||
split_on_trailing_comma: bool,
|
||||
force_single_line: bool,
|
||||
single_line_exclusions: &BTreeSet<String>,
|
||||
order_by_type: bool,
|
||||
) -> String {
|
||||
let trailer = &block.trailer;
|
||||
let block = annotate_imports(&block.imports, comments, locator, split_on_trailing_comma);
|
||||
@@ -583,7 +571,7 @@ pub fn format_imports(
|
||||
// Generate replacement source code.
|
||||
let mut is_first_block = true;
|
||||
for import_block in block_by_type.into_values() {
|
||||
let mut import_block = sort_imports(import_block);
|
||||
let mut import_block = sort_imports(import_block, order_by_type);
|
||||
if force_single_line {
|
||||
import_block = force_single_line_imports(import_block, single_line_exclusions);
|
||||
}
|
||||
@@ -657,9 +645,14 @@ mod tests {
|
||||
#[test_case(Path::new("fit_line_length_comment.py"))]
|
||||
#[test_case(Path::new("force_wrap_aliases.py"))]
|
||||
#[test_case(Path::new("import_from_after_import.py"))]
|
||||
#[test_case(Path::new("inline_comments.py"))]
|
||||
#[test_case(Path::new("insert_empty_lines.py"))]
|
||||
#[test_case(Path::new("insert_empty_lines.pyi"))]
|
||||
#[test_case(Path::new("leading_prefix.py"))]
|
||||
#[test_case(Path::new("line_ending_cr.py"))]
|
||||
#[test_case(Path::new("line_ending_crlf.py"))]
|
||||
#[test_case(Path::new("line_ending_lf.py"))]
|
||||
#[test_case(Path::new("magic_trailing_comma.py"))]
|
||||
#[test_case(Path::new("natural_order.py"))]
|
||||
#[test_case(Path::new("no_reorder_within_section.py"))]
|
||||
#[test_case(Path::new("no_wrap_star.py"))]
|
||||
@@ -679,10 +672,6 @@ mod tests {
|
||||
#[test_case(Path::new("split.py"))]
|
||||
#[test_case(Path::new("trailing_suffix.py"))]
|
||||
#[test_case(Path::new("type_comments.py"))]
|
||||
#[test_case(Path::new("magic_trailing_comma.py"))]
|
||||
#[test_case(Path::new("line_ending_lf.py"))]
|
||||
#[test_case(Path::new("line_ending_crlf.py"))]
|
||||
#[test_case(Path::new("line_ending_cr.py"))]
|
||||
fn default(path: &Path) -> Result<()> {
|
||||
let snapshot = format!("{}", path.to_string_lossy());
|
||||
let checks = test_path(
|
||||
@@ -781,4 +770,25 @@ mod tests {
|
||||
insta::assert_yaml_snapshot!(snapshot, checks);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test_case(Path::new("order_by_type.py"))]
|
||||
fn order_by_type(path: &Path) -> Result<()> {
|
||||
let snapshot = format!("order_by_type_false_{}", path.to_string_lossy());
|
||||
let mut checks = test_path(
|
||||
Path::new("./resources/test/fixtures/isort")
|
||||
.join(path)
|
||||
.as_path(),
|
||||
&Settings {
|
||||
isort: isort::settings::Settings {
|
||||
order_by_type: false,
|
||||
..isort::settings::Settings::default()
|
||||
},
|
||||
src: vec![Path::new("resources/test/fixtures/isort").to_path_buf()],
|
||||
..Settings::for_rule(CheckCode::I001)
|
||||
},
|
||||
)?;
|
||||
checks.sort_by_key(|check| check.location);
|
||||
insta::assert_yaml_snapshot!(snapshot, checks);
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
@@ -81,6 +81,7 @@ pub fn check_imports(
|
||||
settings.isort.split_on_trailing_comma,
|
||||
settings.isort.force_single_line,
|
||||
&settings.isort.single_line_exclusions,
|
||||
settings.isort.order_by_type,
|
||||
);
|
||||
|
||||
// Expand the span the entire range, including leading and trailing space.
|
||||
|
||||
@@ -78,6 +78,16 @@ pub struct Options {
|
||||
///
|
||||
/// See isort's [`split-on-trailing-comma`](https://pycqa.github.io/isort/docs/configuration/options.html#split-on-trailing-comma) option.
|
||||
pub split_on_trailing_comma: Option<bool>,
|
||||
#[option(
|
||||
default = r#"true"#,
|
||||
value_type = "bool",
|
||||
example = r#"
|
||||
order-by-type = true
|
||||
"#
|
||||
)]
|
||||
/// Order imports by type, which is determined by case, in addition to
|
||||
/// alphabetically.
|
||||
pub order_by_type: Option<bool>,
|
||||
#[option(
|
||||
default = r#"[]"#,
|
||||
value_type = "Vec<String>",
|
||||
@@ -120,6 +130,7 @@ pub struct Settings {
|
||||
pub single_line_exclusions: BTreeSet<String>,
|
||||
pub known_first_party: BTreeSet<String>,
|
||||
pub known_third_party: BTreeSet<String>,
|
||||
pub order_by_type: bool,
|
||||
pub extra_standard_library: BTreeSet<String>,
|
||||
}
|
||||
|
||||
@@ -130,6 +141,7 @@ impl Default for Settings {
|
||||
force_wrap_aliases: false,
|
||||
split_on_trailing_comma: true,
|
||||
force_single_line: false,
|
||||
order_by_type: true,
|
||||
single_line_exclusions: BTreeSet::new(),
|
||||
known_first_party: BTreeSet::new(),
|
||||
known_third_party: BTreeSet::new(),
|
||||
@@ -145,6 +157,7 @@ impl From<Options> for Settings {
|
||||
force_wrap_aliases: options.force_wrap_aliases.unwrap_or(false),
|
||||
split_on_trailing_comma: options.split_on_trailing_comma.unwrap_or(true),
|
||||
force_single_line: options.force_single_line.unwrap_or(false),
|
||||
order_by_type: options.order_by_type.unwrap_or(true),
|
||||
single_line_exclusions: BTreeSet::from_iter(
|
||||
options.single_line_exclusions.unwrap_or_default(),
|
||||
),
|
||||
@@ -164,6 +177,7 @@ impl From<Settings> for Options {
|
||||
force_wrap_aliases: Some(settings.force_wrap_aliases),
|
||||
split_on_trailing_comma: Some(settings.split_on_trailing_comma),
|
||||
force_single_line: Some(settings.force_single_line),
|
||||
order_by_type: Some(settings.order_by_type),
|
||||
single_line_exclusions: Some(settings.single_line_exclusions.into_iter().collect()),
|
||||
known_first_party: Some(settings.known_first_party.into_iter().collect()),
|
||||
known_third_party: Some(settings.known_third_party.into_iter().collect()),
|
||||
|
||||
@@ -0,0 +1,21 @@
|
||||
---
|
||||
source: src/isort/mod.rs
|
||||
expression: checks
|
||||
---
|
||||
- kind: UnsortedImports
|
||||
location:
|
||||
row: 1
|
||||
column: 0
|
||||
end_location:
|
||||
row: 12
|
||||
column: 0
|
||||
fix:
|
||||
content: "from a.prometheus.metrics import ( # type:ignore[attr-defined]\n TERMINAL_CURRENTLY_RUNNING_TOTAL,\n)\nfrom b.prometheus.metrics import (\n TERMINAL_CURRENTLY_RUNNING_TOTAL, # type:ignore[attr-defined]\n)\nfrom c.prometheus.metrics import (\n TERMINAL_CURRENTLY_RUNNING_TOTAL, # type:ignore[attr-defined]\n)\nfrom d.prometheus.metrics import ( # type:ignore[attr-defined]\n OTHER_RUNNING_TOTAL,\n TERMINAL_CURRENTLY_RUNNING_TOTAL,\n)\n"
|
||||
location:
|
||||
row: 1
|
||||
column: 0
|
||||
end_location:
|
||||
row: 12
|
||||
column: 0
|
||||
parent: ~
|
||||
|
||||
@@ -0,0 +1,21 @@
|
||||
---
|
||||
source: src/isort/mod.rs
|
||||
expression: checks
|
||||
---
|
||||
- kind: UnsortedImports
|
||||
location:
|
||||
row: 1
|
||||
column: 0
|
||||
end_location:
|
||||
row: 13
|
||||
column: 0
|
||||
fix:
|
||||
content: "import glob\nimport os\nimport shutil\nimport tempfile\nimport time\nfrom subprocess import PIPE, Popen, STDOUT\n\nimport BAR\nimport bar\nimport FOO\nimport foo\nimport StringIO\nfrom module import Apple, BASIC, Class, CONSTANT, function\n"
|
||||
location:
|
||||
row: 1
|
||||
column: 0
|
||||
end_location:
|
||||
row: 13
|
||||
column: 0
|
||||
parent: ~
|
||||
|
||||
@@ -37,10 +37,14 @@ pub fn cmp_modules(alias1: &AliasData, alias2: &AliasData) -> Ordering {
|
||||
}
|
||||
|
||||
/// Compare two member imports within `StmtKind::ImportFrom` blocks.
|
||||
pub fn cmp_members(alias1: &AliasData, alias2: &AliasData) -> Ordering {
|
||||
prefix(alias1.name)
|
||||
.cmp(&prefix(alias2.name))
|
||||
.then_with(|| cmp_modules(alias1, alias2))
|
||||
pub fn cmp_members(alias1: &AliasData, alias2: &AliasData, order_by_type: bool) -> Ordering {
|
||||
if order_by_type {
|
||||
prefix(alias1.name)
|
||||
.cmp(&prefix(alias2.name))
|
||||
.then_with(|| cmp_modules(alias1, alias2))
|
||||
} else {
|
||||
cmp_modules(alias1, alias2)
|
||||
}
|
||||
}
|
||||
|
||||
/// Compare two relative import levels.
|
||||
|
||||
@@ -1413,7 +1413,7 @@ fn missing_args(checker: &mut Checker, docstring: &Docstring, docstrings_args: &
|
||||
|
||||
// See: `GOOGLE_ARGS_REGEX` in `pydocstyle/checker.py`.
|
||||
static GOOGLE_ARGS_REGEX: Lazy<Regex> =
|
||||
Lazy::new(|| Regex::new(r"^\s*(\w+)\s*(\(.*?\))?\s*:.+").unwrap());
|
||||
Lazy::new(|| Regex::new(r"^\s*(\*?\*?\w+)\s*(\(.*?\))?\s*:.+").unwrap());
|
||||
|
||||
fn args_section(checker: &mut Checker, docstring: &Docstring, context: &SectionContext) {
|
||||
let mut matches = Vec::new();
|
||||
|
||||
@@ -4,7 +4,7 @@ use ruff_macros::ConfigurationOptions;
|
||||
use schemars::JsonSchema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
use crate::registry::CheckCode;
|
||||
use crate::registry_gen::CheckCodePrefix;
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize, Hash, JsonSchema)]
|
||||
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
|
||||
@@ -18,153 +18,55 @@ pub enum Convention {
|
||||
}
|
||||
|
||||
impl Convention {
|
||||
pub fn codes(&self) -> Vec<CheckCode> {
|
||||
pub fn codes(&self) -> &'static [CheckCodePrefix] {
|
||||
match self {
|
||||
Convention::Google => vec![
|
||||
Convention::Google => &[
|
||||
// All errors except D203, D204, D213, D215, D400, D401, D404, D406, D407, D408,
|
||||
// D409 and D413.
|
||||
CheckCode::D100,
|
||||
CheckCode::D101,
|
||||
CheckCode::D102,
|
||||
CheckCode::D103,
|
||||
CheckCode::D104,
|
||||
CheckCode::D105,
|
||||
CheckCode::D106,
|
||||
CheckCode::D107,
|
||||
CheckCode::D200,
|
||||
CheckCode::D201,
|
||||
CheckCode::D202,
|
||||
// CheckCode::D203,
|
||||
// CheckCode::D204,
|
||||
CheckCode::D205,
|
||||
CheckCode::D206,
|
||||
CheckCode::D207,
|
||||
CheckCode::D208,
|
||||
CheckCode::D209,
|
||||
CheckCode::D210,
|
||||
CheckCode::D211,
|
||||
CheckCode::D212,
|
||||
// CheckCode::D213,
|
||||
CheckCode::D214,
|
||||
// CheckCode::D215,
|
||||
CheckCode::D300,
|
||||
CheckCode::D301,
|
||||
// CheckCode::D400,
|
||||
CheckCode::D402,
|
||||
CheckCode::D403,
|
||||
// CheckCode::D404,
|
||||
CheckCode::D405,
|
||||
// CheckCode::D406,
|
||||
// CheckCode::D407,
|
||||
// CheckCode::D408,
|
||||
// CheckCode::D409,
|
||||
CheckCode::D410,
|
||||
CheckCode::D411,
|
||||
CheckCode::D412,
|
||||
// CheckCode::D413,
|
||||
CheckCode::D414,
|
||||
CheckCode::D415,
|
||||
CheckCode::D416,
|
||||
CheckCode::D417,
|
||||
CheckCode::D418,
|
||||
CheckCode::D419,
|
||||
CheckCodePrefix::D203,
|
||||
CheckCodePrefix::D204,
|
||||
CheckCodePrefix::D213,
|
||||
CheckCodePrefix::D215,
|
||||
CheckCodePrefix::D400,
|
||||
CheckCodePrefix::D404,
|
||||
CheckCodePrefix::D406,
|
||||
CheckCodePrefix::D407,
|
||||
CheckCodePrefix::D408,
|
||||
CheckCodePrefix::D409,
|
||||
CheckCodePrefix::D413,
|
||||
],
|
||||
Convention::Numpy => vec![
|
||||
Convention::Numpy => &[
|
||||
// All errors except D107, D203, D212, D213, D402, D413, D415, D416, and D417.
|
||||
CheckCode::D100,
|
||||
CheckCode::D101,
|
||||
CheckCode::D102,
|
||||
CheckCode::D103,
|
||||
CheckCode::D104,
|
||||
CheckCode::D105,
|
||||
CheckCode::D106,
|
||||
// CheckCode::D107,
|
||||
CheckCode::D200,
|
||||
CheckCode::D201,
|
||||
CheckCode::D202,
|
||||
// CheckCode::D203,
|
||||
CheckCode::D204,
|
||||
CheckCode::D205,
|
||||
CheckCode::D206,
|
||||
CheckCode::D207,
|
||||
CheckCode::D208,
|
||||
CheckCode::D209,
|
||||
CheckCode::D210,
|
||||
CheckCode::D211,
|
||||
// CheckCode::D212,
|
||||
// CheckCode::D213,
|
||||
CheckCode::D214,
|
||||
CheckCode::D215,
|
||||
CheckCode::D300,
|
||||
CheckCode::D301,
|
||||
CheckCode::D400,
|
||||
// CheckCode::D402,
|
||||
CheckCode::D403,
|
||||
CheckCode::D404,
|
||||
CheckCode::D405,
|
||||
CheckCode::D406,
|
||||
CheckCode::D407,
|
||||
CheckCode::D408,
|
||||
CheckCode::D409,
|
||||
CheckCode::D410,
|
||||
CheckCode::D411,
|
||||
CheckCode::D412,
|
||||
// CheckCode::D413,
|
||||
CheckCode::D414,
|
||||
// CheckCode::D415,
|
||||
// CheckCode::D416,
|
||||
// CheckCode::D417,
|
||||
CheckCode::D418,
|
||||
CheckCode::D419,
|
||||
CheckCodePrefix::D107,
|
||||
CheckCodePrefix::D203,
|
||||
CheckCodePrefix::D212,
|
||||
CheckCodePrefix::D213,
|
||||
CheckCodePrefix::D402,
|
||||
CheckCodePrefix::D413,
|
||||
CheckCodePrefix::D415,
|
||||
CheckCodePrefix::D416,
|
||||
CheckCodePrefix::D417,
|
||||
],
|
||||
Convention::Pep257 => vec![
|
||||
Convention::Pep257 => &[
|
||||
// All errors except D203, D212, D213, D214, D215, D404, D405, D406, D407, D408,
|
||||
// D409, D410, D411, D413, D415, D416 and D417.
|
||||
CheckCode::D100,
|
||||
CheckCode::D101,
|
||||
CheckCode::D102,
|
||||
CheckCode::D103,
|
||||
CheckCode::D104,
|
||||
CheckCode::D105,
|
||||
CheckCode::D106,
|
||||
CheckCode::D107,
|
||||
CheckCode::D200,
|
||||
CheckCode::D201,
|
||||
CheckCode::D202,
|
||||
// CheckCode::D203,
|
||||
CheckCode::D204,
|
||||
CheckCode::D205,
|
||||
CheckCode::D206,
|
||||
CheckCode::D207,
|
||||
CheckCode::D208,
|
||||
CheckCode::D209,
|
||||
CheckCode::D210,
|
||||
CheckCode::D211,
|
||||
// CheckCode::D212,
|
||||
// CheckCode::D213,
|
||||
// CheckCode::D214,
|
||||
// CheckCode::D215,
|
||||
CheckCode::D300,
|
||||
CheckCode::D301,
|
||||
CheckCode::D400,
|
||||
CheckCode::D402,
|
||||
CheckCode::D403,
|
||||
// CheckCode::D404,
|
||||
// CheckCode::D405,
|
||||
// CheckCode::D406,
|
||||
// CheckCode::D407,
|
||||
// CheckCode::D408,
|
||||
// CheckCode::D409,
|
||||
// CheckCode::D410,
|
||||
// CheckCode::D411,
|
||||
CheckCode::D412,
|
||||
// CheckCode::D413,
|
||||
CheckCode::D414,
|
||||
// CheckCode::D415,
|
||||
// CheckCode::D416,
|
||||
// CheckCode::D417,
|
||||
CheckCode::D418,
|
||||
CheckCode::D419,
|
||||
CheckCodePrefix::D203,
|
||||
CheckCodePrefix::D212,
|
||||
CheckCodePrefix::D213,
|
||||
CheckCodePrefix::D214,
|
||||
CheckCodePrefix::D215,
|
||||
CheckCodePrefix::D404,
|
||||
CheckCodePrefix::D405,
|
||||
CheckCodePrefix::D406,
|
||||
CheckCodePrefix::D407,
|
||||
CheckCodePrefix::D408,
|
||||
CheckCodePrefix::D409,
|
||||
CheckCodePrefix::D410,
|
||||
CheckCodePrefix::D411,
|
||||
CheckCodePrefix::D413,
|
||||
CheckCodePrefix::D415,
|
||||
CheckCodePrefix::D416,
|
||||
CheckCodePrefix::D417,
|
||||
],
|
||||
}
|
||||
}
|
||||
|
||||
@@ -85,4 +85,26 @@ expression: checks
|
||||
column: 12
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DocumentAllArguments:
|
||||
- x
|
||||
location:
|
||||
row: 98
|
||||
column: 0
|
||||
end_location:
|
||||
row: 105
|
||||
column: 12
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DocumentAllArguments:
|
||||
- "*args"
|
||||
location:
|
||||
row: 108
|
||||
column: 0
|
||||
end_location:
|
||||
row: 115
|
||||
column: 12
|
||||
fix: ~
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -37,4 +37,26 @@ expression: checks
|
||||
column: 12
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DocumentAllArguments:
|
||||
- x
|
||||
location:
|
||||
row: 98
|
||||
column: 0
|
||||
end_location:
|
||||
row: 105
|
||||
column: 12
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DocumentAllArguments:
|
||||
- "*args"
|
||||
location:
|
||||
row: 108
|
||||
column: 0
|
||||
end_location:
|
||||
row: 115
|
||||
column: 12
|
||||
fix: ~
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -98,6 +98,7 @@ mod tests {
|
||||
#[test_case(CheckCode::F821, Path::new("F821_5.py"); "F821_5")]
|
||||
#[test_case(CheckCode::F821, Path::new("F821_6.py"); "F821_6")]
|
||||
#[test_case(CheckCode::F821, Path::new("F821_7.py"); "F821_7")]
|
||||
#[test_case(CheckCode::F821, Path::new("F821_8.pyi"); "F821_8")]
|
||||
#[test_case(CheckCode::F822, Path::new("F822.py"); "F822")]
|
||||
#[test_case(CheckCode::F823, Path::new("F823.py"); "F823")]
|
||||
#[test_case(CheckCode::F841, Path::new("F841_0.py"); "F841_0")]
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
use std::string::ToString;
|
||||
|
||||
use log::error;
|
||||
use rustc_hash::FxHashSet;
|
||||
use rustpython_ast::{Keyword, KeywordData};
|
||||
use rustpython_parser::ast::{Constant, Expr, ExprKind};
|
||||
@@ -115,9 +116,11 @@ pub(crate) fn percent_format_extra_named_arguments(
|
||||
location,
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
if let Ok(fix) = remove_unused_format_arguments_from_dict(&missing, right, checker.locator)
|
||||
{
|
||||
check.amend(fix);
|
||||
match remove_unused_format_arguments_from_dict(&missing, right, checker.locator) {
|
||||
Ok(fix) => {
|
||||
check.amend(fix);
|
||||
}
|
||||
Err(e) => error!("Failed to remove unused format arguments: {e}"),
|
||||
}
|
||||
}
|
||||
checker.add_check(check);
|
||||
@@ -272,10 +275,12 @@ pub(crate) fn string_dot_format_extra_named_arguments(
|
||||
location,
|
||||
);
|
||||
if checker.patch(check.kind.code()) {
|
||||
if let Ok(fix) =
|
||||
remove_unused_keyword_arguments_from_format_call(&missing, location, checker.locator)
|
||||
match remove_unused_keyword_arguments_from_format_call(&missing, location, checker.locator)
|
||||
{
|
||||
check.amend(fix);
|
||||
Ok(fix) => {
|
||||
check.amend(fix);
|
||||
}
|
||||
Err(e) => error!("Failed to remove unused keyword arguments: {e}"),
|
||||
}
|
||||
}
|
||||
checker.add_check(check);
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
---
|
||||
source: src/pyflakes/mod.rs
|
||||
expression: checks
|
||||
---
|
||||
[]
|
||||
|
||||
@@ -48,6 +48,8 @@ mod tests {
|
||||
#[test_case(CheckCode::UP025, Path::new("UP025.py"); "UP025")]
|
||||
#[test_case(CheckCode::UP026, Path::new("UP026.py"); "UP026")]
|
||||
#[test_case(CheckCode::UP027, Path::new("UP027.py"); "UP027")]
|
||||
#[test_case(CheckCode::UP028, Path::new("UP028_0.py"); "UP028_0")]
|
||||
#[test_case(CheckCode::UP028, Path::new("UP028_1.py"); "UP028_1")]
|
||||
fn checks(check_code: CheckCode, path: &Path) -> Result<()> {
|
||||
let snapshot = format!("{}_{}", check_code.as_ref(), path.to_string_lossy());
|
||||
let checks = test_path(
|
||||
|
||||
@@ -193,8 +193,8 @@ pub fn convert_named_tuple_functional_to_class(
|
||||
return;
|
||||
};
|
||||
match match_defaults(keywords) {
|
||||
Ok(defaults) => {
|
||||
if let Ok(properties) = create_properties_from_args(args, defaults) {
|
||||
Ok(defaults) => match create_properties_from_args(args, defaults) {
|
||||
Ok(properties) => {
|
||||
let mut check = Check::new(
|
||||
CheckKind::ConvertNamedTupleFunctionalToClass(typename.to_string()),
|
||||
Range::from_located(stmt),
|
||||
@@ -209,7 +209,8 @@ pub fn convert_named_tuple_functional_to_class(
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
}
|
||||
Err(err) => error!("Failed to create properties: {err}"),
|
||||
},
|
||||
Err(err) => error!("Failed to parse defaults: {err}"),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -12,6 +12,7 @@ pub use replace_universal_newlines::replace_universal_newlines;
|
||||
pub use rewrite_c_element_tree::replace_c_element_tree;
|
||||
pub use rewrite_mock_import::{rewrite_mock_attribute, rewrite_mock_import};
|
||||
pub use rewrite_unicode_literal::rewrite_unicode_literal;
|
||||
pub use rewrite_yield_from::rewrite_yield_from;
|
||||
pub use super_call_with_parameters::super_call_with_parameters;
|
||||
pub use type_of_primitive::type_of_primitive;
|
||||
pub use typing_text_str_alias::typing_text_str_alias;
|
||||
@@ -38,6 +39,7 @@ mod replace_universal_newlines;
|
||||
mod rewrite_c_element_tree;
|
||||
mod rewrite_mock_import;
|
||||
mod rewrite_unicode_literal;
|
||||
mod rewrite_yield_from;
|
||||
mod super_call_with_parameters;
|
||||
mod type_of_primitive;
|
||||
mod typing_text_str_alias;
|
||||
|
||||
@@ -17,21 +17,31 @@ pub fn native_literals(
|
||||
) {
|
||||
let ExprKind::Name { id, .. } = &func.node else { return; };
|
||||
|
||||
if (id == "str" || id == "bytes")
|
||||
&& keywords.is_empty()
|
||||
&& args.len() <= 1
|
||||
&& checker.is_builtin(id)
|
||||
{
|
||||
if !keywords.is_empty() || args.len() > 1 {
|
||||
return;
|
||||
}
|
||||
|
||||
if (id == "str" || id == "bytes") && checker.is_builtin(id) {
|
||||
let Some(arg) = args.get(0) else {
|
||||
let literal_type = if id == "str" {
|
||||
let mut check = Check::new(CheckKind::NativeLiterals(if id == "str" {
|
||||
LiteralType::Str
|
||||
} else {
|
||||
LiteralType::Bytes
|
||||
};
|
||||
let mut check = Check::new(CheckKind::NativeLiterals(literal_type), Range::from_located(expr));
|
||||
}), Range::from_located(expr));
|
||||
if checker.patch(&CheckCode::UP018) {
|
||||
check.amend(Fix::replacement(
|
||||
format!("{}\"\"", if id == "bytes" { "b" } else { "" }),
|
||||
if id == "bytes" {
|
||||
let mut content = String::with_capacity(3);
|
||||
content.push('b');
|
||||
content.push(checker.style.quote().into());
|
||||
content.push(checker.style.quote().into());
|
||||
content
|
||||
} else {
|
||||
let mut content = String::with_capacity(2);
|
||||
content.push(checker.style.quote().into());
|
||||
content.push(checker.style.quote().into());
|
||||
content
|
||||
},
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
@@ -40,14 +50,31 @@ pub fn native_literals(
|
||||
return;
|
||||
};
|
||||
|
||||
let ExprKind::Constant { value, ..} = &arg.node else {
|
||||
// Look for `str("")`.
|
||||
if id == "str"
|
||||
&& !matches!(
|
||||
&arg.node,
|
||||
ExprKind::Constant {
|
||||
value: Constant::Str(_),
|
||||
..
|
||||
},
|
||||
)
|
||||
{
|
||||
return;
|
||||
};
|
||||
let literal_type = match value {
|
||||
Constant::Str { .. } => LiteralType::Str,
|
||||
Constant::Bytes { .. } => LiteralType::Bytes,
|
||||
_ => return,
|
||||
};
|
||||
}
|
||||
|
||||
// Look for `bytes(b"")`
|
||||
if id == "bytes"
|
||||
&& !matches!(
|
||||
&arg.node,
|
||||
ExprKind::Constant {
|
||||
value: Constant::Bytes(_),
|
||||
..
|
||||
},
|
||||
)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
// rust-python merges adjacent string/bytes literals into one node, but we can't
|
||||
// safely remove the outer call in this situation. We're following pyupgrade
|
||||
@@ -65,7 +92,11 @@ pub fn native_literals(
|
||||
}
|
||||
|
||||
let mut check = Check::new(
|
||||
CheckKind::NativeLiterals(literal_type),
|
||||
CheckKind::NativeLiterals(if id == "str" {
|
||||
LiteralType::Str
|
||||
} else {
|
||||
LiteralType::Bytes
|
||||
}),
|
||||
Range::from_located(expr),
|
||||
);
|
||||
if checker.patch(&CheckCode::UP018) {
|
||||
|
||||
161
src/pyupgrade/plugins/rewrite_yield_from.rs
Normal file
161
src/pyupgrade/plugins/rewrite_yield_from.rs
Normal file
@@ -0,0 +1,161 @@
|
||||
use rustc_hash::FxHashMap;
|
||||
use rustpython_ast::{Expr, ExprContext, ExprKind, Stmt, StmtKind};
|
||||
|
||||
use crate::ast::types::{Range, RefEquality};
|
||||
use crate::ast::visitor;
|
||||
use crate::ast::visitor::Visitor;
|
||||
use crate::autofix::Fix;
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::registry::{Check, CheckKind};
|
||||
|
||||
fn collect_names(expr: &Expr) -> Vec<&str> {
|
||||
match &expr.node {
|
||||
ExprKind::Name { id, .. } => vec![id],
|
||||
ExprKind::Tuple { elts, .. } => elts.iter().flat_map(collect_names).collect(),
|
||||
_ => vec![],
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct YieldFrom<'a> {
|
||||
stmt: &'a Stmt,
|
||||
body: &'a Stmt,
|
||||
iter: &'a Expr,
|
||||
names: Vec<&'a str>,
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
struct YieldFromVisitor<'a> {
|
||||
yields: Vec<YieldFrom<'a>>,
|
||||
}
|
||||
|
||||
impl<'a> Visitor<'a> for YieldFromVisitor<'a> {
|
||||
fn visit_stmt(&mut self, stmt: &'a Stmt) {
|
||||
match &stmt.node {
|
||||
StmtKind::For {
|
||||
target,
|
||||
body,
|
||||
orelse,
|
||||
iter,
|
||||
..
|
||||
} => {
|
||||
// If there is an else statement, don't rewrite.
|
||||
if !orelse.is_empty() {
|
||||
return;
|
||||
}
|
||||
// If there's any logic besides a yield, don't rewrite.
|
||||
if body.len() != 1 {
|
||||
return;
|
||||
}
|
||||
// If the body is not a yield, don't rewrite.
|
||||
let body = &body[0];
|
||||
if let StmtKind::Expr { value } = &body.node {
|
||||
if let ExprKind::Yield { value: Some(value) } = &value.node {
|
||||
let names = collect_names(target);
|
||||
if names == collect_names(value) {
|
||||
self.yields.push(YieldFrom {
|
||||
stmt,
|
||||
body,
|
||||
iter,
|
||||
names,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
StmtKind::FunctionDef { .. }
|
||||
| StmtKind::AsyncFunctionDef { .. }
|
||||
| StmtKind::ClassDef { .. } => {
|
||||
// Don't recurse into anything that defines a new scope.
|
||||
}
|
||||
_ => visitor::walk_stmt(self, stmt),
|
||||
}
|
||||
}
|
||||
|
||||
fn visit_expr(&mut self, expr: &'a Expr) {
|
||||
match &expr.node {
|
||||
ExprKind::ListComp { .. }
|
||||
| ExprKind::SetComp { .. }
|
||||
| ExprKind::DictComp { .. }
|
||||
| ExprKind::GeneratorExp { .. }
|
||||
| ExprKind::Lambda { .. } => {
|
||||
// Don't recurse into anything that defines a new scope.
|
||||
}
|
||||
_ => visitor::walk_expr(self, expr),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
struct ReferenceVisitor<'a> {
|
||||
parent: Option<&'a Stmt>,
|
||||
references: FxHashMap<RefEquality<'a, Stmt>, Vec<&'a str>>,
|
||||
}
|
||||
|
||||
impl<'a> Visitor<'a> for ReferenceVisitor<'a> {
|
||||
fn visit_stmt(&mut self, stmt: &'a Stmt) {
|
||||
let prev_parent = self.parent;
|
||||
self.parent = Some(stmt);
|
||||
visitor::walk_stmt(self, stmt);
|
||||
self.parent = prev_parent;
|
||||
}
|
||||
|
||||
fn visit_expr(&mut self, expr: &'a Expr) {
|
||||
match &expr.node {
|
||||
ExprKind::Name { id, ctx } => {
|
||||
if matches!(ctx, ExprContext::Load | ExprContext::Del) {
|
||||
if let Some(parent) = self.parent {
|
||||
self.references
|
||||
.entry(RefEquality(parent))
|
||||
.or_default()
|
||||
.push(id);
|
||||
}
|
||||
}
|
||||
}
|
||||
_ => visitor::walk_expr(self, expr),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// UP028
|
||||
pub fn rewrite_yield_from(checker: &mut Checker, stmt: &Stmt) {
|
||||
// Intentionally omit async functions.
|
||||
if let StmtKind::FunctionDef { body, .. } = &stmt.node {
|
||||
let yields = {
|
||||
let mut visitor = YieldFromVisitor::default();
|
||||
visitor.visit_body(body);
|
||||
visitor.yields
|
||||
};
|
||||
|
||||
let references = {
|
||||
let mut visitor = ReferenceVisitor::default();
|
||||
visitor.visit_body(body);
|
||||
visitor.references
|
||||
};
|
||||
|
||||
for item in yields {
|
||||
// If any of the bound names are used outside of the loop, don't rewrite.
|
||||
if references.iter().any(|(stmt, names)| {
|
||||
stmt != &RefEquality(item.stmt)
|
||||
&& stmt != &RefEquality(item.body)
|
||||
&& item.names.iter().any(|name| names.contains(name))
|
||||
}) {
|
||||
continue;
|
||||
}
|
||||
|
||||
let mut check = Check::new(CheckKind::RewriteYieldFrom, Range::from_located(item.stmt));
|
||||
if checker.patch(check.kind.code()) {
|
||||
let contents = checker
|
||||
.locator
|
||||
.slice_source_code_range(&Range::from_located(item.iter));
|
||||
let contents = format!("yield from {contents}");
|
||||
check.amend(Fix::replacement(
|
||||
contents,
|
||||
item.stmt.location,
|
||||
item.stmt.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
checker.add_check(check);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,3 +1,4 @@
|
||||
use log::error;
|
||||
use rustpython_ast::{Constant, Expr, ExprKind, Location, Operator};
|
||||
|
||||
use crate::ast::helpers::{collect_call_paths, dealias_call_path};
|
||||
@@ -71,13 +72,16 @@ pub fn use_pep604_annotation(checker: &mut Checker, expr: &Expr, value: &Expr, s
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(&optional(slice), 0);
|
||||
if let Ok(content) = generator.generate() {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!("Failed to rewrite PEP604 annotation: {e}"),
|
||||
};
|
||||
}
|
||||
checker.add_check(check);
|
||||
} else if checker.match_typing_call_path(&call_path, "Union") {
|
||||
@@ -94,12 +98,15 @@ pub fn use_pep604_annotation(checker: &mut Checker, expr: &Expr, value: &Expr, s
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(&union(elts), 0);
|
||||
if let Ok(content) = generator.generate() {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!("Failed to rewrite PEP604 annotation: {e}"),
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
@@ -110,12 +117,15 @@ pub fn use_pep604_annotation(checker: &mut Checker, expr: &Expr, value: &Expr, s
|
||||
checker.style.line_ending(),
|
||||
);
|
||||
generator.unparse_expr(slice, 0);
|
||||
if let Ok(content) = generator.generate() {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
match generator.generate() {
|
||||
Ok(content) => {
|
||||
check.amend(Fix::replacement(
|
||||
content,
|
||||
expr.location,
|
||||
expr.end_location.unwrap(),
|
||||
));
|
||||
}
|
||||
Err(e) => error!("Failed to rewrite PEP604 annotation: {e}"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,103 +5,103 @@ expression: checks
|
||||
- kind:
|
||||
NativeLiterals: Str
|
||||
location:
|
||||
row: 18
|
||||
row: 20
|
||||
column: 0
|
||||
end_location:
|
||||
row: 18
|
||||
row: 20
|
||||
column: 5
|
||||
fix:
|
||||
content: "\"\""
|
||||
location:
|
||||
row: 18
|
||||
row: 20
|
||||
column: 0
|
||||
end_location:
|
||||
row: 18
|
||||
row: 20
|
||||
column: 5
|
||||
parent: ~
|
||||
- kind:
|
||||
NativeLiterals: Str
|
||||
location:
|
||||
row: 19
|
||||
row: 21
|
||||
column: 0
|
||||
end_location:
|
||||
row: 19
|
||||
row: 21
|
||||
column: 10
|
||||
fix:
|
||||
content: "\"foo\""
|
||||
location:
|
||||
row: 19
|
||||
row: 21
|
||||
column: 0
|
||||
end_location:
|
||||
row: 19
|
||||
row: 21
|
||||
column: 10
|
||||
parent: ~
|
||||
- kind:
|
||||
NativeLiterals: Str
|
||||
location:
|
||||
row: 20
|
||||
row: 22
|
||||
column: 0
|
||||
end_location:
|
||||
row: 21
|
||||
row: 23
|
||||
column: 7
|
||||
fix:
|
||||
content: "\"\"\"\nfoo\"\"\""
|
||||
location:
|
||||
row: 20
|
||||
column: 0
|
||||
end_location:
|
||||
row: 21
|
||||
column: 7
|
||||
parent: ~
|
||||
- kind:
|
||||
NativeLiterals: Bytes
|
||||
location:
|
||||
row: 22
|
||||
column: 0
|
||||
end_location:
|
||||
row: 22
|
||||
column: 7
|
||||
fix:
|
||||
content: "b\"\""
|
||||
location:
|
||||
row: 22
|
||||
column: 0
|
||||
end_location:
|
||||
row: 22
|
||||
row: 23
|
||||
column: 7
|
||||
parent: ~
|
||||
- kind:
|
||||
NativeLiterals: Bytes
|
||||
location:
|
||||
row: 23
|
||||
column: 0
|
||||
end_location:
|
||||
row: 23
|
||||
column: 13
|
||||
fix:
|
||||
content: "b\"foo\""
|
||||
location:
|
||||
row: 23
|
||||
column: 0
|
||||
end_location:
|
||||
row: 23
|
||||
column: 13
|
||||
parent: ~
|
||||
- kind:
|
||||
NativeLiterals: Bytes
|
||||
location:
|
||||
row: 24
|
||||
column: 0
|
||||
end_location:
|
||||
row: 25
|
||||
row: 24
|
||||
column: 7
|
||||
fix:
|
||||
content: "b\"\"\"\nfoo\"\"\""
|
||||
content: "b\"\""
|
||||
location:
|
||||
row: 24
|
||||
column: 0
|
||||
end_location:
|
||||
row: 24
|
||||
column: 7
|
||||
parent: ~
|
||||
- kind:
|
||||
NativeLiterals: Bytes
|
||||
location:
|
||||
row: 25
|
||||
column: 0
|
||||
end_location:
|
||||
row: 25
|
||||
column: 13
|
||||
fix:
|
||||
content: "b\"foo\""
|
||||
location:
|
||||
row: 25
|
||||
column: 0
|
||||
end_location:
|
||||
row: 25
|
||||
column: 13
|
||||
parent: ~
|
||||
- kind:
|
||||
NativeLiterals: Bytes
|
||||
location:
|
||||
row: 26
|
||||
column: 0
|
||||
end_location:
|
||||
row: 27
|
||||
column: 7
|
||||
fix:
|
||||
content: "b\"\"\"\nfoo\"\"\""
|
||||
location:
|
||||
row: 26
|
||||
column: 0
|
||||
end_location:
|
||||
row: 27
|
||||
column: 7
|
||||
parent: ~
|
||||
|
||||
|
||||
@@ -0,0 +1,197 @@
|
||||
---
|
||||
source: src/pyupgrade/mod.rs
|
||||
expression: checks
|
||||
---
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 2
|
||||
column: 4
|
||||
end_location:
|
||||
row: 3
|
||||
column: 15
|
||||
fix:
|
||||
content: yield from y
|
||||
location:
|
||||
row: 2
|
||||
column: 4
|
||||
end_location:
|
||||
row: 3
|
||||
column: 15
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 7
|
||||
column: 4
|
||||
end_location:
|
||||
row: 8
|
||||
column: 20
|
||||
fix:
|
||||
content: yield from z
|
||||
location:
|
||||
row: 7
|
||||
column: 4
|
||||
end_location:
|
||||
row: 8
|
||||
column: 20
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 12
|
||||
column: 4
|
||||
end_location:
|
||||
row: 13
|
||||
column: 15
|
||||
fix:
|
||||
content: "yield from [1, 2, 3]"
|
||||
location:
|
||||
row: 12
|
||||
column: 4
|
||||
end_location:
|
||||
row: 13
|
||||
column: 15
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 17
|
||||
column: 4
|
||||
end_location:
|
||||
row: 18
|
||||
column: 15
|
||||
fix:
|
||||
content: "yield from {x for x in y}"
|
||||
location:
|
||||
row: 17
|
||||
column: 4
|
||||
end_location:
|
||||
row: 18
|
||||
column: 15
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 22
|
||||
column: 4
|
||||
end_location:
|
||||
row: 23
|
||||
column: 15
|
||||
fix:
|
||||
content: "yield from (1, 2, 3)"
|
||||
location:
|
||||
row: 22
|
||||
column: 4
|
||||
end_location:
|
||||
row: 23
|
||||
column: 15
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 27
|
||||
column: 4
|
||||
end_location:
|
||||
row: 28
|
||||
column: 18
|
||||
fix:
|
||||
content: "yield from {3: \"x\", 6: \"y\"}"
|
||||
location:
|
||||
row: 27
|
||||
column: 4
|
||||
end_location:
|
||||
row: 28
|
||||
column: 18
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 33
|
||||
column: 4
|
||||
end_location:
|
||||
row: 39
|
||||
column: 18
|
||||
fix:
|
||||
content: "yield from { # Comment three\\n'\n 3: \"x\", # Comment four\\n'\n # Comment five\\n'\n 6: \"y\", # Comment six\\n'\n }"
|
||||
location:
|
||||
row: 33
|
||||
column: 4
|
||||
end_location:
|
||||
row: 39
|
||||
column: 18
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 44
|
||||
column: 4
|
||||
end_location:
|
||||
row: 45
|
||||
column: 18
|
||||
fix:
|
||||
content: "yield from [{3: (3, [44, \"long ss\"]), 6: \"y\"}]"
|
||||
location:
|
||||
row: 44
|
||||
column: 4
|
||||
end_location:
|
||||
row: 45
|
||||
column: 18
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 49
|
||||
column: 4
|
||||
end_location:
|
||||
row: 50
|
||||
column: 18
|
||||
fix:
|
||||
content: yield from z()
|
||||
location:
|
||||
row: 49
|
||||
column: 4
|
||||
end_location:
|
||||
row: 50
|
||||
column: 18
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 55
|
||||
column: 8
|
||||
end_location:
|
||||
row: 57
|
||||
column: 22
|
||||
fix:
|
||||
content: yield from z()
|
||||
location:
|
||||
row: 55
|
||||
column: 8
|
||||
end_location:
|
||||
row: 57
|
||||
column: 22
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 67
|
||||
column: 4
|
||||
end_location:
|
||||
row: 68
|
||||
column: 15
|
||||
fix:
|
||||
content: yield from x
|
||||
location:
|
||||
row: 67
|
||||
column: 4
|
||||
end_location:
|
||||
row: 68
|
||||
column: 15
|
||||
parent: ~
|
||||
- kind: RewriteYieldFrom
|
||||
location:
|
||||
row: 72
|
||||
column: 4
|
||||
end_location:
|
||||
row: 73
|
||||
column: 18
|
||||
fix:
|
||||
content: yield from z()
|
||||
location:
|
||||
row: 72
|
||||
column: 4
|
||||
end_location:
|
||||
row: 73
|
||||
column: 18
|
||||
parent: ~
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
---
|
||||
source: src/pyupgrade/mod.rs
|
||||
expression: checks
|
||||
---
|
||||
[]
|
||||
|
||||
@@ -248,6 +248,7 @@ pub enum CheckCode {
|
||||
UP025,
|
||||
UP026,
|
||||
UP027,
|
||||
UP028,
|
||||
// pydocstyle
|
||||
D100,
|
||||
D101,
|
||||
@@ -975,6 +976,7 @@ pub enum CheckKind {
|
||||
RewriteUnicodeLiteral,
|
||||
RewriteMockImport(MockReference),
|
||||
RewriteListComprehension,
|
||||
RewriteYieldFrom,
|
||||
// pydocstyle
|
||||
BlankLineAfterLastSection(String),
|
||||
BlankLineAfterSection(String),
|
||||
@@ -1405,6 +1407,7 @@ impl CheckCode {
|
||||
CheckCode::UP025 => CheckKind::RewriteUnicodeLiteral,
|
||||
CheckCode::UP026 => CheckKind::RewriteMockImport(MockReference::Import),
|
||||
CheckCode::UP027 => CheckKind::RewriteListComprehension,
|
||||
CheckCode::UP028 => CheckKind::RewriteYieldFrom,
|
||||
// pydocstyle
|
||||
CheckCode::D100 => CheckKind::PublicModule,
|
||||
CheckCode::D101 => CheckKind::PublicClass,
|
||||
@@ -1933,6 +1936,7 @@ impl CheckCode {
|
||||
CheckCode::UP025 => CheckCategory::Pyupgrade,
|
||||
CheckCode::UP026 => CheckCategory::Pyupgrade,
|
||||
CheckCode::UP027 => CheckCategory::Pyupgrade,
|
||||
CheckCode::UP028 => CheckCategory::Pyupgrade,
|
||||
// pycodestyle (warnings)
|
||||
CheckCode::W292 => CheckCategory::Pycodestyle,
|
||||
CheckCode::W605 => CheckCategory::Pycodestyle,
|
||||
@@ -2173,6 +2177,7 @@ impl CheckKind {
|
||||
CheckKind::RewriteUnicodeLiteral => &CheckCode::UP025,
|
||||
CheckKind::RewriteMockImport(..) => &CheckCode::UP026,
|
||||
CheckKind::RewriteListComprehension => &CheckCode::UP027,
|
||||
CheckKind::RewriteYieldFrom => &CheckCode::UP028,
|
||||
// pydocstyle
|
||||
CheckKind::BlankLineAfterLastSection(..) => &CheckCode::D413,
|
||||
CheckKind::BlankLineAfterSection(..) => &CheckCode::D410,
|
||||
@@ -2975,6 +2980,9 @@ impl CheckKind {
|
||||
CheckKind::RewriteListComprehension => {
|
||||
"Replace unpacked list comprehension with a generator expression".to_string()
|
||||
}
|
||||
CheckKind::RewriteYieldFrom => {
|
||||
"Replace `yield` over `for` loop with `yield from`".to_string()
|
||||
}
|
||||
// pydocstyle
|
||||
CheckKind::FitsOnOneLine => "One-line docstring should fit on one line".to_string(),
|
||||
CheckKind::BlankLineAfterSummary => {
|
||||
@@ -3555,6 +3563,7 @@ impl CheckKind {
|
||||
| CheckKind::RewriteListComprehension
|
||||
| CheckKind::RewriteMockImport(..)
|
||||
| CheckKind::RewriteUnicodeLiteral
|
||||
| CheckKind::RewriteYieldFrom
|
||||
| CheckKind::SectionNameEndsInColon(..)
|
||||
| CheckKind::SectionNotOverIndented(..)
|
||||
| CheckKind::SectionUnderlineAfterName(..)
|
||||
@@ -3594,6 +3603,8 @@ impl CheckKind {
|
||||
| CheckKind::UselessImportAlias
|
||||
| CheckKind::UselessMetaclassType
|
||||
| CheckKind::UselessObjectInheritance(..)
|
||||
| CheckKind::UselessYieldFixture(..)
|
||||
| CheckKind::YodaConditions(..)
|
||||
)
|
||||
}
|
||||
|
||||
@@ -3689,6 +3700,7 @@ impl CheckKind {
|
||||
CheckKind::RewriteListComprehension => {
|
||||
Some("Replace with generator expression".to_string())
|
||||
}
|
||||
CheckKind::RewriteYieldFrom => Some("Replace with `yield from`".to_string()),
|
||||
CheckKind::NewLineAfterSectionName(name) => {
|
||||
Some(format!("Add newline after \"{name}\""))
|
||||
}
|
||||
@@ -3856,6 +3868,10 @@ impl CheckKind {
|
||||
CheckKind::UselessObjectInheritance(..) => {
|
||||
Some("Remove `object` inheritance".to_string())
|
||||
}
|
||||
CheckKind::UselessYieldFixture(..) => Some("Replace `yield` with `return`".to_string()),
|
||||
CheckKind::YodaConditions(left, right) => Some(format!(
|
||||
"Replace with `{left} == {right}` (Yoda-conditions)`"
|
||||
)),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -589,6 +589,7 @@ pub enum CheckCodePrefix {
|
||||
UP025,
|
||||
UP026,
|
||||
UP027,
|
||||
UP028,
|
||||
W,
|
||||
W2,
|
||||
W29,
|
||||
@@ -828,6 +829,7 @@ impl CheckCodePrefix {
|
||||
CheckCode::UP025,
|
||||
CheckCode::UP026,
|
||||
CheckCode::UP027,
|
||||
CheckCode::UP028,
|
||||
CheckCode::D100,
|
||||
CheckCode::D101,
|
||||
CheckCode::D102,
|
||||
@@ -2668,6 +2670,7 @@ impl CheckCodePrefix {
|
||||
CheckCode::UP025,
|
||||
CheckCode::UP026,
|
||||
CheckCode::UP027,
|
||||
CheckCode::UP028,
|
||||
]
|
||||
}
|
||||
CheckCodePrefix::U0 => {
|
||||
@@ -2704,6 +2707,7 @@ impl CheckCodePrefix {
|
||||
CheckCode::UP025,
|
||||
CheckCode::UP026,
|
||||
CheckCode::UP027,
|
||||
CheckCode::UP028,
|
||||
]
|
||||
}
|
||||
CheckCodePrefix::U00 => {
|
||||
@@ -2924,6 +2928,7 @@ impl CheckCodePrefix {
|
||||
CheckCode::UP025,
|
||||
CheckCode::UP026,
|
||||
CheckCode::UP027,
|
||||
CheckCode::UP028,
|
||||
],
|
||||
CheckCodePrefix::UP0 => vec![
|
||||
CheckCode::UP001,
|
||||
@@ -2952,6 +2957,7 @@ impl CheckCodePrefix {
|
||||
CheckCode::UP025,
|
||||
CheckCode::UP026,
|
||||
CheckCode::UP027,
|
||||
CheckCode::UP028,
|
||||
],
|
||||
CheckCodePrefix::UP00 => vec![
|
||||
CheckCode::UP001,
|
||||
@@ -3002,6 +3008,7 @@ impl CheckCodePrefix {
|
||||
CheckCode::UP025,
|
||||
CheckCode::UP026,
|
||||
CheckCode::UP027,
|
||||
CheckCode::UP028,
|
||||
],
|
||||
CheckCodePrefix::UP020 => vec![CheckCode::UP020],
|
||||
CheckCodePrefix::UP021 => vec![CheckCode::UP021],
|
||||
@@ -3011,6 +3018,7 @@ impl CheckCodePrefix {
|
||||
CheckCodePrefix::UP025 => vec![CheckCode::UP025],
|
||||
CheckCodePrefix::UP026 => vec![CheckCode::UP026],
|
||||
CheckCodePrefix::UP027 => vec![CheckCode::UP027],
|
||||
CheckCodePrefix::UP028 => vec![CheckCode::UP028],
|
||||
CheckCodePrefix::W => vec![CheckCode::W292, CheckCode::W605],
|
||||
CheckCodePrefix::W2 => vec![CheckCode::W292],
|
||||
CheckCodePrefix::W29 => vec![CheckCode::W292],
|
||||
@@ -3631,6 +3639,7 @@ impl CheckCodePrefix {
|
||||
CheckCodePrefix::UP025 => SuffixLength::Three,
|
||||
CheckCodePrefix::UP026 => SuffixLength::Three,
|
||||
CheckCodePrefix::UP027 => SuffixLength::Three,
|
||||
CheckCodePrefix::UP028 => SuffixLength::Three,
|
||||
CheckCodePrefix::W => SuffixLength::Zero,
|
||||
CheckCodePrefix::W2 => SuffixLength::One,
|
||||
CheckCodePrefix::W29 => SuffixLength::Two,
|
||||
|
||||
@@ -3,11 +3,13 @@
|
||||
//! to external visibility or parsing.
|
||||
|
||||
use std::hash::{Hash, Hasher};
|
||||
use std::iter;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
use anyhow::{anyhow, Result};
|
||||
use colored::Colorize;
|
||||
use globset::{Glob, GlobMatcher, GlobSet};
|
||||
use itertools::Either::{Left, Right};
|
||||
use itertools::Itertools;
|
||||
use once_cell::sync::Lazy;
|
||||
use path_absolutize::path_dedot;
|
||||
@@ -114,12 +116,6 @@ impl Settings {
|
||||
.dummy_variable_rgx
|
||||
.unwrap_or_else(|| DEFAULT_DUMMY_VARIABLE_RGX.clone()),
|
||||
enabled: validate_enabled(resolve_codes(
|
||||
config
|
||||
.pydocstyle
|
||||
.as_ref()
|
||||
.and_then(|pydocstyle| pydocstyle.convention)
|
||||
.map(|convention| convention.codes())
|
||||
.unwrap_or_default(),
|
||||
[CheckCodeSpec {
|
||||
select: &config
|
||||
.select
|
||||
@@ -133,6 +129,22 @@ impl Settings {
|
||||
.iter()
|
||||
.zip(config.extend_ignore.iter())
|
||||
.map(|(select, ignore)| CheckCodeSpec { select, ignore }),
|
||||
)
|
||||
.chain(
|
||||
// If a docstring convention is specified, force-disable any incompatible error
|
||||
// codes.
|
||||
if let Some(convention) = config
|
||||
.pydocstyle
|
||||
.as_ref()
|
||||
.and_then(|pydocstyle| pydocstyle.convention)
|
||||
{
|
||||
Left(iter::once(CheckCodeSpec {
|
||||
select: &[],
|
||||
ignore: convention.codes(),
|
||||
}))
|
||||
} else {
|
||||
Right(iter::empty())
|
||||
},
|
||||
),
|
||||
)),
|
||||
exclude: resolve_globset(config.exclude.unwrap_or_else(|| DEFAULT_EXCLUDE.clone()))?,
|
||||
@@ -141,7 +153,6 @@ impl Settings {
|
||||
fix: config.fix.unwrap_or(false),
|
||||
fix_only: config.fix_only.unwrap_or(false),
|
||||
fixable: resolve_codes(
|
||||
vec![],
|
||||
[CheckCodeSpec {
|
||||
select: &config.fixable.unwrap_or_else(|| CATEGORIES.to_vec()),
|
||||
ignore: &config.unfixable.unwrap_or_default(),
|
||||
@@ -344,6 +355,7 @@ impl Hash for Settings {
|
||||
self.flake8_bugbear.hash(state);
|
||||
self.flake8_errmsg.hash(state);
|
||||
self.flake8_import_conventions.hash(state);
|
||||
self.flake8_pytest_style.hash(state);
|
||||
self.flake8_quotes.hash(state);
|
||||
self.flake8_tidy_imports.hash(state);
|
||||
self.flake8_unused_arguments.hash(state);
|
||||
@@ -391,11 +403,8 @@ struct CheckCodeSpec<'a> {
|
||||
|
||||
/// Given a set of selected and ignored prefixes, resolve the set of enabled
|
||||
/// error codes.
|
||||
fn resolve_codes<'a>(
|
||||
baseline: Vec<CheckCode>,
|
||||
specs: impl Iterator<Item = CheckCodeSpec<'a>>,
|
||||
) -> FxHashSet<CheckCode> {
|
||||
let mut codes: FxHashSet<CheckCode> = FxHashSet::from_iter(baseline);
|
||||
fn resolve_codes<'a>(specs: impl Iterator<Item = CheckCodeSpec<'a>>) -> FxHashSet<CheckCode> {
|
||||
let mut codes: FxHashSet<CheckCode> = FxHashSet::default();
|
||||
for spec in specs {
|
||||
for specificity in [
|
||||
SuffixLength::None,
|
||||
@@ -448,7 +457,6 @@ mod tests {
|
||||
#[test]
|
||||
fn check_codes() {
|
||||
let actual = resolve_codes(
|
||||
vec![],
|
||||
[CheckCodeSpec {
|
||||
select: &[CheckCodePrefix::W],
|
||||
ignore: &[],
|
||||
@@ -459,7 +467,6 @@ mod tests {
|
||||
assert_eq!(actual, expected);
|
||||
|
||||
let actual = resolve_codes(
|
||||
vec![],
|
||||
[CheckCodeSpec {
|
||||
select: &[CheckCodePrefix::W6],
|
||||
ignore: &[],
|
||||
@@ -470,7 +477,6 @@ mod tests {
|
||||
assert_eq!(actual, expected);
|
||||
|
||||
let actual = resolve_codes(
|
||||
vec![],
|
||||
[CheckCodeSpec {
|
||||
select: &[CheckCodePrefix::W],
|
||||
ignore: &[CheckCodePrefix::W292],
|
||||
@@ -481,7 +487,6 @@ mod tests {
|
||||
assert_eq!(actual, expected);
|
||||
|
||||
let actual = resolve_codes(
|
||||
vec![],
|
||||
[CheckCodeSpec {
|
||||
select: &[CheckCodePrefix::W605],
|
||||
ignore: &[CheckCodePrefix::W605],
|
||||
@@ -492,7 +497,6 @@ mod tests {
|
||||
assert_eq!(actual, expected);
|
||||
|
||||
let actual = resolve_codes(
|
||||
vec![],
|
||||
[
|
||||
CheckCodeSpec {
|
||||
select: &[CheckCodePrefix::W],
|
||||
@@ -509,7 +513,6 @@ mod tests {
|
||||
assert_eq!(actual, expected);
|
||||
|
||||
let actual = resolve_codes(
|
||||
vec![],
|
||||
[
|
||||
CheckCodeSpec {
|
||||
select: &[CheckCodePrefix::W],
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
//! Detect code style from Python source code.
|
||||
|
||||
use std::fmt;
|
||||
use std::ops::Deref;
|
||||
|
||||
use once_cell::unsync::OnceCell;
|
||||
@@ -69,6 +70,24 @@ impl From<&Quote> for vendor::str::Quote {
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for Quote {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
Quote::Single => write!(f, "\'"),
|
||||
Quote::Double => write!(f, "\""),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<&Quote> for char {
|
||||
fn from(val: &Quote) -> Self {
|
||||
match val {
|
||||
Quote::Single => '\'',
|
||||
Quote::Double => '"',
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// The indentation style used in Python source code.
|
||||
#[derive(Debug, PartialEq, Eq)]
|
||||
pub struct Indentation(String);
|
||||
|
||||
Reference in New Issue
Block a user