Compare commits

...

23 Commits

Author SHA1 Message Date
Carl Meyer
ab9b38c0f1 re-add para on CLICOLOR and CLICOLOR_FORCE 2024-04-05 16:07:54 -06:00
Carl Meyer
af12ee1763 Switch from colored to owo_colors/anstream 2024-04-04 17:55:43 -06:00
Alex Waygood
6b4fa17097 Rework docs for pydocstyle rules (#10754) 2024-04-03 22:34:00 +01:00
Carl Meyer
5e2482824c [flake8_comprehensions] add sum/min/max to unnecessary comprehension check (C419) (#10759)
Fixes #3259 

## Summary

Renames `UnnecessaryComprehensionAnyAll` to
`UnnecessaryComprehensionInCall` and extends the check to `sum`, `min`,
and `max`, in addition to `any` and `all`.

## Test Plan

Updated snapshot test.

Built docs locally and verified the docs for this rule still render
correctly.
2024-04-03 14:44:33 -06:00
Carl Meyer
e0a8fb607a fix obsolete name in resolve_qualified_name docs (#10762)
`resolve_call_path` was renamed to `resolve_qualified_name` in
a6d892b1f4, but the doc block for the
function wasn't updated to match.
2024-04-03 20:13:10 +00:00
Jane Lewis
257964a8bc ruff server now supports source.fixAll source action (#10597)
## Summary

`ruff server` now has source action `source.fixAll` as an available code
action.

This also fixes https://github.com/astral-sh/ruff/issues/10593 in the
process of revising the code for quick fix code actions.

## Test Plan




https://github.com/astral-sh/ruff/assets/19577865/f4c07425-e68a-445f-a4ed-949c9197a6be
2024-04-03 16:22:17 +00:00
Boshen
d467aa78c2 Remove an unused dependency (#10747)
## Summary

Continuation of #10475, I improved [`cargo
shear`](https://github.com/Boshen/cargo-shear) even more.

We can put this in CI once I test it a bit more, given that [ignoring
false
positives](https://github.com/Boshen/cargo-shear?tab=readme-ov-file#ignore-false-positives)
has been implemented.

## Test Plan

`cargo check --all-features --all-targets`
2024-04-03 09:57:19 +01:00
renovate[bot]
6e1c061e5f chore(deps): update rust crate lsp-types to v0.95.1 (#10686)
[![Mend
Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com)

This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [lsp-types](https://togithub.com/gluon-lang/lsp-types) |
workspace.dependencies | patch | `0.95.0` -> `0.95.1` |

---

### Release Notes

<details>
<summary>gluon-lang/lsp-types (lsp-types)</summary>

###
[`v0.95.1`](https://togithub.com/gluon-lang/lsp-types/blob/HEAD/CHANGELOG.md#v0951-2024-03-18)

[Compare
Source](https://togithub.com/gluon-lang/lsp-types/compare/v0.95.0...v0.95.1)

##### v0.95.1 (2024-03-18)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "on Monday" (UTC), Automerge - At any
time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR has been generated by [Mend
Renovate](https://www.mend.io/free-developer-tools/renovate/). View
repository job log
[here](https://developer.mend.io/github/astral-sh/ruff).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy4yNjkuMiIsInVwZGF0ZWRJblZlciI6IjM3LjI2OS4yIiwidGFyZ2V0QnJhbmNoIjoibWFpbiJ9-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-04-03 06:37:03 +00:00
Jane Lewis
9872f51293 Drop support for root_uri as an initialization parameter in ruff_server (#10743)
## Summary

Needed for https://github.com/astral-sh/ruff/pull/10686.

We no longer support `root_uri` as an initialization parameter, relying
solely on `workspace_folders` to find the working directories. This
means that the minimum supported LSP version is now `0.3.6`.

## Test Plan

When opening a folder in VS Code, you shouldn't see any errors in the
log which say `No workspace(s) were provided(...)`.
2024-04-02 20:51:59 -07:00
Charlie Marsh
e54b591ec7 Use section name range for all section-related docstring diagnostics (#10740)
## Summary

We may not have had access to this in the past, but in short, if the
diagnostic is related to a specific section of a docstring, it seems
better to highlight the section (via the header) than the _entire_
docstring.

This should be completely compatible with existing `# noqa` since it's
always inside of a multi-line string anyway, and in such cases the `#
noqa` is always placed at the end of the multiline string.

Closes https://github.com/astral-sh/ruff/issues/10736.
2024-04-02 22:10:04 -04:00
Carl Meyer
814b26f82e [flake8_comprehensions] update docs for unnecessary-comprehension-any-all (C419) (#10744)
Ref #3259; see in particular
https://github.com/astral-sh/ruff/issues/3259#issuecomment-2033127339

## Summary

Improve the accuracy of the docs for this lint rule/fix.

## Test Plan

Generated the docs locally and visited the page for this rule:

![Screenshot 2024-04-02 at 4 56
40 PM](https://github.com/astral-sh/ruff/assets/61586/64f25cf6-edfe-4682-ac8e-7e21b834f5f2)

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2024-04-02 17:42:55 -06:00
Carl Meyer
2a4084a2bb chore(docs): update ruff_linter crate name in CONTRIBUTING.md (#10745)
Reading through `CONTRIBUTING.md`, I happened to notice that it still
referred to the `ruff_linter` crate as the `ruff` crate. `ruff` is a
different crate, located in `crates/ruff`, which doesn't contain "the
vast majority of the code and all the lint rules."
2024-04-02 17:06:21 -06:00
Charlie Marsh
dff8f93457 [flake8-return] Ignore assignments to annotated variables in unnecessary-assign (#10741)
## Summary

Closes https://github.com/astral-sh/ruff/issues/10732.
2024-04-02 16:18:05 -04:00
Aleksei Latyshev
859e3fc7fa [refurb] Implement int-on-sliced-str (FURB166) (#10650)
## Summary
implement int_on_sliced_str (FURB166) lint
- #1348
- [original
lint](https://github.com/dosisod/refurb/blob/master/refurb/checks/builtin/use_int_base_zero.py)

## Test Plan
cargo test
2024-04-02 19:29:42 +00:00
Aleksei Latyshev
0de23760ff [pylint] Don't recommend decorating staticmethods with @singledispatch (PLE1519, PLE1520) (#10637)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2024-04-02 16:47:31 +00:00
renovate[bot]
b90e6df5cc chore(deps): update rust crate env_logger to 0.11.0 (#10700)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-04-02 15:03:54 +02:00
Micha Reiser
0d20ec968f Build codspeed benchmarks by calling cargo directly (#10735) 2024-04-02 14:50:19 +02:00
Dhruv Manilawala
99dd3a8ab0 Implement as_str & Display for all operator enums (#10691)
## Summary

This PR adds the `as_str` implementation for all the operator methods.
It already exists for `CmpOp` which is being [used in the
linter](ffcd77860c/crates/ruff_linter/src/rules/flake8_simplify/rules/key_in_dict.rs (L117))
and it makes sense to implement it for the rest as well. This will also
be utilized in error messages for the new parser.
2024-04-02 10:34:36 +00:00
Dhruv Manilawala
eee2d5b915 Remove unused operator methods and impl (#10690)
## Summary

This PR removes unused operator methods and impl traits. There is
already the `is_macro::Is` implementation for all the operators and this
seems unnecessary.
2024-04-02 15:53:20 +05:30
Charlie Marsh
159bad73d5 Accept non-aliased (but correct) import in unconventional-import-alias (#10729)
## Summary

Given:

```toml
[tool.ruff.lint.flake8-import-conventions.aliases]
"django.conf.settings" = "settings"
```

We should accept `from django.conf import settings`.

Closes https://github.com/astral-sh/ruff/issues/10599.
2024-04-01 23:47:20 -04:00
Charlie Marsh
7b48443624 Respect Q00* ignores in flake8-quotes rules (#10728)
## Summary

We lost the per-rule ignores when these were migrated to the AST, so if
_any_ `Q` rule is enabled, they're now all enabled.

Closes https://github.com/astral-sh/ruff/issues/10724.

## Test Plan

Ran:

```shell
ruff check . --isolated --select Q --ignore Q000
ruff check . --isolated --select Q --ignore Q001
ruff check . --isolated --select Q --ignore Q002
ruff check . --isolated --select Q --ignore Q000,Q001
ruff check . --isolated --select Q --ignore Q000,Q002
ruff check . --isolated --select Q --ignore Q001,Q002
```

...against:

```python
'''
bad docsting
'''
a = 'single'
b = '''
bad multi line
'''
```
2024-04-02 03:21:12 +00:00
Charlie Marsh
d36f60999d Ignore annotated lambdas in class scopes (#10720)
## Summary

An annotated lambda assignment within a class scope is often
intentional. For example, within a dataclass or Pydantic model, these
are treated as fields rather than methods (and so can be passed values
in constructors).

I originally wrote this to special-case dataclasses and Pydantic
models... But was left feeling like we'd see more false positives here
for little gain (an annotated lambda within a `class` is likely
intentional?). Open to opinions, though.

Closes https://github.com/astral-sh/ruff/issues/10718.
2024-04-01 15:44:45 -04:00
Charlie Marsh
67f0f615b2 Recursively resolve TypeDicts for N815 violations (#10719)
## Summary

Only works within a single file for now.

Closes https://github.com/astral-sh/ruff/issues/10671.
2024-04-01 17:40:55 +00:00
115 changed files with 2317 additions and 2362 deletions

View File

@@ -525,8 +525,23 @@ jobs:
- uses: Swatinem/rust-cache@v2
# Codspeed comes with a very ancient cargo version (1.66) that resolves features flags differently than what we use now.
# This can result in build failures; see https://github.com/astral-sh/ruff/pull/10700.
# There's a pending codspeed PR to upgrade to a newer cargo version, but until that's merged, we need to use the workaround below.
# https://github.com/CodSpeedHQ/codspeed-rust/pull/31
# What we do is to call cargo build manually with the correct feature flags and RUSTC settings. We'll have to
# manually maintain the list of benchmarks to run with codspeed (the benefit is that we could detect which benchmarks to run and build based on the changes).
# This is inspired by https://github.com/oxc-project/oxc/blob/a0532adc654039a0c7ead7b35216dfa0b0cb8e8f/.github/workflows/benchmark.yml
- name: "Build benchmarks"
run: cargo codspeed build --features codspeed -p ruff_benchmark
env:
RUSTFLAGS: "-C debuginfo=2 -C strip=none -g --cfg codspeed"
shell: bash
# Build all benchmarks, copy the binary to the codspeed directory, remove any `*.d` files that might have been created.
run: |
cargo build --release -p ruff_benchmark --bench parser --bench linter --bench formatter --bench lexer --features=codspeed
mkdir -p ./target/codspeed/ruff_benchmark
cp ./target/release/deps/{lexer,parser,linter,formatter}* target/codspeed/ruff_benchmark/
rm -rf ./target/codspeed/ruff_benchmark/*.d
- name: "Run benchmarks"
uses: CodSpeedHQ/action@v2

View File

@@ -123,8 +123,8 @@ prior to merging.
Ruff is structured as a monorepo with a [flat crate structure](https://matklad.github.io/2021/08/22/large-rust-workspaces.html),
such that all crates are contained in a flat `crates` directory.
The vast majority of the code, including all lint rules, lives in the `ruff` crate (located at
`crates/ruff_linter`). As a contributor, that's the crate that'll be most relevant to you.
The vast majority of the code, including all lint rules, lives in the `ruff_linter` crate (located
at `crates/ruff_linter`). As a contributor, that's the crate that'll be most relevant to you.
At the time of writing, the repository includes the following crates:

50
Cargo.lock generated
View File

@@ -712,16 +712,26 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a357d28ed41a50f9c765dbfe56cbc04a64e53e5fc58ba79fbc34c10ef3df831f"
[[package]]
name = "env_logger"
version = "0.10.2"
name = "env_filter"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4cd405aab171cb85d6735e5c8d9db038c17d3ca007a4d2c25f337935c3d90580"
checksum = "a009aa4810eb158359dda09d0c87378e4bbb89b5a801f016885a4707ba24f7ea"
dependencies = [
"humantime",
"is-terminal",
"log",
"regex",
"termcolor",
]
[[package]]
name = "env_logger"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38b35839ba51819680ba087cd351788c9a3c476841207e0b8cee0b04722343b9"
dependencies = [
"anstream",
"anstyle",
"env_filter",
"humantime",
"log",
]
[[package]]
@@ -1347,9 +1357,9 @@ dependencies = [
[[package]]
name = "lsp-types"
version = "0.95.0"
version = "0.95.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "158c1911354ef73e8fe42da6b10c0484cb65c7f1007f28022e847706c1ab6984"
checksum = "8e34d33a8e9b006cd3fc4fe69a921affa097bae4bb65f76271f4644f9a334365"
dependencies = [
"bitflags 1.3.2",
"serde",
@@ -1550,6 +1560,12 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
[[package]]
name = "owo-colors"
version = "4.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "caff54706df99d2a78a5a4e3455ff45448d81ef1bb63c22cd14052ca0e993a3f"
[[package]]
name = "parking_lot"
version = "0.12.1"
@@ -1987,6 +2003,7 @@ dependencies = [
name = "ruff"
version = "0.3.5"
dependencies = [
"anstream",
"anyhow",
"argfile",
"bincode",
@@ -1996,7 +2013,6 @@ dependencies = [
"clap",
"clap_complete_command",
"clearscreen",
"colored",
"filetime",
"ignore",
"insta",
@@ -2007,6 +2023,7 @@ dependencies = [
"mimalloc",
"notify",
"num_cpus",
"owo-colors",
"path-absolutize",
"rayon",
"regex",
@@ -2151,11 +2168,11 @@ version = "0.3.5"
dependencies = [
"aho-corasick",
"annotate-snippets 0.9.2",
"anstream",
"anyhow",
"bitflags 2.5.0",
"chrono",
"clap",
"colored",
"fern",
"glob",
"globset",
@@ -2169,6 +2186,7 @@ dependencies = [
"memchr",
"natord",
"once_cell",
"owo-colors",
"path-absolutize",
"pathdiff",
"pep440_rs 0.5.0",
@@ -2327,7 +2345,6 @@ name = "ruff_python_parser"
version = "0.0.0"
dependencies = [
"anyhow",
"bitflags 2.5.0",
"bstr",
"insta",
"is-macro",
@@ -2484,7 +2501,6 @@ name = "ruff_workspace"
version = "0.0.0"
dependencies = [
"anyhow",
"colored",
"dirs 5.0.1",
"glob",
"globset",
@@ -2492,6 +2508,7 @@ dependencies = [
"is-macro",
"itertools 0.12.1",
"log",
"owo-colors",
"path-absolutize",
"pep440_rs 0.5.0",
"regex",
@@ -2900,15 +2917,6 @@ dependencies = [
"winapi",
]
[[package]]
name = "termcolor"
version = "1.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "06794f8f6c5c898b3275aebefa6b8a1cb24cd2c6c79397ab15774837a0bc5755"
dependencies = [
"winapi-util",
]
[[package]]
name = "terminal_size"
version = "0.3.0"

View File

@@ -14,6 +14,7 @@ license = "MIT"
[workspace.dependencies]
aho-corasick = { version = "1.1.3" }
annotate-snippets = { version = "0.9.2", features = ["color"] }
anstream = { version = "0.6.13" }
anyhow = { version = "1.0.80" }
argfile = { version = "0.1.6" }
bincode = { version = "1.3.3" }
@@ -25,7 +26,6 @@ clap = { version = "4.5.3", features = ["derive"] }
clap_complete_command = { version = "0.5.1" }
clearscreen = { version = "2.0.0" }
codspeed-criterion-compat = { version = "2.4.0", default-features = false }
colored = { version = "2.1.0" }
console_error_panic_hook = { version = "0.1.7" }
console_log = { version = "1.0.0" }
countme = { version = "3.0.1" }
@@ -33,7 +33,7 @@ criterion = { version = "0.5.1", default-features = false }
crossbeam = { version = "0.8.4" }
dirs = { version = "5.0.0" }
drop_bomb = { version = "0.1.5" }
env_logger = { version = "0.10.1" }
env_logger = { version = "0.11.0" }
fern = { version = "0.6.1" }
filetime = { version = "0.2.23" }
fs-err = { version = "2.11.0" }
@@ -65,6 +65,7 @@ natord = { version = "1.0.9" }
notify = { version = "6.1.1" }
num_cpus = { version = "1.16.0" }
once_cell = { version = "1.19.0" }
owo-colors = { version = "4.0.0" }
path-absolutize = { version = "3.1.1" }
pathdiff = { version = "0.2.1" }
pep440_rs = { version = "0.5.0", features = ["serde"] }

View File

@@ -25,6 +25,7 @@ ruff_source_file = { path = "../ruff_source_file" }
ruff_text_size = { path = "../ruff_text_size" }
ruff_workspace = { path = "../ruff_workspace" }
anstream = { workspace = true }
anyhow = { workspace = true }
argfile = { workspace = true }
bincode = { workspace = true }
@@ -34,7 +35,6 @@ chrono = { workspace = true }
clap = { workspace = true, features = ["derive", "env", "wrap_help"] }
clap_complete_command = { workspace = true }
clearscreen = { workspace = true }
colored = { workspace = true }
filetime = { workspace = true }
ignore = { workspace = true }
is-macro = { workspace = true }
@@ -42,6 +42,7 @@ itertools = { workspace = true }
log = { workspace = true }
notify = { workspace = true }
num_cpus = { workspace = true }
owo-colors = { workspace = true }
path-absolutize = { workspace = true, features = ["once_cell_cache"] }
rayon = { workspace = true }
regex = { workspace = true }
@@ -62,8 +63,6 @@ wild = { workspace = true }
[dev-dependencies]
# Enable test rules during development
ruff_linter = { path = "../ruff_linter", features = ["clap", "test-rules"] }
# Avoid writing colored snapshots when running tests from the terminal
colored = { workspace = true, features = ["no-color"] }
insta = { workspace = true, features = ["filters", "json"] }
insta-cmd = { workspace = true }
tempfile = { workspace = true }

View File

@@ -8,7 +8,7 @@ use std::sync::Arc;
use anyhow::bail;
use clap::builder::{TypedValueParser, ValueParserFactory};
use clap::{command, Parser};
use colored::Colorize;
use owo_colors::OwoColorize;
use path_absolutize::path_dedot;
use regex::Regex;
use rustc_hash::FxHashMap;
@@ -1053,7 +1053,6 @@ pub enum FormatRangeParseError {
impl std::fmt::Display for FormatRangeParseError {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
let tip = " tip:".bold().green();
match self {
FormatRangeParseError::StartGreaterThanEnd(start, end) => {
write!(
@@ -1062,7 +1061,8 @@ impl std::fmt::Display for FormatRangeParseError {
start_invalid=start.to_string().bold().yellow(),
end_invalid=end.to_string().bold().yellow(),
start=start.to_string().green().bold(),
end=end.to_string().green().bold()
end=end.to_string().green().bold(),
tip=" tip:".bold().green()
)
}
FormatRangeParseError::InvalidStart(inner) => inner.write(f, true),
@@ -1148,7 +1148,8 @@ pub enum LineColumnParseError {
impl LineColumnParseError {
fn write(&self, f: &mut std::fmt::Formatter, start_range: bool) -> std::fmt::Result {
let tip = "tip:".bold().green();
let tip = "tip:".bold();
let tip = tip.green();
let range = if start_range { "start" } else { "end" };

View File

@@ -4,9 +4,9 @@ use std::path::{Path, PathBuf};
use std::time::Instant;
use anyhow::Result;
use colored::Colorize;
use ignore::Error;
use log::{debug, error, warn};
use owo_colors::OwoColorize;
#[cfg(not(target_family = "wasm"))]
use rayon::prelude::*;
use rustc_hash::FxHashMap;
@@ -226,6 +226,7 @@ mod test {
use rustc_hash::FxHashMap;
use tempfile::TempDir;
use ruff_linter::colors;
use ruff_linter::message::{Emitter, EmitterContext, TextEmitter};
use ruff_linter::registry::Rule;
use ruff_linter::settings::types::UnsafeFixes;
@@ -280,7 +281,7 @@ mod test {
UnsafeFixes::Enabled,
)
.unwrap();
let mut output = Vec::new();
let mut output = colors::none(Vec::new());
TextEmitter::default()
.with_show_fix_status(true)
@@ -291,7 +292,7 @@ mod test {
)
.unwrap();
let messages = String::from_utf8(output).unwrap();
let messages = String::from_utf8(output.into_inner()).unwrap();
insta::with_settings!({
omit_expression => true,

View File

@@ -2,7 +2,7 @@ use std::fs::remove_dir_all;
use std::io::{self, BufWriter, Write};
use anyhow::Result;
use colored::Colorize;
use owo_colors::OwoColorize;
use path_absolutize::path_dedot;
use walkdir::WalkDir;

View File

@@ -6,9 +6,9 @@ use std::path::{Path, PathBuf};
use std::time::Instant;
use anyhow::Result;
use colored::Colorize;
use itertools::Itertools;
use log::{error, warn};
use owo_colors::OwoColorize;
use rayon::iter::Either::{Left, Right};
use rayon::iter::{IntoParallelRefIterator, ParallelIterator};
use rustc_hash::FxHashSet;
@@ -16,6 +16,7 @@ use thiserror::Error;
use tracing::debug;
use ruff_diagnostics::SourceMap;
use ruff_linter::colors;
use ruff_linter::fs;
use ruff_linter::logging::{DisplayParseError, LogLevel};
use ruff_linter::registry::Rule;
@@ -189,10 +190,10 @@ pub(crate) fn format(
match mode {
FormatMode::Write => {}
FormatMode::Check => {
results.write_changed(&mut stdout().lock())?;
results.write_changed(&mut colors::auto(stdout()).lock())?;
}
FormatMode::Diff => {
results.write_diff(&mut stdout().lock())?;
results.write_diff(&mut colors::auto(stdout()).lock())?;
}
}
@@ -200,9 +201,9 @@ pub(crate) fn format(
if config_arguments.log_level >= LogLevel::Default {
if mode.is_diff() {
// Allow piping the diff to e.g. a file by writing the summary to stderr
results.write_summary(&mut stderr().lock())?;
results.write_summary(&mut colors::auto(stderr()).lock())?;
} else {
results.write_summary(&mut stdout().lock())?;
results.write_summary(&mut colors::auto(stdout()).lock())?;
}
}

View File

@@ -4,6 +4,7 @@ use std::path::Path;
use anyhow::Result;
use log::error;
use ruff_linter::colors;
use ruff_linter::source_kind::SourceKind;
use ruff_python_ast::{PySourceType, SourceType};
use ruff_workspace::resolver::{match_exclusion, python_file_at_path, Resolver};
@@ -111,7 +112,7 @@ fn format_source_code(
match &formatted {
FormattedSource::Formatted(formatted) => match mode {
FormatMode::Write => {
let mut writer = stdout().lock();
let mut writer = colors::auto(stdout()).lock();
formatted
.write(&mut writer)
.map_err(|err| FormatCommandError::Write(path.map(Path::to_path_buf), err))?;
@@ -120,7 +121,7 @@ fn format_source_code(
FormatMode::Diff => {
use std::io::Write;
write!(
&mut stdout().lock(),
&mut colors::auto(stdout()).lock(),
"{}",
source_kind.diff(formatted, path).unwrap()
)
@@ -130,7 +131,7 @@ fn format_source_code(
FormattedSource::Unchanged => {
// Write to stdout regardless of whether the source was formatted
if mode.is_write() {
let mut writer = stdout().lock();
let mut writer = colors::auto(stdout()).lock();
source_kind
.write(&mut writer)
.map_err(|err| FormatCommandError::Write(path.map(Path::to_path_buf), err))?;

View File

@@ -8,11 +8,12 @@ use std::ops::{Add, AddAssign};
use std::path::Path;
use anyhow::{Context, Result};
use colored::Colorize;
use log::{debug, error, warn};
use owo_colors::OwoColorize;
use rustc_hash::FxHashMap;
use ruff_diagnostics::Diagnostic;
use ruff_linter::colors;
use ruff_linter::linter::{lint_fix, lint_only, FixTable, FixerResult, LinterResult, ParseSource};
use ruff_linter::logging::DisplayParseError;
use ruff_linter::message::Message;
@@ -444,7 +445,7 @@ pub(crate) fn lint_stdin(
// But only write a diff if it's non-empty.
if !fixed.is_empty() {
write!(
&mut io::stdout().lock(),
&mut colors::auto(io::stdout()).lock(),
"{}",
source_kind.diff(&transformed, path).unwrap()
)?;

View File

@@ -10,10 +10,11 @@ use std::sync::mpsc::channel;
use anyhow::Result;
use args::{GlobalConfigArgs, ServerCommand};
use clap::CommandFactory;
use colored::Colorize;
use log::warn;
use notify::{recommended_watcher, RecursiveMode, Watcher};
use owo_colors::OwoColorize;
use ruff_linter::colors;
use ruff_linter::logging::{set_up_logging, LogLevel};
use ruff_linter::settings::flags::FixMode;
use ruff_linter::settings::types::SerializationFormat;
@@ -145,10 +146,6 @@ pub fn run(
}));
}
// Enabled ANSI colors on Windows 10.
#[cfg(windows)]
assert!(colored::control::set_virtual_terminal(true).is_ok());
set_up_logging(global_options.log_level())?;
if let Some(deprecated_alias_warning) = deprecated_alias_warning {
@@ -225,13 +222,12 @@ pub fn check(args: CheckCommand, global_options: GlobalConfigArgs) -> Result<Exi
let mut writer: Box<dyn Write> = match cli.output_file {
Some(path) if !cli.watch => {
colored::control::set_override(false);
let file = File::create(path)?;
Box::new(BufWriter::new(file))
Box::new(BufWriter::new(colors::none(file)))
}
_ => Box::new(BufWriter::new(io::stdout())),
_ => Box::new(BufWriter::new(colors::auto(io::stdout()))),
};
let stderr_writer = Box::new(BufWriter::new(io::stderr()));
let stderr_writer = Box::new(BufWriter::new(colors::auto(io::stderr())));
let is_stdin = is_stdin(&cli.files, cli.stdin_filename.as_deref());
let files = resolve_default_files(cli.files, is_stdin);

View File

@@ -1,7 +1,8 @@
use std::process::ExitCode;
use anstream::eprintln;
use clap::{Parser, Subcommand};
use colored::Colorize;
use owo_colors::OwoColorize;
use ruff::args::{Args, Command};
use ruff::{run, ExitStatus};

View File

@@ -5,8 +5,8 @@ use std::io::Write;
use anyhow::Result;
use bitflags::bitflags;
use colored::Colorize;
use itertools::{iterate, Itertools};
use owo_colors::OwoColorize;
use serde::Serialize;
use ruff_linter::fs::relativize_path;

View File

@@ -30,11 +30,11 @@ ruff_text_size = { path = "../ruff_text_size" }
aho-corasick = { workspace = true }
annotate-snippets = { workspace = true, features = ["color"] }
anstream = { workspace = true }
anyhow = { workspace = true }
bitflags = { workspace = true }
chrono = { workspace = true }
clap = { workspace = true, features = ["derive", "string"], optional = true }
colored = { workspace = true }
fern = { workspace = true }
glob = { workspace = true }
globset = { workspace = true }
@@ -47,6 +47,7 @@ log = { workspace = true }
memchr = { workspace = true }
natord = { workspace = true }
once_cell = { workspace = true }
owo-colors = { workspace = true }
path-absolutize = { workspace = true, features = [
"once_cell_cache",
"use_unix_paths_on_wasm",
@@ -75,8 +76,6 @@ url = { workspace = true }
[dev-dependencies]
insta = { workspace = true }
test-case = { workspace = true }
# Disable colored output in tests
colored = { workspace = true, features = ["no-color"] }
[features]
default = []

View File

@@ -13,6 +13,10 @@ all(x.id for x in bar)
all(x.id for x in bar)
any(x.id for x in bar)
all((x.id for x in bar))
# we don't lint on these in stable yet
sum([x.val for x in bar])
min([x.val for x in bar])
max([x.val for x in bar])
async def f() -> bool:

View File

@@ -0,0 +1,8 @@
sum([x.val for x in bar])
min([x.val for x in bar])
max([x.val for x in bar])
# Ok
sum(x.val for x in bar)
min(x.val for x in bar)
max(x.val for x in bar)

View File

@@ -0,0 +1,3 @@
# no lint if shadowed
def all(x): pass
all([x.id for x in bar])

View File

@@ -21,6 +21,7 @@ def unconventional_aliases():
import tkinter as tkr
import networkx as nxy
def conventional_aliases():
import altair as alt
import matplotlib.pyplot as plt

View File

@@ -0,0 +1,10 @@
def no_alias():
from django.conf import settings
def conventional_alias():
from django.conf import settings as settings
def unconventional_alias():
from django.conf import settings as s

View File

@@ -0,0 +1,7 @@
"""This is a docstring."""
this_is_an_inline_string = "double quote string"
this_is_a_multiline_string = """
double quote string
"""

View File

@@ -406,3 +406,18 @@ def foo():
with contextlib.suppress(Exception):
y = 2
return y
# See: https://github.com/astral-sh/ruff/issues/10732
def func(a: dict[str, int]) -> list[dict[str, int]]:
services: list[dict[str, int]]
if "services" in a:
services = a["services"]
return services
# See: https://github.com/astral-sh/ruff/issues/10732
def func(a: dict[str, int]) -> list[dict[str, int]]:
if "services" in a:
services = a["services"]
return services

View File

@@ -21,3 +21,10 @@ class D(TypedDict):
mixedCase: bool
_mixedCase: list
mixed_Case: set
class E(D):
lower: int
CONSTANT: str
mixedCase: bool
_mixedCase: list
mixed_Case: set

View File

@@ -60,7 +60,7 @@ class Scope:
class Scope:
from typing import Callable
# E731
# OK
f: Callable[[int], int] = lambda x: 2 * x
@@ -147,3 +147,12 @@ def scope():
f = lambda: (
i := 1,
)
from dataclasses import dataclass
from typing import Callable
@dataclass
class FilterDataclass:
# OK
filter: Callable[[str], bool] = lambda _: True

View File

@@ -20,7 +20,7 @@ class Board:
def place(self, position):
pass
@singledispatch
@singledispatch # [singledispatch-method]
@staticmethod
def do(position):
pass

View File

@@ -17,7 +17,7 @@ class Board:
def move(self, position):
pass
@singledispatchmethod # [singledispatchmethod-function]
@singledispatchmethod # Ok
@staticmethod
def do(position):
pass

View File

@@ -0,0 +1,29 @@
# Errors
_ = int("0b1010"[2:], 2)
_ = int("0o777"[2:], 8)
_ = int("0xFFFF"[2:], 16)
b = "0b11"
_ = int(b[2:], 2)
_ = int("0xFFFF"[2:], base=16)
_ = int(b"0xFFFF"[2:], 16)
def get_str():
return "0xFFF"
_ = int(get_str()[2:], 16)
# OK
_ = int("0b1100", 0)
_ = int("123", 3)
_ = int("123", 10)
_ = int("0b1010"[3:], 2)
_ = int("0b1010"[:2], 2)
_ = int("12345"[2:])
_ = int("12345"[2:], xyz=1) # type: ignore

View File

@@ -223,14 +223,10 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
}
if checker.enabled(Rule::MixedCaseVariableInClassScope) {
if let ScopeKind::Class(ast::StmtClassDef { arguments, .. }) =
&checker.semantic.current_scope().kind
if let ScopeKind::Class(class_def) = &checker.semantic.current_scope().kind
{
pep8_naming::rules::mixed_case_variable_in_class_scope(
checker,
expr,
id,
arguments.as_deref(),
checker, expr, id, class_def,
);
}
}
@@ -709,8 +705,8 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
args,
);
}
if checker.enabled(Rule::UnnecessaryComprehensionAnyAll) {
flake8_comprehensions::rules::unnecessary_comprehension_any_all(
if checker.enabled(Rule::UnnecessaryComprehensionInCall) {
flake8_comprehensions::rules::unnecessary_comprehension_in_call(
checker, expr, func, args, keywords,
);
}
@@ -977,6 +973,9 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
if checker.enabled(Rule::UnnecessaryIterableAllocationForFirstElement) {
ruff::rules::unnecessary_iterable_allocation_for_first_element(checker, expr);
}
if checker.enabled(Rule::IntOnSlicedStr) {
refurb::rules::int_on_sliced_str(checker, call);
}
}
Expr::Dict(dict) => {
if checker.any_enabled(&[

View File

@@ -398,7 +398,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Flake8Comprehensions, "16") => (RuleGroup::Stable, rules::flake8_comprehensions::rules::UnnecessaryComprehension),
(Flake8Comprehensions, "17") => (RuleGroup::Stable, rules::flake8_comprehensions::rules::UnnecessaryMap),
(Flake8Comprehensions, "18") => (RuleGroup::Stable, rules::flake8_comprehensions::rules::UnnecessaryLiteralWithinDictCall),
(Flake8Comprehensions, "19") => (RuleGroup::Stable, rules::flake8_comprehensions::rules::UnnecessaryComprehensionAnyAll),
(Flake8Comprehensions, "19") => (RuleGroup::Stable, rules::flake8_comprehensions::rules::UnnecessaryComprehensionInCall),
// flake8-debugger
(Flake8Debugger, "0") => (RuleGroup::Stable, rules::flake8_debugger::rules::Debugger),
@@ -1053,6 +1053,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Refurb, "161") => (RuleGroup::Preview, rules::refurb::rules::BitCount),
(Refurb, "163") => (RuleGroup::Preview, rules::refurb::rules::RedundantLogBase),
(Refurb, "164") => (RuleGroup::Preview, rules::refurb::rules::UnnecessaryFromFloat),
(Refurb, "166") => (RuleGroup::Preview, rules::refurb::rules::IntOnSlicedStr),
(Refurb, "167") => (RuleGroup::Preview, rules::refurb::rules::RegexFlagAlias),
(Refurb, "168") => (RuleGroup::Preview, rules::refurb::rules::IsinstanceTypeNone),
(Refurb, "169") => (RuleGroup::Preview, rules::refurb::rules::TypeNoneComparison),

View File

@@ -0,0 +1,19 @@
use anstream::stream::RawStream;
use anstream::{AutoStream, ColorChoice};
pub fn none<S: RawStream>(stream: S) -> AutoStream<S> {
AutoStream::new(stream, ColorChoice::Never)
}
pub fn auto<S: RawStream>(stream: S) -> AutoStream<S> {
let choice = choice(&stream);
AutoStream::new(stream, choice)
}
pub fn choice<S: RawStream>(stream: &S) -> ColorChoice {
AutoStream::choice(stream)
}
pub fn enabled<S: RawStream>(stream: &S) -> bool {
choice(stream) != ColorChoice::Never
}

View File

@@ -16,6 +16,7 @@ pub const VERSION: &str = env!("CARGO_PKG_VERSION");
mod checkers;
pub mod codes;
pub mod colors;
mod comments;
mod cst;
pub mod directives;

View File

@@ -2,10 +2,11 @@ use std::borrow::Cow;
use std::ops::Deref;
use std::path::Path;
use anstream::eprintln;
use anyhow::{anyhow, Result};
use colored::Colorize;
use itertools::Itertools;
use log::error;
use owo_colors::OwoColorize;
use rustc_hash::FxHashMap;
use ruff_diagnostics::Diagnostic;

View File

@@ -3,15 +3,16 @@ use std::path::{Path, PathBuf};
use std::sync::Mutex;
use anyhow::Result;
use colored::Colorize;
use fern;
use log::Level;
use once_cell::sync::Lazy;
use owo_colors::OwoColorize;
use ruff_python_parser::{ParseError, ParseErrorType};
use rustc_hash::FxHashSet;
use ruff_source_file::{LineIndex, OneIndexed, SourceCode, SourceLocation};
use crate::colors;
use crate::fs;
use crate::source_kind::SourceKind;
use ruff_notebook::Notebook;
@@ -22,7 +23,7 @@ pub static IDENTIFIERS: Lazy<Mutex<Vec<&'static str>>> = Lazy::new(Mutex::defaul
#[macro_export]
macro_rules! warn_user_once_by_id {
($id:expr, $($arg:tt)*) => {
use colored::Colorize;
use owo_colors::OwoColorize;
use log::warn;
if let Ok(mut states) = $crate::logging::IDENTIFIERS.lock() {
@@ -42,7 +43,7 @@ pub static MESSAGES: Lazy<Mutex<FxHashSet<String>>> = Lazy::new(Mutex::default);
#[macro_export]
macro_rules! warn_user_once_by_message {
($($arg:tt)*) => {
use colored::Colorize;
use owo_colors::OwoColorize;
use log::warn;
if let Ok(mut states) = $crate::logging::MESSAGES.lock() {
@@ -59,7 +60,7 @@ macro_rules! warn_user_once_by_message {
#[macro_export]
macro_rules! warn_user_once {
($($arg:tt)*) => {
use colored::Colorize;
use owo_colors::OwoColorize;
use log::warn;
static WARNED: std::sync::atomic::AtomicBool = std::sync::atomic::AtomicBool::new(false);
@@ -73,7 +74,7 @@ macro_rules! warn_user_once {
#[macro_export]
macro_rules! warn_user {
($($arg:tt)*) => {{
use colored::Colorize;
use owo_colors::OwoColorize;
use log::warn;
let message = format!("{}", format_args!($($arg)*));
@@ -152,7 +153,10 @@ pub fn set_up_logging(level: LogLevel) -> Result<()> {
})
.level(level.level_filter())
.level_for("globset", log::LevelFilter::Warn)
.chain(std::io::stderr())
.chain(fern::Output::writer(
Box::new(colors::auto(std::io::stderr())),
"\n",
))
.apply()?;
Ok(())
}

View File

@@ -1,7 +1,7 @@
use std::fmt::{Display, Formatter};
use std::num::NonZeroUsize;
use colored::{Color, ColoredString, Colorize, Styles};
use owo_colors::{OwoColorize, Style};
use ruff_text_size::{Ranged, TextRange, TextSize};
use similar::{ChangeTag, TextDiff};
@@ -81,7 +81,7 @@ impl Display for Diff<'_> {
ChangeTag::Equal => " ",
};
let line_style = LineStyle::from(change.tag());
let line_style = diff_line_style(change.tag());
let old_index = change.old_index().map(OneIndexed::from_zero_indexed);
let new_index = change.new_index().map(OneIndexed::from_zero_indexed);
@@ -97,14 +97,14 @@ impl Display for Diff<'_> {
index: new_index,
width: digit_with
},
line_style.apply_to(sign).bold()
sign.style(line_style).bold()
)?;
for (emphasized, value) in change.iter_strings_lossy() {
if emphasized {
write!(f, "{}", line_style.apply_to(&value).underline().on_black())?;
write!(f, "{}", value.style(line_style).underline().on_black())?;
} else {
write!(f, "{}", line_style.apply_to(&value))?;
write!(f, "{}", value.style(line_style))?;
}
}
if change.missing_newline() {
@@ -118,52 +118,11 @@ impl Display for Diff<'_> {
}
}
struct LineStyle {
fgcolor: Option<Color>,
style: Option<Styles>,
}
impl LineStyle {
fn apply_to(&self, input: &str) -> ColoredString {
let mut colored = ColoredString::from(input);
if let Some(color) = self.fgcolor {
colored = colored.color(color);
}
if let Some(style) = self.style {
match style {
Styles::Clear => colored.clear(),
Styles::Bold => colored.bold(),
Styles::Dimmed => colored.dimmed(),
Styles::Underline => colored.underline(),
Styles::Reversed => colored.reversed(),
Styles::Italic => colored.italic(),
Styles::Blink => colored.blink(),
Styles::Hidden => colored.hidden(),
Styles::Strikethrough => colored.strikethrough(),
}
} else {
colored
}
}
}
impl From<ChangeTag> for LineStyle {
fn from(value: ChangeTag) -> Self {
match value {
ChangeTag::Equal => LineStyle {
fgcolor: None,
style: Some(Styles::Dimmed),
},
ChangeTag::Delete => LineStyle {
fgcolor: Some(Color::Red),
style: None,
},
ChangeTag::Insert => LineStyle {
fgcolor: Some(Color::Green),
style: None,
},
}
fn diff_line_style(change_tag: ChangeTag) -> Style {
match change_tag {
ChangeTag::Equal => Style::new().dimmed(),
ChangeTag::Delete => Style::new().red(),
ChangeTag::Insert => Style::new().green(),
}
}

View File

@@ -2,7 +2,7 @@ use std::fmt::{Display, Formatter};
use std::io::Write;
use std::num::NonZeroUsize;
use colored::Colorize;
use owo_colors::OwoColorize;
use ruff_notebook::NotebookIndex;
use ruff_source_file::OneIndexed;

View File

@@ -156,6 +156,7 @@ mod tests {
use ruff_source_file::{OneIndexed, SourceFileBuilder};
use ruff_text_size::{Ranged, TextRange, TextSize};
use crate::colors;
use crate::message::{Emitter, EmitterContext, Message};
pub(super) fn create_messages() -> Vec<Message> {
@@ -337,10 +338,10 @@ def foo():
) -> String {
let notebook_indexes = FxHashMap::default();
let context = EmitterContext::new(&notebook_indexes);
let mut output: Vec<u8> = Vec::new();
let mut output = colors::none(Vec::new());
emitter.emit(&mut output, messages, &context).unwrap();
String::from_utf8(output).expect("Output to be valid UTF-8")
String::from_utf8(output.into_inner()).expect("Output to be valid UTF-8")
}
pub(super) fn capture_emitter_notebook_output(
@@ -349,9 +350,9 @@ def foo():
notebook_indexes: &FxHashMap<String, NotebookIndex>,
) -> String {
let context = EmitterContext::new(notebook_indexes);
let mut output: Vec<u8> = Vec::new();
let mut output = colors::none(Vec::new());
emitter.emit(&mut output, messages, &context).unwrap();
String::from_utf8(output).expect("Output to be valid UTF-8")
String::from_utf8(output.into_inner()).expect("Output to be valid UTF-8")
}
}

View File

@@ -5,12 +5,13 @@ use std::io::Write;
use annotate_snippets::display_list::{DisplayList, FormatOptions};
use annotate_snippets::snippet::{Annotation, AnnotationType, Slice, Snippet, SourceAnnotation};
use bitflags::bitflags;
use colored::Colorize;
use owo_colors::OwoColorize;
use ruff_notebook::NotebookIndex;
use ruff_source_file::{OneIndexed, SourceLocation};
use ruff_text_size::{Ranged, TextRange, TextSize};
use crate::colors;
use crate::fs::relativize_path;
use crate::line_width::{IndentWidth, LineWidthBuilder};
use crate::message::diff::Diff;
@@ -284,10 +285,7 @@ impl Display for MessageCodeFrame<'_> {
}],
footer,
opt: FormatOptions {
#[cfg(test)]
color: false,
#[cfg(not(test))]
color: colored::control::SHOULD_COLORIZE.should_colorize(),
color: colors::enabled(&std::io::stdout()),
..FormatOptions::default()
},
};

View File

@@ -1,5 +1,5 @@
use colored::Colorize;
use log::warn;
use owo_colors::OwoColorize;
use pyproject_toml::PyProjectToml;
use ruff_text_size::{TextRange, TextSize};

View File

@@ -793,7 +793,7 @@ pub(crate) fn fix_unnecessary_map(
}
/// (C419) Convert `[i for i in a]` into `i for i in a`
pub(crate) fn fix_unnecessary_comprehension_any_all(
pub(crate) fn fix_unnecessary_comprehension_in_call(
expr: &Expr,
locator: &Locator,
stylist: &Stylist,

View File

@@ -12,13 +12,15 @@ mod tests {
use crate::assert_messages;
use crate::registry::Rule;
use crate::settings::types::PreviewMode;
use crate::settings::LinterSettings;
use crate::test::test_path;
#[test_case(Rule::UnnecessaryCallAroundSorted, Path::new("C413.py"))]
#[test_case(Rule::UnnecessaryCollectionCall, Path::new("C408.py"))]
#[test_case(Rule::UnnecessaryComprehension, Path::new("C416.py"))]
#[test_case(Rule::UnnecessaryComprehensionAnyAll, Path::new("C419.py"))]
#[test_case(Rule::UnnecessaryComprehensionInCall, Path::new("C419.py"))]
#[test_case(Rule::UnnecessaryComprehensionInCall, Path::new("C419_2.py"))]
#[test_case(Rule::UnnecessaryDoubleCastOrProcess, Path::new("C414.py"))]
#[test_case(Rule::UnnecessaryGeneratorDict, Path::new("C402.py"))]
#[test_case(Rule::UnnecessaryGeneratorList, Path::new("C400.py"))]
@@ -43,6 +45,24 @@ mod tests {
Ok(())
}
#[test_case(Rule::UnnecessaryComprehensionInCall, Path::new("C419_1.py"))]
fn preview_rules(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!(
"preview__{}_{}",
rule_code.noqa_code(),
path.to_string_lossy()
);
let diagnostics = test_path(
Path::new("flake8_comprehensions").join(path).as_path(),
&LinterSettings {
preview: PreviewMode::Enabled,
..LinterSettings::for_rule(rule_code)
},
)?;
assert_messages!(snapshot, diagnostics);
Ok(())
}
#[test_case(Rule::UnnecessaryCollectionCall, Path::new("C408.py"))]
fn allow_dict_calls_with_keyword_arguments(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!(

View File

@@ -1,7 +1,7 @@
pub(crate) use unnecessary_call_around_sorted::*;
pub(crate) use unnecessary_collection_call::*;
pub(crate) use unnecessary_comprehension::*;
pub(crate) use unnecessary_comprehension_any_all::*;
pub(crate) use unnecessary_comprehension_in_call::*;
pub(crate) use unnecessary_double_cast_or_process::*;
pub(crate) use unnecessary_generator_dict::*;
pub(crate) use unnecessary_generator_list::*;
@@ -21,7 +21,7 @@ mod helpers;
mod unnecessary_call_around_sorted;
mod unnecessary_collection_call;
mod unnecessary_comprehension;
mod unnecessary_comprehension_any_all;
mod unnecessary_comprehension_in_call;
mod unnecessary_double_cast_or_process;
mod unnecessary_generator_dict;
mod unnecessary_generator_list;

View File

@@ -11,15 +11,18 @@ use crate::checkers::ast::Checker;
use crate::rules::flake8_comprehensions::fixes;
/// ## What it does
/// Checks for unnecessary list comprehensions passed to `any` and `all`.
/// Checks for unnecessary list comprehensions passed to builtin functions that take an iterable.
///
/// ## Why is this bad?
/// `any` and `all` take any iterators, including generators. Converting a generator to a list
/// by way of a list comprehension is unnecessary and reduces performance due to the
/// overhead of creating the list.
/// Many builtin functions (this rule currently covers `any`, `all`, `min`, `max`, and `sum`) take
/// any iterable, including a generator. Constructing a temporary list via list comprehension is
/// unnecessary and wastes memory for large iterables.
///
/// For example, compare the performance of `all` with a list comprehension against that
/// of a generator (~40x faster here):
/// `any` and `all` can also short-circuit iteration, saving a lot of time. The unnecessary
/// comprehension forces a full iteration of the input iterable, giving up the benefits of
/// short-circuiting. For example, compare the performance of `all` with a list comprehension
/// against that of a generator in a case where an early short-circuit is possible (almost 40x
/// faster):
///
/// ```console
/// In [1]: %timeit all([i for i in range(1000)])
@@ -29,26 +32,41 @@ use crate::rules::flake8_comprehensions::fixes;
/// 212 ns ± 0.892 ns per loop (mean ± std. dev. of 7 runs, 1,000,000 loops each)
/// ```
///
/// This performance improvement is due to short-circuiting. If the entire iterable has to be
/// traversed, the comprehension version may even be a bit faster: list allocation overhead is not
/// necessarily greater than generator overhead.
///
/// Applying this rule simplifies the code and will usually save memory, but in the absence of
/// short-circuiting it may not improve performance. (It may even slightly regress performance,
/// though the difference will usually be small.)
///
/// ## Examples
/// ```python
/// any([x.id for x in bar])
/// all([x.id for x in bar])
/// sum([x.val for x in bar])
/// min([x.val for x in bar])
/// max([x.val for x in bar])
/// ```
///
/// Use instead:
/// ```python
/// any(x.id for x in bar)
/// all(x.id for x in bar)
/// sum(x.val for x in bar)
/// min(x.val for x in bar)
/// max(x.val for x in bar)
/// ```
///
/// ## Fix safety
/// This rule's fix is marked as unsafe, as it may occasionally drop comments
/// when rewriting the comprehension. In most cases, though, comments will be
/// preserved.
/// This rule's fix is marked as unsafe, as it can change the behavior of the code if the iteration
/// has side effects (due to laziness and short-circuiting). The fix may also drop comments when
/// rewriting some comprehensions.
///
#[violation]
pub struct UnnecessaryComprehensionAnyAll;
pub struct UnnecessaryComprehensionInCall;
impl Violation for UnnecessaryComprehensionAnyAll {
impl Violation for UnnecessaryComprehensionInCall {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
#[derive_message_formats]
@@ -62,7 +80,7 @@ impl Violation for UnnecessaryComprehensionAnyAll {
}
/// C419
pub(crate) fn unnecessary_comprehension_any_all(
pub(crate) fn unnecessary_comprehension_in_call(
checker: &mut Checker,
expr: &Expr,
func: &Expr,
@@ -72,10 +90,13 @@ pub(crate) fn unnecessary_comprehension_any_all(
if !keywords.is_empty() {
return;
}
let Expr::Name(ast::ExprName { id, .. }) = func else {
return;
};
if !matches!(id.as_str(), "all" | "any") {
if !(matches!(id.as_str(), "any" | "all")
|| (checker.settings.preview.is_enabled() && matches!(id.as_str(), "sum" | "min" | "max")))
{
return;
}
let [arg] = args else {
@@ -93,9 +114,9 @@ pub(crate) fn unnecessary_comprehension_any_all(
return;
}
let mut diagnostic = Diagnostic::new(UnnecessaryComprehensionAnyAll, arg.range());
let mut diagnostic = Diagnostic::new(UnnecessaryComprehensionInCall, arg.range());
diagnostic.try_set_fix(|| {
fixes::fix_unnecessary_comprehension_any_all(expr, checker.locator(), checker.stylist())
fixes::fix_unnecessary_comprehension_in_call(expr, checker.locator(), checker.stylist())
});
checker.diagnostics.push(diagnostic);
}

View File

@@ -98,67 +98,65 @@ C419.py:9:5: C419 [*] Unnecessary list comprehension
11 11 | # OK
12 12 | all(x.id for x in bar)
C419.py:24:5: C419 [*] Unnecessary list comprehension
C419.py:28:5: C419 [*] Unnecessary list comprehension
|
22 | # Special comment handling
23 | any(
24 | [ # lbracket comment
26 | # Special comment handling
27 | any(
28 | [ # lbracket comment
| _____^
25 | | # second line comment
26 | | i.bit_count()
27 | | # random middle comment
28 | | for i in range(5) # rbracket comment
29 | | ] # rpar comment
29 | | # second line comment
30 | | i.bit_count()
31 | | # random middle comment
32 | | for i in range(5) # rbracket comment
33 | | ] # rpar comment
| |_____^ C419
30 | # trailing comment
31 | )
34 | # trailing comment
35 | )
|
= help: Remove unnecessary list comprehension
Unsafe fix
21 21 |
22 22 | # Special comment handling
23 23 | any(
24 |- [ # lbracket comment
25 |- # second line comment
26 |- i.bit_count()
24 |+ # lbracket comment
25 |+ # second line comment
26 |+ i.bit_count()
27 27 | # random middle comment
28 |- for i in range(5) # rbracket comment
29 |- ] # rpar comment
28 |+ for i in range(5) # rbracket comment # rpar comment
30 29 | # trailing comment
31 30 | )
32 31 |
25 25 |
26 26 | # Special comment handling
27 27 | any(
28 |- [ # lbracket comment
29 |- # second line comment
30 |- i.bit_count()
28 |+ # lbracket comment
29 |+ # second line comment
30 |+ i.bit_count()
31 31 | # random middle comment
32 |- for i in range(5) # rbracket comment
33 |- ] # rpar comment
32 |+ for i in range(5) # rbracket comment # rpar comment
34 33 | # trailing comment
35 34 | )
36 35 |
C419.py:35:5: C419 [*] Unnecessary list comprehension
C419.py:39:5: C419 [*] Unnecessary list comprehension
|
33 | # Weird case where the function call, opening bracket, and comment are all
34 | # on the same line.
35 | any([ # lbracket comment
37 | # Weird case where the function call, opening bracket, and comment are all
38 | # on the same line.
39 | any([ # lbracket comment
| _____^
36 | | # second line comment
37 | | i.bit_count() for i in range(5) # rbracket comment
38 | | ] # rpar comment
40 | | # second line comment
41 | | i.bit_count() for i in range(5) # rbracket comment
42 | | ] # rpar comment
| |_____^ C419
39 | )
43 | )
|
= help: Remove unnecessary list comprehension
Unsafe fix
32 32 |
33 33 | # Weird case where the function call, opening bracket, and comment are all
34 34 | # on the same line.
35 |-any([ # lbracket comment
36 |- # second line comment
37 |- i.bit_count() for i in range(5) # rbracket comment
38 |- ] # rpar comment
35 |+any(
36 |+# lbracket comment
37 |+# second line comment
38 |+i.bit_count() for i in range(5) # rbracket comment # rpar comment
39 39 | )
36 36 |
37 37 | # Weird case where the function call, opening bracket, and comment are all
38 38 | # on the same line.
39 |-any([ # lbracket comment
40 |- # second line comment
41 |- i.bit_count() for i in range(5) # rbracket comment
42 |- ] # rpar comment
39 |+any(
40 |+# lbracket comment
41 |+# second line comment
42 |+i.bit_count() for i in range(5) # rbracket comment # rpar comment
43 43 | )

View File

@@ -0,0 +1,4 @@
---
source: crates/ruff_linter/src/rules/flake8_comprehensions/mod.rs
---

View File

@@ -0,0 +1,55 @@
---
source: crates/ruff_linter/src/rules/flake8_comprehensions/mod.rs
---
C419_1.py:1:5: C419 [*] Unnecessary list comprehension
|
1 | sum([x.val for x in bar])
| ^^^^^^^^^^^^^^^^^^^^ C419
2 | min([x.val for x in bar])
3 | max([x.val for x in bar])
|
= help: Remove unnecessary list comprehension
Unsafe fix
1 |-sum([x.val for x in bar])
1 |+sum(x.val for x in bar)
2 2 | min([x.val for x in bar])
3 3 | max([x.val for x in bar])
4 4 |
C419_1.py:2:5: C419 [*] Unnecessary list comprehension
|
1 | sum([x.val for x in bar])
2 | min([x.val for x in bar])
| ^^^^^^^^^^^^^^^^^^^^ C419
3 | max([x.val for x in bar])
|
= help: Remove unnecessary list comprehension
Unsafe fix
1 1 | sum([x.val for x in bar])
2 |-min([x.val for x in bar])
2 |+min(x.val for x in bar)
3 3 | max([x.val for x in bar])
4 4 |
5 5 | # Ok
C419_1.py:3:5: C419 [*] Unnecessary list comprehension
|
1 | sum([x.val for x in bar])
2 | min([x.val for x in bar])
3 | max([x.val for x in bar])
| ^^^^^^^^^^^^^^^^^^^^ C419
4 |
5 | # Ok
|
= help: Remove unnecessary list comprehension
Unsafe fix
1 1 | sum([x.val for x in bar])
2 2 | min([x.val for x in bar])
3 |-max([x.val for x in bar])
3 |+max(x.val for x in bar)
4 4 |
5 5 | # Ok
6 6 | sum(x.val for x in bar)

View File

@@ -27,7 +27,7 @@ mod tests {
#[test]
fn custom() -> Result<()> {
let mut aliases = super::settings::default_aliases();
let mut aliases = default_aliases();
aliases.extend(FxHashMap::from_iter([
("dask.array".to_string(), "da".to_string()),
("dask.dataframe".to_string(), "dd".to_string()),
@@ -126,7 +126,7 @@ mod tests {
#[test]
fn override_defaults() -> Result<()> {
let mut aliases = super::settings::default_aliases();
let mut aliases = default_aliases();
aliases.extend(FxHashMap::from_iter([(
"numpy".to_string(),
"nmp".to_string(),
@@ -149,7 +149,7 @@ mod tests {
#[test]
fn from_imports() -> Result<()> {
let mut aliases = super::settings::default_aliases();
let mut aliases = default_aliases();
aliases.extend(FxHashMap::from_iter([
("xml.dom.minidom".to_string(), "md".to_string()),
(
@@ -182,4 +182,26 @@ mod tests {
assert_messages!(diagnostics);
Ok(())
}
#[test]
fn same_name() -> Result<()> {
let mut aliases = default_aliases();
aliases.extend(FxHashMap::from_iter([(
"django.conf.settings".to_string(),
"settings".to_string(),
)]));
let diagnostics = test_path(
Path::new("flake8_import_conventions/same_name.py"),
&LinterSettings {
flake8_import_conventions: super::settings::Settings {
aliases,
banned_aliases: FxHashMap::default(),
banned_from: FxHashSet::default(),
},
..LinterSettings::for_rule(Rule::UnconventionalImportAlias)
},
)?;
assert_messages!(diagnostics);
Ok(())
}
}

View File

@@ -66,7 +66,7 @@ pub(crate) fn unconventional_import_alias(
let expected_alias = conventions.get(qualified_name.as_str())?;
let name = binding.name(checker.locator());
if binding.is_alias() && name == expected_alias {
if name == expected_alias {
return None;
}

View File

@@ -256,7 +256,7 @@ defaults.py:21:23: ICN001 [*] `tkinter` should be imported as `tk`
21 |+ import tkinter as tk
22 22 | import networkx as nxy
23 23 |
24 24 | def conventional_aliases():
24 24 |
defaults.py:22:24: ICN001 [*] `networkx` should be imported as `nx`
|
@@ -264,8 +264,6 @@ defaults.py:22:24: ICN001 [*] `networkx` should be imported as `nx`
21 | import tkinter as tkr
22 | import networkx as nxy
| ^^^ ICN001
23 |
24 | def conventional_aliases():
|
= help: Alias `networkx` to `nx`
@@ -276,7 +274,5 @@ defaults.py:22:24: ICN001 [*] `networkx` should be imported as `nx`
22 |- import networkx as nxy
22 |+ import networkx as nx
23 23 |
24 24 | def conventional_aliases():
25 25 | import altair as alt
24 24 |
25 25 | def conventional_aliases():

View File

@@ -0,0 +1,17 @@
---
source: crates/ruff_linter/src/rules/flake8_import_conventions/mod.rs
---
same_name.py:10:41: ICN001 [*] `django.conf.settings` should be imported as `settings`
|
9 | def unconventional_alias():
10 | from django.conf import settings as s
| ^ ICN001
|
= help: Alias `django.conf.settings` to `settings`
Unsafe fix
7 7 |
8 8 |
9 9 | def unconventional_alias():
10 |- from django.conf import settings as s
10 |+ from django.conf import settings as settings

View File

@@ -195,4 +195,61 @@ mod tests {
assert_messages!(snapshot, diagnostics);
Ok(())
}
#[test_case(Path::new("doubles_all.py"))]
fn only_inline(path: &Path) -> Result<()> {
let snapshot = format!("only_inline_{}", path.to_string_lossy());
let diagnostics = test_path(
Path::new("flake8_quotes").join(path).as_path(),
&LinterSettings {
flake8_quotes: super::settings::Settings {
inline_quotes: Quote::Single,
multiline_quotes: Quote::Single,
docstring_quotes: Quote::Single,
avoid_escape: true,
},
..LinterSettings::for_rules(vec![Rule::BadQuotesInlineString])
},
)?;
assert_messages!(snapshot, diagnostics);
Ok(())
}
#[test_case(Path::new("doubles_all.py"))]
fn only_multiline(path: &Path) -> Result<()> {
let snapshot = format!("only_multiline_{}", path.to_string_lossy());
let diagnostics = test_path(
Path::new("flake8_quotes").join(path).as_path(),
&LinterSettings {
flake8_quotes: super::settings::Settings {
inline_quotes: Quote::Single,
multiline_quotes: Quote::Single,
docstring_quotes: Quote::Single,
avoid_escape: true,
},
..LinterSettings::for_rules(vec![Rule::BadQuotesMultilineString])
},
)?;
assert_messages!(snapshot, diagnostics);
Ok(())
}
#[test_case(Path::new("doubles_all.py"))]
fn only_docstring(path: &Path) -> Result<()> {
let snapshot = format!("only_docstring_{}", path.to_string_lossy());
let diagnostics = test_path(
Path::new("flake8_quotes").join(path).as_path(),
&LinterSettings {
flake8_quotes: super::settings::Settings {
inline_quotes: Quote::Single,
multiline_quotes: Quote::Single,
docstring_quotes: Quote::Single,
avoid_escape: true,
},
..LinterSettings::for_rules(vec![Rule::BadQuotesDocstring])
},
)?;
assert_messages!(snapshot, diagnostics);
Ok(())
}
}

View File

@@ -5,6 +5,7 @@ use ruff_source_file::Locator;
use ruff_text_size::{Ranged, TextRange};
use crate::checkers::ast::Checker;
use crate::registry::Rule;
use super::super::settings::Quote;
@@ -334,6 +335,11 @@ fn strings(checker: &mut Checker, sequence: &[TextRange]) {
for (range, trivia) in sequence.iter().zip(trivia) {
if trivia.is_multiline {
// If multiline strings aren't enforced, ignore it.
if !checker.enabled(Rule::BadQuotesMultilineString) {
continue;
}
// If our string is or contains a known good string, ignore it.
if trivia
.raw_text
@@ -375,6 +381,11 @@ fn strings(checker: &mut Checker, sequence: &[TextRange]) {
// If we're not using the preferred type, only allow use to avoid escapes.
&& !relax_quote
{
// If inline strings aren't enforced, ignore it.
if !checker.enabled(Rule::BadQuotesInlineString) {
continue;
}
if trivia.has_empty_text()
&& text_ends_at_quote(locator, *range, quotes_settings.inline_quotes)
{
@@ -455,10 +466,14 @@ pub(crate) fn check_string_quotes(checker: &mut Checker, string_like: StringLike
};
if checker.semantic().in_docstring() {
for range in ranges {
docstring(checker, range);
if checker.enabled(Rule::BadQuotesDocstring) {
for range in ranges {
docstring(checker, range);
}
}
} else {
strings(checker, &ranges);
if checker.any_enabled(&[Rule::BadQuotesInlineString, Rule::BadQuotesMultilineString]) {
strings(checker, &ranges);
}
}
}

View File

@@ -0,0 +1,18 @@
---
source: crates/ruff_linter/src/rules/flake8_quotes/mod.rs
---
doubles_all.py:1:1: Q002 [*] Double quote docstring found but single quotes preferred
|
1 | """This is a docstring."""
| ^^^^^^^^^^^^^^^^^^^^^^^^^^ Q002
2 |
3 | this_is_an_inline_string = "double quote string"
|
= help: Replace double quotes docstring with single quotes
Safe fix
1 |-"""This is a docstring."""
1 |+'''This is a docstring.'''
2 2 |
3 3 | this_is_an_inline_string = "double quote string"
4 4 |

View File

@@ -0,0 +1,22 @@
---
source: crates/ruff_linter/src/rules/flake8_quotes/mod.rs
---
doubles_all.py:3:28: Q000 [*] Double quotes found but single quotes preferred
|
1 | """This is a docstring."""
2 |
3 | this_is_an_inline_string = "double quote string"
| ^^^^^^^^^^^^^^^^^^^^^ Q000
4 |
5 | this_is_a_multiline_string = """
|
= help: Replace double quotes with single quotes
Safe fix
1 1 | """This is a docstring."""
2 2 |
3 |-this_is_an_inline_string = "double quote string"
3 |+this_is_an_inline_string = 'double quote string'
4 4 |
5 5 | this_is_a_multiline_string = """
6 6 | double quote string

View File

@@ -0,0 +1,24 @@
---
source: crates/ruff_linter/src/rules/flake8_quotes/mod.rs
---
doubles_all.py:5:30: Q001 [*] Double quote multiline found but single quotes preferred
|
3 | this_is_an_inline_string = "double quote string"
4 |
5 | this_is_a_multiline_string = """
| ______________________________^
6 | | double quote string
7 | | """
| |___^ Q001
|
= help: Replace double multiline quotes with single quotes
Safe fix
2 2 |
3 3 | this_is_an_inline_string = "double quote string"
4 4 |
5 |-this_is_a_multiline_string = """
5 |+this_is_a_multiline_string = '''
6 6 | double quote string
7 |-"""
7 |+'''

View File

@@ -567,6 +567,12 @@ fn unnecessary_assign(checker: &mut Checker, stack: &Stack) {
continue;
}
// Ignore variables that have an annotation defined elsewhere.
if stack.annotations.contains(assigned_id.as_str()) {
continue;
}
// Ignore `nonlocal` and `global` variables.
if stack.non_locals.contains(assigned_id.as_str()) {
continue;
}

View File

@@ -241,4 +241,19 @@ RET504.py:400:12: RET504 [*] Unnecessary assignment to `y` before `return` state
402 401 |
403 402 | def foo():
RET504.py:423:16: RET504 [*] Unnecessary assignment to `services` before `return` statement
|
421 | if "services" in a:
422 | services = a["services"]
423 | return services
| ^^^^^^^^ RET504
|
= help: Remove unnecessary assignment
Unsafe fix
419 419 | # See: https://github.com/astral-sh/ruff/issues/10732
420 420 | def func(a: dict[str, int]) -> list[dict[str, int]]:
421 421 | if "services" in a:
422 |- services = a["services"]
423 |- return services
422 |+ return a["services"]

View File

@@ -13,6 +13,21 @@ pub(super) struct Stack<'data> {
pub(super) elifs_elses: Vec<(&'data [Stmt], &'data ElifElseClause)>,
/// The non-local variables in the current function.
pub(super) non_locals: FxHashSet<&'data str>,
/// The annotated variables in the current function.
///
/// For example, consider:
/// ```python
/// x: int
///
/// if True:
/// x = foo()
/// return x
/// ```
///
/// In this case, the annotation on `x` is used to cast the return value
/// of `foo()` to an `int`. Removing the `x = foo()` statement would
/// change the return type of the function.
pub(super) annotations: FxHashSet<&'data str>,
/// Whether the current function is a generator.
pub(super) is_generator: bool,
/// The `assignment`-to-`return` statement pairs in the current function.
@@ -86,6 +101,14 @@ impl<'semantic, 'a> Visitor<'a> for ReturnVisitor<'semantic, 'a> {
.non_locals
.extend(names.iter().map(Identifier::as_str));
}
Stmt::AnnAssign(ast::StmtAnnAssign { target, value, .. }) => {
// Ex) `x: int`
if value.is_none() {
if let Expr::Name(name) = target.as_ref() {
self.stack.annotations.insert(name.id.as_str());
}
}
}
Stmt::Return(stmt_return) => {
// If the `return` statement is preceded by an `assignment` statement, then the
// `assignment` statement may be redundant.
@@ -163,19 +186,6 @@ impl<'semantic, 'a> Visitor<'a> for ReturnVisitor<'semantic, 'a> {
}
}
/// RET504
/// If the last statement is a `return` statement, and the second-to-last statement is a
/// `with` statement that suppresses an exception, then we should not analyze the `return`
/// statement for unnecessary assignments. Otherwise we will suggest removing the assignment
/// and the `with` statement, which would change the behavior of the code.
///
/// Example:
/// ```python
/// def foo(data):
/// with suppress(JSONDecoderError):
/// data = data.decode()
/// return data
/// Returns `true` if the [`With`] statement is known to have a conditional body. In other words:
/// if the [`With`] statement's body may or may not run.
///

View File

@@ -1,8 +1,8 @@
use itertools::Itertools;
use ruff_python_ast::name::UnqualifiedName;
use ruff_python_ast::{self as ast, Arguments, Expr, Stmt};
use ruff_python_semantic::SemanticModel;
use ruff_python_ast::{self as ast, Expr, Stmt};
use ruff_python_semantic::{analyze, SemanticModel};
use ruff_python_stdlib::str::{is_cased_lowercase, is_cased_uppercase};
pub(super) fn is_camelcase(name: &str) -> bool {
@@ -86,16 +86,13 @@ pub(super) fn is_type_alias_assignment(stmt: &Stmt, semantic: &SemanticModel) ->
}
/// Returns `true` if the statement is an assignment to a `TypedDict`.
pub(super) fn is_typed_dict_class(arguments: Option<&Arguments>, semantic: &SemanticModel) -> bool {
pub(super) fn is_typed_dict_class(class_def: &ast::StmtClassDef, semantic: &SemanticModel) -> bool {
if !semantic.seen_typing() {
return false;
}
arguments.is_some_and(|arguments| {
arguments
.args
.iter()
.any(|base| semantic.match_typing_expr(base, "TypedDict"))
analyze::class::any_qualified_name(class_def, semantic, &|qualified_name| {
semantic.match_typing_qualified_name(&qualified_name, "TypedDict")
})
}

View File

@@ -1,7 +1,6 @@
use ruff_python_ast::{Arguments, Expr};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::{self as ast, Expr};
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
@@ -55,7 +54,7 @@ pub(crate) fn mixed_case_variable_in_class_scope(
checker: &mut Checker,
expr: &Expr,
name: &str,
arguments: Option<&Arguments>,
class_def: &ast::StmtClassDef,
) {
if !helpers::is_mixed_case(name) {
return;
@@ -64,7 +63,7 @@ pub(crate) fn mixed_case_variable_in_class_scope(
let parent = checker.semantic().current_statement();
if helpers::is_named_tuple_assignment(parent, checker.semantic())
|| helpers::is_typed_dict_class(arguments, checker.semantic())
|| helpers::is_typed_dict_class(class_def, checker.semantic())
{
return;
}

View File

@@ -1,3 +1,4 @@
/// Returns `true` if the name should be considered "ambiguous".
pub(super) fn is_ambiguous_name(name: &str) -> bool {
name == "l" || name == "I" || name == "O"
}

View File

@@ -1,14 +1,13 @@
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::{
self as ast, Expr, Identifier, Parameter, ParameterWithDefault, Parameters, Stmt,
};
use ruff_text_size::{Ranged, TextRange};
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_codegen::Generator;
use ruff_python_semantic::SemanticModel;
use ruff_python_trivia::{has_leading_content, has_trailing_content, leading_indentation};
use ruff_source_file::UniversalNewlines;
use ruff_text_size::{Ranged, TextRange};
use crate::checkers::ast::Checker;
@@ -106,12 +105,17 @@ pub(crate) fn lambda_assignment(
}
}
// If the assignment is in a class body, it might not be safe to replace it because the
// assignment might be carrying a type annotation that will be used by some package like
// dataclasses, which wouldn't consider the rewritten function definition to be
// equivalent. Even if it _doesn't_ have an annotation, rewriting safely would require
// making this a static method.
// See: https://github.com/astral-sh/ruff/issues/3046
// If the assignment is a class attribute (with an annotation), ignore it.
//
// This is most common for, e.g., dataclasses and Pydantic models. Those libraries will
// treat the lambda as an assignable field, and the use of a lambda is almost certainly
// intentional.
if annotation.is_some() && checker.semantic().current_scope().kind.is_class() {
return;
}
// Otherwise, if the assignment is in a class body, flag it, but use a display-only fix.
// Rewriting safely would require making this a static method.
//
// Similarly, if the lambda is shadowing a variable in the current scope,
// rewriting it as a function declaration may break type-checking.
@@ -179,6 +183,7 @@ fn extract_types(annotation: &Expr, semantic: &SemanticModel) -> Option<(Vec<Exp
Some((params, return_type))
}
/// Generate a function definition from a `lambda` expression.
fn function(
name: &str,
parameters: Option<&Parameters>,

View File

@@ -120,25 +120,6 @@ E731.py:57:5: E731 Do not assign a `lambda` expression, use a `def`
59 60 |
60 61 | class Scope:
E731.py:64:5: E731 Do not assign a `lambda` expression, use a `def`
|
63 | # E731
64 | f: Callable[[int], int] = lambda x: 2 * x
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E731
|
= help: Rewrite `f` as a `def`
Display-only fix
61 61 | from typing import Callable
62 62 |
63 63 | # E731
64 |- f: Callable[[int], int] = lambda x: 2 * x
64 |+ def f(x: int) -> int:
65 |+ return 2 * x
65 66 |
66 67 |
67 68 | def scope():
E731.py:73:9: E731 Do not assign a `lambda` expression, use a `def`
|
71 | x: Callable[[int], int]
@@ -383,5 +364,6 @@ E731.py:147:5: E731 [*] Do not assign a `lambda` expression, use a `def`
149 |- )
147 |+ def f():
148 |+ return (i := 1),
150 149 |
151 150 |
152 151 | from dataclasses import dataclass

View File

@@ -2,6 +2,7 @@ use itertools::Itertools;
use once_cell::sync::Lazy;
use regex::Regex;
use rustc_hash::FxHashSet;
use std::ops::Add;
use ruff_diagnostics::{AlwaysFixableViolation, Violation};
use ruff_diagnostics::{Diagnostic, Edit, Fix};
@@ -10,8 +11,8 @@ use ruff_python_ast::docstrings::{clean_space, leading_space};
use ruff_python_ast::identifier::Identifier;
use ruff_python_ast::ParameterWithDefault;
use ruff_python_semantic::analyze::visibility::is_staticmethod;
use ruff_python_trivia::{textwrap::dedent, PythonWhitespace};
use ruff_source_file::NewlineWithTrailingNewline;
use ruff_python_trivia::{textwrap::dedent, Cursor};
use ruff_source_file::{Line, NewlineWithTrailingNewline};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
use crate::checkers::ast::Checker;
@@ -25,13 +26,15 @@ use crate::rules::pydocstyle::settings::Convention;
/// Checks for over-indented sections in docstrings.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule enforces a consistent style for docstrings with multiple
/// sections.
///
/// Each section should use consistent indentation, with the section headers
/// matching the indentation of the docstring's opening quotes, and the
/// section bodies being indented one level further.
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body. The convention is that all sections should use
/// consistent indentation. In each section, the header should match the
/// indentation of the docstring's opening quotes, and the body should be
/// indented one level further.
///
/// ## Example
/// ```python
@@ -105,15 +108,20 @@ impl AlwaysFixableViolation for SectionNotOverIndented {
/// Checks for over-indented section underlines in docstrings.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule enforces a consistent style for multiline numpy-style docstrings,
/// and helps prevent incorrect syntax in docstrings using reStructuredText.
///
/// Some docstring formats (like reStructuredText) use underlines to separate
/// section bodies from section headers.
/// Multiline numpy-style docstrings are typically composed of a summary line,
/// followed by a blank line, followed by a series of sections. Each section
/// has a section header and a section body, and there should be a series of
/// underline characters in the line following the header. The underline should
/// have the same indentation as the header.
///
/// Avoid over-indenting the section underlines, as this can cause syntax
/// errors in reStructuredText.
/// This rule enforces a consistent style for multiline numpy-style docstrings
/// with sections. If your docstring uses reStructuredText, the rule also
/// helps protect against incorrect reStructuredText syntax, which would cause
/// errors if you tried to use a tool such as Sphinx to generate documentation
/// from the docstring.
///
/// This rule is enabled when using the `numpy` convention, and disabled when
/// using the `google` or `pep257` conventions.
@@ -131,12 +139,12 @@ impl AlwaysFixableViolation for SectionNotOverIndented {
/// Time spent traveling.
///
/// Returns
/// -------
/// -------
/// float
/// Speed as distance divided by time.
///
/// Raises
/// ------
/// ------
/// FasterThanLightError
/// If speed is greater than the speed of light.
/// """
@@ -204,11 +212,12 @@ impl AlwaysFixableViolation for SectionUnderlineNotOverIndented {
/// letters.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// For stylistic consistency, all section headers in a docstring should be
/// capitalized.
///
/// Section headers should be capitalized, for consistency.
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections. Each section typically has
/// a header and a body.
///
/// ## Example
/// ```python
@@ -279,22 +288,24 @@ impl AlwaysFixableViolation for CapitalizeSectionName {
}
/// ## What it does
/// Checks that section headers in docstrings that are not followed by a
/// newline.
/// Checks for section headers in docstrings that are followed by non-newline
/// characters.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule enforces a consistent style for multiline numpy-style docstrings.
///
/// Section headers should be followed by a newline, and not by another
/// character (like a colon), for consistency.
/// Multiline numpy-style docstrings are typically composed of a summary line,
/// followed by a blank line, followed by a series of sections. Each section
/// has a section header and a section body. The section header should be
/// followed by a newline, rather than by some other character (like a colon).
///
/// This rule is enabled when using the `numpy` convention, and disabled
/// when using the `google` or `pep257` conventions.
///
/// ## Example
/// ```python
/// # The `Parameters`, `Returns` and `Raises` section headers are all followed
/// # by a colon in this function's docstring:
/// def calculate_speed(distance: float, time: float) -> float:
/// """Calculate speed as distance divided by time.
///
@@ -379,12 +390,19 @@ impl AlwaysFixableViolation for NewLineAfterSectionName {
/// underlines.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule enforces a consistent style for multiline numpy-style docstrings,
/// and helps prevent incorrect syntax in docstrings using reStructuredText.
///
/// Some docstring formats (like reStructuredText) use underlines to separate
/// section bodies from section headers.
/// Multiline numpy-style docstrings are typically composed of a summary line,
/// followed by a blank line, followed by a series of sections. Each section
/// has a section header and a section body, and the header should be followed
/// by a series of underline characters in the following line.
///
/// This rule enforces a consistent style for multiline numpy-style docstrings
/// with sections. If your docstring uses reStructuredText, the rule also
/// helps protect against incorrect reStructuredText syntax, which would cause
/// errors if you tried to use a tool such as Sphinx to generate documentation
/// from the docstring.
///
/// This rule is enabled when using the `numpy` convention, and disabled
/// when using the `google` or `pep257` conventions.
@@ -475,15 +493,19 @@ impl AlwaysFixableViolation for DashedUnderlineAfterSection {
/// immediately following the section name.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule enforces a consistent style for multiline numpy-style docstrings,
/// and helps prevent incorrect syntax in docstrings using reStructuredText.
///
/// Some docstring formats (like reStructuredText) use underlines to separate
/// section bodies from section headers.
/// Multiline numpy-style docstrings are typically composed of a summary line,
/// followed by a blank line, followed by a series of sections. Each section
/// has a header and a body. There should be a series of underline characters
/// in the line immediately below the header.
///
/// When present, section underlines should be positioned on the line
/// immediately following the section header.
/// This rule enforces a consistent style for multiline numpy-style docstrings
/// with sections. If your docstring uses reStructuredText, the rule also
/// helps protect against incorrect reStructuredText syntax, which would cause
/// errors if you tried to use a tool such as Sphinx to generate documentation
/// from the docstring.
///
/// This rule is enabled when using the `numpy` convention, and disabled
/// when using the `google` or `pep257` conventions.
@@ -577,15 +599,20 @@ impl AlwaysFixableViolation for SectionUnderlineAfterName {
/// the corresponding section header.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule enforces a consistent style for multiline numpy-style docstrings,
/// and helps prevent incorrect syntax in docstrings using reStructuredText.
///
/// Some docstring formats (like reStructuredText) use underlines to separate
/// section bodies from section headers.
/// Multiline numpy-style docstrings are typically composed of a summary line,
/// followed by a blank line, followed by a series of sections. Each section
/// has a section header and a section body, and there should be a series of
/// underline characters in the line following the header. The length of the
/// underline should exactly match the length of the section header.
///
/// When present, section underlines should match the length of the
/// corresponding section header.
/// This rule enforces a consistent style for multiline numpy-style docstrings
/// with sections. If your docstring uses reStructuredText, the rule also
/// helps protect against incorrect reStructuredText syntax, which would cause
/// errors if you tried to use a tool such as Sphinx to generate documentation
/// from the docstring.
///
/// This rule is enabled when using the `numpy` convention, and disabled
/// when using the `google` or `pep257` conventions.
@@ -676,13 +703,15 @@ impl AlwaysFixableViolation for SectionUnderlineMatchesSectionLength {
/// line.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
///
/// Docstring sections should be separated by a blank line, for consistency and
/// This rule enforces consistency in your docstrings, and helps ensure
/// compatibility with documentation tooling.
///
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body. If a multiline numpy-style or Google-style docstring
/// consists of multiple sections, each section should be separated by a single
/// blank line.
///
/// This rule is enabled when using the `numpy` and `google` conventions, and
/// disabled when using the `pep257` convention.
///
@@ -767,15 +796,15 @@ impl AlwaysFixableViolation for NoBlankLineAfterSection {
}
/// ## What it does
/// Checks for docstring sections that are separated by a blank line.
/// Checks for docstring sections that are not separated by a blank line.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule enforces consistency in numpy-style and Google-style docstrings,
/// and helps ensure compatibility with documentation tooling.
///
/// Docstring sections should be separated by a blank line, for consistency and
/// compatibility with documentation tooling.
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body. Sections should be separated by a single blank line.
///
/// This rule is enabled when using the `numpy` and `google` conventions, and
/// disabled when using the `pep257` convention.
@@ -860,19 +889,18 @@ impl AlwaysFixableViolation for NoBlankLineBeforeSection {
}
/// ## What it does
/// Checks for missing blank lines after the last section of a multi-line
/// Checks for missing blank lines after the last section of a multiline
/// docstring.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// This rule enforces a consistent style for multiline docstrings.
///
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
///
/// In some projects, the last section of a docstring is followed by a blank line,
/// for consistency and compatibility.
///
/// This rule may not apply to all projects; its applicability is a matter of
/// convention. By default, this rule is disabled when using the `google`,
/// convention. By default, the rule is disabled when using the `google`,
/// `numpy`, and `pep257` conventions.
///
/// ## Example
@@ -956,15 +984,16 @@ impl AlwaysFixableViolation for BlankLineAfterLastSection {
}
/// ## What it does
/// Checks for docstrings that contain empty sections.
/// Checks for docstrings with empty sections.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// An empty section in a multiline docstring likely indicates an unfinished
/// or incomplete docstring.
///
/// Empty docstring sections are indicative of missing documentation. Empty
/// sections should either be removed or filled in with relevant documentation.
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body. Each section body should be non-empty; empty sections
/// should either have content added to them, or be removed entirely.
///
/// ## Example
/// ```python
@@ -1045,13 +1074,14 @@ impl Violation for EmptyDocstringSection {
/// Checks for docstring section headers that do not end with a colon.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// This rule enforces a consistent style for multiline Google-style
/// docstrings. If a multiline Google-style docstring consists of multiple
/// sections, each section header should end with a colon.
///
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
///
/// In a docstring, each section header should end with a colon, for
/// consistency.
///
/// This rule is enabled when using the `google` convention, and disabled when
/// using the `pep257` and `numpy` conventions.
///
@@ -1127,13 +1157,14 @@ impl AlwaysFixableViolation for SectionNameEndsInColon {
/// parameters in the function.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule helps prevent you from leaving Google-style docstrings unfinished
/// or incomplete. Multiline Google-style docstrings should describe all
/// parameters for the function they are documenting.
///
/// Function docstrings often include a section for function arguments, which
/// should include documentation for every argument. Undocumented arguments are
/// indicative of missing documentation.
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body. Function docstrings often include a section for
/// function arguments; this rule is concerned with that section only.
///
/// This rule is enabled when using the `google` convention, and disabled when
/// using the `pep257` and `numpy` conventions.
@@ -1209,16 +1240,16 @@ impl Violation for UndocumentedParam {
}
/// ## What it does
/// Checks for docstring sections that contain blank lines between the section
/// header and the section body.
/// Checks for docstring sections that contain blank lines between a section
/// header and a section body.
///
/// ## Why is this bad?
/// Multi-line docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body.
/// This rule enforces a consistent style for multiline docstrings.
///
/// Docstring sections should not contain blank lines between the section header
/// and the section body, for consistency.
/// Multiline docstrings are typically composed of a summary line, followed by
/// a blank line, followed by a series of sections, each with a section header
/// and a section body. There should be no blank lines between a section header
/// and a section body.
///
/// ## Example
/// ```python
@@ -1377,50 +1408,41 @@ fn blanks_and_section_underline(
}
if let Some(non_blank_line) = following_lines.next() {
let dash_line_found = is_dashed_underline(&non_blank_line);
if dash_line_found {
if let Some(dashed_line) = find_underline(&non_blank_line, '-') {
if blank_lines_after_header > 0 {
if checker.enabled(Rule::SectionUnderlineAfterName) {
let mut diagnostic = Diagnostic::new(
SectionUnderlineAfterName {
name: context.section_name().to_string(),
},
docstring.range(),
dashed_line,
);
let range = TextRange::new(context.following_range().start(), blank_lines_end);
// Delete any blank lines between the header and the underline.
diagnostic.set_fix(Fix::safe_edit(Edit::range_deletion(range)));
diagnostic.set_fix(Fix::safe_edit(Edit::deletion(
context.following_range().start(),
blank_lines_end,
)));
checker.diagnostics.push(diagnostic);
}
}
if non_blank_line
.trim()
.chars()
.filter(|char| *char == '-')
.count()
!= context.section_name().len()
{
if dashed_line.len().to_usize() != context.section_name().len() {
if checker.enabled(Rule::SectionUnderlineMatchesSectionLength) {
let mut diagnostic = Diagnostic::new(
SectionUnderlineMatchesSectionLength {
name: context.section_name().to_string(),
},
docstring.range(),
dashed_line,
);
// Replace the existing underline with a line of the appropriate length.
let content = format!(
"{}{}{}",
clean_space(docstring.indentation),
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
"-".repeat(context.section_name().len()),
checker.stylist().line_ending().as_str()
);
diagnostic.set_fix(Fix::safe_edit(Edit::replacement(
content,
blank_lines_end,
non_blank_line.full_end(),
dashed_line,
)));
checker.diagnostics.push(diagnostic);
}
}
@@ -1432,8 +1454,9 @@ fn blanks_and_section_underline(
SectionUnderlineNotOverIndented {
name: context.section_name().to_string(),
},
docstring.range(),
dashed_line,
);
// Replace the existing indentation with whitespace of the appropriate length.
let range = TextRange::at(
blank_lines_end,
@@ -1445,6 +1468,7 @@ fn blanks_and_section_underline(
} else {
Edit::range_replacement(contents, range)
}));
checker.diagnostics.push(diagnostic);
}
}
@@ -1467,7 +1491,7 @@ fn blanks_and_section_underline(
EmptyDocstringSection {
name: context.section_name().to_string(),
},
docstring.range(),
context.section_name_range(),
));
}
} else if checker.enabled(Rule::BlankLinesBetweenHeaderAndContent) {
@@ -1475,7 +1499,7 @@ fn blanks_and_section_underline(
BlankLinesBetweenHeaderAndContent {
name: context.section_name().to_string(),
},
docstring.range(),
context.section_name_range(),
);
// Delete any blank lines between the header and content.
diagnostic.set_fix(Fix::safe_edit(Edit::deletion(
@@ -1491,47 +1515,50 @@ fn blanks_and_section_underline(
EmptyDocstringSection {
name: context.section_name().to_string(),
},
docstring.range(),
context.section_name_range(),
));
}
}
} else {
let equal_line_found = non_blank_line
.chars()
.all(|char| char.is_whitespace() || char == '=');
if checker.enabled(Rule::DashedUnderlineAfterSection) {
let mut diagnostic = Diagnostic::new(
DashedUnderlineAfterSection {
name: context.section_name().to_string(),
},
docstring.range(),
);
// Add a dashed line (of the appropriate length) under the section header.
let content = format!(
"{}{}{}",
checker.stylist().line_ending().as_str(),
clean_space(docstring.indentation),
"-".repeat(context.section_name().len()),
);
if equal_line_found
&& non_blank_line.trim_whitespace().len() == context.section_name().len()
{
if let Some(equal_line) = find_underline(&non_blank_line, '=') {
let mut diagnostic = Diagnostic::new(
DashedUnderlineAfterSection {
name: context.section_name().to_string(),
},
equal_line,
);
// If an existing underline is an equal sign line of the appropriate length,
// replace it with a dashed line.
diagnostic.set_fix(Fix::safe_edit(Edit::replacement(
content,
context.summary_range().end(),
non_blank_line.end(),
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
"-".repeat(context.section_name().len()),
equal_line,
)));
checker.diagnostics.push(diagnostic);
} else {
// Otherwise, insert a dashed line after the section header.
let mut diagnostic = Diagnostic::new(
DashedUnderlineAfterSection {
name: context.section_name().to_string(),
},
context.section_name_range(),
);
// Add a dashed line (of the appropriate length) under the section header.
let content = format!(
"{}{}{}",
checker.stylist().line_ending().as_str(),
clean_space(docstring.indentation),
"-".repeat(context.section_name().len()),
);
diagnostic.set_fix(Fix::safe_edit(Edit::insertion(
content,
context.summary_range().end(),
)));
checker.diagnostics.push(diagnostic);
}
checker.diagnostics.push(diagnostic);
}
if blank_lines_after_header > 0 {
if checker.enabled(Rule::BlankLinesBetweenHeaderAndContent) {
@@ -1539,7 +1566,7 @@ fn blanks_and_section_underline(
BlankLinesBetweenHeaderAndContent {
name: context.section_name().to_string(),
},
docstring.range(),
context.section_name_range(),
);
let range = TextRange::new(context.following_range().start(), blank_lines_end);
// Delete any blank lines between the header and content.
@@ -1548,16 +1575,16 @@ fn blanks_and_section_underline(
}
}
}
}
// Nothing but blank lines after the section header.
else {
} else {
// Nothing but blank lines after the section header.
if checker.enabled(Rule::DashedUnderlineAfterSection) {
let mut diagnostic = Diagnostic::new(
DashedUnderlineAfterSection {
name: context.section_name().to_string(),
},
docstring.range(),
context.section_name_range(),
);
// Add a dashed line (of the appropriate length) under the section header.
let content = format!(
"{}{}{}",
@@ -1565,11 +1592,11 @@ fn blanks_and_section_underline(
clean_space(docstring.indentation),
"-".repeat(context.section_name().len()),
);
diagnostic.set_fix(Fix::safe_edit(Edit::insertion(
content,
context.summary_range().end(),
)));
checker.diagnostics.push(diagnostic);
}
if checker.enabled(Rule::EmptyDocstringSection) {
@@ -1577,7 +1604,7 @@ fn blanks_and_section_underline(
EmptyDocstringSection {
name: context.section_name().to_string(),
},
docstring.range(),
context.section_name_range(),
));
}
}
@@ -1592,15 +1619,15 @@ fn common_section(
if checker.enabled(Rule::CapitalizeSectionName) {
let capitalized_section_name = context.kind().as_str();
if context.section_name() != capitalized_section_name {
let section_range = context.section_name_range();
let mut diagnostic = Diagnostic::new(
CapitalizeSectionName {
name: context.section_name().to_string(),
},
docstring.range(),
section_range,
);
// Replace the section title with the capitalized variant. This requires
// locating the start and end of the section name.
let section_range = context.section_name_range();
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
capitalized_section_name.to_string(),
section_range,
@@ -1612,16 +1639,17 @@ fn common_section(
if checker.enabled(Rule::SectionNotOverIndented) {
let leading_space = leading_space(context.summary_line());
if leading_space.len() > docstring.indentation.len() {
let section_range = context.section_name_range();
let mut diagnostic = Diagnostic::new(
SectionNotOverIndented {
name: context.section_name().to_string(),
},
docstring.range(),
section_range,
);
// Replace the existing indentation with whitespace of the appropriate length.
let content = clean_space(docstring.indentation);
let fix_range = TextRange::at(context.start(), leading_space.text_len());
diagnostic.set_fix(Fix::safe_edit(if content.is_empty() {
Edit::range_deletion(fix_range)
} else {
@@ -1641,11 +1669,12 @@ fn common_section(
.take_while(|line| line.trim().is_empty())
.count();
if num_blank_lines < 2 {
let section_range = context.section_name_range();
let mut diagnostic = Diagnostic::new(
NoBlankLineAfterSection {
name: context.section_name().to_string(),
},
docstring.range(),
section_range,
);
// Add a newline at the beginning of the next section.
diagnostic.set_fix(Fix::safe_edit(Edit::insertion(
@@ -1682,11 +1711,12 @@ fn common_section(
context.end(),
);
let section_range = context.section_name_range();
let mut diagnostic = Diagnostic::new(
BlankLineAfterLastSection {
name: context.section_name().to_string(),
},
docstring.range(),
section_range,
);
diagnostic.set_fix(Fix::safe_edit(edit));
checker.diagnostics.push(diagnostic);
@@ -1699,11 +1729,12 @@ fn common_section(
.previous_line()
.is_some_and(|line| line.trim().is_empty())
{
let section_range = context.section_name_range();
let mut diagnostic = Diagnostic::new(
NoBlankLineBeforeSection {
name: context.section_name().to_string(),
},
docstring.range(),
section_range,
);
// Add a blank line before the section.
diagnostic.set_fix(Fix::safe_edit(Edit::insertion(
@@ -1800,10 +1831,11 @@ fn args_section(context: &SectionContext) -> FxHashSet<String> {
let leading_space = leading_space(first_line.as_str());
let relevant_lines = std::iter::once(first_line)
.chain(following_lines)
.map(|l| l.as_str())
.filter(|line| {
line.is_empty() || (line.starts_with(leading_space) && !is_dashed_underline(line))
line.is_empty()
|| (line.starts_with(leading_space) && find_underline(line, '-').is_none())
})
.map(|line| line.as_str())
.join("\n");
let args_content = dedent(&relevant_lines);
@@ -1897,7 +1929,7 @@ fn numpy_section(
NewLineAfterSectionName {
name: context.section_name().to_string(),
},
docstring.range(),
context.section_name_range(),
);
let section_range = context.section_name_range();
diagnostic.set_fix(Fix::safe_edit(Edit::range_deletion(TextRange::at(
@@ -1931,7 +1963,7 @@ fn google_section(
SectionNameEndsInColon {
name: context.section_name().to_string(),
},
docstring.range(),
context.section_name_range(),
);
// Replace the suffix.
let section_name_range = context.section_name_range();
@@ -1991,7 +2023,35 @@ fn parse_google_sections(
}
}
fn is_dashed_underline(line: &str) -> bool {
let trimmed_line = line.trim();
!trimmed_line.is_empty() && trimmed_line.chars().all(|char| char == '-')
/// Returns the [`TextRange`] of the underline, if a line consists of only dashes.
fn find_underline(line: &Line, dash: char) -> Option<TextRange> {
let mut cursor = Cursor::new(line.as_str());
// Eat leading whitespace.
cursor.eat_while(char::is_whitespace);
// Determine the start of the dashes.
let offset = cursor.token_len();
// Consume the dashes.
cursor.start_token();
cursor.eat_while(|c| c == dash);
// Determine the end of the dashes.
let len = cursor.token_len();
// If there are no dashes, return None.
if len == TextSize::new(0) {
return None;
}
// Eat trailing whitespace.
cursor.eat_while(char::is_whitespace);
// If there are any characters after the dashes, return None.
if !cursor.is_eof() {
return None;
}
Some(TextRange::at(offset, len).add(line.start()))
}

View File

@@ -1,23 +1,16 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
D214_module.py:1:1: D214 [*] Section is over-indented ("Returns")
|
1 | / """A module docstring with D214 violations
2 | |
3 | | Returns
4 | | -----
5 | | valid returns
6 | |
7 | | Args
8 | | -----
9 | | valid args
10 | | """
| |___^ D214
11 |
12 | import os
|
= help: Remove over-indentation from "Returns"
D214_module.py:3:5: D214 [*] Section is over-indented ("Returns")
|
1 | """A module docstring with D214 violations
2 |
3 | Returns
| ^^^^^^^ D214
4 | -----
5 | valid returns
|
= help: Remove over-indentation from "Returns"
Safe fix
1 1 | """A module docstring with D214 violations
@@ -28,23 +21,16 @@ D214_module.py:1:1: D214 [*] Section is over-indented ("Returns")
5 5 | valid returns
6 6 |
D214_module.py:1:1: D214 [*] Section is over-indented ("Args")
|
1 | / """A module docstring with D214 violations
2 | |
3 | | Returns
4 | | -----
5 | | valid returns
6 | |
7 | | Args
8 | | -----
9 | | valid args
10 | | """
| |___^ D214
11 |
12 | import os
|
= help: Remove over-indentation from "Args"
D214_module.py:7:5: D214 [*] Section is over-indented ("Args")
|
5 | valid returns
6 |
7 | Args
| ^^^^ D214
8 | -----
9 | valid args
|
= help: Remove over-indentation from "Args"
Safe fix
4 4 | -----
@@ -55,5 +41,3 @@ D214_module.py:1:1: D214 [*] Section is over-indented ("Args")
8 8 | -----
9 9 | valid args
10 10 | """

View File

@@ -1,19 +1,14 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:144:5: D214 [*] Section is over-indented ("Returns")
sections.py:146:9: D214 [*] Section is over-indented ("Returns")
|
142 | @expect("D214: Section is over-indented ('Returns')")
143 | def section_overindented(): # noqa: D416
144 | """Toggle the gizmo.
| _____^
145 | |
146 | | Returns
147 | | -------
148 | | A value of some sort.
149 | |
150 | | """
| |_______^ D214
144 | """Toggle the gizmo.
145 |
146 | Returns
| ^^^^^^^ D214
147 | -------
148 | A value of some sort.
|
= help: Remove over-indentation from "Returns"
@@ -27,18 +22,13 @@ sections.py:144:5: D214 [*] Section is over-indented ("Returns")
148 148 | A value of some sort.
149 149 |
sections.py:558:5: D214 [*] Section is over-indented ("Returns")
sections.py:563:9: D214 [*] Section is over-indented ("Returns")
|
557 | def titlecase_sub_section_header():
558 | """Below, `Returns:` should be considered a section header.
| _____^
559 | |
560 | | Args:
561 | | Here's a note.
562 | |
563 | | Returns:
564 | | """
| |_______^ D214
561 | Here's a note.
562 |
563 | Returns:
| ^^^^^^^ D214
564 | """
|
= help: Remove over-indentation from "Returns"
@@ -50,6 +40,4 @@ sections.py:558:5: D214 [*] Section is over-indented ("Returns")
563 |+ Returns:
564 564 | """
565 565 |
566 566 |
566 566 |

View File

@@ -1,13 +1,13 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
D215.py:1:1: D215 [*] Section underline is over-indented ("TODO")
D215.py:3:5: D215 [*] Section underline is over-indented ("TODO")
|
1 | / """
2 | | TODO:
3 | | -
4 | | """
| |___^ D215
1 | """
2 | TODO:
3 | -
| ^ D215
4 | """
|
= help: Remove over-indentation from "TODO" underline
@@ -17,5 +17,3 @@ D215.py:1:1: D215 [*] Section underline is over-indented ("TODO")
3 |- -
3 |+
4 4 | """

View File

@@ -1,19 +1,12 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:156:5: D215 [*] Section underline is over-indented ("Returns")
sections.py:159:9: D215 [*] Section underline is over-indented ("Returns")
|
154 | @expect("D215: Section underline is over-indented (in section 'Returns')")
155 | def section_underline_overindented(): # noqa: D416
156 | """Toggle the gizmo.
| _____^
157 | |
158 | | Returns
159 | | -------
160 | | A value of some sort.
161 | |
162 | | """
| |_______^ D215
158 | Returns
159 | -------
| ^^^^^^^ D215
160 | A value of some sort.
|
= help: Remove over-indentation from "Returns" underline
@@ -27,17 +20,12 @@ sections.py:156:5: D215 [*] Section underline is over-indented ("Returns")
161 161 |
162 162 | """
sections.py:170:5: D215 [*] Section underline is over-indented ("Returns")
sections.py:173:9: D215 [*] Section underline is over-indented ("Returns")
|
168 | @expect("D414: Section has no content ('Returns')")
169 | def section_underline_overindented_and_contentless(): # noqa: D416
170 | """Toggle the gizmo.
| _____^
171 | |
172 | | Returns
173 | | -------
174 | | """
| |_______^ D215
172 | Returns
173 | -------
| ^^^^^^^ D215
174 | """
|
= help: Remove over-indentation from "Returns" underline
@@ -49,6 +37,4 @@ sections.py:170:5: D215 [*] Section underline is over-indented ("Returns")
173 |+ ------
174 174 | """
175 175 |
176 176 |
176 176 |

View File

@@ -1,19 +1,14 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:17:5: D405 [*] Section name should be properly capitalized ("returns")
sections.py:19:5: D405 [*] Section name should be properly capitalized ("returns")
|
15 | "('Returns', not 'returns')")
16 | def not_capitalized(): # noqa: D416
17 | """Toggle the gizmo.
| _____^
18 | |
19 | | returns
20 | | -------
21 | | A value of some sort.
22 | |
23 | | """
| |_______^ D405
17 | """Toggle the gizmo.
18 |
19 | returns
| ^^^^^^^ D405
20 | -------
21 | A value of some sort.
|
= help: Capitalize "returns"
@@ -27,27 +22,13 @@ sections.py:17:5: D405 [*] Section name should be properly capitalized ("returns
21 21 | A value of some sort.
22 22 |
sections.py:216:5: D405 [*] Section name should be properly capitalized ("Short summary")
sections.py:218:5: D405 [*] Section name should be properly capitalized ("Short summary")
|
214 | @expect("D407: Missing dashed underline after section ('Raises')")
215 | def multiple_sections(): # noqa: D416
216 | """Toggle the gizmo.
| _____^
217 | |
218 | | Short summary
219 | | -------------
220 | |
221 | | This is the function's description, which will also specify what it
222 | | returns.
223 | |
224 | | Returns
225 | | ------
226 | | Many many wonderful things.
227 | | Raises:
228 | | My attention.
229 | |
230 | | """
| |_______^ D405
216 | """Toggle the gizmo.
217 |
218 | Short summary
| ^^^^^^^^^^^^^ D405
219 | -------------
|
= help: Capitalize "Short summary"
@@ -60,5 +41,3 @@ sections.py:216:5: D405 [*] Section name should be properly capitalized ("Short
219 219 | -------------
220 220 |
221 221 | This is the function's description, which will also specify what it

View File

@@ -1,19 +1,14 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:30:5: D406 [*] Section name should end with a newline ("Returns")
sections.py:32:5: D406 [*] Section name should end with a newline ("Returns")
|
28 | "('Returns', not 'Returns:')")
29 | def superfluous_suffix(): # noqa: D416
30 | """Toggle the gizmo.
| _____^
31 | |
32 | | Returns:
33 | | -------
34 | | A value of some sort.
35 | |
36 | | """
| |_______^ D406
30 | """Toggle the gizmo.
31 |
32 | Returns:
| ^^^^^^^ D406
33 | -------
34 | A value of some sort.
|
= help: Add newline after "Returns"
@@ -27,27 +22,13 @@ sections.py:30:5: D406 [*] Section name should end with a newline ("Returns")
34 34 | A value of some sort.
35 35 |
sections.py:216:5: D406 [*] Section name should end with a newline ("Raises")
sections.py:227:5: D406 [*] Section name should end with a newline ("Raises")
|
214 | @expect("D407: Missing dashed underline after section ('Raises')")
215 | def multiple_sections(): # noqa: D416
216 | """Toggle the gizmo.
| _____^
217 | |
218 | | Short summary
219 | | -------------
220 | |
221 | | This is the function's description, which will also specify what it
222 | | returns.
223 | |
224 | | Returns
225 | | ------
226 | | Many many wonderful things.
227 | | Raises:
228 | | My attention.
229 | |
230 | | """
| |_______^ D406
225 | ------
226 | Many many wonderful things.
227 | Raises:
| ^^^^^^ D406
228 | My attention.
|
= help: Add newline after "Raises"
@@ -61,20 +42,14 @@ sections.py:216:5: D406 [*] Section name should end with a newline ("Raises")
229 229 |
230 230 | """
sections.py:588:5: D406 [*] Section name should end with a newline ("Parameters")
sections.py:590:5: D406 [*] Section name should end with a newline ("Parameters")
|
587 | def test_lowercase_sub_section_header_should_be_valid(parameters: list[str], value: int): # noqa: D213
588 | """Test that lower case subsection header is valid even if it has the same name as section kind.
| _____^
589 | |
590 | | Parameters:
591 | | ----------
592 | | parameters:
593 | | A list of string parameters
594 | | value:
595 | | Some value
596 | | """
| |_______^ D406
588 | """Test that lower case subsection header is valid even if it has the same name as section kind.
589 |
590 | Parameters:
| ^^^^^^^^^^ D406
591 | ----------
592 | parameters:
|
= help: Add newline after "Parameters"
@@ -87,5 +62,3 @@ sections.py:588:5: D406 [*] Section name should end with a newline ("Parameters"
591 591 | ----------
592 592 | parameters:
593 593 | A list of string parameters

View File

@@ -1,18 +1,13 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:42:5: D407 [*] Missing dashed underline after section ("Returns")
sections.py:44:5: D407 [*] Missing dashed underline after section ("Returns")
|
40 | @expect("D407: Missing dashed underline after section ('Returns')")
41 | def no_underline(): # noqa: D416
42 | """Toggle the gizmo.
| _____^
43 | |
44 | | Returns
45 | | A value of some sort.
46 | |
47 | | """
| |_______^ D407
42 | """Toggle the gizmo.
43 |
44 | Returns
| ^^^^^^^ D407
45 | A value of some sort.
|
= help: Add dashed line under "Returns"
@@ -25,17 +20,14 @@ sections.py:42:5: D407 [*] Missing dashed underline after section ("Returns")
46 47 |
47 48 | """
sections.py:54:5: D407 [*] Missing dashed underline after section ("Returns")
sections.py:56:5: D407 [*] Missing dashed underline after section ("Returns")
|
52 | @expect("D414: Section has no content ('Returns')")
53 | def no_underline_and_no_description(): # noqa: D416
54 | """Toggle the gizmo.
| _____^
55 | |
56 | | Returns
57 | |
58 | | """
| |_______^ D407
54 | """Toggle the gizmo.
55 |
56 | Returns
| ^^^^^^^ D407
57 |
58 | """
|
= help: Add dashed line under "Returns"
@@ -48,15 +40,12 @@ sections.py:54:5: D407 [*] Missing dashed underline after section ("Returns")
58 59 | """
59 60 |
sections.py:65:5: D407 [*] Missing dashed underline after section ("Returns")
sections.py:67:5: D407 [*] Missing dashed underline after section ("Returns")
|
63 | @expect("D414: Section has no content ('Returns')")
64 | def no_underline_and_no_newline(): # noqa: D416
65 | """Toggle the gizmo.
| _____^
66 | |
67 | | Returns"""
| |______________^ D407
65 | """Toggle the gizmo.
66 |
67 | Returns"""
| ^^^^^^^ D407
|
= help: Add dashed line under "Returns"
@@ -71,27 +60,13 @@ sections.py:65:5: D407 [*] Missing dashed underline after section ("Returns")
69 70 |
70 71 | @expect(_D213)
sections.py:216:5: D407 [*] Missing dashed underline after section ("Raises")
sections.py:227:5: D407 [*] Missing dashed underline after section ("Raises")
|
214 | @expect("D407: Missing dashed underline after section ('Raises')")
215 | def multiple_sections(): # noqa: D416
216 | """Toggle the gizmo.
| _____^
217 | |
218 | | Short summary
219 | | -------------
220 | |
221 | | This is the function's description, which will also specify what it
222 | | returns.
223 | |
224 | | Returns
225 | | ------
226 | | Many many wonderful things.
227 | | Raises:
228 | | My attention.
229 | |
230 | | """
| |_______^ D407
225 | ------
226 | Many many wonderful things.
227 | Raises:
| ^^^^^^ D407
228 | My attention.
|
= help: Add dashed line under "Raises"
@@ -104,23 +79,13 @@ sections.py:216:5: D407 [*] Missing dashed underline after section ("Raises")
229 230 |
230 231 | """
sections.py:261:5: D407 [*] Missing dashed underline after section ("Args")
sections.py:263:5: D407 [*] Missing dashed underline after section ("Args")
|
259 | @expect("D414: Section has no content ('Returns')")
260 | def valid_google_style_section(): # noqa: D406, D407
261 | """Toggle the gizmo.
| _____^
262 | |
263 | | Args:
264 | | note: A random string.
265 | |
266 | | Returns:
267 | |
268 | | Raises:
269 | | RandomError: A random error that occurs randomly.
270 | |
271 | | """
| |_______^ D407
261 | """Toggle the gizmo.
262 |
263 | Args:
| ^^^^ D407
264 | note: A random string.
|
= help: Add dashed line under "Args"
@@ -133,23 +98,14 @@ sections.py:261:5: D407 [*] Missing dashed underline after section ("Args")
265 266 |
266 267 | Returns:
sections.py:261:5: D407 [*] Missing dashed underline after section ("Returns")
sections.py:266:5: D407 [*] Missing dashed underline after section ("Returns")
|
259 | @expect("D414: Section has no content ('Returns')")
260 | def valid_google_style_section(): # noqa: D406, D407
261 | """Toggle the gizmo.
| _____^
262 | |
263 | | Args:
264 | | note: A random string.
265 | |
266 | | Returns:
267 | |
268 | | Raises:
269 | | RandomError: A random error that occurs randomly.
270 | |
271 | | """
| |_______^ D407
264 | note: A random string.
265 |
266 | Returns:
| ^^^^^^^ D407
267 |
268 | Raises:
|
= help: Add dashed line under "Returns"
@@ -162,23 +118,13 @@ sections.py:261:5: D407 [*] Missing dashed underline after section ("Returns")
268 269 | Raises:
269 270 | RandomError: A random error that occurs randomly.
sections.py:261:5: D407 [*] Missing dashed underline after section ("Raises")
sections.py:268:5: D407 [*] Missing dashed underline after section ("Raises")
|
259 | @expect("D414: Section has no content ('Returns')")
260 | def valid_google_style_section(): # noqa: D406, D407
261 | """Toggle the gizmo.
| _____^
262 | |
263 | | Args:
264 | | note: A random string.
265 | |
266 | | Returns:
267 | |
268 | | Raises:
269 | | RandomError: A random error that occurs randomly.
270 | |
271 | | """
| |_______^ D407
266 | Returns:
267 |
268 | Raises:
| ^^^^^^ D407
269 | RandomError: A random error that occurs randomly.
|
= help: Add dashed line under "Raises"
@@ -191,18 +137,13 @@ sections.py:261:5: D407 [*] Missing dashed underline after section ("Raises")
270 271 |
271 272 | """
sections.py:278:5: D407 [*] Missing dashed underline after section ("Args")
sections.py:280:5: D407 [*] Missing dashed underline after section ("Args")
|
276 | "('Args:', not 'Args')")
277 | def missing_colon_google_style_section(): # noqa: D406, D407
278 | """Toggle the gizmo.
| _____^
279 | |
280 | | Args
281 | | note: A random string.
282 | |
283 | | """
| |_______^ D407
278 | """Toggle the gizmo.
279 |
280 | Args
| ^^^^ D407
281 | note: A random string.
|
= help: Add dashed line under "Args"
@@ -215,21 +156,14 @@ sections.py:278:5: D407 [*] Missing dashed underline after section ("Args")
282 283 |
283 284 | """
sections.py:293:9: D407 [*] Missing dashed underline after section ("Args")
sections.py:297:9: D407 [*] Missing dashed underline after section ("Args")
|
292 | def bar(y=2): # noqa: D207, D213, D406, D407
293 | """Nested function test for docstrings.
| _________^
294 | |
295 | | Will this work when referencing x?
296 | |
297 | | Args:
298 | | x: Test something
299 | | that is broken.
300 | |
301 | | """
| |___________^ D407
302 | print(x)
295 | Will this work when referencing x?
296 |
297 | Args:
| ^^^^ D407
298 | x: Test something
299 | that is broken.
|
= help: Add dashed line under "Args"
@@ -242,18 +176,13 @@ sections.py:293:9: D407 [*] Missing dashed underline after section ("Args")
299 300 | that is broken.
300 301 |
sections.py:310:5: D407 [*] Missing dashed underline after section ("Args")
sections.py:312:5: D407 [*] Missing dashed underline after section ("Args")
|
308 | "'test_missing_google_args' docstring)")
309 | def test_missing_google_args(x=1, y=2, _private=3): # noqa: D406, D407
310 | """Toggle the gizmo.
| _____^
311 | |
312 | | Args:
313 | | x (int): The greatest integer.
314 | |
315 | | """
| |_______^ D407
310 | """Toggle the gizmo.
311 |
312 | Args:
| ^^^^ D407
313 | x (int): The greatest integer.
|
= help: Add dashed line under "Args"
@@ -266,20 +195,14 @@ sections.py:310:5: D407 [*] Missing dashed underline after section ("Args")
314 315 |
315 316 | """
sections.py:322:9: D407 [*] Missing dashed underline after section ("Args")
sections.py:324:9: D407 [*] Missing dashed underline after section ("Args")
|
321 | def test_method(self, test, another_test, _): # noqa: D213, D407
322 | """Test a valid args section.
| _________^
323 | |
324 | | Args:
325 | | test: A parameter.
326 | | another_test: Another parameter.
327 | |
328 | | """
| |___________^ D407
329 |
330 | @expect("D417: Missing argument descriptions in the docstring "
322 | """Test a valid args section.
323 |
324 | Args:
| ^^^^ D407
325 | test: A parameter.
326 | another_test: Another parameter.
|
= help: Add dashed line under "Args"
@@ -292,20 +215,13 @@ sections.py:322:9: D407 [*] Missing dashed underline after section ("Args")
326 327 | another_test: Another parameter.
327 328 |
sections.py:334:9: D407 [*] Missing dashed underline after section ("Args")
sections.py:336:9: D407 [*] Missing dashed underline after section ("Args")
|
332 | "'test_missing_args' docstring)", arg_count=5)
333 | def test_missing_args(self, test, x, y, z=3, _private_arg=3): # noqa: D213, D407
334 | """Test a valid args section.
| _________^
335 | |
336 | | Args:
337 | | x: Another parameter.
338 | |
339 | | """
| |___________^ D407
340 |
341 | @classmethod
334 | """Test a valid args section.
335 |
336 | Args:
| ^^^^ D407
337 | x: Another parameter.
|
= help: Add dashed line under "Args"
@@ -318,21 +234,14 @@ sections.py:334:9: D407 [*] Missing dashed underline after section ("Args")
338 339 |
339 340 | """
sections.py:346:9: D407 [*] Missing dashed underline after section ("Args")
sections.py:348:9: D407 [*] Missing dashed underline after section ("Args")
|
344 | "'test_missing_args_class_method' docstring)", arg_count=5)
345 | def test_missing_args_class_method(cls, test, x, y, _, z=3): # noqa: D213, D407
346 | """Test a valid args section.
| _________^
347 | |
348 | | Args:
349 | | x: Another parameter. The parameter below is missing description.
350 | | y:
351 | |
352 | | """
| |___________^ D407
353 |
354 | @staticmethod
346 | """Test a valid args section.
347 |
348 | Args:
| ^^^^ D407
349 | x: Another parameter. The parameter below is missing description.
350 | y:
|
= help: Add dashed line under "Args"
@@ -345,20 +254,13 @@ sections.py:346:9: D407 [*] Missing dashed underline after section ("Args")
350 351 | y:
351 352 |
sections.py:359:9: D407 [*] Missing dashed underline after section ("Args")
sections.py:361:9: D407 [*] Missing dashed underline after section ("Args")
|
357 | "'test_missing_args_static_method' docstring)", arg_count=4)
358 | def test_missing_args_static_method(a, x, y, _test, z=3): # noqa: D213, D407
359 | """Test a valid args section.
| _________^
360 | |
361 | | Args:
362 | | x: Another parameter.
363 | |
364 | | """
| |___________^ D407
365 |
366 | @staticmethod
359 | """Test a valid args section.
360 |
361 | Args:
| ^^^^ D407
362 | x: Another parameter.
|
= help: Add dashed line under "Args"
@@ -371,20 +273,13 @@ sections.py:359:9: D407 [*] Missing dashed underline after section ("Args")
363 364 |
364 365 | """
sections.py:371:9: D407 [*] Missing dashed underline after section ("Args")
sections.py:373:9: D407 [*] Missing dashed underline after section ("Args")
|
369 | "'test_missing_docstring' docstring)", arg_count=2)
370 | def test_missing_docstring(a, b): # noqa: D213, D407
371 | """Test a valid args section.
| _________^
372 | |
373 | | Args:
374 | | a:
375 | |
376 | | """
| |___________^ D407
377 |
378 | @staticmethod
371 | """Test a valid args section.
372 |
373 | Args:
| ^^^^ D407
374 | a:
|
= help: Add dashed line under "Args"
@@ -397,24 +292,14 @@ sections.py:371:9: D407 [*] Missing dashed underline after section ("Args")
375 376 |
376 377 | """
sections.py:380:9: D407 [*] Missing dashed underline after section ("Args")
sections.py:382:9: D407 [*] Missing dashed underline after section ("Args")
|
378 | @staticmethod
379 | def test_hanging_indent(skip, verbose): # noqa: D213, D407
380 | """Do stuff.
| _________^
381 | |
382 | | Args:
383 | | skip (:attr:`.Skip`):
384 | | Lorem ipsum dolor sit amet, consectetur adipiscing elit.
385 | | Etiam at tellus a tellus faucibus maximus. Curabitur tellus
386 | | mauris, semper id vehicula ac, feugiat ut tortor.
387 | | verbose (bool):
388 | | If True, print out as much information as possible.
389 | | If False, print out concise "one-liner" information.
390 | |
391 | | """
| |___________^ D407
380 | """Do stuff.
381 |
382 | Args:
| ^^^^ D407
383 | skip (:attr:`.Skip`):
384 | Lorem ipsum dolor sit amet, consectetur adipiscing elit.
|
= help: Add dashed line under "Args"
@@ -427,20 +312,13 @@ sections.py:380:9: D407 [*] Missing dashed underline after section ("Args")
384 385 | Lorem ipsum dolor sit amet, consectetur adipiscing elit.
385 386 | Etiam at tellus a tellus faucibus maximus. Curabitur tellus
sections.py:499:9: D407 [*] Missing dashed underline after section ("Args")
sections.py:503:9: D407 [*] Missing dashed underline after section ("Args")
|
497 | "'test_incorrect_indent' docstring)", arg_count=3)
498 | def test_incorrect_indent(self, x=1, y=2): # noqa: D207, D213, D407
499 | """Reproducing issue #437.
| _________^
500 | |
501 | | Testing this incorrectly indented docstring.
502 | |
503 | | Args:
504 | | x: Test argument.
505 | |
506 | | """
| |___________^ D407
501 | Testing this incorrectly indented docstring.
502 |
503 | Args:
| ^^^^ D407
504 | x: Test argument.
|
= help: Add dashed line under "Args"
@@ -453,16 +331,12 @@ sections.py:499:9: D407 [*] Missing dashed underline after section ("Args")
505 506 |
506 507 | """
sections.py:519:5: D407 [*] Missing dashed underline after section ("Parameters")
sections.py:522:5: D407 [*] Missing dashed underline after section ("Parameters")
|
518 | def replace_equals_with_dash():
519 | """Equal length equals should be replaced with dashes.
| _____^
520 | |
521 | | Parameters
522 | | ==========
523 | | """
| |_______^ D407
521 | Parameters
522 | ==========
| ^^^^^^^^^^ D407
523 | """
|
= help: Add dashed line under "Parameters"
@@ -476,16 +350,12 @@ sections.py:519:5: D407 [*] Missing dashed underline after section ("Parameters"
524 524 |
525 525 |
sections.py:527:5: D407 [*] Missing dashed underline after section ("Parameters")
sections.py:530:5: D407 [*] Missing dashed underline after section ("Parameters")
|
526 | def replace_equals_with_dash2():
527 | """Here, the length of equals is not the same.
| _____^
528 | |
529 | | Parameters
530 | | ===========
531 | | """
| |_______^ D407
529 | Parameters
530 | ===========
| ^^^^^^^^^^^ D407
531 | """
|
= help: Add dashed line under "Parameters"
@@ -493,23 +363,19 @@ sections.py:527:5: D407 [*] Missing dashed underline after section ("Parameters"
527 527 | """Here, the length of equals is not the same.
528 528 |
529 529 | Parameters
530 |- ===========
530 |+ ----------
530 531 | ===========
531 532 | """
532 533 |
531 531 | """
532 532 |
533 533 |
sections.py:548:5: D407 [*] Missing dashed underline after section ("Args")
sections.py:550:5: D407 [*] Missing dashed underline after section ("Args")
|
547 | def lowercase_sub_section_header():
548 | """Below, `returns:` should _not_ be considered a section header.
| _____^
549 | |
550 | | Args:
551 | | Here's a note.
552 | |
553 | | returns:
554 | | """
| |_______^ D407
548 | """Below, `returns:` should _not_ be considered a section header.
549 |
550 | Args:
| ^^^^ D407
551 | Here's a note.
|
= help: Add dashed line under "Args"
@@ -522,18 +388,13 @@ sections.py:548:5: D407 [*] Missing dashed underline after section ("Args")
552 553 |
553 554 | returns:
sections.py:558:5: D407 [*] Missing dashed underline after section ("Args")
sections.py:560:5: D407 [*] Missing dashed underline after section ("Args")
|
557 | def titlecase_sub_section_header():
558 | """Below, `Returns:` should be considered a section header.
| _____^
559 | |
560 | | Args:
561 | | Here's a note.
562 | |
563 | | Returns:
564 | | """
| |_______^ D407
558 | """Below, `Returns:` should be considered a section header.
559 |
560 | Args:
| ^^^^ D407
561 | Here's a note.
|
= help: Add dashed line under "Args"
@@ -546,18 +407,13 @@ sections.py:558:5: D407 [*] Missing dashed underline after section ("Args")
562 563 |
563 564 | Returns:
sections.py:558:5: D407 [*] Missing dashed underline after section ("Returns")
sections.py:563:9: D407 [*] Missing dashed underline after section ("Returns")
|
557 | def titlecase_sub_section_header():
558 | """Below, `Returns:` should be considered a section header.
| _____^
559 | |
560 | | Args:
561 | | Here's a note.
562 | |
563 | | Returns:
564 | | """
| |_______^ D407
561 | Here's a note.
562 |
563 | Returns:
| ^^^^^^^ D407
564 | """
|
= help: Add dashed line under "Returns"
@@ -570,19 +426,14 @@ sections.py:558:5: D407 [*] Missing dashed underline after section ("Returns")
565 566 |
566 567 |
sections.py:600:4: D407 [*] Missing dashed underline after section ("Parameters")
sections.py:602:4: D407 [*] Missing dashed underline after section ("Parameters")
|
599 | def test_lowercase_sub_section_header_different_kind(returns: int):
600 | """Test that lower case subsection header is valid even if it is of a different kind.
| ____^
601 | |
602 | | Parameters
603 | | ------------------
604 | | returns:
605 | | some value
606 | |
607 | | """
| |______^ D407
600 | """Test that lower case subsection header is valid even if it is of a different kind.
601 |
602 | Parameters
| ^^^^^^^^^^ D407
603 | ------------------
604 | returns:
|
= help: Add dashed line under "Parameters"
@@ -594,5 +445,3 @@ sections.py:600:4: D407 [*] Missing dashed underline after section ("Parameters"
603 604 | ------------------
604 605 | returns:
605 606 | some value

View File

@@ -1,22 +1,15 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:94:5: D408 [*] Section underline should be in the line following the section's name ("Returns")
|
92 | "section's name ('Returns')")
93 | def blank_line_before_underline(): # noqa: D416
94 | """Toggle the gizmo.
| _____^
95 | |
96 | | Returns
97 | |
98 | | -------
99 | | A value of some sort.
100 | |
101 | | """
| |_______^ D408
|
= help: Add underline to "Returns"
sections.py:98:5: D408 [*] Section underline should be in the line following the section's name ("Returns")
|
96 | Returns
97 |
98 | -------
| ^^^^^^^ D408
99 | A value of some sort.
|
= help: Add underline to "Returns"
Safe fix
94 94 | """Toggle the gizmo.
@@ -25,6 +18,4 @@ sections.py:94:5: D408 [*] Section underline should be in the line following the
97 |-
98 97 | -------
99 98 | A value of some sort.
100 99 |
100 99 |

View File

@@ -1,19 +1,12 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:108:5: D409 [*] Section underline should match the length of its name ("Returns")
sections.py:111:5: D409 [*] Section underline should match the length of its name ("Returns")
|
106 | "(Expected 7 dashes in section 'Returns', got 2)")
107 | def bad_underline_length(): # noqa: D416
108 | """Toggle the gizmo.
| _____^
109 | |
110 | | Returns
111 | | --
112 | | A value of some sort.
113 | |
114 | | """
| |_______^ D409
110 | Returns
111 | --
| ^^ D409
112 | A value of some sort.
|
= help: Adjust underline length to match "Returns"
@@ -27,27 +20,13 @@ sections.py:108:5: D409 [*] Section underline should match the length of its nam
113 113 |
114 114 | """
sections.py:216:5: D409 [*] Section underline should match the length of its name ("Returns")
sections.py:225:5: D409 [*] Section underline should match the length of its name ("Returns")
|
214 | @expect("D407: Missing dashed underline after section ('Raises')")
215 | def multiple_sections(): # noqa: D416
216 | """Toggle the gizmo.
| _____^
217 | |
218 | | Short summary
219 | | -------------
220 | |
221 | | This is the function's description, which will also specify what it
222 | | returns.
223 | |
224 | | Returns
225 | | ------
226 | | Many many wonderful things.
227 | | Raises:
228 | | My attention.
229 | |
230 | | """
| |_______^ D409
224 | Returns
225 | ------
| ^^^^^^ D409
226 | Many many wonderful things.
227 | Raises:
|
= help: Adjust underline length to match "Returns"
@@ -61,28 +40,13 @@ sections.py:216:5: D409 [*] Section underline should match the length of its nam
227 227 | Raises:
228 228 | My attention.
sections.py:568:5: D409 [*] Section underline should match the length of its name ("Other Parameters")
sections.py:578:5: D409 [*] Section underline should match the length of its name ("Other Parameters")
|
567 | def test_method_should_be_correctly_capitalized(parameters: list[str], other_parameters: dict[str, str]): # noqa: D213
568 | """Test parameters and attributes sections are capitalized correctly.
| _____^
569 | |
570 | | Parameters
571 | | ----------
572 | | parameters:
573 | | A list of string parameters
574 | | other_parameters:
575 | | A dictionary of string attributes
576 | |
577 | | Other Parameters
578 | | ----------
579 | | other_parameters:
580 | | A dictionary of string attributes
581 | | parameters:
582 | | A list of string parameters
583 | |
584 | | """
| |_______^ D409
577 | Other Parameters
578 | ----------
| ^^^^^^^^^^ D409
579 | other_parameters:
580 | A dictionary of string attributes
|
= help: Adjust underline length to match "Other Parameters"
@@ -95,5 +59,3 @@ sections.py:568:5: D409 [*] Section underline should match the length of its nam
579 579 | other_parameters:
580 580 | A dictionary of string attributes
581 581 | parameters:

View File

@@ -1,27 +1,16 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
D410.py:2:5: D410 [*] Missing blank line after section ("Parameters")
|
1 | def f(a: int, b: int) -> int:
2 | """Showcase function.
| _____^
3 | |
4 | | Parameters
5 | | ----------
6 | | a : int
7 | | _description_
8 | | b : int
9 | | _description_
10 | | Returns
11 | | -------
12 | | int
13 | | _description
14 | | """
| |_______^ D410
15 | return b - a
|
= help: Add blank line after "Parameters"
D410.py:4:5: D410 [*] Missing blank line after section ("Parameters")
|
2 | """Showcase function.
3 |
4 | Parameters
| ^^^^^^^^^^ D410
5 | ----------
6 | a : int
|
= help: Add blank line after "Parameters"
Safe fix
7 7 | _description_
@@ -32,18 +21,14 @@ D410.py:2:5: D410 [*] Missing blank line after section ("Parameters")
11 12 | -------
12 13 | int
D410.py:19:5: D410 [*] Missing blank line after section ("Parameters")
D410.py:21:5: D410 [*] Missing blank line after section ("Parameters")
|
18 | def f() -> int:
19 | """Showcase function.
| _____^
20 | |
21 | | Parameters
22 | | ----------
23 | | Returns
24 | | -------
25 | | """
| |_______^ D410
19 | """Showcase function.
20 |
21 | Parameters
| ^^^^^^^^^^ D410
22 | ----------
23 | Returns
|
= help: Add blank line after "Parameters"
@@ -55,5 +40,3 @@ D410.py:19:5: D410 [*] Missing blank line after section ("Parameters")
23 24 | Returns
24 25 | -------
25 26 | """

View File

@@ -1,24 +1,14 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:76:5: D410 [*] Missing blank line after section ("Returns")
sections.py:78:5: D410 [*] Missing blank line after section ("Returns")
|
74 | @expect("D414: Section has no content ('Yields')")
75 | def consecutive_sections(): # noqa: D416
76 | """Toggle the gizmo.
| _____^
77 | |
78 | | Returns
79 | | -------
80 | | Yields
81 | | ------
82 | |
83 | | Raises
84 | | ------
85 | | Questions.
86 | |
87 | | """
| |_______^ D410
76 | """Toggle the gizmo.
77 |
78 | Returns
| ^^^^^^^ D410
79 | -------
80 | Yields
|
= help: Add blank line after "Returns"
@@ -31,27 +21,14 @@ sections.py:76:5: D410 [*] Missing blank line after section ("Returns")
81 82 | ------
82 83 |
sections.py:216:5: D410 [*] Missing blank line after section ("Returns")
sections.py:224:5: D410 [*] Missing blank line after section ("Returns")
|
214 | @expect("D407: Missing dashed underline after section ('Raises')")
215 | def multiple_sections(): # noqa: D416
216 | """Toggle the gizmo.
| _____^
217 | |
218 | | Short summary
219 | | -------------
220 | |
221 | | This is the function's description, which will also specify what it
222 | | returns.
223 | |
224 | | Returns
225 | | ------
226 | | Many many wonderful things.
227 | | Raises:
228 | | My attention.
229 | |
230 | | """
| |_______^ D410
222 | returns.
223 |
224 | Returns
| ^^^^^^^ D410
225 | ------
226 | Many many wonderful things.
|
= help: Add blank line after "Returns"
@@ -62,6 +39,4 @@ sections.py:216:5: D410 [*] Missing blank line after section ("Returns")
227 |+
227 228 | Raises:
228 229 | My attention.
229 230 |
229 230 |

View File

@@ -1,24 +1,13 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:76:5: D411 [*] Missing blank line before section ("Yields")
sections.py:80:5: D411 [*] Missing blank line before section ("Yields")
|
74 | @expect("D414: Section has no content ('Yields')")
75 | def consecutive_sections(): # noqa: D416
76 | """Toggle the gizmo.
| _____^
77 | |
78 | | Returns
79 | | -------
80 | | Yields
81 | | ------
82 | |
83 | | Raises
84 | | ------
85 | | Questions.
86 | |
87 | | """
| |_______^ D411
78 | Returns
79 | -------
80 | Yields
| ^^^^^^ D411
81 | ------
|
= help: Add blank line before "Yields"
@@ -31,20 +20,13 @@ sections.py:76:5: D411 [*] Missing blank line before section ("Yields")
81 82 | ------
82 83 |
sections.py:131:5: D411 [*] Missing blank line before section ("Returns")
sections.py:134:5: D411 [*] Missing blank line before section ("Returns")
|
129 | @expect("D411: Missing blank line before section ('Returns')")
130 | def no_blank_line_before_section(): # noqa: D416
131 | """Toggle the gizmo.
| _____^
132 | |
133 | | The function's description.
134 | | Returns
135 | | -------
136 | | A value of some sort.
137 | |
138 | | """
| |_______^ D411
133 | The function's description.
134 | Returns
| ^^^^^^^ D411
135 | -------
136 | A value of some sort.
|
= help: Add blank line before "Returns"
@@ -57,27 +39,13 @@ sections.py:131:5: D411 [*] Missing blank line before section ("Returns")
135 136 | -------
136 137 | A value of some sort.
sections.py:216:5: D411 [*] Missing blank line before section ("Raises")
sections.py:227:5: D411 [*] Missing blank line before section ("Raises")
|
214 | @expect("D407: Missing dashed underline after section ('Raises')")
215 | def multiple_sections(): # noqa: D416
216 | """Toggle the gizmo.
| _____^
217 | |
218 | | Short summary
219 | | -------------
220 | |
221 | | This is the function's description, which will also specify what it
222 | | returns.
223 | |
224 | | Returns
225 | | ------
226 | | Many many wonderful things.
227 | | Raises:
228 | | My attention.
229 | |
230 | | """
| |_______^ D411
225 | ------
226 | Many many wonderful things.
227 | Raises:
| ^^^^^^ D411
228 | My attention.
|
= help: Add blank line before "Raises"
@@ -88,6 +56,4 @@ sections.py:216:5: D411 [*] Missing blank line before section ("Raises")
227 |+
227 228 | Raises:
228 229 | My attention.
229 230 |
229 230 |

View File

@@ -1,27 +1,13 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:216:5: D412 [*] No blank lines allowed between a section header and its content ("Short summary")
sections.py:218:5: D412 [*] No blank lines allowed between a section header and its content ("Short summary")
|
214 | @expect("D407: Missing dashed underline after section ('Raises')")
215 | def multiple_sections(): # noqa: D416
216 | """Toggle the gizmo.
| _____^
217 | |
218 | | Short summary
219 | | -------------
220 | |
221 | | This is the function's description, which will also specify what it
222 | | returns.
223 | |
224 | | Returns
225 | | ------
226 | | Many many wonderful things.
227 | | Raises:
228 | | My attention.
229 | |
230 | | """
| |_______^ D412
216 | """Toggle the gizmo.
217 |
218 | Short summary
| ^^^^^^^^^^^^^ D412
219 | -------------
|
= help: Remove blank line(s)
@@ -32,6 +18,4 @@ sections.py:216:5: D412 [*] No blank lines allowed between a section header and
220 |-
221 220 | This is the function's description, which will also specify what it
222 221 | returns.
223 222 |
223 222 |

View File

@@ -1,18 +1,14 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
D413.py:1:1: D413 [*] Missing blank line after last section ("Returns")
D413.py:7:1: D413 [*] Missing blank line after last section ("Returns")
|
1 | / """Do something.
2 | |
3 | | Args:
4 | | x: the value
5 | | with a hanging indent
6 | |
7 | | Returns:
8 | | the value
9 | | """
| |___^ D413
5 | with a hanging indent
6 |
7 | Returns:
| ^^^^^^^ D413
8 | the value
9 | """
|
= help: Add blank line after "Returns"
@@ -25,20 +21,14 @@ D413.py:1:1: D413 [*] Missing blank line after last section ("Returns")
10 11 |
11 12 |
D413.py:13:5: D413 [*] Missing blank line after last section ("Returns")
D413.py:19:5: D413 [*] Missing blank line after last section ("Returns")
|
12 | def func():
13 | """Do something.
| _____^
14 | |
15 | | Args:
16 | | x: the value
17 | | with a hanging indent
18 | |
19 | | Returns:
20 | | the value
21 | | """
| |_______^ D413
17 | with a hanging indent
18 |
19 | Returns:
| ^^^^^^^ D413
20 | the value
21 | """
|
= help: Add blank line after "Returns"
@@ -51,19 +41,13 @@ D413.py:13:5: D413 [*] Missing blank line after last section ("Returns")
22 23 |
23 24 |
D413.py:52:5: D413 [*] Missing blank line after last section ("Returns")
D413.py:58:5: D413 [*] Missing blank line after last section ("Returns")
|
51 | def func():
52 | """Do something.
| _____^
53 | |
54 | | Args:
55 | | x: the value
56 | | with a hanging indent
57 | |
58 | | Returns:
59 | | the value"""
| |____________________^ D413
56 | with a hanging indent
57 |
58 | Returns:
| ^^^^^^^ D413
59 | the value"""
|
= help: Add blank line after "Returns"
@@ -79,20 +63,14 @@ D413.py:52:5: D413 [*] Missing blank line after last section ("Returns")
61 63 |
62 64 | def func():
D413.py:63:5: D413 [*] Missing blank line after last section ("Returns")
D413.py:69:5: D413 [*] Missing blank line after last section ("Returns")
|
62 | def func():
63 | """Do something.
| _____^
64 | |
65 | | Args:
66 | | x: the value
67 | | with a hanging indent
68 | |
69 | | Returns:
70 | | the value
71 | | """
| |___________^ D413
67 | with a hanging indent
68 |
69 | Returns:
| ^^^^^^^ D413
70 | the value
71 | """
|
= help: Add blank line after "Returns"

View File

@@ -1,15 +1,12 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:65:5: D413 [*] Missing blank line after last section ("Returns")
sections.py:67:5: D413 [*] Missing blank line after last section ("Returns")
|
63 | @expect("D414: Section has no content ('Returns')")
64 | def no_underline_and_no_newline(): # noqa: D416
65 | """Toggle the gizmo.
| _____^
66 | |
67 | | Returns"""
| |______________^ D413
65 | """Toggle the gizmo.
66 |
67 | Returns"""
| ^^^^^^^ D413
|
= help: Add blank line after "Returns"
@@ -25,18 +22,14 @@ sections.py:65:5: D413 [*] Missing blank line after last section ("Returns")
69 71 |
70 72 | @expect(_D213)
sections.py:120:5: D413 [*] Missing blank line after last section ("Returns")
sections.py:122:5: D413 [*] Missing blank line after last section ("Returns")
|
118 | @expect("D413: Missing blank line after last section ('Returns')")
119 | def no_blank_line_after_last_section(): # noqa: D416
120 | """Toggle the gizmo.
| _____^
121 | |
122 | | Returns
123 | | -------
124 | | A value of some sort.
125 | | """
| |_______^ D413
120 | """Toggle the gizmo.
121 |
122 | Returns
| ^^^^^^^ D413
123 | -------
124 | A value of some sort.
|
= help: Add blank line after "Returns"
@@ -49,17 +42,14 @@ sections.py:120:5: D413 [*] Missing blank line after last section ("Returns")
126 127 |
127 128 |
sections.py:170:5: D413 [*] Missing blank line after last section ("Returns")
sections.py:172:5: D413 [*] Missing blank line after last section ("Returns")
|
168 | @expect("D414: Section has no content ('Returns')")
169 | def section_underline_overindented_and_contentless(): # noqa: D416
170 | """Toggle the gizmo.
| _____^
171 | |
172 | | Returns
173 | | -------
174 | | """
| |_______^ D413
170 | """Toggle the gizmo.
171 |
172 | Returns
| ^^^^^^^ D413
173 | -------
174 | """
|
= help: Add blank line after "Returns"
@@ -72,16 +62,14 @@ sections.py:170:5: D413 [*] Missing blank line after last section ("Returns")
175 176 |
176 177 |
sections.py:519:5: D413 [*] Missing blank line after last section ("Parameters")
sections.py:521:5: D413 [*] Missing blank line after last section ("Parameters")
|
518 | def replace_equals_with_dash():
519 | """Equal length equals should be replaced with dashes.
| _____^
520 | |
521 | | Parameters
522 | | ==========
523 | | """
| |_______^ D413
519 | """Equal length equals should be replaced with dashes.
520 |
521 | Parameters
| ^^^^^^^^^^ D413
522 | ==========
523 | """
|
= help: Add blank line after "Parameters"
@@ -94,16 +82,14 @@ sections.py:519:5: D413 [*] Missing blank line after last section ("Parameters")
524 525 |
525 526 |
sections.py:527:5: D413 [*] Missing blank line after last section ("Parameters")
sections.py:529:5: D413 [*] Missing blank line after last section ("Parameters")
|
526 | def replace_equals_with_dash2():
527 | """Here, the length of equals is not the same.
| _____^
528 | |
529 | | Parameters
530 | | ===========
531 | | """
| |_______^ D413
527 | """Here, the length of equals is not the same.
528 |
529 | Parameters
| ^^^^^^^^^^ D413
530 | ===========
531 | """
|
= help: Add blank line after "Parameters"
@@ -116,18 +102,13 @@ sections.py:527:5: D413 [*] Missing blank line after last section ("Parameters")
532 533 |
533 534 |
sections.py:548:5: D413 [*] Missing blank line after last section ("Args")
sections.py:550:5: D413 [*] Missing blank line after last section ("Args")
|
547 | def lowercase_sub_section_header():
548 | """Below, `returns:` should _not_ be considered a section header.
| _____^
549 | |
550 | | Args:
551 | | Here's a note.
552 | |
553 | | returns:
554 | | """
| |_______^ D413
548 | """Below, `returns:` should _not_ be considered a section header.
549 |
550 | Args:
| ^^^^ D413
551 | Here's a note.
|
= help: Add blank line after "Args"
@@ -140,18 +121,13 @@ sections.py:548:5: D413 [*] Missing blank line after last section ("Args")
555 556 |
556 557 |
sections.py:558:5: D413 [*] Missing blank line after last section ("Returns")
sections.py:563:9: D413 [*] Missing blank line after last section ("Returns")
|
557 | def titlecase_sub_section_header():
558 | """Below, `Returns:` should be considered a section header.
| _____^
559 | |
560 | | Args:
561 | | Here's a note.
562 | |
563 | | Returns:
564 | | """
| |_______^ D413
561 | Here's a note.
562 |
563 | Returns:
| ^^^^^^^ D413
564 | """
|
= help: Add blank line after "Returns"
@@ -164,20 +140,14 @@ sections.py:558:5: D413 [*] Missing blank line after last section ("Returns")
565 566 |
566 567 |
sections.py:588:5: D413 [*] Missing blank line after last section ("Parameters")
sections.py:590:5: D413 [*] Missing blank line after last section ("Parameters")
|
587 | def test_lowercase_sub_section_header_should_be_valid(parameters: list[str], value: int): # noqa: D213
588 | """Test that lower case subsection header is valid even if it has the same name as section kind.
| _____^
589 | |
590 | | Parameters:
591 | | ----------
592 | | parameters:
593 | | A list of string parameters
594 | | value:
595 | | Some value
596 | | """
| |_______^ D413
588 | """Test that lower case subsection header is valid even if it has the same name as section kind.
589 |
590 | Parameters:
| ^^^^^^^^^^ D413
591 | ----------
592 | parameters:
|
= help: Add blank line after "Parameters"

View File

@@ -1,114 +1,68 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
---
sections.py:54:5: D414 Section has no content ("Returns")
sections.py:56:5: D414 Section has no content ("Returns")
|
52 | @expect("D414: Section has no content ('Returns')")
53 | def no_underline_and_no_description(): # noqa: D416
54 | """Toggle the gizmo.
| _____^
55 | |
56 | | Returns
57 | |
58 | | """
| |_______^ D414
54 | """Toggle the gizmo.
55 |
56 | Returns
| ^^^^^^^ D414
57 |
58 | """
|
sections.py:65:5: D414 Section has no content ("Returns")
sections.py:67:5: D414 Section has no content ("Returns")
|
63 | @expect("D414: Section has no content ('Returns')")
64 | def no_underline_and_no_newline(): # noqa: D416
65 | """Toggle the gizmo.
| _____^
66 | |
67 | | Returns"""
| |______________^ D414
65 | """Toggle the gizmo.
66 |
67 | Returns"""
| ^^^^^^^ D414
|
sections.py:76:5: D414 Section has no content ("Returns")
sections.py:78:5: D414 Section has no content ("Returns")
|
74 | @expect("D414: Section has no content ('Yields')")
75 | def consecutive_sections(): # noqa: D416
76 | """Toggle the gizmo.
| _____^
77 | |
78 | | Returns
79 | | -------
80 | | Yields
81 | | ------
82 | |
83 | | Raises
84 | | ------
85 | | Questions.
86 | |
87 | | """
| |_______^ D414
76 | """Toggle the gizmo.
77 |
78 | Returns
| ^^^^^^^ D414
79 | -------
80 | Yields
|
sections.py:76:5: D414 Section has no content ("Yields")
sections.py:80:5: D414 Section has no content ("Yields")
|
74 | @expect("D414: Section has no content ('Yields')")
75 | def consecutive_sections(): # noqa: D416
76 | """Toggle the gizmo.
| _____^
77 | |
78 | | Returns
79 | | -------
80 | | Yields
81 | | ------
82 | |
83 | | Raises
84 | | ------
85 | | Questions.
86 | |
87 | | """
| |_______^ D414
78 | Returns
79 | -------
80 | Yields
| ^^^^^^ D414
81 | ------
|
sections.py:170:5: D414 Section has no content ("Returns")
sections.py:172:5: D414 Section has no content ("Returns")
|
168 | @expect("D414: Section has no content ('Returns')")
169 | def section_underline_overindented_and_contentless(): # noqa: D416
170 | """Toggle the gizmo.
| _____^
171 | |
172 | | Returns
173 | | -------
174 | | """
| |_______^ D414
170 | """Toggle the gizmo.
171 |
172 | Returns
| ^^^^^^^ D414
173 | -------
174 | """
|
sections.py:261:5: D414 Section has no content ("Returns")
sections.py:266:5: D414 Section has no content ("Returns")
|
259 | @expect("D414: Section has no content ('Returns')")
260 | def valid_google_style_section(): # noqa: D406, D407
261 | """Toggle the gizmo.
| _____^
262 | |
263 | | Args:
264 | | note: A random string.
265 | |
266 | | Returns:
267 | |
268 | | Raises:
269 | | RandomError: A random error that occurs randomly.
270 | |
271 | | """
| |_______^ D414
264 | note: A random string.
265 |
266 | Returns:
| ^^^^^^^ D414
267 |
268 | Raises:
|
sections.py:558:5: D414 Section has no content ("Returns")
sections.py:563:9: D414 Section has no content ("Returns")
|
557 | def titlecase_sub_section_header():
558 | """Below, `Returns:` should be considered a section header.
| _____^
559 | |
560 | | Args:
561 | | Here's a note.
562 | |
563 | | Returns:
564 | | """
| |_______^ D414
561 | Here's a note.
562 |
563 | Returns:
| ^^^^^^^ D414
564 | """
|

View File

@@ -9,13 +9,13 @@ use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
/// ## What it does
/// Checks for `@singledispatch` decorators on class and instance methods.
/// Checks for methods decorated with `@singledispatch`.
///
/// ## Why is this bad?
/// The `@singledispatch` decorator is intended for use with functions, not methods.
///
/// Instead, use the `@singledispatchmethod` decorator, or migrate the method to a
/// standalone function or `@staticmethod`.
/// standalone function.
///
/// ## Example
/// ```python
@@ -88,7 +88,9 @@ pub(crate) fn singledispatch_method(
);
if !matches!(
type_,
function_type::FunctionType::Method | function_type::FunctionType::ClassMethod
function_type::FunctionType::Method
| function_type::FunctionType::ClassMethod
| function_type::FunctionType::StaticMethod
) {
return;
}

View File

@@ -9,12 +9,11 @@ use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
/// ## What it does
/// Checks for `@singledispatchmethod` decorators on functions or static
/// methods.
/// Checks for non-method functions decorated with `@singledispatchmethod`.
///
/// ## Why is this bad?
/// The `@singledispatchmethod` decorator is intended for use with class and
/// instance methods, not functions.
/// The `@singledispatchmethod` decorator is intended for use with methods, not
/// functions.
///
/// Instead, use the `@singledispatch` decorator.
///
@@ -85,10 +84,7 @@ pub(crate) fn singledispatchmethod_function(
&checker.settings.pep8_naming.classmethod_decorators,
&checker.settings.pep8_naming.staticmethod_decorators,
);
if !matches!(
type_,
function_type::FunctionType::Function | function_type::FunctionType::StaticMethod
) {
if !matches!(type_, function_type::FunctionType::Function) {
return;
}

View File

@@ -40,4 +40,25 @@ singledispatch_method.py:15:5: PLE1519 [*] `@singledispatch` decorator should no
15 |+ @singledispatchmethod # [singledispatch-method]
16 16 | def move(self, position):
17 17 | pass
18 18 |
18 18 |
singledispatch_method.py:23:5: PLE1519 [*] `@singledispatch` decorator should not be used on methods
|
21 | pass
22 |
23 | @singledispatch # [singledispatch-method]
| ^^^^^^^^^^^^^^^ PLE1519
24 | @staticmethod
25 | def do(position):
|
= help: Replace with `@singledispatchmethod`
Unsafe fix
20 20 | def place(self, position):
21 21 | pass
22 22 |
23 |- @singledispatch # [singledispatch-method]
23 |+ @singledispatchmethod # [singledispatch-method]
24 24 | @staticmethod
25 25 | def do(position):
26 26 | pass

View File

@@ -19,31 +19,4 @@ singledispatchmethod_function.py:4:1: PLE1520 [*] `@singledispatchmethod` decora
4 |+@singledispatch # [singledispatchmethod-function]
5 5 | def convert_position(position):
6 6 | pass
7 7 |
singledispatchmethod_function.py:20:5: PLE1520 [*] `@singledispatchmethod` decorator should not be used on non-method functions
|
18 | pass
19 |
20 | @singledispatchmethod # [singledispatchmethod-function]
| ^^^^^^^^^^^^^^^^^^^^^ PLE1520
21 | @staticmethod
22 | def do(position):
|
= help: Replace with `@singledispatch`
Unsafe fix
1 |-from functools import singledispatchmethod
1 |+from functools import singledispatchmethod, singledispatch
2 2 |
3 3 |
4 4 | @singledispatchmethod # [singledispatchmethod-function]
--------------------------------------------------------------------------------
17 17 | def move(self, position):
18 18 | pass
19 19 |
20 |- @singledispatchmethod # [singledispatchmethod-function]
20 |+ @singledispatch # [singledispatchmethod-function]
21 21 | @staticmethod
22 22 | def do(position):
23 23 | pass
7 7 |

View File

@@ -5,9 +5,9 @@ invalid_characters.py:15:6: PLE2510 [*] Invalid unescaped character backspace, u
|
13 | # (Pylint, "C3002") => Rule::UnnecessaryDirectLambdaCall,
14 | #foo = 'hi'
15 | b = ''
15 | b = ''
| PLE2510
16 | b = f''
16 | b = f''
|
= help: Replace with escape sequence
@@ -15,17 +15,17 @@ invalid_characters.py:15:6: PLE2510 [*] Invalid unescaped character backspace, u
12 12 | # (Pylint, "C0414") => Rule::UselessImportAlias,
13 13 | # (Pylint, "C3002") => Rule::UnnecessaryDirectLambdaCall,
14 14 | #foo = 'hi'
15 |-b = ''
15 |-b = ''
15 |+b = '\b'
16 16 | b = f''
16 16 | b = f''
17 17 |
18 18 | b_ok = '\\b'
invalid_characters.py:16:7: PLE2510 [*] Invalid unescaped character backspace, use "\b" instead
|
14 | #foo = 'hi'
15 | b = ''
16 | b = f''
15 | b = ''
16 | b = f''
| PLE2510
17 |
18 | b_ok = '\\b'
@@ -35,8 +35,8 @@ invalid_characters.py:16:7: PLE2510 [*] Invalid unescaped character backspace, u
Safe fix
13 13 | # (Pylint, "C3002") => Rule::UnnecessaryDirectLambdaCall,
14 14 | #foo = 'hi'
15 15 | b = ''
16 |-b = f''
15 15 | b = ''
16 |-b = f''
16 |+b = f'\b'
17 17 |
18 18 | b_ok = '\\b'
@@ -46,7 +46,7 @@ invalid_characters.py:55:21: PLE2510 [*] Invalid unescaped character backspace,
|
53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
54 |
55 | nested_fstrings = f'{f'{f''}'}'
55 | nested_fstrings = f'{f'{f''}'
| PLE2510
56 |
57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
@@ -57,10 +57,8 @@ invalid_characters.py:55:21: PLE2510 [*] Invalid unescaped character backspace,
52 52 | zwsp_after_multicharacter_grapheme_cluster = "ಫ್ರಾನ್ಸಿಸ್ಕೊ "
53 53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
54 54 |
55 |-nested_fstrings = f'{f'{f''}'}'
55 |+nested_fstrings = f'\b{f'{f''}'}'
55 |-nested_fstrings = f'{f'{f''}'
55 |+nested_fstrings = f'\b{f'{f''}'
56 56 |
57 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
58 58 | x = f"""}}ab"""
58 58 | x = f"""}}ab"""

View File

@@ -5,9 +5,9 @@ invalid_characters.py:24:12: PLE2512 [*] Invalid unescaped character SUB, use "\
|
22 | cr_ok = f'\\r'
23 |
24 | sub = 'sub '
24 | sub = 'sub '
| PLE2512
25 | sub = f'sub '
25 | sub = f'sub '
|
= help: Replace with escape sequence
@@ -15,16 +15,16 @@ invalid_characters.py:24:12: PLE2512 [*] Invalid unescaped character SUB, use "\
21 21 | cr_ok = '\\r'
22 22 | cr_ok = f'\\r'
23 23 |
24 |-sub = 'sub '
24 |-sub = 'sub '
24 |+sub = 'sub \x1A'
25 25 | sub = f'sub '
25 25 | sub = f'sub '
26 26 |
27 27 | sub_ok = '\x1a'
invalid_characters.py:25:13: PLE2512 [*] Invalid unescaped character SUB, use "\x1A" instead
|
24 | sub = 'sub '
25 | sub = f'sub '
24 | sub = 'sub '
25 | sub = f'sub '
| PLE2512
26 |
27 | sub_ok = '\x1a'
@@ -34,8 +34,8 @@ invalid_characters.py:25:13: PLE2512 [*] Invalid unescaped character SUB, use "\
Safe fix
22 22 | cr_ok = f'\\r'
23 23 |
24 24 | sub = 'sub '
25 |-sub = f'sub '
24 24 | sub = 'sub '
25 |-sub = f'sub '
25 |+sub = f'sub \x1A'
26 26 |
27 27 | sub_ok = '\x1a'
@@ -45,7 +45,7 @@ invalid_characters.py:55:25: PLE2512 [*] Invalid unescaped character SUB, use "\
|
53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
54 |
55 | nested_fstrings = f'{f'{f''}'}'
55 | nested_fstrings = f'{f'{f''}'
| PLE2512
56 |
57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
@@ -56,29 +56,27 @@ invalid_characters.py:55:25: PLE2512 [*] Invalid unescaped character SUB, use "\
52 52 | zwsp_after_multicharacter_grapheme_cluster = "ಫ್ರಾನ್ಸಿಸ್ಕೊ "
53 53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
54 54 |
55 |-nested_fstrings = f'{f'{f''}'}'
55 |+nested_fstrings = f'{f'\x1A{f''}'}'
55 |-nested_fstrings = f'{f'{f''}'
55 |+nested_fstrings = f'{f'\x1A{f''}'
56 56 |
57 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
58 58 | x = f"""}}ab"""
58 58 | x = f"""}}ab"""
invalid_characters.py:58:12: PLE2512 [*] Invalid unescaped character SUB, use "\x1A" instead
|
57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
58 | x = f"""}}ab"""
58 | x = f"""}}ab"""
| PLE2512
59 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998256
60 | x = f"""}}ab"""
60 | x = f"""}}a"""
|
= help: Replace with escape sequence
Safe fix
55 55 | nested_fstrings = f'{f'{f''}'}'
55 55 | nested_fstrings = f'{f'{f''}'
56 56 |
57 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
58 |-x = f"""}}ab"""
58 |-x = f"""}}ab"""
58 |+x = f"""}}a\x1Ab"""
59 59 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998256
60 60 | x = f"""}}ab"""
60 60 | x = f"""}}a"""

View File

@@ -5,27 +5,27 @@ invalid_characters.py:30:16: PLE2513 [*] Invalid unescaped character ESC, use "\
|
28 | sub_ok = f'\x1a'
29 |
30 | esc = 'esc esc '
| PLE2513
31 | esc = f'esc esc '
|
30 | esc = 'esc esc
PLE2513
31 | esc = f'esc esc
= help: Replace with escape sequence
Safe fix
27 27 | sub_ok = '\x1a'
28 28 | sub_ok = f'\x1a'
29 29 |
30 |-esc = 'esc esc '
30 |-esc = 'esc esc
30 |+esc = 'esc esc \x1B'
31 31 | esc = f'esc esc '
31 31 | esc = f'esc esc
32 32 |
33 33 | esc_ok = '\x1b'
invalid_characters.py:31:17: PLE2513 [*] Invalid unescaped character ESC, use "\x1B" instead
|
30 | esc = 'esc esc '
31 | esc = f'esc esc '
| PLE2513
30 | esc = 'esc esc
1 | esc = f'esc esc
PLE2513
32 |
33 | esc_ok = '\x1b'
|
@@ -34,8 +34,8 @@ invalid_characters.py:31:17: PLE2513 [*] Invalid unescaped character ESC, use "\
Safe fix
28 28 | sub_ok = f'\x1a'
29 29 |
30 30 | esc = 'esc esc '
31 |-esc = f'esc esc '
30 30 | esc = 'esc esc
31 |-esc = f'esc esc
31 |+esc = f'esc esc \x1B'
32 32 |
33 33 | esc_ok = '\x1b'
@@ -45,7 +45,7 @@ invalid_characters.py:55:29: PLE2513 [*] Invalid unescaped character ESC, use "\
|
53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
54 |
55 | nested_fstrings = f'{f'{f''}'}'
55 | nested_fstrings = f'{f'{f''}'
| PLE2513
56 |
57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
@@ -56,26 +56,24 @@ invalid_characters.py:55:29: PLE2513 [*] Invalid unescaped character ESC, use "\
52 52 | zwsp_after_multicharacter_grapheme_cluster = "ಫ್ರಾನ್ಸಿಸ್ಕೊ "
53 53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
54 54 |
55 |-nested_fstrings = f'{f'{f''}'}'
55 |+nested_fstrings = f'{f'{f'\x1B'}'}'
55 |-nested_fstrings = f'{f'{f''}'
55 |+nested_fstrings = f'{f'{f'\x1B'}'}'
56 56 |
57 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
58 58 | x = f"""}}ab"""
58 58 | x = f"""}}ab"""
invalid_characters.py:60:12: PLE2513 [*] Invalid unescaped character ESC, use "\x1B" instead
|
58 | x = f"""}}ab"""
58 | x = f"""}}ab"""
59 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998256
60 | x = f"""}}ab"""
60 | x = f"""}}a"""
| PLE2513
|
= help: Replace with escape sequence
Safe fix
57 57 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998106
58 58 | x = f"""}}ab"""
58 58 | x = f"""}}ab"""
59 59 | # https://github.com/astral-sh/ruff/issues/7455#issuecomment-1741998256
60 |-x = f"""}}ab"""
60 |-x = f"""}}a"""
60 |+x = f"""}}a\x1Bb"""

View File

@@ -100,7 +100,7 @@ invalid_characters.py:52:60: PLE2515 [*] Invalid unescaped character zero-width-
52 |+zwsp_after_multicharacter_grapheme_cluster = "ಫ್ರಾನ್ಸಿಸ್ಕೊ \u200b"
53 53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
54 54 |
55 55 | nested_fstrings = f'{f'{f''}'}'
55 55 | nested_fstrings = f'{f'{f''}'
invalid_characters.py:52:61: PLE2515 [*] Invalid unescaped character zero-width-space, use "\u200B" instead
|
@@ -120,7 +120,7 @@ invalid_characters.py:52:61: PLE2515 [*] Invalid unescaped character zero-width-
52 |+zwsp_after_multicharacter_grapheme_cluster = "ಫ್ರಾನ್ಸಿಸ್ಕೊ \u200b"
53 53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
54 54 |
55 55 | nested_fstrings = f'{f'{f''}'}'
55 55 | nested_fstrings = f'{f'{f''}'
invalid_characters.py:53:61: PLE2515 [*] Invalid unescaped character zero-width-space, use "\u200B" instead
|
@@ -129,7 +129,7 @@ invalid_characters.py:53:61: PLE2515 [*] Invalid unescaped character zero-width-
53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
| PLE2515
54 |
55 | nested_fstrings = f'{f'{f''}'}'
55 | nested_fstrings = f'{f'{f''}'
|
= help: Replace with escape sequence
@@ -140,7 +140,7 @@ invalid_characters.py:53:61: PLE2515 [*] Invalid unescaped character zero-width-
53 |-zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
53 |+zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ \u200b"
54 54 |
55 55 | nested_fstrings = f'{f'{f''}'}'
55 55 | nested_fstrings = f'{f'{f''}'
56 56 |
invalid_characters.py:53:62: PLE2515 [*] Invalid unescaped character zero-width-space, use "\u200B" instead
@@ -150,7 +150,7 @@ invalid_characters.py:53:62: PLE2515 [*] Invalid unescaped character zero-width-
53 | zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
| PLE2515
54 |
55 | nested_fstrings = f'{f'{f''}'}'
55 | nested_fstrings = f'{f'{f''}'
|
= help: Replace with escape sequence
@@ -161,7 +161,5 @@ invalid_characters.py:53:62: PLE2515 [*] Invalid unescaped character zero-width-
53 |-zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ "
53 |+zwsp_after_multicharacter_grapheme_cluster = f"ಫ್ರಾನ್ಸಿಸ್ಕೊ \u200b"
54 54 |
55 55 | nested_fstrings = f'{f'{f''}'}'
56 56 |
55 55 | nested_fstrings = f'{f'{f''}'
56 56 |

View File

@@ -32,6 +32,7 @@ mod tests {
#[test_case(Rule::ImplicitCwd, Path::new("FURB177.py"))]
#[test_case(Rule::SingleItemMembershipTest, Path::new("FURB171.py"))]
#[test_case(Rule::BitCount, Path::new("FURB161.py"))]
#[test_case(Rule::IntOnSlicedStr, Path::new("FURB166.py"))]
#[test_case(Rule::RegexFlagAlias, Path::new("FURB167.py"))]
#[test_case(Rule::IsinstanceTypeNone, Path::new("FURB168.py"))]
#[test_case(Rule::TypeNoneComparison, Path::new("FURB169.py"))]

View File

@@ -0,0 +1,134 @@
use ruff_diagnostics::{AlwaysFixableViolation, Diagnostic, Edit, Fix};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::{Expr, ExprCall, Identifier};
use ruff_text_size::Ranged;
use crate::checkers::ast::Checker;
/// ## What it does
/// Checks for uses of `int` with an explicit base in which a string expression
/// is stripped of its leading prefix (i.e., `0b`, `0o`, or `0x`).
///
/// ## Why is this bad?
/// Given an integer string with a prefix (e.g., `0xABC`), Python can automatically
/// determine the base of the integer by the prefix without needing to specify
/// it explicitly.
///
/// Instead of `int(num[2:], 16)`, use `int(num, 0)`, which will automatically
/// deduce the base based on the prefix.
///
/// ## Example
/// ```python
/// num = "0xABC"
///
/// if num.startswith("0b"):
/// i = int(num[2:], 2)
/// elif num.startswith("0o"):
/// i = int(num[2:], 8)
/// elif num.startswith("0x"):
/// i = int(num[2:], 16)
///
/// print(i)
/// ```
///
/// Use instead:
/// ```python
/// num = "0xABC"
///
/// i = int(num, 0)
///
/// print(i)
/// ```
///
/// ## Fix safety
/// The rule's fix is marked as unsafe, as Ruff cannot guarantee that the
/// argument to `int` will remain valid when its base is included in the
/// function call.
///
/// ## References
/// - [Python documentation: `int`](https://docs.python.org/3/library/functions.html#int)
#[violation]
pub struct IntOnSlicedStr {
base: u8,
}
impl AlwaysFixableViolation for IntOnSlicedStr {
#[derive_message_formats]
fn message(&self) -> String {
let IntOnSlicedStr { base } = self;
format!("Use of `int` with explicit `base={base}` after removing prefix")
}
fn fix_title(&self) -> String {
format!("Replace with `base=0`")
}
}
pub(crate) fn int_on_sliced_str(checker: &mut Checker, call: &ExprCall) {
// Verify that the function is `int`.
let Expr::Name(name) = call.func.as_ref() else {
return;
};
if name.id.as_str() != "int" {
return;
}
if !checker.semantic().is_builtin("int") {
return;
}
// There must be exactly two arguments (e.g., `int(num[2:], 16)`).
let (expression, base) = match (
call.arguments.args.as_ref(),
call.arguments.keywords.as_ref(),
) {
([expression], [base]) if base.arg.as_ref().map(Identifier::as_str) == Some("base") => {
(expression, &base.value)
}
([expression, base], []) => (expression, base),
_ => {
return;
}
};
// The base must be a number literal with a value of 2, 8, or 16.
let Some(base_u8) = base
.as_number_literal_expr()
.and_then(|base| base.value.as_int())
.and_then(ruff_python_ast::Int::as_u8)
else {
return;
};
if !matches!(base_u8, 2 | 8 | 16) {
return;
}
// Determine whether the expression is a slice of a string (e.g., `num[2:]`).
let Expr::Subscript(expr_subscript) = expression else {
return;
};
let Expr::Slice(expr_slice) = expr_subscript.slice.as_ref() else {
return;
};
if expr_slice.upper.is_some() || expr_slice.step.is_some() {
return;
}
if !expr_slice
.lower
.as_ref()
.and_then(|expr| expr.as_number_literal_expr())
.and_then(|expr| expr.value.as_int())
.is_some_and(|expr| expr.as_u8() == Some(2))
{
return;
}
let mut diagnostic = Diagnostic::new(IntOnSlicedStr { base: base_u8 }, call.range());
diagnostic.set_fix(Fix::unsafe_edits(
Edit::range_replacement(
checker.locator().slice(&*expr_subscript.value).to_string(),
expression.range(),
),
[Edit::range_replacement("0".to_string(), base.range())],
));
checker.diagnostics.push(diagnostic);
}

View File

@@ -5,6 +5,7 @@ pub(crate) use for_loop_set_mutations::*;
pub(crate) use hashlib_digest_hex::*;
pub(crate) use if_expr_min_max::*;
pub(crate) use implicit_cwd::*;
pub(crate) use int_on_sliced_str::*;
pub(crate) use isinstance_type_none::*;
pub(crate) use list_reverse_copy::*;
pub(crate) use math_constant::*;
@@ -31,6 +32,7 @@ mod for_loop_set_mutations;
mod hashlib_digest_hex;
mod if_expr_min_max;
mod implicit_cwd;
mod int_on_sliced_str;
mod isinstance_type_none;
mod list_reverse_copy;
mod math_constant;

View File

@@ -0,0 +1,141 @@
---
source: crates/ruff_linter/src/rules/refurb/mod.rs
---
FURB166.py:3:5: FURB166 [*] Use of `int` with explicit `base=2` after removing prefix
|
1 | # Errors
2 |
3 | _ = int("0b1010"[2:], 2)
| ^^^^^^^^^^^^^^^^^^^^ FURB166
4 | _ = int("0o777"[2:], 8)
5 | _ = int("0xFFFF"[2:], 16)
|
= help: Replace with `base=0`
Unsafe fix
1 1 | # Errors
2 2 |
3 |-_ = int("0b1010"[2:], 2)
3 |+_ = int("0b1010", 0)
4 4 | _ = int("0o777"[2:], 8)
5 5 | _ = int("0xFFFF"[2:], 16)
6 6 |
FURB166.py:4:5: FURB166 [*] Use of `int` with explicit `base=8` after removing prefix
|
3 | _ = int("0b1010"[2:], 2)
4 | _ = int("0o777"[2:], 8)
| ^^^^^^^^^^^^^^^^^^^ FURB166
5 | _ = int("0xFFFF"[2:], 16)
|
= help: Replace with `base=0`
Unsafe fix
1 1 | # Errors
2 2 |
3 3 | _ = int("0b1010"[2:], 2)
4 |-_ = int("0o777"[2:], 8)
4 |+_ = int("0o777", 0)
5 5 | _ = int("0xFFFF"[2:], 16)
6 6 |
7 7 | b = "0b11"
FURB166.py:5:5: FURB166 [*] Use of `int` with explicit `base=16` after removing prefix
|
3 | _ = int("0b1010"[2:], 2)
4 | _ = int("0o777"[2:], 8)
5 | _ = int("0xFFFF"[2:], 16)
| ^^^^^^^^^^^^^^^^^^^^^ FURB166
6 |
7 | b = "0b11"
|
= help: Replace with `base=0`
Unsafe fix
2 2 |
3 3 | _ = int("0b1010"[2:], 2)
4 4 | _ = int("0o777"[2:], 8)
5 |-_ = int("0xFFFF"[2:], 16)
5 |+_ = int("0xFFFF", 0)
6 6 |
7 7 | b = "0b11"
8 8 | _ = int(b[2:], 2)
FURB166.py:8:5: FURB166 [*] Use of `int` with explicit `base=2` after removing prefix
|
7 | b = "0b11"
8 | _ = int(b[2:], 2)
| ^^^^^^^^^^^^^ FURB166
9 |
10 | _ = int("0xFFFF"[2:], base=16)
|
= help: Replace with `base=0`
Unsafe fix
5 5 | _ = int("0xFFFF"[2:], 16)
6 6 |
7 7 | b = "0b11"
8 |-_ = int(b[2:], 2)
8 |+_ = int(b, 0)
9 9 |
10 10 | _ = int("0xFFFF"[2:], base=16)
11 11 |
FURB166.py:10:5: FURB166 [*] Use of `int` with explicit `base=16` after removing prefix
|
8 | _ = int(b[2:], 2)
9 |
10 | _ = int("0xFFFF"[2:], base=16)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^ FURB166
11 |
12 | _ = int(b"0xFFFF"[2:], 16)
|
= help: Replace with `base=0`
Unsafe fix
7 7 | b = "0b11"
8 8 | _ = int(b[2:], 2)
9 9 |
10 |-_ = int("0xFFFF"[2:], base=16)
10 |+_ = int("0xFFFF", base=0)
11 11 |
12 12 | _ = int(b"0xFFFF"[2:], 16)
13 13 |
FURB166.py:12:5: FURB166 [*] Use of `int` with explicit `base=16` after removing prefix
|
10 | _ = int("0xFFFF"[2:], base=16)
11 |
12 | _ = int(b"0xFFFF"[2:], 16)
| ^^^^^^^^^^^^^^^^^^^^^^ FURB166
|
= help: Replace with `base=0`
Unsafe fix
9 9 |
10 10 | _ = int("0xFFFF"[2:], base=16)
11 11 |
12 |-_ = int(b"0xFFFF"[2:], 16)
12 |+_ = int(b"0xFFFF", 0)
13 13 |
14 14 |
15 15 | def get_str():
FURB166.py:19:5: FURB166 [*] Use of `int` with explicit `base=16` after removing prefix
|
19 | _ = int(get_str()[2:], 16)
| ^^^^^^^^^^^^^^^^^^^^^^ FURB166
20 |
21 | # OK
|
= help: Replace with `base=0`
Unsafe fix
16 16 | return "0xFFF"
17 17 |
18 18 |
19 |-_ = int(get_str()[2:], 16)
19 |+_ = int(get_str(), 0)
20 20 |
21 21 | # OK
22 22 |

View File

@@ -11,7 +11,7 @@ use ruff_diagnostics::SourceMap;
use ruff_notebook::{Cell, Notebook, NotebookError};
use ruff_python_ast::PySourceType;
use colored::Colorize;
use owo_colors::OwoColorize;
use crate::fs;

View File

@@ -19,6 +19,7 @@ use ruff_python_trivia::textwrap::dedent;
use ruff_source_file::{Locator, SourceFileBuilder};
use ruff_text_size::Ranged;
use crate::colors;
use crate::directives;
use crate::fix::{fix_file, FixResult};
use crate::linter::{check_path, LinterResult, TokenSource};
@@ -293,7 +294,7 @@ pub(crate) fn print_jupyter_messages(
path: &Path,
notebook: &Notebook,
) -> String {
let mut output = Vec::new();
let mut output = colors::none(Vec::new());
TextEmitter::default()
.with_show_fix_status(true)
@@ -310,11 +311,11 @@ pub(crate) fn print_jupyter_messages(
)
.unwrap();
String::from_utf8(output).unwrap()
String::from_utf8(output.into_inner()).unwrap()
}
pub(crate) fn print_messages(messages: &[Message]) -> String {
let mut output = Vec::new();
let mut output = colors::none(Vec::new());
TextEmitter::default()
.with_show_fix_status(true)
@@ -328,7 +329,7 @@ pub(crate) fn print_messages(messages: &[Message]) -> String {
)
.unwrap();
String::from_utf8(output).unwrap()
String::from_utf8(output.into_inner()).unwrap()
}
#[macro_export]

View File

@@ -2727,73 +2727,6 @@ pub enum ExprContext {
Store,
Del,
}
impl ExprContext {
#[inline]
pub const fn load(&self) -> Option<ExprContextLoad> {
match self {
ExprContext::Load => Some(ExprContextLoad),
_ => None,
}
}
#[inline]
pub const fn store(&self) -> Option<ExprContextStore> {
match self {
ExprContext::Store => Some(ExprContextStore),
_ => None,
}
}
#[inline]
pub const fn del(&self) -> Option<ExprContextDel> {
match self {
ExprContext::Del => Some(ExprContextDel),
_ => None,
}
}
}
pub struct ExprContextLoad;
impl From<ExprContextLoad> for ExprContext {
fn from(_: ExprContextLoad) -> Self {
ExprContext::Load
}
}
impl std::cmp::PartialEq<ExprContext> for ExprContextLoad {
#[inline]
fn eq(&self, other: &ExprContext) -> bool {
matches!(other, ExprContext::Load)
}
}
pub struct ExprContextStore;
impl From<ExprContextStore> for ExprContext {
fn from(_: ExprContextStore) -> Self {
ExprContext::Store
}
}
impl std::cmp::PartialEq<ExprContext> for ExprContextStore {
#[inline]
fn eq(&self, other: &ExprContext) -> bool {
matches!(other, ExprContext::Store)
}
}
pub struct ExprContextDel;
impl From<ExprContextDel> for ExprContext {
fn from(_: ExprContextDel) -> Self {
ExprContext::Del
}
}
impl std::cmp::PartialEq<ExprContext> for ExprContextDel {
#[inline]
fn eq(&self, other: &ExprContext) -> bool {
matches!(other, ExprContext::Del)
}
}
/// See also [boolop](https://docs.python.org/3/library/ast.html#ast.BoolOp)
#[derive(Clone, Debug, PartialEq, is_macro::Is, Copy, Hash, Eq)]
@@ -2801,49 +2734,19 @@ pub enum BoolOp {
And,
Or,
}
impl BoolOp {
#[inline]
pub const fn and(&self) -> Option<BoolOpAnd> {
pub const fn as_str(&self) -> &'static str {
match self {
BoolOp::And => Some(BoolOpAnd),
BoolOp::Or => None,
}
}
#[inline]
pub const fn or(&self) -> Option<BoolOpOr> {
match self {
BoolOp::Or => Some(BoolOpOr),
BoolOp::And => None,
BoolOp::And => "and",
BoolOp::Or => "or",
}
}
}
pub struct BoolOpAnd;
impl From<BoolOpAnd> for BoolOp {
fn from(_: BoolOpAnd) -> Self {
BoolOp::And
}
}
impl std::cmp::PartialEq<BoolOp> for BoolOpAnd {
#[inline]
fn eq(&self, other: &BoolOp) -> bool {
matches!(other, BoolOp::And)
}
}
pub struct BoolOpOr;
impl From<BoolOpOr> for BoolOp {
fn from(_: BoolOpOr) -> Self {
BoolOp::Or
}
}
impl std::cmp::PartialEq<BoolOp> for BoolOpOr {
#[inline]
fn eq(&self, other: &BoolOp) -> bool {
matches!(other, BoolOp::Or)
impl fmt::Display for BoolOp {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.write_str(self.as_str())
}
}
@@ -2864,291 +2767,30 @@ pub enum Operator {
BitAnd,
FloorDiv,
}
impl Operator {
#[inline]
pub const fn operator_add(&self) -> Option<OperatorAdd> {
pub const fn as_str(&self) -> &'static str {
match self {
Operator::Add => Some(OperatorAdd),
_ => None,
}
}
#[inline]
pub const fn operator_sub(&self) -> Option<OperatorSub> {
match self {
Operator::Sub => Some(OperatorSub),
_ => None,
}
}
#[inline]
pub const fn operator_mult(&self) -> Option<OperatorMult> {
match self {
Operator::Mult => Some(OperatorMult),
_ => None,
}
}
#[inline]
pub const fn operator_mat_mult(&self) -> Option<OperatorMatMult> {
match self {
Operator::MatMult => Some(OperatorMatMult),
_ => None,
}
}
#[inline]
pub const fn operator_div(&self) -> Option<OperatorDiv> {
match self {
Operator::Div => Some(OperatorDiv),
_ => None,
}
}
#[inline]
pub const fn operator_mod(&self) -> Option<OperatorMod> {
match self {
Operator::Mod => Some(OperatorMod),
_ => None,
}
}
#[inline]
pub const fn operator_pow(&self) -> Option<OperatorPow> {
match self {
Operator::Pow => Some(OperatorPow),
_ => None,
}
}
#[inline]
pub const fn operator_l_shift(&self) -> Option<OperatorLShift> {
match self {
Operator::LShift => Some(OperatorLShift),
_ => None,
}
}
#[inline]
pub const fn operator_r_shift(&self) -> Option<OperatorRShift> {
match self {
Operator::RShift => Some(OperatorRShift),
_ => None,
}
}
#[inline]
pub const fn operator_bit_or(&self) -> Option<OperatorBitOr> {
match self {
Operator::BitOr => Some(OperatorBitOr),
_ => None,
}
}
#[inline]
pub const fn operator_bit_xor(&self) -> Option<OperatorBitXor> {
match self {
Operator::BitXor => Some(OperatorBitXor),
_ => None,
}
}
#[inline]
pub const fn operator_bit_and(&self) -> Option<OperatorBitAnd> {
match self {
Operator::BitAnd => Some(OperatorBitAnd),
_ => None,
}
}
#[inline]
pub const fn operator_floor_div(&self) -> Option<OperatorFloorDiv> {
match self {
Operator::FloorDiv => Some(OperatorFloorDiv),
_ => None,
Operator::Add => "+",
Operator::Sub => "-",
Operator::Mult => "*",
Operator::MatMult => "@",
Operator::Div => "/",
Operator::Mod => "%",
Operator::Pow => "**",
Operator::LShift => "<<",
Operator::RShift => ">>",
Operator::BitOr => "|",
Operator::BitXor => "^",
Operator::BitAnd => "&",
Operator::FloorDiv => "//",
}
}
}
pub struct OperatorAdd;
impl From<OperatorAdd> for Operator {
fn from(_: OperatorAdd) -> Self {
Operator::Add
}
}
impl std::cmp::PartialEq<Operator> for OperatorAdd {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::Add)
}
}
pub struct OperatorSub;
impl From<OperatorSub> for Operator {
fn from(_: OperatorSub) -> Self {
Operator::Sub
}
}
impl std::cmp::PartialEq<Operator> for OperatorSub {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::Sub)
}
}
pub struct OperatorMult;
impl From<OperatorMult> for Operator {
fn from(_: OperatorMult) -> Self {
Operator::Mult
}
}
impl std::cmp::PartialEq<Operator> for OperatorMult {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::Mult)
}
}
pub struct OperatorMatMult;
impl From<OperatorMatMult> for Operator {
fn from(_: OperatorMatMult) -> Self {
Operator::MatMult
}
}
impl std::cmp::PartialEq<Operator> for OperatorMatMult {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::MatMult)
}
}
pub struct OperatorDiv;
impl From<OperatorDiv> for Operator {
fn from(_: OperatorDiv) -> Self {
Operator::Div
}
}
impl std::cmp::PartialEq<Operator> for OperatorDiv {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::Div)
}
}
pub struct OperatorMod;
impl From<OperatorMod> for Operator {
fn from(_: OperatorMod) -> Self {
Operator::Mod
}
}
impl std::cmp::PartialEq<Operator> for OperatorMod {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::Mod)
}
}
pub struct OperatorPow;
impl From<OperatorPow> for Operator {
fn from(_: OperatorPow) -> Self {
Operator::Pow
}
}
impl std::cmp::PartialEq<Operator> for OperatorPow {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::Pow)
}
}
pub struct OperatorLShift;
impl From<OperatorLShift> for Operator {
fn from(_: OperatorLShift) -> Self {
Operator::LShift
}
}
impl std::cmp::PartialEq<Operator> for OperatorLShift {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::LShift)
}
}
pub struct OperatorRShift;
impl From<OperatorRShift> for Operator {
fn from(_: OperatorRShift) -> Self {
Operator::RShift
}
}
impl std::cmp::PartialEq<Operator> for OperatorRShift {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::RShift)
}
}
pub struct OperatorBitOr;
impl From<OperatorBitOr> for Operator {
fn from(_: OperatorBitOr) -> Self {
Operator::BitOr
}
}
impl std::cmp::PartialEq<Operator> for OperatorBitOr {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::BitOr)
}
}
pub struct OperatorBitXor;
impl From<OperatorBitXor> for Operator {
fn from(_: OperatorBitXor) -> Self {
Operator::BitXor
}
}
impl std::cmp::PartialEq<Operator> for OperatorBitXor {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::BitXor)
}
}
pub struct OperatorBitAnd;
impl From<OperatorBitAnd> for Operator {
fn from(_: OperatorBitAnd) -> Self {
Operator::BitAnd
}
}
impl std::cmp::PartialEq<Operator> for OperatorBitAnd {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::BitAnd)
}
}
pub struct OperatorFloorDiv;
impl From<OperatorFloorDiv> for Operator {
fn from(_: OperatorFloorDiv) -> Self {
Operator::FloorDiv
}
}
impl std::cmp::PartialEq<Operator> for OperatorFloorDiv {
#[inline]
fn eq(&self, other: &Operator) -> bool {
matches!(other, Operator::FloorDiv)
impl fmt::Display for Operator {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.write_str(self.as_str())
}
}
@@ -3160,93 +2802,21 @@ pub enum UnaryOp {
UAdd,
USub,
}
impl UnaryOp {
#[inline]
pub const fn invert(&self) -> Option<UnaryOpInvert> {
pub const fn as_str(&self) -> &'static str {
match self {
UnaryOp::Invert => Some(UnaryOpInvert),
_ => None,
}
}
#[inline]
pub const fn not(&self) -> Option<UnaryOpNot> {
match self {
UnaryOp::Not => Some(UnaryOpNot),
_ => None,
}
}
#[inline]
pub const fn u_add(&self) -> Option<UnaryOpUAdd> {
match self {
UnaryOp::UAdd => Some(UnaryOpUAdd),
_ => None,
}
}
#[inline]
pub const fn u_sub(&self) -> Option<UnaryOpUSub> {
match self {
UnaryOp::USub => Some(UnaryOpUSub),
_ => None,
UnaryOp::Invert => "~",
UnaryOp::Not => "not",
UnaryOp::UAdd => "+",
UnaryOp::USub => "-",
}
}
}
pub struct UnaryOpInvert;
impl From<UnaryOpInvert> for UnaryOp {
fn from(_: UnaryOpInvert) -> Self {
UnaryOp::Invert
}
}
impl std::cmp::PartialEq<UnaryOp> for UnaryOpInvert {
#[inline]
fn eq(&self, other: &UnaryOp) -> bool {
matches!(other, UnaryOp::Invert)
}
}
pub struct UnaryOpNot;
impl From<UnaryOpNot> for UnaryOp {
fn from(_: UnaryOpNot) -> Self {
UnaryOp::Not
}
}
impl std::cmp::PartialEq<UnaryOp> for UnaryOpNot {
#[inline]
fn eq(&self, other: &UnaryOp) -> bool {
matches!(other, UnaryOp::Not)
}
}
pub struct UnaryOpUAdd;
impl From<UnaryOpUAdd> for UnaryOp {
fn from(_: UnaryOpUAdd) -> Self {
UnaryOp::UAdd
}
}
impl std::cmp::PartialEq<UnaryOp> for UnaryOpUAdd {
#[inline]
fn eq(&self, other: &UnaryOp) -> bool {
matches!(other, UnaryOp::UAdd)
}
}
pub struct UnaryOpUSub;
impl From<UnaryOpUSub> for UnaryOp {
fn from(_: UnaryOpUSub) -> Self {
UnaryOp::USub
}
}
impl std::cmp::PartialEq<UnaryOp> for UnaryOpUSub {
#[inline]
fn eq(&self, other: &UnaryOp) -> bool {
matches!(other, UnaryOp::USub)
impl fmt::Display for UnaryOp {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.write_str(self.as_str())
}
}
@@ -3264,225 +2834,27 @@ pub enum CmpOp {
In,
NotIn,
}
impl CmpOp {
#[inline]
pub const fn cmp_op_eq(&self) -> Option<CmpOpEq> {
pub const fn as_str(&self) -> &'static str {
match self {
CmpOp::Eq => Some(CmpOpEq),
_ => None,
}
}
#[inline]
pub const fn cmp_op_not_eq(&self) -> Option<CmpOpNotEq> {
match self {
CmpOp::NotEq => Some(CmpOpNotEq),
_ => None,
}
}
#[inline]
pub const fn cmp_op_lt(&self) -> Option<CmpOpLt> {
match self {
CmpOp::Lt => Some(CmpOpLt),
_ => None,
}
}
#[inline]
pub const fn cmp_op_lt_e(&self) -> Option<CmpOpLtE> {
match self {
CmpOp::LtE => Some(CmpOpLtE),
_ => None,
}
}
#[inline]
pub const fn cmp_op_gt(&self) -> Option<CmpOpGt> {
match self {
CmpOp::Gt => Some(CmpOpGt),
_ => None,
}
}
#[inline]
pub const fn cmp_op_gt_e(&self) -> Option<CmpOpGtE> {
match self {
CmpOp::GtE => Some(CmpOpGtE),
_ => None,
}
}
#[inline]
pub const fn cmp_op_is(&self) -> Option<CmpOpIs> {
match self {
CmpOp::Is => Some(CmpOpIs),
_ => None,
}
}
#[inline]
pub const fn cmp_op_is_not(&self) -> Option<CmpOpIsNot> {
match self {
CmpOp::IsNot => Some(CmpOpIsNot),
_ => None,
}
}
#[inline]
pub const fn cmp_op_in(&self) -> Option<CmpOpIn> {
match self {
CmpOp::In => Some(CmpOpIn),
_ => None,
}
}
#[inline]
pub const fn cmp_op_not_in(&self) -> Option<CmpOpNotIn> {
match self {
CmpOp::NotIn => Some(CmpOpNotIn),
_ => None,
CmpOp::Eq => "==",
CmpOp::NotEq => "!=",
CmpOp::Lt => "<",
CmpOp::LtE => "<=",
CmpOp::Gt => ">",
CmpOp::GtE => ">=",
CmpOp::Is => "is",
CmpOp::IsNot => "is not",
CmpOp::In => "in",
CmpOp::NotIn => "not in",
}
}
}
pub struct CmpOpEq;
impl From<CmpOpEq> for CmpOp {
fn from(_: CmpOpEq) -> Self {
CmpOp::Eq
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpEq {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::Eq)
}
}
pub struct CmpOpNotEq;
impl From<CmpOpNotEq> for CmpOp {
fn from(_: CmpOpNotEq) -> Self {
CmpOp::NotEq
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpNotEq {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::NotEq)
}
}
pub struct CmpOpLt;
impl From<CmpOpLt> for CmpOp {
fn from(_: CmpOpLt) -> Self {
CmpOp::Lt
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpLt {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::Lt)
}
}
pub struct CmpOpLtE;
impl From<CmpOpLtE> for CmpOp {
fn from(_: CmpOpLtE) -> Self {
CmpOp::LtE
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpLtE {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::LtE)
}
}
pub struct CmpOpGt;
impl From<CmpOpGt> for CmpOp {
fn from(_: CmpOpGt) -> Self {
CmpOp::Gt
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpGt {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::Gt)
}
}
pub struct CmpOpGtE;
impl From<CmpOpGtE> for CmpOp {
fn from(_: CmpOpGtE) -> Self {
CmpOp::GtE
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpGtE {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::GtE)
}
}
pub struct CmpOpIs;
impl From<CmpOpIs> for CmpOp {
fn from(_: CmpOpIs) -> Self {
CmpOp::Is
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpIs {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::Is)
}
}
pub struct CmpOpIsNot;
impl From<CmpOpIsNot> for CmpOp {
fn from(_: CmpOpIsNot) -> Self {
CmpOp::IsNot
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpIsNot {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::IsNot)
}
}
pub struct CmpOpIn;
impl From<CmpOpIn> for CmpOp {
fn from(_: CmpOpIn) -> Self {
CmpOp::In
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpIn {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::In)
}
}
pub struct CmpOpNotIn;
impl From<CmpOpNotIn> for CmpOp {
fn from(_: CmpOpNotIn) -> Self {
CmpOp::NotIn
}
}
impl std::cmp::PartialEq<CmpOp> for CmpOpNotIn {
#[inline]
fn eq(&self, other: &CmpOp) -> bool {
matches!(other, CmpOp::NotIn)
impl fmt::Display for CmpOp {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.write_str(self.as_str())
}
}
@@ -3991,23 +3363,6 @@ impl Deref for TypeParams {
pub type Suite = Vec<Stmt>;
impl CmpOp {
pub fn as_str(&self) -> &'static str {
match self {
CmpOp::Eq => "==",
CmpOp::NotEq => "!=",
CmpOp::Lt => "<",
CmpOp::LtE => "<=",
CmpOp::Gt => ">",
CmpOp::GtE => ">=",
CmpOp::Is => "is",
CmpOp::IsNot => "is not",
CmpOp::In => "in",
CmpOp::NotIn => "not in",
}
}
}
impl Parameters {
pub fn empty(range: TextRange) -> Self {
Self {

View File

@@ -18,7 +18,6 @@ ruff_python_ast = { path = "../ruff_python_ast" }
ruff_text_size = { path = "../ruff_text_size" }
anyhow = { workspace = true }
bitflags = { workspace = true }
bstr = { workspace = true }
is-macro = { workspace = true }
itertools = { workspace = true }

View File

@@ -680,7 +680,7 @@ impl<'a> SemanticModel<'a> {
/// print(python_version)
/// ```
///
/// ...then `resolve_call_path(${python_version})` will resolve to `sys.version_info`.
/// ...then `resolve_qualified_name(${python_version})` will resolve to `sys.version_info`.
pub fn resolve_qualified_name<'name, 'expr: 'name>(
&self,
value: &'expr Expr,

View File

@@ -2,11 +2,13 @@
mod document;
mod range;
mod replacement;
pub use document::Document;
pub(crate) use document::DocumentVersion;
use lsp_types::PositionEncodingKind;
pub(crate) use range::{RangeExt, ToRangeExt};
pub(crate) use replacement::Replacement;
/// A convenient enumeration for supported text encodings. Can be converted to [`lsp_types::PositionEncodingKind`].
// Please maintain the order from least to greatest priority for the derived `Ord` impl.

View File

@@ -0,0 +1,98 @@
use ruff_text_size::{TextLen, TextRange, TextSize};
pub(crate) struct Replacement {
pub(crate) source_range: TextRange,
pub(crate) modified_range: TextRange,
}
impl Replacement {
/// Creates a [`Replacement`] that describes the `source_range` of `source` to replace
/// with `modified` sliced by `modified_range`.
pub(crate) fn between(
source: &str,
source_line_starts: &[TextSize],
modified: &str,
modified_line_starts: &[TextSize],
) -> Self {
let mut source_start = TextSize::default();
let mut replaced_start = TextSize::default();
let mut source_end = source.text_len();
let mut replaced_end = modified.text_len();
let mut line_iter = source_line_starts
.iter()
.copied()
.zip(modified_line_starts.iter().copied());
for (source_line_start, modified_line_start) in line_iter.by_ref() {
if source_line_start != modified_line_start
|| source[TextRange::new(source_start, source_line_start)]
!= modified[TextRange::new(replaced_start, modified_line_start)]
{
break;
}
source_start = source_line_start;
replaced_start = modified_line_start;
}
let mut line_iter = line_iter.rev();
for (old_line_start, new_line_start) in line_iter.by_ref() {
if old_line_start <= source_start
|| new_line_start <= replaced_start
|| source[TextRange::new(old_line_start, source_end)]
!= modified[TextRange::new(new_line_start, replaced_end)]
{
break;
}
source_end = old_line_start;
replaced_end = new_line_start;
}
Replacement {
source_range: TextRange::new(source_start, source_end),
modified_range: TextRange::new(replaced_start, replaced_end),
}
}
}
#[cfg(test)]
mod tests {
use ruff_source_file::LineIndex;
use super::Replacement;
#[test]
fn find_replacement_range_works() {
let original = r#"
aaaa
bbbb
cccc
dddd
eeee
"#;
let original_index = LineIndex::from_source_text(original);
let new = r#"
bb
cccc
dd
"#;
let new_index = LineIndex::from_source_text(new);
let expected = r#"
bb
cccc
dd
"#;
let replacement = Replacement::between(
original,
original_index.line_starts(),
new,
new_index.line_starts(),
);
let mut test = original.to_string();
test.replace_range(
replacement.source_range.start().to_usize()..replacement.source_range.end().to_usize(),
&new[replacement.modified_range],
);
assert_eq!(expected, &test);
}
}

View File

@@ -0,0 +1,79 @@
use ruff_linter::{
linter::{FixerResult, LinterResult},
settings::{flags, types::UnsafeFixes, LinterSettings},
source_kind::SourceKind,
};
use ruff_python_ast::PySourceType;
use ruff_source_file::LineIndex;
use std::{borrow::Cow, path::Path};
use crate::{
edit::{Replacement, ToRangeExt},
PositionEncoding,
};
pub(crate) fn fix_all(
document: &crate::edit::Document,
linter_settings: &LinterSettings,
encoding: PositionEncoding,
) -> crate::Result<Vec<lsp_types::TextEdit>> {
let source = document.contents();
let source_type = PySourceType::default();
// TODO(jane): Support Jupyter Notebooks
let source_kind = SourceKind::Python(source.to_string());
// We need to iteratively apply all safe fixes onto a single file and then
// create a diff between the modified file and the original source to use as a single workspace
// edit.
// If we simply generated the diagnostics with `check_path` and then applied fixes individually,
// there's a possibility they could overlap or introduce new problems that need to be fixed,
// which is inconsistent with how `ruff check --fix` works.
let FixerResult {
transformed,
result: LinterResult { error, .. },
..
} = ruff_linter::linter::lint_fix(
Path::new("<filename>"),
None,
flags::Noqa::Enabled,
UnsafeFixes::Disabled,
linter_settings,
&source_kind,
source_type,
)?;
if let Some(error) = error {
// abort early if a parsing error occurred
return Err(anyhow::anyhow!(
"A parsing error occurred during `fix_all`: {error}"
));
}
// fast path: if `transformed` is still borrowed, no changes were made and we can return early
if let Cow::Borrowed(_) = transformed {
return Ok(vec![]);
}
let modified = transformed.source_code();
let modified_index = LineIndex::from_source_text(modified);
let source_index = document.index();
let Replacement {
source_range,
modified_range,
} = Replacement::between(
source,
source_index.line_starts(),
modified,
modified_index.line_starts(),
);
Ok(vec![lsp_types::TextEdit {
range: source_range.to_range(source, source_index, encoding),
new_text: modified[modified_range].to_owned(),
}])
}

Some files were not shown because too many files have changed in this diff Show More