Compare commits
73 Commits
tracing-in
...
charlie/PG
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
68b228f9dd | ||
|
|
49d596c29d | ||
|
|
40f6456add | ||
|
|
3e1dffab20 | ||
|
|
3336d23f48 | ||
|
|
2421805033 | ||
|
|
359f50e6dc | ||
|
|
bccba5d73f | ||
|
|
0bfdb15ecf | ||
|
|
a902d14c31 | ||
|
|
728539291f | ||
|
|
c2bd8af59a | ||
|
|
c946bf157e | ||
|
|
8ab2519717 | ||
|
|
c4d85d6fb6 | ||
|
|
70ea49bf72 | ||
|
|
8e255974bc | ||
|
|
d358604464 | ||
|
|
8243db74fe | ||
|
|
0346e781d4 | ||
|
|
b66bfa6570 | ||
|
|
28273eb00b | ||
|
|
12acd191e1 | ||
|
|
64b929bc29 | ||
|
|
26ae0a6e8d | ||
|
|
959338d39d | ||
|
|
422ff82f4a | ||
|
|
8d0a5e01bd | ||
|
|
aae02cf275 | ||
|
|
7e2eba2592 | ||
|
|
22770fb4be | ||
|
|
1880cceac1 | ||
|
|
0d1fb823d6 | ||
|
|
c907317199 | ||
|
|
916dd5b7fa | ||
|
|
2cbe1733c8 | ||
|
|
1f6e1485f9 | ||
|
|
9b43162cc4 | ||
|
|
cc9e84c144 | ||
|
|
f4d50a2aec | ||
|
|
0c030b5bf3 | ||
|
|
1b082ce67e | ||
|
|
f936d319cc | ||
|
|
85d8b6228f | ||
|
|
7594dadc1d | ||
|
|
de37fbfac9 | ||
|
|
4e2769a16c | ||
|
|
75b5c314e3 | ||
|
|
450fb9b99a | ||
|
|
6163c99551 | ||
|
|
3112202a5b | ||
|
|
64ea00048b | ||
|
|
067a4acd54 | ||
|
|
c88376f468 | ||
|
|
9b7c29853d | ||
|
|
3e21d32b79 | ||
|
|
f9e3ea23ba | ||
|
|
6856d0b44b | ||
|
|
21539f1663 | ||
|
|
b9bb6bf780 | ||
|
|
d39eae2713 | ||
|
|
5d21b9c22e | ||
|
|
ec2f229a45 | ||
|
|
34c1cb7d11 | ||
|
|
2ddea7c657 | ||
|
|
45eabdd2c3 | ||
|
|
675c86c175 | ||
|
|
1f8e2b8f14 | ||
|
|
58b3040342 | ||
|
|
6a9b8aede1 | ||
|
|
f48126ad00 | ||
|
|
11287f944f | ||
|
|
a65efcf459 |
2
.github/workflows/docs.yaml
vendored
2
.github/workflows/docs.yaml
vendored
@@ -52,4 +52,4 @@ jobs:
|
||||
apiToken: ${{ secrets.CF_API_TOKEN }}
|
||||
accountId: ${{ secrets.CF_ACCOUNT_ID }}
|
||||
# `github.head_ref` is only set during pull requests and for manual runs or tags we use `main` to deploy to production
|
||||
command: pages deploy site --project-name=ruff-docs --branch ${{ github.head_ref || 'main' }} --commit-hash ${GITHUB_SHA}
|
||||
command: pages deploy site --project-name=astral-docs --branch ${{ github.head_ref || 'main' }} --commit-hash ${GITHUB_SHA}
|
||||
|
||||
2
.github/workflows/playground.yaml
vendored
2
.github/workflows/playground.yaml
vendored
@@ -44,4 +44,4 @@ jobs:
|
||||
with:
|
||||
apiToken: ${{ secrets.CF_API_TOKEN }}
|
||||
accountId: ${{ secrets.CF_ACCOUNT_ID }}
|
||||
command: pages publish playground/dist --project-name=ruff --branch ${GITHUB_HEAD_REF} --commit-hash ${GITHUB_SHA}
|
||||
command: pages deploy playground/dist --project-name=ruff-playground --branch ${GITHUB_HEAD_REF} --commit-hash ${GITHUB_SHA}
|
||||
|
||||
30
.github/workflows/release.yaml
vendored
30
.github/workflows/release.yaml
vendored
@@ -419,18 +419,7 @@ jobs:
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
ref: ${{ inputs.sha }}
|
||||
- name: Check tag consistency
|
||||
run: |
|
||||
version=$(grep "version = " pyproject.toml | sed -e 's/version = "\(.*\)"/\1/g')
|
||||
if [ "${{ inputs.tag }}" != "${version}" ]; then
|
||||
echo "The input tag does not match the version from pyproject.toml:" >&2
|
||||
echo "${{ inputs.tag }}" >&2
|
||||
echo "${version}" >&2
|
||||
exit 1
|
||||
else
|
||||
echo "Releasing ${version}"
|
||||
fi
|
||||
ref: main # We checkout the main branch to check for the commit
|
||||
- name: Check main branch
|
||||
if: ${{ inputs.sha }}
|
||||
run: |
|
||||
@@ -440,17 +429,18 @@ jobs:
|
||||
echo "The specified sha is not on the main branch" >&2
|
||||
exit 1
|
||||
fi
|
||||
- name: Check SHA consistency
|
||||
if: ${{ inputs.sha }}
|
||||
- name: Check tag consistency
|
||||
run: |
|
||||
git_sha=$(git rev-parse HEAD)
|
||||
if [ "${{ inputs.sha }}" != "${git_sha}" ]; then
|
||||
echo "The specified sha does not match the git checkout" >&2
|
||||
echo "${{ inputs.sha }}" >&2
|
||||
echo "${git_sha}" >&2
|
||||
# Switch to the commit we want to release
|
||||
git checkout ${{ inputs.sha }}
|
||||
version=$(grep "version = " pyproject.toml | sed -e 's/version = "\(.*\)"/\1/g')
|
||||
if [ "${{ inputs.tag }}" != "${version}" ]; then
|
||||
echo "The input tag does not match the version from pyproject.toml:" >&2
|
||||
echo "${{ inputs.tag }}" >&2
|
||||
echo "${version}" >&2
|
||||
exit 1
|
||||
else
|
||||
echo "Releasing ${git_sha}"
|
||||
echo "Releasing ${version}"
|
||||
fi
|
||||
|
||||
upload-release:
|
||||
|
||||
@@ -299,4 +299,4 @@ default.
|
||||
`pyproject.toml` files are now resolved hierarchically, such that for each Python file, we find
|
||||
the first `pyproject.toml` file in its path, and use that to determine its lint settings.
|
||||
|
||||
See the [documentation](https://beta.ruff.rs/docs/configuration/#python-file-discovery) for more.
|
||||
See the [documentation](https://docs.astral.sh/ruff/configuration/#python-file-discovery) for more.
|
||||
|
||||
@@ -719,8 +719,8 @@ Module {
|
||||
- `cargo dev generate-cli-help`, `cargo dev generate-docs` and `cargo dev generate-json-schema`:
|
||||
Update just `docs/configuration.md`, `docs/rules` and `ruff.schema.json` respectively.
|
||||
- `cargo dev generate-options`: Generate a markdown-compatible table of all `pyproject.toml`
|
||||
options. Used for <https://beta.ruff.rs/docs/settings/>
|
||||
- `cargo dev generate-rules-table`: Generate a markdown-compatible table of all rules. Used for <https://beta.ruff.rs/docs/rules/>
|
||||
options. Used for <https://docs.astral.sh/ruff/settings/>.
|
||||
- `cargo dev generate-rules-table`: Generate a markdown-compatible table of all rules. Used for <https://docs.astral.sh/ruff/rules/>.
|
||||
- `cargo dev round-trip <python file or jupyter notebook>`: Read a Python file or Jupyter Notebook,
|
||||
parse it, serialize the parsed representation and write it back. Used to check how good our
|
||||
representation is so that fixes don't rewrite irrelevant parts of a file.
|
||||
@@ -778,7 +778,7 @@ To understand Ruff's import categorization system, we first need to define two c
|
||||
- "Package root": The top-most directory defining the Python package that includes a given Python
|
||||
file. To find the package root for a given Python file, traverse up its parent directories until
|
||||
you reach a parent directory that doesn't contain an `__init__.py` file (and isn't marked as
|
||||
a [namespace package](https://beta.ruff.rs/docs/settings/#namespace-packages)); take the directory
|
||||
a [namespace package](https://docs.astral.sh/ruff/settings/#namespace-packages)); take the directory
|
||||
just before that, i.e., the first directory in the package.
|
||||
|
||||
For example, given:
|
||||
@@ -867,7 +867,7 @@ There are three ways in which an import can be categorized as "first-party":
|
||||
package (e.g., `from foo import bar` or `import foo.bar`), they'll be classified as first-party
|
||||
automatically. This check is as simple as comparing the first segment of the current file's
|
||||
module path to the first segment of the import.
|
||||
1. **Source roots**: Ruff supports a `[src](https://beta.ruff.rs/docs/settings/#src)` setting, which
|
||||
1. **Source roots**: Ruff supports a `[src](https://docs.astral.sh/ruff/settings/#src)` setting, which
|
||||
sets the directories to scan when identifying first-party imports. The algorithm is
|
||||
straightforward: given an import, like `import foo`, iterate over the directories enumerated in
|
||||
the `src` setting and, for each directory, check for the existence of a subdirectory `foo` or a
|
||||
|
||||
178
Cargo.lock
generated
178
Cargo.lock
generated
@@ -221,7 +221,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "4c2f7349907b712260e64b0afe2f84692af14a454be26187d9df565c7f69266a"
|
||||
dependencies = [
|
||||
"memchr",
|
||||
"regex-automata 0.3.7",
|
||||
"regex-automata 0.3.8",
|
||||
"serde",
|
||||
]
|
||||
|
||||
@@ -272,15 +272,14 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "chrono"
|
||||
version = "0.4.28"
|
||||
version = "0.4.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "95ed24df0632f708f5f6d8082675bef2596f7084dee3dd55f632290bf35bfe0f"
|
||||
checksum = "7f2c685bad3eb3d45a01354cedb7d5faa66194d1d58ba6e267a8de788f79db38"
|
||||
dependencies = [
|
||||
"android-tzdata",
|
||||
"iana-time-zone",
|
||||
"js-sys",
|
||||
"num-traits",
|
||||
"time",
|
||||
"wasm-bindgen",
|
||||
"windows-targets 0.48.5",
|
||||
]
|
||||
@@ -314,20 +313,19 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "clap"
|
||||
version = "4.4.1"
|
||||
version = "4.4.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "7c8d502cbaec4595d2e7d5f61e318f05417bd2b66fdc3809498f0d3fdf0bea27"
|
||||
checksum = "84ed82781cea27b43c9b106a979fe450a13a31aab0500595fb3fc06616de08e6"
|
||||
dependencies = [
|
||||
"clap_builder",
|
||||
"clap_derive",
|
||||
"once_cell",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "clap_builder"
|
||||
version = "4.4.1"
|
||||
version = "4.4.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5891c7bc0edb3e1c2204fc5e94009affabeb1821c9e5fdc3959536c5c0bb984d"
|
||||
checksum = "2bb9faaa7c2ef94b2743a21f5a29e6f0010dff4caa69ac8e9d6cf8b6fa74da08"
|
||||
dependencies = [
|
||||
"anstream",
|
||||
"anstyle",
|
||||
@@ -378,14 +376,14 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "clap_derive"
|
||||
version = "4.4.0"
|
||||
version = "4.4.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c9fd1a5729c4548118d7d70ff234a44868d00489a4b6597b0b020918a0e91a1a"
|
||||
checksum = "0862016ff20d69b84ef8247369fabf5c008a7417002411897d40ee1f4532b873"
|
||||
dependencies = [
|
||||
"heck",
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -610,7 +608,7 @@ dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"strsim",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -621,7 +619,7 @@ checksum = "836a9bbc7ad63342d6d6e7b815ccab164bc77a2d95d84bc3117a8c0d5c98e2d5"
|
||||
dependencies = [
|
||||
"darling_core",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -783,6 +781,15 @@ version = "2.0.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6999dc1837253364c2ebb0704ba97994bd874e8f195d665c50b7548f6ea92764"
|
||||
|
||||
[[package]]
|
||||
name = "fern"
|
||||
version = "0.6.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d9f0c14694cbd524c8720dd69b0e3179344f04ebb5f90f2e4a440c6ea3b2f1ee"
|
||||
dependencies = [
|
||||
"log",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "filetime"
|
||||
version = "0.2.22"
|
||||
@@ -803,7 +810,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
|
||||
|
||||
[[package]]
|
||||
name = "flake8-to-ruff"
|
||||
version = "0.0.289"
|
||||
version = "0.0.290"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"clap",
|
||||
@@ -874,7 +881,7 @@ dependencies = [
|
||||
"cfg-if",
|
||||
"js-sys",
|
||||
"libc",
|
||||
"wasi 0.11.0+wasi-snapshot-preview1",
|
||||
"wasi",
|
||||
"wasm-bindgen",
|
||||
]
|
||||
|
||||
@@ -1042,9 +1049,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "indoc"
|
||||
version = "2.0.3"
|
||||
version = "2.0.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2c785eefb63ebd0e33416dfcb8d6da0bf27ce752843a45632a67bf10d4d4b5c4"
|
||||
checksum = "1e186cfbae8084e513daff4240b4797e342f988cecda4fb6c939150f96315fd8"
|
||||
|
||||
[[package]]
|
||||
name = "inotify"
|
||||
@@ -1113,7 +1120,7 @@ dependencies = [
|
||||
"pmutil 0.6.1",
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1268,9 +1275,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "libmimalloc-sys"
|
||||
version = "0.1.34"
|
||||
version = "0.1.35"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "25d058a81af0d1c22d7a1c948576bee6d673f7af3c0f35564abd6c81122f513d"
|
||||
checksum = "3979b5c37ece694f1f5e51e7ecc871fdb0f517ed04ee45f88d15d6d553cb9664"
|
||||
dependencies = [
|
||||
"cc",
|
||||
"libc",
|
||||
@@ -1336,9 +1343,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "mimalloc"
|
||||
version = "0.1.38"
|
||||
version = "0.1.39"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "972e5f23f6716f62665760b0f4cbf592576a80c7b879ba9beaafc0e558894127"
|
||||
checksum = "fa01922b5ea280a911e323e4d2fd24b7fe5cc4042e0d2cda3c40775cdc4bdc9c"
|
||||
dependencies = [
|
||||
"libmimalloc-sys",
|
||||
]
|
||||
@@ -1366,7 +1373,7 @@ checksum = "927a765cd3fc26206e66b296465fa9d3e5ab003e651c1b3c060e7956d96b19d2"
|
||||
dependencies = [
|
||||
"libc",
|
||||
"log",
|
||||
"wasi 0.11.0+wasi-snapshot-preview1",
|
||||
"wasi",
|
||||
"windows-sys 0.48.0",
|
||||
]
|
||||
|
||||
@@ -1411,20 +1418,21 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "notify"
|
||||
version = "5.2.0"
|
||||
version = "6.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "729f63e1ca555a43fe3efa4f3efdf4801c479da85b432242a7b726f353c88486"
|
||||
checksum = "6205bd8bb1e454ad2e27422015fb5e4f2bcc7e08fa8f27058670d208324a4d2d"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"bitflags 2.4.0",
|
||||
"crossbeam-channel",
|
||||
"filetime",
|
||||
"fsevent-sys",
|
||||
"inotify",
|
||||
"kqueue",
|
||||
"libc",
|
||||
"log",
|
||||
"mio",
|
||||
"walkdir",
|
||||
"windows-sys 0.45.0",
|
||||
"windows-sys 0.48.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1712,7 +1720,7 @@ checksum = "52a40bc70c2c58040d2d8b167ba9a5ff59fc9dab7ad44771cfde3dcfde7a09c6"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1797,9 +1805,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "proc-macro2"
|
||||
version = "1.0.66"
|
||||
version = "1.0.67"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "18fb31db3f9bddb2ea821cde30a9f70117e3f119938b5ee630b7403aa6e2ead9"
|
||||
checksum = "3d433d9f1a3e8c1263d9456598b16fec66f4acc9a74dacffd35c7bb09b3a1328"
|
||||
dependencies = [
|
||||
"unicode-ident",
|
||||
]
|
||||
@@ -1932,13 +1940,13 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "regex"
|
||||
version = "1.9.4"
|
||||
version = "1.9.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "12de2eff854e5fa4b1295edd650e227e9d8fb0c9e90b12e7f36d6a6811791a29"
|
||||
checksum = "697061221ea1b4a94a624f67d0ae2bfe4e22b8a17b6a192afb11046542cc8c47"
|
||||
dependencies = [
|
||||
"aho-corasick",
|
||||
"memchr",
|
||||
"regex-automata 0.3.7",
|
||||
"regex-automata 0.3.8",
|
||||
"regex-syntax 0.7.5",
|
||||
]
|
||||
|
||||
@@ -1953,9 +1961,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "regex-automata"
|
||||
version = "0.3.7"
|
||||
version = "0.3.8"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "49530408a136e16e5b486e883fbb6ba058e8e4e8ae6621a77b048b314336e629"
|
||||
checksum = "c2f401f4955220693b56f8ec66ee9c78abffd8d1c4f23dc41a23839eb88f0795"
|
||||
dependencies = [
|
||||
"aho-corasick",
|
||||
"memchr",
|
||||
@@ -2013,7 +2021,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.0.289"
|
||||
version = "0.0.290"
|
||||
dependencies = [
|
||||
"annotate-snippets 0.9.1",
|
||||
"anyhow",
|
||||
@@ -2021,6 +2029,7 @@ dependencies = [
|
||||
"chrono",
|
||||
"clap",
|
||||
"colored",
|
||||
"fern",
|
||||
"glob",
|
||||
"globset",
|
||||
"imperative",
|
||||
@@ -2070,8 +2079,6 @@ dependencies = [
|
||||
"test-case",
|
||||
"thiserror",
|
||||
"toml",
|
||||
"tracing",
|
||||
"tracing-subscriber",
|
||||
"typed-arena",
|
||||
"unicode-width",
|
||||
"unicode_names2",
|
||||
@@ -2112,7 +2119,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_cli"
|
||||
version = "0.0.289"
|
||||
version = "0.0.290"
|
||||
dependencies = [
|
||||
"annotate-snippets 0.9.1",
|
||||
"anyhow",
|
||||
@@ -2252,7 +2259,7 @@ dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"ruff_python_trivia",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2423,12 +2430,11 @@ name = "ruff_python_trivia"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"insta",
|
||||
"memchr",
|
||||
"itertools",
|
||||
"ruff_python_ast",
|
||||
"ruff_python_parser",
|
||||
"ruff_source_file",
|
||||
"ruff_text_size",
|
||||
"smallvec",
|
||||
"unicode-ident",
|
||||
]
|
||||
|
||||
@@ -2485,6 +2491,7 @@ dependencies = [
|
||||
"ruff_python_formatter",
|
||||
"ruff_python_index",
|
||||
"ruff_python_parser",
|
||||
"ruff_python_trivia",
|
||||
"ruff_source_file",
|
||||
"ruff_text_size",
|
||||
"ruff_workspace",
|
||||
@@ -2666,9 +2673,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "serde-wasm-bindgen"
|
||||
version = "0.5.0"
|
||||
version = "0.6.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f3b143e2833c57ab9ad3ea280d21fd34e285a42837aeb0ee301f4f41890fa00e"
|
||||
checksum = "30c9933e5689bd420dc6c87b7a1835701810cbc10cd86a26e4da45b73e6b1d78"
|
||||
dependencies = [
|
||||
"js-sys",
|
||||
"serde",
|
||||
@@ -2683,7 +2690,7 @@ checksum = "4eca7ac642d82aa35b60049a6eccb4be6be75e599bd2e9adb5f875a737654af2"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2699,9 +2706,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "serde_json"
|
||||
version = "1.0.106"
|
||||
version = "1.0.107"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2cc66a619ed80bf7a0f6b17dd063a84b88f6dea1813737cf469aef1d081142c2"
|
||||
checksum = "6b420ce6e3d8bd882e9b243c6eed35dbc9a6110c9769e74b584e0d68d1f20c65"
|
||||
dependencies = [
|
||||
"itoa",
|
||||
"ryu",
|
||||
@@ -2745,7 +2752,7 @@ dependencies = [
|
||||
"darling",
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2768,9 +2775,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "shlex"
|
||||
version = "1.1.0"
|
||||
version = "1.2.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "43b2853a4d09f215c24cc5489c992ce46052d359b5109343cbafbf26bc62f8a3"
|
||||
checksum = "a7cee0529a6d40f580e7a5e6c495c8fbfe21b7b52795ed4bb5e62cdf92bc6380"
|
||||
|
||||
[[package]]
|
||||
name = "similar"
|
||||
@@ -2840,7 +2847,7 @@ dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"rustversion",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2856,9 +2863,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "syn"
|
||||
version = "2.0.29"
|
||||
version = "2.0.33"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c324c494eba9d92503e6f1ef2e6df781e78f6a7705a0202d9801b198807d518a"
|
||||
checksum = "9caece70c63bfba29ec2fed841a09851b14a235c60010fa4de58089b6c025668"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
@@ -2928,57 +2935,57 @@ checksum = "3369f5ac52d5eb6ab48c6b4ffdc8efbcad6b89c765749064ba298f2c68a16a76"
|
||||
|
||||
[[package]]
|
||||
name = "test-case"
|
||||
version = "3.1.0"
|
||||
version = "3.2.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2a1d6e7bde536b0412f20765b76e921028059adfd1b90d8974d33fd3c91b25df"
|
||||
checksum = "c8f1e820b7f1d95a0cdbf97a5df9de10e1be731983ab943e56703ac1b8e9d425"
|
||||
dependencies = [
|
||||
"test-case-macros",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "test-case-core"
|
||||
version = "3.1.0"
|
||||
version = "3.2.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d10394d5d1e27794f772b6fc854c7e91a2dc26e2cbf807ad523370c2a59c0cee"
|
||||
checksum = "54c25e2cb8f5fcd7318157634e8838aa6f7e4715c96637f969fabaccd1ef5462"
|
||||
dependencies = [
|
||||
"cfg-if",
|
||||
"proc-macro-error",
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 1.0.109",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "test-case-macros"
|
||||
version = "3.1.0"
|
||||
version = "3.2.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "eeb9a44b1c6a54c1ba58b152797739dba2a83ca74e18168a68c980eb142f9404"
|
||||
checksum = "37cfd7bbc88a0104e304229fba519bdc45501a30b760fb72240342f1289ad257"
|
||||
dependencies = [
|
||||
"proc-macro-error",
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 1.0.109",
|
||||
"syn 2.0.33",
|
||||
"test-case-core",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "thiserror"
|
||||
version = "1.0.47"
|
||||
version = "1.0.48"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "97a802ec30afc17eee47b2855fc72e0c4cd62be9b4efe6591edde0ec5bd68d8f"
|
||||
checksum = "9d6d7a740b8a666a7e828dd00da9c0dc290dff53154ea77ac109281de90589b7"
|
||||
dependencies = [
|
||||
"thiserror-impl",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "thiserror-impl"
|
||||
version = "1.0.47"
|
||||
version = "1.0.48"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6bb623b56e39ab7dcd4b1b98bb6c8f8d907ed255b18de254088016b27a8ee19b"
|
||||
checksum = "49922ecae66cc8a249b77e68d1d0623c1b2c514f0060c27cdc68bd62a1219d35"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -3011,17 +3018,6 @@ dependencies = [
|
||||
"tikv-jemalloc-sys",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "time"
|
||||
version = "0.1.45"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1b797afad3f312d1c66a56d11d0316f916356d11bd158fbc6ca6389ff6bf805a"
|
||||
dependencies = [
|
||||
"libc",
|
||||
"wasi 0.10.0+wasi-snapshot-preview1",
|
||||
"winapi",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tiny-keccak"
|
||||
version = "2.0.2"
|
||||
@@ -3058,9 +3054,9 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
|
||||
|
||||
[[package]]
|
||||
name = "toml"
|
||||
version = "0.7.6"
|
||||
version = "0.7.8"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c17e963a819c331dcacd7ab957d80bc2b9a9c1e71c804826d2f283dd65306542"
|
||||
checksum = "dd79e69d3b627db300ff956027cc6c3798cef26d22526befdfcd12feeb6d2257"
|
||||
dependencies = [
|
||||
"serde",
|
||||
"serde_spanned",
|
||||
@@ -3079,9 +3075,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "toml_edit"
|
||||
version = "0.19.14"
|
||||
version = "0.19.15"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f8123f27e969974a3dfba720fdb560be359f57b44302d280ba72e76a74480e8a"
|
||||
checksum = "1b5bb770da30e5cbfde35a2d7b9b8a2c4b8ef89548a7a6aeab5c9a576e3e7421"
|
||||
dependencies = [
|
||||
"indexmap",
|
||||
"serde",
|
||||
@@ -3111,7 +3107,7 @@ checksum = "5f4f31f56159e98206da9efd823404b79b6ef3143b4a7ab76e67b1751b25a4ab"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -3221,9 +3217,9 @@ checksum = "92888ba5573ff080736b3648696b70cafad7d250551175acbaa4e0385b3e1460"
|
||||
|
||||
[[package]]
|
||||
name = "unicode-ident"
|
||||
version = "1.0.11"
|
||||
version = "1.0.12"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "301abaae475aa91687eb82514b328ab47a211a533026cb25fc3e519b86adfc3c"
|
||||
checksum = "3354b9ac3fae1ff6755cb6db53683adb661634f67557942dea4facebec0fee4b"
|
||||
|
||||
[[package]]
|
||||
name = "unicode-normalization"
|
||||
@@ -3314,7 +3310,7 @@ checksum = "f7e1ba1f333bd65ce3c9f27de592fcbc256dafe3af2717f56d7c87761fbaccf4"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -3381,12 +3377,6 @@ dependencies = [
|
||||
"winapi-util",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wasi"
|
||||
version = "0.10.0+wasi-snapshot-preview1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1a143597ca7c7793eff794def352d41792a93c481eb1042423ff7ff72ba2c31f"
|
||||
|
||||
[[package]]
|
||||
name = "wasi"
|
||||
version = "0.11.0+wasi-snapshot-preview1"
|
||||
@@ -3414,7 +3404,7 @@ dependencies = [
|
||||
"once_cell",
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
"wasm-bindgen-shared",
|
||||
]
|
||||
|
||||
@@ -3448,7 +3438,7 @@ checksum = "54681b18a46765f095758388f2d0cf16eb8d4169b639ab575a8f5693af210c7b"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.29",
|
||||
"syn 2.0.33",
|
||||
"wasm-bindgen-backend",
|
||||
"wasm-bindgen-shared",
|
||||
]
|
||||
|
||||
24
Cargo.toml
24
Cargo.toml
@@ -5,8 +5,8 @@ resolver = "2"
|
||||
[workspace.package]
|
||||
edition = "2021"
|
||||
rust-version = "1.71"
|
||||
homepage = "https://beta.ruff.rs/docs"
|
||||
documentation = "https://beta.ruff.rs/docs"
|
||||
homepage = "https://docs.astral.sh/ruff"
|
||||
documentation = "https://docs.astral.sh/ruff"
|
||||
repository = "https://github.com/astral-sh/ruff"
|
||||
authors = ["Charlie Marsh <charlie.r.marsh@gmail.com>"]
|
||||
license = "MIT"
|
||||
@@ -14,8 +14,8 @@ license = "MIT"
|
||||
[workspace.dependencies]
|
||||
anyhow = { version = "1.0.69" }
|
||||
bitflags = { version = "2.3.1" }
|
||||
chrono = { version = "0.4.23", default-features = false, features = ["clock"] }
|
||||
clap = { version = "4.1.8", features = ["derive"] }
|
||||
chrono = { version = "0.4.31", default-features = false, features = ["clock"] }
|
||||
clap = { version = "4.4.3", features = ["derive"] }
|
||||
colored = { version = "2.0.0" }
|
||||
filetime = { version = "0.2.20" }
|
||||
glob = { version = "0.3.1" }
|
||||
@@ -30,27 +30,27 @@ num-bigint = { version = "0.4.3" }
|
||||
num-traits = { version = "0.2.15" }
|
||||
once_cell = { version = "1.17.1" }
|
||||
path-absolutize = { version = "3.1.1" }
|
||||
proc-macro2 = { version = "1.0.51" }
|
||||
proc-macro2 = { version = "1.0.67" }
|
||||
quote = { version = "1.0.23" }
|
||||
regex = { version = "1.7.1" }
|
||||
regex = { version = "1.9.5" }
|
||||
rustc-hash = { version = "1.1.0" }
|
||||
schemars = { version = "0.8.12" }
|
||||
serde = { version = "1.0.152", features = ["derive"] }
|
||||
serde_json = { version = "1.0.106" }
|
||||
serde_json = { version = "1.0.107" }
|
||||
shellexpand = { version = "3.0.0" }
|
||||
similar = { version = "2.2.1", features = ["inline"] }
|
||||
smallvec = { version = "1.10.0" }
|
||||
static_assertions = "1.1.0"
|
||||
strum = { version = "0.25.0", features = ["strum_macros"] }
|
||||
strum_macros = { version = "0.25.2" }
|
||||
syn = { version = "2.0.15" }
|
||||
test-case = { version = "3.0.0" }
|
||||
thiserror = { version = "1.0.43" }
|
||||
toml = { version = "0.7.2" }
|
||||
syn = { version = "2.0.33" }
|
||||
test-case = { version = "3.2.1" }
|
||||
thiserror = { version = "1.0.48" }
|
||||
toml = { version = "0.7.8" }
|
||||
tracing = "0.1.37"
|
||||
tracing-indicatif = "0.3.4"
|
||||
tracing-subscriber = { version = "0.3.17", features = ["env-filter"] }
|
||||
unicode-ident = "1.0.11"
|
||||
unicode-ident = "1.0.12"
|
||||
unicode-width = "0.1.10"
|
||||
uuid = { version = "1.4.1", features = ["v4", "fast-rng", "macro-diagnostics", "js"] }
|
||||
wsl = { version = "0.1.0" }
|
||||
|
||||
25
LICENSE
25
LICENSE
@@ -1224,6 +1224,31 @@ are:
|
||||
SOFTWARE.
|
||||
"""
|
||||
|
||||
- flake8-logging, licensed as follows:
|
||||
"""
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2023 Adam Johnson
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
"""
|
||||
|
||||
- Pyright, licensed as follows:
|
||||
"""
|
||||
MIT License
|
||||
|
||||
27
README.md
27
README.md
@@ -8,7 +8,7 @@
|
||||
[](https://pypi.python.org/pypi/ruff)
|
||||
[](https://github.com/astral-sh/ruff/actions)
|
||||
|
||||
[**Discord**](https://discord.gg/c9MhzV8aU5) | [**Docs**](https://beta.ruff.rs/docs/) | [**Playground**](https://play.ruff.rs/)
|
||||
[**Discord**](https://discord.gg/c9MhzV8aU5) | [**Docs**](https://docs.astral.sh/ruff/) | [**Playground**](https://play.ruff.rs/)
|
||||
|
||||
An extremely fast Python linter, written in Rust.
|
||||
|
||||
@@ -30,13 +30,13 @@ An extremely fast Python linter, written in Rust.
|
||||
- 🤝 Python 3.11 compatibility
|
||||
- 📦 Built-in caching, to avoid re-analyzing unchanged files
|
||||
- 🔧 Autofix support, for automatic error correction (e.g., automatically remove unused imports)
|
||||
- 📏 Over [600 built-in rules](https://beta.ruff.rs/docs/rules/)
|
||||
- ⚖️ [Near-parity](https://beta.ruff.rs/docs/faq/#how-does-ruff-compare-to-flake8) with the
|
||||
- 📏 Over [600 built-in rules](https://docs.astral.sh/ruff/rules/)
|
||||
- ⚖️ [Near-parity](https://docs.astral.sh/ruff/faq/#how-does-ruff-compare-to-flake8) with the
|
||||
built-in Flake8 rule set
|
||||
- 🔌 Native re-implementations of dozens of Flake8 plugins, like flake8-bugbear
|
||||
- ⌨️ First-party [editor integrations](https://beta.ruff.rs/docs/editor-integrations/) for
|
||||
- ⌨️ First-party [editor integrations](https://docs.astral.sh/ruff/editor-integrations/) for
|
||||
[VS Code](https://github.com/astral-sh/ruff-vscode) and [more](https://github.com/astral-sh/ruff-lsp)
|
||||
- 🌎 Monorepo-friendly, with [hierarchical and cascading configuration](https://beta.ruff.rs/docs/configuration/#pyprojecttoml-discovery)
|
||||
- 🌎 Monorepo-friendly, with [hierarchical and cascading configuration](https://docs.astral.sh/ruff/configuration/#pyprojecttoml-discovery)
|
||||
|
||||
Ruff aims to be orders of magnitude faster than alternative tools while integrating more
|
||||
functionality behind a single, common interface.
|
||||
@@ -98,7 +98,7 @@ developer of [Zulip](https://github.com/zulip/zulip):
|
||||
|
||||
## Table of Contents
|
||||
|
||||
For more, see the [documentation](https://beta.ruff.rs/docs/).
|
||||
For more, see the [documentation](https://docs.astral.sh/ruff/).
|
||||
|
||||
1. [Getting Started](#getting-started)
|
||||
1. [Configuration](#configuration)
|
||||
@@ -111,7 +111,7 @@ For more, see the [documentation](https://beta.ruff.rs/docs/).
|
||||
|
||||
## Getting Started
|
||||
|
||||
For more, see the [documentation](https://beta.ruff.rs/docs/).
|
||||
For more, see the [documentation](https://docs.astral.sh/ruff/).
|
||||
|
||||
### Installation
|
||||
|
||||
@@ -122,7 +122,7 @@ pip install ruff
|
||||
```
|
||||
|
||||
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
|
||||
and with [a variety of other package managers](https://beta.ruff.rs/docs/installation/).
|
||||
and with [a variety of other package managers](https://docs.astral.sh/ruff/installation/).
|
||||
|
||||
### Usage
|
||||
|
||||
@@ -140,7 +140,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com) hook:
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.0.289
|
||||
rev: v0.0.290
|
||||
hooks:
|
||||
- id: ruff
|
||||
```
|
||||
@@ -165,7 +165,7 @@ jobs:
|
||||
### Configuration
|
||||
|
||||
Ruff can be configured through a `pyproject.toml`, `ruff.toml`, or `.ruff.toml` file (see:
|
||||
[_Configuration_](https://beta.ruff.rs/docs/configuration/), or [_Settings_](https://beta.ruff.rs/docs/settings/)
|
||||
[_Configuration_](https://docs.astral.sh/ruff/configuration/), or [_Settings_](https://docs.astral.sh/ruff/settings/)
|
||||
for a complete list of all configuration options).
|
||||
|
||||
If left unspecified, the default configuration is equivalent to:
|
||||
@@ -238,7 +238,7 @@ isort, pyupgrade, and others. Regardless of the rule's origin, Ruff re-implement
|
||||
Rust as a first-party feature.
|
||||
|
||||
By default, Ruff enables Flake8's `E` and `F` rules. Ruff supports all rules from the `F` category,
|
||||
and a [subset](https://beta.ruff.rs/docs/rules/#error-e) of the `E` category, omitting those
|
||||
and a [subset](https://docs.astral.sh/ruff/rules/#error-e) of the `E` category, omitting those
|
||||
stylistic rules made obsolete by the use of an autoformatter, like
|
||||
[Black](https://github.com/psf/black).
|
||||
|
||||
@@ -274,6 +274,7 @@ quality tools, including:
|
||||
- [flake8-gettext](https://pypi.org/project/flake8-gettext/)
|
||||
- [flake8-implicit-str-concat](https://pypi.org/project/flake8-implicit-str-concat/)
|
||||
- [flake8-import-conventions](https://github.com/joaopalmeiro/flake8-import-conventions)
|
||||
- [flake8-logging](https://pypi.org/project/flake8-logging/)
|
||||
- [flake8-logging-format](https://pypi.org/project/flake8-logging-format/)
|
||||
- [flake8-no-pep420](https://pypi.org/project/flake8-no-pep420)
|
||||
- [flake8-pie](https://pypi.org/project/flake8-pie/)
|
||||
@@ -303,12 +304,12 @@ quality tools, including:
|
||||
- [tryceratops](https://pypi.org/project/tryceratops/)
|
||||
- [yesqa](https://pypi.org/project/yesqa/)
|
||||
|
||||
For a complete enumeration of the supported rules, see [_Rules_](https://beta.ruff.rs/docs/rules/).
|
||||
For a complete enumeration of the supported rules, see [_Rules_](https://docs.astral.sh/ruff/rules/).
|
||||
|
||||
## Contributing
|
||||
|
||||
Contributions are welcome and highly appreciated. To get started, check out the
|
||||
[**contributing guidelines**](https://beta.ruff.rs/docs/contributing/).
|
||||
[**contributing guidelines**](https://docs.astral.sh/ruff/contributing/).
|
||||
|
||||
You can also join us on [**Discord**](https://discord.gg/c9MhzV8aU5).
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "flake8-to-ruff"
|
||||
version = "0.0.289"
|
||||
version = "0.0.290"
|
||||
description = """
|
||||
Convert Flake8 configuration files to Ruff configuration files.
|
||||
"""
|
||||
|
||||
@@ -86,7 +86,7 @@ flake8-to-ruff path/to/.flake8 --plugin flake8-builtins --plugin flake8-quotes
|
||||
configuration options that don't exist in Flake8.)
|
||||
1. Ruff will omit any rule codes that are unimplemented or unsupported by Ruff, including rule
|
||||
codes from unsupported plugins. (See the
|
||||
[documentation](https://beta.ruff.rs/docs/faq/#how-does-ruff-compare-to-flake8) for the complete
|
||||
[documentation](https://docs.astral.sh/ruff/faq/#how-does-ruff-compare-to-flake8) for the complete
|
||||
list of supported plugins.)
|
||||
|
||||
## License
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff"
|
||||
version = "0.0.289"
|
||||
version = "0.0.290"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
@@ -37,6 +37,7 @@ bitflags = { workspace = true }
|
||||
chrono = { workspace = true }
|
||||
clap = { workspace = true, features = ["derive", "string"], optional = true }
|
||||
colored = { workspace = true }
|
||||
fern = { version = "0.6.1" }
|
||||
glob = { workspace = true }
|
||||
globset = { workspace = true }
|
||||
imperative = { version = "1.0.4" }
|
||||
@@ -70,8 +71,6 @@ strum = { workspace = true }
|
||||
strum_macros = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
toml = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
tracing-subscriber = { workspace = true }
|
||||
typed-arena = { version = "2.0.2" }
|
||||
unicode-width = { workspace = true }
|
||||
unicode_names2 = { version = "0.6.0", git = "https://github.com/youknowone/unicode_names2.git", rev = "4ce16aa85cbcdd9cc830410f1a72ef9a235f2fde" }
|
||||
|
||||
@@ -53,3 +53,6 @@ setattr(foo, "__123abc__", None)
|
||||
setattr(foo, "abc123", None)
|
||||
setattr(foo, r"abc123", None)
|
||||
setattr(foo.bar, r"baz", None)
|
||||
|
||||
# Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458885
|
||||
assert getattr(func, '_rpc')is True
|
||||
|
||||
@@ -19,3 +19,6 @@ print(f'Hello {dict((x,f(x)) for x in "abc")} World')
|
||||
|
||||
# Regression test for: https://github.com/astral-sh/ruff/issues/7086
|
||||
dict((k,v)for k,v in d.iteritems() if k in only_args)
|
||||
|
||||
# Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458940
|
||||
dict((*v, k) for k, v in enumerate(calendar.month_abbr))
|
||||
|
||||
@@ -7,6 +7,8 @@ d = {"a": 1, "b": 2, "c": 3}
|
||||
{i for i in x}
|
||||
{k: v for k, v in y}
|
||||
{k: v for k, v in d.items()}
|
||||
[(k, v) for k, v in d.items()]
|
||||
{k: (a, b) for k, (a, b) in d.items()}
|
||||
|
||||
[i for i, in z]
|
||||
[i for i, j in y]
|
||||
|
||||
5
crates/ruff/resources/test/fixtures/flake8_logging/LOG001.py
vendored
Normal file
5
crates/ruff/resources/test/fixtures/flake8_logging/LOG001.py
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
import logging
|
||||
|
||||
logging.Logger(__name__)
|
||||
logging.Logger()
|
||||
logging.getLogger(__name__)
|
||||
24
crates/ruff/resources/test/fixtures/flake8_logging/LOG002.py
vendored
Normal file
24
crates/ruff/resources/test/fixtures/flake8_logging/LOG002.py
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
import logging
|
||||
from logging import getLogger
|
||||
|
||||
# Ok
|
||||
logging.getLogger(__name__)
|
||||
logging.getLogger(name=__name__)
|
||||
logging.getLogger("custom")
|
||||
logging.getLogger(name="custom")
|
||||
|
||||
# LOG002
|
||||
getLogger(__file__)
|
||||
logging.getLogger(name=__file__)
|
||||
|
||||
logging.getLogger(__cached__)
|
||||
getLogger(name=__cached__)
|
||||
|
||||
|
||||
# Override `logging.getLogger`
|
||||
class logging:
|
||||
def getLogger(self):
|
||||
pass
|
||||
|
||||
|
||||
logging.getLogger(__file__)
|
||||
16
crates/ruff/resources/test/fixtures/flake8_logging/LOG007.py
vendored
Normal file
16
crates/ruff/resources/test/fixtures/flake8_logging/LOG007.py
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
logging.exception("foo") # OK
|
||||
logging.exception("foo", exc_info=False) # LOG007
|
||||
logging.exception("foo", exc_info=[]) # LOG007
|
||||
logger.exception("foo") # OK
|
||||
logger.exception("foo", exc_info=False) # LOG007
|
||||
logger.exception("foo", exc_info=[]) # LOG007
|
||||
|
||||
|
||||
from logging import exception
|
||||
|
||||
exception("foo", exc_info=False) # LOG007
|
||||
exception("foo", exc_info=True) # OK
|
||||
9
crates/ruff/resources/test/fixtures/flake8_logging/LOG009.py
vendored
Normal file
9
crates/ruff/resources/test/fixtures/flake8_logging/LOG009.py
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
import logging
|
||||
|
||||
logging.WARN # LOG009
|
||||
logging.WARNING # OK
|
||||
|
||||
from logging import WARN, WARNING
|
||||
|
||||
WARN # LOG009
|
||||
WARNING # OK
|
||||
@@ -92,3 +92,9 @@ class Test(unittest.TestCase):
|
||||
|
||||
def test_fail_if_equal(self):
|
||||
self.failIfEqual(1, 2) # Error
|
||||
|
||||
|
||||
# Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722459517
|
||||
(self.assertTrue(
|
||||
"piAx_piAy_beta[r][x][y] = {17}".format(
|
||||
self.model.piAx_piAy_beta[r][x][y])))
|
||||
|
||||
@@ -41,9 +41,13 @@ class Test(unittest.TestCase):
|
||||
assert True
|
||||
|
||||
|
||||
from typing import override
|
||||
from typing import override, overload
|
||||
|
||||
|
||||
@override
|
||||
def BAD_FUNC():
|
||||
pass
|
||||
|
||||
@overload
|
||||
def BAD_FUNC():
|
||||
pass
|
||||
|
||||
83
crates/ruff/resources/test/fixtures/perflint/PERF403.py
vendored
Normal file
83
crates/ruff/resources/test/fixtures/perflint/PERF403.py
vendored
Normal file
@@ -0,0 +1,83 @@
|
||||
def foo():
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
result = {}
|
||||
for idx, name in enumerate(fruit):
|
||||
result[idx] = name # PERF403
|
||||
|
||||
|
||||
def foo():
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
result = {}
|
||||
for idx, name in enumerate(fruit):
|
||||
if idx % 2:
|
||||
result[idx] = name # PERF403
|
||||
|
||||
|
||||
def foo():
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
result = {}
|
||||
for idx, name in enumerate(fruit):
|
||||
if idx % 2:
|
||||
result[idx] = name # Ok (false negative: edge case where `else` is same as `if`)
|
||||
else:
|
||||
result[idx] = name
|
||||
|
||||
|
||||
def foo():
|
||||
result = {}
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
for idx, name in enumerate(fruit):
|
||||
if idx % 2:
|
||||
result[idx] = name # PERF403
|
||||
|
||||
|
||||
def foo():
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
result = []
|
||||
for idx, name in enumerate(fruit):
|
||||
if idx % 2:
|
||||
result[idx] = name # OK (result is not a dictionary)
|
||||
else:
|
||||
result[idx] = name
|
||||
|
||||
|
||||
def foo():
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
result = {}
|
||||
for idx, name in enumerate(fruit):
|
||||
if idx % 2:
|
||||
result[idx] = name # OK (if/elif/else isn't replaceable)
|
||||
elif idx % 3:
|
||||
result[idx] = name
|
||||
else:
|
||||
result[idx] = name
|
||||
|
||||
|
||||
def foo():
|
||||
result = {1: "banana"}
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
for idx, name in enumerate(fruit):
|
||||
if idx % 2:
|
||||
result[idx] = name # PERF403
|
||||
|
||||
|
||||
def foo():
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
result = {}
|
||||
for idx, name in enumerate(fruit):
|
||||
if idx in result:
|
||||
result[idx] = name # PERF403
|
||||
|
||||
|
||||
def foo():
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
result = {}
|
||||
for name in fruit:
|
||||
result[name] = name # PERF403
|
||||
|
||||
|
||||
def foo():
|
||||
fruit = ["apple", "pear", "orange"]
|
||||
result = {}
|
||||
for idx, name in enumerate(fruit):
|
||||
result[name] = idx # PERF403
|
||||
@@ -37,3 +37,4 @@ assert (not foo) in bar
|
||||
assert {"x": not foo} in bar
|
||||
assert [42, not foo] in bar
|
||||
assert not (re.search(r"^.:\\Users\\[^\\]*\\Downloads\\.*") is None)
|
||||
assert not('name' in request)or not request['name']
|
||||
|
||||
@@ -1,3 +1,6 @@
|
||||
'''File starts with a tab
|
||||
multiline string with tab in it'''
|
||||
|
||||
#: W191
|
||||
if False:
|
||||
print # indented with 1 tab
|
||||
|
||||
@@ -658,3 +658,8 @@ class CommentAfterDocstring:
|
||||
"After this docstring there's a comment." # priorities=1
|
||||
def sort_services(self):
|
||||
pass
|
||||
|
||||
|
||||
def newline_after_closing_quote(self):
|
||||
"We enforce a newline after the closing quote for a multi-line docstring \
|
||||
but continuations shouldn't be considered multi-line"
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
from ast import literal_eval
|
||||
|
||||
eval("3 + 4")
|
||||
|
||||
literal_eval({1: 2})
|
||||
|
||||
|
||||
def fn() -> None:
|
||||
eval("3 + 4")
|
||||
@@ -1,11 +0,0 @@
|
||||
def eval(content: str) -> None:
|
||||
pass
|
||||
|
||||
|
||||
eval("3 + 4")
|
||||
|
||||
literal_eval({1: 2})
|
||||
|
||||
|
||||
def fn() -> None:
|
||||
eval("3 + 4")
|
||||
@@ -1,7 +0,0 @@
|
||||
import logging
|
||||
import warnings
|
||||
from warnings import warn
|
||||
|
||||
warnings.warn("this is ok")
|
||||
warn("by itself is also ok")
|
||||
logging.warning("this is fine")
|
||||
@@ -1,5 +0,0 @@
|
||||
import logging
|
||||
from logging import warn
|
||||
|
||||
logging.warn("this is not ok")
|
||||
warn("not ok")
|
||||
@@ -49,6 +49,20 @@ class Apples:
|
||||
def __doc__(self):
|
||||
return "Docstring"
|
||||
|
||||
# Allow dunder methods recommended by attrs.
|
||||
def __attrs_post_init__(self):
|
||||
pass
|
||||
|
||||
def __attrs_pre_init__(self):
|
||||
pass
|
||||
|
||||
def __attrs_init__(self):
|
||||
pass
|
||||
|
||||
# Allow __html__, used by Jinja2 and Django.
|
||||
def __html__(self):
|
||||
pass
|
||||
|
||||
|
||||
def __foo_bar__(): # this is not checked by the [bad-dunder-name] rule
|
||||
...
|
||||
|
||||
@@ -36,3 +36,8 @@ def isinstances():
|
||||
result = isinstance(var[6], int) or isinstance(var[7], int)
|
||||
result = isinstance(var[6], (float, int)) or False
|
||||
return result
|
||||
|
||||
|
||||
# Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722460483
|
||||
if(isinstance(self.k, int)) or (isinstance(self.k, float)):
|
||||
...
|
||||
|
||||
@@ -10,6 +10,5 @@ type(arg)(" ")
|
||||
# OK
|
||||
y = x.dtype.type(0.0)
|
||||
|
||||
# OK
|
||||
type = lambda *args, **kwargs: None
|
||||
type("")
|
||||
# Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722459841
|
||||
assert isinstance(fullname, type("")is not True)
|
||||
|
||||
@@ -75,3 +75,8 @@ print("foo".encode()) # print(b"foo")
|
||||
(f"foo{bar}").encode(encoding="utf-8")
|
||||
("unicode text©").encode("utf-8")
|
||||
("unicode text©").encode(encoding="utf-8")
|
||||
|
||||
|
||||
# Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722459882
|
||||
def _match_ignore(line):
|
||||
input=stdin and'\n'.encode()or None
|
||||
|
||||
@@ -13,18 +13,19 @@ x: typing.TypeAlias = list[T]
|
||||
T = typing.TypeVar("T")
|
||||
x: typing.TypeAlias = list[T]
|
||||
|
||||
# UP040 bounded generic (todo)
|
||||
# UP040 bounded generic
|
||||
T = typing.TypeVar("T", bound=int)
|
||||
x: typing.TypeAlias = list[T]
|
||||
|
||||
# UP040 constrained generic
|
||||
T = typing.TypeVar("T", int, str)
|
||||
x: typing.TypeAlias = list[T]
|
||||
|
||||
# UP040 contravariant generic (todo)
|
||||
# UP040 contravariant generic
|
||||
T = typing.TypeVar("T", contravariant=True)
|
||||
x: typing.TypeAlias = list[T]
|
||||
|
||||
# UP040 covariant generic (todo)
|
||||
# UP040 covariant generic
|
||||
T = typing.TypeVar("T", covariant=True)
|
||||
x: typing.TypeAlias = list[T]
|
||||
|
||||
|
||||
@@ -13,10 +13,10 @@ use crate::registry::Rule;
|
||||
use crate::rules::{
|
||||
flake8_2020, flake8_async, flake8_bandit, flake8_boolean_trap, flake8_bugbear, flake8_builtins,
|
||||
flake8_comprehensions, flake8_datetimez, flake8_debugger, flake8_django,
|
||||
flake8_future_annotations, flake8_gettext, flake8_implicit_str_concat, flake8_logging_format,
|
||||
flake8_pie, flake8_print, flake8_pyi, flake8_pytest_style, flake8_self, flake8_simplify,
|
||||
flake8_tidy_imports, flake8_use_pathlib, flynt, numpy, pandas_vet, pep8_naming, pycodestyle,
|
||||
pyflakes, pygrep_hooks, pylint, pyupgrade, refurb, ruff,
|
||||
flake8_future_annotations, flake8_gettext, flake8_implicit_str_concat, flake8_logging,
|
||||
flake8_logging_format, flake8_pie, flake8_print, flake8_pyi, flake8_pytest_style, flake8_self,
|
||||
flake8_simplify, flake8_tidy_imports, flake8_use_pathlib, flynt, numpy, pandas_vet,
|
||||
pep8_naming, pycodestyle, pyflakes, pygrep_hooks, pylint, pyupgrade, refurb, ruff,
|
||||
};
|
||||
use crate::settings::types::PythonVersion;
|
||||
|
||||
@@ -260,6 +260,9 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
|
||||
if checker.enabled(Rule::SixPY3) {
|
||||
flake8_2020::rules::name_or_attribute(checker, expr);
|
||||
}
|
||||
if checker.enabled(Rule::UndocumentedWarn) {
|
||||
flake8_logging::rules::undocumented_warn(checker, expr);
|
||||
}
|
||||
if checker.enabled(Rule::LoadBeforeGlobalDeclaration) {
|
||||
pylint::rules::load_before_global_declaration(checker, id, expr);
|
||||
}
|
||||
@@ -326,6 +329,9 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
|
||||
if checker.enabled(Rule::CollectionsNamedTuple) {
|
||||
flake8_pyi::rules::collections_named_tuple(checker, expr);
|
||||
}
|
||||
if checker.enabled(Rule::UndocumentedWarn) {
|
||||
flake8_logging::rules::undocumented_warn(checker, expr);
|
||||
}
|
||||
pandas_vet::rules::attr(checker, attribute);
|
||||
}
|
||||
Expr::Call(
|
||||
@@ -730,12 +736,6 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
|
||||
if checker.enabled(Rule::CallDateFromtimestamp) {
|
||||
flake8_datetimez::rules::call_date_fromtimestamp(checker, func, expr.range());
|
||||
}
|
||||
if checker.enabled(Rule::Eval) {
|
||||
pygrep_hooks::rules::no_eval(checker, func);
|
||||
}
|
||||
if checker.enabled(Rule::DeprecatedLogWarn) {
|
||||
pygrep_hooks::rules::deprecated_log_warn(checker, func);
|
||||
}
|
||||
if checker.enabled(Rule::UnnecessaryDirectLambdaCall) {
|
||||
pylint::rules::unnecessary_direct_lambda_call(checker, expr, func);
|
||||
}
|
||||
@@ -886,6 +886,15 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
|
||||
if checker.enabled(Rule::QuadraticListSummation) {
|
||||
ruff::rules::quadratic_list_summation(checker, call);
|
||||
}
|
||||
if checker.enabled(Rule::DirectLoggerInstantiation) {
|
||||
flake8_logging::rules::direct_logger_instantiation(checker, call);
|
||||
}
|
||||
if checker.enabled(Rule::InvalidGetLoggerArgument) {
|
||||
flake8_logging::rules::invalid_get_logger_argument(checker, call);
|
||||
}
|
||||
if checker.enabled(Rule::ExceptionWithoutExcInfo) {
|
||||
flake8_logging::rules::exception_without_exc_info(checker, call);
|
||||
}
|
||||
}
|
||||
Expr::Dict(ast::ExprDict {
|
||||
keys,
|
||||
|
||||
@@ -1184,7 +1184,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
|
||||
iter,
|
||||
orelse,
|
||||
is_async,
|
||||
..
|
||||
range: _,
|
||||
},
|
||||
) => {
|
||||
if checker.any_enabled(&[
|
||||
@@ -1218,6 +1218,9 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
|
||||
if checker.enabled(Rule::ManualListCopy) {
|
||||
perflint::rules::manual_list_copy(checker, target, body);
|
||||
}
|
||||
if checker.enabled(Rule::ManualDictComprehension) {
|
||||
perflint::rules::manual_dict_comprehension(checker, target, body);
|
||||
}
|
||||
if checker.enabled(Rule::UnnecessaryListCast) {
|
||||
perflint::rules::unnecessary_list_cast(checker, iter);
|
||||
}
|
||||
|
||||
@@ -6,6 +6,7 @@ use itertools::Itertools;
|
||||
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
|
||||
|
||||
use ruff_diagnostics::{Diagnostic, Edit, Fix};
|
||||
use ruff_python_trivia::CommentRanges;
|
||||
use ruff_source_file::Locator;
|
||||
|
||||
use crate::noqa;
|
||||
@@ -19,7 +20,7 @@ pub(crate) fn check_noqa(
|
||||
diagnostics: &mut Vec<Diagnostic>,
|
||||
path: &Path,
|
||||
locator: &Locator,
|
||||
comment_ranges: &[TextRange],
|
||||
comment_ranges: &CommentRanges,
|
||||
noqa_line_for: &NoqaMapping,
|
||||
analyze_directives: bool,
|
||||
settings: &Settings,
|
||||
|
||||
@@ -9,7 +9,7 @@ use crate::registry::Rule;
|
||||
use crate::rules::flake8_copyright::rules::missing_copyright_notice;
|
||||
use crate::rules::pycodestyle::rules::{
|
||||
doc_line_too_long, line_too_long, mixed_spaces_and_tabs, no_newline_at_end_of_file,
|
||||
tab_indentation, trailing_whitespace,
|
||||
trailing_whitespace,
|
||||
};
|
||||
use crate::rules::pylint;
|
||||
use crate::settings::Settings;
|
||||
@@ -31,7 +31,6 @@ pub(crate) fn check_physical_lines(
|
||||
let enforce_trailing_whitespace = settings.rules.enabled(Rule::TrailingWhitespace);
|
||||
let enforce_blank_line_contains_whitespace =
|
||||
settings.rules.enabled(Rule::BlankLineWithWhitespace);
|
||||
let enforce_tab_indentation = settings.rules.enabled(Rule::TabIndentation);
|
||||
let enforce_copyright_notice = settings.rules.enabled(Rule::MissingCopyrightNotice);
|
||||
|
||||
let mut doc_lines_iter = doc_lines.iter().peekable();
|
||||
@@ -69,12 +68,6 @@ pub(crate) fn check_physical_lines(
|
||||
diagnostics.push(diagnostic);
|
||||
}
|
||||
}
|
||||
|
||||
if enforce_tab_indentation {
|
||||
if let Some(diagnostic) = tab_indentation(&line, indexer) {
|
||||
diagnostics.push(diagnostic);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if enforce_no_newline_at_end_of_file {
|
||||
|
||||
@@ -86,6 +86,10 @@ pub(crate) fn check_tokens(
|
||||
}
|
||||
}
|
||||
|
||||
if settings.rules.enabled(Rule::TabIndentation) {
|
||||
pycodestyle::rules::tab_indentation(&mut diagnostics, tokens, locator, indexer);
|
||||
}
|
||||
|
||||
if settings.rules.any_enabled(&[
|
||||
Rule::InvalidCharacterBackspace,
|
||||
Rule::InvalidCharacterSub,
|
||||
|
||||
@@ -640,8 +640,6 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
|
||||
(Flake8Datetimez, "012") => (RuleGroup::Unspecified, rules::flake8_datetimez::rules::CallDateFromtimestamp),
|
||||
|
||||
// pygrep-hooks
|
||||
(PygrepHooks, "001") => (RuleGroup::Unspecified, rules::pygrep_hooks::rules::Eval),
|
||||
(PygrepHooks, "002") => (RuleGroup::Unspecified, rules::pygrep_hooks::rules::DeprecatedLogWarn),
|
||||
(PygrepHooks, "003") => (RuleGroup::Unspecified, rules::pygrep_hooks::rules::BlanketTypeIgnore),
|
||||
(PygrepHooks, "004") => (RuleGroup::Unspecified, rules::pygrep_hooks::rules::BlanketNOQA),
|
||||
(PygrepHooks, "005") => (RuleGroup::Unspecified, rules::pygrep_hooks::rules::InvalidMockAccess),
|
||||
@@ -898,6 +896,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
|
||||
(Perflint, "203") => (RuleGroup::Unspecified, rules::perflint::rules::TryExceptInLoop),
|
||||
(Perflint, "401") => (RuleGroup::Unspecified, rules::perflint::rules::ManualListComprehension),
|
||||
(Perflint, "402") => (RuleGroup::Unspecified, rules::perflint::rules::ManualListCopy),
|
||||
(Perflint, "403") => (RuleGroup::Preview, rules::perflint::rules::ManualDictComprehension),
|
||||
|
||||
// flake8-fixme
|
||||
(Flake8Fixme, "001") => (RuleGroup::Unspecified, rules::flake8_fixme::rules::LineContainsFixme),
|
||||
@@ -919,6 +918,12 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
|
||||
(Refurb, "132") => (RuleGroup::Nursery, rules::refurb::rules::CheckAndRemoveFromSet),
|
||||
(Refurb, "145") => (RuleGroup::Preview, rules::refurb::rules::SliceCopy),
|
||||
|
||||
// flake8-logging
|
||||
(Flake8Logging, "001") => (RuleGroup::Preview, rules::flake8_logging::rules::DirectLoggerInstantiation),
|
||||
(Flake8Logging, "002") => (RuleGroup::Preview, rules::flake8_logging::rules::InvalidGetLoggerArgument),
|
||||
(Flake8Logging, "007") => (RuleGroup::Preview, rules::flake8_logging::rules::ExceptionWithoutExcInfo),
|
||||
(Flake8Logging, "009") => (RuleGroup::Preview, rules::flake8_logging::rules::UndocumentedWarn),
|
||||
|
||||
_ => return None,
|
||||
})
|
||||
}
|
||||
|
||||
@@ -30,6 +30,11 @@ impl<'a> Docstring<'a> {
|
||||
pub(crate) fn leading_quote(&self) -> &'a str {
|
||||
&self.contents[TextRange::up_to(self.body_range.start())]
|
||||
}
|
||||
|
||||
pub(crate) fn triple_quoted(&self) -> bool {
|
||||
let leading_quote = self.leading_quote();
|
||||
leading_quote.ends_with("\"\"\"") || leading_quote.ends_with("'''")
|
||||
}
|
||||
}
|
||||
|
||||
impl Ranged for Docstring<'_> {
|
||||
|
||||
@@ -5,6 +5,8 @@
|
||||
//!
|
||||
//! [Ruff]: https://github.com/astral-sh/ruff
|
||||
|
||||
#[cfg(feature = "clap")]
|
||||
pub use rule_selector::clap_completion::RuleSelectorParser;
|
||||
pub use rule_selector::RuleSelector;
|
||||
pub use rules::pycodestyle::rules::{IOError, SyntaxError};
|
||||
|
||||
|
||||
@@ -36,9 +36,6 @@ use crate::settings::{flags, Settings};
|
||||
use crate::source_kind::SourceKind;
|
||||
use crate::{directives, fs};
|
||||
|
||||
const CARGO_PKG_NAME: &str = env!("CARGO_PKG_NAME");
|
||||
const CARGO_PKG_REPOSITORY: &str = env!("CARGO_PKG_REPOSITORY");
|
||||
|
||||
/// A [`Result`]-like type that returns both data and an error. Used to return
|
||||
/// diagnostics even in the face of parse errors, since many diagnostics can be
|
||||
/// generated without a full AST.
|
||||
@@ -543,8 +540,9 @@ fn report_failed_to_converge_error(path: &Path, transformed: &str, diagnostics:
|
||||
let codes = collect_rule_codes(diagnostics.iter().map(|diagnostic| diagnostic.kind.rule()));
|
||||
if cfg!(debug_assertions) {
|
||||
eprintln!(
|
||||
"{}: Failed to converge after {} iterations in `{}` with rule codes {}:---\n{}\n---",
|
||||
"{}{} Failed to converge after {} iterations in `{}` with rule codes {}:---\n{}\n---",
|
||||
"debug error".red().bold(),
|
||||
":".bold(),
|
||||
MAX_ITERATIONS,
|
||||
fs::relativize_path(path),
|
||||
codes,
|
||||
@@ -553,18 +551,17 @@ fn report_failed_to_converge_error(path: &Path, transformed: &str, diagnostics:
|
||||
} else {
|
||||
eprintln!(
|
||||
r#"
|
||||
{}: Failed to converge after {} iterations.
|
||||
{}{} Failed to converge after {} iterations.
|
||||
|
||||
This indicates a bug in `{}`. If you could open an issue at:
|
||||
This indicates a bug in Ruff. If you could open an issue at:
|
||||
|
||||
{}/issues/new?title=%5BInfinite%20loop%5D
|
||||
https://github.com/astral-sh/ruff/issues/new?title=%5BInfinite%20loop%5D
|
||||
|
||||
...quoting the contents of `{}`, the rule codes {}, along with the `pyproject.toml` settings and executed command, we'd be very appreciative!
|
||||
"#,
|
||||
"error".red().bold(),
|
||||
":".bold(),
|
||||
MAX_ITERATIONS,
|
||||
CARGO_PKG_NAME,
|
||||
CARGO_PKG_REPOSITORY,
|
||||
fs::relativize_path(path),
|
||||
codes
|
||||
);
|
||||
@@ -581,8 +578,9 @@ fn report_autofix_syntax_error(
|
||||
let codes = collect_rule_codes(rules);
|
||||
if cfg!(debug_assertions) {
|
||||
eprintln!(
|
||||
"{}: Autofix introduced a syntax error in `{}` with rule codes {}: {}\n---\n{}\n---",
|
||||
"{}{} Autofix introduced a syntax error in `{}` with rule codes {}: {}\n---\n{}\n---",
|
||||
"error".red().bold(),
|
||||
":".bold(),
|
||||
fs::relativize_path(path),
|
||||
codes,
|
||||
error,
|
||||
@@ -591,17 +589,16 @@ fn report_autofix_syntax_error(
|
||||
} else {
|
||||
eprintln!(
|
||||
r#"
|
||||
{}: Autofix introduced a syntax error. Reverting all changes.
|
||||
{}{} Autofix introduced a syntax error. Reverting all changes.
|
||||
|
||||
This indicates a bug in `{}`. If you could open an issue at:
|
||||
This indicates a bug in Ruff. If you could open an issue at:
|
||||
|
||||
{}/issues/new?title=%5BAutofix%20error%5D
|
||||
https://github.com/astral-sh/ruff/issues/new?title=%5BAutofix%20error%5D
|
||||
|
||||
...quoting the contents of `{}`, the rule codes {}, along with the `pyproject.toml` settings and executed command, we'd be very appreciative!
|
||||
"#,
|
||||
"error".red().bold(),
|
||||
CARGO_PKG_NAME,
|
||||
CARGO_PKG_REPOSITORY,
|
||||
":".bold(),
|
||||
fs::relativize_path(path),
|
||||
codes,
|
||||
);
|
||||
|
||||
@@ -4,17 +4,16 @@ use std::sync::Mutex;
|
||||
|
||||
use anyhow::Result;
|
||||
use colored::Colorize;
|
||||
use fern;
|
||||
use log::Level;
|
||||
use once_cell::sync::Lazy;
|
||||
use tracing_subscriber::layer::SubscriberExt;
|
||||
use tracing_subscriber::util::SubscriberInitExt;
|
||||
use tracing_subscriber::EnvFilter;
|
||||
|
||||
use ruff_notebook::Notebook;
|
||||
use ruff_python_parser::{ParseError, ParseErrorType};
|
||||
|
||||
use ruff_source_file::{OneIndexed, SourceCode, SourceLocation};
|
||||
|
||||
use crate::fs;
|
||||
use crate::source_kind::SourceKind;
|
||||
use ruff_notebook::Notebook;
|
||||
|
||||
pub static WARNINGS: Lazy<Mutex<Vec<&'static str>>> = Lazy::new(Mutex::default);
|
||||
|
||||
@@ -91,27 +90,49 @@ pub enum LogLevel {
|
||||
|
||||
impl LogLevel {
|
||||
#[allow(clippy::trivially_copy_pass_by_ref)]
|
||||
const fn tracing_level(&self) -> tracing::Level {
|
||||
const fn level_filter(&self) -> log::LevelFilter {
|
||||
match self {
|
||||
LogLevel::Default => tracing::Level::INFO,
|
||||
LogLevel::Verbose => tracing::Level::DEBUG,
|
||||
LogLevel::Quiet => tracing::Level::WARN,
|
||||
LogLevel::Silent => tracing::Level::ERROR,
|
||||
LogLevel::Default => log::LevelFilter::Info,
|
||||
LogLevel::Verbose => log::LevelFilter::Debug,
|
||||
LogLevel::Quiet => log::LevelFilter::Off,
|
||||
LogLevel::Silent => log::LevelFilter::Off,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Log level priorities: 1. `RUST_LOG=`, 2. explicit CLI log level, 3. default to info
|
||||
pub fn set_up_logging(level: &LogLevel) -> Result<()> {
|
||||
let filter_layer = EnvFilter::try_from_default_env().unwrap_or_else(|_| {
|
||||
EnvFilter::builder()
|
||||
.with_default_directive(level.tracing_level().into())
|
||||
.parse_lossy("")
|
||||
});
|
||||
tracing_subscriber::registry()
|
||||
.with(filter_layer)
|
||||
.with(tracing_subscriber::fmt::layer())
|
||||
.init();
|
||||
fern::Dispatch::new()
|
||||
.format(|out, message, record| match record.level() {
|
||||
Level::Error => {
|
||||
out.finish(format_args!(
|
||||
"{}{} {}",
|
||||
"error".red().bold(),
|
||||
":".bold(),
|
||||
message
|
||||
));
|
||||
}
|
||||
Level::Warn => {
|
||||
out.finish(format_args!(
|
||||
"{}{} {}",
|
||||
"warning".yellow().bold(),
|
||||
":".bold(),
|
||||
message
|
||||
));
|
||||
}
|
||||
Level::Info | Level::Debug | Level::Trace => {
|
||||
out.finish(format_args!(
|
||||
"{}[{}][{}] {}",
|
||||
chrono::Local::now().format("[%Y-%m-%d][%H:%M:%S]"),
|
||||
record.target(),
|
||||
record.level(),
|
||||
message
|
||||
));
|
||||
}
|
||||
})
|
||||
.level(level.level_filter())
|
||||
.level_for("globset", log::LevelFilter::Warn)
|
||||
.chain(std::io::stderr())
|
||||
.apply()?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
||||
@@ -33,7 +33,7 @@ expression: content
|
||||
},
|
||||
"message": "`os` imported but unused",
|
||||
"noqa_row": 1,
|
||||
"url": "https://beta.ruff.rs/docs/rules/unused-import"
|
||||
"url": "https://docs.astral.sh/ruff/rules/unused-import"
|
||||
},
|
||||
{
|
||||
"code": "F841",
|
||||
@@ -65,7 +65,7 @@ expression: content
|
||||
},
|
||||
"message": "Local variable `x` is assigned to but never used",
|
||||
"noqa_row": 6,
|
||||
"url": "https://beta.ruff.rs/docs/rules/unused-variable"
|
||||
"url": "https://docs.astral.sh/ruff/rules/unused-variable"
|
||||
},
|
||||
{
|
||||
"code": "F821",
|
||||
@@ -81,6 +81,6 @@ expression: content
|
||||
},
|
||||
"message": "Undefined name `a`",
|
||||
"noqa_row": 1,
|
||||
"url": "https://beta.ruff.rs/docs/rules/undefined-name"
|
||||
"url": "https://docs.astral.sh/ruff/rules/undefined-name"
|
||||
}
|
||||
]
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
source: crates/ruff/src/message/json_lines.rs
|
||||
expression: content
|
||||
---
|
||||
{"code":"F401","end_location":{"column":10,"row":1},"filename":"fib.py","fix":{"applicability":"Suggested","edits":[{"content":"","end_location":{"column":1,"row":2},"location":{"column":1,"row":1}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":1},"message":"`os` imported but unused","noqa_row":1,"url":"https://beta.ruff.rs/docs/rules/unused-import"}
|
||||
{"code":"F841","end_location":{"column":6,"row":6},"filename":"fib.py","fix":{"applicability":"Suggested","edits":[{"content":"","end_location":{"column":10,"row":6},"location":{"column":5,"row":6}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":6},"message":"Local variable `x` is assigned to but never used","noqa_row":6,"url":"https://beta.ruff.rs/docs/rules/unused-variable"}
|
||||
{"code":"F821","end_location":{"column":5,"row":1},"filename":"undef.py","fix":null,"location":{"column":4,"row":1},"message":"Undefined name `a`","noqa_row":1,"url":"https://beta.ruff.rs/docs/rules/undefined-name"}
|
||||
{"code":"F401","end_location":{"column":10,"row":1},"filename":"fib.py","fix":{"applicability":"Suggested","edits":[{"content":"","end_location":{"column":1,"row":2},"location":{"column":1,"row":1}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":1},"message":"`os` imported but unused","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
|
||||
{"code":"F841","end_location":{"column":6,"row":6},"filename":"fib.py","fix":{"applicability":"Suggested","edits":[{"content":"","end_location":{"column":10,"row":6},"location":{"column":5,"row":6}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":6},"message":"Local variable `x` is assigned to but never used","noqa_row":6,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}
|
||||
{"code":"F821","end_location":{"column":5,"row":1},"filename":"undef.py","fix":null,"location":{"column":4,"row":1},"message":"Undefined name `a`","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/undefined-name"}
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ use log::warn;
|
||||
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
|
||||
|
||||
use ruff_diagnostics::Diagnostic;
|
||||
use ruff_python_trivia::indentation_at_offset;
|
||||
use ruff_python_trivia::{indentation_at_offset, CommentRanges};
|
||||
use ruff_source_file::{LineEnding, Locator};
|
||||
|
||||
use crate::codes::NoqaCode;
|
||||
@@ -234,7 +234,7 @@ impl FileExemption {
|
||||
/// globally ignored within the file.
|
||||
pub(crate) fn try_extract(
|
||||
contents: &str,
|
||||
comment_ranges: &[TextRange],
|
||||
comment_ranges: &CommentRanges,
|
||||
path: &Path,
|
||||
locator: &Locator,
|
||||
) -> Option<Self> {
|
||||
@@ -457,7 +457,7 @@ pub(crate) fn add_noqa(
|
||||
path: &Path,
|
||||
diagnostics: &[Diagnostic],
|
||||
locator: &Locator,
|
||||
commented_lines: &[TextRange],
|
||||
comment_ranges: &CommentRanges,
|
||||
noqa_line_for: &NoqaMapping,
|
||||
line_ending: LineEnding,
|
||||
) -> Result<usize> {
|
||||
@@ -465,7 +465,7 @@ pub(crate) fn add_noqa(
|
||||
path,
|
||||
diagnostics,
|
||||
locator,
|
||||
commented_lines,
|
||||
comment_ranges,
|
||||
noqa_line_for,
|
||||
line_ending,
|
||||
);
|
||||
@@ -477,7 +477,7 @@ fn add_noqa_inner(
|
||||
path: &Path,
|
||||
diagnostics: &[Diagnostic],
|
||||
locator: &Locator,
|
||||
commented_ranges: &[TextRange],
|
||||
comment_ranges: &CommentRanges,
|
||||
noqa_line_for: &NoqaMapping,
|
||||
line_ending: LineEnding,
|
||||
) -> (usize, String) {
|
||||
@@ -487,8 +487,8 @@ fn add_noqa_inner(
|
||||
|
||||
// Whether the file is exempted from all checks.
|
||||
// Codes that are globally exempted (within the current file).
|
||||
let exemption = FileExemption::try_extract(locator.contents(), commented_ranges, path, locator);
|
||||
let directives = NoqaDirectives::from_commented_ranges(commented_ranges, path, locator);
|
||||
let exemption = FileExemption::try_extract(locator.contents(), comment_ranges, path, locator);
|
||||
let directives = NoqaDirectives::from_commented_ranges(comment_ranges, path, locator);
|
||||
|
||||
// Mark any non-ignored diagnostics.
|
||||
for diagnostic in diagnostics {
|
||||
@@ -658,7 +658,7 @@ pub(crate) struct NoqaDirectives<'a> {
|
||||
|
||||
impl<'a> NoqaDirectives<'a> {
|
||||
pub(crate) fn from_commented_ranges(
|
||||
comment_ranges: &[TextRange],
|
||||
comment_ranges: &CommentRanges,
|
||||
path: &Path,
|
||||
locator: &'a Locator<'a>,
|
||||
) -> Self {
|
||||
@@ -800,6 +800,7 @@ mod tests {
|
||||
use ruff_text_size::{TextRange, TextSize};
|
||||
|
||||
use ruff_diagnostics::Diagnostic;
|
||||
use ruff_python_trivia::CommentRanges;
|
||||
use ruff_source_file::{LineEnding, Locator};
|
||||
|
||||
use crate::noqa::{add_noqa_inner, Directive, NoqaMapping, ParsedFileExemption};
|
||||
@@ -997,7 +998,7 @@ mod tests {
|
||||
path,
|
||||
&[],
|
||||
&Locator::new(contents),
|
||||
&[],
|
||||
&CommentRanges::default(),
|
||||
&noqa_line_for,
|
||||
LineEnding::Lf,
|
||||
);
|
||||
@@ -1017,7 +1018,7 @@ mod tests {
|
||||
path,
|
||||
&diagnostics,
|
||||
&Locator::new(contents),
|
||||
&[],
|
||||
&CommentRanges::default(),
|
||||
&noqa_line_for,
|
||||
LineEnding::Lf,
|
||||
);
|
||||
@@ -1038,11 +1039,13 @@ mod tests {
|
||||
];
|
||||
let contents = "x = 1 # noqa: E741\n";
|
||||
let noqa_line_for = NoqaMapping::default();
|
||||
let comment_ranges =
|
||||
CommentRanges::new(vec![TextRange::new(TextSize::from(7), TextSize::from(19))]);
|
||||
let (count, output) = add_noqa_inner(
|
||||
path,
|
||||
&diagnostics,
|
||||
&Locator::new(contents),
|
||||
&[TextRange::new(TextSize::from(7), TextSize::from(19))],
|
||||
&comment_ranges,
|
||||
&noqa_line_for,
|
||||
LineEnding::Lf,
|
||||
);
|
||||
@@ -1063,11 +1066,13 @@ mod tests {
|
||||
];
|
||||
let contents = "x = 1 # noqa";
|
||||
let noqa_line_for = NoqaMapping::default();
|
||||
let comment_ranges =
|
||||
CommentRanges::new(vec![TextRange::new(TextSize::from(7), TextSize::from(13))]);
|
||||
let (count, output) = add_noqa_inner(
|
||||
path,
|
||||
&diagnostics,
|
||||
&Locator::new(contents),
|
||||
&[TextRange::new(TextSize::from(7), TextSize::from(13))],
|
||||
&comment_ranges,
|
||||
&noqa_line_for,
|
||||
LineEnding::Lf,
|
||||
);
|
||||
|
||||
@@ -199,6 +199,9 @@ pub enum Linter {
|
||||
/// [refurb](https://pypi.org/project/refurb/)
|
||||
#[prefix = "FURB"]
|
||||
Refurb,
|
||||
/// [flake8-logging](https://pypi.org/project/flake8-logging/)
|
||||
#[prefix = "LOG"]
|
||||
Flake8Logging,
|
||||
/// Ruff-specific rules
|
||||
#[prefix = "RUF"]
|
||||
Ruff,
|
||||
@@ -248,7 +251,6 @@ impl Rule {
|
||||
| Rule::MissingCopyrightNotice
|
||||
| Rule::MissingNewlineAtEndOfFile
|
||||
| Rule::MixedSpacesAndTabs
|
||||
| Rule::TabIndentation
|
||||
| Rule::TrailingWhitespace => LintSource::PhysicalLines,
|
||||
Rule::AmbiguousUnicodeCharacterComment
|
||||
| Rule::AmbiguousUnicodeCharacterDocstring
|
||||
@@ -289,6 +291,7 @@ impl Rule {
|
||||
| Rule::ShebangNotExecutable
|
||||
| Rule::ShebangNotFirstLine
|
||||
| Rule::SingleLineImplicitStringConcatenation
|
||||
| Rule::TabIndentation
|
||||
| Rule::TrailingCommaOnBareTuple
|
||||
| Rule::TypeCommentInStub
|
||||
| Rule::UselessSemicolon
|
||||
|
||||
@@ -98,5 +98,7 @@ static REDIRECTS: Lazy<HashMap<&'static str, &'static str>> = Lazy::new(|| {
|
||||
("T002", "FIX002"),
|
||||
("T003", "FIX003"),
|
||||
("T004", "FIX004"),
|
||||
("PGH001", "S307"),
|
||||
("PGH002", "G010"),
|
||||
])
|
||||
});
|
||||
|
||||
@@ -15,10 +15,8 @@ use crate::settings::types::PreviewMode;
|
||||
pub enum RuleSelector {
|
||||
/// Select all rules (includes rules in preview if enabled)
|
||||
All,
|
||||
/// Category to select all rules in preview (includes legacy nursery rules)
|
||||
Preview,
|
||||
/// Legacy category to select all rules in the "nursery" which predated preview mode
|
||||
#[deprecated(note = "Use `RuleSelector::Preview` for new rules instead")]
|
||||
#[deprecated(note = "The nursery was replaced with 'preview mode' which has no selector")]
|
||||
Nursery,
|
||||
/// Legacy category to select both the `mccabe` and `flake8-comprehensions` linters
|
||||
/// via a single selector.
|
||||
@@ -54,7 +52,6 @@ impl FromStr for RuleSelector {
|
||||
"ALL" => Ok(Self::All),
|
||||
#[allow(deprecated)]
|
||||
"NURSERY" => Ok(Self::Nursery),
|
||||
"PREVIEW" => Ok(Self::Preview),
|
||||
"C" => Ok(Self::C),
|
||||
"T" => Ok(Self::T),
|
||||
_ => {
|
||||
@@ -121,7 +118,6 @@ impl RuleSelector {
|
||||
RuleSelector::All => ("", "ALL"),
|
||||
#[allow(deprecated)]
|
||||
RuleSelector::Nursery => ("", "NURSERY"),
|
||||
RuleSelector::Preview => ("", "PREVIEW"),
|
||||
RuleSelector::C => ("", "C"),
|
||||
RuleSelector::T => ("", "T"),
|
||||
RuleSelector::Prefix { prefix, .. } | RuleSelector::Rule { prefix, .. } => {
|
||||
@@ -185,9 +181,6 @@ impl RuleSelector {
|
||||
RuleSelector::Nursery => {
|
||||
RuleSelectorIter::Nursery(Rule::iter().filter(Rule::is_nursery))
|
||||
}
|
||||
RuleSelector::Preview => RuleSelectorIter::Nursery(
|
||||
Rule::iter().filter(|rule| rule.is_preview() || rule.is_nursery()),
|
||||
),
|
||||
RuleSelector::C => RuleSelectorIter::Chain(
|
||||
Linter::Flake8Comprehensions
|
||||
.rules()
|
||||
@@ -261,8 +254,9 @@ mod schema {
|
||||
instance_type: Some(InstanceType::String.into()),
|
||||
enum_values: Some(
|
||||
[
|
||||
// Include the non-standard "ALL" selector.
|
||||
// Include the non-standard "ALL" and "NURSERY" selectors.
|
||||
"ALL".to_string(),
|
||||
"NURSERY".to_string(),
|
||||
// Include the legacy "C" and "T" selectors.
|
||||
"C".to_string(),
|
||||
"T".to_string(),
|
||||
@@ -301,7 +295,6 @@ impl RuleSelector {
|
||||
pub fn specificity(&self) -> Specificity {
|
||||
match self {
|
||||
RuleSelector::All => Specificity::All,
|
||||
RuleSelector::Preview => Specificity::All,
|
||||
#[allow(deprecated)]
|
||||
RuleSelector::Nursery => Specificity::All,
|
||||
RuleSelector::T => Specificity::LinterGroup,
|
||||
@@ -343,13 +336,14 @@ pub enum Specificity {
|
||||
}
|
||||
|
||||
#[cfg(feature = "clap")]
|
||||
mod clap_completion {
|
||||
pub mod clap_completion {
|
||||
use clap::builder::{PossibleValue, TypedValueParser, ValueParserFactory};
|
||||
use strum::IntoEnumIterator;
|
||||
|
||||
use crate::{
|
||||
codes::RuleCodePrefix,
|
||||
registry::{Linter, RuleNamespace},
|
||||
rule_selector::is_single_rule_selector,
|
||||
RuleSelector,
|
||||
};
|
||||
|
||||
@@ -369,17 +363,29 @@ mod clap_completion {
|
||||
|
||||
fn parse_ref(
|
||||
&self,
|
||||
_cmd: &clap::Command,
|
||||
_arg: Option<&clap::Arg>,
|
||||
cmd: &clap::Command,
|
||||
arg: Option<&clap::Arg>,
|
||||
value: &std::ffi::OsStr,
|
||||
) -> Result<Self::Value, clap::Error> {
|
||||
let value = value
|
||||
.to_str()
|
||||
.ok_or_else(|| clap::Error::new(clap::error::ErrorKind::InvalidUtf8))?;
|
||||
|
||||
value
|
||||
.parse()
|
||||
.map_err(|e| clap::Error::raw(clap::error::ErrorKind::InvalidValue, e))
|
||||
value.parse().map_err(|_| {
|
||||
let mut error =
|
||||
clap::Error::new(clap::error::ErrorKind::ValueValidation).with_cmd(cmd);
|
||||
if let Some(arg) = arg {
|
||||
error.insert(
|
||||
clap::error::ContextKind::InvalidArg,
|
||||
clap::error::ContextValue::String(arg.to_string()),
|
||||
);
|
||||
}
|
||||
error.insert(
|
||||
clap::error::ContextKind::InvalidValue,
|
||||
clap::error::ContextValue::String(value.to_string()),
|
||||
);
|
||||
error
|
||||
})
|
||||
}
|
||||
|
||||
fn possible_values(&self) -> Option<Box<dyn Iterator<Item = PossibleValue> + '_>> {
|
||||
@@ -394,27 +400,34 @@ mod clap_completion {
|
||||
RuleCodePrefix::iter()
|
||||
// Filter out rule gated behind `#[cfg(feature = "unreachable-code")]`, which is
|
||||
// off-by-default
|
||||
.filter(|p| {
|
||||
format!("{}{}", p.linter().common_prefix(), p.short_code())
|
||||
!= "RUF014"
|
||||
.filter(|prefix| {
|
||||
format!(
|
||||
"{}{}",
|
||||
prefix.linter().common_prefix(),
|
||||
prefix.short_code()
|
||||
) != "RUF014"
|
||||
})
|
||||
.map(|p| {
|
||||
let prefix = p.linter().common_prefix();
|
||||
let code = p.short_code();
|
||||
|
||||
let mut rules_iter = p.rules();
|
||||
let rule1 = rules_iter.next();
|
||||
let rule2 = rules_iter.next();
|
||||
|
||||
let value = PossibleValue::new(format!("{prefix}{code}"));
|
||||
|
||||
if rule2.is_none() {
|
||||
let rule1 = rule1.unwrap();
|
||||
let name: &'static str = rule1.into();
|
||||
value.help(name)
|
||||
} else {
|
||||
value
|
||||
.filter_map(|prefix| {
|
||||
// Ex) `UP`
|
||||
if prefix.short_code().is_empty() {
|
||||
let code = prefix.linter().common_prefix();
|
||||
let name = prefix.linter().name();
|
||||
return Some(PossibleValue::new(code).help(name));
|
||||
}
|
||||
|
||||
// Ex) `UP004`
|
||||
if is_single_rule_selector(&prefix) {
|
||||
let rule = prefix.rules().next()?;
|
||||
let code = format!(
|
||||
"{}{}",
|
||||
prefix.linter().common_prefix(),
|
||||
prefix.short_code()
|
||||
);
|
||||
let name: &'static str = rule.into();
|
||||
return Some(PossibleValue::new(code).help(name));
|
||||
}
|
||||
|
||||
None
|
||||
}),
|
||||
),
|
||||
),
|
||||
|
||||
@@ -148,7 +148,7 @@ impl Violation for StartProcessWithNoShell {
|
||||
///
|
||||
/// ## References
|
||||
/// - [Python documentation: `subprocess.Popen()`](https://docs.python.org/3/library/subprocess.html#subprocess.Popen)
|
||||
/// - [Common Weakness Enumeration: CWE-78](https://cwe.mitre.org/data/definitions/78.html)
|
||||
/// - [Common Weakness Enumeration: CWE-426](https://cwe.mitre.org/data/definitions/426.html)
|
||||
#[violation]
|
||||
pub struct StartProcessWithPartialPath;
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
use crate::autofix::edits::pad;
|
||||
use ruff_diagnostics::{AlwaysAutofixableViolation, Diagnostic, Edit, Fix};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast::{self as ast, Constant, Expr};
|
||||
@@ -80,17 +81,21 @@ pub(crate) fn getattr_with_constant(
|
||||
let mut diagnostic = Diagnostic::new(GetAttrWithConstant, expr.range());
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
diagnostic.set_fix(Fix::suggested(Edit::range_replacement(
|
||||
if matches!(
|
||||
obj,
|
||||
Expr::Name(_) | Expr::Attribute(_) | Expr::Subscript(_) | Expr::Call(_)
|
||||
) {
|
||||
format!("{}.{}", checker.locator().slice(obj), value)
|
||||
} else {
|
||||
// Defensively parenthesize any other expressions. For example, attribute accesses
|
||||
// on `int` literals must be parenthesized, e.g., `getattr(1, "real")` becomes
|
||||
// `(1).real`. The same is true for named expressions and others.
|
||||
format!("({}).{}", checker.locator().slice(obj), value)
|
||||
},
|
||||
pad(
|
||||
if matches!(
|
||||
obj,
|
||||
Expr::Name(_) | Expr::Attribute(_) | Expr::Subscript(_) | Expr::Call(_)
|
||||
) {
|
||||
format!("{}.{}", checker.locator().slice(obj), value)
|
||||
} else {
|
||||
// Defensively parenthesize any other expressions. For example, attribute accesses
|
||||
// on `int` literals must be parenthesized, e.g., `getattr(1, "real")` becomes
|
||||
// `(1).real`. The same is true for named expressions and others.
|
||||
format!("({}).{}", checker.locator().slice(obj), value)
|
||||
},
|
||||
expr.range(),
|
||||
checker.locator(),
|
||||
),
|
||||
expr.range(),
|
||||
)));
|
||||
}
|
||||
|
||||
@@ -70,6 +70,7 @@ impl Violation for MutableArgumentDefault {
|
||||
fn message(&self) -> String {
|
||||
format!("Do not use mutable data structures for argument defaults")
|
||||
}
|
||||
|
||||
fn autofix_title(&self) -> Option<String> {
|
||||
Some(format!("Replace with `None`; initialize within function"))
|
||||
}
|
||||
|
||||
@@ -17,8 +17,8 @@ use crate::checkers::ast::Checker;
|
||||
/// contains multiple characters, the reader may be misled into thinking that
|
||||
/// a prefix or suffix is being removed, rather than a set of characters.
|
||||
///
|
||||
/// In Python 3.9 and later, you can use `str#removeprefix` and
|
||||
/// `str#removesuffix` to remove an exact prefix or suffix from a string,
|
||||
/// In Python 3.9 and later, you can use `str.removeprefix` and
|
||||
/// `str.removesuffix` to remove an exact prefix or suffix from a string,
|
||||
/// respectively, which should be preferred when possible.
|
||||
///
|
||||
/// ## Example
|
||||
|
||||
@@ -316,4 +316,19 @@ B009_B010.py:34:1: B009 [*] Do not call `getattr` with a constant attribute valu
|
||||
37 37 |
|
||||
38 38 | # Valid setattr usage
|
||||
|
||||
B009_B010.py:58:8: B009 [*] Do not call `getattr` with a constant attribute value. It is not any safer than normal property access.
|
||||
|
|
||||
57 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458885
|
||||
58 | assert getattr(func, '_rpc')is True
|
||||
| ^^^^^^^^^^^^^^^^^^^^^ B009
|
||||
|
|
||||
= help: Replace `getattr` with attribute access
|
||||
|
||||
ℹ Suggested fix
|
||||
55 55 | setattr(foo.bar, r"baz", None)
|
||||
56 56 |
|
||||
57 57 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458885
|
||||
58 |-assert getattr(func, '_rpc')is True
|
||||
58 |+assert func._rpc is True
|
||||
|
||||
|
||||
|
||||
@@ -82,6 +82,7 @@ B009_B010.py:53:1: B010 [*] Do not call `setattr` with a constant attribute valu
|
||||
53 |+foo.abc123 = None
|
||||
54 54 | setattr(foo, r"abc123", None)
|
||||
55 55 | setattr(foo.bar, r"baz", None)
|
||||
56 56 |
|
||||
|
||||
B009_B010.py:54:1: B010 [*] Do not call `setattr` with a constant attribute value. It is not any safer than normal property access.
|
||||
|
|
||||
@@ -100,6 +101,8 @@ B009_B010.py:54:1: B010 [*] Do not call `setattr` with a constant attribute valu
|
||||
54 |-setattr(foo, r"abc123", None)
|
||||
54 |+foo.abc123 = None
|
||||
55 55 | setattr(foo.bar, r"baz", None)
|
||||
56 56 |
|
||||
57 57 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458885
|
||||
|
||||
B009_B010.py:55:1: B010 [*] Do not call `setattr` with a constant attribute value. It is not any safer than normal property access.
|
||||
|
|
||||
@@ -107,6 +110,8 @@ B009_B010.py:55:1: B010 [*] Do not call `setattr` with a constant attribute valu
|
||||
54 | setattr(foo, r"abc123", None)
|
||||
55 | setattr(foo.bar, r"baz", None)
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ B010
|
||||
56 |
|
||||
57 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458885
|
||||
|
|
||||
= help: Replace `setattr` with assignment
|
||||
|
||||
@@ -116,5 +121,8 @@ B009_B010.py:55:1: B010 [*] Do not call `setattr` with a constant attribute valu
|
||||
54 54 | setattr(foo, r"abc123", None)
|
||||
55 |-setattr(foo.bar, r"baz", None)
|
||||
55 |+foo.bar.baz = None
|
||||
56 56 |
|
||||
57 57 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458885
|
||||
58 58 | assert getattr(func, '_rpc')is True
|
||||
|
||||
|
||||
|
||||
@@ -137,13 +137,13 @@ fn is_standard_library_override(
|
||||
return false;
|
||||
};
|
||||
match name {
|
||||
// Ex) `Event#set`
|
||||
// Ex) `Event.set`
|
||||
"set" => bases.iter().any(|base| {
|
||||
semantic
|
||||
.resolve_call_path(base)
|
||||
.is_some_and(|call_path| matches!(call_path.as_slice(), ["threading", "Event"]))
|
||||
}),
|
||||
// Ex) `Filter#filter`
|
||||
// Ex) `Filter.filter`
|
||||
"filter" => bases.iter().any(|base| {
|
||||
semantic
|
||||
.resolve_call_path(base)
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
use ruff_diagnostics::{AlwaysAutofixableViolation, Diagnostic, Fix};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast::comparable::ComparableExpr;
|
||||
use ruff_python_ast::{self as ast, Comprehension, Expr};
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
@@ -86,28 +87,16 @@ pub(crate) fn unnecessary_dict_comprehension(
|
||||
if !generator.ifs.is_empty() || generator.is_async {
|
||||
return;
|
||||
}
|
||||
let Some(key) = key.as_name_expr() else {
|
||||
return;
|
||||
};
|
||||
let Some(value) = value.as_name_expr() else {
|
||||
return;
|
||||
};
|
||||
let Expr::Tuple(ast::ExprTuple { elts, .. }) = &generator.target else {
|
||||
return;
|
||||
};
|
||||
let [target_key, target_value] = elts.as_slice() else {
|
||||
return;
|
||||
};
|
||||
let Some(target_key) = target_key.as_name_expr() else {
|
||||
return;
|
||||
};
|
||||
let Some(target_value) = target_value.as_name_expr() else {
|
||||
return;
|
||||
};
|
||||
if target_key.id != key.id {
|
||||
if ComparableExpr::from(key) != ComparableExpr::from(target_key) {
|
||||
return;
|
||||
}
|
||||
if target_value.id != value.id {
|
||||
if ComparableExpr::from(value) != ComparableExpr::from(target_value) {
|
||||
return;
|
||||
}
|
||||
add_diagnostic(checker, expr);
|
||||
@@ -126,13 +115,7 @@ pub(crate) fn unnecessary_list_set_comprehension(
|
||||
if !generator.ifs.is_empty() || generator.is_async {
|
||||
return;
|
||||
}
|
||||
let Some(elt) = elt.as_name_expr() else {
|
||||
return;
|
||||
};
|
||||
let Some(target) = generator.target.as_name_expr() else {
|
||||
return;
|
||||
};
|
||||
if elt.id != target.id {
|
||||
if ComparableExpr::from(elt) != ComparableExpr::from(&generator.target) {
|
||||
return;
|
||||
}
|
||||
add_diagnostic(checker, expr);
|
||||
|
||||
@@ -54,18 +54,23 @@ pub(crate) fn unnecessary_generator_dict(
|
||||
else {
|
||||
return;
|
||||
};
|
||||
if let Expr::GeneratorExp(ast::ExprGeneratorExp { elt, .. }) = argument {
|
||||
match elt.as_ref() {
|
||||
Expr::Tuple(ast::ExprTuple { elts, .. }) if elts.len() == 2 => {
|
||||
let mut diagnostic = Diagnostic::new(UnnecessaryGeneratorDict, expr.range());
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
diagnostic.try_set_fix(|| {
|
||||
fixes::fix_unnecessary_generator_dict(expr, checker).map(Fix::suggested)
|
||||
});
|
||||
}
|
||||
checker.diagnostics.push(diagnostic);
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
let Expr::GeneratorExp(ast::ExprGeneratorExp { elt, .. }) = argument else {
|
||||
return;
|
||||
};
|
||||
let Expr::Tuple(ast::ExprTuple { elts, .. }) = elt.as_ref() else {
|
||||
return;
|
||||
};
|
||||
if elts.len() != 2 {
|
||||
return;
|
||||
}
|
||||
if elts.iter().any(Expr::is_starred_expr) {
|
||||
return;
|
||||
}
|
||||
let mut diagnostic = Diagnostic::new(UnnecessaryGeneratorDict, expr.range());
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
diagnostic.try_set_fix(|| {
|
||||
fixes::fix_unnecessary_generator_dict(expr, checker).map(Fix::suggested)
|
||||
});
|
||||
}
|
||||
checker.diagnostics.push(diagnostic);
|
||||
}
|
||||
|
||||
@@ -251,6 +251,8 @@ C402.py:21:1: C402 [*] Unnecessary generator (rewrite as a `dict` comprehension)
|
||||
20 | # Regression test for: https://github.com/astral-sh/ruff/issues/7086
|
||||
21 | dict((k,v)for k,v in d.iteritems() if k in only_args)
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ C402
|
||||
22 |
|
||||
23 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458940
|
||||
|
|
||||
= help: Rewrite as a `dict` comprehension
|
||||
|
||||
@@ -260,5 +262,8 @@ C402.py:21:1: C402 [*] Unnecessary generator (rewrite as a `dict` comprehension)
|
||||
20 20 | # Regression test for: https://github.com/astral-sh/ruff/issues/7086
|
||||
21 |-dict((k,v)for k,v in d.iteritems() if k in only_args)
|
||||
21 |+{k: v for k,v in d.iteritems() if k in only_args}
|
||||
22 22 |
|
||||
23 23 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722458940
|
||||
24 24 | dict((*v, k) for k, v in enumerate(calendar.month_abbr))
|
||||
|
||||
|
||||
|
||||
@@ -40,17 +40,18 @@ C416.py:7:1: C416 [*] Unnecessary `set` comprehension (rewrite using `set()`)
|
||||
7 |+set(x)
|
||||
8 8 | {k: v for k, v in y}
|
||||
9 9 | {k: v for k, v in d.items()}
|
||||
10 10 |
|
||||
10 10 | [(k, v) for k, v in d.items()]
|
||||
|
||||
C416.py:8:1: C416 [*] Unnecessary `dict` comprehension (rewrite using `dict()`)
|
||||
|
|
||||
6 | [i for i in x]
|
||||
7 | {i for i in x}
|
||||
8 | {k: v for k, v in y}
|
||||
| ^^^^^^^^^^^^^^^^^^^^ C416
|
||||
9 | {k: v for k, v in d.items()}
|
||||
|
|
||||
= help: Rewrite using `dict()`
|
||||
|
|
||||
6 | [i for i in x]
|
||||
7 | {i for i in x}
|
||||
8 | {k: v for k, v in y}
|
||||
| ^^^^^^^^^^^^^^^^^^^^ C416
|
||||
9 | {k: v for k, v in d.items()}
|
||||
10 | [(k, v) for k, v in d.items()]
|
||||
|
|
||||
= help: Rewrite using `dict()`
|
||||
|
||||
ℹ Suggested fix
|
||||
5 5 |
|
||||
@@ -59,8 +60,8 @@ C416.py:8:1: C416 [*] Unnecessary `dict` comprehension (rewrite using `dict()`)
|
||||
8 |-{k: v for k, v in y}
|
||||
8 |+dict(y)
|
||||
9 9 | {k: v for k, v in d.items()}
|
||||
10 10 |
|
||||
11 11 | [i for i, in z]
|
||||
10 10 | [(k, v) for k, v in d.items()]
|
||||
11 11 | {k: (a, b) for k, (a, b) in d.items()}
|
||||
|
||||
C416.py:9:1: C416 [*] Unnecessary `dict` comprehension (rewrite using `dict()`)
|
||||
|
|
||||
@@ -68,8 +69,8 @@ C416.py:9:1: C416 [*] Unnecessary `dict` comprehension (rewrite using `dict()`)
|
||||
8 | {k: v for k, v in y}
|
||||
9 | {k: v for k, v in d.items()}
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ C416
|
||||
10 |
|
||||
11 | [i for i, in z]
|
||||
10 | [(k, v) for k, v in d.items()]
|
||||
11 | {k: (a, b) for k, (a, b) in d.items()}
|
||||
|
|
||||
= help: Rewrite using `dict()`
|
||||
|
||||
@@ -79,23 +80,64 @@ C416.py:9:1: C416 [*] Unnecessary `dict` comprehension (rewrite using `dict()`)
|
||||
8 8 | {k: v for k, v in y}
|
||||
9 |-{k: v for k, v in d.items()}
|
||||
9 |+dict(d.items())
|
||||
10 10 |
|
||||
11 11 | [i for i, in z]
|
||||
12 12 | [i for i, j in y]
|
||||
10 10 | [(k, v) for k, v in d.items()]
|
||||
11 11 | {k: (a, b) for k, (a, b) in d.items()}
|
||||
12 12 |
|
||||
|
||||
C416.py:22:70: C416 [*] Unnecessary `list` comprehension (rewrite using `list()`)
|
||||
C416.py:10:1: C416 [*] Unnecessary `list` comprehension (rewrite using `list()`)
|
||||
|
|
||||
21 | # Regression test for: https://github.com/astral-sh/ruff/issues/7196
|
||||
22 | any(len(symbol_table.get_by_type(symbol_type)) > 0 for symbol_type in[t for t in SymbolType])
|
||||
8 | {k: v for k, v in y}
|
||||
9 | {k: v for k, v in d.items()}
|
||||
10 | [(k, v) for k, v in d.items()]
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ C416
|
||||
11 | {k: (a, b) for k, (a, b) in d.items()}
|
||||
|
|
||||
= help: Rewrite using `list()`
|
||||
|
||||
ℹ Suggested fix
|
||||
7 7 | {i for i in x}
|
||||
8 8 | {k: v for k, v in y}
|
||||
9 9 | {k: v for k, v in d.items()}
|
||||
10 |-[(k, v) for k, v in d.items()]
|
||||
10 |+list(d.items())
|
||||
11 11 | {k: (a, b) for k, (a, b) in d.items()}
|
||||
12 12 |
|
||||
13 13 | [i for i, in z]
|
||||
|
||||
C416.py:11:1: C416 [*] Unnecessary `dict` comprehension (rewrite using `dict()`)
|
||||
|
|
||||
9 | {k: v for k, v in d.items()}
|
||||
10 | [(k, v) for k, v in d.items()]
|
||||
11 | {k: (a, b) for k, (a, b) in d.items()}
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ C416
|
||||
12 |
|
||||
13 | [i for i, in z]
|
||||
|
|
||||
= help: Rewrite using `dict()`
|
||||
|
||||
ℹ Suggested fix
|
||||
8 8 | {k: v for k, v in y}
|
||||
9 9 | {k: v for k, v in d.items()}
|
||||
10 10 | [(k, v) for k, v in d.items()]
|
||||
11 |-{k: (a, b) for k, (a, b) in d.items()}
|
||||
11 |+dict(d.items())
|
||||
12 12 |
|
||||
13 13 | [i for i, in z]
|
||||
14 14 | [i for i, j in y]
|
||||
|
||||
C416.py:24:70: C416 [*] Unnecessary `list` comprehension (rewrite using `list()`)
|
||||
|
|
||||
23 | # Regression test for: https://github.com/astral-sh/ruff/issues/7196
|
||||
24 | any(len(symbol_table.get_by_type(symbol_type)) > 0 for symbol_type in[t for t in SymbolType])
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ C416
|
||||
|
|
||||
= help: Rewrite using `list()`
|
||||
|
||||
ℹ Suggested fix
|
||||
19 19 | {k: v if v else None for k, v in y}
|
||||
20 20 |
|
||||
21 21 | # Regression test for: https://github.com/astral-sh/ruff/issues/7196
|
||||
22 |-any(len(symbol_table.get_by_type(symbol_type)) > 0 for symbol_type in[t for t in SymbolType])
|
||||
22 |+any(len(symbol_table.get_by_type(symbol_type)) > 0 for symbol_type in list(SymbolType))
|
||||
21 21 | {k: v if v else None for k, v in y}
|
||||
22 22 |
|
||||
23 23 | # Regression test for: https://github.com/astral-sh/ruff/issues/7196
|
||||
24 |-any(len(symbol_table.get_by_type(symbol_type)) > 0 for symbol_type in[t for t in SymbolType])
|
||||
24 |+any(len(symbol_table.get_by_type(symbol_type)) > 0 for symbol_type in list(SymbolType))
|
||||
|
||||
|
||||
|
||||
29
crates/ruff/src/rules/flake8_logging/mod.rs
Normal file
29
crates/ruff/src/rules/flake8_logging/mod.rs
Normal file
@@ -0,0 +1,29 @@
|
||||
//! Rules from [flake8-logging](https://pypi.org/project/flake8-logging/).
|
||||
pub(crate) mod rules;
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use std::path::Path;
|
||||
|
||||
use anyhow::Result;
|
||||
use test_case::test_case;
|
||||
|
||||
use crate::assert_messages;
|
||||
use crate::registry::Rule;
|
||||
use crate::settings::Settings;
|
||||
use crate::test::test_path;
|
||||
|
||||
#[test_case(Rule::DirectLoggerInstantiation, Path::new("LOG001.py"))]
|
||||
#[test_case(Rule::InvalidGetLoggerArgument, Path::new("LOG002.py"))]
|
||||
#[test_case(Rule::ExceptionWithoutExcInfo, Path::new("LOG007.py"))]
|
||||
#[test_case(Rule::UndocumentedWarn, Path::new("LOG009.py"))]
|
||||
fn rules(rule_code: Rule, path: &Path) -> Result<()> {
|
||||
let snapshot = format!("{}_{}", rule_code.noqa_code(), path.to_string_lossy());
|
||||
let diagnostics = test_path(
|
||||
Path::new("flake8_logging").join(path).as_path(),
|
||||
&Settings::for_rule(rule_code),
|
||||
)?;
|
||||
assert_messages!(snapshot, diagnostics);
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,77 @@
|
||||
use ruff_diagnostics::{AutofixKind, Diagnostic, Edit, Fix, Violation};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast as ast;
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::importer::ImportRequest;
|
||||
use crate::registry::AsRule;
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for direct instantiation of `logging.Logger`, as opposed to using
|
||||
/// `logging.getLogger()`.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// The [Logger Objects] documentation states that:
|
||||
///
|
||||
/// > Note that Loggers should NEVER be instantiated directly, but always
|
||||
/// > through the module-level function `logging.getLogger(name)`.
|
||||
///
|
||||
/// If a logger is directly instantiated, it won't be added to the logger
|
||||
/// tree, and will bypass all configuration. Messages logged to it will
|
||||
/// only be sent to the "handler of last resort", skipping any filtering
|
||||
/// or formatting.
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// import logging
|
||||
///
|
||||
/// logger = logging.Logger(__name__)
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// import logging
|
||||
///
|
||||
/// logger = logging.getLogger(__name__)
|
||||
/// ```
|
||||
///
|
||||
/// [Logger Objects]: https://docs.python.org/3/library/logging.html#logger-objects
|
||||
#[violation]
|
||||
pub struct DirectLoggerInstantiation;
|
||||
|
||||
impl Violation for DirectLoggerInstantiation {
|
||||
const AUTOFIX: AutofixKind = AutofixKind::Sometimes;
|
||||
|
||||
#[derive_message_formats]
|
||||
fn message(&self) -> String {
|
||||
format!("Use `logging.getLogger()` to instantiate loggers")
|
||||
}
|
||||
|
||||
fn autofix_title(&self) -> Option<String> {
|
||||
Some(format!("Replace with `logging.getLogger()`"))
|
||||
}
|
||||
}
|
||||
|
||||
/// LOG001
|
||||
pub(crate) fn direct_logger_instantiation(checker: &mut Checker, call: &ast::ExprCall) {
|
||||
if checker
|
||||
.semantic()
|
||||
.resolve_call_path(call.func.as_ref())
|
||||
.is_some_and(|call_path| matches!(call_path.as_slice(), ["logging", "Logger"]))
|
||||
{
|
||||
let mut diagnostic = Diagnostic::new(DirectLoggerInstantiation, call.func.range());
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
diagnostic.try_set_fix(|| {
|
||||
let (import_edit, binding) = checker.importer().get_or_import_symbol(
|
||||
&ImportRequest::import("logging", "getLogger"),
|
||||
call.func.start(),
|
||||
checker.semantic(),
|
||||
)?;
|
||||
let reference_edit = Edit::range_replacement(binding, call.func.range());
|
||||
Ok(Fix::suggested_edits(import_edit, [reference_edit]))
|
||||
});
|
||||
}
|
||||
checker.diagnostics.push(diagnostic);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,65 @@
|
||||
use ruff_python_ast::ExprCall;
|
||||
|
||||
use ruff_diagnostics::{Diagnostic, Violation};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast::helpers::Truthiness;
|
||||
use ruff_python_semantic::analyze::logging::is_logger_candidate;
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
use crate::checkers::ast::Checker;
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for uses of `logging.exception()` with `exc_info` set to `False`.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// The `logging.exception()` method captures the exception automatically, but
|
||||
/// accepts an optional `exc_info` argument to override this behavior. Setting
|
||||
/// `exc_info` to `False` disables the automatic capture of the exception and
|
||||
/// stack trace.
|
||||
///
|
||||
/// Instead of setting `exc_info` to `False`, prefer `logging.error()`, which
|
||||
/// has equivalent behavior to `logging.exception()` with `exc_info` set to
|
||||
/// `False`, but is clearer in intent.
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// logging.exception("...", exc_info=False)
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// logging.error("...")
|
||||
/// ```
|
||||
#[violation]
|
||||
pub struct ExceptionWithoutExcInfo;
|
||||
|
||||
impl Violation for ExceptionWithoutExcInfo {
|
||||
#[derive_message_formats]
|
||||
fn message(&self) -> String {
|
||||
format!("Use of `logging.exception` with falsy `exc_info`")
|
||||
}
|
||||
}
|
||||
|
||||
/// LOG007
|
||||
pub(crate) fn exception_without_exc_info(checker: &mut Checker, call: &ExprCall) {
|
||||
if !is_logger_candidate(
|
||||
call.func.as_ref(),
|
||||
checker.semantic(),
|
||||
&["exception".to_string()],
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
if call
|
||||
.arguments
|
||||
.find_keyword("exc_info")
|
||||
.map(|keyword| &keyword.value)
|
||||
.is_some_and(|value| {
|
||||
Truthiness::from_expr(value, |id| checker.semantic().is_builtin(id)).is_falsey()
|
||||
})
|
||||
{
|
||||
checker
|
||||
.diagnostics
|
||||
.push(Diagnostic::new(ExceptionWithoutExcInfo, call.range()));
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,92 @@
|
||||
use ruff_diagnostics::{AutofixKind, Diagnostic, Edit, Fix, Violation};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast::{self as ast, Expr};
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::registry::AsRule;
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for any usage of `__cached__` and `__file__` as an argument to
|
||||
/// `logging.getLogger()`.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// The [logging documentation] recommends this pattern:
|
||||
///
|
||||
/// ```python
|
||||
/// logging.getLogger(__name__)
|
||||
/// ```
|
||||
///
|
||||
/// Here, `__name__` is the fully qualified module name, such as `foo.bar`,
|
||||
/// which is the intended format for logger names.
|
||||
///
|
||||
/// This rule detects probably-mistaken usage of similar module-level dunder constants:
|
||||
///
|
||||
/// * `__cached__` - the pathname of the module's compiled version, such as `foo/__pycache__/bar.cpython-311.pyc`.
|
||||
/// * `__file__` - the pathname of the module, such as `foo/bar.py`.
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// import logging
|
||||
///
|
||||
/// logger = logging.getLogger(__file__)
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// import logging
|
||||
///
|
||||
/// logger = logging.getLogger(__name__)
|
||||
/// ```
|
||||
///
|
||||
/// [logging documentation]: https://docs.python.org/3/library/logging.html#logger-objects
|
||||
#[violation]
|
||||
pub struct InvalidGetLoggerArgument;
|
||||
|
||||
impl Violation for InvalidGetLoggerArgument {
|
||||
const AUTOFIX: AutofixKind = AutofixKind::Sometimes;
|
||||
|
||||
#[derive_message_formats]
|
||||
fn message(&self) -> String {
|
||||
format!("Use `__name__` with `logging.getLogger()`")
|
||||
}
|
||||
|
||||
fn autofix_title(&self) -> Option<String> {
|
||||
Some(format!("Replace with `__name__`"))
|
||||
}
|
||||
}
|
||||
|
||||
/// LOG002
|
||||
pub(crate) fn invalid_get_logger_argument(checker: &mut Checker, call: &ast::ExprCall) {
|
||||
let Some(Expr::Name(expr @ ast::ExprName { id, .. })) = call.arguments.find_argument("name", 0)
|
||||
else {
|
||||
return;
|
||||
};
|
||||
|
||||
if !matches!(id.as_ref(), "__file__" | "__cached__") {
|
||||
return;
|
||||
}
|
||||
|
||||
if !checker.semantic().is_builtin(id) {
|
||||
return;
|
||||
}
|
||||
|
||||
if !checker
|
||||
.semantic()
|
||||
.resolve_call_path(call.func.as_ref())
|
||||
.is_some_and(|call_path| matches!(call_path.as_slice(), ["logging", "getLogger"]))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
let mut diagnostic = Diagnostic::new(InvalidGetLoggerArgument, expr.range());
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
if checker.semantic().is_builtin("__name__") {
|
||||
diagnostic.set_fix(Fix::suggested(Edit::range_replacement(
|
||||
"__name__".to_string(),
|
||||
expr.range(),
|
||||
)));
|
||||
}
|
||||
}
|
||||
checker.diagnostics.push(diagnostic);
|
||||
}
|
||||
9
crates/ruff/src/rules/flake8_logging/rules/mod.rs
Normal file
9
crates/ruff/src/rules/flake8_logging/rules/mod.rs
Normal file
@@ -0,0 +1,9 @@
|
||||
pub(crate) use direct_logger_instantiation::*;
|
||||
pub(crate) use exception_without_exc_info::*;
|
||||
pub(crate) use invalid_get_logger_argument::*;
|
||||
pub(crate) use undocumented_warn::*;
|
||||
|
||||
mod direct_logger_instantiation;
|
||||
mod exception_without_exc_info;
|
||||
mod invalid_get_logger_argument;
|
||||
mod undocumented_warn;
|
||||
@@ -0,0 +1,71 @@
|
||||
use ruff_python_ast::Expr;
|
||||
|
||||
use ruff_diagnostics::{AutofixKind, Diagnostic, Edit, Fix, Violation};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::importer::ImportRequest;
|
||||
use crate::registry::AsRule;
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for uses of `logging.WARN`.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// The `logging.WARN` constant is an undocumented alias for `logging.WARNING`.
|
||||
///
|
||||
/// Although it’s not explicitly deprecated, `logging.WARN` is not mentioned
|
||||
/// in the `logging` documentation. Prefer `logging.WARNING` instead.
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// import logging
|
||||
///
|
||||
///
|
||||
/// logging.basicConfig(level=logging.WARN)
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// import logging
|
||||
///
|
||||
///
|
||||
/// logging.basicConfig(level=logging.WARNING)
|
||||
/// ```
|
||||
#[violation]
|
||||
pub struct UndocumentedWarn;
|
||||
|
||||
impl Violation for UndocumentedWarn {
|
||||
const AUTOFIX: AutofixKind = AutofixKind::Sometimes;
|
||||
#[derive_message_formats]
|
||||
fn message(&self) -> String {
|
||||
format!("Use of undocumented `logging.WARN` constant")
|
||||
}
|
||||
|
||||
fn autofix_title(&self) -> Option<String> {
|
||||
Some(format!("Replace `logging.WARN` with `logging.WARNING`"))
|
||||
}
|
||||
}
|
||||
|
||||
/// LOG009
|
||||
pub(crate) fn undocumented_warn(checker: &mut Checker, expr: &Expr) {
|
||||
if checker
|
||||
.semantic()
|
||||
.resolve_call_path(expr)
|
||||
.is_some_and(|call_path| matches!(call_path.as_slice(), ["logging", "WARN"]))
|
||||
{
|
||||
let mut diagnostic = Diagnostic::new(UndocumentedWarn, expr.range());
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
diagnostic.try_set_fix(|| {
|
||||
let (import_edit, binding) = checker.importer().get_or_import_symbol(
|
||||
&ImportRequest::import("logging", "WARNING"),
|
||||
expr.start(),
|
||||
checker.semantic(),
|
||||
)?;
|
||||
let reference_edit = Edit::range_replacement(binding, expr.range());
|
||||
Ok(Fix::suggested_edits(import_edit, [reference_edit]))
|
||||
});
|
||||
}
|
||||
checker.diagnostics.push(diagnostic);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,40 @@
|
||||
---
|
||||
source: crates/ruff/src/rules/flake8_logging/mod.rs
|
||||
---
|
||||
LOG001.py:3:1: LOG001 [*] Use `logging.getLogger()` to instantiate loggers
|
||||
|
|
||||
1 | import logging
|
||||
2 |
|
||||
3 | logging.Logger(__name__)
|
||||
| ^^^^^^^^^^^^^^ LOG001
|
||||
4 | logging.Logger()
|
||||
5 | logging.getLogger(__name__)
|
||||
|
|
||||
= help: Replace with `logging.getLogger()`
|
||||
|
||||
ℹ Suggested fix
|
||||
1 1 | import logging
|
||||
2 2 |
|
||||
3 |-logging.Logger(__name__)
|
||||
3 |+logging.getLogger(__name__)
|
||||
4 4 | logging.Logger()
|
||||
5 5 | logging.getLogger(__name__)
|
||||
|
||||
LOG001.py:4:1: LOG001 [*] Use `logging.getLogger()` to instantiate loggers
|
||||
|
|
||||
3 | logging.Logger(__name__)
|
||||
4 | logging.Logger()
|
||||
| ^^^^^^^^^^^^^^ LOG001
|
||||
5 | logging.getLogger(__name__)
|
||||
|
|
||||
= help: Replace with `logging.getLogger()`
|
||||
|
||||
ℹ Suggested fix
|
||||
1 1 | import logging
|
||||
2 2 |
|
||||
3 3 | logging.Logger(__name__)
|
||||
4 |-logging.Logger()
|
||||
4 |+logging.getLogger()
|
||||
5 5 | logging.getLogger(__name__)
|
||||
|
||||
|
||||
@@ -0,0 +1,82 @@
|
||||
---
|
||||
source: crates/ruff/src/rules/flake8_logging/mod.rs
|
||||
---
|
||||
LOG002.py:11:11: LOG002 [*] Use `__name__` with `logging.getLogger()`
|
||||
|
|
||||
10 | # LOG002
|
||||
11 | getLogger(__file__)
|
||||
| ^^^^^^^^ LOG002
|
||||
12 | logging.getLogger(name=__file__)
|
||||
|
|
||||
= help: Replace with `name`
|
||||
|
||||
ℹ Suggested fix
|
||||
8 8 | logging.getLogger(name="custom")
|
||||
9 9 |
|
||||
10 10 | # LOG002
|
||||
11 |-getLogger(__file__)
|
||||
11 |+getLogger(__name__)
|
||||
12 12 | logging.getLogger(name=__file__)
|
||||
13 13 |
|
||||
14 14 | logging.getLogger(__cached__)
|
||||
|
||||
LOG002.py:12:24: LOG002 [*] Use `__name__` with `logging.getLogger()`
|
||||
|
|
||||
10 | # LOG002
|
||||
11 | getLogger(__file__)
|
||||
12 | logging.getLogger(name=__file__)
|
||||
| ^^^^^^^^ LOG002
|
||||
13 |
|
||||
14 | logging.getLogger(__cached__)
|
||||
|
|
||||
= help: Replace with `name`
|
||||
|
||||
ℹ Suggested fix
|
||||
9 9 |
|
||||
10 10 | # LOG002
|
||||
11 11 | getLogger(__file__)
|
||||
12 |-logging.getLogger(name=__file__)
|
||||
12 |+logging.getLogger(name=__name__)
|
||||
13 13 |
|
||||
14 14 | logging.getLogger(__cached__)
|
||||
15 15 | getLogger(name=__cached__)
|
||||
|
||||
LOG002.py:14:19: LOG002 [*] Use `__name__` with `logging.getLogger()`
|
||||
|
|
||||
12 | logging.getLogger(name=__file__)
|
||||
13 |
|
||||
14 | logging.getLogger(__cached__)
|
||||
| ^^^^^^^^^^ LOG002
|
||||
15 | getLogger(name=__cached__)
|
||||
|
|
||||
= help: Replace with `name`
|
||||
|
||||
ℹ Suggested fix
|
||||
11 11 | getLogger(__file__)
|
||||
12 12 | logging.getLogger(name=__file__)
|
||||
13 13 |
|
||||
14 |-logging.getLogger(__cached__)
|
||||
14 |+logging.getLogger(__name__)
|
||||
15 15 | getLogger(name=__cached__)
|
||||
16 16 |
|
||||
17 17 |
|
||||
|
||||
LOG002.py:15:16: LOG002 [*] Use `__name__` with `logging.getLogger()`
|
||||
|
|
||||
14 | logging.getLogger(__cached__)
|
||||
15 | getLogger(name=__cached__)
|
||||
| ^^^^^^^^^^ LOG002
|
||||
|
|
||||
= help: Replace with `name`
|
||||
|
||||
ℹ Suggested fix
|
||||
12 12 | logging.getLogger(name=__file__)
|
||||
13 13 |
|
||||
14 14 | logging.getLogger(__cached__)
|
||||
15 |-getLogger(name=__cached__)
|
||||
15 |+getLogger(name=__name__)
|
||||
16 16 |
|
||||
17 17 |
|
||||
18 18 | # Override `logging.getLogger`
|
||||
|
||||
|
||||
@@ -0,0 +1,40 @@
|
||||
---
|
||||
source: crates/ruff/src/rules/flake8_logging/mod.rs
|
||||
---
|
||||
LOG007.py:6:1: LOG007 Use of `logging.exception` with falsy `exc_info`
|
||||
|
|
||||
5 | logging.exception("foo") # OK
|
||||
6 | logging.exception("foo", exc_info=False) # LOG007
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ LOG007
|
||||
7 | logging.exception("foo", exc_info=[]) # LOG007
|
||||
8 | logger.exception("foo") # OK
|
||||
|
|
||||
|
||||
LOG007.py:7:1: LOG007 Use of `logging.exception` with falsy `exc_info`
|
||||
|
|
||||
5 | logging.exception("foo") # OK
|
||||
6 | logging.exception("foo", exc_info=False) # LOG007
|
||||
7 | logging.exception("foo", exc_info=[]) # LOG007
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ LOG007
|
||||
8 | logger.exception("foo") # OK
|
||||
9 | logger.exception("foo", exc_info=False) # LOG007
|
||||
|
|
||||
|
||||
LOG007.py:9:1: LOG007 Use of `logging.exception` with falsy `exc_info`
|
||||
|
|
||||
7 | logging.exception("foo", exc_info=[]) # LOG007
|
||||
8 | logger.exception("foo") # OK
|
||||
9 | logger.exception("foo", exc_info=False) # LOG007
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ LOG007
|
||||
10 | logger.exception("foo", exc_info=[]) # LOG007
|
||||
|
|
||||
|
||||
LOG007.py:10:1: LOG007 Use of `logging.exception` with falsy `exc_info`
|
||||
|
|
||||
8 | logger.exception("foo") # OK
|
||||
9 | logger.exception("foo", exc_info=False) # LOG007
|
||||
10 | logger.exception("foo", exc_info=[]) # LOG007
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ LOG007
|
||||
|
|
||||
|
||||
|
||||
@@ -0,0 +1,41 @@
|
||||
---
|
||||
source: crates/ruff/src/rules/flake8_logging/mod.rs
|
||||
---
|
||||
LOG009.py:3:1: LOG009 [*] Use of undocumented `logging.WARN` constant
|
||||
|
|
||||
1 | import logging
|
||||
2 |
|
||||
3 | logging.WARN # LOG009
|
||||
| ^^^^^^^^^^^^ LOG009
|
||||
4 | logging.WARNING # OK
|
||||
|
|
||||
= help: Replace `logging.WARN` with `logging.WARNING`
|
||||
|
||||
ℹ Suggested fix
|
||||
1 1 | import logging
|
||||
2 2 |
|
||||
3 |-logging.WARN # LOG009
|
||||
3 |+logging.WARNING # LOG009
|
||||
4 4 | logging.WARNING # OK
|
||||
5 5 |
|
||||
6 6 | from logging import WARN, WARNING
|
||||
|
||||
LOG009.py:8:1: LOG009 [*] Use of undocumented `logging.WARN` constant
|
||||
|
|
||||
6 | from logging import WARN, WARNING
|
||||
7 |
|
||||
8 | WARN # LOG009
|
||||
| ^^^^ LOG009
|
||||
9 | WARNING # OK
|
||||
|
|
||||
= help: Replace `logging.WARN` with `logging.WARNING`
|
||||
|
||||
ℹ Suggested fix
|
||||
5 5 |
|
||||
6 6 | from logging import WARN, WARNING
|
||||
7 7 |
|
||||
8 |-WARN # LOG009
|
||||
8 |+logging.WARNING # LOG009
|
||||
9 9 | WARNING # OK
|
||||
|
||||
|
||||
@@ -6,6 +6,26 @@ use ruff_text_size::Ranged;
|
||||
use crate::checkers::ast::Checker;
|
||||
use crate::registry::Rule;
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for `pass` statements in empty stub bodies.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// For consistency, empty stub bodies should contain `...` instead of `pass`.
|
||||
///
|
||||
/// Additionally, an ellipsis better conveys the intent of the stub body (that
|
||||
/// the body has been implemented, but has been intentionally left blank to
|
||||
/// document the interface).
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// def foo(bar: int) -> list[int]:
|
||||
/// pass
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// def foo(bar: int) -> list[int]: ...
|
||||
/// ```
|
||||
#[violation]
|
||||
pub struct PassStatementStubBody;
|
||||
|
||||
|
||||
@@ -10,6 +10,7 @@ use libcst_native::{
|
||||
use ruff_diagnostics::{AutofixKind, Diagnostic, Edit, Fix, Violation};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast::helpers::Truthiness;
|
||||
use ruff_python_ast::parenthesize::parenthesized_range;
|
||||
use ruff_python_ast::visitor::Visitor;
|
||||
use ruff_python_ast::{
|
||||
self as ast, Arguments, BoolOp, ExceptHandler, Expr, Keyword, Stmt, UnaryOp,
|
||||
@@ -293,7 +294,13 @@ pub(crate) fn unittest_assertion(
|
||||
if let Ok(stmt) = unittest_assert.generate_assert(args, keywords) {
|
||||
diagnostic.set_fix(Fix::suggested(Edit::range_replacement(
|
||||
checker.generator().stmt(&stmt),
|
||||
expr.range(),
|
||||
parenthesized_range(
|
||||
expr.into(),
|
||||
checker.semantic().current_statement().into(),
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator().contents(),
|
||||
)
|
||||
.unwrap_or(expr.range()),
|
||||
)));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -9,6 +9,7 @@ use ruff_python_ast::node::AstNode;
|
||||
use ruff_python_ast::parenthesize::parenthesized_range;
|
||||
use ruff_python_ast::{self as ast, Arguments, Constant, Decorator, Expr, ExprContext};
|
||||
use ruff_python_codegen::Generator;
|
||||
use ruff_python_trivia::CommentRanges;
|
||||
use ruff_python_trivia::{SimpleTokenKind, SimpleTokenizer};
|
||||
use ruff_text_size::{Ranged, TextRange, TextSize};
|
||||
|
||||
@@ -298,12 +299,17 @@ fn elts_to_csv(elts: &[Expr], generator: Generator) -> Option<String> {
|
||||
fn get_parametrize_name_range(
|
||||
decorator: &Decorator,
|
||||
expr: &Expr,
|
||||
comment_ranges: &CommentRanges,
|
||||
source: &str,
|
||||
) -> Option<TextRange> {
|
||||
decorator
|
||||
.expression
|
||||
.as_call_expr()
|
||||
.and_then(|call| parenthesized_range(expr.into(), call.arguments.as_any_node_ref(), source))
|
||||
decorator.expression.as_call_expr().and_then(|call| {
|
||||
parenthesized_range(
|
||||
expr.into(),
|
||||
call.arguments.as_any_node_ref(),
|
||||
comment_ranges,
|
||||
source,
|
||||
)
|
||||
})
|
||||
}
|
||||
|
||||
/// PT006
|
||||
@@ -322,6 +328,7 @@ fn check_names(checker: &mut Checker, decorator: &Decorator, expr: &Expr) {
|
||||
let name_range = get_parametrize_name_range(
|
||||
decorator,
|
||||
expr,
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator().contents(),
|
||||
)
|
||||
.unwrap_or(expr.range());
|
||||
@@ -356,6 +363,7 @@ fn check_names(checker: &mut Checker, decorator: &Decorator, expr: &Expr) {
|
||||
let name_range = get_parametrize_name_range(
|
||||
decorator,
|
||||
expr,
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator().contents(),
|
||||
)
|
||||
.unwrap_or(expr.range());
|
||||
|
||||
@@ -637,5 +637,27 @@ PT009.py:94:9: PT009 [*] Use a regular `assert` instead of unittest-style `failI
|
||||
93 93 | def test_fail_if_equal(self):
|
||||
94 |- self.failIfEqual(1, 2) # Error
|
||||
94 |+ assert 1 != 2 # Error
|
||||
95 95 |
|
||||
96 96 |
|
||||
97 97 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722459517
|
||||
|
||||
PT009.py:98:2: PT009 [*] Use a regular `assert` instead of unittest-style `assertTrue`
|
||||
|
|
||||
97 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722459517
|
||||
98 | (self.assertTrue(
|
||||
| ^^^^^^^^^^^^^^^ PT009
|
||||
99 | "piAx_piAy_beta[r][x][y] = {17}".format(
|
||||
100 | self.model.piAx_piAy_beta[r][x][y])))
|
||||
|
|
||||
= help: Replace `assertTrue(...)` with `assert ...`
|
||||
|
||||
ℹ Suggested fix
|
||||
95 95 |
|
||||
96 96 |
|
||||
97 97 | # Regression test for: https://github.com/astral-sh/ruff/issues/7455#issuecomment-1722459517
|
||||
98 |-(self.assertTrue(
|
||||
99 |- "piAx_piAy_beta[r][x][y] = {17}".format(
|
||||
100 |- self.model.piAx_piAy_beta[r][x][y])))
|
||||
98 |+assert "piAx_piAy_beta[r][x][y] = {17}".format(self.model.piAx_piAy_beta[r][x][y])
|
||||
|
||||
|
||||
|
||||
@@ -773,9 +773,14 @@ fn is_short_circuit(
|
||||
edit = Some(get_short_circuit_edit(
|
||||
value,
|
||||
TextRange::new(
|
||||
parenthesized_range(furthest.into(), expr.into(), checker.locator().contents())
|
||||
.unwrap_or(furthest.range())
|
||||
.start(),
|
||||
parenthesized_range(
|
||||
furthest.into(),
|
||||
expr.into(),
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator().contents(),
|
||||
)
|
||||
.unwrap_or(furthest.range())
|
||||
.start(),
|
||||
expr.end(),
|
||||
),
|
||||
short_circuit_truthiness,
|
||||
@@ -796,9 +801,14 @@ fn is_short_circuit(
|
||||
edit = Some(get_short_circuit_edit(
|
||||
next_value,
|
||||
TextRange::new(
|
||||
parenthesized_range(furthest.into(), expr.into(), checker.locator().contents())
|
||||
.unwrap_or(furthest.range())
|
||||
.start(),
|
||||
parenthesized_range(
|
||||
furthest.into(),
|
||||
expr.into(),
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator().contents(),
|
||||
)
|
||||
.unwrap_or(furthest.range())
|
||||
.start(),
|
||||
expr.end(),
|
||||
),
|
||||
short_circuit_truthiness,
|
||||
|
||||
@@ -163,8 +163,13 @@ pub(crate) fn if_expr_with_true_false(
|
||||
checker
|
||||
.locator()
|
||||
.slice(
|
||||
parenthesized_range(test.into(), expr.into(), checker.locator().contents())
|
||||
.unwrap_or(test.range()),
|
||||
parenthesized_range(
|
||||
test.into(),
|
||||
expr.into(),
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator().contents(),
|
||||
)
|
||||
.unwrap_or(test.range()),
|
||||
)
|
||||
.to_string(),
|
||||
expr.range(),
|
||||
|
||||
@@ -88,10 +88,20 @@ fn key_in_dict(
|
||||
}
|
||||
|
||||
// Extract the exact range of the left and right expressions.
|
||||
let left_range = parenthesized_range(left.into(), parent, checker.locator().contents())
|
||||
.unwrap_or(left.range());
|
||||
let right_range = parenthesized_range(right.into(), parent, checker.locator().contents())
|
||||
.unwrap_or(right.range());
|
||||
let left_range = parenthesized_range(
|
||||
left.into(),
|
||||
parent,
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator().contents(),
|
||||
)
|
||||
.unwrap_or(left.range());
|
||||
let right_range = parenthesized_range(
|
||||
right.into(),
|
||||
parent,
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator().contents(),
|
||||
)
|
||||
.unwrap_or(right.range());
|
||||
|
||||
let mut diagnostic = Diagnostic::new(
|
||||
InDictKeys {
|
||||
|
||||
@@ -12,10 +12,10 @@ use crate::registry::AsRule;
|
||||
use crate::rules::flynt::helpers;
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for `str#join` calls that can be replaced with f-strings.
|
||||
/// Checks for `str.join` calls that can be replaced with f-strings.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// f-strings are more readable and generally preferred over `str#join` calls.
|
||||
/// f-strings are more readable and generally preferred over `str.join` calls.
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
|
||||
@@ -22,6 +22,7 @@ pub mod flake8_future_annotations;
|
||||
pub mod flake8_gettext;
|
||||
pub mod flake8_implicit_str_concat;
|
||||
pub mod flake8_import_conventions;
|
||||
pub mod flake8_logging;
|
||||
pub mod flake8_logging_format;
|
||||
pub mod flake8_no_pep420;
|
||||
pub mod flake8_pie;
|
||||
|
||||
@@ -3,6 +3,7 @@ use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast::helpers::is_const_true;
|
||||
use ruff_python_ast::parenthesize::parenthesized_range;
|
||||
use ruff_python_ast::{self as ast, Keyword, Stmt};
|
||||
use ruff_python_trivia::CommentRanges;
|
||||
use ruff_source_file::Locator;
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
@@ -85,6 +86,7 @@ pub(crate) fn inplace_argument(checker: &mut Checker, call: &ast::ExprCall) {
|
||||
call,
|
||||
keyword,
|
||||
statement,
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator(),
|
||||
) {
|
||||
diagnostic.set_fix(fix);
|
||||
@@ -107,15 +109,21 @@ fn convert_inplace_argument_to_assignment(
|
||||
call: &ast::ExprCall,
|
||||
keyword: &Keyword,
|
||||
statement: &Stmt,
|
||||
comment_ranges: &CommentRanges,
|
||||
locator: &Locator,
|
||||
) -> Option<Fix> {
|
||||
// Add the assignment.
|
||||
let attr = call.func.as_attribute_expr()?;
|
||||
let insert_assignment = Edit::insertion(
|
||||
format!("{name} = ", name = locator.slice(attr.value.range())),
|
||||
parenthesized_range(call.into(), statement.into(), locator.contents())
|
||||
.unwrap_or(call.range())
|
||||
.start(),
|
||||
parenthesized_range(
|
||||
call.into(),
|
||||
statement.into(),
|
||||
comment_ranges,
|
||||
locator.contents(),
|
||||
)
|
||||
.unwrap_or(call.range())
|
||||
.start(),
|
||||
);
|
||||
|
||||
// Remove the `inplace` argument.
|
||||
|
||||
@@ -70,9 +70,12 @@ pub(crate) fn invalid_function_name(
|
||||
return None;
|
||||
}
|
||||
|
||||
// Ignore any functions that are explicitly `@override`. These are defined elsewhere,
|
||||
// so if they're first-party, we'll flag them at the definition site.
|
||||
if visibility::is_override(decorator_list, semantic) {
|
||||
// Ignore any functions that are explicitly `@override` or `@overload`.
|
||||
// These are defined elsewhere, so if they're first-party,
|
||||
// we'll flag them at the definition site.
|
||||
if visibility::is_override(decorator_list, semantic)
|
||||
|| visibility::is_overload(decorator_list, semantic)
|
||||
{
|
||||
return None;
|
||||
}
|
||||
|
||||
|
||||
@@ -19,6 +19,7 @@ mod tests {
|
||||
#[test_case(Rule::TryExceptInLoop, Path::new("PERF203.py"))]
|
||||
#[test_case(Rule::ManualListComprehension, Path::new("PERF401.py"))]
|
||||
#[test_case(Rule::ManualListCopy, Path::new("PERF402.py"))]
|
||||
#[test_case(Rule::ManualDictComprehension, Path::new("PERF403.py"))]
|
||||
fn rules(rule_code: Rule, path: &Path) -> Result<()> {
|
||||
let snapshot = format!("{}_{}", rule_code.noqa_code(), path.to_string_lossy());
|
||||
let diagnostics = test_path(
|
||||
|
||||
@@ -0,0 +1,178 @@
|
||||
use ruff_diagnostics::{Diagnostic, Violation};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast::comparable::ComparableExpr;
|
||||
use ruff_python_ast::helpers::any_over_expr;
|
||||
use ruff_python_ast::{self as ast, Expr, Stmt};
|
||||
use ruff_python_semantic::analyze::typing::is_dict;
|
||||
use ruff_python_semantic::Binding;
|
||||
|
||||
use crate::checkers::ast::Checker;
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for `for` loops that can be replaced by a dictionary comprehension.
|
||||
///
|
||||
/// ## Why is this bad?
|
||||
/// When creating or extending a dictionary in a for-loop, prefer a dictionary
|
||||
/// comprehension. Comprehensions are more readable and more performant.
|
||||
///
|
||||
/// For example, when comparing `{x: x for x in list(range(1000))}` to the `for`
|
||||
/// loop version, the comprehension is ~10% faster on Python 3.11.
|
||||
///
|
||||
/// Note that, as with all `perflint` rules, this is only intended as a
|
||||
/// micro-optimization, and will have a negligible impact on performance in
|
||||
/// most cases.
|
||||
///
|
||||
/// ## Example
|
||||
/// ```python
|
||||
/// pairs = (("a", 1), ("b", 2))
|
||||
/// result = {}
|
||||
/// for x, y in pairs:
|
||||
/// if y % 2:
|
||||
/// result[x] = y
|
||||
/// ```
|
||||
///
|
||||
/// Use instead:
|
||||
/// ```python
|
||||
/// pairs = (("a", 1), ("b", 2))
|
||||
/// result = {x: y for x, y in pairs if y % 2}
|
||||
/// ```
|
||||
///
|
||||
/// If you're appending to an existing dictionary, use the `update` method instead:
|
||||
/// ```python
|
||||
/// pairs = (("a", 1), ("b", 2))
|
||||
/// result.update({x: y for x, y in pairs if y % 2})
|
||||
/// ```
|
||||
#[violation]
|
||||
pub struct ManualDictComprehension;
|
||||
|
||||
impl Violation for ManualDictComprehension {
|
||||
#[derive_message_formats]
|
||||
fn message(&self) -> String {
|
||||
format!("Use a dictionary comprehension instead of a for-loop")
|
||||
}
|
||||
}
|
||||
|
||||
/// PERF403
|
||||
pub(crate) fn manual_dict_comprehension(checker: &mut Checker, target: &Expr, body: &[Stmt]) {
|
||||
let (stmt, if_test) = match body {
|
||||
// ```python
|
||||
// for idx, name in enumerate(names):
|
||||
// if idx % 2 == 0:
|
||||
// result[name] = idx
|
||||
// ```
|
||||
[Stmt::If(ast::StmtIf {
|
||||
body,
|
||||
elif_else_clauses,
|
||||
test,
|
||||
..
|
||||
})] => {
|
||||
// TODO(charlie): If there's an `else` clause, verify that the `else` has the
|
||||
// same structure.
|
||||
if !elif_else_clauses.is_empty() {
|
||||
return;
|
||||
}
|
||||
let [stmt] = body.as_slice() else {
|
||||
return;
|
||||
};
|
||||
(stmt, Some(test))
|
||||
}
|
||||
// ```python
|
||||
// for idx, name in enumerate(names):
|
||||
// result[name] = idx
|
||||
// ```
|
||||
[stmt] => (stmt, None),
|
||||
_ => return,
|
||||
};
|
||||
|
||||
let Stmt::Assign(ast::StmtAssign {
|
||||
targets,
|
||||
value,
|
||||
range,
|
||||
}) = stmt
|
||||
else {
|
||||
return;
|
||||
};
|
||||
|
||||
let [Expr::Subscript(ast::ExprSubscript {
|
||||
value: subscript_value,
|
||||
slice,
|
||||
..
|
||||
})] = targets.as_slice()
|
||||
else {
|
||||
return;
|
||||
};
|
||||
|
||||
match target {
|
||||
Expr::Tuple(ast::ExprTuple { elts, .. }) => {
|
||||
if !elts
|
||||
.iter()
|
||||
.any(|elt| ComparableExpr::from(slice) == ComparableExpr::from(elt))
|
||||
{
|
||||
return;
|
||||
}
|
||||
if !elts
|
||||
.iter()
|
||||
.any(|elt| ComparableExpr::from(value) == ComparableExpr::from(elt))
|
||||
{
|
||||
return;
|
||||
}
|
||||
}
|
||||
Expr::Name(_) => {
|
||||
if ComparableExpr::from(slice) != ComparableExpr::from(target) {
|
||||
return;
|
||||
}
|
||||
if ComparableExpr::from(value) != ComparableExpr::from(target) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
_ => return,
|
||||
}
|
||||
|
||||
// Exclude non-dictionary value.
|
||||
let Expr::Name(ast::ExprName {
|
||||
id: subscript_name, ..
|
||||
}) = subscript_value.as_ref()
|
||||
else {
|
||||
return;
|
||||
};
|
||||
let scope = checker.semantic().current_scope();
|
||||
let bindings: Vec<&Binding> = scope
|
||||
.get_all(subscript_name)
|
||||
.map(|binding_id| checker.semantic().binding(binding_id))
|
||||
.collect();
|
||||
|
||||
let [binding] = bindings.as_slice() else {
|
||||
return;
|
||||
};
|
||||
|
||||
if !is_dict(binding, checker.semantic()) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Avoid if the value is used in the conditional test, e.g.,
|
||||
//
|
||||
// ```python
|
||||
// for x in y:
|
||||
// if x in filtered:
|
||||
// filtered[x] = y
|
||||
// ```
|
||||
//
|
||||
// Converting this to a dictionary comprehension would raise a `NameError` as
|
||||
// `filtered` is not defined yet:
|
||||
//
|
||||
// ```python
|
||||
// filtered = {x: y for x in y if x in filtered}
|
||||
// ```
|
||||
if if_test.is_some_and(|test| {
|
||||
any_over_expr(test, &|expr| {
|
||||
expr.as_name_expr()
|
||||
.is_some_and(|expr| expr.id == *subscript_name)
|
||||
})
|
||||
}) {
|
||||
return;
|
||||
}
|
||||
|
||||
checker
|
||||
.diagnostics
|
||||
.push(Diagnostic::new(ManualDictComprehension, *range));
|
||||
}
|
||||
@@ -1,10 +1,12 @@
|
||||
pub(crate) use incorrect_dict_iterator::*;
|
||||
pub(crate) use manual_dict_comprehension::*;
|
||||
pub(crate) use manual_list_comprehension::*;
|
||||
pub(crate) use manual_list_copy::*;
|
||||
pub(crate) use try_except_in_loop::*;
|
||||
pub(crate) use unnecessary_list_cast::*;
|
||||
|
||||
mod incorrect_dict_iterator;
|
||||
mod manual_dict_comprehension;
|
||||
mod manual_list_comprehension;
|
||||
mod manual_list_copy;
|
||||
mod try_except_in_loop;
|
||||
|
||||
@@ -0,0 +1,52 @@
|
||||
---
|
||||
source: crates/ruff/src/rules/perflint/mod.rs
|
||||
---
|
||||
PERF403.py:5:9: PERF403 Use a dictionary comprehension instead of a for-loop
|
||||
|
|
||||
3 | result = {}
|
||||
4 | for idx, name in enumerate(fruit):
|
||||
5 | result[idx] = name # PERF403
|
||||
| ^^^^^^^^^^^^^^^^^^ PERF403
|
||||
|
|
||||
|
||||
PERF403.py:13:13: PERF403 Use a dictionary comprehension instead of a for-loop
|
||||
|
|
||||
11 | for idx, name in enumerate(fruit):
|
||||
12 | if idx % 2:
|
||||
13 | result[idx] = name # PERF403
|
||||
| ^^^^^^^^^^^^^^^^^^ PERF403
|
||||
|
|
||||
|
||||
PERF403.py:31:13: PERF403 Use a dictionary comprehension instead of a for-loop
|
||||
|
|
||||
29 | for idx, name in enumerate(fruit):
|
||||
30 | if idx % 2:
|
||||
31 | result[idx] = name # PERF403
|
||||
| ^^^^^^^^^^^^^^^^^^ PERF403
|
||||
|
|
||||
|
||||
PERF403.py:61:13: PERF403 Use a dictionary comprehension instead of a for-loop
|
||||
|
|
||||
59 | for idx, name in enumerate(fruit):
|
||||
60 | if idx % 2:
|
||||
61 | result[idx] = name # PERF403
|
||||
| ^^^^^^^^^^^^^^^^^^ PERF403
|
||||
|
|
||||
|
||||
PERF403.py:76:9: PERF403 Use a dictionary comprehension instead of a for-loop
|
||||
|
|
||||
74 | result = {}
|
||||
75 | for name in fruit:
|
||||
76 | result[name] = name # PERF403
|
||||
| ^^^^^^^^^^^^^^^^^^^ PERF403
|
||||
|
|
||||
|
||||
PERF403.py:83:9: PERF403 Use a dictionary comprehension instead of a for-loop
|
||||
|
|
||||
81 | result = {}
|
||||
82 | for idx, name in enumerate(fruit):
|
||||
83 | result[name] = idx # PERF403
|
||||
| ^^^^^^^^^^^^^^^^^^ PERF403
|
||||
|
|
||||
|
||||
|
||||
@@ -3,6 +3,7 @@ use unicode_width::UnicodeWidthStr;
|
||||
use ruff_python_ast::node::AnyNodeRef;
|
||||
use ruff_python_ast::parenthesize::parenthesized_range;
|
||||
use ruff_python_ast::{CmpOp, Expr};
|
||||
use ruff_python_trivia::CommentRanges;
|
||||
use ruff_source_file::{Line, Locator};
|
||||
use ruff_text_size::{Ranged, TextLen, TextRange};
|
||||
|
||||
@@ -17,6 +18,7 @@ pub(super) fn generate_comparison(
|
||||
ops: &[CmpOp],
|
||||
comparators: &[Expr],
|
||||
parent: AnyNodeRef,
|
||||
comment_ranges: &CommentRanges,
|
||||
locator: &Locator,
|
||||
) -> String {
|
||||
let start = left.start();
|
||||
@@ -24,9 +26,12 @@ pub(super) fn generate_comparison(
|
||||
let mut contents = String::with_capacity(usize::from(end - start));
|
||||
|
||||
// Add the left side of the comparison.
|
||||
contents.push_str(locator.slice(
|
||||
parenthesized_range(left.into(), parent, locator.contents()).unwrap_or(left.range()),
|
||||
));
|
||||
contents.push_str(
|
||||
locator.slice(
|
||||
parenthesized_range(left.into(), parent, comment_ranges, locator.contents())
|
||||
.unwrap_or(left.range()),
|
||||
),
|
||||
);
|
||||
|
||||
for (op, comparator) in ops.iter().zip(comparators) {
|
||||
// Add the operator.
|
||||
@@ -46,8 +51,13 @@ pub(super) fn generate_comparison(
|
||||
// Add the right side of the comparison.
|
||||
contents.push_str(
|
||||
locator.slice(
|
||||
parenthesized_range(comparator.into(), parent, locator.contents())
|
||||
.unwrap_or(comparator.range()),
|
||||
parenthesized_range(
|
||||
comparator.into(),
|
||||
parent,
|
||||
comment_ranges,
|
||||
locator.contents(),
|
||||
)
|
||||
.unwrap_or(comparator.range()),
|
||||
),
|
||||
);
|
||||
}
|
||||
|
||||
@@ -283,6 +283,7 @@ pub(crate) fn literal_comparisons(checker: &mut Checker, compare: &ast::ExprComp
|
||||
&ops,
|
||||
&compare.comparators,
|
||||
compare.into(),
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator(),
|
||||
);
|
||||
for diagnostic in &mut diagnostics {
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
use crate::autofix::edits::pad;
|
||||
use ruff_diagnostics::{AlwaysAutofixableViolation, Diagnostic, Edit, Fix};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_ast::{self as ast, CmpOp, Expr};
|
||||
@@ -95,11 +96,16 @@ pub(crate) fn not_tests(checker: &mut Checker, unary_op: &ast::ExprUnaryOp) {
|
||||
let mut diagnostic = Diagnostic::new(NotInTest, unary_op.operand.range());
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
diagnostic.set_fix(Fix::automatic(Edit::range_replacement(
|
||||
generate_comparison(
|
||||
left,
|
||||
&[CmpOp::NotIn],
|
||||
comparators,
|
||||
unary_op.into(),
|
||||
pad(
|
||||
generate_comparison(
|
||||
left,
|
||||
&[CmpOp::NotIn],
|
||||
comparators,
|
||||
unary_op.into(),
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator(),
|
||||
),
|
||||
unary_op.range(),
|
||||
checker.locator(),
|
||||
),
|
||||
unary_op.range(),
|
||||
@@ -113,11 +119,16 @@ pub(crate) fn not_tests(checker: &mut Checker, unary_op: &ast::ExprUnaryOp) {
|
||||
let mut diagnostic = Diagnostic::new(NotIsTest, unary_op.operand.range());
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
diagnostic.set_fix(Fix::automatic(Edit::range_replacement(
|
||||
generate_comparison(
|
||||
left,
|
||||
&[CmpOp::IsNot],
|
||||
comparators,
|
||||
unary_op.into(),
|
||||
pad(
|
||||
generate_comparison(
|
||||
left,
|
||||
&[CmpOp::IsNot],
|
||||
comparators,
|
||||
unary_op.into(),
|
||||
checker.indexer().comment_ranges(),
|
||||
checker.locator(),
|
||||
),
|
||||
unary_op.range(),
|
||||
checker.locator(),
|
||||
),
|
||||
unary_op.range(),
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
use ruff_text_size::{TextLen, TextRange, TextSize};
|
||||
|
||||
use ruff_diagnostics::{Diagnostic, Violation};
|
||||
use ruff_macros::{derive_message_formats, violation};
|
||||
use ruff_python_index::Indexer;
|
||||
use ruff_python_parser::lexer::LexResult;
|
||||
use ruff_python_parser::Tok;
|
||||
use ruff_python_trivia::leading_indentation;
|
||||
use ruff_source_file::Line;
|
||||
use ruff_source_file::Locator;
|
||||
use ruff_text_size::{TextLen, TextRange, TextSize};
|
||||
|
||||
/// ## What it does
|
||||
/// Checks for indentation that uses tabs.
|
||||
@@ -37,17 +38,46 @@ impl Violation for TabIndentation {
|
||||
}
|
||||
|
||||
/// W191
|
||||
pub(crate) fn tab_indentation(line: &Line, indexer: &Indexer) -> Option<Diagnostic> {
|
||||
let indent = leading_indentation(line);
|
||||
if let Some(tab_index) = indent.find('\t') {
|
||||
// If the tab character is within a multi-line string, abort.
|
||||
let tab_offset = line.start() + TextSize::try_from(tab_index).unwrap();
|
||||
if indexer.triple_quoted_string_range(tab_offset).is_none() {
|
||||
return Some(Diagnostic::new(
|
||||
TabIndentation,
|
||||
TextRange::at(line.start(), indent.text_len()),
|
||||
));
|
||||
pub(crate) fn tab_indentation(
|
||||
diagnostics: &mut Vec<Diagnostic>,
|
||||
tokens: &[LexResult],
|
||||
locator: &Locator,
|
||||
indexer: &Indexer,
|
||||
) {
|
||||
// Always check the first line for tab indentation as there's no newline
|
||||
// token before it.
|
||||
tab_indentation_at_line_start(diagnostics, locator, TextSize::default());
|
||||
|
||||
for (tok, range) in tokens.iter().flatten() {
|
||||
if matches!(tok, Tok::Newline | Tok::NonLogicalNewline) {
|
||||
tab_indentation_at_line_start(diagnostics, locator, range.end());
|
||||
}
|
||||
}
|
||||
None
|
||||
|
||||
// The lexer doesn't emit `Newline` / `NonLogicalNewline` for a line
|
||||
// continuation character (`\`), so we need to manually check for tab
|
||||
// indentation for lines that follow a line continuation character.
|
||||
for continuation_line in indexer.continuation_line_starts() {
|
||||
tab_indentation_at_line_start(
|
||||
diagnostics,
|
||||
locator,
|
||||
locator.full_line_end(*continuation_line),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
/// Checks for indentation that uses tabs for a line starting at
|
||||
/// the given [`TextSize`].
|
||||
fn tab_indentation_at_line_start(
|
||||
diagnostics: &mut Vec<Diagnostic>,
|
||||
locator: &Locator,
|
||||
line_start: TextSize,
|
||||
) {
|
||||
let indent = leading_indentation(locator.after(line_start));
|
||||
if indent.find('\t').is_some() {
|
||||
diagnostics.push(Diagnostic::new(
|
||||
TabIndentation,
|
||||
TextRange::at(line_start, indent.text_len()),
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -102,4 +102,20 @@ E713.py:14:9: E713 [*] Test for membership should be `not in`
|
||||
16 16 |
|
||||
17 17 | #: Okay
|
||||
|
||||
E713.py:40:12: E713 [*] Test for membership should be `not in`
|
||||
|
|
||||
38 | assert [42, not foo] in bar
|
||||
39 | assert not (re.search(r"^.:\\Users\\[^\\]*\\Downloads\\.*") is None)
|
||||
40 | assert not('name' in request)or not request['name']
|
||||
| ^^^^^^^^^^^^^^^^^ E713
|
||||
|
|
||||
= help: Convert to `not in`
|
||||
|
||||
ℹ Fix
|
||||
37 37 | assert {"x": not foo} in bar
|
||||
38 38 | assert [42, not foo] in bar
|
||||
39 39 | assert not (re.search(r"^.:\\Users\\[^\\]*\\Downloads\\.*") is None)
|
||||
40 |-assert not('name' in request)or not request['name']
|
||||
40 |+assert 'name' not in request or not request['name']
|
||||
|
||||
|
||||
|
||||
@@ -1,345 +1,352 @@
|
||||
---
|
||||
source: crates/ruff/src/rules/pycodestyle/mod.rs
|
||||
---
|
||||
W19.py:3:1: W191 Indentation contains tabs
|
||||
W19.py:1:1: W191 Indentation contains tabs
|
||||
|
|
||||
1 | #: W191
|
||||
2 | if False:
|
||||
3 | print # indented with 1 tab
|
||||
1 | '''File starts with a tab
|
||||
| ^^^^ W191
|
||||
4 | #:
|
||||
2 | multiline string with tab in it'''
|
||||
|
|
||||
|
||||
W19.py:9:1: W191 Indentation contains tabs
|
||||
W19.py:6:1: W191 Indentation contains tabs
|
||||
|
|
||||
4 | #: W191
|
||||
5 | if False:
|
||||
6 | print # indented with 1 tab
|
||||
| ^^^^ W191
|
||||
7 | #:
|
||||
|
|
||||
|
||||
W19.py:12:1: W191 Indentation contains tabs
|
||||
|
|
||||
7 | #: W191
|
||||
8 | y = x == 2 \
|
||||
9 | or x == 3
|
||||
10 | #: W191
|
||||
11 | y = x == 2 \
|
||||
12 | or x == 3
|
||||
| ^^^^ W191
|
||||
10 | #: E101 W191 W504
|
||||
11 | if (
|
||||
13 | #: E101 W191 W504
|
||||
14 | if (
|
||||
|
|
||||
|
||||
W19.py:16:1: W191 Indentation contains tabs
|
||||
W19.py:19:1: W191 Indentation contains tabs
|
||||
|
|
||||
14 | ) or
|
||||
15 | y == 4):
|
||||
16 | pass
|
||||
17 | ) or
|
||||
18 | y == 4):
|
||||
19 | pass
|
||||
| ^^^^ W191
|
||||
17 | #: E101 W191
|
||||
18 | if x == 2 \
|
||||
20 | #: E101 W191
|
||||
21 | if x == 2 \
|
||||
|
|
||||
|
||||
W19.py:21:1: W191 Indentation contains tabs
|
||||
W19.py:24:1: W191 Indentation contains tabs
|
||||
|
|
||||
19 | or y > 1 \
|
||||
20 | or x == 3:
|
||||
21 | pass
|
||||
22 | or y > 1 \
|
||||
23 | or x == 3:
|
||||
24 | pass
|
||||
| ^^^^ W191
|
||||
22 | #: E101 W191
|
||||
23 | if x == 2 \
|
||||
25 | #: E101 W191
|
||||
26 | if x == 2 \
|
||||
|
|
||||
|
||||
W19.py:26:1: W191 Indentation contains tabs
|
||||
W19.py:29:1: W191 Indentation contains tabs
|
||||
|
|
||||
24 | or y > 1 \
|
||||
25 | or x == 3:
|
||||
26 | pass
|
||||
27 | or y > 1 \
|
||||
28 | or x == 3:
|
||||
29 | pass
|
||||
| ^^^^ W191
|
||||
27 | #:
|
||||
30 | #:
|
||||
|
|
||||
|
||||
W19.py:32:1: W191 Indentation contains tabs
|
||||
W19.py:35:1: W191 Indentation contains tabs
|
||||
|
|
||||
30 | if (foo == bar and
|
||||
31 | baz == bop):
|
||||
32 | pass
|
||||
33 | if (foo == bar and
|
||||
34 | baz == bop):
|
||||
35 | pass
|
||||
| ^^^^ W191
|
||||
33 | #: E101 W191 W504
|
||||
34 | if (
|
||||
36 | #: E101 W191 W504
|
||||
37 | if (
|
||||
|
|
||||
|
||||
W19.py:38:1: W191 Indentation contains tabs
|
||||
W19.py:41:1: W191 Indentation contains tabs
|
||||
|
|
||||
36 | baz == bop
|
||||
37 | ):
|
||||
38 | pass
|
||||
39 | baz == bop
|
||||
40 | ):
|
||||
41 | pass
|
||||
| ^^^^ W191
|
||||
39 | #:
|
||||
42 | #:
|
||||
|
|
||||
|
||||
W19.py:44:1: W191 Indentation contains tabs
|
||||
W19.py:47:1: W191 Indentation contains tabs
|
||||
|
|
||||
42 | if start[1] > end_col and not (
|
||||
43 | over_indent == 4 and indent_next):
|
||||
44 | return (0, "E121 continuation line over-"
|
||||
45 | if start[1] > end_col and not (
|
||||
46 | over_indent == 4 and indent_next):
|
||||
47 | return (0, "E121 continuation line over-"
|
||||
| ^^^^ W191
|
||||
45 | "indented for visual indent")
|
||||
46 | #:
|
||||
48 | "indented for visual indent")
|
||||
49 | #:
|
||||
|
|
||||
|
||||
W19.py:45:1: W191 Indentation contains tabs
|
||||
W19.py:48:1: W191 Indentation contains tabs
|
||||
|
|
||||
43 | over_indent == 4 and indent_next):
|
||||
44 | return (0, "E121 continuation line over-"
|
||||
45 | "indented for visual indent")
|
||||
46 | over_indent == 4 and indent_next):
|
||||
47 | return (0, "E121 continuation line over-"
|
||||
48 | "indented for visual indent")
|
||||
| ^^^^^^^^^^^^ W191
|
||||
46 | #:
|
||||
49 | #:
|
||||
|
|
||||
|
||||
W19.py:54:1: W191 Indentation contains tabs
|
||||
W19.py:57:1: W191 Indentation contains tabs
|
||||
|
|
||||
52 | var_one, var_two, var_three,
|
||||
53 | var_four):
|
||||
54 | print(var_one)
|
||||
55 | var_one, var_two, var_three,
|
||||
56 | var_four):
|
||||
57 | print(var_one)
|
||||
| ^^^^ W191
|
||||
55 | #: E101 W191 W504
|
||||
56 | if ((row < 0 or self.moduleCount <= row or
|
||||
|
|
||||
|
||||
W19.py:58:1: W191 Indentation contains tabs
|
||||
|
|
||||
56 | if ((row < 0 or self.moduleCount <= row or
|
||||
57 | col < 0 or self.moduleCount <= col)):
|
||||
58 | raise Exception("%s,%s - %s" % (row, col, self.moduleCount))
|
||||
| ^^^^ W191
|
||||
59 | #: E101 E101 E101 E101 W191 W191 W191 W191 W191 W191
|
||||
60 | if bar:
|
||||
58 | #: E101 W191 W504
|
||||
59 | if ((row < 0 or self.moduleCount <= row or
|
||||
|
|
||||
|
||||
W19.py:61:1: W191 Indentation contains tabs
|
||||
|
|
||||
59 | #: E101 E101 E101 E101 W191 W191 W191 W191 W191 W191
|
||||
60 | if bar:
|
||||
61 | return (
|
||||
59 | if ((row < 0 or self.moduleCount <= row or
|
||||
60 | col < 0 or self.moduleCount <= col)):
|
||||
61 | raise Exception("%s,%s - %s" % (row, col, self.moduleCount))
|
||||
| ^^^^ W191
|
||||
62 | start, 'E121 lines starting with a '
|
||||
63 | 'closing bracket should be indented '
|
||||
|
|
||||
|
||||
W19.py:62:1: W191 Indentation contains tabs
|
||||
|
|
||||
60 | if bar:
|
||||
61 | return (
|
||||
62 | start, 'E121 lines starting with a '
|
||||
| ^^^^^^^^ W191
|
||||
63 | 'closing bracket should be indented '
|
||||
64 | "to match that of the opening "
|
||||
|
|
||||
|
||||
W19.py:63:1: W191 Indentation contains tabs
|
||||
|
|
||||
61 | return (
|
||||
62 | start, 'E121 lines starting with a '
|
||||
63 | 'closing bracket should be indented '
|
||||
| ^^^^^^^^ W191
|
||||
64 | "to match that of the opening "
|
||||
65 | "bracket's line"
|
||||
62 | #: E101 E101 E101 E101 W191 W191 W191 W191 W191 W191
|
||||
63 | if bar:
|
||||
|
|
||||
|
||||
W19.py:64:1: W191 Indentation contains tabs
|
||||
|
|
||||
62 | start, 'E121 lines starting with a '
|
||||
63 | 'closing bracket should be indented '
|
||||
64 | "to match that of the opening "
|
||||
| ^^^^^^^^ W191
|
||||
65 | "bracket's line"
|
||||
66 | )
|
||||
62 | #: E101 E101 E101 E101 W191 W191 W191 W191 W191 W191
|
||||
63 | if bar:
|
||||
64 | return (
|
||||
| ^^^^ W191
|
||||
65 | start, 'E121 lines starting with a '
|
||||
66 | 'closing bracket should be indented '
|
||||
|
|
||||
|
||||
W19.py:65:1: W191 Indentation contains tabs
|
||||
|
|
||||
63 | 'closing bracket should be indented '
|
||||
64 | "to match that of the opening "
|
||||
65 | "bracket's line"
|
||||
63 | if bar:
|
||||
64 | return (
|
||||
65 | start, 'E121 lines starting with a '
|
||||
| ^^^^^^^^ W191
|
||||
66 | )
|
||||
67 | #
|
||||
66 | 'closing bracket should be indented '
|
||||
67 | "to match that of the opening "
|
||||
|
|
||||
|
||||
W19.py:66:1: W191 Indentation contains tabs
|
||||
|
|
||||
64 | "to match that of the opening "
|
||||
65 | "bracket's line"
|
||||
66 | )
|
||||
| ^^^^ W191
|
||||
67 | #
|
||||
68 | #: E101 W191 W504
|
||||
64 | return (
|
||||
65 | start, 'E121 lines starting with a '
|
||||
66 | 'closing bracket should be indented '
|
||||
| ^^^^^^^^ W191
|
||||
67 | "to match that of the opening "
|
||||
68 | "bracket's line"
|
||||
|
|
||||
|
||||
W19.py:73:1: W191 Indentation contains tabs
|
||||
W19.py:67:1: W191 Indentation contains tabs
|
||||
|
|
||||
71 | foo.bar("bop")
|
||||
72 | )):
|
||||
73 | print "yes"
|
||||
| ^^^^ W191
|
||||
74 | #: E101 W191 W504
|
||||
75 | # also ok, but starting to look like LISP
|
||||
65 | start, 'E121 lines starting with a '
|
||||
66 | 'closing bracket should be indented '
|
||||
67 | "to match that of the opening "
|
||||
| ^^^^^^^^ W191
|
||||
68 | "bracket's line"
|
||||
69 | )
|
||||
|
|
||||
|
||||
W19.py:78:1: W191 Indentation contains tabs
|
||||
W19.py:68:1: W191 Indentation contains tabs
|
||||
|
|
||||
76 | if ((foo.bar("baz") and
|
||||
77 | foo.bar("bop"))):
|
||||
78 | print "yes"
|
||||
| ^^^^ W191
|
||||
79 | #: E101 W191 W504
|
||||
80 | if (a == 2 or
|
||||
66 | 'closing bracket should be indented '
|
||||
67 | "to match that of the opening "
|
||||
68 | "bracket's line"
|
||||
| ^^^^^^^^ W191
|
||||
69 | )
|
||||
70 | #
|
||||
|
|
||||
|
||||
W19.py:83:1: W191 Indentation contains tabs
|
||||
W19.py:69:1: W191 Indentation contains tabs
|
||||
|
|
||||
81 | b == "abc def ghi"
|
||||
82 | "jkl mno"):
|
||||
83 | return True
|
||||
67 | "to match that of the opening "
|
||||
68 | "bracket's line"
|
||||
69 | )
|
||||
| ^^^^ W191
|
||||
84 | #: E101 W191 W504
|
||||
85 | if (a == 2 or
|
||||
70 | #
|
||||
71 | #: E101 W191 W504
|
||||
|
|
||||
|
||||
W19.py:88:1: W191 Indentation contains tabs
|
||||
W19.py:76:1: W191 Indentation contains tabs
|
||||
|
|
||||
86 | b == """abc def ghi
|
||||
87 | jkl mno"""):
|
||||
88 | return True
|
||||
74 | foo.bar("bop")
|
||||
75 | )):
|
||||
76 | print "yes"
|
||||
| ^^^^ W191
|
||||
89 | #: W191:2:1 W191:3:1 E101:3:2
|
||||
90 | if length > options.max_line_length:
|
||||
77 | #: E101 W191 W504
|
||||
78 | # also ok, but starting to look like LISP
|
||||
|
|
||||
|
||||
W19.py:81:1: W191 Indentation contains tabs
|
||||
|
|
||||
79 | if ((foo.bar("baz") and
|
||||
80 | foo.bar("bop"))):
|
||||
81 | print "yes"
|
||||
| ^^^^ W191
|
||||
82 | #: E101 W191 W504
|
||||
83 | if (a == 2 or
|
||||
|
|
||||
|
||||
W19.py:86:1: W191 Indentation contains tabs
|
||||
|
|
||||
84 | b == "abc def ghi"
|
||||
85 | "jkl mno"):
|
||||
86 | return True
|
||||
| ^^^^ W191
|
||||
87 | #: E101 W191 W504
|
||||
88 | if (a == 2 or
|
||||
|
|
||||
|
||||
W19.py:91:1: W191 Indentation contains tabs
|
||||
|
|
||||
89 | #: W191:2:1 W191:3:1 E101:3:2
|
||||
90 | if length > options.max_line_length:
|
||||
91 | return options.max_line_length, \
|
||||
89 | b == """abc def ghi
|
||||
90 | jkl mno"""):
|
||||
91 | return True
|
||||
| ^^^^ W191
|
||||
92 | "E501 line too long (%d characters)" % length
|
||||
92 | #: W191:2:1 W191:3:1 E101:3:2
|
||||
93 | if length > options.max_line_length:
|
||||
|
|
||||
|
||||
W19.py:92:1: W191 Indentation contains tabs
|
||||
W19.py:94:1: W191 Indentation contains tabs
|
||||
|
|
||||
90 | if length > options.max_line_length:
|
||||
91 | return options.max_line_length, \
|
||||
92 | "E501 line too long (%d characters)" % length
|
||||
92 | #: W191:2:1 W191:3:1 E101:3:2
|
||||
93 | if length > options.max_line_length:
|
||||
94 | return options.max_line_length, \
|
||||
| ^^^^ W191
|
||||
95 | "E501 line too long (%d characters)" % length
|
||||
|
|
||||
|
||||
W19.py:95:1: W191 Indentation contains tabs
|
||||
|
|
||||
93 | if length > options.max_line_length:
|
||||
94 | return options.max_line_length, \
|
||||
95 | "E501 line too long (%d characters)" % length
|
||||
| ^^^^^^^^ W191
|
||||
|
|
||||
|
||||
W19.py:98:1: W191 Indentation contains tabs
|
||||
W19.py:101:1: W191 Indentation contains tabs
|
||||
|
|
||||
96 | #: E101 W191 W191 W504
|
||||
97 | if os.path.exists(os.path.join(path, PEP8_BIN)):
|
||||
98 | cmd = ([os.path.join(path, PEP8_BIN)] +
|
||||
99 | #: E101 W191 W191 W504
|
||||
100 | if os.path.exists(os.path.join(path, PEP8_BIN)):
|
||||
101 | cmd = ([os.path.join(path, PEP8_BIN)] +
|
||||
| ^^^^ W191
|
||||
99 | self._pep8_options(targetfile))
|
||||
100 | #: W191 - okay
|
||||
102 | self._pep8_options(targetfile))
|
||||
103 | #: W191 - okay
|
||||
|
|
||||
|
||||
W19.py:99:1: W191 Indentation contains tabs
|
||||
W19.py:102:1: W191 Indentation contains tabs
|
||||
|
|
||||
97 | if os.path.exists(os.path.join(path, PEP8_BIN)):
|
||||
98 | cmd = ([os.path.join(path, PEP8_BIN)] +
|
||||
99 | self._pep8_options(targetfile))
|
||||
100 | if os.path.exists(os.path.join(path, PEP8_BIN)):
|
||||
101 | cmd = ([os.path.join(path, PEP8_BIN)] +
|
||||
102 | self._pep8_options(targetfile))
|
||||
| ^^^^^^^^^^^ W191
|
||||
100 | #: W191 - okay
|
||||
101 | '''
|
||||
103 | #: W191 - okay
|
||||
104 | '''
|
||||
|
|
||||
|
||||
W19.py:125:1: W191 Indentation contains tabs
|
||||
W19.py:128:1: W191 Indentation contains tabs
|
||||
|
|
||||
123 | if foo is None and bar is "bop" and \
|
||||
124 | blah == 'yeah':
|
||||
125 | blah = 'yeahnah'
|
||||
126 | if foo is None and bar is "bop" and \
|
||||
127 | blah == 'yeah':
|
||||
128 | blah = 'yeahnah'
|
||||
| ^^^^ W191
|
||||
|
|
||||
|
||||
W19.py:131:1: W191 Indentation contains tabs
|
||||
W19.py:134:1: W191 Indentation contains tabs
|
||||
|
|
||||
129 | #: W191 W191 W191
|
||||
130 | if True:
|
||||
131 | foo(
|
||||
132 | #: W191 W191 W191
|
||||
133 | if True:
|
||||
134 | foo(
|
||||
| ^^^^ W191
|
||||
132 | 1,
|
||||
133 | 2)
|
||||
135 | 1,
|
||||
136 | 2)
|
||||
|
|
||||
|
||||
W19.py:132:1: W191 Indentation contains tabs
|
||||
W19.py:135:1: W191 Indentation contains tabs
|
||||
|
|
||||
130 | if True:
|
||||
131 | foo(
|
||||
132 | 1,
|
||||
133 | if True:
|
||||
134 | foo(
|
||||
135 | 1,
|
||||
| ^^^^^^^^ W191
|
||||
133 | 2)
|
||||
134 | #: W191 W191 W191 W191 W191
|
||||
|
|
||||
|
||||
W19.py:133:1: W191 Indentation contains tabs
|
||||
|
|
||||
131 | foo(
|
||||
132 | 1,
|
||||
133 | 2)
|
||||
| ^^^^^^^^ W191
|
||||
134 | #: W191 W191 W191 W191 W191
|
||||
135 | def test_keys(self):
|
||||
136 | 2)
|
||||
137 | #: W191 W191 W191 W191 W191
|
||||
|
|
||||
|
||||
W19.py:136:1: W191 Indentation contains tabs
|
||||
|
|
||||
134 | #: W191 W191 W191 W191 W191
|
||||
135 | def test_keys(self):
|
||||
136 | """areas.json - All regions are accounted for."""
|
||||
| ^^^^ W191
|
||||
137 | expected = set([
|
||||
138 | u'Norrbotten',
|
||||
|
|
||||
|
||||
W19.py:137:1: W191 Indentation contains tabs
|
||||
|
|
||||
135 | def test_keys(self):
|
||||
136 | """areas.json - All regions are accounted for."""
|
||||
137 | expected = set([
|
||||
| ^^^^ W191
|
||||
138 | u'Norrbotten',
|
||||
139 | u'V\xe4sterbotten',
|
||||
|
|
||||
|
||||
W19.py:138:1: W191 Indentation contains tabs
|
||||
|
|
||||
136 | """areas.json - All regions are accounted for."""
|
||||
137 | expected = set([
|
||||
138 | u'Norrbotten',
|
||||
134 | foo(
|
||||
135 | 1,
|
||||
136 | 2)
|
||||
| ^^^^^^^^ W191
|
||||
139 | u'V\xe4sterbotten',
|
||||
140 | ])
|
||||
137 | #: W191 W191 W191 W191 W191
|
||||
138 | def test_keys(self):
|
||||
|
|
||||
|
||||
W19.py:139:1: W191 Indentation contains tabs
|
||||
|
|
||||
137 | expected = set([
|
||||
138 | u'Norrbotten',
|
||||
139 | u'V\xe4sterbotten',
|
||||
| ^^^^^^^^ W191
|
||||
140 | ])
|
||||
141 | #: W191
|
||||
137 | #: W191 W191 W191 W191 W191
|
||||
138 | def test_keys(self):
|
||||
139 | """areas.json - All regions are accounted for."""
|
||||
| ^^^^ W191
|
||||
140 | expected = set([
|
||||
141 | u'Norrbotten',
|
||||
|
|
||||
|
||||
W19.py:140:1: W191 Indentation contains tabs
|
||||
|
|
||||
138 | u'Norrbotten',
|
||||
139 | u'V\xe4sterbotten',
|
||||
140 | ])
|
||||
138 | def test_keys(self):
|
||||
139 | """areas.json - All regions are accounted for."""
|
||||
140 | expected = set([
|
||||
| ^^^^ W191
|
||||
141 | #: W191
|
||||
142 | x = [
|
||||
141 | u'Norrbotten',
|
||||
142 | u'V\xe4sterbotten',
|
||||
|
|
||||
|
||||
W19.py:141:1: W191 Indentation contains tabs
|
||||
|
|
||||
139 | """areas.json - All regions are accounted for."""
|
||||
140 | expected = set([
|
||||
141 | u'Norrbotten',
|
||||
| ^^^^^^^^ W191
|
||||
142 | u'V\xe4sterbotten',
|
||||
143 | ])
|
||||
|
|
||||
|
||||
W19.py:142:1: W191 Indentation contains tabs
|
||||
|
|
||||
140 | expected = set([
|
||||
141 | u'Norrbotten',
|
||||
142 | u'V\xe4sterbotten',
|
||||
| ^^^^^^^^ W191
|
||||
143 | ])
|
||||
144 | #: W191
|
||||
|
|
||||
|
||||
W19.py:143:1: W191 Indentation contains tabs
|
||||
|
|
||||
141 | #: W191
|
||||
142 | x = [
|
||||
143 | 'abc'
|
||||
141 | u'Norrbotten',
|
||||
142 | u'V\xe4sterbotten',
|
||||
143 | ])
|
||||
| ^^^^ W191
|
||||
144 | ]
|
||||
145 | #: W191 - okay
|
||||
144 | #: W191
|
||||
145 | x = [
|
||||
|
|
||||
|
||||
W19.py:146:1: W191 Indentation contains tabs
|
||||
|
|
||||
144 | #: W191
|
||||
145 | x = [
|
||||
146 | 'abc'
|
||||
| ^^^^ W191
|
||||
147 | ]
|
||||
148 | #: W191 - okay
|
||||
|
|
||||
|
||||
|
||||
|
||||
@@ -70,6 +70,10 @@ impl Violation for BlankLineAfterSummary {
|
||||
pub(crate) fn blank_after_summary(checker: &mut Checker, docstring: &Docstring) {
|
||||
let body = docstring.body();
|
||||
|
||||
if !docstring.triple_quoted() {
|
||||
return;
|
||||
}
|
||||
|
||||
let mut lines_count: usize = 1;
|
||||
let mut blanks_count = 0;
|
||||
for line in body.trim().universal_newlines().skip(1) {
|
||||
|
||||
@@ -39,7 +39,7 @@ use crate::registry::{AsRule, Rule};
|
||||
/// ## Options
|
||||
/// - `pydocstyle.convention`
|
||||
///
|
||||
/// [D211]: https://beta.ruff.rs/docs/rules/blank-line-before-class
|
||||
/// [D211]: https://docs.astral.sh/ruff/rules/blank-line-before-class
|
||||
#[violation]
|
||||
pub struct OneBlankLineBeforeClass;
|
||||
|
||||
@@ -136,7 +136,7 @@ impl AlwaysAutofixableViolation for OneBlankLineAfterClass {
|
||||
/// ## Options
|
||||
/// - `pydocstyle.convention`
|
||||
///
|
||||
/// [D203]: https://beta.ruff.rs/docs/rules/one-blank-line-before-class
|
||||
/// [D203]: https://docs.astral.sh/ruff/rules/one-blank-line-before-class
|
||||
#[violation]
|
||||
pub struct BlankLineBeforeClass;
|
||||
|
||||
|
||||
@@ -48,7 +48,7 @@ use crate::registry::{AsRule, Rule};
|
||||
/// """
|
||||
/// ```
|
||||
///
|
||||
/// [D213]: https://beta.ruff.rs/docs/rules/multi-line-summary-second-line
|
||||
/// [D213]: https://docs.astral.sh/ruff/rules/multi-line-summary-second-line
|
||||
#[violation]
|
||||
pub struct MultiLineSummaryFirstLine;
|
||||
|
||||
@@ -102,7 +102,7 @@ impl AlwaysAutofixableViolation for MultiLineSummaryFirstLine {
|
||||
/// """
|
||||
/// ```
|
||||
///
|
||||
/// [D212]: https://beta.ruff.rs/docs/rules/multi-line-summary-first-line
|
||||
/// [D212]: https://docs.astral.sh/ruff/rules/multi-line-summary-first-line
|
||||
#[violation]
|
||||
pub struct MultiLineSummarySecondLine;
|
||||
|
||||
|
||||
@@ -63,6 +63,10 @@ pub(crate) fn newline_after_last_paragraph(checker: &mut Checker, docstring: &Do
|
||||
let contents = docstring.contents;
|
||||
let body = docstring.body();
|
||||
|
||||
if !docstring.triple_quoted() {
|
||||
return;
|
||||
}
|
||||
|
||||
let mut line_count = 0;
|
||||
for line in NewlineWithTrailingNewline::from(body.as_str()) {
|
||||
if !line.trim().is_empty() {
|
||||
|
||||
@@ -97,5 +97,6 @@ D.py:658:5: D204 [*] 1 blank line required after class docstring
|
||||
659 |+
|
||||
659 660 | def sort_services(self):
|
||||
660 661 | pass
|
||||
661 662 |
|
||||
|
||||
|
||||
|
||||
@@ -77,4 +77,13 @@ D.py:658:5: D300 Use triple double quotes `"""`
|
||||
660 | pass
|
||||
|
|
||||
|
||||
D.py:664:5: D300 Use triple double quotes `"""`
|
||||
|
|
||||
663 | def newline_after_closing_quote(self):
|
||||
664 | "We enforce a newline after the closing quote for a multi-line docstring \
|
||||
| _____^
|
||||
665 | | but continuations shouldn't be considered multi-line"
|
||||
| |_________________________________________________________^ D300
|
||||
|
|
||||
|
||||
|
||||
|
||||
@@ -310,4 +310,21 @@ D.py:641:18: D400 [*] First line should end with a period
|
||||
643 643 |
|
||||
644 644 | def single_line_docstring_with_an_escaped_backslash():
|
||||
|
||||
D.py:664:5: D400 [*] First line should end with a period
|
||||
|
|
||||
663 | def newline_after_closing_quote(self):
|
||||
664 | "We enforce a newline after the closing quote for a multi-line docstring \
|
||||
| _____^
|
||||
665 | | but continuations shouldn't be considered multi-line"
|
||||
| |_________________________________________________________^ D400
|
||||
|
|
||||
= help: Add period
|
||||
|
||||
ℹ Suggested fix
|
||||
662 662 |
|
||||
663 663 | def newline_after_closing_quote(self):
|
||||
664 664 | "We enforce a newline after the closing quote for a multi-line docstring \
|
||||
665 |- but continuations shouldn't be considered multi-line"
|
||||
665 |+ but continuations shouldn't be considered multi-line."
|
||||
|
||||
|
||||
|
||||
@@ -292,4 +292,21 @@ D.py:641:18: D415 [*] First line should end with a period, question mark, or exc
|
||||
643 643 |
|
||||
644 644 | def single_line_docstring_with_an_escaped_backslash():
|
||||
|
||||
D.py:664:5: D415 [*] First line should end with a period, question mark, or exclamation point
|
||||
|
|
||||
663 | def newline_after_closing_quote(self):
|
||||
664 | "We enforce a newline after the closing quote for a multi-line docstring \
|
||||
| _____^
|
||||
665 | | but continuations shouldn't be considered multi-line"
|
||||
| |_________________________________________________________^ D415
|
||||
|
|
||||
= help: Add closing punctuation
|
||||
|
||||
ℹ Suggested fix
|
||||
662 662 |
|
||||
663 663 | def newline_after_closing_quote(self):
|
||||
664 664 | "We enforce a newline after the closing quote for a multi-line docstring \
|
||||
665 |- but continuations shouldn't be considered multi-line"
|
||||
665 |+ but continuations shouldn't be considered multi-line."
|
||||
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user