Compare commits

...

20 Commits

Author SHA1 Message Date
Charlie Marsh
931d41bff1 Revert "Bump version to 0.0.222"
This reverts commit 852aab5758.
2023-01-13 23:56:29 -05:00
Charlie Marsh
852aab5758 Bump version to 0.0.222 2023-01-13 23:50:08 -05:00
Charlie Marsh
27fe4873f2 Fix placement of update feature flag 2023-01-13 23:46:32 -05:00
Charlie Marsh
ee6c81d02a Bump version to 0.0.221 2023-01-13 23:33:15 -05:00
Charlie Marsh
59542344e2 Avoid unnecessary allocations for module names (#1863) 2023-01-13 23:26:34 -05:00
Martin Fischer
7b1ce72f86 Actually fix wasm-pack build command (#1862)
I initially attempted to run `wasm-pack build -p ruff` which gave the
error message:

Error: crate directory is missing a `Cargo.toml` file; is `-p` the wrong
directory?

I interpreted that as wasm-pack looking for the "ruff" directory because
I specified -p ruff, however actually the wasm-pack build usage is:

    wasm-pack build [FLAGS] [OPTIONS] <path> <cargo-build-options>

And I was missing the `<path>` argument. So this actually wasn't at all
a bug in wasm-pack but just a confusing error message. And the symlink
hack I introduced in the previous commit didn't actually work ... I only
accidentally omitted the `-p` when testing (which ended up as `ruff`
being the <path> argument) ... CLIs are fun.
2023-01-13 23:20:20 -05:00
Martin Fischer
156e09536e Add workaround for wasm-pack bug to fix the playground CI (#1861)
Fixes #1860.
2023-01-13 22:46:51 -05:00
Charlie Marsh
22341c4ae4 Move repology down 2023-01-13 22:29:26 -05:00
Colin Delahunty
e4993bd7e2 Added ALE (#1857)
Fixes the sub issue I brought up in #1829.
2023-01-13 21:39:51 -05:00
Martin Fischer
82aff5f9ec Split off ruff_cli crate from ruff library
This lets you test the ruff linters or use the ruff library
without having to compile the ~100 additional dependencies
that are needed by the CLI.

Because we set the following in the [workspace] section of Cargo.toml:

   default-members = [".", "ruff_cli"]

`cargo run` still runs the CLI and `cargo test` still tests
the code in src/ as well as the code in the new ruff_cli crate.
(But you can now also run `cargo test -p ruff` to only test the linters.)
2023-01-13 21:37:54 -05:00
Charlie Marsh
403a004e03 Refactor import-tracking to leverage existing AST bindings (#1856)
This PR refactors our import-tracking logic to leverage our existing
logic for tracking bindings. It's both a significant simplification, a
significant improvement (as we can now track reassignments), and closes
out a bunch of subtle bugs.

Though the AST tracks all bindings (e.g., when parsing `import os as
foo`, we bind the name `foo` to a `BindingKind::Importation` that points
to the `os` module), when I went to implement import tracking (e.g., to
ensure that if the user references `List`, it's actually `typing.List`),
I added a parallel system specifically for this use-case.

That was a mistake, for a few reasons:

1. It didn't track reassignments, so if you had `from typing import
List`, but `List` was later overridden, we'd still consider any
reference to `List` to be `typing.List`.
2. It required a bunch of extra logic, include complex logic to try and
optimize the lookups, since it's such a hot codepath.
3. There were a few bugs in the implementation that were just hard to
correct under the existing abstractions (e.g., if you did `from typing
import Optional as Foo`, then we'd treat any reference to `Foo` _or_
`Optional` as `typing.Optional` (even though, in that case, `Optional`
was really unbound).

The new implementation goes through our existing binding tracking: when
we get a reference, we find the appropriate binding given the current
scope stack, and normalize it back to its original target.

Closes #1690.
Closes #1790.
2023-01-13 20:39:54 -05:00
Charlie Marsh
0b92849996 Improve spacing preservation for C405 fixes (#1855)
We now preserve the spacing of the more common form:

```py
set((
    1,
))
```

Rather than the less common form:

```py
set(
    (1,)
)
```
2023-01-13 13:11:08 -05:00
Charlie Marsh
12440ede9c Remove non-magic trailing comma from tuple (#1854)
Closes #1821.
2023-01-13 12:56:42 -05:00
max0x53
fc3f722df5 Implement PLR0133 (ComparisonOfConstants) (#1841)
This PR adds [Pylint
`R0133`](https://pylint.pycqa.org/en/latest/user_guide/messages/refactor/comparison-of-constants.html)

Feel free to suggest changes and additions, I have tried to maintain
parity with the Pylint implementation
[`comparison_checker.py`](https://github.com/PyCQA/pylint/blob/main/pylint/checkers/base/comparison_checker.py#L247)

See #970
2023-01-13 12:14:35 -05:00
Maksudul Haque
84ef7a0171 [isort] Add classes Config Option (#1849)
ref https://github.com/charliermarsh/ruff/issues/1819
2023-01-13 12:13:01 -05:00
Nicola Soranzo
66b1d09362 Clarify that some flake8-bugbear opinionated rules are already implemented (#1847)
E.g. B904 and B905.
2023-01-13 11:49:05 -05:00
Maksudul Haque
3ae01db226 [flake8-bugbear] Fix False Positives for B024 & B027 (#1851)
closes https://github.com/charliermarsh/ruff/issues/1848
2023-01-13 11:46:17 -05:00
Charlie Marsh
048e5774e8 Use absolute paths for --stdin-filename matching (#1843)
Non-basename glob matches (e.g., for `--per-file-ignores`) assume that
the path has been converted to an absolute path. (We do this for
filenames as part of the directory traversal.) For filenames passed via
stdin, though, we're missing this conversion. So `--per-file-ignores`
that rely on the _basename_ worked as expected, but directory paths did
not.

Closes #1840.
2023-01-12 21:01:05 -05:00
max0x53
b47e8e6770 Implement PLR2004 (MagicValueComparison) (#1828)
This PR adds [Pylint
`R2004`](https://pylint.pycqa.org/en/latest/user_guide/messages/refactor/magic-value-comparison.html#magic-value-comparison-r2004)

Feel free to suggest changes and additions, I have tried to maintain
parity with the Pylint implementation
[`magic_value.py`](https://github.com/PyCQA/pylint/blob/main/pylint/extensions/magic_value.py)

See #970
2023-01-12 19:44:18 -05:00
Jan Katins
ef17c82998 Document the way extend-ignore/select are applied (#1839)
Closes: https://github.com/charliermarsh/ruff/issues/1838
2023-01-12 19:44:03 -05:00
122 changed files with 2487 additions and 2089 deletions

View File

@@ -60,7 +60,7 @@ jobs:
target: wasm32-unknown-unknown
- uses: Swatinem/rust-cache@v1
- run: cargo clippy --workspace --all-targets --all-features -- -D warnings -W clippy::pedantic
- run: cargo clippy --workspace --target wasm32-unknown-unknown --all-features -- -D warnings -W clippy::pedantic
- run: cargo clippy -p ruff --target wasm32-unknown-unknown --all-features -- -D warnings -W clippy::pedantic
cargo-test:
name: "cargo test"
@@ -79,7 +79,7 @@ jobs:
run: |
cargo insta test --all --delete-unreferenced-snapshots
git diff --exit-code
- run: cargo test --package ruff --test black_compatibility_test -- --ignored
- run: cargo test --package ruff_cli --test black_compatibility_test -- --ignored
# TODO(charlie): Re-enable the `wasm-pack` tests.
# See: https://github.com/charliermarsh/ruff/issues/1425

View File

@@ -31,7 +31,7 @@ jobs:
- uses: jetli/wasm-pack-action@v0.4.0
- uses: jetli/wasm-bindgen-action@v0.2.0
- name: "Run wasm-pack"
run: wasm-pack build --target web --out-dir playground/src/pkg
run: wasm-pack build --target web --out-dir playground/src/pkg . -- -p ruff
- name: "Install Node dependencies"
run: npm ci
working-directory: playground

View File

@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: v0.0.220
rev: v0.0.221
hooks:
- id: ruff

62
Cargo.lock generated
View File

@@ -735,7 +735,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
[[package]]
name = "flake8-to-ruff"
version = "0.0.220-dev.0"
version = "0.0.221-dev.0"
dependencies = [
"anyhow",
"clap 4.0.32",
@@ -1874,27 +1874,19 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.0.220"
version = "0.0.221"
dependencies = [
"annotate-snippets 0.9.1",
"anyhow",
"assert_cmd",
"atty",
"bincode",
"bitflags",
"cachedir",
"cfg-if 1.0.0",
"chrono",
"clap 4.0.32",
"clap_complete_command",
"clearscreen",
"colored",
"console_error_panic_hook",
"console_log",
"criterion",
"dirs 4.0.0",
"fern",
"filetime",
"getrandom 0.2.8",
"glob",
"globset",
@@ -1906,13 +1898,10 @@ dependencies = [
"log",
"natord",
"nohash-hasher",
"notify",
"num-bigint",
"num-traits",
"once_cell",
"path-absolutize",
"quick-junit",
"rayon",
"regex",
"ropey",
"ruff_macros",
@@ -1924,25 +1913,57 @@ dependencies = [
"semver",
"serde",
"serde-wasm-bindgen",
"serde_json",
"shellexpand",
"similar",
"strum",
"strum_macros",
"test-case",
"textwrap",
"titlecase",
"toml_edit",
"update-informer",
"ureq",
"walkdir",
"wasm-bindgen",
"wasm-bindgen-test",
]
[[package]]
name = "ruff_cli"
version = "0.0.221"
dependencies = [
"annotate-snippets 0.9.1",
"anyhow",
"assert_cmd",
"atty",
"bincode",
"cachedir",
"chrono",
"clap 4.0.32",
"clap_complete_command",
"clearscreen",
"colored",
"filetime",
"glob",
"ignore",
"itertools",
"log",
"notify",
"path-absolutize",
"quick-junit",
"rayon",
"regex",
"ruff",
"rustc-hash",
"serde",
"serde_json",
"similar",
"strum",
"textwrap",
"update-informer",
"ureq",
"walkdir",
]
[[package]]
name = "ruff_dev"
version = "0.0.220"
version = "0.0.221"
dependencies = [
"anyhow",
"clap 4.0.32",
@@ -1950,6 +1971,7 @@ dependencies = [
"libcst",
"once_cell",
"ruff",
"ruff_cli",
"rustpython-ast",
"rustpython-common",
"rustpython-parser",
@@ -1962,7 +1984,7 @@ dependencies = [
[[package]]
name = "ruff_macros"
version = "0.0.220"
version = "0.0.221"
dependencies = [
"once_cell",
"proc-macro2",

View File

@@ -2,11 +2,13 @@
members = [
"flake8_to_ruff",
"ruff_dev",
"ruff_cli",
]
default-members = [".", "ruff_cli"]
[package]
name = "ruff"
version = "0.0.220"
version = "0.0.221"
authors = ["Charlie Marsh <charlie.r.marsh@gmail.com>"]
edition = "2021"
rust-version = "1.65.0"
@@ -22,20 +24,14 @@ crate-type = ["cdylib", "rlib"]
doctest = false
[dependencies]
annotate-snippets = { version = "0.9.1", features = ["color"] }
anyhow = { version = "1.0.66" }
atty = { version = "0.2.14" }
bincode = { version = "1.3.3" }
bitflags = { version = "1.3.2" }
cachedir = { version = "0.3.0" }
cfg-if = { version = "1.0.0" }
chrono = { version = "0.4.21", default-features = false, features = ["clock"] }
clap = { version = "4.0.1", features = ["derive", "env"] }
clap_complete_command = { version = "0.4.0" }
colored = { version = "2.0.0" }
dirs = { version = "4.0.0" }
fern = { version = "0.6.1" }
filetime = { version = "0.2.17" }
glob = { version = "0.3.0" }
globset = { version = "0.4.9" }
ignore = { version = "0.4.18" }
@@ -44,15 +40,13 @@ libcst = { git = "https://github.com/charliermarsh/LibCST", rev = "f2f0b7a487a87
log = { version = "0.4.17" }
natord = { version = "1.0.9" }
nohash-hasher = { version = "0.2.0" }
notify = { version = "5.0.0" }
num-bigint = { version = "0.4.3" }
num-traits = "0.2.15"
once_cell = { version = "1.16.0" }
path-absolutize = { version = "3.0.14", features = ["once_cell_cache", "use_unix_paths_on_wasm"] }
quick-junit = { version = "0.3.2" }
regex = { version = "1.6.0" }
ropey = { version = "1.5.0", features = ["cr_lines", "simd"], default-features = false }
ruff_macros = { version = "0.0.220", path = "ruff_macros" }
ruff_macros = { version = "0.0.221", path = "ruff_macros" }
rustc-hash = { version = "1.1.0" }
rustpython-ast = { features = ["unparse"], git = "https://github.com/RustPython/RustPython.git", rev = "d532160333ffeb6dbeca2c2728c2391cd1e53b7f" }
rustpython-common = { git = "https://github.com/RustPython/RustPython.git", rev = "d532160333ffeb6dbeca2c2728c2391cd1e53b7f" }
@@ -60,20 +54,12 @@ rustpython-parser = { features = ["lalrpop"], git = "https://github.com/RustPyth
schemars = { version = "0.8.11" }
semver = { version = "1.0.16" }
serde = { version = "1.0.147", features = ["derive"] }
serde_json = { version = "1.0.87" }
shellexpand = { version = "3.0.0" }
similar = { version = "2.2.1" }
strum = { version = "0.24.1", features = ["strum_macros"] }
strum_macros = { version = "0.24.3" }
textwrap = { version = "0.16.0" }
titlecase = { version = "2.2.1" }
toml_edit = { version = "0.17.1", features = ["easy"] }
walkdir = { version = "2.3.2" }
[target.'cfg(not(target_family = "wasm"))'.dependencies]
clearscreen = { version = "2.0.0" }
rayon = { version = "1.5.3" }
update-informer = { version = "0.6.0", default-features = false, features = ["pypi"], optional = true }
# https://docs.rs/getrandom/0.2.7/getrandom/#webassembly-support
# For (future) wasm-pack support
@@ -88,17 +74,11 @@ wasm-bindgen = { version = "0.2.83" }
[dev-dependencies]
insta = { version = "1.19.1", features = ["yaml"] }
test-case = { version = "2.2.2" }
ureq = { version = "2.5.0", features = [] }
wasm-bindgen-test = { version = "0.3.33" }
[target.'cfg(not(target_family = "wasm"))'.dev-dependencies]
assert_cmd = { version = "2.0.4" }
criterion = { version = "0.4.0" }
[features]
default = ["update-informer"]
update-informer = ["dep:update-informer"]
[profile.release]
panic = "abort"
lto = "thin"

View File

@@ -141,8 +141,6 @@ Ruff is available as [`ruff`](https://pypi.org/project/ruff/) on PyPI:
pip install ruff
```
[![Packaging status](https://repology.org/badge/vertical-allrepos/ruff-python-linter.svg?exclude_unsupported=1)](https://repology.org/project/ruff-python-linter/versions)
For **macOS Homebrew** and **Linuxbrew** users, Ruff is also available as [`ruff`](https://formulae.brew.sh/formula/ruff) on Homebrew:
```shell
@@ -161,6 +159,8 @@ For **Arch Linux** users, Ruff is also available as [`ruff`](https://archlinux.o
pacman -S ruff
```
[![Packaging status](https://repology.org/badge/vertical-allrepos/ruff-python-linter.svg?exclude_unsupported=1)](https://repology.org/project/ruff-python-linter/versions)
### Usage
To run Ruff, try any of the following:
@@ -182,7 +182,7 @@ Ruff also works with [pre-commit](https://pre-commit.com):
```yaml
- repo: https://github.com/charliermarsh/ruff-pre-commit
# Ruff version.
rev: 'v0.0.220'
rev: 'v0.0.221'
hooks:
- id: ruff
# Respect `exclude` and `extend-exclude` settings.
@@ -1094,8 +1094,10 @@ For more, see [Pylint](https://pypi.org/project/pylint/2.15.7/) on PyPI.
| PLE1142 | AwaitOutsideAsync | `await` should be used within an async function | |
| PLR0206 | PropertyWithParameters | Cannot have defined parameters for properties | |
| PLR0402 | ConsiderUsingFromImport | Use `from ... import ...` in lieu of alias | |
| PLR0133 | ConstantComparison | Two constants compared in a comparison, consider replacing `0 == 0` | |
| PLR1701 | ConsiderMergingIsinstance | Merge these isinstance calls: `isinstance(..., (...))` | |
| PLR1722 | UseSysExit | Use `sys.exit()` instead of `exit` | 🛠 |
| PLR2004 | MagicValueComparison | Magic number used in comparison, consider replacing magic with a constant variable | |
| PLW0120 | UselessElseOnLoop | Else clause on loop without a break statement, remove the else and de-indent all the code inside it | |
| PLW0602 | GlobalVariableNotAssigned | Using global for `...` but no assignment is done | |
@@ -1254,6 +1256,18 @@ officially supported LSP server for Ruff.
However, Ruff is also available as part of the [coc-pyright](https://github.com/fannheyward/coc-pyright)
extension for `coc.nvim`.
<details>
<summary>With the <a href="https://github.com/dense-analysis/ale">ALE</a> plugin for (Neo)Vim.</summary>
```vim
let g:ale_linters = { "python": ["ruff"] }
let g:ale_fixers = {
\ "python": ["black", "ruff"],
\}
```
</details>
<details>
<summary>Ruff can also be integrated via <a href="https://github.com/neovim/nvim-lspconfig/blob/master/doc/server_configurations.md#efm"><code>efm</code></a> in just a <a href="https://github.com/JafarAbdi/myconfigs/blob/6f0b6b2450e92ec8fc50422928cd22005b919110/efm-langserver/config.yaml#L14-L20">few lines</a>.</summary>
<br>
@@ -1418,7 +1432,7 @@ Beyond the rule set, Ruff suffers from the following limitations vis-à-vis Flak
There are a few other minor incompatibilities between Ruff and the originating Flake8 plugins:
- Ruff doesn't implement the "opinionated" lint rules from `flake8-bugbear`.
- Ruff doesn't implement all the "opinionated" lint rules from `flake8-bugbear`.
- Depending on your project structure, Ruff and `isort` can differ in their detection of first-party
code. (This is often solved by modifying the `src` property, e.g., to `src = ["src"]`, if your
code is nested in a `src` directory.)
@@ -1908,6 +1922,12 @@ extend-exclude = ["tests", "src/bad.py"]
A list of rule codes or prefixes to ignore, in addition to those
specified by `ignore`.
Note that `extend-ignore` is applied after resolving rules from
`ignore`/`select` and a less specific rule in `extend-ignore`
would overwrite a more specific rule in `select`. It is
recommended to only use `extend-ignore` when extending a
`pyproject.toml` file via `extend`.
**Default value**: `[]`
**Type**: `Vec<RuleCodePrefix>`
@@ -1927,6 +1947,12 @@ extend-ignore = ["F841"]
A list of rule codes or prefixes to enable, in addition to those
specified by `select`.
Note that `extend-select` is applied after resolving rules from
`ignore`/`select` and a less specific rule in `extend-select`
would overwrite a more specific rule in `ignore`. It is
recommended to only use `extend-select` when extending a
`pyproject.toml` file via `extend`.
**Default value**: `[]`
**Type**: `Vec<RuleCodePrefix>`
@@ -2864,6 +2890,24 @@ ignore-variadic-names = true
### `isort`
#### [`classes`](#classes)
An override list of tokens to always recognize as a Class for
`order-by-type` regardless of casing.
**Default value**: `[]`
**Type**: `Vec<String>`
**Example usage**:
```toml
[tool.ruff.isort]
classes = ["SVC"]
```
---
#### [`combine-as-imports`](#combine-as-imports)
Combines as imports on the same line. See isort's [`combine-as-imports`](https://pycqa.github.io/isort/docs/configuration/options.html#combine-as-imports)

View File

@@ -771,7 +771,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
[[package]]
name = "flake8_to_ruff"
version = "0.0.220"
version = "0.0.221"
dependencies = [
"anyhow",
"clap",
@@ -1975,7 +1975,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.0.220"
version = "0.0.221"
dependencies = [
"anyhow",
"bincode",

View File

@@ -1,6 +1,6 @@
[package]
name = "flake8-to-ruff"
version = "0.0.220-dev.0"
version = "0.0.221-dev.0"
edition = "2021"
[lib]

View File

@@ -1,10 +1,13 @@
[build-system]
requires = ["maturin>=0.14,<0.15"]
requires = ["maturin>=0.14.10,<0.15"]
# We depend on >=0.14.10 because we specify name in
# [package.metadata.maturin] in ruff_cli/Cargo.toml.
build-backend = "maturin"
[project]
name = "ruff"
version = "0.0.220"
version = "0.0.221"
description = "An extremely fast Python linter, written in Rust."
authors = [
{ name = "Charlie Marsh", email = "charlie.r.marsh@gmail.com" },
@@ -35,6 +38,7 @@ urls = { repository = "https://github.com/charliermarsh/ruff" }
[tool.maturin]
bindings = "bin"
manifest-path = "ruff_cli/Cargo.toml"
python-source = "python"
strip = true

View File

@@ -0,0 +1,9 @@
class C:
from typing import overload
@overload
def f(self, x: int, y: int) -> None:
...
def f(self, x, y):
pass

View File

@@ -1,15 +1,16 @@
"""
Should emit:
B024 - on lines 17, 34, 52, 58, 69, 74, 79, 84, 89
B024 - on lines 18, 71, 82, 87, 92, 141
"""
import abc
import abc as notabc
from abc import ABC, ABCMeta
from abc import abstractmethod
from abc import abstractmethod, abstractproperty
from abc import abstractmethod as abstract
from abc import abstractmethod as abstractaoeuaoeuaoeu
from abc import abstractmethod as notabstract
from abc import abstractproperty as notabstract_property
import foo
@@ -49,12 +50,24 @@ class Base_6(ABC):
foo()
class Base_7(ABC): # error
class Base_7(ABC):
@notabstract
def method(self):
foo()
class Base_8(ABC):
@notabstract_property
def method(self):
foo()
class Base_9(ABC):
@abstractproperty
def method(self):
foo()
class MetaBase_1(metaclass=ABCMeta): # error
def method(self):
foo()

View File

@@ -1,11 +1,12 @@
"""
Should emit:
B027 - on lines 12, 15, 18, 22, 30
B027 - on lines 13, 16, 19, 23
"""
import abc
from abc import ABC
from abc import abstractmethod
from abc import abstractmethod, abstractproperty
from abc import abstractmethod as notabstract
from abc import abstractproperty as notabstract_property
class AbstractClass(ABC):
@@ -42,6 +43,18 @@ class AbstractClass(ABC):
def abstract_3(self):
...
@abc.abstractproperty
def abstract_4(self):
...
@abstractproperty
def abstract_5(self):
...
@notabstract_property
def abstract_6(self):
...
def body_1(self):
print("foo")
...

View File

@@ -1,5 +1,18 @@
s1 = set([1, 2])
s2 = set((1, 2))
s3 = set([])
s4 = set(())
s5 = set()
set([1, 2])
set((1, 2))
set([])
set(())
set()
set((1,))
set((
1,
))
set([
1,
])
set(
(1,)
)
set(
[1,]
)

View File

@@ -0,0 +1,4 @@
from sklearn.svm import func, SVC, CONST, Klass
from subprocess import N_CLASS, PIPE, Popen, STDOUT
from module import CLASS, Class, CONSTANT, function, BASIC, Apple
from torch.nn import SELU, AClass, A_CONSTANT

View File

@@ -9,7 +9,6 @@ from .background import BackgroundTasks
# F401 `datastructures.UploadFile` imported but unused
from .datastructures import UploadFile as FileUpload
# OK
import applications as applications

View File

@@ -5,4 +5,3 @@ from warnings import warn
warnings.warn("this is ok")
warn("by itself is also ok")
logging.warning("this is fine")
log.warning("this is ok")

View File

@@ -2,14 +2,4 @@ import logging
from logging import warn
logging.warn("this is not ok")
log.warn("this is also not ok")
warn("not ok")
def foo():
from logging import warn
def warn():
pass
warn("has been redefined, but we will still report it")

View File

@@ -0,0 +1,59 @@
"""Check that magic values are not used in comparisons"""
if 100 == 100: # [comparison-of-constants]
pass
if 1 == 3: # [comparison-of-constants]
pass
if 1 != 3: # [comparison-of-constants]
pass
x = 0
if 4 == 3 == x: # [comparison-of-constants]
pass
if x == 0: # correct
pass
y = 1
if x == y: # correct
pass
if 1 > 0: # [comparison-of-constants]
pass
if x > 0: # correct
pass
if 1 >= 0: # [comparison-of-constants]
pass
if x >= 0: # correct
pass
if 1 < 0: # [comparison-of-constants]
pass
if x < 0: # correct
pass
if 1 <= 0: # [comparison-of-constants]
pass
if x <= 0: # correct
pass
word = "hello"
if word == "": # correct
pass
if "hello" == "": # [comparison-of-constants]
pass
truthy = True
if truthy == True: # correct
pass
if True == False: # [comparison-of-constants]
pass

View File

@@ -0,0 +1,71 @@
"""Check that magic values are not used in comparisons"""
import cmath
user_input = 10
if 10 > user_input: # [magic-value-comparison]
pass
if 10 == 100: # [comparison-of-constants] R0133
pass
if 1 == 3: # [comparison-of-constants] R0133
pass
x = 0
if 4 == 3 == x: # [comparison-of-constants] R0133
pass
time_delta = 7224
ONE_HOUR = 3600
if time_delta > ONE_HOUR: # correct
pass
argc = 1
if argc != -1: # correct
pass
if argc != 0: # correct
pass
if argc != 1: # correct
pass
if argc != 2: # [magic-value-comparison]
pass
if __name__ == "__main__": # correct
pass
ADMIN_PASSWORD = "SUPERSECRET"
input_password = "password"
if input_password == "": # correct
pass
if input_password == ADMIN_PASSWORD: # correct
pass
if input_password == "Hunter2": # [magic-value-comparison]
pass
PI = 3.141592653589793238
pi_estimation = 3.14
if pi_estimation == 3.141592653589793238: # [magic-value-comparison]
pass
if pi_estimation == PI: # correct
pass
HELLO_WORLD = b"Hello, World!"
user_input = b"Hello, There!"
if user_input == b"something": # [magic-value-comparison]
pass
if user_input == HELLO_WORLD: # correct
pass

View File

@@ -67,7 +67,7 @@
}
},
"extend-ignore": {
"description": "A list of rule codes or prefixes to ignore, in addition to those specified by `ignore`.",
"description": "A list of rule codes or prefixes to ignore, in addition to those specified by `ignore`.\n\nNote that `extend-ignore` is applied after resolving rules from `ignore`/`select` and a less specific rule in `extend-ignore` would overwrite a more specific rule in `select`. It is recommended to only use `extend-ignore` when extending a `pyproject.toml` file via `extend`.",
"type": [
"array",
"null"
@@ -77,7 +77,7 @@
}
},
"extend-select": {
"description": "A list of rule codes or prefixes to enable, in addition to those specified by `select`.",
"description": "A list of rule codes or prefixes to enable, in addition to those specified by `select`.\n\nNote that `extend-select` is applied after resolving rules from `ignore`/`select` and a less specific rule in `extend-select` would overwrite a more specific rule in `ignore`. It is recommended to only use `extend-select` when extending a `pyproject.toml` file via `extend`.",
"type": [
"array",
"null"
@@ -755,6 +755,16 @@
"IsortOptions": {
"type": "object",
"properties": {
"classes": {
"description": "An override list of tokens to always recognize as a Class for `order-by-type` regardless of casing.",
"type": [
"array",
"null"
],
"items": {
"type": "string"
}
},
"combine-as-imports": {
"description": "Combines as imports on the same line. See isort's [`combine-as-imports`](https://pycqa.github.io/isort/docs/configuration/options.html#combine-as-imports) option.",
"type": [
@@ -1433,6 +1443,9 @@
"PLE1142",
"PLR",
"PLR0",
"PLR01",
"PLR013",
"PLR0133",
"PLR02",
"PLR020",
"PLR0206",
@@ -1445,6 +1458,10 @@
"PLR1701",
"PLR172",
"PLR1722",
"PLR2",
"PLR20",
"PLR200",
"PLR2004",
"PLW",
"PLW0",
"PLW01",

62
ruff_cli/Cargo.toml Normal file
View File

@@ -0,0 +1,62 @@
[package]
name = "ruff_cli"
version = "0.0.221"
authors = ["Charlie Marsh <charlie.r.marsh@gmail.com>"]
edition = "2021"
rust-version = "1.65.0"
documentation = "https://github.com/charliermarsh/ruff"
homepage = "https://github.com/charliermarsh/ruff"
repository = "https://github.com/charliermarsh/ruff"
readme = "../README.md"
license = "MIT"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[[bin]]
name = "ruff"
path = "src/main.rs"
doctest = false
[dependencies]
ruff = { path = ".." }
annotate-snippets = { version = "0.9.1", features = ["color"] }
anyhow = { version = "1.0.66" }
atty = { version = "0.2.14" }
bincode = { version = "1.3.3" }
cachedir = { version = "0.3.0" }
chrono = { version = "0.4.21", default-features = false, features = ["clock"] }
clap = { version = "4.0.1", features = ["derive", "env"] }
clap_complete_command = { version = "0.4.0" }
clearscreen = { version = "2.0.0" }
colored = { version = "2.0.0" }
filetime = { version = "0.2.17" }
glob = { version = "0.3.0" }
ignore = { version = "0.4.18" }
itertools = { version = "0.10.5" }
log = { version = "0.4.17" }
notify = { version = "5.0.0" }
path-absolutize = { version = "3.0.14", features = ["once_cell_cache"] }
quick-junit = { version = "0.3.2" }
rayon = { version = "1.5.3" }
regex = { version = "1.6.0" }
rustc-hash = { version = "1.1.0" }
serde = { version = "1.0.147", features = ["derive"] }
serde_json = { version = "1.0.87" }
similar = { version = "2.2.1" }
textwrap = { version = "0.16.0" }
update-informer = { version = "0.6.0", default-features = false, features = ["pypi"], optional = true }
walkdir = { version = "2.3.2" }
[dev-dependencies]
assert_cmd = { version = "2.0.4" }
strum = { version = "0.24.1" }
ureq = { version = "2.5.0", features = [] }
[features]
default = ["update-informer"]
update-informer = ["dep:update-informer"]
[package.metadata.maturin]
name = "ruff"
# Setting the name here is necessary for maturin to include the package in its builds.

124
ruff_cli/src/cache.rs Normal file
View File

@@ -0,0 +1,124 @@
use std::collections::hash_map::DefaultHasher;
use std::fs;
use std::hash::{Hash, Hasher};
use std::io::Write;
use std::path::Path;
use anyhow::Result;
use filetime::FileTime;
use log::error;
use path_absolutize::Absolutize;
use ruff::message::Message;
use ruff::settings::{flags, Settings};
use serde::{Deserialize, Serialize};
const CARGO_PKG_VERSION: &str = env!("CARGO_PKG_VERSION");
#[derive(Serialize, Deserialize)]
struct CacheMetadata {
mtime: i64,
}
#[derive(Serialize)]
struct CheckResultRef<'a> {
metadata: &'a CacheMetadata,
messages: &'a [Message],
}
#[derive(Deserialize)]
struct CheckResult {
metadata: CacheMetadata,
messages: Vec<Message>,
}
fn content_dir() -> &'static Path {
Path::new("content")
}
fn cache_key<P: AsRef<Path>>(path: P, settings: &Settings, autofix: flags::Autofix) -> u64 {
let mut hasher = DefaultHasher::new();
CARGO_PKG_VERSION.hash(&mut hasher);
path.as_ref().absolutize().unwrap().hash(&mut hasher);
settings.hash(&mut hasher);
autofix.hash(&mut hasher);
hasher.finish()
}
#[allow(dead_code)]
/// Initialize the cache at the specified `Path`.
pub fn init(path: &Path) -> Result<()> {
// Create the cache directories.
fs::create_dir_all(path.join(content_dir()))?;
// Add the CACHEDIR.TAG.
if !cachedir::is_tagged(path)? {
cachedir::add_tag(path)?;
}
// Add the .gitignore.
let gitignore_path = path.join(".gitignore");
if !gitignore_path.exists() {
let mut file = fs::File::create(gitignore_path)?;
file.write_all(b"*")?;
}
Ok(())
}
fn write_sync(cache_dir: &Path, key: u64, value: &[u8]) -> Result<(), std::io::Error> {
fs::write(
cache_dir.join(content_dir()).join(format!("{key:x}")),
value,
)
}
fn read_sync(cache_dir: &Path, key: u64) -> Result<Vec<u8>, std::io::Error> {
fs::read(cache_dir.join(content_dir()).join(format!("{key:x}")))
}
/// Get a value from the cache.
pub fn get<P: AsRef<Path>>(
path: P,
metadata: &fs::Metadata,
settings: &Settings,
autofix: flags::Autofix,
) -> Option<Vec<Message>> {
let encoded = read_sync(&settings.cache_dir, cache_key(path, settings, autofix)).ok()?;
let (mtime, messages) = match bincode::deserialize::<CheckResult>(&encoded[..]) {
Ok(CheckResult {
metadata: CacheMetadata { mtime },
messages,
}) => (mtime, messages),
Err(e) => {
error!("Failed to deserialize encoded cache entry: {e:?}");
return None;
}
};
if FileTime::from_last_modification_time(metadata).unix_seconds() != mtime {
return None;
}
Some(messages)
}
/// Set a value in the cache.
pub fn set<P: AsRef<Path>>(
path: P,
metadata: &fs::Metadata,
settings: &Settings,
autofix: flags::Autofix,
messages: &[Message],
) {
let check_result = CheckResultRef {
metadata: &CacheMetadata {
mtime: FileTime::from_last_modification_time(metadata).unix_seconds(),
},
messages,
};
if let Err(e) = write_sync(
&settings.cache_dir,
cache_key(path, settings, autofix),
&bincode::serialize(&check_result).unwrap(),
) {
error!("Failed to write to cache: {e:?}");
}
}

View File

@@ -2,18 +2,21 @@ use std::path::{Path, PathBuf};
use clap::{command, Parser};
use regex::Regex;
use rustc_hash::FxHashMap;
use crate::logging::LogLevel;
use crate::registry::{RuleCode, RuleCodePrefix};
use crate::resolver::ConfigProcessor;
use crate::settings::types::{
use ruff::logging::LogLevel;
use ruff::registry::{RuleCode, RuleCodePrefix};
use ruff::resolver::ConfigProcessor;
use ruff::settings::types::{
FilePattern, PatternPrefixPair, PerFileIgnore, PythonVersion, SerializationFormat,
};
use crate::{fs, mccabe};
use ruff::{fs, mccabe};
use rustc_hash::FxHashMap;
#[derive(Debug, Parser)]
#[command(author, about = "Ruff: An extremely fast Python linter.")]
#[command(
author,
name = "ruff",
about = "Ruff: An extremely fast Python linter."
)]
#[command(version)]
#[allow(clippy::struct_excessive_bools)]
pub struct Cli {
@@ -346,7 +349,7 @@ pub struct Overrides {
}
impl ConfigProcessor for &Overrides {
fn process_config(&self, config: &mut crate::settings::configuration::Configuration) {
fn process_config(&self, config: &mut ruff::settings::configuration::Configuration) {
if let Some(cache_dir) = &self.cache_dir {
config.cache_dir = Some(cache_dir.clone());
}

View File

@@ -11,22 +11,22 @@ use log::{debug, error};
use path_absolutize::path_dedot;
#[cfg(not(target_family = "wasm"))]
use rayon::prelude::*;
use rustpython_ast::Location;
use ruff::cache::CACHE_DIR_NAME;
use ruff::linter::add_noqa_to_path;
use ruff::logging::LogLevel;
use ruff::message::{Location, Message};
use ruff::registry::RuleCode;
use ruff::resolver::{FileDiscovery, PyprojectDiscovery};
use ruff::settings::flags;
use ruff::settings::types::SerializationFormat;
use ruff::{fix, fs, packaging, resolver, violations, warn_user_once};
use serde::Serialize;
use walkdir::WalkDir;
use crate::cache::CACHE_DIR_NAME;
use crate::cache;
use crate::cli::Overrides;
use crate::diagnostics::{lint_path, lint_stdin, Diagnostics};
use crate::iterators::par_iter;
use crate::linter::add_noqa_to_path;
use crate::logging::LogLevel;
use crate::message::Message;
use crate::registry::RuleCode;
use crate::resolver::{FileDiscovery, PyprojectDiscovery};
use crate::settings::flags;
use crate::settings::types::SerializationFormat;
use crate::{cache, fix, fs, packaging, resolver, violations, warn_user_once};
/// Run the linter over a collection of files.
pub fn run(
@@ -288,7 +288,7 @@ struct Explanation<'a> {
}
/// Explain a `RuleCode` to the user.
pub fn explain(code: &RuleCode, format: &SerializationFormat) -> Result<()> {
pub fn explain(code: &RuleCode, format: SerializationFormat) -> Result<()> {
match format {
SerializationFormat::Text | SerializationFormat::Grouped => {
println!(

View File

@@ -7,12 +7,13 @@ use std::path::Path;
use anyhow::Result;
use log::debug;
use ruff::linter::{lint_fix, lint_only};
use ruff::message::Message;
use ruff::settings::{flags, Settings};
use ruff::{fix, fs};
use similar::TextDiff;
use crate::linter::{lint_fix, lint_only};
use crate::message::Message;
use crate::settings::{flags, Settings};
use crate::{cache, fix, fs};
use crate::cache;
#[derive(Debug, Default)]
pub struct Diagnostics {

6
ruff_cli/src/lib.rs Normal file
View File

@@ -0,0 +1,6 @@
#![allow(clippy::must_use_candidate, dead_code)]
mod cli;
// used by ruff_dev::generate_cli_help
pub use cli::Cli;

View File

@@ -1,25 +1,40 @@
#![allow(
clippy::match_same_arms,
clippy::missing_errors_doc,
clippy::module_name_repetitions,
clippy::too_many_lines
)]
#![forbid(unsafe_code)]
use std::io::{self};
use std::path::{Path, PathBuf};
use std::process::ExitCode;
use std::sync::mpsc::channel;
use ::ruff::cli::{extract_log_level, Cli, Overrides};
use ::ruff::logging::{set_up_logging, LogLevel};
use ::ruff::printer::{Printer, Violations};
use ::ruff::resolver::{
resolve_settings_with_processor, ConfigProcessor, FileDiscovery, PyprojectDiscovery, Relativity,
};
use ::ruff::settings::configuration::Configuration;
use ::ruff::settings::types::SerializationFormat;
use ::ruff::settings::{pyproject, Settings};
#[cfg(feature = "update-informer")]
use ::ruff::updates;
use ::ruff::{commands, fix, warn_user_once};
use ::ruff::{fix, fs, warn_user_once};
use anyhow::Result;
use clap::{CommandFactory, Parser};
use cli::{extract_log_level, Cli, Overrides};
use colored::Colorize;
use notify::{recommended_watcher, RecursiveMode, Watcher};
use path_absolutize::path_dedot;
use printer::{Printer, Violations};
mod cache;
mod cli;
mod commands;
mod diagnostics;
mod iterators;
mod printer;
#[cfg(all(feature = "update-informer"))]
pub mod updates;
/// Resolve the relevant settings strategy and defaults for the current
/// invocation.
@@ -72,7 +87,7 @@ fn resolve(
}
}
pub(crate) fn inner_main() -> Result<ExitCode> {
pub fn main() -> Result<ExitCode> {
// Extract command-line arguments.
let (cli, overrides) = Cli::parse().partition();
let log_level = extract_log_level(&cli);
@@ -130,7 +145,7 @@ pub(crate) fn inner_main() -> Result<ExitCode> {
};
if let Some(code) = cli.explain {
commands::explain(&code, &format)?;
commands::explain(&code, format)?;
return Ok(ExitCode::SUCCESS);
}
if cli.show_settings {
@@ -196,7 +211,7 @@ pub(crate) fn inner_main() -> Result<ExitCode> {
cache.into(),
fix::FixMode::None,
)?;
printer.write_continuously(&messages)?;
printer.write_continuously(&messages);
// Configure the file watcher.
let (tx, rx) = channel();
@@ -226,7 +241,7 @@ pub(crate) fn inner_main() -> Result<ExitCode> {
cache.into(),
fix::FixMode::None,
)?;
printer.write_continuously(&messages)?;
printer.write_continuously(&messages);
}
}
Err(err) => return Err(err.into()),
@@ -244,7 +259,7 @@ pub(crate) fn inner_main() -> Result<ExitCode> {
// Generate lint violations.
let diagnostics = if is_stdin {
commands::run_stdin(
cli.stdin_filename.as_deref(),
cli.stdin_filename.map(fs::normalize_path).as_deref(),
&pyproject_strategy,
&file_strategy,
&overrides,

View File

@@ -6,17 +6,16 @@ use annotate_snippets::snippet::{Annotation, AnnotationType, Slice, Snippet, Sou
use anyhow::Result;
use colored::Colorize;
use itertools::iterate;
use rustpython_parser::ast::Location;
use ruff::fs::relativize_path;
use ruff::logging::LogLevel;
use ruff::message::{Location, Message};
use ruff::registry::RuleCode;
use ruff::settings::types::SerializationFormat;
use ruff::{fix, notify_user};
use serde::Serialize;
use serde_json::json;
use crate::diagnostics::Diagnostics;
use crate::fs::relativize_path;
use crate::logging::LogLevel;
use crate::message::Message;
use crate::registry::RuleCode;
use crate::settings::types::SerializationFormat;
use crate::{fix, notify_user};
/// Enum to control whether lint violations are shown to the user.
pub enum Violations {
@@ -282,9 +281,9 @@ impl<'a> Printer<'a> {
Ok(())
}
pub fn write_continuously(&self, diagnostics: &Diagnostics) -> Result<()> {
pub fn write_continuously(&self, diagnostics: &Diagnostics) {
if matches!(self.log_level, LogLevel::Silent) {
return Ok(());
return;
}
if self.log_level >= &LogLevel::Default {
@@ -302,10 +301,9 @@ impl<'a> Printer<'a> {
print_message(message);
}
}
Ok(())
}
#[allow(clippy::unused_self)]
pub fn clear_screen(&self) -> Result<()> {
#[cfg(not(target_family = "wasm"))]
clearscreen::clear()?;

View File

@@ -9,7 +9,7 @@ use std::time::Duration;
use std::{fs, process, str};
use anyhow::{anyhow, Context, Result};
use assert_cmd::{crate_name, Command};
use assert_cmd::Command;
use itertools::Itertools;
use log::info;
use ruff::logging::{set_up_logging, LogLevel};
@@ -24,6 +24,8 @@ struct Blackd {
client: ureq::Agent,
}
const BIN_NAME: &str = "ruff";
impl Blackd {
pub fn new() -> Result<Self> {
// Get free TCP port to run on
@@ -104,7 +106,7 @@ fn run_test(path: &Path, blackd: &Blackd, ruff_args: &[&str]) -> Result<()> {
let input = fs::read(path)?;
// Step 1: Run `ruff` on the input.
let step_1 = &Command::cargo_bin(crate_name!())?
let step_1 = &Command::cargo_bin(BIN_NAME)?
.args(ruff_args)
.write_stdin(input)
.assert()
@@ -121,7 +123,7 @@ fn run_test(path: &Path, blackd: &Blackd, ruff_args: &[&str]) -> Result<()> {
let step_2_output = blackd.check(&step_1_output)?;
// Step 3: Re-run `ruff` on the input.
let step_3 = &Command::cargo_bin(crate_name!())?
let step_3 = &Command::cargo_bin(BIN_NAME)?
.args(ruff_args)
.write_stdin(step_2_output.clone())
.assert();

View File

@@ -3,11 +3,14 @@
use std::str;
use anyhow::Result;
use assert_cmd::{crate_name, Command};
use assert_cmd::Command;
use path_absolutize::path_dedot;
const BIN_NAME: &str = "ruff";
#[test]
fn test_stdin_success() -> Result<()> {
let mut cmd = Command::cargo_bin(crate_name!())?;
let mut cmd = Command::cargo_bin(BIN_NAME)?;
cmd.args(["-", "--format", "text"])
.write_stdin("")
.assert()
@@ -17,7 +20,7 @@ fn test_stdin_success() -> Result<()> {
#[test]
fn test_stdin_error() -> Result<()> {
let mut cmd = Command::cargo_bin(crate_name!())?;
let mut cmd = Command::cargo_bin(BIN_NAME)?;
let output = cmd
.args(["-", "--format", "text"])
.write_stdin("import os\n")
@@ -33,7 +36,7 @@ fn test_stdin_error() -> Result<()> {
#[test]
fn test_stdin_filename() -> Result<()> {
let mut cmd = Command::cargo_bin(crate_name!())?;
let mut cmd = Command::cargo_bin(BIN_NAME)?;
let output = cmd
.args(["-", "--format", "text", "--stdin-filename", "F401.py"])
.write_stdin("import os\n")
@@ -49,7 +52,7 @@ fn test_stdin_filename() -> Result<()> {
#[test]
fn test_stdin_json() -> Result<()> {
let mut cmd = Command::cargo_bin(crate_name!())?;
let mut cmd = Command::cargo_bin(BIN_NAME)?;
let output = cmd
.args(["-", "--format", "json", "--stdin-filename", "F401.py"])
.write_stdin("import os\n")
@@ -57,41 +60,44 @@ fn test_stdin_json() -> Result<()> {
.failure();
assert_eq!(
str::from_utf8(&output.get_output().stdout)?,
r#"[
{
format!(
r#"[
{{
"code": "F401",
"message": "`os` imported but unused",
"fix": {
"fix": {{
"content": "",
"message": "Remove unused import: `os`",
"location": {
"location": {{
"row": 1,
"column": 0
},
"end_location": {
}},
"end_location": {{
"row": 2,
"column": 0
}
},
"location": {
}}
}},
"location": {{
"row": 1,
"column": 8
},
"end_location": {
}},
"end_location": {{
"row": 1,
"column": 10
},
"filename": "F401.py"
}
}},
"filename": "{}/F401.py"
}}
]
"#
"#,
path_dedot::CWD.to_str().unwrap()
)
);
Ok(())
}
#[test]
fn test_stdin_autofix() -> Result<()> {
let mut cmd = Command::cargo_bin(crate_name!())?;
let mut cmd = Command::cargo_bin(BIN_NAME)?;
let output = cmd
.args(["-", "--format", "text", "--fix"])
.write_stdin("import os\nimport sys\n\nprint(sys.version)\n")
@@ -106,7 +112,7 @@ fn test_stdin_autofix() -> Result<()> {
#[test]
fn test_stdin_autofix_when_not_fixable_should_still_print_contents() -> Result<()> {
let mut cmd = Command::cargo_bin(crate_name!())?;
let mut cmd = Command::cargo_bin(BIN_NAME)?;
let output = cmd
.args(["-", "--format", "text", "--fix"])
.write_stdin("import os\nimport sys\n\nif (1, 2):\n print(sys.version)\n")
@@ -121,7 +127,7 @@ fn test_stdin_autofix_when_not_fixable_should_still_print_contents() -> Result<(
#[test]
fn test_stdin_autofix_when_no_issues_should_still_print_contents() -> Result<()> {
let mut cmd = Command::cargo_bin(crate_name!())?;
let mut cmd = Command::cargo_bin(BIN_NAME)?;
let output = cmd
.args(["-", "--format", "text", "--fix"])
.write_stdin("import sys\n\nprint(sys.version)\n")
@@ -136,7 +142,7 @@ fn test_stdin_autofix_when_no_issues_should_still_print_contents() -> Result<()>
#[test]
fn test_show_source() -> Result<()> {
let mut cmd = Command::cargo_bin(crate_name!())?;
let mut cmd = Command::cargo_bin(BIN_NAME)?;
let output = cmd
.args(["-", "--format", "text", "--show-source"])
.write_stdin("l = 1")

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_dev"
version = "0.0.220"
version = "0.0.221"
edition = "2021"
[lib]
@@ -14,6 +14,7 @@ itertools = { version = "0.10.5" }
libcst = { git = "https://github.com/charliermarsh/LibCST", rev = "f2f0b7a487a8725d161fe8b3ed73a6758b21e177" }
once_cell = { version = "1.16.0" }
ruff = { path = ".." }
ruff_cli = { path = "../ruff_cli" }
rustpython-ast = { features = ["unparse"], git = "https://github.com/RustPython/RustPython.git", rev = "d532160333ffeb6dbeca2c2728c2391cd1e53b7f" }
rustpython-common = { git = "https://github.com/RustPython/RustPython.git", rev = "d532160333ffeb6dbeca2c2728c2391cd1e53b7f" }
rustpython-parser = { features = ["lalrpop"], git = "https://github.com/RustPython/RustPython.git", rev = "d532160333ffeb6dbeca2c2728c2391cd1e53b7f" }

View File

@@ -2,7 +2,7 @@
use anyhow::Result;
use clap::{Args, CommandFactory};
use ruff::cli::Cli as MainCli;
use ruff_cli::Cli as MainCli;
use crate::utils::replace_readme_section;

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_macros"
version = "0.0.220"
version = "0.0.221"
edition = "2021"
[lib]

View File

@@ -1,10 +1,8 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::Expr;
use crate::ast::helpers::{
collect_call_paths, dealias_call_path, match_call_path, to_module_and_member,
};
use crate::ast::helpers::to_call_path;
use crate::ast::types::{Scope, ScopeKind};
use crate::checkers::ast::Checker;
const CLASS_METHODS: [&str; 3] = ["__new__", "__init_subclass__", "__class_getitem__"];
const METACLASS_BASES: [(&str, &str); 2] = [("", "type"), ("abc", "ABCMeta")];
@@ -18,11 +16,10 @@ pub enum FunctionType {
/// Classify a function based on its scope, name, and decorators.
pub fn classify(
checker: &Checker,
scope: &Scope,
name: &str,
decorator_list: &[Expr],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
classmethod_decorators: &[String],
staticmethod_decorators: &[String],
) -> FunctionType {
@@ -33,17 +30,18 @@ pub fn classify(
if CLASS_METHODS.contains(&name)
|| scope.bases.iter().any(|expr| {
// The class itself extends a known metaclass, so all methods are class methods.
let call_path = dealias_call_path(collect_call_paths(expr), import_aliases);
METACLASS_BASES
.iter()
.any(|(module, member)| match_call_path(&call_path, module, member, from_imports))
checker.resolve_call_path(expr).map_or(false, |call_path| {
METACLASS_BASES
.iter()
.any(|(module, member)| call_path == [*module, *member])
})
})
|| decorator_list.iter().any(|expr| {
// The method is decorated with a class method decorator (like `@classmethod`).
let call_path = dealias_call_path(collect_call_paths(expr), import_aliases);
classmethod_decorators.iter().any(|decorator| {
let (module, member) = to_module_and_member(decorator);
match_call_path(&call_path, module, member, from_imports)
checker.resolve_call_path(expr).map_or(false, |call_path| {
classmethod_decorators
.iter()
.any(|decorator| call_path == to_call_path(decorator))
})
})
{
@@ -51,10 +49,10 @@ pub fn classify(
} else if decorator_list.iter().any(|expr| {
// The method is decorated with a static method decorator (like
// `@staticmethod`).
let call_path = dealias_call_path(collect_call_paths(expr), import_aliases);
staticmethod_decorators.iter().any(|decorator| {
let (module, member) = to_module_and_member(decorator);
match_call_path(&call_path, module, member, from_imports)
checker.resolve_call_path(expr).map_or(false, |call_path| {
staticmethod_decorators
.iter()
.any(|decorator| call_path == to_call_path(decorator))
})
}) {
FunctionType::StaticMethod

View File

@@ -12,6 +12,7 @@ use rustpython_parser::lexer::Tok;
use rustpython_parser::token::StringKind;
use crate::ast::types::{Binding, BindingKind, Range};
use crate::checkers::ast::Checker;
use crate::source_code::{Generator, Locator, Stylist};
/// Create an `Expr` with default location from an `ExprKind`.
@@ -54,150 +55,42 @@ fn collect_call_path_inner<'a>(expr: &'a Expr, parts: &mut Vec<&'a str>) {
}
}
/// Convert an `Expr` to its call path (like `List`, or `typing.List`).
pub fn compose_call_path(expr: &Expr) -> Option<String> {
let segments = collect_call_paths(expr);
if segments.is_empty() {
None
} else {
Some(segments.join("."))
}
}
/// Convert an `Expr` to its call path segments (like ["typing", "List"]).
pub fn collect_call_paths(expr: &Expr) -> Vec<&str> {
pub fn collect_call_path(expr: &Expr) -> Vec<&str> {
let mut segments = vec![];
collect_call_path_inner(expr, &mut segments);
segments
}
/// Rewrite any import aliases on a call path.
pub fn dealias_call_path<'a>(
call_path: Vec<&'a str>,
import_aliases: &FxHashMap<&str, &'a str>,
) -> Vec<&'a str> {
if let Some(head) = call_path.first() {
if let Some(origin) = import_aliases.get(head) {
let tail = &call_path[1..];
let mut call_path: Vec<&str> = vec![];
call_path.extend(origin.split('.'));
call_path.extend(tail);
call_path
} else {
call_path
}
/// Convert an `Expr` to its call path (like `List`, or `typing.List`).
pub fn compose_call_path(expr: &Expr) -> Option<String> {
let call_path = collect_call_path(expr);
if call_path.is_empty() {
None
} else {
call_path
Some(format_call_path(&call_path))
}
}
/// Return `true` if the `Expr` is a reference to `${module}.${target}`.
///
/// Useful for, e.g., ensuring that a `Union` reference represents
/// `typing.Union`.
pub fn match_module_member(
expr: &Expr,
module: &str,
member: &str,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
match_call_path(
&dealias_call_path(collect_call_paths(expr), import_aliases),
module,
member,
from_imports,
)
}
/// Return `true` if the `call_path` is a reference to `${module}.${target}`.
///
/// Optimized version of `match_module_member` for pre-computed call paths.
pub fn match_call_path(
call_path: &[&str],
module: &str,
member: &str,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
) -> bool {
// If we have no segments, we can't ever match.
let num_segments = call_path.len();
if num_segments == 0 {
return false;
}
// If the last segment doesn't match the member, we can't ever match.
if call_path[num_segments - 1] != member {
return false;
}
// We now only need the module path, so throw out the member name.
let call_path = &call_path[..num_segments - 1];
let num_segments = call_path.len();
// Case (1): It's a builtin (like `list`).
// Case (2a): We imported from the parent (`from typing.re import Match`,
// `Match`).
// Case (2b): We imported star from the parent (`from typing.re import *`,
// `Match`).
if num_segments == 0 {
module.is_empty()
|| from_imports.get(module).map_or(false, |imports| {
imports.contains(member) || imports.contains("*")
})
/// Format a call path for display.
pub fn format_call_path(call_path: &[&str]) -> String {
if call_path
.first()
.expect("Unable to format empty call path")
.is_empty()
{
call_path[1..].join(".")
} else {
let components: Vec<&str> = module.split('.').collect();
// Case (3a): it's a fully qualified call path (`import typing`,
// `typing.re.Match`). Case (3b): it's a fully qualified call path (`import
// typing.re`, `typing.re.Match`).
if components == call_path {
return true;
}
// Case (4): We imported from the grandparent (`from typing import re`,
// `re.Match`)
let num_matches = (0..components.len())
.take(num_segments)
.take_while(|i| components[components.len() - 1 - i] == call_path[num_segments - 1 - i])
.count();
if num_matches > 0 {
let cut = components.len() - num_matches;
// TODO(charlie): Rewrite to avoid this allocation.
let module = components[..cut].join(".");
let member = components[cut];
if from_imports
.get(&module.as_str())
.map_or(false, |imports| imports.contains(member))
{
return true;
}
}
false
call_path.join(".")
}
}
/// Return `true` if the `Expr` contains a reference to `${module}.${target}`.
pub fn contains_call_path(
expr: &Expr,
module: &str,
member: &str,
import_aliases: &FxHashMap<&str, &str>,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
) -> bool {
pub fn contains_call_path(checker: &Checker, expr: &Expr, target: &[&str]) -> bool {
any_over_expr(expr, &|expr| {
let call_path = collect_call_paths(expr);
if !call_path.is_empty() {
if match_call_path(
&dealias_call_path(call_path, import_aliases),
module,
member,
from_imports,
) {
return true;
}
}
false
checker
.resolve_call_path(expr)
.map_or(false, |call_path| call_path == target)
})
}
@@ -389,13 +282,13 @@ pub fn extract_handler_names(handlers: &[Excepthandler]) -> Vec<Vec<&str>> {
if let Some(type_) = type_ {
if let ExprKind::Tuple { elts, .. } = &type_.node {
for type_ in elts {
let call_path = collect_call_paths(type_);
let call_path = collect_call_path(type_);
if !call_path.is_empty() {
handler_names.push(call_path);
}
}
} else {
let call_path = collect_call_paths(type_);
let call_path = collect_call_path(type_);
if !call_path.is_empty() {
handler_names.push(call_path);
}
@@ -458,12 +351,37 @@ pub fn format_import_from(level: Option<&usize>, module: Option<&str>) -> String
module_name
}
/// Format the member reference name for a relative import.
pub fn format_import_from_member(
level: Option<&usize>,
module: Option<&str>,
member: &str,
) -> String {
let mut full_name = String::with_capacity(
level.map_or(0, |level| *level)
+ module.as_ref().map_or(0, |module| module.len())
+ 1
+ member.len(),
);
if let Some(level) = level {
for _ in 0..*level {
full_name.push('.');
}
}
if let Some(module) = module {
full_name.push_str(module);
full_name.push('.');
}
full_name.push_str(member);
full_name
}
/// Split a target string (like `typing.List`) into (`typing`, `List`).
pub fn to_module_and_member(target: &str) -> (&str, &str) {
if let Some(index) = target.rfind('.') {
(&target[..index], &target[index + 1..])
pub fn to_call_path(target: &str) -> Vec<&str> {
if target.contains('.') {
target.split('.').collect()
} else {
("", target)
vec!["", target]
}
}
@@ -778,6 +696,7 @@ impl<'a> SimpleCallArgs<'a> {
}
/// Get the number of positional and keyword arguments used.
#[allow(clippy::len_without_is_empty)]
pub fn len(&self) -> usize {
self.args.len() + self.kwargs.len()
}
@@ -786,159 +705,13 @@ impl<'a> SimpleCallArgs<'a> {
#[cfg(test)]
mod tests {
use anyhow::Result;
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::Location;
use rustpython_parser::parser;
use crate::ast::helpers::{
else_range, identifier_range, match_module_member, match_trailing_content,
};
use crate::ast::helpers::{else_range, identifier_range, match_trailing_content};
use crate::ast::types::Range;
use crate::source_code::Locator;
#[test]
fn builtin() -> Result<()> {
let expr = parser::parse_expression("list", "<filename>")?;
assert!(match_module_member(
&expr,
"",
"list",
&FxHashMap::default(),
&FxHashMap::default(),
));
Ok(())
}
#[test]
fn fully_qualified() -> Result<()> {
let expr = parser::parse_expression("typing.re.Match", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::default(),
&FxHashMap::default(),
));
Ok(())
}
#[test]
fn unimported() -> Result<()> {
let expr = parser::parse_expression("Match", "<filename>")?;
assert!(!match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::default(),
&FxHashMap::default(),
));
let expr = parser::parse_expression("re.Match", "<filename>")?;
assert!(!match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::default(),
&FxHashMap::default(),
));
Ok(())
}
#[test]
fn from_star() -> Result<()> {
let expr = parser::parse_expression("Match", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::from_iter([("typing.re", FxHashSet::from_iter(["*"]))]),
&FxHashMap::default()
));
Ok(())
}
#[test]
fn from_parent() -> Result<()> {
let expr = parser::parse_expression("Match", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::from_iter([("typing.re", FxHashSet::from_iter(["Match"]))]),
&FxHashMap::default()
));
Ok(())
}
#[test]
fn from_grandparent() -> Result<()> {
let expr = parser::parse_expression("re.Match", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::from_iter([("typing", FxHashSet::from_iter(["re"]))]),
&FxHashMap::default()
));
let expr = parser::parse_expression("match.Match", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re.match",
"Match",
&FxHashMap::from_iter([("typing.re", FxHashSet::from_iter(["match"]))]),
&FxHashMap::default()
));
let expr = parser::parse_expression("re.match.Match", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re.match",
"Match",
&FxHashMap::from_iter([("typing", FxHashSet::from_iter(["re"]))]),
&FxHashMap::default()
));
Ok(())
}
#[test]
fn from_alias() -> Result<()> {
let expr = parser::parse_expression("IMatch", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::from_iter([("typing.re", FxHashSet::from_iter(["Match"]))]),
&FxHashMap::from_iter([("IMatch", "Match")]),
));
Ok(())
}
#[test]
fn from_aliased_parent() -> Result<()> {
let expr = parser::parse_expression("t.Match", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::default(),
&FxHashMap::from_iter([("t", "typing.re")]),
));
Ok(())
}
#[test]
fn from_aliased_grandparent() -> Result<()> {
let expr = parser::parse_expression("t.re.Match", "<filename>")?;
assert!(match_module_member(
&expr,
"typing.re",
"Match",
&FxHashMap::default(),
&FxHashMap::from_iter([("t", "typing")]),
));
Ok(())
}
#[test]
fn trailing_content() -> Result<()> {
let contents = "x = 1";

View File

@@ -104,7 +104,7 @@ impl<'a> Scope<'a> {
}
#[derive(Clone, Debug)]
pub enum BindingKind {
pub enum BindingKind<'a> {
Annotation,
Argument,
Assignment,
@@ -118,14 +118,14 @@ pub enum BindingKind {
Export(Vec<String>),
FutureImportation,
StarImportation(Option<usize>, Option<String>),
Importation(String, String),
FromImportation(String, String),
SubmoduleImportation(String, String),
Importation(&'a str, &'a str),
FromImportation(&'a str, String),
SubmoduleImportation(&'a str, &'a str),
}
#[derive(Clone, Debug)]
pub struct Binding<'a> {
pub kind: BindingKind,
pub kind: BindingKind<'a>,
pub range: Range,
/// The statement in which the `Binding` was defined.
pub source: Option<RefEquality<'a, Stmt>>,
@@ -168,19 +168,26 @@ impl<'a> Binding<'a> {
pub fn redefines(&self, existing: &'a Binding) -> bool {
match &self.kind {
BindingKind::Importation(_, full_name) | BindingKind::FromImportation(_, full_name) => {
if let BindingKind::SubmoduleImportation(_, existing_full_name) = &existing.kind {
return full_name == existing_full_name;
BindingKind::Importation(.., full_name) => {
if let BindingKind::SubmoduleImportation(.., existing) = &existing.kind {
return full_name == existing;
}
}
BindingKind::SubmoduleImportation(_, full_name) => {
if let BindingKind::Importation(_, existing_full_name)
| BindingKind::FromImportation(_, existing_full_name)
| BindingKind::SubmoduleImportation(_, existing_full_name) = &existing.kind
{
return full_name == existing_full_name;
BindingKind::FromImportation(.., full_name) => {
if let BindingKind::SubmoduleImportation(.., existing) = &existing.kind {
return full_name == existing;
}
}
BindingKind::SubmoduleImportation(.., full_name) => match &existing.kind {
BindingKind::Importation(.., existing)
| BindingKind::SubmoduleImportation(.., existing) => {
return full_name == existing;
}
BindingKind::FromImportation(.., existing) => {
return full_name == existing;
}
_ => {}
},
BindingKind::Annotation => {
return false;
}

View File

@@ -210,14 +210,17 @@ pub fn remove_unused_imports<'a>(
Some(SmallStatement::Import(import_body)) => (&mut import_body.names, None),
Some(SmallStatement::ImportFrom(import_body)) => {
if let ImportNames::Aliases(names) = &mut import_body.names {
(names, import_body.module.as_ref())
(
names,
Some((&import_body.relative, import_body.module.as_ref())),
)
} else if let ImportNames::Star(..) = &import_body.names {
// Special-case: if the import is a `from ... import *`, then we delete the
// entire statement.
let mut found_star = false;
for unused_import in unused_imports {
let full_name = match import_body.module.as_ref() {
Some(module_name) => format!("{}.*", compose_module_path(module_name),),
Some(module_name) => format!("{}.*", compose_module_path(module_name)),
None => "*".to_string(),
};
if unused_import == full_name {
@@ -246,11 +249,25 @@ pub fn remove_unused_imports<'a>(
for unused_import in unused_imports {
let alias_index = aliases.iter().position(|alias| {
let full_name = match import_module {
Some(module_name) => format!(
"{}.{}",
compose_module_path(module_name),
compose_module_path(&alias.name)
),
Some((relative, module)) => {
let module = module.map(compose_module_path);
let member = compose_module_path(&alias.name);
let mut full_name = String::with_capacity(
relative.len()
+ module.as_ref().map_or(0, std::string::String::len)
+ member.len()
+ 1,
);
for _ in 0..relative.len() {
full_name.push('.');
}
if let Some(module) = module {
full_name.push_str(&module);
full_name.push('.');
}
full_name.push_str(&member);
full_name
}
None => compose_module_path(&alias.name),
};
full_name == unused_import

View File

@@ -1,134 +1,9 @@
#![cfg_attr(target_family = "wasm", allow(dead_code))]
use std::collections::hash_map::DefaultHasher;
use std::fs;
use std::hash::{Hash, Hasher};
use std::io::Write;
use std::path::{Path, PathBuf};
use anyhow::Result;
use filetime::FileTime;
use log::error;
use path_absolutize::Absolutize;
use serde::{Deserialize, Serialize};
use crate::message::Message;
use crate::settings::{flags, Settings};
pub const CACHE_DIR_NAME: &str = ".ruff_cache";
const CARGO_PKG_VERSION: &str = env!("CARGO_PKG_VERSION");
#[derive(Serialize, Deserialize)]
struct CacheMetadata {
mtime: i64,
}
#[derive(Serialize)]
struct CheckResultRef<'a> {
metadata: &'a CacheMetadata,
messages: &'a [Message],
}
#[derive(Deserialize)]
struct CheckResult {
metadata: CacheMetadata,
messages: Vec<Message>,
}
/// Return the cache directory for a given project root. Defers to the
/// `RUFF_CACHE_DIR` environment variable, if set.
pub fn cache_dir(project_root: &Path) -> PathBuf {
project_root.join(CACHE_DIR_NAME)
}
fn content_dir() -> &'static Path {
Path::new("content")
}
fn cache_key<P: AsRef<Path>>(path: P, settings: &Settings, autofix: flags::Autofix) -> u64 {
let mut hasher = DefaultHasher::new();
CARGO_PKG_VERSION.hash(&mut hasher);
path.as_ref().absolutize().unwrap().hash(&mut hasher);
settings.hash(&mut hasher);
autofix.hash(&mut hasher);
hasher.finish()
}
#[allow(dead_code)]
/// Initialize the cache at the specified `Path`.
pub fn init(path: &Path) -> Result<()> {
// Create the cache directories.
fs::create_dir_all(path.join(content_dir()))?;
// Add the CACHEDIR.TAG.
if !cachedir::is_tagged(path)? {
cachedir::add_tag(path)?;
}
// Add the .gitignore.
let gitignore_path = path.join(".gitignore");
if !gitignore_path.exists() {
let mut file = fs::File::create(gitignore_path)?;
file.write_all(b"*")?;
}
Ok(())
}
fn write_sync(cache_dir: &Path, key: u64, value: &[u8]) -> Result<(), std::io::Error> {
fs::write(
cache_dir.join(content_dir()).join(format!("{key:x}")),
value,
)
}
fn read_sync(cache_dir: &Path, key: u64) -> Result<Vec<u8>, std::io::Error> {
fs::read(cache_dir.join(content_dir()).join(format!("{key:x}")))
}
/// Get a value from the cache.
pub fn get<P: AsRef<Path>>(
path: P,
metadata: &fs::Metadata,
settings: &Settings,
autofix: flags::Autofix,
) -> Option<Vec<Message>> {
let encoded = read_sync(&settings.cache_dir, cache_key(path, settings, autofix)).ok()?;
let (mtime, messages) = match bincode::deserialize::<CheckResult>(&encoded[..]) {
Ok(CheckResult {
metadata: CacheMetadata { mtime },
messages,
}) => (mtime, messages),
Err(e) => {
error!("Failed to deserialize encoded cache entry: {e:?}");
return None;
}
};
if FileTime::from_last_modification_time(metadata).unix_seconds() != mtime {
return None;
}
Some(messages)
}
/// Set a value in the cache.
pub fn set<P: AsRef<Path>>(
path: P,
metadata: &fs::Metadata,
settings: &Settings,
autofix: flags::Autofix,
messages: &[Message],
) {
let check_result = CheckResultRef {
metadata: &CacheMetadata {
mtime: FileTime::from_last_modification_time(metadata).unix_seconds(),
},
messages,
};
if let Err(e) = write_sync(
&settings.cache_dir,
cache_key(path, settings, autofix),
&bincode::serialize(&check_result).unwrap(),
) {
error!("Failed to write to cache: {e:?}");
}
}

View File

@@ -14,9 +14,7 @@ use rustpython_parser::ast::{
};
use rustpython_parser::parser;
use crate::ast::helpers::{
binding_range, collect_call_paths, dealias_call_path, extract_handler_names, match_call_path,
};
use crate::ast::helpers::{binding_range, collect_call_path, extract_handler_names};
use crate::ast::operations::extract_all_names;
use crate::ast::relocate::relocate_expr;
use crate::ast::types::{
@@ -63,13 +61,10 @@ pub struct Checker<'a> {
// Computed diagnostics.
pub(crate) diagnostics: Vec<Diagnostic>,
// Function and class definition tracking (e.g., for docstring enforcement).
definitions: Vec<(Definition<'a>, Visibility)>,
definitions: Vec<(Definition<'a>, Visibility, DeferralContext<'a>)>,
// Edit tracking.
// TODO(charlie): Instead of exposing deletions, wrap in a public API.
pub(crate) deletions: FxHashSet<RefEquality<'a, Stmt>>,
// Import tracking.
pub(crate) from_imports: FxHashMap<&'a str, FxHashSet<&'a str>>,
pub(crate) import_aliases: FxHashMap<&'a str, &'a str>,
// Retain all scopes and parent nodes, along with a stack of indexes to track which are active
// at various points in time.
pub(crate) parents: Vec<RefEquality<'a, Stmt>>,
@@ -123,8 +118,6 @@ impl<'a> Checker<'a> {
diagnostics: vec![],
definitions: vec![],
deletions: FxHashSet::default(),
from_imports: FxHashMap::default(),
import_aliases: FxHashMap::default(),
parents: vec![],
depths: FxHashMap::default(),
child_to_parent: FxHashMap::default(),
@@ -167,28 +160,28 @@ impl<'a> Checker<'a> {
/// Return `true` if the `Expr` is a reference to `typing.${target}`.
pub fn match_typing_expr(&self, expr: &Expr, target: &str) -> bool {
let call_path = dealias_call_path(collect_call_paths(expr), &self.import_aliases);
self.match_typing_call_path(&call_path, target)
self.resolve_call_path(expr).map_or(false, |call_path| {
self.match_typing_call_path(&call_path, target)
})
}
/// Return `true` if the call path is a reference to `typing.${target}`.
pub fn match_typing_call_path(&self, call_path: &[&str], target: &str) -> bool {
if match_call_path(call_path, "typing", target, &self.from_imports) {
if call_path == ["typing", target] {
return true;
}
if typing::TYPING_EXTENSIONS.contains(target) {
if match_call_path(call_path, "typing_extensions", target, &self.from_imports) {
if call_path == ["typing_extensions", target] {
return true;
}
}
if self
.settings
.typing_modules
.iter()
.any(|module| match_call_path(call_path, module, target, &self.from_imports))
{
if self.settings.typing_modules.iter().any(|module| {
let mut module = module.split('.').collect::<Vec<_>>();
module.push(target);
call_path == module.as_slice()
}) {
return true;
}
@@ -209,6 +202,54 @@ impl<'a> Checker<'a> {
})
}
pub fn resolve_call_path<'b>(&'a self, value: &'b Expr) -> Option<Vec<&'a str>>
where
'b: 'a,
{
let call_path = collect_call_path(value);
if let Some(head) = call_path.first() {
if let Some(binding) = self.find_binding(head) {
match &binding.kind {
BindingKind::Importation(.., name) => {
// Ignore relative imports.
if name.starts_with('.') {
return None;
}
let mut source_path: Vec<&str> = name.split('.').collect();
source_path.extend(call_path.iter().skip(1));
return Some(source_path);
}
BindingKind::SubmoduleImportation(name, ..) => {
// Ignore relative imports.
if name.starts_with('.') {
return None;
}
let mut source_path: Vec<&str> = name.split('.').collect();
source_path.extend(call_path.iter().skip(1));
return Some(source_path);
}
BindingKind::FromImportation(.., name) => {
// Ignore relative imports.
if name.starts_with('.') {
return None;
}
let mut source_path: Vec<&str> = name.split('.').collect();
source_path.extend(call_path.iter().skip(1));
return Some(source_path);
}
BindingKind::Builtin => {
let mut source_path: Vec<&str> = Vec::with_capacity(call_path.len() + 1);
source_path.push("");
source_path.extend(call_path);
return Some(source_path);
}
_ => {}
}
}
}
None
}
/// Return `true` if a `RuleCode` is disabled by a `noqa` directive.
pub fn is_ignored(&self, code: &RuleCode, lineno: usize) -> bool {
// TODO(charlie): `noqa` directives are mostly enforced in `check_lines.rs`.
@@ -415,13 +456,11 @@ where
if self.settings.enabled.contains(&RuleCode::N804) {
if let Some(diagnostic) =
pep8_naming::rules::invalid_first_argument_name_for_class_method(
self,
self.current_scope(),
name,
decorator_list,
args,
&self.from_imports,
&self.import_aliases,
&self.settings.pep8_naming,
)
{
self.diagnostics.push(diagnostic);
@@ -431,13 +470,11 @@ where
if self.settings.enabled.contains(&RuleCode::N805) {
if let Some(diagnostic) =
pep8_naming::rules::invalid_first_argument_name_for_method(
self,
self.current_scope(),
name,
decorator_list,
args,
&self.from_imports,
&self.import_aliases,
&self.settings.pep8_naming,
)
{
self.diagnostics.push(diagnostic);
@@ -706,10 +743,7 @@ where
self.add_binding(
name,
Binding {
kind: BindingKind::SubmoduleImportation(
name.to_string(),
full_name.to_string(),
),
kind: BindingKind::SubmoduleImportation(name, full_name),
used: None,
range: Range::from_located(alias),
source: Some(self.current_stmt().clone()),
@@ -728,10 +762,7 @@ where
self.add_binding(
name,
Binding {
kind: BindingKind::Importation(
name.to_string(),
full_name.to_string(),
),
kind: BindingKind::Importation(name, full_name),
// Treat explicit re-export as usage (e.g., `import applications
// as applications`).
used: if alias
@@ -788,12 +819,6 @@ where
}
if let Some(asname) = &alias.node.asname {
for alias in names {
if let Some(asname) = &alias.node.asname {
self.import_aliases.insert(asname, &alias.node.name);
}
}
let name = alias.node.name.split('.').last().unwrap();
if self.settings.enabled.contains(&RuleCode::N811) {
if let Some(diagnostic) =
@@ -890,25 +915,6 @@ where
module,
level,
} => {
// Track `import from` statements, to ensure that we can correctly attribute
// references like `from typing import Union`.
if self.settings.enabled.contains(&RuleCode::UP023) {
pyupgrade::rules::replace_c_element_tree(self, stmt);
}
if level.map(|level| level == 0).unwrap_or(true) {
if let Some(module) = module {
self.from_imports
.entry(module)
.or_insert_with(FxHashSet::default)
.extend(names.iter().map(|alias| alias.node.name.as_str()));
}
for alias in names {
if let Some(asname) = &alias.node.asname {
self.import_aliases.insert(asname, &alias.node.name);
}
}
}
if self.settings.enabled.contains(&RuleCode::E402) {
if self.seen_import_boundary && stmt.location.column() == 0 {
self.diagnostics.push(Diagnostic::new(
@@ -926,6 +932,9 @@ where
if self.settings.enabled.contains(&RuleCode::UP026) {
pyupgrade::rules::rewrite_mock_import(self, stmt);
}
if self.settings.enabled.contains(&RuleCode::UP023) {
pyupgrade::rules::replace_c_element_tree(self, stmt);
}
if self.settings.enabled.contains(&RuleCode::UP029) {
if let Some(module) = module.as_deref() {
pyupgrade::rules::unnecessary_builtin_import(self, stmt, module, names);
@@ -1057,15 +1066,16 @@ where
// be "foo.bar". Given `from foo import bar as baz`, `name` would be "baz"
// and `full_name` would be "foo.bar".
let name = alias.node.asname.as_ref().unwrap_or(&alias.node.name);
let full_name = match module {
None => alias.node.name.to_string(),
Some(parent) => format!("{parent}.{}", alias.node.name),
};
let full_name = helpers::format_import_from_member(
level.as_ref(),
module.as_deref(),
&alias.node.name,
);
let range = Range::from_located(alias);
self.add_binding(
name,
Binding {
kind: BindingKind::FromImportation(name.to_string(), full_name),
kind: BindingKind::FromImportation(name, full_name),
// Treat explicit re-export as usage (e.g., `from .applications
// import FastAPI as FastAPI`).
used: if alias
@@ -1453,8 +1463,11 @@ where
pyupgrade::rules::rewrite_yield_from(self, stmt);
}
let scope = transition_scope(&self.visible_scope, stmt, &Documentable::Function);
self.definitions
.push((definition, scope.visibility.clone()));
self.definitions.push((
definition,
scope.visibility.clone(),
(self.scope_stack.clone(), self.parents.clone()),
));
self.visible_scope = scope;
// If any global bindings don't already exist in the global scope, add it.
@@ -1511,8 +1524,11 @@ where
&Documentable::Class,
);
let scope = transition_scope(&self.visible_scope, stmt, &Documentable::Class);
self.definitions
.push((definition, scope.visibility.clone()));
self.definitions.push((
definition,
scope.visibility.clone(),
(self.scope_stack.clone(), self.parents.clone()),
));
self.visible_scope = scope;
// If any global bindings don't already exist in the global scope, add it.
@@ -1708,13 +1724,9 @@ where
&& !self.settings.pyupgrade.keep_runtime_typing
&& self.annotations_future_enabled
&& self.in_annotation))
&& typing::is_pep585_builtin(
expr,
&self.from_imports,
&self.import_aliases,
)
&& typing::is_pep585_builtin(self, expr)
{
pyupgrade::rules::use_pep585_annotation(self, expr, id);
pyupgrade::rules::use_pep585_annotation(self, expr);
}
self.handle_node_load(expr);
@@ -1752,9 +1764,9 @@ where
|| (self.settings.target_version >= PythonVersion::Py37
&& self.annotations_future_enabled
&& self.in_annotation))
&& typing::is_pep585_builtin(expr, &self.from_imports, &self.import_aliases)
&& typing::is_pep585_builtin(self, expr)
{
pyupgrade::rules::use_pep585_annotation(self, expr, attr);
pyupgrade::rules::use_pep585_annotation(self, expr);
}
if self.settings.enabled.contains(&RuleCode::UP016) {
@@ -1822,12 +1834,7 @@ where
}
if self.settings.enabled.contains(&RuleCode::TID251) {
flake8_tidy_imports::rules::banned_attribute_access(
self,
&dealias_call_path(collect_call_paths(expr), &self.import_aliases),
expr,
&self.settings.flake8_tidy_imports.banned_api,
);
flake8_tidy_imports::rules::banned_attribute_access(self, expr);
}
}
ExprKind::Call {
@@ -1976,96 +1983,36 @@ where
}
}
if self.settings.enabled.contains(&RuleCode::S103) {
if let Some(diagnostic) = flake8_bandit::rules::bad_file_permissions(
func,
args,
keywords,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_bandit::rules::bad_file_permissions(self, func, args, keywords);
}
if self.settings.enabled.contains(&RuleCode::S501) {
if let Some(diagnostic) = flake8_bandit::rules::request_with_no_cert_validation(
func,
args,
keywords,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_bandit::rules::request_with_no_cert_validation(
self, func, args, keywords,
);
}
if self.settings.enabled.contains(&RuleCode::S506) {
if let Some(diagnostic) = flake8_bandit::rules::unsafe_yaml_load(
func,
args,
keywords,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_bandit::rules::unsafe_yaml_load(self, func, args, keywords);
}
if self.settings.enabled.contains(&RuleCode::S508) {
if let Some(diagnostic) = flake8_bandit::rules::snmp_insecure_version(
func,
args,
keywords,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_bandit::rules::snmp_insecure_version(self, func, args, keywords);
}
if self.settings.enabled.contains(&RuleCode::S509) {
if let Some(diagnostic) = flake8_bandit::rules::snmp_weak_cryptography(
func,
args,
keywords,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_bandit::rules::snmp_weak_cryptography(self, func, args, keywords);
}
if self.settings.enabled.contains(&RuleCode::S701) {
if let Some(diagnostic) = flake8_bandit::rules::jinja2_autoescape_false(
func,
args,
keywords,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_bandit::rules::jinja2_autoescape_false(self, func, args, keywords);
}
if self.settings.enabled.contains(&RuleCode::S106) {
self.diagnostics
.extend(flake8_bandit::rules::hardcoded_password_func_arg(keywords));
}
if self.settings.enabled.contains(&RuleCode::S324) {
if let Some(diagnostic) = flake8_bandit::rules::hashlib_insecure_hash_functions(
func,
args,
keywords,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_bandit::rules::hashlib_insecure_hash_functions(
self, func, args, keywords,
);
}
if self.settings.enabled.contains(&RuleCode::S113) {
if let Some(diagnostic) = flake8_bandit::rules::request_without_timeout(
func,
args,
keywords,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_bandit::rules::request_without_timeout(self, func, args, keywords);
}
// flake8-comprehensions
@@ -2157,14 +2104,7 @@ where
// flake8-debugger
if self.settings.enabled.contains(&RuleCode::T100) {
if let Some(diagnostic) = flake8_debugger::rules::debugger_call(
expr,
func,
&self.from_imports,
&self.import_aliases,
) {
self.diagnostics.push(diagnostic);
}
flake8_debugger::rules::debugger_call(self, expr, func);
}
// pandas-vet
@@ -2191,7 +2131,7 @@ where
if let BindingKind::Importation(.., module) =
&binding.kind
{
module != "pandas"
module != &"pandas"
} else {
matches!(
binding.kind,
@@ -2590,6 +2530,14 @@ where
);
}
if self.settings.enabled.contains(&RuleCode::PLR0133) {
pylint::rules::constant_comparison(self, left, ops, comparators);
}
if self.settings.enabled.contains(&RuleCode::PLR2004) {
pylint::rules::magic_value_comparison(self, left, comparators);
}
if self.settings.enabled.contains(&RuleCode::SIM118) {
flake8_simplify::rules::key_in_dict_compare(self, expr, left, ops, comparators);
}
@@ -2739,15 +2687,19 @@ where
args,
keywords,
} => {
let call_path = dealias_call_path(collect_call_paths(func), &self.import_aliases);
if self.match_typing_call_path(&call_path, "ForwardRef") {
let call_path = self.resolve_call_path(func);
if call_path.as_ref().map_or(false, |call_path| {
self.match_typing_call_path(call_path, "ForwardRef")
}) {
self.visit_expr(func);
for expr in args {
self.in_type_definition = true;
self.visit_expr(expr);
self.in_type_definition = prev_in_type_definition;
}
} else if self.match_typing_call_path(&call_path, "cast") {
} else if call_path.as_ref().map_or(false, |call_path| {
self.match_typing_call_path(call_path, "cast")
}) {
self.visit_expr(func);
if !args.is_empty() {
self.in_type_definition = true;
@@ -2757,14 +2709,18 @@ where
for expr in args.iter().skip(1) {
self.visit_expr(expr);
}
} else if self.match_typing_call_path(&call_path, "NewType") {
} else if call_path.as_ref().map_or(false, |call_path| {
self.match_typing_call_path(call_path, "NewType")
}) {
self.visit_expr(func);
for expr in args.iter().skip(1) {
self.in_type_definition = true;
self.visit_expr(expr);
self.in_type_definition = prev_in_type_definition;
}
} else if self.match_typing_call_path(&call_path, "TypeVar") {
} else if call_path.as_ref().map_or(false, |call_path| {
self.match_typing_call_path(call_path, "TypeVar")
}) {
self.visit_expr(func);
for expr in args.iter().skip(1) {
self.in_type_definition = true;
@@ -2785,7 +2741,9 @@ where
}
}
}
} else if self.match_typing_call_path(&call_path, "NamedTuple") {
} else if call_path.as_ref().map_or(false, |call_path| {
self.match_typing_call_path(call_path, "NamedTuple")
}) {
self.visit_expr(func);
// Ex) NamedTuple("a", [("a", int)])
@@ -2821,7 +2779,9 @@ where
self.visit_expr(value);
self.in_type_definition = prev_in_type_definition;
}
} else if self.match_typing_call_path(&call_path, "TypedDict") {
} else if call_path.as_ref().map_or(false, |call_path| {
self.match_typing_call_path(call_path, "TypedDict")
}) {
self.visit_expr(func);
// Ex) TypedDict("a", {"a": int})
@@ -2847,12 +2807,11 @@ where
self.visit_expr(value);
self.in_type_definition = prev_in_type_definition;
}
} else if ["Arg", "DefaultArg", "NamedArg", "DefaultNamedArg"]
.iter()
.any(|target| {
match_call_path(&call_path, "mypy_extensions", target, &self.from_imports)
})
{
} else if call_path.as_ref().map_or(false, |call_path| {
["Arg", "DefaultArg", "NamedArg", "DefaultNamedArg"]
.iter()
.any(|target| *call_path == ["mypy_extensions", target])
}) {
self.visit_expr(func);
// Ex) DefaultNamedArg(bool | None, name="some_prop_name")
@@ -2886,13 +2845,7 @@ where
self.in_subscript = true;
visitor::walk_expr(self, expr);
} else {
match typing::match_annotated_subscript(
value,
&self.from_imports,
&self.import_aliases,
self.settings.typing_modules.iter().map(String::as_str),
|member| self.is_builtin(member),
) {
match typing::match_annotated_subscript(self, value) {
Some(subscript) => {
match subscript {
// Ex) Optional[int]
@@ -3419,23 +3372,37 @@ impl<'a> Checker<'a> {
// import pyarrow as pa
// import pyarrow.csv
// print(pa.csv.read_csv("test.csv"))
if let BindingKind::Importation(name, full_name)
| BindingKind::FromImportation(name, full_name)
| BindingKind::SubmoduleImportation(name, full_name) =
&self.bindings[*index].kind
{
let has_alias = full_name
.split('.')
.last()
.map(|segment| segment != name)
.unwrap_or_default();
if has_alias {
// Mark the sub-importation as used.
if let Some(index) = scope.values.get(full_name.as_str()) {
self.bindings[*index].used =
Some((scope_id, Range::from_located(expr)));
match &self.bindings[*index].kind {
BindingKind::Importation(name, full_name)
| BindingKind::SubmoduleImportation(name, full_name) => {
let has_alias = full_name
.split('.')
.last()
.map(|segment| &segment != name)
.unwrap_or_default();
if has_alias {
// Mark the sub-importation as used.
if let Some(index) = scope.values.get(full_name) {
self.bindings[*index].used =
Some((scope_id, Range::from_located(expr)));
}
}
}
BindingKind::FromImportation(name, full_name) => {
let has_alias = full_name
.split('.')
.last()
.map(|segment| &segment != name)
.unwrap_or_default();
if has_alias {
// Mark the sub-importation as used.
if let Some(index) = scope.values.get(full_name.as_str()) {
self.bindings[*index].used =
Some((scope_id, Range::from_located(expr)));
}
}
}
_ => {}
}
return;
@@ -3689,6 +3656,7 @@ impl<'a> Checker<'a> {
docstring,
},
self.visible_scope.visibility.clone(),
(self.scope_stack.clone(), self.parents.clone()),
));
docstring.is_some()
}
@@ -3960,9 +3928,12 @@ impl<'a> Checker<'a> {
{
let binding = &self.bindings[*index];
let (BindingKind::Importation(_, full_name)
| BindingKind::SubmoduleImportation(_, full_name)
| BindingKind::FromImportation(_, full_name)) = &binding.kind else { continue; };
let full_name = match &binding.kind {
BindingKind::Importation(.., full_name) => full_name,
BindingKind::FromImportation(.., full_name) => full_name.as_str(),
BindingKind::SubmoduleImportation(.., full_name) => full_name,
_ => continue,
};
// Skip used exports from `__all__`
if binding.used.is_some()
@@ -4142,7 +4113,10 @@ impl<'a> Checker<'a> {
let mut overloaded_name: Option<String> = None;
self.definitions.reverse();
while let Some((definition, visibility)) = self.definitions.pop() {
while let Some((definition, visibility, (scopes, parents))) = self.definitions.pop() {
self.scope_stack = scopes.clone();
self.parents = parents.clone();
// flake8-annotations
if enforce_annotations {
// TODO(charlie): This should be even stricter, in that an overload
@@ -4355,13 +4329,13 @@ pub fn check_ast(
let mut allocator = vec![];
checker.check_deferred_string_type_definitions(&mut allocator);
// Check docstrings.
checker.check_definitions();
// Reset the scope to module-level, and check all consumed scopes.
checker.scope_stack = vec![GLOBAL_SCOPE_INDEX];
checker.pop_scope();
checker.check_dead_scopes();
// Check docstrings.
checker.check_definitions();
checker.diagnostics
}

View File

@@ -1,20 +1,15 @@
use num_bigint::BigInt;
use rustpython_ast::{Cmpop, Constant, Expr, ExprKind, Located};
use crate::ast::helpers::match_module_member;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::{Diagnostic, RuleCode};
use crate::violations;
fn is_sys(checker: &Checker, expr: &Expr, target: &str) -> bool {
match_module_member(
expr,
"sys",
target,
&checker.from_imports,
&checker.import_aliases,
)
checker
.resolve_call_path(expr)
.map_or(false, |path| path == ["sys", target])
}
/// YTT101, YTT102, YTT301, YTT303
@@ -187,13 +182,10 @@ pub fn compare(checker: &mut Checker, left: &Expr, ops: &[Cmpop], comparators: &
/// YTT202
pub fn name_or_attribute(checker: &mut Checker, expr: &Expr) {
if match_module_member(
expr,
"six",
"PY3",
&checker.from_imports,
&checker.import_aliases,
) {
if checker
.resolve_call_path(expr)
.map_or(false, |path| path == ["six", "PY3"])
{
checker.diagnostics.push(Diagnostic::new(
violations::SixPY3Referenced,
Range::from_located(expr),

View File

@@ -145,4 +145,22 @@ mod tests {
insta::assert_yaml_snapshot!(diagnostics);
Ok(())
}
#[test]
fn allow_nested_overload() -> Result<()> {
let diagnostics = test_path(
Path::new("./resources/test/fixtures/flake8_annotations/allow_nested_overload.py"),
&Settings {
..Settings::for_rules(vec![
RuleCode::ANN201,
RuleCode::ANN202,
RuleCode::ANN204,
RuleCode::ANN205,
RuleCode::ANN206,
])
},
)?;
insta::assert_yaml_snapshot!(diagnostics);
Ok(())
}
}

View File

@@ -0,0 +1,6 @@
---
source: src/flake8_annotations/mod.rs
expression: diagnostics
---
[]

View File

@@ -1,10 +1,11 @@
use num_traits::ToPrimitive;
use once_cell::sync::Lazy;
use rustc_hash::{FxHashMap, FxHashSet};
use rustc_hash::FxHashMap;
use rustpython_ast::{Constant, Expr, ExprKind, Keyword, Operator};
use crate::ast::helpers::{compose_call_path, match_module_member, SimpleCallArgs};
use crate::ast::helpers::{compose_call_path, SimpleCallArgs};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
@@ -86,18 +87,20 @@ fn get_int_value(expr: &Expr) -> Option<u16> {
/// S103
pub fn bad_file_permissions(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
if match_module_member(func, "os", "chmod", from_imports, import_aliases) {
) {
if checker
.resolve_call_path(func)
.map_or(false, |call_path| call_path == ["os", "chmod"])
{
let call_args = SimpleCallArgs::new(args, keywords);
if let Some(mode_arg) = call_args.get_argument("mode", Some(1)) {
if let Some(int_value) = get_int_value(mode_arg) {
if (int_value & WRITE_WORLD > 0) || (int_value & EXECUTE_GROUP > 0) {
return Some(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
violations::BadFilePermissions(int_value),
Range::from_located(mode_arg),
));
@@ -105,5 +108,4 @@ pub fn bad_file_permissions(
}
}
}
None
}

View File

@@ -1,8 +1,8 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Constant, Expr, ExprKind, Keyword};
use crate::ast::helpers::{match_module_member, SimpleCallArgs};
use crate::ast::helpers::SimpleCallArgs;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::flake8_bandit::helpers::string_literal;
use crate::registry::Diagnostic;
use crate::violations;
@@ -24,44 +24,45 @@ fn is_used_for_security(call_args: &SimpleCallArgs) -> bool {
/// S324
pub fn hashlib_insecure_hash_functions(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
if match_module_member(func, "hashlib", "new", from_imports, import_aliases) {
let call_args = SimpleCallArgs::new(args, keywords);
) {
if let Some(call_path) = checker.resolve_call_path(func) {
if call_path == ["hashlib", "new"] {
let call_args = SimpleCallArgs::new(args, keywords);
if !is_used_for_security(&call_args) {
return None;
}
if let Some(name_arg) = call_args.get_argument("name", Some(0)) {
let hash_func_name = string_literal(name_arg)?;
if WEAK_HASHES.contains(&hash_func_name.to_lowercase().as_str()) {
return Some(Diagnostic::new(
violations::HashlibInsecureHashFunction(hash_func_name.to_string()),
Range::from_located(name_arg),
));
if !is_used_for_security(&call_args) {
return;
}
}
} else {
for func_name in &WEAK_HASHES {
if match_module_member(func, "hashlib", func_name, from_imports, import_aliases) {
let call_args = SimpleCallArgs::new(args, keywords);
if !is_used_for_security(&call_args) {
return None;
if let Some(name_arg) = call_args.get_argument("name", Some(0)) {
if let Some(hash_func_name) = string_literal(name_arg) {
if WEAK_HASHES.contains(&hash_func_name.to_lowercase().as_str()) {
checker.diagnostics.push(Diagnostic::new(
violations::HashlibInsecureHashFunction(hash_func_name.to_string()),
Range::from_located(name_arg),
));
}
}
}
} else {
for func_name in &WEAK_HASHES {
if call_path == ["hashlib", func_name] {
let call_args = SimpleCallArgs::new(args, keywords);
return Some(Diagnostic::new(
violations::HashlibInsecureHashFunction((*func_name).to_string()),
Range::from_located(func),
));
if !is_used_for_security(&call_args) {
return;
}
checker.diagnostics.push(Diagnostic::new(
violations::HashlibInsecureHashFunction((*func_name).to_string()),
Range::from_located(func),
));
return;
}
}
}
}
None
}

View File

@@ -1,26 +1,23 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Expr, ExprKind, Keyword};
use rustpython_parser::ast::Constant;
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path, SimpleCallArgs};
use crate::ast::helpers::SimpleCallArgs;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
/// S701
pub fn jinja2_autoescape_false(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
if match_call_path(
&dealias_call_path(collect_call_paths(func), import_aliases),
"jinja2",
"Environment",
from_imports,
) {
) {
if checker
.resolve_call_path(func)
.map_or(false, |call_path| call_path == ["jinja2", "Environment"])
{
let call_args = SimpleCallArgs::new(args, keywords);
if let Some(autoescape_arg) = call_args.get_argument("autoescape", None) {
@@ -32,26 +29,23 @@ pub fn jinja2_autoescape_false(
ExprKind::Call { func, .. } => {
if let ExprKind::Name { id, .. } = &func.node {
if id.as_str() != "select_autoescape" {
return Some(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
violations::Jinja2AutoescapeFalse(true),
Range::from_located(autoescape_arg),
));
}
}
}
_ => {
return Some(Diagnostic::new(
violations::Jinja2AutoescapeFalse(true),
Range::from_located(autoescape_arg),
))
}
_ => checker.diagnostics.push(Diagnostic::new(
violations::Jinja2AutoescapeFalse(true),
Range::from_located(autoescape_arg),
)),
}
} else {
return Some(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
violations::Jinja2AutoescapeFalse(false),
Range::from_located(func),
));
}
}
None
}

View File

@@ -1,9 +1,9 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Expr, ExprKind, Keyword};
use rustpython_parser::ast::Constant;
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path, SimpleCallArgs};
use crate::ast::helpers::SimpleCallArgs;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
@@ -24,47 +24,46 @@ const HTTPX_METHODS: [&str; 11] = [
/// S501
pub fn request_with_no_cert_validation(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
let call_path = dealias_call_path(collect_call_paths(func), import_aliases);
let call_args = SimpleCallArgs::new(args, keywords);
for func_name in &REQUESTS_HTTP_VERBS {
if match_call_path(&call_path, "requests", func_name, from_imports) {
if let Some(verify_arg) = call_args.get_argument("verify", None) {
if let ExprKind::Constant {
value: Constant::Bool(false),
..
} = &verify_arg.node
{
return Some(Diagnostic::new(
violations::RequestWithNoCertValidation("requests".to_string()),
Range::from_located(verify_arg),
));
) {
if let Some(call_path) = checker.resolve_call_path(func) {
let call_args = SimpleCallArgs::new(args, keywords);
for func_name in &REQUESTS_HTTP_VERBS {
if call_path == ["requests", func_name] {
if let Some(verify_arg) = call_args.get_argument("verify", None) {
if let ExprKind::Constant {
value: Constant::Bool(false),
..
} = &verify_arg.node
{
checker.diagnostics.push(Diagnostic::new(
violations::RequestWithNoCertValidation("requests".to_string()),
Range::from_located(verify_arg),
));
}
}
return;
}
}
for func_name in &HTTPX_METHODS {
if call_path == ["httpx", func_name] {
if let Some(verify_arg) = call_args.get_argument("verify", None) {
if let ExprKind::Constant {
value: Constant::Bool(false),
..
} = &verify_arg.node
{
checker.diagnostics.push(Diagnostic::new(
violations::RequestWithNoCertValidation("httpx".to_string()),
Range::from_located(verify_arg),
));
}
}
return;
}
}
}
for func_name in &HTTPX_METHODS {
if match_call_path(&call_path, "httpx", func_name, from_imports) {
if let Some(verify_arg) = call_args.get_argument("verify", None) {
if let ExprKind::Constant {
value: Constant::Bool(false),
..
} = &verify_arg.node
{
return Some(Diagnostic::new(
violations::RequestWithNoCertValidation("httpx".to_string()),
Range::from_located(verify_arg),
));
}
}
}
}
None
}

View File

@@ -1,9 +1,9 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Expr, ExprKind, Keyword};
use rustpython_parser::ast::Constant;
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path, SimpleCallArgs};
use crate::ast::helpers::SimpleCallArgs;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
@@ -11,36 +11,35 @@ const HTTP_VERBS: [&str; 7] = ["get", "options", "head", "post", "put", "patch",
/// S113
pub fn request_without_timeout(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
let call_path = dealias_call_path(collect_call_paths(func), import_aliases);
for func_name in &HTTP_VERBS {
if match_call_path(&call_path, "requests", func_name, from_imports) {
let call_args = SimpleCallArgs::new(args, keywords);
if let Some(timeout_arg) = call_args.get_argument("timeout", None) {
if let Some(timeout) = match &timeout_arg.node {
ExprKind::Constant {
value: value @ Constant::None,
..
} => Some(value.to_string()),
_ => None,
} {
return Some(Diagnostic::new(
violations::RequestWithoutTimeout(Some(timeout)),
Range::from_located(timeout_arg),
));
}
} else {
return Some(Diagnostic::new(
violations::RequestWithoutTimeout(None),
Range::from_located(func),
) {
if checker.resolve_call_path(func).map_or(false, |call_path| {
HTTP_VERBS
.iter()
.any(|func_name| call_path == ["requests", func_name])
}) {
let call_args = SimpleCallArgs::new(args, keywords);
if let Some(timeout_arg) = call_args.get_argument("timeout", None) {
if let Some(timeout) = match &timeout_arg.node {
ExprKind::Constant {
value: value @ Constant::None,
..
} => Some(value.to_string()),
_ => None,
} {
checker.diagnostics.push(Diagnostic::new(
violations::RequestWithoutTimeout(Some(timeout)),
Range::from_located(timeout_arg),
));
}
} else {
checker.diagnostics.push(Diagnostic::new(
violations::RequestWithoutTimeout(None),
Range::from_located(func),
));
}
}
None
}

View File

@@ -1,26 +1,24 @@
use num_traits::{One, Zero};
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Expr, ExprKind, Keyword};
use rustpython_parser::ast::Constant;
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path, SimpleCallArgs};
use crate::ast::helpers::SimpleCallArgs;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
/// S508
pub fn snmp_insecure_version(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
let call_path = dealias_call_path(collect_call_paths(func), import_aliases);
if match_call_path(&call_path, "pysnmp.hlapi", "CommunityData", from_imports) {
) {
if checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["pysnmp", "hlapi", "CommunityData"]
}) {
let call_args = SimpleCallArgs::new(args, keywords);
if let Some(mp_model_arg) = call_args.get_argument("mpModel", None) {
if let ExprKind::Constant {
value: Constant::Int(value),
@@ -28,7 +26,7 @@ pub fn snmp_insecure_version(
} = &mp_model_arg.node
{
if value.is_zero() || value.is_one() {
return Some(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
violations::SnmpInsecureVersion,
Range::from_located(mp_model_arg),
));
@@ -36,5 +34,4 @@ pub fn snmp_insecure_version(
}
}
}
None
}

View File

@@ -1,30 +1,27 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Expr, Keyword};
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path, SimpleCallArgs};
use crate::ast::helpers::SimpleCallArgs;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
/// S509
pub fn snmp_weak_cryptography(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
let call_path = dealias_call_path(collect_call_paths(func), import_aliases);
if match_call_path(&call_path, "pysnmp.hlapi", "UsmUserData", from_imports) {
) {
if checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["pysnmp", "hlapi", "UsmUserData"]
}) {
let call_args = SimpleCallArgs::new(args, keywords);
if call_args.len() < 3 {
return Some(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
violations::SnmpWeakCryptography,
Range::from_located(func),
));
}
}
None
}

View File

@@ -1,51 +1,40 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Expr, ExprKind, Keyword};
use crate::ast::helpers::{match_module_member, SimpleCallArgs};
use crate::ast::helpers::SimpleCallArgs;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
/// S506
pub fn unsafe_yaml_load(
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
if match_module_member(func, "yaml", "load", from_imports, import_aliases) {
pub fn unsafe_yaml_load(checker: &mut Checker, func: &Expr, args: &[Expr], keywords: &[Keyword]) {
if checker
.resolve_call_path(func)
.map_or(false, |call_path| call_path == ["yaml", "load"])
{
let call_args = SimpleCallArgs::new(args, keywords);
if let Some(loader_arg) = call_args.get_argument("Loader", Some(1)) {
if !match_module_member(
loader_arg,
"yaml",
"SafeLoader",
from_imports,
import_aliases,
) && !match_module_member(
loader_arg,
"yaml",
"CSafeLoader",
from_imports,
import_aliases,
) {
if !checker
.resolve_call_path(loader_arg)
.map_or(false, |call_path| {
call_path == ["yaml", "SafeLoader"] || call_path == ["yaml", "CSafeLoader"]
})
{
let loader = match &loader_arg.node {
ExprKind::Attribute { attr, .. } => Some(attr.to_string()),
ExprKind::Name { id, .. } => Some(id.to_string()),
_ => None,
};
return Some(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
violations::UnsafeYAMLLoad(loader),
Range::from_located(loader_arg),
));
}
} else {
return Some(Diagnostic::new(
checker.diagnostics.push(Diagnostic::new(
violations::UnsafeYAMLLoad(None),
Range::from_located(func),
));
}
}
None
}

View File

@@ -1,34 +1,26 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Constant, Expr, ExprKind, Keyword, Stmt, StmtKind};
use crate::ast::helpers::match_module_member;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::{Diagnostic, RuleCode};
use crate::violations;
use crate::visibility::{is_abstract, is_overload};
fn is_abc_class(
bases: &[Expr],
keywords: &[Keyword],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
fn is_abc_class(checker: &Checker, bases: &[Expr], keywords: &[Keyword]) -> bool {
keywords.iter().any(|keyword| {
keyword
.node
.arg
.as_ref()
.map_or(false, |a| a == "metaclass")
&& match_module_member(
&keyword.node.value,
"abc",
"ABCMeta",
from_imports,
import_aliases,
)
}) || bases
.iter()
.any(|base| match_module_member(base, "abc", "ABC", from_imports, import_aliases))
.map_or(false, |arg| arg == "metaclass")
&& checker
.resolve_call_path(&keyword.node.value)
.map_or(false, |call_path| call_path == ["abc", "ABCMeta"])
}) || bases.iter().any(|base| {
checker
.resolve_call_path(base)
.map_or(false, |call_path| call_path == ["abc", "ABC"])
})
}
fn is_empty_body(body: &[Stmt]) -> bool {
@@ -44,22 +36,6 @@ fn is_empty_body(body: &[Stmt]) -> bool {
})
}
fn is_abstractmethod(
expr: &Expr,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
match_module_member(expr, "abc", "abstractmethod", from_imports, import_aliases)
}
fn is_overload(
expr: &Expr,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
match_module_member(expr, "typing", "overload", from_imports, import_aliases)
}
pub fn abstract_base_class(
checker: &mut Checker,
stmt: &Stmt,
@@ -71,12 +47,7 @@ pub fn abstract_base_class(
if bases.len() + keywords.len() != 1 {
return;
}
if !is_abc_class(
bases,
keywords,
&checker.from_imports,
&checker.import_aliases,
) {
if !is_abc_class(checker, bases, keywords) {
return;
}
@@ -102,22 +73,14 @@ pub fn abstract_base_class(
continue;
};
let has_abstract_decorator = decorator_list
.iter()
.any(|d| is_abstractmethod(d, &checker.from_imports, &checker.import_aliases));
let has_abstract_decorator = is_abstract(checker, decorator_list);
has_abstract_method |= has_abstract_decorator;
if !checker.settings.enabled.contains(&RuleCode::B027) {
continue;
}
if !has_abstract_decorator
&& is_empty_body(body)
&& !decorator_list
.iter()
.any(|d| is_overload(d, &checker.from_imports, &checker.import_aliases))
{
if !has_abstract_decorator && is_empty_body(body) && !is_overload(checker, decorator_list) {
checker.diagnostics.push(Diagnostic::new(
violations::EmptyMethodWithoutAbstractDecorator(name.to_string()),
Range::from_located(stmt),

View File

@@ -1,6 +1,5 @@
use rustpython_ast::{ExprKind, Stmt, Withitem};
use crate::ast::helpers::match_module_member;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
@@ -24,13 +23,10 @@ pub fn assert_raises_exception(checker: &mut Checker, stmt: &Stmt, items: &[With
if !matches!(&func.node, ExprKind::Attribute { attr, .. } if attr == "assertRaises") {
return;
}
if !match_module_member(
args.first().unwrap(),
"",
"Exception",
&checker.from_imports,
&checker.import_aliases,
) {
if !checker
.resolve_call_path(args.first().unwrap())
.map_or(false, |call_path| call_path == ["", "Exception"])
{
return;
}

View File

@@ -1,15 +1,14 @@
use rustpython_ast::{Expr, ExprKind};
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path};
use crate::ast::types::{Range, ScopeKind};
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
fn is_cache_func(checker: &Checker, expr: &Expr) -> bool {
let call_path = dealias_call_path(collect_call_paths(expr), &checker.import_aliases);
match_call_path(&call_path, "functools", "lru_cache", &checker.from_imports)
|| match_call_path(&call_path, "functools", "cache", &checker.from_imports)
checker.resolve_call_path(expr).map_or(false, |call_path| {
call_path == ["functools", "lru_cache"] || call_path == ["functools", "cache"]
})
}
/// B019

View File

@@ -30,7 +30,7 @@ fn duplicate_handler_exceptions<'a>(
let mut duplicates: FxHashSet<Vec<&str>> = FxHashSet::default();
let mut unique_elts: Vec<&Expr> = Vec::default();
for type_ in elts {
let call_path = helpers::collect_call_paths(type_);
let call_path = helpers::collect_call_path(type_);
if !call_path.is_empty() {
if seen.contains_key(&call_path) {
duplicates.insert(call_path);
@@ -83,7 +83,7 @@ pub fn duplicate_exceptions(checker: &mut Checker, handlers: &[Excepthandler]) {
};
match &type_.node {
ExprKind::Attribute { .. } | ExprKind::Name { .. } => {
let call_path = helpers::collect_call_paths(type_);
let call_path = helpers::collect_call_path(type_);
if !call_path.is_empty() {
if seen.contains(&call_path) {
duplicates.entry(call_path).or_default().push(type_);

View File

@@ -1,9 +1,6 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Arguments, Constant, Expr, ExprKind};
use crate::ast::helpers::{
collect_call_paths, compose_call_path, dealias_call_path, match_call_path, to_module_and_member,
};
use crate::ast::helpers::{compose_call_path, to_call_path};
use crate::ast::types::Range;
use crate::ast::visitor;
use crate::ast::visitor::Visitor;
@@ -12,34 +9,29 @@ use crate::flake8_bugbear::rules::mutable_argument_default::is_mutable_func;
use crate::registry::{Diagnostic, DiagnosticKind};
use crate::violations;
const IMMUTABLE_FUNCS: [(&str, &str); 7] = [
("", "tuple"),
("", "frozenset"),
("operator", "attrgetter"),
("operator", "itemgetter"),
("operator", "methodcaller"),
("types", "MappingProxyType"),
("re", "compile"),
const IMMUTABLE_FUNCS: &[&[&str]] = &[
&["", "tuple"],
&["", "frozenset"],
&["operator", "attrgetter"],
&["operator", "itemgetter"],
&["operator", "methodcaller"],
&["types", "MappingProxyType"],
&["re", "compile"],
];
fn is_immutable_func(
expr: &Expr,
extend_immutable_calls: &[(&str, &str)],
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
let call_path = dealias_call_path(collect_call_paths(expr), import_aliases);
IMMUTABLE_FUNCS
.iter()
.chain(extend_immutable_calls)
.any(|(module, member)| match_call_path(&call_path, module, member, from_imports))
fn is_immutable_func(checker: &Checker, expr: &Expr, extend_immutable_calls: &[Vec<&str>]) -> bool {
checker.resolve_call_path(expr).map_or(false, |call_path| {
IMMUTABLE_FUNCS.iter().any(|target| call_path == *target)
|| extend_immutable_calls
.iter()
.any(|target| call_path == *target)
})
}
struct ArgumentDefaultVisitor<'a> {
checker: &'a Checker<'a>,
diagnostics: Vec<(DiagnosticKind, Range)>,
extend_immutable_calls: &'a [(&'a str, &'a str)],
from_imports: &'a FxHashMap<&'a str, FxHashSet<&'a str>>,
import_aliases: &'a FxHashMap<&'a str, &'a str>,
extend_immutable_calls: Vec<Vec<&'a str>>,
}
impl<'a, 'b> Visitor<'b> for ArgumentDefaultVisitor<'b>
@@ -49,13 +41,8 @@ where
fn visit_expr(&mut self, expr: &'b Expr) {
match &expr.node {
ExprKind::Call { func, args, .. } => {
if !is_mutable_func(func, self.from_imports, self.import_aliases)
&& !is_immutable_func(
func,
self.extend_immutable_calls,
self.from_imports,
self.import_aliases,
)
if !is_mutable_func(self.checker, func)
&& !is_immutable_func(self.checker, func, &self.extend_immutable_calls)
&& !is_nan_or_infinity(func, args)
{
self.diagnostics.push((
@@ -97,27 +84,29 @@ fn is_nan_or_infinity(expr: &Expr, args: &[Expr]) -> bool {
/// B008
pub fn function_call_argument_default(checker: &mut Checker, arguments: &Arguments) {
// Map immutable calls to (module, member) format.
let extend_immutable_cells: Vec<(&str, &str)> = checker
let extend_immutable_calls: Vec<Vec<&str>> = checker
.settings
.flake8_bugbear
.extend_immutable_calls
.iter()
.map(|target| to_module_and_member(target))
.map(|target| to_call_path(target))
.collect();
let mut visitor = ArgumentDefaultVisitor {
diagnostics: vec![],
extend_immutable_calls: &extend_immutable_cells,
from_imports: &checker.from_imports,
import_aliases: &checker.import_aliases,
let diagnostics = {
let mut visitor = ArgumentDefaultVisitor {
checker,
diagnostics: vec![],
extend_immutable_calls,
};
for expr in arguments
.defaults
.iter()
.chain(arguments.kw_defaults.iter())
{
visitor.visit_expr(expr);
}
visitor.diagnostics
};
for expr in arguments
.defaults
.iter()
.chain(arguments.kw_defaults.iter())
{
visitor.visit_expr(expr);
}
for (check, range) in visitor.diagnostics {
for (check, range) in diagnostics {
checker.diagnostics.push(Diagnostic::new(check, range));
}
}

View File

@@ -1,79 +1,68 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Arguments, Constant, Expr, ExprKind, Operator};
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
const MUTABLE_FUNCS: &[(&str, &str)] = &[
("", "dict"),
("", "list"),
("", "set"),
("collections", "Counter"),
("collections", "OrderedDict"),
("collections", "defaultdict"),
("collections", "deque"),
const MUTABLE_FUNCS: &[&[&str]] = &[
&["", "dict"],
&["", "list"],
&["", "set"],
&["collections", "Counter"],
&["collections", "OrderedDict"],
&["collections", "defaultdict"],
&["collections", "deque"],
];
const IMMUTABLE_TYPES: &[(&str, &str)] = &[
("", "bool"),
("", "bytes"),
("", "complex"),
("", "float"),
("", "frozenset"),
("", "int"),
("", "object"),
("", "range"),
("", "str"),
("collections.abc", "Sized"),
("typing", "LiteralString"),
("typing", "Sized"),
const IMMUTABLE_TYPES: &[&[&str]] = &[
&["", "bool"],
&["", "bytes"],
&["", "complex"],
&["", "float"],
&["", "frozenset"],
&["", "int"],
&["", "object"],
&["", "range"],
&["", "str"],
&["collections", "abc", "Sized"],
&["typing", "LiteralString"],
&["typing", "Sized"],
];
const IMMUTABLE_GENERIC_TYPES: &[(&str, &str)] = &[
("", "tuple"),
("collections.abc", "ByteString"),
("collections.abc", "Collection"),
("collections.abc", "Container"),
("collections.abc", "Iterable"),
("collections.abc", "Mapping"),
("collections.abc", "Reversible"),
("collections.abc", "Sequence"),
("collections.abc", "Set"),
("typing", "AbstractSet"),
("typing", "ByteString"),
("typing", "Callable"),
("typing", "Collection"),
("typing", "Container"),
("typing", "FrozenSet"),
("typing", "Iterable"),
("typing", "Literal"),
("typing", "Mapping"),
("typing", "Never"),
("typing", "NoReturn"),
("typing", "Reversible"),
("typing", "Sequence"),
("typing", "Tuple"),
const IMMUTABLE_GENERIC_TYPES: &[&[&str]] = &[
&["", "tuple"],
&["collections", "abc", "ByteString"],
&["collections", "abc", "Collection"],
&["collections", "abc", "Container"],
&["collections", "abc", "Iterable"],
&["collections", "abc", "Mapping"],
&["collections", "abc", "Reversible"],
&["collections", "abc", "Sequence"],
&["collections", "abc", "Set"],
&["typing", "AbstractSet"],
&["typing", "ByteString"],
&["typing", "Callable"],
&["typing", "Collection"],
&["typing", "Container"],
&["typing", "FrozenSet"],
&["typing", "Iterable"],
&["typing", "Literal"],
&["typing", "Mapping"],
&["typing", "Never"],
&["typing", "NoReturn"],
&["typing", "Reversible"],
&["typing", "Sequence"],
&["typing", "Tuple"],
];
pub fn is_mutable_func(
expr: &Expr,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
let call_path = dealias_call_path(collect_call_paths(expr), import_aliases);
MUTABLE_FUNCS
.iter()
.any(|(module, member)| match_call_path(&call_path, module, member, from_imports))
pub fn is_mutable_func(checker: &Checker, expr: &Expr) -> bool {
checker.resolve_call_path(expr).map_or(false, |call_path| {
MUTABLE_FUNCS.iter().any(|target| call_path == *target)
})
}
fn is_mutable_expr(
expr: &Expr,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
fn is_mutable_expr(checker: &Checker, expr: &Expr) -> bool {
match &expr.node {
ExprKind::List { .. }
| ExprKind::Dict { .. }
@@ -81,60 +70,53 @@ fn is_mutable_expr(
| ExprKind::ListComp { .. }
| ExprKind::DictComp { .. }
| ExprKind::SetComp { .. } => true,
ExprKind::Call { func, .. } => is_mutable_func(func, from_imports, import_aliases),
ExprKind::Call { func, .. } => is_mutable_func(checker, func),
_ => false,
}
}
fn is_immutable_annotation(
expr: &Expr,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
fn is_immutable_annotation(checker: &Checker, expr: &Expr) -> bool {
match &expr.node {
ExprKind::Name { .. } | ExprKind::Attribute { .. } => {
let call_path = dealias_call_path(collect_call_paths(expr), import_aliases);
IMMUTABLE_TYPES
.iter()
.chain(IMMUTABLE_GENERIC_TYPES)
.any(|(module, member)| match_call_path(&call_path, module, member, from_imports))
checker.resolve_call_path(expr).map_or(false, |call_path| {
IMMUTABLE_TYPES
.iter()
.chain(IMMUTABLE_GENERIC_TYPES)
.any(|target| call_path == *target)
})
}
ExprKind::Subscript { value, slice, .. } => {
let call_path = dealias_call_path(collect_call_paths(value), import_aliases);
if IMMUTABLE_GENERIC_TYPES
.iter()
.any(|(module, member)| match_call_path(&call_path, module, member, from_imports))
{
true
} else if match_call_path(&call_path, "typing", "Union", from_imports) {
if let ExprKind::Tuple { elts, .. } = &slice.node {
elts.iter()
.all(|elt| is_immutable_annotation(elt, from_imports, import_aliases))
checker.resolve_call_path(value).map_or(false, |call_path| {
if IMMUTABLE_GENERIC_TYPES
.iter()
.any(|target| call_path == *target)
{
true
} else if call_path == ["typing", "Union"] {
if let ExprKind::Tuple { elts, .. } = &slice.node {
elts.iter().all(|elt| is_immutable_annotation(checker, elt))
} else {
false
}
} else if call_path == ["typing", "Optional"] {
is_immutable_annotation(checker, slice)
} else if call_path == ["typing", "Annotated"] {
if let ExprKind::Tuple { elts, .. } = &slice.node {
elts.first()
.map_or(false, |elt| is_immutable_annotation(checker, elt))
} else {
false
}
} else {
false
}
} else if match_call_path(&call_path, "typing", "Optional", from_imports) {
is_immutable_annotation(slice, from_imports, import_aliases)
} else if match_call_path(&call_path, "typing", "Annotated", from_imports) {
if let ExprKind::Tuple { elts, .. } = &slice.node {
elts.first().map_or(false, |elt| {
is_immutable_annotation(elt, from_imports, import_aliases)
})
} else {
false
}
} else {
false
}
})
}
ExprKind::BinOp {
left,
op: Operator::BitOr,
right,
} => {
is_immutable_annotation(left, from_imports, import_aliases)
&& is_immutable_annotation(right, from_imports, import_aliases)
}
} => is_immutable_annotation(checker, left) && is_immutable_annotation(checker, right),
ExprKind::Constant {
value: Constant::None,
..
@@ -145,7 +127,7 @@ fn is_immutable_annotation(
/// B006
pub fn mutable_argument_default(checker: &mut Checker, arguments: &Arguments) {
// Scan in reverse order to right-align zip()
// Scan in reverse order to right-align zip().
for (arg, default) in arguments
.kwonlyargs
.iter()
@@ -160,10 +142,12 @@ pub fn mutable_argument_default(checker: &mut Checker, arguments: &Arguments) {
.zip(arguments.defaults.iter().rev()),
)
{
if is_mutable_expr(default, &checker.from_imports, &checker.import_aliases)
&& arg.node.annotation.as_ref().map_or(true, |expr| {
!is_immutable_annotation(expr, &checker.from_imports, &checker.import_aliases)
})
if is_mutable_expr(checker, default)
&& !arg
.node
.annotation
.as_ref()
.map_or(false, |expr| is_immutable_annotation(checker, expr))
{
checker.diagnostics.push(Diagnostic::new(
violations::MutableArgumentDefault,

View File

@@ -1,6 +1,5 @@
use rustpython_ast::Expr;
use crate::ast::helpers::{collect_call_paths, match_call_path};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
@@ -8,12 +7,10 @@ use crate::violations;
/// B005
pub fn useless_contextlib_suppress(checker: &mut Checker, expr: &Expr, args: &[Expr]) {
if match_call_path(
&collect_call_paths(expr),
"contextlib",
"suppress",
&checker.from_imports,
) && args.is_empty()
if args.is_empty()
&& checker
.resolve_call_path(expr)
.map_or(false, |call_path| call_path == ["contextlib", "suppress"])
{
checker.diagnostics.push(Diagnostic::new(
violations::UselessContextlibSuppress,

View File

@@ -1,64 +1,64 @@
---
source: src/flake8_bugbear/mod.rs
expression: checks
expression: diagnostics
---
- kind:
AbstractBaseClassWithoutAbstractMethod: Base_1
location:
row: 17
row: 18
column: 0
end_location:
row: 19
row: 20
column: 13
fix: ~
parent: ~
- kind:
AbstractBaseClassWithoutAbstractMethod: MetaBase_1
location:
row: 58
row: 71
column: 0
end_location:
row: 60
row: 73
column: 13
fix: ~
parent: ~
- kind:
AbstractBaseClassWithoutAbstractMethod: abc_Base_1
location:
row: 69
row: 82
column: 0
end_location:
row: 71
row: 84
column: 13
fix: ~
parent: ~
- kind:
AbstractBaseClassWithoutAbstractMethod: abc_Base_2
location:
row: 74
row: 87
column: 0
end_location:
row: 76
row: 89
column: 13
fix: ~
parent: ~
- kind:
AbstractBaseClassWithoutAbstractMethod: notabc_Base_1
location:
row: 79
row: 92
column: 0
end_location:
row: 81
row: 94
column: 13
fix: ~
parent: ~
- kind:
AbstractBaseClassWithoutAbstractMethod: abc_set_class_variable_4
location:
row: 128
row: 141
column: 0
end_location:
row: 129
row: 142
column: 7
fix: ~
parent: ~

View File

@@ -1,44 +1,44 @@
---
source: src/flake8_bugbear/mod.rs
expression: checks
expression: diagnostics
---
- kind:
EmptyMethodWithoutAbstractDecorator: AbstractClass
location:
row: 12
row: 13
column: 4
end_location:
row: 13
row: 14
column: 11
fix: ~
parent: ~
- kind:
EmptyMethodWithoutAbstractDecorator: AbstractClass
location:
row: 15
row: 16
column: 4
end_location:
row: 16
row: 17
column: 12
fix: ~
parent: ~
- kind:
EmptyMethodWithoutAbstractDecorator: AbstractClass
location:
row: 18
row: 19
column: 4
end_location:
row: 20
row: 21
column: 11
fix: ~
parent: ~
- kind:
EmptyMethodWithoutAbstractDecorator: AbstractClass
location:
row: 22
row: 23
column: 4
end_location:
row: 27
row: 28
column: 12
fix: ~
parent: ~

View File

@@ -258,6 +258,49 @@ pub fn fix_unnecessary_list_comprehension_dict(
))
}
/// Drop a trailing comma from a list of tuple elements.
fn drop_trailing_comma<'a>(
tuple: &Tuple<'a>,
) -> Result<(
Vec<Element<'a>>,
ParenthesizableWhitespace<'a>,
ParenthesizableWhitespace<'a>,
)> {
let whitespace_after = tuple
.lpar
.first()
.ok_or_else(|| anyhow::anyhow!("Expected at least one set of parentheses"))?
.whitespace_after
.clone();
let whitespace_before = tuple
.rpar
.first()
.ok_or_else(|| anyhow::anyhow!("Expected at least one set of parentheses"))?
.whitespace_before
.clone();
let mut elements = tuple.elements.clone();
if elements.len() == 1 {
if let Some(Element::Simple {
value,
comma: Some(..),
..
}) = elements.last()
{
if whitespace_before == ParenthesizableWhitespace::default()
&& whitespace_after == ParenthesizableWhitespace::default()
{
elements[0] = Element::Simple {
value: value.clone(),
comma: None,
};
}
}
}
Ok((elements, whitespace_after, whitespace_before))
}
/// (C405) Convert `set((1, 2))` to `{1, 2}`.
pub fn fix_unnecessary_literal_set(locator: &Locator, expr: &rustpython_ast::Expr) -> Result<Fix> {
// Expr(Call(List|Tuple)))) -> Expr(Set)))
@@ -267,9 +310,13 @@ pub fn fix_unnecessary_literal_set(locator: &Locator, expr: &rustpython_ast::Exp
let mut call = match_call(body)?;
let arg = match_arg(call)?;
let elements = match &arg.value {
Expression::Tuple(inner) => &inner.elements,
Expression::List(inner) => &inner.elements,
let (elements, whitespace_after, whitespace_before) = match &arg.value {
Expression::Tuple(inner) => drop_trailing_comma(inner)?,
Expression::List(inner) => (
inner.elements.clone(),
inner.lbracket.whitespace_after.clone(),
inner.rbracket.whitespace_before.clone(),
),
_ => {
bail!("Expected Expression::Tuple | Expression::List");
}
@@ -279,13 +326,9 @@ pub fn fix_unnecessary_literal_set(locator: &Locator, expr: &rustpython_ast::Exp
call.args = vec![];
} else {
body.value = Expression::Set(Box::new(Set {
elements: elements.clone(),
lbrace: LeftCurlyBrace {
whitespace_after: call.whitespace_before_args.clone(),
},
rbrace: RightCurlyBrace {
whitespace_before: arg.whitespace_after_arg.clone(),
},
elements,
lbrace: LeftCurlyBrace { whitespace_after },
rbrace: RightCurlyBrace { whitespace_before },
lpar: vec![],
rpar: vec![],
}));

View File

@@ -1,73 +1,158 @@
---
source: src/flake8_comprehensions/mod.rs
expression: checks
expression: diagnostics
---
- kind:
UnnecessaryLiteralSet: list
location:
row: 1
column: 5
column: 0
end_location:
row: 1
column: 16
column: 11
fix:
content: "{1, 2}"
location:
row: 1
column: 5
column: 0
end_location:
row: 1
column: 16
column: 11
parent: ~
- kind:
UnnecessaryLiteralSet: tuple
location:
row: 2
column: 5
column: 0
end_location:
row: 2
column: 16
column: 11
fix:
content: "{1, 2}"
location:
row: 2
column: 5
column: 0
end_location:
row: 2
column: 16
column: 11
parent: ~
- kind:
UnnecessaryLiteralSet: list
location:
row: 3
column: 5
column: 0
end_location:
row: 3
column: 12
column: 7
fix:
content: set()
location:
row: 3
column: 5
column: 0
end_location:
row: 3
column: 12
column: 7
parent: ~
- kind:
UnnecessaryLiteralSet: tuple
location:
row: 4
column: 5
column: 0
end_location:
row: 4
column: 12
column: 7
fix:
content: set()
location:
row: 4
column: 5
column: 0
end_location:
row: 4
column: 12
column: 7
parent: ~
- kind:
UnnecessaryLiteralSet: tuple
location:
row: 6
column: 0
end_location:
row: 6
column: 9
fix:
content: "{1}"
location:
row: 6
column: 0
end_location:
row: 6
column: 9
parent: ~
- kind:
UnnecessaryLiteralSet: tuple
location:
row: 7
column: 0
end_location:
row: 9
column: 2
fix:
content: "{\n 1,\n}"
location:
row: 7
column: 0
end_location:
row: 9
column: 2
parent: ~
- kind:
UnnecessaryLiteralSet: list
location:
row: 10
column: 0
end_location:
row: 12
column: 2
fix:
content: "{\n 1,\n}"
location:
row: 10
column: 0
end_location:
row: 12
column: 2
parent: ~
- kind:
UnnecessaryLiteralSet: tuple
location:
row: 13
column: 0
end_location:
row: 15
column: 1
fix:
content: "{1}"
location:
row: 13
column: 0
end_location:
row: 15
column: 1
parent: ~
- kind:
UnnecessaryLiteralSet: list
location:
row: 16
column: 0
end_location:
row: 18
column: 1
fix:
content: "{1,}"
location:
row: 16
column: 0
end_location:
row: 18
column: 1
parent: ~

View File

@@ -1,8 +1,6 @@
use rustpython_ast::{Constant, Expr, ExprKind, Keyword};
use crate::ast::helpers::{
collect_call_paths, dealias_call_path, has_non_none_keyword, is_const_none, match_call_path,
};
use crate::ast::helpers::{has_non_none_keyword, is_const_none};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
@@ -15,8 +13,10 @@ pub fn call_datetime_without_tzinfo(
keywords: &[Keyword],
location: Range,
) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if !match_call_path(&call_path, "datetime", "datetime", &checker.from_imports) {
if !checker
.resolve_call_path(func)
.map_or(false, |call_path| call_path == ["datetime", "datetime"])
{
return;
}
@@ -40,13 +40,9 @@ pub fn call_datetime_without_tzinfo(
/// DTZ002
pub fn call_datetime_today(checker: &mut Checker, func: &Expr, location: Range) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if match_call_path(
&call_path,
"datetime.datetime",
"today",
&checker.from_imports,
) {
if checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["datetime", "datetime", "today"]
}) {
checker
.diagnostics
.push(Diagnostic::new(violations::CallDatetimeToday, location));
@@ -55,13 +51,9 @@ pub fn call_datetime_today(checker: &mut Checker, func: &Expr, location: Range)
/// DTZ003
pub fn call_datetime_utcnow(checker: &mut Checker, func: &Expr, location: Range) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if match_call_path(
&call_path,
"datetime.datetime",
"utcnow",
&checker.from_imports,
) {
if checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["datetime", "datetime", "utcnow"]
}) {
checker
.diagnostics
.push(Diagnostic::new(violations::CallDatetimeUtcnow, location));
@@ -70,13 +62,9 @@ pub fn call_datetime_utcnow(checker: &mut Checker, func: &Expr, location: Range)
/// DTZ004
pub fn call_datetime_utcfromtimestamp(checker: &mut Checker, func: &Expr, location: Range) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if match_call_path(
&call_path,
"datetime.datetime",
"utcfromtimestamp",
&checker.from_imports,
) {
if checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["datetime", "datetime", "utcfromtimestamp"]
}) {
checker.diagnostics.push(Diagnostic::new(
violations::CallDatetimeUtcfromtimestamp,
location,
@@ -92,13 +80,9 @@ pub fn call_datetime_now_without_tzinfo(
keywords: &[Keyword],
location: Range,
) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if !match_call_path(
&call_path,
"datetime.datetime",
"now",
&checker.from_imports,
) {
if !checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["datetime", "datetime", "now"]
}) {
return;
}
@@ -137,13 +121,9 @@ pub fn call_datetime_fromtimestamp(
keywords: &[Keyword],
location: Range,
) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if !match_call_path(
&call_path,
"datetime.datetime",
"fromtimestamp",
&checker.from_imports,
) {
if !checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["datetime", "datetime", "fromtimestamp"]
}) {
return;
}
@@ -181,13 +161,9 @@ pub fn call_datetime_strptime_without_zone(
args: &[Expr],
location: Range,
) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if !match_call_path(
&call_path,
"datetime.datetime",
"strptime",
&checker.from_imports,
) {
if !checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["datetime", "datetime", "strptime"]
}) {
return;
}
@@ -234,8 +210,9 @@ pub fn call_datetime_strptime_without_zone(
/// DTZ011
pub fn call_date_today(checker: &mut Checker, func: &Expr, location: Range) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if match_call_path(&call_path, "datetime.date", "today", &checker.from_imports) {
if checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["datetime", "date", "today"]
}) {
checker
.diagnostics
.push(Diagnostic::new(violations::CallDateToday, location));
@@ -244,13 +221,9 @@ pub fn call_date_today(checker: &mut Checker, func: &Expr, location: Range) {
/// DTZ012
pub fn call_date_fromtimestamp(checker: &mut Checker, func: &Expr, location: Range) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if match_call_path(
&call_path,
"datetime.date",
"fromtimestamp",
&checker.from_imports,
) {
if checker.resolve_call_path(func).map_or(false, |call_path| {
call_path == ["datetime", "date", "fromtimestamp"]
}) {
checker
.diagnostics
.push(Diagnostic::new(violations::CallDateFromtimestamp, location));

View File

@@ -1,42 +1,39 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Expr, Stmt};
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path};
use crate::ast::helpers::format_call_path;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::flake8_debugger::types::DebuggerUsingType;
use crate::registry::Diagnostic;
use crate::violations;
const DEBUGGERS: &[(&str, &str)] = &[
("pdb", "set_trace"),
("pudb", "set_trace"),
("ipdb", "set_trace"),
("ipdb", "sset_trace"),
("IPython.terminal.embed", "InteractiveShellEmbed"),
("IPython.frontend.terminal.embed", "InteractiveShellEmbed"),
("celery.contrib.rdb", "set_trace"),
("builtins", "breakpoint"),
("", "breakpoint"),
const DEBUGGERS: &[&[&str]] = &[
&["pdb", "set_trace"],
&["pudb", "set_trace"],
&["ipdb", "set_trace"],
&["ipdb", "sset_trace"],
&["IPython", "terminal", "embed", "InteractiveShellEmbed"],
&[
"IPython",
"frontend",
"terminal",
"embed",
"InteractiveShellEmbed",
],
&["celery", "contrib", "rdb", "set_trace"],
&["builtins", "breakpoint"],
&["", "breakpoint"],
];
/// Checks for the presence of a debugger call.
pub fn debugger_call(
expr: &Expr,
func: &Expr,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> Option<Diagnostic> {
let call_path = dealias_call_path(collect_call_paths(func), import_aliases);
if DEBUGGERS
.iter()
.any(|(module, member)| match_call_path(&call_path, module, member, from_imports))
{
Some(Diagnostic::new(
violations::Debugger(DebuggerUsingType::Call(call_path.join("."))),
Range::from_located(expr),
))
} else {
None
pub fn debugger_call(checker: &mut Checker, expr: &Expr, func: &Expr) {
if let Some(call_path) = checker.resolve_call_path(func) {
if DEBUGGERS.iter().any(|target| call_path == *target) {
checker.diagnostics.push(Diagnostic::new(
violations::Debugger(DebuggerUsingType::Call(format_call_path(&call_path))),
Range::from_located(expr),
));
}
}
}
@@ -49,23 +46,25 @@ pub fn debugger_import(stmt: &Stmt, module: Option<&str>, name: &str) -> Option<
}
if let Some(module) = module {
if let Some((module_name, member)) = DEBUGGERS
.iter()
.find(|(module_name, member)| module_name == &module && member == &name)
{
let mut call_path = module.split('.').collect::<Vec<_>>();
call_path.push(name);
if DEBUGGERS.iter().any(|target| call_path == **target) {
return Some(Diagnostic::new(
violations::Debugger(DebuggerUsingType::Import(format!("{module_name}.{member}"))),
violations::Debugger(DebuggerUsingType::Import(format_call_path(&call_path))),
Range::from_located(stmt),
));
}
} else {
let parts = name.split('.').collect::<Vec<_>>();
if DEBUGGERS
.iter()
.any(|call_path| call_path[..call_path.len() - 1] == parts)
{
return Some(Diagnostic::new(
violations::Debugger(DebuggerUsingType::Import(name.to_string())),
Range::from_located(stmt),
));
}
} else if DEBUGGERS
.iter()
.any(|(module_name, ..)| module_name == &name)
{
return Some(Diagnostic::new(
violations::Debugger(DebuggerUsingType::Import(name.to_string())),
Range::from_located(stmt),
));
}
None
}

View File

@@ -1,6 +1,6 @@
---
source: src/flake8_debugger/mod.rs
expression: checks
expression: diagnostics
---
- kind:
Debugger:
@@ -70,7 +70,7 @@ expression: checks
parent: ~
- kind:
Debugger:
Call: breakpoint
Call: builtins.breakpoint
location:
row: 11
column: 0
@@ -81,7 +81,7 @@ expression: checks
parent: ~
- kind:
Debugger:
Call: set_trace
Call: pdb.set_trace
location:
row: 12
column: 0
@@ -92,7 +92,7 @@ expression: checks
parent: ~
- kind:
Debugger:
Call: set_trace
Call: celery.contrib.rdb.set_trace
location:
row: 13
column: 0

View File

@@ -1,7 +1,7 @@
use log::error;
use rustpython_ast::{Expr, Keyword, Stmt, StmtKind};
use crate::ast::helpers::{collect_call_paths, dealias_call_path, is_const_none, match_call_path};
use crate::ast::helpers::is_const_none;
use crate::ast::types::Range;
use crate::autofix::helpers;
use crate::checkers::ast::Checker;
@@ -11,8 +11,11 @@ use crate::violations;
/// T201, T203
pub fn print_call(checker: &mut Checker, func: &Expr, keywords: &[Keyword]) {
let mut diagnostic = {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if match_call_path(&call_path, "", "print", &checker.from_imports) {
let call_path = checker.resolve_call_path(func);
if call_path
.as_ref()
.map_or(false, |call_path| *call_path == ["", "print"])
{
// If the print call has a `file=` argument (that isn't `None`, `"sys.stdout"`,
// or `"sys.stderr"`), don't trigger T201.
if let Some(keyword) = keywords
@@ -20,16 +23,21 @@ pub fn print_call(checker: &mut Checker, func: &Expr, keywords: &[Keyword]) {
.find(|keyword| keyword.node.arg.as_ref().map_or(false, |arg| arg == "file"))
{
if !is_const_none(&keyword.node.value) {
let call_path = collect_call_paths(&keyword.node.value);
if !(match_call_path(&call_path, "sys", "stdout", &checker.from_imports)
|| match_call_path(&call_path, "sys", "stderr", &checker.from_imports))
if checker
.resolve_call_path(&keyword.node.value)
.map_or(true, |call_path| {
call_path != ["sys", "stdout"] && call_path != ["sys", "stderr"]
})
{
return;
}
}
}
Diagnostic::new(violations::PrintFound, Range::from_located(func))
} else if match_call_path(&call_path, "pprint", "pprint", &checker.from_imports) {
} else if call_path
.as_ref()
.map_or(false, |call_path| *call_path == ["pprint", "pprint"])
{
Diagnostic::new(violations::PPrintFound, Range::from_located(func))
} else {
return;

View File

@@ -4,7 +4,7 @@ use super::helpers::{
get_mark_decorators, get_mark_name, is_abstractmethod_decorator, is_pytest_fixture,
is_pytest_yield_fixture, keyword_is_literal,
};
use crate::ast::helpers::{collect_arg_names, collect_call_paths};
use crate::ast::helpers::{collect_arg_names, collect_call_path};
use crate::ast::types::Range;
use crate::ast::visitor;
use crate::ast::visitor::Visitor;
@@ -50,7 +50,7 @@ where
}
}
ExprKind::Call { func, .. } => {
if collect_call_paths(func) == vec!["request", "addfinalizer"] {
if collect_call_path(func) == vec!["request", "addfinalizer"] {
self.addfinalizer_call = Some(expr);
};
visitor::walk_expr(self, expr);

View File

@@ -1,7 +1,7 @@
use num_traits::identities::Zero;
use rustpython_ast::{Constant, Expr, ExprKind, Keyword};
use crate::ast::helpers::{collect_call_paths, compose_call_path, match_module_member};
use crate::ast::helpers::collect_call_path;
use crate::checkers::ast::Checker;
const ITERABLE_INITIALIZERS: &[&str] = &["dict", "frozenset", "list", "tuple", "set"];
@@ -14,55 +14,40 @@ pub fn get_mark_decorators(decorators: &[Expr]) -> Vec<&Expr> {
}
pub fn get_mark_name(decorator: &Expr) -> &str {
collect_call_paths(decorator).last().unwrap()
collect_call_path(decorator).last().unwrap()
}
pub fn is_pytest_fail(call: &Expr, checker: &Checker) -> bool {
match_module_member(
call,
"pytest",
"fail",
&checker.from_imports,
&checker.import_aliases,
)
checker
.resolve_call_path(call)
.map_or(false, |call_path| call_path == ["pytest", "fail"])
}
pub fn is_pytest_fixture(decorator: &Expr, checker: &Checker) -> bool {
match_module_member(
decorator,
"pytest",
"fixture",
&checker.from_imports,
&checker.import_aliases,
)
checker
.resolve_call_path(decorator)
.map_or(false, |call_path| call_path == ["pytest", "fixture"])
}
pub fn is_pytest_mark(decorator: &Expr) -> bool {
if let Some(qualname) = compose_call_path(decorator) {
qualname.starts_with("pytest.mark.")
let segments = collect_call_path(decorator);
if segments.len() > 2 {
segments[0] == "pytest" && segments[1] == "mark"
} else {
false
}
}
pub fn is_pytest_yield_fixture(decorator: &Expr, checker: &Checker) -> bool {
match_module_member(
decorator,
"pytest",
"yield_fixture",
&checker.from_imports,
&checker.import_aliases,
)
checker
.resolve_call_path(decorator)
.map_or(false, |call_path| call_path == ["pytest", "yield_fixture"])
}
pub fn is_abstractmethod_decorator(decorator: &Expr, checker: &Checker) -> bool {
match_module_member(
decorator,
"abc",
"abstractmethod",
&checker.from_imports,
&checker.import_aliases,
)
checker
.resolve_call_path(decorator)
.map_or(false, |call_path| call_path == ["abc", "abstractmethod"])
}
/// Check if the expression is a constant that evaluates to false.
@@ -108,13 +93,11 @@ pub fn is_falsy_constant(expr: &Expr) -> bool {
}
pub fn is_pytest_parametrize(decorator: &Expr, checker: &Checker) -> bool {
match_module_member(
decorator,
"pytest.mark",
"parametrize",
&checker.from_imports,
&checker.import_aliases,
)
checker
.resolve_call_path(decorator)
.map_or(false, |call_path| {
call_path == ["pytest", "mark", "parametrize"]
})
}
pub fn keyword_is_literal(kw: &Keyword, literal: &str) -> bool {

View File

@@ -1,22 +1,16 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Expr, ExprKind, Keyword, Stmt, StmtKind, Withitem};
use super::helpers::is_empty_or_null_string;
use crate::ast::helpers::{
collect_call_paths, dealias_call_path, match_call_path, match_module_member,
to_module_and_member,
};
use crate::ast::helpers::{format_call_path, to_call_path};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::{Diagnostic, RuleCode};
use crate::violations;
fn is_pytest_raises(
func: &Expr,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
) -> bool {
match_module_member(func, "pytest", "raises", from_imports, import_aliases)
fn is_pytest_raises(checker: &Checker, func: &Expr) -> bool {
checker
.resolve_call_path(func)
.map_or(false, |call_path| call_path == ["pytest", "raises"])
}
fn is_non_trivial_with_body(body: &[Stmt]) -> bool {
@@ -30,7 +24,7 @@ fn is_non_trivial_with_body(body: &[Stmt]) -> bool {
}
pub fn raises_call(checker: &mut Checker, func: &Expr, args: &[Expr], keywords: &[Keyword]) {
if is_pytest_raises(func, &checker.from_imports, &checker.import_aliases) {
if is_pytest_raises(checker, func) {
if checker.settings.enabled.contains(&RuleCode::PT010) {
if args.is_empty() && keywords.is_empty() {
checker.diagnostics.push(Diagnostic::new(
@@ -62,9 +56,7 @@ pub fn complex_raises(checker: &mut Checker, stmt: &Stmt, items: &[Withitem], bo
let mut is_too_complex = false;
let raises_called = items.iter().any(|item| match &item.context_expr.node {
ExprKind::Call { func, .. } => {
is_pytest_raises(func, &checker.from_imports, &checker.import_aliases)
}
ExprKind::Call { func, .. } => is_pytest_raises(checker, func),
_ => false,
});
@@ -101,26 +93,24 @@ pub fn complex_raises(checker: &mut Checker, stmt: &Stmt, items: &[Withitem], bo
/// PT011
fn exception_needs_match(checker: &mut Checker, exception: &Expr) {
let call_path = dealias_call_path(collect_call_paths(exception), &checker.import_aliases);
let is_broad_exception = checker
.settings
.flake8_pytest_style
.raises_require_match_for
.iter()
.chain(
&checker
.settings
.flake8_pytest_style
.raises_extend_require_match_for,
)
.map(|target| to_module_and_member(target))
.any(|(module, member)| match_call_path(&call_path, module, member, &checker.from_imports));
if is_broad_exception {
checker.diagnostics.push(Diagnostic::new(
violations::RaisesTooBroad(call_path.join(".")),
Range::from_located(exception),
));
if let Some(call_path) = checker.resolve_call_path(exception) {
let is_broad_exception = checker
.settings
.flake8_pytest_style
.raises_require_match_for
.iter()
.chain(
&checker
.settings
.flake8_pytest_style
.raises_extend_require_match_for,
)
.any(|target| call_path == to_call_path(target));
if is_broad_exception {
checker.diagnostics.push(Diagnostic::new(
violations::RaisesTooBroad(format_call_path(&call_path)),
Range::from_located(exception),
));
}
}
}

View File

@@ -1,6 +1,6 @@
use rustpython_ast::{Constant, Expr, ExprKind};
use crate::ast::helpers::{create_expr, match_module_member, unparse_expr};
use crate::ast::helpers::{create_expr, unparse_expr};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::fix::Fix;
@@ -16,21 +16,9 @@ pub fn use_capital_environment_variables(checker: &mut Checker, expr: &Expr) {
}
// check `os.environ.get('foo')` and `os.getenv('foo')``
let is_os_environ_get = match_module_member(
expr,
"os.environ",
"get",
&checker.from_imports,
&checker.import_aliases,
);
let is_os_getenv = match_module_member(
expr,
"os",
"getenv",
&checker.from_imports,
&checker.import_aliases,
);
if !(is_os_environ_get || is_os_getenv) {
if !checker.resolve_call_path(expr).map_or(false, |call_path| {
call_path == ["os", "environ", "get"] || call_path == ["os", "getenv"]
}) {
return;
}

View File

@@ -149,25 +149,13 @@ pub fn use_ternary_operator(checker: &mut Checker, stmt: &Stmt, parent: Option<&
}
// Avoid suggesting ternary for `if sys.version_info >= ...`-style checks.
if contains_call_path(
test,
"sys",
"version_info",
&checker.import_aliases,
&checker.from_imports,
) {
if contains_call_path(checker, test, &["sys", "version_info"]) {
return;
}
// Avoid suggesting ternary for `if sys.platform.startswith("...")`-style
// checks.
if contains_call_path(
test,
"sys",
"platform",
&checker.import_aliases,
&checker.from_imports,
) {
if contains_call_path(checker, test, &["sys", "platform"]) {
return;
}

View File

@@ -1,7 +1,6 @@
use rustpython_ast::Expr;
use rustpython_parser::ast::StmtKind;
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
@@ -9,12 +8,10 @@ use crate::violations;
/// SIM115
pub fn open_file_with_context_handler(checker: &mut Checker, func: &Expr) {
if match_call_path(
&dealias_call_path(collect_call_paths(func), &checker.import_aliases),
"",
"open",
&checker.from_imports,
) {
if checker
.resolve_call_path(func)
.map_or(false, |call_path| call_path == ["", "open"])
{
if checker.is_builtin("open") {
match checker.current_stmt().node {
StmtKind::With { .. } => (),

View File

@@ -74,25 +74,4 @@ mod tests {
insta::assert_yaml_snapshot!(diagnostics);
Ok(())
}
#[test]
fn banned_api_false_positives() -> Result<()> {
let diagnostics = test_path(
Path::new("./resources/test/fixtures/flake8_tidy_imports/TID251_false_positives.py"),
&Settings {
flake8_tidy_imports: flake8_tidy_imports::settings::Settings {
banned_api: FxHashMap::from_iter([(
"typing.TypedDict".to_string(),
BannedApi {
msg: "Use typing_extensions.TypedDict instead.".to_string(),
},
)]),
..Default::default()
},
..Settings::for_rules(vec![RuleCode::TID251])
},
)?;
insta::assert_yaml_snapshot!(diagnostics);
Ok(())
}
}

View File

@@ -2,7 +2,6 @@ use rustc_hash::FxHashMap;
use rustpython_ast::{Alias, Expr, Located, Stmt};
use super::settings::BannedApi;
use crate::ast::helpers::match_call_path;
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::flake8_tidy_imports::settings::Strictness;
@@ -75,15 +74,10 @@ pub fn name_or_parent_is_banned<T>(
}
/// TID251
pub fn banned_attribute_access(
checker: &mut Checker,
call_path: &[&str],
expr: &Expr,
banned_apis: &FxHashMap<String, BannedApi>,
) {
for (banned_path, ban) in banned_apis {
if let Some((module, member)) = banned_path.rsplit_once('.') {
if match_call_path(call_path, module, member, &checker.from_imports) {
pub fn banned_attribute_access(checker: &mut Checker, expr: &Expr) {
if let Some(call_path) = checker.resolve_call_path(expr) {
for (banned_path, ban) in &checker.settings.flake8_tidy_imports.banned_api {
if call_path == banned_path.split('.').collect::<Vec<_>>() {
checker.diagnostics.push(Diagnostic::new(
violations::BannedApi {
name: banned_path.to_string(),

View File

@@ -1,41 +0,0 @@
---
source: src/flake8_tidy_imports/mod.rs
expression: checks
---
- kind:
BannedApi:
name: typing.TypedDict
message: Use typing_extensions.TypedDict instead.
location:
row: 2
column: 7
end_location:
row: 2
column: 23
fix: ~
parent: ~
- kind:
BannedApi:
name: typing.TypedDict
message: Use typing_extensions.TypedDict instead.
location:
row: 7
column: 0
end_location:
row: 7
column: 16
fix: ~
parent: ~
- kind:
BannedApi:
name: typing.TypedDict
message: Use typing_extensions.TypedDict instead.
location:
row: 11
column: 4
end_location:
row: 11
column: 20
fix: ~
parent: ~

View File

@@ -118,11 +118,10 @@ pub fn unused_arguments(
..
}) => {
match function_type::classify(
checker,
parent,
name,
decorator_list,
&checker.from_imports,
&checker.import_aliases,
&checker.settings.pep8_naming.classmethod_decorators,
&checker.settings.pep8_naming.staticmethod_decorators,
) {

View File

@@ -66,7 +66,7 @@ pub fn relativize_path(path: &Path) -> Cow<str> {
}
/// Read a file's contents from disk.
pub(crate) fn read_file<P: AsRef<Path>>(path: P) -> Result<String> {
pub fn read_file<P: AsRef<Path>>(path: P) -> Result<String> {
let file = File::open(path)?;
let mut buf_reader = BufReader::new(file);
let mut contents = String::new();

View File

@@ -383,11 +383,12 @@ fn categorize_imports<'a>(
block_by_type
}
fn order_imports(
block: ImportBlock,
fn order_imports<'a>(
block: ImportBlock<'a>,
order_by_type: bool,
relative_imports_order: RelatveImportsOrder,
) -> OrderedImportBlock {
classes: &'a BTreeSet<String>,
) -> OrderedImportBlock<'a> {
let mut ordered = OrderedImportBlock::default();
// Sort `StmtKind::Import`.
@@ -466,7 +467,7 @@ fn order_imports(
aliases
.into_iter()
.sorted_by(|(alias1, _), (alias2, _)| {
cmp_members(alias1, alias2, order_by_type)
cmp_members(alias1, alias2, order_by_type, classes)
})
.collect::<Vec<(AliasData, CommentSet)>>(),
)
@@ -479,7 +480,7 @@ fn order_imports(
(None, Some(_)) => Ordering::Less,
(Some(_), None) => Ordering::Greater,
(Some((alias1, _)), Some((alias2, _))) => {
cmp_members(alias1, alias2, order_by_type)
cmp_members(alias1, alias2, order_by_type, classes)
}
},
)
@@ -556,6 +557,7 @@ pub fn format_imports(
relative_imports_order: RelatveImportsOrder,
single_line_exclusions: &BTreeSet<String>,
split_on_trailing_comma: bool,
classes: &BTreeSet<String>,
) -> String {
let trailer = &block.trailer;
let block = annotate_imports(&block.imports, comments, locator, split_on_trailing_comma);
@@ -578,7 +580,8 @@ pub fn format_imports(
// Generate replacement source code.
let mut is_first_block = true;
for import_block in block_by_type.into_values() {
let mut imports = order_imports(import_block, order_by_type, relative_imports_order);
let mut imports =
order_imports(import_block, order_by_type, relative_imports_order, classes);
if force_single_line {
imports = force_single_line_imports(imports, single_line_exclusions);
@@ -698,6 +701,7 @@ mod tests {
#[test_case(Path::new("split.py"))]
#[test_case(Path::new("trailing_suffix.py"))]
#[test_case(Path::new("type_comments.py"))]
#[test_case(Path::new("order_by_type_with_custom_classes.py"))]
fn default(path: &Path) -> Result<()> {
let snapshot = format!("{}", path.to_string_lossy());
let diagnostics = test_path(
@@ -818,6 +822,36 @@ mod tests {
Ok(())
}
#[test_case(Path::new("order_by_type_with_custom_classes.py"))]
fn order_by_type_with_custom_classes(path: &Path) -> Result<()> {
let snapshot = format!(
"order_by_type_with_custom_classes_{}",
path.to_string_lossy()
);
let mut diagnostics = test_path(
Path::new("./resources/test/fixtures/isort")
.join(path)
.as_path(),
&Settings {
isort: isort::settings::Settings {
order_by_type: true,
classes: BTreeSet::from([
"SVC".to_string(),
"SELU".to_string(),
"N_CLASS".to_string(),
"CLASS".to_string(),
]),
..isort::settings::Settings::default()
},
src: vec![Path::new("resources/test/fixtures/isort").to_path_buf()],
..Settings::for_rule(RuleCode::I001)
},
)?;
diagnostics.sort_by_key(|diagnostic| diagnostic.location);
insta::assert_yaml_snapshot!(snapshot, diagnostics);
Ok(())
}
#[test_case(Path::new("force_sort_within_sections.py"))]
fn force_sort_within_sections(path: &Path) -> Result<()> {
let snapshot = format!("force_sort_within_sections_{}", path.to_string_lossy());

View File

@@ -84,6 +84,7 @@ pub fn organize_imports(
settings.isort.relative_imports_order,
&settings.isort.single_line_exclusions,
settings.isort.split_on_trailing_comma,
&settings.isort.classes,
);
// Expand the span the entire range, including leading and trailing space.

View File

@@ -171,6 +171,16 @@ pub struct Options {
)]
/// Add the specified import line to all files.
pub required_imports: Option<Vec<String>>,
#[option(
default = r#"[]"#,
value_type = "Vec<String>",
example = r#"
classes = ["SVC"]
"#
)]
/// An override list of tokens to always recognize as a Class for
/// `order-by-type` regardless of casing.
pub classes: Option<Vec<String>>,
}
#[derive(Debug, Hash)]
@@ -188,6 +198,7 @@ pub struct Settings {
pub relative_imports_order: RelatveImportsOrder,
pub single_line_exclusions: BTreeSet<String>,
pub split_on_trailing_comma: bool,
pub classes: BTreeSet<String>,
}
impl Default for Settings {
@@ -205,6 +216,7 @@ impl Default for Settings {
relative_imports_order: RelatveImportsOrder::default(),
single_line_exclusions: BTreeSet::new(),
split_on_trailing_comma: true,
classes: BTreeSet::new(),
}
}
}
@@ -228,6 +240,7 @@ impl From<Options> for Settings {
options.single_line_exclusions.unwrap_or_default(),
),
split_on_trailing_comma: options.split_on_trailing_comma.unwrap_or(true),
classes: BTreeSet::from_iter(options.classes.unwrap_or_default()),
}
}
}
@@ -247,6 +260,7 @@ impl From<Settings> for Options {
relative_imports_order: Some(settings.relative_imports_order),
single_line_exclusions: Some(settings.single_line_exclusions.into_iter().collect()),
split_on_trailing_comma: Some(settings.split_on_trailing_comma),
classes: Some(settings.classes.into_iter().collect()),
}
}
}

View File

@@ -0,0 +1,22 @@
---
source: src/isort/mod.rs
expression: diagnostics
---
- kind:
UnsortedImports: ~
location:
row: 1
column: 0
end_location:
row: 5
column: 0
fix:
content: "from subprocess import N_CLASS, PIPE, STDOUT, Popen\n\nfrom module import BASIC, CLASS, CONSTANT, Apple, Class, function\nfrom sklearn.svm import CONST, SVC, Klass, func\nfrom torch.nn import A_CONSTANT, SELU, AClass\n"
location:
row: 1
column: 0
end_location:
row: 5
column: 0
parent: ~

View File

@@ -0,0 +1,22 @@
---
source: src/isort/mod.rs
expression: diagnostics
---
- kind:
UnsortedImports: ~
location:
row: 1
column: 0
end_location:
row: 5
column: 0
fix:
content: "from subprocess import PIPE, STDOUT, N_CLASS, Popen\n\nfrom module import BASIC, CONSTANT, Apple, CLASS, Class, function\nfrom sklearn.svm import CONST, Klass, SVC, func\nfrom torch.nn import A_CONSTANT, AClass, SELU\n"
location:
row: 1
column: 0
end_location:
row: 5
column: 0
parent: ~

View File

@@ -1,5 +1,6 @@
/// See: <https://github.com/PyCQA/isort/blob/12cc5fbd67eebf92eb2213b03c07b138ae1fb448/isort/sorting.py#L13>
use std::cmp::Ordering;
use std::collections::BTreeSet;
use crate::isort::settings::RelatveImportsOrder;
use crate::isort::types::EitherImport::{Import, ImportFrom};
@@ -13,8 +14,11 @@ pub enum Prefix {
Variables,
}
fn prefix(name: &str) -> Prefix {
if name.len() > 1 && string::is_upper(name) {
fn prefix(name: &str, classes: &BTreeSet<String>) -> Prefix {
if classes.contains(name) {
// Ex) `CLASS`
Prefix::Classes
} else if name.len() > 1 && string::is_upper(name) {
// Ex) `CONSTANT`
Prefix::Constants
} else if name.chars().next().map_or(false, char::is_uppercase) {
@@ -39,10 +43,15 @@ pub fn cmp_modules(alias1: &AliasData, alias2: &AliasData) -> Ordering {
}
/// Compare two member imports within `StmtKind::ImportFrom` blocks.
pub fn cmp_members(alias1: &AliasData, alias2: &AliasData, order_by_type: bool) -> Ordering {
pub fn cmp_members(
alias1: &AliasData,
alias2: &AliasData,
order_by_type: bool,
classes: &BTreeSet<String>,
) -> Ordering {
if order_by_type {
prefix(alias1.name)
.cmp(&prefix(alias2.name))
prefix(alias1.name, classes)
.cmp(&prefix(alias2.name, classes))
.then_with(|| cmp_modules(alias1, alias2))
} else {
cmp_modules(alias1, alias2)

View File

@@ -12,14 +12,14 @@
)]
#![forbid(unsafe_code)]
mod ast;
mod autofix;
mod cache;
extern crate core;
pub mod ast;
pub mod autofix;
pub mod cache;
mod checkers;
pub mod cli;
mod cst;
mod diagnostics;
mod directives;
pub mod directives;
mod doc_lines;
mod docstrings;
mod eradicate;
@@ -47,7 +47,6 @@ pub mod flake8_tidy_imports;
mod flake8_unused_arguments;
pub mod fs;
mod isort;
pub mod iterators;
mod lex;
pub mod linter;
pub mod logging;
@@ -56,7 +55,6 @@ pub mod message;
mod noqa;
mod pandas_vet;
pub mod pep8_naming;
pub mod printer;
mod pycodestyle;
pub mod pydocstyle;
mod pyflakes;
@@ -67,23 +65,20 @@ mod pyupgrade;
pub mod registry;
pub mod resolver;
mod ruff;
mod rustpython_helpers;
pub mod rustpython_helpers;
pub mod settings;
pub mod source_code;
mod vendor;
mod violation;
mod violations;
pub mod violations;
mod visibility;
use cfg_if::cfg_if;
cfg_if! {
if #[cfg(not(target_family = "wasm"))] {
pub mod commands;
mod packaging;
pub mod packaging;
#[cfg(all(feature = "update-informer"))]
pub mod updates;
mod lib_native;
pub use lib_native::check;

View File

@@ -26,7 +26,7 @@ const CARGO_PKG_REPOSITORY: &str = env!("CARGO_PKG_REPOSITORY");
/// Generate `Diagnostic`s from the source code contents at the
/// given `Path`.
#[allow(clippy::too_many_arguments)]
pub(crate) fn check_path(
pub fn check_path(
path: &Path,
package: Option<&Path>,
contents: &str,

View File

@@ -1,42 +0,0 @@
#![allow(
clippy::collapsible_else_if,
clippy::collapsible_if,
clippy::implicit_hasher,
clippy::match_same_arms,
clippy::missing_errors_doc,
clippy::missing_panics_doc,
clippy::module_name_repetitions,
clippy::must_use_candidate,
clippy::similar_names,
clippy::too_many_lines
)]
#![forbid(unsafe_code)]
use std::process::ExitCode;
use cfg_if::cfg_if;
use colored::Colorize;
cfg_if! {
if #[cfg(not(target_family = "wasm"))] {
mod main_native;
use main_native::inner_main;
} else {
use anyhow::Result;
#[allow(clippy::unnecessary_wraps)]
fn inner_main() -> Result<ExitCode> {
Ok(ExitCode::FAILURE)
}
}
}
fn main() -> ExitCode {
match inner_main() {
Ok(code) => code,
Err(err) => {
eprintln!("{} {err:?}", "error".red().bold());
ExitCode::FAILURE
}
}
}

View File

@@ -1,6 +1,6 @@
use std::cmp::Ordering;
use rustpython_parser::ast::Location;
pub use rustpython_parser::ast::Location;
use serde::{Deserialize, Serialize};
use crate::ast::types::Range;

View File

@@ -1,8 +1,7 @@
use itertools::Itertools;
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Stmt, StmtKind};
use crate::ast::helpers::{collect_call_paths, match_call_path};
use crate::checkers::ast::Checker;
use crate::python::string::{is_lower, is_upper};
pub fn is_camelcase(name: &str) -> bool {
@@ -23,19 +22,13 @@ pub fn is_acronym(name: &str, asname: &str) -> bool {
name.chars().filter(|c| c.is_uppercase()).join("") == asname
}
pub fn is_namedtuple_assignment(
stmt: &Stmt,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
) -> bool {
pub fn is_namedtuple_assignment(checker: &Checker, stmt: &Stmt) -> bool {
let StmtKind::Assign { value, .. } = &stmt.node else {
return false;
};
match_call_path(
&collect_call_paths(value),
"collections",
"namedtuple",
from_imports,
)
checker.resolve_call_path(value).map_or(false, |call_path| {
call_path == ["collections", "namedtuple"]
})
}
#[cfg(test)]

View File

@@ -1,4 +1,3 @@
use rustc_hash::{FxHashMap, FxHashSet};
use rustpython_ast::{Arg, Arguments, Expr, ExprKind, Stmt};
use crate::ast::function_type;
@@ -6,7 +5,6 @@ use crate::ast::helpers::identifier_range;
use crate::ast::types::{Range, Scope, ScopeKind};
use crate::checkers::ast::Checker;
use crate::pep8_naming::helpers;
use crate::pep8_naming::settings::Settings;
use crate::python::string::{self};
use crate::registry::Diagnostic;
use crate::source_code::Locator;
@@ -53,23 +51,20 @@ pub fn invalid_argument_name(name: &str, arg: &Arg) -> Option<Diagnostic> {
/// N804
pub fn invalid_first_argument_name_for_class_method(
checker: &Checker,
scope: &Scope,
name: &str,
decorator_list: &[Expr],
args: &Arguments,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
settings: &Settings,
) -> Option<Diagnostic> {
if !matches!(
function_type::classify(
checker,
scope,
name,
decorator_list,
from_imports,
import_aliases,
&settings.classmethod_decorators,
&settings.staticmethod_decorators,
&checker.settings.pep8_naming.classmethod_decorators,
&checker.settings.pep8_naming.staticmethod_decorators,
),
function_type::FunctionType::ClassMethod
) {
@@ -95,23 +90,20 @@ pub fn invalid_first_argument_name_for_class_method(
/// N805
pub fn invalid_first_argument_name_for_method(
checker: &Checker,
scope: &Scope,
name: &str,
decorator_list: &[Expr],
args: &Arguments,
from_imports: &FxHashMap<&str, FxHashSet<&str>>,
import_aliases: &FxHashMap<&str, &str>,
settings: &Settings,
) -> Option<Diagnostic> {
if !matches!(
function_type::classify(
checker,
scope,
name,
decorator_list,
from_imports,
import_aliases,
&settings.classmethod_decorators,
&settings.staticmethod_decorators,
&checker.settings.pep8_naming.classmethod_decorators,
&checker.settings.pep8_naming.staticmethod_decorators,
),
function_type::FunctionType::Method
) {
@@ -134,9 +126,7 @@ pub fn non_lowercase_variable_in_function(
stmt: &Stmt,
name: &str,
) {
if name.to_lowercase() != name
&& !helpers::is_namedtuple_assignment(stmt, &checker.from_imports)
{
if name.to_lowercase() != name && !helpers::is_namedtuple_assignment(checker, stmt) {
checker.diagnostics.push(Diagnostic::new(
violations::NonLowercaseVariableInFunction(name.to_string()),
Range::from_located(expr),
@@ -243,9 +233,7 @@ pub fn mixed_case_variable_in_class_scope(
stmt: &Stmt,
name: &str,
) {
if helpers::is_mixed_case(name)
&& !helpers::is_namedtuple_assignment(stmt, &checker.from_imports)
{
if helpers::is_mixed_case(name) && !helpers::is_namedtuple_assignment(checker, stmt) {
checker.diagnostics.push(Diagnostic::new(
violations::MixedCaseVariableInClassScope(name.to_string()),
Range::from_located(expr),
@@ -260,9 +248,7 @@ pub fn mixed_case_variable_in_global_scope(
stmt: &Stmt,
name: &str,
) {
if helpers::is_mixed_case(name)
&& !helpers::is_namedtuple_assignment(stmt, &checker.from_imports)
{
if helpers::is_mixed_case(name) && !helpers::is_namedtuple_assignment(checker, stmt) {
checker.diagnostics.push(Diagnostic::new(
violations::MixedCaseVariableInGlobalScope(name.to_string()),
Range::from_located(expr),

View File

@@ -1,10 +1,10 @@
---
source: src/pyflakes/mod.rs
expression: checks
expression: diagnostics
---
- kind:
UnusedImport:
- background.BackgroundTasks
- ".background.BackgroundTasks"
- false
- false
location:
@@ -24,7 +24,7 @@ expression: checks
parent: ~
- kind:
UnusedImport:
- datastructures.UploadFile
- ".datastructures.UploadFile"
- false
- false
location:
@@ -48,18 +48,18 @@ expression: checks
- false
- false
location:
row: 17
row: 16
column: 7
end_location:
row: 17
row: 16
column: 17
fix:
content: ""
location:
row: 17
row: 16
column: 0
end_location:
row: 18
row: 17
column: 0
parent: ~
- kind:
@@ -68,18 +68,18 @@ expression: checks
- false
- false
location:
row: 20
row: 19
column: 7
end_location:
row: 20
row: 19
column: 35
fix:
content: ""
location:
row: 20
row: 19
column: 0
end_location:
row: 21
row: 20
column: 0
parent: ~

View File

@@ -1,6 +1,5 @@
use rustpython_ast::Expr;
use crate::ast::helpers::{collect_call_paths, dealias_call_path, match_call_path};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
@@ -8,9 +7,9 @@ use crate::violations;
/// PGH002 - deprecated use of logging.warn
pub fn deprecated_log_warn(checker: &mut Checker, func: &Expr) {
let call_path = dealias_call_path(collect_call_paths(func), &checker.import_aliases);
if call_path == ["log", "warn"]
|| match_call_path(&call_path, "logging", "warn", &checker.from_imports)
if checker
.resolve_call_path(func)
.map_or(false, |call_path| call_path == ["logging", "warn"])
{
checker.diagnostics.push(Diagnostic::new(
violations::DeprecatedLogWarn,

View File

@@ -1,6 +1,6 @@
---
source: src/pygrep_hooks/mod.rs
expression: checks
expression: diagnostics
---
- kind:
DeprecatedLogWarn: ~
@@ -19,27 +19,7 @@ expression: checks
column: 0
end_location:
row: 5
column: 8
fix: ~
parent: ~
- kind:
DeprecatedLogWarn: ~
location:
row: 6
column: 0
end_location:
row: 6
column: 4
fix: ~
parent: ~
- kind:
DeprecatedLogWarn: ~
location:
row: 15
column: 4
end_location:
row: 15
column: 8
fix: ~
parent: ~

View File

@@ -17,6 +17,7 @@ mod tests {
#[test_case(RuleCode::PLE0117, Path::new("nonlocal_without_binding.py"); "PLE0117")]
#[test_case(RuleCode::PLE0118, Path::new("used_prior_global_declaration.py"); "PLE0118")]
#[test_case(RuleCode::PLE1142, Path::new("await_outside_async.py"); "PLE1142")]
#[test_case(RuleCode::PLR0133, Path::new("constant_comparison.py"); "PLR0133")]
#[test_case(RuleCode::PLR0206, Path::new("property_with_parameters.py"); "PLR0206")]
#[test_case(RuleCode::PLR0402, Path::new("import_aliasing.py"); "PLR0402")]
#[test_case(RuleCode::PLR1701, Path::new("consider_merging_isinstance.py"); "PLR1701")]
@@ -27,6 +28,7 @@ mod tests {
#[test_case(RuleCode::PLR1722, Path::new("consider_using_sys_exit_4.py"); "PLR1722_4")]
#[test_case(RuleCode::PLR1722, Path::new("consider_using_sys_exit_5.py"); "PLR1722_5")]
#[test_case(RuleCode::PLR1722, Path::new("consider_using_sys_exit_6.py"); "PLR1722_6")]
#[test_case(RuleCode::PLR2004, Path::new("magic_value_comparison.py"); "PLR2004")]
#[test_case(RuleCode::PLW0120, Path::new("useless_else_on_loop.py"); "PLW0120")]
#[test_case(RuleCode::PLW0602, Path::new("global_variable_not_assigned.py"); "PLW0602")]
fn rules(rule_code: RuleCode, path: &Path) -> Result<()> {

View File

@@ -0,0 +1,44 @@
use itertools::Itertools;
use rustpython_ast::{Cmpop, Expr, ExprKind, Located};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
/// PLR0133
pub fn constant_comparison(
checker: &mut Checker,
left: &Expr,
ops: &[Cmpop],
comparators: &[Expr],
) {
for ((left, right), op) in std::iter::once(left)
.chain(comparators.iter())
.tuple_windows::<(&Located<_>, &Located<_>)>()
.zip(ops)
{
if let (
ExprKind::Constant {
value: left_constant,
..
},
ExprKind::Constant {
value: right_constant,
..
},
) = (&left.node, &right.node)
{
let diagnostic = Diagnostic::new(
violations::ConstantComparison {
left_constant: left_constant.to_string(),
op: op.into(),
right_constant: right_constant.to_string(),
},
Range::from_located(left),
);
checker.diagnostics.push(diagnostic);
};
}
}

View File

@@ -0,0 +1,51 @@
use itertools::Itertools;
use rustpython_ast::{Constant, Expr, ExprKind};
use crate::ast::types::Range;
use crate::checkers::ast::Checker;
use crate::registry::Diagnostic;
use crate::violations;
fn is_magic_value(constant: &Constant) -> bool {
match constant {
Constant::None => false,
// E712 `if True == do_something():`
Constant::Bool(_) => false,
Constant::Str(value) => !matches!(value.as_str(), "" | "__main__"),
Constant::Bytes(_) => true,
Constant::Int(value) => !matches!(value.try_into(), Ok(-1 | 0 | 1)),
Constant::Tuple(_) => true,
Constant::Float(_) => true,
Constant::Complex { .. } => true,
Constant::Ellipsis => true,
}
}
/// PLR2004
pub fn magic_value_comparison(checker: &mut Checker, left: &Expr, comparators: &[Expr]) {
for (left, right) in std::iter::once(left)
.chain(comparators.iter())
.tuple_windows()
{
// If both of the comparators are constant, skip rule for the whole expression.
// R0133: comparison-of-constants
if matches!(left.node, ExprKind::Constant { .. })
&& matches!(right.node, ExprKind::Constant { .. })
{
return;
}
}
for comparison_expr in std::iter::once(left).chain(comparators.iter()) {
if let ExprKind::Constant { value, .. } = &comparison_expr.node {
if is_magic_value(value) {
let diagnostic = Diagnostic::new(
violations::MagicValueComparison(value.to_string()),
Range::from_located(comparison_expr),
);
checker.diagnostics.push(diagnostic);
}
}
}
}

View File

@@ -1,4 +1,6 @@
pub use await_outside_async::await_outside_async;
pub use constant_comparison::constant_comparison;
pub use magic_value_comparison::magic_value_comparison;
pub use merge_isinstance::merge_isinstance;
pub use misplaced_comparison_constant::misplaced_comparison_constant;
pub use property_with_parameters::property_with_parameters;
@@ -10,6 +12,8 @@ pub use useless_else_on_loop::useless_else_on_loop;
pub use useless_import_alias::useless_import_alias;
mod await_outside_async;
mod constant_comparison;
mod magic_value_comparison;
mod merge_isinstance;
mod misplaced_comparison_constant;
mod property_with_parameters;

View File

@@ -31,28 +31,44 @@ fn get_member_import_name_alias(checker: &Checker, module: &str, member: &str) -
// e.g. module=sys object=exit
// `import sys` -> `sys.exit`
// `import sys as sys2` -> `sys2.exit`
BindingKind::Importation(name, full_name) if full_name == module => {
Some(format!("{name}.{member}"))
BindingKind::Importation(name, full_name) => {
if full_name == &module {
Some(format!("{name}.{member}"))
} else {
None
}
}
// e.g. module=os.path object=join
// `from os.path import join` -> `join`
// `from os.path import join as join2` -> `join2`
BindingKind::FromImportation(name, full_name)
if full_name == &format!("{module}.{member}") =>
{
Some(name.to_string())
BindingKind::FromImportation(name, full_name) => {
let mut parts = full_name.split('.');
if parts.next() == Some(module)
&& parts.next() == Some(member)
&& parts.next().is_none()
{
Some((*name).to_string())
} else {
None
}
}
// e.g. module=os.path object=join
// `from os.path import *` -> `join`
BindingKind::StarImportation(_, name)
if name.as_ref().map(|name| name == module).unwrap_or_default() =>
{
Some(member.to_string())
BindingKind::StarImportation(_, name) => {
if name.as_ref().map(|name| name == module).unwrap_or_default() {
Some(member.to_string())
} else {
None
}
}
// e.g. module=os.path object=join
// `import os.path ` -> `os.path.join`
BindingKind::SubmoduleImportation(_, full_name) if full_name == module => {
Some(format!("{full_name}.{member}"))
BindingKind::SubmoduleImportation(_, full_name) => {
if full_name == &module {
Some(format!("{full_name}.{member}"))
} else {
None
}
}
// Non-imports.
_ => None,

View File

@@ -0,0 +1,135 @@
---
source: src/pylint/mod.rs
expression: diagnostics
---
- kind:
ConstantComparison:
left_constant: "100"
op: Eq
right_constant: "100"
location:
row: 3
column: 3
end_location:
row: 3
column: 6
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "1"
op: Eq
right_constant: "3"
location:
row: 6
column: 3
end_location:
row: 6
column: 4
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "1"
op: NotEq
right_constant: "3"
location:
row: 9
column: 3
end_location:
row: 9
column: 4
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "4"
op: Eq
right_constant: "3"
location:
row: 13
column: 3
end_location:
row: 13
column: 4
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "1"
op: Gt
right_constant: "0"
location:
row: 23
column: 3
end_location:
row: 23
column: 4
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "1"
op: GtE
right_constant: "0"
location:
row: 29
column: 3
end_location:
row: 29
column: 4
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "1"
op: Lt
right_constant: "0"
location:
row: 35
column: 3
end_location:
row: 35
column: 4
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "1"
op: LtE
right_constant: "0"
location:
row: 41
column: 3
end_location:
row: 41
column: 4
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "'hello'"
op: Eq
right_constant: "''"
location:
row: 51
column: 3
end_location:
row: 51
column: 10
fix: ~
parent: ~
- kind:
ConstantComparison:
left_constant: "True"
op: Eq
right_constant: "False"
location:
row: 58
column: 3
end_location:
row: 58
column: 7
fix: ~
parent: ~

View File

@@ -0,0 +1,55 @@
---
source: src/pylint/mod.rs
expression: diagnostics
---
- kind:
MagicValueComparison: "10"
location:
row: 6
column: 3
end_location:
row: 6
column: 5
fix: ~
parent: ~
- kind:
MagicValueComparison: "2"
location:
row: 36
column: 11
end_location:
row: 36
column: 12
fix: ~
parent: ~
- kind:
MagicValueComparison: "'Hunter2'"
location:
row: 51
column: 21
end_location:
row: 51
column: 30
fix: ~
parent: ~
- kind:
MagicValueComparison: "3.141592653589793"
location:
row: 57
column: 20
end_location:
row: 57
column: 40
fix: ~
parent: ~
- kind:
MagicValueComparison: "b'something'"
location:
row: 66
column: 17
end_location:
row: 66
column: 29
fix: ~
parent: ~

Some files were not shown because too many files have changed in this diff Show More