Compare commits

..

41 Commits

Author SHA1 Message Date
Charlie Marsh
31a0b20271 Bump version to 0.0.52 2022-10-03 15:22:58 -04:00
Charlie Marsh
0966bf2c66 Handle multi-import lines (#307) 2022-10-03 15:22:46 -04:00
Charlie Marsh
64d8e25528 Bump version to 0.0.51 2022-10-03 14:08:39 -04:00
Charlie Marsh
b049cced04 Automatically remove unused imports (#298) 2022-10-03 14:08:16 -04:00
Charlie Marsh
bc335f839e Visit lambda arguments prior to deferral (#303) 2022-10-02 20:54:02 -04:00
Charlie Marsh
4819e19ba2 Bump version to 0.0.50 2022-10-02 20:43:30 -04:00
Charlie Marsh
622b8adb79 Avoid falling back to A003 when A001 is disabled (#302) 2022-10-02 20:43:12 -04:00
Charlie Marsh
558d9fcbe3 Enable LibCST-based autofixing for SPR001 (#297) 2022-10-02 19:58:13 -04:00
Charlie Marsh
83f18193c2 Add an end location to Check (#299) 2022-10-02 12:50:42 -04:00
Charlie Marsh
46e6a1b3be Add end locations to all nodes (#296) 2022-10-02 12:49:48 -04:00
Suguru Yamamoto
4d0d433af9 fix: Make assigns to dunder exception for E402. (#294) 2022-10-01 09:43:47 -04:00
Christian Clauss
11f7532e72 pre-commit: Validate pyproject.toml (#266) 2022-09-30 19:21:12 -04:00
Charlie Marsh
417764d309 Expose a public 'check' method (#289) 2022-09-30 11:30:37 -04:00
Charlie Marsh
1e36c109c6 Bump version to 0.0.49 2022-09-30 09:15:32 -04:00
Nikita Sobolev
3960016d55 Create .editorconfig (#290) 2022-09-30 09:15:08 -04:00
Charlie Marsh
75d669fa86 Update check ordering 2022-09-30 09:14:41 -04:00
Nikita Sobolev
20989e12ba Implement flake8-super check (#291) 2022-09-30 09:12:09 -04:00
Charlie Marsh
5a1b6c32eb Add CONTRIBUTING.md (#288) 2022-09-30 07:51:30 -04:00
Charlie Marsh
46bdcb9080 Add instructions on GitHub Actions integration 2022-09-29 19:05:02 -04:00
Charlie Marsh
d16a7252af Add instructions on PyCharm integration 2022-09-29 18:52:45 -04:00
Charlie Marsh
ca6551eb37 Remove misc. unnecessary statements 2022-09-29 18:45:10 -04:00
Charlie Marsh
43a4f5749e Create CODE_OF_CONDUCT.md (#287) 2022-09-29 16:59:17 -04:00
Charlie Marsh
6fef4db433 Bump version to 0.0.48 2022-09-29 16:40:01 -04:00
Charlie Marsh
7470d6832f Add pattern matching limitation to README.md 2022-09-29 16:39:25 -04:00
Nikita Sobolev
63ba0bfeef Adds flake8-builtins (#284) 2022-09-29 16:37:43 -04:00
Anders Kaseorg
91666fcaf6 Don’t follow directory symlinks found while walking (#280) 2022-09-29 15:10:25 -04:00
Heyward Fann
643e27221d chore: fix eslint fix link (#281) 2022-09-29 07:15:07 -04:00
Charlie Marsh
c7349b69c1 Bump version to 0.0.47 2022-09-28 22:30:48 -04:00
Charlie Marsh
7f84753f3c Improve rendering of --show-settings 2022-09-28 22:30:20 -04:00
Charlie Marsh
e2ec62cf33 Misc. follow-up changes to #272 (#278) 2022-09-28 22:15:58 -04:00
Charlie Marsh
1d5592d937 Use take-while to terminate on parse errors (#279) 2022-09-28 22:06:35 -04:00
Anders Kaseorg
886def13bd Upgrade to clap 4 (#272) 2022-09-28 17:11:57 -04:00
Charlie Marsh
949e4d4077 Bump version to 0.0.46 2022-09-24 13:10:10 -04:00
Charlie Marsh
c8cb2eead2 Remove README note about noqa patterns 2022-09-24 13:09:45 -04:00
Seamooo
02ae494a0e Enable per-file ignores (#261) 2022-09-24 13:02:34 -04:00
Harutaka Kawamura
dce86e065b Make unused variable pattern configurable (#265) 2022-09-24 10:43:39 -04:00
Harutaka Kawamura
d77979429c Print warning and error messages in stderr (#267) 2022-09-24 09:27:35 -04:00
Adrian Garcia Badaracco
a3a15d2eb2 error invalid pyproject.toml configs (#264) 2022-09-23 21:16:07 -04:00
Charlie Marsh
5af95428ff Tweak import 2022-09-23 18:53:57 -04:00
Harutaka Kawamura
6338cad4e6 Remove python 3.6 classifier (#260) 2022-09-22 20:38:09 -04:00
Harutaka Kawamura
485881877f Include error code and message in JSON output (#259) 2022-09-22 20:29:21 -04:00
85 changed files with 3147 additions and 478 deletions

14
.editorconfig Normal file
View File

@@ -0,0 +1,14 @@
# Check http://editorconfig.org for more information
# This is the main config file for this project:
root = true
[*]
charset = utf-8
trim_trailing_whitespace = true
end_of_line = lf
indent_style = space
insert_final_newline = true
indent_size = 2
[*.{rs,py}]
indent_size = 4

View File

@@ -3,3 +3,8 @@ repos:
rev: v0.0.40
hooks:
- id: lint
- repo: https://github.com/abravalheri/validate-pyproject
rev: v0.10.1
hooks:
- id: validate-pyproject

128
CODE_OF_CONDUCT.md Normal file
View File

@@ -0,0 +1,128 @@
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
overall community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or
advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
charlie.r.marsh@gmail.com.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series
of actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within
the community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see the FAQ at
https://www.contributor-covenant.org/faq. Translations are available at
https://www.contributor-covenant.org/translations.

105
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,105 @@
# Contributing to ruff
Welcome! We're happy to have you here. Thank you in advance for your contribution to ruff.
## The basics
ruff welcomes contributions in the form of Pull Requests. For small changes (e.g., bug fixes), feel
free to submit a PR. For larger changes (e.g., new lint rules, new functionality, new configuration
options), consider submitting an [Issue](https://github.com/charliermarsh/ruff/issues) outlining
your proposed change.
### Prerequisites
ruff is written in Rust (1.63.0). You'll need to install the
[Rust toolchain](https://www.rust-lang.org/tools/install) for development.
### Development
After cloning the repository, run ruff locally with:
```shell
cargo run resources/test/fixtures --no-cache
```
Prior to opening a pull request, ensure that your code has been auto-formatted, and that it passes
both the lint and test validation checks:
```shell
cargo fmt # Auto-formatting...
cargo clippy # Linting...
cargo test # Testing...
```
These checks will run on GitHub Actions when you open your Pull Request, but running them locally
will save you time and expedite the merge process.
Your Pull Request will be reviewed by a maintainer, which may involve a few rounds of iteration
prior to merging.
### Example: Adding a new lint rule
There are three phases to adding a new lint rule:
1. Define the rule in `src/checks.rs`.
2. Define the _logic_ for triggering the rule in `src/check_ast.rs` (for AST-based checks)
or `src/check_lines.rs` (for text-based checks).
3. Add a test fixture.
To define the rule, open up `src/checks.rs`. You'll need to define both a `CheckCode` and
`CheckKind`. As an example, you can grep for `E402` and `ModuleImportNotAtTopOfFile`, and follow the
pattern implemented therein.
To trigger the rule, you'll likely want to augment the logic in `src/check_ast.rs`, which defines
the Python AST visitor, responsible for iterating over the abstract syntax tree and collecting
lint-rule violations as it goes. Grep for the `Check::new` invocations to understand how other,
similar rules are implemented.
To add a test fixture, create a file under `resources/test/fixtures`, named to match the `CheckCode`
you defined earlier (e.g., `E402.py`). This file should contain a variety of violations and
non-violations designed to evaluate and demonstrate the behavior of your lint rule. Run ruff locally
with (e.g.) `cargo run resources/test/fixtures/E402.py`. Once you're satisified with the output,
codify the behavior as a snapshot test by adding a new function to the `mod tests` section of
`src/linter.rs`, like so:
```rust
#[test]
fn e402() -> Result<()> {
let mut checks = check_path(
Path::new("./resources/test/fixtures/E402.py"),
&settings::Settings::for_rule(CheckCode::E402),
&fixer::Mode::Generate,
)?;
checks.sort_by_key(|check| check.location);
insta::assert_yaml_snapshot!(checks);
Ok(())
}
```
Then, run `cargo test`. Your test will fail, but you'll be prompted to follow-up with
`cargo insta review`. Accept the generated snapshot, then commit the snapshot file alongside the
rest of your changes.
### Example: Adding a new configuration option
ruff's user-facing settings live in two places: first, the command-line options defined with
[clap](https://docs.rs/clap/latest/clap/) via the `Cli` struct in `src/main.rs`; and second, the
`Config` struct defined `src/pyproject.rs`, which is responsible for extracting user-defined
settings from a `pyproject.toml` file.
Ultimately, these two sources of configuration are merged into the `Settings` struct defined
in `src/settings.rs`, which is then threaded through the codebase.
To add a new configuration option, you'll likely want to _both_ add a CLI option to `src/main.rs`
_and_ a `pyproject.toml` parameter to `src/pyproject.rs`. If you want to pattern-match against an
existing example, grep for `dummy_variable_rgx`, which defines a regular expression to match against
acceptable unused variables (e.g., `_`).
## Release process
As of now, ruff has an ad hoc release process: releases are cut with high frequency via GitHub
Actions, which automatically generates the appropriate wheels across architectures and publishes
them to [PyPI](https://pypi.org/project/ruff/).
ruff follows the [semver](https://semver.org/) versioning standard. However, as pre-1.0 software,
even patch releases may contain [non-backwards-compatible changes](https://semver.org/#spec-item-4).

117
Cargo.lock generated
View File

@@ -37,6 +37,12 @@ dependencies = [
"libc",
]
[[package]]
name = "annotate-snippets"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c7021ce4924a3f25f802b2cccd1af585e39ea1a363a1aa2e72afe54b67a3a7a7"
[[package]]
name = "anyhow"
version = "1.0.60"
@@ -359,6 +365,15 @@ version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd"
[[package]]
name = "chic"
version = "1.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a5b5db619f3556839cb2223ae86ff3f9a09da2c5013be42bc9af08c9589bf70c"
dependencies = [
"annotate-snippets",
]
[[package]]
name = "chrono"
version = "0.4.21"
@@ -382,26 +397,24 @@ checksum = "fff857943da45f546682664a79488be82e69e43c1a7a2307679ab9afb3a66d2e"
[[package]]
name = "clap"
version = "3.2.16"
version = "4.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a3dbbb6653e7c55cc8595ad3e1f7be8f32aba4eb7ff7f0fd1163d4f3d137c0a9"
checksum = "dd03107d0f87139c1774a15f3db2165b0652b5460c58c27e561f89c20c599eaf"
dependencies = [
"atty",
"bitflags",
"clap_derive",
"clap_lex",
"indexmap",
"once_cell",
"strsim",
"termcolor",
"textwrap",
]
[[package]]
name = "clap_derive"
version = "3.2.15"
version = "4.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9ba52acd3b0a5c33aeada5cdaa3267cdc7c594a98731d4268cdc1532f4264cb4"
checksum = "ca689d7434ce44517a12a89456b2be4d1ea1cafcd8f581978c03d45f5a5c12a7"
dependencies = [
"heck",
"proc-macro-error",
@@ -412,9 +425,9 @@ dependencies = [
[[package]]
name = "clap_lex"
version = "0.2.4"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2850f2f5a82cbf437dd5af4d49848fbdfc27c157c3d010345776f952765261c5"
checksum = "0d4198f73e42b4936b35b5bb248d81d2b595ecb170da0bac7655c54eedfa8da8"
dependencies = [
"os_str_bytes",
]
@@ -1064,9 +1077,9 @@ dependencies = [
[[package]]
name = "itertools"
version = "0.10.3"
version = "0.10.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a9a9d19fa1e79b6215ff29b9d6880b706147f16e9b1dbb1e4e5947b5b02bc5e3"
checksum = "b0fd2260e829bddf4cb6ea802289de2f86d6a7a690192fbe91b3f46e0f2c8473"
dependencies = [
"either",
]
@@ -1155,6 +1168,30 @@ version = "0.2.127"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "505e71a4706fa491e9b1b55f51b95d4037d0821ee40131190475f692b35b009b"
[[package]]
name = "libcst"
version = "0.1.0"
source = "git+https://github.com/charliermarsh/LibCST?rev=32a044c127668df44582f85699358e67803b0d73#32a044c127668df44582f85699358e67803b0d73"
dependencies = [
"chic",
"itertools",
"libcst_derive",
"once_cell",
"paste",
"peg",
"regex",
"thiserror",
]
[[package]]
name = "libcst_derive"
version = "0.1.0"
source = "git+https://github.com/charliermarsh/LibCST?rev=32a044c127668df44582f85699358e67803b0d73#32a044c127668df44582f85699358e67803b0d73"
dependencies = [
"quote",
"syn",
]
[[package]]
name = "linked-hash-map"
version = "0.5.6"
@@ -1381,9 +1418,9 @@ dependencies = [
[[package]]
name = "once_cell"
version = "1.13.1"
version = "1.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "074864da206b4973b84eb91683020dbefd6a8c3f0f38e054d93954e891935e4e"
checksum = "e82dad04139b71a90c080c8463fe0dc7902db5192d939bd0950f074d014339e1"
[[package]]
name = "opaque-debug"
@@ -1432,6 +1469,12 @@ dependencies = [
"windows-sys",
]
[[package]]
name = "paste"
version = "1.0.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1de2e551fb905ac83f73f7aedf2f0cb4a0da7e35efa24a202a936269f1f18e1"
[[package]]
name = "path-absolutize"
version = "3.0.13"
@@ -1450,6 +1493,33 @@ dependencies = [
"once_cell",
]
[[package]]
name = "peg"
version = "0.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a07f2cafdc3babeebc087e499118343442b742cc7c31b4d054682cc598508554"
dependencies = [
"peg-macros",
"peg-runtime",
]
[[package]]
name = "peg-macros"
version = "0.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4a90084dc05cf0428428e3d12399f39faad19b0909f64fb9170c9fdd6d9cd49b"
dependencies = [
"peg-runtime",
"proc-macro2",
"quote",
]
[[package]]
name = "peg-runtime"
version = "0.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9fa00462b37ead6d11a82c9d568b26682d78e0477dc02d1966c013af80969739"
[[package]]
name = "percent-encoding"
version = "2.1.0"
@@ -1801,7 +1871,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.0.45"
version = "0.0.52"
dependencies = [
"anyhow",
"bincode",
@@ -1817,6 +1887,7 @@ dependencies = [
"glob",
"insta",
"itertools",
"libcst",
"log",
"notify",
"once_cell",
@@ -1846,7 +1917,7 @@ dependencies = [
[[package]]
name = "rustpython-ast"
version = "0.1.0"
source = "git+https://github.com/charliermarsh/RustPython.git?rev=966a80597d626a9a47eaec78471164422d341453#966a80597d626a9a47eaec78471164422d341453"
source = "git+https://github.com/charliermarsh/RustPython.git?rev=4f457893efc381ad5c432576b24bcc7e4a08c641#4f457893efc381ad5c432576b24bcc7e4a08c641"
dependencies = [
"num-bigint",
"rustpython-compiler-core",
@@ -1855,7 +1926,7 @@ dependencies = [
[[package]]
name = "rustpython-compiler-core"
version = "0.1.2"
source = "git+https://github.com/charliermarsh/RustPython.git?rev=966a80597d626a9a47eaec78471164422d341453#966a80597d626a9a47eaec78471164422d341453"
source = "git+https://github.com/charliermarsh/RustPython.git?rev=4f457893efc381ad5c432576b24bcc7e4a08c641#4f457893efc381ad5c432576b24bcc7e4a08c641"
dependencies = [
"bincode",
"bitflags",
@@ -1872,7 +1943,7 @@ dependencies = [
[[package]]
name = "rustpython-parser"
version = "0.1.2"
source = "git+https://github.com/charliermarsh/RustPython.git?rev=966a80597d626a9a47eaec78471164422d341453#966a80597d626a9a47eaec78471164422d341453"
source = "git+https://github.com/charliermarsh/RustPython.git?rev=4f457893efc381ad5c432576b24bcc7e4a08c641#4f457893efc381ad5c432576b24bcc7e4a08c641"
dependencies = [
"ahash",
"anyhow",
@@ -2187,26 +2258,20 @@ dependencies = [
"phf_codegen 0.8.0",
]
[[package]]
name = "textwrap"
version = "0.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1141d4d61095b28419e22cb0bbf02755f5e54e0526f97f1e3d1d160e60885fb"
[[package]]
name = "thiserror"
version = "1.0.32"
version = "1.0.37"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f5f6586b7f764adc0231f4c79be7b920e766bb2f3e51b3661cdb263828f19994"
checksum = "10deb33631e3c9018b9baf9dcbbc4f737320d2b576bac10f6aefa048fa407e3e"
dependencies = [
"thiserror-impl",
]
[[package]]
name = "thiserror-impl"
version = "1.0.32"
version = "1.0.37"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "12bafc5b54507e0149cdf1b145a5d80ab80a90bcd9275df43d4fff68460f6c21"
checksum = "982d17546b47146b28f7c22e3d08465f6b8903d0ea13c1660d9d84a6e7adcdbb"
dependencies = [
"proc-macro2",
"quote",

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.0.45"
version = "0.0.52"
edition = "2021"
[lib]
@@ -11,7 +11,7 @@ anyhow = { version = "1.0.60" }
bincode = { version = "1.3.3" }
cacache = { version = "10.0.1" }
chrono = { version = "0.4.21" }
clap = { version = "3.2.16", features = ["derive"] }
clap = { version = "4.0.1", features = ["derive"] }
clearscreen = { version = "1.0.10" }
colored = { version = "2.0.0" }
common-path = { version = "1.0.0" }
@@ -19,14 +19,15 @@ dirs = { version = "4.0.0" }
fern = { version = "0.6.1" }
filetime = { version = "0.2.17" }
glob = { version = "0.3.0" }
itertools = { version = "0.10.3" }
itertools = { version = "0.10.5" }
libcst = { git = "https://github.com/charliermarsh/LibCST", rev = "32a044c127668df44582f85699358e67803b0d73" }
log = { version = "0.4.17" }
notify = { version = "4.0.17" }
once_cell = { version = "1.13.1" }
path-absolutize = { version = "3.0.13", features = ["once_cell_cache"] }
rayon = { version = "1.5.3" }
regex = { version = "1.6.0" }
rustpython-parser = { features = ["lalrpop"], git = "https://github.com/charliermarsh/RustPython.git", rev = "966a80597d626a9a47eaec78471164422d341453" }
rustpython-parser = { features = ["lalrpop"], git = "https://github.com/charliermarsh/RustPython.git", rev = "4f457893efc381ad5c432576b24bcc7e4a08c641" }
serde = { version = "1.0.143", features = ["derive"] }
serde_json = { version = "1.0.83" }
toml = { version = "0.5.9" }

146
README.md
View File

@@ -20,7 +20,7 @@ An extremely fast Python linter, written in Rust.
- 🤝 Python 3.10 compatibility
- 🛠️ `pyproject.toml` support
- 📦 [ESLint](https://eslint.org/docs/latest/user-guide/command-line-interface#caching)-inspired cache support
- 🔧 [ESLint](https://eslint.org/docs/latest/user-guide/command-line-interface#caching)-inspired `--fix` support
- 🔧 [ESLint](https://eslint.org/docs/latest/user-guide/command-line-interface#--fix)-inspired `--fix` support
- 👀 [TypeScript](https://www.typescriptlang.org/docs/handbook/configuring-watch.html)-inspired `--watch` support
- ⚖️ [Near-complete parity](#Parity-with-Flake8) with the built-in Flake8 rule set
@@ -57,7 +57,7 @@ ruff also works with [pre-commit](https://pre-commit.com):
```yaml
repos:
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: v0.0.40
rev: v0.0.48
hooks:
- id: lint
```
@@ -80,59 +80,60 @@ select = [
Alternatively, on the command-line:
```shell
ruff path/to/code/ --select F401 F403
ruff path/to/code/ --select F401 --select F403
```
See `ruff --help` for more:
```shell
ruff (v0.0.45) 0.0.45
An extremely fast Python linter.
ruff: An extremely fast Python linter.
USAGE:
ruff [OPTIONS] <FILES>...
Usage: ruff [OPTIONS] <FILES>...
ARGS:
<FILES>...
Arguments:
<FILES>...
OPTIONS:
--select <SELECT>...
List of error codes to enable
--extend-select <EXTEND_SELECT>...
Like --select, but adds additional error codes on top of the selected ones
--ignore <IGNORE>...
List of error codes to ignore
--extend-ignore <EXTEND_IGNORE>...
Like --ignore, but adds additional error codes on top of the ignored ones
--exclude <EXCLUDE>...
List of paths, used to exclude files and/or directories from checks
--extend-exclude <EXTEND_EXCLUDE>...
Like --exclude, but adds additional files and directories on top of the excluded ones
-e, --exit-zero
Exit with status code "0", even upon detecting errors
-f, --fix
Attempt to automatically fix lint errors
--format <FORMAT>
Output serialization format for error messages [default: text] [possible values: text,
json]
-h, --help
Print help information
-n, --no-cache
Disable cache reads
-q, --quiet
Disable all logging (but still exit with status code "1" upon detecting errors)
--add-noqa
Enable automatic additions of noqa directives to failing lines
--show-files
See the files ruff will be run against with the current settings
--show-settings
See ruff's settings
-v, --verbose
Enable verbose logging
-V, --version
Print version information
-w, --watch
Run in watch mode by re-running whenever files change
Options:
-v, --verbose
Enable verbose logging
-q, --quiet
Disable all logging (but still exit with status code "1" upon detecting errors)
-e, --exit-zero
Exit with status code "0", even upon detecting errors
-w, --watch
Run in watch mode by re-running whenever files change
-f, --fix
Attempt to automatically fix lint errors
-n, --no-cache
Disable cache reads
--select <SELECT>
List of error codes to enable
--extend-select <EXTEND_SELECT>
Like --select, but adds additional error codes on top of the selected ones
--ignore <IGNORE>
List of error codes to ignore
--extend-ignore <EXTEND_IGNORE>
Like --ignore, but adds additional error codes on top of the ignored ones
--exclude <EXCLUDE>
List of paths, used to exclude files and/or directories from checks
--extend-exclude <EXTEND_EXCLUDE>
Like --exclude, but adds additional files and directories on top of the excluded ones
--per-file-ignores <PER_FILE_IGNORES>
List of mappings from file pattern to code to exclude
--format <FORMAT>
Output serialization format for error messages [default: text] [possible values: text, json]
--show-files
See the files ruff will be run against with the current settings
--show-settings
See ruff's settings
--add-noqa
Enable automatic additions of noqa directives to failing lines
--dummy-variable-rgx <DUMMY_VARIABLE_RGX>
Regular expression matching the name of dummy variables
-h, --help
Print help information
-V, --version
Print version information
```
### Excluding files
@@ -210,8 +211,8 @@ variables.)
Beyond rule-set parity, ruff suffers from the following limitations vis-à-vis Flake8:
1. Flake8 has a plugin architecture and supports writing custom lint rules.
2. Flake8 supports a wider range of `noqa` patterns, such as per-file ignores defined in `.flake8`.
3. ruff does not yet support parenthesized context managers.
2. ruff does not yet support a few Python 3.9 and 3.10 language features, including structural
pattern matching and parenthesized context managers.
## Rules
@@ -259,10 +260,52 @@ Beyond rule-set parity, ruff suffers from the following limitations vis-à-vis F
| F831 | DuplicateArgumentName | Duplicate argument name in function definition |
| F841 | UnusedVariable | Local variable `...` is assigned to but never used |
| F901 | RaiseNotImplemented | `raise NotImplemented` should be `raise NotImplementedError` |
| A001 | BuiltinVariableShadowing | Variable `...` is shadowing a python builtin |
| A002 | BuiltinArgumentShadowing | Argument `...` is shadowing a python builtin |
| A003 | BuiltinAttributeShadowing | class attribute `...` is shadowing a python builtin |
| SPR001 | SuperCallWithParameters | Use `super()` instead of `super(__class__, self)` |
| R001 | UselessObjectInheritance | Class `...` inherits from object |
| R002 | NoAssertEquals | `assertEquals` is deprecated, use `assertEqual` instead |
| M001 | UnusedNOQA | Unused `noqa` directive |
## Integrations
### PyCharm
ruff can be installed as an [External Tool](https://www.jetbrains.com/help/pycharm/configuring-third-party-tools.html)
in PyCharm. Open the Preferences pane, then navigate to "Tools", then "External Tools". From there,
add a new tool with the following configuration:
![Install ruff as an External Tool](https://user-images.githubusercontent.com/1309177/193155720-336e43f0-1a8d-46b4-bc12-e60f9ae01f7e.png)
ruff should then appear as a runnable action:
![ruff as a runnable action](https://user-images.githubusercontent.com/1309177/193156026-732b0aaf-3dd9-4549-9b4d-2de6d2168a33.png)
### GitHub Actions
GitHub Actions has everything you need to run ruff out-of-the-box:
```yaml
name: CI
on: push
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Install Python
uses: actions/setup-python@v2
with:
python-version: "3.10"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install ruff
- name: Run ruff
run: ruff .
```
## Development
ruff is written in Rust (1.63.0). You'll need to install the [Rust toolchain](https://www.rust-lang.org/tools/install)
@@ -422,3 +465,8 @@ Summary
## License
MIT
## Contributing
Contributions are welcome and hugely appreciated. To get started, check out the
[contributing guidelines](https://github.com/charliermarsh/ruff/blob/main/CONTRIBUTORS.md).

View File

@@ -2,14 +2,14 @@
use std::path::PathBuf;
use anyhow::Result;
use clap::{Parser, ValueHint};
use clap::Parser;
use rustpython_parser::parser;
use ruff::fs;
#[derive(Debug, Parser)]
struct Cli {
#[clap(parse(from_os_str), value_hint = ValueHint::FilePath, required = true)]
#[arg(required = true)]
file: PathBuf,
}

View File

@@ -2,14 +2,14 @@
use std::path::PathBuf;
use anyhow::Result;
use clap::{Parser, ValueHint};
use clap::Parser;
use rustpython_parser::lexer;
use ruff::fs;
#[derive(Debug, Parser)]
struct Cli {
#[clap(parse(from_os_str), value_hint = ValueHint::FilePath, required = true)]
#[arg(required = true)]
file: PathBuf,
}

View File

@@ -8,7 +8,6 @@ classifiers = [
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",

27
resources/test/fixtures/A001.py vendored Normal file
View File

@@ -0,0 +1,27 @@
import some as sum
from some import other as int
print = 1
copyright: 'annotation' = 2
(complex := 3)
float = object = 4
min, max = 5, 6
def bytes():
pass
class slice:
pass
try:
...
except ImportError as ValueError:
...
for memoryview, *bytearray in []:
pass
with open('file') as str, open('file2') as (all, any):
pass
[0 for sum in ()]

9
resources/test/fixtures/A002.py vendored Normal file
View File

@@ -0,0 +1,9 @@
def func1(str, /, type, *complex, Exception, **getattr):
pass
async def func2(bytes):
pass
map([], lambda float: ...)

8
resources/test/fixtures/A003.py vendored Normal file
View File

@@ -0,0 +1,8 @@
class MyClass:
ImportError = 4
def __init__(self):
self.float = 5 # is fine
def str(self):
pass

View File

@@ -1,4 +1,8 @@
"""Top-level docstring."""
__all__ = ["y"]
__version__: str = "0.1.0"
import a
try:

View File

@@ -1,6 +1,5 @@
from __future__ import all_feature_names
import os
import functools
import functools, os
from datetime import datetime
from collections import (
Counter,
@@ -11,14 +10,36 @@ import multiprocessing.pool
import multiprocessing.process
import logging.config
import logging.handlers
from typing import TYPING_CHECK, NamedTuple, Dict, Type, TypeVar, List, Set, Union, cast
from typing import (
TYPE_CHECKING,
NamedTuple,
Dict,
Type,
TypeVar,
List,
Set,
Union,
cast,
)
from dataclasses import MISSING, field
from blah import ClassA, ClassB, ClassC
if TYPING_CHECK:
if TYPE_CHECKING:
from models import Fruit, Nut, Vegetable
if TYPE_CHECKING:
import shelve
import importlib
if TYPE_CHECKING:
"""Hello, world!"""
import pathlib
z = 1
class X:
datetime: datetime
foo: Type["NamedTuple"]
@@ -28,6 +49,9 @@ class X:
y = Counter()
z = multiprocessing.pool.ThreadPool()
def b(self) -> None:
import pickle
__all__ = ["ClassA"] + ["ClassB"]
__all__ += ["ClassC"]
@@ -39,3 +63,5 @@ Z = TypeVar("Z", "List", "Set")
a = list["Fruit"]
b = Union["Nut", None]
c = cast("Vegetable", b)
Field = lambda default=MISSING: field(default=default)

View File

@@ -10,13 +10,13 @@ except ValueError as e:
print(e)
def f():
def f1():
x = 1
y = 2
z = x + y
def g():
def f2():
foo = (1, 2)
(a, b) = (1, 2)
@@ -26,6 +26,12 @@ def g():
(x, y) = baz = bar
def h():
def f3():
locals()
x = 1
def f4():
_ = 1
__ = 1
_discarded = 1

22
resources/test/fixtures/SPR001.py vendored Normal file
View File

@@ -0,0 +1,22 @@
class Parent:
def method(self):
pass
def wrong(self):
pass
class Child(Parent):
def method(self):
parent = super() # ok
super().method() # ok
Parent.method(self) # ok
Parent.super(1, 2) # ok
def wrong(self):
parent = super(Child, self) # wrong
super(Child, self).method # wrong
super(
Child,
self,
).method() # wrong

View File

@@ -2,7 +2,10 @@ from __future__ import annotations
from dataclasses import dataclass
from models import Fruit, Nut
from models import (
Fruit,
Nut,
)
@dataclass

View File

@@ -1,18 +1,22 @@
use std::collections::BTreeSet;
use itertools::izip;
use regex::Regex;
use rustpython_parser::ast::{
Arg, Arguments, Cmpop, Constant, Excepthandler, ExcepthandlerKind, Expr, ExprKind, Keyword,
Location, Stmt, StmtKind, Unaryop,
};
use crate::ast::operations::SourceCodeLocator;
use crate::ast::types::{Binding, BindingKind, CheckLocator, FunctionScope, Scope, ScopeKind};
use crate::ast::types::{
Binding, BindingKind, CheckLocator, FunctionScope, Range, Scope, ScopeKind,
};
use crate::autofix::{fixer, fixes};
use crate::checks::{Check, CheckKind, Fix, RejectedCmpop};
use crate::python::builtins::BUILTINS;
/// Check IfTuple compliance.
pub fn check_if_tuple(test: &Expr, location: Location) -> Option<Check> {
pub fn check_if_tuple(test: &Expr, location: Range) -> Option<Check> {
if let ExprKind::Tuple { elts, .. } = &test.node {
if !elts.is_empty() {
return Some(Check::new(CheckKind::IfTuple, location));
@@ -22,7 +26,7 @@ pub fn check_if_tuple(test: &Expr, location: Location) -> Option<Check> {
}
/// Check AssertTuple compliance.
pub fn check_assert_tuple(test: &Expr, location: Location) -> Option<Check> {
pub fn check_assert_tuple(test: &Expr, location: Range) -> Option<Check> {
if let ExprKind::Tuple { elts, .. } = &test.node {
if !elts.is_empty() {
return Some(Check::new(CheckKind::AssertTuple, location));
@@ -49,7 +53,7 @@ pub fn check_not_tests(
if check_not_in {
checks.push(Check::new(
CheckKind::NotInTest,
locator.locate_check(operand.location),
locator.locate_check(Range::from_located(operand)),
));
}
}
@@ -57,7 +61,7 @@ pub fn check_not_tests(
if check_not_is {
checks.push(Check::new(
CheckKind::NotIsTest,
locator.locate_check(operand.location),
locator.locate_check(Range::from_located(operand)),
));
}
}
@@ -71,7 +75,11 @@ pub fn check_not_tests(
}
/// Check UnusedVariable compliance.
pub fn check_unused_variables(scope: &Scope, locator: &dyn CheckLocator) -> Vec<Check> {
pub fn check_unused_variables(
scope: &Scope,
locator: &dyn CheckLocator,
dummy_variable_rgx: &Regex,
) -> Vec<Check> {
let mut checks: Vec<Check> = vec![];
if matches!(
@@ -82,17 +90,16 @@ pub fn check_unused_variables(scope: &Scope, locator: &dyn CheckLocator) -> Vec<
}
for (name, binding) in scope.values.iter() {
// TODO(charlie): Ignore if using `locals`.
if binding.used.is_none()
&& name != "_"
&& matches!(binding.kind, BindingKind::Assignment)
&& !dummy_variable_rgx.is_match(name)
&& name != "__tracebackhide__"
&& name != "__traceback_info__"
&& name != "__traceback_supplement__"
&& matches!(binding.kind, BindingKind::Assignment)
{
checks.push(Check::new(
CheckKind::UnusedVariable(name.to_string()),
locator.locate_check(binding.location),
locator.locate_check(binding.range),
));
}
}
@@ -101,7 +108,7 @@ pub fn check_unused_variables(scope: &Scope, locator: &dyn CheckLocator) -> Vec<
}
/// Check DoNotAssignLambda compliance.
pub fn check_do_not_assign_lambda(value: &Expr, location: Location) -> Option<Check> {
pub fn check_do_not_assign_lambda(value: &Expr, location: Range) -> Option<Check> {
if let ExprKind::Lambda { .. } = &value.node {
Some(Check::new(CheckKind::DoNotAssignLambda, location))
} else {
@@ -114,7 +121,7 @@ fn is_ambiguous_name(name: &str) -> bool {
}
/// Check AmbiguousVariableName compliance.
pub fn check_ambiguous_variable_name(name: &str, location: Location) -> Option<Check> {
pub fn check_ambiguous_variable_name(name: &str, location: Range) -> Option<Check> {
if is_ambiguous_name(name) {
Some(Check::new(
CheckKind::AmbiguousVariableName(name.to_string()),
@@ -126,7 +133,7 @@ pub fn check_ambiguous_variable_name(name: &str, location: Location) -> Option<C
}
/// Check AmbiguousClassName compliance.
pub fn check_ambiguous_class_name(name: &str, location: Location) -> Option<Check> {
pub fn check_ambiguous_class_name(name: &str, location: Range) -> Option<Check> {
if is_ambiguous_name(name) {
Some(Check::new(
CheckKind::AmbiguousClassName(name.to_string()),
@@ -138,7 +145,7 @@ pub fn check_ambiguous_class_name(name: &str, location: Location) -> Option<Chec
}
/// Check AmbiguousFunctionName compliance.
pub fn check_ambiguous_function_name(name: &str, location: Location) -> Option<Check> {
pub fn check_ambiguous_function_name(name: &str, location: Range) -> Option<Check> {
if is_ambiguous_name(name) {
Some(Check::new(
CheckKind::AmbiguousFunctionName(name.to_string()),
@@ -170,7 +177,7 @@ pub fn check_useless_object_inheritance(
}) => {
let mut check = Check::new(
CheckKind::UselessObjectInheritance(name.to_string()),
expr.location,
Range::from_located(expr),
);
if matches!(autofix, fixer::Mode::Generate | fixer::Mode::Apply) {
if let Some(fix) = fixes::remove_class_def_base(
@@ -201,7 +208,7 @@ pub fn check_default_except_not_last(handlers: &Vec<Excepthandler>) -> Option<Ch
if type_.is_none() && idx < handlers.len() - 1 {
return Some(Check::new(
CheckKind::DefaultExceptNotLast,
handler.location,
Range::from_located(handler),
));
}
}
@@ -215,13 +222,19 @@ pub fn check_raise_not_implemented(expr: &Expr) -> Option<Check> {
ExprKind::Call { func, .. } => {
if let ExprKind::Name { id, .. } = &func.node {
if id == "NotImplemented" {
return Some(Check::new(CheckKind::RaiseNotImplemented, expr.location));
return Some(Check::new(
CheckKind::RaiseNotImplemented,
Range::from_located(expr),
));
}
}
}
ExprKind::Name { id, .. } => {
if id == "NotImplemented" {
return Some(Check::new(CheckKind::RaiseNotImplemented, expr.location));
return Some(Check::new(
CheckKind::RaiseNotImplemented,
Range::from_located(expr),
));
}
}
_ => {}
@@ -253,7 +266,10 @@ pub fn check_duplicate_arguments(arguments: &Arguments) -> Vec<Check> {
for arg in all_arguments {
let ident = &arg.node.arg;
if idents.contains(ident.as_str()) {
checks.push(Check::new(CheckKind::DuplicateArgumentName, arg.location));
checks.push(Check::new(
CheckKind::DuplicateArgumentName,
Range::from_located(arg),
));
}
idents.insert(ident);
}
@@ -267,12 +283,16 @@ pub fn check_assert_equals(expr: &Expr, autofix: &fixer::Mode) -> Option<Check>
if attr == "assertEquals" {
if let ExprKind::Name { id, .. } = &value.node {
if id == "self" {
let mut check = Check::new(CheckKind::NoAssertEquals, expr.location);
let mut check =
Check::new(CheckKind::NoAssertEquals, Range::from_located(expr));
if matches!(autofix, fixer::Mode::Generate | fixer::Mode::Apply) {
check.amend(Fix {
content: "assertEqual".to_string(),
start: Location::new(expr.location.row(), expr.location.column() + 1),
end: Location::new(
location: Location::new(
expr.location.row(),
expr.location.column() + 1,
),
end_location: Location::new(
expr.location.row(),
expr.location.column() + 1 + "assertEquals".len(),
),
@@ -321,7 +341,7 @@ pub fn check_repeated_keys(
if check_repeated_literals && v1 == v2 {
checks.push(Check::new(
CheckKind::MultiValueRepeatedKeyLiteral,
locator.locate_check(k2.location),
locator.locate_check(Range::from_located(k2)),
))
}
}
@@ -329,7 +349,7 @@ pub fn check_repeated_keys(
if check_repeated_variables && v1 == v2 {
checks.push(Check::new(
CheckKind::MultiValueRepeatedKeyVariable((*v2).to_string()),
locator.locate_check(k2.location),
locator.locate_check(Range::from_located(k2)),
))
}
}
@@ -368,13 +388,13 @@ pub fn check_literal_comparisons(
if matches!(op, Cmpop::Eq) {
checks.push(Check::new(
CheckKind::NoneComparison(RejectedCmpop::Eq),
locator.locate_check(comparator.location),
locator.locate_check(Range::from_located(comparator)),
));
}
if matches!(op, Cmpop::NotEq) {
checks.push(Check::new(
CheckKind::NoneComparison(RejectedCmpop::NotEq),
locator.locate_check(comparator.location),
locator.locate_check(Range::from_located(comparator)),
));
}
}
@@ -388,13 +408,13 @@ pub fn check_literal_comparisons(
if matches!(op, Cmpop::Eq) {
checks.push(Check::new(
CheckKind::TrueFalseComparison(value, RejectedCmpop::Eq),
locator.locate_check(comparator.location),
locator.locate_check(Range::from_located(comparator)),
));
}
if matches!(op, Cmpop::NotEq) {
checks.push(Check::new(
CheckKind::TrueFalseComparison(value, RejectedCmpop::NotEq),
locator.locate_check(comparator.location),
locator.locate_check(Range::from_located(comparator)),
));
}
}
@@ -414,13 +434,13 @@ pub fn check_literal_comparisons(
if matches!(op, Cmpop::Eq) {
checks.push(Check::new(
CheckKind::NoneComparison(RejectedCmpop::Eq),
locator.locate_check(comparator.location),
locator.locate_check(Range::from_located(comparator)),
));
}
if matches!(op, Cmpop::NotEq) {
checks.push(Check::new(
CheckKind::NoneComparison(RejectedCmpop::NotEq),
locator.locate_check(comparator.location),
locator.locate_check(Range::from_located(comparator)),
));
}
}
@@ -434,13 +454,13 @@ pub fn check_literal_comparisons(
if matches!(op, Cmpop::Eq) {
checks.push(Check::new(
CheckKind::TrueFalseComparison(value, RejectedCmpop::Eq),
locator.locate_check(comparator.location),
locator.locate_check(Range::from_located(comparator)),
));
}
if matches!(op, Cmpop::NotEq) {
checks.push(Check::new(
CheckKind::TrueFalseComparison(value, RejectedCmpop::NotEq),
locator.locate_check(comparator.location),
locator.locate_check(Range::from_located(comparator)),
));
}
}
@@ -477,7 +497,7 @@ pub fn check_is_literal(
left: &Expr,
ops: &Vec<Cmpop>,
comparators: &Vec<Expr>,
location: Location,
location: Range,
) -> Vec<Check> {
let mut checks: Vec<Check> = vec![];
@@ -498,7 +518,7 @@ pub fn check_is_literal(
pub fn check_type_comparison(
ops: &Vec<Cmpop>,
comparators: &Vec<Expr>,
location: Location,
location: Range,
) -> Vec<Check> {
let mut checks: Vec<Check> = vec![];
@@ -539,7 +559,7 @@ pub fn check_starred_expressions(
elts: &[Expr],
check_too_many_expressions: bool,
check_two_starred_expressions: bool,
location: Location,
location: Range,
) -> Option<Check> {
let mut has_starred: bool = false;
let mut starred_index: Option<usize> = None;
@@ -601,7 +621,7 @@ pub fn check_break_outside_loop(
if !allowed {
Some(Check::new(
CheckKind::BreakOutsideLoop,
locator.locate_check(stmt.location),
locator.locate_check(Range::from_located(stmt)),
))
} else {
None
@@ -642,9 +662,62 @@ pub fn check_continue_outside_loop(
if !allowed {
Some(Check::new(
CheckKind::ContinueOutsideLoop,
locator.locate_check(stmt.location),
locator.locate_check(Range::from_located(stmt)),
))
} else {
None
}
}
// flake8-builtins
pub enum ShadowingType {
Variable,
Argument,
Attribute,
}
/// Check builtin name shadowing
pub fn check_builtin_shadowing(
name: &str,
location: Range,
node_type: ShadowingType,
) -> Option<Check> {
if BUILTINS.contains(&name) {
Some(Check::new(
match node_type {
ShadowingType::Variable => CheckKind::BuiltinVariableShadowing(name.to_string()),
ShadowingType::Argument => CheckKind::BuiltinArgumentShadowing(name.to_string()),
ShadowingType::Attribute => CheckKind::BuiltinAttributeShadowing(name.to_string()),
},
location,
))
} else {
None
}
}
// flake8-super
/// Check that `super()` has no args
pub fn check_super_args(
expr: &Expr,
func: &Expr,
args: &Vec<Expr>,
locator: &mut SourceCodeLocator,
autofix: &fixer::Mode,
) -> Option<Check> {
if let ExprKind::Name { id, .. } = &func.node {
if id == "super" && !args.is_empty() {
let mut check = Check::new(
CheckKind::SuperCallWithParameters,
Range::from_located(expr),
);
if matches!(autofix, fixer::Mode::Generate | fixer::Mode::Apply) {
if let Some(fix) = fixes::remove_super_arguments(locator, expr) {
check.amend(fix);
}
}
return Some(check);
}
}
None
}

View File

@@ -1,6 +1,6 @@
use rustpython_parser::ast::{Constant, Expr, ExprKind, Location, Stmt, StmtKind};
use crate::ast::types::{BindingKind, Scope};
use crate::ast::types::{BindingKind, Range, Scope};
/// Extract the names bound to a given __all__ assignment.
pub fn extract_all_names(stmt: &Stmt, scope: &Scope) -> Vec<String> {
@@ -134,7 +134,7 @@ impl<'a> SourceCodeLocator<'a> {
}
}
pub fn slice_source_code(&mut self, location: &Location) -> &'a str {
pub fn slice_source_code_at(&mut self, location: &Location) -> &'a str {
if !self.initialized {
let mut offset = 0;
for i in self.content.lines() {
@@ -147,4 +147,19 @@ impl<'a> SourceCodeLocator<'a> {
let offset = self.offsets[location.row() - 1] + location.column() - 1;
&self.content[offset..]
}
pub fn slice_source_code_range(&mut self, range: &Range) -> &'a str {
if !self.initialized {
let mut offset = 0;
for i in self.content.lines() {
self.offsets.push(offset);
offset += i.len();
offset += 1;
}
self.initialized = true;
}
let start = self.offsets[range.location.row() - 1] + range.location.column() - 1;
let end = self.offsets[range.end_location.row() - 1] + range.end_location.column() - 1;
&self.content[start..end]
}
}

View File

@@ -1,13 +1,17 @@
use rustpython_parser::ast::{Expr, ExprKind, Keyword, Location};
use rustpython_parser::ast::{Expr, ExprKind, Keyword};
fn relocate_keyword(keyword: &mut Keyword, location: Location) {
keyword.location = location;
use crate::ast::types::Range;
fn relocate_keyword(keyword: &mut Keyword, location: Range) {
keyword.location = location.location;
keyword.end_location = location.end_location;
relocate_expr(&mut keyword.node.value, location);
}
/// Change an expression's location (recursively) to match a desired, fixed location.
pub fn relocate_expr(expr: &mut Expr, location: Location) {
expr.location = location;
pub fn relocate_expr(expr: &mut Expr, location: Range) {
expr.location = location.location;
expr.end_location = location.end_location;
match &mut expr.node {
ExprKind::BoolOp { values, .. } => {
for expr in values {

View File

@@ -1,13 +1,28 @@
use std::collections::BTreeMap;
use std::sync::atomic::{AtomicUsize, Ordering};
use rustpython_parser::ast::Location;
use rustpython_parser::ast::{Located, Location};
fn id() -> usize {
static COUNTER: AtomicUsize = AtomicUsize::new(1);
COUNTER.fetch_add(1, Ordering::Relaxed)
}
#[derive(Clone, Copy, Debug, Default, PartialEq, Eq, PartialOrd, Ord)]
pub struct Range {
pub location: Location,
pub end_location: Location,
}
impl Range {
pub fn from_located<T>(located: &Located<T>) -> Self {
Range {
location: located.location,
end_location: located.end_location,
}
}
}
#[derive(Clone, Debug, Default)]
pub struct FunctionScope {
pub uses_locals: bool,
@@ -40,6 +55,12 @@ impl Scope {
}
}
#[derive(Clone, Debug)]
pub struct BindingContext {
pub defined_by: usize,
pub defined_in: Option<usize>,
}
#[derive(Clone, Debug)]
pub enum BindingKind {
Annotation,
@@ -52,20 +73,27 @@ pub enum BindingKind {
Definition,
Export(Vec<String>),
FutureImportation,
Importation(String),
StarImportation,
SubmoduleImportation(String),
Importation(String, BindingContext),
FromImportation(String, BindingContext),
SubmoduleImportation(String, BindingContext),
}
#[derive(Clone, Debug)]
pub struct Binding {
pub kind: BindingKind,
pub location: Location,
/// Tuple of (scope index, location) indicating the scope and location at which the binding was
pub range: Range,
/// Tuple of (scope index, range) indicating the scope and range at which the binding was
/// last used.
pub used: Option<(usize, Location)>,
pub used: Option<(usize, Range)>,
}
pub trait CheckLocator {
fn locate_check(&self, default: Location) -> Location;
fn locate_check(&self, default: Range) -> Range;
}
#[derive(Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord)]
pub enum ImportKind {
Import,
ImportFrom,
}

View File

@@ -2,6 +2,7 @@ use std::fs;
use std::path::Path;
use anyhow::Result;
use itertools::Itertools;
use rustpython_parser::ast::Location;
use crate::checks::{Check, Fix};
@@ -43,41 +44,46 @@ fn apply_fixes<'a>(fixes: impl Iterator<Item = &'a mut Fix>, contents: &str) ->
let mut output = "".to_string();
let mut last_pos: Location = Location::new(0, 0);
for fix in fixes {
for fix in fixes.sorted_by_key(|fix| fix.location) {
// Best-effort approach: if this fix overlaps with a fix we've already applied, skip it.
if last_pos > fix.start {
if last_pos > fix.location {
continue;
}
if fix.start.row() > last_pos.row() {
if fix.location.row() > last_pos.row() {
if last_pos.row() > 0 || last_pos.column() > 0 {
output.push_str(&lines[last_pos.row() - 1][last_pos.column() - 1..]);
output.push('\n');
}
for line in &lines[last_pos.row()..fix.start.row() - 1] {
for line in &lines[last_pos.row()..fix.location.row() - 1] {
output.push_str(line);
output.push('\n');
}
output.push_str(&lines[fix.start.row() - 1][..fix.start.column() - 1]);
output.push_str(&lines[fix.location.row() - 1][..fix.location.column() - 1]);
output.push_str(&fix.content);
} else {
output.push_str(
&lines[last_pos.row() - 1][last_pos.column() - 1..fix.start.column() - 1],
&lines[last_pos.row() - 1][last_pos.column() - 1..fix.location.column() - 1],
);
output.push_str(&fix.content);
}
last_pos = fix.end;
last_pos = fix.end_location;
fix.applied = true;
}
if last_pos.row() > 0 || last_pos.column() > 0 {
if last_pos.row() > 0
&& (last_pos.row() - 1) < lines.len()
&& (last_pos.row() > 0 || last_pos.column() > 0)
{
output.push_str(&lines[last_pos.row() - 1][last_pos.column() - 1..]);
output.push('\n');
}
for line in &lines[last_pos.row()..] {
output.push_str(line);
output.push('\n');
if last_pos.row() < lines.len() {
for line in &lines[last_pos.row()..] {
output.push_str(line);
output.push('\n');
}
}
output
@@ -106,8 +112,8 @@ mod tests {
fn apply_single_replacement() -> Result<()> {
let mut fixes = vec![Fix {
content: "Bar".to_string(),
start: Location::new(1, 9),
end: Location::new(1, 15),
location: Location::new(1, 9),
end_location: Location::new(1, 15),
applied: false,
}];
let actual = apply_fixes(
@@ -130,8 +136,8 @@ mod tests {
fn apply_single_removal() -> Result<()> {
let mut fixes = vec![Fix {
content: "".to_string(),
start: Location::new(1, 8),
end: Location::new(1, 16),
location: Location::new(1, 8),
end_location: Location::new(1, 16),
applied: false,
}];
let actual = apply_fixes(
@@ -155,14 +161,14 @@ mod tests {
let mut fixes = vec![
Fix {
content: "".to_string(),
start: Location::new(1, 8),
end: Location::new(1, 17),
location: Location::new(1, 8),
end_location: Location::new(1, 17),
applied: false,
},
Fix {
content: "".to_string(),
start: Location::new(1, 17),
end: Location::new(1, 24),
location: Location::new(1, 17),
end_location: Location::new(1, 24),
applied: false,
},
];
@@ -187,14 +193,14 @@ mod tests {
let mut fixes = vec![
Fix {
content: "".to_string(),
start: Location::new(1, 8),
end: Location::new(1, 16),
location: Location::new(1, 8),
end_location: Location::new(1, 16),
applied: false,
},
Fix {
content: "ignored".to_string(),
start: Location::new(1, 10),
end: Location::new(1, 12),
location: Location::new(1, 10),
end_location: Location::new(1, 12),
applied: false,
},
];

View File

@@ -1,8 +1,14 @@
use rustpython_parser::ast::{Expr, Keyword, Location};
use anyhow::Result;
use itertools::Itertools;
use libcst_native::ImportNames::Aliases;
use libcst_native::NameOrAttribute::N;
use libcst_native::{Codegen, Expression, SmallStatement, Statement};
use rustpython_parser::ast::{ExcepthandlerKind, Expr, Keyword, Location, Stmt, StmtKind};
use rustpython_parser::lexer;
use rustpython_parser::token::Tok;
use crate::ast::operations::SourceCodeLocator;
use crate::ast::types::Range;
use crate::checks::Fix;
/// Convert a location within a file (relative to `base`) to an absolute position.
@@ -25,7 +31,7 @@ pub fn remove_class_def_base(
bases: &[Expr],
keywords: &[Keyword],
) -> Option<Fix> {
let content = locator.slice_source_code(stmt_at);
let content = locator.slice_source_code_at(stmt_at);
// Case 1: `object` is the only base.
if bases.len() == 1 && keywords.is_empty() {
@@ -52,8 +58,8 @@ pub fn remove_class_def_base(
return match (fix_start, fix_end) {
(Some(start), Some(end)) => Some(Fix {
content: "".to_string(),
start,
end,
location: start,
end_location: end,
applied: false,
}),
_ => None,
@@ -91,8 +97,8 @@ pub fn remove_class_def_base(
match (fix_start, fix_end) {
(Some(start), Some(end)) => Some(Fix {
content: "".to_string(),
start,
end,
location: start,
end_location: end,
applied: false,
}),
_ => None,
@@ -116,11 +122,271 @@ pub fn remove_class_def_base(
match (fix_start, fix_end) {
(Some(start), Some(end)) => Some(Fix {
content: "".to_string(),
start,
end,
location: start,
end_location: end,
applied: false,
}),
_ => None,
}
}
}
pub fn remove_super_arguments(locator: &mut SourceCodeLocator, expr: &Expr) -> Option<Fix> {
let range = Range::from_located(expr);
let contents = locator.slice_source_code_range(&range);
let mut tree = match libcst_native::parse_module(contents, None) {
Ok(m) => m,
Err(_) => return None,
};
if let Some(Statement::Simple(body)) = tree.body.first_mut() {
if let Some(SmallStatement::Expr(body)) = body.body.first_mut() {
if let Expression::Call(body) = &mut body.value {
body.args = vec![];
body.whitespace_before_args = Default::default();
body.whitespace_after_func = Default::default();
let mut state = Default::default();
tree.codegen(&mut state);
return Some(Fix {
content: state.to_string(),
location: range.location,
end_location: range.end_location,
applied: false,
});
}
}
}
None
}
/// Determine if a body contains only a single statement, taking into account deleted.
fn has_single_child(body: &[Stmt], deleted: &[&Stmt]) -> bool {
body.iter().filter(|child| !deleted.contains(child)).count() == 1
}
/// Determine if a child is the only statement in its body.
fn is_lone_child(child: &Stmt, parent: &Stmt, deleted: &[&Stmt]) -> Result<bool> {
match &parent.node {
StmtKind::FunctionDef { body, .. }
| StmtKind::AsyncFunctionDef { body, .. }
| StmtKind::ClassDef { body, .. }
| StmtKind::With { body, .. }
| StmtKind::AsyncWith { body, .. } => {
if body.iter().contains(child) {
Ok(has_single_child(body, deleted))
} else {
Err(anyhow::anyhow!("Unable to find child in parent body."))
}
}
StmtKind::For { body, orelse, .. }
| StmtKind::AsyncFor { body, orelse, .. }
| StmtKind::While { body, orelse, .. }
| StmtKind::If { body, orelse, .. } => {
if body.iter().contains(child) {
Ok(has_single_child(body, deleted))
} else if orelse.iter().contains(child) {
Ok(has_single_child(orelse, deleted))
} else {
Err(anyhow::anyhow!("Unable to find child in parent body."))
}
}
StmtKind::Try {
body,
handlers,
orelse,
finalbody,
} => {
if body.iter().contains(child) {
Ok(has_single_child(body, deleted))
} else if orelse.iter().contains(child) {
Ok(has_single_child(orelse, deleted))
} else if finalbody.iter().contains(child) {
Ok(has_single_child(finalbody, deleted))
} else if let Some(body) = handlers.iter().find_map(|handler| match &handler.node {
ExcepthandlerKind::ExceptHandler { body, .. } => {
if body.iter().contains(child) {
Some(body)
} else {
None
}
}
}) {
Ok(has_single_child(body, deleted))
} else {
Err(anyhow::anyhow!("Unable to find child in parent body."))
}
}
_ => Err(anyhow::anyhow!("Unable to find child in parent body.")),
}
}
fn remove_stmt(stmt: &Stmt, parent: Option<&Stmt>, deleted: &[&Stmt]) -> Result<Fix> {
if parent
.map(|parent| is_lone_child(stmt, parent, deleted))
.map_or(Ok(None), |v| v.map(Some))?
.unwrap_or_default()
{
// If removing this node would lead to an invalid syntax tree, replace
// it with a `pass`.
Ok(Fix {
location: stmt.location,
end_location: stmt.end_location,
content: "pass".to_string(),
applied: false,
})
} else {
// Otherwise, nuke the entire line.
// TODO(charlie): This logic assumes that there are no multi-statement physical lines.
Ok(Fix {
location: Location::new(stmt.location.row(), 1),
end_location: Location::new(stmt.end_location.row() + 1, 1),
content: "".to_string(),
applied: false,
})
}
}
/// Generate a Fix to remove any unused imports from an `import` statement.
pub fn remove_unused_imports(
locator: &mut SourceCodeLocator,
full_names: &[&str],
stmt: &Stmt,
parent: Option<&Stmt>,
deleted: &[&Stmt],
) -> Result<Fix> {
let mut tree = match libcst_native::parse_module(
locator.slice_source_code_range(&Range::from_located(stmt)),
None,
) {
Ok(m) => m,
Err(_) => return Err(anyhow::anyhow!("Failed to extract CST from source.")),
};
let body = if let Some(Statement::Simple(body)) = tree.body.first_mut() {
body
} else {
return Err(anyhow::anyhow!("Expected node to be: Statement::Simple."));
};
let body = if let Some(SmallStatement::Import(body)) = body.body.first_mut() {
body
} else {
return Err(anyhow::anyhow!(
"Expected node to be: SmallStatement::ImportFrom."
));
};
let aliases = &mut body.names;
// Preserve the trailing comma (or not) from the last entry.
let trailing_comma = aliases.last().and_then(|alias| alias.comma.clone());
// Identify unused imports from within the `import from`.
let mut removable = vec![];
for (index, alias) in aliases.iter().enumerate() {
if let N(import_name) = &alias.name {
if full_names.contains(&import_name.value) {
removable.push(index);
}
}
}
// TODO(charlie): This is quadratic.
for index in removable.iter().rev() {
aliases.remove(*index);
}
if let Some(alias) = aliases.last_mut() {
alias.comma = trailing_comma;
}
if aliases.is_empty() {
remove_stmt(stmt, parent, deleted)
} else {
let mut state = Default::default();
tree.codegen(&mut state);
Ok(Fix {
content: state.to_string(),
location: stmt.location,
end_location: stmt.end_location,
applied: false,
})
}
}
/// Generate a Fix to remove any unused imports from an `import from` statement.
pub fn remove_unused_import_froms(
locator: &mut SourceCodeLocator,
full_names: &[&str],
stmt: &Stmt,
parent: Option<&Stmt>,
deleted: &[&Stmt],
) -> Result<Fix> {
let mut tree = match libcst_native::parse_module(
locator.slice_source_code_range(&Range::from_located(stmt)),
None,
) {
Ok(m) => m,
Err(_) => return Err(anyhow::anyhow!("Failed to extract CST from source.")),
};
let body = if let Some(Statement::Simple(body)) = tree.body.first_mut() {
body
} else {
return Err(anyhow::anyhow!("Expected node to be: Statement::Simple."));
};
let body = if let Some(SmallStatement::ImportFrom(body)) = body.body.first_mut() {
body
} else {
return Err(anyhow::anyhow!(
"Expected node to be: SmallStatement::ImportFrom."
));
};
let aliases = if let Aliases(aliases) = &mut body.names {
aliases
} else {
return Err(anyhow::anyhow!("Expected node to be: Aliases."));
};
// Preserve the trailing comma (or not) from the last entry.
let trailing_comma = aliases.last().and_then(|alias| alias.comma.clone());
// Identify unused imports from within the `import from`.
let mut removable = vec![];
for (index, alias) in aliases.iter().enumerate() {
if let N(name) = &alias.name {
let import_name = if let Some(N(module_name)) = &body.module {
format!("{}.{}", module_name.value, name.value)
} else {
name.value.to_string()
};
if full_names.contains(&import_name.as_str()) {
removable.push(index);
}
}
}
// TODO(charlie): This is quadratic.
for index in removable.iter().rev() {
aliases.remove(*index);
}
if let Some(alias) = aliases.last_mut() {
alias.comma = trailing_comma;
}
if aliases.is_empty() {
remove_stmt(stmt, parent, deleted)
} else {
let mut state = Default::default();
tree.codegen(&mut state);
Ok(Fix {
content: state.to_string(),
location: stmt.location,
end_location: stmt.end_location,
applied: false,
})
}
}

View File

@@ -1,18 +1,25 @@
use std::collections::{BTreeMap, BTreeSet};
use std::ops::Deref;
use std::path::Path;
use log::error;
use once_cell::sync::Lazy;
use regex::Regex;
use rustpython_parser::ast::{
Arg, Arguments, Constant, Excepthandler, ExcepthandlerKind, Expr, ExprContext, ExprKind,
KeywordData, Location, Operator, Stmt, StmtKind, Suite,
KeywordData, Operator, Stmt, StmtKind, Suite,
};
use rustpython_parser::parser;
use crate::ast::operations::{extract_all_names, SourceCodeLocator};
use crate::ast::relocate::relocate_expr;
use crate::ast::types::{Binding, BindingKind, CheckLocator, FunctionScope, Scope, ScopeKind};
use crate::ast::types::{
Binding, BindingContext, BindingKind, CheckLocator, FunctionScope, ImportKind, Range, Scope,
ScopeKind,
};
use crate::ast::visitor::{walk_excepthandler, Visitor};
use crate::ast::{checks, operations, visitor};
use crate::autofix::fixer;
use crate::autofix::{fixer, fixes};
use crate::checks::{Check, CheckCode, CheckKind};
use crate::python::builtins::{BUILTINS, MAGIC_GLOBALS};
use crate::python::future::ALL_FEATURE_NAMES;
@@ -21,6 +28,8 @@ use crate::settings::Settings;
pub const GLOBAL_SCOPE_INDEX: usize = 0;
static DUNDER_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r"__[^\s]+__").unwrap());
struct Checker<'a> {
// Input data.
locator: SourceCodeLocator<'a>,
@@ -36,19 +45,21 @@ struct Checker<'a> {
scopes: Vec<Scope>,
scope_stack: Vec<usize>,
dead_scopes: Vec<usize>,
deferred_string_annotations: Vec<(Location, &'a str)>,
deferred_string_annotations: Vec<(Range, &'a str)>,
deferred_annotations: Vec<(&'a Expr, Vec<usize>, Vec<usize>)>,
deferred_functions: Vec<(&'a Stmt, Vec<usize>, Vec<usize>)>,
deferred_lambdas: Vec<(&'a Expr, Vec<usize>, Vec<usize>)>,
deferred_assignments: Vec<usize>,
// Derivative state.
in_f_string: Option<Location>,
in_f_string: Option<Range>,
in_annotation: bool,
in_literal: bool,
seen_non_import: bool,
seen_docstring: bool,
futures_allowed: bool,
annotations_future_enabled: bool,
// Edit tracking.
deletions: BTreeSet<usize>,
}
impl<'a> Checker<'a> {
@@ -81,6 +92,7 @@ impl<'a> Checker<'a> {
seen_docstring: false,
futures_allowed: true,
annotations_future_enabled: false,
deletions: Default::default(),
}
}
}
@@ -101,6 +113,36 @@ fn is_annotated_subscript(expr: &Expr) -> bool {
}
}
fn is_assignment_to_a_dunder(node: &StmtKind) -> bool {
// Check whether it's an assignment to a dunder, with or without a type annotation.
// This is what pycodestyle (as of 2.9.1) does.
match node {
StmtKind::Assign {
targets,
value: _,
type_comment: _,
} => {
if targets.len() != 1 {
return false;
}
match &targets[0].node {
ExprKind::Name { id, ctx: _ } => DUNDER_REGEX.is_match(id),
_ => false,
}
}
StmtKind::AnnAssign {
target,
annotation: _,
value: _,
simple: _,
} => match &target.node {
ExprKind::Name { id, ctx: _ } => DUNDER_REGEX.is_match(id),
_ => false,
},
_ => false,
}
}
impl<'a, 'b> Visitor<'b> for Checker<'a>
where
'b: 'a,
@@ -157,10 +199,11 @@ where
self.futures_allowed = false;
}
}
_ => {
node => {
self.futures_allowed = false;
if !self.seen_non_import
&& !is_assignment_to_a_dunder(node)
&& !operations::in_nested_block(&self.parent_stack, &self.parents)
{
self.seen_non_import = true;
@@ -183,8 +226,8 @@ where
name.to_string(),
Binding {
kind: BindingKind::Assignment,
used: Some((global_scope_id, stmt.location)),
location: stmt.location,
used: Some((global_scope_id, Range::from_located(stmt))),
range: Range::from_located(stmt),
},
);
}
@@ -192,7 +235,7 @@ where
}
if self.settings.select.contains(&CheckCode::E741) {
let location = self.locate_check(stmt.location);
let location = self.locate_check(Range::from_located(stmt));
self.checks.extend(
names.iter().filter_map(|name| {
checks::check_ambiguous_variable_name(name, location)
@@ -241,11 +284,15 @@ where
if self.settings.select.contains(&CheckCode::E743) {
if let Some(check) = checks::check_ambiguous_function_name(
name,
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
) {
self.checks.push(check);
}
}
self.check_builtin_shadowing(name, Range::from_located(stmt), true);
// Visit the decorators and arguments, but avoid the body, which will be deferred.
for expr in decorator_list {
self.visit_expr(expr);
}
@@ -288,7 +335,7 @@ where
Binding {
kind: BindingKind::Definition,
used: None,
location: stmt.location,
range: Range::from_located(stmt),
},
);
}
@@ -303,7 +350,7 @@ where
ScopeKind::Class | ScopeKind::Module => {
self.checks.push(Check::new(
CheckKind::ReturnOutsideFunction,
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
));
}
_ => {}
@@ -335,13 +382,20 @@ where
}
if self.settings.select.contains(&CheckCode::E742) {
if let Some(check) =
checks::check_ambiguous_class_name(name, self.locate_check(stmt.location))
{
if let Some(check) = checks::check_ambiguous_class_name(
name,
self.locate_check(Range::from_located(stmt)),
) {
self.checks.push(check);
}
}
self.check_builtin_shadowing(
name,
self.locate_check(Range::from_located(stmt)),
false,
);
for expr in bases {
self.visit_expr(expr)
}
@@ -363,7 +417,7 @@ where
{
self.checks.push(Check::new(
CheckKind::ModuleImportNotAtTopOfFile,
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
));
}
@@ -376,12 +430,17 @@ where
Binding {
kind: BindingKind::SubmoduleImportation(
alias.node.name.to_string(),
self.binding_context(),
),
used: None,
location: stmt.location,
range: Range::from_located(stmt),
},
)
} else {
if let Some(asname) = &alias.node.asname {
self.check_builtin_shadowing(asname, Range::from_located(stmt), false);
}
self.add_binding(
alias
.node
@@ -395,9 +454,10 @@ where
.asname
.clone()
.unwrap_or_else(|| alias.node.name.clone()),
self.binding_context(),
),
used: None,
location: stmt.location,
range: Range::from_located(stmt),
},
)
}
@@ -417,7 +477,7 @@ where
{
self.checks.push(Check::new(
CheckKind::ModuleImportNotAtTopOfFile,
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
));
}
@@ -438,9 +498,9 @@ where
.last()
.expect("No current scope found."))]
.id,
stmt.location,
Range::from_located(stmt),
)),
location: stmt.location,
range: Range::from_located(stmt),
},
);
@@ -453,7 +513,7 @@ where
{
self.checks.push(Check::new(
CheckKind::FutureFeatureNotDefined(alias.node.name.to_string()),
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
));
}
@@ -461,7 +521,7 @@ where
{
self.checks.push(Check::new(
CheckKind::LateFutureImport,
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
));
}
} else if alias.node.name == "*" {
@@ -476,7 +536,7 @@ where
Binding {
kind: BindingKind::StarImportation,
used: None,
location: stmt.location,
range: Range::from_located(stmt),
},
);
@@ -486,7 +546,7 @@ where
if !matches!(scope.kind, ScopeKind::Module) {
self.checks.push(Check::new(
CheckKind::ImportStarNotPermitted(module_name.to_string()),
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
));
}
}
@@ -494,7 +554,7 @@ where
if self.settings.select.contains(&CheckCode::F403) {
self.checks.push(Check::new(
CheckKind::ImportStarUsed(module_name.to_string()),
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
));
}
@@ -504,13 +564,20 @@ where
.expect("No current scope found."))];
scope.import_starred = true;
} else {
if let Some(asname) = &alias.node.asname {
self.check_builtin_shadowing(asname, Range::from_located(stmt), false);
}
let binding = Binding {
kind: BindingKind::Importation(match module {
None => name.clone(),
Some(parent) => format!("{}.{}", parent, name),
}),
kind: BindingKind::FromImportation(
match module {
None => name.clone(),
Some(parent) => format!("{}.{}", parent, name),
},
self.binding_context(),
),
used: None,
location: stmt.location,
range: Range::from_located(stmt),
};
self.add_binding(name, binding)
}
@@ -531,7 +598,7 @@ where
StmtKind::If { test, .. } => {
if self.settings.select.contains(&CheckCode::F634) {
if let Some(check) =
checks::check_if_tuple(test, self.locate_check(stmt.location))
checks::check_if_tuple(test, self.locate_check(Range::from_located(stmt)))
{
self.checks.push(check);
}
@@ -539,9 +606,10 @@ where
}
StmtKind::Assert { test, .. } => {
if self.settings.select.contains(CheckKind::AssertTuple.code()) {
if let Some(check) =
checks::check_assert_tuple(test, self.locate_check(stmt.location))
{
if let Some(check) = checks::check_assert_tuple(
test,
self.locate_check(Range::from_located(stmt)),
) {
self.checks.push(check);
}
}
@@ -555,9 +623,10 @@ where
}
StmtKind::Assign { value, .. } => {
if self.settings.select.contains(&CheckCode::E731) {
if let Some(check) =
checks::check_do_not_assign_lambda(value, self.locate_check(stmt.location))
{
if let Some(check) = checks::check_do_not_assign_lambda(
value,
self.locate_check(Range::from_located(stmt)),
) {
self.checks.push(check);
}
}
@@ -567,7 +636,7 @@ where
if let Some(value) = value {
if let Some(check) = checks::check_do_not_assign_lambda(
value,
self.locate_check(stmt.location),
self.locate_check(Range::from_located(stmt)),
) {
self.checks.push(check);
}
@@ -603,7 +672,7 @@ where
Binding {
kind: BindingKind::ClassDefinition,
used: None,
location: stmt.location,
range: Range::from_located(stmt),
},
);
};
@@ -650,7 +719,7 @@ where
elts,
check_too_many_expressions,
check_two_starred_expressions,
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
) {
self.checks.push(check);
}
@@ -662,24 +731,36 @@ where
if self.settings.select.contains(&CheckCode::E741) {
if let Some(check) = checks::check_ambiguous_variable_name(
id,
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
) {
self.checks.push(check);
}
}
self.check_builtin_shadowing(id, Range::from_located(expr), true);
let parent =
self.parents[*(self.parent_stack.last().expect("No parent found."))];
self.handle_node_store(expr, parent);
}
ExprContext::Del => self.handle_node_delete(expr),
},
ExprKind::Call { func, .. } => {
ExprKind::Call { func, args, .. } => {
if self.settings.select.contains(&CheckCode::R002) {
if let Some(check) = checks::check_assert_equals(func, self.autofix) {
self.checks.push(check)
}
}
// flake8-super
if self.settings.select.contains(&CheckCode::SPR001) {
if let Some(check) =
checks::check_super_args(expr, func, args, &mut self.locator, self.autofix)
{
self.checks.push(check)
}
}
if let ExprKind::Name { id, ctx } = &func.node {
if id == "locals" && matches!(ctx, ExprContext::Load) {
let scope = &mut self.scopes[*(self
@@ -718,7 +799,7 @@ where
{
self.checks.push(Check::new(
CheckKind::YieldOutsideFunction,
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
));
}
}
@@ -734,10 +815,10 @@ where
{
self.checks.push(Check::new(
CheckKind::FStringMissingPlaceholders,
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
));
}
self.in_f_string = Some(expr.location);
self.in_f_string = Some(Range::from_located(expr));
}
ExprKind::BinOp {
left,
@@ -754,8 +835,10 @@ where
..
}) = scope.values.get("print")
{
self.checks
.push(Check::new(CheckKind::InvalidPrintSyntax, left.location));
self.checks.push(Check::new(
CheckKind::InvalidPrintSyntax,
Range::from_located(left),
));
}
}
}
@@ -797,7 +880,7 @@ where
left,
ops,
comparators,
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
));
}
@@ -805,7 +888,7 @@ where
self.checks.extend(checks::check_type_comparison(
ops,
comparators,
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
));
}
}
@@ -814,7 +897,41 @@ where
..
} if self.in_annotation && !self.in_literal => {
self.deferred_string_annotations
.push((expr.location, value));
.push((Range::from_located(expr), value));
}
ExprKind::Lambda { args, .. } => {
// Visit the arguments, but avoid the body, which will be deferred.
for arg in &args.posonlyargs {
if let Some(expr) = &arg.node.annotation {
self.visit_annotation(expr);
}
}
for arg in &args.args {
if let Some(expr) = &arg.node.annotation {
self.visit_annotation(expr);
}
}
if let Some(arg) = &args.vararg {
if let Some(expr) = &arg.node.annotation {
self.visit_annotation(expr);
}
}
for arg in &args.kwonlyargs {
if let Some(expr) = &arg.node.annotation {
self.visit_annotation(expr);
}
}
if let Some(arg) = &args.kwarg {
if let Some(expr) = &arg.node.annotation {
self.visit_annotation(expr);
}
}
for expr in &args.kw_defaults {
self.visit_expr(expr);
}
for expr in &args.defaults {
self.visit_expr(expr);
}
}
ExprKind::GeneratorExp { .. }
| ExprKind::ListComp { .. }
@@ -964,7 +1081,7 @@ where
if self.settings.select.contains(&CheckCode::E722) && type_.is_none() {
self.checks.push(Check::new(
CheckKind::DoNotUseBareExcept,
excepthandler.location,
Range::from_located(excepthandler),
));
}
match name {
@@ -972,11 +1089,18 @@ where
if self.settings.select.contains(&CheckCode::E741) {
if let Some(check) = checks::check_ambiguous_variable_name(
name,
self.locate_check(excepthandler.location),
self.locate_check(Range::from_located(excepthandler)),
) {
self.checks.push(check);
}
}
self.check_builtin_shadowing(
name,
Range::from_located(excepthandler),
false,
);
let scope = &self.scopes
[*(self.scope_stack.last().expect("No current scope found."))];
if scope.values.contains_key(name) {
@@ -985,6 +1109,7 @@ where
self.handle_node_store(
&Expr::new(
excepthandler.location,
excepthandler.end_location,
ExprKind::Name {
id: name.to_string(),
ctx: ExprContext::Store,
@@ -992,7 +1117,6 @@ where
),
parent,
);
self.parents.push(parent);
}
let parent =
@@ -1003,6 +1127,7 @@ where
self.handle_node_store(
&Expr::new(
excepthandler.location,
excepthandler.end_location,
ExprKind::Name {
id: name.to_string(),
ctx: ExprContext::Store,
@@ -1010,7 +1135,6 @@ where
),
parent,
);
self.parents.push(parent);
walk_excepthandler(self, excepthandler);
@@ -1022,7 +1146,7 @@ where
{
self.checks.push(Check::new(
CheckKind::UnusedVariable(name.to_string()),
excepthandler.location,
Range::from_located(excepthandler),
));
}
}
@@ -1068,23 +1192,25 @@ where
Binding {
kind: BindingKind::Argument,
used: None,
location: arg.location,
range: Range::from_located(arg),
},
);
if self.settings.select.contains(&CheckCode::E741) {
if let Some(check) = checks::check_ambiguous_variable_name(
&arg.node.arg,
self.locate_check(arg.location),
self.locate_check(Range::from_located(arg)),
) {
self.checks.push(check);
}
}
self.check_builtin_arg_shadowing(&arg.node.arg, Range::from_located(arg));
}
}
impl CheckLocator for Checker<'_> {
fn locate_check(&self, default: Location) -> Location {
fn locate_check(&self, default: Range) -> Range {
self.in_f_string.unwrap_or(default)
}
}
@@ -1122,7 +1248,7 @@ impl<'a> Checker<'a> {
(*builtin).to_string(),
Binding {
kind: BindingKind::Builtin,
location: Default::default(),
range: Default::default(),
used: None,
},
);
@@ -1132,13 +1258,23 @@ impl<'a> Checker<'a> {
(*builtin).to_string(),
Binding {
kind: BindingKind::Builtin,
location: Default::default(),
range: Default::default(),
used: None,
},
);
}
}
fn binding_context(&self) -> BindingContext {
let mut rev = self.parent_stack.iter().rev().fuse();
let defined_by = *rev.next().expect("Expected to bind within a statement.");
let defined_in = rev.next().cloned();
BindingContext {
defined_by,
defined_in,
}
}
fn add_binding(&mut self, name: String, binding: Binding) {
let scope = &mut self.scopes[*(self.scope_stack.last().expect("No current scope found."))];
@@ -1147,17 +1283,23 @@ impl<'a> Checker<'a> {
None => binding,
Some(existing) => {
if self.settings.select.contains(&CheckCode::F402)
&& matches!(existing.kind, BindingKind::Importation(_))
&& matches!(binding.kind, BindingKind::LoopVar)
&& matches!(
existing.kind,
BindingKind::Importation(_, _) | BindingKind::FromImportation(_, _)
)
{
self.checks.push(Check::new(
CheckKind::ImportShadowedByLoopVar(name.clone(), existing.location.row()),
binding.location,
CheckKind::ImportShadowedByLoopVar(
name.clone(),
existing.range.location.row(),
),
binding.range,
));
}
Binding {
kind: binding.kind,
location: binding.location,
range: binding.range,
used: existing.used,
}
}
@@ -1184,7 +1326,7 @@ impl<'a> Checker<'a> {
}
}
if let Some(binding) = scope.values.get_mut(id) {
binding.used = Some((scope_id, expr.location));
binding.used = Some((scope_id, Range::from_located(expr)));
return;
}
@@ -1208,7 +1350,7 @@ impl<'a> Checker<'a> {
self.checks.push(Check::new(
CheckKind::ImportStarUsage(id.clone(), from_list.join(", ")),
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
));
}
return;
@@ -1221,7 +1363,7 @@ impl<'a> Checker<'a> {
}
self.checks.push(Check::new(
CheckKind::UndefinedName(id.clone()),
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
))
}
}
@@ -1258,7 +1400,7 @@ impl<'a> Checker<'a> {
Binding {
kind: BindingKind::Annotation,
used: None,
location: expr.location,
range: Range::from_located(expr),
},
);
return;
@@ -1274,7 +1416,7 @@ impl<'a> Checker<'a> {
Binding {
kind: BindingKind::LoopVar,
used: None,
location: expr.location,
range: Range::from_located(expr),
},
);
return;
@@ -1286,7 +1428,7 @@ impl<'a> Checker<'a> {
Binding {
kind: BindingKind::Binding,
used: None,
location: expr.location,
range: Range::from_located(expr),
},
);
return;
@@ -1306,7 +1448,7 @@ impl<'a> Checker<'a> {
Binding {
kind: BindingKind::Export(extract_all_names(parent, current)),
used: None,
location: expr.location,
range: Range::from_located(expr),
},
);
return;
@@ -1317,7 +1459,7 @@ impl<'a> Checker<'a> {
Binding {
kind: BindingKind::Assignment,
used: None,
location: expr.location,
range: Range::from_located(expr),
},
);
}
@@ -1336,7 +1478,7 @@ impl<'a> Checker<'a> {
{
self.checks.push(Check::new(
CheckKind::UndefinedName(id.clone()),
self.locate_check(expr.location),
self.locate_check(Range::from_located(expr)),
))
}
}
@@ -1415,8 +1557,11 @@ impl<'a> Checker<'a> {
fn check_deferred_assignments(&mut self) {
if self.settings.select.contains(&CheckCode::F841) {
while let Some(index) = self.deferred_assignments.pop() {
self.checks
.extend(checks::check_unused_variables(&self.scopes[index], self));
self.checks.extend(checks::check_unused_variables(
&self.scopes[index],
self,
&self.settings.dummy_variable_rgx,
));
}
}
}
@@ -1448,7 +1593,7 @@ impl<'a> Checker<'a> {
if !scope.values.contains_key(name) {
self.checks.push(Check::new(
CheckKind::UndefinedExport(name.to_string()),
self.locate_check(all_binding.location),
self.locate_check(all_binding.range),
));
}
}
@@ -1471,7 +1616,7 @@ impl<'a> Checker<'a> {
if !scope.values.contains_key(name) {
self.checks.push(Check::new(
CheckKind::ImportStarUsage(name.clone(), from_list.join(", ")),
self.locate_check(all_binding.location),
self.locate_check(all_binding.range),
));
}
}
@@ -1480,6 +1625,11 @@ impl<'a> Checker<'a> {
}
if self.settings.select.contains(&CheckCode::F401) {
// Collect all unused imports by location. (Multiple unused imports at the same
// location indicates an `import from`.)
let mut unused: BTreeMap<(ImportKind, usize, Option<usize>), Vec<&str>> =
BTreeMap::new();
for (name, binding) in scope.values.iter().rev() {
let used = binding.used.is_some()
|| all_names
@@ -1488,17 +1638,103 @@ impl<'a> Checker<'a> {
if !used {
match &binding.kind {
BindingKind::Importation(full_name)
| BindingKind::SubmoduleImportation(full_name) => {
self.checks.push(Check::new(
CheckKind::UnusedImport(full_name.to_string()),
self.locate_check(binding.location),
));
BindingKind::FromImportation(full_name, context) => {
let full_names = unused
.entry((
ImportKind::ImportFrom,
context.defined_by,
context.defined_in,
))
.or_insert(vec![]);
full_names.push(full_name);
}
BindingKind::Importation(full_name, context)
| BindingKind::SubmoduleImportation(full_name, context) => {
let full_names = unused
.entry((
ImportKind::Import,
context.defined_by,
context.defined_in,
))
.or_insert(vec![]);
full_names.push(full_name);
}
_ => {}
}
}
}
for ((kind, defined_by, defined_in), full_names) in unused {
let child = self.parents[defined_by];
let parent = defined_in.map(|defined_in| self.parents[defined_in]);
let mut check = Check::new(
CheckKind::UnusedImport(full_names.join(", ")),
self.locate_check(Range::from_located(child)),
);
if matches!(self.autofix, fixer::Mode::Generate | fixer::Mode::Apply) {
let deleted: Vec<&Stmt> = self
.deletions
.iter()
.map(|index| self.parents[*index])
.collect();
let removal_fn = match kind {
ImportKind::Import => fixes::remove_unused_imports,
ImportKind::ImportFrom => fixes::remove_unused_import_froms,
};
match removal_fn(&mut self.locator, &full_names, child, parent, &deleted) {
Ok(fix) => {
if fix.content.is_empty() || fix.content == "pass" {
self.deletions.insert(defined_by);
}
check.amend(fix)
}
Err(e) => error!("Failed to fix unused imports: {}", e),
}
}
self.checks.push(check);
}
}
}
}
fn check_builtin_shadowing(&mut self, name: &str, location: Range, is_attribute: bool) {
let scope = &self.scopes[*(self.scope_stack.last().expect("No current scope found."))];
// flake8-builtins
if is_attribute && matches!(scope.kind, ScopeKind::Class) {
if self.settings.select.contains(&CheckCode::A003) {
if let Some(check) = checks::check_builtin_shadowing(
name,
self.locate_check(location),
checks::ShadowingType::Attribute,
) {
self.checks.push(check);
}
}
} else if self.settings.select.contains(&CheckCode::A001) {
if let Some(check) = checks::check_builtin_shadowing(
name,
self.locate_check(location),
checks::ShadowingType::Variable,
) {
self.checks.push(check);
}
}
}
fn check_builtin_arg_shadowing(&mut self, name: &str, location: Range) {
if self.settings.select.contains(&CheckCode::A002) {
if let Some(check) = checks::check_builtin_shadowing(
name,
self.locate_check(location),
checks::ShadowingType::Argument,
) {
self.checks.push(check);
}
}
}
@@ -1506,12 +1742,12 @@ impl<'a> Checker<'a> {
pub fn check_ast(
python_ast: &Suite,
content: &str,
contents: &str,
settings: &Settings,
autofix: &fixer::Mode,
path: &Path,
) -> Vec<Check> {
let mut checker = Checker::new(settings, autofix, path, content);
let mut checker = Checker::new(settings, autofix, path, contents);
checker.push_scope(Scope::new(ScopeKind::Module));
checker.bind_builtins();

View File

@@ -1,5 +1,6 @@
use std::collections::BTreeMap;
use crate::ast::types::Range;
use rustpython_parser::ast::Location;
use crate::autofix::fixer;
@@ -64,11 +65,11 @@ pub fn check_lines(
.or_insert_with(|| (noqa::extract_noqa_directive(lines[noqa_lineno]), vec![]));
match noqa {
(Directive::All(_), matches) => {
(Directive::All(_, _), matches) => {
matches.push(check.kind.code().as_str());
ignored.push(index)
}
(Directive::Codes(_, codes), matches) => {
(Directive::Codes(_, _, codes), matches) => {
if codes.contains(&check.kind.code().as_str()) {
matches.push(check.kind.code().as_str());
ignored.push(index);
@@ -89,14 +90,17 @@ pub fn check_lines(
let check = Check::new(
CheckKind::LineTooLong(line_length, settings.line_length),
Location::new(lineno + 1, settings.line_length + 1),
Range {
location: Location::new(lineno + 1, 1),
end_location: Location::new(lineno + 1, line_length + 1),
},
);
match noqa {
(Directive::All(_), matches) => {
(Directive::All(_, _), matches) => {
matches.push(check.kind.code().as_str());
}
(Directive::Codes(_, codes), matches) => {
(Directive::Codes(_, _, codes), matches) => {
if codes.contains(&check.kind.code().as_str()) {
matches.push(check.kind.code().as_str());
} else {
@@ -113,24 +117,30 @@ pub fn check_lines(
if enforce_noqa {
for (row, (directive, matches)) in noqa_directives {
match directive {
Directive::All(column) => {
Directive::All(start, end) => {
if matches.is_empty() {
let mut check = Check::new(
CheckKind::UnusedNOQA(None),
Location::new(row + 1, column + 1),
Range {
location: Location::new(row + 1, start + 1),
end_location: Location::new(row + 1, end + 1),
},
);
if matches!(autofix, fixer::Mode::Generate | fixer::Mode::Apply) {
check.amend(Fix {
content: "".to_string(),
start: Location::new(row + 1, column + 1),
end: Location::new(row + 1, lines[row].chars().count() + 1),
location: Location::new(row + 1, start + 1),
end_location: Location::new(
row + 1,
lines[row].chars().count() + 1,
),
applied: false,
});
}
line_checks.push(check);
}
}
Directive::Codes(column, codes) => {
Directive::Codes(start, end, codes) => {
let mut invalid_codes = vec![];
let mut valid_codes = vec![];
for code in codes {
@@ -144,21 +154,30 @@ pub fn check_lines(
if !invalid_codes.is_empty() {
let mut check = Check::new(
CheckKind::UnusedNOQA(Some(invalid_codes.join(", "))),
Location::new(row + 1, column + 1),
Range {
location: Location::new(row + 1, start + 1),
end_location: Location::new(row + 1, end + 1),
},
);
if matches!(autofix, fixer::Mode::Generate | fixer::Mode::Apply) {
if valid_codes.is_empty() {
check.amend(Fix {
content: "".to_string(),
start: Location::new(row + 1, column + 1),
end: Location::new(row + 1, lines[row].chars().count() + 1),
location: Location::new(row + 1, start + 1),
end_location: Location::new(
row + 1,
lines[row].chars().count() + 1,
),
applied: false,
});
} else {
check.amend(Fix {
content: format!(" # noqa: {}", valid_codes.join(", ")),
start: Location::new(row + 1, column + 1),
end: Location::new(row + 1, lines[row].chars().count() + 1),
location: Location::new(row + 1, start + 1),
end_location: Location::new(
row + 1,
lines[row].chars().count() + 1,
),
applied: false,
});
}
@@ -180,10 +199,11 @@ pub fn check_lines(
#[cfg(test)]
mod tests {
use std::collections::BTreeSet;
use crate::autofix::fixer;
use crate::checks::{Check, CheckCode};
use crate::settings;
use super::check_lines;
use super::*;
#[test]
fn e501_non_ascii_char() {
@@ -191,19 +211,14 @@ mod tests {
let noqa_line_for: Vec<usize> = vec![1];
let check_with_max_line_length = |line_length: usize| {
let mut checks: Vec<Check> = vec![];
let settings = Settings {
pyproject: None,
project_root: None,
line_length,
exclude: vec![],
extend_exclude: vec![],
select: BTreeSet::from_iter(vec![CheckCode::E501]),
};
check_lines(
&mut checks,
line,
&noqa_line_for,
&settings,
&settings::Settings {
line_length,
..settings::Settings::for_rule(CheckCode::E501)
},
&fixer::Mode::Generate,
);
return checks;

View File

@@ -1,10 +1,12 @@
use std::str::FromStr;
use crate::ast::types::Range;
use anyhow::Result;
use rustpython_parser::ast::Location;
use serde::{Deserialize, Serialize};
pub const DEFAULT_CHECK_CODES: [CheckCode; 42] = [
pub const DEFAULT_CHECK_CODES: [CheckCode; 46] = [
// pycodestyle
CheckCode::E402,
CheckCode::E501,
CheckCode::E711,
@@ -19,6 +21,7 @@ pub const DEFAULT_CHECK_CODES: [CheckCode; 42] = [
CheckCode::E743,
CheckCode::E902,
CheckCode::E999,
// pyflakes
CheckCode::F401,
CheckCode::F402,
CheckCode::F403,
@@ -47,9 +50,16 @@ pub const DEFAULT_CHECK_CODES: [CheckCode; 42] = [
CheckCode::F831,
CheckCode::F841,
CheckCode::F901,
// flake8-builtins
CheckCode::A001,
CheckCode::A002,
CheckCode::A003,
// flake8-super
CheckCode::SPR001,
];
pub const ALL_CHECK_CODES: [CheckCode; 45] = [
pub const ALL_CHECK_CODES: [CheckCode; 49] = [
// pycodestyle
CheckCode::E402,
CheckCode::E501,
CheckCode::E711,
@@ -64,6 +74,7 @@ pub const ALL_CHECK_CODES: [CheckCode; 45] = [
CheckCode::E743,
CheckCode::E902,
CheckCode::E999,
// pyflakes
CheckCode::F401,
CheckCode::F402,
CheckCode::F403,
@@ -92,13 +103,22 @@ pub const ALL_CHECK_CODES: [CheckCode; 45] = [
CheckCode::F831,
CheckCode::F841,
CheckCode::F901,
// flake8-builtins
CheckCode::A001,
CheckCode::A002,
CheckCode::A003,
// flake8-super
CheckCode::SPR001,
// Meta
CheckCode::M001,
// Refactor
CheckCode::R001,
CheckCode::R002,
];
#[derive(Debug, PartialEq, Eq, Clone, Serialize, Deserialize, Hash, PartialOrd, Ord)]
pub enum CheckCode {
// pycodestyle
E402,
E501,
E711,
@@ -113,6 +133,7 @@ pub enum CheckCode {
E743,
E902,
E999,
// pyflakes
F401,
F402,
F403,
@@ -141,8 +162,16 @@ pub enum CheckCode {
F831,
F841,
F901,
// flake8-builtins
A001,
A002,
A003,
// flake8-super
SPR001,
// Refactor
R001,
R002,
// Meta
M001,
}
@@ -151,6 +180,7 @@ impl FromStr for CheckCode {
fn from_str(s: &str) -> Result<Self> {
match s {
// pycodestyle
"E402" => Ok(CheckCode::E402),
"E501" => Ok(CheckCode::E501),
"E711" => Ok(CheckCode::E711),
@@ -165,6 +195,7 @@ impl FromStr for CheckCode {
"E743" => Ok(CheckCode::E743),
"E902" => Ok(CheckCode::E902),
"E999" => Ok(CheckCode::E999),
// pyflakes
"F401" => Ok(CheckCode::F401),
"F402" => Ok(CheckCode::F402),
"F403" => Ok(CheckCode::F403),
@@ -193,8 +224,16 @@ impl FromStr for CheckCode {
"F831" => Ok(CheckCode::F831),
"F841" => Ok(CheckCode::F841),
"F901" => Ok(CheckCode::F901),
// flake8-builtins
"A001" => Ok(CheckCode::A001),
"A002" => Ok(CheckCode::A002),
"A003" => Ok(CheckCode::A003),
// flake8-super
"SPR001" => Ok(CheckCode::SPR001),
// Refactor
"R001" => Ok(CheckCode::R001),
"R002" => Ok(CheckCode::R002),
// Meta
"M001" => Ok(CheckCode::M001),
_ => Err(anyhow::anyhow!("Unknown check code: {s}")),
}
@@ -204,6 +243,7 @@ impl FromStr for CheckCode {
impl CheckCode {
pub fn as_str(&self) -> &str {
match self {
// pycodestyle
CheckCode::E402 => "E402",
CheckCode::E501 => "E501",
CheckCode::E711 => "E711",
@@ -218,6 +258,7 @@ impl CheckCode {
CheckCode::E743 => "E743",
CheckCode::E902 => "E902",
CheckCode::E999 => "E999",
// pyflakes
CheckCode::F401 => "F401",
CheckCode::F402 => "F402",
CheckCode::F403 => "F403",
@@ -246,8 +287,16 @@ impl CheckCode {
CheckCode::F831 => "F831",
CheckCode::F841 => "F841",
CheckCode::F901 => "F901",
// flake8-builtins
CheckCode::A001 => "A001",
CheckCode::A002 => "A002",
CheckCode::A003 => "A003",
// flake8-super
CheckCode::SPR001 => "SPR001",
// Refactor
CheckCode::R001 => "R001",
CheckCode::R002 => "R002",
// Meta
CheckCode::M001 => "M001",
}
}
@@ -256,7 +305,7 @@ impl CheckCode {
pub fn lint_source(&self) -> &'static LintSource {
match self {
CheckCode::E501 | CheckCode::M001 => &LintSource::Lines,
CheckCode::E902 | CheckCode::E999 => &LintSource::FileSystem,
CheckCode::E902 => &LintSource::FileSystem,
_ => &LintSource::AST,
}
}
@@ -264,6 +313,7 @@ impl CheckCode {
/// A placeholder representation of the CheckKind for the check.
pub fn kind(&self) -> CheckKind {
match self {
// pycodestyle
CheckCode::E402 => CheckKind::ModuleImportNotAtTopOfFile,
CheckCode::E501 => CheckKind::LineTooLong(89, 88),
CheckCode::E711 => CheckKind::NoneComparison(RejectedCmpop::Eq),
@@ -278,6 +328,7 @@ impl CheckCode {
CheckCode::E743 => CheckKind::AmbiguousFunctionName("...".to_string()),
CheckCode::E902 => CheckKind::IOError("...".to_string()),
CheckCode::E999 => CheckKind::SyntaxError("...".to_string()),
// pyflakes
CheckCode::F401 => CheckKind::UnusedImport("...".to_string()),
CheckCode::F402 => CheckKind::ImportShadowedByLoopVar("...".to_string(), 1),
CheckCode::F403 => CheckKind::ImportStarUsed("...".to_string()),
@@ -306,9 +357,17 @@ impl CheckCode {
CheckCode::F831 => CheckKind::DuplicateArgumentName,
CheckCode::F841 => CheckKind::UnusedVariable("...".to_string()),
CheckCode::F901 => CheckKind::RaiseNotImplemented,
CheckCode::M001 => CheckKind::UnusedNOQA(None),
// flake8-builtins
CheckCode::A001 => CheckKind::BuiltinVariableShadowing("...".to_string()),
CheckCode::A002 => CheckKind::BuiltinArgumentShadowing("...".to_string()),
CheckCode::A003 => CheckKind::BuiltinAttributeShadowing("...".to_string()),
// flake8-super
CheckCode::SPR001 => CheckKind::SuperCallWithParameters,
// Refactor
CheckCode::R001 => CheckKind::UselessObjectInheritance("...".to_string()),
CheckCode::R002 => CheckKind::NoAssertEquals,
// Meta
CheckCode::M001 => CheckKind::UnusedNOQA(None),
}
}
}
@@ -373,6 +432,12 @@ pub enum CheckKind {
UnusedVariable(String),
UselessObjectInheritance(String),
YieldOutsideFunction,
// flake8-builtin
BuiltinVariableShadowing(String),
BuiltinArgumentShadowing(String),
BuiltinAttributeShadowing(String),
// flake8-super
SuperCallWithParameters,
}
impl CheckKind {
@@ -426,6 +491,12 @@ impl CheckKind {
CheckKind::UselessObjectInheritance(_) => "UselessObjectInheritance",
CheckKind::YieldOutsideFunction => "YieldOutsideFunction",
CheckKind::UnusedNOQA(_) => "UnusedNOQA",
// flake8-builtins
CheckKind::BuiltinVariableShadowing(_) => "BuiltinVariableShadowing",
CheckKind::BuiltinArgumentShadowing(_) => "BuiltinArgumentShadowing",
CheckKind::BuiltinAttributeShadowing(_) => "BuiltinAttributeShadowing",
// flake8-super
CheckKind::SuperCallWithParameters => "SuperCallWithParameters",
}
}
@@ -477,6 +548,12 @@ impl CheckKind {
CheckKind::UnusedVariable(_) => &CheckCode::F841,
CheckKind::UselessObjectInheritance(_) => &CheckCode::R001,
CheckKind::YieldOutsideFunction => &CheckCode::F704,
// flake8-builtins
CheckKind::BuiltinVariableShadowing(_) => &CheckCode::A001,
CheckKind::BuiltinArgumentShadowing(_) => &CheckCode::A002,
CheckKind::BuiltinAttributeShadowing(_) => &CheckCode::A003,
// flake8-super
CheckKind::SuperCallWithParameters => &CheckCode::SPR001,
}
}
@@ -611,6 +688,20 @@ impl CheckKind {
None => "Unused `noqa` directive".to_string(),
Some(code) => format!("Unused `noqa` directive for: {code}"),
},
// flake8-builtins
CheckKind::BuiltinVariableShadowing(name) => {
format!("Variable `{name}` is shadowing a python builtin")
}
CheckKind::BuiltinArgumentShadowing(name) => {
format!("Argument `{name}` is shadowing a python builtin")
}
CheckKind::BuiltinAttributeShadowing(name) => {
format!("class attribute `{name}` is shadowing a python builtin")
}
// flake8-super
CheckKind::SuperCallWithParameters => {
"Use `super()` instead of `super(__class__, self)`".to_string()
}
}
}
@@ -621,6 +712,8 @@ impl CheckKind {
CheckKind::NoAssertEquals
| CheckKind::UselessObjectInheritance(_)
| CheckKind::UnusedNOQA(_)
| CheckKind::SuperCallWithParameters
| CheckKind::UnusedImport(_)
)
}
}
@@ -628,8 +721,8 @@ impl CheckKind {
#[derive(Debug, PartialEq, Eq, Serialize, Deserialize)]
pub struct Fix {
pub content: String,
pub start: Location,
pub end: Location,
pub location: Location,
pub end_location: Location,
pub applied: bool,
}
@@ -637,14 +730,16 @@ pub struct Fix {
pub struct Check {
pub kind: CheckKind,
pub location: Location,
pub end_location: Location,
pub fix: Option<Fix>,
}
impl Check {
pub fn new(kind: CheckKind, location: Location) -> Self {
pub fn new(kind: CheckKind, span: Range) -> Self {
Self {
kind,
location,
location: span.location,
end_location: span.end_location,
fix: None,
}
}

View File

@@ -1,4 +1,5 @@
use std::borrow::Cow;
use std::collections::BTreeSet;
use std::fs::File;
use std::io::{BufReader, Read};
use std::ops::Deref;
@@ -10,7 +11,8 @@ use path_absolutize::path_dedot;
use path_absolutize::Absolutize;
use walkdir::{DirEntry, WalkDir};
use crate::settings::FilePattern;
use crate::checks::CheckCode;
use crate::settings::{FilePattern, PerFileIgnore};
/// Extract the absolute path and basename (as strings) from a Path.
fn extract_path_names(path: &Path) -> Result<(&str, &str)> {
@@ -25,9 +27,12 @@ fn extract_path_names(path: &Path) -> Result<(&str, &str)> {
Ok((file_path, file_basename))
}
fn is_excluded(file_path: &str, file_basename: &str, exclude: &[FilePattern]) -> bool {
fn is_excluded<'a, T>(file_path: &str, file_basename: &str, exclude: T) -> bool
where
T: Iterator<Item = &'a FilePattern>,
{
for pattern in exclude {
match &pattern {
match pattern {
FilePattern::Simple(basename) => {
if *basename == file_basename {
return true;
@@ -45,7 +50,7 @@ fn is_excluded(file_path: &str, file_basename: &str, exclude: &[FilePattern]) ->
return true;
}
}
}
};
}
false
}
@@ -71,7 +76,6 @@ pub fn iter_python_files<'a>(
.all(|pattern| matches!(pattern, FilePattern::Simple(_)));
WalkDir::new(normalize_path(path))
.follow_links(true)
.into_iter()
.filter_entry(move |entry| {
if !has_exclude && !has_extend_exclude {
@@ -85,13 +89,13 @@ pub fn iter_python_files<'a>(
if has_exclude
&& (!exclude_simple || file_type.is_dir())
&& is_excluded(file_path, file_basename, exclude)
&& is_excluded(file_path, file_basename, exclude.iter())
{
debug!("Ignored path via `exclude`: {:?}", path);
false
} else if has_extend_exclude
&& (!extend_exclude_simple || file_type.is_dir())
&& is_excluded(file_path, file_basename, extend_exclude)
&& is_excluded(file_path, file_basename, extend_exclude.iter())
{
debug!("Ignored path via `extend-exclude`: {:?}", path);
false
@@ -107,11 +111,32 @@ pub fn iter_python_files<'a>(
})
.filter(|entry| {
entry.as_ref().map_or(true, |entry| {
(entry.depth() == 0 && !entry.file_type().is_dir()) || is_included(entry.path())
(entry.depth() == 0 || is_included(entry.path()))
&& !entry.file_type().is_dir()
&& !(entry.file_type().is_symlink() && entry.path().is_dir())
})
})
}
/// Create tree set with codes matching the pattern/code pairs.
pub fn ignores_from_path<'a>(
path: &Path,
pattern_code_pairs: &'a [PerFileIgnore],
) -> Result<BTreeSet<&'a CheckCode>> {
let (file_path, file_basename) = extract_path_names(path)?;
Ok(pattern_code_pairs
.iter()
.filter(|pattern_code_pair| {
is_excluded(
file_path,
file_basename,
[&pattern_code_pair.pattern].into_iter(),
)
})
.map(|pattern_code_pair| &pattern_code_pair.code)
.collect())
}
/// Convert any path to an absolute path (based on the current working directory).
pub fn normalize_path(path: &Path) -> PathBuf {
if let Ok(path) = path.absolutize() {
@@ -180,7 +205,7 @@ mod tests {
&Some(project_root.to_path_buf()),
)];
let (file_path, file_basename) = extract_path_names(&path)?;
assert!(is_excluded(file_path, file_basename, &exclude));
assert!(is_excluded(file_path, file_basename, exclude.iter()));
let path = Path::new("foo/bar").absolutize_from(project_root).unwrap();
let exclude = vec![FilePattern::from_user(
@@ -188,7 +213,7 @@ mod tests {
&Some(project_root.to_path_buf()),
)];
let (file_path, file_basename) = extract_path_names(&path)?;
assert!(is_excluded(file_path, file_basename, &exclude));
assert!(is_excluded(file_path, file_basename, exclude.iter()));
let path = Path::new("foo/bar/baz.py")
.absolutize_from(project_root)
@@ -198,7 +223,7 @@ mod tests {
&Some(project_root.to_path_buf()),
)];
let (file_path, file_basename) = extract_path_names(&path)?;
assert!(is_excluded(file_path, file_basename, &exclude));
assert!(is_excluded(file_path, file_basename, exclude.iter()));
let path = Path::new("foo/bar").absolutize_from(project_root).unwrap();
let exclude = vec![FilePattern::from_user(
@@ -206,7 +231,7 @@ mod tests {
&Some(project_root.to_path_buf()),
)];
let (file_path, file_basename) = extract_path_names(&path)?;
assert!(is_excluded(file_path, file_basename, &exclude));
assert!(is_excluded(file_path, file_basename, exclude.iter()));
let path = Path::new("foo/bar/baz.py")
.absolutize_from(project_root)
@@ -216,7 +241,7 @@ mod tests {
&Some(project_root.to_path_buf()),
)];
let (file_path, file_basename) = extract_path_names(&path)?;
assert!(is_excluded(file_path, file_basename, &exclude));
assert!(is_excluded(file_path, file_basename, exclude.iter()));
let path = Path::new("foo/bar/baz.py")
.absolutize_from(project_root)
@@ -226,7 +251,7 @@ mod tests {
&Some(project_root.to_path_buf()),
)];
let (file_path, file_basename) = extract_path_names(&path)?;
assert!(is_excluded(file_path, file_basename, &exclude));
assert!(is_excluded(file_path, file_basename, exclude.iter()));
let path = Path::new("foo/bar/baz.py")
.absolutize_from(project_root)
@@ -236,7 +261,7 @@ mod tests {
&Some(project_root.to_path_buf()),
)];
let (file_path, file_basename) = extract_path_names(&path)?;
assert!(!is_excluded(file_path, file_basename, &exclude));
assert!(!is_excluded(file_path, file_basename, exclude.iter()));
Ok(())
}

View File

@@ -1,4 +1,13 @@
extern crate core;
use std::path::Path;
use anyhow::Result;
use log::debug;
use rustpython_parser::lexer::LexResult;
use crate::autofix::fixer::Mode;
use crate::linter::{check_path, tokenize};
use crate::message::Message;
use crate::settings::Settings;
mod ast;
mod autofix;
@@ -15,3 +24,50 @@ pub mod printer;
pub mod pyproject;
mod python;
pub mod settings;
/// Run ruff over Python source code directly.
pub fn check(path: &Path, contents: &str) -> Result<Vec<Message>> {
// Find the project root and pyproject.toml.
let project_root = pyproject::find_project_root(&[path.to_path_buf()]);
match &project_root {
Some(path) => debug!("Found project root at: {:?}", path),
None => debug!("Unable to identify project root; assuming current directory..."),
};
let pyproject = pyproject::find_pyproject_toml(&project_root);
match &pyproject {
Some(path) => debug!("Found pyproject.toml at: {:?}", path),
None => debug!("Unable to find pyproject.toml; using default settings..."),
};
let settings = Settings::from_pyproject(pyproject, project_root)?;
// Tokenize once.
let tokens: Vec<LexResult> = tokenize(contents);
// Determine the noqa line for every line in the source.
let noqa_line_for = noqa::extract_noqa_line_for(&tokens);
// Generate checks.
let checks = check_path(
path,
contents,
tokens,
&noqa_line_for,
&settings,
&Mode::None,
)?;
// Convert to messages.
let messages: Vec<Message> = checks
.into_iter()
.map(|check| Message {
kind: check.kind,
fixed: check.fix.map(|fix| fix.applied).unwrap_or_default(),
location: check.location,
end_location: check.end_location,
filename: path.to_string_lossy().to_string(),
})
.collect();
Ok(messages)
}

View File

@@ -5,6 +5,7 @@ use log::debug;
use rustpython_parser::lexer::LexResult;
use rustpython_parser::{lexer, parser};
use crate::ast::types::Range;
use crate::autofix::fixer;
use crate::autofix::fixer::fix_file;
use crate::check_ast::check_ast;
@@ -15,19 +16,30 @@ use crate::noqa::add_noqa;
use crate::settings::Settings;
use crate::{cache, fs, noqa};
fn check_path(
/// Collect tokens up to and including the first error.
pub(crate) fn tokenize(contents: &str) -> Vec<LexResult> {
let mut tokens: Vec<LexResult> = vec![];
for tok in lexer::make_tokenizer(contents) {
let is_err = tok.is_err();
tokens.push(tok);
if is_err {
break;
}
}
tokens
}
pub(crate) fn check_path(
path: &Path,
contents: &str,
tokens: Vec<LexResult>,
noqa_line_for: &[usize],
settings: &Settings,
autofix: &fixer::Mode,
) -> Vec<Check> {
) -> Result<Vec<Check>> {
// Aggregate all checks.
let mut checks: Vec<Check> = vec![];
// Determine the noqa line for every line in the source.
let noqa_line_for = noqa::extract_noqa_line_for(&tokens);
// Run the AST-based checks.
if settings
.select
@@ -42,7 +54,10 @@ fn check_path(
if settings.select.contains(&CheckCode::E999) {
checks.push(Check::new(
CheckKind::SyntaxError(parse_error.error.to_string()),
parse_error.location,
Range {
location: parse_error.location,
end_location: parse_error.location,
},
))
}
}
@@ -50,9 +65,20 @@ fn check_path(
}
// Run the lines-based checks.
check_lines(&mut checks, contents, &noqa_line_for, settings, autofix);
check_lines(&mut checks, contents, noqa_line_for, settings, autofix);
checks
// Create path ignores.
if !checks.is_empty() && !settings.per_file_ignores.is_empty() {
let ignores = fs::ignores_from_path(path, &settings.per_file_ignores)?;
if !ignores.is_empty() {
return Ok(checks
.into_iter()
.filter(|check| !ignores.contains(check.kind.code()))
.collect());
}
}
Ok(checks)
}
pub fn lint_path(
@@ -73,10 +99,13 @@ pub fn lint_path(
let contents = fs::read_file(path)?;
// Tokenize once.
let tokens: Vec<LexResult> = lexer::make_tokenizer(&contents).collect();
let tokens: Vec<LexResult> = tokenize(&contents);
// Determine the noqa line for every line in the source.
let noqa_line_for = noqa::extract_noqa_line_for(&tokens);
// Generate checks.
let mut checks = check_path(path, &contents, tokens, settings, autofix);
let mut checks = check_path(path, &contents, tokens, &noqa_line_for, settings, autofix)?;
// Apply autofix.
if matches!(autofix, fixer::Mode::Apply) {
@@ -90,6 +119,7 @@ pub fn lint_path(
kind: check.kind,
fixed: check.fix.map(|fix| fix.applied).unwrap_or_default(),
location: check.location,
end_location: check.end_location,
filename: path.to_string_lossy().to_string(),
})
.collect();
@@ -103,13 +133,20 @@ pub fn add_noqa_to_path(path: &Path, settings: &Settings) -> Result<usize> {
let contents = fs::read_file(path)?;
// Tokenize once.
let tokens: Vec<LexResult> = lexer::make_tokenizer(&contents).collect();
let tokens: Vec<LexResult> = tokenize(&contents);
// Determine the noqa line for every line in the source.
let noqa_line_for = noqa::extract_noqa_line_for(&tokens);
// Generate checks.
let checks = check_path(path, &contents, tokens, settings, &fixer::Mode::None);
let checks = check_path(
path,
&contents,
tokens,
&noqa_line_for,
settings,
&fixer::Mode::None,
)?;
add_noqa(&checks, &contents, &noqa_line_for, path)
}
@@ -119,14 +156,15 @@ mod tests {
use std::path::Path;
use anyhow::Result;
use rustpython_parser::lexer;
use regex::Regex;
use rustpython_parser::lexer::LexResult;
use crate::autofix::fixer;
use crate::checks::{Check, CheckCode};
use crate::fs;
use crate::linter;
use crate::linter::tokenize;
use crate::settings;
use crate::{fs, noqa};
fn check_path(
path: &Path,
@@ -134,10 +172,9 @@ mod tests {
autofix: &fixer::Mode,
) -> Result<Vec<Check>> {
let contents = fs::read_file(path)?;
let tokens: Vec<LexResult> = lexer::make_tokenizer(&contents).collect();
Ok(linter::check_path(
path, &contents, tokens, settings, autofix,
))
let tokens: Vec<LexResult> = tokenize(&contents);
let noqa_line_for = noqa::extract_noqa_line_for(&tokens);
linter::check_path(path, &contents, tokens, &noqa_line_for, settings, autofix)
}
#[test]
@@ -596,6 +633,21 @@ mod tests {
Ok(())
}
#[test]
fn f841_dummy_variable_rgx() -> Result<()> {
let mut checks = check_path(
Path::new("./resources/test/fixtures/F841.py"),
&settings::Settings {
dummy_variable_rgx: Regex::new(r"^z$").unwrap(),
..settings::Settings::for_rule(CheckCode::F841)
},
&fixer::Mode::Generate,
)?;
checks.sort_by_key(|check| check.location);
insta::assert_yaml_snapshot!(checks);
Ok(())
}
#[test]
fn f901() -> Result<()> {
let mut checks = check_path(
@@ -667,4 +719,64 @@ mod tests {
insta::assert_yaml_snapshot!(checks);
Ok(())
}
#[test]
fn e999() -> Result<()> {
let mut checks = check_path(
Path::new("./resources/test/fixtures/E999.py"),
&settings::Settings::for_rule(CheckCode::E999),
&fixer::Mode::Generate,
)?;
checks.sort_by_key(|check| check.location);
insta::assert_yaml_snapshot!(checks);
Ok(())
}
#[test]
fn a001() -> Result<()> {
let mut checks = check_path(
Path::new("./resources/test/fixtures/A001.py"),
&settings::Settings::for_rule(CheckCode::A001),
&fixer::Mode::Generate,
)?;
checks.sort_by_key(|check| check.location);
insta::assert_yaml_snapshot!(checks);
Ok(())
}
#[test]
fn a002() -> Result<()> {
let mut checks = check_path(
Path::new("./resources/test/fixtures/A002.py"),
&settings::Settings::for_rule(CheckCode::A002),
&fixer::Mode::Generate,
)?;
checks.sort_by_key(|check| check.location);
insta::assert_yaml_snapshot!(checks);
Ok(())
}
#[test]
fn a003() -> Result<()> {
let mut checks = check_path(
Path::new("./resources/test/fixtures/A003.py"),
&settings::Settings::for_rule(CheckCode::A003),
&fixer::Mode::Generate,
)?;
checks.sort_by_key(|check| check.location);
insta::assert_yaml_snapshot!(checks);
Ok(())
}
#[test]
fn spr001() -> Result<()> {
let mut checks = check_path(
Path::new("./resources/test/fixtures/SPR001.py"),
&settings::Settings::for_rule(CheckCode::SPR001),
&fixer::Mode::Generate,
)?;
checks.sort_by_key(|check| check.location);
insta::assert_yaml_snapshot!(checks);
Ok(())
}
}

View File

@@ -1,5 +1,3 @@
extern crate core;
use std::io;
use std::path::{Path, PathBuf};
use std::process::ExitCode;
@@ -7,84 +5,91 @@ use std::sync::mpsc::channel;
use std::time::Instant;
use anyhow::Result;
use clap::{Parser, ValueHint};
use clap::{command, Parser};
use colored::Colorize;
use log::{debug, error};
use notify::{raw_watcher, RecursiveMode, Watcher};
use rayon::prelude::*;
use regex::Regex;
use walkdir::DirEntry;
use ::ruff::cache;
use ::ruff::checks::CheckCode;
use ::ruff::checks::CheckKind;
use ::ruff::fs::iter_python_files;
use ::ruff::linter::add_noqa_to_path;
use ::ruff::linter::lint_path;
use ::ruff::logging::set_up_logging;
use ::ruff::message::Message;
use ::ruff::printer::{Printer, SerializationFormat};
use ::ruff::pyproject;
use ::ruff::settings::{FilePattern, Settings};
use ::ruff::pyproject::{self, StrCheckCodePair};
use ::ruff::settings::CurrentSettings;
use ::ruff::settings::{FilePattern, PerFileIgnore, Settings};
use ::ruff::tell_user;
use ruff::linter::add_noqa_to_path;
const CARGO_PKG_NAME: &str = env!("CARGO_PKG_NAME");
const CARGO_PKG_VERSION: &str = env!("CARGO_PKG_VERSION");
#[derive(Debug, Parser)]
#[clap(name = format!("{CARGO_PKG_NAME} (v{CARGO_PKG_VERSION})"))]
#[clap(about = "An extremely fast Python linter.", long_about = None)]
#[clap(version)]
#[command(author, about = "ruff: An extremely fast Python linter.")]
#[command(version)]
struct Cli {
#[clap(parse(from_os_str), value_hint = ValueHint::AnyPath, required = true)]
#[arg(required = true)]
files: Vec<PathBuf>,
/// Enable verbose logging.
#[clap(short, long, action)]
#[arg(short, long)]
verbose: bool,
/// Disable all logging (but still exit with status code "1" upon detecting errors).
#[clap(short, long, action)]
#[arg(short, long)]
quiet: bool,
/// Exit with status code "0", even upon detecting errors.
#[clap(short, long, action)]
#[arg(short, long)]
exit_zero: bool,
/// Run in watch mode by re-running whenever files change.
#[clap(short, long, action)]
#[arg(short, long)]
watch: bool,
/// Attempt to automatically fix lint errors.
#[clap(short, long, action)]
#[arg(short, long)]
fix: bool,
/// Disable cache reads.
#[clap(short, long, action)]
#[arg(short, long)]
no_cache: bool,
/// List of error codes to enable.
#[clap(long, multiple = true)]
#[arg(long, value_delimiter = ',')]
select: Vec<CheckCode>,
/// Like --select, but adds additional error codes on top of the selected ones.
#[clap(long, multiple = true)]
#[arg(long, value_delimiter = ',')]
extend_select: Vec<CheckCode>,
/// List of error codes to ignore.
#[clap(long, multiple = true)]
#[arg(long, value_delimiter = ',')]
ignore: Vec<CheckCode>,
/// Like --ignore, but adds additional error codes on top of the ignored ones.
#[clap(long, multiple = true)]
#[arg(long, value_delimiter = ',')]
extend_ignore: Vec<CheckCode>,
/// List of paths, used to exclude files and/or directories from checks.
#[clap(long, multiple = true)]
#[arg(long, value_delimiter = ',')]
exclude: Vec<String>,
/// Like --exclude, but adds additional files and directories on top of the excluded ones.
#[clap(long, multiple = true)]
#[arg(long, value_delimiter = ',')]
extend_exclude: Vec<String>,
/// List of mappings from file pattern to code to exclude
#[arg(long, value_delimiter = ',')]
per_file_ignores: Vec<StrCheckCodePair>,
/// Output serialization format for error messages.
#[clap(long, arg_enum, default_value_t=SerializationFormat::Text)]
#[arg(long, value_enum, default_value_t=SerializationFormat::Text)]
format: SerializationFormat,
/// See the files ruff will be run against with the current settings.
#[clap(long, action)]
#[arg(long)]
show_files: bool,
/// See ruff's settings.
#[clap(long, action)]
#[arg(long)]
show_settings: bool,
/// Enable automatic additions of noqa directives to failing lines.
#[clap(long, action)]
#[arg(long)]
add_noqa: bool,
/// Regular expression matching the name of dummy variables.
#[arg(long)]
dummy_variable_rgx: Option<Regex>,
}
#[cfg(feature = "update-informer")]
@@ -111,8 +116,8 @@ fn check_for_updates() {
}
}
fn show_settings(settings: &Settings) {
println!("{:#?}", settings);
fn show_settings(settings: Settings) {
println!("{:#?}", CurrentSettings::from_settings(settings));
}
fn show_files(files: &[PathBuf], settings: &Settings) {
@@ -165,6 +170,7 @@ fn run_once(
kind: CheckKind::IOError(message),
fixed: false,
location: Default::default(),
end_location: Default::default(),
filename: path.to_string_lossy().to_string(),
}]
} else {
@@ -244,14 +250,22 @@ fn inner_main() -> Result<ExitCode> {
.iter()
.map(|path| FilePattern::from_user(path, &project_root))
.collect();
let per_file_ignores: Vec<PerFileIgnore> = cli
.per_file_ignores
.into_iter()
.map(|pair| PerFileIgnore::new(pair, &project_root))
.collect();
let mut settings = Settings::from_pyproject(pyproject, project_root);
let mut settings = Settings::from_pyproject(pyproject, project_root)?;
if !exclude.is_empty() {
settings.exclude = exclude;
}
if !extend_exclude.is_empty() {
settings.extend_exclude = extend_exclude;
}
if !per_file_ignores.is_empty() {
settings.per_file_ignores = per_file_ignores;
}
if !cli.select.is_empty() {
settings.clear();
settings.select(cli.select);
@@ -265,34 +279,37 @@ fn inner_main() -> Result<ExitCode> {
if !cli.extend_ignore.is_empty() {
settings.ignore(&cli.extend_ignore);
}
if let Some(dummy_variable_rgx) = cli.dummy_variable_rgx {
settings.dummy_variable_rgx = dummy_variable_rgx;
}
if cli.show_settings && cli.show_files {
println!("Error: specify --show-settings or show-files (not both).");
eprintln!("Error: specify --show-settings or show-files (not both).");
return Ok(ExitCode::FAILURE);
}
if cli.show_settings {
show_settings(&settings);
return Ok(ExitCode::SUCCESS);
}
if cli.show_files {
show_files(&cli.files, &settings);
return Ok(ExitCode::SUCCESS);
}
if cli.show_settings {
show_settings(settings);
return Ok(ExitCode::SUCCESS);
}
cache::init()?;
let mut printer = Printer::new(cli.format, cli.verbose);
if cli.watch {
if cli.fix {
println!("Warning: --fix is not enabled in watch mode.");
eprintln!("Warning: --fix is not enabled in watch mode.");
}
if cli.add_noqa {
println!("Warning: --no-qa is not enabled in watch mode.");
eprintln!("Warning: --no-qa is not enabled in watch mode.");
}
if cli.format != SerializationFormat::Text {
println!("Warning: --format 'text' is used in watch mode.");
eprintln!("Warning: --format 'text' is used in watch mode.");
}
// Perform an initial run instantly.
@@ -354,6 +371,9 @@ fn inner_main() -> Result<ExitCode> {
fn main() -> ExitCode {
match inner_main() {
Ok(code) => code,
Err(_) => ExitCode::FAILURE,
Err(err) => {
eprintln!("{} {:?}", "error".red().bold(), err);
ExitCode::FAILURE
}
}
}

View File

@@ -14,6 +14,7 @@ pub struct Message {
pub kind: CheckKind,
pub fixed: bool,
pub location: Location,
pub end_location: Location,
pub filename: String,
}
@@ -39,7 +40,6 @@ impl fmt::Display for Message {
f,
"{}{}{}{}{}{} {} {}",
relativize_path(Path::new(&self.filename)).white().bold(),
// self.filename.white(),
":".cyan(),
self.location.row(),
":".cyan(),

View File

@@ -1,13 +1,14 @@
use std::cmp::{max, min};
use std::collections::{BTreeMap, BTreeSet};
use std::fs;
use std::path::Path;
use crate::checks::{Check, CheckCode};
use anyhow::Result;
use once_cell::sync::Lazy;
use regex::Regex;
use rustpython_parser::lexer::{LexResult, Tok};
use std::collections::{BTreeMap, BTreeSet};
use std::fs;
use std::path::Path;
use crate::checks::{Check, CheckCode};
static NO_QA_REGEX: Lazy<Regex> = Lazy::new(|| {
Regex::new(r"(?i)(?P<noqa>\s*# noqa(?::\s?(?P<codes>([A-Z]+[0-9]+(?:[,\s]+)?)+))?)")
@@ -18,8 +19,8 @@ static SPLIT_COMMA_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r"[,\s]").expect
#[derive(Debug)]
pub enum Directive<'a> {
None,
All(usize),
Codes(usize, Vec<&'a str>),
All(usize, usize),
Codes(usize, usize, Vec<&'a str>),
}
pub fn extract_noqa_directive(line: &str) -> Directive {
@@ -28,13 +29,14 @@ pub fn extract_noqa_directive(line: &str) -> Directive {
Some(noqa) => match caps.name("codes") {
Some(codes) => Directive::Codes(
noqa.start(),
noqa.end(),
SPLIT_COMMA_REGEX
.split(codes.as_str())
.map(|code| code.trim())
.filter(|code| !code.is_empty())
.collect(),
),
None => Directive::All(noqa.start()),
None => Directive::All(noqa.start(), noqa.end()),
},
None => Directive::None,
},
@@ -132,8 +134,8 @@ fn add_noqa_inner(
Directive::None => {
output.push_str(line);
}
Directive::All(start) => output.push_str(&line[..start]),
Directive::Codes(start, _) => output.push_str(&line[..start]),
Directive::All(start, _) => output.push_str(&line[..start]),
Directive::Codes(start, _, _) => output.push_str(&line[..start]),
};
let codes: Vec<&str> = codes.iter().map(|code| code.as_str()).collect();
output.push_str(" # noqa: ");
@@ -160,12 +162,13 @@ pub fn add_noqa(
#[cfg(test)]
mod tests {
use crate::checks::{Check, CheckKind};
use crate::ast::types::Range;
use anyhow::Result;
use rustpython_parser::ast::Location;
use rustpython_parser::lexer;
use rustpython_parser::lexer::LexResult;
use crate::checks::{Check, CheckKind};
use crate::noqa::{add_noqa_inner, extract_noqa_line_for};
#[test]
@@ -235,7 +238,10 @@ z = x + 1",
let checks = vec![Check::new(
CheckKind::UnusedVariable("x".to_string()),
Location::new(1, 1),
Range {
location: Location::new(1, 1),
end_location: Location::new(1, 1),
},
)];
let contents = "x = 1";
let noqa_line_for = vec![1];
@@ -246,11 +252,17 @@ z = x + 1",
let checks = vec![
Check::new(
CheckKind::AmbiguousVariableName("x".to_string()),
Location::new(1, 1),
Range {
location: Location::new(1, 1),
end_location: Location::new(1, 1),
},
),
Check::new(
CheckKind::UnusedVariable("x".to_string()),
Location::new(1, 1),
Range {
location: Location::new(1, 1),
end_location: Location::new(1, 1),
},
),
];
let contents = "x = 1 # noqa: E741";
@@ -262,11 +274,17 @@ z = x + 1",
let checks = vec![
Check::new(
CheckKind::AmbiguousVariableName("x".to_string()),
Location::new(1, 1),
Range {
location: Location::new(1, 1),
end_location: Location::new(1, 1),
},
),
Check::new(
CheckKind::UnusedVariable("x".to_string()),
Location::new(1, 1),
Range {
location: Location::new(1, 1),
end_location: Location::new(1, 1),
},
),
];
let contents = "x = 1 # noqa";

View File

@@ -1,7 +1,10 @@
use anyhow::Result;
use clap::ValueEnum;
use colored::Colorize;
use rustpython_parser::ast::Location;
use serde::Serialize;
use crate::checks::{CheckCode, CheckKind};
use crate::message::Message;
use crate::tell_user;
@@ -11,6 +14,17 @@ pub enum SerializationFormat {
Json,
}
#[derive(Serialize)]
struct ExpandedMessage<'a> {
kind: &'a CheckKind,
code: &'a CheckCode,
message: String,
fixed: bool,
location: Location,
end_location: Location,
filename: &'a String,
}
pub struct Printer {
format: SerializationFormat,
verbose: bool,
@@ -31,7 +45,23 @@ impl Printer {
match self.format {
SerializationFormat::Json => {
println!("{}", serde_json::to_string_pretty(&messages)?)
println!(
"{}",
serde_json::to_string_pretty(
&messages
.iter()
.map(|m| ExpandedMessage {
kind: &m.kind,
code: m.kind.code(),
message: m.kind.body(),
fixed: m.fixed,
location: m.location,
end_location: m.end_location,
filename: &m.filename,
})
.collect::<Vec<_>>()
)?
)
}
SerializationFormat::Text => {
if !fixed.is_empty() {

View File

@@ -1,30 +1,25 @@
use std::path::{Path, PathBuf};
use std::str::FromStr;
use anyhow::Result;
use anyhow::{anyhow, Result};
use common_path::common_path_all;
use path_absolutize::Absolutize;
use serde::Deserialize;
use serde::de;
use serde::{Deserialize, Deserializer};
use crate::checks::CheckCode;
use crate::fs;
pub fn load_config(pyproject: &Option<PathBuf>) -> Config {
pub fn load_config(pyproject: &Option<PathBuf>) -> Result<Config> {
match pyproject {
Some(pyproject) => match parse_pyproject_toml(pyproject) {
Ok(pyproject) => pyproject
.tool
.and_then(|tool| tool.ruff)
.unwrap_or_default(),
Err(e) => {
println!("Failed to load pyproject.toml: {:?}", e);
println!("Falling back to default configuration...");
Default::default()
}
},
Some(pyproject) => Ok(parse_pyproject_toml(pyproject)?
.tool
.and_then(|tool| tool.ruff)
.unwrap_or_default()),
None => {
println!("No pyproject.toml found.");
println!("Falling back to default configuration...");
Default::default()
eprintln!("No pyproject.toml found.");
eprintln!("Falling back to default configuration...");
Ok(Default::default())
}
}
}
@@ -37,6 +32,50 @@ pub struct Config {
pub extend_exclude: Option<Vec<String>>,
pub select: Option<Vec<CheckCode>>,
pub ignore: Option<Vec<CheckCode>>,
pub per_file_ignores: Option<Vec<StrCheckCodePair>>,
pub dummy_variable_rgx: Option<String>,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct StrCheckCodePair {
pub pattern: String,
pub code: CheckCode,
}
impl StrCheckCodePair {
const EXPECTED_PATTERN: &'static str = "<FilePattern>:<CheckCode> pattern";
}
impl<'de> Deserialize<'de> for StrCheckCodePair {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
let str_result = String::deserialize(deserializer)?;
Self::from_str(str_result.as_str()).map_err(|_| {
de::Error::invalid_value(
de::Unexpected::Str(str_result.as_str()),
&Self::EXPECTED_PATTERN,
)
})
}
}
impl FromStr for StrCheckCodePair {
type Err = anyhow::Error;
fn from_str(string: &str) -> Result<Self, Self::Err> {
let (pattern_str, code_string) = {
let tokens = string.split(':').collect::<Vec<_>>();
if tokens.len() != 2 {
return Err(anyhow!("Expected {}", Self::EXPECTED_PATTERN));
}
(tokens[0], tokens[1])
};
let code = CheckCode::from_str(code_string)?;
let pattern = pattern_str.into();
Ok(Self { pattern, code })
}
}
#[derive(Debug, PartialEq, Eq, Deserialize)]
@@ -102,9 +141,11 @@ pub fn find_project_root(sources: &[PathBuf]) -> Option<PathBuf> {
mod tests {
use std::env::current_dir;
use std::path::PathBuf;
use std::str::FromStr;
use anyhow::Result;
use super::StrCheckCodePair;
use crate::checks::CheckCode;
use crate::pyproject::{
find_project_root, find_pyproject_toml, parse_pyproject_toml, Config, PyProject, Tools,
@@ -137,6 +178,8 @@ mod tests {
extend_exclude: None,
select: None,
ignore: None,
per_file_ignores: None,
dummy_variable_rgx: None,
})
})
);
@@ -157,6 +200,8 @@ line-length = 79
extend_exclude: None,
select: None,
ignore: None,
per_file_ignores: None,
dummy_variable_rgx: None,
})
})
);
@@ -177,6 +222,8 @@ exclude = ["foo.py"]
extend_exclude: None,
select: None,
ignore: None,
per_file_ignores: None,
dummy_variable_rgx: None,
})
})
);
@@ -197,6 +244,8 @@ select = ["E501"]
extend_exclude: None,
select: Some(vec![CheckCode::E501]),
ignore: None,
per_file_ignores: None,
dummy_variable_rgx: None,
})
})
);
@@ -217,6 +266,8 @@ ignore = ["E501"]
extend_exclude: None,
select: None,
ignore: Some(vec![CheckCode::E501]),
per_file_ignores: None,
dummy_variable_rgx: None,
})
})
);
@@ -281,9 +332,29 @@ other-attribute = 1
]),
select: None,
ignore: None,
per_file_ignores: None,
dummy_variable_rgx: None,
}
);
Ok(())
}
#[test]
fn str_check_code_pair_strings() {
let result = StrCheckCodePair::from_str("foo:E501");
assert!(result.is_ok());
let result = StrCheckCodePair::from_str("E501:foo");
assert!(result.is_err());
let result = StrCheckCodePair::from_str("E501");
assert!(result.is_err());
let result = StrCheckCodePair::from_str("foo");
assert!(result.is_err());
let result = StrCheckCodePair::from_str("foo:E501:E402");
assert!(result.is_err());
let result = StrCheckCodePair::from_str("**/bar:E501");
assert!(result.is_ok());
let result = StrCheckCodePair::from_str("bar:E502");
assert!(result.is_err());
}
}

View File

@@ -2,14 +2,16 @@ use std::collections::BTreeSet;
use std::hash::{Hash, Hasher};
use std::path::{Path, PathBuf};
use anyhow::{anyhow, Result};
use glob::Pattern;
use once_cell::sync::Lazy;
use regex::Regex;
use crate::checks::{CheckCode, DEFAULT_CHECK_CODES};
use crate::fs;
use crate::pyproject::load_config;
use crate::pyproject::{load_config, StrCheckCodePair};
#[derive(Debug, Clone)]
#[derive(Debug, Clone, Hash)]
pub enum FilePattern {
Simple(&'static str),
Complex(Pattern, Option<Pattern>),
@@ -34,6 +36,20 @@ impl FilePattern {
}
}
#[derive(Debug, Clone, Hash)]
pub struct PerFileIgnore {
pub pattern: FilePattern,
pub code: CheckCode,
}
impl PerFileIgnore {
pub fn new(user_in: StrCheckCodePair, project_root: &Option<PathBuf>) -> Self {
let pattern = FilePattern::from_user(user_in.pattern.as_str(), project_root);
let code = user_in.code;
Self { pattern, code }
}
}
#[derive(Debug)]
pub struct Settings {
pub pyproject: Option<PathBuf>,
@@ -42,6 +58,8 @@ pub struct Settings {
pub exclude: Vec<FilePattern>,
pub extend_exclude: Vec<FilePattern>,
pub select: BTreeSet<CheckCode>,
pub per_file_ignores: Vec<PerFileIgnore>,
pub dummy_variable_rgx: Regex,
}
impl Settings {
@@ -53,6 +71,8 @@ impl Settings {
exclude: vec![],
extend_exclude: vec![],
select: BTreeSet::from([check_code]),
per_file_ignores: vec![],
dummy_variable_rgx: DEFAULT_DUMMY_VARIABLE_RGX.clone(),
}
}
@@ -64,6 +84,8 @@ impl Settings {
exclude: vec![],
extend_exclude: vec![],
select: BTreeSet::from_iter(check_codes),
per_file_ignores: vec![],
dummy_variable_rgx: DEFAULT_DUMMY_VARIABLE_RGX.clone(),
}
}
}
@@ -71,9 +93,13 @@ impl Settings {
impl Hash for Settings {
fn hash<H: Hasher>(&self, state: &mut H) {
self.line_length.hash(state);
self.dummy_variable_rgx.as_str().hash(state);
for value in self.select.iter() {
value.hash(state);
}
for value in self.per_file_ignores.iter() {
value.hash(state);
}
}
}
@@ -101,9 +127,15 @@ static DEFAULT_EXCLUDE: Lazy<Vec<FilePattern>> = Lazy::new(|| {
]
});
static DEFAULT_DUMMY_VARIABLE_RGX: Lazy<Regex> =
Lazy::new(|| Regex::new("^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$").unwrap());
impl Settings {
pub fn from_pyproject(pyproject: Option<PathBuf>, project_root: Option<PathBuf>) -> Self {
let config = load_config(&pyproject);
pub fn from_pyproject(
pyproject: Option<PathBuf>,
project_root: Option<PathBuf>,
) -> Result<Self> {
let config = load_config(&pyproject)?;
let mut settings = Settings {
line_length: config.line_length.unwrap_or(88),
exclude: config
@@ -129,13 +161,27 @@ impl Settings {
} else {
BTreeSet::from_iter(DEFAULT_CHECK_CODES)
},
per_file_ignores: config
.per_file_ignores
.map(|ignore_strings| {
ignore_strings
.into_iter()
.map(|pair| PerFileIgnore::new(pair, &project_root))
.collect()
})
.unwrap_or_default(),
dummy_variable_rgx: match config.dummy_variable_rgx {
Some(pattern) => Regex::new(&pattern)
.map_err(|e| anyhow!("Invalid dummy-variable-rgx value: {e}"))?,
None => DEFAULT_DUMMY_VARIABLE_RGX.clone(),
},
pyproject,
project_root,
};
if let Some(ignore) = &config.ignore {
settings.ignore(ignore);
}
settings
Ok(settings)
}
pub fn clear(&mut self) {
@@ -154,3 +200,62 @@ impl Settings {
}
}
}
/// Struct to render user-facing exclusion patterns.
#[derive(Debug)]
#[allow(dead_code)]
pub struct Exclusion {
basename: Option<String>,
absolute: Option<String>,
}
impl Exclusion {
pub fn from_file_pattern(file_pattern: FilePattern) -> Self {
match file_pattern {
FilePattern::Simple(basename) => Exclusion {
basename: Some(basename.to_string()),
absolute: None,
},
FilePattern::Complex(absolute, basename) => Exclusion {
basename: basename.map(|pattern| pattern.to_string()),
absolute: Some(absolute.to_string()),
},
}
}
}
/// Struct to render user-facing Settings.
#[derive(Debug)]
pub struct CurrentSettings {
pub pyproject: Option<PathBuf>,
pub project_root: Option<PathBuf>,
pub line_length: usize,
pub exclude: Vec<Exclusion>,
pub extend_exclude: Vec<Exclusion>,
pub select: BTreeSet<CheckCode>,
pub per_file_ignores: Vec<PerFileIgnore>,
pub dummy_variable_rgx: Regex,
}
impl CurrentSettings {
pub fn from_settings(settings: Settings) -> Self {
Self {
pyproject: settings.pyproject,
project_root: settings.project_root,
line_length: settings.line_length,
exclude: settings
.exclude
.into_iter()
.map(Exclusion::from_file_pattern)
.collect(),
extend_exclude: settings
.extend_exclude
.into_iter()
.map(Exclusion::from_file_pattern)
.collect(),
select: settings.select,
per_file_ignores: settings.per_file_ignores,
dummy_variable_rgx: settings.dummy_variable_rgx,
}
}
}

View File

@@ -0,0 +1,167 @@
---
source: src/linter.rs
expression: checks
---
- kind:
BuiltinVariableShadowing: sum
location:
row: 1
column: 1
end_location:
row: 1
column: 19
fix: ~
- kind:
BuiltinVariableShadowing: int
location:
row: 2
column: 1
end_location:
row: 2
column: 30
fix: ~
- kind:
BuiltinVariableShadowing: print
location:
row: 4
column: 1
end_location:
row: 4
column: 6
fix: ~
- kind:
BuiltinVariableShadowing: copyright
location:
row: 5
column: 1
end_location:
row: 5
column: 10
fix: ~
- kind:
BuiltinVariableShadowing: complex
location:
row: 6
column: 2
end_location:
row: 6
column: 14
fix: ~
- kind:
BuiltinVariableShadowing: float
location:
row: 7
column: 1
end_location:
row: 7
column: 6
fix: ~
- kind:
BuiltinVariableShadowing: object
location:
row: 7
column: 9
end_location:
row: 7
column: 15
fix: ~
- kind:
BuiltinVariableShadowing: min
location:
row: 8
column: 1
end_location:
row: 8
column: 4
fix: ~
- kind:
BuiltinVariableShadowing: max
location:
row: 8
column: 6
end_location:
row: 8
column: 9
fix: ~
- kind:
BuiltinVariableShadowing: bytes
location:
row: 10
column: 1
end_location:
row: 13
column: 1
fix: ~
- kind:
BuiltinVariableShadowing: slice
location:
row: 13
column: 1
end_location:
row: 16
column: 1
fix: ~
- kind:
BuiltinVariableShadowing: ValueError
location:
row: 18
column: 1
end_location:
row: 21
column: 1
fix: ~
- kind:
BuiltinVariableShadowing: memoryview
location:
row: 21
column: 5
end_location:
row: 21
column: 15
fix: ~
- kind:
BuiltinVariableShadowing: bytearray
location:
row: 21
column: 18
end_location:
row: 21
column: 27
fix: ~
- kind:
BuiltinVariableShadowing: str
location:
row: 24
column: 22
end_location:
row: 24
column: 25
fix: ~
- kind:
BuiltinVariableShadowing: all
location:
row: 24
column: 45
end_location:
row: 24
column: 48
fix: ~
- kind:
BuiltinVariableShadowing: any
location:
row: 24
column: 50
end_location:
row: 24
column: 53
fix: ~
- kind:
BuiltinVariableShadowing: sum
location:
row: 27
column: 8
end_location:
row: 27
column: 11
fix: ~

View File

@@ -0,0 +1,68 @@
---
source: src/linter.rs
expression: checks
---
- kind:
BuiltinArgumentShadowing: str
location:
row: 1
column: 11
end_location:
row: 1
column: 14
fix: ~
- kind:
BuiltinArgumentShadowing: type
location:
row: 1
column: 19
end_location:
row: 1
column: 23
fix: ~
- kind:
BuiltinArgumentShadowing: complex
location:
row: 1
column: 26
end_location:
row: 1
column: 33
fix: ~
- kind:
BuiltinArgumentShadowing: Exception
location:
row: 1
column: 35
end_location:
row: 1
column: 44
fix: ~
- kind:
BuiltinArgumentShadowing: getattr
location:
row: 1
column: 48
end_location:
row: 1
column: 55
fix: ~
- kind:
BuiltinArgumentShadowing: bytes
location:
row: 5
column: 17
end_location:
row: 5
column: 22
fix: ~
- kind:
BuiltinArgumentShadowing: float
location:
row: 9
column: 16
end_location:
row: 9
column: 21
fix: ~

View File

@@ -0,0 +1,23 @@
---
source: src/linter.rs
expression: checks
---
- kind:
BuiltinAttributeShadowing: ImportError
location:
row: 2
column: 5
end_location:
row: 2
column: 16
fix: ~
- kind:
BuiltinAttributeShadowing: str
location:
row: 7
column: 5
end_location:
row: 9
column: 1
fix: ~

View File

@@ -4,7 +4,10 @@ expression: checks
---
- kind: ModuleImportNotAtTopOfFile
location:
row: 20
row: 24
column: 1
end_location:
row: 24
column: 9
fix: ~

View File

@@ -8,6 +8,9 @@ expression: checks
- 88
location:
row: 5
column: 89
column: 1
end_location:
row: 5
column: 124
fix: ~

View File

@@ -7,47 +7,71 @@ expression: checks
location:
row: 2
column: 11
end_location:
row: 2
column: 15
fix: ~
- kind:
NoneComparison: NotEq
location:
row: 5
column: 11
end_location:
row: 5
column: 15
fix: ~
- kind:
NoneComparison: Eq
location:
row: 8
column: 4
end_location:
row: 8
column: 8
fix: ~
- kind:
NoneComparison: NotEq
location:
row: 11
column: 4
end_location:
row: 11
column: 8
fix: ~
- kind:
NoneComparison: Eq
location:
row: 14
column: 14
end_location:
row: 14
column: 18
fix: ~
- kind:
NoneComparison: NotEq
location:
row: 17
column: 14
end_location:
row: 17
column: 18
fix: ~
- kind:
NoneComparison: NotEq
location:
row: 20
column: 4
end_location:
row: 20
column: 8
fix: ~
- kind:
NoneComparison: Eq
location:
row: 23
column: 4
end_location:
row: 23
column: 8
fix: ~

View File

@@ -9,6 +9,9 @@ expression: checks
location:
row: 2
column: 11
end_location:
row: 2
column: 15
fix: ~
- kind:
TrueFalseComparison:
@@ -17,6 +20,9 @@ expression: checks
location:
row: 5
column: 11
end_location:
row: 5
column: 16
fix: ~
- kind:
TrueFalseComparison:
@@ -25,6 +31,9 @@ expression: checks
location:
row: 8
column: 4
end_location:
row: 8
column: 8
fix: ~
- kind:
TrueFalseComparison:
@@ -33,6 +42,9 @@ expression: checks
location:
row: 11
column: 4
end_location:
row: 11
column: 9
fix: ~
- kind:
TrueFalseComparison:
@@ -41,6 +53,9 @@ expression: checks
location:
row: 14
column: 14
end_location:
row: 14
column: 18
fix: ~
- kind:
TrueFalseComparison:
@@ -49,6 +64,9 @@ expression: checks
location:
row: 17
column: 14
end_location:
row: 17
column: 19
fix: ~
- kind:
TrueFalseComparison:
@@ -57,6 +75,9 @@ expression: checks
location:
row: 20
column: 20
end_location:
row: 20
column: 24
fix: ~
- kind:
TrueFalseComparison:
@@ -65,6 +86,9 @@ expression: checks
location:
row: 20
column: 44
end_location:
row: 20
column: 49
fix: ~
- kind:
TrueFalseComparison:
@@ -73,5 +97,8 @@ expression: checks
location:
row: 22
column: 5
end_location:
row: 22
column: 9
fix: ~

View File

@@ -6,25 +6,40 @@ expression: checks
location:
row: 2
column: 10
end_location:
row: 2
column: 14
fix: ~
- kind: NotInTest
location:
row: 5
column: 12
end_location:
row: 5
column: 16
fix: ~
- kind: NotInTest
location:
row: 8
column: 10
end_location:
row: 8
column: 14
fix: ~
- kind: NotInTest
location:
row: 11
column: 25
end_location:
row: 11
column: 29
fix: ~
- kind: NotInTest
location:
row: 14
column: 11
end_location:
row: 14
column: 15
fix: ~

View File

@@ -6,15 +6,24 @@ expression: checks
location:
row: 2
column: 10
end_location:
row: 2
column: 14
fix: ~
- kind: NotIsTest
location:
row: 5
column: 12
end_location:
row: 5
column: 16
fix: ~
- kind: NotIsTest
location:
row: 8
column: 10
end_location:
row: 8
column: 23
fix: ~

View File

@@ -6,80 +6,128 @@ expression: checks
location:
row: 2
column: 14
end_location:
row: 2
column: 25
fix: ~
- kind: TypeComparison
location:
row: 5
column: 14
end_location:
row: 5
column: 25
fix: ~
- kind: TypeComparison
location:
row: 10
column: 8
end_location:
row: 10
column: 24
fix: ~
- kind: TypeComparison
location:
row: 15
column: 14
end_location:
row: 15
column: 35
fix: ~
- kind: TypeComparison
location:
row: 18
column: 18
end_location:
row: 18
column: 32
fix: ~
- kind: TypeComparison
location:
row: 18
column: 46
end_location:
row: 18
column: 59
fix: ~
- kind: TypeComparison
location:
row: 20
column: 18
end_location:
row: 20
column: 29
fix: ~
- kind: TypeComparison
location:
row: 22
column: 18
end_location:
row: 22
column: 29
fix: ~
- kind: TypeComparison
location:
row: 24
column: 18
end_location:
row: 24
column: 31
fix: ~
- kind: TypeComparison
location:
row: 26
column: 18
end_location:
row: 26
column: 30
fix: ~
- kind: TypeComparison
location:
row: 28
column: 18
end_location:
row: 28
column: 31
fix: ~
- kind: TypeComparison
location:
row: 30
column: 18
end_location:
row: 30
column: 31
fix: ~
- kind: TypeComparison
location:
row: 32
column: 18
end_location:
row: 32
column: 35
fix: ~
- kind: TypeComparison
location:
row: 34
column: 18
end_location:
row: 38
column: 2
fix: ~
- kind: TypeComparison
location:
row: 40
column: 18
end_location:
row: 40
column: 29
fix: ~
- kind: TypeComparison
location:
row: 42
column: 18
end_location:
row: 42
column: 31
fix: ~

View File

@@ -6,15 +6,24 @@ expression: checks
location:
row: 4
column: 1
end_location:
row: 7
column: 1
fix: ~
- kind: DoNotUseBareExcept
location:
row: 11
column: 1
end_location:
row: 14
column: 1
fix: ~
- kind: DoNotUseBareExcept
location:
row: 16
column: 1
end_location:
row: 19
column: 1
fix: ~

View File

@@ -6,25 +6,40 @@ expression: checks
location:
row: 2
column: 1
end_location:
row: 2
column: 20
fix: ~
- kind: DoNotAssignLambda
location:
row: 4
column: 1
end_location:
row: 4
column: 20
fix: ~
- kind: DoNotAssignLambda
location:
row: 7
column: 5
end_location:
row: 7
column: 30
fix: ~
- kind: DoNotAssignLambda
location:
row: 12
column: 1
end_location:
row: 12
column: 28
fix: ~
- kind: DoNotAssignLambda
location:
row: 16
column: 1
end_location:
row: 16
column: 26
fix: ~

View File

@@ -7,149 +7,224 @@ expression: checks
location:
row: 3
column: 1
end_location:
row: 3
column: 2
fix: ~
- kind:
AmbiguousVariableName: I
location:
row: 4
column: 1
end_location:
row: 4
column: 2
fix: ~
- kind:
AmbiguousVariableName: O
location:
row: 5
column: 1
end_location:
row: 5
column: 2
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 6
column: 1
end_location:
row: 6
column: 2
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 8
column: 4
end_location:
row: 8
column: 5
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 9
column: 5
end_location:
row: 9
column: 6
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 10
column: 5
end_location:
row: 10
column: 6
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 11
column: 5
end_location:
row: 11
column: 6
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 16
column: 5
end_location:
row: 16
column: 6
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 20
column: 8
end_location:
row: 20
column: 9
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 25
column: 5
end_location:
row: 25
column: 13
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 26
column: 5
end_location:
row: 26
column: 6
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 30
column: 5
end_location:
row: 30
column: 6
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 33
column: 9
end_location:
row: 33
column: 19
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 34
column: 9
end_location:
row: 34
column: 10
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 40
column: 8
end_location:
row: 40
column: 9
fix: ~
- kind:
AmbiguousVariableName: I
location:
row: 40
column: 14
end_location:
row: 40
column: 15
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 44
column: 8
end_location:
row: 44
column: 9
fix: ~
- kind:
AmbiguousVariableName: I
location:
row: 44
column: 16
end_location:
row: 44
column: 17
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 48
column: 9
end_location:
row: 48
column: 10
fix: ~
- kind:
AmbiguousVariableName: I
location:
row: 48
column: 14
end_location:
row: 48
column: 15
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 57
column: 16
end_location:
row: 57
column: 17
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 66
column: 20
end_location:
row: 66
column: 21
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 71
column: 1
end_location:
row: 74
column: 1
fix: ~
- kind:
AmbiguousVariableName: l
location:
row: 74
column: 5
end_location:
row: 74
column: 11
fix: ~

View File

@@ -7,17 +7,26 @@ expression: checks
location:
row: 1
column: 1
end_location:
row: 5
column: 1
fix: ~
- kind:
AmbiguousClassName: I
location:
row: 5
column: 1
end_location:
row: 9
column: 1
fix: ~
- kind:
AmbiguousClassName: O
location:
row: 9
column: 1
end_location:
row: 13
column: 1
fix: ~

View File

@@ -7,17 +7,26 @@ expression: checks
location:
row: 1
column: 1
end_location:
row: 5
column: 1
fix: ~
- kind:
AmbiguousFunctionName: I
location:
row: 5
column: 1
end_location:
row: 9
column: 1
fix: ~
- kind:
AmbiguousFunctionName: O
location:
row: 10
column: 5
end_location:
row: 14
column: 1
fix: ~

View File

@@ -0,0 +1,14 @@
---
source: src/linter.rs
expression: checks
---
- kind:
SyntaxError: Got unexpected EOF
location:
row: 2
column: 1
end_location:
row: 2
column: 1
fix: ~

View File

@@ -5,19 +5,120 @@ expression: checks
- kind:
UnusedImport: functools
location:
row: 3
row: 2
column: 1
fix: ~
end_location:
row: 2
column: 21
fix:
content: import os
location:
row: 2
column: 1
end_location:
row: 2
column: 21
applied: false
- kind:
UnusedImport: collections.OrderedDict
location:
row: 5
row: 4
column: 1
fix: ~
end_location:
row: 8
column: 2
fix:
content: "from collections import (\n Counter,\n namedtuple,\n)"
location:
row: 4
column: 1
end_location:
row: 8
column: 2
applied: false
- kind:
UnusedImport: logging.handlers
location:
row: 13
row: 12
column: 1
fix: ~
end_location:
row: 12
column: 24
fix:
content: import logging.handlers
location:
row: 12
column: 1
end_location:
row: 12
column: 24
applied: false
- kind:
UnusedImport: shelve
location:
row: 33
column: 5
end_location:
row: 33
column: 18
fix:
content: ""
location:
row: 33
column: 1
end_location:
row: 34
column: 1
applied: false
- kind:
UnusedImport: importlib
location:
row: 34
column: 5
end_location:
row: 34
column: 21
fix:
content: pass
location:
row: 34
column: 5
end_location:
row: 34
column: 21
applied: false
- kind:
UnusedImport: pathlib
location:
row: 38
column: 5
end_location:
row: 38
column: 19
fix:
content: ""
location:
row: 38
column: 1
end_location:
row: 39
column: 1
applied: false
- kind:
UnusedImport: pickle
location:
row: 53
column: 9
end_location:
row: 53
column: 22
fix:
content: pass
location:
row: 53
column: 9
end_location:
row: 53
column: 22
applied: false

View File

@@ -9,6 +9,9 @@ expression: checks
location:
row: 5
column: 5
end_location:
row: 5
column: 7
fix: ~
- kind:
ImportShadowedByLoopVar:
@@ -17,5 +20,8 @@ expression: checks
location:
row: 8
column: 5
end_location:
row: 8
column: 9
fix: ~

View File

@@ -7,11 +7,17 @@ expression: checks
location:
row: 1
column: 1
end_location:
row: 1
column: 19
fix: ~
- kind:
ImportStarUsed: F634
location:
row: 2
column: 1
end_location:
row: 2
column: 19
fix: ~

View File

@@ -6,5 +6,8 @@ expression: checks
location:
row: 7
column: 1
end_location:
row: 7
column: 38
fix: ~

View File

@@ -9,6 +9,9 @@ expression: checks
location:
row: 5
column: 11
end_location:
row: 5
column: 15
fix: ~
- kind:
ImportStarUsage:
@@ -17,5 +20,8 @@ expression: checks
location:
row: 11
column: 1
end_location:
row: 11
column: 8
fix: ~

View File

@@ -7,11 +7,17 @@ expression: checks
location:
row: 5
column: 5
end_location:
row: 5
column: 23
fix: ~
- kind:
ImportStarNotPermitted: F634
location:
row: 9
column: 5
end_location:
row: 9
column: 23
fix: ~

View File

@@ -7,5 +7,8 @@ expression: checks
location:
row: 2
column: 1
end_location:
row: 2
column: 44
fix: ~

View File

@@ -6,15 +6,24 @@ expression: checks
location:
row: 4
column: 7
end_location:
row: 4
column: 11
fix: ~
- kind: FStringMissingPlaceholders
location:
row: 5
column: 7
end_location:
row: 5
column: 11
fix: ~
- kind: FStringMissingPlaceholders
location:
row: 7
column: 7
end_location:
row: 7
column: 11
fix: ~

View File

@@ -6,15 +6,24 @@ expression: checks
location:
row: 3
column: 6
end_location:
row: 3
column: 8
fix: ~
- kind: MultiValueRepeatedKeyLiteral
location:
row: 9
column: 5
end_location:
row: 9
column: 6
fix: ~
- kind: MultiValueRepeatedKeyLiteral
location:
row: 11
column: 7
end_location:
row: 11
column: 11
fix: ~

View File

@@ -7,5 +7,8 @@ expression: checks
location:
row: 5
column: 5
end_location:
row: 5
column: 6
fix: ~

View File

@@ -6,5 +6,8 @@ expression: checks
location:
row: 1
column: 1
end_location:
row: 1
column: 10
fix: ~

View File

@@ -6,10 +6,16 @@ expression: checks
location:
row: 1
column: 1
end_location:
row: 1
column: 20
fix: ~
- kind: AssertTuple
location:
row: 2
column: 1
end_location:
row: 2
column: 16
fix: ~

View File

@@ -6,10 +6,16 @@ expression: checks
location:
row: 1
column: 6
end_location:
row: 1
column: 14
fix: ~
- kind: IsLiteral
location:
row: 4
column: 8
end_location:
row: 4
column: 16
fix: ~

View File

@@ -6,5 +6,8 @@ expression: checks
location:
row: 4
column: 1
end_location:
row: 4
column: 6
fix: ~

View File

@@ -6,10 +6,16 @@ expression: checks
location:
row: 1
column: 1
end_location:
row: 4
column: 1
fix: ~
- kind: IfTuple
location:
row: 7
column: 5
end_location:
row: 9
column: 5
fix: ~

View File

@@ -6,20 +6,32 @@ expression: checks
location:
row: 4
column: 5
end_location:
row: 4
column: 10
fix: ~
- kind: BreakOutsideLoop
location:
row: 16
column: 5
end_location:
row: 16
column: 10
fix: ~
- kind: BreakOutsideLoop
location:
row: 20
column: 5
end_location:
row: 20
column: 10
fix: ~
- kind: BreakOutsideLoop
location:
row: 23
column: 1
end_location:
row: 23
column: 6
fix: ~

View File

@@ -6,20 +6,32 @@ expression: checks
location:
row: 4
column: 5
end_location:
row: 4
column: 13
fix: ~
- kind: ContinueOutsideLoop
location:
row: 16
column: 5
end_location:
row: 16
column: 13
fix: ~
- kind: ContinueOutsideLoop
location:
row: 20
column: 5
end_location:
row: 20
column: 13
fix: ~
- kind: ContinueOutsideLoop
location:
row: 23
column: 1
end_location:
row: 23
column: 9
fix: ~

View File

@@ -6,15 +6,24 @@ expression: checks
location:
row: 6
column: 5
end_location:
row: 6
column: 12
fix: ~
- kind: YieldOutsideFunction
location:
row: 9
column: 1
end_location:
row: 9
column: 8
fix: ~
- kind: YieldOutsideFunction
location:
row: 10
column: 1
end_location:
row: 10
column: 13
fix: ~

View File

@@ -6,10 +6,16 @@ expression: checks
location:
row: 6
column: 5
end_location:
row: 6
column: 13
fix: ~
- kind: ReturnOutsideFunction
location:
row: 9
column: 1
end_location:
row: 9
column: 9
fix: ~

View File

@@ -6,15 +6,24 @@ expression: checks
location:
row: 3
column: 1
end_location:
row: 5
column: 1
fix: ~
- kind: DefaultExceptNotLast
location:
row: 10
column: 1
end_location:
row: 12
column: 1
fix: ~
- kind: DefaultExceptNotLast
location:
row: 19
column: 1
end_location:
row: 21
column: 1
fix: ~

View File

@@ -7,5 +7,8 @@ expression: checks
location:
row: 9
column: 13
end_location:
row: 9
column: 17
fix: ~

View File

@@ -7,47 +7,71 @@ expression: checks
location:
row: 2
column: 12
end_location:
row: 2
column: 16
fix: ~
- kind:
UndefinedName: self
location:
row: 6
column: 13
end_location:
row: 6
column: 17
fix: ~
- kind:
UndefinedName: self
location:
row: 10
column: 9
end_location:
row: 10
column: 13
fix: ~
- kind:
UndefinedName: numeric_string
location:
row: 21
column: 12
end_location:
row: 21
column: 26
fix: ~
- kind:
UndefinedName: Bar
location:
row: 58
column: 5
end_location:
row: 58
column: 9
fix: ~
- kind:
UndefinedName: TOMATO
location:
row: 83
column: 11
end_location:
row: 83
column: 17
fix: ~
- kind:
UndefinedName: B
location:
row: 87
column: 7
end_location:
row: 87
column: 11
fix: ~
- kind:
UndefinedName: B
location:
row: 89
column: 7
end_location:
row: 89
column: 9
fix: ~

View File

@@ -7,5 +7,8 @@ expression: checks
location:
row: 3
column: 1
end_location:
row: 3
column: 8
fix: ~

View File

@@ -7,5 +7,8 @@ expression: checks
location:
row: 6
column: 5
end_location:
row: 6
column: 11
fix: ~

View File

@@ -6,15 +6,24 @@ expression: checks
location:
row: 1
column: 25
end_location:
row: 1
column: 31
fix: ~
- kind: DuplicateArgumentName
location:
row: 5
column: 28
end_location:
row: 5
column: 34
fix: ~
- kind: DuplicateArgumentName
location:
row: 9
column: 27
end_location:
row: 9
column: 33
fix: ~

View File

@@ -7,29 +7,44 @@ expression: checks
location:
row: 3
column: 1
end_location:
row: 7
column: 1
fix: ~
- kind:
UnusedVariable: z
location:
row: 16
column: 5
end_location:
row: 16
column: 6
fix: ~
- kind:
UnusedVariable: foo
location:
row: 20
column: 5
end_location:
row: 20
column: 8
fix: ~
- kind:
UnusedVariable: a
location:
row: 21
column: 6
end_location:
row: 21
column: 7
fix: ~
- kind:
UnusedVariable: b
location:
row: 21
column: 9
end_location:
row: 21
column: 10
fix: ~

View File

@@ -0,0 +1,68 @@
---
source: src/linter.rs
expression: checks
---
- kind:
UnusedVariable: e
location:
row: 3
column: 1
end_location:
row: 7
column: 1
fix: ~
- kind:
UnusedVariable: foo
location:
row: 20
column: 5
end_location:
row: 20
column: 8
fix: ~
- kind:
UnusedVariable: a
location:
row: 21
column: 6
end_location:
row: 21
column: 7
fix: ~
- kind:
UnusedVariable: b
location:
row: 21
column: 9
end_location:
row: 21
column: 10
fix: ~
- kind:
UnusedVariable: _
location:
row: 35
column: 5
end_location:
row: 35
column: 6
fix: ~
- kind:
UnusedVariable: __
location:
row: 36
column: 5
end_location:
row: 36
column: 7
fix: ~
- kind:
UnusedVariable: _discarded
location:
row: 37
column: 5
end_location:
row: 37
column: 15
fix: ~

View File

@@ -5,11 +5,17 @@ expression: checks
- kind: RaiseNotImplemented
location:
row: 2
column: 25
column: 11
end_location:
row: 2
column: 27
fix: ~
- kind: RaiseNotImplemented
location:
row: 6
column: 11
end_location:
row: 6
column: 25
fix: ~

View File

@@ -7,11 +7,25 @@ expression: checks
location:
row: 5
column: 1
fix: ~
end_location:
row: 8
column: 2
fix:
content: "from models import (\n Fruit,\n)"
location:
row: 5
column: 1
end_location:
row: 8
column: 2
applied: false
- kind:
UndefinedName: Bar
location:
row: 22
row: 25
column: 19
end_location:
row: 25
column: 22
fix: ~

View File

@@ -7,12 +7,15 @@ expression: checks
location:
row: 9
column: 10
end_location:
row: 9
column: 18
fix:
content: ""
start:
location:
row: 9
column: 10
end:
end_location:
row: 9
column: 18
applied: false
@@ -21,12 +24,15 @@ expression: checks
location:
row: 13
column: 10
end_location:
row: 13
column: 24
fix:
content: ""
start:
location:
row: 13
column: 10
end:
end_location:
row: 13
column: 24
applied: false
@@ -35,12 +41,15 @@ expression: checks
location:
row: 16
column: 10
end_location:
row: 16
column: 30
fix:
content: " # noqa: F841"
start:
location:
row: 16
column: 10
end:
end_location:
row: 16
column: 30
applied: false
@@ -49,12 +58,15 @@ expression: checks
location:
row: 41
column: 4
end_location:
row: 41
column: 24
fix:
content: " # noqa: E501"
start:
location:
row: 41
column: 4
end:
end_location:
row: 41
column: 24
applied: false
@@ -63,12 +75,15 @@ expression: checks
location:
row: 49
column: 4
end_location:
row: 49
column: 18
fix:
content: ""
start:
location:
row: 49
column: 4
end:
end_location:
row: 49
column: 18
applied: false
@@ -77,12 +92,15 @@ expression: checks
location:
row: 57
column: 4
end_location:
row: 57
column: 12
fix:
content: ""
start:
location:
row: 57
column: 4
end:
end_location:
row: 57
column: 12
applied: false

View File

@@ -7,12 +7,15 @@ expression: checks
location:
row: 5
column: 9
end_location:
row: 5
column: 15
fix:
content: ""
start:
location:
row: 5
column: 8
end:
end_location:
row: 5
column: 16
applied: false
@@ -21,12 +24,15 @@ expression: checks
location:
row: 10
column: 5
end_location:
row: 10
column: 11
fix:
content: ""
start:
location:
row: 9
column: 8
end:
end_location:
row: 11
column: 2
applied: false
@@ -35,12 +41,15 @@ expression: checks
location:
row: 16
column: 5
end_location:
row: 16
column: 11
fix:
content: ""
start:
location:
row: 15
column: 8
end:
end_location:
row: 18
column: 2
applied: false
@@ -49,12 +58,15 @@ expression: checks
location:
row: 24
column: 5
end_location:
row: 24
column: 11
fix:
content: ""
start:
location:
row: 22
column: 8
end:
end_location:
row: 25
column: 2
applied: false
@@ -63,12 +75,15 @@ expression: checks
location:
row: 31
column: 5
end_location:
row: 31
column: 11
fix:
content: ""
start:
location:
row: 29
column: 8
end:
end_location:
row: 32
column: 2
applied: false
@@ -77,12 +92,15 @@ expression: checks
location:
row: 37
column: 5
end_location:
row: 37
column: 11
fix:
content: ""
start:
location:
row: 36
column: 8
end:
end_location:
row: 39
column: 2
applied: false
@@ -91,12 +109,15 @@ expression: checks
location:
row: 45
column: 5
end_location:
row: 45
column: 11
fix:
content: ""
start:
location:
row: 43
column: 8
end:
end_location:
row: 47
column: 2
applied: false
@@ -105,12 +126,15 @@ expression: checks
location:
row: 53
column: 5
end_location:
row: 53
column: 11
fix:
content: ""
start:
location:
row: 51
column: 8
end:
end_location:
row: 55
column: 2
applied: false
@@ -119,12 +143,15 @@ expression: checks
location:
row: 61
column: 5
end_location:
row: 61
column: 11
fix:
content: ""
start:
location:
row: 59
column: 8
end:
end_location:
row: 63
column: 2
applied: false
@@ -133,12 +160,15 @@ expression: checks
location:
row: 69
column: 5
end_location:
row: 69
column: 11
fix:
content: ""
start:
location:
row: 67
column: 8
end:
end_location:
row: 71
column: 2
applied: false
@@ -147,12 +177,15 @@ expression: checks
location:
row: 75
column: 12
end_location:
row: 75
column: 18
fix:
content: ""
start:
location:
row: 75
column: 10
end:
end_location:
row: 75
column: 18
applied: false
@@ -161,12 +194,15 @@ expression: checks
location:
row: 79
column: 9
end_location:
row: 79
column: 15
fix:
content: ""
start:
location:
row: 79
column: 9
end:
end_location:
row: 79
column: 17
applied: false
@@ -175,12 +211,15 @@ expression: checks
location:
row: 84
column: 5
end_location:
row: 84
column: 11
fix:
content: ""
start:
location:
row: 84
column: 5
end:
end_location:
row: 85
column: 5
applied: false
@@ -189,12 +228,15 @@ expression: checks
location:
row: 92
column: 5
end_location:
row: 92
column: 11
fix:
content: ""
start:
location:
row: 91
column: 6
end:
end_location:
row: 92
column: 11
applied: false
@@ -203,12 +245,15 @@ expression: checks
location:
row: 98
column: 5
end_location:
row: 98
column: 11
fix:
content: ""
start:
location:
row: 98
column: 5
end:
end_location:
row: 100
column: 5
applied: false
@@ -217,12 +262,15 @@ expression: checks
location:
row: 108
column: 5
end_location:
row: 108
column: 11
fix:
content: ""
start:
location:
row: 107
column: 6
end:
end_location:
row: 108
column: 11
applied: false
@@ -231,12 +279,15 @@ expression: checks
location:
row: 114
column: 13
end_location:
row: 114
column: 19
fix:
content: ""
start:
location:
row: 114
column: 12
end:
end_location:
row: 114
column: 20
applied: false
@@ -245,12 +296,15 @@ expression: checks
location:
row: 119
column: 5
end_location:
row: 119
column: 11
fix:
content: ""
start:
location:
row: 118
column: 8
end:
end_location:
row: 120
column: 2
applied: false
@@ -259,12 +313,15 @@ expression: checks
location:
row: 125
column: 5
end_location:
row: 125
column: 11
fix:
content: ""
start:
location:
row: 124
column: 8
end:
end_location:
row: 126
column: 2
applied: false
@@ -273,12 +330,15 @@ expression: checks
location:
row: 131
column: 5
end_location:
row: 131
column: 11
fix:
content: ""
start:
location:
row: 130
column: 8
end:
end_location:
row: 133
column: 2
applied: false

View File

@@ -5,27 +5,33 @@ expression: checks
- kind: NoAssertEquals
location:
row: 1
column: 5
column: 1
end_location:
row: 1
column: 18
fix:
content: assertEqual
start:
location:
row: 1
column: 6
end:
column: 2
end_location:
row: 1
column: 18
column: 14
applied: false
- kind: NoAssertEquals
location:
row: 2
column: 5
column: 1
end_location:
row: 2
column: 18
fix:
content: assertEqual
start:
location:
row: 2
column: 6
end:
column: 2
end_location:
row: 2
column: 18
column: 14
applied: false

View File

@@ -0,0 +1,53 @@
---
source: src/linter.rs
expression: checks
---
- kind: SuperCallWithParameters
location:
row: 17
column: 18
end_location:
row: 17
column: 36
fix:
content: super()
location:
row: 17
column: 18
end_location:
row: 17
column: 36
applied: false
- kind: SuperCallWithParameters
location:
row: 18
column: 9
end_location:
row: 18
column: 27
fix:
content: super()
location:
row: 18
column: 9
end_location:
row: 18
column: 27
applied: false
- kind: SuperCallWithParameters
location:
row: 19
column: 9
end_location:
row: 22
column: 10
fix:
content: super()
location:
row: 19
column: 9
end_location:
row: 22
column: 10
applied: false