Compare commits
70 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c15595325c | ||
|
|
e97b1a4280 | ||
|
|
82ec884a61 | ||
|
|
7c1a6bce7b | ||
|
|
84a8b628b8 | ||
|
|
142b627bb8 | ||
|
|
fbf231e1b8 | ||
|
|
1dd9ccf7f6 | ||
|
|
d601abe01b | ||
|
|
15d4774b6b | ||
|
|
293c7e00d5 | ||
|
|
c3a3195922 | ||
|
|
39d98d3488 | ||
|
|
cd3d82213a | ||
|
|
a9a0026f2f | ||
|
|
da4618d77b | ||
|
|
1b0748d19d | ||
|
|
0b7fa64481 | ||
|
|
09d593b124 | ||
|
|
adc134ced0 | ||
|
|
6051a0c1c8 | ||
|
|
00495e8620 | ||
|
|
ad8693e3de | ||
|
|
69e20c4554 | ||
|
|
b5816634b3 | ||
|
|
e8810eae64 | ||
|
|
ba26a60e2a | ||
|
|
42459c35b0 | ||
|
|
1cbd929a0a | ||
|
|
5f07e70762 | ||
|
|
8963a62ec0 | ||
|
|
4589daa0bd | ||
|
|
ea0274d22c | ||
|
|
ca1129ad27 | ||
|
|
104c63afc6 | ||
|
|
ba457c21b5 | ||
|
|
a92958f941 | ||
|
|
1cd206285e | ||
|
|
5ac5b69e9f | ||
|
|
01fedec1e7 | ||
|
|
ef20692149 | ||
|
|
50046fbed3 | ||
|
|
6798675db1 | ||
|
|
8e5a944ce1 | ||
|
|
1e325edfb1 | ||
|
|
502574797f | ||
|
|
7a83b65fbe | ||
|
|
74e3cdfd7c | ||
|
|
2ef28f217c | ||
|
|
63fc912ed8 | ||
|
|
d76a47d366 | ||
|
|
b532fce792 | ||
|
|
3ee6a90905 | ||
|
|
0a6d2294a7 | ||
|
|
e66fb42d0b | ||
|
|
5165b703d9 | ||
|
|
b40cd1fabc | ||
|
|
64fb0bd2cc | ||
|
|
2ad29089af | ||
|
|
f41796d559 | ||
|
|
945a9e187c | ||
|
|
546413defb | ||
|
|
e371ef9b1a | ||
|
|
c9585fe304 | ||
|
|
535868f939 | ||
|
|
cec993aaa9 | ||
|
|
1a32d873f0 | ||
|
|
f308f9f27e | ||
|
|
73dccce5f5 | ||
|
|
fc9fae6579 |
19
.github/release.yml
vendored
Normal file
19
.github/release.yml
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
# https://docs.github.com/en/repositories/releasing-projects-on-github/automatically-generated-release-notes#configuring-automatically-generated-release-notes
|
||||
changelog:
|
||||
categories:
|
||||
- title: Breaking Changes
|
||||
labels:
|
||||
- breaking
|
||||
- title: Rules
|
||||
labels:
|
||||
- rule
|
||||
- autofix
|
||||
- title: Settings
|
||||
labels:
|
||||
- configuration
|
||||
- title: Bug Fixes
|
||||
labels:
|
||||
- bug
|
||||
- title: Other Changes
|
||||
labels:
|
||||
- "*"
|
||||
3
.gitignore
vendored
3
.gitignore
vendored
@@ -184,3 +184,6 @@ cython_debug/
|
||||
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
|
||||
.idea/
|
||||
.vimspector.json
|
||||
|
||||
# Visual Studio Code
|
||||
.vscode/
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
repos:
|
||||
- repo: https://github.com/charliermarsh/ruff-pre-commit
|
||||
rev: v0.0.237
|
||||
rev: v0.0.239
|
||||
hooks:
|
||||
- id: ruff
|
||||
args: [--fix]
|
||||
|
||||
@@ -1,6 +1,43 @@
|
||||
# Breaking Changes
|
||||
|
||||
## 0.0.237
|
||||
## 0.0.238
|
||||
|
||||
### `select`, `extend-select`, `ignore`, and `extend-ignore` have new semantics ([#2312](https://github.com/charliermarsh/ruff/pull/2312))
|
||||
|
||||
Previously, the interplay between `select` and its related options could lead to unexpected
|
||||
behavior. For example, `ruff --select E501 --ignore ALL` and `ruff --select E501 --extend-ignore
|
||||
ALL` behaved differently. (See [#2312](https://github.com/charliermarsh/ruff/pull/2312) for more
|
||||
examples.)
|
||||
|
||||
When Ruff determines the enabled rule set, it has to reconcile `select` and `ignore` from a variety
|
||||
of sources, including the current `pyproject.toml`, any inherited `pyproject.toml` files, and the
|
||||
CLI.
|
||||
|
||||
The new semantics are such that Ruff uses the "highest-priority" `select` as the basis for the rule
|
||||
set, and then applies any `extend-select`, `ignore`, and `extend-ignore` adjustments. CLI options
|
||||
are given higher priority than `pyproject.toml` options, and the current `pyproject.toml` file is
|
||||
given higher priority than any inherited `pyproject.toml` files.
|
||||
|
||||
`extend-select` and `extend-ignore` are no longer given "top priority"; instead, they merely append
|
||||
to the `select` and `ignore` lists, as in Flake8.
|
||||
|
||||
This change is largely backwards compatible -- most users should experience no change in behavior.
|
||||
However, as an example of a breaking change, consider the following:
|
||||
|
||||
```toml
|
||||
[tool.ruff]
|
||||
ignore = ["F401"]
|
||||
```
|
||||
|
||||
Running `ruff --select F` would previously have enabled all `F` rules, apart from `F401`. Now, it
|
||||
will enable all `F` rules, including `F401`, as the command line's `--select` resets the resolution.
|
||||
|
||||
### `remove-six-compat` (`UP016`) has been removed ([#2332](https://github.com/charliermarsh/ruff/pull/2332))
|
||||
|
||||
The `remove-six-compat` rule has been removed. This rule was only useful for one-time Python 2-to-3
|
||||
upgrades.
|
||||
|
||||
## 0.0.238
|
||||
|
||||
### `--explain`, `--clean`, and `--generate-shell-completion` are now subcommands ([#2190](https://github.com/charliermarsh/ruff/pull/2190))
|
||||
|
||||
|
||||
10
Cargo.lock
generated
10
Cargo.lock
generated
@@ -750,7 +750,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
|
||||
|
||||
[[package]]
|
||||
name = "flake8-to-ruff"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"clap 4.1.4",
|
||||
@@ -1922,7 +1922,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"bitflags",
|
||||
@@ -1977,7 +1977,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_cli"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
dependencies = [
|
||||
"annotate-snippets 0.9.1",
|
||||
"anyhow",
|
||||
@@ -2014,7 +2014,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_dev"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"clap 4.1.4",
|
||||
@@ -2035,7 +2035,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_macros"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
dependencies = [
|
||||
"once_cell",
|
||||
"proc-macro2",
|
||||
|
||||
@@ -8,7 +8,7 @@ default-members = [".", "ruff_cli"]
|
||||
|
||||
[package]
|
||||
name = "ruff"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
authors = ["Charlie Marsh <charlie.r.marsh@gmail.com>"]
|
||||
edition = "2021"
|
||||
rust-version = "1.65.0"
|
||||
@@ -46,7 +46,7 @@ num-traits = "0.2.15"
|
||||
once_cell = { version = "1.16.0" }
|
||||
path-absolutize = { version = "3.0.14", features = ["once_cell_cache", "use_unix_paths_on_wasm"] }
|
||||
regex = { version = "1.6.0" }
|
||||
ruff_macros = { version = "0.0.237", path = "ruff_macros" }
|
||||
ruff_macros = { version = "0.0.239", path = "ruff_macros" }
|
||||
rustc-hash = { version = "1.1.0" }
|
||||
rustpython-ast = { features = ["unparse"], git = "https://github.com/RustPython/RustPython.git", rev = "4f38cb68e4a97aeea9eb19673803a0bd5f655383" }
|
||||
rustpython-common = { git = "https://github.com/RustPython/RustPython.git", rev = "4f38cb68e4a97aeea9eb19673803a0bd5f655383" }
|
||||
|
||||
25
LICENSE
25
LICENSE
@@ -1005,3 +1005,28 @@ are:
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
"""
|
||||
|
||||
- flake8-raise, licensed as follows:
|
||||
"""
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2020 Jon Dufresne
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
"""
|
||||
|
||||
391
README.md
391
README.md
@@ -10,6 +10,8 @@
|
||||
|
||||
An extremely fast Python linter, written in Rust.
|
||||
|
||||
This README is also available as [documentation](https://beta.ruff.rs/docs/).
|
||||
|
||||
<p align="center">
|
||||
<picture align="center">
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://user-images.githubusercontent.com/1309177/212613422-7faaf278-706b-4294-ad92-236ffcab3430.svg">
|
||||
@@ -59,6 +61,7 @@ Ruff is extremely actively developed and used in major open-source projects like
|
||||
- [Sphinx](https://github.com/sphinx-doc/sphinx)
|
||||
- [Hatch](https://github.com/pypa/hatch)
|
||||
- [Jupyter](https://github.com/jupyter-server/jupyter_server)
|
||||
- [SciPy](https://github.com/scipy/scipy)
|
||||
- [Great Expectations](https://github.com/great-expectations/great_expectations)
|
||||
- [Polars](https://github.com/pola-rs/polars)
|
||||
- [Ibis](https://github.com/ibis-project/ibis)
|
||||
@@ -72,6 +75,7 @@ Ruff is extremely actively developed and used in major open-source projects like
|
||||
- [build (PyPA)](https://github.com/pypa/build)
|
||||
- [Babel](https://github.com/python-babel/babel)
|
||||
- [featuretools](https://github.com/alteryx/featuretools)
|
||||
- [meson-python](https://github.com/mesonbuild/meson-python)
|
||||
|
||||
Read the [launch blog post](https://notes.crmarsh.com/python-tooling-could-be-much-much-faster).
|
||||
|
||||
@@ -154,6 +158,7 @@ developer of [Zulip](https://github.com/zulip/zulip):
|
||||
1. [pygrep-hooks (PGH)](#pygrep-hooks-pgh)
|
||||
1. [Pylint (PL)](#pylint-pl)
|
||||
1. [tryceratops (TRY)](#tryceratops-try)
|
||||
1. [flake8-raise (RSE)](#flake8-raise-rse)
|
||||
1. [Ruff-specific rules (RUF)](#ruff-specific-rules-ruf)<!-- End auto-generated table of contents. -->
|
||||
1. [Editor Integrations](#editor-integrations)
|
||||
1. [FAQ](#faq)
|
||||
@@ -206,9 +211,10 @@ apk add ruff
|
||||
To run Ruff, try any of the following:
|
||||
|
||||
```shell
|
||||
ruff path/to/code/to/lint.py # Run Ruff over `lint.py`
|
||||
ruff path/to/code/ # Run Ruff over all files in `/path/to/code` (and any subdirectories)
|
||||
ruff path/to/code/*.py # Run Ruff over all `.py` files in `/path/to/code`
|
||||
ruff . # Lint all files in the current directory (and any subdirectories)
|
||||
ruff path/to/code/ # Lint all files in `/path/to/code` (and any subdirectories)
|
||||
ruff path/to/code/*.py # Lint all `.py` files in `/path/to/code`
|
||||
ruff path/to/code/to/file.py # Lint `file.py`
|
||||
```
|
||||
|
||||
You can run Ruff in `--watch` mode to automatically re-run on-change:
|
||||
@@ -222,7 +228,7 @@ Ruff also works with [pre-commit](https://pre-commit.com):
|
||||
```yaml
|
||||
- repo: https://github.com/charliermarsh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: 'v0.0.237'
|
||||
rev: 'v0.0.239'
|
||||
hooks:
|
||||
- id: ruff
|
||||
```
|
||||
@@ -346,15 +352,14 @@ an equivalent schema (though the `[tool.ruff]` hierarchy can be omitted). For ex
|
||||
`pyproject.toml` described above would be represented via the following `ruff.toml`:
|
||||
|
||||
```toml
|
||||
# Enable Pyflakes and pycodestyle rules.
|
||||
select = ["E", "F"]
|
||||
# Enable flake8-bugbear (`B`) rules.
|
||||
select = ["E", "F", "B"]
|
||||
|
||||
# Never enforce `E501` (line length violations).
|
||||
ignore = ["E501"]
|
||||
|
||||
# Always autofix, but never try to fix `F401` (unused imports).
|
||||
fix = true
|
||||
unfixable = ["F401"]
|
||||
# Avoid trying to fix flake8-bugbear (`B`) violations.
|
||||
unfixable = ["B"]
|
||||
|
||||
# Ignore `E402` (import violations) in all `__init__.py` files, and in `path/to/file.py`.
|
||||
[per-file-ignores]
|
||||
@@ -364,25 +369,29 @@ unfixable = ["F401"]
|
||||
|
||||
For a full list of configurable options, see the [API reference](#reference).
|
||||
|
||||
Some common configuration settings can be provided via the command-line:
|
||||
### Command-line interface
|
||||
|
||||
Some configuration settings can be provided via the command-line, such as those related to
|
||||
rule enablement and disablement, file discovery, logging level, and more:
|
||||
|
||||
```shell
|
||||
ruff path/to/code/ --select F401 --select F403
|
||||
ruff path/to/code/ --select F401 --select F403 --quiet
|
||||
```
|
||||
|
||||
See `ruff --help` for more:
|
||||
See `ruff help` for more on Ruff's top-level commands:
|
||||
|
||||
<!-- Begin auto-generated cli help. -->
|
||||
<!-- Begin auto-generated command help. -->
|
||||
```
|
||||
Ruff: An extremely fast Python linter.
|
||||
|
||||
Usage: ruff [OPTIONS] <COMMAND>
|
||||
|
||||
Commands:
|
||||
check Run Ruff on the given files or directories (default)
|
||||
rule Explain a rule
|
||||
clean Clear any caches in the current directory and any subdirectories
|
||||
help Print this message or the help of the given subcommand(s)
|
||||
check Run Ruff on the given files or directories (default)
|
||||
rule Explain a rule
|
||||
linter List all supported upstream linters
|
||||
clean Clear any caches in the current directory and any subdirectories
|
||||
help Print this message or the help of the given subcommand(s)
|
||||
|
||||
Options:
|
||||
-h, --help Print help
|
||||
@@ -395,7 +404,81 @@ Log levels:
|
||||
|
||||
For help with a specific command, see: `ruff help <command>`.
|
||||
```
|
||||
<!-- End auto-generated cli help. -->
|
||||
<!-- End auto-generated command help. -->
|
||||
|
||||
Or `ruff help check` for more on the linting command:
|
||||
|
||||
<!-- Begin auto-generated subcommand help. -->
|
||||
```
|
||||
Run Ruff on the given files or directories (default)
|
||||
|
||||
Usage: ruff check [OPTIONS] [FILES]...
|
||||
|
||||
Arguments:
|
||||
[FILES]... List of files or directories to check
|
||||
|
||||
Options:
|
||||
--fix Attempt to automatically fix lint violations
|
||||
--show-source Show violations with source code
|
||||
--diff Avoid writing any fixed files back; instead, output a diff for each changed file to stdout
|
||||
-w, --watch Run in watch mode by re-running whenever files change
|
||||
--fix-only Fix any fixable lint violations, but don't report on leftover violations. Implies `--fix`
|
||||
--format <FORMAT> Output serialization format for violations [env: RUFF_FORMAT=] [possible values: text, json, junit, grouped, github, gitlab, pylint]
|
||||
--config <CONFIG> Path to the `pyproject.toml` or `ruff.toml` file to use for configuration
|
||||
--statistics Show counts for every rule with at least one violation
|
||||
--add-noqa Enable automatic additions of `noqa` directives to failing lines
|
||||
--show-files See the files Ruff will be run against with the current settings
|
||||
--show-settings See the settings Ruff will use to lint a given Python file
|
||||
-h, --help Print help
|
||||
|
||||
Rule selection:
|
||||
--select <RULE_CODE>
|
||||
Comma-separated list of rule codes to enable (or ALL, to enable all rules)
|
||||
--ignore <RULE_CODE>
|
||||
Comma-separated list of rule codes to disable
|
||||
--extend-select <RULE_CODE>
|
||||
Like --select, but adds additional rule codes on top of the selected ones
|
||||
--per-file-ignores <PER_FILE_IGNORES>
|
||||
List of mappings from file pattern to code to exclude
|
||||
--fixable <RULE_CODE>
|
||||
List of rule codes to treat as eligible for autofix. Only applicable when autofix itself is enabled (e.g., via `--fix`)
|
||||
--unfixable <RULE_CODE>
|
||||
List of rule codes to treat as ineligible for autofix. Only applicable when autofix itself is enabled (e.g., via `--fix`)
|
||||
|
||||
File selection:
|
||||
--exclude <FILE_PATTERN> List of paths, used to omit files and/or directories from analysis
|
||||
--extend-exclude <FILE_PATTERN> Like --exclude, but adds additional files and directories on top of those already excluded
|
||||
--respect-gitignore Respect file exclusions via `.gitignore` and other standard ignore files
|
||||
--force-exclude Enforce exclusions, even for paths passed to Ruff directly on the command-line
|
||||
|
||||
Rule configuration:
|
||||
--target-version <TARGET_VERSION>
|
||||
The minimum Python version that should be supported
|
||||
--line-length <LINE_LENGTH>
|
||||
Set the line-length for length-associated rules and automatic formatting
|
||||
--dummy-variable-rgx <DUMMY_VARIABLE_RGX>
|
||||
Regular expression matching the name of dummy variables
|
||||
|
||||
Miscellaneous:
|
||||
-n, --no-cache
|
||||
Disable cache reads
|
||||
--isolated
|
||||
Ignore all configuration files
|
||||
--cache-dir <CACHE_DIR>
|
||||
Path to the cache directory [env: RUFF_CACHE_DIR=]
|
||||
--stdin-filename <STDIN_FILENAME>
|
||||
The name of the file when passing it through stdin
|
||||
-e, --exit-zero
|
||||
Exit with status code "0", even upon detecting lint violations
|
||||
--update-check
|
||||
Enable or disable automatic update checks
|
||||
|
||||
Log levels:
|
||||
-v, --verbose Enable verbose logging
|
||||
-q, --quiet Print lint violations, but nothing else
|
||||
-s, --silent Disable all logging (but still exit with status code "1" upon detecting lint violations)
|
||||
```
|
||||
<!-- End auto-generated subcommand help. -->
|
||||
|
||||
### `pyproject.toml` discovery
|
||||
|
||||
@@ -448,7 +531,34 @@ By default, Ruff will also skip any files that are omitted via `.ignore`, `.giti
|
||||
Files that are passed to `ruff` directly are always linted, regardless of the above criteria.
|
||||
For example, `ruff /path/to/excluded/file.py` will always lint `file.py`.
|
||||
|
||||
### Ignoring errors
|
||||
### Rule resolution
|
||||
|
||||
The set of enabled rules is controlled via the [`select`](#select) and [`ignore`](#ignore) settings,
|
||||
along with the [`extend-select`](#extend-select) and [`extend-ignore`](#extend-ignore) modifiers.
|
||||
|
||||
To resolve the enabled rule set, Ruff may need to reconcile `select` and `ignore` from a variety
|
||||
of sources, including the current `pyproject.toml`, any inherited `pyproject.toml` files, and the
|
||||
CLI (e.g., `--select`).
|
||||
|
||||
In those scenarios, Ruff uses the "highest-priority" `select` as the basis for the rule set, and
|
||||
then applies any `extend-select`, `ignore`, and `extend-ignore` adjustments. CLI options are given
|
||||
higher priority than `pyproject.toml` options, and the current `pyproject.toml` file is given higher
|
||||
priority than any inherited `pyproject.toml` files.
|
||||
|
||||
For example, given the following `pyproject.toml` file:
|
||||
|
||||
```toml
|
||||
[tool.ruff]
|
||||
select = ["E", "F"]
|
||||
ignore = ["F401"]
|
||||
```
|
||||
|
||||
Running `ruff --select F401` would result in Ruff enforcing `F401`, and no other rules.
|
||||
|
||||
Running `ruff --extend-select B` would result in Ruff enforcing the `E`, `F`, and `B` rules, with
|
||||
the exception of `F401`.
|
||||
|
||||
### Suppressing errors
|
||||
|
||||
To omit a lint rule entirely, add it to the "ignore" list via [`ignore`](#ignore) or
|
||||
[`extend-ignore`](#extend-ignore), either on the command-line or in your `pyproject.toml` file.
|
||||
@@ -473,7 +583,7 @@ will apply to the entire string, like so:
|
||||
```python
|
||||
"""Lorem ipsum dolor sit amet.
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor.
|
||||
""" # noqa: E501
|
||||
```
|
||||
|
||||
@@ -484,17 +594,7 @@ disable enforcement across the entire file.
|
||||
For targeted exclusions across entire files (e.g., "Ignore all F841 violations in
|
||||
`/path/to/file.py`"), see the [`per-file-ignores`](#per-file-ignores) configuration setting.
|
||||
|
||||
### "Action Comments"
|
||||
|
||||
Ruff respects `isort`'s ["Action Comments"](https://pycqa.github.io/isort/docs/configuration/action_comments.html)
|
||||
(`# isort: skip_file`, `# isort: on`, `# isort: off`, `# isort: skip`, and `# isort: split`), which
|
||||
enable selectively enabling and disabling import sorting for blocks of code and other inline
|
||||
configuration.
|
||||
|
||||
See the [`isort` documentation](https://pycqa.github.io/isort/docs/configuration/action_comments.html)
|
||||
for more.
|
||||
|
||||
### Automating `noqa` Directives
|
||||
#### Automatic error suppression
|
||||
|
||||
Ruff supports several workflows to aid in `noqa` management.
|
||||
|
||||
@@ -511,6 +611,16 @@ Third, Ruff can _automatically add_ `noqa` directives to all failing lines. This
|
||||
migrating a new codebase to Ruff. You can run `ruff /path/to/file.py --add-noqa` to automatically
|
||||
add `noqa` directives to all failing lines, with the appropriate rule codes.
|
||||
|
||||
#### Action comments
|
||||
|
||||
Ruff respects `isort`'s [action comments](https://pycqa.github.io/isort/docs/configuration/action_comments.html)
|
||||
(`# isort: skip_file`, `# isort: on`, `# isort: off`, `# isort: skip`, and `# isort: split`), which
|
||||
enable selectively enabling and disabling import sorting for blocks of code and other inline
|
||||
configuration.
|
||||
|
||||
See the [`isort` documentation](https://pycqa.github.io/isort/docs/configuration/action_comments.html)
|
||||
for more.
|
||||
|
||||
<!-- End section: Configuration -->
|
||||
|
||||
## Supported Rules
|
||||
@@ -720,7 +830,6 @@ For more, see [pyupgrade](https://pypi.org/project/pyupgrade/) on PyPI.
|
||||
| UP013 | convert-typed-dict-functional-to-class | Convert `{name}` from `TypedDict` functional to class syntax | 🛠 |
|
||||
| UP014 | convert-named-tuple-functional-to-class | Convert `{name}` from `NamedTuple` functional to class syntax | 🛠 |
|
||||
| UP015 | redundant-open-modes | Unnecessary open mode parameters | 🛠 |
|
||||
| UP016 | remove-six-compat | Unnecessary `six` compatibility usage | 🛠 |
|
||||
| UP017 | datetime-timezone-utc | Use `datetime.UTC` alias | 🛠 |
|
||||
| UP018 | native-literals | Unnecessary call to `{literal_type}` | 🛠 |
|
||||
| UP019 | typing-text-str-alias | `typing.Text` is deprecated, use `str` | 🛠 |
|
||||
@@ -739,6 +848,7 @@ For more, see [pyupgrade](https://pypi.org/project/pyupgrade/) on PyPI.
|
||||
| UP032 | f-string | Use f-string instead of `format` call | 🛠 |
|
||||
| UP033 | functools-cache | Use `@functools.cache` instead of `@functools.lru_cache(maxsize=None)` | 🛠 |
|
||||
| UP034 | extraneous-parentheses | Avoid extraneous parentheses | 🛠 |
|
||||
| UP035 | import-replacements | Import from `{module}` instead: {names} | 🛠 |
|
||||
|
||||
### flake8-2020 (YTT)
|
||||
|
||||
@@ -948,7 +1058,7 @@ For more, see [flake8-implicit-str-concat](https://pypi.org/project/flake8-impli
|
||||
| Code | Name | Message | Fix |
|
||||
| ---- | ---- | ------- | --- |
|
||||
| ISC001 | single-line-implicit-string-concatenation | Implicitly concatenated string literals on one line | |
|
||||
| ISC002 | multi-line-implicit-string-concatenation | Implicitly concatenated string literals over continuation line | |
|
||||
| ISC002 | multi-line-implicit-string-concatenation | Implicitly concatenated string literals over multiple lines | |
|
||||
| ISC003 | explicit-string-concatenation | Explicitly concatenated string should be implicitly concatenated | |
|
||||
|
||||
### flake8-import-conventions (ICN)
|
||||
@@ -1012,7 +1122,7 @@ For more, see [flake8-pytest-style](https://pypi.org/project/flake8-pytest-style
|
||||
| ---- | ---- | ------- | --- |
|
||||
| PT001 | incorrect-fixture-parentheses-style | Use `@pytest.fixture{expected_parens}` over `@pytest.fixture{actual_parens}` | 🛠 |
|
||||
| PT002 | fixture-positional-args | Configuration for fixture `{function}` specified via positional args, use kwargs | |
|
||||
| PT003 | extraneous-scope-function | `scope='function'` is implied in `@pytest.fixture()` | |
|
||||
| PT003 | extraneous-scope-function | `scope='function'` is implied in `@pytest.fixture()` | 🛠 |
|
||||
| PT004 | missing-fixture-name-underscore | Fixture `{function}` does not return anything, add leading underscore | |
|
||||
| PT005 | incorrect-fixture-name-underscore | Fixture `{function}` returns a value, remove leading underscore | |
|
||||
| PT006 | parametrize-names-wrong-type | Wrong name(s) type in `@pytest.mark.parametrize`, expected `{expected}` | 🛠 |
|
||||
@@ -1142,13 +1252,13 @@ For more, see [flake8-use-pathlib](https://pypi.org/project/flake8-use-pathlib/)
|
||||
| PTH106 | pathlib-rmdir | `os.rmdir` should be replaced by `.rmdir()` | |
|
||||
| PTH107 | pathlib-remove | `os.remove` should be replaced by `.unlink()` | |
|
||||
| PTH108 | pathlib-unlink | `os.unlink` should be replaced by `.unlink()` | |
|
||||
| PTH109 | pathlib-getcwd | `os.getcwd()` should be replaced by `Path.cwd()` | |
|
||||
| PTH109 | pathlib-getcwd | `os.getcwd` should be replaced by `Path.cwd()` | |
|
||||
| PTH110 | pathlib-exists | `os.path.exists` should be replaced by `.exists()` | |
|
||||
| PTH111 | pathlib-expanduser | `os.path.expanduser` should be replaced by `.expanduser()` | |
|
||||
| PTH112 | pathlib-is-dir | `os.path.isdir` should be replaced by `.is_dir()` | |
|
||||
| PTH113 | pathlib-is-file | `os.path.isfile` should be replaced by `.is_file()` | |
|
||||
| PTH114 | pathlib-is-link | `os.path.islink` should be replaced by `.is_symlink()` | |
|
||||
| PTH115 | pathlib-readlink | `os.readlink(` should be replaced by `.readlink()` | |
|
||||
| PTH115 | pathlib-readlink | `os.readlink` should be replaced by `.readlink()` | |
|
||||
| PTH116 | pathlib-stat | `os.stat` should be replaced by `.stat()` or `.owner()` or `.group()` | |
|
||||
| PTH117 | pathlib-is-abs | `os.path.isabs` should be replaced by `.is_absolute()` | |
|
||||
| PTH118 | pathlib-join | `os.path.join` should be replaced by foo_path / "bar" | |
|
||||
@@ -1173,7 +1283,7 @@ For more, see [pandas-vet](https://pypi.org/project/pandas-vet/) on PyPI.
|
||||
|
||||
| Code | Name | Message | Fix |
|
||||
| ---- | ---- | ------- | --- |
|
||||
| PD002 | use-of-inplace-argument | `inplace=True` should be avoided; it has inconsistent behavior | |
|
||||
| PD002 | use-of-inplace-argument | `inplace=True` should be avoided; it has inconsistent behavior | 🛠 |
|
||||
| PD003 | use-of-dot-is-null | `.isna` is preferred to `.isnull`; functionality is equivalent | |
|
||||
| PD004 | use-of-dot-not-null | `.notna` is preferred to `.notnull`; functionality is equivalent | |
|
||||
| PD007 | use-of-dot-ix | `.ix` is deprecated; use more explicit `.loc` or `.iloc` | |
|
||||
@@ -1222,6 +1332,7 @@ For more, see [Pylint](https://pypi.org/project/pylint/) on PyPI.
|
||||
| PLR0133 | constant-comparison | Two constants compared in a comparison, consider replacing `{left_constant} {op} {right_constant}` | |
|
||||
| PLR0206 | property-with-parameters | Cannot have defined parameters for properties | |
|
||||
| PLR0402 | consider-using-from-import | Use `from {module} import {name}` in lieu of alias | |
|
||||
| PLR0913 | too-many-args | Too many arguments to function call ({c_args}/{max_args}) | |
|
||||
| PLR1701 | consider-merging-isinstance | Merge these isinstance calls: `isinstance({obj}, ({types}))` | |
|
||||
| PLR1722 | use-sys-exit | Use `sys.exit()` instead of `{name}` | 🛠 |
|
||||
| PLR2004 | magic-value-comparison | Magic value used in comparison, consider replacing {value} with a constant variable | |
|
||||
@@ -1247,6 +1358,14 @@ For more, see [tryceratops](https://pypi.org/project/tryceratops/1.1.0/) on PyPI
|
||||
| TRY301 | raise-within-try | Abstract `raise` to an inner function | |
|
||||
| TRY400 | error-instead-of-exception | Use `logging.exception` instead of `logging.error` | |
|
||||
|
||||
### flake8-raise (RSE)
|
||||
|
||||
For more, see [flake8-raise](https://pypi.org/project/flake8-raise/) on PyPI.
|
||||
|
||||
| Code | Name | Message | Fix |
|
||||
| ---- | ---- | ------- | --- |
|
||||
| RSE102 | unnecessary-paren-on-raise-exception | Unnecessary parentheses on raised exception | |
|
||||
|
||||
### Ruff-specific rules (RUF)
|
||||
|
||||
| Code | Name | Message | Fix |
|
||||
@@ -1431,30 +1550,10 @@ tools:
|
||||
|
||||
```lua
|
||||
local null_ls = require("null-ls")
|
||||
local methods = require("null-ls.methods")
|
||||
local helpers = require("null-ls.helpers")
|
||||
|
||||
local function ruff_fix()
|
||||
return helpers.make_builtin({
|
||||
name = "ruff",
|
||||
meta = {
|
||||
url = "https://github.com/charliermarsh/ruff/",
|
||||
description = "An extremely fast Python linter, written in Rust.",
|
||||
},
|
||||
method = methods.internal.FORMATTING,
|
||||
filetypes = { "python" },
|
||||
generator_opts = {
|
||||
command = "ruff",
|
||||
args = { "--fix", "-e", "-n", "--stdin-filename", "$FILENAME", "-" },
|
||||
to_stdin = true
|
||||
},
|
||||
factory = helpers.formatter_factory
|
||||
})
|
||||
end
|
||||
|
||||
null_ls.setup({
|
||||
sources = {
|
||||
ruff_fix(),
|
||||
null_ls.builtins.formatting.ruff,
|
||||
null_ls.builtins.diagnostics.ruff,
|
||||
}
|
||||
})
|
||||
@@ -1558,6 +1657,7 @@ natively, including:
|
||||
- [`flake8-print`](https://pypi.org/project/flake8-print/)
|
||||
- [`flake8-pytest-style`](https://pypi.org/project/flake8-pytest-style/)
|
||||
- [`flake8-quotes`](https://pypi.org/project/flake8-quotes/)
|
||||
- [`flake8-raise`](https://pypi.org/project/flake8-raise/)
|
||||
- [`flake8-return`](https://pypi.org/project/flake8-return/)
|
||||
- [`flake8-simplify`](https://pypi.org/project/flake8-simplify/) ([#998](https://github.com/charliermarsh/ruff/issues/998))
|
||||
- [`flake8-super`](https://pypi.org/project/flake8-super/)
|
||||
@@ -1646,6 +1746,7 @@ Today, Ruff can be used to replace Flake8 when used with any of the following pl
|
||||
- [`flake8-print`](https://pypi.org/project/flake8-print/)
|
||||
- [`flake8-pytest-style`](https://pypi.org/project/flake8-pytest-style/)
|
||||
- [`flake8-quotes`](https://pypi.org/project/flake8-quotes/)
|
||||
- [`flake8-raise`](https://pypi.org/project/flake8-raise/)
|
||||
- [`flake8-return`](https://pypi.org/project/flake8-return/)
|
||||
- [`flake8-simplify`](https://pypi.org/project/flake8-simplify/) ([#998](https://github.com/charliermarsh/ruff/issues/998))
|
||||
- [`flake8-super`](https://pypi.org/project/flake8-super/)
|
||||
@@ -1967,7 +2068,7 @@ enforcing `RUF001`, `RUF002`, and `RUF003`.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<char>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -1987,7 +2088,7 @@ system builtins.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2013,7 +2114,7 @@ variable, if set.
|
||||
|
||||
**Default value**: `.ruff_cache`
|
||||
|
||||
**Type**: `PathBuf`
|
||||
**Type**: `str`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2032,7 +2133,7 @@ default expression matches `_`, `__`, and `_var`, but not `_var_`.
|
||||
|
||||
**Default value**: `"^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"`
|
||||
|
||||
**Type**: `Regex`
|
||||
**Type**: `re.Pattern`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2065,7 +2166,7 @@ Note that you'll typically want to use
|
||||
|
||||
**Default value**: `[".bzr", ".direnv", ".eggs", ".git", ".hg", ".mypy_cache", ".nox", ".pants.d", ".ruff_cache", ".svn", ".tox", ".venv", "__pypackages__", "_build", "buck-out", "build", "dist", "node_modules", "venv"]`
|
||||
|
||||
**Type**: `Vec<FilePattern>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2088,7 +2189,7 @@ in the current configuration file.
|
||||
|
||||
**Default value**: `None`
|
||||
|
||||
**Type**: `Path`
|
||||
**Type**: `str`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2121,7 +2222,7 @@ For more information on the glob syntax, refer to the [`globset` documentation](
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<FilePattern>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2138,15 +2239,12 @@ extend-exclude = ["tests", "src/bad.py"]
|
||||
A list of rule codes or prefixes to ignore, in addition to those
|
||||
specified by `ignore`.
|
||||
|
||||
Note that `extend-ignore` is applied after resolving rules from
|
||||
`ignore`/`select` and a less specific rule in `extend-ignore`
|
||||
would overwrite a more specific rule in `select`. It is
|
||||
recommended to only use `extend-ignore` when extending a
|
||||
`pyproject.toml` file via `extend`.
|
||||
This option has been **deprecated** in favor of `ignore`
|
||||
since its usage is now interchangeable with `ignore`.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<RuleSelector>`
|
||||
**Type**: `list[RuleSelector]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2163,15 +2261,9 @@ extend-ignore = ["F841"]
|
||||
A list of rule codes or prefixes to enable, in addition to those
|
||||
specified by `select`.
|
||||
|
||||
Note that `extend-select` is applied after resolving rules from
|
||||
`ignore`/`select` and a less specific rule in `extend-select`
|
||||
would overwrite a more specific rule in `ignore`. It is
|
||||
recommended to only use `extend-select` when extending a
|
||||
`pyproject.toml` file via `extend`.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<RuleSelector>`
|
||||
**Type**: `list[RuleSelector]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2192,7 +2284,7 @@ by Ruff.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2242,11 +2334,12 @@ fix-only = true
|
||||
|
||||
#### [`fixable`](#fixable)
|
||||
|
||||
A list of rule codes or prefixes to consider autofixable.
|
||||
A list of rule codes or prefixes to consider autofixable. By default, all rules are
|
||||
considered autofixable.
|
||||
|
||||
**Default value**: `["A", "ANN", "ARG", "B", "BLE", "C", "D", "E", "ERA", "F", "FBT", "I", "ICN", "N", "PGH", "PLC", "PLE", "PLR", "PLW", "Q", "RET", "RUF", "S", "T", "TID", "UP", "W", "YTT"]`
|
||||
**Default value**: `["A", "ANN", "ARG", "B", "BLE", "C", "COM", "D", "DTZ", "E", "EM", "ERA", "EXE", "F", "FBT", "G", "I", "ICN", "INP", "ISC", "N", "PD", "PGH", "PIE", "PL", "PT", "PTH", "Q", "RET", "RUF", "S", "SIM", "T", "TCH", "TID", "TRY", "UP", "W", "YTT"]`
|
||||
|
||||
**Type**: `Vec<RuleSelector>`
|
||||
**Type**: `list[RuleSelector]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2294,7 +2387,7 @@ Actions annotations), `"gitlab"` (GitLab CI code quality report), or
|
||||
|
||||
**Default value**: `"text"`
|
||||
|
||||
**Type**: `SerializationType`
|
||||
**Type**: `"text" | "json" | "junit" | "github" | "gitlab" | "pylint"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2318,7 +2411,7 @@ specific prefixes.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<RuleSelector>`
|
||||
**Type**: `list[RuleSelector]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2357,7 +2450,7 @@ The line length to use when enforcing long-lines violations (like
|
||||
|
||||
**Default value**: `88`
|
||||
|
||||
**Type**: `usize`
|
||||
**Type**: `int`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2377,7 +2470,7 @@ contained an `__init__.py` file.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<PathBuf>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2395,7 +2488,7 @@ exclude, when considering any matching files.
|
||||
|
||||
**Default value**: `{}`
|
||||
|
||||
**Type**: `HashMap<String, Vec<RuleSelector>>`
|
||||
**Type**: `dict[str, list[RuleSelector]]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2417,7 +2510,7 @@ file).
|
||||
|
||||
**Default value**: `None`
|
||||
|
||||
**Type**: `String`
|
||||
**Type**: `str`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2459,7 +2552,7 @@ specific prefixes.
|
||||
|
||||
**Default value**: `["E", "F"]`
|
||||
|
||||
**Type**: `Vec<RuleSelector>`
|
||||
**Type**: `list[RuleSelector]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2519,7 +2612,7 @@ variables will also be expanded.
|
||||
|
||||
**Default value**: `["."]`
|
||||
|
||||
**Type**: `Vec<PathBuf>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2540,7 +2633,7 @@ and instead must be specified explicitly (as seen below).
|
||||
|
||||
**Default value**: `"py310"`
|
||||
|
||||
**Type**: `PythonVersion`
|
||||
**Type**: `"py37" | "py38" | "py39" | "py310" | "py311"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2562,7 +2655,7 @@ detection (`ERA`), and skipped by line-length rules (`E501`) if
|
||||
|
||||
**Default value**: `["TODO", "FIXME", "XXX"]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2586,7 +2679,7 @@ as ordinary Python objects.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2603,7 +2696,7 @@ A list of rule codes or prefixes to consider non-autofix-able.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<RuleSelector>`
|
||||
**Type**: `list[RuleSelector]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2737,7 +2830,7 @@ A list of directories to consider temporary.
|
||||
|
||||
**Default value**: `["/tmp", "/var/tmp", "/dev/shm"]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2755,7 +2848,7 @@ specified by `hardcoded-tmp-directory`.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2775,7 +2868,7 @@ e.g., the `no-mutable-default-argument` rule (`B006`).
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2795,7 +2888,7 @@ Ignore list of builtins.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2814,7 +2907,7 @@ Maximum string length for string literals in exception messages.
|
||||
|
||||
**Default value**: `0`
|
||||
|
||||
**Type**: `usize`
|
||||
**Type**: `int`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2834,6 +2927,11 @@ By default, implicit concatenations of multiline strings are
|
||||
allowed (but continuation lines, delimited with a backslash, are
|
||||
prohibited).
|
||||
|
||||
Note that setting `allow-multiline = false` should typically be coupled
|
||||
with disabling `explicit-string-concatenation` (`ISC003`). Otherwise,
|
||||
both explicit and implicit multiline string concatenations will be seen
|
||||
as violations.
|
||||
|
||||
**Default value**: `true`
|
||||
|
||||
**Type**: `bool`
|
||||
@@ -2854,9 +2952,9 @@ allow-multiline = false
|
||||
The conventional aliases for imports. These aliases can be extended by
|
||||
the `extend_aliases` option.
|
||||
|
||||
**Default value**: `{"altair": "alt", "matplotlib.pyplot": "plt", "numpy": "np", "pandas": "pd", "seaborn": "sns"}`
|
||||
**Default value**: `{"altair": "alt", "matplotlib": "mpl", "matplotlib.pyplot": "plt", "numpy": "np", "pandas": "pd", "seaborn": "sns", "tensorflow": "tf", "holoviews": "hv", "panel": "pn", "plotly.express": "px", "polars": "pl", "pyarrow": "pa"}`
|
||||
|
||||
**Type**: `FxHashMap<String, String>`
|
||||
**Type**: `dict[str, str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2869,6 +2967,7 @@ altair = "alt"
|
||||
numpy = "np"
|
||||
pandas = "pd"
|
||||
seaborn = "sns"
|
||||
scipy = "sp"
|
||||
```
|
||||
|
||||
---
|
||||
@@ -2880,7 +2979,7 @@ will be added to the `aliases` mapping.
|
||||
|
||||
**Default value**: `{}`
|
||||
|
||||
**Type**: `FxHashMap<String, String>`
|
||||
**Type**: `dict[str, str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2949,7 +3048,7 @@ The following values are supported:
|
||||
|
||||
**Default value**: `tuple`
|
||||
|
||||
**Type**: `ParametrizeNameType`
|
||||
**Type**: `"csv" | "tuple" | "list"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2971,7 +3070,7 @@ case of multiple parameters. The following values are supported:
|
||||
|
||||
**Default value**: `tuple`
|
||||
|
||||
**Type**: `ParametrizeValuesRowType`
|
||||
**Type**: `"tuple" | "list"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -2991,7 +3090,7 @@ The following values are supported:
|
||||
|
||||
**Default value**: `list`
|
||||
|
||||
**Type**: `ParametrizeValuesType`
|
||||
**Type**: `"tuple" | "list"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3015,7 +3114,7 @@ list.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3033,7 +3132,7 @@ List of exception names that require a match= parameter in a
|
||||
|
||||
**Default value**: `["BaseException", "Exception", "ValueError", "OSError", "IOError", "EnvironmentError", "socket.error"]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3068,12 +3167,11 @@ avoid-escape = false
|
||||
|
||||
#### [`docstring-quotes`](#docstring-quotes)
|
||||
|
||||
Quote style to prefer for docstrings (either "single" (`'`) or "double"
|
||||
(`"`)).
|
||||
Quote style to prefer for docstrings (either "single" or "double").
|
||||
|
||||
**Default value**: `"double"`
|
||||
|
||||
**Type**: `Quote`
|
||||
**Type**: `"single" | "double"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3086,12 +3184,12 @@ docstring-quotes = "single"
|
||||
|
||||
#### [`inline-quotes`](#inline-quotes)
|
||||
|
||||
Quote style to prefer for inline strings (either "single" (`'`) or
|
||||
"double" (`"`)).
|
||||
Quote style to prefer for inline strings (either "single" or
|
||||
"double").
|
||||
|
||||
**Default value**: `"double"`
|
||||
|
||||
**Type**: `Quote`
|
||||
**Type**: `"single" | "double"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3104,12 +3202,12 @@ inline-quotes = "single"
|
||||
|
||||
#### [`multiline-quotes`](#multiline-quotes)
|
||||
|
||||
Quote style to prefer for multiline strings (either "single" (`'`) or
|
||||
"double" (`"`)).
|
||||
Quote style to prefer for multiline strings (either "single" or
|
||||
"double").
|
||||
|
||||
**Default value**: `"double"`
|
||||
|
||||
**Type**: `Quote`
|
||||
**Type**: `"single" | "double"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3129,7 +3227,7 @@ that extend into the parent module or beyond (`"parents"`).
|
||||
|
||||
**Default value**: `"parents"`
|
||||
|
||||
**Type**: `Strictness`
|
||||
**Type**: `"parents" | "all"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3149,7 +3247,7 @@ and can be circumvented via `eval` or `importlib`.
|
||||
|
||||
**Default value**: `{}`
|
||||
|
||||
**Type**: `HashMap<String, BannedApi>`
|
||||
**Type**: `dict[str, { "msg": str }]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3171,7 +3269,7 @@ blocks.
|
||||
|
||||
**Default value**: `["typing"]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3229,7 +3327,7 @@ An override list of tokens to always recognize as a Class for
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3265,7 +3363,7 @@ for `order-by-type` regardless of casing.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3283,7 +3381,7 @@ known to Ruff in advance.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3370,7 +3468,7 @@ can be identified as such via introspection of the local filesystem.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3388,7 +3486,7 @@ can be identified as such via introspection of the local filesystem.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3406,7 +3504,7 @@ section via empty lines.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Option<Vec<ImportType>>`
|
||||
**Type**: `list["future" | "standard-library" | "third-party" | "first-party" | "local-folder"]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3448,7 +3546,7 @@ this to "closest-to-furthest" is equivalent to isort's `reverse-relative
|
||||
|
||||
**Default value**: `furthest-to-closest`
|
||||
|
||||
**Type**: `RelatveImportsOrder`
|
||||
**Type**: `"furthest-to-closest" | "closest-to-furthest"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3465,7 +3563,7 @@ Add the specified import line to all files.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3482,7 +3580,7 @@ One or more modules to exclude from the single line rule.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3520,7 +3618,7 @@ for `order-by-type` regardless of casing.
|
||||
|
||||
**Default value**: `[]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3539,7 +3637,7 @@ The maximum McCabe complexity to allow before triggering `C901` errors.
|
||||
|
||||
**Default value**: `10`
|
||||
|
||||
**Type**: `usize`
|
||||
**Type**: `int`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3562,7 +3660,7 @@ expect that any method decorated by a decorator in this list takes a
|
||||
|
||||
**Default value**: `["classmethod"]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3580,7 +3678,7 @@ A list of names to ignore when considering `pep8-naming` violations.
|
||||
|
||||
**Default value**: `["setUp", "tearDown", "setUpClass", "tearDownClass", "setUpModule", "tearDownModule", "asyncSetUp", "asyncTearDown", "setUpTestData", "failureException", "longMessage", "maxDiff"]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3600,7 +3698,7 @@ expect that any method decorated by a decorator in this list has no
|
||||
|
||||
**Default value**: `["staticmethod"]`
|
||||
|
||||
**Type**: `Vec<String>`
|
||||
**Type**: `list[str]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3640,7 +3738,7 @@ documentation (`W505`), including standalone comments.
|
||||
|
||||
**Default value**: `None`
|
||||
|
||||
**Type**: `usize`
|
||||
**Type**: `int`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3660,7 +3758,7 @@ defaults when analyzing docstring sections.
|
||||
|
||||
**Default value**: `None`
|
||||
|
||||
**Type**: `Convention`
|
||||
**Type**: `"google" | "numpy" | "pep257"`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3676,11 +3774,11 @@ convention = "google"
|
||||
|
||||
#### [`allow-magic-value-types`](#allow-magic-value-types)
|
||||
|
||||
Constant types to ignore when used as "magic values".
|
||||
Constant types to ignore when used as "magic values" (see: `PLR2004`).
|
||||
|
||||
**Default value**: `["str"]`
|
||||
**Default value**: `["str", "bytes"]`
|
||||
|
||||
**Type**: `Vec<ConstantType>`
|
||||
**Type**: `list["str" | "bytes" | "complex" | "float" | "int" | "tuple"]`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
@@ -3691,6 +3789,23 @@ allow-magic-value-types = ["int"]
|
||||
|
||||
---
|
||||
|
||||
#### [`max-args`](#max-args)
|
||||
|
||||
Maximum number of arguments allowed for a function definition (see: `PLR0913`).
|
||||
|
||||
**Default value**: `5`
|
||||
|
||||
**Type**: `int`
|
||||
|
||||
**Example usage**:
|
||||
|
||||
```toml
|
||||
[tool.ruff.pylint]
|
||||
max-args = 5
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### `pyupgrade`
|
||||
|
||||
#### [`keep-runtime-typing`](#keep-runtime-typing)
|
||||
|
||||
4
flake8_to_ruff/Cargo.lock
generated
4
flake8_to_ruff/Cargo.lock
generated
@@ -771,7 +771,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
|
||||
|
||||
[[package]]
|
||||
name = "flake8_to_ruff"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"clap",
|
||||
@@ -1975,7 +1975,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"bincode",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "flake8-to-ruff"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
|
||||
@@ -7,7 +7,7 @@ build-backend = "maturin"
|
||||
|
||||
[project]
|
||||
name = "ruff"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
description = "An extremely fast Python linter, written in Rust."
|
||||
authors = [
|
||||
{ name = "Charlie Marsh", email = "charlie.r.marsh@gmail.com" },
|
||||
|
||||
@@ -4,6 +4,7 @@ d = {}
|
||||
safe = "s3cr3t"
|
||||
password = True
|
||||
password = safe
|
||||
password = ""
|
||||
password is True
|
||||
password == 1
|
||||
d["safe"] = "s3cr3t"
|
||||
|
||||
@@ -7,6 +7,7 @@ string = "Hello World"
|
||||
# OK
|
||||
func("s3cr3t")
|
||||
func(1, password=string)
|
||||
func(1, password="")
|
||||
func(pos="s3cr3t", password=string)
|
||||
|
||||
# Error
|
||||
|
||||
@@ -28,3 +28,7 @@ def ok_all(first, /, pos, default="posonly", *, kwonly="kwonly"):
|
||||
|
||||
def default_all(first, /, pos, secret="posonly", *, password="kwonly"):
|
||||
pass
|
||||
|
||||
|
||||
def ok_empty(first, password=""):
|
||||
pass
|
||||
|
||||
@@ -7,6 +7,13 @@ import matplotlib.pyplot # unconventional
|
||||
import numpy # unconventional
|
||||
import pandas # unconventional
|
||||
import seaborn # unconventional
|
||||
import tensorflow # unconventional
|
||||
import holoviews # unconventional
|
||||
import panel # unconventional
|
||||
import plotly.express # unconventional
|
||||
import matplotlib # unconventional
|
||||
import polars # unconventional
|
||||
import pyarrow # unconventional
|
||||
|
||||
import altair as altr # unconventional
|
||||
import matplotlib.pyplot as plot # unconventional
|
||||
@@ -15,6 +22,13 @@ import dask.dataframe as ddf # unconventional
|
||||
import numpy as nmp # unconventional
|
||||
import pandas as pdas # unconventional
|
||||
import seaborn as sbrn # unconventional
|
||||
import tensorflow as tfz # unconventional
|
||||
import holoviews as hsv # unconventional
|
||||
import panel as pns # unconventional
|
||||
import plotly.express as pltx # unconventional
|
||||
import matplotlib as ml # unconventional
|
||||
import polars as ps # unconventional
|
||||
import pyarrow as arr # unconventional
|
||||
|
||||
import altair as alt # conventional
|
||||
import dask.array as da # conventional
|
||||
@@ -23,3 +37,12 @@ import matplotlib.pyplot as plt # conventional
|
||||
import numpy as np # conventional
|
||||
import pandas as pd # conventional
|
||||
import seaborn as sns # conventional
|
||||
import tensorflow as tf # conventional
|
||||
import holoviews as hv # conventional
|
||||
import panel as pn # conventional
|
||||
import plotly.express as px # conventional
|
||||
import matplotlib as mpl # conventional
|
||||
import polars as pl # conventional
|
||||
import pyarrow as pa # conventional
|
||||
|
||||
from tensorflow.keras import Model # conventional
|
||||
|
||||
@@ -1,3 +1,21 @@
|
||||
import logging
|
||||
import sys
|
||||
|
||||
logging.error('Hello World', exc_info=True)
|
||||
# G201
|
||||
try:
|
||||
pass
|
||||
except:
|
||||
logging.error("Hello World", exc_info=True)
|
||||
|
||||
try:
|
||||
pass
|
||||
except:
|
||||
logging.error("Hello World", exc_info=sys.exc_info())
|
||||
|
||||
# OK
|
||||
try:
|
||||
pass
|
||||
except:
|
||||
logging.error("Hello World", exc_info=False)
|
||||
|
||||
logging.error("Hello World", exc_info=sys.exc_info())
|
||||
|
||||
@@ -1,3 +1,21 @@
|
||||
import logging
|
||||
import sys
|
||||
|
||||
logging.exception('Hello World', exc_info=True)
|
||||
# G202
|
||||
try:
|
||||
pass
|
||||
except:
|
||||
logging.exception("Hello World", exc_info=True)
|
||||
|
||||
try:
|
||||
pass
|
||||
except:
|
||||
logging.exception("Hello World", exc_info=sys.exc_info())
|
||||
|
||||
# OK
|
||||
try:
|
||||
pass
|
||||
except:
|
||||
logging.exception("Hello World", exc_info=False)
|
||||
|
||||
logging.exception("Hello World", exc_info=True)
|
||||
|
||||
0
resources/test/fixtures/flake8_no_pep420/test_pass_pyi/example.pyi
vendored
Normal file
0
resources/test/fixtures/flake8_no_pep420/test_pass_pyi/example.pyi
vendored
Normal file
@@ -14,3 +14,60 @@ def ok_other_scope():
|
||||
@pytest.fixture(scope="function")
|
||||
def error():
|
||||
...
|
||||
|
||||
|
||||
@pytest.fixture(scope="function", name="my_fixture")
|
||||
def error_multiple_args():
|
||||
...
|
||||
|
||||
|
||||
@pytest.fixture(name="my_fixture", scope="function")
|
||||
def error_multiple_args():
|
||||
...
|
||||
|
||||
|
||||
@pytest.fixture(name="my_fixture", scope="function", **kwargs)
|
||||
def error_second_arg():
|
||||
...
|
||||
|
||||
|
||||
# pytest.fixture does not take positional arguments, however this
|
||||
# tests the general case as we use a helper function that should
|
||||
# work for all cases.
|
||||
@pytest.fixture("my_fixture", scope="function")
|
||||
def error_arg():
|
||||
...
|
||||
|
||||
|
||||
@pytest.fixture(
|
||||
scope="function",
|
||||
name="my_fixture",
|
||||
)
|
||||
def error_multiple_args():
|
||||
...
|
||||
|
||||
|
||||
@pytest.fixture(
|
||||
name="my_fixture",
|
||||
scope="function",
|
||||
)
|
||||
def error_multiple_args():
|
||||
...
|
||||
|
||||
|
||||
@pytest.fixture(
|
||||
"hello",
|
||||
name,
|
||||
*args
|
||||
,
|
||||
|
||||
# another comment ,)
|
||||
|
||||
scope=\
|
||||
"function" # some comment ),
|
||||
,
|
||||
|
||||
name2=name, name3="my_fixture", **kwargs
|
||||
)
|
||||
def error_multiple_args():
|
||||
...
|
||||
|
||||
@@ -1,2 +1,4 @@
|
||||
this_should_be_linted = "double quote string"
|
||||
this_should_be_linted = u"double quote string"
|
||||
this_should_be_linted = f"double quote string"
|
||||
this_should_be_linted = f"double {'quote'} string"
|
||||
|
||||
@@ -4,3 +4,8 @@ this_is_fine = '"This" is a \'string\''
|
||||
this_is_fine = "This is a 'string'"
|
||||
this_is_fine = "\"This\" is a 'string'"
|
||||
this_is_fine = r'This is a \'string\''
|
||||
this_is_fine = R'This is a \'string\''
|
||||
this_should_raise = (
|
||||
'This is a'
|
||||
'\'string\''
|
||||
)
|
||||
|
||||
27
resources/test/fixtures/flake8_quotes/doubles_implicit.py
vendored
Normal file
27
resources/test/fixtures/flake8_quotes/doubles_implicit.py
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
x = (
|
||||
"This"
|
||||
"is"
|
||||
"not"
|
||||
)
|
||||
|
||||
x = (
|
||||
"This" \
|
||||
"is" \
|
||||
"not"
|
||||
)
|
||||
|
||||
x = (
|
||||
"This"
|
||||
"is 'actually'"
|
||||
"fine"
|
||||
)
|
||||
|
||||
x = (
|
||||
"This" \
|
||||
"is 'actually'" \
|
||||
"fine"
|
||||
)
|
||||
|
||||
if True:
|
||||
"This can use 'double' quotes"
|
||||
"But this needs to be changed"
|
||||
@@ -1,2 +1,4 @@
|
||||
this_should_be_linted = 'single quote string'
|
||||
this_should_be_linted = u'double quote string'
|
||||
this_should_be_linted = f'double quote string'
|
||||
this_should_be_linted = f'double {"quote"} string'
|
||||
|
||||
@@ -3,3 +3,8 @@ this_is_fine = "'This' is a \"string\""
|
||||
this_is_fine = 'This is a "string"'
|
||||
this_is_fine = '\'This\' is a "string"'
|
||||
this_is_fine = r"This is a \"string\""
|
||||
this_is_fine = R"This is a \"string\""
|
||||
this_should_raise = (
|
||||
"This is a"
|
||||
"\"string\""
|
||||
)
|
||||
|
||||
27
resources/test/fixtures/flake8_quotes/singles_implicit.py
vendored
Normal file
27
resources/test/fixtures/flake8_quotes/singles_implicit.py
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
x = (
|
||||
'This'
|
||||
'is'
|
||||
'not'
|
||||
)
|
||||
|
||||
x = (
|
||||
'This' \
|
||||
'is' \
|
||||
'not'
|
||||
)
|
||||
|
||||
x = (
|
||||
'This'
|
||||
'is "actually"'
|
||||
'fine'
|
||||
)
|
||||
|
||||
x = (
|
||||
'This' \
|
||||
'is "actually"' \
|
||||
'fine'
|
||||
)
|
||||
|
||||
if True:
|
||||
'This can use "single" quotes'
|
||||
'But this needs to be changed'
|
||||
15
resources/test/fixtures/flake8_raise/RSE102.py
vendored
Normal file
15
resources/test/fixtures/flake8_raise/RSE102.py
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
try:
|
||||
y = 6 + "7"
|
||||
except TypeError:
|
||||
raise ValueError() # RSE102
|
||||
|
||||
try:
|
||||
x = 1 / 0
|
||||
except ZeroDivisionError:
|
||||
raise
|
||||
|
||||
raise TypeError() # RSE102
|
||||
|
||||
raise AssertionError
|
||||
|
||||
raise AttributeError("test message")
|
||||
@@ -6,6 +6,14 @@ def f():
|
||||
return False
|
||||
|
||||
|
||||
def f():
|
||||
# SIM103
|
||||
if a == b:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def f():
|
||||
# SIM103
|
||||
if a:
|
||||
@@ -50,3 +58,29 @@ def f():
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
|
||||
def f():
|
||||
# OK
|
||||
if a:
|
||||
return False
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def f():
|
||||
# OK
|
||||
if a:
|
||||
return True
|
||||
else:
|
||||
return True
|
||||
|
||||
|
||||
def f():
|
||||
# OK
|
||||
def bool():
|
||||
return False
|
||||
if a:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
@@ -91,3 +91,10 @@ if True:
|
||||
b = cccccccccccccccccccccccccccccccccccc
|
||||
else:
|
||||
b = ddddddddddddddddddddddddddddddddddddd
|
||||
|
||||
|
||||
# OK (trailing comments)
|
||||
if True:
|
||||
exitcode = 0
|
||||
else:
|
||||
exitcode = 1 # Trailing comment
|
||||
|
||||
@@ -5,3 +5,8 @@ a = True if b != c else False # SIM210
|
||||
a = True if b + c else False # SIM210
|
||||
|
||||
a = False if b else True # OK
|
||||
|
||||
def bool():
|
||||
return False
|
||||
|
||||
a = True if b else False # OK
|
||||
|
||||
29
resources/test/fixtures/isort/preserve_tabs.py
vendored
Normal file
29
resources/test/fixtures/isort/preserve_tabs.py
vendored
Normal file
@@ -0,0 +1,29 @@
|
||||
from numpy import (
|
||||
cos,
|
||||
int8,
|
||||
int16,
|
||||
int32,
|
||||
int64,
|
||||
sin,
|
||||
tan,
|
||||
uint8,
|
||||
uint16,
|
||||
uint32,
|
||||
uint64,
|
||||
)
|
||||
|
||||
if True:
|
||||
# inside nested block
|
||||
from numpy import (
|
||||
cos,
|
||||
int8,
|
||||
int16,
|
||||
int32,
|
||||
int64,
|
||||
sin,
|
||||
tan,
|
||||
uint8,
|
||||
uint16,
|
||||
uint32,
|
||||
uint64,
|
||||
)
|
||||
13
resources/test/fixtures/isort/preserve_tabs_2.py
vendored
Normal file
13
resources/test/fixtures/isort/preserve_tabs_2.py
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
from numpy import (
|
||||
cos,
|
||||
int8,
|
||||
int16,
|
||||
int32,
|
||||
int64,
|
||||
sin,
|
||||
tan,
|
||||
uint8,
|
||||
uint16,
|
||||
uint32,
|
||||
uint64,
|
||||
)
|
||||
3
resources/test/fixtures/isort/star_before_others.py
vendored
Normal file
3
resources/test/fixtures/isort/star_before_others.py
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
from .logging import config_logging
|
||||
from .settings import ENV
|
||||
from .settings import *
|
||||
20
resources/test/fixtures/pandas_vet/PD002.py
vendored
Normal file
20
resources/test/fixtures/pandas_vet/PD002.py
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
import pandas as pd
|
||||
|
||||
x = pd.DataFrame()
|
||||
|
||||
x.drop(["a"], axis=1, inplace=True)
|
||||
|
||||
x.drop(["a"], axis=1, inplace=True)
|
||||
|
||||
x.drop(
|
||||
inplace=True,
|
||||
columns=["a"],
|
||||
axis=1,
|
||||
)
|
||||
|
||||
if True:
|
||||
x.drop(
|
||||
inplace=True,
|
||||
columns=["a"],
|
||||
axis=1,
|
||||
)
|
||||
7
resources/test/fixtures/pylint/PLE0605.py
vendored
7
resources/test/fixtures/pylint/PLE0605.py
vendored
@@ -1,7 +0,0 @@
|
||||
__all__ = ("CONST") # [invalid-all-format]
|
||||
|
||||
__all__ = ["Hello"] + {"world"} # [invalid-all-format]
|
||||
|
||||
__all__ += {"world"} # [invalid-all-format]
|
||||
|
||||
__all__ = {"world"} + ["Hello"] # [invalid-all-format]
|
||||
19
resources/test/fixtures/pylint/invalid_all_format.py
vendored
Normal file
19
resources/test/fixtures/pylint/invalid_all_format.py
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
__all__ = "CONST" # [invalid-all-format]
|
||||
|
||||
__all__ = ["Hello"] + {"world"} # [invalid-all-format]
|
||||
|
||||
__all__ += {"world"} # [invalid-all-format]
|
||||
|
||||
__all__ = {"world"} + ["Hello"] # [invalid-all-format]
|
||||
|
||||
__all__ = (x for x in ["Hello", "world"]) # [invalid-all-format]
|
||||
|
||||
__all__ = {x for x in ["Hello", "world"]} # [invalid-all-format]
|
||||
|
||||
__all__ = ["Hello"]
|
||||
|
||||
__all__ = ("Hello",)
|
||||
|
||||
__all__ = ["Hello"] + ("world",)
|
||||
|
||||
__all__ = [x for x in ["Hello", "world"]]
|
||||
@@ -47,7 +47,7 @@ if input_password == "": # correct
|
||||
if input_password == ADMIN_PASSWORD: # correct
|
||||
pass
|
||||
|
||||
if input_password == "Hunter2": # [magic-value-comparison]
|
||||
if input_password == "Hunter2": # correct
|
||||
pass
|
||||
|
||||
PI = 3.141592653589793238
|
||||
@@ -62,7 +62,7 @@ if pi_estimation == PI: # correct
|
||||
HELLO_WORLD = b"Hello, World!"
|
||||
user_input = b"Hello, There!"
|
||||
|
||||
if user_input == b"something": # [magic-value-comparison]
|
||||
if user_input == b"something": # correct
|
||||
pass
|
||||
|
||||
if user_input == HELLO_WORLD: # correct
|
||||
|
||||
34
resources/test/fixtures/pylint/too_many_args.py
vendored
Normal file
34
resources/test/fixtures/pylint/too_many_args.py
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
def f(x, y, z, t, u, v, w, r): # Too many arguments (8/5)
|
||||
pass
|
||||
|
||||
|
||||
def f(x, y, z, t, u): # OK
|
||||
pass
|
||||
|
||||
|
||||
def f(x): # OK
|
||||
pass
|
||||
|
||||
|
||||
def f(x, y, z, _t, _u, _v, _w, r): # OK (underscore-prefixed names are ignored
|
||||
pass
|
||||
|
||||
|
||||
def f(x, y, z, u=1, v=1, r=1): # Too many arguments (6/5)
|
||||
pass
|
||||
|
||||
|
||||
def f(x=1, y=1, z=1): # OK
|
||||
pass
|
||||
|
||||
|
||||
def f(x, y, z, /, u, v, w): # OK
|
||||
pass
|
||||
|
||||
|
||||
def f(x, y, z, *, u, v, w): # OK
|
||||
pass
|
||||
|
||||
|
||||
def f(x, y, z, a, b, c, *, u, v, w): # Too many arguments (6/5)
|
||||
pass
|
||||
10
resources/test/fixtures/pylint/too_many_args_params.py
vendored
Normal file
10
resources/test/fixtures/pylint/too_many_args_params.py
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
# Too many args (6/4) for max_args=4
|
||||
# OK for dummy_variable_rgx ~ "skip_.*"
|
||||
def f(x, y, z, skip_t, skip_u, skip_v):
|
||||
pass
|
||||
|
||||
|
||||
# Too many args (6/4) for max_args=4
|
||||
# Too many args (6/5) for dummy_variable_rgx ~ "skip_.*"
|
||||
def f(x, y, z, t, u, v):
|
||||
pass
|
||||
87
resources/test/fixtures/pyupgrade/UP016.py
vendored
87
resources/test/fixtures/pyupgrade/UP016.py
vendored
@@ -1,87 +0,0 @@
|
||||
# Replace names by built-in names, whether namespaced or not
|
||||
# https://github.com/search?q=%22from+six+import%22&type=code
|
||||
import six
|
||||
from six.moves import map # No need
|
||||
from six import text_type
|
||||
|
||||
six.text_type # str
|
||||
six.binary_type # bytes
|
||||
six.class_types # (type,)
|
||||
six.string_types # (str,)
|
||||
six.integer_types # (int,)
|
||||
six.unichr # chr
|
||||
six.iterbytes # iter
|
||||
six.print_(...) # print(...)
|
||||
six.exec_(c, g, l) # exec(c, g, l)
|
||||
six.advance_iterator(it) # next(it)
|
||||
six.next(it) # next(it)
|
||||
six.callable(x) # callable(x)
|
||||
six.moves.range(x) # range(x)
|
||||
six.moves.xrange(x) # range(x)
|
||||
isinstance(..., six.class_types) # isinstance(..., type)
|
||||
issubclass(..., six.integer_types) # issubclass(..., int)
|
||||
isinstance(..., six.string_types) # isinstance(..., str)
|
||||
|
||||
# Replace call on arg by method call on arg
|
||||
six.iteritems(dct) # dct.items()
|
||||
six.iterkeys(dct) # dct.keys()
|
||||
six.itervalues(dct) # dct.values()
|
||||
six.viewitems(dct) # dct.items()
|
||||
six.viewkeys(dct) # dct.keys()
|
||||
six.viewvalues(dct) # dct.values()
|
||||
six.assertCountEqual(self, a1, a2) # self.assertCountEqual(a1, a2)
|
||||
six.assertRaisesRegex(self, e, r, fn) # self.assertRaisesRegex(e, r, fn)
|
||||
six.assertRegex(self, s, r) # self.assertRegex(s, r)
|
||||
|
||||
# Replace call on arg by arg attribute
|
||||
six.get_method_function(meth) # meth.__func__
|
||||
six.get_method_self(meth) # meth.__self__
|
||||
six.get_function_closure(fn) # fn.__closure__
|
||||
six.get_function_code(fn) # fn.__code__
|
||||
six.get_function_defaults(fn) # fn.__defaults__
|
||||
six.get_function_globals(fn) # fn.__globals__
|
||||
|
||||
# Replace by string literal
|
||||
six.b("...") # b'...'
|
||||
six.u("...") # '...'
|
||||
six.ensure_binary("...") # b'...'
|
||||
six.ensure_str("...") # '...'
|
||||
six.ensure_text("...") # '...'
|
||||
six.b(string) # no change
|
||||
|
||||
# Replace by simple expression
|
||||
six.get_unbound_function(meth) # meth
|
||||
six.create_unbound_method(fn, cls) # fn
|
||||
|
||||
# Raise exception
|
||||
six.raise_from(exc, exc_from) # raise exc from exc_from
|
||||
six.reraise(tp, exc, tb) # raise exc.with_traceback(tb)
|
||||
six.reraise(*sys.exc_info()) # raise
|
||||
|
||||
# Int / Bytes conversion
|
||||
six.byte2int(bs) # bs[0]
|
||||
six.indexbytes(bs, i) # bs[i]
|
||||
six.int2byte(i) # bytes((i, ))
|
||||
|
||||
# Special cases for next calls
|
||||
next(six.iteritems(dct)) # next(iter(dct.items()))
|
||||
next(six.iterkeys(dct)) # next(iter(dct.keys()))
|
||||
next(six.itervalues(dct)) # next(iter(dct.values()))
|
||||
|
||||
# TODO: To implement
|
||||
|
||||
|
||||
# Rewrite classes
|
||||
@six.python_2_unicode_compatible # Remove
|
||||
class C(six.Iterator):
|
||||
pass # class C: pass
|
||||
|
||||
|
||||
class C(six.with_metaclass(M, B)):
|
||||
pass # class C(B, metaclass=M): pass
|
||||
|
||||
|
||||
# class C(B, metaclass=M): pass
|
||||
@six.add_metaclass(M)
|
||||
class C(B):
|
||||
pass
|
||||
50
resources/test/fixtures/pyupgrade/UP035.py
vendored
Normal file
50
resources/test/fixtures/pyupgrade/UP035.py
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
# UP035
|
||||
from collections import Mapping
|
||||
|
||||
from collections import Mapping as MAP
|
||||
|
||||
from collections import Mapping, Sequence
|
||||
|
||||
from collections import Counter, Mapping
|
||||
|
||||
from collections import (Counter, Mapping)
|
||||
|
||||
from collections import (Counter,
|
||||
Mapping)
|
||||
|
||||
from collections import Counter, \
|
||||
Mapping
|
||||
|
||||
from collections import Counter, Mapping, Sequence
|
||||
|
||||
from collections import Mapping as mapping, Counter
|
||||
|
||||
if True:
|
||||
from collections import Mapping, Counter
|
||||
|
||||
if True:
|
||||
if True:
|
||||
pass
|
||||
from collections import Mapping, Counter
|
||||
|
||||
if True: from collections import Mapping
|
||||
|
||||
import os
|
||||
from collections import Counter, Mapping
|
||||
import sys
|
||||
|
||||
if True:
|
||||
from collections import (
|
||||
Mapping,
|
||||
Callable,
|
||||
Bad,
|
||||
Good,
|
||||
)
|
||||
|
||||
from typing import Callable, Match, Pattern, List
|
||||
|
||||
if True: from collections import (
|
||||
Mapping, Counter)
|
||||
|
||||
# OK
|
||||
from a import b
|
||||
2
resources/test/fixtures/ruff/RUF100_0.py
vendored
2
resources/test/fixtures/ruff/RUF100_0.py
vendored
@@ -86,3 +86,5 @@ import shelve # noqa: RUF100
|
||||
import sys # noqa: F401, RUF100
|
||||
|
||||
print(sys.path)
|
||||
|
||||
"shape: (6,)\nSeries: '' [duration[μs]]\n[\n\t0µs\n\t1µs\n\t2µs\n\t3µs\n\t4µs\n\t5µs\n]" # noqa: F401
|
||||
|
||||
1
resources/test/fixtures/ruff/RUF100_2.py
vendored
Normal file
1
resources/test/fixtures/ruff/RUF100_2.py
vendored
Normal file
@@ -0,0 +1 @@
|
||||
import itertools # noqa: F401
|
||||
10
resources/test/fixtures/tryceratops/TRY201.py
vendored
10
resources/test/fixtures/tryceratops/TRY201.py
vendored
@@ -35,6 +35,16 @@ def still_good():
|
||||
raise
|
||||
|
||||
|
||||
def still_actually_good():
|
||||
try:
|
||||
process()
|
||||
except MyException as e:
|
||||
try:
|
||||
pass
|
||||
except TypeError:
|
||||
raise e
|
||||
|
||||
|
||||
def bad_that_needs_recursion():
|
||||
try:
|
||||
process()
|
||||
|
||||
@@ -66,18 +66,8 @@
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"extend-ignore": {
|
||||
"description": "A list of rule codes or prefixes to ignore, in addition to those specified by `ignore`.\n\nNote that `extend-ignore` is applied after resolving rules from `ignore`/`select` and a less specific rule in `extend-ignore` would overwrite a more specific rule in `select`. It is recommended to only use `extend-ignore` when extending a `pyproject.toml` file via `extend`.",
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"$ref": "#/definitions/RuleSelector"
|
||||
}
|
||||
},
|
||||
"extend-select": {
|
||||
"description": "A list of rule codes or prefixes to enable, in addition to those specified by `select`.\n\nNote that `extend-select` is applied after resolving rules from `ignore`/`select` and a less specific rule in `extend-select` would overwrite a more specific rule in `ignore`. It is recommended to only use `extend-select` when extending a `pyproject.toml` file via `extend`.",
|
||||
"description": "A list of rule codes or prefixes to enable, in addition to those specified by `select`.",
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
@@ -111,7 +101,7 @@
|
||||
]
|
||||
},
|
||||
"fixable": {
|
||||
"description": "A list of rule codes or prefixes to consider autofixable.",
|
||||
"description": "A list of rule codes or prefixes to consider autofixable. By default, all rules are considered autofixable.",
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
@@ -659,7 +649,7 @@
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"allow-multiline": {
|
||||
"description": "Whether to allow implicit string concatenations for multiline strings. By default, implicit concatenations of multiline strings are allowed (but continuation lines, delimited with a backslash, are prohibited).",
|
||||
"description": "Whether to allow implicit string concatenations for multiline strings. By default, implicit concatenations of multiline strings are allowed (but continuation lines, delimited with a backslash, are prohibited).\n\nNote that setting `allow-multiline = false` should typically be coupled with disabling `explicit-string-concatenation` (`ISC003`). Otherwise, both explicit and implicit multiline string concatenations will be seen as violations.",
|
||||
"type": [
|
||||
"boolean",
|
||||
"null"
|
||||
@@ -778,7 +768,7 @@
|
||||
]
|
||||
},
|
||||
"docstring-quotes": {
|
||||
"description": "Quote style to prefer for docstrings (either \"single\" (`'`) or \"double\" (`\"`)).",
|
||||
"description": "Quote style to prefer for docstrings (either \"single\" or \"double\").",
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/Quote"
|
||||
@@ -789,7 +779,7 @@
|
||||
]
|
||||
},
|
||||
"inline-quotes": {
|
||||
"description": "Quote style to prefer for inline strings (either \"single\" (`'`) or \"double\" (`\"`)).",
|
||||
"description": "Quote style to prefer for inline strings (either \"single\" or \"double\").",
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/Quote"
|
||||
@@ -800,7 +790,7 @@
|
||||
]
|
||||
},
|
||||
"multiline-quotes": {
|
||||
"description": "Quote style to prefer for multiline strings (either \"single\" (`'`) or \"double\" (`\"`)).",
|
||||
"description": "Quote style to prefer for multiline strings (either \"single\" or \"double\").",
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/Quote"
|
||||
@@ -1164,7 +1154,7 @@
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"allow-magic-value-types": {
|
||||
"description": "Constant types to ignore when used as \"magic values\".",
|
||||
"description": "Constant types to ignore when used as \"magic values\" (see: `PLR2004`).",
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
@@ -1172,6 +1162,15 @@
|
||||
"items": {
|
||||
"$ref": "#/definitions/ConstantType"
|
||||
}
|
||||
},
|
||||
"max-args": {
|
||||
"description": "Maximum number of arguments allowed for a function definition (see: `PLR0913`).",
|
||||
"type": [
|
||||
"integer",
|
||||
"null"
|
||||
],
|
||||
"format": "uint",
|
||||
"minimum": 0.0
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
@@ -1189,14 +1188,14 @@
|
||||
"Quote": {
|
||||
"oneOf": [
|
||||
{
|
||||
"description": "Use single quotes (`'`).",
|
||||
"description": "Use single quotes.",
|
||||
"type": "string",
|
||||
"enum": [
|
||||
"single"
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "Use double quotes (`\"`).",
|
||||
"description": "Use double quotes.",
|
||||
"type": "string",
|
||||
"enum": [
|
||||
"double"
|
||||
@@ -1641,6 +1640,9 @@
|
||||
"PLR04",
|
||||
"PLR040",
|
||||
"PLR0402",
|
||||
"PLR09",
|
||||
"PLR091",
|
||||
"PLR0913",
|
||||
"PLR1",
|
||||
"PLR17",
|
||||
"PLR170",
|
||||
@@ -1737,6 +1739,10 @@
|
||||
"RET506",
|
||||
"RET507",
|
||||
"RET508",
|
||||
"RSE",
|
||||
"RSE1",
|
||||
"RSE10",
|
||||
"RSE102",
|
||||
"RUF",
|
||||
"RUF0",
|
||||
"RUF00",
|
||||
@@ -1870,7 +1876,6 @@
|
||||
"UP013",
|
||||
"UP014",
|
||||
"UP015",
|
||||
"UP016",
|
||||
"UP017",
|
||||
"UP018",
|
||||
"UP019",
|
||||
@@ -1891,6 +1896,7 @@
|
||||
"UP032",
|
||||
"UP033",
|
||||
"UP034",
|
||||
"UP035",
|
||||
"W",
|
||||
"W2",
|
||||
"W29",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_cli"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
authors = ["Charlie Marsh <charlie.r.marsh@gmail.com>"]
|
||||
edition = "2021"
|
||||
rust-version = "1.65.0"
|
||||
@@ -53,6 +53,7 @@ similar = { version = "2.2.1" }
|
||||
textwrap = { version = "0.16.0" }
|
||||
update-informer = { version = "0.6.0", default-features = false, features = ["pypi"], optional = true }
|
||||
walkdir = { version = "2.3.2" }
|
||||
strum = "0.24.1"
|
||||
|
||||
[dev-dependencies]
|
||||
assert_cmd = { version = "2.0.4" }
|
||||
|
||||
@@ -5,6 +5,7 @@ use regex::Regex;
|
||||
use ruff::logging::LogLevel;
|
||||
use ruff::registry::Rule;
|
||||
use ruff::resolver::ConfigProcessor;
|
||||
use ruff::settings::configuration::RuleSelection;
|
||||
use ruff::settings::types::{
|
||||
FilePattern, PatternPrefixPair, PerFileIgnore, PythonVersion, SerializationFormat,
|
||||
};
|
||||
@@ -41,6 +42,12 @@ pub enum Command {
|
||||
#[arg(long, value_enum, default_value = "text")]
|
||||
format: HelpFormat,
|
||||
},
|
||||
/// List all supported upstream linters
|
||||
Linter {
|
||||
/// Output format
|
||||
#[arg(long, value_enum, default_value = "text")]
|
||||
format: HelpFormat,
|
||||
},
|
||||
/// Clear any caches in the current directory and any subdirectories.
|
||||
#[clap(alias = "--clean")]
|
||||
Clean,
|
||||
@@ -110,13 +117,13 @@ pub struct CheckArgs {
|
||||
help_heading = "Rule selection"
|
||||
)]
|
||||
pub extend_select: Option<Vec<RuleSelector>>,
|
||||
/// Like --ignore, but adds additional rule codes on top of the ignored
|
||||
/// ones.
|
||||
/// Like --ignore. (Deprecated: You can just use --ignore instead.)
|
||||
#[arg(
|
||||
long,
|
||||
value_delimiter = ',',
|
||||
value_name = "RULE_CODE",
|
||||
help_heading = "Rule selection"
|
||||
help_heading = "Rule selection",
|
||||
hide = true
|
||||
)]
|
||||
pub extend_ignore: Option<Vec<RuleSelector>>,
|
||||
/// List of mappings from file pattern to code to exclude
|
||||
@@ -211,6 +218,15 @@ pub struct CheckArgs {
|
||||
update_check: bool,
|
||||
#[clap(long, overrides_with("update_check"), hide = true)]
|
||||
no_update_check: bool,
|
||||
/// Show counts for every rule with at least one violation.
|
||||
#[arg(
|
||||
long,
|
||||
// Unsupported default-command arguments.
|
||||
conflicts_with = "diff",
|
||||
conflicts_with = "show_source",
|
||||
conflicts_with = "watch",
|
||||
)]
|
||||
pub statistics: bool,
|
||||
/// Enable automatic additions of `noqa` directives to failing lines.
|
||||
#[arg(
|
||||
long,
|
||||
@@ -218,6 +234,7 @@ pub struct CheckArgs {
|
||||
conflicts_with = "show_files",
|
||||
conflicts_with = "show_settings",
|
||||
// Unsupported default-command arguments.
|
||||
conflicts_with = "statistics",
|
||||
conflicts_with = "stdin_filename",
|
||||
conflicts_with = "watch",
|
||||
)]
|
||||
@@ -230,6 +247,7 @@ pub struct CheckArgs {
|
||||
// conflicts_with = "show_files",
|
||||
conflicts_with = "show_settings",
|
||||
// Unsupported default-command arguments.
|
||||
conflicts_with = "statistics",
|
||||
conflicts_with = "stdin_filename",
|
||||
conflicts_with = "watch",
|
||||
)]
|
||||
@@ -242,6 +260,7 @@ pub struct CheckArgs {
|
||||
conflicts_with = "show_files",
|
||||
// conflicts_with = "show_settings",
|
||||
// Unsupported default-command arguments.
|
||||
conflicts_with = "statistics",
|
||||
conflicts_with = "stdin_filename",
|
||||
conflicts_with = "watch",
|
||||
)]
|
||||
@@ -316,6 +335,7 @@ impl CheckArgs {
|
||||
no_cache: self.no_cache,
|
||||
show_files: self.show_files,
|
||||
show_settings: self.show_settings,
|
||||
statistics: self.statistics,
|
||||
stdin_filename: self.stdin_filename,
|
||||
watch: self.watch,
|
||||
},
|
||||
@@ -371,6 +391,7 @@ pub struct Arguments {
|
||||
pub no_cache: bool,
|
||||
pub show_files: bool,
|
||||
pub show_settings: bool,
|
||||
pub statistics: bool,
|
||||
pub stdin_filename: Option<PathBuf>,
|
||||
pub watch: bool,
|
||||
}
|
||||
@@ -422,18 +443,25 @@ impl ConfigProcessor for &Overrides {
|
||||
if let Some(fix_only) = &self.fix_only {
|
||||
config.fix_only = Some(*fix_only);
|
||||
}
|
||||
if let Some(fixable) = &self.fixable {
|
||||
config.fixable = Some(fixable.clone());
|
||||
}
|
||||
config.rule_selections.push(RuleSelection {
|
||||
select: self.select.clone(),
|
||||
ignore: self
|
||||
.ignore
|
||||
.iter()
|
||||
.cloned()
|
||||
.chain(self.extend_ignore.iter().cloned().into_iter())
|
||||
.flatten()
|
||||
.collect(),
|
||||
extend_select: self.extend_select.clone().unwrap_or_default(),
|
||||
fixable: self.fixable.clone(),
|
||||
unfixable: self.unfixable.clone().unwrap_or_default(),
|
||||
});
|
||||
if let Some(format) = &self.format {
|
||||
config.format = Some(*format);
|
||||
}
|
||||
if let Some(force_exclude) = &self.force_exclude {
|
||||
config.force_exclude = Some(*force_exclude);
|
||||
}
|
||||
if let Some(ignore) = &self.ignore {
|
||||
config.ignore = Some(ignore.clone());
|
||||
}
|
||||
if let Some(line_length) = &self.line_length {
|
||||
config.line_length = Some(*line_length);
|
||||
}
|
||||
@@ -443,38 +471,15 @@ impl ConfigProcessor for &Overrides {
|
||||
if let Some(respect_gitignore) = &self.respect_gitignore {
|
||||
config.respect_gitignore = Some(*respect_gitignore);
|
||||
}
|
||||
if let Some(select) = &self.select {
|
||||
config.select = Some(select.clone());
|
||||
}
|
||||
if let Some(show_source) = &self.show_source {
|
||||
config.show_source = Some(*show_source);
|
||||
}
|
||||
if let Some(target_version) = &self.target_version {
|
||||
config.target_version = Some(*target_version);
|
||||
}
|
||||
if let Some(unfixable) = &self.unfixable {
|
||||
config.unfixable = Some(unfixable.clone());
|
||||
}
|
||||
if let Some(update_check) = &self.update_check {
|
||||
config.update_check = Some(*update_check);
|
||||
}
|
||||
// Special-case: `extend_ignore` and `extend_select` are parallel arrays, so
|
||||
// push an empty array if only one of the two is provided.
|
||||
match (&self.extend_ignore, &self.extend_select) {
|
||||
(Some(extend_ignore), Some(extend_select)) => {
|
||||
config.extend_ignore.push(extend_ignore.clone());
|
||||
config.extend_select.push(extend_select.clone());
|
||||
}
|
||||
(Some(extend_ignore), None) => {
|
||||
config.extend_ignore.push(extend_ignore.clone());
|
||||
config.extend_select.push(Vec::new());
|
||||
}
|
||||
(None, Some(extend_select)) => {
|
||||
config.extend_ignore.push(Vec::new());
|
||||
config.extend_select.push(extend_select.clone());
|
||||
}
|
||||
(None, None) => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -27,6 +27,8 @@ use crate::cache;
|
||||
use crate::diagnostics::{lint_path, lint_stdin, Diagnostics};
|
||||
use crate::iterators::par_iter;
|
||||
|
||||
pub mod linter;
|
||||
|
||||
/// Run the linter over a collection of files.
|
||||
pub fn run(
|
||||
files: &[PathBuf],
|
||||
@@ -110,7 +112,7 @@ pub fn run(
|
||||
let settings = resolver.resolve(path, pyproject_strategy);
|
||||
if settings.rules.enabled(&Rule::IOError) {
|
||||
Diagnostics::new(vec![Message {
|
||||
kind: IOError(message).into(),
|
||||
kind: IOError { message }.into(),
|
||||
location: Location::default(),
|
||||
end_location: Location::default(),
|
||||
fix: None,
|
||||
|
||||
59
ruff_cli/src/commands/linter.rs
Normal file
59
ruff_cli/src/commands/linter.rs
Normal file
@@ -0,0 +1,59 @@
|
||||
use itertools::Itertools;
|
||||
use serde::Serialize;
|
||||
use strum::IntoEnumIterator;
|
||||
|
||||
use ruff::registry::{Linter, LinterCategory, RuleNamespace};
|
||||
|
||||
use crate::args::HelpFormat;
|
||||
|
||||
pub fn linter(format: HelpFormat) {
|
||||
match format {
|
||||
HelpFormat::Text => {
|
||||
for linter in Linter::iter() {
|
||||
let prefix = match linter.common_prefix() {
|
||||
"" => linter
|
||||
.categories()
|
||||
.unwrap()
|
||||
.iter()
|
||||
.map(|LinterCategory(prefix, ..)| prefix)
|
||||
.join("/"),
|
||||
prefix => prefix.to_string(),
|
||||
};
|
||||
println!("{:>4} {}", prefix, linter.name());
|
||||
}
|
||||
}
|
||||
|
||||
HelpFormat::Json => {
|
||||
let linters: Vec<_> = Linter::iter()
|
||||
.map(|linter_info| LinterInfo {
|
||||
prefix: linter_info.common_prefix(),
|
||||
name: linter_info.name(),
|
||||
categories: linter_info.categories().map(|cats| {
|
||||
cats.iter()
|
||||
.map(|LinterCategory(prefix, name, ..)| LinterCategoryInfo {
|
||||
prefix,
|
||||
name,
|
||||
})
|
||||
.collect()
|
||||
}),
|
||||
})
|
||||
.collect();
|
||||
|
||||
println!("{}", serde_json::to_string_pretty(&linters).unwrap());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct LinterInfo {
|
||||
prefix: &'static str,
|
||||
name: &'static str,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
categories: Option<Vec<LinterCategoryInfo>>,
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct LinterCategoryInfo {
|
||||
prefix: &'static str,
|
||||
name: &'static str,
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
//! This library only exists to enable the Ruff internal tooling (`ruff_dev`)
|
||||
//! to automatically update the `ruff --help` output in the `README.md`.
|
||||
//! to automatically update the `ruff help` output in the `README.md`.
|
||||
//!
|
||||
//! For the actual Ruff library, see [`ruff`].
|
||||
#![forbid(unsafe_code)]
|
||||
@@ -10,7 +10,21 @@ mod args;
|
||||
|
||||
use clap::CommandFactory;
|
||||
|
||||
/// Returns the output of `ruff --help`.
|
||||
pub fn help() -> String {
|
||||
/// Returns the output of `ruff help`.
|
||||
pub fn command_help() -> String {
|
||||
args::Args::command().render_help().to_string()
|
||||
}
|
||||
|
||||
/// Returns the output of `ruff help check`.
|
||||
pub fn subcommand_help() -> String {
|
||||
let mut cmd = args::Args::command();
|
||||
|
||||
// The build call is necessary for the help output to contain `Usage: ruff check` instead of `Usage: check`
|
||||
// see https://github.com/clap-rs/clap/issues/4685
|
||||
cmd.build();
|
||||
|
||||
cmd.find_subcommand_mut("check")
|
||||
.expect("`check` subcommand not found")
|
||||
.render_help()
|
||||
.to_string()
|
||||
}
|
||||
|
||||
@@ -58,26 +58,30 @@ fn inner_main() -> Result<ExitCode> {
|
||||
log_level_args,
|
||||
} = Args::parse_from(args);
|
||||
|
||||
let default_panic_hook = std::panic::take_hook();
|
||||
std::panic::set_hook(Box::new(move |info| {
|
||||
eprintln!(
|
||||
r#"
|
||||
#[cfg(not(debug_assertions))]
|
||||
{
|
||||
let default_panic_hook = std::panic::take_hook();
|
||||
std::panic::set_hook(Box::new(move |info| {
|
||||
eprintln!(
|
||||
r#"
|
||||
{}: `ruff` crashed. This indicates a bug in `ruff`. If you could open an issue at:
|
||||
|
||||
https://github.com/charliermarsh/ruff/issues/new?title=%5BPanic%5D
|
||||
|
||||
quoting the executed command, along with the relevant file contents and `pyproject.toml` settings, we'd be very appreciative!
|
||||
"#,
|
||||
"error".red().bold(),
|
||||
);
|
||||
default_panic_hook(info);
|
||||
}));
|
||||
"error".red().bold(),
|
||||
);
|
||||
default_panic_hook(info);
|
||||
}));
|
||||
}
|
||||
|
||||
let log_level: LogLevel = (&log_level_args).into();
|
||||
set_up_logging(&log_level)?;
|
||||
|
||||
match command {
|
||||
Command::Rule { rule, format } => commands::rule(rule, format)?,
|
||||
Command::Linter { format } => commands::linter::linter(format),
|
||||
Command::Clean => commands::clean(log_level)?,
|
||||
Command::GenerateShellCompletion { shell } => {
|
||||
shell.generate(&mut Args::command(), &mut io::stdout());
|
||||
@@ -244,7 +248,11 @@ fn check(args: CheckArgs, log_level: LogLevel) -> Result<ExitCode> {
|
||||
// unless we're writing fixes via stdin (in which case, the transformed
|
||||
// source code goes to stdout).
|
||||
if !(is_stdin && matches!(autofix, fix::FixMode::Apply | fix::FixMode::Diff)) {
|
||||
printer.write_once(&diagnostics)?;
|
||||
if cli.statistics {
|
||||
printer.write_statistics(&diagnostics)?;
|
||||
} else {
|
||||
printer.write_once(&diagnostics)?;
|
||||
}
|
||||
}
|
||||
|
||||
// Check for updates if we're in a non-silent log level.
|
||||
|
||||
@@ -7,7 +7,7 @@ use annotate_snippets::display_list::{DisplayList, FormatOptions};
|
||||
use annotate_snippets::snippet::{Annotation, AnnotationType, Slice, Snippet, SourceAnnotation};
|
||||
use anyhow::Result;
|
||||
use colored::Colorize;
|
||||
use itertools::iterate;
|
||||
use itertools::{iterate, Itertools};
|
||||
use ruff::fs::relativize_path;
|
||||
use ruff::logging::LogLevel;
|
||||
use ruff::message::{Location, Message};
|
||||
@@ -43,6 +43,13 @@ struct ExpandedMessage<'a> {
|
||||
filename: &'a str,
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct ExpandedStatistics<'a> {
|
||||
count: usize,
|
||||
code: &'a str,
|
||||
message: String,
|
||||
}
|
||||
|
||||
struct SerializeRuleAsCode<'a>(&'a Rule);
|
||||
|
||||
impl Serialize for SerializeRuleAsCode<'_> {
|
||||
@@ -336,6 +343,80 @@ impl<'a> Printer<'a> {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn write_statistics(&self, diagnostics: &Diagnostics) -> Result<()> {
|
||||
let violations = diagnostics
|
||||
.messages
|
||||
.iter()
|
||||
.map(|message| message.kind.rule())
|
||||
.sorted()
|
||||
.dedup()
|
||||
.collect::<Vec<_>>();
|
||||
if violations.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let statistics = violations
|
||||
.iter()
|
||||
.map(|rule| ExpandedStatistics {
|
||||
code: rule.code(),
|
||||
count: diagnostics
|
||||
.messages
|
||||
.iter()
|
||||
.filter(|message| message.kind.rule() == *rule)
|
||||
.count(),
|
||||
message: diagnostics
|
||||
.messages
|
||||
.iter()
|
||||
.find(|message| message.kind.rule() == *rule)
|
||||
.map(|message| message.kind.body())
|
||||
.unwrap(),
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
let mut stdout = BufWriter::new(io::stdout().lock());
|
||||
match self.format {
|
||||
SerializationFormat::Text => {
|
||||
// Compute the maximum number of digits in the count and code, for all messages, to enable
|
||||
// pretty-printing.
|
||||
let count_width = num_digits(
|
||||
statistics
|
||||
.iter()
|
||||
.map(|statistic| statistic.count)
|
||||
.max()
|
||||
.unwrap(),
|
||||
);
|
||||
let code_width = statistics
|
||||
.iter()
|
||||
.map(|statistic| statistic.code.len())
|
||||
.max()
|
||||
.unwrap();
|
||||
|
||||
// By default, we mimic Flake8's `--statistics` format.
|
||||
for msg in statistics {
|
||||
writeln!(
|
||||
stdout,
|
||||
"{:>count_width$}\t{:<code_width$}\t{}",
|
||||
msg.count, msg.code, msg.message
|
||||
)?;
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
SerializationFormat::Json => {
|
||||
writeln!(stdout, "{}", serde_json::to_string_pretty(&statistics)?)?;
|
||||
}
|
||||
_ => {
|
||||
anyhow::bail!(
|
||||
"Unsupported serialization format for statistics: {:?}",
|
||||
self.format
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
stdout.flush()?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn write_continuously(&self, diagnostics: &Diagnostics) -> Result<()> {
|
||||
if matches!(self.log_level, LogLevel::Silent) {
|
||||
return Ok(());
|
||||
|
||||
@@ -14,7 +14,7 @@ const BIN_NAME: &str = "ruff";
|
||||
#[test]
|
||||
fn test_stdin_success() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
cmd.args(["-", "--format", "text"])
|
||||
cmd.args(["-", "--format", "text", "--isolated"])
|
||||
.write_stdin("")
|
||||
.assert()
|
||||
.success();
|
||||
@@ -25,7 +25,7 @@ fn test_stdin_success() -> Result<()> {
|
||||
fn test_stdin_error() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
let output = cmd
|
||||
.args(["-", "--format", "text"])
|
||||
.args(["-", "--format", "text", "--isolated"])
|
||||
.write_stdin("import os\n")
|
||||
.assert()
|
||||
.failure();
|
||||
@@ -41,7 +41,14 @@ fn test_stdin_error() -> Result<()> {
|
||||
fn test_stdin_filename() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
let output = cmd
|
||||
.args(["-", "--format", "text", "--stdin-filename", "F401.py"])
|
||||
.args([
|
||||
"-",
|
||||
"--format",
|
||||
"text",
|
||||
"--stdin-filename",
|
||||
"F401.py",
|
||||
"--isolated",
|
||||
])
|
||||
.write_stdin("import os\n")
|
||||
.assert()
|
||||
.failure();
|
||||
@@ -58,7 +65,14 @@ fn test_stdin_filename() -> Result<()> {
|
||||
fn test_stdin_json() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
let output = cmd
|
||||
.args(["-", "--format", "json", "--stdin-filename", "F401.py"])
|
||||
.args([
|
||||
"-",
|
||||
"--format",
|
||||
"json",
|
||||
"--stdin-filename",
|
||||
"F401.py",
|
||||
"--isolated",
|
||||
])
|
||||
.write_stdin("import os\n")
|
||||
.assert()
|
||||
.failure();
|
||||
@@ -107,7 +121,7 @@ fn test_stdin_json() -> Result<()> {
|
||||
fn test_stdin_autofix() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
let output = cmd
|
||||
.args(["-", "--format", "text", "--fix"])
|
||||
.args(["-", "--format", "text", "--fix", "--isolated"])
|
||||
.write_stdin("import os\nimport sys\n\nprint(sys.version)\n")
|
||||
.assert()
|
||||
.success();
|
||||
@@ -122,7 +136,7 @@ fn test_stdin_autofix() -> Result<()> {
|
||||
fn test_stdin_autofix_when_not_fixable_should_still_print_contents() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
let output = cmd
|
||||
.args(["-", "--format", "text", "--fix"])
|
||||
.args(["-", "--format", "text", "--fix", "--isolated"])
|
||||
.write_stdin("import os\nimport sys\n\nif (1, 2):\n print(sys.version)\n")
|
||||
.assert()
|
||||
.failure();
|
||||
@@ -137,7 +151,7 @@ fn test_stdin_autofix_when_not_fixable_should_still_print_contents() -> Result<(
|
||||
fn test_stdin_autofix_when_no_issues_should_still_print_contents() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
let output = cmd
|
||||
.args(["-", "--format", "text", "--fix"])
|
||||
.args(["-", "--format", "text", "--fix", "--isolated"])
|
||||
.write_stdin("import sys\n\nprint(sys.version)\n")
|
||||
.assert()
|
||||
.success();
|
||||
@@ -152,7 +166,7 @@ fn test_stdin_autofix_when_no_issues_should_still_print_contents() -> Result<()>
|
||||
fn test_show_source() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
let output = cmd
|
||||
.args(["-", "--format", "text", "--show-source"])
|
||||
.args(["-", "--format", "text", "--show-source", "--isolated"])
|
||||
.write_stdin("l = 1")
|
||||
.assert()
|
||||
.failure();
|
||||
@@ -168,3 +182,21 @@ fn explain_status_codes() -> Result<()> {
|
||||
cmd.args(["--explain", "RUF404"]).assert().failure();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn show_statistics() -> Result<()> {
|
||||
let mut cmd = Command::cargo_bin(BIN_NAME)?;
|
||||
let output = cmd
|
||||
.args(["-", "--format", "text", "--select", "F401", "--statistics"])
|
||||
.write_stdin("import sys\nimport os\n\nprint(os.getuid())\n")
|
||||
.assert()
|
||||
.failure();
|
||||
assert_eq!(
|
||||
str::from_utf8(&output.get_output().stdout)?
|
||||
.lines()
|
||||
.last()
|
||||
.unwrap(),
|
||||
"1\tF401\t`sys` imported but unused"
|
||||
);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_dev"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
@@ -15,7 +15,7 @@ rustpython-ast = { features = ["unparse"], git = "https://github.com/RustPython/
|
||||
rustpython-common = { git = "https://github.com/RustPython/RustPython.git", rev = "4f38cb68e4a97aeea9eb19673803a0bd5f655383" }
|
||||
rustpython-parser = { features = ["lalrpop"], git = "https://github.com/RustPython/RustPython.git", rev = "4f38cb68e4a97aeea9eb19673803a0bd5f655383" }
|
||||
schemars = { version = "0.8.11" }
|
||||
serde_json = {version="1.0.91"}
|
||||
serde_json = { version = "1.0.91" }
|
||||
strum = { version = "0.24.1", features = ["strum_macros"] }
|
||||
strum_macros = { version = "0.24.3" }
|
||||
textwrap = { version = "0.16.0" }
|
||||
|
||||
@@ -1,11 +1,14 @@
|
||||
//! Generate CLI help.
|
||||
|
||||
use anyhow::Result;
|
||||
|
||||
use crate::utils::replace_readme_section;
|
||||
use anyhow::Result;
|
||||
use std::str;
|
||||
|
||||
const HELP_BEGIN_PRAGMA: &str = "<!-- Begin auto-generated cli help. -->";
|
||||
const HELP_END_PRAGMA: &str = "<!-- End auto-generated cli help. -->";
|
||||
const COMMAND_HELP_BEGIN_PRAGMA: &str = "<!-- Begin auto-generated command help. -->";
|
||||
const COMMAND_HELP_END_PRAGMA: &str = "<!-- End auto-generated command help. -->";
|
||||
|
||||
const SUBCOMMAND_HELP_BEGIN_PRAGMA: &str = "<!-- Begin auto-generated subcommand help. -->";
|
||||
const SUBCOMMAND_HELP_END_PRAGMA: &str = "<!-- End auto-generated subcommand help. -->";
|
||||
|
||||
#[derive(clap::Args)]
|
||||
pub struct Args {
|
||||
@@ -19,15 +22,25 @@ fn trim_lines(s: &str) -> String {
|
||||
}
|
||||
|
||||
pub fn main(args: &Args) -> Result<()> {
|
||||
let output = trim_lines(ruff_cli::help().trim());
|
||||
// Generate `ruff help`.
|
||||
let command_help = trim_lines(ruff_cli::command_help().trim());
|
||||
|
||||
// Generate `ruff help check`.
|
||||
let subcommand_help = trim_lines(ruff_cli::subcommand_help().trim());
|
||||
|
||||
if args.dry_run {
|
||||
print!("{output}");
|
||||
print!("{command_help}");
|
||||
print!("{subcommand_help}");
|
||||
} else {
|
||||
replace_readme_section(
|
||||
&format!("```\n{output}\n```\n"),
|
||||
HELP_BEGIN_PRAGMA,
|
||||
HELP_END_PRAGMA,
|
||||
&format!("```\n{command_help}\n```\n"),
|
||||
COMMAND_HELP_BEGIN_PRAGMA,
|
||||
COMMAND_HELP_END_PRAGMA,
|
||||
)?;
|
||||
replace_readme_section(
|
||||
&format!("```\n{subcommand_help}\n```\n"),
|
||||
SUBCOMMAND_HELP_BEGIN_PRAGMA,
|
||||
SUBCOMMAND_HELP_END_PRAGMA,
|
||||
)?;
|
||||
}
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
//! Generate a Markdown-compatible table of supported lint rules.
|
||||
|
||||
use anyhow::Result;
|
||||
use itertools::Itertools;
|
||||
use ruff::registry::{Linter, LinterCategory, Rule, RuleNamespace};
|
||||
use strum::IntoEnumIterator;
|
||||
|
||||
@@ -47,7 +48,15 @@ pub fn main(args: &Args) -> Result<()> {
|
||||
let mut table_out = String::new();
|
||||
let mut toc_out = String::new();
|
||||
for linter in Linter::iter() {
|
||||
let codes_csv: String = linter.prefixes().join(", ");
|
||||
let codes_csv: String = match linter.common_prefix() {
|
||||
"" => linter
|
||||
.categories()
|
||||
.unwrap()
|
||||
.iter()
|
||||
.map(|LinterCategory(prefix, ..)| prefix)
|
||||
.join(", "),
|
||||
prefix => prefix.to_string(),
|
||||
};
|
||||
table_out.push_str(&format!("### {} ({codes_csv})", linter.name()));
|
||||
table_out.push('\n');
|
||||
table_out.push('\n');
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_macros"
|
||||
version = "0.0.237"
|
||||
version = "0.0.239"
|
||||
edition = "2021"
|
||||
|
||||
[lib]
|
||||
|
||||
@@ -52,6 +52,7 @@ pub fn expand<'a>(
|
||||
::strum_macros::EnumIter,
|
||||
::strum_macros::EnumString,
|
||||
::strum_macros::AsRefStr,
|
||||
::strum_macros::IntoStaticStr,
|
||||
Debug,
|
||||
PartialEq,
|
||||
Eq,
|
||||
|
||||
@@ -15,13 +15,15 @@ pub fn derive_impl(input: DeriveInput) -> syn::Result<proc_macro2::TokenStream>
|
||||
|
||||
let mut parsed = Vec::new();
|
||||
|
||||
let mut prefix_match_arms = quote!();
|
||||
let mut common_prefix_match_arms = quote!();
|
||||
let mut name_match_arms = quote!(Self::Ruff => "Ruff-specific rules",);
|
||||
let mut url_match_arms = quote!(Self::Ruff => None,);
|
||||
let mut into_iter_match_arms = quote!();
|
||||
|
||||
let mut all_prefixes = HashSet::new();
|
||||
|
||||
for variant in variants {
|
||||
let mut first_chars = HashSet::new();
|
||||
let prefixes: Result<Vec<_>, _> = variant
|
||||
.attrs
|
||||
.iter()
|
||||
@@ -33,7 +35,9 @@ pub fn derive_impl(input: DeriveInput) -> syn::Result<proc_macro2::TokenStream>
|
||||
let str = lit.value();
|
||||
match str.chars().next() {
|
||||
None => return Err(Error::new(lit.span(), "expected prefix string to be non-empty")),
|
||||
Some(_) => {},
|
||||
Some(c) => if !first_chars.insert(c) {
|
||||
return Err(Error::new(lit.span(), format!("this variant already has another prefix starting with the character '{c}'")))
|
||||
}
|
||||
}
|
||||
if !all_prefixes.insert(str.clone()) {
|
||||
return Err(Error::new(lit.span(), "prefix has already been defined before"));
|
||||
@@ -63,31 +67,43 @@ pub fn derive_impl(input: DeriveInput) -> syn::Result<proc_macro2::TokenStream>
|
||||
}
|
||||
|
||||
for lit in &prefixes {
|
||||
parsed.push((lit.clone(), variant_ident.clone()));
|
||||
parsed.push((
|
||||
lit.clone(),
|
||||
variant_ident.clone(),
|
||||
match prefixes.len() {
|
||||
1 => ParseStrategy::SinglePrefix,
|
||||
_ => ParseStrategy::MultiplePrefixes,
|
||||
},
|
||||
));
|
||||
}
|
||||
|
||||
prefix_match_arms.extend(quote! {
|
||||
Self::#variant_ident => &[#(#prefixes),*],
|
||||
});
|
||||
if let [prefix] = &prefixes[..] {
|
||||
common_prefix_match_arms.extend(quote! { Self::#variant_ident => #prefix, });
|
||||
|
||||
let prefix_ident = Ident::new(prefix, Span::call_site());
|
||||
into_iter_match_arms.extend(quote! {
|
||||
#ident::#variant_ident => RuleCodePrefix::#prefix_ident.into_iter(),
|
||||
});
|
||||
} else {
|
||||
// There is more than one prefix. We already previously asserted
|
||||
// that prefixes of the same variant don't start with the same character
|
||||
// so the common prefix for this variant is the empty string.
|
||||
common_prefix_match_arms.extend(quote! { Self::#variant_ident => "", });
|
||||
}
|
||||
}
|
||||
|
||||
parsed.sort_by_key(|(prefix, _)| Reverse(prefix.len()));
|
||||
parsed.sort_by_key(|(prefix, ..)| Reverse(prefix.len()));
|
||||
|
||||
let mut if_statements = quote!();
|
||||
let mut into_iter_match_arms = quote!();
|
||||
|
||||
for (prefix, field) in parsed {
|
||||
for (prefix, field, strategy) in parsed {
|
||||
let ret_str = match strategy {
|
||||
ParseStrategy::SinglePrefix => quote!(rest),
|
||||
ParseStrategy::MultiplePrefixes => quote!(code),
|
||||
};
|
||||
if_statements.extend(quote! {if let Some(rest) = code.strip_prefix(#prefix) {
|
||||
return Some((#ident::#field, rest));
|
||||
return Some((#ident::#field, #ret_str));
|
||||
}});
|
||||
|
||||
let prefix_ident = Ident::new(&prefix, Span::call_site());
|
||||
|
||||
if field != "Pycodestyle" {
|
||||
into_iter_match_arms.extend(quote! {
|
||||
#ident::#field => RuleCodePrefix::#prefix_ident.into_iter(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
into_iter_match_arms.extend(quote! {
|
||||
@@ -104,9 +120,8 @@ pub fn derive_impl(input: DeriveInput) -> syn::Result<proc_macro2::TokenStream>
|
||||
None
|
||||
}
|
||||
|
||||
|
||||
fn prefixes(&self) -> &'static [&'static str] {
|
||||
match self { #prefix_match_arms }
|
||||
fn common_prefix(&self) -> &'static str {
|
||||
match self { #common_prefix_match_arms }
|
||||
}
|
||||
|
||||
fn name(&self) -> &'static str {
|
||||
@@ -152,3 +167,8 @@ fn parse_doc_attr(doc_attr: &Attribute) -> syn::Result<(String, String)> {
|
||||
fn parse_markdown_link(link: &str) -> Option<(&str, &str)> {
|
||||
link.strip_prefix('[')?.strip_suffix(')')?.split_once("](")
|
||||
}
|
||||
|
||||
enum ParseStrategy {
|
||||
SinglePrefix,
|
||||
MultiplePrefixes,
|
||||
}
|
||||
|
||||
@@ -15,12 +15,22 @@ SECTIONS: list[tuple[str, str]] = [
|
||||
("FAQ", "faq.md"),
|
||||
]
|
||||
|
||||
DOCUMENTATION_LINK: str = (
|
||||
"This README is also available as [documentation](https://beta.ruff.rs/docs/)."
|
||||
)
|
||||
|
||||
|
||||
def main() -> None:
|
||||
"""Generate an MkDocs-compatible `docs` and `mkdocs.yml`."""
|
||||
with Path("README.md").open(encoding="utf8") as fp:
|
||||
content = fp.read()
|
||||
|
||||
# Remove the documentation link, since we're _in_ the docs.
|
||||
if DOCUMENTATION_LINK not in content:
|
||||
msg = "README.md is not in the expected format."
|
||||
raise ValueError(msg)
|
||||
content = content.replace(DOCUMENTATION_LINK, "")
|
||||
|
||||
Path("docs").mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Split the README.md into sections.
|
||||
|
||||
@@ -21,24 +21,18 @@ GITHUB = """
|
||||
# https://github.com/pypi/warehouse/issues/11251
|
||||
PYPI = """
|
||||
<p align="center">
|
||||
<picture align="center">
|
||||
<img alt="Shows a bar chart with benchmark results." src="https://user-images.githubusercontent.com/1309177/212613257-5f4bca12-6d6b-4c79-9bac-51a4c6d08928.svg">
|
||||
</picture>
|
||||
<img alt="Shows a bar chart with benchmark results." src="https://user-images.githubusercontent.com/1309177/212613257-5f4bca12-6d6b-4c79-9bac-51a4c6d08928.svg">
|
||||
</p>
|
||||
"""
|
||||
|
||||
# https://squidfunk.github.io/mkdocs-material/reference/images/#light-and-dark-mode
|
||||
MK_DOCS = """
|
||||
<p align="center">
|
||||
<picture align="center">
|
||||
<img alt="Shows a bar chart with benchmark results." src="https://user-images.githubusercontent.com/1309177/212613257-5f4bca12-6d6b-4c79-9bac-51a4c6d08928.svg#only-light">
|
||||
</picture>
|
||||
<img alt="Shows a bar chart with benchmark results." src="https://user-images.githubusercontent.com/1309177/212613257-5f4bca12-6d6b-4c79-9bac-51a4c6d08928.svg#only-light">
|
||||
</p>
|
||||
|
||||
<p align="center">
|
||||
<picture align="center">
|
||||
<img alt="Shows a bar chart with benchmark results." src="https://user-images.githubusercontent.com/1309177/212613422-7faaf278-706b-4294-ad92-236ffcab3430.svg#only-dark">
|
||||
</picture>
|
||||
<img alt="Shows a bar chart with benchmark results." src="https://user-images.githubusercontent.com/1309177/212613422-7faaf278-706b-4294-ad92-236ffcab3430.svg#only-dark">
|
||||
</p>
|
||||
"""
|
||||
|
||||
|
||||
@@ -559,6 +559,17 @@ pub fn collect_arg_names<'a>(arguments: &'a Arguments) -> FxHashSet<&'a str> {
|
||||
}
|
||||
|
||||
/// Returns `true` if a statement or expression includes at least one comment.
|
||||
pub fn has_comments<T>(located: &Located<T>, locator: &Locator) -> bool {
|
||||
has_comments_in(
|
||||
Range::new(
|
||||
Location::new(located.location.row(), 0),
|
||||
Location::new(located.end_location.unwrap().row() + 1, 0),
|
||||
),
|
||||
locator,
|
||||
)
|
||||
}
|
||||
|
||||
/// Returns `true` if a [`Range`] includes at least one comment.
|
||||
pub fn has_comments_in(range: Range, locator: &Locator) -> bool {
|
||||
lexer::make_tokenizer(locator.slice_source_code_range(&range))
|
||||
.any(|result| result.map_or(false, |(_, tok, _)| matches!(tok, Tok::Comment(..))))
|
||||
@@ -651,6 +662,17 @@ pub fn to_absolute(relative: Location, base: Location) -> Location {
|
||||
}
|
||||
}
|
||||
|
||||
pub fn to_relative(absolute: Location, base: Location) -> Location {
|
||||
if absolute.row() == base.row() {
|
||||
Location::new(
|
||||
absolute.row() - base.row() + 1,
|
||||
absolute.column() - base.column(),
|
||||
)
|
||||
} else {
|
||||
Location::new(absolute.row() - base.row() + 1, absolute.column())
|
||||
}
|
||||
}
|
||||
|
||||
/// Return `true` if a `Stmt` has leading content.
|
||||
pub fn match_leading_content(stmt: &Stmt, locator: &Locator) -> bool {
|
||||
let range = Range::new(Location::new(stmt.location.row(), 0), stmt.location);
|
||||
|
||||
@@ -88,6 +88,11 @@ pub fn extract_all_names(
|
||||
}
|
||||
}
|
||||
}
|
||||
ExprKind::ListComp { .. } => {
|
||||
// Allow list comprehensions, even though we can't statically analyze them.
|
||||
// TODO(charlie): Allow `list()` and `tuple()` calls too, and extract the members
|
||||
// from them (even if, e.g., it's `list({...})`).
|
||||
}
|
||||
_ => {
|
||||
flags |= AllNamesFlags::INVALID_FORMAT;
|
||||
}
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
|
||||
@@ -3,7 +3,9 @@ use itertools::Itertools;
|
||||
use libcst_native::{
|
||||
Codegen, CodegenState, ImportNames, ParenthesizableWhitespace, SmallStatement, Statement,
|
||||
};
|
||||
use rustpython_parser::ast::{ExcepthandlerKind, Location, Stmt, StmtKind};
|
||||
use rustpython_parser::ast::{ExcepthandlerKind, Expr, Keyword, Location, Stmt, StmtKind};
|
||||
use rustpython_parser::lexer;
|
||||
use rustpython_parser::lexer::Tok;
|
||||
|
||||
use crate::ast::helpers;
|
||||
use crate::ast::helpers::to_absolute;
|
||||
@@ -321,6 +323,109 @@ pub fn remove_unused_imports<'a>(
|
||||
}
|
||||
}
|
||||
|
||||
/// Generic function to remove arguments or keyword arguments in function
|
||||
/// calls and class definitions. (For classes `args` should be considered
|
||||
/// `bases`)
|
||||
///
|
||||
/// Supports the removal of parentheses when this is the only (kw)arg left.
|
||||
/// For this behavior, set `remove_parentheses` to `true`.
|
||||
pub fn remove_argument(
|
||||
locator: &Locator,
|
||||
stmt_at: Location,
|
||||
expr_at: Location,
|
||||
expr_end: Location,
|
||||
args: &[Expr],
|
||||
keywords: &[Keyword],
|
||||
remove_parentheses: bool,
|
||||
) -> Result<Fix> {
|
||||
// TODO(sbrugman): Preserve trailing comments.
|
||||
let contents = locator.slice_source_code_at(stmt_at);
|
||||
|
||||
let mut fix_start = None;
|
||||
let mut fix_end = None;
|
||||
|
||||
let n_arguments = keywords.len() + args.len();
|
||||
if n_arguments == 0 {
|
||||
bail!("No arguments or keywords to remove");
|
||||
}
|
||||
|
||||
if n_arguments == 1 {
|
||||
// Case 1: there is only one argument.
|
||||
let mut count: usize = 0;
|
||||
for (start, tok, end) in lexer::make_tokenizer_located(contents, stmt_at).flatten() {
|
||||
if matches!(tok, Tok::Lpar) {
|
||||
if count == 0 {
|
||||
fix_start = Some(if remove_parentheses {
|
||||
start
|
||||
} else {
|
||||
Location::new(start.row(), start.column() + 1)
|
||||
});
|
||||
}
|
||||
count += 1;
|
||||
}
|
||||
|
||||
if matches!(tok, Tok::Rpar) {
|
||||
count -= 1;
|
||||
if count == 0 {
|
||||
fix_end = Some(if remove_parentheses {
|
||||
end
|
||||
} else {
|
||||
Location::new(end.row(), end.column() - 1)
|
||||
});
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if args
|
||||
.iter()
|
||||
.map(|node| node.location)
|
||||
.chain(keywords.iter().map(|node| node.location))
|
||||
.any(|location| location > expr_at)
|
||||
{
|
||||
// Case 2: argument or keyword is _not_ the last node.
|
||||
let mut seen_comma = false;
|
||||
for (start, tok, end) in lexer::make_tokenizer_located(contents, stmt_at).flatten() {
|
||||
if seen_comma {
|
||||
if matches!(tok, Tok::NonLogicalNewline) {
|
||||
// Also delete any non-logical newlines after the comma.
|
||||
continue;
|
||||
}
|
||||
fix_end = Some(if matches!(tok, Tok::Newline) {
|
||||
end
|
||||
} else {
|
||||
start
|
||||
});
|
||||
break;
|
||||
}
|
||||
if start == expr_at {
|
||||
fix_start = Some(start);
|
||||
}
|
||||
if fix_start.is_some() && matches!(tok, Tok::Comma) {
|
||||
seen_comma = true;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Case 3: argument or keyword is the last node, so we have to find the last
|
||||
// comma in the stmt.
|
||||
for (start, tok, _) in lexer::make_tokenizer_located(contents, stmt_at).flatten() {
|
||||
if start == expr_at {
|
||||
fix_end = Some(expr_end);
|
||||
break;
|
||||
}
|
||||
if matches!(tok, Tok::Comma) {
|
||||
fix_start = Some(start);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
match (fix_start, fix_end) {
|
||||
(Some(start), Some(end)) => Ok(Fix::deletion(start, end)),
|
||||
_ => {
|
||||
bail!("No fix could be constructed")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use anyhow::Result;
|
||||
|
||||
@@ -8,7 +8,6 @@ use crate::fix::Fix;
|
||||
use crate::registry::Diagnostic;
|
||||
use crate::source_code::Locator;
|
||||
|
||||
pub mod fixer;
|
||||
pub mod helpers;
|
||||
|
||||
/// Auto-fix errors in a file, and write the fixed source code to disk.
|
||||
@@ -28,7 +27,7 @@ fn apply_fixes<'a>(
|
||||
fixes: impl Iterator<Item = &'a Fix>,
|
||||
locator: &'a Locator<'a>,
|
||||
) -> (String, usize) {
|
||||
let mut output = String::new();
|
||||
let mut output = String::with_capacity(locator.len());
|
||||
let mut last_pos: Location = Location::new(1, 0);
|
||||
let mut applied: BTreeSet<&Fix> = BTreeSet::default();
|
||||
let mut num_fixed: usize = 0;
|
||||
@@ -67,11 +66,29 @@ fn apply_fixes<'a>(
|
||||
(output, num_fixed)
|
||||
}
|
||||
|
||||
/// Apply a single fix.
|
||||
pub(crate) fn apply_fix(fix: &Fix, locator: &Locator) -> String {
|
||||
let mut output = String::with_capacity(locator.len());
|
||||
|
||||
// Add all contents from `last_pos` to `fix.location`.
|
||||
let slice = locator.slice_source_code_range(&Range::new(Location::new(1, 0), fix.location));
|
||||
output.push_str(slice);
|
||||
|
||||
// Add the patch itself.
|
||||
output.push_str(&fix.content);
|
||||
|
||||
// Add the remaining content.
|
||||
let slice = locator.slice_source_code_at(fix.end_location);
|
||||
output.push_str(slice);
|
||||
|
||||
output
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use rustpython_parser::ast::Location;
|
||||
|
||||
use crate::autofix::apply_fixes;
|
||||
use crate::autofix::{apply_fix, apply_fixes};
|
||||
use crate::fix::Fix;
|
||||
use crate::source_code::Locator;
|
||||
|
||||
@@ -85,7 +102,7 @@ mod tests {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn apply_single_replacement() {
|
||||
fn apply_one_replacement() {
|
||||
let fixes = vec![Fix {
|
||||
content: "Bar".to_string(),
|
||||
location: Location::new(1, 8),
|
||||
@@ -111,7 +128,7 @@ class A(Bar):
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn apply_single_removal() {
|
||||
fn apply_one_removal() {
|
||||
let fixes = vec![Fix {
|
||||
content: String::new(),
|
||||
location: Location::new(1, 7),
|
||||
@@ -137,7 +154,7 @@ class A:
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn apply_double_removal() {
|
||||
fn apply_two_removals() {
|
||||
let fixes = vec![
|
||||
Fix {
|
||||
content: String::new(),
|
||||
@@ -202,4 +219,31 @@ class A:
|
||||
);
|
||||
assert_eq!(fixed, 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn apply_single_fix() {
|
||||
let locator = Locator::new(
|
||||
r#"
|
||||
class A(object):
|
||||
...
|
||||
"#
|
||||
.trim(),
|
||||
);
|
||||
let contents = apply_fix(
|
||||
&Fix {
|
||||
content: String::new(),
|
||||
location: Location::new(1, 7),
|
||||
end_location: Location::new(1, 15),
|
||||
},
|
||||
&locator,
|
||||
);
|
||||
assert_eq!(
|
||||
contents,
|
||||
r#"
|
||||
class A:
|
||||
...
|
||||
"#
|
||||
.trim()
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -36,7 +36,7 @@ use crate::rules::{
|
||||
flake8_2020, flake8_annotations, flake8_bandit, flake8_blind_except, flake8_boolean_trap,
|
||||
flake8_bugbear, flake8_builtins, flake8_comprehensions, flake8_datetimez, flake8_debugger,
|
||||
flake8_errmsg, flake8_implicit_str_concat, flake8_import_conventions, flake8_logging_format,
|
||||
flake8_pie, flake8_print, flake8_pytest_style, flake8_return, flake8_simplify,
|
||||
flake8_pie, flake8_print, flake8_pytest_style, flake8_raise, flake8_return, flake8_simplify,
|
||||
flake8_tidy_imports, flake8_type_checking, flake8_unused_arguments, flake8_use_pathlib, mccabe,
|
||||
pandas_vet, pep8_naming, pycodestyle, pydocstyle, pyflakes, pygrep_hooks, pylint, pyupgrade,
|
||||
ruff, tryceratops,
|
||||
@@ -93,6 +93,7 @@ pub struct Checker<'a> {
|
||||
in_type_definition: bool,
|
||||
in_deferred_string_type_definition: bool,
|
||||
in_deferred_type_definition: bool,
|
||||
in_exception_handler: bool,
|
||||
in_literal: bool,
|
||||
in_subscript: bool,
|
||||
in_type_checking_block: bool,
|
||||
@@ -153,6 +154,7 @@ impl<'a> Checker<'a> {
|
||||
in_type_definition: false,
|
||||
in_deferred_string_type_definition: false,
|
||||
in_deferred_type_definition: false,
|
||||
in_exception_handler: false,
|
||||
in_literal: false,
|
||||
in_subscript: false,
|
||||
in_type_checking_block: false,
|
||||
@@ -392,7 +394,9 @@ where
|
||||
if !exists {
|
||||
if self.settings.rules.enabled(&Rule::NonlocalWithoutBinding) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::NonlocalWithoutBinding(name.to_string()),
|
||||
violations::NonlocalWithoutBinding {
|
||||
name: name.to_string(),
|
||||
},
|
||||
*range,
|
||||
));
|
||||
}
|
||||
@@ -692,6 +696,9 @@ where
|
||||
context,
|
||||
},
|
||||
);
|
||||
if self.settings.rules.enabled(&Rule::TooManyArgs) {
|
||||
pylint::rules::too_many_args(self, args, stmt);
|
||||
}
|
||||
}
|
||||
StmtKind::Return { .. } => {
|
||||
if self.settings.rules.enabled(&Rule::ReturnOutsideFunction) {
|
||||
@@ -1066,6 +1073,15 @@ where
|
||||
if self.settings.rules.enabled(&Rule::RewriteCElementTree) {
|
||||
pyupgrade::rules::replace_c_element_tree(self, stmt);
|
||||
}
|
||||
if self.settings.rules.enabled(&Rule::ImportReplacements) {
|
||||
pyupgrade::rules::import_replacements(
|
||||
self,
|
||||
stmt,
|
||||
names,
|
||||
module.as_ref().map(String::as_str),
|
||||
level.as_ref(),
|
||||
);
|
||||
}
|
||||
if self.settings.rules.enabled(&Rule::UnnecessaryBuiltinImport) {
|
||||
if let Some(module) = module.as_deref() {
|
||||
pyupgrade::rules::unnecessary_builtin_import(self, stmt, module, names);
|
||||
@@ -1138,9 +1154,9 @@ where
|
||||
if self.settings.rules.enabled(&Rule::FutureFeatureNotDefined) {
|
||||
if !ALL_FEATURE_NAMES.contains(&&*alias.node.name) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::FutureFeatureNotDefined(
|
||||
alias.node.name.to_string(),
|
||||
),
|
||||
violations::FutureFeatureNotDefined {
|
||||
name: alias.node.name.to_string(),
|
||||
},
|
||||
Range::from_located(alias),
|
||||
));
|
||||
}
|
||||
@@ -1173,12 +1189,12 @@ where
|
||||
[*(self.scope_stack.last().expect("No current scope found"))];
|
||||
if !matches!(scope.kind, ScopeKind::Module) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::ImportStarNotPermitted(
|
||||
helpers::format_import_from(
|
||||
violations::ImportStarNotPermitted {
|
||||
name: helpers::format_import_from(
|
||||
level.as_ref(),
|
||||
module.as_deref(),
|
||||
),
|
||||
),
|
||||
},
|
||||
Range::from_located(stmt),
|
||||
));
|
||||
}
|
||||
@@ -1186,10 +1202,12 @@ where
|
||||
|
||||
if self.settings.rules.enabled(&Rule::ImportStarUsed) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::ImportStarUsed(helpers::format_import_from(
|
||||
level.as_ref(),
|
||||
module.as_deref(),
|
||||
)),
|
||||
violations::ImportStarUsed {
|
||||
name: helpers::format_import_from(
|
||||
level.as_ref(),
|
||||
module.as_deref(),
|
||||
),
|
||||
},
|
||||
Range::from_located(stmt),
|
||||
));
|
||||
}
|
||||
@@ -1416,6 +1434,15 @@ where
|
||||
tryceratops::rules::raise_vanilla_args(self, expr);
|
||||
}
|
||||
}
|
||||
if self
|
||||
.settings
|
||||
.rules
|
||||
.enabled(&Rule::UnnecessaryParenOnRaiseException)
|
||||
{
|
||||
if let Some(expr) = exc {
|
||||
flake8_raise::rules::unnecessary_paren_on_raise_exception(self, expr);
|
||||
}
|
||||
}
|
||||
}
|
||||
StmtKind::AugAssign { target, .. } => {
|
||||
self.handle_node_load(target);
|
||||
@@ -1707,6 +1734,7 @@ where
|
||||
}
|
||||
|
||||
// Recurse.
|
||||
let prev_in_exception_handler = self.in_exception_handler;
|
||||
let prev_visible_scope = self.visible_scope.clone();
|
||||
match &stmt.node {
|
||||
StmtKind::FunctionDef {
|
||||
@@ -1858,9 +1886,13 @@ where
|
||||
}
|
||||
self.visit_body(body);
|
||||
self.except_handlers.pop();
|
||||
|
||||
self.in_exception_handler = true;
|
||||
for excepthandler in handlers {
|
||||
self.visit_excepthandler(excepthandler);
|
||||
}
|
||||
self.in_exception_handler = prev_in_exception_handler;
|
||||
|
||||
self.visit_body(orelse);
|
||||
self.visit_body(finalbody);
|
||||
}
|
||||
@@ -2085,11 +2117,6 @@ where
|
||||
{
|
||||
pyupgrade::rules::use_pep585_annotation(self, expr);
|
||||
}
|
||||
|
||||
if self.settings.rules.enabled(&Rule::RemoveSixCompat) {
|
||||
pyupgrade::rules::remove_six_compat(self, expr);
|
||||
}
|
||||
|
||||
if self.settings.rules.enabled(&Rule::DatetimeTimezoneUTC)
|
||||
&& self.settings.target_version >= PythonVersion::Py311
|
||||
{
|
||||
@@ -2101,16 +2128,13 @@ where
|
||||
if self.settings.rules.enabled(&Rule::RewriteMockImport) {
|
||||
pyupgrade::rules::rewrite_mock_attribute(self, expr);
|
||||
}
|
||||
|
||||
if self.settings.rules.enabled(&Rule::SixPY3Referenced) {
|
||||
flake8_2020::rules::name_or_attribute(self, expr);
|
||||
}
|
||||
|
||||
pandas_vet::rules::check_attr(self, attr, value, expr);
|
||||
|
||||
if self.settings.rules.enabled(&Rule::BannedApi) {
|
||||
flake8_tidy_imports::banned_api::banned_attribute_access(self, expr);
|
||||
}
|
||||
pandas_vet::rules::check_attr(self, attr, value, expr);
|
||||
}
|
||||
ExprKind::Call {
|
||||
func,
|
||||
@@ -2144,9 +2168,9 @@ where
|
||||
.enabled(&Rule::StringDotFormatInvalidFormat)
|
||||
{
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::StringDotFormatInvalidFormat(
|
||||
pyflakes::format::error_to_string(&e),
|
||||
),
|
||||
violations::StringDotFormatInvalidFormat {
|
||||
message: pyflakes::format::error_to_string(&e),
|
||||
},
|
||||
location,
|
||||
));
|
||||
}
|
||||
@@ -2223,9 +2247,6 @@ where
|
||||
if self.settings.rules.enabled(&Rule::RedundantOpenModes) {
|
||||
pyupgrade::rules::redundant_open_modes(self, expr);
|
||||
}
|
||||
if self.settings.rules.enabled(&Rule::RemoveSixCompat) {
|
||||
pyupgrade::rules::remove_six_compat(self, expr);
|
||||
}
|
||||
if self.settings.rules.enabled(&Rule::NativeLiterals) {
|
||||
pyupgrade::rules::native_literals(self, expr, func, args, keywords);
|
||||
}
|
||||
@@ -2475,8 +2496,9 @@ where
|
||||
|
||||
// pandas-vet
|
||||
if self.settings.rules.enabled(&Rule::UseOfInplaceArgument) {
|
||||
self.diagnostics
|
||||
.extend(pandas_vet::rules::inplace_argument(keywords).into_iter());
|
||||
self.diagnostics.extend(
|
||||
pandas_vet::rules::inplace_argument(self, expr, args, keywords).into_iter(),
|
||||
);
|
||||
}
|
||||
pandas_vet::rules::check_call(self, func);
|
||||
|
||||
@@ -2705,7 +2727,9 @@ where
|
||||
let scope = self.current_scope();
|
||||
if matches!(scope.kind, ScopeKind::Class(_) | ScopeKind::Module) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::YieldOutsideFunction(DeferralKeyword::Yield),
|
||||
violations::YieldOutsideFunction {
|
||||
keyword: DeferralKeyword::Yield,
|
||||
},
|
||||
Range::from_located(expr),
|
||||
));
|
||||
}
|
||||
@@ -2716,7 +2740,9 @@ where
|
||||
let scope = self.current_scope();
|
||||
if matches!(scope.kind, ScopeKind::Class(_) | ScopeKind::Module) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::YieldOutsideFunction(DeferralKeyword::YieldFrom),
|
||||
violations::YieldOutsideFunction {
|
||||
keyword: DeferralKeyword::YieldFrom,
|
||||
},
|
||||
Range::from_located(expr),
|
||||
));
|
||||
}
|
||||
@@ -2727,7 +2753,9 @@ where
|
||||
let scope = self.current_scope();
|
||||
if matches!(scope.kind, ScopeKind::Class(_) | ScopeKind::Module) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::YieldOutsideFunction(DeferralKeyword::Await),
|
||||
violations::YieldOutsideFunction {
|
||||
keyword: DeferralKeyword::Await,
|
||||
},
|
||||
Range::from_located(expr),
|
||||
));
|
||||
}
|
||||
@@ -2813,7 +2841,9 @@ where
|
||||
.enabled(&Rule::PercentFormatUnsupportedFormatCharacter)
|
||||
{
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::PercentFormatUnsupportedFormatCharacter(c),
|
||||
violations::PercentFormatUnsupportedFormatCharacter {
|
||||
char: c,
|
||||
},
|
||||
location,
|
||||
));
|
||||
}
|
||||
@@ -2825,7 +2855,9 @@ where
|
||||
.enabled(&Rule::PercentFormatInvalidFormat)
|
||||
{
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::PercentFormatInvalidFormat(e.to_string()),
|
||||
violations::PercentFormatInvalidFormat {
|
||||
message: e.to_string(),
|
||||
},
|
||||
location,
|
||||
));
|
||||
}
|
||||
@@ -3513,7 +3545,9 @@ where
|
||||
if !self.bindings[*index].used() {
|
||||
if self.settings.rules.enabled(&Rule::UnusedVariable) {
|
||||
let mut diagnostic = Diagnostic::new(
|
||||
violations::UnusedVariable(name.to_string()),
|
||||
violations::UnusedVariable {
|
||||
name: name.to_string(),
|
||||
},
|
||||
name_range,
|
||||
);
|
||||
if self.patch(&Rule::UnusedVariable) {
|
||||
@@ -3772,6 +3806,10 @@ impl<'a> Checker<'a> {
|
||||
.map(|index| &self.scopes[*index])
|
||||
}
|
||||
|
||||
pub fn in_exception_handler(&self) -> bool {
|
||||
self.in_exception_handler
|
||||
}
|
||||
|
||||
pub fn execution_context(&self) -> ExecutionContext {
|
||||
if self.in_type_checking_block
|
||||
|| self.in_annotation
|
||||
@@ -3823,10 +3861,10 @@ impl<'a> Checker<'a> {
|
||||
if matches!(binding.kind, BindingKind::LoopVar) && existing_is_import {
|
||||
if self.settings.rules.enabled(&Rule::ImportShadowedByLoopVar) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::ImportShadowedByLoopVar(
|
||||
name.to_string(),
|
||||
existing.range.location.row(),
|
||||
),
|
||||
violations::ImportShadowedByLoopVar {
|
||||
name: name.to_string(),
|
||||
line: existing.range.location.row(),
|
||||
},
|
||||
binding.range,
|
||||
));
|
||||
}
|
||||
@@ -3842,10 +3880,10 @@ impl<'a> Checker<'a> {
|
||||
{
|
||||
if self.settings.rules.enabled(&Rule::RedefinedWhileUnused) {
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::RedefinedWhileUnused(
|
||||
name.to_string(),
|
||||
existing.range.location.row(),
|
||||
),
|
||||
violations::RedefinedWhileUnused {
|
||||
name: name.to_string(),
|
||||
line: existing.range.location.row(),
|
||||
},
|
||||
binding_range(&binding, self.locator),
|
||||
));
|
||||
}
|
||||
@@ -4009,7 +4047,10 @@ impl<'a> Checker<'a> {
|
||||
from_list.sort();
|
||||
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::ImportStarUsage(id.to_string(), from_list),
|
||||
violations::ImportStarUsage {
|
||||
name: id.to_string(),
|
||||
sources: from_list,
|
||||
},
|
||||
Range::from_located(expr),
|
||||
));
|
||||
}
|
||||
@@ -4040,7 +4081,7 @@ impl<'a> Checker<'a> {
|
||||
}
|
||||
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::UndefinedName(id.clone()),
|
||||
violations::UndefinedName { name: id.clone() },
|
||||
Range::from_located(expr),
|
||||
));
|
||||
}
|
||||
@@ -4248,7 +4289,9 @@ impl<'a> Checker<'a> {
|
||||
&& self.settings.rules.enabled(&Rule::UndefinedName)
|
||||
{
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::UndefinedName(id.to_string()),
|
||||
violations::UndefinedName {
|
||||
name: id.to_string(),
|
||||
},
|
||||
Range::from_located(expr),
|
||||
));
|
||||
}
|
||||
@@ -4314,7 +4357,9 @@ impl<'a> Checker<'a> {
|
||||
.enabled(&Rule::ForwardAnnotationSyntaxError)
|
||||
{
|
||||
self.diagnostics.push(Diagnostic::new(
|
||||
violations::ForwardAnnotationSyntaxError(expression.to_string()),
|
||||
violations::ForwardAnnotationSyntaxError {
|
||||
body: expression.to_string(),
|
||||
},
|
||||
range,
|
||||
));
|
||||
}
|
||||
@@ -4486,7 +4531,9 @@ impl<'a> Checker<'a> {
|
||||
if let Some(stmt) = &binding.source {
|
||||
if matches!(stmt.node, StmtKind::Global { .. }) {
|
||||
diagnostics.push(Diagnostic::new(
|
||||
violations::GlobalVariableNotAssigned((*name).to_string()),
|
||||
violations::GlobalVariableNotAssigned {
|
||||
name: (*name).to_string(),
|
||||
},
|
||||
binding.range,
|
||||
));
|
||||
}
|
||||
@@ -4517,7 +4564,9 @@ impl<'a> Checker<'a> {
|
||||
for &name in names {
|
||||
if !scope.values.contains_key(name) {
|
||||
diagnostics.push(Diagnostic::new(
|
||||
violations::UndefinedExport(name.to_string()),
|
||||
violations::UndefinedExport {
|
||||
name: name.to_string(),
|
||||
},
|
||||
all_binding.range,
|
||||
));
|
||||
}
|
||||
@@ -4555,10 +4604,10 @@ impl<'a> Checker<'a> {
|
||||
if let Some(indices) = self.redefinitions.get(index) {
|
||||
for index in indices {
|
||||
diagnostics.push(Diagnostic::new(
|
||||
violations::RedefinedWhileUnused(
|
||||
(*name).to_string(),
|
||||
binding.range.location.row(),
|
||||
),
|
||||
violations::RedefinedWhileUnused {
|
||||
name: (*name).to_string(),
|
||||
line: binding.range.location.row(),
|
||||
},
|
||||
binding_range(&self.bindings[*index], self.locator),
|
||||
));
|
||||
}
|
||||
@@ -4586,10 +4635,10 @@ impl<'a> Checker<'a> {
|
||||
for &name in names {
|
||||
if !scope.values.contains_key(name) {
|
||||
diagnostics.push(Diagnostic::new(
|
||||
violations::ImportStarUsage(
|
||||
name.to_string(),
|
||||
from_list.clone(),
|
||||
),
|
||||
violations::ImportStarUsage {
|
||||
name: name.to_string(),
|
||||
sources: from_list.clone(),
|
||||
},
|
||||
all_binding.range,
|
||||
));
|
||||
}
|
||||
|
||||
@@ -100,10 +100,13 @@ pub fn check_noqa(
|
||||
if enforce_noqa {
|
||||
for (row, (directive, matches)) in noqa_directives {
|
||||
match directive {
|
||||
Directive::All(spaces, start, end) => {
|
||||
Directive::All(spaces, start_byte, end_byte) => {
|
||||
if matches.is_empty() {
|
||||
let start = lines[row][..start_byte].chars().count();
|
||||
let end = start + lines[row][start_byte..end_byte].chars().count();
|
||||
|
||||
let mut diagnostic = Diagnostic::new(
|
||||
violations::UnusedNOQA(None),
|
||||
violations::UnusedNOQA { codes: None },
|
||||
Range::new(Location::new(row + 1, start), Location::new(row + 1, end)),
|
||||
);
|
||||
if matches!(autofix, flags::Autofix::Enabled)
|
||||
@@ -117,7 +120,7 @@ pub fn check_noqa(
|
||||
diagnostics.push(diagnostic);
|
||||
}
|
||||
}
|
||||
Directive::Codes(spaces, start, end, codes) => {
|
||||
Directive::Codes(spaces, start_byte, end_byte, codes) => {
|
||||
let mut disabled_codes = vec![];
|
||||
let mut unknown_codes = vec![];
|
||||
let mut unmatched_codes = vec![];
|
||||
@@ -153,21 +156,26 @@ pub fn check_noqa(
|
||||
&& unknown_codes.is_empty()
|
||||
&& unmatched_codes.is_empty())
|
||||
{
|
||||
let start = lines[row][..start_byte].chars().count();
|
||||
let end = start + lines[row][start_byte..end_byte].chars().count();
|
||||
|
||||
let mut diagnostic = Diagnostic::new(
|
||||
violations::UnusedNOQA(Some(UnusedCodes {
|
||||
disabled: disabled_codes
|
||||
.iter()
|
||||
.map(|code| (*code).to_string())
|
||||
.collect(),
|
||||
unknown: unknown_codes
|
||||
.iter()
|
||||
.map(|code| (*code).to_string())
|
||||
.collect(),
|
||||
unmatched: unmatched_codes
|
||||
.iter()
|
||||
.map(|code| (*code).to_string())
|
||||
.collect(),
|
||||
})),
|
||||
violations::UnusedNOQA {
|
||||
codes: Some(UnusedCodes {
|
||||
disabled: disabled_codes
|
||||
.iter()
|
||||
.map(|code| (*code).to_string())
|
||||
.collect(),
|
||||
unknown: unknown_codes
|
||||
.iter()
|
||||
.map(|code| (*code).to_string())
|
||||
.collect(),
|
||||
unmatched: unmatched_codes
|
||||
.iter()
|
||||
.map(|code| (*code).to_string())
|
||||
.collect(),
|
||||
}),
|
||||
},
|
||||
Range::new(Location::new(row + 1, start), Location::new(row + 1, end)),
|
||||
);
|
||||
if matches!(autofix, flags::Autofix::Enabled)
|
||||
|
||||
@@ -48,77 +48,73 @@ pub fn check_tokens(
|
||||
|| settings.rules.enabled(&Rule::TrailingCommaProhibited);
|
||||
let enforce_extraneous_parenthesis = settings.rules.enabled(&Rule::ExtraneousParentheses);
|
||||
|
||||
let mut state_machine = StateMachine::default();
|
||||
for &(start, ref tok, end) in tokens.iter().flatten() {
|
||||
let is_docstring = if enforce_ambiguous_unicode_character || enforce_quotes {
|
||||
state_machine.consume(tok)
|
||||
} else {
|
||||
false
|
||||
};
|
||||
if enforce_ambiguous_unicode_character
|
||||
|| enforce_commented_out_code
|
||||
|| enforce_invalid_escape_sequence
|
||||
{
|
||||
let mut state_machine = StateMachine::default();
|
||||
for &(start, ref tok, end) in tokens.iter().flatten() {
|
||||
let is_docstring = if enforce_ambiguous_unicode_character {
|
||||
state_machine.consume(tok)
|
||||
} else {
|
||||
false
|
||||
};
|
||||
|
||||
// RUF001, RUF002, RUF003
|
||||
if enforce_ambiguous_unicode_character {
|
||||
if matches!(tok, Tok::String { .. } | Tok::Comment(_)) {
|
||||
diagnostics.extend(ruff::rules::ambiguous_unicode_character(
|
||||
locator,
|
||||
start,
|
||||
end,
|
||||
if matches!(tok, Tok::String { .. }) {
|
||||
if is_docstring {
|
||||
Context::Docstring
|
||||
// RUF001, RUF002, RUF003
|
||||
if enforce_ambiguous_unicode_character {
|
||||
if matches!(tok, Tok::String { .. } | Tok::Comment(_)) {
|
||||
diagnostics.extend(ruff::rules::ambiguous_unicode_character(
|
||||
locator,
|
||||
start,
|
||||
end,
|
||||
if matches!(tok, Tok::String { .. }) {
|
||||
if is_docstring {
|
||||
Context::Docstring
|
||||
} else {
|
||||
Context::String
|
||||
}
|
||||
} else {
|
||||
Context::String
|
||||
}
|
||||
} else {
|
||||
Context::Comment
|
||||
},
|
||||
settings,
|
||||
autofix,
|
||||
));
|
||||
Context::Comment
|
||||
},
|
||||
settings,
|
||||
autofix,
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// flake8-quotes
|
||||
if enforce_quotes {
|
||||
if matches!(tok, Tok::String { .. }) {
|
||||
if let Some(diagnostic) = flake8_quotes::rules::quotes(
|
||||
locator,
|
||||
start,
|
||||
end,
|
||||
is_docstring,
|
||||
settings,
|
||||
autofix,
|
||||
) {
|
||||
if settings.rules.enabled(diagnostic.kind.rule()) {
|
||||
// eradicate
|
||||
if enforce_commented_out_code {
|
||||
if matches!(tok, Tok::Comment(_)) {
|
||||
if let Some(diagnostic) =
|
||||
eradicate::rules::commented_out_code(locator, start, end, settings, autofix)
|
||||
{
|
||||
diagnostics.push(diagnostic);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// eradicate
|
||||
if enforce_commented_out_code {
|
||||
if matches!(tok, Tok::Comment(_)) {
|
||||
if let Some(diagnostic) =
|
||||
eradicate::rules::commented_out_code(locator, start, end, settings, autofix)
|
||||
{
|
||||
diagnostics.push(diagnostic);
|
||||
// W605
|
||||
if enforce_invalid_escape_sequence {
|
||||
if matches!(tok, Tok::String { .. }) {
|
||||
diagnostics.extend(pycodestyle::rules::invalid_escape_sequence(
|
||||
locator,
|
||||
start,
|
||||
end,
|
||||
matches!(autofix, flags::Autofix::Enabled)
|
||||
&& settings.rules.should_fix(&Rule::InvalidEscapeSequence),
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// W605
|
||||
if enforce_invalid_escape_sequence {
|
||||
if matches!(tok, Tok::String { .. }) {
|
||||
diagnostics.extend(pycodestyle::rules::invalid_escape_sequence(
|
||||
locator,
|
||||
start,
|
||||
end,
|
||||
matches!(autofix, flags::Autofix::Enabled)
|
||||
&& settings.rules.should_fix(&Rule::InvalidEscapeSequence),
|
||||
));
|
||||
}
|
||||
}
|
||||
// Q001, Q002, Q003
|
||||
if enforce_quotes {
|
||||
diagnostics.extend(
|
||||
flake8_quotes::rules::from_tokens(tokens, locator, settings, autofix)
|
||||
.into_iter()
|
||||
.filter(|diagnostic| settings.rules.enabled(diagnostic.kind.rule())),
|
||||
);
|
||||
}
|
||||
|
||||
// ISC001, ISC002
|
||||
|
||||
@@ -36,6 +36,8 @@ impl Fix {
|
||||
}
|
||||
|
||||
pub fn replacement(content: String, start: Location, end: Location) -> Self {
|
||||
debug_assert!(!content.is_empty(), "Prefer `Fix::deletion`");
|
||||
|
||||
Self {
|
||||
content,
|
||||
location: start,
|
||||
@@ -44,6 +46,8 @@ impl Fix {
|
||||
}
|
||||
|
||||
pub fn insertion(content: String, at: Location) -> Self {
|
||||
debug_assert!(!content.is_empty(), "Insert content is empty");
|
||||
|
||||
Self {
|
||||
content,
|
||||
location: at,
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
use std::collections::{BTreeSet, HashMap};
|
||||
use std::collections::{HashMap, HashSet};
|
||||
|
||||
use anyhow::Result;
|
||||
use itertools::Itertools;
|
||||
|
||||
use super::external_config::ExternalConfig;
|
||||
use super::plugin::Plugin;
|
||||
@@ -38,7 +39,7 @@ pub fn convert(
|
||||
.expect("Unable to find flake8 section in INI file");
|
||||
|
||||
// Extract all referenced rule code prefixes, to power plugin inference.
|
||||
let mut referenced_codes: BTreeSet<RuleSelector> = BTreeSet::default();
|
||||
let mut referenced_codes: HashSet<RuleSelector> = HashSet::default();
|
||||
for (key, value) in flake8 {
|
||||
if let Some(value) = value {
|
||||
match key.as_str() {
|
||||
@@ -80,15 +81,15 @@ pub fn convert(
|
||||
.and_then(|value| {
|
||||
value
|
||||
.as_ref()
|
||||
.map(|value| BTreeSet::from_iter(parser::parse_prefix_codes(value)))
|
||||
.map(|value| HashSet::from_iter(parser::parse_prefix_codes(value)))
|
||||
})
|
||||
.unwrap_or_else(|| resolve_select(&plugins));
|
||||
let mut ignore = flake8
|
||||
let mut ignore: HashSet<RuleSelector> = flake8
|
||||
.get("ignore")
|
||||
.and_then(|value| {
|
||||
value
|
||||
.as_ref()
|
||||
.map(|value| BTreeSet::from_iter(parser::parse_prefix_codes(value)))
|
||||
.map(|value| HashSet::from_iter(parser::parse_prefix_codes(value)))
|
||||
})
|
||||
.unwrap_or_default();
|
||||
|
||||
@@ -349,8 +350,18 @@ pub fn convert(
|
||||
}
|
||||
|
||||
// Deduplicate and sort.
|
||||
options.select = Some(Vec::from_iter(select));
|
||||
options.ignore = Some(Vec::from_iter(ignore));
|
||||
options.select = Some(
|
||||
select
|
||||
.into_iter()
|
||||
.sorted_by_key(RuleSelector::short_code)
|
||||
.collect(),
|
||||
);
|
||||
options.ignore = Some(
|
||||
ignore
|
||||
.into_iter()
|
||||
.sorted_by_key(RuleSelector::short_code)
|
||||
.collect(),
|
||||
);
|
||||
if flake8_annotations != flake8_annotations::settings::Options::default() {
|
||||
options.flake8_annotations = Some(flake8_annotations);
|
||||
}
|
||||
@@ -414,8 +425,8 @@ pub fn convert(
|
||||
|
||||
/// Resolve the set of enabled `RuleSelector` values for the given
|
||||
/// plugins.
|
||||
fn resolve_select(plugins: &[Plugin]) -> BTreeSet<RuleSelector> {
|
||||
let mut select: BTreeSet<_> = DEFAULT_SELECTORS.iter().cloned().collect();
|
||||
fn resolve_select(plugins: &[Plugin]) -> HashSet<RuleSelector> {
|
||||
let mut select: HashSet<_> = DEFAULT_SELECTORS.iter().cloned().collect();
|
||||
select.extend(plugins.iter().map(Plugin::selector));
|
||||
select
|
||||
}
|
||||
@@ -446,7 +457,7 @@ mod tests {
|
||||
.iter()
|
||||
.cloned()
|
||||
.chain(plugins)
|
||||
.sorted()
|
||||
.sorted_by_key(RuleSelector::short_code)
|
||||
.collect(),
|
||||
),
|
||||
..Options::default()
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
use std::collections::{BTreeSet, HashMap};
|
||||
use std::collections::{BTreeSet, HashMap, HashSet};
|
||||
use std::fmt;
|
||||
use std::str::FromStr;
|
||||
|
||||
@@ -293,7 +293,7 @@ pub fn infer_plugins_from_options(flake8: &HashMap<String, Option<String>>) -> V
|
||||
///
|
||||
/// For example, if the user ignores `ANN101`, we should infer that
|
||||
/// `flake8-annotations` is active.
|
||||
pub fn infer_plugins_from_codes(selectors: &BTreeSet<RuleSelector>) -> Vec<Plugin> {
|
||||
pub fn infer_plugins_from_codes(selectors: &HashSet<RuleSelector>) -> Vec<Plugin> {
|
||||
// Ignore cases in which we've knowingly changed rule prefixes.
|
||||
[
|
||||
Plugin::Flake82020,
|
||||
|
||||
26
src/fs.rs
26
src/fs.rs
@@ -1,8 +1,10 @@
|
||||
use std::fs::File;
|
||||
use std::io::{BufReader, Read};
|
||||
use std::ops::Deref;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
use anyhow::{anyhow, Result};
|
||||
use log::debug;
|
||||
use path_absolutize::{path_dedot, Absolutize};
|
||||
use rustc_hash::FxHashSet;
|
||||
|
||||
@@ -34,10 +36,28 @@ pub(crate) fn ignores_from_path<'a>(
|
||||
let (file_path, file_basename) = extract_path_names(path)?;
|
||||
Ok(pattern_code_pairs
|
||||
.iter()
|
||||
.filter(|(absolute, basename, _)| {
|
||||
basename.is_match(file_basename) || absolute.is_match(file_path)
|
||||
.filter_map(|(absolute, basename, codes)| {
|
||||
if basename.is_match(file_basename) {
|
||||
debug!(
|
||||
"Adding per-file ignores for {:?} due to basename match on {:?}: {:?}",
|
||||
path,
|
||||
basename.deref().glob().regex(),
|
||||
&**codes
|
||||
);
|
||||
return Some(codes.iter());
|
||||
}
|
||||
if absolute.is_match(file_path) {
|
||||
debug!(
|
||||
"Adding per-file ignores for {:?} due to absolute match on {:?}: {:?}",
|
||||
path,
|
||||
absolute.deref().glob().regex(),
|
||||
&**codes
|
||||
);
|
||||
return Some(codes.iter());
|
||||
}
|
||||
None
|
||||
})
|
||||
.flat_map(|(_, _, codes)| codes.iter())
|
||||
.flatten()
|
||||
.collect())
|
||||
}
|
||||
|
||||
|
||||
@@ -118,7 +118,9 @@ pub fn check_path(
|
||||
Err(parse_error) => {
|
||||
if settings.rules.enabled(&Rule::SyntaxError) {
|
||||
diagnostics.push(Diagnostic::new(
|
||||
violations::SyntaxError(parse_error.error.to_string()),
|
||||
violations::SyntaxError {
|
||||
message: parse_error.error.to_string(),
|
||||
},
|
||||
Range::new(parse_error.location, parse_error.location),
|
||||
));
|
||||
}
|
||||
@@ -149,6 +151,15 @@ pub fn check_path(
|
||||
));
|
||||
}
|
||||
|
||||
// Ignore diagnostics based on per-file-ignores.
|
||||
if !diagnostics.is_empty() && !settings.per_file_ignores.is_empty() {
|
||||
let ignores = fs::ignores_from_path(path, &settings.per_file_ignores)?;
|
||||
|
||||
if !ignores.is_empty() {
|
||||
diagnostics.retain(|diagnostic| !ignores.contains(&diagnostic.kind.rule()));
|
||||
}
|
||||
};
|
||||
|
||||
// Enforce `noqa` directives.
|
||||
if (matches!(noqa, flags::Noqa::Enabled) && !diagnostics.is_empty())
|
||||
|| settings
|
||||
@@ -166,17 +177,6 @@ pub fn check_path(
|
||||
);
|
||||
}
|
||||
|
||||
// Create path ignores.
|
||||
if !diagnostics.is_empty() && !settings.per_file_ignores.is_empty() {
|
||||
let ignores = fs::ignores_from_path(path, &settings.per_file_ignores)?;
|
||||
if !ignores.is_empty() {
|
||||
return Ok(diagnostics
|
||||
.into_iter()
|
||||
.filter(|diagnostic| !ignores.contains(&diagnostic.kind.rule()))
|
||||
.collect());
|
||||
}
|
||||
}
|
||||
|
||||
Ok(diagnostics)
|
||||
}
|
||||
|
||||
|
||||
18
src/noqa.rs
18
src/noqa.rs
@@ -166,13 +166,13 @@ fn add_noqa_inner(
|
||||
output.push_str(line_ending);
|
||||
count += 1;
|
||||
}
|
||||
Directive::Codes(_, start, _, existing) => {
|
||||
Directive::Codes(_, start_byte, _, existing) => {
|
||||
// Reconstruct the line based on the preserved rule codes.
|
||||
// This enables us to tally the number of edits.
|
||||
let mut formatted = String::new();
|
||||
let mut formatted = String::with_capacity(line.len());
|
||||
|
||||
// Add existing content.
|
||||
formatted.push_str(line[..start].trim_end());
|
||||
formatted.push_str(line[..start_byte].trim_end());
|
||||
|
||||
// Add `noqa` directive.
|
||||
formatted.push_str(" # noqa: ");
|
||||
@@ -247,7 +247,9 @@ mod tests {
|
||||
assert_eq!(output, format!("{contents}\n"));
|
||||
|
||||
let diagnostics = vec![Diagnostic::new(
|
||||
violations::UnusedVariable("x".to_string()),
|
||||
violations::UnusedVariable {
|
||||
name: "x".to_string(),
|
||||
},
|
||||
Range::new(Location::new(1, 0), Location::new(1, 0)),
|
||||
)];
|
||||
let contents = "x = 1";
|
||||
@@ -269,7 +271,9 @@ mod tests {
|
||||
Range::new(Location::new(1, 0), Location::new(1, 0)),
|
||||
),
|
||||
Diagnostic::new(
|
||||
violations::UnusedVariable("x".to_string()),
|
||||
violations::UnusedVariable {
|
||||
name: "x".to_string(),
|
||||
},
|
||||
Range::new(Location::new(1, 0), Location::new(1, 0)),
|
||||
),
|
||||
];
|
||||
@@ -292,7 +296,9 @@ mod tests {
|
||||
Range::new(Location::new(1, 0), Location::new(1, 0)),
|
||||
),
|
||||
Diagnostic::new(
|
||||
violations::UnusedVariable("x".to_string()),
|
||||
violations::UnusedVariable {
|
||||
name: "x".to_string(),
|
||||
},
|
||||
Range::new(Location::new(1, 0), Location::new(1, 0)),
|
||||
),
|
||||
];
|
||||
|
||||
@@ -194,6 +194,7 @@ pub static KNOWN_STANDARD_LIBRARY: Lazy<FxHashSet<&'static str>> = Lazy::new(||
|
||||
"tkinter",
|
||||
"token",
|
||||
"tokenize",
|
||||
"tomllib",
|
||||
"trace",
|
||||
"traceback",
|
||||
"tracemalloc",
|
||||
|
||||
112
src/registry.rs
112
src/registry.rs
@@ -7,7 +7,6 @@ use strum_macros::{AsRefStr, EnumIter};
|
||||
|
||||
use crate::ast::types::Range;
|
||||
use crate::fix::Fix;
|
||||
use crate::rule_selector::{prefix_to_selector, RuleSelector};
|
||||
use crate::violation::Violation;
|
||||
use crate::{rules, violations};
|
||||
|
||||
@@ -93,6 +92,7 @@ ruff_macros::define_rule_mapping!(
|
||||
PLR2004 => violations::MagicValueComparison,
|
||||
PLW0120 => violations::UselessElseOnLoop,
|
||||
PLW0602 => violations::GlobalVariableNotAssigned,
|
||||
PLR0913 => rules::pylint::rules::TooManyArgs,
|
||||
// flake8-builtins
|
||||
A001 => violations::BuiltinVariableShadowing,
|
||||
A002 => violations::BuiltinArgumentShadowing,
|
||||
@@ -237,7 +237,6 @@ ruff_macros::define_rule_mapping!(
|
||||
UP013 => violations::ConvertTypedDictFunctionalToClass,
|
||||
UP014 => violations::ConvertNamedTupleFunctionalToClass,
|
||||
UP015 => violations::RedundantOpenModes,
|
||||
UP016 => violations::RemoveSixCompat,
|
||||
UP017 => violations::DatetimeTimezoneUTC,
|
||||
UP018 => violations::NativeLiterals,
|
||||
UP019 => violations::TypingTextStrAlias,
|
||||
@@ -256,6 +255,7 @@ ruff_macros::define_rule_mapping!(
|
||||
UP032 => violations::FString,
|
||||
UP033 => violations::FunctoolsCache,
|
||||
UP034 => violations::ExtraneousParentheses,
|
||||
UP035 => rules::pyupgrade::rules::ImportReplacements,
|
||||
// pydocstyle
|
||||
D100 => violations::PublicModule,
|
||||
D101 => violations::PublicClass,
|
||||
@@ -284,7 +284,7 @@ ruff_macros::define_rule_mapping!(
|
||||
D300 => violations::UsesTripleQuotes,
|
||||
D301 => violations::UsesRPrefixForBackslashedContent,
|
||||
D400 => violations::EndsInPeriod,
|
||||
D401 => crate::rules::pydocstyle::rules::non_imperative_mood::NonImperativeMood,
|
||||
D401 => rules::pydocstyle::rules::non_imperative_mood::NonImperativeMood,
|
||||
D402 => violations::NoSignature,
|
||||
D403 => violations::FirstLineCapitalized,
|
||||
D404 => violations::NoThisPrefix,
|
||||
@@ -370,28 +370,28 @@ ruff_macros::define_rule_mapping!(
|
||||
PGH003 => violations::BlanketTypeIgnore,
|
||||
PGH004 => violations::BlanketNOQA,
|
||||
// pandas-vet
|
||||
PD002 => violations::UseOfInplaceArgument,
|
||||
PD003 => violations::UseOfDotIsNull,
|
||||
PD004 => violations::UseOfDotNotNull,
|
||||
PD007 => violations::UseOfDotIx,
|
||||
PD008 => violations::UseOfDotAt,
|
||||
PD009 => violations::UseOfDotIat,
|
||||
PD010 => violations::UseOfDotPivotOrUnstack,
|
||||
PD011 => violations::UseOfDotValues,
|
||||
PD012 => violations::UseOfDotReadTable,
|
||||
PD013 => violations::UseOfDotStack,
|
||||
PD015 => violations::UseOfPdMerge,
|
||||
PD901 => violations::DfIsABadVariableName,
|
||||
PD002 => rules::pandas_vet::rules::UseOfInplaceArgument,
|
||||
PD003 => rules::pandas_vet::rules::UseOfDotIsNull,
|
||||
PD004 => rules::pandas_vet::rules::UseOfDotNotNull,
|
||||
PD007 => rules::pandas_vet::rules::UseOfDotIx,
|
||||
PD008 => rules::pandas_vet::rules::UseOfDotAt,
|
||||
PD009 => rules::pandas_vet::rules::UseOfDotIat,
|
||||
PD010 => rules::pandas_vet::rules::UseOfDotPivotOrUnstack,
|
||||
PD011 => rules::pandas_vet::rules::UseOfDotValues,
|
||||
PD012 => rules::pandas_vet::rules::UseOfDotReadTable,
|
||||
PD013 => rules::pandas_vet::rules::UseOfDotStack,
|
||||
PD015 => rules::pandas_vet::rules::UseOfPdMerge,
|
||||
PD901 => rules::pandas_vet::rules::DfIsABadVariableName,
|
||||
// flake8-errmsg
|
||||
EM101 => violations::RawStringInException,
|
||||
EM102 => violations::FStringInException,
|
||||
EM103 => violations::DotFormatInException,
|
||||
// flake8-pytest-style
|
||||
PT001 => violations::IncorrectFixtureParenthesesStyle,
|
||||
PT002 => violations::FixturePositionalArgs,
|
||||
PT003 => violations::ExtraneousScopeFunction,
|
||||
PT004 => violations::MissingFixtureNameUnderscore,
|
||||
PT005 => violations::IncorrectFixtureNameUnderscore,
|
||||
PT001 => rules::flake8_pytest_style::rules::IncorrectFixtureParenthesesStyle,
|
||||
PT002 => rules::flake8_pytest_style::rules::FixturePositionalArgs,
|
||||
PT003 => rules::flake8_pytest_style::rules::ExtraneousScopeFunction,
|
||||
PT004 => rules::flake8_pytest_style::rules::MissingFixtureNameUnderscore,
|
||||
PT005 => rules::flake8_pytest_style::rules::IncorrectFixtureNameUnderscore,
|
||||
PT006 => violations::ParametrizeNamesWrongType,
|
||||
PT007 => violations::ParametrizeValuesWrongType,
|
||||
PT008 => violations::PatchWithLambda,
|
||||
@@ -404,13 +404,13 @@ ruff_macros::define_rule_mapping!(
|
||||
PT016 => violations::FailWithoutMessage,
|
||||
PT017 => violations::AssertInExcept,
|
||||
PT018 => violations::CompositeAssertion,
|
||||
PT019 => violations::FixtureParamWithoutValue,
|
||||
PT020 => violations::DeprecatedYieldFixture,
|
||||
PT021 => violations::FixtureFinalizerCallback,
|
||||
PT022 => violations::UselessYieldFixture,
|
||||
PT019 => rules::flake8_pytest_style::rules::FixtureParamWithoutValue,
|
||||
PT020 => rules::flake8_pytest_style::rules::DeprecatedYieldFixture,
|
||||
PT021 => rules::flake8_pytest_style::rules::FixtureFinalizerCallback,
|
||||
PT022 => rules::flake8_pytest_style::rules::UselessYieldFixture,
|
||||
PT023 => violations::IncorrectMarkParenthesesStyle,
|
||||
PT024 => violations::UnnecessaryAsyncioMarkOnFixture,
|
||||
PT025 => violations::ErroneousUseFixturesOnFixture,
|
||||
PT024 => rules::flake8_pytest_style::rules::UnnecessaryAsyncioMarkOnFixture,
|
||||
PT025 => rules::flake8_pytest_style::rules::ErroneousUseFixturesOnFixture,
|
||||
PT026 => violations::UseFixturesWithoutParameters,
|
||||
// flake8-pie
|
||||
PIE790 => rules::flake8_pie::rules::NoUnnecessaryPass,
|
||||
@@ -481,6 +481,8 @@ ruff_macros::define_rule_mapping!(
|
||||
G101 => rules::flake8_logging_format::violations::LoggingExtraAttrClash,
|
||||
G201 => rules::flake8_logging_format::violations::LoggingExcInfo,
|
||||
G202 => rules::flake8_logging_format::violations::LoggingRedundantExcInfo,
|
||||
// flake8-raise
|
||||
RSE102 => rules::flake8_raise::rules::UnnecessaryParenOnRaiseException,
|
||||
// ruff
|
||||
RUF001 => violations::AmbiguousUnicodeCharacterString,
|
||||
RUF002 => violations::AmbiguousUnicodeCharacterDocstring,
|
||||
@@ -610,15 +612,25 @@ pub enum Linter {
|
||||
/// [tryceratops](https://pypi.org/project/tryceratops/1.1.0/)
|
||||
#[prefix = "TRY"]
|
||||
Tryceratops,
|
||||
/// [flake8-raise](https://pypi.org/project/flake8-raise/)
|
||||
#[prefix = "RSE"]
|
||||
Flake8Raise,
|
||||
/// Ruff-specific rules
|
||||
#[prefix = "RUF"]
|
||||
Ruff,
|
||||
}
|
||||
|
||||
pub trait RuleNamespace: Sized {
|
||||
fn parse_code(code: &str) -> Option<(Self, &str)>;
|
||||
/// Returns the prefix that every single code that ruff uses to identify
|
||||
/// rules from this linter starts with. In the case that multiple
|
||||
/// `#[prefix]`es are configured for the variant in the `Linter` enum
|
||||
/// definition this is the empty string.
|
||||
fn common_prefix(&self) -> &'static str;
|
||||
|
||||
fn prefixes(&self) -> &'static [&'static str];
|
||||
/// Attempts to parse the given rule code. If the prefix is recognized
|
||||
/// returns the respective variant along with the code with the common
|
||||
/// prefix stripped.
|
||||
fn parse_code(code: &str) -> Option<(Self, &str)>;
|
||||
|
||||
fn name(&self) -> &'static str;
|
||||
|
||||
@@ -626,27 +638,21 @@ pub trait RuleNamespace: Sized {
|
||||
}
|
||||
|
||||
/// The prefix, name and selector for an upstream linter category.
|
||||
pub struct LinterCategory(pub &'static str, pub &'static str, pub RuleSelector);
|
||||
|
||||
// TODO(martin): Move these constant definitions back to Linter::categories impl
|
||||
// once RuleSelector is an enum with a Linter variant
|
||||
const PYCODESTYLE_CATEGORIES: &[LinterCategory] = &[
|
||||
LinterCategory("E", "Error", prefix_to_selector(RuleCodePrefix::E)),
|
||||
LinterCategory("W", "Warning", prefix_to_selector(RuleCodePrefix::W)),
|
||||
];
|
||||
|
||||
const PYLINT_CATEGORIES: &[LinterCategory] = &[
|
||||
LinterCategory("PLC", "Convention", prefix_to_selector(RuleCodePrefix::PLC)),
|
||||
LinterCategory("PLE", "Error", prefix_to_selector(RuleCodePrefix::PLE)),
|
||||
LinterCategory("PLR", "Refactor", prefix_to_selector(RuleCodePrefix::PLR)),
|
||||
LinterCategory("PLW", "Warning", prefix_to_selector(RuleCodePrefix::PLW)),
|
||||
];
|
||||
pub struct LinterCategory(pub &'static str, pub &'static str, pub RuleCodePrefix);
|
||||
|
||||
impl Linter {
|
||||
pub fn categories(&self) -> Option<&'static [LinterCategory]> {
|
||||
match self {
|
||||
Linter::Pycodestyle => Some(PYCODESTYLE_CATEGORIES),
|
||||
Linter::Pylint => Some(PYLINT_CATEGORIES),
|
||||
Linter::Pycodestyle => Some(&[
|
||||
LinterCategory("E", "Error", RuleCodePrefix::E),
|
||||
LinterCategory("W", "Warning", RuleCodePrefix::W),
|
||||
]),
|
||||
Linter::Pylint => Some(&[
|
||||
LinterCategory("PLC", "Convention", RuleCodePrefix::PLC),
|
||||
LinterCategory("PLE", "Error", RuleCodePrefix::PLE),
|
||||
LinterCategory("PLR", "Refactor", RuleCodePrefix::PLR),
|
||||
LinterCategory("PLW", "Warning", RuleCodePrefix::PLW),
|
||||
]),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
@@ -735,18 +741,18 @@ impl Diagnostic {
|
||||
}
|
||||
|
||||
/// Pairs of checks that shouldn't be enabled together.
|
||||
pub const INCOMPATIBLE_CODES: &[(Rule, Rule, &str)] = &[
|
||||
pub const INCOMPATIBLE_CODES: &[(Rule, Rule, &str); 2] = &[
|
||||
(
|
||||
Rule::OneBlankLineBeforeClass,
|
||||
Rule::NoBlankLineBeforeClass,
|
||||
Rule::OneBlankLineBeforeClass,
|
||||
"`one-blank-line-before-class` (D203) and `no-blank-line-before-class` (D211) are \
|
||||
incompatible. Consider ignoring `one-blank-line-before-class`.",
|
||||
incompatible. Ignoring `one-blank-line-before-class`.",
|
||||
),
|
||||
(
|
||||
Rule::MultiLineSummaryFirstLine,
|
||||
Rule::MultiLineSummarySecondLine,
|
||||
"`multi-line-summary-first-line` (D212) and `multi-line-summary-second-line` (D213) are \
|
||||
incompatible. Consider ignoring one.",
|
||||
incompatible. Ignoring `multi-line-summary-second-line`.",
|
||||
),
|
||||
];
|
||||
|
||||
@@ -767,10 +773,12 @@ mod tests {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_linter_prefixes() {
|
||||
fn test_linter_parse_code() {
|
||||
for rule in Rule::iter() {
|
||||
Linter::parse_code(rule.code())
|
||||
.unwrap_or_else(|| panic!("couldn't parse {:?}", rule.code()));
|
||||
let code = rule.code();
|
||||
let (linter, rest) =
|
||||
Linter::parse_code(code).unwrap_or_else(|| panic!("couldn't parse {:?}", code));
|
||||
assert_eq!(code, format!("{}{rest}", linter.common_prefix()));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -6,11 +6,12 @@ use schemars::JsonSchema;
|
||||
use serde::de::{self, Visitor};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use strum::IntoEnumIterator;
|
||||
use strum_macros::EnumIter;
|
||||
|
||||
use crate::registry::{Rule, RuleCodePrefix, RuleIter};
|
||||
use crate::rule_redirects::get_redirect;
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord)]
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
|
||||
pub enum RuleSelector {
|
||||
/// All rules
|
||||
All,
|
||||
@@ -48,15 +49,21 @@ pub enum ParseError {
|
||||
Unknown(String),
|
||||
}
|
||||
|
||||
impl RuleSelector {
|
||||
pub fn short_code(&self) -> &'static str {
|
||||
match self {
|
||||
RuleSelector::All => "ALL",
|
||||
RuleSelector::Prefix { prefix, .. } => prefix.into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Serialize for RuleSelector {
|
||||
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||
where
|
||||
S: serde::Serializer,
|
||||
{
|
||||
match self {
|
||||
RuleSelector::All => serializer.serialize_str("ALL"),
|
||||
RuleSelector::Prefix { prefix, .. } => prefix.serialize(serializer),
|
||||
}
|
||||
serializer.serialize_str(self.short_code())
|
||||
}
|
||||
}
|
||||
|
||||
@@ -171,7 +178,7 @@ impl RuleSelector {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(PartialEq, Eq, PartialOrd, Ord)]
|
||||
#[derive(EnumIter, PartialEq, Eq, PartialOrd, Ord)]
|
||||
pub(crate) enum Specificity {
|
||||
All,
|
||||
Linter,
|
||||
|
||||
@@ -58,7 +58,7 @@ where
|
||||
{
|
||||
if checker.match_typing_expr(annotation, "Any") {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::DynamicallyTypedExpression(func()),
|
||||
violations::DynamicallyTypedExpression { name: func() },
|
||||
Range::from_located(annotation),
|
||||
));
|
||||
};
|
||||
@@ -115,7 +115,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
.enabled(&Rule::MissingTypeFunctionArgument)
|
||||
{
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingTypeFunctionArgument(arg.node.arg.to_string()),
|
||||
violations::MissingTypeFunctionArgument {
|
||||
name: arg.node.arg.to_string(),
|
||||
},
|
||||
Range::from_located(arg),
|
||||
));
|
||||
}
|
||||
@@ -143,7 +145,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
{
|
||||
if checker.settings.rules.enabled(&Rule::MissingTypeArgs) {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingTypeArgs(arg.node.arg.to_string()),
|
||||
violations::MissingTypeArgs {
|
||||
name: arg.node.arg.to_string(),
|
||||
},
|
||||
Range::from_located(arg),
|
||||
));
|
||||
}
|
||||
@@ -171,7 +175,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
{
|
||||
if checker.settings.rules.enabled(&Rule::MissingTypeKwargs) {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingTypeKwargs(arg.node.arg.to_string()),
|
||||
violations::MissingTypeKwargs {
|
||||
name: arg.node.arg.to_string(),
|
||||
},
|
||||
Range::from_located(arg),
|
||||
));
|
||||
}
|
||||
@@ -186,14 +192,18 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
if visibility::is_classmethod(checker, cast::decorator_list(stmt)) {
|
||||
if checker.settings.rules.enabled(&Rule::MissingTypeCls) {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingTypeCls(arg.node.arg.to_string()),
|
||||
violations::MissingTypeCls {
|
||||
name: arg.node.arg.to_string(),
|
||||
},
|
||||
Range::from_located(arg),
|
||||
));
|
||||
}
|
||||
} else {
|
||||
if checker.settings.rules.enabled(&Rule::MissingTypeSelf) {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingTypeSelf(arg.node.arg.to_string()),
|
||||
violations::MissingTypeSelf {
|
||||
name: arg.node.arg.to_string(),
|
||||
},
|
||||
Range::from_located(arg),
|
||||
));
|
||||
}
|
||||
@@ -227,7 +237,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
.enabled(&Rule::MissingReturnTypeClassMethod)
|
||||
{
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingReturnTypeClassMethod(name.to_string()),
|
||||
violations::MissingReturnTypeClassMethod {
|
||||
name: name.to_string(),
|
||||
},
|
||||
helpers::identifier_range(stmt, checker.locator),
|
||||
));
|
||||
}
|
||||
@@ -240,7 +252,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
.enabled(&Rule::MissingReturnTypeStaticMethod)
|
||||
{
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingReturnTypeStaticMethod(name.to_string()),
|
||||
violations::MissingReturnTypeStaticMethod {
|
||||
name: name.to_string(),
|
||||
},
|
||||
helpers::identifier_range(stmt, checker.locator),
|
||||
));
|
||||
}
|
||||
@@ -256,7 +270,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
&& has_any_typed_arg)
|
||||
{
|
||||
let mut diagnostic = Diagnostic::new(
|
||||
violations::MissingReturnTypeSpecialMethod(name.to_string()),
|
||||
violations::MissingReturnTypeSpecialMethod {
|
||||
name: name.to_string(),
|
||||
},
|
||||
helpers::identifier_range(stmt, checker.locator),
|
||||
);
|
||||
if checker.patch(diagnostic.kind.rule()) {
|
||||
@@ -277,7 +293,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
.enabled(&Rule::MissingReturnTypeSpecialMethod)
|
||||
{
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingReturnTypeSpecialMethod(name.to_string()),
|
||||
violations::MissingReturnTypeSpecialMethod {
|
||||
name: name.to_string(),
|
||||
},
|
||||
helpers::identifier_range(stmt, checker.locator),
|
||||
));
|
||||
}
|
||||
@@ -290,7 +308,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
.enabled(&Rule::MissingReturnTypePublicFunction)
|
||||
{
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingReturnTypePublicFunction(name.to_string()),
|
||||
violations::MissingReturnTypePublicFunction {
|
||||
name: name.to_string(),
|
||||
},
|
||||
helpers::identifier_range(stmt, checker.locator),
|
||||
));
|
||||
}
|
||||
@@ -302,7 +322,9 @@ pub fn definition(checker: &mut Checker, definition: &Definition, visibility: &V
|
||||
.enabled(&Rule::MissingReturnTypePrivateFunction)
|
||||
{
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::MissingReturnTypePrivateFunction(name.to_string()),
|
||||
violations::MissingReturnTypePrivateFunction {
|
||||
name: name.to_string(),
|
||||
},
|
||||
helpers::identifier_range(stmt, checker.locator),
|
||||
));
|
||||
}
|
||||
|
||||
@@ -3,7 +3,8 @@ source: src/rules/flake8_annotations/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
MissingReturnTypePublicFunction: bar
|
||||
MissingReturnTypePublicFunction:
|
||||
name: bar
|
||||
location:
|
||||
row: 29
|
||||
column: 8
|
||||
|
||||
@@ -3,7 +3,8 @@ source: src/rules/flake8_annotations/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
DynamicallyTypedExpression: a
|
||||
DynamicallyTypedExpression:
|
||||
name: a
|
||||
location:
|
||||
row: 10
|
||||
column: 11
|
||||
@@ -13,7 +14,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: foo
|
||||
DynamicallyTypedExpression:
|
||||
name: foo
|
||||
location:
|
||||
row: 15
|
||||
column: 46
|
||||
@@ -23,7 +25,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: a
|
||||
DynamicallyTypedExpression:
|
||||
name: a
|
||||
location:
|
||||
row: 40
|
||||
column: 28
|
||||
@@ -33,7 +36,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: foo_method
|
||||
DynamicallyTypedExpression:
|
||||
name: foo_method
|
||||
location:
|
||||
row: 44
|
||||
column: 66
|
||||
|
||||
@@ -3,7 +3,8 @@ source: src/rules/flake8_annotations/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
MissingReturnTypePublicFunction: foo
|
||||
MissingReturnTypePublicFunction:
|
||||
name: foo
|
||||
location:
|
||||
row: 4
|
||||
column: 4
|
||||
@@ -13,7 +14,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingTypeFunctionArgument: a
|
||||
MissingTypeFunctionArgument:
|
||||
name: a
|
||||
location:
|
||||
row: 4
|
||||
column: 8
|
||||
@@ -23,7 +25,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingTypeFunctionArgument: b
|
||||
MissingTypeFunctionArgument:
|
||||
name: b
|
||||
location:
|
||||
row: 4
|
||||
column: 11
|
||||
@@ -33,7 +36,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingReturnTypePublicFunction: foo
|
||||
MissingReturnTypePublicFunction:
|
||||
name: foo
|
||||
location:
|
||||
row: 9
|
||||
column: 4
|
||||
@@ -43,7 +47,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingTypeFunctionArgument: b
|
||||
MissingTypeFunctionArgument:
|
||||
name: b
|
||||
location:
|
||||
row: 9
|
||||
column: 16
|
||||
@@ -53,7 +58,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingTypeFunctionArgument: b
|
||||
MissingTypeFunctionArgument:
|
||||
name: b
|
||||
location:
|
||||
row: 14
|
||||
column: 16
|
||||
@@ -63,7 +69,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingReturnTypePublicFunction: foo
|
||||
MissingReturnTypePublicFunction:
|
||||
name: foo
|
||||
location:
|
||||
row: 19
|
||||
column: 4
|
||||
@@ -73,7 +80,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingReturnTypePublicFunction: foo
|
||||
MissingReturnTypePublicFunction:
|
||||
name: foo
|
||||
location:
|
||||
row: 24
|
||||
column: 4
|
||||
@@ -83,7 +91,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: a
|
||||
DynamicallyTypedExpression:
|
||||
name: a
|
||||
location:
|
||||
row: 44
|
||||
column: 11
|
||||
@@ -93,7 +102,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: foo
|
||||
DynamicallyTypedExpression:
|
||||
name: foo
|
||||
location:
|
||||
row: 49
|
||||
column: 46
|
||||
@@ -103,7 +113,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: "*args"
|
||||
DynamicallyTypedExpression:
|
||||
name: "*args"
|
||||
location:
|
||||
row: 54
|
||||
column: 23
|
||||
@@ -113,7 +124,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: "**kwargs"
|
||||
DynamicallyTypedExpression:
|
||||
name: "**kwargs"
|
||||
location:
|
||||
row: 54
|
||||
column: 38
|
||||
@@ -123,7 +135,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: "*args"
|
||||
DynamicallyTypedExpression:
|
||||
name: "*args"
|
||||
location:
|
||||
row: 59
|
||||
column: 23
|
||||
@@ -133,7 +146,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: "**kwargs"
|
||||
DynamicallyTypedExpression:
|
||||
name: "**kwargs"
|
||||
location:
|
||||
row: 64
|
||||
column: 38
|
||||
@@ -143,7 +157,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingTypeSelf: self
|
||||
MissingTypeSelf:
|
||||
name: self
|
||||
location:
|
||||
row: 74
|
||||
column: 12
|
||||
@@ -153,7 +168,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: a
|
||||
DynamicallyTypedExpression:
|
||||
name: a
|
||||
location:
|
||||
row: 78
|
||||
column: 28
|
||||
@@ -163,7 +179,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: foo
|
||||
DynamicallyTypedExpression:
|
||||
name: foo
|
||||
location:
|
||||
row: 82
|
||||
column: 66
|
||||
@@ -173,7 +190,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: "*params"
|
||||
DynamicallyTypedExpression:
|
||||
name: "*params"
|
||||
location:
|
||||
row: 86
|
||||
column: 42
|
||||
@@ -183,7 +201,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: "**options"
|
||||
DynamicallyTypedExpression:
|
||||
name: "**options"
|
||||
location:
|
||||
row: 86
|
||||
column: 58
|
||||
@@ -193,7 +212,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: "*params"
|
||||
DynamicallyTypedExpression:
|
||||
name: "*params"
|
||||
location:
|
||||
row: 90
|
||||
column: 42
|
||||
@@ -203,7 +223,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
DynamicallyTypedExpression: "**options"
|
||||
DynamicallyTypedExpression:
|
||||
name: "**options"
|
||||
location:
|
||||
row: 94
|
||||
column: 58
|
||||
@@ -213,7 +234,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingTypeCls: cls
|
||||
MissingTypeCls:
|
||||
name: cls
|
||||
location:
|
||||
row: 104
|
||||
column: 12
|
||||
|
||||
@@ -3,7 +3,8 @@ source: src/rules/flake8_annotations/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
MissingReturnTypeSpecialMethod: __init__
|
||||
MissingReturnTypeSpecialMethod:
|
||||
name: __init__
|
||||
location:
|
||||
row: 5
|
||||
column: 8
|
||||
@@ -21,7 +22,8 @@ expression: diagnostics
|
||||
column: 22
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingReturnTypeSpecialMethod: __init__
|
||||
MissingReturnTypeSpecialMethod:
|
||||
name: __init__
|
||||
location:
|
||||
row: 11
|
||||
column: 8
|
||||
@@ -39,7 +41,8 @@ expression: diagnostics
|
||||
column: 27
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingReturnTypePrivateFunction: __init__
|
||||
MissingReturnTypePrivateFunction:
|
||||
name: __init__
|
||||
location:
|
||||
row: 40
|
||||
column: 4
|
||||
@@ -49,7 +52,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingReturnTypeSpecialMethod: __init__
|
||||
MissingReturnTypeSpecialMethod:
|
||||
name: __init__
|
||||
location:
|
||||
row: 47
|
||||
column: 8
|
||||
|
||||
@@ -3,7 +3,8 @@ source: src/rules/flake8_annotations/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
MissingReturnTypePublicFunction: foo
|
||||
MissingReturnTypePublicFunction:
|
||||
name: foo
|
||||
location:
|
||||
row: 45
|
||||
column: 4
|
||||
@@ -13,7 +14,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
MissingReturnTypePublicFunction: foo
|
||||
MissingReturnTypePublicFunction:
|
||||
name: foo
|
||||
location:
|
||||
row: 50
|
||||
column: 4
|
||||
|
||||
@@ -101,7 +101,7 @@ pub fn bad_file_permissions(
|
||||
if let Some(int_value) = get_int_value(mode_arg) {
|
||||
if (int_value & WRITE_WORLD > 0) || (int_value & EXECUTE_GROUP > 0) {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::BadFilePermissions(int_value),
|
||||
violations::BadFilePermissions { mask: int_value },
|
||||
Range::from_located(mode_arg),
|
||||
));
|
||||
}
|
||||
|
||||
@@ -6,13 +6,15 @@ use crate::registry::Diagnostic;
|
||||
use crate::violations;
|
||||
|
||||
fn check_password_kwarg(arg: &Located<ArgData>, default: &Expr) -> Option<Diagnostic> {
|
||||
let string = string_literal(default)?;
|
||||
let string = string_literal(default).filter(|string| !string.is_empty())?;
|
||||
let kwarg_name = &arg.node.arg;
|
||||
if !matches_password_name(kwarg_name) {
|
||||
return None;
|
||||
}
|
||||
Some(Diagnostic::new(
|
||||
violations::HardcodedPasswordDefault(string.to_string()),
|
||||
violations::HardcodedPasswordDefault {
|
||||
string: string.to_string(),
|
||||
},
|
||||
Range::from_located(default),
|
||||
))
|
||||
}
|
||||
|
||||
@@ -10,13 +10,15 @@ pub fn hardcoded_password_func_arg(keywords: &[Keyword]) -> Vec<Diagnostic> {
|
||||
keywords
|
||||
.iter()
|
||||
.filter_map(|keyword| {
|
||||
let string = string_literal(&keyword.node.value)?;
|
||||
let string = string_literal(&keyword.node.value).filter(|string| !string.is_empty())?;
|
||||
let arg = keyword.node.arg.as_ref()?;
|
||||
if !matches_password_name(arg) {
|
||||
return None;
|
||||
}
|
||||
Some(Diagnostic::new(
|
||||
violations::HardcodedPasswordFuncArg(string.to_string()),
|
||||
violations::HardcodedPasswordFuncArg {
|
||||
string: string.to_string(),
|
||||
},
|
||||
Range::from_located(keyword),
|
||||
))
|
||||
})
|
||||
|
||||
@@ -30,12 +30,14 @@ pub fn compare_to_hardcoded_password_string(left: &Expr, comparators: &[Expr]) -
|
||||
comparators
|
||||
.iter()
|
||||
.filter_map(|comp| {
|
||||
let string = string_literal(comp)?;
|
||||
let string = string_literal(comp).filter(|string| !string.is_empty())?;
|
||||
if !is_password_target(left) {
|
||||
return None;
|
||||
}
|
||||
Some(Diagnostic::new(
|
||||
violations::HardcodedPasswordString(string.to_string()),
|
||||
violations::HardcodedPasswordString {
|
||||
string: string.to_string(),
|
||||
},
|
||||
Range::from_located(comp),
|
||||
))
|
||||
})
|
||||
@@ -44,11 +46,13 @@ pub fn compare_to_hardcoded_password_string(left: &Expr, comparators: &[Expr]) -
|
||||
|
||||
/// S105
|
||||
pub fn assign_hardcoded_password_string(value: &Expr, targets: &[Expr]) -> Option<Diagnostic> {
|
||||
if let Some(string) = string_literal(value) {
|
||||
if let Some(string) = string_literal(value).filter(|string| !string.is_empty()) {
|
||||
for target in targets {
|
||||
if is_password_target(target) {
|
||||
return Some(Diagnostic::new(
|
||||
violations::HardcodedPasswordString(string.to_string()),
|
||||
violations::HardcodedPasswordString {
|
||||
string: string.to_string(),
|
||||
},
|
||||
Range::from_located(value),
|
||||
));
|
||||
}
|
||||
|
||||
@@ -12,7 +12,9 @@ pub fn hardcoded_tmp_directory(
|
||||
) -> Option<Diagnostic> {
|
||||
if prefixes.iter().any(|prefix| value.starts_with(prefix)) {
|
||||
Some(Diagnostic::new(
|
||||
violations::HardcodedTempFile(value.to_string()),
|
||||
violations::HardcodedTempFile {
|
||||
string: value.to_string(),
|
||||
},
|
||||
Range::from_located(expr),
|
||||
))
|
||||
} else {
|
||||
|
||||
@@ -56,7 +56,9 @@ pub fn hashlib_insecure_hash_functions(
|
||||
if let Some(hash_func_name) = string_literal(name_arg) {
|
||||
if WEAK_HASHES.contains(&hash_func_name.to_lowercase().as_str()) {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::HashlibInsecureHashFunction(hash_func_name.to_string()),
|
||||
violations::HashlibInsecureHashFunction {
|
||||
string: hash_func_name.to_string(),
|
||||
},
|
||||
Range::from_located(name_arg),
|
||||
));
|
||||
}
|
||||
@@ -71,7 +73,9 @@ pub fn hashlib_insecure_hash_functions(
|
||||
}
|
||||
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::HashlibInsecureHashFunction((*func_name).to_string()),
|
||||
violations::HashlibInsecureHashFunction {
|
||||
string: (*func_name).to_string(),
|
||||
},
|
||||
Range::from_located(func),
|
||||
));
|
||||
}
|
||||
|
||||
@@ -29,20 +29,20 @@ pub fn jinja2_autoescape_false(
|
||||
if let ExprKind::Name { id, .. } = &func.node {
|
||||
if id.as_str() != "select_autoescape" {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::Jinja2AutoescapeFalse(true),
|
||||
violations::Jinja2AutoescapeFalse { value: true },
|
||||
Range::from_located(autoescape_arg),
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
_ => checker.diagnostics.push(Diagnostic::new(
|
||||
violations::Jinja2AutoescapeFalse(true),
|
||||
violations::Jinja2AutoescapeFalse { value: true },
|
||||
Range::from_located(autoescape_arg),
|
||||
)),
|
||||
}
|
||||
} else {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::Jinja2AutoescapeFalse(false),
|
||||
violations::Jinja2AutoescapeFalse { value: false },
|
||||
Range::from_located(func),
|
||||
));
|
||||
}
|
||||
|
||||
@@ -48,7 +48,9 @@ pub fn request_with_no_cert_validation(
|
||||
} = &verify_arg.node
|
||||
{
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::RequestWithNoCertValidation(target.to_string()),
|
||||
violations::RequestWithNoCertValidation {
|
||||
string: target.to_string(),
|
||||
},
|
||||
Range::from_located(verify_arg),
|
||||
));
|
||||
}
|
||||
|
||||
@@ -31,13 +31,15 @@ pub fn request_without_timeout(
|
||||
_ => None,
|
||||
} {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::RequestWithoutTimeout(Some(timeout)),
|
||||
violations::RequestWithoutTimeout {
|
||||
timeout: Some(timeout),
|
||||
},
|
||||
Range::from_located(timeout_arg),
|
||||
));
|
||||
}
|
||||
} else {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::RequestWithoutTimeout(None),
|
||||
violations::RequestWithoutTimeout { timeout: None },
|
||||
Range::from_located(func),
|
||||
));
|
||||
}
|
||||
|
||||
@@ -27,13 +27,13 @@ pub fn unsafe_yaml_load(checker: &mut Checker, func: &Expr, args: &[Expr], keywo
|
||||
_ => None,
|
||||
};
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::UnsafeYAMLLoad(loader),
|
||||
violations::UnsafeYAMLLoad { loader },
|
||||
Range::from_located(loader_arg),
|
||||
));
|
||||
}
|
||||
} else {
|
||||
checker.diagnostics.push(Diagnostic::new(
|
||||
violations::UnsafeYAMLLoad(None),
|
||||
violations::UnsafeYAMLLoad { loader: None },
|
||||
Range::from_located(func),
|
||||
));
|
||||
}
|
||||
|
||||
@@ -21,14 +21,14 @@ fn default_tmp_dirs() -> Vec<String> {
|
||||
pub struct Options {
|
||||
#[option(
|
||||
default = "[\"/tmp\", \"/var/tmp\", \"/dev/shm\"]",
|
||||
value_type = "Vec<String>",
|
||||
value_type = "list[str]",
|
||||
example = "hardcoded-tmp-directory = [\"/foo/bar\"]"
|
||||
)]
|
||||
/// A list of directories to consider temporary.
|
||||
pub hardcoded_tmp_directory: Option<Vec<String>>,
|
||||
#[option(
|
||||
default = "[]",
|
||||
value_type = "Vec<String>",
|
||||
value_type = "list[str]",
|
||||
example = "extend-hardcoded-tmp-directory = [\"/foo/bar\"]"
|
||||
)]
|
||||
/// A list of directories to consider temporary, in addition to those
|
||||
|
||||
@@ -3,7 +3,8 @@ source: src/rules/flake8_bandit/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
BadFilePermissions: 151
|
||||
BadFilePermissions:
|
||||
mask: 151
|
||||
location:
|
||||
row: 6
|
||||
column: 24
|
||||
@@ -13,7 +14,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 7
|
||||
BadFilePermissions:
|
||||
mask: 7
|
||||
location:
|
||||
row: 7
|
||||
column: 24
|
||||
@@ -23,7 +25,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 511
|
||||
BadFilePermissions:
|
||||
mask: 511
|
||||
location:
|
||||
row: 9
|
||||
column: 24
|
||||
@@ -33,7 +36,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 504
|
||||
BadFilePermissions:
|
||||
mask: 504
|
||||
location:
|
||||
row: 10
|
||||
column: 24
|
||||
@@ -43,7 +47,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 510
|
||||
BadFilePermissions:
|
||||
mask: 510
|
||||
location:
|
||||
row: 11
|
||||
column: 24
|
||||
@@ -53,7 +58,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 511
|
||||
BadFilePermissions:
|
||||
mask: 511
|
||||
location:
|
||||
row: 13
|
||||
column: 22
|
||||
@@ -63,7 +69,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 511
|
||||
BadFilePermissions:
|
||||
mask: 511
|
||||
location:
|
||||
row: 14
|
||||
column: 23
|
||||
@@ -73,7 +80,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 511
|
||||
BadFilePermissions:
|
||||
mask: 511
|
||||
location:
|
||||
row: 15
|
||||
column: 24
|
||||
@@ -83,7 +91,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 511
|
||||
BadFilePermissions:
|
||||
mask: 511
|
||||
location:
|
||||
row: 17
|
||||
column: 18
|
||||
@@ -93,7 +102,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 511
|
||||
BadFilePermissions:
|
||||
mask: 511
|
||||
location:
|
||||
row: 18
|
||||
column: 18
|
||||
@@ -103,7 +113,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 511
|
||||
BadFilePermissions:
|
||||
mask: 511
|
||||
location:
|
||||
row: 19
|
||||
column: 18
|
||||
@@ -113,7 +124,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 8
|
||||
BadFilePermissions:
|
||||
mask: 8
|
||||
location:
|
||||
row: 20
|
||||
column: 26
|
||||
@@ -123,7 +135,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
BadFilePermissions: 2
|
||||
BadFilePermissions:
|
||||
mask: 2
|
||||
location:
|
||||
row: 22
|
||||
column: 24
|
||||
|
||||
@@ -3,87 +3,85 @@ source: src/rules/flake8_bandit/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 12
|
||||
row: 13
|
||||
column: 11
|
||||
end_location:
|
||||
row: 12
|
||||
row: 13
|
||||
column: 19
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 13
|
||||
row: 14
|
||||
column: 8
|
||||
end_location:
|
||||
row: 13
|
||||
row: 14
|
||||
column: 16
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 14
|
||||
row: 15
|
||||
column: 9
|
||||
end_location:
|
||||
row: 14
|
||||
row: 15
|
||||
column: 17
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 15
|
||||
row: 16
|
||||
column: 6
|
||||
end_location:
|
||||
row: 15
|
||||
row: 16
|
||||
column: 14
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 16
|
||||
row: 17
|
||||
column: 9
|
||||
end_location:
|
||||
row: 16
|
||||
row: 17
|
||||
column: 17
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 17
|
||||
row: 18
|
||||
column: 8
|
||||
end_location:
|
||||
row: 17
|
||||
row: 18
|
||||
column: 16
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 18
|
||||
row: 19
|
||||
column: 10
|
||||
end_location:
|
||||
row: 18
|
||||
row: 19
|
||||
column: 18
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
location:
|
||||
row: 19
|
||||
column: 18
|
||||
end_location:
|
||||
row: 19
|
||||
column: 26
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 20
|
||||
column: 18
|
||||
@@ -93,87 +91,96 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 22
|
||||
row: 21
|
||||
column: 18
|
||||
end_location:
|
||||
row: 21
|
||||
column: 26
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 23
|
||||
column: 16
|
||||
end_location:
|
||||
row: 22
|
||||
row: 23
|
||||
column: 24
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 23
|
||||
row: 24
|
||||
column: 12
|
||||
end_location:
|
||||
row: 23
|
||||
row: 24
|
||||
column: 20
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 24
|
||||
row: 25
|
||||
column: 14
|
||||
end_location:
|
||||
row: 24
|
||||
row: 25
|
||||
column: 22
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 25
|
||||
row: 26
|
||||
column: 11
|
||||
end_location:
|
||||
row: 25
|
||||
row: 26
|
||||
column: 19
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 26
|
||||
row: 27
|
||||
column: 14
|
||||
end_location:
|
||||
row: 26
|
||||
row: 27
|
||||
column: 22
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 27
|
||||
row: 28
|
||||
column: 13
|
||||
end_location:
|
||||
row: 27
|
||||
row: 28
|
||||
column: 21
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 28
|
||||
row: 29
|
||||
column: 15
|
||||
end_location:
|
||||
row: 28
|
||||
row: 29
|
||||
column: 23
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
location:
|
||||
row: 29
|
||||
column: 23
|
||||
end_location:
|
||||
row: 29
|
||||
column: 31
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 30
|
||||
column: 23
|
||||
@@ -183,192 +190,222 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 34
|
||||
row: 31
|
||||
column: 23
|
||||
end_location:
|
||||
row: 31
|
||||
column: 31
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 35
|
||||
column: 15
|
||||
end_location:
|
||||
row: 34
|
||||
row: 35
|
||||
column: 23
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 38
|
||||
row: 39
|
||||
column: 19
|
||||
end_location:
|
||||
row: 38
|
||||
row: 39
|
||||
column: 27
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 39
|
||||
row: 40
|
||||
column: 16
|
||||
end_location:
|
||||
row: 39
|
||||
row: 40
|
||||
column: 24
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 40
|
||||
row: 41
|
||||
column: 17
|
||||
end_location:
|
||||
row: 40
|
||||
row: 41
|
||||
column: 25
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 41
|
||||
row: 42
|
||||
column: 14
|
||||
end_location:
|
||||
row: 41
|
||||
row: 42
|
||||
column: 22
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 42
|
||||
row: 43
|
||||
column: 17
|
||||
end_location:
|
||||
row: 42
|
||||
row: 43
|
||||
column: 25
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 43
|
||||
row: 44
|
||||
column: 16
|
||||
end_location:
|
||||
row: 43
|
||||
row: 44
|
||||
column: 24
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 44
|
||||
row: 45
|
||||
column: 18
|
||||
end_location:
|
||||
row: 44
|
||||
row: 45
|
||||
column: 26
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 46
|
||||
row: 47
|
||||
column: 12
|
||||
end_location:
|
||||
row: 46
|
||||
row: 47
|
||||
column: 20
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 47
|
||||
row: 48
|
||||
column: 9
|
||||
end_location:
|
||||
row: 47
|
||||
row: 48
|
||||
column: 17
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 48
|
||||
row: 49
|
||||
column: 10
|
||||
end_location:
|
||||
row: 48
|
||||
row: 49
|
||||
column: 18
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 49
|
||||
row: 50
|
||||
column: 7
|
||||
end_location:
|
||||
row: 49
|
||||
row: 50
|
||||
column: 15
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 50
|
||||
row: 51
|
||||
column: 10
|
||||
end_location:
|
||||
row: 50
|
||||
row: 51
|
||||
column: 18
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 51
|
||||
row: 52
|
||||
column: 9
|
||||
end_location:
|
||||
row: 51
|
||||
row: 52
|
||||
column: 17
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 52
|
||||
row: 53
|
||||
column: 11
|
||||
end_location:
|
||||
row: 52
|
||||
row: 53
|
||||
column: 19
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: s3cr3t
|
||||
HardcodedPasswordString:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 53
|
||||
row: 54
|
||||
column: 20
|
||||
end_location:
|
||||
row: 53
|
||||
row: 54
|
||||
column: 28
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: "1\n2"
|
||||
HardcodedPasswordString:
|
||||
string: "1\n2"
|
||||
location:
|
||||
row: 55
|
||||
row: 56
|
||||
column: 12
|
||||
end_location:
|
||||
row: 55
|
||||
row: 56
|
||||
column: 18
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: "3\t4"
|
||||
HardcodedPasswordString:
|
||||
string: "3\t4"
|
||||
location:
|
||||
row: 58
|
||||
row: 59
|
||||
column: 12
|
||||
end_location:
|
||||
row: 58
|
||||
row: 59
|
||||
column: 18
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordString: "5\r6"
|
||||
HardcodedPasswordString:
|
||||
string: "5\r6"
|
||||
location:
|
||||
row: 61
|
||||
row: 62
|
||||
column: 12
|
||||
end_location:
|
||||
row: 61
|
||||
row: 62
|
||||
column: 18
|
||||
fix: ~
|
||||
parent: ~
|
||||
|
||||
@@ -3,12 +3,13 @@ source: src/rules/flake8_bandit/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
HardcodedPasswordFuncArg: s3cr3t
|
||||
HardcodedPasswordFuncArg:
|
||||
string: s3cr3t
|
||||
location:
|
||||
row: 13
|
||||
row: 14
|
||||
column: 8
|
||||
end_location:
|
||||
row: 13
|
||||
row: 14
|
||||
column: 25
|
||||
fix: ~
|
||||
parent: ~
|
||||
|
||||
@@ -3,7 +3,8 @@ source: src/rules/flake8_bandit/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
HardcodedPasswordDefault: default
|
||||
HardcodedPasswordDefault:
|
||||
string: default
|
||||
location:
|
||||
row: 5
|
||||
column: 28
|
||||
@@ -13,7 +14,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordDefault: posonly
|
||||
HardcodedPasswordDefault:
|
||||
string: posonly
|
||||
location:
|
||||
row: 13
|
||||
column: 44
|
||||
@@ -23,7 +25,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordDefault: kwonly
|
||||
HardcodedPasswordDefault:
|
||||
string: kwonly
|
||||
location:
|
||||
row: 21
|
||||
column: 38
|
||||
@@ -33,7 +36,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordDefault: posonly
|
||||
HardcodedPasswordDefault:
|
||||
string: posonly
|
||||
location:
|
||||
row: 29
|
||||
column: 38
|
||||
@@ -43,7 +47,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedPasswordDefault: kwonly
|
||||
HardcodedPasswordDefault:
|
||||
string: kwonly
|
||||
location:
|
||||
row: 29
|
||||
column: 61
|
||||
|
||||
@@ -3,7 +3,8 @@ source: src/rules/flake8_bandit/mod.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
- kind:
|
||||
HardcodedTempFile: /tmp/abc
|
||||
HardcodedTempFile:
|
||||
string: /tmp/abc
|
||||
location:
|
||||
row: 5
|
||||
column: 10
|
||||
@@ -13,7 +14,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedTempFile: /var/tmp/123
|
||||
HardcodedTempFile:
|
||||
string: /var/tmp/123
|
||||
location:
|
||||
row: 8
|
||||
column: 10
|
||||
@@ -23,7 +25,8 @@ expression: diagnostics
|
||||
fix: ~
|
||||
parent: ~
|
||||
- kind:
|
||||
HardcodedTempFile: /dev/shm/unit/test
|
||||
HardcodedTempFile:
|
||||
string: /dev/shm/unit/test
|
||||
location:
|
||||
row: 11
|
||||
column: 10
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user