Compare commits

...

88 Commits

Author SHA1 Message Date
konstin
0f2979ed66 Formatter: Show preceding, following and enclosing of comments
**Summary** I used to always add `dbg!` for preceding, following and enclosing. With this change `--print-comments` can do this instead.
```python
import re  # import

def to_camel_case(node: str) -> str:
    """Converts PascalCase to camel_case"""
    return re.sub("([A-Z])", r"_\1", node).lower().lstrip("_")

# a
if True:
    pass  # b
else:
    print()
```
Debug output:
```
11..19 Some((StmtImport, 0..9)) Some((StmtFunctionDef, 22..165)) (ModModule, 0..213) "# import"
168..171 Some((StmtFunctionDef, 22..165)) Some((StmtIf, 172..212)) (ModModule, 0..213) "# a"
191..194 Some((StmtPass, 185..189)) Some((ElifElseClause, 195..212)) (StmtIf, 172..212) "# b"
```

**Test Plan** n/a
2023-08-23 12:55:24 +02:00
David Szotten
fe97a2a302 Fix panic with empty attribute inner comment (#6332)
Fixes https://github.com/astral-sh/ruff/issues/6181
2023-08-04 11:59:55 +02:00
konsti
a48d16e025 Replace Formatter<PyFormatContext<'_>> with PyFormatter (#6330)
This is a refactoring to use the type alias in more places. In the
process, I had to fix and run generate.py. There are no functional
changes.
2023-08-04 10:48:58 +02:00
Charlie Marsh
8a5bc93fdd Make the Nodes vector generic on node type (#6328) 2023-08-04 03:57:15 +00:00
Charlie Marsh
6da527170f Match left-hand side types() call in types-comparison (#6326)
Follow-up to https://github.com/astral-sh/ruff/pull/6325, to avoid false
positives in cases like:

```python
if x == int:
    ...
```

Which is valid, since we don't know that we're comparing the type _of_
something -- we're comparing the type objects directly.
2023-08-03 23:01:23 -04:00
Charlie Marsh
8cddb6c08d Include comparisons to builtin types in type-comparison rule (#6325)
## Summary

Extends `type-comparison` to flag:

```python
if type(obj) is int:
    pass
```

In addition to the existing cases, like:

```python
if type(obj) is type(1):
    pass
```

Closes https://github.com/astral-sh/ruff/issues/6260.
2023-08-04 02:25:19 +00:00
Victor Hugo Gomes
b8ca220eeb [flake8-pyi] Implement PYI055 (#6316) 2023-08-04 01:36:00 +00:00
Charlie Marsh
1d8759d5df Generalize comment-after-bracket handling to lists, sets, etc. (#6320)
## Summary

We already support preserving the end-of-line comment in calls and type
parameters, as in:

```python
foo(  # comment
    bar,
)
```

This PR adds the same behavior for lists, sets, comprehensions, etc.,
such that we preserve:

```python
[  # comment
    1,
    2,
    3,
]
```

And related cases.
2023-08-04 01:28:05 +00:00
Charlie Marsh
d3aa8b4ee0 Add API to chain comment placement operations (#6319)
## Summary

This PR adds an API for chaining comment placement methods based on the
[`then_with`](https://doc.rust-lang.org/std/cmp/enum.Ordering.html#method.then_with)
from `Ordering` in the standard library.

For example, you can now do:

```rust
try_some_case(comment).then_with(|comment| try_some_other_case_if_still_default(comment))
```

This lets us avoid this kind of pattern, which I've seen in
`placement.rs` and used myself before:

```rust
let comment = match handle_own_line_comment_between_branches(comment, preceding, locator) {
    CommentPlacement::Default(comment) => comment,
    placement => return placement,
};
```
2023-08-03 21:08:50 -04:00
Zanie Blue
9ae498595c Upgrade Rust to 1.71 (#6323)
Addresses
[CVE-2023-38497](https://blog.rust-lang.org/2023/08/03/cve-2023-38497.html)

See also the [version release
post](https://blog.rust-lang.org/2023/08/03/Rust-1.71.1.html)
2023-08-03 21:08:39 -04:00
Charlie Marsh
5f225b18ab Generalize bracketed end-of-line comment handling (#6315)
Micha suggested this in
https://github.com/astral-sh/ruff/pull/6274#discussion_r1282774151, and
it allows us to unify the implementations for arguments and type params.
2023-08-03 20:51:03 +00:00
Charlie Marsh
8276b26480 Omit formatter PRs from releases by default (#6317)
I end up removing these manually every time, seems easier to just omit
them for now.
2023-08-03 20:45:50 +00:00
Charlie Marsh
1705fcef36 Mark trailing comments in parenthesized tests (#6287)
## Summary

This ensures that we treat `# comment` as parenthesized in contexts
like:

```python
while (
    True
    # comment
):
    pass
```

The same logic applies equally to `for`, `async for`, `if`, `with`, and
`async with`. The general pattern is that you have an expression which
precedes a colon-separated suite.
2023-08-03 20:45:03 +00:00
konsti
51ff98f9e9 Make formatter ecosystem check failure output better understandable (#6300)
**Summary** Prompted by
https://github.com/astral-sh/ruff/pull/6257#issuecomment-1661308410, it
tried to make the ecosystem script output on failure better
understandable. All log messages are now written to a file, which is
printed on error. Running locally progress is still shown.

Looking through the log output i saw that we currently log syntax errors
in input, which is confusing because they aren't actual errors, but we
don't check that these files don't change due to parser regressions or
improvements. I added `--files-with-errors` to catch that.

**Test Plan** CI
2023-08-03 20:23:25 +02:00
Charlie Marsh
b3f3529499 Improve comments around Arguments handling in classes (#6310)
## Summary

Based on the confusion here:
https://github.com/astral-sh/ruff/pull/6274#discussion_r1282754515.

I looked into moving this logic into `placement.rs`, but I think it's
trickier than it may appear.
2023-08-03 12:34:03 -04:00
Charlie Marsh
2fa508793f Return a slice in StmtClassDef#bases (#6311)
Slices are strictly more flexible, since you can always convert to an
iterator, etc., but not the other way around. Suggested in
https://github.com/astral-sh/ruff/pull/6259#discussion_r1282730994.
2023-08-03 16:21:55 +00:00
Zanie Blue
718e3945e3 Add rule to upgrade type alias annotations to keyword (UP040) (#6289)
Adds rule to convert type aliases defined with annotations i.e. `x:
TypeAlias = int` to the new PEP-695 syntax e.g. `type x = int`.

Does not support using new generic syntax for type variables, will be
addressed in a follow-up.
Added as part of pyupgrade — ~the code 100 as chosen to avoid collision
with real pyupgrade codes~.

Part of #4617 
Builds on #5062
2023-08-03 16:13:06 +00:00
Charlie Marsh
c75e8a8dab Move ExprCall's NeedsParentheses impl into expr_call.rs (#6309)
Accidental move.
2023-08-03 16:01:01 +00:00
Harutaka Kawamura
74e734e962 More precise invalid expression check for UP032 (#6308) 2023-08-03 15:49:02 +00:00
Zanie Blue
0e18abcf95 Add is_ and is_not to excluded functions for FBT003 (#6307)
These methods are commonly used in SQLAlchemy.

See https://github.com/astral-sh/ruff/discussions/6302
2023-08-03 10:41:45 -05:00
Anders Kaseorg
7c8bcede5b Broaden appropriate flake8-pyi rules to check non-stub code too (#6297)
Of the rules that flake8-pyi enforces for `.pyi` type stubs, many of
them equally make sense to check in normal runtime code with type
annotations. Broaden these rules to check all files:

PYI013 ellipsis-in-non-empty-class-body
PYI016 duplicate-union-member
PYI018 unused-private-type-var
PYI019 custom-type-var-return-type
PYI024 collections-named-tuple
PYI025 unaliased-collections-abc-set-import
PYI030 unnecessary-literal-union
PYI032 any-eq-ne-annotation
PYI034 non-self-return-type
PYI036 bad-exit-annotation
PYI041 redundant-numeric-union
PYI042 snake-case-type-alias
PYI043 t-suffixed-type-alias
PYI045 iter-method-return-iterable
PYI046 unused-private-protocol
PYI047 unused-private-type-alias
PYI049 unused-private-typed-dict
PYI050 no-return-argument-annotation-in-stub (Python ≥ 3.11)
PYI051 redundant-literal-union
PYI056 unsupported-method-call-on-all

The other rules are stub-specific and remain enabled only in `.pyi`
files.

PYI001 unprefixed-type-param
PYI002 complex-if-statement-in-stub
PYI003 unrecognized-version-info-check
PYI004 patch-version-comparison
PYI005 wrong-tuple-length-version-comparison (could make sense to
broaden, see
https://github.com/astral-sh/ruff/pull/6297#issuecomment-1663314807)
PYI006 bad-version-info-comparison (same)
PYI007 unrecognized-platform-check
PYI008 unrecognized-platform-name
PYI009 pass-statement-stub-body
PYI010 non-empty-stub-body
PYI011 typed-argument-default-in-stub
PYI012 pass-in-class-body
PYI014 argument-default-in-stub
PYI015 assignment-default-in-stub
PYI017 complex-assignment-in-stub
PYI020 quoted-annotation-in-stub
PYI021 docstring-in-stub
PYI026 type-alias-without-annotation (could make sense to broaden, but
gives many false positives on runtime code as currently implemented)
PYI029 str-or-repr-defined-in-stub
PYI033 type-comment-in-stub
PYI035 unassigned-special-variable-in-stub
PYI044 future-annotations-in-stub
PYI048 stub-body-multiple-statements
PYI052 unannotated-assignment-in-stub
PYI053 string-or-bytes-too-long
PYI054 numeric-literal-too-long

Signed-off-by: Anders Kaseorg <andersk@mit.edu>
2023-08-03 11:40:42 -04:00
Harutaka Kawamura
30c2e9430e Update UP032 to support await expressions (#6304)
<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

In Python >= 3.7, `await` can be included in f-strings. 

https://bugs.python.org/issue28942

## Test Plan

<!-- How was it tested? -->

Existing tests
2023-08-03 09:53:36 -05:00
Harutaka Kawamura
b6f0316d55 Add PT013 and PT015 docs (#6303)
<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

#2646

## Test Plan

<!-- How was it tested? -->
2023-08-03 09:51:52 -05:00
Charlie Marsh
9f3567dea6 Use range: _ in lieu of range: _range (#6296)
## Summary

`range: _range` is slightly inconvenient because you can't use it
multiple times within a single match, unlike `_`.
2023-08-02 22:11:13 -04:00
Charlie Marsh
9e2bbf4beb Add a simple tooltip to the sidebar (#6295)
## Summary

Not perfect, but IMO helpful:

<img width="1792" alt="Screen Shot 2023-08-02 at 9 29 24 PM"
src="https://github.com/astral-sh/ruff/assets/1309177/e613e918-75cb-475e-9ea4-f833d1a0b5f6">

<img width="1792" alt="Screen Shot 2023-08-02 at 9 29 20 PM"
src="https://github.com/astral-sh/ruff/assets/1309177/bb3efdfe-40e1-45b5-b774-082521b2d214">
2023-08-03 01:41:07 +00:00
Charlie Marsh
d7627c398c Add an icon for FIR (#6292)
It's not a very _good_ icon, but I prefer the consistency. I'm also
going to add tooltips to these.
2023-08-03 01:20:00 +00:00
Charlie Marsh
23e527e386 Increase icon opacity on-hover (#6291)
## Summary

Makes it clearer that these are clickable.
2023-08-03 01:05:38 +00:00
Charlie Marsh
a15b0a9102 Tweak background on theme button (#6290)
## Summary

It's now white on-hover as opposed to yellow, to match the copy button:

<img width="1792" alt="Screen Shot 2023-08-02 at 8 52 10 PM"
src="https://github.com/astral-sh/ruff/assets/1309177/96d5cbf9-ef33-4fba-8888-f2a4af9a6ec4">
2023-08-03 01:00:37 +00:00
qdegraaf
d40597a266 [flake8-pyi] Implement custom_type_var_return_type (PYI019) (#6204)
## Summary

Implements `Y019` from
[flake8-pyi](https://github.com/PyCQA/flake8-pyi).

The rule checks if

-  instance methods that return `self` 
-  class methods that return an instance of `cls`
- `__new__` methods

Return a custom `TypeVar` instead of `typing.Self` and raises a
violation if this is the case. The rule also covers
[PEP-695](https://peps.python.org/pep-0695/) syntax as introduced in
upstream in https://github.com/PyCQA/flake8-pyi/pull/402

## Test Plan

Added fixtures with test cases from upstream implementation (plus
additional one for an excluded edge case, mentioned in upstream
implementation)
2023-08-03 00:42:42 +00:00
Silvano Cerza
82410524d9 [pylint] Implement Pylint bad-format-character (E1300) (#6171)
## Summary

Relates to #970.

Add new `bad-format-character` Pylint rule.

I had to make a change in `crates/ruff_python_literal/src/format.rs` to
get a more detailed error in case the format character is not correct. I
chose to do this since most of the format spec parsing functions are
private. It would have required me reimplementing most of the parsing
logic just to know if the format char was correct.

This PR also doesn't reflect current Pylint functionality in two ways.

It supports new format strings correctly, Pylint as of now doesn't. See
pylint-dev/pylint#6085.

In case there are multiple adjacent string literals delimited by
whitespace the index of the wrong format char will relative to the
single string. Pylint will instead reported it relative to the
concatenated string.

Given this:
```
"%s" "%z" % ("hello", "world")
```

Ruff will report this:
```Unsupported format character 'z' (0x7a) at index 1```

Pylint instead:
```Unsupported format character 'z' (0x7a) at index 3```

I believe it's more sensible to report the index relative to the
individual string.

## Test Plan

Added new snapshot and a small test in
`crates/ruff_python_literal/src/format.rs`.

---------

Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
2023-08-02 21:32:43 +00:00
Zanie Blue
5b2e973fa5 Add formatting of type alias statements (#6162)
Part of #5062 
Extends https://github.com/astral-sh/ruff/pull/6161
Closes #5929
2023-08-02 20:40:32 +00:00
Zanie Blue
1a60d1e3c6 Add formatting of type parameters in class and function definitions (#6161)
Part of #5062 
Closes https://github.com/astral-sh/ruff/issues/5931

Implements formatting of a sequence of type parameters in a dedicated
struct for reuse by classes, functions, and type aliases (preparing for
#5929). Adds formatting of type parameters in class and function
definitions — previously, they were just elided.
2023-08-02 20:29:28 +00:00
Charlie Marsh
9425ed72a0 Break global and nonlocal statements over continuation lines (#6172)
## Summary

Builds on #6170 to break `global` and `nonlocal` statements, such that
we get:

```python
def f():
    global \
        analyze_featuremap_layer, \
        analyze_featuremapcompression_layer, \
        analyze_latencies_post, \
        analyze_motions_layer, \
        analyze_size_model
```

Instead of:

```python
def f():
    global analyze_featuremap_layer, analyze_featuremapcompression_layer, analyze_latencies_post, analyze_motions_layer, analyze_size_model
```

Notably, we avoid applying this formatting if the statement ends in a
comment. Otherwise, the comment would _need_ to be placed after the last
item, like:

```python
def f():
    global \
        analyze_featuremap_layer, \
        analyze_featuremapcompression_layer, \
        analyze_latencies_post, \
        analyze_motions_layer, \
        analyze_size_model  # noqa
```

To me, this seems wrong (and would break the `# noqa` comment). Ideally,
the items would be parenthesized, and the comment would be on the inner
parenthesis, like:

```python
def f():
    global (  # noqa
        analyze_featuremap_layer,
        analyze_featuremapcompression_layer,
        analyze_latencies_post,
        analyze_motions_layer,
        analyze_size_model
    )
```

But that's not valid syntax.
2023-08-02 19:55:00 +00:00
Victor Hugo Gomes
9f38dbd06e [flake8-pyi] Implement PYI051 (#6215)
## Summary
Checks for the presence of redundant `Literal` types and builtin super
types in an union. See [original
source](2a86db8271/pyi.py (L1261)).

This implementation has a couple of differences from the original. The
first one is, we support the `complex` and `float` builtin types. The
second is, when reporting diagnostic for a `Literal` with multiple
members of the same type, we print the entire `Literal` while `flak8`
only prints the `Literal` with its first member.
For example:
```python
from typing import Literal

x: Literal[1, 2] | int
```  
Ruff will show `Literal[1, 2]` while flake8 only shows `Literal[1]`.

```shell
$ ruff crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:4:18: PYI051 `Literal["foo"]` is redundant in an union with `str`
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:5:37: PYI051 `Literal[b"bar", b"foo"]` is redundant in an union with `bytes`
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:6:37: PYI051 `Literal[5]` is redundant in an union with `int`
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:6:67: PYI051 `Literal["foo"]` is redundant in an union with `str`
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:7:37: PYI051 `Literal[b"str_bytes"]` is redundant in an union with `bytes`
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:7:51: PYI051 `Literal[42]` is redundant in an union with `int`
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:9:31: PYI051 `Literal[1J]` is redundant in an union with `complex`
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:9:53: PYI051 `Literal[3.14]` is redundant in an union with `float`
Found 8 errors.
```

```shell
$ flake8 crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:4:18: Y051 "Literal['foo']" is redundant in a union with "str"
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:5:37: Y051 "Literal[b'bar']" is redundant in a union with "bytes"
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:6:37: Y051 "Literal[5]" is redundant in a unionwith "int"
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:6:67: Y051 "Literal['foo']" is redundant in a union with "str"
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:7:37: Y051 "Literal[b'str_bytes']" is redundantin a union with "bytes"
crates/ruff/resources/test/fixtures/flake8_pyi/PYI051.pyi:7:51: Y051 "Literal[42]" is redundant in a union with "int"
```

While implementing this rule, I found a bug in the `is_unchecked_union`
check. This is the new check.


1ab86bad35/crates/ruff/src/checkers/ast/analyze/expression.rs (L85-L102)

The purpose of the check was to prevent rules from navigating through
nested `Union`s, as they already handle nested `Union`s. The way it was
implemented, this was not happening, the rules were getting executed
more than one time and sometimes were receiving expressions that were
not `Union`. For example, with the following code:
 ```python
  typing.Union[Literal[5], int, typing.Union[Literal["foo"], str]]
 ```

The rules were receiving the expressions in the following order:
- `typing.Union[Literal[5], int, typing.Union[Literal["foo"], str]]`
     - `Literal[5]`
     - `typing.Union[Literal["foo"], str]]`

This was causing `PYI030` to report redundant information, for example:
 ```python
typing.Union[Literal[5], int, typing.Union[Literal["foo"],
Literal["bar"]]]
 ```
This is the `PYI030` output for this code:
```shell
PYI030 Multiple literal members in a union. Use a single literal, e.g. `Literal[5, "foo", "bar"]`
YI030 Multiple literal members in a union. Use a single literal, e.g.`Literal[5, "foo"]`
```

If I haven't misinterpreted the rule, that looks incorrect. I didn't
have the time to check the `PYI016` rule.

The last thing is, I couldn't find a reason for the "Why is this bad?"
section for `PYI051`.

Ref: #848 

## Test Plan

Snapshots and manual runs of flake8.
\
2023-08-02 15:37:40 -04:00
Victor Hugo Gomes
7c5791fb77 Fix formatting of lambda star arguments (#6257)
## Summary
Previously, the ruff formatter was removing the star argument of
`lambda` expressions when formatting.

Given the following code snippet
```python
lambda *a: ()
lambda **b: ()
```
it would be formatted to
```python
lambda: ()
lambda: ()
```

We fix this by checking for the presence of `args`, `vararg` or `kwarg`
in the `lambda` expression, before we were only checking for the
presence of `args`.

Fixes #5894

## Test Plan

Add new tests cases.

---------

Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
2023-08-02 19:31:20 +00:00
Harutaka Kawamura
c362ea7fd4 Add PT025 and PT026 docs (#6264) 2023-08-02 19:00:03 +00:00
Harutaka Kawamura
ec8fad5b02 Extend UP032 to support implicitly concatenated strings (#6263) 2023-08-02 18:56:24 +00:00
Harutaka Kawamura
bcc41ba062 Extend UP032 to support repeated format fields (#6266) 2023-08-02 14:23:25 -04:00
Charlie Marsh
556abf4bd3 Avoid PTH206 with maxsplit (#6283)
## Summary

Avoid suggesting `Path.parts` when a `maxsplit` is specified, since
these behavior differently.

## Test Plan

`cargo test`
2023-08-02 18:16:57 +00:00
Charlie Marsh
23b8fc4366 Move includes_arg_name onto Parameters (#6282)
## Summary

Like #6279, no reason for this to be a standalone method.
2023-08-02 18:05:26 +00:00
Charlie Marsh
fd40864924 Move find_keyword helpers onto Arguments struct (#6280)
## Summary

Similar to #6279, moving some helpers onto the struct in the name of
reducing the number of random undiscoverable utilities we have in
`helpers.rs`.

Most of the churn is migrating rules to take `ast::ExprCall` instead of
the spread call arguments.

## Test Plan

`cargo test`
2023-08-02 13:54:48 -04:00
Charlie Marsh
041946fb64 Remove CallArguments abstraction (#6279)
## Summary

This PR removes a now-unnecessary abstraction from `helper.rs`
(`CallArguments`), in favor of adding methods to `Arguments` directly,
which helps with discoverability.
2023-08-02 13:25:43 -04:00
Charlie Marsh
8a0f844642 Box type params and arguments fields on the class definition node (#6275)
## Summary

This PR boxes the `TypeParams` and `Arguments` fields on the class
definition node. These fields are optional and often emitted, and given
that class definition is our largest enum variant, we pay the cost of
including them for every statement in the AST. Boxing these types
reduces the statement size by 40 bytes, which seems like a good tradeoff
given how infrequently these are accessed.

## Test Plan

Need to benchmark, but no behavior changes.
2023-08-02 16:47:06 +00:00
Charlie Marsh
8c40886f87 Use Arguments node to power remove_argument method (#6278)
## Summary

Internal refactor to take advantage of the new `Arguments` node, to
power our `remove_argument` fix action.

## Test Plan

`cargo test`
2023-08-02 12:38:43 -04:00
Charlie Marsh
4c53bfe896 Add formatter support for call and class definition Arguments (#6274)
## Summary

This PR leverages the `Arguments` AST node introduced in #6259 in the
formatter, which ensures that we correctly handle trailing comments in
calls, like:

```python
f(
  1,
  # comment
)

pass
```

(Previously, this was treated as a leading comment on `pass`.)

This also allows us to unify the argument handling across calls and
class definitions.

## Test Plan

A bunch of new fixture tests, plus improved Black compatibility.
2023-08-02 11:54:22 -04:00
Charlie Marsh
b095b7204b Add a TypeParams node to the AST (#6261)
## Summary

Similar to #6259, this PR adds a `TypeParams` node to the AST, to
capture the list of type parameters with their surrounding brackets.

If a statement lacks type parameters, the `type_params` field will be
`None`.
2023-08-02 14:12:45 +00:00
Charlie Marsh
981e64f82b Introduce an Arguments AST node for function calls and class definitions (#6259)
## Summary

This PR adds a new `Arguments` AST node, which we can use for function
calls and class definitions.

The `Arguments` node spans from the left (open) to right (close)
parentheses inclusive.

In the case of classes, the `Arguments` is an option, to differentiate
between:

```python
# None
class C: ...

# Some, with empty vectors
class C(): ...
```

In this PR, we don't really leverage this change (except that a few
rules get much simpler, since we don't need to lex to find the start and
end ranges of the parentheses, e.g.,
`crates/ruff/src/rules/pyupgrade/rules/lru_cache_without_parameters.rs`,
`crates/ruff/src/rules/pyupgrade/rules/unnecessary_class_parentheses.rs`).

In future PRs, this will be especially helpful for the formatter, since
we can track comments enclosed on the node itself.

## Test Plan

`cargo test`
2023-08-02 10:01:13 -04:00
Ran Benita
0d62ad2480 Permit ClassVar and Final without subscript in RUF012 (#6273)
Fix #6267.
2023-08-02 12:58:44 +00:00
Harutaka Kawamura
b4f224ecea Fix links in docs (#6265)
<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Before:

<img width="1031" alt="Screen Shot 2023-08-02 at 15 57 10"
src="https://github.com/astral-sh/ruff/assets/17039389/171a21d5-01a5-4aa5-8079-4e7f8a59ade8">

After:

<img width="1031" alt="Screen Shot 2023-08-02 at 15 58 03"
src="https://github.com/astral-sh/ruff/assets/17039389/afd1b9b7-89e0-4e38-a4a6-e3255b62f021">


## Test Plan

<!-- How was it tested? -->

Manual inspection
2023-08-02 09:42:25 +02:00
Charlie Marsh
7842c82a0a Preserve end-of-line comments on import-from statements (#6216)
## Summary

Ensures that we keep comments at the end-of-line in cases like:

```python
from foo import (  # comment
  bar,
)
```

Closes https://github.com/astral-sh/ruff/issues/6067.
2023-08-01 18:58:05 +00:00
Charlie Marsh
9c708d8fc1 Rename Parameter#arg and ParameterWithDefault#def fields (#6255)
## Summary

This PR renames...

- `Parameter#arg` to `Parameter#name`
- `ParameterWithDefault#def` to `ParameterWithDefault#parameter` (such
that `ParameterWithDefault` has a `default` and a `parameter`)

## Test Plan

`cargo test`
2023-08-01 14:28:34 -04:00
Charlie Marsh
adc8bb7821 Rename Arguments to Parameters in the AST (#6253)
## Summary

This PR renames a few AST nodes for clarity:

- `Arguments` is now `Parameters`
- `Arg` is now `Parameter`
- `ArgWithDefault` is now `ParameterWithDefault`

For now, the attribute names that reference `Parameters` directly are
changed (e.g., on `StmtFunctionDef`), but the attributes on `Parameters`
itself are not (e.g., `vararg`). We may revisit that decision in the
future.

For context, the AST node formerly known as `Arguments` is used in
function definitions. Formally (outside of the Python context),
"arguments" typically refers to "the values passed to a function", while
"parameters" typically refers to "the variables used in a function
definition". E.g., if you Google "arguments vs parameters", you'll get
some explanation like:

> A parameter is a variable in a function definition. It is a
placeholder and hence does not have a concrete value. An argument is a
value passed during function invocation.

We're thus deviating from Python's nomenclature in favor of a scheme
that we find to be more precise.
2023-08-01 13:53:28 -04:00
Charlie Marsh
a82eb9544c Implement Black's rules around newlines before and after class docstrings (#6209)
## Summary

Black allows up to one blank line _before_ a class docstring, and
enforces one blank line _after_ a class docstring. This PR implements
that handling. The cases in
`crates/ruff_python_formatter/resources/test/fixtures/ruff/statement/class_definition.py`
match Black identically.
2023-08-01 13:33:01 -04:00
Zanie Blue
5e41f2fc7d Tweak pre-commit message (#6243)
Follow-up to https://github.com/astral-sh/ruff/pull/6153
2023-08-01 12:32:14 -05:00
konsti
1df7e9831b Replace .map_or(false, $closure) with .is_some_and(closure) (#6244)
**Summary**
[Option::is_some_and](https://doc.rust-lang.org/stable/std/option/enum.Option.html#method.is_some_and)
and
[Result::is_ok_and](https://doc.rust-lang.org/std/result/enum.Result.html#method.is_ok_and)
are new methods is rust 1.70. I find them way more readable than
`.map_or(false, ...)`.

The changes are `s/.map_or(false,/.is_some_and(/g`, then manually
switching to `is_ok_and` where the value is a Result rather than an
Option.

**Test Plan** n/a^
2023-08-01 19:29:42 +02:00
Zanie Blue
2e1754e5fc Update ecosystem checks for bokeh to 3.3 (#6249)
Bokeh 3.3 is planned for release this month
(https://github.com/bokeh/bokeh/issues/13207) and is their default
branch now
2023-08-01 11:56:58 -05:00
Zanie Blue
67b88803d8 Use prettier to format yaml files in pre-commit (#6250)
Prompted by
https://github.com/astral-sh/ruff/pull/6248#discussion_r1280855848
2023-08-01 16:45:08 +00:00
konsti
ed45fcb1f7 Remove old CI comment (#6246)
We don't build abi3 wheels
2023-08-01 11:35:47 -05:00
Zanie Blue
adf227b8a9 Run ecosystem checks on changes to ecosystem test script (#6248)
e.g. https://github.com/astral-sh/ruff/pull/6245 should probably run
checks before merge
2023-08-01 11:35:29 -05:00
Micha Reiser
debfca3a11 Remove Parse trait (#6235) 2023-08-01 18:35:03 +02:00
Charlie Marsh
83fe103d6e Allow generic tuple and list calls in __all__ (#6247)
## Summary

Allows, e.g., `__all__ = list[str]()`.

Closes https://github.com/astral-sh/ruff/issues/6226.
2023-08-01 12:01:48 -04:00
Charlie Marsh
e08f873077 Add Poetry and FastAPI to ecosystem checks (#6245)
Poetry in particular would be useful to avoid issues like
https://github.com/astral-sh/ruff/issues/6233.
2023-08-01 11:48:34 -04:00
Charlie Marsh
928ab63a64 Add empty lines before nested functions and classes (#6206)
## Summary

This PR ensures that if a function or class is the first statement in a
nested suite that _isn't_ a function or class body, we insert a leading
newline.

For example, given:

```python
def f():
    if True:

        def register_type():
            pass
```

We _want_ to preserve the newline, whereas today, we remove it.

Note that this only applies when the function or class doesn't have any
leading comments.

Closes https://github.com/astral-sh/ruff/issues/6066.
2023-08-01 15:30:59 +00:00
Harutaka Kawamura
b68f76f0d9 Add pre-commit install in CONTRIBUTING.md (#6153)
<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

To enable the git hooks, `pre-commit install` needs to be executed.

## Test Plan

<!-- How was it tested? -->

N/A
2023-08-01 09:28:50 -05:00
Charlie Marsh
1a85953129 Don't require docstrings in .pyi files (#6239)
Closes https://github.com/astral-sh/ruff/issues/6224.
2023-08-01 10:02:57 -04:00
Charlie Marsh
743118ae9a Bump version to 0.0.282 (#6241) 2023-08-01 13:21:33 +00:00
Charlie Marsh
0753017cf1 Revert "Expand scope of quoted-annotation rule (#5766)" (#6237)
This is causing some problems, so we'll just revert for now.

Closes https://github.com/astral-sh/ruff/issues/6189.
2023-08-01 09:03:02 -04:00
Charlie Marsh
29fb655e04 Fix logger-objects documentation (#6238)
Closes https://github.com/astral-sh/ruff/issues/6234.
2023-08-01 12:57:56 +00:00
Micha Reiser
f45e8645d7 Remove unused parser modes
<!--
Thank you for contributing to Ruff! To help us out with reviewing, please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

This PR removes the `Interactive` and `FunctionType` parser modes that are unused by ruff

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

`cargo test`

<!-- How was it tested? -->
2023-08-01 13:10:07 +02:00
Micha Reiser
7c7231db2e Remove unsupported type_comment field
<!--
Thank you for contributing to Ruff! To help us out with reviewing, please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

This PR removes the `type_comment` field which our parser doesn't support.

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

`cargo test`

<!-- How was it tested? -->
2023-08-01 12:53:13 +02:00
Micha Reiser
4ad5903ef6 Delete type-ignore node
<!--
Thank you for contributing to Ruff! To help us out with reviewing, please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

This PR removes the type ignore node from the AST because our parser doesn't support it, and just having it around is confusing.

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

`cargo build`

<!-- How was it tested? -->
2023-08-01 12:34:50 +02:00
konsti
c6986ac95d Consistent CommentPlacement conversion signatures (#6231)
**Summary** Allow passing any node to `CommentPlacement::{leading,
trailing, dangling}` without manually converting. Conversely, Restrict
the comment to the only type we actually pass.

**Test Plan** No changes.
2023-08-01 12:01:17 +02:00
Micha Reiser
ecfdd8d58b Add static assertions to nodes (#6228) 2023-08-01 11:54:49 +02:00
David Szotten
07468f8be9 format ExprJoinedStr (#5932) 2023-08-01 08:26:30 +02:00
David Szotten
ba990b676f add DebugText for self-documenting f-strings (#6167) 2023-08-01 07:55:03 +02:00
Harutaka Kawamura
44a8d1c644 Add PT021, PT022 and PT023 docs (#6143) 2023-08-01 00:41:54 -04:00
Charlie Marsh
88b984e885 Avoid detecting continuations at non-start-of-line (#6219)
## Summary

Previously, given:

```python
a = \
  5;
```

When detecting continuations starting at the offset of the `;`, we'd
flag the previous line as a continuation. We should only flag a
continuation if there isn't leading content prior to the offset.

Closes https://github.com/astral-sh/ruff/issues/6214
2023-08-01 00:20:29 -04:00
Charlie Marsh
bf584c6d74 Remove use of SmallVec in unnecessary-literal-union (#6221)
I prefer to use this on an as-needed basis.
2023-08-01 04:03:58 +00:00
Konrad Listwan-Ciesielski
6ea3c178fd Add DTZ002 documentation (#6146)
## Summary

Adds documentation for DTZ002. Related to
https://github.com/astral-sh/ruff/issues/2646.

## Test Plan

`python scripts/test_docs_formatted.py`
2023-08-01 04:00:50 +00:00
Charlie Marsh
764d35667f Avoid PERF401 false positive on list access in loop (#6220)
Closes https://github.com/astral-sh/ruff/issues/6210.
2023-08-01 03:56:53 +00:00
Charlie Marsh
ff9ebbaa5f Skip trivia when searching for named exception (#6218)
Closes https://github.com/astral-sh/ruff/issues/6213.
2023-08-01 03:42:30 +00:00
Micha Reiser
38b5726948 formatter: WithNodeLevel helper (#6212) 2023-07-31 21:22:17 +00:00
Charlie Marsh
615337a54d Remove newline-insertion logic from JoinNodesBuilder (#6205)
## Summary

This PR moves the "insert empty lines" behavior out of
`JoinNodesBuilder` and into the `Suite` formatter. I find it a little
confusing that the logic is split between those two formatters right
now, and since this is _only_ used in that one place, IMO it is a bit
simpler to just inline it and use a single approach to tracking state
(right now, both are stateful).

The only other place this was used was for decorators. As a side effect,
we now remove blank lines in both of these cases, which is a known but
intentional deviation from Black (which preserves the empty line before
the comment in the first case):

```python
@foo

# Hello
@bar
def baz():
    pass

@foo

@bar
def baz():
    pass
```
2023-07-31 16:58:15 -04:00
Charlie Marsh
6ee5cb37c0 Reset model state when exiting deferred visitors (#6208)
## Summary

Very subtle bug related to the AST traversal. Given:

```python
from __future__ import annotations

from logging import getLogger

__all__ = ("getLogger",)


def foo() -> None:
    pass
```

We end up visiting the `-> None` annotation, then reusing the state
snapshot when we go to visit the `__all__` exports, so when we visit
`"getLogger"`, we think we're inside of a deferred type annotation.

This PR changes all the deferred visitors to snapshot and restore the
state, which is a lot safer -- that way, the visitors avoid modifying
the current visitor state. (Previously, they implicitly left the visitor
state set to the state of the _last_ thing they visited.)

Closes https://github.com/astral-sh/ruff/issues/6207.
2023-07-31 19:46:52 +00:00
konsti
0fddb31235 Use tracing for format_dev (#6177)
## Summary

[tracing](https://github.com/tokio-rs/tracing) is library for logging,
tracing and related features that has a large ecosystem. Using
[tracing-subscriber](https://docs.rs/tracing-subscriber) and
[tracing-indicatif](https://github.com/emersonford/tracing-indicatif),
we get a nice logging output that you can configure with `RUST_LOG`
(e.g. `RUST_LOG=debug`) and a live look into the formatter progress.

Default:
![Screenshot from 2023-07-30
13-59-53](https://github.com/astral-sh/ruff/assets/6826232/6432f835-9ff1-4771-955b-398e54c406dc)

`RUST_LOG=debug`:
![Screenshot from 2023-07-30
14-01-32](https://github.com/astral-sh/ruff/assets/6826232/5f2c87da-0867-4159-82e7-b5757eebb8eb)

It's easy to see in this output which files take a disproportionate
amount of time.

[Peek 2023-07-30
14-35.webm](https://github.com/astral-sh/ruff/assets/6826232/2c92db5c-1354-465b-a6bc-ddfb281d6f9d)

It opens up further integration with the tracing ecosystem,
[tracing-timing](https://docs.rs/tracing-timing/latest/tracing_timing/)
and [tokio-console](https://github.com/tokio-rs/console) can e.g. show
histograms and the json output allows us building better pipelines than
grepping a log file.

One caveat is using `parent: None` for the logging statements because
tracing subscriber does not allow deactivating the span without
reimplementing all the other log message formatting, too, and we don't
need span information, esp. since it would currently show the progress
bar span.

## Test Plan

n/a
2023-07-31 19:14:01 +00:00
konsti
a7aa3caaae Rename formatter_progress to formatter_ecosystem_checks (#6194)
Rename the `scripts/formatter_progress.sh` to
`formatter/formatter_ecosysytem_checks.sh` since it fits the actual task
better.
2023-07-31 18:33:12 +00:00
konsti
e52b636da0 Log configuration in ruff_dev (#6193)
**Summary** This includes two changes:
 * Allow setting `-v` in `ruff_dev`, using the `ruff_cli` implementation
 * `debug!` which ruff configuration strategy was used

This is a byproduct of debugging #6187.

**Test Plan** n/a
2023-07-31 17:52:38 +00:00
konsti
9063f4524d Fix formatting of trailing unescaped quotes in raw triple quoted strings (#6202)
**Summary** This prevents us from turning `r'''\""'''` into
`r"""\"""""`, which is invalid syntax.

This PR fixes CI, which is currently broken on main (in a way that still
passes on linter PRs and allows merging formatter PRs, but it's bad to
have a job be red). Once merged, i'll make the formatted ecosystem
checks a required check.

**Test Plan** Added a regression test.
2023-07-31 19:25:16 +02:00
599 changed files with 42362 additions and 40613 deletions

1
.github/release.yml vendored
View File

@@ -4,6 +4,7 @@ changelog:
labels:
- internal
- documentation
- formatter
categories:
- title: Breaking Changes
labels:

View File

@@ -3,12 +3,12 @@ name: Benchmark
on:
pull_request:
paths:
- 'Cargo.toml'
- 'Cargo.lock'
- 'rust-toolchain'
- 'crates/**'
- '!crates/ruff_dev'
- '!crates/ruff_shrinking'
- "Cargo.toml"
- "Cargo.lock"
- "rust-toolchain"
- "crates/**"
- "!crates/ruff_dev"
- "!crates/ruff_shrinking"
workflow_dispatch:
@@ -22,7 +22,7 @@ jobs:
name: "Run | ${{ matrix.os }}"
strategy:
matrix:
os: [ ubuntu-latest, windows-latest ]
os: [ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
steps:

View File

@@ -2,7 +2,7 @@ name: CI
on:
push:
branches: [ main ]
branches: [main]
pull_request:
workflow_dispatch:
@@ -16,7 +16,7 @@ env:
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
PACKAGE_NAME: ruff
PYTHON_VERSION: "3.11" # to build abi3 wheels
PYTHON_VERSION: "3.11"
jobs:
determine_changes:
@@ -42,6 +42,7 @@ jobs:
- "!crates/ruff_formatter/**"
- "!crates/ruff_dev/**"
- "!crates/ruff_shrinking/**"
- scripts/check_ecosystem.py
formatter:
- Cargo.toml
@@ -54,7 +55,7 @@ jobs:
- crates/ruff_python_index/**
- crates/ruff_text_size/**
- crates/ruff_python_parser/**
- crates/ruff_dev/**
cargo-fmt:
name: "cargo fmt"
@@ -83,7 +84,7 @@ jobs:
cargo-test:
strategy:
matrix:
os: [ ubuntu-latest, windows-latest ]
os: [ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
name: "cargo test | ${{ matrix.os }}"
steps:
@@ -235,7 +236,6 @@ jobs:
- name: "Run cargo-udeps"
run: cargo +nightly-2023-06-08 udeps
python-package:
name: "python package"
runs-on: ubuntu-latest
@@ -335,9 +335,9 @@ jobs:
- name: "Cache rust"
uses: Swatinem/rust-cache@v2
- name: "Formatter progress"
run: scripts/formatter_progress.sh
run: scripts/formatter_ecosystem_checks.sh
- name: "Github step summary"
run: grep "similarity index" target/progress_projects_report.txt | sort > $GITHUB_STEP_SUMMARY
run: grep "similarity index" target/progress_projects_log.txt | sort > $GITHUB_STEP_SUMMARY
# CPython is not black formatted, so we run only the stability check
- name: "Clone CPython 3.10"
run: git clone --branch 3.10 --depth 1 https://github.com/python/cpython.git crates/ruff/resources/test/cpython

View File

@@ -3,7 +3,7 @@ name: mkdocs
on:
workflow_dispatch:
release:
types: [ published ]
types: [published]
jobs:
mkdocs:

View File

@@ -66,7 +66,7 @@ jobs:
runs-on: windows-latest
strategy:
matrix:
target: [ x64, x86 ]
target: [x64, x86]
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
@@ -94,7 +94,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
target: [ x86_64, i686 ]
target: [x86_64, i686]
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
@@ -121,7 +121,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
target: [ aarch64, armv7, s390x, ppc64le, ppc64 ]
target: [aarch64, armv7, s390x, ppc64le, ppc64]
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4

View File

@@ -3,7 +3,7 @@ name: "[Playground] Release"
on:
workflow_dispatch:
release:
types: [ published ]
types: [published]
env:
CARGO_INCREMENTAL: 0

View File

@@ -2,8 +2,8 @@ name: PR Check Comment
on:
workflow_run:
workflows: [ CI, Benchmark ]
types: [ completed ]
workflows: [CI, Benchmark]
types: [completed]
workflow_dispatch:
inputs:
workflow_run_id:

View File

@@ -42,13 +42,13 @@ repos:
name: cargo fmt
entry: cargo fmt --
language: system
types: [ rust ]
types: [rust]
pass_filenames: false # This makes it a lot faster
- id: ruff
name: ruff
entry: cargo run --bin ruff -- check --no-cache --force-exclude --fix --exit-non-zero-on-fix
language: system
types_or: [ python, pyi ]
types_or: [python, pyi]
require_serial: true
exclude: |
(?x)^(
@@ -62,5 +62,12 @@ repos:
hooks:
- id: black
# Prettier
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v3.0.0
hooks:
- id: prettier
types: [yaml]
ci:
skip: [ cargo-fmt, dev-generate-all ]
skip: [cargo-fmt, dev-generate-all]

View File

@@ -69,6 +69,13 @@ and pre-commit to run some validation checks:
pipx install pre-commit # or `pip install pre-commit` if you have a virtualenv
```
You can optionally install pre-commit hooks to automatically run the validation checks
when making a commit:
```shell
pre-commit install
```
### Development
After cloning the repository, run Ruff locally from the repository root with:

63
Cargo.lock generated
View File

@@ -133,6 +133,12 @@ dependencies = [
"os_str_bytes",
]
[[package]]
name = "arrayvec"
version = "0.7.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "96d30a06541fbafbc7f82ed10c06164cfbd2c401138f6addd8404629c4b16711"
[[package]]
name = "ascii-canvas"
version = "3.0.0"
@@ -794,7 +800,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
[[package]]
name = "flake8-to-ruff"
version = "0.0.281"
version = "0.0.282"
dependencies = [
"anyhow",
"clap",
@@ -1027,6 +1033,7 @@ dependencies = [
"number_prefix",
"portable-atomic",
"unicode-width",
"vt100",
]
[[package]]
@@ -2035,7 +2042,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.0.281"
version = "0.0.282"
dependencies = [
"annotate-snippets 0.9.1",
"anyhow",
@@ -2134,7 +2141,7 @@ dependencies = [
[[package]]
name = "ruff_cli"
version = "0.0.281"
version = "0.0.282"
dependencies = [
"annotate-snippets 0.9.1",
"anyhow",
@@ -2194,7 +2201,6 @@ dependencies = [
"indoc",
"itertools",
"libcst",
"log",
"once_cell",
"pretty_assertions",
"rayon",
@@ -2218,6 +2224,9 @@ dependencies = [
"strum_macros",
"tempfile",
"toml",
"tracing",
"tracing-indicatif",
"tracing-subscriber",
]
[[package]]
@@ -2282,6 +2291,7 @@ dependencies = [
"rustc-hash",
"serde",
"smallvec",
"static_assertions",
]
[[package]]
@@ -3115,6 +3125,18 @@ dependencies = [
"valuable",
]
[[package]]
name = "tracing-indicatif"
version = "0.3.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b38ed3722d27705c3bd7ca0ccf29acc3d8e1c717b4cd87f97891a2c1834ea1af"
dependencies = [
"indicatif",
"tracing",
"tracing-core",
"tracing-subscriber",
]
[[package]]
name = "tracing-log"
version = "0.1.3"
@@ -3313,6 +3335,39 @@ version = "0.9.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "49874b5167b65d7193b8aba1567f5c7d93d001cafc34600cee003eda787e483f"
[[package]]
name = "vt100"
version = "0.15.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "84cd863bf0db7e392ba3bd04994be3473491b31e66340672af5d11943c6274de"
dependencies = [
"itoa",
"log",
"unicode-width",
"vte",
]
[[package]]
name = "vte"
version = "0.11.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f5022b5fbf9407086c180e9557be968742d839e68346af7792b8592489732197"
dependencies = [
"arrayvec",
"utf8parse",
"vte_generate_state_changes",
]
[[package]]
name = "vte_generate_state_changes"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d257817081c7dffcdbab24b9e62d2def62e2ff7d00b1c20062551e6cccc145ff"
dependencies = [
"proc-macro2",
"quote",
]
[[package]]
name = "wait-timeout"
version = "0.2.0"

View File

@@ -4,7 +4,7 @@ resolver = "2"
[workspace.package]
edition = "2021"
rust-version = "1.70"
rust-version = "1.71"
homepage = "https://beta.ruff.rs/docs"
documentation = "https://beta.ruff.rs/docs"
repository = "https://github.com/astral-sh/ruff"
@@ -46,6 +46,9 @@ syn = { version = "2.0.15" }
test-case = { version = "3.0.0" }
thiserror = { version = "1.0.43" }
toml = { version = "0.7.2" }
tracing = "0.1.37"
tracing-indicatif = "0.3.4"
tracing-subscriber = { version = "0.3.17", features = ["env-filter"] }
wsl = { version = "0.1.0" }
# v1.0.1

View File

@@ -140,7 +140,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com) hook:
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.281
rev: v0.0.282
hooks:
- id: ruff
```

View File

@@ -1,6 +1,6 @@
[package]
name = "flake8-to-ruff"
version = "0.0.281"
version = "0.0.282"
description = """
Convert Flake8 configuration files to Ruff configuration files.
"""

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.0.281"
version = "0.0.282"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -67,7 +67,8 @@ cfg.getboolean("hello", True)
os.set_blocking(0, False)
g_action.set_enabled(True)
settings.set_enable_developer_extras(True)
foo.is_(True)
bar.is_not(False)
class Registry:
def __init__(self) -> None:

View File

@@ -0,0 +1,42 @@
from typing import TypeVar, Self, Type
_S = TypeVar("_S", bound=BadClass)
_S2 = TypeVar("_S2", BadClass, GoodClass)
class BadClass:
def __new__(cls: type[_S], *args: str, **kwargs: int) -> _S: ... # PYI019
def bad_instance_method(self: _S, arg: bytes) -> _S: ... # PYI019
@classmethod
def bad_class_method(cls: type[_S], arg: int) -> _S: ... # PYI019
@classmethod
def excluded_edge_case(cls: Type[_S], arg: int) -> _S: ... # Ok
class GoodClass:
def __new__(cls: type[Self], *args: list[int], **kwargs: set[str]) -> Self: ...
def good_instance_method_1(self: Self, arg: bytes) -> Self: ...
def good_instance_method_2(self, arg1: _S2, arg2: _S2) -> _S2: ...
@classmethod
def good_cls_method_1(cls: type[Self], arg: int) -> Self: ...
@classmethod
def good_cls_method_2(cls, arg1: _S, arg2: _S) -> _S: ...
@staticmethod
def static_method(arg1: _S) -> _S: ...
# Python > 3.12
class PEP695BadDunderNew[T]:
def __new__[S](cls: type[S], *args: Any, ** kwargs: Any) -> S: ... # PYI019
def generic_instance_method[S](self: S) -> S: ... # PYI019
class PEP695GoodDunderNew[T]:
def __new__(cls, *args: Any, **kwargs: Any) -> Self: ...

View File

@@ -0,0 +1,42 @@
from typing import TypeVar, Self, Type
_S = TypeVar("_S", bound=BadClass)
_S2 = TypeVar("_S2", BadClass, GoodClass)
class BadClass:
def __new__(cls: type[_S], *args: str, **kwargs: int) -> _S: ... # PYI019
def bad_instance_method(self: _S, arg: bytes) -> _S: ... # PYI019
@classmethod
def bad_class_method(cls: type[_S], arg: int) -> _S: ... # PYI019
@classmethod
def excluded_edge_case(cls: Type[_S], arg: int) -> _S: ... # Ok
class GoodClass:
def __new__(cls: type[Self], *args: list[int], **kwargs: set[str]) -> Self: ...
def good_instance_method_1(self: Self, arg: bytes) -> Self: ...
def good_instance_method_2(self, arg1: _S2, arg2: _S2) -> _S2: ...
@classmethod
def good_cls_method_1(cls: type[Self], arg: int) -> Self: ...
@classmethod
def good_cls_method_2(cls, arg1: _S, arg2: _S) -> _S: ...
@staticmethod
def static_method(arg1: _S) -> _S: ...
# Python > 3.12
class PEP695BadDunderNew[T]:
def __new__[S](cls: type[S], *args: Any, ** kwargs: Any) -> S: ... # PYI019
def generic_instance_method[S](self: S) -> S: ... # PYI019
class PEP695GoodDunderNew[T]:
def __new__(cls, *args: Any, **kwargs: Any) -> Self: ...

View File

@@ -1,9 +1,11 @@
import collections
person: collections.namedtuple # OK
person: collections.namedtuple # Y024 Use "typing.NamedTuple" instead of "collections.namedtuple"
from collections import namedtuple
person: namedtuple # OK
person: namedtuple # Y024 Use "typing.NamedTuple" instead of "collections.namedtuple"
person = namedtuple("Person", ["name", "age"]) # OK
person = namedtuple(
"Person", ["name", "age"]
) # Y024 Use "typing.NamedTuple" instead of "collections.namedtuple"

View File

@@ -1,24 +1,38 @@
import typing
import typing_extensions
from typing import Literal
# Shouldn't emit for any cases in the non-stub file for compatibility with flake8-pyi.
# Note that this rule could be applied here in the future.
# Shouldn't affect non-union field types.
field1: Literal[1] # OK
field2: Literal[1] | Literal[2] # OK
def func1(arg1: Literal[1] | Literal[2]): # OK
# Should emit for duplicate field types.
field2: Literal[1] | Literal[2] # Error
# Should emit for union types in arguments.
def func1(arg1: Literal[1] | Literal[2]): # Error
print(arg1)
def func2() -> Literal[1] | Literal[2]: # OK
# Should emit for unions in return types.
def func2() -> Literal[1] | Literal[2]: # Error
return "my Literal[1]ing"
field3: Literal[1] | Literal[2] | str # OK
field4: str | Literal[1] | Literal[2] # OK
field5: Literal[1] | str | Literal[2] # OK
field6: Literal[1] | bool | Literal[2] | str # OK
field7 = Literal[1] | Literal[2] # OK
field8: Literal[1] | (Literal[2] | str) # OK
field9: Literal[1] | (Literal[2] | str) # OK
field10: (Literal[1] | str) | Literal[2] # OK
field11: dict[Literal[1] | Literal[2], str] # OK
# Should emit in longer unions, even if not directly adjacent.
field3: Literal[1] | Literal[2] | str # Error
field4: str | Literal[1] | Literal[2] # Error
field5: Literal[1] | str | Literal[2] # Error
field6: Literal[1] | bool | Literal[2] | str # Error
# Should emit for non-type unions.
field7 = Literal[1] | Literal[2] # Error
# Should emit for parenthesized unions.
field8: Literal[1] | (Literal[2] | str) # Error
# Should handle user parentheses when fixing.
field9: Literal[1] | (Literal[2] | str) # Error
field10: (Literal[1] | str) | Literal[2] # Error
# Should emit for union in generic parent type.
field11: dict[Literal[1] | Literal[2], str] # Error

View File

@@ -3,8 +3,8 @@ import typing
class Bad:
def __eq__(self, other: Any) -> bool: ... # Fine because not a stub file
def __ne__(self, other: typing.Any) -> typing.Any: ... # Fine because not a stub file
def __eq__(self, other: Any) -> bool: ... # Y032
def __ne__(self, other: typing.Any) -> typing.Any: ... # Y032
class Good:

View File

@@ -9,16 +9,16 @@ from typing import (
just_literals_pipe_union: TypeAlias = (
Literal[True] | Literal["idk"]
) # not PYI042 (not a stubfile)
) # PYI042, since not camel case
PublicAliasT: TypeAlias = str | int
PublicAliasT2: TypeAlias = Union[str, bytes]
_ABCDEFGHIJKLMNOPQRST: TypeAlias = typing.Any
_PrivateAliasS: TypeAlias = Literal["I", "guess", "this", "is", "okay"]
_PrivateAliasS2: TypeAlias = Annotated[str, "also okay"]
snake_case_alias1: TypeAlias = str | int # not PYI042 (not a stubfile)
_snake_case_alias2: TypeAlias = Literal["whatever"] # not PYI042 (not a stubfile)
Snake_case_alias: TypeAlias = int | float # not PYI042 (not a stubfile)
snake_case_alias1: TypeAlias = str | int # PYI042, since not camel case
_snake_case_alias2: TypeAlias = Literal["whatever"] # PYI042, since not camel case
Snake_case_alias: TypeAlias = int | float # PYI042, since not camel case
# check that this edge case doesn't crash
_: TypeAlias = str | int

View File

@@ -7,11 +7,11 @@ from typing import (
Literal,
)
_PrivateAliasT: TypeAlias = str | int # not PYI043 (not a stubfile)
_PrivateAliasT2: TypeAlias = typing.Any # not PYI043 (not a stubfile)
_PrivateAliasT: TypeAlias = str | int # PYI043, since this ends in a T
_PrivateAliasT2: TypeAlias = typing.Any # PYI043, since this ends in a T
_PrivateAliasT3: TypeAlias = Literal[
"not", "a", "chance"
] # not PYI043 (not a stubfile)
] # PYI043, since this ends in a T
just_literals_pipe_union: TypeAlias = Literal[True] | Literal["idk"]
PublicAliasT: TypeAlias = str | int
PublicAliasT2: TypeAlias = Union[str, bytes]

View File

@@ -0,0 +1,17 @@
import typing
from typing import Literal, TypeAlias, Union
A: str | Literal["foo"]
B: TypeAlias = typing.Union[Literal[b"bar", b"foo"], bytes, str]
C: TypeAlias = typing.Union[Literal[5], int, typing.Union[Literal["foo"], str]]
D: TypeAlias = typing.Union[Literal[b"str_bytes", 42], bytes, int]
def func(x: complex | Literal[1J], y: Union[Literal[3.14], float]): ...
# OK
A: Literal["foo"]
B: TypeAlias = Literal[b"bar", b"foo"]
C: TypeAlias = typing.Union[Literal[5], Literal["foo"]]
D: TypeAlias = Literal[b"str_bytes", 42]
def func(x: Literal[1J], y: Literal[3.14]): ...

View File

@@ -0,0 +1,17 @@
import typing
from typing import Literal, TypeAlias, Union
A: str | Literal["foo"]
B: TypeAlias = typing.Union[Literal[b"bar", b"foo"], bytes, str]
C: TypeAlias = typing.Union[Literal[5], int, typing.Union[Literal["foo"], str]]
D: TypeAlias = typing.Union[Literal[b"str_bytes", 42], bytes, int]
def func(x: complex | Literal[1J], y: Union[Literal[3.14], float]): ...
# OK
A: Literal["foo"]
B: TypeAlias = Literal[b"bar", b"foo"]
C: TypeAlias = typing.Union[Literal[5], Literal["foo"]]
D: TypeAlias = Literal[b"str_bytes", 42]
def func(x: Literal[1J], y: Literal[3.14]): ...

View File

@@ -0,0 +1,20 @@
import builtins
from typing import Union
w: builtins.type[int] | builtins.type[str] | builtins.type[complex]
x: type[int] | type[str] | type[float]
y: builtins.type[int] | type[str] | builtins.type[complex]
z: Union[type[float], type[complex]]
z: Union[type[float, int], type[complex]]
def func(arg: type[int] | str | type[float]) -> None: ...
# OK
x: type[int, str, float]
y: builtins.type[int, str, complex]
z: Union[float, complex]
def func(arg: type[int, float] | str) -> None: ...

View File

@@ -0,0 +1,20 @@
import builtins
from typing import Union
w: builtins.type[int] | builtins.type[str] | builtins.type[complex]
x: type[int] | type[str] | type[float]
y: builtins.type[int] | type[str] | builtins.type[complex]
z: Union[type[float], type[complex]]
z: Union[type[float, int], type[complex]]
def func(arg: type[int] | str | type[float]) -> None: ...
# OK
x: type[int, str, float]
y: builtins.type[int, str, complex]
z: Union[float, complex]
def func(arg: type[int, float] | str) -> None: ...

View File

@@ -0,0 +1,10 @@
"""Regression test: ensure that we don't treat the export entry as a typing-only reference."""
from __future__ import annotations
from logging import getLogger
__all__ = ("getLogger",)
def foo() -> None:
pass

View File

@@ -18,3 +18,5 @@ file_name.split(os.sep)
# OK
"foo/bar/".split("/")
"foo/bar/".split(os.sep, 1)
"foo/bar/".split(1, sep=os.sep)

View File

@@ -45,3 +45,18 @@ def f():
for i in items:
if i not in result:
result.append(i) # OK
def f():
fibonacci = [0, 1]
for i in range(20):
fibonacci.append(sum(fibonacci[-2:])) # OK
print(fibonacci)
def f():
foo = object()
foo.fibonacci = [0, 1]
for i in range(20):
foo.fibonacci.append(sum(foo.fibonacci[-2:])) # OK
print(foo.fibonacci)

View File

@@ -66,3 +66,6 @@ while 1:
#: E703:2:1
0\
;
#: E701:2:3
a = \
5;

View File

@@ -58,3 +58,6 @@ assert type(res) == type(None)
types = StrEnum
if x == types.X:
pass
#: E721
assert type(res) is int

View File

@@ -147,3 +147,10 @@ def f() -> None:
global CONSTANT
CONSTANT = 1
CONSTANT = 2
def f() -> None:
try:
print("hello")
except A as e :
print("oh no!")

View File

@@ -0,0 +1,29 @@
# pylint: disable=missing-docstring,consider-using-f-string, pointless-statement
## Old style formatting
"%s %z" % ("hello", "world") # [bad-format-character]
"%s" "%z" % ("hello", "world") # [bad-format-character]
"""%s %z""" % ("hello", "world") # [bad-format-character]
"""%s""" """%z""" % ("hello", "world") # [bad-format-character]
## New style formatting
"{:s} {:y}".format("hello", "world") # [bad-format-character]
"{:*^30s}".format("centered")
## f-strings
H, W = "hello", "world"
f"{H} {W}"
f"{H:s} {W:z}" # [bad-format-character]
f"{1:z}" # [bad-format-character]
## False negatives
print(("%" "z") % 1)

View File

@@ -40,3 +40,4 @@ __all__ = __all__ + ["Hello"]
__all__ = __all__ + multiprocessing.__all__
__all__ = list[str](["Hello", "world"])

View File

@@ -6,8 +6,12 @@
"{1} {0}".format(a, b)
"{0} {1} {0}".format(a, b)
"{x.y}".format(x=z)
"{x} {y} {x}".format(x=a, y=b)
"{.x} {.y}".format(a, b)
"{} {}".format(a.b, c.d)
@@ -72,6 +76,58 @@ aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
111111
)
"{a}" "{b}".format(a=1, b=1)
(
"{a}"
"{b}"
).format(a=1, b=1)
(
"{a}"
""
"{b}"
""
).format(a=1, b=1)
(
(
# comment
"{a}"
# comment
"{b}"
)
# comment
.format(a=1, b=1)
)
(
"{a}"
"b"
).format(a=1)
def d(osname, version, release):
return"{}-{}.{}".format(osname, version, release)
def e():
yield"{}".format(1)
assert"{}".format(1)
async def c():
return "{}".format(await 3)
async def c():
return "{}".format(1 + await 3)
"{}".format(1 * 2)
###
# Non-errors
###
@@ -85,8 +141,6 @@ aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
"{} {}".format(*a)
"{0} {0}".format(arg)
"{x} {x}".format(arg)
"{x.y} {x.z}".format(arg)
@@ -103,8 +157,6 @@ b"{} {}".format(a, b)
r'"\N{snowman} {}".format(a)'
"{a}" "{b}".format(a=1, b=1)
"123456789 {}".format(
11111111111111111111111111111111111111111111111111111111111111111111111111,
)
@@ -140,20 +192,9 @@ aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
]
)
async def c():
return "{}".format(await 3)
(
"{a}"
"{1 + 2}"
).format(a=1)
async def c():
return "{}".format(1 + await 3)
def d(osname, version, release):
return"{}-{}.{}".format(osname, version, release)
def e():
yield"{}".format(1)
assert"{}".format(1)
"{}".format(**c)

View File

@@ -1,5 +1,3 @@
"""A mirror of UP037_1.py, with `from __future__ import annotations`."""
from __future__ import annotations
from typing import (

View File

@@ -1,108 +0,0 @@
"""A mirror of UP037_0.py, without `from __future__ import annotations`."""
from typing import (
Annotated,
Callable,
List,
Literal,
NamedTuple,
Tuple,
TypeVar,
TypedDict,
cast,
)
from mypy_extensions import Arg, DefaultArg, DefaultNamedArg, NamedArg, VarArg
def foo(var: "MyClass") -> "MyClass":
x: "MyClass"
def foo(*, inplace: "bool"):
pass
def foo(*args: "str", **kwargs: "int"):
pass
x: Tuple["MyClass"]
x: Callable[["MyClass"], None]
class Foo(NamedTuple):
x: "MyClass"
class D(TypedDict):
E: TypedDict("E", foo="int", total=False)
class D(TypedDict):
E: TypedDict("E", {"foo": "int"})
x: Annotated["str", "metadata"]
x: Arg("str", "name")
x: DefaultArg("str", "name")
x: NamedArg("str", "name")
x: DefaultNamedArg("str", "name")
x: DefaultNamedArg("str", name="name")
x: VarArg("str")
x: List[List[List["MyClass"]]]
x: NamedTuple("X", [("foo", "int"), ("bar", "str")])
x: NamedTuple("X", fields=[("foo", "int"), ("bar", "str")])
x: NamedTuple(typename="X", fields=[("foo", "int")])
X: MyCallable("X")
# OK
class D(TypedDict):
E: TypedDict("E")
x: Annotated[()]
x: DefaultNamedArg(name="name", quox="str")
x: DefaultNamedArg(name="name")
x: NamedTuple("X", [("foo",), ("bar",)])
x: NamedTuple("X", ["foo", "bar"])
x: NamedTuple()
x: Literal["foo", "bar"]
x = cast(x, "str")
def foo(x, *args, **kwargs):
...
def foo(*, inplace):
...
x: Annotated[1:2] = ...
x = TypeVar("x", "str", "int")
x = cast("str", x)
X = List["MyClass"]

View File

@@ -0,0 +1,16 @@
import typing
from typing import TypeAlias
# UP040
x: typing.TypeAlias = int
x: TypeAlias = int
# UP040 with generics (todo)
T = typing.TypeVar["T"]
x: typing.TypeAlias = list[T]
# OK
x: TypeAlias
x: int = 1

View File

@@ -11,6 +11,8 @@ class A:
without_annotation = []
class_variable: ClassVar[list[int]] = []
final_variable: Final[list[int]] = []
class_variable_without_subscript: ClassVar = []
final_variable_without_subscript: Final = []
from dataclasses import dataclass, field

View File

@@ -1,14 +1,15 @@
//! Interface for generating autofix edits from higher-level actions (e.g., "remove an argument").
use anyhow::{bail, Result};
use ruff_python_ast::{self as ast, ExceptHandler, Expr, Keyword, Ranged, Stmt};
use ruff_python_parser::{lexer, Mode};
use ruff_text_size::{TextLen, TextRange, TextSize};
use ruff_diagnostics::Edit;
use ruff_python_ast::{self as ast, Arguments, ExceptHandler, Expr, Keyword, Ranged, Stmt};
use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::{lexer, Mode};
use ruff_python_trivia::{has_leading_content, is_python_whitespace, PythonWhitespace};
use ruff_source_file::{Locator, NewlineWithTrailingNewline};
use ruff_text_size::{TextLen, TextRange, TextSize};
use crate::autofix::codemods;
@@ -68,105 +69,101 @@ pub(crate) fn remove_unused_imports<'a>(
}
}
#[derive(Debug, Copy, Clone)]
pub(crate) enum Parentheses {
/// Remove parentheses, if the removed argument is the only argument left.
Remove,
/// Preserve parentheses, even if the removed argument is the only argument
Preserve,
}
/// Generic function to remove arguments or keyword arguments in function
/// calls and class definitions. (For classes `args` should be considered
/// `bases`)
///
/// Supports the removal of parentheses when this is the only (kw)arg left.
/// For this behavior, set `remove_parentheses` to `true`.
pub(crate) fn remove_argument(
pub(crate) fn remove_argument<T: Ranged>(
argument: &T,
arguments: &Arguments,
parentheses: Parentheses,
locator: &Locator,
call_at: TextSize,
expr_range: TextRange,
args: &[Expr],
keywords: &[Keyword],
remove_parentheses: bool,
) -> Result<Edit> {
// TODO(sbrugman): Preserve trailing comments.
let contents = locator.after(call_at);
if arguments.keywords.len() + arguments.args.len() > 1 {
let mut fix_start = None;
let mut fix_end = None;
let mut fix_start = None;
let mut fix_end = None;
let n_arguments = keywords.len() + args.len();
if n_arguments == 0 {
bail!("No arguments or keywords to remove");
}
if n_arguments == 1 {
// Case 1: there is only one argument.
let mut count = 0u32;
for (tok, range) in lexer::lex_starts_at(contents, Mode::Module, call_at).flatten() {
if tok.is_lpar() {
if count == 0 {
fix_start = Some(if remove_parentheses {
range.start()
} else {
range.start() + TextSize::from(1)
});
}
count = count.saturating_add(1);
}
if tok.is_rpar() {
count = count.saturating_sub(1);
if count == 0 {
fix_end = Some(if remove_parentheses {
if arguments
.args
.iter()
.map(Expr::start)
.chain(arguments.keywords.iter().map(Keyword::start))
.any(|location| location > argument.start())
{
// Case 1: argument or keyword is _not_ the last node, so delete from the start of the
// argument to the end of the subsequent comma.
let mut seen_comma = false;
for (tok, range) in lexer::lex_starts_at(
locator.slice(arguments.range()),
Mode::Module,
arguments.start(),
)
.flatten()
{
if seen_comma {
if tok.is_non_logical_newline() {
// Also delete any non-logical newlines after the comma.
continue;
}
fix_end = Some(if tok.is_newline() {
range.end()
} else {
range.end() - TextSize::from(1)
range.start()
});
break;
}
if range.start() == argument.start() {
fix_start = Some(range.start());
}
if fix_start.is_some() && tok.is_comma() {
seen_comma = true;
}
}
} else {
// Case 2: argument or keyword is the last node, so delete from the start of the
// previous comma to the end of the argument.
for (tok, range) in lexer::lex_starts_at(
locator.slice(arguments.range()),
Mode::Module,
arguments.start(),
)
.flatten()
{
if range.start() == argument.start() {
fix_end = Some(argument.end());
break;
}
if tok.is_comma() {
fix_start = Some(range.start());
}
}
}
} else if args
.iter()
.map(Expr::start)
.chain(keywords.iter().map(Keyword::start))
.any(|location| location > expr_range.start())
{
// Case 2: argument or keyword is _not_ the last node.
let mut seen_comma = false;
for (tok, range) in lexer::lex_starts_at(contents, Mode::Module, call_at).flatten() {
if seen_comma {
if tok.is_non_logical_newline() {
// Also delete any non-logical newlines after the comma.
continue;
}
fix_end = Some(if tok.is_newline() {
range.end()
} else {
range.start()
});
break;
}
if range.start() == expr_range.start() {
fix_start = Some(range.start());
}
if fix_start.is_some() && tok.is_comma() {
seen_comma = true;
match (fix_start, fix_end) {
(Some(start), Some(end)) => Ok(Edit::deletion(start, end)),
_ => {
bail!("No fix could be constructed")
}
}
} else {
// Case 3: argument or keyword is the last node, so we have to find the last
// comma in the stmt.
for (tok, range) in lexer::lex_starts_at(contents, Mode::Module, call_at).flatten() {
if range.start() == expr_range.start() {
fix_end = Some(expr_range.end());
break;
// Only one argument; remove it (but preserve parentheses, if needed).
Ok(match parentheses {
Parentheses::Remove => Edit::deletion(arguments.start(), arguments.end()),
Parentheses::Preserve => {
Edit::replacement("()".to_string(), arguments.start(), arguments.end())
}
if tok.is_comma() {
fix_start = Some(range.start());
}
}
}
match (fix_start, fix_end) {
(Some(start), Some(end)) => Ok(Edit::deletion(start, end)),
_ => {
bail!("No fix could be constructed")
}
})
}
}
@@ -294,24 +291,24 @@ fn next_stmt_break(semicolon: TextSize, locator: &Locator) -> TextSize {
#[cfg(test)]
mod tests {
use anyhow::Result;
use ruff_python_ast::{Ranged, Suite};
use ruff_python_parser::Parse;
use ruff_text_size::TextSize;
use ruff_python_ast::Ranged;
use ruff_python_parser::parse_suite;
use ruff_source_file::Locator;
use ruff_text_size::TextSize;
use crate::autofix::edits::{next_stmt_break, trailing_semicolon};
#[test]
fn find_semicolon() -> Result<()> {
let contents = "x = 1";
let program = Suite::parse(contents, "<filename>")?;
let program = parse_suite(contents, "<filename>")?;
let stmt = program.first().unwrap();
let locator = Locator::new(contents);
assert_eq!(trailing_semicolon(stmt.end(), &locator), None);
let contents = "x = 1; y = 1";
let program = Suite::parse(contents, "<filename>")?;
let program = parse_suite(contents, "<filename>")?;
let stmt = program.first().unwrap();
let locator = Locator::new(contents);
assert_eq!(
@@ -320,7 +317,7 @@ mod tests {
);
let contents = "x = 1 ; y = 1";
let program = Suite::parse(contents, "<filename>")?;
let program = parse_suite(contents, "<filename>")?;
let stmt = program.first().unwrap();
let locator = Locator::new(contents);
assert_eq!(
@@ -333,7 +330,7 @@ x = 1 \
; y = 1
"
.trim();
let program = Suite::parse(contents, "<filename>")?;
let program = parse_suite(contents, "<filename>")?;
let stmt = program.first().unwrap();
let locator = Locator::new(contents);
assert_eq!(

View File

@@ -76,7 +76,7 @@ fn apply_fixes<'a>(
}
// If this fix overlaps with a fix we've already applied, skip it.
if last_pos.map_or(false, |last_pos| last_pos >= first.start()) {
if last_pos.is_some_and(|last_pos| last_pos >= first.start()) {
continue;
}
}

View File

@@ -56,13 +56,11 @@ pub(crate) fn bindings(checker: &mut Checker) {
checker.diagnostics.push(diagnostic);
}
}
if checker.is_stub {
if checker.enabled(Rule::UnaliasedCollectionsAbcSetImport) {
if let Some(diagnostic) =
flake8_pyi::rules::unaliased_collections_abc_set_import(checker, binding)
{
checker.diagnostics.push(diagnostic);
}
if checker.enabled(Rule::UnaliasedCollectionsAbcSetImport) {
if let Some(diagnostic) =
flake8_pyi::rules::unaliased_collections_abc_set_import(checker, binding)
{
checker.diagnostics.push(diagnostic);
}
}
}

View File

@@ -218,19 +218,17 @@ pub(crate) fn deferred_scopes(checker: &mut Checker) {
}
}
if checker.is_stub {
if checker.enabled(Rule::UnusedPrivateTypeVar) {
flake8_pyi::rules::unused_private_type_var(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateProtocol) {
flake8_pyi::rules::unused_private_protocol(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateTypeAlias) {
flake8_pyi::rules::unused_private_type_alias(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateTypedDict) {
flake8_pyi::rules::unused_private_typed_dict(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateTypeVar) {
flake8_pyi::rules::unused_private_type_var(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateProtocol) {
flake8_pyi::rules::unused_private_protocol(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateTypeAlias) {
flake8_pyi::rules::unused_private_type_alias(checker, scope, &mut diagnostics);
}
if checker.enabled(Rule::UnusedPrivateTypedDict) {
flake8_pyi::rules::unused_private_typed_dict(checker, scope, &mut diagnostics);
}
if matches!(

View File

@@ -30,8 +30,8 @@ pub(crate) fn definitions(checker: &mut Checker) {
Rule::MissingTypeKwargs,
Rule::MissingTypeSelf,
]);
let enforce_stubs = checker.is_stub
&& checker.any_enabled(&[Rule::DocstringInStub, Rule::IterMethodReturnIterable]);
let enforce_stubs = checker.is_stub && checker.enabled(Rule::DocstringInStub);
let enforce_stubs_and_runtime = checker.enabled(Rule::IterMethodReturnIterable);
let enforce_docstrings = checker.any_enabled(&[
Rule::BlankLineAfterLastSection,
Rule::BlankLineAfterSummary,
@@ -81,7 +81,7 @@ pub(crate) fn definitions(checker: &mut Checker) {
Rule::UndocumentedPublicPackage,
]);
if !enforce_annotations && !enforce_docstrings && !enforce_stubs {
if !enforce_annotations && !enforce_docstrings && !enforce_stubs && !enforce_stubs_and_runtime {
return;
}
@@ -117,7 +117,7 @@ pub(crate) fn definitions(checker: &mut Checker) {
// interfaces, without any AST nodes in between. Right now, we
// only error when traversing definition boundaries (functions,
// classes, etc.).
if !overloaded_name.map_or(false, |overloaded_name| {
if !overloaded_name.is_some_and(|overloaded_name| {
flake8_annotations::helpers::is_overload_impl(
definition,
&overloaded_name,
@@ -141,6 +141,8 @@ pub(crate) fn definitions(checker: &mut Checker) {
if checker.enabled(Rule::DocstringInStub) {
flake8_pyi::rules::docstring_in_stubs(checker, docstring);
}
}
if enforce_stubs_and_runtime {
if checker.enabled(Rule::IterMethodReturnIterable) {
flake8_pyi::rules::iter_method_return_iterable(checker, definition);
}

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{self as ast, Constant, Expr, ExprContext, Operator, Ranged};
use ruff_python_ast::{self as ast, Arguments, Constant, Expr, ExprContext, Operator, Ranged};
use ruff_python_literal::cformat::{CFormatError, CFormatErrorType};
use ruff_diagnostics::Diagnostic;
@@ -74,9 +74,14 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
// Ex) Union[...]
if checker.any_enabled(&[Rule::UnnecessaryLiteralUnion, Rule::DuplicateUnionMember]) {
// Determine if the current expression is an union
// Avoid duplicate checks if the parent is an `Union[...]` since these rules traverse nested unions
if checker.any_enabled(&[
Rule::UnnecessaryLiteralUnion,
Rule::DuplicateUnionMember,
Rule::RedundantLiteralUnion,
Rule::UnnecessaryTypeUnion,
]) {
// Avoid duplicate checks if the parent is an `Union[...]` since these rules
// traverse nested unions.
let is_unchecked_union = checker
.semantic
.expr_grandparent()
@@ -92,6 +97,12 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
if checker.enabled(Rule::DuplicateUnionMember) {
flake8_pyi::rules::duplicate_union_member(checker, expr);
}
if checker.enabled(Rule::RedundantLiteralUnion) {
flake8_pyi::rules::redundant_literal_union(checker, expr);
}
if checker.enabled(Rule::UnnecessaryTypeUnion) {
flake8_pyi::rules::unnecessary_type_union(checker, expr);
}
}
}
@@ -152,10 +163,8 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
if checker.enabled(Rule::NumpyDeprecatedFunction) {
numpy::rules::deprecated_function(checker, expr);
}
if checker.is_stub {
if checker.enabled(Rule::CollectionsNamedTuple) {
flake8_pyi::rules::collections_named_tuple(checker, expr);
}
if checker.enabled(Rule::CollectionsNamedTuple) {
flake8_pyi::rules::collections_named_tuple(checker, expr);
}
// Ex) List[...]
@@ -174,9 +183,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
&& checker.semantic.in_annotation()
&& !checker.settings.pyupgrade.keep_runtime_typing
{
flake8_future_annotations::rules::future_rewritable_type_annotation(
checker, expr,
);
flake8_future_annotations::rules::future_rewritable_type_annotation(checker, expr);
}
}
if checker.enabled(Rule::NonPEP585Annotation) {
@@ -201,14 +208,9 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
if checker.enabled(Rule::NonLowercaseVariableInFunction) {
if checker.semantic.scope().kind.is_any_function() {
// Ignore globals.
if !checker
.semantic
.scope()
.get(id)
.map_or(false, |binding_id| {
checker.semantic.binding(binding_id).is_global()
})
{
if !checker.semantic.scope().get(id).is_some_and(|binding_id| {
checker.semantic.binding(binding_id).is_global()
}) {
pep8_naming::rules::non_lowercase_variable_in_function(
checker, expr, id,
);
@@ -216,11 +218,14 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
}
if checker.enabled(Rule::MixedCaseVariableInClassScope) {
if let ScopeKind::Class(ast::StmtClassDef { bases, .. }) =
if let ScopeKind::Class(ast::StmtClassDef { arguments, .. }) =
&checker.semantic.scope().kind
{
pep8_naming::rules::mixed_case_variable_in_class_scope(
checker, expr, id, bases,
checker,
expr,
id,
arguments.as_deref(),
);
}
}
@@ -318,18 +323,20 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
if checker.enabled(Rule::PrivateMemberAccess) {
flake8_self::rules::private_member_access(checker, expr);
}
if checker.is_stub {
if checker.enabled(Rule::CollectionsNamedTuple) {
flake8_pyi::rules::collections_named_tuple(checker, expr);
}
if checker.enabled(Rule::CollectionsNamedTuple) {
flake8_pyi::rules::collections_named_tuple(checker, expr);
}
pandas_vet::rules::attr(checker, attr, value, expr);
}
Expr::Call(
call @ ast::ExprCall {
func,
args,
keywords,
arguments:
Arguments {
args,
keywords,
range: _,
},
range: _,
},
) => {
@@ -382,9 +389,8 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
.enabled(Rule::StringDotFormatExtraPositionalArguments)
{
pyflakes::rules::string_dot_format_extra_positional_arguments(
checker,
&summary, args, location,
);
checker, &summary, args, location,
);
}
if checker.enabled(Rule::StringDotFormatMissingArguments) {
pyflakes::rules::string_dot_format_missing_argument(
@@ -410,6 +416,14 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
}
}
if checker.enabled(Rule::BadStringFormatCharacter) {
pylint::rules::bad_string_format_character::call(
checker,
val.as_str(),
location,
);
}
}
}
}
@@ -424,10 +438,10 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
pyupgrade::rules::super_call_with_parameters(checker, expr, func, args);
}
if checker.enabled(Rule::UnnecessaryEncodeUTF8) {
pyupgrade::rules::unnecessary_encode_utf8(checker, expr, func, args, keywords);
pyupgrade::rules::unnecessary_encode_utf8(checker, call);
}
if checker.enabled(Rule::RedundantOpenModes) {
pyupgrade::rules::redundant_open_modes(checker, expr);
pyupgrade::rules::redundant_open_modes(checker, call);
}
if checker.enabled(Rule::NativeLiterals) {
pyupgrade::rules::native_literals(checker, expr, func, args, keywords);
@@ -436,10 +450,10 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
pyupgrade::rules::open_alias(checker, expr, func);
}
if checker.enabled(Rule::ReplaceUniversalNewlines) {
pyupgrade::rules::replace_universal_newlines(checker, func, keywords);
pyupgrade::rules::replace_universal_newlines(checker, call);
}
if checker.enabled(Rule::ReplaceStdoutStderr) {
pyupgrade::rules::replace_stdout_stderr(checker, expr, func, args, keywords);
pyupgrade::rules::replace_stdout_stderr(checker, call);
}
if checker.enabled(Rule::OSErrorAlias) {
pyupgrade::rules::os_error_alias_call(checker, func);
@@ -459,7 +473,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
flake8_async::rules::blocking_os_call(checker, expr);
}
if checker.any_enabled(&[Rule::Print, Rule::PPrint]) {
flake8_print::rules::print_call(checker, func, keywords);
flake8_print::rules::print_call(checker, call);
}
if checker.any_enabled(&[
Rule::SuspiciousPickleUsage,
@@ -511,13 +525,11 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
if checker.enabled(Rule::ZipWithoutExplicitStrict) {
if checker.settings.target_version >= PythonVersion::Py310 {
flake8_bugbear::rules::zip_without_explicit_strict(
checker, expr, func, args, keywords,
);
flake8_bugbear::rules::zip_without_explicit_strict(checker, call);
}
}
if checker.enabled(Rule::NoExplicitStacklevel) {
flake8_bugbear::rules::no_explicit_stacklevel(checker, func, keywords);
flake8_bugbear::rules::no_explicit_stacklevel(checker, call);
}
if checker.enabled(Rule::UnnecessaryDictKwargs) {
flake8_pie::rules::unnecessary_dict_kwargs(checker, expr, keywords);
@@ -526,22 +538,22 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
flake8_bandit::rules::exec_used(checker, func);
}
if checker.enabled(Rule::BadFilePermissions) {
flake8_bandit::rules::bad_file_permissions(checker, func, args, keywords);
flake8_bandit::rules::bad_file_permissions(checker, call);
}
if checker.enabled(Rule::RequestWithNoCertValidation) {
flake8_bandit::rules::request_with_no_cert_validation(checker, func, keywords);
flake8_bandit::rules::request_with_no_cert_validation(checker, call);
}
if checker.enabled(Rule::UnsafeYAMLLoad) {
flake8_bandit::rules::unsafe_yaml_load(checker, func, args, keywords);
flake8_bandit::rules::unsafe_yaml_load(checker, call);
}
if checker.enabled(Rule::SnmpInsecureVersion) {
flake8_bandit::rules::snmp_insecure_version(checker, func, keywords);
flake8_bandit::rules::snmp_insecure_version(checker, call);
}
if checker.enabled(Rule::SnmpWeakCryptography) {
flake8_bandit::rules::snmp_weak_cryptography(checker, func, args, keywords);
flake8_bandit::rules::snmp_weak_cryptography(checker, call);
}
if checker.enabled(Rule::Jinja2AutoescapeFalse) {
flake8_bandit::rules::jinja2_autoescape_false(checker, func, keywords);
flake8_bandit::rules::jinja2_autoescape_false(checker, call);
}
if checker.enabled(Rule::HardcodedPasswordFuncArg) {
flake8_bandit::rules::hardcoded_password_func_arg(checker, keywords);
@@ -550,18 +562,16 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
flake8_bandit::rules::hardcoded_sql_expression(checker, expr);
}
if checker.enabled(Rule::HashlibInsecureHashFunction) {
flake8_bandit::rules::hashlib_insecure_hash_functions(
checker, func, args, keywords,
);
flake8_bandit::rules::hashlib_insecure_hash_functions(checker, call);
}
if checker.enabled(Rule::RequestWithoutTimeout) {
flake8_bandit::rules::request_without_timeout(checker, func, keywords);
flake8_bandit::rules::request_without_timeout(checker, call);
}
if checker.enabled(Rule::ParamikoCall) {
flake8_bandit::rules::paramiko_call(checker, func);
}
if checker.enabled(Rule::LoggingConfigInsecureListen) {
flake8_bandit::rules::logging_config_insecure_listen(checker, func, keywords);
flake8_bandit::rules::logging_config_insecure_listen(checker, call);
}
if checker.any_enabled(&[
Rule::SubprocessWithoutShellEqualsTrue,
@@ -572,7 +582,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
Rule::StartProcessWithPartialPath,
Rule::UnixCommandWildcardInjection,
]) {
flake8_bandit::rules::shell_injection(checker, func, args, keywords);
flake8_bandit::rules::shell_injection(checker, call);
}
if checker.enabled(Rule::UnnecessaryGeneratorList) {
flake8_comprehensions::rules::unnecessary_generator_list(
@@ -675,23 +685,17 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
flake8_debugger::rules::debugger_call(checker, expr, func);
}
if checker.enabled(Rule::PandasUseOfInplaceArgument) {
pandas_vet::rules::inplace_argument(checker, expr, func, args, keywords);
pandas_vet::rules::inplace_argument(checker, call);
}
pandas_vet::rules::call(checker, func);
if checker.enabled(Rule::PandasUseOfDotReadTable) {
pandas_vet::rules::use_of_read_table(checker, func, keywords);
pandas_vet::rules::use_of_read_table(checker, call);
}
if checker.enabled(Rule::PandasUseOfPdMerge) {
pandas_vet::rules::use_of_pd_merge(checker, func);
}
if checker.enabled(Rule::CallDatetimeWithoutTzinfo) {
flake8_datetimez::rules::call_datetime_without_tzinfo(
checker,
func,
args,
keywords,
expr.range(),
);
flake8_datetimez::rules::call_datetime_without_tzinfo(checker, call);
}
if checker.enabled(Rule::CallDatetimeToday) {
flake8_datetimez::rules::call_datetime_today(checker, func, expr.range());
@@ -707,30 +711,13 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
);
}
if checker.enabled(Rule::CallDatetimeNowWithoutTzinfo) {
flake8_datetimez::rules::call_datetime_now_without_tzinfo(
checker,
func,
args,
keywords,
expr.range(),
);
flake8_datetimez::rules::call_datetime_now_without_tzinfo(checker, call);
}
if checker.enabled(Rule::CallDatetimeFromtimestamp) {
flake8_datetimez::rules::call_datetime_fromtimestamp(
checker,
func,
args,
keywords,
expr.range(),
);
flake8_datetimez::rules::call_datetime_fromtimestamp(checker, call);
}
if checker.enabled(Rule::CallDatetimeStrptimeWithoutZone) {
flake8_datetimez::rules::call_datetime_strptime_without_zone(
checker,
func,
args,
expr.range(),
);
flake8_datetimez::rules::call_datetime_strptime_without_zone(checker, call);
}
if checker.enabled(Rule::CallDateToday) {
flake8_datetimez::rules::call_date_today(checker, func, expr.range());
@@ -754,18 +741,16 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
pylint::rules::bad_str_strip_call(checker, func, args);
}
if checker.enabled(Rule::InvalidEnvvarDefault) {
pylint::rules::invalid_envvar_default(checker, func, args, keywords);
pylint::rules::invalid_envvar_default(checker, call);
}
if checker.enabled(Rule::InvalidEnvvarValue) {
pylint::rules::invalid_envvar_value(checker, func, args, keywords);
pylint::rules::invalid_envvar_value(checker, call);
}
if checker.enabled(Rule::NestedMinMax) {
pylint::rules::nested_min_max(checker, expr, func, args, keywords);
}
if checker.enabled(Rule::PytestPatchWithLambda) {
if let Some(diagnostic) =
flake8_pytest_style::rules::patch_with_lambda(func, args, keywords)
{
if let Some(diagnostic) = flake8_pytest_style::rules::patch_with_lambda(call) {
checker.diagnostics.push(diagnostic);
}
}
@@ -777,16 +762,16 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
}
if checker.enabled(Rule::SubprocessPopenPreexecFn) {
pylint::rules::subprocess_popen_preexec_fn(checker, func, keywords);
pylint::rules::subprocess_popen_preexec_fn(checker, call);
}
if checker.any_enabled(&[
Rule::PytestRaisesWithoutException,
Rule::PytestRaisesTooBroad,
]) {
flake8_pytest_style::rules::raises_call(checker, func, args, keywords);
flake8_pytest_style::rules::raises_call(checker, call);
}
if checker.enabled(Rule::PytestFailWithoutMessage) {
flake8_pytest_style::rules::fail_call(checker, func, args, keywords);
flake8_pytest_style::rules::fail_call(checker, call);
}
if checker.enabled(Rule::PairwiseOverZipped) {
if checker.settings.target_version >= PythonVersion::Py310 {
@@ -857,7 +842,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
flake8_use_pathlib::rules::path_constructor_current_directory(checker, expr, func);
}
if checker.enabled(Rule::OsSepSplit) {
flake8_use_pathlib::rules::os_sep_split(checker, func, args, keywords);
flake8_use_pathlib::rules::os_sep_split(checker, call);
}
if checker.enabled(Rule::NumpyLegacyRandom) {
numpy::rules::legacy_random(checker, func);
@@ -872,15 +857,15 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
Rule::LoggingExcInfo,
Rule::LoggingRedundantExcInfo,
]) {
flake8_logging_format::rules::logging_call(checker, func, args, keywords);
flake8_logging_format::rules::logging_call(checker, call);
}
if checker.any_enabled(&[Rule::LoggingTooFewArgs, Rule::LoggingTooManyArgs]) {
pylint::rules::logging_call(checker, func, args, keywords);
pylint::rules::logging_call(checker, call);
}
if checker.enabled(Rule::DjangoLocalsInRenderFunction) {
flake8_django::rules::locals_in_render_function(checker, func, args, keywords);
flake8_django::rules::locals_in_render_function(checker, call);
}
if checker.is_stub && checker.enabled(Rule::UnsupportedMethodCallOnAll) {
if checker.enabled(Rule::UnsupportedMethodCallOnAll) {
flake8_pyi::rules::unsupported_method_call_on_all(checker, func);
}
}
@@ -1045,6 +1030,9 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
checker.locator,
);
}
if checker.enabled(Rule::BadStringFormatCharacter) {
pylint::rules::bad_string_format_character::percent(checker, expr);
}
if checker.enabled(Rule::BadStringFormatType) {
pylint::rules::bad_string_format_type(checker, expr, right);
}
@@ -1088,26 +1076,30 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
);
}
}
if checker.is_stub {
if checker.enabled(Rule::DuplicateUnionMember)
&& checker.semantic.in_type_definition()
// Avoid duplicate checks if the parent is an `|`
&& !matches!(
checker.semantic.expr_parent(),
Some(Expr::BinOp(ast::ExprBinOp { op: Operator::BitOr, ..}))
)
{
flake8_pyi::rules::duplicate_union_member(checker, expr);
}
if checker.enabled(Rule::UnnecessaryLiteralUnion)
// Avoid duplicate checks if the parent is an `|`
&& !matches!(
checker.semantic.expr_parent(),
Some(Expr::BinOp(ast::ExprBinOp { op: Operator::BitOr, ..}))
)
{
flake8_pyi::rules::unnecessary_literal_union(checker, expr);
}
// Avoid duplicate checks if the parent is an `|` since these rules
// traverse nested unions.
let is_unchecked_union = !matches!(
checker.semantic.expr_parent(),
Some(Expr::BinOp(ast::ExprBinOp {
op: Operator::BitOr,
..
}))
);
if checker.enabled(Rule::DuplicateUnionMember)
&& checker.semantic.in_type_definition()
&& is_unchecked_union
{
flake8_pyi::rules::duplicate_union_member(checker, expr);
}
if checker.enabled(Rule::UnnecessaryLiteralUnion) && is_unchecked_union {
flake8_pyi::rules::unnecessary_literal_union(checker, expr);
}
if checker.enabled(Rule::RedundantLiteralUnion) && is_unchecked_union {
flake8_pyi::rules::redundant_literal_union(checker, expr);
}
if checker.enabled(Rule::UnnecessaryTypeUnion) && is_unchecked_union {
flake8_pyi::rules::unnecessary_type_union(checker, expr);
}
}
Expr::UnaryOp(ast::ExprUnaryOp {
@@ -1142,12 +1134,14 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
flake8_simplify::rules::double_negation(checker, expr, *op, operand);
}
}
Expr::Compare(ast::ExprCompare {
left,
ops,
comparators,
range: _,
}) => {
Expr::Compare(
compare @ ast::ExprCompare {
left,
ops,
comparators,
range: _,
},
) => {
let check_none_comparisons = checker.enabled(Rule::NoneComparison);
let check_true_false_comparisons = checker.enabled(Rule::TrueFalseComparison);
if check_none_comparisons || check_true_false_comparisons {
@@ -1165,7 +1159,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
pyflakes::rules::invalid_literal_comparison(checker, left, ops, comparators, expr);
}
if checker.enabled(Rule::TypeComparison) {
pycodestyle::rules::type_comparison(checker, expr, ops, comparators);
pycodestyle::rules::type_comparison(checker, compare);
}
if checker.any_enabled(&[
Rule::SysVersionCmpStr3,
@@ -1261,7 +1255,7 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
}
Expr::Lambda(
lambda @ ast::ExprLambda {
args: _,
parameters: _,
body: _,
range: _,
},

View File

@@ -1,5 +1,3 @@
pub(super) use argument::argument;
pub(super) use arguments::arguments;
pub(super) use bindings::bindings;
pub(super) use comprehension::comprehension;
pub(super) use deferred_for_loops::deferred_for_loops;
@@ -8,12 +6,12 @@ pub(super) use definitions::definitions;
pub(super) use except_handler::except_handler;
pub(super) use expression::expression;
pub(super) use module::module;
pub(super) use parameter::parameter;
pub(super) use parameters::parameters;
pub(super) use statement::statement;
pub(super) use suite::suite;
pub(super) use unresolved_references::unresolved_references;
mod argument;
mod arguments;
mod bindings;
mod comprehension;
mod deferred_for_loops;
@@ -22,6 +20,8 @@ mod definitions;
mod except_handler;
mod expression;
mod module;
mod parameter;
mod parameters;
mod statement;
mod suite;
mod unresolved_references;

View File

@@ -1,27 +1,28 @@
use ruff_python_ast::{Arg, Ranged};
use ruff_python_ast::{Parameter, Ranged};
use crate::checkers::ast::Checker;
use crate::codes::Rule;
use crate::rules::{flake8_builtins, pep8_naming, pycodestyle};
/// Run lint rules over an [`Arg`] syntax node.
pub(crate) fn argument(arg: &Arg, checker: &mut Checker) {
/// Run lint rules over a [`Parameter`] syntax node.
pub(crate) fn parameter(parameter: &Parameter, checker: &mut Checker) {
if checker.enabled(Rule::AmbiguousVariableName) {
if let Some(diagnostic) = pycodestyle::rules::ambiguous_variable_name(&arg.arg, arg.range())
if let Some(diagnostic) =
pycodestyle::rules::ambiguous_variable_name(&parameter.name, parameter.range())
{
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::InvalidArgumentName) {
if let Some(diagnostic) = pep8_naming::rules::invalid_argument_name(
&arg.arg,
arg,
&parameter.name,
parameter,
&checker.settings.pep8_naming.ignore_names,
) {
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::BuiltinArgumentShadowing) {
flake8_builtins::rules::builtin_argument_shadowing(checker, arg);
flake8_builtins::rules::builtin_argument_shadowing(checker, parameter);
}
}

View File

@@ -1,26 +1,26 @@
use ruff_python_ast::Arguments;
use ruff_python_ast::Parameters;
use crate::checkers::ast::Checker;
use crate::codes::Rule;
use crate::rules::{flake8_bugbear, flake8_pyi, ruff};
/// Run lint rules over a [`Arguments`] syntax node.
pub(crate) fn arguments(arguments: &Arguments, checker: &mut Checker) {
/// Run lint rules over a [`Parameters`] syntax node.
pub(crate) fn parameters(parameters: &Parameters, checker: &mut Checker) {
if checker.enabled(Rule::MutableArgumentDefault) {
flake8_bugbear::rules::mutable_argument_default(checker, arguments);
flake8_bugbear::rules::mutable_argument_default(checker, parameters);
}
if checker.enabled(Rule::FunctionCallInDefaultArgument) {
flake8_bugbear::rules::function_call_in_argument_default(checker, arguments);
flake8_bugbear::rules::function_call_in_argument_default(checker, parameters);
}
if checker.settings.rules.enabled(Rule::ImplicitOptional) {
ruff::rules::implicit_optional(checker, arguments);
ruff::rules::implicit_optional(checker, parameters);
}
if checker.is_stub {
if checker.enabled(Rule::TypedArgumentDefaultInStub) {
flake8_pyi::rules::typed_argument_simple_defaults(checker, arguments);
flake8_pyi::rules::typed_argument_simple_defaults(checker, parameters);
}
if checker.enabled(Rule::ArgumentDefaultInStub) {
flake8_pyi::rules::argument_simple_defaults(checker, arguments);
flake8_pyi::rules::argument_simple_defaults(checker, parameters);
}
}
}

View File

@@ -73,16 +73,18 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
name,
decorator_list,
returns,
args,
parameters,
body,
type_params,
..
})
| Stmt::AsyncFunctionDef(ast::StmtAsyncFunctionDef {
name,
decorator_list,
returns,
args,
parameters,
body,
type_params,
..
}) => {
if checker.enabled(Rule::DjangoNonLeadingReceiverDecorator) {
@@ -114,7 +116,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker.semantic.scope(),
name,
decorator_list,
args,
parameters,
)
{
checker.diagnostics.push(diagnostic);
@@ -126,7 +128,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker.semantic.scope(),
name,
decorator_list,
args,
parameters,
) {
checker.diagnostics.push(diagnostic);
}
@@ -141,38 +143,52 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if checker.enabled(Rule::StubBodyMultipleStatements) {
flake8_pyi::rules::stub_body_multiple_statements(checker, stmt, body);
}
if checker.enabled(Rule::AnyEqNeAnnotation) {
flake8_pyi::rules::any_eq_ne_annotation(checker, name, args);
}
if checker.enabled(Rule::NonSelfReturnType) {
flake8_pyi::rules::non_self_return_type(
checker,
stmt,
name,
decorator_list,
returns.as_ref().map(AsRef::as_ref),
args,
stmt.is_async_function_def_stmt(),
);
}
}
if checker.enabled(Rule::AnyEqNeAnnotation) {
flake8_pyi::rules::any_eq_ne_annotation(checker, name, parameters);
}
if checker.enabled(Rule::NonSelfReturnType) {
flake8_pyi::rules::non_self_return_type(
checker,
stmt,
name,
decorator_list,
returns.as_ref().map(AsRef::as_ref),
parameters,
stmt.is_async_function_def_stmt(),
);
}
if checker.enabled(Rule::CustomTypeVarReturnType) {
flake8_pyi::rules::custom_type_var_return_type(
checker,
name,
decorator_list,
returns.as_ref().map(AsRef::as_ref),
parameters,
type_params.as_ref(),
);
}
if checker.is_stub {
if checker.enabled(Rule::StrOrReprDefinedInStub) {
flake8_pyi::rules::str_or_repr_defined_in_stub(checker, stmt);
}
}
if checker.is_stub || checker.settings.target_version >= PythonVersion::Py311 {
if checker.enabled(Rule::NoReturnArgumentAnnotationInStub) {
flake8_pyi::rules::no_return_argument_annotation(checker, args);
}
if checker.enabled(Rule::BadExitAnnotation) {
flake8_pyi::rules::bad_exit_annotation(
checker,
stmt.is_async_function_def_stmt(),
name,
args,
);
}
if checker.enabled(Rule::RedundantNumericUnion) {
flake8_pyi::rules::redundant_numeric_union(checker, args);
flake8_pyi::rules::no_return_argument_annotation(checker, parameters);
}
}
if checker.enabled(Rule::BadExitAnnotation) {
flake8_pyi::rules::bad_exit_annotation(
checker,
stmt.is_async_function_def_stmt(),
name,
parameters,
);
}
if checker.enabled(Rule::RedundantNumericUnion) {
flake8_pyi::rules::redundant_numeric_union(checker, parameters);
}
if checker.enabled(Rule::DunderFunctionName) {
if let Some(diagnostic) = pep8_naming::rules::dunder_function_name(
checker.semantic.scope(),
@@ -230,13 +246,13 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
}
if checker.enabled(Rule::HardcodedPasswordDefault) {
flake8_bandit::rules::hardcoded_password_default(checker, args);
flake8_bandit::rules::hardcoded_password_default(checker, parameters);
}
if checker.enabled(Rule::PropertyWithParameters) {
pylint::rules::property_with_parameters(checker, stmt, decorator_list, args);
pylint::rules::property_with_parameters(checker, stmt, decorator_list, parameters);
}
if checker.enabled(Rule::TooManyArguments) {
pylint::rules::too_many_arguments(checker, args, stmt);
pylint::rules::too_many_arguments(checker, parameters, stmt);
}
if checker.enabled(Rule::TooManyReturnStatements) {
if let Some(diagnostic) = pylint::rules::too_many_return_statements(
@@ -282,7 +298,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker,
stmt,
name,
args,
parameters,
decorator_list,
body,
);
@@ -304,7 +320,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker,
name,
decorator_list,
args,
parameters,
);
}
if checker.enabled(Rule::BooleanDefaultValueInFunctionDefinition) {
@@ -312,7 +328,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
checker,
name,
decorator_list,
args,
parameters,
);
}
if checker.enabled(Rule::UnexpectedSpecialMethodSignature) {
@@ -321,7 +337,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
name,
decorator_list,
args,
parameters,
);
}
if checker.enabled(Rule::FStringDocstring) {
@@ -363,8 +379,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
Stmt::ClassDef(
class_def @ ast::StmtClassDef {
name,
bases,
keywords,
arguments,
type_params: _,
decorator_list,
body,
@@ -375,21 +390,27 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
flake8_django::rules::nullable_model_string_field(checker, body);
}
if checker.enabled(Rule::DjangoExcludeWithModelForm) {
if let Some(diagnostic) =
flake8_django::rules::exclude_with_model_form(checker, bases, body)
{
if let Some(diagnostic) = flake8_django::rules::exclude_with_model_form(
checker,
arguments.as_deref(),
body,
) {
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::DjangoAllWithModelForm) {
if let Some(diagnostic) =
flake8_django::rules::all_with_model_form(checker, bases, body)
flake8_django::rules::all_with_model_form(checker, arguments.as_deref(), body)
{
checker.diagnostics.push(diagnostic);
}
}
if checker.enabled(Rule::DjangoUnorderedBodyContentInModel) {
flake8_django::rules::unordered_body_content_in_model(checker, bases, body);
flake8_django::rules::unordered_body_content_in_model(
checker,
arguments.as_deref(),
body,
);
}
if !checker.is_stub {
if checker.enabled(Rule::DjangoModelWithoutDunderStr) {
@@ -425,7 +446,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if checker.enabled(Rule::ErrorSuffixOnExceptionName) {
if let Some(diagnostic) = pep8_naming::rules::error_suffix_on_exception_name(
stmt,
bases,
arguments.as_deref(),
name,
&checker.settings.pep8_naming.ignore_names,
) {
@@ -438,7 +459,11 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
Rule::EmptyMethodWithoutAbstractDecorator,
]) {
flake8_bugbear::rules::abstract_base_class(
checker, stmt, name, bases, keywords, body,
checker,
stmt,
name,
arguments.as_deref(),
body,
);
}
}
@@ -449,9 +474,9 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if checker.enabled(Rule::PassInClassBody) {
flake8_pyi::rules::pass_in_class_body(checker, stmt, body);
}
if checker.enabled(Rule::EllipsisInNonEmptyClassBody) {
flake8_pyi::rules::ellipsis_in_non_empty_class_body(checker, stmt, body);
}
}
if checker.enabled(Rule::EllipsisInNonEmptyClassBody) {
flake8_pyi::rules::ellipsis_in_non_empty_class_body(checker, stmt, body);
}
if checker.enabled(Rule::PytestIncorrectMarkParenthesesStyle) {
flake8_pytest_style::rules::marks(checker, decorator_list);
@@ -478,7 +503,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
flake8_builtins::rules::builtin_variable_shadowing(checker, name, name.range());
}
if checker.enabled(Rule::DuplicateBases) {
pylint::rules::duplicate_bases(checker, name, bases);
pylint::rules::duplicate_bases(checker, name, arguments.as_deref());
}
if checker.enabled(Rule::NoSlotsInStrSubclass) {
flake8_slots::rules::no_slots_in_str_subclass(checker, stmt, class_def);
@@ -1337,12 +1362,14 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
}
}
Stmt::AnnAssign(ast::StmtAnnAssign {
target,
value,
annotation,
..
}) => {
Stmt::AnnAssign(
assign_stmt @ ast::StmtAnnAssign {
target,
value,
annotation,
..
},
) => {
if let Some(value) = value {
if checker.enabled(Rule::LambdaAssignment) {
pycodestyle::rules::lambda_assignment(
@@ -1365,6 +1392,9 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
stmt,
);
}
if checker.enabled(Rule::NonPEP695TypeAlias) {
pyupgrade::rules::non_pep695_type_alias(checker, assign_stmt);
}
if checker.is_stub {
if let Some(value) = value {
if checker.enabled(Rule::AssignmentDefaultInStub) {
@@ -1386,13 +1416,13 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
);
}
}
if checker.semantic.match_typing_expr(annotation, "TypeAlias") {
if checker.enabled(Rule::SnakeCaseTypeAlias) {
flake8_pyi::rules::snake_case_type_alias(checker, target);
}
if checker.enabled(Rule::TSuffixedTypeAlias) {
flake8_pyi::rules::t_suffixed_type_alias(checker, target);
}
}
if checker.semantic.match_typing_expr(annotation, "TypeAlias") {
if checker.enabled(Rule::SnakeCaseTypeAlias) {
flake8_pyi::rules::snake_case_type_alias(checker, target);
}
if checker.enabled(Rule::TSuffixedTypeAlias) {
flake8_pyi::rules::t_suffixed_type_alias(checker, target);
}
}
}

View File

@@ -31,8 +31,9 @@ use std::path::Path;
use itertools::Itertools;
use log::error;
use ruff_python_ast::{
self as ast, Arg, ArgWithDefault, Arguments, Comprehension, Constant, ElifElseClause,
ExceptHandler, Expr, ExprContext, Keyword, Pattern, Ranged, Stmt, Suite, UnaryOp,
self as ast, Arguments, Comprehension, Constant, ElifElseClause, ExceptHandler, Expr,
ExprContext, Keyword, Parameter, ParameterWithDefault, Parameters, Pattern, Ranged, Stmt,
Suite, UnaryOp,
};
use ruff_text_size::{TextRange, TextSize};
@@ -232,6 +233,11 @@ impl<'a> Checker<'a> {
&self.semantic
}
/// Return `true` if the current file is a stub file (`.pyi`).
pub(crate) const fn is_stub(&self) -> bool {
self.is_stub
}
/// The [`Path`] to the file under analysis.
pub(crate) const fn path(&self) -> &'a Path {
self.path
@@ -334,7 +340,7 @@ where
if alias
.asname
.as_ref()
.map_or(false, |asname| asname.as_str() == alias.name.as_str())
.is_some_and(|asname| asname.as_str() == alias.name.as_str())
{
flags |= BindingFlags::EXPLICIT_EXPORT;
}
@@ -379,7 +385,7 @@ where
if alias
.asname
.as_ref()
.map_or(false, |asname| asname.as_str() == alias.name.as_str())
.is_some_and(|asname| asname.as_str() == alias.name.as_str())
{
flags |= BindingFlags::EXPLICIT_EXPORT;
}
@@ -450,7 +456,7 @@ where
match stmt {
Stmt::FunctionDef(ast::StmtFunctionDef {
body,
args,
parameters,
decorator_list,
returns,
type_params,
@@ -458,7 +464,7 @@ where
})
| Stmt::AsyncFunctionDef(ast::StmtAsyncFunctionDef {
body,
args,
parameters,
decorator_list,
type_params,
returns,
@@ -476,28 +482,28 @@ where
self.semantic.push_scope(ScopeKind::Type);
for type_param in type_params {
self.visit_type_param(type_param);
if let Some(type_params) = type_params {
self.visit_type_params(type_params);
}
for arg_with_default in args
for parameter_with_default in parameters
.posonlyargs
.iter()
.chain(&args.args)
.chain(&args.kwonlyargs)
.chain(&parameters.args)
.chain(&parameters.kwonlyargs)
{
if let Some(expr) = &arg_with_default.def.annotation {
if let Some(expr) = &parameter_with_default.parameter.annotation {
if runtime_annotation {
self.visit_runtime_annotation(expr);
} else {
self.visit_annotation(expr);
};
}
if let Some(expr) = &arg_with_default.default {
if let Some(expr) = &parameter_with_default.default {
self.visit_expr(expr);
}
}
if let Some(arg) = &args.vararg {
if let Some(arg) = &parameters.vararg {
if let Some(expr) = &arg.annotation {
if runtime_annotation {
self.visit_runtime_annotation(expr);
@@ -506,7 +512,7 @@ where
};
}
}
if let Some(arg) = &args.kwarg {
if let Some(arg) = &parameters.kwarg {
if let Some(expr) = &arg.annotation {
if runtime_annotation {
self.visit_runtime_annotation(expr);
@@ -547,8 +553,7 @@ where
Stmt::ClassDef(
class_def @ ast::StmtClassDef {
body,
bases,
keywords,
arguments,
decorator_list,
type_params,
..
@@ -560,14 +565,12 @@ where
self.semantic.push_scope(ScopeKind::Type);
for type_param in type_params {
self.visit_type_param(type_param);
if let Some(type_params) = type_params {
self.visit_type_params(type_params);
}
for expr in bases {
self.visit_expr(expr);
}
for keyword in keywords {
self.visit_keyword(keyword);
if let Some(arguments) = arguments {
self.visit_arguments(arguments);
}
let definition = docstrings::extraction::extract_definition(
@@ -587,14 +590,14 @@ where
self.visit_body(body);
}
Stmt::TypeAlias(ast::StmtTypeAlias {
range: _range,
range: _,
name,
type_params,
value,
}) => {
self.semantic.push_scope(ScopeKind::Type);
for type_param in type_params {
self.visit_type_param(type_param);
if let Some(type_params) = type_params {
self.visit_type_params(type_params);
}
self.visit_expr(value);
self.semantic.pop_scope();
@@ -832,8 +835,7 @@ where
match expr {
Expr::Call(ast::ExprCall {
func,
args: _,
keywords: _,
arguments: _,
range: _,
}) => {
if let Expr::Name(ast::ExprName { id, ctx, range: _ }) = func.as_ref() {
@@ -883,21 +885,21 @@ where
}
Expr::Lambda(
lambda @ ast::ExprLambda {
args,
parameters,
body: _,
range: _,
},
) => {
// Visit the default arguments, but avoid the body, which will be deferred.
for ArgWithDefault {
for ParameterWithDefault {
default,
def: _,
parameter: _,
range: _,
} in args
} in parameters
.posonlyargs
.iter()
.chain(&args.args)
.chain(&args.kwonlyargs)
.chain(&parameters.args)
.chain(&parameters.kwonlyargs)
{
if let Some(expr) = &default {
self.visit_expr(expr);
@@ -919,8 +921,12 @@ where
}
Expr::Call(ast::ExprCall {
func,
args,
keywords,
arguments:
Arguments {
args,
keywords,
range: _,
},
range: _,
}) => {
self.visit_expr(func);
@@ -1098,7 +1104,7 @@ where
arg,
range: _,
} = keyword;
if arg.as_ref().map_or(false, |arg| arg == "type") {
if arg.as_ref().is_some_and(|arg| arg == "type") {
self.visit_type_definition(value);
} else {
self.visit_non_type_definition(value);
@@ -1288,43 +1294,43 @@ where
}
}
fn visit_arguments(&mut self, arguments: &'b Arguments) {
fn visit_parameters(&mut self, parameters: &'b Parameters) {
// Step 1: Binding.
// Bind, but intentionally avoid walking default expressions, as we handle them
// upstream.
for arg_with_default in &arguments.posonlyargs {
self.visit_arg(&arg_with_default.def);
for parameter_with_default in &parameters.posonlyargs {
self.visit_parameter(&parameter_with_default.parameter);
}
for arg_with_default in &arguments.args {
self.visit_arg(&arg_with_default.def);
for parameter_with_default in &parameters.args {
self.visit_parameter(&parameter_with_default.parameter);
}
if let Some(arg) = &arguments.vararg {
self.visit_arg(arg);
if let Some(arg) = &parameters.vararg {
self.visit_parameter(arg);
}
for arg_with_default in &arguments.kwonlyargs {
self.visit_arg(&arg_with_default.def);
for parameter_with_default in &parameters.kwonlyargs {
self.visit_parameter(&parameter_with_default.parameter);
}
if let Some(arg) = &arguments.kwarg {
self.visit_arg(arg);
if let Some(arg) = &parameters.kwarg {
self.visit_parameter(arg);
}
// Step 4: Analysis
analyze::arguments(arguments, self);
analyze::parameters(parameters, self);
}
fn visit_arg(&mut self, arg: &'b Arg) {
fn visit_parameter(&mut self, parameter: &'b Parameter) {
// Step 1: Binding.
// Bind, but intentionally avoid walking the annotation, as we handle it
// upstream.
self.add_binding(
&arg.arg,
arg.identifier(),
&parameter.name,
parameter.identifier(),
BindingKind::Argument,
BindingFlags::empty(),
);
// Step 4: Analysis
analyze::argument(arg, self);
analyze::parameter(parameter, self);
}
fn visit_pattern(&mut self, pattern: &'b Pattern) {
@@ -1731,6 +1737,7 @@ impl<'a> Checker<'a> {
}
fn visit_deferred_future_type_definitions(&mut self) {
let snapshot = self.semantic.snapshot();
while !self.deferred.future_type_definitions.is_empty() {
let type_definitions = std::mem::take(&mut self.deferred.future_type_definitions);
for (expr, snapshot) in type_definitions {
@@ -1741,9 +1748,11 @@ impl<'a> Checker<'a> {
self.visit_expr(expr);
}
}
self.semantic.restore(snapshot);
}
fn visit_deferred_type_param_definitions(&mut self) {
let snapshot = self.semantic.snapshot();
while !self.deferred.type_param_definitions.is_empty() {
let type_params = std::mem::take(&mut self.deferred.type_param_definitions);
for (type_param, snapshot) in type_params {
@@ -1757,9 +1766,11 @@ impl<'a> Checker<'a> {
}
}
}
self.semantic.restore(snapshot);
}
fn visit_deferred_string_type_definitions(&mut self, allocator: &'a typed_arena::Arena<Expr>) {
let snapshot = self.semantic.snapshot();
while !self.deferred.string_type_definitions.is_empty() {
let type_definitions = std::mem::take(&mut self.deferred.string_type_definitions);
for (range, value, snapshot) in type_definitions {
@@ -1770,7 +1781,7 @@ impl<'a> Checker<'a> {
self.semantic.restore(snapshot);
if self.semantic.in_typing_only_annotation() {
if self.semantic.in_annotation() && self.semantic.future_annotations() {
if self.enabled(Rule::QuotedAnnotation) {
pyupgrade::rules::quoted_annotation(self, value, range);
}
@@ -1803,18 +1814,24 @@ impl<'a> Checker<'a> {
}
}
}
self.semantic.restore(snapshot);
}
fn visit_deferred_functions(&mut self) {
let snapshot = self.semantic.snapshot();
while !self.deferred.functions.is_empty() {
let deferred_functions = std::mem::take(&mut self.deferred.functions);
for snapshot in deferred_functions {
self.semantic.restore(snapshot);
match &self.semantic.stmt() {
Stmt::FunctionDef(ast::StmtFunctionDef { body, args, .. })
| Stmt::AsyncFunctionDef(ast::StmtAsyncFunctionDef { body, args, .. }) => {
self.visit_arguments(args);
Stmt::FunctionDef(ast::StmtFunctionDef {
body, parameters, ..
})
| Stmt::AsyncFunctionDef(ast::StmtAsyncFunctionDef {
body, parameters, ..
}) => {
self.visit_parameters(parameters);
self.visit_body(body);
}
_ => {
@@ -1823,31 +1840,36 @@ impl<'a> Checker<'a> {
}
}
}
self.semantic.restore(snapshot);
}
fn visit_deferred_lambdas(&mut self) {
let snapshot = self.semantic.snapshot();
while !self.deferred.lambdas.is_empty() {
let lambdas = std::mem::take(&mut self.deferred.lambdas);
for (expr, snapshot) in lambdas {
self.semantic.restore(snapshot);
if let Expr::Lambda(ast::ExprLambda {
args,
parameters,
body,
range: _,
}) = expr
{
self.visit_arguments(args);
self.visit_parameters(parameters);
self.visit_expr(body);
} else {
unreachable!("Expected Expr::Lambda");
}
}
}
self.semantic.restore(snapshot);
}
/// Run any lint rules that operate over the module exports (i.e., members of `__all__`).
fn visit_exports(&mut self) {
let snapshot = self.semantic.snapshot();
let exports: Vec<(&str, TextRange)> = self
.semantic
.global_scope()
@@ -1890,6 +1912,8 @@ impl<'a> Checker<'a> {
}
}
}
self.semantic.restore(snapshot);
}
}

View File

@@ -191,6 +191,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Pylint, "E1142") => (RuleGroup::Unspecified, rules::pylint::rules::AwaitOutsideAsync),
(Pylint, "E1205") => (RuleGroup::Unspecified, rules::pylint::rules::LoggingTooManyArgs),
(Pylint, "E1206") => (RuleGroup::Unspecified, rules::pylint::rules::LoggingTooFewArgs),
(Pylint, "E1300") => (RuleGroup::Unspecified, rules::pylint::rules::BadStringFormatCharacter),
(Pylint, "E1307") => (RuleGroup::Unspecified, rules::pylint::rules::BadStringFormatType),
(Pylint, "E1310") => (RuleGroup::Unspecified, rules::pylint::rules::BadStrStripCall),
(Pylint, "E1507") => (RuleGroup::Unspecified, rules::pylint::rules::InvalidEnvvarValue),
@@ -438,6 +439,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Pyupgrade, "037") => (RuleGroup::Unspecified, rules::pyupgrade::rules::QuotedAnnotation),
(Pyupgrade, "038") => (RuleGroup::Unspecified, rules::pyupgrade::rules::NonPEP604Isinstance),
(Pyupgrade, "039") => (RuleGroup::Unspecified, rules::pyupgrade::rules::UnnecessaryClassParentheses),
(Pyupgrade, "040") => (RuleGroup::Unspecified, rules::pyupgrade::rules::NonPEP695TypeAlias),
// pydocstyle
(Pydocstyle, "100") => (RuleGroup::Unspecified, rules::pydocstyle::rules::UndocumentedPublicModule),
@@ -636,6 +638,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Flake8Pyi, "016") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::DuplicateUnionMember),
(Flake8Pyi, "017") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::ComplexAssignmentInStub),
(Flake8Pyi, "018") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::UnusedPrivateTypeVar),
(Flake8Pyi, "019") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::CustomTypeVarReturnType),
(Flake8Pyi, "020") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::QuotedAnnotationInStub),
(Flake8Pyi, "021") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::DocstringInStub),
(Flake8Pyi, "024") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::CollectionsNamedTuple),
@@ -658,9 +661,11 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Flake8Pyi, "048") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::StubBodyMultipleStatements),
(Flake8Pyi, "049") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::UnusedPrivateTypedDict),
(Flake8Pyi, "050") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::NoReturnArgumentAnnotationInStub),
(Flake8Pyi, "051") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::RedundantLiteralUnion),
(Flake8Pyi, "052") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::UnannotatedAssignmentInStub),
(Flake8Pyi, "054") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::NumericLiteralTooLong),
(Flake8Pyi, "053") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::StringOrBytesTooLong),
(Flake8Pyi, "055") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::UnnecessaryTypeUnion),
(Flake8Pyi, "056") => (RuleGroup::Unspecified, rules::flake8_pyi::rules::UnsupportedMethodCallOnAll),
// flake8-pytest-style

View File

@@ -402,7 +402,7 @@ fn is_docstring_section(
}
// Determine whether the next line is an underline, e.g., "-----".
let next_line_is_underline = next_line.map_or(false, |next_line| {
let next_line_is_underline = next_line.is_some_and(|next_line| {
let next_line = next_line.trim();
if next_line.is_empty() {
false

View File

@@ -299,12 +299,12 @@ fn match_leading_semicolon(s: &str) -> Option<TextSize> {
#[cfg(test)]
mod tests {
use anyhow::Result;
use ruff_python_ast::Suite;
use ruff_python_parser::lexer::LexResult;
use ruff_python_parser::Parse;
use ruff_text_size::TextSize;
use ruff_python_codegen::Stylist;
use ruff_python_parser::parse_suite;
use ruff_source_file::{LineEnding, Locator};
use super::Insertion;
@@ -312,7 +312,7 @@ mod tests {
#[test]
fn start_of_file() -> Result<()> {
fn insert(contents: &str) -> Result<Insertion> {
let program = Suite::parse(contents, "<filename>")?;
let program = parse_suite(contents, "<filename>")?;
let tokens: Vec<LexResult> = ruff_python_parser::tokenize(contents);
let locator = Locator::new(contents);
let stylist = Stylist::from_tokens(&tokens, &locator);

View File

@@ -305,7 +305,7 @@ impl<'a> Importer<'a> {
}) = stmt
{
if level.map_or(true, |level| level.to_u32() == 0)
&& name.as_ref().map_or(false, |name| name == module)
&& name.as_ref().is_some_and(|name| name == module)
{
import_from = Some(*stmt);
}

View File

@@ -376,7 +376,7 @@ impl Notebook {
1
} else {
let trailing_newline =
usize::from(string_array.last().map_or(false, |s| s.ends_with('\n')));
usize::from(string_array.last().is_some_and(|s| s.ends_with('\n')));
u32::try_from(string_array.len() + trailing_newline).unwrap()
}
}

View File

@@ -141,7 +141,7 @@ impl<'a> EmitterContext<'a> {
pub fn is_jupyter_notebook(&self, name: &str) -> bool {
self.source_kind
.get(name)
.map_or(false, SourceKind::is_jupyter)
.is_some_and(SourceKind::is_jupyter)
}
pub fn source_kind(&self, name: &str) -> Option<&SourceKind> {

View File

@@ -60,7 +60,7 @@ impl<'a> Directive<'a> {
if text[..comment_start]
.chars()
.last()
.map_or(false, |c| c != '#')
.is_some_and(|c| c != '#')
{
continue;
}

View File

@@ -304,7 +304,7 @@ pub fn python_files_in_path(
if let Ok(entry) = &result {
if entry
.file_type()
.map_or(false, |file_type| file_type.is_dir())
.is_some_and(|file_type| file_type.is_dir())
{
match settings_toml(entry.path()) {
Ok(Some(pyproject)) => match resolve_scoped_settings(

View File

@@ -1,10 +1,8 @@
use ruff_python_ast as ast;
use ruff_python_ast::{Expr, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::find_keyword;
use ruff_python_ast as ast;
use ruff_python_ast::Constant;
use ruff_python_ast::{Expr, Ranged};
use crate::checkers::ast::Checker;
@@ -60,7 +58,10 @@ pub(crate) fn variable_name_task_id(
};
// If the value is not a call, we can't do anything.
let Expr::Call(ast::ExprCall { func, keywords, .. }) = value else {
let Expr::Call(ast::ExprCall {
func, arguments, ..
}) = value
else {
return None;
};
@@ -68,13 +69,13 @@ pub(crate) fn variable_name_task_id(
if !checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| matches!(call_path[0], "airflow"))
.is_some_and(|call_path| matches!(call_path[0], "airflow"))
{
return None;
}
// If the call doesn't have a `task_id` keyword argument, we can't do anything.
let keyword = find_keyword(keywords, "task_id")?;
let keyword = arguments.find_keyword("task_id")?;
// If the keyword argument is not a string, we can't do anything.
let task_id = match &keyword.value {

View File

@@ -1,8 +1,8 @@
/// See: [eradicate.py](https://github.com/myint/eradicate/blob/98f199940979c94447a461d50d27862b118b282d/eradicate.py)
use once_cell::sync::Lazy;
use regex::Regex;
use ruff_python_ast::Suite;
use ruff_python_parser::Parse;
use ruff_python_parser::parse_suite;
static ALLOWLIST_REGEX: Lazy<Regex> = Lazy::new(|| {
Regex::new(
@@ -79,7 +79,7 @@ pub(crate) fn comment_contains_code(line: &str, task_tags: &[String]) -> bool {
}
// Finally, compile the source code.
Suite::parse(&line, "<filename>").is_ok()
parse_suite(&line, "<filename>").is_ok()
}
/// Returns `true` if a line is probably part of some multiline code.

View File

@@ -5,5 +5,5 @@ use ruff_python_semantic::SemanticModel;
pub(super) fn is_sys(expr: &Expr, target: &str, semantic: &SemanticModel) -> bool {
semantic
.resolve_call_path(expr)
.map_or(false, |call_path| call_path.as_slice() == ["sys", target])
.is_some_and(|call_path| call_path.as_slice() == ["sys", target])
}

View File

@@ -49,9 +49,7 @@ pub(crate) fn name_or_attribute(checker: &mut Checker, expr: &Expr) {
if checker
.semantic()
.resolve_call_path(expr)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["six", "PY3"])
})
.is_some_and(|call_path| matches!(call_path.as_slice(), ["six", "PY3"]))
{
checker
.diagnostics

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{self as ast, Arguments, Expr, Stmt};
use ruff_python_ast::{self as ast, Expr, Parameters, Stmt};
use ruff_python_ast::cast;
use ruff_python_semantic::analyze::visibility;
@@ -6,11 +6,11 @@ use ruff_python_semantic::{Definition, Member, MemberKind, SemanticModel};
pub(super) fn match_function_def(
stmt: &Stmt,
) -> (&str, &Arguments, Option<&Expr>, &[Stmt], &[ast::Decorator]) {
) -> (&str, &Parameters, Option<&Expr>, &[Stmt], &[ast::Decorator]) {
match stmt {
Stmt::FunctionDef(ast::StmtFunctionDef {
name,
args,
parameters,
returns,
body,
decorator_list,
@@ -18,14 +18,14 @@ pub(super) fn match_function_def(
})
| Stmt::AsyncFunctionDef(ast::StmtAsyncFunctionDef {
name,
args,
parameters,
returns,
body,
decorator_list,
..
}) => (
name,
args,
parameters,
returns.as_ref().map(AsRef::as_ref),
body,
decorator_list,

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{self as ast, ArgWithDefault, Constant, Expr, Ranged, Stmt};
use ruff_python_ast::{self as ast, Constant, Expr, ParameterWithDefault, Ranged, Stmt};
use ruff_diagnostics::{AlwaysAutofixableViolation, Diagnostic, Fix, Violation};
use ruff_macros::{derive_message_formats, violation};
@@ -524,8 +524,8 @@ pub(crate) fn definition(
let is_overridden = visibility::is_override(decorator_list, checker.semantic());
// ANN001, ANN401
for ArgWithDefault {
def,
for ParameterWithDefault {
parameter,
default: _,
range: _,
} in arguments
@@ -542,26 +542,29 @@ pub(crate) fn definition(
)
{
// ANN401 for dynamically typed arguments
if let Some(annotation) = &def.annotation {
if let Some(annotation) = &parameter.annotation {
has_any_typed_arg = true;
if checker.enabled(Rule::AnyType) && !is_overridden {
check_dynamically_typed(
checker,
annotation,
|| def.arg.to_string(),
|| parameter.name.to_string(),
&mut diagnostics,
);
}
} else {
if !(checker.settings.flake8_annotations.suppress_dummy_args
&& checker.settings.dummy_variable_rgx.is_match(&def.arg))
&& checker
.settings
.dummy_variable_rgx
.is_match(&parameter.name))
{
if checker.enabled(Rule::MissingTypeFunctionArgument) {
diagnostics.push(Diagnostic::new(
MissingTypeFunctionArgument {
name: def.arg.to_string(),
name: parameter.name.to_string(),
},
def.range(),
parameter.range(),
));
}
}
@@ -574,18 +577,18 @@ pub(crate) fn definition(
has_any_typed_arg = true;
if !checker.settings.flake8_annotations.allow_star_arg_any {
if checker.enabled(Rule::AnyType) && !is_overridden {
let name = &arg.arg;
let name = &arg.name;
check_dynamically_typed(checker, expr, || format!("*{name}"), &mut diagnostics);
}
}
} else {
if !(checker.settings.flake8_annotations.suppress_dummy_args
&& checker.settings.dummy_variable_rgx.is_match(&arg.arg))
&& checker.settings.dummy_variable_rgx.is_match(&arg.name))
{
if checker.enabled(Rule::MissingTypeArgs) {
diagnostics.push(Diagnostic::new(
MissingTypeArgs {
name: arg.arg.to_string(),
name: arg.name.to_string(),
},
arg.range(),
));
@@ -600,7 +603,7 @@ pub(crate) fn definition(
has_any_typed_arg = true;
if !checker.settings.flake8_annotations.allow_star_arg_any {
if checker.enabled(Rule::AnyType) && !is_overridden {
let name = &arg.arg;
let name = &arg.name;
check_dynamically_typed(
checker,
expr,
@@ -611,12 +614,12 @@ pub(crate) fn definition(
}
} else {
if !(checker.settings.flake8_annotations.suppress_dummy_args
&& checker.settings.dummy_variable_rgx.is_match(&arg.arg))
&& checker.settings.dummy_variable_rgx.is_match(&arg.name))
{
if checker.enabled(Rule::MissingTypeKwargs) {
diagnostics.push(Diagnostic::new(
MissingTypeKwargs {
name: arg.arg.to_string(),
name: arg.name.to_string(),
},
arg.range(),
));
@@ -627,8 +630,8 @@ pub(crate) fn definition(
// ANN101, ANN102
if is_method && !visibility::is_staticmethod(cast::decorator_list(stmt), checker.semantic()) {
if let Some(ArgWithDefault {
def,
if let Some(ParameterWithDefault {
parameter,
default: _,
range: _,
}) = arguments
@@ -636,23 +639,23 @@ pub(crate) fn definition(
.first()
.or_else(|| arguments.args.first())
{
if def.annotation.is_none() {
if parameter.annotation.is_none() {
if visibility::is_classmethod(cast::decorator_list(stmt), checker.semantic()) {
if checker.enabled(Rule::MissingTypeCls) {
diagnostics.push(Diagnostic::new(
MissingTypeCls {
name: def.arg.to_string(),
name: parameter.name.to_string(),
},
def.range(),
parameter.range(),
));
}
} else {
if checker.enabled(Rule::MissingTypeSelf) {
diagnostics.push(Diagnostic::new(
MissingTypeSelf {
name: def.arg.to_string(),
name: parameter.name.to_string(),
},
def.range(),
parameter.range(),
));
}
}

View File

@@ -68,7 +68,7 @@ pub(crate) fn blocking_http_call(checker: &mut Checker, expr: &Expr) {
.semantic()
.resolve_call_path(func)
.as_ref()
.map_or(false, is_blocking_http_call)
.is_some_and(is_blocking_http_call)
{
checker.diagnostics.push(Diagnostic::new(
BlockingHttpCallInAsyncFunction,

View File

@@ -48,7 +48,7 @@ pub(crate) fn blocking_os_call(checker: &mut Checker, expr: &Expr) {
.semantic()
.resolve_call_path(func)
.as_ref()
.map_or(false, is_unsafe_os_method)
.is_some_and(is_unsafe_os_method)
{
checker
.diagnostics

View File

@@ -48,7 +48,7 @@ pub(crate) fn open_sleep_or_subprocess_call(checker: &mut Checker, expr: &Expr)
.semantic()
.resolve_call_path(func)
.as_ref()
.map_or(false, is_open_sleep_or_subprocess_call)
.is_some_and(is_open_sleep_or_subprocess_call)
{
checker.diagnostics.push(Diagnostic::new(
OpenSleepOrSubprocessInAsyncFunction,

View File

@@ -26,18 +26,14 @@ pub(super) fn is_untyped_exception(type_: Option<&Expr>, semantic: &SemanticMode
type_.map_or(true, |type_| {
if let Expr::Tuple(ast::ExprTuple { elts, .. }) = &type_ {
elts.iter().any(|type_| {
semantic
.resolve_call_path(type_)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["", "Exception" | "BaseException"])
})
})
} else {
semantic
.resolve_call_path(type_)
.map_or(false, |call_path| {
semantic.resolve_call_path(type_).is_some_and(|call_path| {
matches!(call_path.as_slice(), ["", "Exception" | "BaseException"])
})
})
} else {
semantic.resolve_call_path(type_).is_some_and(|call_path| {
matches!(call_path.as_slice(), ["", "Exception" | "BaseException"])
})
}
})
}

View File

@@ -1,10 +1,9 @@
use num_traits::ToPrimitive;
use ruff_python_ast::{self as ast, Constant, Expr, Keyword, Operator, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::call_path::CallPath;
use ruff_python_ast::helpers::CallArguments;
use ruff_python_ast::{self as ast, Constant, Expr, Operator, Ranged};
use ruff_python_semantic::SemanticModel;
use crate::checkers::ast::Checker;
@@ -48,21 +47,13 @@ impl Violation for BadFilePermissions {
}
/// S103
pub(crate) fn bad_file_permissions(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
) {
pub(crate) fn bad_file_permissions(checker: &mut Checker, call: &ast::ExprCall) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["os", "chmod"])
})
.resolve_call_path(&call.func)
.is_some_and(|call_path| matches!(call_path.as_slice(), ["os", "chmod"]))
{
let call_args = CallArguments::new(args, keywords);
if let Some(mode_arg) = call_args.argument("mode", 1) {
if let Some(mode_arg) = call.arguments.find_argument("mode", 1) {
if let Some(int_value) = int_value(mode_arg, checker.semantic()) {
if (int_value & WRITE_WORLD > 0) || (int_value & EXECUTE_GROUP > 0) {
checker.diagnostics.push(Diagnostic::new(

View File

@@ -35,9 +35,7 @@ pub(crate) fn exec_used(checker: &mut Checker, func: &Expr) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["" | "builtin", "exec"])
})
.is_some_and(|call_path| matches!(call_path.as_slice(), ["" | "builtin", "exec"]))
{
checker
.diagnostics

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{Arg, ArgWithDefault, Arguments, Expr, Ranged};
use ruff_python_ast::{Expr, Parameter, ParameterWithDefault, Parameters, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
@@ -53,9 +53,9 @@ impl Violation for HardcodedPasswordDefault {
}
}
fn check_password_kwarg(arg: &Arg, default: &Expr) -> Option<Diagnostic> {
fn check_password_kwarg(parameter: &Parameter, default: &Expr) -> Option<Diagnostic> {
string_literal(default).filter(|string| !string.is_empty())?;
let kwarg_name = &arg.arg;
let kwarg_name = &parameter.name;
if !matches_password_name(kwarg_name) {
return None;
}
@@ -68,21 +68,21 @@ fn check_password_kwarg(arg: &Arg, default: &Expr) -> Option<Diagnostic> {
}
/// S107
pub(crate) fn hardcoded_password_default(checker: &mut Checker, arguments: &Arguments) {
for ArgWithDefault {
def,
pub(crate) fn hardcoded_password_default(checker: &mut Checker, parameters: &Parameters) {
for ParameterWithDefault {
parameter,
default,
range: _,
} in arguments
} in parameters
.posonlyargs
.iter()
.chain(&arguments.args)
.chain(&arguments.kwonlyargs)
.chain(&parameters.args)
.chain(&parameters.kwonlyargs)
{
let Some(default) = default else {
continue;
};
if let Some(diagnostic) = check_password_kwarg(def, default) {
if let Some(diagnostic) = check_password_kwarg(parameter, default) {
checker.diagnostics.push(diagnostic);
}
}

View File

@@ -1,8 +1,7 @@
use ruff_python_ast::{Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::{find_keyword, is_const_false, CallArguments};
use ruff_python_ast::helpers::is_const_false;
use ruff_python_ast::{self as ast, Arguments, Ranged};
use crate::checkers::ast::Checker;
@@ -60,30 +59,26 @@ impl Violation for HashlibInsecureHashFunction {
}
/// S324
pub(crate) fn hashlib_insecure_hash_functions(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
) {
if let Some(hashlib_call) = checker
.semantic()
.resolve_call_path(func)
.and_then(|call_path| match call_path.as_slice() {
["hashlib", "new"] => Some(HashlibCall::New),
["hashlib", "md4"] => Some(HashlibCall::WeakHash("md4")),
["hashlib", "md5"] => Some(HashlibCall::WeakHash("md5")),
["hashlib", "sha"] => Some(HashlibCall::WeakHash("sha")),
["hashlib", "sha1"] => Some(HashlibCall::WeakHash("sha1")),
_ => None,
})
pub(crate) fn hashlib_insecure_hash_functions(checker: &mut Checker, call: &ast::ExprCall) {
if let Some(hashlib_call) =
checker
.semantic()
.resolve_call_path(&call.func)
.and_then(|call_path| match call_path.as_slice() {
["hashlib", "new"] => Some(HashlibCall::New),
["hashlib", "md4"] => Some(HashlibCall::WeakHash("md4")),
["hashlib", "md5"] => Some(HashlibCall::WeakHash("md5")),
["hashlib", "sha"] => Some(HashlibCall::WeakHash("sha")),
["hashlib", "sha1"] => Some(HashlibCall::WeakHash("sha1")),
_ => None,
})
{
if !is_used_for_security(keywords) {
if !is_used_for_security(&call.arguments) {
return;
}
match hashlib_call {
HashlibCall::New => {
if let Some(name_arg) = CallArguments::new(args, keywords).argument("name", 0) {
if let Some(name_arg) = call.arguments.find_argument("name", 0) {
if let Some(hash_func_name) = string_literal(name_arg) {
// `hashlib.new` accepts both lowercase and uppercase names for hash
// functions.
@@ -106,15 +101,16 @@ pub(crate) fn hashlib_insecure_hash_functions(
HashlibInsecureHashFunction {
string: (*func_name).to_string(),
},
func.range(),
call.func.range(),
));
}
}
}
}
fn is_used_for_security(keywords: &[Keyword]) -> bool {
find_keyword(keywords, "usedforsecurity")
fn is_used_for_security(arguments: &Arguments) -> bool {
arguments
.find_keyword("usedforsecurity")
.map_or(true, |keyword| !is_const_false(&keyword.value))
}

View File

@@ -1,8 +1,6 @@
use ruff_python_ast::helpers::find_keyword;
use ruff_python_ast::{self as ast, Constant, Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::{self as ast, Constant, Expr, Ranged};
use crate::checkers::ast::Checker;
@@ -59,15 +57,13 @@ impl Violation for Jinja2AutoescapeFalse {
}
/// S701
pub(crate) fn jinja2_autoescape_false(checker: &mut Checker, func: &Expr, keywords: &[Keyword]) {
pub(crate) fn jinja2_autoescape_false(checker: &mut Checker, call: &ast::ExprCall) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["jinja2", "Environment"])
})
.resolve_call_path(&call.func)
.is_some_and(|call_path| matches!(call_path.as_slice(), ["jinja2", "Environment"]))
{
if let Some(keyword) = find_keyword(keywords, "autoescape") {
if let Some(keyword) = call.arguments.find_keyword("autoescape") {
match &keyword.value {
Expr::Constant(ast::ExprConstant {
value: Constant::Bool(true),
@@ -91,7 +87,7 @@ pub(crate) fn jinja2_autoescape_false(checker: &mut Checker, func: &Expr, keywor
} else {
checker.diagnostics.push(Diagnostic::new(
Jinja2AutoescapeFalse { value: false },
func.range(),
call.func.range(),
));
}
}

View File

@@ -1,8 +1,6 @@
use ruff_python_ast::{Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::find_keyword;
use ruff_python_ast::{self as ast, Ranged};
use crate::checkers::ast::Checker;
@@ -35,24 +33,19 @@ impl Violation for LoggingConfigInsecureListen {
}
/// S612
pub(crate) fn logging_config_insecure_listen(
checker: &mut Checker,
func: &Expr,
keywords: &[Keyword],
) {
pub(crate) fn logging_config_insecure_listen(checker: &mut Checker, call: &ast::ExprCall) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["logging", "config", "listen"])
})
.resolve_call_path(&call.func)
.is_some_and(|call_path| matches!(call_path.as_slice(), ["logging", "config", "listen"]))
{
if find_keyword(keywords, "verify").is_some() {
if call.arguments.find_keyword("verify").is_some() {
return;
}
checker
.diagnostics
.push(Diagnostic::new(LoggingConfigInsecureListen, func.range()));
checker.diagnostics.push(Diagnostic::new(
LoggingConfigInsecureListen,
call.func.range(),
));
}
}

View File

@@ -39,9 +39,7 @@ pub(crate) fn paramiko_call(checker: &mut Checker, func: &Expr) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["paramiko", "exec_command"])
})
.is_some_and(|call_path| matches!(call_path.as_slice(), ["paramiko", "exec_command"]))
{
checker
.diagnostics

View File

@@ -1,8 +1,7 @@
use ruff_python_ast::{Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::{find_keyword, is_const_false};
use ruff_python_ast::helpers::is_const_false;
use ruff_python_ast::{self as ast, Ranged};
use crate::checkers::ast::Checker;
@@ -46,14 +45,10 @@ impl Violation for RequestWithNoCertValidation {
}
/// S501
pub(crate) fn request_with_no_cert_validation(
checker: &mut Checker,
func: &Expr,
keywords: &[Keyword],
) {
pub(crate) fn request_with_no_cert_validation(checker: &mut Checker, call: &ast::ExprCall) {
if let Some(target) = checker
.semantic()
.resolve_call_path(func)
.resolve_call_path(&call.func)
.and_then(|call_path| match call_path.as_slice() {
["requests", "get" | "options" | "head" | "post" | "put" | "patch" | "delete"] => {
Some("requests")
@@ -63,7 +58,7 @@ pub(crate) fn request_with_no_cert_validation(
_ => None,
})
{
if let Some(keyword) = find_keyword(keywords, "verify") {
if let Some(keyword) = call.arguments.find_keyword("verify") {
if is_const_false(&keyword.value) {
checker.diagnostics.push(Diagnostic::new(
RequestWithNoCertValidation {

View File

@@ -1,8 +1,7 @@
use ruff_python_ast::{Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::{find_keyword, is_const_none};
use ruff_python_ast::helpers::is_const_none;
use ruff_python_ast::{self as ast, Ranged};
use crate::checkers::ast::Checker;
@@ -49,11 +48,11 @@ impl Violation for RequestWithoutTimeout {
}
/// S113
pub(crate) fn request_without_timeout(checker: &mut Checker, func: &Expr, keywords: &[Keyword]) {
pub(crate) fn request_without_timeout(checker: &mut Checker, call: &ast::ExprCall) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
.resolve_call_path(&call.func)
.is_some_and(|call_path| {
matches!(
call_path.as_slice(),
[
@@ -63,7 +62,7 @@ pub(crate) fn request_without_timeout(checker: &mut Checker, func: &Expr, keywor
)
})
{
if let Some(keyword) = find_keyword(keywords, "timeout") {
if let Some(keyword) = call.arguments.find_keyword("timeout") {
if is_const_none(&keyword.value) {
checker.diagnostics.push(Diagnostic::new(
RequestWithoutTimeout { implicit: false },
@@ -73,7 +72,7 @@ pub(crate) fn request_without_timeout(checker: &mut Checker, func: &Expr, keywor
} else {
checker.diagnostics.push(Diagnostic::new(
RequestWithoutTimeout { implicit: true },
func.range(),
call.func.range(),
));
}
}

View File

@@ -1,10 +1,9 @@
//! Checks relating to shell injection.
use ruff_python_ast::{self as ast, Constant, Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::{find_keyword, Truthiness};
use ruff_python_ast::helpers::Truthiness;
use ruff_python_ast::{self as ast, Arguments, Constant, Expr, Keyword, Ranged};
use ruff_python_semantic::SemanticModel;
use crate::{
@@ -144,17 +143,12 @@ impl Violation for UnixCommandWildcardInjection {
}
/// S602, S603, S604, S605, S606, S607, S609
pub(crate) fn shell_injection(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
) {
let call_kind = get_call_kind(func, checker.semantic());
let shell_keyword = find_shell_keyword(keywords, checker.semantic());
pub(crate) fn shell_injection(checker: &mut Checker, call: &ast::ExprCall) {
let call_kind = get_call_kind(&call.func, checker.semantic());
let shell_keyword = find_shell_keyword(&call.arguments, checker.semantic());
if matches!(call_kind, Some(CallKind::Subprocess)) {
if let Some(arg) = args.first() {
if let Some(arg) = call.arguments.args.first() {
match shell_keyword {
// S602
Some(ShellKeyword {
@@ -209,7 +203,7 @@ pub(crate) fn shell_injection(
// S605
if checker.enabled(Rule::StartProcessWithAShell) {
if matches!(call_kind, Some(CallKind::Shell)) {
if let Some(arg) = args.first() {
if let Some(arg) = call.arguments.args.first() {
checker.diagnostics.push(Diagnostic::new(
StartProcessWithAShell {
seems_safe: shell_call_seems_safe(arg),
@@ -225,14 +219,14 @@ pub(crate) fn shell_injection(
if matches!(call_kind, Some(CallKind::NoShell)) {
checker
.diagnostics
.push(Diagnostic::new(StartProcessWithNoShell, func.range()));
.push(Diagnostic::new(StartProcessWithNoShell, call.func.range()));
}
}
// S607
if checker.enabled(Rule::StartProcessWithPartialPath) {
if call_kind.is_some() {
if let Some(arg) = args.first() {
if let Some(arg) = call.arguments.args.first() {
if is_partial_path(arg) {
checker
.diagnostics
@@ -256,11 +250,12 @@ pub(crate) fn shell_injection(
)
)
{
if let Some(arg) = args.first() {
if let Some(arg) = call.arguments.args.first() {
if is_wildcard_command(arg) {
checker
.diagnostics
.push(Diagnostic::new(UnixCommandWildcardInjection, func.range()));
checker.diagnostics.push(Diagnostic::new(
UnixCommandWildcardInjection,
call.func.range(),
));
}
}
}
@@ -317,10 +312,10 @@ struct ShellKeyword<'a> {
/// Return the `shell` keyword argument to the given function call, if any.
fn find_shell_keyword<'a>(
keywords: &'a [Keyword],
arguments: &'a Arguments,
semantic: &SemanticModel,
) -> Option<ShellKeyword<'a>> {
find_keyword(keywords, "shell").map(|keyword| ShellKeyword {
arguments.find_keyword("shell").map(|keyword| ShellKeyword {
truthiness: Truthiness::from_expr(&keyword.value, |id| semantic.is_builtin(id)),
keyword,
})
@@ -379,7 +374,7 @@ fn is_partial_path(expr: &Expr) -> bool {
Expr::List(ast::ExprList { elts, .. }) => elts.first().and_then(string_literal),
_ => string_literal(expr),
};
string_literal.map_or(false, |text| !is_full_path(text))
string_literal.is_some_and(|text| !is_full_path(text))
}
/// Return `true` if the [`Expr`] is a wildcard command.
@@ -410,7 +405,7 @@ fn is_wildcard_command(expr: &Expr) -> bool {
has_star && has_command
} else {
let string_literal = string_literal(expr);
string_literal.map_or(false, |text| {
string_literal.is_some_and(|text| {
text.contains('*')
&& (text.contains("chown")
|| text.contains("chmod")

View File

@@ -1,9 +1,8 @@
use num_traits::{One, Zero};
use ruff_python_ast::{self as ast, Constant, Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::find_keyword;
use ruff_python_ast::{self as ast, Constant, Expr, Ranged};
use crate::checkers::ast::Checker;
@@ -43,15 +42,15 @@ impl Violation for SnmpInsecureVersion {
}
/// S508
pub(crate) fn snmp_insecure_version(checker: &mut Checker, func: &Expr, keywords: &[Keyword]) {
pub(crate) fn snmp_insecure_version(checker: &mut Checker, call: &ast::ExprCall) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
.resolve_call_path(&call.func)
.is_some_and(|call_path| {
matches!(call_path.as_slice(), ["pysnmp", "hlapi", "CommunityData"])
})
{
if let Some(keyword) = find_keyword(keywords, "mpModel") {
if let Some(keyword) = call.arguments.find_keyword("mpModel") {
if let Expr::Constant(ast::ExprConstant {
value: Constant::Int(value),
..

View File

@@ -1,8 +1,6 @@
use ruff_python_ast::{Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::CallArguments;
use ruff_python_ast::{self as ast, Ranged};
use crate::checkers::ast::Checker;
@@ -42,23 +40,18 @@ impl Violation for SnmpWeakCryptography {
}
/// S509
pub(crate) fn snmp_weak_cryptography(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["pysnmp", "hlapi", "UsmUserData"])
})
{
if CallArguments::new(args, keywords).len() < 3 {
pub(crate) fn snmp_weak_cryptography(checker: &mut Checker, call: &ast::ExprCall) {
if call.arguments.len() < 3 {
if checker
.semantic()
.resolve_call_path(&call.func)
.is_some_and(|call_path| {
matches!(call_path.as_slice(), ["pysnmp", "hlapi", "UsmUserData"])
})
{
checker
.diagnostics
.push(Diagnostic::new(SnmpWeakCryptography, func.range()));
.push(Diagnostic::new(SnmpWeakCryptography, call.func.range()));
}
}
}

View File

@@ -1,8 +1,6 @@
use ruff_python_ast::{self as ast, Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::CallArguments;
use ruff_python_ast::{self as ast, Expr, Ranged};
use crate::checkers::ast::Checker;
@@ -60,25 +58,17 @@ impl Violation for UnsafeYAMLLoad {
}
/// S506
pub(crate) fn unsafe_yaml_load(
checker: &mut Checker,
func: &Expr,
args: &[Expr],
keywords: &[Keyword],
) {
pub(crate) fn unsafe_yaml_load(checker: &mut Checker, call: &ast::ExprCall) {
if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["yaml", "load"])
})
.resolve_call_path(&call.func)
.is_some_and(|call_path| matches!(call_path.as_slice(), ["yaml", "load"]))
{
let call_args = CallArguments::new(args, keywords);
if let Some(loader_arg) = call_args.argument("Loader", 1) {
if let Some(loader_arg) = call.arguments.find_argument("Loader", 1) {
if !checker
.semantic()
.resolve_call_path(loader_arg)
.map_or(false, |call_path| {
.is_some_and(|call_path| {
matches!(call_path.as_slice(), ["yaml", "SafeLoader" | "CSafeLoader"])
})
{
@@ -95,7 +85,7 @@ pub(crate) fn unsafe_yaml_load(
} else {
checker.diagnostics.push(Diagnostic::new(
UnsafeYAMLLoad { loader: None },
func.range(),
call.func.range(),
));
}
}

View File

@@ -1,8 +1,7 @@
use ruff_python_ast::{self as ast, Expr, Ranged, Stmt};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::{find_keyword, is_const_true};
use ruff_python_ast::helpers::is_const_true;
use ruff_python_ast::{self as ast, Expr, Ranged, Stmt};
use ruff_python_semantic::analyze::logging;
use crate::checkers::ast::Checker;
@@ -77,7 +76,7 @@ pub(crate) fn blind_except(
if let Stmt::Raise(ast::StmtRaise { exc, .. }) = stmt {
if let Some(exc) = exc {
if let Expr::Name(ast::ExprName { id, .. }) = exc.as_ref() {
name.map_or(false, |name| id == name)
name.is_some_and(|name| id == name)
} else {
false
}
@@ -94,7 +93,10 @@ pub(crate) fn blind_except(
// If the exception is logged, don't flag an error.
if body.iter().any(|stmt| {
if let Stmt::Expr(ast::StmtExpr { value, range: _ }) = stmt {
if let Expr::Call(ast::ExprCall { func, keywords, .. }) = value.as_ref() {
if let Expr::Call(ast::ExprCall {
func, arguments, ..
}) = value.as_ref()
{
if logging::is_logger_candidate(
func,
checker.semantic(),
@@ -106,7 +108,7 @@ pub(crate) fn blind_except(
return true;
}
if attr == "error" {
if let Some(keyword) = find_keyword(keywords, "exc_info") {
if let Some(keyword) = arguments.find_keyword("exc_info") {
if is_const_true(&keyword.value) {
return true;
}

View File

@@ -28,6 +28,8 @@ pub(super) fn is_allowed_func_call(name: &str) -> bool {
| "index"
| "insert"
| "int"
| "is_"
| "is_not"
| "param"
| "pop"
| "remove"

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{ArgWithDefault, Arguments, Decorator};
use ruff_python_ast::{Decorator, ParameterWithDefault, Parameters};
use ruff_diagnostics::Violation;
use ruff_macros::{derive_message_formats, violation};
@@ -59,7 +59,7 @@ pub(crate) fn check_boolean_default_value_in_function_definition(
checker: &mut Checker,
name: &str,
decorator_list: &[Decorator],
arguments: &Arguments,
parameters: &Parameters,
) {
if is_allowed_func_def(name) {
return;
@@ -67,16 +67,16 @@ pub(crate) fn check_boolean_default_value_in_function_definition(
if decorator_list.iter().any(|decorator| {
collect_call_path(&decorator.expression)
.map_or(false, |call_path| call_path.as_slice() == [name, "setter"])
.is_some_and(|call_path| call_path.as_slice() == [name, "setter"])
}) {
return;
}
for ArgWithDefault {
def: _,
for ParameterWithDefault {
parameter: _,
default,
range: _,
} in arguments.args.iter().chain(&arguments.posonlyargs)
} in parameters.args.iter().chain(&parameters.posonlyargs)
{
let Some(default) = default else {
continue;

View File

@@ -1,4 +1,6 @@
use ruff_python_ast::{self as ast, ArgWithDefault, Arguments, Constant, Decorator, Expr, Ranged};
use ruff_python_ast::{
self as ast, Constant, Decorator, Expr, ParameterWithDefault, Parameters, Ranged,
};
use ruff_diagnostics::Diagnostic;
use ruff_diagnostics::Violation;
@@ -80,7 +82,7 @@ pub(crate) fn check_positional_boolean_in_def(
checker: &mut Checker,
name: &str,
decorator_list: &[Decorator],
arguments: &Arguments,
parameters: &Parameters,
) {
if is_allowed_func_def(name) {
return;
@@ -88,21 +90,21 @@ pub(crate) fn check_positional_boolean_in_def(
if decorator_list.iter().any(|decorator| {
collect_call_path(&decorator.expression)
.map_or(false, |call_path| call_path.as_slice() == [name, "setter"])
.is_some_and(|call_path| call_path.as_slice() == [name, "setter"])
}) {
return;
}
for ArgWithDefault {
def,
for ParameterWithDefault {
parameter,
default: _,
range: _,
} in arguments.posonlyargs.iter().chain(&arguments.args)
} in parameters.posonlyargs.iter().chain(&parameters.args)
{
if def.annotation.is_none() {
if parameter.annotation.is_none() {
continue;
}
let Some(expr) = &def.annotation else {
let Some(expr) = &parameter.annotation else {
continue;
};
@@ -120,7 +122,7 @@ pub(crate) fn check_positional_boolean_in_def(
}
checker.diagnostics.push(Diagnostic::new(
BooleanPositionalArgInFunctionDefinition,
def.range(),
parameter.range(),
));
}
}

View File

@@ -81,12 +81,12 @@ FBT.py:19:5: FBT001 Boolean positional arg in function definition
21 | kwonly_nonvalued_nohint,
|
FBT.py:85:19: FBT001 Boolean positional arg in function definition
FBT.py:86:19: FBT001 Boolean positional arg in function definition
|
84 | # FBT001: Boolean positional arg in function definition
85 | def foo(self, value: bool) -> None:
85 | # FBT001: Boolean positional arg in function definition
86 | def foo(self, value: bool) -> None:
| ^^^^^^^^^^^ FBT001
86 | pass
87 | pass
|

View File

@@ -34,6 +34,8 @@ FBT.py:69:38: FBT003 Boolean positional value in function call
68 | g_action.set_enabled(True)
69 | settings.set_enable_developer_extras(True)
| ^^^^ FBT003
70 | foo.is_(True)
71 | bar.is_not(False)
|

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{self as ast, Constant, Expr, Keyword, Ranged, Stmt};
use ruff_python_ast::{self as ast, Arguments, Constant, Expr, Keyword, Ranged, Stmt};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
@@ -106,26 +106,21 @@ impl Violation for EmptyMethodWithoutAbstractDecorator {
fn is_abc_class(bases: &[Expr], keywords: &[Keyword], semantic: &SemanticModel) -> bool {
keywords.iter().any(|keyword| {
keyword.arg.as_ref().map_or(false, |arg| arg == "metaclass")
keyword.arg.as_ref().is_some_and(|arg| arg == "metaclass")
&& semantic
.resolve_call_path(&keyword.value)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["abc", "ABCMeta"])
})
.is_some_and(|call_path| matches!(call_path.as_slice(), ["abc", "ABCMeta"]))
}) || bases.iter().any(|base| {
semantic.resolve_call_path(base).map_or(false, |call_path| {
matches!(call_path.as_slice(), ["abc", "ABC"])
})
semantic
.resolve_call_path(base)
.is_some_and(|call_path| matches!(call_path.as_slice(), ["abc", "ABC"]))
})
}
fn is_empty_body(body: &[Stmt]) -> bool {
body.iter().all(|stmt| match stmt {
Stmt::Pass(_) => true,
Stmt::Expr(ast::StmtExpr {
value,
range: _range,
}) => match value.as_ref() {
Stmt::Expr(ast::StmtExpr { value, range: _ }) => match value.as_ref() {
Expr::Constant(ast::ExprConstant { value, .. }) => {
matches!(value, Constant::Str(..) | Constant::Ellipsis)
}
@@ -141,14 +136,17 @@ pub(crate) fn abstract_base_class(
checker: &mut Checker,
stmt: &Stmt,
name: &str,
bases: &[Expr],
keywords: &[Keyword],
arguments: Option<&Arguments>,
body: &[Stmt],
) {
if bases.len() + keywords.len() != 1 {
let Some(Arguments { args, keywords, .. }) = arguments else {
return;
};
if args.len() + keywords.len() != 1 {
return;
}
if !is_abc_class(bases, keywords, checker.semantic()) {
if !is_abc_class(args, keywords, checker.semantic()) {
return;
}

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{self as ast, Expr, ExprContext, Ranged, Stmt};
use ruff_python_ast::{self as ast, Arguments, Expr, ExprContext, Ranged, Stmt};
use ruff_text_size::TextRange;
use ruff_diagnostics::{AlwaysAutofixableViolation, Diagnostic, Edit, Fix};
@@ -53,12 +53,15 @@ fn assertion_error(msg: Option<&Expr>) -> Stmt {
ctx: ExprContext::Load,
range: TextRange::default(),
})),
args: if let Some(msg) = msg {
vec![msg.clone()]
} else {
vec![]
arguments: Arguments {
args: if let Some(msg) = msg {
vec![msg.clone()]
} else {
vec![]
},
keywords: vec![],
range: TextRange::default(),
},
keywords: vec![],
range: TextRange::default(),
}))),
cause: None,

View File

@@ -1,10 +1,8 @@
use std::fmt;
use ruff_python_ast::{self as ast, Expr, Ranged, WithItem};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::find_keyword;
use ruff_python_ast::{self as ast, Expr, Ranged, WithItem};
use crate::checkers::ast::Checker;
@@ -81,23 +79,24 @@ pub(crate) fn assert_raises_exception(checker: &mut Checker, items: &[WithItem])
for item in items {
let Expr::Call(ast::ExprCall {
func,
args,
keywords,
arguments,
range: _,
}) = &item.context_expr
else {
return;
};
if args.len() != 1 {
return;
}
if item.optional_vars.is_some() {
return;
}
let [arg] = arguments.args.as_slice() else {
return;
};
let Some(exception) = checker
.semantic()
.resolve_call_path(args.first().unwrap())
.resolve_call_path(arg)
.and_then(|call_path| match call_path.as_slice() {
["", "Exception"] => Some(ExceptionKind::Exception),
["", "BaseException"] => Some(ExceptionKind::BaseException),
@@ -113,10 +112,8 @@ pub(crate) fn assert_raises_exception(checker: &mut Checker, items: &[WithItem])
} else if checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["pytest", "raises"])
})
&& find_keyword(keywords, "match").is_none()
.is_some_and(|call_path| matches!(call_path.as_slice(), ["pytest", "raises"]))
&& arguments.find_keyword("match").is_none()
{
AssertionKind::PytestRaises
} else {

View File

@@ -70,7 +70,7 @@ impl Violation for CachedInstanceMethod {
}
fn is_cache_func(expr: &Expr, semantic: &SemanticModel) -> bool {
semantic.resolve_call_path(expr).map_or(false, |call_path| {
semantic.resolve_call_path(expr).is_some_and(|call_path| {
matches!(call_path.as_slice(), ["functools", "lru_cache" | "cache"])
})
}

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{self as ast, ArgWithDefault, Arguments, Expr, Ranged};
use ruff_python_ast::{self as ast, Expr, ParameterWithDefault, Parameters, Ranged};
use ruff_text_size::TextRange;
use ruff_diagnostics::Violation;
@@ -105,7 +105,7 @@ where
}
/// B008
pub(crate) fn function_call_in_argument_default(checker: &mut Checker, arguments: &Arguments) {
pub(crate) fn function_call_in_argument_default(checker: &mut Checker, parameters: &Parameters) {
// Map immutable calls to (module, member) format.
let extend_immutable_calls: Vec<CallPath> = checker
.settings
@@ -116,15 +116,15 @@ pub(crate) fn function_call_in_argument_default(checker: &mut Checker, arguments
.collect();
let diagnostics = {
let mut visitor = ArgumentDefaultVisitor::new(checker.semantic(), extend_immutable_calls);
for ArgWithDefault {
for ParameterWithDefault {
default,
def: _,
parameter: _,
range: _,
} in arguments
} in parameters
.posonlyargs
.iter()
.chain(&arguments.args)
.chain(&arguments.kwonlyargs)
.chain(&parameters.args)
.chain(&parameters.kwonlyargs)
{
if let Some(expr) = &default {
visitor.visit_expr(expr);

View File

@@ -1,11 +1,9 @@
use ruff_python_ast::{self as ast, Comprehension, Expr, ExprContext, Ranged, Stmt};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::includes_arg_name;
use ruff_python_ast::types::Node;
use ruff_python_ast::visitor;
use ruff_python_ast::visitor::Visitor;
use ruff_python_ast::{self as ast, Arguments, Comprehension, Expr, ExprContext, Ranged, Stmt};
use crate::checkers::ast::Checker;
@@ -86,8 +84,12 @@ struct SuspiciousVariablesVisitor<'a> {
impl<'a> Visitor<'a> for SuspiciousVariablesVisitor<'a> {
fn visit_stmt(&mut self, stmt: &'a Stmt) {
match stmt {
Stmt::FunctionDef(ast::StmtFunctionDef { args, body, .. })
| Stmt::AsyncFunctionDef(ast::StmtAsyncFunctionDef { args, body, .. }) => {
Stmt::FunctionDef(ast::StmtFunctionDef {
parameters, body, ..
})
| Stmt::AsyncFunctionDef(ast::StmtAsyncFunctionDef {
parameters, body, ..
}) => {
// Collect all loaded variable names.
let mut visitor = LoadedNamesVisitor::default();
visitor.visit_body(body);
@@ -99,7 +101,7 @@ impl<'a> Visitor<'a> for SuspiciousVariablesVisitor<'a> {
return false;
}
if includes_arg_name(&loaded.id, args) {
if parameters.includes(&loaded.id) {
return false;
}
@@ -126,8 +128,12 @@ impl<'a> Visitor<'a> for SuspiciousVariablesVisitor<'a> {
match expr {
Expr::Call(ast::ExprCall {
func,
args,
keywords,
arguments:
Arguments {
args,
keywords,
range: _,
},
range: _,
}) => {
match func.as_ref() {
@@ -157,7 +163,7 @@ impl<'a> Visitor<'a> for SuspiciousVariablesVisitor<'a> {
}
for keyword in keywords {
if keyword.arg.as_ref().map_or(false, |arg| arg == "key")
if keyword.arg.as_ref().is_some_and(|arg| arg == "key")
&& keyword.value.is_lambda_expr()
{
self.safe_functions.push(&keyword.value);
@@ -165,7 +171,7 @@ impl<'a> Visitor<'a> for SuspiciousVariablesVisitor<'a> {
}
}
Expr::Lambda(ast::ExprLambda {
args,
parameters,
body,
range: _,
}) => {
@@ -181,7 +187,7 @@ impl<'a> Visitor<'a> for SuspiciousVariablesVisitor<'a> {
return false;
}
if includes_arg_name(&loaded.id, args) {
if parameters.includes(&loaded.id) {
return false;
}

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{self as ast, ArgWithDefault, Expr, Ranged};
use ruff_python_ast::{self as ast, Expr, ParameterWithDefault, Ranged};
use rustc_hash::FxHashMap;
use ruff_diagnostics::{Diagnostic, Violation};
@@ -71,22 +71,22 @@ where
}
}
Expr::Lambda(ast::ExprLambda {
args,
parameters,
body,
range: _,
}) => {
visitor::walk_expr(self, body);
for ArgWithDefault {
def,
for ParameterWithDefault {
parameter,
default: _,
range: _,
} in args
} in parameters
.posonlyargs
.iter()
.chain(&args.args)
.chain(&args.kwonlyargs)
.chain(&parameters.args)
.chain(&parameters.kwonlyargs)
{
self.names.remove(def.arg.as_str());
self.names.remove(parameter.name.as_str());
}
}
_ => visitor::walk_expr(self, expr),

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{ArgWithDefault, Arguments, Ranged};
use ruff_python_ast::{ParameterWithDefault, Parameters, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
@@ -57,26 +57,27 @@ impl Violation for MutableArgumentDefault {
}
/// B006
pub(crate) fn mutable_argument_default(checker: &mut Checker, arguments: &Arguments) {
pub(crate) fn mutable_argument_default(checker: &mut Checker, parameters: &Parameters) {
// Scan in reverse order to right-align zip().
for ArgWithDefault {
def,
for ParameterWithDefault {
parameter,
default,
range: _,
} in arguments
} in parameters
.posonlyargs
.iter()
.chain(&arguments.args)
.chain(&arguments.kwonlyargs)
.chain(&parameters.args)
.chain(&parameters.kwonlyargs)
{
let Some(default) = default else {
continue;
};
if is_mutable_expr(default, checker.semantic())
&& !def.annotation.as_ref().map_or(false, |expr| {
is_immutable_annotation(expr, checker.semantic())
})
&& !parameter
.annotation
.as_ref()
.is_some_and(|expr| is_immutable_annotation(expr, checker.semantic()))
{
checker
.diagnostics

View File

@@ -1,8 +1,6 @@
use ruff_python_ast::{Expr, Keyword, Ranged};
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::helpers::find_keyword;
use ruff_python_ast::{self as ast, Ranged};
use crate::checkers::ast::Checker;
@@ -38,22 +36,20 @@ impl Violation for NoExplicitStacklevel {
}
/// B028
pub(crate) fn no_explicit_stacklevel(checker: &mut Checker, func: &Expr, keywords: &[Keyword]) {
pub(crate) fn no_explicit_stacklevel(checker: &mut Checker, call: &ast::ExprCall) {
if !checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["warnings", "warn"])
})
.resolve_call_path(&call.func)
.is_some_and(|call_path| matches!(call_path.as_slice(), ["warnings", "warn"]))
{
return;
}
if find_keyword(keywords, "stacklevel").is_some() {
if call.arguments.find_keyword("stacklevel").is_some() {
return;
}
checker
.diagnostics
.push(Diagnostic::new(NoExplicitStacklevel, func.range()));
.push(Diagnostic::new(NoExplicitStacklevel, call.func.range()));
}

View File

@@ -87,7 +87,7 @@ pub(crate) fn raise_without_from_inside_except(
if let Some(name) = name {
if exc
.as_name_expr()
.map_or(false, |ast::ExprName { id, .. }| name == id)
.is_some_and(|ast::ExprName { id, .. }| name == id)
{
continue;
}

View File

@@ -68,7 +68,7 @@ pub(crate) fn re_sub_positional_args(checker: &mut Checker, call: &ast::ExprCall
return;
};
if call.args.len() > method.num_args() {
if call.arguments.args.len() > method.num_args() {
checker.diagnostics.push(Diagnostic::new(
ReSubPositionalArgs { method },
call.range(),

View File

@@ -313,9 +313,7 @@ pub(crate) fn reuse_of_groupby_generator(
if !checker
.semantic()
.resolve_call_path(func)
.map_or(false, |call_path| {
matches!(call_path.as_slice(), ["itertools", "groupby"])
})
.is_some_and(|call_path| matches!(call_path.as_slice(), ["itertools", "groupby"]))
{
return;
}

View File

@@ -57,7 +57,6 @@ fn assignment(obj: &Expr, name: &str, value: &Expr, generator: Generator) -> Str
range: TextRange::default(),
})],
value: Box::new(value.clone()),
type_comment: None,
range: TextRange::default(),
});
generator.stmt(&stmt)

Some files were not shown because too many files have changed in this diff Show More