Compare commits

...

159 Commits

Author SHA1 Message Date
Micha Reiser
d013b71974 Ignore the open parentheses when measuring BestFitsParenthesize 2023-12-01 13:08:51 +08:00
Micha Reiser
7d948b23b4 Inline comments for type alias similar to assignment 2023-12-01 12:31:13 +08:00
Micha Reiser
1de4fcdea8 Refactor the comment handling of a statement's last expression 2023-12-01 12:31:11 +08:00
Charlie Marsh
69dfe0a207 Fix doc formatting for zero-sleep-call (#8937) 2023-11-30 22:34:09 -05:00
Charlie Marsh
46a174a22e Use full arguments range for zero-sleep-call (#8936) 2023-12-01 03:09:18 +00:00
Charlie Marsh
912c39ce2a Add support for @functools.singledispatch (#8934)
## Summary

When a function uses `@functools.singledispatch`, we need to treat the
first argument of any implementations as runtime-required.

Closes https://github.com/astral-sh/ruff/issues/6849.
2023-12-01 03:04:58 +00:00
Charlie Marsh
b2638c62a5 Update formatter fixtures (#8935)
I merged a branch that wasn't up-to-date, which left us with test
failures on `main`.
2023-12-01 02:57:05 +00:00
Charlie Marsh
eaa310429f Insert trailing comma when function breaks with single argument (#8921)
## Summary

Given:

```python
def _example_function_xxxxxxx(
    variable: Optional[List[str]]
) -> List[example.ExampleConfig]:
    pass
```

We should be inserting a trailing comma after the argument (as long as
it's a single-argument function). This was an inconsistency with Black,
but also led to some internal inconsistencies, whereby we added the
comma if the argument contained a trailing end-of-line comment, but not
otherwise.

Closes https://github.com/astral-sh/ruff/issues/8912.

## Test Plan

`cargo test`

Before:

| project | similarity index | total files | changed files |

|----------------|------------------:|------------------:|------------------:|
| cpython | 0.75804 | 1799 | 1648 |
| django | 0.99984 | 2772 | 34 |
| home-assistant | 0.99963 | 10596 | 146 |
| poetry | 0.99925 | 317 | 12 |
| transformers | 0.99967 | 2657 | 322 |
| twine | 1.00000 | 33 | 0 |
| typeshed | 0.99980 | 3669 | 18 |
| warehouse | 0.99977 | 654 | 13 |
| zulip | 0.99970 | 1459 | 21 |

After:

| project | similarity index | total files | changed files |

|----------------|------------------:|------------------:|------------------:|
| cpython | 0.75804 | 1799 | 1648 |
| django | 0.99984 | 2772 | 34 |
| home-assistant | 0.99955 | 10596 | 213 |
| poetry | 0.99917 | 317 | 13 |
| transformers | 0.99967 | 2657 | 324 |
| twine | 1.00000 | 33 | 0 |
| typeshed | 0.99980 | 3669 | 18 |
| warehouse | 0.99976 | 654 | 14 |
| zulip | 0.99957 | 1459 | 36 |
2023-11-30 21:49:28 -05:00
Charlie Marsh
019d9aebe9 Implement multiline dictionary and list hugging for preview style (#8293)
## Summary

This PR implement's Black's new single-argument hugging for lists, sets,
and dictionaries under preview style.

For example, this:

```python
foo(
    [
        1,
        2,
        3,
    ]
)
```

Would instead now be formatted as:

```python
foo([
    1,
    2,
    3,
])
```

A couple notes:

- This doesn't apply when the argument has a magic trailing comma.
- This _does_ apply when the argument is starred or double-starred.
- We don't apply this when there are comments before or after the
argument, though Black does in some cases (and moves the comments
outside the call parentheses).

It doesn't say it in the originating PR
(https://github.com/psf/black/pull/3964), but I think this also applies
to parenthesized expressions? At least, it does in my testing of preview
vs. stable, though it's possible that behavior predated the linked PR.

See: #8279.

## Test Plan

Before:

| project | similarity index | total files | changed files |

|----------------|------------------:|------------------:|------------------:|
| cpython | 0.75804 | 1799 | 1648 |
| django | 0.99984 | 2772 | 34 |
| home-assistant | 0.99963 | 10596 | 146 |
| poetry | 0.99925 | 317 | 12 |
| transformers | 0.99967 | 2657 | 322 |
| twine | 1.00000 | 33 | 0 |
| typeshed | 0.99980 | 3669 | 18 |
| warehouse | 0.99977 | 654 | 13 |
| zulip | 0.99970 | 1459 | 21 |

After:

| project | similarity index | total files | changed files |

|----------------|------------------:|------------------:|------------------:|
| cpython | 0.75804 | 1799 | 1648 |
| django | 0.99984 | 2772 | 34 |
| home-assistant | 0.99963 | 10596 | 146 |
| poetry | 0.96215 | 317 | 34 |
| transformers | 0.99967 | 2657 | 322 |
| twine | 1.00000 | 33 | 0 |
| typeshed | 0.99980 | 3669 | 18 |
| warehouse | 0.99977 | 654 | 13 |
| zulip | 0.99970 | 1459 | 21 |
2023-11-30 21:11:14 -05:00
Dhruv Manilawala
f06c5dc896 Use correct range for TRIO115 fix (#8933)
## Summary

This PR fixes the bug where the autofix for `TRIO115` was taking the
entire arguments range for the fix which included the parenthesis as
well. This means that the fix would remove the arguments and the
parenthesis. The fix is to use the correct range.

fixes: #8713 

## Test Plan

Update existing snapshots :)
2023-12-01 01:42:46 +00:00
Charlie Marsh
c1dc4a60be Apply some minor changes to unnecessary-list-index-lookup (#8932)
## Summary

I was late in reviewing this but found a few things I wanted to tweak.
No functional changes.
2023-12-01 00:53:26 +00:00
Steve C
70febb1862 [pylint] - add unnecessary-list-index-lookup (PLR1736) + autofix (#7999)
## Summary

Add
[R1736](https://pylint.readthedocs.io/en/latest/user_guide/messages/refactor/unnecessary-list-index-lookup.html)
along with the autofix

See #970 

## Test Plan

`cargo test` and manually

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2023-11-30 17:45:12 -06:00
Steve C
4212b41796 [pylint] - implement R0202 and R0203 with autofixes (#8335)
## Summary

Implements
[`no-classmethod-decorator`/`R0202`](https://pylint.readthedocs.io/en/latest/user_guide/messages/refactor/no-classmethod-decorator.html)
and
[`no-staticmethod-decorator`/`R0203`](https://pylint.readthedocs.io/en/latest/user_guide/messages/refactor/no-staticmethod-decorator.html)
with autofixes.

They're similar enough that all code is reusable for both.

See: #970 

## Test Plan

`cargo test`
2023-11-30 16:18:09 -06:00
Steve C
bbad4b4c93 Add autofix for PYI030 (#7934)
## Summary

Part 2 of implementing the reverted autofix for `PYI030`

Also handles `typing.Union` and `typing_extensions.Literal` etc, uses
the first subscript name it finds for each offensive line.

## Test Plan

<!-- How was it tested? -->

`cargo test` and manually

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2023-11-30 22:16:57 +00:00
Steve C
3ee1ec70cc Fix up some types in the ecosystem code (#8898)
## Summary

Fixes up the type annotations to make type analyzers a little happier 😄 

## Test Plan

N/A
2023-11-30 16:02:20 -06:00
Charlie Marsh
ee5d95f751 Remove duplicate imports from os-stat documentation (#8930)
Closes https://github.com/astral-sh/ruff/issues/8799.
2023-11-30 20:13:29 +00:00
Charlie Marsh
d674e7946d Ignore underlines when determining docstring logical lines (#8929) 2023-11-30 14:27:04 -05:00
Charlie Marsh
20782ab02c Support type alias statements in simple statement positions (#8916)
<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

Our `SoftKeywordTokenizer` only respected soft keywords in compound
statement positions -- for example, at the start of a logical line:

```python
type X = int
```

However, type aliases can also appear in simple statement positions,
like:

```python
class Class: type X = int
```

(Note that `match` and `case` are _not_ valid keywords in such
positions.)

This PR upgrades the tokenizer to track both kinds of valid positions.

Closes https://github.com/astral-sh/ruff/issues/8900.
Closes https://github.com/astral-sh/ruff/issues/8899.

## Test Plan

`cargo test`
2023-11-30 19:15:19 +00:00
Charlie Marsh
073eddb1d9 Use Python version to determine typing rewrite safety (#8919)
## Summary

These rewrites are only (potentially) unsafe on Python versions that
predate their introduction into the standard library and grammar, so it
seems correct to mark them as safe on those later versions.
2023-11-29 22:22:04 -05:00
Charlie Marsh
e8da95d09c Document fix safety for flake8-comprehensions and some pyupgrade rules (#8918)
See: https://github.com/astral-sh/ruff/issues/7993.
2023-11-29 20:51:23 -05:00
Charlie Marsh
c324cb6202 [pep8-naming] Allow Django model loads in non-lowercase-variable-in-function (N806) (#8917)
## Summary

Allows assignments of the form, e.g., `Attachment =
apps.get_model("zerver", "Attachment")`, for better compatibility with
Django.

Closes https://github.com/astral-sh/ruff/issues/7675.

## Test Plan

`cargo test`
2023-11-29 20:43:40 -05:00
Charlie Marsh
774c77adae Avoid off-by-one error in with-item named expressions (#8915)
## Summary

Given `with (a := b): pass`, we truncate the `WithItem` range by one on
both sides such that the parentheses are part of the statement, rather
than the item. However, for `with (a := b) as x: pass`, we want to avoid
this trick.

Closes https://github.com/astral-sh/ruff/issues/8913.
2023-11-30 00:11:04 +00:00
Micha Reiser
fd70cd789f Update Black tests (#8901) 2023-11-30 00:09:55 +00:00
Andrey
08f3110f1e Optimize workflow run (#8225) 2023-11-30 00:09:33 +00:00
Micha Reiser
2314b9aaca Add benchmark for running all rules including preview rules (#8865) 2023-11-29 04:26:25 +00:00
Micha Reiser
cddc696896 Stop at the first resolved parent configuration (#8864) 2023-11-29 04:21:07 +00:00
Charlie Marsh
6435e4e4aa Enable auto-return-type involving Optional and Union annotations (#8885)
## Summary

Previously, this was only supported for Python 3.10 and later, since we
always use the PEP 604-style unions.
2023-11-28 18:35:55 -08:00
Dhruv Manilawala
ec7456bac0 Rename as_str to to_str (#8886)
This PR renames the method on `StringLiteralValue` from `as_str` to
`to_str`. The main motivation is to follow the naming convention as
described in the [Rust API
Guidelines](https://rust-lang.github.io/api-guidelines/naming.html#ad-hoc-conversions-follow-as_-to_-into_-conventions-c-conv).
This method can perform a string allocation in case the string is
implicitly concatenated.
2023-11-28 18:50:42 -06:00
Dhruv Manilawala
b28556d739 Update E402 to work at cell level for notebooks (#8872)
## Summary

This PR updates the `E402` rule to work at cell level for Jupyter
notebooks. This is enabled only in preview to gather feedback.

The implementation basically resets the import boundary flag on the
semantic model when we encounter the first statement in a cell.

Another potential solution is to introduce `E403` rule that is
specifically for notebooks that works at cell level while `E402` will be
disabled for notebooks.

## Test Plan

Add a notebook with imports in multiple cells and verify that the rule
works as expected.

resolves: #8669
2023-11-29 00:32:35 +00:00
Andrew Gallant
4957d94beb ruff_python_formatter: small cleanups in doctest formatting (#8871)
This PR contains a few small clean-ups that are responses to
@MichaReiser's review of my #8811 PR.
2023-11-28 18:43:07 -05:00
Charlie Marsh
5d554edace Allow booleans in @override methods (#8882)
Closes #8867.
2023-11-28 13:42:31 -08:00
Charlie Marsh
412688826c Avoid filtering out un-representable types in return annotation (#8881)
## Summary

Given `Union[Dict, None]` (in our internal representation), we were
filtering out `Dict` since we treat it as un-representable (i.e., we
can't convert it to an expression), returning just `None` as the type
annotation. We should require that all members of the union are
representable.

Closes https://github.com/astral-sh/ruff/issues/8879.
2023-11-28 21:10:42 +00:00
Dhruv Manilawala
47d80f29a7 Lexer start of line is false only for Mode::Expression (#8880)
## Summary

This PR fixes the bug in the lexer where the `Mode::Ipython` wasn't
being considered when initializing the soft keyword transformer which
wraps the lexer. This means that if the source code starts with either
`match` or `type` keyword, then the keywords were being considered as
name tokens instead. For example,

```python
match foo:
    case bar:
        pass
```

This would transform the `match` keyword into an identifier if the mode
is `Ipython`.

The fix is to reverse the condition in the soft keyword initializer so
that any new modes are by default considered as the lexer being at start
of line.

## Test Plan

Add a new test case for `Mode::Ipython` and verify the snapshot.

fixes: #8870
2023-11-28 20:38:25 +00:00
Dhruv Manilawala
9dee1883ce Update ruff-dev to use SourceKind (#8878)
Just a small quality of life improvement to be able to pass in the
Jupyter Notebook to `ruff-dev` CLI.
2023-11-28 14:27:35 -06:00
Andrew Gallant
f585e3e2dc remove several uses of unsafe (#8600)
This PR removes several uses of `unsafe`. I generally limited myself to
low hanging fruit that I could see. There are still a few remaining uses
of `unsafe` that looked a bit more difficult to remove (if possible at
all). But this gets rid of a good chunk of them.

I put each `unsafe` removal into its own commit with a justification for
why I did it. So I would encourage reviewing this PR commit-by-commit.
That way, we can legislate them independently. It's no problem to drop a
commit if we feel the `unsafe` should stay in that case.
2023-11-28 09:50:03 -05:00
Joffrey Bluthé
578ddf1bb1 [isort] Add support for length-sort settings (#8841)
## Summary

Closes #1567.

Add both `length-sort` and `length-sort-straight` settings for isort.

Here are a few notable points:
- The length is determined using the
[`unicode_width`](https://crates.io/crates/unicode-width) crate, i.e. we
are talking about displayed length (this is explicitly mentioned in the
description of the setting)
- The dots are taken into account in the length to be compatible with
the original isort
- I had to reorder a few fields of the module key struct for it all to
make sense (notably the `force_to_top` field is now the first one)

## Test Plan

I added tests for the following cases:
- Basic tests for length-sort with ASCII characters only
- Tests with non-ASCII characters
- Tests with relative imports
- Tests for length-sort-straight
2023-11-28 06:00:37 +00:00
Charlie Marsh
ed14fd9163 [pydocstyle] Avoid non-character breaks in over-indentation (D208) (#8866)
Closes https://github.com/astral-sh/ruff/issues/8844.
2023-11-27 21:47:35 -08:00
Tom Kuson
60eb11fa50 [refurb] Implement redundant-log-base (FURB163) (#8842)
## Summary

Implement
[`simplify-math-log`](https://github.com/dosisod/refurb/blob/master/refurb/checks/math/simplify_log.py)
as `redundant-log-base` (`FURB163`).

Auto-fixes

```python
import math

math.log(2, 2)
```

to

```python
import math

math.log2(2)
```

Related to #1348.

## Test Plan

`cargo test`
2023-11-27 23:57:00 +00:00
Andrew Gallant
33caa2ab1c ruff_python_formatter: move docstring handling to a submodule (#8861)
This turns `string` into a parent module with a `docstring` sub-module.
I arranged things this way because there are parts of the `string`
module that the `docstring` module wants to know about (such as a
`NormalizedString`). The alternative I think would be to make
`docstring` a sibling module and expose more of `string`'s internals.

I think I overall like this change because it gives docstring handling a
bit more room to breath. It has grown quite a bit with the addition of
code snippet formatting.

[This was suggested by
@charliermarsh.](https://github.com/astral-sh/ruff/pull/8811#discussion_r1401169531)
2023-11-27 13:32:26 -05:00
Andrew Gallant
d9845a2628 format doctests in docstrings (#8811)
## Summary

This PR adds opt-in support for formatting doctests in docstrings. This
reflects initial support and it is intended to add support for Markdown
and reStructuredText Python code blocks in the future. But I believe
this PR lays the groundwork, and future additions for Markdown and reST
should be less costly to add.

It's strongly recommended to review this PR commit-by-commit. The last
few commits in particular implement the bulk of the work here and
represent the denser portions.

Some things worth mentioning:

* The formatter is itself not perfect, and it is possible for it to
produce invalid Python code. Because of this, reformatted code snippets
are checked for Python validity. If they aren't valid, then we
(unfortunately silently) bail on formatting that code snippet.
* There are a couple places where it would be nice to at least warn the
user that doctest formatting failed, but it wasn't clear to me what the
best way to do that is.
* I haven't yet run this in anger on a real world code base. I think
that should happen before merging.

Closes #7146 

## Test Plan

* [x] Pass the local test suite.
* [x] Scrutinize ecosystem changes.
* [x] Run this formatter on extant code and scrutinize the results.
(e.g., CPython, numpy.)
2023-11-27 11:14:55 -05:00
Samuel Searles-Bryant
1f14d9a9f7 Add advice for fixing RUF008 when mutability is not desired (#8853) 2023-11-27 09:27:22 -06:00
dependabot[bot]
0202a49297 Bump proc-macro2 from 1.0.69 to 1.0.70 (#8850)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-27 10:14:06 +00:00
dependabot[bot]
d0591561c9 Bump configparser from 3.0.2 to 3.0.3 (#8849)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-27 10:07:22 +00:00
dependabot[bot]
2f5859e79e Bump uuid from 1.6.0 to 1.6.1 (#8851)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-27 10:05:50 +00:00
dependabot[bot]
16085339bc Bump globset from 0.4.13 to 0.4.14 (#8848)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-27 10:04:24 +00:00
dependabot[bot]
074812115f Bump wasm-bindgen-test from 0.3.37 to 0.3.38 (#8847)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-27 10:02:53 +00:00
Charlie Marsh
9b17724d77 [pylint] Extend self-assigning-variable to multi-target assignments (#8839)
Closes https://github.com/astral-sh/ruff/issues/8667.
2023-11-25 18:42:19 +00:00
Charlie Marsh
0d4af9d3c6 Allow space-before-colon after end-of-slice (#8838)
Closes https://github.com/astral-sh/ruff/issues/8752.
2023-11-25 18:16:43 +00:00
Dhruv Manilawala
1dbfab9a0c Auto-generate formatter nodes for string parts (#8837)
A follow-up to auto-generate the `FormatNodeRule` implementation for the
string part nodes. This is just a dummy implementation that is
unreachable because it's handled by the parent nodes.
2023-11-25 13:00:47 +00:00
Dhruv Manilawala
501cca8b72 Remove #[allow(unused_variables)] from visitor methods (#8828)
Small follow-up to remove `#[allow(unused_variables)]` from visitor
methods and use underscore prefix for unused variables instead.
2023-11-25 00:09:46 +00:00
Dhruv Manilawala
626b0577cd Explicit as_str (no deref), add no allocation methods (#8826)
## Summary

This PR is a follow-up to the AST refactor which does the following:
- Remove `Deref` implementation on `StringLiteralValue` and use explicit
`as_str` calls instead. The `Deref` implementation would implicitly
perform allocations in case of implicitly concatenated strings. This is
to make sure the allocation is explicit.
- Now, certain methods can be implemented to do zero allocations which
have been implemented in this PR. They are:
    - `is_empty`
    - `len`
    - `chars`
    - Custom `PartialEq` implementation to compare each character

## Test Plan

Run the linter test suite and make sure all tests pass.
2023-11-25 00:03:59 +00:00
Dhruv Manilawala
017e829115 Update string nodes for implicit concatenation (#7927)
## Summary

This PR updates the string nodes (`ExprStringLiteral`,
`ExprBytesLiteral`, and `ExprFString`) to account for implicit string
concatenation.

### Motivation

In Python, implicit string concatenation are joined while parsing
because the interpreter doesn't require the information for each part.
While that's feasible for an interpreter, it falls short for a static
analysis tool where having such information is more useful. Currently,
various parts of the code uses the lexer to get the individual string
parts.

One of the main challenge this solves is that of string formatting.
Currently, the formatter relies on the lexer to get the individual
string parts, and formats them including the comments accordingly. But,
with PEP 701, f-string can also contain comments. Without this change,
it becomes very difficult to add support for f-string formatting.

### Implementation

The initial proposal was made in this discussion:
https://github.com/astral-sh/ruff/discussions/6183#discussioncomment-6591993.
There were various AST designs which were explored for this task which
are available in the linked internal document[^1].

The selected variant was the one where the nodes were kept as it is
except that the `implicit_concatenated` field was removed and instead a
new struct was added to the `Expr*` struct. This would be a private
struct would contain the actual implementation of how the AST is
designed for both single and implicitly concatenated strings.

This implementation is achieved through an enum with two variants:
`Single` and `Concatenated` to avoid allocating a vector even for single
strings. There are various public methods available on the value struct
to query certain information regarding the node.

The nodes are structured in the following way:

```
ExprStringLiteral - "foo" "bar"
|- StringLiteral - "foo"
|- StringLiteral - "bar"

ExprBytesLiteral - b"foo" b"bar"
|- BytesLiteral - b"foo"
|- BytesLiteral - b"bar"

ExprFString - "foo" f"bar {x}"
|- FStringPart::Literal - "foo"
|- FStringPart::FString - f"bar {x}"
  |- StringLiteral - "bar "
  |- FormattedValue - "x"
```

[^1]: Internal document:
https://www.notion.so/astral-sh/Implicit-String-Concatenation-e036345dc48943f89e416c087bf6f6d9?pvs=4

#### Visitor

The way the nodes are structured is that the entire string, including
all the parts that are implicitly concatenation, is a single node
containing individual nodes for the parts. The previous section has a
representation of that tree for all the string nodes. This means that
new visitor methods are added to visit the individual parts of string,
bytes, and f-strings for `Visitor`, `PreorderVisitor`, and
`Transformer`.

## Test Plan

- `cargo insta test --workspace --all-features --unreferenced reject`
- Verify that the ecosystem results are unchanged
2023-11-24 17:55:41 -06:00
Chaojie
2590aa30ae [flake8-bandit] Implement tarfile-unsafe-members (S202) (#8829)
See: https://github.com/astral-sh/ruff/issues/1646.

Bandit origin:
https://github.com/PyCQA/bandit/blob/main/bandit/plugins/tarfile_unsafe_members.py
2023-11-24 17:46:06 +00:00
Samuel Cormier-Iijima
852a8f4a4f [PIE796] don't report when using ellipses for enum values in stub files (#8825)
## Summary

Just ignores ellipses as enum values inside stub files.

Fixes #8818.
2023-11-24 15:24:57 +00:00
Dhruv Manilawala
8365d2e0fd Avoid E703 for last expression in a cell (#8821)
## Summary

This PR updates the `E703` rule to avoid flagging any semicolons if
they're present after the last expression in a notebook cell. These are
intended to hide the cell output.

Part of #8669 

## Test Plan

Add test notebook and update the snapshots.
2023-11-23 07:40:57 -06:00
Dhruv Manilawala
5b726f70f4 Avoid B015,B018 for last expression in a cell (#8815)
## Summary

This PR updates `B015` and `B018` to ignore last top-level expressions
in each cell of a Jupyter Notebook.

Part of #8669

## Test Plan

Add test cases for both rules and update the snapshots.
2023-11-22 15:33:23 +00:00
Dhruv Manilawala
727e389cac Add CellOffsets abstraction (#8814)
Refactor `Notebook::cell_offsets` to use an abstract struct for storing
the cell offsets. This will allow us to add useful methods on it.
2023-11-22 15:27:00 +00:00
Dhruv Manilawala
0cb438dd65 Avoid D100 for Jupyter Notebooks (#8816)
This PR avoids triggering `D100` for Jupyter Notebooks.

Part of #8669
2023-11-22 15:26:25 +00:00
Dhruv Manilawala
63a87dda63 Move notebook cell logic in separate file (#8813)
Small refactor to move cell related logic to it's own file.
2023-11-22 09:21:28 -06:00
Alan Du
359a68d18f Factor out a builder to handle common integration test arguments (#8733)
## Summary

This refactors the `ruff_cli` integration tests to create a new
`RuffCheck` struct -- this holds options to configure the "common case"
flags that we want to pass to Ruff (e.g. `--no-cache`, `--isolated`,
etc). This helps reduce the boilerplate and (IMO) makes it more obvious
what the core logic of each test is by keeping only the "interesting"
parameters.

## Test Plan

`cargo test`
2023-11-22 00:32:01 +00:00
Adrian
948094e691 [pylint] Add allow-dunder-method-names setting for bad-dunder-method-name (PLW3201) (#8812)
closes #8732

I noticed that the reference to the setting in the rule docs doesn't
work, but there seem to be something wrong with pylint settings in
general in the docs - the "For related settings, see ...." is also
missing there.
2023-11-21 23:44:23 +00:00
Jelmer Vernooij
f1ed0f27c2 isort: Add support for the `from-first` setting (#8663)
# Summary

This setting behaves similarly to the ``from_first`` setting in isort
upstream, and sorts "from X import Y" type imports before straight
imports.

Like the other PR I added, happy to refactor if this is better in
another form.

Fixes #8662 

# Test plan

I've added a unit test, and ran this on a large codebase that relies on
this setting in isort to verify it doesn't have unexpected side effects.
2023-11-21 23:36:15 +00:00
Dhruv Manilawala
5ce6299e22 Avoid PERF101 if there's an append in loop body (#8809)
## Summary

Avoid `PERF101` if there's an append in loop body

## Test Plan

Add new test cases for this pattern.

fixes: #8746
2023-11-21 15:35:42 -06:00
Charlie Marsh
5373759f62 Respect dictionary unpacking in NamedTuple assignments (#8810)
Closes https://github.com/astral-sh/ruff/issues/8803.
2023-11-21 19:30:48 +00:00
Zanie Blue
d9151b1948 Update ruff check and ruff format to default to the current directory (#8791)
Closes https://github.com/astral-sh/ruff/issues/7347
Closes #3970 via use of `include`

We could update examples in our documentation, but I worry since we do
not have versioned documentation users on older versions would be
confused. Instead, I'll open an issue to track updating use of `ruff
check .` in the documentation sometime in the future.
2023-11-21 11:34:21 -06:00
Charlie Marsh
b61ce7fa46 Replace generated reference to MkDocs (#8806) 2023-11-21 11:59:22 +00:00
maltevesper
6fb6478887 [flake8-simplify] Omit select context managers from SIM117 (#8801)
Semantically it makes sense to put certain contextmanagers into separate
with statements. Currently asyncio.timeout and its relatives in anyio
and trio are exempt from SIM117.

Closes https://github.com/astral-sh/ruff/issues/8606

## Summary

Exempt asyncio.timeout and related functions from SIM117 (Collapse with
statements where possible).
See https://github.com/astral-sh/ruff/issues/8606 for more.

## Test Plan

Extended the insta tests.
2023-11-21 11:53:42 +00:00
Ezra Shaw
bf729e7a77 fix: mark __main__ as first-party import (#8805)
## Summary

Fixes #8750. `import __main__` is now considered a first-party import,
and is grouped accordingly by the linter and formatter.

## Test Plan

Added a test based off code supplied in the linked issue.
2023-11-21 11:52:28 +00:00
Felix Yan
6ca2aaa245 Update Arch Linux package URL in installation.md (#8802)
The old URL returns 404 now.
2023-11-21 11:37:18 +00:00
Iipin
e306359411 Mark pydantic_settings.BaseSettings as having default copy semantics (#8793)
## Summary

In 2.0, Pydantic has moved the `BaseSettings` class to a separate
package called `pydantic-settings`
(https://docs.pydantic.dev/2.4/migration/#basesettings-has-moved-to-pydantic-settings),
which results in a false positive on `RUF012` (`mutable-class-default`).
A simple fix for that would be adding `pydantic_settings.BaseSettings`
base to the `has_default_copy_semantics` helper, which I've done in this
PR.

Related issue: #5308

## Test Plan

`cargo test`
2023-11-20 19:29:48 +00:00
T-256
aec80dc3ab Ruff ecosystem: pass no-preview cli arg by default (#8775)
Addresses
https://github.com/astral-sh/ruff/pull/8489#issuecomment-1793513411
That issues still exist on formatter, but since `black` doesn't support
`no-preview` cli arg, I didn't include it in this PR.

---------

Co-authored-by: Zanie <contact@zanie.dev>
2023-11-20 12:21:51 -06:00
Charlie Marsh
10d937c1a1 [pep8-naming] Avoid N806 errors for type alias statements (#8785)
Allow, e.g.:

```python
def func():
    type MyInt = int
```

(We already allowed `MyInt: TypeAlias = int`.)

Closes https://github.com/astral-sh/ruff/issues/8773.
2023-11-20 12:28:52 +00:00
Chaojie
653e51ae97 [flake8-bandit] Implement django-raw-sql (S611) (#8651)
See: https://github.com/astral-sh/ruff/issues/1646.
2023-11-20 12:21:12 +00:00
dependabot[bot]
04f0625d23 Bump codspeed-criterion-compat from 2.3.1 to 2.3.3 (#8784) 2023-11-20 11:10:48 +00:00
dependabot[bot]
29dc8b986b Bump tracing-subscriber from 0.3.17 to 0.3.18 (#8777)
Bumps [tracing-subscriber](https://github.com/tokio-rs/tracing) from
0.3.17 to 0.3.18.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/tokio-rs/tracing/releases">tracing-subscriber's
releases</a>.</em></p>
<blockquote>
<h2>tracing-subscriber 0.3.18</h2>
<p>This release of <code>tracing-subscriber</code> adds support for the
<a href="https://no-color.org/"><code>NO_COLOR</code></a> environment
variable (an informal standard to disable emitting ANSI color escape
codes) in
<code>fmt::Layer</code>, reintroduces support for the <a
href="https://github.com/chronotope/chrono"><code>chrono</code></a>
crate, and increases the
minimum supported Rust version (MSRV) to Rust 1.63.0.</p>
<p>It also introduces several minor API improvements.</p>
<h3>Added</h3>
<ul>
<li><strong>chrono</strong>: Add <a
href="https://github.com/chronotope/chrono"><code>chrono</code></a>
implementations of <code>FormatTime</code> (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2690">#2690</a>)</li>
<li><strong>subscriber</strong>: Add support for the <a
href="https://no-color.org/"><code>NO_COLOR</code></a> environment
variable in
<code>fmt::Layer</code> (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2647">#2647</a>)</li>
<li><strong>fmt</strong>: make <code>format::Writer::new()</code> public
(<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2680">#2680</a>)</li>
<li><strong>filter</strong>: Implement <code>layer::Filter</code> for
<code>Option&lt;Filter&gt;</code> (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2407">#2407</a>)</li>
</ul>
<h3>Changed</h3>
<ul>
<li><strong>log</strong>: bump version of <code>tracing-log</code> to
0.2 (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2772">#2772</a>)</li>
<li>Increased minimum supported Rust version (MSRV) to 1.63.0+.</li>
</ul>
<p><a
href="https://redirect.github.com/tokio-rs/tracing/issues/2690">#2690</a>:
<a
href="https://redirect.github.com/tokio-rs/tracing/pull/2690">tokio-rs/tracing#2690</a>
<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2647">#2647</a>:
<a
href="https://redirect.github.com/tokio-rs/tracing/pull/2647">tokio-rs/tracing#2647</a>
<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2680">#2680</a>:
<a
href="https://redirect.github.com/tokio-rs/tracing/pull/2680">tokio-rs/tracing#2680</a>
<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2407">#2407</a>:
<a
href="https://redirect.github.com/tokio-rs/tracing/pull/2407">tokio-rs/tracing#2407</a>
<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2772">#2772</a>:
<a
href="https://redirect.github.com/tokio-rs/tracing/pull/2772">tokio-rs/tracing#2772</a></p>
<p>Thanks to <a
href="https://github.com/shayne-fletcher"><code>@​shayne-fletcher</code></a>,
<a href="https://github.com/dmlary"><code>@​dmlary</code></a>, <a
href="https://github.com/kaifastromai"><code>@​kaifastromai</code></a>,
and <a href="https://github.com/jsgf"><code>@​jsgf</code></a> for
contributing!</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="8b7a1dde69"><code>8b7a1dd</code></a>
chore: prepare tracing-subscriber 0.3.18 release (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2789">#2789</a>)</li>
<li><a
href="befb4de073"><code>befb4de</code></a>
chore: fix <code>ahash</code>-caused build breakage (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2792">#2792</a>)</li>
<li><a
href="1dc1e6a302"><code>1dc1e6a</code></a>
chore: bump <code>ahash</code> to 0.7.7 (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2794">#2794</a>)</li>
<li><a
href="abb23931ef"><code>abb2393</code></a>
chore: backport CI improvements (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2238">#2238</a>)</li>
<li><a
href="c6abc10c3a"><code>c6abc10</code></a>
chore: bump MSRV to 1.63 (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2793">#2793</a>)</li>
<li><a
href="4529182e60"><code>4529182</code></a>
attributes: added missing RecordTypes for instrument (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2781">#2781</a>)</li>
<li><a
href="7b594354cf"><code>7b59435</code></a>
subcriber: update docs for EnvFilter Builder (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2782">#2782</a>)</li>
<li><a
href="119f91a85c"><code>119f91a</code></a>
tracing: removed core imports in macros (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2762">#2762</a>)</li>
<li><a
href="346d6e6456"><code>346d6e6</code></a>
core: fix incorrect (incorrectly updated) docs for LevelFilter (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2767">#2767</a>)</li>
<li><a
href="8cdc9da202"><code>8cdc9da</code></a>
appender: remove <code>Sync</code> bound from writer for
<code>NonBlocking</code> (<a
href="https://redirect.github.com/tokio-rs/tracing/issues/2607">#2607</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/tokio-rs/tracing/compare/tracing-subscriber-0.3.17...tracing-subscriber-0.3.18">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=tracing-subscriber&package-manager=cargo&previous-version=0.3.17&new-version=0.3.18)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-20 09:55:55 +01:00
dependabot[bot]
f750257ef4 Bump js-sys from 0.3.64 to 0.3.65 (#8781)
Bumps [js-sys](https://github.com/rustwasm/wasm-bindgen) from 0.3.64 to
0.3.65.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/rustwasm/wasm-bindgen/commits">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=js-sys&package-manager=cargo&previous-version=0.3.64&new-version=0.3.65)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-20 09:43:17 +01:00
dependabot[bot]
0f2c9ab4d2 Bump uuid from 1.5.0 to 1.6.0 (#8783)
Bumps [uuid](https://github.com/uuid-rs/uuid) from 1.5.0 to 1.6.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/uuid-rs/uuid/releases">uuid's
releases</a>.</em></p>
<blockquote>
<h2>1.6.0</h2>
<h2>What's Changed</h2>
<ul>
<li>doc: fix links in v6 module by <a
href="https://github.com/metalalive"><code>@​metalalive</code></a> in <a
href="https://redirect.github.com/uuid-rs/uuid/pull/714">uuid-rs/uuid#714</a></li>
<li>Stabilize UUIDv6-v8 support by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/uuid-rs/uuid/pull/718">uuid-rs/uuid#718</a></li>
<li>Prepare for 1.6.0 release by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/uuid-rs/uuid/pull/719">uuid-rs/uuid#719</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/metalalive"><code>@​metalalive</code></a> made
their first contribution in <a
href="https://redirect.github.com/uuid-rs/uuid/pull/714">uuid-rs/uuid#714</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/uuid-rs/uuid/compare/1.5.0...1.6.0">https://github.com/uuid-rs/uuid/compare/1.5.0...1.6.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4609e61794"><code>4609e61</code></a>
Merge pull request <a
href="https://redirect.github.com/uuid-rs/uuid/issues/719">#719</a> from
uuid-rs/cargo/1.6.0</li>
<li><a
href="24330666ec"><code>2433066</code></a>
prepare for 1.6.0 release</li>
<li><a
href="9787ea1d0b"><code>9787ea1</code></a>
Merge pull request <a
href="https://redirect.github.com/uuid-rs/uuid/issues/718">#718</a> from
uuid-rs/feat/stabilize-v6-plus</li>
<li><a
href="90b0bc0a1c"><code>90b0bc0</code></a>
Merge pull request <a
href="https://redirect.github.com/uuid-rs/uuid/issues/714">#714</a> from
metalalive/doc/fix-v6-links</li>
<li><a
href="1eebe7d299"><code>1eebe7d</code></a>
bump msrv to 1.60.0</li>
<li><a
href="6bade3ae59"><code>6bade3a</code></a>
just test lib with miri</li>
<li><a
href="3df0aaa80d"><code>3df0aaa</code></a>
stabilize UUIDv6-v8 support</li>
<li><a
href="003dc57994"><code>003dc57</code></a>
doc: fix links to timestamp module</li>
<li>See full diff in <a
href="https://github.com/uuid-rs/uuid/compare/1.5.0...1.6.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=uuid&package-manager=cargo&previous-version=1.5.0&new-version=1.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-20 09:37:33 +01:00
dependabot[bot]
779a5a9c32 Bump actions/github-script from 6 to 7 (#8780)
Bumps [actions/github-script](https://github.com/actions/github-script)
from 6 to 7.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/github-script/releases">actions/github-script's
releases</a>.</em></p>
<blockquote>
<h2>v7.0.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Add base-url option by <a
href="https://github.com/robandpdx"><code>@​robandpdx</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/429">actions/github-script#429</a></li>
<li>Expose async-function argument type by <a
href="https://github.com/viktorlott"><code>@​viktorlott</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/402">actions/github-script#402</a>,
see for details <a
href="https://github.com/actions/github-script#use-scripts-with-jsdoc-support">https://github.com/actions/github-script#use-scripts-with-jsdoc-support</a></li>
<li>Update dependencies and use Node 20 by <a
href="https://github.com/joshmgross"><code>@​joshmgross</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/425">actions/github-script#425</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/navarroaxel"><code>@​navarroaxel</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/github-script/pull/285">actions/github-script#285</a></li>
<li><a href="https://github.com/robandpdx"><code>@​robandpdx</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/github-script/pull/429">actions/github-script#429</a></li>
<li><a
href="https://github.com/viktorlott"><code>@​viktorlott</code></a> made
their first contribution in <a
href="https://redirect.github.com/actions/github-script/pull/402">actions/github-script#402</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/github-script/compare/v6.4.1...v7.0.0">https://github.com/actions/github-script/compare/v6.4.1...v7.0.0</a></p>
<h2>v6.4.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Add <code>@​octokit/plugin-request-log</code>, to produce debug
output for requests by <a
href="https://github.com/mjpieters"><code>@​mjpieters</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/358">actions/github-script#358</a></li>
<li>fix input handling by <a
href="https://github.com/mjpieters"><code>@​mjpieters</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/357">actions/github-script#357</a></li>
<li>Remove unused dependencies by <a
href="https://github.com/mjpieters"><code>@​mjpieters</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/356">actions/github-script#356</a></li>
<li>Default debug to current runner debug state by <a
href="https://github.com/mjpieters"><code>@​mjpieters</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/363">actions/github-script#363</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/mjpieters"><code>@​mjpieters</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/github-script/pull/358">actions/github-script#358</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/github-script/compare/v6.4.0...v6.4.1">https://github.com/actions/github-script/compare/v6.4.0...v6.4.1</a></p>
<h2>v6.4.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Bump json5 from 2.1.3 to 2.2.3 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/319">actions/github-script#319</a></li>
<li>Bump minimatch from 3.0.4 to 3.1.2 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/320">actions/github-script#320</a></li>
<li>Add node-fetch by <a
href="https://github.com/danmichaelo"><code>@​danmichaelo</code></a> in
<a
href="https://redirect.github.com/actions/github-script/pull/321">actions/github-script#321</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/jongwooo"><code>@​jongwooo</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/github-script/pull/313">actions/github-script#313</a></li>
<li><a
href="https://github.com/austinvazquez"><code>@​austinvazquez</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/github-script/pull/306">actions/github-script#306</a></li>
<li><a
href="https://github.com/danmichaelo"><code>@​danmichaelo</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/github-script/pull/321">actions/github-script#321</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/github-script/compare/v6.3.3...v6.4.0">https://github.com/actions/github-script/compare/v6.3.3...v6.4.0</a></p>
<h2>v6.3.3</h2>
<h2>What's Changed</h2>
<ul>
<li>Update <code>@actions/glob</code> to 0.3.0 by <a
href="https://github.com/nineinchnick"><code>@​nineinchnick</code></a>
in <a
href="https://redirect.github.com/actions/github-script/pull/279">actions/github-script#279</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/nineinchnick"><code>@​nineinchnick</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/github-script/pull/279">actions/github-script#279</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/github-script/compare/v6.3.2...v6.3.3">https://github.com/actions/github-script/compare/v6.3.2...v6.3.3</a></p>
<h2>v6.3.2</h2>
<h2>What's Changed</h2>
<ul>
<li>Update <code>@​actions/core</code> to 1.10.0 by <a
href="https://github.com/rentziass"><code>@​rentziass</code></a> in <a
href="https://redirect.github.com/actions/github-script/pull/295">actions/github-script#295</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="60a0d83039"><code>60a0d83</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/github-script/issues/440">#440</a>
from actions/joshmgross/v7.0.1</li>
<li><a
href="b7fb2001b4"><code>b7fb200</code></a>
Update version to 7.0.1</li>
<li><a
href="12e22ed06b"><code>12e22ed</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/github-script/issues/439">#439</a>
from actions/joshmgross/avoid-setting-base-url</li>
<li><a
href="d319f8f5b5"><code>d319f8f</code></a>
Avoid setting <code>baseUrl</code> to undefined when input is not
provided</li>
<li><a
href="e69ef5462f"><code>e69ef54</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/github-script/issues/425">#425</a>
from actions/joshmgross/node-20</li>
<li><a
href="ee0914b839"><code>ee0914b</code></a>
Update licenses</li>
<li><a
href="d6fc56f33b"><code>d6fc56f</code></a>
Use <code>@types/node</code> for Node 20</li>
<li><a
href="384d6cf581"><code>384d6cf</code></a>
Fix quotations in tests</li>
<li><a
href="84724927e3"><code>8472492</code></a>
Only validate GraphQL <code>previews</code></li>
<li><a
href="84903f5182"><code>84903f5</code></a>
Remove <code>node-fetch</code> from type</li>
<li>Additional commits viewable in <a
href="https://github.com/actions/github-script/compare/v6...v7">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/github-script&package-manager=github_actions&previous-version=6&new-version=7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-20 09:32:56 +01:00
dependabot[bot]
bba43029d4 Bump docker/metadata-action from 4 to 5 (#8778)
Bumps
[docker/metadata-action](https://github.com/docker/metadata-action) from
4 to 5.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/docker/metadata-action/releases">docker/metadata-action's
releases</a>.</em></p>
<blockquote>
<h2>v5.0.0</h2>
<ul>
<li>Node 20 as default runtime (requires <a
href="https://github.com/actions/runner/releases/tag/v2.308.0">Actions
Runner v2.308.0</a> or later) by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/metadata-action/pull/328">docker/metadata-action#328</a></li>
<li>Bump <code>@​actions/core</code> from 1.10.0 to 1.10.1 in <a
href="https://redirect.github.com/docker/metadata-action/pull/333">docker/metadata-action#333</a></li>
<li>Bump csv-parse from 5.4.0 to 5.5.0 in <a
href="https://redirect.github.com/docker/metadata-action/pull/320">docker/metadata-action#320</a></li>
<li>Bump semver from 7.5.1 to 7.5.2 in <a
href="https://redirect.github.com/docker/metadata-action/pull/304">docker/metadata-action#304</a></li>
<li>Bump handlebars from 4.7.7 to 4.7.8 in <a
href="https://redirect.github.com/docker/metadata-action/pull/315">docker/metadata-action#315</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/metadata-action/compare/v4.6.0...v5.0.0">https://github.com/docker/metadata-action/compare/v4.6.0...v5.0.0</a></p>
<h2>v4.6.0</h2>
<ul>
<li>Dedup and sort labels by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/metadata-action/pull/301">docker/metadata-action#301</a></li>
<li>Bump <code>@​docker/actions-toolkit</code> from 0.3.0 to 0.5.0 in <a
href="https://redirect.github.com/docker/metadata-action/pull/302">docker/metadata-action#302</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/metadata-action/compare/v4.5.0...v4.6.0">https://github.com/docker/metadata-action/compare/v4.5.0...v4.6.0</a></p>
<h2>v4.5.0</h2>
<ul>
<li>Bump <code>@​docker/actions-toolkit</code> from 0.1.0 to 0.3.0 in <a
href="https://redirect.github.com/docker/metadata-action/pull/296">docker/metadata-action#296</a></li>
<li>Bump csv-parse from 5.3.8 to 5.4.0 in <a
href="https://redirect.github.com/docker/metadata-action/pull/294">docker/metadata-action#294</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/metadata-action/compare/v4.4.0...v4.5.0">https://github.com/docker/metadata-action/compare/v4.4.0...v4.5.0</a></p>
<h2>v4.4.0</h2>
<ul>
<li>Add <code>context</code> input to define the metadata provider by <a
href="https://github.com/neilime"><code>@​neilime</code></a> in <a
href="https://redirect.github.com/docker/metadata-action/pull/248">docker/metadata-action#248</a></li>
<li>Switch to actions-toolkit implementation by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/metadata-action/pull/266">docker/metadata-action#266</a>
<a
href="https://redirect.github.com/docker/metadata-action/pull/273">docker/metadata-action#273</a>
<a
href="https://redirect.github.com/docker/metadata-action/pull/284">docker/metadata-action#284</a></li>
<li>Bump csv-parse from 5.3.3 to 5.3.8 in <a
href="https://redirect.github.com/docker/metadata-action/pull/271">docker/metadata-action#271</a>
<a
href="https://redirect.github.com/docker/metadata-action/pull/286">docker/metadata-action#286</a></li>
<li>Bump moment-timezone from 0.5.40 to 0.5.43 in <a
href="https://redirect.github.com/docker/metadata-action/pull/268">docker/metadata-action#268</a>
<a
href="https://redirect.github.com/docker/metadata-action/pull/278">docker/metadata-action#278</a>
<a
href="https://redirect.github.com/docker/metadata-action/pull/281">docker/metadata-action#281</a></li>
<li>Bump semver from 7.4.0 to 7.5.0 in <a
href="https://redirect.github.com/docker/metadata-action/pull/285">docker/metadata-action#285</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/metadata-action/compare/v4.3.0...v4.4.0">https://github.com/docker/metadata-action/compare/v4.3.0...v4.4.0</a></p>
<h2>v4.3.0</h2>
<ul>
<li>Provide outputs as env vars by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> (<a
href="https://redirect.github.com/docker/metadata-action/issues/257">#257</a>)</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/metadata-action/compare/v4.2.0...v4.3.0">https://github.com/docker/metadata-action/compare/v4.2.0...v4.3.0</a></p>
<h2>v4.2.0</h2>
<ul>
<li>Add <code>tz</code> attribute to handlebar date function by <a
href="https://github.com/chroju"><code>@​chroju</code></a> (<a
href="https://redirect.github.com/docker/metadata-action/issues/251">#251</a>)</li>
<li>Bump minimatch from 3.0.4 to 3.1.2 (<a
href="https://redirect.github.com/docker/metadata-action/issues/242">#242</a>)</li>
<li>Bump csv-parse from 5.3.1 to 5.3.3 (<a
href="https://redirect.github.com/docker/metadata-action/issues/245">#245</a>)</li>
<li>Bump json5 from 2.2.0 to 2.2.3 (<a
href="https://redirect.github.com/docker/metadata-action/issues/252">#252</a>)</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/metadata-action/compare/v4.1.1...v4.2.0">https://github.com/docker/metadata-action/compare/v4.1.1...v4.2.0</a></p>
<h2>v4.1.1</h2>
<ul>
<li>Revert changes to set associated head sha on pull request event by
<a href="https://github.com/crazy-max"><code>@​crazy-max</code></a> (<a
href="https://redirect.github.com/docker/metadata-action/issues/239">#239</a>)
<ul>
<li>User can still set associated head sha on PR by setting the env var
<code>DOCKER_METADATA_PR_HEAD_SHA=true</code></li>
</ul>
</li>
<li>Bump csv-parse from 5.3.0 to 5.3.1 (<a
href="https://redirect.github.com/docker/metadata-action/issues/237">#237</a>)</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/metadata-action/compare/v4.1.0...v4.1.1">https://github.com/docker/metadata-action/compare/v4.1.0...v4.1.1</a></p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Upgrade guide</summary>
<p><em>Sourced from <a
href="https://github.com/docker/metadata-action/blob/master/UPGRADE.md">docker/metadata-action's
upgrade guide</a>.</em></p>
<blockquote>
<h1>Upgrade notes</h1>
<h2>v2 to v3</h2>
<ul>
<li>Repository has been moved to docker org. Replace
<code>crazy-max/ghaction-docker-meta@v2</code>
with <code>docker/metadata-action@v5</code></li>
<li>The default bake target has been changed:
<code>ghaction-docker-meta</code> &gt;
<code>docker-metadata-action</code></li>
</ul>
<h2>v1 to v2</h2>
<ul>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#inputs">inputs</a>
<ul>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#tag-sha"><code>tag-sha</code></a></li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#tag-edge--tag-edge-branch"><code>tag-edge</code>
/ <code>tag-edge-branch</code></a></li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#tag-semver"><code>tag-semver</code></a></li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#tag-match--tag-match-group"><code>tag-match</code>
/ <code>tag-match-group</code></a></li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#tag-latest"><code>tag-latest</code></a></li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#tag-schedule"><code>tag-schedule</code></a></li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#tag-custom--tag-custom-only"><code>tag-custom</code>
/ <code>tag-custom-only</code></a></li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#label-custom"><code>label-custom</code></a></li>
</ul>
</li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#basic-workflow">Basic
workflow</a></li>
<li><a
href="https://github.com/docker/metadata-action/blob/master/#semver-workflow">Semver
workflow</a></li>
</ul>
<h3>inputs</h3>
<table>
<thead>
<tr>
<th>New</th>
<th>Unchanged</th>
<th>Removed</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>tags</code></td>
<td><code>images</code></td>
<td><code>tag-sha</code></td>
</tr>
<tr>
<td><code>flavor</code></td>
<td><code>sep-tags</code></td>
<td><code>tag-edge</code></td>
</tr>
<tr>
<td><code>labels</code></td>
<td><code>sep-labels</code></td>
<td><code>tag-edge-branch</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td><code>tag-semver</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td><code>tag-match</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td><code>tag-match-group</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td><code>tag-latest</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td><code>tag-schedule</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td><code>tag-custom</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td><code>tag-custom-only</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td><code>label-custom</code></td>
</tr>
</tbody>
</table>
<h4><code>tag-sha</code></h4>
<pre lang="yaml"><code>tags: |
  type=sha
</code></pre>
<h4><code>tag-edge</code> / <code>tag-edge-branch</code></h4>
<pre lang="yaml"><code>tags: |
  # default branch
&lt;/tr&gt;&lt;/table&gt; 
</code></pre>
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="96383f4557"><code>96383f4</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/metadata-action/issues/320">#320</a>
from docker/dependabot/npm_and_yarn/csv-parse-5.5.0</li>
<li><a
href="f138b9677b"><code>f138b96</code></a>
chore: update generated content</li>
<li><a
href="9cf7015b15"><code>9cf7015</code></a>
Bump csv-parse from 5.4.0 to 5.5.0</li>
<li><a
href="5a8a5ff8df"><code>5a8a5ff</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/metadata-action/issues/315">#315</a>
from docker/dependabot/npm_and_yarn/handlebars-4.7.8</li>
<li><a
href="2279d9af58"><code>2279d9a</code></a>
chore: update generated content</li>
<li><a
href="c659933213"><code>c659933</code></a>
Bump handlebars from 4.7.7 to 4.7.8</li>
<li><a
href="48d23ccc05"><code>48d23cc</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/metadata-action/issues/333">#333</a>
from docker/dependabot/npm_and_yarn/actions/core-1.10.1</li>
<li><a
href="b83ffb48d6"><code>b83ffb4</code></a>
chore: update generated content</li>
<li><a
href="3207f2405f"><code>3207f24</code></a>
Bump <code>@​actions/core</code> from 1.10.0 to 1.10.1</li>
<li><a
href="63f4a263e5"><code>63f4a26</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/metadata-action/issues/328">#328</a>
from crazy-max/update-node20</li>
<li>Additional commits viewable in <a
href="https://github.com/docker/metadata-action/compare/v4...v5">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=docker/metadata-action&package-manager=github_actions&previous-version=4&new-version=5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-20 09:32:31 +01:00
dependabot[bot]
1de5425f7d Bump docker/build-push-action from 3 to 5 (#8779)
Bumps
[docker/build-push-action](https://github.com/docker/build-push-action)
from 3 to 5.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/docker/build-push-action/releases">docker/build-push-action's
releases</a>.</em></p>
<blockquote>
<h2>v5.0.0</h2>
<ul>
<li>Node 20 as default runtime (requires <a
href="https://github.com/actions/runner/releases/tag/v2.308.0">Actions
Runner v2.308.0</a> or later) by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/954">docker/build-push-action#954</a></li>
<li>Bump <code>@​actions/core</code> from 1.10.0 to 1.10.1 in <a
href="https://redirect.github.com/docker/build-push-action/pull/959">docker/build-push-action#959</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/build-push-action/compare/v4.2.1...v5.0.0">https://github.com/docker/build-push-action/compare/v4.2.1...v5.0.0</a></p>
<h2>v4.2.1</h2>
<blockquote>
<p><strong>Note</strong></p>
<p>Buildx v0.10 enables support for a minimal <a
href="https://slsa.dev/provenance/">SLSA Provenance</a> attestation,
which requires support for <a
href="https://github.com/opencontainers/image-spec">OCI-compliant</a>
multi-platform images. This may introduce issues with registry and
runtime support (e.g. <a
href="https://redirect.github.com/docker/buildx/issues/1533">Google
Cloud Run and AWS Lambda</a>). You can optionally disable the default
provenance attestation functionality using <code>provenance:
false</code>.</p>
</blockquote>
<ul>
<li>warn if docker config can't be parsed by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/957">docker/build-push-action#957</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/build-push-action/compare/v4.2.0...v4.2.1">https://github.com/docker/build-push-action/compare/v4.2.0...v4.2.1</a></p>
<h2>v4.2.0</h2>
<blockquote>
<p><strong>Note</strong></p>
<p>Buildx v0.10 enables support for a minimal <a
href="https://slsa.dev/provenance/">SLSA Provenance</a> attestation,
which requires support for <a
href="https://github.com/opencontainers/image-spec">OCI-compliant</a>
multi-platform images. This may introduce issues with registry and
runtime support (e.g. <a
href="https://redirect.github.com/docker/buildx/issues/1533">Google
Cloud Run and AWS Lambda</a>). You can optionally disable the default
provenance attestation functionality using <code>provenance:
false</code>.</p>
</blockquote>
<ul>
<li>display proxy configuration by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/872">docker/build-push-action#872</a></li>
<li>chore(deps): Bump <code>@​docker/actions-toolkit</code> from 0.6.0
to 0.8.0 in <a
href="https://redirect.github.com/docker/build-push-action/pull/930">docker/build-push-action#930</a></li>
<li>chore(deps): Bump word-wrap from 1.2.3 to 1.2.5 in <a
href="https://redirect.github.com/docker/build-push-action/pull/925">docker/build-push-action#925</a></li>
<li>chore(deps): Bump semver from 6.3.0 to 6.3.1 in <a
href="https://redirect.github.com/docker/build-push-action/pull/902">docker/build-push-action#902</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/build-push-action/compare/v4.1.1...v4.2.0">https://github.com/docker/build-push-action/compare/v4.1.1...v4.2.0</a></p>
<h2>v4.1.1</h2>
<blockquote>
<p><strong>Note</strong></p>
<p>Buildx v0.10 enables support for a minimal <a
href="https://slsa.dev/provenance/">SLSA Provenance</a> attestation,
which requires support for <a
href="https://github.com/opencontainers/image-spec">OCI-compliant</a>
multi-platform images. This may introduce issues with registry and
runtime support (e.g. <a
href="https://redirect.github.com/docker/buildx/issues/1533">Google
Cloud Run and AWS Lambda</a>). You can optionally disable the default
provenance attestation functionality using <code>provenance:
false</code>.</p>
</blockquote>
<ul>
<li>Bump <code>@​docker/actions-toolkit</code> from 0.3.0 to 0.5.0 by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/880">docker/build-push-action#880</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/build-push-action/compare/v4.1.0...v4.1.1">https://github.com/docker/build-push-action/compare/v4.1.0...v4.1.1</a></p>
<h2>v4.1.0</h2>
<blockquote>
<p><strong>Note</strong></p>
<p>Buildx v0.10 enables support for a minimal <a
href="https://slsa.dev/provenance/">SLSA Provenance</a> attestation,
which requires support for <a
href="https://github.com/opencontainers/image-spec">OCI-compliant</a>
multi-platform images. This may introduce issues with registry and
runtime support (e.g. <a
href="https://redirect.github.com/docker/buildx/issues/1533">Google
Cloud Run and AWS Lambda</a>). You can optionally disable the default
provenance attestation functionality using <code>provenance:
false</code>.</p>
</blockquote>
<ul>
<li>Switch to actions-toolkit implementation by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/811">docker/build-push-action#811</a>
<a
href="https://redirect.github.com/docker/build-push-action/pull/838">docker/build-push-action#838</a>
<a
href="https://redirect.github.com/docker/build-push-action/pull/855">docker/build-push-action#855</a>
<a
href="https://redirect.github.com/docker/build-push-action/pull/860">docker/build-push-action#860</a>
<a
href="https://redirect.github.com/docker/build-push-action/pull/875">docker/build-push-action#875</a></li>
<li>e2e: quay.io by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/799">docker/build-push-action#799</a>
<a
href="https://redirect.github.com/docker/build-push-action/pull/805">docker/build-push-action#805</a></li>
<li>e2e: local harbor and nexus by <a
href="https://github.com/crazy-max"><code>@​crazy-max</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/800">docker/build-push-action#800</a></li>
<li>e2e: add artifactory container registry to test against by <a
href="https://github.com/jedevc"><code>@​jedevc</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/804">docker/build-push-action#804</a></li>
<li>e2e: add distribution tests by <a
href="https://github.com/jedevc"><code>@​jedevc</code></a> in <a
href="https://redirect.github.com/docker/build-push-action/pull/814">docker/build-push-action#814</a>
<a
href="https://redirect.github.com/docker/build-push-action/pull/815">docker/build-push-action#815</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/docker/build-push-action/compare/v4.0.0...v4.1.0">https://github.com/docker/build-push-action/compare/v4.0.0...v4.1.0</a></p>
<h2>v4.0.0</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4a13e500e5"><code>4a13e50</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/build-push-action/issues/1006">#1006</a>
from docker/dependabot/npm_and_yarn/docker/actions-t...</li>
<li><a
href="7416668686"><code>7416668</code></a>
chore: update generated content</li>
<li><a
href="b4f76a5dc6"><code>b4f76a5</code></a>
chore(deps): Bump <code>@​docker/actions-toolkit</code> from 0.13.0 to
0.14.0</li>
<li><a
href="b7feb766fa"><code>b7feb76</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/build-push-action/issues/1005">#1005</a>
from crazy-max/ci-inspect</li>
<li><a
href="fae8018297"><code>fae8018</code></a>
ci: inspect sbom and provenance</li>
<li><a
href="b625868b13"><code>b625868</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/build-push-action/issues/1004">#1004</a>
from crazy-max/ci-update-buildx</li>
<li><a
href="5193ef1da6"><code>5193ef1</code></a>
ci: update buildx to latest</li>
<li><a
href="d3afd779e4"><code>d3afd77</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/build-push-action/issues/991">#991</a>
from docker/dependabot/npm_and_yarn/babel/traverse-7....</li>
<li><a
href="7a786bb2b9"><code>7a786bb</code></a>
Merge pull request <a
href="https://redirect.github.com/docker/build-push-action/issues/992">#992</a>
from crazy-max/annotations</li>
<li><a
href="c66ae3adcf"><code>c66ae3a</code></a>
chore: update generated content</li>
<li>Additional commits viewable in <a
href="https://github.com/docker/build-push-action/compare/v3...v5">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=docker/build-push-action&package-manager=github_actions&previous-version=3&new-version=5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-20 09:32:22 +01:00
Charlie Marsh
71573fd35c Avoid repeated triggers in nested tryceratops diagnostics (#8772)
Closes https://github.com/astral-sh/ruff/issues/8770.
2023-11-19 23:43:59 +00:00
Charlie Marsh
95e2f632e6 Retain extra ellipses in protocols and abstract methods (#8769)
## Summary

It turns out that some type checkers rely on the presence of ellipses in
`Protocol` interfaces and abstract methods, in order to differentiate
between default implementations and stubs. This PR modifies the preview
behavior of `PIE790` to avoid flagging "unnecessary" ellipses in such
cases.

Closes https://github.com/astral-sh/ruff/issues/8756.

## Test Plan

`cargo test`
2023-11-19 10:05:29 -05:00
Charlie Marsh
00a015ca24 Respect local subclasses in flake8-type-checking (#8768)
If you define a subclass of `pydantic.BaseModel`, and then a subclass of
_that_ class in the same file, we'll now correctly treat it as
runtime-evaluated.

Closes https://github.com/astral-sh/ruff/issues/7893.
2023-11-19 09:49:25 -05:00
Charlie Marsh
94178a0320 [flake8-pyi] Respect local enum subclasses in simple-defaults (PYI052) (#8767)
We should reuse this approach in other rules, but this is a good start.

Closes https://github.com/astral-sh/ruff/issues/8764.
2023-11-19 09:31:59 -05:00
Charlie Marsh
7165f8f05d [flake8-pyi] Improve motivation for custom-type-var-return-type (PYI019) (#8766)
Closes https://github.com/astral-sh/ruff/issues/8765.
2023-11-19 09:15:10 -05:00
konsti
415e808046 Use pinned toolchain version in Dockerfile (#8763)
The previous build would use the cached rust version instead of
rust-toolchain.toml
2023-11-19 08:12:51 +00:00
Alex Bieg
9279114521 Add Implementation for Pylint E1132: Repeated Keyword (#8706)
<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

Adds the Pylint rule E1132 to check for repeated keyword arguments in a
function call.

## Test Plan

Tested via the included unit tests and manual spot checking.
2023-11-19 00:26:24 +00:00
Charlie Marsh
8b86e8004d Extend dict-get-with-none-default (SIM910) to non-literals (#8762)
## Summary

Ensures that we can catch cases like:

```python
ages = {"Tom": 23, "Maria": 23, "Dog": 11}
age = ages.get("Cat", None)
```

Previously, the rule was somewhat useless, as it only checked for
literal accesses.

Closes https://github.com/astral-sh/ruff/issues/8760.
2023-11-19 00:21:53 +00:00
konsti
a7fc785cc5 Add a ruff docker image at ghcr.io/astral-sh/ruff (#8554)
This dockerfile creates a minimal docker container that runs ruff

```console
$ docker run -v .:/io --rm ruff check --select G004 .
scripts/check_ecosystem.py:51:26: G004 Logging statement uses f-string
scripts/check_ecosystem.py:55:22: G004 Logging statement uses f-string
scripts/check_ecosystem.py:84:13: G004 Logging statement uses f-string
scripts/check_ecosystem.py:177:18: G004 Logging statement uses f-string
scripts/check_ecosystem.py:200:18: G004 Logging statement uses f-string
scripts/check_ecosystem.py:354:18: G004 Logging statement uses f-string
scripts/check_ecosystem.py:477:18: G004 Logging statement uses f-string
Found 7 errors.
```

```console
$ docker image ls ruff
 REPOSITORY   TAG       IMAGE ID       CREATED         SIZE
 ruff         latest    505876b0f817   2 minutes ago   16.2MB
```

Test repo: https://github.com/konstin/release-testing2
Successful build:
https://github.com/konstin/release-testing2/actions/runs/6862107104/job/18659155108
The package:
https://github.com/konstin/release-testing2/pkgs/container/release-testing2

After merging this, i have to manually push the first image and connect
it the repo in the github UI or the action will fail due to lack of
permissions

Open questions:
* Test arm version: Anyone working on an aarch64 linux machine? I don't
see this failing or a high-priority deployment (the vast majority of
linux users is on x86), but it would be nice to have it tested one.

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2023-11-17 19:44:28 +01:00
Charlie Marsh
f460f9c5c0 Bump version to v0.1.6 (#8744) 2023-11-17 13:29:19 -05:00
Tuomas Siipola
2faac1e7a8 [refurb] Implement math-constant (FURB152) (#8727)
## Summary

Implements
[FURB152](https://github.com/dosisod/refurb/blob/master/docs/checks.md#furb152-use-math-constant)
that checks for literals that are similar to constants in `math` module,
for example:

```python
A = 3.141592 * r ** 2
```

Use instead:
```python
A = math.pi * r ** 2
```

Related to #1348.
2023-11-17 17:37:44 +00:00
Charlie Marsh
b7dbb9062c Remove incorrect deprecation label for stdout and stderr (#8743)
Closes https://github.com/astral-sh/ruff/issues/8738.
2023-11-17 12:34:02 -05:00
Charlie Marsh
66794bc9fe Remove erroneous bad-dunder-name reference (#8742)
Closes #8731.
2023-11-17 17:26:29 +00:00
konsti
dca430f4d2 Fix instability with await fluent style (#8676)
Fix an instability where await was followed by a breaking fluent style
expression:

```python
test_data = await (
    Stream.from_async(async_data)
    .flat_map_async()
    .map()
    .filter_async(is_valid_data)
    .to_list()
)
```

Note that this technically a minor style change (see ecosystem check)
2023-11-17 12:24:19 -05:00
Adil Zouitine
841e6c889e Add River in "Who's Using Ruff?" section (#8740)
## Summary

I added [`River`](https://github.com/online-ml/river) in "Who's Using
Ruff?" section. River is an online machine learning library, with 4.5k
stars and 460k downloads, since few months we used
[`Ruff`](3758eb1e0a/pyproject.toml (L31))
as linter.
2023-11-17 08:13:50 -06:00
Zanie Blue
bd99175fea Update D208 to preserve indentation offsets when fixing overindented lines (#8699)
Closes https://github.com/astral-sh/ruff/issues/8695

We track the smallest offset seen for overindented lines then only
reduce the indentation of the lines that far to preserve indentation in
other lines. This rule's behavior now matches our formatter, which is
nice.

We may want to gate this with preview.
2023-11-16 22:11:07 -06:00
Ofek Lev
4c86b155f2 Fix typo (#8735) 2023-11-16 22:10:09 -06:00
Vince Chan
e2109c1353 Improve debug printing for resolving origin of config settings (#8729)
## Summary

When running ruff in verbose mode with `-v`, the first debug logs show
where the config settings are taken from. For example:
```
❯ ruff check ./some_file.py -v
[2023-11-17][00:16:25][ruff_cli::resolve][DEBUG] Using pyproject.toml (parent) at /Users/vince/demo/ruff.toml
```

This threw me off for a second because I knew I had no python project
there, and therefore no `pyproject.toml` file. Then I realised it was
actually reading a `ruff.toml` file (obvious when you read the whole
print I suppose) and that the pyproject.toml is a hardcoded string in
the debug log.

I think it would be nice to tweak the wording slightly so it is clear
that the settings don't neccessarily have to come from a
`pyproject.toml` file.
2023-11-17 01:10:36 +00:00
Charlie Marsh
1fcccf82fc Avoid syntax error via importing trio.lowlevel (#8730)
We ended up with a syntax error here via `from trio import
lowlevel.checkpoint`. The new solution avoids that error, but does miss
cases like:

```py
from trio.lowlevel import Timer
```

Where it could insert `from trio.lowlevel import Timer, checkpoint`.
Instead, it'll add `from trio import lowlevel`.

See:
https://github.com/astral-sh/ruff/issues/8402#issuecomment-1810838129
2023-11-17 01:07:59 +00:00
konsti
14e65afdc6 Update to Rust 1.74 and use new clippy lints table (#8722)
Update to [Rust
1.74](https://blog.rust-lang.org/2023/11/16/Rust-1.74.0.html) and use
the new clippy lints table.

The update itself introduced a new clippy lint about superfluous hashes
in raw strings, which got removed.

I moved our lint config from `rustflags` to the newly stabilized
[workspace.lints](https://doc.rust-lang.org/stable/cargo/reference/workspaces.html#the-lints-table).
One consequence is that we have to `unsafe_code = "warn"` instead of
"forbid" because the latter now actually bans unsafe code:

```
error[E0453]: allow(unsafe_code) incompatible with previous forbid
  --> crates/ruff_source_file/src/newlines.rs:62:17
   |
62 |         #[allow(unsafe_code)]
   |                 ^^^^^^^^^^^ overruled by previous forbid
   |
   = note: `forbid` lint level was set on command line
```

---------

Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
2023-11-16 18:12:46 -05:00
Charlie Marsh
6d5d079a18 Avoid missing namespace violations in scripts with shebangs (#8710)
## Summary

I think it's reasonable to avoid raising `INP001` for scripts, and
shebangs are one sufficient way to detect scripts.

Closes https://github.com/astral-sh/ruff/issues/8690.
2023-11-16 17:21:33 -05:00
Zanie Blue
d1e88dc984 Update UP032 to unescape curly braces in literal parts of converted strings (#8697)
Closes #8694
2023-11-16 13:59:54 -06:00
konsti
dda31b6996 List all ipython builtins (#8719)
I checked for ipython-specific builtins on python 3.11 using
```python
import json
from subprocess import check_output

builtins_python = json.loads(check_output(["python3", "-c" "import json; print(json.dumps(dir(__builtins__)))"]))
builtins_ipython = json.loads(check_output(["ipython3", "-c" "import json; print(json.dumps(dir(__builtins__)))"]))
print(sorted(set(builtins_ipython) - set(builtins_python)))
```
and updated the relevant constant and match. The list changes from

`display`

to

`__IPYTHON__`, `display`, `get_ipython`.

Followup to #8707
2023-11-16 19:06:25 +01:00
Charlie Marsh
b6a7787318 Remove pyproject.toml from fixtures directory (#8726)
## Summary

This exists to power a test, but it ends up affecting the behavior of
all files in the directory. Namely, it means that these files _aren't_
excluded when you format or lint them directly, since in that case, Ruff
will fall back to looking at the `pyproject.toml` in
`crates/ruff_linter/resources/test/fixtures`, which _doesn't_ exclude
these files, unlike our top-level `pyproject.toml`.
2023-11-16 13:04:52 -05:00
Jonas Haag
5fa961f670 Improve N803 example (#8714) 2023-11-16 12:50:31 -05:00
Charlie Marsh
2424188bb2 Trim trailing empty strings when converting to f-strings (#8712)
## Summary

When converting from a `.format` call to an f-string, we can trim any
trailing empty tokens.

Closes https://github.com/astral-sh/ruff/issues/8683.
2023-11-15 23:14:49 -05:00
Charlie Marsh
a59172528c Add fix for future-required-type-annotation (#8711)
## Summary

We already support inserting imports for `I002` -- this PR just adds the
same fix for `FA102`, which is explicitly about `from __future__ import
annotations`.

Closes https://github.com/astral-sh/ruff/issues/8682.
2023-11-16 03:08:02 +00:00
Charlie Marsh
cd29761b9c Run unicode prefix rule over tokens (#8709)
## Summary

It seems like the range of an `ExprStringLiteral` can be somewhat
unreliable when the string is part of an implicit concatenation with an
f-string. Using the tokens themselves is more reliable.

Closes #8680.
Closes https://github.com/astral-sh/ruff/issues/7784.
2023-11-16 02:30:42 +00:00
Charlie Marsh
4ac78d5725 Treat display as a builtin in IPython (#8707)
## Summary

`display` is a special-cased builtin in IPython. This PR adds it to the
builtin namespace when analyzing IPython notebooks.

Closes https://github.com/astral-sh/ruff/issues/8702.
2023-11-16 01:58:44 +00:00
Alan Du
2083352ae3 Add autofix for PIE800 (#8668)
## Summary

This adds an autofix for PIE800 (unnecessary spread) -- whenever we see
a `**{...}` inside another dictionary literal, just delete the `**{` and
`}` to inline the key-value pairs. So `{"a": "b", **{"c": "d"}}` becomes
just `{"a": "b", "c": "d"}`.

I have enabled this just for preview mode.

## Test Plan

Updated the preview snapshot test.
2023-11-15 18:11:04 +00:00
Tuomas Siipola
0e2ece5217 Implement FURB136 (#8664)
## Summary

Implements
[FURB136](https://github.com/dosisod/refurb/blob/master/docs/checks.md#furb136-use-min-max)
that checks for `if` expressions that can be replaced with `min()` or
`max()` calls. See issue #1348 for more information.

This implementation diverges from Refurb's original implementation by
retaining the order of equal values. For example, Refurb suggest that
the following expressions:

```python
highest_score1 = score1 if score1 > score2 else score2
highest_score2 = score1 if score1 >= score2 else score2
```

should be to rewritten as:

```python
highest_score1 = max(score1, score2)
highest_score2 = max(score1, score2)
```

whereas this implementation provides more correct alternatives:

```python
highest_score1 = max(score2, score1)
highest_score2 = max(score1, score2)
```

## Test Plan

Unit test checks all eight possibilities.
2023-11-15 18:10:13 +00:00
konsti
a783b14e7d Add --skip-magic-trailing-comma to formatter dev comment (#8689)
Testing the compatibility with the future stable black style, i realized
the `ruff_python_formatter` dev main was lacking the
`--skip-magic-trailing-comma` option. This does not affect `ruff
format`.

Usage:
```shell
cargo run --bin ruff_python_formatter -p ruff_python_formatter -- --skip-magic-trailing-comma --emit stdout scratch.py
```
2023-11-15 09:23:46 +00:00
Jelmer Vernooij
9d76e4e0b9 isort: Support disabling sections with `no-sections = true` (#8657)
## Summary

This adds a ``no-sections`` option for isort in the linter, similar to
the ``no_sections`` option that exists in upstream isort
(https://pycqa.github.io/isort/docs/configuration/options.html#no-sections)

This option puts all imports except for ``__future__`` into the same
section, and is mostly used by monorepos.

I've taken a bit of a leap in assuming that ruff wants to support the
exact same option; more than happy to refactor if you'd prefer a
different way of setting this up.

Fixes #8653

## Test Plan

I've added a test and have run it on a large Python codebase that uses
isort with --no-sections. The option is disabled by default.
2023-11-14 21:45:51 +00:00
bluthej
561277925f [isort] Simplify code structure for ordering imports (#8685)
While fixing #8661 I noticed that the code structure for sorting imports
could be simplified.

## Summary

- Move the logic for `force_sort_within_sections` from `isort/mod.rs` to
`isort/ordering.rs` => now there is just one line in `isort/mod.rs`:
`let imports = order_imports(import_block, settings);` which yields the
sorted imports
- Change the function signature of `order_imports` to directly return a
`Vec<EitherImport<'a>>` => no need for `OrderedImportBlock`

I think this is a bit of an improvement because the code is simpler and
there should be a bit of a speedup when setting
`force-sort-within-sections` to true. Indeed, when it's set to true
we're now directly ordering all the imports, whereas before we would
first order the straight imports, then the from imports, combine them
and finally sort the combination a second time (this is probably not
noticeable in practice though).

## Test Plan

No tests added, this is a simple refactor.
2023-11-14 16:43:46 -05:00
doolio
1074324c52 Add starlette as a user of ruff (#8672)
Mentioned [here on
discord](https://discord.com/channels/1039017663004942429/1039017663512449056/1173726569852837958).
Used their github url as that seems to be the most common url type
though not for Litestar.
2023-11-14 08:38:12 -06:00
Dhruv Manilawala
4099b9610f F-strings doesn't contain bytes literal for PLW0129 (#8675)
For the `PLW0129` rule, the f-string case shouldn't match against bytes
literal as f-strings cannot contain them. F-strings are made up of
either string literals or formatted expressions.
2023-11-14 18:56:18 +05:30
Charlie Marsh
f7d249ae06 Remove repeated and erroneous scoped settings headers in docs (#8670)
Closes https://github.com/astral-sh/ruff/issues/8505.
2023-11-14 05:44:30 +00:00
Charlie Marsh
bf2cc3f520 Add autotyping-like return type inference for annotation rules (#8643)
## Summary

This PR adds (unsafe) fixes to the flake8-annotations rules that enforce
missing return types, offering to automatically insert type annotations
for functions with literal return values. The logic is smart enough to
generate simplified unions (e.g., `float` instead of `int | float`) and
deal with implicit returns (`return` without a value).

Closes https://github.com/astral-sh/ruff/issues/1640 (though we could
open a separate issue for referring parameter types).

Closes https://github.com/astral-sh/ruff/issues/8213.

## Test Plan

`cargo test`
2023-11-13 23:34:15 -05:00
bluthej
23c819b4b3 Fix ordering for force-sort-within-sections (#8665)
Fixes #8661 

## Summary

Imports like `from x import y` don't have an "asname" for the module, so
they were placed before imports like `import x as w` since `None` <
`Some(s)` for any string s.
The fix is to first sort by `first_alias`, since it's `None` for `import
x as w`, and then by `asname`.

## Test Plan

I included the example from the issue to avoid future regressions.
2023-11-13 18:27:56 -05:00
Adrian
16060670b8 Add new rule to check for useless quote escapes (#8630)
When using the autofixer for `Q000` it does not remove the backslashes
from quotes that no longer need escaping.

This new rule checks for such backslashes (regardless whether they come
from the autofixer or not) and can remove them.

fixes #8617
2023-11-13 21:59:37 +00:00
Charlie Marsh
534fc34f11 Extend unnecessary-pass (PIE790) to include ellipses in preview (#8641)
## Summary

This PR extends `unnecessary-pass` (`PIE790`) to flag unnecessary
ellipsis expressions in addition to `pass` statements. A `pass` is
equivalent to a standalone `...`, so it feels correct to me that a
single rule should cover both cases.

When we look to v0.2.0, we should also consider deprecating `PYI013`,
which flags ellipses only for classes.

Closes https://github.com/astral-sh/ruff/issues/8602.
2023-11-13 19:28:16 +00:00
Charlie Marsh
df9ade7fd9 Use AST transformer for relocate (#8660) 2023-11-13 13:24:27 -05:00
Charlie Marsh
345e1401cf Treat class C: ... and class C(): ... equivalently (#8659)
## Summary

These should be seen as identical from the `ComparableAst` perspective.
2023-11-13 18:03:04 +00:00
Charlie Marsh
a8e0d4ab4f Fix lingering generated reference for MkDocs (#8658)
Missed this in #8652.
2023-11-13 18:00:01 +00:00
Alan Du
6f23bdb78f Generalize PIE807 to handle dict literals (#8608)
## Summary

PIE807 will rewrite `lambda: []` to `list` -- AFAICT though, the same
rationale also applies to dicts, so I've modified the code to also
rewrite `lambda: {}` to `dict`.

Two things I'm not sure about:
* Should this go to a new rule? This no longer actually matches the
behavior of flake8-pie, and while I think thematically it makes sense to
be part of the same rule, we could make it a standalone rule (but if so,
where should I put it and what error code should I use)?
* If we want a single rule, are there backwards compatibility concerns
with the rule name change (from `reimplemented_list_builtin` to
`reimplemented_container_builtin`?

## Test Plan

Added snapshot tests of the functionality.
2023-11-13 17:55:17 +00:00
Charlie Marsh
d574fcd1ac Compare formatted and unformatted ASTs during formatter tests (#8624)
## Summary

This PR implements validation in the formatter tests to ensure that we
don't modify the AST during formatting. Black has similar logic.

In implementing this, I learned that Black actually _does_ modify the
AST, and their test infrastructure normalizes the AST to wipe away those
differences. Specifically, Black changes the indentation of docstrings,
which _does_ modify the AST; and it also inserts parentheses in `del`
statements, which changes the AST too.

Ruff also does both these things, so we _also_ implement the same
normalization using a new visitor that allows for modifying the AST.

Closes https://github.com/astral-sh/ruff/issues/8184.

## Test Plan

`cargo test`
2023-11-13 17:43:27 +00:00
Charlie Marsh
3592f44ade Allow whitespace around colon in slices for whitespace-before-punctuation (E203) (#8654)
## Summary

This PR makes `whitespace-before-punctuation` (`E203`) compatible with
the formatter by relaxing the rule a bit, as compared to the pycodestyle
implementation. It's also more consistent with PEP 8, which says:

> However, in a slice the colon acts like a binary operator, and should
have equal amounts on either side (treating it as the operator with the
lowest priority).

Closes https://github.com/astral-sh/ruff/issues/7259.
Closes https://github.com/astral-sh/ruff/issues/8642.

## Test Plan

`cargo test`
2023-11-13 12:16:13 -05:00
Andrew Gallant
8984072df2 ruff_python_formatter: copy and inline shared traits (#8656)
It seems as though using `include!(...)` to avoid the source code copy
breaks rust-analzer. Namely, it treats the included file as unlinked,
and so any part of analysis (e.g., goto-definition) that needs that file
to reason about the code ends up failing.

<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

<!-- How was it tested? -->
2023-11-13 12:16:04 -05:00
Charlie Marsh
6a6de53722 Omit Insiders-only plugin when building docs on CI (#8652) 2023-11-13 10:24:58 -05:00
dependabot[bot]
5ba852a878 Bump annotate-snippets from 0.9.1 to 0.9.2 (#8646) 2023-11-13 14:55:15 +00:00
dependabot[bot]
c4fc2b8584 Bump pyproject-toml from 0.8.0 to 0.8.1 (#8648) 2023-11-13 14:53:47 +00:00
dependabot[bot]
1c5f2288ba Bump fs-err from 2.9.0 to 2.10.0 (#8649) 2023-11-13 09:38:44 -05:00
dependabot[bot]
62f1830898 Bump quick-junit from 0.3.3 to 0.3.5 (#8645) 2023-11-13 09:38:31 -05:00
dependabot[bot]
abca0a86ea Bump smallvec from 1.11.1 to 1.11.2 (#8647) 2023-11-13 09:38:11 -05:00
Charlie Marsh
213d315373 Avoid recommending Self usages in metaclasses (#8639)
PEP 673 forbids the use of `typing(_extensions).Self` in metaclasses, so
we want to avoid flagging `PYI034` on metaclasses. This is based on an
analogous change in `flake8-pyi`:
https://github.com/PyCQA/flake8-pyi/pull/436.

Closes https://github.com/astral-sh/ruff/issues/8353.
2023-11-12 19:47:48 -05:00
Charlie Marsh
7fd95e15d9 Document conventions in the FAQ (#8638)
Enumerates all rules defined in each convention in the FAQ. These lists
mirror
[pydocstyle](https://www.pydocstyle.org/en/latest/error_codes.html#default-conventions).

Closes https://github.com/astral-sh/ruff/issues/8573.
2023-11-12 22:56:39 +00:00
Charlie Marsh
02946e7b0c Redirect from rule codes to rule pages in docs (#8636)
## Summary

This adds redirects from, e.g., `https://docs.astral.sh/ruff/rules/F401`
to `https://docs.astral.sh/ruff/rules/unused-import`, which are
generated automatically when creating the documentation. Though we want
to move towards human-readable names eventually, I think this is a nice
and user-friendly change (and doesn't require any fancy infrastructure,
since the redirects are handled via a plugin and added client-side).

Closes #4710.
2023-11-12 17:47:10 -05:00
Charlie Marsh
cbd9157bbf Use function range for no-self-use (#8637)
Previously, this rule used the range of the `self` annotation, but it's
a lot more natural to use the range of the function name (since it also
means the `# noqa` is associated with the method rather than its first
argument).

Closes https://github.com/astral-sh/ruff/issues/8635.
2023-11-12 16:37:52 -05:00
Charlie Marsh
70f491d31e Omit unrolled augmented assignments in PIE794 (#8634)
Closes https://github.com/astral-sh/ruff/issues/8497.
2023-11-12 20:40:33 +00:00
Jonathan Plasse
776eb8724f Fix FBT001 false negative with unions and optional (#7501)
## Summary

- Close #7487

In the spirit of `flake8-boolean-trap`, any positional argument that can
accept a boolean should raise `FBT001`.
Raise `FBT001` for all annotations that accept booleans (e.g.
`Optional[bool]`, `Union[int, bool]`).

## Test Plan

Add a fixture, with an annotation using `|`, `Optional`, and `Union`,
and containing a boolean.
2023-11-12 15:09:23 -05:00
Charlie Wilson
5f78580775 Remove unecessary commentary in PD901 message (#8625)
## Summary

Removes unnecessary commentary from the PD901 message. This does make it
different from pandas-vet, but it improves consistency with the rest of
messages.

Current Message:

> `df` is a bad variable name. Be kinder to your future self.


New Message

> `df` is a bad variable name.


## Test Plan

The relevant snapshot has been updated with the new message.

---------

Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
2023-11-12 17:20:05 +00:00
Bodo Graumann
4d301f6dcc Improve docs for RUF001, RUF002 and RUF003 (#8628)
I got an error from RUF001 and wanted to override it. How to do that was
not quite obvious. In the process I have tried to improve the
documentation for the rule and it's siblings.
2023-11-12 17:19:25 +00:00
Charlie Marsh
96b265ccec Implement autofix for multiple-spaces-after-operator and multiple-spaces-before-operator (#8623) 2023-11-11 23:46:16 +00:00
Charlie Marsh
e0a0ddcf7d Implement autofix for multiple-spaces-after-keyword and multiple-spaces-before-keyword (#8622)
Closes https://github.com/astral-sh/ruff/issues/8312.
2023-11-11 23:41:12 +00:00
Charlie Marsh
9724dfd939 Implement autofix for unnecessary-lambda (PLW0108) (#8621)
Closes https://github.com/astral-sh/ruff/issues/8618.
2023-11-11 18:34:02 -05:00
Steven DeMartini
d7144d6d8e Fix docs typo for ruff format preview configuration (#8611)
## Summary

The ruff configuration section is called "format", rather than
"preview". Using the configuration as it was written in the docs gives
an error:

```
$ ruff format --check .
ruff failed
  Cause: TOML parse error at line 143, column 1
    |
143 | [tool.ruff.preview]
    | ^^^^^^^^^^^^^^^^^^^
invalid type: map, expected a boolean
```

## Test Plan

Tested running `ruff format` with the following in my `pyproject.toml`:

```toml
[tool.ruff.format]
preview = true
```

and it worked properly (using preview rules for formatting).
2023-11-11 03:07:32 +00:00
Jesse Serrao
39728a1198 Add check for is comparison with mutable initialisers to rule F632 (#8607)
## Summary

Adds an extra check to F632 to check for any `is` comparisons to a
mutable initialisers.
Implements #8589 .

Example:
```Python
named_var = {}
if named_var is {}:  # F632 (fix)
    pass
```
The if condition will always evaluate to False because it checks on
identity and it's impossible to take the same identity as a hard coded
list/set/dict initializer.

## Test Plan

Multiple test cases were added to ensure the rule works + doesn't flag
false positives + the fix works correctly.
2023-11-11 00:29:23 +00:00
Shantanu
8207d6df82 Fix unnecessary parentheses in UP007 fix (#8610)
Fixes #8609
2023-11-10 19:15:09 -05:00
Jake Park
c8edac9d2b [pylint] Implement redefined-argument-from-local (R1704) (#8159)
## Summary

It implements Pylint rule R1704: redefined-argument-from-local

Problematic code:
```python
def show(host_id=10.11):
    # +1: [redefined-argument-from-local]
    for host_id, host in [[12.13, "Venus"], [14.15, "Mars"]]:
        print(host_id, host)
```

Correct code:
```python
def show(host_id=10.11):
    for inner_host_id, host in [[12.13, "Venus"], [14.15, "Mars"]]:
        print(host_id, inner_host_id, host)
```

References:
[Pylint
documentation](https://pylint.readthedocs.io/en/latest/user_guide/messages/refactor/redefined-argument-from-local.html)
[Related Issue](https://github.com/astral-sh/ruff/issues/970)

## Test Plan

`cargo test`
2023-11-10 14:13:07 -05:00
Alan Du
5a1a8bebca Allow overriding pydocstyle convention rules (#8586)
## Summary

This fixes #2606 by moving where we apply the convention ignores --
instead of applying that at the very end, e track, we now track which
rules have been specifically enabled (via `Specificity::Rule`). If they
have, then we do *not* apply the docstring overrides at the end.

## Test Plan

Added unit tests to `ruff_workspace` and an integration test to
`ruff_cli`
2023-11-10 18:47:37 +00:00
Dhruv Manilawala
3e00ddce38 Preserve trailing semicolon for Notebooks (#8590)
## Summary

This PR updates the formatter to preserve trailing semicolon for Jupyter
Notebooks.

The motivation behind the change is that semicolons in notebooks are
typically used to hide the output, for example when plotting. This is
highlighted in the linked issue.

The conditions required as to when the trailing semicolon should be
preserved are:
1. It should be a top-level statement which is last in the module.
2. For statement, it can be either assignment, annotated assignment, or
augmented assignment. Here, the target should only be a single
identifier i.e., multiple assignments or tuple unpacking isn't
considered.
3. For expression, it can be any.

## Test Plan

Add a new integration test in `ruff_cli`. The test notebook basically
acts as a document as to which trailing semicolons are to be preserved.

fixes: #8254
2023-11-10 21:53:35 +05:30
Andrew Gallant
a7dbe9d670 refine pyupgrade's TimeoutErrorAlias lint (UP041) to remove false positives (#8587)
Previously, this lint had its alias detection logic a little
backwards. That is, for Python 3.11+, it would *only* detect
asyncio.TimeoutError as an alias, but it should have also detected
socket.timeout as an alias. And in Python <3.11, it would falsely
detect asyncio.TimeoutError as an alias where it should have only
detected socket.timeout as an alias.

We fix it so that both asyncio.TimeoutError and socket.timeout are
detected as aliases in Python 3.11+, and only socket.timeout is
detected as an alias in Python 3.10.

Fixes #8565

## Test Plan

I tested this by updating the existing snapshot test which had
erroneously
asserted that socket.timeout should not be replaced with TimeoutError in
Python
3.11+. I also added a new regression test that targets Python 3.10 and
ensures
that the suggestion to replace asyncio.TimeoutError with TimeoutError
does not
occur.
2023-11-10 10:15:33 -05:00
Charlie Marsh
036b6bc0bd Document context manager breaking deviation vs. Black (#8597)
Closes https://github.com/astral-sh/ruff/issues/8180.
Closes https://github.com/astral-sh/ruff/issues/8580.
Closes https://github.com/astral-sh/ruff/issues/7441.
2023-11-10 04:32:29 +00:00
Dhruv Manilawala
d5606b7705 Consider the new f-string tokens for flake8-commas (#8582)
## Summary

This fixes the bug where the `flake8-commas` rules weren't taking the
new f-string tokens into account.

## Test Plan

Add new test cases around f-strings for all of `flake8-commas`'s rules.

fixes: #8556
2023-11-10 09:49:14 +05:30
Charlie Marsh
7968e190dd Write unchanged, excluded files to stdout when read via stdin (#8596)
## Summary

When you run Ruff via stdin, and pass `format` or `check --fix`, we
typically write the changed or unchanged contents to stdout. It turns
out we forgot to do this when the file is _excluded_, so if you run
`ruff format /path/to/excluded/file.py`, we don't write _anything_ to
`stdout`. This led to a bug in the LSP whereby we deleted file contents
for third-party files.

The right thing to do here is write back the unchanged contents, as it
should always be safe to write the output of stdout back to a file.
2023-11-09 23:15:01 -05:00
Charlie Marsh
346a828db2 Add a BindingKind for WithItem variables (#8594) 2023-11-09 22:44:49 -05:00
Charlie Marsh
0ac124acef Make unpacked assignment a flag rather than a BindingKind (#8595)
## Summary

An assignment can be _both_ (e.g.) a loop variable _and_ assigned via
unpacking. In other words, unpacking is a quality of an assignment, not
a _kind_.
2023-11-09 21:41:30 -05:00
Adrian
4ebd0bd31e Support local and dynamic class- and static-method decorators (#8592)
## Summary

This brings ruff's behavior in line with what `pep8-naming` already does
and thus closes #8397.

I had initially implemented this to look at the last segment of a dotted
path only when the entry in the `*-decorators` setting started with a
`.`, but in the end I thought it's better to remain consistent w/
`pep8-naming` and doing a match against the last segment of the
decorator name in any case.

If you prefer to diverge from this in favor of less ambiguity in the
configuration let me know and I'll change it so you would need to put
e.g. `.expression` in the `classmethod-decorators` list.

## Test Plan

Tested against the file in the issue linked below, plus the new testcase
added in this PR.
2023-11-10 02:04:25 +00:00
Zanie Blue
565ddebb15 Improve detection of TYPE_CHECKING blocks imported from typing_extensions or _typeshed (#8429)
~Improves detection of types imported from `typing_extensions`. Removes
the hard-coded list of supported types in `typing_extensions`; instead
assuming all types could be imported from `typing`, `_typeshed`, or
`typing_extensions`.~

~The typing extensions package appears to re-export types even if they
do not need modification.~


Adds detection of `if typing_extensions.TYPE_CHECKING` blocks. Avoids
inserting a new `if TYPE_CHECKING` block and `from typing import
TYPE_CHECKING` if `typing_extensions.TYPE_CHECKING` is used (closes
https://github.com/astral-sh/ruff/issues/8427)

---------

Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
2023-11-09 12:21:03 -06:00
1117 changed files with 75062 additions and 31239 deletions

View File

@@ -1,37 +1,3 @@
[alias]
dev = "run --package ruff_dev --bin ruff_dev"
benchmark = "bench -p ruff_benchmark --bench linter --bench formatter --"
[target.'cfg(all())']
rustflags = [
# CLIPPY LINT SETTINGS
# This is a workaround to configure lints for the entire workspace, pending the ability to configure this via TOML.
# See: `https://github.com/rust-lang/cargo/issues/5034`
# `https://github.com/EmbarkStudios/rust-ecosystem/issues/22#issuecomment-947011395`
"-Dunsafe_code",
"-Wclippy::pedantic",
# Allowed pedantic lints
"-Wclippy::char_lit_as_u8",
"-Aclippy::collapsible_else_if",
"-Aclippy::collapsible_if",
"-Aclippy::implicit_hasher",
"-Aclippy::match_same_arms",
"-Aclippy::missing_errors_doc",
"-Aclippy::missing_panics_doc",
"-Aclippy::module_name_repetitions",
"-Aclippy::must_use_candidate",
"-Aclippy::similar_names",
"-Aclippy::too_many_lines",
# Disallowed restriction lints
"-Wclippy::print_stdout",
"-Wclippy::print_stderr",
"-Wclippy::dbg_macro",
"-Wclippy::empty_drop",
"-Wclippy::empty_structs_with_brackets",
"-Wclippy::exit",
"-Wclippy::get_unwrap",
"-Wclippy::rc_buffer",
"-Wclippy::rc_mutex",
"-Wclippy::rest_pat_in_fully_bound_structs",
"-Wunreachable_pub"
]

3
.gitattributes vendored
View File

@@ -3,5 +3,8 @@
crates/ruff_linter/resources/test/fixtures/isort/line_ending_crlf.py text eol=crlf
crates/ruff_linter/resources/test/fixtures/pycodestyle/W605_1.py text eol=crlf
crates/ruff_python_formatter/resources/test/fixtures/ruff/docstring_code_examples_crlf.py text eol=crlf
crates/ruff_python_formatter/tests/snapshots/format@docstring_code_examples_crlf.py.snap text eol=crlf
ruff.schema.json linguist-generated=true text=auto eol=lf
*.md.snap linguist-language=Markdown

View File

@@ -23,8 +23,13 @@ jobs:
name: "Determine changes"
runs-on: ubuntu-latest
outputs:
# Flag that is raised when any code that affects linter is changed
linter: ${{ steps.changed.outputs.linter_any_changed }}
# Flag that is raised when any code that affects formatter is changed
formatter: ${{ steps.changed.outputs.formatter_any_changed }}
# Flag that is raised when any code is changed
# This is superset of the linter and formatter
code: ${{ steps.changed.outputs.code_any_changed }}
steps:
- uses: actions/checkout@v4
with:
@@ -44,6 +49,7 @@ jobs:
- "!crates/ruff_shrinking/**"
- scripts/*
- .github/workflows/ci.yaml
- python/**
formatter:
- Cargo.toml
@@ -58,8 +64,15 @@ jobs:
- crates/ruff_python_parser/**
- crates/ruff_dev/**
- scripts/*
- python/**
- .github/workflows/ci.yaml
code:
- "*/**"
- "!**/*.md"
- "!docs/**"
- "!assets/**"
cargo-fmt:
name: "cargo fmt"
runs-on: ubuntu-latest
@@ -72,6 +85,8 @@ jobs:
cargo-clippy:
name: "cargo clippy"
runs-on: ubuntu-latest
needs: determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
steps:
- uses: actions/checkout@v4
- name: "Install Rust toolchain"
@@ -86,6 +101,8 @@ jobs:
cargo-test-linux:
runs-on: ubuntu-latest
needs: determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
name: "cargo test (linux)"
steps:
- uses: actions/checkout@v4
@@ -110,6 +127,8 @@ jobs:
cargo-test-windows:
runs-on: windows-latest
needs: determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
name: "cargo test (windows)"
steps:
- uses: actions/checkout@v4
@@ -127,6 +146,8 @@ jobs:
cargo-test-wasm:
runs-on: ubuntu-latest
needs: determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
name: "cargo test (wasm)"
steps:
- uses: actions/checkout@v4
@@ -146,6 +167,8 @@ jobs:
cargo-fuzz:
runs-on: ubuntu-latest
needs: determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
name: "cargo fuzz"
steps:
- uses: actions/checkout@v4
@@ -163,6 +186,8 @@ jobs:
scripts:
name: "test scripts"
runs-on: ubuntu-latest
needs: determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
steps:
- uses: actions/checkout@v4
- name: "Install Rust toolchain"
@@ -186,8 +211,7 @@ jobs:
# Only runs on pull requests, since that is the only we way we can find the base version for comparison.
# Ecosystem check needs linter and/or formatter changes.
if: github.event_name == 'pull_request' && ${{
needs.determine_changes.outputs.linter == 'true' ||
needs.determine_changes.outputs.formatter == 'true'
needs.determine_changes.outputs.code == 'true'
}}
steps:
- uses: actions/checkout@v4
@@ -296,6 +320,8 @@ jobs:
cargo-udeps:
name: "cargo udeps"
runs-on: ubuntu-latest
needs: determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
steps:
- uses: actions/checkout@v4
- name: "Install nightly Rust toolchain"
@@ -392,7 +418,7 @@ jobs:
run: mkdocs build --strict -f mkdocs.insiders.yml
- name: "Build docs"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS != 'true' }}
run: mkdocs build --strict -f mkdocs.generated.yml
run: mkdocs build --strict -f mkdocs.public.yml
check-formatter-instability-and-black-similarity:
name: "formatter instabilities and black similarity"
@@ -415,7 +441,10 @@ jobs:
check-ruff-lsp:
name: "test ruff-lsp"
runs-on: ubuntu-latest
needs: cargo-test-linux
needs:
- cargo-test-linux
- determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
steps:
- uses: extractions/setup-just@v1
env:
@@ -453,6 +482,8 @@ jobs:
benchmarks:
runs-on: ubuntu-latest
needs: determine_changes
if: needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main'
steps:
- name: "Checkout Branch"
uses: actions/checkout@v4

View File

@@ -44,7 +44,7 @@ jobs:
run: mkdocs build --strict -f mkdocs.insiders.yml
- name: "Build docs"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS != 'true' }}
run: mkdocs build --strict -f mkdocs.generated.yml
run: mkdocs build --strict -f mkdocs.public.yml
- name: "Deploy to Cloudflare Pages"
if: ${{ env.CF_API_TOKEN_EXISTS == 'true' }}
uses: cloudflare/wrangler-action@v3.3.2

View File

@@ -516,6 +516,62 @@ jobs:
files: binaries/*
tag_name: v${{ inputs.tag }}
docker-publish:
# This action doesn't need to wait on any other task, it's easy to re-tag if something failed and we're validating
# the tag here also
name: Push Docker image ghcr.io/astral-sh/ruff
runs-on: ubuntu-latest
environment:
name: release
permissions:
# For the docker push
packages: write
steps:
- uses: actions/checkout@v4
with:
ref: ${{ inputs.sha }}
- uses: docker/setup-buildx-action@v3
- uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: ghcr.io/astral-sh/ruff
- name: Check tag consistency
# Unlike validate-tag we don't check if the commit is on the main branch, but it seems good enough since we can
# change docker tags
if: ${{ inputs.tag }}
run: |
version=$(grep "version = " pyproject.toml | sed -e 's/version = "\(.*\)"/\1/g')
if [ "${{ inputs.tag }}" != "${version}" ]; then
echo "The input tag does not match the version from pyproject.toml:" >&2
echo "${{ inputs.tag }}" >&2
echo "${version}" >&2
exit 1
else
echo "Releasing ${version}"
fi
- name: "Build and push Docker image"
uses: docker/build-push-action@v5
with:
context: .
platforms: linux/amd64,linux/arm64
# Reuse the builder
cache-from: type=gha
cache-to: type=gha,mode=max
push: ${{ inputs.tag != '' }}
tags: ghcr.io/astral-sh/ruff:latest,ghcr.io/astral-sh/ruff:${{ inputs.tag || 'dry-run' }}
labels: ${{ steps.meta.outputs.labels }}
# After the release has been published, we update downstream repositories
# This is separate because if this fails the release is still fine, we just need to do some manual workflow triggers
update-dependents:
@@ -524,7 +580,7 @@ jobs:
needs: publish-release
steps:
- name: "Update pre-commit mirror"
uses: actions/github-script@v6
uses: actions/github-script@v7
with:
github-token: ${{ secrets.RUFF_PRE_COMMIT_PAT }}
script: |

View File

@@ -1,5 +1,78 @@
# Changelog
## 0.1.6
### Preview features
- \[`flake8-boolean-trap`\] Extend `boolean-type-hint-positional-argument` (`FBT001`) to include booleans in unions ([#7501](https://github.com/astral-sh/ruff/pull/7501))
- \[`flake8-pie`\] Extend `reimplemented-list-builtin` (`PIE807`) to `dict` reimplementations ([#8608](https://github.com/astral-sh/ruff/pull/8608))
- \[`flake8-pie`\] Extend `unnecessary-pass` (`PIE790`) to include ellipses (`...`) ([#8641](https://github.com/astral-sh/ruff/pull/8641))
- \[`flake8-pie`\] Implement fix for `unnecessary-spread` (`PIE800`) ([#8668](https://github.com/astral-sh/ruff/pull/8668))
- \[`flake8-quotes`\] Implement `unnecessary-escaped-quote` (`Q004`) ([#8630](https://github.com/astral-sh/ruff/pull/8630))
- \[`pycodestyle`\] Implement fix for `multiple-spaces-after-keyword` (`E271`) and `multiple-spaces-before-keyword` (`E272`) ([#8622](https://github.com/astral-sh/ruff/pull/8622))
- \[`pycodestyle`\] Implement fix for `multiple-spaces-after-operator` (`E222`) and `multiple-spaces-before-operator` (`E221`) ([#8623](https://github.com/astral-sh/ruff/pull/8623))
- \[`pyflakes`\] Extend `is-literal` (`F632`) to include comparisons against mutable initializers ([#8607](https://github.com/astral-sh/ruff/pull/8607))
- \[`pylint`\] Implement `redefined-argument-from-local` (`PLR1704`) ([#8159](https://github.com/astral-sh/ruff/pull/8159))
- \[`pylint`\] Implement fix for `unnecessary-lambda` (`PLW0108`) ([#8621](https://github.com/astral-sh/ruff/pull/8621))
- \[`refurb`\] Implement `if-expr-min-max` (`FURB136`) ([#8664](https://github.com/astral-sh/ruff/pull/8664))
- \[`refurb`\] Implement `math-constant` (`FURB152`) ([#8727](https://github.com/astral-sh/ruff/pull/8727))
### Rule changes
- \[`flake8-annotations`\] Add autotyping-like return type inference for annotation rules ([#8643](https://github.com/astral-sh/ruff/pull/8643))
- \[`flake8-future-annotations`\] Implement fix for `future-required-type-annotation` (`FA102`) ([#8711](https://github.com/astral-sh/ruff/pull/8711))
- \[`flake8-implicit-namespace-package`\] Avoid missing namespace violations in scripts with shebangs ([#8710](https://github.com/astral-sh/ruff/pull/8710))
- \[`pydocstyle`\] Update `over-indentation` (`D208`) to preserve indentation offsets when fixing overindented lines ([#8699](https://github.com/astral-sh/ruff/pull/8699))
- \[`pyupgrade`\] Refine `timeout-error-alias` (`UP041`) to remove false positives ([#8587](https://github.com/astral-sh/ruff/pull/8587))
### Formatter
- Fix instability in `await` formatting with fluent style ([#8676](https://github.com/astral-sh/ruff/pull/8676))
- Compare formatted and unformatted ASTs during formatter tests ([#8624](https://github.com/astral-sh/ruff/pull/8624))
- Preserve trailing semicolon for Notebooks ([#8590](https://github.com/astral-sh/ruff/pull/8590))
### CLI
- Improve debug printing for resolving origin of config settings ([#8729](https://github.com/astral-sh/ruff/pull/8729))
- Write unchanged, excluded files to stdout when read via stdin ([#8596](https://github.com/astral-sh/ruff/pull/8596))
### Configuration
- \[`isort`\] Support disabling sections with `no-sections = true` ([#8657](https://github.com/astral-sh/ruff/pull/8657))
- \[`pep8-naming`\] Support local and dynamic class- and static-method decorators ([#8592](https://github.com/astral-sh/ruff/pull/8592))
- \[`pydocstyle`\] Allow overriding pydocstyle convention rules ([#8586](https://github.com/astral-sh/ruff/pull/8586))
### Bug fixes
- Avoid syntax error via importing `trio.lowlevel` ([#8730](https://github.com/astral-sh/ruff/pull/8730))
- Omit unrolled augmented assignments in `PIE794` ([#8634](https://github.com/astral-sh/ruff/pull/8634))
- Slice source code instead of generating it for `EM` fixes ([#7746](https://github.com/astral-sh/ruff/pull/7746))
- Allow whitespace around colon in slices for `whitespace-before-punctuation` (`E203`) ([#8654](https://github.com/astral-sh/ruff/pull/8654))
- Use function range for `no-self-use` ([#8637](https://github.com/astral-sh/ruff/pull/8637))
- F-strings doesn't contain bytes literal for `PLW0129` ([#8675](https://github.com/astral-sh/ruff/pull/8675))
- Improve detection of `TYPE_CHECKING` blocks imported from `typing_extensions` or `_typeshed` ([#8429](https://github.com/astral-sh/ruff/pull/8429))
- Treat display as a builtin in IPython ([#8707](https://github.com/astral-sh/ruff/pull/8707))
- Avoid `FURB113` autofix if comments are present ([#8494](https://github.com/astral-sh/ruff/pull/8494))
- Consider the new f-string tokens for `flake8-commas` ([#8582](https://github.com/astral-sh/ruff/pull/8582))
- Remove erroneous bad-dunder-name reference ([#8742](https://github.com/astral-sh/ruff/pull/8742))
- Avoid recommending Self usages in metaclasses ([#8639](https://github.com/astral-sh/ruff/pull/8639))
- Detect runtime-evaluated base classes defined in the current file ([#8572](https://github.com/astral-sh/ruff/pull/8572))
- Avoid inserting trailing commas within f-strings ([#8574](https://github.com/astral-sh/ruff/pull/8574))
- Remove incorrect deprecation label for stdout and stderr ([#8743](https://github.com/astral-sh/ruff/pull/8743))
- Fix unnecessary parentheses in UP007 fix ([#8610](https://github.com/astral-sh/ruff/pull/8610))
- Remove repeated and erroneous scoped settings headers in docs ([#8670](https://github.com/astral-sh/ruff/pull/8670))
- Trim trailing empty strings when converting to f-strings ([#8712](https://github.com/astral-sh/ruff/pull/8712))
- Fix ordering for `force-sort-within-sections` ([#8665](https://github.com/astral-sh/ruff/pull/8665))
- Run unicode prefix rule over tokens ([#8709](https://github.com/astral-sh/ruff/pull/8709))
- Update UP032 to unescape curly braces in literal parts of converted strings ([#8697](https://github.com/astral-sh/ruff/pull/8697))
- List all ipython builtins ([#8719](https://github.com/astral-sh/ruff/pull/8719))
### Documentation
- Document conventions in the FAQ ([#8638](https://github.com/astral-sh/ruff/pull/8638))
- Redirect from rule codes to rule pages in docs ([#8636](https://github.com/astral-sh/ruff/pull/8636))
- Fix permalink to convention setting ([#8575](https://github.com/astral-sh/ruff/pull/8575))
## 0.1.5
### Preview features

View File

@@ -295,7 +295,7 @@ To preview any changes to the documentation locally:
```shell
# For contributors.
mkdocs serve -f mkdocs.generated.yml
mkdocs serve -f mkdocs.public.yml
# For members of the Astral org, which has access to MkDocs Insiders via sponsorship.
mkdocs serve -f mkdocs.insiders.yml

178
Cargo.lock generated
View File

@@ -64,9 +64,9 @@ checksum = "c7021ce4924a3f25f802b2cccd1af585e39ea1a363a1aa2e72afe54b67a3a7a7"
[[package]]
name = "annotate-snippets"
version = "0.9.1"
version = "0.9.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c3b9d411ecbaf79885c6df4d75fff75858d5995ff25385657a28af47e82f9c36"
checksum = "ccaf7e9dfbb6ab22c82e473cd1a8a7bd313c19a5b7e40970f3d89ef5a5c9e81e"
dependencies = [
"unicode-width",
"yansi-term",
@@ -278,9 +278,7 @@ checksum = "7f2c685bad3eb3d45a01354cedb7d5faa66194d1d58ba6e267a8de788f79db38"
dependencies = [
"android-tzdata",
"iana-time-zone",
"js-sys",
"num-traits",
"wasm-bindgen",
"windows-targets 0.48.5",
]
@@ -407,9 +405,9 @@ dependencies = [
[[package]]
name = "codspeed"
version = "2.3.1"
version = "2.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "918b13a0f1a32460ab3bd5debd56b5a27a7071fa5ff5dfeb3a5cf291a85b174b"
checksum = "0eb4ab4dcb6554eb4f590fb16f99d3b102ab76f5f56554c9a5340518b32c499b"
dependencies = [
"colored",
"libc",
@@ -418,9 +416,9 @@ dependencies = [
[[package]]
name = "codspeed-criterion-compat"
version = "2.3.1"
version = "2.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c683c7fef2b873fbbdf4062782914c652309951244bf0bd362fe608b7d6f901c"
checksum = "cc07a3d3f7e0c8961d0ffdee149d39b231bafdcdc3d978dc5ad790c615f55f3f"
dependencies = [
"codspeed",
"colored",
@@ -446,9 +444,9 @@ dependencies = [
[[package]]
name = "configparser"
version = "3.0.2"
version = "3.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5458d9d1a587efaf5091602c59d299696a3877a439c8f6d461a2d3cce11df87a"
checksum = "e0e56e414a2a52ab2a104f85cd40933c2fbc278b83637facf646ecf451b49237"
[[package]]
name = "console"
@@ -810,7 +808,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
[[package]]
name = "flake8-to-ruff"
version = "0.1.5"
version = "0.1.6"
dependencies = [
"anyhow",
"clap",
@@ -829,7 +827,7 @@ dependencies = [
"serde_json",
"strum",
"strum_macros",
"toml",
"toml 0.7.8",
]
[[package]]
@@ -859,9 +857,12 @@ dependencies = [
[[package]]
name = "fs-err"
version = "2.9.0"
version = "2.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0845fa252299212f0389d64ba26f34fa32cfe41588355f21ed507c59a0f64541"
checksum = "fb5fd9bcbe8b1087cbd395b51498c01bc997cef73e778a80b77a811af5e2d29f"
dependencies = [
"autocfg",
]
[[package]]
name = "fsevent-sys"
@@ -902,15 +903,15 @@ checksum = "d2fabcfbdc87f4758337ca535fb41a6d701b65693ce38287d856d1674551ec9b"
[[package]]
name = "globset"
version = "0.4.13"
version = "0.4.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "759c97c1e17c55525b57192c06a267cda0ac5210b222d6b82189a2338fa1c13d"
checksum = "57da3b9b5b85bd66f31093f8c408b90a74431672542466497dcbdfdc02034be1"
dependencies = [
"aho-corasick",
"bstr",
"fnv",
"log",
"regex",
"regex-automata 0.4.3",
"regex-syntax 0.8.2",
]
[[package]]
@@ -927,9 +928,9 @@ checksum = "8a9ee70c43aaf417c914396645a0fa852624801b24ebb7ae78fe8272889ac888"
[[package]]
name = "hashbrown"
version = "0.14.0"
version = "0.14.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2c6201b9ff9fd90a5a3bac2e56a830d0caa509576f0e503818ee82c181b3437a"
checksum = "f93e7192158dbcda357bdec5fb5788eebf8bbac027f3f33e719d29135ae84156"
[[package]]
name = "heck"
@@ -1033,12 +1034,12 @@ dependencies = [
[[package]]
name = "indexmap"
version = "2.0.0"
version = "2.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d5477fe2230a79769d8dc68e0eabf5437907c0457a5614a9e8dddb67f65eb65d"
checksum = "d530e1a18b1cb4c484e6e34556a0d948706958449fca0cab753d649f2bce3d1f"
dependencies = [
"equivalent",
"hashbrown 0.14.0",
"hashbrown 0.14.2",
"serde",
]
@@ -1169,9 +1170,9 @@ checksum = "af150ab688ff2122fcef229be89cb50dd66af9e01a4ff320cc137eecc9bacc38"
[[package]]
name = "js-sys"
version = "0.3.64"
version = "0.3.65"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c5f195fe497f702db0f318b07fdd68edb16955aed830df8363d837542f8f935a"
checksum = "54c0c35952f67de54bb584e9fd912b3023117cbafc0a77d8f3dee1fb5f572fe8"
dependencies = [
"wasm-bindgen",
]
@@ -1792,45 +1793,46 @@ dependencies = [
[[package]]
name = "proc-macro2"
version = "1.0.69"
version = "1.0.70"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "134c189feb4956b20f6f547d2cf727d4c0fe06722b20a0eec87ed445a97f92da"
checksum = "39278fbbf5fb4f646ce651690877f89d1c5811a3d4acb27700c1cb3cdb78fd3b"
dependencies = [
"unicode-ident",
]
[[package]]
name = "pyproject-toml"
version = "0.8.0"
version = "0.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0774c13ff0b8b7ebb4791c050c497aefcfe3f6a222c0829c7017161ed38391ff"
checksum = "46d4a5e69187f23a29f8aa0ea57491d104ba541bc55f76552c2a74962aa20e04"
dependencies = [
"indexmap",
"pep440_rs",
"pep508_rs",
"serde",
"toml",
"toml 0.8.2",
]
[[package]]
name = "quick-junit"
version = "0.3.3"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6bf780b59d590c25f8c59b44c124166a2a93587868b619fb8f5b47fb15e9ed6d"
checksum = "1b9599bffc2cd7511355996e0cfd979266b2cfa3f3ff5247d07a3a6e1ded6158"
dependencies = [
"chrono",
"indexmap",
"nextest-workspace-hack",
"quick-xml",
"strip-ansi-escapes",
"thiserror",
"uuid",
]
[[package]]
name = "quick-xml"
version = "0.29.0"
version = "0.31.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "81b9228215d82c7b61490fec1de287136b5de6f5700f6e58ea9ad61a7964ca51"
checksum = "1004a344b30a54e2ee58d66a71b32d2db2feb0a31f9a2d302bf0536f15de2a33"
dependencies = [
"memchr",
]
@@ -2060,9 +2062,9 @@ dependencies = [
[[package]]
name = "ruff_cli"
version = "0.1.5"
version = "0.1.6"
dependencies = [
"annotate-snippets 0.9.1",
"annotate-snippets 0.9.2",
"anyhow",
"argfile",
"assert_cmd",
@@ -2152,7 +2154,7 @@ dependencies = [
"strum",
"strum_macros",
"tempfile",
"toml",
"toml 0.7.8",
"tracing",
"tracing-indicatif",
"tracing-subscriber",
@@ -2196,10 +2198,10 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.1.5"
version = "0.1.6"
dependencies = [
"aho-corasick",
"annotate-snippets 0.9.1",
"annotate-snippets 0.9.2",
"anyhow",
"bitflags 2.4.1",
"chrono",
@@ -2252,7 +2254,7 @@ dependencies = [
"tempfile",
"test-case",
"thiserror",
"toml",
"toml 0.7.8",
"typed-arena",
"unicode-width",
"unicode_names2",
@@ -2332,6 +2334,7 @@ dependencies = [
"itertools 0.11.0",
"memchr",
"once_cell",
"regex",
"ruff_cache",
"ruff_formatter",
"ruff_macros",
@@ -2447,7 +2450,7 @@ dependencies = [
[[package]]
name = "ruff_shrinking"
version = "0.1.5"
version = "0.1.6"
dependencies = [
"anyhow",
"clap",
@@ -2538,7 +2541,7 @@ dependencies = [
"shellexpand",
"strum",
"tempfile",
"toml",
"toml 0.7.8",
]
[[package]]
@@ -2802,9 +2805,9 @@ checksum = "38b58827f4464d87d377d175e90bf58eb00fd8716ff0a62f80356b5e61555d0d"
[[package]]
name = "smallvec"
version = "1.11.1"
version = "1.11.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "942b4a808e05215192e39f4ab80813e599068285906cc91aa64f923db842bd5a"
checksum = "4dccd0940a2dcdf68d092b8cbab7dc0ad8fa938bf95787e1b916b0e3d0e8e970"
[[package]]
name = "spin"
@@ -2831,6 +2834,15 @@ dependencies = [
"precomputed-hash",
]
[[package]]
name = "strip-ansi-escapes"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "55ff8ef943b384c414f54aefa961dd2bd853add74ec75e7ac74cf91dba62bcfa"
dependencies = [
"vte",
]
[[package]]
name = "strsim"
version = "0.10.0"
@@ -3086,7 +3098,19 @@ dependencies = [
"serde",
"serde_spanned",
"toml_datetime",
"toml_edit",
"toml_edit 0.19.15",
]
[[package]]
name = "toml"
version = "0.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "185d8ab0dfbb35cf1399a6344d8484209c088f75f8f68230da55d48d95d43e3d"
dependencies = [
"serde",
"serde_spanned",
"toml_datetime",
"toml_edit 0.20.2",
]
[[package]]
@@ -3111,6 +3135,19 @@ dependencies = [
"winnow",
]
[[package]]
name = "toml_edit"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "396e4d48bbb2b7554c944bde63101b5ae446cff6ec4a24227428f15eb72ef338"
dependencies = [
"indexmap",
"serde",
"serde_spanned",
"toml_datetime",
"winnow",
]
[[package]]
name = "tracing"
version = "0.1.40"
@@ -3158,20 +3195,20 @@ dependencies = [
[[package]]
name = "tracing-log"
version = "0.1.3"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "78ddad33d2d10b1ed7eb9d1f518a5674713876e97e5bb9b7345a7984fbb4f922"
checksum = "ee855f1f400bd0e5c02d150ae5de3840039a3f54b025156404e34c23c03f47c3"
dependencies = [
"lazy_static",
"log",
"once_cell",
"tracing-core",
]
[[package]]
name = "tracing-subscriber"
version = "0.3.17"
version = "0.3.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "30a651bc37f915e81f087d86e62a18eec5f79550c7faff886f7090b4ea757c77"
checksum = "ad0f048c97dbd9faa9b7df56362b8ebcaa52adb06b498c050d2f4e32f90a7a8b"
dependencies = [
"matchers",
"nu-ansi-term",
@@ -3331,9 +3368,9 @@ checksum = "711b9620af191e0cdc7468a8d14e709c3dcdb115b36f838e601583af800a370a"
[[package]]
name = "uuid"
version = "1.5.0"
version = "1.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "88ad59a7560b41a70d191093a945f0b87bc1deeda46fb237479708a1d6b6cdfc"
checksum = "5e395fcf16a7a3d8127ec99782007af141946b4795001f876d54fb0d55978560"
dependencies = [
"getrandom",
"rand",
@@ -3343,9 +3380,9 @@ dependencies = [
[[package]]
name = "uuid-macro-internal"
version = "1.5.0"
version = "1.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3d8c6bba9b149ee82950daefc9623b32bb1dacbfb1890e352f6b887bd582adaf"
checksum = "f49e7f3f3db8040a100710a11932239fd30697115e2ba4107080d8252939845e"
dependencies = [
"proc-macro2",
"quote",
@@ -3424,9 +3461,9 @@ checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423"
[[package]]
name = "wasm-bindgen"
version = "0.2.87"
version = "0.2.88"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7706a72ab36d8cb1f80ffbf0e071533974a60d0a308d01a5d0375bf60499a342"
checksum = "7daec296f25a1bae309c0cd5c29c4b260e510e6d813c286b19eaadf409d40fce"
dependencies = [
"cfg-if",
"wasm-bindgen-macro",
@@ -3434,9 +3471,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-backend"
version = "0.2.87"
version = "0.2.88"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5ef2b6d3c510e9625e5fe6f509ab07d66a760f0885d858736483c32ed7809abd"
checksum = "e397f4664c0e4e428e8313a469aaa58310d302159845980fd23b0f22a847f217"
dependencies = [
"bumpalo",
"log",
@@ -3449,9 +3486,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-futures"
version = "0.4.37"
version = "0.4.38"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c02dbc21516f9f1f04f187958890d7e6026df8d16540b7ad9492bc34a67cea03"
checksum = "9afec9963e3d0994cac82455b2b3502b81a7f40f9a0d32181f7528d9f4b43e02"
dependencies = [
"cfg-if",
"js-sys",
@@ -3461,9 +3498,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro"
version = "0.2.87"
version = "0.2.88"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dee495e55982a3bd48105a7b947fd2a9b4a8ae3010041b9e0faab3f9cd028f1d"
checksum = "5961017b3b08ad5f3fe39f1e79877f8ee7c23c5e5fd5eb80de95abc41f1f16b2"
dependencies = [
"quote",
"wasm-bindgen-macro-support",
@@ -3471,9 +3508,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro-support"
version = "0.2.87"
version = "0.2.88"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "54681b18a46765f095758388f2d0cf16eb8d4169b639ab575a8f5693af210c7b"
checksum = "c5353b8dab669f5e10f5bd76df26a9360c748f054f862ff5f3f8aae0c7fb3907"
dependencies = [
"proc-macro2",
"quote",
@@ -3484,15 +3521,15 @@ dependencies = [
[[package]]
name = "wasm-bindgen-shared"
version = "0.2.87"
version = "0.2.88"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ca6ad05a4870b2bf5fe995117d3728437bd27d7cd5f06f13c17443ef369775a1"
checksum = "0d046c5d029ba91a1ed14da14dca44b68bf2f124cfbaf741c54151fdb3e0750b"
[[package]]
name = "wasm-bindgen-test"
version = "0.3.37"
version = "0.3.38"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6e6e302a7ea94f83a6d09e78e7dc7d9ca7b186bc2829c24a22d0753efd680671"
checksum = "c6433b7c56db97397842c46b67e11873eda263170afeb3a2dc74a7cb370fee0d"
dependencies = [
"console_error_panic_hook",
"js-sys",
@@ -3504,12 +3541,13 @@ dependencies = [
[[package]]
name = "wasm-bindgen-test-macro"
version = "0.3.37"
version = "0.3.38"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ecb993dd8c836930ed130e020e77d9b2e65dd0fbab1b67c790b0f5d80b11a575"
checksum = "493fcbab756bb764fa37e6bee8cec2dd709eb4273d06d0c282a5e74275ded735"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.39",
]
[[package]]

View File

@@ -19,7 +19,7 @@ clap = { version = "4.4.7", features = ["derive"] }
colored = { version = "2.0.0" }
filetime = { version = "0.2.20" }
glob = { version = "0.3.1" }
globset = { version = "0.4.10" }
globset = { version = "0.4.14" }
ignore = { version = "0.4.20" }
insta = { version = "1.34.0", feature = ["filters", "glob"] }
is-macro = { version = "0.3.0" }
@@ -29,7 +29,7 @@ log = { version = "0.4.17" }
memchr = { version = "2.6.4" }
once_cell = { version = "1.17.1" }
path-absolutize = { version = "3.1.1" }
proc-macro2 = { version = "1.0.69" }
proc-macro2 = { version = "1.0.70" }
quote = { version = "1.0.23" }
regex = { version = "1.10.2" }
rustc-hash = { version = "1.1.0" }
@@ -38,7 +38,7 @@ serde = { version = "1.0.190", features = ["derive"] }
serde_json = { version = "1.0.108" }
shellexpand = { version = "3.0.0" }
similar = { version = "2.3.0", features = ["inline"] }
smallvec = { version = "1.11.1" }
smallvec = { version = "1.11.2" }
static_assertions = "1.1.0"
strum = { version = "0.25.0", features = ["strum_macros"] }
strum_macros = { version = "0.25.3" }
@@ -48,13 +48,45 @@ thiserror = { version = "1.0.50" }
toml = { version = "0.7.8" }
tracing = { version = "0.1.40" }
tracing-indicatif = { version = "0.3.4" }
tracing-subscriber = { version = "0.3.17", features = ["env-filter"] }
tracing-subscriber = { version = "0.3.18", features = ["env-filter"] }
unicode-ident = { version = "1.0.12" }
unicode_names2 = { version = "1.2.0" }
unicode-width = { version = "0.1.11" }
uuid = { version = "1.5.0", features = ["v4", "fast-rng", "macro-diagnostics", "js"] }
uuid = { version = "1.6.1", features = ["v4", "fast-rng", "macro-diagnostics", "js"] }
wsl = { version = "0.1.0" }
[workspace.lints.rust]
unsafe_code = "warn"
unreachable_pub = "warn"
[workspace.lints.clippy]
pedantic = { level = "warn", priority = -2 }
# Allowed pedantic lints
char_lit_as_u8 = "allow"
collapsible_else_if = "allow"
collapsible_if = "allow"
implicit_hasher = "allow"
match_same_arms = "allow"
missing_errors_doc = "allow"
missing_panics_doc = "allow"
module_name_repetitions = "allow"
must_use_candidate = "allow"
similar_names = "allow"
too_many_lines = "allow"
# To allow `#[allow(clippy::all)]` in `crates/ruff_python_parser/src/python.rs`.
needless_raw_string_hashes = "allow"
# Disallowed restriction lints
print_stdout = "warn"
print_stderr = "warn"
dbg_macro = "warn"
empty_drop = "warn"
empty_structs_with_brackets = "warn"
exit = "warn"
get_unwrap = "warn"
rc_buffer = "warn"
rc_mutex = "warn"
rest_pat_in_fully_bound_structs = "warn"
[profile.release]
lto = "fat"
codegen-units = 1

38
Dockerfile Normal file
View File

@@ -0,0 +1,38 @@
FROM --platform=$BUILDPLATFORM ubuntu as build
ENV HOME="/root"
WORKDIR $HOME
RUN apt update && apt install -y build-essential curl python3-venv
# Setup zig as cross compiling linker
RUN python3 -m venv $HOME/.venv
RUN .venv/bin/pip install cargo-zigbuild
ENV PATH="$HOME/.venv/bin:$PATH"
# Install rust
ARG TARGETPLATFORM
RUN case "$TARGETPLATFORM" in \
"linux/arm64") echo "aarch64-unknown-linux-musl" > rust_target.txt ;; \
"linux/amd64") echo "x86_64-unknown-linux-musl" > rust_target.txt ;; \
*) exit 1 ;; \
esac
# Update rustup whenever we bump the rust version
COPY rust-toolchain.toml rust-toolchain.toml
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --target $(cat rust_target.txt) --profile minimal --default-toolchain none
ENV PATH="$HOME/.cargo/bin:$PATH"
# Installs the correct toolchain version from rust-toolchain.toml and then the musl target
RUN rustup target add $(cat rust_target.txt)
# Build
COPY crates crates
COPY Cargo.toml Cargo.toml
COPY Cargo.lock Cargo.lock
RUN cargo zigbuild --bin ruff --target $(cat rust_target.txt) --release
RUN cp target/$(cat rust_target.txt)/release/ruff /ruff
# TODO: Optimize binary size, with a version that also works when cross compiling
# RUN strip --strip-all /ruff
FROM scratch
COPY --from=build /ruff /ruff
WORKDIR /io
ENTRYPOINT ["/ruff"]

View File

@@ -54,7 +54,7 @@ Ruff is extremely actively developed and used in major open-source projects like
- [Pandas](https://github.com/pandas-dev/pandas)
- [SciPy](https://github.com/scipy/scipy)
...and many more.
...and [many more](#whos-using-ruff).
Ruff is backed by [Astral](https://astral.sh). Read the [launch post](https://astral.sh/blog/announcing-astral-the-company-behind-ruff),
or the original [project announcement](https://notes.crmarsh.com/python-tooling-could-be-much-much-faster).
@@ -150,7 +150,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.5
rev: v0.1.6
hooks:
# Run the linter.
- id: ruff
@@ -377,8 +377,8 @@ Ruff is used by a number of major open-source projects and companies, including:
- Anthropic ([Python SDK](https://github.com/anthropics/anthropic-sdk-python))
- [Apache Airflow](https://github.com/apache/airflow)
- AstraZeneca ([Magnus](https://github.com/AstraZeneca/magnus-core))
- Benchling ([Refac](https://github.com/benchling/refac))
- [Babel](https://github.com/python-babel/babel)
- Benchling ([Refac](https://github.com/benchling/refac))
- [Bokeh](https://github.com/bokeh/bokeh)
- [Cryptography (PyCA)](https://github.com/pyca/cryptography)
- [DVC](https://github.com/iterative/dvc)
@@ -389,15 +389,16 @@ Ruff is used by a number of major open-source projects and companies, including:
- [Gradio](https://github.com/gradio-app/gradio)
- [Great Expectations](https://github.com/great-expectations/great_expectations)
- [HTTPX](https://github.com/encode/httpx)
- [Hatch](https://github.com/pypa/hatch)
- [Home Assistant](https://github.com/home-assistant/core)
- Hugging Face ([Transformers](https://github.com/huggingface/transformers),
[Datasets](https://github.com/huggingface/datasets),
[Diffusers](https://github.com/huggingface/diffusers))
- [Hatch](https://github.com/pypa/hatch)
- [Home Assistant](https://github.com/home-assistant/core)
- ING Bank ([popmon](https://github.com/ing-bank/popmon), [probatus](https://github.com/ing-bank/probatus))
- [Ibis](https://github.com/ibis-project/ibis)
- [Jupyter](https://github.com/jupyter-server/jupyter_server)
- [LangChain](https://github.com/hwchase17/langchain)
- [Litestar](https://litestar.dev/)
- [LlamaIndex](https://github.com/jerryjliu/llama_index)
- Matrix ([Synapse](https://github.com/matrix-org/synapse))
- [MegaLinter](https://github.com/oxsecurity/megalinter)
@@ -422,20 +423,21 @@ Ruff is used by a number of major open-source projects and companies, including:
- [PostHog](https://github.com/PostHog/posthog)
- Prefect ([Python SDK](https://github.com/PrefectHQ/prefect), [Marvin](https://github.com/PrefectHQ/marvin))
- [PyInstaller](https://github.com/pyinstaller/pyinstaller)
- [PyMC-Marketing](https://github.com/pymc-labs/pymc-marketing)
- [PyTorch](https://github.com/pytorch/pytorch)
- [Pydantic](https://github.com/pydantic/pydantic)
- [Pylint](https://github.com/PyCQA/pylint)
- [PyMC-Marketing](https://github.com/pymc-labs/pymc-marketing)
- [Reflex](https://github.com/reflex-dev/reflex)
- [River](https://github.com/online-ml/river)
- [Rippling](https://rippling.com)
- [Robyn](https://github.com/sansyrox/robyn)
- Scale AI ([Launch SDK](https://github.com/scaleapi/launch-python-client))
- Snowflake ([SnowCLI](https://github.com/Snowflake-Labs/snowcli))
- [Saleor](https://github.com/saleor/saleor)
- Scale AI ([Launch SDK](https://github.com/scaleapi/launch-python-client))
- [SciPy](https://github.com/scipy/scipy)
- Snowflake ([SnowCLI](https://github.com/Snowflake-Labs/snowcli))
- [Sphinx](https://github.com/sphinx-doc/sphinx)
- [Stable Baselines3](https://github.com/DLR-RM/stable-baselines3)
- [Litestar](https://litestar.dev/)
- [Starlette](https://github.com/encode/starlette)
- [The Algorithms](https://github.com/TheAlgorithms/Python)
- [Vega-Altair](https://github.com/altair-viz/altair)
- WordPress ([Openverse](https://github.com/WordPress/openverse))

View File

@@ -1,6 +1,6 @@
[package]
name = "flake8-to-ruff"
version = "0.1.5"
version = "0.1.6"
description = """
Convert Flake8 configuration files to Ruff configuration files.
"""
@@ -19,7 +19,7 @@ ruff_workspace = { path = "../ruff_workspace" }
anyhow = { workspace = true }
clap = { workspace = true }
colored = { workspace = true }
configparser = { version = "3.0.2" }
configparser = { version = "3.0.3" }
itertools = { workspace = true }
log = { workspace = true }
once_cell = { workspace = true }
@@ -34,3 +34,6 @@ toml = { workspace = true }
[dev-dependencies]
pretty_assertions = "1.3.0"
[lints]
workspace = true

View File

@@ -37,7 +37,7 @@ serde_json.workspace = true
url = "2.3.1"
ureq = "2.8.0"
criterion = { version = "0.5.1", default-features = false }
codspeed-criterion-compat = { version="2.3.1", default-features = false, optional = true}
codspeed-criterion-compat = { version="2.3.3", default-features = false, optional = true}
[dev-dependencies]
ruff_linter.path = "../ruff_linter"
@@ -46,6 +46,9 @@ ruff_python_formatter = { path = "../ruff_python_formatter" }
ruff_python_index = { path = "../ruff_python_index" }
ruff_python_parser = { path = "../ruff_python_parser" }
[lints]
workspace = true
[features]
codspeed = ["codspeed-criterion-compat"]

View File

@@ -3,7 +3,9 @@ use ruff_benchmark::criterion::{
};
use ruff_benchmark::{TestCase, TestFile, TestFileDownloadError};
use ruff_linter::linter::lint_only;
use ruff_linter::rule_selector::PreviewOptions;
use ruff_linter::settings::rule_table::RuleTable;
use ruff_linter::settings::types::PreviewMode;
use ruff_linter::settings::{flags, LinterSettings};
use ruff_linter::source_kind::SourceKind;
use ruff_linter::{registry::Rule, RuleSelector};
@@ -78,12 +80,21 @@ fn benchmark_default_rules(criterion: &mut Criterion) {
benchmark_linter(group, &LinterSettings::default());
}
fn benchmark_all_rules(criterion: &mut Criterion) {
let mut rules: RuleTable = RuleSelector::All.all_rules().collect();
// Disable IO based rules because it is a source of flakiness
/// Disables IO based rules because they are a source of flakiness
fn disable_io_rules(rules: &mut RuleTable) {
rules.disable(Rule::ShebangMissingExecutableFile);
rules.disable(Rule::ShebangNotExecutable);
}
fn benchmark_all_rules(criterion: &mut Criterion) {
let mut rules: RuleTable = RuleSelector::All
.rules(&PreviewOptions {
mode: PreviewMode::Disabled,
require_explicit: false,
})
.collect();
disable_io_rules(&mut rules);
let settings = LinterSettings {
rules,
@@ -94,6 +105,22 @@ fn benchmark_all_rules(criterion: &mut Criterion) {
benchmark_linter(group, &settings);
}
fn benchmark_preview_rules(criterion: &mut Criterion) {
let mut rules: RuleTable = RuleSelector::All.all_rules().collect();
disable_io_rules(&mut rules);
let settings = LinterSettings {
rules,
preview: PreviewMode::Enabled,
..LinterSettings::default()
};
let group = criterion.benchmark_group("linter/all-with-preview-rules");
benchmark_linter(group, &settings);
}
criterion_group!(default_rules, benchmark_default_rules);
criterion_group!(all_rules, benchmark_all_rules);
criterion_main!(default_rules, all_rules);
criterion_group!(preview_rules, benchmark_preview_rules);
criterion_main!(default_rules, all_rules, preview_rules);

View File

@@ -20,3 +20,6 @@ seahash = "4.1.0"
[dev-dependencies]
ruff_macros = { path = "../ruff_macros" }
[lints]
workspace = true

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_cli"
version = "0.1.5"
version = "0.1.6"
publish = false
authors = { workspace = true }
edition = { workspace = true }
@@ -28,7 +28,7 @@ ruff_python_trivia = { path = "../ruff_python_trivia" }
ruff_workspace = { path = "../ruff_workspace" }
ruff_text_size = { path = "../ruff_text_size" }
annotate-snippets = { version = "0.9.1", features = ["color"] }
annotate-snippets = { version = "0.9.2", features = ["color"] }
anyhow = { workspace = true }
argfile = { version = "0.1.6" }
bincode = { version = "1.3.3" }
@@ -76,3 +76,6 @@ mimalloc = "0.1.39"
[target.'cfg(all(not(target_os = "windows"), not(target_os = "openbsd"), any(target_arch = "x86_64", target_arch = "aarch64", target_arch = "powerpc64")))'.dependencies]
tikv-jemallocator = "0.5.0"
[lints]
workspace = true

View File

@@ -0,0 +1,2 @@
[tool.ruff]
select = []

View File

@@ -0,0 +1,2 @@
[tool.ruff]
include = ["a.py", "subdirectory/c.py"]

View File

@@ -0,0 +1,413 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"id": "4f8ce941-1492-4d4e-8ab5-70d733fe891a",
"metadata": {},
"outputs": [],
"source": [
"%config ZMQInteractiveShell.ast_node_interactivity=\"last_expr_or_assign\""
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "721ec705-0c65-4bfb-9809-7ed8bc534186",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Assignment statement without a semicolon\n",
"x = 1"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "de50e495-17e5-41cc-94bd-565757555d7e",
"metadata": {},
"outputs": [],
"source": [
"# Assignment statement with a semicolon\n",
"x = 1;\n",
"x = 1;"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "39e31201-23da-44eb-8684-41bba3663991",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"2"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Augmented assignment without a semicolon\n",
"x += 1"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "6b73d3dd-c73a-4697-9e97-e109a6c1fbab",
"metadata": {},
"outputs": [],
"source": [
"# Augmented assignment without a semicolon\n",
"x += 1;\n",
"x += 1; # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "2a3e5b86-aa5b-46ba-b9c6-0386d876f58c",
"metadata": {},
"outputs": [],
"source": [
"# Multiple assignment without a semicolon\n",
"x = y = 1"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "07f89e51-9357-4cfb-8fc5-76fb75e35949",
"metadata": {},
"outputs": [],
"source": [
"# Multiple assignment with a semicolon\n",
"x = y = 1;\n",
"x = y = 1;"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "c22b539d-473e-48f8-a236-625e58c47a00",
"metadata": {},
"outputs": [],
"source": [
"# Tuple unpacking without a semicolon\n",
"x, y = 1, 2"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "12c87940-a0d5-403b-a81c-7507eb06dc7e",
"metadata": {},
"outputs": [],
"source": [
"# Tuple unpacking with a semicolon (irrelevant)\n",
"x, y = 1, 2;\n",
"x, y = 1, 2; # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "5a768c76-6bc4-470c-b37e-8cc14bc6caf4",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Annotated assignment statement without a semicolon\n",
"x: int = 1"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "21bfda82-1a9a-4ba1-9078-74ac480804b5",
"metadata": {},
"outputs": [],
"source": [
"# Annotated assignment statement without a semicolon\n",
"x: int = 1;\n",
"x: int = 1; # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "09929999-ff29-4d10-ad2b-e665af15812d",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Assignment expression without a semicolon\n",
"(x := 1)"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "32a83217-1bad-4f61-855e-ffcdb119c763",
"metadata": {},
"outputs": [],
"source": [
"# Assignment expression with a semicolon\n",
"(x := 1);\n",
"(x := 1); # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "61b81865-277e-4964-b03e-eb78f1f318eb",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x = 1\n",
"# Expression without a semicolon\n",
"x"
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "974c29be-67e1-4000-95fa-6ca118a63bad",
"metadata": {},
"outputs": [],
"source": [
"x = 1\n",
"# Expression with a semicolon\n",
"x;"
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "cfeb1757-46d6-4f13-969f-a283b6d0304f",
"metadata": {},
"outputs": [],
"source": [
"class Point:\n",
" def __init__(self, x, y):\n",
" self.x = x\n",
" self.y = y\n",
"\n",
"\n",
"p = Point(0, 0);"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "2ee7f1a5-ccfe-4004-bfa4-ef834a58da97",
"metadata": {},
"outputs": [],
"source": [
"# Assignment statement where the left is an attribute access doesn't\n",
"# print the value.\n",
"p.x = 1;"
]
},
{
"cell_type": "code",
"execution_count": 18,
"id": "3e49370a-048b-474d-aa0a-3d1d4a73ad37",
"metadata": {},
"outputs": [],
"source": [
"data = {}\n",
"\n",
"# Neither does the subscript node\n",
"data[\"foo\"] = 1;"
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "d594bdd3-eaa9-41ef-8cda-cf01bc273b2d",
"metadata": {},
"outputs": [],
"source": [
"if (x := 1):\n",
" # It should be the top level statement\n",
" x"
]
},
{
"cell_type": "code",
"execution_count": 20,
"id": "e532f0cf-80c7-42b7-8226-6002fcf74fb6",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 20,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Parentheses with comments\n",
"(\n",
" x := 1 # comment\n",
") # comment"
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "473c5d62-871b-46ed-8a34-27095243f462",
"metadata": {},
"outputs": [],
"source": [
"# Parentheses with comments\n",
"(\n",
" x := 1 # comment\n",
"); # comment"
]
},
{
"cell_type": "code",
"execution_count": 22,
"id": "8c3c2361-f49f-45fe-bbe3-7e27410a8a86",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'Hello world!'"
]
},
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"\"\"\"Hello world!\"\"\""
]
},
{
"cell_type": "code",
"execution_count": 23,
"id": "23dbe9b5-3f68-4890-ab2d-ab0dbfd0712a",
"metadata": {},
"outputs": [],
"source": [
"\"\"\"Hello world!\"\"\"; # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 24,
"id": "3ce33108-d95d-4c70-83d1-0d4fd36a2951",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'x = 1'"
]
},
"execution_count": 24,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x = 1\n",
"f\"x = {x}\""
]
},
{
"cell_type": "code",
"execution_count": 25,
"id": "654a4a67-de43-4684-824a-9451c67db48f",
"metadata": {},
"outputs": [],
"source": [
"x = 1\n",
"f\"x = {x}\";\n",
"f\"x = {x}\"; # comment\n",
"# comment"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python (ruff-playground)",
"language": "python",
"name": "ruff-playground"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@@ -88,6 +88,7 @@ pub enum Command {
#[allow(clippy::struct_excessive_bools)]
pub struct CheckCommand {
/// List of files or directories to check.
#[clap(help = "List of files or directories to check [default: .]")]
pub files: Vec<PathBuf>,
/// Apply fixes to resolve lint violations.
/// Use `--no-fix` to disable or `--unsafe-fixes` to include unsafe fixes.
@@ -363,6 +364,7 @@ pub struct CheckCommand {
#[allow(clippy::struct_excessive_bools)]
pub struct FormatCommand {
/// List of files or directories to format.
#[clap(help = "List of files or directories to format [default: .]")]
pub files: Vec<PathBuf>,
/// Avoid writing any formatted files back; instead, exit with a non-zero status code if any
/// files would have been modified, and zero otherwise.

View File

@@ -202,12 +202,12 @@ fn lint_path(
match result {
Ok(inner) => inner,
Err(error) => {
let message = r#"This indicates a bug in Ruff. If you could open an issue at:
let message = r"This indicates a bug in Ruff. If you could open an issue at:
https://github.com/astral-sh/ruff/issues/new?title=%5BLinter%20panic%5D
...with the relevant file contents, the `pyproject.toml` settings, and the following stack trace, we'd be very appreciative!
"#;
";
error!(
"{}{}{} {message}\n{error}",

View File

@@ -8,7 +8,7 @@ use ruff_workspace::resolver::{match_exclusion, python_file_at_path, PyprojectCo
use crate::args::CliOverrides;
use crate::diagnostics::{lint_stdin, Diagnostics};
use crate::stdin::read_from_stdin;
use crate::stdin::{parrot_stdin, read_from_stdin};
/// Run the linter over a single file, read from `stdin`.
pub(crate) fn check_stdin(
@@ -21,6 +21,9 @@ pub(crate) fn check_stdin(
if pyproject_config.settings.file_resolver.force_exclude {
if let Some(filename) = filename {
if !python_file_at_path(filename, pyproject_config, overrides)? {
if fix_mode.is_apply() {
parrot_stdin()?;
}
return Ok(Diagnostics::default());
}
@@ -29,14 +32,17 @@ pub(crate) fn check_stdin(
.file_name()
.is_some_and(|name| match_exclusion(filename, name, &lint_settings.exclude))
{
if fix_mode.is_apply() {
parrot_stdin()?;
}
return Ok(Diagnostics::default());
}
}
}
let stdin = read_from_stdin()?;
let package_root = filename.and_then(Path::parent).and_then(|path| {
packaging::detect_package_root(path, &pyproject_config.settings.linter.namespace_packages)
});
let stdin = read_from_stdin()?;
let mut diagnostics = lint_stdin(
filename,
package_root,

View File

@@ -34,7 +34,7 @@ use crate::args::{CliOverrides, FormatArguments};
use crate::cache::{Cache, FileCacheKey, PackageCacheMap, PackageCaches};
use crate::panic::{catch_unwind, PanicError};
use crate::resolve::resolve;
use crate::ExitStatus;
use crate::{resolve_default_files, ExitStatus};
#[derive(Debug, Copy, Clone, is_macro::Is)]
pub(crate) enum FormatMode {
@@ -60,7 +60,7 @@ impl FormatMode {
/// Format a set of files, and return the exit status.
pub(crate) fn format(
cli: &FormatArguments,
cli: FormatArguments,
overrides: &CliOverrides,
log_level: LogLevel,
) -> Result<ExitStatus> {
@@ -70,8 +70,9 @@ pub(crate) fn format(
overrides,
cli.stdin_filename.as_deref(),
)?;
let mode = FormatMode::from_cli(cli);
let (paths, resolver) = python_files_in_path(&cli.files, &pyproject_config, overrides)?;
let mode = FormatMode::from_cli(&cli);
let files = resolve_default_files(cli.files, false);
let (paths, resolver) = python_files_in_path(&files, &pyproject_config, overrides)?;
if paths.is_empty() {
warn_user_once!("No Python files found under the given path(s)");
@@ -660,12 +661,12 @@ impl Display for FormatCommandError {
}
}
Self::Panic(path, err) => {
let message = r#"This indicates a bug in Ruff. If you could open an issue at:
let message = r"This indicates a bug in Ruff. If you could open an issue at:
https://github.com/astral-sh/ruff/issues/new?title=%5BFormatter%20panic%5D
...with the relevant file contents, the `pyproject.toml` settings, and the following stack trace, we'd be very appreciative!
"#;
";
if let Some(path) = path {
write!(
f,

View File

@@ -15,7 +15,7 @@ use crate::commands::format::{
FormatResult, FormattedSource,
};
use crate::resolve::resolve;
use crate::stdin::read_from_stdin;
use crate::stdin::{parrot_stdin, read_from_stdin};
use crate::ExitStatus;
/// Run the formatter over a single file, read from `stdin`.
@@ -34,6 +34,9 @@ pub(crate) fn format_stdin(cli: &FormatArguments, overrides: &CliOverrides) -> R
if pyproject_config.settings.file_resolver.force_exclude {
if let Some(filename) = cli.stdin_filename.as_deref() {
if !python_file_at_path(filename, &pyproject_config, overrides)? {
if mode.is_write() {
parrot_stdin()?;
}
return Ok(ExitStatus::Success);
}
@@ -42,6 +45,9 @@ pub(crate) fn format_stdin(cli: &FormatArguments, overrides: &CliOverrides) -> R
.file_name()
.is_some_and(|name| match_exclusion(filename, name, &format_settings.exclude))
{
if mode.is_write() {
parrot_stdin()?;
}
return Ok(ExitStatus::Success);
}
}
@@ -50,6 +56,9 @@ pub(crate) fn format_stdin(cli: &FormatArguments, overrides: &CliOverrides) -> R
let path = cli.stdin_filename.as_deref();
let SourceType::Python(source_type) = path.map(SourceType::from).unwrap_or_default() else {
if mode.is_write() {
parrot_stdin()?;
}
return Ok(ExitStatus::Success);
};

View File

@@ -63,7 +63,7 @@ fn format_rule_text(rule: Rule) -> String {
if rule.is_preview() || rule.is_nursery() {
output.push_str(
r#"This rule is in preview and is not stable. The `--preview` flag is required for use."#,
r"This rule is in preview and is not stable. The `--preview` flag is required for use.",
);
output.push('\n');
output.push('\n');

View File

@@ -101,6 +101,19 @@ fn is_stdin(files: &[PathBuf], stdin_filename: Option<&Path>) -> bool {
file == Path::new("-")
}
/// Returns the default set of files if none are provided, otherwise returns `None`.
fn resolve_default_files(files: Vec<PathBuf>, is_stdin: bool) -> Vec<PathBuf> {
if files.is_empty() {
if is_stdin {
vec![Path::new("-").to_path_buf()]
} else {
vec![Path::new(".").to_path_buf()]
}
} else {
files
}
}
/// Get the actual value of the `format` desired from either `output_format`
/// or `format`, and warn the user if they're using the deprecated form.
fn resolve_help_output_format(output_format: HelpFormat, format: Option<HelpFormat>) -> HelpFormat {
@@ -196,7 +209,7 @@ fn format(args: FormatCommand, log_level: LogLevel) -> Result<ExitStatus> {
if is_stdin(&cli.files, cli.stdin_filename.as_deref()) {
commands::format_stdin::format_stdin(&cli, &overrides)
} else {
commands::format::format(&cli, &overrides, log_level)
commands::format::format(cli, &overrides, log_level)
}
}
@@ -222,17 +235,15 @@ pub fn check(args: CheckCommand, log_level: LogLevel) -> Result<ExitStatus> {
};
let stderr_writer = Box::new(BufWriter::new(io::stderr()));
let is_stdin = is_stdin(&cli.files, cli.stdin_filename.as_deref());
let files = resolve_default_files(cli.files, is_stdin);
if cli.show_settings {
commands::show_settings::show_settings(
&cli.files,
&pyproject_config,
&overrides,
&mut writer,
)?;
commands::show_settings::show_settings(&files, &pyproject_config, &overrides, &mut writer)?;
return Ok(ExitStatus::Success);
}
if cli.show_files {
commands::show_files::show_files(&cli.files, &pyproject_config, &overrides, &mut writer)?;
commands::show_files::show_files(&files, &pyproject_config, &overrides, &mut writer)?;
return Ok(ExitStatus::Success);
}
@@ -295,8 +306,7 @@ pub fn check(args: CheckCommand, log_level: LogLevel) -> Result<ExitStatus> {
if !fix_mode.is_generate() {
warn_user!("--fix is incompatible with --add-noqa.");
}
let modifications =
commands::add_noqa::add_noqa(&cli.files, &pyproject_config, &overrides)?;
let modifications = commands::add_noqa::add_noqa(&files, &pyproject_config, &overrides)?;
if modifications > 0 && log_level >= LogLevel::Default {
let s = if modifications == 1 { "" } else { "s" };
#[allow(clippy::print_stderr)]
@@ -323,7 +333,7 @@ pub fn check(args: CheckCommand, log_level: LogLevel) -> Result<ExitStatus> {
// Configure the file watcher.
let (tx, rx) = channel();
let mut watcher = recommended_watcher(tx)?;
for file in &cli.files {
for file in &files {
watcher.watch(file, RecursiveMode::Recursive)?;
}
if let Some(file) = pyproject_config.path.as_ref() {
@@ -335,7 +345,7 @@ pub fn check(args: CheckCommand, log_level: LogLevel) -> Result<ExitStatus> {
printer.write_to_user("Starting linter in watch mode...\n");
let messages = commands::check::check(
&cli.files,
&files,
&pyproject_config,
&overrides,
cache.into(),
@@ -368,7 +378,7 @@ pub fn check(args: CheckCommand, log_level: LogLevel) -> Result<ExitStatus> {
printer.write_to_user("File change detected...\n");
let messages = commands::check::check(
&cli.files,
&files,
&pyproject_config,
&overrides,
cache.into(),
@@ -382,8 +392,6 @@ pub fn check(args: CheckCommand, log_level: LogLevel) -> Result<ExitStatus> {
}
}
} else {
let is_stdin = is_stdin(&cli.files, cli.stdin_filename.as_deref());
// Generate lint violations.
let diagnostics = if is_stdin {
commands::check_stdin::check_stdin(
@@ -395,7 +403,7 @@ pub fn check(args: CheckCommand, log_level: LogLevel) -> Result<ExitStatus> {
)?
} else {
commands::check::check(
&cli.files,
&files,
&pyproject_config,
&overrides,
cache.into(),

View File

@@ -43,7 +43,7 @@ pub fn resolve(
{
let settings = resolve_root_settings(&pyproject, Relativity::Cwd, overrides)?;
debug!(
"Using user specified pyproject.toml at {}",
"Using user-specified configuration file at: {}",
pyproject.display()
);
return Ok(PyprojectConfig::new(
@@ -63,7 +63,10 @@ pub fn resolve(
.as_ref()
.unwrap_or(&path_dedot::CWD.as_path()),
)? {
debug!("Using pyproject.toml (parent) at {}", pyproject.display());
debug!(
"Using configuration file (via parent) at: {}",
pyproject.display()
);
let settings = resolve_root_settings(&pyproject, Relativity::Parent, overrides)?;
return Ok(PyprojectConfig::new(
PyprojectDiscoveryStrategy::Hierarchical,
@@ -77,7 +80,10 @@ pub fn resolve(
// end up the "closest" `pyproject.toml` file for every Python file later on, so
// these act as the "default" settings.)
if let Some(pyproject) = pyproject::find_user_settings_toml() {
debug!("Using pyproject.toml (cwd) at {}", pyproject.display());
debug!(
"Using configuration file (via cwd) at: {}",
pyproject.display()
);
let settings = resolve_root_settings(&pyproject, Relativity::Cwd, overrides)?;
return Ok(PyprojectConfig::new(
PyprojectDiscoveryStrategy::Hierarchical,

View File

@@ -1,5 +1,5 @@
use std::io;
use std::io::Read;
use std::io::{Read, Write};
/// Read a string from `stdin`.
pub(crate) fn read_from_stdin() -> Result<String, io::Error> {
@@ -7,3 +7,11 @@ pub(crate) fn read_from_stdin() -> Result<String, io::Error> {
io::stdin().lock().read_to_string(&mut buffer)?;
Ok(buffer)
}
/// Read bytes from `stdin` and write them to `stdout`.
pub(crate) fn parrot_stdin() -> Result<(), io::Error> {
let mut buffer = String::new();
io::stdin().lock().read_to_string(&mut buffer)?;
io::stdout().write_all(buffer.as_bytes())?;
Ok(())
}

View File

@@ -43,6 +43,53 @@ if condition:
"###);
}
#[test]
fn default_files() -> Result<()> {
let tempdir = TempDir::new()?;
fs::write(
tempdir.path().join("foo.py"),
r#"
foo = "needs formatting"
"#,
)?;
fs::write(
tempdir.path().join("bar.py"),
r#"
bar = "needs formatting"
"#,
)?;
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["format", "--isolated", "--no-cache", "--check"]).current_dir(tempdir.path()), @r###"
success: false
exit_code: 1
----- stdout -----
Would reformat: bar.py
Would reformat: foo.py
2 files would be reformatted
----- stderr -----
"###);
Ok(())
}
#[test]
fn format_warn_stdin_filename_with_files() {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["format", "--isolated", "--stdin-filename", "foo.py"])
.arg("foo.py")
.pass_stdin("foo = 1"), @r###"
success: true
exit_code: 0
----- stdout -----
foo = 1
----- stderr -----
warning: Ignoring file foo.py in favor of standard input.
"###);
}
#[test]
fn format_options() -> Result<()> {
let tempdir = TempDir::new()?;
@@ -320,6 +367,11 @@ if __name__ == '__main__':
exit_code: 0
----- stdout -----
from test import say_hy
if __name__ == '__main__':
say_hy("dear Ruff contributor")
----- stderr -----
"###);
Ok(())
@@ -390,9 +442,9 @@ fn deprecated_options() -> Result<()> {
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
r#"
r"
tab-size = 2
"#,
",
)?;
insta::with_settings!({filters => vec![
@@ -402,10 +454,10 @@ tab-size = 2
.args(["format", "--config"])
.arg(&ruff_toml)
.arg("-")
.pass_stdin(r#"
.pass_stdin(r"
if True:
pass
"#), @r###"
"), @r###"
success: true
exit_code: 0
----- stdout -----
@@ -438,9 +490,9 @@ format = "json"
.args(["check", "--select", "F401", "--no-cache", "--config"])
.arg(&ruff_toml)
.arg("-")
.pass_stdin(r#"
.pass_stdin(r"
import os
"#), @r###"
"), @r###"
success: false
exit_code: 2
----- stdout -----
@@ -813,3 +865,432 @@ fn test_diff_stdin_formatted() {
----- stderr -----
"###);
}
#[test]
fn test_notebook_trailing_semicolon() {
let fixtures = Path::new("resources").join("test").join("fixtures");
let unformatted = fs::read(fixtures.join("trailing_semicolon.ipynb")).unwrap();
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["format", "--isolated", "--stdin-filename", "test.ipynb"])
.arg("-")
.pass_stdin(unformatted), @r###"
success: true
exit_code: 0
----- stdout -----
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"id": "4f8ce941-1492-4d4e-8ab5-70d733fe891a",
"metadata": {},
"outputs": [],
"source": [
"%config ZMQInteractiveShell.ast_node_interactivity=\"last_expr_or_assign\""
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "721ec705-0c65-4bfb-9809-7ed8bc534186",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Assignment statement without a semicolon\n",
"x = 1"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "de50e495-17e5-41cc-94bd-565757555d7e",
"metadata": {},
"outputs": [],
"source": [
"# Assignment statement with a semicolon\n",
"x = 1\n",
"x = 1;"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "39e31201-23da-44eb-8684-41bba3663991",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"2"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Augmented assignment without a semicolon\n",
"x += 1"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "6b73d3dd-c73a-4697-9e97-e109a6c1fbab",
"metadata": {},
"outputs": [],
"source": [
"# Augmented assignment without a semicolon\n",
"x += 1\n",
"x += 1; # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "2a3e5b86-aa5b-46ba-b9c6-0386d876f58c",
"metadata": {},
"outputs": [],
"source": [
"# Multiple assignment without a semicolon\n",
"x = y = 1"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "07f89e51-9357-4cfb-8fc5-76fb75e35949",
"metadata": {},
"outputs": [],
"source": [
"# Multiple assignment with a semicolon\n",
"x = y = 1\n",
"x = y = 1"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "c22b539d-473e-48f8-a236-625e58c47a00",
"metadata": {},
"outputs": [],
"source": [
"# Tuple unpacking without a semicolon\n",
"x, y = 1, 2"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "12c87940-a0d5-403b-a81c-7507eb06dc7e",
"metadata": {},
"outputs": [],
"source": [
"# Tuple unpacking with a semicolon (irrelevant)\n",
"x, y = 1, 2\n",
"x, y = 1, 2 # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "5a768c76-6bc4-470c-b37e-8cc14bc6caf4",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Annotated assignment statement without a semicolon\n",
"x: int = 1"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "21bfda82-1a9a-4ba1-9078-74ac480804b5",
"metadata": {},
"outputs": [],
"source": [
"# Annotated assignment statement without a semicolon\n",
"x: int = 1\n",
"x: int = 1; # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "09929999-ff29-4d10-ad2b-e665af15812d",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Assignment expression without a semicolon\n",
"(x := 1)"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "32a83217-1bad-4f61-855e-ffcdb119c763",
"metadata": {},
"outputs": [],
"source": [
"# Assignment expression with a semicolon\n",
"(x := 1)\n",
"(x := 1); # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "61b81865-277e-4964-b03e-eb78f1f318eb",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x = 1\n",
"# Expression without a semicolon\n",
"x"
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "974c29be-67e1-4000-95fa-6ca118a63bad",
"metadata": {},
"outputs": [],
"source": [
"x = 1\n",
"# Expression with a semicolon\n",
"x;"
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "cfeb1757-46d6-4f13-969f-a283b6d0304f",
"metadata": {},
"outputs": [],
"source": [
"class Point:\n",
" def __init__(self, x, y):\n",
" self.x = x\n",
" self.y = y\n",
"\n",
"\n",
"p = Point(0, 0);"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "2ee7f1a5-ccfe-4004-bfa4-ef834a58da97",
"metadata": {},
"outputs": [],
"source": [
"# Assignment statement where the left is an attribute access doesn't\n",
"# print the value.\n",
"p.x = 1"
]
},
{
"cell_type": "code",
"execution_count": 18,
"id": "3e49370a-048b-474d-aa0a-3d1d4a73ad37",
"metadata": {},
"outputs": [],
"source": [
"data = {}\n",
"\n",
"# Neither does the subscript node\n",
"data[\"foo\"] = 1"
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "d594bdd3-eaa9-41ef-8cda-cf01bc273b2d",
"metadata": {},
"outputs": [],
"source": [
"if x := 1:\n",
" # It should be the top level statement\n",
" x"
]
},
{
"cell_type": "code",
"execution_count": 20,
"id": "e532f0cf-80c7-42b7-8226-6002fcf74fb6",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 20,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Parentheses with comments\n",
"(\n",
" x := 1 # comment\n",
") # comment"
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "473c5d62-871b-46ed-8a34-27095243f462",
"metadata": {},
"outputs": [],
"source": [
"# Parentheses with comments\n",
"(\n",
" x := 1 # comment\n",
"); # comment"
]
},
{
"cell_type": "code",
"execution_count": 22,
"id": "8c3c2361-f49f-45fe-bbe3-7e27410a8a86",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'Hello world!'"
]
},
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"\"\"\"Hello world!\"\"\""
]
},
{
"cell_type": "code",
"execution_count": 23,
"id": "23dbe9b5-3f68-4890-ab2d-ab0dbfd0712a",
"metadata": {},
"outputs": [],
"source": [
"\"\"\"Hello world!\"\"\"; # comment\n",
"# comment"
]
},
{
"cell_type": "code",
"execution_count": 24,
"id": "3ce33108-d95d-4c70-83d1-0d4fd36a2951",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'x = 1'"
]
},
"execution_count": 24,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x = 1\n",
"f\"x = {x}\""
]
},
{
"cell_type": "code",
"execution_count": 25,
"id": "654a4a67-de43-4684-824a-9451c67db48f",
"metadata": {},
"outputs": [],
"source": [
"x = 1\n",
"f\"x = {x}\"\n",
"f\"x = {x}\"; # comment\n",
"# comment"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python (ruff-playground)",
"language": "python",
"name": "ruff-playground"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
----- stderr -----
"###);
}

File diff suppressed because it is too large Load Diff

View File

@@ -396,3 +396,43 @@ if __name__ == "__main__":
"###);
Ok(())
}
/// Regression test for [#8858](https://github.com/astral-sh/ruff/issues/8858)
#[test]
fn parent_configuration_override() -> Result<()> {
let tempdir = TempDir::new()?;
let root_ruff = tempdir.path().join("ruff.toml");
fs::write(
root_ruff,
r#"
[lint]
select = ["ALL"]
"#,
)?;
let sub_dir = tempdir.path().join("subdirectory");
fs::create_dir(&sub_dir)?;
let subdirectory_ruff = sub_dir.join("ruff.toml");
fs::write(
subdirectory_ruff,
r#"
[lint]
ignore = ["D203", "D212"]
"#,
)?;
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.current_dir(sub_dir)
.arg("check")
.args(STDIN_BASE_OPTIONS)
, @r###"
success: true
exit_code: 0
----- stdout -----
----- stderr -----
warning: No Python files found under the given path(s)
"###);
Ok(())
}

View File

@@ -0,0 +1,101 @@
#![cfg(not(target_family = "wasm"))]
use std::path::Path;
use std::process::Command;
use std::str;
use insta_cmd::{assert_cmd_snapshot, get_cargo_bin};
const BIN_NAME: &str = "ruff";
#[cfg(not(target_os = "windows"))]
const TEST_FILTERS: &[(&str, &str)] = &[(".*/resources/test/fixtures/", "[BASEPATH]/")];
#[cfg(target_os = "windows")]
const TEST_FILTERS: &[(&str, &str)] = &[
(r".*\\resources\\test\\fixtures\\", "[BASEPATH]\\"),
(r"\\", "/"),
];
#[test]
fn check_project_include_defaults() {
// Defaults to checking the current working directory
//
// The test directory includes:
// - A pyproject.toml which specifies an include
// - A nested pyproject.toml which has a Ruff section
//
// The nested project should all be checked instead of respecting the parent includes
insta::with_settings!({
filters => TEST_FILTERS.to_vec()
}, {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--show-files"]).current_dir(Path::new("./resources/test/fixtures/include-test")), @r###"
success: true
exit_code: 0
----- stdout -----
[BASEPATH]/include-test/a.py
[BASEPATH]/include-test/nested-project/e.py
[BASEPATH]/include-test/nested-project/pyproject.toml
[BASEPATH]/include-test/subdirectory/c.py
----- stderr -----
"###);
});
}
#[test]
fn check_project_respects_direct_paths() {
// Given a direct path not included in the project `includes`, it should be checked
insta::with_settings!({
filters => TEST_FILTERS.to_vec()
}, {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--show-files", "b.py"]).current_dir(Path::new("./resources/test/fixtures/include-test")), @r###"
success: true
exit_code: 0
----- stdout -----
[BASEPATH]/include-test/b.py
----- stderr -----
"###);
});
}
#[test]
fn check_project_respects_subdirectory_includes() {
// Given a direct path to a subdirectory, the include should be respected
insta::with_settings!({
filters => TEST_FILTERS.to_vec()
}, {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--show-files", "subdirectory"]).current_dir(Path::new("./resources/test/fixtures/include-test")), @r###"
success: true
exit_code: 0
----- stdout -----
[BASEPATH]/include-test/subdirectory/c.py
----- stderr -----
"###);
});
}
#[test]
fn check_project_from_project_subdirectory_respects_includes() {
// Run from a project subdirectory, the include specified in the parent directory should be respected
insta::with_settings!({
filters => TEST_FILTERS.to_vec()
}, {
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.args(["check", "--show-files"]).current_dir(Path::new("./resources/test/fixtures/include-test/subdirectory")), @r###"
success: true
exit_code: 0
----- stdout -----
[BASEPATH]/include-test/subdirectory/c.py
----- stderr -----
"###);
});
}

View File

@@ -48,9 +48,12 @@ tracing-indicatif = { workspace = true }
tracing-subscriber = { workspace = true, features = ["env-filter"] }
imara-diff = "0.1.5"
[dev-dependencies]
indoc = "2.0.4"
[features]
# Turn off rayon for profiling
singlethreaded = []
[dev-dependencies]
indoc = "2.0.4"
[lints]
workspace = true

View File

@@ -49,7 +49,7 @@ pub(crate) fn main(args: &Args) -> Result<()> {
if rule.is_preview() || rule.is_nursery() {
output.push_str(
r#"This rule is unstable and in [preview](../preview.md). The `--preview` flag is required for use."#,
r"This rule is unstable and in [preview](../preview.md). The `--preview` flag is required for use.",
);
output.push('\n');
output.push('\n');

View File

@@ -129,12 +129,12 @@ fn emit_field(output: &mut String, name: &str, field: &OptionField, parent_set:
output.push_str("**Example usage**:\n\n");
output.push_str(&format_tab(
"pyproject.toml",
&format_header(parent_set, ConfigurationFile::PyprojectToml),
&format_header(field.scope, parent_set, ConfigurationFile::PyprojectToml),
field.example,
));
output.push_str(&format_tab(
"ruff.toml",
&format_header(parent_set, ConfigurationFile::RuffToml),
&format_header(field.scope, parent_set, ConfigurationFile::RuffToml),
field.example,
));
output.push('\n');
@@ -149,23 +149,53 @@ fn format_tab(tab_name: &str, header: &str, content: &str) -> String {
)
}
fn format_header(parent_set: &Set, configuration: ConfigurationFile) -> String {
let fmt = if let Some(set_name) = parent_set.name() {
if set_name == "format" {
String::from(".format")
} else {
format!(".lint.{set_name}")
}
} else {
String::new()
};
/// Format the TOML header for the example usage for a given option.
///
/// For example: `[tool.ruff.format]` or `[tool.ruff.lint.isort]`.
fn format_header(
scope: Option<&str>,
parent_set: &Set,
configuration: ConfigurationFile,
) -> String {
match configuration {
ConfigurationFile::PyprojectToml => format!("[tool.ruff{fmt}]"),
ConfigurationFile::PyprojectToml => {
let mut header = if let Some(set_name) = parent_set.name() {
if set_name == "format" {
String::from("tool.ruff.format")
} else {
format!("tool.ruff.lint.{set_name}")
}
} else {
"tool.ruff".to_string()
};
if let Some(scope) = scope {
if !header.is_empty() {
header.push('.');
}
header.push_str(scope);
}
format!("[{header}]")
}
ConfigurationFile::RuffToml => {
if fmt.is_empty() {
let mut header = if let Some(set_name) = parent_set.name() {
if set_name == "format" {
String::from("format")
} else {
format!("lint.{set_name}")
}
} else {
String::new()
};
if let Some(scope) = scope {
if !header.is_empty() {
header.push('.');
}
header.push_str(scope);
}
if header.is_empty() {
String::new()
} else {
format!("[{}]", fmt.strip_prefix('.').unwrap())
format!("[{header}]")
}
}
}

View File

@@ -1,30 +1,34 @@
//! Print the AST for a given Python file.
#![allow(clippy::print_stdout, clippy::print_stderr)]
use std::fs;
use std::path::PathBuf;
use anyhow::Result;
use ruff_python_parser::{parse, Mode};
use ruff_linter::source_kind::SourceKind;
use ruff_python_ast::PySourceType;
use ruff_python_parser::{parse, AsMode};
#[derive(clap::Args)]
pub(crate) struct Args {
/// Python file for which to generate the AST.
#[arg(required = true)]
file: PathBuf,
/// Run in Jupyter mode i.e., allow line magics.
#[arg(long)]
jupyter: bool,
}
pub(crate) fn main(args: &Args) -> Result<()> {
let contents = fs::read_to_string(&args.file)?;
let mode = if args.jupyter {
Mode::Ipython
} else {
Mode::Module
};
let python_ast = parse(&contents, mode, &args.file.to_string_lossy())?;
let source_type = PySourceType::from(&args.file);
let source_kind = SourceKind::from_path(&args.file, source_type)?.ok_or_else(|| {
anyhow::anyhow!(
"Could not determine source kind for file: {}",
args.file.display()
)
})?;
let python_ast = parse(
source_kind.source_code(),
source_type.as_mode(),
&args.file.to_string_lossy(),
)?;
println!("{python_ast:#?}");
Ok(())
}

View File

@@ -1,30 +1,30 @@
//! Print the token stream for a given Python file.
#![allow(clippy::print_stdout, clippy::print_stderr)]
use std::fs;
use std::path::PathBuf;
use anyhow::Result;
use ruff_python_parser::{lexer, Mode};
use ruff_linter::source_kind::SourceKind;
use ruff_python_ast::PySourceType;
use ruff_python_parser::{lexer, AsMode};
#[derive(clap::Args)]
pub(crate) struct Args {
/// Python file for which to generate the AST.
#[arg(required = true)]
file: PathBuf,
/// Run in Jupyter mode i.e., allow line magics (`%`, `!`, `?`, `/`, `,`, `;`).
#[arg(long)]
jupyter: bool,
}
pub(crate) fn main(args: &Args) -> Result<()> {
let contents = fs::read_to_string(&args.file)?;
let mode = if args.jupyter {
Mode::Ipython
} else {
Mode::Module
};
for (tok, range) in lexer::lex(&contents, mode).flatten() {
let source_type = PySourceType::from(&args.file);
let source_kind = SourceKind::from_path(&args.file, source_type)?.ok_or_else(|| {
anyhow::anyhow!(
"Could not determine source kind for file: {}",
args.file.display()
)
})?;
for (tok, range) in lexer::lex(source_kind.source_code(), source_type.as_mode()).flatten() {
println!(
"{start:#?} {tok:#?} {end:#?}",
start = range.start(),

View File

@@ -29,3 +29,6 @@ insta = { workspace = true }
[features]
serde = ["dep:serde", "ruff_text_size/serde"]
schemars = ["dep:schemars", "ruff_text_size/schemars"]
[lints]
workspace = true

View File

@@ -1,23 +1,9 @@
use super::{Buffer, Format, Formatter};
use crate::FormatResult;
use std::ffi::c_void;
use std::marker::PhantomData;
/// Mono-morphed type to format an object. Used by the [`crate::format`!], [`crate::format_args`!], and
/// [`crate::write`!] macros.
///
/// This struct is similar to a dynamic dispatch (using `dyn Format`) because it stores a pointer to the value.
/// However, it doesn't store the pointer to `dyn Format`'s vtable, instead it statically resolves the function
/// pointer of `Format::format` and stores it in `formatter`.
/// A convenience wrapper for representing a formattable argument.
pub struct Argument<'fmt, Context> {
/// The value to format stored as a raw pointer where `lifetime` stores the value's lifetime.
value: *const c_void,
/// Stores the lifetime of the value. To get the most out of our dear borrow checker.
lifetime: PhantomData<&'fmt ()>,
/// The function pointer to `value`'s `Format::format` method
formatter: fn(*const c_void, &mut Formatter<'_, Context>) -> FormatResult<()>,
value: &'fmt dyn Format<Context>,
}
impl<Context> Clone for Argument<'_, Context> {
@@ -28,32 +14,19 @@ impl<Context> Clone for Argument<'_, Context> {
impl<Context> Copy for Argument<'_, Context> {}
impl<'fmt, Context> Argument<'fmt, Context> {
/// Called by the [ruff_formatter::format_args] macro. Creates a mono-morphed value for formatting
/// an object.
/// Called by the [ruff_formatter::format_args] macro.
#[doc(hidden)]
#[inline]
pub fn new<F: Format<Context>>(value: &'fmt F) -> Self {
#[inline]
fn formatter<F: Format<Context>, Context>(
ptr: *const c_void,
fmt: &mut Formatter<Context>,
) -> FormatResult<()> {
// SAFETY: Safe because the 'fmt lifetime is captured by the 'lifetime' field.
#[allow(unsafe_code)]
F::fmt(unsafe { &*ptr.cast::<F>() }, fmt)
}
Self {
value: (value as *const F).cast::<std::ffi::c_void>(),
lifetime: PhantomData,
formatter: formatter::<F, Context>,
}
Self { value }
}
/// Formats the value stored by this argument using the given formatter.
#[inline]
// Seems to only be triggered on wasm32 and looks like a false positive?
#[allow(clippy::trivially_copy_pass_by_ref)]
pub(super) fn format(&self, f: &mut Formatter<Context>) -> FormatResult<()> {
(self.formatter)(self.value, f)
self.value.fmt(f)
}
}

View File

@@ -1451,11 +1451,12 @@ impl<Context> std::fmt::Debug for Group<'_, Context> {
}
}
/// Content that may get parenthesized if it exceeds the configured line width but only if the parenthesized
/// layout doesn't exceed the line width too, in which case it falls back to the flat layout.
/// Content that may get parenthesized if it exceeds the configured line width but only if the parenthesized content
/// doesn't exceed the line width too, in which case it falls back to the flat layout.
///
/// This IR is identical to the following [`best_fitting`] layout but is implemented as custom IR for
/// best performance.
/// This IR is almost identical to the following [`best_fitting`] layout but is implemented as custom IR for
/// best performance. It differs in that the open parentheses is ignored when testing if the content fits because
/// the goal is to only parenthesize the content if parenthesizing makes the content fit.
///
/// ```rust
/// # use ruff_formatter::prelude::*;
@@ -2555,17 +2556,17 @@ pub struct BestFitting<'a, Context> {
}
impl<'a, Context> BestFitting<'a, Context> {
/// Creates a new best fitting IR with the given variants. The method itself isn't unsafe
/// but it is to discourage people from using it because the printer will panic if
/// the slice doesn't contain at least the least and most expanded variants.
/// Creates a new best fitting IR with the given variants.
///
/// Callers are required to ensure that the number of variants given
/// is at least 2.
///
/// You're looking for a way to create a `BestFitting` object, use the `best_fitting![least_expanded, most_expanded]` macro.
///
/// ## Safety
/// The slice must contain at least two variants.
#[allow(unsafe_code)]
pub unsafe fn from_arguments_unchecked(variants: Arguments<'a, Context>) -> Self {
/// # Panics
///
/// When the slice contains less than two variants.
pub fn from_arguments_unchecked(variants: Arguments<'a, Context>) -> Self {
assert!(
variants.0.len() >= 2,
"Requires at least the least expanded and most expanded variants"
@@ -2696,14 +2697,12 @@ impl<Context> Format<Context> for BestFitting<'_, Context> {
buffer.write_element(FormatElement::Tag(EndBestFittingEntry));
}
// SAFETY: The constructor guarantees that there are always at least two variants. It's, therefore,
// safe to call into the unsafe `from_vec_unchecked` function
#[allow(unsafe_code)]
let element = unsafe {
FormatElement::BestFitting {
variants: BestFittingVariants::from_vec_unchecked(buffer.into_vec()),
mode: self.mode,
}
// OK because the constructor guarantees that there are always at
// least two variants.
let variants = BestFittingVariants::from_vec_unchecked(buffer.into_vec());
let element = FormatElement::BestFitting {
variants,
mode: self.mode,
};
f.write_element(element);

View File

@@ -332,17 +332,14 @@ pub enum BestFittingMode {
pub struct BestFittingVariants(Box<[FormatElement]>);
impl BestFittingVariants {
/// Creates a new best fitting IR with the given variants. The method itself isn't unsafe
/// but it is to discourage people from using it because the printer will panic if
/// the slice doesn't contain at least the least and most expanded variants.
/// Creates a new best fitting IR with the given variants.
///
/// Callers are required to ensure that the number of variants given
/// is at least 2 when using `most_expanded` or `most_flag`.
///
/// You're looking for a way to create a `BestFitting` object, use the `best_fitting![least_expanded, most_expanded]` macro.
///
/// ## Safety
/// The slice must contain at least two variants.
#[doc(hidden)]
#[allow(unsafe_code)]
pub unsafe fn from_vec_unchecked(variants: Vec<FormatElement>) -> Self {
pub fn from_vec_unchecked(variants: Vec<FormatElement>) -> Self {
debug_assert!(
variants
.iter()
@@ -351,12 +348,23 @@ impl BestFittingVariants {
>= 2,
"Requires at least the least expanded and most expanded variants"
);
Self(variants.into_boxed_slice())
}
/// Returns the most expanded variant
///
/// # Panics
///
/// When the number of variants is less than two.
pub fn most_expanded(&self) -> &[FormatElement] {
assert!(
self.as_slice()
.iter()
.filter(|element| matches!(element, FormatElement::Tag(Tag::StartBestFittingEntry)))
.count()
>= 2,
"Requires at least the least expanded and most expanded variants"
);
self.into_iter().last().unwrap()
}
@@ -365,7 +373,19 @@ impl BestFittingVariants {
}
/// Returns the least expanded variant
///
/// # Panics
///
/// When the number of variants is less than two.
pub fn most_flat(&self) -> &[FormatElement] {
assert!(
self.as_slice()
.iter()
.filter(|element| matches!(element, FormatElement::Tag(Tag::StartBestFittingEntry)))
.count()
>= 2,
"Requires at least the least expanded and most expanded variants"
);
self.into_iter().next().unwrap()
}
}

View File

@@ -329,10 +329,8 @@ macro_rules! format {
#[macro_export]
macro_rules! best_fitting {
($least_expanded:expr, $($tail:expr),+ $(,)?) => {{
#[allow(unsafe_code)]
unsafe {
$crate::BestFitting::from_arguments_unchecked($crate::format_args!($least_expanded, $($tail),+))
}
// OK because the macro syntax requires at least two variants.
$crate::BestFitting::from_arguments_unchecked($crate::format_args!($least_expanded, $($tail),+))
}}
}

View File

@@ -203,7 +203,9 @@ impl<'a> Printer<'a> {
args.with_measure_mode(MeasureMode::AllLines),
);
queue.extend_back(&[OPEN_PAREN, INDENT, HARD_LINE_BREAK]);
// Don't measure the opening parentheses. Only concerned whether parenthesizing
// makes the content fit.
queue.extend_back(&[INDENT, HARD_LINE_BREAK]);
let fits_expanded = self.fits(queue, stack)?;
queue.pop_slice();
stack.pop(TagKind::BestFitParenthesize)?;
@@ -1299,10 +1301,11 @@ impl<'a, 'print> FitsMeasurer<'a, 'print> {
self.state.line_width = 0;
self.state.pending_indent = unindented.indention();
return Ok(self.fits_text(Text::Token(")"), unindented));
// Don't measure the `)` similar to the open parentheses but increment the line width and indent level.
self.fits_text(Text::Token(")"), unindented);
} else {
self.stack.pop(TagKind::BestFitParenthesize)?;
}
self.stack.pop(TagKind::BestFitParenthesize)?;
}
FormatElement::Tag(StartConditionalGroup(group)) => {
@@ -1711,14 +1714,14 @@ mod tests {
));
assert_eq!(
r#"a
"a
b
c
d
d
c
b
a"#,
a",
formatted.as_code()
);
}
@@ -2047,10 +2050,10 @@ two lines`,
assert_eq!(
printed.as_code(),
r#"Group with id-2
"Group with id-2
Group with id-1 does not fit on the line because it exceeds the line width of 80 characters by
Group 2 fits
Group 1 breaks"#
Group 1 breaks"
);
}

View File

@@ -17,3 +17,6 @@ ruff_macros = { path = "../ruff_macros" }
[dev-dependencies]
static_assertions = "1.1.0"
[lints]
workspace = true

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.1.5"
version = "0.1.6"
publish = false
authors = { workspace = true }
edition = { workspace = true }
@@ -30,7 +30,7 @@ ruff_source_file = { path = "../ruff_source_file", features = ["serde"] }
ruff_text_size = { path = "../ruff_text_size" }
aho-corasick = { version = "1.1.2" }
annotate-snippets = { version = "0.9.1", features = ["color"] }
annotate-snippets = { version = "0.9.2", features = ["color"] }
anyhow = { workspace = true }
bitflags = { workspace = true }
chrono = { workspace = true }
@@ -53,8 +53,8 @@ path-absolutize = { workspace = true, features = [
] }
pathdiff = { version = "0.2.1" }
pep440_rs = { version = "0.3.12", features = ["serde"] }
pyproject-toml = { version = "0.8.0" }
quick-junit = { version = "0.3.2" }
pyproject-toml = { version = "0.8.1" }
quick-junit = { version = "0.3.5" }
regex = { workspace = true }
result-like = { version = "0.4.6" }
rustc-hash = { workspace = true }
@@ -86,3 +86,6 @@ default = []
schemars = ["dep:schemars"]
# Enables the UnreachableCode rule
unreachable-code = []
[lints]
workspace = true

View File

@@ -1,3 +0,0 @@
# fixtures
Fixture files used for snapshot testing.

View File

@@ -0,0 +1,65 @@
def func():
return 1
def func():
return 1.5
def func(x: int):
if x > 0:
return 1
else:
return 1.5
def func():
return True
def func(x: int):
if x > 0:
return None
else:
return
def func(x: int):
return 1 or 2.5 if x > 0 else 1.5 or "str"
def func(x: int):
return 1 + 2.5 if x > 0 else 1.5 or "str"
def func(x: int):
if not x:
return None
return {"foo": 1}
def func(x: int):
return {"foo": 1}
def func(x: int):
if not x:
return 1
else:
return True
def func(x: int):
if not x:
return 1
else:
return None
def func(x: int):
if not x:
return 1
elif x > 5:
return "str"
else:
return None

View File

@@ -0,0 +1,65 @@
import sys
import tarfile
import tempfile
def unsafe_archive_handler(filename):
tar = tarfile.open(filename)
tar.extractall(path=tempfile.mkdtemp())
tar.close()
def managed_members_archive_handler(filename):
tar = tarfile.open(filename)
tar.extractall(path=tempfile.mkdtemp(), members=members_filter(tar))
tar.close()
def list_members_archive_handler(filename):
tar = tarfile.open(filename)
tar.extractall(path=tempfile.mkdtemp(), members=[])
tar.close()
def provided_members_archive_handler(filename):
tar = tarfile.open(filename)
tarfile.extractall(path=tempfile.mkdtemp(), members=tar)
tar.close()
def filter_data(filename):
tar = tarfile.open(filename)
tarfile.extractall(path=tempfile.mkdtemp(), filter="data")
tar.close()
def filter_fully_trusted(filename):
tar = tarfile.open(filename)
tarfile.extractall(path=tempfile.mkdtemp(), filter="fully_trusted")
tar.close()
def filter_tar(filename):
tar = tarfile.open(filename)
tarfile.extractall(path=tempfile.mkdtemp(), filter="tar")
tar.close()
def members_filter(tarfile):
result = []
for member in tarfile.getmembers():
if '../' in member.name:
print('Member name container directory traversal sequence')
continue
elif (member.issym() or member.islnk()) and ('../' in member.linkname):
print('Symlink to external resource')
continue
result.append(member)
return result
if __name__ == "__main__":
if len(sys.argv) > 1:
filename = sys.argv[1]
unsafe_archive_handler(filename)
managed_members_archive_handler(filename)

View File

@@ -0,0 +1,13 @@
from django.db.models.expressions import RawSQL
from django.contrib.auth.models import User
User.objects.annotate(val=RawSQL('secure', []))
User.objects.annotate(val=RawSQL('%secure' % 'nos', []))
User.objects.annotate(val=RawSQL('{}secure'.format('no'), []))
raw = '"username") AS "val" FROM "auth_user" WHERE "username"="admin" --'
User.objects.annotate(val=RawSQL(raw, []))
raw = '"username") AS "val" FROM "auth_user"' \
' WHERE "username"="admin" OR 1=%s --'
User.objects.annotate(val=RawSQL(raw, [0]))
User.objects.annotate(val=RawSQL(sql='{}secure'.format('no'), params=[]))
User.objects.annotate(val=RawSQL(params=[], sql='{}secure'.format('no')))

View File

@@ -91,3 +91,26 @@ class Registry:
def foo(self) -> None:
object.__setattr__(self, "flag", True)
from typing import Optional, Union
def func(x: Union[list, Optional[int | str | float | bool]]):
pass
def func(x: bool | str):
pass
def func(x: int | str):
pass
from typing import override
@override
def func(x: bool):
pass

View File

@@ -0,0 +1,149 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"id": "33faf7ad-a3fd-4ac4-a0c3-52e507ed49df",
"metadata": {},
"outputs": [],
"source": [
"x = 1"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "481fb4bf-c1b9-47da-927f-3cfdfe4b49ec",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Simple case\n",
"x == 1"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "2f0c65a5-0a0e-4080-afce-5a8ed0d706df",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Only skip the last expression\n",
"x == 1 # B018\n",
"x == 1"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "5a3fd75d-26d9-44f7-b013-1684aabfd0ae",
"metadata": {},
"outputs": [],
"source": [
"# Nested expressions isn't relevant\n",
"if True:\n",
" x == 1"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "e00e1afa-b76c-4774-be2a-7223641579e4",
"metadata": {},
"outputs": [],
"source": [
"# Semicolons shouldn't affect the output\n",
"x == 1;"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "05eab5b9-e2ba-4954-8ef3-b035a79573fe",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Semicolons with multiple expressions\n",
"x == 1; x == 1"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "9cbbddc5-83fc-4fdb-81ab-53a3912ae898",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Comments, newlines and whitespace\n",
"x == 1 # comment\n",
"\n",
"# another comment"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python (ruff-playground)",
"language": "python",
"name": "ruff-playground"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@@ -0,0 +1,149 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"id": "33faf7ad-a3fd-4ac4-a0c3-52e507ed49df",
"metadata": {},
"outputs": [],
"source": [
"x = 1"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "481fb4bf-c1b9-47da-927f-3cfdfe4b49ec",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Simple case\n",
"x"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "2f0c65a5-0a0e-4080-afce-5a8ed0d706df",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Only skip the last expression\n",
"x # B018\n",
"x"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "5a3fd75d-26d9-44f7-b013-1684aabfd0ae",
"metadata": {},
"outputs": [],
"source": [
"# Nested expressions isn't relevant\n",
"if True:\n",
" x"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "e00e1afa-b76c-4774-be2a-7223641579e4",
"metadata": {},
"outputs": [],
"source": [
"# Semicolons shouldn't affect the output\n",
"x;"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "05eab5b9-e2ba-4954-8ef3-b035a79573fe",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Semicolons with multiple expressions\n",
"x; x"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "9cbbddc5-83fc-4fdb-81ab-53a3912ae898",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Comments, newlines and whitespace\n",
"x # comment\n",
"\n",
"# another comment"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python (ruff-playground)",
"language": "python",
"name": "ruff-playground"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@@ -639,3 +639,18 @@ foo = namedtuple(
:20
],
)
# F-strings
kwargs.pop("remove", f"this {trailing_comma}",)
raise Exception(
"first", extra=f"Add trailing comma here ->"
)
assert False, f"<- This is not a trailing comma"
f"""This is a test. {
"Another sentence."
if True else
"Don't add a trailing comma here ->"
}"""

View File

@@ -148,3 +148,62 @@ for i in range(10):
for i in range(10):
pass # comment
pass
def foo():
print("foo")
...
def foo():
"""A docstring."""
print("foo")
...
for i in range(10):
...
...
for i in range(10):
...
...
for i in range(10):
... # comment
...
for i in range(10):
...
pass
from typing import Protocol
class Repro(Protocol):
def func(self) -> str:
"""Docstring"""
...
def impl(self) -> str:
"""Docstring"""
return self.func()
import abc
class Repro:
@abc.abstractmethod
def func(self) -> str:
"""Docstring"""
...
def impl(self) -> str:
"""Docstring"""
return self.func()
def stub(self) -> str:
"""Docstring"""
...

View File

@@ -38,3 +38,15 @@ class User:
foo: bool = BooleanField()
# ...
bar = StringField() # PIE794
class Person:
name = "Foo"
name = name + " Bar"
name = "Bar" # PIE794
class Person:
name: str = "Foo"
name: str = name + " Bar"
name: str = "Bar" # PIE794

View File

@@ -64,3 +64,9 @@ class FakeEnum10(enum.Enum):
A = enum.auto()
B = enum.auto()
C = enum.auto()
class FakeEnum10(enum.Enum):
A = ...
B = ... # PIE796
C = ... # PIE796

View File

@@ -0,0 +1,7 @@
import enum
class FakeEnum1(enum.Enum):
A = ...
B = ...
C = ...

View File

@@ -1,9 +1,21 @@
{"foo": 1, **{"bar": 1}} # PIE800
{**{"bar": 10}, "a": "b"} # PIE800
foo({**foo, **{"bar": True}}) # PIE800
{**foo, **{"bar": 10}} # PIE800
{ # PIE800
"a": "b",
# Preserve
**{
# all
"bar": 10, # the
# comments
},
}
{**foo, **buzz, **{bar: 10}} # PIE800
{**foo, "bar": True } # OK

View File

@@ -1,27 +1,37 @@
@dataclass
class Foo:
foo: List[str] = field(default_factory=lambda: []) # PIE807
bar: Dict[str, int] = field(default_factory=lambda: {}) # PIE807
class FooTable(BaseTable):
bar = fields.ListField(default=lambda: []) # PIE807
foo = fields.ListField(default=lambda: []) # PIE807
bar = fields.ListField(default=lambda: {}) # PIE807
class FooTable(BaseTable):
bar = fields.ListField(lambda: []) # PIE807
foo = fields.ListField(lambda: []) # PIE807
bar = fields.ListField(default=lambda: {}) # PIE807
@dataclass
class Foo:
foo: List[str] = field(default_factory=list)
bar: Dict[str, int] = field(default_factory=dict)
class FooTable(BaseTable):
bar = fields.ListField(list)
foo = fields.ListField(list)
bar = fields.ListField(dict)
lambda *args, **kwargs: []
lambda *args, **kwargs: {}
lambda *args: []
lambda *args: {}
lambda **kwargs: []
lambda **kwargs: {}
lambda: {**unwrap}

View File

@@ -36,3 +36,54 @@ field10: (Literal[1] | str) | Literal[2] # Error
# Should emit for union in generic parent type.
field11: dict[Literal[1] | Literal[2], str] # Error
# Should emit for unions with more than two cases
field12: Literal[1] | Literal[2] | Literal[3] # Error
field13: Literal[1] | Literal[2] | Literal[3] | Literal[4] # Error
# Should emit for unions with more than two cases, even if not directly adjacent
field14: Literal[1] | Literal[2] | str | Literal[3] # Error
# Should emit for unions with mixed literal internal types
field15: Literal[1] | Literal["foo"] | Literal[True] # Error
# Shouldn't emit for duplicate field types with same value; covered by Y016
field16: Literal[1] | Literal[1] # OK
# Shouldn't emit if in new parent type
field17: Literal[1] | dict[Literal[2], str] # OK
# Shouldn't emit if not in a union parent
field18: dict[Literal[1], Literal[2]] # OK
# Should respect name of literal type used
field19: typing.Literal[1] | typing.Literal[2] # Error
# Should emit in cases with newlines
field20: typing.Union[
Literal[
1 # test
],
Literal[2],
] # Error, newline and comment will not be emitted in message
# Should handle multiple unions with multiple members
field21: Literal[1, 2] | Literal[3, 4] # Error
# Should emit in cases with `typing.Union` instead of `|`
field22: typing.Union[Literal[1], Literal[2]] # Error
# Should emit in cases with `typing_extensions.Literal`
field23: typing_extensions.Literal[1] | typing_extensions.Literal[2] # Error
# Should emit in cases with nested `typing.Union`
field24: typing.Union[Literal[1], typing.Union[Literal[2], str]] # Error
# Should emit in cases with mixed `typing.Union` and `|`
field25: typing.Union[Literal[1], Literal[2] | str] # Error
# Should emit only once in cases with multiple nested `typing.Union`
field24: typing.Union[Literal[1], typing.Union[Literal[2], typing.Union[Literal[3], Literal[4]]]] # Error
# Should use the first literal subscript attribute when fixing
field25: typing.Union[typing_extensions.Literal[1], typing.Union[Literal[2], typing.Union[Literal[3], Literal[4]]], str] # Error

View File

@@ -84,3 +84,6 @@ field25: typing.Union[Literal[1], Literal[2] | str] # Error
# Should emit only once in cases with multiple nested `typing.Union`
field24: typing.Union[Literal[1], typing.Union[Literal[2], typing.Union[Literal[3], Literal[4]]]] # Error
# Should use the first literal subscript attribute when fixing
field25: typing.Union[typing_extensions.Literal[1], typing.Union[Literal[2], typing.Union[Literal[3], Literal[4]]], str] # Error

View File

@@ -3,9 +3,11 @@
import abc
import builtins
import collections.abc
import enum
import typing
from abc import abstractmethod
from abc import ABCMeta, abstractmethod
from collections.abc import AsyncIterable, AsyncIterator, Iterable, Iterator
from enum import EnumMeta
from typing import Any, overload
import typing_extensions
@@ -199,6 +201,31 @@ class AsyncIteratorReturningAsyncIterable:
... # Y045 "__aiter__" methods should return an AsyncIterator, not an AsyncIterable
class MetaclassInWhichSelfCannotBeUsed(type):
def __new__(cls) -> MetaclassInWhichSelfCannotBeUsed: ...
def __enter__(self) -> MetaclassInWhichSelfCannotBeUsed: ...
async def __aenter__(self) -> MetaclassInWhichSelfCannotBeUsed: ...
def __isub__(self, other: MetaclassInWhichSelfCannotBeUsed) -> MetaclassInWhichSelfCannotBeUsed: ...
class MetaclassInWhichSelfCannotBeUsed2(EnumMeta):
def __new__(cls) -> MetaclassInWhichSelfCannotBeUsed2: ...
def __enter__(self) -> MetaclassInWhichSelfCannotBeUsed2: ...
async def __aenter__(self) -> MetaclassInWhichSelfCannotBeUsed2: ...
def __isub__(self, other: MetaclassInWhichSelfCannotBeUsed2) -> MetaclassInWhichSelfCannotBeUsed2: ...
class MetaclassInWhichSelfCannotBeUsed3(enum.EnumType):
def __new__(cls) -> MetaclassInWhichSelfCannotBeUsed3: ...
def __enter__(self) -> MetaclassInWhichSelfCannotBeUsed3: ...
async def __aenter__(self) -> MetaclassInWhichSelfCannotBeUsed3: ...
def __isub__(self, other: MetaclassInWhichSelfCannotBeUsed3) -> MetaclassInWhichSelfCannotBeUsed3: ...
class MetaclassInWhichSelfCannotBeUsed4(ABCMeta):
def __new__(cls) -> MetaclassInWhichSelfCannotBeUsed4: ...
def __enter__(self) -> MetaclassInWhichSelfCannotBeUsed4: ...
async def __aenter__(self) -> MetaclassInWhichSelfCannotBeUsed4: ...
def __isub__(self, other: MetaclassInWhichSelfCannotBeUsed4) -> MetaclassInWhichSelfCannotBeUsed4: ...
class Abstract(Iterator[str]):
@abstractmethod
def __iter__(self) -> Iterator[str]:

View File

@@ -3,9 +3,11 @@
import abc
import builtins
import collections.abc
import enum
import typing
from abc import abstractmethod
from abc import ABCMeta, abstractmethod
from collections.abc import AsyncIterable, AsyncIterator, Iterable, Iterator
from enum import EnumMeta
from typing import Any, overload
import typing_extensions
@@ -152,6 +154,30 @@ class AsyncIteratorReturningAsyncIterable:
str
]: ... # Y045 "__aiter__" methods should return an AsyncIterator, not an AsyncIterable
class MetaclassInWhichSelfCannotBeUsed(type):
def __new__(cls) -> MetaclassInWhichSelfCannotBeUsed: ...
def __enter__(self) -> MetaclassInWhichSelfCannotBeUsed: ...
async def __aenter__(self) -> MetaclassInWhichSelfCannotBeUsed: ...
def __isub__(self, other: MetaclassInWhichSelfCannotBeUsed) -> MetaclassInWhichSelfCannotBeUsed: ...
class MetaclassInWhichSelfCannotBeUsed2(EnumMeta):
def __new__(cls) -> MetaclassInWhichSelfCannotBeUsed2: ...
def __enter__(self) -> MetaclassInWhichSelfCannotBeUsed2: ...
async def __aenter__(self) -> MetaclassInWhichSelfCannotBeUsed2: ...
def __isub__(self, other: MetaclassInWhichSelfCannotBeUsed2) -> MetaclassInWhichSelfCannotBeUsed2: ...
class MetaclassInWhichSelfCannotBeUsed3(enum.EnumType):
def __new__(cls) -> MetaclassInWhichSelfCannotBeUsed3: ...
def __enter__(self) -> MetaclassInWhichSelfCannotBeUsed3: ...
async def __aenter__(self) -> MetaclassInWhichSelfCannotBeUsed3: ...
def __isub__(self, other: MetaclassInWhichSelfCannotBeUsed3) -> MetaclassInWhichSelfCannotBeUsed3: ...
class MetaclassInWhichSelfCannotBeUsed4(ABCMeta):
def __new__(cls) -> MetaclassInWhichSelfCannotBeUsed4: ...
def __enter__(self) -> MetaclassInWhichSelfCannotBeUsed4: ...
async def __aenter__(self) -> MetaclassInWhichSelfCannotBeUsed4: ...
def __isub__(self, other: MetaclassInWhichSelfCannotBeUsed4) -> MetaclassInWhichSelfCannotBeUsed4: ...
class Abstract(Iterator[str]):
@abstractmethod
def __iter__(self) -> Iterator[str]: ...

View File

@@ -91,3 +91,17 @@ field27 = list[str]
field28 = builtins.str
field29 = str
field30 = str | bytes | None
# We shouldn't emit Y052 for `enum` subclasses.
from enum import Enum
class Foo(Enum):
FOO = 0
BAR = 1
class Bar(Foo):
BAZ = 2
BOP = 3
class Bop:
WIZ = 4

View File

@@ -98,3 +98,17 @@ field27 = list[str]
field28 = builtins.str
field29 = str
field30 = str | bytes | None
# We shouldn't emit Y052 for `enum` subclasses.
from enum import Enum
class Foo(Enum):
FOO = 0
BAR = 1
class Bar(Foo):
BAZ = 2
BOP = 3
class Bop:
WIZ = 4

View File

@@ -0,0 +1,45 @@
this_should_raise_Q004 = 'This is a \"string\"'
this_should_raise_Q004 = 'This is \\ a \\\"string\"'
this_is_fine = '"This" is a \"string\"'
this_is_fine = "This is a 'string'"
this_is_fine = "\"This\" is a 'string'"
this_is_fine = r'This is a \"string\"'
this_is_fine = R'This is a \"string\"'
this_should_raise_Q004 = (
'This is a'
'\"string\"'
)
# Same as above, but with f-strings
f'This is a \"string\"' # Q004
f'This is \\ a \\\"string\"' # Q004
f'"This" is a \"string\"'
f"This is a 'string'"
f"\"This\" is a 'string'"
fr'This is a \"string\"'
fR'This is a \"string\"'
this_should_raise_Q004 = (
f'This is a'
f'\"string\"' # Q004
)
# Nested f-strings (Python 3.12+)
#
# The first one is interesting because the fix for it is valid pre 3.12:
#
# f"'foo' {'nested'}"
#
# but as the actual string itself is invalid pre 3.12, we don't catch it.
f'\"foo\" {'nested'}' # Q004
f'\"foo\" {f'nested'}' # Q004
f'\"foo\" {f'\"nested\"'} \"\"' # Q004
f'normal {f'nested'} normal'
f'\"normal\" {f'nested'} normal' # Q004
f'\"normal\" {f'nested'} "double quotes"'
f'\"normal\" {f'\"nested\" {'other'} normal'} "double quotes"' # Q004
f'\"normal\" {f'\"nested\" {'other'} "double quotes"'} normal' # Q004
# Make sure we do not unescape quotes
this_is_fine = 'This is an \\"escaped\\" quote'
this_should_raise_Q004 = 'This is an \\\"escaped\\\" quote with an extra backslash'

View File

@@ -0,0 +1,43 @@
this_should_raise_Q004 = "This is a \'string\'"
this_should_raise_Q004 = "'This' is a \'string\'"
this_is_fine = 'This is a "string"'
this_is_fine = '\'This\' is a "string"'
this_is_fine = r"This is a \'string\'"
this_is_fine = R"This is a \'string\'"
this_should_raise_Q004 = (
"This is a"
"\'string\'"
)
# Same as above, but with f-strings
f"This is a \'string\'" # Q004
f"'This' is a \'string\'" # Q004
f'This is a "string"'
f'\'This\' is a "string"'
fr"This is a \'string\'"
fR"This is a \'string\'"
this_should_raise_Q004 = (
f"This is a"
f"\'string\'" # Q004
)
# Nested f-strings (Python 3.12+)
#
# The first one is interesting because the fix for it is valid pre 3.12:
#
# f'"foo" {"nested"}'
#
# but as the actual string itself is invalid pre 3.12, we don't catch it.
f"\'foo\' {"foo"}" # Q004
f"\'foo\' {f"foo"}" # Q004
f"\'foo\' {f"\'foo\'"} \'\'" # Q004
f"normal {f"nested"} normal"
f"\'normal\' {f"nested"} normal" # Q004
f"\'normal\' {f"nested"} 'single quotes'"
f"\'normal\' {f"\'nested\' {"other"} normal"} 'single quotes'" # Q004
f"\'normal\' {f"\'nested\' {"other"} 'single quotes'"} normal" # Q004
# Make sure we do not unescape quotes
this_is_fine = "This is an \\'escaped\\' quote"
this_should_raise_Q004 = "This is an \\\'escaped\\\' quote with an extra backslash"

View File

@@ -134,3 +134,32 @@ with A() as a:
f" something { my_dict["key"] } something else "
f"foo {f"bar {x}"} baz"
# Allow cascading for some statements.
import anyio
import asyncio
import trio
async with asyncio.timeout(1):
async with A() as a:
pass
async with A():
async with asyncio.timeout(1):
pass
async with asyncio.timeout(1):
async with asyncio.timeout_at(1):
async with anyio.CancelScope():
async with anyio.fail_after(1):
async with anyio.move_on_after(1):
async with trio.fail_after(1):
async with trio.fail_at(1):
async with trio.move_on_after(1):
async with trio.move_on_at(1):
pass
# Do not supress combination, if a context manager is already combined with another.
async with asyncio.timeout(1), A():
async with B():
pass

View File

@@ -25,3 +25,11 @@ a = {}.get(key, None)
# SIM910
({}).get(key, None)
# SIM910
ages = {"Tom": 23, "Maria": 23, "Dog": 11}
age = ages.get("Cat", None)
# OK
ages = ["Tom", "Maria", "Dog"]
age = ages.get("Cat", None)

View File

@@ -1,8 +1,7 @@
import trio
from trio import sleep
async def func():
import trio
from trio import sleep
await trio.sleep(0) # TRIO115
await trio.sleep(1) # OK
await trio.sleep(0, 1) # OK
@@ -21,8 +20,16 @@ async def func():
trio.sleep(bar)
trio.sleep(0) # TRIO115
def func():
trio.run(trio.sleep(0)) # TRIO115
from trio import Event, sleep
def func():
trio.run(trio.sleep(0)) # TRIO115
sleep(0) # TRIO115
async def func():
await sleep(seconds=0) # TRIO115

View File

@@ -172,3 +172,14 @@ def f():
from module import Member
x: Member = 1
def f():
from typing_extensions import TYPE_CHECKING
from pandas import y
if TYPE_CHECKING:
_type = x
elif True:
_type = y

View File

@@ -0,0 +1,10 @@
from __future__ import annotations
from typing_extensions import TYPE_CHECKING
if TYPE_CHECKING:
from pandas import DataFrame
def example() -> DataFrame:
pass

View File

@@ -0,0 +1,10 @@
from __future__ import annotations
from typing_extensions import TYPE_CHECKING
if TYPE_CHECKING:
from pandas import DataFrame
def example() -> DataFrame:
x = DataFrame()

View File

@@ -37,3 +37,9 @@ if False:
if 0:
x: List
from typing_extensions import TYPE_CHECKING
if TYPE_CHECKING:
pass # TCH005

View File

@@ -1,11 +1,12 @@
from __future__ import annotations
from collections.abc import Sequence # TCH003
from pandas import DataFrame
from pydantic import BaseModel
class MyBaseClass:
pass
class Parent(BaseModel):
...
class Foo(MyBaseClass):
foo: Sequence
class Child(Parent):
baz: DataFrame

View File

@@ -0,0 +1,34 @@
"""Test module."""
from __future__ import annotations
from functools import singledispatch
from typing import TYPE_CHECKING
from numpy import asarray
from numpy.typing import ArrayLike
from scipy.sparse import spmatrix
from pandas import DataFrame
if TYPE_CHECKING:
from numpy import ndarray
@singledispatch
def to_array_or_mat(a: ArrayLike | spmatrix) -> ndarray | spmatrix:
"""Convert arg to array or leaves it as sparse matrix."""
msg = f"Unhandled type {type(a)}"
raise NotImplementedError(msg)
@to_array_or_mat.register
def _(a: ArrayLike) -> ndarray:
return asarray(a)
@to_array_or_mat.register
def _(a: spmatrix) -> spmatrix:
return a
def _(a: DataFrame) -> DataFrame:
return a

View File

@@ -0,0 +1,9 @@
from __future__ import annotations
from typing_extensions import Self
def func():
from pandas import DataFrame
df: DataFrame

View File

@@ -0,0 +1,9 @@
from __future__ import annotations
import typing_extensions
def func():
from pandas import DataFrame
df: DataFrame

View File

@@ -0,0 +1,5 @@
import encodings
from datetime import timezone as tz
from datetime import timedelta
import datetime as dt
import datetime

View File

@@ -0,0 +1,5 @@
from __future__ import blah
from os import path
import os

View File

@@ -0,0 +1,3 @@
from mediuuuuuuuuuuum import a
from short import b
from loooooooooooooooooooooog import c

View File

@@ -0,0 +1,11 @@
from module1 import (
loooooooooooooong,
σηορτ,
mediuuuuum,
shoort,
looooooooooooooong,
μεδιυυυυυμ,
short,
mediuuuuuum,
λοοοοοοοοοοοοοονγ,
)

View File

@@ -0,0 +1,9 @@
import loooooooooooooong
import mediuuuuuum
import short
import σηορτ
import shoort
import mediuuuuum
import λοοοοοοοοοοοοοονγ
import μεδιυυυυυμ
import looooooooooooooong

View File

@@ -0,0 +1,6 @@
import mediuuuuuum
import short
import looooooooooooooooong
from looooooooooooooong import a
from mediuuuum import c
from short import b

View File

@@ -0,0 +1,4 @@
import mediuuuuuumb
import short
import looooooooooooooooong
import mediuuuuuuma

View File

@@ -0,0 +1,7 @@
from ..looooooooooooooong import a
from ...mediuuuum import b
from .short import c
from ....short import c
from . import d
from .mediuuuum import a
from ......short import b

View File

@@ -0,0 +1,3 @@
from looooooooooooooong import a
from mediuuuum import *
from short import *

View File

@@ -0,0 +1,11 @@
import os
import __main__
import third_party
import first_party
os.a
third_party.a
__main__.a
first_party.a

View File

@@ -0,0 +1,8 @@
from __future__ import annotations
import django.settings
import os
import pytz
import sys
from . import local
from library import foo

View File

@@ -63,3 +63,33 @@ class PosOnlyClass:
def bad_method_pos_only(this, blah, /, self, something: str):
pass
class ModelClass:
@hybrid_property
def bad(cls):
pass
@bad.expression
def bad(self):
pass
@bad.wtf
def bad(cls):
pass
@hybrid_property
def good(self):
pass
@good.expression
def good(cls):
pass
@good.wtf
def good(self):
pass
@foobar.thisisstatic
def badstatic(foo):
pass

View File

@@ -1,6 +1,6 @@
import collections
from collections import namedtuple
from typing import TypeAlias, TypeVar, NewType, NamedTuple, TypedDict
from typing import Type, TypeAlias, TypeVar, NewType, NamedTuple, TypedDict
GLOBAL: str = "foo"
@@ -25,6 +25,8 @@ def assign():
IntOrStr: TypeAlias = int | str
type MyInt = int
def aug_assign(rank, world_size):
global CURRENT_PORT
@@ -38,3 +40,15 @@ def loop_assign():
global CURRENT_PORT
for CURRENT_PORT in range(5):
pass
def model_assign() -> None:
Bad = apps.get_model("zerver", "Stream") # N806
Attachment = apps.get_model("zerver", "Attachment") # OK
Recipient = apps.get_model("zerver", model_name="Recipient") # OK
Address: Type = apps.get_model("zerver", "Address") # OK
from django.utils.module_loading import import_string
Bad = import_string("django.core.exceptions.ValidationError") # N806
ValidationError = import_string("django.core.exceptions.ValidationError") # OK

View File

@@ -50,3 +50,16 @@ import itertools
for i in itertools.product(foo_int): # Ok
pass
for i in list(foo_list): # Ok
foo_list.append(i + 1)
for i in list(foo_list): # PERF101
# Make sure we match the correct list
other_list.append(i + 1)
for i in list(foo_tuple): # Ok
foo_tuple.append(i + 1)
for i in list(foo_set): # Ok
foo_set.append(i + 1)

View File

@@ -89,3 +89,61 @@ x = [ #
f"{ {'a': 1} }"
f"{[ { {'a': 1} } ]}"
f"normal { {f"{ { [1, 2] } }" } } normal"
#: Okay
ham[lower + offset : upper + offset]
#: Okay
ham[(lower + offset) : upper + offset]
#: E203:1:19
ham{lower + offset : upper + offset}
#: E203:1:19
ham[lower + offset : upper + offset]
#: Okay
release_lines = history_file_lines[history_file_lines.index('## Unreleased') + 1: -1]
#: Okay
release_lines = history_file_lines[history_file_lines.index('## Unreleased') + 1 : -1]
#: Okay
ham[1:9], ham[1:9:3], ham[:9:3], ham[1::3], ham[1:9:]
ham[lower:upper], ham[lower:upper:], ham[lower::step]
ham[lower+offset : upper+offset]
ham[: upper_fn(x) : step_fn(x)], ham[:: step_fn(x)]
ham[lower + offset : upper + offset]
#: E201:1:5
ham[ : upper]
#: Okay
ham[lower + offset :: upper + offset]
#: Okay
ham[(lower + offset) :: upper + offset]
#: Okay
ham[lower + offset::upper + offset]
#: E203:1:21
ham[lower + offset : : upper + offset]
#: E203:1:20
ham[lower + offset: :upper + offset]
#: E203:1:20
ham[{lower + offset : upper + offset} : upper + offset]
#: Okay
ham[upper:]
#: Okay
ham[upper :]
#: E202:1:12
ham[upper : ]
#: E203:1:10
ham[upper :]

View File

@@ -0,0 +1,113 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"id": "33faf7ad-a3fd-4ac4-a0c3-52e507ed49df",
"metadata": {},
"outputs": [],
"source": [
"import sys\n",
"\n",
"sys.path"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1331140f-2741-4661-9086-0764368710c9",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"id": "a4113383-725d-4f04-80b8-a3080b2b8c4b",
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"\n",
"os.path\n",
"\n",
"import pathlib"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a5d2ef63-ae60-4311-bae3-42e845afba3f",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"id": "79599475-a5ee-4f60-80d1-6efa77693da0",
"metadata": {},
"outputs": [],
"source": [
"import a\n",
"\n",
"try:\n",
" import b\n",
"except ImportError:\n",
" pass\n",
"else:\n",
" pass\n",
"\n",
"__some__magic = 1\n",
"\n",
"import c"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "863dcc35-5c8d-4d05-8b4a-91059e944112",
"metadata": {},
"outputs": [],
"source": [
"import ok\n",
"\n",
"\n",
"def foo() -> None:\n",
" import e\n",
"\n",
"\n",
"import no_ok"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6b2377d0-b814-4057-83ec-d443d8e19401",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python (ruff-playground)",
"language": "python",
"name": "ruff-playground"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

Some files were not shown because too many files have changed in this diff Show More