Compare commits

...

129 Commits

Author SHA1 Message Date
Brent Westbrook
47a5543109 chore: stabilize RUF059 2025-06-05 12:33:28 -04:00
Andrew Gallant
55100209c7 [ty] IDE: add support for object.<CURSOR> completions (#18468)
This PR adds logic for detecting `Name Dot [Name]` token patterns,
finding the corresponding `ExprAttribute`, getting the type of the
object and returning the members available on that object.

Here's a video demonstrating this working:

https://github.com/user-attachments/assets/42ce78e8-5930-4211-a18a-fa2a0434d0eb

Ref astral-sh/ty#86
2025-06-05 11:15:19 -04:00
chiri
c0bb83b882 [perflint] fix missing parentheses for lambda and ternary conditions (PERF401, PERF403) (#18412)
Closes #18405
2025-06-05 09:57:08 -05:00
Brent Westbrook
74a4e9af3d Combine lint and syntax error handling (#18471)
## Summary

This is a spin-off from
https://github.com/astral-sh/ruff/pull/18447#discussion_r2125844669 to
avoid using `Message::noqa_code` to differentiate between lints and
syntax errors. I went through all of the calls on `main` and on the
branch from #18447, and the instance in `ruff_server` noted in the
linked comment was actually the primary place where this was being done.
Other calls to `noqa_code` are typically some variation of
`message.noqa_code().map_or(String::new, format!(...))`, with the major
exception of the gitlab output format:


a120610b5b/crates/ruff_linter/src/message/gitlab.rs (L93-L105)

which obviously assumes that `None` means syntax error. A simple fix
here would be to use `message.name()` for `check_name` instead of the
noqa code, but I'm not sure how breaking that would be. This could just
be:

```rust
 let description = message.body();
 let description = description.strip_prefix("SyntaxError: ").unwrap_or(description).to_string();
 let check_name = message.name();
```

In that case. This sounds reasonable based on the [Code Quality report
format](https://docs.gitlab.com/ci/testing/code_quality/#code-quality-report-format)
docs:

> | Name | Type | Description|
> |-----|-----|----|
> |`check_name` | String | A unique name representing the check, or
rule, associated with this violation. |

## Test Plan

Existing tests
2025-06-05 12:50:02 +00:00
Alex Waygood
8485dbb324 [ty] Fix --python argument for Windows, and improve error messages for bad --python arguments (#18457)
## Summary

Fixes https://github.com/astral-sh/ty/issues/556.

On Windows, system installations have different layouts to virtual
environments. In Windows virtual environments, the Python executable is
found at `<sys.prefix>/Scripts/python.exe`. But in Windows system
installations, the Python executable is found at
`<sys.prefix>/python.exe`. That means that Windows users were able to
point to Python executables inside virtual environments with the
`--python` flag, but they weren't able to point to Python executables
inside system installations.

This PR fixes that issue. It also makes a couple of other changes:
- Nearly all `sys.prefix` resolution is moved inside `site_packages.rs`.
That was the original design of the `site-packages` resolution logic,
but features implemented since the initial implementation have added
some resolution and validation to `resolver.rs` inside the module
resolver. That means that we've ended up with a somewhat confusing code
structure and a situation where several checks are unnecessarily
duplicated between the two modules.
- I noticed that we had quite bad error messages if you e.g. pointed to
a path that didn't exist on disk with `--python` (we just gave a
somewhat impenetrable message saying that we "failed to canonicalize"
the path). I improved the error messages here and added CLI tests for
`--python` and the `environment.python` configuration setting.

## Test Plan

- Existing tests pass
- Added new CLI tests
- I manually checked that virtual-environment discovery still works if
no configuration is given
- Micha did some manual testing to check that pointing `--python` to a
system-installation executable now works on Windows
2025-06-05 08:19:15 +01:00
Shunsuke Shibayama
0858896bc4 [ty] type narrowing by attribute/subscript assignments (#18041)
## Summary

This PR partially solves https://github.com/astral-sh/ty/issues/164
(derived from #17643).

Currently, the definitions we manage are limited to those for simple
name (symbol) targets, but we expand this to track definitions for
attribute and subscript targets as well.

This was originally planned as part of the work in #17643, but the
changes are significant, so I made it a separate PR.
After merging this PR, I will reflect this changes in #17643.

There is still some incomplete work remaining, but the basic features
have been implemented, so I am publishing it as a draft PR.
Here is the TODO list (there may be more to come):
* [x] Complete rewrite and refactoring of documentation (removing
`Symbol` and replacing it with `Place`)
* [x] More thorough testing
* [x] Consolidation of duplicated code (maybe we can consolidate the
handling related to name, attribute, and subscript)

This PR replaces the current `Symbol` API with the `Place` API, which is
a concept that includes attributes and subscripts (the term is borrowed
from Rust).

## Test Plan

`mdtest/narrow/assignment.md` is added.

---------

Co-authored-by: David Peter <sharkdp@users.noreply.github.com>
Co-authored-by: Carl Meyer <carl@astral.sh>
2025-06-04 17:24:27 -07:00
Alex Waygood
ce8b744f17 [ty] Only calculate information for unresolved-reference subdiagnostic if we know we'll emit the diagnostic (#18465)
## Summary

This optimizes some of the logic added in
https://github.com/astral-sh/ruff/pull/18444. In general, we only
calculate information for subdiagnostics if we know we'll actually emit
the diagnostic. The check to see whether we'll emit the diagnostic is
work we'll definitely have to do whereas the the work to gather
information for a subdiagnostic isn't work we necessarily have to do if
the diagnostic isn't going to be emitted at all.

This PR makes us lazier about gathering the information we need for the
subdiagnostic, and moves all the subdiagnostic logic into one function
rather than having some `unresolved-reference` subdiagnostic logic in
`infer.rs` and some in `diagnostic.rs`.

## Test Plan

`cargo test -p ty_python_semantic`
2025-06-04 20:41:00 +01:00
Alex Waygood
5a8cdab771 [ty] Only consider a type T a subtype of a protocol P if all of P's members are fully bound on T (#18466)
## Summary

Fixes https://github.com/astral-sh/ty/issues/578

## Test Plan

mdtests
2025-06-04 19:39:14 +00:00
Alex Waygood
3a8191529c [ty] Exclude members starting with _abc_ from a protocol interface (#18467)
## Summary

As well as excluding a hardcoded set of special attributes, CPython at
runtime also excludes any attributes or declarations starting with
`_abc_` from the set of members that make up a protocol interface. I
missed this in my initial implementation.

This is a bit of a CPython implementation detail, but I do think it's
important that we try to model the runtime as best we can here. The
closer we are to the runtime behaviour, the closer we come to sound
behaviour when narrowing types from `isinstance()` checks against
runtime-checkable protocols (for example)

## Test Plan

Extended an existing mdtest
2025-06-04 20:34:09 +01:00
lipefree
e658778ced [ty] Add subdiagnostic suggestion to unresolved-reference diagnostic when variable exists on self (#18444)
## Summary

Closes https://github.com/astral-sh/ty/issues/502.

In the following example:
```py
class Foo:
    x: int

    def method(self):
        y = x
```
The user may intended to use `y = self.x` in `method`. 

This is now added as a subdiagnostic in the following form : 

`info: An attribute with the same name as 'x' is defined, consider using
'self.x'`

## Test Plan

Added mdtest with snapshot diagnostics.
2025-06-04 08:13:50 -07:00
David Peter
f1883d71a4 [ty] IDE: only provide declarations and bindings as completions (#18456)
## Summary

Previously, all symbols where provided as possible completions. In an
example like the following, both `foo` and `f` were suggested as
completions, because `f` itself is a symbol.
```py
foo = 1

f<CURSOR>
```
Similarly, in the following example, `hidden_symbol` was suggested, even
though it is not statically visible:
```py
if 1 + 2 != 3:
    hidden_symbol = 1

hidden_<CURSOR>
```

With the change suggested here, we only use statically visible
declarations and bindings as a source for completions.


## Test Plan

- Updated snapshot tests
- New test for statically hidden definitions
- Added test for star import
2025-06-04 16:11:05 +02:00
David Peter
11db567b0b [ty] ty_ide: Hotfix for expression_scope_id panics (#18455)
## Summary

Implement a hotfix for the playground/LSP crashes related to missing
`expression_scope_id`s.

relates to: https://github.com/astral-sh/ty/issues/572

## Test Plan

* Regression tests from https://github.com/astral-sh/ruff/pull/18441
* Ran the playground locally to check if panics occur / completions
still work.

---------

Co-authored-by: Andrew Gallant <andrew@astral.sh>
2025-06-04 10:39:16 +02:00
David Peter
9f8c3de462 [ty] Improve docs for Class{Literal,Type}::instance_member (#18454)
## Summary

Mostly just refer to `Type::instance_member` which has much more
details.
2025-06-04 09:55:45 +02:00
David Peter
293d4ac388 [ty] Add meta-type tests for legavy TypeVars (#18453)
## Summary

Follow up to the comment by @dcreager
[here](https://github.com/astral-sh/ruff/pull/18439#discussion_r2123802784).
2025-06-04 07:44:44 +00:00
Carl Meyer
9e8a7e9353 update to salsa that doesn't panic silently on cycles (#18450) 2025-06-04 07:40:16 +02:00
Dhruv Manilawala
453e5f5934 [ty] Add tests for empty list/tuple unpacking (#18451)
## Summary

This PR is to address this comment:
https://github.com/astral-sh/ruff/pull/18438#issuecomment-2935344415

## Test Plan

Run mdtest
2025-06-04 02:40:26 +00:00
Dhruv Manilawala
7ea773daf2 [ty] Argument type expansion for overload call evaluation (#18382)
## Summary

Part of astral-sh/ty#104, closes: astral-sh/ty#468

This PR implements the argument type expansion which is step 3 of the
overload call evaluation algorithm.

Specifically, this step needs to be taken if type checking resolves to
no matching overload and there are argument types that can be expanded.

## Test Plan

Add new test cases.

## Ecosystem analysis

This PR removes 174 `no-matching-overload` false positives -- I looked
at a lot of them and they all are false positives.

One thing that I'm not able to understand is that in
2b7e3adf27/sphinx/ext/autodoc/preserve_defaults.py (L179)
the inferred type of `value` is `str | None` by ty and Pyright, which is
correct, but it's only ty that raises `invalid-argument-type` error
while Pyright doesn't. The constructor method of `DefaultValue` has
declared type of `str` which is invalid.

There are few cases of false positives resulting due to the fact that ty
doesn't implement narrowing on attribute expressions.
2025-06-04 02:12:00 +00:00
Alex Waygood
0079cc6817 [ty] Minor cleanup for site-packages discovery logic (#18446) 2025-06-03 18:49:14 +00:00
Matthew Mckee
e8ea40012a [ty] Add generic inference for dataclasses (#18443)
## Summary

An issue seen here https://github.com/astral-sh/ty/issues/500

The `__init__` method of dataclasses had no inherited generic context,
so we could not infer the type of an instance from a constructor call
with generics

## Test Plan

Add tests to classes.md` in generics folder
2025-06-03 09:59:43 -07:00
Abhijeet Prasad Bodas
71d8a5da2a [ty] dataclasses: Allow using dataclasses.dataclass as a function. (#18440)
## Summary

Part of https://github.com/astral-sh/ty/issues/111

Using `dataclass` as a function, instead of as a decorator did not work
as expected prior to this.
Fix that by modifying the dataclass overload's return type.

## Test Plan

New mdtests, fixing the existing TODO.
2025-06-03 09:50:29 -07:00
Douglas Creager
2c3b3d3230 [ty] Create separate FunctionLiteral and FunctionType types (#18360)
This updates our representation of functions to more closely match our
representation of classes.

The new `OverloadLiteral` and `FunctionLiteral` classes represent a
function definition in the AST. If a function is generic, this is
unspecialized. `FunctionType` has been updated to represent a function
type, which is specialized if the function is generic. (These names are
chosen to match `ClassLiteral` and `ClassType` on the class side.)

This PR does not add a separate `Type` variant for `FunctionLiteral`.
Maybe we should? Possibly as a follow-on PR?

Part of https://github.com/astral-sh/ty/issues/462

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-03 10:59:31 -04:00
Dhruv Manilawala
8d98c601d8 [ty] Infer list[T] when unpacking non-tuple type (#18438)
## Summary

Follow-up from #18401, I was looking at whether that would fix the issue
at https://github.com/astral-sh/ty/issues/247#issuecomment-2917656676
and it didn't, which made me realize that the PR only inferred `list[T]`
when the value type was tuple but it could be other types as well.

This PR fixes the actual issue by inferring `list[T]` for the non-tuple
type case.

## Test Plan

Add test cases for starred expression involved with non-tuple type. I
also added a few test cases for list type and list literal.

I also verified that the example in the linked issue comment works:
```py
def _(line: str):
    a, b, *c = line.split(maxsplit=2)
    c.pop()
```
2025-06-03 19:17:47 +05:30
David Peter
0986edf427 [ty] Meta-type of type variables should be type[..] (#18439)
## Summary

Came across this while debugging some ecosystem changes in
https://github.com/astral-sh/ruff/pull/18347. I think the meta-type of a
typevar-annotated variable should be equal to `type`, not `<class
'object'>`.

## Test Plan

New Markdown tests.
2025-06-03 15:22:00 +02:00
chiri
03f1f8e218 [pyupgrade] Make fix unsafe if it deletes comments (UP050) (#18390)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
/closes #18387
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan
update snapshots
<!-- How was it tested? -->
2025-06-03 09:10:15 -04:00
chiri
628bb2cd1d [pyupgrade] Make fix unsafe if it deletes comments (UP004) (#18393)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
https://github.com/astral-sh/ruff/issues/18387#issuecomment-2923039331
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan
update snapshots
<!-- How was it tested? -->
2025-06-03 09:09:33 -04:00
lipefree
f23d2c9b9e [ty] Support using legacy typing aliases for generic classes in type annotations (#18404)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-06-03 12:09:51 +01:00
Micha Reiser
67d94d9ec8 Use ty's completions in playground (#18425) 2025-06-03 10:11:39 +02:00
otakutyrant
d1cb8e2142 Update editor setup docs about Neovim and Vim (#18324)
## Summary

I struggled to make ruff_organize_imports work and then I found out I
missed the key note about conform.nvim before because it was put in the
Vim section wrongly! So I refined them both.

---------

Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
2025-06-03 07:40:22 +00:00
renovate[bot]
57202c1c77 Update NPM Development dependencies (#18423) 2025-06-03 08:06:56 +02:00
Dhruv Manilawala
2289187b74 Infer list[T] for starred target in unpacking (#18401)
## Summary

Closes: astral-sh/ty#191

## Test Plan

Update existing tests.
2025-06-03 07:25:07 +05:30
Robsdedude
14c42a8ddf [refurb] Mark FURB180 fix unsafe when class has bases (#18149)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Mark `FURB180`'s fix as unsafe if the class already has base classes.
This is because the base classes might validate the other base classes
(like `typing.Protocol` does) or otherwise alter runtime behavior if
more base classes are added.

## Test Plan

The existing snapshot test covers this case already.

## References

Partially addresses https://github.com/astral-sh/ruff/issues/13307 (left
out way to permit certain exceptions)

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-06-03 00:51:09 +00:00
Denys Kyslytsyn
e677863787 [fastapi] Avoid false positive for class dependencies (FAST003) (#18271)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Closes #17226.

This PR updates the `FAST003` rule to correctly handle [FastAPI class
dependencies](https://fastapi.tiangolo.com/tutorial/dependencies/classes-as-dependencies/).
Specifically, if a path parameter is declared in either:

- a `pydantic.BaseModel` used as a dependency, or  
- the `__init__` method of a class used as a dependency,  

then `FAST003` will no longer incorrectly report it as unused.

FastAPI allows a shortcut when using annotated class dependencies -
`Depends` can be called without arguments, e.g.:

```python
class MyParams(BaseModel):
    my_id: int

@router.get("/{my_id}")
def get_id(params: Annotated[MyParams, Depends()]): ...
```
This PR ensures that such usage is properly supported by the linter.

Note: Support for dataclasses is not included in this PR. Let me know if
you’d like it to be added.

## Test Plan

Added relevant test cases to the `FAST003.py` fixture.
2025-06-02 14:34:50 -04:00
lipefree
f379eb6e62 [ty] Treat lambda functions as instances of types.FunctionType (#18431) 2025-06-02 16:46:26 +01:00
Alex Waygood
47698883ae [ty] Fix false positives for legacy ParamSpecs inside Callable type expressions (#18426) 2025-06-02 14:10:00 +01:00
Alex Waygood
e2d96df501 [ty] Improve diagnostics if the user attempts to import a stdlib module that does not exist on their configured Python version (#18403) 2025-06-02 10:52:26 +00:00
renovate[bot]
384e80ec80 Update taiki-e/install-action action to v2.52.4 (#18420)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-02 09:03:32 +02:00
renovate[bot]
b9f3b0e0a6 Update docker/build-push-action action to v6.18.0 (#18422)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-02 09:03:09 +02:00
Micha Reiser
1e6d76c878 [ty] Fix server hang after shutdown request (#18414) 2025-06-02 06:57:51 +00:00
renovate[bot]
844c8626c3 Update Rust crate libcst to v1.8.0 (#18424) 2025-06-02 07:40:18 +02:00
renovate[bot]
1c8d9d707e Update Rust crate clap to v4.5.39 (#18419) 2025-06-02 07:39:27 +02:00
renovate[bot]
4856377478 Update cargo-bins/cargo-binstall action to v1.12.6 (#18416) 2025-06-02 07:38:57 +02:00
renovate[bot]
643c845a47 Update dependency mdformat-mkdocs to v4.3.0 (#18421) 2025-06-02 07:38:36 +02:00
renovate[bot]
9e952cf0e0 Update pre-commit dependencies (#18418) 2025-06-02 07:38:10 +02:00
renovate[bot]
c4015edf48 Update dependency ruff to v0.11.12 (#18417) 2025-06-02 07:37:56 +02:00
Matthew Mckee
97b824db3e [ty] Ensure Literal types are considered assignable to anything their Instance supertypes are assignable to (#18351) 2025-06-01 16:39:56 +01:00
Micha Reiser
220ab88779 [ty] Promote projects to good that now no longer hang (#18370) 2025-06-01 17:25:46 +02:00
github-actions[bot]
7a63ac145a Sync vendored typeshed stubs (#18407)
Co-authored-by: typeshedbot <>
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-06-01 15:21:18 +01:00
Micha Reiser
54f597658c [ty] Fix multithreading related hangs and panics (#18238) 2025-06-01 11:07:55 +02:00
Ibraheem Ahmed
aa1fad61e0 Support relative --ty-path in ty-benchmark (#18385)
## Summary

This currently doesn't work because the benchmark changes the working
directory. Also updates the process name to make it easier to compare
two local ty binaries.
2025-05-30 18:19:20 -04:00
Alex Waygood
b390b3cb8e [ty] Update docs for Python version inference (#18397) 2025-05-30 22:45:28 +01:00
Zanie Blue
88866f0048 [ty] Infer the Python version from the environment if feasible (#18057)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-05-30 21:22:51 +00:00
Dylan
9bbf4987e8 Implement template strings (#17851)
This PR implements template strings (t-strings) in the parser and
formatter for Ruff.

Minimal changes necessary to compile were made in other parts of the code (e.g. ty, the linter, etc.). These will be covered properly in follow-up PRs.
2025-05-30 15:00:56 -05:00
Carl Meyer
ad024f9a09 [ty] support callability of bound/constrained typevars (#18389)
## Summary

Allow a typevar to be callable if it is bound to a callable type, or
constrained to callable types.

I spent some time digging into why this support didn't fall out
naturally, and ultimately the reason is that we look up `__call__` on
the meta type (since its a dunder), and our implementation of
`Type::to_meta_type` for `Type::Callable` does not return a type with
`__call__`.

A more general solution here would be to have `Type::to_meta_type` for
`Type::Callable` synthesize a protocol with `__call__` and return an
intersection with that protocol (since for a type to be callable, we
know its meta-type must have `__call__`). That solution could in
principle also replace the special-case handling of `Type::Callable`
itself, here in `Type::bindings`. But that more general approach would
also be slower, and our protocol support isn't quite ready for that yet,
and handling this directly in `Type::bindings` is really not bad.

Fixes https://github.com/astral-sh/ty/issues/480

## Test Plan

Added mdtests.
2025-05-30 12:01:51 -07:00
Andrew Gallant
fc549bda94 [ty] Minor tweaks to "list all members" docs and tests (#18388)
Ref https://github.com/astral-sh/ruff/pull/18251#pullrequestreview-2881810681
2025-05-30 13:36:57 -04:00
Alex Waygood
77c8ddf101 [ty] Fix broken property tests for disjointness (#18384) 2025-05-30 16:49:20 +01:00
David Peter
e730f27f80 [ty] List available members for a given type (#18251)
This PR adds initial support for listing all attributes of
an object. It is exposed through a new `all_members`
routine in `ty_extensions`, which is in turn used to test
the functionality.

The purpose of listing all members is for code
completion. That is, given a `object.<CURSOR>`, we
would like to list all available attributes on
`object`.
2025-05-30 11:24:20 -04:00
Wei Lee
d65bd69963 [airflow] Add unsafe fix for module moved cases (AIR312) (#18363)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Follow up on https://github.com/astral-sh/ruff/pull/18093 and apply it
to AIR312

## Test Plan

<!-- How was it tested? -->

The existing test fixtures have been updated
2025-05-30 09:36:20 -04:00
Brent Westbrook
c713e76e4d Add a SourceFile to OldDiagnostic (#18356)
Summary
--

This is the last main difference between the `OldDiagnostic` and
`Message`
types, so attaching a `SourceFile` to `OldDiagnostic` should make
combining the
two types almost trivial.

Initially I updated the remaining rules without access to a `Checker` to
take a
`&SourceFile` directly, but after Micha's suggestion in
https://github.com/astral-sh/ruff/pull/18356#discussion_r2113281552, I
updated all of these calls to take a
`LintContext` instead. This new type is a thin wrapper around a
`RefCell<Vec<OldDiagnostic>>`
and a `SourceFile` and now has the `report_diagnostic` method returning
a `DiagnosticGuard` instead of `Checker`.
This allows the same `Drop`-based implementation to be used in cases
without a `Checker` and also avoids a lot of intermediate allocations of
`Vec<OldDiagnostic>`s.

`Checker` now also contains a `LintContext`, which it defers to for its
`report_diagnostic` methods, which I preserved for convenience.

Test Plan
--

Existing tests
2025-05-30 13:34:38 +00:00
Micha Reiser
8005ebb405 Update salsa past generational id change (#18362) 2025-05-30 15:31:33 +02:00
Wei Lee
0c29e258c6 [airflow] Add unsafe fix for module moved cases (AIR311) (#18366)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Follow up on https://github.com/astral-sh/ruff/pull/18093 and apply it
to AIR311

---

Rules fixed
* `airflow.models.datasets.expand_alias_to_datasets` →
`airflow.models.asset.expand_alias_to_assets`
* `airflow.models.baseoperatorlink.BaseOperatorLink` →
`airflow.sdk.BaseOperatorLink`


## Test Plan

<!-- How was it tested? -->
The existing test fixtures have been updated
2025-05-30 09:27:14 -04:00
Wei Lee
b5b6b657cc [airflow] Add unsafe fix for module moved cases (AIR301) (#18367)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Follow up on https://github.com/astral-sh/ruff/pull/18093 and apply it
to AIR301

## Test Plan

<!-- How was it tested? -->

The existing test fixtures have been updated
2025-05-30 08:46:39 -04:00
Alex Waygood
ad2f667ee4 [ty] Improve tests for site-packages discovery (#18374)
## Summary

- Convert tests demonstrating our resilience to malformed/absent
`version` fields in `pyvenf.cfg` files to mdtests. Also make them more
expansive.
- Convert the regression test I added in
https://github.com/astral-sh/ruff/pull/18157 to an mdtest
- Add comments next to unit tests that cannot be converted to mdtests
(but where it's not obvious why they can't) so I don't have to do this
exercise again 😄
- In `site_packages.rs`, factor out the logic for figuring out where we
expect the system-installation `site-packages` to be. Currently we have
the same logic twice.

## Test Plan

`cargo test -p ty_python_semantic`
2025-05-30 07:32:21 +01:00
Carl Meyer
363f061f09 [ty] _typeshed.Self is not a special form (#18377)
## Summary

This change was based on a mis-reading of a comment in typeshed, and a
wrong assumption about what was causing a test failure in a prior PR.
Reverting it doesn't cause any tests to fail.

## Test Plan

Existing tests.
2025-05-29 17:11:13 -07:00
InSync
9b0dfc505f [ty] Callable types are disjoint from non-callable @final nominal instance types (#18368)
## Summary

Resolves [#513](https://github.com/astral-sh/ty/issues/513).

Callable types are now considered to be disjoint from nominal instance
types where:

* The class is `@final`, and
* Its `__call__` either does not exist or is not assignable to `(...) ->
Unknown`.

## Test Plan

Markdown tests.

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-05-29 23:27:27 +00:00
lipefree
695de4f27f [ty] Add diagnosis for function with no return statement but with return type annotation (#18359)
## Summary

Partially implement https://github.com/astral-sh/ty/issues/538, 
```py
from pathlib import Path

def setup_test_project(registry_name: str, registry_url: str, project_dir: str) -> Path:
    pyproject_file = Path(project_dir) / "pyproject.toml"
    pyproject_file.write_text("...", encoding="utf-8")
```
As no return statement is defined in the function `setup_test_project`
with annotated return type `Path`, we provide the following diagnosis :

- error[invalid-return-type]: Function **always** implicitly returns
`None`, which is not assignable to return type `Path`

with a subdiagnostic : 
- note: Consider changing your return annotation to `-> None` or adding a `return` statement
 
## Test Plan

mdtests with snapshots to capture the subdiagnostic. I have to mention
that existing snapshots were modified since they now fall in this
category.

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-05-29 23:17:18 +00:00
Wei Lee
3445d1322d [airflow] Add unsafe fix module moved cases (AIR302) (#18093)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Add utility functions `generate_import_edit` and
`generate_remove_and_runtime_import_edit` to generate the fix needed for
the airflow rules.

1. `generate_import_edit` is for the cases where the member name has
changed. (e.g., `airflow.datasts.Dataset` to `airflow.sdk.Asset`) It's
just extracted from the original logic
2. `generate_remove_and_runtime_import_edit` is for cases where the
member name has not changed. (e.g.,
`airflow.operators.pig_operator.PigOperator` to
`airflow.providers.apache.pig.hooks.pig.PigCliHook`) This is newly
introduced. As it introduced runtime import, I mark it as an unsafe fix.
Under the hook, it tried to find the original import statement, remove
it, and add a new import fix

---

* rules fix
* `airflow.sensors.external_task_sensor.ExternalTaskSensorLink` →
`airflow.providers.standard.sensors.external_task.ExternalDagLink`

## Test Plan

<!-- How was it tested? -->
The existing test fixtures have been updated
2025-05-29 16:30:40 -04:00
Brent Westbrook
2c3f091e0e Rename ruff_linter::Diagnostic to OldDiagnostic (#18355)
Summary
--

It's a bit late in the refactoring process, but I think there are still
a couple of PRs left before getting rid of this type entirely, so I
thought it would still be worth doing.

This PR is just a quick rename with no other changes.

Test Plan
--

Existing tests
2025-05-29 15:04:31 -04:00
Marcus Näslund
9d3cad95bc [refurb] Add coverage of set and frozenset calls (FURB171) (#18035)
## Summary

Adds coverage of using set(...) in addition to `{...} in
SingleItemMembershipTest.

Fixes #15792
(and replaces the old PR #15793)

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

Updated unit test and snapshot.

Steps to reproduce are in the issue linked above.

<!-- How was it tested? -->
2025-05-29 14:59:49 -04:00
Alex Waygood
7df79cfb70 Add offset method to ruff_python_trivia::Cursor (#18371) 2025-05-29 16:08:15 +01:00
Andrew Gallant
33ed502edb ty_ide: improve completions by using scopes
Previously, completions were based on just returning every identifier
parsed in the current Python file. In this commit, we change it to
identify an expression under the cursor and then return all symbols
available to the scope containing that expression.

This is still returning too much, and also, in some cases, not enough.
Namely, it doesn't really take the specific context into account other
than scope. But this does improve on the status quo. For example:

    def foo(): ...
    def bar():
        def fast(): ...
    def foofoo(): ...

    f<CURSOR>

When asking for completions here, the LSP will no longer include `fast`
as a possible completion in this context.

Ref https://github.com/astral-sh/ty/issues/86
2025-05-29 10:31:30 -04:00
Andrew Gallant
a827b16ebd ruff_python_parser: add Tokens::before method
This is analogous to the existing `Tokens::after` method. Its
implementation is almost identical.

We plan to use this for looking at the tokens immediately before the
cursor when fetching completions.
2025-05-29 10:31:30 -04:00
Alex Waygood
47a2ec002e [ty] Split Type::KnownInstance into two type variants (#18350) 2025-05-29 14:47:55 +01:00
Brent Westbrook
aee3af0f7a Bump 0.11.12 (#18369) 2025-05-29 09:17:12 -04:00
Victor Hugo Gomes
04dc48e17c [refurb] Fix FURB129 autofix generating invalid syntax (#18235)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Fixes #18231

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

Snapshot tests
<!-- How was it tested? -->
2025-05-28 17:01:03 -04:00
vjurczenia
27743efa1b [pylint] Implement missing-maxsplit-arg (PLC0207) (#17454)
## Summary

Implements  `use-maxsplit-arg` (`PLC0207`)

https://pylint.readthedocs.io/en/latest/user_guide/messages/convention/use-maxsplit-arg.html
> Emitted when accessing only the first or last element of str.split().
The first and last element can be accessed by using str.split(sep,
maxsplit=1)[0] or str.rsplit(sep, maxsplit=1)[-1] instead.

This is part of https://github.com/astral-sh/ruff/issues/970

## Test Plan

`cargo test`

Additionally compared Ruff output to Pylint:
```
pylint --disable=all --enable=use-maxsplit-arg crates/ruff_linter/resources/test/fixtures/pylint/missing_maxsplit_arg.py

cargo run -p ruff -- check crates/ruff_linter/resources/test/fixtures/pylint/missing_maxsplit_arg.py --no-cache --select PLC0207
```

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-05-28 20:46:30 +00:00
Matthew Mckee
c60b4d7f30 [ty] Add subtyping between Callable types and class literals with __init__ (#17638)
## Summary

Allow classes with `__init__` to be subtypes of `Callable`

Fixes https://github.com/astral-sh/ty/issues/358

## Test Plan

Update is_subtype_of.md

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-05-28 13:43:07 -07:00
Hans
16621fa19d [flake8-bugbear ] Add fix safety section (B006) (#17652)
## Summary

This PR add the `fix safety` section for rule `B006` in
`mutable_argument_default.rs` for #15584

When applying this rule for fixes, certain changes may alter the
original logical behavior. For example:

before:
```python
def cache(x, storage=[]):
    storage.append(x)
    return storage

print(cache(1))  # [1]
print(cache(2))  # [1, 2]
```

after:
```python
def cache(x, storage=[]):
    storage.append(x)
    return storage

print(cache(1))  # [1]
print(cache(2))  # [2]
```
2025-05-28 16:27:13 -04:00
Victor Hugo Gomes
e23d4ea027 [flake8-bugbear] Ignore __debug__ attribute in B010 (#18357)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Fixes #18353
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

<!-- How was it tested? -->
Snapshot tests
2025-05-28 16:24:52 -04:00
Douglas Creager
452f992fbc [ty] Simplify signature types, use them in CallableType (#18344)
There were many fields in `Signature` and friends that really had more
to do with how a signature was being _used_ — how it was looked up,
details about an individual call site, etc. Those fields more properly
belong in `Bindings` and friends.

This is a pure refactoring, and should not affect any tests or ecosystem
projects.

I started on this journey in support of
https://github.com/astral-sh/ty/issues/462. It seemed worth pulling out
as a separate PR.

One major concrete benefit of this refactoring is that we can now use
`CallableSignature` directly in `CallableType`. (We can't use
`CallableSignature` directly in that `Type` variant because signatures
are not currently interned.)
2025-05-28 13:11:45 -04:00
Alex Waygood
a5ebb3f3a2 [ty] Support ephemeral uv virtual environments (#18335) 2025-05-28 14:54:59 +00:00
Brent Westbrook
9925910a29 Add a ViolationMetadata::rule method (#18234)
Summary
--

This PR adds a macro-generated method to retrieve the `Rule` associated
with a given `Violation` struct, which makes it substantially cheaper
than parsing from the rule name. The rule is then converted to a
`NoqaCode` for storage on the `Message` (and eventually on the new
diagnostic type). The `ViolationMetadata::rule_name` method was now
unused, so the `rule` method replaces it.

Several types had to be moved from the `ruff_diagnostics` crate to the
`ruff_linter` crate to make this work, namely the `Violation` traits and
the old `Diagnostic` type, which had a constructor generic over a
`Violation`.

It's actually a fairly small PR, minus the hundreds of import changes.
The main changes are in these files:

-
[crates/ruff_linter/src/message/mod.rs](https://github.com/astral-sh/ruff/pull/18234/files#diff-139754ea310d75f28307008d21c771a190038bd106efe3b9267cc2d6c0fa0921)
-
[crates/ruff_diagnostics/src/lib.rs](https://github.com/astral-sh/ruff/pull/18234/files#diff-8e8ea5c586935bf21ea439f24253fcfd5955d2cb130f5377c2fa7bfee3ea3a81)
-
[crates/ruff_linter/src/diagnostic.rs](https://github.com/astral-sh/ruff/pull/18234/files#diff-1d0c9aad90d8f9446079c5be5f284150d97797158715bd9729e6f1f70246297a)
-
[crates/ruff_linter/src/lib.rs](https://github.com/astral-sh/ruff/pull/18234/files#diff-eb93ef7e78a612f5fa9145412c75cf6b1a5cefba1c2233e4a11a880a1ce1fbcc)

Test Plan
--

Existing tests
2025-05-28 09:27:09 -04:00
Brent Westbrook
a3ee6bb3b5 Return DiagnosticGuard from Checker::report_diagnostic (#18232)
Summary
--

This PR adds a `DiagnosticGuard` type to ruff that is adapted from the
`DiagnosticGuard` and `LintDiagnosticGuard` types from ty. This guard is
returned by `Checker::report_diagnostic` and derefs to a
`ruff_diagnostics::Diagnostic` (`OldDiagnostic`), allowing methods like
`OldDiagnostic::set_fix` to be called on the result. On `Drop` the
`DiagnosticGuard` pushes its contained `OldDiagnostic` to the `Checker`.

The main motivation for this is to make a following PR adding a
`SourceFile` to each diagnostic easier. For every rule where a `Checker`
is available, this will now only require modifying
`Checker::report_diagnostic` rather than all the rules.

In the few cases where we need to create a diagnostic before we know if
we actually want to emit it, there is a `DiagnosticGuard::defuse`
method, which consumes the guard without emitting the diagnostic. I was
able to restructure about half of the rules that naively called this to
avoid calling it, but a handful of rules still need it.

One of the fairly common patterns where `defuse` was needed initially
was something like

```rust
let diagnostic = Diagnostic::new(DiagnosticKind, range);

if !checker.enabled(diagnostic.rule()) {
    return;
}
```

So I also added a `Checker::checked_report_diagnostic` method that
handles this check internally. That helped to avoid some additional
`defuse` calls. The name is a bit repetitive, so I'm definitely open to
suggestions there. I included a warning against using it in the docs
since, as we've seen, the conversion from a diagnostic to a rule is
actually pretty expensive.

Test Plan
--

Existing tests
2025-05-28 07:41:31 -04:00
Viktor Merkurev
b60ba75d09 [flake8_use_pathlib]: Replace os.symlink with Path.symlink_to (PTH211) (#18337)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-05-28 12:39:05 +02:00
Micha Reiser
66ba1d8775 [ty] Support cancellation and retry in the server (#18273) 2025-05-28 10:59:29 +02:00
David Peter
bbcd7e0196 [ty] Synthetic function-like callables (#18242)
## Summary

We create `Callable` types for synthesized functions like the `__init__`
method of a dataclass. These generated functions are real functions
though, with descriptor-like behavior. That is, they can bind `self`
when accessed on an instance. This was modeled incorrectly so far.

## Test Plan

Updated tests
2025-05-28 10:00:56 +02:00
Dhruv Manilawala
48c425c15b [ty] Support publishing diagnostics in the server (#18309)
## Summary

This PR adds support for [publishing
diagnostics](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_publishDiagnostics)
from the ty language server.

It only adds support for it for text documents and not notebook
documents because the server doesn't have full notebook support yet.

Closes: astral-sh/ty#79

## Test Plan

Testing this out in Helix and Zed since those are the two editors that I
know of that doesn't support pull diagnostics:

### Helix


https://github.com/user-attachments/assets/e193f804-0b32-4f7e-8b83-6f9307e3d2d4



### Zed



https://github.com/user-attachments/assets/93ec7169-ce2b-4521-b009-a82d8afb9eaa
2025-05-28 13:15:11 +05:30
Max Mynter
6d210dd0c7 Add Autofix for ISC003 (#18256)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-05-28 09:30:51 +02:00
chiri
9ce83c215d [pyupgrade]: new rule UP050 (useless-class-metaclass-type) (#18334)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-05-28 09:22:44 +02:00
हिमांशु
602dd5c039 [pycodestyle] Make E712 suggestion not assume a context (#18328) 2025-05-28 09:06:39 +02:00
Carl Meyer
3eada01153 put similar dunder-call tests next to each other (#18343)
Follow-up from post-land review on
https://github.com/astral-sh/ruff/pull/18260
2025-05-27 12:16:41 -07:00
Alex Waygood
3e811fc369 [ty] Derive PartialOrd, Ord for KnownInstanceType (#18340) 2025-05-27 19:37:01 +01:00
Alex Waygood
743764d384 [ty] Simplify Type::try_bool() (#18342)
## Summary

I don't think we're ever going to add any `KnownInstanceType` variants
that evaluate to `False` in a boolean context; the
`KnownInstanceType::bool()` method just seems like unnecessary
complexity.

## Test Plan

`cargo test -p ty_python_semantic`
2025-05-27 19:32:17 +01:00
Alex Waygood
e03e05d2b3 [ty] Simplify Type::normalized slightly (#18339) 2025-05-27 18:08:59 +00:00
Alex Waygood
9ec4a178a4 [ty] Move arviz off the list of selected primer projects (#18336) 2025-05-27 17:51:19 +01:00
justin
8d5655a7ba [ty] Add --config-file CLI arg (#18083) 2025-05-27 08:00:38 +02:00
Alex Waygood
6453ac9ea1 [ty] Tell the user why we inferred a certain Python version when reporting version-specific syntax errors (#18295) 2025-05-26 20:44:43 +00:00
Alex Waygood
0a11baf29c [ty] Implement implicit inheritance from Generic[] for PEP-695 generic classes (#18283) 2025-05-26 20:40:16 +01:00
lipefree
1d20cf9570 [ty] Add hint if async context manager is used in non-async with statement (#18299)
# Summary

Adds a subdiagnostic hint in the following scenario where a
synchronous `with` is used with an async context manager:
```py
class Manager:
    async def __aenter__(self): ...
    async def __aexit__(self, *args): ...

# error: [invalid-context-manager] "Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` and `__exit__`"
# note: Objects of type `Manager` *can* be used as async context managers
# note: Consider using `async with` here
with Manager():
    ...
```

closes https://github.com/astral-sh/ty/issues/508

## Test Plan

New MD snapshot tests

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-05-26 21:34:47 +02:00
Micha Reiser
62ef96f51e [ty] Move respect-ignore-files under src section (#18322) 2025-05-26 18:45:48 +01:00
David Peter
4e68dd96a6 [ty] Infer types for ty_extensions.Intersection[A, B] tuple expressions (#18321)
## Summary

fixes astral-sh/ty#366

## Test Plan

* Added panic corpus regression tests
* I also wrote a hover regression test (see below), but decided not to
include it. The corpus tests are much more "effective" at finding these
types of errors, since they exhaustively check all expressions for
types.

<details>

```rs
#[test]
fn hover_regression_test_366() {
    let test = cursor_test(
        r#"
    from ty_extensions import Intersection

    class A: ...
    class B: ...

    def _(x: Intersection[A,<CURSOR> B]):
        pass
    "#,
    );

    assert_snapshot!(test.hover(), @r"
    A & B
    ---------------------------------------------
    ```text
    A & B
    ```
    ---------------------------------------------
    info[hover]: Hovered content is
     --> main.py:7:31
      |
    5 |         class B: ...
    6 |
    7 |         def _(x: Intersection[A, B]):
      |                               ^^-^
      |                               | |
      |                               | Cursor offset
      |                               source
    8 |             pass
      |
    ");
}
```

</details>
2025-05-26 17:08:52 +02:00
Maddy Guthridge
b25b642371 Improve readability of rule status icons in documentation (#18297)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-05-26 14:35:20 +00:00
Micha Reiser
175402aa75 [ty] Remove unnecessary lifetimes for Task (#18261) 2025-05-26 12:44:43 +00:00
Micha Reiser
d8216fa328 [ty] Gracefully handle salsa cancellations and panics in background request handlers (#18254) 2025-05-26 13:37:49 +01:00
David Peter
d51f6940fe [ty] Playground: Better default settings (#18316)
## Summary

The playground default settings set the `division-by-zero` rule severity
to `error`. This slightly confusing because `division-by-zero` is now
disabled by default. I am assuming that we have a `rules` section in
there to make it easier for users to customize those settings (in
addition to what the JSON schema gives us).

Here, I'm proposing a different default rule-set (`"undefined-reveal":
"ignore"`) that I would personally find more helpful for the playground,
since we're using it so frequently for MREs that often involve some
`reveal_type` calls.
2025-05-26 14:14:23 +02:00
Micha Reiser
66b082ff71 [ty] Abort process if worker thread panics (#18211) 2025-05-26 13:09:06 +01:00
Micha Reiser
5d93d619f3 Use git-commit as ty playground version instead of 0.0.0 (#18314) 2025-05-26 11:55:11 +00:00
David Peter
e1b662bf5d [ty] Always pass NO_INSTANCE_FALLBACK in try_call_dunder_with_policy (#18315)
## Summary

The previous `try_call_dunder_with_policy` API was a bit of a footgun
since you needed to pass `NO_INSTANCE_FALLBACK` in *addition* to other
policies that you wanted for the member lookup. Implicit calls to dunder
methods never access instance members though, so we can do this
implicitly in `try_call_dunder_with_policy`.

No functional changes.
2025-05-26 13:20:27 +02:00
Felix Scherz
f885cb8a2f [ty] use __getattribute__ to lookup unknown members on a type (#18280)
## Summary

`Type::member_lookup_with_policy` now falls back to calling
`__getattribute__` when a member cannot be found as a second fallback
after `__getattr__`.


closes https://github.com/astral-sh/ty/issues/441

## Test Plan

Added markdown tests.

---------

Co-authored-by: Alex Waygood <Alex.Waygood@Gmail.com>
Co-authored-by: David Peter <mail@david-peter.de>
2025-05-26 12:59:45 +02:00
David Peter
4ef2c223c9 [ty] Respect MRO_NO_OBJECT_FALLBACK policy when looking up symbols on type instances (#18312)
## Summary

This should address a problem that came up while working on
https://github.com/astral-sh/ruff/pull/18280. When looking up an
attribute (typically a dunder method) with the `MRO_NO_OBJECT_FALLBACK`
policy, the attribute is first looked up on the meta type. If the meta
type happens to be `type`, we go through the following branch in
`find_name_in_mro_with_policy`:


97ff015c88/crates/ty_python_semantic/src/types.rs (L2565-L2573)

The problem is that we now look up the attribute on `object` *directly*
(instead of just having `object` in the MRO). In this case,
`MRO_NO_OBJECT_FALLBACK` has no effect in `class_member_from_mro`:


c3feb8ce27/crates/ty_python_semantic/src/types/class.rs (L1081-L1082)

So instead, we need to explicitly respect the `MRO_NO_OBJECT_FALLBACK`
policy here by returning `Symbol::Unbound`.

## Test Plan

Added new Markdown tests that explain the ecosystem changes that we
observe.
2025-05-26 12:03:29 +02:00
Vasanth
d078ecff37 [flake8_async] Refactor argument name resolution for async sleep func… (#18262)
Co-authored-by: Vasanth-96 <ramavath.naik@itilite.com>
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-05-26 09:53:03 +00:00
David Peter
7eca6f96e3 [ty] Fix attribute writes to unions/intersections including modules (#18313)
## Summary

Fix a bug that involved writes to attributes on union/intersection types
that included modules as elements.

This is a prerequisite to avoid some ecosystem false positives in
https://github.com/astral-sh/ruff/pull/18312

## Test Plan

Added regression test
2025-05-26 11:41:03 +02:00
David Sherret
fbaf826a9d Only enable js feature of uuid crate for wasm crates (#18152) 2025-05-26 10:33:51 +01:00
Wei Lee
d8a5b9de17 [airflow] Revise fix title AIR3 (#18215) 2025-05-26 10:31:48 +01:00
otakutyrant
c3feb8ce27 Update editor integrations link in README (#17977)
Co-authored-by: Oscar Gustafsson <oscar.gustafsson@gmail.com>
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-05-26 09:50:09 +01:00
Jo
97ff015c88 [ty] Add tests to src.root if it exists and is not a package (#18286) 2025-05-26 09:08:57 +01:00
renovate[bot]
1f7134f727 Update rui314/setup-mold digest to 67424c1 (#18300) 2025-05-26 07:43:52 +02:00
renovate[bot]
6a0b93170e Update pre-commit dependencies (#18302) 2025-05-26 07:43:31 +02:00
renovate[bot]
cc59ff8aad Update dependency ruff to v0.11.11 (#18301) 2025-05-26 07:41:54 +02:00
renovate[bot]
2b90e7fcd7 Update NPM Development dependencies (#18305) 2025-05-26 07:41:37 +02:00
renovate[bot]
a43f5b2129 Update taiki-e/install-action action to v2.52.1 (#18307) 2025-05-26 07:41:18 +02:00
renovate[bot]
f3fb7429ca Update astral-sh/setup-uv action to v6.1.0 (#18304) 2025-05-26 07:40:51 +02:00
renovate[bot]
83498b95fb Update Rust crate uuid to v1.17.0 (#18306) 2025-05-26 07:40:01 +02:00
renovate[bot]
03d7be3747 Update Rust crate jiff to v0.2.14 (#18303) 2025-05-26 07:38:37 +02:00
Dhruv Manilawala
d95b029862 [ty] Move diagnostics API for the server (#18308)
## Summary

This PR moves the diagnostics API for the language server out from the
request handler module to the diagnostics API module.

This is in preparation to add support for publishing diagnostics.
2025-05-26 04:16:38 +00:00
Charlie Marsh
14c3755445 Fix YTT201 for '!=' comparisons (#18293)
## Summary

Closes #18292.
2025-05-25 13:16:19 -04:00
Jo
83a036960b [ty] Add long help for --config argument (#18285) 2025-05-25 13:09:02 +02:00
chiri
be76fadb05 [pyupgrade] make fix unsafe if it deletes comments (UP010, unnecessary-future-import) (#18291) 2025-05-25 12:44:21 +02:00
Alex Waygood
e293411679 [ty] get_protocol_members returns a frozenset, not a tuple (#18284) 2025-05-23 23:20:34 +00:00
lipefree
53d19f8368 [ty] Resolving Python path using CONDA_PREFIX variable to support Conda and Pixi (#18267) 2025-05-23 20:00:42 +02:00
1304 changed files with 47874 additions and 18004 deletions

View File

@@ -79,7 +79,7 @@ jobs:
# Adapted from https://docs.docker.com/build/ci/github-actions/multi-platform/
- name: Build and push by digest
id: build
uses: docker/build-push-action@1dc73863535b631f98b2378be8619f83b136f4a0 # v6.17.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
platforms: ${{ matrix.platform }}
@@ -231,7 +231,7 @@ jobs:
${{ env.TAG_PATTERNS }}
- name: Build and push
uses: docker/build-push-action@1dc73863535b631f98b2378be8619f83b136f4a0 # v6.17.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
platforms: linux/amd64,linux/arm64

View File

@@ -237,13 +237,13 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
uses: rui314/setup-mold@67424c1b3680e35255d95971cbd5de0047bf31c3 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@941e8a4d9d7cdb696bd4f017cf54aca281f8ffff # v2.51.2
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@941e8a4d9d7cdb696bd4f017cf54aca281f8ffff # v2.51.2
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-insta
- name: ty mdtests (GitHub annotations)
@@ -295,13 +295,13 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
uses: rui314/setup-mold@67424c1b3680e35255d95971cbd5de0047bf31c3 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@941e8a4d9d7cdb696bd4f017cf54aca281f8ffff # v2.51.2
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@941e8a4d9d7cdb696bd4f017cf54aca281f8ffff # v2.51.2
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-insta
- name: "Run tests"
@@ -324,7 +324,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@941e8a4d9d7cdb696bd4f017cf54aca281f8ffff # v2.51.2
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-nextest
- name: "Run tests"
@@ -380,7 +380,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
uses: rui314/setup-mold@67424c1b3680e35255d95971cbd5de0047bf31c3 # v1
- name: "Build"
run: cargo build --release --locked
@@ -405,13 +405,13 @@ jobs:
MSRV: ${{ steps.msrv.outputs.value }}
run: rustup default "${MSRV}"
- name: "Install mold"
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
uses: rui314/setup-mold@67424c1b3680e35255d95971cbd5de0047bf31c3 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@941e8a4d9d7cdb696bd4f017cf54aca281f8ffff # v2.51.2
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@941e8a4d9d7cdb696bd4f017cf54aca281f8ffff # v2.51.2
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-insta
- name: "Run tests"
@@ -437,7 +437,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo-binstall"
uses: cargo-bins/cargo-binstall@5cbf019d8cb9b9d5b086218c41458ea35d817691 # v1.12.5
uses: cargo-bins/cargo-binstall@e8c9cc3599f6c4063d143083205f98ca25d91677 # v1.12.6
with:
tool: cargo-fuzz@0.11.2
- name: "Install cargo-fuzz"
@@ -459,7 +459,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@6b9c6063abd6010835644d4c2e1bef4cf5cd0fca # v6.0.1
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
name: Download Ruff binary to test
id: download-cached-binary
@@ -660,7 +660,7 @@ jobs:
branch: ${{ github.event.pull_request.base.ref }}
workflow: "ci.yaml"
check_artifacts: true
- uses: astral-sh/setup-uv@6b9c6063abd6010835644d4c2e1bef4cf5cd0fca # v6.0.1
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- name: Fuzz
env:
FORCE_COLOR: 1
@@ -690,7 +690,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: cargo-bins/cargo-binstall@5cbf019d8cb9b9d5b086218c41458ea35d817691 # v1.12.5
- uses: cargo-bins/cargo-binstall@e8c9cc3599f6c4063d143083205f98ca25d91677 # v1.12.6
- run: cargo binstall --no-confirm cargo-shear
- run: cargo shear
@@ -730,7 +730,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@6b9c6063abd6010835644d4c2e1bef4cf5cd0fca # v6.0.1
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
@@ -773,7 +773,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: Install uv
uses: astral-sh/setup-uv@6b9c6063abd6010835644d4c2e1bef4cf5cd0fca # v6.0.1
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- name: "Install Insiders dependencies"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uv pip install -r docs/requirements-insiders.txt --system
@@ -910,7 +910,7 @@ jobs:
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@941e8a4d9d7cdb696bd4f017cf54aca281f8ffff # v2.51.2
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-codspeed

View File

@@ -34,11 +34,11 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: astral-sh/setup-uv@6b9c6063abd6010835644d4c2e1bef4cf5cd0fca # v6.0.1
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install mold"
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
uses: rui314/setup-mold@67424c1b3680e35255d95971cbd5de0047bf31c3 # v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- name: Build ruff
# A debug build means the script runs slower once it gets started,

View File

@@ -37,7 +37,7 @@ jobs:
persist-credentials: false
- name: Install the latest version of uv
uses: astral-sh/setup-uv@6b9c6063abd6010835644d4c2e1bef4cf5cd0fca # v6.0.1
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
with:

View File

@@ -22,7 +22,7 @@ jobs:
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@6b9c6063abd6010835644d4c2e1bef4cf5cd0fca # v6.0.1
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
pattern: wheels-*

View File

@@ -80,7 +80,7 @@ repos:
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.10
rev: v0.11.12
hooks:
- id: ruff-format
- id: ruff
@@ -98,7 +98,7 @@ repos:
# zizmor detects security vulnerabilities in GitHub Actions workflows.
# Additional configuration for the tool is found in `.github/zizmor.yml`
- repo: https://github.com/woodruffw/zizmor-pre-commit
rev: v1.7.0
rev: v1.9.0
hooks:
- id: zizmor

View File

@@ -1,5 +1,33 @@
# Changelog
## 0.11.12
### Preview features
- \[`airflow`\] Revise fix titles (`AIR3`) ([#18215](https://github.com/astral-sh/ruff/pull/18215))
- \[`pylint`\] Implement `missing-maxsplit-arg` (`PLC0207`) ([#17454](https://github.com/astral-sh/ruff/pull/17454))
- \[`pyupgrade`\] New rule `UP050` (`useless-class-metaclass-type`) ([#18334](https://github.com/astral-sh/ruff/pull/18334))
- \[`flake8-use-pathlib`\] Replace `os.symlink` with `Path.symlink_to` (`PTH211`) ([#18337](https://github.com/astral-sh/ruff/pull/18337))
### Bug fixes
- \[`flake8-bugbear`\] Ignore `__debug__` attribute in `B010` ([#18357](https://github.com/astral-sh/ruff/pull/18357))
- \[`flake8-async`\] Fix `anyio.sleep` argument name (`ASYNC115`, `ASYNC116`) ([#18262](https://github.com/astral-sh/ruff/pull/18262))
- \[`refurb`\] Fix `FURB129` autofix generating invalid syntax ([#18235](https://github.com/astral-sh/ruff/pull/18235))
### Rule changes
- \[`flake8-implicit-str-concat`\] Add autofix for `ISC003` ([#18256](https://github.com/astral-sh/ruff/pull/18256))
- \[`pycodestyle`\] Improve the diagnostic message for `E712` ([#18328](https://github.com/astral-sh/ruff/pull/18328))
- \[`flake8-2020`\] Fix diagnostic message for `!=` comparisons (`YTT201`) ([#18293](https://github.com/astral-sh/ruff/pull/18293))
- \[`pyupgrade`\] Make fix unsafe if it deletes comments (`UP010`) ([#18291](https://github.com/astral-sh/ruff/pull/18291))
### Documentation
- Simplify rules table to improve readability ([#18297](https://github.com/astral-sh/ruff/pull/18297))
- Update editor integrations link in README ([#17977](https://github.com/astral-sh/ruff/pull/17977))
- \[`flake8-bugbear`\] Add fix safety section (`B006`) ([#17652](https://github.com/astral-sh/ruff/pull/17652))
## 0.11.11
### Preview features

77
Cargo.lock generated
View File

@@ -8,6 +8,18 @@ version = "2.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "512761e0bb2578dd7380c6baaa0f4ce03e84f95e960231d1dec8bf4d7d6e2627"
[[package]]
name = "ahash"
version = "0.8.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a15f179cd60c4584b8a8c596927aadc462e27f2ca70c04e0071964a73ba7a75"
dependencies = [
"cfg-if",
"once_cell",
"version_check",
"zerocopy",
]
[[package]]
name = "aho-corasick"
version = "1.1.3"
@@ -342,9 +354,9 @@ dependencies = [
[[package]]
name = "clap"
version = "4.5.38"
version = "4.5.39"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ed93b9805f8ba930df42c2590f05453d5ec36cbb85d018868a5b24d31f6ac000"
checksum = "fd60e63e9be68e5fb56422e397cf9baddded06dae1d2e523401542383bc72a9f"
dependencies = [
"clap_builder",
"clap_derive",
@@ -352,9 +364,9 @@ dependencies = [
[[package]]
name = "clap_builder"
version = "4.5.38"
version = "4.5.39"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "379026ff283facf611b0ea629334361c4211d1b12ee01024eec1591133b04120"
checksum = "89cc6392a1f72bbeb820d71f32108f61fdaf18bc526e1d23954168a67759ef51"
dependencies = [
"anstream",
"anstyle",
@@ -1106,6 +1118,10 @@ name = "hashbrown"
version = "0.14.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e5274423e17b7c9fc20b6e7e208532f9b19825d82dfd615708b70edd83df41f1"
dependencies = [
"ahash",
"allocator-api2",
]
[[package]]
name = "hashbrown"
@@ -1498,9 +1514,9 @@ checksum = "4a5f13b858c8d314ee3e8f639011f7ccefe71f97f96e50151fb991f267928e2c"
[[package]]
name = "jiff"
version = "0.2.13"
version = "0.2.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f02000660d30638906021176af16b17498bd0d12813dbfe7b276d8bc7f3c0806"
checksum = "a194df1107f33c79f4f93d02c80798520551949d59dfad22b6157048a88cca93"
dependencies = [
"jiff-static",
"jiff-tzdb-platform",
@@ -1513,9 +1529,9 @@ dependencies = [
[[package]]
name = "jiff-static"
version = "0.2.13"
version = "0.2.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f3c30758ddd7188629c6713fc45d1188af4f44c90582311d0c8d8c9907f60c48"
checksum = "6c6e1db7ed32c6c71b759497fae34bf7933636f75a251b9e736555da426f6442"
dependencies = [
"proc-macro2",
"quote",
@@ -1597,9 +1613,9 @@ checksum = "d750af042f7ef4f724306de029d18836c26c1765a54a6a3f094cbd23a7267ffa"
[[package]]
name = "libcst"
version = "1.7.0"
version = "1.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad9e315e3f679e61b9095ffd5e509de78b8a4ea3bba9d772f6fb243209f808d4"
checksum = "3ac076e37f8fe6bcddbb6c3282897e6e9498b254907ccbfc806dc8f9f1491f02"
dependencies = [
"annotate-snippets",
"libcst_derive",
@@ -1607,14 +1623,14 @@ dependencies = [
"paste",
"peg",
"regex",
"thiserror 1.0.69",
"thiserror 2.0.12",
]
[[package]]
name = "libcst_derive"
version = "1.7.0"
version = "1.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bfa96ed35d0dccc67cf7ba49350cb86de3dcb1d072a7ab28f99117f19d874953"
checksum = "9cf4a12c744a301b216c4f0cb73542709ab15e6dadbb06966ac05864109d05da"
dependencies = [
"quote",
"syn",
@@ -2485,7 +2501,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.11.11"
version = "0.11.12"
dependencies = [
"anyhow",
"argfile",
@@ -2642,7 +2658,6 @@ dependencies = [
"rayon",
"regex",
"ruff",
"ruff_diagnostics",
"ruff_formatter",
"ruff_linter",
"ruff_notebook",
@@ -2672,9 +2687,7 @@ dependencies = [
name = "ruff_diagnostics"
version = "0.0.0"
dependencies = [
"anyhow",
"is-macro",
"log",
"ruff_text_size",
"serde",
]
@@ -2725,7 +2738,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.11.11"
version = "0.11.12"
dependencies = [
"aho-corasick",
"anyhow",
@@ -3061,7 +3074,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.11.11"
version = "0.11.12"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -3081,6 +3094,7 @@ dependencies = [
"ruff_workspace",
"serde",
"serde-wasm-bindgen",
"uuid",
"wasm-bindgen",
"wasm-bindgen-test",
]
@@ -3179,13 +3193,14 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.21.1"
source = "git+https://github.com/salsa-rs/salsa.git?rev=4818b15f3b7516555d39f5a41cb75970448bee4c#4818b15f3b7516555d39f5a41cb75970448bee4c"
version = "0.22.0"
source = "git+https://github.com/carljm/salsa.git?rev=0f6d406f6c309964279baef71588746b8c67b4a3#0f6d406f6c309964279baef71588746b8c67b4a3"
dependencies = [
"boxcar",
"compact_str",
"crossbeam-queue",
"dashmap 6.1.0",
"hashbrown 0.14.5",
"hashbrown 0.15.3",
"hashlink",
"indexmap",
@@ -3202,15 +3217,14 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.21.1"
source = "git+https://github.com/salsa-rs/salsa.git?rev=4818b15f3b7516555d39f5a41cb75970448bee4c#4818b15f3b7516555d39f5a41cb75970448bee4c"
version = "0.22.0"
source = "git+https://github.com/carljm/salsa.git?rev=0f6d406f6c309964279baef71588746b8c67b4a3#0f6d406f6c309964279baef71588746b8c67b4a3"
[[package]]
name = "salsa-macros"
version = "0.21.1"
source = "git+https://github.com/salsa-rs/salsa.git?rev=4818b15f3b7516555d39f5a41cb75970448bee4c#4818b15f3b7516555d39f5a41cb75970448bee4c"
version = "0.22.0"
source = "git+https://github.com/carljm/salsa.git?rev=0f6d406f6c309964279baef71588746b8c67b4a3#0f6d406f6c309964279baef71588746b8c67b4a3"
dependencies = [
"heck",
"proc-macro2",
"quote",
"syn",
@@ -3874,6 +3888,7 @@ dependencies = [
"countme",
"crossbeam",
"ctrlc",
"dunce",
"filetime",
"indicatif",
"insta",
@@ -4003,6 +4018,7 @@ dependencies = [
"ruff_source_file",
"ruff_text_size",
"rustc-hash 2.1.1",
"salsa",
"serde",
"serde_json",
"shellexpand",
@@ -4071,6 +4087,7 @@ dependencies = [
"ty_ide",
"ty_project",
"ty_python_semantic",
"uuid",
"wasm-bindgen",
"wasm-bindgen-test",
]
@@ -4240,9 +4257,9 @@ checksum = "06abde3611657adf66d383f00b093d7faecc7fa57071cce2578660c9f1010821"
[[package]]
name = "uuid"
version = "1.16.0"
version = "1.17.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "458f7a779bf54acc9f347480ac654f68407d3aab21269a6e3c9f922acd9e2da9"
checksum = "3cf4199d1e5d15ddd86a694e4d0dffa9c323ce759fea589f00fef9d81cc1931d"
dependencies = [
"getrandom 0.3.3",
"js-sys",
@@ -4253,9 +4270,9 @@ dependencies = [
[[package]]
name = "uuid-macro-internal"
version = "1.16.0"
version = "1.17.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "72dcd78c4f979627a754f5522cea6e6a25e55139056535fe6e69c506cd64a862"
checksum = "26b682e8c381995ea03130e381928e0e005b7c9eb483c6c8682f50e07b33c2b7"
dependencies = [
"proc-macro2",
"quote",

View File

@@ -129,7 +129,7 @@ regex = { version = "1.10.2" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "4818b15f3b7516555d39f5a41cb75970448bee4c" }
salsa = { git = "https://github.com/carljm/salsa.git", rev = "0f6d406f6c309964279baef71588746b8c67b4a3" }
schemars = { version = "0.8.16" }
seahash = { version = "4.1.0" }
serde = { version = "1.0.197", features = ["derive"] }
@@ -179,7 +179,6 @@ uuid = { version = "1.6.1", features = [
"v4",
"fast-rng",
"macro-diagnostics",
"js",
] }
walkdir = { version = "2.3.2" }
wasm-bindgen = { version = "0.2.92" }
@@ -188,7 +187,7 @@ wild = { version = "2" }
zip = { version = "0.6.6", default-features = false }
[workspace.metadata.cargo-shear]
ignored = ["getrandom", "ruff_options_metadata"]
ignored = ["getrandom", "ruff_options_metadata", "uuid"]
[workspace.lints.rust]

View File

@@ -34,8 +34,7 @@ An extremely fast Python linter and code formatter, written in Rust.
- 🔧 Fix support, for automatic error correction (e.g., automatically remove unused imports)
- 📏 Over [800 built-in rules](https://docs.astral.sh/ruff/rules/), with native re-implementations
of popular Flake8 plugins, like flake8-bugbear
- ⌨️ First-party [editor integrations](https://docs.astral.sh/ruff/integrations/) for
[VS Code](https://github.com/astral-sh/ruff-vscode) and [more](https://docs.astral.sh/ruff/editors/setup)
- ⌨️ First-party [editor integrations](https://docs.astral.sh/ruff/editors) for [VS Code](https://github.com/astral-sh/ruff-vscode) and [more](https://docs.astral.sh/ruff/editors/setup)
- 🌎 Monorepo-friendly, with [hierarchical and cascading configuration](https://docs.astral.sh/ruff/configuration/#config-file-discovery)
Ruff aims to be orders of magnitude faster than alternative tools while integrating more
@@ -149,8 +148,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.11.11/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.11.11/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.11.12/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.11.12/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -183,7 +182,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.11.11
rev: v0.11.12
hooks:
# Run the linter.
- id: ruff

View File

@@ -4,6 +4,10 @@ extend-exclude = [
"crates/ty_vendored/vendor/**/*",
"**/resources/**/*",
"**/snapshots/**/*",
# Completion tests tend to have a lot of incomplete
# words naturally. It's annoying to have to make all
# of them actually words. So just ignore typos here.
"crates/ty_ide/src/completion.rs",
]
[default.extend-words]

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.11.11"
version = "0.11.12"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -349,7 +349,6 @@ impl FileCache {
.iter()
.map(|msg| {
Message::diagnostic(
msg.rule.into(),
msg.body.clone(),
msg.suggestion.clone(),
msg.range,
@@ -357,6 +356,7 @@ impl FileCache {
msg.parent,
file.clone(),
msg.noqa_offset,
msg.rule,
)
})
.collect()

View File

@@ -12,7 +12,7 @@ use rayon::prelude::*;
use rustc_hash::FxHashMap;
use ruff_db::panic::catch_unwind;
use ruff_diagnostics::Diagnostic;
use ruff_linter::OldDiagnostic;
use ruff_linter::message::Message;
use ruff_linter::package::PackageRoot;
use ruff_linter::registry::Rule;
@@ -131,8 +131,7 @@ pub(crate) fn check(
Diagnostics::new(
vec![Message::from_diagnostic(
Diagnostic::new(IOError { message }, TextRange::default()),
dummy,
OldDiagnostic::new(IOError { message }, TextRange::default(), &dummy),
None,
)],
FxHashMap::default(),

View File

@@ -6,7 +6,7 @@ use serde::ser::SerializeSeq;
use serde::{Serialize, Serializer};
use strum::IntoEnumIterator;
use ruff_diagnostics::FixAvailability;
use ruff_linter::FixAvailability;
use ruff_linter::registry::{Linter, Rule, RuleNamespace};
use crate::args::HelpFormat;

View File

@@ -12,7 +12,7 @@ use colored::Colorize;
use log::{debug, warn};
use rustc_hash::FxHashMap;
use ruff_diagnostics::Diagnostic;
use ruff_linter::OldDiagnostic;
use ruff_linter::codes::Rule;
use ruff_linter::linter::{FixTable, FixerResult, LinterResult, ParseSource, lint_fix, lint_only};
use ruff_linter::message::Message;
@@ -64,13 +64,13 @@ impl Diagnostics {
let source_file = SourceFileBuilder::new(name, "").finish();
Self::new(
vec![Message::from_diagnostic(
Diagnostic::new(
OldDiagnostic::new(
IOError {
message: err.to_string(),
},
TextRange::default(),
&source_file,
),
source_file,
None,
)],
FxHashMap::default(),
@@ -235,7 +235,7 @@ pub(crate) fn lint_path(
};
let source_file =
SourceFileBuilder::new(path.to_string_lossy(), contents).finish();
lint_pyproject_toml(source_file, settings)
lint_pyproject_toml(&source_file, settings)
} else {
vec![]
};
@@ -396,7 +396,7 @@ pub(crate) fn lint_stdin(
}
return Ok(Diagnostics {
messages: lint_pyproject_toml(source_file, &settings.linter),
messages: lint_pyproject_toml(&source_file, &settings.linter),
fixed: FixMap::from_iter([(fs::relativize_path(path), FixTable::default())]),
notebook_indexes: FxHashMap::default(),
});

View File

@@ -566,7 +566,7 @@ fn venv() -> Result<()> {
----- stderr -----
ruff failed
Cause: Invalid search path settings
Cause: Failed to discover the site-packages directory: Invalid `--python` argument: `none` could not be canonicalized
Cause: Failed to discover the site-packages directory: Invalid `--python` argument: `none` does not point to a Python executable or a directory on disk
");
});

View File

@@ -78,7 +78,7 @@ fn setup_tomllib_case() -> Case {
let src_root = SystemPath::new("/src");
let mut metadata = ProjectMetadata::discover(src_root, &system).unwrap();
metadata.apply_cli_options(Options {
metadata.apply_options(Options {
environment: Some(EnvironmentOptions {
python_version: Some(RangedValue::cli(PythonVersion::PY312)),
..EnvironmentOptions::default()
@@ -131,7 +131,7 @@ fn benchmark_incremental(criterion: &mut Criterion) {
fn setup() -> Case {
let case = setup_tomllib_case();
let result: Vec<_> = case.db.check().unwrap();
let result: Vec<_> = case.db.check();
assert_diagnostics(&case.db, &result, EXPECTED_TOMLLIB_DIAGNOSTICS);
@@ -159,7 +159,7 @@ fn benchmark_incremental(criterion: &mut Criterion) {
None,
);
let result = db.check().unwrap();
let result = db.check();
assert_eq!(result.len(), EXPECTED_TOMLLIB_DIAGNOSTICS.len());
}
@@ -179,7 +179,7 @@ fn benchmark_cold(criterion: &mut Criterion) {
setup_tomllib_case,
|case| {
let Case { db, .. } = case;
let result: Vec<_> = db.check().unwrap();
let result: Vec<_> = db.check();
assert_diagnostics(db, &result, EXPECTED_TOMLLIB_DIAGNOSTICS);
},
@@ -224,7 +224,7 @@ fn setup_micro_case(code: &str) -> Case {
let src_root = SystemPath::new("/src");
let mut metadata = ProjectMetadata::discover(src_root, &system).unwrap();
metadata.apply_cli_options(Options {
metadata.apply_options(Options {
environment: Some(EnvironmentOptions {
python_version: Some(RangedValue::cli(PythonVersion::PY312)),
..EnvironmentOptions::default()
@@ -293,7 +293,7 @@ fn benchmark_many_string_assignments(criterion: &mut Criterion) {
},
|case| {
let Case { db, .. } = case;
let result = db.check().unwrap();
let result = db.check();
assert_eq!(result.len(), 0);
},
BatchSize::SmallInput,
@@ -339,7 +339,7 @@ fn benchmark_many_tuple_assignments(criterion: &mut Criterion) {
},
|case| {
let Case { db, .. } = case;
let result = db.check().unwrap();
let result = db.check();
assert_eq!(result.len(), 0);
},
BatchSize::SmallInput,

View File

@@ -1,3 +1,4 @@
use std::any::Any;
use std::backtrace::BacktraceStatus;
use std::cell::Cell;
use std::panic::Location;
@@ -24,17 +25,25 @@ impl Payload {
None
}
}
pub fn downcast_ref<R: Any>(&self) -> Option<&R> {
self.0.downcast_ref::<R>()
}
}
impl std::fmt::Display for PanicError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
writeln!(f, "panicked at")?;
write!(f, "panicked at")?;
if let Some(location) = &self.location {
write!(f, " {location}")?;
}
if let Some(payload) = self.payload.as_str() {
write!(f, ":\n{payload}")?;
}
if let Some(query_trace) = self.salsa_backtrace.as_ref() {
let _ = writeln!(f, "{query_trace}");
}
if let Some(backtrace) = &self.backtrace {
match backtrace.status() {
BacktraceStatus::Disabled => {
@@ -49,6 +58,7 @@ impl std::fmt::Display for PanicError {
_ => {}
}
}
Ok(())
}
}

View File

@@ -596,6 +596,13 @@ impl AsRef<SystemPath> for Utf8PathBuf {
}
}
impl AsRef<SystemPath> for camino::Utf8Component<'_> {
#[inline]
fn as_ref(&self) -> &SystemPath {
SystemPath::new(self.as_str())
}
}
impl AsRef<SystemPath> for str {
#[inline]
fn as_ref(&self) -> &SystemPath {
@@ -626,6 +633,22 @@ impl Deref for SystemPathBuf {
}
}
impl<P: AsRef<SystemPath>> FromIterator<P> for SystemPathBuf {
fn from_iter<I: IntoIterator<Item = P>>(iter: I) -> Self {
let mut buf = SystemPathBuf::new();
buf.extend(iter);
buf
}
}
impl<P: AsRef<SystemPath>> Extend<P> for SystemPathBuf {
fn extend<I: IntoIterator<Item = P>>(&mut self, iter: I) {
for path in iter {
self.push(path);
}
}
}
impl std::fmt::Debug for SystemPath {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.0.fmt(f)

View File

@@ -13,12 +13,12 @@ pub fn assert_function_query_was_not_run<Db, Q, QDb, I, R>(
Q: Fn(QDb, I) -> R,
I: salsa::plumbing::AsId + std::fmt::Debug + Copy,
{
let id = input.as_id().as_u32();
let id = input.as_id();
let (query_name, will_execute_event) = find_will_execute_event(db, query, input, events);
db.attach(|_| {
if let Some(will_execute_event) = will_execute_event {
panic!("Expected query {query_name}({id}) not to have run but it did: {will_execute_event:?}\n\n{events:#?}");
panic!("Expected query {query_name}({id:?}) not to have run but it did: {will_execute_event:?}\n\n{events:#?}");
}
});
}
@@ -65,7 +65,7 @@ pub fn assert_function_query_was_run<Db, Q, QDb, I, R>(
Q: Fn(QDb, I) -> R,
I: salsa::plumbing::AsId + std::fmt::Debug + Copy,
{
let id = input.as_id().as_u32();
let id = input.as_id();
let (query_name, will_execute_event) = find_will_execute_event(db, query, input, events);
db.attach(|_| {
@@ -224,7 +224,7 @@ fn query_was_not_run() {
}
#[test]
#[should_panic(expected = "Expected query len(0) not to have run but it did:")]
#[should_panic(expected = "Expected query len(Id(0)) not to have run but it did:")]
fn query_was_not_run_fails_if_query_was_run() {
use crate::tests::TestDb;
use salsa::prelude::*;
@@ -287,7 +287,7 @@ fn const_query_was_not_run_fails_if_query_was_run() {
}
#[test]
#[should_panic(expected = "Expected query len(0) to have run but it did not:")]
#[should_panic(expected = "Expected query len(Id(0)) to have run but it did not:")]
fn query_was_run_fails_if_query_was_not_run() {
use crate::tests::TestDb;
use salsa::prelude::*;

View File

@@ -14,7 +14,6 @@ license = { workspace = true }
ty = { workspace = true }
ty_project = { workspace = true, features = ["schemars"] }
ruff = { workspace = true }
ruff_diagnostics = { workspace = true }
ruff_formatter = { workspace = true }
ruff_linter = { workspace = true, features = ["schemars"] }
ruff_notebook = { workspace = true }

View File

@@ -10,7 +10,7 @@ use itertools::Itertools;
use regex::{Captures, Regex};
use strum::IntoEnumIterator;
use ruff_diagnostics::FixAvailability;
use ruff_linter::FixAvailability;
use ruff_linter::registry::{Linter, Rule, RuleNamespace};
use ruff_options_metadata::{OptionEntry, OptionsMetadata};
use ruff_workspace::options::Options;

View File

@@ -8,7 +8,7 @@ use std::borrow::Cow;
use std::fmt::Write;
use strum::IntoEnumIterator;
use ruff_diagnostics::FixAvailability;
use ruff_linter::FixAvailability;
use ruff_linter::registry::{Linter, Rule, RuleNamespace};
use ruff_linter::upstream_categories::UpstreamCategoryAndPrefix;
use ruff_options_metadata::OptionsMetadata;
@@ -18,44 +18,43 @@ const FIX_SYMBOL: &str = "🛠️";
const PREVIEW_SYMBOL: &str = "🧪";
const REMOVED_SYMBOL: &str = "";
const WARNING_SYMBOL: &str = "⚠️";
const STABLE_SYMBOL: &str = "✔️";
const SPACER: &str = "&nbsp;&nbsp;&nbsp;&nbsp;";
/// Style for the rule's fixability and status icons.
const SYMBOL_STYLE: &str = "style='width: 1em; display: inline-block;'";
/// Style for the container wrapping the fixability and status icons.
const SYMBOLS_CONTAINER: &str = "style='display: flex; gap: 0.5rem; justify-content: end;'";
fn generate_table(table_out: &mut String, rules: impl IntoIterator<Item = Rule>, linter: &Linter) {
table_out.push_str("| Code | Name | Message | |");
table_out.push_str("| Code | Name | Message | |");
table_out.push('\n');
table_out.push_str("| ---- | ---- | ------- | ------: |");
table_out.push_str("| ---- | ---- | ------- | -: |");
table_out.push('\n');
for rule in rules {
let status_token = match rule.group() {
RuleGroup::Removed => {
format!("<span title='Rule has been removed'>{REMOVED_SYMBOL}</span>")
format!(
"<span {SYMBOL_STYLE} title='Rule has been removed'>{REMOVED_SYMBOL}</span>"
)
}
RuleGroup::Deprecated => {
format!("<span title='Rule has been deprecated'>{WARNING_SYMBOL}</span>")
format!(
"<span {SYMBOL_STYLE} title='Rule has been deprecated'>{WARNING_SYMBOL}</span>"
)
}
RuleGroup::Preview => {
format!("<span title='Rule is in preview'>{PREVIEW_SYMBOL}</span>")
}
RuleGroup::Stable => {
// A full opacity checkmark is a bit aggressive for indicating stable
format!("<span title='Rule is stable' style='opacity: 0.6'>{STABLE_SYMBOL}</span>")
format!("<span {SYMBOL_STYLE} title='Rule is in preview'>{PREVIEW_SYMBOL}</span>")
}
RuleGroup::Stable => format!("<span {SYMBOL_STYLE}></span>"),
};
let fix_token = match rule.fixable() {
FixAvailability::Always | FixAvailability::Sometimes => {
format!("<span title='Automatic fix available'>{FIX_SYMBOL}</span>")
}
FixAvailability::None => {
format!(
"<span title='Automatic fix not available' style='opacity: 0.1' aria-hidden='true'>{FIX_SYMBOL}</span>"
)
format!("<span {SYMBOL_STYLE} title='Automatic fix available'>{FIX_SYMBOL}</span>")
}
FixAvailability::None => format!("<span {SYMBOL_STYLE}></span>"),
};
let tokens = format!("{status_token} {fix_token}");
let rule_name = rule.as_ref();
// If the message ends in a bracketed expression (like: "Use {replacement}"), escape the
@@ -82,15 +81,14 @@ fn generate_table(table_out: &mut String, rules: impl IntoIterator<Item = Rule>,
#[expect(clippy::or_fun_call)]
let _ = write!(
table_out,
"| {ss}{0}{1}{se} {{ #{0}{1} }} | {ss}{2}{se} | {ss}{3}{se} | {ss}{4}{se} |",
linter.common_prefix(),
linter.code_for_rule(rule).unwrap(),
rule.explanation()
"| {ss}{prefix}{code}{se} {{ #{prefix}{code} }} | {ss}{explanation}{se} | {ss}{message}{se} | <div {SYMBOLS_CONTAINER}>{status_token}{fix_token}</div>|",
prefix = linter.common_prefix(),
code = linter.code_for_rule(rule).unwrap(),
explanation = rule
.explanation()
.is_some()
.then_some(format_args!("[{rule_name}](rules/{rule_name}.md)"))
.unwrap_or(format_args!("{rule_name}")),
message,
tokens,
);
table_out.push('\n');
}
@@ -104,12 +102,6 @@ pub(crate) fn generate() -> String {
table_out.push_str("### Legend");
table_out.push('\n');
let _ = write!(
&mut table_out,
"{SPACER}{STABLE_SYMBOL}{SPACER} The rule is stable."
);
table_out.push_str("<br />");
let _ = write!(
&mut table_out,
"{SPACER}{PREVIEW_SYMBOL}{SPACER} The rule is unstable and is in [\"preview\"](faq.md#what-is-preview)."
@@ -132,7 +124,8 @@ pub(crate) fn generate() -> String {
&mut table_out,
"{SPACER}{FIX_SYMBOL}{SPACER} The rule is automatically fixable by the `--fix` command-line option."
);
table_out.push_str("<br />");
table_out.push_str("\n\n");
table_out.push_str("All rules not marked as preview, deprecated or removed are stable.");
table_out.push('\n');
for linter in Linter::iter() {

View File

@@ -16,7 +16,5 @@ doctest = false
[dependencies]
ruff_text_size = { workspace = true }
anyhow = { workspace = true }
log = { workspace = true }
is-macro = { workspace = true }
serde = { workspace = true, optional = true, features = [] }

View File

@@ -1,11 +1,7 @@
pub use diagnostic::Diagnostic;
pub use edit::Edit;
pub use fix::{Applicability, Fix, IsolationLevel};
pub use source_map::{SourceMap, SourceMarker};
pub use violation::{AlwaysFixableViolation, FixAvailability, Violation, ViolationMetadata};
mod diagnostic;
mod edit;
mod fix;
mod source_map;
mod violation;

View File

@@ -10,7 +10,7 @@ use ruff_python_ast::PythonVersion;
use ty_python_semantic::lint::{LintRegistry, RuleSelection};
use ty_python_semantic::{
Db, Program, ProgramSettings, PythonPath, PythonPlatform, PythonVersionSource,
PythonVersionWithSource, SearchPathSettings, default_lint_registry,
PythonVersionWithSource, SearchPathSettings, SysPrefixPathOrigin, default_lint_registry,
};
static EMPTY_VENDORED: std::sync::LazyLock<VendoredFileSystem> = std::sync::LazyLock::new(|| {
@@ -37,17 +37,18 @@ impl ModuleDb {
) -> Result<Self> {
let mut search_paths = SearchPathSettings::new(src_roots);
if let Some(venv_path) = venv_path {
search_paths.python_path = PythonPath::from_cli_flag(venv_path);
search_paths.python_path =
PythonPath::sys_prefix(venv_path, SysPrefixPathOrigin::PythonCliFlag);
}
let db = Self::default();
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource {
python_version: Some(PythonVersionWithSource {
version: python_version,
source: PythonVersionSource::default(),
},
}),
python_platform: PythonPlatform::default(),
search_paths,
},

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.11.11"
version = "0.11.12"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -10,22 +10,10 @@ from airflow import (
PY312,
)
from airflow.api_connexion.security import requires_access
from airflow.configuration import (
as_dict,
get,
getboolean,
getfloat,
getint,
has_option,
remove_option,
set,
)
from airflow.contrib.aws_athena_hook import AWSAthenaHook
from airflow.datasets import DatasetAliasEvent
from airflow.hooks.base_hook import BaseHook
from airflow.operators.subdag import SubDagOperator
from airflow.secrets.local_filesystem import LocalFilesystemBackend
from airflow.sensors.base_sensor_operator import BaseSensorOperator
from airflow.triggers.external_task import TaskStateTrigger
from airflow.utils import dates
from airflow.utils.dag_cycle_tester import test_cycle
@@ -40,13 +28,10 @@ from airflow.utils.dates import (
)
from airflow.utils.db import create_session
from airflow.utils.decorators import apply_defaults
from airflow.utils.file import TemporaryDirectory, mkdirs
from airflow.utils.helpers import chain as helper_chain
from airflow.utils.helpers import cross_downstream as helper_cross_downstream
from airflow.utils.log import secrets_masker
from airflow.utils.file import mkdirs
from airflow.utils.state import SHUTDOWN, terminating_states
from airflow.utils.trigger_rule import TriggerRule
from airflow.www.auth import has_access
from airflow.www.auth import has_access, has_access_dataset
from airflow.www.utils import get_sensitive_variables_fields, should_hide_value_for_key
# airflow root
@@ -55,11 +40,6 @@ PY36, PY37, PY38, PY39, PY310, PY311, PY312
# airflow.api_connexion.security
requires_access
# airflow.configuration
get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
# airflow.contrib.*
AWSAthenaHook()
@@ -68,10 +48,6 @@ AWSAthenaHook()
DatasetAliasEvent()
# airflow.hooks
BaseHook()
# airflow.operators.subdag.*
SubDagOperator()
@@ -81,10 +57,6 @@ SubDagOperator()
LocalFilesystemBackend()
# airflow.sensors.base_sensor_operator
BaseSensorOperator()
# airflow.triggers.external_task
TaskStateTrigger()
@@ -114,15 +86,8 @@ create_session
apply_defaults
# airflow.utils.file
TemporaryDirectory()
mkdirs
# airflow.utils.helpers
helper_chain
helper_cross_downstream
# airflow.utils.log
secrets_masker
# airflow.utils.state
SHUTDOWN
@@ -135,37 +100,8 @@ TriggerRule.NONE_FAILED_OR_SKIPPED
# airflow.www.auth
has_access
has_access_dataset
# airflow.www.utils
get_sensitive_variables_fields
should_hide_value_for_key
# airflow.operators.python
from airflow.operators.python import get_current_context
get_current_context()
# airflow.providers.mysql
from airflow.providers.mysql.datasets.mysql import sanitize_uri
sanitize_uri
# airflow.providers.postgres
from airflow.providers.postgres.datasets.postgres import sanitize_uri
sanitize_uri
# airflow.providers.trino
from airflow.providers.trino.datasets.trino import sanitize_uri
sanitize_uri
# airflow.notifications.basenotifier
from airflow.notifications.basenotifier import BaseNotifier
BaseNotifier()
# airflow.auth.manager
from airflow.auth.managers.base_auth_manager import BaseAuthManager
BaseAuthManager()

View File

@@ -3,7 +3,6 @@ from __future__ import annotations
from airflow.api_connexion.security import requires_access_dataset
from airflow.auth.managers.models.resource_details import (
DatasetDetails,
)
from airflow.datasets.manager import (
DatasetManager,
@@ -12,15 +11,13 @@ from airflow.datasets.manager import (
)
from airflow.lineage.hook import DatasetLineageInfo
from airflow.metrics.validators import AllowListValidator, BlockListValidator
from airflow.secrets.local_filesystm import load_connections
from airflow.secrets.local_filesystem import load_connections
from airflow.security.permissions import RESOURCE_DATASET
from airflow.www.auth import has_access_dataset
requires_access_dataset()
DatasetDetails()
DatasetManager()
dataset_manager()
resolve_dataset_manager()
@@ -34,7 +31,6 @@ load_connections()
RESOURCE_DATASET
has_access_dataset()
from airflow.listeners.spec.dataset import (
on_dataset_changed,
@@ -43,3 +39,76 @@ from airflow.listeners.spec.dataset import (
on_dataset_created()
on_dataset_changed()
# airflow.operators.python
from airflow.operators.python import get_current_context
get_current_context()
# airflow.providers.mysql
from airflow.providers.mysql.datasets.mysql import sanitize_uri
sanitize_uri
# airflow.providers.postgres
from airflow.providers.postgres.datasets.postgres import sanitize_uri
sanitize_uri
# airflow.providers.trino
from airflow.providers.trino.datasets.trino import sanitize_uri
sanitize_uri
# airflow.notifications.basenotifier
from airflow.notifications.basenotifier import BaseNotifier
BaseNotifier()
# airflow.auth.manager
from airflow.auth.managers.base_auth_manager import BaseAuthManager
BaseAuthManager()
from airflow.configuration import (
as_dict,
get,
getboolean,
getfloat,
getint,
has_option,
remove_option,
set,
)
# airflow.configuration
get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
from airflow.hooks.base_hook import BaseHook
# airflow.hooks
BaseHook()
from airflow.sensors.base_sensor_operator import BaseSensorOperator
# airflow.sensors.base_sensor_operator
BaseSensorOperator()
BaseHook()
from airflow.utils.helpers import chain as helper_chain
from airflow.utils.helpers import cross_downstream as helper_cross_downstream
# airflow.utils.helpers
helper_chain
helper_cross_downstream
# airflow.utils.file
from airflow.utils.file import TemporaryDirectory
TemporaryDirectory()
from airflow.utils.log import secrets_masker
# airflow.utils.log
secrets_masker

View File

@@ -1,54 +1,54 @@
from __future__ import annotations
from airflow.providers.amazon.aws.auth_manager.avp.entities import AvpEntities
from airflow.providers.openlineage.utils.utils import (
DatasetInfo,
translate_airflow_dataset,
)
from airflow.secrets.local_filesystem import load_connections
from airflow.security.permissions import RESOURCE_DATASET
AvpEntities.DATASET
# airflow.providers.openlineage.utils.utils
DatasetInfo()
translate_airflow_dataset()
# airflow.secrets.local_filesystem
load_connections()
# airflow.security.permissions
RESOURCE_DATASET
from airflow.providers.amazon.aws.datasets.s3 import (
convert_dataset_to_openlineage as s3_convert_dataset_to_openlineage,
)
from airflow.providers.amazon.aws.datasets.s3 import create_dataset as s3_create_dataset
s3_create_dataset()
s3_convert_dataset_to_openlineage()
from airflow.providers.common.io.dataset.file import (
convert_dataset_to_openlineage as io_convert_dataset_to_openlineage,
)
from airflow.providers.common.io.dataset.file import create_dataset as io_create_dataset
from airflow.providers.google.datasets.bigquery import (
create_dataset as bigquery_create_dataset,
)
from airflow.providers.google.datasets.gcs import (
convert_dataset_to_openlineage as gcs_convert_dataset_to_openlineage,
)
from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
from airflow.providers.openlineage.utils.utils import (
DatasetInfo,
translate_airflow_dataset,
)
AvpEntities.DATASET
s3_create_dataset()
s3_convert_dataset_to_openlineage()
io_create_dataset()
io_convert_dataset_to_openlineage()
# # airflow.providers.google.datasets.bigquery
from airflow.providers.google.datasets.bigquery import (
create_dataset as bigquery_create_dataset,
)
# airflow.providers.google.datasets.bigquery
bigquery_create_dataset()
# airflow.providers.google.datasets.gcs
from airflow.providers.google.datasets.gcs import (
convert_dataset_to_openlineage as gcs_convert_dataset_to_openlineage,
)
from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
gcs_create_dataset()
gcs_convert_dataset_to_openlineage()
# airflow.providers.openlineage.utils.utils
DatasetInfo()
translate_airflow_dataset()
#
# airflow.secrets.local_filesystem
load_connections()
#
# airflow.security.permissions
RESOURCE_DATASET
# airflow.timetables
DatasetTriggeredTimetable()
#
# airflow.www.auth
has_access_dataset

View File

@@ -5,35 +5,30 @@ from airflow.hooks.S3_hook import (
provide_bucket_name,
)
from airflow.operators.gcs_to_s3 import GCSToS3Operator
from airflow.operators.google_api_to_s3_transfer import (
GoogleApiToS3Operator,
GoogleApiToS3Transfer,
)
from airflow.operators.redshift_to_s3_operator import (
RedshiftToS3Operator,
RedshiftToS3Transfer,
)
from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
from airflow.operators.s3_to_redshift_operator import (
S3ToRedshiftOperator,
S3ToRedshiftTransfer,
)
from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
from airflow.sensors.s3_key_sensor import S3KeySensor
S3Hook()
provide_bucket_name()
GCSToS3Operator()
GoogleApiToS3Operator()
RedshiftToS3Operator()
S3FileTransformOperator()
S3ToRedshiftOperator()
S3KeySensor()
from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Transfer
GoogleApiToS3Transfer()
RedshiftToS3Operator()
from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
RedshiftToS3Transfer()
S3FileTransformOperator()
from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer
S3ToRedshiftOperator()
S3ToRedshiftTransfer()
S3KeySensor()

View File

@@ -4,10 +4,13 @@ from airflow.hooks.dbapi import (
ConnectorProtocol,
DbApiHook,
)
ConnectorProtocol()
DbApiHook()
from airflow.hooks.dbapi_hook import DbApiHook
from airflow.operators.check_operator import SQLCheckOperator
ConnectorProtocol()
DbApiHook()
SQLCheckOperator()
@@ -114,16 +117,11 @@ from airflow.sensors.sql_sensor import SqlSensor
SqlSensor()
from airflow.operators.jdbc_operator import JdbcOperator
from airflow.operators.mssql_operator import MsSqlOperator
from airflow.operators.mysql_operator import MySqlOperator
from airflow.operators.oracle_operator import OracleOperator
from airflow.operators.postgres_operator import PostgresOperator
from airflow.operators.sqlite_operator import SqliteOperator
from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
JdbcOperator()
MsSqlOperator()
MySqlOperator()
OracleOperator()
PostgresOperator()
SqliteOperator()
SQLExecuteQueryOperator()
SQLExecuteQueryOperator()
SQLExecuteQueryOperator()
SQLExecuteQueryOperator()
SQLExecuteQueryOperator()
SQLExecuteQueryOperator()

View File

@@ -12,55 +12,59 @@ from airflow.macros.hive import (
)
from airflow.operators.hive_operator import HiveOperator
from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
from airflow.operators.hive_to_mysql import (
HiveToMySqlOperator,
HiveToMySqlTransfer,
)
from airflow.operators.hive_to_mysql import HiveToMySqlOperator
from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
from airflow.operators.mssql_to_hive import (
MsSqlToHiveOperator,
MsSqlToHiveTransfer,
)
from airflow.operators.mysql_to_hive import (
MySqlToHiveOperator,
MySqlToHiveTransfer,
)
from airflow.operators.s3_to_hive_operator import (
S3ToHiveOperator,
S3ToHiveTransfer,
)
from airflow.sensors.hive_partition_sensor import HivePartitionSensor
from airflow.sensors.metastore_partition_sensor import MetastorePartitionSensor
from airflow.sensors.named_hive_partition_sensor import NamedHivePartitionSensor
HIVE_QUEUE_PRIORITIES
HiveCliHook()
HiveMetastoreHook()
HiveServer2Hook()
closest_ds_partition()
max_partition()
HiveCliHook()
HiveMetastoreHook()
HiveServer2Hook()
HIVE_QUEUE_PRIORITIES
HiveOperator()
HiveStatsCollectionOperator()
HiveToMySqlOperator()
HiveToMySqlTransfer()
HiveToSambaOperator()
MsSqlToHiveOperator()
MsSqlToHiveTransfer()
from airflow.operators.hive_to_mysql import HiveToMySqlTransfer
HiveToMySqlTransfer()
from airflow.operators.mysql_to_hive import MySqlToHiveOperator
MySqlToHiveOperator()
from airflow.operators.mysql_to_hive import MySqlToHiveTransfer
MySqlToHiveTransfer()
from airflow.operators.mssql_to_hive import MsSqlToHiveOperator
MsSqlToHiveOperator()
from airflow.operators.mssql_to_hive import MsSqlToHiveTransfer
MsSqlToHiveTransfer()
from airflow.operators.s3_to_hive_operator import S3ToHiveOperator
S3ToHiveOperator()
from airflow.operators.s3_to_hive_operator import S3ToHiveTransfer
S3ToHiveTransfer()
from airflow.sensors.hive_partition_sensor import HivePartitionSensor
HivePartitionSensor()
from airflow.sensors.metastore_partition_sensor import MetastorePartitionSensor
MetastorePartitionSensor()
from airflow.sensors.named_hive_partition_sensor import NamedHivePartitionSensor
NamedHivePartitionSensor()

View File

@@ -16,14 +16,7 @@ from airflow.kubernetes.kube_client import (
from airflow.kubernetes.kubernetes_helper_functions import (
add_pod_suffix,
annotations_for_logging_task_metadata,
annotations_to_key,
create_pod_id,
get_logs_task_metadata,
rand_str,
)
from airflow.kubernetes.pod import (
Port,
Resources,
)
ALL_NAMESPACES
@@ -37,21 +30,13 @@ _enable_tcp_keepalive()
get_kube_client()
add_pod_suffix()
create_pod_id()
annotations_for_logging_task_metadata()
annotations_to_key()
get_logs_task_metadata()
rand_str()
Port()
Resources()
create_pod_id()
from airflow.kubernetes.pod_generator import (
PodDefaults,
PodGenerator,
PodGeneratorDeprecated,
add_pod_suffix,
datetime_to_label_safe_datestring,
extend_object_field,
@@ -61,18 +46,16 @@ from airflow.kubernetes.pod_generator import (
rand_str,
)
PodDefaults()
PodGenerator()
add_pod_suffix()
datetime_to_label_safe_datestring()
extend_object_field()
label_safe_datestring_to_datetime()
make_safe_label_value()
merge_objects()
PodGenerator()
PodDefaults()
PodGeneratorDeprecated()
add_pod_suffix()
rand_str()
from airflow.kubernetes.pod_generator_deprecated import (
PodDefaults,
PodGenerator,
@@ -90,7 +73,6 @@ make_safe_label_value()
PodLauncher()
PodStatus()
from airflow.kubernetes.pod_launcher_deprecated import (
PodDefaults,
PodLauncher,
@@ -115,3 +97,17 @@ K8SModel()
Secret()
Volume()
VolumeMount()
from airflow.kubernetes.kubernetes_helper_functions import (
annotations_to_key,
get_logs_task_metadata,
rand_str,
)
annotations_to_key()
get_logs_task_metadata()
rand_str()
from airflow.kubernetes.pod_generator import PodGeneratorDeprecated
PodGeneratorDeprecated()

View File

@@ -5,10 +5,6 @@ from airflow.operators.dagrun_operator import (
TriggerDagRunLink,
TriggerDagRunOperator,
)
from airflow.operators.dummy import (
DummyOperator,
EmptyOperator,
)
from airflow.operators.latest_only_operator import LatestOnlyOperator
from airflow.operators.python_operator import (
BranchPythonOperator,
@@ -19,15 +15,12 @@ from airflow.operators.python_operator import (
from airflow.sensors.external_task_sensor import (
ExternalTaskMarker,
ExternalTaskSensor,
ExternalTaskSensorLink,
)
BashOperator()
TriggerDagRunLink()
TriggerDagRunOperator()
DummyOperator()
EmptyOperator()
LatestOnlyOperator()
@@ -38,25 +31,48 @@ ShortCircuitOperator()
ExternalTaskMarker()
ExternalTaskSensor()
ExternalTaskSensorLink()
from airflow.operators.dummy_operator import (
DummyOperator,
EmptyOperator,
)
DummyOperator()
EmptyOperator()
from airflow.hooks.subprocess import SubprocessResult
SubprocessResult()
from airflow.hooks.subprocess import working_directory
working_directory()
from airflow.operators.datetime import target_times_as_dates
target_times_as_dates()
from airflow.operators.trigger_dagrun import TriggerDagRunLink
TriggerDagRunLink()
from airflow.sensors.external_task import ExternalTaskSensorLink
ExternalTaskSensorLink()
from airflow.sensors.time_delta import WaitSensor
WaitSensor()
WaitSensor()
from airflow.operators.dummy import DummyOperator
DummyOperator()
from airflow.operators.dummy import EmptyOperator
EmptyOperator()
from airflow.operators.dummy_operator import DummyOperator
DummyOperator()
from airflow.operators.dummy_operator import EmptyOperator
EmptyOperator()
from airflow.sensors.external_task_sensor import ExternalTaskSensorLink
ExternalTaskSensorLink()

View File

@@ -9,19 +9,12 @@ from airflow.datasets import (
expand_alias_to_datasets,
)
from airflow.datasets.metadata import Metadata
from airflow.decorators import dag, setup, task, task_group, teardown
from airflow.io.path import ObjectStoragePath
from airflow.io.storage import attach
from airflow.models import DAG as DAGFromModel
from airflow.models import (
Connection,
Variable,
from airflow.decorators import (
dag,
setup,
task,
task_group,
)
from airflow.models.baseoperator import chain, chain_linear, cross_downstream
from airflow.models.baseoperatorlink import BaseOperatorLink
from airflow.models.dag import DAG as DAGFromDag
from airflow.timetables.datasets import DatasetOrTimeSchedule
from airflow.utils.dag_parsing_context import get_parsing_context
# airflow
DatasetFromRoot()
@@ -39,9 +32,22 @@ dag()
task()
task_group()
setup()
from airflow.decorators import teardown
from airflow.io.path import ObjectStoragePath
from airflow.io.storage import attach
from airflow.models import DAG as DAGFromModel
from airflow.models import (
Connection,
Variable,
)
from airflow.models.baseoperator import chain, chain_linear, cross_downstream
from airflow.models.baseoperatorlink import BaseOperatorLink
from airflow.models.dag import DAG as DAGFromDag
# airflow.decorators
teardown()
# airflow.io
# # airflow.io
ObjectStoragePath()
attach()
@@ -60,6 +66,9 @@ BaseOperatorLink()
# airflow.models.dag
DAGFromDag()
from airflow.timetables.datasets import DatasetOrTimeSchedule
from airflow.utils.dag_parsing_context import get_parsing_context
# airflow.timetables.datasets
DatasetOrTimeSchedule()

View File

@@ -7,49 +7,71 @@ from airflow.operators.bash import BashOperator
from airflow.operators.datetime import BranchDateTimeOperator
from airflow.operators.empty import EmptyOperator
from airflow.operators.latest_only import LatestOnlyOperator
from airflow.operators.trigger_dagrun import TriggerDagRunOperator
from airflow.operators.weekday import BranchDayOfWeekOperator
from airflow.sensors.date_time import DateTimeSensor
FSHook()
PackageIndexHook()
SubprocessHook()
BashOperator()
BranchDateTimeOperator()
TriggerDagRunOperator()
EmptyOperator()
LatestOnlyOperator()
BranchDayOfWeekOperator()
DateTimeSensor()
from airflow.operators.python import (
BranchPythonOperator,
PythonOperator,
PythonVirtualenvOperator,
ShortCircuitOperator,
)
from airflow.operators.trigger_dagrun import TriggerDagRunOperator
from airflow.operators.weekday import BranchDayOfWeekOperator
from airflow.sensors.date_time import DateTimeSensor, DateTimeSensorAsync
from airflow.sensors.date_time import DateTimeSensorAsync
from airflow.sensors.external_task import (
ExternalTaskMarker,
ExternalTaskSensor,
)
from airflow.sensors.time_sensor import (
TimeSensor,
TimeSensorAsync,
)
from airflow.sensors.filesystem import FileSensor
from airflow.sensors.time_delta import TimeDeltaSensor, TimeDeltaSensorAsync
from airflow.sensors.time_sensor import TimeSensor, TimeSensorAsync
from airflow.sensors.weekday import DayOfWeekSensor
from airflow.triggers.external_task import DagStateTrigger, WorkflowTrigger
from airflow.triggers.file import FileTrigger
from airflow.triggers.temporal import DateTimeTrigger, TimeDeltaTrigger
FSHook()
PackageIndexHook()
SubprocessHook()
BashOperator()
BranchDateTimeOperator()
TriggerDagRunOperator()
EmptyOperator()
LatestOnlyOperator()
(
BranchPythonOperator(),
PythonOperator(),
PythonVirtualenvOperator(),
ShortCircuitOperator(),
)
BranchDayOfWeekOperator()
DateTimeSensor(), DateTimeSensorAsync()
ExternalTaskMarker(), ExternalTaskSensor()
BranchPythonOperator()
PythonOperator()
PythonVirtualenvOperator()
ShortCircuitOperator()
DateTimeSensorAsync()
ExternalTaskMarker()
ExternalTaskSensor()
FileSensor()
TimeSensor(), TimeSensorAsync()
TimeDeltaSensor(), TimeDeltaSensorAsync()
TimeSensor()
TimeSensorAsync()
from airflow.sensors.time_delta import (
TimeDeltaSensor,
TimeDeltaSensorAsync,
)
from airflow.sensors.weekday import DayOfWeekSensor
from airflow.triggers.external_task import (
DagStateTrigger,
WorkflowTrigger,
)
from airflow.triggers.file import FileTrigger
from airflow.triggers.temporal import (
DateTimeTrigger,
TimeDeltaTrigger,
)
TimeDeltaSensor()
TimeDeltaSensorAsync()
DayOfWeekSensor()
DagStateTrigger(), WorkflowTrigger()
DagStateTrigger()
WorkflowTrigger()
FileTrigger()
DateTimeTrigger(), TimeDeltaTrigger()
DateTimeTrigger()
TimeDeltaTrigger()

View File

@@ -178,3 +178,38 @@ async def unknown_1(other: str = Depends(unknown_unresolved)): ...
async def unknown_2(other: str = Depends(unknown_not_function)): ...
@app.get("/things/{thing_id}")
async def unknown_3(other: str = Depends(unknown_imported)): ...
# Class dependencies
from pydantic import BaseModel
from dataclasses import dataclass
class PydanticParams(BaseModel):
my_id: int
class InitParams:
def __init__(self, my_id: int):
self.my_id = my_id
# Errors
@app.get("/{id}")
async def get_id_pydantic_full(
params: Annotated[PydanticParams, Depends(PydanticParams)],
): ...
@app.get("/{id}")
async def get_id_pydantic_short(params: Annotated[PydanticParams, Depends()]): ...
@app.get("/{id}")
async def get_id_init_not_annotated(params = Depends(InitParams)): ...
# No errors
@app.get("/{my_id}")
async def get_id_pydantic_full(
params: Annotated[PydanticParams, Depends(PydanticParams)],
): ...
@app.get("/{my_id}")
async def get_id_pydantic_short(params: Annotated[PydanticParams, Depends()]): ...
@app.get("/{my_id}")
async def get_id_init_not_annotated(params = Depends(InitParams)): ...

View File

@@ -145,3 +145,23 @@ def func():
sleep = 10
anyio.run(main)
async def test_anyio_async115_helpers():
import anyio
await anyio.sleep(delay=1) # OK
await anyio.sleep(seconds=1) # OK
await anyio.sleep(delay=0) # ASYNC115
await anyio.sleep(seconds=0) # OK
async def test_trio_async115_helpers():
import trio
await trio.sleep(seconds=1) # OK
await trio.sleep(delay=1) # OK
await trio.sleep(seconds=0) # ASYNC115
await trio.sleep(delay=0) # OK

View File

@@ -108,3 +108,23 @@ async def import_from_anyio():
# catch from import
await sleep(86401) # error: 116, "async"
async def test_anyio_async116_helpers():
import anyio
await anyio.sleep(delay=1) # OK
await anyio.sleep(seconds=1) # OK
await anyio.sleep(delay=86401) # ASYNC116
await anyio.sleep(seconds=86401) # OK
async def test_trio_async116_helpers():
import trio
await trio.sleep(seconds=1) # OK
await trio.sleep(delay=1) # OK
await trio.sleep(seconds=86401) # ASYNC116
await trio.sleep(delay=86401) # OK

View File

@@ -22,3 +22,8 @@ def my_func():
# Implicit string concatenation
"0.0.0.0" f"0.0.0.0{expr}0.0.0.0"
# t-strings - all ok
t"0.0.0.0"
"0.0.0.0" t"0.0.0.0{expr}0.0.0.0"
"0.0.0.0" f"0.0.0.0{expr}0.0.0.0" t"0.0.0.0{expr}0.0.0.0"

View File

@@ -40,3 +40,7 @@ with tempfile.TemporaryDirectory(dir="/dev/shm") as d:
with TemporaryDirectory(dir="/tmp") as d:
pass
# ok (runtime error from t-string)
with open(t"/foo/bar", "w") as f:
f.write("def")

View File

@@ -169,3 +169,13 @@ query60 = f"""
# https://github.com/astral-sh/ruff/issues/17967
query61 = f"SELECT * FROM table" # skip expressionless f-strings
# t-strings
query62 = t"SELECT * FROM table"
query63 = t"""
SELECT *,
foo
FROM ({user_input}) raw
"""
query64 = f"update {t"{table}"} set var = {t"{var}"}"
query65 = t"update {f"{table}"} set var = {f"{var}"}"

View File

@@ -67,3 +67,6 @@ getattr(self.
import builtins
builtins.getattr(foo, "bar")
# Regression test for: https://github.com/astral-sh/ruff/issues/18353
setattr(foo, "__debug__", 0)

View File

@@ -91,3 +91,99 @@ _ = "\8""0" # fix should be "\80"
_ = "\12""8" # fix should be "\128"
_ = "\12""foo" # fix should be "\12foo"
_ = "\12" "" # fix should be "\12"
# Mixed literal + non-literal scenarios
_ = (
"start" +
variable +
"end"
)
_ = (
f"format" +
func_call() +
"literal"
)
_ = (
rf"raw_f{x}" +
r"raw_normal"
)
# Different prefix combinations
_ = (
u"unicode" +
r"raw"
)
_ = (
rb"raw_bytes" +
b"normal_bytes"
)
_ = (
b"bytes" +
b"with_bytes"
)
# Repeated concatenation
_ = ("a" +
"b" +
"c" +
"d" + "e"
)
_ = ("a"
+ "b"
+ "c"
+ "d"
+ "e"
)
_ = (
"start" +
variable + # comment
"end"
)
_ = (
"start" +
variable
# leading comment
+ "end"
)
_ = (
"first"
+ "second" # extra spaces around +
)
_ = (
"first" + # trailing spaces before +
"second"
)
_ = ((
"deep" +
"nesting"
))
_ = (
"contains + plus" +
"another string"
)
_ = (
"start"
# leading comment
+ "end"
)
_ = (
"start" +
# leading comment
"end"
)

View File

@@ -72,3 +72,5 @@ def not_warnings_dot_deprecated(
@not_warnings_dot_deprecated("Not warnings.deprecated, so this one *should* lead to PYI053 in a stub!")
def not_a_deprecated_function() -> None: ...
baz: str = t"51 character stringgggggggggggggggggggggggggggggggg"

View File

@@ -80,3 +80,7 @@ x: TypeAlias = Literal["fooooooooooooooooooooooooooooooooooooooooooooooooooooooo
# Ok
y: TypeAlias = Annotated[int, "metadataaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"]
ttoo: str = t"50 character stringggggggggggggggggggggggggggggggg" # OK
tbar: str = t"51 character stringgggggggggggggggggggggggggggggggg" # Error: PYI053

View File

@@ -39,3 +39,27 @@ f'\'normal\' {f'nested'} normal' # Q003
f'\'normal\' {f'nested'} "double quotes"'
f'\'normal\' {f'\'nested\' {'other'} normal'} "double quotes"' # Q003
f'\'normal\' {f'\'nested\' {'other'} "double quotes"'} normal' # Q00l
# Same as above, but with t-strings
t'This is a \'string\'' # Q003
t'This is \\ a \\\'string\'' # Q003
t'"This" is a \'string\''
f"This is a 'string'"
f"\"This\" is a 'string'"
fr'This is a \'string\''
fR'This is a \'string\''
foo = (
t'This is a'
t'\'string\'' # Q003
)
t'\'foo\' {'nested'}' # Q003
t'\'foo\' {t'nested'}' # Q003
t'\'foo\' {t'\'nested\''} \'\'' # Q003
t'normal {t'nested'} normal'
t'\'normal\' {t'nested'} normal' # Q003
t'\'normal\' {t'nested'} "double quotes"'
t'\'normal\' {t'\'nested\' {'other'} normal'} "double quotes"' # Q003
t'\'normal\' {t'\'nested\' {'other'} "double quotes"'} normal' # Q00l

View File

@@ -37,3 +37,25 @@ f"\"normal\" {f"nested"} normal" # Q003
f"\"normal\" {f"nested"} 'single quotes'"
f"\"normal\" {f"\"nested\" {"other"} normal"} 'single quotes'" # Q003
f"\"normal\" {f"\"nested\" {"other"} 'single quotes'"} normal" # Q003
# Same as above, but with t-strings
t"This is a \"string\""
t"'This' is a \"string\""
f'This is a "string"'
f'\'This\' is a "string"'
fr"This is a \"string\""
fR"This is a \"string\""
foo = (
t"This is a"
t"\"string\""
)
t"\"foo\" {"foo"}" # Q003
t"\"foo\" {t"foo"}" # Q003
t"\"foo\" {t"\"foo\""} \"\"" # Q003
t"normal {t"nested"} normal"
t"\"normal\" {t"nested"} normal" # Q003
t"\"normal\" {t"nested"} 'single quotes'"
t"\"normal\" {t"\"nested\" {"other"} normal"} 'single quotes'" # Q003
t"\"normal\" {t"\"nested\" {"other"} 'single quotes'"} normal" # Q003

View File

@@ -0,0 +1,15 @@
import os
from pathlib import Path
os.symlink("usr/bin/python", "tmp/python")
os.symlink(b"usr/bin/python", b"tmp/python")
Path("tmp/python").symlink_to("usr/bin/python") # Ok
os.symlink("usr/bin/python", "tmp/python", target_is_directory=True)
os.symlink(b"usr/bin/python", b"tmp/python", target_is_directory=True)
Path("tmp/python").symlink_to("usr/bin/python", target_is_directory=True) # Ok
fd = os.open(".", os.O_RDONLY)
os.symlink("source.txt", "link.txt", dir_fd=fd) # Ok: dir_fd is not supported by pathlib
os.close(fd)

View File

@@ -266,3 +266,15 @@ def f():
result = list() # this should be replaced with a comprehension
for i in values:
result.append(i + 1) # PERF401
def f():
src = [1]
dst = []
for i in src:
if True if True else False:
dst.append(i)
for i in src:
if lambda: 0:
dst.append(i)

View File

@@ -151,3 +151,16 @@ def foo():
result = {}
for idx, name in indices, fruit:
result[name] = idx # PERF403
def foo():
src = (("x", 1),)
dst = {}
for k, v in src:
if True if True else False:
dst[k] = v
for k, v in src:
if lambda: 0:
dst[k] = v

View File

@@ -1,4 +1,4 @@
# Same as `W605_0.py` but using f-strings instead.
# Same as `W605_0.py` but using f-strings and t-strings instead.
#: W605:1:10
regex = f'\.png$'
@@ -66,3 +66,72 @@ s = f"TOTAL: {total}\nOK: {ok}\INCOMPLETE: {incomplete}\n"
# Debug text (should trigger)
t = f"{'\InHere'=}"
#: W605:1:10
regex = t'\.png$'
#: W605:2:1
regex = t'''
\.png$
'''
#: W605:2:6
f(
t'\_'
)
#: W605:4:6
t"""
multi-line
literal
with \_ somewhere
in the middle
"""
#: W605:1:38
value = t'new line\nand invalid escape \_ here'
#: Okay
regex = fr'\.png$'
regex = t'\\.png$'
regex = fr'''
\.png$
'''
regex = fr'''
\\.png$
'''
s = t'\\'
regex = t'\w' # noqa
regex = t'''
\w
''' # noqa
regex = t'\\\_'
value = t'\{{1}}'
value = t'\{1}'
value = t'{1:\}'
value = t"{t"\{1}"}"
value = rt"{t"\{1}"}"
# Okay
value = rt'\{{1}}'
value = rt'\{1}'
value = rt'{1:\}'
value = t"{rt"\{1}"}"
# Regression tests for https://github.com/astral-sh/ruff/issues/10434
t"{{}}+-\d"
t"\n{{}}+-\d+"
t"\n{{}}<7D>+-\d+"
# See https://github.com/astral-sh/ruff/issues/11491
total = 10
ok = 7
incomplete = 3
s = t"TOTAL: {total}\nOK: {ok}\INCOMPLETE: {incomplete}\n"
# Debug text (should trigger)
t = t"{'\InHere'=}"

View File

@@ -0,0 +1,184 @@
SEQ = "1,2,3"
class Foo(str):
class_str = "1,2,3"
def split(self, sep=None, maxsplit=-1) -> list[str]:
return super().split(sep, maxsplit)
class Bar():
split = "1,2,3"
# Errors
## Test split called directly on string literal
"1,2,3".split(",")[0] # [missing-maxsplit-arg]
"1,2,3".split(",")[-1] # [missing-maxsplit-arg]
"1,2,3".rsplit(",")[0] # [missing-maxsplit-arg]
"1,2,3".rsplit(",")[-1] # [missing-maxsplit-arg]
## Test split called on string variable
SEQ.split(",")[0] # [missing-maxsplit-arg]
SEQ.split(",")[-1] # [missing-maxsplit-arg]
SEQ.rsplit(",")[0] # [missing-maxsplit-arg]
SEQ.rsplit(",")[-1] # [missing-maxsplit-arg]
## Test split called on class attribute
Foo.class_str.split(",")[0] # [missing-maxsplit-arg]
Foo.class_str.split(",")[-1] # [missing-maxsplit-arg]
Foo.class_str.rsplit(",")[0] # [missing-maxsplit-arg]
Foo.class_str.rsplit(",")[-1] # [missing-maxsplit-arg]
## Test split called on sliced string
"1,2,3"[::-1].split(",")[0] # [missing-maxsplit-arg]
"1,2,3"[::-1][::-1].split(",")[0] # [missing-maxsplit-arg]
SEQ[:3].split(",")[0] # [missing-maxsplit-arg]
Foo.class_str[1:3].split(",")[-1] # [missing-maxsplit-arg]
"1,2,3"[::-1].rsplit(",")[0] # [missing-maxsplit-arg]
SEQ[:3].rsplit(",")[0] # [missing-maxsplit-arg]
Foo.class_str[1:3].rsplit(",")[-1] # [missing-maxsplit-arg]
## Test sep given as named argument
"1,2,3".split(sep=",")[0] # [missing-maxsplit-arg]
"1,2,3".split(sep=",")[-1] # [missing-maxsplit-arg]
"1,2,3".rsplit(sep=",")[0] # [missing-maxsplit-arg]
"1,2,3".rsplit(sep=",")[-1] # [missing-maxsplit-arg]
## Special cases
"1,2,3".split("\n")[0] # [missing-maxsplit-arg]
"1,2,3".split("split")[-1] # [missing-maxsplit-arg]
"1,2,3".rsplit("rsplit")[0] # [missing-maxsplit-arg]
## Test class attribute named split
Bar.split.split(",")[0] # [missing-maxsplit-arg]
Bar.split.split(",")[-1] # [missing-maxsplit-arg]
Bar.split.rsplit(",")[0] # [missing-maxsplit-arg]
Bar.split.rsplit(",")[-1] # [missing-maxsplit-arg]
## Test unpacked dict literal kwargs
"1,2,3".split(**{"sep": ","})[0] # [missing-maxsplit-arg]
# OK
## Test not accessing the first or last element
### Test split called directly on string literal
"1,2,3".split(",")[1]
"1,2,3".split(",")[-2]
"1,2,3".rsplit(",")[1]
"1,2,3".rsplit(",")[-2]
### Test split called on string variable
SEQ.split(",")[1]
SEQ.split(",")[-2]
SEQ.rsplit(",")[1]
SEQ.rsplit(",")[-2]
### Test split called on class attribute
Foo.class_str.split(",")[1]
Foo.class_str.split(",")[-2]
Foo.class_str.rsplit(",")[1]
Foo.class_str.rsplit(",")[-2]
### Test split called on sliced string
"1,2,3"[::-1].split(",")[1]
SEQ[:3].split(",")[1]
Foo.class_str[1:3].split(",")[-2]
"1,2,3"[::-1].rsplit(",")[1]
SEQ[:3].rsplit(",")[1]
Foo.class_str[1:3].rsplit(",")[-2]
### Test sep given as named argument
"1,2,3".split(sep=",")[1]
"1,2,3".split(sep=",")[-2]
"1,2,3".rsplit(sep=",")[1]
"1,2,3".rsplit(sep=",")[-2]
## Test varying maxsplit argument
### str.split() tests
"1,2,3".split(sep=",", maxsplit=1)[-1]
"1,2,3".split(sep=",", maxsplit=1)[0]
"1,2,3".split(sep=",", maxsplit=2)[-1]
"1,2,3".split(sep=",", maxsplit=2)[0]
"1,2,3".split(sep=",", maxsplit=2)[1]
### str.rsplit() tests
"1,2,3".rsplit(sep=",", maxsplit=1)[-1]
"1,2,3".rsplit(sep=",", maxsplit=1)[0]
"1,2,3".rsplit(sep=",", maxsplit=2)[-1]
"1,2,3".rsplit(sep=",", maxsplit=2)[0]
"1,2,3".rsplit(sep=",", maxsplit=2)[1]
## Test user-defined split
Foo("1,2,3").split(",")[0]
Foo("1,2,3").split(",")[-1]
Foo("1,2,3").rsplit(",")[0]
Foo("1,2,3").rsplit(",")[-1]
## Test split called on sliced list
["1", "2", "3"][::-1].split(",")[0]
## Test class attribute named split
Bar.split[0]
Bar.split[-1]
Bar.split[0]
Bar.split[-1]
## Test unpacked dict literal kwargs
"1,2,3".split(",", **{"maxsplit": 1})[0]
"1,2,3".split(**{"sep": ",", "maxsplit": 1})[0]
# TODO
## Test variable split result index
## TODO: These require the ability to resolve a variable name to a value
# Errors
result_index = 0
"1,2,3".split(",")[result_index] # TODO: [missing-maxsplit-arg]
result_index = -1
"1,2,3".split(",")[result_index] # TODO: [missing-maxsplit-arg]
# OK
result_index = 1
"1,2,3".split(",")[result_index]
result_index = -2
"1,2,3".split(",")[result_index]
## Test split result index modified in loop
## TODO: These require the ability to recognize being in a loop where:
## - the result of split called on a string is indexed by a variable
## - the variable index above is modified
# OK
result_index = 0
for j in range(3):
print(SEQ.split(",")[result_index])
result_index = result_index + 1
## Test accessor
## TODO: These require the ability to get the return type of a method
## (possibly via `typing::is_string`)
class Baz():
def __init__(self):
self.my_str = "1,2,3"
def get_string(self) -> str:
return self.my_str
# Errors
Baz().get_string().split(",")[0] # TODO: [missing-maxsplit-arg]
Baz().get_string().split(",")[-1] # TODO: [missing-maxsplit-arg]
# OK
Baz().get_string().split(",")[1]
Baz().get_string().split(",")[-2]
## Test unpacked dict instance kwargs
## TODO: These require the ability to resolve a dict variable name to a value
# Errors
kwargs_without_maxsplit = {"seq": ","}
"1,2,3".split(**kwargs_without_maxsplit)[0] # TODO: [missing-maxsplit-arg]
# OK
kwargs_with_maxsplit = {"maxsplit": 1}
"1,2,3".split(",", **kwargs_with_maxsplit)[0] # TODO: false positive
kwargs_with_maxsplit = {"sep": ",", "maxsplit": 1}
"1,2,3".split(**kwargs_with_maxsplit)[0] # TODO: false positive

View File

@@ -12,3 +12,4 @@ if True:
if True:
from __future__ import generator_stop
from __future__ import invalid_module, generators
from __future__ import generators # comment

View File

@@ -0,0 +1,84 @@
class A:
...
class A(metaclass=type):
...
class A(
metaclass=type
):
...
class A(
metaclass=type
#
):
...
class A(
#
metaclass=type
):
...
class A(
metaclass=type,
#
):
...
class A(
#
metaclass=type,
#
):
...
class B(A, metaclass=type):
...
class B(
A,
metaclass=type,
):
...
class B(
A,
# comment
metaclass=type,
):
...
def foo():
class A(metaclass=type):
...
class A(
metaclass=type # comment
,
):
...
type = str
class Foo(metaclass=type):
...
import builtins
class A(metaclass=builtins.type):
...

View File

@@ -43,7 +43,6 @@ def func():
import builtins
with builtins.open("FURB129.py") as f:
for line in f.readlines():
pass
@@ -51,7 +50,6 @@ with builtins.open("FURB129.py") as f:
from builtins import open as o
with o("FURB129.py") as f:
for line in f.readlines():
pass
@@ -89,3 +87,18 @@ with open("FURB129.py") as f:
pass
for _not_line in f.readline():
pass
# https://github.com/astral-sh/ruff/issues/18231
with open("furb129.py") as f:
for line in (f).readlines():
pass
with open("furb129.py") as f:
[line for line in (f).readlines()]
with open("furb129.py") as f:
for line in (((f))).readlines():
pass
for line in(f).readlines():
pass

View File

@@ -0,0 +1,53 @@
# Errors.
if 1 in set([1]):
print("Single-element set")
if 1 in set((1,)):
print("Single-element set")
if 1 in set({1}):
print("Single-element set")
if 1 in frozenset([1]):
print("Single-element set")
if 1 in frozenset((1,)):
print("Single-element set")
if 1 in frozenset({1}):
print("Single-element set")
if 1 in set(set([1])):
print('Recursive solution')
# Non-errors.
if 1 in set((1, 2)):
pass
if 1 in set([1, 2]):
pass
if 1 in set({1, 2}):
pass
if 1 in frozenset((1, 2)):
pass
if 1 in frozenset([1, 2]):
pass
if 1 in frozenset({1, 2}):
pass
if 1 in set(1,):
pass
if 1 in set(1,2):
pass
if 1 in set((x for x in range(2))):
pass

View File

@@ -1,6 +1,6 @@
use ruff_diagnostics::{Diagnostic, Fix};
use ruff_text_size::Ranged;
use crate::Fix;
use crate::checkers::ast::Checker;
use crate::codes::Rule;
use crate::rules::{
@@ -38,92 +38,64 @@ pub(crate) fn bindings(checker: &Checker) {
.dummy_variable_rgx
.is_match(binding.name(checker.source()))
{
let mut diagnostic = Diagnostic::new(
pyflakes::rules::UnusedVariable {
name: binding.name(checker.source()).to_string(),
},
binding.range(),
);
diagnostic.try_set_fix(|| {
pyflakes::fixes::remove_exception_handler_assignment(binding, checker.locator)
checker
.report_diagnostic(
pyflakes::rules::UnusedVariable {
name: binding.name(checker.source()).to_string(),
},
binding.range(),
)
.try_set_fix(|| {
pyflakes::fixes::remove_exception_handler_assignment(
binding,
checker.locator,
)
.map(Fix::safe_edit)
});
checker.report_diagnostic(diagnostic);
});
}
}
if checker.enabled(Rule::InvalidAllFormat) {
if let Some(diagnostic) = pylint::rules::invalid_all_format(binding) {
checker.report_diagnostic(diagnostic);
}
pylint::rules::invalid_all_format(checker, binding);
}
if checker.enabled(Rule::InvalidAllObject) {
if let Some(diagnostic) = pylint::rules::invalid_all_object(binding) {
checker.report_diagnostic(diagnostic);
}
pylint::rules::invalid_all_object(checker, binding);
}
if checker.enabled(Rule::NonAsciiName) {
if let Some(diagnostic) = pylint::rules::non_ascii_name(binding, checker.locator) {
checker.report_diagnostic(diagnostic);
}
pylint::rules::non_ascii_name(checker, binding);
}
if checker.enabled(Rule::UnconventionalImportAlias) {
if let Some(diagnostic) = flake8_import_conventions::rules::unconventional_import_alias(
flake8_import_conventions::rules::unconventional_import_alias(
checker,
binding,
&checker.settings.flake8_import_conventions.aliases,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::UnaliasedCollectionsAbcSetImport) {
if let Some(diagnostic) =
flake8_pyi::rules::unaliased_collections_abc_set_import(checker, binding)
{
checker.report_diagnostic(diagnostic);
}
flake8_pyi::rules::unaliased_collections_abc_set_import(checker, binding);
}
if !checker.source_type.is_stub() && checker.enabled(Rule::UnquotedTypeAlias) {
flake8_type_checking::rules::unquoted_type_alias(checker, binding);
}
if checker.enabled(Rule::UnsortedDunderSlots) {
if let Some(diagnostic) = ruff::rules::sort_dunder_slots(checker, binding) {
checker.report_diagnostic(diagnostic);
}
ruff::rules::sort_dunder_slots(checker, binding);
}
if checker.enabled(Rule::UsedDummyVariable) {
if let Some(diagnostic) = ruff::rules::used_dummy_variable(checker, binding, binding_id)
{
checker.report_diagnostic(diagnostic);
}
ruff::rules::used_dummy_variable(checker, binding, binding_id);
}
if checker.enabled(Rule::AssignmentInAssert) {
if let Some(diagnostic) = ruff::rules::assignment_in_assert(checker, binding) {
checker.report_diagnostic(diagnostic);
}
ruff::rules::assignment_in_assert(checker, binding);
}
if checker.enabled(Rule::PytestUnittestRaisesAssertion) {
if let Some(diagnostic) =
flake8_pytest_style::rules::unittest_raises_assertion_binding(checker, binding)
{
checker.report_diagnostic(diagnostic);
}
flake8_pytest_style::rules::unittest_raises_assertion_binding(checker, binding);
}
if checker.enabled(Rule::ForLoopWrites) {
if let Some(diagnostic) = refurb::rules::for_loop_writes_binding(checker, binding) {
checker.report_diagnostic(diagnostic);
}
refurb::rules::for_loop_writes_binding(checker, binding);
}
if checker.enabled(Rule::CustomTypeVarForSelf) {
if let Some(diagnostic) =
flake8_pyi::rules::custom_type_var_instead_of_self(checker, binding)
{
checker.report_diagnostic(diagnostic);
}
flake8_pyi::rules::custom_type_var_instead_of_self(checker, binding);
}
if checker.enabled(Rule::PrivateTypeParameter) {
if let Some(diagnostic) = pyupgrade::rules::private_type_parameter(checker, binding) {
checker.report_diagnostic(diagnostic);
}
pyupgrade::rules::private_type_parameter(checker, binding);
}
}
}

View File

@@ -1,9 +1,9 @@
use ruff_diagnostics::{Diagnostic, Fix};
use ruff_python_semantic::analyze::visibility;
use ruff_python_semantic::{Binding, BindingKind, Imported, ResolvedReference, ScopeKind};
use ruff_text_size::Ranged;
use rustc_hash::FxHashMap;
use crate::Fix;
use crate::checkers::ast::Checker;
use crate::codes::Rule;
use crate::fix;
@@ -112,12 +112,12 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
.map(|id| checker.semantic.reference(*id))
.all(ResolvedReference::is_load)
{
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pylint::rules::GlobalVariableNotAssigned {
name: (*name).to_string(),
},
binding.range(),
));
);
}
}
}
@@ -146,12 +146,12 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
if scope.kind.is_generator() {
continue;
}
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pylint::rules::RedefinedArgumentFromLocal {
name: name.to_string(),
},
binding.range(),
));
);
}
}
}
@@ -186,13 +186,13 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
continue;
}
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pyflakes::rules::ImportShadowedByLoopVar {
name: name.to_string(),
row: checker.compute_source_row(shadowed.start()),
},
binding.range(),
));
);
}
}
}
@@ -331,7 +331,7 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
// Create diagnostics for each statement.
for (source, entries) in &redefinitions {
for (shadowed, binding) in entries {
let mut diagnostic = Diagnostic::new(
let mut diagnostic = checker.report_diagnostic(
pyflakes::rules::RedefinedWhileUnused {
name: binding.name(checker.source()).to_string(),
row: checker.compute_source_row(shadowed.start()),
@@ -346,8 +346,6 @@ pub(crate) fn deferred_scopes(checker: &Checker) {
if let Some(fix) = source.as_ref().and_then(|source| fixes.get(source)) {
diagnostic.set_fix(fix.clone());
}
checker.report_diagnostic(diagnostic);
}
}
}

View File

@@ -17,14 +17,7 @@ pub(crate) fn except_handler(except_handler: &ExceptHandler, checker: &Checker)
range: _,
}) => {
if checker.enabled(Rule::BareExcept) {
if let Some(diagnostic) = pycodestyle::rules::bare_except(
type_.as_deref(),
body,
except_handler,
checker.locator,
) {
checker.report_diagnostic(diagnostic);
}
pycodestyle::rules::bare_except(checker, type_.as_deref(), body, except_handler);
}
if checker.enabled(Rule::RaiseWithoutFromInsideExcept) {
flake8_bugbear::rules::raise_without_from_inside_except(

View File

@@ -1,8 +1,6 @@
use ruff_python_ast::{self as ast, Arguments, Expr, ExprContext, Operator};
use ruff_python_literal::cformat::{CFormatError, CFormatErrorType};
use ruff_diagnostics::Diagnostic;
use ruff_python_ast::types::Node;
use ruff_python_semantic::ScopeKind;
use ruff_python_semantic::analyze::typing;
@@ -178,6 +176,9 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if checker.enabled(Rule::Airflow3Removal) {
airflow::rules::airflow_3_removal_expr(checker, expr);
}
if checker.enabled(Rule::MissingMaxsplitArg) {
pylint::rules::missing_maxsplit_arg(checker, value, slice, expr);
}
pandas_vet::rules::subscript(checker, value, expr);
}
Expr::Tuple(ast::ExprTuple {
@@ -195,14 +196,13 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
let check_too_many_expressions = checker.enabled(Rule::ExpressionsInStarAssignment);
let check_two_starred_expressions =
checker.enabled(Rule::MultipleStarredExpressions);
if let Some(diagnostic) = pyflakes::rules::starred_expressions(
pyflakes::rules::starred_expressions(
checker,
elts,
check_too_many_expressions,
check_two_starred_expressions,
expr.range(),
) {
checker.report_diagnostic(diagnostic);
}
);
}
}
Expr::Name(ast::ExprName { id, ctx, range }) => {
@@ -527,12 +527,12 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
match pyflakes::format::FormatSummary::try_from(string_value.to_str()) {
Err(e) => {
if checker.enabled(Rule::StringDotFormatInvalidFormat) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pyflakes::rules::StringDotFormatInvalidFormat {
message: pyflakes::format::error_to_string(&e),
},
location,
));
);
}
}
Ok(summary) => {
@@ -936,9 +936,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
pylint::rules::repeated_keyword_argument(checker, call);
}
if checker.enabled(Rule::PytestPatchWithLambda) {
if let Some(diagnostic) = flake8_pytest_style::rules::patch_with_lambda(call) {
checker.report_diagnostic(diagnostic);
}
flake8_pytest_style::rules::patch_with_lambda(checker, call);
}
if checker.any_enabled(&[
Rule::PytestParametrizeNamesWrongType,
@@ -1041,6 +1039,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
Rule::OsPathGetctime,
Rule::Glob,
Rule::OsListdir,
Rule::OsSymlink,
]) {
flake8_use_pathlib::rules::replaceable_by_pathlib(checker, call);
}
@@ -1285,22 +1284,22 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
..
}) => {
if checker.enabled(Rule::PercentFormatUnsupportedFormatCharacter) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pyflakes::rules::PercentFormatUnsupportedFormatCharacter {
char: c,
},
location,
));
);
}
}
Err(e) => {
if checker.enabled(Rule::PercentFormatInvalidFormat) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pyflakes::rules::PercentFormatInvalidFormat {
message: e.to_string(),
},
location,
));
);
}
}
Ok(summary) => {
@@ -1364,13 +1363,7 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
op: Operator::Add, ..
}) => {
if checker.enabled(Rule::ExplicitStringConcatenation) {
if let Some(diagnostic) = flake8_implicit_str_concat::rules::explicit(
expr,
checker.locator,
checker.settings,
) {
checker.report_diagnostic(diagnostic);
}
flake8_implicit_str_concat::rules::explicit(checker, expr);
}
if checker.enabled(Rule::CollectionLiteralConcatenation) {
ruff::rules::collection_literal_concatenation(checker, expr);

View File

@@ -1,4 +1,3 @@
use ruff_diagnostics::Diagnostic;
use ruff_python_ast::helpers;
use ruff_python_ast::types::Node;
use ruff_python_ast::{self as ast, Expr, Stmt};
@@ -39,12 +38,12 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if !checker.semantic.scope_id.is_global() {
for name in names {
if checker.semantic.nonlocal(name).is_none() {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pylint::rules::NonlocalWithoutBinding {
name: name.to_string(),
},
name.range(),
));
);
}
}
}
@@ -55,22 +54,20 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
Stmt::Break(_) => {
if checker.enabled(Rule::BreakOutsideLoop) {
if let Some(diagnostic) = pyflakes::rules::break_outside_loop(
pyflakes::rules::break_outside_loop(
checker,
stmt,
&mut checker.semantic.current_statements().skip(1),
) {
checker.report_diagnostic(diagnostic);
}
);
}
}
Stmt::Continue(_) => {
if checker.enabled(Rule::ContinueOutsideLoop) {
if let Some(diagnostic) = pyflakes::rules::continue_outside_loop(
pyflakes::rules::continue_outside_loop(
checker,
stmt,
&mut checker.semantic.current_statements().skip(1),
) {
checker.report_diagnostic(diagnostic);
}
);
}
}
Stmt::FunctionDef(
@@ -98,9 +95,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
fastapi::rules::fastapi_unused_path_parameter(checker, function_def);
}
if checker.enabled(Rule::AmbiguousFunctionName) {
if let Some(diagnostic) = pycodestyle::rules::ambiguous_function_name(name) {
checker.report_diagnostic(diagnostic);
}
pycodestyle::rules::ambiguous_function_name(checker, name);
}
if checker.enabled(Rule::InvalidBoolReturnType) {
pylint::rules::invalid_bool_return(checker, function_def);
@@ -121,15 +116,14 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pylint::rules::invalid_str_return(checker, function_def);
}
if checker.enabled(Rule::InvalidFunctionName) {
if let Some(diagnostic) = pep8_naming::rules::invalid_function_name(
pep8_naming::rules::invalid_function_name(
checker,
stmt,
name,
decorator_list,
&checker.settings.pep8_naming.ignore_names,
&checker.semantic,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.source_type.is_stub() {
if checker.enabled(Rule::PassStatementStubBody) {
@@ -179,14 +173,13 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
flake8_pyi::rules::pep_484_positional_parameter(checker, function_def);
}
if checker.enabled(Rule::DunderFunctionName) {
if let Some(diagnostic) = pep8_naming::rules::dunder_function_name(
pep8_naming::rules::dunder_function_name(
checker,
checker.semantic.current_scope(),
stmt,
name,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::GlobalStatement) {
pylint::rules::global_statement(checker, name);
@@ -231,14 +224,13 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
);
}
if checker.enabled(Rule::ComplexStructure) {
if let Some(diagnostic) = mccabe::rules::function_is_too_complex(
mccabe::rules::function_is_too_complex(
checker,
stmt,
name,
body,
checker.settings.mccabe.max_complexity,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::HardcodedPasswordDefault) {
flake8_bandit::rules::hardcoded_password_default(checker, parameters);
@@ -258,31 +250,28 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pylint::rules::too_many_positional_arguments(checker, function_def);
}
if checker.enabled(Rule::TooManyReturnStatements) {
if let Some(diagnostic) = pylint::rules::too_many_return_statements(
pylint::rules::too_many_return_statements(
checker,
stmt,
body,
checker.settings.pylint.max_returns,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::TooManyBranches) {
if let Some(diagnostic) = pylint::rules::too_many_branches(
pylint::rules::too_many_branches(
checker,
stmt,
body,
checker.settings.pylint.max_branches,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::TooManyStatements) {
if let Some(diagnostic) = pylint::rules::too_many_statements(
pylint::rules::too_many_statements(
checker,
stmt,
body,
checker.settings.pylint.max_statements,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.any_enabled(&[
Rule::PytestFixtureIncorrectParenthesesStyle,
@@ -439,6 +428,9 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if checker.enabled(Rule::UselessObjectInheritance) {
pyupgrade::rules::useless_object_inheritance(checker, class_def);
}
if checker.enabled(Rule::UselessClassMetaclassType) {
pyupgrade::rules::useless_class_metaclass_type(checker, class_def);
}
if checker.enabled(Rule::ReplaceStrEnum) {
if checker.target_version() >= PythonVersion::PY311 {
pyupgrade::rules::replace_str_enum(checker, class_def);
@@ -448,28 +440,24 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pyupgrade::rules::unnecessary_class_parentheses(checker, class_def);
}
if checker.enabled(Rule::AmbiguousClassName) {
if let Some(diagnostic) = pycodestyle::rules::ambiguous_class_name(name) {
checker.report_diagnostic(diagnostic);
}
pycodestyle::rules::ambiguous_class_name(checker, name);
}
if checker.enabled(Rule::InvalidClassName) {
if let Some(diagnostic) = pep8_naming::rules::invalid_class_name(
pep8_naming::rules::invalid_class_name(
checker,
stmt,
name,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::ErrorSuffixOnExceptionName) {
if let Some(diagnostic) = pep8_naming::rules::error_suffix_on_exception_name(
pep8_naming::rules::error_suffix_on_exception_name(
checker,
stmt,
arguments.as_deref(),
name,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if !checker.source_type.is_stub() {
if checker.any_enabled(&[
@@ -608,11 +596,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
if checker.enabled(Rule::Debugger) {
if let Some(diagnostic) =
flake8_debugger::rules::debugger_import(stmt, None, &alias.name)
{
checker.report_diagnostic(diagnostic);
}
flake8_debugger::rules::debugger_import(checker, stmt, None, &alias.name);
}
if checker.enabled(Rule::BannedApi) {
flake8_tidy_imports::rules::banned_api(
@@ -635,94 +619,74 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pylint::rules::manual_from_import(checker, stmt, alias, names);
}
if checker.enabled(Rule::ImportSelf) {
if let Some(diagnostic) =
pylint::rules::import_self(alias, checker.module.qualified_name())
{
checker.report_diagnostic(diagnostic);
}
pylint::rules::import_self(checker, alias, checker.module.qualified_name());
}
if let Some(asname) = &alias.asname {
let name = alias.name.split('.').next_back().unwrap();
if checker.enabled(Rule::ConstantImportedAsNonConstant) {
if let Some(diagnostic) =
pep8_naming::rules::constant_imported_as_non_constant(
name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
}
}
if checker.enabled(Rule::LowercaseImportedAsNonLowercase) {
if let Some(diagnostic) =
pep8_naming::rules::lowercase_imported_as_non_lowercase(
name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsLowercase) {
if let Some(diagnostic) =
pep8_naming::rules::camelcase_imported_as_lowercase(
name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsConstant) {
if let Some(diagnostic) = pep8_naming::rules::camelcase_imported_as_constant(
pep8_naming::rules::constant_imported_as_non_constant(
checker,
name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::LowercaseImportedAsNonLowercase) {
pep8_naming::rules::lowercase_imported_as_non_lowercase(
checker,
name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
);
}
if checker.enabled(Rule::CamelcaseImportedAsLowercase) {
pep8_naming::rules::camelcase_imported_as_lowercase(
checker,
name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
);
}
if checker.enabled(Rule::CamelcaseImportedAsConstant) {
pep8_naming::rules::camelcase_imported_as_constant(
checker,
name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
);
}
if checker.enabled(Rule::CamelcaseImportedAsAcronym) {
if let Some(diagnostic) = pep8_naming::rules::camelcase_imported_as_acronym(
pep8_naming::rules::camelcase_imported_as_acronym(
name, asname, alias, stmt, checker,
) {
checker.report_diagnostic(diagnostic);
}
);
}
}
if checker.enabled(Rule::BannedImportAlias) {
if let Some(asname) = &alias.asname {
if let Some(diagnostic) =
flake8_import_conventions::rules::banned_import_alias(
stmt,
&alias.name,
asname,
&checker.settings.flake8_import_conventions.banned_aliases,
)
{
checker.report_diagnostic(diagnostic);
}
flake8_import_conventions::rules::banned_import_alias(
checker,
stmt,
&alias.name,
asname,
&checker.settings.flake8_import_conventions.banned_aliases,
);
}
}
if checker.enabled(Rule::PytestIncorrectPytestImport) {
if let Some(diagnostic) = flake8_pytest_style::rules::import(
flake8_pytest_style::rules::import(
checker,
stmt,
&alias.name,
alias.asname.as_deref(),
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::BuiltinImportShadowing) {
flake8_builtins::rules::builtin_import_shadowing(checker, alias);
@@ -834,11 +798,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
if checker.enabled(Rule::PytestIncorrectPytestImport) {
if let Some(diagnostic) =
flake8_pytest_style::rules::import_from(stmt, module, level)
{
checker.report_diagnostic(diagnostic);
}
flake8_pytest_style::rules::import_from(checker, stmt, module, level);
}
if checker.source_type.is_stub() {
if checker.enabled(Rule::FutureAnnotationsInStub) {
@@ -853,119 +813,98 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
} else if &alias.name == "*" {
if checker.enabled(Rule::UndefinedLocalWithNestedImportStarUsage) {
if !matches!(checker.semantic.current_scope().kind, ScopeKind::Module) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pyflakes::rules::UndefinedLocalWithNestedImportStarUsage {
name: helpers::format_import_from(level, module).to_string(),
},
stmt.range(),
));
);
}
}
if checker.enabled(Rule::UndefinedLocalWithImportStar) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pyflakes::rules::UndefinedLocalWithImportStar {
name: helpers::format_import_from(level, module).to_string(),
},
stmt.range(),
));
);
}
}
if checker.enabled(Rule::RelativeImports) {
if let Some(diagnostic) = flake8_tidy_imports::rules::banned_relative_import(
flake8_tidy_imports::rules::banned_relative_import(
checker,
stmt,
level,
module,
checker.module.qualified_name(),
checker.settings.flake8_tidy_imports.ban_relative_imports,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::Debugger) {
if let Some(diagnostic) =
flake8_debugger::rules::debugger_import(stmt, module, &alias.name)
{
checker.report_diagnostic(diagnostic);
}
flake8_debugger::rules::debugger_import(checker, stmt, module, &alias.name);
}
if checker.enabled(Rule::BannedImportAlias) {
if let Some(asname) = &alias.asname {
let qualified_name =
helpers::format_import_from_member(level, module, &alias.name);
if let Some(diagnostic) =
flake8_import_conventions::rules::banned_import_alias(
stmt,
&qualified_name,
asname,
&checker.settings.flake8_import_conventions.banned_aliases,
)
{
checker.report_diagnostic(diagnostic);
}
flake8_import_conventions::rules::banned_import_alias(
checker,
stmt,
&qualified_name,
asname,
&checker.settings.flake8_import_conventions.banned_aliases,
);
}
}
if let Some(asname) = &alias.asname {
if checker.enabled(Rule::ConstantImportedAsNonConstant) {
if let Some(diagnostic) =
pep8_naming::rules::constant_imported_as_non_constant(
&alias.name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
}
}
if checker.enabled(Rule::LowercaseImportedAsNonLowercase) {
if let Some(diagnostic) =
pep8_naming::rules::lowercase_imported_as_non_lowercase(
&alias.name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsLowercase) {
if let Some(diagnostic) =
pep8_naming::rules::camelcase_imported_as_lowercase(
&alias.name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
)
{
checker.report_diagnostic(diagnostic);
}
}
if checker.enabled(Rule::CamelcaseImportedAsConstant) {
if let Some(diagnostic) = pep8_naming::rules::camelcase_imported_as_constant(
pep8_naming::rules::constant_imported_as_non_constant(
checker,
&alias.name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::LowercaseImportedAsNonLowercase) {
pep8_naming::rules::lowercase_imported_as_non_lowercase(
checker,
&alias.name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
);
}
if checker.enabled(Rule::CamelcaseImportedAsLowercase) {
pep8_naming::rules::camelcase_imported_as_lowercase(
checker,
&alias.name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
);
}
if checker.enabled(Rule::CamelcaseImportedAsConstant) {
pep8_naming::rules::camelcase_imported_as_constant(
checker,
&alias.name,
asname,
alias,
stmt,
&checker.settings.pep8_naming.ignore_names,
);
}
if checker.enabled(Rule::CamelcaseImportedAsAcronym) {
if let Some(diagnostic) = pep8_naming::rules::camelcase_imported_as_acronym(
pep8_naming::rules::camelcase_imported_as_acronym(
&alias.name,
asname,
alias,
stmt,
checker,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if !checker.source_type.is_stub() {
if checker.enabled(Rule::UselessImportAlias) {
@@ -978,23 +917,21 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
}
if checker.enabled(Rule::ImportSelf) {
if let Some(diagnostic) = pylint::rules::import_from_self(
pylint::rules::import_from_self(
checker,
level,
module,
names,
checker.module.qualified_name(),
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::BannedImportFrom) {
if let Some(diagnostic) = flake8_import_conventions::rules::banned_import_from(
flake8_import_conventions::rules::banned_import_from(
checker,
stmt,
&helpers::format_import_from(level, module),
&checker.settings.flake8_import_conventions.banned_from,
) {
checker.report_diagnostic(diagnostic);
}
);
}
if checker.enabled(Rule::ByteStringUsage) {
flake8_pyi::rules::bytestring_import(checker, import_from);
@@ -1209,7 +1146,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
) => {
if !checker.semantic.in_type_checking_block() {
if checker.enabled(Rule::Assert) {
checker.report_diagnostic(flake8_bandit::rules::assert_used(stmt));
flake8_bandit::rules::assert_used(checker, stmt);
}
}
if checker.enabled(Rule::AssertTuple) {
@@ -1406,11 +1343,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
}
}
if checker.enabled(Rule::DefaultExceptNotLast) {
if let Some(diagnostic) =
pyflakes::rules::default_except_not_last(handlers, checker.locator)
{
checker.report_diagnostic(diagnostic);
}
pyflakes::rules::default_except_not_last(checker, handlers, checker.locator);
}
if checker.any_enabled(&[
Rule::DuplicateHandlerException,
@@ -1505,9 +1438,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
);
}
if checker.enabled(Rule::PandasDfVariableName) {
if let Some(diagnostic) = pandas_vet::rules::assignment_to_df(targets) {
checker.report_diagnostic(diagnostic);
}
pandas_vet::rules::assignment_to_df(checker, targets);
}
if checker
.settings
@@ -1701,11 +1632,7 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pylint::rules::named_expr_without_context(checker, value);
}
if checker.enabled(Rule::AsyncioDanglingTask) {
if let Some(diagnostic) =
ruff::rules::asyncio_dangling_task(value, checker.semantic())
{
checker.report_diagnostic(diagnostic);
}
ruff::rules::asyncio_dangling_task(checker, value, checker.semantic());
}
if checker.enabled(Rule::RepeatedAppend) {
refurb::rules::repeated_append(checker, stmt);

View File

@@ -1,4 +1,3 @@
use ruff_diagnostics::Diagnostic;
use ruff_python_semantic::Exceptions;
use ruff_python_stdlib::builtins::version_builtin_was_added;
@@ -15,12 +14,12 @@ pub(crate) fn unresolved_references(checker: &Checker) {
for reference in checker.semantic.unresolved_references() {
if reference.is_wildcard_import() {
if checker.enabled(Rule::UndefinedLocalWithImportStarUsage) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: reference.name(checker.source()).to_string(),
},
reference.range(),
));
);
}
} else {
if checker.enabled(Rule::UndefinedName) {
@@ -42,13 +41,13 @@ pub(crate) fn unresolved_references(checker: &Checker) {
let symbol_name = reference.name(checker.source());
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
pyflakes::rules::UndefinedName {
name: symbol_name.to_string(),
minor_version_builtin_added: version_builtin_was_added(symbol_name),
},
reference.range(),
));
);
}
}
}

View File

@@ -26,12 +26,9 @@ use std::path::Path;
use itertools::Itertools;
use log::debug;
use ruff_python_parser::semantic_errors::{
SemanticSyntaxChecker, SemanticSyntaxContext, SemanticSyntaxError, SemanticSyntaxErrorKind,
};
use rustc_hash::{FxHashMap, FxHashSet};
use ruff_diagnostics::{Diagnostic, Edit, IsolationLevel};
use ruff_diagnostics::IsolationLevel;
use ruff_notebook::{CellOffsets, NotebookIndex};
use ruff_python_ast::helpers::{collect_import_from_member, is_docstring_stmt, to_module_path};
use ruff_python_ast::identifier::Identifier;
@@ -40,12 +37,15 @@ use ruff_python_ast::str::Quote;
use ruff_python_ast::visitor::{Visitor, walk_except_handler, walk_pattern};
use ruff_python_ast::{
self as ast, AnyParameterRef, ArgOrKeyword, Comprehension, ElifElseClause, ExceptHandler, Expr,
ExprContext, FStringElement, Keyword, MatchCase, ModModule, Parameter, Parameters, Pattern,
PythonVersion, Stmt, Suite, UnaryOp,
ExprContext, InterpolatedStringElement, Keyword, MatchCase, ModModule, Parameter, Parameters,
Pattern, PythonVersion, Stmt, Suite, UnaryOp,
};
use ruff_python_ast::{PySourceType, helpers, str, visitor};
use ruff_python_codegen::{Generator, Stylist};
use ruff_python_index::Indexer;
use ruff_python_parser::semantic_errors::{
SemanticSyntaxChecker, SemanticSyntaxContext, SemanticSyntaxError, SemanticSyntaxErrorKind,
};
use ruff_python_parser::typing::{AnnotationKind, ParsedAnnotation, parse_type_annotation};
use ruff_python_parser::{ParseError, Parsed, Tokens};
use ruff_python_semantic::all::{DunderAllDefinition, DunderAllFlags};
@@ -57,7 +57,7 @@ use ruff_python_semantic::{
};
use ruff_python_stdlib::builtins::{MAGIC_GLOBALS, python_builtins};
use ruff_python_trivia::CommentRanges;
use ruff_source_file::{OneIndexed, SourceRow};
use ruff_source_file::{OneIndexed, SourceFile, SourceRow};
use ruff_text_size::{Ranged, TextRange, TextSize};
use crate::checkers::ast::annotation::AnnotationContext;
@@ -66,13 +66,14 @@ use crate::importer::{ImportRequest, Importer, ResolutionError};
use crate::noqa::NoqaMapping;
use crate::package::PackageRoot;
use crate::preview::{is_semantic_errors_enabled, is_undefined_export_in_dunder_init_enabled};
use crate::registry::Rule;
use crate::registry::{AsRule, Rule};
use crate::rules::pyflakes::rules::{
LateFutureImport, ReturnOutsideFunction, YieldOutsideFunction,
};
use crate::rules::pylint::rules::{AwaitOutsideAsync, LoadBeforeGlobalDeclaration};
use crate::rules::{flake8_pyi, flake8_type_checking, pyflakes, pyupgrade};
use crate::settings::{LinterSettings, TargetVersion, flags};
use crate::{Edit, OldDiagnostic, Violation};
use crate::{Locator, docstrings, noqa};
mod analyze;
@@ -223,8 +224,6 @@ pub(crate) struct Checker<'a> {
visit: deferred::Visit<'a>,
/// A set of deferred nodes to be analyzed after the AST traversal (e.g., `for` loops).
analyze: deferred::Analyze,
/// The cumulative set of diagnostics computed across all lint rules.
diagnostics: RefCell<Vec<Diagnostic>>,
/// The list of names already seen by flake8-bugbear diagnostics, to avoid duplicate violations.
flake8_bugbear_seen: RefCell<FxHashSet<TextRange>>,
/// The end offset of the last visited statement.
@@ -238,6 +237,7 @@ pub(crate) struct Checker<'a> {
semantic_checker: SemanticSyntaxChecker,
/// Errors collected by the `semantic_checker`.
semantic_errors: RefCell<Vec<SemanticSyntaxError>>,
context: &'a LintContext<'a>,
}
impl<'a> Checker<'a> {
@@ -258,6 +258,7 @@ impl<'a> Checker<'a> {
cell_offsets: Option<&'a CellOffsets>,
notebook_index: Option<&'a NotebookIndex>,
target_version: TargetVersion,
context: &'a LintContext<'a>,
) -> Checker<'a> {
let semantic = SemanticModel::new(&settings.typing_modules, path, module);
Self {
@@ -278,7 +279,6 @@ impl<'a> Checker<'a> {
semantic,
visit: deferred::Visit::default(),
analyze: deferred::Analyze::default(),
diagnostics: RefCell::default(),
flake8_bugbear_seen: RefCell::default(),
cell_offsets,
notebook_index,
@@ -287,6 +287,7 @@ impl<'a> Checker<'a> {
target_version,
semantic_checker: SemanticSyntaxChecker::new(),
semantic_errors: RefCell::default(),
context,
}
}
}
@@ -337,6 +338,7 @@ impl<'a> Checker<'a> {
ast::BytesLiteralFlags::empty().with_quote_style(self.preferred_quote())
}
// TODO(dylan) add similar method for t-strings
/// Return the default f-string flags a generated `FString` node should use, given where we are
/// in the AST.
pub(crate) fn default_fstring_flags(&self) -> ast::FStringFlags {
@@ -379,10 +381,30 @@ impl<'a> Checker<'a> {
self.indexer.comment_ranges()
}
/// Push a new [`Diagnostic`] to the collection in the [`Checker`]
pub(crate) fn report_diagnostic(&self, diagnostic: Diagnostic) {
let mut diagnostics = self.diagnostics.borrow_mut();
diagnostics.push(diagnostic);
/// Return a [`DiagnosticGuard`] for reporting a diagnostic.
///
/// The guard derefs to a [`Diagnostic`], so it can be used to further modify the diagnostic
/// before it is added to the collection in the checker on `Drop`.
pub(crate) fn report_diagnostic<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
) -> DiagnosticGuard<'chk, 'a> {
self.context.report_diagnostic(kind, range)
}
/// Return a [`DiagnosticGuard`] for reporting a diagnostic if the corresponding rule is
/// enabled.
///
/// Prefer [`Checker::report_diagnostic`] in general because the conversion from a `Diagnostic`
/// to a `Rule` is somewhat expensive.
pub(crate) fn report_diagnostic_if_enabled<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
) -> Option<DiagnosticGuard<'chk, 'a>> {
self.context
.report_diagnostic_if_enabled(kind, range, self.settings)
}
/// Adds a [`TextRange`] to the set of ranges of variable names
@@ -508,9 +530,9 @@ impl<'a> Checker<'a> {
}
/// Push `diagnostic` if the checker is not in a `@no_type_check` context.
pub(crate) fn report_type_diagnostic(&self, diagnostic: Diagnostic) {
pub(crate) fn report_type_diagnostic<T: Violation>(&self, kind: T, range: TextRange) {
if !self.semantic.in_no_type_check() {
self.report_diagnostic(diagnostic);
self.report_diagnostic(kind, range);
}
}
@@ -595,7 +617,7 @@ impl SemanticSyntaxContext for Checker<'_> {
match error.kind {
SemanticSyntaxErrorKind::LateFutureImport => {
if self.settings.rules.enabled(Rule::LateFutureImport) {
self.report_diagnostic(Diagnostic::new(LateFutureImport, error.range));
self.report_diagnostic(LateFutureImport, error.range);
}
}
SemanticSyntaxErrorKind::LoadBeforeGlobalDeclaration { name, start } => {
@@ -604,31 +626,28 @@ impl SemanticSyntaxContext for Checker<'_> {
.rules
.enabled(Rule::LoadBeforeGlobalDeclaration)
{
self.report_diagnostic(Diagnostic::new(
self.report_diagnostic(
LoadBeforeGlobalDeclaration {
name,
row: self.compute_source_row(start),
},
error.range,
));
);
}
}
SemanticSyntaxErrorKind::YieldOutsideFunction(kind) => {
if self.settings.rules.enabled(Rule::YieldOutsideFunction) {
self.report_diagnostic(Diagnostic::new(
YieldOutsideFunction::new(kind),
error.range,
));
self.report_diagnostic(YieldOutsideFunction::new(kind), error.range);
}
}
SemanticSyntaxErrorKind::ReturnOutsideFunction => {
if self.settings.rules.enabled(Rule::ReturnOutsideFunction) {
self.report_diagnostic(Diagnostic::new(ReturnOutsideFunction, error.range));
self.report_diagnostic(ReturnOutsideFunction, error.range);
}
}
SemanticSyntaxErrorKind::AwaitOutsideAsyncFunction(_) => {
if self.settings.rules.enabled(Rule::AwaitOutsideAsync) {
self.report_diagnostic(Diagnostic::new(AwaitOutsideAsync, error.range));
self.report_diagnostic(AwaitOutsideAsync, error.range);
}
}
SemanticSyntaxErrorKind::ReboundComprehensionVariable
@@ -1879,6 +1898,10 @@ impl<'a> Visitor<'a> for Checker<'a> {
self.semantic.flags |= SemanticModelFlags::F_STRING;
visitor::walk_expr(self, expr);
}
Expr::TString(_) => {
self.semantic.flags |= SemanticModelFlags::T_STRING;
visitor::walk_expr(self, expr);
}
Expr::Named(ast::ExprNamed {
target,
value,
@@ -1912,6 +1935,7 @@ impl<'a> Visitor<'a> for Checker<'a> {
}
Expr::BytesLiteral(bytes_literal) => analyze::string_like(bytes_literal.into(), self),
Expr::FString(f_string) => analyze::string_like(f_string.into(), self),
Expr::TString(t_string) => analyze::string_like(t_string.into(), self),
_ => {}
}
@@ -2101,12 +2125,15 @@ impl<'a> Visitor<'a> for Checker<'a> {
}
}
fn visit_f_string_element(&mut self, f_string_element: &'a FStringElement) {
fn visit_interpolated_string_element(
&mut self,
interpolated_string_element: &'a InterpolatedStringElement,
) {
let snapshot = self.semantic.flags;
if f_string_element.is_expression() {
self.semantic.flags |= SemanticModelFlags::F_STRING_REPLACEMENT_FIELD;
if interpolated_string_element.is_interpolation() {
self.semantic.flags |= SemanticModelFlags::INTERPOLATED_STRING_REPLACEMENT_FIELD;
}
visitor::walk_f_string_element(self, f_string_element);
visitor::walk_interpolated_string_element(self, interpolated_string_element);
self.semantic.flags = snapshot;
}
}
@@ -2717,12 +2744,12 @@ impl<'a> Checker<'a> {
self.semantic.restore(snapshot);
if self.enabled(Rule::ForwardAnnotationSyntaxError) {
self.report_type_diagnostic(Diagnostic::new(
self.report_type_diagnostic(
pyflakes::rules::ForwardAnnotationSyntaxError {
parse_error: parse_error.error.to_string(),
},
string_expr.range(),
));
);
}
}
}
@@ -2863,30 +2890,26 @@ impl<'a> Checker<'a> {
} else {
if self.semantic.global_scope().uses_star_imports() {
if self.enabled(Rule::UndefinedLocalWithImportStarUsage) {
self.diagnostics.get_mut().push(
Diagnostic::new(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: name.to_string(),
},
range,
)
.with_parent(definition.start()),
);
self.report_diagnostic(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: name.to_string(),
},
range,
)
.set_parent(definition.start());
}
} else {
if self.enabled(Rule::UndefinedExport) {
if is_undefined_export_in_dunder_init_enabled(self.settings)
|| !self.path.ends_with("__init__.py")
{
self.diagnostics.get_mut().push(
Diagnostic::new(
pyflakes::rules::UndefinedExport {
name: name.to_string(),
},
range,
)
.with_parent(definition.start()),
);
self.report_diagnostic(
pyflakes::rules::UndefinedExport {
name: name.to_string(),
},
range,
)
.set_parent(definition.start());
}
}
}
@@ -2947,7 +2970,8 @@ pub(crate) fn check_ast(
cell_offsets: Option<&CellOffsets>,
notebook_index: Option<&NotebookIndex>,
target_version: TargetVersion,
) -> (Vec<Diagnostic>, Vec<SemanticSyntaxError>) {
context: &LintContext,
) -> Vec<SemanticSyntaxError> {
let module_path = package
.map(PackageRoot::path)
.and_then(|package| to_module_path(package, path));
@@ -2987,6 +3011,7 @@ pub(crate) fn check_ast(
cell_offsets,
notebook_index,
target_version,
context,
);
checker.bind_builtins();
@@ -3013,10 +3038,137 @@ pub(crate) fn check_ast(
analyze::deferred_scopes(&checker);
let Checker {
diagnostics,
semantic_errors,
..
semantic_errors, ..
} = checker;
(diagnostics.into_inner(), semantic_errors.into_inner())
semantic_errors.into_inner()
}
/// A type for collecting diagnostics in a given file.
///
/// [`LintContext::report_diagnostic`] can be used to obtain a [`DiagnosticGuard`], which will push
/// a [`Violation`] to the contained [`OldDiagnostic`] collection on `Drop`.
pub(crate) struct LintContext<'a> {
diagnostics: RefCell<Vec<OldDiagnostic>>,
source_file: &'a SourceFile,
}
impl<'a> LintContext<'a> {
/// Create a new collector with the given `source_file` and an empty collection of
/// `OldDiagnostic`s.
pub(crate) fn new(source_file: &'a SourceFile) -> Self {
Self {
diagnostics: RefCell::default(),
source_file,
}
}
/// Return a [`DiagnosticGuard`] for reporting a diagnostic.
///
/// The guard derefs to an [`OldDiagnostic`], so it can be used to further modify the diagnostic
/// before it is added to the collection in the collector on `Drop`.
pub(crate) fn report_diagnostic<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
) -> DiagnosticGuard<'chk, 'a> {
DiagnosticGuard {
context: self,
diagnostic: Some(OldDiagnostic::new(kind, range, self.source_file)),
}
}
/// Return a [`DiagnosticGuard`] for reporting a diagnostic if the corresponding rule is
/// enabled.
///
/// Prefer [`DiagnosticsCollector::report_diagnostic`] in general because the conversion from an
/// `OldDiagnostic` to a `Rule` is somewhat expensive.
pub(crate) fn report_diagnostic_if_enabled<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
settings: &LinterSettings,
) -> Option<DiagnosticGuard<'chk, 'a>> {
let diagnostic = OldDiagnostic::new(kind, range, self.source_file);
if settings.rules.enabled(diagnostic.rule()) {
Some(DiagnosticGuard {
context: self,
diagnostic: Some(diagnostic),
})
} else {
None
}
}
pub(crate) fn into_diagnostics(self) -> Vec<OldDiagnostic> {
self.diagnostics.into_inner()
}
pub(crate) fn is_empty(&self) -> bool {
self.diagnostics.borrow().is_empty()
}
pub(crate) fn as_mut_vec(&mut self) -> &mut Vec<OldDiagnostic> {
self.diagnostics.get_mut()
}
pub(crate) fn iter(&mut self) -> impl Iterator<Item = &OldDiagnostic> {
self.diagnostics.get_mut().iter()
}
}
/// An abstraction for mutating a diagnostic.
///
/// Callers can build this guard by starting with `Checker::report_diagnostic`.
///
/// The primary function of this guard is to add the underlying diagnostic to the `Checker`'s list
/// of diagnostics on `Drop`, while dereferencing to the underlying diagnostic for mutations like
/// adding fixes or parent ranges.
pub(crate) struct DiagnosticGuard<'a, 'b> {
/// The parent checker that will receive the diagnostic on `Drop`.
context: &'a LintContext<'b>,
/// The diagnostic that we want to report.
///
/// This is always `Some` until the `Drop` (or `defuse`) call.
diagnostic: Option<OldDiagnostic>,
}
impl DiagnosticGuard<'_, '_> {
/// Consume the underlying `Diagnostic` without emitting it.
///
/// In general you should avoid constructing diagnostics that may not be emitted, but this
/// method can be used where this is unavoidable.
pub(crate) fn defuse(mut self) {
self.diagnostic = None;
}
}
impl std::ops::Deref for DiagnosticGuard<'_, '_> {
type Target = OldDiagnostic;
fn deref(&self) -> &OldDiagnostic {
// OK because `self.diagnostic` is only `None` within `Drop`.
self.diagnostic.as_ref().unwrap()
}
}
/// Return a mutable borrow of the diagnostic in this guard.
impl std::ops::DerefMut for DiagnosticGuard<'_, '_> {
fn deref_mut(&mut self) -> &mut OldDiagnostic {
// OK because `self.diagnostic` is only `None` within `Drop`.
self.diagnostic.as_mut().unwrap()
}
}
impl Drop for DiagnosticGuard<'_, '_> {
fn drop(&mut self) {
if std::thread::panicking() {
// Don't submit diagnostics when panicking because they might be incomplete.
return;
}
if let Some(diagnostic) = self.diagnostic.take() {
self.context.diagnostics.borrow_mut().push(diagnostic);
}
}
}

View File

@@ -1,10 +1,10 @@
use std::path::Path;
use ruff_diagnostics::Diagnostic;
use ruff_python_ast::PythonVersion;
use ruff_python_trivia::CommentRanges;
use crate::Locator;
use crate::checkers::ast::LintContext;
use crate::package::PackageRoot;
use crate::preview::is_allow_nested_roots_enabled;
use crate::registry::Rule;
@@ -20,13 +20,12 @@ pub(crate) fn check_file_path(
comment_ranges: &CommentRanges,
settings: &LinterSettings,
target_version: PythonVersion,
) -> Vec<Diagnostic> {
let mut diagnostics: Vec<Diagnostic> = vec![];
context: &LintContext,
) {
// flake8-no-pep420
if settings.rules.enabled(Rule::ImplicitNamespacePackage) {
let allow_nested_roots = is_allow_nested_roots_enabled(settings);
if let Some(diagnostic) = implicit_namespace_package(
implicit_namespace_package(
path,
package,
locator,
@@ -34,26 +33,17 @@ pub(crate) fn check_file_path(
&settings.project_root,
&settings.src,
allow_nested_roots,
) {
diagnostics.push(diagnostic);
}
context,
);
}
// pep8-naming
if settings.rules.enabled(Rule::InvalidModuleName) {
if let Some(diagnostic) =
invalid_module_name(path, package, &settings.pep8_naming.ignore_names)
{
diagnostics.push(diagnostic);
}
invalid_module_name(path, package, &settings.pep8_naming.ignore_names, context);
}
// flake8-builtins
if settings.rules.enabled(Rule::StdlibModuleShadowing) {
if let Some(diagnostic) = stdlib_module_shadowing(path, settings, target_version) {
diagnostics.push(diagnostic);
}
stdlib_module_shadowing(path, settings, target_version, context);
}
diagnostics
}

View File

@@ -1,6 +1,5 @@
//! Lint rules based on import analysis.
use ruff_diagnostics::Diagnostic;
use ruff_notebook::CellOffsets;
use ruff_python_ast::statement_visitor::StatementVisitor;
use ruff_python_ast::{ModModule, PySourceType, PythonVersion};
@@ -16,6 +15,8 @@ use crate::rules::isort;
use crate::rules::isort::block::{Block, BlockBuilder};
use crate::settings::LinterSettings;
use super::ast::LintContext;
#[expect(clippy::too_many_arguments)]
pub(crate) fn check_imports(
parsed: &Parsed<ModModule>,
@@ -28,7 +29,8 @@ pub(crate) fn check_imports(
source_type: PySourceType,
cell_offsets: Option<&CellOffsets>,
target_version: PythonVersion,
) -> Vec<Diagnostic> {
context: &LintContext,
) {
// Extract all import blocks from the AST.
let tracker = {
let mut tracker =
@@ -40,11 +42,10 @@ pub(crate) fn check_imports(
let blocks: Vec<&Block> = tracker.iter().collect();
// Enforce import rules.
let mut diagnostics = vec![];
if settings.rules.enabled(Rule::UnsortedImports) {
for block in &blocks {
if !block.imports.is_empty() {
if let Some(diagnostic) = isort::rules::organize_imports(
isort::rules::organize_imports(
block,
locator,
stylist,
@@ -54,21 +55,19 @@ pub(crate) fn check_imports(
source_type,
parsed.tokens(),
target_version,
) {
diagnostics.push(diagnostic);
}
context,
);
}
}
}
if settings.rules.enabled(Rule::MissingRequiredImport) {
diagnostics.extend(isort::rules::add_required_imports(
isort::rules::add_required_imports(
parsed,
locator,
stylist,
settings,
source_type,
));
context,
);
}
diagnostics
}

View File

@@ -1,13 +1,11 @@
use ruff_diagnostics::Diagnostic;
use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::{TokenKind, Tokens};
use ruff_source_file::LineRanges;
use ruff_text_size::{Ranged, TextRange};
use crate::Locator;
use crate::line_width::IndentWidth;
use crate::registry::{AsRule, Rule};
use crate::registry::Rule;
use crate::rules::pycodestyle::rules::logical_lines::{
LogicalLines, TokenFlags, extraneous_whitespace, indentation, missing_whitespace,
missing_whitespace_after_keyword, missing_whitespace_around_operator, redundant_backslash,
@@ -16,6 +14,9 @@ use crate::rules::pycodestyle::rules::logical_lines::{
whitespace_before_parameters,
};
use crate::settings::LinterSettings;
use crate::{Locator, Violation};
use super::ast::{DiagnosticGuard, LintContext};
/// Return the amount of indentation, expanding tabs to the next multiple of the settings' tab size.
pub(crate) fn expand_indent(line: &str, indent_width: IndentWidth) -> usize {
@@ -40,8 +41,9 @@ pub(crate) fn check_logical_lines(
indexer: &Indexer,
stylist: &Stylist,
settings: &LinterSettings,
) -> Vec<Diagnostic> {
let mut context = LogicalLinesContext::new(settings);
lint_context: &LintContext,
) {
let mut context = LogicalLinesContext::new(settings, lint_context);
let mut prev_line = None;
let mut prev_indent_level = None;
@@ -170,7 +172,7 @@ pub(crate) fn check_logical_lines(
let indent_size = 4;
if enforce_indentation {
for diagnostic in indentation(
indentation(
&line,
prev_line.as_ref(),
indent_char,
@@ -178,11 +180,9 @@ pub(crate) fn check_logical_lines(
prev_indent_level,
indent_size,
range,
) {
if settings.rules.enabled(diagnostic.rule()) {
context.push_diagnostic(diagnostic);
}
}
lint_context,
settings,
);
}
if !line.is_comment_only() {
@@ -190,26 +190,24 @@ pub(crate) fn check_logical_lines(
prev_indent_level = Some(indent_level);
}
}
context.diagnostics
}
#[derive(Debug, Clone)]
pub(crate) struct LogicalLinesContext<'a> {
pub(crate) struct LogicalLinesContext<'a, 'b> {
settings: &'a LinterSettings,
diagnostics: Vec<Diagnostic>,
context: &'a LintContext<'b>,
}
impl<'a> LogicalLinesContext<'a> {
fn new(settings: &'a LinterSettings) -> Self {
Self {
settings,
diagnostics: Vec::new(),
}
impl<'a, 'b> LogicalLinesContext<'a, 'b> {
fn new(settings: &'a LinterSettings, context: &'a LintContext<'b>) -> Self {
Self { settings, context }
}
pub(crate) fn push_diagnostic(&mut self, diagnostic: Diagnostic) {
if self.settings.rules.enabled(diagnostic.rule()) {
self.diagnostics.push(diagnostic);
}
pub(crate) fn report_diagnostic<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
) -> Option<DiagnosticGuard<'chk, 'a>> {
self.context
.report_diagnostic_if_enabled(kind, range, self.settings)
}
}

View File

@@ -5,11 +5,9 @@ use std::path::Path;
use itertools::Itertools;
use rustc_hash::FxHashSet;
use ruff_diagnostics::{Diagnostic, Edit, Fix};
use ruff_python_trivia::CommentRanges;
use ruff_text_size::Ranged;
use crate::Locator;
use crate::fix::edits::delete_comment;
use crate::noqa::{
Code, Directive, FileExemption, FileNoqaDirectives, NoqaDirectives, NoqaMapping,
@@ -21,10 +19,13 @@ use crate::rules::pygrep_hooks;
use crate::rules::ruff;
use crate::rules::ruff::rules::{UnusedCodes, UnusedNOQA};
use crate::settings::LinterSettings;
use crate::{Edit, Fix, Locator};
use super::ast::LintContext;
#[expect(clippy::too_many_arguments)]
pub(crate) fn check_noqa(
diagnostics: &mut Vec<Diagnostic>,
context: &mut LintContext,
path: &Path,
locator: &Locator,
comment_ranges: &CommentRanges,
@@ -46,7 +47,7 @@ pub(crate) fn check_noqa(
let mut ignored_diagnostics = vec![];
// Remove any ignored diagnostics.
'outer: for (index, diagnostic) in diagnostics.iter().enumerate() {
'outer: for (index, diagnostic) in context.iter().enumerate() {
let rule = diagnostic.rule();
if matches!(rule, Rule::BlanketNOQA) {
@@ -135,11 +136,9 @@ pub(crate) fn check_noqa(
Directive::All(directive) => {
if matches.is_empty() {
let edit = delete_comment(directive.range(), locator);
let mut diagnostic =
Diagnostic::new(UnusedNOQA { codes: None }, directive.range());
let mut diagnostic = context
.report_diagnostic(UnusedNOQA { codes: None }, directive.range());
diagnostic.set_fix(Fix::safe_edit(edit));
diagnostics.push(diagnostic);
}
}
Directive::Codes(directive) => {
@@ -159,9 +158,7 @@ pub(crate) fn check_noqa(
if seen_codes.insert(original_code) {
let is_code_used = if is_file_level {
diagnostics
.iter()
.any(|diag| diag.rule().noqa_code() == code)
context.iter().any(|diag| diag.rule().noqa_code() == code)
} else {
matches.iter().any(|match_| *match_ == code)
} || settings
@@ -212,7 +209,7 @@ pub(crate) fn check_noqa(
directive.range(),
)
};
let mut diagnostic = Diagnostic::new(
let mut diagnostic = context.report_diagnostic(
UnusedNOQA {
codes: Some(UnusedCodes {
disabled: disabled_codes
@@ -236,7 +233,6 @@ pub(crate) fn check_noqa(
directive.range(),
);
diagnostic.set_fix(Fix::safe_edit(edit));
diagnostics.push(diagnostic);
}
}
}
@@ -247,8 +243,8 @@ pub(crate) fn check_noqa(
&& !per_file_ignores.contains(Rule::RedirectedNOQA)
&& !exemption.includes(Rule::RedirectedNOQA)
{
ruff::rules::redirected_noqa(diagnostics, &noqa_directives);
ruff::rules::redirected_file_noqa(diagnostics, &file_noqa_directives);
ruff::rules::redirected_noqa(context, &noqa_directives);
ruff::rules::redirected_file_noqa(context, &file_noqa_directives);
}
if settings.rules.enabled(Rule::BlanketNOQA)
@@ -256,7 +252,7 @@ pub(crate) fn check_noqa(
&& !exemption.enumerates(Rule::BlanketNOQA)
{
pygrep_hooks::rules::blanket_noqa(
diagnostics,
context,
&noqa_directives,
locator,
&file_noqa_directives,
@@ -267,7 +263,7 @@ pub(crate) fn check_noqa(
&& !per_file_ignores.contains(Rule::InvalidRuleCode)
&& !exemption.enumerates(Rule::InvalidRuleCode)
{
ruff::rules::invalid_noqa_code(diagnostics, &noqa_directives, locator, &settings.external);
ruff::rules::invalid_noqa_code(context, &noqa_directives, locator, &settings.external);
}
ignored_diagnostics.sort_unstable();

View File

@@ -1,6 +1,5 @@
//! Lint rules based on checking physical lines.
use ruff_diagnostics::Diagnostic;
use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_source_file::UniversalNewlines;
@@ -17,15 +16,16 @@ use crate::rules::pylint;
use crate::rules::ruff::rules::indented_form_feed;
use crate::settings::LinterSettings;
use super::ast::LintContext;
pub(crate) fn check_physical_lines(
locator: &Locator,
stylist: &Stylist,
indexer: &Indexer,
doc_lines: &[TextSize],
settings: &LinterSettings,
) -> Vec<Diagnostic> {
let mut diagnostics: Vec<Diagnostic> = vec![];
context: &LintContext,
) {
let enforce_doc_line_too_long = settings.rules.enabled(Rule::DocLineTooLong);
let enforce_line_too_long = settings.rules.enabled(Rule::LineTooLong);
let enforce_no_newline_at_end_of_file = settings.rules.enabled(Rule::MissingNewlineAtEndOfFile);
@@ -45,54 +45,38 @@ pub(crate) fn check_physical_lines(
.is_some()
{
if enforce_doc_line_too_long {
if let Some(diagnostic) = doc_line_too_long(&line, comment_ranges, settings) {
diagnostics.push(diagnostic);
}
doc_line_too_long(&line, comment_ranges, settings, context);
}
}
if enforce_mixed_spaces_and_tabs {
if let Some(diagnostic) = mixed_spaces_and_tabs(&line) {
diagnostics.push(diagnostic);
}
mixed_spaces_and_tabs(&line, context);
}
if enforce_line_too_long {
if let Some(diagnostic) = line_too_long(&line, comment_ranges, settings) {
diagnostics.push(diagnostic);
}
line_too_long(&line, comment_ranges, settings, context);
}
if enforce_bidirectional_unicode {
diagnostics.extend(pylint::rules::bidirectional_unicode(&line));
pylint::rules::bidirectional_unicode(&line, context);
}
if enforce_trailing_whitespace || enforce_blank_line_contains_whitespace {
if let Some(diagnostic) = trailing_whitespace(&line, locator, indexer, settings) {
diagnostics.push(diagnostic);
}
trailing_whitespace(&line, locator, indexer, settings, context);
}
if settings.rules.enabled(Rule::IndentedFormFeed) {
if let Some(diagnostic) = indented_form_feed(&line) {
diagnostics.push(diagnostic);
}
indented_form_feed(&line, context);
}
}
if enforce_no_newline_at_end_of_file {
if let Some(diagnostic) = no_newline_at_end_of_file(locator, stylist) {
diagnostics.push(diagnostic);
}
no_newline_at_end_of_file(locator, stylist, context);
}
if enforce_copyright_notice {
if let Some(diagnostic) = missing_copyright_notice(locator, settings) {
diagnostics.push(diagnostic);
}
missing_copyright_notice(locator, settings, context);
}
diagnostics
}
#[cfg(test)]
@@ -100,8 +84,10 @@ mod tests {
use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::parse_module;
use ruff_source_file::SourceFileBuilder;
use crate::Locator;
use crate::checkers::ast::LintContext;
use crate::line_width::LineLength;
use crate::registry::Rule;
use crate::rules::pycodestyle;
@@ -118,6 +104,8 @@ mod tests {
let stylist = Stylist::from_tokens(parsed.tokens(), locator.contents());
let check_with_max_line_length = |line_length: LineLength| {
let source_file = SourceFileBuilder::new("<filename>", line).finish();
let diagnostics = LintContext::new(&source_file);
check_physical_lines(
&locator,
&stylist,
@@ -130,7 +118,9 @@ mod tests {
},
..LinterSettings::for_rule(Rule::LineTooLong)
},
)
&diagnostics,
);
diagnostics.into_diagnostics()
};
let line_length = LineLength::try_from(8).unwrap();
assert_eq!(check_with_max_line_length(line_length), vec![]);

View File

@@ -2,7 +2,6 @@
use std::path::Path;
use ruff_diagnostics::Diagnostic;
use ruff_notebook::CellOffsets;
use ruff_python_ast::PySourceType;
use ruff_python_codegen::Stylist;
@@ -19,6 +18,8 @@ use crate::rules::{
};
use crate::settings::LinterSettings;
use super::ast::LintContext;
#[expect(clippy::too_many_arguments)]
pub(crate) fn check_tokens(
tokens: &Tokens,
@@ -29,8 +30,8 @@ pub(crate) fn check_tokens(
settings: &LinterSettings,
source_type: PySourceType,
cell_offsets: Option<&CellOffsets>,
) -> Vec<Diagnostic> {
let mut diagnostics: Vec<Diagnostic> = vec![];
context: &mut LintContext,
) {
let comment_ranges = indexer.comment_ranges();
if settings.rules.any_enabled(&[
@@ -41,16 +42,23 @@ pub(crate) fn check_tokens(
Rule::BlankLinesAfterFunctionOrClass,
Rule::BlankLinesBeforeNestedDefinition,
]) {
BlankLinesChecker::new(locator, stylist, settings, source_type, cell_offsets)
.check_lines(tokens, &mut diagnostics);
BlankLinesChecker::new(
locator,
stylist,
settings,
source_type,
cell_offsets,
context,
)
.check_lines(tokens);
}
if settings.rules.enabled(Rule::BlanketTypeIgnore) {
pygrep_hooks::rules::blanket_type_ignore(&mut diagnostics, comment_ranges, locator);
pygrep_hooks::rules::blanket_type_ignore(context, comment_ranges, locator);
}
if settings.rules.enabled(Rule::EmptyComment) {
pylint::rules::empty_comments(&mut diagnostics, comment_ranges, locator);
pylint::rules::empty_comments(context, comment_ranges, locator);
}
if settings
@@ -58,25 +66,20 @@ pub(crate) fn check_tokens(
.enabled(Rule::AmbiguousUnicodeCharacterComment)
{
for range in comment_ranges {
ruff::rules::ambiguous_unicode_character_comment(
&mut diagnostics,
locator,
range,
settings,
);
ruff::rules::ambiguous_unicode_character_comment(context, locator, range, settings);
}
}
if settings.rules.enabled(Rule::CommentedOutCode) {
eradicate::rules::commented_out_code(&mut diagnostics, locator, comment_ranges, settings);
eradicate::rules::commented_out_code(context, locator, comment_ranges, settings);
}
if settings.rules.enabled(Rule::UTF8EncodingDeclaration) {
pyupgrade::rules::unnecessary_coding_comment(&mut diagnostics, locator, comment_ranges);
pyupgrade::rules::unnecessary_coding_comment(context, locator, comment_ranges);
}
if settings.rules.enabled(Rule::TabIndentation) {
pycodestyle::rules::tab_indentation(&mut diagnostics, locator, indexer);
pycodestyle::rules::tab_indentation(context, locator, indexer);
}
if settings.rules.any_enabled(&[
@@ -87,7 +90,7 @@ pub(crate) fn check_tokens(
Rule::InvalidCharacterZeroWidthSpace,
]) {
for token in tokens {
pylint::rules::invalid_string_characters(&mut diagnostics, token, locator);
pylint::rules::invalid_string_characters(context, token, locator);
}
}
@@ -97,7 +100,7 @@ pub(crate) fn check_tokens(
Rule::UselessSemicolon,
]) {
pycodestyle::rules::compound_statements(
&mut diagnostics,
context,
tokens,
locator,
indexer,
@@ -110,13 +113,7 @@ pub(crate) fn check_tokens(
Rule::SingleLineImplicitStringConcatenation,
Rule::MultiLineImplicitStringConcatenation,
]) {
flake8_implicit_str_concat::rules::implicit(
&mut diagnostics,
tokens,
locator,
indexer,
settings,
);
flake8_implicit_str_concat::rules::implicit(context, tokens, locator, indexer, settings);
}
if settings.rules.any_enabled(&[
@@ -124,15 +121,15 @@ pub(crate) fn check_tokens(
Rule::TrailingCommaOnBareTuple,
Rule::ProhibitedTrailingComma,
]) {
flake8_commas::rules::trailing_commas(&mut diagnostics, tokens, locator, indexer);
flake8_commas::rules::trailing_commas(context, tokens, locator, indexer);
}
if settings.rules.enabled(Rule::ExtraneousParentheses) {
pyupgrade::rules::extraneous_parentheses(&mut diagnostics, tokens, locator);
pyupgrade::rules::extraneous_parentheses(context, tokens, locator);
}
if source_type.is_stub() && settings.rules.enabled(Rule::TypeCommentInStub) {
flake8_pyi::rules::type_comment_in_stub(&mut diagnostics, locator, comment_ranges);
flake8_pyi::rules::type_comment_in_stub(context, locator, comment_ranges);
}
if settings.rules.any_enabled(&[
@@ -142,13 +139,7 @@ pub(crate) fn check_tokens(
Rule::ShebangNotFirstLine,
Rule::ShebangMissingPython,
]) {
flake8_executable::rules::from_tokens(
&mut diagnostics,
path,
locator,
comment_ranges,
settings,
);
flake8_executable::rules::from_tokens(context, path, locator, comment_ranges, settings);
}
if settings.rules.any_enabled(&[
@@ -172,19 +163,15 @@ pub(crate) fn check_tokens(
TodoComment::from_comment(comment, *comment_range, i)
})
.collect();
flake8_todos::rules::todos(&mut diagnostics, &todo_comments, locator, comment_ranges);
flake8_fixme::rules::todos(&mut diagnostics, &todo_comments);
flake8_todos::rules::todos(context, &todo_comments, locator, comment_ranges);
flake8_fixme::rules::todos(context, &todo_comments);
}
if settings.rules.enabled(Rule::TooManyNewlinesAtEndOfFile) {
pycodestyle::rules::too_many_newlines_at_end_of_file(
&mut diagnostics,
tokens,
cell_offsets,
);
pycodestyle::rules::too_many_newlines_at_end_of_file(context, tokens, cell_offsets);
}
diagnostics.retain(|diagnostic| settings.rules.enabled(diagnostic.rule()));
diagnostics
context
.as_mut_vec()
.retain(|diagnostic| settings.rules.enabled(diagnostic.rule()));
}

View File

@@ -6,7 +6,7 @@ use std::fmt::Formatter;
use strum_macros::{AsRefStr, EnumIter};
use crate::registry::{AsRule, Linter};
use crate::registry::Linter;
use crate::rule_selector::is_single_rule_selector;
use crate::rules;
@@ -198,6 +198,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Pylint, "C0132") => (RuleGroup::Stable, rules::pylint::rules::TypeParamNameMismatch),
(Pylint, "C0205") => (RuleGroup::Stable, rules::pylint::rules::SingleStringSlots),
(Pylint, "C0206") => (RuleGroup::Stable, rules::pylint::rules::DictIndexMissingItems),
(Pylint, "C0207") => (RuleGroup::Preview, rules::pylint::rules::MissingMaxsplitArg),
(Pylint, "C0208") => (RuleGroup::Stable, rules::pylint::rules::IterationOverSet),
(Pylint, "C0414") => (RuleGroup::Stable, rules::pylint::rules::UselessImportAlias),
(Pylint, "C0415") => (RuleGroup::Preview, rules::pylint::rules::ImportOutsideTopLevel),
@@ -552,6 +553,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Pyupgrade, "046") => (RuleGroup::Preview, rules::pyupgrade::rules::NonPEP695GenericClass),
(Pyupgrade, "047") => (RuleGroup::Preview, rules::pyupgrade::rules::NonPEP695GenericFunction),
(Pyupgrade, "049") => (RuleGroup::Preview, rules::pyupgrade::rules::PrivateTypeParameter),
(Pyupgrade, "050") => (RuleGroup::Preview, rules::pyupgrade::rules::UselessClassMetaclassType),
// pydocstyle
(Pydocstyle, "100") => (RuleGroup::Stable, rules::pydocstyle::rules::UndocumentedPublicModule),
@@ -933,6 +935,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Flake8UsePathlib, "207") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::Glob),
(Flake8UsePathlib, "208") => (RuleGroup::Stable, rules::flake8_use_pathlib::violations::OsListdir),
(Flake8UsePathlib, "210") => (RuleGroup::Stable, rules::flake8_use_pathlib::rules::InvalidPathlibWithSuffix),
(Flake8UsePathlib, "211") => (RuleGroup::Preview, rules::flake8_use_pathlib::violations::OsSymlink),
// flake8-logging-format
(Flake8LoggingFormat, "001") => (RuleGroup::Stable, rules::flake8_logging_format::violations::LoggingStringFormat),
@@ -1022,7 +1025,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Ruff, "056") => (RuleGroup::Preview, rules::ruff::rules::FalsyDictGetFallback),
(Ruff, "057") => (RuleGroup::Preview, rules::ruff::rules::UnnecessaryRound),
(Ruff, "058") => (RuleGroup::Preview, rules::ruff::rules::StarmapZip),
(Ruff, "059") => (RuleGroup::Preview, rules::ruff::rules::UnusedUnpackedVariable),
(Ruff, "059") => (RuleGroup::Stable, rules::ruff::rules::UnusedUnpackedVariable),
(Ruff, "060") => (RuleGroup::Preview, rules::ruff::rules::InEmptyCollection),
(Ruff, "100") => (RuleGroup::Stable, rules::ruff::rules::UnusedNOQA),
(Ruff, "101") => (RuleGroup::Stable, rules::ruff::rules::RedirectedNOQA),
@@ -1154,3 +1157,9 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
_ => return None,
})
}
impl std::fmt::Display for Rule {
fn fmt(&self, f: &mut Formatter) -> std::fmt::Result {
f.write_str(self.into())
}
}

View File

@@ -1,14 +1,15 @@
use anyhow::Result;
use log::debug;
use ruff_source_file::SourceFile;
use ruff_text_size::{Ranged, TextRange, TextSize};
use crate::{Fix, Violation};
use crate::registry::AsRule;
use crate::violation::Violation;
use crate::{Fix, codes::Rule};
#[derive(Debug, PartialEq, Eq, Clone)]
pub struct Diagnostic {
/// The identifier of the diagnostic, used to align the diagnostic with a rule.
pub name: &'static str,
pub struct OldDiagnostic {
/// The message body to display to the user, to explain the diagnostic.
pub body: String,
/// The message to display to the user, to explain the suggested fix.
@@ -16,17 +17,27 @@ pub struct Diagnostic {
pub range: TextRange,
pub fix: Option<Fix>,
pub parent: Option<TextSize>,
pub(crate) rule: Rule,
pub(crate) file: SourceFile,
}
impl Diagnostic {
pub fn new<T: Violation>(kind: T, range: TextRange) -> Self {
impl OldDiagnostic {
// TODO(brent) We temporarily allow this to avoid updating all of the call sites to add
// references. I expect this method to go away or change significantly with the rest of the
// diagnostic refactor, but if it still exists in this form at the end of the refactor, we
// should just update the call sites.
#[expect(clippy::needless_pass_by_value)]
pub fn new<T: Violation>(kind: T, range: TextRange, file: &SourceFile) -> Self {
Self {
name: T::rule_name(),
body: Violation::message(&kind),
suggestion: Violation::fix_title(&kind),
range,
fix: None,
parent: None,
rule: T::rule(),
file: file.clone(),
}
}
@@ -50,7 +61,7 @@ impl Diagnostic {
pub fn try_set_fix(&mut self, func: impl FnOnce() -> Result<Fix>) {
match func() {
Ok(fix) => self.fix = Some(fix),
Err(err) => debug!("Failed to create fix for {}: {}", self.name, err),
Err(err) => debug!("Failed to create fix for {}: {}", self.rule, err),
}
}
@@ -61,7 +72,7 @@ impl Diagnostic {
match func() {
Ok(None) => {}
Ok(Some(fix)) => self.fix = Some(fix),
Err(err) => debug!("Failed to create fix for {}: {}", self.name, err),
Err(err) => debug!("Failed to create fix for {}: {}", self.rule, err),
}
}
@@ -80,7 +91,13 @@ impl Diagnostic {
}
}
impl Ranged for Diagnostic {
impl AsRule for OldDiagnostic {
fn rule(&self) -> Rule {
self.rule
}
}
impl Ranged for OldDiagnostic {
fn range(&self) -> TextRange {
self.range
}

View File

@@ -2,7 +2,6 @@
use anyhow::{Context, Result};
use ruff_diagnostics::Edit;
use ruff_python_ast::parenthesize::parenthesized_range;
use ruff_python_ast::{self as ast, Arguments, ExceptHandler, Expr, ExprList, Parameters, Stmt};
use ruff_python_ast::{AnyNodeRef, ArgOrKeyword};
@@ -16,6 +15,7 @@ use ruff_python_trivia::{
use ruff_source_file::{LineRanges, NewlineWithTrailingNewline, UniversalNewlines};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
use crate::Edit;
use crate::Locator;
use crate::cst::matchers::{match_function_def, match_indented_block, match_statement};
use crate::fix::codemods;
@@ -595,18 +595,17 @@ mod tests {
use ruff_source_file::SourceFileBuilder;
use test_case::test_case;
use ruff_diagnostics::{Diagnostic, Edit, Fix};
use ruff_python_ast::Stmt;
use ruff_python_codegen::Stylist;
use ruff_python_parser::{parse_expression, parse_module};
use ruff_text_size::{Ranged, TextRange, TextSize};
use crate::Locator;
use crate::fix::apply_fixes;
use crate::fix::edits::{
add_to_dunder_all, make_redundant_alias, next_stmt_break, trailing_semicolon,
};
use crate::message::Message;
use crate::{Edit, Fix, Locator, OldDiagnostic};
/// Parse the given source using [`Mode::Module`] and return the first statement.
fn parse_first_stmt(source: &str) -> Result<Stmt> {
@@ -737,24 +736,16 @@ x = 1 \
let diag = {
use crate::rules::pycodestyle::rules::MissingNewlineAtEndOfFile;
let mut iter = edits.into_iter();
let diag = Diagnostic::new(
let diag = OldDiagnostic::new(
MissingNewlineAtEndOfFile, // The choice of rule here is arbitrary.
TextRange::default(),
&SourceFileBuilder::new("<filename>", "<code>").finish(),
)
.with_fix(Fix::safe_edits(
iter.next().ok_or(anyhow!("expected edits nonempty"))?,
iter,
));
Message::diagnostic(
diag.name,
diag.body,
diag.suggestion,
diag.range,
diag.fix,
diag.parent,
SourceFileBuilder::new("<filename>", "<code>").finish(),
None,
)
Message::from_diagnostic(diag, None)
};
assert_eq!(apply_fixes([diag].iter(), &locator).code, expect);
Ok(())

View File

@@ -3,7 +3,7 @@ use std::collections::BTreeSet;
use itertools::Itertools;
use rustc_hash::{FxHashMap, FxHashSet};
use ruff_diagnostics::{Edit, Fix, IsolationLevel, SourceMap};
use ruff_diagnostics::{IsolationLevel, SourceMap};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
use crate::Locator;
@@ -11,6 +11,7 @@ use crate::linter::FixTable;
use crate::message::Message;
use crate::registry::Rule;
use crate::settings::types::UnsafeFixes;
use crate::{Edit, Fix};
pub(crate) mod codemods;
pub(crate) mod edits;
@@ -157,14 +158,16 @@ fn cmp_fix(rule1: Rule, rule2: Rule, fix1: &Fix, fix2: &Fix) -> std::cmp::Orderi
#[cfg(test)]
mod tests {
use ruff_diagnostics::{Diagnostic, Edit, Fix, SourceMarker};
use ruff_diagnostics::SourceMarker;
use ruff_source_file::SourceFileBuilder;
use ruff_text_size::{Ranged, TextSize};
use crate::Locator;
use crate::diagnostic::OldDiagnostic;
use crate::fix::{FixResult, apply_fixes};
use crate::message::Message;
use crate::rules::pycodestyle::rules::MissingNewlineAtEndOfFile;
use crate::{Edit, Fix};
fn create_diagnostics(
filename: &str,
@@ -174,12 +177,12 @@ mod tests {
edit.into_iter()
.map(|edit| {
// The choice of rule here is arbitrary.
let diagnostic = Diagnostic::new(MissingNewlineAtEndOfFile, edit.range());
Message::from_diagnostic(
diagnostic.with_fix(Fix::safe_edit(edit)),
SourceFileBuilder::new(filename, source).finish(),
None,
)
let diagnostic = OldDiagnostic::new(
MissingNewlineAtEndOfFile,
edit.range(),
&SourceFileBuilder::new(filename, source).finish(),
);
Message::from_diagnostic(diagnostic.with_fix(Fix::safe_edit(edit)), None)
})
.collect()
}

View File

@@ -1,7 +1,6 @@
//! Insert statements into Python code.
use std::ops::Add;
use ruff_diagnostics::Edit;
use ruff_python_ast::Stmt;
use ruff_python_ast::helpers::is_docstring_stmt;
use ruff_python_codegen::Stylist;
@@ -10,6 +9,7 @@ use ruff_python_trivia::{PythonWhitespace, textwrap::indent};
use ruff_source_file::{LineRanges, UniversalNewlineIterator};
use ruff_text_size::{Ranged, TextSize};
use crate::Edit;
use crate::Locator;
#[derive(Debug, Clone, PartialEq, Eq)]

View File

@@ -8,7 +8,6 @@ use std::error::Error;
use anyhow::Result;
use libcst_native::{ImportAlias, Name as cstName, NameOrAttribute};
use ruff_diagnostics::Edit;
use ruff_python_ast::{self as ast, Expr, ModModule, Stmt};
use ruff_python_codegen::Stylist;
use ruff_python_parser::{Parsed, Tokens};
@@ -18,6 +17,7 @@ use ruff_python_semantic::{
use ruff_python_trivia::textwrap::indent;
use ruff_text_size::{Ranged, TextSize};
use crate::Edit;
use crate::Locator;
use crate::cst::matchers::{match_aliases, match_import_from, match_statement};
use crate::fix;

View File

@@ -14,12 +14,17 @@ pub use rule_selector::RuleSelector;
pub use rule_selector::clap_completion::RuleSelectorParser;
pub use rules::pycodestyle::rules::IOError;
pub use diagnostic::OldDiagnostic;
pub(crate) use ruff_diagnostics::{Applicability, Edit, Fix};
pub use violation::{AlwaysFixableViolation, FixAvailability, Violation, ViolationMetadata};
pub const VERSION: &str = env!("CARGO_PKG_VERSION");
mod checkers;
pub mod codes;
mod comments;
mod cst;
mod diagnostic;
pub mod directives;
mod doc_lines;
mod docstrings;
@@ -45,6 +50,7 @@ pub mod settings;
pub mod source_kind;
mod text_helpers;
pub mod upstream_categories;
mod violation;
#[cfg(any(test, fuzzing))]
pub mod test;

View File

@@ -1,6 +1,4 @@
use std::borrow::Cow;
use std::cell::LazyCell;
use std::ops::Deref;
use std::path::Path;
use anyhow::{Result, anyhow};
@@ -9,16 +7,16 @@ use itertools::Itertools;
use ruff_python_parser::semantic_errors::SemanticSyntaxError;
use rustc_hash::FxHashMap;
use ruff_diagnostics::Diagnostic;
use ruff_notebook::Notebook;
use ruff_python_ast::{ModModule, PySourceType, PythonVersion};
use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::{ParseError, ParseOptions, Parsed, UnsupportedSyntaxError};
use ruff_source_file::SourceFileBuilder;
use ruff_source_file::{SourceFile, SourceFileBuilder};
use ruff_text_size::Ranged;
use crate::checkers::ast::check_ast;
use crate::OldDiagnostic;
use crate::checkers::ast::{LintContext, check_ast};
use crate::checkers::filesystem::check_file_path;
use crate::checkers::imports::check_imports;
use crate::checkers::noqa::check_noqa;
@@ -113,8 +111,11 @@ pub fn check_path(
parsed: &Parsed<ModModule>,
target_version: TargetVersion,
) -> Vec<Message> {
let source_file =
SourceFileBuilder::new(path.to_string_lossy().as_ref(), locator.contents()).finish();
// Aggregate all diagnostics.
let mut diagnostics = vec![];
let mut diagnostics = LintContext::new(&source_file);
// Aggregate all semantic syntax errors.
let mut semantic_syntax_errors = vec![];
@@ -136,7 +137,7 @@ pub fn check_path(
.iter_enabled()
.any(|rule_code| rule_code.lint_source().is_tokens())
{
diagnostics.extend(check_tokens(
check_tokens(
tokens,
path,
locator,
@@ -145,7 +146,8 @@ pub fn check_path(
settings,
source_type,
source_kind.as_ipy_notebook().map(Notebook::cell_offsets),
));
&mut diagnostics,
);
}
// Run the filesystem-based rules.
@@ -154,14 +156,15 @@ pub fn check_path(
.iter_enabled()
.any(|rule_code| rule_code.lint_source().is_filesystem())
{
diagnostics.extend(check_file_path(
check_file_path(
path,
package,
locator,
comment_ranges,
settings,
target_version.linter_version(),
));
&diagnostics,
);
}
// Run the logical line-based rules.
@@ -170,9 +173,14 @@ pub fn check_path(
.iter_enabled()
.any(|rule_code| rule_code.lint_source().is_logical_lines())
{
diagnostics.extend(crate::checkers::logical_lines::check_logical_lines(
tokens, locator, indexer, stylist, settings,
));
crate::checkers::logical_lines::check_logical_lines(
tokens,
locator,
indexer,
stylist,
settings,
&diagnostics,
);
}
// Run the AST-based rules only if there are no syntax errors.
@@ -180,7 +188,7 @@ pub fn check_path(
let cell_offsets = source_kind.as_ipy_notebook().map(Notebook::cell_offsets);
let notebook_index = source_kind.as_ipy_notebook().map(Notebook::index);
let (new_diagnostics, new_semantic_syntax_errors) = check_ast(
semantic_syntax_errors.extend(check_ast(
parsed,
locator,
stylist,
@@ -194,9 +202,8 @@ pub fn check_path(
cell_offsets,
notebook_index,
target_version,
);
diagnostics.extend(new_diagnostics);
semantic_syntax_errors.extend(new_semantic_syntax_errors);
&diagnostics,
));
let use_imports = !directives.isort.skip_file
&& settings
@@ -205,7 +212,7 @@ pub fn check_path(
.any(|rule_code| rule_code.lint_source().is_imports());
if use_imports || use_doc_lines {
if use_imports {
let import_diagnostics = check_imports(
check_imports(
parsed,
locator,
indexer,
@@ -216,9 +223,8 @@ pub fn check_path(
source_type,
cell_offsets,
target_version.linter_version(),
&diagnostics,
);
diagnostics.extend(import_diagnostics);
}
if use_doc_lines {
doc_lines.extend(doc_lines_from_ast(parsed.suite(), locator));
@@ -238,9 +244,14 @@ pub fn check_path(
.iter_enabled()
.any(|rule_code| rule_code.lint_source().is_physical_lines())
{
diagnostics.extend(check_physical_lines(
locator, stylist, indexer, &doc_lines, settings,
));
check_physical_lines(
locator,
stylist,
indexer,
&doc_lines,
settings,
&diagnostics,
);
}
// Raise violations for internal test rules
@@ -250,47 +261,70 @@ pub fn check_path(
if !settings.rules.enabled(*test_rule) {
continue;
}
let diagnostic = match test_rule {
match test_rule {
Rule::StableTestRule => {
test_rules::StableTestRule::diagnostic(locator, comment_ranges)
}
Rule::StableTestRuleSafeFix => {
test_rules::StableTestRuleSafeFix::diagnostic(locator, comment_ranges)
}
Rule::StableTestRuleUnsafeFix => {
test_rules::StableTestRuleUnsafeFix::diagnostic(locator, comment_ranges)
test_rules::StableTestRule::diagnostic(locator, comment_ranges, &diagnostics);
}
Rule::StableTestRuleSafeFix => test_rules::StableTestRuleSafeFix::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::StableTestRuleUnsafeFix => test_rules::StableTestRuleUnsafeFix::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::StableTestRuleDisplayOnlyFix => {
test_rules::StableTestRuleDisplayOnlyFix::diagnostic(locator, comment_ranges)
test_rules::StableTestRuleDisplayOnlyFix::diagnostic(
locator,
comment_ranges,
&diagnostics,
);
}
Rule::PreviewTestRule => {
test_rules::PreviewTestRule::diagnostic(locator, comment_ranges)
test_rules::PreviewTestRule::diagnostic(locator, comment_ranges, &diagnostics);
}
Rule::DeprecatedTestRule => {
test_rules::DeprecatedTestRule::diagnostic(locator, comment_ranges)
test_rules::DeprecatedTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
);
}
Rule::AnotherDeprecatedTestRule => {
test_rules::AnotherDeprecatedTestRule::diagnostic(locator, comment_ranges)
test_rules::AnotherDeprecatedTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
);
}
Rule::RemovedTestRule => {
test_rules::RemovedTestRule::diagnostic(locator, comment_ranges)
}
Rule::AnotherRemovedTestRule => {
test_rules::AnotherRemovedTestRule::diagnostic(locator, comment_ranges)
}
Rule::RedirectedToTestRule => {
test_rules::RedirectedToTestRule::diagnostic(locator, comment_ranges)
}
Rule::RedirectedFromTestRule => {
test_rules::RedirectedFromTestRule::diagnostic(locator, comment_ranges)
test_rules::RemovedTestRule::diagnostic(locator, comment_ranges, &diagnostics);
}
Rule::AnotherRemovedTestRule => test_rules::AnotherRemovedTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::RedirectedToTestRule => test_rules::RedirectedToTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::RedirectedFromTestRule => test_rules::RedirectedFromTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::RedirectedFromPrefixTestRule => {
test_rules::RedirectedFromPrefixTestRule::diagnostic(locator, comment_ranges)
test_rules::RedirectedFromPrefixTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
);
}
_ => unreachable!("All test rules must have an implementation"),
};
if let Some(diagnostic) = diagnostic {
diagnostics.push(diagnostic);
}
}
}
@@ -308,7 +342,9 @@ pub fn check_path(
RuleSet::empty()
};
if !per_file_ignores.is_empty() {
diagnostics.retain(|diagnostic| !per_file_ignores.contains(diagnostic.rule()));
diagnostics
.as_mut_vec()
.retain(|diagnostic| !per_file_ignores.contains(diagnostic.rule()));
}
// Enforce `noqa` directives.
@@ -330,11 +366,13 @@ pub fn check_path(
);
if noqa.is_enabled() {
for index in ignored.iter().rev() {
diagnostics.swap_remove(*index);
diagnostics.as_mut_vec().swap_remove(*index);
}
}
}
let mut diagnostics = diagnostics.into_diagnostics();
if parsed.has_valid_syntax() {
// Remove fixes for any rules marked as unfixable.
for diagnostic in &mut diagnostics {
@@ -372,9 +410,9 @@ pub fn check_path(
parsed.errors(),
syntax_errors,
&semantic_syntax_errors,
path,
locator,
directives,
&source_file,
)
}
@@ -438,7 +476,7 @@ pub fn add_noqa_to_path(
)
}
/// Generate a [`Message`] for each [`Diagnostic`] triggered by the given source
/// Generate a [`Message`] for each [`OldDiagnostic`] triggered by the given source
/// code.
pub fn lint_only(
path: &Path,
@@ -503,39 +541,28 @@ pub fn lint_only(
/// Convert from diagnostics to messages.
fn diagnostics_to_messages(
diagnostics: Vec<Diagnostic>,
diagnostics: Vec<OldDiagnostic>,
parse_errors: &[ParseError],
unsupported_syntax_errors: &[UnsupportedSyntaxError],
semantic_syntax_errors: &[SemanticSyntaxError],
path: &Path,
locator: &Locator,
directives: &Directives,
source_file: &SourceFile,
) -> Vec<Message> {
let file = LazyCell::new(|| {
let mut builder =
SourceFileBuilder::new(path.to_string_lossy().as_ref(), locator.contents());
if let Some(line_index) = locator.line_index() {
builder.set_line_index(line_index.clone());
}
builder.finish()
});
parse_errors
.iter()
.map(|parse_error| Message::from_parse_error(parse_error, locator, file.deref().clone()))
.map(|parse_error| Message::from_parse_error(parse_error, locator, source_file.clone()))
.chain(unsupported_syntax_errors.iter().map(|syntax_error| {
Message::from_unsupported_syntax_error(syntax_error, file.deref().clone())
Message::from_unsupported_syntax_error(syntax_error, source_file.clone())
}))
.chain(
semantic_syntax_errors
.iter()
.map(|error| Message::from_semantic_syntax_error(error, file.deref().clone())),
.map(|error| Message::from_semantic_syntax_error(error, source_file.clone())),
)
.chain(diagnostics.into_iter().map(|diagnostic| {
let noqa_offset = directives.noqa_line_for.resolve(diagnostic.start());
Message::from_diagnostic(diagnostic, file.deref().clone(), Some(noqa_offset))
Message::from_diagnostic(diagnostic, Some(noqa_offset))
}))
.collect()
}

View File

@@ -6,11 +6,11 @@ use colored::{Color, ColoredString, Colorize, Styles};
use ruff_text_size::{Ranged, TextRange, TextSize};
use similar::{ChangeTag, TextDiff};
use ruff_diagnostics::{Applicability, Fix};
use ruff_source_file::{OneIndexed, SourceFile};
use crate::message::Message;
use crate::text_helpers::ShowNonprinting;
use crate::{Applicability, Fix};
/// Renders a diff that shows the code fixes.
///

View File

@@ -4,11 +4,11 @@ use serde::ser::SerializeSeq;
use serde::{Serialize, Serializer};
use serde_json::{Value, json};
use ruff_diagnostics::Edit;
use ruff_notebook::NotebookIndex;
use ruff_source_file::{LineColumn, OneIndexed, SourceCode};
use ruff_text_size::Ranged;
use crate::Edit;
use crate::message::{Emitter, EmitterContext, Message};
#[derive(Default)]

View File

@@ -16,7 +16,6 @@ pub use json_lines::JsonLinesEmitter;
pub use junit::JunitEmitter;
pub use pylint::PylintEmitter;
pub use rdjson::RdjsonEmitter;
use ruff_diagnostics::{Diagnostic, Fix};
use ruff_notebook::NotebookIndex;
use ruff_python_parser::{ParseError, UnsupportedSyntaxError};
use ruff_source_file::{LineColumn, SourceFile};
@@ -28,6 +27,7 @@ use crate::Locator;
use crate::codes::NoqaCode;
use crate::logging::DisplayParseErrorType;
use crate::registry::Rule;
use crate::{Fix, OldDiagnostic};
mod azure;
mod diff;
@@ -50,7 +50,7 @@ mod text;
/// `noqa` offsets.
///
/// For diagnostic messages, the [`db::Diagnostic`]'s primary message contains the
/// [`Diagnostic::body`], and the primary annotation optionally contains the suggestion accompanying
/// [`OldDiagnostic::body`], and the primary annotation optionally contains the suggestion accompanying
/// a fix. The `db::Diagnostic::id` field contains the kebab-case lint name derived from the `Rule`.
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct Message {
@@ -60,6 +60,7 @@ pub struct Message {
pub fix: Option<Fix>,
pub parent: Option<TextSize>,
pub(crate) noqa_offset: Option<TextSize>,
noqa_code: Option<NoqaCode>,
}
impl Message {
@@ -76,12 +77,12 @@ impl Message {
fix: None,
parent: None,
noqa_offset: None,
noqa_code: None,
}
}
#[expect(clippy::too_many_arguments)]
pub fn diagnostic(
name: &'static str,
body: String,
suggestion: Option<String>,
range: TextRange,
@@ -89,9 +90,10 @@ impl Message {
parent: Option<TextSize>,
file: SourceFile,
noqa_offset: Option<TextSize>,
rule: Rule,
) -> Message {
let mut diagnostic = db::Diagnostic::new(
DiagnosticId::Lint(LintName::of(name)),
DiagnosticId::Lint(LintName::of(rule.into())),
Severity::Error,
body,
);
@@ -107,25 +109,22 @@ impl Message {
fix,
parent,
noqa_offset,
noqa_code: Some(rule.noqa_code()),
}
}
/// Create a [`Message`] from the given [`Diagnostic`] corresponding to a rule violation.
pub fn from_diagnostic(
diagnostic: Diagnostic,
file: SourceFile,
noqa_offset: Option<TextSize>,
) -> Message {
let Diagnostic {
name,
/// Create a [`Message`] from the given [`OldDiagnostic`] corresponding to a rule violation.
pub fn from_diagnostic(diagnostic: OldDiagnostic, noqa_offset: Option<TextSize>) -> Message {
let OldDiagnostic {
body,
suggestion,
range,
fix,
parent,
rule,
file,
} = diagnostic;
Self::diagnostic(
name,
body,
suggestion,
range,
@@ -133,6 +132,7 @@ impl Message {
parent,
file,
noqa_offset,
rule,
)
}
@@ -235,7 +235,7 @@ impl Message {
/// Returns the [`NoqaCode`] corresponding to the diagnostic message.
pub fn to_noqa_code(&self) -> Option<NoqaCode> {
self.to_rule().map(|rule| rule.noqa_code())
self.noqa_code
}
/// Returns the URL for the rule documentation, if it exists.
@@ -371,7 +371,8 @@ impl<'a> EmitterContext<'a> {
mod tests {
use rustc_hash::FxHashMap;
use ruff_diagnostics::{Edit, Fix};
use crate::codes::Rule;
use crate::{Edit, Fix};
use ruff_notebook::NotebookIndex;
use ruff_python_parser::{Mode, ParseOptions, parse_unchecked};
use ruff_source_file::{OneIndexed, SourceFileBuilder};
@@ -417,7 +418,6 @@ def fibonacci(n):
let unused_import_start = TextSize::from(7);
let unused_import = Message::diagnostic(
"unused-import",
"`os` imported but unused".to_string(),
Some("Remove unused import: `os`".to_string()),
TextRange::new(unused_import_start, TextSize::from(9)),
@@ -428,11 +428,11 @@ def fibonacci(n):
None,
fib_source.clone(),
Some(unused_import_start),
Rule::UnusedImport,
);
let unused_variable_start = TextSize::from(94);
let unused_variable = Message::diagnostic(
"unused-variable",
"Local variable `x` is assigned to but never used".to_string(),
Some("Remove assignment to unused variable `x`".to_string()),
TextRange::new(unused_variable_start, TextSize::from(95)),
@@ -443,13 +443,13 @@ def fibonacci(n):
None,
fib_source,
Some(unused_variable_start),
Rule::UnusedVariable,
);
let file_2 = r"if a == 1: pass";
let undefined_name_start = TextSize::from(3);
let undefined_name = Message::diagnostic(
"undefined-name",
"Undefined name `a`".to_string(),
None,
TextRange::new(undefined_name_start, TextSize::from(4)),
@@ -457,6 +457,7 @@ def fibonacci(n):
None,
SourceFileBuilder::new("undef.py", file_2).finish(),
Some(undefined_name_start),
Rule::UndefinedName,
);
vec![unused_import, unused_variable, undefined_name]
@@ -479,7 +480,6 @@ def foo():
let unused_import_os_start = TextSize::from(16);
let unused_import_os = Message::diagnostic(
"unused-import",
"`os` imported but unused".to_string(),
Some("Remove unused import: `os`".to_string()),
TextRange::new(unused_import_os_start, TextSize::from(18)),
@@ -490,11 +490,11 @@ def foo():
None,
notebook_source.clone(),
Some(unused_import_os_start),
Rule::UnusedImport,
);
let unused_import_math_start = TextSize::from(35);
let unused_import_math = Message::diagnostic(
"unused-import",
"`math` imported but unused".to_string(),
Some("Remove unused import: `math`".to_string()),
TextRange::new(unused_import_math_start, TextSize::from(39)),
@@ -505,11 +505,11 @@ def foo():
None,
notebook_source.clone(),
Some(unused_import_math_start),
Rule::UnusedImport,
);
let unused_variable_start = TextSize::from(98);
let unused_variable = Message::diagnostic(
"unused-variable",
"Local variable `x` is assigned to but never used".to_string(),
Some("Remove assignment to unused variable `x`".to_string()),
TextRange::new(unused_variable_start, TextSize::from(99)),
@@ -520,6 +520,7 @@ def foo():
None,
notebook_source,
Some(unused_variable_start),
Rule::UnusedVariable,
);
let mut notebook_indexes = FxHashMap::default();

View File

@@ -4,10 +4,10 @@ use serde::ser::SerializeSeq;
use serde::{Serialize, Serializer};
use serde_json::{Value, json};
use ruff_diagnostics::Edit;
use ruff_source_file::SourceCode;
use ruff_text_size::Ranged;
use crate::Edit;
use crate::message::{Emitter, EmitterContext, LineColumn, Message};
#[derive(Default)]

View File

@@ -9,11 +9,11 @@ use anyhow::Result;
use itertools::Itertools;
use log::warn;
use ruff_diagnostics::Edit;
use ruff_python_trivia::{CommentRanges, Cursor, indentation_at_offset};
use ruff_source_file::{LineEnding, LineRanges};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
use crate::Edit;
use crate::Locator;
use crate::codes::NoqaCode;
use crate::fs::relativize_path;
@@ -1221,7 +1221,6 @@ mod tests {
use insta::assert_debug_snapshot;
use ruff_diagnostics::{Diagnostic, Edit};
use ruff_python_trivia::CommentRanges;
use ruff_source_file::{LineEnding, SourceFileBuilder};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
@@ -1234,6 +1233,7 @@ mod tests {
use crate::rules::pycodestyle::rules::{AmbiguousVariableName, UselessSemicolon};
use crate::rules::pyflakes::rules::UnusedVariable;
use crate::rules::pyupgrade::rules::PrintfStringFormatting;
use crate::{Edit, OldDiagnostic};
use crate::{Locator, generate_noqa_edits};
fn assert_lexed_ranges_match_slices(
@@ -1252,14 +1252,9 @@ mod tests {
}
/// Create a [`Message`] with a placeholder filename and rule code from `diagnostic`.
fn message_from_diagnostic(
diagnostic: Diagnostic,
path: impl AsRef<Path>,
source: &str,
) -> Message {
fn message_from_diagnostic(diagnostic: OldDiagnostic) -> Message {
let noqa_offset = diagnostic.start();
let file = SourceFileBuilder::new(path.as_ref().to_string_lossy(), source).finish();
Message::from_diagnostic(diagnostic, file, Some(noqa_offset))
Message::from_diagnostic(diagnostic, Some(noqa_offset))
}
#[test]
@@ -2842,13 +2837,15 @@ mod tests {
assert_eq!(count, 0);
assert_eq!(output, format!("{contents}"));
let messages = [Diagnostic::new(
let source_file = SourceFileBuilder::new(path.to_string_lossy(), contents).finish();
let messages = [OldDiagnostic::new(
UnusedVariable {
name: "x".to_string(),
},
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
)]
.map(|d| message_from_diagnostic(d, path, contents));
.map(message_from_diagnostic);
let contents = "x = 1";
let noqa_line_for = NoqaMapping::default();
@@ -2864,19 +2861,22 @@ mod tests {
assert_eq!(count, 1);
assert_eq!(output, "x = 1 # noqa: F841\n");
let source_file = SourceFileBuilder::new(path.to_string_lossy(), contents).finish();
let messages = [
Diagnostic::new(
OldDiagnostic::new(
AmbiguousVariableName("x".to_string()),
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
),
Diagnostic::new(
OldDiagnostic::new(
UnusedVariable {
name: "x".to_string(),
},
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
),
]
.map(|d| message_from_diagnostic(d, path, contents));
.map(message_from_diagnostic);
let contents = "x = 1 # noqa: E741\n";
let noqa_line_for = NoqaMapping::default();
let comment_ranges =
@@ -2893,19 +2893,22 @@ mod tests {
assert_eq!(count, 1);
assert_eq!(output, "x = 1 # noqa: E741, F841\n");
let source_file = SourceFileBuilder::new(path.to_string_lossy(), contents).finish();
let messages = [
Diagnostic::new(
OldDiagnostic::new(
AmbiguousVariableName("x".to_string()),
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
),
Diagnostic::new(
OldDiagnostic::new(
UnusedVariable {
name: "x".to_string(),
},
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
),
]
.map(|d| message_from_diagnostic(d, path, contents));
.map(message_from_diagnostic);
let contents = "x = 1 # noqa";
let noqa_line_for = NoqaMapping::default();
let comment_ranges =
@@ -2936,11 +2939,13 @@ print(
)
"#;
let noqa_line_for = [TextRange::new(8.into(), 68.into())].into_iter().collect();
let messages = [Diagnostic::new(
let source_file = SourceFileBuilder::new(path.to_string_lossy(), source).finish();
let messages = [OldDiagnostic::new(
PrintfStringFormatting,
TextRange::new(12.into(), 79.into()),
&source_file,
)]
.map(|d| message_from_diagnostic(d, path, source));
.map(message_from_diagnostic);
let comment_ranges = CommentRanges::default();
let edits = generate_noqa_edits(
path,
@@ -2968,11 +2973,13 @@ print(
foo;
bar =
";
let messages = [Diagnostic::new(
let source_file = SourceFileBuilder::new(path.to_string_lossy(), source).finish();
let messages = [OldDiagnostic::new(
UselessSemicolon,
TextRange::new(4.into(), 5.into()),
&source_file,
)]
.map(|d| message_from_diagnostic(d, path, source));
.map(message_from_diagnostic);
let noqa_line_for = NoqaMapping::default();
let comment_ranges = CommentRanges::default();
let edits = generate_noqa_edits(

View File

@@ -3,16 +3,16 @@ use log::warn;
use pyproject_toml::PyProjectToml;
use ruff_text_size::{TextRange, TextSize};
use ruff_diagnostics::Diagnostic;
use ruff_source_file::SourceFile;
use crate::IOError;
use crate::OldDiagnostic;
use crate::message::Message;
use crate::registry::Rule;
use crate::rules::ruff::rules::InvalidPyprojectToml;
use crate::settings::LinterSettings;
pub fn lint_pyproject_toml(source_file: SourceFile, settings: &LinterSettings) -> Vec<Message> {
pub fn lint_pyproject_toml(source_file: &SourceFile, settings: &LinterSettings) -> Vec<Message> {
let Some(err) = toml::from_str::<PyProjectToml>(source_file.source_text()).err() else {
return Vec::default();
};
@@ -29,8 +29,9 @@ pub fn lint_pyproject_toml(source_file: SourceFile, settings: &LinterSettings) -
source_file.name(),
);
if settings.rules.enabled(Rule::IOError) {
let diagnostic = Diagnostic::new(IOError { message }, TextRange::default());
messages.push(Message::from_diagnostic(diagnostic, source_file, None));
let diagnostic =
OldDiagnostic::new(IOError { message }, TextRange::default(), source_file);
messages.push(Message::from_diagnostic(diagnostic, None));
} else {
warn!(
"{}{}{} {message}",
@@ -51,8 +52,12 @@ pub fn lint_pyproject_toml(source_file: SourceFile, settings: &LinterSettings) -
if settings.rules.enabled(Rule::InvalidPyprojectToml) {
let toml_err = err.message().to_string();
let diagnostic = Diagnostic::new(InvalidPyprojectToml { message: toml_err }, range);
messages.push(Message::from_diagnostic(diagnostic, source_file, None));
let diagnostic = OldDiagnostic::new(
InvalidPyprojectToml { message: toml_err },
range,
source_file,
);
messages.push(Message::from_diagnostic(diagnostic, None));
}
messages

View File

@@ -3,13 +3,13 @@
use anyhow::{Result, anyhow};
use itertools::Itertools;
use ruff_diagnostics::Edit;
use ruff_python_ast as ast;
use ruff_python_codegen::Stylist;
use ruff_python_semantic::{Binding, BindingKind, Scope, ScopeId, SemanticModel};
use ruff_python_stdlib::{builtins::is_python_builtin, keyword::is_keyword};
use ruff_text_size::Ranged;
use crate::Edit;
use crate::checkers::ast::Checker;
pub(crate) struct Renamer;

View File

@@ -1,10 +1,17 @@
use crate::checkers::ast::Checker;
use crate::fix::edits::remove_unused_imports;
use crate::importer::ImportRequest;
use crate::rules::numpy::helpers::{AttributeSearcher, ImportSearcher};
use ruff_diagnostics::{Edit, Fix};
use ruff_python_ast::name::QualifiedNameBuilder;
use ruff_python_ast::statement_visitor::StatementVisitor;
use ruff_python_ast::visitor::Visitor;
use ruff_python_ast::{Expr, ExprName, StmtTry};
use ruff_python_ast::{Expr, ExprAttribute, ExprName, StmtTry};
use ruff_python_semantic::Exceptions;
use ruff_python_semantic::SemanticModel;
use ruff_python_semantic::{MemberNameImport, NameImport};
use ruff_text_size::Ranged;
use ruff_text_size::TextRange;
#[derive(Clone, Debug, Eq, PartialEq)]
pub(crate) enum Replacement {
@@ -170,3 +177,70 @@ pub(crate) fn is_airflow_builtin_or_provider(
_ => false,
}
}
/// Return the [`ast::ExprName`] at the head of the expression, if any.
pub(crate) fn match_head(value: &Expr) -> Option<&ExprName> {
match value {
Expr::Attribute(ExprAttribute { value, .. }) => value.as_name_expr(),
Expr::Name(name) => Some(name),
_ => None,
}
}
/// Return the [`Fix`] that imports the new name and updates where the import is referenced.
/// This is used for cases that member name has changed.
/// (e.g., `airflow.datasts.Dataset` to `airflow.sdk.Asset`)
pub(crate) fn generate_import_edit(
expr: &Expr,
checker: &Checker,
module: &str,
name: &str,
ranged: TextRange,
) -> Option<Fix> {
let (import_edit, _) = checker
.importer()
.get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)
.ok()?;
let replacement_edit = Edit::range_replacement(name.to_string(), ranged.range());
Some(Fix::safe_edits(import_edit, [replacement_edit]))
}
/// Return the [`Fix`] that remove the original import and import the same name with new path.
/// This is used for cases that member name has not changed.
/// (e.g., `airflow.operators.pig_operator.PigOperator` to `airflow.providers.apache.pig.hooks.pig.PigCliHook`)
pub(crate) fn generate_remove_and_runtime_import_edit(
expr: &Expr,
checker: &Checker,
module: &str,
name: &str,
) -> Option<Fix> {
let head = match_head(expr)?;
let semantic = checker.semantic();
let binding = semantic
.resolve_name(head)
.or_else(|| checker.semantic().lookup_symbol(&head.id))
.map(|id| checker.semantic().binding(id))?;
let stmt = binding.statement(semantic)?;
let remove_edit = remove_unused_imports(
std::iter::once(name),
stmt,
None,
checker.locator(),
checker.stylist(),
checker.indexer(),
)
.ok()?;
let import_edit = checker.importer().add_import(
&NameImport::ImportFrom(MemberNameImport::member(
(*module).to_string(),
name.to_string(),
)),
expr.start(),
);
Some(Fix::unsafe_edits(remove_edit, [import_edit]))
}

View File

@@ -1,10 +1,10 @@
use ruff_diagnostics::{Diagnostic, Violation};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::Expr;
use ruff_python_ast::{self as ast};
use ruff_python_semantic::Modules;
use ruff_text_size::Ranged;
use crate::Violation;
use crate::checkers::ast::Checker;
/// ## What it does
@@ -86,6 +86,5 @@ pub(crate) fn dag_no_schedule_argument(checker: &Checker, expr: &Expr) {
}
// Produce a diagnostic when the `schedule` keyword argument is not found.
let diagnostic = Diagnostic::new(AirflowDagNoScheduleArgument, expr.range());
checker.report_diagnostic(diagnostic);
checker.report_diagnostic(AirflowDagNoScheduleArgument, expr.range());
}

View File

@@ -1,13 +1,16 @@
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{ProviderReplacement, is_guarded_by_try_except};
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use crate::checkers::ast::Checker;
use crate::rules::airflow::helpers::{
ProviderReplacement, generate_import_edit, generate_remove_and_runtime_import_edit,
is_guarded_by_try_except,
};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::name::QualifiedName;
use ruff_python_ast::{Expr, ExprAttribute};
use ruff_python_semantic::Modules;
use ruff_text_size::Ranged;
use ruff_text_size::TextRange;
use crate::checkers::ast::Checker;
use crate::{FixAvailability, Violation};
/// ## What it does
/// Checks for uses of Airflow functions and values that have been moved to it providers.
@@ -28,12 +31,12 @@ use crate::checkers::ast::Checker;
/// from airflow.providers.fab.auth_manager.fab_auth_manage import FabAuthManager
/// ```
#[derive(ViolationMetadata)]
pub(crate) struct Airflow3MovedToProvider {
deprecated: String,
pub(crate) struct Airflow3MovedToProvider<'a> {
deprecated: QualifiedName<'a>,
replacement: ProviderReplacement,
}
impl Violation for Airflow3MovedToProvider {
impl Violation for Airflow3MovedToProvider<'_> {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
#[derive_message_formats]
@@ -65,24 +68,26 @@ impl Violation for Airflow3MovedToProvider {
fn fix_title(&self) -> Option<String> {
let Airflow3MovedToProvider { replacement, .. } = self;
match replacement {
ProviderReplacement::None => None,
if let Some((module, name, provider, version)) = match &replacement {
ProviderReplacement::AutoImport {
name,
module,
name,
provider,
version,
} => Some(format!(
"Install `apache-airflow-providers-{provider}>={version}` and use `{module}.{name}` instead."
)),
} => Some((module, *name, provider, version)),
ProviderReplacement::SourceModuleMovedToProvider {
name,
module,
name,
provider,
version,
} => Some(format!(
"Install `apache-airflow-providers-{provider}>={version}` and use `{module}.{name}` instead."
)),
} => Some((module, name.as_str(), provider, version)),
ProviderReplacement::None => None,
} {
Some(format!(
"Install `apache-airflow-providers-{provider}>={version}` and use `{name}` from `{module}` instead."
))
} else {
None
}
}
}
@@ -1166,7 +1171,7 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
[
"airflow",
"sensors",
"external_task",
"external_task" | "external_task_sensor",
"ExternalTaskSensorLink",
] => ProviderReplacement::AutoImport {
module: "airflow.providers.standard.sensors.external_task",
@@ -1178,7 +1183,7 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
"airflow",
"sensors",
"external_task_sensor",
rest @ ("ExternalTaskMarker" | "ExternalTaskSensor" | "ExternalTaskSensorLink"),
rest @ ("ExternalTaskMarker" | "ExternalTaskSensor"),
] => ProviderReplacement::SourceModuleMovedToProvider {
module: "airflow.providers.standard.sensors.external_task",
name: (*rest).to_string(),
@@ -1195,34 +1200,38 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
_ => return,
};
let mut diagnostic = Diagnostic::new(
Airflow3MovedToProvider {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
ranged.range(),
);
let semantic = checker.semantic();
if let Some((module, name)) = match &replacement {
ProviderReplacement::AutoImport { module, name, .. } => Some((module, *name)),
let (module, name) = match &replacement {
ProviderReplacement::AutoImport { module, name, .. } => (module, *name),
ProviderReplacement::SourceModuleMovedToProvider { module, name, .. } => {
Some((module, name.as_str()))
(module, name.as_str())
}
ProviderReplacement::None => None,
} {
if is_guarded_by_try_except(expr, module, name, semantic) {
ProviderReplacement::None => {
checker.report_diagnostic(
Airflow3MovedToProvider {
deprecated: qualified_name,
replacement,
},
ranged,
);
return;
}
diagnostic.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, ranged.range());
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
};
if is_guarded_by_try_except(expr, module, name, checker.semantic()) {
return;
}
let mut diagnostic = checker.report_diagnostic(
Airflow3MovedToProvider {
deprecated: qualified_name,
replacement: replacement.clone(),
},
ranged,
);
if let Some(fix) = generate_import_edit(expr, checker, module, name, ranged)
.or_else(|| generate_remove_and_runtime_import_edit(expr, checker, module, name))
{
diagnostic.set_fix(fix);
}
checker.report_diagnostic(diagnostic);
}

View File

@@ -1,9 +1,9 @@
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{
Replacement, is_airflow_builtin_or_provider, is_guarded_by_try_except,
Replacement, generate_import_edit, generate_remove_and_runtime_import_edit,
is_airflow_builtin_or_provider, is_guarded_by_try_except,
};
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use crate::{Edit, Fix, FixAvailability, Violation};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::helpers::map_callable;
use ruff_python_ast::{
@@ -73,10 +73,10 @@ impl Violation for Airflow3Removal {
Replacement::AttrName(name) => Some(format!("Use `{name}` instead")),
Replacement::Message(message) => Some((*message).to_string()),
Replacement::AutoImport { module, name } => {
Some(format!("Use `{module}.{name}` instead"))
Some(format!("Use `{name}` from `{module}` instead."))
}
Replacement::SourceModuleMoved { module, name } => {
Some(format!("Use `{module}.{name}` instead"))
Some(format!("Use `{name}` from `{module}` instead."))
}
}
}
@@ -166,13 +166,13 @@ fn check_function_parameters(checker: &Checker, function_def: &StmtFunctionDef)
for param in function_def.parameters.iter_non_variadic_params() {
let param_name = param.name();
if REMOVED_CONTEXT_KEYS.contains(&param_name.as_str()) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
Airflow3Removal {
deprecated: param_name.to_string(),
replacement: Replacement::None,
},
param_name.range(),
));
);
}
}
}
@@ -200,7 +200,7 @@ fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, argum
segments => {
if is_airflow_auth_manager(segments) {
if !arguments.is_empty() {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
Airflow3Removal {
deprecated: String::from("appbuilder"),
replacement: Replacement::Message(
@@ -208,7 +208,7 @@ fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, argum
),
},
arguments.range(),
));
);
}
} else if is_airflow_task_handler(segments) {
diagnostic_for_argument(checker, arguments, "filename_template", None);
@@ -305,7 +305,7 @@ fn check_class_attribute(checker: &Checker, attribute_expr: &ExprAttribute) {
} else {
None
};
let mut diagnostic = Diagnostic::new(
let mut diagnostic = checker.report_diagnostic(
Airflow3Removal {
deprecated: attr.to_string(),
replacement,
@@ -315,7 +315,6 @@ fn check_class_attribute(checker: &Checker, attribute_expr: &ExprAttribute) {
if let Some(fix) = fix {
diagnostic.set_fix(fix);
}
checker.report_diagnostic(diagnostic);
}
/// Checks whether an Airflow 3.0removed context key is used in a function decorated with `@task`.
@@ -382,13 +381,13 @@ fn check_context_key_usage_in_call(checker: &Checker, call_expr: &ExprCall) {
continue;
};
if value == removed_key {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
Airflow3Removal {
deprecated: removed_key.to_string(),
replacement: Replacement::None,
},
*range,
));
);
}
}
}
@@ -423,13 +422,13 @@ fn check_context_key_usage_in_subscript(checker: &Checker, subscript: &ExprSubsc
}
if REMOVED_CONTEXT_KEYS.contains(&key.to_str()) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
Airflow3Removal {
deprecated: key.to_string(),
replacement: Replacement::None,
},
slice.range(),
));
);
}
}
@@ -537,7 +536,7 @@ fn check_method(checker: &Checker, call_expr: &ExprCall) {
None
};
let mut diagnostic = Diagnostic::new(
let mut diagnostic = checker.report_diagnostic(
Airflow3Removal {
deprecated: attr.to_string(),
replacement,
@@ -547,7 +546,6 @@ fn check_method(checker: &Checker, call_expr: &ExprCall) {
if let Some(fix) = fix {
diagnostic.set_fix(fix);
}
checker.report_diagnostic(diagnostic);
}
/// Check whether a removed Airflow name is used.
@@ -616,7 +614,6 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
},
// airflow.configuration
// TODO: check whether we could improve it
[
"airflow",
"configuration",
@@ -966,37 +963,39 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
_ => return,
};
let mut diagnostic = Diagnostic::new(
let (module, name) = match &replacement {
Replacement::AutoImport { module, name } => (module, *name),
Replacement::SourceModuleMoved { module, name } => (module, name.as_str()),
_ => {
checker.report_diagnostic(
Airflow3Removal {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
);
return;
}
};
if is_guarded_by_try_except(expr, module, name, checker.semantic()) {
return;
}
let import_target = name.split('.').next().unwrap_or(name);
let mut diagnostic = checker.report_diagnostic(
Airflow3Removal {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
);
let semantic = checker.semantic();
if let Some((module, name)) = match &replacement {
Replacement::AutoImport { module, name } => Some((module, *name)),
Replacement::SourceModuleMoved { module, name } => Some((module, name.as_str())),
_ => None,
} {
if is_guarded_by_try_except(expr, module, name, semantic) {
return;
}
let import_target = name.split('.').next().unwrap_or(name);
diagnostic.try_set_fix(|| {
let (import_edit, _) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, import_target),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(name.to_string(), range);
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
if let Some(fix) = generate_import_edit(expr, checker, module, import_target, range)
.or_else(|| generate_remove_and_runtime_import_edit(expr, checker, module, name))
{
diagnostic.set_fix(fix);
}
checker.report_diagnostic(diagnostic);
}
/// Check whether a customized Airflow plugin contains removed extensions.
@@ -1028,7 +1027,7 @@ fn check_airflow_plugin_extension(
)
})
}) {
checker.report_diagnostic(Diagnostic::new(
checker.report_diagnostic(
Airflow3Removal {
deprecated: name.to_string(),
replacement: Replacement::Message(
@@ -1036,7 +1035,7 @@ fn check_airflow_plugin_extension(
),
},
expr.range(),
));
);
}
}
}
@@ -1052,7 +1051,11 @@ fn diagnostic_for_argument(
let Some(keyword) = arguments.find_keyword(deprecated) else {
return;
};
let mut diagnostic = Diagnostic::new(
let range = keyword
.arg
.as_ref()
.map_or_else(|| keyword.range(), Ranged::range);
let mut diagnostic = checker.report_diagnostic(
Airflow3Removal {
deprecated: deprecated.to_string(),
replacement: match replacement {
@@ -1060,20 +1063,15 @@ fn diagnostic_for_argument(
None => Replacement::None,
},
},
keyword
.arg
.as_ref()
.map_or_else(|| keyword.range(), Ranged::range),
range,
);
if let Some(replacement) = replacement {
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
replacement.to_string(),
diagnostic.range,
range,
)));
}
checker.report_diagnostic(diagnostic);
}
/// Check whether the symbol is coming from the `secrets` builtin or provider module which ends

View File

@@ -1,15 +1,16 @@
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{ProviderReplacement, is_guarded_by_try_except};
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use crate::checkers::ast::Checker;
use crate::rules::airflow::helpers::{
ProviderReplacement, generate_import_edit, generate_remove_and_runtime_import_edit,
is_guarded_by_try_except,
};
use crate::{FixAvailability, Violation};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::name::QualifiedName;
use ruff_python_ast::{Expr, ExprAttribute};
use ruff_python_semantic::Modules;
use ruff_text_size::Ranged;
use ruff_text_size::TextRange;
use crate::checkers::ast::Checker;
/// ## What it does
/// Checks for uses of Airflow functions and values that have been moved to its providers
/// but still have a compatibility layer (e.g., `apache-airflow-providers-standard`).
@@ -30,12 +31,12 @@ use crate::checkers::ast::Checker;
/// from airflow.providers.standard.operators.python import PythonOperator
/// ```
#[derive(ViolationMetadata)]
pub(crate) struct Airflow3SuggestedToMoveToProvider {
deprecated: String,
pub(crate) struct Airflow3SuggestedToMoveToProvider<'a> {
deprecated: QualifiedName<'a>,
replacement: ProviderReplacement,
}
impl Violation for Airflow3SuggestedToMoveToProvider {
impl Violation for Airflow3SuggestedToMoveToProvider<'_> {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
#[derive_message_formats]
fn message(&self) -> String {
@@ -77,7 +78,7 @@ impl Violation for Airflow3SuggestedToMoveToProvider {
provider,
version,
} => Some(format!(
"Install `apache-airflow-providers-{provider}>={version}` and use `{module}.{name}` instead."
"Install `apache-airflow-providers-{provider}>={version}` and use `{name}` from `{module}` instead."
)),
ProviderReplacement::SourceModuleMovedToProvider {
module,
@@ -85,7 +86,7 @@ impl Violation for Airflow3SuggestedToMoveToProvider {
provider,
version,
} => Some(format!(
"Install `apache-airflow-providers-{provider}>={version}` and use `{module}.{name}` instead."
"Install `apache-airflow-providers-{provider}>={version}` and use `{name}` from `{module}` instead."
)),
}
}
@@ -281,35 +282,37 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
_ => return,
};
let mut diagnostic = Diagnostic::new(
Airflow3SuggestedToMoveToProvider {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
ranged.range(),
);
let semantic = checker.semantic();
if let Some((module, name)) = match &replacement {
ProviderReplacement::AutoImport { module, name, .. } => Some((module, *name)),
let (module, name) = match &replacement {
ProviderReplacement::AutoImport { module, name, .. } => (module, *name),
ProviderReplacement::SourceModuleMovedToProvider { module, name, .. } => {
Some((module, name.as_str()))
(module, name.as_str())
}
ProviderReplacement::None => None,
} {
if is_guarded_by_try_except(expr, module, name, semantic) {
ProviderReplacement::None => {
checker.report_diagnostic(
Airflow3SuggestedToMoveToProvider {
deprecated: qualified_name,
replacement: replacement.clone(),
},
ranged.range(),
);
return;
}
diagnostic.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, ranged.range());
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
}
};
checker.report_diagnostic(diagnostic);
if is_guarded_by_try_except(expr, module, name, checker.semantic()) {
return;
}
let mut diagnostic = checker.report_diagnostic(
Airflow3SuggestedToMoveToProvider {
deprecated: qualified_name,
replacement: replacement.clone(),
},
ranged,
);
if let Some(fix) = generate_import_edit(expr, checker, module, name, ranged)
.or_else(|| generate_remove_and_runtime_import_edit(expr, checker, module, name))
{
diagnostic.set_fix(fix);
}
}

View File

@@ -1,9 +1,9 @@
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{Replacement, is_airflow_builtin_or_provider};
use crate::rules::airflow::helpers::{
Replacement, is_airflow_builtin_or_provider, is_guarded_by_try_except,
generate_import_edit, generate_remove_and_runtime_import_edit, is_guarded_by_try_except,
};
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use crate::{Edit, Fix, FixAvailability, Violation};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::{Arguments, Expr, ExprAttribute, ExprCall, ExprName, name::QualifiedName};
use ruff_python_semantic::Modules;
@@ -72,10 +72,10 @@ impl Violation for Airflow3SuggestedUpdate {
Replacement::AttrName(name) => Some(format!("Use `{name}` instead")),
Replacement::Message(message) => Some((*message).to_string()),
Replacement::AutoImport { module, name } => {
Some(format!("Use `{module}.{name}` instead"))
Some(format!("Use `{name}` from `{module}` instead."))
}
Replacement::SourceModuleMoved { module, name } => {
Some(format!("Use `{module}.{name}` instead"))
Some(format!("Use `{name}` from `{module}` instead."))
}
}
}
@@ -120,7 +120,11 @@ fn diagnostic_for_argument(
let Some(keyword) = arguments.find_keyword(deprecated) else {
return;
};
let mut diagnostic = Diagnostic::new(
let range = keyword
.arg
.as_ref()
.map_or_else(|| keyword.range(), Ranged::range);
let mut diagnostic = checker.report_diagnostic(
Airflow3SuggestedUpdate {
deprecated: deprecated.to_string(),
replacement: match replacement {
@@ -128,20 +132,15 @@ fn diagnostic_for_argument(
None => Replacement::None,
},
},
keyword
.arg
.as_ref()
.map_or_else(|| keyword.range(), Ranged::range),
range,
);
if let Some(replacement) = replacement {
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
replacement.to_string(),
diagnostic.range,
range,
)));
}
checker.report_diagnostic(diagnostic);
}
/// Check whether a removed Airflow argument is passed.
///
@@ -212,7 +211,7 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
name: "AssetAny",
},
"expand_alias_to_datasets" => Replacement::AutoImport {
module: "airflow.sdk",
module: "airflow.models.asset",
name: "expand_alias_to_assets",
},
_ => return,
@@ -257,7 +256,7 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
name: (*rest).to_string(),
},
["airflow", "models", "baseoperatorlink", "BaseOperatorLink"] => Replacement::AutoImport {
module: "airflow.sdk.definitions.baseoperatorlink",
module: "airflow.sdk",
name: "BaseOperatorLink",
},
// airflow.model..DAG
@@ -284,33 +283,34 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
_ => return,
};
let mut diagnostic = Diagnostic::new(
let (module, name) = match &replacement {
Replacement::AutoImport { module, name } => (module, *name),
Replacement::SourceModuleMoved { module, name } => (module, name.as_str()),
_ => {
checker.report_diagnostic(
Airflow3SuggestedUpdate {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
);
return;
}
};
if is_guarded_by_try_except(expr, module, name, checker.semantic()) {
return;
}
let mut diagnostic = checker.report_diagnostic(
Airflow3SuggestedUpdate {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
);
let semantic = checker.semantic();
if let Some((module, name)) = match &replacement {
Replacement::AutoImport { module, name } => Some((module, *name)),
Replacement::SourceModuleMoved { module, name } => Some((module, name.as_str())),
_ => None,
} {
if is_guarded_by_try_except(expr, module, name, semantic) {
return;
}
diagnostic.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, range);
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
if let Some(fix) = generate_import_edit(expr, checker, module, name, range)
.or_else(|| generate_remove_and_runtime_import_edit(expr, checker, module, name))
{
diagnostic.set_fix(fix);
}
checker.report_diagnostic(diagnostic);
}

View File

@@ -1,4 +1,4 @@
use ruff_diagnostics::{Diagnostic, Violation};
use crate::Violation;
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast as ast;
use ruff_python_ast::Expr;
@@ -110,11 +110,10 @@ pub(crate) fn variable_name_task_id(checker: &Checker, targets: &[Expr], value:
return;
}
let diagnostic = Diagnostic::new(
checker.report_diagnostic(
AirflowVariableNameTaskIdMismatch {
task_id: task_id.to_string(),
},
target.range(),
);
checker.report_diagnostic(diagnostic);
}

View File

@@ -171,7 +171,7 @@ AIR301_class_attribute.py:42:6: AIR301 [*] `airflow.datasets.manager.DatasetMana
43 | dm.register_dataset_change()
44 | dm.create_datasets()
|
= help: Use `airflow.assets.manager.AssetManager` instead
= help: Use `AssetManager` from `airflow.assets.manager` instead.
Safe fix
19 19 | from airflow.providers_manager import ProvidersManager
@@ -302,7 +302,7 @@ AIR301_class_attribute.py:50:11: AIR301 [*] `airflow.lineage.hook.DatasetLineage
| ^^^^^^^^^^^^^^^^^^ AIR301
51 | dl_info.dataset
|
= help: Use `airflow.lineage.hook.AssetLineageInfo` instead
= help: Use `AssetLineageInfo` from `airflow.lineage.hook` instead.
Safe fix
9 9 | DatasetAny,

View File

@@ -1,651 +1,294 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR301_names.py:53:1: AIR301 `airflow.PY36` is removed in Airflow 3.0
AIR301_names.py:38:1: AIR301 `airflow.PY36` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:7: AIR301 `airflow.PY37` is removed in Airflow 3.0
AIR301_names.py:38:7: AIR301 `airflow.PY37` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:13: AIR301 `airflow.PY38` is removed in Airflow 3.0
AIR301_names.py:38:13: AIR301 `airflow.PY38` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:19: AIR301 `airflow.PY39` is removed in Airflow 3.0
AIR301_names.py:38:19: AIR301 `airflow.PY39` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:25: AIR301 `airflow.PY310` is removed in Airflow 3.0
AIR301_names.py:38:25: AIR301 `airflow.PY310` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:32: AIR301 `airflow.PY311` is removed in Airflow 3.0
AIR301_names.py:38:32: AIR301 `airflow.PY311` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:39: AIR301 `airflow.PY312` is removed in Airflow 3.0
AIR301_names.py:38:39: AIR301 `airflow.PY312` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:56:1: AIR301 `airflow.api_connexion.security.requires_access` is removed in Airflow 3.0
AIR301_names.py:41:1: AIR301 `airflow.api_connexion.security.requires_access` is removed in Airflow 3.0
|
55 | # airflow.api_connexion.security
56 | requires_access
40 | # airflow.api_connexion.security
41 | requires_access
| ^^^^^^^^^^^^^^^ AIR301
42 |
43 | # airflow.contrib.*
|
= help: Use `airflow.api_fastapi.core_api.security.requires_access_*` instead
AIR301_names.py:60:1: AIR301 [*] `airflow.configuration.get` is removed in Airflow 3.0
AIR301_names.py:44:1: AIR301 `airflow.contrib.aws_athena_hook.AWSAthenaHook` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^ AIR301
|
= help: Use `airflow.configuration.conf.get` instead
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+conf.get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:6: AIR301 [*] `airflow.configuration.getboolean` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^ AIR301
|
= help: Use `airflow.configuration.conf.getboolean` instead
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, conf.getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:18: AIR301 [*] `airflow.configuration.getfloat` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^ AIR301
|
= help: Use `airflow.configuration.conf.getfloat` instead
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, conf.getfloat, getint, has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:28: AIR301 [*] `airflow.configuration.getint` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^ AIR301
|
= help: Use `airflow.configuration.conf.getint` instead
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, conf.getint, has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:36: AIR301 [*] `airflow.configuration.has_option` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^ AIR301
|
= help: Use `airflow.configuration.conf.has_option` instead
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, getint, conf.has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:48: AIR301 [*] `airflow.configuration.remove_option` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^^^^ AIR301
|
= help: Use `airflow.configuration.conf.remove_option` instead
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, getint, has_option, conf.remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:63: AIR301 [*] `airflow.configuration.as_dict` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^ AIR301
|
= help: Use `airflow.configuration.conf.as_dict` instead
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, getint, has_option, remove_option, conf.as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:72: AIR301 [*] `airflow.configuration.set` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^ AIR301
|
= help: Use `airflow.configuration.conf.set` instead
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, getint, has_option, remove_option, as_dict, conf.set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:64:1: AIR301 `airflow.contrib.aws_athena_hook.AWSAthenaHook` is removed in Airflow 3.0
|
63 | # airflow.contrib.*
64 | AWSAthenaHook()
43 | # airflow.contrib.*
44 | AWSAthenaHook()
| ^^^^^^^^^^^^^ AIR301
|
= help: The whole `airflow.contrib` module has been removed.
AIR301_names.py:68:1: AIR301 `airflow.datasets.DatasetAliasEvent` is removed in Airflow 3.0
AIR301_names.py:48:1: AIR301 `airflow.datasets.DatasetAliasEvent` is removed in Airflow 3.0
|
67 | # airflow.datasets
68 | DatasetAliasEvent()
47 | # airflow.datasets
48 | DatasetAliasEvent()
| ^^^^^^^^^^^^^^^^^ AIR301
|
AIR301_names.py:72:1: AIR301 `airflow.hooks.base_hook.BaseHook` is removed in Airflow 3.0
AIR301_names.py:52:1: AIR301 `airflow.operators.subdag.SubDagOperator` is removed in Airflow 3.0
|
71 | # airflow.hooks
72 | BaseHook()
| ^^^^^^^^ AIR301
|
= help: Use `airflow.hooks.base.BaseHook` instead
AIR301_names.py:76:1: AIR301 `airflow.operators.subdag.SubDagOperator` is removed in Airflow 3.0
|
75 | # airflow.operators.subdag.*
76 | SubDagOperator()
51 | # airflow.operators.subdag.*
52 | SubDagOperator()
| ^^^^^^^^^^^^^^ AIR301
|
= help: The whole `airflow.subdag` module has been removed.
AIR301_names.py:85:1: AIR301 `airflow.sensors.base_sensor_operator.BaseSensorOperator` is removed in Airflow 3.0
AIR301_names.py:61:1: AIR301 `airflow.triggers.external_task.TaskStateTrigger` is removed in Airflow 3.0
|
84 | # airflow.sensors.base_sensor_operator
85 | BaseSensorOperator()
| ^^^^^^^^^^^^^^^^^^ AIR301
|
= help: Use `airflow.sdk.bases.sensor.BaseSensorOperator` instead
AIR301_names.py:89:1: AIR301 `airflow.triggers.external_task.TaskStateTrigger` is removed in Airflow 3.0
|
88 | # airflow.triggers.external_task
89 | TaskStateTrigger()
60 | # airflow.triggers.external_task
61 | TaskStateTrigger()
| ^^^^^^^^^^^^^^^^ AIR301
90 |
91 | # airflow.utils.date
62 |
63 | # airflow.utils.date
|
AIR301_names.py:92:1: AIR301 `airflow.utils.dates.date_range` is removed in Airflow 3.0
AIR301_names.py:64:1: AIR301 `airflow.utils.dates.date_range` is removed in Airflow 3.0
|
91 | # airflow.utils.date
92 | dates.date_range
63 | # airflow.utils.date
64 | dates.date_range
| ^^^^^^^^^^^^^^^^ AIR301
93 | dates.days_ago
65 | dates.days_ago
|
AIR301_names.py:93:1: AIR301 `airflow.utils.dates.days_ago` is removed in Airflow 3.0
AIR301_names.py:65:1: AIR301 `airflow.utils.dates.days_ago` is removed in Airflow 3.0
|
91 | # airflow.utils.date
92 | dates.date_range
93 | dates.days_ago
63 | # airflow.utils.date
64 | dates.date_range
65 | dates.days_ago
| ^^^^^^^^^^^^^^ AIR301
94 |
95 | date_range
66 |
67 | date_range
|
= help: Use `pendulum.today('UTC').add(days=-N, ...)` instead
AIR301_names.py:95:1: AIR301 `airflow.utils.dates.date_range` is removed in Airflow 3.0
AIR301_names.py:67:1: AIR301 `airflow.utils.dates.date_range` is removed in Airflow 3.0
|
93 | dates.days_ago
94 |
95 | date_range
65 | dates.days_ago
66 |
67 | date_range
| ^^^^^^^^^^ AIR301
96 | days_ago
97 | infer_time_unit
68 | days_ago
69 | infer_time_unit
|
AIR301_names.py:96:1: AIR301 `airflow.utils.dates.days_ago` is removed in Airflow 3.0
AIR301_names.py:68:1: AIR301 `airflow.utils.dates.days_ago` is removed in Airflow 3.0
|
95 | date_range
96 | days_ago
67 | date_range
68 | days_ago
| ^^^^^^^^ AIR301
97 | infer_time_unit
98 | parse_execution_date
69 | infer_time_unit
70 | parse_execution_date
|
= help: Use `pendulum.today('UTC').add(days=-N, ...)` instead
AIR301_names.py:97:1: AIR301 `airflow.utils.dates.infer_time_unit` is removed in Airflow 3.0
AIR301_names.py:69:1: AIR301 `airflow.utils.dates.infer_time_unit` is removed in Airflow 3.0
|
95 | date_range
96 | days_ago
97 | infer_time_unit
67 | date_range
68 | days_ago
69 | infer_time_unit
| ^^^^^^^^^^^^^^^ AIR301
98 | parse_execution_date
99 | round_time
70 | parse_execution_date
71 | round_time
|
AIR301_names.py:98:1: AIR301 `airflow.utils.dates.parse_execution_date` is removed in Airflow 3.0
|
96 | days_ago
97 | infer_time_unit
98 | parse_execution_date
| ^^^^^^^^^^^^^^^^^^^^ AIR301
99 | round_time
100 | scale_time_units
|
AIR301_names.py:70:1: AIR301 `airflow.utils.dates.parse_execution_date` is removed in Airflow 3.0
|
68 | days_ago
69 | infer_time_unit
70 | parse_execution_date
| ^^^^^^^^^^^^^^^^^^^^ AIR301
71 | round_time
72 | scale_time_units
|
AIR301_names.py:99:1: AIR301 `airflow.utils.dates.round_time` is removed in Airflow 3.0
AIR301_names.py:71:1: AIR301 `airflow.utils.dates.round_time` is removed in Airflow 3.0
|
69 | infer_time_unit
70 | parse_execution_date
71 | round_time
| ^^^^^^^^^^ AIR301
72 | scale_time_units
|
AIR301_names.py:72:1: AIR301 `airflow.utils.dates.scale_time_units` is removed in Airflow 3.0
|
70 | parse_execution_date
71 | round_time
72 | scale_time_units
| ^^^^^^^^^^^^^^^^ AIR301
73 |
74 | # This one was not deprecated.
|
AIR301_names.py:79:1: AIR301 `airflow.utils.dag_cycle_tester.test_cycle` is removed in Airflow 3.0
|
78 | # airflow.utils.dag_cycle_tester
79 | test_cycle
| ^^^^^^^^^^ AIR301
|
AIR301_names.py:83:1: AIR301 `airflow.utils.db.create_session` is removed in Airflow 3.0
|
82 | # airflow.utils.db
83 | create_session
| ^^^^^^^^^^^^^^ AIR301
84 |
85 | # airflow.utils.decorators
|
AIR301_names.py:86:1: AIR301 `airflow.utils.decorators.apply_defaults` is removed in Airflow 3.0
|
85 | # airflow.utils.decorators
86 | apply_defaults
| ^^^^^^^^^^^^^^ AIR301
87 |
88 | # airflow.utils.file
|
= help: `apply_defaults` is now unconditionally done and can be safely removed.
AIR301_names.py:89:1: AIR301 `airflow.utils.file.mkdirs` is removed in Airflow 3.0
|
88 | # airflow.utils.file
89 | mkdirs
| ^^^^^^ AIR301
|
= help: Use `pathlib.Path({path}).mkdir` instead
AIR301_names.py:93:1: AIR301 `airflow.utils.state.SHUTDOWN` is removed in Airflow 3.0
|
92 | # airflow.utils.state
93 | SHUTDOWN
| ^^^^^^^^ AIR301
94 | terminating_states
|
AIR301_names.py:94:1: AIR301 `airflow.utils.state.terminating_states` is removed in Airflow 3.0
|
92 | # airflow.utils.state
93 | SHUTDOWN
94 | terminating_states
| ^^^^^^^^^^^^^^^^^^ AIR301
95 |
96 | # airflow.utils.trigger_rule
|
AIR301_names.py:97:1: AIR301 `airflow.utils.trigger_rule.TriggerRule.DUMMY` is removed in Airflow 3.0
|
96 | # airflow.utils.trigger_rule
97 | TriggerRule.DUMMY
| ^^^^^^^^^^^^^^^^^ AIR301
98 | TriggerRule.NONE_FAILED_OR_SKIPPED
|
AIR301_names.py:98:1: AIR301 `airflow.utils.trigger_rule.TriggerRule.NONE_FAILED_OR_SKIPPED` is removed in Airflow 3.0
|
96 | # airflow.utils.trigger_rule
97 | TriggerRule.DUMMY
98 | TriggerRule.NONE_FAILED_OR_SKIPPED
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
|
AIR301_names.py:102:1: AIR301 `airflow.www.auth.has_access` is removed in Airflow 3.0
|
97 | infer_time_unit
98 | parse_execution_date
99 | round_time
101 | # airflow.www.auth
102 | has_access
| ^^^^^^^^^^ AIR301
100 | scale_time_units
103 | has_access_dataset
|
AIR301_names.py:100:1: AIR301 `airflow.utils.dates.scale_time_units` is removed in Airflow 3.0
AIR301_names.py:103:1: AIR301 `airflow.www.auth.has_access_dataset` is removed in Airflow 3.0
|
98 | parse_execution_date
99 | round_time
100 | scale_time_units
| ^^^^^^^^^^^^^^^^ AIR301
101 |
102 | # This one was not deprecated.
|
AIR301_names.py:107:1: AIR301 `airflow.utils.dag_cycle_tester.test_cycle` is removed in Airflow 3.0
|
106 | # airflow.utils.dag_cycle_tester
107 | test_cycle
| ^^^^^^^^^^ AIR301
|
AIR301_names.py:111:1: AIR301 `airflow.utils.db.create_session` is removed in Airflow 3.0
|
110 | # airflow.utils.db
111 | create_session
| ^^^^^^^^^^^^^^ AIR301
112 |
113 | # airflow.utils.decorators
|
AIR301_names.py:114:1: AIR301 `airflow.utils.decorators.apply_defaults` is removed in Airflow 3.0
|
113 | # airflow.utils.decorators
114 | apply_defaults
| ^^^^^^^^^^^^^^ AIR301
115 |
116 | # airflow.utils.file
|
= help: `apply_defaults` is now unconditionally done and can be safely removed.
AIR301_names.py:117:1: AIR301 `airflow.utils.file.TemporaryDirectory` is removed in Airflow 3.0
|
116 | # airflow.utils.file
117 | TemporaryDirectory()
101 | # airflow.www.auth
102 | has_access
103 | has_access_dataset
| ^^^^^^^^^^^^^^^^^^ AIR301
118 | mkdirs
|
= help: Use `tempfile.TemporaryDirectory` instead
AIR301_names.py:118:1: AIR301 `airflow.utils.file.mkdirs` is removed in Airflow 3.0
|
116 | # airflow.utils.file
117 | TemporaryDirectory()
118 | mkdirs
| ^^^^^^ AIR301
119 |
120 | # airflow.utils.helpers
|
= help: Use `pathlib.Path({path}).mkdir` instead
AIR301_names.py:121:1: AIR301 [*] `airflow.utils.helpers.chain` is removed in Airflow 3.0
|
120 | # airflow.utils.helpers
121 | helper_chain
| ^^^^^^^^^^^^ AIR301
122 | helper_cross_downstream
|
= help: Use `airflow.sdk.chain` instead
Safe fix
48 48 | from airflow.utils.trigger_rule import TriggerRule
49 49 | from airflow.www.auth import has_access
50 50 | from airflow.www.utils import get_sensitive_variables_fields, should_hide_value_for_key
51 |+from airflow.sdk import chain
51 52 |
52 53 | # airflow root
53 54 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
--------------------------------------------------------------------------------
118 119 | mkdirs
119 120 |
120 121 | # airflow.utils.helpers
121 |-helper_chain
122 |+chain
122 123 | helper_cross_downstream
123 124 |
124 125 | # airflow.utils.log
AIR301_names.py:122:1: AIR301 [*] `airflow.utils.helpers.cross_downstream` is removed in Airflow 3.0
|
120 | # airflow.utils.helpers
121 | helper_chain
122 | helper_cross_downstream
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
123 |
124 | # airflow.utils.log
|
= help: Use `airflow.sdk.cross_downstream` instead
Safe fix
48 48 | from airflow.utils.trigger_rule import TriggerRule
49 49 | from airflow.www.auth import has_access
50 50 | from airflow.www.utils import get_sensitive_variables_fields, should_hide_value_for_key
51 |+from airflow.sdk import cross_downstream
51 52 |
52 53 | # airflow root
53 54 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
--------------------------------------------------------------------------------
119 120 |
120 121 | # airflow.utils.helpers
121 122 | helper_chain
122 |-helper_cross_downstream
123 |+cross_downstream
123 124 |
124 125 | # airflow.utils.log
125 126 | secrets_masker
AIR301_names.py:125:1: AIR301 `airflow.utils.log.secrets_masker` is removed in Airflow 3.0
|
124 | # airflow.utils.log
125 | secrets_masker
| ^^^^^^^^^^^^^^ AIR301
126 |
127 | # airflow.utils.state
|
= help: Use `airflow.sdk.execution_time.secrets_masker` instead
AIR301_names.py:128:1: AIR301 `airflow.utils.state.SHUTDOWN` is removed in Airflow 3.0
|
127 | # airflow.utils.state
128 | SHUTDOWN
| ^^^^^^^^ AIR301
129 | terminating_states
104 |
105 | # airflow.www.utils
|
AIR301_names.py:129:1: AIR301 `airflow.utils.state.terminating_states` is removed in Airflow 3.0
AIR301_names.py:106:1: AIR301 `airflow.www.utils.get_sensitive_variables_fields` is removed in Airflow 3.0
|
127 | # airflow.utils.state
128 | SHUTDOWN
129 | terminating_states
| ^^^^^^^^^^^^^^^^^^ AIR301
130 |
131 | # airflow.utils.trigger_rule
|
AIR301_names.py:132:1: AIR301 `airflow.utils.trigger_rule.TriggerRule.DUMMY` is removed in Airflow 3.0
|
131 | # airflow.utils.trigger_rule
132 | TriggerRule.DUMMY
| ^^^^^^^^^^^^^^^^^ AIR301
133 | TriggerRule.NONE_FAILED_OR_SKIPPED
|
AIR301_names.py:133:1: AIR301 `airflow.utils.trigger_rule.TriggerRule.NONE_FAILED_OR_SKIPPED` is removed in Airflow 3.0
|
131 | # airflow.utils.trigger_rule
132 | TriggerRule.DUMMY
133 | TriggerRule.NONE_FAILED_OR_SKIPPED
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
|
AIR301_names.py:137:1: AIR301 `airflow.www.auth.has_access` is removed in Airflow 3.0
|
136 | # airflow.www.auth
137 | has_access
| ^^^^^^^^^^ AIR301
138 |
139 | # airflow.www.utils
|
AIR301_names.py:140:1: AIR301 `airflow.www.utils.get_sensitive_variables_fields` is removed in Airflow 3.0
|
139 | # airflow.www.utils
140 | get_sensitive_variables_fields
105 | # airflow.www.utils
106 | get_sensitive_variables_fields
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
141 | should_hide_value_for_key
107 | should_hide_value_for_key
|
AIR301_names.py:141:1: AIR301 `airflow.www.utils.should_hide_value_for_key` is removed in Airflow 3.0
AIR301_names.py:107:1: AIR301 `airflow.www.utils.should_hide_value_for_key` is removed in Airflow 3.0
|
139 | # airflow.www.utils
140 | get_sensitive_variables_fields
141 | should_hide_value_for_key
105 | # airflow.www.utils
106 | get_sensitive_variables_fields
107 | should_hide_value_for_key
| ^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
142 |
143 | # airflow.operators.python
|
AIR301_names.py:146:1: AIR301 `airflow.operators.python.get_current_context` is removed in Airflow 3.0
|
144 | from airflow.operators.python import get_current_context
145 |
146 | get_current_context()
| ^^^^^^^^^^^^^^^^^^^ AIR301
147 |
148 | # airflow.providers.mysql
|
= help: Use `airflow.sdk.get_current_context` instead
AIR301_names.py:151:1: AIR301 `airflow.providers.mysql.datasets.mysql.sanitize_uri` is removed in Airflow 3.0
|
149 | from airflow.providers.mysql.datasets.mysql import sanitize_uri
150 |
151 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
152 |
153 | # airflow.providers.postgres
|
= help: Use `airflow.providers.mysql.assets.mysql.sanitize_uri` instead
AIR301_names.py:156:1: AIR301 `airflow.providers.postgres.datasets.postgres.sanitize_uri` is removed in Airflow 3.0
|
154 | from airflow.providers.postgres.datasets.postgres import sanitize_uri
155 |
156 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
157 |
158 | # airflow.providers.trino
|
= help: Use `airflow.providers.postgres.assets.postgres.sanitize_uri` instead
AIR301_names.py:161:1: AIR301 `airflow.providers.trino.datasets.trino.sanitize_uri` is removed in Airflow 3.0
|
159 | from airflow.providers.trino.datasets.trino import sanitize_uri
160 |
161 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
162 |
163 | # airflow.notifications.basenotifier
|
= help: Use `airflow.providers.trino.assets.trino.sanitize_uri` instead
AIR301_names.py:166:1: AIR301 `airflow.notifications.basenotifier.BaseNotifier` is removed in Airflow 3.0
|
164 | from airflow.notifications.basenotifier import BaseNotifier
165 |
166 | BaseNotifier()
| ^^^^^^^^^^^^ AIR301
167 |
168 | # airflow.auth.manager
|
= help: Use `airflow.sdk.bases.notifier.BaseNotifier` instead
AIR301_names.py:171:1: AIR301 `airflow.auth.managers.base_auth_manager.BaseAuthManager` is removed in Airflow 3.0
|
169 | from airflow.auth.managers.base_auth_manager import BaseAuthManager
170 |
171 | BaseAuthManager()
| ^^^^^^^^^^^^^^^ AIR301
|
= help: Use `airflow.api_fastapi.auth.managers.base_auth_manager.BaseAuthManager` instead

View File

@@ -1,216 +1,243 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR301_provider_names_fix.py:25:1: AIR301 [*] `airflow.providers.amazon.aws.auth_manager.avp.entities.AvpEntities.DATASET` is removed in Airflow 3.0
AIR301_provider_names_fix.py:11:1: AIR301 [*] `airflow.providers.amazon.aws.auth_manager.avp.entities.AvpEntities.DATASET` is removed in Airflow 3.0
|
23 | )
24 |
25 | AvpEntities.DATASET
9 | from airflow.security.permissions import RESOURCE_DATASET
10 |
11 | AvpEntities.DATASET
| ^^^^^^^^^^^^^^^^^^^ AIR301
26 |
27 | s3_create_dataset()
12 |
13 | # airflow.providers.openlineage.utils.utils
|
= help: Use `airflow.providers.amazon.aws.auth_manager.avp.entities.AvpEntities.ASSET` instead
= help: Use `AvpEntities.ASSET` from `airflow.providers.amazon.aws.auth_manager.avp.entities` instead.
Safe fix
22 22 | translate_airflow_dataset,
23 23 | )
24 24 |
25 |-AvpEntities.DATASET
25 |+AvpEntities.ASSET
26 26 |
27 27 | s3_create_dataset()
28 28 | s3_convert_dataset_to_openlineage()
8 8 | from airflow.secrets.local_filesystem import load_connections
9 9 | from airflow.security.permissions import RESOURCE_DATASET
10 10 |
11 |-AvpEntities.DATASET
11 |+AvpEntities
12 12 |
13 13 | # airflow.providers.openlineage.utils.utils
14 14 | DatasetInfo()
AIR301_provider_names_fix.py:27:1: AIR301 [*] `airflow.providers.amazon.aws.datasets.s3.create_dataset` is removed in Airflow 3.0
AIR301_provider_names_fix.py:14:1: AIR301 [*] `airflow.providers.openlineage.utils.utils.DatasetInfo` is removed in Airflow 3.0
|
25 | AvpEntities.DATASET
26 |
27 | s3_create_dataset()
| ^^^^^^^^^^^^^^^^^ AIR301
28 | s3_convert_dataset_to_openlineage()
|
= help: Use `airflow.providers.amazon.aws.assets.s3.create_asset` instead
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.amazon.aws.assets.s3 import create_asset
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
27 |-s3_create_dataset()
28 |+create_asset()
28 29 | s3_convert_dataset_to_openlineage()
29 30 |
30 31 | io_create_dataset()
AIR301_provider_names_fix.py:28:1: AIR301 [*] `airflow.providers.amazon.aws.datasets.s3.convert_dataset_to_openlineage` is removed in Airflow 3.0
|
27 | s3_create_dataset()
28 | s3_convert_dataset_to_openlineage()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
29 |
30 | io_create_dataset()
|
= help: Use `airflow.providers.amazon.aws.assets.s3.convert_asset_to_openlineage` instead
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.amazon.aws.assets.s3 import convert_asset_to_openlineage
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
27 28 | s3_create_dataset()
28 |-s3_convert_dataset_to_openlineage()
29 |+convert_asset_to_openlineage()
29 30 |
30 31 | io_create_dataset()
31 32 | io_convert_dataset_to_openlineage()
AIR301_provider_names_fix.py:36:1: AIR301 [*] `airflow.providers.google.datasets.bigquery.create_dataset` is removed in Airflow 3.0
|
35 | # airflow.providers.google.datasets.bigquery
36 | bigquery_create_dataset()
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
37 | # airflow.providers.google.datasets.gcs
38 | gcs_create_dataset()
|
= help: Use `airflow.providers.google.assets.bigquery.create_asset` instead
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.google.assets.bigquery import create_asset
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
--------------------------------------------------------------------------------
33 34 |
34 35 |
35 36 | # airflow.providers.google.datasets.bigquery
36 |-bigquery_create_dataset()
37 |+create_asset()
37 38 | # airflow.providers.google.datasets.gcs
38 39 | gcs_create_dataset()
39 40 | gcs_convert_dataset_to_openlineage()
AIR301_provider_names_fix.py:38:1: AIR301 [*] `airflow.providers.google.datasets.gcs.create_dataset` is removed in Airflow 3.0
|
36 | bigquery_create_dataset()
37 | # airflow.providers.google.datasets.gcs
38 | gcs_create_dataset()
| ^^^^^^^^^^^^^^^^^^ AIR301
39 | gcs_convert_dataset_to_openlineage()
40 | # airflow.providers.openlineage.utils.utils
|
= help: Use `airflow.providers.google.assets.gcs.create_asset` instead
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.google.assets.gcs import create_asset
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
--------------------------------------------------------------------------------
35 36 | # airflow.providers.google.datasets.bigquery
36 37 | bigquery_create_dataset()
37 38 | # airflow.providers.google.datasets.gcs
38 |-gcs_create_dataset()
39 |+create_asset()
39 40 | gcs_convert_dataset_to_openlineage()
40 41 | # airflow.providers.openlineage.utils.utils
41 42 | DatasetInfo()
AIR301_provider_names_fix.py:39:1: AIR301 [*] `airflow.providers.google.datasets.gcs.convert_dataset_to_openlineage` is removed in Airflow 3.0
|
37 | # airflow.providers.google.datasets.gcs
38 | gcs_create_dataset()
39 | gcs_convert_dataset_to_openlineage()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
40 | # airflow.providers.openlineage.utils.utils
41 | DatasetInfo()
|
= help: Use `airflow.providers.google.assets.gcs.convert_asset_to_openlineage` instead
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.google.assets.gcs import convert_asset_to_openlineage
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
--------------------------------------------------------------------------------
36 37 | bigquery_create_dataset()
37 38 | # airflow.providers.google.datasets.gcs
38 39 | gcs_create_dataset()
39 |-gcs_convert_dataset_to_openlineage()
40 |+convert_asset_to_openlineage()
40 41 | # airflow.providers.openlineage.utils.utils
41 42 | DatasetInfo()
42 43 | translate_airflow_dataset()
AIR301_provider_names_fix.py:41:1: AIR301 [*] `airflow.providers.openlineage.utils.utils.DatasetInfo` is removed in Airflow 3.0
|
39 | gcs_convert_dataset_to_openlineage()
40 | # airflow.providers.openlineage.utils.utils
41 | DatasetInfo()
13 | # airflow.providers.openlineage.utils.utils
14 | DatasetInfo()
| ^^^^^^^^^^^ AIR301
42 | translate_airflow_dataset()
43 | #
15 | translate_airflow_dataset()
|
= help: Use `airflow.providers.openlineage.utils.utils.AssetInfo` instead
= help: Use `AssetInfo` from `airflow.providers.openlineage.utils.utils` instead.
Safe fix
20 20 | from airflow.providers.openlineage.utils.utils import (
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 |+AssetInfo,
23 24 | )
24 25 |
25 26 | AvpEntities.DATASET
4 4 | from airflow.providers.openlineage.utils.utils import (
5 5 | DatasetInfo,
6 6 | translate_airflow_dataset,
7 |+AssetInfo,
7 8 | )
8 9 | from airflow.secrets.local_filesystem import load_connections
9 10 | from airflow.security.permissions import RESOURCE_DATASET
--------------------------------------------------------------------------------
38 39 | gcs_create_dataset()
39 40 | gcs_convert_dataset_to_openlineage()
40 41 | # airflow.providers.openlineage.utils.utils
41 |-DatasetInfo()
42 |+AssetInfo()
42 43 | translate_airflow_dataset()
43 44 | #
44 45 | # airflow.secrets.local_filesystem
11 12 | AvpEntities.DATASET
12 13 |
13 14 | # airflow.providers.openlineage.utils.utils
14 |-DatasetInfo()
15 |+AssetInfo()
15 16 | translate_airflow_dataset()
16 17 |
17 18 | # airflow.secrets.local_filesystem
AIR301_provider_names_fix.py:42:1: AIR301 [*] `airflow.providers.openlineage.utils.utils.translate_airflow_dataset` is removed in Airflow 3.0
AIR301_provider_names_fix.py:15:1: AIR301 [*] `airflow.providers.openlineage.utils.utils.translate_airflow_dataset` is removed in Airflow 3.0
|
40 | # airflow.providers.openlineage.utils.utils
41 | DatasetInfo()
42 | translate_airflow_dataset()
13 | # airflow.providers.openlineage.utils.utils
14 | DatasetInfo()
15 | translate_airflow_dataset()
| ^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
43 | #
44 | # airflow.secrets.local_filesystem
16 |
17 | # airflow.secrets.local_filesystem
|
= help: Use `airflow.providers.openlineage.utils.utils.translate_airflow_asset` instead
= help: Use `translate_airflow_asset` from `airflow.providers.openlineage.utils.utils` instead.
Safe fix
20 20 | from airflow.providers.openlineage.utils.utils import (
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 |+translate_airflow_asset,
23 24 | )
24 25 |
25 26 | AvpEntities.DATASET
4 4 | from airflow.providers.openlineage.utils.utils import (
5 5 | DatasetInfo,
6 6 | translate_airflow_dataset,
7 |+translate_airflow_asset,
7 8 | )
8 9 | from airflow.secrets.local_filesystem import load_connections
9 10 | from airflow.security.permissions import RESOURCE_DATASET
--------------------------------------------------------------------------------
39 40 | gcs_convert_dataset_to_openlineage()
40 41 | # airflow.providers.openlineage.utils.utils
41 42 | DatasetInfo()
42 |-translate_airflow_dataset()
43 |+translate_airflow_asset()
43 44 | #
44 45 | # airflow.secrets.local_filesystem
45 46 | load_connections()
12 13 |
13 14 | # airflow.providers.openlineage.utils.utils
14 15 | DatasetInfo()
15 |-translate_airflow_dataset()
16 |+translate_airflow_asset()
16 17 |
17 18 | # airflow.secrets.local_filesystem
18 19 | load_connections()
AIR301_provider_names_fix.py:18:1: AIR301 [*] `airflow.secrets.local_filesystem.load_connections` is removed in Airflow 3.0
|
17 | # airflow.secrets.local_filesystem
18 | load_connections()
| ^^^^^^^^^^^^^^^^ AIR301
19 |
20 | # airflow.security.permissions
|
= help: Use `load_connections_dict` from `airflow.secrets.local_filesystem` instead.
Safe fix
5 5 | DatasetInfo,
6 6 | translate_airflow_dataset,
7 7 | )
8 |-from airflow.secrets.local_filesystem import load_connections
8 |+from airflow.secrets.local_filesystem import load_connections, load_connections_dict
9 9 | from airflow.security.permissions import RESOURCE_DATASET
10 10 |
11 11 | AvpEntities.DATASET
--------------------------------------------------------------------------------
15 15 | translate_airflow_dataset()
16 16 |
17 17 | # airflow.secrets.local_filesystem
18 |-load_connections()
18 |+load_connections_dict()
19 19 |
20 20 | # airflow.security.permissions
21 21 | RESOURCE_DATASET
AIR301_provider_names_fix.py:21:1: AIR301 [*] `airflow.security.permissions.RESOURCE_DATASET` is removed in Airflow 3.0
|
20 | # airflow.security.permissions
21 | RESOURCE_DATASET
| ^^^^^^^^^^^^^^^^ AIR301
22 |
23 | from airflow.providers.amazon.aws.datasets.s3 import (
|
= help: Use `RESOURCE_ASSET` from `airflow.security.permissions` instead.
Safe fix
6 6 | translate_airflow_dataset,
7 7 | )
8 8 | from airflow.secrets.local_filesystem import load_connections
9 |-from airflow.security.permissions import RESOURCE_DATASET
9 |+from airflow.security.permissions import RESOURCE_DATASET, RESOURCE_ASSET
10 10 |
11 11 | AvpEntities.DATASET
12 12 |
--------------------------------------------------------------------------------
18 18 | load_connections()
19 19 |
20 20 | # airflow.security.permissions
21 |-RESOURCE_DATASET
21 |+RESOURCE_ASSET
22 22 |
23 23 | from airflow.providers.amazon.aws.datasets.s3 import (
24 24 | convert_dataset_to_openlineage as s3_convert_dataset_to_openlineage,
AIR301_provider_names_fix.py:28:1: AIR301 [*] `airflow.providers.amazon.aws.datasets.s3.create_dataset` is removed in Airflow 3.0
|
26 | from airflow.providers.amazon.aws.datasets.s3 import create_dataset as s3_create_dataset
27 |
28 | s3_create_dataset()
| ^^^^^^^^^^^^^^^^^ AIR301
29 | s3_convert_dataset_to_openlineage()
|
= help: Use `create_asset` from `airflow.providers.amazon.aws.assets.s3` instead.
Safe fix
24 24 | convert_dataset_to_openlineage as s3_convert_dataset_to_openlineage,
25 25 | )
26 26 | from airflow.providers.amazon.aws.datasets.s3 import create_dataset as s3_create_dataset
27 |+from airflow.providers.amazon.aws.assets.s3 import create_asset
27 28 |
28 |-s3_create_dataset()
29 |+create_asset()
29 30 | s3_convert_dataset_to_openlineage()
30 31 |
31 32 | from airflow.providers.common.io.dataset.file import (
AIR301_provider_names_fix.py:29:1: AIR301 [*] `airflow.providers.amazon.aws.datasets.s3.convert_dataset_to_openlineage` is removed in Airflow 3.0
|
28 | s3_create_dataset()
29 | s3_convert_dataset_to_openlineage()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
30 |
31 | from airflow.providers.common.io.dataset.file import (
|
= help: Use `convert_asset_to_openlineage` from `airflow.providers.amazon.aws.assets.s3` instead.
Safe fix
24 24 | convert_dataset_to_openlineage as s3_convert_dataset_to_openlineage,
25 25 | )
26 26 | from airflow.providers.amazon.aws.datasets.s3 import create_dataset as s3_create_dataset
27 |+from airflow.providers.amazon.aws.assets.s3 import convert_asset_to_openlineage
27 28 |
28 29 | s3_create_dataset()
29 |-s3_convert_dataset_to_openlineage()
30 |+convert_asset_to_openlineage()
30 31 |
31 32 | from airflow.providers.common.io.dataset.file import (
32 33 | convert_dataset_to_openlineage as io_convert_dataset_to_openlineage,
AIR301_provider_names_fix.py:45:1: AIR301 [*] `airflow.providers.google.datasets.bigquery.create_dataset` is removed in Airflow 3.0
|
43 | )
44 |
45 | bigquery_create_dataset()
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
46 |
47 | # airflow.providers.google.datasets.gcs
|
= help: Use `create_asset` from `airflow.providers.google.assets.bigquery` instead.
Safe fix
41 41 | from airflow.providers.google.datasets.bigquery import (
42 42 | create_dataset as bigquery_create_dataset,
43 43 | )
44 |+from airflow.providers.google.assets.bigquery import create_asset
44 45 |
45 |-bigquery_create_dataset()
46 |+create_asset()
46 47 |
47 48 | # airflow.providers.google.datasets.gcs
48 49 | from airflow.providers.google.datasets.gcs import (
AIR301_provider_names_fix.py:53:1: AIR301 [*] `airflow.providers.google.datasets.gcs.create_dataset` is removed in Airflow 3.0
|
51 | from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
52 |
53 | gcs_create_dataset()
| ^^^^^^^^^^^^^^^^^^ AIR301
54 | gcs_convert_dataset_to_openlineage()
|
= help: Use `create_asset` from `airflow.providers.google.assets.gcs` instead.
Safe fix
49 49 | convert_dataset_to_openlineage as gcs_convert_dataset_to_openlineage,
50 50 | )
51 51 | from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
52 |+from airflow.providers.google.assets.gcs import create_asset
52 53 |
53 |-gcs_create_dataset()
54 |+create_asset()
54 55 | gcs_convert_dataset_to_openlineage()
AIR301_provider_names_fix.py:54:1: AIR301 [*] `airflow.providers.google.datasets.gcs.convert_dataset_to_openlineage` is removed in Airflow 3.0
|
53 | gcs_create_dataset()
54 | gcs_convert_dataset_to_openlineage()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
|
= help: Use `convert_asset_to_openlineage` from `airflow.providers.google.assets.gcs` instead.
Safe fix
49 49 | convert_dataset_to_openlineage as gcs_convert_dataset_to_openlineage,
50 50 | )
51 51 | from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
52 |+from airflow.providers.google.assets.gcs import convert_asset_to_openlineage
52 53 |
53 54 | gcs_create_dataset()
54 |-gcs_convert_dataset_to_openlineage()
55 |+convert_asset_to_openlineage()

View File

@@ -1,113 +1,252 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_amazon.py:23:1: AIR302 `airflow.hooks.S3_hook.S3Hook` is moved into `amazon` provider in Airflow 3.0;
AIR302_amazon.py:14:1: AIR302 [*] `airflow.hooks.S3_hook.S3Hook` is moved into `amazon` provider in Airflow 3.0;
|
21 | from airflow.sensors.s3_key_sensor import S3KeySensor
22 |
23 | S3Hook()
12 | from airflow.sensors.s3_key_sensor import S3KeySensor
13 |
14 | S3Hook()
| ^^^^^^ AIR302
24 | provide_bucket_name()
15 | provide_bucket_name()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.hooks.s3.S3Hook` instead.
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3Hook` from `airflow.providers.amazon.aws.hooks.s3` instead.
AIR302_amazon.py:24:1: AIR302 `airflow.hooks.S3_hook.provide_bucket_name` is moved into `amazon` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 3 | from airflow.hooks.S3_hook import (
4 |- S3Hook,
5 4 | provide_bucket_name,
6 5 | )
7 6 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
--------------------------------------------------------------------------------
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.hooks.s3 import S3Hook
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:15:1: AIR302 [*] `airflow.hooks.S3_hook.provide_bucket_name` is moved into `amazon` provider in Airflow 3.0;
|
23 | S3Hook()
24 | provide_bucket_name()
14 | S3Hook()
15 | provide_bucket_name()
| ^^^^^^^^^^^^^^^^^^^ AIR302
25 |
26 | GCSToS3Operator()
16 |
17 | GCSToS3Operator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.hooks.s3.provide_bucket_name` instead.
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `provide_bucket_name` from `airflow.providers.amazon.aws.hooks.s3` instead.
AIR302_amazon.py:26:1: AIR302 `airflow.operators.gcs_to_s3.GCSToS3Operator` is moved into `amazon` provider in Airflow 3.0;
Unsafe fix
2 2 |
3 3 | from airflow.hooks.S3_hook import (
4 4 | S3Hook,
5 |- provide_bucket_name,
6 5 | )
7 6 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 7 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
--------------------------------------------------------------------------------
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.hooks.s3 import provide_bucket_name
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:17:1: AIR302 [*] `airflow.operators.gcs_to_s3.GCSToS3Operator` is moved into `amazon` provider in Airflow 3.0;
|
24 | provide_bucket_name()
25 |
26 | GCSToS3Operator()
15 | provide_bucket_name()
16 |
17 | GCSToS3Operator()
| ^^^^^^^^^^^^^^^ AIR302
27 |
28 | GoogleApiToS3Operator()
18 | GoogleApiToS3Operator()
19 | RedshiftToS3Operator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.transfers.gcs_to_s3.GCSToS3Operator` instead.
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `GCSToS3Operator` from `airflow.providers.amazon.aws.transfers.gcs_to_s3` instead.
AIR302_amazon.py:28:1: AIR302 `airflow.operators.google_api_to_s3_transfer.GoogleApiToS3Operator` is moved into `amazon` provider in Airflow 3.0;
Unsafe fix
4 4 | S3Hook,
5 5 | provide_bucket_name,
6 6 | )
7 |-from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 7 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 8 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.transfers.gcs_to_s3 import GCSToS3Operator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:18:1: AIR302 [*] `airflow.operators.google_api_to_s3_transfer.GoogleApiToS3Operator` is moved into `amazon` provider in Airflow 3.0;
|
26 | GCSToS3Operator()
27 |
28 | GoogleApiToS3Operator()
17 | GCSToS3Operator()
18 | GoogleApiToS3Operator()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
29 | GoogleApiToS3Transfer()
19 | RedshiftToS3Operator()
20 | S3FileTransformOperator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.transfers.google_api_to_s3.GoogleApiToS3Operator` instead.
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `GoogleApiToS3Operator` from `airflow.providers.amazon.aws.transfers.google_api_to_s3` instead.
AIR302_amazon.py:29:1: AIR302 `airflow.operators.google_api_to_s3_transfer.GoogleApiToS3Transfer` is moved into `amazon` provider in Airflow 3.0;
|
28 | GoogleApiToS3Operator()
29 | GoogleApiToS3Transfer()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
30 |
31 | RedshiftToS3Operator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.transfers.google_api_to_s3.GoogleApiToS3Operator` instead.
Unsafe fix
5 5 | provide_bucket_name,
6 6 | )
7 7 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 |-from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 8 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.transfers.google_api_to_s3 import GoogleApiToS3Operator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:31:1: AIR302 `airflow.operators.redshift_to_s3_operator.RedshiftToS3Operator` is moved into `amazon` provider in Airflow 3.0;
AIR302_amazon.py:19:1: AIR302 [*] `airflow.operators.redshift_to_s3_operator.RedshiftToS3Operator` is moved into `amazon` provider in Airflow 3.0;
|
29 | GoogleApiToS3Transfer()
30 |
31 | RedshiftToS3Operator()
17 | GCSToS3Operator()
18 | GoogleApiToS3Operator()
19 | RedshiftToS3Operator()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
32 | RedshiftToS3Transfer()
20 | S3FileTransformOperator()
21 | S3ToRedshiftOperator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator` instead.
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `RedshiftToS3Operator` from `airflow.providers.amazon.aws.transfers.redshift_to_s3` instead.
AIR302_amazon.py:32:1: AIR302 `airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer` is moved into `amazon` provider in Airflow 3.0;
|
31 | RedshiftToS3Operator()
32 | RedshiftToS3Transfer()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
33 |
34 | S3FileTransformOperator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator` instead.
Unsafe fix
6 6 | )
7 7 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 8 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 |-from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.transfers.redshift_to_s3 import RedshiftToS3Operator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:34:1: AIR302 `airflow.operators.s3_file_transform_operator.S3FileTransformOperator` is moved into `amazon` provider in Airflow 3.0;
AIR302_amazon.py:20:1: AIR302 [*] `airflow.operators.s3_file_transform_operator.S3FileTransformOperator` is moved into `amazon` provider in Airflow 3.0;
|
32 | RedshiftToS3Transfer()
33 |
34 | S3FileTransformOperator()
18 | GoogleApiToS3Operator()
19 | RedshiftToS3Operator()
20 | S3FileTransformOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR302
35 |
36 | S3ToRedshiftOperator()
21 | S3ToRedshiftOperator()
22 | S3KeySensor()
|
= help: Install `apache-airflow-providers-amazon>=3.0.0` and use `airflow.providers.amazon.aws.operators.s3.S3FileTransformOperator` instead.
= help: Install `apache-airflow-providers-amazon>=3.0.0` and use `S3FileTransformOperator` from `airflow.providers.amazon.aws.operators.s3` instead.
AIR302_amazon.py:36:1: AIR302 `airflow.operators.s3_to_redshift_operator.S3ToRedshiftOperator` is moved into `amazon` provider in Airflow 3.0;
Unsafe fix
7 7 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 8 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 9 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 |-from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.operators.s3 import S3FileTransformOperator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:21:1: AIR302 [*] `airflow.operators.s3_to_redshift_operator.S3ToRedshiftOperator` is moved into `amazon` provider in Airflow 3.0;
|
34 | S3FileTransformOperator()
35 |
36 | S3ToRedshiftOperator()
19 | RedshiftToS3Operator()
20 | S3FileTransformOperator()
21 | S3ToRedshiftOperator()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
37 | S3ToRedshiftTransfer()
22 | S3KeySensor()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator` instead.
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3ToRedshiftOperator` from `airflow.providers.amazon.aws.transfers.s3_to_redshift` instead.
AIR302_amazon.py:37:1: AIR302 `airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer` is moved into `amazon` provider in Airflow 3.0;
|
36 | S3ToRedshiftOperator()
37 | S3ToRedshiftTransfer()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
38 |
39 | S3KeySensor()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator` instead.
Unsafe fix
8 8 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 9 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 10 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 |-from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.transfers.s3_to_redshift import S3ToRedshiftOperator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:39:1: AIR302 `airflow.sensors.s3_key_sensor.S3KeySensor` is moved into `amazon` provider in Airflow 3.0;
AIR302_amazon.py:22:1: AIR302 [*] `airflow.sensors.s3_key_sensor.S3KeySensor` is moved into `amazon` provider in Airflow 3.0;
|
37 | S3ToRedshiftTransfer()
38 |
39 | S3KeySensor()
20 | S3FileTransformOperator()
21 | S3ToRedshiftOperator()
22 | S3KeySensor()
| ^^^^^^^^^^^ AIR302
23 |
24 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Transfer
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `airflow.providers.amazon.aws.sensors.s3.S3KeySensor` instead.
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3KeySensor` from `airflow.providers.amazon.aws.sensors.s3` instead.
Unsafe fix
9 9 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 10 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 11 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 |-from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.sensors.s3 import S3KeySensor
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:26:1: AIR302 [*] `airflow.operators.google_api_to_s3_transfer.GoogleApiToS3Transfer` is moved into `amazon` provider in Airflow 3.0;
|
24 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Transfer
25 |
26 | GoogleApiToS3Transfer()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
27 |
28 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `GoogleApiToS3Operator` from `airflow.providers.amazon.aws.transfers.google_api_to_s3` instead.
Unsafe fix
22 22 | S3KeySensor()
23 23 |
24 24 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Transfer
25 |+from airflow.providers.amazon.aws.transfers.google_api_to_s3 import GoogleApiToS3Operator
25 26 |
26 27 | GoogleApiToS3Transfer()
27 28 |
AIR302_amazon.py:30:1: AIR302 [*] `airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer` is moved into `amazon` provider in Airflow 3.0;
|
28 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
29 |
30 | RedshiftToS3Transfer()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
31 |
32 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `RedshiftToS3Operator` from `airflow.providers.amazon.aws.transfers.redshift_to_s3` instead.
Unsafe fix
26 26 | GoogleApiToS3Transfer()
27 27 |
28 28 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
29 |+from airflow.providers.amazon.aws.transfers.redshift_to_s3 import RedshiftToS3Operator
29 30 |
30 31 | RedshiftToS3Transfer()
31 32 |
AIR302_amazon.py:34:1: AIR302 [*] `airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer` is moved into `amazon` provider in Airflow 3.0;
|
32 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer
33 |
34 | S3ToRedshiftTransfer()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3ToRedshiftOperator` from `airflow.providers.amazon.aws.transfers.s3_to_redshift` instead.
Unsafe fix
30 30 | RedshiftToS3Transfer()
31 31 |
32 32 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer
33 |+from airflow.providers.amazon.aws.transfers.s3_to_redshift import S3ToRedshiftOperator
33 34 |
34 35 | S3ToRedshiftTransfer()

Some files were not shown because too many files have changed in this diff Show More