Compare commits

...

103 Commits

Author SHA1 Message Date
dylwil3
fc01814f65 preview->stable 2025-06-09 08:04:01 -05:00
Dylan
829acf498d [flake8-boolean-trap] Stabilize lint bool suprtypes in boolean-type-hint-positional-argument (FBT001) (#18520)
Feel free to complain about the rephrasing in the docs!
2025-06-08 20:22:48 -04:00
Dylan
e07f352f99 [flake8-bandit] Stabilize more trusted inputs in subprocess-without-shell-equals-true (S603) (#18521) 2025-06-08 20:22:48 -04:00
Dylan
8d0b6882b7 [flake8-pyi] Stabilize autofix for future-annotations-in-stub (PYI044) (#18518) 2025-06-08 20:22:48 -04:00
Dylan
65a2daea02 [semantic errors] Stabilize semantic errors (#18523) 2025-06-08 20:22:48 -04:00
Dylan
8baaa2f7f3 [syntax errors] Stabilize version-specific unsupported syntax errors (#18522) 2025-06-08 20:22:48 -04:00
Dylan
8b1ce32f04 [ruff] Stabilize checking for file-level directives in unused-noqa (RUF100) (#18497)
Note that the preview behavior was not documented (shame on us!) so the
documentation was not modified.

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-06-08 20:22:48 -04:00
Dylan
eb5abda8ac [flake8-simplify] Stabilize further simplification to binary expressions in autofix for if-else-block-instead-of-if-exp (SIM108) (#18506) 2025-06-08 20:22:48 -04:00
Brent Westbrook
9c4ecf77b6 [refurb] Stabilize fromisoformat-replace-z (FURB162) (#18510)
This PR stabilizes the FURB162 rule by moving it from preview to stable
status for the 0.12.0 release.

## Summary
- **Rule**: FURB162 (`fromisoformat-replace-z`)
- **Purpose**: Detects unnecessary timezone replacement operations when
calling `datetime.fromisoformat()`
- **Change**: Move from `RuleGroup::Preview` to `RuleGroup::Stable` in
`codes.rs`

## Verification Links
- **Tests**:
[refurb/mod.rs](https://github.com/astral-sh/ruff/blob/main/crates/ruff_linter/src/rules/refurb/mod.rs#L54)
- Confirms FURB162 has only standard tests, no preview-specific test
cases
- **Documentation**:
https://docs.astral.sh/ruff/rules/fromisoformat-replace-z/ - Current
documentation shows preview status that will be automatically updated
2025-06-08 20:22:48 -04:00
Brent Westbrook
0809d88ca0 [ruff] Stabilize class-with-mixed-type-vars (RUF053) (#18512)
This PR stabilizes the RUF053 rule by moving it from preview to stable
status for the 0.12.0 release.

## Summary
- **Rule**: RUF053 (`class-with-mixed-type-vars`)
- **Purpose**: Detects classes that have both PEP 695 type parameter
lists while also inheriting from `typing.Generic`
- **Change**: Move from `RuleGroup::Preview` to `RuleGroup::Stable` in
`codes.rs` and migrate preview tests to stable tests

## Verification Links
- **Tests**:
[ruff/mod.rs](https://github.com/astral-sh/ruff/blob/main/crates/ruff_linter/src/rules/ruff/mod.rs#L98)
- Shows RUF053 moved from preview_rules to main rules test function
- **Documentation**:
https://docs.astral.sh/ruff/rules/class-with-mixed-type-vars/ - Current
documentation shows preview status that will be automatically updated
2025-06-08 20:22:48 -04:00
Dylan
5c59167686 [ruff] Stabilize checking in presence of slices for collection-literal-concatenation (RUF005) (#18500) 2025-06-08 20:22:48 -04:00
Dylan
e2ea301c74 [refurb] Stabilize fix safety for readlines-in-for (FURB129) (#18496)
Note that the preview behavior was not documented (shame on us!) so the
documentation was not modified.

---------

Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-06-08 20:22:44 -04:00
Brent Westbrook
62364ea47e Ruff 0.12
Summary
--

Release branch for Ruff 0.12.0

TODOs
--

- [ ] Drop empty first commit
- [ ] Merge with rebase-merge (**don't squash merge!!!!**)
2025-06-08 20:14:44 -04:00
Charlie Marsh
331821244b Refactor fix in readlines-in-for (#18573)
## Summary

Post-merge feedback from https://github.com/astral-sh/ruff/pull/18542.
2025-06-08 20:10:13 -04:00
Ben Bar-Or
1dc8f8f903 [ty] Add hints to invalid-type-form for common mistakes (#18543)
Co-authored-by: Ben Bar-Or <ben.baror@ridewithvia.com>
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-06-09 00:40:05 +01:00
Charlie Marsh
301b9f4135 Add trailing space around readlines (#18542)
Closes https://github.com/astral-sh/ruff/issues/17683.
2025-06-08 12:00:30 -04:00
Micha Reiser
86e5a311f0 [ty] Introduce and use System::env_var for better test isolation (#18538) 2025-06-07 19:56:58 +02:00
Micha Reiser
0c20010bb9 [ty] Split CLI tests into multiple files (#18537) 2025-06-07 16:43:28 +00:00
Alex Waygood
72552f31e4 [ty] Fix panic when pulling types for UnaryOp expressions inside Literal slices (#18536) 2025-06-07 15:26:10 +00:00
Alex Waygood
95497ffaab [ty] Fix panic when trying to pull types for attribute expressions inside Literal type expressions (#18535) 2025-06-07 15:59:12 +01:00
Micha Reiser
b3b900dc1e Treat ty: comments as pragma comments (#18532)
## Summary

Add support for ty's `ty:` pragma comments to ruff's formatter and E501

Fixes https://github.com/astral-sh/ruff/issues/18529

## Test Plan

Added test
2025-06-07 16:02:43 +02:00
Alex Waygood
503427855d [ty] Enable more corpus tests (#18531) 2025-06-07 14:18:25 +01:00
Alex Waygood
6e785867c3 [ty] Unify Type::is_subtype_of() and Type::is_assignable_to() (#18430) 2025-06-06 17:28:55 +00:00
Alex Waygood
1274521f9f [ty] Track the origin of the environment.python setting for better error messages (#18483) 2025-06-06 13:36:41 +01:00
shimies
8d24760643 Fix doc for Neovim setting examples (#18491)
## Summary
This PR fixes an error in the example Neovim configuration on [this
documentation
page](https://docs.astral.sh/ruff/editors/settings/#configuration).
The `configuration` block should be nested under `settings`, consistent
with other properties and as outlined
[here](https://docs.astral.sh/ruff/editors/setup/#neovim).

I encountered this issue when copying the example to configure ruff
integration in my neovim - the config didn’t work until I corrected the
nesting.

## Test Plan
- [x] Confirmed that the corrected configuration works in a real Neovim
+ Ruff setup
- [x] Verified that the updated configuration renders correctly in
MkDocs
<img width="382" alt="image"
src="https://github.com/user-attachments/assets/0722fb35-8ffa-4b10-90ba-c6e8417e40bf"
/>
2025-06-06 15:19:16 +05:30
Carl Meyer
db8db536f8 [ty] clarify requirements for scope_id argument to in_type_expression (#18488) 2025-06-05 22:46:26 -07:00
Carl Meyer
cb8246bc5f [ty] remove unnecessary Either (#18489)
Just a quick review-comment follow-up.
2025-06-05 18:39:22 -07:00
Dylan
5faf72a4d9 Bump 0.11.13 (#18484) 2025-06-05 15:18:38 -05:00
Micha Reiser
28dbc5c51e [ty] Fix completion order in playground (#18480) 2025-06-05 18:55:54 +02:00
Brent Westbrook
ce216c79cc Remove Message::to_rule (#18447)
## Summary

As the title says, this PR removes the `Message::to_rule` method by
replacing related uses of `Rule` with `NoqaCode` (or the rule's name in
the case of the cache). Where it seemed a `Rule` was really needed, we
convert back to the `Rule` by parsing either the rule name (with
`str::parse`) or the `NoqaCode` (with `Rule::from_code`).

I thought this was kind of like cheating and that it might not resolve
this part of Micha's
[comment](https://github.com/astral-sh/ruff/pull/18391#issuecomment-2933764275):

> because we can't add Rule to Diagnostic or **have it anywhere in our
shared rendering logic**

but after looking again, the only remaining `Rule` conversion in
rendering code is for the SARIF output format. The other two non-test
`Rule` conversions are for caching and writing a fix summary, which I
don't think fall into the shared rendering logic. That leaves the SARIF
format as the only real problem, but maybe we can delay that for now.

The motivation here is that we won't be able to store a `Rule` on the
new `Diagnostic` type, but we should be able to store a `NoqaCode`,
likely as a string.

## Test Plan

Existing tests

##
[Benchmarks](https://codspeed.io/astral-sh/ruff/branches/brent%2Fremove-to-rule)

Almost no perf regression, only -1% on
`linter/default-rules[large/dataset.py]`.

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-05 12:48:29 -04:00
Victorien
33468cc8cc [pyupgrade] Apply UP035 only on py313+ for get_type_hints() (#18476) 2025-06-05 17:16:29 +01:00
Ibraheem Ahmed
8531f4b3ca [ty] Add infrastructure for AST garbage collection (#18445)
## Summary

https://github.com/astral-sh/ty/issues/214 will require a couple
invasive changes that I would like to get merged even before garbage
collection is fully implemented (to avoid rebasing):
- `ParsedModule` can no longer be dereferenced directly. Instead you
need to load a `ParsedModuleRef` to access the AST, which requires a
reference to the salsa database (as it may require re-parsing the AST if
it was collected).
- `AstNodeRef` can only be dereferenced with the `node` method, which
takes a reference to the `ParsedModuleRef`. This allows us to encode the
fact that ASTs do not live as long as the database and may be collected
as soon a given instance of a `ParsedModuleRef` is dropped. There are a
number of places where we currently merge the `'db` and `'ast`
lifetimes, so this requires giving some types/functions two separate
lifetime parameters.
2025-06-05 11:43:18 -04:00
Andrew Gallant
55100209c7 [ty] IDE: add support for object.<CURSOR> completions (#18468)
This PR adds logic for detecting `Name Dot [Name]` token patterns,
finding the corresponding `ExprAttribute`, getting the type of the
object and returning the members available on that object.

Here's a video demonstrating this working:

https://github.com/user-attachments/assets/42ce78e8-5930-4211-a18a-fa2a0434d0eb

Ref astral-sh/ty#86
2025-06-05 11:15:19 -04:00
chiri
c0bb83b882 [perflint] fix missing parentheses for lambda and ternary conditions (PERF401, PERF403) (#18412)
Closes #18405
2025-06-05 09:57:08 -05:00
Brent Westbrook
74a4e9af3d Combine lint and syntax error handling (#18471)
## Summary

This is a spin-off from
https://github.com/astral-sh/ruff/pull/18447#discussion_r2125844669 to
avoid using `Message::noqa_code` to differentiate between lints and
syntax errors. I went through all of the calls on `main` and on the
branch from #18447, and the instance in `ruff_server` noted in the
linked comment was actually the primary place where this was being done.
Other calls to `noqa_code` are typically some variation of
`message.noqa_code().map_or(String::new, format!(...))`, with the major
exception of the gitlab output format:


a120610b5b/crates/ruff_linter/src/message/gitlab.rs (L93-L105)

which obviously assumes that `None` means syntax error. A simple fix
here would be to use `message.name()` for `check_name` instead of the
noqa code, but I'm not sure how breaking that would be. This could just
be:

```rust
 let description = message.body();
 let description = description.strip_prefix("SyntaxError: ").unwrap_or(description).to_string();
 let check_name = message.name();
```

In that case. This sounds reasonable based on the [Code Quality report
format](https://docs.gitlab.com/ci/testing/code_quality/#code-quality-report-format)
docs:

> | Name | Type | Description|
> |-----|-----|----|
> |`check_name` | String | A unique name representing the check, or
rule, associated with this violation. |

## Test Plan

Existing tests
2025-06-05 12:50:02 +00:00
Alex Waygood
8485dbb324 [ty] Fix --python argument for Windows, and improve error messages for bad --python arguments (#18457)
## Summary

Fixes https://github.com/astral-sh/ty/issues/556.

On Windows, system installations have different layouts to virtual
environments. In Windows virtual environments, the Python executable is
found at `<sys.prefix>/Scripts/python.exe`. But in Windows system
installations, the Python executable is found at
`<sys.prefix>/python.exe`. That means that Windows users were able to
point to Python executables inside virtual environments with the
`--python` flag, but they weren't able to point to Python executables
inside system installations.

This PR fixes that issue. It also makes a couple of other changes:
- Nearly all `sys.prefix` resolution is moved inside `site_packages.rs`.
That was the original design of the `site-packages` resolution logic,
but features implemented since the initial implementation have added
some resolution and validation to `resolver.rs` inside the module
resolver. That means that we've ended up with a somewhat confusing code
structure and a situation where several checks are unnecessarily
duplicated between the two modules.
- I noticed that we had quite bad error messages if you e.g. pointed to
a path that didn't exist on disk with `--python` (we just gave a
somewhat impenetrable message saying that we "failed to canonicalize"
the path). I improved the error messages here and added CLI tests for
`--python` and the `environment.python` configuration setting.

## Test Plan

- Existing tests pass
- Added new CLI tests
- I manually checked that virtual-environment discovery still works if
no configuration is given
- Micha did some manual testing to check that pointing `--python` to a
system-installation executable now works on Windows
2025-06-05 08:19:15 +01:00
Shunsuke Shibayama
0858896bc4 [ty] type narrowing by attribute/subscript assignments (#18041)
## Summary

This PR partially solves https://github.com/astral-sh/ty/issues/164
(derived from #17643).

Currently, the definitions we manage are limited to those for simple
name (symbol) targets, but we expand this to track definitions for
attribute and subscript targets as well.

This was originally planned as part of the work in #17643, but the
changes are significant, so I made it a separate PR.
After merging this PR, I will reflect this changes in #17643.

There is still some incomplete work remaining, but the basic features
have been implemented, so I am publishing it as a draft PR.
Here is the TODO list (there may be more to come):
* [x] Complete rewrite and refactoring of documentation (removing
`Symbol` and replacing it with `Place`)
* [x] More thorough testing
* [x] Consolidation of duplicated code (maybe we can consolidate the
handling related to name, attribute, and subscript)

This PR replaces the current `Symbol` API with the `Place` API, which is
a concept that includes attributes and subscripts (the term is borrowed
from Rust).

## Test Plan

`mdtest/narrow/assignment.md` is added.

---------

Co-authored-by: David Peter <sharkdp@users.noreply.github.com>
Co-authored-by: Carl Meyer <carl@astral.sh>
2025-06-04 17:24:27 -07:00
Alex Waygood
ce8b744f17 [ty] Only calculate information for unresolved-reference subdiagnostic if we know we'll emit the diagnostic (#18465)
## Summary

This optimizes some of the logic added in
https://github.com/astral-sh/ruff/pull/18444. In general, we only
calculate information for subdiagnostics if we know we'll actually emit
the diagnostic. The check to see whether we'll emit the diagnostic is
work we'll definitely have to do whereas the the work to gather
information for a subdiagnostic isn't work we necessarily have to do if
the diagnostic isn't going to be emitted at all.

This PR makes us lazier about gathering the information we need for the
subdiagnostic, and moves all the subdiagnostic logic into one function
rather than having some `unresolved-reference` subdiagnostic logic in
`infer.rs` and some in `diagnostic.rs`.

## Test Plan

`cargo test -p ty_python_semantic`
2025-06-04 20:41:00 +01:00
Alex Waygood
5a8cdab771 [ty] Only consider a type T a subtype of a protocol P if all of P's members are fully bound on T (#18466)
## Summary

Fixes https://github.com/astral-sh/ty/issues/578

## Test Plan

mdtests
2025-06-04 19:39:14 +00:00
Alex Waygood
3a8191529c [ty] Exclude members starting with _abc_ from a protocol interface (#18467)
## Summary

As well as excluding a hardcoded set of special attributes, CPython at
runtime also excludes any attributes or declarations starting with
`_abc_` from the set of members that make up a protocol interface. I
missed this in my initial implementation.

This is a bit of a CPython implementation detail, but I do think it's
important that we try to model the runtime as best we can here. The
closer we are to the runtime behaviour, the closer we come to sound
behaviour when narrowing types from `isinstance()` checks against
runtime-checkable protocols (for example)

## Test Plan

Extended an existing mdtest
2025-06-04 20:34:09 +01:00
lipefree
e658778ced [ty] Add subdiagnostic suggestion to unresolved-reference diagnostic when variable exists on self (#18444)
## Summary

Closes https://github.com/astral-sh/ty/issues/502.

In the following example:
```py
class Foo:
    x: int

    def method(self):
        y = x
```
The user may intended to use `y = self.x` in `method`. 

This is now added as a subdiagnostic in the following form : 

`info: An attribute with the same name as 'x' is defined, consider using
'self.x'`

## Test Plan

Added mdtest with snapshot diagnostics.
2025-06-04 08:13:50 -07:00
David Peter
f1883d71a4 [ty] IDE: only provide declarations and bindings as completions (#18456)
## Summary

Previously, all symbols where provided as possible completions. In an
example like the following, both `foo` and `f` were suggested as
completions, because `f` itself is a symbol.
```py
foo = 1

f<CURSOR>
```
Similarly, in the following example, `hidden_symbol` was suggested, even
though it is not statically visible:
```py
if 1 + 2 != 3:
    hidden_symbol = 1

hidden_<CURSOR>
```

With the change suggested here, we only use statically visible
declarations and bindings as a source for completions.


## Test Plan

- Updated snapshot tests
- New test for statically hidden definitions
- Added test for star import
2025-06-04 16:11:05 +02:00
David Peter
11db567b0b [ty] ty_ide: Hotfix for expression_scope_id panics (#18455)
## Summary

Implement a hotfix for the playground/LSP crashes related to missing
`expression_scope_id`s.

relates to: https://github.com/astral-sh/ty/issues/572

## Test Plan

* Regression tests from https://github.com/astral-sh/ruff/pull/18441
* Ran the playground locally to check if panics occur / completions
still work.

---------

Co-authored-by: Andrew Gallant <andrew@astral.sh>
2025-06-04 10:39:16 +02:00
David Peter
9f8c3de462 [ty] Improve docs for Class{Literal,Type}::instance_member (#18454)
## Summary

Mostly just refer to `Type::instance_member` which has much more
details.
2025-06-04 09:55:45 +02:00
David Peter
293d4ac388 [ty] Add meta-type tests for legavy TypeVars (#18453)
## Summary

Follow up to the comment by @dcreager
[here](https://github.com/astral-sh/ruff/pull/18439#discussion_r2123802784).
2025-06-04 07:44:44 +00:00
Carl Meyer
9e8a7e9353 update to salsa that doesn't panic silently on cycles (#18450) 2025-06-04 07:40:16 +02:00
Dhruv Manilawala
453e5f5934 [ty] Add tests for empty list/tuple unpacking (#18451)
## Summary

This PR is to address this comment:
https://github.com/astral-sh/ruff/pull/18438#issuecomment-2935344415

## Test Plan

Run mdtest
2025-06-04 02:40:26 +00:00
Dhruv Manilawala
7ea773daf2 [ty] Argument type expansion for overload call evaluation (#18382)
## Summary

Part of astral-sh/ty#104, closes: astral-sh/ty#468

This PR implements the argument type expansion which is step 3 of the
overload call evaluation algorithm.

Specifically, this step needs to be taken if type checking resolves to
no matching overload and there are argument types that can be expanded.

## Test Plan

Add new test cases.

## Ecosystem analysis

This PR removes 174 `no-matching-overload` false positives -- I looked
at a lot of them and they all are false positives.

One thing that I'm not able to understand is that in
2b7e3adf27/sphinx/ext/autodoc/preserve_defaults.py (L179)
the inferred type of `value` is `str | None` by ty and Pyright, which is
correct, but it's only ty that raises `invalid-argument-type` error
while Pyright doesn't. The constructor method of `DefaultValue` has
declared type of `str` which is invalid.

There are few cases of false positives resulting due to the fact that ty
doesn't implement narrowing on attribute expressions.
2025-06-04 02:12:00 +00:00
Alex Waygood
0079cc6817 [ty] Minor cleanup for site-packages discovery logic (#18446) 2025-06-03 18:49:14 +00:00
Matthew Mckee
e8ea40012a [ty] Add generic inference for dataclasses (#18443)
## Summary

An issue seen here https://github.com/astral-sh/ty/issues/500

The `__init__` method of dataclasses had no inherited generic context,
so we could not infer the type of an instance from a constructor call
with generics

## Test Plan

Add tests to classes.md` in generics folder
2025-06-03 09:59:43 -07:00
Abhijeet Prasad Bodas
71d8a5da2a [ty] dataclasses: Allow using dataclasses.dataclass as a function. (#18440)
## Summary

Part of https://github.com/astral-sh/ty/issues/111

Using `dataclass` as a function, instead of as a decorator did not work
as expected prior to this.
Fix that by modifying the dataclass overload's return type.

## Test Plan

New mdtests, fixing the existing TODO.
2025-06-03 09:50:29 -07:00
Douglas Creager
2c3b3d3230 [ty] Create separate FunctionLiteral and FunctionType types (#18360)
This updates our representation of functions to more closely match our
representation of classes.

The new `OverloadLiteral` and `FunctionLiteral` classes represent a
function definition in the AST. If a function is generic, this is
unspecialized. `FunctionType` has been updated to represent a function
type, which is specialized if the function is generic. (These names are
chosen to match `ClassLiteral` and `ClassType` on the class side.)

This PR does not add a separate `Type` variant for `FunctionLiteral`.
Maybe we should? Possibly as a follow-on PR?

Part of https://github.com/astral-sh/ty/issues/462

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-06-03 10:59:31 -04:00
Dhruv Manilawala
8d98c601d8 [ty] Infer list[T] when unpacking non-tuple type (#18438)
## Summary

Follow-up from #18401, I was looking at whether that would fix the issue
at https://github.com/astral-sh/ty/issues/247#issuecomment-2917656676
and it didn't, which made me realize that the PR only inferred `list[T]`
when the value type was tuple but it could be other types as well.

This PR fixes the actual issue by inferring `list[T]` for the non-tuple
type case.

## Test Plan

Add test cases for starred expression involved with non-tuple type. I
also added a few test cases for list type and list literal.

I also verified that the example in the linked issue comment works:
```py
def _(line: str):
    a, b, *c = line.split(maxsplit=2)
    c.pop()
```
2025-06-03 19:17:47 +05:30
David Peter
0986edf427 [ty] Meta-type of type variables should be type[..] (#18439)
## Summary

Came across this while debugging some ecosystem changes in
https://github.com/astral-sh/ruff/pull/18347. I think the meta-type of a
typevar-annotated variable should be equal to `type`, not `<class
'object'>`.

## Test Plan

New Markdown tests.
2025-06-03 15:22:00 +02:00
chiri
03f1f8e218 [pyupgrade] Make fix unsafe if it deletes comments (UP050) (#18390)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
/closes #18387
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan
update snapshots
<!-- How was it tested? -->
2025-06-03 09:10:15 -04:00
chiri
628bb2cd1d [pyupgrade] Make fix unsafe if it deletes comments (UP004) (#18393)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary
https://github.com/astral-sh/ruff/issues/18387#issuecomment-2923039331
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan
update snapshots
<!-- How was it tested? -->
2025-06-03 09:09:33 -04:00
lipefree
f23d2c9b9e [ty] Support using legacy typing aliases for generic classes in type annotations (#18404)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-06-03 12:09:51 +01:00
Micha Reiser
67d94d9ec8 Use ty's completions in playground (#18425) 2025-06-03 10:11:39 +02:00
otakutyrant
d1cb8e2142 Update editor setup docs about Neovim and Vim (#18324)
## Summary

I struggled to make ruff_organize_imports work and then I found out I
missed the key note about conform.nvim before because it was put in the
Vim section wrongly! So I refined them both.

---------

Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
2025-06-03 07:40:22 +00:00
renovate[bot]
57202c1c77 Update NPM Development dependencies (#18423) 2025-06-03 08:06:56 +02:00
Dhruv Manilawala
2289187b74 Infer list[T] for starred target in unpacking (#18401)
## Summary

Closes: astral-sh/ty#191

## Test Plan

Update existing tests.
2025-06-03 07:25:07 +05:30
Robsdedude
14c42a8ddf [refurb] Mark FURB180 fix unsafe when class has bases (#18149)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Mark `FURB180`'s fix as unsafe if the class already has base classes.
This is because the base classes might validate the other base classes
(like `typing.Protocol` does) or otherwise alter runtime behavior if
more base classes are added.

## Test Plan

The existing snapshot test covers this case already.

## References

Partially addresses https://github.com/astral-sh/ruff/issues/13307 (left
out way to permit certain exceptions)

---------

Co-authored-by: Brent Westbrook <36778786+ntBre@users.noreply.github.com>
Co-authored-by: Brent Westbrook <brentrwestbrook@gmail.com>
2025-06-03 00:51:09 +00:00
Denys Kyslytsyn
e677863787 [fastapi] Avoid false positive for class dependencies (FAST003) (#18271)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

Closes #17226.

This PR updates the `FAST003` rule to correctly handle [FastAPI class
dependencies](https://fastapi.tiangolo.com/tutorial/dependencies/classes-as-dependencies/).
Specifically, if a path parameter is declared in either:

- a `pydantic.BaseModel` used as a dependency, or  
- the `__init__` method of a class used as a dependency,  

then `FAST003` will no longer incorrectly report it as unused.

FastAPI allows a shortcut when using annotated class dependencies -
`Depends` can be called without arguments, e.g.:

```python
class MyParams(BaseModel):
    my_id: int

@router.get("/{my_id}")
def get_id(params: Annotated[MyParams, Depends()]): ...
```
This PR ensures that such usage is properly supported by the linter.

Note: Support for dataclasses is not included in this PR. Let me know if
you’d like it to be added.

## Test Plan

Added relevant test cases to the `FAST003.py` fixture.
2025-06-02 14:34:50 -04:00
lipefree
f379eb6e62 [ty] Treat lambda functions as instances of types.FunctionType (#18431) 2025-06-02 16:46:26 +01:00
Alex Waygood
47698883ae [ty] Fix false positives for legacy ParamSpecs inside Callable type expressions (#18426) 2025-06-02 14:10:00 +01:00
Alex Waygood
e2d96df501 [ty] Improve diagnostics if the user attempts to import a stdlib module that does not exist on their configured Python version (#18403) 2025-06-02 10:52:26 +00:00
renovate[bot]
384e80ec80 Update taiki-e/install-action action to v2.52.4 (#18420)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-02 09:03:32 +02:00
renovate[bot]
b9f3b0e0a6 Update docker/build-push-action action to v6.18.0 (#18422)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-02 09:03:09 +02:00
Micha Reiser
1e6d76c878 [ty] Fix server hang after shutdown request (#18414) 2025-06-02 06:57:51 +00:00
renovate[bot]
844c8626c3 Update Rust crate libcst to v1.8.0 (#18424) 2025-06-02 07:40:18 +02:00
renovate[bot]
1c8d9d707e Update Rust crate clap to v4.5.39 (#18419) 2025-06-02 07:39:27 +02:00
renovate[bot]
4856377478 Update cargo-bins/cargo-binstall action to v1.12.6 (#18416) 2025-06-02 07:38:57 +02:00
renovate[bot]
643c845a47 Update dependency mdformat-mkdocs to v4.3.0 (#18421) 2025-06-02 07:38:36 +02:00
renovate[bot]
9e952cf0e0 Update pre-commit dependencies (#18418) 2025-06-02 07:38:10 +02:00
renovate[bot]
c4015edf48 Update dependency ruff to v0.11.12 (#18417) 2025-06-02 07:37:56 +02:00
Matthew Mckee
97b824db3e [ty] Ensure Literal types are considered assignable to anything their Instance supertypes are assignable to (#18351) 2025-06-01 16:39:56 +01:00
Micha Reiser
220ab88779 [ty] Promote projects to good that now no longer hang (#18370) 2025-06-01 17:25:46 +02:00
github-actions[bot]
7a63ac145a Sync vendored typeshed stubs (#18407)
Co-authored-by: typeshedbot <>
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-06-01 15:21:18 +01:00
Micha Reiser
54f597658c [ty] Fix multithreading related hangs and panics (#18238) 2025-06-01 11:07:55 +02:00
Ibraheem Ahmed
aa1fad61e0 Support relative --ty-path in ty-benchmark (#18385)
## Summary

This currently doesn't work because the benchmark changes the working
directory. Also updates the process name to make it easier to compare
two local ty binaries.
2025-05-30 18:19:20 -04:00
Alex Waygood
b390b3cb8e [ty] Update docs for Python version inference (#18397) 2025-05-30 22:45:28 +01:00
Zanie Blue
88866f0048 [ty] Infer the Python version from the environment if feasible (#18057)
Co-authored-by: Alex Waygood <alex.waygood@gmail.com>
2025-05-30 21:22:51 +00:00
Dylan
9bbf4987e8 Implement template strings (#17851)
This PR implements template strings (t-strings) in the parser and
formatter for Ruff.

Minimal changes necessary to compile were made in other parts of the code (e.g. ty, the linter, etc.). These will be covered properly in follow-up PRs.
2025-05-30 15:00:56 -05:00
Carl Meyer
ad024f9a09 [ty] support callability of bound/constrained typevars (#18389)
## Summary

Allow a typevar to be callable if it is bound to a callable type, or
constrained to callable types.

I spent some time digging into why this support didn't fall out
naturally, and ultimately the reason is that we look up `__call__` on
the meta type (since its a dunder), and our implementation of
`Type::to_meta_type` for `Type::Callable` does not return a type with
`__call__`.

A more general solution here would be to have `Type::to_meta_type` for
`Type::Callable` synthesize a protocol with `__call__` and return an
intersection with that protocol (since for a type to be callable, we
know its meta-type must have `__call__`). That solution could in
principle also replace the special-case handling of `Type::Callable`
itself, here in `Type::bindings`. But that more general approach would
also be slower, and our protocol support isn't quite ready for that yet,
and handling this directly in `Type::bindings` is really not bad.

Fixes https://github.com/astral-sh/ty/issues/480

## Test Plan

Added mdtests.
2025-05-30 12:01:51 -07:00
Andrew Gallant
fc549bda94 [ty] Minor tweaks to "list all members" docs and tests (#18388)
Ref https://github.com/astral-sh/ruff/pull/18251#pullrequestreview-2881810681
2025-05-30 13:36:57 -04:00
Alex Waygood
77c8ddf101 [ty] Fix broken property tests for disjointness (#18384) 2025-05-30 16:49:20 +01:00
David Peter
e730f27f80 [ty] List available members for a given type (#18251)
This PR adds initial support for listing all attributes of
an object. It is exposed through a new `all_members`
routine in `ty_extensions`, which is in turn used to test
the functionality.

The purpose of listing all members is for code
completion. That is, given a `object.<CURSOR>`, we
would like to list all available attributes on
`object`.
2025-05-30 11:24:20 -04:00
Wei Lee
d65bd69963 [airflow] Add unsafe fix for module moved cases (AIR312) (#18363)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Follow up on https://github.com/astral-sh/ruff/pull/18093 and apply it
to AIR312

## Test Plan

<!-- How was it tested? -->

The existing test fixtures have been updated
2025-05-30 09:36:20 -04:00
Brent Westbrook
c713e76e4d Add a SourceFile to OldDiagnostic (#18356)
Summary
--

This is the last main difference between the `OldDiagnostic` and
`Message`
types, so attaching a `SourceFile` to `OldDiagnostic` should make
combining the
two types almost trivial.

Initially I updated the remaining rules without access to a `Checker` to
take a
`&SourceFile` directly, but after Micha's suggestion in
https://github.com/astral-sh/ruff/pull/18356#discussion_r2113281552, I
updated all of these calls to take a
`LintContext` instead. This new type is a thin wrapper around a
`RefCell<Vec<OldDiagnostic>>`
and a `SourceFile` and now has the `report_diagnostic` method returning
a `DiagnosticGuard` instead of `Checker`.
This allows the same `Drop`-based implementation to be used in cases
without a `Checker` and also avoids a lot of intermediate allocations of
`Vec<OldDiagnostic>`s.

`Checker` now also contains a `LintContext`, which it defers to for its
`report_diagnostic` methods, which I preserved for convenience.

Test Plan
--

Existing tests
2025-05-30 13:34:38 +00:00
Micha Reiser
8005ebb405 Update salsa past generational id change (#18362) 2025-05-30 15:31:33 +02:00
Wei Lee
0c29e258c6 [airflow] Add unsafe fix for module moved cases (AIR311) (#18366)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Follow up on https://github.com/astral-sh/ruff/pull/18093 and apply it
to AIR311

---

Rules fixed
* `airflow.models.datasets.expand_alias_to_datasets` →
`airflow.models.asset.expand_alias_to_assets`
* `airflow.models.baseoperatorlink.BaseOperatorLink` →
`airflow.sdk.BaseOperatorLink`


## Test Plan

<!-- How was it tested? -->
The existing test fixtures have been updated
2025-05-30 09:27:14 -04:00
Wei Lee
b5b6b657cc [airflow] Add unsafe fix for module moved cases (AIR301) (#18367)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Follow up on https://github.com/astral-sh/ruff/pull/18093 and apply it
to AIR301

## Test Plan

<!-- How was it tested? -->

The existing test fixtures have been updated
2025-05-30 08:46:39 -04:00
Alex Waygood
ad2f667ee4 [ty] Improve tests for site-packages discovery (#18374)
## Summary

- Convert tests demonstrating our resilience to malformed/absent
`version` fields in `pyvenf.cfg` files to mdtests. Also make them more
expansive.
- Convert the regression test I added in
https://github.com/astral-sh/ruff/pull/18157 to an mdtest
- Add comments next to unit tests that cannot be converted to mdtests
(but where it's not obvious why they can't) so I don't have to do this
exercise again 😄
- In `site_packages.rs`, factor out the logic for figuring out where we
expect the system-installation `site-packages` to be. Currently we have
the same logic twice.

## Test Plan

`cargo test -p ty_python_semantic`
2025-05-30 07:32:21 +01:00
Carl Meyer
363f061f09 [ty] _typeshed.Self is not a special form (#18377)
## Summary

This change was based on a mis-reading of a comment in typeshed, and a
wrong assumption about what was causing a test failure in a prior PR.
Reverting it doesn't cause any tests to fail.

## Test Plan

Existing tests.
2025-05-29 17:11:13 -07:00
InSync
9b0dfc505f [ty] Callable types are disjoint from non-callable @final nominal instance types (#18368)
## Summary

Resolves [#513](https://github.com/astral-sh/ty/issues/513).

Callable types are now considered to be disjoint from nominal instance
types where:

* The class is `@final`, and
* Its `__call__` either does not exist or is not assignable to `(...) ->
Unknown`.

## Test Plan

Markdown tests.

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-05-29 23:27:27 +00:00
lipefree
695de4f27f [ty] Add diagnosis for function with no return statement but with return type annotation (#18359)
## Summary

Partially implement https://github.com/astral-sh/ty/issues/538, 
```py
from pathlib import Path

def setup_test_project(registry_name: str, registry_url: str, project_dir: str) -> Path:
    pyproject_file = Path(project_dir) / "pyproject.toml"
    pyproject_file.write_text("...", encoding="utf-8")
```
As no return statement is defined in the function `setup_test_project`
with annotated return type `Path`, we provide the following diagnosis :

- error[invalid-return-type]: Function **always** implicitly returns
`None`, which is not assignable to return type `Path`

with a subdiagnostic : 
- note: Consider changing your return annotation to `-> None` or adding a `return` statement
 
## Test Plan

mdtests with snapshots to capture the subdiagnostic. I have to mention
that existing snapshots were modified since they now fall in this
category.

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-05-29 23:17:18 +00:00
Wei Lee
3445d1322d [airflow] Add unsafe fix module moved cases (AIR302) (#18093)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Add utility functions `generate_import_edit` and
`generate_remove_and_runtime_import_edit` to generate the fix needed for
the airflow rules.

1. `generate_import_edit` is for the cases where the member name has
changed. (e.g., `airflow.datasts.Dataset` to `airflow.sdk.Asset`) It's
just extracted from the original logic
2. `generate_remove_and_runtime_import_edit` is for cases where the
member name has not changed. (e.g.,
`airflow.operators.pig_operator.PigOperator` to
`airflow.providers.apache.pig.hooks.pig.PigCliHook`) This is newly
introduced. As it introduced runtime import, I mark it as an unsafe fix.
Under the hook, it tried to find the original import statement, remove
it, and add a new import fix

---

* rules fix
* `airflow.sensors.external_task_sensor.ExternalTaskSensorLink` →
`airflow.providers.standard.sensors.external_task.ExternalDagLink`

## Test Plan

<!-- How was it tested? -->
The existing test fixtures have been updated
2025-05-29 16:30:40 -04:00
Brent Westbrook
2c3f091e0e Rename ruff_linter::Diagnostic to OldDiagnostic (#18355)
Summary
--

It's a bit late in the refactoring process, but I think there are still
a couple of PRs left before getting rid of this type entirely, so I
thought it would still be worth doing.

This PR is just a quick rename with no other changes.

Test Plan
--

Existing tests
2025-05-29 15:04:31 -04:00
Marcus Näslund
9d3cad95bc [refurb] Add coverage of set and frozenset calls (FURB171) (#18035)
## Summary

Adds coverage of using set(...) in addition to `{...} in
SingleItemMembershipTest.

Fixes #15792
(and replaces the old PR #15793)

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

Updated unit test and snapshot.

Steps to reproduce are in the issue linked above.

<!-- How was it tested? -->
2025-05-29 14:59:49 -04:00
Alex Waygood
7df79cfb70 Add offset method to ruff_python_trivia::Cursor (#18371) 2025-05-29 16:08:15 +01:00
Andrew Gallant
33ed502edb ty_ide: improve completions by using scopes
Previously, completions were based on just returning every identifier
parsed in the current Python file. In this commit, we change it to
identify an expression under the cursor and then return all symbols
available to the scope containing that expression.

This is still returning too much, and also, in some cases, not enough.
Namely, it doesn't really take the specific context into account other
than scope. But this does improve on the status quo. For example:

    def foo(): ...
    def bar():
        def fast(): ...
    def foofoo(): ...

    f<CURSOR>

When asking for completions here, the LSP will no longer include `fast`
as a possible completion in this context.

Ref https://github.com/astral-sh/ty/issues/86
2025-05-29 10:31:30 -04:00
Andrew Gallant
a827b16ebd ruff_python_parser: add Tokens::before method
This is analogous to the existing `Tokens::after` method. Its
implementation is almost identical.

We plan to use this for looking at the tokens immediately before the
cursor when fetching completions.
2025-05-29 10:31:30 -04:00
Alex Waygood
47a2ec002e [ty] Split Type::KnownInstance into two type variants (#18350) 2025-05-29 14:47:55 +01:00
692 changed files with 42348 additions and 16056 deletions

View File

@@ -79,7 +79,7 @@ jobs:
# Adapted from https://docs.docker.com/build/ci/github-actions/multi-platform/
- name: Build and push by digest
id: build
uses: docker/build-push-action@1dc73863535b631f98b2378be8619f83b136f4a0 # v6.17.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
platforms: ${{ matrix.platform }}
@@ -231,7 +231,7 @@ jobs:
${{ env.TAG_PATTERNS }}
- name: Build and push
uses: docker/build-push-action@1dc73863535b631f98b2378be8619f83b136f4a0 # v6.17.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
platforms: linux/amd64,linux/arm64

View File

@@ -239,11 +239,11 @@ jobs:
- name: "Install mold"
uses: rui314/setup-mold@67424c1b3680e35255d95971cbd5de0047bf31c3 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@6c6479b49816fcc0975a31af977bdc1f847c2920 # v2.52.1
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@6c6479b49816fcc0975a31af977bdc1f847c2920 # v2.52.1
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-insta
- name: ty mdtests (GitHub annotations)
@@ -297,11 +297,11 @@ jobs:
- name: "Install mold"
uses: rui314/setup-mold@67424c1b3680e35255d95971cbd5de0047bf31c3 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@6c6479b49816fcc0975a31af977bdc1f847c2920 # v2.52.1
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@6c6479b49816fcc0975a31af977bdc1f847c2920 # v2.52.1
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-insta
- name: "Run tests"
@@ -324,7 +324,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@6c6479b49816fcc0975a31af977bdc1f847c2920 # v2.52.1
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-nextest
- name: "Run tests"
@@ -407,11 +407,11 @@ jobs:
- name: "Install mold"
uses: rui314/setup-mold@67424c1b3680e35255d95971cbd5de0047bf31c3 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@6c6479b49816fcc0975a31af977bdc1f847c2920 # v2.52.1
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@6c6479b49816fcc0975a31af977bdc1f847c2920 # v2.52.1
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-insta
- name: "Run tests"
@@ -437,7 +437,7 @@ jobs:
- name: "Install Rust toolchain"
run: rustup show
- name: "Install cargo-binstall"
uses: cargo-bins/cargo-binstall@5cbf019d8cb9b9d5b086218c41458ea35d817691 # v1.12.5
uses: cargo-bins/cargo-binstall@e8c9cc3599f6c4063d143083205f98ca25d91677 # v1.12.6
with:
tool: cargo-fuzz@0.11.2
- name: "Install cargo-fuzz"
@@ -690,7 +690,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- uses: cargo-bins/cargo-binstall@5cbf019d8cb9b9d5b086218c41458ea35d817691 # v1.12.5
- uses: cargo-bins/cargo-binstall@e8c9cc3599f6c4063d143083205f98ca25d91677 # v1.12.6
- run: cargo binstall --no-confirm cargo-shear
- run: cargo shear
@@ -910,7 +910,7 @@ jobs:
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@6c6479b49816fcc0975a31af977bdc1f847c2920 # v2.52.1
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
with:
tool: cargo-codspeed

View File

@@ -80,7 +80,7 @@ repos:
pass_filenames: false # This makes it a lot faster
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.11
rev: v0.11.12
hooks:
- id: ruff-format
- id: ruff
@@ -98,7 +98,7 @@ repos:
# zizmor detects security vulnerabilities in GitHub Actions workflows.
# Additional configuration for the tool is found in `.github/zizmor.yml`
- repo: https://github.com/woodruffw/zizmor-pre-commit
rev: v1.8.0
rev: v1.9.0
hooks:
- id: zizmor

View File

@@ -1,5 +1,31 @@
# Changelog
## 0.11.13
### Preview features
- \[`airflow`\] Add unsafe fix for module moved cases (`AIR301`,`AIR311`,`AIR312`,`AIR302`) ([#18367](https://github.com/astral-sh/ruff/pull/18367),[#18366](https://github.com/astral-sh/ruff/pull/18366),[#18363](https://github.com/astral-sh/ruff/pull/18363),[#18093](https://github.com/astral-sh/ruff/pull/18093))
- \[`refurb`\] Add coverage of `set` and `frozenset` calls (`FURB171`) ([#18035](https://github.com/astral-sh/ruff/pull/18035))
- \[`refurb`\] Mark `FURB180` fix unsafe when class has bases ([#18149](https://github.com/astral-sh/ruff/pull/18149))
### Bug fixes
- \[`perflint`\] Fix missing parentheses for lambda and ternary conditions (`PERF401`, `PERF403`) ([#18412](https://github.com/astral-sh/ruff/pull/18412))
- \[`pyupgrade`\] Apply `UP035` only on py313+ for `get_type_hints()` ([#18476](https://github.com/astral-sh/ruff/pull/18476))
- \[`pyupgrade`\] Make fix unsafe if it deletes comments (`UP004`,`UP050`) ([#18393](https://github.com/astral-sh/ruff/pull/18393), [#18390](https://github.com/astral-sh/ruff/pull/18390))
### Rule changes
- \[`fastapi`\] Avoid false positive for class dependencies (`FAST003`) ([#18271](https://github.com/astral-sh/ruff/pull/18271))
### Documentation
- Update editor setup docs for Neovim and Vim ([#18324](https://github.com/astral-sh/ruff/pull/18324))
### Other changes
- Support Python 3.14 template strings (t-strings) in formatter and parser ([#17851](https://github.com/astral-sh/ruff/pull/17851))
## 0.11.12
### Preview features

73
Cargo.lock generated
View File

@@ -8,6 +8,18 @@ version = "2.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "512761e0bb2578dd7380c6baaa0f4ce03e84f95e960231d1dec8bf4d7d6e2627"
[[package]]
name = "ahash"
version = "0.8.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a15f179cd60c4584b8a8c596927aadc462e27f2ca70c04e0071964a73ba7a75"
dependencies = [
"cfg-if",
"once_cell",
"version_check",
"zerocopy",
]
[[package]]
name = "aho-corasick"
version = "1.1.3"
@@ -342,9 +354,9 @@ dependencies = [
[[package]]
name = "clap"
version = "4.5.38"
version = "4.5.39"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ed93b9805f8ba930df42c2590f05453d5ec36cbb85d018868a5b24d31f6ac000"
checksum = "fd60e63e9be68e5fb56422e397cf9baddded06dae1d2e523401542383bc72a9f"
dependencies = [
"clap_builder",
"clap_derive",
@@ -352,9 +364,9 @@ dependencies = [
[[package]]
name = "clap_builder"
version = "4.5.38"
version = "4.5.39"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "379026ff283facf611b0ea629334361c4211d1b12ee01024eec1591133b04120"
checksum = "89cc6392a1f72bbeb820d71f32108f61fdaf18bc526e1d23954168a67759ef51"
dependencies = [
"anstream",
"anstyle",
@@ -486,7 +498,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "117725a109d387c937a1533ce01b450cbde6b88abceea8473c4d7a85853cda3c"
dependencies = [
"lazy_static",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -495,7 +507,7 @@ version = "3.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fde0e0ec90c9dfb3b4b1a0891a7dcd0e2bffde2f7efed5fe7c9bb00e5bfb915e"
dependencies = [
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -909,7 +921,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cea14ef9355e3beab063703aa9dab15afd25f0667c341310c1e5274bb1d0da18"
dependencies = [
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -1106,6 +1118,10 @@ name = "hashbrown"
version = "0.14.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e5274423e17b7c9fc20b6e7e208532f9b19825d82dfd615708b70edd83df41f1"
dependencies = [
"ahash",
"allocator-api2",
]
[[package]]
name = "hashbrown"
@@ -1444,7 +1460,7 @@ checksum = "e04d7f318608d35d4b61ddd75cbdaee86b023ebe2bd5a66ee0915f0bf93095a9"
dependencies = [
"hermit-abi 0.5.1",
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -1508,7 +1524,7 @@ dependencies = [
"portable-atomic",
"portable-atomic-util",
"serde",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -1597,9 +1613,9 @@ checksum = "d750af042f7ef4f724306de029d18836c26c1765a54a6a3f094cbd23a7267ffa"
[[package]]
name = "libcst"
version = "1.7.0"
version = "1.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad9e315e3f679e61b9095ffd5e509de78b8a4ea3bba9d772f6fb243209f808d4"
checksum = "3ac076e37f8fe6bcddbb6c3282897e6e9498b254907ccbfc806dc8f9f1491f02"
dependencies = [
"annotate-snippets",
"libcst_derive",
@@ -1607,14 +1623,14 @@ dependencies = [
"paste",
"peg",
"regex",
"thiserror 1.0.69",
"thiserror 2.0.12",
]
[[package]]
name = "libcst_derive"
version = "1.7.0"
version = "1.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bfa96ed35d0dccc67cf7ba49350cb86de3dcb1d072a7ab28f99117f19d874953"
checksum = "9cf4a12c744a301b216c4f0cb73542709ab15e6dadbb06966ac05864109d05da"
dependencies = [
"quote",
"syn",
@@ -2485,7 +2501,7 @@ dependencies = [
[[package]]
name = "ruff"
version = "0.11.12"
version = "0.11.13"
dependencies = [
"anyhow",
"argfile",
@@ -2722,7 +2738,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.11.12"
version = "0.11.13"
dependencies = [
"aho-corasick",
"anyhow",
@@ -3058,7 +3074,7 @@ dependencies = [
[[package]]
name = "ruff_wasm"
version = "0.11.12"
version = "0.11.13"
dependencies = [
"console_error_panic_hook",
"console_log",
@@ -3160,7 +3176,7 @@ dependencies = [
"errno",
"libc",
"linux-raw-sys",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -3177,13 +3193,14 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.21.1"
source = "git+https://github.com/salsa-rs/salsa.git?rev=4818b15f3b7516555d39f5a41cb75970448bee4c#4818b15f3b7516555d39f5a41cb75970448bee4c"
version = "0.22.0"
source = "git+https://github.com/carljm/salsa.git?rev=0f6d406f6c309964279baef71588746b8c67b4a3#0f6d406f6c309964279baef71588746b8c67b4a3"
dependencies = [
"boxcar",
"compact_str",
"crossbeam-queue",
"dashmap 6.1.0",
"hashbrown 0.14.5",
"hashbrown 0.15.3",
"hashlink",
"indexmap",
@@ -3200,15 +3217,14 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.21.1"
source = "git+https://github.com/salsa-rs/salsa.git?rev=4818b15f3b7516555d39f5a41cb75970448bee4c#4818b15f3b7516555d39f5a41cb75970448bee4c"
version = "0.22.0"
source = "git+https://github.com/carljm/salsa.git?rev=0f6d406f6c309964279baef71588746b8c67b4a3#0f6d406f6c309964279baef71588746b8c67b4a3"
[[package]]
name = "salsa-macros"
version = "0.21.1"
source = "git+https://github.com/salsa-rs/salsa.git?rev=4818b15f3b7516555d39f5a41cb75970448bee4c#4818b15f3b7516555d39f5a41cb75970448bee4c"
version = "0.22.0"
source = "git+https://github.com/carljm/salsa.git?rev=0f6d406f6c309964279baef71588746b8c67b4a3#0f6d406f6c309964279baef71588746b8c67b4a3"
dependencies = [
"heck",
"proc-macro2",
"quote",
"syn",
@@ -3529,7 +3545,7 @@ dependencies = [
"getrandom 0.3.3",
"once_cell",
"rustix",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -3872,6 +3888,7 @@ dependencies = [
"countme",
"crossbeam",
"ctrlc",
"dunce",
"filetime",
"indicatif",
"insta",
@@ -3948,6 +3965,7 @@ dependencies = [
"anyhow",
"bitflags 2.9.1",
"camino",
"colored 3.0.0",
"compact_str",
"countme",
"dir-test",
@@ -3960,6 +3978,7 @@ dependencies = [
"ordermap",
"quickcheck",
"quickcheck_macros",
"ruff_annotate_snippets",
"ruff_db",
"ruff_index",
"ruff_macros",
@@ -4523,7 +4542,7 @@ version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cf221c93e13a30d793f7645a0e7762c55d169dbb0a49671918a2319d289b10bb"
dependencies = [
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]

View File

@@ -129,7 +129,7 @@ regex = { version = "1.10.2" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "4818b15f3b7516555d39f5a41cb75970448bee4c" }
salsa = { git = "https://github.com/carljm/salsa.git", rev = "0f6d406f6c309964279baef71588746b8c67b4a3" }
schemars = { version = "0.8.16" }
seahash = { version = "4.1.0" }
serde = { version = "1.0.197", features = ["derive"] }

View File

@@ -148,8 +148,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
# For a specific version.
curl -LsSf https://astral.sh/ruff/0.11.12/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.11.12/install.ps1 | iex"
curl -LsSf https://astral.sh/ruff/0.11.13/install.sh | sh
powershell -c "irm https://astral.sh/ruff/0.11.13/install.ps1 | iex"
```
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
@@ -182,7 +182,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
```yaml
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.11.12
rev: v0.11.13
hooks:
# Run the linter.
- id: ruff

View File

@@ -4,6 +4,10 @@ extend-exclude = [
"crates/ty_vendored/vendor/**/*",
"**/resources/**/*",
"**/snapshots/**/*",
# Completion tests tend to have a lot of incomplete
# words naturally. It's annoying to have to make all
# of them actually words. So just ignore typos here.
"crates/ty_ide/src/completion.rs",
]
[default.extend-words]

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff"
version = "0.11.12"
version = "0.11.13"
publish = true
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -439,7 +439,10 @@ impl LintCacheData {
let messages = messages
.iter()
.filter_map(|msg| msg.to_rule().map(|rule| (rule, msg)))
// Parse the kebab-case rule name into a `Rule`. This will fail for syntax errors, so
// this also serves to filter them out, but we shouldn't be caching files with syntax
// errors anyway.
.filter_map(|msg| Some((msg.name().parse().ok()?, msg)))
.map(|(rule, msg)| {
// Make sure that all message use the same source file.
assert_eq!(

View File

@@ -12,7 +12,7 @@ use rayon::prelude::*;
use rustc_hash::FxHashMap;
use ruff_db::panic::catch_unwind;
use ruff_linter::Diagnostic;
use ruff_linter::OldDiagnostic;
use ruff_linter::message::Message;
use ruff_linter::package::PackageRoot;
use ruff_linter::registry::Rule;
@@ -131,8 +131,7 @@ pub(crate) fn check(
Diagnostics::new(
vec![Message::from_diagnostic(
Diagnostic::new(IOError { message }, TextRange::default()),
dummy,
OldDiagnostic::new(IOError { message }, TextRange::default(), &dummy),
None,
)],
FxHashMap::default(),

View File

@@ -30,7 +30,7 @@ impl<'a> Explanation<'a> {
let (linter, _) = Linter::parse_code(&code).unwrap();
let fix = rule.fixable().to_string();
Self {
name: rule.as_ref(),
name: rule.name().as_str(),
code,
linter: linter.name(),
summary: rule.message_formats()[0],
@@ -44,7 +44,7 @@ impl<'a> Explanation<'a> {
fn format_rule_text(rule: Rule) -> String {
let mut output = String::new();
let _ = write!(&mut output, "# {} ({})", rule.as_ref(), rule.noqa_code());
let _ = write!(&mut output, "# {} ({})", rule.name(), rule.noqa_code());
output.push('\n');
output.push('\n');

View File

@@ -12,7 +12,7 @@ use colored::Colorize;
use log::{debug, warn};
use rustc_hash::FxHashMap;
use ruff_linter::Diagnostic;
use ruff_linter::OldDiagnostic;
use ruff_linter::codes::Rule;
use ruff_linter::linter::{FixTable, FixerResult, LinterResult, ParseSource, lint_fix, lint_only};
use ruff_linter::message::Message;
@@ -64,13 +64,13 @@ impl Diagnostics {
let source_file = SourceFileBuilder::new(name, "").finish();
Self::new(
vec![Message::from_diagnostic(
Diagnostic::new(
OldDiagnostic::new(
IOError {
message: err.to_string(),
},
TextRange::default(),
&source_file,
),
source_file,
None,
)],
FxHashMap::default(),
@@ -165,9 +165,9 @@ impl AddAssign for FixMap {
continue;
}
let fixed_in_file = self.0.entry(filename).or_default();
for (rule, count) in fixed {
for (rule, name, count) in fixed.iter() {
if count > 0 {
*fixed_in_file.entry(rule).or_default() += count;
*fixed_in_file.entry(rule).or_default(name) += count;
}
}
}
@@ -235,7 +235,7 @@ pub(crate) fn lint_path(
};
let source_file =
SourceFileBuilder::new(path.to_string_lossy(), contents).finish();
lint_pyproject_toml(source_file, settings)
lint_pyproject_toml(&source_file, settings)
} else {
vec![]
};
@@ -305,7 +305,7 @@ pub(crate) fn lint_path(
ParseSource::None,
);
let transformed = source_kind;
let fixed = FxHashMap::default();
let fixed = FixTable::default();
(result, transformed, fixed)
}
} else {
@@ -319,7 +319,7 @@ pub(crate) fn lint_path(
ParseSource::None,
);
let transformed = source_kind;
let fixed = FxHashMap::default();
let fixed = FixTable::default();
(result, transformed, fixed)
};
@@ -396,7 +396,7 @@ pub(crate) fn lint_stdin(
}
return Ok(Diagnostics {
messages: lint_pyproject_toml(source_file, &settings.linter),
messages: lint_pyproject_toml(&source_file, &settings.linter),
fixed: FixMap::from_iter([(fs::relativize_path(path), FixTable::default())]),
notebook_indexes: FxHashMap::default(),
});
@@ -473,7 +473,7 @@ pub(crate) fn lint_stdin(
}
let transformed = source_kind;
let fixed = FxHashMap::default();
let fixed = FixTable::default();
(result, transformed, fixed)
}
} else {
@@ -487,7 +487,7 @@ pub(crate) fn lint_stdin(
ParseSource::None,
);
let transformed = source_kind;
let fixed = FxHashMap::default();
let fixed = FixTable::default();
(result, transformed, fixed)
};

View File

@@ -7,6 +7,7 @@ use bitflags::bitflags;
use colored::Colorize;
use itertools::{Itertools, iterate};
use ruff_linter::codes::NoqaCode;
use ruff_linter::linter::FixTable;
use serde::Serialize;
use ruff_linter::fs::relativize_path;
@@ -80,7 +81,7 @@ impl Printer {
let fixed = diagnostics
.fixed
.values()
.flat_map(std::collections::HashMap::values)
.flat_map(FixTable::counts)
.sum::<usize>();
if self.flags.intersects(Flags::SHOW_VIOLATIONS) {
@@ -302,7 +303,7 @@ impl Printer {
let statistics: Vec<ExpandedStatistics> = diagnostics
.messages
.iter()
.map(|message| (message.to_noqa_code(), message))
.map(|message| (message.noqa_code(), message))
.sorted_by_key(|(code, message)| (*code, message.fixable()))
.fold(
vec![],
@@ -472,13 +473,13 @@ fn show_fix_status(fix_mode: flags::FixMode, fixables: Option<&FixableStatistics
fn print_fix_summary(writer: &mut dyn Write, fixed: &FixMap) -> Result<()> {
let total = fixed
.values()
.map(|table| table.values().sum::<usize>())
.map(|table| table.counts().sum::<usize>())
.sum::<usize>();
assert!(total > 0);
let num_digits = num_digits(
*fixed
fixed
.values()
.filter_map(|table| table.values().max())
.filter_map(|table| table.counts().max())
.max()
.unwrap(),
);
@@ -498,12 +499,11 @@ fn print_fix_summary(writer: &mut dyn Write, fixed: &FixMap) -> Result<()> {
relativize_path(filename).bold(),
":".cyan()
)?;
for (rule, count) in table.iter().sorted_by_key(|(.., count)| Reverse(*count)) {
for (code, name, count) in table.iter().sorted_by_key(|(.., count)| Reverse(*count)) {
writeln!(
writer,
" {count:>num_digits$} × {} ({})",
rule.noqa_code().to_string().red().bold(),
rule.as_ref(),
" {count:>num_digits$} × {code} ({name})",
code = code.to_string().red().bold(),
)?;
}
}

View File

@@ -566,7 +566,7 @@ fn venv() -> Result<()> {
----- stderr -----
ruff failed
Cause: Invalid search path settings
Cause: Failed to discover the site-packages directory: Invalid `--python` argument: `none` could not be canonicalized
Cause: Failed to discover the site-packages directory: Invalid `--python` argument `none`: does not point to a Python executable or a directory on disk
");
});

View File

@@ -5436,14 +5436,15 @@ match 2:
print("it's one")
"#
),
@r"
success: true
exit_code: 0
@r###"
success: false
exit_code: 1
----- stdout -----
All checks passed!
test.py:2:1: SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)
Found 1 error.
----- stderr -----
"
"###
);
// syntax error on 3.9 with preview

View File

@@ -1,5 +1,4 @@
use std::fmt::Formatter;
use std::ops::Deref;
use std::sync::Arc;
use ruff_python_ast::ModModule;
@@ -18,7 +17,7 @@ use crate::source::source_text;
/// The query is only cached when the [`source_text()`] hasn't changed. This is because
/// comparing two ASTs is a non-trivial operation and every offset change is directly
/// reflected in the changed AST offsets.
/// The other reason is that Ruff's AST doesn't implement `Eq` which Sala requires
/// The other reason is that Ruff's AST doesn't implement `Eq` which Salsa requires
/// for determining if a query result is unchanged.
#[salsa::tracked(returns(ref), no_eq)]
pub fn parsed_module(db: &dyn Db, file: File) -> ParsedModule {
@@ -36,7 +35,10 @@ pub fn parsed_module(db: &dyn Db, file: File) -> ParsedModule {
ParsedModule::new(parsed)
}
/// Cheap cloneable wrapper around the parsed module.
/// A wrapper around a parsed module.
///
/// This type manages instances of the module AST. A particular instance of the AST
/// is represented with the [`ParsedModuleRef`] type.
#[derive(Clone)]
pub struct ParsedModule {
inner: Arc<Parsed<ModModule>>,
@@ -49,17 +51,11 @@ impl ParsedModule {
}
}
/// Consumes `self` and returns the Arc storing the parsed module.
pub fn into_arc(self) -> Arc<Parsed<ModModule>> {
self.inner
}
}
impl Deref for ParsedModule {
type Target = Parsed<ModModule>;
fn deref(&self) -> &Self::Target {
&self.inner
/// Loads a reference to the parsed module.
pub fn load(&self, _db: &dyn Db) -> ParsedModuleRef {
ParsedModuleRef {
module_ref: self.inner.clone(),
}
}
}
@@ -77,6 +73,30 @@ impl PartialEq for ParsedModule {
impl Eq for ParsedModule {}
/// Cheap cloneable wrapper around an instance of a module AST.
#[derive(Clone)]
pub struct ParsedModuleRef {
module_ref: Arc<Parsed<ModModule>>,
}
impl ParsedModuleRef {
pub fn as_arc(&self) -> &Arc<Parsed<ModModule>> {
&self.module_ref
}
pub fn into_arc(self) -> Arc<Parsed<ModModule>> {
self.module_ref
}
}
impl std::ops::Deref for ParsedModuleRef {
type Target = Parsed<ModModule>;
fn deref(&self) -> &Self::Target {
&self.module_ref
}
}
#[cfg(test)]
mod tests {
use crate::Db;
@@ -98,7 +118,7 @@ mod tests {
let file = system_path_to_file(&db, path).unwrap();
let parsed = parsed_module(&db, file);
let parsed = parsed_module(&db, file).load(&db);
assert!(parsed.has_valid_syntax());
@@ -114,7 +134,7 @@ mod tests {
let file = system_path_to_file(&db, path).unwrap();
let parsed = parsed_module(&db, file);
let parsed = parsed_module(&db, file).load(&db);
assert!(parsed.has_valid_syntax());
@@ -130,7 +150,7 @@ mod tests {
let virtual_file = db.files().virtual_file(&db, path);
let parsed = parsed_module(&db, virtual_file.file());
let parsed = parsed_module(&db, virtual_file.file()).load(&db);
assert!(parsed.has_valid_syntax());
@@ -146,7 +166,7 @@ mod tests {
let virtual_file = db.files().virtual_file(&db, path);
let parsed = parsed_module(&db, virtual_file.file());
let parsed = parsed_module(&db, virtual_file.file()).load(&db);
assert!(parsed.has_valid_syntax());
@@ -177,7 +197,7 @@ else:
let file = vendored_path_to_file(&db, VendoredPath::new("path.pyi")).unwrap();
let parsed = parsed_module(&db, file);
let parsed = parsed_module(&db, file).load(&db);
assert!(parsed.has_valid_syntax());
}

View File

@@ -171,6 +171,21 @@ pub trait System: Debug {
PatternError,
>;
/// Fetches the environment variable `key` from the current process.
///
/// # Errors
///
/// Returns [`std::env::VarError::NotPresent`] if:
/// - The variable is not set.
/// - The variable's name contains an equal sign or NUL (`'='` or `'\0'`).
///
/// Returns [`std::env::VarError::NotUnicode`] if the variable's value is not valid
/// Unicode.
fn env_var(&self, name: &str) -> std::result::Result<String, std::env::VarError> {
let _ = name;
Err(std::env::VarError::NotPresent)
}
fn as_any(&self) -> &dyn std::any::Any;
fn as_any_mut(&mut self) -> &mut dyn std::any::Any;

View File

@@ -214,6 +214,10 @@ impl System for OsSystem {
})
})))
}
fn env_var(&self, name: &str) -> std::result::Result<String, std::env::VarError> {
std::env::var(name)
}
}
impl OsSystem {

View File

@@ -13,12 +13,12 @@ pub fn assert_function_query_was_not_run<Db, Q, QDb, I, R>(
Q: Fn(QDb, I) -> R,
I: salsa::plumbing::AsId + std::fmt::Debug + Copy,
{
let id = input.as_id().as_u32();
let id = input.as_id();
let (query_name, will_execute_event) = find_will_execute_event(db, query, input, events);
db.attach(|_| {
if let Some(will_execute_event) = will_execute_event {
panic!("Expected query {query_name}({id}) not to have run but it did: {will_execute_event:?}\n\n{events:#?}");
panic!("Expected query {query_name}({id:?}) not to have run but it did: {will_execute_event:?}\n\n{events:#?}");
}
});
}
@@ -65,7 +65,7 @@ pub fn assert_function_query_was_run<Db, Q, QDb, I, R>(
Q: Fn(QDb, I) -> R,
I: salsa::plumbing::AsId + std::fmt::Debug + Copy,
{
let id = input.as_id().as_u32();
let id = input.as_id();
let (query_name, will_execute_event) = find_will_execute_event(db, query, input, events);
db.attach(|_| {
@@ -224,7 +224,7 @@ fn query_was_not_run() {
}
#[test]
#[should_panic(expected = "Expected query len(0) not to have run but it did:")]
#[should_panic(expected = "Expected query len(Id(0)) not to have run but it did:")]
fn query_was_not_run_fails_if_query_was_run() {
use crate::tests::TestDb;
use salsa::prelude::*;
@@ -287,7 +287,7 @@ fn const_query_was_not_run_fails_if_query_was_run() {
}
#[test]
#[should_panic(expected = "Expected query len(0) to have run but it did not:")]
#[should_panic(expected = "Expected query len(Id(0)) to have run but it did not:")]
fn query_was_run_fails_if_query_was_not_run() {
use crate::tests::TestDb;
use salsa::prelude::*;

View File

@@ -29,7 +29,7 @@ pub(crate) fn main(args: &Args) -> Result<()> {
if let Some(explanation) = rule.explanation() {
let mut output = String::new();
let _ = writeln!(&mut output, "# {} ({})", rule.as_ref(), rule.noqa_code());
let _ = writeln!(&mut output, "# {} ({})", rule.name(), rule.noqa_code());
let (linter, _) = Linter::parse_code(&rule.noqa_code().to_string()).unwrap();
if linter.url().is_some() {
@@ -101,7 +101,7 @@ pub(crate) fn main(args: &Args) -> Result<()> {
let filename = PathBuf::from(ROOT_DIR)
.join("docs")
.join("rules")
.join(rule.as_ref())
.join(&*rule.name())
.with_extension("md");
if args.dry_run {

View File

@@ -55,7 +55,7 @@ fn generate_table(table_out: &mut String, rules: impl IntoIterator<Item = Rule>,
FixAvailability::None => format!("<span {SYMBOL_STYLE}></span>"),
};
let rule_name = rule.as_ref();
let rule_name = rule.name();
// If the message ends in a bracketed expression (like: "Use {replacement}"), escape the
// brackets. Otherwise, it'll be interpreted as an HTML attribute via the `attr_list`

View File

@@ -10,7 +10,7 @@ use ruff_python_ast::PythonVersion;
use ty_python_semantic::lint::{LintRegistry, RuleSelection};
use ty_python_semantic::{
Db, Program, ProgramSettings, PythonPath, PythonPlatform, PythonVersionSource,
PythonVersionWithSource, SearchPathSettings, default_lint_registry,
PythonVersionWithSource, SearchPathSettings, SysPrefixPathOrigin, default_lint_registry,
};
static EMPTY_VENDORED: std::sync::LazyLock<VendoredFileSystem> = std::sync::LazyLock::new(|| {
@@ -37,17 +37,18 @@ impl ModuleDb {
) -> Result<Self> {
let mut search_paths = SearchPathSettings::new(src_roots);
if let Some(venv_path) = venv_path {
search_paths.python_path = PythonPath::from_cli_flag(venv_path);
search_paths.python_path =
PythonPath::sys_prefix(venv_path, SysPrefixPathOrigin::PythonCliFlag);
}
let db = Self::default();
Program::from_settings(
&db,
ProgramSettings {
python_version: PythonVersionWithSource {
python_version: Some(PythonVersionWithSource {
version: python_version,
source: PythonVersionSource::default(),
},
}),
python_platform: PythonPlatform::default(),
search_paths,
},

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.11.12"
version = "0.11.13"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -10,22 +10,10 @@ from airflow import (
PY312,
)
from airflow.api_connexion.security import requires_access
from airflow.configuration import (
as_dict,
get,
getboolean,
getfloat,
getint,
has_option,
remove_option,
set,
)
from airflow.contrib.aws_athena_hook import AWSAthenaHook
from airflow.datasets import DatasetAliasEvent
from airflow.hooks.base_hook import BaseHook
from airflow.operators.subdag import SubDagOperator
from airflow.secrets.local_filesystem import LocalFilesystemBackend
from airflow.sensors.base_sensor_operator import BaseSensorOperator
from airflow.triggers.external_task import TaskStateTrigger
from airflow.utils import dates
from airflow.utils.dag_cycle_tester import test_cycle
@@ -40,13 +28,10 @@ from airflow.utils.dates import (
)
from airflow.utils.db import create_session
from airflow.utils.decorators import apply_defaults
from airflow.utils.file import TemporaryDirectory, mkdirs
from airflow.utils.helpers import chain as helper_chain
from airflow.utils.helpers import cross_downstream as helper_cross_downstream
from airflow.utils.log import secrets_masker
from airflow.utils.file import mkdirs
from airflow.utils.state import SHUTDOWN, terminating_states
from airflow.utils.trigger_rule import TriggerRule
from airflow.www.auth import has_access
from airflow.www.auth import has_access, has_access_dataset
from airflow.www.utils import get_sensitive_variables_fields, should_hide_value_for_key
# airflow root
@@ -55,11 +40,6 @@ PY36, PY37, PY38, PY39, PY310, PY311, PY312
# airflow.api_connexion.security
requires_access
# airflow.configuration
get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
# airflow.contrib.*
AWSAthenaHook()
@@ -68,10 +48,6 @@ AWSAthenaHook()
DatasetAliasEvent()
# airflow.hooks
BaseHook()
# airflow.operators.subdag.*
SubDagOperator()
@@ -81,10 +57,6 @@ SubDagOperator()
LocalFilesystemBackend()
# airflow.sensors.base_sensor_operator
BaseSensorOperator()
# airflow.triggers.external_task
TaskStateTrigger()
@@ -114,15 +86,8 @@ create_session
apply_defaults
# airflow.utils.file
TemporaryDirectory()
mkdirs
# airflow.utils.helpers
helper_chain
helper_cross_downstream
# airflow.utils.log
secrets_masker
# airflow.utils.state
SHUTDOWN
@@ -135,37 +100,8 @@ TriggerRule.NONE_FAILED_OR_SKIPPED
# airflow.www.auth
has_access
has_access_dataset
# airflow.www.utils
get_sensitive_variables_fields
should_hide_value_for_key
# airflow.operators.python
from airflow.operators.python import get_current_context
get_current_context()
# airflow.providers.mysql
from airflow.providers.mysql.datasets.mysql import sanitize_uri
sanitize_uri
# airflow.providers.postgres
from airflow.providers.postgres.datasets.postgres import sanitize_uri
sanitize_uri
# airflow.providers.trino
from airflow.providers.trino.datasets.trino import sanitize_uri
sanitize_uri
# airflow.notifications.basenotifier
from airflow.notifications.basenotifier import BaseNotifier
BaseNotifier()
# airflow.auth.manager
from airflow.auth.managers.base_auth_manager import BaseAuthManager
BaseAuthManager()

View File

@@ -3,7 +3,6 @@ from __future__ import annotations
from airflow.api_connexion.security import requires_access_dataset
from airflow.auth.managers.models.resource_details import (
DatasetDetails,
)
from airflow.datasets.manager import (
DatasetManager,
@@ -12,15 +11,13 @@ from airflow.datasets.manager import (
)
from airflow.lineage.hook import DatasetLineageInfo
from airflow.metrics.validators import AllowListValidator, BlockListValidator
from airflow.secrets.local_filesystm import load_connections
from airflow.secrets.local_filesystem import load_connections
from airflow.security.permissions import RESOURCE_DATASET
from airflow.www.auth import has_access_dataset
requires_access_dataset()
DatasetDetails()
DatasetManager()
dataset_manager()
resolve_dataset_manager()
@@ -34,7 +31,6 @@ load_connections()
RESOURCE_DATASET
has_access_dataset()
from airflow.listeners.spec.dataset import (
on_dataset_changed,
@@ -43,3 +39,76 @@ from airflow.listeners.spec.dataset import (
on_dataset_created()
on_dataset_changed()
# airflow.operators.python
from airflow.operators.python import get_current_context
get_current_context()
# airflow.providers.mysql
from airflow.providers.mysql.datasets.mysql import sanitize_uri
sanitize_uri
# airflow.providers.postgres
from airflow.providers.postgres.datasets.postgres import sanitize_uri
sanitize_uri
# airflow.providers.trino
from airflow.providers.trino.datasets.trino import sanitize_uri
sanitize_uri
# airflow.notifications.basenotifier
from airflow.notifications.basenotifier import BaseNotifier
BaseNotifier()
# airflow.auth.manager
from airflow.auth.managers.base_auth_manager import BaseAuthManager
BaseAuthManager()
from airflow.configuration import (
as_dict,
get,
getboolean,
getfloat,
getint,
has_option,
remove_option,
set,
)
# airflow.configuration
get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
from airflow.hooks.base_hook import BaseHook
# airflow.hooks
BaseHook()
from airflow.sensors.base_sensor_operator import BaseSensorOperator
# airflow.sensors.base_sensor_operator
BaseSensorOperator()
BaseHook()
from airflow.utils.helpers import chain as helper_chain
from airflow.utils.helpers import cross_downstream as helper_cross_downstream
# airflow.utils.helpers
helper_chain
helper_cross_downstream
# airflow.utils.file
from airflow.utils.file import TemporaryDirectory
TemporaryDirectory()
from airflow.utils.log import secrets_masker
# airflow.utils.log
secrets_masker

View File

@@ -1,54 +1,54 @@
from __future__ import annotations
from airflow.providers.amazon.aws.auth_manager.avp.entities import AvpEntities
from airflow.providers.openlineage.utils.utils import (
DatasetInfo,
translate_airflow_dataset,
)
from airflow.secrets.local_filesystem import load_connections
from airflow.security.permissions import RESOURCE_DATASET
AvpEntities.DATASET
# airflow.providers.openlineage.utils.utils
DatasetInfo()
translate_airflow_dataset()
# airflow.secrets.local_filesystem
load_connections()
# airflow.security.permissions
RESOURCE_DATASET
from airflow.providers.amazon.aws.datasets.s3 import (
convert_dataset_to_openlineage as s3_convert_dataset_to_openlineage,
)
from airflow.providers.amazon.aws.datasets.s3 import create_dataset as s3_create_dataset
s3_create_dataset()
s3_convert_dataset_to_openlineage()
from airflow.providers.common.io.dataset.file import (
convert_dataset_to_openlineage as io_convert_dataset_to_openlineage,
)
from airflow.providers.common.io.dataset.file import create_dataset as io_create_dataset
from airflow.providers.google.datasets.bigquery import (
create_dataset as bigquery_create_dataset,
)
from airflow.providers.google.datasets.gcs import (
convert_dataset_to_openlineage as gcs_convert_dataset_to_openlineage,
)
from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
from airflow.providers.openlineage.utils.utils import (
DatasetInfo,
translate_airflow_dataset,
)
AvpEntities.DATASET
s3_create_dataset()
s3_convert_dataset_to_openlineage()
io_create_dataset()
io_convert_dataset_to_openlineage()
# # airflow.providers.google.datasets.bigquery
from airflow.providers.google.datasets.bigquery import (
create_dataset as bigquery_create_dataset,
)
# airflow.providers.google.datasets.bigquery
bigquery_create_dataset()
# airflow.providers.google.datasets.gcs
from airflow.providers.google.datasets.gcs import (
convert_dataset_to_openlineage as gcs_convert_dataset_to_openlineage,
)
from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
gcs_create_dataset()
gcs_convert_dataset_to_openlineage()
# airflow.providers.openlineage.utils.utils
DatasetInfo()
translate_airflow_dataset()
#
# airflow.secrets.local_filesystem
load_connections()
#
# airflow.security.permissions
RESOURCE_DATASET
# airflow.timetables
DatasetTriggeredTimetable()
#
# airflow.www.auth
has_access_dataset

View File

@@ -5,35 +5,30 @@ from airflow.hooks.S3_hook import (
provide_bucket_name,
)
from airflow.operators.gcs_to_s3 import GCSToS3Operator
from airflow.operators.google_api_to_s3_transfer import (
GoogleApiToS3Operator,
GoogleApiToS3Transfer,
)
from airflow.operators.redshift_to_s3_operator import (
RedshiftToS3Operator,
RedshiftToS3Transfer,
)
from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
from airflow.operators.s3_to_redshift_operator import (
S3ToRedshiftOperator,
S3ToRedshiftTransfer,
)
from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
from airflow.sensors.s3_key_sensor import S3KeySensor
S3Hook()
provide_bucket_name()
GCSToS3Operator()
GoogleApiToS3Operator()
RedshiftToS3Operator()
S3FileTransformOperator()
S3ToRedshiftOperator()
S3KeySensor()
from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Transfer
GoogleApiToS3Transfer()
RedshiftToS3Operator()
from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
RedshiftToS3Transfer()
S3FileTransformOperator()
from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer
S3ToRedshiftOperator()
S3ToRedshiftTransfer()
S3KeySensor()

View File

@@ -4,10 +4,13 @@ from airflow.hooks.dbapi import (
ConnectorProtocol,
DbApiHook,
)
ConnectorProtocol()
DbApiHook()
from airflow.hooks.dbapi_hook import DbApiHook
from airflow.operators.check_operator import SQLCheckOperator
ConnectorProtocol()
DbApiHook()
SQLCheckOperator()

View File

@@ -12,55 +12,59 @@ from airflow.macros.hive import (
)
from airflow.operators.hive_operator import HiveOperator
from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
from airflow.operators.hive_to_mysql import (
HiveToMySqlOperator,
HiveToMySqlTransfer,
)
from airflow.operators.hive_to_mysql import HiveToMySqlOperator
from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
from airflow.operators.mssql_to_hive import (
MsSqlToHiveOperator,
MsSqlToHiveTransfer,
)
from airflow.operators.mysql_to_hive import (
MySqlToHiveOperator,
MySqlToHiveTransfer,
)
from airflow.operators.s3_to_hive_operator import (
S3ToHiveOperator,
S3ToHiveTransfer,
)
from airflow.sensors.hive_partition_sensor import HivePartitionSensor
from airflow.sensors.metastore_partition_sensor import MetastorePartitionSensor
from airflow.sensors.named_hive_partition_sensor import NamedHivePartitionSensor
HIVE_QUEUE_PRIORITIES
HiveCliHook()
HiveMetastoreHook()
HiveServer2Hook()
closest_ds_partition()
max_partition()
HiveCliHook()
HiveMetastoreHook()
HiveServer2Hook()
HIVE_QUEUE_PRIORITIES
HiveOperator()
HiveStatsCollectionOperator()
HiveToMySqlOperator()
HiveToMySqlTransfer()
HiveToSambaOperator()
MsSqlToHiveOperator()
MsSqlToHiveTransfer()
from airflow.operators.hive_to_mysql import HiveToMySqlTransfer
HiveToMySqlTransfer()
from airflow.operators.mysql_to_hive import MySqlToHiveOperator
MySqlToHiveOperator()
from airflow.operators.mysql_to_hive import MySqlToHiveTransfer
MySqlToHiveTransfer()
from airflow.operators.mssql_to_hive import MsSqlToHiveOperator
MsSqlToHiveOperator()
from airflow.operators.mssql_to_hive import MsSqlToHiveTransfer
MsSqlToHiveTransfer()
from airflow.operators.s3_to_hive_operator import S3ToHiveOperator
S3ToHiveOperator()
from airflow.operators.s3_to_hive_operator import S3ToHiveTransfer
S3ToHiveTransfer()
from airflow.sensors.hive_partition_sensor import HivePartitionSensor
HivePartitionSensor()
from airflow.sensors.metastore_partition_sensor import MetastorePartitionSensor
MetastorePartitionSensor()
from airflow.sensors.named_hive_partition_sensor import NamedHivePartitionSensor
NamedHivePartitionSensor()

View File

@@ -16,14 +16,7 @@ from airflow.kubernetes.kube_client import (
from airflow.kubernetes.kubernetes_helper_functions import (
add_pod_suffix,
annotations_for_logging_task_metadata,
annotations_to_key,
create_pod_id,
get_logs_task_metadata,
rand_str,
)
from airflow.kubernetes.pod import (
Port,
Resources,
)
ALL_NAMESPACES
@@ -37,21 +30,13 @@ _enable_tcp_keepalive()
get_kube_client()
add_pod_suffix()
create_pod_id()
annotations_for_logging_task_metadata()
annotations_to_key()
get_logs_task_metadata()
rand_str()
Port()
Resources()
create_pod_id()
from airflow.kubernetes.pod_generator import (
PodDefaults,
PodGenerator,
PodGeneratorDeprecated,
add_pod_suffix,
datetime_to_label_safe_datestring,
extend_object_field,
@@ -61,18 +46,16 @@ from airflow.kubernetes.pod_generator import (
rand_str,
)
PodDefaults()
PodGenerator()
add_pod_suffix()
datetime_to_label_safe_datestring()
extend_object_field()
label_safe_datestring_to_datetime()
make_safe_label_value()
merge_objects()
PodGenerator()
PodDefaults()
PodGeneratorDeprecated()
add_pod_suffix()
rand_str()
from airflow.kubernetes.pod_generator_deprecated import (
PodDefaults,
PodGenerator,
@@ -90,7 +73,6 @@ make_safe_label_value()
PodLauncher()
PodStatus()
from airflow.kubernetes.pod_launcher_deprecated import (
PodDefaults,
PodLauncher,
@@ -115,3 +97,17 @@ K8SModel()
Secret()
Volume()
VolumeMount()
from airflow.kubernetes.kubernetes_helper_functions import (
annotations_to_key,
get_logs_task_metadata,
rand_str,
)
annotations_to_key()
get_logs_task_metadata()
rand_str()
from airflow.kubernetes.pod_generator import PodGeneratorDeprecated
PodGeneratorDeprecated()

View File

@@ -5,10 +5,6 @@ from airflow.operators.dagrun_operator import (
TriggerDagRunLink,
TriggerDagRunOperator,
)
from airflow.operators.dummy import (
DummyOperator,
EmptyOperator,
)
from airflow.operators.latest_only_operator import LatestOnlyOperator
from airflow.operators.python_operator import (
BranchPythonOperator,
@@ -19,15 +15,12 @@ from airflow.operators.python_operator import (
from airflow.sensors.external_task_sensor import (
ExternalTaskMarker,
ExternalTaskSensor,
ExternalTaskSensorLink,
)
BashOperator()
TriggerDagRunLink()
TriggerDagRunOperator()
DummyOperator()
EmptyOperator()
LatestOnlyOperator()
@@ -38,25 +31,48 @@ ShortCircuitOperator()
ExternalTaskMarker()
ExternalTaskSensor()
ExternalTaskSensorLink()
from airflow.operators.dummy_operator import (
DummyOperator,
EmptyOperator,
)
DummyOperator()
EmptyOperator()
from airflow.hooks.subprocess import SubprocessResult
SubprocessResult()
from airflow.hooks.subprocess import working_directory
working_directory()
from airflow.operators.datetime import target_times_as_dates
target_times_as_dates()
from airflow.operators.trigger_dagrun import TriggerDagRunLink
TriggerDagRunLink()
from airflow.sensors.external_task import ExternalTaskSensorLink
ExternalTaskSensorLink()
from airflow.sensors.time_delta import WaitSensor
WaitSensor()
WaitSensor()
from airflow.operators.dummy import DummyOperator
DummyOperator()
from airflow.operators.dummy import EmptyOperator
EmptyOperator()
from airflow.operators.dummy_operator import DummyOperator
DummyOperator()
from airflow.operators.dummy_operator import EmptyOperator
EmptyOperator()
from airflow.sensors.external_task_sensor import ExternalTaskSensorLink
ExternalTaskSensorLink()

View File

@@ -9,19 +9,12 @@ from airflow.datasets import (
expand_alias_to_datasets,
)
from airflow.datasets.metadata import Metadata
from airflow.decorators import dag, setup, task, task_group, teardown
from airflow.io.path import ObjectStoragePath
from airflow.io.storage import attach
from airflow.models import DAG as DAGFromModel
from airflow.models import (
Connection,
Variable,
from airflow.decorators import (
dag,
setup,
task,
task_group,
)
from airflow.models.baseoperator import chain, chain_linear, cross_downstream
from airflow.models.baseoperatorlink import BaseOperatorLink
from airflow.models.dag import DAG as DAGFromDag
from airflow.timetables.datasets import DatasetOrTimeSchedule
from airflow.utils.dag_parsing_context import get_parsing_context
# airflow
DatasetFromRoot()
@@ -39,9 +32,22 @@ dag()
task()
task_group()
setup()
from airflow.decorators import teardown
from airflow.io.path import ObjectStoragePath
from airflow.io.storage import attach
from airflow.models import DAG as DAGFromModel
from airflow.models import (
Connection,
Variable,
)
from airflow.models.baseoperator import chain, chain_linear, cross_downstream
from airflow.models.baseoperatorlink import BaseOperatorLink
from airflow.models.dag import DAG as DAGFromDag
# airflow.decorators
teardown()
# airflow.io
# # airflow.io
ObjectStoragePath()
attach()
@@ -60,6 +66,9 @@ BaseOperatorLink()
# airflow.models.dag
DAGFromDag()
from airflow.timetables.datasets import DatasetOrTimeSchedule
from airflow.utils.dag_parsing_context import get_parsing_context
# airflow.timetables.datasets
DatasetOrTimeSchedule()

View File

@@ -7,49 +7,71 @@ from airflow.operators.bash import BashOperator
from airflow.operators.datetime import BranchDateTimeOperator
from airflow.operators.empty import EmptyOperator
from airflow.operators.latest_only import LatestOnlyOperator
from airflow.operators.trigger_dagrun import TriggerDagRunOperator
from airflow.operators.weekday import BranchDayOfWeekOperator
from airflow.sensors.date_time import DateTimeSensor
FSHook()
PackageIndexHook()
SubprocessHook()
BashOperator()
BranchDateTimeOperator()
TriggerDagRunOperator()
EmptyOperator()
LatestOnlyOperator()
BranchDayOfWeekOperator()
DateTimeSensor()
from airflow.operators.python import (
BranchPythonOperator,
PythonOperator,
PythonVirtualenvOperator,
ShortCircuitOperator,
)
from airflow.operators.trigger_dagrun import TriggerDagRunOperator
from airflow.operators.weekday import BranchDayOfWeekOperator
from airflow.sensors.date_time import DateTimeSensor, DateTimeSensorAsync
from airflow.sensors.date_time import DateTimeSensorAsync
from airflow.sensors.external_task import (
ExternalTaskMarker,
ExternalTaskSensor,
)
from airflow.sensors.time_sensor import (
TimeSensor,
TimeSensorAsync,
)
from airflow.sensors.filesystem import FileSensor
from airflow.sensors.time_delta import TimeDeltaSensor, TimeDeltaSensorAsync
from airflow.sensors.time_sensor import TimeSensor, TimeSensorAsync
from airflow.sensors.weekday import DayOfWeekSensor
from airflow.triggers.external_task import DagStateTrigger, WorkflowTrigger
from airflow.triggers.file import FileTrigger
from airflow.triggers.temporal import DateTimeTrigger, TimeDeltaTrigger
FSHook()
PackageIndexHook()
SubprocessHook()
BashOperator()
BranchDateTimeOperator()
TriggerDagRunOperator()
EmptyOperator()
LatestOnlyOperator()
(
BranchPythonOperator(),
PythonOperator(),
PythonVirtualenvOperator(),
ShortCircuitOperator(),
)
BranchDayOfWeekOperator()
DateTimeSensor(), DateTimeSensorAsync()
ExternalTaskMarker(), ExternalTaskSensor()
BranchPythonOperator()
PythonOperator()
PythonVirtualenvOperator()
ShortCircuitOperator()
DateTimeSensorAsync()
ExternalTaskMarker()
ExternalTaskSensor()
FileSensor()
TimeSensor(), TimeSensorAsync()
TimeDeltaSensor(), TimeDeltaSensorAsync()
TimeSensor()
TimeSensorAsync()
from airflow.sensors.time_delta import (
TimeDeltaSensor,
TimeDeltaSensorAsync,
)
from airflow.sensors.weekday import DayOfWeekSensor
from airflow.triggers.external_task import (
DagStateTrigger,
WorkflowTrigger,
)
from airflow.triggers.file import FileTrigger
from airflow.triggers.temporal import (
DateTimeTrigger,
TimeDeltaTrigger,
)
TimeDeltaSensor()
TimeDeltaSensorAsync()
DayOfWeekSensor()
DagStateTrigger(), WorkflowTrigger()
DagStateTrigger()
WorkflowTrigger()
FileTrigger()
DateTimeTrigger(), TimeDeltaTrigger()
DateTimeTrigger()
TimeDeltaTrigger()

View File

@@ -178,3 +178,38 @@ async def unknown_1(other: str = Depends(unknown_unresolved)): ...
async def unknown_2(other: str = Depends(unknown_not_function)): ...
@app.get("/things/{thing_id}")
async def unknown_3(other: str = Depends(unknown_imported)): ...
# Class dependencies
from pydantic import BaseModel
from dataclasses import dataclass
class PydanticParams(BaseModel):
my_id: int
class InitParams:
def __init__(self, my_id: int):
self.my_id = my_id
# Errors
@app.get("/{id}")
async def get_id_pydantic_full(
params: Annotated[PydanticParams, Depends(PydanticParams)],
): ...
@app.get("/{id}")
async def get_id_pydantic_short(params: Annotated[PydanticParams, Depends()]): ...
@app.get("/{id}")
async def get_id_init_not_annotated(params = Depends(InitParams)): ...
# No errors
@app.get("/{my_id}")
async def get_id_pydantic_full(
params: Annotated[PydanticParams, Depends(PydanticParams)],
): ...
@app.get("/{my_id}")
async def get_id_pydantic_short(params: Annotated[PydanticParams, Depends()]): ...
@app.get("/{my_id}")
async def get_id_init_not_annotated(params = Depends(InitParams)): ...

View File

@@ -22,3 +22,8 @@ def my_func():
# Implicit string concatenation
"0.0.0.0" f"0.0.0.0{expr}0.0.0.0"
# t-strings - all ok
t"0.0.0.0"
"0.0.0.0" t"0.0.0.0{expr}0.0.0.0"
"0.0.0.0" f"0.0.0.0{expr}0.0.0.0" t"0.0.0.0{expr}0.0.0.0"

View File

@@ -40,3 +40,7 @@ with tempfile.TemporaryDirectory(dir="/dev/shm") as d:
with TemporaryDirectory(dir="/tmp") as d:
pass
# ok (runtime error from t-string)
with open(t"/foo/bar", "w") as f:
f.write("def")

View File

@@ -169,3 +169,13 @@ query60 = f"""
# https://github.com/astral-sh/ruff/issues/17967
query61 = f"SELECT * FROM table" # skip expressionless f-strings
# t-strings
query62 = t"SELECT * FROM table"
query63 = t"""
SELECT *,
foo
FROM ({user_input}) raw
"""
query64 = f"update {t"{table}"} set var = {t"{var}"}"
query65 = t"update {f"{table}"} set var = {f"{var}"}"

View File

@@ -72,3 +72,5 @@ def not_warnings_dot_deprecated(
@not_warnings_dot_deprecated("Not warnings.deprecated, so this one *should* lead to PYI053 in a stub!")
def not_a_deprecated_function() -> None: ...
baz: str = t"51 character stringgggggggggggggggggggggggggggggggg"

View File

@@ -80,3 +80,7 @@ x: TypeAlias = Literal["fooooooooooooooooooooooooooooooooooooooooooooooooooooooo
# Ok
y: TypeAlias = Annotated[int, "metadataaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"]
ttoo: str = t"50 character stringggggggggggggggggggggggggggggggg" # OK
tbar: str = t"51 character stringgggggggggggggggggggggggggggggggg" # Error: PYI053

View File

@@ -39,3 +39,27 @@ f'\'normal\' {f'nested'} normal' # Q003
f'\'normal\' {f'nested'} "double quotes"'
f'\'normal\' {f'\'nested\' {'other'} normal'} "double quotes"' # Q003
f'\'normal\' {f'\'nested\' {'other'} "double quotes"'} normal' # Q00l
# Same as above, but with t-strings
t'This is a \'string\'' # Q003
t'This is \\ a \\\'string\'' # Q003
t'"This" is a \'string\''
f"This is a 'string'"
f"\"This\" is a 'string'"
fr'This is a \'string\''
fR'This is a \'string\''
foo = (
t'This is a'
t'\'string\'' # Q003
)
t'\'foo\' {'nested'}' # Q003
t'\'foo\' {t'nested'}' # Q003
t'\'foo\' {t'\'nested\''} \'\'' # Q003
t'normal {t'nested'} normal'
t'\'normal\' {t'nested'} normal' # Q003
t'\'normal\' {t'nested'} "double quotes"'
t'\'normal\' {t'\'nested\' {'other'} normal'} "double quotes"' # Q003
t'\'normal\' {t'\'nested\' {'other'} "double quotes"'} normal' # Q00l

View File

@@ -37,3 +37,25 @@ f"\"normal\" {f"nested"} normal" # Q003
f"\"normal\" {f"nested"} 'single quotes'"
f"\"normal\" {f"\"nested\" {"other"} normal"} 'single quotes'" # Q003
f"\"normal\" {f"\"nested\" {"other"} 'single quotes'"} normal" # Q003
# Same as above, but with t-strings
t"This is a \"string\""
t"'This' is a \"string\""
f'This is a "string"'
f'\'This\' is a "string"'
fr"This is a \"string\""
fR"This is a \"string\""
foo = (
t"This is a"
t"\"string\""
)
t"\"foo\" {"foo"}" # Q003
t"\"foo\" {t"foo"}" # Q003
t"\"foo\" {t"\"foo\""} \"\"" # Q003
t"normal {t"nested"} normal"
t"\"normal\" {t"nested"} normal" # Q003
t"\"normal\" {t"nested"} 'single quotes'"
t"\"normal\" {t"\"nested\" {"other"} normal"} 'single quotes'" # Q003
t"\"normal\" {t"\"nested\" {"other"} 'single quotes'"} normal" # Q003

View File

@@ -266,3 +266,15 @@ def f():
result = list() # this should be replaced with a comprehension
for i in values:
result.append(i + 1) # PERF401
def f():
src = [1]
dst = []
for i in src:
if True if True else False:
dst.append(i)
for i in src:
if lambda: 0:
dst.append(i)

View File

@@ -151,3 +151,16 @@ def foo():
result = {}
for idx, name in indices, fruit:
result[name] = idx # PERF403
def foo():
src = (("x", 1),)
dst = {}
for k, v in src:
if True if True else False:
dst[k] = v
for k, v in src:
if lambda: 0:
dst[k] = v

View File

@@ -1,4 +1,4 @@
# Same as `W605_0.py` but using f-strings instead.
# Same as `W605_0.py` but using f-strings and t-strings instead.
#: W605:1:10
regex = f'\.png$'
@@ -66,3 +66,72 @@ s = f"TOTAL: {total}\nOK: {ok}\INCOMPLETE: {incomplete}\n"
# Debug text (should trigger)
t = f"{'\InHere'=}"
#: W605:1:10
regex = t'\.png$'
#: W605:2:1
regex = t'''
\.png$
'''
#: W605:2:6
f(
t'\_'
)
#: W605:4:6
t"""
multi-line
literal
with \_ somewhere
in the middle
"""
#: W605:1:38
value = t'new line\nand invalid escape \_ here'
#: Okay
regex = fr'\.png$'
regex = t'\\.png$'
regex = fr'''
\.png$
'''
regex = fr'''
\\.png$
'''
s = t'\\'
regex = t'\w' # noqa
regex = t'''
\w
''' # noqa
regex = t'\\\_'
value = t'\{{1}}'
value = t'\{1}'
value = t'{1:\}'
value = t"{t"\{1}"}"
value = rt"{t"\{1}"}"
# Okay
value = rt'\{{1}}'
value = rt'\{1}'
value = rt'{1:\}'
value = t"{rt"\{1}"}"
# Regression tests for https://github.com/astral-sh/ruff/issues/10434
t"{{}}+-\d"
t"\n{{}}+-\d+"
t"\n{{}}<7D>+-\d+"
# See https://github.com/astral-sh/ruff/issues/11491
total = 10
ok = 7
incomplete = 3
s = t"TOTAL: {total}\nOK: {ok}\INCOMPLETE: {incomplete}\n"
# Debug text (should trigger)
t = t"{'\InHere'=}"

View File

@@ -110,6 +110,8 @@ from typing_extensions import CapsuleType
# UP035 on py313+ only
from typing_extensions import deprecated
# UP035 on py313+ only
from typing_extensions import get_type_hints
# https://github.com/astral-sh/ruff/issues/15780
from typing_extensions import is_typeddict

View File

@@ -102,3 +102,6 @@ with open("furb129.py") as f:
pass
for line in(f).readlines():
pass
# Test case for issue #17683 (missing space before keyword)
print([line for line in f.readlines()if True])

View File

@@ -0,0 +1,53 @@
# Errors.
if 1 in set([1]):
print("Single-element set")
if 1 in set((1,)):
print("Single-element set")
if 1 in set({1}):
print("Single-element set")
if 1 in frozenset([1]):
print("Single-element set")
if 1 in frozenset((1,)):
print("Single-element set")
if 1 in frozenset({1}):
print("Single-element set")
if 1 in set(set([1])):
print('Recursive solution')
# Non-errors.
if 1 in set((1, 2)):
pass
if 1 in set([1, 2]):
pass
if 1 in set({1, 2}):
pass
if 1 in frozenset((1, 2)):
pass
if 1 in frozenset([1, 2]):
pass
if 1 in frozenset({1, 2}):
pass
if 1 in set(1,):
pass
if 1 in set(1,2):
pass
if 1 in set((x for x in range(2))):
pass

View File

@@ -37,8 +37,8 @@ use ruff_python_ast::str::Quote;
use ruff_python_ast::visitor::{Visitor, walk_except_handler, walk_pattern};
use ruff_python_ast::{
self as ast, AnyParameterRef, ArgOrKeyword, Comprehension, ElifElseClause, ExceptHandler, Expr,
ExprContext, FStringElement, Keyword, MatchCase, ModModule, Parameter, Parameters, Pattern,
PythonVersion, Stmt, Suite, UnaryOp,
ExprContext, InterpolatedStringElement, Keyword, MatchCase, ModModule, Parameter, Parameters,
Pattern, PythonVersion, Stmt, Suite, UnaryOp,
};
use ruff_python_ast::{PySourceType, helpers, str, visitor};
use ruff_python_codegen::{Generator, Stylist};
@@ -57,7 +57,7 @@ use ruff_python_semantic::{
};
use ruff_python_stdlib::builtins::{MAGIC_GLOBALS, python_builtins};
use ruff_python_trivia::CommentRanges;
use ruff_source_file::{OneIndexed, SourceRow};
use ruff_source_file::{OneIndexed, SourceFile, SourceRow};
use ruff_text_size::{Ranged, TextRange, TextSize};
use crate::checkers::ast::annotation::AnnotationContext;
@@ -65,7 +65,7 @@ use crate::docstrings::extraction::ExtractionTarget;
use crate::importer::{ImportRequest, Importer, ResolutionError};
use crate::noqa::NoqaMapping;
use crate::package::PackageRoot;
use crate::preview::{is_semantic_errors_enabled, is_undefined_export_in_dunder_init_enabled};
use crate::preview::is_undefined_export_in_dunder_init_enabled;
use crate::registry::{AsRule, Rule};
use crate::rules::pyflakes::rules::{
LateFutureImport, ReturnOutsideFunction, YieldOutsideFunction,
@@ -73,7 +73,7 @@ use crate::rules::pyflakes::rules::{
use crate::rules::pylint::rules::{AwaitOutsideAsync, LoadBeforeGlobalDeclaration};
use crate::rules::{flake8_pyi, flake8_type_checking, pyflakes, pyupgrade};
use crate::settings::{LinterSettings, TargetVersion, flags};
use crate::{Diagnostic, Edit, Violation};
use crate::{Edit, OldDiagnostic, Violation};
use crate::{Locator, docstrings, noqa};
mod analyze;
@@ -224,8 +224,6 @@ pub(crate) struct Checker<'a> {
visit: deferred::Visit<'a>,
/// A set of deferred nodes to be analyzed after the AST traversal (e.g., `for` loops).
analyze: deferred::Analyze,
/// The cumulative set of diagnostics computed across all lint rules.
diagnostics: RefCell<Vec<Diagnostic>>,
/// The list of names already seen by flake8-bugbear diagnostics, to avoid duplicate violations.
flake8_bugbear_seen: RefCell<FxHashSet<TextRange>>,
/// The end offset of the last visited statement.
@@ -239,6 +237,7 @@ pub(crate) struct Checker<'a> {
semantic_checker: SemanticSyntaxChecker,
/// Errors collected by the `semantic_checker`.
semantic_errors: RefCell<Vec<SemanticSyntaxError>>,
context: &'a LintContext<'a>,
}
impl<'a> Checker<'a> {
@@ -259,6 +258,7 @@ impl<'a> Checker<'a> {
cell_offsets: Option<&'a CellOffsets>,
notebook_index: Option<&'a NotebookIndex>,
target_version: TargetVersion,
context: &'a LintContext<'a>,
) -> Checker<'a> {
let semantic = SemanticModel::new(&settings.typing_modules, path, module);
Self {
@@ -279,7 +279,6 @@ impl<'a> Checker<'a> {
semantic,
visit: deferred::Visit::default(),
analyze: deferred::Analyze::default(),
diagnostics: RefCell::default(),
flake8_bugbear_seen: RefCell::default(),
cell_offsets,
notebook_index,
@@ -288,6 +287,7 @@ impl<'a> Checker<'a> {
target_version,
semantic_checker: SemanticSyntaxChecker::new(),
semantic_errors: RefCell::default(),
context,
}
}
}
@@ -338,6 +338,7 @@ impl<'a> Checker<'a> {
ast::BytesLiteralFlags::empty().with_quote_style(self.preferred_quote())
}
// TODO(dylan) add similar method for t-strings
/// Return the default f-string flags a generated `FString` node should use, given where we are
/// in the AST.
pub(crate) fn default_fstring_flags(&self) -> ast::FStringFlags {
@@ -389,10 +390,7 @@ impl<'a> Checker<'a> {
kind: T,
range: TextRange,
) -> DiagnosticGuard<'chk, 'a> {
DiagnosticGuard {
checker: self,
diagnostic: Some(Diagnostic::new(kind, range)),
}
self.context.report_diagnostic(kind, range)
}
/// Return a [`DiagnosticGuard`] for reporting a diagnostic if the corresponding rule is
@@ -405,15 +403,8 @@ impl<'a> Checker<'a> {
kind: T,
range: TextRange,
) -> Option<DiagnosticGuard<'chk, 'a>> {
let diagnostic = Diagnostic::new(kind, range);
if self.enabled(diagnostic.rule()) {
Some(DiagnosticGuard {
checker: self,
diagnostic: Some(diagnostic),
})
} else {
None
}
self.context
.report_diagnostic_if_enabled(kind, range, self.settings)
}
/// Adds a [`TextRange`] to the set of ranges of variable names
@@ -672,9 +663,7 @@ impl SemanticSyntaxContext for Checker<'_> {
| SemanticSyntaxErrorKind::AsyncComprehensionInSyncComprehension(_)
| SemanticSyntaxErrorKind::DuplicateParameter(_)
| SemanticSyntaxErrorKind::NonlocalDeclarationAtModuleLevel => {
if is_semantic_errors_enabled(self.settings) {
self.semantic_errors.borrow_mut().push(error);
}
self.semantic_errors.borrow_mut().push(error);
}
}
}
@@ -1907,6 +1896,10 @@ impl<'a> Visitor<'a> for Checker<'a> {
self.semantic.flags |= SemanticModelFlags::F_STRING;
visitor::walk_expr(self, expr);
}
Expr::TString(_) => {
self.semantic.flags |= SemanticModelFlags::T_STRING;
visitor::walk_expr(self, expr);
}
Expr::Named(ast::ExprNamed {
target,
value,
@@ -1940,6 +1933,7 @@ impl<'a> Visitor<'a> for Checker<'a> {
}
Expr::BytesLiteral(bytes_literal) => analyze::string_like(bytes_literal.into(), self),
Expr::FString(f_string) => analyze::string_like(f_string.into(), self),
Expr::TString(t_string) => analyze::string_like(t_string.into(), self),
_ => {}
}
@@ -2129,12 +2123,15 @@ impl<'a> Visitor<'a> for Checker<'a> {
}
}
fn visit_f_string_element(&mut self, f_string_element: &'a FStringElement) {
fn visit_interpolated_string_element(
&mut self,
interpolated_string_element: &'a InterpolatedStringElement,
) {
let snapshot = self.semantic.flags;
if f_string_element.is_expression() {
self.semantic.flags |= SemanticModelFlags::F_STRING_REPLACEMENT_FIELD;
if interpolated_string_element.is_interpolation() {
self.semantic.flags |= SemanticModelFlags::INTERPOLATED_STRING_REPLACEMENT_FIELD;
}
visitor::walk_f_string_element(self, f_string_element);
visitor::walk_interpolated_string_element(self, interpolated_string_element);
self.semantic.flags = snapshot;
}
}
@@ -2891,30 +2888,26 @@ impl<'a> Checker<'a> {
} else {
if self.semantic.global_scope().uses_star_imports() {
if self.enabled(Rule::UndefinedLocalWithImportStarUsage) {
self.diagnostics.get_mut().push(
Diagnostic::new(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: name.to_string(),
},
range,
)
.with_parent(definition.start()),
);
self.report_diagnostic(
pyflakes::rules::UndefinedLocalWithImportStarUsage {
name: name.to_string(),
},
range,
)
.set_parent(definition.start());
}
} else {
if self.enabled(Rule::UndefinedExport) {
if is_undefined_export_in_dunder_init_enabled(self.settings)
|| !self.path.ends_with("__init__.py")
{
self.diagnostics.get_mut().push(
Diagnostic::new(
pyflakes::rules::UndefinedExport {
name: name.to_string(),
},
range,
)
.with_parent(definition.start()),
);
self.report_diagnostic(
pyflakes::rules::UndefinedExport {
name: name.to_string(),
},
range,
)
.set_parent(definition.start());
}
}
}
@@ -2975,7 +2968,8 @@ pub(crate) fn check_ast(
cell_offsets: Option<&CellOffsets>,
notebook_index: Option<&NotebookIndex>,
target_version: TargetVersion,
) -> (Vec<Diagnostic>, Vec<SemanticSyntaxError>) {
context: &LintContext,
) -> Vec<SemanticSyntaxError> {
let module_path = package
.map(PackageRoot::path)
.and_then(|package| to_module_path(package, path));
@@ -3015,6 +3009,7 @@ pub(crate) fn check_ast(
cell_offsets,
notebook_index,
target_version,
context,
);
checker.bind_builtins();
@@ -3041,12 +3036,83 @@ pub(crate) fn check_ast(
analyze::deferred_scopes(&checker);
let Checker {
diagnostics,
semantic_errors,
..
semantic_errors, ..
} = checker;
(diagnostics.into_inner(), semantic_errors.into_inner())
semantic_errors.into_inner()
}
/// A type for collecting diagnostics in a given file.
///
/// [`LintContext::report_diagnostic`] can be used to obtain a [`DiagnosticGuard`], which will push
/// a [`Violation`] to the contained [`OldDiagnostic`] collection on `Drop`.
pub(crate) struct LintContext<'a> {
diagnostics: RefCell<Vec<OldDiagnostic>>,
source_file: &'a SourceFile,
}
impl<'a> LintContext<'a> {
/// Create a new collector with the given `source_file` and an empty collection of
/// `OldDiagnostic`s.
pub(crate) fn new(source_file: &'a SourceFile) -> Self {
Self {
diagnostics: RefCell::default(),
source_file,
}
}
/// Return a [`DiagnosticGuard`] for reporting a diagnostic.
///
/// The guard derefs to an [`OldDiagnostic`], so it can be used to further modify the diagnostic
/// before it is added to the collection in the collector on `Drop`.
pub(crate) fn report_diagnostic<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
) -> DiagnosticGuard<'chk, 'a> {
DiagnosticGuard {
context: self,
diagnostic: Some(OldDiagnostic::new(kind, range, self.source_file)),
}
}
/// Return a [`DiagnosticGuard`] for reporting a diagnostic if the corresponding rule is
/// enabled.
///
/// Prefer [`DiagnosticsCollector::report_diagnostic`] in general because the conversion from an
/// `OldDiagnostic` to a `Rule` is somewhat expensive.
pub(crate) fn report_diagnostic_if_enabled<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
settings: &LinterSettings,
) -> Option<DiagnosticGuard<'chk, 'a>> {
let diagnostic = OldDiagnostic::new(kind, range, self.source_file);
if settings.rules.enabled(diagnostic.rule()) {
Some(DiagnosticGuard {
context: self,
diagnostic: Some(diagnostic),
})
} else {
None
}
}
pub(crate) fn into_diagnostics(self) -> Vec<OldDiagnostic> {
self.diagnostics.into_inner()
}
pub(crate) fn is_empty(&self) -> bool {
self.diagnostics.borrow().is_empty()
}
pub(crate) fn as_mut_vec(&mut self) -> &mut Vec<OldDiagnostic> {
self.diagnostics.get_mut()
}
pub(crate) fn iter(&mut self) -> impl Iterator<Item = &OldDiagnostic> {
self.diagnostics.get_mut().iter()
}
}
/// An abstraction for mutating a diagnostic.
@@ -3058,11 +3124,11 @@ pub(crate) fn check_ast(
/// adding fixes or parent ranges.
pub(crate) struct DiagnosticGuard<'a, 'b> {
/// The parent checker that will receive the diagnostic on `Drop`.
checker: &'a Checker<'b>,
context: &'a LintContext<'b>,
/// The diagnostic that we want to report.
///
/// This is always `Some` until the `Drop` (or `defuse`) call.
diagnostic: Option<Diagnostic>,
diagnostic: Option<OldDiagnostic>,
}
impl DiagnosticGuard<'_, '_> {
@@ -3076,9 +3142,9 @@ impl DiagnosticGuard<'_, '_> {
}
impl std::ops::Deref for DiagnosticGuard<'_, '_> {
type Target = Diagnostic;
type Target = OldDiagnostic;
fn deref(&self) -> &Diagnostic {
fn deref(&self) -> &OldDiagnostic {
// OK because `self.diagnostic` is only `None` within `Drop`.
self.diagnostic.as_ref().unwrap()
}
@@ -3086,7 +3152,7 @@ impl std::ops::Deref for DiagnosticGuard<'_, '_> {
/// Return a mutable borrow of the diagnostic in this guard.
impl std::ops::DerefMut for DiagnosticGuard<'_, '_> {
fn deref_mut(&mut self) -> &mut Diagnostic {
fn deref_mut(&mut self) -> &mut OldDiagnostic {
// OK because `self.diagnostic` is only `None` within `Drop`.
self.diagnostic.as_mut().unwrap()
}
@@ -3100,7 +3166,7 @@ impl Drop for DiagnosticGuard<'_, '_> {
}
if let Some(diagnostic) = self.diagnostic.take() {
self.checker.diagnostics.borrow_mut().push(diagnostic);
self.context.diagnostics.borrow_mut().push(diagnostic);
}
}
}

View File

@@ -3,8 +3,8 @@ use std::path::Path;
use ruff_python_ast::PythonVersion;
use ruff_python_trivia::CommentRanges;
use crate::Diagnostic;
use crate::Locator;
use crate::checkers::ast::LintContext;
use crate::package::PackageRoot;
use crate::preview::is_allow_nested_roots_enabled;
use crate::registry::Rule;
@@ -20,13 +20,12 @@ pub(crate) fn check_file_path(
comment_ranges: &CommentRanges,
settings: &LinterSettings,
target_version: PythonVersion,
) -> Vec<Diagnostic> {
let mut diagnostics: Vec<Diagnostic> = vec![];
context: &LintContext,
) {
// flake8-no-pep420
if settings.rules.enabled(Rule::ImplicitNamespacePackage) {
let allow_nested_roots = is_allow_nested_roots_enabled(settings);
if let Some(diagnostic) = implicit_namespace_package(
implicit_namespace_package(
path,
package,
locator,
@@ -34,26 +33,17 @@ pub(crate) fn check_file_path(
&settings.project_root,
&settings.src,
allow_nested_roots,
) {
diagnostics.push(diagnostic);
}
context,
);
}
// pep8-naming
if settings.rules.enabled(Rule::InvalidModuleName) {
if let Some(diagnostic) =
invalid_module_name(path, package, &settings.pep8_naming.ignore_names)
{
diagnostics.push(diagnostic);
}
invalid_module_name(path, package, &settings.pep8_naming.ignore_names, context);
}
// flake8-builtins
if settings.rules.enabled(Rule::StdlibModuleShadowing) {
if let Some(diagnostic) = stdlib_module_shadowing(path, settings, target_version) {
diagnostics.push(diagnostic);
}
stdlib_module_shadowing(path, settings, target_version, context);
}
diagnostics
}

View File

@@ -7,7 +7,6 @@ use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::Parsed;
use crate::Diagnostic;
use crate::Locator;
use crate::directives::IsortDirectives;
use crate::package::PackageRoot;
@@ -16,6 +15,8 @@ use crate::rules::isort;
use crate::rules::isort::block::{Block, BlockBuilder};
use crate::settings::LinterSettings;
use super::ast::LintContext;
#[expect(clippy::too_many_arguments)]
pub(crate) fn check_imports(
parsed: &Parsed<ModModule>,
@@ -28,7 +29,8 @@ pub(crate) fn check_imports(
source_type: PySourceType,
cell_offsets: Option<&CellOffsets>,
target_version: PythonVersion,
) -> Vec<Diagnostic> {
context: &LintContext,
) {
// Extract all import blocks from the AST.
let tracker = {
let mut tracker =
@@ -40,11 +42,10 @@ pub(crate) fn check_imports(
let blocks: Vec<&Block> = tracker.iter().collect();
// Enforce import rules.
let mut diagnostics = vec![];
if settings.rules.enabled(Rule::UnsortedImports) {
for block in &blocks {
if !block.imports.is_empty() {
if let Some(diagnostic) = isort::rules::organize_imports(
isort::rules::organize_imports(
block,
locator,
stylist,
@@ -54,21 +55,19 @@ pub(crate) fn check_imports(
source_type,
parsed.tokens(),
target_version,
) {
diagnostics.push(diagnostic);
}
context,
);
}
}
}
if settings.rules.enabled(Rule::MissingRequiredImport) {
diagnostics.extend(isort::rules::add_required_imports(
isort::rules::add_required_imports(
parsed,
locator,
stylist,
settings,
source_type,
));
context,
);
}
diagnostics
}

View File

@@ -4,10 +4,8 @@ use ruff_python_parser::{TokenKind, Tokens};
use ruff_source_file::LineRanges;
use ruff_text_size::{Ranged, TextRange};
use crate::Diagnostic;
use crate::Locator;
use crate::line_width::IndentWidth;
use crate::registry::{AsRule, Rule};
use crate::registry::Rule;
use crate::rules::pycodestyle::rules::logical_lines::{
LogicalLines, TokenFlags, extraneous_whitespace, indentation, missing_whitespace,
missing_whitespace_after_keyword, missing_whitespace_around_operator, redundant_backslash,
@@ -16,6 +14,9 @@ use crate::rules::pycodestyle::rules::logical_lines::{
whitespace_before_parameters,
};
use crate::settings::LinterSettings;
use crate::{Locator, Violation};
use super::ast::{DiagnosticGuard, LintContext};
/// Return the amount of indentation, expanding tabs to the next multiple of the settings' tab size.
pub(crate) fn expand_indent(line: &str, indent_width: IndentWidth) -> usize {
@@ -40,8 +41,9 @@ pub(crate) fn check_logical_lines(
indexer: &Indexer,
stylist: &Stylist,
settings: &LinterSettings,
) -> Vec<Diagnostic> {
let mut context = LogicalLinesContext::new(settings);
lint_context: &LintContext,
) {
let mut context = LogicalLinesContext::new(settings, lint_context);
let mut prev_line = None;
let mut prev_indent_level = None;
@@ -170,7 +172,7 @@ pub(crate) fn check_logical_lines(
let indent_size = 4;
if enforce_indentation {
for diagnostic in indentation(
indentation(
&line,
prev_line.as_ref(),
indent_char,
@@ -178,11 +180,9 @@ pub(crate) fn check_logical_lines(
prev_indent_level,
indent_size,
range,
) {
if settings.rules.enabled(diagnostic.rule()) {
context.push_diagnostic(diagnostic);
}
}
lint_context,
settings,
);
}
if !line.is_comment_only() {
@@ -190,26 +190,24 @@ pub(crate) fn check_logical_lines(
prev_indent_level = Some(indent_level);
}
}
context.diagnostics
}
#[derive(Debug, Clone)]
pub(crate) struct LogicalLinesContext<'a> {
pub(crate) struct LogicalLinesContext<'a, 'b> {
settings: &'a LinterSettings,
diagnostics: Vec<Diagnostic>,
context: &'a LintContext<'b>,
}
impl<'a> LogicalLinesContext<'a> {
fn new(settings: &'a LinterSettings) -> Self {
Self {
settings,
diagnostics: Vec::new(),
}
impl<'a, 'b> LogicalLinesContext<'a, 'b> {
fn new(settings: &'a LinterSettings, context: &'a LintContext<'b>) -> Self {
Self { settings, context }
}
pub(crate) fn push_diagnostic(&mut self, diagnostic: Diagnostic) {
if self.settings.rules.enabled(diagnostic.rule()) {
self.diagnostics.push(diagnostic);
}
pub(crate) fn report_diagnostic<'chk, T: Violation>(
&'chk self,
kind: T,
range: TextRange,
) -> Option<DiagnosticGuard<'chk, 'a>> {
self.context
.report_diagnostic_if_enabled(kind, range, self.settings)
}
}

View File

@@ -8,23 +8,23 @@ use rustc_hash::FxHashSet;
use ruff_python_trivia::CommentRanges;
use ruff_text_size::Ranged;
use crate::Locator;
use crate::fix::edits::delete_comment;
use crate::noqa::{
Code, Directive, FileExemption, FileNoqaDirectives, NoqaDirectives, NoqaMapping,
};
use crate::preview::is_check_file_level_directives_enabled;
use crate::registry::{AsRule, Rule, RuleSet};
use crate::rule_redirects::get_redirect_target;
use crate::rules::pygrep_hooks;
use crate::rules::ruff;
use crate::rules::ruff::rules::{UnusedCodes, UnusedNOQA};
use crate::settings::LinterSettings;
use crate::{Diagnostic, Edit, Fix};
use crate::{Edit, Fix, Locator};
use super::ast::LintContext;
#[expect(clippy::too_many_arguments)]
pub(crate) fn check_noqa(
diagnostics: &mut Vec<Diagnostic>,
context: &mut LintContext,
path: &Path,
locator: &Locator,
comment_ranges: &CommentRanges,
@@ -46,7 +46,7 @@ pub(crate) fn check_noqa(
let mut ignored_diagnostics = vec![];
// Remove any ignored diagnostics.
'outer: for (index, diagnostic) in diagnostics.iter().enumerate() {
'outer: for (index, diagnostic) in context.iter().enumerate() {
let rule = diagnostic.rule();
if matches!(rule, Rule::BlanketNOQA) {
@@ -111,35 +111,24 @@ pub(crate) fn check_noqa(
&& !exemption.includes(Rule::UnusedNOQA)
&& !per_file_ignores.contains(Rule::UnusedNOQA)
{
let directives: Vec<_> = if is_check_file_level_directives_enabled(settings) {
noqa_directives
.lines()
.iter()
.map(|line| (&line.directive, &line.matches, false))
.chain(
file_noqa_directives
.lines()
.iter()
.map(|line| (&line.parsed_file_exemption, &line.matches, true)),
)
.collect()
} else {
noqa_directives
.lines()
.iter()
.map(|line| (&line.directive, &line.matches, false))
.collect()
};
let directives = noqa_directives
.lines()
.iter()
.map(|line| (&line.directive, &line.matches, false))
.chain(
file_noqa_directives
.lines()
.iter()
.map(|line| (&line.parsed_file_exemption, &line.matches, true)),
);
for (directive, matches, is_file_level) in directives {
match directive {
Directive::All(directive) => {
if matches.is_empty() {
let edit = delete_comment(directive.range(), locator);
let mut diagnostic =
Diagnostic::new(UnusedNOQA { codes: None }, directive.range());
let mut diagnostic = context
.report_diagnostic(UnusedNOQA { codes: None }, directive.range());
diagnostic.set_fix(Fix::safe_edit(edit));
diagnostics.push(diagnostic);
}
}
Directive::Codes(directive) => {
@@ -159,9 +148,7 @@ pub(crate) fn check_noqa(
if seen_codes.insert(original_code) {
let is_code_used = if is_file_level {
diagnostics
.iter()
.any(|diag| diag.rule().noqa_code() == code)
context.iter().any(|diag| diag.rule().noqa_code() == code)
} else {
matches.iter().any(|match_| *match_ == code)
} || settings
@@ -212,7 +199,7 @@ pub(crate) fn check_noqa(
directive.range(),
)
};
let mut diagnostic = Diagnostic::new(
let mut diagnostic = context.report_diagnostic(
UnusedNOQA {
codes: Some(UnusedCodes {
disabled: disabled_codes
@@ -236,7 +223,6 @@ pub(crate) fn check_noqa(
directive.range(),
);
diagnostic.set_fix(Fix::safe_edit(edit));
diagnostics.push(diagnostic);
}
}
}
@@ -247,8 +233,8 @@ pub(crate) fn check_noqa(
&& !per_file_ignores.contains(Rule::RedirectedNOQA)
&& !exemption.includes(Rule::RedirectedNOQA)
{
ruff::rules::redirected_noqa(diagnostics, &noqa_directives);
ruff::rules::redirected_file_noqa(diagnostics, &file_noqa_directives);
ruff::rules::redirected_noqa(context, &noqa_directives);
ruff::rules::redirected_file_noqa(context, &file_noqa_directives);
}
if settings.rules.enabled(Rule::BlanketNOQA)
@@ -256,7 +242,7 @@ pub(crate) fn check_noqa(
&& !exemption.enumerates(Rule::BlanketNOQA)
{
pygrep_hooks::rules::blanket_noqa(
diagnostics,
context,
&noqa_directives,
locator,
&file_noqa_directives,
@@ -267,7 +253,7 @@ pub(crate) fn check_noqa(
&& !per_file_ignores.contains(Rule::InvalidRuleCode)
&& !exemption.enumerates(Rule::InvalidRuleCode)
{
ruff::rules::invalid_noqa_code(diagnostics, &noqa_directives, locator, &settings.external);
ruff::rules::invalid_noqa_code(context, &noqa_directives, locator, &settings.external);
}
ignored_diagnostics.sort_unstable();

View File

@@ -5,7 +5,6 @@ use ruff_python_index::Indexer;
use ruff_source_file::UniversalNewlines;
use ruff_text_size::TextSize;
use crate::Diagnostic;
use crate::Locator;
use crate::registry::Rule;
use crate::rules::flake8_copyright::rules::missing_copyright_notice;
@@ -17,15 +16,16 @@ use crate::rules::pylint;
use crate::rules::ruff::rules::indented_form_feed;
use crate::settings::LinterSettings;
use super::ast::LintContext;
pub(crate) fn check_physical_lines(
locator: &Locator,
stylist: &Stylist,
indexer: &Indexer,
doc_lines: &[TextSize],
settings: &LinterSettings,
) -> Vec<Diagnostic> {
let mut diagnostics: Vec<Diagnostic> = vec![];
context: &LintContext,
) {
let enforce_doc_line_too_long = settings.rules.enabled(Rule::DocLineTooLong);
let enforce_line_too_long = settings.rules.enabled(Rule::LineTooLong);
let enforce_no_newline_at_end_of_file = settings.rules.enabled(Rule::MissingNewlineAtEndOfFile);
@@ -45,54 +45,38 @@ pub(crate) fn check_physical_lines(
.is_some()
{
if enforce_doc_line_too_long {
if let Some(diagnostic) = doc_line_too_long(&line, comment_ranges, settings) {
diagnostics.push(diagnostic);
}
doc_line_too_long(&line, comment_ranges, settings, context);
}
}
if enforce_mixed_spaces_and_tabs {
if let Some(diagnostic) = mixed_spaces_and_tabs(&line) {
diagnostics.push(diagnostic);
}
mixed_spaces_and_tabs(&line, context);
}
if enforce_line_too_long {
if let Some(diagnostic) = line_too_long(&line, comment_ranges, settings) {
diagnostics.push(diagnostic);
}
line_too_long(&line, comment_ranges, settings, context);
}
if enforce_bidirectional_unicode {
diagnostics.extend(pylint::rules::bidirectional_unicode(&line));
pylint::rules::bidirectional_unicode(&line, context);
}
if enforce_trailing_whitespace || enforce_blank_line_contains_whitespace {
if let Some(diagnostic) = trailing_whitespace(&line, locator, indexer, settings) {
diagnostics.push(diagnostic);
}
trailing_whitespace(&line, locator, indexer, settings, context);
}
if settings.rules.enabled(Rule::IndentedFormFeed) {
if let Some(diagnostic) = indented_form_feed(&line) {
diagnostics.push(diagnostic);
}
indented_form_feed(&line, context);
}
}
if enforce_no_newline_at_end_of_file {
if let Some(diagnostic) = no_newline_at_end_of_file(locator, stylist) {
diagnostics.push(diagnostic);
}
no_newline_at_end_of_file(locator, stylist, context);
}
if enforce_copyright_notice {
if let Some(diagnostic) = missing_copyright_notice(locator, settings) {
diagnostics.push(diagnostic);
}
missing_copyright_notice(locator, settings, context);
}
diagnostics
}
#[cfg(test)]
@@ -100,8 +84,10 @@ mod tests {
use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::parse_module;
use ruff_source_file::SourceFileBuilder;
use crate::Locator;
use crate::checkers::ast::LintContext;
use crate::line_width::LineLength;
use crate::registry::Rule;
use crate::rules::pycodestyle;
@@ -118,6 +104,8 @@ mod tests {
let stylist = Stylist::from_tokens(parsed.tokens(), locator.contents());
let check_with_max_line_length = |line_length: LineLength| {
let source_file = SourceFileBuilder::new("<filename>", line).finish();
let diagnostics = LintContext::new(&source_file);
check_physical_lines(
&locator,
&stylist,
@@ -130,7 +118,9 @@ mod tests {
},
..LinterSettings::for_rule(Rule::LineTooLong)
},
)
&diagnostics,
);
diagnostics.into_diagnostics()
};
let line_length = LineLength::try_from(8).unwrap();
assert_eq!(check_with_max_line_length(line_length), vec![]);

View File

@@ -8,7 +8,6 @@ use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::Tokens;
use crate::Diagnostic;
use crate::Locator;
use crate::directives::TodoComment;
use crate::registry::{AsRule, Rule};
@@ -19,6 +18,8 @@ use crate::rules::{
};
use crate::settings::LinterSettings;
use super::ast::LintContext;
#[expect(clippy::too_many_arguments)]
pub(crate) fn check_tokens(
tokens: &Tokens,
@@ -29,8 +30,8 @@ pub(crate) fn check_tokens(
settings: &LinterSettings,
source_type: PySourceType,
cell_offsets: Option<&CellOffsets>,
) -> Vec<Diagnostic> {
let mut diagnostics: Vec<Diagnostic> = vec![];
context: &mut LintContext,
) {
let comment_ranges = indexer.comment_ranges();
if settings.rules.any_enabled(&[
@@ -41,16 +42,23 @@ pub(crate) fn check_tokens(
Rule::BlankLinesAfterFunctionOrClass,
Rule::BlankLinesBeforeNestedDefinition,
]) {
BlankLinesChecker::new(locator, stylist, settings, source_type, cell_offsets)
.check_lines(tokens, &mut diagnostics);
BlankLinesChecker::new(
locator,
stylist,
settings,
source_type,
cell_offsets,
context,
)
.check_lines(tokens);
}
if settings.rules.enabled(Rule::BlanketTypeIgnore) {
pygrep_hooks::rules::blanket_type_ignore(&mut diagnostics, comment_ranges, locator);
pygrep_hooks::rules::blanket_type_ignore(context, comment_ranges, locator);
}
if settings.rules.enabled(Rule::EmptyComment) {
pylint::rules::empty_comments(&mut diagnostics, comment_ranges, locator);
pylint::rules::empty_comments(context, comment_ranges, locator);
}
if settings
@@ -58,25 +66,20 @@ pub(crate) fn check_tokens(
.enabled(Rule::AmbiguousUnicodeCharacterComment)
{
for range in comment_ranges {
ruff::rules::ambiguous_unicode_character_comment(
&mut diagnostics,
locator,
range,
settings,
);
ruff::rules::ambiguous_unicode_character_comment(context, locator, range, settings);
}
}
if settings.rules.enabled(Rule::CommentedOutCode) {
eradicate::rules::commented_out_code(&mut diagnostics, locator, comment_ranges, settings);
eradicate::rules::commented_out_code(context, locator, comment_ranges, settings);
}
if settings.rules.enabled(Rule::UTF8EncodingDeclaration) {
pyupgrade::rules::unnecessary_coding_comment(&mut diagnostics, locator, comment_ranges);
pyupgrade::rules::unnecessary_coding_comment(context, locator, comment_ranges);
}
if settings.rules.enabled(Rule::TabIndentation) {
pycodestyle::rules::tab_indentation(&mut diagnostics, locator, indexer);
pycodestyle::rules::tab_indentation(context, locator, indexer);
}
if settings.rules.any_enabled(&[
@@ -87,7 +90,7 @@ pub(crate) fn check_tokens(
Rule::InvalidCharacterZeroWidthSpace,
]) {
for token in tokens {
pylint::rules::invalid_string_characters(&mut diagnostics, token, locator);
pylint::rules::invalid_string_characters(context, token, locator);
}
}
@@ -97,7 +100,7 @@ pub(crate) fn check_tokens(
Rule::UselessSemicolon,
]) {
pycodestyle::rules::compound_statements(
&mut diagnostics,
context,
tokens,
locator,
indexer,
@@ -110,13 +113,7 @@ pub(crate) fn check_tokens(
Rule::SingleLineImplicitStringConcatenation,
Rule::MultiLineImplicitStringConcatenation,
]) {
flake8_implicit_str_concat::rules::implicit(
&mut diagnostics,
tokens,
locator,
indexer,
settings,
);
flake8_implicit_str_concat::rules::implicit(context, tokens, locator, indexer, settings);
}
if settings.rules.any_enabled(&[
@@ -124,15 +121,15 @@ pub(crate) fn check_tokens(
Rule::TrailingCommaOnBareTuple,
Rule::ProhibitedTrailingComma,
]) {
flake8_commas::rules::trailing_commas(&mut diagnostics, tokens, locator, indexer);
flake8_commas::rules::trailing_commas(context, tokens, locator, indexer);
}
if settings.rules.enabled(Rule::ExtraneousParentheses) {
pyupgrade::rules::extraneous_parentheses(&mut diagnostics, tokens, locator);
pyupgrade::rules::extraneous_parentheses(context, tokens, locator);
}
if source_type.is_stub() && settings.rules.enabled(Rule::TypeCommentInStub) {
flake8_pyi::rules::type_comment_in_stub(&mut diagnostics, locator, comment_ranges);
flake8_pyi::rules::type_comment_in_stub(context, locator, comment_ranges);
}
if settings.rules.any_enabled(&[
@@ -142,13 +139,7 @@ pub(crate) fn check_tokens(
Rule::ShebangNotFirstLine,
Rule::ShebangMissingPython,
]) {
flake8_executable::rules::from_tokens(
&mut diagnostics,
path,
locator,
comment_ranges,
settings,
);
flake8_executable::rules::from_tokens(context, path, locator, comment_ranges, settings);
}
if settings.rules.any_enabled(&[
@@ -172,19 +163,15 @@ pub(crate) fn check_tokens(
TodoComment::from_comment(comment, *comment_range, i)
})
.collect();
flake8_todos::rules::todos(&mut diagnostics, &todo_comments, locator, comment_ranges);
flake8_fixme::rules::todos(&mut diagnostics, &todo_comments);
flake8_todos::rules::todos(context, &todo_comments, locator, comment_ranges);
flake8_fixme::rules::todos(context, &todo_comments);
}
if settings.rules.enabled(Rule::TooManyNewlinesAtEndOfFile) {
pycodestyle::rules::too_many_newlines_at_end_of_file(
&mut diagnostics,
tokens,
cell_offsets,
);
pycodestyle::rules::too_many_newlines_at_end_of_file(context, tokens, cell_offsets);
}
diagnostics.retain(|diagnostic| settings.rules.enabled(diagnostic.rule()));
diagnostics
context
.as_mut_vec()
.retain(|diagnostic| settings.rules.enabled(diagnostic.rule()));
}

View File

@@ -4,13 +4,13 @@
/// `--select`. For pylint this is e.g. C0414 and E0118 but also C and E01.
use std::fmt::Formatter;
use strum_macros::{AsRefStr, EnumIter};
use strum_macros::EnumIter;
use crate::registry::Linter;
use crate::rule_selector::is_single_rule_selector;
use crate::rules;
#[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Ord)]
#[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub struct NoqaCode(&'static str, &'static str);
impl NoqaCode {
@@ -847,7 +847,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Flake8PytestStyle, "026") => (RuleGroup::Stable, rules::flake8_pytest_style::rules::PytestUseFixturesWithoutParameters),
(Flake8PytestStyle, "027") => (RuleGroup::Stable, rules::flake8_pytest_style::rules::PytestUnittestRaisesAssertion),
(Flake8PytestStyle, "028") => (RuleGroup::Preview, rules::flake8_pytest_style::rules::PytestParameterWithDefaultArgument),
(Flake8PytestStyle, "029") => (RuleGroup::Preview, rules::flake8_pytest_style::rules::PytestWarnsWithoutWarning),
(Flake8PytestStyle, "029") => (RuleGroup::Stable, rules::flake8_pytest_style::rules::PytestWarnsWithoutWarning),
(Flake8PytestStyle, "030") => (RuleGroup::Preview, rules::flake8_pytest_style::rules::PytestWarnsTooBroad),
(Flake8PytestStyle, "031") => (RuleGroup::Preview, rules::flake8_pytest_style::rules::PytestWarnsWithMultipleStatements),
@@ -1019,7 +1019,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Ruff, "049") => (RuleGroup::Preview, rules::ruff::rules::DataclassEnum),
(Ruff, "051") => (RuleGroup::Stable, rules::ruff::rules::IfKeyInDictDel),
(Ruff, "052") => (RuleGroup::Preview, rules::ruff::rules::UsedDummyVariable),
(Ruff, "053") => (RuleGroup::Preview, rules::ruff::rules::ClassWithMixedTypeVars),
(Ruff, "053") => (RuleGroup::Stable, rules::ruff::rules::ClassWithMixedTypeVars),
(Ruff, "054") => (RuleGroup::Preview, rules::ruff::rules::IndentedFormFeed),
(Ruff, "055") => (RuleGroup::Preview, rules::ruff::rules::UnnecessaryRegularExpression),
(Ruff, "056") => (RuleGroup::Preview, rules::ruff::rules::FalsyDictGetFallback),
@@ -1129,7 +1129,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Refurb, "156") => (RuleGroup::Preview, rules::refurb::rules::HardcodedStringCharset),
(Refurb, "157") => (RuleGroup::Preview, rules::refurb::rules::VerboseDecimalConstructor),
(Refurb, "161") => (RuleGroup::Stable, rules::refurb::rules::BitCount),
(Refurb, "162") => (RuleGroup::Preview, rules::refurb::rules::FromisoformatReplaceZ),
(Refurb, "162") => (RuleGroup::Stable, rules::refurb::rules::FromisoformatReplaceZ),
(Refurb, "163") => (RuleGroup::Stable, rules::refurb::rules::RedundantLogBase),
(Refurb, "164") => (RuleGroup::Preview, rules::refurb::rules::UnnecessaryFromFloat),
(Refurb, "166") => (RuleGroup::Preview, rules::refurb::rules::IntOnSlicedStr),

View File

@@ -1,6 +1,7 @@
use anyhow::Result;
use log::debug;
use ruff_source_file::SourceFile;
use ruff_text_size::{Ranged, TextRange, TextSize};
use crate::registry::AsRule;
@@ -8,7 +9,7 @@ use crate::violation::Violation;
use crate::{Fix, codes::Rule};
#[derive(Debug, PartialEq, Eq, Clone)]
pub struct Diagnostic {
pub struct OldDiagnostic {
/// The message body to display to the user, to explain the diagnostic.
pub body: String,
/// The message to display to the user, to explain the suggested fix.
@@ -18,15 +19,17 @@ pub struct Diagnostic {
pub parent: Option<TextSize>,
pub(crate) rule: Rule,
pub(crate) file: SourceFile,
}
impl Diagnostic {
impl OldDiagnostic {
// TODO(brent) We temporarily allow this to avoid updating all of the call sites to add
// references. I expect this method to go away or change significantly with the rest of the
// diagnostic refactor, but if it still exists in this form at the end of the refactor, we
// should just update the call sites.
#[expect(clippy::needless_pass_by_value)]
pub fn new<T: Violation>(kind: T, range: TextRange) -> Self {
pub fn new<T: Violation>(kind: T, range: TextRange, file: &SourceFile) -> Self {
Self {
body: Violation::message(&kind),
suggestion: Violation::fix_title(&kind),
@@ -34,6 +37,7 @@ impl Diagnostic {
fix: None,
parent: None,
rule: T::rule(),
file: file.clone(),
}
}
@@ -87,13 +91,13 @@ impl Diagnostic {
}
}
impl AsRule for Diagnostic {
impl AsRule for OldDiagnostic {
fn rule(&self) -> Rule {
self.rule
}
}
impl Ranged for Diagnostic {
impl Ranged for OldDiagnostic {
fn range(&self) -> TextRange {
self.range
}

View File

@@ -600,14 +600,12 @@ mod tests {
use ruff_python_parser::{parse_expression, parse_module};
use ruff_text_size::{Ranged, TextRange, TextSize};
use crate::Locator;
use crate::codes::Rule;
use crate::fix::apply_fixes;
use crate::fix::edits::{
add_to_dunder_all, make_redundant_alias, next_stmt_break, trailing_semicolon,
};
use crate::message::Message;
use crate::{Diagnostic, Edit, Fix};
use crate::{Edit, Fix, Locator, OldDiagnostic};
/// Parse the given source using [`Mode::Module`] and return the first statement.
fn parse_first_stmt(source: &str) -> Result<Stmt> {
@@ -738,24 +736,16 @@ x = 1 \
let diag = {
use crate::rules::pycodestyle::rules::MissingNewlineAtEndOfFile;
let mut iter = edits.into_iter();
let diag = Diagnostic::new(
let diag = OldDiagnostic::new(
MissingNewlineAtEndOfFile, // The choice of rule here is arbitrary.
TextRange::default(),
&SourceFileBuilder::new("<filename>", "<code>").finish(),
)
.with_fix(Fix::safe_edits(
iter.next().ok_or(anyhow!("expected edits nonempty"))?,
iter,
));
Message::diagnostic(
diag.body,
diag.suggestion,
diag.range,
diag.fix,
diag.parent,
SourceFileBuilder::new("<filename>", "<code>").finish(),
None,
Rule::MissingNewlineAtEndOfFile,
)
Message::from_diagnostic(diag, None)
};
assert_eq!(apply_fixes([diag].iter(), &locator).code, expect);
Ok(())

View File

@@ -1,7 +1,7 @@
use std::collections::BTreeSet;
use itertools::Itertools;
use rustc_hash::{FxHashMap, FxHashSet};
use rustc_hash::FxHashSet;
use ruff_diagnostics::{IsolationLevel, SourceMap};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
@@ -59,13 +59,13 @@ fn apply_fixes<'a>(
let mut last_pos: Option<TextSize> = None;
let mut applied: BTreeSet<&Edit> = BTreeSet::default();
let mut isolated: FxHashSet<u32> = FxHashSet::default();
let mut fixed = FxHashMap::default();
let mut fixed = FixTable::default();
let mut source_map = SourceMap::default();
for (rule, fix) in diagnostics
.filter_map(|msg| msg.to_rule().map(|rule| (rule, msg)))
.filter_map(|(rule, diagnostic)| diagnostic.fix().map(|fix| (rule, fix)))
.sorted_by(|(rule1, fix1), (rule2, fix2)| cmp_fix(*rule1, *rule2, fix1, fix2))
for (code, name, fix) in diagnostics
.filter_map(|msg| msg.noqa_code().map(|code| (code, msg.name(), msg)))
.filter_map(|(code, name, diagnostic)| diagnostic.fix().map(|fix| (code, name, fix)))
.sorted_by(|(_, name1, fix1), (_, name2, fix2)| cmp_fix(name1, name2, fix1, fix2))
{
let mut edits = fix
.edits()
@@ -110,7 +110,7 @@ fn apply_fixes<'a>(
}
applied.extend(applied_edits.drain(..));
*fixed.entry(rule).or_default() += 1;
*fixed.entry(code).or_default(name) += 1;
}
// Add the remaining content.
@@ -125,34 +125,44 @@ fn apply_fixes<'a>(
}
/// Compare two fixes.
fn cmp_fix(rule1: Rule, rule2: Rule, fix1: &Fix, fix2: &Fix) -> std::cmp::Ordering {
fn cmp_fix(name1: &str, name2: &str, fix1: &Fix, fix2: &Fix) -> std::cmp::Ordering {
// Always apply `RedefinedWhileUnused` before `UnusedImport`, as the latter can end up fixing
// the former. But we can't apply this just for `RedefinedWhileUnused` and `UnusedImport` because it violates
// `< is transitive: a < b and b < c implies a < c. The same must hold for both == and >.`
// See https://github.com/astral-sh/ruff/issues/12469#issuecomment-2244392085
match (rule1, rule2) {
(Rule::RedefinedWhileUnused, Rule::RedefinedWhileUnused) => std::cmp::Ordering::Equal,
(Rule::RedefinedWhileUnused, _) => std::cmp::Ordering::Less,
(_, Rule::RedefinedWhileUnused) => std::cmp::Ordering::Greater,
_ => std::cmp::Ordering::Equal,
let redefined_while_unused = Rule::RedefinedWhileUnused.name().as_str();
if (name1, name2) == (redefined_while_unused, redefined_while_unused) {
std::cmp::Ordering::Equal
} else if name1 == redefined_while_unused {
std::cmp::Ordering::Less
} else if name2 == redefined_while_unused {
std::cmp::Ordering::Greater
} else {
std::cmp::Ordering::Equal
}
// Apply fixes in order of their start position.
.then_with(|| fix1.min_start().cmp(&fix2.min_start()))
// Break ties in the event of overlapping rules, for some specific combinations.
.then_with(|| match (&rule1, &rule2) {
.then_with(|| {
let rules = (name1, name2);
// Apply `MissingTrailingPeriod` fixes before `NewLineAfterLastParagraph` fixes.
(Rule::MissingTrailingPeriod, Rule::NewLineAfterLastParagraph) => std::cmp::Ordering::Less,
(Rule::NewLineAfterLastParagraph, Rule::MissingTrailingPeriod) => {
let missing_trailing_period = Rule::MissingTrailingPeriod.name().as_str();
let newline_after_last_paragraph = Rule::NewLineAfterLastParagraph.name().as_str();
let if_else_instead_of_dict_get = Rule::IfElseBlockInsteadOfDictGet.name().as_str();
let if_else_instead_of_if_exp = Rule::IfElseBlockInsteadOfIfExp.name().as_str();
if rules == (missing_trailing_period, newline_after_last_paragraph) {
std::cmp::Ordering::Less
} else if rules == (newline_after_last_paragraph, missing_trailing_period) {
std::cmp::Ordering::Greater
}
// Apply `IfElseBlockInsteadOfDictGet` fixes before `IfElseBlockInsteadOfIfExp` fixes.
(Rule::IfElseBlockInsteadOfDictGet, Rule::IfElseBlockInsteadOfIfExp) => {
else if rules == (if_else_instead_of_dict_get, if_else_instead_of_if_exp) {
std::cmp::Ordering::Less
}
(Rule::IfElseBlockInsteadOfIfExp, Rule::IfElseBlockInsteadOfDictGet) => {
} else if rules == (if_else_instead_of_if_exp, if_else_instead_of_dict_get) {
std::cmp::Ordering::Greater
} else {
std::cmp::Ordering::Equal
}
_ => std::cmp::Ordering::Equal,
})
}
@@ -163,7 +173,7 @@ mod tests {
use ruff_text_size::{Ranged, TextSize};
use crate::Locator;
use crate::diagnostic::Diagnostic;
use crate::diagnostic::OldDiagnostic;
use crate::fix::{FixResult, apply_fixes};
use crate::message::Message;
use crate::rules::pycodestyle::rules::MissingNewlineAtEndOfFile;
@@ -177,12 +187,12 @@ mod tests {
edit.into_iter()
.map(|edit| {
// The choice of rule here is arbitrary.
let diagnostic = Diagnostic::new(MissingNewlineAtEndOfFile, edit.range());
Message::from_diagnostic(
diagnostic.with_fix(Fix::safe_edit(edit)),
SourceFileBuilder::new(filename, source).finish(),
None,
)
let diagnostic = OldDiagnostic::new(
MissingNewlineAtEndOfFile,
edit.range(),
&SourceFileBuilder::new(filename, source).finish(),
);
Message::from_diagnostic(diagnostic.with_fix(Fix::safe_edit(edit)), None)
})
.collect()
}
@@ -197,7 +207,7 @@ mod tests {
source_map,
} = apply_fixes(diagnostics.iter(), &locator);
assert_eq!(code, "");
assert_eq!(fixes.values().sum::<usize>(), 0);
assert_eq!(fixes.counts().sum::<usize>(), 0);
assert!(source_map.markers().is_empty());
}
@@ -234,7 +244,7 @@ print("hello world")
"#
.trim()
);
assert_eq!(fixes.values().sum::<usize>(), 1);
assert_eq!(fixes.counts().sum::<usize>(), 1);
assert_eq!(
source_map.markers(),
&[
@@ -275,7 +285,7 @@ class A(Bar):
"
.trim(),
);
assert_eq!(fixes.values().sum::<usize>(), 1);
assert_eq!(fixes.counts().sum::<usize>(), 1);
assert_eq!(
source_map.markers(),
&[
@@ -312,7 +322,7 @@ class A:
"
.trim()
);
assert_eq!(fixes.values().sum::<usize>(), 1);
assert_eq!(fixes.counts().sum::<usize>(), 1);
assert_eq!(
source_map.markers(),
&[
@@ -353,7 +363,7 @@ class A(object):
"
.trim()
);
assert_eq!(fixes.values().sum::<usize>(), 2);
assert_eq!(fixes.counts().sum::<usize>(), 2);
assert_eq!(
source_map.markers(),
&[
@@ -395,7 +405,7 @@ class A:
"
.trim(),
);
assert_eq!(fixes.values().sum::<usize>(), 1);
assert_eq!(fixes.counts().sum::<usize>(), 1);
assert_eq!(
source_map.markers(),
&[

View File

@@ -14,7 +14,7 @@ pub use rule_selector::RuleSelector;
pub use rule_selector::clap_completion::RuleSelectorParser;
pub use rules::pycodestyle::rules::IOError;
pub use diagnostic::Diagnostic;
pub use diagnostic::OldDiagnostic;
pub(crate) use ruff_diagnostics::{Applicability, Edit, Fix};
pub use violation::{AlwaysFixableViolation, FixAvailability, Violation, ViolationMetadata};

View File

@@ -1,6 +1,5 @@
use std::borrow::Cow;
use std::cell::LazyCell;
use std::ops::Deref;
use std::collections::hash_map::Entry;
use std::path::Path;
use anyhow::{Result, anyhow};
@@ -14,23 +13,24 @@ use ruff_python_ast::{ModModule, PySourceType, PythonVersion};
use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::{ParseError, ParseOptions, Parsed, UnsupportedSyntaxError};
use ruff_source_file::SourceFileBuilder;
use ruff_source_file::{SourceFile, SourceFileBuilder};
use ruff_text_size::Ranged;
use crate::Diagnostic;
use crate::checkers::ast::check_ast;
use crate::OldDiagnostic;
use crate::checkers::ast::{LintContext, check_ast};
use crate::checkers::filesystem::check_file_path;
use crate::checkers::imports::check_imports;
use crate::checkers::noqa::check_noqa;
use crate::checkers::physical_lines::check_physical_lines;
use crate::checkers::tokens::check_tokens;
use crate::codes::NoqaCode;
use crate::directives::Directives;
use crate::doc_lines::{doc_lines_from_ast, doc_lines_from_tokens};
use crate::fix::{FixResult, fix_file};
use crate::message::Message;
use crate::noqa::add_noqa;
use crate::package::PackageRoot;
use crate::preview::{is_py314_support_enabled, is_unsupported_syntax_enabled};
use crate::preview::is_py314_support_enabled;
use crate::registry::{AsRule, Rule, RuleSet};
#[cfg(any(feature = "test-rules", test))]
use crate::rules::ruff::rules::test_rules::{self, TEST_RULES, TestRule};
@@ -86,7 +86,53 @@ impl LinterResult {
}
}
pub type FixTable = FxHashMap<Rule, usize>;
#[derive(Debug, Default, PartialEq)]
struct FixCount {
rule_name: &'static str,
count: usize,
}
/// A mapping from a noqa code to the corresponding lint name and a count of applied fixes.
#[derive(Debug, Default, PartialEq)]
pub struct FixTable(FxHashMap<NoqaCode, FixCount>);
impl FixTable {
pub fn counts(&self) -> impl Iterator<Item = usize> {
self.0.values().map(|fc| fc.count)
}
pub fn entry(&mut self, code: NoqaCode) -> FixTableEntry {
FixTableEntry(self.0.entry(code))
}
pub fn iter(&self) -> impl Iterator<Item = (NoqaCode, &'static str, usize)> {
self.0
.iter()
.map(|(code, FixCount { rule_name, count })| (*code, *rule_name, *count))
}
pub fn keys(&self) -> impl Iterator<Item = NoqaCode> {
self.0.keys().copied()
}
pub fn is_empty(&self) -> bool {
self.0.is_empty()
}
}
pub struct FixTableEntry<'a>(Entry<'a, NoqaCode, FixCount>);
impl<'a> FixTableEntry<'a> {
pub fn or_default(self, rule_name: &'static str) -> &'a mut usize {
&mut (self
.0
.or_insert(FixCount {
rule_name,
count: 0,
})
.count)
}
}
pub struct FixerResult<'a> {
/// The result returned by the linter, after applying any fixes.
@@ -113,8 +159,11 @@ pub fn check_path(
parsed: &Parsed<ModModule>,
target_version: TargetVersion,
) -> Vec<Message> {
let source_file =
SourceFileBuilder::new(path.to_string_lossy().as_ref(), locator.contents()).finish();
// Aggregate all diagnostics.
let mut diagnostics = vec![];
let mut diagnostics = LintContext::new(&source_file);
// Aggregate all semantic syntax errors.
let mut semantic_syntax_errors = vec![];
@@ -136,7 +185,7 @@ pub fn check_path(
.iter_enabled()
.any(|rule_code| rule_code.lint_source().is_tokens())
{
diagnostics.extend(check_tokens(
check_tokens(
tokens,
path,
locator,
@@ -145,7 +194,8 @@ pub fn check_path(
settings,
source_type,
source_kind.as_ipy_notebook().map(Notebook::cell_offsets),
));
&mut diagnostics,
);
}
// Run the filesystem-based rules.
@@ -154,14 +204,15 @@ pub fn check_path(
.iter_enabled()
.any(|rule_code| rule_code.lint_source().is_filesystem())
{
diagnostics.extend(check_file_path(
check_file_path(
path,
package,
locator,
comment_ranges,
settings,
target_version.linter_version(),
));
&diagnostics,
);
}
// Run the logical line-based rules.
@@ -170,9 +221,14 @@ pub fn check_path(
.iter_enabled()
.any(|rule_code| rule_code.lint_source().is_logical_lines())
{
diagnostics.extend(crate::checkers::logical_lines::check_logical_lines(
tokens, locator, indexer, stylist, settings,
));
crate::checkers::logical_lines::check_logical_lines(
tokens,
locator,
indexer,
stylist,
settings,
&diagnostics,
);
}
// Run the AST-based rules only if there are no syntax errors.
@@ -180,7 +236,7 @@ pub fn check_path(
let cell_offsets = source_kind.as_ipy_notebook().map(Notebook::cell_offsets);
let notebook_index = source_kind.as_ipy_notebook().map(Notebook::index);
let (new_diagnostics, new_semantic_syntax_errors) = check_ast(
semantic_syntax_errors.extend(check_ast(
parsed,
locator,
stylist,
@@ -194,9 +250,8 @@ pub fn check_path(
cell_offsets,
notebook_index,
target_version,
);
diagnostics.extend(new_diagnostics);
semantic_syntax_errors.extend(new_semantic_syntax_errors);
&diagnostics,
));
let use_imports = !directives.isort.skip_file
&& settings
@@ -205,7 +260,7 @@ pub fn check_path(
.any(|rule_code| rule_code.lint_source().is_imports());
if use_imports || use_doc_lines {
if use_imports {
let import_diagnostics = check_imports(
check_imports(
parsed,
locator,
indexer,
@@ -216,9 +271,8 @@ pub fn check_path(
source_type,
cell_offsets,
target_version.linter_version(),
&diagnostics,
);
diagnostics.extend(import_diagnostics);
}
if use_doc_lines {
doc_lines.extend(doc_lines_from_ast(parsed.suite(), locator));
@@ -238,9 +292,14 @@ pub fn check_path(
.iter_enabled()
.any(|rule_code| rule_code.lint_source().is_physical_lines())
{
diagnostics.extend(check_physical_lines(
locator, stylist, indexer, &doc_lines, settings,
));
check_physical_lines(
locator,
stylist,
indexer,
&doc_lines,
settings,
&diagnostics,
);
}
// Raise violations for internal test rules
@@ -250,47 +309,70 @@ pub fn check_path(
if !settings.rules.enabled(*test_rule) {
continue;
}
let diagnostic = match test_rule {
match test_rule {
Rule::StableTestRule => {
test_rules::StableTestRule::diagnostic(locator, comment_ranges)
}
Rule::StableTestRuleSafeFix => {
test_rules::StableTestRuleSafeFix::diagnostic(locator, comment_ranges)
}
Rule::StableTestRuleUnsafeFix => {
test_rules::StableTestRuleUnsafeFix::diagnostic(locator, comment_ranges)
test_rules::StableTestRule::diagnostic(locator, comment_ranges, &diagnostics);
}
Rule::StableTestRuleSafeFix => test_rules::StableTestRuleSafeFix::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::StableTestRuleUnsafeFix => test_rules::StableTestRuleUnsafeFix::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::StableTestRuleDisplayOnlyFix => {
test_rules::StableTestRuleDisplayOnlyFix::diagnostic(locator, comment_ranges)
test_rules::StableTestRuleDisplayOnlyFix::diagnostic(
locator,
comment_ranges,
&diagnostics,
);
}
Rule::PreviewTestRule => {
test_rules::PreviewTestRule::diagnostic(locator, comment_ranges)
test_rules::PreviewTestRule::diagnostic(locator, comment_ranges, &diagnostics);
}
Rule::DeprecatedTestRule => {
test_rules::DeprecatedTestRule::diagnostic(locator, comment_ranges)
test_rules::DeprecatedTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
);
}
Rule::AnotherDeprecatedTestRule => {
test_rules::AnotherDeprecatedTestRule::diagnostic(locator, comment_ranges)
test_rules::AnotherDeprecatedTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
);
}
Rule::RemovedTestRule => {
test_rules::RemovedTestRule::diagnostic(locator, comment_ranges)
}
Rule::AnotherRemovedTestRule => {
test_rules::AnotherRemovedTestRule::diagnostic(locator, comment_ranges)
}
Rule::RedirectedToTestRule => {
test_rules::RedirectedToTestRule::diagnostic(locator, comment_ranges)
}
Rule::RedirectedFromTestRule => {
test_rules::RedirectedFromTestRule::diagnostic(locator, comment_ranges)
test_rules::RemovedTestRule::diagnostic(locator, comment_ranges, &diagnostics);
}
Rule::AnotherRemovedTestRule => test_rules::AnotherRemovedTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::RedirectedToTestRule => test_rules::RedirectedToTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::RedirectedFromTestRule => test_rules::RedirectedFromTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
),
Rule::RedirectedFromPrefixTestRule => {
test_rules::RedirectedFromPrefixTestRule::diagnostic(locator, comment_ranges)
test_rules::RedirectedFromPrefixTestRule::diagnostic(
locator,
comment_ranges,
&diagnostics,
);
}
_ => unreachable!("All test rules must have an implementation"),
};
if let Some(diagnostic) = diagnostic {
diagnostics.push(diagnostic);
}
}
}
@@ -308,7 +390,9 @@ pub fn check_path(
RuleSet::empty()
};
if !per_file_ignores.is_empty() {
diagnostics.retain(|diagnostic| !per_file_ignores.contains(diagnostic.rule()));
diagnostics
.as_mut_vec()
.retain(|diagnostic| !per_file_ignores.contains(diagnostic.rule()));
}
// Enforce `noqa` directives.
@@ -330,11 +414,13 @@ pub fn check_path(
);
if noqa.is_enabled() {
for index in ignored.iter().rev() {
diagnostics.swap_remove(*index);
diagnostics.as_mut_vec().swap_remove(*index);
}
}
}
let mut diagnostics = diagnostics.into_diagnostics();
if parsed.has_valid_syntax() {
// Remove fixes for any rules marked as unfixable.
for diagnostic in &mut diagnostics {
@@ -361,20 +447,16 @@ pub fn check_path(
}
}
let syntax_errors = if is_unsupported_syntax_enabled(settings) {
parsed.unsupported_syntax_errors()
} else {
&[]
};
let syntax_errors = parsed.unsupported_syntax_errors();
diagnostics_to_messages(
diagnostics,
parsed.errors(),
syntax_errors,
&semantic_syntax_errors,
path,
locator,
directives,
&source_file,
)
}
@@ -438,7 +520,7 @@ pub fn add_noqa_to_path(
)
}
/// Generate a [`Message`] for each [`Diagnostic`] triggered by the given source
/// Generate a [`Message`] for each [`OldDiagnostic`] triggered by the given source
/// code.
pub fn lint_only(
path: &Path,
@@ -503,39 +585,28 @@ pub fn lint_only(
/// Convert from diagnostics to messages.
fn diagnostics_to_messages(
diagnostics: Vec<Diagnostic>,
diagnostics: Vec<OldDiagnostic>,
parse_errors: &[ParseError],
unsupported_syntax_errors: &[UnsupportedSyntaxError],
semantic_syntax_errors: &[SemanticSyntaxError],
path: &Path,
locator: &Locator,
directives: &Directives,
source_file: &SourceFile,
) -> Vec<Message> {
let file = LazyCell::new(|| {
let mut builder =
SourceFileBuilder::new(path.to_string_lossy().as_ref(), locator.contents());
if let Some(line_index) = locator.line_index() {
builder.set_line_index(line_index.clone());
}
builder.finish()
});
parse_errors
.iter()
.map(|parse_error| Message::from_parse_error(parse_error, locator, file.deref().clone()))
.map(|parse_error| Message::from_parse_error(parse_error, locator, source_file.clone()))
.chain(unsupported_syntax_errors.iter().map(|syntax_error| {
Message::from_unsupported_syntax_error(syntax_error, file.deref().clone())
Message::from_unsupported_syntax_error(syntax_error, source_file.clone())
}))
.chain(
semantic_syntax_errors
.iter()
.map(|error| Message::from_semantic_syntax_error(error, file.deref().clone())),
.map(|error| Message::from_semantic_syntax_error(error, source_file.clone())),
)
.chain(diagnostics.into_iter().map(|diagnostic| {
let noqa_offset = directives.noqa_line_for.resolve(diagnostic.start());
Message::from_diagnostic(diagnostic, file.deref().clone(), Some(noqa_offset))
Message::from_diagnostic(diagnostic, Some(noqa_offset))
}))
.collect()
}
@@ -554,7 +625,7 @@ pub fn lint_fix<'a>(
let mut transformed = Cow::Borrowed(source_kind);
// Track the number of fixed errors across iterations.
let mut fixed = FxHashMap::default();
let mut fixed = FixTable::default();
// As an escape hatch, bail after 100 iterations.
let mut iterations = 0;
@@ -623,12 +694,7 @@ pub fn lint_fix<'a>(
// syntax error. Return the original code.
if has_valid_syntax && has_no_syntax_errors {
if let Some(error) = parsed.errors().first() {
report_fix_syntax_error(
path,
transformed.source_code(),
error,
fixed.keys().copied(),
);
report_fix_syntax_error(path, transformed.source_code(), error, fixed.keys());
return Err(anyhow!("Fix introduced a syntax error"));
}
}
@@ -643,8 +709,8 @@ pub fn lint_fix<'a>(
{
if iterations < MAX_ITERATIONS {
// Count the number of fixed errors.
for (rule, count) in applied {
*fixed.entry(rule).or_default() += count;
for (rule, name, count) in applied.iter() {
*fixed.entry(rule).or_default(name) += count;
}
transformed = Cow::Owned(transformed.updated(fixed_contents, &source_map));
@@ -671,10 +737,10 @@ pub fn lint_fix<'a>(
}
}
fn collect_rule_codes(rules: impl IntoIterator<Item = Rule>) -> String {
fn collect_rule_codes(rules: impl IntoIterator<Item = NoqaCode>) -> String {
rules
.into_iter()
.map(|rule| rule.noqa_code().to_string())
.map(|rule| rule.to_string())
.sorted_unstable()
.dedup()
.join(", ")
@@ -682,7 +748,7 @@ fn collect_rule_codes(rules: impl IntoIterator<Item = Rule>) -> String {
#[expect(clippy::print_stderr)]
fn report_failed_to_converge_error(path: &Path, transformed: &str, messages: &[Message]) {
let codes = collect_rule_codes(messages.iter().filter_map(Message::to_rule));
let codes = collect_rule_codes(messages.iter().filter_map(Message::noqa_code));
if cfg!(debug_assertions) {
eprintln!(
"{}{} Failed to converge after {} iterations in `{}` with rule codes {}:---\n{}\n---",
@@ -718,7 +784,7 @@ fn report_fix_syntax_error(
path: &Path,
transformed: &str,
error: &ParseError,
rules: impl IntoIterator<Item = Rule>,
rules: impl IntoIterator<Item = NoqaCode>,
) {
let codes = collect_rule_codes(rules);
if cfg!(debug_assertions) {

View File

@@ -33,7 +33,7 @@ impl Emitter for AzureEmitter {
line = location.line,
col = location.column,
code = message
.to_noqa_code()
.noqa_code()
.map_or_else(String::new, |code| format!("code={code};")),
body = message.body(),
)?;

View File

@@ -33,7 +33,7 @@ impl Emitter for GithubEmitter {
writer,
"::error title=Ruff{code},file={file},line={row},col={column},endLine={end_row},endColumn={end_column}::",
code = message
.to_noqa_code()
.noqa_code()
.map_or_else(String::new, |code| format!(" ({code})")),
file = message.filename(),
row = source_location.line,
@@ -50,7 +50,7 @@ impl Emitter for GithubEmitter {
column = location.column,
)?;
if let Some(code) = message.to_noqa_code() {
if let Some(code) = message.noqa_code() {
write!(writer, " {code}")?;
}

View File

@@ -90,7 +90,7 @@ impl Serialize for SerializedMessages<'_> {
}
fingerprints.insert(message_fingerprint);
let (description, check_name) = if let Some(code) = message.to_noqa_code() {
let (description, check_name) = if let Some(code) = message.noqa_code() {
(message.body().to_string(), code.to_string())
} else {
let description = message.body();

View File

@@ -81,8 +81,8 @@ pub(crate) fn message_to_json_value(message: &Message, context: &EmitterContext)
}
json!({
"code": message.to_noqa_code().map(|code| code.to_string()),
"url": message.to_rule().and_then(|rule| rule.url()),
"code": message.noqa_code().map(|code| code.to_string()),
"url": message.to_url(),
"message": message.body(),
"fix": fix,
"cell": notebook_cell_index,

View File

@@ -59,7 +59,7 @@ impl Emitter for JunitEmitter {
body = message.body()
));
let mut case = TestCase::new(
if let Some(code) = message.to_noqa_code() {
if let Some(code) = message.noqa_code() {
format!("org.ruff.{code}")
} else {
"org.ruff".to_string()

View File

@@ -27,7 +27,7 @@ use crate::Locator;
use crate::codes::NoqaCode;
use crate::logging::DisplayParseErrorType;
use crate::registry::Rule;
use crate::{Diagnostic, Fix};
use crate::{Fix, OldDiagnostic};
mod azure;
mod diff;
@@ -50,7 +50,7 @@ mod text;
/// `noqa` offsets.
///
/// For diagnostic messages, the [`db::Diagnostic`]'s primary message contains the
/// [`Diagnostic::body`], and the primary annotation optionally contains the suggestion accompanying
/// [`OldDiagnostic::body`], and the primary annotation optionally contains the suggestion accompanying
/// a fix. The `db::Diagnostic::id` field contains the kebab-case lint name derived from the `Rule`.
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct Message {
@@ -113,19 +113,16 @@ impl Message {
}
}
/// Create a [`Message`] from the given [`Diagnostic`] corresponding to a rule violation.
pub fn from_diagnostic(
diagnostic: Diagnostic,
file: SourceFile,
noqa_offset: Option<TextSize>,
) -> Message {
let Diagnostic {
/// Create a [`Message`] from the given [`OldDiagnostic`] corresponding to a rule violation.
pub fn from_diagnostic(diagnostic: OldDiagnostic, noqa_offset: Option<TextSize>) -> Message {
let OldDiagnostic {
body,
suggestion,
range,
fix,
parent,
rule,
file,
} = diagnostic;
Self::diagnostic(
body,
@@ -227,30 +224,22 @@ impl Message {
self.fix().is_some()
}
/// Returns the [`Rule`] corresponding to the diagnostic message.
pub fn to_rule(&self) -> Option<Rule> {
if self.is_syntax_error() {
None
} else {
Some(self.name().parse().expect("Expected a valid rule name"))
}
}
/// Returns the [`NoqaCode`] corresponding to the diagnostic message.
pub fn to_noqa_code(&self) -> Option<NoqaCode> {
pub fn noqa_code(&self) -> Option<NoqaCode> {
self.noqa_code
}
/// Returns the URL for the rule documentation, if it exists.
pub fn to_url(&self) -> Option<String> {
// TODO(brent) Rule::url calls Rule::explanation, which calls ViolationMetadata::explain,
// which when derived (seems always to be the case?) is always `Some`, so I think it's
// pretty safe to inline the Rule::url implementation here, using `self.name()`:
//
// format!("{}/rules/{}", env!("CARGO_PKG_HOMEPAGE"), self.name())
//
// at least in the case of diagnostics, I guess syntax errors will return None
self.to_rule().and_then(|rule| rule.url())
if self.is_syntax_error() {
None
} else {
Some(format!(
"{}/rules/{}",
env!("CARGO_PKG_HOMEPAGE"),
self.name()
))
}
}
/// Returns the filename for the message.

View File

@@ -26,7 +26,7 @@ impl Emitter for PylintEmitter {
message.compute_start_location().line
};
let body = if let Some(code) = message.to_noqa_code() {
let body = if let Some(code) = message.noqa_code() {
format!("[{code}] {body}", body = message.body())
} else {
message.body().to_string()

View File

@@ -71,7 +71,7 @@ fn message_to_rdjson_value(message: &Message) -> Value {
"range": rdjson_range(start_location, end_location),
},
"code": {
"value": message.to_noqa_code().map(|code| code.to_string()),
"value": message.noqa_code().map(|code| code.to_string()),
"url": message.to_url(),
},
"suggestions": rdjson_suggestions(fix.edits(), &source_code),
@@ -84,7 +84,7 @@ fn message_to_rdjson_value(message: &Message) -> Value {
"range": rdjson_range(start_location, end_location),
},
"code": {
"value": message.to_noqa_code().map(|code| code.to_string()),
"value": message.noqa_code().map(|code| code.to_string()),
"url": message.to_url(),
},
})

View File

@@ -8,7 +8,7 @@ use serde_json::json;
use ruff_source_file::OneIndexed;
use crate::VERSION;
use crate::codes::Rule;
use crate::codes::NoqaCode;
use crate::fs::normalize_path;
use crate::message::{Emitter, EmitterContext, Message};
use crate::registry::{Linter, RuleNamespace};
@@ -27,7 +27,7 @@ impl Emitter for SarifEmitter {
.map(SarifResult::from_message)
.collect::<Result<Vec<_>>>()?;
let unique_rules: HashSet<_> = results.iter().filter_map(|result| result.rule).collect();
let unique_rules: HashSet<_> = results.iter().filter_map(|result| result.code).collect();
let mut rules: Vec<SarifRule> = unique_rules.into_iter().map(SarifRule::from).collect();
rules.sort_by(|a, b| a.code.cmp(&b.code));
@@ -61,13 +61,19 @@ struct SarifRule<'a> {
url: Option<String>,
}
impl From<Rule> for SarifRule<'_> {
fn from(rule: Rule) -> Self {
let code = rule.noqa_code().to_string();
let (linter, _) = Linter::parse_code(&code).unwrap();
impl From<NoqaCode> for SarifRule<'_> {
fn from(code: NoqaCode) -> Self {
let code_str = code.to_string();
// This is a manual re-implementation of Rule::from_code, but we also want the Linter. This
// avoids calling Linter::parse_code twice.
let (linter, suffix) = Linter::parse_code(&code_str).unwrap();
let rule = linter
.all_rules()
.find(|rule| rule.noqa_code().suffix() == suffix)
.expect("Expected a valid noqa code corresponding to a rule");
Self {
name: rule.into(),
code,
code: code_str,
linter: linter.name(),
summary: rule.message_formats()[0],
explanation: rule.explanation(),
@@ -106,7 +112,7 @@ impl Serialize for SarifRule<'_> {
#[derive(Debug)]
struct SarifResult {
rule: Option<Rule>,
code: Option<NoqaCode>,
level: String,
message: String,
uri: String,
@@ -123,7 +129,7 @@ impl SarifResult {
let end_location = message.compute_end_location();
let path = normalize_path(&*message.filename());
Ok(Self {
rule: message.to_rule(),
code: message.noqa_code(),
level: "error".to_string(),
message: message.body().to_string(),
uri: url::Url::from_file_path(&path)
@@ -143,7 +149,7 @@ impl SarifResult {
let end_location = message.compute_end_location();
let path = normalize_path(&*message.filename());
Ok(Self {
rule: message.to_rule(),
code: message.noqa_code(),
level: "error".to_string(),
message: message.body().to_string(),
uri: path.display().to_string(),
@@ -178,7 +184,7 @@ impl Serialize for SarifResult {
}
}
}],
"ruleId": self.rule.map(|rule| rule.noqa_code().to_string()),
"ruleId": self.code.map(|code| code.to_string()),
})
.serialize(serializer)
}

View File

@@ -151,7 +151,7 @@ impl Display for RuleCodeAndBody<'_> {
if let Some(fix) = self.message.fix() {
// Do not display an indicator for inapplicable fixes
if fix.applies(self.unsafe_fixes.required_applicability()) {
if let Some(code) = self.message.to_noqa_code() {
if let Some(code) = self.message.noqa_code() {
write!(f, "{} ", code.to_string().red().bold())?;
}
return write!(
@@ -164,7 +164,7 @@ impl Display for RuleCodeAndBody<'_> {
}
}
if let Some(code) = self.message.to_noqa_code() {
if let Some(code) = self.message.noqa_code() {
write!(
f,
"{code} {body}",
@@ -254,7 +254,7 @@ impl Display for MessageCodeFrame<'_> {
let label = self
.message
.to_noqa_code()
.noqa_code()
.map_or_else(String::new, |code| code.to_string());
let line_start = self.notebook_index.map_or_else(

View File

@@ -12,13 +12,14 @@ use log::warn;
use ruff_python_trivia::{CommentRanges, Cursor, indentation_at_offset};
use ruff_source_file::{LineEnding, LineRanges};
use ruff_text_size::{Ranged, TextLen, TextRange, TextSize};
use rustc_hash::FxHashSet;
use crate::Edit;
use crate::Locator;
use crate::codes::NoqaCode;
use crate::fs::relativize_path;
use crate::message::Message;
use crate::registry::{Rule, RuleSet};
use crate::registry::Rule;
use crate::rule_redirects::get_redirect_target;
/// Generates an array of edits that matches the length of `messages`.
@@ -780,7 +781,7 @@ fn build_noqa_edits_by_diagnostic(
if let Some(noqa_edit) = generate_noqa_edit(
comment.directive,
comment.line,
RuleSet::from_rule(comment.rule),
FxHashSet::from_iter([comment.code]),
locator,
line_ending,
) {
@@ -816,7 +817,7 @@ fn build_noqa_edits_by_line<'a>(
offset,
matches
.into_iter()
.map(|NoqaComment { rule, .. }| rule)
.map(|NoqaComment { code, .. }| code)
.collect(),
locator,
line_ending,
@@ -829,7 +830,7 @@ fn build_noqa_edits_by_line<'a>(
struct NoqaComment<'a> {
line: TextSize,
rule: Rule,
code: NoqaCode,
directive: Option<&'a Directive<'a>>,
}
@@ -845,13 +846,11 @@ fn find_noqa_comments<'a>(
// Mark any non-ignored diagnostics.
for message in messages {
let Some(rule) = message.to_rule() else {
let Some(code) = message.noqa_code() else {
comments_by_line.push(None);
continue;
};
let code = rule.noqa_code();
match &exemption {
FileExemption::All(_) => {
// If the file is exempted, don't add any noqa directives.
@@ -900,7 +899,7 @@ fn find_noqa_comments<'a>(
if !codes.includes(code) {
comments_by_line.push(Some(NoqaComment {
line: directive_line.start(),
rule,
code,
directive: Some(directive),
}));
}
@@ -912,7 +911,7 @@ fn find_noqa_comments<'a>(
// There's no existing noqa directive that suppresses the diagnostic.
comments_by_line.push(Some(NoqaComment {
line: locator.line_start(noqa_offset),
rule,
code,
directive: None,
}));
}
@@ -922,7 +921,7 @@ fn find_noqa_comments<'a>(
struct NoqaEdit<'a> {
edit_range: TextRange,
rules: RuleSet,
noqa_codes: FxHashSet<NoqaCode>,
codes: Option<&'a Codes<'a>>,
line_ending: LineEnding,
}
@@ -941,18 +940,15 @@ impl NoqaEdit<'_> {
Some(codes) => {
push_codes(
writer,
self.rules
self.noqa_codes
.iter()
.map(|rule| rule.noqa_code().to_string())
.map(ToString::to_string)
.chain(codes.iter().map(ToString::to_string))
.sorted_unstable(),
);
}
None => {
push_codes(
writer,
self.rules.iter().map(|rule| rule.noqa_code().to_string()),
);
push_codes(writer, self.noqa_codes.iter().map(ToString::to_string));
}
}
write!(writer, "{}", self.line_ending.as_str()).unwrap();
@@ -968,7 +964,7 @@ impl Ranged for NoqaEdit<'_> {
fn generate_noqa_edit<'a>(
directive: Option<&'a Directive>,
offset: TextSize,
rules: RuleSet,
noqa_codes: FxHashSet<NoqaCode>,
locator: &Locator,
line_ending: LineEnding,
) -> Option<NoqaEdit<'a>> {
@@ -997,7 +993,7 @@ fn generate_noqa_edit<'a>(
Some(NoqaEdit {
edit_range,
rules,
noqa_codes,
codes,
line_ending,
})
@@ -1233,7 +1229,7 @@ mod tests {
use crate::rules::pycodestyle::rules::{AmbiguousVariableName, UselessSemicolon};
use crate::rules::pyflakes::rules::UnusedVariable;
use crate::rules::pyupgrade::rules::PrintfStringFormatting;
use crate::{Diagnostic, Edit};
use crate::{Edit, OldDiagnostic};
use crate::{Locator, generate_noqa_edits};
fn assert_lexed_ranges_match_slices(
@@ -1252,14 +1248,9 @@ mod tests {
}
/// Create a [`Message`] with a placeholder filename and rule code from `diagnostic`.
fn message_from_diagnostic(
diagnostic: Diagnostic,
path: impl AsRef<Path>,
source: &str,
) -> Message {
fn message_from_diagnostic(diagnostic: OldDiagnostic) -> Message {
let noqa_offset = diagnostic.start();
let file = SourceFileBuilder::new(path.as_ref().to_string_lossy(), source).finish();
Message::from_diagnostic(diagnostic, file, Some(noqa_offset))
Message::from_diagnostic(diagnostic, Some(noqa_offset))
}
#[test]
@@ -2842,13 +2833,15 @@ mod tests {
assert_eq!(count, 0);
assert_eq!(output, format!("{contents}"));
let messages = [Diagnostic::new(
let source_file = SourceFileBuilder::new(path.to_string_lossy(), contents).finish();
let messages = [OldDiagnostic::new(
UnusedVariable {
name: "x".to_string(),
},
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
)]
.map(|d| message_from_diagnostic(d, path, contents));
.map(message_from_diagnostic);
let contents = "x = 1";
let noqa_line_for = NoqaMapping::default();
@@ -2864,19 +2857,22 @@ mod tests {
assert_eq!(count, 1);
assert_eq!(output, "x = 1 # noqa: F841\n");
let source_file = SourceFileBuilder::new(path.to_string_lossy(), contents).finish();
let messages = [
Diagnostic::new(
OldDiagnostic::new(
AmbiguousVariableName("x".to_string()),
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
),
Diagnostic::new(
OldDiagnostic::new(
UnusedVariable {
name: "x".to_string(),
},
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
),
]
.map(|d| message_from_diagnostic(d, path, contents));
.map(message_from_diagnostic);
let contents = "x = 1 # noqa: E741\n";
let noqa_line_for = NoqaMapping::default();
let comment_ranges =
@@ -2893,19 +2889,22 @@ mod tests {
assert_eq!(count, 1);
assert_eq!(output, "x = 1 # noqa: E741, F841\n");
let source_file = SourceFileBuilder::new(path.to_string_lossy(), contents).finish();
let messages = [
Diagnostic::new(
OldDiagnostic::new(
AmbiguousVariableName("x".to_string()),
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
),
Diagnostic::new(
OldDiagnostic::new(
UnusedVariable {
name: "x".to_string(),
},
TextRange::new(TextSize::from(0), TextSize::from(0)),
&source_file,
),
]
.map(|d| message_from_diagnostic(d, path, contents));
.map(message_from_diagnostic);
let contents = "x = 1 # noqa";
let noqa_line_for = NoqaMapping::default();
let comment_ranges =
@@ -2936,11 +2935,13 @@ print(
)
"#;
let noqa_line_for = [TextRange::new(8.into(), 68.into())].into_iter().collect();
let messages = [Diagnostic::new(
let source_file = SourceFileBuilder::new(path.to_string_lossy(), source).finish();
let messages = [OldDiagnostic::new(
PrintfStringFormatting,
TextRange::new(12.into(), 79.into()),
&source_file,
)]
.map(|d| message_from_diagnostic(d, path, source));
.map(message_from_diagnostic);
let comment_ranges = CommentRanges::default();
let edits = generate_noqa_edits(
path,
@@ -2968,11 +2969,13 @@ print(
foo;
bar =
";
let messages = [Diagnostic::new(
let source_file = SourceFileBuilder::new(path.to_string_lossy(), source).finish();
let messages = [OldDiagnostic::new(
UselessSemicolon,
TextRange::new(4.into(), 5.into()),
&source_file,
)]
.map(|d| message_from_diagnostic(d, path, source));
.map(message_from_diagnostic);
let noqa_line_for = NoqaMapping::default();
let comment_ranges = CommentRanges::default();
let edits = generate_noqa_edits(

View File

@@ -7,17 +7,6 @@
use crate::settings::LinterSettings;
// https://github.com/astral-sh/ruff/issues/17412
// https://github.com/astral-sh/ruff/issues/11934
pub(crate) const fn is_semantic_errors_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/16429
pub(crate) const fn is_unsupported_syntax_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
pub(crate) const fn is_py314_support_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
@@ -29,23 +18,11 @@ pub(crate) const fn is_full_path_match_source_strategy_enabled(settings: &Linter
// Rule-specific behavior
// https://github.com/astral-sh/ruff/pull/17136
pub(crate) const fn is_shell_injection_only_trusted_input_enabled(
settings: &LinterSettings,
) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/15541
pub(crate) const fn is_suspicious_function_reference_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/7501
pub(crate) const fn is_bool_subtype_of_annotation_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/10759
pub(crate) const fn is_comprehension_with_min_max_sum_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
@@ -63,21 +40,11 @@ pub(crate) const fn is_bad_version_info_in_non_stub_enabled(settings: &LinterSet
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/12676
pub(crate) const fn is_fix_future_annotations_in_stub_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/11074
pub(crate) const fn is_only_add_return_none_at_end_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/12796
pub(crate) const fn is_simplify_ternary_to_binary_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/16719
pub(crate) const fn is_fix_manual_dict_comprehension_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
@@ -104,13 +71,6 @@ pub(crate) const fn is_unicode_to_unicode_confusables_enabled(settings: &LinterS
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/17078
pub(crate) const fn is_support_slices_in_literal_concatenation_enabled(
settings: &LinterSettings,
) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/11370
pub(crate) const fn is_undefined_export_in_dunder_init_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
@@ -121,16 +81,9 @@ pub(crate) const fn is_allow_nested_roots_enabled(settings: &LinterSettings) ->
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/17061
pub(crate) const fn is_check_file_level_directives_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
// https://github.com/astral-sh/ruff/pull/17644
pub(crate) const fn is_readlines_in_for_fix_safe_enabled(settings: &LinterSettings) -> bool {
settings.preview.is_enabled()
}
pub(crate) const fn multiple_with_statements_fix_safe_enabled(settings: &LinterSettings) -> bool {
// https://github.com/astral-sh/ruff/pull/18208
pub(crate) const fn is_multiple_with_statements_fix_safe_enabled(
settings: &LinterSettings,
) -> bool {
settings.preview.is_enabled()
}

View File

@@ -5,14 +5,14 @@ use ruff_text_size::{TextRange, TextSize};
use ruff_source_file::SourceFile;
use crate::Diagnostic;
use crate::IOError;
use crate::OldDiagnostic;
use crate::message::Message;
use crate::registry::Rule;
use crate::rules::ruff::rules::InvalidPyprojectToml;
use crate::settings::LinterSettings;
pub fn lint_pyproject_toml(source_file: SourceFile, settings: &LinterSettings) -> Vec<Message> {
pub fn lint_pyproject_toml(source_file: &SourceFile, settings: &LinterSettings) -> Vec<Message> {
let Some(err) = toml::from_str::<PyProjectToml>(source_file.source_text()).err() else {
return Vec::default();
};
@@ -29,8 +29,9 @@ pub fn lint_pyproject_toml(source_file: SourceFile, settings: &LinterSettings) -
source_file.name(),
);
if settings.rules.enabled(Rule::IOError) {
let diagnostic = Diagnostic::new(IOError { message }, TextRange::default());
messages.push(Message::from_diagnostic(diagnostic, source_file, None));
let diagnostic =
OldDiagnostic::new(IOError { message }, TextRange::default(), source_file);
messages.push(Message::from_diagnostic(diagnostic, None));
} else {
warn!(
"{}{}{} {message}",
@@ -51,8 +52,12 @@ pub fn lint_pyproject_toml(source_file: SourceFile, settings: &LinterSettings) -
if settings.rules.enabled(Rule::InvalidPyprojectToml) {
let toml_err = err.message().to_string();
let diagnostic = Diagnostic::new(InvalidPyprojectToml { message: toml_err }, range);
messages.push(Message::from_diagnostic(diagnostic, source_file, None));
let diagnostic = OldDiagnostic::new(
InvalidPyprojectToml { message: toml_err },
range,
source_file,
);
messages.push(Message::from_diagnostic(diagnostic, None));
}
messages

View File

@@ -1,6 +1,7 @@
//! Remnant of the registry of all [`Rule`] implementations, now it's reexporting from codes.rs
//! with some helper symbols
use ruff_db::diagnostic::LintName;
use strum_macros::EnumIter;
pub use codes::Rule;
@@ -348,9 +349,18 @@ impl Rule {
/// Return the URL for the rule documentation, if it exists.
pub fn url(&self) -> Option<String> {
self.explanation()
.is_some()
.then(|| format!("{}/rules/{}", env!("CARGO_PKG_HOMEPAGE"), self.as_ref()))
self.explanation().is_some().then(|| {
format!(
"{}/rules/{name}",
env!("CARGO_PKG_HOMEPAGE"),
name = self.name()
)
})
}
pub fn name(&self) -> LintName {
let name: &'static str = self.into();
LintName::of(name)
}
}
@@ -421,7 +431,7 @@ pub mod clap_completion {
fn possible_values(&self) -> Option<Box<dyn Iterator<Item = PossibleValue> + '_>> {
Some(Box::new(Rule::iter().map(|rule| {
let name = rule.noqa_code().to_string();
let help = rule.as_ref().to_string();
let help = rule.name().as_str();
PossibleValue::new(name).help(help)
})))
}
@@ -443,7 +453,7 @@ mod tests {
assert!(
rule.explanation().is_some(),
"Rule {} is missing documentation",
rule.as_ref()
rule.name()
);
}
}
@@ -460,10 +470,10 @@ mod tests {
.collect();
for rule in Rule::iter() {
let rule_name = rule.as_ref();
let rule_name = rule.name();
for pattern in &patterns {
assert!(
!pattern.matches(rule_name),
!pattern.matches(&rule_name),
"{rule_name} does not match naming convention, see CONTRIBUTING.md"
);
}

View File

@@ -302,9 +302,8 @@ impl Display for RuleSet {
} else {
writeln!(f, "[")?;
for rule in self {
let name = rule.as_ref();
let code = rule.noqa_code();
writeln!(f, "\t{name} ({code}),")?;
writeln!(f, "\t{name} ({code}),", name = rule.name())?;
}
write!(f, "]")?;
}

View File

@@ -485,8 +485,7 @@ pub mod clap_completion {
prefix.linter().common_prefix(),
prefix.short_code()
);
let name: &'static str = rule.into();
return Some(PossibleValue::new(code).help(name));
return Some(PossibleValue::new(code).help(rule.name().as_str()));
}
None

View File

@@ -1,10 +1,17 @@
use crate::checkers::ast::Checker;
use crate::fix::edits::remove_unused_imports;
use crate::importer::ImportRequest;
use crate::rules::numpy::helpers::{AttributeSearcher, ImportSearcher};
use ruff_diagnostics::{Edit, Fix};
use ruff_python_ast::name::QualifiedNameBuilder;
use ruff_python_ast::statement_visitor::StatementVisitor;
use ruff_python_ast::visitor::Visitor;
use ruff_python_ast::{Expr, ExprName, StmtTry};
use ruff_python_ast::{Expr, ExprAttribute, ExprName, StmtTry};
use ruff_python_semantic::Exceptions;
use ruff_python_semantic::SemanticModel;
use ruff_python_semantic::{MemberNameImport, NameImport};
use ruff_text_size::Ranged;
use ruff_text_size::TextRange;
#[derive(Clone, Debug, Eq, PartialEq)]
pub(crate) enum Replacement {
@@ -170,3 +177,70 @@ pub(crate) fn is_airflow_builtin_or_provider(
_ => false,
}
}
/// Return the [`ast::ExprName`] at the head of the expression, if any.
pub(crate) fn match_head(value: &Expr) -> Option<&ExprName> {
match value {
Expr::Attribute(ExprAttribute { value, .. }) => value.as_name_expr(),
Expr::Name(name) => Some(name),
_ => None,
}
}
/// Return the [`Fix`] that imports the new name and updates where the import is referenced.
/// This is used for cases that member name has changed.
/// (e.g., `airflow.datasts.Dataset` to `airflow.sdk.Asset`)
pub(crate) fn generate_import_edit(
expr: &Expr,
checker: &Checker,
module: &str,
name: &str,
ranged: TextRange,
) -> Option<Fix> {
let (import_edit, _) = checker
.importer()
.get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)
.ok()?;
let replacement_edit = Edit::range_replacement(name.to_string(), ranged.range());
Some(Fix::safe_edits(import_edit, [replacement_edit]))
}
/// Return the [`Fix`] that remove the original import and import the same name with new path.
/// This is used for cases that member name has not changed.
/// (e.g., `airflow.operators.pig_operator.PigOperator` to `airflow.providers.apache.pig.hooks.pig.PigCliHook`)
pub(crate) fn generate_remove_and_runtime_import_edit(
expr: &Expr,
checker: &Checker,
module: &str,
name: &str,
) -> Option<Fix> {
let head = match_head(expr)?;
let semantic = checker.semantic();
let binding = semantic
.resolve_name(head)
.or_else(|| checker.semantic().lookup_symbol(&head.id))
.map(|id| checker.semantic().binding(id))?;
let stmt = binding.statement(semantic)?;
let remove_edit = remove_unused_imports(
std::iter::once(name),
stmt,
None,
checker.locator(),
checker.stylist(),
checker.indexer(),
)
.ok()?;
let import_edit = checker.importer().add_import(
&NameImport::ImportFrom(MemberNameImport::member(
(*module).to_string(),
name.to_string(),
)),
expr.start(),
);
Some(Fix::unsafe_edits(remove_edit, [import_edit]))
}

View File

@@ -1,3 +1,8 @@
use crate::checkers::ast::Checker;
use crate::rules::airflow::helpers::{
ProviderReplacement, generate_import_edit, generate_remove_and_runtime_import_edit,
is_guarded_by_try_except,
};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::name::QualifiedName;
use ruff_python_ast::{Expr, ExprAttribute};
@@ -5,10 +10,7 @@ use ruff_python_semantic::Modules;
use ruff_text_size::Ranged;
use ruff_text_size::TextRange;
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{ProviderReplacement, is_guarded_by_try_except};
use crate::{Edit, Fix, FixAvailability, Violation};
use crate::{FixAvailability, Violation};
/// ## What it does
/// Checks for uses of Airflow functions and values that have been moved to it providers.
@@ -1169,7 +1171,7 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
[
"airflow",
"sensors",
"external_task",
"external_task" | "external_task_sensor",
"ExternalTaskSensorLink",
] => ProviderReplacement::AutoImport {
module: "airflow.providers.standard.sensors.external_task",
@@ -1181,7 +1183,7 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
"airflow",
"sensors",
"external_task_sensor",
rest @ ("ExternalTaskMarker" | "ExternalTaskSensor" | "ExternalTaskSensorLink"),
rest @ ("ExternalTaskMarker" | "ExternalTaskSensor"),
] => ProviderReplacement::SourceModuleMovedToProvider {
module: "airflow.providers.standard.sensors.external_task",
name: (*rest).to_string(),
@@ -1219,21 +1221,17 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
return;
}
checker
.report_diagnostic(
Airflow3MovedToProvider {
deprecated: qualified_name,
replacement: replacement.clone(),
},
ranged,
)
.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, ranged);
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
let mut diagnostic = checker.report_diagnostic(
Airflow3MovedToProvider {
deprecated: qualified_name,
replacement: replacement.clone(),
},
ranged,
);
if let Some(fix) = generate_import_edit(expr, checker, module, name, ranged)
.or_else(|| generate_remove_and_runtime_import_edit(expr, checker, module, name))
{
diagnostic.set_fix(fix);
}
}

View File

@@ -1,7 +1,7 @@
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{
Replacement, is_airflow_builtin_or_provider, is_guarded_by_try_except,
Replacement, generate_import_edit, generate_remove_and_runtime_import_edit,
is_airflow_builtin_or_provider, is_guarded_by_try_except,
};
use crate::{Edit, Fix, FixAvailability, Violation};
use ruff_macros::{ViolationMetadata, derive_message_formats};
@@ -614,7 +614,6 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
},
// airflow.configuration
// TODO: check whether we could improve it
[
"airflow",
"configuration",
@@ -984,24 +983,19 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
}
let import_target = name.split('.').next().unwrap_or(name);
let mut diagnostic = checker.report_diagnostic(
Airflow3Removal {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
);
checker
.report_diagnostic(
Airflow3Removal {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
)
.try_set_fix(|| {
let (import_edit, _) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, import_target),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(name.to_string(), range);
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
if let Some(fix) = generate_import_edit(expr, checker, module, import_target, range)
.or_else(|| generate_remove_and_runtime_import_edit(expr, checker, module, name))
{
diagnostic.set_fix(fix);
}
}
/// Check whether a customized Airflow plugin contains removed extensions.

View File

@@ -1,7 +1,9 @@
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{ProviderReplacement, is_guarded_by_try_except};
use crate::{Edit, Fix, FixAvailability, Violation};
use crate::checkers::ast::Checker;
use crate::rules::airflow::helpers::{
ProviderReplacement, generate_import_edit, generate_remove_and_runtime_import_edit,
is_guarded_by_try_except,
};
use crate::{FixAvailability, Violation};
use ruff_macros::{ViolationMetadata, derive_message_formats};
use ruff_python_ast::name::QualifiedName;
use ruff_python_ast::{Expr, ExprAttribute};
@@ -9,8 +11,6 @@ use ruff_python_semantic::Modules;
use ruff_text_size::Ranged;
use ruff_text_size::TextRange;
use crate::checkers::ast::Checker;
/// ## What it does
/// Checks for uses of Airflow functions and values that have been moved to its providers
/// but still have a compatibility layer (e.g., `apache-airflow-providers-standard`).
@@ -302,22 +302,17 @@ fn check_names_moved_to_provider(checker: &Checker, expr: &Expr, ranged: TextRan
if is_guarded_by_try_except(expr, module, name, checker.semantic()) {
return;
}
let mut diagnostic = checker.report_diagnostic(
Airflow3SuggestedToMoveToProvider {
deprecated: qualified_name,
replacement: replacement.clone(),
},
ranged,
);
checker
.report_diagnostic(
Airflow3SuggestedToMoveToProvider {
deprecated: qualified_name,
replacement: replacement.clone(),
},
ranged.range(),
)
.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, ranged.range());
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
if let Some(fix) = generate_import_edit(expr, checker, module, name, ranged)
.or_else(|| generate_remove_and_runtime_import_edit(expr, checker, module, name))
{
diagnostic.set_fix(fix);
}
}

View File

@@ -1,7 +1,7 @@
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{Replacement, is_airflow_builtin_or_provider};
use crate::rules::airflow::helpers::{
Replacement, is_airflow_builtin_or_provider, is_guarded_by_try_except,
generate_import_edit, generate_remove_and_runtime_import_edit, is_guarded_by_try_except,
};
use crate::{Edit, Fix, FixAvailability, Violation};
use ruff_macros::{ViolationMetadata, derive_message_formats};
@@ -211,7 +211,7 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
name: "AssetAny",
},
"expand_alias_to_datasets" => Replacement::AutoImport {
module: "airflow.sdk",
module: "airflow.models.asset",
name: "expand_alias_to_assets",
},
_ => return,
@@ -256,7 +256,7 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
name: (*rest).to_string(),
},
["airflow", "models", "baseoperatorlink", "BaseOperatorLink"] => Replacement::AutoImport {
module: "airflow.sdk.definitions.baseoperatorlink",
module: "airflow.sdk",
name: "BaseOperatorLink",
},
// airflow.model..DAG
@@ -301,22 +301,16 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
if is_guarded_by_try_except(expr, module, name, checker.semantic()) {
return;
}
checker
.report_diagnostic(
Airflow3SuggestedUpdate {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
)
.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, range);
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
let mut diagnostic = checker.report_diagnostic(
Airflow3SuggestedUpdate {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
);
if let Some(fix) = generate_import_edit(expr, checker, module, name, range)
.or_else(|| generate_remove_and_runtime_import_edit(expr, checker, module, name))
{
diagnostic.set_fix(fix);
}
}

View File

@@ -1,651 +1,294 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR301_names.py:53:1: AIR301 `airflow.PY36` is removed in Airflow 3.0
AIR301_names.py:38:1: AIR301 `airflow.PY36` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:7: AIR301 `airflow.PY37` is removed in Airflow 3.0
AIR301_names.py:38:7: AIR301 `airflow.PY37` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:13: AIR301 `airflow.PY38` is removed in Airflow 3.0
AIR301_names.py:38:13: AIR301 `airflow.PY38` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:19: AIR301 `airflow.PY39` is removed in Airflow 3.0
AIR301_names.py:38:19: AIR301 `airflow.PY39` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:25: AIR301 `airflow.PY310` is removed in Airflow 3.0
AIR301_names.py:38:25: AIR301 `airflow.PY310` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:32: AIR301 `airflow.PY311` is removed in Airflow 3.0
AIR301_names.py:38:32: AIR301 `airflow.PY311` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:53:39: AIR301 `airflow.PY312` is removed in Airflow 3.0
AIR301_names.py:38:39: AIR301 `airflow.PY312` is removed in Airflow 3.0
|
52 | # airflow root
53 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
37 | # airflow root
38 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
| ^^^^^ AIR301
54 |
55 | # airflow.api_connexion.security
39 |
40 | # airflow.api_connexion.security
|
= help: Use `sys.version_info` instead
AIR301_names.py:56:1: AIR301 `airflow.api_connexion.security.requires_access` is removed in Airflow 3.0
AIR301_names.py:41:1: AIR301 `airflow.api_connexion.security.requires_access` is removed in Airflow 3.0
|
55 | # airflow.api_connexion.security
56 | requires_access
40 | # airflow.api_connexion.security
41 | requires_access
| ^^^^^^^^^^^^^^^ AIR301
42 |
43 | # airflow.contrib.*
|
= help: Use `airflow.api_fastapi.core_api.security.requires_access_*` instead
AIR301_names.py:60:1: AIR301 [*] `airflow.configuration.get` is removed in Airflow 3.0
AIR301_names.py:44:1: AIR301 `airflow.contrib.aws_athena_hook.AWSAthenaHook` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^ AIR301
|
= help: Use `conf.get` from `airflow.configuration` instead.
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+conf.get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:6: AIR301 [*] `airflow.configuration.getboolean` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^ AIR301
|
= help: Use `conf.getboolean` from `airflow.configuration` instead.
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, conf.getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:18: AIR301 [*] `airflow.configuration.getfloat` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^ AIR301
|
= help: Use `conf.getfloat` from `airflow.configuration` instead.
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, conf.getfloat, getint, has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:28: AIR301 [*] `airflow.configuration.getint` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^ AIR301
|
= help: Use `conf.getint` from `airflow.configuration` instead.
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, conf.getint, has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:36: AIR301 [*] `airflow.configuration.has_option` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^ AIR301
|
= help: Use `conf.has_option` from `airflow.configuration` instead.
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, getint, conf.has_option, remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:48: AIR301 [*] `airflow.configuration.remove_option` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^^^^ AIR301
|
= help: Use `conf.remove_option` from `airflow.configuration` instead.
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, getint, has_option, conf.remove_option, as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:63: AIR301 [*] `airflow.configuration.as_dict` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^ AIR301
|
= help: Use `conf.as_dict` from `airflow.configuration` instead.
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, getint, has_option, remove_option, conf.as_dict, set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:60:72: AIR301 [*] `airflow.configuration.set` is removed in Airflow 3.0
|
59 | # airflow.configuration
60 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^ AIR301
|
= help: Use `conf.set` from `airflow.configuration` instead.
Safe fix
19 19 | has_option,
20 20 | remove_option,
21 21 | set,
22 |+conf,
22 23 | )
23 24 | from airflow.contrib.aws_athena_hook import AWSAthenaHook
24 25 | from airflow.datasets import DatasetAliasEvent
--------------------------------------------------------------------------------
57 58 |
58 59 |
59 60 | # airflow.configuration
60 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
61 |+get, getboolean, getfloat, getint, has_option, remove_option, as_dict, conf.set
61 62 |
62 63 |
63 64 | # airflow.contrib.*
AIR301_names.py:64:1: AIR301 `airflow.contrib.aws_athena_hook.AWSAthenaHook` is removed in Airflow 3.0
|
63 | # airflow.contrib.*
64 | AWSAthenaHook()
43 | # airflow.contrib.*
44 | AWSAthenaHook()
| ^^^^^^^^^^^^^ AIR301
|
= help: The whole `airflow.contrib` module has been removed.
AIR301_names.py:68:1: AIR301 `airflow.datasets.DatasetAliasEvent` is removed in Airflow 3.0
AIR301_names.py:48:1: AIR301 `airflow.datasets.DatasetAliasEvent` is removed in Airflow 3.0
|
67 | # airflow.datasets
68 | DatasetAliasEvent()
47 | # airflow.datasets
48 | DatasetAliasEvent()
| ^^^^^^^^^^^^^^^^^ AIR301
|
AIR301_names.py:72:1: AIR301 `airflow.hooks.base_hook.BaseHook` is removed in Airflow 3.0
AIR301_names.py:52:1: AIR301 `airflow.operators.subdag.SubDagOperator` is removed in Airflow 3.0
|
71 | # airflow.hooks
72 | BaseHook()
| ^^^^^^^^ AIR301
|
= help: Use `BaseHook` from `airflow.hooks.base` instead.
AIR301_names.py:76:1: AIR301 `airflow.operators.subdag.SubDagOperator` is removed in Airflow 3.0
|
75 | # airflow.operators.subdag.*
76 | SubDagOperator()
51 | # airflow.operators.subdag.*
52 | SubDagOperator()
| ^^^^^^^^^^^^^^ AIR301
|
= help: The whole `airflow.subdag` module has been removed.
AIR301_names.py:85:1: AIR301 `airflow.sensors.base_sensor_operator.BaseSensorOperator` is removed in Airflow 3.0
AIR301_names.py:61:1: AIR301 `airflow.triggers.external_task.TaskStateTrigger` is removed in Airflow 3.0
|
84 | # airflow.sensors.base_sensor_operator
85 | BaseSensorOperator()
| ^^^^^^^^^^^^^^^^^^ AIR301
|
= help: Use `BaseSensorOperator` from `airflow.sdk.bases.sensor` instead.
AIR301_names.py:89:1: AIR301 `airflow.triggers.external_task.TaskStateTrigger` is removed in Airflow 3.0
|
88 | # airflow.triggers.external_task
89 | TaskStateTrigger()
60 | # airflow.triggers.external_task
61 | TaskStateTrigger()
| ^^^^^^^^^^^^^^^^ AIR301
90 |
91 | # airflow.utils.date
62 |
63 | # airflow.utils.date
|
AIR301_names.py:92:1: AIR301 `airflow.utils.dates.date_range` is removed in Airflow 3.0
AIR301_names.py:64:1: AIR301 `airflow.utils.dates.date_range` is removed in Airflow 3.0
|
91 | # airflow.utils.date
92 | dates.date_range
63 | # airflow.utils.date
64 | dates.date_range
| ^^^^^^^^^^^^^^^^ AIR301
93 | dates.days_ago
65 | dates.days_ago
|
AIR301_names.py:93:1: AIR301 `airflow.utils.dates.days_ago` is removed in Airflow 3.0
AIR301_names.py:65:1: AIR301 `airflow.utils.dates.days_ago` is removed in Airflow 3.0
|
91 | # airflow.utils.date
92 | dates.date_range
93 | dates.days_ago
63 | # airflow.utils.date
64 | dates.date_range
65 | dates.days_ago
| ^^^^^^^^^^^^^^ AIR301
94 |
95 | date_range
66 |
67 | date_range
|
= help: Use `pendulum.today('UTC').add(days=-N, ...)` instead
AIR301_names.py:95:1: AIR301 `airflow.utils.dates.date_range` is removed in Airflow 3.0
AIR301_names.py:67:1: AIR301 `airflow.utils.dates.date_range` is removed in Airflow 3.0
|
93 | dates.days_ago
94 |
95 | date_range
65 | dates.days_ago
66 |
67 | date_range
| ^^^^^^^^^^ AIR301
96 | days_ago
97 | infer_time_unit
68 | days_ago
69 | infer_time_unit
|
AIR301_names.py:96:1: AIR301 `airflow.utils.dates.days_ago` is removed in Airflow 3.0
AIR301_names.py:68:1: AIR301 `airflow.utils.dates.days_ago` is removed in Airflow 3.0
|
95 | date_range
96 | days_ago
67 | date_range
68 | days_ago
| ^^^^^^^^ AIR301
97 | infer_time_unit
98 | parse_execution_date
69 | infer_time_unit
70 | parse_execution_date
|
= help: Use `pendulum.today('UTC').add(days=-N, ...)` instead
AIR301_names.py:97:1: AIR301 `airflow.utils.dates.infer_time_unit` is removed in Airflow 3.0
AIR301_names.py:69:1: AIR301 `airflow.utils.dates.infer_time_unit` is removed in Airflow 3.0
|
95 | date_range
96 | days_ago
97 | infer_time_unit
67 | date_range
68 | days_ago
69 | infer_time_unit
| ^^^^^^^^^^^^^^^ AIR301
98 | parse_execution_date
99 | round_time
70 | parse_execution_date
71 | round_time
|
AIR301_names.py:98:1: AIR301 `airflow.utils.dates.parse_execution_date` is removed in Airflow 3.0
|
96 | days_ago
97 | infer_time_unit
98 | parse_execution_date
| ^^^^^^^^^^^^^^^^^^^^ AIR301
99 | round_time
100 | scale_time_units
|
AIR301_names.py:70:1: AIR301 `airflow.utils.dates.parse_execution_date` is removed in Airflow 3.0
|
68 | days_ago
69 | infer_time_unit
70 | parse_execution_date
| ^^^^^^^^^^^^^^^^^^^^ AIR301
71 | round_time
72 | scale_time_units
|
AIR301_names.py:99:1: AIR301 `airflow.utils.dates.round_time` is removed in Airflow 3.0
AIR301_names.py:71:1: AIR301 `airflow.utils.dates.round_time` is removed in Airflow 3.0
|
69 | infer_time_unit
70 | parse_execution_date
71 | round_time
| ^^^^^^^^^^ AIR301
72 | scale_time_units
|
AIR301_names.py:72:1: AIR301 `airflow.utils.dates.scale_time_units` is removed in Airflow 3.0
|
70 | parse_execution_date
71 | round_time
72 | scale_time_units
| ^^^^^^^^^^^^^^^^ AIR301
73 |
74 | # This one was not deprecated.
|
AIR301_names.py:79:1: AIR301 `airflow.utils.dag_cycle_tester.test_cycle` is removed in Airflow 3.0
|
78 | # airflow.utils.dag_cycle_tester
79 | test_cycle
| ^^^^^^^^^^ AIR301
|
AIR301_names.py:83:1: AIR301 `airflow.utils.db.create_session` is removed in Airflow 3.0
|
82 | # airflow.utils.db
83 | create_session
| ^^^^^^^^^^^^^^ AIR301
84 |
85 | # airflow.utils.decorators
|
AIR301_names.py:86:1: AIR301 `airflow.utils.decorators.apply_defaults` is removed in Airflow 3.0
|
85 | # airflow.utils.decorators
86 | apply_defaults
| ^^^^^^^^^^^^^^ AIR301
87 |
88 | # airflow.utils.file
|
= help: `apply_defaults` is now unconditionally done and can be safely removed.
AIR301_names.py:89:1: AIR301 `airflow.utils.file.mkdirs` is removed in Airflow 3.0
|
88 | # airflow.utils.file
89 | mkdirs
| ^^^^^^ AIR301
|
= help: Use `pathlib.Path({path}).mkdir` instead
AIR301_names.py:93:1: AIR301 `airflow.utils.state.SHUTDOWN` is removed in Airflow 3.0
|
92 | # airflow.utils.state
93 | SHUTDOWN
| ^^^^^^^^ AIR301
94 | terminating_states
|
AIR301_names.py:94:1: AIR301 `airflow.utils.state.terminating_states` is removed in Airflow 3.0
|
92 | # airflow.utils.state
93 | SHUTDOWN
94 | terminating_states
| ^^^^^^^^^^^^^^^^^^ AIR301
95 |
96 | # airflow.utils.trigger_rule
|
AIR301_names.py:97:1: AIR301 `airflow.utils.trigger_rule.TriggerRule.DUMMY` is removed in Airflow 3.0
|
96 | # airflow.utils.trigger_rule
97 | TriggerRule.DUMMY
| ^^^^^^^^^^^^^^^^^ AIR301
98 | TriggerRule.NONE_FAILED_OR_SKIPPED
|
AIR301_names.py:98:1: AIR301 `airflow.utils.trigger_rule.TriggerRule.NONE_FAILED_OR_SKIPPED` is removed in Airflow 3.0
|
96 | # airflow.utils.trigger_rule
97 | TriggerRule.DUMMY
98 | TriggerRule.NONE_FAILED_OR_SKIPPED
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
|
AIR301_names.py:102:1: AIR301 `airflow.www.auth.has_access` is removed in Airflow 3.0
|
97 | infer_time_unit
98 | parse_execution_date
99 | round_time
101 | # airflow.www.auth
102 | has_access
| ^^^^^^^^^^ AIR301
100 | scale_time_units
103 | has_access_dataset
|
AIR301_names.py:100:1: AIR301 `airflow.utils.dates.scale_time_units` is removed in Airflow 3.0
AIR301_names.py:103:1: AIR301 `airflow.www.auth.has_access_dataset` is removed in Airflow 3.0
|
98 | parse_execution_date
99 | round_time
100 | scale_time_units
| ^^^^^^^^^^^^^^^^ AIR301
101 |
102 | # This one was not deprecated.
|
AIR301_names.py:107:1: AIR301 `airflow.utils.dag_cycle_tester.test_cycle` is removed in Airflow 3.0
|
106 | # airflow.utils.dag_cycle_tester
107 | test_cycle
| ^^^^^^^^^^ AIR301
|
AIR301_names.py:111:1: AIR301 `airflow.utils.db.create_session` is removed in Airflow 3.0
|
110 | # airflow.utils.db
111 | create_session
| ^^^^^^^^^^^^^^ AIR301
112 |
113 | # airflow.utils.decorators
|
AIR301_names.py:114:1: AIR301 `airflow.utils.decorators.apply_defaults` is removed in Airflow 3.0
|
113 | # airflow.utils.decorators
114 | apply_defaults
| ^^^^^^^^^^^^^^ AIR301
115 |
116 | # airflow.utils.file
|
= help: `apply_defaults` is now unconditionally done and can be safely removed.
AIR301_names.py:117:1: AIR301 `airflow.utils.file.TemporaryDirectory` is removed in Airflow 3.0
|
116 | # airflow.utils.file
117 | TemporaryDirectory()
101 | # airflow.www.auth
102 | has_access
103 | has_access_dataset
| ^^^^^^^^^^^^^^^^^^ AIR301
118 | mkdirs
|
= help: Use `TemporaryDirectory` from `tempfile` instead.
AIR301_names.py:118:1: AIR301 `airflow.utils.file.mkdirs` is removed in Airflow 3.0
|
116 | # airflow.utils.file
117 | TemporaryDirectory()
118 | mkdirs
| ^^^^^^ AIR301
119 |
120 | # airflow.utils.helpers
|
= help: Use `pathlib.Path({path}).mkdir` instead
AIR301_names.py:121:1: AIR301 [*] `airflow.utils.helpers.chain` is removed in Airflow 3.0
|
120 | # airflow.utils.helpers
121 | helper_chain
| ^^^^^^^^^^^^ AIR301
122 | helper_cross_downstream
|
= help: Use `chain` from `airflow.sdk` instead.
Safe fix
48 48 | from airflow.utils.trigger_rule import TriggerRule
49 49 | from airflow.www.auth import has_access
50 50 | from airflow.www.utils import get_sensitive_variables_fields, should_hide_value_for_key
51 |+from airflow.sdk import chain
51 52 |
52 53 | # airflow root
53 54 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
--------------------------------------------------------------------------------
118 119 | mkdirs
119 120 |
120 121 | # airflow.utils.helpers
121 |-helper_chain
122 |+chain
122 123 | helper_cross_downstream
123 124 |
124 125 | # airflow.utils.log
AIR301_names.py:122:1: AIR301 [*] `airflow.utils.helpers.cross_downstream` is removed in Airflow 3.0
|
120 | # airflow.utils.helpers
121 | helper_chain
122 | helper_cross_downstream
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
123 |
124 | # airflow.utils.log
|
= help: Use `cross_downstream` from `airflow.sdk` instead.
Safe fix
48 48 | from airflow.utils.trigger_rule import TriggerRule
49 49 | from airflow.www.auth import has_access
50 50 | from airflow.www.utils import get_sensitive_variables_fields, should_hide_value_for_key
51 |+from airflow.sdk import cross_downstream
51 52 |
52 53 | # airflow root
53 54 | PY36, PY37, PY38, PY39, PY310, PY311, PY312
--------------------------------------------------------------------------------
119 120 |
120 121 | # airflow.utils.helpers
121 122 | helper_chain
122 |-helper_cross_downstream
123 |+cross_downstream
123 124 |
124 125 | # airflow.utils.log
125 126 | secrets_masker
AIR301_names.py:125:1: AIR301 `airflow.utils.log.secrets_masker` is removed in Airflow 3.0
|
124 | # airflow.utils.log
125 | secrets_masker
| ^^^^^^^^^^^^^^ AIR301
126 |
127 | # airflow.utils.state
|
= help: Use `secrets_masker` from `airflow.sdk.execution_time` instead.
AIR301_names.py:128:1: AIR301 `airflow.utils.state.SHUTDOWN` is removed in Airflow 3.0
|
127 | # airflow.utils.state
128 | SHUTDOWN
| ^^^^^^^^ AIR301
129 | terminating_states
104 |
105 | # airflow.www.utils
|
AIR301_names.py:129:1: AIR301 `airflow.utils.state.terminating_states` is removed in Airflow 3.0
AIR301_names.py:106:1: AIR301 `airflow.www.utils.get_sensitive_variables_fields` is removed in Airflow 3.0
|
127 | # airflow.utils.state
128 | SHUTDOWN
129 | terminating_states
| ^^^^^^^^^^^^^^^^^^ AIR301
130 |
131 | # airflow.utils.trigger_rule
|
AIR301_names.py:132:1: AIR301 `airflow.utils.trigger_rule.TriggerRule.DUMMY` is removed in Airflow 3.0
|
131 | # airflow.utils.trigger_rule
132 | TriggerRule.DUMMY
| ^^^^^^^^^^^^^^^^^ AIR301
133 | TriggerRule.NONE_FAILED_OR_SKIPPED
|
AIR301_names.py:133:1: AIR301 `airflow.utils.trigger_rule.TriggerRule.NONE_FAILED_OR_SKIPPED` is removed in Airflow 3.0
|
131 | # airflow.utils.trigger_rule
132 | TriggerRule.DUMMY
133 | TriggerRule.NONE_FAILED_OR_SKIPPED
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
|
AIR301_names.py:137:1: AIR301 `airflow.www.auth.has_access` is removed in Airflow 3.0
|
136 | # airflow.www.auth
137 | has_access
| ^^^^^^^^^^ AIR301
138 |
139 | # airflow.www.utils
|
AIR301_names.py:140:1: AIR301 `airflow.www.utils.get_sensitive_variables_fields` is removed in Airflow 3.0
|
139 | # airflow.www.utils
140 | get_sensitive_variables_fields
105 | # airflow.www.utils
106 | get_sensitive_variables_fields
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
141 | should_hide_value_for_key
107 | should_hide_value_for_key
|
AIR301_names.py:141:1: AIR301 `airflow.www.utils.should_hide_value_for_key` is removed in Airflow 3.0
AIR301_names.py:107:1: AIR301 `airflow.www.utils.should_hide_value_for_key` is removed in Airflow 3.0
|
139 | # airflow.www.utils
140 | get_sensitive_variables_fields
141 | should_hide_value_for_key
105 | # airflow.www.utils
106 | get_sensitive_variables_fields
107 | should_hide_value_for_key
| ^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
142 |
143 | # airflow.operators.python
|
AIR301_names.py:146:1: AIR301 `airflow.operators.python.get_current_context` is removed in Airflow 3.0
|
144 | from airflow.operators.python import get_current_context
145 |
146 | get_current_context()
| ^^^^^^^^^^^^^^^^^^^ AIR301
147 |
148 | # airflow.providers.mysql
|
= help: Use `get_current_context` from `airflow.sdk` instead.
AIR301_names.py:151:1: AIR301 `airflow.providers.mysql.datasets.mysql.sanitize_uri` is removed in Airflow 3.0
|
149 | from airflow.providers.mysql.datasets.mysql import sanitize_uri
150 |
151 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
152 |
153 | # airflow.providers.postgres
|
= help: Use `sanitize_uri` from `airflow.providers.mysql.assets.mysql` instead.
AIR301_names.py:156:1: AIR301 `airflow.providers.postgres.datasets.postgres.sanitize_uri` is removed in Airflow 3.0
|
154 | from airflow.providers.postgres.datasets.postgres import sanitize_uri
155 |
156 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
157 |
158 | # airflow.providers.trino
|
= help: Use `sanitize_uri` from `airflow.providers.postgres.assets.postgres` instead.
AIR301_names.py:161:1: AIR301 `airflow.providers.trino.datasets.trino.sanitize_uri` is removed in Airflow 3.0
|
159 | from airflow.providers.trino.datasets.trino import sanitize_uri
160 |
161 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
162 |
163 | # airflow.notifications.basenotifier
|
= help: Use `sanitize_uri` from `airflow.providers.trino.assets.trino` instead.
AIR301_names.py:166:1: AIR301 `airflow.notifications.basenotifier.BaseNotifier` is removed in Airflow 3.0
|
164 | from airflow.notifications.basenotifier import BaseNotifier
165 |
166 | BaseNotifier()
| ^^^^^^^^^^^^ AIR301
167 |
168 | # airflow.auth.manager
|
= help: Use `BaseNotifier` from `airflow.sdk.bases.notifier` instead.
AIR301_names.py:171:1: AIR301 `airflow.auth.managers.base_auth_manager.BaseAuthManager` is removed in Airflow 3.0
|
169 | from airflow.auth.managers.base_auth_manager import BaseAuthManager
170 |
171 | BaseAuthManager()
| ^^^^^^^^^^^^^^^ AIR301
|
= help: Use `BaseAuthManager` from `airflow.api_fastapi.auth.managers.base_auth_manager` instead.

View File

@@ -1,296 +1,780 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR301_names_fix.py:19:1: AIR301 [*] `airflow.api_connexion.security.requires_access_dataset` is removed in Airflow 3.0
AIR301_names_fix.py:17:1: AIR301 [*] `airflow.api_connexion.security.requires_access_dataset` is removed in Airflow 3.0
|
17 | from airflow.www.auth import has_access_dataset
18 |
19 | requires_access_dataset()
15 | from airflow.security.permissions import RESOURCE_DATASET
16 |
17 | requires_access_dataset()
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
20 |
21 | DatasetDetails()
18 |
19 | DatasetDetails()
|
= help: Use `requires_access_asset` from `airflow.api_fastapi.core_api.security` instead.
Safe fix
15 15 | from airflow.secrets.local_filesystm import load_connections
16 16 | from airflow.security.permissions import RESOURCE_DATASET
17 17 | from airflow.www.auth import has_access_dataset
18 |+from airflow.api_fastapi.core_api.security import requires_access_asset
13 13 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 15 | from airflow.security.permissions import RESOURCE_DATASET
16 |+from airflow.api_fastapi.core_api.security import requires_access_asset
16 17 |
17 |-requires_access_dataset()
18 |+requires_access_asset()
18 19 |
19 |-requires_access_dataset()
20 |+requires_access_asset()
19 20 | DatasetDetails()
20 21 |
21 22 | DatasetDetails()
22 23 |
AIR301_names_fix.py:21:1: AIR301 [*] `airflow.auth.managers.models.resource_details.DatasetDetails` is removed in Airflow 3.0
AIR301_names_fix.py:19:1: AIR301 [*] `airflow.auth.managers.models.resource_details.DatasetDetails` is removed in Airflow 3.0
|
19 | requires_access_dataset()
20 |
21 | DatasetDetails()
17 | requires_access_dataset()
18 |
19 | DatasetDetails()
| ^^^^^^^^^^^^^^ AIR301
20 |
21 | DatasetManager()
|
= help: Use `AssetDetails` from `airflow.api_fastapi.auth.managers.models.resource_details` instead.
Safe fix
15 15 | from airflow.secrets.local_filesystm import load_connections
16 16 | from airflow.security.permissions import RESOURCE_DATASET
17 17 | from airflow.www.auth import has_access_dataset
18 |+from airflow.api_fastapi.auth.managers.models.resource_details import AssetDetails
13 13 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 15 | from airflow.security.permissions import RESOURCE_DATASET
16 |+from airflow.api_fastapi.auth.managers.models.resource_details import AssetDetails
16 17 |
17 18 | requires_access_dataset()
18 19 |
19 20 | requires_access_dataset()
19 |-DatasetDetails()
20 |+AssetDetails()
20 21 |
21 |-DatasetDetails()
22 |+AssetDetails()
22 23 |
23 24 |
24 25 | DatasetManager()
21 22 | DatasetManager()
22 23 | dataset_manager()
AIR301_names_fix.py:24:1: AIR301 [*] `airflow.datasets.manager.DatasetManager` is removed in Airflow 3.0
AIR301_names_fix.py:21:1: AIR301 [*] `airflow.datasets.manager.DatasetManager` is removed in Airflow 3.0
|
24 | DatasetManager()
19 | DatasetDetails()
20 |
21 | DatasetManager()
| ^^^^^^^^^^^^^^ AIR301
25 | dataset_manager()
26 | resolve_dataset_manager()
22 | dataset_manager()
23 | resolve_dataset_manager()
|
= help: Use `AssetManager` from `airflow.assets.manager` instead.
Safe fix
15 15 | from airflow.secrets.local_filesystm import load_connections
16 16 | from airflow.security.permissions import RESOURCE_DATASET
17 17 | from airflow.www.auth import has_access_dataset
18 |+from airflow.assets.manager import AssetManager
13 13 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 15 | from airflow.security.permissions import RESOURCE_DATASET
16 |+from airflow.assets.manager import AssetManager
16 17 |
17 18 | requires_access_dataset()
18 19 |
19 20 | requires_access_dataset()
19 20 | DatasetDetails()
20 21 |
21 22 | DatasetDetails()
22 23 |
23 24 |
24 |-DatasetManager()
25 |+AssetManager()
25 26 | dataset_manager()
26 27 | resolve_dataset_manager()
27 28 |
21 |-DatasetManager()
22 |+AssetManager()
22 23 | dataset_manager()
23 24 | resolve_dataset_manager()
24 25 |
AIR301_names_fix.py:25:1: AIR301 [*] `airflow.datasets.manager.dataset_manager` is removed in Airflow 3.0
AIR301_names_fix.py:22:1: AIR301 [*] `airflow.datasets.manager.dataset_manager` is removed in Airflow 3.0
|
24 | DatasetManager()
25 | dataset_manager()
21 | DatasetManager()
22 | dataset_manager()
| ^^^^^^^^^^^^^^^ AIR301
26 | resolve_dataset_manager()
23 | resolve_dataset_manager()
|
= help: Use `asset_manager` from `airflow.assets.manager` instead.
Safe fix
15 15 | from airflow.secrets.local_filesystm import load_connections
16 16 | from airflow.security.permissions import RESOURCE_DATASET
17 17 | from airflow.www.auth import has_access_dataset
18 |+from airflow.assets.manager import asset_manager
13 13 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 15 | from airflow.security.permissions import RESOURCE_DATASET
16 |+from airflow.assets.manager import asset_manager
16 17 |
17 18 | requires_access_dataset()
18 19 |
19 20 | requires_access_dataset()
19 20 | DatasetDetails()
20 21 |
--------------------------------------------------------------------------------
22 23 |
23 24 |
24 25 | DatasetManager()
25 |-dataset_manager()
26 |+asset_manager()
26 27 | resolve_dataset_manager()
27 28 |
28 29 | DatasetLineageInfo()
21 22 | DatasetManager()
22 |-dataset_manager()
23 |+asset_manager()
23 24 | resolve_dataset_manager()
24 25 |
25 26 | DatasetLineageInfo()
AIR301_names_fix.py:26:1: AIR301 [*] `airflow.datasets.manager.resolve_dataset_manager` is removed in Airflow 3.0
AIR301_names_fix.py:23:1: AIR301 [*] `airflow.datasets.manager.resolve_dataset_manager` is removed in Airflow 3.0
|
24 | DatasetManager()
25 | dataset_manager()
26 | resolve_dataset_manager()
21 | DatasetManager()
22 | dataset_manager()
23 | resolve_dataset_manager()
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
27 |
28 | DatasetLineageInfo()
24 |
25 | DatasetLineageInfo()
|
= help: Use `resolve_asset_manager` from `airflow.assets.manager` instead.
Safe fix
15 15 | from airflow.secrets.local_filesystm import load_connections
16 16 | from airflow.security.permissions import RESOURCE_DATASET
17 17 | from airflow.www.auth import has_access_dataset
18 |+from airflow.assets.manager import resolve_asset_manager
13 13 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 15 | from airflow.security.permissions import RESOURCE_DATASET
16 |+from airflow.assets.manager import resolve_asset_manager
16 17 |
17 18 | requires_access_dataset()
18 19 |
19 20 | requires_access_dataset()
20 21 |
--------------------------------------------------------------------------------
23 24 |
24 25 | DatasetManager()
25 26 | dataset_manager()
26 |-resolve_dataset_manager()
27 |+resolve_asset_manager()
27 28 |
28 29 | DatasetLineageInfo()
29 30 |
20 21 |
21 22 | DatasetManager()
22 23 | dataset_manager()
23 |-resolve_dataset_manager()
24 |+resolve_asset_manager()
24 25 |
25 26 | DatasetLineageInfo()
26 27 |
AIR301_names_fix.py:28:1: AIR301 [*] `airflow.lineage.hook.DatasetLineageInfo` is removed in Airflow 3.0
AIR301_names_fix.py:25:1: AIR301 [*] `airflow.lineage.hook.DatasetLineageInfo` is removed in Airflow 3.0
|
26 | resolve_dataset_manager()
27 |
28 | DatasetLineageInfo()
23 | resolve_dataset_manager()
24 |
25 | DatasetLineageInfo()
| ^^^^^^^^^^^^^^^^^^ AIR301
29 |
30 | AllowListValidator()
26 |
27 | AllowListValidator()
|
= help: Use `AssetLineageInfo` from `airflow.lineage.hook` instead.
Safe fix
10 10 | dataset_manager,
11 11 | resolve_dataset_manager,
12 12 | )
13 |-from airflow.lineage.hook import DatasetLineageInfo
13 |+from airflow.lineage.hook import DatasetLineageInfo, AssetLineageInfo
14 14 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
15 15 | from airflow.secrets.local_filesystm import load_connections
16 16 | from airflow.security.permissions import RESOURCE_DATASET
9 9 | dataset_manager,
10 10 | resolve_dataset_manager,
11 11 | )
12 |-from airflow.lineage.hook import DatasetLineageInfo
12 |+from airflow.lineage.hook import DatasetLineageInfo, AssetLineageInfo
13 13 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 15 | from airflow.security.permissions import RESOURCE_DATASET
--------------------------------------------------------------------------------
25 25 | dataset_manager()
26 26 | resolve_dataset_manager()
27 27 |
28 |-DatasetLineageInfo()
28 |+AssetLineageInfo()
29 29 |
30 30 | AllowListValidator()
31 31 | BlockListValidator()
22 22 | dataset_manager()
23 23 | resolve_dataset_manager()
24 24 |
25 |-DatasetLineageInfo()
25 |+AssetLineageInfo()
26 26 |
27 27 | AllowListValidator()
28 28 | BlockListValidator()
AIR301_names_fix.py:30:1: AIR301 [*] `airflow.metrics.validators.AllowListValidator` is removed in Airflow 3.0
AIR301_names_fix.py:27:1: AIR301 [*] `airflow.metrics.validators.AllowListValidator` is removed in Airflow 3.0
|
28 | DatasetLineageInfo()
29 |
30 | AllowListValidator()
25 | DatasetLineageInfo()
26 |
27 | AllowListValidator()
| ^^^^^^^^^^^^^^^^^^ AIR301
31 | BlockListValidator()
28 | BlockListValidator()
|
= help: Use `PatternAllowListValidator` from `airflow.metrics.validators` instead.
Safe fix
11 11 | resolve_dataset_manager,
12 12 | )
13 13 | from airflow.lineage.hook import DatasetLineageInfo
14 |-from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 |+from airflow.metrics.validators import AllowListValidator, BlockListValidator, PatternAllowListValidator
15 15 | from airflow.secrets.local_filesystm import load_connections
16 16 | from airflow.security.permissions import RESOURCE_DATASET
17 17 | from airflow.www.auth import has_access_dataset
10 10 | resolve_dataset_manager,
11 11 | )
12 12 | from airflow.lineage.hook import DatasetLineageInfo
13 |-from airflow.metrics.validators import AllowListValidator, BlockListValidator
13 |+from airflow.metrics.validators import AllowListValidator, BlockListValidator, PatternAllowListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 15 | from airflow.security.permissions import RESOURCE_DATASET
16 16 |
--------------------------------------------------------------------------------
27 27 |
28 28 | DatasetLineageInfo()
24 24 |
25 25 | DatasetLineageInfo()
26 26 |
27 |-AllowListValidator()
27 |+PatternAllowListValidator()
28 28 | BlockListValidator()
29 29 |
30 |-AllowListValidator()
30 |+PatternAllowListValidator()
31 31 | BlockListValidator()
32 32 |
33 33 | load_connections()
30 30 | load_connections()
AIR301_names_fix.py:31:1: AIR301 [*] `airflow.metrics.validators.BlockListValidator` is removed in Airflow 3.0
AIR301_names_fix.py:28:1: AIR301 [*] `airflow.metrics.validators.BlockListValidator` is removed in Airflow 3.0
|
30 | AllowListValidator()
31 | BlockListValidator()
27 | AllowListValidator()
28 | BlockListValidator()
| ^^^^^^^^^^^^^^^^^^ AIR301
32 |
33 | load_connections()
29 |
30 | load_connections()
|
= help: Use `PatternBlockListValidator` from `airflow.metrics.validators` instead.
Safe fix
11 11 | resolve_dataset_manager,
12 12 | )
13 13 | from airflow.lineage.hook import DatasetLineageInfo
14 |-from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 |+from airflow.metrics.validators import AllowListValidator, BlockListValidator, PatternBlockListValidator
15 15 | from airflow.secrets.local_filesystm import load_connections
16 16 | from airflow.security.permissions import RESOURCE_DATASET
17 17 | from airflow.www.auth import has_access_dataset
10 10 | resolve_dataset_manager,
11 11 | )
12 12 | from airflow.lineage.hook import DatasetLineageInfo
13 |-from airflow.metrics.validators import AllowListValidator, BlockListValidator
13 |+from airflow.metrics.validators import AllowListValidator, BlockListValidator, PatternBlockListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 15 | from airflow.security.permissions import RESOURCE_DATASET
16 16 |
--------------------------------------------------------------------------------
28 28 | DatasetLineageInfo()
25 25 | DatasetLineageInfo()
26 26 |
27 27 | AllowListValidator()
28 |-BlockListValidator()
28 |+PatternBlockListValidator()
29 29 |
30 30 | AllowListValidator()
31 |-BlockListValidator()
31 |+PatternBlockListValidator()
32 32 |
33 33 | load_connections()
34 34 |
30 30 | load_connections()
31 31 |
AIR301_names_fix.py:35:1: AIR301 [*] `airflow.security.permissions.RESOURCE_DATASET` is removed in Airflow 3.0
AIR301_names_fix.py:30:1: AIR301 [*] `airflow.secrets.local_filesystem.load_connections` is removed in Airflow 3.0
|
33 | load_connections()
34 |
35 | RESOURCE_DATASET
28 | BlockListValidator()
29 |
30 | load_connections()
| ^^^^^^^^^^^^^^^^ AIR301
31 |
32 | RESOURCE_DATASET
|
= help: Use `load_connections_dict` from `airflow.secrets.local_filesystem` instead.
Safe fix
11 11 | )
12 12 | from airflow.lineage.hook import DatasetLineageInfo
13 13 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 |-from airflow.secrets.local_filesystem import load_connections
14 |+from airflow.secrets.local_filesystem import load_connections, load_connections_dict
15 15 | from airflow.security.permissions import RESOURCE_DATASET
16 16 |
17 17 | requires_access_dataset()
--------------------------------------------------------------------------------
27 27 | AllowListValidator()
28 28 | BlockListValidator()
29 29 |
30 |-load_connections()
30 |+load_connections_dict()
31 31 |
32 32 | RESOURCE_DATASET
33 33 |
AIR301_names_fix.py:32:1: AIR301 [*] `airflow.security.permissions.RESOURCE_DATASET` is removed in Airflow 3.0
|
30 | load_connections()
31 |
32 | RESOURCE_DATASET
| ^^^^^^^^^^^^^^^^ AIR301
36 |
37 | has_access_dataset()
|
= help: Use `RESOURCE_ASSET` from `airflow.security.permissions` instead.
Safe fix
13 13 | from airflow.lineage.hook import DatasetLineageInfo
14 14 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
15 15 | from airflow.secrets.local_filesystm import load_connections
16 |-from airflow.security.permissions import RESOURCE_DATASET
16 |+from airflow.security.permissions import RESOURCE_DATASET, RESOURCE_ASSET
17 17 | from airflow.www.auth import has_access_dataset
12 12 | from airflow.lineage.hook import DatasetLineageInfo
13 13 | from airflow.metrics.validators import AllowListValidator, BlockListValidator
14 14 | from airflow.secrets.local_filesystem import load_connections
15 |-from airflow.security.permissions import RESOURCE_DATASET
15 |+from airflow.security.permissions import RESOURCE_DATASET, RESOURCE_ASSET
16 16 |
17 17 | requires_access_dataset()
18 18 |
19 19 | requires_access_dataset()
--------------------------------------------------------------------------------
32 32 |
33 33 | load_connections()
29 29 |
30 30 | load_connections()
31 31 |
32 |-RESOURCE_DATASET
32 |+RESOURCE_ASSET
33 33 |
34 34 |
35 |-RESOURCE_DATASET
35 |+RESOURCE_ASSET
36 36 |
37 37 | has_access_dataset()
38 38 |
35 35 | from airflow.listeners.spec.dataset import (
AIR301_names_fix.py:37:1: AIR301 `airflow.www.auth.has_access_dataset` is removed in Airflow 3.0
AIR301_names_fix.py:40:1: AIR301 [*] `airflow.listeners.spec.dataset.on_dataset_created` is removed in Airflow 3.0
|
35 | RESOURCE_DATASET
36 |
37 | has_access_dataset()
38 | )
39 |
40 | on_dataset_created()
| ^^^^^^^^^^^^^^^^^^ AIR301
38 |
39 | from airflow.listeners.spec.dataset import (
|
AIR301_names_fix.py:44:1: AIR301 [*] `airflow.listeners.spec.dataset.on_dataset_created` is removed in Airflow 3.0
|
42 | )
43 |
44 | on_dataset_created()
| ^^^^^^^^^^^^^^^^^^ AIR301
45 | on_dataset_changed()
41 | on_dataset_changed()
|
= help: Use `on_asset_created` from `airflow.listeners.spec.asset` instead.
Safe fix
40 40 | on_dataset_changed,
41 41 | on_dataset_created,
42 42 | )
43 |+from airflow.listeners.spec.asset import on_asset_created
36 36 | on_dataset_changed,
37 37 | on_dataset_created,
38 38 | )
39 |+from airflow.listeners.spec.asset import on_asset_created
39 40 |
40 |-on_dataset_created()
41 |+on_asset_created()
41 42 | on_dataset_changed()
42 43 |
43 44 |
44 |-on_dataset_created()
45 |+on_asset_created()
45 46 | on_dataset_changed()
AIR301_names_fix.py:45:1: AIR301 [*] `airflow.listeners.spec.dataset.on_dataset_changed` is removed in Airflow 3.0
AIR301_names_fix.py:41:1: AIR301 [*] `airflow.listeners.spec.dataset.on_dataset_changed` is removed in Airflow 3.0
|
44 | on_dataset_created()
45 | on_dataset_changed()
40 | on_dataset_created()
41 | on_dataset_changed()
| ^^^^^^^^^^^^^^^^^^ AIR301
|
= help: Use `on_asset_changed` from `airflow.listeners.spec.asset` instead.
Safe fix
40 40 | on_dataset_changed,
41 41 | on_dataset_created,
42 42 | )
43 |+from airflow.listeners.spec.asset import on_asset_changed
36 36 | on_dataset_changed,
37 37 | on_dataset_created,
38 38 | )
39 |+from airflow.listeners.spec.asset import on_asset_changed
39 40 |
40 41 | on_dataset_created()
41 |-on_dataset_changed()
42 |+on_asset_changed()
42 43 |
43 44 |
44 45 | on_dataset_created()
45 |-on_dataset_changed()
46 |+on_asset_changed()
44 45 | # airflow.operators.python
AIR301_names_fix.py:47:1: AIR301 [*] `airflow.operators.python.get_current_context` is removed in Airflow 3.0
|
45 | from airflow.operators.python import get_current_context
46 |
47 | get_current_context()
| ^^^^^^^^^^^^^^^^^^^ AIR301
48 |
49 | # airflow.providers.mysql
|
= help: Use `get_current_context` from `airflow.sdk` instead.
Unsafe fix
42 42 |
43 43 |
44 44 | # airflow.operators.python
45 |-from airflow.operators.python import get_current_context
45 |+from airflow.sdk import get_current_context
46 46 |
47 47 | get_current_context()
48 48 |
AIR301_names_fix.py:52:1: AIR301 [*] `airflow.providers.mysql.datasets.mysql.sanitize_uri` is removed in Airflow 3.0
|
50 | from airflow.providers.mysql.datasets.mysql import sanitize_uri
51 |
52 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
53 |
54 | # airflow.providers.postgres
|
= help: Use `sanitize_uri` from `airflow.providers.mysql.assets.mysql` instead.
Unsafe fix
47 47 | get_current_context()
48 48 |
49 49 | # airflow.providers.mysql
50 |-from airflow.providers.mysql.datasets.mysql import sanitize_uri
50 |+from airflow.providers.mysql.assets.mysql import sanitize_uri
51 51 |
52 52 | sanitize_uri
53 53 |
AIR301_names_fix.py:57:1: AIR301 [*] `airflow.providers.postgres.datasets.postgres.sanitize_uri` is removed in Airflow 3.0
|
55 | from airflow.providers.postgres.datasets.postgres import sanitize_uri
56 |
57 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
58 |
59 | # airflow.providers.trino
|
= help: Use `sanitize_uri` from `airflow.providers.postgres.assets.postgres` instead.
Unsafe fix
52 52 | sanitize_uri
53 53 |
54 54 | # airflow.providers.postgres
55 |-from airflow.providers.postgres.datasets.postgres import sanitize_uri
55 |+from airflow.providers.postgres.assets.postgres import sanitize_uri
56 56 |
57 57 | sanitize_uri
58 58 |
AIR301_names_fix.py:62:1: AIR301 [*] `airflow.providers.trino.datasets.trino.sanitize_uri` is removed in Airflow 3.0
|
60 | from airflow.providers.trino.datasets.trino import sanitize_uri
61 |
62 | sanitize_uri
| ^^^^^^^^^^^^ AIR301
63 |
64 | # airflow.notifications.basenotifier
|
= help: Use `sanitize_uri` from `airflow.providers.trino.assets.trino` instead.
Unsafe fix
57 57 | sanitize_uri
58 58 |
59 59 | # airflow.providers.trino
60 |-from airflow.providers.trino.datasets.trino import sanitize_uri
60 |+from airflow.providers.trino.assets.trino import sanitize_uri
61 61 |
62 62 | sanitize_uri
63 63 |
AIR301_names_fix.py:67:1: AIR301 [*] `airflow.notifications.basenotifier.BaseNotifier` is removed in Airflow 3.0
|
65 | from airflow.notifications.basenotifier import BaseNotifier
66 |
67 | BaseNotifier()
| ^^^^^^^^^^^^ AIR301
68 |
69 | # airflow.auth.manager
|
= help: Use `BaseNotifier` from `airflow.sdk.bases.notifier` instead.
Unsafe fix
62 62 | sanitize_uri
63 63 |
64 64 | # airflow.notifications.basenotifier
65 |-from airflow.notifications.basenotifier import BaseNotifier
65 |+from airflow.sdk.bases.notifier import BaseNotifier
66 66 |
67 67 | BaseNotifier()
68 68 |
AIR301_names_fix.py:72:1: AIR301 [*] `airflow.auth.managers.base_auth_manager.BaseAuthManager` is removed in Airflow 3.0
|
70 | from airflow.auth.managers.base_auth_manager import BaseAuthManager
71 |
72 | BaseAuthManager()
| ^^^^^^^^^^^^^^^ AIR301
|
= help: Use `BaseAuthManager` from `airflow.api_fastapi.auth.managers.base_auth_manager` instead.
Unsafe fix
67 67 | BaseNotifier()
68 68 |
69 69 | # airflow.auth.manager
70 |-from airflow.auth.managers.base_auth_manager import BaseAuthManager
70 |+from airflow.api_fastapi.auth.managers.base_auth_manager import BaseAuthManager
71 71 |
72 72 | BaseAuthManager()
73 73 |
AIR301_names_fix.py:87:1: AIR301 [*] `airflow.configuration.get` is removed in Airflow 3.0
|
86 | # airflow.configuration
87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^ AIR301
88 | from airflow.hooks.base_hook import BaseHook
|
= help: Use `conf.get` from `airflow.configuration` instead.
Safe fix
81 81 | has_option,
82 82 | remove_option,
83 83 | set,
84 |+conf,
84 85 | )
85 86 |
86 87 | # airflow.configuration
87 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |+conf, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 89 | from airflow.hooks.base_hook import BaseHook
89 90 |
90 91 | # airflow.hooks
AIR301_names_fix.py:87:6: AIR301 [*] `airflow.configuration.getboolean` is removed in Airflow 3.0
|
86 | # airflow.configuration
87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^ AIR301
88 | from airflow.hooks.base_hook import BaseHook
|
= help: Use `conf.getboolean` from `airflow.configuration` instead.
Safe fix
81 81 | has_option,
82 82 | remove_option,
83 83 | set,
84 |+conf,
84 85 | )
85 86 |
86 87 | # airflow.configuration
87 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |+get, conf, getfloat, getint, has_option, remove_option, as_dict, set
88 89 | from airflow.hooks.base_hook import BaseHook
89 90 |
90 91 | # airflow.hooks
AIR301_names_fix.py:87:18: AIR301 [*] `airflow.configuration.getfloat` is removed in Airflow 3.0
|
86 | # airflow.configuration
87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^ AIR301
88 | from airflow.hooks.base_hook import BaseHook
|
= help: Use `conf.getfloat` from `airflow.configuration` instead.
Safe fix
81 81 | has_option,
82 82 | remove_option,
83 83 | set,
84 |+conf,
84 85 | )
85 86 |
86 87 | # airflow.configuration
87 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |+get, getboolean, conf, getint, has_option, remove_option, as_dict, set
88 89 | from airflow.hooks.base_hook import BaseHook
89 90 |
90 91 | # airflow.hooks
AIR301_names_fix.py:87:28: AIR301 [*] `airflow.configuration.getint` is removed in Airflow 3.0
|
86 | # airflow.configuration
87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^ AIR301
88 | from airflow.hooks.base_hook import BaseHook
|
= help: Use `conf.getint` from `airflow.configuration` instead.
Safe fix
81 81 | has_option,
82 82 | remove_option,
83 83 | set,
84 |+conf,
84 85 | )
85 86 |
86 87 | # airflow.configuration
87 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |+get, getboolean, getfloat, conf, has_option, remove_option, as_dict, set
88 89 | from airflow.hooks.base_hook import BaseHook
89 90 |
90 91 | # airflow.hooks
AIR301_names_fix.py:87:36: AIR301 [*] `airflow.configuration.has_option` is removed in Airflow 3.0
|
86 | # airflow.configuration
87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^ AIR301
88 | from airflow.hooks.base_hook import BaseHook
|
= help: Use `conf.has_option` from `airflow.configuration` instead.
Safe fix
81 81 | has_option,
82 82 | remove_option,
83 83 | set,
84 |+conf,
84 85 | )
85 86 |
86 87 | # airflow.configuration
87 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |+get, getboolean, getfloat, getint, conf, remove_option, as_dict, set
88 89 | from airflow.hooks.base_hook import BaseHook
89 90 |
90 91 | # airflow.hooks
AIR301_names_fix.py:87:48: AIR301 [*] `airflow.configuration.remove_option` is removed in Airflow 3.0
|
86 | # airflow.configuration
87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^^^^^^^ AIR301
88 | from airflow.hooks.base_hook import BaseHook
|
= help: Use `conf.remove_option` from `airflow.configuration` instead.
Safe fix
81 81 | has_option,
82 82 | remove_option,
83 83 | set,
84 |+conf,
84 85 | )
85 86 |
86 87 | # airflow.configuration
87 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |+get, getboolean, getfloat, getint, has_option, conf, as_dict, set
88 89 | from airflow.hooks.base_hook import BaseHook
89 90 |
90 91 | # airflow.hooks
AIR301_names_fix.py:87:63: AIR301 [*] `airflow.configuration.as_dict` is removed in Airflow 3.0
|
86 | # airflow.configuration
87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^^^^^ AIR301
88 | from airflow.hooks.base_hook import BaseHook
|
= help: Use `conf.as_dict` from `airflow.configuration` instead.
Safe fix
81 81 | has_option,
82 82 | remove_option,
83 83 | set,
84 |+conf,
84 85 | )
85 86 |
86 87 | # airflow.configuration
87 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |+get, getboolean, getfloat, getint, has_option, remove_option, conf, set
88 89 | from airflow.hooks.base_hook import BaseHook
89 90 |
90 91 | # airflow.hooks
AIR301_names_fix.py:87:72: AIR301 [*] `airflow.configuration.set` is removed in Airflow 3.0
|
86 | # airflow.configuration
87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
| ^^^ AIR301
88 | from airflow.hooks.base_hook import BaseHook
|
= help: Use `conf.set` from `airflow.configuration` instead.
Safe fix
81 81 | has_option,
82 82 | remove_option,
83 83 | set,
84 |+conf,
84 85 | )
85 86 |
86 87 | # airflow.configuration
87 |-get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |+get, getboolean, getfloat, getint, has_option, remove_option, as_dict, conf
88 89 | from airflow.hooks.base_hook import BaseHook
89 90 |
90 91 | # airflow.hooks
AIR301_names_fix.py:91:1: AIR301 [*] `airflow.hooks.base_hook.BaseHook` is removed in Airflow 3.0
|
90 | # airflow.hooks
91 | BaseHook()
| ^^^^^^^^ AIR301
92 |
93 | from airflow.sensors.base_sensor_operator import BaseSensorOperator
|
= help: Use `BaseHook` from `airflow.hooks.base` instead.
Unsafe fix
85 85 |
86 86 | # airflow.configuration
87 87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |-from airflow.hooks.base_hook import BaseHook
88 |+from airflow.hooks.base import BaseHook
89 89 |
90 90 | # airflow.hooks
91 91 | BaseHook()
AIR301_names_fix.py:96:1: AIR301 [*] `airflow.sensors.base_sensor_operator.BaseSensorOperator` is removed in Airflow 3.0
|
95 | # airflow.sensors.base_sensor_operator
96 | BaseSensorOperator()
| ^^^^^^^^^^^^^^^^^^ AIR301
97 | BaseHook()
|
= help: Use `BaseSensorOperator` from `airflow.sdk.bases.sensor` instead.
Unsafe fix
90 90 | # airflow.hooks
91 91 | BaseHook()
92 92 |
93 |-from airflow.sensors.base_sensor_operator import BaseSensorOperator
93 |+from airflow.sdk.bases.sensor import BaseSensorOperator
94 94 |
95 95 | # airflow.sensors.base_sensor_operator
96 96 | BaseSensorOperator()
AIR301_names_fix.py:97:1: AIR301 [*] `airflow.hooks.base_hook.BaseHook` is removed in Airflow 3.0
|
95 | # airflow.sensors.base_sensor_operator
96 | BaseSensorOperator()
97 | BaseHook()
| ^^^^^^^^ AIR301
98 |
99 | from airflow.utils.helpers import chain as helper_chain
|
= help: Use `BaseHook` from `airflow.hooks.base` instead.
Unsafe fix
85 85 |
86 86 | # airflow.configuration
87 87 | get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
88 |-from airflow.hooks.base_hook import BaseHook
89 88 |
90 89 | # airflow.hooks
91 90 | BaseHook()
92 91 |
93 92 | from airflow.sensors.base_sensor_operator import BaseSensorOperator
93 |+from airflow.hooks.base import BaseHook
94 94 |
95 95 | # airflow.sensors.base_sensor_operator
96 96 | BaseSensorOperator()
AIR301_names_fix.py:103:1: AIR301 [*] `airflow.utils.helpers.chain` is removed in Airflow 3.0
|
102 | # airflow.utils.helpers
103 | helper_chain
| ^^^^^^^^^^^^ AIR301
104 | helper_cross_downstream
|
= help: Use `chain` from `airflow.sdk` instead.
Safe fix
98 98 |
99 99 | from airflow.utils.helpers import chain as helper_chain
100 100 | from airflow.utils.helpers import cross_downstream as helper_cross_downstream
101 |+from airflow.sdk import chain
101 102 |
102 103 | # airflow.utils.helpers
103 |-helper_chain
104 |+chain
104 105 | helper_cross_downstream
105 106 |
106 107 | # airflow.utils.file
AIR301_names_fix.py:104:1: AIR301 [*] `airflow.utils.helpers.cross_downstream` is removed in Airflow 3.0
|
102 | # airflow.utils.helpers
103 | helper_chain
104 | helper_cross_downstream
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
105 |
106 | # airflow.utils.file
|
= help: Use `cross_downstream` from `airflow.sdk` instead.
Safe fix
98 98 |
99 99 | from airflow.utils.helpers import chain as helper_chain
100 100 | from airflow.utils.helpers import cross_downstream as helper_cross_downstream
101 |+from airflow.sdk import cross_downstream
101 102 |
102 103 | # airflow.utils.helpers
103 104 | helper_chain
104 |-helper_cross_downstream
105 |+cross_downstream
105 106 |
106 107 | # airflow.utils.file
107 108 | from airflow.utils.file import TemporaryDirectory
AIR301_names_fix.py:109:1: AIR301 [*] `airflow.utils.file.TemporaryDirectory` is removed in Airflow 3.0
|
107 | from airflow.utils.file import TemporaryDirectory
108 |
109 | TemporaryDirectory()
| ^^^^^^^^^^^^^^^^^^ AIR301
110 |
111 | from airflow.utils.log import secrets_masker
|
= help: Use `TemporaryDirectory` from `tempfile` instead.
Unsafe fix
104 104 | helper_cross_downstream
105 105 |
106 106 | # airflow.utils.file
107 |-from airflow.utils.file import TemporaryDirectory
107 |+from tempfile import TemporaryDirectory
108 108 |
109 109 | TemporaryDirectory()
110 110 |
AIR301_names_fix.py:114:1: AIR301 [*] `airflow.utils.log.secrets_masker` is removed in Airflow 3.0
|
113 | # airflow.utils.log
114 | secrets_masker
| ^^^^^^^^^^^^^^ AIR301
|
= help: Use `secrets_masker` from `airflow.sdk.execution_time` instead.
Unsafe fix
108 108 |
109 109 | TemporaryDirectory()
110 110 |
111 |-from airflow.utils.log import secrets_masker
111 |+from airflow.sdk.execution_time import secrets_masker
112 112 |
113 113 | # airflow.utils.log
114 114 | secrets_masker

View File

@@ -1,216 +1,243 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR301_provider_names_fix.py:25:1: AIR301 [*] `airflow.providers.amazon.aws.auth_manager.avp.entities.AvpEntities.DATASET` is removed in Airflow 3.0
AIR301_provider_names_fix.py:11:1: AIR301 [*] `airflow.providers.amazon.aws.auth_manager.avp.entities.AvpEntities.DATASET` is removed in Airflow 3.0
|
23 | )
24 |
25 | AvpEntities.DATASET
9 | from airflow.security.permissions import RESOURCE_DATASET
10 |
11 | AvpEntities.DATASET
| ^^^^^^^^^^^^^^^^^^^ AIR301
26 |
27 | s3_create_dataset()
12 |
13 | # airflow.providers.openlineage.utils.utils
|
= help: Use `AvpEntities.ASSET` from `airflow.providers.amazon.aws.auth_manager.avp.entities` instead.
Safe fix
22 22 | translate_airflow_dataset,
23 23 | )
24 24 |
25 |-AvpEntities.DATASET
25 |+AvpEntities.ASSET
26 26 |
27 27 | s3_create_dataset()
28 28 | s3_convert_dataset_to_openlineage()
8 8 | from airflow.secrets.local_filesystem import load_connections
9 9 | from airflow.security.permissions import RESOURCE_DATASET
10 10 |
11 |-AvpEntities.DATASET
11 |+AvpEntities
12 12 |
13 13 | # airflow.providers.openlineage.utils.utils
14 14 | DatasetInfo()
AIR301_provider_names_fix.py:27:1: AIR301 [*] `airflow.providers.amazon.aws.datasets.s3.create_dataset` is removed in Airflow 3.0
AIR301_provider_names_fix.py:14:1: AIR301 [*] `airflow.providers.openlineage.utils.utils.DatasetInfo` is removed in Airflow 3.0
|
25 | AvpEntities.DATASET
26 |
27 | s3_create_dataset()
| ^^^^^^^^^^^^^^^^^ AIR301
28 | s3_convert_dataset_to_openlineage()
|
= help: Use `create_asset` from `airflow.providers.amazon.aws.assets.s3` instead.
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.amazon.aws.assets.s3 import create_asset
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
27 |-s3_create_dataset()
28 |+create_asset()
28 29 | s3_convert_dataset_to_openlineage()
29 30 |
30 31 | io_create_dataset()
AIR301_provider_names_fix.py:28:1: AIR301 [*] `airflow.providers.amazon.aws.datasets.s3.convert_dataset_to_openlineage` is removed in Airflow 3.0
|
27 | s3_create_dataset()
28 | s3_convert_dataset_to_openlineage()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
29 |
30 | io_create_dataset()
|
= help: Use `convert_asset_to_openlineage` from `airflow.providers.amazon.aws.assets.s3` instead.
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.amazon.aws.assets.s3 import convert_asset_to_openlineage
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
27 28 | s3_create_dataset()
28 |-s3_convert_dataset_to_openlineage()
29 |+convert_asset_to_openlineage()
29 30 |
30 31 | io_create_dataset()
31 32 | io_convert_dataset_to_openlineage()
AIR301_provider_names_fix.py:36:1: AIR301 [*] `airflow.providers.google.datasets.bigquery.create_dataset` is removed in Airflow 3.0
|
35 | # airflow.providers.google.datasets.bigquery
36 | bigquery_create_dataset()
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
37 | # airflow.providers.google.datasets.gcs
38 | gcs_create_dataset()
|
= help: Use `create_asset` from `airflow.providers.google.assets.bigquery` instead.
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.google.assets.bigquery import create_asset
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
--------------------------------------------------------------------------------
33 34 |
34 35 |
35 36 | # airflow.providers.google.datasets.bigquery
36 |-bigquery_create_dataset()
37 |+create_asset()
37 38 | # airflow.providers.google.datasets.gcs
38 39 | gcs_create_dataset()
39 40 | gcs_convert_dataset_to_openlineage()
AIR301_provider_names_fix.py:38:1: AIR301 [*] `airflow.providers.google.datasets.gcs.create_dataset` is removed in Airflow 3.0
|
36 | bigquery_create_dataset()
37 | # airflow.providers.google.datasets.gcs
38 | gcs_create_dataset()
| ^^^^^^^^^^^^^^^^^^ AIR301
39 | gcs_convert_dataset_to_openlineage()
40 | # airflow.providers.openlineage.utils.utils
|
= help: Use `create_asset` from `airflow.providers.google.assets.gcs` instead.
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.google.assets.gcs import create_asset
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
--------------------------------------------------------------------------------
35 36 | # airflow.providers.google.datasets.bigquery
36 37 | bigquery_create_dataset()
37 38 | # airflow.providers.google.datasets.gcs
38 |-gcs_create_dataset()
39 |+create_asset()
39 40 | gcs_convert_dataset_to_openlineage()
40 41 | # airflow.providers.openlineage.utils.utils
41 42 | DatasetInfo()
AIR301_provider_names_fix.py:39:1: AIR301 [*] `airflow.providers.google.datasets.gcs.convert_dataset_to_openlineage` is removed in Airflow 3.0
|
37 | # airflow.providers.google.datasets.gcs
38 | gcs_create_dataset()
39 | gcs_convert_dataset_to_openlineage()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
40 | # airflow.providers.openlineage.utils.utils
41 | DatasetInfo()
|
= help: Use `convert_asset_to_openlineage` from `airflow.providers.google.assets.gcs` instead.
Safe fix
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 23 | )
24 |+from airflow.providers.google.assets.gcs import convert_asset_to_openlineage
24 25 |
25 26 | AvpEntities.DATASET
26 27 |
--------------------------------------------------------------------------------
36 37 | bigquery_create_dataset()
37 38 | # airflow.providers.google.datasets.gcs
38 39 | gcs_create_dataset()
39 |-gcs_convert_dataset_to_openlineage()
40 |+convert_asset_to_openlineage()
40 41 | # airflow.providers.openlineage.utils.utils
41 42 | DatasetInfo()
42 43 | translate_airflow_dataset()
AIR301_provider_names_fix.py:41:1: AIR301 [*] `airflow.providers.openlineage.utils.utils.DatasetInfo` is removed in Airflow 3.0
|
39 | gcs_convert_dataset_to_openlineage()
40 | # airflow.providers.openlineage.utils.utils
41 | DatasetInfo()
13 | # airflow.providers.openlineage.utils.utils
14 | DatasetInfo()
| ^^^^^^^^^^^ AIR301
42 | translate_airflow_dataset()
43 | #
15 | translate_airflow_dataset()
|
= help: Use `AssetInfo` from `airflow.providers.openlineage.utils.utils` instead.
Safe fix
20 20 | from airflow.providers.openlineage.utils.utils import (
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 |+AssetInfo,
23 24 | )
24 25 |
25 26 | AvpEntities.DATASET
4 4 | from airflow.providers.openlineage.utils.utils import (
5 5 | DatasetInfo,
6 6 | translate_airflow_dataset,
7 |+AssetInfo,
7 8 | )
8 9 | from airflow.secrets.local_filesystem import load_connections
9 10 | from airflow.security.permissions import RESOURCE_DATASET
--------------------------------------------------------------------------------
38 39 | gcs_create_dataset()
39 40 | gcs_convert_dataset_to_openlineage()
40 41 | # airflow.providers.openlineage.utils.utils
41 |-DatasetInfo()
42 |+AssetInfo()
42 43 | translate_airflow_dataset()
43 44 | #
44 45 | # airflow.secrets.local_filesystem
11 12 | AvpEntities.DATASET
12 13 |
13 14 | # airflow.providers.openlineage.utils.utils
14 |-DatasetInfo()
15 |+AssetInfo()
15 16 | translate_airflow_dataset()
16 17 |
17 18 | # airflow.secrets.local_filesystem
AIR301_provider_names_fix.py:42:1: AIR301 [*] `airflow.providers.openlineage.utils.utils.translate_airflow_dataset` is removed in Airflow 3.0
AIR301_provider_names_fix.py:15:1: AIR301 [*] `airflow.providers.openlineage.utils.utils.translate_airflow_dataset` is removed in Airflow 3.0
|
40 | # airflow.providers.openlineage.utils.utils
41 | DatasetInfo()
42 | translate_airflow_dataset()
13 | # airflow.providers.openlineage.utils.utils
14 | DatasetInfo()
15 | translate_airflow_dataset()
| ^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
43 | #
44 | # airflow.secrets.local_filesystem
16 |
17 | # airflow.secrets.local_filesystem
|
= help: Use `translate_airflow_asset` from `airflow.providers.openlineage.utils.utils` instead.
Safe fix
20 20 | from airflow.providers.openlineage.utils.utils import (
21 21 | DatasetInfo,
22 22 | translate_airflow_dataset,
23 |+translate_airflow_asset,
23 24 | )
24 25 |
25 26 | AvpEntities.DATASET
4 4 | from airflow.providers.openlineage.utils.utils import (
5 5 | DatasetInfo,
6 6 | translate_airflow_dataset,
7 |+translate_airflow_asset,
7 8 | )
8 9 | from airflow.secrets.local_filesystem import load_connections
9 10 | from airflow.security.permissions import RESOURCE_DATASET
--------------------------------------------------------------------------------
39 40 | gcs_convert_dataset_to_openlineage()
40 41 | # airflow.providers.openlineage.utils.utils
41 42 | DatasetInfo()
42 |-translate_airflow_dataset()
43 |+translate_airflow_asset()
43 44 | #
44 45 | # airflow.secrets.local_filesystem
45 46 | load_connections()
12 13 |
13 14 | # airflow.providers.openlineage.utils.utils
14 15 | DatasetInfo()
15 |-translate_airflow_dataset()
16 |+translate_airflow_asset()
16 17 |
17 18 | # airflow.secrets.local_filesystem
18 19 | load_connections()
AIR301_provider_names_fix.py:18:1: AIR301 [*] `airflow.secrets.local_filesystem.load_connections` is removed in Airflow 3.0
|
17 | # airflow.secrets.local_filesystem
18 | load_connections()
| ^^^^^^^^^^^^^^^^ AIR301
19 |
20 | # airflow.security.permissions
|
= help: Use `load_connections_dict` from `airflow.secrets.local_filesystem` instead.
Safe fix
5 5 | DatasetInfo,
6 6 | translate_airflow_dataset,
7 7 | )
8 |-from airflow.secrets.local_filesystem import load_connections
8 |+from airflow.secrets.local_filesystem import load_connections, load_connections_dict
9 9 | from airflow.security.permissions import RESOURCE_DATASET
10 10 |
11 11 | AvpEntities.DATASET
--------------------------------------------------------------------------------
15 15 | translate_airflow_dataset()
16 16 |
17 17 | # airflow.secrets.local_filesystem
18 |-load_connections()
18 |+load_connections_dict()
19 19 |
20 20 | # airflow.security.permissions
21 21 | RESOURCE_DATASET
AIR301_provider_names_fix.py:21:1: AIR301 [*] `airflow.security.permissions.RESOURCE_DATASET` is removed in Airflow 3.0
|
20 | # airflow.security.permissions
21 | RESOURCE_DATASET
| ^^^^^^^^^^^^^^^^ AIR301
22 |
23 | from airflow.providers.amazon.aws.datasets.s3 import (
|
= help: Use `RESOURCE_ASSET` from `airflow.security.permissions` instead.
Safe fix
6 6 | translate_airflow_dataset,
7 7 | )
8 8 | from airflow.secrets.local_filesystem import load_connections
9 |-from airflow.security.permissions import RESOURCE_DATASET
9 |+from airflow.security.permissions import RESOURCE_DATASET, RESOURCE_ASSET
10 10 |
11 11 | AvpEntities.DATASET
12 12 |
--------------------------------------------------------------------------------
18 18 | load_connections()
19 19 |
20 20 | # airflow.security.permissions
21 |-RESOURCE_DATASET
21 |+RESOURCE_ASSET
22 22 |
23 23 | from airflow.providers.amazon.aws.datasets.s3 import (
24 24 | convert_dataset_to_openlineage as s3_convert_dataset_to_openlineage,
AIR301_provider_names_fix.py:28:1: AIR301 [*] `airflow.providers.amazon.aws.datasets.s3.create_dataset` is removed in Airflow 3.0
|
26 | from airflow.providers.amazon.aws.datasets.s3 import create_dataset as s3_create_dataset
27 |
28 | s3_create_dataset()
| ^^^^^^^^^^^^^^^^^ AIR301
29 | s3_convert_dataset_to_openlineage()
|
= help: Use `create_asset` from `airflow.providers.amazon.aws.assets.s3` instead.
Safe fix
24 24 | convert_dataset_to_openlineage as s3_convert_dataset_to_openlineage,
25 25 | )
26 26 | from airflow.providers.amazon.aws.datasets.s3 import create_dataset as s3_create_dataset
27 |+from airflow.providers.amazon.aws.assets.s3 import create_asset
27 28 |
28 |-s3_create_dataset()
29 |+create_asset()
29 30 | s3_convert_dataset_to_openlineage()
30 31 |
31 32 | from airflow.providers.common.io.dataset.file import (
AIR301_provider_names_fix.py:29:1: AIR301 [*] `airflow.providers.amazon.aws.datasets.s3.convert_dataset_to_openlineage` is removed in Airflow 3.0
|
28 | s3_create_dataset()
29 | s3_convert_dataset_to_openlineage()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
30 |
31 | from airflow.providers.common.io.dataset.file import (
|
= help: Use `convert_asset_to_openlineage` from `airflow.providers.amazon.aws.assets.s3` instead.
Safe fix
24 24 | convert_dataset_to_openlineage as s3_convert_dataset_to_openlineage,
25 25 | )
26 26 | from airflow.providers.amazon.aws.datasets.s3 import create_dataset as s3_create_dataset
27 |+from airflow.providers.amazon.aws.assets.s3 import convert_asset_to_openlineage
27 28 |
28 29 | s3_create_dataset()
29 |-s3_convert_dataset_to_openlineage()
30 |+convert_asset_to_openlineage()
30 31 |
31 32 | from airflow.providers.common.io.dataset.file import (
32 33 | convert_dataset_to_openlineage as io_convert_dataset_to_openlineage,
AIR301_provider_names_fix.py:45:1: AIR301 [*] `airflow.providers.google.datasets.bigquery.create_dataset` is removed in Airflow 3.0
|
43 | )
44 |
45 | bigquery_create_dataset()
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR301
46 |
47 | # airflow.providers.google.datasets.gcs
|
= help: Use `create_asset` from `airflow.providers.google.assets.bigquery` instead.
Safe fix
41 41 | from airflow.providers.google.datasets.bigquery import (
42 42 | create_dataset as bigquery_create_dataset,
43 43 | )
44 |+from airflow.providers.google.assets.bigquery import create_asset
44 45 |
45 |-bigquery_create_dataset()
46 |+create_asset()
46 47 |
47 48 | # airflow.providers.google.datasets.gcs
48 49 | from airflow.providers.google.datasets.gcs import (
AIR301_provider_names_fix.py:53:1: AIR301 [*] `airflow.providers.google.datasets.gcs.create_dataset` is removed in Airflow 3.0
|
51 | from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
52 |
53 | gcs_create_dataset()
| ^^^^^^^^^^^^^^^^^^ AIR301
54 | gcs_convert_dataset_to_openlineage()
|
= help: Use `create_asset` from `airflow.providers.google.assets.gcs` instead.
Safe fix
49 49 | convert_dataset_to_openlineage as gcs_convert_dataset_to_openlineage,
50 50 | )
51 51 | from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
52 |+from airflow.providers.google.assets.gcs import create_asset
52 53 |
53 |-gcs_create_dataset()
54 |+create_asset()
54 55 | gcs_convert_dataset_to_openlineage()
AIR301_provider_names_fix.py:54:1: AIR301 [*] `airflow.providers.google.datasets.gcs.convert_dataset_to_openlineage` is removed in Airflow 3.0
|
53 | gcs_create_dataset()
54 | gcs_convert_dataset_to_openlineage()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR301
|
= help: Use `convert_asset_to_openlineage` from `airflow.providers.google.assets.gcs` instead.
Safe fix
49 49 | convert_dataset_to_openlineage as gcs_convert_dataset_to_openlineage,
50 50 | )
51 51 | from airflow.providers.google.datasets.gcs import create_dataset as gcs_create_dataset
52 |+from airflow.providers.google.assets.gcs import convert_asset_to_openlineage
52 53 |
53 54 | gcs_create_dataset()
54 |-gcs_convert_dataset_to_openlineage()
55 |+convert_asset_to_openlineage()

View File

@@ -1,113 +1,252 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_amazon.py:23:1: AIR302 `airflow.hooks.S3_hook.S3Hook` is moved into `amazon` provider in Airflow 3.0;
AIR302_amazon.py:14:1: AIR302 [*] `airflow.hooks.S3_hook.S3Hook` is moved into `amazon` provider in Airflow 3.0;
|
21 | from airflow.sensors.s3_key_sensor import S3KeySensor
22 |
23 | S3Hook()
12 | from airflow.sensors.s3_key_sensor import S3KeySensor
13 |
14 | S3Hook()
| ^^^^^^ AIR302
24 | provide_bucket_name()
15 | provide_bucket_name()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3Hook` from `airflow.providers.amazon.aws.hooks.s3` instead.
AIR302_amazon.py:24:1: AIR302 `airflow.hooks.S3_hook.provide_bucket_name` is moved into `amazon` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 3 | from airflow.hooks.S3_hook import (
4 |- S3Hook,
5 4 | provide_bucket_name,
6 5 | )
7 6 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
--------------------------------------------------------------------------------
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.hooks.s3 import S3Hook
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:15:1: AIR302 [*] `airflow.hooks.S3_hook.provide_bucket_name` is moved into `amazon` provider in Airflow 3.0;
|
23 | S3Hook()
24 | provide_bucket_name()
14 | S3Hook()
15 | provide_bucket_name()
| ^^^^^^^^^^^^^^^^^^^ AIR302
25 |
26 | GCSToS3Operator()
16 |
17 | GCSToS3Operator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `provide_bucket_name` from `airflow.providers.amazon.aws.hooks.s3` instead.
AIR302_amazon.py:26:1: AIR302 `airflow.operators.gcs_to_s3.GCSToS3Operator` is moved into `amazon` provider in Airflow 3.0;
Unsafe fix
2 2 |
3 3 | from airflow.hooks.S3_hook import (
4 4 | S3Hook,
5 |- provide_bucket_name,
6 5 | )
7 6 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 7 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
--------------------------------------------------------------------------------
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.hooks.s3 import provide_bucket_name
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:17:1: AIR302 [*] `airflow.operators.gcs_to_s3.GCSToS3Operator` is moved into `amazon` provider in Airflow 3.0;
|
24 | provide_bucket_name()
25 |
26 | GCSToS3Operator()
15 | provide_bucket_name()
16 |
17 | GCSToS3Operator()
| ^^^^^^^^^^^^^^^ AIR302
27 |
28 | GoogleApiToS3Operator()
18 | GoogleApiToS3Operator()
19 | RedshiftToS3Operator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `GCSToS3Operator` from `airflow.providers.amazon.aws.transfers.gcs_to_s3` instead.
AIR302_amazon.py:28:1: AIR302 `airflow.operators.google_api_to_s3_transfer.GoogleApiToS3Operator` is moved into `amazon` provider in Airflow 3.0;
Unsafe fix
4 4 | S3Hook,
5 5 | provide_bucket_name,
6 6 | )
7 |-from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 7 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 8 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.transfers.gcs_to_s3 import GCSToS3Operator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:18:1: AIR302 [*] `airflow.operators.google_api_to_s3_transfer.GoogleApiToS3Operator` is moved into `amazon` provider in Airflow 3.0;
|
26 | GCSToS3Operator()
27 |
28 | GoogleApiToS3Operator()
17 | GCSToS3Operator()
18 | GoogleApiToS3Operator()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
29 | GoogleApiToS3Transfer()
19 | RedshiftToS3Operator()
20 | S3FileTransformOperator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `GoogleApiToS3Operator` from `airflow.providers.amazon.aws.transfers.google_api_to_s3` instead.
AIR302_amazon.py:29:1: AIR302 `airflow.operators.google_api_to_s3_transfer.GoogleApiToS3Transfer` is moved into `amazon` provider in Airflow 3.0;
|
28 | GoogleApiToS3Operator()
29 | GoogleApiToS3Transfer()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
30 |
31 | RedshiftToS3Operator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `GoogleApiToS3Operator` from `airflow.providers.amazon.aws.transfers.google_api_to_s3` instead.
Unsafe fix
5 5 | provide_bucket_name,
6 6 | )
7 7 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 |-from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 8 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.transfers.google_api_to_s3 import GoogleApiToS3Operator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:31:1: AIR302 `airflow.operators.redshift_to_s3_operator.RedshiftToS3Operator` is moved into `amazon` provider in Airflow 3.0;
AIR302_amazon.py:19:1: AIR302 [*] `airflow.operators.redshift_to_s3_operator.RedshiftToS3Operator` is moved into `amazon` provider in Airflow 3.0;
|
29 | GoogleApiToS3Transfer()
30 |
31 | RedshiftToS3Operator()
17 | GCSToS3Operator()
18 | GoogleApiToS3Operator()
19 | RedshiftToS3Operator()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
32 | RedshiftToS3Transfer()
20 | S3FileTransformOperator()
21 | S3ToRedshiftOperator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `RedshiftToS3Operator` from `airflow.providers.amazon.aws.transfers.redshift_to_s3` instead.
AIR302_amazon.py:32:1: AIR302 `airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer` is moved into `amazon` provider in Airflow 3.0;
|
31 | RedshiftToS3Operator()
32 | RedshiftToS3Transfer()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
33 |
34 | S3FileTransformOperator()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `RedshiftToS3Operator` from `airflow.providers.amazon.aws.transfers.redshift_to_s3` instead.
Unsafe fix
6 6 | )
7 7 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 8 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 |-from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 9 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.transfers.redshift_to_s3 import RedshiftToS3Operator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:34:1: AIR302 `airflow.operators.s3_file_transform_operator.S3FileTransformOperator` is moved into `amazon` provider in Airflow 3.0;
AIR302_amazon.py:20:1: AIR302 [*] `airflow.operators.s3_file_transform_operator.S3FileTransformOperator` is moved into `amazon` provider in Airflow 3.0;
|
32 | RedshiftToS3Transfer()
33 |
34 | S3FileTransformOperator()
18 | GoogleApiToS3Operator()
19 | RedshiftToS3Operator()
20 | S3FileTransformOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^ AIR302
35 |
36 | S3ToRedshiftOperator()
21 | S3ToRedshiftOperator()
22 | S3KeySensor()
|
= help: Install `apache-airflow-providers-amazon>=3.0.0` and use `S3FileTransformOperator` from `airflow.providers.amazon.aws.operators.s3` instead.
AIR302_amazon.py:36:1: AIR302 `airflow.operators.s3_to_redshift_operator.S3ToRedshiftOperator` is moved into `amazon` provider in Airflow 3.0;
Unsafe fix
7 7 | from airflow.operators.gcs_to_s3 import GCSToS3Operator
8 8 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 9 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 |-from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 10 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.operators.s3 import S3FileTransformOperator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:21:1: AIR302 [*] `airflow.operators.s3_to_redshift_operator.S3ToRedshiftOperator` is moved into `amazon` provider in Airflow 3.0;
|
34 | S3FileTransformOperator()
35 |
36 | S3ToRedshiftOperator()
19 | RedshiftToS3Operator()
20 | S3FileTransformOperator()
21 | S3ToRedshiftOperator()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
37 | S3ToRedshiftTransfer()
22 | S3KeySensor()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3ToRedshiftOperator` from `airflow.providers.amazon.aws.transfers.s3_to_redshift` instead.
AIR302_amazon.py:37:1: AIR302 `airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer` is moved into `amazon` provider in Airflow 3.0;
|
36 | S3ToRedshiftOperator()
37 | S3ToRedshiftTransfer()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
38 |
39 | S3KeySensor()
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3ToRedshiftOperator` from `airflow.providers.amazon.aws.transfers.s3_to_redshift` instead.
Unsafe fix
8 8 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Operator
9 9 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 10 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 |-from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 11 | from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.transfers.s3_to_redshift import S3ToRedshiftOperator
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:39:1: AIR302 `airflow.sensors.s3_key_sensor.S3KeySensor` is moved into `amazon` provider in Airflow 3.0;
AIR302_amazon.py:22:1: AIR302 [*] `airflow.sensors.s3_key_sensor.S3KeySensor` is moved into `amazon` provider in Airflow 3.0;
|
37 | S3ToRedshiftTransfer()
38 |
39 | S3KeySensor()
20 | S3FileTransformOperator()
21 | S3ToRedshiftOperator()
22 | S3KeySensor()
| ^^^^^^^^^^^ AIR302
23 |
24 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Transfer
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3KeySensor` from `airflow.providers.amazon.aws.sensors.s3` instead.
Unsafe fix
9 9 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator
10 10 | from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
11 11 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator
12 |-from airflow.sensors.s3_key_sensor import S3KeySensor
12 |+from airflow.providers.amazon.aws.sensors.s3 import S3KeySensor
13 13 |
14 14 | S3Hook()
15 15 | provide_bucket_name()
AIR302_amazon.py:26:1: AIR302 [*] `airflow.operators.google_api_to_s3_transfer.GoogleApiToS3Transfer` is moved into `amazon` provider in Airflow 3.0;
|
24 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Transfer
25 |
26 | GoogleApiToS3Transfer()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
27 |
28 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `GoogleApiToS3Operator` from `airflow.providers.amazon.aws.transfers.google_api_to_s3` instead.
Unsafe fix
22 22 | S3KeySensor()
23 23 |
24 24 | from airflow.operators.google_api_to_s3_transfer import GoogleApiToS3Transfer
25 |+from airflow.providers.amazon.aws.transfers.google_api_to_s3 import GoogleApiToS3Operator
25 26 |
26 27 | GoogleApiToS3Transfer()
27 28 |
AIR302_amazon.py:30:1: AIR302 [*] `airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer` is moved into `amazon` provider in Airflow 3.0;
|
28 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
29 |
30 | RedshiftToS3Transfer()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
31 |
32 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `RedshiftToS3Operator` from `airflow.providers.amazon.aws.transfers.redshift_to_s3` instead.
Unsafe fix
26 26 | GoogleApiToS3Transfer()
27 27 |
28 28 | from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
29 |+from airflow.providers.amazon.aws.transfers.redshift_to_s3 import RedshiftToS3Operator
29 30 |
30 31 | RedshiftToS3Transfer()
31 32 |
AIR302_amazon.py:34:1: AIR302 [*] `airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer` is moved into `amazon` provider in Airflow 3.0;
|
32 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer
33 |
34 | S3ToRedshiftTransfer()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-amazon>=1.0.0` and use `S3ToRedshiftOperator` from `airflow.providers.amazon.aws.transfers.s3_to_redshift` instead.
Unsafe fix
30 30 | RedshiftToS3Transfer()
31 31 |
32 32 | from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer
33 |+from airflow.providers.amazon.aws.transfers.s3_to_redshift import S3ToRedshiftOperator
33 34 |
34 35 | S3ToRedshiftTransfer()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_celery.py:9:1: AIR302 `airflow.config_templates.default_celery.DEFAULT_CELERY_CONFIG` is moved into `celery` provider in Airflow 3.0;
AIR302_celery.py:9:1: AIR302 [*] `airflow.config_templates.default_celery.DEFAULT_CELERY_CONFIG` is moved into `celery` provider in Airflow 3.0;
|
7 | )
8 |
@@ -12,7 +12,20 @@ AIR302_celery.py:9:1: AIR302 `airflow.config_templates.default_celery.DEFAULT_CE
|
= help: Install `apache-airflow-providers-celery>=3.3.0` and use `DEFAULT_CELERY_CONFIG` from `airflow.providers.celery.executors.default_celery` instead.
AIR302_celery.py:11:1: AIR302 `airflow.executors.celery_executor.app` is moved into `celery` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 |-from airflow.config_templates.default_celery import DEFAULT_CELERY_CONFIG
4 3 | from airflow.executors.celery_executor import (
5 4 | CeleryExecutor,
6 5 | app,
7 6 | )
7 |+from airflow.providers.celery.executors.default_celery import DEFAULT_CELERY_CONFIG
8 8 |
9 9 | DEFAULT_CELERY_CONFIG
10 10 |
AIR302_celery.py:11:1: AIR302 [*] `airflow.executors.celery_executor.app` is moved into `celery` provider in Airflow 3.0;
|
9 | DEFAULT_CELERY_CONFIG
10 |
@@ -22,10 +35,33 @@ AIR302_celery.py:11:1: AIR302 `airflow.executors.celery_executor.app` is moved i
|
= help: Install `apache-airflow-providers-celery>=3.3.0` and use `app` from `airflow.providers.celery.executors.celery_executor_utils` instead.
AIR302_celery.py:12:1: AIR302 `airflow.executors.celery_executor.CeleryExecutor` is moved into `celery` provider in Airflow 3.0;
Unsafe fix
3 3 | from airflow.config_templates.default_celery import DEFAULT_CELERY_CONFIG
4 4 | from airflow.executors.celery_executor import (
5 5 | CeleryExecutor,
6 |- app,
7 6 | )
7 |+from airflow.providers.celery.executors.celery_executor_utils import app
8 8 |
9 9 | DEFAULT_CELERY_CONFIG
10 10 |
AIR302_celery.py:12:1: AIR302 [*] `airflow.executors.celery_executor.CeleryExecutor` is moved into `celery` provider in Airflow 3.0;
|
11 | app
12 | CeleryExecutor()
| ^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-celery>=3.3.0` and use `CeleryExecutor` from `airflow.providers.celery.executors.celery_executor` instead.
Unsafe fix
2 2 |
3 3 | from airflow.config_templates.default_celery import DEFAULT_CELERY_CONFIG
4 4 | from airflow.executors.celery_executor import (
5 |- CeleryExecutor,
6 5 | app,
7 6 | )
7 |+from airflow.providers.celery.executors.celery_executor import CeleryExecutor
8 8 |
9 9 | DEFAULT_CELERY_CONFIG
10 10 |

View File

@@ -1,290 +1,639 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_common_sql.py:10:1: AIR302 `airflow.hooks.dbapi.ConnectorProtocol` is moved into `common-sql` provider in Airflow 3.0;
|
8 | from airflow.operators.check_operator import SQLCheckOperator
9 |
10 | ConnectorProtocol()
| ^^^^^^^^^^^^^^^^^ AIR302
11 | DbApiHook()
12 | SQLCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `ConnectorProtocol` from `airflow.providers.common.sql.hooks.sql` instead.
AIR302_common_sql.py:8:1: AIR302 [*] `airflow.hooks.dbapi.ConnectorProtocol` is moved into `common-sql` provider in Airflow 3.0;
|
6 | )
7 |
8 | ConnectorProtocol()
| ^^^^^^^^^^^^^^^^^ AIR302
9 | DbApiHook()
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `ConnectorProtocol` from `airflow.providers.common.sql.hooks.sql` instead.
AIR302_common_sql.py:11:1: AIR302 `airflow.hooks.dbapi_hook.DbApiHook` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 3 | from airflow.hooks.dbapi import (
4 |- ConnectorProtocol,
5 4 | DbApiHook,
6 5 | )
6 |+from airflow.providers.common.sql.hooks.sql import ConnectorProtocol
7 7 |
8 8 | ConnectorProtocol()
9 9 | DbApiHook()
AIR302_common_sql.py:9:1: AIR302 [*] `airflow.hooks.dbapi.DbApiHook` is moved into `common-sql` provider in Airflow 3.0;
|
10 | ConnectorProtocol()
11 | DbApiHook()
8 | ConnectorProtocol()
9 | DbApiHook()
| ^^^^^^^^^ AIR302
12 | SQLCheckOperator()
10 |
11 | from airflow.hooks.dbapi_hook import DbApiHook
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `DbApiHook` from `airflow.providers.common.sql.hooks.sql` instead.
AIR302_common_sql.py:12:1: AIR302 `airflow.operators.check_operator.SQLCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
2 2 |
3 3 | from airflow.hooks.dbapi import (
4 4 | ConnectorProtocol,
5 |- DbApiHook,
6 5 | )
6 |+from airflow.providers.common.sql.hooks.sql import DbApiHook
7 7 |
8 8 | ConnectorProtocol()
9 9 | DbApiHook()
AIR302_common_sql.py:14:1: AIR302 [*] `airflow.hooks.dbapi_hook.DbApiHook` is moved into `common-sql` provider in Airflow 3.0;
|
10 | ConnectorProtocol()
11 | DbApiHook()
12 | SQLCheckOperator()
12 | from airflow.operators.check_operator import SQLCheckOperator
13 |
14 | DbApiHook()
| ^^^^^^^^^ AIR302
15 | SQLCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `DbApiHook` from `airflow.providers.common.sql.hooks.sql` instead.
Unsafe fix
8 8 | ConnectorProtocol()
9 9 | DbApiHook()
10 10 |
11 |-from airflow.hooks.dbapi_hook import DbApiHook
12 11 | from airflow.operators.check_operator import SQLCheckOperator
12 |+from airflow.providers.common.sql.hooks.sql import DbApiHook
13 13 |
14 14 | DbApiHook()
15 15 | SQLCheckOperator()
AIR302_common_sql.py:15:1: AIR302 [*] `airflow.operators.check_operator.SQLCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
14 | DbApiHook()
15 | SQLCheckOperator()
| ^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:18:1: AIR302 `airflow.operators.sql.SQLCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
9 9 | DbApiHook()
10 10 |
11 11 | from airflow.hooks.dbapi_hook import DbApiHook
12 |-from airflow.operators.check_operator import SQLCheckOperator
12 |+from airflow.providers.common.sql.operators.sql import SQLCheckOperator
13 13 |
14 14 | DbApiHook()
15 15 | SQLCheckOperator()
AIR302_common_sql.py:21:1: AIR302 [*] `airflow.operators.sql.SQLCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
16 | from airflow.operators.sql import SQLCheckOperator
17 |
18 | SQLCheckOperator()
19 | from airflow.operators.sql import SQLCheckOperator
20 |
21 | SQLCheckOperator()
| ^^^^^^^^^^^^^^^^ AIR302
19 | CheckOperator()
22 | CheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:19:1: AIR302 `airflow.operators.check_operator.CheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
16 16 |
17 17 |
18 18 | from airflow.operators.check_operator import CheckOperator
19 |-from airflow.operators.sql import SQLCheckOperator
19 |+from airflow.providers.common.sql.operators.sql import SQLCheckOperator
20 20 |
21 21 | SQLCheckOperator()
22 22 | CheckOperator()
AIR302_common_sql.py:22:1: AIR302 [*] `airflow.operators.check_operator.CheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
18 | SQLCheckOperator()
19 | CheckOperator()
21 | SQLCheckOperator()
22 | CheckOperator()
| ^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:24:1: AIR302 `airflow.operators.druid_check_operator.CheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
17 17 |
18 18 | from airflow.operators.check_operator import CheckOperator
19 19 | from airflow.operators.sql import SQLCheckOperator
20 |+from airflow.providers.common.sql.operators.sql import SQLCheckOperator
20 21 |
21 22 | SQLCheckOperator()
22 23 | CheckOperator()
AIR302_common_sql.py:27:1: AIR302 [*] `airflow.operators.druid_check_operator.CheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
22 | from airflow.operators.druid_check_operator import CheckOperator
23 |
24 | CheckOperator()
25 | from airflow.operators.druid_check_operator import CheckOperator
26 |
27 | CheckOperator()
| ^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:29:1: AIR302 `airflow.operators.presto_check_operator.CheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
23 23 |
24 24 |
25 25 | from airflow.operators.druid_check_operator import CheckOperator
26 |+from airflow.providers.common.sql.operators.sql import SQLCheckOperator
26 27 |
27 28 | CheckOperator()
28 29 |
AIR302_common_sql.py:32:1: AIR302 [*] `airflow.operators.presto_check_operator.CheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
27 | from airflow.operators.presto_check_operator import CheckOperator
28 |
29 | CheckOperator()
30 | from airflow.operators.presto_check_operator import CheckOperator
31 |
32 | CheckOperator()
| ^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:39:1: AIR302 `airflow.operators.druid_check_operator.DruidCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
28 28 |
29 29 |
30 30 | from airflow.operators.presto_check_operator import CheckOperator
31 |+from airflow.providers.common.sql.operators.sql import SQLCheckOperator
31 32 |
32 33 | CheckOperator()
33 34 |
AIR302_common_sql.py:42:1: AIR302 [*] `airflow.operators.druid_check_operator.DruidCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
37 | from airflow.operators.presto_check_operator import PrestoCheckOperator
38 |
39 | DruidCheckOperator()
40 | from airflow.operators.presto_check_operator import PrestoCheckOperator
41 |
42 | DruidCheckOperator()
| ^^^^^^^^^^^^^^^^^^ AIR302
40 | PrestoCheckOperator()
41 | IntervalCheckOperator()
43 | PrestoCheckOperator()
44 | IntervalCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:40:1: AIR302 `airflow.operators.presto_check_operator.PrestoCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
38 38 | )
39 39 | from airflow.operators.druid_check_operator import DruidCheckOperator
40 40 | from airflow.operators.presto_check_operator import PrestoCheckOperator
41 |+from airflow.providers.common.sql.operators.sql import SQLCheckOperator
41 42 |
42 43 | DruidCheckOperator()
43 44 | PrestoCheckOperator()
AIR302_common_sql.py:43:1: AIR302 [*] `airflow.operators.presto_check_operator.PrestoCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
39 | DruidCheckOperator()
40 | PrestoCheckOperator()
42 | DruidCheckOperator()
43 | PrestoCheckOperator()
| ^^^^^^^^^^^^^^^^^^^ AIR302
41 | IntervalCheckOperator()
42 | SQLIntervalCheckOperator()
44 | IntervalCheckOperator()
45 | SQLIntervalCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:41:1: AIR302 `airflow.operators.check_operator.IntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
38 38 | )
39 39 | from airflow.operators.druid_check_operator import DruidCheckOperator
40 40 | from airflow.operators.presto_check_operator import PrestoCheckOperator
41 |+from airflow.providers.common.sql.operators.sql import SQLCheckOperator
41 42 |
42 43 | DruidCheckOperator()
43 44 | PrestoCheckOperator()
AIR302_common_sql.py:44:1: AIR302 [*] `airflow.operators.check_operator.IntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
39 | DruidCheckOperator()
40 | PrestoCheckOperator()
41 | IntervalCheckOperator()
42 | DruidCheckOperator()
43 | PrestoCheckOperator()
44 | IntervalCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
42 | SQLIntervalCheckOperator()
45 | SQLIntervalCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLIntervalCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:42:1: AIR302 `airflow.operators.check_operator.SQLIntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
34 34 |
35 35 | from airflow.operators.check_operator import (
36 36 | IntervalCheckOperator,
37 |- SQLIntervalCheckOperator,
38 37 | )
39 38 | from airflow.operators.druid_check_operator import DruidCheckOperator
40 39 | from airflow.operators.presto_check_operator import PrestoCheckOperator
40 |+from airflow.providers.common.sql.operators.sql import SQLIntervalCheckOperator
41 41 |
42 42 | DruidCheckOperator()
43 43 | PrestoCheckOperator()
AIR302_common_sql.py:45:1: AIR302 [*] `airflow.operators.check_operator.SQLIntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
40 | PrestoCheckOperator()
41 | IntervalCheckOperator()
42 | SQLIntervalCheckOperator()
43 | PrestoCheckOperator()
44 | IntervalCheckOperator()
45 | SQLIntervalCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLIntervalCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:51:1: AIR302 `airflow.operators.presto_check_operator.IntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
34 34 |
35 35 | from airflow.operators.check_operator import (
36 36 | IntervalCheckOperator,
37 |- SQLIntervalCheckOperator,
38 37 | )
39 38 | from airflow.operators.druid_check_operator import DruidCheckOperator
40 39 | from airflow.operators.presto_check_operator import PrestoCheckOperator
40 |+from airflow.providers.common.sql.operators.sql import SQLIntervalCheckOperator
41 41 |
42 42 | DruidCheckOperator()
43 43 | PrestoCheckOperator()
AIR302_common_sql.py:54:1: AIR302 [*] `airflow.operators.presto_check_operator.IntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
49 | from airflow.operators.sql import SQLIntervalCheckOperator
50 |
51 | IntervalCheckOperator()
52 | from airflow.operators.sql import SQLIntervalCheckOperator
53 |
54 | IntervalCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
52 | SQLIntervalCheckOperator()
53 | PrestoIntervalCheckOperator()
55 | SQLIntervalCheckOperator()
56 | PrestoIntervalCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLIntervalCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:52:1: AIR302 `airflow.operators.sql.SQLIntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
50 50 | PrestoIntervalCheckOperator,
51 51 | )
52 52 | from airflow.operators.sql import SQLIntervalCheckOperator
53 |+from airflow.providers.common.sql.operators.sql import SQLIntervalCheckOperator
53 54 |
54 55 | IntervalCheckOperator()
55 56 | SQLIntervalCheckOperator()
AIR302_common_sql.py:55:1: AIR302 [*] `airflow.operators.sql.SQLIntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
51 | IntervalCheckOperator()
52 | SQLIntervalCheckOperator()
54 | IntervalCheckOperator()
55 | SQLIntervalCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
53 | PrestoIntervalCheckOperator()
56 | PrestoIntervalCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLIntervalCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:53:1: AIR302 `airflow.operators.presto_check_operator.PrestoIntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
49 49 | IntervalCheckOperator,
50 50 | PrestoIntervalCheckOperator,
51 51 | )
52 |-from airflow.operators.sql import SQLIntervalCheckOperator
52 |+from airflow.providers.common.sql.operators.sql import SQLIntervalCheckOperator
53 53 |
54 54 | IntervalCheckOperator()
55 55 | SQLIntervalCheckOperator()
AIR302_common_sql.py:56:1: AIR302 [*] `airflow.operators.presto_check_operator.PrestoIntervalCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
51 | IntervalCheckOperator()
52 | SQLIntervalCheckOperator()
53 | PrestoIntervalCheckOperator()
54 | IntervalCheckOperator()
55 | SQLIntervalCheckOperator()
56 | PrestoIntervalCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLIntervalCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:61:1: AIR302 `airflow.operators.check_operator.SQLThresholdCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
50 50 | PrestoIntervalCheckOperator,
51 51 | )
52 52 | from airflow.operators.sql import SQLIntervalCheckOperator
53 |+from airflow.providers.common.sql.operators.sql import SQLIntervalCheckOperator
53 54 |
54 55 | IntervalCheckOperator()
55 56 | SQLIntervalCheckOperator()
AIR302_common_sql.py:64:1: AIR302 [*] `airflow.operators.check_operator.SQLThresholdCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
59 | )
60 |
61 | SQLThresholdCheckOperator()
62 | )
63 |
64 | SQLThresholdCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
62 | ThresholdCheckOperator()
65 | ThresholdCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLThresholdCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:62:1: AIR302 `airflow.operators.check_operator.ThresholdCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
57 57 |
58 58 |
59 59 | from airflow.operators.check_operator import (
60 |- SQLThresholdCheckOperator,
61 60 | ThresholdCheckOperator,
62 61 | )
62 |+from airflow.providers.common.sql.operators.sql import SQLThresholdCheckOperator
63 63 |
64 64 | SQLThresholdCheckOperator()
65 65 | ThresholdCheckOperator()
AIR302_common_sql.py:65:1: AIR302 [*] `airflow.operators.check_operator.ThresholdCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
61 | SQLThresholdCheckOperator()
62 | ThresholdCheckOperator()
64 | SQLThresholdCheckOperator()
65 | ThresholdCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLThresholdCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:67:1: AIR302 `airflow.operators.sql.SQLThresholdCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
57 57 |
58 58 |
59 59 | from airflow.operators.check_operator import (
60 |- SQLThresholdCheckOperator,
61 60 | ThresholdCheckOperator,
62 61 | )
62 |+from airflow.providers.common.sql.operators.sql import SQLThresholdCheckOperator
63 63 |
64 64 | SQLThresholdCheckOperator()
65 65 | ThresholdCheckOperator()
AIR302_common_sql.py:70:1: AIR302 [*] `airflow.operators.sql.SQLThresholdCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
65 | from airflow.operators.sql import SQLThresholdCheckOperator
66 |
67 | SQLThresholdCheckOperator()
68 | from airflow.operators.sql import SQLThresholdCheckOperator
69 |
70 | SQLThresholdCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLThresholdCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:75:1: AIR302 `airflow.operators.check_operator.SQLValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
65 65 | ThresholdCheckOperator()
66 66 |
67 67 |
68 |-from airflow.operators.sql import SQLThresholdCheckOperator
68 |+from airflow.providers.common.sql.operators.sql import SQLThresholdCheckOperator
69 69 |
70 70 | SQLThresholdCheckOperator()
71 71 |
AIR302_common_sql.py:78:1: AIR302 [*] `airflow.operators.check_operator.SQLValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
73 | )
74 |
75 | SQLValueCheckOperator()
76 | )
77 |
78 | SQLValueCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
76 | ValueCheckOperator()
79 | ValueCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLValueCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:76:1: AIR302 `airflow.operators.check_operator.ValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
71 71 |
72 72 |
73 73 | from airflow.operators.check_operator import (
74 |- SQLValueCheckOperator,
75 74 | ValueCheckOperator,
76 75 | )
76 |+from airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
77 77 |
78 78 | SQLValueCheckOperator()
79 79 | ValueCheckOperator()
AIR302_common_sql.py:79:1: AIR302 [*] `airflow.operators.check_operator.ValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
75 | SQLValueCheckOperator()
76 | ValueCheckOperator()
78 | SQLValueCheckOperator()
79 | ValueCheckOperator()
| ^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLValueCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:85:1: AIR302 `airflow.operators.sql.SQLValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
71 71 |
72 72 |
73 73 | from airflow.operators.check_operator import (
74 |- SQLValueCheckOperator,
75 74 | ValueCheckOperator,
76 75 | )
76 |+from airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
77 77 |
78 78 | SQLValueCheckOperator()
79 79 | ValueCheckOperator()
AIR302_common_sql.py:88:1: AIR302 [*] `airflow.operators.sql.SQLValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
83 | from airflow.operators.sql import SQLValueCheckOperator
84 |
85 | SQLValueCheckOperator()
86 | from airflow.operators.sql import SQLValueCheckOperator
87 |
88 | SQLValueCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
86 | ValueCheckOperator()
87 | PrestoValueCheckOperator()
89 | ValueCheckOperator()
90 | PrestoValueCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLValueCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:86:1: AIR302 `airflow.operators.presto_check_operator.ValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
83 83 | PrestoValueCheckOperator,
84 84 | ValueCheckOperator,
85 85 | )
86 |-from airflow.operators.sql import SQLValueCheckOperator
86 |+from airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
87 87 |
88 88 | SQLValueCheckOperator()
89 89 | ValueCheckOperator()
AIR302_common_sql.py:89:1: AIR302 [*] `airflow.operators.presto_check_operator.ValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
85 | SQLValueCheckOperator()
86 | ValueCheckOperator()
88 | SQLValueCheckOperator()
89 | ValueCheckOperator()
| ^^^^^^^^^^^^^^^^^^ AIR302
87 | PrestoValueCheckOperator()
90 | PrestoValueCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLValueCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:87:1: AIR302 `airflow.operators.presto_check_operator.PrestoValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
84 84 | ValueCheckOperator,
85 85 | )
86 86 | from airflow.operators.sql import SQLValueCheckOperator
87 |+from airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
87 88 |
88 89 | SQLValueCheckOperator()
89 90 | ValueCheckOperator()
AIR302_common_sql.py:90:1: AIR302 [*] `airflow.operators.presto_check_operator.PrestoValueCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
85 | SQLValueCheckOperator()
86 | ValueCheckOperator()
87 | PrestoValueCheckOperator()
88 | SQLValueCheckOperator()
89 | ValueCheckOperator()
90 | PrestoValueCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLValueCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:99:1: AIR302 `airflow.operators.sql.BaseSQLOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
84 84 | ValueCheckOperator,
85 85 | )
86 86 | from airflow.operators.sql import SQLValueCheckOperator
87 |+from airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
87 88 |
88 89 | SQLValueCheckOperator()
89 90 | ValueCheckOperator()
AIR302_common_sql.py:102:1: AIR302 [*] `airflow.operators.sql.BaseSQLOperator` is moved into `common-sql` provider in Airflow 3.0;
|
97 | )
98 |
99 | BaseSQLOperator()
100 | )
101 |
102 | BaseSQLOperator()
| ^^^^^^^^^^^^^^^ AIR302
100 | BranchSQLOperator()
101 | SQLTableCheckOperator()
103 | BranchSQLOperator()
104 | SQLTableCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `BaseSQLOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:100:1: AIR302 `airflow.operators.sql.BranchSQLOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
91 91 |
92 92 |
93 93 | from airflow.operators.sql import (
94 |- BaseSQLOperator,
95 94 | BranchSQLOperator,
96 95 | SQLColumnCheckOperator,
97 96 | SQLTableCheckOperator,
98 97 | _convert_to_float_if_possible,
99 98 | parse_boolean,
100 99 | )
100 |+from airflow.providers.common.sql.operators.sql import BaseSQLOperator
101 101 |
102 102 | BaseSQLOperator()
103 103 | BranchSQLOperator()
AIR302_common_sql.py:103:1: AIR302 [*] `airflow.operators.sql.BranchSQLOperator` is moved into `common-sql` provider in Airflow 3.0;
|
99 | BaseSQLOperator()
100 | BranchSQLOperator()
102 | BaseSQLOperator()
103 | BranchSQLOperator()
| ^^^^^^^^^^^^^^^^^ AIR302
101 | SQLTableCheckOperator()
102 | SQLColumnCheckOperator()
104 | SQLTableCheckOperator()
105 | SQLColumnCheckOperator()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `BranchSQLOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:101:1: AIR302 `airflow.operators.sql.SQLTableCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
92 92 |
93 93 | from airflow.operators.sql import (
94 94 | BaseSQLOperator,
95 |- BranchSQLOperator,
96 95 | SQLColumnCheckOperator,
97 96 | SQLTableCheckOperator,
98 97 | _convert_to_float_if_possible,
99 98 | parse_boolean,
100 99 | )
100 |+from airflow.providers.common.sql.operators.sql import BranchSQLOperator
101 101 |
102 102 | BaseSQLOperator()
103 103 | BranchSQLOperator()
AIR302_common_sql.py:104:1: AIR302 [*] `airflow.operators.sql.SQLTableCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
99 | BaseSQLOperator()
100 | BranchSQLOperator()
101 | SQLTableCheckOperator()
102 | BaseSQLOperator()
103 | BranchSQLOperator()
104 | SQLTableCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
102 | SQLColumnCheckOperator()
103 | _convert_to_float_if_possible()
105 | SQLColumnCheckOperator()
106 | _convert_to_float_if_possible()
|
= help: Install `apache-airflow-providers-common-sql>=1.1.0` and use `SQLTableCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:102:1: AIR302 `airflow.operators.sql.SQLColumnCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
94 94 | BaseSQLOperator,
95 95 | BranchSQLOperator,
96 96 | SQLColumnCheckOperator,
97 |- SQLTableCheckOperator,
98 97 | _convert_to_float_if_possible,
99 98 | parse_boolean,
100 99 | )
100 |+from airflow.providers.common.sql.operators.sql import SQLTableCheckOperator
101 101 |
102 102 | BaseSQLOperator()
103 103 | BranchSQLOperator()
AIR302_common_sql.py:105:1: AIR302 [*] `airflow.operators.sql.SQLColumnCheckOperator` is moved into `common-sql` provider in Airflow 3.0;
|
100 | BranchSQLOperator()
101 | SQLTableCheckOperator()
102 | SQLColumnCheckOperator()
103 | BranchSQLOperator()
104 | SQLTableCheckOperator()
105 | SQLColumnCheckOperator()
| ^^^^^^^^^^^^^^^^^^^^^^ AIR302
103 | _convert_to_float_if_possible()
104 | parse_boolean()
106 | _convert_to_float_if_possible()
107 | parse_boolean()
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `SQLColumnCheckOperator` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:103:1: AIR302 `airflow.operators.sql._convert_to_float_if_possible` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
93 93 | from airflow.operators.sql import (
94 94 | BaseSQLOperator,
95 95 | BranchSQLOperator,
96 |- SQLColumnCheckOperator,
97 96 | SQLTableCheckOperator,
98 97 | _convert_to_float_if_possible,
99 98 | parse_boolean,
100 99 | )
100 |+from airflow.providers.common.sql.operators.sql import SQLColumnCheckOperator
101 101 |
102 102 | BaseSQLOperator()
103 103 | BranchSQLOperator()
AIR302_common_sql.py:106:1: AIR302 [*] `airflow.operators.sql._convert_to_float_if_possible` is moved into `common-sql` provider in Airflow 3.0;
|
101 | SQLTableCheckOperator()
102 | SQLColumnCheckOperator()
103 | _convert_to_float_if_possible()
104 | SQLTableCheckOperator()
105 | SQLColumnCheckOperator()
106 | _convert_to_float_if_possible()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
104 | parse_boolean()
107 | parse_boolean()
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `_convert_to_float_if_possible` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:104:1: AIR302 `airflow.operators.sql.parse_boolean` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
95 95 | BranchSQLOperator,
96 96 | SQLColumnCheckOperator,
97 97 | SQLTableCheckOperator,
98 |- _convert_to_float_if_possible,
99 98 | parse_boolean,
100 99 | )
100 |+from airflow.providers.common.sql.operators.sql import _convert_to_float_if_possible
101 101 |
102 102 | BaseSQLOperator()
103 103 | BranchSQLOperator()
AIR302_common_sql.py:107:1: AIR302 [*] `airflow.operators.sql.parse_boolean` is moved into `common-sql` provider in Airflow 3.0;
|
102 | SQLColumnCheckOperator()
103 | _convert_to_float_if_possible()
104 | parse_boolean()
105 | SQLColumnCheckOperator()
106 | _convert_to_float_if_possible()
107 | parse_boolean()
| ^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `parse_boolean` from `airflow.providers.common.sql.operators.sql` instead.
AIR302_common_sql.py:109:1: AIR302 `airflow.sensors.sql.SqlSensor` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
96 96 | SQLColumnCheckOperator,
97 97 | SQLTableCheckOperator,
98 98 | _convert_to_float_if_possible,
99 |- parse_boolean,
100 99 | )
100 |+from airflow.providers.common.sql.operators.sql import parse_boolean
101 101 |
102 102 | BaseSQLOperator()
103 103 | BranchSQLOperator()
AIR302_common_sql.py:112:1: AIR302 [*] `airflow.sensors.sql.SqlSensor` is moved into `common-sql` provider in Airflow 3.0;
|
107 | from airflow.sensors.sql import SqlSensor
108 |
109 | SqlSensor()
110 | from airflow.sensors.sql import SqlSensor
111 |
112 | SqlSensor()
| ^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `SqlSensor` from `airflow.providers.common.sql.sensors.sql` instead.
AIR302_common_sql.py:114:1: AIR302 `airflow.sensors.sql_sensor.SqlSensor` is moved into `common-sql` provider in Airflow 3.0;
Unsafe fix
107 107 | parse_boolean()
108 108 |
109 109 |
110 |-from airflow.sensors.sql import SqlSensor
110 |+from airflow.providers.common.sql.sensors.sql import SqlSensor
111 111 |
112 112 | SqlSensor()
113 113 |
AIR302_common_sql.py:117:1: AIR302 [*] `airflow.sensors.sql_sensor.SqlSensor` is moved into `common-sql` provider in Airflow 3.0;
|
112 | from airflow.sensors.sql_sensor import SqlSensor
113 |
114 | SqlSensor()
115 | from airflow.sensors.sql_sensor import SqlSensor
116 |
117 | SqlSensor()
| ^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-common-sql>=1.0.0` and use `SqlSensor` from `airflow.providers.common.sql.sensors.sql` instead.
Unsafe fix
112 112 | SqlSensor()
113 113 |
114 114 |
115 |-from airflow.sensors.sql_sensor import SqlSensor
115 |+from airflow.providers.common.sql.sensors.sql import SqlSensor
116 116 |
117 117 | SqlSensor()
118 118 |

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_daskexecutor.py:5:1: AIR302 `airflow.executors.dask_executor.DaskExecutor` is moved into `daskexecutor` provider in Airflow 3.0;
AIR302_daskexecutor.py:5:1: AIR302 [*] `airflow.executors.dask_executor.DaskExecutor` is moved into `daskexecutor` provider in Airflow 3.0;
|
3 | from airflow.executors.dask_executor import DaskExecutor
4 |
@@ -9,3 +9,11 @@ AIR302_daskexecutor.py:5:1: AIR302 `airflow.executors.dask_executor.DaskExecutor
| ^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-daskexecutor>=1.0.0` and use `DaskExecutor` from `airflow.providers.daskexecutor.executors.dask_executor` instead.
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 |-from airflow.executors.dask_executor import DaskExecutor
3 |+from airflow.providers.daskexecutor.executors.dask_executor import DaskExecutor
4 4 |
5 5 | DaskExecutor()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_druid.py:12:1: AIR302 `airflow.hooks.druid_hook.DruidDbApiHook` is moved into `apache-druid` provider in Airflow 3.0;
AIR302_druid.py:12:1: AIR302 [*] `airflow.hooks.druid_hook.DruidDbApiHook` is moved into `apache-druid` provider in Airflow 3.0;
|
10 | )
11 |
@@ -11,7 +11,23 @@ AIR302_druid.py:12:1: AIR302 `airflow.hooks.druid_hook.DruidDbApiHook` is moved
|
= help: Install `apache-airflow-providers-apache-druid>=1.0.0` and use `DruidDbApiHook` from `airflow.providers.apache.druid.hooks.druid` instead.
AIR302_druid.py:13:1: AIR302 `airflow.hooks.druid_hook.DruidHook` is moved into `apache-druid` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 3 | from airflow.hooks.druid_hook import (
4 |- DruidDbApiHook,
5 4 | DruidHook,
6 5 | )
7 6 | from airflow.operators.hive_to_druid import (
8 7 | HiveToDruidOperator,
9 8 | HiveToDruidTransfer,
10 9 | )
10 |+from airflow.providers.apache.druid.hooks.druid import DruidDbApiHook
11 11 |
12 12 | DruidDbApiHook()
13 13 | DruidHook()
AIR302_druid.py:13:1: AIR302 [*] `airflow.hooks.druid_hook.DruidHook` is moved into `apache-druid` provider in Airflow 3.0;
|
12 | DruidDbApiHook()
13 | DruidHook()
@@ -21,7 +37,22 @@ AIR302_druid.py:13:1: AIR302 `airflow.hooks.druid_hook.DruidHook` is moved into
|
= help: Install `apache-airflow-providers-apache-druid>=1.0.0` and use `DruidHook` from `airflow.providers.apache.druid.hooks.druid` instead.
AIR302_druid.py:15:1: AIR302 `airflow.operators.hive_to_druid.HiveToDruidOperator` is moved into `apache-druid` provider in Airflow 3.0;
Unsafe fix
2 2 |
3 3 | from airflow.hooks.druid_hook import (
4 4 | DruidDbApiHook,
5 |- DruidHook,
6 5 | )
7 6 | from airflow.operators.hive_to_druid import (
8 7 | HiveToDruidOperator,
9 8 | HiveToDruidTransfer,
10 9 | )
10 |+from airflow.providers.apache.druid.hooks.druid import DruidHook
11 11 |
12 12 | DruidDbApiHook()
13 13 | DruidHook()
AIR302_druid.py:15:1: AIR302 [*] `airflow.operators.hive_to_druid.HiveToDruidOperator` is moved into `apache-druid` provider in Airflow 3.0;
|
13 | DruidHook()
14 |
@@ -31,10 +62,34 @@ AIR302_druid.py:15:1: AIR302 `airflow.operators.hive_to_druid.HiveToDruidOperato
|
= help: Install `apache-airflow-providers-apache-druid>=1.0.0` and use `HiveToDruidOperator` from `airflow.providers.apache.druid.transfers.hive_to_druid` instead.
AIR302_druid.py:16:1: AIR302 `airflow.operators.hive_to_druid.HiveToDruidTransfer` is moved into `apache-druid` provider in Airflow 3.0;
Unsafe fix
5 5 | DruidHook,
6 6 | )
7 7 | from airflow.operators.hive_to_druid import (
8 |- HiveToDruidOperator,
9 8 | HiveToDruidTransfer,
10 9 | )
10 |+from airflow.providers.apache.druid.transfers.hive_to_druid import HiveToDruidOperator
11 11 |
12 12 | DruidDbApiHook()
13 13 | DruidHook()
AIR302_druid.py:16:1: AIR302 [*] `airflow.operators.hive_to_druid.HiveToDruidTransfer` is moved into `apache-druid` provider in Airflow 3.0;
|
15 | HiveToDruidOperator()
16 | HiveToDruidTransfer()
| ^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-apache-druid>=1.0.0` and use `HiveToDruidOperator` from `airflow.providers.apache.druid.transfers.hive_to_druid` instead.
Unsafe fix
5 5 | DruidHook,
6 6 | )
7 7 | from airflow.operators.hive_to_druid import (
8 |- HiveToDruidOperator,
9 8 | HiveToDruidTransfer,
10 9 | )
10 |+from airflow.providers.apache.druid.transfers.hive_to_druid import HiveToDruidOperator
11 11 |
12 12 | DruidDbApiHook()
13 13 | DruidHook()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_fab.py:10:1: AIR302 `airflow.api.auth.backend.basic_auth.CLIENT_AUTH` is moved into `fab` provider in Airflow 3.0;
AIR302_fab.py:10:1: AIR302 [*] `airflow.api.auth.backend.basic_auth.CLIENT_AUTH` is moved into `fab` provider in Airflow 3.0;
|
8 | )
9 |
@@ -12,7 +12,21 @@ AIR302_fab.py:10:1: AIR302 `airflow.api.auth.backend.basic_auth.CLIENT_AUTH` is
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `CLIENT_AUTH` from `airflow.providers.fab.auth_manager.api.auth.backend.basic_auth` instead.
AIR302_fab.py:11:1: AIR302 `airflow.api.auth.backend.basic_auth.init_app` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 3 | from airflow.api.auth.backend.basic_auth import (
4 |- CLIENT_AUTH,
5 4 | auth_current_user,
6 5 | init_app,
7 6 | requires_authentication,
8 7 | )
8 |+from airflow.providers.fab.auth_manager.api.auth.backend.basic_auth import CLIENT_AUTH
9 9 |
10 10 | CLIENT_AUTH
11 11 | init_app()
AIR302_fab.py:11:1: AIR302 [*] `airflow.api.auth.backend.basic_auth.init_app` is moved into `fab` provider in Airflow 3.0;
|
10 | CLIENT_AUTH
11 | init_app()
@@ -22,7 +36,19 @@ AIR302_fab.py:11:1: AIR302 `airflow.api.auth.backend.basic_auth.init_app` is mov
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `init_app` from `airflow.providers.fab.auth_manager.api.auth.backend.basic_auth` instead.
AIR302_fab.py:12:1: AIR302 `airflow.api.auth.backend.basic_auth.auth_current_user` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
3 3 | from airflow.api.auth.backend.basic_auth import (
4 4 | CLIENT_AUTH,
5 5 | auth_current_user,
6 |- init_app,
7 6 | requires_authentication,
8 7 | )
8 |+from airflow.providers.fab.auth_manager.api.auth.backend.basic_auth import init_app
9 9 |
10 10 | CLIENT_AUTH
11 11 | init_app()
AIR302_fab.py:12:1: AIR302 [*] `airflow.api.auth.backend.basic_auth.auth_current_user` is moved into `fab` provider in Airflow 3.0;
|
10 | CLIENT_AUTH
11 | init_app()
@@ -32,7 +58,20 @@ AIR302_fab.py:12:1: AIR302 `airflow.api.auth.backend.basic_auth.auth_current_use
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `auth_current_user` from `airflow.providers.fab.auth_manager.api.auth.backend.basic_auth` instead.
AIR302_fab.py:13:1: AIR302 `airflow.api.auth.backend.basic_auth.requires_authentication` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
2 2 |
3 3 | from airflow.api.auth.backend.basic_auth import (
4 4 | CLIENT_AUTH,
5 |- auth_current_user,
6 5 | init_app,
7 6 | requires_authentication,
8 7 | )
8 |+from airflow.providers.fab.auth_manager.api.auth.backend.basic_auth import auth_current_user
9 9 |
10 10 | CLIENT_AUTH
11 11 | init_app()
AIR302_fab.py:13:1: AIR302 [*] `airflow.api.auth.backend.basic_auth.requires_authentication` is moved into `fab` provider in Airflow 3.0;
|
11 | init_app()
12 | auth_current_user()
@@ -43,7 +82,18 @@ AIR302_fab.py:13:1: AIR302 `airflow.api.auth.backend.basic_auth.requires_authent
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `requires_authentication` from `airflow.providers.fab.auth_manager.api.auth.backend.basic_auth` instead.
AIR302_fab.py:23:1: AIR302 `airflow.api.auth.backend.kerberos_auth.log` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
4 4 | CLIENT_AUTH,
5 5 | auth_current_user,
6 6 | init_app,
7 |- requires_authentication,
8 7 | )
8 |+from airflow.providers.fab.auth_manager.api.auth.backend.basic_auth import requires_authentication
9 9 |
10 10 | CLIENT_AUTH
11 11 | init_app()
AIR302_fab.py:23:1: AIR302 [*] `airflow.api.auth.backend.kerberos_auth.log` is moved into `fab` provider in Airflow 3.0;
|
21 | )
22 |
@@ -54,7 +104,19 @@ AIR302_fab.py:23:1: AIR302 `airflow.api.auth.backend.kerberos_auth.log` is moved
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `log` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:24:1: AIR302 `airflow.api.auth.backend.kerberos_auth.CLIENT_AUTH` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
16 16 | CLIENT_AUTH,
17 17 | find_user,
18 18 | init_app,
19 |- log,
20 19 | requires_authentication,
21 20 | )
21 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import log
22 22 |
23 23 | log()
24 24 | CLIENT_AUTH
AIR302_fab.py:24:1: AIR302 [*] `airflow.api.auth.backend.kerberos_auth.CLIENT_AUTH` is moved into `fab` provider in Airflow 3.0;
|
23 | log()
24 | CLIENT_AUTH
@@ -64,7 +126,22 @@ AIR302_fab.py:24:1: AIR302 `airflow.api.auth.backend.kerberos_auth.CLIENT_AUTH`
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `CLIENT_AUTH` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:25:1: AIR302 `airflow.api.auth.backend.kerberos_auth.find_user` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
13 13 | requires_authentication()
14 14 |
15 15 | from airflow.api.auth.backend.kerberos_auth import (
16 |- CLIENT_AUTH,
17 16 | find_user,
18 17 | init_app,
19 18 | log,
20 19 | requires_authentication,
21 20 | )
21 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import CLIENT_AUTH
22 22 |
23 23 | log()
24 24 | CLIENT_AUTH
AIR302_fab.py:25:1: AIR302 [*] `airflow.api.auth.backend.kerberos_auth.find_user` is moved into `fab` provider in Airflow 3.0;
|
23 | log()
24 | CLIENT_AUTH
@@ -75,7 +152,21 @@ AIR302_fab.py:25:1: AIR302 `airflow.api.auth.backend.kerberos_auth.find_user` is
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `find_user` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:26:1: AIR302 `airflow.api.auth.backend.kerberos_auth.init_app` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
14 14 |
15 15 | from airflow.api.auth.backend.kerberos_auth import (
16 16 | CLIENT_AUTH,
17 |- find_user,
18 17 | init_app,
19 18 | log,
20 19 | requires_authentication,
21 20 | )
21 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import find_user
22 22 |
23 23 | log()
24 24 | CLIENT_AUTH
AIR302_fab.py:26:1: AIR302 [*] `airflow.api.auth.backend.kerberos_auth.init_app` is moved into `fab` provider in Airflow 3.0;
|
24 | CLIENT_AUTH
25 | find_user()
@@ -85,7 +176,20 @@ AIR302_fab.py:26:1: AIR302 `airflow.api.auth.backend.kerberos_auth.init_app` is
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `init_app` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:27:1: AIR302 `airflow.api.auth.backend.kerberos_auth.requires_authentication` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
15 15 | from airflow.api.auth.backend.kerberos_auth import (
16 16 | CLIENT_AUTH,
17 17 | find_user,
18 |- init_app,
19 18 | log,
20 19 | requires_authentication,
21 20 | )
21 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import init_app
22 22 |
23 23 | log()
24 24 | CLIENT_AUTH
AIR302_fab.py:27:1: AIR302 [*] `airflow.api.auth.backend.kerberos_auth.requires_authentication` is moved into `fab` provider in Airflow 3.0;
|
25 | find_user()
26 | init_app()
@@ -96,7 +200,18 @@ AIR302_fab.py:27:1: AIR302 `airflow.api.auth.backend.kerberos_auth.requires_auth
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `requires_authentication` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:37:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.log` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
17 17 | find_user,
18 18 | init_app,
19 19 | log,
20 |- requires_authentication,
21 20 | )
21 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import requires_authentication
22 22 |
23 23 | log()
24 24 | CLIENT_AUTH
AIR302_fab.py:37:1: AIR302 [*] `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.log` is moved into `fab` provider in Airflow 3.0;
|
35 | )
36 |
@@ -107,7 +222,19 @@ AIR302_fab.py:37:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `log` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:38:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.CLIENT_AUTH` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
30 30 | CLIENT_AUTH,
31 31 | find_user,
32 32 | init_app,
33 |- log,
34 33 | requires_authentication,
35 34 | )
35 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import log
36 36 |
37 37 | log()
38 38 | CLIENT_AUTH
AIR302_fab.py:38:1: AIR302 [*] `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.CLIENT_AUTH` is moved into `fab` provider in Airflow 3.0;
|
37 | log()
38 | CLIENT_AUTH
@@ -117,7 +244,22 @@ AIR302_fab.py:38:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `CLIENT_AUTH` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:39:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.find_user` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
27 27 | requires_authentication()
28 28 |
29 29 | from airflow.auth.managers.fab.api.auth.backend.kerberos_auth import (
30 |- CLIENT_AUTH,
31 30 | find_user,
32 31 | init_app,
33 32 | log,
34 33 | requires_authentication,
35 34 | )
35 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import CLIENT_AUTH
36 36 |
37 37 | log()
38 38 | CLIENT_AUTH
AIR302_fab.py:39:1: AIR302 [*] `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.find_user` is moved into `fab` provider in Airflow 3.0;
|
37 | log()
38 | CLIENT_AUTH
@@ -128,7 +270,21 @@ AIR302_fab.py:39:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `find_user` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:40:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.init_app` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
28 28 |
29 29 | from airflow.auth.managers.fab.api.auth.backend.kerberos_auth import (
30 30 | CLIENT_AUTH,
31 |- find_user,
32 31 | init_app,
33 32 | log,
34 33 | requires_authentication,
35 34 | )
35 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import find_user
36 36 |
37 37 | log()
38 38 | CLIENT_AUTH
AIR302_fab.py:40:1: AIR302 [*] `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.init_app` is moved into `fab` provider in Airflow 3.0;
|
38 | CLIENT_AUTH
39 | find_user()
@@ -138,7 +294,20 @@ AIR302_fab.py:40:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `init_app` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:41:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.requires_authentication` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
29 29 | from airflow.auth.managers.fab.api.auth.backend.kerberos_auth import (
30 30 | CLIENT_AUTH,
31 31 | find_user,
32 |- init_app,
33 32 | log,
34 33 | requires_authentication,
35 34 | )
35 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import init_app
36 36 |
37 37 | log()
38 38 | CLIENT_AUTH
AIR302_fab.py:41:1: AIR302 [*] `airflow.auth.managers.fab.api.auth.backend.kerberos_auth.requires_authentication` is moved into `fab` provider in Airflow 3.0;
|
39 | find_user()
40 | init_app()
@@ -149,7 +318,18 @@ AIR302_fab.py:41:1: AIR302 `airflow.auth.managers.fab.api.auth.backend.kerberos_
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `requires_authentication` from `airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth` instead.
AIR302_fab.py:49:1: AIR302 `airflow.auth.managers.fab.fab_auth_manager.FabAuthManager` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
31 31 | find_user,
32 32 | init_app,
33 33 | log,
34 |- requires_authentication,
35 34 | )
35 |+from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import requires_authentication
36 36 |
37 37 | log()
38 38 | CLIENT_AUTH
AIR302_fab.py:49:1: AIR302 [*] `airflow.auth.managers.fab.fab_auth_manager.FabAuthManager` is moved into `fab` provider in Airflow 3.0;
|
47 | )
48 |
@@ -160,7 +340,21 @@ AIR302_fab.py:49:1: AIR302 `airflow.auth.managers.fab.fab_auth_manager.FabAuthMa
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `FabAuthManager` from `airflow.providers.fab.auth_manager.fab_auth_manager` instead.
AIR302_fab.py:50:1: AIR302 `airflow.auth.managers.fab.security_manager.override.MAX_NUM_DATABASE_USER_SESSIONS` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
40 40 | init_app()
41 41 | requires_authentication()
42 42 |
43 |-from airflow.auth.managers.fab.fab_auth_manager import FabAuthManager
44 43 | from airflow.auth.managers.fab.security_manager.override import (
45 44 | MAX_NUM_DATABASE_USER_SESSIONS,
46 45 | FabAirflowSecurityManagerOverride,
47 46 | )
47 |+from airflow.providers.fab.auth_manager.fab_auth_manager import FabAuthManager
48 48 |
49 49 | FabAuthManager()
50 50 | MAX_NUM_DATABASE_USER_SESSIONS
AIR302_fab.py:50:1: AIR302 [*] `airflow.auth.managers.fab.security_manager.override.MAX_NUM_DATABASE_USER_SESSIONS` is moved into `fab` provider in Airflow 3.0;
|
49 | FabAuthManager()
50 | MAX_NUM_DATABASE_USER_SESSIONS
@@ -169,7 +363,19 @@ AIR302_fab.py:50:1: AIR302 `airflow.auth.managers.fab.security_manager.override.
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `MAX_NUM_DATABASE_USER_SESSIONS` from `airflow.providers.fab.auth_manager.security_manager.override` instead.
AIR302_fab.py:51:1: AIR302 `airflow.auth.managers.fab.security_manager.override.FabAirflowSecurityManagerOverride` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
42 42 |
43 43 | from airflow.auth.managers.fab.fab_auth_manager import FabAuthManager
44 44 | from airflow.auth.managers.fab.security_manager.override import (
45 |- MAX_NUM_DATABASE_USER_SESSIONS,
46 45 | FabAirflowSecurityManagerOverride,
47 46 | )
47 |+from airflow.providers.fab.auth_manager.security_manager.override import MAX_NUM_DATABASE_USER_SESSIONS
48 48 |
49 49 | FabAuthManager()
50 50 | MAX_NUM_DATABASE_USER_SESSIONS
AIR302_fab.py:51:1: AIR302 [*] `airflow.auth.managers.fab.security_manager.override.FabAirflowSecurityManagerOverride` is moved into `fab` provider in Airflow 3.0;
|
49 | FabAuthManager()
50 | MAX_NUM_DATABASE_USER_SESSIONS
@@ -180,7 +386,18 @@ AIR302_fab.py:51:1: AIR302 `airflow.auth.managers.fab.security_manager.override.
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `FabAirflowSecurityManagerOverride` from `airflow.providers.fab.auth_manager.security_manager.override` instead.
AIR302_fab.py:55:1: AIR302 `airflow.www.security.FabAirflowSecurityManagerOverride` is moved into `fab` provider in Airflow 3.0;
Unsafe fix
43 43 | from airflow.auth.managers.fab.fab_auth_manager import FabAuthManager
44 44 | from airflow.auth.managers.fab.security_manager.override import (
45 45 | MAX_NUM_DATABASE_USER_SESSIONS,
46 |- FabAirflowSecurityManagerOverride,
47 46 | )
47 |+from airflow.providers.fab.auth_manager.security_manager.override import FabAirflowSecurityManagerOverride
48 48 |
49 49 | FabAuthManager()
50 50 | MAX_NUM_DATABASE_USER_SESSIONS
AIR302_fab.py:55:1: AIR302 [*] `airflow.www.security.FabAirflowSecurityManagerOverride` is moved into `fab` provider in Airflow 3.0;
|
53 | from airflow.www.security import FabAirflowSecurityManagerOverride
54 |
@@ -188,3 +405,12 @@ AIR302_fab.py:55:1: AIR302 `airflow.www.security.FabAirflowSecurityManagerOverri
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-fab>=1.0.0` and use `FabAirflowSecurityManagerOverride` from `airflow.providers.fab.auth_manager.security_manager.override` instead.
Unsafe fix
50 50 | MAX_NUM_DATABASE_USER_SESSIONS
51 51 | FabAirflowSecurityManagerOverride()
52 52 |
53 |-from airflow.www.security import FabAirflowSecurityManagerOverride
53 |+from airflow.providers.fab.auth_manager.security_manager.override import FabAirflowSecurityManagerOverride
54 54 |
55 55 | FabAirflowSecurityManagerOverride()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_hdfs.py:6:1: AIR302 `airflow.hooks.webhdfs_hook.WebHDFSHook` is moved into `apache-hdfs` provider in Airflow 3.0;
AIR302_hdfs.py:6:1: AIR302 [*] `airflow.hooks.webhdfs_hook.WebHDFSHook` is moved into `apache-hdfs` provider in Airflow 3.0;
|
4 | from airflow.sensors.web_hdfs_sensor import WebHdfsSensor
5 |
@@ -11,10 +11,30 @@ AIR302_hdfs.py:6:1: AIR302 `airflow.hooks.webhdfs_hook.WebHDFSHook` is moved int
|
= help: Install `apache-airflow-providers-apache-hdfs>=1.0.0` and use `WebHDFSHook` from `airflow.providers.apache.hdfs.hooks.webhdfs` instead.
AIR302_hdfs.py:7:1: AIR302 `airflow.sensors.web_hdfs_sensor.WebHdfsSensor` is moved into `apache-hdfs` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 |-from airflow.hooks.webhdfs_hook import WebHDFSHook
4 3 | from airflow.sensors.web_hdfs_sensor import WebHdfsSensor
4 |+from airflow.providers.apache.hdfs.hooks.webhdfs import WebHDFSHook
5 5 |
6 6 | WebHDFSHook()
7 7 | WebHdfsSensor()
AIR302_hdfs.py:7:1: AIR302 [*] `airflow.sensors.web_hdfs_sensor.WebHdfsSensor` is moved into `apache-hdfs` provider in Airflow 3.0;
|
6 | WebHDFSHook()
7 | WebHdfsSensor()
| ^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-apache-hdfs>=1.0.0` and use `WebHdfsSensor` from `airflow.providers.apache.hdfs.sensors.web_hdfs` instead.
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 3 | from airflow.hooks.webhdfs_hook import WebHDFSHook
4 |-from airflow.sensors.web_hdfs_sensor import WebHdfsSensor
4 |+from airflow.providers.apache.hdfs.sensors.web_hdfs import WebHdfsSensor
5 5 |
6 6 | WebHDFSHook()
7 7 | WebHdfsSensor()

View File

@@ -1,208 +1,452 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_hive.py:36:1: AIR302 `airflow.macros.hive.closest_ds_partition` is moved into `apache-hive` provider in Airflow 3.0;
AIR302_hive.py:18:1: AIR302 [*] `airflow.hooks.hive_hooks.HIVE_QUEUE_PRIORITIES` is moved into `apache-hive` provider in Airflow 3.0;
|
34 | from airflow.sensors.named_hive_partition_sensor import NamedHivePartitionSensor
35 |
36 | closest_ds_partition()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
37 | max_partition()
|
= help: Install `apache-airflow-providers-apache-hive>=5.1.0` and use `closest_ds_partition` from `airflow.providers.apache.hive.macros.hive` instead.
AIR302_hive.py:37:1: AIR302 `airflow.macros.hive.max_partition` is moved into `apache-hive` provider in Airflow 3.0;
|
36 | closest_ds_partition()
37 | max_partition()
| ^^^^^^^^^^^^^ AIR302
38 |
39 | HiveCliHook()
|
= help: Install `apache-airflow-providers-apache-hive>=5.1.0` and use `max_partition` from `airflow.providers.apache.hive.macros.hive` instead.
AIR302_hive.py:39:1: AIR302 `airflow.hooks.hive_hooks.HiveCliHook` is moved into `apache-hive` provider in Airflow 3.0;
|
37 | max_partition()
38 |
39 | HiveCliHook()
| ^^^^^^^^^^^ AIR302
40 | HiveMetastoreHook()
41 | HiveServer2Hook()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveCliHook` from `airflow.providers.apache.hive.hooks.hive` instead.
AIR302_hive.py:40:1: AIR302 `airflow.hooks.hive_hooks.HiveMetastoreHook` is moved into `apache-hive` provider in Airflow 3.0;
|
39 | HiveCliHook()
40 | HiveMetastoreHook()
| ^^^^^^^^^^^^^^^^^ AIR302
41 | HiveServer2Hook()
42 | HIVE_QUEUE_PRIORITIES
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveMetastoreHook` from `airflow.providers.apache.hive.hooks.hive` instead.
AIR302_hive.py:41:1: AIR302 `airflow.hooks.hive_hooks.HiveServer2Hook` is moved into `apache-hive` provider in Airflow 3.0;
|
39 | HiveCliHook()
40 | HiveMetastoreHook()
41 | HiveServer2Hook()
| ^^^^^^^^^^^^^^^ AIR302
42 | HIVE_QUEUE_PRIORITIES
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveServer2Hook` from `airflow.providers.apache.hive.hooks.hive` instead.
AIR302_hive.py:42:1: AIR302 `airflow.hooks.hive_hooks.HIVE_QUEUE_PRIORITIES` is moved into `apache-hive` provider in Airflow 3.0;
|
40 | HiveMetastoreHook()
41 | HiveServer2Hook()
42 | HIVE_QUEUE_PRIORITIES
16 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
17 |
18 | HIVE_QUEUE_PRIORITIES
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
43 |
44 | HiveOperator()
19 | HiveCliHook()
20 | HiveMetastoreHook()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HIVE_QUEUE_PRIORITIES` from `airflow.providers.apache.hive.hooks.hive` instead.
AIR302_hive.py:44:1: AIR302 `airflow.operators.hive_operator.HiveOperator` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 3 | from airflow.hooks.hive_hooks import (
4 |- HIVE_QUEUE_PRIORITIES,
5 4 | HiveCliHook,
6 5 | HiveMetastoreHook,
7 6 | HiveServer2Hook,
--------------------------------------------------------------------------------
14 13 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 14 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.hooks.hive import HIVE_QUEUE_PRIORITIES
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:19:1: AIR302 [*] `airflow.hooks.hive_hooks.HiveCliHook` is moved into `apache-hive` provider in Airflow 3.0;
|
42 | HIVE_QUEUE_PRIORITIES
43 |
44 | HiveOperator()
18 | HIVE_QUEUE_PRIORITIES
19 | HiveCliHook()
| ^^^^^^^^^^^ AIR302
20 | HiveMetastoreHook()
21 | HiveServer2Hook()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveCliHook` from `airflow.providers.apache.hive.hooks.hive` instead.
Unsafe fix
2 2 |
3 3 | from airflow.hooks.hive_hooks import (
4 4 | HIVE_QUEUE_PRIORITIES,
5 |- HiveCliHook,
6 5 | HiveMetastoreHook,
7 6 | HiveServer2Hook,
8 7 | )
--------------------------------------------------------------------------------
14 13 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 14 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.hooks.hive import HiveCliHook
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:20:1: AIR302 [*] `airflow.hooks.hive_hooks.HiveMetastoreHook` is moved into `apache-hive` provider in Airflow 3.0;
|
18 | HIVE_QUEUE_PRIORITIES
19 | HiveCliHook()
20 | HiveMetastoreHook()
| ^^^^^^^^^^^^^^^^^ AIR302
21 | HiveServer2Hook()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveMetastoreHook` from `airflow.providers.apache.hive.hooks.hive` instead.
Unsafe fix
3 3 | from airflow.hooks.hive_hooks import (
4 4 | HIVE_QUEUE_PRIORITIES,
5 5 | HiveCliHook,
6 |- HiveMetastoreHook,
7 6 | HiveServer2Hook,
8 7 | )
9 8 | from airflow.macros.hive import (
--------------------------------------------------------------------------------
14 13 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 14 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.hooks.hive import HiveMetastoreHook
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:21:1: AIR302 [*] `airflow.hooks.hive_hooks.HiveServer2Hook` is moved into `apache-hive` provider in Airflow 3.0;
|
19 | HiveCliHook()
20 | HiveMetastoreHook()
21 | HiveServer2Hook()
| ^^^^^^^^^^^^^^^ AIR302
22 |
23 | closest_ds_partition()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveServer2Hook` from `airflow.providers.apache.hive.hooks.hive` instead.
Unsafe fix
4 4 | HIVE_QUEUE_PRIORITIES,
5 5 | HiveCliHook,
6 6 | HiveMetastoreHook,
7 |- HiveServer2Hook,
8 7 | )
9 8 | from airflow.macros.hive import (
10 9 | closest_ds_partition,
--------------------------------------------------------------------------------
14 13 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 14 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.hooks.hive import HiveServer2Hook
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:23:1: AIR302 [*] `airflow.macros.hive.closest_ds_partition` is moved into `apache-hive` provider in Airflow 3.0;
|
21 | HiveServer2Hook()
22 |
23 | closest_ds_partition()
| ^^^^^^^^^^^^^^^^^^^^ AIR302
24 | max_partition()
|
= help: Install `apache-airflow-providers-apache-hive>=5.1.0` and use `closest_ds_partition` from `airflow.providers.apache.hive.macros.hive` instead.
Unsafe fix
7 7 | HiveServer2Hook,
8 8 | )
9 9 | from airflow.macros.hive import (
10 |- closest_ds_partition,
11 10 | max_partition,
12 11 | )
13 12 | from airflow.operators.hive_operator import HiveOperator
14 13 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 14 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.macros.hive import closest_ds_partition
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:24:1: AIR302 [*] `airflow.macros.hive.max_partition` is moved into `apache-hive` provider in Airflow 3.0;
|
23 | closest_ds_partition()
24 | max_partition()
| ^^^^^^^^^^^^^ AIR302
25 |
26 | HiveOperator()
|
= help: Install `apache-airflow-providers-apache-hive>=5.1.0` and use `max_partition` from `airflow.providers.apache.hive.macros.hive` instead.
Unsafe fix
8 8 | )
9 9 | from airflow.macros.hive import (
10 10 | closest_ds_partition,
11 |- max_partition,
12 11 | )
13 12 | from airflow.operators.hive_operator import HiveOperator
14 13 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 14 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.macros.hive import max_partition
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:26:1: AIR302 [*] `airflow.operators.hive_operator.HiveOperator` is moved into `apache-hive` provider in Airflow 3.0;
|
24 | max_partition()
25 |
26 | HiveOperator()
| ^^^^^^^^^^^^ AIR302
45 |
46 | HiveStatsCollectionOperator()
27 | HiveStatsCollectionOperator()
28 | HiveToMySqlOperator()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveOperator` from `airflow.providers.apache.hive.operators.hive` instead.
AIR302_hive.py:46:1: AIR302 `airflow.operators.hive_stats_operator.HiveStatsCollectionOperator` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
10 10 | closest_ds_partition,
11 11 | max_partition,
12 12 | )
13 |-from airflow.operators.hive_operator import HiveOperator
14 13 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 14 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.operators.hive import HiveOperator
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:27:1: AIR302 [*] `airflow.operators.hive_stats_operator.HiveStatsCollectionOperator` is moved into `apache-hive` provider in Airflow 3.0;
|
44 | HiveOperator()
45 |
46 | HiveStatsCollectionOperator()
26 | HiveOperator()
27 | HiveStatsCollectionOperator()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
47 |
48 | HiveToMySqlOperator()
28 | HiveToMySqlOperator()
29 | HiveToSambaOperator()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveStatsCollectionOperator` from `airflow.providers.apache.hive.operators.hive_stats` instead.
AIR302_hive.py:48:1: AIR302 `airflow.operators.hive_to_mysql.HiveToMySqlOperator` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
11 11 | max_partition,
12 12 | )
13 13 | from airflow.operators.hive_operator import HiveOperator
14 |-from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 14 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.operators.hive_stats import HiveStatsCollectionOperator
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:28:1: AIR302 [*] `airflow.operators.hive_to_mysql.HiveToMySqlOperator` is moved into `apache-hive` provider in Airflow 3.0;
|
46 | HiveStatsCollectionOperator()
47 |
48 | HiveToMySqlOperator()
26 | HiveOperator()
27 | HiveStatsCollectionOperator()
28 | HiveToMySqlOperator()
| ^^^^^^^^^^^^^^^^^^^ AIR302
49 | HiveToMySqlTransfer()
29 | HiveToSambaOperator()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveToMySqlOperator` from `airflow.providers.apache.hive.transfers.hive_to_mysql` instead.
AIR302_hive.py:49:1: AIR302 `airflow.operators.hive_to_mysql.HiveToMySqlTransfer` is moved into `apache-hive` provider in Airflow 3.0;
|
48 | HiveToMySqlOperator()
49 | HiveToMySqlTransfer()
| ^^^^^^^^^^^^^^^^^^^ AIR302
50 |
51 | HiveToSambaOperator()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveToMySqlOperator` from `airflow.providers.apache.hive.transfers.hive_to_mysql` instead.
Unsafe fix
12 12 | )
13 13 | from airflow.operators.hive_operator import HiveOperator
14 14 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 |-from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 15 | from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.transfers.hive_to_mysql import HiveToMySqlOperator
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:51:1: AIR302 `airflow.operators.hive_to_samba_operator.HiveToSambaOperator` is moved into `apache-hive` provider in Airflow 3.0;
AIR302_hive.py:29:1: AIR302 [*] `airflow.operators.hive_to_samba_operator.HiveToSambaOperator` is moved into `apache-hive` provider in Airflow 3.0;
|
49 | HiveToMySqlTransfer()
50 |
51 | HiveToSambaOperator()
27 | HiveStatsCollectionOperator()
28 | HiveToMySqlOperator()
29 | HiveToSambaOperator()
| ^^^^^^^^^^^^^^^^^^^ AIR302
52 |
53 | MsSqlToHiveOperator()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveToSambaOperator` from `airflow.providers.apache.hive.transfers.hive_to_samba` instead.
AIR302_hive.py:53:1: AIR302 `airflow.operators.mssql_to_hive.MsSqlToHiveOperator` is moved into `apache-hive` provider in Airflow 3.0;
|
51 | HiveToSambaOperator()
52 |
53 | MsSqlToHiveOperator()
| ^^^^^^^^^^^^^^^^^^^ AIR302
54 | MsSqlToHiveTransfer()
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `MsSqlToHiveOperator` from `airflow.providers.apache.hive.transfers.mssql_to_hive` instead.
Unsafe fix
13 13 | from airflow.operators.hive_operator import HiveOperator
14 14 | from airflow.operators.hive_stats_operator import HiveStatsCollectionOperator
15 15 | from airflow.operators.hive_to_mysql import HiveToMySqlOperator
16 |-from airflow.operators.hive_to_samba_operator import HiveToSambaOperator
16 |+from airflow.providers.apache.hive.transfers.hive_to_samba import HiveToSambaOperator
17 17 |
18 18 | HIVE_QUEUE_PRIORITIES
19 19 | HiveCliHook()
AIR302_hive.py:54:1: AIR302 `airflow.operators.mssql_to_hive.MsSqlToHiveTransfer` is moved into `apache-hive` provider in Airflow 3.0;
AIR302_hive.py:34:1: AIR302 [*] `airflow.operators.hive_to_mysql.HiveToMySqlTransfer` is moved into `apache-hive` provider in Airflow 3.0;
|
53 | MsSqlToHiveOperator()
54 | MsSqlToHiveTransfer()
32 | from airflow.operators.hive_to_mysql import HiveToMySqlTransfer
33 |
34 | HiveToMySqlTransfer()
| ^^^^^^^^^^^^^^^^^^^ AIR302
55 |
56 | MySqlToHiveOperator()
35 |
36 | from airflow.operators.mysql_to_hive import MySqlToHiveOperator
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `MsSqlToHiveOperator` from `airflow.providers.apache.hive.transfers.mssql_to_hive` instead.
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HiveToMySqlOperator` from `airflow.providers.apache.hive.transfers.hive_to_mysql` instead.
AIR302_hive.py:56:1: AIR302 `airflow.operators.mysql_to_hive.MySqlToHiveOperator` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
30 30 |
31 31 |
32 32 | from airflow.operators.hive_to_mysql import HiveToMySqlTransfer
33 |+from airflow.providers.apache.hive.transfers.hive_to_mysql import HiveToMySqlOperator
33 34 |
34 35 | HiveToMySqlTransfer()
35 36 |
AIR302_hive.py:38:1: AIR302 [*] `airflow.operators.mysql_to_hive.MySqlToHiveOperator` is moved into `apache-hive` provider in Airflow 3.0;
|
54 | MsSqlToHiveTransfer()
55 |
56 | MySqlToHiveOperator()
36 | from airflow.operators.mysql_to_hive import MySqlToHiveOperator
37 |
38 | MySqlToHiveOperator()
| ^^^^^^^^^^^^^^^^^^^ AIR302
57 | MySqlToHiveTransfer()
39 |
40 | from airflow.operators.mysql_to_hive import MySqlToHiveTransfer
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `MySqlToHiveOperator` from `airflow.providers.apache.hive.transfers.mysql_to_hive` instead.
AIR302_hive.py:57:1: AIR302 `airflow.operators.mysql_to_hive.MySqlToHiveTransfer` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
33 33 |
34 34 | HiveToMySqlTransfer()
35 35 |
36 |-from airflow.operators.mysql_to_hive import MySqlToHiveOperator
36 |+from airflow.providers.apache.hive.transfers.mysql_to_hive import MySqlToHiveOperator
37 37 |
38 38 | MySqlToHiveOperator()
39 39 |
AIR302_hive.py:42:1: AIR302 [*] `airflow.operators.mysql_to_hive.MySqlToHiveTransfer` is moved into `apache-hive` provider in Airflow 3.0;
|
56 | MySqlToHiveOperator()
57 | MySqlToHiveTransfer()
40 | from airflow.operators.mysql_to_hive import MySqlToHiveTransfer
41 |
42 | MySqlToHiveTransfer()
| ^^^^^^^^^^^^^^^^^^^ AIR302
58 |
59 | S3ToHiveOperator()
43 |
44 | from airflow.operators.mssql_to_hive import MsSqlToHiveOperator
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `MySqlToHiveOperator` from `airflow.providers.apache.hive.transfers.mysql_to_hive` instead.
AIR302_hive.py:59:1: AIR302 `airflow.operators.s3_to_hive_operator.S3ToHiveOperator` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
38 38 | MySqlToHiveOperator()
39 39 |
40 40 | from airflow.operators.mysql_to_hive import MySqlToHiveTransfer
41 |+from airflow.providers.apache.hive.transfers.mysql_to_hive import MySqlToHiveOperator
41 42 |
42 43 | MySqlToHiveTransfer()
43 44 |
AIR302_hive.py:46:1: AIR302 [*] `airflow.operators.mssql_to_hive.MsSqlToHiveOperator` is moved into `apache-hive` provider in Airflow 3.0;
|
57 | MySqlToHiveTransfer()
58 |
59 | S3ToHiveOperator()
44 | from airflow.operators.mssql_to_hive import MsSqlToHiveOperator
45 |
46 | MsSqlToHiveOperator()
| ^^^^^^^^^^^^^^^^^^^ AIR302
47 |
48 | from airflow.operators.mssql_to_hive import MsSqlToHiveTransfer
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `MsSqlToHiveOperator` from `airflow.providers.apache.hive.transfers.mssql_to_hive` instead.
Unsafe fix
41 41 |
42 42 | MySqlToHiveTransfer()
43 43 |
44 |-from airflow.operators.mssql_to_hive import MsSqlToHiveOperator
44 |+from airflow.providers.apache.hive.transfers.mssql_to_hive import MsSqlToHiveOperator
45 45 |
46 46 | MsSqlToHiveOperator()
47 47 |
AIR302_hive.py:50:1: AIR302 [*] `airflow.operators.mssql_to_hive.MsSqlToHiveTransfer` is moved into `apache-hive` provider in Airflow 3.0;
|
48 | from airflow.operators.mssql_to_hive import MsSqlToHiveTransfer
49 |
50 | MsSqlToHiveTransfer()
| ^^^^^^^^^^^^^^^^^^^ AIR302
51 |
52 | from airflow.operators.s3_to_hive_operator import S3ToHiveOperator
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `MsSqlToHiveOperator` from `airflow.providers.apache.hive.transfers.mssql_to_hive` instead.
Unsafe fix
46 46 | MsSqlToHiveOperator()
47 47 |
48 48 | from airflow.operators.mssql_to_hive import MsSqlToHiveTransfer
49 |+from airflow.providers.apache.hive.transfers.mssql_to_hive import MsSqlToHiveOperator
49 50 |
50 51 | MsSqlToHiveTransfer()
51 52 |
AIR302_hive.py:54:1: AIR302 [*] `airflow.operators.s3_to_hive_operator.S3ToHiveOperator` is moved into `apache-hive` provider in Airflow 3.0;
|
52 | from airflow.operators.s3_to_hive_operator import S3ToHiveOperator
53 |
54 | S3ToHiveOperator()
| ^^^^^^^^^^^^^^^^ AIR302
60 | S3ToHiveTransfer()
55 |
56 | from airflow.operators.s3_to_hive_operator import S3ToHiveTransfer
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `S3ToHiveOperator` from `airflow.providers.apache.hive.transfers.s3_to_hive` instead.
AIR302_hive.py:60:1: AIR302 `airflow.operators.s3_to_hive_operator.S3ToHiveTransfer` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
49 49 |
50 50 | MsSqlToHiveTransfer()
51 51 |
52 |-from airflow.operators.s3_to_hive_operator import S3ToHiveOperator
52 |+from airflow.providers.apache.hive.transfers.s3_to_hive import S3ToHiveOperator
53 53 |
54 54 | S3ToHiveOperator()
55 55 |
AIR302_hive.py:58:1: AIR302 [*] `airflow.operators.s3_to_hive_operator.S3ToHiveTransfer` is moved into `apache-hive` provider in Airflow 3.0;
|
59 | S3ToHiveOperator()
60 | S3ToHiveTransfer()
56 | from airflow.operators.s3_to_hive_operator import S3ToHiveTransfer
57 |
58 | S3ToHiveTransfer()
| ^^^^^^^^^^^^^^^^ AIR302
61 |
62 | HivePartitionSensor()
59 |
60 | from airflow.sensors.hive_partition_sensor import HivePartitionSensor
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `S3ToHiveOperator` from `airflow.providers.apache.hive.transfers.s3_to_hive` instead.
AIR302_hive.py:62:1: AIR302 `airflow.sensors.hive_partition_sensor.HivePartitionSensor` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
54 54 | S3ToHiveOperator()
55 55 |
56 56 | from airflow.operators.s3_to_hive_operator import S3ToHiveTransfer
57 |+from airflow.providers.apache.hive.transfers.s3_to_hive import S3ToHiveOperator
57 58 |
58 59 | S3ToHiveTransfer()
59 60 |
AIR302_hive.py:62:1: AIR302 [*] `airflow.sensors.hive_partition_sensor.HivePartitionSensor` is moved into `apache-hive` provider in Airflow 3.0;
|
60 | S3ToHiveTransfer()
60 | from airflow.sensors.hive_partition_sensor import HivePartitionSensor
61 |
62 | HivePartitionSensor()
| ^^^^^^^^^^^^^^^^^^^ AIR302
63 |
64 | MetastorePartitionSensor()
64 | from airflow.sensors.metastore_partition_sensor import MetastorePartitionSensor
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `HivePartitionSensor` from `airflow.providers.apache.hive.sensors.hive_partition` instead.
AIR302_hive.py:64:1: AIR302 `airflow.sensors.metastore_partition_sensor.MetastorePartitionSensor` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
57 57 |
58 58 | S3ToHiveTransfer()
59 59 |
60 |-from airflow.sensors.hive_partition_sensor import HivePartitionSensor
60 |+from airflow.providers.apache.hive.sensors.hive_partition import HivePartitionSensor
61 61 |
62 62 | HivePartitionSensor()
63 63 |
AIR302_hive.py:66:1: AIR302 [*] `airflow.sensors.metastore_partition_sensor.MetastorePartitionSensor` is moved into `apache-hive` provider in Airflow 3.0;
|
62 | HivePartitionSensor()
63 |
64 | MetastorePartitionSensor()
| ^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
64 | from airflow.sensors.metastore_partition_sensor import MetastorePartitionSensor
65 |
66 | NamedHivePartitionSensor()
66 | MetastorePartitionSensor()
| ^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
67 |
68 | from airflow.sensors.named_hive_partition_sensor import NamedHivePartitionSensor
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `MetastorePartitionSensor` from `airflow.providers.apache.hive.sensors.metastore_partition` instead.
AIR302_hive.py:66:1: AIR302 `airflow.sensors.named_hive_partition_sensor.NamedHivePartitionSensor` is moved into `apache-hive` provider in Airflow 3.0;
Unsafe fix
61 61 |
62 62 | HivePartitionSensor()
63 63 |
64 |-from airflow.sensors.metastore_partition_sensor import MetastorePartitionSensor
64 |+from airflow.providers.apache.hive.sensors.metastore_partition import MetastorePartitionSensor
65 65 |
66 66 | MetastorePartitionSensor()
67 67 |
AIR302_hive.py:70:1: AIR302 [*] `airflow.sensors.named_hive_partition_sensor.NamedHivePartitionSensor` is moved into `apache-hive` provider in Airflow 3.0;
|
64 | MetastorePartitionSensor()
65 |
66 | NamedHivePartitionSensor()
68 | from airflow.sensors.named_hive_partition_sensor import NamedHivePartitionSensor
69 |
70 | NamedHivePartitionSensor()
| ^^^^^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-apache-hive>=1.0.0` and use `NamedHivePartitionSensor` from `airflow.providers.apache.hive.sensors.named_hive_partition` instead.
Unsafe fix
65 65 |
66 66 | MetastorePartitionSensor()
67 67 |
68 |-from airflow.sensors.named_hive_partition_sensor import NamedHivePartitionSensor
68 |+from airflow.providers.apache.hive.sensors.named_hive_partition import NamedHivePartitionSensor
69 69 |
70 70 | NamedHivePartitionSensor()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_http.py:7:1: AIR302 `airflow.hooks.http_hook.HttpHook` is moved into `http` provider in Airflow 3.0;
AIR302_http.py:7:1: AIR302 [*] `airflow.hooks.http_hook.HttpHook` is moved into `http` provider in Airflow 3.0;
|
5 | from airflow.sensors.http_sensor import HttpSensor
6 |
@@ -12,6 +12,17 @@ AIR302_http.py:7:1: AIR302 `airflow.hooks.http_hook.HttpHook` is moved into `htt
|
= help: Install `apache-airflow-providers-http>=1.0.0` and use `HttpHook` from `airflow.providers.http.hooks.http` instead.
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 |-from airflow.hooks.http_hook import HttpHook
4 3 | from airflow.operators.http_operator import SimpleHttpOperator
5 4 | from airflow.sensors.http_sensor import HttpSensor
5 |+from airflow.providers.http.hooks.http import HttpHook
6 6 |
7 7 | HttpHook()
8 8 | SimpleHttpOperator()
AIR302_http.py:8:1: AIR302 [*] `airflow.operators.http_operator.SimpleHttpOperator` is moved into `http` provider in Airflow 3.0;
|
7 | HttpHook()
@@ -32,7 +43,7 @@ AIR302_http.py:8:1: AIR302 [*] `airflow.operators.http_operator.SimpleHttpOperat
9 |+HttpOperator()
9 10 | HttpSensor()
AIR302_http.py:9:1: AIR302 `airflow.sensors.http_sensor.HttpSensor` is moved into `http` provider in Airflow 3.0;
AIR302_http.py:9:1: AIR302 [*] `airflow.sensors.http_sensor.HttpSensor` is moved into `http` provider in Airflow 3.0;
|
7 | HttpHook()
8 | SimpleHttpOperator()
@@ -40,3 +51,13 @@ AIR302_http.py:9:1: AIR302 `airflow.sensors.http_sensor.HttpSensor` is moved int
| ^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-http>=1.0.0` and use `HttpSensor` from `airflow.providers.http.sensors.http` instead.
Unsafe fix
2 2 |
3 3 | from airflow.hooks.http_hook import HttpHook
4 4 | from airflow.operators.http_operator import SimpleHttpOperator
5 |-from airflow.sensors.http_sensor import HttpSensor
5 |+from airflow.providers.http.sensors.http import HttpSensor
6 6 |
7 7 | HttpHook()
8 8 | SimpleHttpOperator()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_jdbc.py:8:1: AIR302 `airflow.hooks.jdbc_hook.JdbcHook` is moved into `jdbc` provider in Airflow 3.0;
AIR302_jdbc.py:8:1: AIR302 [*] `airflow.hooks.jdbc_hook.JdbcHook` is moved into `jdbc` provider in Airflow 3.0;
|
6 | )
7 |
@@ -11,10 +11,33 @@ AIR302_jdbc.py:8:1: AIR302 `airflow.hooks.jdbc_hook.JdbcHook` is moved into `jdb
|
= help: Install `apache-airflow-providers-jdbc>=1.0.0` and use `JdbcHook` from `airflow.providers.jdbc.hooks.jdbc` instead.
AIR302_jdbc.py:9:1: AIR302 `airflow.hooks.jdbc_hook.jaydebeapi` is moved into `jdbc` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 3 | from airflow.hooks.jdbc_hook import (
4 |- JdbcHook,
5 4 | jaydebeapi,
6 5 | )
6 |+from airflow.providers.jdbc.hooks.jdbc import JdbcHook
7 7 |
8 8 | JdbcHook()
9 9 | jaydebeapi()
AIR302_jdbc.py:9:1: AIR302 [*] `airflow.hooks.jdbc_hook.jaydebeapi` is moved into `jdbc` provider in Airflow 3.0;
|
8 | JdbcHook()
9 | jaydebeapi()
| ^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-jdbc>=1.0.0` and use `jaydebeapi` from `airflow.providers.jdbc.hooks.jdbc` instead.
Unsafe fix
2 2 |
3 3 | from airflow.hooks.jdbc_hook import (
4 4 | JdbcHook,
5 |- jaydebeapi,
6 5 | )
6 |+from airflow.providers.jdbc.hooks.jdbc import jaydebeapi
7 7 |
8 8 | JdbcHook()
9 9 | jaydebeapi()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_mysql.py:9:1: AIR302 `airflow.hooks.mysql_hook.MySqlHook` is moved into `mysql` provider in Airflow 3.0;
AIR302_mysql.py:9:1: AIR302 [*] `airflow.hooks.mysql_hook.MySqlHook` is moved into `mysql` provider in Airflow 3.0;
|
7 | )
8 |
@@ -12,7 +12,20 @@ AIR302_mysql.py:9:1: AIR302 `airflow.hooks.mysql_hook.MySqlHook` is moved into `
|
= help: Install `apache-airflow-providers-mysql>=1.0.0` and use `MySqlHook` from `airflow.providers.mysql.hooks.mysql` instead.
AIR302_mysql.py:10:1: AIR302 `airflow.operators.presto_to_mysql.PrestoToMySqlOperator` is moved into `mysql` provider in Airflow 3.0;
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 |-from airflow.hooks.mysql_hook import MySqlHook
4 3 | from airflow.operators.presto_to_mysql import (
5 4 | PrestoToMySqlOperator,
6 5 | PrestoToMySqlTransfer,
7 6 | )
7 |+from airflow.providers.mysql.hooks.mysql import MySqlHook
8 8 |
9 9 | MySqlHook()
10 10 | PrestoToMySqlOperator()
AIR302_mysql.py:10:1: AIR302 [*] `airflow.operators.presto_to_mysql.PrestoToMySqlOperator` is moved into `mysql` provider in Airflow 3.0;
|
9 | MySqlHook()
10 | PrestoToMySqlOperator()
@@ -21,7 +34,19 @@ AIR302_mysql.py:10:1: AIR302 `airflow.operators.presto_to_mysql.PrestoToMySqlOpe
|
= help: Install `apache-airflow-providers-mysql>=1.0.0` and use `PrestoToMySqlOperator` from `airflow.providers.mysql.transfers.presto_to_mysql` instead.
AIR302_mysql.py:11:1: AIR302 `airflow.operators.presto_to_mysql.PrestoToMySqlTransfer` is moved into `mysql` provider in Airflow 3.0;
Unsafe fix
2 2 |
3 3 | from airflow.hooks.mysql_hook import MySqlHook
4 4 | from airflow.operators.presto_to_mysql import (
5 |- PrestoToMySqlOperator,
6 5 | PrestoToMySqlTransfer,
7 6 | )
7 |+from airflow.providers.mysql.transfers.presto_to_mysql import PrestoToMySqlOperator
8 8 |
9 9 | MySqlHook()
10 10 | PrestoToMySqlOperator()
AIR302_mysql.py:11:1: AIR302 [*] `airflow.operators.presto_to_mysql.PrestoToMySqlTransfer` is moved into `mysql` provider in Airflow 3.0;
|
9 | MySqlHook()
10 | PrestoToMySqlOperator()
@@ -29,3 +54,15 @@ AIR302_mysql.py:11:1: AIR302 `airflow.operators.presto_to_mysql.PrestoToMySqlTra
| ^^^^^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-mysql>=1.0.0` and use `PrestoToMySqlOperator` from `airflow.providers.mysql.transfers.presto_to_mysql` instead.
Unsafe fix
2 2 |
3 3 | from airflow.hooks.mysql_hook import MySqlHook
4 4 | from airflow.operators.presto_to_mysql import (
5 |- PrestoToMySqlOperator,
6 5 | PrestoToMySqlTransfer,
7 6 | )
7 |+from airflow.providers.mysql.transfers.presto_to_mysql import PrestoToMySqlOperator
8 8 |
9 9 | MySqlHook()
10 10 | PrestoToMySqlOperator()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_oracle.py:5:1: AIR302 `airflow.hooks.oracle_hook.OracleHook` is moved into `oracle` provider in Airflow 3.0;
AIR302_oracle.py:5:1: AIR302 [*] `airflow.hooks.oracle_hook.OracleHook` is moved into `oracle` provider in Airflow 3.0;
|
3 | from airflow.hooks.oracle_hook import OracleHook
4 |
@@ -9,3 +9,11 @@ AIR302_oracle.py:5:1: AIR302 `airflow.hooks.oracle_hook.OracleHook` is moved int
| ^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-oracle>=1.0.0` and use `OracleHook` from `airflow.providers.oracle.hooks.oracle` instead.
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 |-from airflow.hooks.oracle_hook import OracleHook
3 |+from airflow.providers.oracle.hooks.oracle import OracleHook
4 4 |
5 5 | OracleHook()

View File

@@ -1,7 +1,7 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR302_papermill.py:5:1: AIR302 `airflow.operators.papermill_operator.PapermillOperator` is moved into `papermill` provider in Airflow 3.0;
AIR302_papermill.py:5:1: AIR302 [*] `airflow.operators.papermill_operator.PapermillOperator` is moved into `papermill` provider in Airflow 3.0;
|
3 | from airflow.operators.papermill_operator import PapermillOperator
4 |
@@ -9,3 +9,11 @@ AIR302_papermill.py:5:1: AIR302 `airflow.operators.papermill_operator.PapermillO
| ^^^^^^^^^^^^^^^^^ AIR302
|
= help: Install `apache-airflow-providers-papermill>=1.0.0` and use `PapermillOperator` from `airflow.providers.papermill.operators.papermill` instead.
Unsafe fix
1 1 | from __future__ import annotations
2 2 |
3 |-from airflow.operators.papermill_operator import PapermillOperator
3 |+from airflow.providers.papermill.operators.papermill import PapermillOperator
4 4 |
5 5 | PapermillOperator()

Some files were not shown because too many files have changed in this diff Show More