Compare commits

..

16 Commits

Author SHA1 Message Date
Micha Reiser
371f3f7340 [ty] Remove the scope from TypeInference in release builds 2025-07-19 17:09:12 +02:00
Micha Reiser
98d1811dd1 [ty] Add goto definition to playground (#19425) 2025-07-19 15:44:44 +02:00
Aria Desires
06f9f52e59 [ty] Add support for @warnings.deprecated (#19376)
* [x] basic handling
  * [x] parse and discover `@warnings.deprecated` attributes
  * [x] associate them with function definitions
  * [x] associate them with class definitions
  * [x] add a new "deprecated" diagnostic
* [x] ensure diagnostic is styled appropriately for LSPs
(DiagnosticTag::Deprecated)

* [x] functions
  * [x] fire on calls
  * [x] fire on arbitrary references 
* [x] classes
  * [x] fire on initializers
  * [x] fire on arbitrary references
* [x] methods
  * [x] fire on calls
  * [x] fire on arbitrary references
* [ ] overloads
  * [ ] fire on calls
  * [ ] fire on arbitrary references(??? maybe not ???)
  * [ ] only fire if the actual selected overload is deprecated 

* [ ] dunder desugarring (warn on deprecated `__add__` if `+` is
invoked)
* [ ] alias supression? (don't warn on uses of variables that deprecated
items were assigned to)

* [ ] import logic
  * [x] fire on imports of deprecated items
* [ ] suppress subsequent diagnostics if the import diagnostic fired (is
this handled by alias supression?)
  * [x] fire on all qualified references (`module.mydeprecated`)
  * [x] fire on all references that depend on a `*` import
    


Fixes https://github.com/astral-sh/ty/issues/153
2025-07-18 23:50:29 +00:00
Jack O'Connor
e9a64e5825 [ty] make del x force local resolution of x in the current scope (#19389)
Fixes https://github.com/astral-sh/ty/issues/769.

**Updated:** The preferred approach here is to keep the SemanticIndex
simple (`del` of any name marks that name "bound" in the current scope)
and to move complexity to type inference (free variable resolution stops
when it finds a binding, unless that binding is declared `nonlocal`). As
part of this change, free variable resolution will now union the types
it finds as it walks in enclosing scopes. This approach is still
incomplete, because it doesn't consider inner scopes or sibling scopes,
but it improves the common case.

---------

Co-authored-by: Carl Meyer <carl@astral.sh>
2025-07-18 14:58:32 -07:00
UnboundVariable
360eb7005f [ty] Added support for "go to definition" for attribute accesses and keyword arguments (#19417)
This PR builds upon #19371. It addresses a few additional code review
suggestions and adds support for attribute accesses (expressions of the
form `x.y`) and keyword arguments within call expressions.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-18 11:33:57 -07:00
Micha Reiser
630c7a3152 [ty] Reduce number of inline stored definitions per place (#19409) 2025-07-18 18:28:46 +02:00
Ibraheem Ahmed
e6e029a8b7 Update salsa (#19258)
## Summary

Pulls in https://github.com/salsa-rs/salsa/pull/934.
2025-07-18 12:14:28 -04:00
Andrew Gallant
64f9481fd0 [ty] Add caching for submodule completion suggestions (#19408)
This change makes it so we aren't doing a directory traversal every time
we ask for completions from a module. Specifically, submodules that
aren't attributes of their parent module can only be discovered by
looking at the directory tree. But we want to avoid doing a directory
scan unless we think there are changes.

To make this work, this change does a little bit of surgery to
`FileRoot`. Previously, a `FileRoot` was only used for library search
paths. Its revision was bumped whenever a file in that tree was added,
deleted or even modified (to support the discovery of `pth` files and
changes to its contents). This generally seems fine since these are
presumably dependency paths that shouldn't change frequently.

In this change, we add a `FileRoot` for the project. But having the
`FileRoot`'s revision bumped for every change in the project makes
caching based on that `FileRoot` rather ineffective. That is, cache
invalidation will occur too aggressively. To the point that there is
little point in adding caching in the first place. To mitigate this, a
`FileRoot`'s revision is only bumped on a change to a child file's
contents when the `FileRoot` is a `LibrarySearchPath`. Otherwise, we
only bump the revision when a file is created or added.

The effect is that, at least in VS Code, when a new module is added or
removed, this change is picked up and the cache is properly invalidated.
Other LSP clients with worse support for file watching (which seems to
be the case for the CoC vim plugin that I use) don't work as well. Here,
the cache is less likely to be invalidated which might cause completions
to have stale results. Unless there's an obvious way to fix or improve
this, I propose punting on improvements here for now.
2025-07-18 11:54:27 -04:00
Dhruv Manilawala
99d0ac60b4 [ty] Track open files in the server (#19264)
## Summary

This PR updates the server to keep track of open files both system and
virtual files.

This is done by updating the project by adding the file in the open file
set in `didOpen` notification and removing it in `didClose`
notification.

This does mean that for workspace diagnostics, ty will only check open
files because the behavior of different diagnostic builder is to first
check `is_file_open` and only add diagnostics for open files. So, this
required updating the `is_file_open` model to be `should_check_file`
model which validates whether the file needs to be checked based on the
`CheckMode`. If the check mode is open files only then it will check
whether the file is open. If it's all files then it'll return `true` by
default.

Closes: astral-sh/ty#619

## Test Plan

### Before

There are two files in the project: `__init__.py` and `diagnostics.py`.

In the video, I'm demonstrating the old behavior where making changes to
the (open) `diagnostics.py` file results in re-parsing the file:


https://github.com/user-attachments/assets/c2ac0ecd-9c77-42af-a924-c3744b146045

### After

Same setup as above.

In the video, I'm demonstrating the new behavior where making changes to
the (open) `diagnostics.py` file doesn't result in re-parting the file:


https://github.com/user-attachments/assets/7b82fe92-f330-44c7-b527-c841c4545f8f
2025-07-18 19:33:35 +05:30
Andrew Gallant
ba7ed3a6f9 [ty] Use as the "cut" indicator in diagnostic rendering (#19420)
This makes ty match ruff's behavior. Specifically, we want to use `…`
instead of the default `...` because `...` has special significance in
Python.
2025-07-18 07:46:48 -04:00
justin
39b41838f3 [ty] synthesize __setattr__ for frozen dataclasses (#19307)
## Summary

Synthesize a `__setattr__` method with a return type of `Never` for
frozen dataclasses.

https://docs.python.org/3/library/dataclasses.html#frozen-instances

https://docs.python.org/3/library/dataclasses.html#dataclasses.FrozenInstanceError

### Related
https://github.com/astral-sh/ty/issues/111
https://github.com/astral-sh/ruff/pull/17974#discussion_r2108527106
https://github.com/astral-sh/ruff/pull/18347#discussion_r2128174665

## Test Plan

New Markdown tests

---------

Co-authored-by: David Peter <mail@david-peter.de>
2025-07-18 11:35:05 +02:00
UnboundVariable
c7640a433e [ty] Fixed bug in semantic token provider for parameters. (#19418)
This fixes https://github.com/astral-sh/ty/issues/832.

New tests were added to prevent future regressions.

---------

Co-authored-by: UnboundVariable <unbound@gmail.com>
2025-07-18 00:02:23 -07:00
Micha Reiser
1765014be3 [ty] Shrink reachability constraints (#19410) 2025-07-18 07:36:18 +02:00
Brent Westbrook
997dc2e7cc Move JUnit rendering to ruff_db (#19370)
Summary
--

This PR moves the JUnit output format to the new rendering
infrastructure. As I
mention in a TODO in the code, there's some code that will be shared
with the
`grouped` output format. Hopefully I'll have that PR up too by the time
this one
is reviewed.

Test Plan
--

Existing tests moved to `ruff_db`

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-17 18:24:13 -04:00
Douglas Creager
4aee0398cb [ty] Show the raw argument type in reveal_type (#19400)
This PR is changes how `reveal_type` determines what type to reveal, in
a way that should be a no-op to most callers.

Previously, we would reveal the type of the first parameter, _after_ all
of the call binding machinery had done its work. This includes inferring
the specialization of a generic function, and then applying that
specialization to all parameter and argument types, which is relevant
since the typeshed definition of `reveal_type` is generic:

```pyi
def reveal_type(obj: _T, /) -> _T: ...
```

Normally this does not matter, since we infer `_T = [arg type]` and
apply that to the parameter type, yielding `[arg type]`. But applying
that specialization also simplifies the argument type, which makes
`reveal_type` less useful as a debugging aid when we want to see the
actual, raw, unsimplified argument type.

With this patch, we now grab the original unmodified argument type and
reveal that instead.

In addition to making the debugging aid example work, this also makes
our `reveal_type` implementation more robust to custom typeshed
definitions, such as

```py
def reveal_type(obj: Any) -> Any: ...
```

(That custom definition is probably not what anyone would want, since
you wouldn't be able to depend on the return type being equivalent to
the argument type, but still)
2025-07-17 16:50:29 -04:00
Brent Westbrook
1fd9103e81 Canonicalize path before filtering (#19407)
## Summary

This came up on
[Discord](https://discord.com/channels/1039017663004942429/1343692072921731082/1395447082520678440)
and also in #19387, but on macOS the tmp directory is a symlink to
`/private/tmp`, which breaks this filter. I'm still not quite sure why
only these tests are affected when we use the `tempdir_filter`
elsewhere, but hopefully this fixes the immediate issue. Just
`tempdir.path().canonicalize()` also worked, but I used `dunce` since
that's what I saw in other tests (I guess it's not _just_ these tests).

Some related links from uv:
-
1b2f212e8b/crates/uv/tests/it/common/mod.rs (L1161-L1178)
-
1b2f212e8b/crates/uv/tests/it/common/mod.rs (L424-L438)
- https://github.com/astral-sh/uv/pull/14290

Thanks to @zanieb for those!

## Test Plan

I tested the `main` branch on my MacBook and reproduced the test
failure, then confirmed that the tests pass after the change. Now to
make sure it passes on Windows, which caused most of the trouble in the
first PR!
2025-07-17 14:02:17 -04:00
77 changed files with 3214 additions and 904 deletions

39
Cargo.lock generated
View File

@@ -1557,6 +1557,15 @@ dependencies = [
"memoffset",
]
[[package]]
name = "inventory"
version = "0.3.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab08d7cd2c5897f2c949e5383ea7c7db03fb19130ffcfbf7eda795137ae3cb83"
dependencies = [
"rustversion",
]
[[package]]
name = "is-docker"
version = "0.2.0"
@@ -2123,16 +2132,6 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
[[package]]
name = "papaya"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f92dd0b07c53a0a0c764db2ace8c541dc47320dad97c2200c2a637ab9dd2328f"
dependencies = [
"equivalent",
"seize",
]
[[package]]
name = "parking_lot"
version = "0.12.3"
@@ -2839,6 +2838,7 @@ dependencies = [
"insta",
"matchit",
"path-slash",
"quick-junit",
"ruff_annotate_snippets",
"ruff_cache",
"ruff_diagnostics",
@@ -2987,7 +2987,6 @@ dependencies = [
"pathdiff",
"pep440_rs",
"pyproject-toml",
"quick-junit",
"regex",
"ruff_annotate_snippets",
"ruff_cache",
@@ -3409,7 +3408,7 @@ checksum = "28d3b2b1366ec20994f1fd18c3c594f05c5dd4bc44d8bb0c1c632c8d6829481f"
[[package]]
name = "salsa"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
source = "git+https://github.com/salsa-rs/salsa?rev=dba66f1a37acca014c2402f231ed5b361bd7d8fe#dba66f1a37acca014c2402f231ed5b361bd7d8fe"
dependencies = [
"boxcar",
"compact_str",
@@ -3419,7 +3418,7 @@ dependencies = [
"hashlink",
"indexmap",
"intrusive-collections",
"papaya",
"inventory",
"parking_lot",
"portable-atomic",
"rayon",
@@ -3434,12 +3433,12 @@ dependencies = [
[[package]]
name = "salsa-macro-rules"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
source = "git+https://github.com/salsa-rs/salsa?rev=dba66f1a37acca014c2402f231ed5b361bd7d8fe#dba66f1a37acca014c2402f231ed5b361bd7d8fe"
[[package]]
name = "salsa-macros"
version = "0.23.0"
source = "git+https://github.com/salsa-rs/salsa?rev=fc00eba89e5dcaa5edba51c41aa5f309b5cb126b#fc00eba89e5dcaa5edba51c41aa5f309b5cb126b"
source = "git+https://github.com/salsa-rs/salsa?rev=dba66f1a37acca014c2402f231ed5b361bd7d8fe#dba66f1a37acca014c2402f231ed5b361bd7d8fe"
dependencies = [
"proc-macro2",
"quote",
@@ -3492,16 +3491,6 @@ version = "4.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1c107b6f4780854c8b126e228ea8869f4d7b71260f962fefb57b996b8959ba6b"
[[package]]
name = "seize"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e4b8d813387d566f627f3ea1b914c068aac94c40ae27ec43f5f33bde65abefe7"
dependencies = [
"libc",
"windows-sys 0.52.0",
]
[[package]]
name = "serde"
version = "1.0.219"

View File

@@ -138,7 +138,7 @@ regex-automata = { version = "0.4.9" }
rustc-hash = { version = "2.0.0" }
rustc-stable-hash = { version = "0.1.2" }
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "fc00eba89e5dcaa5edba51c41aa5f309b5cb126b" }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "dba66f1a37acca014c2402f231ed5b361bd7d8fe" }
schemars = { version = "0.8.16" }
seahash = { version = "4.1.0" }
serde = { version = "1.0.197", features = ["derive"] }
@@ -150,7 +150,7 @@ serde_with = { version = "3.6.0", default-features = false, features = [
] }
shellexpand = { version = "3.0.0" }
similar = { version = "2.4.0", features = ["inline"] }
smallvec = { version = "1.13.2" }
smallvec = { version = "1.13.2", features = ["union", "const_generics", "const_new"] }
snapbox = { version = "0.6.0", features = [
"diff",
"term-svg",

View File

@@ -15,8 +15,8 @@ use ruff_db::diagnostic::{
use ruff_linter::fs::relativize_path;
use ruff_linter::logging::LogLevel;
use ruff_linter::message::{
Emitter, EmitterContext, GithubEmitter, GitlabEmitter, GroupedEmitter, JunitEmitter,
SarifEmitter, TextEmitter,
Emitter, EmitterContext, GithubEmitter, GitlabEmitter, GroupedEmitter, SarifEmitter,
TextEmitter,
};
use ruff_linter::notify_user;
use ruff_linter::settings::flags::{self};
@@ -252,7 +252,11 @@ impl Printer {
write!(writer, "{value}")?;
}
OutputFormat::Junit => {
JunitEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Junit)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Concise | OutputFormat::Full => {
TextEmitter::default()

View File

@@ -5718,8 +5718,11 @@ match 42: # invalid-syntax
let snapshot = format!("output_format_{output_format}");
let project_dir = dunce::canonicalize(tempdir.path())?;
insta::with_settings!({
filters => vec![
(tempdir_filter(&project_dir).as_str(), "[TMP]/"),
(tempdir_filter(&tempdir).as_str(), "[TMP]/"),
(r#""[^"]+\\?/?input.py"#, r#""[TMP]/input.py"#),
(ruff_linter::VERSION, "[VERSION]"),

View File

@@ -25,7 +25,7 @@ exit_code: 1
<testcase name="org.ruff.F821" classname="[TMP]/input" line="2" column="5">
<failure message="Undefined name `y`">line 2, col 5, Undefined name `y`</failure>
</testcase>
<testcase name="org.ruff" classname="[TMP]/input" line="3" column="1">
<testcase name="org.ruff.invalid-syntax" classname="[TMP]/input" line="3" column="1">
<failure message="SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)">line 3, col 1, SyntaxError: Cannot use `match` statement on Python 3.9 (syntax was added in Python 3.10)</failure>
</testcase>
</testsuite>

View File

@@ -18,7 +18,7 @@ use ruff_python_ast::PythonVersion;
use ty_project::metadata::options::{EnvironmentOptions, Options};
use ty_project::metadata::value::{RangedValue, RelativePathBuf};
use ty_project::watch::{ChangeEvent, ChangedKind};
use ty_project::{Db, ProjectDatabase, ProjectMetadata};
use ty_project::{CheckMode, Db, ProjectDatabase, ProjectMetadata};
struct Case {
db: ProjectDatabase,
@@ -102,6 +102,7 @@ fn setup_tomllib_case() -> Case {
let re = re.unwrap();
db.set_check_mode(CheckMode::OpenFiles);
db.project().set_open_files(&mut db, tomllib_files);
let re_path = re.path(&db).as_system_path().unwrap().to_owned();
@@ -237,6 +238,7 @@ fn setup_micro_case(code: &str) -> Case {
let mut db = ProjectDatabase::new(metadata, system).unwrap();
let file = system_path_to_file(&db, SystemPathBuf::from(file_path)).unwrap();
db.set_check_mode(CheckMode::OpenFiles);
db.project()
.set_open_files(&mut db, FxHashSet::from_iter([file]));
@@ -525,14 +527,21 @@ impl<'a> ProjectBenchmark<'a> {
#[track_caller]
fn bench_project(benchmark: &ProjectBenchmark, criterion: &mut Criterion) {
fn check_project(db: &mut ProjectDatabase, max_diagnostics: usize) {
fn check_project(db: &mut ProjectDatabase, project_name: &str, max_diagnostics: usize) {
let result = db.check();
let diagnostics = result.len();
assert!(
diagnostics <= max_diagnostics,
"Expected <={max_diagnostics} diagnostics but got {diagnostics}"
);
if diagnostics > max_diagnostics {
let details = result
.into_iter()
.map(|diagnostic| diagnostic.concise_message().to_string())
.collect::<Vec<_>>()
.join("\n ");
assert!(
diagnostics <= max_diagnostics,
"{project_name}: Expected <={max_diagnostics} diagnostics but got {diagnostics}:\n {details}",
);
}
}
setup_rayon();
@@ -542,7 +551,7 @@ fn bench_project(benchmark: &ProjectBenchmark, criterion: &mut Criterion) {
group.bench_function(benchmark.project.config.name, |b| {
b.iter_batched_ref(
|| benchmark.setup_iteration(),
|db| check_project(db, benchmark.max_diagnostics),
|db| check_project(db, benchmark.project.config.name, benchmark.max_diagnostics),
BatchSize::SmallInput,
);
});
@@ -610,7 +619,7 @@ fn datetype(criterion: &mut Criterion) {
max_dep_date: "2025-07-04",
python_version: PythonVersion::PY313,
},
0,
2,
);
bench_project(&benchmark, criterion);

View File

@@ -34,6 +34,7 @@ glob = { workspace = true }
ignore = { workspace = true, optional = true }
matchit = { workspace = true }
path-slash = { workspace = true }
quick-junit = { workspace = true, optional = true }
rustc-hash = { workspace = true }
salsa = { workspace = true }
schemars = { workspace = true, optional = true }
@@ -56,6 +57,7 @@ tempfile = { workspace = true }
[features]
cache = ["ruff_cache"]
junit = ["dep:quick-junit"]
os = ["ignore", "dep:etcetera"]
serde = ["camino/serde1", "dep:serde", "dep:serde_json", "ruff_diagnostics/serde"]
# Exposes testing utilities.

View File

@@ -1282,6 +1282,9 @@ pub enum DiagnosticFormat {
Rdjson,
/// Print diagnostics in the format emitted by Pylint.
Pylint,
/// Print diagnostics in the format expected by JUnit.
#[cfg(feature = "junit")]
Junit,
}
/// A representation of the kinds of messages inside a diagnostic.

View File

@@ -30,6 +30,8 @@ mod azure;
mod json;
#[cfg(feature = "serde")]
mod json_lines;
#[cfg(feature = "junit")]
mod junit;
mod pylint;
#[cfg(feature = "serde")]
mod rdjson;
@@ -156,7 +158,8 @@ impl std::fmt::Display for DisplayDiagnostics<'_> {
AnnotateRenderer::styled()
} else {
AnnotateRenderer::plain()
};
}
.cut_indicator("");
renderer = renderer
.error(stylesheet.error)
@@ -196,6 +199,10 @@ impl std::fmt::Display for DisplayDiagnostics<'_> {
DiagnosticFormat::Pylint => {
PylintRenderer::new(self.resolver).render(f, self.diagnostics)?;
}
#[cfg(feature = "junit")]
DiagnosticFormat::Junit => {
junit::JunitRenderer::new(self.resolver).render(f, self.diagnostics)?;
}
}
Ok(())

View File

@@ -0,0 +1,195 @@
use std::{collections::BTreeMap, ops::Deref, path::Path};
use quick_junit::{NonSuccessKind, Report, TestCase, TestCaseStatus, TestSuite, XmlString};
use ruff_source_file::LineColumn;
use crate::diagnostic::{Diagnostic, SecondaryCode, render::FileResolver};
/// A renderer for diagnostics in the [JUnit] format.
///
/// See [`junit.xsd`] for the specification in the JUnit repository and an annotated [version]
/// linked from the [`quick_junit`] docs.
///
/// [JUnit]: https://junit.org/
/// [`junit.xsd`]: https://github.com/junit-team/junit-framework/blob/2870b7d8fd5bf7c1efe489d3991d3ed3900e82bb/platform-tests/src/test/resources/jenkins-junit.xsd
/// [version]: https://llg.cubic.org/docs/junit/
/// [`quick_junit`]: https://docs.rs/quick-junit/latest/quick_junit/
pub struct JunitRenderer<'a> {
resolver: &'a dyn FileResolver,
}
impl<'a> JunitRenderer<'a> {
pub fn new(resolver: &'a dyn FileResolver) -> Self {
Self { resolver }
}
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
let mut report = Report::new("ruff");
if diagnostics.is_empty() {
let mut test_suite = TestSuite::new("ruff");
test_suite
.extra
.insert(XmlString::new("package"), XmlString::new("org.ruff"));
let mut case = TestCase::new("No errors found", TestCaseStatus::success());
case.set_classname("ruff");
test_suite.add_test_case(case);
report.add_test_suite(test_suite);
} else {
for (filename, diagnostics) in group_diagnostics_by_filename(diagnostics, self.resolver)
{
let mut test_suite = TestSuite::new(filename);
test_suite
.extra
.insert(XmlString::new("package"), XmlString::new("org.ruff"));
let classname = Path::new(filename).with_extension("");
for diagnostic in diagnostics {
let DiagnosticWithLocation {
diagnostic,
start_location: location,
} = diagnostic;
let mut status = TestCaseStatus::non_success(NonSuccessKind::Failure);
status.set_message(diagnostic.body());
if let Some(location) = location {
status.set_description(format!(
"line {row}, col {col}, {body}",
row = location.line,
col = location.column,
body = diagnostic.body()
));
} else {
status.set_description(diagnostic.body());
}
let code = diagnostic
.secondary_code()
.map_or_else(|| diagnostic.name(), SecondaryCode::as_str);
let mut case = TestCase::new(format!("org.ruff.{code}"), status);
case.set_classname(classname.to_str().unwrap());
if let Some(location) = location {
case.extra.insert(
XmlString::new("line"),
XmlString::new(location.line.to_string()),
);
case.extra.insert(
XmlString::new("column"),
XmlString::new(location.column.to_string()),
);
}
test_suite.add_test_case(case);
}
report.add_test_suite(test_suite);
}
}
let adapter = FmtAdapter { fmt: f };
report.serialize(adapter).map_err(|_| std::fmt::Error)
}
}
// TODO(brent) this and `group_diagnostics_by_filename` are also used by the `grouped` output
// format. I think they'd make more sense in that file, but I started here first. I'll move them to
// that module when adding the `grouped` output format.
struct DiagnosticWithLocation<'a> {
diagnostic: &'a Diagnostic,
start_location: Option<LineColumn>,
}
impl Deref for DiagnosticWithLocation<'_> {
type Target = Diagnostic;
fn deref(&self) -> &Self::Target {
self.diagnostic
}
}
fn group_diagnostics_by_filename<'a>(
diagnostics: &'a [Diagnostic],
resolver: &'a dyn FileResolver,
) -> BTreeMap<&'a str, Vec<DiagnosticWithLocation<'a>>> {
let mut grouped_diagnostics = BTreeMap::default();
for diagnostic in diagnostics {
let (filename, start_location) = diagnostic
.primary_span_ref()
.map(|span| {
let file = span.file();
let start_location =
span.range()
.filter(|_| !resolver.is_notebook(file))
.map(|range| {
file.diagnostic_source(resolver)
.as_source_code()
.line_column(range.start())
});
(span.file().path(resolver), start_location)
})
.unwrap_or_default();
grouped_diagnostics
.entry(filename)
.or_insert_with(Vec::new)
.push(DiagnosticWithLocation {
diagnostic,
start_location,
});
}
grouped_diagnostics
}
struct FmtAdapter<'a> {
fmt: &'a mut dyn std::fmt::Write,
}
impl std::io::Write for FmtAdapter<'_> {
fn write(&mut self, buf: &[u8]) -> std::io::Result<usize> {
self.fmt
.write_str(std::str::from_utf8(buf).map_err(|_| {
std::io::Error::new(
std::io::ErrorKind::InvalidData,
"Invalid UTF-8 in JUnit report",
)
})?)
.map_err(std::io::Error::other)?;
Ok(buf.len())
}
fn flush(&mut self) -> std::io::Result<()> {
Ok(())
}
fn write_fmt(&mut self, args: std::fmt::Arguments<'_>) -> std::io::Result<()> {
self.fmt.write_fmt(args).map_err(std::io::Error::other)
}
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{create_diagnostics, create_syntax_error_diagnostics},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Junit);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Junit);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/junit.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/junit.rs
expression: env.render_diagnostics(&diagnostics)
---
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="ruff" tests="3" failures="3" errors="0">

View File

@@ -1,15 +1,14 @@
---
source: crates/ruff_linter/src/message/junit.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/junit.rs
expression: env.render_diagnostics(&diagnostics)
---
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="ruff" tests="2" failures="2" errors="0">
<testsuite name="syntax_errors.py" tests="2" disabled="0" errors="0" failures="2" package="org.ruff">
<testcase name="org.ruff" classname="syntax_errors" line="1" column="15">
<testcase name="org.ruff.invalid-syntax" classname="syntax_errors" line="1" column="15">
<failure message="SyntaxError: Expected one or more symbol names after import">line 1, col 15, SyntaxError: Expected one or more symbol names after import</failure>
</testcase>
<testcase name="org.ruff" classname="syntax_errors" line="3" column="12">
<testcase name="org.ruff.invalid-syntax" classname="syntax_errors" line="3" column="12">
<failure message="SyntaxError: Expected &apos;)&apos;, found newline">line 3, col 12, SyntaxError: Expected &apos;)&apos;, found newline</failure>
</testcase>
</testsuite>

View File

@@ -232,7 +232,7 @@ impl Files {
let roots = inner.roots.read().unwrap();
for root in roots.all() {
if root.path(db).starts_with(&path) {
if path.starts_with(root.path(db)) {
root.set_revision(db).to(FileRevision::now());
}
}
@@ -375,12 +375,25 @@ impl File {
}
/// Refreshes the file metadata by querying the file system if needed.
///
/// This also "touches" the file root associated with the given path.
/// This means that any Salsa queries that depend on the corresponding
/// root's revision will become invalidated.
pub fn sync_path(db: &mut dyn Db, path: &SystemPath) {
let absolute = SystemPath::absolute(path, db.system().current_directory());
Files::touch_root(db, &absolute);
Self::sync_system_path(db, &absolute, None);
}
/// Refreshes *only* the file metadata by querying the file system if needed.
///
/// This specifically does not touch any file root associated with the
/// given file path.
pub fn sync_path_only(db: &mut dyn Db, path: &SystemPath) {
let absolute = SystemPath::absolute(path, db.system().current_directory());
Self::sync_system_path(db, &absolute, None);
}
/// Increments the revision for the virtual file at `path`.
pub fn sync_virtual_path(db: &mut dyn Db, path: &SystemVirtualPath) {
if let Some(virtual_file) = db.files().try_virtual_file(path) {
@@ -486,7 +499,7 @@ impl fmt::Debug for File {
///
/// This is a wrapper around a [`File`] that provides additional methods to interact with a virtual
/// file.
#[derive(Copy, Clone)]
#[derive(Copy, Clone, Debug)]
pub struct VirtualFile(File);
impl VirtualFile {

View File

@@ -23,7 +23,7 @@ pub struct FileRoot {
pub path: SystemPathBuf,
/// The kind of the root at the time of its creation.
kind_at_time_of_creation: FileRootKind,
pub kind_at_time_of_creation: FileRootKind,
/// A revision that changes when the contents of the source root change.
///

View File

@@ -87,7 +87,7 @@ impl SourceDb for ModuleDb {
#[salsa::db]
impl Db for ModuleDb {
fn is_file_open(&self, file: File) -> bool {
fn should_check_file(&self, file: File) -> bool {
!file.path(self).is_vendored_path()
}

View File

@@ -15,7 +15,7 @@ license = { workspace = true }
[dependencies]
ruff_annotate_snippets = { workspace = true }
ruff_cache = { workspace = true }
ruff_db = { workspace = true, features = ["serde"] }
ruff_db = { workspace = true, features = ["junit", "serde"] }
ruff_diagnostics = { workspace = true, features = ["serde"] }
ruff_notebook = { workspace = true }
ruff_macros = { workspace = true }
@@ -55,7 +55,6 @@ path-absolutize = { workspace = true, features = [
pathdiff = { workspace = true }
pep440_rs = { workspace = true }
pyproject-toml = { workspace = true }
quick-junit = { workspace = true }
regex = { workspace = true }
rustc-hash = { workspace = true }
schemars = { workspace = true, optional = true }

View File

@@ -1,117 +0,0 @@
use std::io::Write;
use std::path::Path;
use quick_junit::{NonSuccessKind, Report, TestCase, TestCaseStatus, TestSuite, XmlString};
use ruff_db::diagnostic::Diagnostic;
use ruff_source_file::LineColumn;
use crate::message::{Emitter, EmitterContext, MessageWithLocation, group_diagnostics_by_filename};
#[derive(Default)]
pub struct JunitEmitter;
impl Emitter for JunitEmitter {
fn emit(
&mut self,
writer: &mut dyn Write,
diagnostics: &[Diagnostic],
context: &EmitterContext,
) -> anyhow::Result<()> {
let mut report = Report::new("ruff");
if diagnostics.is_empty() {
let mut test_suite = TestSuite::new("ruff");
test_suite
.extra
.insert(XmlString::new("package"), XmlString::new("org.ruff"));
let mut case = TestCase::new("No errors found", TestCaseStatus::success());
case.set_classname("ruff");
test_suite.add_test_case(case);
report.add_test_suite(test_suite);
} else {
for (filename, messages) in group_diagnostics_by_filename(diagnostics) {
let mut test_suite = TestSuite::new(&filename);
test_suite
.extra
.insert(XmlString::new("package"), XmlString::new("org.ruff"));
for message in messages {
let MessageWithLocation {
message,
start_location,
} = message;
let mut status = TestCaseStatus::non_success(NonSuccessKind::Failure);
status.set_message(message.body());
let location = if context.is_notebook(&message.expect_ruff_filename()) {
// We can't give a reasonable location for the structured formats,
// so we show one that's clearly a fallback
LineColumn::default()
} else {
start_location
};
status.set_description(format!(
"line {row}, col {col}, {body}",
row = location.line,
col = location.column,
body = message.body()
));
let mut case = TestCase::new(
if let Some(code) = message.secondary_code() {
format!("org.ruff.{code}")
} else {
"org.ruff".to_string()
},
status,
);
let file_path = Path::new(&*filename);
let file_stem = file_path.file_stem().unwrap().to_str().unwrap();
let classname = file_path.parent().unwrap().join(file_stem);
case.set_classname(classname.to_str().unwrap());
case.extra.insert(
XmlString::new("line"),
XmlString::new(location.line.to_string()),
);
case.extra.insert(
XmlString::new("column"),
XmlString::new(location.column.to_string()),
);
test_suite.add_test_case(case);
}
report.add_test_suite(test_suite);
}
}
report.serialize(writer)?;
Ok(())
}
}
#[cfg(test)]
mod tests {
use insta::assert_snapshot;
use crate::message::JunitEmitter;
use crate::message::tests::{
capture_emitter_output, create_diagnostics, create_syntax_error_diagnostics,
};
#[test]
fn output() {
let mut emitter = JunitEmitter;
let content = capture_emitter_output(&mut emitter, &create_diagnostics());
assert_snapshot!(content);
}
#[test]
fn syntax_errors() {
let mut emitter = JunitEmitter;
let content = capture_emitter_output(&mut emitter, &create_syntax_error_diagnostics());
assert_snapshot!(content);
}
}

View File

@@ -14,7 +14,6 @@ use ruff_db::files::File;
pub use github::GithubEmitter;
pub use gitlab::GitlabEmitter;
pub use grouped::GroupedEmitter;
pub use junit::JunitEmitter;
use ruff_notebook::NotebookIndex;
use ruff_source_file::{LineColumn, SourceFile};
use ruff_text_size::{Ranged, TextRange, TextSize};
@@ -28,7 +27,6 @@ mod diff;
mod github;
mod gitlab;
mod grouped;
mod junit;
mod sarif;
mod text;

View File

@@ -77,10 +77,31 @@ struct ExpandedEdit {
content: Option<String>,
}
/// Perform global constructor initialization.
#[cfg(target_family = "wasm")]
#[expect(unsafe_code)]
pub fn before_main() {
unsafe extern "C" {
fn __wasm_call_ctors();
}
// Salsa uses the `inventory` crate, which registers global constructors that may need to be
// called explicitly on WASM. See <https://github.com/dtolnay/inventory/blob/master/src/lib.rs#L105>
// for details.
unsafe {
__wasm_call_ctors();
}
}
#[cfg(not(target_family = "wasm"))]
pub fn before_main() {}
#[wasm_bindgen(start)]
pub fn run() {
use log::Level;
before_main();
// When the `console_error_panic_hook` feature is enabled, we can call the
// `set_panic_hook` function at least once during initialization, and then
// we will get better error messages if our code ever panics.

View File

@@ -21,6 +21,8 @@ macro_rules! check {
#[wasm_bindgen_test]
fn empty_config() {
ruff_wasm::before_main();
check!(
"if (1, 2):\n pass",
r#"{}"#,
@@ -42,6 +44,8 @@ fn empty_config() {
#[wasm_bindgen_test]
fn syntax_error() {
ruff_wasm::before_main();
check!(
"x =\ny = 1\n",
r#"{}"#,
@@ -63,6 +67,8 @@ fn syntax_error() {
#[wasm_bindgen_test]
fn unsupported_syntax_error() {
ruff_wasm::before_main();
check!(
"match 2:\n case 1: ...",
r#"{"target-version": "py39"}"#,
@@ -84,11 +90,15 @@ fn unsupported_syntax_error() {
#[wasm_bindgen_test]
fn partial_config() {
ruff_wasm::before_main();
check!("if (1, 2):\n pass", r#"{"ignore": ["F"]}"#, []);
}
#[wasm_bindgen_test]
fn partial_nested_config() {
ruff_wasm::before_main();
let config = r#"{
"select": ["Q"],
"flake8-quotes": {

141
crates/ty/docs/rules.md generated
View File

@@ -36,7 +36,7 @@ def test(): -> "int":
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20call-non-callable) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L99)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L100)
</small>
**What it does**
@@ -58,7 +58,7 @@ Calling a non-callable object will raise a `TypeError` at runtime.
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20conflicting-argument-forms) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L143)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L144)
</small>
**What it does**
@@ -88,7 +88,7 @@ f(int) # error
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20conflicting-declarations) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L169)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L170)
</small>
**What it does**
@@ -117,7 +117,7 @@ a = 1
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20conflicting-metaclass) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L194)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L195)
</small>
**What it does**
@@ -147,7 +147,7 @@ class C(A, B): ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20cyclic-class-definition) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L220)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L221)
</small>
**What it does**
@@ -177,7 +177,7 @@ class B(A): ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20duplicate-base) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L264)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L286)
</small>
**What it does**
@@ -202,7 +202,7 @@ class B(A, A): ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20duplicate-kw-only) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L285)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L307)
</small>
**What it does**
@@ -306,7 +306,7 @@ def test(): -> "Literal[5]":
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20inconsistent-mro) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L427)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L449)
</small>
**What it does**
@@ -334,7 +334,7 @@ class C(A, B): ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20index-out-of-bounds) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L451)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L473)
</small>
**What it does**
@@ -358,7 +358,7 @@ t[3] # IndexError: tuple index out of range
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20instance-layout-conflict) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L317)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L339)
</small>
**What it does**
@@ -445,7 +445,7 @@ an atypical memory layout.
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-argument-type) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L471)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L493)
</small>
**What it does**
@@ -470,7 +470,7 @@ func("foo") # error: [invalid-argument-type]
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-assignment) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L511)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L533)
</small>
**What it does**
@@ -496,7 +496,7 @@ a: int = ''
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-attribute-access) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1515)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1537)
</small>
**What it does**
@@ -528,7 +528,7 @@ C.instance_var = 3 # error: Cannot assign to instance variable
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-base) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L533)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L555)
</small>
**What it does**
@@ -550,7 +550,7 @@ class A(42): ... # error: [invalid-base]
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-context-manager) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L584)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L606)
</small>
**What it does**
@@ -575,7 +575,7 @@ with 1:
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-declaration) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L605)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L627)
</small>
**What it does**
@@ -602,7 +602,7 @@ a: str
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-exception-caught) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L628)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L650)
</small>
**What it does**
@@ -644,7 +644,7 @@ except ZeroDivisionError:
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-generic-class) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L664)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L686)
</small>
**What it does**
@@ -675,7 +675,7 @@ class C[U](Generic[T]): ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-legacy-type-variable) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L690)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L712)
</small>
**What it does**
@@ -708,7 +708,7 @@ def f(t: TypeVar("U")): ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-metaclass) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L739)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L761)
</small>
**What it does**
@@ -740,7 +740,7 @@ class B(metaclass=f): ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-overload) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L766)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L788)
</small>
**What it does**
@@ -788,7 +788,7 @@ def foo(x: int) -> int: ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-parameter-default) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L809)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L831)
</small>
**What it does**
@@ -812,7 +812,7 @@ def f(a: int = ''): ...
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-protocol) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L399)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L421)
</small>
**What it does**
@@ -844,7 +844,7 @@ TypeError: Protocols can only inherit from other protocols, got <class 'int'>
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-raise) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L829)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L851)
</small>
Checks for `raise` statements that raise non-exceptions or use invalid
@@ -891,7 +891,7 @@ def g():
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-return-type) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L492)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L514)
</small>
**What it does**
@@ -914,7 +914,7 @@ def func() -> int:
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-super-argument) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L872)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L894)
</small>
**What it does**
@@ -968,7 +968,7 @@ TODO #14889
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-type-alias-type) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L718)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L740)
</small>
**What it does**
@@ -993,7 +993,7 @@ NewAlias = TypeAliasType(get_name(), int) # error: TypeAliasType name mus
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-type-checking-constant) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L911)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L933)
</small>
**What it does**
@@ -1021,7 +1021,7 @@ TYPE_CHECKING = ''
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-type-form) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L935)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L957)
</small>
**What it does**
@@ -1049,7 +1049,7 @@ b: Annotated[int] # `Annotated` expects at least two arguments
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-type-guard-call) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L987)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1009)
</small>
**What it does**
@@ -1081,7 +1081,7 @@ f(10) # Error
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-type-guard-definition) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L959)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L981)
</small>
**What it does**
@@ -1113,7 +1113,7 @@ class C:
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20invalid-type-variable-constraints) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1015)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1037)
</small>
**What it does**
@@ -1146,7 +1146,7 @@ T = TypeVar('T', bound=str) # valid bound TypeVar
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20missing-argument) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1044)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1066)
</small>
**What it does**
@@ -1169,7 +1169,7 @@ func() # TypeError: func() missing 1 required positional argument: 'x'
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20no-matching-overload) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1063)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1085)
</small>
**What it does**
@@ -1196,7 +1196,7 @@ func("string") # error: [no-matching-overload]
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20non-subscriptable) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1086)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1108)
</small>
**What it does**
@@ -1218,7 +1218,7 @@ Subscripting an object that does not support it will raise a `TypeError` at runt
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20not-iterable) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1104)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1126)
</small>
**What it does**
@@ -1242,7 +1242,7 @@ for i in 34: # TypeError: 'int' object is not iterable
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20parameter-already-assigned) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1155)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1177)
</small>
**What it does**
@@ -1296,7 +1296,7 @@ def test(): -> "int":
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20static-assert-error) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1491)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1513)
</small>
**What it does**
@@ -1324,7 +1324,7 @@ static_assert(int(2.0 * 3.0) == 6) # error: does not have a statically known tr
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20subclass-of-final-class) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1246)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1268)
</small>
**What it does**
@@ -1351,7 +1351,7 @@ class B(A): ... # Error raised here
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20too-many-positional-arguments) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1291)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1313)
</small>
**What it does**
@@ -1376,7 +1376,7 @@ f("foo") # Error raised here
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20type-assertion-failure) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1269)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1291)
</small>
**What it does**
@@ -1402,7 +1402,7 @@ def _(x: int):
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unavailable-implicit-super-arguments) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1312)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1334)
</small>
**What it does**
@@ -1446,7 +1446,7 @@ class A:
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unknown-argument) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1369)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1391)
</small>
**What it does**
@@ -1471,7 +1471,7 @@ f(x=1, y=2) # Error raised here
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unresolved-attribute) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1390)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1412)
</small>
**What it does**
@@ -1497,7 +1497,7 @@ A().foo # AttributeError: 'A' object has no attribute 'foo'
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unresolved-import) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1412)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1434)
</small>
**What it does**
@@ -1520,7 +1520,7 @@ import foo # ModuleNotFoundError: No module named 'foo'
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unresolved-reference) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1431)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1453)
</small>
**What it does**
@@ -1543,7 +1543,7 @@ print(x) # NameError: name 'x' is not defined
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unsupported-bool-conversion) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1124)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1146)
</small>
**What it does**
@@ -1578,7 +1578,7 @@ b1 < b2 < b1 # exception raised here
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unsupported-operator) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1450)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1472)
</small>
**What it does**
@@ -1604,7 +1604,7 @@ A() + A() # TypeError: unsupported operand type(s) for +: 'A' and 'A'
<small>
Default level: [`error`](../rules.md#rule-levels "This lint has a default level of 'error'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20zero-stepsize-in-slice) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1472)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1494)
</small>
**What it does**
@@ -1622,6 +1622,31 @@ l = list(range(10))
l[1:10:0] # ValueError: slice step cannot be zero
```
## `deprecated`
<small>
Default level: [`warn`](../rules.md#rule-levels "This lint has a default level of 'warn'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20deprecated) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L265)
</small>
**What it does**
Checks for uses of deprecated items
**Why is this bad?**
Deprecated items should no longer be used.
**Examples**
```python
@warnings.deprecated("use new_func instead")
def old_func(): ...
old_func() # emits [deprecated] diagnostic
```
## `invalid-ignore-comment`
<small>
@@ -1655,7 +1680,7 @@ a = 20 / 0 # type: ignore
<small>
Default level: [`warn`](../rules.md#rule-levels "This lint has a default level of 'warn'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20possibly-unbound-attribute) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1176)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1198)
</small>
**What it does**
@@ -1681,7 +1706,7 @@ A.c # AttributeError: type object 'A' has no attribute 'c'
<small>
Default level: [`warn`](../rules.md#rule-levels "This lint has a default level of 'warn'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20possibly-unbound-implicit-call) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L117)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L118)
</small>
**What it does**
@@ -1711,7 +1736,7 @@ A()[0] # TypeError: 'A' object is not subscriptable
<small>
Default level: [`warn`](../rules.md#rule-levels "This lint has a default level of 'warn'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20possibly-unbound-import) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1198)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1220)
</small>
**What it does**
@@ -1741,7 +1766,7 @@ from module import a # ImportError: cannot import name 'a' from 'module'
<small>
Default level: [`warn`](../rules.md#rule-levels "This lint has a default level of 'warn'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20redundant-cast) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1543)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1565)
</small>
**What it does**
@@ -1766,7 +1791,7 @@ cast(int, f()) # Redundant
<small>
Default level: [`warn`](../rules.md#rule-levels "This lint has a default level of 'warn'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20undefined-reveal) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1351)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1373)
</small>
**What it does**
@@ -1817,7 +1842,7 @@ a = 20 / 0 # ty: ignore[division-by-zero]
<small>
Default level: [`warn`](../rules.md#rule-levels "This lint has a default level of 'warn'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unresolved-global) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1564)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1586)
</small>
**What it does**
@@ -1871,7 +1896,7 @@ def g():
<small>
Default level: [`warn`](../rules.md#rule-levels "This lint has a default level of 'warn'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20unsupported-base) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L551)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L573)
</small>
**What it does**
@@ -1908,7 +1933,7 @@ class D(C): ... # error: [unsupported-base]
<small>
Default level: [`ignore`](../rules.md#rule-levels "This lint has a default level of 'ignore'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20division-by-zero) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L246)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L247)
</small>
**What it does**
@@ -1930,7 +1955,7 @@ Dividing by zero raises a `ZeroDivisionError` at runtime.
<small>
Default level: [`ignore`](../rules.md#rule-levels "This lint has a default level of 'ignore'.") ·
[Related issues](https://github.com/astral-sh/ty/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20possibly-unresolved-reference) ·
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1224)
[View source](https://github.com/astral-sh/ruff/blob/main/crates%2Fty_python_semantic%2Fsrc%2Ftypes%2Fdiagnostic.rs#L1246)
</small>
**What it does**

View File

@@ -660,7 +660,7 @@ fn can_handle_large_binop_expressions() -> anyhow::Result<()> {
--> test.py:4:13
|
2 | from typing_extensions import reveal_type
3 | total = 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1...
3 | total = 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 +…
4 | reveal_type(total)
| ^^^^^ `Literal[2000]`
|

View File

@@ -15,7 +15,7 @@ use ty_project::metadata::pyproject::{PyProject, Tool};
use ty_project::metadata::value::{RangedValue, RelativePathBuf};
use ty_project::watch::{ChangeEvent, ProjectWatcher, directory_watcher};
use ty_project::{Db, ProjectDatabase, ProjectMetadata};
use ty_python_semantic::{ModuleName, PythonPlatform, resolve_module};
use ty_python_semantic::{Module, ModuleName, PythonPlatform, resolve_module};
struct TestCase {
db: ProjectDatabase,
@@ -40,6 +40,14 @@ impl TestCase {
&self.db
}
/// Stops file-watching and returns the collected change events.
///
/// The caller must pass a `MatchEvent` filter that is applied to
/// the change events returned. To get all change events, use `|_:
/// &ChangeEvent| true`. If possible, callers should pass a filter for a
/// specific file name, e.g., `event_for_file("foo.py")`. When done this
/// way, the watcher will specifically try to wait for a change event
/// matching the filter. This can help avoid flakes.
#[track_caller]
fn stop_watch<M>(&mut self, matcher: M) -> Vec<ChangeEvent>
where
@@ -1877,3 +1885,156 @@ fn rename_files_casing_only() -> anyhow::Result<()> {
Ok(())
}
/// This tests that retrieving submodules from a module has its cache
/// appropriately invalidated after a file is created.
#[test]
fn submodule_cache_invalidation_created() -> anyhow::Result<()> {
let mut case = setup([("lib.py", ""), ("bar/__init__.py", ""), ("bar/foo.py", "")])?;
let module = resolve_module(case.db(), &ModuleName::new("bar").unwrap()).expect("`bar` module");
let get_submodules = |db: &dyn Db, module: &Module| {
let mut names = module
.all_submodules(db)
.iter()
.map(|name| name.as_str().to_string())
.collect::<Vec<String>>();
names.sort();
names.join("\n")
};
insta::assert_snapshot!(
get_submodules(case.db(), &module),
@"foo",
);
std::fs::write(case.project_path("bar/wazoo.py").as_std_path(), "")?;
let changes = case.stop_watch(event_for_file("wazoo.py"));
case.apply_changes(changes, None);
insta::assert_snapshot!(
get_submodules(case.db(), &module),
@r"
foo
wazoo
",
);
Ok(())
}
/// This tests that retrieving submodules from a module has its cache
/// appropriately invalidated after a file is deleted.
#[test]
fn submodule_cache_invalidation_deleted() -> anyhow::Result<()> {
let mut case = setup([
("lib.py", ""),
("bar/__init__.py", ""),
("bar/foo.py", ""),
("bar/wazoo.py", ""),
])?;
let module = resolve_module(case.db(), &ModuleName::new("bar").unwrap()).expect("`bar` module");
let get_submodules = |db: &dyn Db, module: &Module| {
let mut names = module
.all_submodules(db)
.iter()
.map(|name| name.as_str().to_string())
.collect::<Vec<String>>();
names.sort();
names.join("\n")
};
insta::assert_snapshot!(
get_submodules(case.db(), &module),
@r"
foo
wazoo
",
);
std::fs::remove_file(case.project_path("bar/wazoo.py").as_std_path())?;
let changes = case.stop_watch(event_for_file("wazoo.py"));
case.apply_changes(changes, None);
insta::assert_snapshot!(
get_submodules(case.db(), &module),
@"foo",
);
Ok(())
}
/// This tests that retrieving submodules from a module has its cache
/// appropriately invalidated after a file is created and then deleted.
#[test]
fn submodule_cache_invalidation_created_then_deleted() -> anyhow::Result<()> {
let mut case = setup([("lib.py", ""), ("bar/__init__.py", ""), ("bar/foo.py", "")])?;
let module = resolve_module(case.db(), &ModuleName::new("bar").unwrap()).expect("`bar` module");
let get_submodules = |db: &dyn Db, module: &Module| {
let mut names = module
.all_submodules(db)
.iter()
.map(|name| name.as_str().to_string())
.collect::<Vec<String>>();
names.sort();
names.join("\n")
};
insta::assert_snapshot!(
get_submodules(case.db(), &module),
@"foo",
);
std::fs::write(case.project_path("bar/wazoo.py").as_std_path(), "")?;
let changes = case.take_watch_changes(event_for_file("wazoo.py"));
case.apply_changes(changes, None);
std::fs::remove_file(case.project_path("bar/wazoo.py").as_std_path())?;
let changes = case.stop_watch(event_for_file("wazoo.py"));
case.apply_changes(changes, None);
insta::assert_snapshot!(
get_submodules(case.db(), &module),
@"foo",
);
Ok(())
}
/// This tests that retrieving submodules from a module has its cache
/// appropriately invalidated after a file is created *after* a project
/// configuration change.
#[test]
fn submodule_cache_invalidation_after_pyproject_created() -> anyhow::Result<()> {
let mut case = setup([("lib.py", ""), ("bar/__init__.py", ""), ("bar/foo.py", "")])?;
let module = resolve_module(case.db(), &ModuleName::new("bar").unwrap()).expect("`bar` module");
let get_submodules = |db: &dyn Db, module: &Module| {
let mut names = module
.all_submodules(db)
.iter()
.map(|name| name.as_str().to_string())
.collect::<Vec<String>>();
names.sort();
names.join("\n")
};
insta::assert_snapshot!(
get_submodules(case.db(), &module),
@"foo",
);
case.update_options(Options::default())?;
std::fs::write(case.project_path("bar/wazoo.py").as_std_path(), "")?;
let changes = case.take_watch_changes(event_for_file("wazoo.py"));
case.apply_changes(changes, None);
insta::assert_snapshot!(
get_submodules(case.db(), &module),
@r"
foo
wazoo
",
);
Ok(())
}

View File

@@ -96,7 +96,7 @@ pub(crate) mod tests {
#[salsa::db]
impl SemanticDb for TestDb {
fn is_file_open(&self, file: File) -> bool {
fn should_check_file(&self, file: File) -> bool {
!file.path(self).is_vendored_path()
}

View File

@@ -4,12 +4,13 @@ pub use crate::goto_type_definition::goto_type_definition;
use crate::find_node::covering_node;
use crate::stub_mapping::StubMapper;
use ruff_db::parsed::ParsedModuleRef;
use ruff_db::parsed::{ParsedModuleRef, parsed_module};
use ruff_python_ast::{self as ast, AnyNodeRef};
use ruff_python_parser::TokenKind;
use ruff_text_size::{Ranged, TextRange, TextSize};
use ty_python_semantic::types::Type;
use ty_python_semantic::{HasType, SemanticModel};
use ty_python_semantic::types::definitions_for_keyword_argument;
use ty_python_semantic::{HasType, SemanticModel, definitions_for_name};
#[derive(Clone, Copy, Debug)]
pub(crate) enum GotoTarget<'a> {
@@ -150,15 +151,19 @@ impl GotoTarget<'_> {
use ruff_python_ast as ast;
match self {
// For names, find the definitions of the symbol
GotoTarget::Expression(expression) => {
if let ast::ExprRef::Name(name) = expression {
Self::get_name_definition_targets(name, file, db, stub_mapper)
} else {
// For other expressions, we can't find definitions
None
}
}
GotoTarget::Expression(expression) => match expression {
ast::ExprRef::Name(name) => definitions_to_navigation_targets(
db,
stub_mapper,
definitions_for_name(db, file, name),
),
ast::ExprRef::Attribute(attribute) => definitions_to_navigation_targets(
db,
stub_mapper,
ty_python_semantic::definitions_for_attribute(db, file, attribute),
),
_ => None,
},
// For already-defined symbols, they are their own definitions
GotoTarget::FunctionDef(function) => {
@@ -195,41 +200,31 @@ impl GotoTarget<'_> {
None
}
// TODO: Handle attribute and method accesses (y in `x.y` expressions)
// TODO: Handle keyword arguments in call expression
// Handle keyword arguments in call expressions
GotoTarget::KeywordArgument(keyword) => {
// Find the call expression that contains this keyword
let module = parsed_module(db, file).load(db);
// Use the keyword's range to find the containing call expression
let covering_node = covering_node(module.syntax().into(), keyword.range())
.find_first(|node| matches!(node, AnyNodeRef::ExprCall(_)))
.ok()?;
if let AnyNodeRef::ExprCall(call_expr) = covering_node.node() {
let definitions =
definitions_for_keyword_argument(db, file, keyword, call_expr);
return definitions_to_navigation_targets(db, stub_mapper, definitions);
}
None
}
// TODO: Handle multi-part module names in import statements
// TODO: Handle imported symbol in y in `from x import y as z` statement
// TODO: Handle string literals that map to TypedDict fields
_ => None,
}
}
/// Get navigation targets for definitions associated with a name expression
fn get_name_definition_targets(
name: &ruff_python_ast::ExprName,
file: ruff_db::files::File,
db: &dyn crate::Db,
stub_mapper: Option<&StubMapper>,
) -> Option<crate::NavigationTargets> {
use ty_python_semantic::definitions_for_name;
// Get all definitions for this name
let mut definitions = definitions_for_name(db, file, name);
// Apply stub mapping if a mapper is provided
if let Some(mapper) = stub_mapper {
definitions = mapper.map_definitions(definitions);
}
if definitions.is_empty() {
return None;
}
// Convert definitions to navigation targets
let targets = convert_resolved_definitions_to_targets(db, definitions);
Some(crate::NavigationTargets::unique(targets))
}
}
impl Ranged for GotoTarget<'_> {
@@ -279,18 +274,35 @@ fn convert_resolved_definitions_to_targets(
full_range: full_range.range(),
}
}
ty_python_semantic::ResolvedDefinition::ModuleFile(module_file) => {
// For module files, navigate to the beginning of the file
ty_python_semantic::ResolvedDefinition::FileWithRange(file_range) => {
// For file ranges, navigate to the specific range within the file
crate::NavigationTarget {
file: module_file,
focus_range: ruff_text_size::TextRange::default(), // Start of file
full_range: ruff_text_size::TextRange::default(), // Start of file
file: file_range.file(),
focus_range: file_range.range(),
full_range: file_range.range(),
}
}
})
.collect()
}
/// Shared helper to map and convert resolved definitions into navigation targets.
fn definitions_to_navigation_targets<'db>(
db: &dyn crate::Db,
stub_mapper: Option<&StubMapper<'db>>,
mut definitions: Vec<ty_python_semantic::ResolvedDefinition<'db>>,
) -> Option<crate::NavigationTargets> {
if let Some(mapper) = stub_mapper {
definitions = mapper.map_definitions(definitions);
}
if definitions.is_empty() {
None
} else {
let targets = convert_resolved_definitions_to_targets(db, definitions);
Some(crate::NavigationTargets::unique(targets))
}
}
pub(crate) fn find_goto_target(
parsed: &ParsedModuleRef,
offset: TextSize,

View File

@@ -611,7 +611,107 @@ def another_helper():
}
#[test]
fn goto_declaration_builtin_type() {
fn goto_declaration_instance_attribute() {
let test = cursor_test(
"
class C:
def __init__(self):
self.x: int = 1
c = C()
y = c.x<CURSOR>
",
);
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> main.py:4:21
|
2 | class C:
3 | def __init__(self):
4 | self.x: int = 1
| ^^^^^^
5 |
6 | c = C()
|
info: Source
--> main.py:7:17
|
6 | c = C()
7 | y = c.x
| ^^^
|
");
}
#[test]
fn goto_declaration_instance_attribute_no_annotation() {
let test = cursor_test(
"
class C:
def __init__(self):
self.x = 1
c = C()
y = c.x<CURSOR>
",
);
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> main.py:4:21
|
2 | class C:
3 | def __init__(self):
4 | self.x = 1
| ^^^^^^
5 |
6 | c = C()
|
info: Source
--> main.py:7:17
|
6 | c = C()
7 | y = c.x
| ^^^
|
");
}
#[test]
fn goto_declaration_method_call_to_definition() {
let test = cursor_test(
"
class C:
def foo(self):
return 42
c = C()
res = c.foo<CURSOR>()
",
);
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> main.py:3:21
|
2 | class C:
3 | def foo(self):
| ^^^
4 | return 42
|
info: Source
--> main.py:7:19
|
6 | c = C()
7 | res = c.foo()
| ^^^^^
|
");
}
#[test]
fn goto_declaration_module_attribute() {
let test = cursor_test(
r#"
x: i<CURSOR>nt = 42
@@ -721,6 +821,152 @@ def function():
"#);
}
#[test]
fn goto_declaration_inherited_attribute() {
let test = cursor_test(
"
class A:
x = 10
class B(A):
pass
b = B()
y = b.x<CURSOR>
",
);
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> main.py:3:17
|
2 | class A:
3 | x = 10
| ^
4 |
5 | class B(A):
|
info: Source
--> main.py:9:17
|
8 | b = B()
9 | y = b.x
| ^^^
|
");
}
#[test]
fn goto_declaration_property_getter_setter() {
let test = cursor_test(
"
class C:
def __init__(self):
self._value = 0
@property
def value(self):
return self._value
c = C()
c.value<CURSOR> = 42
",
);
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> main.py:7:21
|
6 | @property
7 | def value(self):
| ^^^^^
8 | return self._value
|
info: Source
--> main.py:11:13
|
10 | c = C()
11 | c.value = 42
| ^^^^^^^
|
");
}
#[test]
fn goto_declaration_function_doc_attribute() {
let test = cursor_test(
r#"
def my_function():
"""This is a docstring."""
return 42
doc = my_function.__doc<CURSOR>__
"#,
);
// Should navigate to the __doc__ property in the FunctionType class in typeshed
let result = test.goto_declaration();
assert!(
!result.contains("No goto target found"),
"Should find builtin __doc__ attribute"
);
assert!(
!result.contains("No declarations found"),
"Should find builtin __doc__ declarations"
);
// Should navigate to a typeshed file containing the __doc__ attribute
assert!(
result.contains("types.pyi") || result.contains("builtins.pyi"),
"Should navigate to typeshed file with __doc__ definition"
);
assert!(
result.contains("__doc__"),
"Should find the __doc__ attribute definition"
);
assert!(
result.contains("info[goto-declaration]: Declaration"),
"Should be a goto-declaration result"
);
}
#[test]
fn goto_declaration_protocol_instance_attribute() {
let test = cursor_test(
"
from typing import Protocol
class Drawable(Protocol):
def draw(self) -> None: ...
name: str
def use_drawable(obj: Drawable):
obj.na<CURSOR>me
",
);
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> main.py:6:17
|
4 | class Drawable(Protocol):
5 | def draw(self) -> None: ...
6 | name: str
| ^^^^
7 |
8 | def use_drawable(obj: Drawable):
|
info: Source
--> main.py:9:17
|
8 | def use_drawable(obj: Drawable):
9 | obj.name
| ^^^^^^^^
|
");
}
#[test]
fn goto_declaration_generic_method_class_type() {
let test = cursor_test(
@@ -756,6 +1002,94 @@ class MyClass:
");
}
#[test]
fn goto_declaration_keyword_argument_simple() {
let test = cursor_test(
"
def my_function(x, y, z=10):
return x + y + z
result = my_function(1, y<CURSOR>=2, z=3)
",
);
assert_snapshot!(test.goto_declaration(), @r"
info[goto-declaration]: Declaration
--> main.py:2:32
|
2 | def my_function(x, y, z=10):
| ^
3 | return x + y + z
|
info: Source
--> main.py:5:37
|
3 | return x + y + z
4 |
5 | result = my_function(1, y=2, z=3)
| ^
|
");
}
#[test]
fn goto_declaration_keyword_argument_overloaded() {
let test = cursor_test(
r#"
from typing import overload
@overload
def process(data: str, format: str) -> str: ...
@overload
def process(data: int, format: int) -> int: ...
def process(data, format):
return data
# Call the overloaded function
result = process("hello", format<CURSOR>="json")
"#,
);
// Should navigate to the parameter in both matching overloads
assert_snapshot!(test.goto_declaration(), @r#"
info[goto-declaration]: Declaration
--> main.py:5:36
|
4 | @overload
5 | def process(data: str, format: str) -> str: ...
| ^^^^^^
6 |
7 | @overload
|
info: Source
--> main.py:14:39
|
13 | # Call the overloaded function
14 | result = process("hello", format="json")
| ^^^^^^
|
info[goto-declaration]: Declaration
--> main.py:8:36
|
7 | @overload
8 | def process(data: int, format: int) -> int: ...
| ^^^^^^
9 |
10 | def process(data, format):
|
info: Source
--> main.py:14:39
|
13 | # Call the overloaded function
14 | result = process("hello", format="json")
| ^^^^^^
|
"#);
}
impl CursorTest {
fn goto_declaration(&self) -> String {
let Some(targets) = goto_declaration(&self.db, self.cursor.file, self.cursor.offset)

View File

@@ -105,11 +105,11 @@ pub struct NavigationTargets(smallvec::SmallVec<[NavigationTarget; 1]>);
impl NavigationTargets {
fn single(target: NavigationTarget) -> Self {
Self(smallvec::smallvec![target])
Self(smallvec::smallvec_inline![target])
}
fn empty() -> Self {
Self(smallvec::SmallVec::new())
Self(smallvec::SmallVec::new_const())
}
fn unique(targets: impl IntoIterator<Item = NavigationTarget>) -> Self {

View File

@@ -482,25 +482,40 @@ impl<'db> SemanticTokenVisitor<'db> {
parameters: &ast::Parameters,
func: Option<&ast::StmtFunctionDef>,
) {
// Parameters
for (i, param) in parameters.args.iter().enumerate() {
let token_type = if let Some(func) = func {
// For function definitions, use the classification logic to determine
// whether this is a self/cls parameter or just a regular parameter
self.classify_parameter(&param.parameter, i == 0, func)
} else {
// For lambdas, all parameters are just parameters (no self/cls)
SemanticTokenType::Parameter
let mut param_index = 0;
for any_param in parameters {
let parameter = any_param.as_parameter();
let token_type = match any_param {
ast::AnyParameterRef::NonVariadic(_) => {
// For non-variadic parameters (positional-only, regular, keyword-only),
// check if this should be classified as self/cls parameter
if let Some(func) = func {
let result = self.classify_parameter(parameter, param_index == 0, func);
param_index += 1;
result
} else {
// For lambdas, all parameters are just parameters (no self/cls)
param_index += 1;
SemanticTokenType::Parameter
}
}
ast::AnyParameterRef::Variadic(_) => {
// Variadic parameters (*args, **kwargs) are always just parameters
param_index += 1;
SemanticTokenType::Parameter
}
};
self.add_token(
param.parameter.name.range(),
parameter.name.range(),
token_type,
SemanticTokenModifier::empty(),
);
// Handle parameter type annotations
if let Some(annotation) = &param.parameter.annotation {
if let Some(annotation) = &parameter.annotation {
self.visit_type_annotation(annotation);
}
}
@@ -977,7 +992,8 @@ class MyClass:
class MyClass:
def method(instance, x): pass
@classmethod
def other(klass, y): pass<CURSOR>
def other(klass, y): pass
def complex_method(instance, posonly, /, regular, *args, kwonly, **kwargs): pass<CURSOR>
",
);
@@ -992,6 +1008,13 @@ class MyClass:
"other" @ 75..80: Method [definition]
"klass" @ 81..86: ClsParameter
"y" @ 88..89: Parameter
"complex_method" @ 105..119: Method [definition]
"instance" @ 120..128: SelfParameter
"posonly" @ 130..137: Parameter
"regular" @ 142..149: Parameter
"args" @ 152..156: Parameter
"kwonly" @ 158..164: Parameter
"kwargs" @ 168..174: Parameter
"#);
}
@@ -1665,6 +1688,12 @@ class BoundedContainer[T: int, U = str]:
"P" @ 324..325: Variable
"str" @ 327..330: Class
"wrapper" @ 341..348: Function [definition]
"args" @ 350..354: Parameter
"P" @ 356..357: Variable
"args" @ 358..362: Variable
"kwargs" @ 366..372: Parameter
"P" @ 374..375: Variable
"kwargs" @ 376..382: Variable
"str" @ 387..390: Class
"str" @ 407..410: Class
"func" @ 411..415: Variable

View File

@@ -12,8 +12,8 @@ use ruff_db::diagnostic::Diagnostic;
use ruff_db::files::{File, Files};
use ruff_db::system::System;
use ruff_db::vendored::VendoredFileSystem;
use salsa::Event;
use salsa::plumbing::ZalsaDatabase;
use salsa::{Event, Setter};
use ty_ide::Db as IdeDb;
use ty_python_semantic::lint::{LintRegistry, RuleSelection};
use ty_python_semantic::{Db as SemanticDb, Program};
@@ -82,22 +82,25 @@ impl ProjectDatabase {
Ok(db)
}
/// Checks all open files in the project and its dependencies.
/// Checks the files in the project and its dependencies as per the project's check mode.
///
/// Use [`set_check_mode`] to update the check mode.
///
/// [`set_check_mode`]: ProjectDatabase::set_check_mode
pub fn check(&self) -> Vec<Diagnostic> {
self.check_with_mode(CheckMode::OpenFiles)
}
/// Checks all open files in the project and its dependencies, using the given reporter.
pub fn check_with_reporter(&self, reporter: &mut dyn ProgressReporter) -> Vec<Diagnostic> {
let reporter = AssertUnwindSafe(reporter);
self.project().check(self, CheckMode::OpenFiles, reporter)
}
/// Check the project with the given mode.
pub fn check_with_mode(&self, mode: CheckMode) -> Vec<Diagnostic> {
let mut reporter = DummyReporter;
let reporter = AssertUnwindSafe(&mut reporter as &mut dyn ProgressReporter);
self.project().check(self, mode, reporter)
self.project().check(self, reporter)
}
/// Checks the files in the project and its dependencies, using the given reporter.
///
/// Use [`set_check_mode`] to update the check mode.
///
/// [`set_check_mode`]: ProjectDatabase::set_check_mode
pub fn check_with_reporter(&self, reporter: &mut dyn ProgressReporter) -> Vec<Diagnostic> {
let reporter = AssertUnwindSafe(reporter);
self.project().check(self, reporter)
}
#[tracing::instrument(level = "debug", skip(self))]
@@ -105,6 +108,12 @@ impl ProjectDatabase {
self.project().check_file(self, file)
}
/// Set the check mode for the project.
pub fn set_check_mode(&mut self, mode: CheckMode) {
tracing::debug!("Updating project to check {mode}");
self.project().set_check_mode(self).to(mode);
}
/// Returns a mutable reference to the system.
///
/// WARNING: Triggers a new revision, canceling other database handles. This can lead to deadlock.
@@ -163,17 +172,28 @@ impl std::fmt::Debug for ProjectDatabase {
}
}
#[derive(Debug, Clone, Copy, PartialEq)]
#[derive(Default, Debug, Clone, Copy, PartialEq, Eq)]
#[cfg_attr(test, derive(serde::Serialize))]
pub enum CheckMode {
/// Checks only the open files in the project.
/// Checks the open files in the project.
OpenFiles,
/// Checks all files in the project, ignoring the open file set.
///
/// This includes virtual files, such as those created by the language server.
/// This includes virtual files, such as those opened in an editor.
#[default]
AllFiles,
}
impl fmt::Display for CheckMode {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
CheckMode::OpenFiles => write!(f, "open files"),
CheckMode::AllFiles => write!(f, "all files"),
}
}
}
/// Stores memory usage information.
pub struct SalsaMemoryDump {
total_fields: usize,
@@ -389,12 +409,9 @@ impl IdeDb for ProjectDatabase {}
#[salsa::db]
impl SemanticDb for ProjectDatabase {
fn is_file_open(&self, file: File) -> bool {
let Some(project) = &self.project else {
return false;
};
project.is_file_open(self, file)
fn should_check_file(&self, file: File) -> bool {
self.project
.is_some_and(|project| project.should_check_file(self, file))
}
fn rule_selection(&self, file: File) -> &RuleSelection {
@@ -543,7 +560,7 @@ pub(crate) mod tests {
#[salsa::db]
impl ty_python_semantic::Db for TestDb {
fn is_file_open(&self, file: ruff_db::files::File) -> bool {
fn should_check_file(&self, file: ruff_db::files::File) -> bool {
!file.path(self).is_vendored_path()
}

View File

@@ -6,7 +6,8 @@ use std::collections::BTreeSet;
use crate::walk::ProjectFilesWalker;
use ruff_db::Db as _;
use ruff_db::files::{File, Files};
use ruff_db::file_revision::FileRevision;
use ruff_db::files::{File, FileRootKind, Files};
use ruff_db::system::SystemPath;
use rustc_hash::FxHashSet;
use salsa::Setter;
@@ -57,12 +58,6 @@ impl ProjectDatabase {
let mut synced_files = FxHashSet::default();
let mut sync_recursively = BTreeSet::default();
let mut sync_path = |db: &mut ProjectDatabase, path: &SystemPath| {
if synced_files.insert(path.to_path_buf()) {
File::sync_path(db, path);
}
};
for change in changes {
tracing::trace!("Handle change: {:?}", change);
@@ -92,12 +87,49 @@ impl ProjectDatabase {
match change {
ChangeEvent::Changed { path, kind: _ } | ChangeEvent::Opened(path) => {
sync_path(self, &path);
if synced_files.insert(path.to_path_buf()) {
let absolute =
SystemPath::absolute(&path, self.system().current_directory());
File::sync_path_only(self, &absolute);
if let Some(root) = self.files().root(self, &absolute) {
match root.kind_at_time_of_creation(self) {
// When a file inside the root of
// the project is changed, we don't
// want to mark the entire root as
// having changed too. In theory it
// might make sense to, but at time
// of writing, the file root revision
// on a project is used to invalidate
// the submodule files found within a
// directory. If we bumped the revision
// on every change within a project,
// then this caching technique would be
// effectively useless.
//
// It's plausible we should explore
// a more robust cache invalidation
// strategy that models more directly
// what we care about. For example, by
// keeping track of directories and
// their direct children explicitly,
// and then keying the submodule cache
// off of that instead. ---AG
FileRootKind::Project => {}
FileRootKind::LibrarySearchPath => {
root.set_revision(self).to(FileRevision::now());
}
}
}
}
}
ChangeEvent::Created { kind, path } => {
match kind {
CreatedKind::File => sync_path(self, &path),
CreatedKind::File => {
if synced_files.insert(path.to_path_buf()) {
File::sync_path(self, &path);
}
}
CreatedKind::Directory | CreatedKind::Any => {
sync_recursively.insert(path.clone());
}
@@ -138,7 +170,9 @@ impl ProjectDatabase {
};
if is_file {
sync_path(self, &path);
if synced_files.insert(path.to_path_buf()) {
File::sync_path(self, &path);
}
if let Some(file) = self.files().try_system(self, &path) {
project.remove_file(self, file);

View File

@@ -18,7 +18,7 @@ use crate::{IOErrorDiagnostic, Project};
/// The implementation uses internal mutability to transition between the lazy and indexed state
/// without triggering a new salsa revision. This is safe because the initial indexing happens on first access,
/// so no query can be depending on the contents of the indexed files before that. All subsequent mutations to
/// the indexed files must go through `IndexedMut`, which uses the Salsa setter `package.set_file_set` to
/// the indexed files must go through `IndexedMut`, which uses the Salsa setter `project.set_file_set` to
/// ensure that Salsa always knows when the set of indexed files have changed.
#[derive(Debug)]
pub struct IndexedFiles {
@@ -280,7 +280,7 @@ mod tests {
// Calling files a second time should not dead-lock.
// This can e.g. happen when `check_file` iterates over all files and
// `is_file_open` queries the open files.
// `should_check_file` queries the open files.
let files_2 = project.file_set(&db).get();
match files_2 {

View File

@@ -6,7 +6,7 @@ use files::{Index, Indexed, IndexedFiles};
use metadata::settings::Settings;
pub use metadata::{ProjectMetadata, ProjectMetadataError};
use ruff_db::diagnostic::{Annotation, Diagnostic, DiagnosticId, Severity, Span, SubDiagnostic};
use ruff_db::files::File;
use ruff_db::files::{File, FileRootKind};
use ruff_db::parsed::parsed_module;
use ruff_db::source::{SourceTextError, source_text};
use ruff_db::system::{SystemPath, SystemPathBuf};
@@ -14,6 +14,8 @@ use rustc_hash::FxHashSet;
use salsa::Durability;
use salsa::Setter;
use std::backtrace::BacktraceStatus;
use std::collections::hash_set;
use std::iter::FusedIterator;
use std::panic::{AssertUnwindSafe, UnwindSafe};
use std::sync::Arc;
use thiserror::Error;
@@ -54,13 +56,10 @@ pub fn default_lints_registry() -> LintRegistry {
#[salsa::input]
#[derive(Debug)]
pub struct Project {
/// The files that are open in the project.
///
/// Setting the open files to a non-`None` value changes `check` to only check the
/// open files rather than all files in the project.
#[returns(as_deref)]
/// The files that are open in the project, [`None`] if there are no open files.
#[returns(ref)]
#[default]
open_fileset: Option<Arc<FxHashSet<File>>>,
open_fileset: FxHashSet<File>,
/// The first-party files of this project.
#[default]
@@ -110,6 +109,13 @@ pub struct Project {
/// Diagnostics that were generated when resolving the project settings.
#[returns(deref)]
settings_diagnostics: Vec<OptionDiagnostic>,
/// The mode in which the project should be checked.
///
/// This changes the behavior of `check` to either check only the open files or all files in
/// the project including the virtual files that might exists in the editor.
#[default]
check_mode: CheckMode,
}
/// A progress reporter.
@@ -135,6 +141,13 @@ impl Project {
pub fn from_metadata(db: &dyn Db, metadata: ProjectMetadata) -> Result<Self, ToSettingsError> {
let (settings, diagnostics) = metadata.options().to_settings(db, metadata.root())?;
// This adds a file root for the project itself. This enables
// tracking of when changes are made to the files in a project
// at the directory level. At time of writing (2025-07-17),
// this is used for caching completions for submodules.
db.files()
.try_add_root(db, metadata.root(), FileRootKind::Project);
let project = Project::builder(Box::new(metadata), Box::new(settings), diagnostics)
.durability(Durability::MEDIUM)
.open_fileset_durability(Durability::LOW)
@@ -207,17 +220,20 @@ impl Project {
self.reload_files(db);
}
/// Checks all open files in the project and its dependencies.
/// Checks the project and its dependencies according to the project's check mode.
pub(crate) fn check(
self,
db: &ProjectDatabase,
mode: CheckMode,
mut reporter: AssertUnwindSafe<&mut dyn ProgressReporter>,
) -> Vec<Diagnostic> {
let project_span = tracing::debug_span!("Project::check");
let _span = project_span.enter();
tracing::debug!("Checking project '{name}'", name = self.name(db));
tracing::debug!(
"Checking {} in project '{name}'",
self.check_mode(db),
name = self.name(db)
);
let mut diagnostics: Vec<Diagnostic> = Vec::new();
diagnostics.extend(
@@ -226,11 +242,7 @@ impl Project {
.map(OptionDiagnostic::to_diagnostic),
);
let files = match mode {
CheckMode::OpenFiles => ProjectFiles::new(db, self),
// TODO: Consider open virtual files as well
CheckMode::AllFiles => ProjectFiles::Indexed(self.files(db)),
};
let files = ProjectFiles::new(db, self);
reporter.set_files(files.len());
diagnostics.extend(
@@ -284,7 +296,7 @@ impl Project {
}
pub(crate) fn check_file(self, db: &dyn Db, file: File) -> Vec<Diagnostic> {
if !self.is_file_open(db, file) {
if !self.should_check_file(db, file) {
return Vec::new();
}
@@ -292,8 +304,6 @@ impl Project {
}
/// Opens a file in the project.
///
/// This changes the behavior of `check` to only check the open files rather than all files in the project.
pub fn open_file(self, db: &mut dyn Db, file: File) {
tracing::debug!("Opening file `{}`", file.path(db));
@@ -340,45 +350,40 @@ impl Project {
}
}
/// Returns the open files in the project or `None` if the entire project should be checked.
pub fn open_files(self, db: &dyn Db) -> Option<&FxHashSet<File>> {
/// Returns the open files in the project or `None` if there are no open files.
pub fn open_files(self, db: &dyn Db) -> &FxHashSet<File> {
self.open_fileset(db)
}
/// Sets the open files in the project.
///
/// This changes the behavior of `check` to only check the open files rather than all files in the project.
#[tracing::instrument(level = "debug", skip(self, db))]
pub fn set_open_files(self, db: &mut dyn Db, open_files: FxHashSet<File>) {
tracing::debug!("Set open project files (count: {})", open_files.len());
self.set_open_fileset(db).to(Some(Arc::new(open_files)));
self.set_open_fileset(db).to(open_files);
}
/// This takes the open files from the project and returns them.
///
/// This changes the behavior of `check` to check all files in the project instead of just the open files.
fn take_open_files(self, db: &mut dyn Db) -> FxHashSet<File> {
tracing::debug!("Take open project files");
// Salsa will cancel any pending queries and remove its own reference to `open_files`
// so that the reference counter to `open_files` now drops to 1.
let open_files = self.set_open_fileset(db).to(None);
if let Some(open_files) = open_files {
Arc::try_unwrap(open_files).unwrap()
} else {
FxHashSet::default()
}
self.set_open_fileset(db).to(FxHashSet::default())
}
/// Returns `true` if the file is open in the project.
/// Returns `true` if the file should be checked.
///
/// A file is considered open when:
/// * explicitly set as an open file using [`open_file`](Self::open_file)
/// * It has a [`SystemPath`] and belongs to a package's `src` files
/// * It has a [`SystemVirtualPath`](ruff_db::system::SystemVirtualPath)
pub fn is_file_open(self, db: &dyn Db, file: File) -> bool {
/// This depends on the project's check mode:
/// * For [`OpenFiles`], it checks if the file is either explicitly set as an open file using
/// [`open_file`] or a system virtual path
/// * For [`AllFiles`], it checks if the file is either a system virtual path or a part of the
/// indexed files in the project
///
/// [`open_file`]: Self::open_file
/// [`OpenFiles`]: CheckMode::OpenFiles
/// [`AllFiles`]: CheckMode::AllFiles
pub fn should_check_file(self, db: &dyn Db, file: File) -> bool {
let path = file.path(db);
// Try to return early to avoid adding a dependency on `open_files` or `file_set` which
@@ -387,12 +392,12 @@ impl Project {
return false;
}
if let Some(open_files) = self.open_files(db) {
open_files.contains(&file)
} else if file.path(db).is_system_path() {
self.files(db).contains(&file)
} else {
file.path(db).is_system_virtual_path()
match self.check_mode(db) {
CheckMode::OpenFiles => self.open_files(db).contains(&file),
CheckMode::AllFiles => {
// Virtual files are always checked.
path.is_system_virtual_path() || self.files(db).contains(&file)
}
}
}
@@ -524,11 +529,7 @@ pub(crate) fn check_file_impl(db: &dyn Db, file: File) -> Box<[Diagnostic]> {
}
}
if db
.project()
.open_fileset(db)
.is_none_or(|files| !files.contains(&file))
{
if !db.project().open_fileset(db).contains(&file) {
// Drop the AST now that we are done checking this file. It is not currently open,
// so it is unlikely to be accessed again soon. If any queries need to access the AST
// from across files, it will be re-parsed.
@@ -554,24 +555,23 @@ enum ProjectFiles<'a> {
impl<'a> ProjectFiles<'a> {
fn new(db: &'a dyn Db, project: Project) -> Self {
if let Some(open_files) = project.open_files(db) {
ProjectFiles::OpenFiles(open_files)
} else {
ProjectFiles::Indexed(project.files(db))
match project.check_mode(db) {
CheckMode::OpenFiles => ProjectFiles::OpenFiles(project.open_files(db)),
CheckMode::AllFiles => ProjectFiles::Indexed(project.files(db)),
}
}
fn diagnostics(&self) -> &[IOErrorDiagnostic] {
match self {
ProjectFiles::OpenFiles(_) => &[],
ProjectFiles::Indexed(indexed) => indexed.diagnostics(),
ProjectFiles::Indexed(files) => files.diagnostics(),
}
}
fn len(&self) -> usize {
match self {
ProjectFiles::OpenFiles(open_files) => open_files.len(),
ProjectFiles::Indexed(indexed) => indexed.len(),
ProjectFiles::Indexed(files) => files.len(),
}
}
}
@@ -583,16 +583,14 @@ impl<'a> IntoIterator for &'a ProjectFiles<'a> {
fn into_iter(self) -> Self::IntoIter {
match self {
ProjectFiles::OpenFiles(files) => ProjectFilesIter::OpenFiles(files.iter()),
ProjectFiles::Indexed(indexed) => ProjectFilesIter::Indexed {
files: indexed.into_iter(),
},
ProjectFiles::Indexed(files) => ProjectFilesIter::Indexed(files.into_iter()),
}
}
}
enum ProjectFilesIter<'db> {
OpenFiles(std::collections::hash_set::Iter<'db, File>),
Indexed { files: files::IndexedIter<'db> },
OpenFiles(hash_set::Iter<'db, File>),
Indexed(files::IndexedIter<'db>),
}
impl Iterator for ProjectFilesIter<'_> {
@@ -601,11 +599,13 @@ impl Iterator for ProjectFilesIter<'_> {
fn next(&mut self) -> Option<Self::Item> {
match self {
ProjectFilesIter::OpenFiles(files) => files.next().copied(),
ProjectFilesIter::Indexed { files } => files.next(),
ProjectFilesIter::Indexed(files) => files.next(),
}
}
}
impl FusedIterator for ProjectFilesIter<'_> {}
#[derive(Debug, Clone)]
pub struct IOErrorDiagnostic {
file: Option<File>,

View File

@@ -372,11 +372,11 @@ mod tests {
with_escaped_paths(|| {
assert_ron_snapshot!(&project, @r#"
ProjectMetadata(
name: Name("app"),
root: "/app",
options: Options(),
)
ProjectMetadata(
name: Name("app"),
root: "/app",
options: Options(),
)
"#);
});
@@ -410,11 +410,11 @@ mod tests {
with_escaped_paths(|| {
assert_ron_snapshot!(&project, @r#"
ProjectMetadata(
name: Name("backend"),
root: "/app",
options: Options(),
)
ProjectMetadata(
name: Name("backend"),
root: "/app",
options: Options(),
)
"#);
});
@@ -552,16 +552,16 @@ unclosed table, expected `]`
with_escaped_paths(|| {
assert_ron_snapshot!(root, @r#"
ProjectMetadata(
name: Name("project-root"),
root: "/app",
options: Options(
src: Some(SrcOptions(
root: Some("src"),
)),
),
)
"#);
ProjectMetadata(
name: Name("project-root"),
root: "/app",
options: Options(
src: Some(SrcOptions(
root: Some("src"),
)),
),
)
"#);
});
Ok(())

View File

@@ -1806,7 +1806,7 @@ class Frozen:
raise AttributeError("Attributes can not be modified")
instance = Frozen()
instance.non_existing = 2 # error: [invalid-assignment] "Cannot assign to attribute `non_existing` on type `Frozen` whose `__setattr__` method returns `Never`/`NoReturn`"
instance.non_existing = 2 # error: [invalid-assignment] "Can not assign to unresolved attribute `non_existing` on type `Frozen`"
instance.existing = 2 # error: [invalid-assignment] "Cannot assign to attribute `existing` on type `Frozen` whose `__setattr__` method returns `Never`/`NoReturn`"
```

View File

@@ -415,8 +415,7 @@ frozen_instance = MyFrozenGeneric[int](1)
frozen_instance.x = 2 # error: [invalid-assignment]
```
When attempting to mutate an unresolved attribute on a frozen dataclass, only `unresolved-attribute`
is emitted:
Attempting to mutate an unresolved attribute on a frozen dataclass:
```py
from dataclasses import dataclass
@@ -425,7 +424,39 @@ from dataclasses import dataclass
class MyFrozenClass: ...
frozen = MyFrozenClass()
frozen.x = 2 # error: [unresolved-attribute]
frozen.x = 2 # error: [invalid-assignment] "Can not assign to unresolved attribute `x` on type `MyFrozenClass`"
```
A diagnostic is also emitted if a frozen dataclass is inherited, and an attempt is made to mutate an
attribute in the child class:
```py
from dataclasses import dataclass
@dataclass(frozen=True)
class MyFrozenClass:
x: int = 1
class MyFrozenChildClass(MyFrozenClass): ...
frozen = MyFrozenChildClass()
frozen.x = 2 # error: [invalid-assignment]
```
The same diagnostic is emitted if a frozen dataclass is inherited, and an attempt is made to delete
an attribute:
```py
from dataclasses import dataclass
@dataclass(frozen=True)
class MyFrozenClass:
x: int = 1
class MyFrozenChildClass(MyFrozenClass): ...
frozen = MyFrozenChildClass()
del frozen.x # TODO this should emit an [invalid-assignment]
```
### `match_args`

View File

@@ -40,22 +40,92 @@ else:
# error: [possibly-unresolved-reference]
reveal_type(c) # revealed: Literal[2]
d = 1
d = [1, 2, 3]
def delete():
# TODO: this results in `UnboundLocalError`; we should emit `unresolved-reference`
del d
del d # error: [unresolved-reference] "Name `d` used when not defined"
delete()
reveal_type(d) # revealed: Literal[1]
reveal_type(d) # revealed: list[Unknown]
def delete_element():
# When the `del` target isn't a name, it doesn't force local resolution.
del d[0]
print(d)
def delete_global():
global d
del d
# We could lint that `d` is unbound in this trivial case, but because it's global we'd need to
# be careful about false positives if `d` got reinitialized somehow in between the two `del`s.
del d
delete_global()
# The variable should have been removed, but we won't check it for now.
reveal_type(d) # revealed: Literal[1]
# Again, the variable should have been removed, but we don't check it.
reveal_type(d) # revealed: list[Unknown]
def delete_nonlocal():
e = 2
def delete_nonlocal_bad():
del e # error: [unresolved-reference] "Name `e` used when not defined"
def delete_nonlocal_ok():
nonlocal e
del e
# As with `global` above, we don't track that the nonlocal `e` is unbound.
del e
```
## `del` forces local resolution even if it's unreachable
Without a `global x` or `nonlocal x` declaration in `foo`, `del x` in `foo` causes `print(x)` in an
inner function `bar` to resolve to `foo`'s binding, in this case an unresolved reference / unbound
local error:
```py
x = 1
def foo():
print(x) # error: [unresolved-reference] "Name `x` used when not defined"
if False:
# Assigning to `x` would have the same effect here.
del x
def bar():
print(x) # error: [unresolved-reference] "Name `x` used when not defined"
```
## But `del` doesn't force local resolution of `global` or `nonlocal` variables
However, with `global x` in `foo`, `print(x)` in `bar` resolves in the global scope, despite the
`del` in `foo`:
```py
x = 1
def foo():
global x
def bar():
# allowed, refers to `x` in the global scope
reveal_type(x) # revealed: Unknown | Literal[1]
bar()
del x # allowed, deletes `x` in the global scope (though we don't track that)
```
`nonlocal x` has a similar effect, if we add an extra `enclosing` scope to give it something to
refer to:
```py
def enclosing():
x = 2
def foo():
nonlocal x
def bar():
# allowed, refers to `x` in `enclosing`
reveal_type(x) # revealed: Unknown | Literal[2]
bar()
del x # allowed, deletes `x` in `enclosing` (though we don't track that)
```
## Delete attributes

View File

@@ -0,0 +1,355 @@
# Tests for the `@deprecated` decorator
## Introduction
<!-- snapshot-diagnostics -->
The decorator `@deprecated("some message")` can be applied to functions, methods, overloads, and
classes. Uses of these items should subsequently produce a warning.
```py
from typing_extensions import deprecated
@deprecated("use OtherClass")
def myfunc(): ...
myfunc() # error: [deprecated] "use OtherClass"
```
```py
from typing_extensions import deprecated
@deprecated("use BetterClass")
class MyClass: ...
MyClass() # error: [deprecated] "use BetterClass"
```
```py
from typing_extensions import deprecated
class MyClass:
@deprecated("use something else")
def afunc(): ...
@deprecated("don't use this!")
def amethod(self): ...
MyClass.afunc() # error: [deprecated] "use something else"
MyClass().amethod() # error: [deprecated] "don't use this!"
```
## Syntax
<!-- snapshot-diagnostics -->
The typeshed declaration of the decorator is as follows:
```ignore
class deprecated:
message: LiteralString
category: type[Warning] | None
stacklevel: int
def __init__(self, message: LiteralString, /, *, category: type[Warning] | None = ..., stacklevel: int = 1) -> None: ...
def __call__(self, arg: _T, /) -> _T: ...
```
Only the mandatory message string is of interest to static analysis, the other two affect only
runtime behaviour.
```py
from typing_extensions import deprecated
@deprecated # error: [invalid-argument-type] "LiteralString"
def invalid_deco(): ...
invalid_deco() # error: [missing-argument]
```
```py
from typing_extensions import deprecated
@deprecated() # error: [missing-argument] "message"
def invalid_deco(): ...
invalid_deco()
```
The argument is supposed to be a LiteralString, and we can handle simple constant propagations like
this:
```py
from typing_extensions import deprecated
x = "message"
@deprecated(x)
def invalid_deco(): ...
invalid_deco() # error: [deprecated] "message"
```
However sufficiently opaque LiteralStrings we can't resolve, and so we lose the message:
```py
from typing_extensions import deprecated, LiteralString
def opaque() -> LiteralString:
return "message"
@deprecated(opaque())
def valid_deco(): ...
valid_deco() # error: [deprecated]
```
Fully dynamic strings are technically allowed at runtime, but typeshed mandates that the input is a
LiteralString, so we can/should emit a diagnostic for this:
```py
from typing_extensions import deprecated
def opaque() -> str:
return "message"
@deprecated(opaque()) # error: [invalid-argument-type] "LiteralString"
def dubious_deco(): ...
dubious_deco()
```
Although we have no use for the other arguments, we should still error if they're wrong.
```py
from typing_extensions import deprecated
@deprecated("some message", dsfsdf="whatever") # error: [unknown-argument] "dsfsdf"
def invalid_deco(): ...
invalid_deco()
```
And we should always handle correct ones fine.
```py
from typing_extensions import deprecated
@deprecated("some message", category=DeprecationWarning, stacklevel=1)
def valid_deco(): ...
valid_deco() # error: [deprecated] "some message"
```
## Different Versions
There are 2 different sources of `@deprecated`: `warnings` and `typing_extensions`. The version in
`warnings` was added in 3.13, the version in `typing_extensions` is a compatibility shim.
```toml
[environment]
python-version = "3.13"
```
`main.py`:
```py
import warnings
import typing_extensions
@warnings.deprecated("nope")
def func1(): ...
@typing_extensions.deprecated("nada")
def func2(): ...
func1() # error: [deprecated] "nope"
func2() # error: [deprecated] "nada"
```
## Imports
### Direct Import Deprecated
Importing a deprecated item should produce a warning. Subsequent uses of the deprecated item
shouldn't produce a warning.
`module.py`:
```py
from typing_extensions import deprecated
@deprecated("Use OtherType instead")
class DeprType: ...
@deprecated("Use other_func instead")
def depr_func(): ...
```
`main.py`:
```py
# error: [deprecated] "Use OtherType instead"
# error: [deprecated] "Use other_func instead"
from module import DeprType, depr_func
# TODO: these diagnostics ideally shouldn't fire since we warn on the import
DeprType() # error: [deprecated] "Use OtherType instead"
depr_func() # error: [deprecated] "Use other_func instead"
def higher_order(x): ...
# TODO: these diagnostics ideally shouldn't fire since we warn on the import
higher_order(DeprType) # error: [deprecated] "Use OtherType instead"
higher_order(depr_func) # error: [deprecated] "Use other_func instead"
# TODO: these diagnostics ideally shouldn't fire since we warn on the import
DeprType.__str__ # error: [deprecated] "Use OtherType instead"
depr_func.__str__ # error: [deprecated] "Use other_func instead"
```
### Non-Import Deprecated
If the items aren't imported and instead referenced using `module.item` then each use should produce
a warning.
`module.py`:
```py
from typing_extensions import deprecated
@deprecated("Use OtherType instead")
class DeprType: ...
@deprecated("Use other_func instead")
def depr_func(): ...
```
`main.py`:
```py
import module
module.DeprType() # error: [deprecated] "Use OtherType instead"
module.depr_func() # error: [deprecated] "Use other_func instead"
def higher_order(x): ...
higher_order(module.DeprType) # error: [deprecated] "Use OtherType instead"
higher_order(module.depr_func) # error: [deprecated] "Use other_func instead"
module.DeprType.__str__ # error: [deprecated] "Use OtherType instead"
module.depr_func.__str__ # error: [deprecated] "Use other_func instead"
```
### Star Import Deprecated
If the items are instead star-imported, then the actual uses should warn.
`module.py`:
```py
from typing_extensions import deprecated
@deprecated("Use OtherType instead")
class DeprType: ...
@deprecated("Use other_func instead")
def depr_func(): ...
```
`main.py`:
```py
from module import *
DeprType() # error: [deprecated] "Use OtherType instead"
depr_func() # error: [deprecated] "Use other_func instead"
def higher_order(x): ...
higher_order(DeprType) # error: [deprecated] "Use OtherType instead"
higher_order(depr_func) # error: [deprecated] "Use other_func instead"
DeprType.__str__ # error: [deprecated] "Use OtherType instead"
depr_func.__str__ # error: [deprecated] "Use other_func instead"
```
## Aliases
Ideally a deprecated warning shouldn't transitively follow assignments, as you already had to "name"
the deprecated symbol to assign it to something else. These kinds of diagnostics would therefore be
redundant and annoying.
```py
from typing_extensions import deprecated
@deprecated("Use OtherType instead")
class DeprType: ...
@deprecated("Use other_func instead")
def depr_func(): ...
alias_func = depr_func # error: [deprecated] "Use other_func instead"
AliasClass = DeprType # error: [deprecated] "Use OtherType instead"
# TODO: these diagnostics ideally shouldn't fire
alias_func() # error: [deprecated] "Use other_func instead"
AliasClass() # error: [deprecated] "Use OtherType instead"
```
## Dunders
If a dunder like `__add__` is deprecated, then the equivalent syntactic sugar like `+` should fire a
diagnostic.
```py
from typing_extensions import deprecated
class MyInt:
def __init__(self, val):
self.val = val
@deprecated("MyInt `+` support is broken")
def __add__(self, other):
return MyInt(self.val + other.val)
x = MyInt(1)
y = MyInt(2)
z = x + y # TODO error: [deprecated] "MyInt `+` support is broken"
```
## Overloads
Overloads can be deprecated, but only trigger warnings when invoked.
```py
from typing_extensions import deprecated
from typing_extensions import overload
@overload
@deprecated("strings are no longer supported")
def f(x: str): ...
@overload
def f(x: int): ...
def f(x):
print(x)
f(1)
f("hello") # TODO: error: [deprecated] "strings are no longer supported"
```
If the actual impl is deprecated, the deprecation always fires.
```py
from typing_extensions import deprecated
from typing_extensions import overload
@overload
def f(x: str): ...
@overload
def f(x: int): ...
@deprecated("unusable")
def f(x):
print(x)
f(1) # error: [deprecated] "unusable"
f("hello") # error: [deprecated] "unusable"
```

View File

@@ -84,6 +84,52 @@ def f():
x = "hello" # error: [invalid-assignment] "Object of type `Literal["hello"]` is not assignable to `int`"
```
## The types of `nonlocal` binding get unioned
Without a type declaration, we union the bindings in enclosing scopes to infer a type. But name
resolution stops at the closest binding that isn't declared `nonlocal`, and we ignore bindings
outside of that one:
```py
def a():
# This binding is shadowed in `b`, so we ignore it in inner scopes.
x = 1
def b():
x = 2
def c():
nonlocal x
x = 3
def d():
nonlocal x
reveal_type(x) # revealed: Unknown | Literal[3, 2]
x = 4
reveal_type(x) # revealed: Literal[4]
def e():
reveal_type(x) # revealed: Unknown | Literal[4, 3, 2]
```
However, currently the union of types that we build is incomplete. We walk parent scopes, but not
sibling scopes, child scopes, second-cousin-once-removed scopes, etc:
```py
def a():
x = 1
def b():
nonlocal x
x = 2
def c():
def d():
nonlocal x
x = 3
# TODO: This should include 2 and 3.
reveal_type(x) # revealed: Unknown | Literal[1]
```
## Local variable bindings "look ahead" to any assignment in the current scope
The binding `x = 2` in `g` causes the earlier read of `x` to refer to `g`'s not-yet-initialized
@@ -390,3 +436,13 @@ def f():
nonlocal x
x = 1
```
## Narrowing nonlocal types to `Never` doesn't make them unbound
```py
def foo():
x: int = 1
def bar():
if isinstance(x, str):
reveal_type(x) # revealed: Never
```

View File

@@ -0,0 +1,93 @@
---
source: crates/ty_test/src/lib.rs
expression: snapshot
---
---
mdtest name: deprecated.md - Tests for the `@deprecated` decorator - Introduction
mdtest path: crates/ty_python_semantic/resources/mdtest/deprecated.md
---
# Python source files
## mdtest_snippet.py
```
1 | from typing_extensions import deprecated
2 |
3 | @deprecated("use OtherClass")
4 | def myfunc(): ...
5 |
6 | myfunc() # error: [deprecated] "use OtherClass"
7 | from typing_extensions import deprecated
8 |
9 | @deprecated("use BetterClass")
10 | class MyClass: ...
11 |
12 | MyClass() # error: [deprecated] "use BetterClass"
13 | from typing_extensions import deprecated
14 |
15 | class MyClass:
16 | @deprecated("use something else")
17 | def afunc(): ...
18 | @deprecated("don't use this!")
19 | def amethod(self): ...
20 |
21 | MyClass.afunc() # error: [deprecated] "use something else"
22 | MyClass().amethod() # error: [deprecated] "don't use this!"
```
# Diagnostics
```
warning[deprecated]: The function `myfunc` is deprecated
--> src/mdtest_snippet.py:6:1
|
4 | def myfunc(): ...
5 |
6 | myfunc() # error: [deprecated] "use OtherClass"
| ^^^^^^ use OtherClass
7 | from typing_extensions import deprecated
|
info: rule `deprecated` is enabled by default
```
```
warning[deprecated]: The class `MyClass` is deprecated
--> src/mdtest_snippet.py:12:1
|
10 | class MyClass: ...
11 |
12 | MyClass() # error: [deprecated] "use BetterClass"
| ^^^^^^^ use BetterClass
13 | from typing_extensions import deprecated
|
info: rule `deprecated` is enabled by default
```
```
warning[deprecated]: The function `afunc` is deprecated
--> src/mdtest_snippet.py:21:9
|
19 | def amethod(self): ...
20 |
21 | MyClass.afunc() # error: [deprecated] "use something else"
| ^^^^^ use something else
22 | MyClass().amethod() # error: [deprecated] "don't use this!"
|
info: rule `deprecated` is enabled by default
```
```
warning[deprecated]: The function `amethod` is deprecated
--> src/mdtest_snippet.py:22:11
|
21 | MyClass.afunc() # error: [deprecated] "use something else"
22 | MyClass().amethod() # error: [deprecated] "don't use this!"
| ^^^^^^^ don't use this!
|
info: rule `deprecated` is enabled by default
```

View File

@@ -0,0 +1,178 @@
---
source: crates/ty_test/src/lib.rs
expression: snapshot
---
---
mdtest name: deprecated.md - Tests for the `@deprecated` decorator - Syntax
mdtest path: crates/ty_python_semantic/resources/mdtest/deprecated.md
---
# Python source files
## mdtest_snippet.py
```
1 | from typing_extensions import deprecated
2 |
3 | @deprecated # error: [invalid-argument-type] "LiteralString"
4 | def invalid_deco(): ...
5 |
6 | invalid_deco() # error: [missing-argument]
7 | from typing_extensions import deprecated
8 |
9 | @deprecated() # error: [missing-argument] "message"
10 | def invalid_deco(): ...
11 |
12 | invalid_deco()
13 | from typing_extensions import deprecated
14 |
15 | x = "message"
16 |
17 | @deprecated(x)
18 | def invalid_deco(): ...
19 |
20 | invalid_deco() # error: [deprecated] "message"
21 | from typing_extensions import deprecated, LiteralString
22 |
23 | def opaque() -> LiteralString:
24 | return "message"
25 |
26 | @deprecated(opaque())
27 | def valid_deco(): ...
28 |
29 | valid_deco() # error: [deprecated]
30 | from typing_extensions import deprecated
31 |
32 | def opaque() -> str:
33 | return "message"
34 |
35 | @deprecated(opaque()) # error: [invalid-argument-type] "LiteralString"
36 | def dubious_deco(): ...
37 |
38 | dubious_deco()
39 | from typing_extensions import deprecated
40 |
41 | @deprecated("some message", dsfsdf="whatever") # error: [unknown-argument] "dsfsdf"
42 | def invalid_deco(): ...
43 |
44 | invalid_deco()
45 | from typing_extensions import deprecated
46 |
47 | @deprecated("some message", category=DeprecationWarning, stacklevel=1)
48 | def valid_deco(): ...
49 |
50 | valid_deco() # error: [deprecated] "some message"
```
# Diagnostics
```
error[invalid-argument-type]: Argument to class `deprecated` is incorrect
--> src/mdtest_snippet.py:3:1
|
1 | from typing_extensions import deprecated
2 |
3 | @deprecated # error: [invalid-argument-type] "LiteralString"
| ^^^^^^^^^^^ Expected `LiteralString`, found `def invalid_deco() -> Unknown`
4 | def invalid_deco(): ...
|
info: rule `invalid-argument-type` is enabled by default
```
```
error[missing-argument]: No argument provided for required parameter `arg` of bound method `__call__`
--> src/mdtest_snippet.py:6:1
|
4 | def invalid_deco(): ...
5 |
6 | invalid_deco() # error: [missing-argument]
| ^^^^^^^^^^^^^^
7 | from typing_extensions import deprecated
|
info: rule `missing-argument` is enabled by default
```
```
error[missing-argument]: No argument provided for required parameter `message` of class `deprecated`
--> src/mdtest_snippet.py:9:2
|
7 | from typing_extensions import deprecated
8 |
9 | @deprecated() # error: [missing-argument] "message"
| ^^^^^^^^^^^^
10 | def invalid_deco(): ...
|
info: rule `missing-argument` is enabled by default
```
```
warning[deprecated]: The function `invalid_deco` is deprecated
--> src/mdtest_snippet.py:20:1
|
18 | def invalid_deco(): ...
19 |
20 | invalid_deco() # error: [deprecated] "message"
| ^^^^^^^^^^^^ message
21 | from typing_extensions import deprecated, LiteralString
|
info: rule `deprecated` is enabled by default
```
```
warning[deprecated]: The function `valid_deco` is deprecated
--> src/mdtest_snippet.py:29:1
|
27 | def valid_deco(): ...
28 |
29 | valid_deco() # error: [deprecated]
| ^^^^^^^^^^
30 | from typing_extensions import deprecated
|
info: rule `deprecated` is enabled by default
```
```
error[invalid-argument-type]: Argument to class `deprecated` is incorrect
--> src/mdtest_snippet.py:35:13
|
33 | return "message"
34 |
35 | @deprecated(opaque()) # error: [invalid-argument-type] "LiteralString"
| ^^^^^^^^ Expected `LiteralString`, found `str`
36 | def dubious_deco(): ...
|
info: rule `invalid-argument-type` is enabled by default
```
```
error[unknown-argument]: Argument `dsfsdf` does not match any known parameter of class `deprecated`
--> src/mdtest_snippet.py:41:29
|
39 | from typing_extensions import deprecated
40 |
41 | @deprecated("some message", dsfsdf="whatever") # error: [unknown-argument] "dsfsdf"
| ^^^^^^^^^^^^^^^^^
42 | def invalid_deco(): ...
|
info: rule `unknown-argument` is enabled by default
```
```
warning[deprecated]: The function `valid_deco` is deprecated
--> src/mdtest_snippet.py:50:1
|
48 | def valid_deco(): ...
49 |
50 | valid_deco() # error: [deprecated] "some message"
| ^^^^^^^^^^ some message
|
info: rule `deprecated` is enabled by default
```

View File

@@ -36,7 +36,7 @@ error[invalid-syntax]
--> src/mdtest_snippet.py:6:19
|
4 | async def f():
5 | # error: 19 [invalid-syntax] "cannot use an asynchronous comprehension inside of a synchronous comprehension on Python 3.10 (synt...
5 | # error: 19 [invalid-syntax] "cannot use an asynchronous comprehension inside of a synchronous comprehension on Python 3.10 (syntax…
6 | return {n: [x async for x in elements(n)] for n in range(3)}
| ^^^^^^^^^^^^^^^^^^^^^^^^^^ cannot use an asynchronous comprehension inside of a synchronous comprehension on Python 3.10 (syntax was added in 3.11)
7 | async def test():

View File

@@ -41,7 +41,7 @@ mdtest path: crates/ty_python_semantic/resources/mdtest/with/sync.md
error[invalid-context-manager]: Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` and `__exit__`
--> src/mdtest_snippet.py:6:6
|
5 | # error: [invalid-context-manager] "Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` and...
5 | # error: [invalid-context-manager] "Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` and `…
6 | with Manager():
| ^^^^^^^^^
7 | ...
@@ -57,7 +57,7 @@ info: rule `invalid-context-manager` is enabled by default
error[invalid-context-manager]: Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` and `__exit__`
--> src/mdtest_snippet.py:13:6
|
12 | # error: [invalid-context-manager] "Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` an...
12 | # error: [invalid-context-manager] "Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` and …
13 | with Manager():
| ^^^^^^^^^
14 | ...
@@ -73,7 +73,7 @@ info: rule `invalid-context-manager` is enabled by default
error[invalid-context-manager]: Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` and `__exit__`
--> src/mdtest_snippet.py:20:6
|
19 | # error: [invalid-context-manager] "Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` an...
19 | # error: [invalid-context-manager] "Object of type `Manager` cannot be used with `with` because it does not implement `__enter__` and …
20 | with Manager():
| ^^^^^^^^^
21 | ...

View File

@@ -5,7 +5,8 @@ use ruff_db::files::File;
/// Database giving access to semantic information about a Python program.
#[salsa::db]
pub trait Db: SourceDb {
fn is_file_open(&self, file: File) -> bool;
/// Returns `true` if the file should be checked.
fn should_check_file(&self, file: File) -> bool;
/// Resolves the rule selection for a given file.
fn rule_selection(&self, file: File) -> &RuleSelection;
@@ -114,7 +115,7 @@ pub(crate) mod tests {
#[salsa::db]
impl Db for TestDb {
fn is_file_open(&self, file: File) -> bool {
fn should_check_file(&self, file: File) -> bool {
!file.path(self).is_vendored_path()
}

View File

@@ -17,8 +17,8 @@ pub use program::{
pub use python_platform::PythonPlatform;
pub use semantic_model::{Completion, CompletionKind, HasType, NameKind, SemanticModel};
pub use site_packages::{PythonEnvironment, SitePackagesPaths, SysPrefixPathOrigin};
pub use types::definitions_for_name;
pub use types::ide_support::ResolvedDefinition;
pub use types::{definitions_for_attribute, definitions_for_name};
pub use util::diagnostics::add_inferred_python_version_hint_to_diagnostic;
pub mod ast_node_ref;

View File

@@ -17,6 +17,7 @@ pub struct Module {
inner: Arc<ModuleInner>,
}
#[salsa::tracked]
impl Module {
pub(crate) fn file_module(
name: ModuleName,
@@ -97,11 +98,16 @@ impl Module {
///
/// The names returned correspond to the "base" name of the module.
/// That is, `{self.name}.{basename}` should give the full module name.
pub fn all_submodules(&self, db: &dyn Db) -> Vec<Name> {
self.all_submodules_inner(db).unwrap_or_default()
pub fn all_submodules<'db>(&self, db: &'db dyn Db) -> &'db [Name] {
self.clone()
.all_submodules_inner(db, ())
.as_deref()
.unwrap_or_default()
}
fn all_submodules_inner(&self, db: &dyn Db) -> Option<Vec<Name>> {
#[allow(clippy::ref_option, clippy::used_underscore_binding)]
#[salsa::tracked(returns(ref))]
fn all_submodules_inner(self, db: &dyn Db, _dummy: ()) -> Option<Vec<Name>> {
fn is_submodule(
is_dir: bool,
is_file: bool,
@@ -136,32 +142,42 @@ impl Module {
);
Some(match path.parent()? {
SystemOrVendoredPathRef::System(parent_directory) => db
.system()
.read_directory(parent_directory)
.inspect_err(|err| {
tracing::debug!(
"Failed to read {parent_directory:?} when looking for \
its possible submodules: {err}"
);
})
.ok()?
.flatten()
.filter(|entry| {
let ty = entry.file_type();
let path = entry.path();
is_submodule(
ty.is_directory(),
ty.is_file(),
path.file_name(),
path.extension(),
)
})
.filter_map(|entry| {
let stem = entry.path().file_stem()?;
is_identifier(stem).then(|| Name::from(stem))
})
.collect(),
SystemOrVendoredPathRef::System(parent_directory) => {
// Read the revision on the corresponding file root to
// register an explicit dependency on this directory
// tree. When the revision gets bumped, the cache
// that Salsa creates does for this routine will be
// invalidated.
if let Some(root) = db.files().root(db, parent_directory) {
let _ = root.revision(db);
}
db.system()
.read_directory(parent_directory)
.inspect_err(|err| {
tracing::debug!(
"Failed to read {parent_directory:?} when looking for \
its possible submodules: {err}"
);
})
.ok()?
.flatten()
.filter(|entry| {
let ty = entry.file_type();
let path = entry.path();
is_submodule(
ty.is_directory(),
ty.is_file(),
path.file_name(),
path.extension(),
)
})
.filter_map(|entry| {
let stem = entry.path().file_stem()?;
is_identifier(stem).then(|| Name::from(stem))
})
.collect()
}
SystemOrVendoredPathRef::Vendored(parent_directory) => db
.vendored()
.read_directory(parent_directory)
@@ -259,6 +275,7 @@ pub enum KnownModule {
UnittestMock,
#[cfg(test)]
Uuid,
Warnings,
}
impl KnownModule {
@@ -278,6 +295,7 @@ impl KnownModule {
Self::TypeCheckerInternals => "_typeshed._type_checker_internals",
Self::TyExtensions => "ty_extensions",
Self::ImportLib => "importlib",
Self::Warnings => "warnings",
#[cfg(test)]
Self::UnittestMock => "unittest.mock",
#[cfg(test)]

View File

@@ -1994,8 +1994,26 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
walk_stmt(self, stmt);
for target in targets {
if let Ok(target) = PlaceExpr::try_from(target) {
let is_name = target.is_name();
let place_id = self.add_place(PlaceExprWithFlags::new(target));
self.current_place_table_mut().mark_place_used(place_id);
let place_table = self.current_place_table_mut();
if is_name {
// `del x` behaves like an assignment in that it forces all references
// to `x` in the current scope (including *prior* references) to refer
// to the current scope's binding (unless `x` is declared `global` or
// `nonlocal`). For example, this is an UnboundLocalError at runtime:
//
// ```py
// x = 1
// def foo():
// print(x) # can't refer to global `x`
// if False:
// del x
// foo()
// ```
place_table.mark_place_bound(place_id);
}
place_table.mark_place_used(place_id);
self.delete_binding(place_id);
}
}
@@ -2522,7 +2540,7 @@ impl SemanticSyntaxContext for SemanticIndexBuilder<'_, '_> {
}
fn report_semantic_error(&self, error: SemanticSyntaxError) {
if self.db.is_file_open(self.file) {
if self.db.should_check_file(self.file) {
self.semantic_syntax_errors.borrow_mut().push(error);
}
}

View File

@@ -107,7 +107,7 @@ pub struct Definitions<'db> {
impl<'db> Definitions<'db> {
pub(crate) fn single(definition: Definition<'db>) -> Self {
Self {
definitions: smallvec::smallvec![definition],
definitions: smallvec::smallvec_inline![definition],
}
}

View File

@@ -10,7 +10,7 @@ use ruff_index::{IndexVec, newtype_index};
use ruff_python_ast as ast;
use ruff_python_ast::name::Name;
use rustc_hash::FxHasher;
use smallvec::{SmallVec, smallvec};
use smallvec::SmallVec;
use crate::Db;
use crate::ast_node_ref::AstNodeRef;
@@ -162,10 +162,10 @@ impl TryFrom<ast::ExprRef<'_>> for PlaceExpr {
}
impl PlaceExpr {
pub(crate) fn name(name: Name) -> Self {
pub(crate) const fn name(name: Name) -> Self {
Self {
root_name: name,
sub_segments: smallvec![],
sub_segments: SmallVec::new_const(),
}
}

View File

@@ -334,7 +334,9 @@ pub(crate) struct ReachabilityConstraintsBuilder {
}
impl ReachabilityConstraintsBuilder {
pub(crate) fn build(self) -> ReachabilityConstraints {
pub(crate) fn build(mut self) -> ReachabilityConstraints {
self.interiors.shrink_to_fit();
ReachabilityConstraints {
interiors: self.interiors,
}

View File

@@ -71,16 +71,12 @@ impl ScopedDefinitionId {
}
}
/// Can keep inline this many live bindings or declarations per place at a given time; more will
/// go to heap.
const INLINE_DEFINITIONS_PER_PLACE: usize = 4;
/// Live declarations for a single place at some point in control flow, with their
/// corresponding reachability constraints.
#[derive(Clone, Debug, Default, PartialEq, Eq, salsa::Update, get_size2::GetSize)]
pub(super) struct Declarations {
/// A list of live declarations for this place, sorted by their `ScopedDefinitionId`
live_declarations: SmallVec<[LiveDeclaration; INLINE_DEFINITIONS_PER_PLACE]>,
live_declarations: SmallVec<[LiveDeclaration; 2]>,
}
/// One of the live declarations for a single place at some point in control flow.
@@ -199,7 +195,7 @@ pub(super) struct Bindings {
/// "unbound" binding.
unbound_narrowing_constraint: Option<ScopedNarrowingConstraint>,
/// A list of live bindings for this place, sorted by their `ScopedDefinitionId`
live_bindings: SmallVec<[LiveBinding; INLINE_DEFINITIONS_PER_PLACE]>,
live_bindings: SmallVec<[LiveBinding; 2]>,
}
impl Bindings {

View File

@@ -86,7 +86,7 @@ impl<'db> SemanticModel<'db> {
};
let ty = Type::module_literal(self.db, self.file, &submodule);
completions.push(Completion {
name: submodule_basename,
name: submodule_basename.clone(),
ty,
builtin,
});

View File

@@ -508,7 +508,7 @@ impl<'a> SuppressionsBuilder<'a> {
lint_registry,
seen_non_trivia_token: false,
line: Vec::new(),
file: SmallVec::new(),
file: SmallVec::new_const(),
unknown: Vec::new(),
invalid: Vec::new(),
}

View File

@@ -49,7 +49,7 @@ use crate::types::generics::{
};
pub use crate::types::ide_support::{
CallSignatureDetails, Member, all_members, call_signature_details, definition_kind_for_name,
definitions_for_name,
definitions_for_attribute, definitions_for_keyword_argument, definitions_for_name,
};
use crate::types::infer::infer_unpack_types;
use crate::types::mro::{Mro, MroError, MroIterator};
@@ -4187,6 +4187,45 @@ impl<'db> Type<'db> {
.into()
}
Some(KnownClass::Deprecated) => {
// ```py
// class deprecated:
// def __new__(
// cls,
// message: LiteralString,
// /,
// *,
// category: type[Warning] | None = ...,
// stacklevel: int = 1
// ) -> Self: ...
// ```
Binding::single(
self,
Signature::new(
Parameters::new([
Parameter::positional_only(Some(Name::new_static("message")))
.with_annotated_type(Type::LiteralString),
Parameter::keyword_only(Name::new_static("category"))
.with_annotated_type(UnionType::from_elements(
db,
[
// TODO: should be `type[Warning]`
Type::any(),
KnownClass::NoneType.to_instance(db),
],
))
// TODO: should be `type[Warning]`
.with_default_type(Type::any()),
Parameter::keyword_only(Name::new_static("stacklevel"))
.with_annotated_type(KnownClass::Int.to_instance(db))
.with_default_type(Type::IntLiteral(1)),
]),
Some(KnownClass::Deprecated.to_instance(db)),
),
)
.into()
}
Some(KnownClass::TypeAliasType) => {
// ```py
// def __new__(
@@ -4450,8 +4489,11 @@ impl<'db> Type<'db> {
Type::EnumLiteral(enum_literal) => enum_literal.enum_class_instance(db).bindings(db),
Type::KnownInstance(known_instance) => {
known_instance.instance_fallback(db).bindings(db)
}
Type::PropertyInstance(_)
| Type::KnownInstance(_)
| Type::AlwaysFalsy
| Type::AlwaysTruthy
| Type::IntLiteral(_)
@@ -5007,21 +5049,27 @@ impl<'db> Type<'db> {
| Type::ProtocolInstance(_)
| Type::PropertyInstance(_)
| Type::TypeIs(_) => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![InvalidTypeExpression::InvalidType(
*self, scope_id
)],
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::InvalidType(*self, scope_id)
],
fallback_type: Type::unknown(),
}),
Type::KnownInstance(known_instance) => match known_instance {
KnownInstanceType::TypeAliasType(alias) => Ok(alias.value_type(db)),
KnownInstanceType::TypeVar(typevar) => Ok(Type::TypeVar(*typevar)),
KnownInstanceType::Deprecated(_) => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![InvalidTypeExpression::Deprecated],
fallback_type: Type::unknown(),
}),
KnownInstanceType::SubscriptedProtocol(_) => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![InvalidTypeExpression::Protocol],
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::Protocol
],
fallback_type: Type::unknown(),
}),
KnownInstanceType::SubscriptedGeneric(_) => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![InvalidTypeExpression::Generic],
invalid_expressions: smallvec::smallvec_inline![InvalidTypeExpression::Generic],
fallback_type: Type::unknown(),
}),
},
@@ -5057,7 +5105,7 @@ impl<'db> Type<'db> {
let Some(class) = nearest_enclosing_class(db, index, scope_id, &module) else {
return Err(InvalidTypeExpressionError {
fallback_type: Type::unknown(),
invalid_expressions: smallvec::smallvec![
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::InvalidType(*self, scope_id)
],
});
@@ -5081,18 +5129,20 @@ impl<'db> Type<'db> {
SpecialFormType::Literal
| SpecialFormType::Union
| SpecialFormType::Intersection => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::RequiresArguments(*self)
],
fallback_type: Type::unknown(),
}),
SpecialFormType::Protocol => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![InvalidTypeExpression::Protocol],
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::Protocol
],
fallback_type: Type::unknown(),
}),
SpecialFormType::Generic => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![InvalidTypeExpression::Generic],
invalid_expressions: smallvec::smallvec_inline![InvalidTypeExpression::Generic],
fallback_type: Type::unknown(),
}),
@@ -5103,7 +5153,7 @@ impl<'db> Type<'db> {
| SpecialFormType::TypeGuard
| SpecialFormType::Unpack
| SpecialFormType::CallableTypeOf => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::RequiresOneArgument(*self)
],
fallback_type: Type::unknown(),
@@ -5111,7 +5161,7 @@ impl<'db> Type<'db> {
SpecialFormType::Annotated | SpecialFormType::Concatenate => {
Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::RequiresTwoArguments(*self)
],
fallback_type: Type::unknown(),
@@ -5120,7 +5170,7 @@ impl<'db> Type<'db> {
SpecialFormType::ClassVar | SpecialFormType::Final => {
Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::TypeQualifier(*special_form)
],
fallback_type: Type::unknown(),
@@ -5130,7 +5180,7 @@ impl<'db> Type<'db> {
SpecialFormType::ReadOnly
| SpecialFormType::NotRequired
| SpecialFormType::Required => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::TypeQualifierRequiresOneArgument(*special_form)
],
fallback_type: Type::unknown(),
@@ -5184,9 +5234,9 @@ impl<'db> Type<'db> {
"Support for `types.UnionType` instances in type expressions"
)),
_ => Err(InvalidTypeExpressionError {
invalid_expressions: smallvec::smallvec![InvalidTypeExpression::InvalidType(
*self, scope_id
)],
invalid_expressions: smallvec::smallvec_inline![
InvalidTypeExpression::InvalidType(*self, scope_id)
],
fallback_type: Type::unknown(),
}),
},
@@ -5857,6 +5907,9 @@ pub enum KnownInstanceType<'db> {
/// A single instance of `typing.TypeAliasType` (PEP 695 type alias)
TypeAliasType(TypeAliasType<'db>),
/// A single instance of `warnings.deprecated` or `typing_extensions.deprecated`
Deprecated(DeprecatedInstance<'db>),
}
fn walk_known_instance_type<'db, V: visitor::TypeVisitor<'db> + ?Sized>(
@@ -5875,6 +5928,9 @@ fn walk_known_instance_type<'db, V: visitor::TypeVisitor<'db> + ?Sized>(
KnownInstanceType::TypeAliasType(type_alias) => {
visitor.visit_type_alias_type(db, type_alias);
}
KnownInstanceType::Deprecated(_) => {
// Nothing to visit
}
}
}
@@ -5891,6 +5947,10 @@ impl<'db> KnownInstanceType<'db> {
Self::TypeAliasType(type_alias) => {
Self::TypeAliasType(type_alias.normalized_impl(db, visitor))
}
Self::Deprecated(deprecated) => {
// Nothing to normalize
Self::Deprecated(deprecated)
}
}
}
@@ -5899,6 +5959,7 @@ impl<'db> KnownInstanceType<'db> {
Self::SubscriptedProtocol(_) | Self::SubscriptedGeneric(_) => KnownClass::SpecialForm,
Self::TypeVar(_) => KnownClass::TypeVar,
Self::TypeAliasType(_) => KnownClass::TypeAliasType,
Self::Deprecated(_) => KnownClass::Deprecated,
}
}
@@ -5943,6 +6004,7 @@ impl<'db> KnownInstanceType<'db> {
// it as an instance of `typing.TypeVar`. Inside of a generic class or function, we'll
// have a `Type::TypeVar(_)`, which is rendered as the typevar's name.
KnownInstanceType::TypeVar(_) => f.write_str("typing.TypeVar"),
KnownInstanceType::Deprecated(_) => f.write_str("warnings.deprecated"),
}
}
}
@@ -6131,6 +6193,8 @@ enum InvalidTypeExpression<'db> {
Protocol,
/// Same for `Generic`
Generic,
/// Same for `@deprecated`
Deprecated,
/// Type qualifiers are always invalid in *type expressions*,
/// but these ones are okay with 0 arguments in *annotation expressions*
TypeQualifier(SpecialFormType),
@@ -6172,6 +6236,9 @@ impl<'db> InvalidTypeExpression<'db> {
InvalidTypeExpression::Generic => {
f.write_str("`typing.Generic` is not allowed in type expressions")
}
InvalidTypeExpression::Deprecated => {
f.write_str("`warnings.deprecated` is not allowed in type expressions")
}
InvalidTypeExpression::TypeQualifier(qualifier) => write!(
f,
"Type qualifier `{qualifier}` is not allowed in type expressions \
@@ -6227,6 +6294,17 @@ impl<'db> InvalidTypeExpression<'db> {
}
}
/// Data regarding a `warnings.deprecated` or `typing_extensions.deprecated` decorator.
#[salsa::interned(debug)]
#[derive(PartialOrd, Ord)]
pub struct DeprecatedInstance<'db> {
/// The message for the deprecation
pub message: Option<StringLiteralType<'db>>,
}
// The Salsa heap is tracked separately.
impl get_size2::GetSize for DeprecatedInstance<'_> {}
/// Whether this typecar was created via the legacy `TypeVar` constructor, or using PEP 695 syntax.
#[derive(Clone, Copy, Debug, Eq, Hash, PartialEq)]
pub enum TypeVarKind {

View File

@@ -8,7 +8,7 @@ use std::fmt;
use itertools::Itertools;
use ruff_db::parsed::parsed_module;
use smallvec::{SmallVec, smallvec};
use smallvec::{SmallVec, smallvec, smallvec_inline};
use super::{Argument, CallArguments, CallError, CallErrorKind, InferContext, Signature, Type};
use crate::db::Db;
@@ -848,6 +848,7 @@ impl<'db> Bindings<'db> {
class_literal.name(db),
class_literal.body_scope(db),
class_literal.known(db),
class_literal.deprecated(db),
Some(params),
class_literal.dataclass_transformer_params(db),
)));
@@ -1019,7 +1020,7 @@ impl<'db> From<CallableBinding<'db>> for Bindings<'db> {
fn from(from: CallableBinding<'db>) -> Bindings<'db> {
Bindings {
callable_type: from.callable_type,
elements: smallvec![from],
elements: smallvec_inline![from],
argument_forms: Box::from([]),
conflicting_forms: Box::from([]),
}
@@ -1037,11 +1038,11 @@ impl<'db> From<Binding<'db>> for Bindings<'db> {
bound_type: None,
overload_call_return_type: None,
matching_overload_index: None,
overloads: smallvec![from],
overloads: smallvec_inline![from],
};
Bindings {
callable_type,
elements: smallvec![callable_binding],
elements: smallvec_inline![callable_binding],
argument_forms: Box::from([]),
conflicting_forms: Box::from([]),
}

View File

@@ -22,8 +22,9 @@ use crate::types::signatures::{CallableSignature, Parameter, Parameters, Signatu
use crate::types::tuple::TupleType;
use crate::types::{
BareTypeAliasType, Binding, BoundSuperError, BoundSuperType, CallableType, DataclassParams,
DynamicType, KnownInstanceType, TypeAliasType, TypeMapping, TypeRelation, TypeTransformer,
TypeVarBoundOrConstraints, TypeVarInstance, TypeVarKind, infer_definition_types,
DeprecatedInstance, DynamicType, KnownInstanceType, TypeAliasType, TypeMapping, TypeRelation,
TypeTransformer, TypeVarBoundOrConstraints, TypeVarInstance, TypeVarKind,
infer_definition_types,
};
use crate::{
Db, FxOrderSet, KnownModule, Program,
@@ -799,6 +800,9 @@ pub struct ClassLiteral<'db> {
pub(crate) known: Option<KnownClass>,
/// If this class is deprecated, this holds the deprecation message.
pub(crate) deprecated: Option<DeprecatedInstance<'db>>,
pub(crate) dataclass_params: Option<DataclassParams>,
pub(crate) dataclass_transformer_params: Option<DataclassTransformerParams>,
}
@@ -1600,6 +1604,25 @@ impl<'db> ClassLiteral<'db> {
.place
.ignore_possibly_unbound()
}
(CodeGeneratorKind::DataclassLike, "__setattr__") => {
if has_dataclass_param(DataclassParams::FROZEN) {
let signature = Signature::new(
Parameters::new([
Parameter::positional_or_keyword(Name::new_static("self"))
.with_annotated_type(Type::instance(
db,
self.apply_optional_specialization(db, specialization),
)),
Parameter::positional_or_keyword(Name::new_static("name")),
Parameter::positional_or_keyword(Name::new_static("value")),
]),
Some(Type::Never),
);
return Some(CallableType::function_like(db, signature));
}
None
}
_ => None,
}
}
@@ -2399,6 +2422,7 @@ pub enum KnownClass {
NoneType, // Part of `types` for Python >= 3.10
// Typing
Any,
Deprecated,
StdlibAlias,
SpecialForm,
TypeVar,
@@ -2516,6 +2540,7 @@ impl KnownClass {
| Self::NotImplementedType
| Self::Staticmethod
| Self::Classmethod
| Self::Deprecated
| Self::Field
| Self::KwOnly
| Self::NamedTupleFallback => Truthiness::Ambiguous,
@@ -2543,6 +2568,7 @@ impl KnownClass {
| Self::Property
| Self::Staticmethod
| Self::Classmethod
| Self::Deprecated
| Self::Type
| Self::ModuleType
| Self::Super
@@ -2629,6 +2655,7 @@ impl KnownClass {
| KnownClass::ExceptionGroup
| KnownClass::Staticmethod
| KnownClass::Classmethod
| KnownClass::Deprecated
| KnownClass::Super
| KnownClass::Enum
| KnownClass::Auto
@@ -2712,6 +2739,7 @@ impl KnownClass {
| Self::ExceptionGroup
| Self::Staticmethod
| Self::Classmethod
| Self::Deprecated
| Self::GenericAlias
| Self::GeneratorType
| Self::AsyncGeneratorType
@@ -2778,6 +2806,7 @@ impl KnownClass {
Self::ExceptionGroup => "ExceptionGroup",
Self::Staticmethod => "staticmethod",
Self::Classmethod => "classmethod",
Self::Deprecated => "deprecated",
Self::GenericAlias => "GenericAlias",
Self::ModuleType => "ModuleType",
Self::FunctionType => "FunctionType",
@@ -3052,6 +3081,7 @@ impl KnownClass {
| Self::ParamSpec
| Self::ParamSpecArgs
| Self::ParamSpecKwargs
| Self::Deprecated
| Self::NewType => KnownModule::TypingExtensions,
Self::NoDefaultType => {
let python_version = Program::get(db).python_version(db);
@@ -3120,6 +3150,7 @@ impl KnownClass {
| Self::ExceptionGroup
| Self::Staticmethod
| Self::Classmethod
| Self::Deprecated
| Self::GenericAlias
| Self::ModuleType
| Self::FunctionType
@@ -3207,6 +3238,7 @@ impl KnownClass {
| Self::ExceptionGroup
| Self::Staticmethod
| Self::Classmethod
| Self::Deprecated
| Self::TypeVar
| Self::ParamSpec
| Self::ParamSpecArgs
@@ -3259,6 +3291,7 @@ impl KnownClass {
"ExceptionGroup" => Self::ExceptionGroup,
"staticmethod" => Self::Staticmethod,
"classmethod" => Self::Classmethod,
"deprecated" => Self::Deprecated,
"GenericAlias" => Self::GenericAlias,
"NoneType" => Self::NoneType,
"ModuleType" => Self::ModuleType,
@@ -3378,6 +3411,8 @@ impl KnownClass {
| Self::NamedTuple
| Self::Iterable
| Self::NewType => matches!(module, KnownModule::Typing | KnownModule::TypingExtensions),
Self::Deprecated => matches!(module, KnownModule::Warnings | KnownModule::TypingExtensions),
}
}
@@ -3387,10 +3422,10 @@ impl KnownClass {
self,
context: &InferContext<'db, '_>,
index: &SemanticIndex<'db>,
overload_binding: &Binding<'db>,
call_argument_types: &CallArguments<'_, 'db>,
overload: &mut Binding<'db>,
call_arguments: &CallArguments<'_, 'db>,
call_expression: &ast::ExprCall,
) -> Option<Type<'db>> {
) {
let db = context.db();
let scope = context.scope();
let module = context.module();
@@ -3401,14 +3436,15 @@ impl KnownClass {
// In this case, we need to infer the two arguments:
// 1. The nearest enclosing class
// 2. The first parameter of the current function (typically `self` or `cls`)
match overload_binding.parameter_types() {
match overload.parameter_types() {
[] => {
let Some(enclosing_class) =
nearest_enclosing_class(db, index, scope, module)
else {
BoundSuperError::UnavailableImplicitArguments
.report_diagnostic(context, call_expression.into());
return Some(Type::unknown());
overload.set_return_type(Type::unknown());
return;
};
// The type of the first parameter if the given scope is function-like (i.e. function or lambda).
@@ -3430,7 +3466,8 @@ impl KnownClass {
let Some(first_param) = first_param else {
BoundSuperError::UnavailableImplicitArguments
.report_diagnostic(context, call_expression.into());
return Some(Type::unknown());
overload.set_return_type(Type::unknown());
return;
};
let definition = index.expect_single_definition(first_param);
@@ -3447,7 +3484,7 @@ impl KnownClass {
Type::unknown()
});
Some(bound_super)
overload.set_return_type(bound_super);
}
[Some(pivot_class_type), Some(owner_type)] => {
let bound_super = BoundSuperType::build(db, *pivot_class_type, *owner_type)
@@ -3455,13 +3492,37 @@ impl KnownClass {
err.report_diagnostic(context, call_expression.into());
Type::unknown()
});
Some(bound_super)
overload.set_return_type(bound_super);
}
_ => None,
_ => {}
}
}
KnownClass::Deprecated => {
// Parsing something of the form:
//
// @deprecated("message")
// @deprecated("message", caregory = DeprecationWarning, stacklevel = 1)
//
// "Static type checker behavior is not affected by the category and stacklevel arguments"
// so we only need the message and can ignore everything else. The message is mandatory,
// must be a LiteralString, and always comes first.
//
// We aren't guaranteed to know the static value of a LiteralString, so we need to
// accept that sometimes we will fail to include the message.
//
// We don't do any serious validation/diagnostics here, as the signature for this
// is included in `Type::bindings`.
//
// See: <https://typing.python.org/en/latest/spec/directives.html#deprecated>
let [Some(message), ..] = overload.parameter_types() else {
// Checking in Type::bindings will complain about this for us
return;
};
overload.set_return_type(Type::KnownInstance(KnownInstanceType::Deprecated(
DeprecatedInstance::new(db, message.into_string_literal()),
)));
}
KnownClass::TypeVar => {
let assigned_to = index
.try_expression(ast::ExprRef::from(call_expression))
@@ -3473,12 +3534,14 @@ impl KnownClass {
_ => None,
}
}) else {
let builder =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)?;
builder.into_diagnostic(
"A legacy `typing.TypeVar` must be immediately assigned to a variable",
);
return None;
if let Some(builder) =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)
{
builder.into_diagnostic(
"A legacy `typing.TypeVar` must be immediately assigned to a variable",
);
}
return;
};
let [
@@ -3489,9 +3552,9 @@ impl KnownClass {
contravariant,
covariant,
_infer_variance,
] = overload_binding.parameter_types()
] = overload.parameter_types()
else {
return None;
return;
};
let covariant = covariant
@@ -3504,30 +3567,37 @@ impl KnownClass {
let variance = match (contravariant, covariant) {
(Truthiness::Ambiguous, _) => {
let builder =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)?;
builder.into_diagnostic(
"The `contravariant` parameter of a legacy `typing.TypeVar` \
if let Some(builder) =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)
{
builder.into_diagnostic(
"The `contravariant` parameter of a legacy `typing.TypeVar` \
cannot have an ambiguous value",
);
return None;
);
}
return;
}
(_, Truthiness::Ambiguous) => {
let builder =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)?;
builder.into_diagnostic(
"The `covariant` parameter of a legacy `typing.TypeVar` \
if let Some(builder) =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)
{
builder.into_diagnostic(
"The `covariant` parameter of a legacy `typing.TypeVar` \
cannot have an ambiguous value",
);
return None;
);
}
return;
}
(Truthiness::AlwaysTrue, Truthiness::AlwaysTrue) => {
let builder =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)?;
builder.into_diagnostic(
"A legacy `typing.TypeVar` cannot be both covariant and contravariant",
);
return None;
if let Some(builder) =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)
{
builder.into_diagnostic(
"A legacy `typing.TypeVar` cannot be both \
covariant and contravariant",
);
}
return;
}
(Truthiness::AlwaysTrue, Truthiness::AlwaysFalse) => {
TypeVarVariance::Contravariant
@@ -3541,19 +3611,21 @@ impl KnownClass {
let name_param = name_param.into_string_literal().map(|name| name.value(db));
if name_param.is_none_or(|name_param| name_param != target.id) {
let builder =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)?;
builder.into_diagnostic(format_args!(
"The name of a legacy `typing.TypeVar`{} must match \
if let Some(builder) =
context.report_lint(&INVALID_LEGACY_TYPE_VARIABLE, call_expression)
{
builder.into_diagnostic(format_args!(
"The name of a legacy `typing.TypeVar`{} must match \
the name of the variable it is assigned to (`{}`)",
if let Some(name_param) = name_param {
format!(" (`{name_param}`)")
} else {
String::new()
},
target.id,
));
return None;
if let Some(name_param) = name_param {
format!(" (`{name_param}`)")
} else {
String::new()
},
target.id,
));
}
return;
}
let bound_or_constraint = match (bound, constraints) {
@@ -3568,8 +3640,8 @@ impl KnownClass {
// typevar constraints.
let elements = UnionType::new(
db,
overload_binding
.arguments_for_parameter(call_argument_types, 1)
overload
.arguments_for_parameter(call_arguments, 1)
.map(|(_, ty)| ty)
.collect::<Box<_>>(),
);
@@ -3578,13 +3650,13 @@ impl KnownClass {
// TODO: Emit a diagnostic that TypeVar cannot be both bounded and
// constrained
(Some(_), Some(_)) => return None,
(Some(_), Some(_)) => return,
(None, None) => None,
};
let containing_assignment = index.expect_single_definition(target);
Some(Type::KnownInstance(KnownInstanceType::TypeVar(
overload.set_return_type(Type::KnownInstance(KnownInstanceType::TypeVar(
TypeVarInstance::new(
db,
&target.id,
@@ -3594,7 +3666,7 @@ impl KnownClass {
*default,
TypeVarKind::Legacy,
),
)))
)));
}
KnownClass::TypeAliasType => {
@@ -3609,32 +3681,31 @@ impl KnownClass {
}
});
let [Some(name), Some(value), ..] = overload_binding.parameter_types() else {
return None;
let [Some(name), Some(value), ..] = overload.parameter_types() else {
return;
};
name.into_string_literal()
.map(|name| {
Type::KnownInstance(KnownInstanceType::TypeAliasType(TypeAliasType::Bare(
BareTypeAliasType::new(
db,
ast::name::Name::new(name.value(db)),
containing_assignment,
value,
),
)))
})
.or_else(|| {
let builder =
context.report_lint(&INVALID_TYPE_ALIAS_TYPE, call_expression)?;
let Some(name) = name.into_string_literal() else {
if let Some(builder) =
context.report_lint(&INVALID_TYPE_ALIAS_TYPE, call_expression)
{
builder.into_diagnostic(
"The name of a `typing.TypeAlias` must be a string literal",
);
None
})
}
return;
};
overload.set_return_type(Type::KnownInstance(KnownInstanceType::TypeAliasType(
TypeAliasType::Bare(BareTypeAliasType::new(
db,
ast::name::Name::new(name.value(db)),
containing_assignment,
value,
)),
)));
}
_ => None,
_ => {}
}
}
}

View File

@@ -163,7 +163,9 @@ impl<'db> ClassBase<'db> {
Type::KnownInstance(known_instance) => match known_instance {
KnownInstanceType::SubscriptedGeneric(_) => Some(Self::Generic),
KnownInstanceType::SubscriptedProtocol(_) => Some(Self::Protocol),
KnownInstanceType::TypeAliasType(_) | KnownInstanceType::TypeVar(_) => None,
KnownInstanceType::TypeAliasType(_)
| KnownInstanceType::TypeVar(_)
| KnownInstanceType::Deprecated(_) => None,
},
Type::SpecialForm(special_form) => match special_form {

View File

@@ -288,7 +288,6 @@ impl LintDiagnosticGuard<'_, '_> {
///
/// Callers can add additional primary or secondary annotations via the
/// `DerefMut` trait implementation to a `Diagnostic`.
#[expect(dead_code)]
pub(super) fn add_primary_tag(&mut self, tag: DiagnosticTag) {
let ann = self.primary_annotation_mut().unwrap();
ann.push_tag(tag);
@@ -399,7 +398,7 @@ impl<'db, 'ctx> LintDiagnosticGuardBuilder<'db, 'ctx> {
// returns a rule selector for a given file that respects the package's settings,
// any global pragma comments in the file, and any per-file-ignores.
if !ctx.db.is_file_open(ctx.file) {
if !ctx.db.should_check_file(ctx.file) {
return None;
}
let lint_id = LintId::of(lint);
@@ -573,7 +572,7 @@ impl<'db, 'ctx> DiagnosticGuardBuilder<'db, 'ctx> {
id: DiagnosticId,
severity: Severity,
) -> Option<DiagnosticGuardBuilder<'db, 'ctx>> {
if !ctx.db.is_file_open(ctx.file) {
if !ctx.db.should_check_file(ctx.file) {
return None;
}
Some(DiagnosticGuardBuilder { ctx, id, severity })

View File

@@ -34,6 +34,7 @@ pub(crate) fn register_lints(registry: &mut LintRegistryBuilder) {
registry.register_lint(&CONFLICTING_DECLARATIONS);
registry.register_lint(&CONFLICTING_METACLASS);
registry.register_lint(&CYCLIC_CLASS_DEFINITION);
registry.register_lint(&DEPRECATED);
registry.register_lint(&DIVISION_BY_ZERO);
registry.register_lint(&DUPLICATE_BASE);
registry.register_lint(&DUPLICATE_KW_ONLY);
@@ -261,6 +262,27 @@ declare_lint! {
}
}
declare_lint! {
/// ## What it does
/// Checks for uses of deprecated items
///
/// ## Why is this bad?
/// Deprecated items should no longer be used.
///
/// ## Examples
/// ```python
/// @warnings.deprecated("use new_func instead")
/// def old_func(): ...
///
/// old_func() # emits [deprecated] diagnostic
/// ```
pub(crate) static DEPRECATED = {
summary: "detects uses of deprecated items",
status: LintStatus::preview("1.0.0"),
default_level: Level::Warn,
}
}
declare_lint! {
/// ## What it does
/// Checks for class definitions with duplicate bases.

View File

@@ -64,6 +64,7 @@ use crate::semantic_index::ast_ids::HasScopedUseId;
use crate::semantic_index::definition::Definition;
use crate::semantic_index::place::ScopeId;
use crate::semantic_index::semantic_index;
use crate::types::call::{Binding, CallArguments};
use crate::types::context::InferContext;
use crate::types::diagnostic::{
REDUNDANT_CAST, STATIC_ASSERT_ERROR, TYPE_ASSERTION_FAILURE,
@@ -75,8 +76,8 @@ use crate::types::narrow::ClassInfoConstraintFunction;
use crate::types::signatures::{CallableSignature, Signature};
use crate::types::visitor::any_over_type;
use crate::types::{
BoundMethodType, CallableType, DynamicType, KnownClass, Type, TypeMapping, TypeRelation,
TypeTransformer, TypeVarInstance, walk_type_mapping,
BoundMethodType, CallableType, DeprecatedInstance, DynamicType, KnownClass, Type, TypeMapping,
TypeRelation, TypeTransformer, TypeVarInstance, UnionBuilder, walk_type_mapping,
};
use crate::{Db, FxOrderSet, ModuleName, resolve_module};
@@ -198,6 +199,9 @@ pub struct OverloadLiteral<'db> {
/// A set of special decorators that were applied to this function
pub(crate) decorators: FunctionDecorators,
/// If `Some` then contains the `@warnings.deprecated`
pub(crate) deprecated: Option<DeprecatedInstance<'db>>,
/// The arguments to `dataclass_transformer`, if this function was annotated
/// with `@dataclass_transformer(...)`.
pub(crate) dataclass_transformer_params: Option<DataclassTransformerParams>,
@@ -219,6 +223,7 @@ impl<'db> OverloadLiteral<'db> {
self.known(db),
self.body_scope(db),
self.decorators(db),
self.deprecated(db),
Some(params),
)
}
@@ -464,6 +469,14 @@ impl<'db> FunctionLiteral<'db> {
.any(|overload| overload.decorators(db).contains(decorator))
}
/// If the implementation of this function is deprecated, returns the `@warnings.deprecated`.
///
/// Checking if an overload is deprecated requires deeper call analysis.
fn implementation_deprecated(self, db: &'db dyn Db) -> Option<DeprecatedInstance<'db>> {
let (_overloads, implementation) = self.overloads_and_implementation(db);
implementation.and_then(|overload| overload.deprecated(db))
}
fn definition(self, db: &'db dyn Db) -> Definition<'db> {
self.last_definition(db).definition(db)
}
@@ -671,6 +684,16 @@ impl<'db> FunctionType<'db> {
self.literal(db).has_known_decorator(db, decorator)
}
/// If the implementation of this function is deprecated, returns the `@warnings.deprecated`.
///
/// Checking if an overload is deprecated requires deeper call analysis.
pub(crate) fn implementation_deprecated(
self,
db: &'db dyn Db,
) -> Option<DeprecatedInstance<'db>> {
self.literal(db).implementation_deprecated(db)
}
/// Returns the [`Definition`] of the implementation or first overload of this function.
///
/// ## Warning
@@ -1039,86 +1062,90 @@ impl KnownFunction {
pub(super) fn check_call<'db>(
self,
context: &InferContext<'db, '_>,
parameter_types: &[Option<Type<'db>>],
overload: &mut Binding<'db>,
call_arguments: &CallArguments<'_, 'db>,
call_expression: &ast::ExprCall,
file: File,
) -> Option<Type<'db>> {
) {
let db = context.db();
let parameter_types = overload.parameter_types();
match self {
KnownFunction::RevealType => {
let [Some(revealed_type)] = parameter_types else {
return None;
};
let builder =
context.report_diagnostic(DiagnosticId::RevealedType, Severity::Info)?;
let mut diag = builder.into_diagnostic("Revealed type");
let span = context.span(&call_expression.arguments.args[0]);
diag.annotate(
Annotation::primary(span)
.message(format_args!("`{}`", revealed_type.display(db))),
);
None
let revealed_type = overload
.arguments_for_parameter(call_arguments, 0)
.fold(UnionBuilder::new(db), |builder, (_, ty)| builder.add(ty))
.build();
if let Some(builder) =
context.report_diagnostic(DiagnosticId::RevealedType, Severity::Info)
{
let mut diag = builder.into_diagnostic("Revealed type");
let span = context.span(&call_expression.arguments.args[0]);
diag.annotate(
Annotation::primary(span)
.message(format_args!("`{}`", revealed_type.display(db))),
);
}
}
KnownFunction::AssertType => {
let [Some(actual_ty), Some(asserted_ty)] = parameter_types else {
return None;
return;
};
if actual_ty.is_equivalent_to(db, *asserted_ty) {
return None;
return;
}
let builder = context.report_lint(&TYPE_ASSERTION_FAILURE, call_expression)?;
if let Some(builder) = context.report_lint(&TYPE_ASSERTION_FAILURE, call_expression)
{
let mut diagnostic = builder.into_diagnostic(format_args!(
"Argument does not have asserted type `{}`",
asserted_ty.display(db),
));
let mut diagnostic = builder.into_diagnostic(format_args!(
"Argument does not have asserted type `{}`",
asserted_ty.display(db),
));
diagnostic.annotate(
Annotation::secondary(context.span(&call_expression.arguments.args[0]))
.message(format_args!(
"Inferred type of argument is `{}`",
actual_ty.display(db),
)),
);
diagnostic.annotate(
Annotation::secondary(context.span(&call_expression.arguments.args[0]))
.message(format_args!(
"Inferred type of argument is `{}`",
actual_ty.display(db),
)),
);
diagnostic.info(format_args!(
"`{asserted_type}` and `{inferred_type}` are not equivalent types",
asserted_type = asserted_ty.display(db),
inferred_type = actual_ty.display(db),
));
None
diagnostic.info(format_args!(
"`{asserted_type}` and `{inferred_type}` are not equivalent types",
asserted_type = asserted_ty.display(db),
inferred_type = actual_ty.display(db),
));
}
}
KnownFunction::AssertNever => {
let [Some(actual_ty)] = parameter_types else {
return None;
return;
};
if actual_ty.is_equivalent_to(db, Type::Never) {
return None;
return;
}
if let Some(builder) = context.report_lint(&TYPE_ASSERTION_FAILURE, call_expression)
{
let mut diagnostic =
builder.into_diagnostic("Argument does not have asserted type `Never`");
diagnostic.annotate(
Annotation::secondary(context.span(&call_expression.arguments.args[0]))
.message(format_args!(
"Inferred type of argument is `{}`",
actual_ty.display(db)
)),
);
diagnostic.info(format_args!(
"`Never` and `{inferred_type}` are not equivalent types",
inferred_type = actual_ty.display(db),
));
}
let builder = context.report_lint(&TYPE_ASSERTION_FAILURE, call_expression)?;
let mut diagnostic =
builder.into_diagnostic("Argument does not have asserted type `Never`");
diagnostic.annotate(
Annotation::secondary(context.span(&call_expression.arguments.args[0]))
.message(format_args!(
"Inferred type of argument is `{}`",
actual_ty.display(db)
)),
);
diagnostic.info(format_args!(
"`Never` and `{inferred_type}` are not equivalent types",
inferred_type = actual_ty.display(db),
));
None
}
KnownFunction::StaticAssert => {
let [Some(parameter_ty), message] = parameter_types else {
return None;
return;
};
let truthiness = match parameter_ty.try_bool(db) {
Ok(truthiness) => truthiness,
@@ -1138,41 +1165,42 @@ impl KnownFunction {
err.report_diagnostic(context, condition);
return None;
return;
}
};
let builder = context.report_lint(&STATIC_ASSERT_ERROR, call_expression)?;
if truthiness.is_always_true() {
return None;
}
if let Some(message) = message
.and_then(Type::into_string_literal)
.map(|s| s.value(db))
{
builder.into_diagnostic(format_args!("Static assertion error: {message}"));
} else if *parameter_ty == Type::BooleanLiteral(false) {
builder
.into_diagnostic("Static assertion error: argument evaluates to `False`");
} else if truthiness.is_always_false() {
builder.into_diagnostic(format_args!(
"Static assertion error: argument of type `{parameter_ty}` \
if let Some(builder) = context.report_lint(&STATIC_ASSERT_ERROR, call_expression) {
if truthiness.is_always_true() {
return;
}
if let Some(message) = message
.and_then(Type::into_string_literal)
.map(|s| s.value(db))
{
builder.into_diagnostic(format_args!("Static assertion error: {message}"));
} else if *parameter_ty == Type::BooleanLiteral(false) {
builder.into_diagnostic(
"Static assertion error: argument evaluates to `False`",
);
} else if truthiness.is_always_false() {
builder.into_diagnostic(format_args!(
"Static assertion error: argument of type `{parameter_ty}` \
is statically known to be falsy",
parameter_ty = parameter_ty.display(db)
));
} else {
builder.into_diagnostic(format_args!(
"Static assertion error: argument of type `{parameter_ty}` \
parameter_ty = parameter_ty.display(db)
));
} else {
builder.into_diagnostic(format_args!(
"Static assertion error: argument of type `{parameter_ty}` \
has an ambiguous static truthiness",
parameter_ty = parameter_ty.display(db)
));
parameter_ty = parameter_ty.display(db)
));
}
}
None
}
KnownFunction::Cast => {
let [Some(casted_type), Some(source_type)] = parameter_types else {
return None;
return;
};
let contains_unknown_or_todo =
|ty| matches!(ty, Type::Dynamic(dynamic) if dynamic != DynamicType::Any);
@@ -1180,31 +1208,34 @@ impl KnownFunction {
&& !any_over_type(db, *source_type, &contains_unknown_or_todo)
&& !any_over_type(db, *casted_type, &contains_unknown_or_todo)
{
let builder = context.report_lint(&REDUNDANT_CAST, call_expression)?;
builder.into_diagnostic(format_args!(
"Value is already of type `{}`",
casted_type.display(db),
));
if let Some(builder) = context.report_lint(&REDUNDANT_CAST, call_expression) {
builder.into_diagnostic(format_args!(
"Value is already of type `{}`",
casted_type.display(db),
));
}
}
None
}
KnownFunction::GetProtocolMembers => {
let [Some(Type::ClassLiteral(class))] = parameter_types else {
return None;
return;
};
if class.is_protocol(db) {
return None;
return;
}
report_bad_argument_to_get_protocol_members(context, call_expression, *class);
None
}
KnownFunction::IsInstance | KnownFunction::IsSubclass => {
let [_, Some(Type::ClassLiteral(class))] = parameter_types else {
return None;
return;
};
let Some(protocol_class) = class.into_protocol_class(db) else {
return;
};
let protocol_class = class.into_protocol_class(db)?;
if protocol_class.is_runtime_checkable(db) {
return None;
return;
}
report_runtime_check_against_non_runtime_checkable_protocol(
context,
@@ -1212,16 +1243,16 @@ impl KnownFunction {
protocol_class,
self,
);
None
}
known @ (KnownFunction::DunderImport | KnownFunction::ImportModule) => {
let [Some(Type::StringLiteral(full_module_name)), rest @ ..] = parameter_types
else {
return None;
return;
};
if rest.iter().any(Option::is_some) {
return None;
return;
}
let module_name = full_module_name.value(db);
@@ -1231,16 +1262,20 @@ impl KnownFunction {
// `importlib.import_module("collections.abc")` returns the `collections.abc` module.
// ty doesn't have a way to represent the return type of the former yet.
// https://github.com/astral-sh/ruff/pull/19008#discussion_r2173481311
return None;
return;
}
let module_name = ModuleName::new(module_name)?;
let module = resolve_module(db, &module_name)?;
let Some(module_name) = ModuleName::new(module_name) else {
return;
};
let Some(module) = resolve_module(db, &module_name) else {
return;
};
Some(Type::module_literal(db, file, &module))
overload.set_return_type(Type::module_literal(db, file, &module));
}
_ => None,
_ => {}
}
}
}

View File

@@ -13,12 +13,16 @@ use crate::types::call::CallArguments;
use crate::types::signatures::Signature;
use crate::types::{ClassBase, ClassLiteral, DynamicType, KnownClass, KnownInstanceType, Type};
use crate::{Db, HasType, NameKind, SemanticModel};
use ruff_db::files::File;
use ruff_db::files::{File, FileRange};
use ruff_db::parsed::parsed_module;
use ruff_python_ast as ast;
use ruff_python_ast::name::Name;
use ruff_text_size::TextRange;
use ruff_text_size::{Ranged, TextRange};
use rustc_hash::FxHashSet;
pub use resolve_definition::ResolvedDefinition;
use resolve_definition::{find_symbol_in_scope, resolve_definition};
pub(crate) fn all_declarations_and_bindings<'db>(
db: &'db dyn Db,
scope_id: ScopeId<'db>,
@@ -366,7 +370,7 @@ pub fn definition_kind_for_name<'db>(
let name_str = name.id.as_str();
// Get the scope for this name expression
let file_scope = index.try_expression_scope_id(&ast::Expr::Name(name.clone()))?;
let file_scope = index.expression_scope_id(&ast::ExprRef::from(name));
// Get the place table for this scope
let place_table = index.place_table(file_scope);
@@ -399,9 +403,7 @@ pub fn definitions_for_name<'db>(
let name_str = name.id.as_str();
// Get the scope for this name expression
let Some(file_scope) = index.try_expression_scope_id(&ast::Expr::Name(name.clone())) else {
return Vec::new();
};
let file_scope = index.expression_scope_id(&ast::ExprRef::from(name));
let mut all_definitions = Vec::new();
@@ -503,6 +505,183 @@ pub fn definitions_for_name<'db>(
}
}
/// Returns all resolved definitions for an attribute expression `x.y`.
/// This function duplicates much of the functionality in the semantic
/// analyzer, but it has somewhat different behavior so we've decided
/// to keep it separate for now. One key difference is that this function
/// doesn't model the descriptor protocol when accessing attributes.
/// For "go to definition", we want to get the type of the descriptor object
/// rather than "invoking" its `__get__` or `__set__` method.
/// If this becomes a maintenance burden in the future, it may be worth
/// changing the corresponding logic in the semantic analyzer to conditionally
/// handle this case through the use of mode flags.
pub fn definitions_for_attribute<'db>(
db: &'db dyn Db,
file: File,
attribute: &ast::ExprAttribute,
) -> Vec<ResolvedDefinition<'db>> {
let name_str = attribute.attr.as_str();
let model = SemanticModel::new(db, file);
let mut resolved = Vec::new();
// Determine the type of the LHS
let lhs_ty = attribute.value.inferred_type(&model);
let tys = match lhs_ty {
Type::Union(union) => union.elements(db).to_vec(),
_ => vec![lhs_ty],
};
// Expand intersections for each subtype into their components
let expanded_tys = tys
.into_iter()
.flat_map(|ty| match ty {
Type::Intersection(intersection) => intersection.positive(db).iter().copied().collect(),
_ => vec![ty],
})
.collect::<Vec<_>>();
for ty in expanded_tys {
// Handle modules
if let Type::ModuleLiteral(module_literal) = ty {
if let Some(module_file) = module_literal.module(db).file() {
let module_scope = global_scope(db, module_file);
for def in find_symbol_in_scope(db, module_scope, name_str) {
resolved.extend(resolve_definition(db, def, Some(name_str)));
}
}
continue;
}
// First, transform the type to its meta type, unless it's already a class-like type.
let meta_type = match ty {
Type::ClassLiteral(_) | Type::SubclassOf(_) | Type::GenericAlias(_) => ty,
_ => ty.to_meta_type(db),
};
let class_literal = match meta_type {
Type::ClassLiteral(class_literal) => class_literal,
Type::SubclassOf(subclass) => match subclass.subclass_of().into_class() {
Some(cls) => cls.class_literal(db).0,
None => continue,
},
_ => continue,
};
// Walk the MRO: include class and its ancestors, but stop when we find a match
'scopes: for ancestor in class_literal
.iter_mro(db, None)
.filter_map(ClassBase::into_class)
.map(|cls| cls.class_literal(db).0)
{
let class_scope = ancestor.body_scope(db);
let class_place_table = crate::semantic_index::place_table(db, class_scope);
// Look for class-level declarations and bindings
if let Some(place_id) = class_place_table.place_id_by_name(name_str) {
let use_def = use_def_map(db, class_scope);
// Check declarations first
for decl in use_def.all_reachable_declarations(place_id) {
if let Some(def) = decl.declaration.definition() {
resolved.extend(resolve_definition(db, def, Some(name_str)));
break 'scopes;
}
}
// If no declarations found, check bindings
for binding in use_def.all_reachable_bindings(place_id) {
if let Some(def) = binding.binding.definition() {
resolved.extend(resolve_definition(db, def, Some(name_str)));
break 'scopes;
}
}
}
// Look for instance attributes in method scopes (e.g., self.x = 1)
let file = class_scope.file(db);
let index = semantic_index(db, file);
for function_scope_id in attribute_scopes(db, class_scope) {
let place_table = index.place_table(function_scope_id);
if let Some(place_id) = place_table.place_id_by_instance_attribute_name(name_str) {
let use_def = index.use_def_map(function_scope_id);
// Check declarations first
for decl in use_def.all_reachable_declarations(place_id) {
if let Some(def) = decl.declaration.definition() {
resolved.extend(resolve_definition(db, def, Some(name_str)));
break 'scopes;
}
}
// If no declarations found, check bindings
for binding in use_def.all_reachable_bindings(place_id) {
if let Some(def) = binding.binding.definition() {
resolved.extend(resolve_definition(db, def, Some(name_str)));
break 'scopes;
}
}
}
}
// TODO: Add support for metaclass attribute lookups
}
}
resolved
}
/// Returns definitions for a keyword argument in a call expression.
/// This resolves the keyword argument to the corresponding parameter(s) in the callable's signature(s).
pub fn definitions_for_keyword_argument<'db>(
db: &'db dyn Db,
file: File,
keyword: &ast::Keyword,
call_expr: &ast::ExprCall,
) -> Vec<ResolvedDefinition<'db>> {
let model = SemanticModel::new(db, file);
let func_type = call_expr.func.inferred_type(&model);
let Some(keyword_name) = keyword.arg.as_ref() else {
return Vec::new();
};
let keyword_name_str = keyword_name.as_str();
let mut resolved_definitions = Vec::new();
if let Some(Type::Callable(callable_type)) = func_type.into_callable(db) {
let signatures = callable_type.signatures(db);
// For each signature, find the parameter with the matching name
for signature in signatures {
if let Some((_param_index, _param)) =
signature.parameters().keyword_by_name(keyword_name_str)
{
if let Some(function_definition) = signature.definition() {
let function_file = function_definition.file(db);
let module = parsed_module(db, function_file).load(db);
let def_kind = function_definition.kind(db);
if let DefinitionKind::Function(function_ast_ref) = def_kind {
let function_node = function_ast_ref.node(&module);
if let Some(parameter_range) =
find_parameter_range(&function_node.parameters, keyword_name_str)
{
resolved_definitions.push(ResolvedDefinition::FileWithRange(
FileRange::new(function_file, parameter_range),
));
}
}
}
}
}
}
resolved_definitions
}
/// Details about a callable signature for IDE support.
#[derive(Debug, Clone)]
pub struct CallSignatureDetails<'db> {
@@ -526,7 +705,7 @@ pub struct CallSignatureDetails<'db> {
pub definition: Option<Definition<'db>>,
/// Mapping from argument indices to parameter indices. This helps
/// determine which parameter corresponds to which argument position.
/// identify which argument corresponds to which parameter.
pub argument_to_parameter_mapping: Vec<Option<usize>>,
}
@@ -573,14 +752,27 @@ pub fn call_signature_details<'db>(
}
}
/// Find the text range of a specific parameter in function parameters by name.
/// Only searches for parameters that can be addressed by name in keyword arguments.
fn find_parameter_range(parameters: &ast::Parameters, parameter_name: &str) -> Option<TextRange> {
// Check regular positional and keyword-only parameters
parameters
.args
.iter()
.chain(&parameters.kwonlyargs)
.find(|param| param.parameter.name.as_str() == parameter_name)
.map(|param| param.parameter.name.range())
}
mod resolve_definition {
//! Resolves an Import, `ImportFrom` or `StarImport` definition to one or more
//! "resolved definitions". This is done recursively to find the original
//! definition targeted by the import.
use ruff_db::files::File;
use ruff_db::files::{File, FileRange};
use ruff_db::parsed::parsed_module;
use ruff_python_ast as ast;
use ruff_text_size::TextRange;
use rustc_hash::FxHashSet;
use crate::semantic_index::definition::{Definition, DefinitionKind};
@@ -588,16 +780,17 @@ mod resolve_definition {
use crate::semantic_index::{global_scope, place_table, use_def_map};
use crate::{Db, ModuleName, resolve_module};
/// Represents the result of resolving an import to either a specific definition or a module file.
/// Represents the result of resolving an import to either a specific definition or
/// a specific range within a file.
/// This enum helps distinguish between cases where an import resolves to:
/// - A specific definition within a module (e.g., `from os import path` -> definition of `path`)
/// - An entire module file (e.g., `import os` -> the `os` module file itself)
/// - A specific range within a file, sometimes an empty range at the top of the file
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum ResolvedDefinition<'db> {
/// The import resolved to a specific definition within a module
Definition(Definition<'db>),
/// The import resolved to an entire module file
ModuleFile(File),
/// The import resolved to a file with a specific range
FileWithRange(FileRange),
}
/// Resolve import definitions to their targets.
@@ -657,7 +850,10 @@ mod resolve_definition {
// For simple imports like "import os", we want to navigate to the module itself.
// Return the module file directly instead of trying to find definitions within it.
vec![ResolvedDefinition::ModuleFile(module_file)]
vec![ResolvedDefinition::FileWithRange(FileRange::new(
module_file,
TextRange::default(),
))]
}
DefinitionKind::ImportFrom(import_from_def) => {
@@ -767,6 +963,3 @@ mod resolve_definition {
definitions
}
}
pub use resolve_definition::ResolvedDefinition;
use resolve_definition::{find_symbol_in_scope, resolve_definition};

View File

@@ -424,6 +424,7 @@ pub(crate) struct TypeInference<'db> {
diagnostics: TypeCheckDiagnostics,
/// The scope this region is part of.
#[cfg(debug_assertions)]
scope: ScopeId<'db>,
/// The fallback type for missing expressions/bindings/declarations.
@@ -434,24 +435,30 @@ pub(crate) struct TypeInference<'db> {
impl<'db> TypeInference<'db> {
pub(crate) fn empty(scope: ScopeId<'db>) -> Self {
let _ = scope;
Self {
expressions: FxHashMap::default(),
bindings: FxHashMap::default(),
declarations: FxHashMap::default(),
deferred: FxHashSet::default(),
diagnostics: TypeCheckDiagnostics::default(),
#[cfg(debug_assertions)]
scope,
cycle_fallback_type: None,
}
}
fn cycle_fallback(scope: ScopeId<'db>, cycle_fallback_type: Type<'db>) -> Self {
let _ = scope;
Self {
expressions: FxHashMap::default(),
bindings: FxHashMap::default(),
declarations: FxHashMap::default(),
deferred: FxHashSet::default(),
diagnostics: TypeCheckDiagnostics::default(),
#[cfg(debug_assertions)]
scope,
cycle_fallback_type: Some(cycle_fallback_type),
}
@@ -584,6 +591,9 @@ pub(super) struct TypeInferenceBuilder<'db, 'ast> {
index: &'db SemanticIndex<'db>,
region: InferenceRegion<'db>,
/// The scope of the current region.
scope: ScopeId<'db>,
/// The type inference results
types: TypeInference<'db>,
@@ -645,6 +655,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
Self {
context: InferContext::new(db, scope, module),
scope,
index,
region,
return_types_and_ranges: vec![],
@@ -655,8 +666,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
fn extend(&mut self, inference: &TypeInference<'db>) {
debug_assert_eq!(self.types.scope, inference.scope);
#[cfg(debug_assertions)]
assert_eq!(self.scope, inference.scope);
self.extend_unchecked(inference);
}
fn extend_unchecked(&mut self, inference: &TypeInference<'db>) {
self.types.bindings.extend(inference.bindings.iter());
self.types
.declarations
@@ -683,7 +699,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
fn scope(&self) -> ScopeId<'db> {
self.types.scope
self.scope
}
/// Are we currently inferring types in file with deferred types?
@@ -2298,6 +2314,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let mut decorator_types_and_nodes = Vec::with_capacity(decorator_list.len());
let mut function_decorators = FunctionDecorators::empty();
let mut deprecated = None;
let mut dataclass_transformer_params = None;
for decorator in decorator_list {
@@ -2315,6 +2332,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
continue;
}
}
Type::KnownInstance(KnownInstanceType::Deprecated(deprecated_inst)) => {
deprecated = Some(deprecated_inst);
}
Type::DataclassTransformer(params) => {
dataclass_transformer_params = Some(params);
}
@@ -2362,6 +2382,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
known_function,
body_scope,
function_decorators,
deprecated,
dataclass_transformer_params,
);
@@ -2624,6 +2645,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
body: _,
} = class_node;
let mut deprecated = None;
let mut dataclass_params = None;
let mut dataclass_transformer_params = None;
for decorator in decorator_list {
@@ -2641,6 +2663,13 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
continue;
}
if let Type::KnownInstance(KnownInstanceType::Deprecated(deprecated_inst)) =
decorator_ty
{
deprecated = Some(deprecated_inst);
continue;
}
if let Type::FunctionLiteral(f) = decorator_ty {
// We do not yet detect or flag `@dataclass_transform` applied to more than one
// overload, or an overload and the implementation both. Nevertheless, this is not
@@ -2673,6 +2702,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
name.id.clone(),
body_scope,
maybe_known_class,
deprecated,
dataclass_params,
dataclass_transformer_params,
));
@@ -3446,20 +3476,6 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
| Type::AlwaysTruthy
| Type::AlwaysFalsy
| Type::TypeIs(_) => {
let is_read_only = || {
let dataclass_params = match object_ty {
Type::NominalInstance(instance) => match instance.class {
ClassType::NonGeneric(cls) => cls.dataclass_params(self.db()),
ClassType::Generic(cls) => {
cls.origin(self.db()).dataclass_params(self.db())
}
},
_ => None,
};
dataclass_params.is_some_and(|params| params.contains(DataclassParams::FROZEN))
};
// First, try to call the `__setattr__` dunder method. If this is present/defined, overrides
// assigning the attributed by the normal mechanism.
let setattr_dunder_call_result = object_ty.try_call_dunder_with_policy(
@@ -3476,11 +3492,41 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
if let Some(builder) =
self.context.report_lint(&INVALID_ASSIGNMENT, target)
{
builder.into_diagnostic(format_args!(
"Cannot assign to attribute `{attribute}` on type `{}` \
whose `__setattr__` method returns `Never`/`NoReturn`",
object_ty.display(db)
));
let is_setattr_synthesized = match object_ty
.class_member_with_policy(
db,
"__setattr__".into(),
MemberLookupPolicy::MRO_NO_OBJECT_FALLBACK,
) {
PlaceAndQualifiers {
place: Place::Type(attr_ty, _),
qualifiers: _,
} => attr_ty.is_callable_type(),
_ => false,
};
let member_exists =
!object_ty.member(db, attribute).place.is_unbound();
let msg = if !member_exists {
format!(
"Can not assign to unresolved attribute `{attribute}` on type `{}`",
object_ty.display(db)
)
} else if is_setattr_synthesized {
format!(
"Property `{attribute}` defined in `{}` is read-only",
object_ty.display(db)
)
} else {
format!(
"Cannot assign to attribute `{attribute}` on type `{}` \
whose `__setattr__` method returns `Never`/`NoReturn`",
object_ty.display(db)
)
};
builder.into_diagnostic(msg);
}
}
false
@@ -3530,85 +3576,71 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
place: Place::Type(meta_attr_ty, meta_attr_boundness),
qualifiers: _,
} => {
if is_read_only() {
if emit_diagnostics {
if let Some(builder) =
self.context.report_lint(&INVALID_ASSIGNMENT, target)
{
builder.into_diagnostic(format_args!(
"Property `{attribute}` defined in `{ty}` is read-only",
ty = object_ty.display(self.db()),
));
}
}
false
} else {
let assignable_to_meta_attr =
if let Place::Type(meta_dunder_set, _) =
meta_attr_ty.class_member(db, "__set__".into()).place
{
let successful_call = meta_dunder_set
.try_call(
db,
&CallArguments::positional([
meta_attr_ty,
object_ty,
value_ty,
]),
)
.is_ok();
let assignable_to_meta_attr =
if let Place::Type(meta_dunder_set, _) =
meta_attr_ty.class_member(db, "__set__".into()).place
{
let successful_call = meta_dunder_set
.try_call(
db,
&CallArguments::positional([
meta_attr_ty,
object_ty,
value_ty,
]),
)
.is_ok();
if !successful_call && emit_diagnostics {
if let Some(builder) = self
.context
.report_lint(&INVALID_ASSIGNMENT, target)
{
// TODO: Here, it would be nice to emit an additional diagnostic that explains why the call failed
builder.into_diagnostic(format_args!(
if !successful_call && emit_diagnostics {
if let Some(builder) = self
.context
.report_lint(&INVALID_ASSIGNMENT, target)
{
// TODO: Here, it would be nice to emit an additional diagnostic that explains why the call failed
builder.into_diagnostic(format_args!(
"Invalid assignment to data descriptor attribute \
`{attribute}` on type `{}` with custom `__set__` method",
object_ty.display(db)
));
}
}
}
successful_call
} else {
ensure_assignable_to(meta_attr_ty)
};
successful_call
} else {
ensure_assignable_to(meta_attr_ty)
};
let assignable_to_instance_attribute =
if meta_attr_boundness == Boundness::PossiblyUnbound {
let (assignable, boundness) = if let Place::Type(
instance_attr_ty,
let assignable_to_instance_attribute =
if meta_attr_boundness == Boundness::PossiblyUnbound {
let (assignable, boundness) = if let Place::Type(
instance_attr_ty,
instance_attr_boundness,
) =
object_ty.instance_member(db, attribute).place
{
(
ensure_assignable_to(instance_attr_ty),
instance_attr_boundness,
) =
object_ty.instance_member(db, attribute).place
{
(
ensure_assignable_to(instance_attr_ty),
instance_attr_boundness,
)
} else {
(true, Boundness::PossiblyUnbound)
};
if boundness == Boundness::PossiblyUnbound {
report_possibly_unbound_attribute(
&self.context,
target,
attribute,
object_ty,
);
}
assignable
)
} else {
true
(true, Boundness::PossiblyUnbound)
};
assignable_to_meta_attr && assignable_to_instance_attribute
}
if boundness == Boundness::PossiblyUnbound {
report_possibly_unbound_attribute(
&self.context,
target,
attribute,
object_ty,
);
}
assignable
} else {
true
};
assignable_to_meta_attr && assignable_to_instance_attribute
}
PlaceAndQualifiers {
@@ -3627,22 +3659,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
);
}
if is_read_only() {
if emit_diagnostics {
if let Some(builder) = self
.context
.report_lint(&INVALID_ASSIGNMENT, target)
{
builder.into_diagnostic(format_args!(
"Property `{attribute}` defined in `{ty}` is read-only",
ty = object_ty.display(self.db()),
));
}
}
false
} else {
ensure_assignable_to(instance_attr_ty)
}
ensure_assignable_to(instance_attr_ty)
} else {
if emit_diagnostics {
if let Some(builder) =
@@ -4371,7 +4388,14 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
for alias in names {
for definition in self.index.definitions(alias) {
self.extend(infer_definition_types(self.db(), *definition));
let inferred = infer_definition_types(self.db(), *definition);
// Check non-star imports for deprecations
if definition.kind(self.db()).as_star_import().is_none() {
for ty in inferred.declarations.values() {
self.check_deprecated(alias, ty.inner);
}
}
self.extend(inferred);
}
}
}
@@ -5334,10 +5358,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
if comprehension.is_first() && target.is_name_expr() {
result.expression_type(iterable)
} else {
let scope = self.types.scope;
self.types.scope = result.scope;
self.extend(result);
self.types.scope = scope;
self.extend_unchecked(result);
result.expression_type(iterable)
}
};
@@ -5575,6 +5596,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
| KnownClass::NamedTuple
| KnownClass::TypeAliasType
| KnownClass::Tuple
| KnownClass::Deprecated
)
)
// temporary special-casing for all subclasses of `enum.Enum`
@@ -5615,30 +5637,24 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
match binding_type {
Type::FunctionLiteral(function_literal) => {
if let Some(known_function) = function_literal.known(self.db()) {
if let Some(return_type) = known_function.check_call(
known_function.check_call(
&self.context,
overload.parameter_types(),
overload,
&call_arguments,
call_expression,
self.file(),
) {
overload.set_return_type(return_type);
}
);
}
}
Type::ClassLiteral(class) => {
let Some(known_class) = class.known(self.db()) else {
continue;
};
let overridden_return = known_class.check_call(
&self.context,
self.index,
overload,
&call_arguments,
call_expression,
);
if let Some(overridden_return) = overridden_return {
overload.set_return_type(overridden_return);
if let Some(known_class) = class.known(self.db()) {
known_class.check_call(
&self.context,
self.index,
overload,
&call_arguments,
call_expression,
);
}
}
_ => {}
@@ -5818,6 +5834,62 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
ty
}
/// Check if the given ty is `@deprecated` or not
fn check_deprecated<T: Ranged>(&self, ranged: T, ty: Type) {
// First handle classes
if let Type::ClassLiteral(class_literal) = ty {
let Some(deprecated) = class_literal.deprecated(self.db()) else {
return;
};
let Some(builder) = self
.context
.report_lint(&crate::types::diagnostic::DEPRECATED, ranged)
else {
return;
};
let class_name = class_literal.name(self.db());
let mut diag =
builder.into_diagnostic(format_args!(r#"The class `{class_name}` is deprecated"#));
if let Some(message) = deprecated.message(self.db()) {
diag.set_primary_message(message.value(self.db()));
}
diag.add_primary_tag(ruff_db::diagnostic::DiagnosticTag::Deprecated);
return;
}
// Next handle functions
let function = match ty {
Type::FunctionLiteral(function) => function,
Type::BoundMethod(bound) => bound.function(self.db()),
_ => return,
};
// Currently we only check the final implementation for deprecation, as
// that check can be done on any reference to the function. Analysis of
// deprecated overloads needs to be done in places where we resolve the
// actual overloads being used.
let Some(deprecated) = function.implementation_deprecated(self.db()) else {
return;
};
let Some(builder) = self
.context
.report_lint(&crate::types::diagnostic::DEPRECATED, ranged)
else {
return;
};
let func_name = function.name(self.db());
let mut diag =
builder.into_diagnostic(format_args!(r#"The function `{func_name}` is deprecated"#));
if let Some(message) = deprecated.message(self.db()) {
diag.set_primary_message(message.value(self.db()));
}
diag.add_primary_tag(ruff_db::diagnostic::DiagnosticTag::Deprecated);
}
fn infer_name_load(&mut self, name_node: &ast::ExprName) -> Type<'db> {
let ast::ExprName {
range: _,
@@ -5830,6 +5902,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let (resolved, constraint_keys) =
self.infer_place_load(&expr, ast::ExprRef::Name(name_node));
resolved
// Not found in the module's explicitly declared global symbols?
// Check the "implicit globals" such as `__doc__`, `__file__`, `__name__`, etc.
@@ -5911,6 +5984,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let use_id = expr_ref.scoped_use_id(db, scope);
let place = place_from_bindings(db, use_def.bindings_at_use(use_id));
(place, Some(use_id))
}
}
@@ -5997,6 +6071,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// definition of this name visible to us (would be `LOAD_DEREF` at runtime.)
// Note that we skip the scope containing the use that we are resolving, since we
// already looked for the place there up above.
let mut nonlocal_union_builder = UnionBuilder::new(db);
let mut found_some_definition = false;
for (enclosing_scope_file_id, _) in self.index.ancestor_scopes(file_scope_id).skip(1) {
// Class scopes are not visible to nested scopes, and we need to handle global
// scope differently (because an unbound name there falls back to builtins), so
@@ -6082,21 +6158,25 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let Some(enclosing_place) = enclosing_place_table.place_by_expr(expr) else {
continue;
};
// Reads of "free" variables terminate at any enclosing scope that marks the
// variable `global`, whether or not that scope actually binds the variable. If we
// see a `global` declaration, stop walking scopes and proceed to the global
// handling below. (If we're walking from a prior/inner scope where this variable
// is `nonlocal`, then this is a semantic syntax error, but we don't enforce that
// here. See `infer_nonlocal_statement`.)
if enclosing_place.is_marked_global() {
// Reads of "free" variables can terminate at an enclosing scope that marks the
// variable `global` but doesn't actually bind it. In that case, stop walking
// scopes and proceed to the global handling below. (But note that it's a
// semantic syntax error for the `nonlocal` keyword to do this. See
// `infer_nonlocal_statement`.)
break;
}
// If the name is declared or bound in this scope, figure out its type. This might
// resolve the name and end the walk. But if the name is declared `nonlocal` in
// this scope, we'll keep walking enclosing scopes and union this type with the
// other types we find. (It's a semantic syntax error to declare a type for a
// `nonlocal` variable, but we don't enforce that here. See the
// `ast::Stmt::AnnAssign` handling in `SemanticIndexBuilder::visit_stmt`.)
if enclosing_place.is_bound() || enclosing_place.is_declared() {
// We can return early here, because the nearest function-like scope that
// defines a name must be the only source for the nonlocal reference (at
// runtime, it is the scope that creates the cell for our closure.) If the name
// isn't bound in that scope, we should get an unbound name, not continue
// falling back to other scopes / globals / builtins.
return place(
let local_place_and_qualifiers = place(
db,
enclosing_scope_id,
expr,
@@ -6105,6 +6185,25 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
.map_type(|ty| {
self.narrow_place_with_applicable_constraints(expr, ty, &constraint_keys)
});
// We could have Place::Unbound here, despite the checks above, for example if
// this scope contains a `del` statement but no binding or declaration.
if let Place::Type(type_, boundness) = local_place_and_qualifiers.place {
nonlocal_union_builder.add_in_place(type_);
// `ConsideredDefinitions::AllReachable` never returns PossiblyUnbound
debug_assert_eq!(boundness, Boundness::Bound);
found_some_definition = true;
}
if !enclosing_place.is_marked_nonlocal() {
// We've reached a function-like scope that marks this name bound or
// declared but doesn't mark it `nonlocal`. The name is therefore resolved,
// and we won't consider any scopes outside of this one.
return if found_some_definition {
Place::Type(nonlocal_union_builder.build(), Boundness::Bound).into()
} else {
Place::Unbound.into()
};
}
}
}
@@ -6159,6 +6258,10 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
})
});
if let Some(ty) = place.place.ignore_possibly_unbound() {
self.check_deprecated(expr_ref, ty);
}
(place, constraint_keys)
}
@@ -6362,6 +6465,9 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
})
.inner_type();
self.check_deprecated(attr, resolved_type);
// Even if we can obtain the attribute type based on the assignments, we still perform default type inference
// (to report errors).
assigned_type.unwrap_or(resolved_type)
@@ -9288,6 +9394,15 @@ impl<'db> TypeInferenceBuilder<'db, '_> {
}
Type::unknown()
}
KnownInstanceType::Deprecated(_) => {
self.infer_type_expression(&subscript.slice);
if let Some(builder) = self.context.report_lint(&INVALID_TYPE_FORM, subscript) {
builder.into_diagnostic(format_args!(
"`warnings.deprecated` is not allowed in type expressions",
));
}
Type::unknown()
}
KnownInstanceType::TypeVar(_) => {
self.infer_type_expression(&subscript.slice);
todo_type!("TypeVar annotations")

View File

@@ -13,7 +13,7 @@
use std::{collections::HashMap, slice::Iter};
use itertools::EitherOrBoth;
use smallvec::{SmallVec, smallvec};
use smallvec::{SmallVec, smallvec_inline};
use super::{DynamicType, Type, TypeTransformer, TypeVarVariance, definition_expression_type};
use crate::semantic_index::definition::Definition;
@@ -34,7 +34,7 @@ pub struct CallableSignature<'db> {
impl<'db> CallableSignature<'db> {
pub(crate) fn single(signature: Signature<'db>) -> Self {
Self {
overloads: smallvec![signature],
overloads: smallvec_inline![signature],
}
}

View File

@@ -244,7 +244,7 @@ impl ruff_db::Db for CorpusDb {
#[salsa::db]
impl ty_python_semantic::Db for CorpusDb {
fn is_file_open(&self, file: File) -> bool {
fn should_check_file(&self, file: File) -> bool {
!file.path(self).is_vendored_path()
}

View File

@@ -98,7 +98,7 @@ impl<S> tracing_subscriber::layer::Filter<S> for LogLevelFilter {
meta: &tracing::Metadata<'_>,
_: &tracing_subscriber::layer::Context<'_, S>,
) -> bool {
let filter = if meta.target().starts_with("ty") {
let filter = if meta.target().starts_with("ty") || meta.target().starts_with("ruff") {
self.filter.trace_level()
} else {
tracing::Level::WARN

View File

@@ -8,7 +8,8 @@ use crate::system::AnySystemPath;
use lsp_server::ErrorCode;
use lsp_types::notification::DidCloseTextDocument;
use lsp_types::{DidCloseTextDocumentParams, TextDocumentIdentifier};
use ty_project::watch::ChangeEvent;
use ruff_db::Db as _;
use ty_project::Db as _;
pub(crate) struct DidCloseTextDocumentHandler;
@@ -38,11 +39,29 @@ impl SyncNotificationHandler for DidCloseTextDocumentHandler {
.close_document(&key)
.with_failure_code(ErrorCode::InternalError)?;
if let AnySystemPath::SystemVirtual(virtual_path) = key.path() {
session.apply_changes(
key.path(),
vec![ChangeEvent::DeletedVirtual(virtual_path.clone())],
);
let path = key.path();
let db = session.project_db_mut(path);
match path {
AnySystemPath::System(system_path) => {
if let Some(file) = db.files().try_system(db, system_path) {
db.project().close_file(db, file);
} else {
// This can only fail when the path is a directory or it doesn't exists but the
// file should exists for this handler in this branch. This is because every
// close call is preceded by an open call, which ensures that the file is
// interned in the lookup table (`Files`).
tracing::warn!("Salsa file does not exists for {}", system_path);
}
}
AnySystemPath::SystemVirtual(virtual_path) => {
if let Some(virtual_file) = db.files().try_virtual_file(virtual_path) {
db.project().close_file(db, virtual_file.file());
virtual_file.close(db);
} else {
tracing::warn!("Salsa virtual file does not exists for {}", virtual_path);
}
}
}
if !session.global_settings().diagnostic_mode().is_workspace() {

View File

@@ -1,5 +1,9 @@
use lsp_types::notification::DidOpenTextDocument;
use lsp_types::{DidOpenTextDocumentParams, TextDocumentItem};
use ruff_db::Db as _;
use ruff_db::files::system_path_to_file;
use ty_project::Db as _;
use ty_project::watch::{ChangeEvent, CreatedKind};
use crate::TextDocument;
use crate::server::Result;
@@ -8,8 +12,6 @@ use crate::server::api::traits::{NotificationHandler, SyncNotificationHandler};
use crate::session::Session;
use crate::session::client::Client;
use crate::system::AnySystemPath;
use ruff_db::Db;
use ty_project::watch::ChangeEvent;
pub(crate) struct DidOpenTextDocumentHandler;
@@ -46,13 +48,38 @@ impl SyncNotificationHandler for DidOpenTextDocumentHandler {
let path = key.path();
// This is a "maybe" because the `File` might've not been interned yet i.e., the
// `try_system` call will return `None` which doesn't mean that the file is new, it's just
// that the server didn't need the file yet.
let is_maybe_new_system_file = path.as_system().is_some_and(|system_path| {
let db = session.project_db(path);
db.files()
.try_system(db, system_path)
.is_none_or(|file| !file.exists(db))
});
match path {
AnySystemPath::System(system_path) => {
session.apply_changes(path, vec![ChangeEvent::Opened(system_path.clone())]);
let event = if is_maybe_new_system_file {
ChangeEvent::Created {
path: system_path.clone(),
kind: CreatedKind::File,
}
} else {
ChangeEvent::Opened(system_path.clone())
};
session.apply_changes(path, vec![event]);
let db = session.project_db_mut(path);
match system_path_to_file(db, system_path) {
Ok(file) => db.project().open_file(db, file),
Err(err) => tracing::warn!("Failed to open file {system_path}: {err}"),
}
}
AnySystemPath::SystemVirtual(virtual_path) => {
let db = session.project_db_mut(path);
db.files().virtual_file(db, virtual_path);
let virtual_file = db.files().virtual_file(db, virtual_path);
db.project().open_file(db, virtual_file.file());
}
}

View File

@@ -7,7 +7,6 @@ use lsp_types::{
WorkspaceFullDocumentDiagnosticReport,
};
use rustc_hash::FxHashMap;
use ty_project::CheckMode;
use crate::server::Result;
use crate::server::api::diagnostics::to_lsp_diagnostic;
@@ -33,6 +32,8 @@ impl BackgroundRequestHandler for WorkspaceDiagnosticRequestHandler {
let index = snapshot.index();
if !index.global_settings().diagnostic_mode().is_workspace() {
// VS Code sends us the workspace diagnostic request every 2 seconds, so these logs can
// be quite verbose.
tracing::trace!("Workspace diagnostics is disabled; returning empty report");
return Ok(WorkspaceDiagnosticReportResult::Report(
WorkspaceDiagnosticReport { items: vec![] },
@@ -42,7 +43,7 @@ impl BackgroundRequestHandler for WorkspaceDiagnosticRequestHandler {
let mut items = Vec::new();
for db in snapshot.projects() {
let diagnostics = db.check_with_mode(CheckMode::AllFiles);
let diagnostics = db.check();
// Group diagnostics by URL
let mut diagnostics_by_url: FxHashMap<Url, Vec<_>> = FxHashMap::default();

View File

@@ -103,8 +103,8 @@ impl Session {
let index = Arc::new(Index::new(global_options.into_settings()));
let mut workspaces = Workspaces::default();
for (url, options) in workspace_folders {
workspaces.register(url, options.into_settings())?;
for (url, workspace_options) in workspace_folders {
workspaces.register(url, workspace_options.into_settings())?;
}
Ok(Self {
@@ -347,7 +347,10 @@ impl Session {
});
let (root, db) = match project {
Ok(db) => (root, db),
Ok(mut db) => {
db.set_check_mode(workspace.settings.diagnostic_mode().into_check_mode());
(root, db)
}
Err(err) => {
tracing::error!(
"Failed to create project for `{root}`: {err:#}. Falling back to default settings"
@@ -747,17 +750,22 @@ impl DefaultProject {
pub(crate) fn get(&self, index: Option<&Arc<Index>>) -> &ProjectState {
self.0.get_or_init(|| {
tracing::info!("Initialize default project");
tracing::info!("Initializing the default project");
let system = LSPSystem::new(index.unwrap().clone());
let index = index.unwrap();
let system = LSPSystem::new(index.clone());
let metadata = ProjectMetadata::from_options(
Options::default(),
system.current_directory().to_path_buf(),
None,
)
.unwrap();
let mut db = ProjectDatabase::new(metadata, system).unwrap();
db.set_check_mode(index.global_settings().diagnostic_mode().into_check_mode());
ProjectState {
db: ProjectDatabase::new(metadata, system).unwrap(),
db,
untracked_files_with_pushed_diagnostics: Vec::new(),
}
})

View File

@@ -176,7 +176,7 @@ impl Index {
// may need revisiting in the future as we support more editors with notebook support.
if let DocumentKey::NotebookCell { cell_url, .. } = key {
if self.notebook_cells.remove(cell_url).is_none() {
tracing::warn!("Tried to remove a notebook cell that does not exist: {cell_url}",);
tracing::warn!("Tried to remove a notebook cell that does not exist: {cell_url}");
}
return Ok(());
}

View File

@@ -3,6 +3,7 @@ use ruff_db::system::SystemPathBuf;
use ruff_python_ast::PythonVersion;
use rustc_hash::FxHashMap;
use serde::Deserialize;
use ty_project::CheckMode;
use ty_project::metadata::Options;
use ty_project::metadata::options::ProjectOptionsOverrides;
use ty_project::metadata::value::{RangedValue, RelativePathBuf};
@@ -76,6 +77,13 @@ impl DiagnosticMode {
pub(crate) fn is_workspace(self) -> bool {
matches!(self, DiagnosticMode::Workspace)
}
pub(crate) fn into_check_mode(self) -> CheckMode {
match self {
DiagnosticMode::OpenFilesOnly => CheckMode::OpenFiles,
DiagnosticMode::Workspace => CheckMode::AllFiles,
}
}
}
impl ClientOptions {

View File

@@ -77,7 +77,7 @@ impl SourceDb for Db {
#[salsa::db]
impl SemanticDb for Db {
fn is_file_open(&self, file: File) -> bool {
fn should_check_file(&self, file: File) -> bool {
!file.path(self).is_vendored_path()
}

View File

@@ -4,7 +4,7 @@ use js_sys::{Error, JsString};
use ruff_db::Db as _;
use ruff_db::diagnostic::{self, DisplayDiagnosticConfig};
use ruff_db::files::{File, FileRange, system_path_to_file};
use ruff_db::source::{line_index, source_text};
use ruff_db::source::{SourceText, line_index, source_text};
use ruff_db::system::walk_directory::WalkDirectoryBuilder;
use ruff_db::system::{
CaseSensitivity, DirectoryEntry, GlobError, MemoryFileSystem, Metadata, PatternError, System,
@@ -14,12 +14,15 @@ use ruff_notebook::Notebook;
use ruff_python_formatter::formatted_file;
use ruff_source_file::{LineIndex, OneIndexed, SourceLocation};
use ruff_text_size::{Ranged, TextSize};
use ty_ide::signature_help;
use ty_ide::{MarkupKind, goto_type_definition, hover, inlay_hints};
use ty_project::ProjectMetadata;
use ty_ide::{
MarkupKind, RangedValue, goto_declaration, goto_definition, goto_type_definition, hover,
inlay_hints,
};
use ty_ide::{NavigationTargets, signature_help};
use ty_project::metadata::options::Options;
use ty_project::metadata::value::ValueSource;
use ty_project::watch::{ChangeEvent, ChangedKind, CreatedKind, DeletedKind};
use ty_project::{CheckMode, ProjectMetadata};
use ty_project::{Db, ProjectDatabase};
use ty_python_semantic::Program;
use wasm_bindgen::prelude::*;
@@ -32,10 +35,31 @@ pub fn version() -> String {
.to_string()
}
/// Perform global constructor initialization.
#[cfg(target_family = "wasm")]
#[expect(unsafe_code)]
pub fn before_main() {
unsafe extern "C" {
fn __wasm_call_ctors();
}
// Salsa uses the `inventory` crate, which registers global constructors that may need to be
// called explicitly on WASM. See <https://github.com/dtolnay/inventory/blob/master/src/lib.rs#L105>
// for details.
unsafe {
__wasm_call_ctors();
}
}
#[cfg(not(target_family = "wasm"))]
pub fn before_main() {}
#[wasm_bindgen(start)]
pub fn run() {
use log::Level;
before_main();
ruff_db::set_program_version(version()).unwrap();
// When the `console_error_panic_hook` feature is enabled, we can call the
@@ -76,7 +100,11 @@ impl Workspace {
let project = ProjectMetadata::from_options(options, SystemPathBuf::from(root), None)
.map_err(into_error)?;
let db = ProjectDatabase::new(project, system.clone()).map_err(into_error)?;
let mut db = ProjectDatabase::new(project, system.clone()).map_err(into_error)?;
// By default, it will check all files in the project but we only want to check the open
// files in the playground.
db.set_check_mode(CheckMode::OpenFiles);
Ok(Self {
db,
@@ -242,32 +270,61 @@ impl Workspace {
return Ok(Vec::new());
};
let source_range = Range::from_text_range(
targets.file_range().range(),
&index,
Ok(map_targets_to_links(
&self.db,
targets,
&source,
&index,
self.position_encoding,
);
))
}
let links: Vec<_> = targets
.into_iter()
.map(|target| LocationLink {
path: target.file().path(&self.db).to_string(),
full_range: Range::from_file_range(
&self.db,
FileRange::new(target.file(), target.full_range()),
self.position_encoding,
),
selection_range: Some(Range::from_file_range(
&self.db,
FileRange::new(target.file(), target.focus_range()),
self.position_encoding,
)),
origin_selection_range: Some(source_range),
})
.collect();
#[wasm_bindgen(js_name = "gotoDeclaration")]
pub fn goto_declaration(
&self,
file_id: &FileHandle,
position: Position,
) -> Result<Vec<LocationLink>, Error> {
let source = source_text(&self.db, file_id.file);
let index = line_index(&self.db, file_id.file);
Ok(links)
let offset = position.to_text_size(&source, &index, self.position_encoding)?;
let Some(targets) = goto_declaration(&self.db, file_id.file, offset) else {
return Ok(Vec::new());
};
Ok(map_targets_to_links(
&self.db,
targets,
&source,
&index,
self.position_encoding,
))
}
#[wasm_bindgen(js_name = "gotoDefinition")]
pub fn goto_definition(
&self,
file_id: &FileHandle,
position: Position,
) -> Result<Vec<LocationLink>, Error> {
let source = source_text(&self.db, file_id.file);
let index = line_index(&self.db, file_id.file);
let offset = position.to_text_size(&source, &index, self.position_encoding)?;
let Some(targets) = goto_definition(&self.db, file_id.file, offset) else {
return Ok(Vec::new());
};
Ok(map_targets_to_links(
&self.db,
targets,
&source,
&index,
self.position_encoding,
))
}
#[wasm_bindgen]
@@ -439,6 +496,39 @@ pub(crate) fn into_error<E: std::fmt::Display>(err: E) -> Error {
Error::new(&err.to_string())
}
fn map_targets_to_links(
db: &dyn Db,
targets: RangedValue<NavigationTargets>,
source: &SourceText,
index: &LineIndex,
position_encoding: PositionEncoding,
) -> Vec<LocationLink> {
let source_range = Range::from_text_range(
targets.file_range().range(),
index,
source,
position_encoding,
);
targets
.into_iter()
.map(|target| LocationLink {
path: target.file().path(db).to_string(),
full_range: Range::from_file_range(
db,
FileRange::new(target.file(), target.full_range()),
position_encoding,
),
selection_range: Some(Range::from_file_range(
db,
FileRange::new(target.file(), target.focus_range()),
position_encoding,
)),
origin_selection_range: Some(source_range),
})
.collect()
}
#[derive(Debug, Eq, PartialEq)]
#[wasm_bindgen(inspectable)]
pub struct FileHandle {

View File

@@ -5,6 +5,8 @@ use wasm_bindgen_test::wasm_bindgen_test;
#[wasm_bindgen_test]
fn check() {
ty_wasm::before_main();
let mut workspace = Workspace::new(
"/",
PositionEncoding::Utf32,

View File

@@ -30,7 +30,7 @@ ty_python_semantic = { path = "../crates/ty_python_semantic" }
ty_vendored = { path = "../crates/ty_vendored" }
libfuzzer-sys = { git = "https://github.com/rust-fuzz/libfuzzer", default-features = false }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "fc00eba89e5dcaa5edba51c41aa5f309b5cb126b" }
salsa = { git = "https://github.com/salsa-rs/salsa", rev = "dba66f1a37acca014c2402f231ed5b361bd7d8fe" }
similar = { version = "2.5.0" }
tracing = { version = "0.1.40" }

View File

@@ -82,7 +82,7 @@ impl DbWithTestSystem for TestDb {
#[salsa::db]
impl SemanticDb for TestDb {
fn is_file_open(&self, file: File) -> bool {
fn should_check_file(&self, file: File) -> bool {
!file.path(self).is_vendored_path()
}

View File

@@ -146,6 +146,8 @@ interface PlaygroundServerProps {
class PlaygroundServer
implements
languages.TypeDefinitionProvider,
languages.DeclarationProvider,
languages.DefinitionProvider,
editor.ICodeEditorOpener,
languages.HoverProvider,
languages.InlayHintsProvider,
@@ -156,6 +158,8 @@ class PlaygroundServer
languages.SignatureHelpProvider
{
private typeDefinitionProviderDisposable: IDisposable;
private declarationProviderDisposable: IDisposable;
private definitionProviderDisposable: IDisposable;
private editorOpenerDisposable: IDisposable;
private hoverDisposable: IDisposable;
private inlayHintsDisposable: IDisposable;
@@ -171,6 +175,10 @@ class PlaygroundServer
) {
this.typeDefinitionProviderDisposable =
monaco.languages.registerTypeDefinitionProvider("python", this);
this.declarationProviderDisposable =
monaco.languages.registerDeclarationProvider("python", this);
this.definitionProviderDisposable =
monaco.languages.registerDefinitionProvider("python", this);
this.hoverDisposable = monaco.languages.registerHoverProvider(
"python",
this,
@@ -517,29 +525,61 @@ class PlaygroundServer
new TyPosition(position.lineNumber, position.column),
);
return (
links
.map((link) => {
const targetSelection =
link.selection_range == null
? undefined
: tyRangeToMonacoRange(link.selection_range);
return mapNavigationTargets(links);
}
const originSelection =
link.origin_selection_range == null
? undefined
: tyRangeToMonacoRange(link.origin_selection_range);
provideDeclaration(
model: editor.ITextModel,
position: Position,
// eslint-disable-next-line @typescript-eslint/no-unused-vars
_: CancellationToken,
): languages.ProviderResult<languages.Definition | languages.LocationLink[]> {
const workspace = this.props.workspace;
return {
uri: Uri.parse(link.path),
range: tyRangeToMonacoRange(link.full_range),
targetSelectionRange: targetSelection,
originSelectionRange: originSelection,
} as languages.LocationLink;
})
// Filter out vendored files because they aren't open in the editor.
.filter((link) => link.uri.scheme !== "vendored")
const selectedFile = this.props.files.selected;
if (selectedFile == null) {
return;
}
const selectedHandle = this.props.files.handles[selectedFile];
if (selectedHandle == null) {
return;
}
const links = workspace.gotoDeclaration(
selectedHandle,
new TyPosition(position.lineNumber, position.column),
);
return mapNavigationTargets(links);
}
provideDefinition(
model: editor.ITextModel,
position: Position,
// eslint-disable-next-line @typescript-eslint/no-unused-vars
_: CancellationToken,
): languages.ProviderResult<languages.Definition | languages.LocationLink[]> {
const workspace = this.props.workspace;
const selectedFile = this.props.files.selected;
if (selectedFile == null) {
return;
}
const selectedHandle = this.props.files.handles[selectedFile];
if (selectedHandle == null) {
return;
}
const links = workspace.gotoDefinition(
selectedHandle,
new TyPosition(position.lineNumber, position.column),
);
return mapNavigationTargets(links);
}
openCodeEditor(
@@ -625,6 +665,8 @@ class PlaygroundServer
this.hoverDisposable.dispose();
this.editorOpenerDisposable.dispose();
this.typeDefinitionProviderDisposable.dispose();
this.declarationProviderDisposable.dispose();
this.definitionProviderDisposable.dispose();
this.inlayHintsDisposable.dispose();
this.formatDisposable.dispose();
this.rangeSemanticTokensDisposable.dispose();
@@ -683,6 +725,29 @@ function generateMonacoTokens(
return { data: Uint32Array.from(result) };
}
function mapNavigationTargets(links: any[]): languages.LocationLink[] {
return links
.map((link) => {
const targetSelection =
link.selection_range == null
? undefined
: tyRangeToMonacoRange(link.selection_range);
const originSelection =
link.origin_selection_range == null
? undefined
: tyRangeToMonacoRange(link.origin_selection_range);
return {
uri: Uri.parse(link.path),
range: tyRangeToMonacoRange(link.full_range),
targetSelectionRange: targetSelection,
originSelectionRange: originSelection,
} as languages.LocationLink;
})
.filter((link) => link.uri.scheme !== "vendored");
}
function mapCompletionKind(kind: CompletionKind): CompletionItemKind {
switch (kind) {
case CompletionKind.Text:

10
ty.schema.json generated
View File

@@ -331,6 +331,16 @@
}
]
},
"deprecated": {
"title": "detects uses of deprecated items",
"description": "## What it does\nChecks for uses of deprecated items\n\n## Why is this bad?\nDeprecated items should no longer be used.\n\n## Examples\n```python\n@warnings.deprecated(\"use new_func instead\")\ndef old_func(): ...\n\nold_func() # emits [deprecated] diagnostic\n```",
"default": "warn",
"oneOf": [
{
"$ref": "#/definitions/Level"
}
]
},
"division-by-zero": {
"title": "detects division by zero",
"description": "## What it does\nIt detects division by zero.\n\n## Why is this bad?\nDividing by zero raises a `ZeroDivisionError` at runtime.\n\n## Examples\n```python\n5 / 0\n```",