Compare commits

...

19 Commits

Author SHA1 Message Date
Micha Reiser
56cb92f486 Map directly to UseId 2025-07-12 12:58:23 +02:00
Micha Reiser
d6b1081898 [ty] Move ScopedUseId to UseDefMap 2025-07-12 12:43:11 +02:00
GiGaGon
7154b64248 [pylint] Make example error out-of-the-box (PLE1507) (#19288)
## Summary

Part of #18972

This PR makes [invalid-envvar-value
(PLE1507)](https://docs.astral.sh/ruff/rules/invalid-envvar-value/#invalid-envvar-value-ple1507)'s
example error out-of-the-box.

[Old example](https://play.ruff.rs/a46a9bca-edd5-4474-b20d-e6b6d87291ca)
```py
os.getenv(1)
```

[New example](https://play.ruff.rs/8348d32d-71fa-422c-b228-e2bc343765b1)
```py
import os

os.getenv(1)
```

The "Use instead" section was also updated similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-11 16:08:47 -05:00
GiGaGon
6d01c487a5 [pyupgrade] Make example error out-of-the-box (UP041) (#19292)
<!--
Thank you for contributing to Ruff/ty! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title? (Please prefix
with `[ty]` for ty pull
  requests.)
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Part of #18972

This PR makes [timeout-error-alias
(UP041)](https://docs.astral.sh/ruff/rules/timeout-error-alias/#timeout-error-alias-up041)'s
example error out-of-the-box.

[Old example](https://play.ruff.rs/87e20352-d80a-46ec-98a2-6f6ea700438b)
```py
raise asyncio.TimeoutError
```

[New example](https://play.ruff.rs/d3b95557-46a2-4856-bd71-30d5f3f5ca44)
```py
import asyncio

raise asyncio.TimeoutError
```

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-11 16:08:20 -05:00
GiGaGon
6660b11422 [pyupgrade] Make example error out-of-the-box (UP023) (#19291)
## Summary

Part of #18972

This PR makes [deprecated-c-element-tree
(UP023)](https://docs.astral.sh/ruff/rules/deprecated-c-element-tree/#deprecated-c-element-tree-up023)'s
example error out-of-the-box. I have no clue why the `import
xml.etree.cElementTree` and `from xml.etree import cElementTree` cases
are specifically carved out if they do not have an `as ...`, but the
tests explicitly call this out, and that's how it is in `pyupgrade`'s
source as well.


b5c5f710fc/crates/ruff_linter/resources/test/fixtures/pyupgrade/UP023.py (L23-L31)

[Old example](https://play.ruff.rs/632b8ce1-393d-45e5-9504-5444ae71a0d8)
```py
from xml.etree import cElementTree
```

[New example](https://play.ruff.rs/fef4d378-8c54-41b2-8778-2d02bcbbd7d3)
```py
from xml.etree import cElementTree as ET
```

The "Use instead" section was also updated similarly.

## Test Plan

<!-- How was it tested? -->

N/A, no functionality/tests affected
2025-07-11 16:07:34 -05:00
Brent Westbrook
b5c5f710fc Render Azure, JSON, and JSON lines output with the new diagnostics (#19133)
## Summary

This was originally stacked on #19129, but some of the changes I made
for JSON also impacted the Azure format, so I went ahead and combined
them. The main changes here are:

- Implementing `FileResolver` for Ruff's `EmitterContext`
- Adding `FileResolver::notebook_index` and `FileResolver::is_notebook`
methods
- Adding a `DisplayDiagnostics` (with an "s") type for rendering a group
of diagnostics at once
- Adding `Azure`, `Json`, and `JsonLines` as new `DiagnosticFormat`s

I tried a couple of alternatives to the `FileResolver::notebook` methods
like passing down the `NotebookIndex` separately and trying to reparse a
`Notebook` from Ruff's `SourceFile`. The latter seemed promising, but
the `SourceFile` only stores the concatenated plain text of the
notebook, not the re-parsable JSON. I guess the current version is just
a variation on passing the `NotebookIndex`, but at least we can reuse
the existing `resolver` argument. I think a lot of this can be cleaned
up once Ruff has its own actual file resolver.

As suggested, I also tried deleting the corresponding `Emitter` files in
`ruff_linter`, but it doesn't look like git was able to follow this as a
rename. It did, however, track that the tests were moved, so the
snapshots should be easy to review.

## Test Plan

Existing Ruff tests ported to tests in `ruff_db`. I think some other
existing ruff tests also cover parts of this refactor.

---------

Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-11 15:04:46 -04:00
Dan Parizher
ee88abf77c [flake8_django] Fix DJ008 false positive for abstract models with type-annotated abstract field (#19221)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-11 16:50:59 +00:00
Jack O'Connor
78bd73f25a [ty] add support for nonlocal statements 2025-07-11 09:44:54 -07:00
Dan Parizher
110765154f [flake8-bugbear] Fix B017 false negatives for keyword exception arguments (#19217)
Co-authored-by: Micha Reiser <micha@reiser.io>
2025-07-11 16:43:09 +00:00
Dan Parizher
30ee44770d Fix I002 import insertion after docstring with multiple string statements (#19222) 2025-07-11 18:35:41 +02:00
Dhruv Manilawala
fd69533fe5 [ty] Make sure to always respond to client requests (#19277)
## Summary

This PR fixes a bug that didn't return a response to the client if the
document snapshotting failed.

This is resolved by making sure that the server always creates the
document snapshot and embed the any failures inside the snapshot.

Closes: astral-sh/ty#798

## Test Plan

Using the test case as described in the linked issue:



https://github.com/user-attachments/assets/f32833f8-03e5-4641-8c7f-2a536fe2e270
2025-07-11 14:27:27 +00:00
Zanie Blue
39c6364545 Only build tests in the msrv job (#19261)
Alternative to https://github.com/astral-sh/ruff/pull/19260
2025-07-11 09:16:12 -05:00
Andrew Gallant
100d765ddf [ty] Document path separator usage in VendoredFileSystem
Ref https://github.com/astral-sh/ruff/pull/19266#discussion_r2198530383
2025-07-11 10:06:35 -04:00
Andrew Gallant
6ea231e458 [ty] Add debug output with completion request timings
I had this in a branch somewhere but forgot to get it
merged. So I'm sneaking it in here.

This is useful for very ad hoc performance testing.
2025-07-11 10:06:35 -04:00
Alex Waygood
c9df4ddf6a [ty] Add completions for submodule imports
While we did previously support submodule completions via our
`all_members` API, that only works when submodules are attributes of
their parent module. For example, `os.path`. But that didn't work when
the submodule was not an attribute of its parent. For example,
`http.client`. To make the latter work, we read the directory of the
parent module to discover its submodules.
2025-07-11 10:06:35 -04:00
Andrew Gallant
948463aafa [ty] Move SystemOrVendoredPathRef
This moves the type and adds a few methods so that it can
be used elsewhere.
2025-07-11 10:06:35 -04:00
Andrew Gallant
729fa12575 [ty] Add "readdir" for vendored file systems
This is mostly just holding a zip file in the right way
to simulate reading a directory. We want this to be able
to discover sub-modules for completions.
2025-07-11 10:06:35 -04:00
Brent Westbrook
f14ee9edd5 Use structs for JSON serialization (#19270)
## Summary

See https://github.com/astral-sh/ruff/pull/19133#discussion_r2198413586
for recent discussion. This PR moves to using structs for the types in
our JSON output format instead of the `json!` macro.

I didn't rename any of the `message` references because that should be
handled when rebasing #19133 onto this.

My plan for handling the `preview` behavior with the new diagnostics is
to use a wrapper enum. Something like:

```rust
#[derive(Serialize)]
#[serde(untagged)]
pub(crate) enum JsonDiagnostic<'a> {
    Old(OldJsonDiagnostic<'a>),
}

#[derive(Serialize)]
pub(crate) struct OldJsonDiagnostic<'a> {
    // ...
}
```

Initially I thought I could use a `&dyn Serialize` for the affected
fields, but I see that `Serialize` isn't dyn-compatible in testing this
now.

## Test Plan

Existing tests. One quirk of the new types is that their fields are in
alphabetical order. I guess `json!` sorts the fields alphabetically? The
tests were failing before I sorted the struct fields.

## Other formats

It looks like the `rdjson`, `sarif`, and `gitlab` formats also use
`json!`, so if we decide to merge this, I can do something similar for
those before moving them to the new diagnostic format.
2025-07-11 09:37:44 -04:00
Alex Waygood
a67630f907 [ty] Filter out private type aliases from stub files when offering autocomplete suggestions (#19282) 2025-07-11 13:20:16 +00:00
92 changed files with 2751 additions and 989 deletions

View File

@@ -407,20 +407,11 @@ jobs:
run: rustup default "${MSRV}"
- name: "Install mold"
uses: rui314/setup-mold@85c79d00377f0d32cdbae595a46de6f7c2fa6599 # v1
- name: "Install cargo nextest"
uses: taiki-e/install-action@f3a27926ea13d7be3ee2f4cbb925883cf9442b56 # v2.56.7
with:
tool: cargo-nextest
- name: "Install cargo insta"
uses: taiki-e/install-action@f3a27926ea13d7be3ee2f4cbb925883cf9442b56 # v2.56.7
with:
tool: cargo-insta
- name: "Run tests"
- name: "Build tests"
shell: bash
env:
NEXTEST_PROFILE: "ci"
MSRV: ${{ steps.msrv.outputs.value }}
run: cargo "+${MSRV}" insta test --all-features --unreferenced reject --test-runner nextest
run: cargo "+${MSRV}" test --no-run --all-features
cargo-fuzz-build:
name: "cargo fuzz build"

2
Cargo.lock generated
View File

@@ -2852,6 +2852,7 @@ dependencies = [
"salsa",
"schemars",
"serde",
"serde_json",
"tempfile",
"thiserror 2.0.12",
"tracing",
@@ -4272,6 +4273,7 @@ dependencies = [
"serde",
"serde_json",
"shellexpand",
"thiserror 2.0.12",
"tracing",
"tracing-subscriber",
"ty_ide",

View File

@@ -439,7 +439,7 @@ pub fn check(args: CheckCommand, global_options: GlobalConfigArgs) -> Result<Exi
if cli.statistics {
printer.write_statistics(&diagnostics, &mut summary_writer)?;
} else {
printer.write_once(&diagnostics, &mut summary_writer)?;
printer.write_once(&diagnostics, &mut summary_writer, preview)?;
}
if !cli.exit_zero {

View File

@@ -9,13 +9,14 @@ use itertools::{Itertools, iterate};
use ruff_linter::linter::FixTable;
use serde::Serialize;
use ruff_db::diagnostic::{Diagnostic, SecondaryCode};
use ruff_db::diagnostic::{
Diagnostic, DiagnosticFormat, DisplayDiagnosticConfig, DisplayDiagnostics, SecondaryCode,
};
use ruff_linter::fs::relativize_path;
use ruff_linter::logging::LogLevel;
use ruff_linter::message::{
AzureEmitter, Emitter, EmitterContext, GithubEmitter, GitlabEmitter, GroupedEmitter,
JsonEmitter, JsonLinesEmitter, JunitEmitter, PylintEmitter, RdjsonEmitter, SarifEmitter,
TextEmitter,
Emitter, EmitterContext, GithubEmitter, GitlabEmitter, GroupedEmitter, JunitEmitter,
PylintEmitter, RdjsonEmitter, SarifEmitter, TextEmitter,
};
use ruff_linter::notify_user;
use ruff_linter::settings::flags::{self};
@@ -202,6 +203,7 @@ impl Printer {
&self,
diagnostics: &Diagnostics,
writer: &mut dyn Write,
preview: bool,
) -> Result<()> {
if matches!(self.log_level, LogLevel::Silent) {
return Ok(());
@@ -229,13 +231,21 @@ impl Printer {
match self.format {
OutputFormat::Json => {
JsonEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Json)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Rdjson => {
RdjsonEmitter.emit(writer, &diagnostics.inner, &context)?;
}
OutputFormat::JsonLines => {
JsonLinesEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::JsonLines)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Junit => {
JunitEmitter.emit(writer, &diagnostics.inner, &context)?;
@@ -283,7 +293,11 @@ impl Printer {
PylintEmitter.emit(writer, &diagnostics.inner, &context)?;
}
OutputFormat::Azure => {
AzureEmitter.emit(writer, &diagnostics.inner, &context)?;
let config = DisplayDiagnosticConfig::default()
.format(DiagnosticFormat::Azure)
.preview(preview);
let value = DisplayDiagnostics::new(&context, &config, &diagnostics.inner);
write!(writer, "{value}")?;
}
OutputFormat::Sarif => {
SarifEmitter.emit(writer, &diagnostics.inner, &context)?;

View File

@@ -38,6 +38,7 @@ rustc-hash = { workspace = true }
salsa = { workspace = true }
schemars = { workspace = true, optional = true }
serde = { workspace = true, optional = true }
serde_json = { workspace = true, optional = true }
thiserror = { workspace = true }
tracing = { workspace = true }
tracing-subscriber = { workspace = true, optional = true }
@@ -56,6 +57,6 @@ tempfile = { workspace = true }
[features]
cache = ["ruff_cache"]
os = ["ignore", "dep:etcetera"]
serde = ["dep:serde", "camino/serde1"]
serde = ["camino/serde1", "dep:serde", "dep:serde_json", "ruff_diagnostics/serde"]
# Exposes testing utilities.
testing = ["tracing-subscriber"]

View File

@@ -1,13 +1,12 @@
use std::{fmt::Formatter, sync::Arc};
use render::{FileResolver, Input};
use ruff_diagnostics::Fix;
use ruff_source_file::{LineColumn, SourceCode, SourceFile};
use ruff_annotate_snippets::Level as AnnotateLevel;
use ruff_text_size::{Ranged, TextRange, TextSize};
pub use self::render::DisplayDiagnostic;
pub use self::render::{DisplayDiagnostic, DisplayDiagnostics, FileResolver, Input};
use crate::{Db, files::File};
mod render;
@@ -380,7 +379,7 @@ impl Diagnostic {
}
/// Returns the URL for the rule documentation, if it exists.
pub fn to_url(&self) -> Option<String> {
pub fn to_ruff_url(&self) -> Option<String> {
if self.is_invalid_syntax() {
None
} else {
@@ -432,8 +431,9 @@ impl Diagnostic {
/// Returns the [`SourceFile`] which the message belongs to.
///
/// Panics if the diagnostic has no primary span, or if its file is not a `SourceFile`.
pub fn expect_ruff_source_file(&self) -> SourceFile {
self.expect_primary_span().expect_ruff_file().clone()
pub fn expect_ruff_source_file(&self) -> &SourceFile {
self.ruff_source_file()
.expect("Expected a ruff source file")
}
/// Returns the [`TextRange`] for the diagnostic.
@@ -1174,6 +1174,12 @@ pub struct DisplayDiagnosticConfig {
/// here for now as the most "sensible" place for it to live until
/// we had more concrete use cases. ---AG
context: usize,
/// Whether to use preview formatting for Ruff diagnostics.
#[allow(
dead_code,
reason = "This is currently only used for JSON but will be needed soon for other formats"
)]
preview: bool,
}
impl DisplayDiagnosticConfig {
@@ -1194,6 +1200,14 @@ impl DisplayDiagnosticConfig {
..self
}
}
/// Whether to enable preview behavior or not.
pub fn preview(self, yes: bool) -> DisplayDiagnosticConfig {
DisplayDiagnosticConfig {
preview: yes,
..self
}
}
}
impl Default for DisplayDiagnosticConfig {
@@ -1202,6 +1216,7 @@ impl Default for DisplayDiagnosticConfig {
format: DiagnosticFormat::default(),
color: false,
context: 2,
preview: false,
}
}
}
@@ -1229,6 +1244,21 @@ pub enum DiagnosticFormat {
///
/// This may use color when printing to a `tty`.
Concise,
/// Print diagnostics in the [Azure Pipelines] format.
///
/// [Azure Pipelines]: https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash#logissue-log-an-error-or-warning
Azure,
/// Print diagnostics in JSON format.
///
/// Unlike `json-lines`, this prints all of the diagnostics as a JSON array.
#[cfg(feature = "serde")]
Json,
/// Print diagnostics in JSON format, one per line.
///
/// This will print each diagnostic as a separate JSON object on its own line. See the `json`
/// format for an array of all diagnostics. See <https://jsonlines.org/> for more details.
#[cfg(feature = "serde")]
JsonLines,
}
/// A representation of the kinds of messages inside a diagnostic.

View File

@@ -4,6 +4,7 @@ use ruff_annotate_snippets::{
Annotation as AnnotateAnnotation, Level as AnnotateLevel, Message as AnnotateMessage,
Renderer as AnnotateRenderer, Snippet as AnnotateSnippet,
};
use ruff_notebook::{Notebook, NotebookIndex};
use ruff_source_file::{LineIndex, OneIndexed, SourceCode};
use ruff_text_size::{TextRange, TextSize};
@@ -17,9 +18,17 @@ use crate::{
use super::{
Annotation, Diagnostic, DiagnosticFormat, DiagnosticSource, DisplayDiagnosticConfig, Severity,
SubDiagnostic,
SubDiagnostic, UnifiedFile,
};
use azure::AzureRenderer;
mod azure;
#[cfg(feature = "serde")]
mod json;
#[cfg(feature = "serde")]
mod json_lines;
/// A type that implements `std::fmt::Display` for diagnostic rendering.
///
/// It is created via [`Diagnostic::display`].
@@ -34,7 +43,6 @@ use super::{
pub struct DisplayDiagnostic<'a> {
config: &'a DisplayDiagnosticConfig,
resolver: &'a dyn FileResolver,
annotate_renderer: AnnotateRenderer,
diag: &'a Diagnostic,
}
@@ -44,16 +52,9 @@ impl<'a> DisplayDiagnostic<'a> {
config: &'a DisplayDiagnosticConfig,
diag: &'a Diagnostic,
) -> DisplayDiagnostic<'a> {
let annotate_renderer = if config.color {
AnnotateRenderer::styled()
} else {
AnnotateRenderer::plain()
};
DisplayDiagnostic {
config,
resolver,
annotate_renderer,
diag,
}
}
@@ -61,68 +62,131 @@ impl<'a> DisplayDiagnostic<'a> {
impl std::fmt::Display for DisplayDiagnostic<'_> {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
let stylesheet = if self.config.color {
DiagnosticStylesheet::styled()
} else {
DiagnosticStylesheet::plain()
};
DisplayDiagnostics::new(self.resolver, self.config, std::slice::from_ref(self.diag)).fmt(f)
}
}
if matches!(self.config.format, DiagnosticFormat::Concise) {
let (severity, severity_style) = match self.diag.severity() {
Severity::Info => ("info", stylesheet.info),
Severity::Warning => ("warning", stylesheet.warning),
Severity::Error => ("error", stylesheet.error),
Severity::Fatal => ("fatal", stylesheet.error),
};
/// A type that implements `std::fmt::Display` for rendering a collection of diagnostics.
///
/// It is intended for collections of diagnostics that need to be serialized together, as is the
/// case for JSON, for example.
///
/// See [`DisplayDiagnostic`] for rendering individual `Diagnostic`s and details about the lifetime
/// constraints.
pub struct DisplayDiagnostics<'a> {
config: &'a DisplayDiagnosticConfig,
resolver: &'a dyn FileResolver,
diagnostics: &'a [Diagnostic],
}
write!(
f,
"{severity}[{id}]",
severity = fmt_styled(severity, severity_style),
id = fmt_styled(self.diag.id(), stylesheet.emphasis)
)?;
impl<'a> DisplayDiagnostics<'a> {
pub fn new(
resolver: &'a dyn FileResolver,
config: &'a DisplayDiagnosticConfig,
diagnostics: &'a [Diagnostic],
) -> DisplayDiagnostics<'a> {
DisplayDiagnostics {
config,
resolver,
diagnostics,
}
}
}
if let Some(span) = self.diag.primary_span() {
write!(
f,
" {path}",
path = fmt_styled(span.file().path(self.resolver), stylesheet.emphasis)
)?;
if let Some(range) = span.range() {
let diagnostic_source = span.file().diagnostic_source(self.resolver);
let start = diagnostic_source
.as_source_code()
.line_column(range.start());
impl std::fmt::Display for DisplayDiagnostics<'_> {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
match self.config.format {
DiagnosticFormat::Concise => {
let stylesheet = if self.config.color {
DiagnosticStylesheet::styled()
} else {
DiagnosticStylesheet::plain()
};
for diag in self.diagnostics {
let (severity, severity_style) = match diag.severity() {
Severity::Info => ("info", stylesheet.info),
Severity::Warning => ("warning", stylesheet.warning),
Severity::Error => ("error", stylesheet.error),
Severity::Fatal => ("fatal", stylesheet.error),
};
write!(
f,
":{line}:{col}",
line = fmt_styled(start.line, stylesheet.emphasis),
col = fmt_styled(start.column, stylesheet.emphasis),
"{severity}[{id}]",
severity = fmt_styled(severity, severity_style),
id = fmt_styled(diag.id(), stylesheet.emphasis)
)?;
if let Some(span) = diag.primary_span() {
write!(
f,
" {path}",
path = fmt_styled(span.file().path(self.resolver), stylesheet.emphasis)
)?;
if let Some(range) = span.range() {
let diagnostic_source = span.file().diagnostic_source(self.resolver);
let start = diagnostic_source
.as_source_code()
.line_column(range.start());
write!(
f,
":{line}:{col}",
line = fmt_styled(start.line, stylesheet.emphasis),
col = fmt_styled(start.column, stylesheet.emphasis),
)?;
}
write!(f, ":")?;
}
writeln!(f, " {message}", message = diag.concise_message())?;
}
write!(f, ":")?;
}
return writeln!(f, " {message}", message = self.diag.concise_message());
DiagnosticFormat::Full => {
let stylesheet = if self.config.color {
DiagnosticStylesheet::styled()
} else {
DiagnosticStylesheet::plain()
};
let mut renderer = if self.config.color {
AnnotateRenderer::styled()
} else {
AnnotateRenderer::plain()
};
renderer = renderer
.error(stylesheet.error)
.warning(stylesheet.warning)
.info(stylesheet.info)
.note(stylesheet.note)
.help(stylesheet.help)
.line_no(stylesheet.line_no)
.emphasis(stylesheet.emphasis)
.none(stylesheet.none);
for diag in self.diagnostics {
let resolved = Resolved::new(self.resolver, diag);
let renderable = resolved.to_renderable(self.config.context);
for diag in renderable.diagnostics.iter() {
writeln!(f, "{}", renderer.render(diag.to_annotate()))?;
}
writeln!(f)?;
}
}
DiagnosticFormat::Azure => {
AzureRenderer::new(self.resolver).render(f, self.diagnostics)?;
}
#[cfg(feature = "serde")]
DiagnosticFormat::Json => {
json::JsonRenderer::new(self.resolver, self.config).render(f, self.diagnostics)?;
}
#[cfg(feature = "serde")]
DiagnosticFormat::JsonLines => {
json_lines::JsonLinesRenderer::new(self.resolver, self.config)
.render(f, self.diagnostics)?;
}
}
let mut renderer = self.annotate_renderer.clone();
renderer = renderer
.error(stylesheet.error)
.warning(stylesheet.warning)
.info(stylesheet.info)
.note(stylesheet.note)
.help(stylesheet.help)
.line_no(stylesheet.line_no)
.emphasis(stylesheet.emphasis)
.none(stylesheet.none);
let resolved = Resolved::new(self.resolver, self.diag);
let renderable = resolved.to_renderable(self.config.context);
for diag in renderable.diagnostics.iter() {
writeln!(f, "{}", renderer.render(diag.to_annotate()))?;
}
writeln!(f)
Ok(())
}
}
@@ -635,6 +699,12 @@ pub trait FileResolver {
/// Returns the input contents associated with the file given.
fn input(&self, file: File) -> Input;
/// Returns the [`NotebookIndex`] associated with the file given, if it's a Jupyter notebook.
fn notebook_index(&self, file: &UnifiedFile) -> Option<NotebookIndex>;
/// Returns whether the file given is a Jupyter notebook.
fn is_notebook(&self, file: &UnifiedFile) -> bool;
}
impl<T> FileResolver for T
@@ -651,6 +721,25 @@ where
line_index: line_index(self, file),
}
}
fn notebook_index(&self, file: &UnifiedFile) -> Option<NotebookIndex> {
match file {
UnifiedFile::Ty(file) => self
.input(*file)
.text
.as_notebook()
.map(Notebook::index)
.cloned(),
UnifiedFile::Ruff(_) => unimplemented!("Expected an interned ty file"),
}
}
fn is_notebook(&self, file: &UnifiedFile) -> bool {
match file {
UnifiedFile::Ty(file) => self.input(*file).text.as_notebook().is_some(),
UnifiedFile::Ruff(_) => unimplemented!("Expected an interned ty file"),
}
}
}
impl FileResolver for &dyn Db {
@@ -664,6 +753,25 @@ impl FileResolver for &dyn Db {
line_index: line_index(*self, file),
}
}
fn notebook_index(&self, file: &UnifiedFile) -> Option<NotebookIndex> {
match file {
UnifiedFile::Ty(file) => self
.input(*file)
.text
.as_notebook()
.map(Notebook::index)
.cloned(),
UnifiedFile::Ruff(_) => unimplemented!("Expected an interned ty file"),
}
}
fn is_notebook(&self, file: &UnifiedFile) -> bool {
match file {
UnifiedFile::Ty(file) => self.input(*file).text.as_notebook().is_some(),
UnifiedFile::Ruff(_) => unimplemented!("Expected an interned ty file"),
}
}
}
/// An abstraction over a unit of user input.
@@ -724,7 +832,9 @@ fn relativize_path<'p>(cwd: &SystemPath, path: &'p str) -> &'p str {
#[cfg(test)]
mod tests {
use crate::diagnostic::{Annotation, DiagnosticId, Severity, Span};
use ruff_diagnostics::{Edit, Fix};
use crate::diagnostic::{Annotation, DiagnosticId, SecondaryCode, Severity, Span};
use crate::files::system_path_to_file;
use crate::system::{DbWithWritableSystem, SystemPath};
use crate::tests::TestDb;
@@ -2121,7 +2231,7 @@ watermelon
/// A small harness for setting up an environment specifically for testing
/// diagnostic rendering.
struct TestEnvironment {
pub(super) struct TestEnvironment {
db: TestDb,
config: DisplayDiagnosticConfig,
}
@@ -2130,7 +2240,7 @@ watermelon
/// Create a new test harness.
///
/// This uses the default diagnostic rendering configuration.
fn new() -> TestEnvironment {
pub(super) fn new() -> TestEnvironment {
TestEnvironment {
db: TestDb::new(),
config: DisplayDiagnosticConfig::default(),
@@ -2149,8 +2259,26 @@ watermelon
self.config = config;
}
/// Set the output format to use in diagnostic rendering.
pub(super) fn format(&mut self, format: DiagnosticFormat) {
let mut config = std::mem::take(&mut self.config);
config = config.format(format);
self.config = config;
}
/// Enable preview functionality for diagnostic rendering.
#[allow(
dead_code,
reason = "This is currently only used for JSON but will be needed soon for other formats"
)]
pub(super) fn preview(&mut self, yes: bool) {
let mut config = std::mem::take(&mut self.config);
config = config.preview(yes);
self.config = config;
}
/// Add a file with the given path and contents to this environment.
fn add(&mut self, path: &str, contents: &str) {
pub(super) fn add(&mut self, path: &str, contents: &str) {
let path = SystemPath::new(path);
self.db.write_file(path, contents).unwrap();
}
@@ -2200,7 +2328,7 @@ watermelon
/// A convenience function for returning a builder for a diagnostic
/// with "error" severity and canned values for its identifier
/// and message.
fn err(&mut self) -> DiagnosticBuilder<'_> {
pub(super) fn err(&mut self) -> DiagnosticBuilder<'_> {
self.builder(
"test-diagnostic",
Severity::Error,
@@ -2226,6 +2354,12 @@ watermelon
DiagnosticBuilder { env: self, diag }
}
/// A convenience function for returning a builder for an invalid syntax diagnostic.
fn invalid_syntax(&mut self, message: &str) -> DiagnosticBuilder<'_> {
let diag = Diagnostic::new(DiagnosticId::InvalidSyntax, Severity::Error, message);
DiagnosticBuilder { env: self, diag }
}
/// Returns a builder for tersely constructing sub-diagnostics.
fn sub_builder(&mut self, severity: Severity, message: &str) -> SubDiagnosticBuilder<'_> {
let subdiag = SubDiagnostic::new(severity, message);
@@ -2235,9 +2369,18 @@ watermelon
/// Render the given diagnostic into a `String`.
///
/// (This will set the "printed" flag on `Diagnostic`.)
fn render(&self, diag: &Diagnostic) -> String {
pub(super) fn render(&self, diag: &Diagnostic) -> String {
diag.display(&self.db, &self.config).to_string()
}
/// Render the given diagnostics into a `String`.
///
/// See `render` for rendering a single diagnostic.
///
/// (This will set the "printed" flag on `Diagnostic`.)
pub(super) fn render_diagnostics(&self, diagnostics: &[Diagnostic]) -> String {
DisplayDiagnostics::new(&self.db, &self.config, diagnostics).to_string()
}
}
/// A helper builder for tersely populating a `Diagnostic`.
@@ -2246,14 +2389,14 @@ watermelon
/// supported by this builder, and this only needs to be done
/// infrequently, consider doing it more verbosely on `diag`
/// itself.
struct DiagnosticBuilder<'e> {
pub(super) struct DiagnosticBuilder<'e> {
env: &'e mut TestEnvironment,
diag: Diagnostic,
}
impl<'e> DiagnosticBuilder<'e> {
/// Return the built diagnostic.
fn build(self) -> Diagnostic {
pub(super) fn build(self) -> Diagnostic {
self.diag
}
@@ -2302,6 +2445,25 @@ watermelon
self.diag.annotate(ann);
self
}
/// Set the secondary code on the diagnostic.
fn secondary_code(mut self, secondary_code: &str) -> DiagnosticBuilder<'e> {
self.diag
.set_secondary_code(SecondaryCode::new(secondary_code.to_string()));
self
}
/// Set the fix on the diagnostic.
pub(super) fn fix(mut self, fix: Fix) -> DiagnosticBuilder<'e> {
self.diag.set_fix(fix);
self
}
/// Set the noqa offset on the diagnostic.
fn noqa_offset(mut self, noqa_offset: TextSize) -> DiagnosticBuilder<'e> {
self.diag.set_noqa_offset(noqa_offset);
self
}
}
/// A helper builder for tersely populating a `SubDiagnostic`.
@@ -2381,4 +2543,199 @@ watermelon
let offset = TextSize::from(offset.parse::<u32>().unwrap());
(line_number, Some(offset))
}
/// Create Ruff-style diagnostics for testing the various output formats.
pub(crate) fn create_diagnostics(
format: DiagnosticFormat,
) -> (TestEnvironment, Vec<Diagnostic>) {
let mut env = TestEnvironment::new();
env.add(
"fib.py",
r#"import os
def fibonacci(n):
"""Compute the nth number in the Fibonacci sequence."""
x = 1
if n == 0:
return 0
elif n == 1:
return 1
else:
return fibonacci(n - 1) + fibonacci(n - 2)
"#,
);
env.add("undef.py", r"if a == 1: pass");
env.format(format);
let diagnostics = vec![
env.builder("unused-import", Severity::Error, "`os` imported but unused")
.primary("fib.py", "1:7", "1:9", "Remove unused import: `os`")
.secondary_code("F401")
.fix(Fix::unsafe_edit(Edit::range_deletion(TextRange::new(
TextSize::from(0),
TextSize::from(10),
))))
.noqa_offset(TextSize::from(7))
.build(),
env.builder(
"unused-variable",
Severity::Error,
"Local variable `x` is assigned to but never used",
)
.primary(
"fib.py",
"6:4",
"6:5",
"Remove assignment to unused variable `x`",
)
.secondary_code("F841")
.fix(Fix::unsafe_edit(Edit::deletion(
TextSize::from(94),
TextSize::from(99),
)))
.noqa_offset(TextSize::from(94))
.build(),
env.builder("undefined-name", Severity::Error, "Undefined name `a`")
.primary("undef.py", "1:3", "1:4", "")
.secondary_code("F821")
.noqa_offset(TextSize::from(3))
.build(),
];
(env, diagnostics)
}
/// Create Ruff-style syntax error diagnostics for testing the various output formats.
pub(crate) fn create_syntax_error_diagnostics(
format: DiagnosticFormat,
) -> (TestEnvironment, Vec<Diagnostic>) {
let mut env = TestEnvironment::new();
env.add(
"syntax_errors.py",
r"from os import
if call(foo
def bar():
pass
",
);
env.format(format);
let diagnostics = vec![
env.invalid_syntax("SyntaxError: Expected one or more symbol names after import")
.primary("syntax_errors.py", "1:14", "1:15", "")
.build(),
env.invalid_syntax("SyntaxError: Expected ')', found newline")
.primary("syntax_errors.py", "3:11", "3:12", "")
.build(),
];
(env, diagnostics)
}
/// Create Ruff-style diagnostics for testing the various output formats for a notebook.
#[allow(
dead_code,
reason = "This is currently only used for JSON but will be needed soon for other formats"
)]
pub(crate) fn create_notebook_diagnostics(
format: DiagnosticFormat,
) -> (TestEnvironment, Vec<Diagnostic>) {
let mut env = TestEnvironment::new();
env.add(
"notebook.ipynb",
r##"
{
"cells": [
{
"cell_type": "code",
"metadata": {},
"outputs": [],
"source": [
"# cell 1\n",
"import os"
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [],
"source": [
"# cell 2\n",
"import math\n",
"\n",
"print('hello world')"
]
},
{
"cell_type": "code",
"metadata": {},
"outputs": [],
"source": [
"# cell 3\n",
"def foo():\n",
" print()\n",
" x = 1\n"
]
}
],
"metadata": {},
"nbformat": 4,
"nbformat_minor": 5
}
"##,
);
env.format(format);
let diagnostics = vec![
env.builder("unused-import", Severity::Error, "`os` imported but unused")
.primary("notebook.ipynb", "2:7", "2:9", "Remove unused import: `os`")
.secondary_code("F401")
.fix(Fix::safe_edit(Edit::range_deletion(TextRange::new(
TextSize::from(9),
TextSize::from(19),
))))
.noqa_offset(TextSize::from(16))
.build(),
env.builder(
"unused-import",
Severity::Error,
"`math` imported but unused",
)
.primary(
"notebook.ipynb",
"4:7",
"4:11",
"Remove unused import: `math`",
)
.secondary_code("F401")
.fix(Fix::safe_edit(Edit::range_deletion(TextRange::new(
TextSize::from(28),
TextSize::from(40),
))))
.noqa_offset(TextSize::from(35))
.build(),
env.builder(
"unused-variable",
Severity::Error,
"Local variable `x` is assigned to but never used",
)
.primary(
"notebook.ipynb",
"10:4",
"10:5",
"Remove assignment to unused variable `x`",
)
.secondary_code("F841")
.fix(Fix::unsafe_edit(Edit::range_deletion(TextRange::new(
TextSize::from(94),
TextSize::from(104),
))))
.noqa_offset(TextSize::from(98))
.build(),
];
(env, diagnostics)
}
}

View File

@@ -0,0 +1,83 @@
use ruff_source_file::LineColumn;
use crate::diagnostic::{Diagnostic, Severity};
use super::FileResolver;
pub(super) struct AzureRenderer<'a> {
resolver: &'a dyn FileResolver,
}
impl<'a> AzureRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver) -> Self {
Self { resolver }
}
}
impl AzureRenderer<'_> {
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
for diag in diagnostics {
let severity = match diag.severity() {
Severity::Info | Severity::Warning => "warning",
Severity::Error | Severity::Fatal => "error",
};
write!(f, "##vso[task.logissue type={severity};")?;
if let Some(span) = diag.primary_span() {
let filename = span.file().path(self.resolver);
write!(f, "sourcepath={filename};")?;
if let Some(range) = span.range() {
let location = if self.resolver.notebook_index(span.file()).is_some() {
// We can't give a reasonable location for the structured formats,
// so we show one that's clearly a fallback
LineColumn::default()
} else {
span.file()
.diagnostic_source(self.resolver)
.as_source_code()
.line_column(range.start())
};
write!(
f,
"linenumber={line};columnnumber={col};",
line = location.line,
col = location.column,
)?;
}
}
writeln!(
f,
"{code}]{body}",
code = diag
.secondary_code()
.map_or_else(String::new, |code| format!("code={code};")),
body = diag.body(),
)?;
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{create_diagnostics, create_syntax_error_diagnostics},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Azure);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Azure);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
}

View File

@@ -0,0 +1,393 @@
use serde::{Serialize, Serializer, ser::SerializeSeq};
use serde_json::{Value, json};
use ruff_diagnostics::{Applicability, Edit};
use ruff_notebook::NotebookIndex;
use ruff_source_file::{LineColumn, OneIndexed};
use ruff_text_size::Ranged;
use crate::diagnostic::{Diagnostic, DiagnosticSource, DisplayDiagnosticConfig, SecondaryCode};
use super::FileResolver;
pub(super) struct JsonRenderer<'a> {
resolver: &'a dyn FileResolver,
config: &'a DisplayDiagnosticConfig,
}
impl<'a> JsonRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver, config: &'a DisplayDiagnosticConfig) -> Self {
Self { resolver, config }
}
}
impl JsonRenderer<'_> {
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
write!(
f,
"{:#}",
diagnostics_to_json_value(diagnostics, self.resolver, self.config)
)
}
}
fn diagnostics_to_json_value<'a>(
diagnostics: impl IntoIterator<Item = &'a Diagnostic>,
resolver: &dyn FileResolver,
config: &DisplayDiagnosticConfig,
) -> Value {
let values: Vec<_> = diagnostics
.into_iter()
.map(|diag| diagnostic_to_json(diag, resolver, config))
.collect();
json!(values)
}
pub(super) fn diagnostic_to_json<'a>(
diagnostic: &'a Diagnostic,
resolver: &'a dyn FileResolver,
config: &'a DisplayDiagnosticConfig,
) -> JsonDiagnostic<'a> {
let span = diagnostic.primary_span_ref();
let filename = span.map(|span| span.file().path(resolver));
let range = span.and_then(|span| span.range());
let diagnostic_source = span.map(|span| span.file().diagnostic_source(resolver));
let source_code = diagnostic_source
.as_ref()
.map(|diagnostic_source| diagnostic_source.as_source_code());
let notebook_index = span.and_then(|span| resolver.notebook_index(span.file()));
let mut start_location = None;
let mut end_location = None;
let mut noqa_location = None;
let mut notebook_cell_index = None;
if let Some(source_code) = source_code {
noqa_location = diagnostic
.noqa_offset()
.map(|offset| source_code.line_column(offset));
if let Some(range) = range {
let mut start = source_code.line_column(range.start());
let mut end = source_code.line_column(range.end());
if let Some(notebook_index) = &notebook_index {
notebook_cell_index =
Some(notebook_index.cell(start.line).unwrap_or(OneIndexed::MIN));
start = notebook_index.translate_line_column(&start);
end = notebook_index.translate_line_column(&end);
noqa_location =
noqa_location.map(|location| notebook_index.translate_line_column(&location));
}
start_location = Some(start);
end_location = Some(end);
}
}
let fix = diagnostic.fix().map(|fix| JsonFix {
applicability: fix.applicability(),
message: diagnostic.suggestion(),
edits: ExpandedEdits {
edits: fix.edits(),
notebook_index,
config,
diagnostic_source,
},
});
// In preview, the locations and filename can be optional.
if config.preview {
JsonDiagnostic {
code: diagnostic.secondary_code(),
url: diagnostic.to_ruff_url(),
message: diagnostic.body(),
fix,
cell: notebook_cell_index,
location: start_location.map(JsonLocation::from),
end_location: end_location.map(JsonLocation::from),
filename,
noqa_row: noqa_location.map(|location| location.line),
}
} else {
JsonDiagnostic {
code: diagnostic.secondary_code(),
url: diagnostic.to_ruff_url(),
message: diagnostic.body(),
fix,
cell: notebook_cell_index,
location: Some(start_location.unwrap_or_default().into()),
end_location: Some(end_location.unwrap_or_default().into()),
filename: Some(filename.unwrap_or_default()),
noqa_row: noqa_location.map(|location| location.line),
}
}
}
struct ExpandedEdits<'a> {
edits: &'a [Edit],
notebook_index: Option<NotebookIndex>,
config: &'a DisplayDiagnosticConfig,
diagnostic_source: Option<DiagnosticSource>,
}
impl Serialize for ExpandedEdits<'_> {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let mut s = serializer.serialize_seq(Some(self.edits.len()))?;
for edit in self.edits {
let (location, end_location) = if let Some(diagnostic_source) = &self.diagnostic_source
{
let source_code = diagnostic_source.as_source_code();
let mut location = source_code.line_column(edit.start());
let mut end_location = source_code.line_column(edit.end());
if let Some(notebook_index) = &self.notebook_index {
// There exists a newline between each cell's source code in the
// concatenated source code in Ruff. This newline doesn't actually
// exists in the JSON source field.
//
// Now, certain edits may try to remove this newline, which means
// the edit will spill over to the first character of the next cell.
// If it does, we need to translate the end location to the last
// character of the previous cell.
match (
notebook_index.cell(location.line),
notebook_index.cell(end_location.line),
) {
(Some(start_cell), Some(end_cell)) if start_cell != end_cell => {
debug_assert_eq!(end_location.column.get(), 1);
let prev_row = end_location.line.saturating_sub(1);
end_location = LineColumn {
line: notebook_index.cell_row(prev_row).unwrap_or(OneIndexed::MIN),
column: source_code
.line_column(source_code.line_end_exclusive(prev_row))
.column,
};
}
(Some(_), None) => {
debug_assert_eq!(end_location.column.get(), 1);
let prev_row = end_location.line.saturating_sub(1);
end_location = LineColumn {
line: notebook_index.cell_row(prev_row).unwrap_or(OneIndexed::MIN),
column: source_code
.line_column(source_code.line_end_exclusive(prev_row))
.column,
};
}
_ => {
end_location = notebook_index.translate_line_column(&end_location);
}
}
location = notebook_index.translate_line_column(&location);
}
(Some(location), Some(end_location))
} else {
(None, None)
};
// In preview, the locations can be optional.
let value = if self.config.preview {
JsonEdit {
content: edit.content().unwrap_or_default(),
location: location.map(JsonLocation::from),
end_location: end_location.map(JsonLocation::from),
}
} else {
JsonEdit {
content: edit.content().unwrap_or_default(),
location: Some(location.unwrap_or_default().into()),
end_location: Some(end_location.unwrap_or_default().into()),
}
};
s.serialize_element(&value)?;
}
s.end()
}
}
/// A serializable version of `Diagnostic`.
///
/// The `Old` variant only exists to preserve backwards compatibility. Both this and `JsonEdit`
/// should become structs with the `New` definitions in a future Ruff release.
#[derive(Serialize)]
pub(crate) struct JsonDiagnostic<'a> {
cell: Option<OneIndexed>,
code: Option<&'a SecondaryCode>,
end_location: Option<JsonLocation>,
filename: Option<&'a str>,
fix: Option<JsonFix<'a>>,
location: Option<JsonLocation>,
message: &'a str,
noqa_row: Option<OneIndexed>,
url: Option<String>,
}
#[derive(Serialize)]
struct JsonFix<'a> {
applicability: Applicability,
edits: ExpandedEdits<'a>,
message: Option<&'a str>,
}
#[derive(Serialize)]
struct JsonLocation {
column: OneIndexed,
row: OneIndexed,
}
impl From<LineColumn> for JsonLocation {
fn from(location: LineColumn) -> Self {
JsonLocation {
row: location.line,
column: location.column,
}
}
}
#[derive(Serialize)]
struct JsonEdit<'a> {
content: &'a str,
end_location: Option<JsonLocation>,
location: Option<JsonLocation>,
}
#[cfg(test)]
mod tests {
use ruff_diagnostics::{Edit, Fix};
use ruff_text_size::TextSize;
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{
TestEnvironment, create_diagnostics, create_notebook_diagnostics,
create_syntax_error_diagnostics,
},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::Json);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::Json);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn notebook_output() {
let (env, diagnostics) = create_notebook_diagnostics(DiagnosticFormat::Json);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn missing_file_stable() {
let mut env = TestEnvironment::new();
env.format(DiagnosticFormat::Json);
env.preview(false);
let diag = env
.err()
.fix(Fix::safe_edit(Edit::insertion(
"edit".to_string(),
TextSize::from(0),
)))
.build();
insta::assert_snapshot!(
env.render(&diag),
@r#"
[
{
"cell": null,
"code": null,
"end_location": {
"column": 1,
"row": 1
},
"filename": "",
"fix": {
"applicability": "safe",
"edits": [
{
"content": "edit",
"end_location": {
"column": 1,
"row": 1
},
"location": {
"column": 1,
"row": 1
}
}
],
"message": null
},
"location": {
"column": 1,
"row": 1
},
"message": "main diagnostic message",
"noqa_row": null,
"url": "https://docs.astral.sh/ruff/rules/test-diagnostic"
}
]
"#,
);
}
#[test]
fn missing_file_preview() {
let mut env = TestEnvironment::new();
env.format(DiagnosticFormat::Json);
env.preview(true);
let diag = env
.err()
.fix(Fix::safe_edit(Edit::insertion(
"edit".to_string(),
TextSize::from(0),
)))
.build();
insta::assert_snapshot!(
env.render(&diag),
@r#"
[
{
"cell": null,
"code": null,
"end_location": null,
"filename": null,
"fix": {
"applicability": "safe",
"edits": [
{
"content": "edit",
"end_location": null,
"location": null
}
],
"message": null
},
"location": null,
"message": "main diagnostic message",
"noqa_row": null,
"url": "https://docs.astral.sh/ruff/rules/test-diagnostic"
}
]
"#,
);
}
}

View File

@@ -0,0 +1,59 @@
use crate::diagnostic::{Diagnostic, DisplayDiagnosticConfig, render::json::diagnostic_to_json};
use super::FileResolver;
pub(super) struct JsonLinesRenderer<'a> {
resolver: &'a dyn FileResolver,
config: &'a DisplayDiagnosticConfig,
}
impl<'a> JsonLinesRenderer<'a> {
pub(super) fn new(resolver: &'a dyn FileResolver, config: &'a DisplayDiagnosticConfig) -> Self {
Self { resolver, config }
}
}
impl JsonLinesRenderer<'_> {
pub(super) fn render(
&self,
f: &mut std::fmt::Formatter,
diagnostics: &[Diagnostic],
) -> std::fmt::Result {
for diag in diagnostics {
writeln!(
f,
"{}",
serde_json::json!(diagnostic_to_json(diag, self.resolver, self.config))
)?;
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use crate::diagnostic::{
DiagnosticFormat,
render::tests::{
create_diagnostics, create_notebook_diagnostics, create_syntax_error_diagnostics,
},
};
#[test]
fn output() {
let (env, diagnostics) = create_diagnostics(DiagnosticFormat::JsonLines);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn syntax_errors() {
let (env, diagnostics) = create_syntax_error_diagnostics(DiagnosticFormat::JsonLines);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
#[test]
fn notebook_output() {
let (env, diagnostics) = create_notebook_diagnostics(DiagnosticFormat::JsonLines);
insta::assert_snapshot!(env.render_diagnostics(&diagnostics));
}
}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/azure.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/azure.rs
expression: env.render_diagnostics(&diagnostics)
---
##vso[task.logissue type=error;sourcepath=fib.py;linenumber=1;columnnumber=8;code=F401;]`os` imported but unused
##vso[task.logissue type=error;sourcepath=fib.py;linenumber=6;columnnumber=5;code=F841;]Local variable `x` is assigned to but never used

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/azure.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/azure.rs
expression: env.render_diagnostics(&diagnostics)
---
##vso[task.logissue type=error;sourcepath=syntax_errors.py;linenumber=1;columnnumber=15;]SyntaxError: Expected one or more symbol names after import
##vso[task.logissue type=error;sourcepath=syntax_errors.py;linenumber=3;columnnumber=12;]SyntaxError: Expected ')', found newline

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json.rs
expression: env.render_diagnostics(&diagnostics)
---
[
{
@@ -84,8 +83,8 @@ snapshot_kind: text
{
"content": "",
"end_location": {
"column": 10,
"row": 4
"column": 1,
"row": 5
},
"location": {
"column": 1,

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json.rs
expression: env.render_diagnostics(&diagnostics)
---
[
{

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json.rs
expression: env.render_diagnostics(&diagnostics)
---
[
{

View File

@@ -1,8 +1,7 @@
---
source: crates/ruff_linter/src/message/json_lines.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":1,"code":"F401","end_location":{"column":10,"row":2},"filename":"notebook.ipynb","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":10,"row":2},"location":{"column":1,"row":2}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":2},"message":"`os` imported but unused","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":2,"code":"F401","end_location":{"column":12,"row":2},"filename":"notebook.ipynb","fix":{"applicability":"safe","edits":[{"content":"","end_location":{"column":1,"row":3},"location":{"column":1,"row":2}}],"message":"Remove unused import: `math`"},"location":{"column":8,"row":2},"message":"`math` imported but unused","noqa_row":2,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":3,"code":"F841","end_location":{"column":6,"row":4},"filename":"notebook.ipynb","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":10,"row":4},"location":{"column":1,"row":4}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":4},"message":"Local variable `x` is assigned to but never used","noqa_row":4,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}
{"cell":3,"code":"F841","end_location":{"column":6,"row":4},"filename":"notebook.ipynb","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":1,"row":5},"location":{"column":1,"row":4}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":4},"message":"Local variable `x` is assigned to but never used","noqa_row":4,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json_lines.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":null,"code":"F401","end_location":{"column":10,"row":1},"filename":"fib.py","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":1,"row":2},"location":{"column":1,"row":1}}],"message":"Remove unused import: `os`"},"location":{"column":8,"row":1},"message":"`os` imported but unused","noqa_row":1,"url":"https://docs.astral.sh/ruff/rules/unused-import"}
{"cell":null,"code":"F841","end_location":{"column":6,"row":6},"filename":"fib.py","fix":{"applicability":"unsafe","edits":[{"content":"","end_location":{"column":10,"row":6},"location":{"column":5,"row":6}}],"message":"Remove assignment to unused variable `x`"},"location":{"column":5,"row":6},"message":"Local variable `x` is assigned to but never used","noqa_row":6,"url":"https://docs.astral.sh/ruff/rules/unused-variable"}

View File

@@ -1,7 +1,6 @@
---
source: crates/ruff_linter/src/message/json_lines.rs
expression: content
snapshot_kind: text
source: crates/ruff_db/src/diagnostic/render/json_lines.rs
expression: env.render_diagnostics(&diagnostics)
---
{"cell":null,"code":null,"end_location":{"column":1,"row":2},"filename":"syntax_errors.py","fix":null,"location":{"column":15,"row":1},"message":"SyntaxError: Expected one or more symbol names after import","noqa_row":null,"url":null}
{"cell":null,"code":null,"end_location":{"column":1,"row":4},"filename":"syntax_errors.py","fix":null,"location":{"column":12,"row":3},"message":"SyntaxError: Expected ')', found newline","noqa_row":null,"url":null}

View File

@@ -21,6 +21,19 @@ type LockedZipArchive<'a> = MutexGuard<'a, VendoredZipArchive>;
///
/// "Files" in the `VendoredFileSystem` are read-only and immutable.
/// Directories are supported, but symlinks and hardlinks cannot exist.
///
/// # Path separators
///
/// At time of writing (2025-07-11), this implementation always uses `/` as a
/// path separator, even in Windows environments where `\` is traditionally
/// used as a file path separator. Namely, this is only currently used with zip
/// files built by `crates/ty_vendored/build.rs`.
///
/// Callers using this may provide paths that use a `\` as a separator. It will
/// be transparently normalized to `/`.
///
/// This is particularly important because the presence of a trailing separator
/// in a zip file is conventionally used to indicate a directory entry.
#[derive(Clone)]
pub struct VendoredFileSystem {
inner: Arc<Mutex<VendoredZipArchive>>,
@@ -115,6 +128,68 @@ impl VendoredFileSystem {
read_to_string(self, path.as_ref())
}
/// Read the direct children of the directory
/// identified by `path`.
///
/// If `path` is not a directory, then this will
/// return an empty `Vec`.
pub fn read_directory(&self, dir: impl AsRef<VendoredPath>) -> Vec<DirectoryEntry> {
// N.B. We specifically do not return an iterator here to avoid
// holding a lock for the lifetime of the iterator returned.
// That is, it seems like a footgun to keep the zip archive
// locked during iteration, since the unit of work for each
// item in the iterator could be arbitrarily long. Allocating
// up front and stuffing all entries into it is probably the
// simplest solution and what we do here. If this becomes
// a problem, there are other strategies we could pursue.
// (Amortizing allocs, using a different synchronization
// behavior or even exposing additional APIs.) ---AG
fn read_directory(fs: &VendoredFileSystem, dir: &VendoredPath) -> Vec<DirectoryEntry> {
let mut normalized = NormalizedVendoredPath::from(dir);
if !normalized.as_str().ends_with('/') {
normalized = normalized.with_trailing_slash();
}
let archive = fs.lock_archive();
let mut entries = vec![];
for name in archive.0.file_names() {
// Any entry that doesn't have the `path` (with a
// trailing slash) as a prefix cannot possibly be in
// the directory referenced by `path`.
let Some(without_dir_prefix) = name.strip_prefix(normalized.as_str()) else {
continue;
};
// Filter out an entry equivalent to the path given
// since we only want children of the directory.
if without_dir_prefix.is_empty() {
continue;
}
// We only want *direct* children. Files that are
// direct children cannot have any slashes (or else
// they are not direct children). Directories that
// are direct children can only have one slash and
// it must be at the end.
//
// (We do this manually ourselves to avoid doing a
// full file lookup and metadata retrieval via the
// `zip` crate.)
let file_type = FileType::from_zip_file_name(without_dir_prefix);
let slash_count = without_dir_prefix.matches('/').count();
match file_type {
FileType::File if slash_count > 0 => continue,
FileType::Directory if slash_count > 1 => continue,
_ => {}
}
entries.push(DirectoryEntry {
path: VendoredPathBuf::from(name),
file_type,
});
}
entries
}
read_directory(self, dir.as_ref())
}
/// Acquire a lock on the underlying zip archive.
/// The call will block until it is able to acquire the lock.
///
@@ -206,6 +281,14 @@ pub enum FileType {
}
impl FileType {
fn from_zip_file_name(name: &str) -> FileType {
if name.ends_with('/') {
FileType::Directory
} else {
FileType::File
}
}
pub const fn is_file(self) -> bool {
matches!(self, Self::File)
}
@@ -244,6 +327,30 @@ impl Metadata {
}
}
#[derive(Debug, PartialEq, Eq)]
pub struct DirectoryEntry {
path: VendoredPathBuf,
file_type: FileType,
}
impl DirectoryEntry {
pub fn new(path: VendoredPathBuf, file_type: FileType) -> Self {
Self { path, file_type }
}
pub fn into_path(self) -> VendoredPathBuf {
self.path
}
pub fn path(&self) -> &VendoredPath {
&self.path
}
pub fn file_type(&self) -> FileType {
self.file_type
}
}
/// Newtype wrapper around a ZipArchive.
#[derive(Debug)]
struct VendoredZipArchive(ZipArchive<io::Cursor<Cow<'static, [u8]>>>);
@@ -498,6 +605,60 @@ pub(crate) mod tests {
test_directory("./stdlib/asyncio/../asyncio/")
}
fn readdir_snapshot(fs: &VendoredFileSystem, path: &str) -> String {
let mut paths = fs
.read_directory(VendoredPath::new(path))
.into_iter()
.map(|entry| entry.path().to_string())
.collect::<Vec<String>>();
paths.sort();
paths.join("\n")
}
#[test]
fn read_directory_stdlib() {
let mock_typeshed = mock_typeshed();
assert_snapshot!(readdir_snapshot(&mock_typeshed, "stdlib"), @r"
vendored://stdlib/asyncio/
vendored://stdlib/functools.pyi
");
assert_snapshot!(readdir_snapshot(&mock_typeshed, "stdlib/"), @r"
vendored://stdlib/asyncio/
vendored://stdlib/functools.pyi
");
assert_snapshot!(readdir_snapshot(&mock_typeshed, "./stdlib"), @r"
vendored://stdlib/asyncio/
vendored://stdlib/functools.pyi
");
assert_snapshot!(readdir_snapshot(&mock_typeshed, "./stdlib/"), @r"
vendored://stdlib/asyncio/
vendored://stdlib/functools.pyi
");
}
#[test]
fn read_directory_asyncio() {
let mock_typeshed = mock_typeshed();
assert_snapshot!(
readdir_snapshot(&mock_typeshed, "stdlib/asyncio"),
@"vendored://stdlib/asyncio/tasks.pyi",
);
assert_snapshot!(
readdir_snapshot(&mock_typeshed, "./stdlib/asyncio"),
@"vendored://stdlib/asyncio/tasks.pyi",
);
assert_snapshot!(
readdir_snapshot(&mock_typeshed, "stdlib/asyncio/"),
@"vendored://stdlib/asyncio/tasks.pyi",
);
assert_snapshot!(
readdir_snapshot(&mock_typeshed, "./stdlib/asyncio/"),
@"vendored://stdlib/asyncio/tasks.pyi",
);
}
fn test_nonexistent_path(path: &str) {
let mock_typeshed = mock_typeshed();
let path = VendoredPath::new(path);

View File

@@ -17,6 +17,10 @@ impl VendoredPath {
unsafe { &*(path as *const Utf8Path as *const VendoredPath) }
}
pub fn file_name(&self) -> Option<&str> {
self.0.file_name()
}
pub fn to_path_buf(&self) -> VendoredPathBuf {
VendoredPathBuf(self.0.to_path_buf())
}

View File

@@ -1,6 +1,6 @@
"""
Should emit:
B017 - on lines 24, 28, 46, 49, 52, and 58
B017 - on lines 24, 28, 46, 49, 52, 58, 62, 68, and 71
"""
import asyncio
import unittest
@@ -56,3 +56,17 @@ def test_pytest_raises():
with contextlib.nullcontext(), pytest.raises(Exception):
raise ValueError("Multiple context managers")
def test_pytest_raises_keyword():
with pytest.raises(expected_exception=Exception):
raise ValueError("Should be flagged")
def test_assert_raises_keyword():
class TestKwargs(unittest.TestCase):
def test_method(self):
with self.assertRaises(exception=Exception):
raise ValueError("Should be flagged")
with self.assertRaises(exception=BaseException):
raise ValueError("Should be flagged")

View File

@@ -181,3 +181,51 @@ class SubclassTestModel2(TestModel4):
# Subclass without __str__
class SubclassTestModel3(TestModel1):
pass
# Test cases for type-annotated abstract models - these should NOT trigger DJ008
from typing import ClassVar
from django_stubs_ext.db.models import TypedModelMeta
class TypeAnnotatedAbstractModel1(models.Model):
"""Model with type-annotated abstract = True - should not trigger DJ008"""
new_field = models.CharField(max_length=10)
class Meta(TypedModelMeta):
abstract: ClassVar[bool] = True
class TypeAnnotatedAbstractModel2(models.Model):
"""Model with type-annotated abstract = True using regular Meta - should not trigger DJ008"""
new_field = models.CharField(max_length=10)
class Meta:
abstract: ClassVar[bool] = True
class TypeAnnotatedAbstractModel3(models.Model):
"""Model with type-annotated abstract = True but without ClassVar - should not trigger DJ008"""
new_field = models.CharField(max_length=10)
class Meta:
abstract: bool = True
class TypeAnnotatedNonAbstractModel(models.Model):
"""Model with type-annotated abstract = False - should trigger DJ008"""
new_field = models.CharField(max_length=10)
class Meta:
abstract: ClassVar[bool] = False
class TypeAnnotatedAbstractModelWithStr(models.Model):
"""Model with type-annotated abstract = True and __str__ method - should not trigger DJ008"""
new_field = models.CharField(max_length=10)
class Meta(TypedModelMeta):
abstract: ClassVar[bool] = True
def __str__(self):
return self.new_field

View File

@@ -0,0 +1,5 @@
"""This is a docstring."""
"This is not a docstring."
"This is also not a docstring."
x = 1

View File

@@ -670,7 +670,11 @@ impl SemanticSyntaxContext for Checker<'_> {
| SemanticSyntaxErrorKind::InvalidStarExpression
| SemanticSyntaxErrorKind::AsyncComprehensionInSyncComprehension(_)
| SemanticSyntaxErrorKind::DuplicateParameter(_)
| SemanticSyntaxErrorKind::NonlocalDeclarationAtModuleLevel => {
| SemanticSyntaxErrorKind::NonlocalDeclarationAtModuleLevel
| SemanticSyntaxErrorKind::LoadBeforeNonlocalDeclaration { .. }
| SemanticSyntaxErrorKind::NonlocalAndGlobal(_)
| SemanticSyntaxErrorKind::AnnotatedGlobal(_)
| SemanticSyntaxErrorKind::AnnotatedNonlocal(_) => {
self.semantic_errors.borrow_mut().push(error);
}
}

View File

@@ -275,19 +275,12 @@ impl<'a> Insertion<'a> {
}
}
/// Find the end of the last docstring.
/// Find the end of the docstring (first string statement).
fn match_docstring_end(body: &[Stmt]) -> Option<TextSize> {
let mut iter = body.iter();
let mut stmt = iter.next()?;
let stmt = body.first()?;
if !is_docstring_stmt(stmt) {
return None;
}
for next in iter {
if !is_docstring_stmt(next) {
break;
}
stmt = next;
}
Some(stmt.end())
}
@@ -367,7 +360,7 @@ mod tests {
.trim_start();
assert_eq!(
insert(contents)?,
Insertion::own_line("", TextSize::from(40), "\n")
Insertion::own_line("", TextSize::from(20), "\n")
);
let contents = r"

View File

@@ -1,71 +0,0 @@
use std::io::Write;
use ruff_db::diagnostic::Diagnostic;
use ruff_source_file::LineColumn;
use crate::message::{Emitter, EmitterContext};
/// Generate error logging commands for Azure Pipelines format.
/// See [documentation](https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash#logissue-log-an-error-or-warning)
#[derive(Default)]
pub struct AzureEmitter;
impl Emitter for AzureEmitter {
fn emit(
&mut self,
writer: &mut dyn Write,
diagnostics: &[Diagnostic],
context: &EmitterContext,
) -> anyhow::Result<()> {
for diagnostic in diagnostics {
let filename = diagnostic.expect_ruff_filename();
let location = if context.is_notebook(&filename) {
// We can't give a reasonable location for the structured formats,
// so we show one that's clearly a fallback
LineColumn::default()
} else {
diagnostic.expect_ruff_start_location()
};
writeln!(
writer,
"##vso[task.logissue type=error\
;sourcepath={filename};linenumber={line};columnnumber={col};{code}]{body}",
line = location.line,
col = location.column,
code = diagnostic
.secondary_code()
.map_or_else(String::new, |code| format!("code={code};")),
body = diagnostic.body(),
)?;
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use insta::assert_snapshot;
use crate::message::AzureEmitter;
use crate::message::tests::{
capture_emitter_output, create_diagnostics, create_syntax_error_diagnostics,
};
#[test]
fn output() {
let mut emitter = AzureEmitter;
let content = capture_emitter_output(&mut emitter, &create_diagnostics());
assert_snapshot!(content);
}
#[test]
fn syntax_errors() {
let mut emitter = AzureEmitter;
let content = capture_emitter_output(&mut emitter, &create_syntax_error_diagnostics());
assert_snapshot!(content);
}
}

View File

@@ -21,7 +21,7 @@ use crate::{Applicability, Fix};
/// * Compute the diff from the [`Edit`] because diff calculation is expensive.
pub(super) struct Diff<'a> {
fix: &'a Fix,
source_code: SourceFile,
source_code: &'a SourceFile,
}
impl<'a> Diff<'a> {

View File

@@ -1,220 +0,0 @@
use std::io::Write;
use serde::ser::SerializeSeq;
use serde::{Serialize, Serializer};
use serde_json::{Value, json};
use ruff_db::diagnostic::Diagnostic;
use ruff_notebook::NotebookIndex;
use ruff_source_file::{LineColumn, OneIndexed, SourceCode};
use ruff_text_size::Ranged;
use crate::Edit;
use crate::message::{Emitter, EmitterContext};
#[derive(Default)]
pub struct JsonEmitter;
impl Emitter for JsonEmitter {
fn emit(
&mut self,
writer: &mut dyn Write,
diagnostics: &[Diagnostic],
context: &EmitterContext,
) -> anyhow::Result<()> {
serde_json::to_writer_pretty(
writer,
&ExpandedMessages {
diagnostics,
context,
},
)?;
Ok(())
}
}
struct ExpandedMessages<'a> {
diagnostics: &'a [Diagnostic],
context: &'a EmitterContext<'a>,
}
impl Serialize for ExpandedMessages<'_> {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let mut s = serializer.serialize_seq(Some(self.diagnostics.len()))?;
for message in self.diagnostics {
let value = message_to_json_value(message, self.context);
s.serialize_element(&value)?;
}
s.end()
}
}
pub(crate) fn message_to_json_value(message: &Diagnostic, context: &EmitterContext) -> Value {
let source_file = message.expect_ruff_source_file();
let source_code = source_file.to_source_code();
let filename = message.expect_ruff_filename();
let notebook_index = context.notebook_index(&filename);
let fix = message.fix().map(|fix| {
json!({
"applicability": fix.applicability(),
"message": message.suggestion(),
"edits": &ExpandedEdits { edits: fix.edits(), source_code: &source_code, notebook_index },
})
});
let mut start_location = source_code.line_column(message.expect_range().start());
let mut end_location = source_code.line_column(message.expect_range().end());
let mut noqa_location = message
.noqa_offset()
.map(|offset| source_code.line_column(offset));
let mut notebook_cell_index = None;
if let Some(notebook_index) = notebook_index {
notebook_cell_index = Some(
notebook_index
.cell(start_location.line)
.unwrap_or(OneIndexed::MIN),
);
start_location = notebook_index.translate_line_column(&start_location);
end_location = notebook_index.translate_line_column(&end_location);
noqa_location =
noqa_location.map(|location| notebook_index.translate_line_column(&location));
}
json!({
"code": message.secondary_code(),
"url": message.to_url(),
"message": message.body(),
"fix": fix,
"cell": notebook_cell_index,
"location": location_to_json(start_location),
"end_location": location_to_json(end_location),
"filename": filename,
"noqa_row": noqa_location.map(|location| location.line)
})
}
fn location_to_json(location: LineColumn) -> serde_json::Value {
json!({
"row": location.line,
"column": location.column
})
}
struct ExpandedEdits<'a> {
edits: &'a [Edit],
source_code: &'a SourceCode<'a, 'a>,
notebook_index: Option<&'a NotebookIndex>,
}
impl Serialize for ExpandedEdits<'_> {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let mut s = serializer.serialize_seq(Some(self.edits.len()))?;
for edit in self.edits {
let mut location = self.source_code.line_column(edit.start());
let mut end_location = self.source_code.line_column(edit.end());
if let Some(notebook_index) = self.notebook_index {
// There exists a newline between each cell's source code in the
// concatenated source code in Ruff. This newline doesn't actually
// exists in the JSON source field.
//
// Now, certain edits may try to remove this newline, which means
// the edit will spill over to the first character of the next cell.
// If it does, we need to translate the end location to the last
// character of the previous cell.
match (
notebook_index.cell(location.line),
notebook_index.cell(end_location.line),
) {
(Some(start_cell), Some(end_cell)) if start_cell != end_cell => {
debug_assert_eq!(end_location.column.get(), 1);
let prev_row = end_location.line.saturating_sub(1);
end_location = LineColumn {
line: notebook_index.cell_row(prev_row).unwrap_or(OneIndexed::MIN),
column: self
.source_code
.line_column(self.source_code.line_end_exclusive(prev_row))
.column,
};
}
(Some(_), None) => {
debug_assert_eq!(end_location.column.get(), 1);
let prev_row = end_location.line.saturating_sub(1);
end_location = LineColumn {
line: notebook_index.cell_row(prev_row).unwrap_or(OneIndexed::MIN),
column: self
.source_code
.line_column(self.source_code.line_end_exclusive(prev_row))
.column,
};
}
_ => {
end_location = notebook_index.translate_line_column(&end_location);
}
}
location = notebook_index.translate_line_column(&location);
}
let value = json!({
"content": edit.content().unwrap_or_default(),
"location": location_to_json(location),
"end_location": location_to_json(end_location)
});
s.serialize_element(&value)?;
}
s.end()
}
}
#[cfg(test)]
mod tests {
use insta::assert_snapshot;
use crate::message::JsonEmitter;
use crate::message::tests::{
capture_emitter_notebook_output, capture_emitter_output, create_diagnostics,
create_notebook_diagnostics, create_syntax_error_diagnostics,
};
#[test]
fn output() {
let mut emitter = JsonEmitter;
let content = capture_emitter_output(&mut emitter, &create_diagnostics());
assert_snapshot!(content);
}
#[test]
fn syntax_errors() {
let mut emitter = JsonEmitter;
let content = capture_emitter_output(&mut emitter, &create_syntax_error_diagnostics());
assert_snapshot!(content);
}
#[test]
fn notebook_output() {
let mut emitter = JsonEmitter;
let (diagnostics, notebook_indexes) = create_notebook_diagnostics();
let content =
capture_emitter_notebook_output(&mut emitter, &diagnostics, &notebook_indexes);
assert_snapshot!(content);
}
}

View File

@@ -1,60 +0,0 @@
use std::io::Write;
use ruff_db::diagnostic::Diagnostic;
use crate::message::json::message_to_json_value;
use crate::message::{Emitter, EmitterContext};
#[derive(Default)]
pub struct JsonLinesEmitter;
impl Emitter for JsonLinesEmitter {
fn emit(
&mut self,
writer: &mut dyn Write,
diagnostics: &[Diagnostic],
context: &EmitterContext,
) -> anyhow::Result<()> {
for diagnostic in diagnostics {
serde_json::to_writer(&mut *writer, &message_to_json_value(diagnostic, context))?;
writer.write_all(b"\n")?;
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use insta::assert_snapshot;
use crate::message::json_lines::JsonLinesEmitter;
use crate::message::tests::{
capture_emitter_notebook_output, capture_emitter_output, create_diagnostics,
create_notebook_diagnostics, create_syntax_error_diagnostics,
};
#[test]
fn output() {
let mut emitter = JsonLinesEmitter;
let content = capture_emitter_output(&mut emitter, &create_diagnostics());
assert_snapshot!(content);
}
#[test]
fn syntax_errors() {
let mut emitter = JsonLinesEmitter;
let content = capture_emitter_output(&mut emitter, &create_syntax_error_diagnostics());
assert_snapshot!(content);
}
#[test]
fn notebook_output() {
let mut emitter = JsonLinesEmitter;
let (messages, notebook_indexes) = create_notebook_diagnostics();
let content = capture_emitter_notebook_output(&mut emitter, &messages, &notebook_indexes);
assert_snapshot!(content);
}
}

View File

@@ -3,17 +3,17 @@ use std::fmt::Display;
use std::io::Write;
use std::ops::Deref;
use ruff_db::diagnostic::{
Annotation, Diagnostic, DiagnosticId, LintName, SecondaryCode, Severity, Span,
};
use rustc_hash::FxHashMap;
pub use azure::AzureEmitter;
use ruff_db::diagnostic::{
Annotation, Diagnostic, DiagnosticId, FileResolver, Input, LintName, SecondaryCode, Severity,
Span, UnifiedFile,
};
use ruff_db::files::File;
pub use github::GithubEmitter;
pub use gitlab::GitlabEmitter;
pub use grouped::GroupedEmitter;
pub use json::JsonEmitter;
pub use json_lines::JsonLinesEmitter;
pub use junit::JunitEmitter;
pub use pylint::PylintEmitter;
pub use rdjson::RdjsonEmitter;
@@ -26,13 +26,10 @@ pub use text::TextEmitter;
use crate::Fix;
use crate::registry::Rule;
mod azure;
mod diff;
mod github;
mod gitlab;
mod grouped;
mod json;
mod json_lines;
mod junit;
mod pylint;
mod rdjson;
@@ -107,6 +104,34 @@ where
diagnostic
}
impl FileResolver for EmitterContext<'_> {
fn path(&self, _file: File) -> &str {
unimplemented!("Expected a Ruff file for rendering a Ruff diagnostic");
}
fn input(&self, _file: File) -> Input {
unimplemented!("Expected a Ruff file for rendering a Ruff diagnostic");
}
fn notebook_index(&self, file: &UnifiedFile) -> Option<NotebookIndex> {
match file {
UnifiedFile::Ty(_) => {
unimplemented!("Expected a Ruff file for rendering a Ruff diagnostic")
}
UnifiedFile::Ruff(file) => self.notebook_indexes.get(file.name()).cloned(),
}
}
fn is_notebook(&self, file: &UnifiedFile) -> bool {
match file {
UnifiedFile::Ty(_) => {
unimplemented!("Expected a Ruff file for rendering a Ruff diagnostic")
}
UnifiedFile::Ruff(file) => self.notebook_indexes.get(file.name()).is_some(),
}
}
}
struct MessageWithLocation<'a> {
message: &'a Diagnostic,
start_location: LineColumn,

View File

@@ -73,7 +73,7 @@ fn message_to_rdjson_value(message: &Diagnostic) -> Value {
},
"code": {
"value": message.secondary_code(),
"url": message.to_url(),
"url": message.to_ruff_url(),
},
"suggestions": rdjson_suggestions(fix.edits(), &source_code),
})
@@ -86,7 +86,7 @@ fn message_to_rdjson_value(message: &Diagnostic) -> Value {
},
"code": {
"value": message.secondary_code(),
"url": message.to_url(),
"url": message.to_ruff_url(),
},
})
}

View File

@@ -87,9 +87,14 @@ fn detect_blind_exception(
}
}
let first_arg = arguments.args.first()?;
let exception_argument_name = if is_pytest_raises {
"expected_exception"
} else {
"exception"
};
let builtin_symbol = semantic.resolve_builtin_symbol(first_arg)?;
let exception_expr = arguments.find_argument_value(exception_argument_name, 0)?;
let builtin_symbol = semantic.resolve_builtin_symbol(exception_expr)?;
match builtin_symbol {
"Exception" => Some(ExceptionKind::Exception),

View File

@@ -43,3 +43,29 @@ B017_0.py:57:36: B017 Do not assert blind exception: `Exception`
| ^^^^^^^^^^^^^^^^^^^^^^^^ B017
58 | raise ValueError("Multiple context managers")
|
B017_0.py:62:10: B017 Do not assert blind exception: `Exception`
|
61 | def test_pytest_raises_keyword():
62 | with pytest.raises(expected_exception=Exception):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ B017
63 | raise ValueError("Should be flagged")
|
B017_0.py:68:18: B017 Do not assert blind exception: `Exception`
|
66 | class TestKwargs(unittest.TestCase):
67 | def test_method(self):
68 | with self.assertRaises(exception=Exception):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ B017
69 | raise ValueError("Should be flagged")
|
B017_0.py:71:18: B017 Do not assert blind exception: `BaseException`
|
69 | raise ValueError("Should be flagged")
70 |
71 | with self.assertRaises(exception=BaseException):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ B017
72 | raise ValueError("Should be flagged")
|

View File

@@ -43,3 +43,29 @@ B017_0.py:57:36: B017 Do not assert blind exception: `Exception`
| ^^^^^^^^^^^^^^^^^^^^^^^^ B017
58 | raise ValueError("Multiple context managers")
|
B017_0.py:62:10: B017 Do not assert blind exception: `Exception`
|
61 | def test_pytest_raises_keyword():
62 | with pytest.raises(expected_exception=Exception):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ B017
63 | raise ValueError("Should be flagged")
|
B017_0.py:68:18: B017 Do not assert blind exception: `Exception`
|
66 | class TestKwargs(unittest.TestCase):
67 | def test_method(self):
68 | with self.assertRaises(exception=Exception):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ B017
69 | raise ValueError("Should be flagged")
|
B017_0.py:71:18: B017 Do not assert blind exception: `BaseException`
|
69 | raise ValueError("Should be flagged")
70 |
71 | with self.assertRaises(exception=BaseException):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ B017
72 | raise ValueError("Should be flagged")
|

View File

@@ -96,22 +96,43 @@ fn is_model_abstract(class_def: &ast::StmtClassDef) -> bool {
continue;
}
for element in body {
let Stmt::Assign(ast::StmtAssign { targets, value, .. }) = element else {
continue;
};
for target in targets {
let Expr::Name(ast::ExprName { id, .. }) = target else {
continue;
};
if id != "abstract" {
continue;
match element {
Stmt::Assign(ast::StmtAssign { targets, value, .. }) => {
if targets
.iter()
.any(|target| is_abstract_true_assignment(target, Some(value)))
{
return true;
}
}
if !is_const_true(value) {
continue;
Stmt::AnnAssign(ast::StmtAnnAssign { target, value, .. }) => {
if is_abstract_true_assignment(target, value.as_deref()) {
return true;
}
}
return true;
_ => {}
}
}
}
false
}
fn is_abstract_true_assignment(target: &Expr, value: Option<&Expr>) -> bool {
let Expr::Name(ast::ExprName { id, .. }) = target else {
return false;
};
if id != "abstract" {
return false;
}
let Some(value) = value else {
return false;
};
if !is_const_true(value) {
return false;
}
true
}

View File

@@ -1,6 +1,5 @@
---
source: crates/ruff_linter/src/rules/flake8_django/mod.rs
snapshot_kind: text
---
DJ008.py:6:7: DJ008 Model does not define `__str__` method
|
@@ -31,3 +30,11 @@ DJ008.py:182:7: DJ008 Model does not define `__str__` method
| ^^^^^^^^^^^^^^^^^^ DJ008
183 | pass
|
DJ008.py:215:7: DJ008 Model does not define `__str__` method
|
215 | class TypeAnnotatedNonAbstractModel(models.Model):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ DJ008
216 | """Model with type-annotated abstract = False - should trigger DJ008"""
217 | new_field = models.CharField(max_length=10)
|

View File

@@ -912,6 +912,7 @@ mod tests {
#[test_case(Path::new("docstring.pyi"))]
#[test_case(Path::new("docstring_only.py"))]
#[test_case(Path::new("empty.py"))]
#[test_case(Path::new("multiple_strings.py"))]
fn required_imports(path: &Path) -> Result<()> {
let snapshot = format!("required_imports_{}", path.to_string_lossy());
let diagnostics = test_path(

View File

@@ -0,0 +1,18 @@
---
source: crates/ruff_linter/src/rules/isort/mod.rs
---
multiple_strings.py:1:1: I002 [*] Missing required import: `from __future__ import annotations`
Safe fix
1 1 | """This is a docstring."""
2 |+from __future__ import annotations
2 3 | "This is not a docstring."
3 4 | "This is also not a docstring."
4 5 |
multiple_strings.py:1:1: I002 [*] Missing required import: `from __future__ import generator_stop`
Safe fix
1 1 | """This is a docstring."""
2 |+from __future__ import generator_stop
2 3 | "This is not a docstring."
3 4 | "This is also not a docstring."
4 5 |

View File

@@ -18,11 +18,15 @@ use crate::checkers::ast::Checker;
///
/// ## Example
/// ```python
/// import os
///
/// os.getenv(1)
/// ```
///
/// Use instead:
/// ```python
/// import os
///
/// os.getenv("1")
/// ```
#[derive(ViolationMetadata)]

View File

@@ -14,12 +14,12 @@ use crate::{AlwaysFixableViolation, Edit, Fix};
///
/// ## Example
/// ```python
/// from xml.etree import cElementTree
/// from xml.etree import cElementTree as ET
/// ```
///
/// Use instead:
/// ```python
/// from xml.etree import ElementTree
/// from xml.etree import ElementTree as ET
/// ```
///
/// ## References

View File

@@ -27,6 +27,8 @@ use crate::{AlwaysFixableViolation, Edit, Fix};
///
/// ## Example
/// ```python
/// import asyncio
///
/// raise asyncio.TimeoutError
/// ```
///

View File

@@ -952,6 +952,9 @@ impl Display for SemanticSyntaxError {
SemanticSyntaxErrorKind::LoadBeforeGlobalDeclaration { name, start: _ } => {
write!(f, "name `{name}` is used prior to global declaration")
}
SemanticSyntaxErrorKind::LoadBeforeNonlocalDeclaration { name, start: _ } => {
write!(f, "name `{name}` is used prior to nonlocal declaration")
}
SemanticSyntaxErrorKind::InvalidStarExpression => {
f.write_str("Starred expression cannot be used here")
}
@@ -977,6 +980,15 @@ impl Display for SemanticSyntaxError {
SemanticSyntaxErrorKind::NonlocalDeclarationAtModuleLevel => {
write!(f, "nonlocal declaration not allowed at module level")
}
SemanticSyntaxErrorKind::NonlocalAndGlobal(name) => {
write!(f, "name `{name}` is nonlocal and global")
}
SemanticSyntaxErrorKind::AnnotatedGlobal(name) => {
write!(f, "annotated name `{name}` can't be global")
}
SemanticSyntaxErrorKind::AnnotatedNonlocal(name) => {
write!(f, "annotated name `{name}` can't be nonlocal")
}
}
}
}
@@ -1207,6 +1219,24 @@ pub enum SemanticSyntaxErrorKind {
/// [#111123]: https://github.com/python/cpython/issues/111123
LoadBeforeGlobalDeclaration { name: String, start: TextSize },
/// Represents the use of a `nonlocal` variable before its `nonlocal` declaration.
///
/// ## Examples
///
/// ```python
/// def f():
/// counter = 0
/// def increment():
/// print(f"Adding 1 to {counter}")
/// nonlocal counter # SyntaxError: name 'counter' is used prior to nonlocal declaration
/// counter += 1
/// ```
///
/// ## Known Issues
///
/// See [`LoadBeforeGlobalDeclaration`][Self::LoadBeforeGlobalDeclaration].
LoadBeforeNonlocalDeclaration { name: String, start: TextSize },
/// Represents the use of a starred expression in an invalid location, such as a `return` or
/// `yield` statement.
///
@@ -1307,6 +1337,15 @@ pub enum SemanticSyntaxErrorKind {
/// Represents a nonlocal declaration at module level
NonlocalDeclarationAtModuleLevel,
/// Represents the same variable declared as both nonlocal and global
NonlocalAndGlobal(String),
/// Represents a type annotation on a variable that's been declared global
AnnotatedGlobal(String),
/// Represents a type annotation on a variable that's been declared nonlocal
AnnotatedNonlocal(String),
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, get_size2::GetSize)]

View File

@@ -301,7 +301,7 @@ fn to_lsp_diagnostic(
severity,
tags,
code,
code_description: diagnostic.to_url().and_then(|url| {
code_description: diagnostic.to_ruff_url().and_then(|url| {
Some(lsp_types::CodeDescription {
href: lsp_types::Url::parse(&url).ok()?,
})

View File

@@ -323,8 +323,8 @@ pub enum OutputFormat {
Concise,
}
impl From<OutputFormat> for ruff_db::diagnostic::DiagnosticFormat {
fn from(format: OutputFormat) -> ruff_db::diagnostic::DiagnosticFormat {
impl From<OutputFormat> for ty_project::metadata::options::OutputFormat {
fn from(format: OutputFormat) -> ty_project::metadata::options::OutputFormat {
match format {
OutputFormat::Full => Self::Full,
OutputFormat::Concise => Self::Concise,

View File

@@ -290,7 +290,7 @@ impl MainLoop {
} => {
let terminal_settings = db.project().settings(db).terminal();
let display_config = DisplayDiagnosticConfig::default()
.format(terminal_settings.output_format)
.format(terminal_settings.output_format.into())
.color(colored::control::SHOULD_COLORIZE.should_colorize());
if check_revision == revision {

View File

@@ -536,6 +536,9 @@ _private_type_var_tuple = TypeVarTuple("_private_type_var_tuple")
public_explicit_type_alias: TypeAlias = Literal[1]
_private_explicit_type_alias: TypeAlias = Literal[1]
public_implicit_union_alias = int | str
_private_implicit_union_alias = int | str
class PublicProtocol(Protocol):
def method(self) -> None: ...
@@ -557,7 +560,9 @@ class _PrivateProtocol(Protocol):
test.assert_completions_include("public_type_var_tuple");
test.assert_completions_do_not_include("_private_type_var_tuple");
test.assert_completions_include("public_explicit_type_alias");
test.assert_completions_include("_private_explicit_type_alias");
test.assert_completions_do_not_include("_private_explicit_type_alias");
test.assert_completions_include("public_implicit_union_alias");
test.assert_completions_do_not_include("_private_implicit_union_alias");
test.assert_completions_include("PublicProtocol");
test.assert_completions_do_not_include("_PrivateProtocol");
}
@@ -2391,6 +2396,48 @@ Cougar = 3
test.assert_completions_include("Cheetah");
}
#[test]
fn from_import_with_submodule1() {
let test = CursorTest::builder()
.source("main.py", "from package import <CURSOR>")
.source("package/__init__.py", "")
.source("package/foo.py", "")
.source("package/bar.pyi", "")
.source("package/foo-bar.py", "")
.source("package/data.txt", "")
.source("package/sub/__init__.py", "")
.source("package/not-a-submodule/__init__.py", "")
.build();
test.assert_completions_include("foo");
test.assert_completions_include("bar");
test.assert_completions_include("sub");
test.assert_completions_do_not_include("foo-bar");
test.assert_completions_do_not_include("data");
test.assert_completions_do_not_include("not-a-submodule");
}
#[test]
fn from_import_with_vendored_submodule1() {
let test = cursor_test(
"\
from http import <CURSOR>
",
);
test.assert_completions_include("client");
}
#[test]
fn from_import_with_vendored_submodule2() {
let test = cursor_test(
"\
from email import <CURSOR>
",
);
test.assert_completions_include("mime");
test.assert_completions_do_not_include("base");
}
#[test]
fn import_submodule_not_attribute1() {
let test = cursor_test(

View File

@@ -979,6 +979,39 @@ impl GlobFilterContext {
}
}
/// The diagnostic output format.
#[derive(Debug, Default, Clone, Copy, Eq, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "kebab-case", deny_unknown_fields)]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
pub enum OutputFormat {
/// The default full mode will print "pretty" diagnostics.
///
/// That is, color will be used when printing to a `tty`.
/// Moreover, diagnostic messages may include additional
/// context and annotations on the input to help understand
/// the message.
#[default]
Full,
/// Print diagnostics in a concise mode.
///
/// This will guarantee that each diagnostic is printed on
/// a single line. Only the most important or primary aspects
/// of the diagnostic are included. Contextual information is
/// dropped.
///
/// This may use color when printing to a `tty`.
Concise,
}
impl From<OutputFormat> for DiagnosticFormat {
fn from(value: OutputFormat) -> Self {
match value {
OutputFormat::Full => Self::Full,
OutputFormat::Concise => Self::Concise,
}
}
}
#[derive(
Debug, Default, Clone, Eq, PartialEq, Combine, Serialize, Deserialize, OptionsMetadata,
)]
@@ -996,7 +1029,7 @@ pub struct TerminalOptions {
output-format = "concise"
"#
)]
pub output_format: Option<RangedValue<DiagnosticFormat>>,
pub output_format: Option<RangedValue<OutputFormat>>,
/// Use exit code 1 if there are any warning-level diagnostics.
///
/// Defaults to `false`.
@@ -1295,7 +1328,7 @@ pub(super) struct InnerOverrideOptions {
#[derive(Debug)]
pub struct ToSettingsError {
diagnostic: Box<OptionDiagnostic>,
output_format: DiagnosticFormat,
output_format: OutputFormat,
color: bool,
}
@@ -1309,7 +1342,7 @@ impl ToSettingsError {
impl fmt::Display for DisplayPretty<'_> {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let display_config = DisplayDiagnosticConfig::default()
.format(self.error.output_format)
.format(self.error.output_format.into())
.color(self.error.color);
write!(

View File

@@ -1,9 +1,9 @@
use std::sync::Arc;
use ruff_db::{diagnostic::DiagnosticFormat, files::File};
use ruff_db::files::File;
use ty_python_semantic::lint::RuleSelection;
use crate::metadata::options::InnerOverrideOptions;
use crate::metadata::options::{InnerOverrideOptions, OutputFormat};
use crate::{Db, combine::Combine, glob::IncludeExcludeFilter};
/// The resolved [`super::Options`] for the project.
@@ -57,7 +57,7 @@ impl Settings {
#[derive(Debug, Clone, PartialEq, Eq, Default)]
pub struct TerminalSettings {
pub output_format: DiagnosticFormat,
pub output_format: OutputFormat,
pub error_on_warning: bool,
}

View File

@@ -1304,7 +1304,7 @@ scope of the name that was declared `global`, can add a symbol to the global nam
def f():
global g, h
g: bool = True
g = True
f()
```

View File

@@ -83,7 +83,7 @@ def f():
x = 1
def g() -> None:
nonlocal x
global x # TODO: error: [invalid-syntax] "name 'x' is nonlocal and global"
global x # error: [invalid-syntax] "name `x` is nonlocal and global"
x = None
```
@@ -209,5 +209,18 @@ x: int = 1
def f():
global x
x: str = "foo" # TODO: error: [invalid-syntax] "annotated name 'x' can't be global"
x: str = "foo" # error: [invalid-syntax] "annotated name `x` can't be global"
```
## Global declarations affect the inferred type of the binding
Even if the `global` declaration isn't used in an assignment, we conservatively assume it could be:
```py
x = 1
def f():
global x
# TODO: reveal_type(x) # revealed: Unknown | Literal["1"]
```

View File

@@ -43,3 +43,321 @@ def f():
def h():
reveal_type(x) # revealed: Unknown | Literal[1]
```
## The `nonlocal` keyword
Without the `nonlocal` keyword, bindings in an inner scope shadow variables of the same name in
enclosing scopes. This example isn't a type error, because the inner `x` shadows the outer one:
```py
def f():
x: int = 1
def g():
x = "hello" # allowed
```
With `nonlocal` it is a type error, because `x` refers to the same place in both scopes:
```py
def f():
x: int = 1
def g():
nonlocal x
x = "hello" # error: [invalid-assignment] "Object of type `Literal["hello"]` is not assignable to `int`"
```
## Local variable bindings "look ahead" to any assignment in the current scope
The binding `x = 2` in `g` causes the earlier read of `x` to refer to `g`'s not-yet-initialized
binding, rather than to `x = 1` in `f`'s scope:
```py
def f():
x = 1
def g():
if x == 1: # error: [unresolved-reference] "Name `x` used when not defined"
x = 2
```
The `nonlocal` keyword makes this example legal (and makes the assignment `x = 2` affect the outer
scope):
```py
def f():
x = 1
def g():
nonlocal x
if x == 1:
x = 2
```
For the same reason, using the `+=` operator in an inner scope is an error without `nonlocal`
(unless you shadow the outer variable first):
```py
def f():
x = 1
def g():
x += 1 # error: [unresolved-reference] "Name `x` used when not defined"
def f():
x = 1
def g():
x = 1
x += 1 # allowed, but doesn't affect the outer scope
def f():
x = 1
def g():
nonlocal x
x += 1 # allowed, and affects the outer scope
```
## `nonlocal` declarations must match an outer binding
`nonlocal x` isn't allowed when there's no binding for `x` in an enclosing scope:
```py
def f():
def g():
nonlocal x # error: [invalid-syntax] "no binding for nonlocal `x` found"
def f():
x = 1
def g():
nonlocal x, y # error: [invalid-syntax] "no binding for nonlocal `y` found"
```
A global `x` doesn't work. The target must be in a function-like scope:
```py
x = 1
def f():
def g():
nonlocal x # error: [invalid-syntax] "no binding for nonlocal `x` found"
def f():
global x
def g():
nonlocal x # error: [invalid-syntax] "no binding for nonlocal `x` found"
```
A class-scoped `x` also doesn't work:
```py
class Foo:
x = 1
@staticmethod
def f():
nonlocal x # error: [invalid-syntax] "no binding for nonlocal `x` found"
```
However, class-scoped bindings don't break the `nonlocal` chain the way `global` declarations do:
```py
def f():
x: int = 1
class Foo:
x: str = "hello"
@staticmethod
def g():
# Skips the class scope and reaches the outer function scope.
nonlocal x
x = 2 # allowed
x = "goodbye" # error: [invalid-assignment]
```
## `nonlocal` uses the closest binding
```py
def f():
x = 1
def g():
x = 2
def h():
nonlocal x
reveal_type(x) # revealed: Unknown | Literal[2]
```
## `nonlocal` "chaining"
Multiple `nonlocal` statements can "chain" through nested scopes:
```py
def f():
x = 1
def g():
nonlocal x
def h():
nonlocal x
reveal_type(x) # revealed: Unknown | Literal[1]
```
And the `nonlocal` chain can skip over a scope that doesn't bind the variable:
```py
def f1():
x = 1
def f2():
nonlocal x
def f3():
# No binding; this scope gets skipped.
def f4():
nonlocal x
reveal_type(x) # revealed: Unknown | Literal[1]
```
But a `global` statement breaks the chain:
```py
def f():
x = 1
def g():
global x
def h():
nonlocal x # error: [invalid-syntax] "no binding for nonlocal `x` found"
```
## `nonlocal` bindings respect declared types from the defining scope, even without a binding
```py
def f():
x: int
def g():
nonlocal x
x = "string" # error: [invalid-assignment] "Object of type `Literal["string"]` is not assignable to `int`"
```
## A complicated mixture of `nonlocal` chaining, empty scopes, class scopes, and the `global` keyword
```py
def f1():
# The original bindings of `x`, `y`, and `z` with type declarations.
x: int = 1
y: int = 2
z: int = 3
def f2():
# This scope doesn't touch `x`, `y`, or `z` at all.
class Foo:
# This class scope is totally ignored.
x: str = "a"
y: str = "b"
z: str = "c"
@staticmethod
def f3():
# This scope declares `x` nonlocal and `y` as global, and it shadows `z` without
# giving it a type declaration.
nonlocal x
x = 4
y = 5
global z
z = 6
def f4():
# This scope sees `x` from `f1` and `y` from `f3`. It *can't* declare `z`
# nonlocal, because of the global statement above, but it *can* load `z` as a
# "free" variable, in which case it sees the global value.
nonlocal x, y, z # error: [invalid-syntax] "no binding for nonlocal `z` found"
x = "string" # error: [invalid-assignment]
y = "string" # allowed, because `f3`'s `y` is untyped
reveal_type(z) # revealed: Unknown | Literal[6]
```
## TODO: `nonlocal` affects the inferred type in the outer scope
Without `nonlocal`, `g` can't write to `x`, and the inferred type of `x` in `f`'s scope isn't
affected by `g`:
```py
def f():
x = 1
def g():
reveal_type(x) # revealed: Unknown | Literal[1]
reveal_type(x) # revealed: Literal[1]
```
But with `nonlocal`, `g` could write to `x`, and that affects its inferred type in `f`. That's true
regardless of whether `g` actually writes to `x`. With a write:
```py
def f():
x = 1
def g():
nonlocal x
reveal_type(x) # revealed: Unknown | Literal[1]
x += 1
reveal_type(x) # revealed: Unknown | Literal[2]
# TODO: should be `Unknown | Literal[1]`
reveal_type(x) # revealed: Literal[1]
```
Without a write:
```py
def f():
x = 1
def g():
nonlocal x
reveal_type(x) # revealed: Unknown | Literal[1]
# TODO: should be `Unknown | Literal[1]`
reveal_type(x) # revealed: Literal[1]
```
## Annotating a `nonlocal` binding is a syntax error
```py
def f():
x: int = 1
def g():
nonlocal x
x: str = "foo" # error: [invalid-syntax] "annotated name `x` can't be nonlocal"
```
## Use before `nonlocal`
Using a name prior to its `nonlocal` declaration in the same scope is a syntax error:
```py
def f():
x = 1
def g():
x = 2
nonlocal x # error: [invalid-syntax] "name `x` is used prior to nonlocal declaration"
```
This is true even if there are multiple `nonlocal` declarations of the same variable, as long as any
of them come after the usage:
```py
def f():
x = 1
def g():
nonlocal x
x = 2
nonlocal x # error: [invalid-syntax] "name `x` is used prior to nonlocal declaration"
def f():
x = 1
def g():
nonlocal x
nonlocal x
x = 2 # allowed
```
## `nonlocal` before outer initialization
`nonlocal x` works even if `x` isn't bound in the enclosing scope until afterwards:
```py
def f():
def g():
# This is allowed, because of the subsequent definition of `x`.
nonlocal x
x = 1
```

View File

@@ -147,8 +147,7 @@ def nonlocal_use():
X: Final[int] = 1
def inner():
nonlocal X
# TODO: this should be an error
X = 2
X = 2 # error: [invalid-assignment] "Reassignment of `Final` symbol `X` is not allowed: Reassignment of `Final` symbol"
```
`main.py`:

View File

@@ -3,9 +3,13 @@ use std::str::FromStr;
use std::sync::Arc;
use ruff_db::files::File;
use ruff_python_ast::name::Name;
use ruff_python_stdlib::identifiers::is_identifier;
use super::path::SearchPath;
use crate::Db;
use crate::module_name::ModuleName;
use crate::module_resolver::path::SystemOrVendoredPathRef;
/// Representation of a Python module.
#[derive(Clone, PartialEq, Eq, Hash, get_size2::GetSize)]
@@ -85,6 +89,100 @@ impl Module {
ModuleInner::NamespacePackage { .. } => ModuleKind::Package,
}
}
/// Return a list of all submodules of this module.
///
/// Returns an empty list if the module is not a package, if it is an empty package,
/// or if it is a namespace package (one without an `__init__.py` or `__init__.pyi` file).
///
/// The names returned correspond to the "base" name of the module.
/// That is, `{self.name}.{basename}` should give the full module name.
pub fn all_submodules(&self, db: &dyn Db) -> Vec<Name> {
self.all_submodules_inner(db).unwrap_or_default()
}
fn all_submodules_inner(&self, db: &dyn Db) -> Option<Vec<Name>> {
fn is_submodule(
is_dir: bool,
is_file: bool,
basename: Option<&str>,
extension: Option<&str>,
) -> bool {
is_dir
|| (is_file
&& matches!(extension, Some("py" | "pyi"))
&& !matches!(basename, Some("__init__.py" | "__init__.pyi")))
}
// It would be complex and expensive to compute all submodules for
// namespace packages, since a namespace package doesn't correspond
// to a single file; it can span multiple directories across multiple
// search paths. For now, we only compute submodules for traditional
// packages that exist in a single directory on a single search path.
let ModuleInner::FileModule {
kind: ModuleKind::Package,
file,
..
} = &*self.inner
else {
return None;
};
let path = SystemOrVendoredPathRef::try_from_file(db, *file)?;
debug_assert!(
matches!(path.file_name(), Some("__init__.py" | "__init__.pyi")),
"expected package file `{:?}` to be `__init__.py` or `__init__.pyi`",
path.file_name(),
);
Some(match path.parent()? {
SystemOrVendoredPathRef::System(parent_directory) => db
.system()
.read_directory(parent_directory)
.inspect_err(|err| {
tracing::debug!(
"Failed to read {parent_directory:?} when looking for \
its possible submodules: {err}"
);
})
.ok()?
.flatten()
.filter(|entry| {
let ty = entry.file_type();
let path = entry.path();
is_submodule(
ty.is_directory(),
ty.is_file(),
path.file_name(),
path.extension(),
)
})
.filter_map(|entry| {
let stem = entry.path().file_stem()?;
is_identifier(stem).then(|| Name::from(stem))
})
.collect(),
SystemOrVendoredPathRef::Vendored(parent_directory) => db
.vendored()
.read_directory(parent_directory)
.into_iter()
.filter(|entry| {
let ty = entry.file_type();
let path = entry.path();
is_submodule(
ty.is_directory(),
ty.is_file(),
path.file_name(),
path.extension(),
)
})
.filter_map(|entry| {
let stem = entry.path().file_stem()?;
is_identifier(stem).then(|| Name::from(stem))
})
.collect(),
})
}
}
impl std::fmt::Debug for Module {

View File

@@ -4,11 +4,12 @@ use std::fmt;
use std::sync::Arc;
use camino::{Utf8Path, Utf8PathBuf};
use ruff_db::files::{File, FileError, system_path_to_file, vendored_path_to_file};
use ruff_db::files::{File, FileError, FilePath, system_path_to_file, vendored_path_to_file};
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ruff_db::vendored::{VendoredPath, VendoredPathBuf};
use super::typeshed::{TypeshedVersionsParseError, TypeshedVersionsQueryResult, typeshed_versions};
use crate::Db;
use crate::module_name::ModuleName;
use crate::module_resolver::resolver::ResolverContext;
use crate::site_packages::SitePackagesDiscoveryError;
@@ -652,6 +653,48 @@ impl fmt::Display for SearchPath {
}
}
#[derive(Debug, Clone)]
pub(super) enum SystemOrVendoredPathRef<'db> {
System(&'db SystemPath),
Vendored(&'db VendoredPath),
}
impl<'db> SystemOrVendoredPathRef<'db> {
pub(super) fn try_from_file(db: &'db dyn Db, file: File) -> Option<Self> {
match file.path(db) {
FilePath::System(system) => Some(Self::System(system)),
FilePath::Vendored(vendored) => Some(Self::Vendored(vendored)),
FilePath::SystemVirtual(_) => None,
}
}
pub(super) fn file_name(&self) -> Option<&str> {
match self {
Self::System(system) => system.file_name(),
Self::Vendored(vendored) => vendored.file_name(),
}
}
pub(super) fn parent<'a>(&'a self) -> Option<SystemOrVendoredPathRef<'a>>
where
'a: 'db,
{
match self {
Self::System(system) => system.parent().map(Self::System),
Self::Vendored(vendored) => vendored.parent().map(Self::Vendored),
}
}
}
impl std::fmt::Display for SystemOrVendoredPathRef<'_> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
SystemOrVendoredPathRef::System(system) => system.fmt(f),
SystemOrVendoredPathRef::Vendored(vendored) => vendored.fmt(f),
}
}
}
#[cfg(test)]
mod tests {
use ruff_db::Db;

View File

@@ -8,7 +8,7 @@ use rustc_hash::{FxBuildHasher, FxHashSet};
use ruff_db::files::{File, FilePath, FileRootKind};
use ruff_db::system::{DirectoryEntry, System, SystemPath, SystemPathBuf};
use ruff_db::vendored::{VendoredFileSystem, VendoredPath};
use ruff_db::vendored::VendoredFileSystem;
use ruff_python_ast::PythonVersion;
use crate::db::Db;
@@ -17,7 +17,7 @@ use crate::module_resolver::typeshed::{TypeshedVersions, vendored_typeshed_versi
use crate::{Program, SearchPathSettings};
use super::module::{Module, ModuleKind};
use super::path::{ModulePath, SearchPath, SearchPathValidationError};
use super::path::{ModulePath, SearchPath, SearchPathValidationError, SystemOrVendoredPathRef};
/// Resolves a module name to a module.
pub fn resolve_module(db: &dyn Db, module_name: &ModuleName) -> Option<Module> {
@@ -77,21 +77,6 @@ pub(crate) fn path_to_module(db: &dyn Db, path: &FilePath) -> Option<Module> {
file_to_module(db, file)
}
#[derive(Debug, Clone, Copy)]
enum SystemOrVendoredPathRef<'a> {
System(&'a SystemPath),
Vendored(&'a VendoredPath),
}
impl std::fmt::Display for SystemOrVendoredPathRef<'_> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
SystemOrVendoredPathRef::System(system) => system.fmt(f),
SystemOrVendoredPathRef::Vendored(vendored) => vendored.fmt(f),
}
}
}
/// Resolves the module for the file with the given id.
///
/// Returns `None` if the file is not a module locatable via any of the known search paths.
@@ -99,11 +84,7 @@ impl std::fmt::Display for SystemOrVendoredPathRef<'_> {
pub(crate) fn file_to_module(db: &dyn Db, file: File) -> Option<Module> {
let _span = tracing::trace_span!("file_to_module", ?file).entered();
let path = match file.path(db) {
FilePath::System(system) => SystemOrVendoredPathRef::System(system),
FilePath::Vendored(vendored) => SystemOrVendoredPathRef::Vendored(vendored),
FilePath::SystemVirtual(_) => return None,
};
let path = SystemOrVendoredPathRef::try_from_file(db, file)?;
let module_name = search_paths(db).find_map(|candidate| {
let relative_path = match path {

View File

@@ -1,4 +1,4 @@
use ruff_python_ast::{HasNodeIndex, NodeIndex};
use ruff_python_ast::{self as ast, HasNodeIndex, NodeIndex};
/// Compact key for a node for use in a hash map.
#[derive(Copy, Clone, Debug, Eq, PartialEq, Hash, get_size2::GetSize)]
@@ -12,3 +12,31 @@ impl NodeKey {
NodeKey(node.node_index().load())
}
}
#[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, salsa::Update, get_size2::GetSize)]
pub(crate) struct ExpressionNodeKey(NodeKey);
// TODO: Delete after merging https://github.com/astral-sh/ruff/pull/19025
impl From<&ast::Identifier> for ExpressionNodeKey {
fn from(value: &ast::Identifier) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<ast::ExprRef<'_>> for ExpressionNodeKey {
fn from(value: ast::ExprRef<'_>) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<&ast::Expr> for ExpressionNodeKey {
fn from(value: &ast::Expr) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<&ast::ExprCall> for ExpressionNodeKey {
fn from(value: &ast::ExprCall) -> Self {
Self(NodeKey::from_node(value))
}
}

View File

@@ -1421,7 +1421,7 @@ impl RequiresExplicitReExport {
/// ```py
/// def _():
/// x = 1
///
///
/// x = 2
///
/// if flag():

View File

@@ -12,9 +12,7 @@ use salsa::plumbing::AsId;
use crate::Db;
use crate::module_name::ModuleName;
use crate::node_key::NodeKey;
use crate::semantic_index::ast_ids::AstIds;
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::node_key::{ExpressionNodeKey, NodeKey};
use crate::semantic_index::builder::SemanticIndexBuilder;
use crate::semantic_index::definition::{Definition, DefinitionNodeKey, Definitions};
use crate::semantic_index::expression::Expression;
@@ -24,9 +22,9 @@ use crate::semantic_index::place::{
ScopeKind, ScopedPlaceId,
};
use crate::semantic_index::use_def::{EagerSnapshotKey, ScopedEagerSnapshotId, UseDefMap};
pub(crate) use crate::semantic_index::use_def::{FileUseId, HasFileUseId};
use crate::util::get_size::untracked_arc_size;
pub mod ast_ids;
mod builder;
pub mod definition;
pub mod expression;
@@ -217,18 +215,9 @@ pub(crate) struct SemanticIndex<'db> {
/// Map from the file-local [`FileScopeId`] to the salsa-ingredient [`ScopeId`].
scope_ids_by_scope: IndexVec<FileScopeId, ScopeId<'db>>,
/// Map from the file-local [`FileScopeId`] to the set of explicit-global symbols it contains.
globals_by_scope: FxHashMap<FileScopeId, FxHashSet<ScopedPlaceId>>,
/// Use-def map for each scope in this file.
use_def_maps: IndexVec<FileScopeId, ArcUseDefMap<'db>>,
/// Lookup table to map between node ids and ast nodes.
///
/// Note: We should not depend on this map when analysing other files or
/// changing a file invalidates all dependents.
ast_ids: IndexVec<FileScopeId, AstIds>,
/// The set of modules that are imported anywhere within this file.
imported_modules: Arc<FxHashSet<ModuleName>>,
@@ -264,11 +253,6 @@ impl<'db> SemanticIndex<'db> {
self.use_def_maps[scope_id].clone()
}
#[track_caller]
pub(crate) fn ast_ids(&self, scope_id: FileScopeId) -> &AstIds {
&self.ast_ids[scope_id]
}
/// Returns the ID of the `expression`'s enclosing scope.
#[track_caller]
pub(crate) fn expression_scope_id(
@@ -308,9 +292,19 @@ impl<'db> SemanticIndex<'db> {
symbol: ScopedPlaceId,
scope: FileScopeId,
) -> bool {
self.globals_by_scope
.get(&scope)
.is_some_and(|globals| globals.contains(&symbol))
self.place_table(scope)
.place_expr(symbol)
.is_marked_global()
}
pub(crate) fn symbol_is_nonlocal_in_scope(
&self,
symbol: ScopedPlaceId,
scope: FileScopeId,
) -> bool {
self.place_table(scope)
.place_expr(symbol)
.is_marked_nonlocal()
}
/// Returns the id of the parent scope.
@@ -616,11 +610,12 @@ mod tests {
use crate::Db;
use crate::db::tests::{TestDb, TestDbBuilder};
use crate::semantic_index::ast_ids::{HasScopedUseId, ScopedUseId};
use crate::semantic_index::definition::{Definition, DefinitionKind};
use crate::semantic_index::place::{FileScopeId, PlaceTable, Scope, ScopeKind, ScopedPlaceId};
use crate::semantic_index::use_def::UseDefMap;
use crate::semantic_index::{global_scope, place_table, semantic_index, use_def_map};
use crate::semantic_index::{
FileUseId, HasFileUseId, global_scope, place_table, semantic_index, use_def_map,
};
impl UseDefMap<'_> {
fn first_public_binding(&self, symbol: ScopedPlaceId) -> Option<Definition<'_>> {
@@ -628,8 +623,8 @@ mod tests {
.find_map(|constrained_binding| constrained_binding.binding.definition())
}
fn first_binding_at_use(&self, use_id: ScopedUseId) -> Option<Definition<'_>> {
self.bindings_at_use(use_id)
fn first_binding_at_use(&self, use_id: FileUseId) -> Option<Definition<'_>> {
self.bindings_for_node(use_id)
.find_map(|constrained_binding| constrained_binding.binding.definition())
}
}
@@ -1052,8 +1047,7 @@ def f(a: str, /, b: str, c: int = 1, *args, d: int = 2, **kwargs):
.elt
.as_name_expr()
.unwrap();
let element_use_id =
element.scoped_use_id(&db, comprehension_scope_id.to_scope_id(&db, file));
let element_use_id = element.use_id();
let binding = use_def.first_binding_at_use(element_use_id).unwrap();
let DefinitionKind::Comprehension(comprehension) = binding.kind(&db) else {
@@ -1327,7 +1321,7 @@ class C[T]:
let ast::Expr::Name(x_use_expr_name) = x_use_expr.as_ref() else {
panic!("expected a Name");
};
let x_use_id = x_use_expr_name.scoped_use_id(&db, scope);
let x_use_id = x_use_expr_name.use_id();
let use_def = use_def_map(&db, scope);
let binding = use_def.first_binding_at_use(x_use_id).unwrap();
let DefinitionKind::Assignment(assignment) = binding.kind(&db) else {

View File

@@ -1,144 +0,0 @@
use rustc_hash::FxHashMap;
use ruff_index::newtype_index;
use ruff_python_ast as ast;
use ruff_python_ast::ExprRef;
use crate::Db;
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::semantic_index::place::ScopeId;
use crate::semantic_index::semantic_index;
/// AST ids for a single scope.
///
/// The motivation for building the AST ids per scope isn't about reducing invalidation because
/// the struct changes whenever the parsed AST changes. Instead, it's mainly that we can
/// build the AST ids struct when building the place table and also keep the property that
/// IDs of outer scopes are unaffected by changes in inner scopes.
///
/// For example, we don't want that adding new statements to `foo` changes the statement id of `x = foo()` in:
///
/// ```python
/// def foo():
/// return 5
///
/// x = foo()
/// ```
#[derive(Debug, salsa::Update, get_size2::GetSize)]
pub(crate) struct AstIds {
/// Maps expressions which "use" a place (that is, [`ast::ExprName`], [`ast::ExprAttribute`] or [`ast::ExprSubscript`]) to a use id.
uses_map: FxHashMap<ExpressionNodeKey, ScopedUseId>,
}
impl AstIds {
fn use_id(&self, key: impl Into<ExpressionNodeKey>) -> ScopedUseId {
self.uses_map[&key.into()]
}
}
fn ast_ids<'db>(db: &'db dyn Db, scope: ScopeId) -> &'db AstIds {
semantic_index(db, scope.file(db)).ast_ids(scope.file_scope_id(db))
}
/// Uniquely identifies a use of a name in a [`crate::semantic_index::place::FileScopeId`].
#[newtype_index]
#[derive(get_size2::GetSize)]
pub struct ScopedUseId;
pub trait HasScopedUseId {
/// Returns the ID that uniquely identifies the use in `scope`.
fn scoped_use_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedUseId;
}
impl HasScopedUseId for ast::Identifier {
fn scoped_use_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedUseId {
let ast_ids = ast_ids(db, scope);
ast_ids.use_id(self)
}
}
impl HasScopedUseId for ast::ExprName {
fn scoped_use_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedUseId {
let expression_ref = ExprRef::from(self);
expression_ref.scoped_use_id(db, scope)
}
}
impl HasScopedUseId for ast::ExprAttribute {
fn scoped_use_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedUseId {
let expression_ref = ExprRef::from(self);
expression_ref.scoped_use_id(db, scope)
}
}
impl HasScopedUseId for ast::ExprSubscript {
fn scoped_use_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedUseId {
let expression_ref = ExprRef::from(self);
expression_ref.scoped_use_id(db, scope)
}
}
impl HasScopedUseId for ast::ExprRef<'_> {
fn scoped_use_id(&self, db: &dyn Db, scope: ScopeId) -> ScopedUseId {
let ast_ids = ast_ids(db, scope);
ast_ids.use_id(*self)
}
}
#[derive(Debug, Default)]
pub(super) struct AstIdsBuilder {
uses_map: FxHashMap<ExpressionNodeKey, ScopedUseId>,
}
impl AstIdsBuilder {
/// Adds `expr` to the use ids map and returns its id.
pub(super) fn record_use(&mut self, expr: impl Into<ExpressionNodeKey>) -> ScopedUseId {
let use_id = self.uses_map.len().into();
self.uses_map.insert(expr.into(), use_id);
use_id
}
pub(super) fn finish(mut self) -> AstIds {
self.uses_map.shrink_to_fit();
AstIds {
uses_map: self.uses_map,
}
}
}
/// Node key that can only be constructed for expressions.
pub(crate) mod node_key {
use ruff_python_ast as ast;
use crate::node_key::NodeKey;
#[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, salsa::Update, get_size2::GetSize)]
pub(crate) struct ExpressionNodeKey(NodeKey);
impl From<ast::ExprRef<'_>> for ExpressionNodeKey {
fn from(value: ast::ExprRef<'_>) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<&ast::Expr> for ExpressionNodeKey {
fn from(value: &ast::Expr) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<&ast::ExprCall> for ExpressionNodeKey {
fn from(value: &ast::ExprCall) -> Self {
Self(NodeKey::from_node(value))
}
}
impl From<&ast::Identifier> for ExpressionNodeKey {
fn from(value: &ast::Identifier) -> Self {
Self(NodeKey::from_node(value))
}
}
}

View File

@@ -19,9 +19,8 @@ use ruff_text_size::TextRange;
use crate::ast_node_ref::AstNodeRef;
use crate::module_name::ModuleName;
use crate::module_resolver::resolve_module;
use crate::node_key::NodeKey;
use crate::semantic_index::ast_ids::AstIdsBuilder;
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::node_key::{ExpressionNodeKey, NodeKey};
use crate::semantic_index::definition::{
AnnotatedAssignmentDefinitionNodeRef, AssignmentDefinitionNodeRef,
ComprehensionDefinitionNodeRef, Definition, DefinitionCategory, DefinitionNodeKey,
@@ -45,7 +44,7 @@ use crate::semantic_index::reachability_constraints::{
use crate::semantic_index::use_def::{
EagerSnapshotKey, FlowSnapshot, ScopedEagerSnapshotId, UseDefMapBuilder,
};
use crate::semantic_index::{ArcUseDefMap, SemanticIndex};
use crate::semantic_index::{ArcUseDefMap, HasFileUseId, SemanticIndex};
use crate::unpack::{Unpack, UnpackKind, UnpackPosition, UnpackValue};
use crate::{Db, Program};
@@ -99,11 +98,9 @@ pub(super) struct SemanticIndexBuilder<'db, 'ast> {
scopes: IndexVec<FileScopeId, Scope>,
scope_ids_by_scope: IndexVec<FileScopeId, ScopeId<'db>>,
place_tables: IndexVec<FileScopeId, PlaceTableBuilder>,
ast_ids: IndexVec<FileScopeId, AstIdsBuilder>,
use_def_maps: IndexVec<FileScopeId, UseDefMapBuilder<'db>>,
scopes_by_node: FxHashMap<NodeWithScopeKey, FileScopeId>,
scopes_by_expression: FxHashMap<ExpressionNodeKey, FileScopeId>,
globals_by_scope: FxHashMap<FileScopeId, FxHashSet<ScopedPlaceId>>,
definitions_by_node: FxHashMap<DefinitionNodeKey, Definitions<'db>>,
expressions_by_node: FxHashMap<ExpressionNodeKey, Expression<'db>>,
imported_modules: FxHashSet<ModuleName>,
@@ -133,7 +130,6 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
scopes: IndexVec::new(),
place_tables: IndexVec::new(),
ast_ids: IndexVec::new(),
scope_ids_by_scope: IndexVec::new(),
use_def_maps: IndexVec::new(),
@@ -141,7 +137,6 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
scopes_by_node: FxHashMap::default(),
definitions_by_node: FxHashMap::default(),
expressions_by_node: FxHashMap::default(),
globals_by_scope: FxHashMap::default(),
imported_modules: FxHashSet::default(),
generator_functions: FxHashSet::default(),
@@ -257,7 +252,6 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
self.place_tables.push(PlaceTableBuilder::default());
self.use_def_maps
.push(UseDefMapBuilder::new(is_class_scope));
let ast_id_scope = self.ast_ids.push(AstIdsBuilder::default());
let scope_id = ScopeId::new(self.db, self.file, file_scope_id);
@@ -265,8 +259,6 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
let previous = self.scopes_by_node.insert(node.node_key(), file_scope_id);
debug_assert_eq!(previous, None);
debug_assert_eq!(ast_id_scope, file_scope_id);
self.scope_stack.push(ScopeInfo {
file_scope_id,
current_loop: None,
@@ -349,7 +341,12 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
popped_scope_id
}
fn current_place_table(&mut self) -> &mut PlaceTableBuilder {
fn current_place_table(&self) -> &PlaceTableBuilder {
let scope_id = self.current_scope();
&self.place_tables[scope_id]
}
fn current_place_table_mut(&mut self) -> &mut PlaceTableBuilder {
let scope_id = self.current_scope();
&mut self.place_tables[scope_id]
}
@@ -369,11 +366,6 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
&mut self.use_def_maps[scope_id].reachability_constraints
}
fn current_ast_ids(&mut self) -> &mut AstIdsBuilder {
let scope_id = self.current_scope();
&mut self.ast_ids[scope_id]
}
fn flow_snapshot(&self) -> FlowSnapshot {
self.current_use_def_map().snapshot()
}
@@ -389,7 +381,7 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
/// Add a symbol to the place table and the use-def map.
/// Return the [`ScopedPlaceId`] that uniquely identifies the symbol in both.
fn add_symbol(&mut self, name: Name) -> ScopedPlaceId {
let (place_id, added) = self.current_place_table().add_symbol(name);
let (place_id, added) = self.current_place_table_mut().add_symbol(name);
if added {
self.current_use_def_map_mut().add_place(place_id);
}
@@ -399,7 +391,7 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
/// Add a place to the place table and the use-def map.
/// Return the [`ScopedPlaceId`] that uniquely identifies the place in both.
fn add_place(&mut self, place_expr: PlaceExprWithFlags) -> ScopedPlaceId {
let (place_id, added) = self.current_place_table().add_place(place_expr);
let (place_id, added) = self.current_place_table_mut().add_place(place_expr);
if added {
self.current_use_def_map_mut().add_place(place_id);
}
@@ -407,15 +399,15 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
}
fn mark_place_bound(&mut self, id: ScopedPlaceId) {
self.current_place_table().mark_place_bound(id);
self.current_place_table_mut().mark_place_bound(id);
}
fn mark_place_declared(&mut self, id: ScopedPlaceId) {
self.current_place_table().mark_place_declared(id);
self.current_place_table_mut().mark_place_declared(id);
}
fn mark_place_used(&mut self, id: ScopedPlaceId) {
self.current_place_table().mark_place_used(id);
self.current_place_table_mut().mark_place_used(id);
}
fn add_entry_for_definition_key(&mut self, key: DefinitionNodeKey) -> &mut Definitions<'db> {
@@ -1025,16 +1017,9 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
.map(|builder| ArcUseDefMap::new(builder.finish()))
.collect();
let mut ast_ids: IndexVec<_, _> = self
.ast_ids
.into_iter()
.map(super::ast_ids::AstIdsBuilder::finish)
.collect();
self.scopes.shrink_to_fit();
place_tables.shrink_to_fit();
use_def_maps.shrink_to_fit();
ast_ids.shrink_to_fit();
self.scopes_by_expression.shrink_to_fit();
self.definitions_by_node.shrink_to_fit();
@@ -1042,7 +1027,6 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
self.scopes_by_node.shrink_to_fit();
self.generator_functions.shrink_to_fit();
self.eager_snapshots.shrink_to_fit();
self.globals_by_scope.shrink_to_fit();
SemanticIndex {
place_tables,
@@ -1050,8 +1034,6 @@ impl<'db, 'ast> SemanticIndexBuilder<'db, 'ast> {
definitions_by_node: self.definitions_by_node,
expressions_by_node: self.expressions_by_node,
scope_ids_by_scope: self.scope_ids_by_scope,
globals_by_scope: self.globals_by_scope,
ast_ids,
scopes_by_expression: self.scopes_by_expression,
scopes_by_node: self.scopes_by_node,
use_def_maps,
@@ -1143,9 +1125,11 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
// done on the `Identifier` node as opposed to `ExprName` because that's what the
// AST uses.
self.mark_place_used(symbol);
let use_id = self.current_ast_ids().record_use(name);
self.current_use_def_map_mut()
.record_use(symbol, use_id, NodeKey::from_node(name));
self.current_use_def_map_mut().record_use(
symbol,
name.use_id(),
NodeKey::from_node(name),
);
self.add_definition(symbol, function_def);
}
@@ -1418,6 +1402,29 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
self.visit_expr(value);
}
if let ast::Expr::Name(name) = &*node.target {
let symbol_id = self.add_symbol(name.id.clone());
let symbol = self.current_place_table().place_expr(symbol_id);
// Check whether the variable has been declared global.
if symbol.is_marked_global() {
self.report_semantic_error(SemanticSyntaxError {
kind: SemanticSyntaxErrorKind::AnnotatedGlobal(name.id.as_str().into()),
range: name.range,
python_version: self.python_version,
});
}
// Check whether the variable has been declared nonlocal.
if symbol.is_marked_nonlocal() {
self.report_semantic_error(SemanticSyntaxError {
kind: SemanticSyntaxErrorKind::AnnotatedNonlocal(
name.id.as_str().into(),
),
range: name.range,
python_version: self.python_version,
});
}
}
// See https://docs.python.org/3/library/ast.html#ast.AnnAssign
if matches!(
*node.target,
@@ -1858,8 +1865,8 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
}) => {
for name in names {
let symbol_id = self.add_symbol(name.id.clone());
let symbol_table = self.current_place_table();
let symbol = symbol_table.place_expr(symbol_id);
let symbol = self.current_place_table().place_expr(symbol_id);
// Check whether the variable has already been accessed in this scope.
if symbol.is_bound() || symbol.is_declared() || symbol.is_used() {
self.report_semantic_error(SemanticSyntaxError {
kind: SemanticSyntaxErrorKind::LoadBeforeGlobalDeclaration {
@@ -1870,11 +1877,56 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
python_version: self.python_version,
});
}
let scope_id = self.current_scope();
self.globals_by_scope
.entry(scope_id)
.or_default()
.insert(symbol_id);
// Check whether the variable has also been declared nonlocal.
if symbol.is_marked_nonlocal() {
self.report_semantic_error(SemanticSyntaxError {
kind: SemanticSyntaxErrorKind::NonlocalAndGlobal(name.to_string()),
range: name.range,
python_version: self.python_version,
});
}
self.current_place_table_mut().mark_place_global(symbol_id);
}
walk_stmt(self, stmt);
}
ast::Stmt::Nonlocal(ast::StmtNonlocal {
range: _,
node_index: _,
names,
}) => {
for name in names {
let symbol_id = self.add_symbol(name.id.clone());
let symbol = self.current_place_table().place_expr(symbol_id);
// Check whether the variable has already been accessed in this scope.
if symbol.is_bound() || symbol.is_declared() || symbol.is_used() {
self.report_semantic_error(SemanticSyntaxError {
kind: SemanticSyntaxErrorKind::LoadBeforeNonlocalDeclaration {
name: name.to_string(),
start: name.range.start(),
},
range: name.range,
python_version: self.python_version,
});
}
// Check whether the variable has also been declared global.
if symbol.is_marked_global() {
self.report_semantic_error(SemanticSyntaxError {
kind: SemanticSyntaxErrorKind::NonlocalAndGlobal(name.to_string()),
range: name.range,
python_version: self.python_version,
});
}
// The variable is required to exist in an enclosing scope, but that definition
// might come later. For example, this is example legal, but we can't check
// that here, because we haven't gotten to `x = 1`:
// ```py
// def f():
// def g():
// nonlocal x
// x = 1
// ```
self.current_place_table_mut()
.mark_place_nonlocal(symbol_id);
}
walk_stmt(self, stmt);
}
@@ -1888,7 +1940,7 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
for target in targets {
if let Ok(target) = PlaceExpr::try_from(target) {
let place_id = self.add_place(PlaceExprWithFlags::new(target));
self.current_place_table().mark_place_used(place_id);
self.current_place_table_mut().mark_place_used(place_id);
self.delete_binding(place_id);
}
}
@@ -1984,8 +2036,14 @@ impl<'ast> Visitor<'ast> for SemanticIndexBuilder<'_, 'ast> {
let place_id = self.add_place(place_expr);
if is_use {
let use_id = match expr {
ast::Expr::Name(name) => name.use_id(),
ast::Expr::Attribute(attribute) => attribute.use_id(),
ast::Expr::Subscript(subscript) => subscript.use_id(),
_ => unreachable!(),
};
self.mark_place_used(place_id);
let use_id = self.current_ast_ids().record_use(expr);
self.current_use_def_map_mut()
.record_use(place_id, use_id, node_key);
}

View File

@@ -29,9 +29,9 @@
//! [`Predicate`]: crate::semantic_index::predicate::Predicate
use crate::list::{List, ListBuilder, ListSetReverseIterator, ListStorage};
use crate::semantic_index::ast_ids::ScopedUseId;
use crate::semantic_index::place::FileScopeId;
use crate::semantic_index::predicate::ScopedPredicateId;
use crate::semantic_index::use_def::FileUseId;
/// A narrowing constraint associated with a live binding.
///
@@ -44,7 +44,7 @@ pub(crate) type ScopedNarrowingConstraint = List<ScopedNarrowingConstraintPredic
pub(crate) enum ConstraintKey {
NarrowingConstraint(ScopedNarrowingConstraint),
EagerNestedScope(FileScopeId),
UseId(ScopedUseId),
UseId(FileUseId),
}
/// One of the [`Predicate`]s in a narrowing constraint, which constraints the type of the

View File

@@ -330,6 +330,16 @@ impl PlaceExprWithFlags {
self.flags.contains(PlaceFlags::IS_DECLARED)
}
/// Is the place `global` its containing scope?
pub fn is_marked_global(&self) -> bool {
self.flags.contains(PlaceFlags::MARKED_GLOBAL)
}
/// Is the place `nonlocal` its containing scope?
pub fn is_marked_nonlocal(&self) -> bool {
self.flags.contains(PlaceFlags::MARKED_NONLOCAL)
}
pub(crate) fn as_name(&self) -> Option<&Name> {
self.expr.as_name()
}
@@ -397,9 +407,7 @@ bitflags! {
const IS_USED = 1 << 0;
const IS_BOUND = 1 << 1;
const IS_DECLARED = 1 << 2;
/// TODO: This flag is not yet set by anything
const MARKED_GLOBAL = 1 << 3;
/// TODO: This flag is not yet set by anything
const MARKED_NONLOCAL = 1 << 4;
const IS_INSTANCE_ATTRIBUTE = 1 << 5;
}
@@ -663,7 +671,7 @@ impl PlaceTable {
}
/// Returns the place named `name`.
#[allow(unused)] // used in tests
#[cfg(test)]
pub(crate) fn place_by_name(&self, name: &str) -> Option<&PlaceExprWithFlags> {
let id = self.place_id_by_name(name)?;
Some(self.place_expr(id))
@@ -814,6 +822,14 @@ impl PlaceTableBuilder {
self.table.places[id].insert_flags(PlaceFlags::IS_USED);
}
pub(super) fn mark_place_global(&mut self, id: ScopedPlaceId) {
self.table.places[id].insert_flags(PlaceFlags::MARKED_GLOBAL);
}
pub(super) fn mark_place_nonlocal(&mut self, id: ScopedPlaceId) {
self.table.places[id].insert_flags(PlaceFlags::MARKED_NONLOCAL);
}
pub(super) fn places(&self) -> impl Iterator<Item = &PlaceExprWithFlags> {
self.table.places()
}

View File

@@ -241,6 +241,7 @@
//! visits a `StmtIf` node.
use ruff_index::{IndexVec, newtype_index};
use ruff_python_ast as ast;
use rustc_hash::FxHashMap;
use self::place_state::{
@@ -249,7 +250,6 @@ use self::place_state::{
};
use crate::node_key::NodeKey;
use crate::place::BoundnessAnalysis;
use crate::semantic_index::ast_ids::ScopedUseId;
use crate::semantic_index::definition::{Definition, DefinitionState};
use crate::semantic_index::narrowing_constraints::{
ConstraintKey, NarrowingConstraints, NarrowingConstraintsBuilder, NarrowingConstraintsIterator,
@@ -285,8 +285,8 @@ pub(crate) struct UseDefMap<'db> {
/// Array of reachability constraints in this scope.
reachability_constraints: ReachabilityConstraints,
/// [`Bindings`] reaching a [`ScopedUseId`].
bindings_by_use: IndexVec<ScopedUseId, Bindings>,
/// [`Bindings`] reaching a [`FileUseId`].
bindings_by_use: FxHashMap<FileUseId, Bindings>,
/// Tracks whether or not a given AST node is reachable from the start of the scope.
node_reachability: FxHashMap<NodeKey, ScopedReachabilityConstraintId>,
@@ -347,12 +347,13 @@ pub(crate) enum ApplicableConstraints<'map, 'db> {
}
impl<'db> UseDefMap<'db> {
pub(crate) fn bindings_at_use(
#[track_caller]
pub(crate) fn bindings_for_node(
&self,
use_id: ScopedUseId,
use_id: FileUseId,
) -> BindingWithConstraintsIterator<'_, 'db> {
self.bindings_iterator(
&self.bindings_by_use[use_id],
&self.bindings_by_use[&use_id],
BoundnessAnalysis::BasedOnUnboundVisibility,
)
}
@@ -382,7 +383,7 @@ impl<'db> UseDefMap<'db> {
ApplicableConstraints::ConstrainedBindings(bindings)
}
ConstraintKey::UseId(use_id) => {
ApplicableConstraints::ConstrainedBindings(self.bindings_at_use(use_id))
ApplicableConstraints::ConstrainedBindings(self.bindings_for_node(use_id))
}
}
}
@@ -735,7 +736,7 @@ pub(super) struct UseDefMapBuilder<'db> {
pub(super) reachability_constraints: ReachabilityConstraintsBuilder,
/// Live bindings at each so-far-recorded use.
bindings_by_use: IndexVec<ScopedUseId, Bindings>,
bindings_by_use: FxHashMap<FileUseId, Bindings>,
/// Tracks whether or not the current point in control flow is reachable from the
/// start of the scope.
@@ -771,7 +772,7 @@ impl<'db> UseDefMapBuilder<'db> {
predicates: PredicatesBuilder::default(),
narrowing_constraints: NarrowingConstraintsBuilder::default(),
reachability_constraints: ReachabilityConstraintsBuilder::default(),
bindings_by_use: IndexVec::new(),
bindings_by_use: FxHashMap::default(),
reachability: ScopedReachabilityConstraintId::ALWAYS_TRUE,
node_reachability: FxHashMap::default(),
declarations_by_binding: FxHashMap::default(),
@@ -1003,23 +1004,24 @@ impl<'db> UseDefMapBuilder<'db> {
pub(super) fn record_use(
&mut self,
place: ScopedPlaceId,
use_id: ScopedUseId,
use_id: FileUseId,
node_key: NodeKey,
) {
// We have a use of a place; clone the current bindings for that place, and record them
// as the live bindings for this use.
let new_use = self
let old_use = self
.bindings_by_use
.push(self.place_states[place].bindings().clone());
debug_assert_eq!(use_id, new_use);
.insert(use_id, self.place_states[place].bindings().clone());
debug_assert_eq!(old_use, None);
// Track reachability of all uses of places to silence `unresolved-reference`
// diagnostics in unreachable code.
self.record_node_reachability(node_key);
}
pub(super) fn record_node_reachability(&mut self, node_key: NodeKey) {
self.node_reachability.insert(node_key, self.reachability);
pub(super) fn record_node_reachability(&mut self, node: NodeKey) {
self.node_reachability.insert(node, self.reachability);
}
pub(super) fn snapshot_eager_state(
@@ -1144,3 +1146,42 @@ impl<'db> UseDefMapBuilder<'db> {
}
}
}
/// Uniquely identifies a use of a name in a [`crate::semantic_index::place::FileScopeId`].
#[derive(get_size2::GetSize, Debug, Copy, Clone, Eq, PartialEq, Hash)]
pub(crate) struct FileUseId(NodeKey);
pub(crate) trait HasFileUseId {
/// Returns the ID that uniquely identifies the use in `scope`.
fn use_id(&self) -> FileUseId;
}
impl HasFileUseId for ast::Identifier {
fn use_id(&self) -> FileUseId {
FileUseId(NodeKey::from_node(self))
}
}
impl HasFileUseId for ast::ExprName {
fn use_id(&self) -> FileUseId {
FileUseId(NodeKey::from_node(self))
}
}
impl HasFileUseId for ast::ExprAttribute {
fn use_id(&self) -> FileUseId {
FileUseId(NodeKey::from_node(self))
}
}
impl HasFileUseId for ast::ExprSubscript {
fn use_id(&self) -> FileUseId {
FileUseId(NodeKey::from_node(self))
}
}
impl HasFileUseId for ast::ExprRef<'_> {
fn use_id(&self) -> FileUseId {
FileUseId(NodeKey::from_node(self))
}
}

View File

@@ -1,6 +1,6 @@
use ruff_db::files::{File, FilePath};
use ruff_db::source::line_index;
use ruff_python_ast as ast;
use ruff_python_ast::{self as ast};
use ruff_python_ast::{Expr, ExprRef, name::Name};
use ruff_source_file::LineIndex;
@@ -69,14 +69,29 @@ impl<'db> SemanticModel<'db> {
};
let ty = Type::module_literal(self.db, self.file, &module);
let builtin = module.is_known(KnownModule::Builtins);
crate::types::all_members(self.db, ty)
.into_iter()
.map(|member| Completion {
name: member.name,
ty: member.ty,
let mut completions = vec![];
for crate::types::Member { name, ty } in crate::types::all_members(self.db, ty) {
completions.push(Completion { name, ty, builtin });
}
for submodule_basename in module.all_submodules(self.db) {
let Some(basename) = ModuleName::new(submodule_basename.as_str()) else {
continue;
};
let mut submodule_name = module_name.clone();
submodule_name.extend(&basename);
let Some(submodule) = resolve_module(self.db, &submodule_name) else {
continue;
};
let ty = Type::module_literal(self.db, self.file, &submodule);
completions.push(Completion {
name: submodule_basename,
ty,
builtin,
})
.collect()
});
}
completions
}
/// Returns completions for symbols available in a `object.<CURSOR>` context.

View File

@@ -47,7 +47,7 @@ use crate::types::generics::{
walk_partial_specialization, walk_specialization,
};
pub use crate::types::ide_support::{
CallSignatureDetails, all_members, call_signature_details, definition_kind_for_name,
CallSignatureDetails, Member, all_members, call_signature_details, definition_kind_for_name,
};
use crate::types::infer::infer_unpack_types;
use crate::types::mro::{Mro, MroError, MroIterator};
@@ -4995,7 +4995,7 @@ impl<'db> Type<'db> {
TypeVarKind::Legacy,
)))
}
SpecialFormType::TypeAlias => Ok(todo_type!("Support for `typing.TypeAlias`")),
SpecialFormType::TypeAlias => Ok(Type::Dynamic(DynamicType::TodoTypeAlias)),
SpecialFormType::TypedDict => Ok(todo_type!("Support for `typing.TypedDict`")),
SpecialFormType::Literal
@@ -5880,6 +5880,9 @@ pub enum DynamicType {
/// A special Todo-variant for PEP-695 `ParamSpec` types. A temporary variant to detect and special-
/// case the handling of these types in `Callable` annotations.
TodoPEP695ParamSpec,
/// A special Todo-variant for type aliases declared using `typing.TypeAlias`.
/// A temporary variant to detect and special-case the handling of these aliases in autocomplete suggestions.
TodoTypeAlias,
}
impl DynamicType {
@@ -5904,6 +5907,13 @@ impl std::fmt::Display for DynamicType {
f.write_str("@Todo")
}
}
DynamicType::TodoTypeAlias => {
if cfg!(debug_assertions) {
f.write_str("@Todo(Support for `typing.TypeAlias`)")
} else {
f.write_str("@Todo")
}
}
}
}
}

View File

@@ -48,7 +48,11 @@ impl<'db> ClassBase<'db> {
ClassBase::Class(class) => class.name(db),
ClassBase::Dynamic(DynamicType::Any) => "Any",
ClassBase::Dynamic(DynamicType::Unknown) => "Unknown",
ClassBase::Dynamic(DynamicType::Todo(_) | DynamicType::TodoPEP695ParamSpec) => "@Todo",
ClassBase::Dynamic(
DynamicType::Todo(_)
| DynamicType::TodoPEP695ParamSpec
| DynamicType::TodoTypeAlias,
) => "@Todo",
ClassBase::Protocol => "Protocol",
ClassBase::Generic => "Generic",
}

View File

@@ -60,10 +60,9 @@ use ruff_text_size::Ranged;
use crate::module_resolver::{KnownModule, file_to_module};
use crate::place::{Boundness, Place, place_from_bindings};
use crate::semantic_index::ast_ids::HasScopedUseId;
use crate::semantic_index::definition::Definition;
use crate::semantic_index::place::ScopeId;
use crate::semantic_index::semantic_index;
use crate::semantic_index::{HasFileUseId, semantic_index};
use crate::types::context::InferContext;
use crate::types::diagnostic::{
REDUNDANT_CAST, STATIC_ASSERT_ERROR, TYPE_ASSERTION_FAILURE,
@@ -293,10 +292,10 @@ impl<'db> OverloadLiteral<'db> {
.node(db)
.expect_function(&module)
.name
.scoped_use_id(db, scope);
.use_id();
let Place::Type(Type::FunctionLiteral(previous_type), Boundness::Bound) =
place_from_bindings(db, use_def.bindings_at_use(use_id))
place_from_bindings(db, use_def.bindings_for_node(use_id))
else {
return None;
};

View File

@@ -9,7 +9,7 @@ use crate::semantic_index::{
};
use crate::types::call::CallArguments;
use crate::types::signatures::Signature;
use crate::types::{ClassBase, ClassLiteral, KnownClass, KnownInstanceType, Type};
use crate::types::{ClassBase, ClassLiteral, DynamicType, KnownClass, KnownInstanceType, Type};
use crate::{Db, HasType, NameKind, SemanticModel};
use ruff_db::files::File;
use ruff_python_ast as ast;
@@ -181,6 +181,7 @@ impl<'db> AllMembers<'db> {
KnownClass::TypeVar
| KnownClass::TypeVarTuple
| KnownClass::ParamSpec
| KnownClass::UnionType
)
) =>
{
@@ -190,6 +191,7 @@ impl<'db> AllMembers<'db> {
Type::KnownInstance(
KnownInstanceType::TypeVar(_) | KnownInstanceType::TypeAliasType(_),
) => continue,
Type::Dynamic(DynamicType::TodoTypeAlias) => continue,
_ => {}
}
}

View File

@@ -35,6 +35,7 @@
//! be considered a bug.)
use itertools::{Either, Itertools};
use ruff_db::diagnostic::{Annotation, DiagnosticId, Severity};
use ruff_db::files::File;
use ruff_db::parsed::{ParsedModuleRef, parsed_module};
use ruff_python_ast::visitor::{Visitor, walk_expr};
@@ -62,15 +63,14 @@ use super::subclass_of::SubclassOfInner;
use super::{ClassBase, NominalInstanceType, add_inferred_python_version_hint_to_diagnostic};
use crate::module_name::{ModuleName, ModuleNameResolutionError};
use crate::module_resolver::resolve_module;
use crate::node_key::NodeKey;
use crate::node_key::{ExpressionNodeKey, NodeKey};
use crate::place::{
Boundness, ConsideredDefinitions, LookupError, Place, PlaceAndQualifiers,
builtins_module_scope, builtins_symbol, explicit_global_symbol, global_symbol,
module_type_implicit_global_declaration, module_type_implicit_global_symbol, place,
place_from_bindings, place_from_declarations, typing_extensions_symbol,
};
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::semantic_index::ast_ids::{HasScopedUseId, ScopedUseId};
use crate::semantic_index::definition::{
AnnotatedAssignmentDefinitionKind, AssignmentDefinitionKind, ComprehensionDefinitionKind,
Definition, DefinitionKind, DefinitionNodeKey, DefinitionState, ExceptHandlerDefinitionKind,
@@ -82,7 +82,8 @@ use crate::semantic_index::place::{
FileScopeId, NodeWithScopeKind, NodeWithScopeRef, PlaceExpr, ScopeId, ScopeKind, ScopedPlaceId,
};
use crate::semantic_index::{
ApplicableConstraints, EagerSnapshotResult, SemanticIndex, place_table, semantic_index,
ApplicableConstraints, EagerSnapshotResult, FileUseId, HasFileUseId, SemanticIndex,
place_table, semantic_index,
};
use crate::types::call::{Binding, Bindings, CallArgumentTypes, CallArguments, CallError};
use crate::types::class::{CodeGeneratorKind, MetaclassErrorKind, SliceLiteral};
@@ -1562,6 +1563,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let mut bound_ty = ty;
let global_use_def_map = self.index.use_def_map(FileScopeId::global());
let nonlocal_use_def_map;
let place_id = binding.place(self.db());
let place = place_table.place_expr(place_id);
let skip_non_global_scopes = self.skip_non_global_scopes(file_scope_id, place_id);
@@ -1572,9 +1574,58 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
.place_id_by_expr(&place.expr)
{
Some(id) => (global_use_def_map.end_of_scope_declarations(id), false),
// This case is a syntax error (load before global declaration) but ignore that here
// This variable shows up in `global` declarations but doesn't have an explicit
// binding in the global scope.
None => (use_def.declarations_at_binding(binding), true),
}
} else if self
.index
.symbol_is_nonlocal_in_scope(place_id, file_scope_id)
{
// If we run out of ancestor scopes without finding a definition, we'll fall back to
// the local scope. This will also be a syntax error in `infer_nonlocal_statement` (no
// binding for `nonlocal` found), but ignore that here.
let mut declarations = use_def.declarations_at_binding(binding);
let mut is_local = true;
// Walk up parent scopes looking for the enclosing scope that has definition of this
// name. `ancestor_scopes` includes the current scope, so skip that one.
for (enclosing_scope_file_id, enclosing_scope) in
self.index.ancestor_scopes(file_scope_id).skip(1)
{
// Ignore class scopes and the global scope.
if !enclosing_scope.kind().is_function_like() {
continue;
}
let enclosing_place_table = self.index.place_table(enclosing_scope_file_id);
let Some(enclosing_place_id) = enclosing_place_table.place_id_by_expr(&place.expr)
else {
// This ancestor scope doesn't have a binding. Keep going.
continue;
};
if self
.index
.symbol_is_nonlocal_in_scope(enclosing_place_id, enclosing_scope_file_id)
{
// The variable is `nonlocal` in this ancestor scope. Keep going.
continue;
}
if self
.index
.symbol_is_global_in_scope(enclosing_place_id, enclosing_scope_file_id)
{
// The variable is `global` in this ancestor scope. This breaks the `nonlocal`
// chain, and it's a syntax error in `infer_nonlocal_statement`. Ignore that
// here and just bail out of this loop.
break;
}
// We found the closest definition. Note that (unlike in `infer_place_load`) this
// does *not* need to be a binding. It could be just `x: int`.
nonlocal_use_def_map = self.index.use_def_map(enclosing_scope_file_id);
declarations = nonlocal_use_def_map.end_of_scope_declarations(enclosing_place_id);
is_local = false;
break;
}
(declarations, is_local)
} else {
(use_def.declarations_at_binding(binding), true)
};
@@ -2204,12 +2255,12 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
ast::Stmt::Raise(raise) => self.infer_raise_statement(raise),
ast::Stmt::Return(ret) => self.infer_return_statement(ret),
ast::Stmt::Delete(delete) => self.infer_delete_statement(delete),
ast::Stmt::Nonlocal(nonlocal) => self.infer_nonlocal_statement(nonlocal),
ast::Stmt::Break(_)
| ast::Stmt::Continue(_)
| ast::Stmt::Pass(_)
| ast::Stmt::IpyEscapeCommand(_)
| ast::Stmt::Global(_)
| ast::Stmt::Nonlocal(_) => {
| ast::Stmt::Global(_) => {
// No-op
}
}
@@ -4609,6 +4660,69 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
}
}
fn infer_nonlocal_statement(&mut self, nonlocal: &ast::StmtNonlocal) {
let ast::StmtNonlocal {
node_index: _,
range,
names,
} = nonlocal;
let db = self.db();
let scope = self.scope();
let file_scope_id = scope.file_scope_id(db);
let current_file = self.file();
'names: for name in names {
// Walk up parent scopes looking for a possible enclosing scope that may have a
// definition of this name visible to us. Note that we skip the scope containing the
// use that we are resolving, since we already looked for the place there up above.
for (enclosing_scope_file_id, _) in self.index.ancestor_scopes(file_scope_id).skip(1) {
// Class scopes are not visible to nested scopes, and `nonlocal` cannot refer to
// globals, so check only function-like scopes.
let enclosing_scope_id = enclosing_scope_file_id.to_scope_id(db, current_file);
if !enclosing_scope_id.is_function_like(db) {
continue;
}
let enclosing_place_table = self.index.place_table(enclosing_scope_file_id);
let Some(enclosing_place_id) = enclosing_place_table.place_id_by_name(name) else {
// This scope doesn't define this name. Keep going.
continue;
};
// We've found a definition for this name in an enclosing function-like scope.
// Either this definition is the valid place this name refers to, or else we'll
// emit a syntax error. Either way, we won't walk any more enclosing scopes. Note
// that there are differences here compared to `infer_place_load`: A regular load
// (e.g. `print(x)`) is allowed to refer to a global variable (e.g. `x = 1` in the
// global scope), and similarly it's allowed to refer to a local variable in an
// enclosing function that's declared `global` (e.g. `global x`). However, the
// `nonlocal` keyword can't refer to global variables (that's a `SyntaxError`), and
// it also can't refer to local variables in enclosing functions that are declared
// `global` (also a `SyntaxError`).
if self
.index
.symbol_is_global_in_scope(enclosing_place_id, enclosing_scope_file_id)
{
// A "chain" of `nonlocal` statements is "broken" by a `global` statement. Stop
// looping and report that this `nonlocal` statement is invalid.
break;
}
// We found a definition. We've checked that the name isn't `global` in this scope,
// but it's ok if it's `nonlocal`. If a "chain" of `nonlocal` statements fails to
// lead to a valid binding, the outermost one will be an error; we don't need to
// walk the whole chain for each one.
continue 'names;
}
// There's no matching binding in an enclosing scope. This `nonlocal` statement is
// invalid.
if let Some(builder) = self
.context
.report_diagnostic(DiagnosticId::InvalidSyntax, Severity::Error)
{
builder
.into_diagnostic(format_args!("no binding for nonlocal `{name}` found"))
.annotate(Annotation::primary(self.context.span(*range)));
}
}
}
fn module_type_from_name(&self, module_name: &ModuleName) -> Option<Type<'db>> {
resolve_module(self.db(), module_name)
.map(|module| Type::module_literal(self.db(), self.file(), &module))
@@ -5709,7 +5823,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
&self,
expr: &PlaceExpr,
expr_ref: ast::ExprRef,
) -> (Place<'db>, Option<ScopedUseId>) {
) -> (Place<'db>, Option<FileUseId>) {
let db = self.db();
let scope = self.scope();
let file_scope_id = scope.file_scope_id(db);
@@ -5736,8 +5850,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
return (Place::Unbound, None);
}
let use_id = expr_ref.scoped_use_id(db, scope);
let place = place_from_bindings(db, use_def.bindings_at_use(use_id));
let use_id = expr_ref.use_id();
let place = place_from_bindings(db, use_def.bindings_for_node(use_id));
(place, Some(use_id))
}
}
@@ -5775,13 +5889,15 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
let current_file = self.file();
let mut is_nonlocal_binding = false;
if let Some(name) = expr.as_name() {
let skip_non_global_scopes = place_table
.place_id_by_name(name)
.is_some_and(|symbol_id| self.skip_non_global_scopes(file_scope_id, symbol_id));
if skip_non_global_scopes {
return global_symbol(self.db(), self.file(), name);
if let Some(symbol_id) = place_table.place_id_by_name(name) {
if self.skip_non_global_scopes(file_scope_id, symbol_id) {
return global_symbol(self.db(), self.file(), name);
}
is_nonlocal_binding = self
.index
.symbol_is_nonlocal_in_scope(symbol_id, file_scope_id);
}
}
@@ -5794,7 +5910,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
// a local variable or not in function-like scopes. If a variable has any bindings in a
// function-like scope, it is considered a local variable; it never references another
// scope. (At runtime, it would use the `LOAD_FAST` opcode.)
if has_bindings_in_this_scope && scope.is_function_like(db) {
if has_bindings_in_this_scope && scope.is_function_like(db) && !is_nonlocal_binding {
return Place::Unbound.into();
}
@@ -6390,13 +6506,21 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
(unknown @ Type::Dynamic(DynamicType::Unknown), _, _)
| (_, unknown @ Type::Dynamic(DynamicType::Unknown), _) => Some(unknown),
(
todo @ Type::Dynamic(DynamicType::Todo(_) | DynamicType::TodoPEP695ParamSpec),
todo @ Type::Dynamic(
DynamicType::Todo(_)
| DynamicType::TodoPEP695ParamSpec
| DynamicType::TodoTypeAlias,
),
_,
_,
)
| (
_,
todo @ Type::Dynamic(DynamicType::Todo(_) | DynamicType::TodoPEP695ParamSpec),
todo @ Type::Dynamic(
DynamicType::Todo(_)
| DynamicType::TodoPEP695ParamSpec
| DynamicType::TodoTypeAlias,
),
_,
) => Some(todo),
(Type::Never, _, _) | (_, Type::Never, _) => Some(Type::Never),

View File

@@ -250,6 +250,9 @@ fn dynamic_elements_ordering(left: DynamicType, right: DynamicType) -> Ordering
(DynamicType::TodoPEP695ParamSpec, _) => Ordering::Less,
(_, DynamicType::TodoPEP695ParamSpec) => Ordering::Greater,
(DynamicType::TodoTypeAlias, _) => Ordering::Less,
(_, DynamicType::TodoTypeAlias) => Ordering::Greater,
}
}

View File

@@ -6,7 +6,7 @@ use rustc_hash::FxHashMap;
use ruff_python_ast::{self as ast, AnyNodeRef};
use crate::Db;
use crate::semantic_index::ast_ids::node_key::ExpressionNodeKey;
use crate::node_key::ExpressionNodeKey;
use crate::semantic_index::place::ScopeId;
use crate::types::tuple::{ResizeTupleError, Tuple, TupleLength, TupleUnpacker};
use crate::types::{Type, TypeCheckDiagnostics, infer_expression_types};

View File

@@ -31,6 +31,7 @@ salsa = { workspace = true }
serde = { workspace = true }
serde_json = { workspace = true }
shellexpand = { workspace = true }
thiserror = { workspace = true }
tracing = { workspace = true }
tracing-subscriber = { workspace = true, features = ["chrono"] }

View File

@@ -1,7 +1,7 @@
use crate::server::{ConnectionInitializer, Server};
use anyhow::Context;
pub use document::{NotebookDocument, PositionEncoding, TextDocument};
pub use session::{DocumentQuery, DocumentSnapshot, Session};
pub(crate) use session::{DocumentQuery, Session};
use std::num::NonZeroUsize;
mod document;

View File

@@ -218,8 +218,22 @@ where
let url = R::document_url(&params).into_owned();
let Ok(path) = AnySystemPath::try_from_url(&url) else {
tracing::warn!("Ignoring request for invalid `{url}`");
return Box::new(|_| {});
let reason = format!("URL `{url}` isn't a valid system path");
tracing::warn!(
"Ignoring request id={id} method={} because {reason}",
R::METHOD
);
return Box::new(|client| {
respond_silent_error(
id,
client,
lsp_server::ResponseError {
code: lsp_server::ErrorCode::InvalidParams as i32,
message: reason,
data: None,
},
);
});
};
let db = match &path {
@@ -230,10 +244,7 @@ where
AnySystemPath::SystemVirtual(_) => session.default_project_db().clone(),
};
let Some(snapshot) = session.take_document_snapshot(url) else {
tracing::warn!("Ignoring request because snapshot for path `{path:?}` doesn't exist");
return Box::new(|_| {});
};
let snapshot = session.take_document_snapshot(url);
Box::new(move |client| {
let _span = tracing::debug_span!("request", %id, method = R::METHOD).entered();
@@ -331,12 +342,7 @@ where
let (id, params) = cast_notification::<N>(req)?;
Ok(Task::background(schedule, move |session: &Session| {
let url = N::document_url(&params);
let Some(snapshot) = session.take_document_snapshot((*url).clone()) else {
tracing::debug!(
"Ignoring notification because snapshot for url `{url}` doesn't exist."
);
return Box::new(|_| {});
};
let snapshot = session.take_document_snapshot((*url).clone());
Box::new(move |client| {
let _span = tracing::debug_span!("notification", method = N::METHOD).entered();

View File

@@ -10,11 +10,10 @@ use ruff_db::files::FileRange;
use ruff_db::source::{line_index, source_text};
use ty_project::{Db, ProjectDatabase};
use super::LSPResult;
use crate::document::{DocumentKey, FileRangeExt, ToRangeExt};
use crate::server::Result;
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
use crate::{DocumentSnapshot, PositionEncoding, Session};
use crate::{PositionEncoding, Session};
/// Represents the diagnostics for a text document or a notebook document.
pub(super) enum Diagnostics {
@@ -64,30 +63,29 @@ pub(super) fn clear_diagnostics(key: &DocumentKey, client: &Client) {
/// This function is a no-op if the client supports pull diagnostics.
///
/// [publish diagnostics notification]: https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_publishDiagnostics
pub(super) fn publish_diagnostics(
session: &Session,
key: &DocumentKey,
client: &Client,
) -> Result<()> {
pub(super) fn publish_diagnostics(session: &Session, key: &DocumentKey, client: &Client) {
if session.client_capabilities().pull_diagnostics {
return Ok(());
return;
}
let Some(url) = key.to_url() else {
return Ok(());
return;
};
let path = key.path();
let snapshot = session.take_document_snapshot(url.clone());
let snapshot = session
.take_document_snapshot(url.clone())
.ok_or_else(|| anyhow::anyhow!("Unable to take snapshot for document with URL {url}"))
.with_failure_code(lsp_server::ErrorCode::InternalError)?;
let document = match snapshot.document() {
Ok(document) => document,
Err(err) => {
tracing::debug!("Failed to resolve document for URL `{}`: {}", url, err);
return;
}
};
let db = session.project_db_or_default(path);
let db = session.project_db_or_default(key.path());
let Some(diagnostics) = compute_diagnostics(db, &snapshot) else {
return Ok(());
return;
};
// Sends a notification to the client with the diagnostics for the document.
@@ -95,7 +93,7 @@ pub(super) fn publish_diagnostics(
client.send_notification::<PublishDiagnostics>(PublishDiagnosticsParams {
uri,
diagnostics,
version: Some(snapshot.query().version()),
version: Some(document.version()),
});
};
@@ -109,25 +107,28 @@ pub(super) fn publish_diagnostics(
}
}
}
Ok(())
}
pub(super) fn compute_diagnostics(
db: &ProjectDatabase,
snapshot: &DocumentSnapshot,
) -> Option<Diagnostics> {
let Some(file) = snapshot.file(db) else {
tracing::info!(
"No file found for snapshot for `{}`",
snapshot.query().file_url()
);
let document = match snapshot.document() {
Ok(document) => document,
Err(err) => {
tracing::info!("Failed to resolve document for snapshot: {}", err);
return None;
}
};
let Some(file) = document.file(db) else {
tracing::info!("No file found for snapshot for `{}`", document.file_path());
return None;
};
let diagnostics = db.check_file(file);
if let Some(notebook) = snapshot.query().as_notebook() {
if let Some(notebook) = document.as_notebook() {
let mut cell_diagnostics: FxHashMap<Url, Vec<Diagnostic>> = FxHashMap::default();
// Populates all relevant URLs with an empty diagnostic list. This ensures that documents

View File

@@ -28,9 +28,12 @@ impl SyncNotificationHandler for DidChangeTextDocumentHandler {
content_changes,
} = params;
let Ok(key) = session.key_from_url(uri.clone()) else {
tracing::debug!("Failed to create document key from URI: {}", uri);
return Ok(());
let key = match session.key_from_url(uri) {
Ok(key) => key,
Err(uri) => {
tracing::debug!("Failed to create document key from URI: {}", uri);
return Ok(());
}
};
session
@@ -54,6 +57,8 @@ impl SyncNotificationHandler for DidChangeTextDocumentHandler {
}
}
publish_diagnostics(session, &key, client)
publish_diagnostics(session, &key, client);
Ok(())
}
}

View File

@@ -109,7 +109,7 @@ impl SyncNotificationHandler for DidChangeWatchedFiles {
);
} else {
for key in session.text_document_keys() {
publish_diagnostics(session, &key, client)?;
publish_diagnostics(session, &key, client);
}
}

View File

@@ -6,8 +6,8 @@ use crate::session::Session;
use crate::session::client::Client;
use crate::system::AnySystemPath;
use lsp_server::ErrorCode;
use lsp_types::DidCloseTextDocumentParams;
use lsp_types::notification::DidCloseTextDocument;
use lsp_types::{DidCloseTextDocumentParams, TextDocumentIdentifier};
use ty_project::watch::ChangeEvent;
pub(crate) struct DidCloseTextDocumentHandler;
@@ -22,13 +22,18 @@ impl SyncNotificationHandler for DidCloseTextDocumentHandler {
client: &Client,
params: DidCloseTextDocumentParams,
) -> Result<()> {
let Ok(key) = session.key_from_url(params.text_document.uri.clone()) else {
tracing::debug!(
"Failed to create document key from URI: {}",
params.text_document.uri
);
return Ok(());
let DidCloseTextDocumentParams {
text_document: TextDocumentIdentifier { uri },
} = params;
let key = match session.key_from_url(uri) {
Ok(key) => key,
Err(uri) => {
tracing::debug!("Failed to create document key from URI: {}", uri);
return Ok(());
}
};
session
.close_document(&key)
.with_failure_code(ErrorCode::InternalError)?;

View File

@@ -1,5 +1,5 @@
use lsp_types::DidCloseNotebookDocumentParams;
use lsp_types::notification::DidCloseNotebookDocument;
use lsp_types::{DidCloseNotebookDocumentParams, NotebookDocumentIdentifier};
use crate::server::Result;
use crate::server::api::LSPResult;
@@ -21,13 +21,19 @@ impl SyncNotificationHandler for DidCloseNotebookHandler {
_client: &Client,
params: DidCloseNotebookDocumentParams,
) -> Result<()> {
let Ok(key) = session.key_from_url(params.notebook_document.uri.clone()) else {
tracing::debug!(
"Failed to create document key from URI: {}",
params.notebook_document.uri
);
return Ok(());
let DidCloseNotebookDocumentParams {
notebook_document: NotebookDocumentIdentifier { uri },
..
} = params;
let key = match session.key_from_url(uri) {
Ok(key) => key,
Err(uri) => {
tracing::debug!("Failed to create document key from URI: {}", uri);
return Ok(());
}
};
session
.close_document(&key)
.with_failure_code(lsp_server::ErrorCode::InternalError)?;

View File

@@ -21,7 +21,9 @@ impl SyncNotificationHandler for DidOpenTextDocumentHandler {
fn run(
session: &mut Session,
client: &Client,
DidOpenTextDocumentParams {
params: DidOpenTextDocumentParams,
) -> Result<()> {
let DidOpenTextDocumentParams {
text_document:
TextDocumentItem {
uri,
@@ -29,11 +31,14 @@ impl SyncNotificationHandler for DidOpenTextDocumentHandler {
version,
language_id,
},
}: DidOpenTextDocumentParams,
) -> Result<()> {
let Ok(key) = session.key_from_url(uri.clone()) else {
tracing::debug!("Failed to create document key from URI: {}", uri);
return Ok(());
} = params;
let key = match session.key_from_url(uri) {
Ok(key) => key,
Err(uri) => {
tracing::debug!("Failed to create document key from URI: {}", uri);
return Ok(());
}
};
let document = TextDocument::new(text, version).with_language_id(&language_id);
@@ -53,6 +58,8 @@ impl SyncNotificationHandler for DidOpenTextDocumentHandler {
}
}
publish_diagnostics(session, &key, client)
publish_diagnostics(session, &key, client);
Ok(())
}
}

View File

@@ -1,4 +1,5 @@
use std::borrow::Cow;
use std::time::Instant;
use lsp_types::request::Completion;
use lsp_types::{CompletionItem, CompletionItemKind, CompletionParams, CompletionResponse, Url};
@@ -7,11 +8,11 @@ use ty_ide::completion;
use ty_project::ProjectDatabase;
use ty_python_semantic::CompletionKind;
use crate::DocumentSnapshot;
use crate::document::PositionExt;
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
pub(crate) struct CompletionRequestHandler;
@@ -31,12 +32,13 @@ impl BackgroundDocumentRequestHandler for CompletionRequestHandler {
_client: &Client,
params: CompletionParams,
) -> crate::server::Result<Option<CompletionResponse>> {
let start = Instant::now();
if snapshot.client_settings().is_language_services_disabled() {
return Ok(None);
}
let Some(file) = snapshot.file(db) else {
tracing::debug!("Failed to resolve file for {:?}", params);
let Some(file) = snapshot.file_ok(db) else {
return Ok(None);
};
@@ -66,7 +68,12 @@ impl BackgroundDocumentRequestHandler for CompletionRequestHandler {
}
})
.collect();
let len = items.len();
let response = CompletionResponse::Array(items);
tracing::debug!(
"Completions request returned {len} suggestions in {elapsed:?}",
elapsed = Instant::now().duration_since(start)
);
Ok(Some(response))
}
}

View File

@@ -6,11 +6,11 @@ use ruff_db::source::{line_index, source_text};
use ty_ide::goto_type_definition;
use ty_project::ProjectDatabase;
use crate::DocumentSnapshot;
use crate::document::{PositionExt, ToLink};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
pub(crate) struct GotoTypeDefinitionRequestHandler;
@@ -34,8 +34,7 @@ impl BackgroundDocumentRequestHandler for GotoTypeDefinitionRequestHandler {
return Ok(None);
}
let Some(file) = snapshot.file(db) else {
tracing::debug!("Failed to resolve file for {:?}", params);
let Some(file) = snapshot.file_ok(db) else {
return Ok(None);
};

View File

@@ -1,10 +1,10 @@
use std::borrow::Cow;
use crate::DocumentSnapshot;
use crate::document::{PositionExt, ToRangeExt};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
use lsp_types::request::HoverRequest;
use lsp_types::{HoverContents, HoverParams, MarkupContent, Url};
@@ -34,8 +34,7 @@ impl BackgroundDocumentRequestHandler for HoverRequestHandler {
return Ok(None);
}
let Some(file) = snapshot.file(db) else {
tracing::debug!("Failed to resolve file for {:?}", params);
let Some(file) = snapshot.file_ok(db) else {
return Ok(None);
};

View File

@@ -1,10 +1,10 @@
use std::borrow::Cow;
use crate::DocumentSnapshot;
use crate::document::{RangeExt, TextSizeExt};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
use lsp_types::request::InlayHintRequest;
use lsp_types::{InlayHintParams, Url};
@@ -33,8 +33,7 @@ impl BackgroundDocumentRequestHandler for InlayHintRequestHandler {
return Ok(None);
}
let Some(file) = snapshot.file(db) else {
tracing::debug!("Failed to resolve file for {:?}", params);
let Some(file) = snapshot.file_ok(db) else {
return Ok(None);
};

View File

@@ -1,10 +1,10 @@
use std::borrow::Cow;
use crate::DocumentSnapshot;
use crate::server::api::semantic_tokens::generate_semantic_tokens;
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
use lsp_types::{SemanticTokens, SemanticTokensParams, SemanticTokensResult, Url};
use ty_project::ProjectDatabase;
@@ -24,14 +24,13 @@ impl BackgroundDocumentRequestHandler for SemanticTokensRequestHandler {
db: &ProjectDatabase,
snapshot: DocumentSnapshot,
_client: &Client,
params: SemanticTokensParams,
_params: SemanticTokensParams,
) -> crate::server::Result<Option<SemanticTokensResult>> {
if snapshot.client_settings().is_language_services_disabled() {
return Ok(None);
}
let Some(file) = snapshot.file(db) else {
tracing::debug!("Failed to resolve file for {:?}", params);
let Some(file) = snapshot.file_ok(db) else {
return Ok(None);
};

View File

@@ -1,11 +1,11 @@
use std::borrow::Cow;
use crate::DocumentSnapshot;
use crate::document::RangeExt;
use crate::server::api::semantic_tokens::generate_semantic_tokens;
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
use lsp_types::{SemanticTokens, SemanticTokensRangeParams, SemanticTokensRangeResult, Url};
use ruff_db::source::{line_index, source_text};
@@ -32,8 +32,7 @@ impl BackgroundDocumentRequestHandler for SemanticTokensRangeRequestHandler {
return Ok(None);
}
let Some(file) = snapshot.file(db) else {
tracing::debug!("Failed to resolve file for {:?}", params);
let Some(file) = snapshot.file_ok(db) else {
return Ok(None);
};

View File

@@ -1,10 +1,10 @@
use std::borrow::Cow;
use crate::DocumentSnapshot;
use crate::document::{PositionEncoding, PositionExt};
use crate::server::api::traits::{
BackgroundDocumentRequestHandler, RequestHandler, RetriableRequestHandler,
};
use crate::session::DocumentSnapshot;
use crate::session::client::Client;
use lsp_types::request::SignatureHelpRequest;
use lsp_types::{
@@ -36,8 +36,7 @@ impl BackgroundDocumentRequestHandler for SignatureHelpRequestHandler {
return Ok(None);
}
let Some(file) = snapshot.file(db) else {
tracing::debug!("Failed to resolve file for {:?}", params);
let Some(file) = snapshot.file_ok(db) else {
return Ok(None);
};

View File

@@ -65,7 +65,7 @@ impl BackgroundRequestHandler for WorkspaceDiagnosticRequestHandler {
let version = index
.key_from_url(url.clone())
.ok()
.and_then(|key| index.make_document_ref(&key))
.and_then(|key| index.make_document_ref(key).ok())
.map(|doc| i64::from(doc.version()));
// Convert diagnostics to LSP format

View File

@@ -5,23 +5,25 @@ use std::ops::{Deref, DerefMut};
use std::sync::Arc;
use anyhow::{Context, anyhow};
use index::DocumentQueryError;
use lsp_server::Message;
use lsp_types::{ClientCapabilities, TextDocumentContentChangeEvent, Url};
use options::GlobalOptions;
use ruff_db::Db;
use ruff_db::files::{File, system_path_to_file};
use ruff_db::files::File;
use ruff_db::system::{System, SystemPath, SystemPathBuf};
use ty_project::metadata::Options;
use ty_project::{ProjectDatabase, ProjectMetadata};
pub(crate) use self::capabilities::ResolvedClientCapabilities;
pub use self::index::DocumentQuery;
pub(crate) use self::index::DocumentQuery;
pub(crate) use self::options::{AllOptions, ClientOptions, DiagnosticMode};
pub(crate) use self::settings::ClientSettings;
use crate::document::{DocumentKey, DocumentVersion, NotebookDocument};
use crate::session::request_queue::RequestQueue;
use crate::system::{AnySystemPath, LSPSystem};
use crate::{PositionEncoding, TextDocument};
use index::Index;
mod capabilities;
pub(crate) mod client;
@@ -31,7 +33,7 @@ mod request_queue;
mod settings;
/// The global state for the LSP
pub struct Session {
pub(crate) struct Session {
/// Used to retrieve information about open documents and settings.
///
/// This will be [`None`] when a mutable reference is held to the index via [`index_mut`]
@@ -39,7 +41,7 @@ pub struct Session {
/// when the mutable reference ([`MutIndexGuard`]) is dropped.
///
/// [`index_mut`]: Session::index_mut
index: Option<Arc<index::Index>>,
index: Option<Arc<Index>>,
/// Maps workspace folders to their respective workspace.
workspaces: Workspaces,
@@ -71,7 +73,7 @@ impl Session {
global_options: GlobalOptions,
workspace_folders: Vec<(Url, ClientOptions)>,
) -> crate::Result<Self> {
let index = Arc::new(index::Index::new(global_options.into_settings()));
let index = Arc::new(Index::new(global_options.into_settings()));
let mut workspaces = Workspaces::default();
for (url, options) in workspace_folders {
@@ -219,7 +221,10 @@ impl Session {
.chain(std::iter::once(&mut self.default_project))
}
pub(crate) fn key_from_url(&self, url: Url) -> crate::Result<DocumentKey> {
/// Returns the [`DocumentKey`] for the given URL.
///
/// Refer to [`Index::key_from_url`] for more details.
pub(crate) fn key_from_url(&self, url: Url) -> Result<DocumentKey, Url> {
self.index().key_from_url(url)
}
@@ -278,16 +283,17 @@ impl Session {
}
/// Creates a document snapshot with the URL referencing the document to snapshot.
///
/// Returns `None` if the url can't be converted to a document key or if the document isn't open.
pub(crate) fn take_document_snapshot(&self, url: Url) -> Option<DocumentSnapshot> {
let key = self.key_from_url(url).ok()?;
Some(DocumentSnapshot {
pub(crate) fn take_document_snapshot(&self, url: Url) -> DocumentSnapshot {
let index = self.index();
DocumentSnapshot {
resolved_client_capabilities: self.resolved_client_capabilities.clone(),
client_settings: self.index().global_settings(),
document_ref: self.index().make_document_ref(&key)?,
client_settings: index.global_settings(),
position_encoding: self.position_encoding,
})
document_query_result: self
.key_from_url(url)
.map_err(DocumentQueryError::InvalidUrl)
.and_then(|key| index.make_document_ref(key)),
}
}
/// Creates a snapshot of the current state of the [`Session`].
@@ -350,7 +356,7 @@ impl Session {
/// Panics if there's a mutable reference to the index via [`index_mut`].
///
/// [`index_mut`]: Session::index_mut
fn index(&self) -> &index::Index {
fn index(&self) -> &Index {
self.index.as_ref().unwrap()
}
@@ -394,11 +400,11 @@ impl Session {
/// When dropped, this guard restores all references to the index.
struct MutIndexGuard<'a> {
session: &'a mut Session,
index: Option<index::Index>,
index: Option<Index>,
}
impl Deref for MutIndexGuard<'_> {
type Target = index::Index;
type Target = Index;
fn deref(&self) -> &Self::Target {
self.index.as_ref().unwrap()
@@ -428,48 +434,69 @@ impl Drop for MutIndexGuard<'_> {
}
}
/// An immutable snapshot of `Session` that references
/// a specific document.
/// An immutable snapshot of [`Session`] that references a specific document.
#[derive(Debug)]
pub struct DocumentSnapshot {
pub(crate) struct DocumentSnapshot {
resolved_client_capabilities: Arc<ResolvedClientCapabilities>,
client_settings: Arc<ClientSettings>,
document_ref: index::DocumentQuery,
position_encoding: PositionEncoding,
document_query_result: Result<DocumentQuery, DocumentQueryError>,
}
impl DocumentSnapshot {
/// Returns the resolved client capabilities that were captured during initialization.
pub(crate) fn resolved_client_capabilities(&self) -> &ResolvedClientCapabilities {
&self.resolved_client_capabilities
}
pub(crate) fn query(&self) -> &index::DocumentQuery {
&self.document_ref
}
/// Returns the position encoding that was negotiated during initialization.
pub(crate) fn encoding(&self) -> PositionEncoding {
self.position_encoding
}
/// Returns the client settings for this document.
pub(crate) fn client_settings(&self) -> &ClientSettings {
&self.client_settings
}
pub(crate) fn file(&self, db: &dyn Db) -> Option<File> {
match AnySystemPath::try_from_url(self.document_ref.file_url()).ok()? {
AnySystemPath::System(path) => system_path_to_file(db, path).ok(),
AnySystemPath::SystemVirtual(virtual_path) => db
.files()
.try_virtual_file(&virtual_path)
.map(|virtual_file| virtual_file.file()),
/// Returns the result of the document query for this snapshot.
pub(crate) fn document(&self) -> Result<&DocumentQuery, &DocumentQueryError> {
self.document_query_result.as_ref()
}
pub(crate) fn file_ok(&self, db: &dyn Db) -> Option<File> {
match self.file(db) {
Ok(file) => Some(file),
Err(err) => {
tracing::debug!("Failed to resolve file: {}", err);
None
}
}
}
fn file(&self, db: &dyn Db) -> Result<File, FileLookupError> {
let document = match self.document() {
Ok(document) => document,
Err(err) => return Err(FileLookupError::DocumentQuery(err.clone())),
};
document
.file(db)
.ok_or_else(|| FileLookupError::NotFound(document.file_path().clone()))
}
}
#[derive(Debug, thiserror::Error)]
pub(crate) enum FileLookupError {
#[error("file not found for path `{0}`")]
NotFound(AnySystemPath),
#[error(transparent)]
DocumentQuery(DocumentQueryError),
}
/// An immutable snapshot of the current state of [`Session`].
pub(crate) struct SessionSnapshot {
projects: Vec<ProjectDatabase>,
index: Arc<index::Index>,
index: Arc<Index>,
position_encoding: PositionEncoding,
}
@@ -478,7 +505,7 @@ impl SessionSnapshot {
&self.projects
}
pub(crate) fn index(&self) -> &index::Index {
pub(crate) fn index(&self) -> &Index {
&self.index
}

View File

@@ -1,6 +1,8 @@
use std::sync::Arc;
use lsp_types::Url;
use ruff_db::Db;
use ruff_db::files::{File, system_path_to_file};
use rustc_hash::FxHashMap;
use crate::session::settings::ClientSettings;
@@ -68,16 +70,17 @@ impl Index {
Ok(())
}
pub(crate) fn key_from_url(&self, url: Url) -> crate::Result<DocumentKey> {
/// Returns the [`DocumentKey`] corresponding to the given URL.
///
/// It returns [`Err`] with the original URL if it cannot be converted to a [`AnySystemPath`].
pub(crate) fn key_from_url(&self, url: Url) -> Result<DocumentKey, Url> {
if let Some(notebook_path) = self.notebook_cells.get(&url) {
Ok(DocumentKey::NotebookCell {
cell_url: url,
notebook_path: notebook_path.clone(),
})
} else {
let path = AnySystemPath::try_from_url(&url)
.map_err(|()| anyhow::anyhow!("Failed to convert URL to system path: {}", url))?;
let path = AnySystemPath::try_from_url(&url).map_err(|()| url)?;
if path
.extension()
.is_some_and(|ext| ext.eq_ignore_ascii_case("ipynb"))
@@ -122,17 +125,27 @@ impl Index {
Ok(())
}
pub(crate) fn make_document_ref(&self, key: &DocumentKey) -> Option<DocumentQuery> {
/// Create a document reference corresponding to the given document key.
///
/// Returns an error if the document is not found or if the path cannot be converted to a URL.
pub(crate) fn make_document_ref(
&self,
key: DocumentKey,
) -> Result<DocumentQuery, DocumentQueryError> {
let path = key.path();
let controller = self.documents.get(path)?;
let (cell_url, file_url) = match &key {
let Some(controller) = self.documents.get(path) else {
return Err(DocumentQueryError::NotFound(key));
};
// TODO: The `to_url` conversion shouldn't be an error because the paths themselves are
// constructed from the URLs but the `Index` APIs don't maintain this invariant.
let (cell_url, file_path) = match key {
DocumentKey::NotebookCell {
cell_url,
notebook_path,
} => (Some(cell_url.clone()), notebook_path.to_url()?),
DocumentKey::Notebook(path) | DocumentKey::Text(path) => (None, path.to_url()?),
} => (Some(cell_url), notebook_path),
DocumentKey::Notebook(path) | DocumentKey::Text(path) => (None, path),
};
Some(controller.make_ref(cell_url, file_url))
Ok(controller.make_ref(cell_url, file_path))
}
pub(super) fn open_text_document(&mut self, path: &AnySystemPath, document: TextDocument) {
@@ -207,15 +220,15 @@ impl DocumentController {
Self::Notebook(Arc::new(document))
}
fn make_ref(&self, cell_url: Option<Url>, file_url: Url) -> DocumentQuery {
fn make_ref(&self, cell_url: Option<Url>, file_path: AnySystemPath) -> DocumentQuery {
match &self {
Self::Notebook(notebook) => DocumentQuery::Notebook {
cell_url,
file_url,
file_path,
notebook: notebook.clone(),
},
Self::Text(document) => DocumentQuery::Text {
file_url,
file_path,
document: document.clone(),
},
}
@@ -251,26 +264,27 @@ impl DocumentController {
}
/// A read-only query to an open document.
///
/// This query can 'select' a text document, full notebook, or a specific notebook cell.
/// It also includes document settings.
#[derive(Debug, Clone)]
pub enum DocumentQuery {
pub(crate) enum DocumentQuery {
Text {
file_url: Url,
file_path: AnySystemPath,
document: Arc<TextDocument>,
},
Notebook {
/// The selected notebook cell, if it exists.
cell_url: Option<Url>,
/// The URL of the notebook.
file_url: Url,
/// The path to the notebook.
file_path: AnySystemPath,
notebook: Arc<NotebookDocument>,
},
}
impl DocumentQuery {
/// Attempts to access the underlying notebook document that this query is selecting.
pub fn as_notebook(&self) -> Option<&NotebookDocument> {
pub(crate) fn as_notebook(&self) -> Option<&NotebookDocument> {
match self {
Self::Notebook { notebook, .. } => Some(notebook),
Self::Text { .. } => None,
@@ -285,10 +299,10 @@ impl DocumentQuery {
}
}
/// Get the URL for the document selected by this query.
pub(crate) fn file_url(&self) -> &Url {
/// Get the system path for the document selected by this query.
pub(crate) fn file_path(&self) -> &AnySystemPath {
match self {
Self::Text { file_url, .. } | Self::Notebook { file_url, .. } => file_url,
Self::Text { file_path, .. } | Self::Notebook { file_path, .. } => file_path,
}
}
@@ -307,4 +321,27 @@ impl DocumentQuery {
.and_then(|cell_uri| notebook.cell_document_by_uri(cell_uri)),
}
}
/// Returns the salsa interned [`File`] for the document selected by this query.
///
/// It returns [`None`] for the following cases:
/// - For virtual file, if it's not yet opened
/// - For regular file, if it does not exists or is a directory
pub(crate) fn file(&self, db: &dyn Db) -> Option<File> {
match self.file_path() {
AnySystemPath::System(path) => system_path_to_file(db, path).ok(),
AnySystemPath::SystemVirtual(virtual_path) => db
.files()
.try_virtual_file(virtual_path)
.map(|virtual_file| virtual_file.file()),
}
}
}
#[derive(Debug, Clone, thiserror::Error)]
pub(crate) enum DocumentQueryError {
#[error("invalid URL: {0}")]
InvalidUrl(Url),
#[error("document not found for key: {0}")]
NotFound(DocumentKey),
}

View File

@@ -1,4 +1,5 @@
use std::any::Any;
use std::fmt;
use std::fmt::Display;
use std::sync::Arc;
@@ -97,6 +98,15 @@ impl AnySystemPath {
}
}
impl fmt::Display for AnySystemPath {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
AnySystemPath::System(system_path) => write!(f, "{system_path}"),
AnySystemPath::SystemVirtual(virtual_path) => write!(f, "{virtual_path}"),
}
}
}
#[derive(Debug)]
pub(crate) struct LSPSystem {
/// A read-only copy of the index where the server stores all the open documents and settings.
@@ -145,7 +155,7 @@ impl LSPSystem {
fn make_document_ref(&self, path: AnySystemPath) -> Option<DocumentQuery> {
let index = self.index();
let key = DocumentKey::from_path(path);
index.make_document_ref(&key)
index.make_document_ref(key).ok()
}
fn system_path_to_document_ref(&self, path: &SystemPath) -> Option<DocumentQuery> {

40
ty.schema.json generated
View File

@@ -58,25 +58,6 @@
},
"additionalProperties": false,
"definitions": {
"DiagnosticFormat": {
"description": "The diagnostic output format.",
"oneOf": [
{
"description": "The default full mode will print \"pretty\" diagnostics.\n\nThat is, color will be used when printing to a `tty`. Moreover, diagnostic messages may include additional context and annotations on the input to help understand the message.",
"type": "string",
"enum": [
"full"
]
},
{
"description": "Print diagnostics in a concise mode.\n\nThis will guarantee that each diagnostic is printed on a single line. Only the most important or primary aspects of the diagnostic are included. Contextual information is dropped.\n\nThis may use color when printing to a `tty`.",
"type": "string",
"enum": [
"concise"
]
}
]
},
"EnvironmentOptions": {
"type": "object",
"properties": {
@@ -167,6 +148,25 @@
}
]
},
"OutputFormat": {
"description": "The diagnostic output format.",
"oneOf": [
{
"description": "The default full mode will print \"pretty\" diagnostics.\n\nThat is, color will be used when printing to a `tty`. Moreover, diagnostic messages may include additional context and annotations on the input to help understand the message.",
"type": "string",
"enum": [
"full"
]
},
{
"description": "Print diagnostics in a concise mode.\n\nThis will guarantee that each diagnostic is printed on a single line. Only the most important or primary aspects of the diagnostic are included. Contextual information is dropped.\n\nThis may use color when printing to a `tty`.",
"type": "string",
"enum": [
"concise"
]
}
]
},
"OverrideOptions": {
"type": "object",
"properties": {
@@ -991,7 +991,7 @@
"description": "The format to use for printing diagnostic messages.\n\nDefaults to `full`.",
"anyOf": [
{
"$ref": "#/definitions/DiagnosticFormat"
"$ref": "#/definitions/OutputFormat"
},
{
"type": "null"