Compare commits
24 Commits
amy/ruffen
...
dcreager/g
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
bb07775617 | ||
|
|
180b38987c | ||
|
|
7bbe3d73fc | ||
|
|
d5089ccd4d | ||
|
|
1c247af1cf | ||
|
|
f735222ed6 | ||
|
|
f1188c74b1 | ||
|
|
e5533da00f | ||
|
|
138bf79857 | ||
|
|
e39f4654cb | ||
|
|
2c95befaff | ||
|
|
61aafffab5 | ||
|
|
47840fdd0c | ||
|
|
acd1e6c466 | ||
|
|
9ca1207667 | ||
|
|
83378f01b1 | ||
|
|
be4e7e773d | ||
|
|
6df88e8fd2 | ||
|
|
8888c3ce1d | ||
|
|
b6ca2d9050 | ||
|
|
2fa8636a2e | ||
|
|
c8664f68bb | ||
|
|
4ff828d996 | ||
|
|
517566a8ca |
58
CHANGELOG.md
58
CHANGELOG.md
@@ -1,63 +1,5 @@
|
||||
# Changelog
|
||||
|
||||
## 0.14.11
|
||||
|
||||
Released on 2026-01-08.
|
||||
|
||||
### Preview features
|
||||
|
||||
- Consolidate diagnostics for matched disable/enable suppression comments ([#22099](https://github.com/astral-sh/ruff/pull/22099))
|
||||
- Report diagnostics for invalid/unmatched range suppression comments ([#21908](https://github.com/astral-sh/ruff/pull/21908))
|
||||
- \[`airflow`\] Passing positional argument into `airflow.lineage.hook.HookLineageCollector.create_asset` is not allowed (`AIR303`) ([#22046](https://github.com/astral-sh/ruff/pull/22046))
|
||||
- \[`refurb`\] Mark `FURB192` fix as always unsafe ([#22210](https://github.com/astral-sh/ruff/pull/22210))
|
||||
- \[`ruff`\] Add `non-empty-init-module` (`RUF067`) ([#22143](https://github.com/astral-sh/ruff/pull/22143))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- Fix GitHub format for multi-line diagnostics ([#22108](https://github.com/astral-sh/ruff/pull/22108))
|
||||
- \[`flake8-unused-arguments`\] Mark `**kwargs` in `TypeVar` as used (`ARG001`) ([#22214](https://github.com/astral-sh/ruff/pull/22214))
|
||||
|
||||
### Rule changes
|
||||
|
||||
- Add `help:` subdiagnostics for several Ruff rules that can sometimes appear to disagree with `ty` ([#22331](https://github.com/astral-sh/ruff/pull/22331))
|
||||
- \[`pylint`\] Demote `PLW1510` fix to display-only ([#22318](https://github.com/astral-sh/ruff/pull/22318))
|
||||
- \[`pylint`\] Ignore identical members (`PLR1714`) ([#22220](https://github.com/astral-sh/ruff/pull/22220))
|
||||
- \[`pylint`\] Improve diagnostic range for `PLC0206` ([#22312](https://github.com/astral-sh/ruff/pull/22312))
|
||||
- \[`ruff`\] Improve fix title for `RUF102` invalid rule code ([#22100](https://github.com/astral-sh/ruff/pull/22100))
|
||||
- \[`flake8-simplify`\]: Avoid unnecessary builtins import for `SIM105` ([#22358](https://github.com/astral-sh/ruff/pull/22358))
|
||||
|
||||
### Configuration
|
||||
|
||||
- Allow Python 3.15 as valid `target-version` value in preview ([#22419](https://github.com/astral-sh/ruff/pull/22419))
|
||||
- Check `required-version` before parsing rules ([#22410](https://github.com/astral-sh/ruff/pull/22410))
|
||||
- Include configured `src` directories when resolving graphs ([#22451](https://github.com/astral-sh/ruff/pull/22451))
|
||||
|
||||
### Documentation
|
||||
|
||||
- Update `T201` suggestion to not use root logger to satisfy `LOG015` ([#22059](https://github.com/astral-sh/ruff/pull/22059))
|
||||
- Fix `iter` example in unsafe fixes doc ([#22118](https://github.com/astral-sh/ruff/pull/22118))
|
||||
- \[`flake8_print`\] better suggestion for `basicConfig` in `T201` docs ([#22101](https://github.com/astral-sh/ruff/pull/22101))
|
||||
- \[`pylint`\] Restore the fix safety docs for `PLW0133` ([#22211](https://github.com/astral-sh/ruff/pull/22211))
|
||||
- Fix Jupyter notebook discovery info for editors ([#22447](https://github.com/astral-sh/ruff/pull/22447))
|
||||
|
||||
### Contributors
|
||||
|
||||
- [@charliermarsh](https://github.com/charliermarsh)
|
||||
- [@ntBre](https://github.com/ntBre)
|
||||
- [@cenviity](https://github.com/cenviity)
|
||||
- [@njhearp](https://github.com/njhearp)
|
||||
- [@cbachhuber](https://github.com/cbachhuber)
|
||||
- [@jelle-openai](https://github.com/jelle-openai)
|
||||
- [@AlexWaygood](https://github.com/AlexWaygood)
|
||||
- [@ValdonVitija](https://github.com/ValdonVitija)
|
||||
- [@BurntSushi](https://github.com/BurntSushi)
|
||||
- [@Jkhall81](https://github.com/Jkhall81)
|
||||
- [@PeterJCLaw](https://github.com/PeterJCLaw)
|
||||
- [@harupy](https://github.com/harupy)
|
||||
- [@amyreese](https://github.com/amyreese)
|
||||
- [@sjyangkevin](https://github.com/sjyangkevin)
|
||||
- [@woodruffw](https://github.com/woodruffw)
|
||||
|
||||
## 0.14.10
|
||||
|
||||
Released on 2025-12-18.
|
||||
|
||||
7
Cargo.lock
generated
7
Cargo.lock
generated
@@ -2912,7 +2912,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.14.11"
|
||||
version = "0.14.10"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"argfile",
|
||||
@@ -2928,7 +2928,6 @@ dependencies = [
|
||||
"filetime",
|
||||
"globwalk",
|
||||
"ignore",
|
||||
"indexmap",
|
||||
"indoc",
|
||||
"insta",
|
||||
"insta-cmd",
|
||||
@@ -3172,7 +3171,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_linter"
|
||||
version = "0.14.11"
|
||||
version = "0.14.10"
|
||||
dependencies = [
|
||||
"aho-corasick",
|
||||
"anyhow",
|
||||
@@ -3530,7 +3529,7 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff_wasm"
|
||||
version = "0.14.11"
|
||||
version = "0.14.10"
|
||||
dependencies = [
|
||||
"console_error_panic_hook",
|
||||
"console_log",
|
||||
|
||||
@@ -150,8 +150,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
|
||||
|
||||
# For a specific version.
|
||||
curl -LsSf https://astral.sh/ruff/0.14.11/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.14.11/install.ps1 | iex"
|
||||
curl -LsSf https://astral.sh/ruff/0.14.10/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.14.10/install.ps1 | iex"
|
||||
```
|
||||
|
||||
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
|
||||
@@ -184,7 +184,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.11
|
||||
rev: v0.14.10
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff"
|
||||
version = "0.14.11"
|
||||
version = "0.14.10"
|
||||
publish = true
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
@@ -31,7 +31,6 @@ ruff_options_metadata = { workspace = true, features = ["serde"] }
|
||||
ruff_python_ast = { workspace = true }
|
||||
ruff_python_formatter = { workspace = true }
|
||||
ruff_python_parser = { workspace = true }
|
||||
ruff_python_trivia = { workspace = true }
|
||||
ruff_server = { workspace = true }
|
||||
ruff_source_file = { workspace = true }
|
||||
ruff_text_size = { workspace = true }
|
||||
@@ -49,7 +48,6 @@ colored = { workspace = true }
|
||||
filetime = { workspace = true }
|
||||
globwalk = { workspace = true }
|
||||
ignore = { workspace = true }
|
||||
indexmap = { workspace = true }
|
||||
is-macro = { workspace = true }
|
||||
itertools = { workspace = true }
|
||||
jiff = { workspace = true }
|
||||
|
||||
@@ -2,7 +2,6 @@ use crate::args::{AnalyzeGraphArgs, ConfigArguments};
|
||||
use crate::resolve::resolve;
|
||||
use crate::{ExitStatus, resolve_default_files};
|
||||
use anyhow::Result;
|
||||
use indexmap::IndexSet;
|
||||
use log::{debug, warn};
|
||||
use path_absolutize::CWD;
|
||||
use ruff_db::system::{SystemPath, SystemPathBuf};
|
||||
@@ -12,7 +11,7 @@ use ruff_linter::source_kind::SourceKind;
|
||||
use ruff_linter::{warn_user, warn_user_once};
|
||||
use ruff_python_ast::{PySourceType, SourceType};
|
||||
use ruff_workspace::resolver::{ResolvedFile, match_exclusion, python_files_in_path};
|
||||
use rustc_hash::{FxBuildHasher, FxHashMap};
|
||||
use rustc_hash::FxHashMap;
|
||||
use std::io::Write;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::sync::{Arc, Mutex};
|
||||
@@ -60,34 +59,17 @@ pub(crate) fn analyze_graph(
|
||||
})
|
||||
.collect::<FxHashMap<_, _>>();
|
||||
|
||||
// Create a database from the source roots, combining configured `src` paths with detected
|
||||
// package roots. Configured paths are added first so they take precedence, and duplicates
|
||||
// are removed.
|
||||
let mut src_roots: IndexSet<SystemPathBuf, FxBuildHasher> = IndexSet::default();
|
||||
|
||||
// Add configured `src` paths first (for precedence), filtering to only include existing
|
||||
// directories.
|
||||
src_roots.extend(
|
||||
pyproject_config
|
||||
.settings
|
||||
.linter
|
||||
.src
|
||||
.iter()
|
||||
.filter(|path| path.is_dir())
|
||||
.filter_map(|path| SystemPathBuf::from_path_buf(path.clone()).ok()),
|
||||
);
|
||||
|
||||
// Add detected package roots.
|
||||
src_roots.extend(
|
||||
package_roots
|
||||
.values()
|
||||
.filter_map(|package| package.as_deref())
|
||||
.filter_map(|path| path.parent())
|
||||
.filter_map(|path| SystemPathBuf::from_path_buf(path.to_path_buf()).ok()),
|
||||
);
|
||||
// Create a database from the source roots.
|
||||
let src_roots = package_roots
|
||||
.values()
|
||||
.filter_map(|package| package.as_deref())
|
||||
.filter_map(|package| package.parent())
|
||||
.map(Path::to_path_buf)
|
||||
.filter_map(|path| SystemPathBuf::from_path_buf(path).ok())
|
||||
.collect();
|
||||
|
||||
let db = ModuleDb::from_src_roots(
|
||||
src_roots.into_iter().collect(),
|
||||
src_roots,
|
||||
pyproject_config
|
||||
.settings
|
||||
.analyze
|
||||
|
||||
@@ -11,7 +11,6 @@ use itertools::Itertools;
|
||||
use log::{error, warn};
|
||||
use rayon::iter::Either::{Left, Right};
|
||||
use rayon::iter::{IntoParallelRefIterator, ParallelIterator};
|
||||
use regex::{Captures, Regex};
|
||||
use ruff_db::diagnostic::{
|
||||
Annotation, Diagnostic, DiagnosticId, DisplayDiagnosticConfig, Severity, Span,
|
||||
};
|
||||
@@ -19,7 +18,6 @@ use ruff_linter::message::{EmitterContext, create_panic_diagnostic, render_diagn
|
||||
use ruff_linter::settings::types::OutputFormat;
|
||||
use ruff_notebook::NotebookIndex;
|
||||
use ruff_python_parser::ParseError;
|
||||
use ruff_python_trivia::textwrap::{dedent, indent};
|
||||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
use thiserror::Error;
|
||||
use tracing::debug;
|
||||
@@ -491,66 +489,6 @@ pub(crate) fn format_source(
|
||||
formatted,
|
||||
)))
|
||||
}
|
||||
SourceKind::Markdown(unformatted_document) => {
|
||||
// adapted from blacken-docs
|
||||
// https://github.com/adamchainz/blacken-docs/blob/fb107c1dce25f9206e29297aaa1ed7afc2980a5a/src/blacken_docs/__init__.py#L17
|
||||
let code_block_regex = Regex::new(
|
||||
r"(?imsx)
|
||||
(?<before>
|
||||
^(?<indent>\ *)```[^\S\r\n]*
|
||||
(?:python|py|python3|py3)
|
||||
(?:\ .*?)?\n
|
||||
)
|
||||
(?<code>.*?)
|
||||
(?<after>
|
||||
^\ *```[^\S\r\n]*$
|
||||
)
|
||||
",
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
let mut changed = false;
|
||||
let formatted_document =
|
||||
code_block_regex.replace_all(unformatted_document, |capture: &Captures| {
|
||||
let (original, [before, code_indent, unformatted_code, after]) =
|
||||
capture.extract();
|
||||
|
||||
let unformatted_code = dedent(unformatted_code);
|
||||
let options = settings.to_format_options(source_type, &unformatted_code, path);
|
||||
|
||||
let formatted_code = if let Some(_range) = range {
|
||||
unimplemented!()
|
||||
} else {
|
||||
// Using `Printed::into_code` requires adding `ruff_formatter` as a direct dependency, and I suspect that Rust can optimize the closure away regardless.
|
||||
#[expect(clippy::redundant_closure_for_method_calls)]
|
||||
format_module_source(&unformatted_code, options)
|
||||
.map(|formatted| formatted.into_code())
|
||||
};
|
||||
|
||||
// TODO: figure out how to properly raise errors from inside closure
|
||||
if let Ok(formatted_code) = formatted_code {
|
||||
if formatted_code.len() == unformatted_code.len()
|
||||
&& formatted_code == *unformatted_code
|
||||
{
|
||||
original.to_string()
|
||||
} else {
|
||||
changed = true;
|
||||
let formatted_code = indent(formatted_code.as_str(), code_indent);
|
||||
format!("{before}{formatted_code}{after}")
|
||||
}
|
||||
} else {
|
||||
original.to_string()
|
||||
}
|
||||
});
|
||||
|
||||
if changed {
|
||||
Ok(FormattedSource::Formatted(SourceKind::Markdown(
|
||||
formatted_document.to_string(),
|
||||
)))
|
||||
} else {
|
||||
Ok(FormattedSource::Unchanged)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -714,121 +714,6 @@ fn notebook_basic() -> Result<()> {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Test that the `src` configuration option is respected.
|
||||
///
|
||||
/// This is useful for monorepos where there are multiple source directories that need to be
|
||||
/// included in the module resolution search path.
|
||||
#[test]
|
||||
fn src_option() -> Result<()> {
|
||||
let tempdir = TempDir::new()?;
|
||||
let root = ChildPath::new(tempdir.path());
|
||||
|
||||
// Create a lib directory with a package.
|
||||
root.child("lib")
|
||||
.child("mylib")
|
||||
.child("__init__.py")
|
||||
.write_str("def helper(): pass")?;
|
||||
|
||||
// Create an app directory with a file that imports from mylib.
|
||||
root.child("app").child("__init__.py").write_str("")?;
|
||||
root.child("app")
|
||||
.child("main.py")
|
||||
.write_str("from mylib import helper")?;
|
||||
|
||||
// Without src configured, the import from mylib won't resolve.
|
||||
insta::with_settings!({
|
||||
filters => INSTA_FILTERS.to_vec(),
|
||||
}, {
|
||||
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
{
|
||||
"app/__init__.py": [],
|
||||
"app/main.py": []
|
||||
}
|
||||
|
||||
----- stderr -----
|
||||
"#);
|
||||
});
|
||||
|
||||
// With src = ["lib"], the import should resolve.
|
||||
root.child("ruff.toml").write_str(indoc::indoc! {r#"
|
||||
src = ["lib"]
|
||||
"#})?;
|
||||
|
||||
insta::with_settings!({
|
||||
filters => INSTA_FILTERS.to_vec(),
|
||||
}, {
|
||||
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
{
|
||||
"app/__init__.py": [],
|
||||
"app/main.py": [
|
||||
"lib/mylib/__init__.py"
|
||||
]
|
||||
}
|
||||
|
||||
----- stderr -----
|
||||
"#);
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Test that glob patterns in `src` are expanded.
|
||||
#[test]
|
||||
fn src_glob_expansion() -> Result<()> {
|
||||
let tempdir = TempDir::new()?;
|
||||
let root = ChildPath::new(tempdir.path());
|
||||
|
||||
// Create multiple lib directories with packages.
|
||||
root.child("libs")
|
||||
.child("lib_a")
|
||||
.child("pkg_a")
|
||||
.child("__init__.py")
|
||||
.write_str("def func_a(): pass")?;
|
||||
root.child("libs")
|
||||
.child("lib_b")
|
||||
.child("pkg_b")
|
||||
.child("__init__.py")
|
||||
.write_str("def func_b(): pass")?;
|
||||
|
||||
// Create an app that imports from both packages.
|
||||
root.child("app").child("__init__.py").write_str("")?;
|
||||
root.child("app")
|
||||
.child("main.py")
|
||||
.write_str("from pkg_a import func_a\nfrom pkg_b import func_b")?;
|
||||
|
||||
// Use a glob pattern to include all lib directories.
|
||||
root.child("ruff.toml").write_str(indoc::indoc! {r#"
|
||||
src = ["libs/*"]
|
||||
"#})?;
|
||||
|
||||
insta::with_settings!({
|
||||
filters => INSTA_FILTERS.to_vec(),
|
||||
}, {
|
||||
assert_cmd_snapshot!(command().arg("app").current_dir(&root), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
{
|
||||
"app/__init__.py": [],
|
||||
"app/main.py": [
|
||||
"libs/lib_a/pkg_a/__init__.py",
|
||||
"libs/lib_b/pkg_b/__init__.py"
|
||||
]
|
||||
}
|
||||
|
||||
----- stderr -----
|
||||
"#);
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn notebook_with_magic() -> Result<()> {
|
||||
let tempdir = TempDir::new()?;
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_linter"
|
||||
version = "0.14.11"
|
||||
version = "0.14.10"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
||||
@@ -22,8 +22,6 @@ pub enum SourceKind {
|
||||
Python(String),
|
||||
/// The source contains a Jupyter notebook.
|
||||
IpyNotebook(Box<Notebook>),
|
||||
/// The source contains Markdown text.
|
||||
Markdown(String),
|
||||
}
|
||||
|
||||
impl SourceKind {
|
||||
@@ -35,7 +33,6 @@ impl SourceKind {
|
||||
match self {
|
||||
SourceKind::IpyNotebook(notebook) => Some(notebook),
|
||||
SourceKind::Python(_) => None,
|
||||
SourceKind::Markdown(_) => None,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -43,36 +40,20 @@ impl SourceKind {
|
||||
match self {
|
||||
SourceKind::Python(code) => Some(code),
|
||||
SourceKind::IpyNotebook(_) => None,
|
||||
SourceKind::Markdown(_) => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_markdown(&self) -> Option<&str> {
|
||||
match self {
|
||||
SourceKind::Markdown(code) => Some(code),
|
||||
SourceKind::Python(_) => None,
|
||||
SourceKind::IpyNotebook(_) => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn expect_python(self) -> String {
|
||||
match self {
|
||||
SourceKind::Python(code) => code,
|
||||
_ => panic!("expected python code"),
|
||||
SourceKind::IpyNotebook(_) => panic!("expected python code"),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn expect_ipy_notebook(self) -> Notebook {
|
||||
match self {
|
||||
SourceKind::IpyNotebook(notebook) => *notebook,
|
||||
_ => panic!("expected ipy notebook"),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn expect_markdown(self) -> String {
|
||||
match self {
|
||||
SourceKind::Markdown(code) => code,
|
||||
_ => panic!("expected markdown text"),
|
||||
SourceKind::Python(_) => panic!("expected ipy notebook"),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -85,7 +66,6 @@ impl SourceKind {
|
||||
SourceKind::IpyNotebook(cloned)
|
||||
}
|
||||
SourceKind::Python(_) => SourceKind::Python(new_source),
|
||||
SourceKind::Markdown(_) => SourceKind::Markdown(new_source),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -94,28 +74,20 @@ impl SourceKind {
|
||||
match self {
|
||||
SourceKind::Python(source) => source,
|
||||
SourceKind::IpyNotebook(notebook) => notebook.source_code(),
|
||||
SourceKind::Markdown(source) => source,
|
||||
}
|
||||
}
|
||||
|
||||
/// Read the [`SourceKind`] from the given path. Returns `None` if the source is not a Python
|
||||
/// source file.
|
||||
pub fn from_path(path: &Path, source_type: PySourceType) -> Result<Option<Self>, SourceError> {
|
||||
match source_type {
|
||||
PySourceType::Ipynb => {
|
||||
let notebook = Notebook::from_path(path)?;
|
||||
Ok(notebook
|
||||
.is_python_notebook()
|
||||
.then_some(Self::IpyNotebook(Box::new(notebook))))
|
||||
}
|
||||
PySourceType::Markdown => {
|
||||
let contents = std::fs::read_to_string(path)?;
|
||||
Ok(Some(Self::Markdown(contents)))
|
||||
}
|
||||
PySourceType::Python | PySourceType::Stub => {
|
||||
let contents = std::fs::read_to_string(path)?;
|
||||
Ok(Some(Self::Python(contents)))
|
||||
}
|
||||
if source_type.is_ipynb() {
|
||||
let notebook = Notebook::from_path(path)?;
|
||||
Ok(notebook
|
||||
.is_python_notebook()
|
||||
.then_some(Self::IpyNotebook(Box::new(notebook))))
|
||||
} else {
|
||||
let contents = std::fs::read_to_string(path)?;
|
||||
Ok(Some(Self::Python(contents)))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -148,10 +120,6 @@ impl SourceKind {
|
||||
notebook.write(writer)?;
|
||||
Ok(())
|
||||
}
|
||||
SourceKind::Markdown(source) => {
|
||||
writer.write_all(source.as_bytes())?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -172,10 +140,6 @@ impl SourceKind {
|
||||
kind: DiffKind::IpyNotebook(src, dst),
|
||||
path,
|
||||
}),
|
||||
(SourceKind::Markdown(src), SourceKind::Markdown(dst)) => Some(SourceKindDiff {
|
||||
kind: DiffKind::Markdown(src, dst),
|
||||
path,
|
||||
}),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
@@ -248,17 +212,6 @@ impl std::fmt::Display for SourceKindDiff<'_> {
|
||||
|
||||
writeln!(f)?;
|
||||
}
|
||||
DiffKind::Markdown(original, modified) => {
|
||||
let mut diff = CodeDiff::new(original, modified);
|
||||
|
||||
let relative_path = self.path.map(fs::relativize_path);
|
||||
|
||||
if let Some(relative_path) = &relative_path {
|
||||
diff.header(relative_path, relative_path);
|
||||
}
|
||||
|
||||
writeln!(f, "{diff}")?;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
@@ -269,7 +222,6 @@ impl std::fmt::Display for SourceKindDiff<'_> {
|
||||
enum DiffKind<'a> {
|
||||
Python(&'a str, &'a str),
|
||||
IpyNotebook(&'a Notebook, &'a Notebook),
|
||||
Markdown(&'a str, &'a str),
|
||||
}
|
||||
|
||||
struct CodeDiff<'a> {
|
||||
|
||||
@@ -89,8 +89,6 @@ pub enum PySourceType {
|
||||
Stub,
|
||||
/// The source is a Jupyter notebook (`.ipynb`).
|
||||
Ipynb,
|
||||
/// The source is a Markdown file (`.md`).
|
||||
Markdown,
|
||||
}
|
||||
|
||||
impl PySourceType {
|
||||
@@ -108,7 +106,6 @@ impl PySourceType {
|
||||
"pyi" => Self::Stub,
|
||||
"pyw" => Self::Python,
|
||||
"ipynb" => Self::Ipynb,
|
||||
"md" => Self::Markdown,
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
@@ -137,10 +134,6 @@ impl PySourceType {
|
||||
pub const fn is_ipynb(self) -> bool {
|
||||
matches!(self, Self::Ipynb)
|
||||
}
|
||||
|
||||
pub const fn is_markdown(self) -> bool {
|
||||
matches!(self, Self::Markdown)
|
||||
}
|
||||
}
|
||||
|
||||
impl<P: AsRef<Path>> From<P> for PySourceType {
|
||||
|
||||
@@ -334,7 +334,7 @@ impl Format<PyFormatContext<'_>> for FormatEmptyLines {
|
||||
PySourceType::Stub => {
|
||||
write!(f, [empty_line()])
|
||||
}
|
||||
PySourceType::Python | PySourceType::Ipynb | PySourceType::Markdown => {
|
||||
PySourceType::Python | PySourceType::Ipynb => {
|
||||
write!(f, [empty_line(), empty_line()])
|
||||
}
|
||||
},
|
||||
|
||||
@@ -283,9 +283,7 @@ impl FormatRule<Suite, PyFormatContext<'_>> for FormatSuite {
|
||||
PySourceType::Stub => {
|
||||
empty_line().fmt(f)?;
|
||||
}
|
||||
PySourceType::Python
|
||||
| PySourceType::Ipynb
|
||||
| PySourceType::Markdown => {
|
||||
PySourceType::Python | PySourceType::Ipynb => {
|
||||
write!(f, [empty_line(), empty_line()])?;
|
||||
}
|
||||
},
|
||||
@@ -326,7 +324,7 @@ impl FormatRule<Suite, PyFormatContext<'_>> for FormatSuite {
|
||||
PySourceType::Stub => {
|
||||
empty_line().fmt(f)?;
|
||||
}
|
||||
PySourceType::Python | PySourceType::Ipynb | PySourceType::Markdown => {
|
||||
PySourceType::Python | PySourceType::Ipynb => {
|
||||
write!(f, [empty_line(), empty_line()])?;
|
||||
}
|
||||
},
|
||||
@@ -378,7 +376,7 @@ impl FormatRule<Suite, PyFormatContext<'_>> for FormatSuite {
|
||||
PySourceType::Stub => {
|
||||
empty_line().fmt(f)?;
|
||||
}
|
||||
PySourceType::Python | PySourceType::Ipynb | PySourceType::Markdown => {
|
||||
PySourceType::Python | PySourceType::Ipynb => {
|
||||
write!(f, [empty_line(), empty_line()])?;
|
||||
}
|
||||
},
|
||||
|
||||
@@ -525,7 +525,7 @@ pub trait AsMode {
|
||||
impl AsMode for PySourceType {
|
||||
fn as_mode(&self) -> Mode {
|
||||
match self {
|
||||
PySourceType::Python | PySourceType::Stub | PySourceType::Markdown => Mode::Module,
|
||||
PySourceType::Python | PySourceType::Stub => Mode::Module,
|
||||
PySourceType::Ipynb => Mode::Ipython,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff_wasm"
|
||||
version = "0.14.11"
|
||||
version = "0.14.10"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
||||
@@ -40,22 +40,25 @@ impl ExcludeFilter {
|
||||
}
|
||||
|
||||
fn matches(&self, path: &SystemPath, mode: GlobFilterCheckMode, directory: bool) -> bool {
|
||||
// If the path is excluded, return `ignore`
|
||||
if self.ignore.matched(path, directory).is_ignore() {
|
||||
return true;
|
||||
}
|
||||
|
||||
match mode {
|
||||
GlobFilterCheckMode::TopDown => {
|
||||
// No hit or an allow hit means the file or directory is not excluded.
|
||||
false
|
||||
match self.ignore.matched(path, directory) {
|
||||
// No hit or an allow hit means the file or directory is not excluded.
|
||||
Match::None | Match::Allow => false,
|
||||
Match::Ignore => true,
|
||||
}
|
||||
}
|
||||
GlobFilterCheckMode::Adhoc => {
|
||||
// If the path is allowlisted or there's no hit, try the parent to ensure we don't return false
|
||||
// for a folder where there's an exclude for a parent.
|
||||
path.ancestors()
|
||||
.skip(1)
|
||||
.any(|ancestor| self.ignore.matched(ancestor, true).is_ignore())
|
||||
for ancestor in path.ancestors() {
|
||||
match self.ignore.matched(ancestor, directory) {
|
||||
// If the path is allowlisted or there's no hit, try the parent to ensure we don't return false
|
||||
// for a folder where there's an exclude for a parent.
|
||||
Match::None | Match::Allow => {}
|
||||
Match::Ignore => return true,
|
||||
}
|
||||
}
|
||||
|
||||
false
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -184,12 +187,6 @@ enum Match {
|
||||
Allow,
|
||||
}
|
||||
|
||||
impl Match {
|
||||
const fn is_ignore(self) -> bool {
|
||||
matches!(self, Match::Ignore)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
|
||||
struct IgnoreGlob {
|
||||
/// The pattern that was originally parsed.
|
||||
|
||||
@@ -17,8 +17,7 @@ from datetime import time
|
||||
t = time(12, 0, 0)
|
||||
t = replace(t, minute=30)
|
||||
|
||||
# TODO: this should be `time`, once we support specialization of generic protocols
|
||||
reveal_type(t) # revealed: Unknown
|
||||
reveal_type(t) # revealed: time
|
||||
```
|
||||
|
||||
## The `__replace__` protocol
|
||||
@@ -48,8 +47,7 @@ b = a.__replace__(x=3, y=4)
|
||||
reveal_type(b) # revealed: Point
|
||||
|
||||
b = replace(a, x=3, y=4)
|
||||
# TODO: this should be `Point`, once we support specialization of generic protocols
|
||||
reveal_type(b) # revealed: Unknown
|
||||
reveal_type(b) # revealed: Point
|
||||
```
|
||||
|
||||
A call to `replace` does not require all keyword arguments:
|
||||
@@ -59,8 +57,7 @@ c = a.__replace__(y=4)
|
||||
reveal_type(c) # revealed: Point
|
||||
|
||||
d = replace(a, y=4)
|
||||
# TODO: this should be `Point`, once we support specialization of generic protocols
|
||||
reveal_type(d) # revealed: Unknown
|
||||
reveal_type(d) # revealed: Point
|
||||
```
|
||||
|
||||
Invalid calls to `__replace__` or `replace` will raise an error:
|
||||
@@ -96,8 +93,7 @@ b = a.__replace__(x=3, y=4)
|
||||
reveal_type(b) # revealed: Point
|
||||
|
||||
b = replace(a, x=3, y=4)
|
||||
# TODO: this should be `Point`, once we support specialization of generic protocols
|
||||
reveal_type(b) # revealed: Unknown
|
||||
reveal_type(b) # revealed: Point
|
||||
```
|
||||
|
||||
Invalid calls to `__replace__` will raise an error:
|
||||
|
||||
@@ -561,7 +561,7 @@ from typing_extensions import overload, Generic, TypeVar
|
||||
from ty_extensions import generic_context, into_callable
|
||||
|
||||
T = TypeVar("T")
|
||||
U = TypeVar("U")
|
||||
U = TypeVar("U", covariant=True)
|
||||
|
||||
class C(Generic[T]):
|
||||
@overload
|
||||
@@ -611,9 +611,9 @@ reveal_type(generic_context(D))
|
||||
# revealed: ty_extensions.GenericContext[T@D, U@D]
|
||||
reveal_type(generic_context(into_callable(D)))
|
||||
|
||||
reveal_type(D("string")) # revealed: D[str, str]
|
||||
reveal_type(D(1)) # revealed: D[str, int]
|
||||
reveal_type(D(1, "string")) # revealed: D[int, str]
|
||||
reveal_type(D("string")) # revealed: D[str, Literal["string"]]
|
||||
reveal_type(D(1)) # revealed: D[str, Literal[1]]
|
||||
reveal_type(D(1, "string")) # revealed: D[int, Literal["string"]]
|
||||
```
|
||||
|
||||
### Synthesized methods with dataclasses
|
||||
|
||||
@@ -90,13 +90,11 @@ def takes_in_protocol(x: CanIndex[T]) -> T:
|
||||
|
||||
def deep_list(x: list[str]) -> None:
|
||||
reveal_type(takes_in_list(x)) # revealed: list[str]
|
||||
# TODO: revealed: str
|
||||
reveal_type(takes_in_protocol(x)) # revealed: Unknown
|
||||
reveal_type(takes_in_protocol(x)) # revealed: str
|
||||
|
||||
def deeper_list(x: list[set[str]]) -> None:
|
||||
reveal_type(takes_in_list(x)) # revealed: list[set[str]]
|
||||
# TODO: revealed: set[str]
|
||||
reveal_type(takes_in_protocol(x)) # revealed: Unknown
|
||||
reveal_type(takes_in_protocol(x)) # revealed: set[str]
|
||||
|
||||
def deep_explicit(x: ExplicitlyImplements[str]) -> None:
|
||||
reveal_type(takes_in_protocol(x)) # revealed: str
|
||||
@@ -129,12 +127,10 @@ class Sub(list[int]): ...
|
||||
class GenericSub(list[T]): ...
|
||||
|
||||
reveal_type(takes_in_list(Sub())) # revealed: list[int]
|
||||
# TODO: revealed: int
|
||||
reveal_type(takes_in_protocol(Sub())) # revealed: Unknown
|
||||
reveal_type(takes_in_protocol(Sub())) # revealed: int
|
||||
|
||||
reveal_type(takes_in_list(GenericSub[str]())) # revealed: list[str]
|
||||
# TODO: revealed: str
|
||||
reveal_type(takes_in_protocol(GenericSub[str]())) # revealed: Unknown
|
||||
reveal_type(takes_in_protocol(GenericSub[str]())) # revealed: str
|
||||
|
||||
class ExplicitSub(ExplicitlyImplements[int]): ...
|
||||
class ExplicitGenericSub(ExplicitlyImplements[T]): ...
|
||||
|
||||
@@ -538,6 +538,10 @@ C[None](b"bytes") # error: [no-matching-overload]
|
||||
C[None](12)
|
||||
|
||||
class D[T, U]:
|
||||
# we need to use the type variable or else the class is bivariant in T, and
|
||||
# specializations become meaningless
|
||||
x: T
|
||||
|
||||
@overload
|
||||
def __init__(self: "D[str, U]", u: U) -> None: ...
|
||||
@overload
|
||||
@@ -551,7 +555,7 @@ reveal_type(generic_context(into_callable(D)))
|
||||
|
||||
reveal_type(D("string")) # revealed: D[str, Literal["string"]]
|
||||
reveal_type(D(1)) # revealed: D[str, Literal[1]]
|
||||
reveal_type(D(1, "string")) # revealed: D[Literal[1], Literal["string"]]
|
||||
reveal_type(D(1, "string")) # revealed: D[int, Literal["string"]]
|
||||
```
|
||||
|
||||
### Synthesized methods with dataclasses
|
||||
|
||||
@@ -85,13 +85,11 @@ def takes_in_protocol[T](x: CanIndex[T]) -> T:
|
||||
|
||||
def deep_list(x: list[str]) -> None:
|
||||
reveal_type(takes_in_list(x)) # revealed: list[str]
|
||||
# TODO: revealed: str
|
||||
reveal_type(takes_in_protocol(x)) # revealed: Unknown
|
||||
reveal_type(takes_in_protocol(x)) # revealed: str
|
||||
|
||||
def deeper_list(x: list[set[str]]) -> None:
|
||||
reveal_type(takes_in_list(x)) # revealed: list[set[str]]
|
||||
# TODO: revealed: set[str]
|
||||
reveal_type(takes_in_protocol(x)) # revealed: Unknown
|
||||
reveal_type(takes_in_protocol(x)) # revealed: set[str]
|
||||
|
||||
def deep_explicit(x: ExplicitlyImplements[str]) -> None:
|
||||
reveal_type(takes_in_protocol(x)) # revealed: str
|
||||
@@ -124,12 +122,10 @@ class Sub(list[int]): ...
|
||||
class GenericSub[T](list[T]): ...
|
||||
|
||||
reveal_type(takes_in_list(Sub())) # revealed: list[int]
|
||||
# TODO: revealed: int
|
||||
reveal_type(takes_in_protocol(Sub())) # revealed: Unknown
|
||||
reveal_type(takes_in_protocol(Sub())) # revealed: int
|
||||
|
||||
reveal_type(takes_in_list(GenericSub[str]())) # revealed: list[str]
|
||||
# TODO: revealed: str
|
||||
reveal_type(takes_in_protocol(GenericSub[str]())) # revealed: Unknown
|
||||
reveal_type(takes_in_protocol(GenericSub[str]())) # revealed: str
|
||||
|
||||
class ExplicitSub(ExplicitlyImplements[int]): ...
|
||||
class ExplicitGenericSub[T](ExplicitlyImplements[T]): ...
|
||||
|
||||
@@ -1369,6 +1369,31 @@ impl<'db> Type<'db> {
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the number of union clauses in this type. If the type is not a union, returns 1.
|
||||
pub(crate) fn union_size(self, db: &'db dyn Db) -> usize {
|
||||
self.as_union()
|
||||
.map(|union_type| union_type.elements(db).len())
|
||||
.unwrap_or(1)
|
||||
}
|
||||
|
||||
/// Returns the number of intersection clauses in this type. If the type is a union, this is
|
||||
/// the maximum of the `intersection_size` of each union element. If the type is not a union
|
||||
/// nor an intersection, returns 1.
|
||||
pub(crate) fn intersection_size(self, db: &'db dyn Db) -> usize {
|
||||
match self {
|
||||
Type::Intersection(intersection) => {
|
||||
intersection.positive(db).len() + intersection.negative(db).len()
|
||||
}
|
||||
Type::Union(union_type) => union_type
|
||||
.elements(db)
|
||||
.iter()
|
||||
.map(|element| element.intersection_size(db))
|
||||
.max()
|
||||
.unwrap_or(1),
|
||||
_ => 1,
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) const fn as_function_literal(self) -> Option<FunctionType<'db>> {
|
||||
match self {
|
||||
Type::FunctionLiteral(function_type) => Some(function_type),
|
||||
@@ -7895,6 +7920,16 @@ impl<'db> TypeVarInstance<'db> {
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns the bounds or constraints of this typevar. If the typevar is unbounded, returns
|
||||
/// `object` as its upper bound.
|
||||
pub(crate) fn require_bound_or_constraints(
|
||||
self,
|
||||
db: &'db dyn Db,
|
||||
) -> TypeVarBoundOrConstraints<'db> {
|
||||
self.bound_or_constraints(db)
|
||||
.unwrap_or_else(|| TypeVarBoundOrConstraints::UpperBound(Type::object()))
|
||||
}
|
||||
|
||||
pub(crate) fn default_type(self, db: &'db dyn Db) -> Option<Type<'db>> {
|
||||
self._default(db).and_then(|d| match d {
|
||||
TypeVarDefaultEvaluation::Eager(ty) => Some(ty),
|
||||
|
||||
@@ -4489,7 +4489,6 @@ impl<'db> BindingError<'db> {
|
||||
return;
|
||||
};
|
||||
|
||||
let typevar = error.bound_typevar().typevar(context.db());
|
||||
let argument_type = error.argument_type();
|
||||
let argument_ty_display = argument_type.display(context.db());
|
||||
|
||||
@@ -4502,21 +4501,51 @@ impl<'db> BindingError<'db> {
|
||||
}
|
||||
));
|
||||
|
||||
let typevar_name = typevar.name(context.db());
|
||||
match error {
|
||||
SpecializationError::MismatchedBound { .. } => {
|
||||
diag.set_primary_message(format_args!("Argument type `{argument_ty_display}` does not satisfy upper bound `{}` of type variable `{typevar_name}`",
|
||||
typevar.upper_bound(context.db()).expect("type variable should have an upper bound if this error occurs").display(context.db())
|
||||
));
|
||||
SpecializationError::NoSolution { parameter, .. } => {
|
||||
diag.set_primary_message(format_args!(
|
||||
"Argument type `{argument_ty_display}` does not \
|
||||
satisfy generic parameter annotation `{}",
|
||||
parameter.display(context.db()),
|
||||
));
|
||||
}
|
||||
SpecializationError::MismatchedConstraint { .. } => {
|
||||
diag.set_primary_message(format_args!("Argument type `{argument_ty_display}` does not satisfy constraints ({}) of type variable `{typevar_name}`",
|
||||
typevar.constraints(context.db()).expect("type variable should have constraints if this error occurs").iter().map(|ty| format!("`{}`", ty.display(context.db()))).join(", ")
|
||||
));
|
||||
SpecializationError::MismatchedBound { bound_typevar, .. } => {
|
||||
let typevar = bound_typevar.typevar(context.db());
|
||||
let typevar_name = typevar.name(context.db());
|
||||
diag.set_primary_message(format_args!(
|
||||
"Argument type `{argument_ty_display}` does not \
|
||||
satisfy upper bound `{}` of type variable `{typevar_name}`",
|
||||
typevar
|
||||
.upper_bound(context.db())
|
||||
.expect(
|
||||
"type variable should have an upper bound if this error occurs"
|
||||
)
|
||||
.display(context.db())
|
||||
));
|
||||
}
|
||||
SpecializationError::MismatchedConstraint { bound_typevar, .. } => {
|
||||
let typevar = bound_typevar.typevar(context.db());
|
||||
let typevar_name = typevar.name(context.db());
|
||||
diag.set_primary_message(format_args!(
|
||||
"Argument type `{argument_ty_display}` does not \
|
||||
satisfy constraints ({}) of type variable `{typevar_name}`",
|
||||
typevar
|
||||
.constraints(context.db())
|
||||
.expect(
|
||||
"type variable should have constraints if this error occurs"
|
||||
)
|
||||
.iter()
|
||||
.format_with(", ", |ty, f| f(&format_args!(
|
||||
"`{}`",
|
||||
ty.display(context.db())
|
||||
)))
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(typevar_definition) = typevar.definition(context.db()) {
|
||||
if let Some(typevar_definition) = error.bound_typevar().and_then(|bound_typevar| {
|
||||
bound_typevar.typevar(context.db()).definition(context.db())
|
||||
}) {
|
||||
let module = parsed_module(context.db(), typevar_definition.file(context.db()))
|
||||
.load(context.db());
|
||||
let typevar_range = typevar_definition.full_range(context.db(), &module);
|
||||
|
||||
@@ -72,6 +72,7 @@ use std::fmt::Display;
|
||||
use std::ops::Range;
|
||||
|
||||
use itertools::Itertools;
|
||||
use ordermap::map::Entry;
|
||||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
use salsa::plumbing::AsId;
|
||||
|
||||
@@ -757,9 +758,25 @@ impl<'db> ConstrainedTypeVar<'db> {
|
||||
|
||||
/// Returns the intersection of two range constraints, or `None` if the intersection is empty.
|
||||
fn intersect(self, db: &'db dyn Db, other: Self) -> IntersectionResult<'db> {
|
||||
// TODO: For now, we treat some upper bounds as unsimplifiable if they become "too big".
|
||||
// When intersecting constraints, the upper bounds are also intersected together. If the
|
||||
// lhs and rhs upper bounds are unions of intersections (e.g. `(a & b) | (c & d)`), then
|
||||
// intersecting them together will require distributing across every pair of union
|
||||
// elements. That can quickly balloon in size. We are looking at a better representation
|
||||
// that would let us model this case more directly, but for now, we punt.
|
||||
const MAX_UPPER_BOUND_SIZE: usize = 4;
|
||||
let self_upper = self.upper(db);
|
||||
let other_upper = other.upper(db);
|
||||
let estimated_upper_bound_size = self_upper.union_size(db)
|
||||
* other_upper.union_size(db)
|
||||
* (self_upper.intersection_size(db) + other_upper.intersection_size(db));
|
||||
if estimated_upper_bound_size >= MAX_UPPER_BOUND_SIZE {
|
||||
return IntersectionResult::CannotSimplify;
|
||||
}
|
||||
|
||||
// (s₁ ≤ α ≤ t₁) ∧ (s₂ ≤ α ≤ t₂) = (s₁ ∪ s₂) ≤ α ≤ (t₁ ∩ t₂))
|
||||
let lower = UnionType::from_elements(db, [self.lower(db), other.lower(db)]);
|
||||
let upper = IntersectionType::from_elements(db, [self.upper(db), other.upper(db)]);
|
||||
let upper = IntersectionType::from_elements(db, [self_upper, other_upper]);
|
||||
|
||||
// If `lower ≰ upper`, then the intersection is empty, since there is no type that is both
|
||||
// greater than `lower`, and less than `upper`.
|
||||
@@ -767,6 +784,8 @@ impl<'db> ConstrainedTypeVar<'db> {
|
||||
return IntersectionResult::Disjoint;
|
||||
}
|
||||
|
||||
// We do not create lower bounds that are unions, or upper bounds that are intersections,
|
||||
// since those can be broken apart into BDDs over simpler constraints.
|
||||
if lower.is_union() || upper.is_nontrivial_intersection(db) {
|
||||
return IntersectionResult::CannotSimplify;
|
||||
}
|
||||
@@ -3437,9 +3456,11 @@ impl<'db> PathAssignments<'db> {
|
||||
);
|
||||
return Err(PathAssignmentConflict);
|
||||
}
|
||||
if self.assignments.insert(assignment, source_order).is_some() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
match self.assignments.entry(assignment) {
|
||||
Entry::Vacant(entry) => entry.insert(source_order),
|
||||
Entry::Occupied(_) => return Ok(()),
|
||||
};
|
||||
|
||||
// Then use our sequents to add additional facts that we know to be true. We currently
|
||||
// reuse the `source_order` of the "real" constraint passed into `walk_edge` when we add
|
||||
@@ -3741,6 +3762,11 @@ impl<'db> BoundTypeVarInstance<'db> {
|
||||
/// when this typevar is in inferable position, where we only need _some_ specialization to
|
||||
/// satisfy the constraint set.
|
||||
fn valid_specializations(self, db: &'db dyn Db) -> Node<'db> {
|
||||
if self.paramspec_attr(db).is_some() {
|
||||
// P.args and P.kwargs are variadic, and do not have an upper bound or constraints.
|
||||
return Node::AlwaysTrue;
|
||||
}
|
||||
|
||||
// For gradual upper bounds and constraints, we are free to choose any materialization that
|
||||
// makes the check succeed. In inferable positions, it is most helpful to choose a
|
||||
// materialization that is as permissive as possible, since that maximizes the number of
|
||||
|
||||
@@ -13,7 +13,6 @@ use crate::semantic_index::{SemanticIndex, semantic_index};
|
||||
use crate::types::class::ClassType;
|
||||
use crate::types::class_base::ClassBase;
|
||||
use crate::types::constraints::{ConstraintSet, IteratorConstraintsExtension};
|
||||
use crate::types::instance::{Protocol, ProtocolInstanceType};
|
||||
use crate::types::relation::{
|
||||
HasRelationToVisitor, IsDisjointVisitor, IsEquivalentVisitor, TypeRelation,
|
||||
};
|
||||
@@ -1634,12 +1633,43 @@ impl<'db> SpecializationBuilder<'db> {
|
||||
upper: FxOrderSet<Type<'db>>,
|
||||
}
|
||||
|
||||
impl<'db> Bounds<'db> {
|
||||
fn add_lower(&mut self, _db: &'db dyn Db, ty: Type<'db>) {
|
||||
// Lower bounds are unioned. Our type representation is in DNF, so unioning a new
|
||||
// element is typically cheap (in that it does not involve a combinatorial
|
||||
// explosion from distributing the clause through an existing disjunction). So we
|
||||
// don't need to be as clever here as in `add_upper`.
|
||||
self.lower.insert(ty);
|
||||
}
|
||||
|
||||
fn add_upper(&mut self, db: &'db dyn Db, ty: Type<'db>) {
|
||||
// Upper bounds are intersectioned. If `ty` is a union, that involves distributing
|
||||
// the union elements through the existing type. That makes it worth checking first
|
||||
// whether any of the types in the upper bound are redundant.
|
||||
|
||||
// First check if there's an existing upper bound clause that is a subtype of the
|
||||
// new type. If so, adding the new type does nothing to the intersection.
|
||||
if self
|
||||
.upper
|
||||
.iter()
|
||||
.any(|existing| existing.is_subtype_of(db, ty))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
// Otherwise remove any existing clauses that are a supertype of the new type,
|
||||
// since the intersection will clip them to the new type.
|
||||
self.upper
|
||||
.retain(|existing| !ty.is_subtype_of(db, *existing));
|
||||
self.upper.insert(ty);
|
||||
}
|
||||
}
|
||||
|
||||
// Sort the constraints in each path by their `source_order`s, to ensure that we construct
|
||||
// any unions or intersections in our type mappings in a stable order. Constraints might
|
||||
// come out of `PathAssignment`s with identical `source_order`s, but if they do, those
|
||||
// "tied" constraints will still be ordered in a stable way. So we need a stable sort to
|
||||
// retain that stable per-tie ordering.
|
||||
let constraints = constraints.limit_to_valid_specializations(self.db);
|
||||
let mut sorted_paths = Vec::new();
|
||||
constraints.for_each_path(self.db, |path| {
|
||||
let mut path: Vec<_> = path.positive_constraints().collect();
|
||||
@@ -1660,33 +1690,68 @@ impl<'db> SpecializationBuilder<'db> {
|
||||
let lower = constraint.lower(self.db);
|
||||
let upper = constraint.upper(self.db);
|
||||
let bounds = mappings.entry(typevar).or_default();
|
||||
bounds.lower.insert(lower);
|
||||
bounds.upper.insert(upper);
|
||||
bounds.add_lower(self.db, lower);
|
||||
bounds.add_upper(self.db, upper);
|
||||
|
||||
if let Type::TypeVar(lower_bound_typevar) = lower {
|
||||
let bounds = mappings.entry(lower_bound_typevar).or_default();
|
||||
bounds.upper.insert(Type::TypeVar(typevar));
|
||||
bounds.add_upper(self.db, Type::TypeVar(typevar));
|
||||
}
|
||||
|
||||
if let Type::TypeVar(upper_bound_typevar) = upper {
|
||||
let bounds = mappings.entry(upper_bound_typevar).or_default();
|
||||
bounds.lower.insert(Type::TypeVar(typevar));
|
||||
bounds.add_lower(self.db, Type::TypeVar(typevar));
|
||||
}
|
||||
}
|
||||
|
||||
for (bound_typevar, bounds) in mappings.drain() {
|
||||
let variance = formal.variance_of(self.db, bound_typevar);
|
||||
// Prefer the lower bound (often the concrete actual type seen) over the
|
||||
// upper bound (which may include TypeVar bounds/constraints). The upper bound
|
||||
// should only be used as a fallback when no concrete type was inferred.
|
||||
let lower = UnionType::from_elements(self.db, bounds.lower);
|
||||
if !lower.is_never() {
|
||||
self.add_type_mapping(bound_typevar, lower, variance, &mut f);
|
||||
continue;
|
||||
}
|
||||
let upper = IntersectionType::from_elements(self.db, bounds.upper);
|
||||
if !upper.is_object() {
|
||||
self.add_type_mapping(bound_typevar, upper, variance, &mut f);
|
||||
match bound_typevar
|
||||
.typevar(self.db)
|
||||
.require_bound_or_constraints(self.db)
|
||||
{
|
||||
TypeVarBoundOrConstraints::UpperBound(bound) => {
|
||||
let bound = bound.top_materialization(self.db);
|
||||
let lower = UnionType::from_elements(self.db, bounds.lower);
|
||||
if !lower.is_assignable_to(self.db, bound) {
|
||||
// This path does not satisfy the typevar's upper bound, and is
|
||||
// therefore not a valid specialization.
|
||||
continue;
|
||||
}
|
||||
|
||||
let upper = IntersectionType::from_elements(
|
||||
self.db,
|
||||
bounds.upper.into_iter().chain([bound]),
|
||||
);
|
||||
if upper != bound {
|
||||
self.add_type_mapping(bound_typevar, upper, variance, &mut f);
|
||||
} else if !lower.is_never() {
|
||||
self.add_type_mapping(bound_typevar, lower, variance, &mut f);
|
||||
}
|
||||
}
|
||||
|
||||
TypeVarBoundOrConstraints::Constraints(constraints) => {
|
||||
// Filter out the typevar constraints that aren't satisfied by this path.
|
||||
let lower = UnionType::from_elements(self.db, bounds.lower);
|
||||
let upper = IntersectionType::from_elements(self.db, bounds.upper);
|
||||
let compatible_constraints =
|
||||
constraints.elements(self.db).iter().filter(|constraint| {
|
||||
let constraint_lower = constraint.bottom_materialization(self.db);
|
||||
let constraint_upper = constraint.top_materialization(self.db);
|
||||
lower.is_assignable_to(self.db, constraint_lower)
|
||||
&& constraint_upper.is_assignable_to(self.db, upper)
|
||||
});
|
||||
|
||||
// If only one constraint remains, that's our specialization for this path.
|
||||
if let Ok(compatible_constraint) = compatible_constraints.exactly_one() {
|
||||
self.add_type_mapping(
|
||||
bound_typevar,
|
||||
*compatible_constraint,
|
||||
variance,
|
||||
&mut f,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1967,14 +2032,31 @@ impl<'db> SpecializationBuilder<'db> {
|
||||
Type::NominalInstance(formal_nominal) => {
|
||||
formal_nominal.class(self.db).into_generic_alias()
|
||||
}
|
||||
// TODO: This will only handle classes that explicit implement a generic protocol
|
||||
// by listing it as a base class. To handle classes that implicitly implement a
|
||||
// generic protocol, we will need to check the types of the protocol members to be
|
||||
// able to infer the specialization of the protocol that the class implements.
|
||||
Type::ProtocolInstance(ProtocolInstanceType {
|
||||
inner: Protocol::FromClass(class),
|
||||
..
|
||||
}) => class.into_generic_alias(),
|
||||
|
||||
Type::ProtocolInstance(_) => {
|
||||
// TODO: For protocols, we use the new constraint set implementation, which
|
||||
// will handle implicitly implemented protocols and generic protocols. We
|
||||
// eventually want this logic to be used for _all_ nominal instances
|
||||
// (replacing the logic below).
|
||||
let when = actual.when_constraint_set_assignable_to(
|
||||
self.db,
|
||||
formal,
|
||||
self.inferable,
|
||||
);
|
||||
if when
|
||||
.limit_to_valid_specializations(self.db)
|
||||
.is_never_satisfied(self.db)
|
||||
&& (formal.has_typevar(self.db) || actual.has_typevar(self.db))
|
||||
{
|
||||
return Err(SpecializationError::NoSolution {
|
||||
parameter: formal,
|
||||
argument: actual,
|
||||
});
|
||||
}
|
||||
self.add_type_mappings_from_constraint_set(formal, when, &mut f);
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
_ => None,
|
||||
};
|
||||
|
||||
@@ -2132,6 +2214,10 @@ impl<'db> SpecializationBuilder<'db> {
|
||||
|
||||
#[derive(Clone, Debug, Eq, PartialEq)]
|
||||
pub(crate) enum SpecializationError<'db> {
|
||||
NoSolution {
|
||||
parameter: Type<'db>,
|
||||
argument: Type<'db>,
|
||||
},
|
||||
MismatchedBound {
|
||||
bound_typevar: BoundTypeVarInstance<'db>,
|
||||
argument: Type<'db>,
|
||||
@@ -2143,15 +2229,17 @@ pub(crate) enum SpecializationError<'db> {
|
||||
}
|
||||
|
||||
impl<'db> SpecializationError<'db> {
|
||||
pub(crate) fn bound_typevar(&self) -> BoundTypeVarInstance<'db> {
|
||||
pub(crate) fn bound_typevar(&self) -> Option<BoundTypeVarInstance<'db>> {
|
||||
match self {
|
||||
Self::MismatchedBound { bound_typevar, .. } => *bound_typevar,
|
||||
Self::MismatchedConstraint { bound_typevar, .. } => *bound_typevar,
|
||||
Self::NoSolution { .. } => None,
|
||||
Self::MismatchedBound { bound_typevar, .. } => Some(*bound_typevar),
|
||||
Self::MismatchedConstraint { bound_typevar, .. } => Some(*bound_typevar),
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn argument_type(&self) -> Type<'db> {
|
||||
match self {
|
||||
Self::NoSolution { argument, .. } => *argument,
|
||||
Self::MismatchedBound { argument, .. } => *argument,
|
||||
Self::MismatchedConstraint { argument, .. } => *argument,
|
||||
}
|
||||
|
||||
@@ -134,14 +134,29 @@ impl<'db> Type<'db> {
|
||||
disjointness_visitor: &IsDisjointVisitor<'db>,
|
||||
) -> ConstraintSet<'db> {
|
||||
let structurally_satisfied = if let Type::ProtocolInstance(self_protocol) = self {
|
||||
self_protocol.interface(db).has_relation_to_impl(
|
||||
db,
|
||||
protocol.interface(db),
|
||||
inferable,
|
||||
relation,
|
||||
relation_visitor,
|
||||
disjointness_visitor,
|
||||
)
|
||||
let self_as_nominal = self_protocol.to_nominal_instance();
|
||||
let other_as_nominal = protocol.to_nominal_instance();
|
||||
let nominal_match = match self_as_nominal.zip(other_as_nominal) {
|
||||
Some((self_as_nominal, other_as_nominal)) => self_as_nominal.has_relation_to_impl(
|
||||
db,
|
||||
other_as_nominal,
|
||||
inferable,
|
||||
relation,
|
||||
relation_visitor,
|
||||
disjointness_visitor,
|
||||
),
|
||||
_ => ConstraintSet::from(false),
|
||||
};
|
||||
nominal_match.or(db, || {
|
||||
self_protocol.interface(db).has_relation_to_impl(
|
||||
db,
|
||||
protocol.interface(db),
|
||||
inferable,
|
||||
relation,
|
||||
relation_visitor,
|
||||
disjointness_visitor,
|
||||
)
|
||||
})
|
||||
} else {
|
||||
protocol
|
||||
.inner
|
||||
|
||||
@@ -910,15 +910,7 @@ impl Session {
|
||||
|
||||
let db = self.project_db_mut(path);
|
||||
match system_path_to_file(db, system_path) {
|
||||
Ok(file) => {
|
||||
let project = db.project();
|
||||
|
||||
// Only mark this file as open if it's part of the project.
|
||||
// This ensures that we don't show diagnostics for files outside the project.
|
||||
if project.is_file_included(db, system_path) {
|
||||
project.open_file(db, file);
|
||||
}
|
||||
}
|
||||
Ok(file) => db.project().open_file(db, file),
|
||||
Err(err) => tracing::warn!("Failed to open file {system_path}: {err}"),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -151,41 +151,6 @@ reveal_type(total)
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pull_excluded_file() -> Result<()> {
|
||||
let _filter = filter_result_id();
|
||||
|
||||
let main_path = SystemPath::new("src/foo.py");
|
||||
let main_content = r#"reveal_type("included")"#;
|
||||
|
||||
let excluded_path = SystemPath::new("src/excluded/lib.py");
|
||||
let excluded_content = r#"reveal_type("Excluded")"#;
|
||||
|
||||
let config = r#"
|
||||
[src]
|
||||
exclude = ["src/excluded/"]
|
||||
"#;
|
||||
|
||||
let mut server = TestServerBuilder::new()?
|
||||
.with_workspace(SystemPath::new("src"), None)?
|
||||
.with_file(main_path, main_content)?
|
||||
.with_file(excluded_path, excluded_content)?
|
||||
.with_file(SystemPath::new("ty.toml"), config)?
|
||||
.enable_pull_diagnostics(true)
|
||||
.build()
|
||||
.wait_until_workspaces_are_initialized();
|
||||
|
||||
server.open_text_document(main_path, main_content, 1);
|
||||
let main_diagnostics = server.document_diagnostic_request(main_path, None);
|
||||
assert_compact_json_snapshot!("main", main_diagnostics);
|
||||
|
||||
server.open_text_document(excluded_path, excluded_content, 1);
|
||||
let excluded_diagnostics = server.document_diagnostic_request(excluded_path, None);
|
||||
assert_compact_json_snapshot!("excluded", excluded_diagnostics);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn document_diagnostic_caching_unchanged() -> Result<()> {
|
||||
let _filter = filter_result_id();
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
---
|
||||
source: crates/ty_server/tests/e2e/pull_diagnostics.rs
|
||||
expression: excluded_diagnostics
|
||||
---
|
||||
{"kind": "full", "items": []}
|
||||
@@ -1,45 +0,0 @@
|
||||
---
|
||||
source: crates/ty_server/tests/e2e/pull_diagnostics.rs
|
||||
expression: main_diagnostics
|
||||
---
|
||||
{
|
||||
"kind": "full",
|
||||
"resultId": "[RESULT_ID]",
|
||||
"items": [
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 0,
|
||||
"character": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 0,
|
||||
"character": 11
|
||||
}
|
||||
},
|
||||
"severity": 2,
|
||||
"code": "undefined-reveal",
|
||||
"codeDescription": {
|
||||
"href": "https://ty.dev/rules#undefined-reveal"
|
||||
},
|
||||
"source": "ty",
|
||||
"message": "`reveal_type` used without importing it"
|
||||
},
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 0,
|
||||
"character": 12
|
||||
},
|
||||
"end": {
|
||||
"line": 0,
|
||||
"character": 22
|
||||
}
|
||||
},
|
||||
"severity": 3,
|
||||
"code": "revealed-type",
|
||||
"source": "ty",
|
||||
"message": "Revealed type: `Literal[/"included/"]`"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -318,7 +318,6 @@ impl EmbeddedFilePath<'_> {
|
||||
EmbeddedFilePath::Autogenerated(PySourceType::Python) => "mdtest_snippet.py",
|
||||
EmbeddedFilePath::Autogenerated(PySourceType::Stub) => "mdtest_snippet.pyi",
|
||||
EmbeddedFilePath::Autogenerated(PySourceType::Ipynb) => "mdtest_snippet.ipynb",
|
||||
EmbeddedFilePath::Autogenerated(PySourceType::Markdown) => "mdtest_snippet.md",
|
||||
EmbeddedFilePath::Explicit(path) => path,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -80,7 +80,7 @@ You can add the following configuration to `.gitlab-ci.yml` to run a `ruff forma
|
||||
stage: build
|
||||
interruptible: true
|
||||
image:
|
||||
name: ghcr.io/astral-sh/ruff:0.14.11-alpine
|
||||
name: ghcr.io/astral-sh/ruff:0.14.10-alpine
|
||||
before_script:
|
||||
- cd $CI_PROJECT_DIR
|
||||
- ruff --version
|
||||
@@ -106,7 +106,7 @@ Ruff can be used as a [pre-commit](https://pre-commit.com) hook via [`ruff-pre-c
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.11
|
||||
rev: v0.14.10
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
@@ -119,7 +119,7 @@ To enable lint fixes, add the `--fix` argument to the lint hook:
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.11
|
||||
rev: v0.14.10
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
@@ -133,7 +133,7 @@ To avoid running on Jupyter Notebooks, remove `jupyter` from the list of allowed
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.11
|
||||
rev: v0.14.10
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
||||
@@ -369,7 +369,7 @@ This tutorial has focused on Ruff's command-line interface, but Ruff can also be
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.11
|
||||
rev: v0.14.10
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
||||
@@ -4,7 +4,7 @@ build-backend = "maturin"
|
||||
|
||||
[project]
|
||||
name = "ruff"
|
||||
version = "0.14.11"
|
||||
version = "0.14.10"
|
||||
description = "An extremely fast Python linter and code formatter, written in Rust."
|
||||
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
|
||||
readme = "README.md"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[project]
|
||||
name = "scripts"
|
||||
version = "0.14.11"
|
||||
version = "0.14.10"
|
||||
description = ""
|
||||
authors = ["Charles Marsh <charlie.r.marsh@gmail.com>"]
|
||||
|
||||
|
||||
Reference in New Issue
Block a user