Compare commits

..

1 Commits

Author SHA1 Message Date
Micha Reiser
c8623f6936 Test that demonstrates the expected behavior
Signed-off-by: Micha Reiser <micha@reiser.io>
2023-11-03 17:51:30 +09:00
36 changed files with 1918 additions and 2485 deletions

View File

@@ -1,66 +1,5 @@
# Changelog
## 0.1.4
### Preview features
- \[`flake8-trio`\] Implement `timeout-without-await` (`TRIO001`) ([#8439](https://github.com/astral-sh/ruff/pull/8439))
- \[`numpy`\] Implement NumPy 2.0 migration rule (`NPY200`) ([#7702](https://github.com/astral-sh/ruff/pull/7702))
- \[`pylint`\] Implement `bad-open-mode` (`W1501`) ([#8294](https://github.com/astral-sh/ruff/pull/8294))
- \[`pylint`\] Implement `import-outside-toplevel` (`C0415`) rule ([#5180](https://github.com/astral-sh/ruff/pull/5180))
- \[`pylint`\] Implement `useless-with-lock` (`W2101`) ([#8321](https://github.com/astral-sh/ruff/pull/8321))
- \[`pyupgrade`\] Implement `timeout-error-alias` (`UP041`) ([#8476](https://github.com/astral-sh/ruff/pull/8476))
- \[`refurb`\] Implement `isinstance-type-none` (`FURB168`) ([#8308](https://github.com/astral-sh/ruff/pull/8308))
- Detect confusable Unicode-to-Unicode units in `RUF001`, `RUF002`, and `RUF003` ([#4430](https://github.com/astral-sh/ruff/pull/4430))
- Add newline after module docstrings in preview style ([#8283](https://github.com/astral-sh/ruff/pull/8283))
### Formatter
- Add a note on line-too-long to the formatter docs ([#8314](https://github.com/astral-sh/ruff/pull/8314))
- Preserve trailing statement semicolons when using `fmt: skip` ([#8273](https://github.com/astral-sh/ruff/pull/8273))
- Preserve trailing semicolons when using `fmt: off` ([#8275](https://github.com/astral-sh/ruff/pull/8275))
- Avoid duplicating linter-formatter compatibility warnings ([#8292](https://github.com/astral-sh/ruff/pull/8292))
- Avoid inserting a newline after function docstrings ([#8375](https://github.com/astral-sh/ruff/pull/8375))
- Insert newline between docstring and following own line comment ([#8216](https://github.com/astral-sh/ruff/pull/8216))
- Split tuples in return positions by comma first ([#8280](https://github.com/astral-sh/ruff/pull/8280))
- Avoid treating byte strings as docstrings ([#8350](https://github.com/astral-sh/ruff/pull/8350))
- Add `--line-length` option to `format` command ([#8363](https://github.com/astral-sh/ruff/pull/8363))
- Avoid parenthesizing unsplittable because of comments ([#8431](https://github.com/astral-sh/ruff/pull/8431))
### CLI
- Add `--output-format` to `ruff rule` and `ruff linter` ([#8203](https://github.com/astral-sh/ruff/pull/8203))
### Bug fixes
- Respect `--force-exclude` in `lint.exclude` and `format.exclude` ([#8393](https://github.com/astral-sh/ruff/pull/8393))
- Respect `--extend-per-file-ignores` on the CLI ([#8329](https://github.com/astral-sh/ruff/pull/8329))
- Extend `bad-dunder-method-name` to permit `__index__` ([#8300](https://github.com/astral-sh/ruff/pull/8300))
- Fix panic with 8 in octal escape ([#8356](https://github.com/astral-sh/ruff/pull/8356))
- Avoid raising `D300` when both triple quote styles are present ([#8462](https://github.com/astral-sh/ruff/pull/8462))
- Consider unterminated f-strings in `FStringRanges` ([#8154](https://github.com/astral-sh/ruff/pull/8154))
- Avoid including literal `shell=True` for truthy, non-`True` diagnostics ([#8359](https://github.com/astral-sh/ruff/pull/8359))
- Avoid triggering single-element test for starred expressions ([#8433](https://github.com/astral-sh/ruff/pull/8433))
- Detect and ignore Jupyter automagics ([#8398](https://github.com/astral-sh/ruff/pull/8398))
- Fix invalid E231 error with f-strings ([#8369](https://github.com/astral-sh/ruff/pull/8369))
- Avoid triggering `NamedTuple` rewrite with starred annotation ([#8434](https://github.com/astral-sh/ruff/pull/8434))
- Avoid un-setting bracket flag in logical lines ([#8380](https://github.com/astral-sh/ruff/pull/8380))
- Place 'r' prefix before 'f' for raw format strings ([#8464](https://github.com/astral-sh/ruff/pull/8464))
- Remove trailing periods from NumPy 2.0 code actions ([#8475](https://github.com/astral-sh/ruff/pull/8475))
- Fix bug where `PLE1307` was raised when formatting `%c` with characters ([#8407](https://github.com/astral-sh/ruff/pull/8407))
- Remove unicode flag from comparable ([#8440](https://github.com/astral-sh/ruff/pull/8440))
- Improve B015 message ([#8295](https://github.com/astral-sh/ruff/pull/8295))
- Use `fixedOverflowWidgets` for playground popover ([#8458](https://github.com/astral-sh/ruff/pull/8458))
- Mark `byte_bounds` as a non-backwards-compatible NumPy 2.0 change ([#8474](https://github.com/astral-sh/ruff/pull/8474))
### Internals
- Add a dedicated cache directory per Ruff version ([#8333](https://github.com/astral-sh/ruff/pull/8333))
- Allow selective caching for `--fix` and `--diff` ([#8316](https://github.com/astral-sh/ruff/pull/8316))
- Improve performance of comment parsing ([#8193](https://github.com/astral-sh/ruff/pull/8193))
- Improve performance of string parsing ([#8227](https://github.com/astral-sh/ruff/pull/8227))
- Use a dedicated sort key for isort import sorting ([#7963](https://github.com/astral-sh/ruff/pull/7963))
## 0.1.3
This release includes a variety of improvements to the Ruff formatter, removing several known and

8
Cargo.lock generated
View File

@@ -810,7 +810,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
[[package]]
name = "flake8-to-ruff"
version = "0.1.4"
version = "0.1.3"
dependencies = [
"anyhow",
"clap",
@@ -2060,7 +2060,7 @@ dependencies = [
[[package]]
name = "ruff_cli"
version = "0.1.4"
version = "0.1.3"
dependencies = [
"annotate-snippets 0.9.1",
"anyhow",
@@ -2196,7 +2196,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.1.4"
version = "0.1.3"
dependencies = [
"aho-corasick",
"annotate-snippets 0.9.1",
@@ -2447,7 +2447,7 @@ dependencies = [
[[package]]
name = "ruff_shrinking"
version = "0.1.4"
version = "0.1.3"
dependencies = [
"anyhow",
"clap",

View File

@@ -148,9 +148,10 @@ ruff format @arguments.txt # Format using an input file, treating its
Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff-pre-commit`](https://github.com/astral-sh/ruff-pre-commit):
```yaml
# Run the Ruff linter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.4
rev: v0.1.3
hooks:
# Run the Ruff linter.
- id: ruff

View File

@@ -1,6 +1,6 @@
[package]
name = "flake8-to-ruff"
version = "0.1.4"
version = "0.1.3"
description = """
Convert Flake8 configuration files to Ruff configuration files.
"""

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_cli"
version = "0.1.4"
version = "0.1.3"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -0,0 +1,170 @@
#![cfg(not(target_family = "wasm"))]
use std::fs;
use std::path::Path;
use std::process::Command;
use std::str;
use anyhow::Result;
use insta_cmd::{assert_cmd_snapshot, get_cargo_bin};
use tempfile::TempDir;
fn ruff_cli() -> Command {
Command::new(get_cargo_bin("ruff"))
}
fn ruff_check(current_dir: &Path) -> Command {
let mut command = ruff_cli();
command
.current_dir(current_dir)
.args(["check", ".", "--no-cache"]);
command
}
/// Tests excluding files and directories by their basename.
///
/// ```toml
/// exclude = ["logs"]
/// ```
///
/// Excludes files and directories named `logs`.
#[test]
fn exclude_basename() -> Result<()> {
let tempdir = TempDir::new()?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
r#"
select = ["F401"]
exclude = ["logs.py"]
"#,
)?;
fs::write(tempdir.path().join("main.py"), "import im_included")?;
let logs_dir = tempdir.path().join("logs.py");
fs::create_dir(&logs_dir)?;
// Excluded because the file is inside of the `logs.py` directory
fs::write(logs_dir.join("excluded.py"), "import im_excluded")?;
let src = tempdir.path().join("src");
let sub_logs_dir = src.join("nested").join("logs.py").join("output");
fs::create_dir_all(&sub_logs_dir)?;
// "excluded because the file name is logs.py"
fs::write(src.join("logs.py"), "import im_excluded")?;
// Excluded because the file is in a nested `logs.py` directory
fs::write(sub_logs_dir.join("excluded2.py"), "import im_excluded")?;
assert_cmd_snapshot!(
ruff_check(tempdir.path())
.args(["--config"])
.arg(ruff_toml.file_name().unwrap()), @r###"
success: false
exit_code: 1
----- stdout -----
main.py:1:8: F401 [*] `im_included` imported but unused
Found 1 error.
[*] 1 fixable with the `--fix` option.
----- stderr -----
"###);
Ok(())
}
/// Tests excluding directories by their name
///
/// ```toml
/// exclude = ["logs/"]
/// ```
///
/// Excludes the directory `logs`.
#[test]
fn exclude_dirname() -> Result<()> {
let tempdir = TempDir::new()?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
r#"
select = ["F401"]
exclude = ["logs.py/"]
"#,
)?;
fs::write(tempdir.path().join("main.py"), "import im_included")?;
let logs_dir = tempdir.path().join("logs.py");
fs::create_dir(&logs_dir)?;
let src = tempdir.path().join("src");
let sub_logs_dir = src.join("nested").join("logs.py").join("output");
fs::create_dir_all(&sub_logs_dir)?;
// Included, because the pattern only excludes directories named `logs.py`
fs::write(src.join("logs.py"), "import im_included")?;
// This file is included. Ruff does not support the gitignore syntax where `logs.py/` matches any
// directory named `logs.py`
fs::write(sub_logs_dir.join("included2.py"), "import im_included")?;
assert_cmd_snapshot!(
ruff_check(tempdir.path())
.args(["--config"])
.arg(ruff_toml.file_name().unwrap()), @r###"
success: false
exit_code: 1
----- stdout -----
main.py:1:8: F401 [*] `im_included` imported but unused
src/logs.py:1:8: F401 [*] `im_included` imported but unused
src/nested/logs.py/output/included2.py:1:8: F401 [*] `im_included` imported but unused
Found 3 errors.
[*] 3 fixable with the `--fix` option.
----- stderr -----
"###);
Ok(())
}
/// Tests that a directory pattern doesn't match a file with the same name.
///
/// ```toml
/// exclude = ["logs/"]
/// ```
///
/// Excludes the directory `logs` but not a file named `logs`.
#[test]
fn exclude_dirname_doesnt_match_file() -> Result<()> {
let tempdir = TempDir::new()?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
r#"
select = ["F401"]
exclude = ["logs.py/"]
"#,
)?;
fs::write(tempdir.path().join("main.py"), "import im_included")?;
// Included, because the pattern only excludes directories named `logs.py`
fs::write(tempdir.path().join("logs.py"), "import im_included")?;
assert_cmd_snapshot!(
ruff_check(tempdir.path())
.args(["--config"])
.arg(ruff_toml.file_name().unwrap()), @r###"
success: false
exit_code: 1
----- stdout -----
main.py:1:8: F401 [*] `im_included` imported but unused
Found 1 error.
[*] 1 fixable with the `--fix` option.
----- stderr -----
"###);
Ok(())
}

View File

@@ -255,6 +255,55 @@ OTHER = "OTHER"
Ok(())
}
#[test]
fn exclude_dir() -> Result<()> {
let tempdir = TempDir::new()?;
let ruff_toml = tempdir.path().join("ruff.toml");
fs::write(
&ruff_toml,
r#"
[format]
exclude = ["out"]
"#,
)?;
fs::write(
tempdir.path().join("main.py"),
r#"
from test import say_hy
if __name__ == "__main__":
say_hy("dear Ruff contributor")
"#,
)?;
let out_dir = tempdir.path().join("out");
fs::create_dir(&out_dir)?;
fs::write(
&out_dir.join("test.py"),
r#"
def say_hy(name: str):
print(f"Hy {name}")"#,
)?;
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
.current_dir(tempdir.path())
.args(["format", "--no-cache", "--check", "--config"])
.arg(ruff_toml.file_name().unwrap())
.arg("."), @r###"
success: false
exit_code: 1
----- stdout -----
Would reformat: main.py
Would reformat: out/test.py
2 files would be reformatted
----- stderr -----
"###);
Ok(())
}
#[test]
fn exclude_stdin() -> Result<()> {
let tempdir = TempDir::new()?;

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.1.4"
version = "0.1.3"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -8,19 +8,3 @@ def ends_in_quote():
def contains_quote():
'Sum"\\mary.'
# OK
def contains_triples(t):
"""('''|\""")"""
# OK
def contains_triples(t):
'''(\'''|""")'''
# TODO: here should raise D300 for using dobule quotes instead,
# because escaped double quote does allow us.
def contains_triples(t):
'''(\""")'''

View File

@@ -1,68 +0,0 @@
import asyncio, socket
# These should be fixed
try:
pass
except asyncio.TimeoutError:
pass
try:
pass
except socket.timeout:
pass
# Should NOT be in parentheses when replaced
try:
pass
except (asyncio.TimeoutError,):
pass
try:
pass
except (socket.timeout,):
pass
try:
pass
except (asyncio.TimeoutError, socket.timeout,):
pass
# Should be kept in parentheses (because multiple)
try:
pass
except (asyncio.TimeoutError, socket.timeout, KeyError, TimeoutError):
pass
# First should change, second should not
from .mmap import error
try:
pass
except (asyncio.TimeoutError, error):
pass
# These should not change
from foo import error
try:
pass
except (TimeoutError, error):
pass
try:
pass
except:
pass
try:
pass
except TimeoutError:
pass
try:
pass
except (TimeoutError, KeyError):
pass

View File

@@ -466,11 +466,6 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
if checker.enabled(Rule::OSErrorAlias) {
pyupgrade::rules::os_error_alias_call(checker, func);
}
if checker.enabled(Rule::TimeoutErrorAlias) {
if checker.settings.target_version >= PythonVersion::Py310 {
pyupgrade::rules::timeout_error_alias_call(checker, func);
}
}
if checker.enabled(Rule::NonPEP604Isinstance) {
if checker.settings.target_version >= PythonVersion::Py310 {
pyupgrade::rules::use_pep604_isinstance(checker, expr, func, args);

View File

@@ -1006,13 +1006,6 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pyupgrade::rules::os_error_alias_raise(checker, item);
}
}
if checker.enabled(Rule::TimeoutErrorAlias) {
if checker.settings.target_version >= PythonVersion::Py310 {
if let Some(item) = exc {
pyupgrade::rules::timeout_error_alias_raise(checker, item);
}
}
}
if checker.enabled(Rule::RaiseVanillaClass) {
if let Some(expr) = exc {
tryceratops::rules::raise_vanilla_class(checker, expr);
@@ -1311,11 +1304,6 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if checker.enabled(Rule::OSErrorAlias) {
pyupgrade::rules::os_error_alias_handlers(checker, handlers);
}
if checker.enabled(Rule::TimeoutErrorAlias) {
if checker.settings.target_version >= PythonVersion::Py310 {
pyupgrade::rules::timeout_error_alias_handlers(checker, handlers);
}
}
if checker.enabled(Rule::PytestAssertInExcept) {
flake8_pytest_style::rules::assert_in_exception_handler(checker, handlers);
}

View File

@@ -500,7 +500,6 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Pyupgrade, "038") => (RuleGroup::Stable, rules::pyupgrade::rules::NonPEP604Isinstance),
(Pyupgrade, "039") => (RuleGroup::Stable, rules::pyupgrade::rules::UnnecessaryClassParentheses),
(Pyupgrade, "040") => (RuleGroup::Stable, rules::pyupgrade::rules::NonPEP695TypeAlias),
(Pyupgrade, "041") => (RuleGroup::Preview, rules::pyupgrade::rules::TimeoutErrorAlias),
// pydocstyle
(Pydocstyle, "100") => (RuleGroup::Stable, rules::pydocstyle::rules::UndocumentedPublicModule),

View File

@@ -19,13 +19,10 @@ use crate::importer::ImportRequest;
/// constants were removed from the main namespace.
///
/// The majority of these functions and constants can be automatically replaced
/// by other members of the NumPy API or by equivalents from the Python
/// standard library. With the exception of renaming `numpy.byte_bounds` to
/// `numpy.lib.array_utils.byte_bounds`, all such replacements are backwards
/// compatible with earlier versions of NumPy.
///
/// This rule flags all uses of removed members, along with automatic fixes for
/// any backwards-compatible replacements.
/// by other members of the NumPy API, even prior to NumPy 2.0, or by
/// equivalents from the Python standard library. This rule flags all uses of
/// removed members, along with automatic fixes for any backwards-compatible
/// replacements.
///
/// ## Examples
/// ```python
@@ -48,7 +45,6 @@ use crate::importer::ImportRequest;
pub struct Numpy2Deprecation {
existing: String,
migration_guide: Option<String>,
code_action: Option<String>,
}
impl Violation for Numpy2Deprecation {
@@ -59,23 +55,21 @@ impl Violation for Numpy2Deprecation {
let Numpy2Deprecation {
existing,
migration_guide,
code_action: _,
} = self;
match migration_guide {
Some(migration_guide) => {
format!("`np.{existing}` will be removed in NumPy 2.0. {migration_guide}",)
}
None => format!("`np.{existing}` will be removed without replacement in NumPy 2.0"),
None => format!("`np.{existing}` will be removed without replacement in NumPy 2.0."),
}
}
fn fix_title(&self) -> Option<String> {
let Numpy2Deprecation {
existing: _,
migration_guide: _,
code_action,
migration_guide,
} = self;
code_action.clone()
migration_guide.clone()
}
}
@@ -88,11 +82,7 @@ struct Replacement<'a> {
#[derive(Debug)]
enum Details<'a> {
/// The deprecated member can be replaced by another member in the NumPy API.
AutoImport {
path: &'a str,
name: &'a str,
compatibility: Compatibility,
},
AutoImport { path: &'a str, name: &'a str },
/// The deprecated member can be replaced by a member of the Python standard library.
AutoPurePython { python_expr: &'a str },
/// The deprecated member can be replaced by a manual migration.
@@ -102,54 +92,15 @@ enum Details<'a> {
impl Details<'_> {
fn guideline(&self) -> Option<String> {
match self {
Details::AutoImport {
path,
name,
compatibility: Compatibility::BackwardsCompatible,
} => Some(format!("Use `{path}.{name}` instead.")),
Details::AutoImport {
path,
name,
compatibility: Compatibility::Breaking,
} => Some(format!(
"Use `{path}.{name}` on NumPy 2.0, or ignore this warning on earlier versions."
)),
Details::AutoImport { path, name } => Some(format!("Use `{path}.{name}` instead.")),
Details::AutoPurePython { python_expr } => {
Some(format!("Use `{python_expr}` instead."))
}
Details::Manual { guideline } => guideline.map(ToString::to_string),
}
}
fn code_action(&self) -> Option<String> {
match self {
Details::AutoImport {
path,
name,
compatibility: Compatibility::BackwardsCompatible,
} => Some(format!("Replace with `{path}.{name}`")),
Details::AutoImport {
path,
name,
compatibility: Compatibility::Breaking,
} => Some(format!(
"Replace with `{path}.{name}` (requires NumPy 2.0 or greater)"
)),
Details::AutoPurePython { python_expr } => {
Some(format!("Replace with `{python_expr}`"))
}
Details::Manual { guideline: _ } => None,
}
}
}
#[derive(Debug)]
enum Compatibility {
/// The changes is backwards compatible with earlier versions of NumPy.
BackwardsCompatible,
/// The change is breaking in NumPy 2.0.
Breaking,
}
/// NPY201
pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
let maybe_replacement = checker
@@ -162,7 +113,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib",
name: "add_docstring",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "add_newdoc"] => Some(Replacement {
@@ -170,7 +120,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib",
name: "add_newdoc",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "add_newdoc_ufunc"] => Some(Replacement {
@@ -190,7 +139,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib.array_utils",
name: "byte_bounds",
compatibility: Compatibility::Breaking,
},
}),
["numpy", "cast"] => Some(Replacement {
@@ -204,7 +152,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "complex128",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "clongfloat"] => Some(Replacement {
@@ -212,7 +159,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "clongdouble",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "compat"] => Some(Replacement {
@@ -226,7 +172,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "complex128",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "DataSource"] => Some(Replacement {
@@ -234,7 +179,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib.npyio",
name: "DataSource",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "deprecate"] => Some(Replacement {
@@ -278,7 +222,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "float64",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "geterrobj"] => Some(Replacement {
@@ -292,7 +235,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "Inf"] => Some(Replacement {
@@ -300,7 +242,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "Infinity"] => Some(Replacement {
@@ -308,7 +249,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "infty"] => Some(Replacement {
@@ -316,7 +256,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "issctype"] => Some(Replacement {
@@ -336,7 +275,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "issubdtype",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "mat"] => Some(Replacement {
@@ -344,7 +282,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "asmatrix",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "maximum_sctype"] => Some(Replacement {
@@ -358,7 +295,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "nan",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "nbytes"] => Some(Replacement {
@@ -384,7 +320,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "clongdouble",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "longfloat"] => Some(Replacement {
@@ -392,7 +327,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "longdouble",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "lookfor"] => Some(Replacement {
@@ -412,7 +346,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "PZERO"] => Some(Replacement {
@@ -436,7 +369,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "round",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "safe_eval"] => Some(Replacement {
@@ -444,7 +376,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "ast",
name: "literal_eval",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "sctype2char"] => Some(Replacement {
@@ -476,7 +407,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "complex64",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "string_"] => Some(Replacement {
@@ -484,7 +414,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "bytes_",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "source"] => Some(Replacement {
@@ -492,7 +421,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "inspect",
name: "getsource",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "tracemalloc_domain"] => Some(Replacement {
@@ -500,7 +428,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib",
name: "tracemalloc_domain",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "unicode_"] => Some(Replacement {
@@ -508,7 +435,6 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "str_",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "who"] => Some(Replacement {
@@ -525,16 +451,11 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
Numpy2Deprecation {
existing: replacement.existing.to_string(),
migration_guide: replacement.details.guideline(),
code_action: replacement.details.code_action(),
},
expr.range(),
);
match replacement.details {
Details::AutoImport {
path,
name,
compatibility,
} => {
Details::AutoImport { path, name } => {
diagnostic.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(path, name),
@@ -542,14 +463,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, expr.range());
Ok(match compatibility {
Compatibility::BackwardsCompatible => {
Fix::safe_edits(import_edit, [replacement_edit])
}
Compatibility::Breaking => {
Fix::unsafe_edits(import_edit, [replacement_edit])
}
})
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
}
Details::AutoPurePython { python_expr } => diagnostic.set_fix(Fix::safe_edit(

View File

@@ -10,7 +10,7 @@ NPY201.py:4:5: NPY201 [*] `np.add_docstring` will be removed in NumPy 2.0. Use `
5 |
6 | np.add_newdoc
|
= help: Replace with `numpy.lib.add_docstring`
= help: Use `numpy.lib.add_docstring` instead.
Fix
1 |+from numpy.lib import add_docstring
@@ -32,7 +32,7 @@ NPY201.py:6:5: NPY201 [*] `np.add_newdoc` will be removed in NumPy 2.0. Use `num
7 |
8 | np.add_newdoc_ufunc
|
= help: Replace with `numpy.lib.add_newdoc`
= help: Use `numpy.lib.add_newdoc` instead.
Fix
1 |+from numpy.lib import add_newdoc
@@ -56,6 +56,7 @@ NPY201.py:8:5: NPY201 `np.add_newdoc_ufunc` will be removed in NumPy 2.0. `add_n
9 |
10 | np.asfarray([1,2,3])
|
= help: `add_newdoc_ufunc` is an internal function.
NPY201.py:10:5: NPY201 `np.asfarray` will be removed in NumPy 2.0. Use `np.asarray` with a `float` dtype instead.
|
@@ -66,8 +67,9 @@ NPY201.py:10:5: NPY201 `np.asfarray` will be removed in NumPy 2.0. Use `np.asarr
11 |
12 | np.byte_bounds(np.array([1,2,3]))
|
= help: Use `np.asarray` with a `float` dtype instead.
NPY201.py:12:5: NPY201 [*] `np.byte_bounds` will be removed in NumPy 2.0. Use `numpy.lib.array_utils.byte_bounds` on NumPy 2.0, or ignore this warning on earlier versions.
NPY201.py:12:5: NPY201 [*] `np.byte_bounds` will be removed in NumPy 2.0. Use `numpy.lib.array_utils.byte_bounds` instead.
|
10 | np.asfarray([1,2,3])
11 |
@@ -76,9 +78,9 @@ NPY201.py:12:5: NPY201 [*] `np.byte_bounds` will be removed in NumPy 2.0. Use `n
13 |
14 | np.cast
|
= help: Replace with `numpy.lib.array_utils.byte_bounds` (requires NumPy 2.0 or greater)
= help: Use `numpy.lib.array_utils.byte_bounds` instead.
Suggested fix
Fix
1 |+from numpy.lib.array_utils import byte_bounds
1 2 | def func():
2 3 | import numpy as np
@@ -102,6 +104,7 @@ NPY201.py:14:5: NPY201 `np.cast` will be removed in NumPy 2.0. Use `np.asarray(a
15 |
16 | np.cfloat(12+34j)
|
= help: Use `np.asarray(arr, dtype=dtype)` instead.
NPY201.py:16:5: NPY201 [*] `np.cfloat` will be removed in NumPy 2.0. Use `numpy.complex128` instead.
|
@@ -112,7 +115,7 @@ NPY201.py:16:5: NPY201 [*] `np.cfloat` will be removed in NumPy 2.0. Use `numpy.
17 |
18 | np.clongfloat(12+34j)
|
= help: Replace with `numpy.complex128`
= help: Use `numpy.complex128` instead.
Fix
13 13 |
@@ -133,7 +136,7 @@ NPY201.py:18:5: NPY201 [*] `np.clongfloat` will be removed in NumPy 2.0. Use `nu
19 |
20 | np.compat
|
= help: Replace with `numpy.clongdouble`
= help: Use `numpy.clongdouble` instead.
Fix
15 15 |
@@ -154,6 +157,7 @@ NPY201.py:20:5: NPY201 `np.compat` will be removed in NumPy 2.0. Python 2 is no
21 |
22 | np.complex_(12+34j)
|
= help: Python 2 is no longer supported.
NPY201.py:22:5: NPY201 [*] `np.complex_` will be removed in NumPy 2.0. Use `numpy.complex128` instead.
|
@@ -164,7 +168,7 @@ NPY201.py:22:5: NPY201 [*] `np.complex_` will be removed in NumPy 2.0. Use `nump
23 |
24 | np.DataSource
|
= help: Replace with `numpy.complex128`
= help: Use `numpy.complex128` instead.
Fix
19 19 |
@@ -185,7 +189,7 @@ NPY201.py:24:5: NPY201 [*] `np.DataSource` will be removed in NumPy 2.0. Use `nu
25 |
26 | np.deprecate
|
= help: Replace with `numpy.lib.npyio.DataSource`
= help: Use `numpy.lib.npyio.DataSource` instead.
Fix
1 |+from numpy.lib.npyio import DataSource
@@ -211,6 +215,7 @@ NPY201.py:26:5: NPY201 `np.deprecate` will be removed in NumPy 2.0. Emit `Deprec
27 |
28 | np.deprecate_with_doc
|
= help: Emit `DeprecationWarning` with `warnings.warn` directly, or use `typing.deprecated`.
NPY201.py:28:5: NPY201 `np.deprecate_with_doc` will be removed in NumPy 2.0. Emit `DeprecationWarning` with `warnings.warn` directly, or use `typing.deprecated`.
|
@@ -221,6 +226,7 @@ NPY201.py:28:5: NPY201 `np.deprecate_with_doc` will be removed in NumPy 2.0. Emi
29 |
30 | np.disp(10)
|
= help: Emit `DeprecationWarning` with `warnings.warn` directly, or use `typing.deprecated`.
NPY201.py:30:5: NPY201 `np.disp` will be removed in NumPy 2.0. Use a dedicated print function instead.
|
@@ -231,6 +237,7 @@ NPY201.py:30:5: NPY201 `np.disp` will be removed in NumPy 2.0. Use a dedicated p
31 |
32 | np.fastCopyAndTranspose
|
= help: Use a dedicated print function instead.
NPY201.py:32:5: NPY201 `np.fastCopyAndTranspose` will be removed in NumPy 2.0. Use `arr.T.copy()` instead.
|
@@ -241,6 +248,7 @@ NPY201.py:32:5: NPY201 `np.fastCopyAndTranspose` will be removed in NumPy 2.0. U
33 |
34 | np.find_common_type
|
= help: Use `arr.T.copy()` instead.
NPY201.py:34:5: NPY201 `np.find_common_type` will be removed in NumPy 2.0. Use `numpy.promote_types` or `numpy.result_type` instead. To achieve semantics for the `scalar_types` argument, use `numpy.result_type` and pass the Python values `0`, `0.0`, or `0j`.
|
@@ -251,8 +259,9 @@ NPY201.py:34:5: NPY201 `np.find_common_type` will be removed in NumPy 2.0. Use `
35 |
36 | np.get_array_wrap
|
= help: Use `numpy.promote_types` or `numpy.result_type` instead. To achieve semantics for the `scalar_types` argument, use `numpy.result_type` and pass the Python values `0`, `0.0`, or `0j`.
NPY201.py:36:5: NPY201 `np.get_array_wrap` will be removed without replacement in NumPy 2.0
NPY201.py:36:5: NPY201 `np.get_array_wrap` will be removed without replacement in NumPy 2.0.
|
34 | np.find_common_type
35 |
@@ -271,7 +280,7 @@ NPY201.py:38:5: NPY201 [*] `np.float_` will be removed in NumPy 2.0. Use `numpy.
39 |
40 | np.geterrobj
|
= help: Replace with `numpy.float64`
= help: Use `numpy.float64` instead.
Fix
35 35 |
@@ -292,6 +301,7 @@ NPY201.py:40:5: NPY201 `np.geterrobj` will be removed in NumPy 2.0. Use the `np.
41 |
42 | np.Inf
|
= help: Use the `np.errstate` context manager instead.
NPY201.py:42:5: NPY201 [*] `np.Inf` will be removed in NumPy 2.0. Use `numpy.inf` instead.
|
@@ -302,7 +312,7 @@ NPY201.py:42:5: NPY201 [*] `np.Inf` will be removed in NumPy 2.0. Use `numpy.inf
43 |
44 | np.Infinity
|
= help: Replace with `numpy.inf`
= help: Use `numpy.inf` instead.
Fix
39 39 |
@@ -323,7 +333,7 @@ NPY201.py:44:5: NPY201 [*] `np.Infinity` will be removed in NumPy 2.0. Use `nump
45 |
46 | np.infty
|
= help: Replace with `numpy.inf`
= help: Use `numpy.inf` instead.
Fix
41 41 |
@@ -344,7 +354,7 @@ NPY201.py:46:5: NPY201 [*] `np.infty` will be removed in NumPy 2.0. Use `numpy.i
47 |
48 | np.issctype
|
= help: Replace with `numpy.inf`
= help: Use `numpy.inf` instead.
Fix
43 43 |
@@ -356,7 +366,7 @@ NPY201.py:46:5: NPY201 [*] `np.infty` will be removed in NumPy 2.0. Use `numpy.i
48 48 | np.issctype
49 49 |
NPY201.py:48:5: NPY201 `np.issctype` will be removed without replacement in NumPy 2.0
NPY201.py:48:5: NPY201 `np.issctype` will be removed without replacement in NumPy 2.0.
|
46 | np.infty
47 |
@@ -375,7 +385,7 @@ NPY201.py:50:5: NPY201 [*] `np.issubclass_` will be removed in NumPy 2.0. Use `i
51 |
52 | np.issubsctype
|
= help: Replace with `issubclass`
= help: Use `issubclass` instead.
Fix
47 47 |
@@ -396,7 +406,7 @@ NPY201.py:52:5: NPY201 [*] `np.issubsctype` will be removed in NumPy 2.0. Use `n
53 |
54 | np.mat
|
= help: Replace with `numpy.issubdtype`
= help: Use `numpy.issubdtype` instead.
Fix
49 49 |
@@ -417,7 +427,7 @@ NPY201.py:54:5: NPY201 [*] `np.mat` will be removed in NumPy 2.0. Use `numpy.asm
55 |
56 | np.maximum_sctype
|
= help: Replace with `numpy.asmatrix`
= help: Use `numpy.asmatrix` instead.
Fix
51 51 |
@@ -429,7 +439,7 @@ NPY201.py:54:5: NPY201 [*] `np.mat` will be removed in NumPy 2.0. Use `numpy.asm
56 56 | np.maximum_sctype
57 57 |
NPY201.py:56:5: NPY201 `np.maximum_sctype` will be removed without replacement in NumPy 2.0
NPY201.py:56:5: NPY201 `np.maximum_sctype` will be removed without replacement in NumPy 2.0.
|
54 | np.mat
55 |
@@ -448,7 +458,7 @@ NPY201.py:58:5: NPY201 [*] `np.NaN` will be removed in NumPy 2.0. Use `numpy.nan
59 |
60 | np.nbytes[np.int64]
|
= help: Replace with `numpy.nan`
= help: Use `numpy.nan` instead.
Fix
55 55 |
@@ -469,6 +479,7 @@ NPY201.py:60:5: NPY201 `np.nbytes` will be removed in NumPy 2.0. Use `np.dtype(<
61 |
62 | np.NINF
|
= help: Use `np.dtype(<dtype>).itemsize` instead.
NPY201.py:62:5: NPY201 [*] `np.NINF` will be removed in NumPy 2.0. Use `-np.inf` instead.
|
@@ -479,7 +490,7 @@ NPY201.py:62:5: NPY201 [*] `np.NINF` will be removed in NumPy 2.0. Use `-np.inf`
63 |
64 | np.NZERO
|
= help: Replace with `-np.inf`
= help: Use `-np.inf` instead.
Fix
59 59 |
@@ -500,7 +511,7 @@ NPY201.py:64:5: NPY201 [*] `np.NZERO` will be removed in NumPy 2.0. Use `-0.0` i
65 |
66 | np.longcomplex(12+34j)
|
= help: Replace with `-0.0`
= help: Use `-0.0` instead.
Fix
61 61 |
@@ -521,7 +532,7 @@ NPY201.py:66:5: NPY201 [*] `np.longcomplex` will be removed in NumPy 2.0. Use `n
67 |
68 | np.longfloat(12+34j)
|
= help: Replace with `numpy.clongdouble`
= help: Use `numpy.clongdouble` instead.
Fix
63 63 |
@@ -542,7 +553,7 @@ NPY201.py:68:5: NPY201 [*] `np.longfloat` will be removed in NumPy 2.0. Use `num
69 |
70 | np.lookfor
|
= help: Replace with `numpy.longdouble`
= help: Use `numpy.longdouble` instead.
Fix
65 65 |
@@ -563,8 +574,9 @@ NPY201.py:70:5: NPY201 `np.lookfor` will be removed in NumPy 2.0. Search NumPy
71 |
72 | np.obj2sctype(int)
|
= help: Search NumPys documentation directly.
NPY201.py:72:5: NPY201 `np.obj2sctype` will be removed without replacement in NumPy 2.0
NPY201.py:72:5: NPY201 `np.obj2sctype` will be removed without replacement in NumPy 2.0.
|
70 | np.lookfor
71 |
@@ -583,7 +595,7 @@ NPY201.py:74:5: NPY201 [*] `np.PINF` will be removed in NumPy 2.0. Use `numpy.in
75 |
76 | np.PZERO
|
= help: Replace with `numpy.inf`
= help: Use `numpy.inf` instead.
Fix
71 71 |
@@ -604,7 +616,7 @@ NPY201.py:76:5: NPY201 [*] `np.PZERO` will be removed in NumPy 2.0. Use `0.0` in
77 |
78 | np.recfromcsv
|
= help: Replace with `0.0`
= help: Use `0.0` instead.
Fix
73 73 |
@@ -625,6 +637,7 @@ NPY201.py:78:5: NPY201 `np.recfromcsv` will be removed in NumPy 2.0. Use `np.gen
79 |
80 | np.recfromtxt
|
= help: Use `np.genfromtxt` with comma delimiter instead.
NPY201.py:80:5: NPY201 `np.recfromtxt` will be removed in NumPy 2.0. Use `np.genfromtxt` instead.
|
@@ -635,6 +648,7 @@ NPY201.py:80:5: NPY201 `np.recfromtxt` will be removed in NumPy 2.0. Use `np.gen
81 |
82 | np.round_(12.34)
|
= help: Use `np.genfromtxt` instead.
NPY201.py:82:5: NPY201 [*] `np.round_` will be removed in NumPy 2.0. Use `numpy.round` instead.
|
@@ -645,7 +659,7 @@ NPY201.py:82:5: NPY201 [*] `np.round_` will be removed in NumPy 2.0. Use `numpy.
83 |
84 | np.safe_eval
|
= help: Replace with `numpy.round`
= help: Use `numpy.round` instead.
Fix
79 79 |
@@ -666,7 +680,7 @@ NPY201.py:84:5: NPY201 [*] `np.safe_eval` will be removed in NumPy 2.0. Use `ast
85 |
86 | np.sctype2char
|
= help: Replace with `ast.literal_eval`
= help: Use `ast.literal_eval` instead.
Fix
1 |+from ast import literal_eval
@@ -683,7 +697,7 @@ NPY201.py:84:5: NPY201 [*] `np.safe_eval` will be removed in NumPy 2.0. Use `ast
86 87 | np.sctype2char
87 88 |
NPY201.py:86:5: NPY201 `np.sctype2char` will be removed without replacement in NumPy 2.0
NPY201.py:86:5: NPY201 `np.sctype2char` will be removed without replacement in NumPy 2.0.
|
84 | np.safe_eval
85 |
@@ -693,7 +707,7 @@ NPY201.py:86:5: NPY201 `np.sctype2char` will be removed without replacement in N
88 | np.sctypes
|
NPY201.py:88:5: NPY201 `np.sctypes` will be removed without replacement in NumPy 2.0
NPY201.py:88:5: NPY201 `np.sctypes` will be removed without replacement in NumPy 2.0.
|
86 | np.sctype2char
87 |
@@ -712,6 +726,7 @@ NPY201.py:90:5: NPY201 `np.seterrobj` will be removed in NumPy 2.0. Use the `np.
91 |
92 | np.set_numeric_ops
|
= help: Use the `np.errstate` context manager instead.
NPY201.py:94:5: NPY201 `np.set_string_function` will be removed in NumPy 2.0. Use `np.set_printoptions` for custom printing of NumPy objects.
|
@@ -722,6 +737,7 @@ NPY201.py:94:5: NPY201 `np.set_string_function` will be removed in NumPy 2.0. Us
95 |
96 | np.singlecomplex(12+1j)
|
= help: Use `np.set_printoptions` for custom printing of NumPy objects.
NPY201.py:96:5: NPY201 [*] `np.singlecomplex` will be removed in NumPy 2.0. Use `numpy.complex64` instead.
|
@@ -732,7 +748,7 @@ NPY201.py:96:5: NPY201 [*] `np.singlecomplex` will be removed in NumPy 2.0. Use
97 |
98 | np.string_("asdf")
|
= help: Replace with `numpy.complex64`
= help: Use `numpy.complex64` instead.
Fix
93 93 |
@@ -753,7 +769,7 @@ NPY201.py:98:5: NPY201 [*] `np.string_` will be removed in NumPy 2.0. Use `numpy
99 |
100 | np.source
|
= help: Replace with `numpy.bytes_`
= help: Use `numpy.bytes_` instead.
Fix
95 95 |
@@ -774,7 +790,7 @@ NPY201.py:100:5: NPY201 [*] `np.source` will be removed in NumPy 2.0. Use `inspe
101 |
102 | np.tracemalloc_domain
|
= help: Replace with `inspect.getsource`
= help: Use `inspect.getsource` instead.
Fix
1 |+from inspect import getsource
@@ -800,7 +816,7 @@ NPY201.py:102:5: NPY201 [*] `np.tracemalloc_domain` will be removed in NumPy 2.0
103 |
104 | np.unicode_("asf")
|
= help: Replace with `numpy.lib.tracemalloc_domain`
= help: Use `numpy.lib.tracemalloc_domain` instead.
Fix
1 |+from numpy.lib import tracemalloc_domain
@@ -826,7 +842,7 @@ NPY201.py:104:5: NPY201 [*] `np.unicode_` will be removed in NumPy 2.0. Use `num
105 |
106 | np.who()
|
= help: Replace with `numpy.str_`
= help: Use `numpy.str_` instead.
Fix
101 101 |
@@ -844,5 +860,6 @@ NPY201.py:106:5: NPY201 `np.who` will be removed in NumPy 2.0. Use an IDE variab
106 | np.who()
| ^^^^^^ NPY201
|
= help: Use an IDE variable explorer or `locals()` instead.

View File

@@ -61,14 +61,12 @@ impl Violation for TripleSingleQuotes {
pub(crate) fn triple_quotes(checker: &mut Checker, docstring: &Docstring) {
let leading_quote = docstring.leading_quote();
let prefixes = leading_quote
let prefixes = docstring
.leading_quote()
.trim_end_matches(|c| c == '\'' || c == '"')
.to_owned();
let expected_quote = if docstring.body().contains("\"\"\"") {
if docstring.body().contains("\'\'\'") {
return;
}
Quote::Single
} else {
Quote::Double

View File

@@ -1,6 +1,5 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
assertion_line: 134
---
D300.py:6:5: D300 Use triple double quotes `"""`
|
@@ -24,8 +23,5 @@ D300.py:10:5: D300 [*] Use triple double quotes `"""`
9 9 | def contains_quote():
10 |- 'Sum"\\mary.'
10 |+ """Sum"\\mary."""
11 11 |
12 12 |
13 13 | # OK

View File

@@ -60,7 +60,6 @@ mod tests {
#[test_case(Rule::ReplaceStdoutStderr, Path::new("UP022.py"))]
#[test_case(Rule::ReplaceUniversalNewlines, Path::new("UP021.py"))]
#[test_case(Rule::SuperCallWithParameters, Path::new("UP008.py"))]
#[test_case(Rule::TimeoutErrorAlias, Path::new("UP041.py"))]
#[test_case(Rule::TypeOfPrimitive, Path::new("UP003.py"))]
#[test_case(Rule::TypingTextStrAlias, Path::new("UP019.py"))]
#[test_case(Rule::UTF8EncodingDeclaration, Path::new("UP009_0.py"))]

View File

@@ -191,21 +191,15 @@ fn try_convert_to_f_string(
summary: &mut FormatSummaryValues,
locator: &Locator,
) -> Result<Option<String>> {
let contents = locator.slice(range);
// Strip the unicode prefix. It's redundant in Python 3, and invalid when used
// with f-strings.
let contents = locator.slice(range);
let contents = if contents.starts_with('U') || contents.starts_with('u') {
&contents[1..]
} else {
contents
};
// Temporarily strip the raw prefix, if present. It will be prepended to the result, before the
// 'f', to match the prefix order both the Ruff formatter (and Black) use when formatting code.
let raw = contents.starts_with('R') || contents.starts_with('r');
let contents = if raw { &contents[1..] } else { contents };
// Remove the leading and trailing quotes.
let leading_quote = leading_quote(contents).context("Unable to identify leading quote")?;
let trailing_quote = trailing_quote(contents).context("Unable to identify trailing quote")?;
@@ -297,10 +291,7 @@ fn try_convert_to_f_string(
}
// Construct the format string.
let mut contents = String::with_capacity(usize::from(raw) + 1 + converted.len());
if raw {
contents.push('r');
}
let mut contents = String::with_capacity(1 + converted.len());
contents.push('f');
contents.push_str(leading_quote);
contents.push_str(&converted);

View File

@@ -20,7 +20,6 @@ pub(crate) use redundant_open_modes::*;
pub(crate) use replace_stdout_stderr::*;
pub(crate) use replace_universal_newlines::*;
pub(crate) use super_call_with_parameters::*;
pub(crate) use timeout_error_alias::*;
pub(crate) use type_of_primitive::*;
pub(crate) use typing_text_str_alias::*;
pub(crate) use unicode_kind_prefix::*;
@@ -60,7 +59,6 @@ mod redundant_open_modes;
mod replace_stdout_stderr;
mod replace_universal_newlines;
mod super_call_with_parameters;
mod timeout_error_alias;
mod type_of_primitive;
mod typing_text_str_alias;
mod unicode_kind_prefix;

View File

@@ -1,192 +0,0 @@
use ruff_python_ast::{self as ast, ExceptHandler, Expr, ExprContext};
use ruff_text_size::{Ranged, TextRange};
use crate::fix::edits::pad;
use ruff_diagnostics::{AlwaysFixableViolation, Diagnostic, Edit, Fix};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::call_path::compose_call_path;
use ruff_python_semantic::SemanticModel;
use crate::checkers::ast::Checker;
use crate::settings::types::PythonVersion;
/// ## What it does
/// Checks for uses of exceptions that alias `TimeoutError`.
///
/// ## Why is this bad?
/// `TimeoutError` is the builtin error type used for exceptions when a system
/// function timed out at the system level.
///
/// In Python 3.10, `socket.timeout` was aliased to `TimeoutError`. In Python
/// 3.11, `asyncio.TimeoutError` was aliased to `TimeoutError`.
///
/// These aliases remain in place for compatibility with older versions of
/// Python, but may be removed in future versions.
///
/// Prefer using `TimeoutError` directly, as it is more idiomatic and future-proof.
///
/// ## Example
/// ```python
/// raise asyncio.TimeoutError
/// ```
///
/// Use instead:
/// ```python
/// raise TimeoutError
/// ```
///
/// ## References
/// - [Python documentation: `TimeoutError`](https://docs.python.org/3/library/exceptions.html#TimeoutError)
#[violation]
pub struct TimeoutErrorAlias {
name: Option<String>,
}
impl AlwaysFixableViolation for TimeoutErrorAlias {
#[derive_message_formats]
fn message(&self) -> String {
format!("Replace aliased errors with `TimeoutError`")
}
fn fix_title(&self) -> String {
let TimeoutErrorAlias { name } = self;
match name {
None => "Replace with builtin `TimeoutError`".to_string(),
Some(name) => format!("Replace `{name}` with builtin `TimeoutError`"),
}
}
}
/// Return `true` if an [`Expr`] is an alias of `TimeoutError`.
fn is_alias(expr: &Expr, semantic: &SemanticModel, target_version: PythonVersion) -> bool {
semantic.resolve_call_path(expr).is_some_and(|call_path| {
if target_version >= PythonVersion::Py311 {
matches!(call_path.as_slice(), [""] | ["asyncio", "TimeoutError"])
} else {
matches!(
call_path.as_slice(),
[""] | ["asyncio", "TimeoutError"] | ["socket", "timeout"]
)
}
})
}
/// Return `true` if an [`Expr`] is `TimeoutError`.
fn is_timeout_error(expr: &Expr, semantic: &SemanticModel) -> bool {
semantic
.resolve_call_path(expr)
.is_some_and(|call_path| matches!(call_path.as_slice(), ["", "TimeoutError"]))
}
/// Create a [`Diagnostic`] for a single target, like an [`Expr::Name`].
fn atom_diagnostic(checker: &mut Checker, target: &Expr) {
let mut diagnostic = Diagnostic::new(
TimeoutErrorAlias {
name: compose_call_path(target),
},
target.range(),
);
if checker.semantic().is_builtin("TimeoutError") {
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
"TimeoutError".to_string(),
target.range(),
)));
}
checker.diagnostics.push(diagnostic);
}
/// Create a [`Diagnostic`] for a tuple of expressions.
fn tuple_diagnostic(checker: &mut Checker, tuple: &ast::ExprTuple, aliases: &[&Expr]) {
let mut diagnostic = Diagnostic::new(TimeoutErrorAlias { name: None }, tuple.range());
if checker.semantic().is_builtin("TimeoutError") {
// Filter out any `TimeoutErrors` aliases.
let mut remaining: Vec<Expr> = tuple
.elts
.iter()
.filter_map(|elt| {
if aliases.contains(&elt) {
None
} else {
Some(elt.clone())
}
})
.collect();
// If `TimeoutError` itself isn't already in the tuple, add it.
if tuple
.elts
.iter()
.all(|elt| !is_timeout_error(elt, checker.semantic()))
{
let node = ast::ExprName {
id: "TimeoutError".into(),
ctx: ExprContext::Load,
range: TextRange::default(),
};
remaining.insert(0, node.into());
}
let content = if remaining.len() == 1 {
"TimeoutError".to_string()
} else {
let node = ast::ExprTuple {
elts: remaining,
ctx: ExprContext::Load,
range: TextRange::default(),
};
format!("({})", checker.generator().expr(&node.into()))
};
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
pad(content, tuple.range(), checker.locator()),
tuple.range(),
)));
}
checker.diagnostics.push(diagnostic);
}
/// UP041
pub(crate) fn timeout_error_alias_handlers(checker: &mut Checker, handlers: &[ExceptHandler]) {
for handler in handlers {
let ExceptHandler::ExceptHandler(ast::ExceptHandlerExceptHandler { type_, .. }) = handler;
let Some(expr) = type_.as_ref() else {
continue;
};
match expr.as_ref() {
Expr::Name(_) | Expr::Attribute(_) => {
if is_alias(expr, checker.semantic(), checker.settings.target_version) {
atom_diagnostic(checker, expr);
}
}
Expr::Tuple(tuple) => {
// List of aliases to replace with `TimeoutError`.
let mut aliases: Vec<&Expr> = vec![];
for elt in &tuple.elts {
if is_alias(elt, checker.semantic(), checker.settings.target_version) {
aliases.push(elt);
}
}
if !aliases.is_empty() {
tuple_diagnostic(checker, tuple, &aliases);
}
}
_ => {}
}
}
}
/// UP041
pub(crate) fn timeout_error_alias_call(checker: &mut Checker, func: &Expr) {
if is_alias(func, checker.semantic(), checker.settings.target_version) {
atom_diagnostic(checker, func);
}
}
/// UP041
pub(crate) fn timeout_error_alias_raise(checker: &mut Checker, expr: &Expr) {
if matches!(expr, Expr::Name(_) | Expr::Attribute(_)) {
if is_alias(expr, checker.semantic(), checker.settings.target_version) {
atom_diagnostic(checker, expr);
}
}
}

View File

@@ -353,7 +353,7 @@ UP032_0.py:37:1: UP032 [*] Use f-string instead of `format` call
35 35 | "foo{}".format(1)
36 36 |
37 |-r"foo{}".format(1)
37 |+rf"foo{1}"
37 |+fr"foo{1}"
38 38 |
39 39 | x = "{a}".format(a=1)
40 40 |

View File

@@ -1,104 +0,0 @@
---
source: crates/ruff_linter/src/rules/pyupgrade/mod.rs
---
UP041.py:5:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
3 | try:
4 | pass
5 | except asyncio.TimeoutError:
| ^^^^^^^^^^^^^^^^^^^^ UP041
6 | pass
|
= help: Replace `asyncio.TimeoutError` with builtin `TimeoutError`
Fix
2 2 | # These should be fixed
3 3 | try:
4 4 | pass
5 |-except asyncio.TimeoutError:
5 |+except TimeoutError:
6 6 | pass
7 7 |
8 8 | try:
UP041.py:17:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
15 | try:
16 | pass
17 | except (asyncio.TimeoutError,):
| ^^^^^^^^^^^^^^^^^^^^^^^ UP041
18 | pass
|
= help: Replace with builtin `TimeoutError`
Fix
14 14 |
15 15 | try:
16 16 | pass
17 |-except (asyncio.TimeoutError,):
17 |+except TimeoutError:
18 18 | pass
19 19 |
20 20 | try:
UP041.py:27:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
25 | try:
26 | pass
27 | except (asyncio.TimeoutError, socket.timeout,):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UP041
28 | pass
|
= help: Replace with builtin `TimeoutError`
Fix
24 24 |
25 25 | try:
26 26 | pass
27 |-except (asyncio.TimeoutError, socket.timeout,):
27 |+except (TimeoutError, socket.timeout):
28 28 | pass
29 29 |
30 30 | # Should be kept in parentheses (because multiple)
UP041.py:34:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
32 | try:
33 | pass
34 | except (asyncio.TimeoutError, socket.timeout, KeyError, TimeoutError):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UP041
35 | pass
|
= help: Replace with builtin `TimeoutError`
Fix
31 31 |
32 32 | try:
33 33 | pass
34 |-except (asyncio.TimeoutError, socket.timeout, KeyError, TimeoutError):
34 |+except (socket.timeout, KeyError, TimeoutError):
35 35 | pass
36 36 |
37 37 | # First should change, second should not
UP041.py:42:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
40 | try:
41 | pass
42 | except (asyncio.TimeoutError, error):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UP041
43 | pass
|
= help: Replace with builtin `TimeoutError`
Fix
39 39 | from .mmap import error
40 40 | try:
41 41 | pass
42 |-except (asyncio.TimeoutError, error):
42 |+except (TimeoutError, error):
43 43 | pass
44 44 |
45 45 | # These should not change

View File

@@ -17,7 +17,7 @@ mod tests {
use crate::pyproject_toml::lint_pyproject_toml;
use crate::registry::Rule;
use crate::settings::resolve_per_file_ignores;
use crate::settings::types::{PerFileIgnore, PreviewMode, PythonVersion};
use crate::settings::types::{PerFileIgnore, PythonVersion};
use crate::test::{test_path, test_resource_path};
use crate::{assert_messages, settings};
@@ -88,24 +88,6 @@ mod tests {
Ok(())
}
#[test]
fn preview_confusables() -> Result<()> {
let diagnostics = test_path(
Path::new("ruff/confusables.py"),
&settings::LinterSettings {
preview: PreviewMode::Enabled,
allowed_confusables: FxHashSet::from_iter(['', 'ρ', '']),
..settings::LinterSettings::for_rules(vec![
Rule::AmbiguousUnicodeCharacterString,
Rule::AmbiguousUnicodeCharacterDocstring,
Rule::AmbiguousUnicodeCharacterComment,
])
},
)?;
assert_messages!(diagnostics);
Ok(())
}
#[test]
fn noqa() -> Result<()> {
let diagnostics = test_path(

View File

@@ -13,20 +13,12 @@ use crate::rules::ruff::rules::Context;
use crate::settings::LinterSettings;
/// ## What it does
/// Checks for ambiguous Unicode characters in strings.
/// Checks for ambiguous unicode characters in strings.
///
/// ## Why is this bad?
/// Some Unicode characters are visually similar to ASCII characters, but have
/// different code points. For example, `LATIN CAPITAL LETTER A` (`U+0041`) is
/// visually similar, but not identical, to the ASCII character `A`.
///
/// The use of ambiguous Unicode characters can confuse readers and cause
/// The use of ambiguous unicode characters can confuse readers and cause
/// subtle bugs.
///
/// In [preview], this rule will also flag Unicode characters that are
/// confusable with other, non-preferred Unicode characters. For example, the
/// spec recommends `GREEK CAPITAL LETTER OMEGA` over `OHM SIGN`.
///
/// ## Example
/// ```python
/// print("Ηello, world!") # "Η" is the Greek eta (`U+0397`).
@@ -36,8 +28,6 @@ use crate::settings::LinterSettings;
/// ```python
/// print("Hello, world!") # "H" is the Latin capital H (`U+0048`).
/// ```
///
/// [preview]: https://docs.astral.sh/ruff/preview/
#[violation]
pub struct AmbiguousUnicodeCharacterString {
confusable: char,
@@ -60,20 +50,12 @@ impl Violation for AmbiguousUnicodeCharacterString {
}
/// ## What it does
/// Checks for ambiguous Unicode characters in docstrings.
/// Checks for ambiguous unicode characters in docstrings.
///
/// ## Why is this bad?
/// Some Unicode characters are visually similar to ASCII characters, but have
/// different code points. For example, `LATIN CAPITAL LETTER A` (`U+0041`) is
/// visually similar, but not identical, to the ASCII character `A`.
///
/// The use of ambiguous Unicode characters can confuse readers and cause
/// The use of ambiguous unicode characters can confuse readers and cause
/// subtle bugs.
///
/// In [preview], this rule will also flag Unicode characters that are
/// confusable with other, non-preferred Unicode characters. For example, the
/// spec recommends `GREEK CAPITAL LETTER OMEGA` over `OHM SIGN`.
///
/// ## Example
/// ```python
/// """A lovely docstring (with a `U+FF09` parenthesis."""
@@ -83,8 +65,6 @@ impl Violation for AmbiguousUnicodeCharacterString {
/// ```python
/// """A lovely docstring (with no strange parentheses)."""
/// ```
///
/// [preview]: https://docs.astral.sh/ruff/preview/
#[violation]
pub struct AmbiguousUnicodeCharacterDocstring {
confusable: char,
@@ -107,20 +87,12 @@ impl Violation for AmbiguousUnicodeCharacterDocstring {
}
/// ## What it does
/// Checks for ambiguous Unicode characters in comments.
/// Checks for ambiguous unicode characters in comments.
///
/// ## Why is this bad?
/// Some Unicode characters are visually similar to ASCII characters, but have
/// different code points. For example, `LATIN CAPITAL LETTER A` (`U+0041`) is
/// visually similar, but not identical, to the ASCII character `A`.
///
/// The use of ambiguous Unicode characters can confuse readers and cause
/// The use of ambiguous unicode characters can confuse readers and cause
/// subtle bugs.
///
/// In [preview], this rule will also flag Unicode characters that are
/// confusable with other, non-preferred Unicode characters. For example, the
/// spec recommends `GREEK CAPITAL LETTER OMEGA` over `OHM SIGN`.
///
/// ## Example
/// ```python
/// foo() # nоqa # "о" is Cyrillic (`U+043E`)
@@ -130,8 +102,6 @@ impl Violation for AmbiguousUnicodeCharacterDocstring {
/// ```python
/// foo() # noqa # "o" is Latin (`U+006F`)
/// ```
///
/// [preview]: https://docs.astral.sh/ruff/preview/
#[violation]
pub struct AmbiguousUnicodeCharacterComment {
confusable: char,
@@ -189,13 +159,11 @@ pub(crate) fn ambiguous_unicode_character(
// Check if the boundary character is itself an ambiguous unicode character, in which
// case, it's always included as a diagnostic.
if !current_char.is_ascii() {
if let Some(representant) = confusable(current_char as u32)
.filter(|representant| settings.preview.is_enabled() || representant.is_ascii())
{
if let Some(representant) = confusable(current_char as u32) {
let candidate = Candidate::new(
TextSize::try_from(relative_offset).unwrap() + range.start(),
current_char,
representant,
char::from_u32(representant).unwrap(),
);
if let Some(diagnostic) = candidate.into_diagnostic(context, settings) {
diagnostics.push(diagnostic);
@@ -205,14 +173,12 @@ pub(crate) fn ambiguous_unicode_character(
} else if current_char.is_ascii() {
// The current word contains at least one ASCII character.
word_flags |= WordFlags::ASCII;
} else if let Some(representant) = confusable(current_char as u32)
.filter(|representant| settings.preview.is_enabled() || representant.is_ascii())
{
} else if let Some(representant) = confusable(current_char as u32) {
// The current word contains an ambiguous unicode character.
word_candidates.push(Candidate::new(
TextSize::try_from(relative_offset).unwrap() + range.start(),
current_char,
representant,
char::from_u32(representant).unwrap(),
));
} else {
// The current word contains at least one unambiguous unicode character.

File diff suppressed because it is too large Load Diff

View File

@@ -155,4 +155,10 @@ confusables.py:46:62: RUF003 Comment contains ambiguous `` (PHILIPPINE SINGLE
47 | }"
|
confusables.py:55:28: RUF001 String contains ambiguous `µ` (MICRO SIGN). Did you mean `μ` (GREEK SMALL LETTER MU)?
|
55 | assert getattr(Labware(), "µL") == 1.5
| ^ RUF001
|

View File

@@ -1,164 +0,0 @@
---
source: crates/ruff_linter/src/rules/ruff/mod.rs
---
confusables.py:1:6: RUF001 String contains ambiguous `𝐁` (MATHEMATICAL BOLD CAPITAL B). Did you mean `B` (LATIN CAPITAL LETTER B)?
|
1 | x = "𝐁ad string"
| ^ RUF001
2 | y = ""
|
confusables.py:6:56: RUF002 Docstring contains ambiguous `` (FULLWIDTH RIGHT PARENTHESIS). Did you mean `)` (RIGHT PARENTHESIS)?
|
5 | def f():
6 | """Here's a docstring with an unusual parenthesis: """
| ^^ RUF002
7 | # And here's a comment with an unusual punctuation mark:
8 | ...
|
confusables.py:7:62: RUF003 Comment contains ambiguous `` (PHILIPPINE SINGLE PUNCTUATION). Did you mean `/` (SOLIDUS)?
|
5 | def f():
6 | """Here's a docstring with an unusual parenthesis: """
7 | # And here's a comment with an unusual punctuation mark:
| ^ RUF003
8 | ...
|
confusables.py:17:6: RUF001 String contains ambiguous `𝐁` (MATHEMATICAL BOLD CAPITAL B). Did you mean `B` (LATIN CAPITAL LETTER B)?
|
17 | x = "𝐁ad string"
| ^ RUF001
18 | x = ""
|
confusables.py:26:10: RUF001 String contains ambiguous `α` (GREEK SMALL LETTER ALPHA). Did you mean `a` (LATIN SMALL LETTER A)?
|
24 | # The first word should be ignored, while the second should be included, since it
25 | # contains ASCII.
26 | x = "βα Bαd"
| ^ RUF001
27 |
28 | # The two characters should be flagged here. The first character is a "word"
|
confusables.py:31:6: RUF001 String contains ambiguous `Р` (CYRILLIC CAPITAL LETTER ER). Did you mean `P` (LATIN CAPITAL LETTER P)?
|
29 | # consisting of a single ambiguous character, while the second character is a "word
30 | # boundary" (whitespace) that it itself ambiguous.
31 | x = "Р усский"
| ^ RUF001
32 |
33 | # Same test cases as above but using f-strings instead:
|
confusables.py:31:7: RUF001 String contains ambiguous ` ` (EN QUAD). Did you mean ` ` (SPACE)?
|
29 | # consisting of a single ambiguous character, while the second character is a "word
30 | # boundary" (whitespace) that it itself ambiguous.
31 | x = "Р усский"
| ^ RUF001
32 |
33 | # Same test cases as above but using f-strings instead:
|
confusables.py:34:7: RUF001 String contains ambiguous `𝐁` (MATHEMATICAL BOLD CAPITAL B). Did you mean `B` (LATIN CAPITAL LETTER B)?
|
33 | # Same test cases as above but using f-strings instead:
34 | x = f"𝐁ad string"
| ^ RUF001
35 | x = f""
36 | x = f"Русский"
|
confusables.py:37:11: RUF001 String contains ambiguous `α` (GREEK SMALL LETTER ALPHA). Did you mean `a` (LATIN SMALL LETTER A)?
|
35 | x = f""
36 | x = f"Русский"
37 | x = f"βα Bαd"
| ^ RUF001
38 | x = f"Р усский"
|
confusables.py:38:7: RUF001 String contains ambiguous `Р` (CYRILLIC CAPITAL LETTER ER). Did you mean `P` (LATIN CAPITAL LETTER P)?
|
36 | x = f"Русский"
37 | x = f"βα Bαd"
38 | x = f"Р усский"
| ^ RUF001
39 |
40 | # Nested f-strings
|
confusables.py:38:8: RUF001 String contains ambiguous ` ` (EN QUAD). Did you mean ` ` (SPACE)?
|
36 | x = f"Русский"
37 | x = f"βα Bαd"
38 | x = f"Р усский"
| ^ RUF001
39 |
40 | # Nested f-strings
|
confusables.py:41:7: RUF001 String contains ambiguous `𝐁` (MATHEMATICAL BOLD CAPITAL B). Did you mean `B` (LATIN CAPITAL LETTER B)?
|
40 | # Nested f-strings
41 | x = f"𝐁ad string {f" {f"Р усский"}"}"
| ^ RUF001
42 |
43 | # Comments inside f-strings
|
confusables.py:41:21: RUF001 String contains ambiguous ` ` (EN QUAD). Did you mean ` ` (SPACE)?
|
40 | # Nested f-strings
41 | x = f"𝐁ad string {f" {f"Р усский"}"}"
| ^ RUF001
42 |
43 | # Comments inside f-strings
|
confusables.py:41:25: RUF001 String contains ambiguous `Р` (CYRILLIC CAPITAL LETTER ER). Did you mean `P` (LATIN CAPITAL LETTER P)?
|
40 | # Nested f-strings
41 | x = f"𝐁ad string {f" {f"Р усский"}"}"
| ^ RUF001
42 |
43 | # Comments inside f-strings
|
confusables.py:41:26: RUF001 String contains ambiguous ` ` (EN QUAD). Did you mean ` ` (SPACE)?
|
40 | # Nested f-strings
41 | x = f"𝐁ad string {f" {f"Р усский"}"}"
| ^ RUF001
42 |
43 | # Comments inside f-strings
|
confusables.py:44:68: RUF003 Comment contains ambiguous `` (FULLWIDTH RIGHT PARENTHESIS). Did you mean `)` (RIGHT PARENTHESIS)?
|
43 | # Comments inside f-strings
44 | x = f"string { # And here's a comment with an unusual parenthesis:
| ^^ RUF003
45 | # And here's a comment with a greek alpha:
46 | foo # And here's a comment with an unusual punctuation mark:
|
confusables.py:46:62: RUF003 Comment contains ambiguous `` (PHILIPPINE SINGLE PUNCTUATION). Did you mean `/` (SOLIDUS)?
|
44 | x = f"string { # And here's a comment with an unusual parenthesis:
45 | # And here's a comment with a greek alpha:
46 | foo # And here's a comment with an unusual punctuation mark:
| ^ RUF003
47 | }"
|
confusables.py:55:28: RUF001 String contains ambiguous `µ` (MICRO SIGN). Did you mean `μ` (GREEK SMALL LETTER MU)?
|
55 | assert getattr(Labware(), "µL") == 1.5
| ^ RUF001
|

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_shrinking"
version = "0.1.4"
version = "0.1.3"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

View File

@@ -219,28 +219,6 @@ extend-include = ["*.ipynb"]
This will prompt Ruff to discover Jupyter Notebook (`.ipynb`) files in any specified
directories, then lint and format them accordingly.
If you'd prefer to either only lint or only format Jupyter Notebook files, you can use the
section specific `exclude` option to do so. For example, the following would only lint Jupyter
Notebook files and not format them:
```toml
[tool.ruff]
extend-include = ["*.ipynb"]
[tool.ruff.format]
exclude = ["*.ipynb"]
```
And, conversely, the following would only format Jupyter Notebook files and not lint them:
```toml
[tool.ruff]
extend-include = ["*.ipynb"]
[tool.ruff.lint]
exclude = ["*.ipynb"]
```
Alternatively, pass the notebook file(s) to `ruff` on the command-line directly. For example,
`ruff check /path/to/notebook.ipynb` will always lint `notebook.ipynb`. Similarly,
`ruff format /path/to/notebook.ipynb` will always format `notebook.ipynb`.

View File

@@ -12,18 +12,18 @@ which supports fix actions, import sorting, and more.
Ruff can be used as a [pre-commit](https://pre-commit.com) hook via [`ruff-pre-commit`](https://github.com/astral-sh/ruff-pre-commit):
```yaml
# Run the Ruff linter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.291
hooks:
- id: ruff
# Run the Ruff formatter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.291
hooks:
- id: ruff-format
# Run the Ruff linter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.291
hooks:
- id: ruff
```
To enable fixes, add the `--fix` argument to the linter:
@@ -47,7 +47,7 @@ To run the hooks over Jupyter Notebooks too, add `jupyter` to the list of allowe
rev: v0.0.291
hooks:
- id: ruff
types_or: [ python, pyi, jupyter ]
types_or: [python, pyi, jupyter]
```
Ruff's lint hook should be placed after other formatting tools, such as Ruff's format hook, Black,

View File

@@ -284,13 +284,13 @@ This tutorial has focused on Ruff's command-line interface, but Ruff can also be
# Run the Ruff linter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.4
rev: v0.1.3
hooks:
- id: ruff
# Run the Ruff formatter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.4
rev: v0.1.3
hooks:
- id: ruff-format
```

View File

@@ -4,7 +4,7 @@ build-backend = "maturin"
[project]
name = "ruff"
version = "0.1.4"
version = "0.1.3"
description = "An extremely fast Python linter and code formatter, written in Rust."
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
readme = "README.md"

1
ruff.schema.json generated
View File

@@ -3536,7 +3536,6 @@
"UP039",
"UP04",
"UP040",
"UP041",
"W",
"W1",
"W19",

View File

@@ -1,6 +1,6 @@
[tool.poetry]
name = "scripts"
version = "0.1.4"
version = "0.1.3"
description = ""
authors = ["Charles Marsh <charlie.r.marsh@gmail.com>"]

View File

@@ -13,7 +13,7 @@ prelude = """
/// Via: <https://github.com/hediet/vscode-unicode-data/blob/main/out/ambiguous.json>
/// See: <https://github.com/microsoft/vscode/blob/095ddabc52b82498ee7f718a34f9dd11d59099a8/src/vs/base/common/strings.ts#L1094>
pub(crate) fn confusable(c: u32) -> Option<char> {
pub(crate) fn confusable(c: u32) -> Option<u8> {
let result = match c {
""".lstrip()
@@ -49,14 +49,6 @@ def format_number(number: int) -> str:
return f"{number}u32"
def format_char(number: int) -> str:
"""Format a Python integer as a Rust character literal."""
char = chr(number)
if char == "\\":
return "\\\\"
return char
def format_confusables_rs(raw_data: dict[str, list[int]]) -> str:
"""Format the downloaded data into a Rust source file."""
# The input data contains duplicate entries.
@@ -67,7 +59,7 @@ def format_confusables_rs(raw_data: dict[str, list[int]]) -> str:
flattened_items.add((items[i], items[i + 1]))
tuples = [
f" {format_number(left)} => '{format_char(right)}',\n"
f" {format_number(left)} => {right},\n"
for left, right in sorted(flattened_items)
]
@@ -75,13 +67,13 @@ def format_confusables_rs(raw_data: dict[str, list[int]]) -> str:
# as they're unicode-to-unicode confusables, not unicode-to-ASCII confusables.
confusable_units = [
# ANGSTROM SIGN → LATIN CAPITAL LETTER A WITH RING ABOVE
("0x212B", chr(0x00C5)),
("0x212B", "0x00C5"),
# OHM SIGN → GREEK CAPITAL LETTER OMEGA
("0x2126", chr(0x03A9)),
("0x2126", "0x03A9"),
# MICRO SIGN → GREEK SMALL LETTER MU
("0x00B5", chr(0x03BC)),
("0x00B5", "0x03BC"),
]
tuples += [f" {left} => '{right}',\n" for left, right in confusable_units]
tuples += [f" {left} => {right},\n" for left, right in confusable_units]
print(f"{len(tuples)} confusable tuples.")