Compare commits

...

11 Commits

Author SHA1 Message Date
Charlie Marsh
c4889196e7 Add missing pyupgrade entry to changelog (#8479)
This got merged after the changelog was generated, but is part of the
release.
2023-11-03 20:57:19 +00:00
Charlie Marsh
6e635e99f4 Add changelog for v0.1.4 (#8478) 2023-11-03 20:11:21 +00:00
Charlie Marsh
260ea41975 Bump version to v0.1.4 (#8477) 2023-11-03 14:52:56 -04:00
Hugo van Kemenade
65effc6666 Add pyupgrade UP041 to replace TimeoutError aliases (#8476)
## Summary

Add UP041 to replace `TimeoutError` aliases:

* Python 3.10+: `socket.timeout`
* Python 3.11+: `asyncio.TimeoutError`

Re:

* https://github.com/asottile/pyupgrade#timeouterror-aliases
*
https://docs.python.org/3/library/asyncio-exceptions.html#asyncio.TimeoutError
* https://docs.python.org/3/library/socket.html#socket.timeout

Based on `os_error_alias.rs`.

## Test Plan

<!-- How was it tested? -->

By running:

```
cargo clippy --workspace --all-targets --all-features -- -D warnings  # Rust linting
RUFF_UPDATE_SCHEMA=1 cargo test  # Rust testing and updating ruff.schema.json
pre-commit run --all-files --show-diff-on-failure  # Rust and Python formatting, Markdown and Python linting, etc.
cargo insta review
```

And also running with different `--target-version` values:

```sh
cargo run -p ruff_cli -- check crates/ruff_linter/resources/test/fixtures/pyupgrade/UP041.py --no-cache --select UP041 --target-version py37 --diff
cargo run -p ruff_cli -- check crates/ruff_linter/resources/test/fixtures/pyupgrade/UP041.py --no-cache --select UP041 --target-version py310 --diff
cargo run -p ruff_cli -- check crates/ruff_linter/resources/test/fixtures/pyupgrade/UP041.py --no-cache --select UP041 --target-version py311 --diff
```
2023-11-03 17:24:47 +00:00
T-256
4982694b54 D300: prevent autofix when both triples are in body (#8462)
## Summary
Addresses
https://github.com/astral-sh/ruff/issues/8402#issuecomment-1788782750

## Test Plan

Added associated test
2023-11-03 12:49:50 -04:00
Charlie Marsh
536ac550ed Remove trailing periods from NumPy 2.0 code actions (#8475)
Very minor consistency thing with other rules. For code actions, we tend
to say `Replace with {X}` rathern than `Use {X} instead.`
2023-11-03 16:28:06 +00:00
Charlie Marsh
f2335fe692 Make Unicode-to-Unicode confusables a preview change (#8473) 2023-11-03 12:17:28 -04:00
Charlie Marsh
b0f9a14d9a Mark byte_bounds as a non-backwards-compatible NumPy 2.0 change (#8474)
This is the one refactor in the NumPy 2.0 upgrade rule that isn't
compatible with earlier versions of NumPy, so I'm marking it as unsafe
and adding a dedicated message.
2023-11-03 12:14:57 -04:00
Deepyaman Datta
f56bc1983b Place 'r' prefix before 'f' for raw format strings (#8464)
## Summary

Currently, `UP032` applied to raw strings results in format strings with
the prefix 'fr'. This gets changed to 'rf' by Ruff format (or Black). In
order to avoid that, this PR uses the prefix 'rf' to begin with.

## Test Plan

Updated the expectation on an existing test.
2023-11-03 10:56:21 -04:00
Charlie Marsh
7c12eaf322 Use characters instead of u32 in confusable map (#8463) 2023-11-03 09:57:47 -04:00
Dhruv Manilawala
41e538a748 Provide example for exclusive linting or formatting Notebooks (#8461)
Reference screenshot: https://github.com/astral-sh/ruff/assets/67177269/eef5ab79-77e9-4ced-be7b-a61b7bb20ecd
2023-11-03 16:56:20 +05:30
34 changed files with 2485 additions and 1699 deletions

View File

@@ -1,5 +1,66 @@
# Changelog
## 0.1.4
### Preview features
- \[`flake8-trio`\] Implement `timeout-without-await` (`TRIO001`) ([#8439](https://github.com/astral-sh/ruff/pull/8439))
- \[`numpy`\] Implement NumPy 2.0 migration rule (`NPY200`) ([#7702](https://github.com/astral-sh/ruff/pull/7702))
- \[`pylint`\] Implement `bad-open-mode` (`W1501`) ([#8294](https://github.com/astral-sh/ruff/pull/8294))
- \[`pylint`\] Implement `import-outside-toplevel` (`C0415`) rule ([#5180](https://github.com/astral-sh/ruff/pull/5180))
- \[`pylint`\] Implement `useless-with-lock` (`W2101`) ([#8321](https://github.com/astral-sh/ruff/pull/8321))
- \[`pyupgrade`\] Implement `timeout-error-alias` (`UP041`) ([#8476](https://github.com/astral-sh/ruff/pull/8476))
- \[`refurb`\] Implement `isinstance-type-none` (`FURB168`) ([#8308](https://github.com/astral-sh/ruff/pull/8308))
- Detect confusable Unicode-to-Unicode units in `RUF001`, `RUF002`, and `RUF003` ([#4430](https://github.com/astral-sh/ruff/pull/4430))
- Add newline after module docstrings in preview style ([#8283](https://github.com/astral-sh/ruff/pull/8283))
### Formatter
- Add a note on line-too-long to the formatter docs ([#8314](https://github.com/astral-sh/ruff/pull/8314))
- Preserve trailing statement semicolons when using `fmt: skip` ([#8273](https://github.com/astral-sh/ruff/pull/8273))
- Preserve trailing semicolons when using `fmt: off` ([#8275](https://github.com/astral-sh/ruff/pull/8275))
- Avoid duplicating linter-formatter compatibility warnings ([#8292](https://github.com/astral-sh/ruff/pull/8292))
- Avoid inserting a newline after function docstrings ([#8375](https://github.com/astral-sh/ruff/pull/8375))
- Insert newline between docstring and following own line comment ([#8216](https://github.com/astral-sh/ruff/pull/8216))
- Split tuples in return positions by comma first ([#8280](https://github.com/astral-sh/ruff/pull/8280))
- Avoid treating byte strings as docstrings ([#8350](https://github.com/astral-sh/ruff/pull/8350))
- Add `--line-length` option to `format` command ([#8363](https://github.com/astral-sh/ruff/pull/8363))
- Avoid parenthesizing unsplittable because of comments ([#8431](https://github.com/astral-sh/ruff/pull/8431))
### CLI
- Add `--output-format` to `ruff rule` and `ruff linter` ([#8203](https://github.com/astral-sh/ruff/pull/8203))
### Bug fixes
- Respect `--force-exclude` in `lint.exclude` and `format.exclude` ([#8393](https://github.com/astral-sh/ruff/pull/8393))
- Respect `--extend-per-file-ignores` on the CLI ([#8329](https://github.com/astral-sh/ruff/pull/8329))
- Extend `bad-dunder-method-name` to permit `__index__` ([#8300](https://github.com/astral-sh/ruff/pull/8300))
- Fix panic with 8 in octal escape ([#8356](https://github.com/astral-sh/ruff/pull/8356))
- Avoid raising `D300` when both triple quote styles are present ([#8462](https://github.com/astral-sh/ruff/pull/8462))
- Consider unterminated f-strings in `FStringRanges` ([#8154](https://github.com/astral-sh/ruff/pull/8154))
- Avoid including literal `shell=True` for truthy, non-`True` diagnostics ([#8359](https://github.com/astral-sh/ruff/pull/8359))
- Avoid triggering single-element test for starred expressions ([#8433](https://github.com/astral-sh/ruff/pull/8433))
- Detect and ignore Jupyter automagics ([#8398](https://github.com/astral-sh/ruff/pull/8398))
- Fix invalid E231 error with f-strings ([#8369](https://github.com/astral-sh/ruff/pull/8369))
- Avoid triggering `NamedTuple` rewrite with starred annotation ([#8434](https://github.com/astral-sh/ruff/pull/8434))
- Avoid un-setting bracket flag in logical lines ([#8380](https://github.com/astral-sh/ruff/pull/8380))
- Place 'r' prefix before 'f' for raw format strings ([#8464](https://github.com/astral-sh/ruff/pull/8464))
- Remove trailing periods from NumPy 2.0 code actions ([#8475](https://github.com/astral-sh/ruff/pull/8475))
- Fix bug where `PLE1307` was raised when formatting `%c` with characters ([#8407](https://github.com/astral-sh/ruff/pull/8407))
- Remove unicode flag from comparable ([#8440](https://github.com/astral-sh/ruff/pull/8440))
- Improve B015 message ([#8295](https://github.com/astral-sh/ruff/pull/8295))
- Use `fixedOverflowWidgets` for playground popover ([#8458](https://github.com/astral-sh/ruff/pull/8458))
- Mark `byte_bounds` as a non-backwards-compatible NumPy 2.0 change ([#8474](https://github.com/astral-sh/ruff/pull/8474))
### Internals
- Add a dedicated cache directory per Ruff version ([#8333](https://github.com/astral-sh/ruff/pull/8333))
- Allow selective caching for `--fix` and `--diff` ([#8316](https://github.com/astral-sh/ruff/pull/8316))
- Improve performance of comment parsing ([#8193](https://github.com/astral-sh/ruff/pull/8193))
- Improve performance of string parsing ([#8227](https://github.com/astral-sh/ruff/pull/8227))
- Use a dedicated sort key for isort import sorting ([#7963](https://github.com/astral-sh/ruff/pull/7963))
## 0.1.3
This release includes a variety of improvements to the Ruff formatter, removing several known and

8
Cargo.lock generated
View File

@@ -810,7 +810,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
[[package]]
name = "flake8-to-ruff"
version = "0.1.3"
version = "0.1.4"
dependencies = [
"anyhow",
"clap",
@@ -2060,7 +2060,7 @@ dependencies = [
[[package]]
name = "ruff_cli"
version = "0.1.3"
version = "0.1.4"
dependencies = [
"annotate-snippets 0.9.1",
"anyhow",
@@ -2196,7 +2196,7 @@ dependencies = [
[[package]]
name = "ruff_linter"
version = "0.1.3"
version = "0.1.4"
dependencies = [
"aho-corasick",
"annotate-snippets 0.9.1",
@@ -2447,7 +2447,7 @@ dependencies = [
[[package]]
name = "ruff_shrinking"
version = "0.1.3"
version = "0.1.4"
dependencies = [
"anyhow",
"clap",

View File

@@ -148,10 +148,9 @@ ruff format @arguments.txt # Format using an input file, treating its
Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff-pre-commit`](https://github.com/astral-sh/ruff-pre-commit):
```yaml
# Run the Ruff linter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.3
rev: v0.1.4
hooks:
# Run the Ruff linter.
- id: ruff

View File

@@ -1,6 +1,6 @@
[package]
name = "flake8-to-ruff"
version = "0.1.3"
version = "0.1.4"
description = """
Convert Flake8 configuration files to Ruff configuration files.
"""

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_cli"
version = "0.1.3"
version = "0.1.4"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_linter"
version = "0.1.3"
version = "0.1.4"
publish = false
authors = { workspace = true }
edition = { workspace = true }

View File

@@ -8,3 +8,19 @@ def ends_in_quote():
def contains_quote():
'Sum"\\mary.'
# OK
def contains_triples(t):
"""('''|\""")"""
# OK
def contains_triples(t):
'''(\'''|""")'''
# TODO: here should raise D300 for using dobule quotes instead,
# because escaped double quote does allow us.
def contains_triples(t):
'''(\""")'''

View File

@@ -0,0 +1,68 @@
import asyncio, socket
# These should be fixed
try:
pass
except asyncio.TimeoutError:
pass
try:
pass
except socket.timeout:
pass
# Should NOT be in parentheses when replaced
try:
pass
except (asyncio.TimeoutError,):
pass
try:
pass
except (socket.timeout,):
pass
try:
pass
except (asyncio.TimeoutError, socket.timeout,):
pass
# Should be kept in parentheses (because multiple)
try:
pass
except (asyncio.TimeoutError, socket.timeout, KeyError, TimeoutError):
pass
# First should change, second should not
from .mmap import error
try:
pass
except (asyncio.TimeoutError, error):
pass
# These should not change
from foo import error
try:
pass
except (TimeoutError, error):
pass
try:
pass
except:
pass
try:
pass
except TimeoutError:
pass
try:
pass
except (TimeoutError, KeyError):
pass

View File

@@ -466,6 +466,11 @@ pub(crate) fn expression(expr: &Expr, checker: &mut Checker) {
if checker.enabled(Rule::OSErrorAlias) {
pyupgrade::rules::os_error_alias_call(checker, func);
}
if checker.enabled(Rule::TimeoutErrorAlias) {
if checker.settings.target_version >= PythonVersion::Py310 {
pyupgrade::rules::timeout_error_alias_call(checker, func);
}
}
if checker.enabled(Rule::NonPEP604Isinstance) {
if checker.settings.target_version >= PythonVersion::Py310 {
pyupgrade::rules::use_pep604_isinstance(checker, expr, func, args);

View File

@@ -1006,6 +1006,13 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
pyupgrade::rules::os_error_alias_raise(checker, item);
}
}
if checker.enabled(Rule::TimeoutErrorAlias) {
if checker.settings.target_version >= PythonVersion::Py310 {
if let Some(item) = exc {
pyupgrade::rules::timeout_error_alias_raise(checker, item);
}
}
}
if checker.enabled(Rule::RaiseVanillaClass) {
if let Some(expr) = exc {
tryceratops::rules::raise_vanilla_class(checker, expr);
@@ -1304,6 +1311,11 @@ pub(crate) fn statement(stmt: &Stmt, checker: &mut Checker) {
if checker.enabled(Rule::OSErrorAlias) {
pyupgrade::rules::os_error_alias_handlers(checker, handlers);
}
if checker.enabled(Rule::TimeoutErrorAlias) {
if checker.settings.target_version >= PythonVersion::Py310 {
pyupgrade::rules::timeout_error_alias_handlers(checker, handlers);
}
}
if checker.enabled(Rule::PytestAssertInExcept) {
flake8_pytest_style::rules::assert_in_exception_handler(checker, handlers);
}

View File

@@ -500,6 +500,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Pyupgrade, "038") => (RuleGroup::Stable, rules::pyupgrade::rules::NonPEP604Isinstance),
(Pyupgrade, "039") => (RuleGroup::Stable, rules::pyupgrade::rules::UnnecessaryClassParentheses),
(Pyupgrade, "040") => (RuleGroup::Stable, rules::pyupgrade::rules::NonPEP695TypeAlias),
(Pyupgrade, "041") => (RuleGroup::Preview, rules::pyupgrade::rules::TimeoutErrorAlias),
// pydocstyle
(Pydocstyle, "100") => (RuleGroup::Stable, rules::pydocstyle::rules::UndocumentedPublicModule),

View File

@@ -19,10 +19,13 @@ use crate::importer::ImportRequest;
/// constants were removed from the main namespace.
///
/// The majority of these functions and constants can be automatically replaced
/// by other members of the NumPy API, even prior to NumPy 2.0, or by
/// equivalents from the Python standard library. This rule flags all uses of
/// removed members, along with automatic fixes for any backwards-compatible
/// replacements.
/// by other members of the NumPy API or by equivalents from the Python
/// standard library. With the exception of renaming `numpy.byte_bounds` to
/// `numpy.lib.array_utils.byte_bounds`, all such replacements are backwards
/// compatible with earlier versions of NumPy.
///
/// This rule flags all uses of removed members, along with automatic fixes for
/// any backwards-compatible replacements.
///
/// ## Examples
/// ```python
@@ -45,6 +48,7 @@ use crate::importer::ImportRequest;
pub struct Numpy2Deprecation {
existing: String,
migration_guide: Option<String>,
code_action: Option<String>,
}
impl Violation for Numpy2Deprecation {
@@ -55,21 +59,23 @@ impl Violation for Numpy2Deprecation {
let Numpy2Deprecation {
existing,
migration_guide,
code_action: _,
} = self;
match migration_guide {
Some(migration_guide) => {
format!("`np.{existing}` will be removed in NumPy 2.0. {migration_guide}",)
}
None => format!("`np.{existing}` will be removed without replacement in NumPy 2.0."),
None => format!("`np.{existing}` will be removed without replacement in NumPy 2.0"),
}
}
fn fix_title(&self) -> Option<String> {
let Numpy2Deprecation {
existing: _,
migration_guide,
migration_guide: _,
code_action,
} = self;
migration_guide.clone()
code_action.clone()
}
}
@@ -82,7 +88,11 @@ struct Replacement<'a> {
#[derive(Debug)]
enum Details<'a> {
/// The deprecated member can be replaced by another member in the NumPy API.
AutoImport { path: &'a str, name: &'a str },
AutoImport {
path: &'a str,
name: &'a str,
compatibility: Compatibility,
},
/// The deprecated member can be replaced by a member of the Python standard library.
AutoPurePython { python_expr: &'a str },
/// The deprecated member can be replaced by a manual migration.
@@ -92,15 +102,54 @@ enum Details<'a> {
impl Details<'_> {
fn guideline(&self) -> Option<String> {
match self {
Details::AutoImport { path, name } => Some(format!("Use `{path}.{name}` instead.")),
Details::AutoImport {
path,
name,
compatibility: Compatibility::BackwardsCompatible,
} => Some(format!("Use `{path}.{name}` instead.")),
Details::AutoImport {
path,
name,
compatibility: Compatibility::Breaking,
} => Some(format!(
"Use `{path}.{name}` on NumPy 2.0, or ignore this warning on earlier versions."
)),
Details::AutoPurePython { python_expr } => {
Some(format!("Use `{python_expr}` instead."))
}
Details::Manual { guideline } => guideline.map(ToString::to_string),
}
}
fn code_action(&self) -> Option<String> {
match self {
Details::AutoImport {
path,
name,
compatibility: Compatibility::BackwardsCompatible,
} => Some(format!("Replace with `{path}.{name}`")),
Details::AutoImport {
path,
name,
compatibility: Compatibility::Breaking,
} => Some(format!(
"Replace with `{path}.{name}` (requires NumPy 2.0 or greater)"
)),
Details::AutoPurePython { python_expr } => {
Some(format!("Replace with `{python_expr}`"))
}
Details::Manual { guideline: _ } => None,
}
}
}
#[derive(Debug)]
enum Compatibility {
/// The changes is backwards compatible with earlier versions of NumPy.
BackwardsCompatible,
/// The change is breaking in NumPy 2.0.
Breaking,
}
/// NPY201
pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
let maybe_replacement = checker
@@ -113,6 +162,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib",
name: "add_docstring",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "add_newdoc"] => Some(Replacement {
@@ -120,6 +170,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib",
name: "add_newdoc",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "add_newdoc_ufunc"] => Some(Replacement {
@@ -139,6 +190,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib.array_utils",
name: "byte_bounds",
compatibility: Compatibility::Breaking,
},
}),
["numpy", "cast"] => Some(Replacement {
@@ -152,6 +204,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "complex128",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "clongfloat"] => Some(Replacement {
@@ -159,6 +212,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "clongdouble",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "compat"] => Some(Replacement {
@@ -172,6 +226,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "complex128",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "DataSource"] => Some(Replacement {
@@ -179,6 +234,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib.npyio",
name: "DataSource",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "deprecate"] => Some(Replacement {
@@ -222,6 +278,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "float64",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "geterrobj"] => Some(Replacement {
@@ -235,6 +292,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "Inf"] => Some(Replacement {
@@ -242,6 +300,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "Infinity"] => Some(Replacement {
@@ -249,6 +308,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "infty"] => Some(Replacement {
@@ -256,6 +316,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "issctype"] => Some(Replacement {
@@ -275,6 +336,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "issubdtype",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "mat"] => Some(Replacement {
@@ -282,6 +344,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "asmatrix",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "maximum_sctype"] => Some(Replacement {
@@ -295,6 +358,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "nan",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "nbytes"] => Some(Replacement {
@@ -320,6 +384,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "clongdouble",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "longfloat"] => Some(Replacement {
@@ -327,6 +392,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "longdouble",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "lookfor"] => Some(Replacement {
@@ -346,6 +412,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "inf",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "PZERO"] => Some(Replacement {
@@ -369,6 +436,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "round",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "safe_eval"] => Some(Replacement {
@@ -376,6 +444,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "ast",
name: "literal_eval",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "sctype2char"] => Some(Replacement {
@@ -407,6 +476,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "complex64",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "string_"] => Some(Replacement {
@@ -414,6 +484,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "bytes_",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "source"] => Some(Replacement {
@@ -421,6 +492,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "inspect",
name: "getsource",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "tracemalloc_domain"] => Some(Replacement {
@@ -428,6 +500,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy.lib",
name: "tracemalloc_domain",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "unicode_"] => Some(Replacement {
@@ -435,6 +508,7 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
details: Details::AutoImport {
path: "numpy",
name: "str_",
compatibility: Compatibility::BackwardsCompatible,
},
}),
["numpy", "who"] => Some(Replacement {
@@ -451,11 +525,16 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
Numpy2Deprecation {
existing: replacement.existing.to_string(),
migration_guide: replacement.details.guideline(),
code_action: replacement.details.code_action(),
},
expr.range(),
);
match replacement.details {
Details::AutoImport { path, name } => {
Details::AutoImport {
path,
name,
compatibility,
} => {
diagnostic.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(path, name),
@@ -463,7 +542,14 @@ pub(crate) fn numpy_2_0_deprecation(checker: &mut Checker, expr: &Expr) {
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, expr.range());
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
Ok(match compatibility {
Compatibility::BackwardsCompatible => {
Fix::safe_edits(import_edit, [replacement_edit])
}
Compatibility::Breaking => {
Fix::unsafe_edits(import_edit, [replacement_edit])
}
})
});
}
Details::AutoPurePython { python_expr } => diagnostic.set_fix(Fix::safe_edit(

View File

@@ -10,7 +10,7 @@ NPY201.py:4:5: NPY201 [*] `np.add_docstring` will be removed in NumPy 2.0. Use `
5 |
6 | np.add_newdoc
|
= help: Use `numpy.lib.add_docstring` instead.
= help: Replace with `numpy.lib.add_docstring`
Fix
1 |+from numpy.lib import add_docstring
@@ -32,7 +32,7 @@ NPY201.py:6:5: NPY201 [*] `np.add_newdoc` will be removed in NumPy 2.0. Use `num
7 |
8 | np.add_newdoc_ufunc
|
= help: Use `numpy.lib.add_newdoc` instead.
= help: Replace with `numpy.lib.add_newdoc`
Fix
1 |+from numpy.lib import add_newdoc
@@ -56,7 +56,6 @@ NPY201.py:8:5: NPY201 `np.add_newdoc_ufunc` will be removed in NumPy 2.0. `add_n
9 |
10 | np.asfarray([1,2,3])
|
= help: `add_newdoc_ufunc` is an internal function.
NPY201.py:10:5: NPY201 `np.asfarray` will be removed in NumPy 2.0. Use `np.asarray` with a `float` dtype instead.
|
@@ -67,9 +66,8 @@ NPY201.py:10:5: NPY201 `np.asfarray` will be removed in NumPy 2.0. Use `np.asarr
11 |
12 | np.byte_bounds(np.array([1,2,3]))
|
= help: Use `np.asarray` with a `float` dtype instead.
NPY201.py:12:5: NPY201 [*] `np.byte_bounds` will be removed in NumPy 2.0. Use `numpy.lib.array_utils.byte_bounds` instead.
NPY201.py:12:5: NPY201 [*] `np.byte_bounds` will be removed in NumPy 2.0. Use `numpy.lib.array_utils.byte_bounds` on NumPy 2.0, or ignore this warning on earlier versions.
|
10 | np.asfarray([1,2,3])
11 |
@@ -78,9 +76,9 @@ NPY201.py:12:5: NPY201 [*] `np.byte_bounds` will be removed in NumPy 2.0. Use `n
13 |
14 | np.cast
|
= help: Use `numpy.lib.array_utils.byte_bounds` instead.
= help: Replace with `numpy.lib.array_utils.byte_bounds` (requires NumPy 2.0 or greater)
Fix
Suggested fix
1 |+from numpy.lib.array_utils import byte_bounds
1 2 | def func():
2 3 | import numpy as np
@@ -104,7 +102,6 @@ NPY201.py:14:5: NPY201 `np.cast` will be removed in NumPy 2.0. Use `np.asarray(a
15 |
16 | np.cfloat(12+34j)
|
= help: Use `np.asarray(arr, dtype=dtype)` instead.
NPY201.py:16:5: NPY201 [*] `np.cfloat` will be removed in NumPy 2.0. Use `numpy.complex128` instead.
|
@@ -115,7 +112,7 @@ NPY201.py:16:5: NPY201 [*] `np.cfloat` will be removed in NumPy 2.0. Use `numpy.
17 |
18 | np.clongfloat(12+34j)
|
= help: Use `numpy.complex128` instead.
= help: Replace with `numpy.complex128`
Fix
13 13 |
@@ -136,7 +133,7 @@ NPY201.py:18:5: NPY201 [*] `np.clongfloat` will be removed in NumPy 2.0. Use `nu
19 |
20 | np.compat
|
= help: Use `numpy.clongdouble` instead.
= help: Replace with `numpy.clongdouble`
Fix
15 15 |
@@ -157,7 +154,6 @@ NPY201.py:20:5: NPY201 `np.compat` will be removed in NumPy 2.0. Python 2 is no
21 |
22 | np.complex_(12+34j)
|
= help: Python 2 is no longer supported.
NPY201.py:22:5: NPY201 [*] `np.complex_` will be removed in NumPy 2.0. Use `numpy.complex128` instead.
|
@@ -168,7 +164,7 @@ NPY201.py:22:5: NPY201 [*] `np.complex_` will be removed in NumPy 2.0. Use `nump
23 |
24 | np.DataSource
|
= help: Use `numpy.complex128` instead.
= help: Replace with `numpy.complex128`
Fix
19 19 |
@@ -189,7 +185,7 @@ NPY201.py:24:5: NPY201 [*] `np.DataSource` will be removed in NumPy 2.0. Use `nu
25 |
26 | np.deprecate
|
= help: Use `numpy.lib.npyio.DataSource` instead.
= help: Replace with `numpy.lib.npyio.DataSource`
Fix
1 |+from numpy.lib.npyio import DataSource
@@ -215,7 +211,6 @@ NPY201.py:26:5: NPY201 `np.deprecate` will be removed in NumPy 2.0. Emit `Deprec
27 |
28 | np.deprecate_with_doc
|
= help: Emit `DeprecationWarning` with `warnings.warn` directly, or use `typing.deprecated`.
NPY201.py:28:5: NPY201 `np.deprecate_with_doc` will be removed in NumPy 2.0. Emit `DeprecationWarning` with `warnings.warn` directly, or use `typing.deprecated`.
|
@@ -226,7 +221,6 @@ NPY201.py:28:5: NPY201 `np.deprecate_with_doc` will be removed in NumPy 2.0. Emi
29 |
30 | np.disp(10)
|
= help: Emit `DeprecationWarning` with `warnings.warn` directly, or use `typing.deprecated`.
NPY201.py:30:5: NPY201 `np.disp` will be removed in NumPy 2.0. Use a dedicated print function instead.
|
@@ -237,7 +231,6 @@ NPY201.py:30:5: NPY201 `np.disp` will be removed in NumPy 2.0. Use a dedicated p
31 |
32 | np.fastCopyAndTranspose
|
= help: Use a dedicated print function instead.
NPY201.py:32:5: NPY201 `np.fastCopyAndTranspose` will be removed in NumPy 2.0. Use `arr.T.copy()` instead.
|
@@ -248,7 +241,6 @@ NPY201.py:32:5: NPY201 `np.fastCopyAndTranspose` will be removed in NumPy 2.0. U
33 |
34 | np.find_common_type
|
= help: Use `arr.T.copy()` instead.
NPY201.py:34:5: NPY201 `np.find_common_type` will be removed in NumPy 2.0. Use `numpy.promote_types` or `numpy.result_type` instead. To achieve semantics for the `scalar_types` argument, use `numpy.result_type` and pass the Python values `0`, `0.0`, or `0j`.
|
@@ -259,9 +251,8 @@ NPY201.py:34:5: NPY201 `np.find_common_type` will be removed in NumPy 2.0. Use `
35 |
36 | np.get_array_wrap
|
= help: Use `numpy.promote_types` or `numpy.result_type` instead. To achieve semantics for the `scalar_types` argument, use `numpy.result_type` and pass the Python values `0`, `0.0`, or `0j`.
NPY201.py:36:5: NPY201 `np.get_array_wrap` will be removed without replacement in NumPy 2.0.
NPY201.py:36:5: NPY201 `np.get_array_wrap` will be removed without replacement in NumPy 2.0
|
34 | np.find_common_type
35 |
@@ -280,7 +271,7 @@ NPY201.py:38:5: NPY201 [*] `np.float_` will be removed in NumPy 2.0. Use `numpy.
39 |
40 | np.geterrobj
|
= help: Use `numpy.float64` instead.
= help: Replace with `numpy.float64`
Fix
35 35 |
@@ -301,7 +292,6 @@ NPY201.py:40:5: NPY201 `np.geterrobj` will be removed in NumPy 2.0. Use the `np.
41 |
42 | np.Inf
|
= help: Use the `np.errstate` context manager instead.
NPY201.py:42:5: NPY201 [*] `np.Inf` will be removed in NumPy 2.0. Use `numpy.inf` instead.
|
@@ -312,7 +302,7 @@ NPY201.py:42:5: NPY201 [*] `np.Inf` will be removed in NumPy 2.0. Use `numpy.inf
43 |
44 | np.Infinity
|
= help: Use `numpy.inf` instead.
= help: Replace with `numpy.inf`
Fix
39 39 |
@@ -333,7 +323,7 @@ NPY201.py:44:5: NPY201 [*] `np.Infinity` will be removed in NumPy 2.0. Use `nump
45 |
46 | np.infty
|
= help: Use `numpy.inf` instead.
= help: Replace with `numpy.inf`
Fix
41 41 |
@@ -354,7 +344,7 @@ NPY201.py:46:5: NPY201 [*] `np.infty` will be removed in NumPy 2.0. Use `numpy.i
47 |
48 | np.issctype
|
= help: Use `numpy.inf` instead.
= help: Replace with `numpy.inf`
Fix
43 43 |
@@ -366,7 +356,7 @@ NPY201.py:46:5: NPY201 [*] `np.infty` will be removed in NumPy 2.0. Use `numpy.i
48 48 | np.issctype
49 49 |
NPY201.py:48:5: NPY201 `np.issctype` will be removed without replacement in NumPy 2.0.
NPY201.py:48:5: NPY201 `np.issctype` will be removed without replacement in NumPy 2.0
|
46 | np.infty
47 |
@@ -385,7 +375,7 @@ NPY201.py:50:5: NPY201 [*] `np.issubclass_` will be removed in NumPy 2.0. Use `i
51 |
52 | np.issubsctype
|
= help: Use `issubclass` instead.
= help: Replace with `issubclass`
Fix
47 47 |
@@ -406,7 +396,7 @@ NPY201.py:52:5: NPY201 [*] `np.issubsctype` will be removed in NumPy 2.0. Use `n
53 |
54 | np.mat
|
= help: Use `numpy.issubdtype` instead.
= help: Replace with `numpy.issubdtype`
Fix
49 49 |
@@ -427,7 +417,7 @@ NPY201.py:54:5: NPY201 [*] `np.mat` will be removed in NumPy 2.0. Use `numpy.asm
55 |
56 | np.maximum_sctype
|
= help: Use `numpy.asmatrix` instead.
= help: Replace with `numpy.asmatrix`
Fix
51 51 |
@@ -439,7 +429,7 @@ NPY201.py:54:5: NPY201 [*] `np.mat` will be removed in NumPy 2.0. Use `numpy.asm
56 56 | np.maximum_sctype
57 57 |
NPY201.py:56:5: NPY201 `np.maximum_sctype` will be removed without replacement in NumPy 2.0.
NPY201.py:56:5: NPY201 `np.maximum_sctype` will be removed without replacement in NumPy 2.0
|
54 | np.mat
55 |
@@ -458,7 +448,7 @@ NPY201.py:58:5: NPY201 [*] `np.NaN` will be removed in NumPy 2.0. Use `numpy.nan
59 |
60 | np.nbytes[np.int64]
|
= help: Use `numpy.nan` instead.
= help: Replace with `numpy.nan`
Fix
55 55 |
@@ -479,7 +469,6 @@ NPY201.py:60:5: NPY201 `np.nbytes` will be removed in NumPy 2.0. Use `np.dtype(<
61 |
62 | np.NINF
|
= help: Use `np.dtype(<dtype>).itemsize` instead.
NPY201.py:62:5: NPY201 [*] `np.NINF` will be removed in NumPy 2.0. Use `-np.inf` instead.
|
@@ -490,7 +479,7 @@ NPY201.py:62:5: NPY201 [*] `np.NINF` will be removed in NumPy 2.0. Use `-np.inf`
63 |
64 | np.NZERO
|
= help: Use `-np.inf` instead.
= help: Replace with `-np.inf`
Fix
59 59 |
@@ -511,7 +500,7 @@ NPY201.py:64:5: NPY201 [*] `np.NZERO` will be removed in NumPy 2.0. Use `-0.0` i
65 |
66 | np.longcomplex(12+34j)
|
= help: Use `-0.0` instead.
= help: Replace with `-0.0`
Fix
61 61 |
@@ -532,7 +521,7 @@ NPY201.py:66:5: NPY201 [*] `np.longcomplex` will be removed in NumPy 2.0. Use `n
67 |
68 | np.longfloat(12+34j)
|
= help: Use `numpy.clongdouble` instead.
= help: Replace with `numpy.clongdouble`
Fix
63 63 |
@@ -553,7 +542,7 @@ NPY201.py:68:5: NPY201 [*] `np.longfloat` will be removed in NumPy 2.0. Use `num
69 |
70 | np.lookfor
|
= help: Use `numpy.longdouble` instead.
= help: Replace with `numpy.longdouble`
Fix
65 65 |
@@ -574,9 +563,8 @@ NPY201.py:70:5: NPY201 `np.lookfor` will be removed in NumPy 2.0. Search NumPy
71 |
72 | np.obj2sctype(int)
|
= help: Search NumPys documentation directly.
NPY201.py:72:5: NPY201 `np.obj2sctype` will be removed without replacement in NumPy 2.0.
NPY201.py:72:5: NPY201 `np.obj2sctype` will be removed without replacement in NumPy 2.0
|
70 | np.lookfor
71 |
@@ -595,7 +583,7 @@ NPY201.py:74:5: NPY201 [*] `np.PINF` will be removed in NumPy 2.0. Use `numpy.in
75 |
76 | np.PZERO
|
= help: Use `numpy.inf` instead.
= help: Replace with `numpy.inf`
Fix
71 71 |
@@ -616,7 +604,7 @@ NPY201.py:76:5: NPY201 [*] `np.PZERO` will be removed in NumPy 2.0. Use `0.0` in
77 |
78 | np.recfromcsv
|
= help: Use `0.0` instead.
= help: Replace with `0.0`
Fix
73 73 |
@@ -637,7 +625,6 @@ NPY201.py:78:5: NPY201 `np.recfromcsv` will be removed in NumPy 2.0. Use `np.gen
79 |
80 | np.recfromtxt
|
= help: Use `np.genfromtxt` with comma delimiter instead.
NPY201.py:80:5: NPY201 `np.recfromtxt` will be removed in NumPy 2.0. Use `np.genfromtxt` instead.
|
@@ -648,7 +635,6 @@ NPY201.py:80:5: NPY201 `np.recfromtxt` will be removed in NumPy 2.0. Use `np.gen
81 |
82 | np.round_(12.34)
|
= help: Use `np.genfromtxt` instead.
NPY201.py:82:5: NPY201 [*] `np.round_` will be removed in NumPy 2.0. Use `numpy.round` instead.
|
@@ -659,7 +645,7 @@ NPY201.py:82:5: NPY201 [*] `np.round_` will be removed in NumPy 2.0. Use `numpy.
83 |
84 | np.safe_eval
|
= help: Use `numpy.round` instead.
= help: Replace with `numpy.round`
Fix
79 79 |
@@ -680,7 +666,7 @@ NPY201.py:84:5: NPY201 [*] `np.safe_eval` will be removed in NumPy 2.0. Use `ast
85 |
86 | np.sctype2char
|
= help: Use `ast.literal_eval` instead.
= help: Replace with `ast.literal_eval`
Fix
1 |+from ast import literal_eval
@@ -697,7 +683,7 @@ NPY201.py:84:5: NPY201 [*] `np.safe_eval` will be removed in NumPy 2.0. Use `ast
86 87 | np.sctype2char
87 88 |
NPY201.py:86:5: NPY201 `np.sctype2char` will be removed without replacement in NumPy 2.0.
NPY201.py:86:5: NPY201 `np.sctype2char` will be removed without replacement in NumPy 2.0
|
84 | np.safe_eval
85 |
@@ -707,7 +693,7 @@ NPY201.py:86:5: NPY201 `np.sctype2char` will be removed without replacement in N
88 | np.sctypes
|
NPY201.py:88:5: NPY201 `np.sctypes` will be removed without replacement in NumPy 2.0.
NPY201.py:88:5: NPY201 `np.sctypes` will be removed without replacement in NumPy 2.0
|
86 | np.sctype2char
87 |
@@ -726,7 +712,6 @@ NPY201.py:90:5: NPY201 `np.seterrobj` will be removed in NumPy 2.0. Use the `np.
91 |
92 | np.set_numeric_ops
|
= help: Use the `np.errstate` context manager instead.
NPY201.py:94:5: NPY201 `np.set_string_function` will be removed in NumPy 2.0. Use `np.set_printoptions` for custom printing of NumPy objects.
|
@@ -737,7 +722,6 @@ NPY201.py:94:5: NPY201 `np.set_string_function` will be removed in NumPy 2.0. Us
95 |
96 | np.singlecomplex(12+1j)
|
= help: Use `np.set_printoptions` for custom printing of NumPy objects.
NPY201.py:96:5: NPY201 [*] `np.singlecomplex` will be removed in NumPy 2.0. Use `numpy.complex64` instead.
|
@@ -748,7 +732,7 @@ NPY201.py:96:5: NPY201 [*] `np.singlecomplex` will be removed in NumPy 2.0. Use
97 |
98 | np.string_("asdf")
|
= help: Use `numpy.complex64` instead.
= help: Replace with `numpy.complex64`
Fix
93 93 |
@@ -769,7 +753,7 @@ NPY201.py:98:5: NPY201 [*] `np.string_` will be removed in NumPy 2.0. Use `numpy
99 |
100 | np.source
|
= help: Use `numpy.bytes_` instead.
= help: Replace with `numpy.bytes_`
Fix
95 95 |
@@ -790,7 +774,7 @@ NPY201.py:100:5: NPY201 [*] `np.source` will be removed in NumPy 2.0. Use `inspe
101 |
102 | np.tracemalloc_domain
|
= help: Use `inspect.getsource` instead.
= help: Replace with `inspect.getsource`
Fix
1 |+from inspect import getsource
@@ -816,7 +800,7 @@ NPY201.py:102:5: NPY201 [*] `np.tracemalloc_domain` will be removed in NumPy 2.0
103 |
104 | np.unicode_("asf")
|
= help: Use `numpy.lib.tracemalloc_domain` instead.
= help: Replace with `numpy.lib.tracemalloc_domain`
Fix
1 |+from numpy.lib import tracemalloc_domain
@@ -842,7 +826,7 @@ NPY201.py:104:5: NPY201 [*] `np.unicode_` will be removed in NumPy 2.0. Use `num
105 |
106 | np.who()
|
= help: Use `numpy.str_` instead.
= help: Replace with `numpy.str_`
Fix
101 101 |
@@ -860,6 +844,5 @@ NPY201.py:106:5: NPY201 `np.who` will be removed in NumPy 2.0. Use an IDE variab
106 | np.who()
| ^^^^^^ NPY201
|
= help: Use an IDE variable explorer or `locals()` instead.

View File

@@ -61,12 +61,14 @@ impl Violation for TripleSingleQuotes {
pub(crate) fn triple_quotes(checker: &mut Checker, docstring: &Docstring) {
let leading_quote = docstring.leading_quote();
let prefixes = docstring
.leading_quote()
let prefixes = leading_quote
.trim_end_matches(|c| c == '\'' || c == '"')
.to_owned();
let expected_quote = if docstring.body().contains("\"\"\"") {
if docstring.body().contains("\'\'\'") {
return;
}
Quote::Single
} else {
Quote::Double

View File

@@ -1,5 +1,6 @@
---
source: crates/ruff_linter/src/rules/pydocstyle/mod.rs
assertion_line: 134
---
D300.py:6:5: D300 Use triple double quotes `"""`
|
@@ -23,5 +24,8 @@ D300.py:10:5: D300 [*] Use triple double quotes `"""`
9 9 | def contains_quote():
10 |- 'Sum"\\mary.'
10 |+ """Sum"\\mary."""
11 11 |
12 12 |
13 13 | # OK

View File

@@ -60,6 +60,7 @@ mod tests {
#[test_case(Rule::ReplaceStdoutStderr, Path::new("UP022.py"))]
#[test_case(Rule::ReplaceUniversalNewlines, Path::new("UP021.py"))]
#[test_case(Rule::SuperCallWithParameters, Path::new("UP008.py"))]
#[test_case(Rule::TimeoutErrorAlias, Path::new("UP041.py"))]
#[test_case(Rule::TypeOfPrimitive, Path::new("UP003.py"))]
#[test_case(Rule::TypingTextStrAlias, Path::new("UP019.py"))]
#[test_case(Rule::UTF8EncodingDeclaration, Path::new("UP009_0.py"))]

View File

@@ -191,15 +191,21 @@ fn try_convert_to_f_string(
summary: &mut FormatSummaryValues,
locator: &Locator,
) -> Result<Option<String>> {
let contents = locator.slice(range);
// Strip the unicode prefix. It's redundant in Python 3, and invalid when used
// with f-strings.
let contents = locator.slice(range);
let contents = if contents.starts_with('U') || contents.starts_with('u') {
&contents[1..]
} else {
contents
};
// Temporarily strip the raw prefix, if present. It will be prepended to the result, before the
// 'f', to match the prefix order both the Ruff formatter (and Black) use when formatting code.
let raw = contents.starts_with('R') || contents.starts_with('r');
let contents = if raw { &contents[1..] } else { contents };
// Remove the leading and trailing quotes.
let leading_quote = leading_quote(contents).context("Unable to identify leading quote")?;
let trailing_quote = trailing_quote(contents).context("Unable to identify trailing quote")?;
@@ -291,7 +297,10 @@ fn try_convert_to_f_string(
}
// Construct the format string.
let mut contents = String::with_capacity(1 + converted.len());
let mut contents = String::with_capacity(usize::from(raw) + 1 + converted.len());
if raw {
contents.push('r');
}
contents.push('f');
contents.push_str(leading_quote);
contents.push_str(&converted);

View File

@@ -20,6 +20,7 @@ pub(crate) use redundant_open_modes::*;
pub(crate) use replace_stdout_stderr::*;
pub(crate) use replace_universal_newlines::*;
pub(crate) use super_call_with_parameters::*;
pub(crate) use timeout_error_alias::*;
pub(crate) use type_of_primitive::*;
pub(crate) use typing_text_str_alias::*;
pub(crate) use unicode_kind_prefix::*;
@@ -59,6 +60,7 @@ mod redundant_open_modes;
mod replace_stdout_stderr;
mod replace_universal_newlines;
mod super_call_with_parameters;
mod timeout_error_alias;
mod type_of_primitive;
mod typing_text_str_alias;
mod unicode_kind_prefix;

View File

@@ -0,0 +1,192 @@
use ruff_python_ast::{self as ast, ExceptHandler, Expr, ExprContext};
use ruff_text_size::{Ranged, TextRange};
use crate::fix::edits::pad;
use ruff_diagnostics::{AlwaysFixableViolation, Diagnostic, Edit, Fix};
use ruff_macros::{derive_message_formats, violation};
use ruff_python_ast::call_path::compose_call_path;
use ruff_python_semantic::SemanticModel;
use crate::checkers::ast::Checker;
use crate::settings::types::PythonVersion;
/// ## What it does
/// Checks for uses of exceptions that alias `TimeoutError`.
///
/// ## Why is this bad?
/// `TimeoutError` is the builtin error type used for exceptions when a system
/// function timed out at the system level.
///
/// In Python 3.10, `socket.timeout` was aliased to `TimeoutError`. In Python
/// 3.11, `asyncio.TimeoutError` was aliased to `TimeoutError`.
///
/// These aliases remain in place for compatibility with older versions of
/// Python, but may be removed in future versions.
///
/// Prefer using `TimeoutError` directly, as it is more idiomatic and future-proof.
///
/// ## Example
/// ```python
/// raise asyncio.TimeoutError
/// ```
///
/// Use instead:
/// ```python
/// raise TimeoutError
/// ```
///
/// ## References
/// - [Python documentation: `TimeoutError`](https://docs.python.org/3/library/exceptions.html#TimeoutError)
#[violation]
pub struct TimeoutErrorAlias {
name: Option<String>,
}
impl AlwaysFixableViolation for TimeoutErrorAlias {
#[derive_message_formats]
fn message(&self) -> String {
format!("Replace aliased errors with `TimeoutError`")
}
fn fix_title(&self) -> String {
let TimeoutErrorAlias { name } = self;
match name {
None => "Replace with builtin `TimeoutError`".to_string(),
Some(name) => format!("Replace `{name}` with builtin `TimeoutError`"),
}
}
}
/// Return `true` if an [`Expr`] is an alias of `TimeoutError`.
fn is_alias(expr: &Expr, semantic: &SemanticModel, target_version: PythonVersion) -> bool {
semantic.resolve_call_path(expr).is_some_and(|call_path| {
if target_version >= PythonVersion::Py311 {
matches!(call_path.as_slice(), [""] | ["asyncio", "TimeoutError"])
} else {
matches!(
call_path.as_slice(),
[""] | ["asyncio", "TimeoutError"] | ["socket", "timeout"]
)
}
})
}
/// Return `true` if an [`Expr`] is `TimeoutError`.
fn is_timeout_error(expr: &Expr, semantic: &SemanticModel) -> bool {
semantic
.resolve_call_path(expr)
.is_some_and(|call_path| matches!(call_path.as_slice(), ["", "TimeoutError"]))
}
/// Create a [`Diagnostic`] for a single target, like an [`Expr::Name`].
fn atom_diagnostic(checker: &mut Checker, target: &Expr) {
let mut diagnostic = Diagnostic::new(
TimeoutErrorAlias {
name: compose_call_path(target),
},
target.range(),
);
if checker.semantic().is_builtin("TimeoutError") {
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
"TimeoutError".to_string(),
target.range(),
)));
}
checker.diagnostics.push(diagnostic);
}
/// Create a [`Diagnostic`] for a tuple of expressions.
fn tuple_diagnostic(checker: &mut Checker, tuple: &ast::ExprTuple, aliases: &[&Expr]) {
let mut diagnostic = Diagnostic::new(TimeoutErrorAlias { name: None }, tuple.range());
if checker.semantic().is_builtin("TimeoutError") {
// Filter out any `TimeoutErrors` aliases.
let mut remaining: Vec<Expr> = tuple
.elts
.iter()
.filter_map(|elt| {
if aliases.contains(&elt) {
None
} else {
Some(elt.clone())
}
})
.collect();
// If `TimeoutError` itself isn't already in the tuple, add it.
if tuple
.elts
.iter()
.all(|elt| !is_timeout_error(elt, checker.semantic()))
{
let node = ast::ExprName {
id: "TimeoutError".into(),
ctx: ExprContext::Load,
range: TextRange::default(),
};
remaining.insert(0, node.into());
}
let content = if remaining.len() == 1 {
"TimeoutError".to_string()
} else {
let node = ast::ExprTuple {
elts: remaining,
ctx: ExprContext::Load,
range: TextRange::default(),
};
format!("({})", checker.generator().expr(&node.into()))
};
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
pad(content, tuple.range(), checker.locator()),
tuple.range(),
)));
}
checker.diagnostics.push(diagnostic);
}
/// UP041
pub(crate) fn timeout_error_alias_handlers(checker: &mut Checker, handlers: &[ExceptHandler]) {
for handler in handlers {
let ExceptHandler::ExceptHandler(ast::ExceptHandlerExceptHandler { type_, .. }) = handler;
let Some(expr) = type_.as_ref() else {
continue;
};
match expr.as_ref() {
Expr::Name(_) | Expr::Attribute(_) => {
if is_alias(expr, checker.semantic(), checker.settings.target_version) {
atom_diagnostic(checker, expr);
}
}
Expr::Tuple(tuple) => {
// List of aliases to replace with `TimeoutError`.
let mut aliases: Vec<&Expr> = vec![];
for elt in &tuple.elts {
if is_alias(elt, checker.semantic(), checker.settings.target_version) {
aliases.push(elt);
}
}
if !aliases.is_empty() {
tuple_diagnostic(checker, tuple, &aliases);
}
}
_ => {}
}
}
}
/// UP041
pub(crate) fn timeout_error_alias_call(checker: &mut Checker, func: &Expr) {
if is_alias(func, checker.semantic(), checker.settings.target_version) {
atom_diagnostic(checker, func);
}
}
/// UP041
pub(crate) fn timeout_error_alias_raise(checker: &mut Checker, expr: &Expr) {
if matches!(expr, Expr::Name(_) | Expr::Attribute(_)) {
if is_alias(expr, checker.semantic(), checker.settings.target_version) {
atom_diagnostic(checker, expr);
}
}
}

View File

@@ -353,7 +353,7 @@ UP032_0.py:37:1: UP032 [*] Use f-string instead of `format` call
35 35 | "foo{}".format(1)
36 36 |
37 |-r"foo{}".format(1)
37 |+fr"foo{1}"
37 |+rf"foo{1}"
38 38 |
39 39 | x = "{a}".format(a=1)
40 40 |

View File

@@ -0,0 +1,104 @@
---
source: crates/ruff_linter/src/rules/pyupgrade/mod.rs
---
UP041.py:5:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
3 | try:
4 | pass
5 | except asyncio.TimeoutError:
| ^^^^^^^^^^^^^^^^^^^^ UP041
6 | pass
|
= help: Replace `asyncio.TimeoutError` with builtin `TimeoutError`
Fix
2 2 | # These should be fixed
3 3 | try:
4 4 | pass
5 |-except asyncio.TimeoutError:
5 |+except TimeoutError:
6 6 | pass
7 7 |
8 8 | try:
UP041.py:17:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
15 | try:
16 | pass
17 | except (asyncio.TimeoutError,):
| ^^^^^^^^^^^^^^^^^^^^^^^ UP041
18 | pass
|
= help: Replace with builtin `TimeoutError`
Fix
14 14 |
15 15 | try:
16 16 | pass
17 |-except (asyncio.TimeoutError,):
17 |+except TimeoutError:
18 18 | pass
19 19 |
20 20 | try:
UP041.py:27:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
25 | try:
26 | pass
27 | except (asyncio.TimeoutError, socket.timeout,):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UP041
28 | pass
|
= help: Replace with builtin `TimeoutError`
Fix
24 24 |
25 25 | try:
26 26 | pass
27 |-except (asyncio.TimeoutError, socket.timeout,):
27 |+except (TimeoutError, socket.timeout):
28 28 | pass
29 29 |
30 30 | # Should be kept in parentheses (because multiple)
UP041.py:34:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
32 | try:
33 | pass
34 | except (asyncio.TimeoutError, socket.timeout, KeyError, TimeoutError):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UP041
35 | pass
|
= help: Replace with builtin `TimeoutError`
Fix
31 31 |
32 32 | try:
33 33 | pass
34 |-except (asyncio.TimeoutError, socket.timeout, KeyError, TimeoutError):
34 |+except (socket.timeout, KeyError, TimeoutError):
35 35 | pass
36 36 |
37 37 | # First should change, second should not
UP041.py:42:8: UP041 [*] Replace aliased errors with `TimeoutError`
|
40 | try:
41 | pass
42 | except (asyncio.TimeoutError, error):
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UP041
43 | pass
|
= help: Replace with builtin `TimeoutError`
Fix
39 39 | from .mmap import error
40 40 | try:
41 41 | pass
42 |-except (asyncio.TimeoutError, error):
42 |+except (TimeoutError, error):
43 43 | pass
44 44 |
45 45 | # These should not change

View File

@@ -17,7 +17,7 @@ mod tests {
use crate::pyproject_toml::lint_pyproject_toml;
use crate::registry::Rule;
use crate::settings::resolve_per_file_ignores;
use crate::settings::types::{PerFileIgnore, PythonVersion};
use crate::settings::types::{PerFileIgnore, PreviewMode, PythonVersion};
use crate::test::{test_path, test_resource_path};
use crate::{assert_messages, settings};
@@ -88,6 +88,24 @@ mod tests {
Ok(())
}
#[test]
fn preview_confusables() -> Result<()> {
let diagnostics = test_path(
Path::new("ruff/confusables.py"),
&settings::LinterSettings {
preview: PreviewMode::Enabled,
allowed_confusables: FxHashSet::from_iter(['', 'ρ', '']),
..settings::LinterSettings::for_rules(vec![
Rule::AmbiguousUnicodeCharacterString,
Rule::AmbiguousUnicodeCharacterDocstring,
Rule::AmbiguousUnicodeCharacterComment,
])
},
)?;
assert_messages!(diagnostics);
Ok(())
}
#[test]
fn noqa() -> Result<()> {
let diagnostics = test_path(

View File

@@ -13,12 +13,20 @@ use crate::rules::ruff::rules::Context;
use crate::settings::LinterSettings;
/// ## What it does
/// Checks for ambiguous unicode characters in strings.
/// Checks for ambiguous Unicode characters in strings.
///
/// ## Why is this bad?
/// The use of ambiguous unicode characters can confuse readers and cause
/// Some Unicode characters are visually similar to ASCII characters, but have
/// different code points. For example, `LATIN CAPITAL LETTER A` (`U+0041`) is
/// visually similar, but not identical, to the ASCII character `A`.
///
/// The use of ambiguous Unicode characters can confuse readers and cause
/// subtle bugs.
///
/// In [preview], this rule will also flag Unicode characters that are
/// confusable with other, non-preferred Unicode characters. For example, the
/// spec recommends `GREEK CAPITAL LETTER OMEGA` over `OHM SIGN`.
///
/// ## Example
/// ```python
/// print("Ηello, world!") # "Η" is the Greek eta (`U+0397`).
@@ -28,6 +36,8 @@ use crate::settings::LinterSettings;
/// ```python
/// print("Hello, world!") # "H" is the Latin capital H (`U+0048`).
/// ```
///
/// [preview]: https://docs.astral.sh/ruff/preview/
#[violation]
pub struct AmbiguousUnicodeCharacterString {
confusable: char,
@@ -50,12 +60,20 @@ impl Violation for AmbiguousUnicodeCharacterString {
}
/// ## What it does
/// Checks for ambiguous unicode characters in docstrings.
/// Checks for ambiguous Unicode characters in docstrings.
///
/// ## Why is this bad?
/// The use of ambiguous unicode characters can confuse readers and cause
/// Some Unicode characters are visually similar to ASCII characters, but have
/// different code points. For example, `LATIN CAPITAL LETTER A` (`U+0041`) is
/// visually similar, but not identical, to the ASCII character `A`.
///
/// The use of ambiguous Unicode characters can confuse readers and cause
/// subtle bugs.
///
/// In [preview], this rule will also flag Unicode characters that are
/// confusable with other, non-preferred Unicode characters. For example, the
/// spec recommends `GREEK CAPITAL LETTER OMEGA` over `OHM SIGN`.
///
/// ## Example
/// ```python
/// """A lovely docstring (with a `U+FF09` parenthesis."""
@@ -65,6 +83,8 @@ impl Violation for AmbiguousUnicodeCharacterString {
/// ```python
/// """A lovely docstring (with no strange parentheses)."""
/// ```
///
/// [preview]: https://docs.astral.sh/ruff/preview/
#[violation]
pub struct AmbiguousUnicodeCharacterDocstring {
confusable: char,
@@ -87,12 +107,20 @@ impl Violation for AmbiguousUnicodeCharacterDocstring {
}
/// ## What it does
/// Checks for ambiguous unicode characters in comments.
/// Checks for ambiguous Unicode characters in comments.
///
/// ## Why is this bad?
/// The use of ambiguous unicode characters can confuse readers and cause
/// Some Unicode characters are visually similar to ASCII characters, but have
/// different code points. For example, `LATIN CAPITAL LETTER A` (`U+0041`) is
/// visually similar, but not identical, to the ASCII character `A`.
///
/// The use of ambiguous Unicode characters can confuse readers and cause
/// subtle bugs.
///
/// In [preview], this rule will also flag Unicode characters that are
/// confusable with other, non-preferred Unicode characters. For example, the
/// spec recommends `GREEK CAPITAL LETTER OMEGA` over `OHM SIGN`.
///
/// ## Example
/// ```python
/// foo() # nоqa # "о" is Cyrillic (`U+043E`)
@@ -102,6 +130,8 @@ impl Violation for AmbiguousUnicodeCharacterDocstring {
/// ```python
/// foo() # noqa # "o" is Latin (`U+006F`)
/// ```
///
/// [preview]: https://docs.astral.sh/ruff/preview/
#[violation]
pub struct AmbiguousUnicodeCharacterComment {
confusable: char,
@@ -159,11 +189,13 @@ pub(crate) fn ambiguous_unicode_character(
// Check if the boundary character is itself an ambiguous unicode character, in which
// case, it's always included as a diagnostic.
if !current_char.is_ascii() {
if let Some(representant) = confusable(current_char as u32) {
if let Some(representant) = confusable(current_char as u32)
.filter(|representant| settings.preview.is_enabled() || representant.is_ascii())
{
let candidate = Candidate::new(
TextSize::try_from(relative_offset).unwrap() + range.start(),
current_char,
char::from_u32(representant).unwrap(),
representant,
);
if let Some(diagnostic) = candidate.into_diagnostic(context, settings) {
diagnostics.push(diagnostic);
@@ -173,12 +205,14 @@ pub(crate) fn ambiguous_unicode_character(
} else if current_char.is_ascii() {
// The current word contains at least one ASCII character.
word_flags |= WordFlags::ASCII;
} else if let Some(representant) = confusable(current_char as u32) {
} else if let Some(representant) = confusable(current_char as u32)
.filter(|representant| settings.preview.is_enabled() || representant.is_ascii())
{
// The current word contains an ambiguous unicode character.
word_candidates.push(Candidate::new(
TextSize::try_from(relative_offset).unwrap() + range.start(),
current_char,
char::from_u32(representant).unwrap(),
representant,
));
} else {
// The current word contains at least one unambiguous unicode character.

File diff suppressed because it is too large Load Diff

View File

@@ -155,10 +155,4 @@ confusables.py:46:62: RUF003 Comment contains ambiguous `` (PHILIPPINE SINGLE
47 | }"
|
confusables.py:55:28: RUF001 String contains ambiguous `µ` (MICRO SIGN). Did you mean `μ` (GREEK SMALL LETTER MU)?
|
55 | assert getattr(Labware(), "µL") == 1.5
| ^ RUF001
|

View File

@@ -0,0 +1,164 @@
---
source: crates/ruff_linter/src/rules/ruff/mod.rs
---
confusables.py:1:6: RUF001 String contains ambiguous `𝐁` (MATHEMATICAL BOLD CAPITAL B). Did you mean `B` (LATIN CAPITAL LETTER B)?
|
1 | x = "𝐁ad string"
| ^ RUF001
2 | y = ""
|
confusables.py:6:56: RUF002 Docstring contains ambiguous `` (FULLWIDTH RIGHT PARENTHESIS). Did you mean `)` (RIGHT PARENTHESIS)?
|
5 | def f():
6 | """Here's a docstring with an unusual parenthesis: """
| ^^ RUF002
7 | # And here's a comment with an unusual punctuation mark:
8 | ...
|
confusables.py:7:62: RUF003 Comment contains ambiguous `` (PHILIPPINE SINGLE PUNCTUATION). Did you mean `/` (SOLIDUS)?
|
5 | def f():
6 | """Here's a docstring with an unusual parenthesis: """
7 | # And here's a comment with an unusual punctuation mark:
| ^ RUF003
8 | ...
|
confusables.py:17:6: RUF001 String contains ambiguous `𝐁` (MATHEMATICAL BOLD CAPITAL B). Did you mean `B` (LATIN CAPITAL LETTER B)?
|
17 | x = "𝐁ad string"
| ^ RUF001
18 | x = ""
|
confusables.py:26:10: RUF001 String contains ambiguous `α` (GREEK SMALL LETTER ALPHA). Did you mean `a` (LATIN SMALL LETTER A)?
|
24 | # The first word should be ignored, while the second should be included, since it
25 | # contains ASCII.
26 | x = "βα Bαd"
| ^ RUF001
27 |
28 | # The two characters should be flagged here. The first character is a "word"
|
confusables.py:31:6: RUF001 String contains ambiguous `Р` (CYRILLIC CAPITAL LETTER ER). Did you mean `P` (LATIN CAPITAL LETTER P)?
|
29 | # consisting of a single ambiguous character, while the second character is a "word
30 | # boundary" (whitespace) that it itself ambiguous.
31 | x = "Р усский"
| ^ RUF001
32 |
33 | # Same test cases as above but using f-strings instead:
|
confusables.py:31:7: RUF001 String contains ambiguous ` ` (EN QUAD). Did you mean ` ` (SPACE)?
|
29 | # consisting of a single ambiguous character, while the second character is a "word
30 | # boundary" (whitespace) that it itself ambiguous.
31 | x = "Р усский"
| ^ RUF001
32 |
33 | # Same test cases as above but using f-strings instead:
|
confusables.py:34:7: RUF001 String contains ambiguous `𝐁` (MATHEMATICAL BOLD CAPITAL B). Did you mean `B` (LATIN CAPITAL LETTER B)?
|
33 | # Same test cases as above but using f-strings instead:
34 | x = f"𝐁ad string"
| ^ RUF001
35 | x = f""
36 | x = f"Русский"
|
confusables.py:37:11: RUF001 String contains ambiguous `α` (GREEK SMALL LETTER ALPHA). Did you mean `a` (LATIN SMALL LETTER A)?
|
35 | x = f""
36 | x = f"Русский"
37 | x = f"βα Bαd"
| ^ RUF001
38 | x = f"Р усский"
|
confusables.py:38:7: RUF001 String contains ambiguous `Р` (CYRILLIC CAPITAL LETTER ER). Did you mean `P` (LATIN CAPITAL LETTER P)?
|
36 | x = f"Русский"
37 | x = f"βα Bαd"
38 | x = f"Р усский"
| ^ RUF001
39 |
40 | # Nested f-strings
|
confusables.py:38:8: RUF001 String contains ambiguous ` ` (EN QUAD). Did you mean ` ` (SPACE)?
|
36 | x = f"Русский"
37 | x = f"βα Bαd"
38 | x = f"Р усский"
| ^ RUF001
39 |
40 | # Nested f-strings
|
confusables.py:41:7: RUF001 String contains ambiguous `𝐁` (MATHEMATICAL BOLD CAPITAL B). Did you mean `B` (LATIN CAPITAL LETTER B)?
|
40 | # Nested f-strings
41 | x = f"𝐁ad string {f" {f"Р усский"}"}"
| ^ RUF001
42 |
43 | # Comments inside f-strings
|
confusables.py:41:21: RUF001 String contains ambiguous ` ` (EN QUAD). Did you mean ` ` (SPACE)?
|
40 | # Nested f-strings
41 | x = f"𝐁ad string {f" {f"Р усский"}"}"
| ^ RUF001
42 |
43 | # Comments inside f-strings
|
confusables.py:41:25: RUF001 String contains ambiguous `Р` (CYRILLIC CAPITAL LETTER ER). Did you mean `P` (LATIN CAPITAL LETTER P)?
|
40 | # Nested f-strings
41 | x = f"𝐁ad string {f" {f"Р усский"}"}"
| ^ RUF001
42 |
43 | # Comments inside f-strings
|
confusables.py:41:26: RUF001 String contains ambiguous ` ` (EN QUAD). Did you mean ` ` (SPACE)?
|
40 | # Nested f-strings
41 | x = f"𝐁ad string {f" {f"Р усский"}"}"
| ^ RUF001
42 |
43 | # Comments inside f-strings
|
confusables.py:44:68: RUF003 Comment contains ambiguous `` (FULLWIDTH RIGHT PARENTHESIS). Did you mean `)` (RIGHT PARENTHESIS)?
|
43 | # Comments inside f-strings
44 | x = f"string { # And here's a comment with an unusual parenthesis:
| ^^ RUF003
45 | # And here's a comment with a greek alpha:
46 | foo # And here's a comment with an unusual punctuation mark:
|
confusables.py:46:62: RUF003 Comment contains ambiguous `` (PHILIPPINE SINGLE PUNCTUATION). Did you mean `/` (SOLIDUS)?
|
44 | x = f"string { # And here's a comment with an unusual parenthesis:
45 | # And here's a comment with a greek alpha:
46 | foo # And here's a comment with an unusual punctuation mark:
| ^ RUF003
47 | }"
|
confusables.py:55:28: RUF001 String contains ambiguous `µ` (MICRO SIGN). Did you mean `μ` (GREEK SMALL LETTER MU)?
|
55 | assert getattr(Labware(), "µL") == 1.5
| ^ RUF001
|

View File

@@ -1,6 +1,6 @@
[package]
name = "ruff_shrinking"
version = "0.1.3"
version = "0.1.4"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

View File

@@ -219,6 +219,28 @@ extend-include = ["*.ipynb"]
This will prompt Ruff to discover Jupyter Notebook (`.ipynb`) files in any specified
directories, then lint and format them accordingly.
If you'd prefer to either only lint or only format Jupyter Notebook files, you can use the
section specific `exclude` option to do so. For example, the following would only lint Jupyter
Notebook files and not format them:
```toml
[tool.ruff]
extend-include = ["*.ipynb"]
[tool.ruff.format]
exclude = ["*.ipynb"]
```
And, conversely, the following would only format Jupyter Notebook files and not lint them:
```toml
[tool.ruff]
extend-include = ["*.ipynb"]
[tool.ruff.lint]
exclude = ["*.ipynb"]
```
Alternatively, pass the notebook file(s) to `ruff` on the command-line directly. For example,
`ruff check /path/to/notebook.ipynb` will always lint `notebook.ipynb`. Similarly,
`ruff format /path/to/notebook.ipynb` will always format `notebook.ipynb`.

View File

@@ -12,18 +12,18 @@ which supports fix actions, import sorting, and more.
Ruff can be used as a [pre-commit](https://pre-commit.com) hook via [`ruff-pre-commit`](https://github.com/astral-sh/ruff-pre-commit):
```yaml
# Run the Ruff formatter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.291
hooks:
- id: ruff-format
# Run the Ruff linter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.291
hooks:
- id: ruff
# Run the Ruff formatter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.291
hooks:
- id: ruff-format
```
To enable fixes, add the `--fix` argument to the linter:
@@ -47,7 +47,7 @@ To run the hooks over Jupyter Notebooks too, add `jupyter` to the list of allowe
rev: v0.0.291
hooks:
- id: ruff
types_or: [python, pyi, jupyter]
types_or: [ python, pyi, jupyter ]
```
Ruff's lint hook should be placed after other formatting tools, such as Ruff's format hook, Black,

View File

@@ -284,13 +284,13 @@ This tutorial has focused on Ruff's command-line interface, but Ruff can also be
# Run the Ruff linter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.3
rev: v0.1.4
hooks:
- id: ruff
# Run the Ruff formatter.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.3
rev: v0.1.4
hooks:
- id: ruff-format
```

View File

@@ -4,7 +4,7 @@ build-backend = "maturin"
[project]
name = "ruff"
version = "0.1.3"
version = "0.1.4"
description = "An extremely fast Python linter and code formatter, written in Rust."
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
readme = "README.md"

1
ruff.schema.json generated
View File

@@ -3536,6 +3536,7 @@
"UP039",
"UP04",
"UP040",
"UP041",
"W",
"W1",
"W19",

View File

@@ -1,6 +1,6 @@
[tool.poetry]
name = "scripts"
version = "0.1.3"
version = "0.1.4"
description = ""
authors = ["Charles Marsh <charlie.r.marsh@gmail.com>"]

View File

@@ -13,7 +13,7 @@ prelude = """
/// Via: <https://github.com/hediet/vscode-unicode-data/blob/main/out/ambiguous.json>
/// See: <https://github.com/microsoft/vscode/blob/095ddabc52b82498ee7f718a34f9dd11d59099a8/src/vs/base/common/strings.ts#L1094>
pub(crate) fn confusable(c: u32) -> Option<u8> {
pub(crate) fn confusable(c: u32) -> Option<char> {
let result = match c {
""".lstrip()
@@ -49,6 +49,14 @@ def format_number(number: int) -> str:
return f"{number}u32"
def format_char(number: int) -> str:
"""Format a Python integer as a Rust character literal."""
char = chr(number)
if char == "\\":
return "\\\\"
return char
def format_confusables_rs(raw_data: dict[str, list[int]]) -> str:
"""Format the downloaded data into a Rust source file."""
# The input data contains duplicate entries.
@@ -59,7 +67,7 @@ def format_confusables_rs(raw_data: dict[str, list[int]]) -> str:
flattened_items.add((items[i], items[i + 1]))
tuples = [
f" {format_number(left)} => {right},\n"
f" {format_number(left)} => '{format_char(right)}',\n"
for left, right in sorted(flattened_items)
]
@@ -67,13 +75,13 @@ def format_confusables_rs(raw_data: dict[str, list[int]]) -> str:
# as they're unicode-to-unicode confusables, not unicode-to-ASCII confusables.
confusable_units = [
# ANGSTROM SIGN → LATIN CAPITAL LETTER A WITH RING ABOVE
("0x212B", "0x00C5"),
("0x212B", chr(0x00C5)),
# OHM SIGN → GREEK CAPITAL LETTER OMEGA
("0x2126", "0x03A9"),
("0x2126", chr(0x03A9)),
# MICRO SIGN → GREEK SMALL LETTER MU
("0x00B5", "0x03BC"),
("0x00B5", chr(0x03BC)),
]
tuples += [f" {left} => {right},\n" for left, right in confusable_units]
tuples += [f" {left} => '{right}',\n" for left, right in confusable_units]
print(f"{len(tuples)} confusable tuples.")