Compare commits
620 Commits
0.11.2
...
cjm/tvassi
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
e3876e6d6d | ||
|
|
b2de749c32 | ||
|
|
9085f18353 | ||
|
|
8152ba7cb7 | ||
|
|
3d01d3be3e | ||
|
|
4510a236d3 | ||
|
|
76b6d53d8b | ||
|
|
f82b72882b | ||
|
|
d07eefc408 | ||
|
|
f7237e3b69 | ||
|
|
2f9992b6ef | ||
|
|
457ec4dddd | ||
|
|
aa0614509b | ||
|
|
9000eb3bfd | ||
|
|
7f50b503cf | ||
|
|
24d3fc27fb | ||
|
|
6f821ac846 | ||
|
|
d410d12bc5 | ||
|
|
89424cce5f | ||
|
|
fd76d70a31 | ||
|
|
a4c8e43c5f | ||
|
|
ada4c4cb1f | ||
|
|
bb6c7cad07 | ||
|
|
47e3aa40b3 | ||
|
|
9a6633da0b | ||
|
|
de78da5ee6 | ||
|
|
20d64b9c85 | ||
|
|
4850c187ea | ||
|
|
3f32446e16 | ||
|
|
784daae497 | ||
|
|
178c882740 | ||
|
|
101e1a5ddd | ||
|
|
965a4dd731 | ||
|
|
5e2c818417 | ||
|
|
90c12f4177 | ||
|
|
6e9fb9af38 | ||
|
|
a507c1b8b3 | ||
|
|
5a91badb8b | ||
|
|
1945bfdb84 | ||
|
|
a95c73d5d0 | ||
|
|
78b4c3ccf1 | ||
|
|
2485afe640 | ||
|
|
b8ed729f59 | ||
|
|
108c470348 | ||
|
|
87c64c9eab | ||
|
|
a10606dda2 | ||
|
|
d1c6dd9ac1 | ||
|
|
073b993ab0 | ||
|
|
6a36cd6f02 | ||
|
|
3b15af6d4f | ||
|
|
e95130ad80 | ||
|
|
68e32c103f | ||
|
|
fe4051b2e6 | ||
|
|
fa628018b2 | ||
|
|
8535af8516 | ||
|
|
b51c4f82ea | ||
|
|
e6a798b962 | ||
|
|
52b0470870 | ||
|
|
c4a08782cc | ||
|
|
91481a8be7 | ||
|
|
097af060c9 | ||
|
|
b7d0b3f9e5 | ||
|
|
084352f72c | ||
|
|
78d4356301 | ||
|
|
96697c98f3 | ||
|
|
f7cae4ffb5 | ||
|
|
675a5af89a | ||
|
|
ea3f4ac059 | ||
|
|
6d2c10cca2 | ||
|
|
3cf44e401a | ||
|
|
17050e2ec5 | ||
|
|
a6dc04f96e | ||
|
|
e515899141 | ||
|
|
0c80c56afc | ||
|
|
b7ce694162 | ||
|
|
163d526407 | ||
|
|
75effb8ed7 | ||
|
|
3353d07938 | ||
|
|
41f3f21629 | ||
|
|
76ec64d535 | ||
|
|
b7e69ecbfc | ||
|
|
9c57862262 | ||
|
|
67ef370733 | ||
|
|
e17e1e860b | ||
|
|
03d8679adf | ||
|
|
d33a503686 | ||
|
|
650cbdd296 | ||
|
|
d2a238dfad | ||
|
|
6e765b4527 | ||
|
|
c5e41c278c | ||
|
|
0eeb02c0c1 | ||
|
|
f31b1c695c | ||
|
|
5679bf00bc | ||
|
|
a7c358ab5c | ||
|
|
b6de01b9a5 | ||
|
|
18bac94226 | ||
|
|
7568eeb7a5 | ||
|
|
0e85cbdd91 | ||
|
|
7825975972 | ||
|
|
f584b66824 | ||
|
|
ad1a8da4d1 | ||
|
|
0861ecfa55 | ||
|
|
d1f359afbb | ||
|
|
b84b58760e | ||
|
|
d94be0e780 | ||
|
|
8a6787b39e | ||
|
|
4a621c2c12 | ||
|
|
2bb99df394 | ||
|
|
f11d9cb509 | ||
|
|
549ab74bd6 | ||
|
|
81fc7d7d3a | ||
|
|
8c68d30c3a | ||
|
|
93d6a3567b | ||
|
|
1d788981cd | ||
|
|
7d46579808 | ||
|
|
c9a6b1a9d0 | ||
|
|
9b9d16c3ba | ||
|
|
79f8473e51 | ||
|
|
ca4fdf452d | ||
|
|
3c460a7b9a | ||
|
|
31e6576971 | ||
|
|
c953e7d143 | ||
|
|
5096824793 | ||
|
|
ae7691b026 | ||
|
|
504fa20057 | ||
|
|
f0868ac0c9 | ||
|
|
01a31c08f5 | ||
|
|
405878a128 | ||
|
|
80103a179d | ||
|
|
9a8f3cf247 | ||
|
|
07718f4788 | ||
|
|
1e8881f9af | ||
|
|
152a0b6585 | ||
|
|
1ad5015e19 | ||
|
|
92f95ff494 | ||
|
|
ceb2bf1168 | ||
|
|
f521358033 | ||
|
|
74081032d9 | ||
|
|
dbc137c951 | ||
|
|
826b2c9ff3 | ||
|
|
a3e55cfd8f | ||
|
|
d2246278e6 | ||
|
|
6bd1863bf0 | ||
|
|
97dc58fc77 | ||
|
|
53a9448fb5 | ||
|
|
516291b693 | ||
|
|
b09f00a4ef | ||
|
|
03065c245c | ||
|
|
b45598389d | ||
|
|
4729ff2bc8 | ||
|
|
1bdb22c139 | ||
|
|
1c65e0ad25 | ||
|
|
4443f6653c | ||
|
|
b0d475f353 | ||
|
|
b578a828ef | ||
|
|
64ba39a385 | ||
|
|
a4e225ee8a | ||
|
|
45d0634b01 | ||
|
|
4bcf1778fa | ||
|
|
6044f04137 | ||
|
|
2e95475f57 | ||
|
|
cfa1505068 | ||
|
|
0251679f87 | ||
|
|
6ab32a7746 | ||
|
|
bc0a5aa409 | ||
|
|
aba21a5d47 | ||
|
|
b6281a8805 | ||
|
|
049280a3bc | ||
|
|
fa88989ef0 | ||
|
|
4c3f389598 | ||
|
|
6d3b1d13d6 | ||
|
|
3f84e75e20 | ||
|
|
afc18ff1a1 | ||
|
|
f1a539dac6 | ||
|
|
ef0343189c | ||
|
|
4eecc40110 | ||
|
|
cf59cee928 | ||
|
|
538393d1f3 | ||
|
|
92ecfc908b | ||
|
|
f7b48510b5 | ||
|
|
9937064761 | ||
|
|
8d2c79276d | ||
|
|
0f47810768 | ||
|
|
eb1d2518c1 | ||
|
|
a45a0a92bd | ||
|
|
43bd043755 | ||
|
|
9a54ee3a1c | ||
|
|
25c3be51d2 | ||
|
|
e71f3ed2c5 | ||
|
|
ac6219ec38 | ||
|
|
e93fa7062c | ||
|
|
21fd28d713 | ||
|
|
a01f25107a | ||
|
|
48a85c4ed4 | ||
|
|
1796ca97d5 | ||
|
|
e897f37911 | ||
|
|
00e73dc331 | ||
|
|
7b6222700b | ||
|
|
bfc1650198 | ||
|
|
d5410ef9fe | ||
|
|
9db63fc58c | ||
|
|
61e73481fe | ||
|
|
e170fe493d | ||
|
|
e91e2f49db | ||
|
|
b537552927 | ||
|
|
5a719f2d60 | ||
|
|
e7f38fe74b | ||
|
|
624f5c6c22 | ||
|
|
8abf93f5fb | ||
|
|
5407249467 | ||
|
|
0a1f9d090e | ||
|
|
f9c7908bb7 | ||
|
|
99fa850e53 | ||
|
|
a241321735 | ||
|
|
b1b8ca3bcd | ||
|
|
3fae176345 | ||
|
|
f36262d970 | ||
|
|
e45f23b0ec | ||
|
|
aa46047649 | ||
|
|
f9da115fdc | ||
|
|
3872d57463 | ||
|
|
27ada26ddb | ||
|
|
810478f68b | ||
|
|
17f799424a | ||
|
|
c12640fea8 | ||
|
|
3796b13ea2 | ||
|
|
ad5a659f29 | ||
|
|
27a377f077 | ||
|
|
b8b624d890 | ||
|
|
6dc2d29966 | ||
|
|
890ba725d9 | ||
|
|
298f43f34e | ||
|
|
3b300559ab | ||
|
|
14f71ceb83 | ||
|
|
4775719abf | ||
|
|
6bdffc3cbf | ||
|
|
775815ef22 | ||
|
|
0299a52fb1 | ||
|
|
83d5ad8983 | ||
|
|
ae6fde152c | ||
|
|
d2b20f7367 | ||
|
|
38a3b056e3 | ||
|
|
37a0836bd2 | ||
|
|
f83295fe51 | ||
|
|
c4581788b2 | ||
|
|
2894aaa943 | ||
|
|
ed4866a00b | ||
|
|
9b5fe51b32 | ||
|
|
53ffe7143f | ||
|
|
21561000b1 | ||
|
|
9c0772d8f0 | ||
|
|
a4531bf865 | ||
|
|
be54b840e9 | ||
|
|
45b5dedee2 | ||
|
|
9ff4772a2c | ||
|
|
c077b109ce | ||
|
|
8a2dd01db4 | ||
|
|
f888e51a34 | ||
|
|
d11e959ad5 | ||
|
|
a56eef444a | ||
|
|
14ff67fd46 | ||
|
|
ada7d4da0d | ||
|
|
4cafb44ba7 | ||
|
|
1445836872 | ||
|
|
da6b68cb58 | ||
|
|
2a478ce1b2 | ||
|
|
8fe2dd5e03 | ||
|
|
454ad15aee | ||
|
|
fd3fc34a9e | ||
|
|
c550b4d565 | ||
|
|
f8061e8b99 | ||
|
|
27a315b740 | ||
|
|
08221454f6 | ||
|
|
5fec1039ed | ||
|
|
787bcd1c6a | ||
|
|
5853eb28dd | ||
|
|
84d064a14c | ||
|
|
e4e405d2a1 | ||
|
|
1918c61623 | ||
|
|
44ad201262 | ||
|
|
c7372d218d | ||
|
|
de8f4e62e2 | ||
|
|
edfa03a692 | ||
|
|
9965cee998 | ||
|
|
58807b2980 | ||
|
|
9c47b6dbb0 | ||
|
|
d2ebfd6ed7 | ||
|
|
c36f3f5304 | ||
|
|
fcd50a0496 | ||
|
|
3ada36b766 | ||
|
|
bd89838212 | ||
|
|
b32407b6f3 | ||
|
|
b4de245a5a | ||
|
|
914095d08f | ||
|
|
5350288d07 | ||
|
|
649610cc98 | ||
|
|
1a79722ee0 | ||
|
|
b67590bfde | ||
|
|
e6a2de3ac6 | ||
|
|
c7b5067ef8 | ||
|
|
5a115e750d | ||
|
|
a1f361949e | ||
|
|
13ea4e5d0e | ||
|
|
a2a7b1e268 | ||
|
|
1dedcb9e0d | ||
|
|
807a8a7a29 | ||
|
|
78dabc332d | ||
|
|
bfc17fecaa | ||
|
|
942cb9e3ad | ||
|
|
312a487ea7 | ||
|
|
cf8dc60292 | ||
|
|
ea5d5c4e29 | ||
|
|
c99d5522eb | ||
|
|
e57c83e369 | ||
|
|
1d49e71ddd | ||
|
|
f05b2d3673 | ||
|
|
79b921179c | ||
|
|
1f85e0d0a0 | ||
|
|
03adae80dc | ||
|
|
4894f52bae | ||
|
|
3b24fe5c07 | ||
|
|
b5d529e976 | ||
|
|
014bb526f4 | ||
|
|
e2a38e4c00 | ||
|
|
9bee9429de | ||
|
|
17e5b61c54 | ||
|
|
701aecb2a6 | ||
|
|
850360a0b4 | ||
|
|
dfd8eaeb32 | ||
|
|
3aa3ee8b89 | ||
|
|
f0c3abd198 | ||
|
|
e5026c0877 | ||
|
|
da32a83c9f | ||
|
|
4bfdf54d1a | ||
|
|
7d11ef1564 | ||
|
|
b79d43a852 | ||
|
|
a4d3c4bf8b | ||
|
|
f3cc287a40 | ||
|
|
bd5b8a9ec6 | ||
|
|
0caf09f5c3 | ||
|
|
47956db567 | ||
|
|
ffef71d106 | ||
|
|
7e571791c0 | ||
|
|
8e11c53310 | ||
|
|
1aad180aae | ||
|
|
1a3b73720c | ||
|
|
8b2727cf67 | ||
|
|
410aa4b8fd | ||
|
|
bd58135b0d | ||
|
|
81045758d3 | ||
|
|
0b2a5b5cd3 | ||
|
|
7d958a9ee5 | ||
|
|
7e2eb591bc | ||
|
|
ba408f4231 | ||
|
|
28b64064f5 | ||
|
|
75b15ea2d0 | ||
|
|
e7e86b8584 | ||
|
|
f84bc07d56 | ||
|
|
7186d5e9ad | ||
|
|
5b6e94981d | ||
|
|
ec74f2d522 | ||
|
|
907b6ed7b5 | ||
|
|
fd9882a1f4 | ||
|
|
66a33bfd32 | ||
|
|
5b1d8350ff | ||
|
|
4d50ee6f52 | ||
|
|
06ffeb2e09 | ||
|
|
10e44124e6 | ||
|
|
9f6913c488 | ||
|
|
5fef4d4572 | ||
|
|
144484d46c | ||
|
|
c87e3ccb2f | ||
|
|
781b653511 | ||
|
|
484a8ed36d | ||
|
|
2fbc4d577e | ||
|
|
73399029b2 | ||
|
|
ff376fc262 | ||
|
|
7c81408c54 | ||
|
|
7207c86971 | ||
|
|
6ec4c6a97e | ||
|
|
f1ba596f22 | ||
|
|
8249a72412 | ||
|
|
00e00b9ad6 | ||
|
|
9a5a86769b | ||
|
|
2cee86d807 | ||
|
|
fab7d820bd | ||
|
|
ed14dbb1a2 | ||
|
|
058439d5d3 | ||
|
|
dc02732d4d | ||
|
|
0891689d2f | ||
|
|
127a45622f | ||
|
|
b662c3ff7e | ||
|
|
97dd6d120c | ||
|
|
34e06f2d17 | ||
|
|
a388c73752 | ||
|
|
60f2e67454 | ||
|
|
cb7f56fb20 | ||
|
|
761749cb50 | ||
|
|
3657f798c9 | ||
|
|
4a4a376f02 | ||
|
|
5e0f563ee4 | ||
|
|
27ecf350d8 | ||
|
|
f1d4ba32cc | ||
|
|
83a97235ae | ||
|
|
1e9e423362 | ||
|
|
708b84fb87 | ||
|
|
6cc2d02dfa | ||
|
|
c12c76e9c8 | ||
|
|
81cf860dc8 | ||
|
|
3150812ac4 | ||
|
|
12d7fad4ef | ||
|
|
41fec5171f | ||
|
|
db69dfba45 | ||
|
|
50eb3f539e | ||
|
|
ec30f18d02 | ||
|
|
1120def16a | ||
|
|
796e7510c4 | ||
|
|
1f254ab17e | ||
|
|
10003bd4e5 | ||
|
|
85f8116b2f | ||
|
|
bc8c6a4810 | ||
|
|
22f9f1dd29 | ||
|
|
6f11470298 | ||
|
|
a038c2e662 | ||
|
|
7dc6583f9f | ||
|
|
8c51fdfeff | ||
|
|
861a9c5d6c | ||
|
|
396a7162d6 | ||
|
|
040f80ab68 | ||
|
|
67f8c30b53 | ||
|
|
e01d228a71 | ||
|
|
51bdb87fd8 | ||
|
|
ac5d220d75 | ||
|
|
73a9974d8a | ||
|
|
3dd6d0a422 | ||
|
|
4571c5f56f | ||
|
|
1c652e6b98 | ||
|
|
172af7b4b0 | ||
|
|
7e6d3838bd | ||
|
|
1a6a10b30f | ||
|
|
b3243b5e2a | ||
|
|
95d6ed40cc | ||
|
|
acc5662e8b | ||
|
|
33a56f198b | ||
|
|
5cee346744 | ||
|
|
ffa824e037 | ||
|
|
98b95c9c38 | ||
|
|
a4ba10ff0a | ||
|
|
bf0306887a | ||
|
|
4f924bb975 | ||
|
|
c2b2e42ad3 | ||
|
|
24b1b1d52c | ||
|
|
6a07dd227d | ||
|
|
8aa5686e63 | ||
|
|
e7684d3493 | ||
|
|
64e7e1aa64 | ||
|
|
45c43735e0 | ||
|
|
8a4158c5f8 | ||
|
|
fedd982fd5 | ||
|
|
a1eb834a5f | ||
|
|
c1f93a702c | ||
|
|
6e2b8f9696 | ||
|
|
ca0cce3f9c | ||
|
|
3f00010a7a | ||
|
|
d401a5440e | ||
|
|
755ece0c36 | ||
|
|
62f8d855d2 | ||
|
|
130339f3d8 | ||
|
|
e50fc049ab | ||
|
|
66355a6185 | ||
|
|
177afabe18 | ||
|
|
28c68934a4 | ||
|
|
195bb433db | ||
|
|
c2bb5d5250 | ||
|
|
cb7dae1e96 | ||
|
|
8833484b10 | ||
|
|
adeba3dca7 | ||
|
|
af988bf866 | ||
|
|
f989c2c3af | ||
|
|
c2512b4c50 | ||
|
|
6bc2b04c49 | ||
|
|
fc2a0950eb | ||
|
|
33bd08f49b | ||
|
|
5d57788328 | ||
|
|
718b0cadf4 | ||
|
|
24498e383d | ||
|
|
28c7e724e3 | ||
|
|
185bcfef1a | ||
|
|
c30e80a3f4 | ||
|
|
4e169e5f6c | ||
|
|
883b8e3870 | ||
|
|
2ca2f73ba8 | ||
|
|
90f0766210 | ||
|
|
a9527edbbe | ||
|
|
57be814acb | ||
|
|
a8a18e7171 | ||
|
|
a953373892 | ||
|
|
d382065f8a | ||
|
|
d45593288f | ||
|
|
2ae39edccf | ||
|
|
7e97910704 | ||
|
|
ae2cf91a36 | ||
|
|
e1b5b0de71 | ||
|
|
f63024843c | ||
|
|
eb3e176309 | ||
|
|
d38f6fcc55 | ||
|
|
6be0a5057d | ||
|
|
d6dcc377f7 | ||
|
|
a43b683d08 | ||
|
|
d29d4956de | ||
|
|
c74ba00219 | ||
|
|
a15404a5c1 | ||
|
|
3f63c08728 | ||
|
|
5a876ed25e | ||
|
|
49c25993eb | ||
|
|
8e6a83b33e | ||
|
|
d0c8eaa092 | ||
|
|
aa93005d8d | ||
|
|
0073fd4945 | ||
|
|
b57c62e6b3 | ||
|
|
a9dbfebc61 | ||
|
|
53cfaaebc4 | ||
|
|
3ad123bc23 | ||
|
|
a1535fbdbd | ||
|
|
4a6fa5fc27 | ||
|
|
491a51960e | ||
|
|
2d7f118f52 | ||
|
|
3d1e5676fb | ||
|
|
0b1ab8fd5a | ||
|
|
d9616e61b0 | ||
|
|
fa80e10aac | ||
|
|
30a5f69913 | ||
|
|
a192d96880 | ||
|
|
5c1fab0661 | ||
|
|
5f83a32553 | ||
|
|
6d387e01b4 | ||
|
|
ef2616446a | ||
|
|
90153e5664 | ||
|
|
08f86a5ffc | ||
|
|
4cc627ef77 | ||
|
|
244618a6f5 | ||
|
|
e068ffe929 | ||
|
|
a5f4522078 | ||
|
|
c6efa93cf7 | ||
|
|
ab1011ce70 | ||
|
|
a0819f0c51 | ||
|
|
8396d7cd63 | ||
|
|
9ae2900dc3 | ||
|
|
37a40e30f6 | ||
|
|
98438a77f2 | ||
|
|
93052331b0 | ||
|
|
e07741e553 | ||
|
|
050f332771 | ||
|
|
6b02c39321 | ||
|
|
78b0b5a3ab | ||
|
|
0e48940ea4 | ||
|
|
aca6254e82 | ||
|
|
64171744dc | ||
|
|
2e56cd3737 | ||
|
|
e37a1b02f6 | ||
|
|
9aa8dc4590 | ||
|
|
df418d94b3 | ||
|
|
b90741fb92 | ||
|
|
3acf4e716d | ||
|
|
992a1af4c2 | ||
|
|
c963b185eb | ||
|
|
4067a7e50c | ||
|
|
f9bc80ad55 | ||
|
|
142fe0d29b | ||
|
|
6ef522159d | ||
|
|
b9a7328789 | ||
|
|
640d821108 | ||
|
|
43ca85a351 | ||
|
|
338fed98a4 | ||
|
|
d70a3e6753 | ||
|
|
5697d21fca | ||
|
|
58350ec93b | ||
|
|
aae4d0f3eb | ||
|
|
807fce8069 | ||
|
|
8d16a5c8c9 | ||
|
|
4975c2f027 | ||
|
|
dd5b02aaa2 | ||
|
|
68ea2b8b5b | ||
|
|
e87fee4b3b | ||
|
|
cba197e3c5 | ||
|
|
66d0cf2a72 | ||
|
|
85b7f808e1 | ||
|
|
3a97bdf689 | ||
|
|
1bee3994aa | ||
|
|
888a910925 | ||
|
|
581b7005dc | ||
|
|
b442ba440f | ||
|
|
5aba72cdbd | ||
|
|
2711e08eb8 | ||
|
|
f5cdf23545 | ||
|
|
d98222cd14 | ||
|
|
f7b9089cb8 | ||
|
|
dfebc1cfe4 | ||
|
|
7e1484a9b1 | ||
|
|
187cac56bd | ||
|
|
890f79c4ab | ||
|
|
3899f7156f | ||
|
|
902d86e79e | ||
|
|
9fe89ddfba | ||
|
|
08a0995108 | ||
|
|
2d892bc9f7 | ||
|
|
ee51c2a389 | ||
|
|
bb07ccd783 | ||
|
|
c35f2bfe32 | ||
|
|
7fb765d9b6 | ||
|
|
0360c6b219 | ||
|
|
1cffb323bc | ||
|
|
92028efe3d | ||
|
|
7b86f54c4c | ||
|
|
e4f5fe8cf7 | ||
|
|
2baaedda6c | ||
|
|
b1deab83d9 | ||
|
|
d21d639ee0 | ||
|
|
14eb4cac88 | ||
|
|
c03c28d199 |
6
.gitattributes
vendored
6
.gitattributes
vendored
@@ -12,6 +12,12 @@ crates/ruff_python_parser/resources/invalid/re_lexing/line_continuation_windows_
|
||||
crates/ruff_python_parser/resources/invalid/re_lex_logical_token_windows_eol.py text eol=crlf
|
||||
crates/ruff_python_parser/resources/invalid/re_lex_logical_token_mac_eol.py text eol=cr
|
||||
|
||||
crates/ruff_linter/resources/test/fixtures/ruff/RUF046_CR.py text eol=cr
|
||||
crates/ruff_linter/resources/test/fixtures/ruff/RUF046_LF.py text eol=lf
|
||||
|
||||
crates/ruff_linter/resources/test/fixtures/pyupgrade/UP018_CR.py text eol=cr
|
||||
crates/ruff_linter/resources/test/fixtures/pyupgrade/UP018_LF.py text eol=lf
|
||||
|
||||
crates/ruff_python_parser/resources/inline linguist-generated=true
|
||||
|
||||
ruff.schema.json -diff linguist-generated=true text=auto eol=lf
|
||||
|
||||
10
.github/CODEOWNERS
vendored
10
.github/CODEOWNERS
vendored
@@ -14,11 +14,11 @@
|
||||
# flake8-pyi
|
||||
/crates/ruff_linter/src/rules/flake8_pyi/ @AlexWaygood
|
||||
|
||||
# Script for fuzzing the parser/red-knot etc.
|
||||
# Script for fuzzing the parser/ty etc.
|
||||
/python/py-fuzzer/ @AlexWaygood
|
||||
|
||||
# red-knot
|
||||
/crates/red_knot* @carljm @MichaReiser @AlexWaygood @sharkdp @dcreager
|
||||
# ty
|
||||
/crates/ty* @carljm @MichaReiser @AlexWaygood @sharkdp @dcreager
|
||||
/crates/ruff_db/ @carljm @MichaReiser @AlexWaygood @sharkdp @dcreager
|
||||
/scripts/knot_benchmark/ @carljm @MichaReiser @AlexWaygood @sharkdp @dcreager
|
||||
/crates/red_knot_python_semantic @carljm @AlexWaygood @sharkdp @dcreager
|
||||
/scripts/ty_benchmark/ @carljm @MichaReiser @AlexWaygood @sharkdp @dcreager
|
||||
/crates/ty_python_semantic @carljm @AlexWaygood @sharkdp @dcreager
|
||||
|
||||
1
.github/actionlint.yaml
vendored
1
.github/actionlint.yaml
vendored
@@ -6,5 +6,6 @@ self-hosted-runner:
|
||||
labels:
|
||||
- depot-ubuntu-latest-8
|
||||
- depot-ubuntu-22.04-16
|
||||
- depot-ubuntu-22.04-32
|
||||
- github-windows-2025-x86_64-8
|
||||
- github-windows-2025-x86_64-16
|
||||
|
||||
82
.github/workflows/build-binaries.yml
vendored
82
.github/workflows/build-binaries.yml
vendored
@@ -28,7 +28,7 @@ permissions: {}
|
||||
env:
|
||||
PACKAGE_NAME: ruff
|
||||
MODULE_NAME: ruff
|
||||
PYTHON_VERSION: "3.11"
|
||||
PYTHON_VERSION: "3.13"
|
||||
CARGO_INCREMENTAL: 0
|
||||
CARGO_NET_RETRY: 10
|
||||
CARGO_TERM_COLOR: always
|
||||
@@ -39,17 +39,17 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build sdist"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
command: sdist
|
||||
args: --out dist
|
||||
@@ -59,7 +59,7 @@ jobs:
|
||||
"${MODULE_NAME}" --help
|
||||
python -m "${MODULE_NAME}" --help
|
||||
- name: "Upload sdist"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: wheels-sdist
|
||||
path: dist
|
||||
@@ -68,23 +68,23 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
|
||||
runs-on: macos-14
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
architecture: x64
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build wheels - x86_64"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
target: x86_64
|
||||
args: --release --locked --out dist
|
||||
- name: "Upload wheels"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: wheels-macos-x86_64
|
||||
path: dist
|
||||
@@ -99,7 +99,7 @@ jobs:
|
||||
tar czvf $ARCHIVE_FILE $ARCHIVE_NAME
|
||||
shasum -a 256 $ARCHIVE_FILE > $ARCHIVE_FILE.sha256
|
||||
- name: "Upload binary"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: artifacts-macos-x86_64
|
||||
path: |
|
||||
@@ -110,18 +110,18 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
|
||||
runs-on: macos-14
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
architecture: arm64
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build wheels - aarch64"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
target: aarch64
|
||||
args: --release --locked --out dist
|
||||
@@ -131,7 +131,7 @@ jobs:
|
||||
ruff --help
|
||||
python -m ruff --help
|
||||
- name: "Upload wheels"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: wheels-aarch64-apple-darwin
|
||||
path: dist
|
||||
@@ -146,7 +146,7 @@ jobs:
|
||||
tar czvf $ARCHIVE_FILE $ARCHIVE_NAME
|
||||
shasum -a 256 $ARCHIVE_FILE > $ARCHIVE_FILE.sha256
|
||||
- name: "Upload binary"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: artifacts-aarch64-apple-darwin
|
||||
path: |
|
||||
@@ -166,18 +166,18 @@ jobs:
|
||||
- target: aarch64-pc-windows-msvc
|
||||
arch: x64
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
architecture: ${{ matrix.platform.arch }}
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build wheels"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
target: ${{ matrix.platform.target }}
|
||||
args: --release --locked --out dist
|
||||
@@ -192,7 +192,7 @@ jobs:
|
||||
"${MODULE_NAME}" --help
|
||||
python -m "${MODULE_NAME}" --help
|
||||
- name: "Upload wheels"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: wheels-${{ matrix.platform.target }}
|
||||
path: dist
|
||||
@@ -203,7 +203,7 @@ jobs:
|
||||
7z a $ARCHIVE_FILE ./target/${{ matrix.platform.target }}/release/ruff.exe
|
||||
sha256sum $ARCHIVE_FILE > $ARCHIVE_FILE.sha256
|
||||
- name: "Upload binary"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: artifacts-${{ matrix.platform.target }}
|
||||
path: |
|
||||
@@ -219,18 +219,18 @@ jobs:
|
||||
- x86_64-unknown-linux-gnu
|
||||
- i686-unknown-linux-gnu
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
architecture: x64
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build wheels"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
target: ${{ matrix.target }}
|
||||
manylinux: auto
|
||||
@@ -242,7 +242,7 @@ jobs:
|
||||
"${MODULE_NAME}" --help
|
||||
python -m "${MODULE_NAME}" --help
|
||||
- name: "Upload wheels"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: wheels-${{ matrix.target }}
|
||||
path: dist
|
||||
@@ -260,7 +260,7 @@ jobs:
|
||||
tar czvf $ARCHIVE_FILE $ARCHIVE_NAME
|
||||
shasum -a 256 $ARCHIVE_FILE > $ARCHIVE_FILE.sha256
|
||||
- name: "Upload binary"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: artifacts-${{ matrix.target }}
|
||||
path: |
|
||||
@@ -294,17 +294,17 @@ jobs:
|
||||
arch: arm
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build wheels"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
target: ${{ matrix.platform.target }}
|
||||
manylinux: auto
|
||||
@@ -325,7 +325,7 @@ jobs:
|
||||
pip3 install ${{ env.PACKAGE_NAME }} --no-index --find-links dist/ --force-reinstall
|
||||
ruff --help
|
||||
- name: "Upload wheels"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: wheels-${{ matrix.platform.target }}
|
||||
path: dist
|
||||
@@ -343,7 +343,7 @@ jobs:
|
||||
tar czvf $ARCHIVE_FILE $ARCHIVE_NAME
|
||||
shasum -a 256 $ARCHIVE_FILE > $ARCHIVE_FILE.sha256
|
||||
- name: "Upload binary"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: artifacts-${{ matrix.platform.target }}
|
||||
path: |
|
||||
@@ -359,25 +359,25 @@ jobs:
|
||||
- x86_64-unknown-linux-musl
|
||||
- i686-unknown-linux-musl
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
architecture: x64
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build wheels"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
target: ${{ matrix.target }}
|
||||
manylinux: musllinux_1_2
|
||||
args: --release --locked --out dist
|
||||
- name: "Test wheel"
|
||||
if: matrix.target == 'x86_64-unknown-linux-musl'
|
||||
uses: addnab/docker-run-action@v3
|
||||
uses: addnab/docker-run-action@4f65fabd2431ebc8d299f8e5a018d79a769ae185 # v3
|
||||
with:
|
||||
image: alpine:latest
|
||||
options: -v ${{ github.workspace }}:/io -w /io
|
||||
@@ -387,7 +387,7 @@ jobs:
|
||||
.venv/bin/pip3 install ${{ env.PACKAGE_NAME }} --no-index --find-links dist/ --force-reinstall
|
||||
.venv/bin/${{ env.MODULE_NAME }} --help
|
||||
- name: "Upload wheels"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: wheels-${{ matrix.target }}
|
||||
path: dist
|
||||
@@ -405,7 +405,7 @@ jobs:
|
||||
tar czvf $ARCHIVE_FILE $ARCHIVE_NAME
|
||||
shasum -a 256 $ARCHIVE_FILE > $ARCHIVE_FILE.sha256
|
||||
- name: "Upload binary"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: artifacts-${{ matrix.target }}
|
||||
path: |
|
||||
@@ -425,17 +425,17 @@ jobs:
|
||||
arch: armv7
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build wheels"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
target: ${{ matrix.platform.target }}
|
||||
manylinux: musllinux_1_2
|
||||
@@ -454,7 +454,7 @@ jobs:
|
||||
.venv/bin/pip3 install ${{ env.PACKAGE_NAME }} --no-index --find-links dist/ --force-reinstall
|
||||
.venv/bin/${{ env.MODULE_NAME }} --help
|
||||
- name: "Upload wheels"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: wheels-${{ matrix.platform.target }}
|
||||
path: dist
|
||||
@@ -472,7 +472,7 @@ jobs:
|
||||
tar czvf $ARCHIVE_FILE $ARCHIVE_NAME
|
||||
shasum -a 256 $ARCHIVE_FILE > $ARCHIVE_FILE.sha256
|
||||
- name: "Upload binary"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: artifacts-${{ matrix.platform.target }}
|
||||
path: |
|
||||
|
||||
12
.github/workflows/build-docker.yml
vendored
12
.github/workflows/build-docker.yml
vendored
@@ -33,7 +33,7 @@ jobs:
|
||||
- linux/amd64
|
||||
- linux/arm64
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
submodules: recursive
|
||||
persist-credentials: false
|
||||
@@ -79,7 +79,7 @@ jobs:
|
||||
# Adapted from https://docs.docker.com/build/ci/github-actions/multi-platform/
|
||||
- name: Build and push by digest
|
||||
id: build
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6
|
||||
with:
|
||||
context: .
|
||||
platforms: ${{ matrix.platform }}
|
||||
@@ -96,7 +96,7 @@ jobs:
|
||||
touch "/tmp/digests/${digest#sha256:}"
|
||||
|
||||
- name: Upload digests
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: digests-${{ env.PLATFORM_TUPLE }}
|
||||
path: /tmp/digests/*
|
||||
@@ -113,7 +113,7 @@ jobs:
|
||||
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
|
||||
steps:
|
||||
- name: Download digests
|
||||
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
with:
|
||||
path: /tmp/digests
|
||||
pattern: digests-*
|
||||
@@ -231,7 +231,7 @@ jobs:
|
||||
${{ env.TAG_PATTERNS }}
|
||||
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6
|
||||
with:
|
||||
context: .
|
||||
platforms: linux/amd64,linux/arm64
|
||||
@@ -256,7 +256,7 @@ jobs:
|
||||
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
|
||||
steps:
|
||||
- name: Download digests
|
||||
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
with:
|
||||
path: /tmp/digests
|
||||
pattern: digests-*
|
||||
|
||||
250
.github/workflows/ci.yaml
vendored
250
.github/workflows/ci.yaml
vendored
@@ -18,7 +18,7 @@ env:
|
||||
CARGO_TERM_COLOR: always
|
||||
RUSTUP_MAX_RETRIES: 10
|
||||
PACKAGE_NAME: ruff
|
||||
PYTHON_VERSION: "3.12"
|
||||
PYTHON_VERSION: "3.13"
|
||||
|
||||
jobs:
|
||||
determine_changes:
|
||||
@@ -36,11 +36,13 @@ jobs:
|
||||
code: ${{ steps.check_code.outputs.changed }}
|
||||
# Flag that is raised when any code that affects the fuzzer is changed
|
||||
fuzz: ${{ steps.check_fuzzer.outputs.changed }}
|
||||
# Flag that is set to "true" when code related to ty changes.
|
||||
ty: ${{ steps.check_ty.outputs.changed }}
|
||||
|
||||
# Flag that is set to "true" when code related to the playground changes.
|
||||
playground: ${{ steps.check_playground.outputs.changed }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
persist-credentials: false
|
||||
@@ -82,7 +84,7 @@ jobs:
|
||||
if git diff --quiet "${MERGE_BASE}...HEAD" -- ':Cargo.toml' \
|
||||
':Cargo.lock' \
|
||||
':crates/**' \
|
||||
':!crates/red_knot*/**' \
|
||||
':!crates/ty*/**' \
|
||||
':!crates/ruff_python_formatter/**' \
|
||||
':!crates/ruff_formatter/**' \
|
||||
':!crates/ruff_dev/**' \
|
||||
@@ -141,9 +143,9 @@ jobs:
|
||||
env:
|
||||
MERGE_BASE: ${{ steps.merge_base.outputs.sha }}
|
||||
run: |
|
||||
if git diff --quiet "${MERGE_BASE}...HEAD" -- ':**/*' \
|
||||
if git diff --quiet "${MERGE_BASE}...HEAD" -- ':**' \
|
||||
':!**/*.md' \
|
||||
':crates/red_knot_python_semantic/resources/mdtest/**/*.md' \
|
||||
':crates/ty_python_semantic/resources/mdtest/**/*.md' \
|
||||
':!docs/**' \
|
||||
':!assets/**' \
|
||||
':.github/workflows/ci.yaml' \
|
||||
@@ -166,12 +168,35 @@ jobs:
|
||||
echo "changed=true" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
- name: Check if the ty code changed
|
||||
id: check_ty
|
||||
env:
|
||||
MERGE_BASE: ${{ steps.merge_base.outputs.sha }}
|
||||
run: |
|
||||
if git diff --quiet "${MERGE_BASE}...HEAD" -- \
|
||||
':Cargo.toml' \
|
||||
':Cargo.lock' \
|
||||
':crates/ty*/**' \
|
||||
':crates/ruff_db/**' \
|
||||
':crates/ruff_annotate_snippets/**' \
|
||||
':crates/ruff_python_ast/**' \
|
||||
':crates/ruff_python_parser/**' \
|
||||
':crates/ruff_python_trivia/**' \
|
||||
':crates/ruff_source_file/**' \
|
||||
':crates/ruff_text_size/**' \
|
||||
':.github/workflows/ci.yaml' \
|
||||
; then
|
||||
echo "changed=false" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "changed=true" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
cargo-fmt:
|
||||
name: "cargo fmt"
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: "Install Rust toolchain"
|
||||
@@ -185,10 +210,10 @@ jobs:
|
||||
if: ${{ needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main' }}
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
run: |
|
||||
rustup component add clippy
|
||||
@@ -196,7 +221,7 @@ jobs:
|
||||
- name: "Clippy"
|
||||
run: cargo clippy --workspace --all-targets --all-features --locked -- -D warnings
|
||||
- name: "Clippy (wasm)"
|
||||
run: cargo clippy -p ruff_wasm -p red_knot_wasm --target wasm32-unknown-unknown --all-features --locked -- -D warnings
|
||||
run: cargo clippy -p ruff_wasm -p ty_wasm --target wasm32-unknown-unknown --all-features --locked -- -D warnings
|
||||
|
||||
cargo-test-linux:
|
||||
name: "cargo test (linux)"
|
||||
@@ -205,22 +230,30 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Install mold"
|
||||
uses: rui314/setup-mold@v1
|
||||
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
|
||||
- name: "Install cargo nextest"
|
||||
uses: taiki-e/install-action@2c41309d51ede152b6f2ee6bf3b71e6dc9a8b7df # v2
|
||||
uses: taiki-e/install-action@86c23eed46c17b80677df6d8151545ce3e236c61 # v2
|
||||
with:
|
||||
tool: cargo-nextest
|
||||
- name: "Install cargo insta"
|
||||
uses: taiki-e/install-action@2c41309d51ede152b6f2ee6bf3b71e6dc9a8b7df # v2
|
||||
uses: taiki-e/install-action@86c23eed46c17b80677df6d8151545ce3e236c61 # v2
|
||||
with:
|
||||
tool: cargo-insta
|
||||
- name: ty mdtests (GitHub annotations)
|
||||
if: ${{ needs.determine_changes.outputs.ty == 'true' }}
|
||||
env:
|
||||
NO_COLOR: 1
|
||||
MDTEST_GITHUB_ANNOTATIONS_FORMAT: 1
|
||||
# Ignore errors if this step fails; we want to continue to later steps in the workflow anyway.
|
||||
# This step is just to get nice GitHub annotations on the PR diff in the files-changed tab.
|
||||
run: cargo test -p ty_python_semantic --test mdtest || true
|
||||
- name: "Run tests"
|
||||
shell: bash
|
||||
env:
|
||||
@@ -235,14 +268,18 @@ jobs:
|
||||
# sync, not just public items. Eventually we should do this for all
|
||||
# crates; for now add crates here as they are warning-clean to prevent
|
||||
# regression.
|
||||
- run: cargo doc --no-deps -p red_knot_python_semantic -p red_knot -p red_knot_test -p ruff_db --document-private-items
|
||||
- run: cargo doc --no-deps -p ty_python_semantic -p ty -p ty_test -p ruff_db --document-private-items
|
||||
env:
|
||||
# Setting RUSTDOCFLAGS because `cargo doc --check` isn't yet implemented (https://github.com/rust-lang/cargo/issues/10025).
|
||||
RUSTDOCFLAGS: "-D warnings"
|
||||
- uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
- uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: ruff
|
||||
path: target/debug/ruff
|
||||
- uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: ty
|
||||
path: target/debug/ty
|
||||
|
||||
cargo-test-linux-release:
|
||||
name: "cargo test (linux, release)"
|
||||
@@ -251,20 +288,20 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Install mold"
|
||||
uses: rui314/setup-mold@v1
|
||||
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
|
||||
- name: "Install cargo nextest"
|
||||
uses: taiki-e/install-action@2c41309d51ede152b6f2ee6bf3b71e6dc9a8b7df # v2
|
||||
uses: taiki-e/install-action@86c23eed46c17b80677df6d8151545ce3e236c61 # v2
|
||||
with:
|
||||
tool: cargo-nextest
|
||||
- name: "Install cargo insta"
|
||||
uses: taiki-e/install-action@2c41309d51ede152b6f2ee6bf3b71e6dc9a8b7df # v2
|
||||
uses: taiki-e/install-action@86c23eed46c17b80677df6d8151545ce3e236c61 # v2
|
||||
with:
|
||||
tool: cargo-insta
|
||||
- name: "Run tests"
|
||||
@@ -280,14 +317,14 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Install cargo nextest"
|
||||
uses: taiki-e/install-action@2c41309d51ede152b6f2ee6bf3b71e6dc9a8b7df # v2
|
||||
uses: taiki-e/install-action@86c23eed46c17b80677df6d8151545ce3e236c61 # v2
|
||||
with:
|
||||
tool: cargo-nextest
|
||||
- name: "Run tests"
|
||||
@@ -307,13 +344,13 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup target add wasm32-unknown-unknown
|
||||
- uses: actions/setup-node@cdca7365b2dadb8aad0a33bc7601856ffabcc48e # v4
|
||||
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
|
||||
with:
|
||||
node-version: 20
|
||||
cache: "npm"
|
||||
@@ -325,9 +362,9 @@ jobs:
|
||||
run: |
|
||||
cd crates/ruff_wasm
|
||||
wasm-pack test --node
|
||||
- name: "Test red_knot_wasm"
|
||||
- name: "Test ty_wasm"
|
||||
run: |
|
||||
cd crates/red_knot_wasm
|
||||
cd crates/ty_wasm
|
||||
wasm-pack test --node
|
||||
|
||||
cargo-build-release:
|
||||
@@ -336,14 +373,14 @@ jobs:
|
||||
if: ${{ github.ref == 'refs/heads/main' }}
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Install mold"
|
||||
uses: rui314/setup-mold@v1
|
||||
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
|
||||
- name: "Build"
|
||||
run: cargo build --release --locked
|
||||
|
||||
@@ -354,7 +391,7 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: SebRollen/toml-action@b1b3628f55fc3a28208d4203ada8b737e9687876 # v1.2.0
|
||||
@@ -362,19 +399,19 @@ jobs:
|
||||
with:
|
||||
file: "Cargo.toml"
|
||||
field: "workspace.package.rust-version"
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
env:
|
||||
MSRV: ${{ steps.msrv.outputs.value }}
|
||||
run: rustup default "${MSRV}"
|
||||
- name: "Install mold"
|
||||
uses: rui314/setup-mold@v1
|
||||
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
|
||||
- name: "Install cargo nextest"
|
||||
uses: taiki-e/install-action@2c41309d51ede152b6f2ee6bf3b71e6dc9a8b7df # v2
|
||||
uses: taiki-e/install-action@86c23eed46c17b80677df6d8151545ce3e236c61 # v2
|
||||
with:
|
||||
tool: cargo-nextest
|
||||
- name: "Install cargo insta"
|
||||
uses: taiki-e/install-action@2c41309d51ede152b6f2ee6bf3b71e6dc9a8b7df # v2
|
||||
uses: taiki-e/install-action@86c23eed46c17b80677df6d8151545ce3e236c61 # v2
|
||||
with:
|
||||
tool: cargo-insta
|
||||
- name: "Run tests"
|
||||
@@ -391,16 +428,16 @@ jobs:
|
||||
if: ${{ github.ref == 'refs/heads/main' || needs.determine_changes.outputs.fuzz == 'true' || needs.determine_changes.outputs.code == 'true' }}
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
with:
|
||||
workspaces: "fuzz -> target"
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Install cargo-binstall"
|
||||
uses: cargo-bins/cargo-binstall@main
|
||||
uses: cargo-bins/cargo-binstall@63aaa5c1932cebabc34eceda9d92a70215dcead6 # v1.12.3
|
||||
with:
|
||||
tool: cargo-fuzz@0.11.2
|
||||
- name: "Install cargo-fuzz"
|
||||
@@ -419,11 +456,11 @@ jobs:
|
||||
env:
|
||||
FORCE_COLOR: 1
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: astral-sh/setup-uv@f94ec6bedd8674c4426838e6b50417d36b6ab231 # v5
|
||||
- uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
- uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
|
||||
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
name: Download Ruff binary to test
|
||||
id: download-cached-binary
|
||||
with:
|
||||
@@ -453,10 +490,10 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
|
||||
timeout-minutes: 5
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup component add rustfmt
|
||||
# Run all code generation scripts, and verify that the current output is
|
||||
@@ -485,14 +522,14 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && github.event_name == 'pull_request' && needs.determine_changes.outputs.code == 'true' }}
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
|
||||
- uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
name: Download comparison Ruff binary
|
||||
id: ruff-target
|
||||
with:
|
||||
@@ -587,28 +624,75 @@ jobs:
|
||||
run: |
|
||||
echo ${{ github.event.number }} > pr-number
|
||||
|
||||
- uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
- uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
name: Upload PR Number
|
||||
with:
|
||||
name: pr-number
|
||||
path: pr-number
|
||||
|
||||
- uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
- uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
name: Upload Results
|
||||
with:
|
||||
name: ecosystem-result
|
||||
path: ecosystem-result
|
||||
|
||||
fuzz-ty:
|
||||
name: "Fuzz for new ty panics"
|
||||
runs-on: depot-ubuntu-22.04-16
|
||||
needs:
|
||||
- cargo-test-linux
|
||||
- determine_changes
|
||||
# Only runs on pull requests, since that is the only we way we can find the base version for comparison.
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && github.event_name == 'pull_request' && needs.determine_changes.outputs.ty == 'true' }}
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
name: Download new ty binary
|
||||
id: ty-new
|
||||
with:
|
||||
name: ty
|
||||
path: target/debug
|
||||
- uses: dawidd6/action-download-artifact@20319c5641d495c8a52e688b7dc5fada6c3a9fbc # v8
|
||||
name: Download baseline ty binary
|
||||
with:
|
||||
name: ty
|
||||
branch: ${{ github.event.pull_request.base.ref }}
|
||||
workflow: "ci.yaml"
|
||||
check_artifacts: true
|
||||
- uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
|
||||
- name: Fuzz
|
||||
env:
|
||||
FORCE_COLOR: 1
|
||||
NEW_TY: ${{ steps.ty-new.outputs.download-path }}
|
||||
run: |
|
||||
# Make executable, since artifact download doesn't preserve this
|
||||
chmod +x "${PWD}/ty" "${NEW_TY}/ty"
|
||||
|
||||
(
|
||||
uvx \
|
||||
--python="${PYTHON_VERSION}" \
|
||||
--from=./python/py-fuzzer \
|
||||
fuzz \
|
||||
--test-executable="${NEW_TY}/ty" \
|
||||
--baseline-executable="${PWD}/ty" \
|
||||
--only-new-bugs \
|
||||
--bin=ty \
|
||||
0-500
|
||||
)
|
||||
|
||||
cargo-shear:
|
||||
name: "cargo shear"
|
||||
runs-on: ubuntu-latest
|
||||
needs: determine_changes
|
||||
if: ${{ needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main' }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: cargo-bins/cargo-binstall@main
|
||||
- uses: cargo-bins/cargo-binstall@63aaa5c1932cebabc34eceda9d92a70215dcead6 # v1.12.3
|
||||
- run: cargo binstall --no-confirm cargo-shear
|
||||
- run: cargo shear
|
||||
|
||||
@@ -618,18 +702,18 @@ jobs:
|
||||
timeout-minutes: 20
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
architecture: x64
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Prep README.md"
|
||||
run: python scripts/transform_readme.py --target pypi
|
||||
- name: "Build wheels"
|
||||
uses: PyO3/maturin-action@36db84001d74475ad1b8e6613557ae4ee2dc3598 # v1
|
||||
uses: PyO3/maturin-action@aef21716ff3dcae8a1c301d23ec3e4446972a6e3 # v1.49.1
|
||||
with:
|
||||
args: --out dist
|
||||
- name: "Test wheel"
|
||||
@@ -642,22 +726,15 @@ jobs:
|
||||
|
||||
pre-commit:
|
||||
name: "pre-commit"
|
||||
runs-on: ubuntu-latest
|
||||
runs-on: depot-ubuntu-22.04-16
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Install pre-commit"
|
||||
run: pip install pre-commit
|
||||
- uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
|
||||
- name: "Cache pre-commit"
|
||||
uses: actions/cache@d4323d4df104b026a6aa633fdb11d772146be0bf # v4
|
||||
uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3
|
||||
with:
|
||||
path: ~/.cache/pre-commit
|
||||
key: pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
|
||||
@@ -666,7 +743,7 @@ jobs:
|
||||
echo '```console' > "$GITHUB_STEP_SUMMARY"
|
||||
# Enable color output for pre-commit and remove it for the summary
|
||||
# Use --hook-stage=manual to enable slower pre-commit hooks that are skipped by default
|
||||
SKIP=cargo-fmt,clippy,dev-generate-all pre-commit run --all-files --show-diff-on-failure --color=always --hook-stage=manual | \
|
||||
SKIP=cargo-fmt,clippy,dev-generate-all uvx --python="${PYTHON_VERSION}" pre-commit run --all-files --show-diff-on-failure --color=always --hook-stage=manual | \
|
||||
tee >(sed -E 's/\x1B\[([0-9]{1,2}(;[0-9]{1,2})*)?[mGK]//g' >> "$GITHUB_STEP_SUMMARY") >&1
|
||||
exit_code="${PIPESTATUS[0]}"
|
||||
echo '```' >> "$GITHUB_STEP_SUMMARY"
|
||||
@@ -679,22 +756,22 @@ jobs:
|
||||
env:
|
||||
MKDOCS_INSIDERS_SSH_KEY_EXISTS: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY != '' }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: "3.13"
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Add SSH key"
|
||||
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
|
||||
uses: webfactory/ssh-agent@dc588b651fe13675774614f8e6a936a468676387 # v0.9.0
|
||||
uses: webfactory/ssh-agent@a6f90b1f127823b31d4d4a8d96047790581349bd # v0.9.1
|
||||
with:
|
||||
ssh-private-key: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY }}
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: Install uv
|
||||
uses: astral-sh/setup-uv@f94ec6bedd8674c4426838e6b50417d36b6ab231 # v5
|
||||
uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
|
||||
- name: "Install Insiders dependencies"
|
||||
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
|
||||
run: uv pip install -r docs/requirements-insiders.txt --system
|
||||
@@ -721,10 +798,10 @@ jobs:
|
||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.formatter == 'true' || github.ref == 'refs/heads/main') }}
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Run checks"
|
||||
@@ -747,17 +824,18 @@ jobs:
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
name: "Download ruff-lsp source"
|
||||
with:
|
||||
persist-credentials: false
|
||||
repository: "astral-sh/ruff-lsp"
|
||||
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
# installation fails on 3.13 and newer
|
||||
python-version: "3.12"
|
||||
|
||||
- uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
name: Download development ruff binary
|
||||
id: ruff-target
|
||||
with:
|
||||
@@ -788,13 +866,13 @@ jobs:
|
||||
- determine_changes
|
||||
if: ${{ (needs.determine_changes.outputs.playground == 'true') }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup target add wasm32-unknown-unknown
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: actions/setup-node@cdca7365b2dadb8aad0a33bc7601856ffabcc48e # v4
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
|
||||
with:
|
||||
node-version: 22
|
||||
cache: "npm"
|
||||
@@ -820,17 +898,17 @@ jobs:
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- name: "Checkout Branch"
|
||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
|
||||
- name: "Install codspeed"
|
||||
uses: taiki-e/install-action@2c41309d51ede152b6f2ee6bf3b71e6dc9a8b7df # v2
|
||||
uses: taiki-e/install-action@86c23eed46c17b80677df6d8151545ce3e236c61 # v2
|
||||
with:
|
||||
tool: cargo-codspeed
|
||||
|
||||
@@ -838,7 +916,7 @@ jobs:
|
||||
run: cargo codspeed build --features codspeed -p ruff_benchmark
|
||||
|
||||
- name: "Run benchmarks"
|
||||
uses: CodSpeedHQ/action@0010eb0ca6e89b80c88e8edaaa07cfe5f3e6664d # v3
|
||||
uses: CodSpeedHQ/action@0010eb0ca6e89b80c88e8edaaa07cfe5f3e6664d # v3.5.0
|
||||
with:
|
||||
run: cargo codspeed run
|
||||
token: ${{ secrets.CODSPEED_TOKEN }}
|
||||
|
||||
10
.github/workflows/daily_fuzz.yaml
vendored
10
.github/workflows/daily_fuzz.yaml
vendored
@@ -31,15 +31,15 @@ jobs:
|
||||
# Don't run the cron job on forks:
|
||||
if: ${{ github.repository == 'astral-sh/ruff' || github.event_name != 'schedule' }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- uses: astral-sh/setup-uv@f94ec6bedd8674c4426838e6b50417d36b6ab231 # v5
|
||||
- uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Install mold"
|
||||
uses: rui314/setup-mold@v1
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: Build ruff
|
||||
# A debug build means the script runs slower once it gets started,
|
||||
# but this is outweighed by the fact that a release build takes *much* longer to compile in CI
|
||||
@@ -65,7 +65,7 @@ jobs:
|
||||
permissions:
|
||||
issues: write
|
||||
steps:
|
||||
- uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
|
||||
- uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
|
||||
with:
|
||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
script: |
|
||||
|
||||
18
.github/workflows/daily_property_tests.yaml
vendored
18
.github/workflows/daily_property_tests.yaml
vendored
@@ -30,25 +30,25 @@ jobs:
|
||||
# Don't run the cron job on forks:
|
||||
if: ${{ github.repository == 'astral-sh/ruff' || github.event_name != 'schedule' }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
- name: "Install mold"
|
||||
uses: rui314/setup-mold@v1
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- name: Build Red Knot
|
||||
uses: rui314/setup-mold@e16410e7f8d9e167b74ad5697a9089a35126eb50 # v1
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
- name: Build ty
|
||||
# A release build takes longer (2 min vs 1 min), but the property tests run much faster in release
|
||||
# mode (1.5 min vs 14 min), so the overall time is shorter with a release build.
|
||||
run: cargo build --locked --release --package red_knot_python_semantic --tests
|
||||
run: cargo build --locked --release --package ty_python_semantic --tests
|
||||
- name: Run property tests
|
||||
shell: bash
|
||||
run: |
|
||||
export QUICKCHECK_TESTS=100000
|
||||
for _ in {1..5}; do
|
||||
cargo test --locked --release --package red_knot_python_semantic -- --ignored list::property_tests
|
||||
cargo test --locked --release --package red_knot_python_semantic -- --ignored types::property_tests::stable
|
||||
cargo test --locked --release --package ty_python_semantic -- --ignored list::property_tests
|
||||
cargo test --locked --release --package ty_python_semantic -- --ignored types::property_tests::stable
|
||||
done
|
||||
|
||||
create-issue-on-failure:
|
||||
@@ -59,7 +59,7 @@ jobs:
|
||||
permissions:
|
||||
issues: write
|
||||
steps:
|
||||
- uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
|
||||
- uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
|
||||
with:
|
||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
script: |
|
||||
@@ -68,5 +68,5 @@ jobs:
|
||||
repo: "ruff",
|
||||
title: `Daily property test run failed on ${new Date().toDateString()}`,
|
||||
body: "Run listed here: https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}",
|
||||
labels: ["bug", "red-knot", "testing"],
|
||||
labels: ["bug", "ty", "testing"],
|
||||
})
|
||||
|
||||
31
.github/workflows/mypy_primer.yaml
vendored
31
.github/workflows/mypy_primer.yaml
vendored
@@ -5,7 +5,7 @@ permissions: {}
|
||||
on:
|
||||
pull_request:
|
||||
paths:
|
||||
- "crates/red_knot*/**"
|
||||
- "crates/ty*/**"
|
||||
- "crates/ruff_db"
|
||||
- "crates/ruff_python_ast"
|
||||
- "crates/ruff_python_parser"
|
||||
@@ -21,37 +21,37 @@ env:
|
||||
CARGO_NET_RETRY: 10
|
||||
CARGO_TERM_COLOR: always
|
||||
RUSTUP_MAX_RETRIES: 10
|
||||
RUST_BACKTRACE: 1
|
||||
|
||||
jobs:
|
||||
mypy_primer:
|
||||
name: Run mypy_primer
|
||||
runs-on: ubuntu-24.04
|
||||
runs-on: depot-ubuntu-22.04-32
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
path: ruff
|
||||
fetch-depth: 0
|
||||
persist-credentials: false
|
||||
|
||||
- name: Install the latest version of uv
|
||||
uses: astral-sh/setup-uv@f94ec6bedd8674c4426838e6b50417d36b6ab231 # v5
|
||||
uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
|
||||
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
with:
|
||||
workspaces: "ruff"
|
||||
|
||||
- name: Install Rust toolchain
|
||||
run: rustup show
|
||||
|
||||
- name: Install mypy_primer
|
||||
run: |
|
||||
uv tool install "git+https://github.com/astral-sh/mypy_primer.git@add-red-knot-support"
|
||||
|
||||
- name: Run mypy_primer
|
||||
shell: bash
|
||||
run: |
|
||||
cd ruff
|
||||
|
||||
PRIMER_SELECTOR="$(paste -s -d'|' crates/ty_python_semantic/resources/primer/good.txt)"
|
||||
|
||||
echo "new commit"
|
||||
git rev-list --format=%s --max-count=1 "$GITHUB_SHA"
|
||||
|
||||
@@ -62,13 +62,16 @@ jobs:
|
||||
|
||||
cd ..
|
||||
|
||||
echo "Project selector: $PRIMER_SELECTOR"
|
||||
# Allow the exit code to be 0 or 1, only fail for actual mypy_primer crashes/bugs
|
||||
uvx mypy_primer \
|
||||
uvx \
|
||||
--from="git+https://github.com/hauntsaninja/mypy_primer@4b15cf3b07db69db67bbfaebfffb2a8a28040933" \
|
||||
mypy_primer \
|
||||
--repo ruff \
|
||||
--type-checker knot \
|
||||
--type-checker ty \
|
||||
--old base_commit \
|
||||
--new "$GITHUB_SHA" \
|
||||
--project-selector '/(mypy_primer|black|pyp|git-revise|zipp|arrow|isort|itsdangerous|rich|packaging|pybind11|pyinstrument)$' \
|
||||
--project-selector "/($PRIMER_SELECTOR)\$" \
|
||||
--output concise \
|
||||
--debug > mypy_primer.diff || [ $? -eq 1 ]
|
||||
|
||||
@@ -81,13 +84,13 @@ jobs:
|
||||
echo ${{ github.event.number }} > pr-number
|
||||
|
||||
- name: Upload diff
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: mypy_primer_diff
|
||||
path: mypy_primer.diff
|
||||
|
||||
- name: Upload pr-number
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: pr-number
|
||||
path: pr-number
|
||||
|
||||
2
.github/workflows/notify-dependents.yml
vendored
2
.github/workflows/notify-dependents.yml
vendored
@@ -17,7 +17,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: "Update pre-commit mirror"
|
||||
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
|
||||
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
|
||||
with:
|
||||
github-token: ${{ secrets.RUFF_PRE_COMMIT_PAT }}
|
||||
script: |
|
||||
|
||||
8
.github/workflows/publish-docs.yml
vendored
8
.github/workflows/publish-docs.yml
vendored
@@ -23,12 +23,12 @@ jobs:
|
||||
env:
|
||||
MKDOCS_INSIDERS_SSH_KEY_EXISTS: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY != '' }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
ref: ${{ inputs.ref }}
|
||||
persist-credentials: true
|
||||
|
||||
- uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5
|
||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: 3.12
|
||||
|
||||
@@ -61,14 +61,14 @@ jobs:
|
||||
|
||||
- name: "Add SSH key"
|
||||
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
|
||||
uses: webfactory/ssh-agent@dc588b651fe13675774614f8e6a936a468676387 # v0.9.0
|
||||
uses: webfactory/ssh-agent@a6f90b1f127823b31d4d4a8d96047790581349bd # v0.9.1
|
||||
with:
|
||||
ssh-private-key: ${{ secrets.MKDOCS_INSIDERS_SSH_KEY }}
|
||||
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup show
|
||||
|
||||
- uses: Swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
|
||||
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
|
||||
|
||||
- name: "Install Insiders dependencies"
|
||||
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
|
||||
|
||||
4
.github/workflows/publish-playground.yml
vendored
4
.github/workflows/publish-playground.yml
vendored
@@ -24,12 +24,12 @@ jobs:
|
||||
env:
|
||||
CF_API_TOKEN_EXISTS: ${{ secrets.CF_API_TOKEN != '' }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup target add wasm32-unknown-unknown
|
||||
- uses: actions/setup-node@cdca7365b2dadb8aad0a33bc7601856ffabcc48e # v4
|
||||
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
|
||||
with:
|
||||
node-version: 22
|
||||
cache: "npm"
|
||||
|
||||
4
.github/workflows/publish-pypi.yml
vendored
4
.github/workflows/publish-pypi.yml
vendored
@@ -22,8 +22,8 @@ jobs:
|
||||
id-token: write
|
||||
steps:
|
||||
- name: "Install uv"
|
||||
uses: astral-sh/setup-uv@f94ec6bedd8674c4426838e6b50417d36b6ab231 # v5
|
||||
- uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
|
||||
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
with:
|
||||
pattern: wheels-*
|
||||
path: wheels
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
# Publish the Red Knot playground.
|
||||
name: "[Knot Playground] Release"
|
||||
# Publish the ty playground.
|
||||
name: "[ty Playground] Release"
|
||||
|
||||
permissions: {}
|
||||
|
||||
@@ -7,12 +7,12 @@ on:
|
||||
push:
|
||||
branches: [main]
|
||||
paths:
|
||||
- "crates/red_knot*/**"
|
||||
- "crates/ruff_db"
|
||||
- "crates/ruff_python_ast"
|
||||
- "crates/ruff_python_parser"
|
||||
- "playground"
|
||||
- ".github/workflows/publish-knot-playground.yml"
|
||||
- "crates/ty*/**"
|
||||
- "crates/ruff_db/**"
|
||||
- "crates/ruff_python_ast/**"
|
||||
- "crates/ruff_python_parser/**"
|
||||
- "playground/**"
|
||||
- ".github/workflows/publish-ty-playground.yml"
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref_name }}
|
||||
@@ -30,12 +30,12 @@ jobs:
|
||||
env:
|
||||
CF_API_TOKEN_EXISTS: ${{ secrets.CF_API_TOKEN != '' }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: "Install Rust toolchain"
|
||||
run: rustup target add wasm32-unknown-unknown
|
||||
- uses: actions/setup-node@cdca7365b2dadb8aad0a33bc7601856ffabcc48e # v4
|
||||
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
|
||||
with:
|
||||
node-version: 22
|
||||
- uses: jetli/wasm-bindgen-action@20b33e20595891ab1a0ed73145d8a21fc96e7c29 # v0.2.0
|
||||
@@ -45,8 +45,8 @@ jobs:
|
||||
- name: "Run TypeScript checks"
|
||||
run: npm run check
|
||||
working-directory: playground
|
||||
- name: "Build Knot playground"
|
||||
run: npm run build --workspace knot-playground
|
||||
- name: "Build ty playground"
|
||||
run: npm run build --workspace ty-playground
|
||||
working-directory: playground
|
||||
- name: "Deploy to Cloudflare Pages"
|
||||
if: ${{ env.CF_API_TOKEN_EXISTS == 'true' }}
|
||||
@@ -55,4 +55,4 @@ jobs:
|
||||
apiToken: ${{ secrets.CF_API_TOKEN }}
|
||||
accountId: ${{ secrets.CF_ACCOUNT_ID }}
|
||||
# `github.head_ref` is only set during pull requests and for manual runs or tags we use `main` to deploy to production
|
||||
command: pages deploy playground/knot/dist --project-name=knot-playground --branch ${{ github.head_ref || 'main' }} --commit-hash ${GITHUB_SHA}
|
||||
command: pages deploy playground/ty/dist --project-name=ty-playground --branch ${{ github.head_ref || 'main' }} --commit-hash ${GITHUB_SHA}
|
||||
4
.github/workflows/publish-wasm.yml
vendored
4
.github/workflows/publish-wasm.yml
vendored
@@ -29,7 +29,7 @@ jobs:
|
||||
target: [web, bundler, nodejs]
|
||||
fail-fast: false
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: "Install Rust toolchain"
|
||||
@@ -45,7 +45,7 @@ jobs:
|
||||
jq '.name="@astral-sh/ruff-wasm-${{ matrix.target }}"' crates/ruff_wasm/pkg/package.json > /tmp/package.json
|
||||
mv /tmp/package.json crates/ruff_wasm/pkg
|
||||
- run: cp LICENSE crates/ruff_wasm/pkg # wasm-pack does not put the LICENSE file in the pkg
|
||||
- uses: actions/setup-node@cdca7365b2dadb8aad0a33bc7601856ffabcc48e # v4
|
||||
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
|
||||
with:
|
||||
node-version: 20
|
||||
registry-url: "https://registry.npmjs.org"
|
||||
|
||||
36
.github/workflows/release.yml
vendored
36
.github/workflows/release.yml
vendored
@@ -1,6 +1,7 @@
|
||||
# This file was autogenerated by dist: https://opensource.axo.dev/cargo-dist/
|
||||
# This file was autogenerated by dist: https://github.com/astral-sh/cargo-dist
|
||||
#
|
||||
# Copyright 2022-2024, axodotdev
|
||||
# Copyright 2025 Astral Software Inc.
|
||||
# SPDX-License-Identifier: MIT or Apache-2.0
|
||||
#
|
||||
# CI that:
|
||||
@@ -39,6 +40,7 @@ permissions:
|
||||
# If there's a prerelease-style suffix to the version, then the release(s)
|
||||
# will be marked as a prerelease.
|
||||
on:
|
||||
pull_request:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
tag:
|
||||
@@ -59,16 +61,17 @@ jobs:
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@85e6279cec87321a52edac9c87bce653a07cf6c2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Install dist
|
||||
# we specify bash to get pipefail; it guards against the `curl` command
|
||||
# failing. otherwise `sh` won't catch that `curl` returned non-0
|
||||
shell: bash
|
||||
run: "curl --proto '=https' --tlsv1.2 -LsSf https://github.com/axodotdev/cargo-dist/releases/download/v0.25.2-prerelease.3/cargo-dist-installer.sh | sh"
|
||||
run: "curl --proto '=https' --tlsv1.2 -LsSf https://github.com/astral-sh/cargo-dist/releases/download/v0.28.5-prerelease.1/cargo-dist-installer.sh | sh"
|
||||
- name: Cache dist
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
|
||||
with:
|
||||
name: cargo-dist-cache
|
||||
path: ~/.cargo/bin/dist
|
||||
@@ -84,7 +87,7 @@ jobs:
|
||||
cat plan-dist-manifest.json
|
||||
echo "manifest=$(jq -c "." plan-dist-manifest.json)" >> "$GITHUB_OUTPUT"
|
||||
- name: "Upload dist-manifest.json"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
|
||||
with:
|
||||
name: artifacts-plan-dist-manifest
|
||||
path: plan-dist-manifest.json
|
||||
@@ -121,18 +124,19 @@ jobs:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
BUILD_MANIFEST_NAME: target/distrib/global-dist-manifest.json
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@85e6279cec87321a52edac9c87bce653a07cf6c2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Install cached dist
|
||||
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
|
||||
with:
|
||||
name: cargo-dist-cache
|
||||
path: ~/.cargo/bin/
|
||||
- run: chmod +x ~/.cargo/bin/dist
|
||||
# Get all the local artifacts for the global tasks to use (for e.g. checksums)
|
||||
- name: Fetch local artifacts
|
||||
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
|
||||
with:
|
||||
pattern: artifacts-*
|
||||
path: target/distrib/
|
||||
@@ -150,7 +154,7 @@ jobs:
|
||||
|
||||
cp dist-manifest.json "$BUILD_MANIFEST_NAME"
|
||||
- name: "Upload artifacts"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
|
||||
with:
|
||||
name: artifacts-build-global
|
||||
path: |
|
||||
@@ -171,18 +175,19 @@ jobs:
|
||||
outputs:
|
||||
val: ${{ steps.host.outputs.manifest }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@85e6279cec87321a52edac9c87bce653a07cf6c2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Install cached dist
|
||||
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
|
||||
with:
|
||||
name: cargo-dist-cache
|
||||
path: ~/.cargo/bin/
|
||||
- run: chmod +x ~/.cargo/bin/dist
|
||||
# Fetch artifacts from scratch-storage
|
||||
- name: Fetch artifacts
|
||||
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
|
||||
with:
|
||||
pattern: artifacts-*
|
||||
path: target/distrib/
|
||||
@@ -196,7 +201,7 @@ jobs:
|
||||
cat dist-manifest.json
|
||||
echo "manifest=$(jq -c "." dist-manifest.json)" >> "$GITHUB_OUTPUT"
|
||||
- name: "Upload dist-manifest.json"
|
||||
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
|
||||
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
|
||||
with:
|
||||
# Overwrite the previous copy
|
||||
name: artifacts-dist-manifest
|
||||
@@ -246,12 +251,13 @@ jobs:
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@85e6279cec87321a52edac9c87bce653a07cf6c2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
# Create a GitHub Release while uploading all files to it
|
||||
- name: "Download GitHub Artifacts"
|
||||
uses: actions/download-artifact@cc203385981b70ca67e1cc392babf9cc229d5806 # v4
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
|
||||
with:
|
||||
pattern: artifacts-*
|
||||
path: artifacts
|
||||
|
||||
22
.github/workflows/sync_typeshed.yaml
vendored
22
.github/workflows/sync_typeshed.yaml
vendored
@@ -21,12 +21,12 @@ jobs:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
name: Checkout Ruff
|
||||
with:
|
||||
path: ruff
|
||||
persist-credentials: true
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
name: Checkout typeshed
|
||||
with:
|
||||
repository: python/typeshed
|
||||
@@ -39,13 +39,13 @@ jobs:
|
||||
- name: Sync typeshed
|
||||
id: sync
|
||||
run: |
|
||||
rm -rf ruff/crates/red_knot_vendored/vendor/typeshed
|
||||
mkdir ruff/crates/red_knot_vendored/vendor/typeshed
|
||||
cp typeshed/README.md ruff/crates/red_knot_vendored/vendor/typeshed
|
||||
cp typeshed/LICENSE ruff/crates/red_knot_vendored/vendor/typeshed
|
||||
cp -r typeshed/stdlib ruff/crates/red_knot_vendored/vendor/typeshed/stdlib
|
||||
rm -rf ruff/crates/red_knot_vendored/vendor/typeshed/stdlib/@tests
|
||||
git -C typeshed rev-parse HEAD > ruff/crates/red_knot_vendored/vendor/typeshed/source_commit.txt
|
||||
rm -rf ruff/crates/ty_vendored/vendor/typeshed
|
||||
mkdir ruff/crates/ty_vendored/vendor/typeshed
|
||||
cp typeshed/README.md ruff/crates/ty_vendored/vendor/typeshed
|
||||
cp typeshed/LICENSE ruff/crates/ty_vendored/vendor/typeshed
|
||||
cp -r typeshed/stdlib ruff/crates/ty_vendored/vendor/typeshed/stdlib
|
||||
rm -rf ruff/crates/ty_vendored/vendor/typeshed/stdlib/@tests
|
||||
git -C typeshed rev-parse HEAD > ruff/crates/ty_vendored/vendor/typeshed/source_commit.txt
|
||||
- name: Commit the changes
|
||||
id: commit
|
||||
if: ${{ steps.sync.outcome == 'success' }}
|
||||
@@ -70,7 +70,7 @@ jobs:
|
||||
permissions:
|
||||
issues: write
|
||||
steps:
|
||||
- uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
|
||||
- uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
|
||||
with:
|
||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
script: |
|
||||
@@ -79,5 +79,5 @@ jobs:
|
||||
repo: "ruff",
|
||||
title: `Automated typeshed sync failed on ${new Date().toDateString()}`,
|
||||
body: "Run listed here: https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}",
|
||||
labels: ["bug", "red-knot"],
|
||||
labels: ["bug", "ty"],
|
||||
})
|
||||
|
||||
@@ -3,8 +3,8 @@ fail_fast: false
|
||||
exclude: |
|
||||
(?x)^(
|
||||
.github/workflows/release.yml|
|
||||
crates/red_knot_vendored/vendor/.*|
|
||||
crates/red_knot_project/resources/.*|
|
||||
crates/ty_vendored/vendor/.*|
|
||||
crates/ty_project/resources/.*|
|
||||
crates/ruff_benchmark/resources/.*|
|
||||
crates/ruff_linter/resources/.*|
|
||||
crates/ruff_linter/src/rules/.*/snapshots/.*|
|
||||
@@ -18,8 +18,13 @@ exclude: |
|
||||
)$
|
||||
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v5.0.0
|
||||
hooks:
|
||||
- id: check-merge-conflict
|
||||
|
||||
- repo: https://github.com/abravalheri/validate-pyproject
|
||||
rev: v0.24
|
||||
rev: v0.24.1
|
||||
hooks:
|
||||
- id: validate-pyproject
|
||||
|
||||
@@ -60,7 +65,7 @@ repos:
|
||||
- black==25.1.0
|
||||
|
||||
- repo: https://github.com/crate-ci/typos
|
||||
rev: v1.30.2
|
||||
rev: v1.32.0
|
||||
hooks:
|
||||
- id: typos
|
||||
|
||||
@@ -74,7 +79,7 @@ repos:
|
||||
pass_filenames: false # This makes it a lot faster
|
||||
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: v0.11.0
|
||||
rev: v0.11.8
|
||||
hooks:
|
||||
- id: ruff-format
|
||||
- id: ruff
|
||||
@@ -92,12 +97,12 @@ repos:
|
||||
# zizmor detects security vulnerabilities in GitHub Actions workflows.
|
||||
# Additional configuration for the tool is found in `.github/zizmor.yml`
|
||||
- repo: https://github.com/woodruffw/zizmor-pre-commit
|
||||
rev: v1.5.1
|
||||
rev: v1.6.0
|
||||
hooks:
|
||||
- id: zizmor
|
||||
|
||||
- repo: https://github.com/python-jsonschema/check-jsonschema
|
||||
rev: 0.31.3
|
||||
rev: 0.33.0
|
||||
hooks:
|
||||
- id: check-github-workflows
|
||||
|
||||
|
||||
155
CHANGELOG.md
155
CHANGELOG.md
@@ -1,5 +1,150 @@
|
||||
# Changelog
|
||||
|
||||
## 0.11.8
|
||||
|
||||
### Preview features
|
||||
|
||||
- \[`airflow`\] Apply auto fixes to cases where the names have changed in Airflow 3 (`AIR302`, `AIR311`) ([#17553](https://github.com/astral-sh/ruff/pull/17553), [#17570](https://github.com/astral-sh/ruff/pull/17570), [#17571](https://github.com/astral-sh/ruff/pull/17571))
|
||||
- \[`airflow`\] Extend `AIR301` rule ([#17598](https://github.com/astral-sh/ruff/pull/17598))
|
||||
- \[`airflow`\] Update existing `AIR302` rules with better suggestions ([#17542](https://github.com/astral-sh/ruff/pull/17542))
|
||||
- \[`refurb`\] Mark fix as safe for `readlines-in-for` (`FURB129`) ([#17644](https://github.com/astral-sh/ruff/pull/17644))
|
||||
- [syntax-errors] `nonlocal` declaration at module level ([#17559](https://github.com/astral-sh/ruff/pull/17559))
|
||||
- [syntax-errors] Detect single starred expression assignment `x = *y` ([#17624](https://github.com/astral-sh/ruff/pull/17624))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- \[`flake8-pyi`\] Ensure `Literal[None,] | Literal[None,]` is not autofixed to `None | None` (`PYI061`) ([#17659](https://github.com/astral-sh/ruff/pull/17659))
|
||||
- \[`flake8-use-pathlib`\] Avoid suggesting `Path.iterdir()` for `os.listdir` with file descriptor (`PTH208`) ([#17715](https://github.com/astral-sh/ruff/pull/17715))
|
||||
- \[`flake8-use-pathlib`\] Fix `PTH104` false positive when `rename` is passed a file descriptor ([#17712](https://github.com/astral-sh/ruff/pull/17712))
|
||||
- \[`flake8-use-pathlib`\] Fix `PTH116` false positive when `stat` is passed a file descriptor ([#17709](https://github.com/astral-sh/ruff/pull/17709))
|
||||
- \[`flake8-use-pathlib`\] Fix `PTH123` false positive when `open` is passed a file descriptor from a function call ([#17705](https://github.com/astral-sh/ruff/pull/17705))
|
||||
- \[`pycodestyle`\] Fix duplicated diagnostic in `E712` ([#17651](https://github.com/astral-sh/ruff/pull/17651))
|
||||
- \[`pylint`\] Detect `global` declarations in module scope (`PLE0118`) ([#17411](https://github.com/astral-sh/ruff/pull/17411))
|
||||
- [syntax-errors] Make `async-comprehension-in-sync-comprehension` more specific ([#17460](https://github.com/astral-sh/ruff/pull/17460))
|
||||
|
||||
### Configuration
|
||||
|
||||
- Add option to disable `typing_extensions` imports ([#17611](https://github.com/astral-sh/ruff/pull/17611))
|
||||
|
||||
### Documentation
|
||||
|
||||
- Fix example syntax for the `lint.pydocstyle.ignore-var-parameters` option ([#17740](https://github.com/astral-sh/ruff/pull/17740))
|
||||
- Add fix safety sections (`ASYNC116`, `FLY002`, `D200`, `RUF005`, `RUF017`, `RUF027`, `RUF028`, `RUF057`) ([#17497](https://github.com/astral-sh/ruff/pull/17497), [#17496](https://github.com/astral-sh/ruff/pull/17496), [#17502](https://github.com/astral-sh/ruff/pull/17502), [#17484](https://github.com/astral-sh/ruff/pull/17484), [#17480](https://github.com/astral-sh/ruff/pull/17480), [#17485](https://github.com/astral-sh/ruff/pull/17485), [#17722](https://github.com/astral-sh/ruff/pull/17722), [#17483](https://github.com/astral-sh/ruff/pull/17483))
|
||||
|
||||
### Other changes
|
||||
|
||||
- Add Python 3.14 to configuration options ([#17647](https://github.com/astral-sh/ruff/pull/17647))
|
||||
- Make syntax error for unparenthesized except tuples version specific to before 3.14 ([#17660](https://github.com/astral-sh/ruff/pull/17660))
|
||||
|
||||
## 0.11.7
|
||||
|
||||
### Preview features
|
||||
|
||||
- \[`airflow`\] Apply auto fixes to cases where the names have changed in Airflow 3 (`AIR301`) ([#17355](https://github.com/astral-sh/ruff/pull/17355))
|
||||
- \[`perflint`\] Implement fix for `manual-dict-comprehension` (`PERF403`) ([#16719](https://github.com/astral-sh/ruff/pull/16719))
|
||||
- [syntax-errors] Make duplicate parameter names a semantic error ([#17131](https://github.com/astral-sh/ruff/pull/17131))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- \[`airflow`\] Fix typos in provider package names (`AIR302`, `AIR312`) ([#17574](https://github.com/astral-sh/ruff/pull/17574))
|
||||
- \[`flake8-type-checking`\] Visit keyword arguments in checks involving `typing.cast`/`typing.NewType` arguments ([#17538](https://github.com/astral-sh/ruff/pull/17538))
|
||||
- \[`pyupgrade`\] Preserve parenthesis when fixing native literals containing newlines (`UP018`) ([#17220](https://github.com/astral-sh/ruff/pull/17220))
|
||||
- \[`refurb`\] Mark the `FURB161` fix unsafe except for integers and booleans ([#17240](https://github.com/astral-sh/ruff/pull/17240))
|
||||
|
||||
### Rule changes
|
||||
|
||||
- \[`perflint`\] Allow list function calls to be replaced with a comprehension (`PERF401`) ([#17519](https://github.com/astral-sh/ruff/pull/17519))
|
||||
- \[`pycodestyle`\] Auto-fix redundant boolean comparison (`E712`) ([#17090](https://github.com/astral-sh/ruff/pull/17090))
|
||||
- \[`pylint`\] make fix unsafe if delete comments (`PLR1730`) ([#17459](https://github.com/astral-sh/ruff/pull/17459))
|
||||
|
||||
### Documentation
|
||||
|
||||
- Add fix safety sections to docs for several rules ([#17410](https://github.com/astral-sh/ruff/pull/17410),[#17440](https://github.com/astral-sh/ruff/pull/17440),[#17441](https://github.com/astral-sh/ruff/pull/17441),[#17443](https://github.com/astral-sh/ruff/pull/17443),[#17444](https://github.com/astral-sh/ruff/pull/17444))
|
||||
|
||||
## 0.11.6
|
||||
|
||||
### Preview features
|
||||
|
||||
- Avoid adding whitespace to the end of a docstring after an escaped quote ([#17216](https://github.com/astral-sh/ruff/pull/17216))
|
||||
- \[`airflow`\] Extract `AIR311` from `AIR301` rules (`AIR301`, `AIR311`) ([#17310](https://github.com/astral-sh/ruff/pull/17310), [#17422](https://github.com/astral-sh/ruff/pull/17422))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- Raise syntax error when `\` is at end of file ([#17409](https://github.com/astral-sh/ruff/pull/17409))
|
||||
|
||||
## 0.11.5
|
||||
|
||||
### Preview features
|
||||
|
||||
- \[`airflow`\] Add missing `AIR302` attribute check ([#17115](https://github.com/astral-sh/ruff/pull/17115))
|
||||
- \[`airflow`\] Expand module path check to individual symbols (`AIR302`) ([#17278](https://github.com/astral-sh/ruff/pull/17278))
|
||||
- \[`airflow`\] Extract `AIR312` from `AIR302` rules (`AIR302`, `AIR312`) ([#17152](https://github.com/astral-sh/ruff/pull/17152))
|
||||
- \[`airflow`\] Update oudated `AIR301`, `AIR302` rules ([#17123](https://github.com/astral-sh/ruff/pull/17123))
|
||||
- [syntax-errors] Async comprehension in sync comprehension ([#17177](https://github.com/astral-sh/ruff/pull/17177))
|
||||
- [syntax-errors] Check annotations in annotated assignments ([#17283](https://github.com/astral-sh/ruff/pull/17283))
|
||||
- [syntax-errors] Extend annotation checks to `await` ([#17282](https://github.com/astral-sh/ruff/pull/17282))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- \[`flake8-pie`\] Avoid false positive for multiple assignment with `auto()` (`PIE796`) ([#17274](https://github.com/astral-sh/ruff/pull/17274))
|
||||
|
||||
### Rule changes
|
||||
|
||||
- \[`ruff`\] Fix `RUF100` to detect unused file-level `noqa` directives with specific codes (#17042) ([#17061](https://github.com/astral-sh/ruff/pull/17061))
|
||||
- \[`flake8-pytest-style`\] Avoid false positive for legacy form of `pytest.raises` (`PT011`) ([#17231](https://github.com/astral-sh/ruff/pull/17231))
|
||||
|
||||
### Documentation
|
||||
|
||||
- Fix formatting of "See Style Guide" link ([#17272](https://github.com/astral-sh/ruff/pull/17272))
|
||||
|
||||
## 0.11.4
|
||||
|
||||
### Preview features
|
||||
|
||||
- \[`ruff`\] Implement `invalid-rule-code` as `RUF102` ([#17138](https://github.com/astral-sh/ruff/pull/17138))
|
||||
- [syntax-errors] Detect duplicate keys in `match` mapping patterns ([#17129](https://github.com/astral-sh/ruff/pull/17129))
|
||||
- [syntax-errors] Detect duplicate attributes in `match` class patterns ([#17186](https://github.com/astral-sh/ruff/pull/17186))
|
||||
- [syntax-errors] Detect invalid syntax in annotations ([#17101](https://github.com/astral-sh/ruff/pull/17101))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- [syntax-errors] Fix multiple assignment error for class fields in `match` patterns ([#17184](https://github.com/astral-sh/ruff/pull/17184))
|
||||
- Don't skip visiting non-tuple slice in `typing.Annotated` subscripts ([#17201](https://github.com/astral-sh/ruff/pull/17201))
|
||||
|
||||
## 0.11.3
|
||||
|
||||
### Preview features
|
||||
|
||||
- \[`airflow`\] Add more autofixes for `AIR302` ([#16876](https://github.com/astral-sh/ruff/pull/16876), [#16977](https://github.com/astral-sh/ruff/pull/16977), [#16976](https://github.com/astral-sh/ruff/pull/16976), [#16965](https://github.com/astral-sh/ruff/pull/16965))
|
||||
- \[`airflow`\] Move `AIR301` to `AIR002` ([#16978](https://github.com/astral-sh/ruff/pull/16978))
|
||||
- \[`airflow`\] Move `AIR302` to `AIR301` and `AIR303` to `AIR302` ([#17151](https://github.com/astral-sh/ruff/pull/17151))
|
||||
- \[`flake8-bandit`\] Mark `str` and `list[str]` literals as trusted input (`S603`) ([#17136](https://github.com/astral-sh/ruff/pull/17136))
|
||||
- \[`ruff`\] Support slices in `RUF005` ([#17078](https://github.com/astral-sh/ruff/pull/17078))
|
||||
- [syntax-errors] Start detecting compile-time syntax errors ([#16106](https://github.com/astral-sh/ruff/pull/16106))
|
||||
- [syntax-errors] Duplicate type parameter names ([#16858](https://github.com/astral-sh/ruff/pull/16858))
|
||||
- [syntax-errors] Irrefutable `case` pattern before final case ([#16905](https://github.com/astral-sh/ruff/pull/16905))
|
||||
- [syntax-errors] Multiple assignments in `case` pattern ([#16957](https://github.com/astral-sh/ruff/pull/16957))
|
||||
- [syntax-errors] Single starred assignment target ([#17024](https://github.com/astral-sh/ruff/pull/17024))
|
||||
- [syntax-errors] Starred expressions in `return`, `yield`, and `for` ([#17134](https://github.com/astral-sh/ruff/pull/17134))
|
||||
- [syntax-errors] Store to or delete `__debug__` ([#16984](https://github.com/astral-sh/ruff/pull/16984))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- Error instead of `panic!` when running Ruff from a deleted directory (#16903) ([#17054](https://github.com/astral-sh/ruff/pull/17054))
|
||||
- [syntax-errors] Fix false positive for parenthesized tuple index ([#16948](https://github.com/astral-sh/ruff/pull/16948))
|
||||
|
||||
### CLI
|
||||
|
||||
- Check `pyproject.toml` correctly when it is passed via stdin ([#16971](https://github.com/astral-sh/ruff/pull/16971))
|
||||
|
||||
### Configuration
|
||||
|
||||
- \[`flake8-import-conventions`\] Add import `numpy.typing as npt` to default `flake8-import-conventions.aliases` ([#17133](https://github.com/astral-sh/ruff/pull/17133))
|
||||
|
||||
### Documentation
|
||||
|
||||
- \[`refurb`\] Document why `UserDict`, `UserList`, and `UserString` are preferred over `dict`, `list`, and `str` (`FURB189`) ([#16927](https://github.com/astral-sh/ruff/pull/16927))
|
||||
|
||||
## 0.11.2
|
||||
|
||||
### Preview features
|
||||
@@ -1421,11 +1566,11 @@ The following rules have been stabilized and are no longer in preview:
|
||||
|
||||
The following behaviors have been stabilized:
|
||||
|
||||
- [`cancel-scope-no-checkpoint`](https://docs.astral.sh/ruff/rules/cancel-scope-no-checkpoint/) (`ASYNC100`): Support `asyncio` and `anyio` context mangers.
|
||||
- [`async-function-with-timeout`](https://docs.astral.sh/ruff/rules/async-function-with-timeout/) (`ASYNC109`): Support `asyncio` and `anyio` context mangers.
|
||||
- [`async-busy-wait`](https://docs.astral.sh/ruff/rules/async-busy-wait/) (`ASYNC110`): Support `asyncio` and `anyio` context mangers.
|
||||
- [`async-zero-sleep`](https://docs.astral.sh/ruff/rules/async-zero-sleep/) (`ASYNC115`): Support `anyio` context mangers.
|
||||
- [`long-sleep-not-forever`](https://docs.astral.sh/ruff/rules/long-sleep-not-forever/) (`ASYNC116`): Support `anyio` context mangers.
|
||||
- [`cancel-scope-no-checkpoint`](https://docs.astral.sh/ruff/rules/cancel-scope-no-checkpoint/) (`ASYNC100`): Support `asyncio` and `anyio` context managers.
|
||||
- [`async-function-with-timeout`](https://docs.astral.sh/ruff/rules/async-function-with-timeout/) (`ASYNC109`): Support `asyncio` and `anyio` context managers.
|
||||
- [`async-busy-wait`](https://docs.astral.sh/ruff/rules/async-busy-wait/) (`ASYNC110`): Support `asyncio` and `anyio` context managers.
|
||||
- [`async-zero-sleep`](https://docs.astral.sh/ruff/rules/async-zero-sleep/) (`ASYNC115`): Support `anyio` context managers.
|
||||
- [`long-sleep-not-forever`](https://docs.astral.sh/ruff/rules/long-sleep-not-forever/) (`ASYNC116`): Support `anyio` context managers.
|
||||
|
||||
The following fixes have been stabilized:
|
||||
|
||||
|
||||
@@ -71,8 +71,7 @@ representative at an online or offline event.
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||
reported to the community leaders responsible for enforcement at
|
||||
<charlie.r.marsh@gmail.com>.
|
||||
reported to the community leaders responsible for enforcement at <hey@astral.sh>.
|
||||
All complaints will be reviewed and investigated promptly and fairly.
|
||||
|
||||
All community leaders are obligated to respect the privacy and security of the
|
||||
|
||||
709
Cargo.lock
generated
709
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
85
Cargo.toml
85
Cargo.toml
@@ -4,7 +4,7 @@ resolver = "2"
|
||||
|
||||
[workspace.package]
|
||||
edition = "2021"
|
||||
rust-version = "1.83"
|
||||
rust-version = "1.84"
|
||||
homepage = "https://docs.astral.sh/ruff"
|
||||
documentation = "https://docs.astral.sh/ruff"
|
||||
repository = "https://github.com/astral-sh/ruff"
|
||||
@@ -35,13 +35,14 @@ ruff_python_trivia = { path = "crates/ruff_python_trivia" }
|
||||
ruff_server = { path = "crates/ruff_server" }
|
||||
ruff_source_file = { path = "crates/ruff_source_file" }
|
||||
ruff_text_size = { path = "crates/ruff_text_size" }
|
||||
red_knot_vendored = { path = "crates/red_knot_vendored" }
|
||||
ruff_workspace = { path = "crates/ruff_workspace" }
|
||||
|
||||
red_knot_python_semantic = { path = "crates/red_knot_python_semantic" }
|
||||
red_knot_server = { path = "crates/red_knot_server" }
|
||||
red_knot_test = { path = "crates/red_knot_test" }
|
||||
red_knot_project = { path = "crates/red_knot_project", default-features = false }
|
||||
ty_ide = { path = "crates/ty_ide" }
|
||||
ty_project = { path = "crates/ty_project", default-features = false }
|
||||
ty_python_semantic = { path = "crates/ty_python_semantic" }
|
||||
ty_server = { path = "crates/ty_server" }
|
||||
ty_test = { path = "crates/ty_test" }
|
||||
ty_vendored = { path = "crates/ty_vendored" }
|
||||
|
||||
aho-corasick = { version = "1.1.3" }
|
||||
anstream = { version = "0.6.18" }
|
||||
@@ -54,7 +55,6 @@ bitflags = { version = "2.5.0" }
|
||||
bstr = { version = "1.9.1" }
|
||||
cachedir = { version = "0.3.1" }
|
||||
camino = { version = "1.1.7" }
|
||||
chrono = { version = "0.4.35", default-features = false, features = ["clock"] }
|
||||
clap = { version = "4.5.3", features = ["derive"] }
|
||||
clap_complete_command = { version = "0.6.0" }
|
||||
clearscreen = { version = "4.0.0" }
|
||||
@@ -94,6 +94,7 @@ insta-cmd = { version = "0.6.0" }
|
||||
is-macro = { version = "0.3.5" }
|
||||
is-wsl = { version = "0.4.0" }
|
||||
itertools = { version = "0.14.0" }
|
||||
jiff = { version = "0.2.0" }
|
||||
js-sys = { version = "0.3.69" }
|
||||
jod-thread = { version = "0.1.2" }
|
||||
libc = { version = "0.2.153" }
|
||||
@@ -123,7 +124,7 @@ rayon = { version = "1.10.0" }
|
||||
regex = { version = "1.10.2" }
|
||||
rustc-hash = { version = "2.0.0" }
|
||||
# When updating salsa, make sure to also update the revision in `fuzz/Cargo.toml`
|
||||
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "d758691ba17ee1a60c5356ea90888d529e1782ad" }
|
||||
salsa = { git = "https://github.com/salsa-rs/salsa.git", rev = "b2b82bccdbef3e7ce7f302c52f43a0c98ac7177a" }
|
||||
schemars = { version = "0.8.16" }
|
||||
seahash = { version = "4.1.0" }
|
||||
serde = { version = "1.0.197", features = ["derive"] }
|
||||
@@ -208,6 +209,7 @@ must_use_candidate = "allow"
|
||||
similar_names = "allow"
|
||||
single_match_else = "allow"
|
||||
too_many_lines = "allow"
|
||||
needless_continue = "allow" # An explicit continue can be more readable, especially if the alternative is an empty block.
|
||||
# Without the hashes we run into a `rustfmt` bug in some snapshot tests, see #13250
|
||||
needless_raw_string_hashes = "allow"
|
||||
# Disallowed restriction lints
|
||||
@@ -266,70 +268,3 @@ debug = 1
|
||||
# The profile that 'cargo dist' will build with.
|
||||
[profile.dist]
|
||||
inherits = "release"
|
||||
|
||||
# Config for 'dist'
|
||||
[workspace.metadata.dist]
|
||||
# The preferred dist version to use in CI (Cargo.toml SemVer syntax)
|
||||
cargo-dist-version = "0.25.2-prerelease.3"
|
||||
# CI backends to support
|
||||
ci = "github"
|
||||
# The installers to generate for each app
|
||||
installers = ["shell", "powershell"]
|
||||
# The archive format to use for windows builds (defaults .zip)
|
||||
windows-archive = ".zip"
|
||||
# The archive format to use for non-windows builds (defaults .tar.xz)
|
||||
unix-archive = ".tar.gz"
|
||||
# Target platforms to build apps for (Rust target-triple syntax)
|
||||
targets = [
|
||||
"aarch64-apple-darwin",
|
||||
"aarch64-pc-windows-msvc",
|
||||
"aarch64-unknown-linux-gnu",
|
||||
"aarch64-unknown-linux-musl",
|
||||
"arm-unknown-linux-musleabihf",
|
||||
"armv7-unknown-linux-gnueabihf",
|
||||
"armv7-unknown-linux-musleabihf",
|
||||
"i686-pc-windows-msvc",
|
||||
"i686-unknown-linux-gnu",
|
||||
"i686-unknown-linux-musl",
|
||||
"powerpc64-unknown-linux-gnu",
|
||||
"powerpc64le-unknown-linux-gnu",
|
||||
"s390x-unknown-linux-gnu",
|
||||
"x86_64-apple-darwin",
|
||||
"x86_64-pc-windows-msvc",
|
||||
"x86_64-unknown-linux-gnu",
|
||||
"x86_64-unknown-linux-musl",
|
||||
]
|
||||
# Whether to auto-include files like READMEs, LICENSEs, and CHANGELOGs (default true)
|
||||
auto-includes = false
|
||||
# Whether dist should create a Github Release or use an existing draft
|
||||
create-release = true
|
||||
# Which actions to run on pull requests
|
||||
pr-run-mode = "skip"
|
||||
# Whether CI should trigger releases with dispatches instead of tag pushes
|
||||
dispatch-releases = true
|
||||
# Which phase dist should use to create the GitHub release
|
||||
github-release = "announce"
|
||||
# Whether CI should include auto-generated code to build local artifacts
|
||||
build-local-artifacts = false
|
||||
# Local artifacts jobs to run in CI
|
||||
local-artifacts-jobs = ["./build-binaries", "./build-docker"]
|
||||
# Publish jobs to run in CI
|
||||
publish-jobs = ["./publish-pypi", "./publish-wasm"]
|
||||
# Post-announce jobs to run in CI
|
||||
post-announce-jobs = [
|
||||
"./notify-dependents",
|
||||
"./publish-docs",
|
||||
"./publish-playground",
|
||||
]
|
||||
# Custom permissions for GitHub Jobs
|
||||
github-custom-job-permissions = { "build-docker" = { packages = "write", contents = "read" }, "publish-wasm" = { contents = "read", id-token = "write", packages = "write" } }
|
||||
# Whether to install an updater program
|
||||
install-updater = false
|
||||
# Path that installers should place binaries in
|
||||
install-path = ["$XDG_BIN_HOME/", "$XDG_DATA_HOME/../bin", "~/.local/bin"]
|
||||
# Temporarily allow changes to the `release` workflow, in which we pin actions
|
||||
# to a SHA instead of a tag (https://github.com/astral-sh/uv/issues/12253)
|
||||
allow-dirty = ["ci"]
|
||||
|
||||
[workspace.metadata.dist.github-custom-runners]
|
||||
global = "depot-ubuntu-latest-4"
|
||||
|
||||
@@ -149,8 +149,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
|
||||
|
||||
# For a specific version.
|
||||
curl -LsSf https://astral.sh/ruff/0.11.2/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.11.2/install.ps1 | iex"
|
||||
curl -LsSf https://astral.sh/ruff/0.11.8/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.11.8/install.ps1 | iex"
|
||||
```
|
||||
|
||||
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
|
||||
@@ -183,7 +183,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
|
||||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.11.2
|
||||
rev: v0.11.8
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
[files]
|
||||
# https://github.com/crate-ci/typos/issues/868
|
||||
extend-exclude = [
|
||||
"crates/red_knot_vendored/vendor/**/*",
|
||||
"crates/ty_vendored/vendor/**/*",
|
||||
"**/resources/**/*",
|
||||
"**/snapshots/**/*",
|
||||
]
|
||||
|
||||
@@ -1,25 +0,0 @@
|
||||
# Red Knot
|
||||
|
||||
Red Knot is an extremely fast type checker.
|
||||
Currently, it is a work-in-progress and not ready for user testing.
|
||||
|
||||
Red Knot is designed to prioritize good type inference, even in unannotated code,
|
||||
and aims to avoid false positives.
|
||||
|
||||
While Red Knot will produce similar results to mypy and pyright on many codebases,
|
||||
100% compatibility with these tools is a non-goal.
|
||||
On some codebases, Red Knot's design decisions lead to different outcomes
|
||||
than you would get from running one of these more established tools.
|
||||
|
||||
## Contributing
|
||||
|
||||
Core type checking tests are written as Markdown code blocks.
|
||||
They can be found in [`red_knot_python_semantic/resources/mdtest`][resources-mdtest].
|
||||
See [`red_knot_test/README.md`][mdtest-readme] for more information
|
||||
on the test framework itself.
|
||||
|
||||
The list of open issues can be found [here][open-issues].
|
||||
|
||||
[mdtest-readme]: ../red_knot_test/README.md
|
||||
[open-issues]: https://github.com/astral-sh/ruff/issues?q=sort%3Aupdated-desc%20is%3Aissue%20is%3Aopen%20label%3Ared-knot
|
||||
[resources-mdtest]: ../red_knot_python_semantic/resources/mdtest
|
||||
@@ -1,83 +0,0 @@
|
||||
# Any
|
||||
|
||||
## Annotation
|
||||
|
||||
`typing.Any` is a way to name the Any type.
|
||||
|
||||
```py
|
||||
from typing import Any
|
||||
|
||||
x: Any = 1
|
||||
x = "foo"
|
||||
|
||||
def f():
|
||||
reveal_type(x) # revealed: Any
|
||||
```
|
||||
|
||||
## Aliased to a different name
|
||||
|
||||
If you alias `typing.Any` to another name, we still recognize that as a spelling of the Any type.
|
||||
|
||||
```py
|
||||
from typing import Any as RenamedAny
|
||||
|
||||
x: RenamedAny = 1
|
||||
x = "foo"
|
||||
|
||||
def f():
|
||||
reveal_type(x) # revealed: Any
|
||||
```
|
||||
|
||||
## Shadowed class
|
||||
|
||||
If you define your own class named `Any`, using that in a type expression refers to your class, and
|
||||
isn't a spelling of the Any type.
|
||||
|
||||
```py
|
||||
class Any: ...
|
||||
|
||||
x: Any
|
||||
|
||||
def f():
|
||||
reveal_type(x) # revealed: Any
|
||||
|
||||
# This verifies that we're not accidentally seeing typing.Any, since str is assignable
|
||||
# to that but not to our locally defined class.
|
||||
y: Any = "not an Any" # error: [invalid-assignment]
|
||||
```
|
||||
|
||||
## Subclass
|
||||
|
||||
The spec allows you to define subclasses of `Any`.
|
||||
|
||||
TODO: Handle assignments correctly. `Subclass` has an unknown superclass, which might be `int`. The
|
||||
assignment to `x` should not be allowed, even when the unknown superclass is `int`. The assignment
|
||||
to `y` should be allowed, since `Subclass` might have `int` as a superclass, and is therefore
|
||||
assignable to `int`.
|
||||
|
||||
```py
|
||||
from typing import Any
|
||||
|
||||
class Subclass(Any): ...
|
||||
|
||||
reveal_type(Subclass.__mro__) # revealed: tuple[Literal[Subclass], Any, Literal[object]]
|
||||
|
||||
x: Subclass = 1 # error: [invalid-assignment]
|
||||
# TODO: no diagnostic
|
||||
y: int = Subclass() # error: [invalid-assignment]
|
||||
|
||||
def _(s: Subclass):
|
||||
reveal_type(s) # revealed: Subclass
|
||||
```
|
||||
|
||||
## Invalid
|
||||
|
||||
`Any` cannot be parameterized:
|
||||
|
||||
```py
|
||||
from typing import Any
|
||||
|
||||
# error: [invalid-type-form] "Type `typing.Any` expected no type parameter"
|
||||
def f(x: Any[int]):
|
||||
reveal_type(x) # revealed: Unknown
|
||||
```
|
||||
@@ -1,47 +0,0 @@
|
||||
# Deferred annotations
|
||||
|
||||
## Deferred annotations in stubs always resolve
|
||||
|
||||
`mod.pyi`:
|
||||
|
||||
```pyi
|
||||
def get_foo() -> Foo: ...
|
||||
class Foo: ...
|
||||
```
|
||||
|
||||
```py
|
||||
from mod import get_foo
|
||||
|
||||
reveal_type(get_foo()) # revealed: Foo
|
||||
```
|
||||
|
||||
## Deferred annotations in regular code fail
|
||||
|
||||
In (regular) source files, annotations are *not* deferred. This also tests that imports from
|
||||
`__future__` that are not `annotations` are ignored.
|
||||
|
||||
```py
|
||||
from __future__ import with_statement as annotations
|
||||
|
||||
# error: [unresolved-reference]
|
||||
def get_foo() -> Foo: ...
|
||||
|
||||
class Foo: ...
|
||||
|
||||
reveal_type(get_foo()) # revealed: Unknown
|
||||
```
|
||||
|
||||
## Deferred annotations in regular code with `__future__.annotations`
|
||||
|
||||
If `__future__.annotations` is imported, annotations *are* deferred.
|
||||
|
||||
```py
|
||||
from __future__ import annotations
|
||||
|
||||
def get_foo() -> Foo:
|
||||
return Foo()
|
||||
|
||||
class Foo: ...
|
||||
|
||||
reveal_type(get_foo()) # revealed: Foo
|
||||
```
|
||||
@@ -1,126 +0,0 @@
|
||||
# Typing-module aliases to other stdlib classes
|
||||
|
||||
The `typing` module has various aliases to other stdlib classes. These are a legacy feature, but
|
||||
still need to be supported by a type checker.
|
||||
|
||||
## Correspondence
|
||||
|
||||
All of the following symbols can be mapped one-to-one with the actual type:
|
||||
|
||||
```py
|
||||
import typing
|
||||
|
||||
def f(
|
||||
list_bare: typing.List,
|
||||
list_parametrized: typing.List[int],
|
||||
dict_bare: typing.Dict,
|
||||
dict_parametrized: typing.Dict[int, str],
|
||||
set_bare: typing.Set,
|
||||
set_parametrized: typing.Set[int],
|
||||
frozen_set_bare: typing.FrozenSet,
|
||||
frozen_set_parametrized: typing.FrozenSet[str],
|
||||
chain_map_bare: typing.ChainMap,
|
||||
chain_map_parametrized: typing.ChainMap[int],
|
||||
counter_bare: typing.Counter,
|
||||
counter_parametrized: typing.Counter[int],
|
||||
default_dict_bare: typing.DefaultDict,
|
||||
default_dict_parametrized: typing.DefaultDict[str, int],
|
||||
deque_bare: typing.Deque,
|
||||
deque_parametrized: typing.Deque[str],
|
||||
ordered_dict_bare: typing.OrderedDict,
|
||||
ordered_dict_parametrized: typing.OrderedDict[int, str],
|
||||
):
|
||||
reveal_type(list_bare) # revealed: list
|
||||
reveal_type(list_parametrized) # revealed: list
|
||||
|
||||
reveal_type(dict_bare) # revealed: dict
|
||||
reveal_type(dict_parametrized) # revealed: dict
|
||||
|
||||
reveal_type(set_bare) # revealed: set
|
||||
reveal_type(set_parametrized) # revealed: set
|
||||
|
||||
reveal_type(frozen_set_bare) # revealed: frozenset
|
||||
reveal_type(frozen_set_parametrized) # revealed: frozenset
|
||||
|
||||
reveal_type(chain_map_bare) # revealed: ChainMap
|
||||
reveal_type(chain_map_parametrized) # revealed: ChainMap
|
||||
|
||||
reveal_type(counter_bare) # revealed: Counter
|
||||
reveal_type(counter_parametrized) # revealed: Counter
|
||||
|
||||
reveal_type(default_dict_bare) # revealed: defaultdict
|
||||
reveal_type(default_dict_parametrized) # revealed: defaultdict
|
||||
|
||||
reveal_type(deque_bare) # revealed: deque
|
||||
reveal_type(deque_parametrized) # revealed: deque
|
||||
|
||||
reveal_type(ordered_dict_bare) # revealed: OrderedDict
|
||||
reveal_type(ordered_dict_parametrized) # revealed: OrderedDict
|
||||
```
|
||||
|
||||
## Inheritance
|
||||
|
||||
The aliases can be inherited from. Some of these are still partially or wholly TODOs.
|
||||
|
||||
```py
|
||||
import typing
|
||||
|
||||
####################
|
||||
### Built-ins
|
||||
|
||||
class ListSubclass(typing.List): ...
|
||||
|
||||
# revealed: tuple[Literal[ListSubclass], Literal[list], Literal[MutableSequence], Literal[Sequence], Literal[Reversible], Literal[Collection], Literal[Iterable], Literal[Container], @Todo(protocol), Literal[object]]
|
||||
reveal_type(ListSubclass.__mro__)
|
||||
|
||||
class DictSubclass(typing.Dict): ...
|
||||
|
||||
# TODO: should have `Generic`, should not have `Unknown`
|
||||
# revealed: tuple[Literal[DictSubclass], Literal[dict], Unknown, Literal[object]]
|
||||
reveal_type(DictSubclass.__mro__)
|
||||
|
||||
class SetSubclass(typing.Set): ...
|
||||
|
||||
# TODO: should have `Generic`, should not have `Unknown`
|
||||
# revealed: tuple[Literal[SetSubclass], Literal[set], Unknown, Literal[object]]
|
||||
reveal_type(SetSubclass.__mro__)
|
||||
|
||||
class FrozenSetSubclass(typing.FrozenSet): ...
|
||||
|
||||
# TODO: should have `Generic`, should not have `Unknown`
|
||||
# revealed: tuple[Literal[FrozenSetSubclass], Literal[frozenset], Unknown, Literal[object]]
|
||||
reveal_type(FrozenSetSubclass.__mro__)
|
||||
|
||||
####################
|
||||
### `collections`
|
||||
|
||||
class ChainMapSubclass(typing.ChainMap): ...
|
||||
|
||||
# TODO: Should be (ChainMapSubclass, ChainMap, MutableMapping, Mapping, Collection, Sized, Iterable, Container, Generic, object)
|
||||
# revealed: tuple[Literal[ChainMapSubclass], Literal[ChainMap], Unknown, Literal[object]]
|
||||
reveal_type(ChainMapSubclass.__mro__)
|
||||
|
||||
class CounterSubclass(typing.Counter): ...
|
||||
|
||||
# TODO: Should be (CounterSubclass, Counter, dict, MutableMapping, Mapping, Collection, Sized, Iterable, Container, Generic, object)
|
||||
# revealed: tuple[Literal[CounterSubclass], Literal[Counter], Unknown, Literal[object]]
|
||||
reveal_type(CounterSubclass.__mro__)
|
||||
|
||||
class DefaultDictSubclass(typing.DefaultDict): ...
|
||||
|
||||
# TODO: Should be (DefaultDictSubclass, defaultdict, dict, MutableMapping, Mapping, Collection, Sized, Iterable, Container, Generic, object)
|
||||
# revealed: tuple[Literal[DefaultDictSubclass], Literal[defaultdict], Unknown, Literal[object]]
|
||||
reveal_type(DefaultDictSubclass.__mro__)
|
||||
|
||||
class DequeSubclass(typing.Deque): ...
|
||||
|
||||
# TODO: Should be (DequeSubclass, deque, MutableSequence, Sequence, Reversible, Collection, Sized, Iterable, Container, Generic, object)
|
||||
# revealed: tuple[Literal[DequeSubclass], Literal[deque], Unknown, Literal[object]]
|
||||
reveal_type(DequeSubclass.__mro__)
|
||||
|
||||
class OrderedDictSubclass(typing.OrderedDict): ...
|
||||
|
||||
# TODO: Should be (OrderedDictSubclass, OrderedDict, dict, MutableMapping, Mapping, Collection, Sized, Iterable, Container, Generic, object)
|
||||
# revealed: tuple[Literal[OrderedDictSubclass], Literal[OrderedDict], Unknown, Literal[object]]
|
||||
reveal_type(OrderedDictSubclass.__mro__)
|
||||
```
|
||||
@@ -1,40 +0,0 @@
|
||||
# Calling builtins
|
||||
|
||||
## `bool` with incorrect arguments
|
||||
|
||||
```py
|
||||
class NotBool:
|
||||
__bool__ = None
|
||||
|
||||
# error: [too-many-positional-arguments] "Too many positional arguments to class `bool`: expected 1, got 2"
|
||||
bool(1, 2)
|
||||
|
||||
# TODO: We should emit an `unsupported-bool-conversion` error here because the argument doesn't implement `__bool__` correctly.
|
||||
bool(NotBool())
|
||||
```
|
||||
|
||||
## Calls to `type()`
|
||||
|
||||
A single-argument call to `type()` returns an object that has the argument's meta-type. (This is
|
||||
tested more extensively in `crates/red_knot_python_semantic/resources/mdtest/attributes.md`,
|
||||
alongside the tests for the `__class__` attribute.)
|
||||
|
||||
```py
|
||||
reveal_type(type(1)) # revealed: Literal[int]
|
||||
```
|
||||
|
||||
But a three-argument call to type creates a dynamic instance of the `type` class:
|
||||
|
||||
```py
|
||||
reveal_type(type("Foo", (), {})) # revealed: type
|
||||
```
|
||||
|
||||
Other numbers of arguments are invalid
|
||||
|
||||
```py
|
||||
# error: [no-matching-overload] "No overload of class `type` matches arguments"
|
||||
type("Foo", ())
|
||||
|
||||
# error: [no-matching-overload] "No overload of class `type` matches arguments"
|
||||
type("Foo", (), {}, weird_other_arg=42)
|
||||
```
|
||||
@@ -1,7 +0,0 @@
|
||||
# Constructor
|
||||
|
||||
```py
|
||||
class Foo: ...
|
||||
|
||||
reveal_type(Foo()) # revealed: Foo
|
||||
```
|
||||
@@ -1,163 +0,0 @@
|
||||
# Unions in calls
|
||||
|
||||
## Union of return types
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
if flag:
|
||||
def f() -> int:
|
||||
return 1
|
||||
else:
|
||||
def f() -> str:
|
||||
return "foo"
|
||||
reveal_type(f()) # revealed: int | str
|
||||
```
|
||||
|
||||
## Calling with an unknown union
|
||||
|
||||
```py
|
||||
from nonexistent import f # error: [unresolved-import] "Cannot resolve import `nonexistent`"
|
||||
|
||||
def coinflip() -> bool:
|
||||
return True
|
||||
|
||||
if coinflip():
|
||||
def f() -> int:
|
||||
return 1
|
||||
|
||||
reveal_type(f()) # revealed: Unknown | int
|
||||
```
|
||||
|
||||
## Non-callable elements in a union
|
||||
|
||||
Calling a union with a non-callable element should emit a diagnostic.
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
if flag:
|
||||
f = 1
|
||||
else:
|
||||
def f() -> int:
|
||||
return 1
|
||||
x = f() # error: [call-non-callable] "Object of type `Literal[1]` is not callable"
|
||||
reveal_type(x) # revealed: Unknown | int
|
||||
```
|
||||
|
||||
## Multiple non-callable elements in a union
|
||||
|
||||
Calling a union with multiple non-callable elements should mention all of them in the diagnostic.
|
||||
|
||||
```py
|
||||
def _(flag: bool, flag2: bool):
|
||||
if flag:
|
||||
f = 1
|
||||
elif flag2:
|
||||
f = "foo"
|
||||
else:
|
||||
def f() -> int:
|
||||
return 1
|
||||
# TODO we should mention all non-callable elements of the union
|
||||
# error: [call-non-callable] "Object of type `Literal[1]` is not callable"
|
||||
# revealed: Unknown | int
|
||||
reveal_type(f())
|
||||
```
|
||||
|
||||
## All non-callable union elements
|
||||
|
||||
Calling a union with no callable elements can emit a simpler diagnostic.
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
if flag:
|
||||
f = 1
|
||||
else:
|
||||
f = "foo"
|
||||
|
||||
x = f() # error: [call-non-callable] "Object of type `Literal[1, "foo"]` is not callable"
|
||||
reveal_type(x) # revealed: Unknown
|
||||
```
|
||||
|
||||
## Mismatching signatures
|
||||
|
||||
Calling a union where the arguments don't match the signature of all variants.
|
||||
|
||||
```py
|
||||
def f1(a: int) -> int:
|
||||
return a
|
||||
|
||||
def f2(a: str) -> str:
|
||||
return a
|
||||
|
||||
def _(flag: bool):
|
||||
if flag:
|
||||
f = f1
|
||||
else:
|
||||
f = f2
|
||||
|
||||
# error: [invalid-argument-type] "Object of type `Literal[3]` cannot be assigned to parameter 1 (`a`) of function `f2`; expected type `str`"
|
||||
x = f(3)
|
||||
reveal_type(x) # revealed: int | str
|
||||
```
|
||||
|
||||
## Any non-callable variant
|
||||
|
||||
```py
|
||||
def f1(a: int): ...
|
||||
def _(flag: bool):
|
||||
if flag:
|
||||
f = f1
|
||||
else:
|
||||
f = "This is a string literal"
|
||||
|
||||
# error: [call-non-callable] "Object of type `Literal["This is a string literal"]` is not callable"
|
||||
x = f(3)
|
||||
reveal_type(x) # revealed: Unknown
|
||||
```
|
||||
|
||||
## Union of binding errors
|
||||
|
||||
```py
|
||||
def f1(): ...
|
||||
def f2(): ...
|
||||
def _(flag: bool):
|
||||
if flag:
|
||||
f = f1
|
||||
else:
|
||||
f = f2
|
||||
|
||||
# TODO: we should show all errors from the union, not arbitrarily pick one union element
|
||||
# error: [too-many-positional-arguments] "Too many positional arguments to function `f1`: expected 0, got 1"
|
||||
x = f(3)
|
||||
reveal_type(x) # revealed: Unknown
|
||||
```
|
||||
|
||||
## One not-callable, one wrong argument
|
||||
|
||||
```py
|
||||
class C: ...
|
||||
|
||||
def f1(): ...
|
||||
def _(flag: bool):
|
||||
if flag:
|
||||
f = f1
|
||||
else:
|
||||
f = C()
|
||||
|
||||
# TODO: we should either show all union errors here, or prioritize the not-callable error
|
||||
# error: [too-many-positional-arguments] "Too many positional arguments to function `f1`: expected 0, got 1"
|
||||
x = f(3)
|
||||
reveal_type(x) # revealed: Unknown
|
||||
```
|
||||
|
||||
## Union including a special-cased function
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
if flag:
|
||||
f = str
|
||||
else:
|
||||
f = repr
|
||||
reveal_type(str("string")) # revealed: Literal["string"]
|
||||
reveal_type(repr("string")) # revealed: Literal["'string'"]
|
||||
reveal_type(f("string")) # revealed: Literal["string", "'string'"]
|
||||
```
|
||||
@@ -1,63 +0,0 @@
|
||||
# Pattern matching
|
||||
|
||||
## With wildcard
|
||||
|
||||
```py
|
||||
def _(target: int):
|
||||
match target:
|
||||
case 1:
|
||||
y = 2
|
||||
case _:
|
||||
y = 3
|
||||
|
||||
reveal_type(y) # revealed: Literal[2, 3]
|
||||
```
|
||||
|
||||
## Without wildcard
|
||||
|
||||
```py
|
||||
def _(target: int):
|
||||
match target:
|
||||
case 1:
|
||||
y = 2
|
||||
case 2:
|
||||
y = 3
|
||||
|
||||
# revealed: Literal[2, 3]
|
||||
# error: [possibly-unresolved-reference]
|
||||
reveal_type(y)
|
||||
```
|
||||
|
||||
## Basic match
|
||||
|
||||
```py
|
||||
def _(target: int):
|
||||
y = 1
|
||||
y = 2
|
||||
|
||||
match target:
|
||||
case 1:
|
||||
y = 3
|
||||
case 2:
|
||||
y = 4
|
||||
|
||||
reveal_type(y) # revealed: Literal[2, 3, 4]
|
||||
```
|
||||
|
||||
## Guard with object that implements `__bool__` incorrectly
|
||||
|
||||
```py
|
||||
class NotBoolable:
|
||||
__bool__: int = 3
|
||||
|
||||
def _(target: int, flag: NotBoolable):
|
||||
y = 1
|
||||
match target:
|
||||
# error: [unsupported-bool-conversion] "Boolean conversion is unsupported for type `NotBoolable`; its `__bool__` method isn't callable"
|
||||
case 1 if flag:
|
||||
y = 2
|
||||
case 2:
|
||||
y = 3
|
||||
|
||||
reveal_type(y) # revealed: Literal[1, 2, 3]
|
||||
```
|
||||
@@ -1,29 +0,0 @@
|
||||
# `cast`
|
||||
|
||||
`cast()` takes two arguments, one type and one value, and returns a value of the given type.
|
||||
|
||||
The (inferred) type of the value and the given type do not need to have any correlation.
|
||||
|
||||
```py
|
||||
from typing import Literal, cast
|
||||
|
||||
reveal_type(True) # revealed: Literal[True]
|
||||
reveal_type(cast(str, True)) # revealed: str
|
||||
reveal_type(cast("str", True)) # revealed: str
|
||||
|
||||
reveal_type(cast(int | str, 1)) # revealed: int | str
|
||||
|
||||
# error: [invalid-type-form]
|
||||
reveal_type(cast(Literal, True)) # revealed: Unknown
|
||||
|
||||
# error: [invalid-type-form]
|
||||
reveal_type(cast(1, True)) # revealed: Unknown
|
||||
|
||||
# TODO: These should be errors
|
||||
cast(str)
|
||||
cast(str, b"ar", "foo")
|
||||
|
||||
# TODO: Either support keyword arguments properly,
|
||||
# or give a comprehensible error message saying they're unsupported
|
||||
cast(val="foo", typ=int) # error: [unresolved-reference] "Name `foo` used when not defined"
|
||||
```
|
||||
@@ -1,200 +0,0 @@
|
||||
# Generic classes
|
||||
|
||||
## PEP 695 syntax
|
||||
|
||||
TODO: Add a `red_knot_extension` function that asserts whether a function or class is generic.
|
||||
|
||||
This is a generic class defined using PEP 695 syntax:
|
||||
|
||||
```py
|
||||
class C[T]: ...
|
||||
```
|
||||
|
||||
A class that inherits from a generic class, and fills its type parameters with typevars, is generic:
|
||||
|
||||
```py
|
||||
# TODO: no error
|
||||
# error: [non-subscriptable]
|
||||
class D[U](C[U]): ...
|
||||
```
|
||||
|
||||
A class that inherits from a generic class, but fills its type parameters with concrete types, is
|
||||
_not_ generic:
|
||||
|
||||
```py
|
||||
# TODO: no error
|
||||
# error: [non-subscriptable]
|
||||
class E(C[int]): ...
|
||||
```
|
||||
|
||||
A class that inherits from a generic class, and doesn't fill its type parameters at all, implicitly
|
||||
uses the default value for the typevar. In this case, that default type is `Unknown`, so `F`
|
||||
inherits from `C[Unknown]` and is not itself generic.
|
||||
|
||||
```py
|
||||
class F(C): ...
|
||||
```
|
||||
|
||||
## Legacy syntax
|
||||
|
||||
This is a generic class defined using the legacy syntax:
|
||||
|
||||
```py
|
||||
from typing import Generic, TypeVar
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
# TODO: no error
|
||||
# error: [invalid-base]
|
||||
class C(Generic[T]): ...
|
||||
```
|
||||
|
||||
A class that inherits from a generic class, and fills its type parameters with typevars, is generic.
|
||||
|
||||
```py
|
||||
class D(C[T]): ...
|
||||
```
|
||||
|
||||
(Examples `E` and `F` from above do not have analogues in the legacy syntax.)
|
||||
|
||||
## Inferring generic class parameters
|
||||
|
||||
The type parameter can be specified explicitly:
|
||||
|
||||
```py
|
||||
class C[T]:
|
||||
x: T
|
||||
|
||||
# TODO: no error
|
||||
# TODO: revealed: C[int]
|
||||
# error: [non-subscriptable]
|
||||
reveal_type(C[int]()) # revealed: C
|
||||
```
|
||||
|
||||
We can infer the type parameter from a type context:
|
||||
|
||||
```py
|
||||
c: C[int] = C()
|
||||
# TODO: revealed: C[int]
|
||||
reveal_type(c) # revealed: C
|
||||
```
|
||||
|
||||
The typevars of a fully specialized generic class should no longer be visible:
|
||||
|
||||
```py
|
||||
# TODO: revealed: int
|
||||
reveal_type(c.x) # revealed: T
|
||||
```
|
||||
|
||||
If the type parameter is not specified explicitly, and there are no constraints that let us infer a
|
||||
specific type, we infer the typevar's default type:
|
||||
|
||||
```py
|
||||
class D[T = int]: ...
|
||||
|
||||
# TODO: revealed: D[int]
|
||||
reveal_type(D()) # revealed: D
|
||||
```
|
||||
|
||||
If a typevar does not provide a default, we use `Unknown`:
|
||||
|
||||
```py
|
||||
# TODO: revealed: C[Unknown]
|
||||
reveal_type(C()) # revealed: C
|
||||
```
|
||||
|
||||
If the type of a constructor parameter is a class typevar, we can use that to infer the type
|
||||
parameter:
|
||||
|
||||
```py
|
||||
class E[T]:
|
||||
def __init__(self, x: T) -> None: ...
|
||||
|
||||
# TODO: revealed: E[int] or E[Literal[1]]
|
||||
reveal_type(E(1)) # revealed: E
|
||||
```
|
||||
|
||||
The types inferred from a type context and from a constructor parameter must be consistent with each
|
||||
other:
|
||||
|
||||
```py
|
||||
# TODO: error
|
||||
wrong_innards: E[int] = E("five")
|
||||
```
|
||||
|
||||
## Generic subclass
|
||||
|
||||
When a generic subclass fills its superclass's type parameter with one of its own, the actual types
|
||||
propagate through:
|
||||
|
||||
```py
|
||||
class Base[T]:
|
||||
x: T | None = None
|
||||
|
||||
# TODO: no error
|
||||
# error: [non-subscriptable]
|
||||
class Sub[U](Base[U]): ...
|
||||
|
||||
# TODO: no error
|
||||
# TODO: revealed: int | None
|
||||
# error: [non-subscriptable]
|
||||
reveal_type(Base[int].x) # revealed: T | None
|
||||
# TODO: revealed: int | None
|
||||
# error: [non-subscriptable]
|
||||
reveal_type(Sub[int].x) # revealed: T | None
|
||||
```
|
||||
|
||||
## Cyclic class definition
|
||||
|
||||
A class can use itself as the type parameter of one of its superclasses. (This is also known as the
|
||||
[curiously recurring template pattern][crtp] or [F-bounded quantification][f-bound].)
|
||||
|
||||
Here, `Sub` is not a generic class, since it fills its superclass's type parameter (with itself).
|
||||
|
||||
`stub.pyi`:
|
||||
|
||||
```pyi
|
||||
class Base[T]: ...
|
||||
# TODO: no error
|
||||
# error: [non-subscriptable]
|
||||
class Sub(Base[Sub]): ...
|
||||
|
||||
reveal_type(Sub) # revealed: Literal[Sub]
|
||||
```
|
||||
|
||||
A similar case can work in a non-stub file, if forward references are stringified:
|
||||
|
||||
`string_annotation.py`:
|
||||
|
||||
```py
|
||||
class Base[T]: ...
|
||||
|
||||
# TODO: no error
|
||||
# error: [non-subscriptable]
|
||||
class Sub(Base["Sub"]): ...
|
||||
|
||||
reveal_type(Sub) # revealed: Literal[Sub]
|
||||
```
|
||||
|
||||
In a non-stub file, without stringified forward references, this raises a `NameError`:
|
||||
|
||||
`bare_annotation.py`:
|
||||
|
||||
```py
|
||||
class Base[T]: ...
|
||||
|
||||
# TODO: error: [unresolved-reference]
|
||||
# error: [non-subscriptable]
|
||||
class Sub(Base[Sub]): ...
|
||||
```
|
||||
|
||||
## Another cyclic case
|
||||
|
||||
```pyi
|
||||
# TODO no error (generics)
|
||||
# error: [invalid-base]
|
||||
class Derived[T](list[Derived[T]]): ...
|
||||
```
|
||||
|
||||
[crtp]: https://en.wikipedia.org/wiki/Curiously_recurring_template_pattern
|
||||
[f-bound]: https://en.wikipedia.org/wiki/Bounded_quantification#F-bounded_quantification
|
||||
@@ -1,51 +0,0 @@
|
||||
# PEP 695 Generics
|
||||
|
||||
[PEP 695] and Python 3.12 introduced new, more ergonomic syntax for type variables.
|
||||
|
||||
## Type variables
|
||||
|
||||
### Defining PEP 695 type variables
|
||||
|
||||
PEP 695 introduces a new syntax for defining type variables. The resulting type variables are
|
||||
instances of `typing.TypeVar`, just like legacy type variables.
|
||||
|
||||
```py
|
||||
def f[T]():
|
||||
reveal_type(type(T)) # revealed: Literal[TypeVar]
|
||||
reveal_type(T) # revealed: T
|
||||
reveal_type(T.__name__) # revealed: Literal["T"]
|
||||
```
|
||||
|
||||
### Cannot have only one constraint
|
||||
|
||||
> `TypeVar` supports constraining parametric types to a fixed set of possible types...There should
|
||||
> be at least two constraints, if any; specifying a single constraint is disallowed.
|
||||
|
||||
```py
|
||||
# error: [invalid-type-variable-constraints] "TypeVar must have at least two constrained types"
|
||||
def f[T: (int,)]():
|
||||
pass
|
||||
```
|
||||
|
||||
## Invalid uses
|
||||
|
||||
Note that many of the invalid uses of legacy typevars do not apply to PEP 695 typevars, since the
|
||||
PEP 695 syntax is only allowed places where typevars are allowed.
|
||||
|
||||
## Displaying typevars
|
||||
|
||||
We use a suffix when displaying the typevars of a generic function or class. This helps distinguish
|
||||
different uses of the same typevar.
|
||||
|
||||
```py
|
||||
def f[T](x: T, y: T) -> None:
|
||||
# TODO: revealed: T@f
|
||||
reveal_type(x) # revealed: T
|
||||
|
||||
class C[T]:
|
||||
def m(self, x: T) -> None:
|
||||
# TODO: revealed: T@c
|
||||
reveal_type(x) # revealed: T
|
||||
```
|
||||
|
||||
[pep 695]: https://peps.python.org/pep-0695/
|
||||
@@ -1,7 +0,0 @@
|
||||
# Dictionaries
|
||||
|
||||
## Empty dictionary
|
||||
|
||||
```py
|
||||
reveal_type({}) # revealed: dict
|
||||
```
|
||||
@@ -1,47 +0,0 @@
|
||||
# Narrowing for nested conditionals
|
||||
|
||||
## Multiple negative contributions
|
||||
|
||||
```py
|
||||
def _(x: int):
|
||||
if x != 1:
|
||||
if x != 2:
|
||||
if x != 3:
|
||||
reveal_type(x) # revealed: int & ~Literal[1] & ~Literal[2] & ~Literal[3]
|
||||
```
|
||||
|
||||
## Multiple negative contributions with simplification
|
||||
|
||||
```py
|
||||
def _(flag1: bool, flag2: bool):
|
||||
x = 1 if flag1 else 2 if flag2 else 3
|
||||
|
||||
if x != 1:
|
||||
reveal_type(x) # revealed: Literal[2, 3]
|
||||
if x != 2:
|
||||
reveal_type(x) # revealed: Literal[3]
|
||||
```
|
||||
|
||||
## elif-else blocks
|
||||
|
||||
```py
|
||||
def _(flag1: bool, flag2: bool):
|
||||
x = 1 if flag1 else 2 if flag2 else 3
|
||||
|
||||
if x != 1:
|
||||
reveal_type(x) # revealed: Literal[2, 3]
|
||||
if x == 2:
|
||||
# TODO should be `Literal[2]`
|
||||
reveal_type(x) # revealed: Literal[2, 3]
|
||||
elif x == 3:
|
||||
reveal_type(x) # revealed: Literal[3]
|
||||
else:
|
||||
reveal_type(x) # revealed: Never
|
||||
|
||||
elif x != 2:
|
||||
# TODO should be Literal[1]
|
||||
reveal_type(x) # revealed: Literal[1, 3]
|
||||
else:
|
||||
# TODO should be Never
|
||||
reveal_type(x) # revealed: Literal[1, 2, 3]
|
||||
```
|
||||
@@ -1,91 +0,0 @@
|
||||
# Narrowing for `!=` conditionals
|
||||
|
||||
## `x != None`
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
x = None if flag else 1
|
||||
|
||||
if x != None:
|
||||
reveal_type(x) # revealed: Literal[1]
|
||||
else:
|
||||
# TODO should be None
|
||||
reveal_type(x) # revealed: None | Literal[1]
|
||||
```
|
||||
|
||||
## `!=` for other singleton types
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
x = True if flag else False
|
||||
|
||||
if x != False:
|
||||
reveal_type(x) # revealed: Literal[True]
|
||||
else:
|
||||
# TODO should be Literal[False]
|
||||
reveal_type(x) # revealed: bool
|
||||
```
|
||||
|
||||
## `x != y` where `y` is of literal type
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
x = 1 if flag else 2
|
||||
|
||||
if x != 1:
|
||||
reveal_type(x) # revealed: Literal[2]
|
||||
```
|
||||
|
||||
## `x != y` where `y` is a single-valued type
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
class A: ...
|
||||
class B: ...
|
||||
C = A if flag else B
|
||||
|
||||
if C != A:
|
||||
reveal_type(C) # revealed: Literal[B]
|
||||
else:
|
||||
# TODO should be Literal[A]
|
||||
reveal_type(C) # revealed: Literal[A, B]
|
||||
```
|
||||
|
||||
## `x != y` where `y` has multiple single-valued options
|
||||
|
||||
```py
|
||||
def _(flag1: bool, flag2: bool):
|
||||
x = 1 if flag1 else 2
|
||||
y = 2 if flag2 else 3
|
||||
|
||||
if x != y:
|
||||
reveal_type(x) # revealed: Literal[1, 2]
|
||||
else:
|
||||
# TODO should be Literal[2]
|
||||
reveal_type(x) # revealed: Literal[1, 2]
|
||||
```
|
||||
|
||||
## `!=` for non-single-valued types
|
||||
|
||||
Only single-valued types should narrow the type:
|
||||
|
||||
```py
|
||||
def _(flag: bool, a: int, y: int):
|
||||
x = a if flag else None
|
||||
|
||||
if x != y:
|
||||
reveal_type(x) # revealed: int | None
|
||||
```
|
||||
|
||||
## Mix of single-valued and non-single-valued types
|
||||
|
||||
```py
|
||||
def _(flag1: bool, flag2: bool, a: int):
|
||||
x = 1 if flag1 else 2
|
||||
y = 2 if flag2 else a
|
||||
|
||||
if x != y:
|
||||
reveal_type(x) # revealed: Literal[1, 2]
|
||||
else:
|
||||
reveal_type(x) # revealed: Literal[1, 2]
|
||||
```
|
||||
@@ -1,66 +0,0 @@
|
||||
# Narrowing for `match` statements
|
||||
|
||||
## Single `match` pattern
|
||||
|
||||
```py
|
||||
def _(flag: bool):
|
||||
x = None if flag else 1
|
||||
|
||||
reveal_type(x) # revealed: None | Literal[1]
|
||||
|
||||
y = 0
|
||||
|
||||
match x:
|
||||
case None:
|
||||
y = x
|
||||
|
||||
reveal_type(y) # revealed: Literal[0] | None
|
||||
```
|
||||
|
||||
## Class patterns
|
||||
|
||||
```py
|
||||
def get_object() -> object:
|
||||
return object()
|
||||
|
||||
class A: ...
|
||||
class B: ...
|
||||
|
||||
x = get_object()
|
||||
|
||||
reveal_type(x) # revealed: object
|
||||
|
||||
match x:
|
||||
case A():
|
||||
reveal_type(x) # revealed: A
|
||||
case B():
|
||||
# TODO could be `B & ~A`
|
||||
reveal_type(x) # revealed: B
|
||||
|
||||
reveal_type(x) # revealed: object
|
||||
```
|
||||
|
||||
## Class pattern with guard
|
||||
|
||||
```py
|
||||
def get_object() -> object:
|
||||
return object()
|
||||
|
||||
class A:
|
||||
def y() -> int:
|
||||
return 1
|
||||
|
||||
class B: ...
|
||||
|
||||
x = get_object()
|
||||
|
||||
reveal_type(x) # revealed: object
|
||||
|
||||
match x:
|
||||
case A() if reveal_type(x): # revealed: A
|
||||
pass
|
||||
case B() if reveal_type(x): # revealed: B
|
||||
pass
|
||||
|
||||
reveal_type(x) # revealed: object
|
||||
```
|
||||
@@ -1,25 +0,0 @@
|
||||
# Protocols
|
||||
|
||||
We do not support protocols yet, but to avoid false positives, we *partially* support some known
|
||||
protocols.
|
||||
|
||||
## `typing.SupportsIndex`
|
||||
|
||||
```py
|
||||
from typing import SupportsIndex, Literal
|
||||
|
||||
def _(some_int: int, some_literal_int: Literal[1], some_indexable: SupportsIndex):
|
||||
a: SupportsIndex = some_int
|
||||
b: SupportsIndex = some_literal_int
|
||||
c: SupportsIndex = some_indexable
|
||||
```
|
||||
|
||||
## Invalid
|
||||
|
||||
```py
|
||||
from typing import Protocol
|
||||
|
||||
# error: [invalid-type-form] "`typing.Protocol` is not allowed in type expressions"
|
||||
def f(x: Protocol) -> None:
|
||||
reveal_type(x) # revealed: Unknown
|
||||
```
|
||||
@@ -1,28 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: basic.md - Structures - Unresolvable module import
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/import/basic.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | import zqzqzqzqzqzqzq # error: [unresolved-import] "Cannot resolve import `zqzqzqzqzqzqzq`"
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:unresolved-import
|
||||
--> /src/mdtest_snippet.py:1:8
|
||||
|
|
||||
1 | import zqzqzqzqzqzqzq # error: [unresolved-import] "Cannot resolve import `zqzqzqzqzqzqzq`"
|
||||
| ^^^^^^^^^^^^^^ Cannot resolve import `zqzqzqzqzqzqzq`
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,32 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: for.md - For loops - Invalid iterable
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/loops/for.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | nonsense = 123
|
||||
2 | for x in nonsense: # error: [not-iterable]
|
||||
3 | pass
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:not-iterable
|
||||
--> /src/mdtest_snippet.py:2:10
|
||||
|
|
||||
1 | nonsense = 123
|
||||
2 | for x in nonsense: # error: [not-iterable]
|
||||
| ^^^^^^^^ Object of type `Literal[123]` is not iterable because it doesn't have an `__iter__` method or a `__getitem__` method
|
||||
3 | pass
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,40 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: invalid_argument_type.md - Invalid argument type diagnostics - Basic
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/diagnostics/invalid_argument_type.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | def foo(x: int) -> int:
|
||||
2 | return x * x
|
||||
3 |
|
||||
4 | foo("hello") # error: [invalid-argument-type]
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:invalid-argument-type
|
||||
--> /src/mdtest_snippet.py:4:5
|
||||
|
|
||||
2 | return x * x
|
||||
3 |
|
||||
4 | foo("hello") # error: [invalid-argument-type]
|
||||
| ^^^^^^^ Object of type `Literal["hello"]` cannot be assigned to parameter 1 (`x`) of function `foo`; expected type `int`
|
||||
|
|
||||
info
|
||||
--> /src/mdtest_snippet.py:1:9
|
||||
|
|
||||
1 | def foo(x: int) -> int:
|
||||
| ------ parameter declared in function definition here
|
||||
2 | return x * x
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,81 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: invalid_argument_type.md - Invalid argument type diagnostics - Many parameters with multiple invalid arguments
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/diagnostics/invalid_argument_type.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | def foo(x: int, y: int, z: int) -> int:
|
||||
2 | return x * y * z
|
||||
3 |
|
||||
4 | # error: [invalid-argument-type]
|
||||
5 | # error: [invalid-argument-type]
|
||||
6 | # error: [invalid-argument-type]
|
||||
7 | foo("a", "b", "c")
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:invalid-argument-type
|
||||
--> /src/mdtest_snippet.py:7:5
|
||||
|
|
||||
5 | # error: [invalid-argument-type]
|
||||
6 | # error: [invalid-argument-type]
|
||||
7 | foo("a", "b", "c")
|
||||
| ^^^ Object of type `Literal["a"]` cannot be assigned to parameter 1 (`x`) of function `foo`; expected type `int`
|
||||
|
|
||||
info
|
||||
--> /src/mdtest_snippet.py:1:9
|
||||
|
|
||||
1 | def foo(x: int, y: int, z: int) -> int:
|
||||
| ------ parameter declared in function definition here
|
||||
2 | return x * y * z
|
||||
|
|
||||
|
||||
```
|
||||
|
||||
```
|
||||
error: lint:invalid-argument-type
|
||||
--> /src/mdtest_snippet.py:7:10
|
||||
|
|
||||
5 | # error: [invalid-argument-type]
|
||||
6 | # error: [invalid-argument-type]
|
||||
7 | foo("a", "b", "c")
|
||||
| ^^^ Object of type `Literal["b"]` cannot be assigned to parameter 2 (`y`) of function `foo`; expected type `int`
|
||||
|
|
||||
info
|
||||
--> /src/mdtest_snippet.py:1:17
|
||||
|
|
||||
1 | def foo(x: int, y: int, z: int) -> int:
|
||||
| ------ parameter declared in function definition here
|
||||
2 | return x * y * z
|
||||
|
|
||||
|
||||
```
|
||||
|
||||
```
|
||||
error: lint:invalid-argument-type
|
||||
--> /src/mdtest_snippet.py:7:15
|
||||
|
|
||||
5 | # error: [invalid-argument-type]
|
||||
6 | # error: [invalid-argument-type]
|
||||
7 | foo("a", "b", "c")
|
||||
| ^^^ Object of type `Literal["c"]` cannot be assigned to parameter 3 (`z`) of function `foo`; expected type `int`
|
||||
|
|
||||
info
|
||||
--> /src/mdtest_snippet.py:1:25
|
||||
|
|
||||
1 | def foo(x: int, y: int, z: int) -> int:
|
||||
| ------ parameter declared in function definition here
|
||||
2 | return x * y * z
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,95 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: return_type.md - Function return type - Invalid return type
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/function/return_type.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | # error: [invalid-return-type]
|
||||
2 | def f() -> int:
|
||||
3 | 1
|
||||
4 |
|
||||
5 | def f() -> str:
|
||||
6 | # error: [invalid-return-type]
|
||||
7 | return 1
|
||||
8 |
|
||||
9 | def f() -> int:
|
||||
10 | # error: [invalid-return-type]
|
||||
11 | return
|
||||
12 |
|
||||
13 | from typing import TypeVar
|
||||
14 |
|
||||
15 | T = TypeVar("T")
|
||||
16 |
|
||||
17 | # TODO: `invalid-return-type` error should be emitted
|
||||
18 | def m(x: T) -> T: ...
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:invalid-return-type
|
||||
--> /src/mdtest_snippet.py:2:12
|
||||
|
|
||||
1 | # error: [invalid-return-type]
|
||||
2 | def f() -> int:
|
||||
| ^^^ Function can implicitly return `None`, which is not assignable to return type `int`
|
||||
3 | 1
|
||||
|
|
||||
|
||||
```
|
||||
|
||||
```
|
||||
error: lint:invalid-return-type
|
||||
--> /src/mdtest_snippet.py:7:12
|
||||
|
|
||||
5 | def f() -> str:
|
||||
6 | # error: [invalid-return-type]
|
||||
7 | return 1
|
||||
| ^ Object of type `Literal[1]` is not assignable to return type `str`
|
||||
8 |
|
||||
9 | def f() -> int:
|
||||
|
|
||||
info
|
||||
--> /src/mdtest_snippet.py:5:12
|
||||
|
|
||||
3 | 1
|
||||
4 |
|
||||
5 | def f() -> str:
|
||||
| --- Return type is declared here as `str`
|
||||
6 | # error: [invalid-return-type]
|
||||
7 | return 1
|
||||
|
|
||||
|
||||
```
|
||||
|
||||
```
|
||||
error: lint:invalid-return-type
|
||||
--> /src/mdtest_snippet.py:11:5
|
||||
|
|
||||
9 | def f() -> int:
|
||||
10 | # error: [invalid-return-type]
|
||||
11 | return
|
||||
| ^^^^^^ Object of type `None` is not assignable to return type `int`
|
||||
12 |
|
||||
13 | from typing import TypeVar
|
||||
|
|
||||
info
|
||||
--> /src/mdtest_snippet.py:9:12
|
||||
|
|
||||
7 | return 1
|
||||
8 |
|
||||
9 | def f() -> int:
|
||||
| --- Return type is declared here as `int`
|
||||
10 | # error: [invalid-return-type]
|
||||
11 | return
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,37 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: tuples.md - Comparison: Tuples - Equality with elements that incorrectly implement `__bool__`
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/comparison/tuples.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | class A:
|
||||
2 | def __eq__(self, other) -> NotBoolable:
|
||||
3 | return NotBoolable()
|
||||
4 |
|
||||
5 | class NotBoolable:
|
||||
6 | __bool__: None = None
|
||||
7 |
|
||||
8 | # error: [unsupported-bool-conversion]
|
||||
9 | (A(),) == (A(),)
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:unsupported-bool-conversion
|
||||
--> /src/mdtest_snippet.py:9:1
|
||||
|
|
||||
8 | # error: [unsupported-bool-conversion]
|
||||
9 | (A(),) == (A(),)
|
||||
| ^^^^^^^^^^^^^^^^ Boolean conversion is unsupported for type `NotBoolable`; its `__bool__` method isn't callable
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,28 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: unpacking.md - Unpacking - Right hand side not iterable
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/diagnostics/unpacking.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | a, b = 1 # error: [not-iterable]
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:not-iterable
|
||||
--> /src/mdtest_snippet.py:1:8
|
||||
|
|
||||
1 | a, b = 1 # error: [not-iterable]
|
||||
| ^ Object of type `Literal[1]` is not iterable because it doesn't have an `__iter__` method or a `__getitem__` method
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,28 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: unpacking.md - Unpacking - Too few values to unpack
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/diagnostics/unpacking.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | a, b = (1,) # error: [invalid-assignment]
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:invalid-assignment
|
||||
--> /src/mdtest_snippet.py:1:1
|
||||
|
|
||||
1 | a, b = (1,) # error: [invalid-assignment]
|
||||
| ^^^^ Not enough values to unpack (expected 2, got 1)
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,28 +0,0 @@
|
||||
---
|
||||
source: crates/red_knot_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: unpacking.md - Unpacking - Too many values to unpack
|
||||
mdtest path: crates/red_knot_python_semantic/resources/mdtest/diagnostics/unpacking.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | a, b = (1, 2, 3) # error: [invalid-assignment]
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error: lint:invalid-assignment
|
||||
--> /src/mdtest_snippet.py:1:1
|
||||
|
|
||||
1 | a, b = (1, 2, 3) # error: [invalid-assignment]
|
||||
| ^^^^ Too many values to unpack (expected 2, got 3)
|
||||
|
|
||||
|
||||
```
|
||||
@@ -1,191 +0,0 @@
|
||||
# Suppressing errors with `knot: ignore`
|
||||
|
||||
Type check errors can be suppressed by a `knot: ignore` comment on the same line as the violation.
|
||||
|
||||
## Simple `knot: ignore`
|
||||
|
||||
```py
|
||||
a = 4 + test # knot: ignore
|
||||
```
|
||||
|
||||
## Suppressing a specific code
|
||||
|
||||
```py
|
||||
a = 4 + test # knot: ignore[unresolved-reference]
|
||||
```
|
||||
|
||||
## Unused suppression
|
||||
|
||||
```py
|
||||
test = 10
|
||||
# error: [unused-ignore-comment] "Unused `knot: ignore` directive: 'possibly-unresolved-reference'"
|
||||
a = test + 3 # knot: ignore[possibly-unresolved-reference]
|
||||
```
|
||||
|
||||
## Unused suppression if the error codes don't match
|
||||
|
||||
```py
|
||||
# error: [unresolved-reference]
|
||||
# error: [unused-ignore-comment] "Unused `knot: ignore` directive: 'possibly-unresolved-reference'"
|
||||
a = test + 3 # knot: ignore[possibly-unresolved-reference]
|
||||
```
|
||||
|
||||
## Suppressed unused comment
|
||||
|
||||
```py
|
||||
# error: [unused-ignore-comment]
|
||||
a = 10 / 2 # knot: ignore[division-by-zero]
|
||||
a = 10 / 2 # knot: ignore[division-by-zero, unused-ignore-comment]
|
||||
a = 10 / 2 # knot: ignore[unused-ignore-comment, division-by-zero]
|
||||
a = 10 / 2 # knot: ignore[unused-ignore-comment] # type: ignore
|
||||
a = 10 / 2 # type: ignore # knot: ignore[unused-ignore-comment]
|
||||
```
|
||||
|
||||
## Unused ignore comment
|
||||
|
||||
```py
|
||||
# error: [unused-ignore-comment] "Unused `knot: ignore` directive: 'unused-ignore-comment'"
|
||||
a = 10 / 0 # knot: ignore[division-by-zero, unused-ignore-comment]
|
||||
```
|
||||
|
||||
## Multiple unused comments
|
||||
|
||||
Today, Red Knot emits a diagnostic for every unused code. We might want to group the codes by
|
||||
comment at some point in the future.
|
||||
|
||||
```py
|
||||
# error: [unused-ignore-comment] "Unused `knot: ignore` directive: 'division-by-zero'"
|
||||
# error: [unused-ignore-comment] "Unused `knot: ignore` directive: 'unresolved-reference'"
|
||||
a = 10 / 2 # knot: ignore[division-by-zero, unresolved-reference]
|
||||
|
||||
# error: [unused-ignore-comment] "Unused `knot: ignore` directive: 'invalid-assignment'"
|
||||
# error: [unused-ignore-comment] "Unused `knot: ignore` directive: 'unresolved-reference'"
|
||||
a = 10 / 0 # knot: ignore[invalid-assignment, division-by-zero, unresolved-reference]
|
||||
```
|
||||
|
||||
## Multiple suppressions
|
||||
|
||||
```py
|
||||
# fmt: off
|
||||
def test(a: f"f-string type annotation", b: b"byte-string-type-annotation"): ... # knot: ignore[fstring-type-annotation, byte-string-type-annotation]
|
||||
```
|
||||
|
||||
## Can't suppress syntax errors
|
||||
|
||||
<!-- blacken-docs:off -->
|
||||
|
||||
```py
|
||||
# error: [invalid-syntax]
|
||||
# error: [unused-ignore-comment]
|
||||
def test($): # knot: ignore
|
||||
pass
|
||||
```
|
||||
|
||||
<!-- blacken-docs:on -->
|
||||
|
||||
## Can't suppress `revealed-type` diagnostics
|
||||
|
||||
```py
|
||||
a = 10
|
||||
# revealed: Literal[10]
|
||||
# error: [unknown-rule] "Unknown rule `revealed-type`"
|
||||
reveal_type(a) # knot: ignore[revealed-type]
|
||||
```
|
||||
|
||||
## Extra whitespace in type ignore comments is allowed
|
||||
|
||||
```py
|
||||
a = 10 / 0 # knot : ignore
|
||||
a = 10 / 0 # knot: ignore [ division-by-zero ]
|
||||
```
|
||||
|
||||
## Whitespace is optional
|
||||
|
||||
```py
|
||||
# fmt: off
|
||||
a = 10 / 0 #knot:ignore[division-by-zero]
|
||||
```
|
||||
|
||||
## Trailing codes comma
|
||||
|
||||
Trailing commas in the codes section are allowed:
|
||||
|
||||
```py
|
||||
a = 10 / 0 # knot: ignore[division-by-zero,]
|
||||
```
|
||||
|
||||
## Invalid characters in codes
|
||||
|
||||
```py
|
||||
# error: [division-by-zero]
|
||||
# error: [invalid-ignore-comment] "Invalid `knot: ignore` comment: expected a alphanumeric character or `-` or `_` as code"
|
||||
a = 10 / 0 # knot: ignore[*-*]
|
||||
```
|
||||
|
||||
## Trailing whitespace
|
||||
|
||||
<!-- blacken-docs:off -->
|
||||
|
||||
```py
|
||||
a = 10 / 0 # knot: ignore[division-by-zero]
|
||||
# ^^^^^^ trailing whitespace
|
||||
```
|
||||
|
||||
<!-- blacken-docs:on -->
|
||||
|
||||
## Missing comma
|
||||
|
||||
A missing comma results in an invalid suppression comment. We may want to recover from this in the
|
||||
future.
|
||||
|
||||
```py
|
||||
# error: [unresolved-reference]
|
||||
# error: [invalid-ignore-comment] "Invalid `knot: ignore` comment: expected a comma separating the rule codes"
|
||||
a = x / 0 # knot: ignore[division-by-zero unresolved-reference]
|
||||
```
|
||||
|
||||
## Missing closing bracket
|
||||
|
||||
```py
|
||||
# error: [unresolved-reference] "Name `x` used when not defined"
|
||||
# error: [invalid-ignore-comment] "Invalid `knot: ignore` comment: expected a comma separating the rule codes"
|
||||
a = x / 2 # knot: ignore[unresolved-reference
|
||||
```
|
||||
|
||||
## Empty codes
|
||||
|
||||
An empty codes array suppresses no-diagnostics and is always useless
|
||||
|
||||
```py
|
||||
# error: [division-by-zero]
|
||||
# error: [unused-ignore-comment] "Unused `knot: ignore` without a code"
|
||||
a = 4 / 0 # knot: ignore[]
|
||||
```
|
||||
|
||||
## File-level suppression comments
|
||||
|
||||
File level suppression comments are currently intentionally unsupported because we've yet to decide
|
||||
if they should use a different syntax that also supports enabling rules or changing the rule's
|
||||
severity: `knot: possibly-undefined-reference=error`
|
||||
|
||||
```py
|
||||
# error: [unused-ignore-comment]
|
||||
# knot: ignore[division-by-zero]
|
||||
|
||||
a = 4 / 0 # error: [division-by-zero]
|
||||
```
|
||||
|
||||
## Unknown rule
|
||||
|
||||
```py
|
||||
# error: [unknown-rule] "Unknown rule `is-equal-14`"
|
||||
a = 10 + 4 # knot: ignore[is-equal-14]
|
||||
```
|
||||
|
||||
## Code with `lint:` prefix
|
||||
|
||||
```py
|
||||
# error:[unknown-rule] "Unknown rule `lint:division-by-zero`. Did you mean `division-by-zero`?"
|
||||
# error: [division-by-zero]
|
||||
a = 10 / 0 # knot: ignore[lint:division-by-zero]
|
||||
```
|
||||
@@ -1,97 +0,0 @@
|
||||
# Singleton types
|
||||
|
||||
A type is a singleton type iff it has exactly one inhabitant.
|
||||
|
||||
## Basic
|
||||
|
||||
```py
|
||||
from typing_extensions import Literal, Never, Callable
|
||||
from knot_extensions import is_singleton, static_assert
|
||||
|
||||
static_assert(is_singleton(None))
|
||||
static_assert(is_singleton(Literal[True]))
|
||||
static_assert(is_singleton(Literal[False]))
|
||||
|
||||
static_assert(is_singleton(type[bool]))
|
||||
|
||||
static_assert(not is_singleton(Never))
|
||||
static_assert(not is_singleton(str))
|
||||
|
||||
static_assert(not is_singleton(Literal[345]))
|
||||
static_assert(not is_singleton(Literal[1, 2]))
|
||||
|
||||
static_assert(not is_singleton(tuple[()]))
|
||||
static_assert(not is_singleton(tuple[None]))
|
||||
static_assert(not is_singleton(tuple[None, Literal[True]]))
|
||||
|
||||
static_assert(not is_singleton(Callable[..., None]))
|
||||
static_assert(not is_singleton(Callable[[int, str], None]))
|
||||
```
|
||||
|
||||
## `NoDefault`
|
||||
|
||||
### 3.12
|
||||
|
||||
```toml
|
||||
[environment]
|
||||
python-version = "3.12"
|
||||
```
|
||||
|
||||
```py
|
||||
from typing_extensions import _NoDefaultType
|
||||
from knot_extensions import is_singleton, static_assert
|
||||
|
||||
static_assert(is_singleton(_NoDefaultType))
|
||||
```
|
||||
|
||||
### 3.13
|
||||
|
||||
```toml
|
||||
[environment]
|
||||
python-version = "3.13"
|
||||
```
|
||||
|
||||
```py
|
||||
from typing import _NoDefaultType
|
||||
from knot_extensions import is_singleton, static_assert
|
||||
|
||||
static_assert(is_singleton(_NoDefaultType))
|
||||
```
|
||||
|
||||
## `builtins.ellipsis`/`types.EllipsisType`
|
||||
|
||||
### All Python versions
|
||||
|
||||
The type of the builtin symbol `Ellipsis` is the same as the type of an ellipsis literal (`...`).
|
||||
The type is not actually exposed from the standard library on Python \<3.10, but we still recognise
|
||||
the type as a singleton on any Python version.
|
||||
|
||||
```toml
|
||||
[environment]
|
||||
python-version = "3.9"
|
||||
```
|
||||
|
||||
```py
|
||||
import sys
|
||||
from knot_extensions import is_singleton, static_assert
|
||||
|
||||
static_assert(is_singleton(Ellipsis.__class__))
|
||||
static_assert(is_singleton((...).__class__))
|
||||
```
|
||||
|
||||
### Python 3.10+
|
||||
|
||||
On Python 3.10+, the standard library exposes the type of `...` as `types.EllipsisType`, and we also
|
||||
recognise this as a singleton type when it is referenced directly:
|
||||
|
||||
```toml
|
||||
[environment]
|
||||
python-version = "3.10"
|
||||
```
|
||||
|
||||
```py
|
||||
import types
|
||||
from knot_extensions import static_assert, is_singleton
|
||||
|
||||
static_assert(is_singleton(types.EllipsisType))
|
||||
```
|
||||
@@ -1,37 +0,0 @@
|
||||
use crate::{
|
||||
semantic_index::{ast_ids::ScopedExpressionId, expression::Expression},
|
||||
unpack::Unpack,
|
||||
};
|
||||
|
||||
use ruff_python_ast::name::Name;
|
||||
|
||||
use rustc_hash::FxHashMap;
|
||||
|
||||
/// Describes an (annotated) attribute assignment that we discovered in a method
|
||||
/// body, typically of the form `self.x: int`, `self.x: int = …` or `self.x = …`.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, salsa::Update)]
|
||||
pub(crate) enum AttributeAssignment<'db> {
|
||||
/// An attribute assignment with an explicit type annotation, either
|
||||
/// `self.x: <annotation>` or `self.x: <annotation> = …`.
|
||||
Annotated { annotation: Expression<'db> },
|
||||
|
||||
/// An attribute assignment without a type annotation, e.g. `self.x = <value>`.
|
||||
Unannotated { value: Expression<'db> },
|
||||
|
||||
/// An attribute assignment where the right-hand side is an iterable, for example
|
||||
/// `for self.x in <iterable>`.
|
||||
Iterable { iterable: Expression<'db> },
|
||||
|
||||
/// An attribute assignment where the expression to be assigned is a context manager, for example
|
||||
/// `with <context_manager> as self.x`.
|
||||
ContextManager { context_manager: Expression<'db> },
|
||||
|
||||
/// An attribute assignment where the left-hand side is an unpacking expression,
|
||||
/// e.g. `self.x, self.y = <value>`.
|
||||
Unpack {
|
||||
attribute_expression_id: ScopedExpressionId,
|
||||
unpack: Unpack<'db>,
|
||||
},
|
||||
}
|
||||
|
||||
pub(crate) type AttributeAssignments<'db> = FxHashMap<Name, Vec<AttributeAssignment<'db>>>;
|
||||
@@ -1,84 +0,0 @@
|
||||
//! _Predicates_ are Python expressions whose runtime values can affect type inference.
|
||||
//!
|
||||
//! We currently use predicates in two places:
|
||||
//!
|
||||
//! - [_Narrowing constraints_][crate::semantic_index::narrowing_constraints] constrain the type of
|
||||
//! a binding that is visible at a particular use.
|
||||
//! - [_Visibility constraints_][crate::semantic_index::visibility_constraints] determine the
|
||||
//! static visibility of a binding, and the reachability of a statement.
|
||||
|
||||
use ruff_db::files::File;
|
||||
use ruff_index::{newtype_index, IndexVec};
|
||||
use ruff_python_ast::Singleton;
|
||||
|
||||
use crate::db::Db;
|
||||
use crate::semantic_index::expression::Expression;
|
||||
use crate::semantic_index::symbol::{FileScopeId, ScopeId};
|
||||
|
||||
// A scoped identifier for each `Predicate` in a scope.
|
||||
#[newtype_index]
|
||||
#[derive(Ord, PartialOrd)]
|
||||
pub(crate) struct ScopedPredicateId;
|
||||
|
||||
// A collection of predicates for a given scope.
|
||||
pub(crate) type Predicates<'db> = IndexVec<ScopedPredicateId, Predicate<'db>>;
|
||||
|
||||
#[derive(Debug, Default)]
|
||||
pub(crate) struct PredicatesBuilder<'db> {
|
||||
predicates: IndexVec<ScopedPredicateId, Predicate<'db>>,
|
||||
}
|
||||
|
||||
impl<'db> PredicatesBuilder<'db> {
|
||||
/// Adds a predicate. Note that we do not deduplicate predicates. If you add a `Predicate`
|
||||
/// more than once, you will get distinct `ScopedPredicateId`s for each one. (This lets you
|
||||
/// model predicates that might evaluate to different values at different points of execution.)
|
||||
pub(crate) fn add_predicate(&mut self, predicate: Predicate<'db>) -> ScopedPredicateId {
|
||||
self.predicates.push(predicate)
|
||||
}
|
||||
|
||||
pub(crate) fn build(mut self) -> Predicates<'db> {
|
||||
self.predicates.shrink_to_fit();
|
||||
self.predicates
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug, Hash, PartialEq, Eq, salsa::Update)]
|
||||
pub(crate) struct Predicate<'db> {
|
||||
pub(crate) node: PredicateNode<'db>,
|
||||
pub(crate) is_positive: bool,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug, Hash, PartialEq, Eq, salsa::Update)]
|
||||
pub(crate) enum PredicateNode<'db> {
|
||||
Expression(Expression<'db>),
|
||||
Pattern(PatternPredicate<'db>),
|
||||
}
|
||||
|
||||
/// Pattern kinds for which we support type narrowing and/or static visibility analysis.
|
||||
#[derive(Debug, Clone, Hash, PartialEq, salsa::Update)]
|
||||
pub(crate) enum PatternPredicateKind<'db> {
|
||||
Singleton(Singleton, Option<Expression<'db>>),
|
||||
Value(Expression<'db>, Option<Expression<'db>>),
|
||||
Class(Expression<'db>, Option<Expression<'db>>),
|
||||
Unsupported,
|
||||
}
|
||||
|
||||
#[salsa::tracked(debug)]
|
||||
pub(crate) struct PatternPredicate<'db> {
|
||||
pub(crate) file: File,
|
||||
|
||||
pub(crate) file_scope: FileScopeId,
|
||||
|
||||
pub(crate) subject: Expression<'db>,
|
||||
|
||||
#[return_ref]
|
||||
pub(crate) kind: PatternPredicateKind<'db>,
|
||||
|
||||
count: countme::Count<PatternPredicate<'static>>,
|
||||
}
|
||||
|
||||
impl<'db> PatternPredicate<'db> {
|
||||
pub(crate) fn scope(self, db: &'db dyn Db) -> ScopeId<'db> {
|
||||
self.file_scope(db).to_scope_id(db, self.file(db))
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,88 +0,0 @@
|
||||
use super::Type;
|
||||
|
||||
/// Typed arguments for a single call, in source order.
|
||||
#[derive(Clone, Debug, Default)]
|
||||
pub(crate) struct CallArguments<'a, 'db>(Vec<Argument<'a, 'db>>);
|
||||
|
||||
impl<'a, 'db> CallArguments<'a, 'db> {
|
||||
/// Create a [`CallArguments`] with no arguments.
|
||||
pub(crate) fn none() -> Self {
|
||||
Self(Vec::new())
|
||||
}
|
||||
|
||||
/// Create a [`CallArguments`] from an iterator over non-variadic positional argument types.
|
||||
pub(crate) fn positional(positional_tys: impl IntoIterator<Item = Type<'db>>) -> Self {
|
||||
positional_tys
|
||||
.into_iter()
|
||||
.map(Argument::Positional)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Prepend an extra positional argument.
|
||||
pub(crate) fn with_self(&self, self_ty: Type<'db>) -> Self {
|
||||
let mut arguments = Vec::with_capacity(self.0.len() + 1);
|
||||
arguments.push(Argument::Synthetic(self_ty));
|
||||
arguments.extend_from_slice(&self.0);
|
||||
Self(arguments)
|
||||
}
|
||||
|
||||
pub(crate) fn iter(&self) -> impl Iterator<Item = &Argument<'a, 'db>> {
|
||||
self.0.iter()
|
||||
}
|
||||
|
||||
// TODO this should be eliminated in favor of [`bind_call`]
|
||||
pub(crate) fn first_argument(&self) -> Option<Type<'db>> {
|
||||
self.0.first().map(Argument::ty)
|
||||
}
|
||||
|
||||
// TODO this should be eliminated in favor of [`bind_call`]
|
||||
pub(crate) fn second_argument(&self) -> Option<Type<'db>> {
|
||||
self.0.get(1).map(Argument::ty)
|
||||
}
|
||||
|
||||
// TODO this should be eliminated in favor of [`bind_call`]
|
||||
pub(crate) fn third_argument(&self) -> Option<Type<'db>> {
|
||||
self.0.get(2).map(Argument::ty)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'db, 'a, 'b> IntoIterator for &'b CallArguments<'a, 'db> {
|
||||
type Item = &'b Argument<'a, 'db>;
|
||||
type IntoIter = std::slice::Iter<'b, Argument<'a, 'db>>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.0.iter()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'db> FromIterator<Argument<'a, 'db>> for CallArguments<'a, 'db> {
|
||||
fn from_iter<T: IntoIterator<Item = Argument<'a, 'db>>>(iter: T) -> Self {
|
||||
Self(iter.into_iter().collect())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub(crate) enum Argument<'a, 'db> {
|
||||
/// The synthetic `self` or `cls` argument, which doesn't appear explicitly at the call site.
|
||||
Synthetic(Type<'db>),
|
||||
/// A positional argument.
|
||||
Positional(Type<'db>),
|
||||
/// A starred positional argument (e.g. `*args`).
|
||||
Variadic(Type<'db>),
|
||||
/// A keyword argument (e.g. `a=1`).
|
||||
Keyword { name: &'a str, ty: Type<'db> },
|
||||
/// The double-starred keywords argument (e.g. `**kwargs`).
|
||||
Keywords(Type<'db>),
|
||||
}
|
||||
|
||||
impl<'db> Argument<'_, 'db> {
|
||||
fn ty(&self) -> Type<'db> {
|
||||
match self {
|
||||
Self::Synthetic(ty) => *ty,
|
||||
Self::Positional(ty) => *ty,
|
||||
Self::Variadic(ty) => *ty,
|
||||
Self::Keyword { name: _, ty } => *ty,
|
||||
Self::Keywords(ty) => *ty,
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,793 +0,0 @@
|
||||
//! When analyzing a call site, we create _bindings_, which match and type-check the actual
|
||||
//! arguments against the parameters of the callable. Like with
|
||||
//! [signatures][crate::types::signatures], we have to handle the fact that the callable might be a
|
||||
//! union of types, each of which might contain multiple overloads.
|
||||
|
||||
use std::borrow::Cow;
|
||||
|
||||
use smallvec::SmallVec;
|
||||
|
||||
use super::{
|
||||
Argument, CallArguments, CallError, CallErrorKind, CallableSignature, InferContext, Signature,
|
||||
Signatures, Type,
|
||||
};
|
||||
use crate::db::Db;
|
||||
use crate::types::diagnostic::{
|
||||
CALL_NON_CALLABLE, INVALID_ARGUMENT_TYPE, MISSING_ARGUMENT, NO_MATCHING_OVERLOAD,
|
||||
PARAMETER_ALREADY_ASSIGNED, TOO_MANY_POSITIONAL_ARGUMENTS, UNKNOWN_ARGUMENT,
|
||||
};
|
||||
use crate::types::signatures::Parameter;
|
||||
use crate::types::{CallableType, UnionType};
|
||||
use ruff_db::diagnostic::{OldSecondaryDiagnosticMessage, Span};
|
||||
use ruff_python_ast as ast;
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
/// Binding information for a possible union of callables. At a call site, the arguments must be
|
||||
/// compatible with _all_ of the types in the union for the call to be valid.
|
||||
///
|
||||
/// It's guaranteed that the wrapped bindings have no errors.
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub(crate) struct Bindings<'db> {
|
||||
pub(crate) callable_type: Type<'db>,
|
||||
/// By using `SmallVec`, we avoid an extra heap allocation for the common case of a non-union
|
||||
/// type.
|
||||
elements: SmallVec<[CallableBinding<'db>; 1]>,
|
||||
}
|
||||
|
||||
impl<'db> Bindings<'db> {
|
||||
/// Binds the arguments of a call site against a signature.
|
||||
///
|
||||
/// The returned bindings provide the return type of the call, the bound types for all
|
||||
/// parameters, and any errors resulting from binding the call, all for each union element and
|
||||
/// overload (if any).
|
||||
pub(crate) fn bind(
|
||||
db: &'db dyn Db,
|
||||
signatures: &Signatures<'db>,
|
||||
arguments: &CallArguments<'_, 'db>,
|
||||
) -> Result<Self, CallError<'db>> {
|
||||
let elements: SmallVec<[CallableBinding<'db>; 1]> = signatures
|
||||
.into_iter()
|
||||
.map(|signature| CallableBinding::bind(db, signature, arguments))
|
||||
.collect();
|
||||
|
||||
// In order of precedence:
|
||||
//
|
||||
// - If every union element is Ok, then the union is too.
|
||||
// - If any element has a BindingError, the union has a BindingError.
|
||||
// - If every element is NotCallable, then the union is also NotCallable.
|
||||
// - Otherwise, the elements are some mixture of Ok, NotCallable, and PossiblyNotCallable.
|
||||
// The union as a whole is PossiblyNotCallable.
|
||||
//
|
||||
// For example, the union type `Callable[[int], int] | None` may not be callable at all,
|
||||
// because the `None` element in this union has no `__call__` method.
|
||||
//
|
||||
// On the other hand, the union type `Callable[[int], int] | Callable[[str], str]` is
|
||||
// always *callable*, but it would produce a `BindingError` if an inhabitant of this type
|
||||
// was called with a single `int` argument passed in. That's because the second element in
|
||||
// the union doesn't accept an `int` when it's called: it only accepts a `str`.
|
||||
let mut all_ok = true;
|
||||
let mut any_binding_error = false;
|
||||
let mut all_not_callable = true;
|
||||
for binding in &elements {
|
||||
let result = binding.as_result();
|
||||
all_ok &= result.is_ok();
|
||||
any_binding_error |= matches!(result, Err(CallErrorKind::BindingError));
|
||||
all_not_callable &= matches!(result, Err(CallErrorKind::NotCallable));
|
||||
}
|
||||
|
||||
let bindings = Bindings {
|
||||
callable_type: signatures.callable_type,
|
||||
elements,
|
||||
};
|
||||
|
||||
if all_ok {
|
||||
Ok(bindings)
|
||||
} else if any_binding_error {
|
||||
Err(CallError(CallErrorKind::BindingError, Box::new(bindings)))
|
||||
} else if all_not_callable {
|
||||
Err(CallError(CallErrorKind::NotCallable, Box::new(bindings)))
|
||||
} else {
|
||||
Err(CallError(
|
||||
CallErrorKind::PossiblyNotCallable,
|
||||
Box::new(bindings),
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn is_single(&self) -> bool {
|
||||
self.elements.len() == 1
|
||||
}
|
||||
|
||||
/// Returns the return type of the call. For successful calls, this is the actual return type.
|
||||
/// For calls with binding errors, this is a type that best approximates the return type. For
|
||||
/// types that are not callable, returns `Type::Unknown`.
|
||||
pub(crate) fn return_type(&self, db: &'db dyn Db) -> Type<'db> {
|
||||
if let [binding] = self.elements.as_slice() {
|
||||
return binding.return_type();
|
||||
}
|
||||
UnionType::from_elements(db, self.into_iter().map(CallableBinding::return_type))
|
||||
}
|
||||
|
||||
/// Report diagnostics for all of the errors that occurred when trying to match actual
|
||||
/// arguments to formal parameters. If the callable is a union, or has multiple overloads, we
|
||||
/// report a single diagnostic if we couldn't match any union element or overload.
|
||||
/// TODO: Update this to add subdiagnostics about how we failed to match each union element and
|
||||
/// overload.
|
||||
pub(crate) fn report_diagnostics(&self, context: &InferContext<'db>, node: ast::AnyNodeRef) {
|
||||
// If all union elements are not callable, report that the union as a whole is not
|
||||
// callable.
|
||||
if self.into_iter().all(|b| !b.is_callable()) {
|
||||
context.report_lint(
|
||||
&CALL_NON_CALLABLE,
|
||||
node,
|
||||
format_args!(
|
||||
"Object of type `{}` is not callable",
|
||||
self.callable_type.display(context.db())
|
||||
),
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// TODO: We currently only report errors for the first union element. Ideally, we'd report
|
||||
// an error saying that the union type can't be called, followed by subdiagnostics
|
||||
// explaining why.
|
||||
if let Some(first) = self.into_iter().find(|b| b.as_result().is_err()) {
|
||||
first.report_diagnostics(context, node);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'db> IntoIterator for &'a Bindings<'db> {
|
||||
type Item = &'a CallableBinding<'db>;
|
||||
type IntoIter = std::slice::Iter<'a, CallableBinding<'db>>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.elements.iter()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'db> IntoIterator for &'a mut Bindings<'db> {
|
||||
type Item = &'a mut CallableBinding<'db>;
|
||||
type IntoIter = std::slice::IterMut<'a, CallableBinding<'db>>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.elements.iter_mut()
|
||||
}
|
||||
}
|
||||
|
||||
/// Binding information for a single callable. If the callable is overloaded, there is a separate
|
||||
/// [`Binding`] for each overload.
|
||||
///
|
||||
/// For a successful binding, each argument is mapped to one of the callable's formal parameters.
|
||||
/// If the callable has multiple overloads, the first one that matches is used as the overall
|
||||
/// binding match.
|
||||
///
|
||||
/// TODO: Implement the call site evaluation algorithm in the [proposed updated typing
|
||||
/// spec][overloads], which is much more subtle than “first match wins”.
|
||||
///
|
||||
/// If the arguments cannot be matched to formal parameters, we store information about the
|
||||
/// specific errors that occurred when trying to match them up. If the callable has multiple
|
||||
/// overloads, we store this error information for each overload.
|
||||
///
|
||||
/// [overloads]: https://github.com/python/typing/pull/1839
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub(crate) struct CallableBinding<'db> {
|
||||
pub(crate) callable_type: Type<'db>,
|
||||
pub(crate) signature_type: Type<'db>,
|
||||
pub(crate) dunder_call_is_possibly_unbound: bool,
|
||||
|
||||
/// The bindings of each overload of this callable. Will be empty if the type is not callable.
|
||||
///
|
||||
/// By using `SmallVec`, we avoid an extra heap allocation for the common case of a
|
||||
/// non-overloaded callable.
|
||||
overloads: SmallVec<[Binding<'db>; 1]>,
|
||||
}
|
||||
|
||||
impl<'db> CallableBinding<'db> {
|
||||
/// Bind a [`CallArguments`] against a [`CallableSignature`].
|
||||
///
|
||||
/// The returned [`CallableBinding`] provides the return type of the call, the bound types for
|
||||
/// all parameters, and any errors resulting from binding the call.
|
||||
fn bind(
|
||||
db: &'db dyn Db,
|
||||
signature: &CallableSignature<'db>,
|
||||
arguments: &CallArguments<'_, 'db>,
|
||||
) -> Self {
|
||||
// If this callable is a bound method, prepend the self instance onto the arguments list
|
||||
// before checking.
|
||||
let arguments = if let Some(bound_type) = signature.bound_type {
|
||||
Cow::Owned(arguments.with_self(bound_type))
|
||||
} else {
|
||||
Cow::Borrowed(arguments)
|
||||
};
|
||||
|
||||
// TODO: This checks every overload. In the proposed more detailed call checking spec [1],
|
||||
// arguments are checked for arity first, and are only checked for type assignability against
|
||||
// the matching overloads. Make sure to implement that as part of separating call binding into
|
||||
// two phases.
|
||||
//
|
||||
// [1] https://github.com/python/typing/pull/1839
|
||||
let overloads = signature
|
||||
.into_iter()
|
||||
.map(|signature| Binding::bind(db, signature, arguments.as_ref()))
|
||||
.collect();
|
||||
CallableBinding {
|
||||
callable_type: signature.callable_type,
|
||||
signature_type: signature.signature_type,
|
||||
dunder_call_is_possibly_unbound: signature.dunder_call_is_possibly_unbound,
|
||||
overloads,
|
||||
}
|
||||
}
|
||||
|
||||
fn as_result(&self) -> Result<(), CallErrorKind> {
|
||||
if !self.is_callable() {
|
||||
return Err(CallErrorKind::NotCallable);
|
||||
}
|
||||
|
||||
if self.has_binding_errors() {
|
||||
return Err(CallErrorKind::BindingError);
|
||||
}
|
||||
|
||||
if self.dunder_call_is_possibly_unbound {
|
||||
return Err(CallErrorKind::PossiblyNotCallable);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn is_callable(&self) -> bool {
|
||||
!self.overloads.is_empty()
|
||||
}
|
||||
|
||||
/// Returns whether there were any errors binding this call site. If the callable has multiple
|
||||
/// overloads, they must _all_ have errors.
|
||||
pub(crate) fn has_binding_errors(&self) -> bool {
|
||||
self.matching_overload().is_none()
|
||||
}
|
||||
|
||||
/// Returns the overload that matched for this call binding. Returns `None` if none of the
|
||||
/// overloads matched.
|
||||
pub(crate) fn matching_overload(&self) -> Option<(usize, &Binding<'db>)> {
|
||||
self.overloads
|
||||
.iter()
|
||||
.enumerate()
|
||||
.find(|(_, overload)| overload.as_result().is_ok())
|
||||
}
|
||||
|
||||
/// Returns the overload that matched for this call binding. Returns `None` if none of the
|
||||
/// overloads matched.
|
||||
pub(crate) fn matching_overload_mut(&mut self) -> Option<(usize, &mut Binding<'db>)> {
|
||||
self.overloads
|
||||
.iter_mut()
|
||||
.enumerate()
|
||||
.find(|(_, overload)| overload.as_result().is_ok())
|
||||
}
|
||||
|
||||
/// Returns the return type of this call. For a valid call, this is the return type of the
|
||||
/// overload that the arguments matched against. For an invalid call to a non-overloaded
|
||||
/// function, this is the return type of the function. For an invalid call to an overloaded
|
||||
/// function, we return `Type::unknown`, since we cannot make any useful conclusions about
|
||||
/// which overload was intended to be called.
|
||||
pub(crate) fn return_type(&self) -> Type<'db> {
|
||||
if let Some((_, overload)) = self.matching_overload() {
|
||||
return overload.return_type();
|
||||
}
|
||||
if let [overload] = self.overloads.as_slice() {
|
||||
return overload.return_type();
|
||||
}
|
||||
Type::unknown()
|
||||
}
|
||||
|
||||
fn report_diagnostics(&self, context: &InferContext<'db>, node: ast::AnyNodeRef) {
|
||||
if !self.is_callable() {
|
||||
context.report_lint(
|
||||
&CALL_NON_CALLABLE,
|
||||
node,
|
||||
format_args!(
|
||||
"Object of type `{}` is not callable",
|
||||
self.callable_type.display(context.db()),
|
||||
),
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
if self.dunder_call_is_possibly_unbound {
|
||||
context.report_lint(
|
||||
&CALL_NON_CALLABLE,
|
||||
node,
|
||||
format_args!(
|
||||
"Object of type `{}` is not callable (possibly unbound `__call__` method)",
|
||||
self.callable_type.display(context.db()),
|
||||
),
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
let callable_description = CallableDescription::new(context.db(), self.callable_type);
|
||||
if self.overloads.len() > 1 {
|
||||
context.report_lint(
|
||||
&NO_MATCHING_OVERLOAD,
|
||||
node,
|
||||
format_args!(
|
||||
"No overload{} matches arguments",
|
||||
if let Some(CallableDescription { kind, name }) = callable_description {
|
||||
format!(" of {kind} `{name}`")
|
||||
} else {
|
||||
String::new()
|
||||
}
|
||||
),
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
let callable_description = CallableDescription::new(context.db(), self.signature_type);
|
||||
for overload in &self.overloads {
|
||||
overload.report_diagnostics(
|
||||
context,
|
||||
node,
|
||||
self.signature_type,
|
||||
callable_description.as_ref(),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Binding information for one of the overloads of a callable.
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub(crate) struct Binding<'db> {
|
||||
/// Return type of the call.
|
||||
return_ty: Type<'db>,
|
||||
|
||||
/// Bound types for parameters, in parameter source order.
|
||||
parameter_tys: Box<[Type<'db>]>,
|
||||
|
||||
/// Call binding errors, if any.
|
||||
errors: Vec<BindingError<'db>>,
|
||||
}
|
||||
|
||||
impl<'db> Binding<'db> {
|
||||
fn bind(
|
||||
db: &'db dyn Db,
|
||||
signature: &Signature<'db>,
|
||||
arguments: &CallArguments<'_, 'db>,
|
||||
) -> Self {
|
||||
let parameters = signature.parameters();
|
||||
// The type assigned to each parameter at this call site.
|
||||
let mut parameter_tys = vec![None; parameters.len()];
|
||||
let mut errors = vec![];
|
||||
let mut next_positional = 0;
|
||||
let mut first_excess_positional = None;
|
||||
let mut num_synthetic_args = 0;
|
||||
let get_argument_index = |argument_index: usize, num_synthetic_args: usize| {
|
||||
if argument_index >= num_synthetic_args {
|
||||
// Adjust the argument index to skip synthetic args, which don't appear at the call
|
||||
// site and thus won't be in the Call node arguments list.
|
||||
Some(argument_index - num_synthetic_args)
|
||||
} else {
|
||||
// we are erroring on a synthetic argument, we'll just emit the diagnostic on the
|
||||
// entire Call node, since there's no argument node for this argument at the call site
|
||||
None
|
||||
}
|
||||
};
|
||||
for (argument_index, argument) in arguments.iter().enumerate() {
|
||||
let (index, parameter, argument_ty, positional) = match argument {
|
||||
Argument::Positional(ty) | Argument::Synthetic(ty) => {
|
||||
if matches!(argument, Argument::Synthetic(_)) {
|
||||
num_synthetic_args += 1;
|
||||
}
|
||||
let Some((index, parameter)) = parameters
|
||||
.get_positional(next_positional)
|
||||
.map(|param| (next_positional, param))
|
||||
.or_else(|| parameters.variadic())
|
||||
else {
|
||||
first_excess_positional.get_or_insert(argument_index);
|
||||
next_positional += 1;
|
||||
continue;
|
||||
};
|
||||
next_positional += 1;
|
||||
(index, parameter, ty, !parameter.is_variadic())
|
||||
}
|
||||
Argument::Keyword { name, ty } => {
|
||||
let Some((index, parameter)) = parameters
|
||||
.keyword_by_name(name)
|
||||
.or_else(|| parameters.keyword_variadic())
|
||||
else {
|
||||
errors.push(BindingError::UnknownArgument {
|
||||
argument_name: ast::name::Name::new(name),
|
||||
argument_index: get_argument_index(argument_index, num_synthetic_args),
|
||||
});
|
||||
continue;
|
||||
};
|
||||
(index, parameter, ty, false)
|
||||
}
|
||||
|
||||
Argument::Variadic(_) | Argument::Keywords(_) => {
|
||||
// TODO
|
||||
continue;
|
||||
}
|
||||
};
|
||||
if let Some(expected_ty) = parameter.annotated_type() {
|
||||
if !argument_ty.is_assignable_to(db, expected_ty) {
|
||||
errors.push(BindingError::InvalidArgumentType {
|
||||
parameter: ParameterContext::new(parameter, index, positional),
|
||||
argument_index: get_argument_index(argument_index, num_synthetic_args),
|
||||
expected_ty,
|
||||
provided_ty: *argument_ty,
|
||||
});
|
||||
}
|
||||
}
|
||||
if let Some(existing) = parameter_tys[index].replace(*argument_ty) {
|
||||
if parameter.is_variadic() || parameter.is_keyword_variadic() {
|
||||
let union = UnionType::from_elements(db, [existing, *argument_ty]);
|
||||
parameter_tys[index].replace(union);
|
||||
} else {
|
||||
errors.push(BindingError::ParameterAlreadyAssigned {
|
||||
argument_index: get_argument_index(argument_index, num_synthetic_args),
|
||||
parameter: ParameterContext::new(parameter, index, positional),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
if let Some(first_excess_argument_index) = first_excess_positional {
|
||||
errors.push(BindingError::TooManyPositionalArguments {
|
||||
first_excess_argument_index: get_argument_index(
|
||||
first_excess_argument_index,
|
||||
num_synthetic_args,
|
||||
),
|
||||
expected_positional_count: parameters.positional().count(),
|
||||
provided_positional_count: next_positional,
|
||||
});
|
||||
}
|
||||
let mut missing = vec![];
|
||||
for (index, bound_ty) in parameter_tys.iter().enumerate() {
|
||||
if bound_ty.is_none() {
|
||||
let param = ¶meters[index];
|
||||
if param.is_variadic()
|
||||
|| param.is_keyword_variadic()
|
||||
|| param.default_type().is_some()
|
||||
{
|
||||
// variadic/keywords and defaulted arguments are not required
|
||||
continue;
|
||||
}
|
||||
missing.push(ParameterContext::new(param, index, false));
|
||||
}
|
||||
}
|
||||
|
||||
if !missing.is_empty() {
|
||||
errors.push(BindingError::MissingArguments {
|
||||
parameters: ParameterContexts(missing),
|
||||
});
|
||||
}
|
||||
|
||||
Self {
|
||||
return_ty: signature.return_ty.unwrap_or(Type::unknown()),
|
||||
parameter_tys: parameter_tys
|
||||
.into_iter()
|
||||
.map(|opt_ty| opt_ty.unwrap_or(Type::unknown()))
|
||||
.collect(),
|
||||
errors,
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn set_return_type(&mut self, return_ty: Type<'db>) {
|
||||
self.return_ty = return_ty;
|
||||
}
|
||||
|
||||
pub(crate) fn return_type(&self) -> Type<'db> {
|
||||
self.return_ty
|
||||
}
|
||||
|
||||
pub(crate) fn parameter_types(&self) -> &[Type<'db>] {
|
||||
&self.parameter_tys
|
||||
}
|
||||
|
||||
fn report_diagnostics(
|
||||
&self,
|
||||
context: &InferContext<'db>,
|
||||
node: ast::AnyNodeRef,
|
||||
callable_ty: Type<'db>,
|
||||
callable_description: Option<&CallableDescription>,
|
||||
) {
|
||||
for error in &self.errors {
|
||||
error.report_diagnostic(context, node, callable_ty, callable_description);
|
||||
}
|
||||
}
|
||||
|
||||
fn as_result(&self) -> Result<(), CallErrorKind> {
|
||||
if !self.errors.is_empty() {
|
||||
return Err(CallErrorKind::BindingError);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Describes a callable for the purposes of diagnostics.
|
||||
#[derive(Debug)]
|
||||
pub(crate) struct CallableDescription<'a> {
|
||||
name: &'a str,
|
||||
kind: &'a str,
|
||||
}
|
||||
|
||||
impl<'db> CallableDescription<'db> {
|
||||
fn new(db: &'db dyn Db, callable_type: Type<'db>) -> Option<CallableDescription<'db>> {
|
||||
match callable_type {
|
||||
Type::FunctionLiteral(function) => Some(CallableDescription {
|
||||
kind: "function",
|
||||
name: function.name(db),
|
||||
}),
|
||||
Type::ClassLiteral(class_type) => Some(CallableDescription {
|
||||
kind: "class",
|
||||
name: class_type.class().name(db),
|
||||
}),
|
||||
Type::Callable(CallableType::BoundMethod(bound_method)) => Some(CallableDescription {
|
||||
kind: "bound method",
|
||||
name: bound_method.function(db).name(db),
|
||||
}),
|
||||
Type::Callable(CallableType::MethodWrapperDunderGet(function)) => {
|
||||
Some(CallableDescription {
|
||||
kind: "method wrapper `__get__` of function",
|
||||
name: function.name(db),
|
||||
})
|
||||
}
|
||||
Type::Callable(CallableType::WrapperDescriptorDunderGet) => Some(CallableDescription {
|
||||
kind: "wrapper descriptor",
|
||||
name: "FunctionType.__get__",
|
||||
}),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Information needed to emit a diagnostic regarding a parameter.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub(crate) struct ParameterContext {
|
||||
name: Option<ast::name::Name>,
|
||||
index: usize,
|
||||
|
||||
/// Was the argument for this parameter passed positionally, and matched to a non-variadic
|
||||
/// positional parameter? (If so, we will provide the index in the diagnostic, not just the
|
||||
/// name.)
|
||||
positional: bool,
|
||||
}
|
||||
|
||||
impl ParameterContext {
|
||||
fn new(parameter: &Parameter, index: usize, positional: bool) -> Self {
|
||||
Self {
|
||||
name: parameter.display_name(),
|
||||
index,
|
||||
positional,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::fmt::Display for ParameterContext {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
if let Some(name) = &self.name {
|
||||
if self.positional {
|
||||
write!(f, "{} (`{name}`)", self.index + 1)
|
||||
} else {
|
||||
write!(f, "`{name}`")
|
||||
}
|
||||
} else {
|
||||
write!(f, "{}", self.index + 1)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub(crate) struct ParameterContexts(Vec<ParameterContext>);
|
||||
|
||||
impl std::fmt::Display for ParameterContexts {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
let mut iter = self.0.iter();
|
||||
if let Some(first) = iter.next() {
|
||||
write!(f, "{first}")?;
|
||||
for param in iter {
|
||||
f.write_str(", ")?;
|
||||
write!(f, "{param}")?;
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub(crate) enum BindingError<'db> {
|
||||
/// The type of an argument is not assignable to the annotated type of its corresponding
|
||||
/// parameter.
|
||||
InvalidArgumentType {
|
||||
parameter: ParameterContext,
|
||||
argument_index: Option<usize>,
|
||||
expected_ty: Type<'db>,
|
||||
provided_ty: Type<'db>,
|
||||
},
|
||||
/// One or more required parameters (that is, with no default) is not supplied by any argument.
|
||||
MissingArguments { parameters: ParameterContexts },
|
||||
/// A call argument can't be matched to any parameter.
|
||||
UnknownArgument {
|
||||
argument_name: ast::name::Name,
|
||||
argument_index: Option<usize>,
|
||||
},
|
||||
/// More positional arguments are provided in the call than can be handled by the signature.
|
||||
TooManyPositionalArguments {
|
||||
first_excess_argument_index: Option<usize>,
|
||||
expected_positional_count: usize,
|
||||
provided_positional_count: usize,
|
||||
},
|
||||
/// Multiple arguments were provided for a single parameter.
|
||||
ParameterAlreadyAssigned {
|
||||
argument_index: Option<usize>,
|
||||
parameter: ParameterContext,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'db> BindingError<'db> {
|
||||
fn parameter_span_from_index(
|
||||
db: &'db dyn Db,
|
||||
callable_ty: Type<'db>,
|
||||
parameter_index: usize,
|
||||
) -> Option<Span> {
|
||||
match callable_ty {
|
||||
Type::FunctionLiteral(function) => {
|
||||
let function_scope = function.body_scope(db);
|
||||
let mut span = Span::from(function_scope.file(db));
|
||||
let node = function_scope.node(db);
|
||||
if let Some(func_def) = node.as_function() {
|
||||
let range = func_def
|
||||
.parameters
|
||||
.iter()
|
||||
.nth(parameter_index)
|
||||
.map(|param| param.range())
|
||||
.unwrap_or(func_def.parameters.range);
|
||||
span = span.with_range(range);
|
||||
Some(span)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
Type::Callable(CallableType::BoundMethod(bound_method)) => {
|
||||
Self::parameter_span_from_index(
|
||||
db,
|
||||
Type::FunctionLiteral(bound_method.function(db)),
|
||||
parameter_index,
|
||||
)
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn report_diagnostic(
|
||||
&self,
|
||||
context: &InferContext<'db>,
|
||||
node: ast::AnyNodeRef,
|
||||
callable_ty: Type<'db>,
|
||||
callable_description: Option<&CallableDescription>,
|
||||
) {
|
||||
match self {
|
||||
Self::InvalidArgumentType {
|
||||
parameter,
|
||||
argument_index,
|
||||
expected_ty,
|
||||
provided_ty,
|
||||
} => {
|
||||
let mut messages = vec![];
|
||||
if let Some(span) =
|
||||
Self::parameter_span_from_index(context.db(), callable_ty, parameter.index)
|
||||
{
|
||||
messages.push(OldSecondaryDiagnosticMessage::new(
|
||||
span,
|
||||
"parameter declared in function definition here",
|
||||
));
|
||||
}
|
||||
|
||||
let provided_ty_display = provided_ty.display(context.db());
|
||||
let expected_ty_display = expected_ty.display(context.db());
|
||||
context.report_lint_with_secondary_messages(
|
||||
&INVALID_ARGUMENT_TYPE,
|
||||
Self::get_node(node, *argument_index),
|
||||
format_args!(
|
||||
"Object of type `{provided_ty_display}` cannot be assigned to \
|
||||
parameter {parameter}{}; expected type `{expected_ty_display}`",
|
||||
if let Some(CallableDescription { kind, name }) = callable_description {
|
||||
format!(" of {kind} `{name}`")
|
||||
} else {
|
||||
String::new()
|
||||
}
|
||||
),
|
||||
messages,
|
||||
);
|
||||
}
|
||||
|
||||
Self::TooManyPositionalArguments {
|
||||
first_excess_argument_index,
|
||||
expected_positional_count,
|
||||
provided_positional_count,
|
||||
} => {
|
||||
context.report_lint(
|
||||
&TOO_MANY_POSITIONAL_ARGUMENTS,
|
||||
Self::get_node(node, *first_excess_argument_index),
|
||||
format_args!(
|
||||
"Too many positional arguments{}: expected \
|
||||
{expected_positional_count}, got {provided_positional_count}",
|
||||
if let Some(CallableDescription { kind, name }) = callable_description {
|
||||
format!(" to {kind} `{name}`")
|
||||
} else {
|
||||
String::new()
|
||||
}
|
||||
),
|
||||
);
|
||||
}
|
||||
|
||||
Self::MissingArguments { parameters } => {
|
||||
let s = if parameters.0.len() == 1 { "" } else { "s" };
|
||||
context.report_lint(
|
||||
&MISSING_ARGUMENT,
|
||||
node,
|
||||
format_args!(
|
||||
"No argument{s} provided for required parameter{s} {parameters}{}",
|
||||
if let Some(CallableDescription { kind, name }) = callable_description {
|
||||
format!(" of {kind} `{name}`")
|
||||
} else {
|
||||
String::new()
|
||||
}
|
||||
),
|
||||
);
|
||||
}
|
||||
|
||||
Self::UnknownArgument {
|
||||
argument_name,
|
||||
argument_index,
|
||||
} => {
|
||||
context.report_lint(
|
||||
&UNKNOWN_ARGUMENT,
|
||||
Self::get_node(node, *argument_index),
|
||||
format_args!(
|
||||
"Argument `{argument_name}` does not match any known parameter{}",
|
||||
if let Some(CallableDescription { kind, name }) = callable_description {
|
||||
format!(" of {kind} `{name}`")
|
||||
} else {
|
||||
String::new()
|
||||
}
|
||||
),
|
||||
);
|
||||
}
|
||||
|
||||
Self::ParameterAlreadyAssigned {
|
||||
argument_index,
|
||||
parameter,
|
||||
} => {
|
||||
context.report_lint(
|
||||
&PARAMETER_ALREADY_ASSIGNED,
|
||||
Self::get_node(node, *argument_index),
|
||||
format_args!(
|
||||
"Multiple values provided for parameter {parameter}{}",
|
||||
if let Some(CallableDescription { kind, name }) = callable_description {
|
||||
format!(" of {kind} `{name}`")
|
||||
} else {
|
||||
String::new()
|
||||
}
|
||||
),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn get_node(node: ast::AnyNodeRef, argument_index: Option<usize>) -> ast::AnyNodeRef {
|
||||
// If we have a Call node and an argument index, report the diagnostic on the correct
|
||||
// argument node; otherwise, report it on the entire provided node.
|
||||
match (node, argument_index) {
|
||||
(ast::AnyNodeRef::ExprCall(call_node), Some(argument_index)) => {
|
||||
match call_node
|
||||
.arguments
|
||||
.arguments_source_order()
|
||||
.nth(argument_index)
|
||||
.expect("argument index should not be out of range")
|
||||
{
|
||||
ast::ArgOrKeyword::Arg(expr) => expr.into(),
|
||||
ast::ArgOrKeyword::Keyword(keyword) => keyword.into(),
|
||||
}
|
||||
}
|
||||
_ => node,
|
||||
}
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,191 +0,0 @@
|
||||
use crate::types::{todo_type, Class, DynamicType, KnownClass, KnownInstanceType, Type};
|
||||
use crate::Db;
|
||||
use itertools::Either;
|
||||
|
||||
/// Enumeration of the possible kinds of types we allow in class bases.
|
||||
///
|
||||
/// This is much more limited than the [`Type`] enum:
|
||||
/// all types that would be invalid to have as a class base are
|
||||
/// transformed into [`ClassBase::unknown`]
|
||||
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq, salsa::Update)]
|
||||
pub(crate) enum ClassBase<'db> {
|
||||
Dynamic(DynamicType),
|
||||
Class(Class<'db>),
|
||||
}
|
||||
|
||||
impl<'db> ClassBase<'db> {
|
||||
pub(crate) const fn any() -> Self {
|
||||
Self::Dynamic(DynamicType::Any)
|
||||
}
|
||||
|
||||
pub(crate) const fn unknown() -> Self {
|
||||
Self::Dynamic(DynamicType::Unknown)
|
||||
}
|
||||
|
||||
pub(crate) const fn is_dynamic(self) -> bool {
|
||||
match self {
|
||||
ClassBase::Dynamic(_) => true,
|
||||
ClassBase::Class(_) => false,
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn display(self, db: &'db dyn Db) -> impl std::fmt::Display + 'db {
|
||||
struct Display<'db> {
|
||||
base: ClassBase<'db>,
|
||||
db: &'db dyn Db,
|
||||
}
|
||||
|
||||
impl std::fmt::Display for Display<'_> {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self.base {
|
||||
ClassBase::Dynamic(dynamic) => dynamic.fmt(f),
|
||||
ClassBase::Class(class) => write!(f, "<class '{}'>", class.name(self.db)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Display { base: self, db }
|
||||
}
|
||||
|
||||
/// Return a `ClassBase` representing the class `builtins.object`
|
||||
pub(super) fn object(db: &'db dyn Db) -> Self {
|
||||
KnownClass::Object
|
||||
.to_class_literal(db)
|
||||
.into_class_literal()
|
||||
.map_or(Self::unknown(), |literal| Self::Class(literal.class()))
|
||||
}
|
||||
|
||||
/// Attempt to resolve `ty` into a `ClassBase`.
|
||||
///
|
||||
/// Return `None` if `ty` is not an acceptable type for a class base.
|
||||
pub(super) fn try_from_type(db: &'db dyn Db, ty: Type<'db>) -> Option<Self> {
|
||||
match ty {
|
||||
Type::Dynamic(dynamic) => Some(Self::Dynamic(dynamic)),
|
||||
Type::ClassLiteral(literal) => Some(Self::Class(literal.class())),
|
||||
Type::Union(_) => None, // TODO -- forces consideration of multiple possible MROs?
|
||||
Type::Intersection(_) => None, // TODO -- probably incorrect?
|
||||
Type::Instance(_) => None, // TODO -- handle `__mro_entries__`?
|
||||
Type::Never
|
||||
| Type::BooleanLiteral(_)
|
||||
| Type::FunctionLiteral(_)
|
||||
| Type::Callable(..)
|
||||
| Type::BytesLiteral(_)
|
||||
| Type::IntLiteral(_)
|
||||
| Type::StringLiteral(_)
|
||||
| Type::LiteralString
|
||||
| Type::Tuple(_)
|
||||
| Type::SliceLiteral(_)
|
||||
| Type::ModuleLiteral(_)
|
||||
| Type::SubclassOf(_)
|
||||
| Type::AlwaysFalsy
|
||||
| Type::AlwaysTruthy => None,
|
||||
Type::KnownInstance(known_instance) => match known_instance {
|
||||
KnownInstanceType::TypeVar(_)
|
||||
| KnownInstanceType::TypeAliasType(_)
|
||||
| KnownInstanceType::Annotated
|
||||
| KnownInstanceType::Literal
|
||||
| KnownInstanceType::LiteralString
|
||||
| KnownInstanceType::Union
|
||||
| KnownInstanceType::NoReturn
|
||||
| KnownInstanceType::Never
|
||||
| KnownInstanceType::Final
|
||||
| KnownInstanceType::NotRequired
|
||||
| KnownInstanceType::TypeGuard
|
||||
| KnownInstanceType::TypeIs
|
||||
| KnownInstanceType::TypingSelf
|
||||
| KnownInstanceType::Unpack
|
||||
| KnownInstanceType::ClassVar
|
||||
| KnownInstanceType::Concatenate
|
||||
| KnownInstanceType::Required
|
||||
| KnownInstanceType::TypeAlias
|
||||
| KnownInstanceType::ReadOnly
|
||||
| KnownInstanceType::Optional
|
||||
| KnownInstanceType::Not
|
||||
| KnownInstanceType::Intersection
|
||||
| KnownInstanceType::TypeOf
|
||||
| KnownInstanceType::CallableTypeFromFunction
|
||||
| KnownInstanceType::AlwaysTruthy
|
||||
| KnownInstanceType::AlwaysFalsy => None,
|
||||
KnownInstanceType::Unknown => Some(Self::unknown()),
|
||||
KnownInstanceType::Any => Some(Self::any()),
|
||||
// TODO: Classes inheriting from `typing.Type` et al. also have `Generic` in their MRO
|
||||
KnownInstanceType::Dict => {
|
||||
Self::try_from_type(db, KnownClass::Dict.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::List => {
|
||||
Self::try_from_type(db, KnownClass::List.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::Type => {
|
||||
Self::try_from_type(db, KnownClass::Type.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::Tuple => {
|
||||
Self::try_from_type(db, KnownClass::Tuple.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::Set => {
|
||||
Self::try_from_type(db, KnownClass::Set.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::FrozenSet => {
|
||||
Self::try_from_type(db, KnownClass::FrozenSet.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::ChainMap => {
|
||||
Self::try_from_type(db, KnownClass::ChainMap.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::Counter => {
|
||||
Self::try_from_type(db, KnownClass::Counter.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::DefaultDict => {
|
||||
Self::try_from_type(db, KnownClass::DefaultDict.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::Deque => {
|
||||
Self::try_from_type(db, KnownClass::Deque.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::OrderedDict => {
|
||||
Self::try_from_type(db, KnownClass::OrderedDict.to_class_literal(db))
|
||||
}
|
||||
KnownInstanceType::Callable => {
|
||||
Self::try_from_type(db, todo_type!("Support for Callable as a base class"))
|
||||
}
|
||||
KnownInstanceType::Protocol => Some(ClassBase::Dynamic(DynamicType::TodoProtocol)),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn into_class(self) -> Option<Class<'db>> {
|
||||
match self {
|
||||
Self::Class(class) => Some(class),
|
||||
Self::Dynamic(_) => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Iterate over the MRO of this base
|
||||
pub(super) fn mro(
|
||||
self,
|
||||
db: &'db dyn Db,
|
||||
) -> Either<impl Iterator<Item = ClassBase<'db>>, impl Iterator<Item = ClassBase<'db>>> {
|
||||
match self {
|
||||
ClassBase::Dynamic(_) => Either::Left([self, ClassBase::object(db)].into_iter()),
|
||||
ClassBase::Class(class) => Either::Right(class.iter_mro(db)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'db> From<Class<'db>> for ClassBase<'db> {
|
||||
fn from(value: Class<'db>) -> Self {
|
||||
ClassBase::Class(value)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'db> From<ClassBase<'db>> for Type<'db> {
|
||||
fn from(value: ClassBase<'db>) -> Self {
|
||||
match value {
|
||||
ClassBase::Dynamic(dynamic) => Type::Dynamic(dynamic),
|
||||
ClassBase::Class(class) => Type::class_literal(class),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'db> From<&ClassBase<'db>> for Type<'db> {
|
||||
fn from(value: &ClassBase<'db>) -> Self {
|
||||
Self::from(*value)
|
||||
}
|
||||
}
|
||||
@@ -1,236 +0,0 @@
|
||||
use std::fmt;
|
||||
|
||||
use drop_bomb::DebugDropBomb;
|
||||
use ruff_db::{
|
||||
diagnostic::{DiagnosticId, OldSecondaryDiagnosticMessage, Severity},
|
||||
files::File,
|
||||
};
|
||||
use ruff_text_size::{Ranged, TextRange};
|
||||
|
||||
use super::{binding_type, KnownFunction, TypeCheckDiagnostic, TypeCheckDiagnostics};
|
||||
|
||||
use crate::semantic_index::semantic_index;
|
||||
use crate::semantic_index::symbol::ScopeId;
|
||||
use crate::{
|
||||
lint::{LintId, LintMetadata},
|
||||
suppression::suppressions,
|
||||
Db,
|
||||
};
|
||||
|
||||
/// Context for inferring the types of a single file.
|
||||
///
|
||||
/// One context exists for at least for every inferred region but it's
|
||||
/// possible that inferring a sub-region, like an unpack assignment, creates
|
||||
/// a sub-context.
|
||||
///
|
||||
/// Tracks the reported diagnostics of the inferred region.
|
||||
///
|
||||
/// ## Consuming
|
||||
/// It's important that the context is explicitly consumed before dropping by calling
|
||||
/// [`InferContext::finish`] and the returned diagnostics must be stored
|
||||
/// on the current [`TypeInference`](super::infer::TypeInference) result.
|
||||
pub(crate) struct InferContext<'db> {
|
||||
db: &'db dyn Db,
|
||||
scope: ScopeId<'db>,
|
||||
file: File,
|
||||
diagnostics: std::cell::RefCell<TypeCheckDiagnostics>,
|
||||
no_type_check: InNoTypeCheck,
|
||||
bomb: DebugDropBomb,
|
||||
}
|
||||
|
||||
impl<'db> InferContext<'db> {
|
||||
pub(crate) fn new(db: &'db dyn Db, scope: ScopeId<'db>) -> Self {
|
||||
Self {
|
||||
db,
|
||||
scope,
|
||||
file: scope.file(db),
|
||||
diagnostics: std::cell::RefCell::new(TypeCheckDiagnostics::default()),
|
||||
no_type_check: InNoTypeCheck::default(),
|
||||
bomb: DebugDropBomb::new("`InferContext` needs to be explicitly consumed by calling `::finish` to prevent accidental loss of diagnostics."),
|
||||
}
|
||||
}
|
||||
|
||||
/// The file for which the types are inferred.
|
||||
pub(crate) fn file(&self) -> File {
|
||||
self.file
|
||||
}
|
||||
|
||||
pub(crate) fn db(&self) -> &'db dyn Db {
|
||||
self.db
|
||||
}
|
||||
|
||||
pub(crate) fn extend<T>(&mut self, other: &T)
|
||||
where
|
||||
T: WithDiagnostics,
|
||||
{
|
||||
self.diagnostics.get_mut().extend(other.diagnostics());
|
||||
}
|
||||
|
||||
/// Reports a lint located at `ranged`.
|
||||
pub(super) fn report_lint<T>(
|
||||
&self,
|
||||
lint: &'static LintMetadata,
|
||||
ranged: T,
|
||||
message: fmt::Arguments,
|
||||
) where
|
||||
T: Ranged,
|
||||
{
|
||||
self.report_lint_with_secondary_messages(lint, ranged, message, vec![]);
|
||||
}
|
||||
|
||||
/// Reports a lint located at `ranged`.
|
||||
pub(super) fn report_lint_with_secondary_messages<T>(
|
||||
&self,
|
||||
lint: &'static LintMetadata,
|
||||
ranged: T,
|
||||
message: fmt::Arguments,
|
||||
secondary_messages: Vec<OldSecondaryDiagnosticMessage>,
|
||||
) where
|
||||
T: Ranged,
|
||||
{
|
||||
fn lint_severity(
|
||||
context: &InferContext,
|
||||
lint: &'static LintMetadata,
|
||||
range: TextRange,
|
||||
) -> Option<Severity> {
|
||||
if !context.db.is_file_open(context.file) {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Skip over diagnostics if the rule is disabled.
|
||||
let severity = context.db.rule_selection().severity(LintId::of(lint))?;
|
||||
|
||||
if context.is_in_no_type_check() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let suppressions = suppressions(context.db, context.file);
|
||||
|
||||
if let Some(suppression) = suppressions.find_suppression(range, LintId::of(lint)) {
|
||||
context.diagnostics.borrow_mut().mark_used(suppression.id());
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(severity)
|
||||
}
|
||||
|
||||
let Some(severity) = lint_severity(self, lint, ranged.range()) else {
|
||||
return;
|
||||
};
|
||||
|
||||
self.report_diagnostic(
|
||||
ranged,
|
||||
DiagnosticId::Lint(lint.name()),
|
||||
severity,
|
||||
message,
|
||||
secondary_messages,
|
||||
);
|
||||
}
|
||||
|
||||
/// Adds a new diagnostic.
|
||||
///
|
||||
/// The diagnostic does not get added if the rule isn't enabled for this file.
|
||||
pub(super) fn report_diagnostic<T>(
|
||||
&self,
|
||||
ranged: T,
|
||||
id: DiagnosticId,
|
||||
severity: Severity,
|
||||
message: fmt::Arguments,
|
||||
secondary_messages: Vec<OldSecondaryDiagnosticMessage>,
|
||||
) where
|
||||
T: Ranged,
|
||||
{
|
||||
if !self.db.is_file_open(self.file) {
|
||||
return;
|
||||
}
|
||||
|
||||
// TODO: Don't emit the diagnostic if:
|
||||
// * The enclosing node contains any syntax errors
|
||||
// * The rule is disabled for this file. We probably want to introduce a new query that
|
||||
// returns a rule selector for a given file that respects the package's settings,
|
||||
// any global pragma comments in the file, and any per-file-ignores.
|
||||
|
||||
self.diagnostics.borrow_mut().push(TypeCheckDiagnostic {
|
||||
file: self.file,
|
||||
id,
|
||||
message: message.to_string(),
|
||||
range: ranged.range(),
|
||||
severity,
|
||||
secondary_messages,
|
||||
});
|
||||
}
|
||||
|
||||
pub(super) fn set_in_no_type_check(&mut self, no_type_check: InNoTypeCheck) {
|
||||
self.no_type_check = no_type_check;
|
||||
}
|
||||
|
||||
fn is_in_no_type_check(&self) -> bool {
|
||||
match self.no_type_check {
|
||||
InNoTypeCheck::Possibly => {
|
||||
// Accessing the semantic index here is fine because
|
||||
// the index belongs to the same file as for which we emit the diagnostic.
|
||||
let index = semantic_index(self.db, self.file);
|
||||
|
||||
let scope_id = self.scope.file_scope_id(self.db);
|
||||
|
||||
// Inspect all ancestor function scopes by walking bottom up and infer the function's type.
|
||||
let mut function_scope_tys = index
|
||||
.ancestor_scopes(scope_id)
|
||||
.filter_map(|(_, scope)| scope.node().as_function())
|
||||
.filter_map(|function| {
|
||||
binding_type(self.db, index.definition(function)).into_function_literal()
|
||||
});
|
||||
|
||||
// Iterate over all functions and test if any is decorated with `@no_type_check`.
|
||||
function_scope_tys.any(|function_ty| {
|
||||
function_ty
|
||||
.decorators(self.db)
|
||||
.iter()
|
||||
.filter_map(|decorator| decorator.into_function_literal())
|
||||
.any(|decorator_ty| {
|
||||
decorator_ty.is_known(self.db, KnownFunction::NoTypeCheck)
|
||||
})
|
||||
})
|
||||
}
|
||||
InNoTypeCheck::Yes => true,
|
||||
}
|
||||
}
|
||||
|
||||
/// Are we currently inferring types in a stub file?
|
||||
pub(crate) fn in_stub(&self) -> bool {
|
||||
self.file.is_stub(self.db().upcast())
|
||||
}
|
||||
|
||||
#[must_use]
|
||||
pub(crate) fn finish(mut self) -> TypeCheckDiagnostics {
|
||||
self.bomb.defuse();
|
||||
let mut diagnostics = self.diagnostics.into_inner();
|
||||
diagnostics.shrink_to_fit();
|
||||
diagnostics
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Debug for InferContext<'_> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
f.debug_struct("TyContext")
|
||||
.field("file", &self.file)
|
||||
.field("diagnostics", &self.diagnostics)
|
||||
.field("defused", &self.bomb)
|
||||
.finish()
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Default)]
|
||||
pub(crate) enum InNoTypeCheck {
|
||||
/// The inference might be in a `no_type_check` block but only if any
|
||||
/// ancestor function is decorated with `@no_type_check`.
|
||||
#[default]
|
||||
Possibly,
|
||||
|
||||
/// The inference is known to be in an `@no_type_check` decorated function.
|
||||
Yes,
|
||||
}
|
||||
|
||||
pub(crate) trait WithDiagnostics {
|
||||
fn diagnostics(&self) -> &TypeCheckDiagnostics;
|
||||
}
|
||||
@@ -1,565 +0,0 @@
|
||||
use crate::semantic_index::ast_ids::HasScopedExpressionId;
|
||||
use crate::semantic_index::definition::Definition;
|
||||
use crate::semantic_index::expression::Expression;
|
||||
use crate::semantic_index::predicate::{
|
||||
PatternPredicate, PatternPredicateKind, Predicate, PredicateNode,
|
||||
};
|
||||
use crate::semantic_index::symbol::{ScopeId, ScopedSymbolId, SymbolTable};
|
||||
use crate::semantic_index::symbol_table;
|
||||
use crate::types::infer::infer_same_file_expression_type;
|
||||
use crate::types::{
|
||||
infer_expression_types, ClassLiteralType, IntersectionBuilder, KnownClass, SubclassOfType,
|
||||
Truthiness, Type, UnionBuilder,
|
||||
};
|
||||
use crate::Db;
|
||||
use itertools::Itertools;
|
||||
use ruff_python_ast as ast;
|
||||
use ruff_python_ast::{BoolOp, ExprBoolOp};
|
||||
use rustc_hash::FxHashMap;
|
||||
use std::collections::hash_map::Entry;
|
||||
use std::sync::Arc;
|
||||
|
||||
/// Return the type constraint that `test` (if true) would place on `definition`, if any.
|
||||
///
|
||||
/// For example, if we have this code:
|
||||
///
|
||||
/// ```python
|
||||
/// y = 1 if flag else None
|
||||
/// x = 1 if flag else None
|
||||
/// if x is not None:
|
||||
/// ...
|
||||
/// ```
|
||||
///
|
||||
/// The `test` expression `x is not None` places the constraint "not None" on the definition of
|
||||
/// `x`, so in that case we'd return `Some(Type::Intersection(negative=[Type::None]))`.
|
||||
///
|
||||
/// But if we called this with the same `test` expression, but the `definition` of `y`, no
|
||||
/// constraint is applied to that definition, so we'd just return `None`.
|
||||
pub(crate) fn infer_narrowing_constraint<'db>(
|
||||
db: &'db dyn Db,
|
||||
predicate: Predicate<'db>,
|
||||
definition: Definition<'db>,
|
||||
) -> Option<Type<'db>> {
|
||||
let constraints = match predicate.node {
|
||||
PredicateNode::Expression(expression) => {
|
||||
if predicate.is_positive {
|
||||
all_narrowing_constraints_for_expression(db, expression)
|
||||
} else {
|
||||
all_negative_narrowing_constraints_for_expression(db, expression)
|
||||
}
|
||||
}
|
||||
PredicateNode::Pattern(pattern) => all_narrowing_constraints_for_pattern(db, pattern),
|
||||
};
|
||||
if let Some(constraints) = constraints {
|
||||
constraints.get(&definition.symbol(db)).copied()
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(clippy::ref_option)]
|
||||
#[salsa::tracked(return_ref)]
|
||||
fn all_narrowing_constraints_for_pattern<'db>(
|
||||
db: &'db dyn Db,
|
||||
pattern: PatternPredicate<'db>,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
NarrowingConstraintsBuilder::new(db, PredicateNode::Pattern(pattern), true).finish()
|
||||
}
|
||||
|
||||
#[allow(clippy::ref_option)]
|
||||
#[salsa::tracked(return_ref)]
|
||||
fn all_narrowing_constraints_for_expression<'db>(
|
||||
db: &'db dyn Db,
|
||||
expression: Expression<'db>,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
NarrowingConstraintsBuilder::new(db, PredicateNode::Expression(expression), true).finish()
|
||||
}
|
||||
|
||||
#[allow(clippy::ref_option)]
|
||||
#[salsa::tracked(return_ref)]
|
||||
fn all_negative_narrowing_constraints_for_expression<'db>(
|
||||
db: &'db dyn Db,
|
||||
expression: Expression<'db>,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
NarrowingConstraintsBuilder::new(db, PredicateNode::Expression(expression), false).finish()
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
pub enum KnownConstraintFunction {
|
||||
/// `builtins.isinstance`
|
||||
IsInstance,
|
||||
/// `builtins.issubclass`
|
||||
IsSubclass,
|
||||
}
|
||||
|
||||
impl KnownConstraintFunction {
|
||||
/// Generate a constraint from the type of a `classinfo` argument to `isinstance` or `issubclass`.
|
||||
///
|
||||
/// The `classinfo` argument can be a class literal, a tuple of (tuples of) class literals. PEP 604
|
||||
/// union types are not yet supported. Returns `None` if the `classinfo` argument has a wrong type.
|
||||
fn generate_constraint<'db>(self, db: &'db dyn Db, classinfo: Type<'db>) -> Option<Type<'db>> {
|
||||
let constraint_fn = |class| match self {
|
||||
KnownConstraintFunction::IsInstance => Type::instance(class),
|
||||
KnownConstraintFunction::IsSubclass => SubclassOfType::from(db, class),
|
||||
};
|
||||
|
||||
match classinfo {
|
||||
Type::Tuple(tuple) => {
|
||||
let mut builder = UnionBuilder::new(db);
|
||||
for element in tuple.elements(db) {
|
||||
builder = builder.add(self.generate_constraint(db, *element)?);
|
||||
}
|
||||
Some(builder.build())
|
||||
}
|
||||
Type::ClassLiteral(class_literal) => Some(constraint_fn(class_literal.class())),
|
||||
Type::SubclassOf(subclass_of_ty) => {
|
||||
subclass_of_ty.subclass_of().into_class().map(constraint_fn)
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
type NarrowingConstraints<'db> = FxHashMap<ScopedSymbolId, Type<'db>>;
|
||||
|
||||
fn merge_constraints_and<'db>(
|
||||
into: &mut NarrowingConstraints<'db>,
|
||||
from: NarrowingConstraints<'db>,
|
||||
db: &'db dyn Db,
|
||||
) {
|
||||
for (key, value) in from {
|
||||
match into.entry(key) {
|
||||
Entry::Occupied(mut entry) => {
|
||||
*entry.get_mut() = IntersectionBuilder::new(db)
|
||||
.add_positive(*entry.get())
|
||||
.add_positive(value)
|
||||
.build();
|
||||
}
|
||||
Entry::Vacant(entry) => {
|
||||
entry.insert(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn merge_constraints_or<'db>(
|
||||
into: &mut NarrowingConstraints<'db>,
|
||||
from: &NarrowingConstraints<'db>,
|
||||
db: &'db dyn Db,
|
||||
) {
|
||||
for (key, value) in from {
|
||||
match into.entry(*key) {
|
||||
Entry::Occupied(mut entry) => {
|
||||
*entry.get_mut() = UnionBuilder::new(db).add(*entry.get()).add(*value).build();
|
||||
}
|
||||
Entry::Vacant(entry) => {
|
||||
entry.insert(Type::object(db));
|
||||
}
|
||||
}
|
||||
}
|
||||
for (key, value) in into.iter_mut() {
|
||||
if !from.contains_key(key) {
|
||||
*value = Type::object(db);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
struct NarrowingConstraintsBuilder<'db> {
|
||||
db: &'db dyn Db,
|
||||
predicate: PredicateNode<'db>,
|
||||
is_positive: bool,
|
||||
}
|
||||
|
||||
impl<'db> NarrowingConstraintsBuilder<'db> {
|
||||
fn new(db: &'db dyn Db, predicate: PredicateNode<'db>, is_positive: bool) -> Self {
|
||||
Self {
|
||||
db,
|
||||
predicate,
|
||||
is_positive,
|
||||
}
|
||||
}
|
||||
|
||||
fn finish(mut self) -> Option<NarrowingConstraints<'db>> {
|
||||
let constraints: Option<NarrowingConstraints<'db>> = match self.predicate {
|
||||
PredicateNode::Expression(expression) => {
|
||||
self.evaluate_expression_predicate(expression, self.is_positive)
|
||||
}
|
||||
PredicateNode::Pattern(pattern) => self.evaluate_pattern_predicate(pattern),
|
||||
};
|
||||
if let Some(mut constraints) = constraints {
|
||||
constraints.shrink_to_fit();
|
||||
Some(constraints)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn evaluate_expression_predicate(
|
||||
&mut self,
|
||||
expression: Expression<'db>,
|
||||
is_positive: bool,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
let expression_node = expression.node_ref(self.db).node();
|
||||
self.evaluate_expression_node_predicate(expression_node, expression, is_positive)
|
||||
}
|
||||
|
||||
fn evaluate_expression_node_predicate(
|
||||
&mut self,
|
||||
expression_node: &ruff_python_ast::Expr,
|
||||
expression: Expression<'db>,
|
||||
is_positive: bool,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
match expression_node {
|
||||
ast::Expr::Name(name) => Some(self.evaluate_expr_name(name, is_positive)),
|
||||
ast::Expr::Compare(expr_compare) => {
|
||||
self.evaluate_expr_compare(expr_compare, expression, is_positive)
|
||||
}
|
||||
ast::Expr::Call(expr_call) => {
|
||||
self.evaluate_expr_call(expr_call, expression, is_positive)
|
||||
}
|
||||
ast::Expr::UnaryOp(unary_op) if unary_op.op == ast::UnaryOp::Not => {
|
||||
self.evaluate_expression_node_predicate(&unary_op.operand, expression, !is_positive)
|
||||
}
|
||||
ast::Expr::BoolOp(bool_op) => self.evaluate_bool_op(bool_op, expression, is_positive),
|
||||
_ => None, // TODO other test expression kinds
|
||||
}
|
||||
}
|
||||
|
||||
fn evaluate_pattern_predicate(
|
||||
&mut self,
|
||||
pattern: PatternPredicate<'db>,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
let subject = pattern.subject(self.db);
|
||||
|
||||
match pattern.kind(self.db) {
|
||||
PatternPredicateKind::Singleton(singleton, _guard) => {
|
||||
self.evaluate_match_pattern_singleton(subject, *singleton)
|
||||
}
|
||||
PatternPredicateKind::Class(cls, _guard) => {
|
||||
self.evaluate_match_pattern_class(subject, *cls)
|
||||
}
|
||||
// TODO: support more pattern kinds
|
||||
PatternPredicateKind::Value(..) | PatternPredicateKind::Unsupported => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn symbols(&self) -> Arc<SymbolTable> {
|
||||
symbol_table(self.db, self.scope())
|
||||
}
|
||||
|
||||
fn scope(&self) -> ScopeId<'db> {
|
||||
match self.predicate {
|
||||
PredicateNode::Expression(expression) => expression.scope(self.db),
|
||||
PredicateNode::Pattern(pattern) => pattern.scope(self.db),
|
||||
}
|
||||
}
|
||||
|
||||
fn evaluate_expr_name(
|
||||
&mut self,
|
||||
expr_name: &ast::ExprName,
|
||||
is_positive: bool,
|
||||
) -> NarrowingConstraints<'db> {
|
||||
let ast::ExprName { id, .. } = expr_name;
|
||||
|
||||
let symbol = self
|
||||
.symbols()
|
||||
.symbol_id_by_name(id)
|
||||
.expect("Should always have a symbol for every Name node");
|
||||
let mut constraints = NarrowingConstraints::default();
|
||||
|
||||
constraints.insert(
|
||||
symbol,
|
||||
if is_positive {
|
||||
Type::AlwaysFalsy.negate(self.db)
|
||||
} else {
|
||||
Type::AlwaysTruthy.negate(self.db)
|
||||
},
|
||||
);
|
||||
|
||||
constraints
|
||||
}
|
||||
|
||||
fn evaluate_expr_compare(
|
||||
&mut self,
|
||||
expr_compare: &ast::ExprCompare,
|
||||
expression: Expression<'db>,
|
||||
is_positive: bool,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
fn is_narrowing_target_candidate(expr: &ast::Expr) -> bool {
|
||||
matches!(expr, ast::Expr::Name(_) | ast::Expr::Call(_))
|
||||
}
|
||||
|
||||
let ast::ExprCompare {
|
||||
range: _,
|
||||
left,
|
||||
ops,
|
||||
comparators,
|
||||
} = expr_compare;
|
||||
|
||||
// Performance optimization: early return if there are no potential narrowing targets.
|
||||
if !is_narrowing_target_candidate(left)
|
||||
&& comparators
|
||||
.iter()
|
||||
.all(|c| !is_narrowing_target_candidate(c))
|
||||
{
|
||||
return None;
|
||||
}
|
||||
|
||||
if !is_positive && comparators.len() > 1 {
|
||||
// We can't negate a constraint made by a multi-comparator expression, since we can't
|
||||
// know which comparison part is the one being negated.
|
||||
// For example, the negation of `x is 1 is y is 2`, would be `(x is not 1) or (y is not 1) or (y is not 2)`
|
||||
// and that requires cross-symbol constraints, which we don't support yet.
|
||||
return None;
|
||||
}
|
||||
let scope = self.scope();
|
||||
let inference = infer_expression_types(self.db, expression);
|
||||
|
||||
let comparator_tuples = std::iter::once(&**left)
|
||||
.chain(comparators)
|
||||
.tuple_windows::<(&ruff_python_ast::Expr, &ruff_python_ast::Expr)>();
|
||||
let mut constraints = NarrowingConstraints::default();
|
||||
|
||||
let mut last_rhs_ty: Option<Type> = None;
|
||||
|
||||
for (op, (left, right)) in std::iter::zip(&**ops, comparator_tuples) {
|
||||
let lhs_ty = last_rhs_ty.unwrap_or_else(|| {
|
||||
inference.expression_type(left.scoped_expression_id(self.db, scope))
|
||||
});
|
||||
let rhs_ty = inference.expression_type(right.scoped_expression_id(self.db, scope));
|
||||
last_rhs_ty = Some(rhs_ty);
|
||||
|
||||
match left {
|
||||
ast::Expr::Name(ast::ExprName {
|
||||
range: _,
|
||||
id,
|
||||
ctx: _,
|
||||
}) => {
|
||||
let symbol = self
|
||||
.symbols()
|
||||
.symbol_id_by_name(id)
|
||||
.expect("Should always have a symbol for every Name node");
|
||||
|
||||
match if is_positive { *op } else { op.negate() } {
|
||||
ast::CmpOp::IsNot => {
|
||||
if rhs_ty.is_singleton(self.db) {
|
||||
let ty = IntersectionBuilder::new(self.db)
|
||||
.add_negative(rhs_ty)
|
||||
.build();
|
||||
constraints.insert(symbol, ty);
|
||||
} else {
|
||||
// Non-singletons cannot be safely narrowed using `is not`
|
||||
}
|
||||
}
|
||||
ast::CmpOp::Is => {
|
||||
constraints.insert(symbol, rhs_ty);
|
||||
}
|
||||
ast::CmpOp::NotEq => {
|
||||
if rhs_ty.is_single_valued(self.db) {
|
||||
let ty = IntersectionBuilder::new(self.db)
|
||||
.add_negative(rhs_ty)
|
||||
.build();
|
||||
constraints.insert(symbol, ty);
|
||||
}
|
||||
}
|
||||
ast::CmpOp::Eq if lhs_ty.is_literal_string() => {
|
||||
constraints.insert(symbol, rhs_ty);
|
||||
}
|
||||
_ => {
|
||||
// TODO other comparison types
|
||||
}
|
||||
}
|
||||
}
|
||||
ast::Expr::Call(ast::ExprCall {
|
||||
range: _,
|
||||
func: callable,
|
||||
arguments:
|
||||
ast::Arguments {
|
||||
args,
|
||||
keywords,
|
||||
range: _,
|
||||
},
|
||||
}) if keywords.is_empty() => {
|
||||
let Type::ClassLiteral(ClassLiteralType { class: rhs_class }) = rhs_ty else {
|
||||
continue;
|
||||
};
|
||||
|
||||
let [ast::Expr::Name(ast::ExprName { id, .. })] = &**args else {
|
||||
continue;
|
||||
};
|
||||
|
||||
let is_valid_constraint = if is_positive {
|
||||
op == &ast::CmpOp::Is
|
||||
} else {
|
||||
op == &ast::CmpOp::IsNot
|
||||
};
|
||||
|
||||
if !is_valid_constraint {
|
||||
continue;
|
||||
}
|
||||
|
||||
let callable_type =
|
||||
inference.expression_type(callable.scoped_expression_id(self.db, scope));
|
||||
|
||||
if callable_type
|
||||
.into_class_literal()
|
||||
.is_some_and(|c| c.class().is_known(self.db, KnownClass::Type))
|
||||
{
|
||||
let symbol = self
|
||||
.symbols()
|
||||
.symbol_id_by_name(id)
|
||||
.expect("Should always have a symbol for every Name node");
|
||||
constraints.insert(symbol, Type::instance(rhs_class));
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Some(constraints)
|
||||
}
|
||||
|
||||
fn evaluate_expr_call(
|
||||
&mut self,
|
||||
expr_call: &ast::ExprCall,
|
||||
expression: Expression<'db>,
|
||||
is_positive: bool,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
let scope = self.scope();
|
||||
let inference = infer_expression_types(self.db, expression);
|
||||
|
||||
let callable_ty =
|
||||
inference.expression_type(expr_call.func.scoped_expression_id(self.db, scope));
|
||||
|
||||
// TODO: add support for PEP 604 union types on the right hand side of `isinstance`
|
||||
// and `issubclass`, for example `isinstance(x, str | (int | float))`.
|
||||
match callable_ty {
|
||||
Type::FunctionLiteral(function_type) if expr_call.arguments.keywords.is_empty() => {
|
||||
let function = function_type.known(self.db)?.into_constraint_function()?;
|
||||
|
||||
let [ast::Expr::Name(ast::ExprName { id, .. }), class_info] =
|
||||
&*expr_call.arguments.args
|
||||
else {
|
||||
return None;
|
||||
};
|
||||
|
||||
let symbol = self.symbols().symbol_id_by_name(id).unwrap();
|
||||
|
||||
let class_info_ty =
|
||||
inference.expression_type(class_info.scoped_expression_id(self.db, scope));
|
||||
|
||||
function
|
||||
.generate_constraint(self.db, class_info_ty)
|
||||
.map(|constraint| {
|
||||
let mut constraints = NarrowingConstraints::default();
|
||||
constraints.insert(symbol, constraint.negate_if(self.db, !is_positive));
|
||||
constraints
|
||||
})
|
||||
}
|
||||
// for the expression `bool(E)`, we further narrow the type based on `E`
|
||||
Type::ClassLiteral(class_type)
|
||||
if expr_call.arguments.args.len() == 1
|
||||
&& expr_call.arguments.keywords.is_empty()
|
||||
&& class_type.class().is_known(self.db, KnownClass::Bool) =>
|
||||
{
|
||||
self.evaluate_expression_node_predicate(
|
||||
&expr_call.arguments.args[0],
|
||||
expression,
|
||||
is_positive,
|
||||
)
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn evaluate_match_pattern_singleton(
|
||||
&mut self,
|
||||
subject: Expression<'db>,
|
||||
singleton: ast::Singleton,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
if let Some(ast::ExprName { id, .. }) = subject.node_ref(self.db).as_name_expr() {
|
||||
// SAFETY: we should always have a symbol for every Name node.
|
||||
let symbol = self.symbols().symbol_id_by_name(id).unwrap();
|
||||
|
||||
let ty = match singleton {
|
||||
ast::Singleton::None => Type::none(self.db),
|
||||
ast::Singleton::True => Type::BooleanLiteral(true),
|
||||
ast::Singleton::False => Type::BooleanLiteral(false),
|
||||
};
|
||||
let mut constraints = NarrowingConstraints::default();
|
||||
constraints.insert(symbol, ty);
|
||||
Some(constraints)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn evaluate_match_pattern_class(
|
||||
&mut self,
|
||||
subject: Expression<'db>,
|
||||
cls: Expression<'db>,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
let ast::ExprName { id, .. } = subject.node_ref(self.db).as_name_expr()?;
|
||||
let symbol = self
|
||||
.symbols()
|
||||
.symbol_id_by_name(id)
|
||||
.expect("We should always have a symbol for every `Name` node");
|
||||
let ty = infer_same_file_expression_type(self.db, cls).to_instance(self.db)?;
|
||||
|
||||
let mut constraints = NarrowingConstraints::default();
|
||||
constraints.insert(symbol, ty);
|
||||
Some(constraints)
|
||||
}
|
||||
|
||||
fn evaluate_bool_op(
|
||||
&mut self,
|
||||
expr_bool_op: &ExprBoolOp,
|
||||
expression: Expression<'db>,
|
||||
is_positive: bool,
|
||||
) -> Option<NarrowingConstraints<'db>> {
|
||||
let inference = infer_expression_types(self.db, expression);
|
||||
let scope = self.scope();
|
||||
let mut sub_constraints = expr_bool_op
|
||||
.values
|
||||
.iter()
|
||||
// filter our arms with statically known truthiness
|
||||
.filter(|expr| {
|
||||
inference
|
||||
.expression_type(expr.scoped_expression_id(self.db, scope))
|
||||
.bool(self.db)
|
||||
!= match expr_bool_op.op {
|
||||
BoolOp::And => Truthiness::AlwaysTrue,
|
||||
BoolOp::Or => Truthiness::AlwaysFalse,
|
||||
}
|
||||
})
|
||||
.map(|sub_expr| {
|
||||
self.evaluate_expression_node_predicate(sub_expr, expression, is_positive)
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
match (expr_bool_op.op, is_positive) {
|
||||
(BoolOp::And, true) | (BoolOp::Or, false) => {
|
||||
let mut aggregation: Option<NarrowingConstraints> = None;
|
||||
for sub_constraint in sub_constraints.into_iter().flatten() {
|
||||
if let Some(ref mut some_aggregation) = aggregation {
|
||||
merge_constraints_and(some_aggregation, sub_constraint, self.db);
|
||||
} else {
|
||||
aggregation = Some(sub_constraint);
|
||||
}
|
||||
}
|
||||
aggregation
|
||||
}
|
||||
(BoolOp::Or, true) | (BoolOp::And, false) => {
|
||||
let (first, rest) = sub_constraints.split_first_mut()?;
|
||||
if let Some(ref mut first) = first {
|
||||
for rest_constraint in rest {
|
||||
if let Some(rest_constraint) = rest_constraint {
|
||||
merge_constraints_or(first, rest_constraint, self.db);
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
}
|
||||
first.clone()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,995 +0,0 @@
|
||||
//! _Signatures_ describe the expected parameters and return type of a function or other callable.
|
||||
//! Overloads and unions add complexity to this simple description.
|
||||
//!
|
||||
//! In a call expression, the type of the callable might be a union of several types. The call must
|
||||
//! be compatible with _all_ of these types, since at runtime the callable might be an instance of
|
||||
//! any of them.
|
||||
//!
|
||||
//! Each of the atomic types in the union must be callable. Each callable might be _overloaded_,
|
||||
//! containing multiple _overload signatures_, each of which describes a different combination of
|
||||
//! argument types and return types. For each callable type in the union, the call expression's
|
||||
//! arguments must match _at least one_ overload.
|
||||
|
||||
use smallvec::{smallvec, SmallVec};
|
||||
|
||||
use super::{definition_expression_type, DynamicType, Type};
|
||||
use crate::semantic_index::definition::Definition;
|
||||
use crate::types::todo_type;
|
||||
use crate::Db;
|
||||
use ruff_python_ast::{self as ast, name::Name};
|
||||
|
||||
/// The signature of a possible union of callables.
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash, salsa::Update)]
|
||||
pub(crate) struct Signatures<'db> {
|
||||
/// The type that is (hopefully) callable.
|
||||
pub(crate) callable_type: Type<'db>,
|
||||
/// The type we'll use for error messages referring to details of the called signature. For calls to functions this
|
||||
/// will be the same as `callable_type`; for other callable instances it may be a `__call__` method.
|
||||
pub(crate) signature_type: Type<'db>,
|
||||
/// By using `SmallVec`, we avoid an extra heap allocation for the common case of a non-union
|
||||
/// type.
|
||||
elements: SmallVec<[CallableSignature<'db>; 1]>,
|
||||
}
|
||||
|
||||
impl<'db> Signatures<'db> {
|
||||
pub(crate) fn not_callable(signature_type: Type<'db>) -> Self {
|
||||
Self {
|
||||
callable_type: signature_type,
|
||||
signature_type,
|
||||
elements: smallvec![CallableSignature::not_callable(signature_type)],
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn single(signature: CallableSignature<'db>) -> Self {
|
||||
Self {
|
||||
callable_type: signature.callable_type,
|
||||
signature_type: signature.signature_type,
|
||||
elements: smallvec![signature],
|
||||
}
|
||||
}
|
||||
|
||||
/// Creates a new `Signatures` from an iterator of [`Signature`]s. Panics if the iterator is
|
||||
/// empty.
|
||||
pub(crate) fn from_union<I>(signature_type: Type<'db>, elements: I) -> Self
|
||||
where
|
||||
I: IntoIterator<Item = Signatures<'db>>,
|
||||
{
|
||||
let elements: SmallVec<_> = elements
|
||||
.into_iter()
|
||||
.flat_map(|s| s.elements.into_iter())
|
||||
.collect();
|
||||
assert!(!elements.is_empty());
|
||||
Self {
|
||||
callable_type: signature_type,
|
||||
signature_type,
|
||||
elements,
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn replace_callable_type(&mut self, before: Type<'db>, after: Type<'db>) {
|
||||
if self.callable_type == before {
|
||||
self.callable_type = after;
|
||||
}
|
||||
for signature in &mut self.elements {
|
||||
signature.replace_callable_type(before, after);
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn set_dunder_call_is_possibly_unbound(&mut self) {
|
||||
for signature in &mut self.elements {
|
||||
signature.dunder_call_is_possibly_unbound = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'db> IntoIterator for &'a Signatures<'db> {
|
||||
type Item = &'a CallableSignature<'db>;
|
||||
type IntoIter = std::slice::Iter<'a, CallableSignature<'db>>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.elements.iter()
|
||||
}
|
||||
}
|
||||
|
||||
/// The signature of a single callable. If the callable is overloaded, there is a separate
|
||||
/// [`Signature`] for each overload.
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash, salsa::Update)]
|
||||
pub(crate) struct CallableSignature<'db> {
|
||||
/// The type that is (hopefully) callable.
|
||||
pub(crate) callable_type: Type<'db>,
|
||||
|
||||
/// The type we'll use for error messages referring to details of the called signature. For
|
||||
/// calls to functions this will be the same as `callable_type`; for other callable instances
|
||||
/// it may be a `__call__` method.
|
||||
pub(crate) signature_type: Type<'db>,
|
||||
|
||||
/// If this is a callable object (i.e. called via a `__call__` method), the boundness of
|
||||
/// that call method.
|
||||
pub(crate) dunder_call_is_possibly_unbound: bool,
|
||||
|
||||
/// The type of the bound `self` or `cls` parameter if this signature is for a bound method.
|
||||
pub(crate) bound_type: Option<Type<'db>>,
|
||||
|
||||
/// The signatures of each overload of this callable. Will be empty if the type is not
|
||||
/// callable.
|
||||
///
|
||||
/// By using `SmallVec`, we avoid an extra heap allocation for the common case of a
|
||||
/// non-overloaded callable.
|
||||
overloads: SmallVec<[Signature<'db>; 1]>,
|
||||
}
|
||||
|
||||
impl<'db> CallableSignature<'db> {
|
||||
pub(crate) fn not_callable(signature_type: Type<'db>) -> Self {
|
||||
Self {
|
||||
callable_type: signature_type,
|
||||
signature_type,
|
||||
dunder_call_is_possibly_unbound: false,
|
||||
bound_type: None,
|
||||
overloads: smallvec![],
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn single(signature_type: Type<'db>, signature: Signature<'db>) -> Self {
|
||||
Self {
|
||||
callable_type: signature_type,
|
||||
signature_type,
|
||||
dunder_call_is_possibly_unbound: false,
|
||||
bound_type: None,
|
||||
overloads: smallvec![signature],
|
||||
}
|
||||
}
|
||||
|
||||
/// Creates a new `CallableSignature` from an iterator of [`Signature`]s. Returns a
|
||||
/// non-callable signature if the iterator is empty.
|
||||
pub(crate) fn from_overloads<I>(signature_type: Type<'db>, overloads: I) -> Self
|
||||
where
|
||||
I: IntoIterator<Item = Signature<'db>>,
|
||||
{
|
||||
Self {
|
||||
callable_type: signature_type,
|
||||
signature_type,
|
||||
dunder_call_is_possibly_unbound: false,
|
||||
bound_type: None,
|
||||
overloads: overloads.into_iter().collect(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Return a signature for a dynamic callable
|
||||
pub(crate) fn dynamic(signature_type: Type<'db>) -> Self {
|
||||
let signature = Signature {
|
||||
parameters: Parameters::gradual_form(),
|
||||
return_ty: Some(signature_type),
|
||||
};
|
||||
Self::single(signature_type, signature)
|
||||
}
|
||||
|
||||
/// Return a todo signature: (*args: Todo, **kwargs: Todo) -> Todo
|
||||
#[allow(unused_variables)] // 'reason' only unused in debug builds
|
||||
pub(crate) fn todo(reason: &'static str) -> Self {
|
||||
let signature_type = todo_type!(reason);
|
||||
let signature = Signature {
|
||||
parameters: Parameters::todo(),
|
||||
return_ty: Some(signature_type),
|
||||
};
|
||||
Self::single(signature_type, signature)
|
||||
}
|
||||
|
||||
pub(crate) fn with_bound_type(mut self, bound_type: Type<'db>) -> Self {
|
||||
self.bound_type = Some(bound_type);
|
||||
self
|
||||
}
|
||||
|
||||
fn replace_callable_type(&mut self, before: Type<'db>, after: Type<'db>) {
|
||||
if self.callable_type == before {
|
||||
self.callable_type = after;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'db> IntoIterator for &'a CallableSignature<'db> {
|
||||
type Item = &'a Signature<'db>;
|
||||
type IntoIter = std::slice::Iter<'a, Signature<'db>>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.overloads.iter()
|
||||
}
|
||||
}
|
||||
|
||||
/// The signature of one of the overloads of a callable.
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash, salsa::Update)]
|
||||
pub struct Signature<'db> {
|
||||
/// Parameters, in source order.
|
||||
///
|
||||
/// The ordering of parameters in a valid signature must be: first positional-only parameters,
|
||||
/// then positional-or-keyword, then optionally the variadic parameter, then keyword-only
|
||||
/// parameters, and last, optionally the variadic keywords parameter. Parameters with defaults
|
||||
/// must come after parameters without defaults.
|
||||
///
|
||||
/// We may get invalid signatures, though, and need to handle them without panicking.
|
||||
parameters: Parameters<'db>,
|
||||
|
||||
/// Annotated return type, if any.
|
||||
pub(crate) return_ty: Option<Type<'db>>,
|
||||
}
|
||||
|
||||
impl<'db> Signature<'db> {
|
||||
pub(crate) fn new(parameters: Parameters<'db>, return_ty: Option<Type<'db>>) -> Self {
|
||||
Self {
|
||||
parameters,
|
||||
return_ty,
|
||||
}
|
||||
}
|
||||
|
||||
/// Return a todo signature: (*args: Todo, **kwargs: Todo) -> Todo
|
||||
#[allow(unused_variables)] // 'reason' only unused in debug builds
|
||||
pub(crate) fn todo(reason: &'static str) -> Self {
|
||||
Signature {
|
||||
parameters: Parameters::todo(),
|
||||
return_ty: Some(todo_type!(reason)),
|
||||
}
|
||||
}
|
||||
|
||||
/// Return a typed signature from a function definition.
|
||||
pub(super) fn from_function(
|
||||
db: &'db dyn Db,
|
||||
definition: Definition<'db>,
|
||||
function_node: &ast::StmtFunctionDef,
|
||||
) -> Self {
|
||||
let return_ty = function_node.returns.as_ref().map(|returns| {
|
||||
if function_node.is_async {
|
||||
todo_type!("generic types.CoroutineType")
|
||||
} else {
|
||||
definition_expression_type(db, definition, returns.as_ref())
|
||||
}
|
||||
});
|
||||
|
||||
Self {
|
||||
parameters: Parameters::from_parameters(
|
||||
db,
|
||||
definition,
|
||||
function_node.parameters.as_ref(),
|
||||
),
|
||||
return_ty,
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the parameters in this signature.
|
||||
pub(crate) fn parameters(&self) -> &Parameters<'db> {
|
||||
&self.parameters
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash, salsa::Update)]
|
||||
pub(crate) struct Parameters<'db> {
|
||||
// TODO: use SmallVec here once invariance bug is fixed
|
||||
value: Vec<Parameter<'db>>,
|
||||
|
||||
/// Whether this parameter list represents a gradual form using `...` as the only parameter.
|
||||
///
|
||||
/// If this is `true`, the `value` will still contain the variadic and keyword-variadic
|
||||
/// parameters. This flag is used to distinguish between an explicit `...` in the callable type
|
||||
/// as in `Callable[..., int]` and the variadic arguments in `lambda` expression as in
|
||||
/// `lambda *args, **kwargs: None`.
|
||||
///
|
||||
/// The display implementation utilizes this flag to use `...` instead of displaying the
|
||||
/// individual variadic and keyword-variadic parameters.
|
||||
///
|
||||
/// Note: This flag is also used to indicate invalid forms of `Callable` annotations.
|
||||
is_gradual: bool,
|
||||
}
|
||||
|
||||
impl<'db> Parameters<'db> {
|
||||
pub(crate) fn new(parameters: impl IntoIterator<Item = Parameter<'db>>) -> Self {
|
||||
Self {
|
||||
value: parameters.into_iter().collect(),
|
||||
is_gradual: false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create an empty parameter list.
|
||||
pub(crate) fn empty() -> Self {
|
||||
Self {
|
||||
value: Vec::new(),
|
||||
is_gradual: false,
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn as_slice(&self) -> &[Parameter<'db>] {
|
||||
self.value.as_slice()
|
||||
}
|
||||
|
||||
pub(crate) const fn is_gradual(&self) -> bool {
|
||||
self.is_gradual
|
||||
}
|
||||
|
||||
/// Return todo parameters: (*args: Todo, **kwargs: Todo)
|
||||
pub(crate) fn todo() -> Self {
|
||||
Self {
|
||||
value: vec![
|
||||
Parameter {
|
||||
annotated_ty: Some(todo_type!("todo signature *args")),
|
||||
kind: ParameterKind::Variadic {
|
||||
name: Name::new_static("args"),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(todo_type!("todo signature **kwargs")),
|
||||
kind: ParameterKind::KeywordVariadic {
|
||||
name: Name::new_static("kwargs"),
|
||||
},
|
||||
},
|
||||
],
|
||||
is_gradual: false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Return parameters that represents a gradual form using `...` as the only parameter.
|
||||
///
|
||||
/// Internally, this is represented as `(*Any, **Any)` that accepts parameters of type [`Any`].
|
||||
///
|
||||
/// [`Any`]: crate::types::DynamicType::Any
|
||||
pub(crate) fn gradual_form() -> Self {
|
||||
Self {
|
||||
value: vec![
|
||||
Parameter {
|
||||
annotated_ty: Some(Type::Dynamic(DynamicType::Any)),
|
||||
kind: ParameterKind::Variadic {
|
||||
name: Name::new_static("args"),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(Type::Dynamic(DynamicType::Any)),
|
||||
kind: ParameterKind::KeywordVariadic {
|
||||
name: Name::new_static("kwargs"),
|
||||
},
|
||||
},
|
||||
],
|
||||
is_gradual: true,
|
||||
}
|
||||
}
|
||||
|
||||
/// Return parameters that represents an unknown list of parameters.
|
||||
///
|
||||
/// Internally, this is represented as `(*Unknown, **Unknown)` that accepts parameters of type
|
||||
/// [`Unknown`].
|
||||
///
|
||||
/// [`Unknown`]: crate::types::DynamicType::Unknown
|
||||
pub(crate) fn unknown() -> Self {
|
||||
Self {
|
||||
value: vec![
|
||||
Parameter {
|
||||
annotated_ty: Some(Type::Dynamic(DynamicType::Unknown)),
|
||||
kind: ParameterKind::Variadic {
|
||||
name: Name::new_static("args"),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(Type::Dynamic(DynamicType::Unknown)),
|
||||
kind: ParameterKind::KeywordVariadic {
|
||||
name: Name::new_static("kwargs"),
|
||||
},
|
||||
},
|
||||
],
|
||||
is_gradual: true,
|
||||
}
|
||||
}
|
||||
|
||||
fn from_parameters(
|
||||
db: &'db dyn Db,
|
||||
definition: Definition<'db>,
|
||||
parameters: &ast::Parameters,
|
||||
) -> Self {
|
||||
let ast::Parameters {
|
||||
posonlyargs,
|
||||
args,
|
||||
vararg,
|
||||
kwonlyargs,
|
||||
kwarg,
|
||||
range: _,
|
||||
} = parameters;
|
||||
let default_ty = |param: &ast::ParameterWithDefault| {
|
||||
param
|
||||
.default()
|
||||
.map(|default| definition_expression_type(db, definition, default))
|
||||
};
|
||||
let positional_only = posonlyargs.iter().map(|arg| {
|
||||
Parameter::from_node_and_kind(
|
||||
db,
|
||||
definition,
|
||||
&arg.parameter,
|
||||
ParameterKind::PositionalOnly {
|
||||
name: Some(arg.parameter.name.id.clone()),
|
||||
default_ty: default_ty(arg),
|
||||
},
|
||||
)
|
||||
});
|
||||
let positional_or_keyword = args.iter().map(|arg| {
|
||||
Parameter::from_node_and_kind(
|
||||
db,
|
||||
definition,
|
||||
&arg.parameter,
|
||||
ParameterKind::PositionalOrKeyword {
|
||||
name: arg.parameter.name.id.clone(),
|
||||
default_ty: default_ty(arg),
|
||||
},
|
||||
)
|
||||
});
|
||||
let variadic = vararg.as_ref().map(|arg| {
|
||||
Parameter::from_node_and_kind(
|
||||
db,
|
||||
definition,
|
||||
arg,
|
||||
ParameterKind::Variadic {
|
||||
name: arg.name.id.clone(),
|
||||
},
|
||||
)
|
||||
});
|
||||
let keyword_only = kwonlyargs.iter().map(|arg| {
|
||||
Parameter::from_node_and_kind(
|
||||
db,
|
||||
definition,
|
||||
&arg.parameter,
|
||||
ParameterKind::KeywordOnly {
|
||||
name: arg.parameter.name.id.clone(),
|
||||
default_ty: default_ty(arg),
|
||||
},
|
||||
)
|
||||
});
|
||||
let keywords = kwarg.as_ref().map(|arg| {
|
||||
Parameter::from_node_and_kind(
|
||||
db,
|
||||
definition,
|
||||
arg,
|
||||
ParameterKind::KeywordVariadic {
|
||||
name: arg.name.id.clone(),
|
||||
},
|
||||
)
|
||||
});
|
||||
Self::new(
|
||||
positional_only
|
||||
.chain(positional_or_keyword)
|
||||
.chain(variadic)
|
||||
.chain(keyword_only)
|
||||
.chain(keywords),
|
||||
)
|
||||
}
|
||||
|
||||
pub(crate) fn len(&self) -> usize {
|
||||
self.value.len()
|
||||
}
|
||||
|
||||
pub(crate) fn iter(&self) -> std::slice::Iter<Parameter<'db>> {
|
||||
self.value.iter()
|
||||
}
|
||||
|
||||
/// Iterate initial positional parameters, not including variadic parameter, if any.
|
||||
///
|
||||
/// For a valid signature, this will be all positional parameters. In an invalid signature,
|
||||
/// there could be non-initial positional parameters; effectively, we just won't consider those
|
||||
/// to be positional, which is fine.
|
||||
pub(crate) fn positional(&self) -> impl Iterator<Item = &Parameter<'db>> {
|
||||
self.iter().take_while(|param| param.is_positional())
|
||||
}
|
||||
|
||||
/// Return parameter at given index, or `None` if index is out-of-range.
|
||||
pub(crate) fn get(&self, index: usize) -> Option<&Parameter<'db>> {
|
||||
self.value.get(index)
|
||||
}
|
||||
|
||||
/// Return positional parameter at given index, or `None` if `index` is out of range.
|
||||
///
|
||||
/// Does not return variadic parameter.
|
||||
pub(crate) fn get_positional(&self, index: usize) -> Option<&Parameter<'db>> {
|
||||
self.get(index)
|
||||
.and_then(|parameter| parameter.is_positional().then_some(parameter))
|
||||
}
|
||||
|
||||
/// Return the variadic parameter (`*args`), if any, and its index, or `None`.
|
||||
pub(crate) fn variadic(&self) -> Option<(usize, &Parameter<'db>)> {
|
||||
self.iter()
|
||||
.enumerate()
|
||||
.find(|(_, parameter)| parameter.is_variadic())
|
||||
}
|
||||
|
||||
/// Return parameter (with index) for given name, or `None` if no such parameter.
|
||||
///
|
||||
/// Does not return keywords (`**kwargs`) parameter.
|
||||
///
|
||||
/// In an invalid signature, there could be multiple parameters with the same name; we will
|
||||
/// just return the first that matches.
|
||||
pub(crate) fn keyword_by_name(&self, name: &str) -> Option<(usize, &Parameter<'db>)> {
|
||||
self.iter()
|
||||
.enumerate()
|
||||
.find(|(_, parameter)| parameter.callable_by_name(name))
|
||||
}
|
||||
|
||||
/// Return the keywords parameter (`**kwargs`), if any, and its index, or `None`.
|
||||
pub(crate) fn keyword_variadic(&self) -> Option<(usize, &Parameter<'db>)> {
|
||||
self.iter()
|
||||
.enumerate()
|
||||
.rfind(|(_, parameter)| parameter.is_keyword_variadic())
|
||||
}
|
||||
}
|
||||
|
||||
impl<'db, 'a> IntoIterator for &'a Parameters<'db> {
|
||||
type Item = &'a Parameter<'db>;
|
||||
type IntoIter = std::slice::Iter<'a, Parameter<'db>>;
|
||||
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
self.value.iter()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'db> std::ops::Index<usize> for Parameters<'db> {
|
||||
type Output = Parameter<'db>;
|
||||
|
||||
fn index(&self, index: usize) -> &Self::Output {
|
||||
&self.value[index]
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash, salsa::Update)]
|
||||
pub(crate) struct Parameter<'db> {
|
||||
/// Annotated type of the parameter.
|
||||
annotated_ty: Option<Type<'db>>,
|
||||
|
||||
kind: ParameterKind<'db>,
|
||||
}
|
||||
|
||||
impl<'db> Parameter<'db> {
|
||||
pub(crate) fn new(annotated_ty: Option<Type<'db>>, kind: ParameterKind<'db>) -> Self {
|
||||
Self { annotated_ty, kind }
|
||||
}
|
||||
|
||||
fn from_node_and_kind(
|
||||
db: &'db dyn Db,
|
||||
definition: Definition<'db>,
|
||||
parameter: &ast::Parameter,
|
||||
kind: ParameterKind<'db>,
|
||||
) -> Self {
|
||||
Self {
|
||||
annotated_ty: parameter
|
||||
.annotation()
|
||||
.map(|annotation| definition_expression_type(db, definition, annotation)),
|
||||
kind,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns `true` if this is a keyword-only parameter.
|
||||
pub(crate) fn is_keyword_only(&self) -> bool {
|
||||
matches!(self.kind, ParameterKind::KeywordOnly { .. })
|
||||
}
|
||||
|
||||
/// Returns `true` if this is a positional-only parameter.
|
||||
pub(crate) fn is_positional_only(&self) -> bool {
|
||||
matches!(self.kind, ParameterKind::PositionalOnly { .. })
|
||||
}
|
||||
|
||||
/// Returns `true` if this is a variadic parameter.
|
||||
pub(crate) fn is_variadic(&self) -> bool {
|
||||
matches!(self.kind, ParameterKind::Variadic { .. })
|
||||
}
|
||||
|
||||
/// Returns `true` if this is a keyword-variadic parameter.
|
||||
pub(crate) fn is_keyword_variadic(&self) -> bool {
|
||||
matches!(self.kind, ParameterKind::KeywordVariadic { .. })
|
||||
}
|
||||
|
||||
/// Returns `true` if this is either a positional-only or standard (positional or keyword)
|
||||
/// parameter.
|
||||
pub(crate) fn is_positional(&self) -> bool {
|
||||
matches!(
|
||||
self.kind,
|
||||
ParameterKind::PositionalOnly { .. } | ParameterKind::PositionalOrKeyword { .. }
|
||||
)
|
||||
}
|
||||
|
||||
pub(crate) fn callable_by_name(&self, name: &str) -> bool {
|
||||
match &self.kind {
|
||||
ParameterKind::PositionalOrKeyword {
|
||||
name: param_name, ..
|
||||
}
|
||||
| ParameterKind::KeywordOnly {
|
||||
name: param_name, ..
|
||||
} => param_name == name,
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Annotated type of the parameter, if annotated.
|
||||
pub(crate) fn annotated_type(&self) -> Option<Type<'db>> {
|
||||
self.annotated_ty
|
||||
}
|
||||
|
||||
/// Kind of the parameter.
|
||||
pub(crate) fn kind(&self) -> &ParameterKind<'db> {
|
||||
&self.kind
|
||||
}
|
||||
|
||||
/// Name of the parameter (if it has one).
|
||||
pub(crate) fn name(&self) -> Option<&ast::name::Name> {
|
||||
match &self.kind {
|
||||
ParameterKind::PositionalOnly { name, .. } => name.as_ref(),
|
||||
ParameterKind::PositionalOrKeyword { name, .. } => Some(name),
|
||||
ParameterKind::Variadic { name } => Some(name),
|
||||
ParameterKind::KeywordOnly { name, .. } => Some(name),
|
||||
ParameterKind::KeywordVariadic { name } => Some(name),
|
||||
}
|
||||
}
|
||||
|
||||
/// Display name of the parameter, if it has one.
|
||||
pub(crate) fn display_name(&self) -> Option<ast::name::Name> {
|
||||
self.name().map(|name| match self.kind {
|
||||
ParameterKind::Variadic { .. } => ast::name::Name::new(format!("*{name}")),
|
||||
ParameterKind::KeywordVariadic { .. } => ast::name::Name::new(format!("**{name}")),
|
||||
_ => name.clone(),
|
||||
})
|
||||
}
|
||||
|
||||
/// Default-value type of the parameter, if any.
|
||||
pub(crate) fn default_type(&self) -> Option<Type<'db>> {
|
||||
match self.kind {
|
||||
ParameterKind::PositionalOnly { default_ty, .. } => default_ty,
|
||||
ParameterKind::PositionalOrKeyword { default_ty, .. } => default_ty,
|
||||
ParameterKind::Variadic { .. } => None,
|
||||
ParameterKind::KeywordOnly { default_ty, .. } => default_ty,
|
||||
ParameterKind::KeywordVariadic { .. } => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash, salsa::Update)]
|
||||
pub(crate) enum ParameterKind<'db> {
|
||||
/// Positional-only parameter, e.g. `def f(x, /): ...`
|
||||
PositionalOnly {
|
||||
/// Parameter name.
|
||||
///
|
||||
/// It is possible for signatures to be defined in ways that leave positional-only parameters
|
||||
/// nameless (e.g. via `Callable` annotations).
|
||||
name: Option<Name>,
|
||||
default_ty: Option<Type<'db>>,
|
||||
},
|
||||
|
||||
/// Positional-or-keyword parameter, e.g. `def f(x): ...`
|
||||
PositionalOrKeyword {
|
||||
/// Parameter name.
|
||||
name: Name,
|
||||
default_ty: Option<Type<'db>>,
|
||||
},
|
||||
|
||||
/// Variadic parameter, e.g. `def f(*args): ...`
|
||||
Variadic {
|
||||
/// Parameter name.
|
||||
name: Name,
|
||||
},
|
||||
|
||||
/// Keyword-only parameter, e.g. `def f(*, x): ...`
|
||||
KeywordOnly {
|
||||
/// Parameter name.
|
||||
name: Name,
|
||||
default_ty: Option<Type<'db>>,
|
||||
},
|
||||
|
||||
/// Variadic keywords parameter, e.g. `def f(**kwargs): ...`
|
||||
KeywordVariadic {
|
||||
/// Parameter name.
|
||||
name: Name,
|
||||
},
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::db::tests::{setup_db, TestDb};
|
||||
use crate::symbol::global_symbol;
|
||||
use crate::types::{FunctionType, KnownClass};
|
||||
use ruff_db::system::DbWithWritableSystem as _;
|
||||
|
||||
#[track_caller]
|
||||
fn get_function_f<'db>(db: &'db TestDb, file: &'static str) -> FunctionType<'db> {
|
||||
let module = ruff_db::files::system_path_to_file(db, file).unwrap();
|
||||
global_symbol(db, module, "f")
|
||||
.symbol
|
||||
.expect_type()
|
||||
.expect_function_literal()
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
fn assert_params<'db>(signature: &Signature<'db>, expected: &[Parameter<'db>]) {
|
||||
assert_eq!(signature.parameters.value.as_slice(), expected);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty() {
|
||||
let mut db = setup_db();
|
||||
db.write_dedented("/src/a.py", "def f(): ...").unwrap();
|
||||
let func = get_function_f(&db, "/src/a.py");
|
||||
|
||||
let sig = func.internal_signature(&db);
|
||||
|
||||
assert!(sig.return_ty.is_none());
|
||||
assert_params(&sig, &[]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[allow(clippy::many_single_char_names)]
|
||||
fn full() {
|
||||
let mut db = setup_db();
|
||||
db.write_dedented(
|
||||
"/src/a.py",
|
||||
"
|
||||
from typing import Literal
|
||||
|
||||
def f(a, b: int, c = 1, d: int = 2, /,
|
||||
e = 3, f: Literal[4] = 4, *args: object,
|
||||
g = 5, h: Literal[6] = 6, **kwargs: str) -> bytes: ...
|
||||
",
|
||||
)
|
||||
.unwrap();
|
||||
let func = get_function_f(&db, "/src/a.py");
|
||||
|
||||
let sig = func.internal_signature(&db);
|
||||
|
||||
assert_eq!(sig.return_ty.unwrap().display(&db).to_string(), "bytes");
|
||||
assert_params(
|
||||
&sig,
|
||||
&[
|
||||
Parameter {
|
||||
annotated_ty: None,
|
||||
kind: ParameterKind::PositionalOnly {
|
||||
name: Some(Name::new_static("a")),
|
||||
default_ty: None,
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(KnownClass::Int.to_instance(&db)),
|
||||
kind: ParameterKind::PositionalOnly {
|
||||
name: Some(Name::new_static("b")),
|
||||
default_ty: None,
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: None,
|
||||
kind: ParameterKind::PositionalOnly {
|
||||
name: Some(Name::new_static("c")),
|
||||
default_ty: Some(Type::IntLiteral(1)),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(KnownClass::Int.to_instance(&db)),
|
||||
kind: ParameterKind::PositionalOnly {
|
||||
name: Some(Name::new_static("d")),
|
||||
default_ty: Some(Type::IntLiteral(2)),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: None,
|
||||
kind: ParameterKind::PositionalOrKeyword {
|
||||
name: Name::new_static("e"),
|
||||
default_ty: Some(Type::IntLiteral(3)),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(Type::IntLiteral(4)),
|
||||
kind: ParameterKind::PositionalOrKeyword {
|
||||
name: Name::new_static("f"),
|
||||
default_ty: Some(Type::IntLiteral(4)),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(Type::object(&db)),
|
||||
kind: ParameterKind::Variadic {
|
||||
name: Name::new_static("args"),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: None,
|
||||
kind: ParameterKind::KeywordOnly {
|
||||
name: Name::new_static("g"),
|
||||
default_ty: Some(Type::IntLiteral(5)),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(Type::IntLiteral(6)),
|
||||
kind: ParameterKind::KeywordOnly {
|
||||
name: Name::new_static("h"),
|
||||
default_ty: Some(Type::IntLiteral(6)),
|
||||
},
|
||||
},
|
||||
Parameter {
|
||||
annotated_ty: Some(KnownClass::Str.to_instance(&db)),
|
||||
kind: ParameterKind::KeywordVariadic {
|
||||
name: Name::new_static("kwargs"),
|
||||
},
|
||||
},
|
||||
],
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn not_deferred() {
|
||||
let mut db = setup_db();
|
||||
db.write_dedented(
|
||||
"/src/a.py",
|
||||
"
|
||||
class A: ...
|
||||
class B: ...
|
||||
|
||||
alias = A
|
||||
|
||||
def f(a: alias): ...
|
||||
|
||||
alias = B
|
||||
",
|
||||
)
|
||||
.unwrap();
|
||||
let func = get_function_f(&db, "/src/a.py");
|
||||
|
||||
let sig = func.internal_signature(&db);
|
||||
|
||||
let [Parameter {
|
||||
annotated_ty,
|
||||
kind: ParameterKind::PositionalOrKeyword { name, .. },
|
||||
}] = &sig.parameters.value[..]
|
||||
else {
|
||||
panic!("expected one positional-or-keyword parameter");
|
||||
};
|
||||
assert_eq!(name, "a");
|
||||
// Parameter resolution not deferred; we should see A not B
|
||||
assert_eq!(annotated_ty.unwrap().display(&db).to_string(), "A");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn deferred_in_stub() {
|
||||
let mut db = setup_db();
|
||||
db.write_dedented(
|
||||
"/src/a.pyi",
|
||||
"
|
||||
class A: ...
|
||||
class B: ...
|
||||
|
||||
alias = A
|
||||
|
||||
def f(a: alias): ...
|
||||
|
||||
alias = B
|
||||
",
|
||||
)
|
||||
.unwrap();
|
||||
let func = get_function_f(&db, "/src/a.pyi");
|
||||
|
||||
let sig = func.internal_signature(&db);
|
||||
|
||||
let [Parameter {
|
||||
annotated_ty,
|
||||
kind: ParameterKind::PositionalOrKeyword { name, .. },
|
||||
}] = &sig.parameters.value[..]
|
||||
else {
|
||||
panic!("expected one positional-or-keyword parameter");
|
||||
};
|
||||
assert_eq!(name, "a");
|
||||
// Parameter resolution deferred; we should see B
|
||||
assert_eq!(annotated_ty.unwrap().display(&db).to_string(), "B");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn generic_not_deferred() {
|
||||
let mut db = setup_db();
|
||||
db.write_dedented(
|
||||
"/src/a.py",
|
||||
"
|
||||
class A: ...
|
||||
class B: ...
|
||||
|
||||
alias = A
|
||||
|
||||
def f[T](a: alias, b: T) -> T: ...
|
||||
|
||||
alias = B
|
||||
",
|
||||
)
|
||||
.unwrap();
|
||||
let func = get_function_f(&db, "/src/a.py");
|
||||
|
||||
let sig = func.internal_signature(&db);
|
||||
|
||||
let [Parameter {
|
||||
annotated_ty: a_annotated_ty,
|
||||
kind: ParameterKind::PositionalOrKeyword { name: a_name, .. },
|
||||
}, Parameter {
|
||||
annotated_ty: b_annotated_ty,
|
||||
kind: ParameterKind::PositionalOrKeyword { name: b_name, .. },
|
||||
}] = &sig.parameters.value[..]
|
||||
else {
|
||||
panic!("expected two positional-or-keyword parameters");
|
||||
};
|
||||
assert_eq!(a_name, "a");
|
||||
assert_eq!(b_name, "b");
|
||||
// TODO resolution should not be deferred; we should see A not B
|
||||
assert_eq!(
|
||||
a_annotated_ty.unwrap().display(&db).to_string(),
|
||||
"Unknown | B"
|
||||
);
|
||||
assert_eq!(b_annotated_ty.unwrap().display(&db).to_string(), "T");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn generic_deferred_in_stub() {
|
||||
let mut db = setup_db();
|
||||
db.write_dedented(
|
||||
"/src/a.pyi",
|
||||
"
|
||||
class A: ...
|
||||
class B: ...
|
||||
|
||||
alias = A
|
||||
|
||||
def f[T](a: alias, b: T) -> T: ...
|
||||
|
||||
alias = B
|
||||
",
|
||||
)
|
||||
.unwrap();
|
||||
let func = get_function_f(&db, "/src/a.pyi");
|
||||
|
||||
let sig = func.internal_signature(&db);
|
||||
|
||||
let [Parameter {
|
||||
annotated_ty: a_annotated_ty,
|
||||
kind: ParameterKind::PositionalOrKeyword { name: a_name, .. },
|
||||
}, Parameter {
|
||||
annotated_ty: b_annotated_ty,
|
||||
kind: ParameterKind::PositionalOrKeyword { name: b_name, .. },
|
||||
}] = &sig.parameters.value[..]
|
||||
else {
|
||||
panic!("expected two positional-or-keyword parameters");
|
||||
};
|
||||
assert_eq!(a_name, "a");
|
||||
assert_eq!(b_name, "b");
|
||||
// Parameter resolution deferred; we should see B
|
||||
assert_eq!(
|
||||
a_annotated_ty.unwrap().display(&db).to_string(),
|
||||
"Unknown | B"
|
||||
);
|
||||
assert_eq!(b_annotated_ty.unwrap().display(&db).to_string(), "T");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn external_signature_no_decorator() {
|
||||
let mut db = setup_db();
|
||||
db.write_dedented(
|
||||
"/src/a.py",
|
||||
"
|
||||
def f(a: int) -> int: ...
|
||||
",
|
||||
)
|
||||
.unwrap();
|
||||
let func = get_function_f(&db, "/src/a.py");
|
||||
|
||||
let expected_sig = func.internal_signature(&db);
|
||||
|
||||
// With no decorators, internal and external signature are the same
|
||||
assert_eq!(func.signature(&db), &expected_sig);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn external_signature_decorated() {
|
||||
let mut db = setup_db();
|
||||
db.write_dedented(
|
||||
"/src/a.py",
|
||||
"
|
||||
def deco(func): ...
|
||||
|
||||
@deco
|
||||
def f(a: int) -> int: ...
|
||||
",
|
||||
)
|
||||
.unwrap();
|
||||
let func = get_function_f(&db, "/src/a.py");
|
||||
|
||||
let expected_sig = Signature::todo("return type of decorated function");
|
||||
|
||||
// With no decorators, internal and external signature are the same
|
||||
assert_eq!(func.signature(&db), &expected_sig);
|
||||
}
|
||||
}
|
||||
@@ -1,102 +0,0 @@
|
||||
use crate::symbol::SymbolAndQualifiers;
|
||||
|
||||
use super::{ClassBase, ClassLiteralType, Db, KnownClass, Type};
|
||||
|
||||
/// A type that represents `type[C]`, i.e. the class object `C` and class objects that are subclasses of `C`.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, salsa::Update)]
|
||||
pub struct SubclassOfType<'db> {
|
||||
// Keep this field private, so that the only way of constructing the struct is through the `from` method.
|
||||
subclass_of: ClassBase<'db>,
|
||||
}
|
||||
|
||||
impl<'db> SubclassOfType<'db> {
|
||||
/// Construct a new [`Type`] instance representing a given class object (or a given dynamic type)
|
||||
/// and all possible subclasses of that class object/dynamic type.
|
||||
///
|
||||
/// This method does not always return a [`Type::SubclassOf`] variant.
|
||||
/// If the class object is known to be a final class,
|
||||
/// this method will return a [`Type::ClassLiteral`] variant; this is a more precise type.
|
||||
/// If the class object is `builtins.object`, `Type::Instance(<builtins.type>)` will be returned;
|
||||
/// this is no more precise, but it is exactly equivalent to `type[object]`.
|
||||
///
|
||||
/// The eager normalization here means that we do not need to worry elsewhere about distinguishing
|
||||
/// between `@final` classes and other classes when dealing with [`Type::SubclassOf`] variants.
|
||||
pub(crate) fn from(db: &'db dyn Db, subclass_of: impl Into<ClassBase<'db>>) -> Type<'db> {
|
||||
let subclass_of = subclass_of.into();
|
||||
match subclass_of {
|
||||
ClassBase::Dynamic(_) => Type::SubclassOf(Self { subclass_of }),
|
||||
ClassBase::Class(class) => {
|
||||
if class.is_final(db) {
|
||||
Type::ClassLiteral(ClassLiteralType { class })
|
||||
} else if class.is_object(db) {
|
||||
KnownClass::Type.to_instance(db)
|
||||
} else {
|
||||
Type::SubclassOf(Self { subclass_of })
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Return a [`Type`] instance representing the type `type[Unknown]`.
|
||||
pub(crate) const fn subclass_of_unknown() -> Type<'db> {
|
||||
Type::SubclassOf(SubclassOfType {
|
||||
subclass_of: ClassBase::unknown(),
|
||||
})
|
||||
}
|
||||
|
||||
/// Return a [`Type`] instance representing the type `type[Any]`.
|
||||
pub(crate) const fn subclass_of_any() -> Type<'db> {
|
||||
Type::SubclassOf(SubclassOfType {
|
||||
subclass_of: ClassBase::any(),
|
||||
})
|
||||
}
|
||||
|
||||
/// Return the inner [`ClassBase`] value wrapped by this `SubclassOfType`.
|
||||
pub(crate) const fn subclass_of(self) -> ClassBase<'db> {
|
||||
self.subclass_of
|
||||
}
|
||||
|
||||
pub const fn is_dynamic(self) -> bool {
|
||||
// Unpack `self` so that we're forced to update this method if any more fields are added in the future.
|
||||
let Self { subclass_of } = self;
|
||||
subclass_of.is_dynamic()
|
||||
}
|
||||
|
||||
pub const fn is_fully_static(self) -> bool {
|
||||
!self.is_dynamic()
|
||||
}
|
||||
|
||||
pub(crate) fn find_name_in_mro(
|
||||
self,
|
||||
db: &'db dyn Db,
|
||||
name: &str,
|
||||
) -> Option<SymbolAndQualifiers<'db>> {
|
||||
Type::from(self.subclass_of).find_name_in_mro(db, name)
|
||||
}
|
||||
|
||||
/// Return `true` if `self` is a subtype of `other`.
|
||||
///
|
||||
/// This can only return `true` if `self.subclass_of` is a [`ClassBase::Class`] variant;
|
||||
/// only fully static types participate in subtyping.
|
||||
pub(crate) fn is_subtype_of(self, db: &'db dyn Db, other: SubclassOfType<'db>) -> bool {
|
||||
match (self.subclass_of, other.subclass_of) {
|
||||
// Non-fully-static types do not participate in subtyping
|
||||
(ClassBase::Dynamic(_), _) | (_, ClassBase::Dynamic(_)) => false,
|
||||
|
||||
// For example, `type[bool]` describes all possible runtime subclasses of the class `bool`,
|
||||
// and `type[int]` describes all possible runtime subclasses of the class `int`.
|
||||
// The first set is a subset of the second set, because `bool` is itself a subclass of `int`.
|
||||
(ClassBase::Class(self_class), ClassBase::Class(other_class)) => {
|
||||
// N.B. The subclass relation is fully static
|
||||
self_class.is_subclass_of(db, other_class)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn to_instance(self) -> Type<'db> {
|
||||
match self.subclass_of {
|
||||
ClassBase::Class(class) => Type::instance(class),
|
||||
ClassBase::Dynamic(dynamic_type) => Type::Dynamic(dynamic_type),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,215 +0,0 @@
|
||||
use super::notebook;
|
||||
use super::PositionEncoding;
|
||||
use lsp_types as types;
|
||||
use ruff_notebook::NotebookIndex;
|
||||
use ruff_source_file::OneIndexed;
|
||||
use ruff_source_file::{LineIndex, SourceLocation};
|
||||
use ruff_text_size::{TextRange, TextSize};
|
||||
|
||||
pub(crate) struct NotebookRange {
|
||||
pub(crate) cell: notebook::CellId,
|
||||
pub(crate) range: types::Range,
|
||||
}
|
||||
|
||||
pub(crate) trait RangeExt {
|
||||
fn to_text_range(&self, text: &str, index: &LineIndex, encoding: PositionEncoding)
|
||||
-> TextRange;
|
||||
}
|
||||
|
||||
pub(crate) trait ToRangeExt {
|
||||
fn to_range(&self, text: &str, index: &LineIndex, encoding: PositionEncoding) -> types::Range;
|
||||
fn to_notebook_range(
|
||||
&self,
|
||||
text: &str,
|
||||
source_index: &LineIndex,
|
||||
notebook_index: &NotebookIndex,
|
||||
encoding: PositionEncoding,
|
||||
) -> NotebookRange;
|
||||
}
|
||||
|
||||
fn u32_index_to_usize(index: u32) -> usize {
|
||||
usize::try_from(index).expect("u32 fits in usize")
|
||||
}
|
||||
|
||||
impl RangeExt for lsp_types::Range {
|
||||
fn to_text_range(
|
||||
&self,
|
||||
text: &str,
|
||||
index: &LineIndex,
|
||||
encoding: PositionEncoding,
|
||||
) -> TextRange {
|
||||
let start_line = index.line_range(
|
||||
OneIndexed::from_zero_indexed(u32_index_to_usize(self.start.line)),
|
||||
text,
|
||||
);
|
||||
let end_line = index.line_range(
|
||||
OneIndexed::from_zero_indexed(u32_index_to_usize(self.end.line)),
|
||||
text,
|
||||
);
|
||||
|
||||
let (start_column_offset, end_column_offset) = match encoding {
|
||||
PositionEncoding::UTF8 => (
|
||||
TextSize::new(self.start.character),
|
||||
TextSize::new(self.end.character),
|
||||
),
|
||||
|
||||
PositionEncoding::UTF16 => {
|
||||
// Fast path for ASCII only documents
|
||||
if index.is_ascii() {
|
||||
(
|
||||
TextSize::new(self.start.character),
|
||||
TextSize::new(self.end.character),
|
||||
)
|
||||
} else {
|
||||
// UTF16 encodes characters either as one or two 16 bit words.
|
||||
// The position in `range` is the 16-bit word offset from the start of the line (and not the character offset)
|
||||
// UTF-16 with a text that may use variable-length characters.
|
||||
(
|
||||
utf8_column_offset(self.start.character, &text[start_line]),
|
||||
utf8_column_offset(self.end.character, &text[end_line]),
|
||||
)
|
||||
}
|
||||
}
|
||||
PositionEncoding::UTF32 => {
|
||||
// UTF-32 uses 4 bytes for each character. Meaning, the position in range is a character offset.
|
||||
return TextRange::new(
|
||||
index.offset(
|
||||
OneIndexed::from_zero_indexed(u32_index_to_usize(self.start.line)),
|
||||
OneIndexed::from_zero_indexed(u32_index_to_usize(self.start.character)),
|
||||
text,
|
||||
),
|
||||
index.offset(
|
||||
OneIndexed::from_zero_indexed(u32_index_to_usize(self.end.line)),
|
||||
OneIndexed::from_zero_indexed(u32_index_to_usize(self.end.character)),
|
||||
text,
|
||||
),
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
TextRange::new(
|
||||
start_line.start() + start_column_offset.clamp(TextSize::new(0), start_line.end()),
|
||||
end_line.start() + end_column_offset.clamp(TextSize::new(0), end_line.end()),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
impl ToRangeExt for TextRange {
|
||||
fn to_range(&self, text: &str, index: &LineIndex, encoding: PositionEncoding) -> types::Range {
|
||||
types::Range {
|
||||
start: source_location_to_position(&offset_to_source_location(
|
||||
self.start(),
|
||||
text,
|
||||
index,
|
||||
encoding,
|
||||
)),
|
||||
end: source_location_to_position(&offset_to_source_location(
|
||||
self.end(),
|
||||
text,
|
||||
index,
|
||||
encoding,
|
||||
)),
|
||||
}
|
||||
}
|
||||
|
||||
fn to_notebook_range(
|
||||
&self,
|
||||
text: &str,
|
||||
source_index: &LineIndex,
|
||||
notebook_index: &NotebookIndex,
|
||||
encoding: PositionEncoding,
|
||||
) -> NotebookRange {
|
||||
let start = offset_to_source_location(self.start(), text, source_index, encoding);
|
||||
let mut end = offset_to_source_location(self.end(), text, source_index, encoding);
|
||||
let starting_cell = notebook_index.cell(start.row);
|
||||
|
||||
// weird edge case here - if the end of the range is where the newline after the cell got added (making it 'out of bounds')
|
||||
// we need to move it one character back (which should place it at the end of the last line).
|
||||
// we test this by checking if the ending offset is in a different (or nonexistent) cell compared to the cell of the starting offset.
|
||||
if notebook_index.cell(end.row) != starting_cell {
|
||||
end.row = end.row.saturating_sub(1);
|
||||
end.column = offset_to_source_location(
|
||||
self.end().checked_sub(1.into()).unwrap_or_default(),
|
||||
text,
|
||||
source_index,
|
||||
encoding,
|
||||
)
|
||||
.column;
|
||||
}
|
||||
|
||||
let start = source_location_to_position(¬ebook_index.translate_location(&start));
|
||||
let end = source_location_to_position(¬ebook_index.translate_location(&end));
|
||||
|
||||
NotebookRange {
|
||||
cell: starting_cell
|
||||
.map(OneIndexed::to_zero_indexed)
|
||||
.unwrap_or_default(),
|
||||
range: types::Range { start, end },
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Converts a UTF-16 code unit offset for a given line into a UTF-8 column number.
|
||||
fn utf8_column_offset(utf16_code_unit_offset: u32, line: &str) -> TextSize {
|
||||
let mut utf8_code_unit_offset = TextSize::new(0);
|
||||
|
||||
let mut i = 0u32;
|
||||
|
||||
for c in line.chars() {
|
||||
if i >= utf16_code_unit_offset {
|
||||
break;
|
||||
}
|
||||
|
||||
// Count characters encoded as two 16 bit words as 2 characters.
|
||||
{
|
||||
utf8_code_unit_offset +=
|
||||
TextSize::new(u32::try_from(c.len_utf8()).expect("utf8 len always <=4"));
|
||||
i += u32::try_from(c.len_utf16()).expect("utf16 len always <=2");
|
||||
}
|
||||
}
|
||||
|
||||
utf8_code_unit_offset
|
||||
}
|
||||
|
||||
fn offset_to_source_location(
|
||||
offset: TextSize,
|
||||
text: &str,
|
||||
index: &LineIndex,
|
||||
encoding: PositionEncoding,
|
||||
) -> SourceLocation {
|
||||
match encoding {
|
||||
PositionEncoding::UTF8 => {
|
||||
let row = index.line_index(offset);
|
||||
let column = offset - index.line_start(row, text);
|
||||
|
||||
SourceLocation {
|
||||
column: OneIndexed::from_zero_indexed(column.to_usize()),
|
||||
row,
|
||||
}
|
||||
}
|
||||
PositionEncoding::UTF16 => {
|
||||
let row = index.line_index(offset);
|
||||
|
||||
let column = if index.is_ascii() {
|
||||
(offset - index.line_start(row, text)).to_usize()
|
||||
} else {
|
||||
let up_to_line = &text[TextRange::new(index.line_start(row, text), offset)];
|
||||
up_to_line.encode_utf16().count()
|
||||
};
|
||||
|
||||
SourceLocation {
|
||||
column: OneIndexed::from_zero_indexed(column),
|
||||
row,
|
||||
}
|
||||
}
|
||||
PositionEncoding::UTF32 => index.source_location(offset, text),
|
||||
}
|
||||
}
|
||||
|
||||
fn source_location_to_position(location: &SourceLocation) -> types::Position {
|
||||
types::Position {
|
||||
line: u32::try_from(location.row.to_zero_indexed()).expect("row usize fits in u32"),
|
||||
character: u32::try_from(location.column.to_zero_indexed())
|
||||
.expect("character usize fits in u32"),
|
||||
}
|
||||
}
|
||||
@@ -1,3 +0,0 @@
|
||||
mod diagnostic;
|
||||
|
||||
pub(super) use diagnostic::DocumentDiagnosticRequestHandler;
|
||||
@@ -1,3 +0,0 @@
|
||||
The `knot_extensions.pyi` file in this directory will be symlinked into
|
||||
the `vendor/typeshed/stdlib` directory every time we sync our `typeshed`
|
||||
stubs (see `.github/workflows/sync_typeshed.yaml`).
|
||||
@@ -1 +0,0 @@
|
||||
cdfb10c340c3df0f8b4112705e6e229b6ae269fd
|
||||
@@ -1,117 +0,0 @@
|
||||
import sys
|
||||
from _typeshed import ReadableBuffer
|
||||
from typing import ClassVar, final
|
||||
from typing_extensions import Self
|
||||
|
||||
BLAKE2B_MAX_DIGEST_SIZE: int = 64
|
||||
BLAKE2B_MAX_KEY_SIZE: int = 64
|
||||
BLAKE2B_PERSON_SIZE: int = 16
|
||||
BLAKE2B_SALT_SIZE: int = 16
|
||||
BLAKE2S_MAX_DIGEST_SIZE: int = 32
|
||||
BLAKE2S_MAX_KEY_SIZE: int = 32
|
||||
BLAKE2S_PERSON_SIZE: int = 8
|
||||
BLAKE2S_SALT_SIZE: int = 8
|
||||
|
||||
@final
|
||||
class blake2b:
|
||||
MAX_DIGEST_SIZE: ClassVar[int] = 64
|
||||
MAX_KEY_SIZE: ClassVar[int] = 64
|
||||
PERSON_SIZE: ClassVar[int] = 16
|
||||
SALT_SIZE: ClassVar[int] = 16
|
||||
block_size: int
|
||||
digest_size: int
|
||||
name: str
|
||||
if sys.version_info >= (3, 9):
|
||||
def __new__(
|
||||
cls,
|
||||
data: ReadableBuffer = b"",
|
||||
/,
|
||||
*,
|
||||
digest_size: int = 64,
|
||||
key: ReadableBuffer = b"",
|
||||
salt: ReadableBuffer = b"",
|
||||
person: ReadableBuffer = b"",
|
||||
fanout: int = 1,
|
||||
depth: int = 1,
|
||||
leaf_size: int = 0,
|
||||
node_offset: int = 0,
|
||||
node_depth: int = 0,
|
||||
inner_size: int = 0,
|
||||
last_node: bool = False,
|
||||
usedforsecurity: bool = True,
|
||||
) -> Self: ...
|
||||
else:
|
||||
def __new__(
|
||||
cls,
|
||||
data: ReadableBuffer = b"",
|
||||
/,
|
||||
*,
|
||||
digest_size: int = 64,
|
||||
key: ReadableBuffer = b"",
|
||||
salt: ReadableBuffer = b"",
|
||||
person: ReadableBuffer = b"",
|
||||
fanout: int = 1,
|
||||
depth: int = 1,
|
||||
leaf_size: int = 0,
|
||||
node_offset: int = 0,
|
||||
node_depth: int = 0,
|
||||
inner_size: int = 0,
|
||||
last_node: bool = False,
|
||||
) -> Self: ...
|
||||
|
||||
def copy(self) -> Self: ...
|
||||
def digest(self) -> bytes: ...
|
||||
def hexdigest(self) -> str: ...
|
||||
def update(self, data: ReadableBuffer, /) -> None: ...
|
||||
|
||||
@final
|
||||
class blake2s:
|
||||
MAX_DIGEST_SIZE: ClassVar[int] = 32
|
||||
MAX_KEY_SIZE: ClassVar[int] = 32
|
||||
PERSON_SIZE: ClassVar[int] = 8
|
||||
SALT_SIZE: ClassVar[int] = 8
|
||||
block_size: int
|
||||
digest_size: int
|
||||
name: str
|
||||
if sys.version_info >= (3, 9):
|
||||
def __new__(
|
||||
cls,
|
||||
data: ReadableBuffer = b"",
|
||||
/,
|
||||
*,
|
||||
digest_size: int = 32,
|
||||
key: ReadableBuffer = b"",
|
||||
salt: ReadableBuffer = b"",
|
||||
person: ReadableBuffer = b"",
|
||||
fanout: int = 1,
|
||||
depth: int = 1,
|
||||
leaf_size: int = 0,
|
||||
node_offset: int = 0,
|
||||
node_depth: int = 0,
|
||||
inner_size: int = 0,
|
||||
last_node: bool = False,
|
||||
usedforsecurity: bool = True,
|
||||
) -> Self: ...
|
||||
else:
|
||||
def __new__(
|
||||
cls,
|
||||
data: ReadableBuffer = b"",
|
||||
/,
|
||||
*,
|
||||
digest_size: int = 32,
|
||||
key: ReadableBuffer = b"",
|
||||
salt: ReadableBuffer = b"",
|
||||
person: ReadableBuffer = b"",
|
||||
fanout: int = 1,
|
||||
depth: int = 1,
|
||||
leaf_size: int = 0,
|
||||
node_offset: int = 0,
|
||||
node_depth: int = 0,
|
||||
inner_size: int = 0,
|
||||
last_node: bool = False,
|
||||
) -> Self: ...
|
||||
|
||||
def copy(self) -> Self: ...
|
||||
def digest(self) -> bytes: ...
|
||||
def hexdigest(self) -> str: ...
|
||||
def update(self, data: ReadableBuffer, /) -> None: ...
|
||||
@@ -1,33 +0,0 @@
|
||||
from collections.abc import Callable
|
||||
from types import TracebackType
|
||||
from typing import Any, NoReturn, overload
|
||||
from typing_extensions import TypeVarTuple, Unpack
|
||||
|
||||
__all__ = ["error", "start_new_thread", "exit", "get_ident", "allocate_lock", "interrupt_main", "LockType", "RLock"]
|
||||
|
||||
_Ts = TypeVarTuple("_Ts")
|
||||
|
||||
TIMEOUT_MAX: int
|
||||
error = RuntimeError
|
||||
|
||||
@overload
|
||||
def start_new_thread(function: Callable[[Unpack[_Ts]], object], args: tuple[Unpack[_Ts]]) -> None: ...
|
||||
@overload
|
||||
def start_new_thread(function: Callable[..., object], args: tuple[Any, ...], kwargs: dict[str, Any]) -> None: ...
|
||||
def exit() -> NoReturn: ...
|
||||
def get_ident() -> int: ...
|
||||
def allocate_lock() -> LockType: ...
|
||||
def stack_size(size: int | None = None) -> int: ...
|
||||
|
||||
class LockType:
|
||||
locked_status: bool
|
||||
def acquire(self, waitflag: bool | None = None, timeout: int = -1) -> bool: ...
|
||||
def __enter__(self, waitflag: bool | None = None, timeout: int = -1) -> bool: ...
|
||||
def __exit__(self, typ: type[BaseException] | None, val: BaseException | None, tb: TracebackType | None) -> None: ...
|
||||
def release(self) -> bool: ...
|
||||
def locked(self) -> bool: ...
|
||||
|
||||
class RLock(LockType):
|
||||
def release(self) -> None: ... # type: ignore[override]
|
||||
|
||||
def interrupt_main() -> None: ...
|
||||
@@ -1,56 +0,0 @@
|
||||
from _threading_local import local as local
|
||||
from _typeshed import ProfileFunction, TraceFunction
|
||||
from threading import (
|
||||
TIMEOUT_MAX as TIMEOUT_MAX,
|
||||
Barrier as Barrier,
|
||||
BoundedSemaphore as BoundedSemaphore,
|
||||
BrokenBarrierError as BrokenBarrierError,
|
||||
Condition as Condition,
|
||||
Event as Event,
|
||||
ExceptHookArgs as ExceptHookArgs,
|
||||
Lock as Lock,
|
||||
RLock as RLock,
|
||||
Semaphore as Semaphore,
|
||||
Thread as Thread,
|
||||
ThreadError as ThreadError,
|
||||
Timer as Timer,
|
||||
_DummyThread as _DummyThread,
|
||||
_RLock as _RLock,
|
||||
excepthook as excepthook,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"get_ident",
|
||||
"active_count",
|
||||
"Condition",
|
||||
"current_thread",
|
||||
"enumerate",
|
||||
"main_thread",
|
||||
"TIMEOUT_MAX",
|
||||
"Event",
|
||||
"Lock",
|
||||
"RLock",
|
||||
"Semaphore",
|
||||
"BoundedSemaphore",
|
||||
"Thread",
|
||||
"Barrier",
|
||||
"BrokenBarrierError",
|
||||
"Timer",
|
||||
"ThreadError",
|
||||
"setprofile",
|
||||
"settrace",
|
||||
"local",
|
||||
"stack_size",
|
||||
"ExceptHookArgs",
|
||||
"excepthook",
|
||||
]
|
||||
|
||||
def active_count() -> int: ...
|
||||
def current_thread() -> Thread: ...
|
||||
def currentThread() -> Thread: ...
|
||||
def get_ident() -> int: ...
|
||||
def enumerate() -> list[Thread]: ...
|
||||
def main_thread() -> Thread: ...
|
||||
def settrace(func: TraceFunction) -> None: ...
|
||||
def setprofile(func: ProfileFunction | None) -> None: ...
|
||||
def stack_size(size: int | None = None) -> int: ...
|
||||
@@ -1,93 +0,0 @@
|
||||
import sys
|
||||
from _typeshed import ReadableBuffer
|
||||
from collections.abc import Callable
|
||||
from types import ModuleType
|
||||
from typing import AnyStr, Protocol, final, overload, type_check_only
|
||||
from typing_extensions import Self, TypeAlias
|
||||
|
||||
_DigestMod: TypeAlias = str | Callable[[], _HashObject] | ModuleType | None
|
||||
|
||||
openssl_md_meth_names: frozenset[str]
|
||||
|
||||
@type_check_only
|
||||
class _HashObject(Protocol):
|
||||
@property
|
||||
def digest_size(self) -> int: ...
|
||||
@property
|
||||
def block_size(self) -> int: ...
|
||||
@property
|
||||
def name(self) -> str: ...
|
||||
def copy(self) -> Self: ...
|
||||
def digest(self) -> bytes: ...
|
||||
def hexdigest(self) -> str: ...
|
||||
def update(self, obj: ReadableBuffer, /) -> None: ...
|
||||
|
||||
class HASH:
|
||||
@property
|
||||
def digest_size(self) -> int: ...
|
||||
@property
|
||||
def block_size(self) -> int: ...
|
||||
@property
|
||||
def name(self) -> str: ...
|
||||
def copy(self) -> Self: ...
|
||||
def digest(self) -> bytes: ...
|
||||
def hexdigest(self) -> str: ...
|
||||
def update(self, obj: ReadableBuffer, /) -> None: ...
|
||||
|
||||
if sys.version_info >= (3, 10):
|
||||
class UnsupportedDigestmodError(ValueError): ...
|
||||
|
||||
if sys.version_info >= (3, 9):
|
||||
class HASHXOF(HASH):
|
||||
def digest(self, length: int) -> bytes: ... # type: ignore[override]
|
||||
def hexdigest(self, length: int) -> str: ... # type: ignore[override]
|
||||
|
||||
@final
|
||||
class HMAC:
|
||||
@property
|
||||
def digest_size(self) -> int: ...
|
||||
@property
|
||||
def block_size(self) -> int: ...
|
||||
@property
|
||||
def name(self) -> str: ...
|
||||
def copy(self) -> Self: ...
|
||||
def digest(self) -> bytes: ...
|
||||
def hexdigest(self) -> str: ...
|
||||
def update(self, msg: ReadableBuffer) -> None: ...
|
||||
|
||||
@overload
|
||||
def compare_digest(a: ReadableBuffer, b: ReadableBuffer, /) -> bool: ...
|
||||
@overload
|
||||
def compare_digest(a: AnyStr, b: AnyStr, /) -> bool: ...
|
||||
def get_fips_mode() -> int: ...
|
||||
def hmac_new(key: bytes | bytearray, msg: ReadableBuffer = b"", digestmod: _DigestMod = None) -> HMAC: ...
|
||||
def new(name: str, string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_md5(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha1(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha224(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha256(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha384(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha512(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha3_224(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha3_256(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha3_384(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_sha3_512(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASH: ...
|
||||
def openssl_shake_128(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASHXOF: ...
|
||||
def openssl_shake_256(string: ReadableBuffer = b"", *, usedforsecurity: bool = True) -> HASHXOF: ...
|
||||
|
||||
else:
|
||||
def new(name: str, string: ReadableBuffer = b"") -> HASH: ...
|
||||
def openssl_md5(string: ReadableBuffer = b"") -> HASH: ...
|
||||
def openssl_sha1(string: ReadableBuffer = b"") -> HASH: ...
|
||||
def openssl_sha224(string: ReadableBuffer = b"") -> HASH: ...
|
||||
def openssl_sha256(string: ReadableBuffer = b"") -> HASH: ...
|
||||
def openssl_sha384(string: ReadableBuffer = b"") -> HASH: ...
|
||||
def openssl_sha512(string: ReadableBuffer = b"") -> HASH: ...
|
||||
|
||||
def hmac_digest(key: bytes | bytearray, msg: ReadableBuffer, digest: str) -> bytes: ...
|
||||
def pbkdf2_hmac(
|
||||
hash_name: str, password: ReadableBuffer, salt: ReadableBuffer, iterations: int, dklen: int | None = None
|
||||
) -> bytes: ...
|
||||
def scrypt(
|
||||
password: ReadableBuffer, *, salt: ReadableBuffer, n: int, r: int, p: int, maxmem: int = 0, dklen: int = 64
|
||||
) -> bytes: ...
|
||||
@@ -1,2 +0,0 @@
|
||||
from _dummy_threading import *
|
||||
from _dummy_threading import __all__ as __all__
|
||||
@@ -1,21 +0,0 @@
|
||||
import codecs
|
||||
from _codecs import _EncodingMap
|
||||
from _typeshed import ReadableBuffer
|
||||
|
||||
class Codec(codecs.Codec):
|
||||
def encode(self, input: str, errors: str = "strict") -> tuple[bytes, int]: ...
|
||||
def decode(self, input: bytes, errors: str = "strict") -> tuple[str, int]: ...
|
||||
|
||||
class IncrementalEncoder(codecs.IncrementalEncoder):
|
||||
def encode(self, input: str, final: bool = False) -> bytes: ...
|
||||
|
||||
class IncrementalDecoder(codecs.IncrementalDecoder):
|
||||
def decode(self, input: ReadableBuffer, final: bool = False) -> str: ...
|
||||
|
||||
class StreamWriter(Codec, codecs.StreamWriter): ...
|
||||
class StreamReader(Codec, codecs.StreamReader): ...
|
||||
|
||||
def getregentry() -> codecs.CodecInfo: ...
|
||||
|
||||
decoding_table: str
|
||||
encoding_table: _EncodingMap
|
||||
@@ -1,34 +0,0 @@
|
||||
import codecs
|
||||
import sys
|
||||
from _typeshed import ReadableBuffer
|
||||
|
||||
class Codec(codecs.Codec):
|
||||
# At runtime, this is codecs.raw_unicode_escape_encode
|
||||
@staticmethod
|
||||
def encode(str: str, errors: str | None = None, /) -> tuple[bytes, int]: ...
|
||||
# At runtime, this is codecs.raw_unicode_escape_decode
|
||||
if sys.version_info >= (3, 9):
|
||||
@staticmethod
|
||||
def decode(data: str | ReadableBuffer, errors: str | None = None, final: bool = True, /) -> tuple[str, int]: ...
|
||||
else:
|
||||
@staticmethod
|
||||
def decode(data: str | ReadableBuffer, errors: str | None = None, /) -> tuple[str, int]: ...
|
||||
|
||||
class IncrementalEncoder(codecs.IncrementalEncoder):
|
||||
def encode(self, input: str, final: bool = False) -> bytes: ...
|
||||
|
||||
if sys.version_info >= (3, 9):
|
||||
class IncrementalDecoder(codecs.BufferedIncrementalDecoder):
|
||||
def _buffer_decode(self, input: str | ReadableBuffer, errors: str | None, final: bool) -> tuple[str, int]: ...
|
||||
|
||||
else:
|
||||
class IncrementalDecoder(codecs.IncrementalDecoder):
|
||||
def decode(self, input: str | ReadableBuffer, final: bool = False) -> str: ...
|
||||
|
||||
class StreamWriter(Codec, codecs.StreamWriter): ...
|
||||
|
||||
class StreamReader(Codec, codecs.StreamReader):
|
||||
if sys.version_info >= (3, 9):
|
||||
def decode(self, input: str | ReadableBuffer, errors: str = "strict") -> tuple[str, int]: ... # type: ignore[override]
|
||||
|
||||
def getregentry() -> codecs.CodecInfo: ...
|
||||
@@ -1,34 +0,0 @@
|
||||
import codecs
|
||||
import sys
|
||||
from _typeshed import ReadableBuffer
|
||||
|
||||
class Codec(codecs.Codec):
|
||||
# At runtime, this is codecs.unicode_escape_encode
|
||||
@staticmethod
|
||||
def encode(str: str, errors: str | None = None, /) -> tuple[bytes, int]: ...
|
||||
# At runtime, this is codecs.unicode_escape_decode
|
||||
if sys.version_info >= (3, 9):
|
||||
@staticmethod
|
||||
def decode(data: str | ReadableBuffer, errors: str | None = None, final: bool = True, /) -> tuple[str, int]: ...
|
||||
else:
|
||||
@staticmethod
|
||||
def decode(data: str | ReadableBuffer, errors: str | None = None, /) -> tuple[str, int]: ...
|
||||
|
||||
class IncrementalEncoder(codecs.IncrementalEncoder):
|
||||
def encode(self, input: str, final: bool = False) -> bytes: ...
|
||||
|
||||
if sys.version_info >= (3, 9):
|
||||
class IncrementalDecoder(codecs.BufferedIncrementalDecoder):
|
||||
def _buffer_decode(self, input: str | ReadableBuffer, errors: str | None, final: bool) -> tuple[str, int]: ...
|
||||
|
||||
else:
|
||||
class IncrementalDecoder(codecs.IncrementalDecoder):
|
||||
def decode(self, input: str | ReadableBuffer, final: bool = False) -> str: ...
|
||||
|
||||
class StreamWriter(Codec, codecs.StreamWriter): ...
|
||||
|
||||
class StreamReader(Codec, codecs.StreamReader):
|
||||
if sys.version_info >= (3, 9):
|
||||
def decode(self, input: str | ReadableBuffer, errors: str = "strict") -> tuple[str, int]: ... # type: ignore[override]
|
||||
|
||||
def getregentry() -> codecs.CodecInfo: ...
|
||||
@@ -1,21 +0,0 @@
|
||||
import sys
|
||||
from collections.abc import Sequence
|
||||
from typing import Final
|
||||
|
||||
if sys.version_info >= (3, 9):
|
||||
__all__ = ["iskeyword", "issoftkeyword", "kwlist", "softkwlist"]
|
||||
else:
|
||||
__all__ = ["iskeyword", "kwlist"]
|
||||
|
||||
def iskeyword(s: str, /) -> bool: ...
|
||||
|
||||
# a list at runtime, but you're not meant to mutate it;
|
||||
# type it as a sequence
|
||||
kwlist: Final[Sequence[str]]
|
||||
|
||||
if sys.version_info >= (3, 9):
|
||||
def issoftkeyword(s: str, /) -> bool: ...
|
||||
|
||||
# a list at runtime, but you're not meant to mutate it;
|
||||
# type it as a sequence
|
||||
softkwlist: Final[Sequence[str]]
|
||||
@@ -1,20 +0,0 @@
|
||||
import sys
|
||||
from tkinter import Misc
|
||||
from tkinter.commondialog import Dialog
|
||||
from typing import ClassVar
|
||||
|
||||
if sys.version_info >= (3, 9):
|
||||
__all__ = ["Chooser", "askcolor"]
|
||||
|
||||
class Chooser(Dialog):
|
||||
command: ClassVar[str]
|
||||
|
||||
if sys.version_info >= (3, 9):
|
||||
def askcolor(
|
||||
color: str | bytes | None = None, *, initialcolor: str = ..., parent: Misc = ..., title: str = ...
|
||||
) -> tuple[None, None] | tuple[tuple[int, int, int], str]: ...
|
||||
|
||||
else:
|
||||
def askcolor(
|
||||
color: str | bytes | None = None, *, initialcolor: str = ..., parent: Misc = ..., title: str = ...
|
||||
) -> tuple[None, None] | tuple[tuple[float, float, float], str]: ...
|
||||
@@ -1,35 +0,0 @@
|
||||
import sys
|
||||
from collections.abc import Iterable
|
||||
from datetime import datetime, timedelta, tzinfo
|
||||
from typing_extensions import Self
|
||||
|
||||
# TODO: remove this version check
|
||||
# In theory we shouldn't need this version check. Pyright complains about the imports
|
||||
# from zoneinfo.* when run on 3.8 and 3.7 without this. Updates to typeshed's
|
||||
# pyright test script are probably needed, see #11189
|
||||
if sys.version_info >= (3, 9):
|
||||
from zoneinfo._common import ZoneInfoNotFoundError as ZoneInfoNotFoundError, _IOBytes
|
||||
from zoneinfo._tzpath import (
|
||||
TZPATH as TZPATH,
|
||||
InvalidTZPathWarning as InvalidTZPathWarning,
|
||||
available_timezones as available_timezones,
|
||||
reset_tzpath as reset_tzpath,
|
||||
)
|
||||
|
||||
__all__ = ["ZoneInfo", "reset_tzpath", "available_timezones", "TZPATH", "ZoneInfoNotFoundError", "InvalidTZPathWarning"]
|
||||
|
||||
class ZoneInfo(tzinfo):
|
||||
@property
|
||||
def key(self) -> str: ...
|
||||
def __new__(cls, key: str) -> Self: ...
|
||||
@classmethod
|
||||
def no_cache(cls, key: str) -> Self: ...
|
||||
@classmethod
|
||||
def from_file(cls, fobj: _IOBytes, /, key: str | None = None) -> Self: ...
|
||||
@classmethod
|
||||
def clear_cache(cls, *, only_keys: Iterable[str] | None = None) -> None: ...
|
||||
def tzname(self, dt: datetime | None, /) -> str | None: ...
|
||||
def utcoffset(self, dt: datetime | None, /) -> timedelta | None: ...
|
||||
def dst(self, dt: datetime | None, /) -> timedelta | None: ...
|
||||
|
||||
def __dir__() -> list[str]: ...
|
||||
@@ -1,470 +0,0 @@
|
||||
use std::any::Any;
|
||||
|
||||
use js_sys::{Error, JsString};
|
||||
use red_knot_project::metadata::options::{EnvironmentOptions, Options};
|
||||
use red_knot_project::metadata::value::RangedValue;
|
||||
use red_knot_project::watch::{ChangeEvent, ChangedKind, CreatedKind, DeletedKind};
|
||||
use red_knot_project::ProjectMetadata;
|
||||
use red_knot_project::{Db, ProjectDatabase};
|
||||
use ruff_db::diagnostic::{DisplayDiagnosticConfig, OldDiagnosticTrait};
|
||||
use ruff_db::files::{system_path_to_file, File};
|
||||
use ruff_db::source::{line_index, source_text};
|
||||
use ruff_db::system::walk_directory::WalkDirectoryBuilder;
|
||||
use ruff_db::system::{
|
||||
CaseSensitivity, DirectoryEntry, GlobError, MemoryFileSystem, Metadata, PatternError, System,
|
||||
SystemPath, SystemPathBuf, SystemVirtualPath,
|
||||
};
|
||||
use ruff_db::Upcast;
|
||||
use ruff_notebook::Notebook;
|
||||
use ruff_source_file::SourceLocation;
|
||||
use wasm_bindgen::prelude::*;
|
||||
|
||||
#[wasm_bindgen(start)]
|
||||
pub fn run() {
|
||||
use log::Level;
|
||||
|
||||
// When the `console_error_panic_hook` feature is enabled, we can call the
|
||||
// `set_panic_hook` function at least once during initialization, and then
|
||||
// we will get better error messages if our code ever panics.
|
||||
//
|
||||
// For more details see
|
||||
// https://github.com/rustwasm/console_error_panic_hook#readme
|
||||
#[cfg(feature = "console_error_panic_hook")]
|
||||
console_error_panic_hook::set_once();
|
||||
|
||||
console_log::init_with_level(Level::Debug).expect("Initializing logger went wrong.");
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub struct Workspace {
|
||||
db: ProjectDatabase,
|
||||
system: WasmSystem,
|
||||
options: Options,
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
impl Workspace {
|
||||
#[wasm_bindgen(constructor)]
|
||||
pub fn new(root: &str, settings: &Settings) -> Result<Workspace, Error> {
|
||||
let system = WasmSystem::new(SystemPath::new(root));
|
||||
|
||||
let mut workspace =
|
||||
ProjectMetadata::discover(SystemPath::new(root), &system).map_err(into_error)?;
|
||||
|
||||
let options = Options {
|
||||
environment: Some(EnvironmentOptions {
|
||||
python_version: Some(RangedValue::cli(settings.python_version.into())),
|
||||
..EnvironmentOptions::default()
|
||||
}),
|
||||
..Options::default()
|
||||
};
|
||||
|
||||
workspace.apply_cli_options(options.clone());
|
||||
|
||||
let db = ProjectDatabase::new(workspace, system.clone()).map_err(into_error)?;
|
||||
|
||||
Ok(Self {
|
||||
db,
|
||||
system,
|
||||
options,
|
||||
})
|
||||
}
|
||||
|
||||
#[wasm_bindgen(js_name = "openFile")]
|
||||
pub fn open_file(&mut self, path: &str, contents: &str) -> Result<FileHandle, Error> {
|
||||
let path = SystemPath::new(path);
|
||||
|
||||
self.system
|
||||
.fs
|
||||
.write_file_all(path, contents)
|
||||
.map_err(into_error)?;
|
||||
|
||||
self.db.apply_changes(
|
||||
vec![ChangeEvent::Created {
|
||||
path: path.to_path_buf(),
|
||||
kind: CreatedKind::File,
|
||||
}],
|
||||
Some(&self.options),
|
||||
);
|
||||
|
||||
let file = system_path_to_file(&self.db, path).expect("File to exist");
|
||||
|
||||
self.db.project().open_file(&mut self.db, file);
|
||||
|
||||
Ok(FileHandle {
|
||||
file,
|
||||
path: path.to_path_buf(),
|
||||
})
|
||||
}
|
||||
|
||||
#[wasm_bindgen(js_name = "updateFile")]
|
||||
pub fn update_file(&mut self, file_id: &FileHandle, contents: &str) -> Result<(), Error> {
|
||||
if !self.system.fs.exists(&file_id.path) {
|
||||
return Err(Error::new("File does not exist"));
|
||||
}
|
||||
|
||||
self.system
|
||||
.fs
|
||||
.write_file(&file_id.path, contents)
|
||||
.map_err(into_error)?;
|
||||
|
||||
self.db.apply_changes(
|
||||
vec![
|
||||
ChangeEvent::Changed {
|
||||
path: file_id.path.to_path_buf(),
|
||||
kind: ChangedKind::FileContent,
|
||||
},
|
||||
ChangeEvent::Changed {
|
||||
path: file_id.path.to_path_buf(),
|
||||
kind: ChangedKind::FileMetadata,
|
||||
},
|
||||
],
|
||||
Some(&self.options),
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[wasm_bindgen(js_name = "closeFile")]
|
||||
pub fn close_file(&mut self, file_id: &FileHandle) -> Result<(), Error> {
|
||||
let file = file_id.file;
|
||||
|
||||
self.db.project().close_file(&mut self.db, file);
|
||||
self.system
|
||||
.fs
|
||||
.remove_file(&file_id.path)
|
||||
.map_err(into_error)?;
|
||||
|
||||
self.db.apply_changes(
|
||||
vec![ChangeEvent::Deleted {
|
||||
path: file_id.path.to_path_buf(),
|
||||
kind: DeletedKind::File,
|
||||
}],
|
||||
Some(&self.options),
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Checks a single file.
|
||||
#[wasm_bindgen(js_name = "checkFile")]
|
||||
pub fn check_file(&self, file_id: &FileHandle) -> Result<Vec<Diagnostic>, Error> {
|
||||
let result = self.db.check_file(file_id.file).map_err(into_error)?;
|
||||
|
||||
Ok(result.into_iter().map(Diagnostic::wrap).collect())
|
||||
}
|
||||
|
||||
/// Checks all open files
|
||||
pub fn check(&self) -> Result<Vec<Diagnostic>, Error> {
|
||||
let result = self.db.check().map_err(into_error)?;
|
||||
|
||||
Ok(result.into_iter().map(Diagnostic::wrap).collect())
|
||||
}
|
||||
|
||||
/// Returns the parsed AST for `path`
|
||||
pub fn parsed(&self, file_id: &FileHandle) -> Result<String, Error> {
|
||||
let parsed = ruff_db::parsed::parsed_module(&self.db, file_id.file);
|
||||
|
||||
Ok(format!("{:#?}", parsed.syntax()))
|
||||
}
|
||||
|
||||
/// Returns the token stream for `path` serialized as a string.
|
||||
pub fn tokens(&self, file_id: &FileHandle) -> Result<String, Error> {
|
||||
let parsed = ruff_db::parsed::parsed_module(&self.db, file_id.file);
|
||||
|
||||
Ok(format!("{:#?}", parsed.tokens()))
|
||||
}
|
||||
|
||||
#[wasm_bindgen(js_name = "sourceText")]
|
||||
pub fn source_text(&self, file_id: &FileHandle) -> Result<String, Error> {
|
||||
let source_text = ruff_db::source::source_text(&self.db, file_id.file);
|
||||
|
||||
Ok(source_text.to_string())
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn into_error<E: std::fmt::Display>(err: E) -> Error {
|
||||
Error::new(&err.to_string())
|
||||
}
|
||||
|
||||
#[derive(Debug, Eq, PartialEq)]
|
||||
#[wasm_bindgen(inspectable)]
|
||||
pub struct FileHandle {
|
||||
path: SystemPathBuf,
|
||||
file: File,
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
impl FileHandle {
|
||||
#[wasm_bindgen(js_name = toString)]
|
||||
pub fn js_to_string(&self) -> String {
|
||||
format!("file(id: {:?}, path: {})", self.file, self.path)
|
||||
}
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub struct Settings {
|
||||
pub python_version: PythonVersion,
|
||||
}
|
||||
#[wasm_bindgen]
|
||||
impl Settings {
|
||||
#[wasm_bindgen(constructor)]
|
||||
pub fn new(python_version: PythonVersion) -> Self {
|
||||
Self { python_version }
|
||||
}
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub struct Diagnostic {
|
||||
#[wasm_bindgen(readonly)]
|
||||
inner: Box<dyn OldDiagnosticTrait>,
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
impl Diagnostic {
|
||||
fn wrap(diagnostic: Box<dyn OldDiagnosticTrait>) -> Self {
|
||||
Self { inner: diagnostic }
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub fn message(&self) -> JsString {
|
||||
JsString::from(&*self.inner.message())
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub fn id(&self) -> JsString {
|
||||
JsString::from(self.inner.id().to_string())
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub fn severity(&self) -> Severity {
|
||||
Severity::from(self.inner.severity())
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub fn text_range(&self) -> Option<TextRange> {
|
||||
self.inner
|
||||
.span()
|
||||
.and_then(|span| Some(TextRange::from(span.range()?)))
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub fn to_range(&self, workspace: &Workspace) -> Option<Range> {
|
||||
self.inner.span().and_then(|span| {
|
||||
let line_index = line_index(workspace.db.upcast(), span.file());
|
||||
let source = source_text(workspace.db.upcast(), span.file());
|
||||
let text_range = span.range()?;
|
||||
|
||||
Some(Range {
|
||||
start: line_index
|
||||
.source_location(text_range.start(), &source)
|
||||
.into(),
|
||||
end: line_index.source_location(text_range.end(), &source).into(),
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub fn display(&self, workspace: &Workspace) -> JsString {
|
||||
let config = DisplayDiagnosticConfig::default().color(false);
|
||||
self.inner
|
||||
.display(workspace.db.upcast(), &config)
|
||||
.to_string()
|
||||
.into()
|
||||
}
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
#[derive(Copy, Clone, Eq, PartialEq, Debug)]
|
||||
pub struct Range {
|
||||
pub start: Position,
|
||||
pub end: Position,
|
||||
}
|
||||
|
||||
impl From<SourceLocation> for Position {
|
||||
fn from(location: SourceLocation) -> Self {
|
||||
Self {
|
||||
line: location.row.to_zero_indexed(),
|
||||
character: location.column.to_zero_indexed(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
#[derive(Copy, Clone, Eq, PartialEq, Debug)]
|
||||
pub struct Position {
|
||||
pub line: usize,
|
||||
pub character: usize,
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
#[derive(Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub enum Severity {
|
||||
Info,
|
||||
Warning,
|
||||
Error,
|
||||
Fatal,
|
||||
}
|
||||
|
||||
impl From<ruff_db::diagnostic::Severity> for Severity {
|
||||
fn from(value: ruff_db::diagnostic::Severity) -> Self {
|
||||
match value {
|
||||
ruff_db::diagnostic::Severity::Info => Self::Info,
|
||||
ruff_db::diagnostic::Severity::Warning => Self::Warning,
|
||||
ruff_db::diagnostic::Severity::Error => Self::Error,
|
||||
ruff_db::diagnostic::Severity::Fatal => Self::Fatal,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
pub struct TextRange {
|
||||
pub start: u32,
|
||||
pub end: u32,
|
||||
}
|
||||
|
||||
impl From<ruff_text_size::TextRange> for TextRange {
|
||||
fn from(value: ruff_text_size::TextRange) -> Self {
|
||||
Self {
|
||||
start: value.start().into(),
|
||||
end: value.end().into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[wasm_bindgen]
|
||||
#[derive(Copy, Clone, Hash, PartialEq, Eq, PartialOrd, Ord, Default)]
|
||||
pub enum PythonVersion {
|
||||
Py37,
|
||||
Py38,
|
||||
#[default]
|
||||
Py39,
|
||||
Py310,
|
||||
Py311,
|
||||
Py312,
|
||||
Py313,
|
||||
}
|
||||
|
||||
impl From<PythonVersion> for ruff_python_ast::PythonVersion {
|
||||
fn from(value: PythonVersion) -> Self {
|
||||
match value {
|
||||
PythonVersion::Py37 => Self::PY37,
|
||||
PythonVersion::Py38 => Self::PY38,
|
||||
PythonVersion::Py39 => Self::PY39,
|
||||
PythonVersion::Py310 => Self::PY310,
|
||||
PythonVersion::Py311 => Self::PY311,
|
||||
PythonVersion::Py312 => Self::PY312,
|
||||
PythonVersion::Py313 => Self::PY313,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
struct WasmSystem {
|
||||
fs: MemoryFileSystem,
|
||||
}
|
||||
|
||||
impl WasmSystem {
|
||||
fn new(root: &SystemPath) -> Self {
|
||||
Self {
|
||||
fs: MemoryFileSystem::with_current_directory(root),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl System for WasmSystem {
|
||||
fn path_metadata(&self, path: &SystemPath) -> ruff_db::system::Result<Metadata> {
|
||||
self.fs.metadata(path)
|
||||
}
|
||||
|
||||
fn canonicalize_path(&self, path: &SystemPath) -> ruff_db::system::Result<SystemPathBuf> {
|
||||
self.fs.canonicalize(path)
|
||||
}
|
||||
|
||||
fn read_to_string(&self, path: &SystemPath) -> ruff_db::system::Result<String> {
|
||||
self.fs.read_to_string(path)
|
||||
}
|
||||
|
||||
fn read_to_notebook(
|
||||
&self,
|
||||
path: &SystemPath,
|
||||
) -> Result<ruff_notebook::Notebook, ruff_notebook::NotebookError> {
|
||||
let content = self.read_to_string(path)?;
|
||||
Notebook::from_source_code(&content)
|
||||
}
|
||||
|
||||
fn read_virtual_path_to_string(
|
||||
&self,
|
||||
_path: &SystemVirtualPath,
|
||||
) -> ruff_db::system::Result<String> {
|
||||
Err(not_found())
|
||||
}
|
||||
|
||||
fn read_virtual_path_to_notebook(
|
||||
&self,
|
||||
_path: &SystemVirtualPath,
|
||||
) -> Result<Notebook, ruff_notebook::NotebookError> {
|
||||
Err(ruff_notebook::NotebookError::Io(not_found()))
|
||||
}
|
||||
|
||||
fn path_exists_case_sensitive(&self, path: &SystemPath, _prefix: &SystemPath) -> bool {
|
||||
self.path_exists(path)
|
||||
}
|
||||
|
||||
fn case_sensitivity(&self) -> CaseSensitivity {
|
||||
CaseSensitivity::CaseSensitive
|
||||
}
|
||||
|
||||
fn current_directory(&self) -> &SystemPath {
|
||||
self.fs.current_directory()
|
||||
}
|
||||
|
||||
fn user_config_directory(&self) -> Option<SystemPathBuf> {
|
||||
None
|
||||
}
|
||||
|
||||
fn read_directory<'a>(
|
||||
&'a self,
|
||||
path: &SystemPath,
|
||||
) -> ruff_db::system::Result<
|
||||
Box<dyn Iterator<Item = ruff_db::system::Result<DirectoryEntry>> + 'a>,
|
||||
> {
|
||||
Ok(Box::new(self.fs.read_directory(path)?))
|
||||
}
|
||||
|
||||
fn walk_directory(&self, path: &SystemPath) -> WalkDirectoryBuilder {
|
||||
self.fs.walk_directory(path)
|
||||
}
|
||||
|
||||
fn glob(
|
||||
&self,
|
||||
pattern: &str,
|
||||
) -> Result<Box<dyn Iterator<Item = Result<SystemPathBuf, GlobError>>>, PatternError> {
|
||||
Ok(Box::new(self.fs.glob(pattern)?))
|
||||
}
|
||||
|
||||
fn as_any(&self) -> &dyn Any {
|
||||
self
|
||||
}
|
||||
|
||||
fn as_any_mut(&mut self) -> &mut dyn Any {
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
fn not_found() -> std::io::Error {
|
||||
std::io::Error::new(std::io::ErrorKind::NotFound, "No such file or directory")
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use crate::PythonVersion;
|
||||
|
||||
#[test]
|
||||
fn same_default_as_python_version() {
|
||||
assert_eq!(
|
||||
ruff_python_ast::PythonVersion::from(PythonVersion::default()),
|
||||
ruff_python_ast::PythonVersion::default()
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "ruff"
|
||||
version = "0.11.2"
|
||||
version = "0.11.8"
|
||||
publish = true
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
@@ -33,7 +33,6 @@ argfile = { workspace = true }
|
||||
bincode = { workspace = true }
|
||||
bitflags = { workspace = true }
|
||||
cachedir = { workspace = true }
|
||||
chrono = { workspace = true }
|
||||
clap = { workspace = true, features = ["derive", "env", "wrap_help"] }
|
||||
clap_complete_command = { workspace = true }
|
||||
clearscreen = { workspace = true }
|
||||
@@ -43,6 +42,7 @@ globwalk = { workspace = true }
|
||||
ignore = { workspace = true }
|
||||
is-macro = { workspace = true }
|
||||
itertools = { workspace = true }
|
||||
jiff = { workspace = true }
|
||||
log = { workspace = true }
|
||||
notify = { workspace = true }
|
||||
path-absolutize = { workspace = true, features = ["once_cell_cache"] }
|
||||
@@ -75,7 +75,10 @@ test-case = { workspace = true }
|
||||
|
||||
[package.metadata.cargo-shear]
|
||||
# Used via macro expansion.
|
||||
ignored = ["chrono"]
|
||||
ignored = ["jiff"]
|
||||
|
||||
[package.metadata.dist]
|
||||
dist = true
|
||||
|
||||
[target.'cfg(target_os = "windows")'.dependencies]
|
||||
mimalloc = { workspace = true }
|
||||
|
||||
@@ -13,7 +13,6 @@ fn main() {
|
||||
|
||||
commit_info(&workspace_root);
|
||||
|
||||
#[allow(clippy::disallowed_methods)]
|
||||
let target = std::env::var("TARGET").unwrap();
|
||||
println!("cargo::rustc-env=RUST_HOST_TARGET={target}");
|
||||
}
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
use std::cmp::Ordering;
|
||||
use std::fmt::Formatter;
|
||||
use std::fmt::{Formatter, Write as _};
|
||||
use std::ops::Deref;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::str::FromStr;
|
||||
use std::sync::Arc;
|
||||
|
||||
use crate::commands::completions::config::{OptionString, OptionStringParser};
|
||||
use anyhow::bail;
|
||||
use clap::builder::{TypedValueParser, ValueParserFactory};
|
||||
use clap::{command, Parser, Subcommand};
|
||||
@@ -22,7 +23,7 @@ use ruff_linter::settings::types::{
|
||||
};
|
||||
use ruff_linter::{RuleParser, RuleSelector, RuleSelectorParser};
|
||||
use ruff_python_ast as ast;
|
||||
use ruff_source_file::{LineIndex, OneIndexed};
|
||||
use ruff_source_file::{LineIndex, OneIndexed, PositionEncoding};
|
||||
use ruff_text_size::TextRange;
|
||||
use ruff_workspace::configuration::{Configuration, RuleSelection};
|
||||
use ruff_workspace::options::{Options, PycodestyleOptions};
|
||||
@@ -31,8 +32,6 @@ use ruff_workspace::resolver::ConfigurationTransformer;
|
||||
use rustc_hash::FxHashMap;
|
||||
use toml;
|
||||
|
||||
use crate::commands::completions::config::{OptionString, OptionStringParser};
|
||||
|
||||
/// All configuration options that can be passed "globally",
|
||||
/// i.e., can be passed to all subcommands
|
||||
#[derive(Debug, Default, Clone, clap::Args)]
|
||||
@@ -94,7 +93,7 @@ pub struct Args {
|
||||
pub(crate) global_options: GlobalConfigArgs,
|
||||
}
|
||||
|
||||
#[allow(clippy::large_enum_variant)]
|
||||
#[expect(clippy::large_enum_variant)]
|
||||
#[derive(Debug, clap::Subcommand)]
|
||||
pub enum Command {
|
||||
/// Run Ruff on the given files or directories.
|
||||
@@ -178,11 +177,14 @@ pub struct AnalyzeGraphCommand {
|
||||
/// The minimum Python version that should be supported.
|
||||
#[arg(long, value_enum)]
|
||||
target_version: Option<PythonVersion>,
|
||||
/// Path to a virtual environment to use for resolving additional dependencies
|
||||
#[arg(long)]
|
||||
python: Option<PathBuf>,
|
||||
}
|
||||
|
||||
// The `Parser` derive is for ruff_dev, for ruff `Args` would be sufficient
|
||||
#[derive(Clone, Debug, clap::Parser)]
|
||||
#[allow(clippy::struct_excessive_bools)]
|
||||
#[expect(clippy::struct_excessive_bools)]
|
||||
pub struct CheckCommand {
|
||||
/// List of files or directories to check.
|
||||
#[clap(help = "List of files or directories to check [default: .]")]
|
||||
@@ -444,7 +446,7 @@ pub struct CheckCommand {
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, clap::Parser)]
|
||||
#[allow(clippy::struct_excessive_bools)]
|
||||
#[expect(clippy::struct_excessive_bools)]
|
||||
pub struct FormatCommand {
|
||||
/// List of files or directories to format.
|
||||
#[clap(help = "List of files or directories to format [default: .]")]
|
||||
@@ -558,7 +560,7 @@ pub enum HelpFormat {
|
||||
Json,
|
||||
}
|
||||
|
||||
#[allow(clippy::module_name_repetitions)]
|
||||
#[expect(clippy::module_name_repetitions)]
|
||||
#[derive(Debug, Default, Clone, clap::Args)]
|
||||
pub struct LogLevelArgs {
|
||||
/// Enable verbose logging.
|
||||
@@ -797,6 +799,7 @@ impl AnalyzeGraphCommand {
|
||||
let format_arguments = AnalyzeGraphArgs {
|
||||
files: self.files,
|
||||
direction: self.direction,
|
||||
python: self.python,
|
||||
};
|
||||
|
||||
let cli_overrides = ExplicitConfigOverrides {
|
||||
@@ -974,12 +977,13 @@ A `--config` flag must either be a path to a `.toml` configuration file
|
||||
.is_some_and(|ext| ext.eq_ignore_ascii_case("toml"))
|
||||
{
|
||||
if !value.contains('=') {
|
||||
tip.push_str(&format!(
|
||||
let _ = write!(
|
||||
&mut tip,
|
||||
"
|
||||
|
||||
It looks like you were trying to pass a path to a configuration file.
|
||||
The path `{value}` does not point to a configuration file"
|
||||
));
|
||||
);
|
||||
}
|
||||
} else if let Some((key, value)) = value.split_once('=') {
|
||||
let key = key.trim_ascii();
|
||||
@@ -993,7 +997,8 @@ The path `{value}` does not point to a configuration file"
|
||||
.map(|(name, _)| format!("- `{key}.{name}`"))
|
||||
.join("\n");
|
||||
|
||||
tip.push_str(&format!(
|
||||
let _ = write!(
|
||||
&mut tip,
|
||||
"
|
||||
|
||||
`{key}` is a table of configuration options.
|
||||
@@ -1002,13 +1007,14 @@ Did you want to override one of the table's subkeys?
|
||||
Possible choices:
|
||||
|
||||
{prefixed_subfields}"
|
||||
));
|
||||
);
|
||||
}
|
||||
_ => {
|
||||
tip.push_str(&format!(
|
||||
let _ = write!(
|
||||
&mut tip,
|
||||
"\n\n{}:\n\n{underlying_error}",
|
||||
config_parse_error.description()
|
||||
));
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1025,7 +1031,7 @@ Possible choices:
|
||||
|
||||
/// CLI settings that are distinct from configuration (commands, lists of files,
|
||||
/// etc.).
|
||||
#[allow(clippy::struct_excessive_bools)]
|
||||
#[expect(clippy::struct_excessive_bools)]
|
||||
pub struct CheckArguments {
|
||||
pub add_noqa: bool,
|
||||
pub diff: bool,
|
||||
@@ -1044,7 +1050,7 @@ pub struct CheckArguments {
|
||||
|
||||
/// CLI settings that are distinct from configuration (commands, lists of files,
|
||||
/// etc.).
|
||||
#[allow(clippy::struct_excessive_bools)]
|
||||
#[expect(clippy::struct_excessive_bools)]
|
||||
pub struct FormatArguments {
|
||||
pub check: bool,
|
||||
pub no_cache: bool,
|
||||
@@ -1067,8 +1073,9 @@ impl FormatRange {
|
||||
///
|
||||
/// Returns an empty range if the start range is past the end of `source`.
|
||||
pub(super) fn to_text_range(self, source: &str, line_index: &LineIndex) -> TextRange {
|
||||
let start_byte_offset = line_index.offset(self.start.line, self.start.column, source);
|
||||
let end_byte_offset = line_index.offset(self.end.line, self.end.column, source);
|
||||
let start_byte_offset =
|
||||
line_index.offset(self.start.into(), source, PositionEncoding::Utf32);
|
||||
let end_byte_offset = line_index.offset(self.end.into(), source, PositionEncoding::Utf32);
|
||||
|
||||
TextRange::new(start_byte_offset, end_byte_offset)
|
||||
}
|
||||
@@ -1139,6 +1146,15 @@ pub struct LineColumn {
|
||||
pub column: OneIndexed,
|
||||
}
|
||||
|
||||
impl From<LineColumn> for ruff_source_file::SourceLocation {
|
||||
fn from(value: LineColumn) -> Self {
|
||||
Self {
|
||||
line: value.line,
|
||||
character_offset: value.column,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::fmt::Display for LineColumn {
|
||||
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "{line}:{column}", line = self.line, column = self.column)
|
||||
@@ -1249,12 +1265,12 @@ impl LineColumnParseError {
|
||||
pub struct AnalyzeGraphArgs {
|
||||
pub files: Vec<PathBuf>,
|
||||
pub direction: Direction,
|
||||
pub python: Option<PathBuf>,
|
||||
}
|
||||
|
||||
/// Configuration overrides provided via dedicated CLI flags:
|
||||
/// `--line-length`, `--respect-gitignore`, etc.
|
||||
#[derive(Clone, Default)]
|
||||
#[allow(clippy::struct_excessive_bools)]
|
||||
struct ExplicitConfigOverrides {
|
||||
dummy_variable_rgx: Option<Regex>,
|
||||
exclude: Option<Vec<FilePattern>>,
|
||||
|
||||
@@ -86,7 +86,7 @@ pub(crate) struct Cache {
|
||||
changes: Mutex<Vec<Change>>,
|
||||
/// The "current" timestamp used as cache for the updates of
|
||||
/// [`FileCache::last_seen`]
|
||||
#[allow(clippy::struct_field_names)]
|
||||
#[expect(clippy::struct_field_names)]
|
||||
last_seen_cache: u64,
|
||||
}
|
||||
|
||||
@@ -146,7 +146,7 @@ impl Cache {
|
||||
Cache::new(path, package)
|
||||
}
|
||||
|
||||
#[allow(clippy::cast_possible_truncation)]
|
||||
#[expect(clippy::cast_possible_truncation)]
|
||||
fn new(path: PathBuf, package: PackageCache) -> Self {
|
||||
Cache {
|
||||
path,
|
||||
@@ -204,7 +204,7 @@ impl Cache {
|
||||
}
|
||||
|
||||
/// Applies the pending changes without storing the cache to disk.
|
||||
#[allow(clippy::cast_possible_truncation)]
|
||||
#[expect(clippy::cast_possible_truncation)]
|
||||
pub(crate) fn save(&mut self) -> bool {
|
||||
/// Maximum duration for which we keep a file in cache that hasn't been seen.
|
||||
const MAX_LAST_SEEN: Duration = Duration::from_secs(30 * 24 * 60 * 60); // 30 days.
|
||||
@@ -616,7 +616,7 @@ mod tests {
|
||||
let settings = Settings {
|
||||
cache_dir,
|
||||
linter: LinterSettings {
|
||||
unresolved_target_version: PythonVersion::latest(),
|
||||
unresolved_target_version: PythonVersion::latest().into(),
|
||||
..Default::default()
|
||||
},
|
||||
..Settings::default()
|
||||
@@ -834,7 +834,6 @@ mod tests {
|
||||
// Regression test for issue #3086.
|
||||
|
||||
#[cfg(unix)]
|
||||
#[allow(clippy::items_after_statements)]
|
||||
fn flip_execute_permission_bit(path: &Path) -> io::Result<()> {
|
||||
use std::os::unix::fs::PermissionsExt;
|
||||
let file = fs::OpenOptions::new().write(true).open(path)?;
|
||||
@@ -843,7 +842,6 @@ mod tests {
|
||||
}
|
||||
|
||||
#[cfg(windows)]
|
||||
#[allow(clippy::items_after_statements)]
|
||||
fn flip_read_only_permission(path: &Path) -> io::Result<()> {
|
||||
let file = fs::OpenOptions::new().write(true).open(path)?;
|
||||
let mut perms = file.metadata()?.permissions();
|
||||
|
||||
@@ -26,7 +26,7 @@ pub(crate) fn add_noqa(
|
||||
let start = Instant::now();
|
||||
let (paths, resolver) = python_files_in_path(files, pyproject_config, config_arguments)?;
|
||||
let duration = start.elapsed();
|
||||
debug!("Identified files to lint in: {:?}", duration);
|
||||
debug!("Identified files to lint in: {duration:?}");
|
||||
|
||||
if paths.is_empty() {
|
||||
warn_user_once!("No Python files found under the given path(s)");
|
||||
@@ -87,7 +87,7 @@ pub(crate) fn add_noqa(
|
||||
.sum();
|
||||
|
||||
let duration = start.elapsed();
|
||||
debug!("Added noqa to files in: {:?}", duration);
|
||||
debug!("Added noqa to files in: {duration:?}");
|
||||
|
||||
Ok(modifications)
|
||||
}
|
||||
|
||||
@@ -75,6 +75,8 @@ pub(crate) fn analyze_graph(
|
||||
.target_version
|
||||
.as_tuple()
|
||||
.into(),
|
||||
args.python
|
||||
.and_then(|python| SystemPathBuf::from_path_buf(python).ok()),
|
||||
)?;
|
||||
|
||||
let imports = {
|
||||
|
||||
@@ -30,7 +30,6 @@ use crate::cache::{Cache, PackageCacheMap, PackageCaches};
|
||||
use crate::diagnostics::Diagnostics;
|
||||
|
||||
/// Run the linter over a collection of files.
|
||||
#[allow(clippy::too_many_arguments)]
|
||||
pub(crate) fn check(
|
||||
files: &[PathBuf],
|
||||
pyproject_config: &PyprojectConfig,
|
||||
@@ -174,14 +173,13 @@ pub(crate) fn check(
|
||||
caches.persist()?;
|
||||
|
||||
let duration = start.elapsed();
|
||||
debug!("Checked {:?} files in: {:?}", checked_files, duration);
|
||||
debug!("Checked {checked_files:?} files in: {duration:?}");
|
||||
|
||||
Ok(all_diagnostics)
|
||||
}
|
||||
|
||||
/// Wraps [`lint_path`](crate::diagnostics::lint_path) in a [`catch_unwind`](std::panic::catch_unwind) and emits
|
||||
/// a diagnostic if the linting the file panics.
|
||||
#[allow(clippy::too_many_arguments)]
|
||||
fn lint_path(
|
||||
path: &Path,
|
||||
package: Option<PackageRoot<'_>>,
|
||||
|
||||
@@ -5,7 +5,7 @@ use crate::args::HelpFormat;
|
||||
use ruff_workspace::options::Options;
|
||||
use ruff_workspace::options_base::OptionsMetadata;
|
||||
|
||||
#[allow(clippy::print_stdout)]
|
||||
#[expect(clippy::print_stdout)]
|
||||
pub(crate) fn config(key: Option<&str>, format: HelpFormat) -> Result<()> {
|
||||
match key {
|
||||
None => {
|
||||
|
||||
@@ -160,7 +160,7 @@ pub(crate) fn format(
|
||||
}),
|
||||
Err(error) => Err(FormatCommandError::Panic(
|
||||
Some(resolved_file.path().to_path_buf()),
|
||||
error,
|
||||
Box::new(error),
|
||||
)),
|
||||
},
|
||||
)
|
||||
@@ -362,7 +362,7 @@ pub(crate) fn format_source(
|
||||
})
|
||||
} else {
|
||||
// Using `Printed::into_code` requires adding `ruff_formatter` as a direct dependency, and I suspect that Rust can optimize the closure away regardless.
|
||||
#[allow(clippy::redundant_closure_for_method_calls)]
|
||||
#[expect(clippy::redundant_closure_for_method_calls)]
|
||||
format_module_source(unformatted, options).map(|formatted| formatted.into_code())
|
||||
};
|
||||
|
||||
@@ -635,7 +635,7 @@ impl<'a> FormatResults<'a> {
|
||||
pub(crate) enum FormatCommandError {
|
||||
Ignore(#[from] ignore::Error),
|
||||
Parse(#[from] DisplayParseError),
|
||||
Panic(Option<PathBuf>, PanicError),
|
||||
Panic(Option<PathBuf>, Box<PanicError>),
|
||||
Read(Option<PathBuf>, SourceError),
|
||||
Format(Option<PathBuf>, FormatModuleError),
|
||||
Write(Option<PathBuf>, SourceError),
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
use std::fmt::Write as _;
|
||||
use std::io::{self, BufWriter, Write};
|
||||
|
||||
use anyhow::Result;
|
||||
@@ -18,7 +19,7 @@ struct Explanation<'a> {
|
||||
summary: &'a str,
|
||||
message_formats: &'a [&'a str],
|
||||
fix: String,
|
||||
#[allow(clippy::struct_field_names)]
|
||||
#[expect(clippy::struct_field_names)]
|
||||
explanation: Option<&'a str>,
|
||||
preview: bool,
|
||||
}
|
||||
@@ -43,12 +44,16 @@ impl<'a> Explanation<'a> {
|
||||
|
||||
fn format_rule_text(rule: Rule) -> String {
|
||||
let mut output = String::new();
|
||||
output.push_str(&format!("# {} ({})", rule.as_ref(), rule.noqa_code()));
|
||||
let _ = write!(&mut output, "# {} ({})", rule.as_ref(), rule.noqa_code());
|
||||
output.push('\n');
|
||||
output.push('\n');
|
||||
|
||||
let (linter, _) = Linter::parse_code(&rule.noqa_code().to_string()).unwrap();
|
||||
output.push_str(&format!("Derived from the **{}** linter.", linter.name()));
|
||||
let _ = write!(
|
||||
&mut output,
|
||||
"Derived from the **{}** linter.",
|
||||
linter.name()
|
||||
);
|
||||
output.push('\n');
|
||||
output.push('\n');
|
||||
|
||||
@@ -76,7 +81,7 @@ fn format_rule_text(rule: Rule) -> String {
|
||||
output.push_str("Message formats:");
|
||||
for format in rule.message_formats() {
|
||||
output.push('\n');
|
||||
output.push_str(&format!("* {format}"));
|
||||
let _ = write!(&mut output, "* {format}");
|
||||
}
|
||||
}
|
||||
output
|
||||
@@ -92,7 +97,7 @@ pub(crate) fn rule(rule: Rule, format: HelpFormat) -> Result<()> {
|
||||
HelpFormat::Json => {
|
||||
serde_json::to_writer_pretty(stdout, &Explanation::from_rule(&rule))?;
|
||||
}
|
||||
};
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
||||
@@ -16,6 +16,6 @@ pub(crate) fn version(output_format: HelpFormat) -> Result<()> {
|
||||
HelpFormat::Json => {
|
||||
serde_json::to_writer_pretty(stdout, &version_info)?;
|
||||
}
|
||||
};
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -367,8 +367,7 @@ pub(crate) fn lint_path(
|
||||
})
|
||||
}
|
||||
|
||||
/// Generate `Diagnostic`s from source code content derived from
|
||||
/// stdin.
|
||||
/// Generate `Diagnostic`s from source code content derived from stdin.
|
||||
pub(crate) fn lint_stdin(
|
||||
path: Option<&Path>,
|
||||
package: Option<PackageRoot<'_>>,
|
||||
@@ -377,13 +376,37 @@ pub(crate) fn lint_stdin(
|
||||
noqa: flags::Noqa,
|
||||
fix_mode: flags::FixMode,
|
||||
) -> Result<Diagnostics> {
|
||||
// TODO(charlie): Support `pyproject.toml`.
|
||||
let source_type = match path.and_then(|path| settings.linter.extension.get(path)) {
|
||||
None => match path.map(SourceType::from).unwrap_or_default() {
|
||||
SourceType::Python(source_type) => source_type,
|
||||
SourceType::Toml(_) => {
|
||||
return Ok(Diagnostics::default());
|
||||
|
||||
SourceType::Toml(source_type) if source_type.is_pyproject() => {
|
||||
if !settings
|
||||
.linter
|
||||
.rules
|
||||
.iter_enabled()
|
||||
.any(|rule_code| rule_code.lint_source().is_pyproject_toml())
|
||||
{
|
||||
return Ok(Diagnostics::default());
|
||||
}
|
||||
|
||||
let path = path.unwrap();
|
||||
let source_file =
|
||||
SourceFileBuilder::new(path.to_string_lossy(), contents.clone()).finish();
|
||||
|
||||
match fix_mode {
|
||||
flags::FixMode::Diff | flags::FixMode::Generate => {}
|
||||
flags::FixMode::Apply => write!(&mut io::stdout().lock(), "{contents}")?,
|
||||
}
|
||||
|
||||
return Ok(Diagnostics {
|
||||
messages: lint_pyproject_toml(source_file, &settings.linter),
|
||||
fixed: FixMap::from_iter([(fs::relativize_path(path), FixTable::default())]),
|
||||
notebook_indexes: FxHashMap::default(),
|
||||
});
|
||||
}
|
||||
|
||||
SourceType::Toml(_) => return Ok(Diagnostics::default()),
|
||||
},
|
||||
Some(language) => PySourceType::from(language),
|
||||
};
|
||||
|
||||
@@ -112,7 +112,7 @@ fn is_stdin(files: &[PathBuf], stdin_filename: Option<&Path>) -> bool {
|
||||
file == Path::new("-")
|
||||
}
|
||||
|
||||
/// Returns the default set of files if none are provided, otherwise returns `None`.
|
||||
/// Returns the default set of files if none are provided, otherwise returns provided files.
|
||||
fn resolve_default_files(files: Vec<PathBuf>, is_stdin: bool) -> Vec<PathBuf> {
|
||||
if files.is_empty() {
|
||||
if is_stdin {
|
||||
@@ -134,7 +134,7 @@ pub fn run(
|
||||
{
|
||||
let default_panic_hook = std::panic::take_hook();
|
||||
std::panic::set_hook(Box::new(move |info| {
|
||||
#[allow(clippy::print_stderr)]
|
||||
#[expect(clippy::print_stderr)]
|
||||
{
|
||||
eprintln!(
|
||||
r#"
|
||||
@@ -228,7 +228,7 @@ fn server(args: ServerCommand) -> Result<ExitStatus> {
|
||||
// by default, we set the number of worker threads to `num_cpus`, with a maximum of 4.
|
||||
let worker_threads = std::thread::available_parallelism()
|
||||
.unwrap_or(four)
|
||||
.max(four);
|
||||
.min(four);
|
||||
commands::server::run_server(worker_threads, args.resolve_preview())
|
||||
}
|
||||
|
||||
@@ -290,8 +290,8 @@ pub fn check(args: CheckCommand, global_options: GlobalConfigArgs) -> Result<Exi
|
||||
// - If `--fix` or `--fix-only` is set, apply applicable fixes to the filesystem (or
|
||||
// print them to stdout, if we're reading from stdin).
|
||||
// - If `--diff` or `--fix-only` are set, don't print any violations (only applicable fixes)
|
||||
// - By default, applicable fixes only include [`Applicablility::Automatic`], but if
|
||||
// `--unsafe-fixes` is set, then [`Applicablility::Suggested`] fixes are included.
|
||||
// - By default, applicable fixes only include [`Applicability::Automatic`], but if
|
||||
// `--unsafe-fixes` is set, then [`Applicability::Suggested`] fixes are included.
|
||||
|
||||
let fix_mode = if cli.diff {
|
||||
FixMode::Diff
|
||||
@@ -326,7 +326,7 @@ pub fn check(args: CheckCommand, global_options: GlobalConfigArgs) -> Result<Exi
|
||||
commands::add_noqa::add_noqa(&files, &pyproject_config, &config_arguments)?;
|
||||
if modifications > 0 && config_arguments.log_level >= LogLevel::Default {
|
||||
let s = if modifications == 1 { "" } else { "s" };
|
||||
#[allow(clippy::print_stderr)]
|
||||
#[expect(clippy::print_stderr)]
|
||||
{
|
||||
eprintln!("Added {modifications} noqa directive{s}.");
|
||||
}
|
||||
|
||||
@@ -31,7 +31,7 @@ pub fn main() -> ExitCode {
|
||||
|
||||
// support FORCE_COLOR env var
|
||||
if let Some(force_color) = std::env::var_os("FORCE_COLOR") {
|
||||
if force_color.len() > 0 {
|
||||
if !force_color.is_empty() {
|
||||
colored::control::set_override(true);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -241,7 +241,6 @@ impl Printer {
|
||||
}
|
||||
|
||||
if !self.flags.intersects(Flags::SHOW_VIOLATIONS) {
|
||||
#[allow(deprecated)]
|
||||
if matches!(
|
||||
self.format,
|
||||
OutputFormat::Full | OutputFormat::Concise | OutputFormat::Grouped
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
use std::path::Path;
|
||||
|
||||
use anyhow::Result;
|
||||
use anyhow::{bail, Result};
|
||||
use log::debug;
|
||||
use path_absolutize::path_dedot;
|
||||
|
||||
@@ -21,10 +21,14 @@ pub fn resolve(
|
||||
config_arguments: &ConfigArguments,
|
||||
stdin_filename: Option<&Path>,
|
||||
) -> Result<PyprojectConfig> {
|
||||
let Ok(cwd) = std::env::current_dir() else {
|
||||
bail!("Working directory does not exist")
|
||||
};
|
||||
|
||||
// First priority: if we're running in isolated mode, use the default settings.
|
||||
if config_arguments.isolated {
|
||||
let config = config_arguments.transform(Configuration::default());
|
||||
let settings = config.into_settings(&path_dedot::CWD)?;
|
||||
let settings = config.into_settings(&cwd)?;
|
||||
debug!("Isolated mode, not reading any pyproject.toml");
|
||||
return Ok(PyprojectConfig::new(
|
||||
PyprojectDiscoveryStrategy::Fixed,
|
||||
@@ -58,11 +62,7 @@ pub fn resolve(
|
||||
// that directory. (With `Strategy::Hierarchical`, we'll end up finding
|
||||
// the "closest" `pyproject.toml` file for every Python file later on,
|
||||
// so these act as the "default" settings.)
|
||||
if let Some(pyproject) = pyproject::find_settings_toml(
|
||||
stdin_filename
|
||||
.as_ref()
|
||||
.unwrap_or(&path_dedot::CWD.as_path()),
|
||||
)? {
|
||||
if let Some(pyproject) = pyproject::find_settings_toml(stdin_filename.unwrap_or(&cwd))? {
|
||||
debug!(
|
||||
"Using configuration file (via parent) at: {}",
|
||||
pyproject.display()
|
||||
@@ -130,17 +130,13 @@ pub fn resolve(
|
||||
// `pyproject::find_settings_toml`.)
|
||||
// However, there may be a `pyproject.toml` with a `requires-python`
|
||||
// specified, and that is what we look for in this step.
|
||||
let fallback = find_fallback_target_version(
|
||||
stdin_filename
|
||||
.as_ref()
|
||||
.unwrap_or(&path_dedot::CWD.as_path()),
|
||||
);
|
||||
let fallback = find_fallback_target_version(stdin_filename.unwrap_or(&cwd));
|
||||
if let Some(version) = fallback {
|
||||
debug!("Derived `target-version` from found `requires-python`: {version:?}");
|
||||
}
|
||||
config.target_version = fallback.map(ast::PythonVersion::from);
|
||||
}
|
||||
let settings = config.into_settings(&path_dedot::CWD)?;
|
||||
let settings = config.into_settings(&cwd)?;
|
||||
Ok(PyprojectConfig::new(
|
||||
PyprojectDiscoveryStrategy::Hierarchical,
|
||||
settings,
|
||||
|
||||
@@ -422,3 +422,153 @@ fn nested_imports() -> Result<()> {
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Test for venv resolution with the `--python` flag.
|
||||
///
|
||||
/// Based on the [albatross-virtual-workspace] example from the uv repo and the report in [#16598].
|
||||
///
|
||||
/// [albatross-virtual-workspace]: https://github.com/astral-sh/uv/tree/aa629c4a/scripts/workspaces/albatross-virtual-workspace
|
||||
/// [#16598]: https://github.com/astral-sh/ruff/issues/16598
|
||||
#[test]
|
||||
fn venv() -> Result<()> {
|
||||
let tempdir = TempDir::new()?;
|
||||
let root = ChildPath::new(tempdir.path());
|
||||
|
||||
// packages
|
||||
// ├── albatross
|
||||
// │ ├── check_installed_albatross.py
|
||||
// │ ├── pyproject.toml
|
||||
// │ └── src
|
||||
// │ └── albatross
|
||||
// │ └── __init__.py
|
||||
// └── bird-feeder
|
||||
// ├── check_installed_bird_feeder.py
|
||||
// ├── pyproject.toml
|
||||
// └── src
|
||||
// └── bird_feeder
|
||||
// └── __init__.py
|
||||
|
||||
let packages = root.child("packages");
|
||||
|
||||
let albatross = packages.child("albatross");
|
||||
albatross
|
||||
.child("check_installed_albatross.py")
|
||||
.write_str("from albatross import fly")?;
|
||||
albatross
|
||||
.child("pyproject.toml")
|
||||
.write_str(indoc::indoc! {r#"
|
||||
[project]
|
||||
name = "albatross"
|
||||
version = "0.1.0"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = ["bird-feeder", "tqdm>=4,<5"]
|
||||
|
||||
[tool.uv.sources]
|
||||
bird-feeder = { workspace = true }
|
||||
"#})?;
|
||||
albatross
|
||||
.child("src")
|
||||
.child("albatross")
|
||||
.child("__init__.py")
|
||||
.write_str("import tqdm; from bird_feeder import use")?;
|
||||
|
||||
let bird_feeder = packages.child("bird-feeder");
|
||||
bird_feeder
|
||||
.child("check_installed_bird_feeder.py")
|
||||
.write_str("from bird_feeder import use; from albatross import fly")?;
|
||||
bird_feeder
|
||||
.child("pyproject.toml")
|
||||
.write_str(indoc::indoc! {r#"
|
||||
[project]
|
||||
name = "bird-feeder"
|
||||
version = "1.0.0"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = ["anyio>=4.3.0,<5"]
|
||||
"#})?;
|
||||
bird_feeder
|
||||
.child("src")
|
||||
.child("bird_feeder")
|
||||
.child("__init__.py")
|
||||
.write_str("import anyio")?;
|
||||
|
||||
let venv = root.child(".venv");
|
||||
let bin = venv.child("bin");
|
||||
bin.child("python").touch()?;
|
||||
let home = format!("home = {}", bin.to_string_lossy());
|
||||
venv.child("pyvenv.cfg").write_str(&home)?;
|
||||
let site_packages = venv.child("lib").child("python3.12").child("site-packages");
|
||||
site_packages
|
||||
.child("_albatross.pth")
|
||||
.write_str(&albatross.join("src").to_string_lossy())?;
|
||||
site_packages
|
||||
.child("_bird_feeder.pth")
|
||||
.write_str(&bird_feeder.join("src").to_string_lossy())?;
|
||||
site_packages.child("tqdm").child("__init__.py").touch()?;
|
||||
|
||||
// without `--python .venv`, the result should only include dependencies within the albatross
|
||||
// package
|
||||
insta::with_settings!({
|
||||
filters => INSTA_FILTERS.to_vec(),
|
||||
}, {
|
||||
assert_cmd_snapshot!(
|
||||
command().arg("packages/albatross").current_dir(&root),
|
||||
@r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
{
|
||||
"packages/albatross/check_installed_albatross.py": [
|
||||
"packages/albatross/src/albatross/__init__.py"
|
||||
],
|
||||
"packages/albatross/src/albatross/__init__.py": []
|
||||
}
|
||||
|
||||
----- stderr -----
|
||||
"#);
|
||||
});
|
||||
|
||||
// with `--python .venv` both workspace and third-party dependencies are included
|
||||
insta::with_settings!({
|
||||
filters => INSTA_FILTERS.to_vec(),
|
||||
}, {
|
||||
assert_cmd_snapshot!(
|
||||
command().args(["--python", ".venv"]).arg("packages/albatross").current_dir(&root),
|
||||
@r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
{
|
||||
"packages/albatross/check_installed_albatross.py": [
|
||||
"packages/albatross/src/albatross/__init__.py"
|
||||
],
|
||||
"packages/albatross/src/albatross/__init__.py": [
|
||||
".venv/lib/python3.12/site-packages/tqdm/__init__.py",
|
||||
"packages/bird-feeder/src/bird_feeder/__init__.py"
|
||||
]
|
||||
}
|
||||
|
||||
----- stderr -----
|
||||
"#);
|
||||
});
|
||||
|
||||
// test the error message for a non-existent venv. it's important that the `ruff analyze graph`
|
||||
// flag matches the ty flag used to generate the error message (`--python`)
|
||||
insta::with_settings!({
|
||||
filters => INSTA_FILTERS.to_vec(),
|
||||
}, {
|
||||
assert_cmd_snapshot!(
|
||||
command().args(["--python", "none"]).arg("packages/albatross").current_dir(&root),
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 2
|
||||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
ruff failed
|
||||
Cause: Invalid search path settings
|
||||
Cause: Failed to discover the site-packages directory: Invalid `--python` argument: `none` could not be canonicalized
|
||||
");
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
30
crates/ruff/tests/direxist_guard.rs
Normal file
30
crates/ruff/tests/direxist_guard.rs
Normal file
@@ -0,0 +1,30 @@
|
||||
//! Test to verify Ruff's behavior when run from deleted directory.
|
||||
//! It has to be isolated in a separate module.
|
||||
//! Tests in the same module become flaky under `cargo test`s parallel execution
|
||||
//! due to in-test working directory manipulation.
|
||||
|
||||
#![cfg(target_family = "unix")]
|
||||
|
||||
use std::env::set_current_dir;
|
||||
use std::process::Command;
|
||||
|
||||
use insta_cmd::{assert_cmd_snapshot, get_cargo_bin};
|
||||
const BIN_NAME: &str = "ruff";
|
||||
|
||||
#[test]
|
||||
fn check_in_deleted_directory_errors() {
|
||||
let temp_dir = tempfile::tempdir().unwrap();
|
||||
let temp_path = temp_dir.path().to_path_buf();
|
||||
set_current_dir(&temp_path).unwrap();
|
||||
drop(temp_dir);
|
||||
|
||||
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME)).arg("check"), @r###"
|
||||
success: false
|
||||
exit_code: 2
|
||||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
ruff failed
|
||||
Cause: Working directory does not exist
|
||||
"###);
|
||||
}
|
||||
@@ -1054,6 +1054,52 @@ def mvce(keys, values):
|
||||
"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn show_statistics_syntax_errors() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args(["--statistics", "--target-version=py39", "--preview"])
|
||||
.build();
|
||||
|
||||
// ParseError
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("x ="),
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
1 syntax-error
|
||||
Found 1 error.
|
||||
|
||||
----- stderr -----
|
||||
");
|
||||
|
||||
// match before 3.10, UnsupportedSyntaxError
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("match 2:\n case 1: ..."),
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
1 syntax-error
|
||||
Found 1 error.
|
||||
|
||||
----- stderr -----
|
||||
");
|
||||
|
||||
// rebound comprehension variable, SemanticSyntaxError
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("[x := 1 for x in range(0)]"),
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
1 syntax-error
|
||||
Found 1 error.
|
||||
|
||||
----- stderr -----
|
||||
");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn preview_enabled_prefix() {
|
||||
// All the RUF9XX test rules should be triggered
|
||||
@@ -2187,3 +2233,195 @@ unfixable = ["RUF"]
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pyproject_toml_stdin_syntax_error() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args(["--stdin-filename", "pyproject.toml", "--select", "RUF200"])
|
||||
.build();
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("[project"),
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
pyproject.toml:1:9: RUF200 Failed to parse pyproject.toml: invalid table header
|
||||
expected `.`, `]`
|
||||
|
|
||||
1 | [project
|
||||
| ^ RUF200
|
||||
|
|
||||
|
||||
Found 1 error.
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pyproject_toml_stdin_schema_error() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args(["--stdin-filename", "pyproject.toml", "--select", "RUF200"])
|
||||
.build();
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("[project]\nname = 1"),
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
pyproject.toml:2:8: RUF200 Failed to parse pyproject.toml: invalid type: integer `1`, expected a string
|
||||
|
|
||||
1 | [project]
|
||||
2 | name = 1
|
||||
| ^ RUF200
|
||||
|
|
||||
|
||||
Found 1 error.
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pyproject_toml_stdin_no_applicable_rules_selected() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args(["--stdin-filename", "pyproject.toml"])
|
||||
.build();
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("[project"),
|
||||
@r"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
All checks passed!
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pyproject_toml_stdin_no_applicable_rules_selected_2() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args(["--stdin-filename", "pyproject.toml", "--select", "F401"])
|
||||
.build();
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("[project"),
|
||||
@r"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
All checks passed!
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pyproject_toml_stdin_no_errors() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args(["--stdin-filename", "pyproject.toml"])
|
||||
.build();
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin(r#"[project]\nname = "ruff"\nversion = "0.0.0""#),
|
||||
@r"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
All checks passed!
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pyproject_toml_stdin_schema_error_fix() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args([
|
||||
"--stdin-filename",
|
||||
"pyproject.toml",
|
||||
"--select",
|
||||
"RUF200",
|
||||
"--fix",
|
||||
])
|
||||
.build();
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("[project]\nname = 1"),
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
[project]
|
||||
name = 1
|
||||
----- stderr -----
|
||||
pyproject.toml:2:8: RUF200 Failed to parse pyproject.toml: invalid type: integer `1`, expected a string
|
||||
|
|
||||
1 | [project]
|
||||
2 | name = 1
|
||||
| ^ RUF200
|
||||
|
|
||||
|
||||
Found 1 error.
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pyproject_toml_stdin_schema_error_fix_only() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args([
|
||||
"--stdin-filename",
|
||||
"pyproject.toml",
|
||||
"--select",
|
||||
"RUF200",
|
||||
"--fix-only",
|
||||
])
|
||||
.build();
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("[project]\nname = 1"),
|
||||
@r"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
[project]
|
||||
name = 1
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pyproject_toml_stdin_schema_error_fix_diff() {
|
||||
let mut cmd = RuffCheck::default()
|
||||
.args([
|
||||
"--stdin-filename",
|
||||
"pyproject.toml",
|
||||
"--select",
|
||||
"RUF200",
|
||||
"--fix",
|
||||
"--diff",
|
||||
])
|
||||
.build();
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd.pass_stdin("[project]\nname = 1"),
|
||||
@r"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
@@ -994,6 +994,7 @@ fn value_given_to_table_key_is_not_inline_table_2() {
|
||||
- `lint.extend-per-file-ignores`
|
||||
- `lint.exclude`
|
||||
- `lint.preview`
|
||||
- `lint.typing-extensions`
|
||||
|
||||
For more information, try '--help'.
|
||||
");
|
||||
@@ -1899,6 +1900,40 @@ def first_square():
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Regression test for <https://github.com/astral-sh/ruff/issues/2253>
|
||||
#[test]
|
||||
fn add_noqa_parent() -> Result<()> {
|
||||
let tempdir = TempDir::new()?;
|
||||
let test_path = tempdir.path().join("noqa.py");
|
||||
fs::write(
|
||||
&test_path,
|
||||
r#"
|
||||
from foo import ( # noqa: F401
|
||||
bar
|
||||
)
|
||||
"#,
|
||||
)?;
|
||||
|
||||
insta::with_settings!({
|
||||
filters => vec![(tempdir_filter(&tempdir).as_str(), "[TMP]/")]
|
||||
}, {
|
||||
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
|
||||
.args(STDIN_BASE_OPTIONS)
|
||||
.arg("--add-noqa")
|
||||
.arg("--select=F401")
|
||||
.arg("noqa.py")
|
||||
.current_dir(&tempdir), @r"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
");
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Infer `3.11` from `requires-python` in `pyproject.toml`.
|
||||
#[test]
|
||||
fn requires_python() -> Result<()> {
|
||||
@@ -2117,7 +2152,7 @@ requires-python = ">= 3.11"
|
||||
.arg("test.py")
|
||||
.arg("-")
|
||||
.current_dir(project_dir)
|
||||
, @r###"
|
||||
, @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
@@ -2207,6 +2242,7 @@ requires-python = ">= 3.11"
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -2244,6 +2280,7 @@ requires-python = ">= 3.11"
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
@@ -2389,7 +2426,7 @@ requires-python = ">= 3.11"
|
||||
analyze.include_dependencies = {}
|
||||
|
||||
----- stderr -----
|
||||
"###);
|
||||
"#);
|
||||
});
|
||||
Ok(())
|
||||
}
|
||||
@@ -2427,7 +2464,7 @@ requires-python = ">= 3.11"
|
||||
.arg("test.py")
|
||||
.arg("-")
|
||||
.current_dir(project_dir)
|
||||
, @r###"
|
||||
, @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
@@ -2517,6 +2554,7 @@ requires-python = ">= 3.11"
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -2554,6 +2592,7 @@ requires-python = ">= 3.11"
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
@@ -2699,7 +2738,7 @@ requires-python = ">= 3.11"
|
||||
analyze.include_dependencies = {}
|
||||
|
||||
----- stderr -----
|
||||
"###);
|
||||
"#);
|
||||
});
|
||||
Ok(())
|
||||
}
|
||||
@@ -2788,7 +2827,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
.args(STDIN_BASE_OPTIONS)
|
||||
.arg("test.py")
|
||||
.arg("--show-settings")
|
||||
.current_dir(project_dir), @r###"
|
||||
.current_dir(project_dir), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
@@ -2879,6 +2918,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -2916,6 +2956,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
@@ -3061,7 +3102,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
analyze.include_dependencies = {}
|
||||
|
||||
----- stderr -----
|
||||
"###);
|
||||
"#);
|
||||
});
|
||||
Ok(())
|
||||
}
|
||||
@@ -3167,7 +3208,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
.arg("--show-settings")
|
||||
.args(["--select","UP007"])
|
||||
.arg("foo/test.py")
|
||||
.current_dir(&project_dir), @r###"
|
||||
.current_dir(&project_dir), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
@@ -3257,6 +3298,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -3294,6 +3336,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
@@ -3439,7 +3482,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
analyze.include_dependencies = {}
|
||||
|
||||
----- stderr -----
|
||||
"###);
|
||||
"#);
|
||||
});
|
||||
Ok(())
|
||||
}
|
||||
@@ -3471,7 +3514,7 @@ requires-python = ">= 3.11"
|
||||
&inner_pyproject,
|
||||
r#"
|
||||
[tool.ruff]
|
||||
target-version = "py310"
|
||||
target-version = "py310"
|
||||
"#,
|
||||
)?;
|
||||
|
||||
@@ -3493,7 +3536,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
.arg("--show-settings")
|
||||
.args(["--select","UP007"])
|
||||
.arg("foo/test.py")
|
||||
.current_dir(&project_dir), @r###"
|
||||
.current_dir(&project_dir), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
@@ -3583,6 +3626,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -3620,6 +3664,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
@@ -3765,7 +3810,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
analyze.include_dependencies = {}
|
||||
|
||||
----- stderr -----
|
||||
"###);
|
||||
"#);
|
||||
});
|
||||
Ok(())
|
||||
}
|
||||
@@ -3818,7 +3863,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
.args(STDIN_BASE_OPTIONS)
|
||||
.arg("--show-settings")
|
||||
.arg("foo/test.py")
|
||||
.current_dir(&project_dir), @r###"
|
||||
.current_dir(&project_dir), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
@@ -3885,7 +3930,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
linter.per_file_ignores = {}
|
||||
linter.safety_table.forced_safe = []
|
||||
linter.safety_table.forced_unsafe = []
|
||||
linter.unresolved_target_version = 3.9
|
||||
linter.unresolved_target_version = none
|
||||
linter.per_file_target_version = {}
|
||||
linter.preview = disabled
|
||||
linter.explicit_preview_rules = false
|
||||
@@ -3909,6 +3954,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -3946,6 +3992,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
@@ -4091,7 +4138,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
analyze.include_dependencies = {}
|
||||
|
||||
----- stderr -----
|
||||
"###);
|
||||
"#);
|
||||
});
|
||||
|
||||
insta::with_settings!({
|
||||
@@ -4101,7 +4148,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
.args(STDIN_BASE_OPTIONS)
|
||||
.arg("--show-settings")
|
||||
.arg("test.py")
|
||||
.current_dir(project_dir.join("foo")), @r###"
|
||||
.current_dir(project_dir.join("foo")), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
@@ -4168,7 +4215,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
linter.per_file_ignores = {}
|
||||
linter.safety_table.forced_safe = []
|
||||
linter.safety_table.forced_unsafe = []
|
||||
linter.unresolved_target_version = 3.9
|
||||
linter.unresolved_target_version = none
|
||||
linter.per_file_target_version = {}
|
||||
linter.preview = disabled
|
||||
linter.explicit_preview_rules = false
|
||||
@@ -4192,6 +4239,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -4229,6 +4277,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
@@ -4374,7 +4423,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
analyze.include_dependencies = {}
|
||||
|
||||
----- stderr -----
|
||||
"###);
|
||||
"#);
|
||||
});
|
||||
Ok(())
|
||||
}
|
||||
@@ -4437,7 +4486,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
.args(STDIN_BASE_OPTIONS)
|
||||
.arg("--show-settings")
|
||||
.arg("test.py")
|
||||
.current_dir(&project_dir), @r###"
|
||||
.current_dir(&project_dir), @r#"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
@@ -4528,6 +4577,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -4565,6 +4615,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
@@ -4710,7 +4761,7 @@ from typing import Union;foo: Union[int, str] = 1
|
||||
analyze.include_dependencies = {}
|
||||
|
||||
----- stderr -----
|
||||
"###);
|
||||
"#);
|
||||
});
|
||||
|
||||
Ok(())
|
||||
@@ -4972,6 +5023,53 @@ fn flake8_import_convention_unused_aliased_import_no_conflict() {
|
||||
.pass_stdin("1"));
|
||||
}
|
||||
|
||||
// See: https://github.com/astral-sh/ruff/issues/16177
|
||||
#[test]
|
||||
fn flake8_pyi_redundant_none_literal() {
|
||||
let snippet = r#"
|
||||
from typing import Literal
|
||||
|
||||
# For each of these expressions, Ruff provides a fix for one of the `Literal[None]` elements
|
||||
# but not both, as if both were autofixed it would result in `None | None`,
|
||||
# which leads to a `TypeError` at runtime.
|
||||
a: Literal[None,] | Literal[None,]
|
||||
b: Literal[None] | Literal[None]
|
||||
c: Literal[None] | Literal[None,]
|
||||
d: Literal[None,] | Literal[None]
|
||||
"#;
|
||||
|
||||
assert_cmd_snapshot!(Command::new(get_cargo_bin(BIN_NAME))
|
||||
.args(STDIN_BASE_OPTIONS)
|
||||
.args(["--select", "PYI061"])
|
||||
.args(["--stdin-filename", "test.py"])
|
||||
.arg("--preview")
|
||||
.arg("--diff")
|
||||
.arg("-")
|
||||
.pass_stdin(snippet), @r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
--- test.py
|
||||
+++ test.py
|
||||
@@ -4,7 +4,7 @@
|
||||
# For each of these expressions, Ruff provides a fix for one of the `Literal[None]` elements
|
||||
# but not both, as if both were autofixed it would result in `None | None`,
|
||||
# which leads to a `TypeError` at runtime.
|
||||
-a: Literal[None,] | Literal[None,]
|
||||
-b: Literal[None] | Literal[None]
|
||||
-c: Literal[None] | Literal[None,]
|
||||
-d: Literal[None,] | Literal[None]
|
||||
+a: None | Literal[None,]
|
||||
+b: None | Literal[None]
|
||||
+c: None | Literal[None,]
|
||||
+d: None | Literal[None]
|
||||
|
||||
|
||||
----- stderr -----
|
||||
Would fix 4 errors.
|
||||
");
|
||||
}
|
||||
|
||||
/// Test that private, old-style `TypeVar` generics
|
||||
/// 1. Get replaced with PEP 695 type parameters (UP046, UP047)
|
||||
/// 2. Get renamed to remove leading underscores (UP049)
|
||||
@@ -5167,8 +5265,8 @@ fn a005_module_shadowing_non_strict() -> Result<()> {
|
||||
}
|
||||
|
||||
/// Test A005 with `strict-checking` unset
|
||||
/// TODO(brent) This should currently match the strict version, but after the next minor
|
||||
/// release it will match the non-strict version directly above
|
||||
///
|
||||
/// This should match the non-strict version directly above
|
||||
#[test]
|
||||
fn a005_module_shadowing_strict_default() -> Result<()> {
|
||||
let tempdir = TempDir::new()?;
|
||||
@@ -5491,3 +5589,99 @@ fn cookiecutter_globbing_no_project_root() -> Result<()> {
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Test that semantic syntax errors (1) are emitted, (2) are not cached, (3) don't affect the
|
||||
/// reporting of normal diagnostics, and (4) are not suppressed by `select = []` (or otherwise
|
||||
/// disabling all AST-based rules).
|
||||
#[test]
|
||||
fn semantic_syntax_errors() -> Result<()> {
|
||||
let tempdir = TempDir::new()?;
|
||||
let contents = "[(x := 1) for x in foo]";
|
||||
fs::write(tempdir.path().join("main.py"), contents)?;
|
||||
|
||||
let mut cmd = Command::new(get_cargo_bin(BIN_NAME));
|
||||
// inline STDIN_BASE_OPTIONS to remove --no-cache
|
||||
cmd.args(["check", "--output-format", "concise"])
|
||||
.arg("--preview")
|
||||
.arg("--quiet") // suppress `debug build without --no-cache` warnings
|
||||
.current_dir(&tempdir);
|
||||
|
||||
assert_cmd_snapshot!(
|
||||
cmd,
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
main.py:1:3: SyntaxError: assignment expression cannot rebind comprehension variable
|
||||
main.py:1:20: F821 Undefined name `foo`
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
|
||||
// this should *not* be cached, like normal parse errors
|
||||
assert_cmd_snapshot!(
|
||||
cmd,
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
main.py:1:3: SyntaxError: assignment expression cannot rebind comprehension variable
|
||||
main.py:1:20: F821 Undefined name `foo`
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
|
||||
// ensure semantic errors are caught even without AST-based rules selected
|
||||
assert_cmd_snapshot!(
|
||||
Command::new(get_cargo_bin(BIN_NAME))
|
||||
.args(STDIN_BASE_OPTIONS)
|
||||
.args(["--config", "lint.select = []"])
|
||||
.arg("--preview")
|
||||
.arg("-")
|
||||
.pass_stdin(contents),
|
||||
@r"
|
||||
success: false
|
||||
exit_code: 1
|
||||
----- stdout -----
|
||||
-:1:3: SyntaxError: assignment expression cannot rebind comprehension variable
|
||||
Found 1 error.
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Regression test for <https://github.com/astral-sh/ruff/issues/17821>.
|
||||
///
|
||||
/// `lint.typing-extensions = false` with Python 3.9 should disable the PYI019 lint because it would
|
||||
/// try to import `Self` from `typing_extensions`
|
||||
#[test]
|
||||
fn combine_typing_extensions_config() {
|
||||
let contents = "
|
||||
from typing import TypeVar
|
||||
T = TypeVar('T')
|
||||
class Foo:
|
||||
def f(self: T) -> T: ...
|
||||
";
|
||||
assert_cmd_snapshot!(
|
||||
Command::new(get_cargo_bin(BIN_NAME))
|
||||
.args(STDIN_BASE_OPTIONS)
|
||||
.args(["--config", "lint.typing-extensions = false"])
|
||||
.arg("--select=PYI019")
|
||||
.arg("--target-version=py39")
|
||||
.arg("-")
|
||||
.pass_stdin(contents),
|
||||
@r"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
All checks passed!
|
||||
|
||||
----- stderr -----
|
||||
"
|
||||
);
|
||||
}
|
||||
|
||||
@@ -5,7 +5,6 @@ info:
|
||||
args:
|
||||
- rule
|
||||
- F401
|
||||
snapshot_kind: text
|
||||
---
|
||||
success: true
|
||||
exit_code: 0
|
||||
@@ -84,6 +83,11 @@ else:
|
||||
print("numpy is not installed")
|
||||
```
|
||||
|
||||
## Preview
|
||||
When [preview](https://docs.astral.sh/ruff/preview/) is enabled,
|
||||
the criterion for determining whether an import is first-party
|
||||
is stricter, which could affect the suggested fix. See [this FAQ section](https://docs.astral.sh/ruff/faq/#how-does-ruff-determine-which-of-my-imports-are-first-party-third-party-etc) for more details.
|
||||
|
||||
## Options
|
||||
- `lint.ignore-init-module-imports`
|
||||
- `lint.pyflakes.allowed-unused-imports`
|
||||
@@ -91,6 +95,6 @@ else:
|
||||
## References
|
||||
- [Python documentation: `import`](https://docs.python.org/3/reference/simple_stmts.html#the-import-statement)
|
||||
- [Python documentation: `importlib.util.find_spec`](https://docs.python.org/3/library/importlib.html#importlib.util.find_spec)
|
||||
- [Typing documentation: interface conventions](https://typing.readthedocs.io/en/latest/source/libraries.html#library-interface-public-and-private-symbols)
|
||||
- [Typing documentation: interface conventions](https://typing.python.org/en/latest/source/libraries.html#library-interface-public-and-private-symbols)
|
||||
|
||||
----- stderr -----
|
||||
|
||||
@@ -213,6 +213,7 @@ linter.task_tags = [
|
||||
XXX,
|
||||
]
|
||||
linter.typing_modules = []
|
||||
linter.typing_extensions = true
|
||||
|
||||
# Linter Plugins
|
||||
linter.flake8_annotations.mypy_init_return = false
|
||||
@@ -250,6 +251,7 @@ linter.flake8_import_conventions.aliases = {
|
||||
matplotlib.pyplot = plt,
|
||||
networkx = nx,
|
||||
numpy = np,
|
||||
numpy.typing = npt,
|
||||
pandas = pd,
|
||||
panel = pn,
|
||||
plotly.express = px,
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user