New upstream version 1.66.0+dfsg1

This commit is contained in:
Fabian Grünbichler 2023-04-23 19:45:23 +02:00
parent f2b60f7d56
commit 2b03887add
5777 changed files with 135516 additions and 93949 deletions

396
COPYRIGHT
View File

@ -23,65 +23,251 @@ The Rust Project includes packages written by third parties.
The following third party packages are included, and carry
their own copyright notices and license terms:
* LLVM. Code for this package is found in src/llvm-project.
* LLVM, located in src/llvm-project, is licensed under the following
terms.
Copyright (c) 2003-2013 University of Illinois at
Urbana-Champaign. All rights reserved.
==============================================================================
The LLVM Project is under the Apache License v2.0 with LLVM Exceptions:
==============================================================================
Developed by:
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
LLVM Team
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
University of Illinois at Urbana-Champaign
1. Definitions.
http://llvm.org
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal with the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
* Redistributions of source code must retain the
above copyright notice, this list of conditions
and the following disclaimers.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
* Redistributions in binary form must reproduce the
above copyright notice, this list of conditions
and the following disclaimers in the documentation
and/or other materials provided with the
distribution.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
* Neither the names of the LLVM Team, University of
Illinois at Urbana-Champaign, nor the names of its
contributors may be used to endorse or promote
products derived from this Software without
specific prior written permission.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE CONTRIBUTORS OR COPYRIGHT HOLDERS BE LIABLE
FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT
OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS WITH THE SOFTWARE.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
* Additional libraries included in LLVM carry separate
BSD-compatible licenses. See src/llvm-project/llvm/LICENSE.TXT
for details.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
* compiler-rt, in src/compiler-rt is dual licensed under
LLVM's license and MIT:
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
Copyright (c) 2009-2014 by the contributors listed in
CREDITS.TXT
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
---- LLVM Exceptions to the Apache 2.0 License ----
As an exception, if, as a result of your compiling your source code, portions
of this Software are embedded into an Object form of such source code, you
may redistribute such embedded portions in such Object form without complying
with the conditions of Sections 4(a), 4(b) and 4(d) of the License.
In addition, if you combine or link compiled forms of this Software with
software that is licensed under the GPLv2 ("Combined Software") and if a
court of competent jurisdiction determines that the patent provision (Section
3), the indemnity provision (Section 9) or other Section of the License
conflicts with the conditions of the GPLv2, you may retroactively and
prospectively choose to deem waived or otherwise exclude such Section(s) of
the License, but only in their entirety and only with respect to the Combined
Software.
==============================================================================
Software from third parties included in the LLVM Project:
==============================================================================
The LLVM Project contains third party software which is under different license
terms. All such code will be identified clearly using at least one of two
mechanisms:
1) It will be in a separate directory tree with its own `LICENSE.txt` or
`LICENSE` file at the top containing the specific license and restrictions
which apply to that software, or
2) It will contain specific license and restriction terms at the top of every
file.
==============================================================================
Legacy LLVM License (https://llvm.org/docs/DeveloperPolicy.html#legacy):
==============================================================================
University of Illinois/NCSA
Open Source License
Copyright (c) 2003-2019 University of Illinois at Urbana-Champaign.
All rights reserved.
Developed by:
@ -92,70 +278,32 @@ their own copyright notices and license terms:
http://llvm.org
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal with the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal with
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:
* Redistributions of source code must retain the
above copyright notice, this list of conditions
and the following disclaimers.
* Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimers.
* Redistributions in binary form must reproduce the
above copyright notice, this list of conditions
and the following disclaimers in the documentation
and/or other materials provided with the
distribution.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimers in the
documentation and/or other materials provided with the distribution.
* Neither the names of the LLVM Team, University of
Illinois at Urbana-Champaign, nor the names of its
contributors may be used to endorse or promote
products derived from this Software without
specific prior written permission.
* Neither the names of the LLVM Team, University of Illinois at
Urbana-Champaign, nor the names of its contributors may be used to
endorse or promote products derived from this Software without specific
prior written permission.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE CONTRIBUTORS OR COPYRIGHT HOLDERS BE LIABLE
FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT
OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS WITH THE SOFTWARE.
========================================================
Copyright (c) 2009-2014 by the contributors listed in
CREDITS.TXT
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice
shall be included in all copies or substantial portions
of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
CONTRIBUTORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS WITH THE
SOFTWARE.
* Portions of the FFI code for interacting with the native ABI
is derived from the Clay programming language, which carries
@ -191,41 +339,3 @@ their own copyright notices and license terms:
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY
OF SUCH DAMAGE.
* libbacktrace, under src/libbacktrace:
Copyright (C) 2012-2014 Free Software Foundation, Inc.
Written by Ian Lance Taylor, Google.
Redistribution and use in source and binary forms, with
or without modification, are permitted provided that the
following conditions are met:
(1) Redistributions of source code must retain the
above copyright notice, this list of conditions and
the following disclaimer.
(2) Redistributions in binary form must reproduce
the above copyright notice, this list of conditions
and the following disclaimer in the documentation
and/or other materials provided with the
distribution.
(3) The name of the author may not be used to
endorse or promote products derived from this
software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY
AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN
NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT,
INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF
USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY
OF SUCH DAMAGE. */

350
Cargo.lock generated
View File

@ -288,7 +288,7 @@ dependencies = [
[[package]]
name = "cargo"
version = "0.66.0"
version = "0.67.0"
dependencies = [
"anyhow",
"atty",
@ -297,7 +297,7 @@ dependencies = [
"cargo-test-macro",
"cargo-test-support",
"cargo-util",
"clap",
"clap 4.0.15",
"crates-io",
"curl",
"curl-sys",
@ -330,8 +330,10 @@ dependencies = [
"pretty_env_logger",
"rustc-workspace-hack",
"rustfix",
"same-file",
"semver",
"serde",
"serde-value",
"serde_ignored",
"serde_json",
"shell-escape",
@ -383,11 +385,12 @@ version = "0.1.0"
dependencies = [
"cargo_metadata 0.15.0",
"directories",
"rustc-build-sysroot",
"rustc-workspace-hack",
"rustc_tools_util",
"rustc_version",
"serde",
"serde_json",
"vergen",
]
[[package]]
@ -417,6 +420,7 @@ dependencies = [
"anyhow",
"cargo-test-macro",
"cargo-util",
"crates-io",
"filetime",
"flate2",
"git2",
@ -507,6 +511,10 @@ name = "cfg-if"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd"
dependencies = [
"compiler_builtins",
"rustc-std-workspace-core",
]
[[package]]
name = "chalk-derive"
@ -584,7 +592,7 @@ dependencies = [
"atty",
"bitflags",
"clap_derive",
"clap_lex",
"clap_lex 0.2.2",
"indexmap",
"once_cell",
"strsim",
@ -592,13 +600,26 @@ dependencies = [
"textwrap",
]
[[package]]
name = "clap"
version = "4.0.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6bf8832993da70a4c6d13c581f4463c2bdda27b9bf1c5498dc4365543abe6d6f"
dependencies = [
"atty",
"bitflags",
"clap_lex 0.3.0",
"strsim",
"termcolor",
]
[[package]]
name = "clap_complete"
version = "3.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df6f3613c0a3cddfd78b41b10203eb322cb29b600cbdf808a7d3db95691b8e25"
dependencies = [
"clap",
"clap 3.2.20",
]
[[package]]
@ -623,9 +644,18 @@ dependencies = [
"os_str_bytes",
]
[[package]]
name = "clap_lex"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0d4198f73e42b4936b35b5bb248d81d2b595ecb170da0bac7655c54eedfa8da8"
dependencies = [
"os_str_bytes",
]
[[package]]
name = "clippy"
version = "0.1.65"
version = "0.1.66"
dependencies = [
"clippy_lints",
"clippy_utils",
@ -657,7 +687,7 @@ name = "clippy_dev"
version = "0.0.1"
dependencies = [
"aho-corasick",
"clap",
"clap 3.2.20",
"indoc",
"itertools",
"opener",
@ -668,7 +698,7 @@ dependencies = [
[[package]]
name = "clippy_lints"
version = "0.1.65"
version = "0.1.66"
dependencies = [
"cargo_metadata 0.14.0",
"clippy_utils",
@ -690,7 +720,7 @@ dependencies = [
[[package]]
name = "clippy_utils"
version = "0.1.65"
version = "0.1.66"
dependencies = [
"arrayvec",
"if_chain",
@ -766,9 +796,9 @@ dependencies = [
[[package]]
name = "compiler_builtins"
version = "0.1.79"
version = "0.1.82"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4f873ce2bd3550b0b565f878b3d04ea8253f4259dc3d20223af2e1ba86f5ecca"
checksum = "18cd7635fea7bb481ea543b392789844c1ad581299da70184c7175ce3af76603"
dependencies = [
"cc",
"rustc-std-workspace-core",
@ -799,9 +829,9 @@ dependencies = [
[[package]]
name = "compiletest_rs"
version = "0.8.0"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "262134ef87408da1ddfe45e33daa0ca43b75286d6b1076446e602d264cf9847e"
checksum = "70489bbb718aea4f92e5f48f2e3b5be670c2051de30e57cb6e5377b4aa08b372"
dependencies = [
"diff",
"filetime",
@ -1075,9 +1105,9 @@ dependencies = [
[[package]]
name = "directories"
version = "3.0.2"
version = "4.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e69600ff1703123957937708eb27f7a564e48885c537782722ed0ba3189ce1d7"
checksum = "f51c5d4ddabd36886dd3e1438cb358cdcb0d7c499cb99cb4ac2e38e18b5cb210"
dependencies = [
"dirs-sys",
]
@ -1167,26 +1197,6 @@ dependencies = [
"log",
]
[[package]]
name = "enum-iterator"
version = "0.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c79a6321a1197d7730510c7e3f6cb80432dfefecb32426de8cea0aa19b4bb8d7"
dependencies = [
"enum-iterator-derive",
]
[[package]]
name = "enum-iterator-derive"
version = "0.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e94aa31f7c0dc764f57896dc615ddd76fc13b0d5dca7eb6cc5e018a5a09ec06"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "env_logger"
version = "0.7.1"
@ -1524,18 +1534,6 @@ dependencies = [
"wasi 0.9.0+wasi-snapshot-preview1",
]
[[package]]
name = "getset"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24b328c01a4d71d2d8173daa93562a73ab0fe85616876f02500f53d82948c504"
dependencies = [
"proc-macro-error",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "gimli"
version = "0.25.0"
@ -1802,7 +1800,7 @@ name = "installer"
version = "0.0.0"
dependencies = [
"anyhow",
"clap",
"clap 3.2.20",
"flate2",
"lazy_static",
"num_cpus",
@ -1845,9 +1843,9 @@ dependencies = [
[[package]]
name = "itertools"
version = "0.10.1"
version = "0.10.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69ddb889f9d0d08a67338271fa9b62996bc788c7796a5c18cf057420aaed5eaf"
checksum = "b0fd2260e829bddf4cb6ea802289de2f86d6a7a690192fbe91b3f46e0f2c8473"
dependencies = [
"either",
]
@ -1937,9 +1935,9 @@ checksum = "830d08ce1d1d941e6b30645f1a0eb5643013d835ce3779a5fc208261dbe10f55"
[[package]]
name = "libc"
version = "0.2.131"
version = "0.2.135"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "04c3b4822ccebfa39c02fc03d1534441b22ead323fa0f48bb7ddd8e6ba076a40"
checksum = "68783febc7782c6c5cb401fbda4de5a9898be1762314da0bb2c10ced61f18b0c"
dependencies = [
"rustc-std-workspace-core",
]
@ -2145,7 +2143,7 @@ dependencies = [
"ammonia",
"anyhow",
"chrono",
"clap",
"clap 3.2.20",
"clap_complete",
"elasticlunr-rs",
"env_logger 0.9.0",
@ -2264,6 +2262,7 @@ dependencies = [
"rand 0.8.5",
"regex",
"rustc-workspace-hack",
"rustc_version",
"shell-escape",
"smallvec",
"ui_test",
@ -2414,6 +2413,15 @@ dependencies = [
"vcpkg",
]
[[package]]
name = "ordered-float"
version = "2.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7940cf2ca942593318d07fcf2596cdca60a85c9e7fab408a5e21a4f9dcd40d87"
dependencies = [
"num-traits",
]
[[package]]
name = "os_info"
version = "3.5.0"
@ -2645,9 +2653,9 @@ checksum = "8b870d8c151b6f2fb93e84a13146138f05d02ed11c7e7c54f8826aaaf7c9f184"
[[package]]
name = "pkg-config"
version = "0.3.18"
version = "0.3.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d36492546b6af1463394d46f0c834346f31548646f6ba10849802c9c9a27ac33"
checksum = "1df8c4ec4b0627e53bdf214615ad287367e482558cf84b109250b37464dc03ae"
[[package]]
name = "polonius-engine"
@ -2714,11 +2722,11 @@ checksum = "dbf0c48bc1d91375ae5c3cd81e3722dff1abcf81a30960240640d223f59fe0e5"
[[package]]
name = "proc-macro2"
version = "1.0.37"
version = "1.0.46"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec757218438d5fda206afc041538b2f6d889286160d649a86a24d37e1235afd1"
checksum = "94e2ef8dbfc347b10c094890f778ee2e36ca9bb4262e86dc99cd217e35f3470b"
dependencies = [
"unicode-xid",
"unicode-ident",
]
[[package]]
@ -3008,11 +3016,22 @@ dependencies = [
name = "rustbook"
version = "0.1.0"
dependencies = [
"clap",
"clap 3.2.20",
"env_logger 0.7.1",
"mdbook",
]
[[package]]
name = "rustc-build-sysroot"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec5f3689b6c560d6a3a17fcbe54204cd870b4fcf46342d60de16715b660d2c92"
dependencies = [
"anyhow",
"rustc_version",
"tempfile",
]
[[package]]
name = "rustc-demangle"
version = "0.1.21"
@ -3095,7 +3114,7 @@ name = "rustc-workspace-hack"
version = "1.0.0"
dependencies = [
"bstr",
"clap",
"clap 3.2.20",
"libz-sys",
"regex",
"serde_json",
@ -3259,7 +3278,6 @@ dependencies = [
"bitflags",
"cstr",
"libc",
"libloading",
"measureme",
"object 0.29.0",
"rustc-demangle",
@ -3282,6 +3300,7 @@ dependencies = [
"rustc_symbol_mangling",
"rustc_target",
"smallvec",
"tempfile",
"tracing",
]
@ -3391,6 +3410,7 @@ dependencies = [
"rustc_errors",
"rustc_feature",
"rustc_hir",
"rustc_hir_analysis",
"rustc_hir_pretty",
"rustc_interface",
"rustc_lint",
@ -3404,7 +3424,6 @@ dependencies = [
"rustc_session",
"rustc_span",
"rustc_target",
"rustc_typeck",
"serde_json",
"tracing",
"winapi",
@ -3435,6 +3454,8 @@ version = "0.0.0"
dependencies = [
"annotate-snippets",
"atty",
"rustc_ast",
"rustc_ast_pretty",
"rustc_data_structures",
"rustc_error_messages",
"rustc_hir",
@ -3509,6 +3530,34 @@ dependencies = [
"tracing",
]
[[package]]
name = "rustc_hir_analysis"
version = "0.0.0"
dependencies = [
"rustc_arena",
"rustc_ast",
"rustc_attr",
"rustc_data_structures",
"rustc_errors",
"rustc_feature",
"rustc_graphviz",
"rustc_hir",
"rustc_hir_pretty",
"rustc_index",
"rustc_infer",
"rustc_lint",
"rustc_macros",
"rustc_middle",
"rustc_serialize",
"rustc_session",
"rustc_span",
"rustc_target",
"rustc_trait_selection",
"rustc_type_ir",
"smallvec",
"tracing",
]
[[package]]
name = "rustc_hir_pretty"
version = "0.0.0"
@ -3520,6 +3569,32 @@ dependencies = [
"rustc_target",
]
[[package]]
name = "rustc_hir_typeck"
version = "0.1.0"
dependencies = [
"rustc_ast",
"rustc_data_structures",
"rustc_errors",
"rustc_graphviz",
"rustc_hir",
"rustc_hir_analysis",
"rustc_hir_pretty",
"rustc_index",
"rustc_infer",
"rustc_lint",
"rustc_macros",
"rustc_middle",
"rustc_serialize",
"rustc_session",
"rustc_span",
"rustc_target",
"rustc_trait_selection",
"rustc_type_ir",
"smallvec",
"tracing",
]
[[package]]
name = "rustc_incremental"
version = "0.0.0"
@ -3588,6 +3663,8 @@ dependencies = [
"rustc_errors",
"rustc_expand",
"rustc_hir",
"rustc_hir_analysis",
"rustc_hir_typeck",
"rustc_incremental",
"rustc_lint",
"rustc_macros",
@ -3610,7 +3687,6 @@ dependencies = [
"rustc_trait_selection",
"rustc_traits",
"rustc_ty_utils",
"rustc_typeck",
"smallvec",
"tracing",
"winapi",
@ -3735,8 +3811,6 @@ dependencies = [
"either",
"gsgdt",
"polonius-engine",
"rand 0.8.5",
"rand_xoshiro",
"rustc-rayon",
"rustc-rayon-core",
"rustc_apfloat",
@ -3920,12 +3994,12 @@ dependencies = [
"rustc_data_structures",
"rustc_errors",
"rustc_hir",
"rustc_hir_analysis",
"rustc_macros",
"rustc_middle",
"rustc_session",
"rustc_span",
"rustc_trait_selection",
"rustc_typeck",
"tracing",
]
@ -4106,6 +4180,7 @@ version = "0.0.0"
dependencies = [
"bitflags",
"rustc_data_structures",
"rustc_feature",
"rustc_index",
"rustc_macros",
"rustc_serialize",
@ -4116,7 +4191,9 @@ dependencies = [
[[package]]
name = "rustc_tools_util"
version = "0.2.0"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "598f48ce2a421542b3e64828aa742b687cc1b91d2f96591cfdb7ac5988cd6366"
[[package]]
name = "rustc_trait_selection"
@ -4181,6 +4258,8 @@ dependencies = [
name = "rustc_ty_utils"
version = "0.0.0"
dependencies = [
"rand 0.8.5",
"rand_xoshiro",
"rustc_data_structures",
"rustc_errors",
"rustc_hir",
@ -4208,35 +4287,6 @@ dependencies = [
"smallvec",
]
[[package]]
name = "rustc_typeck"
version = "0.0.0"
dependencies = [
"rustc_arena",
"rustc_ast",
"rustc_attr",
"rustc_data_structures",
"rustc_errors",
"rustc_feature",
"rustc_graphviz",
"rustc_hir",
"rustc_hir_pretty",
"rustc_index",
"rustc_infer",
"rustc_lint",
"rustc_macros",
"rustc_middle",
"rustc_serialize",
"rustc_session",
"rustc_span",
"rustc_target",
"rustc_trait_selection",
"rustc_ty_utils",
"rustc_type_ir",
"smallvec",
"tracing",
]
[[package]]
name = "rustc_version"
version = "0.4.0"
@ -4320,7 +4370,7 @@ dependencies = [
"anyhow",
"bytecount",
"cargo_metadata 0.14.0",
"clap",
"clap 3.2.20",
"derive-new",
"diff",
"dirs",
@ -4426,18 +4476,28 @@ dependencies = [
[[package]]
name = "serde"
version = "1.0.143"
version = "1.0.147"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "53e8e5d5b70924f74ff5c6d64d9a5acd91422117c60f48c4e07855238a254553"
checksum = "d193d69bae983fc11a79df82342761dfbf28a99fc8d203dca4c3c1b590948965"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.143"
name = "serde-value"
version = "0.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d3d8e8de557aee63c26b85b947f5e59b690d0454c753f3adeb5cd7835ab88391"
checksum = "f3a1a3341211875ef120e117ea7fd5228530ae7e7036a779fdc9117be6b3282c"
dependencies = [
"ordered-float",
"serde",
]
[[package]]
name = "serde_derive"
version = "1.0.147"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4f1d362ca8fc9c3e3a7484440752472d68a6caa98f1ab81d99b5dfe517cec852"
dependencies = [
"proc-macro2",
"quote",
@ -4613,7 +4673,7 @@ version = "0.0.0"
dependencies = [
"addr2line 0.16.0",
"alloc",
"cfg-if 0.1.10",
"cfg-if 1.0.0",
"compiler_builtins",
"core",
"dlmalloc",
@ -4637,7 +4697,7 @@ dependencies = [
name = "std_detect"
version = "0.1.5"
dependencies = [
"cfg-if 0.1.10",
"cfg-if 1.0.0",
"compiler_builtins",
"libc",
"rustc-std-workspace-alloc",
@ -4687,13 +4747,13 @@ checksum = "73473c0e59e6d5812c5dfe2a064a6444949f089e20eec9a2e5506596494e4623"
[[package]]
name = "syn"
version = "1.0.91"
version = "1.0.102"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b683b2b825c8eef438b77c36a06dc262294da3d5a5813fac20da149241dcd44d"
checksum = "3fcd952facd492f9be3ef0d0b7032a6e442ee9b361d4acc2b1d0c4aaa5f613a1"
dependencies = [
"proc-macro2",
"quote",
"unicode-xid",
"unicode-ident",
]
[[package]]
@ -4886,9 +4946,18 @@ checksum = "29738eedb4388d9ea620eeab9384884fc3f06f586a2eddb56bedc5885126c7c1"
[[package]]
name = "tinyvec"
version = "0.3.4"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "238ce071d267c5710f9d31451efec16c5ee22de34df17cc05e56cbc92e967117"
checksum = "87cc5ceb3875bb20c2890005a4e226a4651264a5c75edb2421b52861a0a0cb50"
dependencies = [
"tinyvec_macros",
]
[[package]]
name = "tinyvec_macros"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cda74da7e1a664f795bb1f8a87ec406fb89a02522cf6e50620d016add6dbbf5c"
[[package]]
name = "tokio"
@ -4912,16 +4981,26 @@ dependencies = [
]
[[package]]
name = "toml_edit"
version = "0.14.3"
name = "toml_datetime"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ba98375fd631b83696f87c64e4ed8e29e6a1f3404d6aed95fa95163bad38e705"
checksum = "808b51e57d0ef8f71115d8f3a01e7d3750d01c79cac4b3eda910f4389fdf92fd"
dependencies = [
"serde",
]
[[package]]
name = "toml_edit"
version = "0.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1541ba70885967e662f69d31ab3aeca7b1aaecfcd58679590b893e9239c3646"
dependencies = [
"combine",
"indexmap",
"itertools",
"kstring",
"serde",
"toml_datetime",
]
[[package]]
@ -5175,25 +5254,31 @@ dependencies = [
]
[[package]]
name = "unicode-normalization"
version = "0.1.13"
name = "unicode-ident"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6fb19cf769fa8c6a80a162df694621ebeb4dafb606470b2b2fce0be40a98a977"
checksum = "6ceab39d59e4c9499d4e5a8ee0e2735b891bb7308ac83dfb4e80cad195c9f6f3"
[[package]]
name = "unicode-normalization"
version = "0.1.22"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c5713f0fc4b5db668a2ac63cdb7bb4469d8c9fed047b1d0292cc7b0ce2ba921"
dependencies = [
"tinyvec",
]
[[package]]
name = "unicode-script"
version = "0.5.3"
version = "0.5.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "098ec66172ce21cd55f8bcc786ee209dd20e04eff70acfca30cb79924d173ae9"
checksum = "7d817255e1bed6dfd4ca47258685d14d2bdcfbc64fdc9e3819bd5848057b8ecc"
[[package]]
name = "unicode-security"
version = "0.0.5"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5d87c28edc5b263377e448d6cdcb935c06b95413d8013ba6fae470558ccab18f"
checksum = "9ef5756b3097992b934b06608c69f48448a0fbe804bb1e72b982f6d7983e9e63"
dependencies = [
"unicode-normalization",
"unicode-script",
@ -5201,15 +5286,15 @@ dependencies = [
[[package]]
name = "unicode-segmentation"
version = "1.9.0"
version = "1.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7e8820f5d777f6224dc4be3632222971ac30164d4a258d595640799554ebfd99"
checksum = "0fdbf052a0783de01e944a6ce7a8cb939e295b1e7be835a1112c3b9a7f047a5a"
[[package]]
name = "unicode-width"
version = "0.1.8"
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9337591893a19b88d8d87f2cec1e73fad5cdfd10e5a6f349f498ad6ea2ffb1e3"
checksum = "c0edd1e5b14653f783770bce4a4dabb4a5108a5370a5f5d8cfe8710c361f6c8b"
dependencies = [
"compiler_builtins",
"rustc-std-workspace-core",
@ -5218,9 +5303,9 @@ dependencies = [
[[package]]
name = "unicode-xid"
version = "0.2.2"
version = "0.2.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8ccb82d61f80a663efe1f787a51b16b5a51e3314d6ac365b08639f52387b33f3"
checksum = "f962df74c8c05a667b5ee8bcf162993134c104e96440b663c8daa176dc772d8c"
[[package]]
name = "unicode_categories"
@ -5293,21 +5378,6 @@ version = "0.2.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6454029bf181f092ad1b853286f23e2c507d8e8194d01d92da4a55c274a5508c"
[[package]]
name = "vergen"
version = "5.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dfbc87f9a7a9d61b15d51d1d3547284f67b6b4f1494ce3fc5814c101f35a5183"
dependencies = [
"anyhow",
"chrono",
"enum-iterator",
"getset",
"git2",
"rustversion",
"thiserror",
]
[[package]]
name = "version_check"
version = "0.9.3"

View File

@ -1,3 +1,94 @@
Version 1.66.0 (2022-12-15)
==========================
Language
--------
- [Permit specifying explicit discriminants on all `repr(Int)` enums](https://github.com/rust-lang/rust/pull/95710/)
```rust
#[repr(u8)]
enum Foo {
A(u8) = 0,
B(i8) = 1,
C(bool) = 42,
}
```
- [Allow transmutes between the same type differing only in lifetimes](https://github.com/rust-lang/rust/pull/101520/)
- [Change constant evaluation errors from a deny-by-default lint to a hard error](https://github.com/rust-lang/rust/pull/102091/)
- [Trigger `must_use` on `impl Trait` for supertraits](https://github.com/rust-lang/rust/pull/102287/)
This makes `impl ExactSizeIterator` respect the existing `#[must_use]` annotation on `Iterator`.
- [Allow `..X` and `..=X` in patterns](https://github.com/rust-lang/rust/pull/102275/)
- [Uplift `clippy::for_loops_over_fallibles` lint into rustc](https://github.com/rust-lang/rust/pull/99696/)
- [Stabilize `sym` operands in inline assembly](https://github.com/rust-lang/rust/pull/103168/)
- [Update to Unicode 15](https://github.com/rust-lang/rust/pull/101912/)
- [Opaque types no longer imply lifetime bounds](https://github.com/rust-lang/rust/pull/95474/)
This is a soundness fix which may break code that was erroneously relying on this behavior.
Compiler
--------
- [Add armv5te-none-eabi and thumbv5te-none-eabi tier 3 targets](https://github.com/rust-lang/rust/pull/101329/)
- Refer to Rust's [platform support page][platform-support-doc] for more
information on Rust's tiered platform support.
- [Add support for linking against macOS universal libraries](https://github.com/rust-lang/rust/pull/98736)
Libraries
---------
- [Fix `#[derive(Default)]` on a generic `#[default]` enum adding unnecessary `Default` bounds](https://github.com/rust-lang/rust/pull/101040/)
- [Update to Unicode 15](https://github.com/rust-lang/rust/pull/101821/)
Stabilized APIs
---------------
- [`proc_macro::Span::source_text`](https://doc.rust-lang.org/stable/proc_macro/struct.Span.html#method.source_text)
- [`uX::{checked_add_signed, overflowing_add_signed, saturating_add_signed, wrapping_add_signed}`](https://doc.rust-lang.org/stable/std/primitive.u8.html#method.checked_add_signed)
- [`iX::{checked_add_unsigned, overflowing_add_unsigned, saturating_add_unsigned, wrapping_add_unsigned}`](https://doc.rust-lang.org/stable/std/primitive.i8.html#method.checked_add_unsigned)
- [`iX::{checked_sub_unsigned, overflowing_sub_unsigned, saturating_sub_unsigned, wrapping_sub_unsigned}`](https://doc.rust-lang.org/stable/std/primitive.i8.html#method.checked_sub_unsigned)
- [`BTreeSet::{first, last, pop_first, pop_last}`](https://doc.rust-lang.org/stable/std/collections/struct.BTreeSet.html#method.first)
- [`BTreeMap::{first_key_value, last_key_value, first_entry, last_entry, pop_first, pop_last}`](https://doc.rust-lang.org/stable/std/collections/struct.BTreeMap.html#method.first_key_value)
- [Add `AsFd` implementations for stdio lock types on WASI.](https://github.com/rust-lang/rust/pull/101768/)
- [`impl TryFrom<Vec<T>> for Box<[T; N]>`](https://doc.rust-lang.org/stable/std/boxed/struct.Box.html#impl-TryFrom%3CVec%3CT%2C%20Global%3E%3E-for-Box%3C%5BT%3B%20N%5D%2C%20Global%3E)
- [`core::hint::black_box`](https://doc.rust-lang.org/stable/std/hint/fn.black_box.html)
- [`Duration::try_from_secs_{f32,f64}`](https://doc.rust-lang.org/stable/std/time/struct.Duration.html#method.try_from_secs_f32)
- [`Option::unzip`](https://doc.rust-lang.org/stable/std/option/enum.Option.html#method.unzip)
- [`std::os::fd`](https://doc.rust-lang.org/stable/std/os/fd/index.html)
Rustdoc
-------
- [Add Rustdoc warning for invalid HTML tags in the documentation](https://github.com/rust-lang/rust/pull/101720/)
Cargo
-----
- [Added `cargo remove` to remove dependencies from Cargo.toml](https://doc.rust-lang.org/nightly/cargo/commands/cargo-remove.html)
- [`cargo publish` now waits for the new version to be downloadable before exiting](https://github.com/rust-lang/cargo/pull/11062)
See [detailed release notes](https://github.com/rust-lang/cargo/blob/master/CHANGELOG.md#cargo-166-2022-12-15) for more.
Compatibility Notes
-------------------
- [Only apply `ProceduralMasquerade` hack to older versions of `rental`](https://github.com/rust-lang/rust/pull/94063/)
- [Don't export `__heap_base` and `__data_end` on wasm32-wasi.](https://github.com/rust-lang/rust/pull/102385/)
- [Don't export `__wasm_init_memory` on WebAssembly.](https://github.com/rust-lang/rust/pull/102426/)
- [Only export `__tls_*` on wasm32-unknown-unknown.](https://github.com/rust-lang/rust/pull/102440/)
- [Don't link to `libresolv` in libstd on Darwin](https://github.com/rust-lang/rust/pull/102766/)
- [Update libstd's libc to 0.2.135 (to make `libstd` no longer pull in `libiconv.dylib` on Darwin)](https://github.com/rust-lang/rust/pull/103277/)
- [Opaque types no longer imply lifetime bounds](https://github.com/rust-lang/rust/pull/95474/)
This is a soundness fix which may break code that was erroneously relying on this behavior.
- [Make `order_dependent_trait_objects` show up in future-breakage reports](https://github.com/rust-lang/rust/pull/102635/)
- [Change std::process::Command spawning to default to inheriting the parent's signal mask](https://github.com/rust-lang/rust/pull/101077/)
Internal Changes
----------------
These changes do not affect any public interfaces of Rust, but they represent
significant improvements to the performance or internals of rustc and related
tools.
- [Enable BOLT for LLVM compilation](https://github.com/rust-lang/rust/pull/94381/)
- [Enable LTO for rustc_driver.so](https://github.com/rust-lang/rust/pull/101403/)
Version 1.65.0 (2022-11-03)
==========================
@ -6,7 +97,7 @@ Language
- [Error on `as` casts of enums with `#[non_exhaustive]` variants](https://github.com/rust-lang/rust/pull/92744/)
- [Stabilize `let else`](https://github.com/rust-lang/rust/pull/93628/)
- [Stabilize generic associated types (GATs)](https://github.com/rust-lang/rust/pull/96709/)
- [Add lints `let_underscore_drop`, `let_underscore_lock`, and `let_underscore_must_use` from Clippy](https://github.com/rust-lang/rust/pull/97739/)
- [Add lints `let_underscore_drop` and `let_underscore_lock` from Clippy](https://github.com/rust-lang/rust/pull/97739/)
- [Stabilize `break`ing from arbitrary labeled blocks ("label-break-value")](https://github.com/rust-lang/rust/pull/99332/)
- [Uninitialized integers, floats, and raw pointers are now considered immediate UB](https://github.com/rust-lang/rust/pull/98919/).
Usage of `MaybeUninit` is the correct way to work with uninitialized memory.
@ -87,6 +178,9 @@ Compatibility Notes
This strengthens the forward compatibility lint deprecated_cfg_attr_crate_type_name to deny.
- [`llvm-has-rust-patches` allows setting the build system to treat the LLVM as having Rust-specific patches](https://github.com/rust-lang/rust/pull/101072)
This option may need to be set for distributions that are building Rust with a patched LLVM via `llvm-config`, not the built-in LLVM.
- Combining three or more languages (e.g. Objective C, C++ and Rust) into one binary may hit linker limitations when using `lld`. For more information, see [issue 102754][102754].
[102754]: https://github.com/rust-lang/rust/issues/102754
Internal Changes
----------------

View File

@ -1,3 +1,5 @@
#![feature(unix_sigpipe)]
// A note about jemalloc: rustc uses jemalloc when built for CI and
// distribution. The obvious way to do this is with the `#[global_allocator]`
// mechanism. However, for complicated reasons (see
@ -23,6 +25,7 @@
// libraries. So we must reference jemalloc symbols one way or another, because
// this file is the only object code in the rustc executable.
#[unix_sigpipe = "sig_dfl"]
fn main() {
// See the comment at the top of this file for an explanation of this.
#[cfg(feature = "jemalloc-sys")]
@ -58,6 +61,5 @@ fn main() {
}
}
rustc_driver::set_sigpipe_handler();
rustc_driver::main()
}

View File

@ -4,7 +4,6 @@ version = "0.0.0"
edition = "2021"
[lib]
doctest = false
[dependencies]
bitflags = "1.2.1"

View File

@ -1112,24 +1112,6 @@ pub struct Expr {
}
impl Expr {
/// Returns `true` if this expression would be valid somewhere that expects a value;
/// for example, an `if` condition.
pub fn returns(&self) -> bool {
if let ExprKind::Block(ref block, _) = self.kind {
match block.stmts.last().map(|last_stmt| &last_stmt.kind) {
// Implicit return
Some(StmtKind::Expr(_)) => true,
// Last statement is an explicit return?
Some(StmtKind::Semi(expr)) => matches!(expr.kind, ExprKind::Ret(_)),
// This is a block that doesn't end in either an implicit or explicit return.
_ => false,
}
} else {
// This is not a block, it is a value.
true
}
}
/// Is this expr either `N`, or `{ N }`.
///
/// If this is not the case, name resolution does not resolve `N` when using
@ -1338,14 +1320,13 @@ pub enum ExprKind {
///
/// The `PathSegment` represents the method name and its generic arguments
/// (within the angle brackets).
/// The first element of the vector of an `Expr` is the expression that evaluates
/// to the object on which the method is being called on (the receiver),
/// and the remaining elements are the rest of the arguments.
/// Thus, `x.foo::<Bar, Baz>(a, b, c, d)` is represented as
/// `ExprKind::MethodCall(PathSegment { foo, [Bar, Baz] }, [x, a, b, c, d])`.
/// The standalone `Expr` is the receiver expression.
/// The vector of `Expr` is the arguments.
/// `x.foo::<Bar, Baz>(a, b, c, d)` is represented as
/// `ExprKind::MethodCall(PathSegment { foo, [Bar, Baz] }, x, [a, b, c, d])`.
/// This `Span` is the span of the function, without the dot and receiver
/// (e.g. `foo(a, b)` in `x.foo(a, b)`
MethodCall(PathSegment, Vec<P<Expr>>, Span),
MethodCall(PathSegment, P<Expr>, Vec<P<Expr>>, Span),
/// A tuple (e.g., `(a, b, c, d)`).
Tup(Vec<P<Expr>>),
/// A binary operation (e.g., `a + b`, `a * b`).
@ -2957,7 +2938,7 @@ pub enum AssocItemKind {
/// An associated function.
Fn(Box<Fn>),
/// An associated type.
TyAlias(Box<TyAlias>),
Type(Box<TyAlias>),
/// A macro expanding to associated items.
MacCall(P<MacCall>),
}
@ -2967,7 +2948,7 @@ impl AssocItemKind {
match *self {
Self::Const(defaultness, ..)
| Self::Fn(box Fn { defaultness, .. })
| Self::TyAlias(box TyAlias { defaultness, .. }) => defaultness,
| Self::Type(box TyAlias { defaultness, .. }) => defaultness,
Self::MacCall(..) => Defaultness::Final,
}
}
@ -2978,7 +2959,7 @@ impl From<AssocItemKind> for ItemKind {
match assoc_item_kind {
AssocItemKind::Const(a, b, c) => ItemKind::Const(a, b, c),
AssocItemKind::Fn(fn_kind) => ItemKind::Fn(fn_kind),
AssocItemKind::TyAlias(ty_alias_kind) => ItemKind::TyAlias(ty_alias_kind),
AssocItemKind::Type(ty_alias_kind) => ItemKind::TyAlias(ty_alias_kind),
AssocItemKind::MacCall(a) => ItemKind::MacCall(a),
}
}
@ -2991,7 +2972,7 @@ impl TryFrom<ItemKind> for AssocItemKind {
Ok(match item_kind {
ItemKind::Const(a, b, c) => AssocItemKind::Const(a, b, c),
ItemKind::Fn(fn_kind) => AssocItemKind::Fn(fn_kind),
ItemKind::TyAlias(ty_alias_kind) => AssocItemKind::TyAlias(ty_alias_kind),
ItemKind::TyAlias(ty_kind) => AssocItemKind::Type(ty_kind),
ItemKind::MacCall(a) => AssocItemKind::MacCall(a),
_ => return Err(item_kind),
})
@ -3043,14 +3024,13 @@ pub type ForeignItem = Item<ForeignItemKind>;
mod size_asserts {
use super::*;
use rustc_data_structures::static_assert_size;
// These are in alphabetical order, which is easy to maintain.
// tidy-alphabetical-start
static_assert_size!(AssocItem, 104);
static_assert_size!(AssocItemKind, 32);
static_assert_size!(Attribute, 32);
static_assert_size!(Block, 48);
static_assert_size!(Expr, 104);
static_assert_size!(ExprKind, 72);
#[cfg(not(bootstrap))]
static_assert_size!(Fn, 184);
static_assert_size!(ForeignItem, 96);
static_assert_size!(ForeignItemKind, 24);
@ -3065,11 +3045,12 @@ mod size_asserts {
static_assert_size!(Local, 72);
static_assert_size!(Param, 40);
static_assert_size!(Pat, 120);
static_assert_size!(PatKind, 96);
static_assert_size!(Path, 40);
static_assert_size!(PathSegment, 24);
static_assert_size!(PatKind, 96);
static_assert_size!(Stmt, 32);
static_assert_size!(StmtKind, 16);
static_assert_size!(Ty, 96);
static_assert_size!(TyKind, 72);
// tidy-alphabetical-end
}

View File

@ -13,9 +13,7 @@
#![feature(const_default_impls)]
#![feature(const_trait_impl)]
#![feature(if_let_guard)]
#![cfg_attr(bootstrap, feature(label_break_value))]
#![feature(let_chains)]
#![cfg_attr(bootstrap, feature(let_else))]
#![feature(min_specialization)]
#![feature(negative_impls)]
#![feature(slice_internals)]

View File

@ -152,6 +152,12 @@ pub trait MutVisitor: Sized {
noop_visit_expr(e, self);
}
/// This method is a hack to workaround unstable of `stmt_expr_attributes`.
/// It can be removed once that feature is stabilized.
fn visit_method_receiver_expr(&mut self, ex: &mut P<Expr>) {
self.visit_expr(ex)
}
fn filter_map_expr(&mut self, e: P<Expr>) -> Option<P<Expr>> {
noop_filter_map_expr(e, self)
}
@ -1106,7 +1112,7 @@ pub fn noop_flat_map_assoc_item<T: MutVisitor>(
visit_fn_sig(sig, visitor);
visit_opt(body, |body| visitor.visit_block(body));
}
AssocItemKind::TyAlias(box TyAlias {
AssocItemKind::Type(box TyAlias {
defaultness,
generics,
where_clauses,
@ -1297,10 +1303,11 @@ pub fn noop_visit_expr<T: MutVisitor>(
vis.visit_expr(f);
visit_exprs(args, vis);
}
ExprKind::MethodCall(PathSegment { ident, id, args }, exprs, span) => {
ExprKind::MethodCall(PathSegment { ident, id, args }, receiver, exprs, span) => {
vis.visit_ident(ident);
vis.visit_id(id);
visit_opt(args, |args| vis.visit_generic_args(args));
vis.visit_method_receiver_expr(receiver);
visit_exprs(exprs, vis);
vis.visit_span(span);
}
@ -1588,3 +1595,9 @@ impl DummyAstNode for Crate {
}
}
}
impl<N: DummyAstNode, T: DummyAstNode> DummyAstNode for crate::ast_traits::AstNodeWrapper<N, T> {
fn dummy() -> Self {
crate::ast_traits::AstNodeWrapper::new(N::dummy(), T::dummy())
}
}

View File

@ -13,7 +13,7 @@ use rustc_span::symbol::{kw, sym};
use rustc_span::symbol::{Ident, Symbol};
use rustc_span::{self, edition::Edition, Span, DUMMY_SP};
use std::borrow::Cow;
use std::{fmt, mem};
use std::fmt;
#[derive(Clone, Copy, PartialEq, Encodable, Decodable, Debug, HashStable_Generic)]
pub enum CommentKind {
@ -256,10 +256,6 @@ pub enum TokenKind {
Eof,
}
// `TokenKind` is used a lot. Make sure it doesn't unintentionally get bigger.
#[cfg(all(target_arch = "x86_64", target_pointer_width = "64"))]
rustc_data_structures::static_assert_size!(TokenKind, 16);
#[derive(Clone, PartialEq, Encodable, Decodable, Debug, HashStable_Generic)]
pub struct Token {
pub kind: TokenKind,
@ -335,11 +331,6 @@ impl Token {
Token::new(Ident(ident.name, ident.is_raw_guess()), ident.span)
}
/// Return this token by value and leave a dummy token in its place.
pub fn take(&mut self) -> Self {
mem::replace(self, Token::dummy())
}
/// For interpolated tokens, returns a span of the fragment to which the interpolated
/// token refers. For all other tokens this is just a regular span.
/// It is particularly important to use this for identifiers and lifetimes
@ -354,17 +345,14 @@ impl Token {
}
pub fn is_op(&self) -> bool {
!matches!(
self.kind,
OpenDelim(..)
| CloseDelim(..)
| Literal(..)
| DocComment(..)
| Ident(..)
| Lifetime(..)
| Interpolated(..)
| Eof
)
match self.kind {
Eq | Lt | Le | EqEq | Ne | Ge | Gt | AndAnd | OrOr | Not | Tilde | BinOp(_)
| BinOpEq(_) | At | Dot | DotDot | DotDotDot | DotDotEq | Comma | Semi | Colon
| ModSep | RArrow | LArrow | FatArrow | Pound | Dollar | Question | SingleQuote => true,
OpenDelim(..) | CloseDelim(..) | Literal(..) | DocComment(..) | Ident(..)
| Lifetime(..) | Interpolated(..) | Eof => false,
}
}
pub fn is_like_plus(&self) -> bool {
@ -733,6 +721,7 @@ impl Token {
}
impl PartialEq<TokenKind> for Token {
#[inline]
fn eq(&self, rhs: &TokenKind) -> bool {
self.kind == *rhs
}
@ -756,10 +745,6 @@ pub enum Nonterminal {
NtVis(P<ast::Visibility>),
}
// `Nonterminal` is used a lot. Make sure it doesn't unintentionally get bigger.
#[cfg(all(target_arch = "x86_64", target_pointer_width = "64"))]
rustc_data_structures::static_assert_size!(Nonterminal, 16);
#[derive(Debug, Copy, Clone, PartialEq, Encodable, Decodable)]
pub enum NonterminalKind {
Item,
@ -898,3 +883,17 @@ where
panic!("interpolated tokens should not be present in the HIR")
}
}
// Some types are used a lot. Make sure they don't unintentionally get bigger.
#[cfg(all(target_arch = "x86_64", target_pointer_width = "64"))]
mod size_asserts {
use super::*;
use rustc_data_structures::static_assert_size;
// tidy-alphabetical-start
static_assert_size!(Lit, 12);
static_assert_size!(LitKind, 2);
static_assert_size!(Nonterminal, 16);
static_assert_size!(Token, 24);
static_assert_size!(TokenKind, 16);
// tidy-alphabetical-end
}

View File

@ -47,10 +47,6 @@ pub enum TokenTree {
Delimited(DelimSpan, Delimiter, TokenStream),
}
// This type is used a lot. Make sure it doesn't unintentionally get bigger.
#[cfg(all(target_arch = "x86_64", target_pointer_width = "64"))]
rustc_data_structures::static_assert_size!(TokenTree, 32);
// Ensure all fields of `TokenTree` is `Send` and `Sync`.
#[cfg(parallel_compiler)]
fn _dummy()
@ -249,12 +245,12 @@ impl AttrTokenStream {
// properly implemented - we always synthesize fake tokens,
// so we never reach this code.
let mut builder = TokenStreamBuilder::new();
let mut stream = TokenStream::default();
for inner_attr in inner_attrs {
builder.push(inner_attr.tokens());
stream.push_stream(inner_attr.tokens());
}
builder.push(delim_tokens.clone());
*tree = TokenTree::Delimited(*span, *delim, builder.build());
stream.push_stream(delim_tokens.clone());
*tree = TokenTree::Delimited(*span, *delim, stream);
found = true;
break;
}
@ -308,13 +304,20 @@ pub struct AttributesData {
#[derive(Clone, Debug, Default, Encodable, Decodable)]
pub struct TokenStream(pub(crate) Lrc<Vec<TokenTree>>);
// `TokenStream` is used a lot. Make sure it doesn't unintentionally get bigger.
#[cfg(all(target_arch = "x86_64", target_pointer_width = "64"))]
rustc_data_structures::static_assert_size!(TokenStream, 8);
/// Similar to `proc_macro::Spacing`, but for tokens.
///
/// Note that all `ast::TokenTree::Token` instances have a `Spacing`, but when
/// we convert to `proc_macro::TokenTree` for proc macros only `Punct`
/// `TokenTree`s have a `proc_macro::Spacing`.
#[derive(Clone, Copy, Debug, PartialEq, Encodable, Decodable, HashStable_Generic)]
pub enum Spacing {
/// The token is not immediately followed by an operator token (as
/// determined by `Token::is_op`). E.g. a `+` token is `Alone` in `+ =`,
/// `+/*foo*/=`, `+ident`, and `+()`.
Alone,
/// The token is immediately followed by an operator token. E.g. a `+`
/// token is `Joint` in `+=` and `++`.
Joint,
}
@ -502,76 +505,49 @@ impl TokenStream {
self.trees().map(|tree| TokenStream::flatten_token_tree(tree)).collect()
}
}
// 99.5%+ of the time we have 1 or 2 elements in this vector.
#[derive(Clone)]
pub struct TokenStreamBuilder(SmallVec<[TokenStream; 2]>);
impl TokenStreamBuilder {
pub fn new() -> TokenStreamBuilder {
TokenStreamBuilder(SmallVec::new())
// If `vec` is not empty, try to glue `tt` onto its last token. The return
// value indicates if gluing took place.
fn try_glue_to_last(vec: &mut Vec<TokenTree>, tt: &TokenTree) -> bool {
if let Some(TokenTree::Token(last_tok, Spacing::Joint)) = vec.last()
&& let TokenTree::Token(tok, spacing) = tt
&& let Some(glued_tok) = last_tok.glue(&tok)
{
// ...then overwrite the last token tree in `vec` with the
// glued token, and skip the first token tree from `stream`.
*vec.last_mut().unwrap() = TokenTree::Token(glued_tok, *spacing);
true
} else {
false
}
}
pub fn push(&mut self, stream: TokenStream) {
self.0.push(stream);
// Push `tt` onto the end of the stream, possibly gluing it to the last
// token. Uses `make_mut` to maximize efficiency.
pub fn push_tree(&mut self, tt: TokenTree) {
let vec_mut = Lrc::make_mut(&mut self.0);
if Self::try_glue_to_last(vec_mut, &tt) {
// nothing else to do
} else {
vec_mut.push(tt);
}
}
pub fn build(self) -> TokenStream {
let mut streams = self.0;
match streams.len() {
0 => TokenStream::default(),
1 => streams.pop().unwrap(),
_ => {
// We will extend the first stream in `streams` with the
// elements from the subsequent streams. This requires using
// `make_mut()` on the first stream, and in practice this
// doesn't cause cloning 99.9% of the time.
//
// One very common use case is when `streams` has two elements,
// where the first stream has any number of elements within
// (often 1, but sometimes many more) and the second stream has
// a single element within.
// Push `stream` onto the end of the stream, possibly gluing the first
// token tree to the last token. (No other token trees will be glued.)
// Uses `make_mut` to maximize efficiency.
pub fn push_stream(&mut self, stream: TokenStream) {
let vec_mut = Lrc::make_mut(&mut self.0);
// Determine how much the first stream will be extended.
// Needed to avoid quadratic blow up from on-the-fly
// reallocations (#57735).
let num_appends = streams.iter().skip(1).map(|ts| ts.len()).sum();
let stream_iter = stream.0.iter().cloned();
// Get the first stream, which will become the result stream.
// If it's `None`, create an empty stream.
let mut iter = streams.into_iter();
let mut res_stream_lrc = iter.next().unwrap().0;
// Append the subsequent elements to the result stream, after
// reserving space for them.
let res_vec_mut = Lrc::make_mut(&mut res_stream_lrc);
res_vec_mut.reserve(num_appends);
for stream in iter {
let stream_iter = stream.0.iter().cloned();
// If (a) `res_mut_vec` is not empty and the last tree
// within it is a token tree marked with `Joint`, and (b)
// `stream` is not empty and the first tree within it is a
// token tree, and (c) the two tokens can be glued
// together...
if let Some(TokenTree::Token(last_tok, Spacing::Joint)) = res_vec_mut.last()
&& let Some(TokenTree::Token(tok, spacing)) = stream.0.first()
&& let Some(glued_tok) = last_tok.glue(&tok)
{
// ...then overwrite the last token tree in
// `res_vec_mut` with the glued token, and skip the
// first token tree from `stream`.
*res_vec_mut.last_mut().unwrap() = TokenTree::Token(glued_tok, *spacing);
res_vec_mut.extend(stream_iter.skip(1));
} else {
// Append all of `stream`.
res_vec_mut.extend(stream_iter);
}
}
TokenStream(res_stream_lrc)
}
if let Some(first) = stream.0.first() && Self::try_glue_to_last(vec_mut, first) {
// Now skip the first token tree from `stream`.
vec_mut.extend(stream_iter.skip(1));
} else {
// Append all of `stream`.
vec_mut.extend(stream_iter);
}
}
}
@ -664,3 +640,17 @@ impl DelimSpan {
self.open.with_hi(self.close.hi())
}
}
// Some types are used a lot. Make sure they don't unintentionally get bigger.
#[cfg(all(target_arch = "x86_64", target_pointer_width = "64"))]
mod size_asserts {
use super::*;
use rustc_data_structures::static_assert_size;
// tidy-alphabetical-start
static_assert_size!(AttrTokenStream, 8);
static_assert_size!(AttrTokenTree, 32);
static_assert_size!(LazyAttrTokenStream, 8);
static_assert_size!(TokenStream, 8);
static_assert_size!(TokenTree, 32);
// tidy-alphabetical-end
}

View File

@ -396,9 +396,9 @@ pub fn contains_exterior_struct_lit(value: &ast::Expr) -> bool {
contains_exterior_struct_lit(&x)
}
ast::ExprKind::MethodCall(.., ref exprs, _) => {
ast::ExprKind::MethodCall(_, ref receiver, _, _) => {
// X { y: 1 }.bar(...)
contains_exterior_struct_lit(&exprs[0])
contains_exterior_struct_lit(&receiver)
}
_ => false,

View File

@ -140,6 +140,11 @@ pub trait Visitor<'ast>: Sized {
fn visit_expr(&mut self, ex: &'ast Expr) {
walk_expr(self, ex)
}
/// This method is a hack to workaround unstable of `stmt_expr_attributes`.
/// It can be removed once that feature is stabilized.
fn visit_method_receiver_expr(&mut self, ex: &'ast Expr) {
self.visit_expr(ex)
}
fn visit_expr_post(&mut self, _ex: &'ast Expr) {}
fn visit_ty(&mut self, t: &'ast Ty) {
walk_ty(self, t)
@ -244,14 +249,12 @@ pub trait Visitor<'ast>: Sized {
#[macro_export]
macro_rules! walk_list {
($visitor: expr, $method: ident, $list: expr) => {
for elem in $list {
$visitor.$method(elem)
}
};
($visitor: expr, $method: ident, $list: expr, $($extra_args: expr),*) => {
for elem in $list {
$visitor.$method(elem, $($extra_args,)*)
($visitor: expr, $method: ident, $list: expr $(, $($extra_args: expr),* )?) => {
{
#[cfg_attr(not(bootstrap), allow(for_loops_over_fallibles))]
for elem in $list {
$visitor.$method(elem $(, $($extra_args,)* )?)
}
}
}
}
@ -685,7 +688,7 @@ pub fn walk_assoc_item<'a, V: Visitor<'a>>(visitor: &mut V, item: &'a AssocItem,
let kind = FnKind::Fn(FnCtxt::Assoc(ctxt), ident, sig, vis, generics, body.as_deref());
visitor.visit_fn(kind, span, id);
}
AssocItemKind::TyAlias(box TyAlias { generics, bounds, ty, .. }) => {
AssocItemKind::Type(box TyAlias { generics, bounds, ty, .. }) => {
visitor.visit_generics(generics);
walk_list!(visitor, visit_param_bound, bounds, BoundKind::Bound);
walk_list!(visitor, visit_ty, ty);
@ -795,8 +798,9 @@ pub fn walk_expr<'a, V: Visitor<'a>>(visitor: &mut V, expression: &'a Expr) {
visitor.visit_expr(callee_expression);
walk_list!(visitor, visit_expr, arguments);
}
ExprKind::MethodCall(ref segment, ref arguments, _span) => {
ExprKind::MethodCall(ref segment, ref receiver, ref arguments, _span) => {
visitor.visit_path_segment(segment);
visitor.visit_expr(receiver);
walk_list!(visitor, visit_expr, arguments);
}
ExprKind::Binary(_, ref left_expression, ref right_expression) => {

View File

@ -192,26 +192,13 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
}
}
InlineAsmOperand::Sym { ref sym } => {
if !self.tcx.features().asm_sym {
feature_err(
&sess.parse_sess,
sym::asm_sym,
*op_sp,
"sym operands for inline assembly are unstable",
)
.emit();
}
let static_def_id = self
.resolver
.get_partial_res(sym.id)
.filter(|res| res.unresolved_segments() == 0)
.and_then(|res| {
if let Res::Def(DefKind::Static(_), def_id) = res.base_res() {
Some(def_id)
} else {
None
}
.and_then(|res| res.full_res())
.and_then(|res| match res {
Res::Def(DefKind::Static(_), def_id) => Some(def_id),
_ => None,
});
if let Some(def_id) = static_def_id {
@ -237,7 +224,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
// Wrap the expression in an AnonConst.
let parent_def_id = self.current_hir_id_owner;
let node_id = self.next_node_id();
self.create_def(parent_def_id, node_id, DefPathData::AnonConst);
self.create_def(parent_def_id.def_id, node_id, DefPathData::AnonConst);
let anon_const = AnonConst { id: node_id, value: P(expr) };
hir::InlineAsmOperand::SymFn {
anon_const: self.lower_anon_const(&anon_const),

View File

@ -1,9 +1,9 @@
use rustc_errors::{fluent, AddSubdiagnostic, Applicability, Diagnostic, DiagnosticArgFromDisplay};
use rustc_macros::{SessionDiagnostic, SessionSubdiagnostic};
use rustc_errors::DiagnosticArgFromDisplay;
use rustc_macros::{Diagnostic, Subdiagnostic};
use rustc_span::{symbol::Ident, Span, Symbol};
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::generic_type_with_parentheses, code = "E0214")]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_generic_type_with_parentheses, code = "E0214")]
pub struct GenericTypeWithParentheses {
#[primary_span]
#[label]
@ -12,35 +12,42 @@ pub struct GenericTypeWithParentheses {
pub sub: Option<UseAngleBrackets>,
}
#[derive(Clone, Copy)]
#[derive(Clone, Copy, Subdiagnostic)]
#[multipart_suggestion(ast_lowering_use_angle_brackets, applicability = "maybe-incorrect")]
pub struct UseAngleBrackets {
#[suggestion_part(code = "<")]
pub open_param: Span,
#[suggestion_part(code = ">")]
pub close_param: Span,
}
impl AddSubdiagnostic for UseAngleBrackets {
fn add_to_diagnostic(self, diag: &mut Diagnostic) {
diag.multipart_suggestion(
fluent::ast_lowering::use_angle_brackets,
vec![(self.open_param, String::from("<")), (self.close_param, String::from(">"))],
Applicability::MaybeIncorrect,
);
}
}
#[derive(SessionDiagnostic)]
#[help]
#[diag(ast_lowering::invalid_abi, code = "E0703")]
#[derive(Diagnostic)]
#[diag(ast_lowering_invalid_abi, code = "E0703")]
#[note]
pub struct InvalidAbi {
#[primary_span]
#[label]
pub span: Span,
pub abi: Symbol,
pub valid_abis: String,
pub command: String,
#[subdiagnostic]
pub suggestion: Option<InvalidAbiSuggestion>,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::assoc_ty_parentheses)]
#[derive(Subdiagnostic)]
#[suggestion(
ast_lowering_invalid_abi_suggestion,
code = "{suggestion}",
applicability = "maybe-incorrect"
)]
pub struct InvalidAbiSuggestion {
#[primary_span]
pub span: Span,
pub suggestion: String,
}
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_assoc_ty_parentheses)]
pub struct AssocTyParentheses {
#[primary_span]
pub span: Span,
@ -48,123 +55,116 @@ pub struct AssocTyParentheses {
pub sub: AssocTyParenthesesSub,
}
#[derive(Clone, Copy)]
#[derive(Clone, Copy, Subdiagnostic)]
pub enum AssocTyParenthesesSub {
Empty { parentheses_span: Span },
NotEmpty { open_param: Span, close_param: Span },
#[multipart_suggestion(ast_lowering_remove_parentheses)]
Empty {
#[suggestion_part(code = "")]
parentheses_span: Span,
},
#[multipart_suggestion(ast_lowering_use_angle_brackets)]
NotEmpty {
#[suggestion_part(code = "<")]
open_param: Span,
#[suggestion_part(code = ">")]
close_param: Span,
},
}
impl AddSubdiagnostic for AssocTyParenthesesSub {
fn add_to_diagnostic(self, diag: &mut Diagnostic) {
match self {
Self::Empty { parentheses_span } => diag.multipart_suggestion(
fluent::ast_lowering::remove_parentheses,
vec![(parentheses_span, String::new())],
Applicability::MaybeIncorrect,
),
Self::NotEmpty { open_param, close_param } => diag.multipart_suggestion(
fluent::ast_lowering::use_angle_brackets,
vec![(open_param, String::from("<")), (close_param, String::from(">"))],
Applicability::MaybeIncorrect,
),
};
}
}
#[derive(SessionDiagnostic)]
#[diag(ast_lowering::misplaced_impl_trait, code = "E0562")]
#[derive(Diagnostic)]
#[diag(ast_lowering_misplaced_impl_trait, code = "E0562")]
pub struct MisplacedImplTrait<'a> {
#[primary_span]
pub span: Span,
pub position: DiagnosticArgFromDisplay<'a>,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::rustc_box_attribute_error)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_rustc_box_attribute_error)]
pub struct RustcBoxAttributeError {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::underscore_expr_lhs_assign)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_underscore_expr_lhs_assign)]
pub struct UnderscoreExprLhsAssign {
#[primary_span]
#[label]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::base_expression_double_dot)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_base_expression_double_dot)]
pub struct BaseExpressionDoubleDot {
#[primary_span]
#[label]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::await_only_in_async_fn_and_blocks, code = "E0728")]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_await_only_in_async_fn_and_blocks, code = "E0728")]
pub struct AwaitOnlyInAsyncFnAndBlocks {
#[primary_span]
#[label]
pub dot_await_span: Span,
#[label(ast_lowering::this_not_async)]
#[label(ast_lowering_this_not_async)]
pub item_span: Option<Span>,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::generator_too_many_parameters, code = "E0628")]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_generator_too_many_parameters, code = "E0628")]
pub struct GeneratorTooManyParameters {
#[primary_span]
pub fn_decl_span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::closure_cannot_be_static, code = "E0697")]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_closure_cannot_be_static, code = "E0697")]
pub struct ClosureCannotBeStatic {
#[primary_span]
pub fn_decl_span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[derive(Diagnostic, Clone, Copy)]
#[help]
#[diag(ast_lowering::async_non_move_closure_not_supported, code = "E0708")]
#[diag(ast_lowering_async_non_move_closure_not_supported, code = "E0708")]
pub struct AsyncNonMoveClosureNotSupported {
#[primary_span]
pub fn_decl_span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::functional_record_update_destructuring_assignment)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_functional_record_update_destructuring_assignment)]
pub struct FunctionalRecordUpdateDestructuringAssignemnt {
#[primary_span]
#[suggestion(code = "", applicability = "machine-applicable")]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::async_generators_not_supported, code = "E0727")]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_async_generators_not_supported, code = "E0727")]
pub struct AsyncGeneratorsNotSupported {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::inline_asm_unsupported_target, code = "E0472")]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_inline_asm_unsupported_target, code = "E0472")]
pub struct InlineAsmUnsupportedTarget {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::att_syntax_only_x86)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_att_syntax_only_x86)]
pub struct AttSyntaxOnlyX86 {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::abi_specified_multiple_times)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_abi_specified_multiple_times)]
pub struct AbiSpecifiedMultipleTimes {
#[primary_span]
pub abi_span: Span,
@ -175,24 +175,24 @@ pub struct AbiSpecifiedMultipleTimes {
pub equivalent: Option<()>,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::clobber_abi_not_supported)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_clobber_abi_not_supported)]
pub struct ClobberAbiNotSupported {
#[primary_span]
pub abi_span: Span,
}
#[derive(SessionDiagnostic)]
#[derive(Diagnostic)]
#[note]
#[diag(ast_lowering::invalid_abi_clobber_abi)]
#[diag(ast_lowering_invalid_abi_clobber_abi)]
pub struct InvalidAbiClobberAbi {
#[primary_span]
pub abi_span: Span,
pub supported_abis: String,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::invalid_register)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_invalid_register)]
pub struct InvalidRegister<'a> {
#[primary_span]
pub op_span: Span,
@ -200,8 +200,8 @@ pub struct InvalidRegister<'a> {
pub error: &'a str,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::invalid_register_class)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_invalid_register_class)]
pub struct InvalidRegisterClass<'a> {
#[primary_span]
pub op_span: Span,
@ -209,61 +209,61 @@ pub struct InvalidRegisterClass<'a> {
pub error: &'a str,
}
#[derive(SessionDiagnostic)]
#[diag(ast_lowering::invalid_asm_template_modifier_reg_class)]
#[derive(Diagnostic)]
#[diag(ast_lowering_invalid_asm_template_modifier_reg_class)]
pub struct InvalidAsmTemplateModifierRegClass {
#[primary_span]
#[label(ast_lowering::template_modifier)]
#[label(ast_lowering_template_modifier)]
pub placeholder_span: Span,
#[label(ast_lowering::argument)]
#[label(ast_lowering_argument)]
pub op_span: Span,
#[subdiagnostic]
pub sub: InvalidAsmTemplateModifierRegClassSub,
}
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub enum InvalidAsmTemplateModifierRegClassSub {
#[note(ast_lowering::support_modifiers)]
#[note(ast_lowering_support_modifiers)]
SupportModifier { class_name: Symbol, modifiers: String },
#[note(ast_lowering::does_not_support_modifiers)]
#[note(ast_lowering_does_not_support_modifiers)]
DoesNotSupportModifier { class_name: Symbol },
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::invalid_asm_template_modifier_const)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_invalid_asm_template_modifier_const)]
pub struct InvalidAsmTemplateModifierConst {
#[primary_span]
#[label(ast_lowering::template_modifier)]
#[label(ast_lowering_template_modifier)]
pub placeholder_span: Span,
#[label(ast_lowering::argument)]
#[label(ast_lowering_argument)]
pub op_span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::invalid_asm_template_modifier_sym)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_invalid_asm_template_modifier_sym)]
pub struct InvalidAsmTemplateModifierSym {
#[primary_span]
#[label(ast_lowering::template_modifier)]
#[label(ast_lowering_template_modifier)]
pub placeholder_span: Span,
#[label(ast_lowering::argument)]
#[label(ast_lowering_argument)]
pub op_span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::register_class_only_clobber)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_register_class_only_clobber)]
pub struct RegisterClassOnlyClobber {
#[primary_span]
pub op_span: Span,
pub reg_class_name: Symbol,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::register_conflict)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_register_conflict)]
pub struct RegisterConflict<'a> {
#[primary_span]
#[label(ast_lowering::register1)]
#[label(ast_lowering_register1)]
pub op_span1: Span,
#[label(ast_lowering::register2)]
#[label(ast_lowering_register2)]
pub op_span2: Span,
pub reg1_name: &'a str,
pub reg2_name: &'a str,
@ -271,14 +271,14 @@ pub struct RegisterConflict<'a> {
pub in_out: Option<Span>,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[derive(Diagnostic, Clone, Copy)]
#[help]
#[diag(ast_lowering::sub_tuple_binding)]
#[diag(ast_lowering_sub_tuple_binding)]
pub struct SubTupleBinding<'a> {
#[primary_span]
#[label]
#[suggestion_verbose(
ast_lowering::sub_tuple_binding_suggestion,
ast_lowering_sub_tuple_binding_suggestion,
code = "..",
applicability = "maybe-incorrect"
)]
@ -288,57 +288,57 @@ pub struct SubTupleBinding<'a> {
pub ctx: &'a str,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::extra_double_dot)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_extra_double_dot)]
pub struct ExtraDoubleDot<'a> {
#[primary_span]
#[label]
pub span: Span,
#[label(ast_lowering::previously_used_here)]
#[label(ast_lowering_previously_used_here)]
pub prev_span: Span,
pub ctx: &'a str,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[derive(Diagnostic, Clone, Copy)]
#[note]
#[diag(ast_lowering::misplaced_double_dot)]
#[diag(ast_lowering_misplaced_double_dot)]
pub struct MisplacedDoubleDot {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::misplaced_relax_trait_bound)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_misplaced_relax_trait_bound)]
pub struct MisplacedRelaxTraitBound {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::not_supported_for_lifetime_binder_async_closure)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_not_supported_for_lifetime_binder_async_closure)]
pub struct NotSupportedForLifetimeBinderAsyncClosure {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::arbitrary_expression_in_pattern)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_arbitrary_expression_in_pattern)]
pub struct ArbitraryExpressionInPattern {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::inclusive_range_with_no_end)]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_inclusive_range_with_no_end)]
pub struct InclusiveRangeWithNoEnd {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic, Clone, Copy)]
#[diag(ast_lowering::trait_fn_async, code = "E0706")]
#[derive(Diagnostic, Clone, Copy)]
#[diag(ast_lowering_trait_fn_async, code = "E0706")]
#[note]
#[note(ast_lowering::note2)]
#[note(note2)]
pub struct TraitFnAsync {
#[primary_span]
pub fn_span: Span,

View File

@ -60,7 +60,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
hir::ExprKind::Call(f, self.lower_exprs(args))
}
}
ExprKind::MethodCall(ref seg, ref args, span) => {
ExprKind::MethodCall(ref seg, ref receiver, ref args, span) => {
let hir_seg = self.arena.alloc(self.lower_path_segment(
e.span,
seg,
@ -68,9 +68,9 @@ impl<'hir> LoweringContext<'_, 'hir> {
ParenthesizedGenericArgs::Err,
&ImplTraitContext::Disallowed(ImplTraitPosition::Path),
));
let receiver = self.lower_expr(&args[0]);
let receiver = self.lower_expr(receiver);
let args =
self.arena.alloc_from_iter(args[1..].iter().map(|x| self.lower_expr_mut(x)));
self.arena.alloc_from_iter(args.iter().map(|x| self.lower_expr_mut(x)));
hir::ExprKind::MethodCall(hir_seg, receiver, args, self.lower_span(span))
}
ExprKind::Binary(binop, ref lhs, ref rhs) => {
@ -359,7 +359,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
let node_id = self.next_node_id();
// Add a definition for the in-band const def.
self.create_def(parent_def_id, node_id, DefPathData::AnonConst);
self.create_def(parent_def_id.def_id, node_id, DefPathData::AnonConst);
let anon_const = AnonConst { id: node_id, value: arg };
generic_args.push(AngleBracketedArg::Arg(GenericArg::Const(anon_const)));
@ -387,32 +387,58 @@ impl<'hir> LoweringContext<'_, 'hir> {
then: &Block,
else_opt: Option<&Expr>,
) -> hir::ExprKind<'hir> {
let lowered_cond = self.lower_expr(cond);
let new_cond = self.manage_let_cond(lowered_cond);
let lowered_cond = self.lower_cond(cond);
let then_expr = self.lower_block_expr(then);
if let Some(rslt) = else_opt {
hir::ExprKind::If(new_cond, self.arena.alloc(then_expr), Some(self.lower_expr(rslt)))
hir::ExprKind::If(
lowered_cond,
self.arena.alloc(then_expr),
Some(self.lower_expr(rslt)),
)
} else {
hir::ExprKind::If(new_cond, self.arena.alloc(then_expr), None)
hir::ExprKind::If(lowered_cond, self.arena.alloc(then_expr), None)
}
}
// If `cond` kind is `let`, returns `let`. Otherwise, wraps and returns `cond`
// in a temporary block.
fn manage_let_cond(&mut self, cond: &'hir hir::Expr<'hir>) -> &'hir hir::Expr<'hir> {
fn has_let_expr<'hir>(expr: &'hir hir::Expr<'hir>) -> bool {
match expr.kind {
hir::ExprKind::Binary(_, lhs, rhs) => has_let_expr(lhs) || has_let_expr(rhs),
hir::ExprKind::Let(..) => true,
// Lowers a condition (i.e. `cond` in `if cond` or `while cond`), wrapping it in a terminating scope
// so that temporaries created in the condition don't live beyond it.
fn lower_cond(&mut self, cond: &Expr) -> &'hir hir::Expr<'hir> {
fn has_let_expr(expr: &Expr) -> bool {
match &expr.kind {
ExprKind::Binary(_, lhs, rhs) => has_let_expr(lhs) || has_let_expr(rhs),
ExprKind::Let(..) => true,
_ => false,
}
}
if has_let_expr(cond) {
cond
} else {
let reason = DesugaringKind::CondTemporary;
let span_block = self.mark_span_with_reason(reason, cond.span, None);
self.expr_drop_temps(span_block, cond, AttrVec::new())
// We have to take special care for `let` exprs in the condition, e.g. in
// `if let pat = val` or `if foo && let pat = val`, as we _do_ want `val` to live beyond the
// condition in this case.
//
// In order to mantain the drop behavior for the non `let` parts of the condition,
// we still wrap them in terminating scopes, e.g. `if foo && let pat = val` essentially
// gets transformed into `if { let _t = foo; _t } && let pat = val`
match &cond.kind {
ExprKind::Binary(op @ Spanned { node: ast::BinOpKind::And, .. }, lhs, rhs)
if has_let_expr(cond) =>
{
let op = self.lower_binop(*op);
let lhs = self.lower_cond(lhs);
let rhs = self.lower_cond(rhs);
self.arena.alloc(self.expr(
cond.span,
hir::ExprKind::Binary(op, lhs, rhs),
AttrVec::new(),
))
}
ExprKind::Let(..) => self.lower_expr(cond),
_ => {
let cond = self.lower_expr(cond);
let reason = DesugaringKind::CondTemporary;
let span_block = self.mark_span_with_reason(reason, cond.span, None);
self.expr_drop_temps(span_block, cond, AttrVec::new())
}
}
}
@ -439,14 +465,13 @@ impl<'hir> LoweringContext<'_, 'hir> {
body: &Block,
opt_label: Option<Label>,
) -> hir::ExprKind<'hir> {
let lowered_cond = self.with_loop_condition_scope(|t| t.lower_expr(cond));
let new_cond = self.manage_let_cond(lowered_cond);
let lowered_cond = self.with_loop_condition_scope(|t| t.lower_cond(cond));
let then = self.lower_block_expr(body);
let expr_break = self.expr_break(span, AttrVec::new());
let stmt_break = self.stmt_expr(span, expr_break);
let else_blk = self.block_all(span, arena_vec![self; stmt_break], None);
let else_expr = self.arena.alloc(self.expr_block(else_blk, AttrVec::new()));
let if_kind = hir::ExprKind::If(new_cond, self.arena.alloc(then), Some(else_expr));
let if_kind = hir::ExprKind::If(lowered_cond, self.arena.alloc(then), Some(else_expr));
let if_expr = self.expr(span, if_kind, AttrVec::new());
let block = self.block_expr(self.arena.alloc(if_expr));
let span = self.lower_span(span.with_hi(cond.span.hi()));
@ -1044,9 +1069,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
if let ExprKind::Path(qself, path) = &expr.kind {
// Does the path resolve to something disallowed in a tuple struct/variant pattern?
if let Some(partial_res) = self.resolver.get_partial_res(expr.id) {
if partial_res.unresolved_segments() == 0
&& !partial_res.base_res().expected_in_tuple_struct_pat()
{
if let Some(res) = partial_res.full_res() && !res.expected_in_tuple_struct_pat() {
return None;
}
}
@ -1066,9 +1089,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
if let ExprKind::Path(qself, path) = &expr.kind {
// Does the path resolve to something disallowed in a unit struct/variant pattern?
if let Some(partial_res) = self.resolver.get_partial_res(expr.id) {
if partial_res.unresolved_segments() == 0
&& !partial_res.base_res().expected_in_unit_struct_pat()
{
if let Some(res) = partial_res.full_res() && !res.expected_in_unit_struct_pat() {
return None;
}
}
@ -1609,11 +1630,11 @@ impl<'hir> LoweringContext<'_, 'hir> {
}
/// Desugar `ExprKind::Yeet` from: `do yeet <expr>` into:
/// ```rust
/// ```ignore(illustrative)
/// // If there is an enclosing `try {...}`:
/// break 'catch_target FromResidual::from_residual(Yeet(residual)),
/// break 'catch_target FromResidual::from_residual(Yeet(residual));
/// // Otherwise:
/// return FromResidual::from_residual(Yeet(residual)),
/// return FromResidual::from_residual(Yeet(residual));
/// ```
/// But to simplify this, there's a `from_yeet` lang item function which
/// handles the combined `FromResidual::from_residual(Yeet(residual))`.

View File

@ -24,7 +24,7 @@ pub(super) struct NodeCollector<'a, 'hir> {
/// The parent of this node
parent_node: hir::ItemLocalId,
owner: LocalDefId,
owner: OwnerId,
definitions: &'a definitions::Definitions,
}
@ -81,9 +81,9 @@ impl<'a, 'hir> NodeCollector<'a, 'hir> {
current_dep_node_owner={} ({:?}), hir_id.owner={} ({:?})",
self.source_map.span_to_diagnostic_string(span),
node,
self.definitions.def_path(self.owner).to_string_no_crate_verbose(),
self.definitions.def_path(self.owner.def_id).to_string_no_crate_verbose(),
self.owner,
self.definitions.def_path(hir_id.owner).to_string_no_crate_verbose(),
self.definitions.def_path(hir_id.owner.def_id).to_string_no_crate_verbose(),
hir_id.owner,
)
}
@ -112,19 +112,19 @@ impl<'a, 'hir> Visitor<'hir> for NodeCollector<'a, 'hir> {
fn visit_nested_item(&mut self, item: ItemId) {
debug!("visit_nested_item: {:?}", item);
self.insert_nested(item.def_id);
self.insert_nested(item.owner_id.def_id);
}
fn visit_nested_trait_item(&mut self, item_id: TraitItemId) {
self.insert_nested(item_id.def_id);
self.insert_nested(item_id.owner_id.def_id);
}
fn visit_nested_impl_item(&mut self, item_id: ImplItemId) {
self.insert_nested(item_id.def_id);
self.insert_nested(item_id.owner_id.def_id);
}
fn visit_nested_foreign_item(&mut self, foreign_id: ForeignItemId) {
self.insert_nested(foreign_id.def_id);
self.insert_nested(foreign_id.owner_id.def_id);
}
fn visit_nested_body(&mut self, id: BodyId) {
@ -143,7 +143,7 @@ impl<'a, 'hir> Visitor<'hir> for NodeCollector<'a, 'hir> {
#[instrument(level = "debug", skip(self))]
fn visit_item(&mut self, i: &'hir Item<'hir>) {
debug_assert_eq!(i.def_id, self.owner);
debug_assert_eq!(i.owner_id, self.owner);
self.with_parent(i.hir_id(), |this| {
if let ItemKind::Struct(ref struct_def, _) = i.kind {
// If this is a tuple or unit-like struct, register the constructor.
@ -157,7 +157,7 @@ impl<'a, 'hir> Visitor<'hir> for NodeCollector<'a, 'hir> {
#[instrument(level = "debug", skip(self))]
fn visit_foreign_item(&mut self, fi: &'hir ForeignItem<'hir>) {
debug_assert_eq!(fi.def_id, self.owner);
debug_assert_eq!(fi.owner_id, self.owner);
self.with_parent(fi.hir_id(), |this| {
intravisit::walk_foreign_item(this, fi);
});
@ -176,7 +176,7 @@ impl<'a, 'hir> Visitor<'hir> for NodeCollector<'a, 'hir> {
#[instrument(level = "debug", skip(self))]
fn visit_trait_item(&mut self, ti: &'hir TraitItem<'hir>) {
debug_assert_eq!(ti.def_id, self.owner);
debug_assert_eq!(ti.owner_id, self.owner);
self.with_parent(ti.hir_id(), |this| {
intravisit::walk_trait_item(this, ti);
});
@ -184,7 +184,7 @@ impl<'a, 'hir> Visitor<'hir> for NodeCollector<'a, 'hir> {
#[instrument(level = "debug", skip(self))]
fn visit_impl_item(&mut self, ii: &'hir ImplItem<'hir>) {
debug_assert_eq!(ii.def_id, self.owner);
debug_assert_eq!(ii.owner_id, self.owner);
self.with_parent(ii.hir_id(), |this| {
intravisit::walk_impl_item(this, ii);
});

View File

@ -1,4 +1,4 @@
use super::errors::{InvalidAbi, MisplacedRelaxTraitBound};
use super::errors::{InvalidAbi, InvalidAbiSuggestion, MisplacedRelaxTraitBound};
use super::ResolverAstLoweringExt;
use super::{Arena, AstOwner, ImplTraitContext, ImplTraitPosition};
use super::{FnDeclKind, LoweringContext, ParamMode};
@ -14,9 +14,10 @@ use rustc_hir::def_id::{LocalDefId, CRATE_DEF_ID};
use rustc_hir::PredicateOrigin;
use rustc_index::vec::{Idx, IndexVec};
use rustc_middle::ty::{DefIdTree, ResolverAstLowering, TyCtxt};
use rustc_span::lev_distance::find_best_match_for_name;
use rustc_span::source_map::DesugaringKind;
use rustc_span::symbol::{kw, sym, Ident};
use rustc_span::Span;
use rustc_span::{Span, Symbol};
use rustc_target::spec::abi;
use smallvec::{smallvec, SmallVec};
@ -67,7 +68,7 @@ impl<'a, 'hir> ItemLowerer<'a, 'hir> {
bodies: Vec::new(),
attrs: SortedMap::default(),
children: FxHashMap::default(),
current_hir_id_owner: CRATE_DEF_ID,
current_hir_id_owner: hir::CRATE_OWNER_ID,
item_local_id_counter: hir::ItemLocalId::new(0),
node_id_to_local_id: Default::default(),
local_id_to_def_id: SortedMap::new(),
@ -176,7 +177,8 @@ impl<'hir> LoweringContext<'_, 'hir> {
}
pub(super) fn lower_item_ref(&mut self, i: &Item) -> SmallVec<[hir::ItemId; 1]> {
let mut node_ids = smallvec![hir::ItemId { def_id: self.local_def_id(i.id) }];
let mut node_ids =
smallvec![hir::ItemId { owner_id: hir::OwnerId { def_id: self.local_def_id(i.id) } }];
if let ItemKind::Use(ref use_tree) = &i.kind {
self.lower_item_id_use_tree(use_tree, i.id, &mut node_ids);
}
@ -192,7 +194,9 @@ impl<'hir> LoweringContext<'_, 'hir> {
match tree.kind {
UseTreeKind::Nested(ref nested_vec) => {
for &(ref nested, id) in nested_vec {
vec.push(hir::ItemId { def_id: self.local_def_id(id) });
vec.push(hir::ItemId {
owner_id: hir::OwnerId { def_id: self.local_def_id(id) },
});
self.lower_item_id_use_tree(nested, id, vec);
}
}
@ -201,7 +205,9 @@ impl<'hir> LoweringContext<'_, 'hir> {
for (_, &id) in
iter::zip(self.expect_full_res_from_use(base_id).skip(1), &[id1, id2])
{
vec.push(hir::ItemId { def_id: self.local_def_id(id) });
vec.push(hir::ItemId {
owner_id: hir::OwnerId { def_id: self.local_def_id(id) },
});
}
}
}
@ -214,7 +220,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
let attrs = self.lower_attrs(hir_id, &i.attrs);
let kind = self.lower_item_kind(i.span, i.id, hir_id, &mut ident, attrs, vis_span, &i.kind);
let item = hir::Item {
def_id: hir_id.expect_owner(),
owner_id: hir_id.expect_owner(),
ident: self.lower_ident(ident),
kind,
vis_span,
@ -539,7 +545,11 @@ impl<'hir> LoweringContext<'_, 'hir> {
let ident = *ident;
let mut path = path.clone();
for seg in &mut path.segments {
seg.id = self.next_node_id();
// Give the cloned segment the same resolution information
// as the old one (this is needed for stability checking).
let new_id = self.next_node_id();
self.resolver.clone_res(seg.id, new_id);
seg.id = new_id;
}
let span = path.span;
@ -552,7 +562,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
}
let item = hir::Item {
def_id: new_id,
owner_id: hir::OwnerId { def_id: new_id },
ident: this.lower_ident(ident),
kind,
vis_span,
@ -608,7 +618,11 @@ impl<'hir> LoweringContext<'_, 'hir> {
// Give the segments new node-ids since they are being cloned.
for seg in &mut prefix.segments {
seg.id = self.next_node_id();
// Give the cloned segment the same resolution information
// as the old one (this is needed for stability checking).
let new_id = self.next_node_id();
self.resolver.clone_res(seg.id, new_id);
seg.id = new_id;
}
// Each `use` import is an item and thus are owners of the
@ -626,7 +640,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
}
let item = hir::Item {
def_id: new_hir_id,
owner_id: hir::OwnerId { def_id: new_hir_id },
ident: this.lower_ident(ident),
kind,
vis_span,
@ -646,10 +660,10 @@ impl<'hir> LoweringContext<'_, 'hir> {
fn lower_foreign_item(&mut self, i: &ForeignItem) -> &'hir hir::ForeignItem<'hir> {
let hir_id = self.lower_node_id(i.id);
let def_id = hir_id.expect_owner();
let owner_id = hir_id.expect_owner();
self.lower_attrs(hir_id, &i.attrs);
let item = hir::ForeignItem {
def_id,
owner_id,
ident: self.lower_ident(i.ident),
kind: match i.kind {
ForeignItemKind::Fn(box Fn { ref sig, ref generics, .. }) => {
@ -688,7 +702,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
fn lower_foreign_item_ref(&mut self, i: &ForeignItem) -> hir::ForeignItemRef {
hir::ForeignItemRef {
id: hir::ForeignItemId { def_id: self.local_def_id(i.id) },
id: hir::ForeignItemId { owner_id: hir::OwnerId { def_id: self.local_def_id(i.id) } },
ident: self.lower_ident(i.ident),
span: self.lower_span(i.span),
}
@ -798,7 +812,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
);
(generics, hir::TraitItemKind::Fn(sig, hir::TraitFn::Provided(body_id)), true)
}
AssocItemKind::TyAlias(box TyAlias {
AssocItemKind::Type(box TyAlias {
ref generics,
where_clauses,
ref bounds,
@ -831,7 +845,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
self.lower_attrs(hir_id, &i.attrs);
let item = hir::TraitItem {
def_id: trait_item_def_id,
owner_id: trait_item_def_id,
ident: self.lower_ident(i.ident),
generics,
kind,
@ -844,13 +858,13 @@ impl<'hir> LoweringContext<'_, 'hir> {
fn lower_trait_item_ref(&mut self, i: &AssocItem) -> hir::TraitItemRef {
let kind = match &i.kind {
AssocItemKind::Const(..) => hir::AssocItemKind::Const,
AssocItemKind::TyAlias(..) => hir::AssocItemKind::Type,
AssocItemKind::Type(..) => hir::AssocItemKind::Type,
AssocItemKind::Fn(box Fn { sig, .. }) => {
hir::AssocItemKind::Fn { has_self: sig.decl.has_self() }
}
AssocItemKind::MacCall(..) => unimplemented!(),
};
let id = hir::TraitItemId { def_id: self.local_def_id(i.id) };
let id = hir::TraitItemId { owner_id: hir::OwnerId { def_id: self.local_def_id(i.id) } };
hir::TraitItemRef {
id,
ident: self.lower_ident(i.ident),
@ -892,7 +906,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
(generics, hir::ImplItemKind::Fn(sig, body_id))
}
AssocItemKind::TyAlias(box TyAlias { generics, where_clauses, ty, .. }) => {
AssocItemKind::Type(box TyAlias { generics, where_clauses, ty, .. }) => {
let mut generics = generics.clone();
add_ty_alias_where_clause(&mut generics, *where_clauses, false);
self.lower_generics(
@ -902,11 +916,11 @@ impl<'hir> LoweringContext<'_, 'hir> {
|this| match ty {
None => {
let ty = this.arena.alloc(this.ty(i.span, hir::TyKind::Err));
hir::ImplItemKind::TyAlias(ty)
hir::ImplItemKind::Type(ty)
}
Some(ty) => {
let ty = this.lower_ty(ty, &ImplTraitContext::TypeAliasesOpaqueTy);
hir::ImplItemKind::TyAlias(ty)
hir::ImplItemKind::Type(ty)
}
},
)
@ -917,7 +931,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
let hir_id = self.lower_node_id(i.id);
self.lower_attrs(hir_id, &i.attrs);
let item = hir::ImplItem {
def_id: hir_id.expect_owner(),
owner_id: hir_id.expect_owner(),
ident: self.lower_ident(i.ident),
generics,
kind,
@ -930,18 +944,21 @@ impl<'hir> LoweringContext<'_, 'hir> {
fn lower_impl_item_ref(&mut self, i: &AssocItem) -> hir::ImplItemRef {
hir::ImplItemRef {
id: hir::ImplItemId { def_id: self.local_def_id(i.id) },
id: hir::ImplItemId { owner_id: hir::OwnerId { def_id: self.local_def_id(i.id) } },
ident: self.lower_ident(i.ident),
span: self.lower_span(i.span),
kind: match &i.kind {
AssocItemKind::Const(..) => hir::AssocItemKind::Const,
AssocItemKind::TyAlias(..) => hir::AssocItemKind::Type,
AssocItemKind::Type(..) => hir::AssocItemKind::Type,
AssocItemKind::Fn(box Fn { sig, .. }) => {
hir::AssocItemKind::Fn { has_self: sig.decl.has_self() }
}
AssocItemKind::MacCall(..) => unimplemented!(),
},
trait_item_def_id: self.resolver.get_partial_res(i.id).map(|r| r.base_res().def_id()),
trait_item_def_id: self
.resolver
.get_partial_res(i.id)
.map(|r| r.expect_full_res().def_id()),
}
}
@ -1049,9 +1066,9 @@ impl<'hir> LoweringContext<'_, 'hir> {
asyncness: Async,
body: Option<&Block>,
) -> hir::BodyId {
let closure_id = match asyncness {
Async::Yes { closure_id, .. } => closure_id,
Async::No => return self.lower_fn_body_block(span, decl, body),
let (closure_id, body) = match (asyncness, body) {
(Async::Yes { closure_id, .. }, Some(body)) => (closure_id, body),
_ => return self.lower_fn_body_block(span, decl, body),
};
self.lower_body(|this| {
@ -1193,16 +1210,15 @@ impl<'hir> LoweringContext<'_, 'hir> {
parameters.push(new_parameter);
}
let body_span = body.map_or(span, |b| b.span);
let async_expr = this.make_async_expr(
CaptureBy::Value,
closure_id,
None,
body_span,
body.span,
hir::AsyncGeneratorKind::Fn,
|this| {
// Create a block from the user's function body:
let user_body = this.lower_block_expr_opt(body_span, body);
let user_body = this.lower_block_expr(body);
// Transform into `drop-temps { <user-body> }`, an expression:
let desugared_span =
@ -1234,7 +1250,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
(
this.arena.alloc_from_iter(parameters),
this.expr(body_span, async_expr, AttrVec::new()),
this.expr(body.span, async_expr, AttrVec::new()),
)
})
}
@ -1280,10 +1296,19 @@ impl<'hir> LoweringContext<'_, 'hir> {
}
fn error_on_invalid_abi(&self, abi: StrLit) {
let abi_names = abi::enabled_names(self.tcx.features(), abi.span)
.iter()
.map(|s| Symbol::intern(s))
.collect::<Vec<_>>();
let suggested_name = find_best_match_for_name(&abi_names, abi.symbol_unescaped, None);
self.tcx.sess.emit_err(InvalidAbi {
abi: abi.symbol_unescaped,
span: abi.span,
abi: abi.symbol,
valid_abis: abi::all_names().join(", "),
suggestion: suggested_name.map(|suggested_name| InvalidAbiSuggestion {
span: abi.span,
suggestion: format!("\"{suggested_name}\""),
}),
command: "rustc --print=calling-conventions".to_string(),
});
}
@ -1335,9 +1360,9 @@ impl<'hir> LoweringContext<'_, 'hir> {
match self
.resolver
.get_partial_res(bound_pred.bounded_ty.id)
.map(|d| (d.base_res(), d.unresolved_segments()))
.and_then(|r| r.full_res())
{
Some((Res::Def(DefKind::TyParam, def_id), 0))
Some(Res::Def(DefKind::TyParam, def_id))
if bound_pred.bound_generic_params.is_empty() =>
{
generics
@ -1464,6 +1489,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
let bounded_ty =
self.ty_path(ty_id, param_span, hir::QPath::Resolved(None, ty_path));
Some(hir::WherePredicate::BoundPredicate(hir::WhereBoundPredicate {
hir_id: self.next_id(),
bounded_ty: self.arena.alloc(bounded_ty),
bounds,
span,
@ -1494,6 +1520,7 @@ impl<'hir> LoweringContext<'_, 'hir> {
ref bounds,
span,
}) => hir::WherePredicate::BoundPredicate(hir::WhereBoundPredicate {
hir_id: self.next_id(),
bound_generic_params: self.lower_generic_params(bound_generic_params),
bounded_ty: self
.lower_ty(bounded_ty, &ImplTraitContext::Disallowed(ImplTraitPosition::Type)),

View File

@ -32,7 +32,6 @@
#![feature(box_patterns)]
#![feature(let_chains)]
#![cfg_attr(bootstrap, feature(let_else))]
#![feature(never_type)]
#![recursion_limit = "256"]
#![allow(rustc::potential_query_instability)]
@ -62,8 +61,8 @@ use rustc_hir::def_id::{LocalDefId, CRATE_DEF_ID};
use rustc_hir::definitions::DefPathData;
use rustc_hir::{ConstArg, GenericArg, ItemLocalId, ParamName, TraitCandidate};
use rustc_index::vec::{Idx, IndexVec};
use rustc_middle::span_bug;
use rustc_middle::ty::{ResolverAstLowering, TyCtxt};
use rustc_middle::{bug, span_bug};
use rustc_session::parse::feature_err;
use rustc_span::hygiene::MacroKind;
use rustc_span::source_map::DesugaringKind;
@ -126,7 +125,7 @@ struct LoweringContext<'a, 'hir> {
is_in_trait_impl: bool,
is_in_dyn_type: bool,
current_hir_id_owner: LocalDefId,
current_hir_id_owner: hir::OwnerId,
item_local_id_counter: hir::ItemLocalId,
local_id_to_def_id: SortedMap<ItemLocalId, LocalDefId>,
trait_map: FxHashMap<ItemLocalId, Box<[TraitCandidate]>>,
@ -161,6 +160,10 @@ trait ResolverAstLoweringExt {
fn legacy_const_generic_args(&self, expr: &Expr) -> Option<Vec<usize>>;
fn get_partial_res(&self, id: NodeId) -> Option<PartialRes>;
fn get_import_res(&self, id: NodeId) -> PerNS<Option<Res<NodeId>>>;
// Clones the resolution (if any) on 'source' and applies it
// to 'target'. Used when desugaring a `UseTreeKind::Nested` to
// multiple `UseTreeKind::Simple`s
fn clone_res(&mut self, source: NodeId, target: NodeId);
fn get_label_res(&self, id: NodeId) -> Option<NodeId>;
fn get_lifetime_res(&self, id: NodeId) -> Option<LifetimeRes>;
fn take_extra_lifetime_params(&mut self, id: NodeId) -> Vec<(Ident, NodeId, LifetimeRes)>;
@ -176,12 +179,7 @@ impl ResolverAstLoweringExt for ResolverAstLowering {
return None;
}
let partial_res = self.partial_res_map.get(&expr.id)?;
if partial_res.unresolved_segments() != 0 {
return None;
}
if let Res::Def(DefKind::Fn, def_id) = partial_res.base_res() {
if let Res::Def(DefKind::Fn, def_id) = self.partial_res_map.get(&expr.id)?.full_res()? {
// We only support cross-crate argument rewriting. Uses
// within the same crate should be updated to use the new
// const generics style.
@ -198,6 +196,12 @@ impl ResolverAstLoweringExt for ResolverAstLowering {
None
}
fn clone_res(&mut self, source: NodeId, target: NodeId) {
if let Some(res) = self.partial_res_map.get(&source) {
self.partial_res_map.insert(target, *res);
}
}
/// Obtains resolution for a `NodeId` with a single resolution.
fn get_partial_res(&self, id: NodeId) -> Option<PartialRes> {
self.partial_res_map.get(&id).copied()
@ -324,17 +328,20 @@ enum FnDeclKind {
}
impl FnDeclKind {
fn impl_trait_return_allowed(&self, tcx: TyCtxt<'_>) -> bool {
fn impl_trait_allowed(&self, tcx: TyCtxt<'_>) -> bool {
match self {
FnDeclKind::Fn | FnDeclKind::Inherent => true,
FnDeclKind::Impl if tcx.features().return_position_impl_trait_in_trait => true,
FnDeclKind::Trait if tcx.features().return_position_impl_trait_in_trait => true,
_ => false,
}
}
fn impl_trait_in_trait_allowed(&self, tcx: TyCtxt<'_>) -> bool {
fn async_fn_allowed(&self, tcx: TyCtxt<'_>) -> bool {
match self {
FnDeclKind::Trait if tcx.features().return_position_impl_trait_in_trait => true,
FnDeclKind::Fn | FnDeclKind::Inherent => true,
FnDeclKind::Impl if tcx.features().async_fn_in_trait => true,
FnDeclKind::Trait if tcx.features().async_fn_in_trait => true,
_ => false,
}
}
@ -572,7 +579,8 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
let current_node_ids = std::mem::take(&mut self.node_id_to_local_id);
let current_id_to_def_id = std::mem::take(&mut self.local_id_to_def_id);
let current_trait_map = std::mem::take(&mut self.trait_map);
let current_owner = std::mem::replace(&mut self.current_hir_id_owner, def_id);
let current_owner =
std::mem::replace(&mut self.current_hir_id_owner, hir::OwnerId { def_id });
let current_local_counter =
std::mem::replace(&mut self.item_local_id_counter, hir::ItemLocalId::new(1));
let current_impl_trait_defs = std::mem::take(&mut self.impl_trait_defs);
@ -587,7 +595,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
debug_assert_eq!(_old, None);
let item = f(self);
debug_assert_eq!(def_id, item.def_id());
debug_assert_eq!(def_id, item.def_id().def_id);
// `f` should have consumed all the elements in these vectors when constructing `item`.
debug_assert!(self.impl_trait_defs.is_empty());
debug_assert!(self.impl_trait_bounds.is_empty());
@ -753,12 +761,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
}
fn expect_full_res(&mut self, id: NodeId) -> Res<NodeId> {
self.resolver.get_partial_res(id).map_or(Res::Err, |pr| {
if pr.unresolved_segments() != 0 {
panic!("path not fully resolved: {:?}", pr);
}
pr.base_res()
})
self.resolver.get_partial_res(id).map_or(Res::Err, |pr| pr.expect_full_res())
}
fn expect_full_res_from_use(&mut self, id: NodeId) -> impl Iterator<Item = Res<NodeId>> {
@ -786,7 +789,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
/// Mark a span as relative to the current owning item.
fn lower_span(&self, span: Span) -> Span {
if self.tcx.sess.opts.unstable_opts.incremental_relative_spans {
span.with_parent(Some(self.current_hir_id_owner))
span.with_parent(Some(self.current_hir_id_owner.def_id))
} else {
// Do not make spans relative when not using incremental compilation.
span
@ -812,7 +815,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
LifetimeRes::Fresh { param, .. } => {
// Late resolution delegates to us the creation of the `LocalDefId`.
let _def_id = self.create_def(
self.current_hir_id_owner,
self.current_hir_id_owner.def_id,
param,
DefPathData::LifetimeNs(kw::UnderscoreLifetime),
);
@ -1060,9 +1063,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
// Desugar `AssocTy: Bounds` into `AssocTy = impl Bounds`. We do this by
// constructing the HIR for `impl bounds...` and then lowering that.
let parent_def_id = self.current_hir_id_owner;
let impl_trait_node_id = self.next_node_id();
self.create_def(parent_def_id, impl_trait_node_id, DefPathData::ImplTrait);
self.with_dyn_type_scope(false, |this| {
let node_id = this.next_node_id();
@ -1140,8 +1141,11 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
// type and value namespaces. If we resolved the path in the value namespace, we
// transform it into a generic const argument.
TyKind::Path(ref qself, ref path) => {
if let Some(partial_res) = self.resolver.get_partial_res(ty.id) {
let res = partial_res.base_res();
if let Some(res) = self
.resolver
.get_partial_res(ty.id)
.and_then(|partial_res| partial_res.full_res())
{
if !res.matches_ns(Namespace::TypeNS) {
debug!(
"lower_generic_arg: Lowering type argument as const argument: {:?}",
@ -1154,7 +1158,11 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
let node_id = self.next_node_id();
// Add a definition for the in-band const def.
self.create_def(parent_def_id, node_id, DefPathData::AnonConst);
self.create_def(
parent_def_id.def_id,
node_id,
DefPathData::AnonConst,
);
let span = self.lower_span(ty.span);
let path_expr = Expr {
@ -1204,8 +1212,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
// by `ty_path`.
if qself.is_none()
&& let Some(partial_res) = self.resolver.get_partial_res(t.id)
&& partial_res.unresolved_segments() == 0
&& let Res::Def(DefKind::Trait | DefKind::TraitAlias, _) = partial_res.base_res()
&& let Some(Res::Def(DefKind::Trait | DefKind::TraitAlias, _)) = partial_res.full_res()
{
let (bounds, lifetime_bound) = self.with_dyn_type_scope(true, |this| {
let poly_trait_ref = this.ast_arena.ptr.alloc(PolyTraitRef {
@ -1349,9 +1356,14 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
def_node_id,
bounds,
false,
&ImplTraitContext::TypeAliasesOpaqueTy,
itctx,
),
ImplTraitContext::Universal => {
self.create_def(
self.current_hir_id_owner.def_id,
def_node_id,
DefPathData::ImplTrait,
);
let span = t.span;
let ident = Ident::from_str_and_span(&pprust::ty_to_string(t), span);
let (param, bounds, path) =
@ -1445,7 +1457,17 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
// frequently opened issues show.
let opaque_ty_span = self.mark_span_with_reason(DesugaringKind::OpaqueTy, span, None);
let opaque_ty_def_id = self.local_def_id(opaque_ty_node_id);
let opaque_ty_def_id = match origin {
hir::OpaqueTyOrigin::TyAlias => self.create_def(
self.current_hir_id_owner.def_id,
opaque_ty_node_id,
DefPathData::ImplTrait,
),
hir::OpaqueTyOrigin::FnReturn(fn_def_id) => {
self.create_def(fn_def_id, opaque_ty_node_id, DefPathData::ImplTrait)
}
hir::OpaqueTyOrigin::AsyncFn(..) => bug!("unreachable"),
};
debug!(?opaque_ty_def_id);
// Contains the new lifetime definitions created for the TAIT (if any).
@ -1551,7 +1573,11 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
debug!(?lifetimes);
// `impl Trait` now just becomes `Foo<'a, 'b, ..>`.
hir::TyKind::OpaqueDef(hir::ItemId { def_id: opaque_ty_def_id }, lifetimes, in_trait)
hir::TyKind::OpaqueDef(
hir::ItemId { owner_id: hir::OwnerId { def_id: opaque_ty_def_id } },
lifetimes,
in_trait,
)
}
/// Registers a new opaque type with the proper `NodeId`s and
@ -1567,7 +1593,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
// Generate an `type Foo = impl Trait;` declaration.
trace!("registering opaque type with id {:#?}", opaque_ty_id);
let opaque_ty_item = hir::Item {
def_id: opaque_ty_id,
owner_id: hir::OwnerId { def_id: opaque_ty_id },
ident: Ident::empty(),
kind: opaque_ty_item_kind,
vis_span: self.lower_span(span.shrink_to_lo()),
@ -1701,62 +1727,38 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
}));
let output = if let Some((ret_id, span)) = make_ret_async {
match kind {
FnDeclKind::Trait => {
if !kind.impl_trait_in_trait_allowed(self.tcx) {
if !kind.async_fn_allowed(self.tcx) {
match kind {
FnDeclKind::Trait | FnDeclKind::Impl => {
self.tcx
.sess
.create_feature_err(
TraitFnAsync { fn_span, span },
sym::return_position_impl_trait_in_trait,
sym::async_fn_in_trait,
)
.emit();
}
self.lower_async_fn_ret_ty(
&decl.output,
fn_node_id.expect("`make_ret_async` but no `fn_def_id`"),
ret_id,
true,
)
}
_ => {
if !kind.impl_trait_return_allowed(self.tcx) {
if kind == FnDeclKind::Impl {
self.tcx
.sess
.create_feature_err(
TraitFnAsync { fn_span, span },
sym::return_position_impl_trait_in_trait,
)
.emit();
} else {
self.tcx.sess.emit_err(TraitFnAsync { fn_span, span });
}
_ => {
self.tcx.sess.emit_err(TraitFnAsync { fn_span, span });
}
self.lower_async_fn_ret_ty(
&decl.output,
fn_node_id.expect("`make_ret_async` but no `fn_def_id`"),
ret_id,
false,
)
}
}
self.lower_async_fn_ret_ty(
&decl.output,
fn_node_id.expect("`make_ret_async` but no `fn_def_id`"),
ret_id,
matches!(kind, FnDeclKind::Trait),
)
} else {
match decl.output {
FnRetTy::Ty(ref ty) => {
let mut context = match fn_node_id {
Some(fn_node_id) if kind.impl_trait_return_allowed(self.tcx) => {
Some(fn_node_id) if kind.impl_trait_allowed(self.tcx) => {
let fn_def_id = self.local_def_id(fn_node_id);
ImplTraitContext::ReturnPositionOpaqueTy {
origin: hir::OpaqueTyOrigin::FnReturn(fn_def_id),
in_trait: false,
}
}
Some(fn_node_id) if kind.impl_trait_in_trait_allowed(self.tcx) => {
let fn_def_id = self.local_def_id(fn_node_id);
ImplTraitContext::ReturnPositionOpaqueTy {
origin: hir::OpaqueTyOrigin::FnReturn(fn_def_id),
in_trait: true,
in_trait: matches!(kind, FnDeclKind::Trait),
}
}
_ => ImplTraitContext::Disallowed(match kind {
@ -1950,9 +1952,13 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
let future_bound = this.lower_async_fn_output_type_to_future_bound(
output,
span,
ImplTraitContext::ReturnPositionOpaqueTy {
origin: hir::OpaqueTyOrigin::FnReturn(fn_def_id),
in_trait,
if in_trait && !this.tcx.features().return_position_impl_trait_in_trait {
ImplTraitContext::Disallowed(ImplTraitPosition::TraitReturn)
} else {
ImplTraitContext::ReturnPositionOpaqueTy {
origin: hir::OpaqueTyOrigin::FnReturn(fn_def_id),
in_trait,
}
},
);
@ -2038,7 +2044,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
// async fn, so the *type parameters* are inherited. It's
// only the lifetime parameters that we must supply.
let opaque_ty_ref = hir::TyKind::OpaqueDef(
hir::ItemId { def_id: opaque_ty_def_id },
hir::ItemId { owner_id: hir::OwnerId { def_id: opaque_ty_def_id } },
generic_args,
in_trait,
);

View File

@ -239,7 +239,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
ident: Ident,
lower_sub: impl FnOnce(&mut Self) -> Option<&'hir hir::Pat<'hir>>,
) -> hir::PatKind<'hir> {
match self.resolver.get_partial_res(p.id).map(|d| d.base_res()) {
match self.resolver.get_partial_res(p.id).map(|d| d.expect_full_res()) {
// `None` can occur in body-less function signatures
res @ (None | Some(Res::Local(_))) => {
let canonical_id = match res {

View File

@ -29,11 +29,13 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
let partial_res =
self.resolver.get_partial_res(id).unwrap_or_else(|| PartialRes::new(Res::Err));
let base_res = partial_res.base_res();
let unresolved_segments = partial_res.unresolved_segments();
let path_span_lo = p.span.shrink_to_lo();
let proj_start = p.segments.len() - partial_res.unresolved_segments();
let proj_start = p.segments.len() - unresolved_segments;
let path = self.arena.alloc(hir::Path {
res: self.lower_res(partial_res.base_res()),
res: self.lower_res(base_res),
segments: self.arena.alloc_from_iter(p.segments[..proj_start].iter().enumerate().map(
|(i, segment)| {
let param_mode = match (qself_position, param_mode) {
@ -46,7 +48,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
_ => param_mode,
};
let parenthesized_generic_args = match partial_res.base_res() {
let parenthesized_generic_args = match base_res {
// `a::b::Trait(Args)`
Res::Def(DefKind::Trait, _) if i + 1 == proj_start => {
ParenthesizedGenericArgs::Ok
@ -83,7 +85,7 @@ impl<'a, 'hir> LoweringContext<'a, 'hir> {
// Simple case, either no projections, or only fully-qualified.
// E.g., `std::mem::size_of` or `<I as Iterator>::Item`.
if partial_res.unresolved_segments() == 0 {
if unresolved_segments == 0 {
return hir::QPath::Resolved(qself, path);
}

View File

@ -14,6 +14,7 @@ use rustc_ast::*;
use rustc_ast_pretty::pprust::{self, State};
use rustc_data_structures::fx::FxHashMap;
use rustc_errors::{error_code, fluent, pluralize, struct_span_err, Applicability};
use rustc_macros::Subdiagnostic;
use rustc_parse::validate_attr;
use rustc_session::lint::builtin::{
DEPRECATED_WHERE_CLAUSE_LOCATION, MISSING_ABI, PATTERNS_IN_FNS_WITHOUT_BODY,
@ -38,6 +39,13 @@ enum SelfSemantic {
No,
}
/// What is the context that prevents using `~const`?
enum DisallowTildeConstContext<'a> {
TraitObject,
ImplTrait,
Fn(FnKind<'a>),
}
struct AstValidator<'a> {
session: &'a Session,
@ -56,7 +64,7 @@ struct AstValidator<'a> {
/// e.g., `impl Iterator<Item = impl Debug>`.
outer_impl_trait: Option<Span>,
is_tilde_const_allowed: bool,
disallow_tilde_const: Option<DisallowTildeConstContext<'a>>,
/// Used to ban `impl Trait` in path projections like `<impl Iterator>::Item`
/// or `Foo::Bar<impl Trait>`
@ -93,18 +101,26 @@ impl<'a> AstValidator<'a> {
self.is_impl_trait_banned = old;
}
fn with_tilde_const(&mut self, allowed: bool, f: impl FnOnce(&mut Self)) {
let old = mem::replace(&mut self.is_tilde_const_allowed, allowed);
fn with_tilde_const(
&mut self,
disallowed: Option<DisallowTildeConstContext<'a>>,
f: impl FnOnce(&mut Self),
) {
let old = mem::replace(&mut self.disallow_tilde_const, disallowed);
f(self);
self.is_tilde_const_allowed = old;
self.disallow_tilde_const = old;
}
fn with_tilde_const_allowed(&mut self, f: impl FnOnce(&mut Self)) {
self.with_tilde_const(true, f)
self.with_tilde_const(None, f)
}
fn with_banned_tilde_const(&mut self, f: impl FnOnce(&mut Self)) {
self.with_tilde_const(false, f)
fn with_banned_tilde_const(
&mut self,
ctx: DisallowTildeConstContext<'a>,
f: impl FnOnce(&mut Self),
) {
self.with_tilde_const(Some(ctx), f)
}
fn with_let_management(
@ -154,7 +170,7 @@ impl<'a> AstValidator<'a> {
DEPRECATED_WHERE_CLAUSE_LOCATION,
id,
where_clauses.0.1,
fluent::ast_passes::deprecated_where_clause_location,
fluent::ast_passes_deprecated_where_clause_location,
BuiltinLintDiagnostics::DeprecatedWhereclauseLocation(
where_clauses.1.1.shrink_to_hi(),
suggestion,
@ -172,7 +188,7 @@ impl<'a> AstValidator<'a> {
fn with_impl_trait(&mut self, outer: Option<Span>, f: impl FnOnce(&mut Self)) {
let old = mem::replace(&mut self.outer_impl_trait, outer);
if outer.is_some() {
self.with_banned_tilde_const(f);
self.with_banned_tilde_const(DisallowTildeConstContext::ImplTrait, f);
} else {
f(self);
}
@ -197,7 +213,10 @@ impl<'a> AstValidator<'a> {
TyKind::ImplTrait(..) => {
self.with_impl_trait(Some(t.span), |this| visit::walk_ty(this, t))
}
TyKind::TraitObject(..) => self.with_banned_tilde_const(|this| visit::walk_ty(this, t)),
TyKind::TraitObject(..) => self
.with_banned_tilde_const(DisallowTildeConstContext::TraitObject, |this| {
visit::walk_ty(this, t)
}),
TyKind::Path(ref qself, ref path) => {
// We allow these:
// - `Option<impl Trait>`
@ -233,20 +252,6 @@ impl<'a> AstValidator<'a> {
}
}
fn visit_struct_field_def(&mut self, field: &'a FieldDef) {
if let Some(ident) = field.ident {
if ident.name == kw::Underscore {
self.visit_vis(&field.vis);
self.visit_ident(ident);
self.visit_ty_common(&field.ty);
self.walk_ty(&field.ty);
walk_list!(self, visit_attribute, &field.attrs);
return;
}
}
self.visit_field_def(field);
}
fn err_handler(&self) -> &rustc_errors::Handler {
&self.session.diagnostic()
}
@ -987,8 +992,8 @@ impl<'a> Visitor<'a> for AstValidator<'a> {
visit::walk_lifetime(self, lifetime);
}
fn visit_field_def(&mut self, s: &'a FieldDef) {
visit::walk_field_def(self, s)
fn visit_field_def(&mut self, field: &'a FieldDef) {
visit::walk_field_def(self, field)
}
fn visit_item(&mut self, item: &'a Item) {
@ -1176,42 +1181,10 @@ impl<'a> Visitor<'a> for AstValidator<'a> {
self.check_mod_file_item_asciionly(item.ident);
}
}
ItemKind::Struct(ref vdata, ref generics) => match vdata {
// Duplicating the `Visitor` logic allows catching all cases
// of `Anonymous(Struct, Union)` outside of a field struct or union.
//
// Inside `visit_ty` the validator catches every `Anonymous(Struct, Union)` it
// encounters, and only on `ItemKind::Struct` and `ItemKind::Union`
// it uses `visit_ty_common`, which doesn't contain that specific check.
VariantData::Struct(ref fields, ..) => {
self.visit_vis(&item.vis);
self.visit_ident(item.ident);
self.visit_generics(generics);
self.with_banned_assoc_ty_bound(|this| {
walk_list!(this, visit_struct_field_def, fields);
});
walk_list!(self, visit_attribute, &item.attrs);
return;
}
_ => {}
},
ItemKind::Union(ref vdata, ref generics) => {
ItemKind::Union(ref vdata, ..) => {
if vdata.fields().is_empty() {
self.err_handler().span_err(item.span, "unions cannot have zero fields");
}
match vdata {
VariantData::Struct(ref fields, ..) => {
self.visit_vis(&item.vis);
self.visit_ident(item.ident);
self.visit_generics(generics);
self.with_banned_assoc_ty_bound(|this| {
walk_list!(this, visit_struct_field_def, fields);
});
walk_list!(self, visit_attribute, &item.attrs);
return;
}
_ => {}
}
}
ItemKind::Const(def, .., None) => {
self.check_defaultness(item.span, def);
@ -1411,13 +1384,15 @@ impl<'a> Visitor<'a> for AstValidator<'a> {
);
err.emit();
}
(_, TraitBoundModifier::MaybeConst) => {
if !self.is_tilde_const_allowed {
self.err_handler()
.struct_span_err(bound.span(), "`~const` is not allowed here")
.note("only allowed on bounds on traits' associated types and functions, const fns, const impls and its associated functions")
.emit();
}
(_, TraitBoundModifier::MaybeConst) if let Some(reason) = &self.disallow_tilde_const => {
let mut err = self.err_handler().struct_span_err(bound.span(), "`~const` is not allowed here");
match reason {
DisallowTildeConstContext::TraitObject => err.note("trait objects cannot have `~const` trait bounds"),
DisallowTildeConstContext::ImplTrait => err.note("`impl Trait`s cannot have `~const` trait bounds"),
DisallowTildeConstContext::Fn(FnKind::Closure(..)) => err.note("closures cannot have `~const` trait bounds"),
DisallowTildeConstContext::Fn(FnKind::Fn(_, ident, ..)) => err.span_note(ident.span, "this function is not `const`, so it cannot have `~const` trait bounds"),
};
err.emit();
}
(_, TraitBoundModifier::MaybeConstMaybe) => {
self.err_handler()
@ -1524,10 +1499,12 @@ impl<'a> Visitor<'a> for AstValidator<'a> {
}
let tilde_const_allowed =
matches!(fk.header(), Some(FnHeader { constness: Const::Yes(_), .. }))
matches!(fk.header(), Some(FnHeader { constness: ast::Const::Yes(_), .. }))
|| matches!(fk.ctxt(), Some(FnCtxt::Assoc(_)));
self.with_tilde_const(tilde_const_allowed, |this| visit::walk_fn(this, fk));
let disallowed = (!tilde_const_allowed).then(|| DisallowTildeConstContext::Fn(fk));
self.with_tilde_const(disallowed, |this| visit::walk_fn(this, fk));
}
fn visit_assoc_item(&mut self, item: &'a AssocItem, ctxt: AssocCtxt) {
@ -1557,7 +1534,7 @@ impl<'a> Visitor<'a> for AstValidator<'a> {
});
}
}
AssocItemKind::TyAlias(box TyAlias {
AssocItemKind::Type(box TyAlias {
generics,
where_clauses,
where_predicates_split,
@ -1596,7 +1573,7 @@ impl<'a> Visitor<'a> for AstValidator<'a> {
}
match item.kind {
AssocItemKind::TyAlias(box TyAlias { ref generics, ref bounds, ref ty, .. })
AssocItemKind::Type(box TyAlias { ref generics, ref bounds, ref ty, .. })
if ctxt == AssocCtxt::Trait =>
{
self.visit_vis(&item.vis);
@ -1771,7 +1748,7 @@ pub fn check_crate(session: &Session, krate: &Crate, lints: &mut LintBuffer) ->
in_const_trait_impl: false,
has_proc_macro_decls: false,
outer_impl_trait: None,
is_tilde_const_allowed: false,
disallow_tilde_const: None,
is_impl_trait_banned: false,
is_assoc_ty_bound_banned: false,
forbidden_let_reason: Some(ForbiddenLetReason::GenericForbidden),
@ -1783,15 +1760,17 @@ pub fn check_crate(session: &Session, krate: &Crate, lints: &mut LintBuffer) ->
}
/// Used to forbid `let` expressions in certain syntactic locations.
#[derive(Clone, Copy)]
#[derive(Clone, Copy, Subdiagnostic)]
pub(crate) enum ForbiddenLetReason {
/// `let` is not valid and the source environment is not important
GenericForbidden,
/// A let chain with the `||` operator
NotSupportedOr(Span),
#[note(not_supported_or)]
NotSupportedOr(#[primary_span] Span),
/// A let chain with invalid parentheses
///
/// For example, `let 1 = 1 && (expr && expr)` is allowed
/// but `(let 1 = 1 && (let 1 = 1 && (let 1 = 1))) && let a = 1` is not
NotSupportedParentheses(Span),
#[note(not_supported_parentheses)]
NotSupportedParentheses(#[primary_span] Span),
}

View File

@ -1,13 +1,13 @@
//! Errors emitted by ast_passes.
use rustc_errors::{fluent, AddSubdiagnostic, Applicability, Diagnostic};
use rustc_macros::{SessionDiagnostic, SessionSubdiagnostic};
use rustc_errors::{fluent, AddToDiagnostic, Applicability, Diagnostic, SubdiagnosticMessage};
use rustc_macros::{Diagnostic, Subdiagnostic};
use rustc_span::{Span, Symbol};
use crate::ast_validation::ForbiddenLetReason;
#[derive(SessionDiagnostic)]
#[diag(ast_passes::forbidden_let)]
#[derive(Diagnostic)]
#[diag(ast_passes_forbidden_let)]
#[note]
pub struct ForbiddenLet {
#[primary_span]
@ -16,130 +16,116 @@ pub struct ForbiddenLet {
pub(crate) reason: ForbiddenLetReason,
}
impl AddSubdiagnostic for ForbiddenLetReason {
fn add_to_diagnostic(self, diag: &mut Diagnostic) {
match self {
Self::GenericForbidden => {}
Self::NotSupportedOr(span) => {
diag.span_note(span, fluent::ast_passes::not_supported_or);
}
Self::NotSupportedParentheses(span) => {
diag.span_note(span, fluent::ast_passes::not_supported_parentheses);
}
}
}
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::forbidden_let_stable)]
#[derive(Diagnostic)]
#[diag(ast_passes_forbidden_let_stable)]
#[note]
pub struct ForbiddenLetStable {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::forbidden_assoc_constraint)]
#[derive(Diagnostic)]
#[diag(ast_passes_forbidden_assoc_constraint)]
pub struct ForbiddenAssocConstraint {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::keyword_lifetime)]
#[derive(Diagnostic)]
#[diag(ast_passes_keyword_lifetime)]
pub struct KeywordLifetime {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::invalid_label)]
#[derive(Diagnostic)]
#[diag(ast_passes_invalid_label)]
pub struct InvalidLabel {
#[primary_span]
pub span: Span,
pub name: Symbol,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::invalid_visibility, code = "E0449")]
#[derive(Diagnostic)]
#[diag(ast_passes_invalid_visibility, code = "E0449")]
pub struct InvalidVisibility {
#[primary_span]
pub span: Span,
#[label(ast_passes::implied)]
#[label(implied)]
pub implied: Option<Span>,
#[subdiagnostic]
pub note: Option<InvalidVisibilityNote>,
}
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub enum InvalidVisibilityNote {
#[note(ast_passes::individual_impl_items)]
#[note(individual_impl_items)]
IndividualImplItems,
#[note(ast_passes::individual_foreign_items)]
#[note(individual_foreign_items)]
IndividualForeignItems,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::trait_fn_const, code = "E0379")]
#[derive(Diagnostic)]
#[diag(ast_passes_trait_fn_const, code = "E0379")]
pub struct TraitFnConst {
#[primary_span]
#[label]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::forbidden_lifetime_bound)]
#[derive(Diagnostic)]
#[diag(ast_passes_forbidden_lifetime_bound)]
pub struct ForbiddenLifetimeBound {
#[primary_span]
pub spans: Vec<Span>,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::forbidden_non_lifetime_param)]
#[derive(Diagnostic)]
#[diag(ast_passes_forbidden_non_lifetime_param)]
pub struct ForbiddenNonLifetimeParam {
#[primary_span]
pub spans: Vec<Span>,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::fn_param_too_many)]
#[derive(Diagnostic)]
#[diag(ast_passes_fn_param_too_many)]
pub struct FnParamTooMany {
#[primary_span]
pub span: Span,
pub max_num_args: usize,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::fn_param_c_var_args_only)]
#[derive(Diagnostic)]
#[diag(ast_passes_fn_param_c_var_args_only)]
pub struct FnParamCVarArgsOnly {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::fn_param_c_var_args_not_last)]
#[derive(Diagnostic)]
#[diag(ast_passes_fn_param_c_var_args_not_last)]
pub struct FnParamCVarArgsNotLast {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::fn_param_doc_comment)]
#[derive(Diagnostic)]
#[diag(ast_passes_fn_param_doc_comment)]
pub struct FnParamDocComment {
#[primary_span]
#[label]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::fn_param_forbidden_attr)]
#[derive(Diagnostic)]
#[diag(ast_passes_fn_param_forbidden_attr)]
pub struct FnParamForbiddenAttr {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::fn_param_forbidden_self)]
#[derive(Diagnostic)]
#[diag(ast_passes_fn_param_forbidden_self)]
#[note]
pub struct FnParamForbiddenSelf {
#[primary_span]
@ -147,8 +133,8 @@ pub struct FnParamForbiddenSelf {
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::forbidden_default)]
#[derive(Diagnostic)]
#[diag(ast_passes_forbidden_default)]
pub struct ForbiddenDefault {
#[primary_span]
pub span: Span,
@ -156,8 +142,8 @@ pub struct ForbiddenDefault {
pub def_span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::assoc_const_without_body)]
#[derive(Diagnostic)]
#[diag(ast_passes_assoc_const_without_body)]
pub struct AssocConstWithoutBody {
#[primary_span]
pub span: Span,
@ -165,8 +151,8 @@ pub struct AssocConstWithoutBody {
pub replace_span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::assoc_fn_without_body)]
#[derive(Diagnostic)]
#[diag(ast_passes_assoc_fn_without_body)]
pub struct AssocFnWithoutBody {
#[primary_span]
pub span: Span,
@ -174,8 +160,8 @@ pub struct AssocFnWithoutBody {
pub replace_span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::assoc_type_without_body)]
#[derive(Diagnostic)]
#[diag(ast_passes_assoc_type_without_body)]
pub struct AssocTypeWithoutBody {
#[primary_span]
pub span: Span,
@ -183,8 +169,8 @@ pub struct AssocTypeWithoutBody {
pub replace_span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::const_without_body)]
#[derive(Diagnostic)]
#[diag(ast_passes_const_without_body)]
pub struct ConstWithoutBody {
#[primary_span]
pub span: Span,
@ -192,8 +178,8 @@ pub struct ConstWithoutBody {
pub replace_span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::static_without_body)]
#[derive(Diagnostic)]
#[diag(ast_passes_static_without_body)]
pub struct StaticWithoutBody {
#[primary_span]
pub span: Span,
@ -201,8 +187,8 @@ pub struct StaticWithoutBody {
pub replace_span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::ty_alias_without_body)]
#[derive(Diagnostic)]
#[diag(ast_passes_ty_alias_without_body)]
pub struct TyAliasWithoutBody {
#[primary_span]
pub span: Span,
@ -210,8 +196,8 @@ pub struct TyAliasWithoutBody {
pub replace_span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(ast_passes::fn_without_body)]
#[derive(Diagnostic)]
#[diag(ast_passes_fn_without_body)]
pub struct FnWithoutBody {
#[primary_span]
pub span: Span,
@ -227,8 +213,11 @@ pub struct ExternBlockSuggestion {
pub abi: Option<Symbol>,
}
impl AddSubdiagnostic for ExternBlockSuggestion {
fn add_to_diagnostic(self, diag: &mut Diagnostic) {
impl AddToDiagnostic for ExternBlockSuggestion {
fn add_to_diagnostic_with<F>(self, diag: &mut Diagnostic, _: F)
where
F: Fn(&mut Diagnostic, SubdiagnosticMessage) -> SubdiagnosticMessage,
{
let start_suggestion = if let Some(abi) = self.abi {
format!("extern \"{}\" {{", abi)
} else {
@ -237,7 +226,7 @@ impl AddSubdiagnostic for ExternBlockSuggestion {
let end_suggestion = " }".to_owned();
diag.multipart_suggestion(
fluent::ast_passes::extern_block_suggestion,
fluent::extern_block_suggestion,
vec![(self.start_span, start_suggestion), (self.end_span, end_suggestion)],
Applicability::MaybeIncorrect,
);

View File

@ -1,15 +1,15 @@
use rustc_ast as ast;
use rustc_ast::visit::{self, AssocCtxt, FnCtxt, FnKind, Visitor};
use rustc_ast::{AssocConstraint, AssocConstraintKind, NodeId};
use rustc_ast::{PatKind, RangeEnd, VariantData};
use rustc_ast::{PatKind, RangeEnd};
use rustc_errors::{struct_span_err, Applicability, StashKey};
use rustc_feature::Features;
use rustc_feature::{AttributeGate, BuiltinAttribute, BUILTIN_ATTRIBUTE_MAP};
use rustc_session::parse::{feature_err, feature_warn};
use rustc_feature::{AttributeGate, BuiltinAttribute, Features, GateIssue, BUILTIN_ATTRIBUTE_MAP};
use rustc_session::parse::{feature_err, feature_err_issue, feature_warn};
use rustc_session::Session;
use rustc_span::source_map::Spanned;
use rustc_span::symbol::sym;
use rustc_span::Span;
use rustc_target::spec::abi;
macro_rules! gate_feature_fn {
($visitor: expr, $has_feature: expr, $span: expr, $name: expr, $explain: expr, $help: expr) => {{
@ -84,210 +84,26 @@ impl<'a> PostExpansionVisitor<'a> {
}
}
match symbol_unescaped.as_str() {
// Stable
"Rust" | "C" | "cdecl" | "stdcall" | "fastcall" | "aapcs" | "win64" | "sysv64"
| "system" => {}
"rust-intrinsic" => {
gate_feature_post!(&self, intrinsics, span, "intrinsics are subject to change");
}
"platform-intrinsic" => {
gate_feature_post!(
&self,
platform_intrinsics,
match abi::is_enabled(&self.features, span, symbol_unescaped.as_str()) {
Ok(()) => (),
Err(abi::AbiDisabled::Unstable { feature, explain }) => {
feature_err_issue(
&self.sess.parse_sess,
feature,
span,
"platform intrinsics are experimental and possibly buggy"
);
GateIssue::Language,
explain,
)
.emit();
}
"vectorcall" => {
gate_feature_post!(
&self,
abi_vectorcall,
span,
"vectorcall is experimental and subject to change"
);
}
"thiscall" => {
gate_feature_post!(
&self,
abi_thiscall,
span,
"thiscall is experimental and subject to change"
);
}
"rust-call" => {
gate_feature_post!(
&self,
unboxed_closures,
span,
"rust-call ABI is subject to change"
);
}
"rust-cold" => {
gate_feature_post!(
&self,
rust_cold_cc,
span,
"rust-cold is experimental and subject to change"
);
}
"ptx-kernel" => {
gate_feature_post!(
&self,
abi_ptx,
span,
"PTX ABIs are experimental and subject to change"
);
}
"unadjusted" => {
gate_feature_post!(
&self,
abi_unadjusted,
span,
"unadjusted ABI is an implementation detail and perma-unstable"
);
}
"msp430-interrupt" => {
gate_feature_post!(
&self,
abi_msp430_interrupt,
span,
"msp430-interrupt ABI is experimental and subject to change"
);
}
"x86-interrupt" => {
gate_feature_post!(
&self,
abi_x86_interrupt,
span,
"x86-interrupt ABI is experimental and subject to change"
);
}
"amdgpu-kernel" => {
gate_feature_post!(
&self,
abi_amdgpu_kernel,
span,
"amdgpu-kernel ABI is experimental and subject to change"
);
}
"avr-interrupt" | "avr-non-blocking-interrupt" => {
gate_feature_post!(
&self,
abi_avr_interrupt,
span,
"avr-interrupt and avr-non-blocking-interrupt ABIs are experimental and subject to change"
);
}
"efiapi" => {
gate_feature_post!(
&self,
abi_efiapi,
span,
"efiapi ABI is experimental and subject to change"
);
}
"C-cmse-nonsecure-call" => {
gate_feature_post!(
&self,
abi_c_cmse_nonsecure_call,
span,
"C-cmse-nonsecure-call ABI is experimental and subject to change"
);
}
"C-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"C-unwind ABI is experimental and subject to change"
);
}
"stdcall-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"stdcall-unwind ABI is experimental and subject to change"
);
}
"system-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"system-unwind ABI is experimental and subject to change"
);
}
"thiscall-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"thiscall-unwind ABI is experimental and subject to change"
);
}
"cdecl-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"cdecl-unwind ABI is experimental and subject to change"
);
}
"fastcall-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"fastcall-unwind ABI is experimental and subject to change"
);
}
"vectorcall-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"vectorcall-unwind ABI is experimental and subject to change"
);
}
"aapcs-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"aapcs-unwind ABI is experimental and subject to change"
);
}
"win64-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"win64-unwind ABI is experimental and subject to change"
);
}
"sysv64-unwind" => {
gate_feature_post!(
&self,
c_unwind,
span,
"sysv64-unwind ABI is experimental and subject to change"
);
}
"wasm" => {
gate_feature_post!(
&self,
wasm_abi,
span,
"wasm ABI is experimental and subject to change"
);
}
abi => {
Err(abi::AbiDisabled::Unrecognized) => {
if self.sess.opts.pretty.map_or(true, |ppm| ppm.needs_hir()) {
self.sess.parse_sess.span_diagnostic.delay_span_bug(
span,
&format!("unrecognized ABI not caught in lowering: {}", abi),
&format!(
"unrecognized ABI not caught in lowering: {}",
symbol_unescaped.as_str()
),
);
}
}
@ -300,46 +116,6 @@ impl<'a> PostExpansionVisitor<'a> {
}
}
fn maybe_report_invalid_custom_discriminants(&self, variants: &[ast::Variant]) {
let has_fields = variants.iter().any(|variant| match variant.data {
VariantData::Tuple(..) | VariantData::Struct(..) => true,
VariantData::Unit(..) => false,
});
let discriminant_spans = variants
.iter()
.filter(|variant| match variant.data {
VariantData::Tuple(..) | VariantData::Struct(..) => false,
VariantData::Unit(..) => true,
})
.filter_map(|variant| variant.disr_expr.as_ref().map(|c| c.value.span))
.collect::<Vec<_>>();
if !discriminant_spans.is_empty() && has_fields {
let mut err = feature_err(
&self.sess.parse_sess,
sym::arbitrary_enum_discriminant,
discriminant_spans.clone(),
"custom discriminant values are not allowed in enums with tuple or struct variants",
);
for sp in discriminant_spans {
err.span_label(sp, "disallowed custom discriminant");
}
for variant in variants.iter() {
match &variant.data {
VariantData::Struct(..) => {
err.span_label(variant.span, "struct variant defined here");
}
VariantData::Tuple(..) => {
err.span_label(variant.span, "tuple variant defined here");
}
VariantData::Unit(..) => {}
}
}
err.emit();
}
}
/// Feature gate `impl Trait` inside `type Alias = $type_expr;`.
fn check_impl_trait(&self, ty: &ast::Ty) {
struct ImplTraitVisitor<'a> {
@ -457,26 +233,6 @@ impl<'a> Visitor<'a> for PostExpansionVisitor<'a> {
}
}
ast::ItemKind::Enum(ast::EnumDef { ref variants, .. }, ..) => {
for variant in variants {
match (&variant.data, &variant.disr_expr) {
(ast::VariantData::Unit(..), _) => {}
(_, Some(disr_expr)) => gate_feature_post!(
&self,
arbitrary_enum_discriminant,
disr_expr.value.span,
"discriminants on non-unit variants are experimental"
),
_ => {}
}
}
let has_feature = self.features.arbitrary_enum_discriminant;
if !has_feature && !i.span.allows_unstable(sym::arbitrary_enum_discriminant) {
self.maybe_report_invalid_custom_discriminants(&variants);
}
}
ast::ItemKind::Impl(box ast::Impl { polarity, defaultness, ref of_trait, .. }) => {
if let ast::ImplPolarity::Negative(span) = polarity {
gate_feature_post!(
@ -645,7 +401,7 @@ impl<'a> Visitor<'a> for PostExpansionVisitor<'a> {
if let PatKind::Range(Some(_), None, Spanned { .. }) = inner_pat.kind {
gate_feature_post!(
&self,
half_open_range_patterns,
half_open_range_patterns_in_slices,
pat.span,
"`X..` patterns in slices are experimental"
);
@ -701,7 +457,7 @@ impl<'a> Visitor<'a> for PostExpansionVisitor<'a> {
fn visit_assoc_item(&mut self, i: &'a ast::AssocItem, ctxt: AssocCtxt) {
let is_fn = match i.kind {
ast::AssocItemKind::Fn(_) => true,
ast::AssocItemKind::TyAlias(box ast::TyAlias { ref ty, .. }) => {
ast::AssocItemKind::Type(box ast::TyAlias { ref ty, .. }) => {
if let (Some(_), AssocCtxt::Trait) = (ty, ctxt) {
gate_feature_post!(
&self,
@ -773,7 +529,10 @@ pub fn check_crate(krate: &ast::Crate, sess: &Session) {
gate_all!(generators, "yield syntax is experimental");
gate_all!(raw_ref_op, "raw address of syntax is experimental");
gate_all!(const_trait_impl, "const trait impls are experimental");
gate_all!(half_open_range_patterns, "half-open range patterns are unstable");
gate_all!(
half_open_range_patterns_in_slices,
"half-open range patterns in slices are unstable"
);
gate_all!(inline_const, "inline-const is experimental");
gate_all!(inline_const_pat, "inline-const in pattern position is experimental");
gate_all!(associated_const_equality, "associated const equality is incomplete");

View File

@ -9,7 +9,6 @@
#![feature(if_let_guard)]
#![feature(iter_is_partitioned)]
#![feature(let_chains)]
#![cfg_attr(bootstrap, feature(let_else))]
#![recursion_limit = "256"]
#[macro_use]

View File

@ -4,7 +4,6 @@ version = "0.0.0"
edition = "2021"
[lib]
doctest = false
[dependencies]
rustc_span = { path = "../rustc_span" }

View File

@ -193,9 +193,13 @@ impl<'a> State<'a> {
self.print_call_post(args)
}
fn print_expr_method_call(&mut self, segment: &ast::PathSegment, args: &[P<ast::Expr>]) {
let base_args = &args[1..];
self.print_expr_maybe_paren(&args[0], parser::PREC_POSTFIX);
fn print_expr_method_call(
&mut self,
segment: &ast::PathSegment,
receiver: &ast::Expr,
base_args: &[P<ast::Expr>],
) {
self.print_expr_maybe_paren(receiver, parser::PREC_POSTFIX);
self.word(".");
self.print_ident(segment.ident);
if let Some(ref args) = segment.args {
@ -303,8 +307,8 @@ impl<'a> State<'a> {
ast::ExprKind::Call(ref func, ref args) => {
self.print_expr_call(func, &args);
}
ast::ExprKind::MethodCall(ref segment, ref args, _) => {
self.print_expr_method_call(segment, &args);
ast::ExprKind::MethodCall(ref segment, ref receiver, ref args, _) => {
self.print_expr_method_call(segment, &receiver, &args);
}
ast::ExprKind::Binary(op, ref lhs, ref rhs) => {
self.print_expr_binary(op, lhs, rhs);

View File

@ -516,7 +516,7 @@ impl<'a> State<'a> {
ast::AssocItemKind::Const(def, ty, body) => {
self.print_item_const(ident, None, ty, body.as_deref(), vis, *def);
}
ast::AssocItemKind::TyAlias(box ast::TyAlias {
ast::AssocItemKind::Type(box ast::TyAlias {
defaultness,
generics,
where_clauses,

View File

@ -4,7 +4,6 @@ version = "0.0.0"
edition = "2021"
[lib]
doctest = false
[dependencies]
rustc_ast_pretty = { path = "../rustc_ast_pretty" }

View File

@ -5,7 +5,6 @@
//! to this crate.
#![feature(let_chains)]
#![cfg_attr(bootstrap, feature(let_else))]
#![deny(rustc::untranslatable_diagnostic)]
#![deny(rustc::diagnostic_outside_of_impl)]

View File

@ -2,23 +2,22 @@ use std::num::IntErrorKind;
use rustc_ast as ast;
use rustc_errors::{
error_code, fluent, Applicability, DiagnosticBuilder, ErrorGuaranteed, Handler,
error_code, fluent, Applicability, DiagnosticBuilder, ErrorGuaranteed, Handler, IntoDiagnostic,
};
use rustc_macros::SessionDiagnostic;
use rustc_session::SessionDiagnostic;
use rustc_macros::Diagnostic;
use rustc_span::{Span, Symbol};
use crate::UnsupportedLiteralReason;
#[derive(SessionDiagnostic)]
#[diag(attr::expected_one_cfg_pattern, code = "E0536")]
#[derive(Diagnostic)]
#[diag(attr_expected_one_cfg_pattern, code = "E0536")]
pub(crate) struct ExpectedOneCfgPattern {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::invalid_predicate, code = "E0537")]
#[derive(Diagnostic)]
#[diag(attr_invalid_predicate, code = "E0537")]
pub(crate) struct InvalidPredicate {
#[primary_span]
pub span: Span,
@ -26,8 +25,8 @@ pub(crate) struct InvalidPredicate {
pub predicate: String,
}
#[derive(SessionDiagnostic)]
#[diag(attr::multiple_item, code = "E0538")]
#[derive(Diagnostic)]
#[diag(attr_multiple_item, code = "E0538")]
pub(crate) struct MultipleItem {
#[primary_span]
pub span: Span,
@ -35,8 +34,8 @@ pub(crate) struct MultipleItem {
pub item: String,
}
#[derive(SessionDiagnostic)]
#[diag(attr::incorrect_meta_item, code = "E0539")]
#[derive(Diagnostic)]
#[diag(attr_incorrect_meta_item, code = "E0539")]
pub(crate) struct IncorrectMetaItem {
#[primary_span]
pub span: Span,
@ -50,44 +49,44 @@ pub(crate) struct UnknownMetaItem<'a> {
}
// Manual implementation to be able to format `expected` items correctly.
impl<'a> SessionDiagnostic<'a> for UnknownMetaItem<'_> {
impl<'a> IntoDiagnostic<'a> for UnknownMetaItem<'_> {
fn into_diagnostic(self, handler: &'a Handler) -> DiagnosticBuilder<'a, ErrorGuaranteed> {
let expected = self.expected.iter().map(|name| format!("`{}`", name)).collect::<Vec<_>>();
let mut diag = handler.struct_span_err_with_code(
self.span,
fluent::attr::unknown_meta_item,
fluent::attr_unknown_meta_item,
error_code!(E0541),
);
diag.set_arg("item", self.item);
diag.set_arg("expected", expected.join(", "));
diag.span_label(self.span, fluent::attr::label);
diag.span_label(self.span, fluent::label);
diag
}
}
#[derive(SessionDiagnostic)]
#[diag(attr::missing_since, code = "E0542")]
#[derive(Diagnostic)]
#[diag(attr_missing_since, code = "E0542")]
pub(crate) struct MissingSince {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::missing_note, code = "E0543")]
#[derive(Diagnostic)]
#[diag(attr_missing_note, code = "E0543")]
pub(crate) struct MissingNote {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::multiple_stability_levels, code = "E0544")]
#[derive(Diagnostic)]
#[diag(attr_multiple_stability_levels, code = "E0544")]
pub(crate) struct MultipleStabilityLevels {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::invalid_issue_string, code = "E0545")]
#[derive(Diagnostic)]
#[diag(attr_invalid_issue_string, code = "E0545")]
pub(crate) struct InvalidIssueString {
#[primary_span]
pub span: Span,
@ -98,33 +97,33 @@ pub(crate) struct InvalidIssueString {
// The error kinds of `IntErrorKind` are duplicated here in order to allow the messages to be
// translatable.
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub(crate) enum InvalidIssueStringCause {
#[label(attr::must_not_be_zero)]
#[label(must_not_be_zero)]
MustNotBeZero {
#[primary_span]
span: Span,
},
#[label(attr::empty)]
#[label(empty)]
Empty {
#[primary_span]
span: Span,
},
#[label(attr::invalid_digit)]
#[label(invalid_digit)]
InvalidDigit {
#[primary_span]
span: Span,
},
#[label(attr::pos_overflow)]
#[label(pos_overflow)]
PosOverflow {
#[primary_span]
span: Span,
},
#[label(attr::neg_overflow)]
#[label(neg_overflow)]
NegOverflow {
#[primary_span]
span: Span,
@ -144,22 +143,22 @@ impl InvalidIssueStringCause {
}
}
#[derive(SessionDiagnostic)]
#[diag(attr::missing_feature, code = "E0546")]
#[derive(Diagnostic)]
#[diag(attr_missing_feature, code = "E0546")]
pub(crate) struct MissingFeature {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::non_ident_feature, code = "E0546")]
#[derive(Diagnostic)]
#[diag(attr_non_ident_feature, code = "E0546")]
pub(crate) struct NonIdentFeature {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::missing_issue, code = "E0547")]
#[derive(Diagnostic)]
#[diag(attr_missing_issue, code = "E0547")]
pub(crate) struct MissingIssue {
#[primary_span]
pub span: Span,
@ -167,8 +166,8 @@ pub(crate) struct MissingIssue {
// FIXME: This diagnostic is identical to `IncorrectMetaItem`, barring the error code. Consider
// changing this to `IncorrectMetaItem`. See #51489.
#[derive(SessionDiagnostic)]
#[diag(attr::incorrect_meta_item, code = "E0551")]
#[derive(Diagnostic)]
#[diag(attr_incorrect_meta_item, code = "E0551")]
pub(crate) struct IncorrectMetaItem2 {
#[primary_span]
pub span: Span,
@ -176,15 +175,15 @@ pub(crate) struct IncorrectMetaItem2 {
// FIXME: Why is this the same error code as `InvalidReprHintNoParen` and `InvalidReprHintNoValue`?
// It is more similar to `IncorrectReprFormatGeneric`.
#[derive(SessionDiagnostic)]
#[diag(attr::incorrect_repr_format_packed_one_or_zero_arg, code = "E0552")]
#[derive(Diagnostic)]
#[diag(attr_incorrect_repr_format_packed_one_or_zero_arg, code = "E0552")]
pub(crate) struct IncorrectReprFormatPackedOneOrZeroArg {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::invalid_repr_hint_no_paren, code = "E0552")]
#[derive(Diagnostic)]
#[diag(attr_invalid_repr_hint_no_paren, code = "E0552")]
pub(crate) struct InvalidReprHintNoParen {
#[primary_span]
pub span: Span,
@ -192,8 +191,8 @@ pub(crate) struct InvalidReprHintNoParen {
pub name: String,
}
#[derive(SessionDiagnostic)]
#[diag(attr::invalid_repr_hint_no_value, code = "E0552")]
#[derive(Diagnostic)]
#[diag(attr_invalid_repr_hint_no_value, code = "E0552")]
pub(crate) struct InvalidReprHintNoValue {
#[primary_span]
pub span: Span,
@ -209,18 +208,18 @@ pub(crate) struct UnsupportedLiteral {
pub start_point_span: Span,
}
impl<'a> SessionDiagnostic<'a> for UnsupportedLiteral {
impl<'a> IntoDiagnostic<'a> for UnsupportedLiteral {
fn into_diagnostic(self, handler: &'a Handler) -> DiagnosticBuilder<'a, ErrorGuaranteed> {
let mut diag = handler.struct_span_err_with_code(
self.span,
match self.reason {
UnsupportedLiteralReason::Generic => fluent::attr::unsupported_literal_generic,
UnsupportedLiteralReason::CfgString => fluent::attr::unsupported_literal_cfg_string,
UnsupportedLiteralReason::Generic => fluent::attr_unsupported_literal_generic,
UnsupportedLiteralReason::CfgString => fluent::attr_unsupported_literal_cfg_string,
UnsupportedLiteralReason::DeprecatedString => {
fluent::attr::unsupported_literal_deprecated_string
fluent::attr_unsupported_literal_deprecated_string
}
UnsupportedLiteralReason::DeprecatedKvPair => {
fluent::attr::unsupported_literal_deprecated_kv_pair
fluent::attr_unsupported_literal_deprecated_kv_pair
}
},
error_code!(E0565),
@ -228,7 +227,7 @@ impl<'a> SessionDiagnostic<'a> for UnsupportedLiteral {
if self.is_bytestr {
diag.span_suggestion(
self.start_point_span,
fluent::attr::unsupported_literal_suggestion,
fluent::attr_unsupported_literal_suggestion,
"",
Applicability::MaybeIncorrect,
);
@ -237,16 +236,16 @@ impl<'a> SessionDiagnostic<'a> for UnsupportedLiteral {
}
}
#[derive(SessionDiagnostic)]
#[diag(attr::invalid_repr_align_need_arg, code = "E0589")]
#[derive(Diagnostic)]
#[diag(attr_invalid_repr_align_need_arg, code = "E0589")]
pub(crate) struct InvalidReprAlignNeedArg {
#[primary_span]
#[suggestion(code = "align(...)", applicability = "has-placeholders")]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::invalid_repr_generic, code = "E0589")]
#[derive(Diagnostic)]
#[diag(attr_invalid_repr_generic, code = "E0589")]
pub(crate) struct InvalidReprGeneric<'a> {
#[primary_span]
pub span: Span,
@ -255,15 +254,15 @@ pub(crate) struct InvalidReprGeneric<'a> {
pub error_part: &'a str,
}
#[derive(SessionDiagnostic)]
#[diag(attr::incorrect_repr_format_align_one_arg, code = "E0693")]
#[derive(Diagnostic)]
#[diag(attr_incorrect_repr_format_align_one_arg, code = "E0693")]
pub(crate) struct IncorrectReprFormatAlignOneArg {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::incorrect_repr_format_generic, code = "E0693")]
#[derive(Diagnostic)]
#[diag(attr_incorrect_repr_format_generic, code = "E0693")]
pub(crate) struct IncorrectReprFormatGeneric<'a> {
#[primary_span]
pub span: Span,
@ -274,9 +273,9 @@ pub(crate) struct IncorrectReprFormatGeneric<'a> {
pub cause: Option<IncorrectReprFormatGenericCause<'a>>,
}
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub(crate) enum IncorrectReprFormatGenericCause<'a> {
#[suggestion(attr::suggestion, code = "{name}({int})", applicability = "machine-applicable")]
#[suggestion(suggestion, code = "{name}({int})", applicability = "machine-applicable")]
Int {
#[primary_span]
span: Span,
@ -288,11 +287,7 @@ pub(crate) enum IncorrectReprFormatGenericCause<'a> {
int: u128,
},
#[suggestion(
attr::suggestion,
code = "{name}({symbol})",
applicability = "machine-applicable"
)]
#[suggestion(suggestion, code = "{name}({symbol})", applicability = "machine-applicable")]
Symbol {
#[primary_span]
span: Span,
@ -317,29 +312,29 @@ impl<'a> IncorrectReprFormatGenericCause<'a> {
}
}
#[derive(SessionDiagnostic)]
#[diag(attr::rustc_promotable_pairing, code = "E0717")]
#[derive(Diagnostic)]
#[diag(attr_rustc_promotable_pairing, code = "E0717")]
pub(crate) struct RustcPromotablePairing {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::rustc_allowed_unstable_pairing, code = "E0789")]
#[derive(Diagnostic)]
#[diag(attr_rustc_allowed_unstable_pairing, code = "E0789")]
pub(crate) struct RustcAllowedUnstablePairing {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::cfg_predicate_identifier)]
#[derive(Diagnostic)]
#[diag(attr_cfg_predicate_identifier)]
pub(crate) struct CfgPredicateIdentifier {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::deprecated_item_suggestion)]
#[derive(Diagnostic)]
#[diag(attr_deprecated_item_suggestion)]
pub(crate) struct DeprecatedItemSuggestion {
#[primary_span]
pub span: Span,
@ -351,22 +346,22 @@ pub(crate) struct DeprecatedItemSuggestion {
pub details: (),
}
#[derive(SessionDiagnostic)]
#[diag(attr::expected_single_version_literal)]
#[derive(Diagnostic)]
#[diag(attr_expected_single_version_literal)]
pub(crate) struct ExpectedSingleVersionLiteral {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::expected_version_literal)]
#[derive(Diagnostic)]
#[diag(attr_expected_version_literal)]
pub(crate) struct ExpectedVersionLiteral {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::expects_feature_list)]
#[derive(Diagnostic)]
#[diag(attr_expects_feature_list)]
pub(crate) struct ExpectsFeatureList {
#[primary_span]
pub span: Span,
@ -374,8 +369,8 @@ pub(crate) struct ExpectsFeatureList {
pub name: String,
}
#[derive(SessionDiagnostic)]
#[diag(attr::expects_features)]
#[derive(Diagnostic)]
#[diag(attr_expects_features)]
pub(crate) struct ExpectsFeatures {
#[primary_span]
pub span: Span,
@ -383,15 +378,15 @@ pub(crate) struct ExpectsFeatures {
pub name: String,
}
#[derive(SessionDiagnostic)]
#[diag(attr::soft_no_args)]
#[derive(Diagnostic)]
#[diag(attr_soft_no_args)]
pub(crate) struct SoftNoArgs {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(attr::unknown_version_literal)]
#[derive(Diagnostic)]
#[diag(attr_unknown_version_literal)]
pub(crate) struct UnknownVersionLiteral {
#[primary_span]
pub span: Span,

View File

@ -4,7 +4,6 @@ version = "0.0.0"
edition = "2021"
[lib]
doctest = false
[dependencies]
either = "1.5.0"

View File

@ -14,8 +14,8 @@ use crate::{
places_conflict, region_infer::values::LivenessValues,
};
pub(super) fn generate_constraints<'cx, 'tcx>(
infcx: &InferCtxt<'cx, 'tcx>,
pub(super) fn generate_constraints<'tcx>(
infcx: &InferCtxt<'tcx>,
liveness_constraints: &mut LivenessValues<RegionVid>,
all_facts: &mut Option<AllFacts>,
location_table: &LocationTable,
@ -37,8 +37,8 @@ pub(super) fn generate_constraints<'cx, 'tcx>(
}
/// 'cg = the duration of the constraint generation process itself.
struct ConstraintGeneration<'cg, 'cx, 'tcx> {
infcx: &'cg InferCtxt<'cx, 'tcx>,
struct ConstraintGeneration<'cg, 'tcx> {
infcx: &'cg InferCtxt<'tcx>,
all_facts: &'cg mut Option<AllFacts>,
location_table: &'cg LocationTable,
liveness_constraints: &'cg mut LivenessValues<RegionVid>,
@ -46,7 +46,7 @@ struct ConstraintGeneration<'cg, 'cx, 'tcx> {
body: &'cg Body<'tcx>,
}
impl<'cg, 'cx, 'tcx> Visitor<'tcx> for ConstraintGeneration<'cg, 'cx, 'tcx> {
impl<'cg, 'tcx> Visitor<'tcx> for ConstraintGeneration<'cg, 'tcx> {
fn visit_basic_block_data(&mut self, bb: BasicBlock, data: &BasicBlockData<'tcx>) {
self.super_basic_block_data(bb, data);
}
@ -156,7 +156,7 @@ impl<'cg, 'cx, 'tcx> Visitor<'tcx> for ConstraintGeneration<'cg, 'cx, 'tcx> {
}
}
impl<'cx, 'cg, 'tcx> ConstraintGeneration<'cx, 'cg, 'tcx> {
impl<'cx, 'tcx> ConstraintGeneration<'cx, 'tcx> {
/// Some variable with type `live_ty` is "regular live" at
/// `location` -- i.e., it may be used later. This means that all
/// regions appearing in the type `live_ty` must be live at

View File

@ -31,9 +31,8 @@ pub fn get_body_with_borrowck_facts<'tcx>(
def: ty::WithOptConstParam<LocalDefId>,
) -> BodyWithBorrowckFacts<'tcx> {
let (input_body, promoted) = tcx.mir_promoted(def);
tcx.infer_ctxt().with_opaque_type_inference(DefiningAnchor::Bind(def.did)).enter(|infcx| {
let input_body: &Body<'_> = &input_body.borrow();
let promoted: &IndexVec<_, _> = &promoted.borrow();
*super::do_mir_borrowck(&infcx, input_body, promoted, true).1.unwrap()
})
let infcx = tcx.infer_ctxt().with_opaque_type_inference(DefiningAnchor::Bind(def.did)).build();
let input_body: &Body<'_> = &input_body.borrow();
let promoted: &IndexVec<_, _> = &promoted.borrow();
*super::do_mir_borrowck(&infcx, input_body, promoted, true).1.unwrap()
}

View File

@ -56,7 +56,7 @@ impl<'tcx> UniverseInfo<'tcx> {
) {
match self.0 {
UniverseInfoInner::RelateTys { expected, found } => {
let err = mbcx.infcx.report_mismatched_types(
let err = mbcx.infcx.err_ctxt().report_mismatched_types(
&cause,
expected,
found,
@ -238,20 +238,11 @@ impl<'tcx> TypeOpInfo<'tcx> for PredicateQuery<'tcx> {
placeholder_region: ty::Region<'tcx>,
error_region: Option<ty::Region<'tcx>>,
) -> Option<DiagnosticBuilder<'tcx, ErrorGuaranteed>> {
mbcx.infcx.tcx.infer_ctxt().enter_with_canonical(
cause.span,
&self.canonical_query,
|ref infcx, key, _| {
let mut fulfill_cx = <dyn TraitEngine<'_>>::new(infcx.tcx);
type_op_prove_predicate_with_cause(infcx, &mut *fulfill_cx, key, cause);
try_extract_error_from_fulfill_cx(
fulfill_cx,
infcx,
placeholder_region,
error_region,
)
},
)
let (ref infcx, key, _) =
mbcx.infcx.tcx.infer_ctxt().build_with_canonical(cause.span, &self.canonical_query);
let mut fulfill_cx = <dyn TraitEngine<'_>>::new(infcx.tcx);
type_op_prove_predicate_with_cause(infcx, &mut *fulfill_cx, key, cause);
try_extract_error_from_fulfill_cx(fulfill_cx, infcx, placeholder_region, error_region)
}
}
@ -288,37 +279,24 @@ where
placeholder_region: ty::Region<'tcx>,
error_region: Option<ty::Region<'tcx>>,
) -> Option<DiagnosticBuilder<'tcx, ErrorGuaranteed>> {
mbcx.infcx.tcx.infer_ctxt().enter_with_canonical(
cause.span,
&self.canonical_query,
|ref infcx, key, _| {
let mut fulfill_cx = <dyn TraitEngine<'_>>::new(infcx.tcx);
let (ref infcx, key, _) =
mbcx.infcx.tcx.infer_ctxt().build_with_canonical(cause.span, &self.canonical_query);
let mut fulfill_cx = <dyn TraitEngine<'_>>::new(infcx.tcx);
let mut selcx = SelectionContext::new(infcx);
let mut selcx = SelectionContext::new(infcx);
// FIXME(lqd): Unify and de-duplicate the following with the actual
// `rustc_traits::type_op::type_op_normalize` query to allow the span we need in the
// `ObligationCause`. The normalization results are currently different between
// `AtExt::normalize` used in the query and `normalize` called below: the former fails
// to normalize the `nll/relate_tys/impl-fn-ignore-binder-via-bottom.rs` test. Check
// after #85499 lands to see if its fixes have erased this difference.
let (param_env, value) = key.into_parts();
let Normalized { value: _, obligations } = rustc_trait_selection::traits::normalize(
&mut selcx,
param_env,
cause,
value.value,
);
fulfill_cx.register_predicate_obligations(infcx, obligations);
// FIXME(lqd): Unify and de-duplicate the following with the actual
// `rustc_traits::type_op::type_op_normalize` query to allow the span we need in the
// `ObligationCause`. The normalization results are currently different between
// `AtExt::normalize` used in the query and `normalize` called below: the former fails
// to normalize the `nll/relate_tys/impl-fn-ignore-binder-via-bottom.rs` test. Check
// after #85499 lands to see if its fixes have erased this difference.
let (param_env, value) = key.into_parts();
let Normalized { value: _, obligations } =
rustc_trait_selection::traits::normalize(&mut selcx, param_env, cause, value.value);
fulfill_cx.register_predicate_obligations(infcx, obligations);
try_extract_error_from_fulfill_cx(
fulfill_cx,
infcx,
placeholder_region,
error_region,
)
},
)
try_extract_error_from_fulfill_cx(fulfill_cx, infcx, placeholder_region, error_region)
}
}
@ -349,21 +327,11 @@ impl<'tcx> TypeOpInfo<'tcx> for AscribeUserTypeQuery<'tcx> {
placeholder_region: ty::Region<'tcx>,
error_region: Option<ty::Region<'tcx>>,
) -> Option<DiagnosticBuilder<'tcx, ErrorGuaranteed>> {
mbcx.infcx.tcx.infer_ctxt().enter_with_canonical(
cause.span,
&self.canonical_query,
|ref infcx, key, _| {
let mut fulfill_cx = <dyn TraitEngine<'_>>::new(infcx.tcx);
type_op_ascribe_user_type_with_span(infcx, &mut *fulfill_cx, key, Some(cause.span))
.ok()?;
try_extract_error_from_fulfill_cx(
fulfill_cx,
infcx,
placeholder_region,
error_region,
)
},
)
let (ref infcx, key, _) =
mbcx.infcx.tcx.infer_ctxt().build_with_canonical(cause.span, &self.canonical_query);
let mut fulfill_cx = <dyn TraitEngine<'_>>::new(infcx.tcx);
type_op_ascribe_user_type_with_span(infcx, &mut *fulfill_cx, key, Some(cause.span)).ok()?;
try_extract_error_from_fulfill_cx(fulfill_cx, infcx, placeholder_region, error_region)
}
}
@ -407,7 +375,7 @@ impl<'tcx> TypeOpInfo<'tcx> for crate::type_check::InstantiateOpaqueType<'tcx> {
#[instrument(skip(fulfill_cx, infcx), level = "debug")]
fn try_extract_error_from_fulfill_cx<'tcx>(
mut fulfill_cx: Box<dyn TraitEngine<'tcx> + 'tcx>,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
placeholder_region: ty::Region<'tcx>,
error_region: Option<ty::Region<'tcx>>,
) -> Option<DiagnosticBuilder<'tcx, ErrorGuaranteed>> {
@ -427,7 +395,7 @@ fn try_extract_error_from_fulfill_cx<'tcx>(
}
fn try_extract_error_from_region_constraints<'tcx>(
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
placeholder_region: ty::Region<'tcx>,
error_region: Option<ty::Region<'tcx>>,
region_constraints: &RegionConstraintData<'tcx>,
@ -449,42 +417,38 @@ fn try_extract_error_from_region_constraints<'tcx>(
})?;
debug!(?sub_region, "cause = {:#?}", cause);
let nice_error = match (error_region, *sub_region) {
(Some(error_region), ty::ReVar(vid)) => NiceRegionError::new(
infcx,
RegionResolutionError::SubSupConflict(
vid,
region_var_origin(vid),
cause.clone(),
error_region,
cause.clone(),
placeholder_region,
vec![],
),
),
(Some(error_region), _) => NiceRegionError::new(
infcx,
RegionResolutionError::ConcreteFailure(cause.clone(), error_region, placeholder_region),
let error = match (error_region, *sub_region) {
(Some(error_region), ty::ReVar(vid)) => RegionResolutionError::SubSupConflict(
vid,
region_var_origin(vid),
cause.clone(),
error_region,
cause.clone(),
placeholder_region,
vec![],
),
(Some(error_region), _) => {
RegionResolutionError::ConcreteFailure(cause.clone(), error_region, placeholder_region)
}
// Note universe here is wrong...
(None, ty::ReVar(vid)) => NiceRegionError::new(
infcx,
RegionResolutionError::UpperBoundUniverseConflict(
vid,
region_var_origin(vid),
universe_of_region(vid),
cause.clone(),
placeholder_region,
),
),
(None, _) => NiceRegionError::new(
infcx,
RegionResolutionError::ConcreteFailure(cause.clone(), sub_region, placeholder_region),
(None, ty::ReVar(vid)) => RegionResolutionError::UpperBoundUniverseConflict(
vid,
region_var_origin(vid),
universe_of_region(vid),
cause.clone(),
placeholder_region,
),
(None, _) => {
RegionResolutionError::ConcreteFailure(cause.clone(), sub_region, placeholder_region)
}
};
nice_error.try_report_from_nll().or_else(|| {
NiceRegionError::new(&infcx.err_ctxt(), error).try_report_from_nll().or_else(|| {
if let SubregionOrigin::Subtype(trace) = cause {
Some(infcx.report_and_explain_type_error(*trace, TypeError::RegionsPlaceholderMismatch))
Some(
infcx
.err_ctxt()
.report_and_explain_type_error(*trace, TypeError::RegionsPlaceholderMismatch),
)
} else {
None
}

View File

@ -16,7 +16,7 @@ use rustc_middle::mir::{
FakeReadCause, LocalDecl, LocalInfo, LocalKind, Location, Operand, Place, PlaceRef,
ProjectionElem, Rvalue, Statement, StatementKind, Terminator, TerminatorKind, VarBindingForm,
};
use rustc_middle::ty::{self, subst::Subst, suggest_constraining_type_params, PredicateKind, Ty};
use rustc_middle::ty::{self, suggest_constraining_type_params, PredicateKind, Ty};
use rustc_mir_dataflow::move_paths::{InitKind, MoveOutIndex, MovePathIndex};
use rustc_span::def_id::LocalDefId;
use rustc_span::hygiene::DesugaringKind;
@ -198,7 +198,6 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
move_span,
move_spans,
*moved_place,
Some(used_place),
partially_str,
loop_message,
move_msg,
@ -369,6 +368,7 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
let mut visitor = ConditionVisitor { spans: &spans, name: &name, errors: vec![] };
visitor.visit_body(&body);
let mut show_assign_sugg = false;
let isnt_initialized = if let InitializationRequiringAction::PartialAssignment
| InitializationRequiringAction::Assignment = desired_action
{
@ -396,6 +396,7 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
.count()
== 0
{
show_assign_sugg = true;
"isn't initialized"
} else {
"is possibly-uninitialized"
@ -446,10 +447,84 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
}
}
}
err.span_label(decl_span, "binding declared here but left uninitialized");
if show_assign_sugg {
struct LetVisitor {
decl_span: Span,
sugg_span: Option<Span>,
}
impl<'v> Visitor<'v> for LetVisitor {
fn visit_stmt(&mut self, ex: &'v hir::Stmt<'v>) {
if self.sugg_span.is_some() {
return;
}
if let hir::StmtKind::Local(hir::Local {
span, ty, init: None, ..
}) = &ex.kind && span.contains(self.decl_span) {
self.sugg_span = ty.map_or(Some(self.decl_span), |ty| Some(ty.span));
}
hir::intravisit::walk_stmt(self, ex);
}
}
let mut visitor = LetVisitor { decl_span, sugg_span: None };
visitor.visit_body(&body);
if let Some(span) = visitor.sugg_span {
self.suggest_assign_value(&mut err, moved_place, span);
}
}
err
}
fn suggest_assign_value(
&self,
err: &mut Diagnostic,
moved_place: PlaceRef<'tcx>,
sugg_span: Span,
) {
let ty = moved_place.ty(self.body, self.infcx.tcx).ty;
debug!("ty: {:?}, kind: {:?}", ty, ty.kind());
let tcx = self.infcx.tcx;
let implements_default = |ty, param_env| {
let Some(default_trait) = tcx.get_diagnostic_item(sym::Default) else {
return false;
};
// Regions are already solved, so we must use a fresh InferCtxt,
// but the type has region variables, so erase those.
tcx.infer_ctxt()
.build()
.type_implements_trait(
default_trait,
tcx.erase_regions(ty),
ty::List::empty(),
param_env,
)
.must_apply_modulo_regions()
};
let assign_value = match ty.kind() {
ty::Bool => "false",
ty::Float(_) => "0.0",
ty::Int(_) | ty::Uint(_) => "0",
ty::Never | ty::Error(_) => "",
ty::Adt(def, _) if Some(def.did()) == tcx.get_diagnostic_item(sym::Vec) => "vec![]",
ty::Adt(_, _) if implements_default(ty, self.param_env) => "Default::default()",
_ => "todo!()",
};
if !assign_value.is_empty() {
err.span_suggestion_verbose(
sugg_span.shrink_to_hi(),
format!("consider assigning a value"),
format!(" = {}", assign_value),
Applicability::MaybeIncorrect,
);
}
}
fn suggest_borrow_fn_like(
&self,
err: &mut Diagnostic,
@ -537,41 +612,40 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
.and_then(|def_id| tcx.hir().get_generics(def_id))
else { return; };
// Try to find predicates on *generic params* that would allow copying `ty`
let predicates: Result<Vec<_>, _> = tcx.infer_ctxt().enter(|infcx| {
let mut fulfill_cx = <dyn rustc_infer::traits::TraitEngine<'_>>::new(infcx.tcx);
let infcx = tcx.infer_ctxt().build();
let mut fulfill_cx = <dyn rustc_infer::traits::TraitEngine<'_>>::new(infcx.tcx);
let copy_did = infcx.tcx.lang_items().copy_trait().unwrap();
let cause = ObligationCause::new(
span,
self.mir_hir_id(),
rustc_infer::traits::ObligationCauseCode::MiscObligation,
);
fulfill_cx.register_bound(
&infcx,
self.param_env,
// Erase any region vids from the type, which may not be resolved
infcx.tcx.erase_regions(ty),
copy_did,
cause,
);
// Select all, including ambiguous predicates
let errors = fulfill_cx.select_all_or_error(&infcx);
let copy_did = infcx.tcx.lang_items().copy_trait().unwrap();
let cause = ObligationCause::new(
span,
self.mir_hir_id(),
rustc_infer::traits::ObligationCauseCode::MiscObligation,
);
fulfill_cx.register_bound(
&infcx,
self.param_env,
// Erase any region vids from the type, which may not be resolved
infcx.tcx.erase_regions(ty),
copy_did,
cause,
);
// Select all, including ambiguous predicates
let errors = fulfill_cx.select_all_or_error(&infcx);
// Only emit suggestion if all required predicates are on generic
errors
.into_iter()
.map(|err| match err.obligation.predicate.kind().skip_binder() {
PredicateKind::Trait(predicate) => match predicate.self_ty().kind() {
ty::Param(param_ty) => Ok((
generics.type_param(param_ty, tcx),
predicate.trait_ref.print_only_trait_path().to_string(),
)),
_ => Err(()),
},
// Only emit suggestion if all required predicates are on generic
let predicates: Result<Vec<_>, _> = errors
.into_iter()
.map(|err| match err.obligation.predicate.kind().skip_binder() {
PredicateKind::Trait(predicate) => match predicate.self_ty().kind() {
ty::Param(param_ty) => Ok((
generics.type_param(param_ty, tcx),
predicate.trait_ref.print_only_trait_path().to_string(),
)),
_ => Err(()),
})
.collect()
});
},
_ => Err(()),
})
.collect();
if let Ok(predicates) = predicates {
suggest_constraining_type_params(
@ -2146,7 +2220,9 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
}
StorageDeadOrDrop::Destructor(_) => kind,
},
ProjectionElem::Field(..) | ProjectionElem::Downcast(..) => {
ProjectionElem::OpaqueCast { .. }
| ProjectionElem::Field(..)
| ProjectionElem::Downcast(..) => {
match place_ty.ty.kind() {
ty::Adt(def, _) if def.has_dtor(tcx) => {
// Report the outermost adt with a destructor

View File

@ -1,8 +1,5 @@
//! Print diagnostics to explain why values are borrowed.
use std::collections::VecDeque;
use rustc_data_structures::fx::FxHashSet;
use rustc_errors::{Applicability, Diagnostic};
use rustc_index::vec::IndexVec;
use rustc_infer::infer::NllRegionVariableOrigin;
@ -359,19 +356,37 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
let borrow_region_vid = borrow.region;
debug!(?borrow_region_vid);
let region_sub = self.regioncx.find_sub_region_live_at(borrow_region_vid, location);
let mut region_sub = self.regioncx.find_sub_region_live_at(borrow_region_vid, location);
debug!(?region_sub);
match find_use::find(body, regioncx, tcx, region_sub, location) {
let mut use_location = location;
let mut use_in_later_iteration_of_loop = false;
if region_sub == borrow_region_vid {
// When `region_sub` is the same as `borrow_region_vid` (the location where the borrow is
// issued is the same location that invalidates the reference), this is likely a loop iteration
// - in this case, try using the loop terminator location in `find_sub_region_live_at`.
if let Some(loop_terminator_location) =
regioncx.find_loop_terminator_location(borrow.region, body)
{
region_sub = self
.regioncx
.find_sub_region_live_at(borrow_region_vid, loop_terminator_location);
debug!("explain_why_borrow_contains_point: region_sub in loop={:?}", region_sub);
use_location = loop_terminator_location;
use_in_later_iteration_of_loop = true;
}
}
match find_use::find(body, regioncx, tcx, region_sub, use_location) {
Some(Cause::LiveVar(local, location)) => {
let span = body.source_info(location).span;
let spans = self
.move_spans(Place::from(local).as_ref(), location)
.or_else(|| self.borrow_spans(span, location));
let borrow_location = location;
if self.is_use_in_later_iteration_of_loop(borrow_location, location) {
let later_use = self.later_use_kind(borrow, spans, location);
if use_in_later_iteration_of_loop {
let later_use = self.later_use_kind(borrow, spans, use_location);
BorrowExplanation::UsedLaterInLoop(later_use.0, later_use.1, later_use.2)
} else {
// Check if the location represents a `FakeRead`, and adapt the error
@ -425,131 +440,6 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
}
}
/// true if `borrow_location` can reach `use_location` by going through a loop and
/// `use_location` is also inside of that loop
fn is_use_in_later_iteration_of_loop(
&self,
borrow_location: Location,
use_location: Location,
) -> bool {
let back_edge = self.reach_through_backedge(borrow_location, use_location);
back_edge.map_or(false, |back_edge| self.can_reach_head_of_loop(use_location, back_edge))
}
/// Returns the outmost back edge if `from` location can reach `to` location passing through
/// that back edge
fn reach_through_backedge(&self, from: Location, to: Location) -> Option<Location> {
let mut visited_locations = FxHashSet::default();
let mut pending_locations = VecDeque::new();
visited_locations.insert(from);
pending_locations.push_back(from);
debug!("reach_through_backedge: from={:?} to={:?}", from, to,);
let mut outmost_back_edge = None;
while let Some(location) = pending_locations.pop_front() {
debug!(
"reach_through_backedge: location={:?} outmost_back_edge={:?}
pending_locations={:?} visited_locations={:?}",
location, outmost_back_edge, pending_locations, visited_locations
);
if location == to && outmost_back_edge.is_some() {
// We've managed to reach the use location
debug!("reach_through_backedge: found!");
return outmost_back_edge;
}
let block = &self.body.basic_blocks[location.block];
if location.statement_index < block.statements.len() {
let successor = location.successor_within_block();
if visited_locations.insert(successor) {
pending_locations.push_back(successor);
}
} else {
pending_locations.extend(
block
.terminator()
.successors()
.map(|bb| Location { statement_index: 0, block: bb })
.filter(|s| visited_locations.insert(*s))
.map(|s| {
if self.is_back_edge(location, s) {
match outmost_back_edge {
None => {
outmost_back_edge = Some(location);
}
Some(back_edge)
if location.dominates(back_edge, &self.dominators) =>
{
outmost_back_edge = Some(location);
}
Some(_) => {}
}
}
s
}),
);
}
}
None
}
/// true if `from` location can reach `loop_head` location and `loop_head` dominates all the
/// intermediate nodes
fn can_reach_head_of_loop(&self, from: Location, loop_head: Location) -> bool {
self.find_loop_head_dfs(from, loop_head, &mut FxHashSet::default())
}
fn find_loop_head_dfs(
&self,
from: Location,
loop_head: Location,
visited_locations: &mut FxHashSet<Location>,
) -> bool {
visited_locations.insert(from);
if from == loop_head {
return true;
}
if loop_head.dominates(from, &self.dominators) {
let block = &self.body.basic_blocks[from.block];
if from.statement_index < block.statements.len() {
let successor = from.successor_within_block();
if !visited_locations.contains(&successor)
&& self.find_loop_head_dfs(successor, loop_head, visited_locations)
{
return true;
}
} else {
for bb in block.terminator().successors() {
let successor = Location { statement_index: 0, block: bb };
if !visited_locations.contains(&successor)
&& self.find_loop_head_dfs(successor, loop_head, visited_locations)
{
return true;
}
}
}
}
false
}
/// True if an edge `source -> target` is a backedge -- in other words, if the target
/// dominates the source.
fn is_back_edge(&self, source: Location, target: Location) -> bool {
target.dominates(source, &self.dominators)
}
/// Determine how the borrow was later used.
/// First span returned points to the location of the conflicting use
/// Second span if `Some` is returned in the case of closures and points

View File

@ -237,6 +237,7 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
}
ProjectionElem::Downcast(..) if opt.including_downcast => return None,
ProjectionElem::Downcast(..) => (),
ProjectionElem::OpaqueCast(..) => (),
ProjectionElem::Field(field, _ty) => {
// FIXME(project-rfc_2229#36): print capture precisely here.
if let Some(field) = self.is_upvar_field_projection(PlaceRef {
@ -317,6 +318,7 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
PlaceRef { local, projection: proj_base }.ty(self.body, self.infcx.tcx)
}
ProjectionElem::Downcast(..) => place.ty(self.body, self.infcx.tcx),
ProjectionElem::OpaqueCast(ty) => PlaceTy::from_ty(*ty),
ProjectionElem::Field(_, field_type) => PlaceTy::from_ty(*field_type),
},
};
@ -970,7 +972,6 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
move_span: Span,
move_spans: UseSpans<'tcx>,
moved_place: Place<'tcx>,
used_place: Option<PlaceRef<'tcx>>,
partially_str: &str,
loop_message: &str,
move_msg: &str,
@ -1024,7 +1025,8 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
if let Some((CallDesugaringKind::ForLoopIntoIter, _)) = desugaring {
let ty = moved_place.ty(self.body, self.infcx.tcx).ty;
let suggest = match self.infcx.tcx.get_diagnostic_item(sym::IntoIterator) {
Some(def_id) => self.infcx.tcx.infer_ctxt().enter(|infcx| {
Some(def_id) => {
let infcx = self.infcx.tcx.infer_ctxt().build();
type_known_to_meet_bound_modulo_regions(
&infcx,
self.param_env,
@ -1035,7 +1037,7 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
def_id,
DUMMY_SP,
)
}),
}
_ => false,
};
if suggest {
@ -1058,9 +1060,11 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
place_name, partially_str, loop_message
),
);
// If we have a `&mut` ref, we need to reborrow.
if let Some(ty::Ref(_, _, hir::Mutability::Mut)) = used_place
.map(|used_place| used_place.ty(self.body, self.infcx.tcx).ty.kind())
// If the moved place was a `&mut` ref, then we can
// suggest to reborrow it where it was moved, so it
// will still be valid by the time we get to the usage.
if let ty::Ref(_, _, hir::Mutability::Mut) =
moved_place.ty(self.body, self.infcx.tcx).ty.kind()
{
// If we are in a loop this will be suggested later.
if !is_loop_move {

View File

@ -401,7 +401,7 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
};
if let Some(use_spans) = use_spans {
self.explain_captures(
&mut err, span, span, use_spans, move_place, None, "", "", "", false, true,
&mut err, span, span, use_spans, move_place, "", "", "", false, true,
);
}
err

View File

@ -169,6 +169,7 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
..,
ProjectionElem::Index(_)
| ProjectionElem::ConstantIndex { .. }
| ProjectionElem::OpaqueCast { .. }
| ProjectionElem::Subslice { .. }
| ProjectionElem::Downcast(..),
],
@ -931,7 +932,7 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
let opt_suggestions = self
.infcx
.tcx
.typeck(path_segment.hir_id.owner)
.typeck(path_segment.hir_id.owner.def_id)
.type_dependent_def_id(*hir_id)
.and_then(|def_id| self.infcx.tcx.impl_of_method(def_id))
.map(|def_id| self.infcx.tcx.associated_items(def_id))
@ -1031,7 +1032,7 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
if look_at_return && hir.get_return_block(closure_id).is_some() {
// ...otherwise we are probably in the tail expression of the function, point at the
// return type.
match hir.get_by_def_id(hir.get_parent_item(fn_call_id)) {
match hir.get_by_def_id(hir.get_parent_item(fn_call_id).def_id) {
hir::Node::Item(hir::Item { ident, kind: hir::ItemKind::Fn(sig, ..), .. })
| hir::Node::TraitItem(hir::TraitItem {
ident,

View File

@ -186,7 +186,7 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
if let Some(lower_bound_region) = lower_bound_region {
let generic_ty = type_test.generic_kind.to_ty(self.infcx.tcx);
let origin = RelateParamBound(type_test_span, generic_ty, None);
self.buffer_error(self.infcx.construct_generic_bound_failure(
self.buffer_error(self.infcx.err_ctxt().construct_generic_bound_failure(
self.body.source.def_id().expect_local(),
type_test_span,
Some(origin),
@ -281,7 +281,8 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
let tcx = self.infcx.tcx;
match tcx.hir().get_if_local(def_id) {
Some(Node::ImplItem(impl_item)) => {
match tcx.hir().find_by_def_id(tcx.hir().get_parent_item(impl_item.hir_id())) {
match tcx.hir().find_by_def_id(tcx.hir().get_parent_item(impl_item.hir_id()).def_id)
{
Some(Node::Item(Item {
kind: ItemKind::Impl(hir::Impl { self_ty, .. }),
..
@ -291,7 +292,7 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
}
Some(Node::TraitItem(trait_item)) => {
let trait_did = tcx.hir().get_parent_item(trait_item.hir_id());
match tcx.hir().find_by_def_id(trait_did) {
match tcx.hir().find_by_def_id(trait_did.def_id) {
Some(Node::Item(Item { kind: ItemKind::Trait(..), .. })) => {
// The method being called is defined in the `trait`, but the `'static`
// obligation comes from the `impl`. Find that `impl` so that we can point
@ -340,7 +341,7 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
/// Report an error because the universal region `fr` was required to outlive
/// `outlived_fr` but it is not known to do so. For example:
///
/// ```compile_fail,E0312
/// ```compile_fail
/// fn foo<'a, 'b>(x: &'a u32) -> &'b u32 { x }
/// ```
///
@ -364,7 +365,8 @@ impl<'a, 'tcx> MirBorrowckCtxt<'a, 'tcx> {
// Check if we can use one of the "nice region errors".
if let (Some(f), Some(o)) = (self.to_error_region(fr), self.to_error_region(outlived_fr)) {
let nice = NiceRegionError::new_from_span(self.infcx, cause.span, o, f);
let infer_err = self.infcx.err_ctxt();
let nice = NiceRegionError::new_from_span(&infer_err, cause.span, o, f);
if let Some(diag) = nice.try_report_from_nll() {
self.buffer_error(diag);
return;

View File

@ -251,7 +251,8 @@ impl<'tcx> MirBorrowckCtxt<'_, 'tcx> {
.or_else(|| self.give_name_if_anonymous_region_appears_in_upvars(fr))
.or_else(|| self.give_name_if_anonymous_region_appears_in_output(fr))
.or_else(|| self.give_name_if_anonymous_region_appears_in_yield_ty(fr))
.or_else(|| self.give_name_if_anonymous_region_appears_in_impl_signature(fr));
.or_else(|| self.give_name_if_anonymous_region_appears_in_impl_signature(fr))
.or_else(|| self.give_name_if_anonymous_region_appears_in_arg_position_impl_trait(fr));
if let Some(ref value) = value {
self.region_names.try_borrow_mut().unwrap().insert(fr, value.clone());
@ -707,7 +708,8 @@ impl<'tcx> MirBorrowckCtxt<'_, 'tcx> {
hir::AsyncGeneratorKind::Block => " of async block",
hir::AsyncGeneratorKind::Closure => " of async closure",
hir::AsyncGeneratorKind::Fn => {
let parent_item = hir.get_by_def_id(hir.get_parent_item(mir_hir_id));
let parent_item =
hir.get_by_def_id(hir.get_parent_item(mir_hir_id).def_id);
let output = &parent_item
.fn_decl()
.expect("generator lowered from async fn should be in fn")
@ -863,20 +865,13 @@ impl<'tcx> MirBorrowckCtxt<'_, 'tcx> {
};
let tcx = self.infcx.tcx;
let body_parent_did = tcx.opt_parent(self.mir_def_id().to_def_id())?;
if tcx.parent(region.def_id) != body_parent_did
|| tcx.def_kind(body_parent_did) != DefKind::Impl
{
let region_parent = tcx.parent(region.def_id);
if tcx.def_kind(region_parent) != DefKind::Impl {
return None;
}
let mut found = false;
tcx.fold_regions(tcx.type_of(body_parent_did), |r: ty::Region<'tcx>, _| {
if *r == ty::ReEarlyBound(region) {
found = true;
}
r
});
let found = tcx
.any_free_region_meets(&tcx.type_of(region_parent), |r| *r == ty::ReEarlyBound(region));
Some(RegionName {
name: self.synthesize_region_name(),
@ -889,4 +884,92 @@ impl<'tcx> MirBorrowckCtxt<'_, 'tcx> {
),
})
}
fn give_name_if_anonymous_region_appears_in_arg_position_impl_trait(
&self,
fr: RegionVid,
) -> Option<RegionName> {
let ty::ReEarlyBound(region) = *self.to_error_region(fr)? else {
return None;
};
if region.has_name() {
return None;
};
let predicates = self
.infcx
.tcx
.predicates_of(self.body.source.def_id())
.instantiate_identity(self.infcx.tcx)
.predicates;
if let Some(upvar_index) = self
.regioncx
.universal_regions()
.defining_ty
.upvar_tys()
.position(|ty| self.any_param_predicate_mentions(&predicates, ty, region))
{
let (upvar_name, upvar_span) = self.regioncx.get_upvar_name_and_span_for_region(
self.infcx.tcx,
&self.upvars,
upvar_index,
);
let region_name = self.synthesize_region_name();
Some(RegionName {
name: region_name,
source: RegionNameSource::AnonRegionFromUpvar(upvar_span, upvar_name),
})
} else if let Some(arg_index) = self
.regioncx
.universal_regions()
.unnormalized_input_tys
.iter()
.position(|ty| self.any_param_predicate_mentions(&predicates, *ty, region))
{
let (arg_name, arg_span) = self.regioncx.get_argument_name_and_span_for_region(
self.body,
&self.local_names,
arg_index,
);
let region_name = self.synthesize_region_name();
Some(RegionName {
name: region_name,
source: RegionNameSource::AnonRegionFromArgument(
RegionNameHighlight::CannotMatchHirTy(arg_span, arg_name?.to_string()),
),
})
} else {
None
}
}
fn any_param_predicate_mentions(
&self,
predicates: &[ty::Predicate<'tcx>],
ty: Ty<'tcx>,
region: ty::EarlyBoundRegion,
) -> bool {
let tcx = self.infcx.tcx;
ty.walk().any(|arg| {
if let ty::GenericArgKind::Type(ty) = arg.unpack()
&& let ty::Param(_) = ty.kind()
{
predicates.iter().any(|pred| {
match pred.kind().skip_binder() {
ty::PredicateKind::Trait(data) if data.self_ty() == ty => {}
ty::PredicateKind::Projection(data) if data.projection_ty.self_ty() == ty => {}
_ => return false,
}
tcx.any_free_region_meets(pred, |r| {
*r == ty::ReEarlyBound(region)
})
})
} else {
false
}
})
}
}

View File

@ -3,7 +3,6 @@
#![allow(rustc::potential_query_instability)]
#![feature(box_patterns)]
#![feature(let_chains)]
#![cfg_attr(bootstrap, feature(let_else))]
#![feature(min_specialization)]
#![feature(never_type)]
#![feature(rustc_attrs)]
@ -132,14 +131,11 @@ fn mir_borrowck<'tcx>(
debug!("run query mir_borrowck: {}", tcx.def_path_str(def.did.to_def_id()));
let hir_owner = tcx.hir().local_def_id_to_hir_id(def.did).owner;
let opt_closure_req = tcx
.infer_ctxt()
.with_opaque_type_inference(DefiningAnchor::Bind(hir_owner))
.enter(|infcx| {
let input_body: &Body<'_> = &input_body.borrow();
let promoted: &IndexVec<_, _> = &promoted.borrow();
do_mir_borrowck(&infcx, input_body, promoted, false).0
});
let infcx =
tcx.infer_ctxt().with_opaque_type_inference(DefiningAnchor::Bind(hir_owner.def_id)).build();
let input_body: &Body<'_> = &input_body.borrow();
let promoted: &IndexVec<_, _> = &promoted.borrow();
let opt_closure_req = do_mir_borrowck(&infcx, input_body, promoted, false).0;
debug!("mir_borrowck done");
tcx.arena.alloc(opt_closure_req)
@ -151,8 +147,8 @@ fn mir_borrowck<'tcx>(
/// region ids on which the borrow checking was performed together with Polonius
/// facts.
#[instrument(skip(infcx, input_body, input_promoted), fields(id=?input_body.source.with_opt_param().as_local().unwrap()), level = "debug")]
fn do_mir_borrowck<'a, 'tcx>(
infcx: &InferCtxt<'a, 'tcx>,
fn do_mir_borrowck<'tcx>(
infcx: &InferCtxt<'tcx>,
input_body: &Body<'tcx>,
input_promoted: &IndexVec<Promoted, Body<'tcx>>,
return_body_with_facts: bool,
@ -475,7 +471,7 @@ pub struct BodyWithBorrowckFacts<'tcx> {
}
struct MirBorrowckCtxt<'cx, 'tcx> {
infcx: &'cx InferCtxt<'cx, 'tcx>,
infcx: &'cx InferCtxt<'tcx>,
param_env: ParamEnv<'tcx>,
body: &'cx Body<'tcx>,
move_data: &'cx MoveData<'tcx>,
@ -1781,6 +1777,7 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
for (place_base, elem) in place.iter_projections().rev() {
match elem {
ProjectionElem::Index(_/*operand*/) |
ProjectionElem::OpaqueCast(_) |
ProjectionElem::ConstantIndex { .. } |
// assigning to P[i] requires P to be valid.
ProjectionElem::Downcast(_/*adt_def*/, _/*variant_idx*/) =>
@ -2172,6 +2169,7 @@ impl<'cx, 'tcx> MirBorrowckCtxt<'cx, 'tcx> {
| ProjectionElem::Index(..)
| ProjectionElem::ConstantIndex { .. }
| ProjectionElem::Subslice { .. }
| ProjectionElem::OpaqueCast { .. }
| ProjectionElem::Downcast(..) => {
let upvar_field_projection = self.is_upvar_field_projection(place);
if let Some(field) = upvar_field_projection {

View File

@ -55,8 +55,8 @@ pub(crate) struct NllOutput<'tcx> {
/// regions (e.g., region parameters) declared on the function. That set will need to be given to
/// `compute_regions`.
#[instrument(skip(infcx, param_env, body, promoted), level = "debug")]
pub(crate) fn replace_regions_in_mir<'cx, 'tcx>(
infcx: &InferCtxt<'cx, 'tcx>,
pub(crate) fn replace_regions_in_mir<'tcx>(
infcx: &InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
body: &mut Body<'tcx>,
promoted: &mut IndexVec<Promoted, Body<'tcx>>,
@ -155,7 +155,7 @@ fn populate_polonius_move_facts(
///
/// This may result in errors being reported.
pub(crate) fn compute_regions<'cx, 'tcx>(
infcx: &InferCtxt<'cx, 'tcx>,
infcx: &InferCtxt<'tcx>,
universal_regions: UniversalRegions<'tcx>,
body: &Body<'tcx>,
promoted: &IndexVec<Promoted, Body<'tcx>>,
@ -318,8 +318,8 @@ pub(crate) fn compute_regions<'cx, 'tcx>(
}
}
pub(super) fn dump_mir_results<'a, 'tcx>(
infcx: &InferCtxt<'a, 'tcx>,
pub(super) fn dump_mir_results<'tcx>(
infcx: &InferCtxt<'tcx>,
body: &Body<'tcx>,
regioncx: &RegionInferenceContext<'tcx>,
closure_region_requirements: &Option<ClosureRegionRequirements<'_>>,
@ -368,8 +368,8 @@ pub(super) fn dump_mir_results<'a, 'tcx>(
};
}
pub(super) fn dump_annotation<'a, 'tcx>(
infcx: &InferCtxt<'a, 'tcx>,
pub(super) fn dump_annotation<'tcx>(
infcx: &InferCtxt<'tcx>,
body: &Body<'tcx>,
regioncx: &RegionInferenceContext<'tcx>,
closure_region_requirements: &Option<ClosureRegionRequirements<'_>>,

View File

@ -250,6 +250,7 @@ fn place_components_conflict<'tcx>(
| (ProjectionElem::Index { .. }, _, _)
| (ProjectionElem::ConstantIndex { .. }, _, _)
| (ProjectionElem::Subslice { .. }, _, _)
| (ProjectionElem::OpaqueCast { .. }, _, _)
| (ProjectionElem::Downcast { .. }, _, _) => {
// Recursive case. This can still be disjoint on a
// further iteration if this a shallow access and
@ -317,6 +318,17 @@ fn place_projection_conflict<'tcx>(
debug!("place_element_conflict: DISJOINT-OR-EQ-DEREF");
Overlap::EqualOrDisjoint
}
(ProjectionElem::OpaqueCast(v1), ProjectionElem::OpaqueCast(v2)) => {
if v1 == v2 {
// same type - recur.
debug!("place_element_conflict: DISJOINT-OR-EQ-OPAQUE");
Overlap::EqualOrDisjoint
} else {
// Different types. Disjoint!
debug!("place_element_conflict: DISJOINT-OPAQUE");
Overlap::Disjoint
}
}
(ProjectionElem::Field(f1, _), ProjectionElem::Field(f2, _)) => {
if f1 == f2 {
// same field (e.g., `a.y` vs. `a.y`) - recur.
@ -520,6 +532,7 @@ fn place_projection_conflict<'tcx>(
| ProjectionElem::Field(..)
| ProjectionElem::Index(..)
| ProjectionElem::ConstantIndex { .. }
| ProjectionElem::OpaqueCast { .. }
| ProjectionElem::Subslice { .. }
| ProjectionElem::Downcast(..),
_,

View File

@ -81,6 +81,7 @@ impl<'cx, 'tcx> Iterator for Prefixes<'cx, 'tcx> {
}
ProjectionElem::Downcast(..)
| ProjectionElem::Subslice { .. }
| ProjectionElem::OpaqueCast { .. }
| ProjectionElem::ConstantIndex { .. }
| ProjectionElem::Index(_) => {
cursor = cursor_base;

View File

@ -15,7 +15,7 @@ use rustc_infer::infer::region_constraints::{GenericKind, VarInfos, VerifyBound,
use rustc_infer::infer::{InferCtxt, NllRegionVariableOrigin, RegionVariableOrigin};
use rustc_middle::mir::{
Body, ClosureOutlivesRequirement, ClosureOutlivesSubject, ClosureRegionRequirements,
ConstraintCategory, Local, Location, ReturnConstraint,
ConstraintCategory, Local, Location, ReturnConstraint, TerminatorKind,
};
use rustc_middle::traits::ObligationCause;
use rustc_middle::traits::ObligationCauseCode;
@ -565,7 +565,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
#[instrument(skip(self, infcx, body, polonius_output), level = "debug")]
pub(super) fn solve(
&mut self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
body: &Body<'tcx>,
polonius_output: Option<Rc<PoloniusOutput>>,
@ -835,7 +835,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
/// 'a`. See `TypeTest` for more details.
fn check_type_tests(
&self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
body: &Body<'tcx>,
mut propagated_outlives_requirements: Option<&mut Vec<ClosureOutlivesRequirement<'tcx>>>,
@ -923,7 +923,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
#[instrument(level = "debug", skip(self, infcx, propagated_outlives_requirements))]
fn try_promote_type_test(
&self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
body: &Body<'tcx>,
type_test: &TypeTest<'tcx>,
@ -1036,7 +1036,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
#[instrument(level = "debug", skip(self, infcx))]
fn try_promote_type_test_subject(
&self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
ty: Ty<'tcx>,
) -> Option<ClosureOutlivesSubject<'tcx>> {
let tcx = infcx.tcx;
@ -1212,7 +1212,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
/// `point`.
fn eval_verify_bound(
&self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
body: &Body<'tcx>,
generic_ty: Ty<'tcx>,
@ -1262,7 +1262,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
fn eval_if_eq(
&self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
generic_ty: Ty<'tcx>,
lower_bound: RegionVid,
@ -1398,7 +1398,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
/// whether any of the constraints were too strong. In particular,
/// we want to check for a case where a universally quantified
/// region exceeded its bounds. Consider:
/// ```compile_fail,E0312
/// ```compile_fail
/// fn foo<'a, 'b>(x: &'a u32) -> &'b u32 { x }
/// ```
/// In this case, returning `x` requires `&'a u32 <: &'b u32`
@ -1451,7 +1451,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
/// <https://smallcultfollowing.com/babysteps/blog/2019/01/17/polonius-and-region-errors/>
///
/// In the canonical example
/// ```compile_fail,E0312
/// ```compile_fail
/// fn foo<'a, 'b>(x: &'a u32) -> &'b u32 { x }
/// ```
/// returning `x` requires `&'a u32 <: &'b u32` and hence we establish (transitively) a
@ -1718,7 +1718,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
fn check_member_constraints(
&self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
errors_buffer: &mut RegionErrors<'tcx>,
) {
let member_constraints = self.member_constraints.clone();
@ -2236,6 +2236,27 @@ impl<'tcx> RegionInferenceContext<'tcx> {
pub(crate) fn universe_info(&self, universe: ty::UniverseIndex) -> UniverseInfo<'tcx> {
self.universe_causes[&universe].clone()
}
/// Tries to find the terminator of the loop in which the region 'r' resides.
/// Returns the location of the terminator if found.
pub(crate) fn find_loop_terminator_location(
&self,
r: RegionVid,
body: &Body<'_>,
) -> Option<Location> {
let scc = self.constraint_sccs.scc(r.to_region_vid());
let locations = self.scc_values.locations_outlived_by(scc);
for location in locations {
let bb = &body[location.block];
if let Some(terminator) = &bb.terminator {
// terminator of a loop should be TerminatorKind::FalseUnwind
if let TerminatorKind::FalseUnwind { .. } = terminator.kind {
return Some(location);
}
}
}
None
}
}
impl<'tcx> RegionDefinition<'tcx> {

View File

@ -2,22 +2,18 @@ use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::vec_map::VecMap;
use rustc_hir::def_id::LocalDefId;
use rustc_hir::OpaqueTyOrigin;
use rustc_infer::infer::error_reporting::unexpected_hidden_region_diagnostic;
use rustc_infer::infer::TyCtxtInferExt as _;
use rustc_infer::infer::{DefiningAnchor, InferCtxt};
use rustc_infer::traits::{Obligation, ObligationCause, TraitEngine};
use rustc_middle::ty::fold::{TypeFolder, TypeSuperFoldable};
use rustc_middle::ty::subst::{GenericArg, GenericArgKind, InternalSubsts};
use rustc_middle::ty::subst::{GenericArgKind, InternalSubsts};
use rustc_middle::ty::visit::TypeVisitable;
use rustc_middle::ty::{
self, OpaqueHiddenType, OpaqueTypeKey, ToPredicate, Ty, TyCtxt, TypeFoldable,
};
use rustc_span::Span;
use rustc_trait_selection::traits::error_reporting::InferCtxtExt as _;
use rustc_trait_selection::traits::error_reporting::TypeErrCtxtExt as _;
use rustc_trait_selection::traits::TraitEngineExt as _;
use crate::session_diagnostics::ConstNotUsedTraitAlias;
use super::RegionInferenceContext;
impl<'tcx> RegionInferenceContext<'tcx> {
@ -63,7 +59,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
#[instrument(level = "debug", skip(self, infcx), ret)]
pub(crate) fn infer_opaque_types(
&self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
opaque_ty_decls: VecMap<OpaqueTypeKey<'tcx>, (OpaqueHiddenType<'tcx>, OpaqueTyOrigin)>,
) -> VecMap<LocalDefId, OpaqueHiddenType<'tcx>> {
let mut result: VecMap<LocalDefId, OpaqueHiddenType<'tcx>> = VecMap::new();
@ -194,7 +190,7 @@ pub trait InferCtxtExt<'tcx> {
) -> Ty<'tcx>;
}
impl<'a, 'tcx> InferCtxtExt<'tcx> for InferCtxt<'a, 'tcx> {
impl<'tcx> InferCtxtExt<'tcx> for InferCtxt<'tcx> {
/// Given the fully resolved, instantiated type for an opaque
/// type, i.e., the value of an inference variable like C1 or C2
/// (*), computes the "definition type" for an opaque type
@ -229,31 +225,9 @@ impl<'a, 'tcx> InferCtxtExt<'tcx> for InferCtxt<'a, 'tcx> {
return self.tcx.ty_error();
}
let OpaqueTypeKey { def_id, substs } = opaque_type_key;
// Use substs to build up a reverse map from regions to their
// identity mappings. This is necessary because of `impl
// Trait` lifetimes are computed by replacing existing
// lifetimes with 'static and remapping only those used in the
// `impl Trait` return type, resulting in the parameters
// shifting.
let id_substs = InternalSubsts::identity_for_item(self.tcx, def_id.to_def_id());
debug!(?id_substs);
let map: FxHashMap<GenericArg<'tcx>, GenericArg<'tcx>> =
substs.iter().enumerate().map(|(index, subst)| (subst, id_substs[index])).collect();
debug!("map = {:#?}", map);
// Convert the type from the function into a type valid outside
// the function, by replacing invalid regions with 'static,
// after producing an error for each of them.
let definition_ty = instantiated_ty.ty.fold_with(&mut ReverseMapper::new(
self.tcx,
opaque_type_key,
map,
instantiated_ty.ty,
instantiated_ty.span,
));
debug!(?definition_ty);
let definition_ty = instantiated_ty
.remap_generic_params_to_declaration_params(opaque_type_key, self.tcx, false, origin)
.ty;
if !check_opaque_type_parameter_valid(
self.tcx,
@ -266,72 +240,70 @@ impl<'a, 'tcx> InferCtxtExt<'tcx> for InferCtxt<'a, 'tcx> {
// Only check this for TAIT. RPIT already supports `src/test/ui/impl-trait/nested-return-type2.rs`
// on stable and we'd break that.
if let OpaqueTyOrigin::TyAlias = origin {
// This logic duplicates most of `check_opaque_meets_bounds`.
// FIXME(oli-obk): Also do region checks here and then consider removing `check_opaque_meets_bounds` entirely.
let param_env = self.tcx.param_env(def_id);
let body_id = self.tcx.local_def_id_to_hir_id(def_id);
// HACK This bubble is required for this tests to pass:
// type-alias-impl-trait/issue-67844-nested-opaque.rs
self.tcx.infer_ctxt().with_opaque_type_inference(DefiningAnchor::Bubble).enter(
move |infcx| {
// Require the hidden type to be well-formed with only the generics of the opaque type.
// Defining use functions may have more bounds than the opaque type, which is ok, as long as the
// hidden type is well formed even without those bounds.
let predicate =
ty::Binder::dummy(ty::PredicateKind::WellFormed(definition_ty.into()))
.to_predicate(infcx.tcx);
let mut fulfillment_cx = <dyn TraitEngine<'tcx>>::new(infcx.tcx);
let OpaqueTyOrigin::TyAlias = origin else {
return definition_ty;
};
let def_id = opaque_type_key.def_id;
// This logic duplicates most of `check_opaque_meets_bounds`.
// FIXME(oli-obk): Also do region checks here and then consider removing `check_opaque_meets_bounds` entirely.
let param_env = self.tcx.param_env(def_id);
let body_id = self.tcx.local_def_id_to_hir_id(def_id);
// HACK This bubble is required for this tests to pass:
// type-alias-impl-trait/issue-67844-nested-opaque.rs
let infcx =
self.tcx.infer_ctxt().with_opaque_type_inference(DefiningAnchor::Bubble).build();
// Require the hidden type to be well-formed with only the generics of the opaque type.
// Defining use functions may have more bounds than the opaque type, which is ok, as long as the
// hidden type is well formed even without those bounds.
let predicate = ty::Binder::dummy(ty::PredicateKind::WellFormed(definition_ty.into()))
.to_predicate(infcx.tcx);
let mut fulfillment_cx = <dyn TraitEngine<'tcx>>::new(infcx.tcx);
// Require that the hidden type actually fulfills all the bounds of the opaque type, even without
// the bounds that the function supplies.
match infcx.register_hidden_type(
OpaqueTypeKey { def_id, substs: id_substs },
ObligationCause::misc(instantiated_ty.span, body_id),
param_env,
let id_substs = InternalSubsts::identity_for_item(self.tcx, def_id.to_def_id());
// Require that the hidden type actually fulfills all the bounds of the opaque type, even without
// the bounds that the function supplies.
let opaque_ty = self.tcx.mk_opaque(def_id.to_def_id(), id_substs);
match infcx
.at(&ObligationCause::misc(instantiated_ty.span, body_id), param_env)
.eq(opaque_ty, definition_ty)
{
Ok(infer_ok) => {
for obligation in infer_ok.obligations {
fulfillment_cx.register_predicate_obligation(&infcx, obligation);
}
}
Err(err) => {
infcx
.err_ctxt()
.report_mismatched_types(
&ObligationCause::misc(instantiated_ty.span, body_id),
opaque_ty,
definition_ty,
origin,
) {
Ok(infer_ok) => {
for obligation in infer_ok.obligations {
fulfillment_cx.register_predicate_obligation(&infcx, obligation);
}
}
Err(err) => {
infcx
.report_mismatched_types(
&ObligationCause::misc(instantiated_ty.span, body_id),
self.tcx.mk_opaque(def_id.to_def_id(), id_substs),
definition_ty,
err,
)
.emit();
}
}
err,
)
.emit();
}
}
fulfillment_cx.register_predicate_obligation(
&infcx,
Obligation::misc(instantiated_ty.span, body_id, param_env, predicate),
);
fulfillment_cx.register_predicate_obligation(
&infcx,
Obligation::misc(instantiated_ty.span, body_id, param_env, predicate),
);
// Check that all obligations are satisfied by the implementation's
// version.
let errors = fulfillment_cx.select_all_or_error(&infcx);
// Check that all obligations are satisfied by the implementation's
// version.
let errors = fulfillment_cx.select_all_or_error(&infcx);
// This is still required for many(half of the tests in ui/type-alias-impl-trait)
// tests to pass
let _ = infcx.inner.borrow_mut().opaque_type_storage.take_opaque_types();
// This is still required for many(half of the tests in ui/type-alias-impl-trait)
// tests to pass
let _ = infcx.inner.borrow_mut().opaque_type_storage.take_opaque_types();
if errors.is_empty() {
definition_ty
} else {
infcx.report_fulfillment_errors(&errors, None, false);
self.tcx.ty_error()
}
},
)
} else {
if errors.is_empty() {
definition_ty
} else {
infcx.err_ctxt().report_fulfillment_errors(&errors, None, false);
self.tcx.ty_error()
}
}
}
@ -427,221 +399,3 @@ fn check_opaque_type_parameter_valid(
}
true
}
struct ReverseMapper<'tcx> {
tcx: TyCtxt<'tcx>,
key: ty::OpaqueTypeKey<'tcx>,
map: FxHashMap<GenericArg<'tcx>, GenericArg<'tcx>>,
do_not_error: bool,
/// initially `Some`, set to `None` once error has been reported
hidden_ty: Option<Ty<'tcx>>,
/// Span of function being checked.
span: Span,
}
impl<'tcx> ReverseMapper<'tcx> {
fn new(
tcx: TyCtxt<'tcx>,
key: ty::OpaqueTypeKey<'tcx>,
map: FxHashMap<GenericArg<'tcx>, GenericArg<'tcx>>,
hidden_ty: Ty<'tcx>,
span: Span,
) -> Self {
Self { tcx, key, map, do_not_error: false, hidden_ty: Some(hidden_ty), span }
}
fn fold_kind_no_missing_regions_error(&mut self, kind: GenericArg<'tcx>) -> GenericArg<'tcx> {
assert!(!self.do_not_error);
self.do_not_error = true;
let kind = kind.fold_with(self);
self.do_not_error = false;
kind
}
fn fold_kind_normally(&mut self, kind: GenericArg<'tcx>) -> GenericArg<'tcx> {
assert!(!self.do_not_error);
kind.fold_with(self)
}
}
impl<'tcx> TypeFolder<'tcx> for ReverseMapper<'tcx> {
fn tcx(&self) -> TyCtxt<'tcx> {
self.tcx
}
#[instrument(skip(self), level = "debug")]
fn fold_region(&mut self, r: ty::Region<'tcx>) -> ty::Region<'tcx> {
match *r {
// Ignore bound regions and `'static` regions that appear in the
// type, we only need to remap regions that reference lifetimes
// from the function declaration.
// This would ignore `'r` in a type like `for<'r> fn(&'r u32)`.
ty::ReLateBound(..) | ty::ReStatic => return r,
// If regions have been erased (by writeback), don't try to unerase
// them.
ty::ReErased => return r,
// The regions that we expect from borrow checking.
ty::ReEarlyBound(_) | ty::ReFree(_) => {}
ty::RePlaceholder(_) | ty::ReVar(_) => {
// All of the regions in the type should either have been
// erased by writeback, or mapped back to named regions by
// borrow checking.
bug!("unexpected region kind in opaque type: {:?}", r);
}
}
let generics = self.tcx().generics_of(self.key.def_id);
match self.map.get(&r.into()).map(|k| k.unpack()) {
Some(GenericArgKind::Lifetime(r1)) => r1,
Some(u) => panic!("region mapped to unexpected kind: {:?}", u),
None if self.do_not_error => self.tcx.lifetimes.re_static,
None if generics.parent.is_some() => {
if let Some(hidden_ty) = self.hidden_ty.take() {
unexpected_hidden_region_diagnostic(
self.tcx,
self.tcx.def_span(self.key.def_id),
hidden_ty,
r,
self.key,
)
.emit();
}
self.tcx.lifetimes.re_static
}
None => {
self.tcx
.sess
.struct_span_err(self.span, "non-defining opaque type use in defining scope")
.span_label(
self.span,
format!(
"lifetime `{}` is part of concrete type but not used in \
parameter list of the `impl Trait` type alias",
r
),
)
.emit();
self.tcx().lifetimes.re_static
}
}
}
fn fold_ty(&mut self, ty: Ty<'tcx>) -> Ty<'tcx> {
match *ty.kind() {
ty::Closure(def_id, substs) => {
// I am a horrible monster and I pray for death. When
// we encounter a closure here, it is always a closure
// from within the function that we are currently
// type-checking -- one that is now being encapsulated
// in an opaque type. Ideally, we would
// go through the types/lifetimes that it references
// and treat them just like we would any other type,
// which means we would error out if we find any
// reference to a type/region that is not in the
// "reverse map".
//
// **However,** in the case of closures, there is a
// somewhat subtle (read: hacky) consideration. The
// problem is that our closure types currently include
// all the lifetime parameters declared on the
// enclosing function, even if they are unused by the
// closure itself. We can't readily filter them out,
// so here we replace those values with `'empty`. This
// can't really make a difference to the rest of the
// compiler; those regions are ignored for the
// outlives relation, and hence don't affect trait
// selection or auto traits, and they are erased
// during codegen.
let generics = self.tcx.generics_of(def_id);
let substs = self.tcx.mk_substs(substs.iter().enumerate().map(|(index, kind)| {
if index < generics.parent_count {
// Accommodate missing regions in the parent kinds...
self.fold_kind_no_missing_regions_error(kind)
} else {
// ...but not elsewhere.
self.fold_kind_normally(kind)
}
}));
self.tcx.mk_closure(def_id, substs)
}
ty::Generator(def_id, substs, movability) => {
let generics = self.tcx.generics_of(def_id);
let substs = self.tcx.mk_substs(substs.iter().enumerate().map(|(index, kind)| {
if index < generics.parent_count {
// Accommodate missing regions in the parent kinds...
self.fold_kind_no_missing_regions_error(kind)
} else {
// ...but not elsewhere.
self.fold_kind_normally(kind)
}
}));
self.tcx.mk_generator(def_id, substs, movability)
}
ty::Param(param) => {
// Look it up in the substitution list.
match self.map.get(&ty.into()).map(|k| k.unpack()) {
// Found it in the substitution list; replace with the parameter from the
// opaque type.
Some(GenericArgKind::Type(t1)) => t1,
Some(u) => panic!("type mapped to unexpected kind: {:?}", u),
None => {
debug!(?param, ?self.map);
self.tcx
.sess
.struct_span_err(
self.span,
&format!(
"type parameter `{}` is part of concrete type but not \
used in parameter list for the `impl Trait` type alias",
ty
),
)
.emit();
self.tcx().ty_error()
}
}
}
_ => ty.super_fold_with(self),
}
}
fn fold_const(&mut self, ct: ty::Const<'tcx>) -> ty::Const<'tcx> {
trace!("checking const {:?}", ct);
// Find a const parameter
match ct.kind() {
ty::ConstKind::Param(..) => {
// Look it up in the substitution list.
match self.map.get(&ct.into()).map(|k| k.unpack()) {
// Found it in the substitution list, replace with the parameter from the
// opaque type.
Some(GenericArgKind::Const(c1)) => c1,
Some(u) => panic!("const mapped to unexpected kind: {:?}", u),
None => {
self.tcx.sess.emit_err(ConstNotUsedTraitAlias {
ct: ct.to_string(),
span: self.span,
});
self.tcx().const_error(ct.ty())
}
}
}
_ => ct,
}
}
}

View File

@ -1,8 +1,8 @@
use rustc_index::vec::IndexVec;
use rustc_infer::infer::{InferCtxt, NllRegionVariableOrigin};
use rustc_middle::mir::visit::{MutVisitor, TyContext};
use rustc_middle::mir::Constant;
use rustc_middle::mir::{Body, Location, Promoted};
use rustc_middle::mir::{Constant, ConstantKind};
use rustc_middle::ty::subst::SubstsRef;
use rustc_middle::ty::{self, Ty, TyCtxt, TypeFoldable};
@ -10,7 +10,7 @@ use rustc_middle::ty::{self, Ty, TyCtxt, TypeFoldable};
/// inference variables, returning the number of variables created.
#[instrument(skip(infcx, body, promoted), level = "debug")]
pub fn renumber_mir<'tcx>(
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
body: &mut Body<'tcx>,
promoted: &mut IndexVec<Promoted, Body<'tcx>>,
) {
@ -28,7 +28,7 @@ pub fn renumber_mir<'tcx>(
/// Replaces all regions appearing in `value` with fresh inference
/// variables.
#[instrument(skip(infcx), level = "debug")]
pub fn renumber_regions<'tcx, T>(infcx: &InferCtxt<'_, 'tcx>, value: T) -> T
pub fn renumber_regions<'tcx, T>(infcx: &InferCtxt<'tcx>, value: T) -> T
where
T: TypeFoldable<'tcx>,
{
@ -38,23 +38,8 @@ where
})
}
// FIXME(valtrees): This function is necessary because `fold_regions`
// panics for mir constants in the visitor.
//
// Once `visit_mir_constant` is removed we can also remove this function
// and just use `renumber_regions`.
fn renumber_regions_in_mir_constant<'tcx>(
infcx: &InferCtxt<'_, 'tcx>,
value: ConstantKind<'tcx>,
) -> ConstantKind<'tcx> {
infcx.tcx.super_fold_regions(value, |_region, _depth| {
let origin = NllRegionVariableOrigin::Existential { from_forall: false };
infcx.next_nll_region_var(origin)
})
}
struct NllVisitor<'a, 'tcx> {
infcx: &'a InferCtxt<'a, 'tcx>,
infcx: &'a InferCtxt<'tcx>,
}
impl<'a, 'tcx> NllVisitor<'a, 'tcx> {
@ -64,13 +49,6 @@ impl<'a, 'tcx> NllVisitor<'a, 'tcx> {
{
renumber_regions(self.infcx, value)
}
fn renumber_regions_in_mir_constant(
&mut self,
value: ConstantKind<'tcx>,
) -> ConstantKind<'tcx> {
renumber_regions_in_mir_constant(self.infcx, value)
}
}
impl<'a, 'tcx> MutVisitor<'tcx> for NllVisitor<'a, 'tcx> {
@ -103,7 +81,7 @@ impl<'a, 'tcx> MutVisitor<'tcx> for NllVisitor<'a, 'tcx> {
#[instrument(skip(self), level = "debug")]
fn visit_constant(&mut self, constant: &mut Constant<'tcx>, _location: Location) {
let literal = constant.literal;
constant.literal = self.renumber_regions_in_mir_constant(literal);
constant.literal = self.renumber_regions(literal);
debug!("constant: {:#?}", constant);
}
}

View File

@ -1,12 +1,12 @@
use rustc_errors::{IntoDiagnosticArg, MultiSpan};
use rustc_macros::{LintDiagnostic, SessionDiagnostic, SessionSubdiagnostic};
use rustc_macros::{Diagnostic, LintDiagnostic, Subdiagnostic};
use rustc_middle::ty::Ty;
use rustc_span::Span;
use crate::diagnostics::RegionName;
#[derive(SessionDiagnostic)]
#[diag(borrowck::move_unsized, code = "E0161")]
#[derive(Diagnostic)]
#[diag(borrowck_move_unsized, code = "E0161")]
pub(crate) struct MoveUnsized<'tcx> {
pub ty: Ty<'tcx>,
#[primary_span]
@ -14,8 +14,8 @@ pub(crate) struct MoveUnsized<'tcx> {
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(borrowck::higher_ranked_lifetime_error)]
#[derive(Diagnostic)]
#[diag(borrowck_higher_ranked_lifetime_error)]
pub(crate) struct HigherRankedLifetimeError {
#[subdiagnostic]
pub cause: Option<HigherRankedErrorCause>,
@ -23,23 +23,23 @@ pub(crate) struct HigherRankedLifetimeError {
pub span: Span,
}
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub(crate) enum HigherRankedErrorCause {
#[note(borrowck::could_not_prove)]
#[note(borrowck_could_not_prove)]
CouldNotProve { predicate: String },
#[note(borrowck::could_not_normalize)]
#[note(borrowck_could_not_normalize)]
CouldNotNormalize { value: String },
}
#[derive(SessionDiagnostic)]
#[diag(borrowck::higher_ranked_subtype_error)]
#[derive(Diagnostic)]
#[diag(borrowck_higher_ranked_subtype_error)]
pub(crate) struct HigherRankedSubtypeError {
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(borrowck::generic_does_not_live_long_enough)]
#[derive(Diagnostic)]
#[diag(borrowck_generic_does_not_live_long_enough)]
pub(crate) struct GenericDoesNotLiveLongEnough {
pub kind: String,
#[primary_span]
@ -47,24 +47,15 @@ pub(crate) struct GenericDoesNotLiveLongEnough {
}
#[derive(LintDiagnostic)]
#[diag(borrowck::var_does_not_need_mut)]
#[diag(borrowck_var_does_not_need_mut)]
pub(crate) struct VarNeedNotMut {
#[suggestion_short(applicability = "machine-applicable", code = "")]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(borrowck::const_not_used_in_type_alias)]
pub(crate) struct ConstNotUsedTraitAlias {
pub ct: String,
#[primary_span]
pub span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(borrowck::var_cannot_escape_closure)]
#[derive(Diagnostic)]
#[diag(borrowck_var_cannot_escape_closure)]
#[note]
#[note(borrowck::cannot_escape)]
#[note(cannot_escape)]
pub(crate) struct FnMutError {
#[primary_span]
pub span: Span,
@ -72,54 +63,54 @@ pub(crate) struct FnMutError {
pub ty_err: FnMutReturnTypeErr,
}
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub(crate) enum VarHereDenote {
#[label(borrowck::var_here_captured)]
#[label(borrowck_var_here_captured)]
Captured {
#[primary_span]
span: Span,
},
#[label(borrowck::var_here_defined)]
#[label(borrowck_var_here_defined)]
Defined {
#[primary_span]
span: Span,
},
#[label(borrowck::closure_inferred_mut)]
#[label(borrowck_closure_inferred_mut)]
FnMutInferred {
#[primary_span]
span: Span,
},
}
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub(crate) enum FnMutReturnTypeErr {
#[label(borrowck::returned_closure_escaped)]
#[label(borrowck_returned_closure_escaped)]
ReturnClosure {
#[primary_span]
span: Span,
},
#[label(borrowck::returned_async_block_escaped)]
#[label(borrowck_returned_async_block_escaped)]
ReturnAsyncBlock {
#[primary_span]
span: Span,
},
#[label(borrowck::returned_ref_escaped)]
#[label(borrowck_returned_ref_escaped)]
ReturnRef {
#[primary_span]
span: Span,
},
}
#[derive(SessionDiagnostic)]
#[diag(borrowck::lifetime_constraints_error)]
#[derive(Diagnostic)]
#[diag(borrowck_lifetime_constraints_error)]
pub(crate) struct LifetimeOutliveErr {
#[primary_span]
pub span: Span,
}
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub(crate) enum LifetimeReturnCategoryErr<'a> {
#[label(borrowck::returned_lifetime_wrong)]
#[label(borrowck_returned_lifetime_wrong)]
WrongReturn {
#[primary_span]
span: Span,
@ -127,7 +118,7 @@ pub(crate) enum LifetimeReturnCategoryErr<'a> {
outlived_fr_name: RegionName,
fr_name: &'a RegionName,
},
#[label(borrowck::returned_lifetime_short)]
#[label(borrowck_returned_lifetime_short)]
ShortReturn {
#[primary_span]
span: Span,
@ -149,9 +140,9 @@ impl IntoDiagnosticArg for RegionName {
}
}
#[derive(SessionSubdiagnostic)]
#[derive(Subdiagnostic)]
pub(crate) enum RequireStaticErr {
#[note(borrowck::used_impl_require_static)]
#[note(borrowck_used_impl_require_static)]
UsedImpl {
#[primary_span]
multi_span: MultiSpan,

View File

@ -52,11 +52,8 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
Some(error_info) => error_info.to_universe_info(old_universe),
None => UniverseInfo::other(),
};
for u in old_universe..universe {
self.borrowck_context
.constraints
.universe_causes
.insert(u + 1, universe_info.clone());
for u in (old_universe + 1)..=universe {
self.borrowck_context.constraints.universe_causes.insert(u, universe_info.clone());
}
}
@ -71,15 +68,13 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
where
T: TypeFoldable<'tcx>,
{
let old_universe = self.infcx.universe();
let (instantiated, _) =
self.infcx.instantiate_canonical_with_fresh_inference_vars(span, canonical);
for u in 0..canonical.max_universe.as_u32() {
let info = UniverseInfo::other();
self.borrowck_context
.constraints
.universe_causes
.insert(ty::UniverseIndex::from_u32(u), info);
for u in (old_universe + 1)..=self.infcx.universe() {
self.borrowck_context.constraints.universe_causes.insert(u, UniverseInfo::other());
}
instantiated

View File

@ -19,7 +19,7 @@ use crate::{
};
pub(crate) struct ConstraintConversion<'a, 'tcx> {
infcx: &'a InferCtxt<'a, 'tcx>,
infcx: &'a InferCtxt<'tcx>,
tcx: TyCtxt<'tcx>,
universal_regions: &'a UniversalRegions<'tcx>,
/// Each RBP `GK: 'a` is assumed to be true. These encode
@ -43,7 +43,7 @@ pub(crate) struct ConstraintConversion<'a, 'tcx> {
impl<'a, 'tcx> ConstraintConversion<'a, 'tcx> {
pub(crate) fn new(
infcx: &'a InferCtxt<'a, 'tcx>,
infcx: &'a InferCtxt<'tcx>,
universal_regions: &'a UniversalRegions<'tcx>,
region_bound_pairs: &'a RegionBoundPairs<'tcx>,
implicit_region_bound: ty::Region<'tcx>,

View File

@ -8,7 +8,6 @@ use rustc_infer::infer::InferCtxt;
use rustc_middle::mir::ConstraintCategory;
use rustc_middle::traits::query::OutlivesBound;
use rustc_middle::ty::{self, RegionVid, Ty};
use rustc_span::DUMMY_SP;
use rustc_trait_selection::traits::query::type_op::{self, TypeOp};
use std::rc::Rc;
use type_op::TypeOpOutput;
@ -48,7 +47,7 @@ pub(crate) struct CreateResult<'tcx> {
}
pub(crate) fn create<'tcx>(
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
implicit_region_bound: ty::Region<'tcx>,
universal_regions: &Rc<UniversalRegions<'tcx>>,
@ -197,7 +196,7 @@ impl UniversalRegionRelations<'_> {
}
struct UniversalRegionRelationsBuilder<'this, 'tcx> {
infcx: &'this InferCtxt<'this, 'tcx>,
infcx: &'this InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
universal_regions: Rc<UniversalRegions<'tcx>>,
implicit_region_bound: ty::Region<'tcx>,
@ -219,6 +218,7 @@ impl<'tcx> UniversalRegionRelationsBuilder<'_, 'tcx> {
}
pub(crate) fn create(mut self) -> CreateResult<'tcx> {
let span = self.infcx.tcx.def_span(self.universal_regions.defining_ty.def_id());
let unnormalized_input_output_tys = self
.universal_regions
.unnormalized_input_tys
@ -250,7 +250,7 @@ impl<'tcx> UniversalRegionRelationsBuilder<'_, 'tcx> {
self.infcx
.tcx
.sess
.delay_span_bug(DUMMY_SP, &format!("failed to normalize {:?}", ty));
.delay_span_bug(span, &format!("failed to normalize {:?}", ty));
TypeOpOutput {
output: self.infcx.tcx.ty_error(),
constraints: None,
@ -301,8 +301,8 @@ impl<'tcx> UniversalRegionRelationsBuilder<'_, 'tcx> {
&self.region_bound_pairs,
self.implicit_region_bound,
self.param_env,
Locations::All(DUMMY_SP),
DUMMY_SP,
Locations::All(span),
span,
ConstraintCategory::Internal,
&mut self.constraints,
)
@ -362,6 +362,11 @@ impl<'tcx> UniversalRegionRelationsBuilder<'_, 'tcx> {
self.region_bound_pairs
.insert(ty::OutlivesPredicate(GenericKind::Projection(projection_b), r_a));
}
OutlivesBound::RegionSubOpaque(r_a, def_id, substs) => {
self.region_bound_pairs
.insert(ty::OutlivesPredicate(GenericKind::Opaque(def_id, substs), r_a));
}
}
}
}

View File

@ -7,16 +7,11 @@
//! `RETURN_PLACE` the MIR arguments) are always fully normalized (and
//! contain revealed `impl Trait` values).
use crate::type_check::constraint_conversion::ConstraintConversion;
use rustc_index::vec::Idx;
use rustc_infer::infer::LateBoundRegionConversionTime;
use rustc_middle::mir::*;
use rustc_middle::ty::Ty;
use rustc_span::Span;
use rustc_span::DUMMY_SP;
use rustc_trait_selection::traits::query::type_op::{self, TypeOp};
use rustc_trait_selection::traits::query::Fallible;
use type_op::TypeOpOutput;
use crate::universal_regions::UniversalRegions;
@ -185,7 +180,7 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
}
}
#[instrument(skip(self, span), level = "debug")]
#[instrument(skip(self), level = "debug")]
fn equate_normalized_input_or_output(&mut self, a: Ty<'tcx>, b: Ty<'tcx>, span: Span) {
if let Err(_) =
self.eq_types(a, b, Locations::All(span), ConstraintCategory::BoringNoLocation)
@ -194,13 +189,7 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
// `rustc_traits::normalize_after_erasing_regions`. Ideally, we'd
// like to normalize *before* inserting into `local_decls`, but
// doing so ends up causing some other trouble.
let b = match self.normalize_and_add_constraints(b) {
Ok(n) => n,
Err(_) => {
debug!("equate_inputs_and_outputs: NoSolution");
b
}
};
let b = self.normalize(b, Locations::All(span));
// Note: if we have to introduce new placeholders during normalization above, then we won't have
// added those universes to the universe info, which we would want in `relate_tys`.
@ -218,28 +207,4 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
}
}
}
pub(crate) fn normalize_and_add_constraints(&mut self, t: Ty<'tcx>) -> Fallible<Ty<'tcx>> {
let TypeOpOutput { output: norm_ty, constraints, .. } =
self.param_env.and(type_op::normalize::Normalize::new(t)).fully_perform(self.infcx)?;
debug!("{:?} normalized to {:?}", t, norm_ty);
for data in constraints {
ConstraintConversion::new(
self.infcx,
&self.borrowck_context.universal_regions,
&self.region_bound_pairs,
self.implicit_region_bound,
self.param_env,
Locations::All(DUMMY_SP),
DUMMY_SP,
ConstraintCategory::Internal,
&mut self.borrowck_context.constraints,
)
.convert_all(&*data);
}
Ok(norm_ty)
}
}

View File

@ -123,7 +123,7 @@ mod relate_tys;
/// - `move_data` -- move-data constructed when performing the maybe-init dataflow analysis
/// - `elements` -- MIR region map
pub(crate) fn type_check<'mir, 'tcx>(
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
body: &Body<'tcx>,
promoted: &IndexVec<Promoted, Body<'tcx>>,
@ -138,8 +138,6 @@ pub(crate) fn type_check<'mir, 'tcx>(
use_polonius: bool,
) -> MirTypeckResults<'tcx> {
let implicit_region_bound = infcx.tcx.mk_region(ty::ReVar(universal_regions.fr_fn_body));
let mut universe_causes = FxHashMap::default();
universe_causes.insert(ty::UniverseIndex::from_u32(0), UniverseInfo::other());
let mut constraints = MirTypeckRegionConstraints {
placeholder_indices: PlaceholderIndices::default(),
placeholder_index_to_region: IndexVec::default(),
@ -148,7 +146,7 @@ pub(crate) fn type_check<'mir, 'tcx>(
member_constraints: MemberConstraintSet::default(),
closure_bounds_mapping: Default::default(),
type_tests: Vec::default(),
universe_causes,
universe_causes: FxHashMap::default(),
};
let CreateResult {
@ -165,9 +163,8 @@ pub(crate) fn type_check<'mir, 'tcx>(
debug!(?normalized_inputs_and_output);
for u in ty::UniverseIndex::ROOT..infcx.universe() {
let info = UniverseInfo::other();
constraints.universe_causes.insert(u, info);
for u in ty::UniverseIndex::ROOT..=infcx.universe() {
constraints.universe_causes.insert(u, UniverseInfo::other());
}
let mut borrowck_context = BorrowCheckContext {
@ -236,7 +233,7 @@ pub(crate) fn type_check<'mir, 'tcx>(
.unwrap();
let mut hidden_type = infcx.resolve_vars_if_possible(decl.hidden_type);
trace!("finalized opaque type {:?} to {:#?}", opaque_type_key, hidden_type.ty.kind());
if hidden_type.has_infer_types_or_consts() {
if hidden_type.has_non_region_infer() {
infcx.tcx.sess.delay_span_bug(
decl.hidden_type.span,
&format!("could not resolve {:#?}", hidden_type.ty.kind()),
@ -428,12 +425,18 @@ impl<'a, 'b, 'tcx> Visitor<'tcx> for TypeVerifier<'a, 'b, 'tcx> {
}
if let ty::FnDef(def_id, substs) = *constant.literal.ty().kind() {
// const_trait_impl: use a non-const param env when checking that a FnDef type is well formed.
// this is because the well-formedness of the function does not need to be proved to have `const`
// impls for trait bounds.
let instantiated_predicates = tcx.predicates_of(def_id).instantiate(tcx, substs);
let prev = self.cx.param_env;
self.cx.param_env = prev.without_const();
self.cx.normalize_and_prove_instantiated_predicates(
def_id,
instantiated_predicates,
locations,
);
self.cx.param_env = prev;
}
}
}
@ -581,6 +584,7 @@ impl<'a, 'b, 'tcx> TypeVerifier<'a, 'b, 'tcx> {
// modify their locations.
let all_facts = &mut None;
let mut constraints = Default::default();
let mut type_tests = Default::default();
let mut closure_bounds = Default::default();
let mut liveness_constraints =
LivenessValues::new(Rc::new(RegionValueElements::new(&promoted_body)));
@ -592,6 +596,7 @@ impl<'a, 'b, 'tcx> TypeVerifier<'a, 'b, 'tcx> {
&mut this.cx.borrowck_context.constraints.outlives_constraints,
&mut constraints,
);
mem::swap(&mut this.cx.borrowck_context.constraints.type_tests, &mut type_tests);
mem::swap(
&mut this.cx.borrowck_context.constraints.closure_bounds_mapping,
&mut closure_bounds,
@ -616,6 +621,13 @@ impl<'a, 'b, 'tcx> TypeVerifier<'a, 'b, 'tcx> {
swap_constraints(self);
let locations = location.to_locations();
// Use location of promoted const in collected constraints
for type_test in type_tests.iter() {
let mut type_test = type_test.clone();
type_test.locations = locations;
self.cx.borrowck_context.constraints.type_tests.push(type_test)
}
for constraint in constraints.outlives().iter() {
let mut constraint = constraint.clone();
constraint.locations = locations;
@ -764,6 +776,19 @@ impl<'a, 'b, 'tcx> TypeVerifier<'a, 'b, 'tcx> {
}
PlaceTy::from_ty(fty)
}
ProjectionElem::OpaqueCast(ty) => {
let ty = self.sanitize_type(place, ty);
let ty = self.cx.normalize(ty, location);
self.cx
.eq_types(
base.ty,
ty,
location.to_locations(),
ConstraintCategory::TypeAnnotation,
)
.unwrap();
PlaceTy::from_ty(ty)
}
}
}
@ -857,7 +882,7 @@ impl<'a, 'b, 'tcx> TypeVerifier<'a, 'b, 'tcx> {
/// way, it accrues region constraints -- these can later be used by
/// NLL region checking.
struct TypeChecker<'a, 'tcx> {
infcx: &'a InferCtxt<'a, 'tcx>,
infcx: &'a InferCtxt<'tcx>,
param_env: ty::ParamEnv<'tcx>,
last_span: Span,
body: &'a Body<'tcx>,
@ -927,7 +952,7 @@ pub(crate) struct MirTypeckRegionConstraints<'tcx> {
impl<'tcx> MirTypeckRegionConstraints<'tcx> {
fn placeholder_region(
&mut self,
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
placeholder: ty::PlaceholderRegion,
) -> ty::Region<'tcx> {
let placeholder_index = self.placeholder_indices.insert(placeholder);
@ -1011,7 +1036,7 @@ impl Locations {
impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
fn new(
infcx: &'a InferCtxt<'a, 'tcx>,
infcx: &'a InferCtxt<'tcx>,
body: &'a Body<'tcx>,
param_env: ty::ParamEnv<'tcx>,
region_bound_pairs: &'a RegionBoundPairs<'tcx>,
@ -1170,10 +1195,11 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
tcx,
self.param_env,
proj,
|this, field, ()| {
|this, field, _| {
let ty = this.field_ty(tcx, field);
self.normalize(ty, locations)
},
|_, _| unreachable!(),
);
curr_projected_ty = projected_ty;
}
@ -1604,7 +1630,7 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
let op_arg_ty = self.normalize(op_arg_ty, term_location);
let category = if from_hir_call {
ConstraintCategory::CallArgument(func_ty)
ConstraintCategory::CallArgument(self.infcx.tcx.erase_regions(func_ty))
} else {
ConstraintCategory::Boring
};
@ -1757,7 +1783,7 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
// `Sized` bound in no way depends on precise regions, so this
// shouldn't affect `is_sized`.
let erased_ty = tcx.erase_regions(ty);
if !erased_ty.is_sized(tcx.at(span), self.param_env) {
if !erased_ty.is_sized(tcx, self.param_env) {
// in current MIR construction, all non-control-flow rvalue
// expressions evaluate through `as_temp` or `into` a return
// slot or local, so to find all unsized rvalues it is enough
@ -2189,25 +2215,104 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
}
}
}
CastKind::Misc => {
CastKind::IntToInt => {
let ty_from = op.ty(body, tcx);
let cast_ty_from = CastTy::from_ty(ty_from);
let cast_ty_to = CastTy::from_ty(*ty);
// Misc casts are either between floats and ints, or one ptr type to another.
match (cast_ty_from, cast_ty_to) {
(
Some(CastTy::Int(_) | CastTy::Float),
Some(CastTy::Int(_) | CastTy::Float),
)
| (Some(CastTy::Ptr(_) | CastTy::FnPtr), Some(CastTy::Ptr(_))) => (),
(Some(CastTy::Int(_)), Some(CastTy::Int(_))) => (),
_ => {
span_mirbug!(
self,
rvalue,
"Invalid Misc cast {:?} -> {:?}",
"Invalid IntToInt cast {:?} -> {:?}",
ty_from,
ty,
ty
)
}
}
}
CastKind::IntToFloat => {
let ty_from = op.ty(body, tcx);
let cast_ty_from = CastTy::from_ty(ty_from);
let cast_ty_to = CastTy::from_ty(*ty);
match (cast_ty_from, cast_ty_to) {
(Some(CastTy::Int(_)), Some(CastTy::Float)) => (),
_ => {
span_mirbug!(
self,
rvalue,
"Invalid IntToFloat cast {:?} -> {:?}",
ty_from,
ty
)
}
}
}
CastKind::FloatToInt => {
let ty_from = op.ty(body, tcx);
let cast_ty_from = CastTy::from_ty(ty_from);
let cast_ty_to = CastTy::from_ty(*ty);
match (cast_ty_from, cast_ty_to) {
(Some(CastTy::Float), Some(CastTy::Int(_))) => (),
_ => {
span_mirbug!(
self,
rvalue,
"Invalid FloatToInt cast {:?} -> {:?}",
ty_from,
ty
)
}
}
}
CastKind::FloatToFloat => {
let ty_from = op.ty(body, tcx);
let cast_ty_from = CastTy::from_ty(ty_from);
let cast_ty_to = CastTy::from_ty(*ty);
match (cast_ty_from, cast_ty_to) {
(Some(CastTy::Float), Some(CastTy::Float)) => (),
_ => {
span_mirbug!(
self,
rvalue,
"Invalid FloatToFloat cast {:?} -> {:?}",
ty_from,
ty
)
}
}
}
CastKind::FnPtrToPtr => {
let ty_from = op.ty(body, tcx);
let cast_ty_from = CastTy::from_ty(ty_from);
let cast_ty_to = CastTy::from_ty(*ty);
match (cast_ty_from, cast_ty_to) {
(Some(CastTy::FnPtr), Some(CastTy::Ptr(_))) => (),
_ => {
span_mirbug!(
self,
rvalue,
"Invalid FnPtrToPtr cast {:?} -> {:?}",
ty_from,
ty
)
}
}
}
CastKind::PtrToPtr => {
let ty_from = op.ty(body, tcx);
let cast_ty_from = CastTy::from_ty(ty_from);
let cast_ty_to = CastTy::from_ty(*ty);
match (cast_ty_from, cast_ty_to) {
(Some(CastTy::Ptr(_)), Some(CastTy::Ptr(_))) => (),
_ => {
span_mirbug!(
self,
rvalue,
"Invalid PtrToPtr cast {:?} -> {:?}",
ty_from,
ty
)
}
}
@ -2503,6 +2608,7 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
}
ProjectionElem::Field(..)
| ProjectionElem::Downcast(..)
| ProjectionElem::OpaqueCast(..)
| ProjectionElem::Index(..)
| ProjectionElem::ConstantIndex { .. }
| ProjectionElem::Subslice { .. } => {
@ -2723,7 +2829,7 @@ impl<'tcx> TypeOp<'tcx> for InstantiateOpaqueType<'tcx> {
/// constraints in our `InferCtxt`
type ErrorInfo = InstantiateOpaqueType<'tcx>;
fn fully_perform(mut self, infcx: &InferCtxt<'_, 'tcx>) -> Fallible<TypeOpOutput<'tcx, Self>> {
fn fully_perform(mut self, infcx: &InferCtxt<'tcx>) -> Fallible<TypeOpOutput<'tcx, Self>> {
let (mut output, region_constraints) = scrape_region_constraints(infcx, || {
Ok(InferOk { value: (), obligations: self.obligations.clone() })
})?;

View File

@ -1,6 +1,6 @@
use rustc_infer::infer::nll_relate::{NormalizationStrategy, TypeRelating, TypeRelatingDelegate};
use rustc_infer::infer::NllRegionVariableOrigin;
use rustc_infer::traits::ObligationCause;
use rustc_infer::traits::PredicateObligations;
use rustc_middle::mir::ConstraintCategory;
use rustc_middle::ty::error::TypeError;
use rustc_middle::ty::relate::TypeRelation;
@ -155,27 +155,16 @@ impl<'tcx> TypeRelatingDelegate<'tcx> for NllTypeRelatingDelegate<'_, '_, 'tcx>
true
}
fn register_opaque_type(
fn register_opaque_type_obligations(
&mut self,
a: Ty<'tcx>,
b: Ty<'tcx>,
a_is_expected: bool,
obligations: PredicateObligations<'tcx>,
) -> Result<(), TypeError<'tcx>> {
let param_env = self.param_env();
let span = self.span();
let def_id = self.type_checker.body.source.def_id().expect_local();
let body_id = self.type_checker.tcx().hir().local_def_id_to_hir_id(def_id);
let cause = ObligationCause::misc(span, body_id);
self.type_checker
.fully_perform_op(
self.locations,
self.category,
InstantiateOpaqueType {
obligations: self
.type_checker
.infcx
.handle_opaque_type(a, b, a_is_expected, &cause, param_env)?
.obligations,
obligations,
// These fields are filled in during execution of the operation
base_universe: None,
region_constraints: None,

View File

@ -22,8 +22,8 @@ use rustc_hir::{BodyOwnerKind, HirId};
use rustc_index::vec::{Idx, IndexVec};
use rustc_infer::infer::{InferCtxt, NllRegionVariableOrigin};
use rustc_middle::ty::fold::TypeFoldable;
use rustc_middle::ty::subst::{InternalSubsts, Subst, SubstsRef};
use rustc_middle::ty::{self, InlineConstSubsts, InlineConstSubstsParts, RegionVid, Ty, TyCtxt};
use rustc_middle::ty::{InternalSubsts, SubstsRef};
use std::iter;
use crate::nll::ToRegionVid;
@ -219,7 +219,7 @@ impl<'tcx> UniversalRegions<'tcx> {
/// signature. This will also compute the relationships that are
/// known between those regions.
pub fn new(
infcx: &InferCtxt<'_, 'tcx>,
infcx: &InferCtxt<'tcx>,
mir_def: ty::WithOptConstParam<LocalDefId>,
param_env: ty::ParamEnv<'tcx>,
) -> Self {
@ -382,7 +382,7 @@ impl<'tcx> UniversalRegions<'tcx> {
}
struct UniversalRegionsBuilder<'cx, 'tcx> {
infcx: &'cx InferCtxt<'cx, 'tcx>,
infcx: &'cx InferCtxt<'tcx>,
mir_def: ty::WithOptConstParam<LocalDefId>,
mir_hir_id: HirId,
param_env: ty::ParamEnv<'tcx>,
@ -414,7 +414,7 @@ impl<'cx, 'tcx> UniversalRegionsBuilder<'cx, 'tcx> {
let typeck_root_def_id = self.infcx.tcx.typeck_root_def_id(self.mir_def.did.to_def_id());
// If this is is a 'root' body (not a closure/generator/inline const), then
// If this is a 'root' body (not a closure/generator/inline const), then
// there are no extern regions, so the local regions start at the same
// position as the (empty) sub-list of extern regions
let first_local_index = if self.mir_def.did.to_def_id() == typeck_root_def_id {
@ -699,7 +699,7 @@ trait InferCtxtExt<'tcx> {
);
}
impl<'cx, 'tcx> InferCtxtExt<'tcx> for InferCtxt<'cx, 'tcx> {
impl<'tcx> InferCtxtExt<'tcx> for InferCtxt<'tcx> {
fn replace_free_regions_with_nll_infer_vars<T>(
&self,
origin: NllRegionVariableOrigin,

View File

@ -58,6 +58,7 @@ impl<'cx, 'a> Context<'cx, 'a> {
/// Builds the whole `assert!` expression. For example, `let elem = 1; assert!(elem == 1);` expands to:
///
/// ```rust
/// #![feature(generic_assert_internals)]
/// let elem = 1;
/// {
/// #[allow(unused_imports)]
@ -70,7 +71,7 @@ impl<'cx, 'a> Context<'cx, 'a> {
/// __local_bind0
/// } == 1
/// ) {
/// panic!("Assertion failed: elem == 1\nWith captures:\n elem = {}", __capture0)
/// panic!("Assertion failed: elem == 1\nWith captures:\n elem = {:?}", __capture0)
/// }
/// }
/// ```
@ -241,8 +242,8 @@ impl<'cx, 'a> Context<'cx, 'a> {
self.manage_cond_expr(prefix);
self.manage_cond_expr(suffix);
}
ExprKind::MethodCall(_, ref mut local_exprs, _) => {
for local_expr in local_exprs.iter_mut().skip(1) {
ExprKind::MethodCall(_, _,ref mut local_exprs, _) => {
for local_expr in local_exprs.iter_mut() {
self.manage_cond_expr(local_expr);
}
}
@ -378,14 +379,12 @@ impl<'cx, 'a> Context<'cx, 'a> {
id: DUMMY_NODE_ID,
ident: Ident::new(sym::try_capture, self.span),
},
vec![
expr_paren(self.cx, self.span, self.cx.expr_addr_of(self.span, wrapper)),
expr_addr_of_mut(
self.cx,
self.span,
self.cx.expr_path(Path::from_ident(capture)),
),
],
expr_paren(self.cx, self.span, self.cx.expr_addr_of(self.span, wrapper)),
vec![expr_addr_of_mut(
self.cx,
self.span,
self.cx.expr_path(Path::from_ident(capture)),
)],
self.span,
))
.add_trailing_semicolon();
@ -443,10 +442,11 @@ fn expr_addr_of_mut(cx: &ExtCtxt<'_>, sp: Span, e: P<Expr>) -> P<Expr> {
fn expr_method_call(
cx: &ExtCtxt<'_>,
path: PathSegment,
receiver: P<Expr>,
args: Vec<P<Expr>>,
span: Span,
) -> P<Expr> {
cx.expr(span, ExprKind::MethodCall(path, args, span))
cx.expr(span, ExprKind::MethodCall(path, receiver, args, span))
}
fn expr_paren(cx: &ExtCtxt<'_>, sp: Span, e: P<Expr>) -> P<Expr> {

View File

@ -8,7 +8,7 @@ use rustc_ast::tokenstream::TokenStream;
use rustc_attr as attr;
use rustc_errors::PResult;
use rustc_expand::base::{self, *};
use rustc_macros::SessionDiagnostic;
use rustc_macros::Diagnostic;
use rustc_span::Span;
pub fn expand_cfg(
@ -35,16 +35,16 @@ pub fn expand_cfg(
}
}
#[derive(SessionDiagnostic)]
#[diag(builtin_macros::requires_cfg_pattern)]
#[derive(Diagnostic)]
#[diag(builtin_macros_requires_cfg_pattern)]
struct RequiresCfgPattern {
#[primary_span]
#[label]
span: Span,
}
#[derive(SessionDiagnostic)]
#[diag(builtin_macros::expected_one_cfg_pattern)]
#[derive(Diagnostic)]
#[diag(builtin_macros_expected_one_cfg_pattern)]
struct OneCfgPattern {
#[primary_span]
span: Span,

View File

@ -7,6 +7,7 @@ use rustc_ast::visit::Visitor;
use rustc_ast::NodeId;
use rustc_ast::{mut_visit, visit};
use rustc_ast::{Attribute, HasAttrs, HasTokens};
use rustc_errors::PResult;
use rustc_expand::base::{Annotatable, ExtCtxt};
use rustc_expand::config::StripUnconfigured;
use rustc_expand::configure;
@ -144,33 +145,34 @@ impl CfgEval<'_, '_> {
// the location of `#[cfg]` and `#[cfg_attr]` in the token stream. The tokenization
// process is lossless, so this process is invisible to proc-macros.
let parse_annotatable_with: fn(&mut Parser<'_>) -> _ = match annotatable {
Annotatable::Item(_) => {
|parser| Annotatable::Item(parser.parse_item(ForceCollect::Yes).unwrap().unwrap())
}
Annotatable::TraitItem(_) => |parser| {
Annotatable::TraitItem(
parser.parse_trait_item(ForceCollect::Yes).unwrap().unwrap().unwrap(),
)
},
Annotatable::ImplItem(_) => |parser| {
Annotatable::ImplItem(
parser.parse_impl_item(ForceCollect::Yes).unwrap().unwrap().unwrap(),
)
},
Annotatable::ForeignItem(_) => |parser| {
Annotatable::ForeignItem(
parser.parse_foreign_item(ForceCollect::Yes).unwrap().unwrap().unwrap(),
)
},
Annotatable::Stmt(_) => |parser| {
Annotatable::Stmt(P(parser.parse_stmt(ForceCollect::Yes).unwrap().unwrap()))
},
Annotatable::Expr(_) => {
|parser| Annotatable::Expr(parser.parse_expr_force_collect().unwrap())
}
_ => unreachable!(),
};
let parse_annotatable_with: for<'a> fn(&mut Parser<'a>) -> PResult<'a, _> =
match annotatable {
Annotatable::Item(_) => {
|parser| Ok(Annotatable::Item(parser.parse_item(ForceCollect::Yes)?.unwrap()))
}
Annotatable::TraitItem(_) => |parser| {
Ok(Annotatable::TraitItem(
parser.parse_trait_item(ForceCollect::Yes)?.unwrap().unwrap(),
))
},
Annotatable::ImplItem(_) => |parser| {
Ok(Annotatable::ImplItem(
parser.parse_impl_item(ForceCollect::Yes)?.unwrap().unwrap(),
))
},
Annotatable::ForeignItem(_) => |parser| {
Ok(Annotatable::ForeignItem(
parser.parse_foreign_item(ForceCollect::Yes)?.unwrap().unwrap(),
))
},
Annotatable::Stmt(_) => |parser| {
Ok(Annotatable::Stmt(P(parser.parse_stmt(ForceCollect::Yes)?.unwrap())))
},
Annotatable::Expr(_) => {
|parser| Ok(Annotatable::Expr(parser.parse_expr_force_collect()?))
}
_ => unreachable!(),
};
// 'Flatten' all nonterminals (i.e. `TokenKind::Interpolated`)
// to `None`-delimited groups containing the corresponding tokens. This
@ -193,7 +195,13 @@ impl CfgEval<'_, '_> {
let mut parser =
rustc_parse::stream_to_parser(&self.cfg.sess.parse_sess, orig_tokens, None);
parser.capture_cfg = true;
annotatable = parse_annotatable_with(&mut parser);
match parse_annotatable_with(&mut parser) {
Ok(a) => annotatable = a,
Err(mut err) => {
err.emit();
return Some(annotatable);
}
}
// Now that we have our re-parsed `AttrTokenStream`, recursively configuring
// our attribute target will correctly the tokens as well.
@ -202,8 +210,15 @@ impl CfgEval<'_, '_> {
}
impl MutVisitor for CfgEval<'_, '_> {
#[instrument(level = "trace", skip(self))]
fn visit_expr(&mut self, expr: &mut P<ast::Expr>) {
self.cfg.configure_expr(expr);
self.cfg.configure_expr(expr, false);
mut_visit::noop_visit_expr(expr, self);
}
#[instrument(level = "trace", skip(self))]
fn visit_method_receiver_expr(&mut self, expr: &mut P<ast::Expr>) {
self.cfg.configure_expr(expr, true);
mut_visit::noop_visit_expr(expr, self);
}

View File

@ -16,6 +16,7 @@ pub fn expand_deriving_copy(
let trait_def = TraitDef {
span,
path: path_std!(marker::Copy),
skip_path_as_bound: false,
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: true,

View File

@ -72,6 +72,7 @@ pub fn expand_deriving_clone(
let trait_def = TraitDef {
span,
path: path_std!(clone::Clone),
skip_path_as_bound: false,
additional_bounds: bounds,
generics: Bounds::empty(),
supports_unions: true,

View File

@ -25,6 +25,7 @@ pub fn expand_deriving_eq(
let trait_def = TraitDef {
span,
path: path_std!(cmp::Eq),
skip_path_as_bound: false,
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: true,

View File

@ -19,6 +19,7 @@ pub fn expand_deriving_ord(
let trait_def = TraitDef {
span,
path: path_std!(cmp::Ord),
skip_path_as_bound: false,
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: false,

View File

@ -83,6 +83,7 @@ pub fn expand_deriving_partial_eq(
let trait_def = TraitDef {
span,
path: path_std!(cmp::PartialEq),
skip_path_as_bound: false,
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: false,

View File

@ -37,6 +37,7 @@ pub fn expand_deriving_partial_ord(
let trait_def = TraitDef {
span,
path: path_std!(cmp::PartialOrd),
skip_path_as_bound: false,
additional_bounds: vec![],
generics: Bounds::empty(),
supports_unions: false,

View File

@ -20,6 +20,7 @@ pub fn expand_deriving_debug(
let trait_def = TraitDef {
span,
path: path_std!(fmt::Debug),
skip_path_as_bound: false,
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: false,

View File

@ -23,6 +23,7 @@ pub fn expand_deriving_rustc_decodable(
let trait_def = TraitDef {
span,
path: Path::new_(vec![krate, sym::Decodable], vec![], PathKind::Global),
skip_path_as_bound: false,
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: false,

View File

@ -24,6 +24,7 @@ pub fn expand_deriving_default(
let trait_def = TraitDef {
span,
path: Path::new(vec![kw::Default, sym::Default]),
skip_path_as_bound: has_a_default_variant(item),
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: false,
@ -262,3 +263,22 @@ impl<'a, 'b> rustc_ast::visit::Visitor<'a> for DetectNonVariantDefaultAttr<'a, '
}
}
}
fn has_a_default_variant(item: &Annotatable) -> bool {
struct HasDefaultAttrOnVariant {
found: bool,
}
impl<'ast> rustc_ast::visit::Visitor<'ast> for HasDefaultAttrOnVariant {
fn visit_variant(&mut self, v: &'ast rustc_ast::Variant) {
if v.attrs.iter().any(|attr| attr.has_name(kw::Default)) {
self.found = true;
}
// no need to subrecurse.
}
}
let mut visitor = HasDefaultAttrOnVariant { found: false };
item.visit_with(&mut visitor);
visitor.found
}

View File

@ -107,6 +107,7 @@ pub fn expand_deriving_rustc_encodable(
let trait_def = TraitDef {
span,
path: Path::new_(vec![krate, sym::Encodable], vec![], PathKind::Global),
skip_path_as_bound: false,
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: false,

View File

@ -174,6 +174,7 @@ use rustc_span::symbol::{kw, sym, Ident, Symbol};
use rustc_span::Span;
use std::cell::RefCell;
use std::iter;
use std::ops::Not;
use std::vec;
use thin_vec::thin_vec;
use ty::{Bounds, Path, Ref, Self_, Ty};
@ -187,6 +188,9 @@ pub struct TraitDef<'a> {
/// Path of the trait, including any type parameters
pub path: Path,
/// Whether to skip adding the current trait as a bound to the type parameters of the type.
pub skip_path_as_bound: bool,
/// Additional bounds required of any type parameters of the type,
/// other than the current trait
pub additional_bounds: Vec<Ty>,
@ -562,7 +566,7 @@ impl<'a> TraitDef<'a> {
tokens: None,
},
attrs: ast::AttrVec::new(),
kind: ast::AssocItemKind::TyAlias(Box::new(ast::TyAlias {
kind: ast::AssocItemKind::Type(Box::new(ast::TyAlias {
defaultness: ast::Defaultness::Final,
generics: Generics::default(),
where_clauses: (
@ -596,7 +600,7 @@ impl<'a> TraitDef<'a> {
cx.trait_bound(p.to_path(cx, self.span, type_ident, generics))
}).chain(
// require the current trait
iter::once(cx.trait_bound(trait_path.clone()))
self.skip_path_as_bound.not().then(|| cx.trait_bound(trait_path.clone()))
).chain(
// also add in any bounds from the declaration
param.bounds.iter().cloned()
@ -1110,6 +1114,11 @@ impl<'a> MethodDef<'a> {
/// ```
/// is equivalent to:
/// ```
/// #![feature(core_intrinsics)]
/// enum A {
/// A1,
/// A2(i32)
/// }
/// impl ::core::cmp::PartialEq for A {
/// #[inline]
/// fn eq(&self, other: &A) -> bool {

View File

@ -22,6 +22,7 @@ pub fn expand_deriving_hash(
let hash_trait_def = TraitDef {
span,
path,
skip_path_as_bound: false,
additional_bounds: Vec::new(),
generics: Bounds::empty(),
supports_unions: false,

View File

@ -131,6 +131,8 @@ fn inject_impl_of_structural_trait(
// Create generics param list for where clauses and impl headers
let mut generics = generics.clone();
let ctxt = span.ctxt();
// Create the type of `self`.
//
// in addition, remove defaults from generic params (impls cannot have them).
@ -138,16 +140,18 @@ fn inject_impl_of_structural_trait(
.params
.iter_mut()
.map(|param| match &mut param.kind {
ast::GenericParamKind::Lifetime => {
ast::GenericArg::Lifetime(cx.lifetime(span, param.ident))
}
ast::GenericParamKind::Lifetime => ast::GenericArg::Lifetime(
cx.lifetime(param.ident.span.with_ctxt(ctxt), param.ident),
),
ast::GenericParamKind::Type { default } => {
*default = None;
ast::GenericArg::Type(cx.ty_ident(span, param.ident))
ast::GenericArg::Type(cx.ty_ident(param.ident.span.with_ctxt(ctxt), param.ident))
}
ast::GenericParamKind::Const { ty: _, kw_span: _, default } => {
*default = None;
ast::GenericArg::Const(cx.const_ident(span, param.ident))
ast::GenericArg::Const(
cx.const_ident(param.ident.span.with_ctxt(ctxt), param.ident),
)
}
})
.collect();
@ -174,6 +178,8 @@ fn inject_impl_of_structural_trait(
})
.cloned(),
);
// Mark as `automatically_derived` to avoid some silly lints.
attrs.push(cx.attribute(cx.meta_word(span, sym::automatically_derived)));
let newitem = cx.item(
span,

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,240 @@
use rustc_ast::ptr::P;
use rustc_ast::Expr;
use rustc_data_structures::fx::FxHashMap;
use rustc_span::symbol::{Ident, Symbol};
use rustc_span::Span;
// Definitions:
//
// format_args!("hello {abc:.xyz$}!!", abc="world");
// └──────────────────────────────────────────────┘
// FormatArgs
//
// format_args!("hello {abc:.xyz$}!!", abc="world");
// └─────────┘
// argument
//
// format_args!("hello {abc:.xyz$}!!", abc="world");
// └───────────────────┘
// template
//
// format_args!("hello {abc:.xyz$}!!", abc="world");
// └────┘└─────────┘└┘
// pieces
//
// format_args!("hello {abc:.xyz$}!!", abc="world");
// └────┘ └┘
// literal pieces
//
// format_args!("hello {abc:.xyz$}!!", abc="world");
// └─────────┘
// placeholder
//
// format_args!("hello {abc:.xyz$}!!", abc="world");
// └─┘ └─┘
// positions (could be names, numbers, empty, or `*`)
/// (Parsed) format args.
///
/// Basically the "AST" for a complete `format_args!()`.
///
/// E.g., `format_args!("hello {name}");`.
#[derive(Clone, Debug)]
pub struct FormatArgs {
pub span: Span,
pub template: Vec<FormatArgsPiece>,
pub arguments: FormatArguments,
}
/// A piece of a format template string.
///
/// E.g. "hello" or "{name}".
#[derive(Clone, Debug)]
pub enum FormatArgsPiece {
Literal(Symbol),
Placeholder(FormatPlaceholder),
}
/// The arguments to format_args!().
///
/// E.g. `1, 2, name="ferris", n=3`,
/// but also implicit captured arguments like `x` in `format_args!("{x}")`.
#[derive(Clone, Debug)]
pub struct FormatArguments {
arguments: Vec<FormatArgument>,
num_unnamed_args: usize,
num_explicit_args: usize,
names: FxHashMap<Symbol, usize>,
}
impl FormatArguments {
pub fn new() -> Self {
Self {
arguments: Vec::new(),
names: FxHashMap::default(),
num_unnamed_args: 0,
num_explicit_args: 0,
}
}
pub fn add(&mut self, arg: FormatArgument) -> usize {
let index = self.arguments.len();
if let Some(name) = arg.kind.ident() {
self.names.insert(name.name, index);
} else if self.names.is_empty() {
// Only count the unnamed args before the first named arg.
// (Any later ones are errors.)
self.num_unnamed_args += 1;
}
if !matches!(arg.kind, FormatArgumentKind::Captured(..)) {
// This is an explicit argument.
// Make sure that all arguments so far are explcit.
assert_eq!(
self.num_explicit_args,
self.arguments.len(),
"captured arguments must be added last"
);
self.num_explicit_args += 1;
}
self.arguments.push(arg);
index
}
pub fn by_name(&self, name: Symbol) -> Option<(usize, &FormatArgument)> {
let i = *self.names.get(&name)?;
Some((i, &self.arguments[i]))
}
pub fn by_index(&self, i: usize) -> Option<&FormatArgument> {
(i < self.num_explicit_args).then(|| &self.arguments[i])
}
pub fn unnamed_args(&self) -> &[FormatArgument] {
&self.arguments[..self.num_unnamed_args]
}
pub fn named_args(&self) -> &[FormatArgument] {
&self.arguments[self.num_unnamed_args..self.num_explicit_args]
}
pub fn explicit_args(&self) -> &[FormatArgument] {
&self.arguments[..self.num_explicit_args]
}
pub fn into_vec(self) -> Vec<FormatArgument> {
self.arguments
}
}
#[derive(Clone, Debug)]
pub struct FormatArgument {
pub kind: FormatArgumentKind,
pub expr: P<Expr>,
}
#[derive(Clone, Debug)]
pub enum FormatArgumentKind {
/// `format_args(…, arg)`
Normal,
/// `format_args(…, arg = 1)`
Named(Ident),
/// `format_args("… {arg} …")`
Captured(Ident),
}
impl FormatArgumentKind {
pub fn ident(&self) -> Option<Ident> {
match self {
&Self::Normal => None,
&Self::Named(id) => Some(id),
&Self::Captured(id) => Some(id),
}
}
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct FormatPlaceholder {
/// Index into [`FormatArgs::arguments`].
pub argument: FormatArgPosition,
/// The span inside the format string for the full `{…}` placeholder.
pub span: Option<Span>,
/// `{}`, `{:?}`, or `{:x}`, etc.
pub format_trait: FormatTrait,
/// `{}` or `{:.5}` or `{:-^20}`, etc.
pub format_options: FormatOptions,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct FormatArgPosition {
/// Which argument this position refers to (Ok),
/// or would've referred to if it existed (Err).
pub index: Result<usize, usize>,
/// What kind of position this is. See [`FormatArgPositionKind`].
pub kind: FormatArgPositionKind,
/// The span of the name or number.
pub span: Option<Span>,
}
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub enum FormatArgPositionKind {
/// `{}` or `{:.*}`
Implicit,
/// `{1}` or `{:1$}` or `{:.1$}`
Number,
/// `{a}` or `{:a$}` or `{:.a$}`
Named,
}
#[derive(Copy, Clone, Debug, Hash, PartialEq, Eq)]
pub enum FormatTrait {
/// `{}`
Display,
/// `{:?}`
Debug,
/// `{:e}`
LowerExp,
/// `{:E}`
UpperExp,
/// `{:o}`
Octal,
/// `{:p}`
Pointer,
/// `{:b}`
Binary,
/// `{:x}`
LowerHex,
/// `{:X}`
UpperHex,
}
#[derive(Clone, Debug, Default, PartialEq, Eq)]
pub struct FormatOptions {
/// The width. E.g. `{:5}` or `{:width$}`.
pub width: Option<FormatCount>,
/// The precision. E.g. `{:.5}` or `{:.precision$}`.
pub precision: Option<FormatCount>,
/// The alignment. E.g. `{:>}` or `{:<}` or `{:^}`.
pub alignment: Option<FormatAlignment>,
/// The fill character. E.g. the `.` in `{:.>10}`.
pub fill: Option<char>,
/// The `+`, `-`, `0`, `#`, `x?` and `X?` flags.
pub flags: u32,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum FormatAlignment {
/// `{:<}`
Left,
/// `{:>}`
Right,
/// `{:^}`
Center,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum FormatCount {
/// `{:5}` or `{:.5}`
Literal(usize),
/// `{:.*}`, `{:.5$}`, or `{:a$}`, etc.
Argument(FormatArgPosition),
}

View File

@ -0,0 +1,353 @@
use super::*;
use rustc_ast as ast;
use rustc_ast::visit::{self, Visitor};
use rustc_ast::{BlockCheckMode, UnsafeSource};
use rustc_data_structures::fx::FxIndexSet;
use rustc_span::{sym, symbol::kw};
#[derive(Copy, Clone, Debug, Hash, PartialEq, Eq)]
enum ArgumentType {
Format(FormatTrait),
Usize,
}
fn make_argument(ecx: &ExtCtxt<'_>, sp: Span, arg: P<ast::Expr>, ty: ArgumentType) -> P<ast::Expr> {
// Generate:
// ::core::fmt::ArgumentV1::new_…(arg)
use ArgumentType::*;
use FormatTrait::*;
ecx.expr_call_global(
sp,
ecx.std_path(&[
sym::fmt,
sym::ArgumentV1,
match ty {
Format(Display) => sym::new_display,
Format(Debug) => sym::new_debug,
Format(LowerExp) => sym::new_lower_exp,
Format(UpperExp) => sym::new_upper_exp,
Format(Octal) => sym::new_octal,
Format(Pointer) => sym::new_pointer,
Format(Binary) => sym::new_binary,
Format(LowerHex) => sym::new_lower_hex,
Format(UpperHex) => sym::new_upper_hex,
Usize => sym::from_usize,
},
]),
vec![arg],
)
}
fn make_count(
ecx: &ExtCtxt<'_>,
sp: Span,
count: &Option<FormatCount>,
argmap: &mut FxIndexSet<(usize, ArgumentType)>,
) -> P<ast::Expr> {
// Generate:
// ::core::fmt::rt::v1::Count::…(…)
match count {
Some(FormatCount::Literal(n)) => ecx.expr_call_global(
sp,
ecx.std_path(&[sym::fmt, sym::rt, sym::v1, sym::Count, sym::Is]),
vec![ecx.expr_usize(sp, *n)],
),
Some(FormatCount::Argument(arg)) => {
if let Ok(arg_index) = arg.index {
let (i, _) = argmap.insert_full((arg_index, ArgumentType::Usize));
ecx.expr_call_global(
sp,
ecx.std_path(&[sym::fmt, sym::rt, sym::v1, sym::Count, sym::Param]),
vec![ecx.expr_usize(sp, i)],
)
} else {
DummyResult::raw_expr(sp, true)
}
}
None => ecx.expr_path(ecx.path_global(
sp,
ecx.std_path(&[sym::fmt, sym::rt, sym::v1, sym::Count, sym::Implied]),
)),
}
}
fn make_format_spec(
ecx: &ExtCtxt<'_>,
sp: Span,
placeholder: &FormatPlaceholder,
argmap: &mut FxIndexSet<(usize, ArgumentType)>,
) -> P<ast::Expr> {
// Generate:
// ::core::fmt::rt::v1::Argument {
// position: 0usize,
// format: ::core::fmt::rt::v1::FormatSpec {
// fill: ' ',
// align: ::core::fmt::rt::v1::Alignment::Unknown,
// flags: 0u32,
// precision: ::core::fmt::rt::v1::Count::Implied,
// width: ::core::fmt::rt::v1::Count::Implied,
// },
// }
let position = match placeholder.argument.index {
Ok(arg_index) => {
let (i, _) =
argmap.insert_full((arg_index, ArgumentType::Format(placeholder.format_trait)));
ecx.expr_usize(sp, i)
}
Err(_) => DummyResult::raw_expr(sp, true),
};
let fill = ecx.expr_char(sp, placeholder.format_options.fill.unwrap_or(' '));
let align = ecx.expr_path(ecx.path_global(
sp,
ecx.std_path(&[
sym::fmt,
sym::rt,
sym::v1,
sym::Alignment,
match placeholder.format_options.alignment {
Some(FormatAlignment::Left) => sym::Left,
Some(FormatAlignment::Right) => sym::Right,
Some(FormatAlignment::Center) => sym::Center,
None => sym::Unknown,
},
]),
));
let flags = ecx.expr_u32(sp, placeholder.format_options.flags);
let prec = make_count(ecx, sp, &placeholder.format_options.precision, argmap);
let width = make_count(ecx, sp, &placeholder.format_options.width, argmap);
ecx.expr_struct(
sp,
ecx.path_global(sp, ecx.std_path(&[sym::fmt, sym::rt, sym::v1, sym::Argument])),
vec![
ecx.field_imm(sp, Ident::new(sym::position, sp), position),
ecx.field_imm(
sp,
Ident::new(sym::format, sp),
ecx.expr_struct(
sp,
ecx.path_global(
sp,
ecx.std_path(&[sym::fmt, sym::rt, sym::v1, sym::FormatSpec]),
),
vec![
ecx.field_imm(sp, Ident::new(sym::fill, sp), fill),
ecx.field_imm(sp, Ident::new(sym::align, sp), align),
ecx.field_imm(sp, Ident::new(sym::flags, sp), flags),
ecx.field_imm(sp, Ident::new(sym::precision, sp), prec),
ecx.field_imm(sp, Ident::new(sym::width, sp), width),
],
),
),
],
)
}
pub fn expand_parsed_format_args(ecx: &mut ExtCtxt<'_>, fmt: FormatArgs) -> P<ast::Expr> {
let macsp = ecx.with_def_site_ctxt(ecx.call_site());
let lit_pieces = ecx.expr_array_ref(
fmt.span,
fmt.template
.iter()
.enumerate()
.filter_map(|(i, piece)| match piece {
&FormatArgsPiece::Literal(s) => Some(ecx.expr_str(fmt.span, s)),
&FormatArgsPiece::Placeholder(_) => {
// Inject empty string before placeholders when not already preceded by a literal piece.
if i == 0 || matches!(fmt.template[i - 1], FormatArgsPiece::Placeholder(_)) {
Some(ecx.expr_str(fmt.span, kw::Empty))
} else {
None
}
}
})
.collect(),
);
// Whether we'll use the `Arguments::new_v1_formatted` form (true),
// or the `Arguments::new_v1` form (false).
let mut use_format_options = false;
// Create a list of all _unique_ (argument, format trait) combinations.
// E.g. "{0} {0:x} {0} {1}" -> [(0, Display), (0, LowerHex), (1, Display)]
let mut argmap = FxIndexSet::default();
for piece in &fmt.template {
let FormatArgsPiece::Placeholder(placeholder) = piece else { continue };
if placeholder.format_options != Default::default() {
// Can't use basic form if there's any formatting options.
use_format_options = true;
}
if let Ok(index) = placeholder.argument.index {
if !argmap.insert((index, ArgumentType::Format(placeholder.format_trait))) {
// Duplicate (argument, format trait) combination,
// which we'll only put once in the args array.
use_format_options = true;
}
}
}
let format_options = use_format_options.then(|| {
// Generate:
// &[format_spec_0, format_spec_1, format_spec_2]
ecx.expr_array_ref(
macsp,
fmt.template
.iter()
.filter_map(|piece| {
let FormatArgsPiece::Placeholder(placeholder) = piece else { return None };
Some(make_format_spec(ecx, macsp, placeholder, &mut argmap))
})
.collect(),
)
});
let arguments = fmt.arguments.into_vec();
// If the args array contains exactly all the original arguments once,
// in order, we can use a simple array instead of a `match` construction.
// However, if there's a yield point in any argument except the first one,
// we don't do this, because an ArgumentV1 cannot be kept across yield points.
let use_simple_array = argmap.len() == arguments.len()
&& argmap.iter().enumerate().all(|(i, &(j, _))| i == j)
&& arguments.iter().skip(1).all(|arg| !may_contain_yield_point(&arg.expr));
let args = if use_simple_array {
// Generate:
// &[
// ::core::fmt::ArgumentV1::new_display(&arg0),
// ::core::fmt::ArgumentV1::new_lower_hex(&arg1),
// ::core::fmt::ArgumentV1::new_debug(&arg2),
// ]
ecx.expr_array_ref(
macsp,
arguments
.into_iter()
.zip(argmap)
.map(|(arg, (_, ty))| {
let sp = arg.expr.span.with_ctxt(macsp.ctxt());
make_argument(ecx, sp, ecx.expr_addr_of(sp, arg.expr), ty)
})
.collect(),
)
} else {
// Generate:
// match (&arg0, &arg1, &arg2) {
// args => &[
// ::core::fmt::ArgumentV1::new_display(args.0),
// ::core::fmt::ArgumentV1::new_lower_hex(args.1),
// ::core::fmt::ArgumentV1::new_debug(args.0),
// ]
// }
let args_ident = Ident::new(sym::args, macsp);
let args = argmap
.iter()
.map(|&(arg_index, ty)| {
if let Some(arg) = arguments.get(arg_index) {
let sp = arg.expr.span.with_ctxt(macsp.ctxt());
make_argument(
ecx,
sp,
ecx.expr_field(
sp,
ecx.expr_ident(macsp, args_ident),
Ident::new(sym::integer(arg_index), macsp),
),
ty,
)
} else {
DummyResult::raw_expr(macsp, true)
}
})
.collect();
ecx.expr_addr_of(
macsp,
ecx.expr_match(
macsp,
ecx.expr_tuple(
macsp,
arguments
.into_iter()
.map(|arg| {
ecx.expr_addr_of(arg.expr.span.with_ctxt(macsp.ctxt()), arg.expr)
})
.collect(),
),
vec![ecx.arm(macsp, ecx.pat_ident(macsp, args_ident), ecx.expr_array(macsp, args))],
),
)
};
if let Some(format_options) = format_options {
// Generate:
// ::core::fmt::Arguments::new_v1_formatted(
// lit_pieces,
// args,
// format_options,
// unsafe { ::core::fmt::UnsafeArg::new() }
// )
ecx.expr_call_global(
macsp,
ecx.std_path(&[sym::fmt, sym::Arguments, sym::new_v1_formatted]),
vec![
lit_pieces,
args,
format_options,
ecx.expr_block(P(ast::Block {
stmts: vec![ecx.stmt_expr(ecx.expr_call_global(
macsp,
ecx.std_path(&[sym::fmt, sym::UnsafeArg, sym::new]),
Vec::new(),
))],
id: ast::DUMMY_NODE_ID,
rules: BlockCheckMode::Unsafe(UnsafeSource::CompilerGenerated),
span: macsp,
tokens: None,
could_be_bare_literal: false,
})),
],
)
} else {
// Generate:
// ::core::fmt::Arguments::new_v1(
// lit_pieces,
// args,
// )
ecx.expr_call_global(
macsp,
ecx.std_path(&[sym::fmt, sym::Arguments, sym::new_v1]),
vec![lit_pieces, args],
)
}
}
fn may_contain_yield_point(e: &ast::Expr) -> bool {
struct MayContainYieldPoint(bool);
impl Visitor<'_> for MayContainYieldPoint {
fn visit_expr(&mut self, e: &ast::Expr) {
if let ast::ExprKind::Await(_) | ast::ExprKind::Yield(_) = e.kind {
self.0 = true;
} else {
visit::walk_expr(self, e);
}
}
fn visit_mac_call(&mut self, _: &ast::MacCall) {
self.0 = true;
}
fn visit_attribute(&mut self, _: &ast::Attribute) {
// Conservatively assume this may be a proc macro attribute in
// expression position.
self.0 = true;
}
fn visit_item(&mut self, _: &ast::Item) {
// Do not recurse into nested items.
}
}
let mut visitor = MayContainYieldPoint(false);
visitor.visit_expr(e);
visitor.0
}

View File

@ -7,9 +7,9 @@
#![feature(box_patterns)]
#![feature(decl_macro)]
#![feature(if_let_guard)]
#![feature(is_some_and)]
#![feature(is_sorted)]
#![feature(let_chains)]
#![cfg_attr(bootstrap, feature(let_else))]
#![feature(proc_macro_internals)]
#![feature(proc_macro_quote)]
#![recursion_limit = "256"]

View File

@ -36,13 +36,22 @@ pub fn expand_test_case(
let sp = ecx.with_def_site_ctxt(attr_sp);
let mut item = anno_item.expect_item();
item = item.map(|mut item| {
let test_path_symbol = Symbol::intern(&item_path(
// skip the name of the root module
&ecx.current_expansion.module.mod_path[1..],
&item.ident,
));
item.vis = ast::Visibility {
span: item.vis.span,
kind: ast::VisibilityKind::Public,
tokens: None,
};
item.ident.span = item.ident.span.with_ctxt(sp.ctxt());
item.attrs.push(ecx.attribute(ecx.meta_word(sp, sym::rustc_test_marker)));
item.attrs.push(ecx.attribute(attr::mk_name_value_item_str(
Ident::new(sym::rustc_test_marker, sp),
test_path_symbol,
sp,
)));
item
});
@ -115,7 +124,7 @@ pub fn expand_test_or_bench(
// reworked in the future to not need it, it'd be nice.
_ => diag.struct_span_err(attr_sp, msg).forget_guarantee(),
};
err.span_label(attr_sp, "the `#[test]` macro causes a a function to be run on a test and has no effect on non-functions")
err.span_label(attr_sp, "the `#[test]` macro causes a function to be run on a test and has no effect on non-functions")
.span_label(item.span, format!("expected a non-associated function, found {} {}", item.kind.article(), item.kind.descr()))
.span_suggestion(attr_sp, "replace with conditional compilation to make the item only exist when tests are being run", "#[cfg(test)]", Applicability::MaybeIncorrect)
.emit();
@ -215,6 +224,12 @@ pub fn expand_test_or_bench(
)
};
let test_path_symbol = Symbol::intern(&item_path(
// skip the name of the root module
&cx.current_expansion.module.mod_path[1..],
&item.ident,
));
let mut test_const = cx.item(
sp,
Ident::new(item.ident.name, sp),
@ -224,9 +239,14 @@ pub fn expand_test_or_bench(
Ident::new(sym::cfg, attr_sp),
vec![attr::mk_nested_word_item(Ident::new(sym::test, attr_sp))],
)),
// #[rustc_test_marker]
cx.attribute(cx.meta_word(attr_sp, sym::rustc_test_marker)),
],
// #[rustc_test_marker = "test_case_sort_key"]
cx.attribute(attr::mk_name_value_item_str(
Ident::new(sym::rustc_test_marker, attr_sp),
test_path_symbol,
attr_sp,
)),
]
.into(),
// const $ident: test::TestDescAndFn =
ast::ItemKind::Const(
ast::Defaultness::Final,
@ -250,14 +270,7 @@ pub fn expand_test_or_bench(
cx.expr_call(
sp,
cx.expr_path(test_path("StaticTestName")),
vec![cx.expr_str(
sp,
Symbol::intern(&item_path(
// skip the name of the root module
&cx.current_expansion.module.mod_path[1..],
&item.ident,
)),
)],
vec![cx.expr_str(sp, test_path_symbol)],
),
),
// ignore: true | false

View File

@ -18,9 +18,11 @@ use thin_vec::thin_vec;
use std::{iter, mem};
#[derive(Clone)]
struct Test {
span: Span,
ident: Ident,
name: Symbol,
}
struct TestCtxt<'a> {
@ -120,10 +122,10 @@ impl<'a> MutVisitor for TestHarnessGenerator<'a> {
fn flat_map_item(&mut self, i: P<ast::Item>) -> SmallVec<[P<ast::Item>; 1]> {
let mut item = i.into_inner();
if is_test_case(&self.cx.ext_cx.sess, &item) {
if let Some(name) = get_test_name(&self.cx.ext_cx.sess, &item) {
debug!("this is a test item");
let test = Test { span: item.span, ident: item.ident };
let test = Test { span: item.span, ident: item.ident, name };
self.tests.push(test);
}
@ -357,9 +359,12 @@ fn mk_tests_slice(cx: &TestCtxt<'_>, sp: Span) -> P<ast::Expr> {
debug!("building test vector from {} tests", cx.test_cases.len());
let ecx = &cx.ext_cx;
let mut tests = cx.test_cases.clone();
tests.sort_by(|a, b| a.name.as_str().cmp(&b.name.as_str()));
ecx.expr_array_ref(
sp,
cx.test_cases
tests
.iter()
.map(|test| {
ecx.expr_addr_of(test.span, ecx.expr_path(ecx.path(test.span, vec![test.ident])))
@ -368,8 +373,8 @@ fn mk_tests_slice(cx: &TestCtxt<'_>, sp: Span) -> P<ast::Expr> {
)
}
fn is_test_case(sess: &Session, i: &ast::Item) -> bool {
sess.contains_name(&i.attrs, sym::rustc_test_marker)
fn get_test_name(sess: &Session, i: &ast::Item) -> Option<Symbol> {
sess.first_attr_value_str_by_name(&i.attrs, sym::rustc_test_marker)
}
fn get_test_runner(

View File

@ -29,7 +29,11 @@ jobs:
matrix:
include:
- os: ubuntu-latest
env:
TARGET_TRIPLE: x86_64-unknown-linux-gnu
- os: macos-latest
env:
TARGET_TRIPLE: x86_64-apple-darwin
# cross-compile from Linux to Windows using mingw
- os: ubuntu-latest
env:
@ -112,7 +116,7 @@ jobs:
if: matrix.env.TARGET_TRIPLE != 'x86_64-pc-windows-gnu'
uses: actions/upload-artifact@v2
with:
name: cg_clif-${{ runner.os }}
name: cg_clif-${{ matrix.env.TARGET_TRIPLE }}
path: cg_clif.tar.xz
- name: Upload prebuilt cg_clif (cross compile)
@ -122,56 +126,89 @@ jobs:
name: cg_clif-${{ runner.os }}-cross-x86_64-mingw
path: cg_clif.tar.xz
build_windows:
runs-on: windows-latest
windows:
runs-on: ${{ matrix.os }}
timeout-minutes: 60
strategy:
fail-fast: false
matrix:
include:
# Native Windows build with MSVC
- os: windows-latest
env:
TARGET_TRIPLE: x86_64-pc-windows-msvc
# cross-compile from Windows to Windows MinGW
- os: windows-latest
env:
TARGET_TRIPLE: x86_64-pc-windows-gnu
steps:
- uses: actions/checkout@v3
#- name: Cache cargo installed crates
# uses: actions/cache@v2
# with:
# path: ~/.cargo/bin
# key: ${{ runner.os }}-cargo-installed-crates
- name: Cache cargo installed crates
uses: actions/cache@v2
with:
path: ~/.cargo/bin
key: ${{ runner.os }}-${{ matrix.env.TARGET_TRIPLE }}-cargo-installed-crates
#- name: Cache cargo registry and index
# uses: actions/cache@v2
# with:
# path: |
# ~/.cargo/registry
# ~/.cargo/git
# key: ${{ runner.os }}-cargo-registry-and-index-${{ hashFiles('**/Cargo.lock') }}
- name: Cache cargo registry and index
uses: actions/cache@v2
with:
path: |
~/.cargo/registry
~/.cargo/git
key: ${{ runner.os }}-${{ matrix.env.TARGET_TRIPLE }}-cargo-registry-and-index-${{ hashFiles('**/Cargo.lock') }}
#- name: Cache cargo target dir
# uses: actions/cache@v2
# with:
# path: target
# key: ${{ runner.os }}-cargo-build-target-${{ hashFiles('rust-toolchain', '**/Cargo.lock') }}
- name: Cache cargo target dir
uses: actions/cache@v2
with:
path: target
key: ${{ runner.os }}-${{ matrix.env.TARGET_TRIPLE }}-cargo-build-target-${{ hashFiles('rust-toolchain', '**/Cargo.lock') }}
- name: Set MinGW as the default toolchain
if: matrix.env.TARGET_TRIPLE == 'x86_64-pc-windows-gnu'
run: rustup set default-host x86_64-pc-windows-gnu
- name: Prepare dependencies
run: |
git config --global user.email "user@example.com"
git config --global user.name "User"
git config --global core.autocrlf false
rustup set default-host x86_64-pc-windows-gnu
rustc y.rs -o y.exe -g
./y.exe prepare
- name: Build without unstable features
env:
TARGET_TRIPLE: ${{ matrix.env.TARGET_TRIPLE }}
# This is the config rust-lang/rust uses for builds
run: ./y.rs build --no-unstable-features
- name: Build
#name: Test
run: ./y.rs build --sysroot none
- name: Test
run: |
# Enable backtraces for easier debugging
#$Env:RUST_BACKTRACE=1
$Env:RUST_BACKTRACE=1
# Reduce amount of benchmark runs as they are slow
#$Env:COMPILE_RUNS=2
#$Env:RUN_RUNS=2
$Env:COMPILE_RUNS=2
$Env:RUN_RUNS=2
# Enable extra checks
#$Env:CG_CLIF_ENABLE_VERIFIER=1
$Env:CG_CLIF_ENABLE_VERIFIER=1
# WIP Disable some tests
# This fails due to some weird argument handling by hyperfine, not an actual regression
# more of a build system issue
(Get-Content config.txt) -replace '(bench.simple-raytracer)', '# $1' | Out-File config.txt
# This fails with a different output than expected
(Get-Content config.txt) -replace '(test.regex-shootout-regex-dna)', '# $1' | Out-File config.txt
./y.exe build
./y.exe test
- name: Package prebuilt cg_clif
# don't use compression as xzip isn't supported by tar on windows and bzip2 hangs
@ -180,5 +217,5 @@ jobs:
- name: Upload prebuilt cg_clif
uses: actions/upload-artifact@v2
with:
name: cg_clif-${{ runner.os }}
name: cg_clif-${{ matrix.env.TARGET_TRIPLE }}
path: cg_clif.tar

View File

@ -7,7 +7,7 @@
"rust-analyzer.cargo.features": ["unstable-features"],
"rust-analyzer.linkedProjects": [
"./Cargo.toml",
//"./build_sysroot/sysroot_src/src/libstd/Cargo.toml",
//"./build_sysroot/sysroot_src/library/std/Cargo.toml",
{
"roots": [
"./example/mini_core.rs",
@ -36,10 +36,10 @@
]
},
{
"roots": ["./scripts/filter_profile.rs"],
"roots": ["./example/std_example.rs"],
"crates": [
{
"root_module": "./scripts/filter_profile.rs",
"root_module": "./example/std_example.rs",
"edition": "2018",
"deps": [{ "crate": 1, "name": "std" }],
"cfg": [],

View File

@ -24,6 +24,12 @@ name = "ar"
version = "0.8.0"
source = "git+https://github.com/bjorn3/rust-ar.git?branch=do_not_remove_cg_clif_ranlib#de9ab0e56bf3a208381d342aa5b60f9ff2891648"
[[package]]
name = "arrayvec"
version = "0.7.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8da52d66c7071e2e3fa2a1e5c6d088fec47b593032b254f5e980de8ea54454d6"
[[package]]
name = "autocfg"
version = "1.1.0"
@ -36,6 +42,12 @@ version = "1.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
[[package]]
name = "bumpalo"
version = "3.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c1ad822118d20d2c234f427000d5acc36eabe1e29a348c89b63dd60b13f28e5d"
[[package]]
name = "byteorder"
version = "1.4.3"
@ -50,19 +62,21 @@ checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd"
[[package]]
name = "cranelift-bforest"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "93945adbccc8d731503d3038814a51e8317497c9e205411820348132fa01a358"
checksum = "44409ccf2d0f663920cab563d2b79fcd6b2e9a2bcc6e929fef76c8f82ad6c17a"
dependencies = [
"cranelift-entity",
]
[[package]]
name = "cranelift-codegen"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2b482acc9d0d0d1ad3288a90a8150ee648be3dce8dc8c8669ff026f72debdc31"
checksum = "98de2018ad96eb97f621f7d6b900a0cc661aec8d02ea4a50e56ecb48e5a2fcaf"
dependencies = [
"arrayvec",
"bumpalo",
"cranelift-bforest",
"cranelift-codegen-meta",
"cranelift-codegen-shared",
@ -77,30 +91,30 @@ dependencies = [
[[package]]
name = "cranelift-codegen-meta"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f9ec188d71e663192ef9048f204e410a7283b609942efc9fcc77da6d496edbb8"
checksum = "5287ce36e6c4758fbaf298bd1a8697ad97a4f2375a3d1b61142ea538db4877e5"
dependencies = [
"cranelift-codegen-shared",
]
[[package]]
name = "cranelift-codegen-shared"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3ad794b1b1c2c7bd9f7b76cfe0f084eaf7753e55d56191c3f7d89e8fa4978b99"
checksum = "2855c24219e2f08827f3f4ffb2da92e134ae8d8ecc185b11ec8f9878cf5f588e"
[[package]]
name = "cranelift-entity"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "342da0d5056f4119d3c311c4aab2460ceb6ee6e127bb395b76dd2279a09ea7a5"
checksum = "0b65673279d75d34bf11af9660ae2dbd1c22e6d28f163f5c72f4e1dc56d56103"
[[package]]
name = "cranelift-frontend"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dfff792f775b07d4d9cfe9f1c767ce755c6cbadda1bbd6db18a1c75ff9f7376a"
checksum = "3ed2b3d7a4751163f6c4a349205ab1b7d9c00eecf19dcea48592ef1f7688eefc"
dependencies = [
"cranelift-codegen",
"log",
@ -110,15 +124,15 @@ dependencies = [
[[package]]
name = "cranelift-isle"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8d51089478849f2ac8ef60a8a2d5346c8d4abfec0e45ac5b24530ef9f9499e1e"
checksum = "3be64cecea9d90105fc6a2ba2d003e98c867c1d6c4c86cc878f97ad9fb916293"
[[package]]
name = "cranelift-jit"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "095936e41720f86004b4c57ce88e6a13af28646bb3a6fb4afbebd5ae90c50029"
checksum = "f98ed42a70a0c9c388e34ec9477f57fc7300f541b1e5136a0e2ea02b1fac6015"
dependencies = [
"anyhow",
"cranelift-codegen",
@ -134,9 +148,9 @@ dependencies = [
[[package]]
name = "cranelift-module"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "704a1aea4723d97eafe0fb7af110f6f6868b1ac95f5380bbc9adb2a3b8cf97e8"
checksum = "d658ac7f156708bfccb647216cc8b9387469f50d352ba4ad80150541e4ae2d49"
dependencies = [
"anyhow",
"cranelift-codegen",
@ -144,9 +158,9 @@ dependencies = [
[[package]]
name = "cranelift-native"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "885debe62f2078638d6585f54c9f05f5c2008f22ce5a2a9100ada785fc065dbd"
checksum = "c4a03a6ac1b063e416ca4b93f6247978c991475e8271465340caa6f92f3c16a4"
dependencies = [
"cranelift-codegen",
"libc",
@ -155,9 +169,9 @@ dependencies = [
[[package]]
name = "cranelift-object"
version = "0.87.0"
version = "0.88.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aac1310cf1081ae8eca916c92cd163b977c77cab6e831fa812273c26ff921816"
checksum = "eef0b4119b645b870a43a036d76c0ada3a076b1f82e8b8487659304c8b09049b"
dependencies = [
"anyhow",
"cranelift-codegen",
@ -232,9 +246,9 @@ checksum = "505e71a4706fa491e9b1b55f51b95d4037d0821ee40131190475f692b35b009b"
[[package]]
name = "libloading"
version = "0.6.7"
version = "0.7.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "351a32417a12d5f7e82c368a66781e307834dae04c6ce0cd4456d52989229883"
checksum = "efbc0f03f9a775e9f6aed295c6a1ba2253c5757a9e03d55c6caa46a681abcddd"
dependencies = [
"cfg-if",
"winapi",

View File

@ -8,19 +8,19 @@ crate-type = ["dylib"]
[dependencies]
# These have to be in sync with each other
cranelift-codegen = { version = "0.87.0", features = ["unwind", "all-arch"] }
cranelift-frontend = "0.87.0"
cranelift-module = "0.87.0"
cranelift-native = "0.87.0"
cranelift-jit = { version = "0.87.0", optional = true }
cranelift-object = "0.87.0"
cranelift-codegen = { version = "0.88.1", features = ["unwind", "all-arch"] }
cranelift-frontend = "0.88.1"
cranelift-module = "0.88.1"
cranelift-native = "0.88.1"
cranelift-jit = { version = "0.88.1", optional = true }
cranelift-object = "0.88.1"
target-lexicon = "0.12.0"
gimli = { version = "0.26.0", default-features = false, features = ["write"]}
object = { version = "0.29.0", default-features = false, features = ["std", "read_core", "write", "archive", "coff", "elf", "macho", "pe"] }
ar = { git = "https://github.com/bjorn3/rust-ar.git", branch = "do_not_remove_cg_clif_ranlib" }
indexmap = "1.9.1"
libloading = { version = "0.6.0", optional = true }
libloading = { version = "0.7.3", optional = true }
once_cell = "1.10.0"
smallvec = "1.8.1"

View File

@ -55,10 +55,20 @@ dependencies = [
]
[[package]]
name = "compiler_builtins"
version = "0.1.79"
name = "cfg-if"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4f873ce2bd3550b0b565f878b3d04ea8253f4259dc3d20223af2e1ba86f5ecca"
checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd"
dependencies = [
"compiler_builtins",
"rustc-std-workspace-core",
]
[[package]]
name = "compiler_builtins"
version = "0.1.82"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "18cd7635fea7bb481ea543b392789844c1ad581299da70184c7175ce3af76603"
dependencies = [
"rustc-std-workspace-core",
]
@ -123,9 +133,9 @@ dependencies = [
[[package]]
name = "hermit-abi"
version = "0.2.5"
version = "0.2.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "897cd85af6387be149f55acf168e41be176a02de7872403aaab184afc2f327e6"
checksum = "ee512640fe35acbfb4bb779db6f0d80704c2cacfa2e39b601ef3e3f47d1ae4c7"
dependencies = [
"compiler_builtins",
"libc",
@ -135,9 +145,9 @@ dependencies = [
[[package]]
name = "libc"
version = "0.2.132"
version = "0.2.135"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8371e4e5341c3a96db127eb2465ac681ced4c433e01dd0e938adbef26ba93ba5"
checksum = "68783febc7782c6c5cb401fbda4de5a9898be1762314da0bb2c10ced61f18b0c"
dependencies = [
"rustc-std-workspace-core",
]
@ -182,7 +192,7 @@ name = "panic_abort"
version = "0.0.0"
dependencies = [
"alloc",
"cfg-if",
"cfg-if 0.1.10",
"compiler_builtins",
"core",
"libc",
@ -193,7 +203,7 @@ name = "panic_unwind"
version = "0.0.0"
dependencies = [
"alloc",
"cfg-if",
"cfg-if 0.1.10",
"compiler_builtins",
"core",
"libc",
@ -245,7 +255,7 @@ version = "0.0.0"
dependencies = [
"addr2line",
"alloc",
"cfg-if",
"cfg-if 1.0.0",
"compiler_builtins",
"core",
"dlmalloc",
@ -267,7 +277,7 @@ dependencies = [
name = "std_detect"
version = "0.1.5"
dependencies = [
"cfg-if",
"cfg-if 1.0.0",
"compiler_builtins",
"libc",
"rustc-std-workspace-alloc",
@ -289,7 +299,7 @@ dependencies = [
name = "test"
version = "0.0.0"
dependencies = [
"cfg-if",
"cfg-if 0.1.10",
"core",
"getopts",
"libc",
@ -301,9 +311,9 @@ dependencies = [
[[package]]
name = "unicode-width"
version = "0.1.9"
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3ed742d4ea2bd1176e236172c8429aaf54486e7ac098db29ffe6529e0ce50973"
checksum = "c0edd1e5b14653f783770bce4a4dabb4a5108a5370a5f5d8cfe8710c361f6c8b"
dependencies = [
"compiler_builtins",
"rustc-std-workspace-core",
@ -315,7 +325,7 @@ name = "unwind"
version = "0.0.0"
dependencies = [
"cc",
"cfg-if",
"cfg-if 0.1.10",
"compiler_builtins",
"core",
"libc",

View File

@ -0,0 +1,52 @@
use std::env;
use std::path::Path;
use super::build_sysroot;
use super::config;
use super::prepare;
use super::utils::{cargo_command, spawn_and_wait};
use super::SysrootKind;
pub(crate) fn run(
channel: &str,
sysroot_kind: SysrootKind,
target_dir: &Path,
cg_clif_dylib: &Path,
host_triple: &str,
target_triple: &str,
) {
if !config::get_bool("testsuite.abi-cafe") {
eprintln!("[SKIP] abi-cafe");
return;
}
if host_triple != target_triple {
eprintln!("[SKIP] abi-cafe (cross-compilation not supported)");
return;
}
eprintln!("Building sysroot for abi-cafe");
build_sysroot::build_sysroot(
channel,
sysroot_kind,
target_dir,
cg_clif_dylib,
host_triple,
target_triple,
);
eprintln!("Running abi-cafe");
let abi_cafe_path = prepare::ABI_CAFE.source_dir();
env::set_current_dir(abi_cafe_path.clone()).unwrap();
let pairs = ["rustc_calls_cgclif", "cgclif_calls_rustc", "cgclif_calls_cc", "cc_calls_cgclif"];
let mut cmd = cargo_command("cargo", "run", Some(target_triple), &abi_cafe_path);
cmd.arg("--");
cmd.arg("--pairs");
cmd.args(pairs);
cmd.arg("--add-rustc-codegen-backend");
cmd.arg(format!("cgclif:{}", cg_clif_dylib.display()));
spawn_and_wait(cmd);
}

View File

@ -1,60 +0,0 @@
use super::build_sysroot;
use super::config;
use super::utils::spawn_and_wait;
use build_system::SysrootKind;
use std::env;
use std::path::Path;
use std::process::Command;
pub(crate) fn run(
channel: &str,
sysroot_kind: SysrootKind,
target_dir: &Path,
cg_clif_build_dir: &Path,
host_triple: &str,
target_triple: &str,
) {
if !config::get_bool("testsuite.abi-checker") {
eprintln!("[SKIP] abi-checker");
return;
}
if host_triple != target_triple {
eprintln!("[SKIP] abi-checker (cross-compilation not supported)");
return;
}
eprintln!("Building sysroot for abi-checker");
build_sysroot::build_sysroot(
channel,
sysroot_kind,
target_dir,
cg_clif_build_dir,
host_triple,
target_triple,
);
eprintln!("Running abi-checker");
let mut abi_checker_path = env::current_dir().unwrap();
abi_checker_path.push("abi-checker");
env::set_current_dir(abi_checker_path.clone()).unwrap();
let build_dir = abi_checker_path.parent().unwrap().join("build");
let cg_clif_dylib_path = build_dir.join(if cfg!(windows) { "bin" } else { "lib" }).join(
env::consts::DLL_PREFIX.to_string() + "rustc_codegen_cranelift" + env::consts::DLL_SUFFIX,
);
let pairs = ["rustc_calls_cgclif", "cgclif_calls_rustc", "cgclif_calls_cc", "cc_calls_cgclif"];
let mut cmd = Command::new("cargo");
cmd.arg("run");
cmd.arg("--target");
cmd.arg(target_triple);
cmd.arg("--");
cmd.arg("--pairs");
cmd.args(pairs);
cmd.arg("--add-rustc-codegen-backend");
cmd.arg(format!("cgclif:{}", cg_clif_dylib_path.display()));
spawn_and_wait(cmd);
}

View File

@ -1,16 +1,16 @@
use std::env;
use std::path::{Path, PathBuf};
use std::process::Command;
use std::path::PathBuf;
use super::utils::is_ci;
use super::rustc_info::get_file_name;
use super::utils::{cargo_command, is_ci};
pub(crate) fn build_backend(
channel: &str,
host_triple: &str,
use_unstable_features: bool,
) -> PathBuf {
let mut cmd = Command::new("cargo");
cmd.arg("build").arg("--target").arg(host_triple);
let source_dir = std::env::current_dir().unwrap();
let mut cmd = cargo_command("cargo", "build", Some(host_triple), &source_dir);
cmd.env("CARGO_BUILD_INCREMENTAL", "true"); // Force incr comp even in release mode
@ -41,5 +41,9 @@ pub(crate) fn build_backend(
eprintln!("[BUILD] rustc_codegen_cranelift");
super::utils::spawn_and_wait(cmd);
Path::new("target").join(host_triple).join(channel)
source_dir
.join("target")
.join(host_triple)
.join(channel)
.join(get_file_name("rustc_codegen_cranelift", "dylib"))
}

View File

@ -3,14 +3,14 @@ use std::path::{Path, PathBuf};
use std::process::{self, Command};
use super::rustc_info::{get_file_name, get_rustc_version, get_wrapper_file_name};
use super::utils::{spawn_and_wait, try_hard_link};
use super::utils::{cargo_command, spawn_and_wait, try_hard_link};
use super::SysrootKind;
pub(crate) fn build_sysroot(
channel: &str,
sysroot_kind: SysrootKind,
target_dir: &Path,
cg_clif_build_dir: &Path,
cg_clif_dylib_src: &Path,
host_triple: &str,
target_triple: &str,
) {
@ -23,7 +23,6 @@ pub(crate) fn build_sysroot(
fs::create_dir_all(target_dir.join("lib")).unwrap();
// Copy the backend
let cg_clif_dylib = get_file_name("rustc_codegen_cranelift", "dylib");
let cg_clif_dylib_path = target_dir
.join(if cfg!(windows) {
// Windows doesn't have rpath support, so the cg_clif dylib needs to be next to the
@ -32,8 +31,8 @@ pub(crate) fn build_sysroot(
} else {
"lib"
})
.join(&cg_clif_dylib);
try_hard_link(cg_clif_build_dir.join(cg_clif_dylib), &cg_clif_dylib_path);
.join(get_file_name("rustc_codegen_cranelift", "dylib"));
try_hard_link(cg_clif_dylib_src, &cg_clif_dylib_path);
// Build and copy rustc and cargo wrappers
for wrapper in ["rustc-clif", "cargo-clif"] {
@ -186,10 +185,10 @@ fn build_clif_sysroot_for_triple(
}
// Build sysroot
let mut build_cmd = Command::new("cargo");
build_cmd.arg("build").arg("--target").arg(triple).current_dir("build_sysroot");
let mut build_cmd = cargo_command("cargo", "build", Some(triple), Path::new("build_sysroot"));
let mut rustflags = "-Zforce-unstable-if-unmarked -Cpanic=abort".to_string();
rustflags.push_str(&format!(" -Zcodegen-backend={}", cg_clif_dylib_path.to_str().unwrap()));
rustflags.push_str(&format!(" --sysroot={}", target_dir.to_str().unwrap()));
if channel == "release" {
build_cmd.arg("--release");
rustflags.push_str(" -Zmir-opt-level=3");

View File

@ -1,4 +1,5 @@
use std::{fs, process};
use std::fs;
use std::process;
fn load_config_file() -> Vec<(String, Option<String>)> {
fs::read_to_string("config.txt")

View File

@ -4,7 +4,7 @@ use std::process;
use self::utils::is_ci;
mod abi_checker;
mod abi_cafe;
mod build_backend;
mod build_sysroot;
mod config;
@ -122,32 +122,23 @@ pub fn main() {
host_triple.clone()
};
if target_triple.ends_with("-msvc") {
eprintln!("The MSVC toolchain is not yet supported by rustc_codegen_cranelift.");
eprintln!("Switch to the MinGW toolchain for Windows support.");
eprintln!("Hint: You can use `rustup set default-host x86_64-pc-windows-gnu` to");
eprintln!("set the global default target to MinGW");
process::exit(1);
}
let cg_clif_build_dir =
build_backend::build_backend(channel, &host_triple, use_unstable_features);
let cg_clif_dylib = build_backend::build_backend(channel, &host_triple, use_unstable_features);
match command {
Command::Test => {
tests::run_tests(
channel,
sysroot_kind,
&target_dir,
&cg_clif_build_dir,
&cg_clif_dylib,
&host_triple,
&target_triple,
);
abi_checker::run(
abi_cafe::run(
channel,
sysroot_kind,
&target_dir,
&cg_clif_build_dir,
&cg_clif_dylib,
&host_triple,
&target_triple,
);
@ -157,7 +148,7 @@ pub fn main() {
channel,
sysroot_kind,
&target_dir,
&cg_clif_build_dir,
&cg_clif_dylib,
&host_triple,
&target_triple,
);

View File

@ -1,64 +1,63 @@
use std::env;
use std::ffi::OsStr;
use std::ffi::OsString;
use std::fs;
use std::path::Path;
use std::path::{Path, PathBuf};
use std::process::Command;
use super::rustc_info::{get_file_name, get_rustc_path, get_rustc_version};
use super::utils::{copy_dir_recursively, spawn_and_wait};
use super::utils::{cargo_command, copy_dir_recursively, spawn_and_wait};
pub(crate) const ABI_CAFE: GitRepo =
GitRepo::github("Gankra", "abi-cafe", "4c6dc8c9c687e2b3a760ff2176ce236872b37212", "abi-cafe");
pub(crate) const RAND: GitRepo =
GitRepo::github("rust-random", "rand", "0f933f9c7176e53b2a3c7952ded484e1783f0bf1", "rand");
pub(crate) const REGEX: GitRepo =
GitRepo::github("rust-lang", "regex", "341f207c1071f7290e3f228c710817c280c8dca1", "regex");
pub(crate) const PORTABLE_SIMD: GitRepo = GitRepo::github(
"rust-lang",
"portable-simd",
"d5cd4a8112d958bd3a252327e0d069a6363249bd",
"portable-simd",
);
pub(crate) const SIMPLE_RAYTRACER: GitRepo = GitRepo::github(
"ebobby",
"simple-raytracer",
"804a7a21b9e673a482797aa289a18ed480e4d813",
"<none>",
);
pub(crate) fn prepare() {
if Path::new("download").exists() {
std::fs::remove_dir_all(Path::new("download")).unwrap();
}
std::fs::create_dir_all(Path::new("download")).unwrap();
prepare_sysroot();
// FIXME maybe install this only locally?
eprintln!("[INSTALL] hyperfine");
Command::new("cargo").arg("install").arg("hyperfine").spawn().unwrap().wait().unwrap();
clone_repo_shallow_github(
"abi-checker",
"Gankra",
"abi-checker",
"a2232d45f202846f5c02203c9f27355360f9a2ff",
);
apply_patches("abi-checker", Path::new("abi-checker"));
clone_repo_shallow_github(
"rand",
"rust-random",
"rand",
"0f933f9c7176e53b2a3c7952ded484e1783f0bf1",
);
apply_patches("rand", Path::new("rand"));
clone_repo_shallow_github(
"regex",
"rust-lang",
"regex",
"341f207c1071f7290e3f228c710817c280c8dca1",
);
clone_repo_shallow_github(
"portable-simd",
"rust-lang",
"portable-simd",
"b8d6b6844602f80af79cd96401339ec594d472d8",
);
apply_patches("portable-simd", Path::new("portable-simd"));
clone_repo_shallow_github(
"simple-raytracer",
"ebobby",
"simple-raytracer",
"804a7a21b9e673a482797aa289a18ed480e4d813",
);
ABI_CAFE.fetch();
RAND.fetch();
REGEX.fetch();
PORTABLE_SIMD.fetch();
SIMPLE_RAYTRACER.fetch();
eprintln!("[LLVM BUILD] simple-raytracer");
let mut build_cmd = Command::new("cargo");
build_cmd.arg("build").env_remove("CARGO_TARGET_DIR").current_dir("simple-raytracer");
let build_cmd = cargo_command("cargo", "build", None, &SIMPLE_RAYTRACER.source_dir());
spawn_and_wait(build_cmd);
fs::copy(
Path::new("simple-raytracer/target/debug").join(get_file_name("main", "bin")),
Path::new("simple-raytracer").join(get_file_name("raytracer_cg_llvm", "bin")),
SIMPLE_RAYTRACER
.source_dir()
.join("target")
.join("debug")
.join(get_file_name("main", "bin")),
SIMPLE_RAYTRACER.source_dir().join(get_file_name("raytracer_cg_llvm", "bin")),
)
.unwrap();
}
@ -90,38 +89,78 @@ fn prepare_sysroot() {
apply_patches("sysroot", &sysroot_src);
}
pub(crate) struct GitRepo {
url: GitRepoUrl,
rev: &'static str,
patch_name: &'static str,
}
enum GitRepoUrl {
Github { user: &'static str, repo: &'static str },
}
impl GitRepo {
const fn github(
user: &'static str,
repo: &'static str,
rev: &'static str,
patch_name: &'static str,
) -> GitRepo {
GitRepo { url: GitRepoUrl::Github { user, repo }, rev, patch_name }
}
pub(crate) fn source_dir(&self) -> PathBuf {
match self.url {
GitRepoUrl::Github { user: _, repo } => {
std::env::current_dir().unwrap().join("download").join(repo)
}
}
}
fn fetch(&self) {
match self.url {
GitRepoUrl::Github { user, repo } => {
clone_repo_shallow_github(&self.source_dir(), user, repo, self.rev);
}
}
apply_patches(self.patch_name, &self.source_dir());
}
}
#[allow(dead_code)]
fn clone_repo(target_dir: &str, repo: &str, rev: &str) {
fn clone_repo(download_dir: &Path, repo: &str, rev: &str) {
eprintln!("[CLONE] {}", repo);
// Ignore exit code as the repo may already have been checked out
Command::new("git").arg("clone").arg(repo).arg(target_dir).spawn().unwrap().wait().unwrap();
Command::new("git").arg("clone").arg(repo).arg(&download_dir).spawn().unwrap().wait().unwrap();
let mut clean_cmd = Command::new("git");
clean_cmd.arg("checkout").arg("--").arg(".").current_dir(target_dir);
clean_cmd.arg("checkout").arg("--").arg(".").current_dir(&download_dir);
spawn_and_wait(clean_cmd);
let mut checkout_cmd = Command::new("git");
checkout_cmd.arg("checkout").arg("-q").arg(rev).current_dir(target_dir);
checkout_cmd.arg("checkout").arg("-q").arg(rev).current_dir(download_dir);
spawn_and_wait(checkout_cmd);
}
fn clone_repo_shallow_github(target_dir: &str, username: &str, repo: &str, rev: &str) {
fn clone_repo_shallow_github(download_dir: &Path, user: &str, repo: &str, rev: &str) {
if cfg!(windows) {
// Older windows doesn't have tar or curl by default. Fall back to using git.
clone_repo(target_dir, &format!("https://github.com/{}/{}.git", username, repo), rev);
clone_repo(download_dir, &format!("https://github.com/{}/{}.git", user, repo), rev);
return;
}
let archive_url = format!("https://github.com/{}/{}/archive/{}.tar.gz", username, repo, rev);
let archive_file = format!("{}.tar.gz", rev);
let archive_dir = format!("{}-{}", repo, rev);
let downloads_dir = std::env::current_dir().unwrap().join("download");
eprintln!("[DOWNLOAD] {}/{} from {}", username, repo, archive_url);
let archive_url = format!("https://github.com/{}/{}/archive/{}.tar.gz", user, repo, rev);
let archive_file = downloads_dir.join(format!("{}.tar.gz", rev));
let archive_dir = downloads_dir.join(format!("{}-{}", repo, rev));
eprintln!("[DOWNLOAD] {}/{} from {}", user, repo, archive_url);
// Remove previous results if they exists
let _ = std::fs::remove_file(&archive_file);
let _ = std::fs::remove_dir_all(&archive_dir);
let _ = std::fs::remove_dir_all(target_dir);
let _ = std::fs::remove_dir_all(&download_dir);
// Download zip archive
let mut download_cmd = Command::new("curl");
@ -130,13 +169,13 @@ fn clone_repo_shallow_github(target_dir: &str, username: &str, repo: &str, rev:
// Unpack tar archive
let mut unpack_cmd = Command::new("tar");
unpack_cmd.arg("xf").arg(&archive_file);
unpack_cmd.arg("xf").arg(&archive_file).current_dir(downloads_dir);
spawn_and_wait(unpack_cmd);
// Rename unpacked dir to the expected name
std::fs::rename(archive_dir, target_dir).unwrap();
std::fs::rename(archive_dir, &download_dir).unwrap();
init_git_repo(Path::new(target_dir));
init_git_repo(&download_dir);
// Cleanup
std::fs::remove_file(archive_file).unwrap();
@ -156,14 +195,20 @@ fn init_git_repo(repo_dir: &Path) {
spawn_and_wait(git_commit_cmd);
}
fn get_patches(crate_name: &str) -> Vec<OsString> {
let mut patches: Vec<_> = fs::read_dir("patches")
fn get_patches(source_dir: &Path, crate_name: &str) -> Vec<PathBuf> {
let mut patches: Vec<_> = fs::read_dir(source_dir.join("patches"))
.unwrap()
.map(|entry| entry.unwrap().path())
.filter(|path| path.extension() == Some(OsStr::new("patch")))
.map(|path| path.file_name().unwrap().to_owned())
.filter(|file_name| {
file_name.to_str().unwrap().split_once("-").unwrap().1.starts_with(crate_name)
.filter(|path| {
path.file_name()
.unwrap()
.to_str()
.unwrap()
.split_once("-")
.unwrap()
.1
.starts_with(crate_name)
})
.collect();
patches.sort();
@ -171,11 +216,18 @@ fn get_patches(crate_name: &str) -> Vec<OsString> {
}
fn apply_patches(crate_name: &str, target_dir: &Path) {
for patch in get_patches(crate_name) {
eprintln!("[PATCH] {:?} <- {:?}", target_dir.file_name().unwrap(), patch);
let patch_arg = env::current_dir().unwrap().join("patches").join(patch);
if crate_name == "<none>" {
return;
}
for patch in get_patches(&std::env::current_dir().unwrap(), crate_name) {
eprintln!(
"[PATCH] {:?} <- {:?}",
target_dir.file_name().unwrap(),
patch.file_name().unwrap()
);
let mut apply_patch_cmd = Command::new("git");
apply_patch_cmd.arg("am").arg(patch_arg).arg("-q").current_dir(target_dir);
apply_patch_cmd.arg("am").arg(patch).arg("-q").current_dir(target_dir);
spawn_and_wait(apply_patch_cmd);
}
}

View File

@ -65,7 +65,7 @@ pub(crate) fn get_file_name(crate_name: &str, crate_type: &str) -> String {
}
/// Similar to `get_file_name`, but converts any dashes (`-`) in the `crate_name` to
/// underscores (`_`). This is specially made for the the rustc and cargo wrappers
/// underscores (`_`). This is specially made for the rustc and cargo wrappers
/// which have a dash in the name, and that is not allowed in a crate name.
pub(crate) fn get_wrapper_file_name(crate_name: &str, crate_type: &str) -> String {
let crate_name = crate_name.replace('-', "_");

View File

@ -1,7 +1,8 @@
use super::build_sysroot;
use super::config;
use super::prepare;
use super::rustc_info::get_wrapper_file_name;
use super::utils::{spawn_and_wait, spawn_and_wait_with_input};
use super::utils::{cargo_command, hyperfine_command, spawn_and_wait, spawn_and_wait_with_input};
use build_system::SysrootKind;
use std::env;
use std::ffi::OsStr;
@ -217,103 +218,95 @@ const BASE_SYSROOT_SUITE: &[TestCase] = &[
const EXTENDED_SYSROOT_SUITE: &[TestCase] = &[
TestCase::new("test.rust-random/rand", &|runner| {
runner.in_dir(["rand"], |runner| {
runner.run_cargo(["clean"]);
runner.in_dir(prepare::RAND.source_dir(), |runner| {
runner.run_cargo("clean", []);
if runner.host_triple == runner.target_triple {
eprintln!("[TEST] rust-random/rand");
runner.run_cargo(["test", "--workspace"]);
runner.run_cargo("test", ["--workspace"]);
} else {
eprintln!("[AOT] rust-random/rand");
runner.run_cargo([
"build",
"--workspace",
"--target",
&runner.target_triple,
"--tests",
]);
runner.run_cargo("build", ["--workspace", "--tests"]);
}
});
}),
TestCase::new("bench.simple-raytracer", &|runner| {
runner.in_dir(["simple-raytracer"], |runner| {
let run_runs = env::var("RUN_RUNS").unwrap_or("10".to_string());
runner.in_dir(prepare::SIMPLE_RAYTRACER.source_dir(), |runner| {
let run_runs = env::var("RUN_RUNS").unwrap_or("10".to_string()).parse().unwrap();
if runner.host_triple == runner.target_triple {
eprintln!("[BENCH COMPILE] ebobby/simple-raytracer");
let mut bench_compile = Command::new("hyperfine");
bench_compile.arg("--runs");
bench_compile.arg(&run_runs);
bench_compile.arg("--warmup");
bench_compile.arg("1");
bench_compile.arg("--prepare");
bench_compile.arg(format!("{:?}", runner.cargo_command(["clean"])));
let prepare = runner.cargo_command("clean", []);
if cfg!(windows) {
bench_compile.arg("cmd /C \"set RUSTFLAGS= && cargo build\"");
} else {
bench_compile.arg("RUSTFLAGS='' cargo build");
}
let llvm_build_cmd = cargo_command("cargo", "build", None, Path::new("."));
let cargo_clif = runner
.root_dir
.clone()
.join("build")
.join(get_wrapper_file_name("cargo-clif", "bin"));
let clif_build_cmd = cargo_command(cargo_clif, "build", None, Path::new("."));
let bench_compile =
hyperfine_command(1, run_runs, Some(prepare), llvm_build_cmd, clif_build_cmd);
bench_compile.arg(format!("{:?}", runner.cargo_command(["build"])));
spawn_and_wait(bench_compile);
eprintln!("[BENCH RUN] ebobby/simple-raytracer");
fs::copy(PathBuf::from("./target/debug/main"), PathBuf::from("raytracer_cg_clif"))
.unwrap();
let mut bench_run = Command::new("hyperfine");
bench_run.arg("--runs");
bench_run.arg(&run_runs);
bench_run.arg(PathBuf::from("./raytracer_cg_llvm"));
bench_run.arg(PathBuf::from("./raytracer_cg_clif"));
let bench_run = hyperfine_command(
0,
run_runs,
None,
Command::new("./raytracer_cg_llvm"),
Command::new("./raytracer_cg_clif"),
);
spawn_and_wait(bench_run);
} else {
runner.run_cargo(["clean"]);
runner.run_cargo("clean", []);
eprintln!("[BENCH COMPILE] ebobby/simple-raytracer (skipped)");
eprintln!("[COMPILE] ebobby/simple-raytracer");
runner.run_cargo(["build", "--target", &runner.target_triple]);
runner.run_cargo("build", []);
eprintln!("[BENCH RUN] ebobby/simple-raytracer (skipped)");
}
});
}),
TestCase::new("test.libcore", &|runner| {
runner.in_dir(["build_sysroot", "sysroot_src", "library", "core", "tests"], |runner| {
runner.run_cargo(["clean"]);
runner.in_dir(
std::env::current_dir()
.unwrap()
.join("build_sysroot")
.join("sysroot_src")
.join("library")
.join("core")
.join("tests"),
|runner| {
runner.run_cargo("clean", []);
if runner.host_triple == runner.target_triple {
runner.run_cargo(["test"]);
} else {
eprintln!("Cross-Compiling: Not running tests");
runner.run_cargo(["build", "--target", &runner.target_triple, "--tests"]);
}
});
if runner.host_triple == runner.target_triple {
runner.run_cargo("test", []);
} else {
eprintln!("Cross-Compiling: Not running tests");
runner.run_cargo("build", ["--tests"]);
}
},
);
}),
TestCase::new("test.regex-shootout-regex-dna", &|runner| {
runner.in_dir(["regex"], |runner| {
runner.run_cargo(["clean"]);
runner.in_dir(prepare::REGEX.source_dir(), |runner| {
runner.run_cargo("clean", []);
// newer aho_corasick versions throw a deprecation warning
let lint_rust_flags = format!("{} --cap-lints warn", runner.rust_flags);
let mut build_cmd = runner.cargo_command([
"build",
"--example",
"shootout-regex-dna",
"--target",
&runner.target_triple,
]);
let mut build_cmd = runner.cargo_command("build", ["--example", "shootout-regex-dna"]);
build_cmd.env("RUSTFLAGS", lint_rust_flags.clone());
spawn_and_wait(build_cmd);
if runner.host_triple == runner.target_triple {
let mut run_cmd = runner.cargo_command([
"run",
"--example",
"shootout-regex-dna",
"--target",
&runner.target_triple,
]);
let mut run_cmd = runner.cargo_command("run", ["--example", "shootout-regex-dna"]);
run_cmd.env("RUSTFLAGS", lint_rust_flags);
let input =
@ -353,41 +346,43 @@ const EXTENDED_SYSROOT_SUITE: &[TestCase] = &[
});
}),
TestCase::new("test.regex", &|runner| {
runner.in_dir(["regex"], |runner| {
runner.run_cargo(["clean"]);
runner.in_dir(prepare::REGEX.source_dir(), |runner| {
runner.run_cargo("clean", []);
// newer aho_corasick versions throw a deprecation warning
let lint_rust_flags = format!("{} --cap-lints warn", runner.rust_flags);
if runner.host_triple == runner.target_triple {
let mut run_cmd = runner.cargo_command([
let mut run_cmd = runner.cargo_command(
"test",
"--tests",
"--",
"--exclude-should-panic",
"--test-threads",
"1",
"-Zunstable-options",
"-q",
]);
[
"--tests",
"--",
"--exclude-should-panic",
"--test-threads",
"1",
"-Zunstable-options",
"-q",
],
);
run_cmd.env("RUSTFLAGS", lint_rust_flags);
spawn_and_wait(run_cmd);
} else {
eprintln!("Cross-Compiling: Not running tests");
let mut build_cmd =
runner.cargo_command(["build", "--tests", "--target", &runner.target_triple]);
runner.cargo_command("build", ["--tests", "--target", &runner.target_triple]);
build_cmd.env("RUSTFLAGS", lint_rust_flags.clone());
spawn_and_wait(build_cmd);
}
});
}),
TestCase::new("test.portable-simd", &|runner| {
runner.in_dir(["portable-simd"], |runner| {
runner.run_cargo(["clean"]);
runner.run_cargo(["build", "--all-targets", "--target", &runner.target_triple]);
runner.in_dir(prepare::PORTABLE_SIMD.source_dir(), |runner| {
runner.run_cargo("clean", []);
runner.run_cargo("build", ["--all-targets", "--target", &runner.target_triple]);
if runner.host_triple == runner.target_triple {
runner.run_cargo(["test", "-q"]);
runner.run_cargo("test", ["-q"]);
}
});
}),
@ -397,7 +392,7 @@ pub(crate) fn run_tests(
channel: &str,
sysroot_kind: SysrootKind,
target_dir: &Path,
cg_clif_build_dir: &Path,
cg_clif_dylib: &Path,
host_triple: &str,
target_triple: &str,
) {
@ -408,7 +403,7 @@ pub(crate) fn run_tests(
channel,
SysrootKind::None,
&target_dir,
cg_clif_build_dir,
cg_clif_dylib,
&host_triple,
&target_triple,
);
@ -427,7 +422,7 @@ pub(crate) fn run_tests(
channel,
sysroot_kind,
&target_dir,
cg_clif_build_dir,
cg_clif_dylib,
&host_triple,
&target_triple,
);
@ -521,16 +516,8 @@ impl TestRunner {
}
}
fn in_dir<'a, I, F>(&self, dir: I, callback: F)
where
I: IntoIterator<Item = &'a str>,
F: FnOnce(&TestRunner),
{
fn in_dir(&self, new: impl AsRef<Path>, callback: impl FnOnce(&TestRunner)) {
let current = env::current_dir().unwrap();
let mut new = current.clone();
for d in dir {
new.push(d);
}
env::set_current_dir(new).unwrap();
callback(self);
@ -595,25 +582,29 @@ impl TestRunner {
spawn_and_wait(cmd);
}
fn cargo_command<I, S>(&self, args: I) -> Command
fn cargo_command<'a, I>(&self, subcommand: &str, args: I) -> Command
where
I: IntoIterator<Item = S>,
S: AsRef<OsStr>,
I: IntoIterator<Item = &'a str>,
{
let mut cargo_clif = self.root_dir.clone();
cargo_clif.push("build");
cargo_clif.push(get_wrapper_file_name("cargo-clif", "bin"));
let mut cmd = Command::new(cargo_clif);
let mut cmd = cargo_command(
cargo_clif,
subcommand,
if subcommand == "clean" { None } else { Some(&self.target_triple) },
Path::new("."),
);
cmd.args(args);
cmd.env("RUSTFLAGS", &self.rust_flags);
cmd
}
fn run_cargo<'a, I>(&self, args: I)
fn run_cargo<'a, I>(&self, subcommand: &str, args: I)
where
I: IntoIterator<Item = &'a str>,
{
spawn_and_wait(self.cargo_command(args));
spawn_and_wait(self.cargo_command(subcommand, args));
}
}

View File

@ -4,6 +4,52 @@ use std::io::Write;
use std::path::Path;
use std::process::{self, Command, Stdio};
pub(crate) fn cargo_command(
cargo: impl AsRef<Path>,
subcommand: &str,
triple: Option<&str>,
source_dir: &Path,
) -> Command {
let mut cmd = Command::new(cargo.as_ref());
cmd.arg(subcommand)
.arg("--manifest-path")
.arg(source_dir.join("Cargo.toml"))
.arg("--target-dir")
.arg(source_dir.join("target"));
if let Some(triple) = triple {
cmd.arg("--target").arg(triple);
}
cmd
}
pub(crate) fn hyperfine_command(
warmup: u64,
runs: u64,
prepare: Option<Command>,
a: Command,
b: Command,
) -> Command {
let mut bench = Command::new("hyperfine");
if warmup != 0 {
bench.arg("--warmup").arg(warmup.to_string());
}
if runs != 0 {
bench.arg("--runs").arg(runs.to_string());
}
if let Some(prepare) = prepare {
bench.arg("--prepare").arg(format!("{:?}", prepare));
}
bench.arg(format!("{:?}", a)).arg(format!("{:?}", b));
bench
}
#[track_caller]
pub(crate) fn try_hard_link(src: impl AsRef<Path>, dst: impl AsRef<Path>) {
let src = src.as_ref();

View File

@ -3,4 +3,8 @@ set -e
rm -rf build_sysroot/{sysroot_src/,target/,compiler-builtins/,rustc_version}
rm -rf target/ build/ perf.data{,.old} y.bin
rm -rf rand/ regex/ simple-raytracer/ portable-simd/ abi-checker/
rm -rf download/
# Kept for now in case someone updates their checkout of cg_clif before running clean_all.sh
# FIXME remove at some point in the future
rm -rf rand/ regex/ simple-raytracer/ portable-simd/ abi-checker/ abi-cafe/

View File

@ -49,4 +49,4 @@ test.regex-shootout-regex-dna
test.regex
test.portable-simd
testsuite.abi-checker
testsuite.abi-cafe

View File

@ -5,7 +5,6 @@
// Test that we can handle unsized types with an extern type tail part.
// Regression test for issue #91827.
#![feature(const_ptr_offset_from)]
#![feature(extern_types)]
use std::ptr::addr_of;

View File

@ -559,16 +559,22 @@ pub union MaybeUninit<T> {
pub mod intrinsics {
extern "rust-intrinsic" {
#[rustc_safe_intrinsic]
pub fn abort() -> !;
#[rustc_safe_intrinsic]
pub fn size_of<T>() -> usize;
pub fn size_of_val<T: ?::Sized>(val: *const T) -> usize;
#[rustc_safe_intrinsic]
pub fn min_align_of<T>() -> usize;
pub fn min_align_of_val<T: ?::Sized>(val: *const T) -> usize;
pub fn copy<T>(src: *const T, dst: *mut T, count: usize);
pub fn transmute<T, U>(e: T) -> U;
pub fn ctlz_nonzero<T>(x: T) -> T;
#[rustc_safe_intrinsic]
pub fn needs_drop<T: ?::Sized>() -> bool;
#[rustc_safe_intrinsic]
pub fn bitreverse<T>(x: T) -> T;
#[rustc_safe_intrinsic]
pub fn bswap<T>(x: T) -> T;
pub fn write_bytes<T>(dst: *mut T, val: u8, count: usize);
}

View File

@ -93,6 +93,7 @@ fn start<T: Termination + 'static>(
main: fn() -> T,
argc: isize,
argv: *const *const u8,
_sigpipe: u8,
) -> isize {
if argc == 3 {
unsafe { puts(*argv as *const i8); }

View File

@ -1,4 +1,4 @@
#![feature(core_intrinsics, generators, generator_trait, is_sorted, bench_black_box)]
#![feature(core_intrinsics, generators, generator_trait, is_sorted)]
#[cfg(target_arch = "x86_64")]
use std::arch::x86_64::*;

View File

@ -0,0 +1,29 @@
From 2b15fee2bb5fd14e34c7e17e44d99cb34f4c555d Mon Sep 17 00:00:00 2001
From: Afonso Bordado <afonsobordado@az8.co>
Date: Tue, 27 Sep 2022 07:55:17 +0100
Subject: [PATCH] Disable some test on x86_64-pc-windows-gnu
---
src/report.rs | 6 ++++++
1 file changed, 6 insertions(+)
diff --git a/src/report.rs b/src/report.rs
index eeec614..f582867 100644
--- a/src/report.rs
+++ b/src/report.rs
@@ -48,6 +48,12 @@ pub fn get_test_rules(test: &TestKey, caller: &dyn AbiImpl, callee: &dyn AbiImpl
//
// THIS AREA RESERVED FOR VENDORS TO APPLY PATCHES
+ // x86_64-pc-windows-gnu has some broken i128 tests that aren't disabled by default
+ if cfg!(all(target_os = "windows", target_env = "gnu")) && test.test_name == "ui128" {
+ result.run = Link;
+ result.check = Pass(Link);
+ }
+
// END OF VENDOR RESERVED AREA
//
//
--
2.30.1.windows.1

Some files were not shown because too many files have changed in this diff Show More