New upstream version 1.41.1+dfsg1

This commit is contained in:
Ximin Luo 2020-03-08 23:20:00 +00:00
parent e74abb3270
commit 60c5eb7d04
5444 changed files with 168661 additions and 77570 deletions

View File

@ -105,7 +105,7 @@ contributions to the compiler and the standard library. It also lists some
really useful commands to the build system (`./x.py`), which could save you a
lot of time.
[rustcguidebuild]: https://rust-lang.github.io/rustc-guide/how-to-build-and-run.html
[rustcguidebuild]: https://rust-lang.github.io/rustc-guide/building/how-to-build-and-run.html
## Pull Requests
[pull-requests]: #pull-requests
@ -150,13 +150,13 @@ All pull requests are reviewed by another person. We have a bot,
request.
If you want to request that a specific person reviews your pull request,
you can add an `r?` to the message. For example, [Steve][steveklabnik] usually reviews
you can add an `r?` to the pull request description. For example, [Steve][steveklabnik] usually reviews
documentation changes. So if you were to make a documentation change, add
r? @steveklabnik
to the end of the message, and @rust-highfive will assign [@steveklabnik][steveklabnik] instead
of a random person. This is entirely optional.
to the end of the pull request description, and [@rust-highfive][rust-highfive] will assign
[@steveklabnik][steveklabnik] instead of a random person. This is entirely optional.
After someone has reviewed your pull request, they will leave an annotation
on the pull request with an `r+`. It will look something like this:

1046
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -21,7 +21,7 @@ The Rust build system has a Python script called `x.py` to bootstrap building
the compiler. More information about it may be found by running `./x.py --help`
or reading the [rustc guide][rustcguidebuild].
[rustcguidebuild]: https://rust-lang.github.io/rustc-guide/how-to-build-and-run.html
[rustcguidebuild]: https://rust-lang.github.io/rustc-guide/building/how-to-build-and-run.html
### Building on *nix
1. Make sure you have installed the dependencies:

View File

@ -1,3 +1,265 @@
Version 1.41.1 (2020-02-27)
===========================
* [Always check types of static items][69145]
* [Always check lifetime bounds of `Copy` impls][69145]
* [Fix miscompilation in callers of `Layout::repeat`][69225]
[69225]: https://github.com/rust-lang/rust/issues/69225
[69145]: https://github.com/rust-lang/rust/pull/69145
Version 1.41.0 (2020-01-30)
===========================
Language
--------
- [You can now pass type parameters to foreign items when implementing
traits.][65879] E.g. You can now write `impl<T> From<Foo> for Vec<T> {}`.
- [You can now arbitrarily nest receiver types in the `self` position.][64325] E.g. you can
now write `fn foo(self: Box<Box<Self>>) {}`. Previously only `Self`, `&Self`,
`&mut Self`, `Arc<Self>`, `Rc<Self>`, and `Box<Self>` were allowed.
- [You can now use any valid identifier in a `format_args` macro.][66847]
Previously identifiers starting with an underscore were not allowed.
- [Visibility modifiers (e.g. `pub`) are now syntactically allowed on trait items and
enum variants.][66183] These are still rejected semantically, but
can be seen and parsed by procedural macros and conditional compilation.
Compiler
--------
- [Rustc will now warn if you have unused loop `'label`s.][66325]
- [Removed support for the `i686-unknown-dragonfly` target.][67255]
- [Added tier 3 support\* for the `riscv64gc-unknown-linux-gnu` target.][66661]
- [You can now pass an arguments file passing the `@path` syntax
to rustc.][66172] Note that the format differs somewhat from what is
found in other tooling; please see [the documentation][argfile-docs] for
more information.
- [You can now provide `--extern` flag without a path, indicating that it is
available from the search path or specified with an `-L` flag.][64882]
\* Refer to Rust's [platform support page][forge-platform-support] for more
information on Rust's tiered platform support.
[argfile-docs]: https://doc.rust-lang.org/nightly/rustc/command-line-arguments.html#path-load-command-line-flags-from-a-path
Libraries
---------
- [The `core::panic` module is now stable.][66771] It was already stable
through `std`.
- [`NonZero*` numerics now implement `From<NonZero*>` if it's a smaller integer
width.][66277] E.g. `NonZeroU16` now implements `From<NonZeroU8>`.
- [`MaybeUninit<T>` now implements `fmt::Debug`.][65013]
Stabilized APIs
---------------
- [`Result::map_or`]
- [`Result::map_or_else`]
- [`std::rc::Weak::weak_count`]
- [`std::rc::Weak::strong_count`]
- [`std::sync::Weak::weak_count`]
- [`std::sync::Weak::strong_count`]
Cargo
-----
- [Cargo will now document all the private items for binary crates
by default.][cargo/7593]
- [`cargo-install` will now reinstall the package if it detects that it is out
of date.][cargo/7560]
- [Cargo.lock now uses a more git friendly format that should help to reduce
merge conflicts.][cargo/7579]
- [You can now override specific dependencies's build settings][cargo/7591] E.g.
`[profile.dev.package.image] opt-level = 2` sets the `image` crate's
optimisation level to `2` for debug builds. You can also use
`[profile.<profile>.build-override]` to override build scripts and
their dependencies.
Misc
----
- [You can now specify `edition` in documentation code blocks to compile the block
for that edition.][66238] E.g. `edition2018` tells rustdoc that the code sample
should be compiled the 2018 edition of Rust.
- [You can now provide custom themes to rustdoc with `--theme`, and check the
current theme with `--check-theme`.][54733]
- [You can use `#[cfg(doc)]` to compile an item when building documentation.][61351]
Compatibility Notes
-------------------
- [As previously announced 1.41.0 will be the last tier 1 release for 32-bit
Apple targets.][apple-32bit-drop] This means that the source code is still
available to build, but the targets are no longer being tested and release
binaries for those platforms will no longer be distributed by the Rust project.
Please refer to the linked blog post for more information.
[54733]: https://github.com/rust-lang/rust/pull/54733/
[61351]: https://github.com/rust-lang/rust/pull/61351/
[67255]: https://github.com/rust-lang/rust/pull/67255/
[66661]: https://github.com/rust-lang/rust/pull/66661/
[66771]: https://github.com/rust-lang/rust/pull/66771/
[66847]: https://github.com/rust-lang/rust/pull/66847/
[66238]: https://github.com/rust-lang/rust/pull/66238/
[66277]: https://github.com/rust-lang/rust/pull/66277/
[66325]: https://github.com/rust-lang/rust/pull/66325/
[66172]: https://github.com/rust-lang/rust/pull/66172/
[66183]: https://github.com/rust-lang/rust/pull/66183/
[65879]: https://github.com/rust-lang/rust/pull/65879/
[65013]: https://github.com/rust-lang/rust/pull/65013/
[64882]: https://github.com/rust-lang/rust/pull/64882/
[64325]: https://github.com/rust-lang/rust/pull/64325/
[cargo/7560]: https://github.com/rust-lang/cargo/pull/7560/
[cargo/7579]: https://github.com/rust-lang/cargo/pull/7579/
[cargo/7591]: https://github.com/rust-lang/cargo/pull/7591/
[cargo/7593]: https://github.com/rust-lang/cargo/pull/7593/
[`Result::map_or_else`]: https://doc.rust-lang.org/std/result/enum.Result.html#method.map_or_else
[`Result::map_or`]: https://doc.rust-lang.org/std/result/enum.Result.html#method.map_or
[`std::rc::Weak::weak_count`]: https://doc.rust-lang.org/std/rc/struct.Weak.html#method.weak_count
[`std::rc::Weak::strong_count`]: https://doc.rust-lang.org/std/rc/struct.Weak.html#method.strong_count
[`std::sync::Weak::weak_count`]: https://doc.rust-lang.org/std/sync/struct.Weak.html#method.weak_count
[`std::sync::Weak::strong_count`]: https://doc.rust-lang.org/std/sync/struct.Weak.html#method.strong_count
[apple-32bit-drop]: https://blog.rust-lang.org/2020/01/03/reducing-support-for-32-bit-apple-targets.html
Version 1.40.0 (2019-12-19)
===========================
Language
--------
- [You can now use tuple `struct`s and tuple `enum` variant's constructors in
`const` contexts.][65188] e.g.
```rust
pub struct Point(i32, i32);
const ORIGIN: Point = {
let constructor = Point;
constructor(0, 0)
};
```
- [You can now mark `struct`s, `enum`s, and `enum` variants with the `#[non_exhaustive]` attribute to
indicate that there may be variants or fields added in the future.][64639]
For example this requires adding a wild-card branch (`_ => {}`) to any match
statements on a non-exhaustive `enum`. [(RFC 2008)]
- [You can now use function-like procedural macros in `extern` blocks and in
type positions.][63931] e.g. `type Generated = macro!();`
- [Function-like and attribute procedural macros can now emit
`macro_rules!` items, so you can now have your macros generate macros.][64035]
- [The `meta` pattern matcher in `macro_rules!` now correctly matches the modern
attribute syntax.][63674] For example `(#[$m:meta])` now matches `#[attr]`,
`#[attr{tokens}]`, `#[attr[tokens]]`, and `#[attr(tokens)]`.
Compiler
--------
- [Added tier 3 support\* for the
`thumbv7neon-unknown-linux-musleabihf` target.][66103]
- [Added tier 3 support for the
`aarch64-unknown-none-softfloat` target.][64589]
- [Added tier 3 support for the `mips64-unknown-linux-muslabi64`, and
`mips64el-unknown-linux-muslabi64` targets.][65843]
\* Refer to Rust's [platform support page][forge-platform-support] for more
information on Rust's tiered platform support.
Libraries
---------
- [The `is_power_of_two` method on unsigned numeric types is now a `const` function.][65092]
Stabilized APIs
---------------
- [`BTreeMap::get_key_value`]
- [`HashMap::get_key_value`]
- [`Option::as_deref_mut`]
- [`Option::as_deref`]
- [`Option::flatten`]
- [`UdpSocket::peer_addr`]
- [`f32::to_be_bytes`]
- [`f32::to_le_bytes`]
- [`f32::to_ne_bytes`]
- [`f64::to_be_bytes`]
- [`f64::to_le_bytes`]
- [`f64::to_ne_bytes`]
- [`f32::from_be_bytes`]
- [`f32::from_le_bytes`]
- [`f32::from_ne_bytes`]
- [`f64::from_be_bytes`]
- [`f64::from_le_bytes`]
- [`f64::from_ne_bytes`]
- [`mem::take`]
- [`slice::repeat`]
- [`todo!`]
Cargo
-----
- [Cargo will now always display warnings, rather than only on
fresh builds.][cargo/7450]
- [Feature flags (except `--all-features`) passed to a virtual workspace will
now produce an error.][cargo/7507] Previously these flags were ignored.
- [You can now publish `dev-dependencies` without including
a `version`.][cargo/7333]
Misc
----
- [You can now specify the `#[cfg(doctest)]` attribute to include an item only
when running documentation tests with `rustdoc`.][63803]
Compatibility Notes
-------------------
- [As previously announced, any previous NLL warnings in the 2015 edition are
now hard errors.][64221]
- [The `include!` macro will now warn if it failed to include the
entire file.][64284] The `include!` macro unintentionally only includes the
first _expression_ in a file, and this can be unintuitive. This will become
either a hard error in a future release, or the behavior may be fixed to include all expressions as expected.
- [Using `#[inline]` on function prototypes and consts now emits a warning under
`unused_attribute` lint.][65294] Using `#[inline]` anywhere else inside traits
or `extern` blocks now correctly emits a hard error.
[65294]: https://github.com/rust-lang/rust/pull/65294/
[66103]: https://github.com/rust-lang/rust/pull/66103/
[65843]: https://github.com/rust-lang/rust/pull/65843/
[65188]: https://github.com/rust-lang/rust/pull/65188/
[65092]: https://github.com/rust-lang/rust/pull/65092/
[64589]: https://github.com/rust-lang/rust/pull/64589/
[64639]: https://github.com/rust-lang/rust/pull/64639/
[64221]: https://github.com/rust-lang/rust/pull/64221/
[64284]: https://github.com/rust-lang/rust/pull/64284/
[63931]: https://github.com/rust-lang/rust/pull/63931/
[64035]: https://github.com/rust-lang/rust/pull/64035/
[63674]: https://github.com/rust-lang/rust/pull/63674/
[63803]: https://github.com/rust-lang/rust/pull/63803/
[cargo/7450]: https://github.com/rust-lang/cargo/pull/7450/
[cargo/7507]: https://github.com/rust-lang/cargo/pull/7507/
[cargo/7525]: https://github.com/rust-lang/cargo/pull/7525/
[cargo/7333]: https://github.com/rust-lang/cargo/pull/7333/
[(rfc 2008)]: https://rust-lang.github.io/rfcs/2008-non-exhaustive.html
[`f32::to_be_bytes`]: https://doc.rust-lang.org/std/primitive.f32.html#method.to_be_bytes
[`f32::to_le_bytes`]: https://doc.rust-lang.org/std/primitive.f32.html#method.to_le_bytes
[`f32::to_ne_bytes`]: https://doc.rust-lang.org/std/primitive.f32.html#method.to_ne_bytes
[`f64::to_be_bytes`]: https://doc.rust-lang.org/std/primitive.f64.html#method.to_be_bytes
[`f64::to_le_bytes`]: https://doc.rust-lang.org/std/primitive.f64.html#method.to_le_bytes
[`f64::to_ne_bytes`]: https://doc.rust-lang.org/std/primitive.f64.html#method.to_ne_bytes
[`f32::from_be_bytes`]: https://doc.rust-lang.org/std/primitive.f32.html#method.from_be_bytes
[`f32::from_le_bytes`]: https://doc.rust-lang.org/std/primitive.f32.html#method.from_le_bytes
[`f32::from_ne_bytes`]: https://doc.rust-lang.org/std/primitive.f32.html#method.from_ne_bytes
[`f64::from_be_bytes`]: https://doc.rust-lang.org/std/primitive.f64.html#method.from_be_bytes
[`f64::from_le_bytes`]: https://doc.rust-lang.org/std/primitive.f64.html#method.from_le_bytes
[`f64::from_ne_bytes`]: https://doc.rust-lang.org/std/primitive.f64.html#method.from_ne_bytes
[`option::flatten`]: https://doc.rust-lang.org/std/option/enum.Option.html#method.flatten
[`option::as_deref`]: https://doc.rust-lang.org/std/option/enum.Option.html#method.as_deref
[`option::as_deref_mut`]: https://doc.rust-lang.org/std/option/enum.Option.html#method.as_deref_mut
[`hashmap::get_key_value`]: https://doc.rust-lang.org/std/collections/struct.HashMap.html#method.get_key_value
[`btreemap::get_key_value`]: https://doc.rust-lang.org/std/collections/struct.BTreeMap.html#method.get_key_value
[`slice::repeat`]: https://doc.rust-lang.org/std/primitive.slice.html#method.repeat
[`mem::take`]: https://doc.rust-lang.org/std/mem/fn.take.html
[`udpsocket::peer_addr`]: https://doc.rust-lang.org/std/net/struct.UdpSocket.html#method.peer_addr
[`todo!`]: https://doc.rust-lang.org/std/macro.todo.html
Version 1.39.0 (2019-11-07)
===========================
@ -7804,7 +8066,7 @@ Version 0.7 (2013-07-03)
* extra: Implementation of fixed output size variations of SHA-2.
* Tooling
* `unused_variable` lint mode for unused variables (default: warn).
* `unused_variables` lint mode for unused variables (default: warn).
* `unused_unsafe` lint mode for detecting unnecessary `unsafe` blocks
(default: warn).
* `unused_mut` lint mode for identifying unused `mut` qualifiers

View File

@ -379,9 +379,6 @@
# and currently the only standard option supported is `"llvm"`
#codegen-backends = ["llvm"]
# This is the name of the directory in which codegen backends will get installed
#codegen-backends-dir = "codegen-backends"
# Indicates whether LLD will be compiled and made available in the sysroot for
# rustc to execute.
#lld = false

View File

@ -1 +1 @@
73528e339aae0f17a15ffa49a8ac608f50c6cf14
f3e1a954d2ead4e2fc197c7da7d71e6c61bad196

View File

@ -25,7 +25,7 @@ fn main() {
let mut dylib_path = bootstrap::util::dylib_path();
dylib_path.insert(0, PathBuf::from(libdir.clone()));
//FIXME(misdreavus): once stdsimd uses cfg(rustdoc) instead of cfg(dox), remove the `--cfg dox`
//FIXME(misdreavus): once stdsimd uses cfg(doc) instead of cfg(dox), remove the `--cfg dox`
//arguments here
let mut cmd = Command::new(rustdoc);
cmd.args(&args)

View File

@ -102,10 +102,10 @@ def verify(path, sha_path, verbose):
return verified
def unpack(tarball, dst, verbose=False, match=None):
def unpack(tarball, tarball_suffix, dst, verbose=False, match=None):
"""Unpack the given tarball file"""
print("extracting", tarball)
fname = os.path.basename(tarball).replace(".tar.gz", "")
fname = os.path.basename(tarball).replace(tarball_suffix, "")
with contextlib.closing(tarfile.open(tarball)) as tar:
for member in tar.getnames():
if "/" not in member:
@ -331,6 +331,7 @@ class RustBuild(object):
self.use_vendored_sources = ''
self.verbose = False
def download_stage0(self):
"""Fetch the build system for Rust, written in Rust
@ -344,18 +345,30 @@ class RustBuild(object):
rustc_channel = self.rustc_channel
cargo_channel = self.cargo_channel
def support_xz():
try:
with tempfile.NamedTemporaryFile(delete=False) as temp_file:
temp_path = temp_file.name
with tarfile.open(temp_path, "w:xz") as tar:
pass
return True
except tarfile.CompressionError:
return False
if self.rustc().startswith(self.bin_root()) and \
(not os.path.exists(self.rustc()) or
self.program_out_of_date(self.rustc_stamp())):
if os.path.exists(self.bin_root()):
shutil.rmtree(self.bin_root())
filename = "rust-std-{}-{}.tar.gz".format(
rustc_channel, self.build)
tarball_suffix = '.tar.xz' if support_xz() else '.tar.gz'
filename = "rust-std-{}-{}{}".format(
rustc_channel, self.build, tarball_suffix)
pattern = "rust-std-{}".format(self.build)
self._download_stage0_helper(filename, pattern)
self._download_stage0_helper(filename, pattern, tarball_suffix)
filename = "rustc-{}-{}.tar.gz".format(rustc_channel, self.build)
self._download_stage0_helper(filename, "rustc")
filename = "rustc-{}-{}{}".format(rustc_channel, self.build,
tarball_suffix)
self._download_stage0_helper(filename, "rustc", tarball_suffix)
self.fix_executable("{}/bin/rustc".format(self.bin_root()))
self.fix_executable("{}/bin/rustdoc".format(self.bin_root()))
with output(self.rustc_stamp()) as rust_stamp:
@ -365,20 +378,22 @@ class RustBuild(object):
# libraries/binaries that are included in rust-std with
# the system MinGW ones.
if "pc-windows-gnu" in self.build:
filename = "rust-mingw-{}-{}.tar.gz".format(
rustc_channel, self.build)
self._download_stage0_helper(filename, "rust-mingw")
filename = "rust-mingw-{}-{}{}".format(
rustc_channel, self.build, tarball_suffix)
self._download_stage0_helper(filename, "rust-mingw", tarball_suffix)
if self.cargo().startswith(self.bin_root()) and \
(not os.path.exists(self.cargo()) or
self.program_out_of_date(self.cargo_stamp())):
filename = "cargo-{}-{}.tar.gz".format(cargo_channel, self.build)
self._download_stage0_helper(filename, "cargo")
tarball_suffix = '.tar.xz' if support_xz() else '.tar.gz'
filename = "cargo-{}-{}{}".format(cargo_channel, self.build,
tarball_suffix)
self._download_stage0_helper(filename, "cargo", tarball_suffix)
self.fix_executable("{}/bin/cargo".format(self.bin_root()))
with output(self.cargo_stamp()) as cargo_stamp:
cargo_stamp.write(self.date)
def _download_stage0_helper(self, filename, pattern):
def _download_stage0_helper(self, filename, pattern, tarball_suffix):
cache_dst = os.path.join(self.build_dir, "cache")
rustc_cache = os.path.join(cache_dst, self.date)
if not os.path.exists(rustc_cache):
@ -388,7 +403,7 @@ class RustBuild(object):
tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball):
get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(), match=pattern, verbose=self.verbose)
unpack(tarball, tarball_suffix, self.bin_root(), match=pattern, verbose=self.verbose)
@staticmethod
def fix_executable(fname):
@ -628,7 +643,9 @@ class RustBuild(object):
env["LIBRARY_PATH"] = os.path.join(self.bin_root(), "lib") + \
(os.pathsep + env["LIBRARY_PATH"]) \
if "LIBRARY_PATH" in env else ""
env["RUSTFLAGS"] = "-Cdebuginfo=2 "
# preserve existing RUSTFLAGS
env.setdefault("RUSTFLAGS", "")
env["RUSTFLAGS"] += " -Cdebuginfo=2"
build_section = "target.{}".format(self.build_triple())
target_features = []
@ -637,13 +654,13 @@ class RustBuild(object):
elif self.get_toml("crt-static", build_section) == "false":
target_features += ["-crt-static"]
if target_features:
env["RUSTFLAGS"] += "-C target-feature=" + (",".join(target_features)) + " "
env["RUSTFLAGS"] += " -C target-feature=" + (",".join(target_features))
target_linker = self.get_toml("linker", build_section)
if target_linker is not None:
env["RUSTFLAGS"] += "-C linker=" + target_linker + " "
env["RUSTFLAGS"] += " -Wrust_2018_idioms -Wunused_lifetimes "
env["RUSTFLAGS"] += " -C linker=" + target_linker
env["RUSTFLAGS"] += " -Wrust_2018_idioms -Wunused_lifetimes"
if self.get_toml("deny-warnings", "rust") != "false":
env["RUSTFLAGS"] += "-Dwarnings "
env["RUSTFLAGS"] += " -Dwarnings"
env["PATH"] = os.path.join(self.bin_root(), "bin") + \
os.pathsep + env["PATH"]

View File

@ -339,7 +339,6 @@ impl<'a> Builder<'a> {
Kind::Build => describe!(
compile::Std,
compile::Rustc,
compile::CodegenBackend,
compile::StartupObjects,
tool::BuildManifest,
tool::Rustbook,
@ -364,10 +363,10 @@ impl<'a> Builder<'a> {
Kind::Check | Kind::Clippy | Kind::Fix => describe!(
check::Std,
check::Rustc,
check::CodegenBackend,
check::Rustdoc
),
Kind::Test => describe!(
crate::toolstate::ToolStateCheck,
test::Tidy,
test::Ui,
test::CompileFail,
@ -631,11 +630,6 @@ impl<'a> Builder<'a> {
self.ensure(Libdir { compiler, target })
}
pub fn sysroot_codegen_backends(&self, compiler: Compiler) -> PathBuf {
self.sysroot_libdir(compiler, compiler.host)
.with_file_name(self.config.rust_codegen_backends_dir.clone())
}
/// Returns the compiler's libdir where it stores the dynamic libraries that
/// it itself links against.
///
@ -706,15 +700,6 @@ impl<'a> Builder<'a> {
}
}
/// Gets the paths to all of the compiler's codegen backends.
fn codegen_backends(&self, compiler: Compiler) -> impl Iterator<Item = PathBuf> {
fs::read_dir(self.sysroot_codegen_backends(compiler))
.into_iter()
.flatten()
.filter_map(Result::ok)
.map(|entry| entry.path())
}
pub fn rustdoc(&self, compiler: Compiler) -> PathBuf {
self.ensure(tool::Rustdoc { compiler })
}
@ -758,12 +743,6 @@ impl<'a> Builder<'a> {
let mut cargo = Command::new(&self.initial_cargo);
let out_dir = self.stage_out(compiler, mode);
// Codegen backends are not yet tracked by -Zbinary-dep-depinfo,
// so we need to explicitly clear out if they've been updated.
for backend in self.codegen_backends(compiler) {
self.clear_if_dirty(&out_dir, &backend);
}
if cmd == "doc" || cmd == "rustdoc" {
let my_out = match mode {
// This is the intended out directory for compiler documentation.
@ -980,7 +959,7 @@ impl<'a> Builder<'a> {
// argument manually via `-C link-args=-Wl,-rpath,...`. Plus isn't it
// fun to pass a flag to a tool to pass a flag to pass a flag to a tool
// to change a flag in a binary?
if self.config.rust_rpath {
if self.config.rust_rpath && util::use_host_linker(&target) {
let rpath = if target.contains("apple") {
// Note that we need to take one extra step on macOS to also pass
@ -990,10 +969,7 @@ impl<'a> Builder<'a> {
// flesh out rpath support more fully in the future.
rustflags.arg("-Zosx-rpath-install-name");
Some("-Wl,-rpath,@loader_path/../lib")
} else if !target.contains("windows") &&
!target.contains("wasm32") &&
!target.contains("emscripten") &&
!target.contains("fuchsia") {
} else if !target.contains("windows") {
Some("-Wl,-rpath,$ORIGIN/../lib")
} else {
None
@ -1242,7 +1218,8 @@ impl<'a> Builder<'a> {
cargo.arg("--frozen");
}
cargo.env("RUSTC_INSTALL_BINDIR", &self.config.bindir);
// Try to use a sysroot-relative bindir, in case it was configured absolutely.
cargo.env("RUSTC_INSTALL_BINDIR", self.config.bindir_relative());
self.ci_env.force_coloring_in_ci(&mut cargo);

View File

@ -363,6 +363,10 @@ fn dist_with_same_targets_and_hosts() {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
]
);
assert_eq!(

View File

@ -13,7 +13,7 @@ use build_helper::output;
use crate::Build;
// The version number
pub const CFG_RELEASE_NUM: &str = "1.40.0";
pub const CFG_RELEASE_NUM: &str = "1.41.1";
pub struct GitInfo {
inner: Option<Info>,
@ -29,31 +29,28 @@ impl GitInfo {
pub fn new(ignore_git: bool, dir: &Path) -> GitInfo {
// See if this even begins to look like a git dir
if ignore_git || !dir.join(".git").exists() {
return GitInfo { inner: None }
return GitInfo { inner: None };
}
// Make sure git commands work
match Command::new("git")
.arg("rev-parse")
.current_dir(dir)
.output()
{
match Command::new("git").arg("rev-parse").current_dir(dir).output() {
Ok(ref out) if out.status.success() => {}
_ => return GitInfo { inner: None },
}
// Ok, let's scrape some info
let ver_date = output(Command::new("git").current_dir(dir)
.arg("log").arg("-1")
.arg("--date=short")
.arg("--pretty=format:%cd"));
let ver_hash = output(Command::new("git").current_dir(dir)
.arg("rev-parse").arg("HEAD"));
let short_ver_hash = output(Command::new("git")
.current_dir(dir)
.arg("rev-parse")
.arg("--short=9")
.arg("HEAD"));
let ver_date = output(
Command::new("git")
.current_dir(dir)
.arg("log")
.arg("-1")
.arg("--date=short")
.arg("--pretty=format:%cd"),
);
let ver_hash = output(Command::new("git").current_dir(dir).arg("rev-parse").arg("HEAD"));
let short_ver_hash = output(
Command::new("git").current_dir(dir).arg("rev-parse").arg("--short=9").arg("HEAD"),
);
GitInfo {
inner: Some(Info {
commit_date: ver_date.trim().to_string(),

View File

@ -1,11 +1,10 @@
//! Implementation of compiling the compiler and standard library, in "check"-based modes.
use crate::compile::{run_cargo, std_cargo, rustc_cargo, rustc_cargo_env,
add_to_sysroot};
use crate::compile::{run_cargo, std_cargo, rustc_cargo, add_to_sysroot};
use crate::builder::{RunConfig, Builder, Kind, ShouldRun, Step};
use crate::tool::{prepare_tool_cargo, SourceType};
use crate::{Compiler, Mode};
use crate::cache::{INTERNER, Interned};
use crate::cache::Interned;
use std::path::PathBuf;
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
@ -97,7 +96,7 @@ impl Step for Rustc {
let mut cargo = builder.cargo(compiler, Mode::Rustc, target,
cargo_subcommand(builder.kind));
rustc_cargo(builder, &mut cargo);
rustc_cargo(builder, &mut cargo, target);
builder.info(&format!("Checking compiler artifacts ({} -> {})", &compiler.host, target));
run_cargo(builder,
@ -113,55 +112,6 @@ impl Step for Rustc {
}
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct CodegenBackend {
pub target: Interned<String>,
pub backend: Interned<String>,
}
impl Step for CodegenBackend {
type Output = ();
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("rustc_codegen_llvm")
}
fn make_run(run: RunConfig<'_>) {
let backend = run.builder.config.rust_codegen_backends.get(0);
let backend = backend.cloned().unwrap_or_else(|| {
INTERNER.intern_str("llvm")
});
run.builder.ensure(CodegenBackend {
target: run.target,
backend,
});
}
fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(0, builder.config.build);
let target = self.target;
let backend = self.backend;
builder.ensure(Rustc { target });
let mut cargo = builder.cargo(compiler, Mode::Codegen, target,
cargo_subcommand(builder.kind));
cargo.arg("--manifest-path").arg(builder.src.join("src/librustc_codegen_llvm/Cargo.toml"));
rustc_cargo_env(builder, &mut cargo);
// We won't build LLVM if it's not available, as it shouldn't affect `check`.
run_cargo(builder,
cargo,
args(builder.kind),
&codegen_backend_stamp(builder, compiler, target, backend),
vec![],
true);
}
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct Rustdoc {
pub target: Interned<String>,
@ -231,16 +181,6 @@ pub fn librustc_stamp(
builder.cargo_out(compiler, Mode::Rustc, target).join(".librustc-check.stamp")
}
/// Cargo's output path for librustc_codegen_llvm in a given stage, compiled by a particular
/// compiler for the specified target and backend.
fn codegen_backend_stamp(builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
backend: Interned<String>) -> PathBuf {
builder.cargo_out(compiler, Mode::Codegen, target)
.join(format!(".librustc_codegen_llvm-{}-check.stamp", backend))
}
/// Cargo's output path for rustdoc in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn rustdoc_stamp(

View File

@ -27,7 +27,7 @@ use crate::{Compiler, Mode, GitRepo};
use crate::native;
use crate::cache::{INTERNER, Interned};
use crate::builder::{Step, RunConfig, ShouldRun, Builder};
use crate::builder::{Step, RunConfig, ShouldRun, Builder, Kind};
#[derive(Debug, PartialOrd, Ord, Copy, Clone, PartialEq, Eq, Hash)]
pub struct Std {
@ -113,7 +113,7 @@ impl Step for Std {
}
}
/// Copies third pary objects needed by various targets.
/// Copies third party objects needed by various targets.
fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target: Interned<String>)
-> Vec<PathBuf>
{
@ -445,7 +445,7 @@ impl Step for Rustc {
});
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "build");
rustc_cargo(builder, &mut cargo);
rustc_cargo(builder, &mut cargo, target);
builder.info(&format!("Building stage{} compiler artifacts ({} -> {})",
compiler.stage, &compiler.host, target));
@ -456,6 +456,44 @@ impl Step for Rustc {
vec![],
false);
// We used to build librustc_codegen_llvm as a separate step,
// which produced a dylib that the compiler would dlopen() at runtime.
// This meant that we only needed to make sure that libLLVM.so was
// installed by the time we went to run a tool using it - since
// librustc_codegen_llvm was effectively a standalone artifact,
// other crates were completely oblivious to its dependency
// on `libLLVM.so` during build time.
//
// However, librustc_codegen_llvm is now built as an ordinary
// crate during the same step as the rest of the compiler crates.
// This means that any crates depending on it will see the fact
// that it uses `libLLVM.so` as a native library, and will
// cause us to pass `-llibLLVM.so` to the linker when we link
// a binary.
//
// For `rustc` itself, this works out fine.
// During the `Assemble` step, we call `dist::maybe_install_llvm_dylib`
// to copy libLLVM.so into the `stage` directory. We then link
// the compiler binary, which will find `libLLVM.so` in the correct place.
//
// However, this is insufficient for tools that are build against stage0
// (e.g. stage1 rustdoc). Since `Assemble` for stage0 doesn't actually do anything,
// we won't have `libLLVM.so` in the stage0 sysroot. In the past, this wasn't
// a problem - we would copy the tool binary into its correct stage directory
// (e.g. stage1 for a stage1 rustdoc built against a stage0 compiler).
// Since libLLVM.so wasn't resolved until runtime, it was fine for it to
// not exist while we were building it.
//
// To ensure that we can still build stage1 tools against a stage0 compiler,
// we explicitly copy libLLVM.so into the stage0 sysroot when building
// the stage0 compiler. This ensures that tools built against stage0
// will see libLLVM.so at build time, making the linker happy.
if compiler.stage == 0 {
builder.info(&format!("Installing libLLVM.so to stage 0 ({})", compiler.host));
let sysroot = builder.sysroot(compiler);
dist::maybe_install_llvm_dylib(builder, compiler.host, &sysroot);
}
builder.ensure(RustcLink {
compiler: builder.compiler(compiler.stage, builder.config.build),
target_compiler: compiler,
@ -464,21 +502,20 @@ impl Step for Rustc {
}
}
pub fn rustc_cargo(builder: &Builder<'_>, cargo: &mut Cargo) {
pub fn rustc_cargo(builder: &Builder<'_>, cargo: &mut Cargo, target: Interned<String>) {
cargo.arg("--features").arg(builder.rustc_features())
.arg("--manifest-path")
.arg(builder.src.join("src/rustc/Cargo.toml"));
rustc_cargo_env(builder, cargo);
rustc_cargo_env(builder, cargo, target);
}
pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Cargo) {
pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Cargo, target: Interned<String>) {
// Set some configuration variables picked up by build scripts and
// the compiler alike
cargo.env("CFG_RELEASE", builder.rust_release())
.env("CFG_RELEASE_CHANNEL", &builder.config.channel)
.env("CFG_VERSION", builder.rust_version())
.env("CFG_PREFIX", builder.config.prefix.clone().unwrap_or_default())
.env("CFG_CODEGEN_BACKENDS_DIR", &builder.config.rust_codegen_backends_dir);
.env("CFG_PREFIX", builder.config.prefix.clone().unwrap_or_default());
let libdir_relative = builder.config.libdir_relative().unwrap_or(Path::new("lib"));
cargo.env("CFG_LIBDIR_RELATIVE", libdir_relative);
@ -501,6 +538,49 @@ pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Cargo) {
if builder.config.rust_verify_llvm_ir {
cargo.env("RUSTC_VERIFY_LLVM_IR", "1");
}
// Pass down configuration from the LLVM build into the build of
// librustc_llvm and librustc_codegen_llvm.
//
// Note that this is disabled if LLVM itself is disabled or we're in a check
// build, where if we're in a check build there's no need to build all of
// LLVM and such.
if builder.config.llvm_enabled() && builder.kind != Kind::Check {
if builder.is_rust_llvm(target) {
cargo.env("LLVM_RUSTLLVM", "1");
}
let llvm_config = builder.ensure(native::Llvm { target });
cargo.env("LLVM_CONFIG", &llvm_config);
let target_config = builder.config.target_config.get(&target);
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
cargo.env("CFG_LLVM_ROOT", s);
}
// Some LLVM linker flags (-L and -l) may be needed to link librustc_llvm.
if let Some(ref s) = builder.config.llvm_ldflags {
cargo.env("LLVM_LINKER_FLAGS", s);
}
// Building with a static libstdc++ is only supported on linux right now,
// not for MSVC or macOS
if builder.config.llvm_static_stdcpp &&
!target.contains("freebsd") &&
!target.contains("msvc") &&
!target.contains("apple") {
let file = compiler_file(builder,
builder.cxx(target).unwrap(),
target,
"libstdc++.a");
cargo.env("LLVM_STATIC_STDCPP", file);
}
if builder.config.llvm_link_shared || builder.config.llvm_thin_lto {
cargo.env("LLVM_LINK_SHARED", "1");
}
if builder.config.llvm_use_libcxx {
cargo.env("LLVM_USE_LIBCXX", "1");
}
if builder.config.llvm_optimize && !builder.config.llvm_release_debuginfo {
cargo.env("LLVM_NDEBUG", "1");
}
}
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
@ -537,215 +617,6 @@ impl Step for RustcLink {
}
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct CodegenBackend {
pub compiler: Compiler,
pub target: Interned<String>,
pub backend: Interned<String>,
}
impl Step for CodegenBackend {
type Output = ();
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("rustc_codegen_llvm")
}
fn make_run(run: RunConfig<'_>) {
let backend = run.builder.config.rust_codegen_backends.get(0);
let backend = backend.cloned().unwrap_or_else(|| {
INTERNER.intern_str("llvm")
});
run.builder.ensure(CodegenBackend {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
backend,
});
}
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target = self.target;
let backend = self.backend;
builder.ensure(Rustc { compiler, target });
if builder.config.keep_stage.contains(&compiler.stage) {
builder.info("Warning: Using a potentially old codegen backend. \
This may not behave well.");
// Codegen backends are linked separately from this step today, so we don't do
// anything here.
return;
}
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
if compiler_to_use != compiler {
builder.ensure(CodegenBackend {
compiler: compiler_to_use,
target,
backend,
});
return;
}
let out_dir = builder.cargo_out(compiler, Mode::Codegen, target);
let mut cargo = builder.cargo(compiler, Mode::Codegen, target, "build");
cargo.arg("--manifest-path")
.arg(builder.src.join("src/librustc_codegen_llvm/Cargo.toml"));
rustc_cargo_env(builder, &mut cargo);
let features = build_codegen_backend(&builder, &mut cargo, &compiler, target, backend);
cargo.arg("--features").arg(features);
let tmp_stamp = out_dir.join(".tmp.stamp");
let files = run_cargo(builder, cargo, vec![], &tmp_stamp, vec![], false);
if builder.config.dry_run {
return;
}
let mut files = files.into_iter()
.filter(|f| {
let filename = f.file_name().unwrap().to_str().unwrap();
is_dylib(filename) && filename.contains("rustc_codegen_llvm-")
});
let codegen_backend = match files.next() {
Some(f) => f,
None => panic!("no dylibs built for codegen backend?"),
};
if let Some(f) = files.next() {
panic!("codegen backend built two dylibs:\n{}\n{}",
codegen_backend.display(),
f.display());
}
let stamp = codegen_backend_stamp(builder, compiler, target, backend);
let codegen_backend = codegen_backend.to_str().unwrap();
t!(fs::write(&stamp, &codegen_backend));
}
}
pub fn build_codegen_backend(builder: &Builder<'_>,
cargo: &mut Cargo,
compiler: &Compiler,
target: Interned<String>,
backend: Interned<String>) -> String {
match &*backend {
"llvm" => {
// Build LLVM for our target. This will implicitly build the
// host LLVM if necessary.
let llvm_config = builder.ensure(native::Llvm {
target,
});
builder.info(&format!("Building stage{} codegen artifacts ({} -> {}, {})",
compiler.stage, &compiler.host, target, backend));
// Pass down configuration from the LLVM build into the build of
// librustc_llvm and librustc_codegen_llvm.
if builder.is_rust_llvm(target) {
cargo.env("LLVM_RUSTLLVM", "1");
}
cargo.env("LLVM_CONFIG", &llvm_config);
let target_config = builder.config.target_config.get(&target);
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
cargo.env("CFG_LLVM_ROOT", s);
}
// Some LLVM linker flags (-L and -l) may be needed to link librustc_llvm.
if let Some(ref s) = builder.config.llvm_ldflags {
cargo.env("LLVM_LINKER_FLAGS", s);
}
// Building with a static libstdc++ is only supported on linux and mingw right now,
// not for MSVC or macOS
if builder.config.llvm_static_stdcpp &&
!target.contains("freebsd") &&
!target.contains("msvc") &&
!target.contains("apple") {
let file = compiler_file(builder,
builder.cxx(target).unwrap(),
target,
"libstdc++.a");
cargo.env("LLVM_STATIC_STDCPP", file);
}
if builder.config.llvm_link_shared || builder.config.llvm_thin_lto {
cargo.env("LLVM_LINK_SHARED", "1");
}
if builder.config.llvm_use_libcxx {
cargo.env("LLVM_USE_LIBCXX", "1");
}
if builder.config.llvm_optimize && !builder.config.llvm_release_debuginfo {
cargo.env("LLVM_NDEBUG", "1");
}
}
_ => panic!("unknown backend: {}", backend),
}
String::new()
}
/// Creates the `codegen-backends` folder for a compiler that's about to be
/// assembled as a complete compiler.
///
/// This will take the codegen artifacts produced by `compiler` and link them
/// into an appropriate location for `target_compiler` to be a functional
/// compiler.
fn copy_codegen_backends_to_sysroot(builder: &Builder<'_>,
compiler: Compiler,
target_compiler: Compiler) {
let target = target_compiler.host;
// Note that this step is different than all the other `*Link` steps in
// that it's not assembling a bunch of libraries but rather is primarily
// moving the codegen backend into place. The codegen backend of rustc is
// not linked into the main compiler by default but is rather dynamically
// selected at runtime for inclusion.
//
// Here we're looking for the output dylib of the `CodegenBackend` step and
// we're copying that into the `codegen-backends` folder.
let dst = builder.sysroot_codegen_backends(target_compiler);
t!(fs::create_dir_all(&dst));
if builder.config.dry_run {
return;
}
for backend in builder.config.rust_codegen_backends.iter() {
let stamp = codegen_backend_stamp(builder, compiler, target, *backend);
let dylib = t!(fs::read_to_string(&stamp));
let file = Path::new(&dylib);
let filename = file.file_name().unwrap().to_str().unwrap();
// change `librustc_codegen_llvm-xxxxxx.so` to `librustc_codegen_llvm-llvm.so`
let target_filename = {
let dash = filename.find('-').unwrap();
let dot = filename.find('.').unwrap();
format!("{}-{}{}",
&filename[..dash],
backend,
&filename[dot..])
};
builder.copy(&file, &dst.join(target_filename));
}
}
fn copy_lld_to_sysroot(builder: &Builder<'_>,
target_compiler: Compiler,
lld_install_root: &Path) {
let target = target_compiler.host;
let dst = builder.sysroot_libdir(target_compiler, target)
.parent()
.unwrap()
.join("bin");
t!(fs::create_dir_all(&dst));
let src_exe = exe("lld", &target);
let dst_exe = exe("rust-lld", &target);
// we prepend this bin directory to the user PATH when linking Rust binaries. To
// avoid shadowing the system LLD we rename the LLD we provide to `rust-lld`.
builder.copy(&lld_install_root.join("bin").join(&src_exe), &dst.join(&dst_exe));
}
/// Cargo's output path for the standard library in a given stage, compiled
/// by a particular compiler for the specified target.
pub fn libstd_stamp(
@ -766,16 +637,6 @@ pub fn librustc_stamp(
builder.cargo_out(compiler, Mode::Rustc, target).join(".librustc.stamp")
}
/// Cargo's output path for librustc_codegen_llvm in a given stage, compiled by a particular
/// compiler for the specified target and backend.
fn codegen_backend_stamp(builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
backend: Interned<String>) -> PathBuf {
builder.cargo_out(compiler, Mode::Codegen, target)
.join(format!(".librustc_codegen_llvm-{}.stamp", backend))
}
pub fn compiler_file(
builder: &Builder<'_>,
compiler: &Path,
@ -879,13 +740,6 @@ impl Step for Assemble {
compiler: build_compiler,
target: target_compiler.host,
});
for &backend in builder.config.rust_codegen_backends.iter() {
builder.ensure(CodegenBackend {
compiler: build_compiler,
target: target_compiler.host,
backend,
});
}
let lld_install = if builder.config.lld_enabled {
Some(builder.ensure(native::Lld {
@ -911,13 +765,19 @@ impl Step for Assemble {
}
}
copy_codegen_backends_to_sysroot(builder,
build_compiler,
target_compiler);
let libdir = builder.sysroot_libdir(target_compiler, target_compiler.host);
if let Some(lld_install) = lld_install {
copy_lld_to_sysroot(builder, target_compiler, &lld_install);
let src_exe = exe("lld", &target_compiler.host);
let dst_exe = exe("rust-lld", &target_compiler.host);
// we prepend this bin directory to the user PATH when linking Rust binaries. To
// avoid shadowing the system LLD we rename the LLD we provide to `rust-lld`.
let dst = libdir.parent().unwrap().join("bin");
t!(fs::create_dir_all(&dst));
builder.copy(&lld_install.join("bin").join(&src_exe), &dst.join(&dst_exe));
}
// Ensure that `libLLVM.so` ends up in the newly build compiler directory,
// so that it can be found when the newly built `rustc` is run.
dist::maybe_install_llvm_dylib(builder, target_compiler.host, &sysroot);
// Link the compiler binary itself into place

View File

@ -105,7 +105,6 @@ pub struct Config {
pub rust_optimize_tests: bool,
pub rust_dist_src: bool,
pub rust_codegen_backends: Vec<Interned<String>>,
pub rust_codegen_backends_dir: String,
pub rust_verify_llvm_ir: bool,
pub rust_remap_debuginfo: bool,
@ -316,7 +315,6 @@ struct Rust {
dist_src: Option<bool>,
save_toolstates: Option<String>,
codegen_backends: Option<Vec<String>>,
codegen_backends_dir: Option<String>,
lld: Option<bool>,
llvm_tools: Option<bool>,
lldb: Option<bool>,
@ -372,7 +370,6 @@ impl Config {
config.ignore_git = false;
config.rust_dist_src = true;
config.rust_codegen_backends = vec![INTERNER.intern_str("llvm")];
config.rust_codegen_backends_dir = "codegen-backends".to_owned();
config.deny_warnings = true;
config.missing_tools = false;
@ -575,8 +572,6 @@ impl Config {
.collect();
}
set(&mut config.rust_codegen_backends_dir, rust.codegen_backends_dir.clone());
config.rust_codegen_units = rust.codegen_units.map(threads_from_config);
config.rust_codegen_units_std = rust.codegen_units_std.map(threads_from_config);
}
@ -647,6 +642,20 @@ impl Config {
config
}
/// Try to find the relative path of `bindir`, otherwise return it in full.
pub fn bindir_relative(&self) -> &Path {
let bindir = &self.bindir;
if bindir.is_absolute() {
// Try to make it relative to the prefix.
if let Some(prefix) = &self.prefix {
if let Ok(stripped) = bindir.strip_prefix(prefix) {
return stripped;
}
}
}
bindir
}
/// Try to find the relative path of `libdir`.
pub fn libdir_relative(&self) -> Option<&Path> {
let libdir = self.libdir.as_ref()?;

View File

@ -498,16 +498,6 @@ impl Step for Rustc {
}
}
// Copy over the codegen backends
let backends_src = builder.sysroot_codegen_backends(compiler);
let backends_rel = backends_src.strip_prefix(&src).unwrap()
.strip_prefix(builder.sysroot_libdir_relative(compiler)).unwrap();
// Don't use custom libdir here because ^lib/ will be resolved again with installer
let backends_dst = image.join("lib").join(&backends_rel);
t!(fs::create_dir_all(&backends_dst));
builder.cp_r(&backends_src, &backends_dst);
// Copy libLLVM.so to the lib dir as well, if needed. While not
// technically needed by rustc itself it's needed by lots of other
// components like the llvm tools and LLD. LLD is included below and
@ -616,6 +606,7 @@ impl Step for DebuggerScripts {
cp_debugger_script("natvis/intrinsic.natvis");
cp_debugger_script("natvis/liballoc.natvis");
cp_debugger_script("natvis/libcore.natvis");
cp_debugger_script("natvis/libstd.natvis");
} else {
cp_debugger_script("debugger_pretty_printers_common.py");
@ -2133,6 +2124,10 @@ impl Step for HashSign {
// Maybe add libLLVM.so to the lib-dir. It will only have been built if
// LLVM tools are linked dynamically.
//
// We add this to both the libdir of the rustc binary itself (for it to load at
// runtime) and also to the target directory so it can find it at link-time.
//
// Note: This function does no yet support Windows but we also don't support
// linking LLVM tools dynamically on Windows yet.
pub fn maybe_install_llvm_dylib(builder: &Builder<'_>,
@ -2141,13 +2136,19 @@ pub fn maybe_install_llvm_dylib(builder: &Builder<'_>,
let src_libdir = builder
.llvm_out(target)
.join("lib");
let dst_libdir = sysroot.join("lib/rustlib").join(&*target).join("lib");
t!(fs::create_dir_all(&dst_libdir));
let dst_libdir1 = sysroot.join("lib/rustlib").join(&*target).join("lib");
let dst_libdir2 = sysroot.join(builder.sysroot_libdir_relative(Compiler {
stage: 1,
host: target,
}));
t!(fs::create_dir_all(&dst_libdir1));
t!(fs::create_dir_all(&dst_libdir2));
if target.contains("apple-darwin") {
let llvm_dylib_path = src_libdir.join("libLLVM.dylib");
if llvm_dylib_path.exists() {
builder.install(&llvm_dylib_path, &dst_libdir, 0o644);
builder.install(&llvm_dylib_path, &dst_libdir1, 0o644);
builder.install(&llvm_dylib_path, &dst_libdir2, 0o644);
}
return
}
@ -2163,7 +2164,8 @@ pub fn maybe_install_llvm_dylib(builder: &Builder<'_>,
});
builder.install(&llvm_dylib_path, &dst_libdir, 0o644);
builder.install(&llvm_dylib_path, &dst_libdir1, 0o644);
builder.install(&llvm_dylib_path, &dst_libdir2, 0o644);
}
}

View File

@ -433,7 +433,7 @@ impl Step for Std {
builder.info(&format!("Documenting stage{} std ({})", stage, target));
let out = builder.doc_out(target);
t!(fs::create_dir_all(&out));
let compiler = builder.compiler_for(stage, builder.config.build, target);
let compiler = builder.compiler(stage, builder.config.build);
builder.ensure(compile::Std { compiler, target });
let out_dir = builder.stage_out(compiler, Mode::Std)
@ -541,7 +541,7 @@ impl Step for Rustc {
// Build cargo command.
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "doc");
cargo.env("RUSTDOCFLAGS", "--document-private-items --passes strip-hidden");
compile::rustc_cargo(builder, &mut cargo);
compile::rustc_cargo(builder, &mut cargo, target);
// Only include compiler crates, no dependencies of those, such as `libc`.
cargo.arg("--no-deps");

View File

@ -448,12 +448,12 @@ Arguments:
Flags {
verbose: matches.opt_count("verbose"),
stage: matches.opt_str("stage").map(|j| j.parse().unwrap()),
stage: matches.opt_str("stage").map(|j| j.parse().expect("`stage` should be a number")),
dry_run: matches.opt_present("dry-run"),
on_fail: matches.opt_str("on-fail"),
rustc_error_format: matches.opt_str("error-format"),
keep_stage: matches.opt_strs("keep-stage")
.into_iter().map(|j| j.parse().unwrap())
.into_iter().map(|j| j.parse().expect("`keep-stage` should be a number"))
.collect(),
host: split(&matches.opt_strs("host"))
.into_iter()
@ -464,7 +464,7 @@ Arguments:
.map(|x| INTERNER.intern_string(x))
.collect::<Vec<_>>(),
config: cfg_file,
jobs: matches.opt_str("jobs").map(|j| j.parse().unwrap()),
jobs: matches.opt_str("jobs").map(|j| j.parse().expect("`jobs` should be a number")),
cmd,
incremental: matches.opt_present("incremental"),
exclude: split(&matches.opt_strs("exclude"))

View File

@ -260,7 +260,7 @@ install!((self, builder, _config),
};
Rustc, "src/librustc", true, only_hosts: true, {
builder.ensure(dist::Rustc {
compiler: self.compiler,
compiler: builder.compiler(builder.top_stage, self.target),
});
install_rustc(builder, self.compiler.stage, self.target);
};

View File

@ -169,7 +169,6 @@ mod job {
pub use crate::config::Config;
use crate::flags::Subcommand;
use crate::cache::{Interned, INTERNER};
use crate::toolstate::ToolState;
const LLVM_TOOLS: &[&str] = &[
"llvm-nm", // used to inspect binaries; it shows symbol names, their sizes and visibility
@ -501,6 +500,9 @@ impl Build {
if self.config.jemalloc {
features.push_str("jemalloc");
}
if self.config.llvm_enabled() {
features.push_str(" llvm");
}
features
}
@ -806,11 +808,8 @@ impl Build {
.and_then(|c| c.linker.as_ref()) {
Some(linker)
} else if target != self.config.build &&
!target.contains("msvc") &&
!target.contains("emscripten") &&
!target.contains("wasm32") &&
!target.contains("nvptx") &&
!target.contains("fuchsia") {
util::use_host_linker(&target) &&
!target.contains("msvc") {
Some(self.cc(target))
} else {
None
@ -1073,32 +1072,6 @@ impl Build {
}
}
/// Updates the actual toolstate of a tool.
///
/// The toolstates are saved to the file specified by the key
/// `rust.save-toolstates` in `config.toml`. If unspecified, nothing will be
/// done. The file is updated immediately after this function completes.
pub fn save_toolstate(&self, tool: &str, state: ToolState) {
if let Some(ref path) = self.config.save_toolstates {
if let Some(parent) = path.parent() {
// Ensure the parent directory always exists
t!(std::fs::create_dir_all(parent));
}
let mut file = t!(fs::OpenOptions::new()
.create(true)
.read(true)
.write(true)
.open(path));
let mut current_toolstates: HashMap<Box<str>, ToolState> =
serde_json::from_reader(&mut file).unwrap_or_default();
current_toolstates.insert(tool.into(), state);
t!(file.seek(SeekFrom::Start(0)));
t!(file.set_len(0));
t!(serde_json::to_writer(file, &current_toolstates));
}
}
fn in_tree_crates(&self, root: &str) -> Vec<&Crate> {
let mut ret = Vec::new();
let mut list = vec![INTERNER.intern_str(root)];

View File

@ -294,11 +294,11 @@ fn check_llvm_version(builder: &Builder<'_>, llvm_config: &Path) {
let mut parts = version.split('.').take(2)
.filter_map(|s| s.parse::<u32>().ok());
if let (Some(major), Some(_minor)) = (parts.next(), parts.next()) {
if major >= 6 {
if major >= 7 {
return
}
}
panic!("\n\nbad LLVM version: {}, need >=6.0\n\n", version)
panic!("\n\nbad LLVM version: {}, need >=7.0\n\n", version)
}
fn configure_cmake(builder: &Builder<'_>,

View File

@ -570,7 +570,12 @@ impl Step for Clippy {
let host_libs = builder
.stage_out(compiler, Mode::ToolRustc)
.join(builder.cargo_dir());
let target_libs = builder
.stage_out(compiler, Mode::ToolRustc)
.join(&self.host)
.join(builder.cargo_dir());
cargo.env("HOST_LIBS", host_libs);
cargo.env("TARGET_LIBS", target_libs);
// clippy tests need to find the driver
cargo.env("CLIPPY_DRIVER_PATH", clippy);
@ -1768,7 +1773,7 @@ impl Step for Crate {
}
Mode::Rustc => {
builder.ensure(compile::Rustc { compiler, target });
compile::rustc_cargo(builder, &mut cargo);
compile::rustc_cargo(builder, &mut cargo, target);
}
_ => panic!("can only test libraries"),
};

View File

@ -1,4 +1,28 @@
use serde::{Deserialize, Serialize};
use build_helper::t;
use std::time;
use std::fs;
use std::io::{Seek, SeekFrom};
use std::collections::HashMap;
use crate::builder::{Builder, RunConfig, ShouldRun, Step};
use std::fmt;
use std::process::Command;
use std::path::PathBuf;
use std::env;
// Each cycle is 42 days long (6 weeks); the last week is 35..=42 then.
const BETA_WEEK_START: u64 = 35;
#[cfg(linux)]
const OS: Option<&str> = Some("linux");
#[cfg(windows)]
const OS: Option<&str> = Some("windows");
#[cfg(all(not(linux), not(windows)))]
const OS: Option<&str> = None;
type ToolstateData = HashMap<Box<str>, ToolState>;
#[derive(Copy, Clone, Debug, Deserialize, Serialize, PartialEq, Eq)]
#[serde(rename_all = "kebab-case")]
@ -12,9 +36,392 @@ pub enum ToolState {
BuildFail = 0,
}
impl fmt::Display for ToolState {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", match self {
ToolState::TestFail => "test-fail",
ToolState::TestPass => "test-pass",
ToolState::BuildFail => "build-fail",
})
}
}
impl Default for ToolState {
fn default() -> Self {
// err on the safe side
ToolState::BuildFail
}
}
/// Number of days after the last promotion of beta.
/// Its value is 41 on the Tuesday where "Promote master to beta (T-2)" happens.
/// The Wednesday after this has value 0.
/// We track this value to prevent regressing tools in the last week of the 6-week cycle.
fn days_since_beta_promotion() -> u64 {
let since_epoch = t!(time::SystemTime::UNIX_EPOCH.elapsed());
(since_epoch.as_secs() / 86400 - 20) % 42
}
// These tools must test-pass on the beta/stable channels.
//
// On the nightly channel, their build step must be attempted, but they may not
// be able to build successfully.
static STABLE_TOOLS: &[(&str, &str)] = &[
("book", "src/doc/book"),
("nomicon", "src/doc/nomicon"),
("reference", "src/doc/reference"),
("rust-by-example", "src/doc/rust-by-example"),
("edition-guide", "src/doc/edition-guide"),
("rls", "src/tools/rls"),
("rustfmt", "src/tools/rustfmt"),
("clippy-driver", "src/tools/clippy"),
];
// These tools are permitted to not build on the beta/stable channels.
//
// We do require that we checked whether they build or not on the tools builder,
// though, as otherwise we will be unable to file an issue if they start
// failing.
static NIGHTLY_TOOLS: &[(&str, &str)] = &[
("miri", "src/tools/miri"),
("embedded-book", "src/doc/embedded-book"),
("rustc-guide", "src/doc/rustc-guide"),
];
fn print_error(tool: &str, submodule: &str) {
eprintln!("");
eprintln!("We detected that this PR updated '{}', but its tests failed.", tool);
eprintln!("");
eprintln!("If you do intend to update '{}', please check the error messages above and", tool);
eprintln!("commit another update.");
eprintln!("");
eprintln!("If you do NOT intend to update '{}', please ensure you did not accidentally", tool);
eprintln!("change the submodule at '{}'. You may ask your reviewer for the", submodule);
eprintln!("proper steps.");
std::process::exit(3);
}
fn check_changed_files(toolstates: &HashMap<Box<str>, ToolState>) {
// Changed files
let output = std::process::Command::new("git")
.arg("diff")
.arg("--name-status")
.arg("HEAD")
.arg("HEAD^")
.output();
let output = match output {
Ok(o) => o,
Err(e) => {
eprintln!("Failed to get changed files: {:?}", e);
std::process::exit(1);
}
};
let output = t!(String::from_utf8(output.stdout));
for (tool, submodule) in STABLE_TOOLS.iter().chain(NIGHTLY_TOOLS.iter()) {
let changed = output.lines().any(|l| {
l.starts_with("M") && l.ends_with(submodule)
});
eprintln!("Verifying status of {}...", tool);
if !changed {
continue;
}
eprintln!("This PR updated '{}', verifying if status is 'test-pass'...", submodule);
if toolstates[*tool] != ToolState::TestPass {
print_error(tool, submodule);
}
}
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct ToolStateCheck;
impl Step for ToolStateCheck {
type Output = ();
/// Runs the `linkchecker` tool as compiled in `stage` by the `host` compiler.
///
/// This tool in `src/tools` will verify the validity of all our links in the
/// documentation to ensure we don't have a bunch of dead ones.
fn run(self, builder: &Builder<'_>) {
if builder.config.dry_run {
return;
}
let days_since_beta_promotion = days_since_beta_promotion();
let in_beta_week = days_since_beta_promotion >= BETA_WEEK_START;
let is_nightly = !(builder.config.channel == "beta" || builder.config.channel == "stable");
let toolstates = builder.toolstates();
let mut did_error = false;
for (tool, _) in STABLE_TOOLS.iter().chain(NIGHTLY_TOOLS.iter()) {
if !toolstates.contains_key(*tool) {
did_error = true;
eprintln!("error: Tool `{}` was not recorded in tool state.", tool);
}
}
if did_error {
std::process::exit(1);
}
check_changed_files(&toolstates);
for (tool, _) in STABLE_TOOLS.iter() {
let state = toolstates[*tool];
if state != ToolState::TestPass {
if !is_nightly {
did_error = true;
eprintln!("error: Tool `{}` should be test-pass but is {}", tool, state);
} else if in_beta_week {
did_error = true;
eprintln!("error: Tool `{}` should be test-pass but is {} during beta week.",
tool, state);
}
}
}
if did_error {
std::process::exit(1);
}
if builder.config.channel == "nightly" && env::var_os("TOOLSTATE_PUBLISH").is_some() {
commit_toolstate_change(&toolstates, in_beta_week);
}
}
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("check-tools")
}
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(ToolStateCheck);
}
}
impl Builder<'_> {
fn toolstates(&self) -> HashMap<Box<str>, ToolState> {
if let Some(ref path) = self.config.save_toolstates {
if let Some(parent) = path.parent() {
// Ensure the parent directory always exists
t!(std::fs::create_dir_all(parent));
}
let mut file = t!(fs::OpenOptions::new()
.create(true)
.write(true)
.read(true)
.open(path));
serde_json::from_reader(&mut file).unwrap_or_default()
} else {
Default::default()
}
}
/// Updates the actual toolstate of a tool.
///
/// The toolstates are saved to the file specified by the key
/// `rust.save-toolstates` in `config.toml`. If unspecified, nothing will be
/// done. The file is updated immediately after this function completes.
pub fn save_toolstate(&self, tool: &str, state: ToolState) {
if let Some(ref path) = self.config.save_toolstates {
if let Some(parent) = path.parent() {
// Ensure the parent directory always exists
t!(std::fs::create_dir_all(parent));
}
let mut file = t!(fs::OpenOptions::new()
.create(true)
.read(true)
.write(true)
.open(path));
let mut current_toolstates: HashMap<Box<str>, ToolState> =
serde_json::from_reader(&mut file).unwrap_or_default();
current_toolstates.insert(tool.into(), state);
t!(file.seek(SeekFrom::Start(0)));
t!(file.set_len(0));
t!(serde_json::to_writer(file, &current_toolstates));
}
}
}
/// This function `commit_toolstate_change` provides functionality for pushing a change
/// to the `rust-toolstate` repository.
///
/// The function relies on a GitHub bot user, which should have a Personal access
/// token defined in the environment variable $TOOLSTATE_REPO_ACCESS_TOKEN. If for
/// some reason you need to change the token, please update the Azure Pipelines
/// variable group.
///
/// 1. Generate a new Personal access token:
///
/// * Login to the bot account, and go to Settings -> Developer settings ->
/// Personal access tokens
/// * Click "Generate new token"
/// * Enable the "public_repo" permission, then click "Generate token"
/// * Copy the generated token (should be a 40-digit hexadecimal number).
/// Save it somewhere secure, as the token would be gone once you leave
/// the page.
///
/// 2. Update the variable group in Azure Pipelines
///
/// * Ping a member of the infrastructure team to do this.
///
/// 4. Replace the email address below if the bot account identity is changed
///
/// * See <https://help.github.com/articles/about-commit-email-addresses/>
/// if a private email by GitHub is wanted.
fn commit_toolstate_change(
current_toolstate: &ToolstateData,
in_beta_week: bool,
) {
fn git_config(key: &str, value: &str) {
let status = Command::new("git").arg("config").arg("--global").arg(key).arg(value).status();
let success = match status {
Ok(s) => s.success(),
Err(_) => false,
};
if !success {
panic!("git config key={} value={} successful (status: {:?})", key, value, status);
}
}
// If changing anything here, then please check that src/ci/publish_toolstate.sh is up to date
// as well.
git_config("user.email", "7378925+rust-toolstate-update@users.noreply.github.com");
git_config("user.name", "Rust Toolstate Update");
git_config("credential.helper", "store");
let credential = format!(
"https://{}:x-oauth-basic@github.com\n",
t!(env::var("TOOLSTATE_REPO_ACCESS_TOKEN")),
);
let git_credential_path = PathBuf::from(t!(env::var("HOME"))).join(".git-credentials");
t!(fs::write(&git_credential_path, credential));
let status = Command::new("git").arg("clone")
.arg("--depth=1")
.arg(t!(env::var("TOOLSTATE_REPO")))
.status();
let success = match status {
Ok(s) => s.success(),
Err(_) => false,
};
if !success {
panic!("git clone successful (status: {:?})", status);
}
let old_toolstate = t!(fs::read("rust-toolstate/_data/latest.json"));
let old_toolstate: Vec<RepoState> = t!(serde_json::from_slice(&old_toolstate));
let message = format!("({} CI update)", OS.expect("linux/windows only"));
let mut success = false;
for _ in 1..=5 {
// Update the toolstate results (the new commit-to-toolstate mapping) in the toolstate repo.
change_toolstate(&current_toolstate, &old_toolstate, in_beta_week);
// `git commit` failing means nothing to commit.
let status = t!(Command::new("git")
.current_dir("rust-toolstate")
.arg("commit")
.arg("-a")
.arg("-m")
.arg(&message)
.status());
if !status.success() {
success = true;
break;
}
let status = t!(Command::new("git")
.current_dir("rust-toolstate")
.arg("push")
.arg("origin")
.arg("master")
.status());
// If we successfully push, exit.
if status.success() {
success = true;
break;
}
eprintln!("Sleeping for 3 seconds before retrying push");
std::thread::sleep(std::time::Duration::from_secs(3));
let status = t!(Command::new("git")
.current_dir("rust-toolstate")
.arg("fetch")
.arg("origin")
.arg("master")
.status());
assert!(status.success());
let status = t!(Command::new("git")
.current_dir("rust-toolstate")
.arg("reset")
.arg("--hard")
.arg("origin/master")
.status());
assert!(status.success());
}
if !success {
panic!("Failed to update toolstate repository with new data");
}
}
fn change_toolstate(
current_toolstate: &ToolstateData,
old_toolstate: &[RepoState],
in_beta_week: bool,
) {
let mut regressed = false;
for repo_state in old_toolstate {
let tool = &repo_state.tool;
let state = if cfg!(linux) {
&repo_state.linux
} else if cfg!(windows) {
&repo_state.windows
} else {
unimplemented!()
};
let new_state = current_toolstate[tool.as_str()];
if new_state != *state {
eprintln!("The state of `{}` has changed from `{}` to `{}`", tool, state, new_state);
if (new_state as u8) < (*state as u8) {
if !["rustc-guide", "miri", "embedded-book"].contains(&tool.as_str()) {
regressed = true;
}
}
}
}
if regressed && in_beta_week {
std::process::exit(1);
}
let commit = t!(std::process::Command::new("git")
.arg("rev-parse")
.arg("HEAD")
.output());
let commit = t!(String::from_utf8(commit.stdout));
let toolstate_serialized = t!(serde_json::to_string(&current_toolstate));
let history_path = format!("rust-toolstate/history/{}.tsv", OS.expect("linux/windows only"));
let mut file = t!(fs::read_to_string(&history_path));
let end_of_first_line = file.find('\n').unwrap();
file.insert_str(end_of_first_line, &format!("{}\t{}\n", commit, toolstate_serialized));
t!(fs::write(&history_path, file));
}
#[derive(Debug, Serialize, Deserialize)]
struct RepoState {
tool: String,
windows: ToolState,
linux: ToolState,
commit: String,
datetime: String,
}

View File

@ -15,6 +15,7 @@ use build_helper::t;
use crate::config::Config;
use crate::builder::Builder;
use crate::cache::Interned;
/// Returns the `name` as the filename of a static library for `target`.
pub fn staticlib(name: &str, target: &str) -> String {
@ -262,6 +263,8 @@ pub enum CiEnv {
None,
/// The Azure Pipelines environment, for Linux (including Docker), Windows, and macOS builds.
AzurePipelines,
/// The GitHub Actions environment, for Linux (including Docker), Windows and macOS builds.
GitHubActions,
}
impl CiEnv {
@ -269,6 +272,8 @@ impl CiEnv {
pub fn current() -> CiEnv {
if env::var("TF_BUILD").ok().map_or(false, |e| &*e == "True") {
CiEnv::AzurePipelines
} else if env::var("GITHUB_ACTIONS").ok().map_or(false, |e| &*e == "true") {
CiEnv::GitHubActions
} else {
CiEnv::None
}
@ -302,3 +307,15 @@ pub fn forcing_clang_based_tests() -> bool {
false
}
}
pub fn use_host_linker(target: &Interned<String>) -> bool {
// FIXME: this information should be gotten by checking the linker flavor
// of the rustc target
!(
target.contains("emscripten") ||
target.contains("wasm32") ||
target.contains("nvptx") ||
target.contains("fortanix") ||
target.contains("fuchsia")
)
}

View File

@ -18,137 +18,47 @@ jobs:
- template: steps/run.yml
strategy:
matrix:
x86_64-gnu-llvm-6.0:
IMAGE: x86_64-gnu-llvm-6.0
x86_64-gnu-llvm-7:
RUST_BACKTRACE: 1
dist-x86_64-linux:
IMAGE: dist-x86_64-linux
DEPLOY: 1
# "alternate" deployments, these are "nightlies" but have LLVM assertions
# turned on, they're deployed to a different location primarily for
# additional testing.
dist-x86_64-linux: {}
dist-x86_64-linux-alt:
IMAGE: dist-x86_64-linux
DEPLOY_ALT: 1
# Linux builders, remaining docker images
arm-android:
IMAGE: arm-android
armhf-gnu:
IMAGE: armhf-gnu
dist-various-1:
IMAGE: dist-various-1
DEPLOY: 1
dist-various-2:
IMAGE: dist-various-2
DEPLOY: 1
dist-aarch64-linux:
IMAGE: dist-aarch64-linux
DEPLOY: 1
dist-android:
IMAGE: dist-android
DEPLOY: 1
dist-arm-linux:
IMAGE: dist-arm-linux
DEPLOY: 1
dist-armhf-linux:
IMAGE: dist-armhf-linux
DEPLOY: 1
dist-armv7-linux:
IMAGE: dist-armv7-linux
DEPLOY: 1
dist-i586-gnu-i586-i686-musl:
IMAGE: dist-i586-gnu-i586-i686-musl
DEPLOY: 1
dist-i686-freebsd:
IMAGE: dist-i686-freebsd
DEPLOY: 1
dist-i686-linux:
IMAGE: dist-i686-linux
DEPLOY: 1
dist-mips-linux:
IMAGE: dist-mips-linux
DEPLOY: 1
dist-mips64-linux:
IMAGE: dist-mips64-linux
DEPLOY: 1
dist-mips64el-linux:
IMAGE: dist-mips64el-linux
DEPLOY: 1
dist-mipsel-linux:
IMAGE: dist-mipsel-linux
DEPLOY: 1
dist-powerpc-linux:
IMAGE: dist-powerpc-linux
DEPLOY: 1
dist-powerpc64-linux:
IMAGE: dist-powerpc64-linux
DEPLOY: 1
dist-powerpc64le-linux:
IMAGE: dist-powerpc64le-linux
DEPLOY: 1
dist-s390x-linux:
IMAGE: dist-s390x-linux
DEPLOY: 1
dist-x86_64-freebsd:
IMAGE: dist-x86_64-freebsd
DEPLOY: 1
dist-x86_64-musl:
IMAGE: dist-x86_64-musl
DEPLOY: 1
dist-x86_64-netbsd:
IMAGE: dist-x86_64-netbsd
DEPLOY: 1
i686-gnu:
IMAGE: i686-gnu
i686-gnu-nopt:
IMAGE: i686-gnu-nopt
test-various:
IMAGE: test-various
wasm32:
IMAGE: wasm32
x86_64-gnu:
IMAGE: x86_64-gnu
x86_64-gnu-full-bootstrap:
IMAGE: x86_64-gnu-full-bootstrap
x86_64-gnu-aux:
IMAGE: x86_64-gnu-aux
arm-android: {}
armhf-gnu: {}
dist-various-1: {}
dist-various-2: {}
dist-aarch64-linux: {}
dist-android: {}
dist-arm-linux: {}
dist-armhf-linux: {}
dist-armv7-linux: {}
dist-i586-gnu-i586-i686-musl: {}
dist-i686-freebsd: {}
dist-i686-linux: {}
dist-mips-linux: {}
dist-mips64-linux: {}
dist-mips64el-linux: {}
dist-mipsel-linux: {}
dist-powerpc-linux: {}
dist-powerpc64-linux: {}
dist-powerpc64le-linux: {}
dist-s390x-linux: {}
dist-x86_64-freebsd: {}
dist-x86_64-musl: {}
dist-x86_64-netbsd: {}
i686-gnu: {}
i686-gnu-nopt: {}
test-various: {}
wasm32: {}
x86_64-gnu: {}
x86_64-gnu-full-bootstrap: {}
x86_64-gnu-aux: {}
x86_64-gnu-tools:
IMAGE: x86_64-gnu-tools
DEPLOY_TOOLSTATES_JSON: toolstates-linux.json
x86_64-gnu-debug:
IMAGE: x86_64-gnu-debug
x86_64-gnu-nopt:
IMAGE: x86_64-gnu-nopt
x86_64-gnu-distcheck:
IMAGE: x86_64-gnu-distcheck
mingw-check:
IMAGE: mingw-check
x86_64-gnu-debug: {}
x86_64-gnu-nopt: {}
x86_64-gnu-distcheck: {}
mingw-check: {}
- job: macOS
timeoutInMinutes: 600
@ -176,7 +86,6 @@ jobs:
dist-x86_64-apple:
SCRIPT: ./x.py dist
RUST_CONFIGURE_ARGS: --target=aarch64-apple-ios,armv7-apple-ios,armv7s-apple-ios,i386-apple-ios,x86_64-apple-ios --enable-full-tools --enable-sanitizers --enable-profiler --set rust.jemalloc
DEPLOY: 1
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
MACOSX_DEPLOYMENT_TARGET: 10.7
NO_LLVM_ASSERTIONS: 1
@ -186,7 +95,6 @@ jobs:
dist-x86_64-apple-alt:
SCRIPT: ./x.py dist
RUST_CONFIGURE_ARGS: --enable-extended --enable-profiler --set rust.jemalloc
DEPLOY_ALT: 1
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
MACOSX_DEPLOYMENT_TARGET: 10.7
NO_LLVM_ASSERTIONS: 1
@ -204,7 +112,6 @@ jobs:
dist-i686-apple:
SCRIPT: ./x.py dist
RUST_CONFIGURE_ARGS: --build=i686-apple-darwin --enable-full-tools --enable-profiler --set rust.jemalloc
DEPLOY: 1
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
MACOSX_DEPLOYMENT_TARGET: 10.7
NO_LLVM_ASSERTIONS: 1
@ -223,25 +130,21 @@ jobs:
matrix:
# 32/64 bit MSVC tests
x86_64-msvc-1:
MSYS_BITS: 64
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-profiler
SCRIPT: make ci-subset-1
# FIXME(#59637)
NO_DEBUG_ASSERTIONS: 1
NO_LLVM_ASSERTIONS: 1
x86_64-msvc-2:
MSYS_BITS: 64
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-profiler
SCRIPT: make ci-subset-2
i686-msvc-1:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc
SCRIPT: make ci-subset-1
# FIXME(#59637)
NO_DEBUG_ASSERTIONS: 1
NO_LLVM_ASSERTIONS: 1
i686-msvc-2:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc
SCRIPT: make ci-subset-2
# FIXME(#59637)
@ -249,11 +152,9 @@ jobs:
NO_LLVM_ASSERTIONS: 1
# MSVC aux tests
x86_64-msvc-aux:
MSYS_BITS: 64
RUST_CHECK_TARGET: check-aux EXCLUDE_CARGO=1
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc
x86_64-msvc-cargo:
MSYS_BITS: 64
SCRIPT: python x.py test src/tools/cargotest src/tools/cargo
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc
VCVARS_BAT: vcvars64.bat
@ -262,10 +163,8 @@ jobs:
NO_LLVM_ASSERTIONS: 1
# MSVC tools tests
x86_64-msvc-tools:
MSYS_BITS: 64
SCRIPT: src/ci/docker/x86_64-gnu-tools/checktools.sh x.py /tmp/toolstate/toolstates.json windows
SCRIPT: src/ci/docker/x86_64-gnu-tools/checktools.sh x.py
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --save-toolstates=/tmp/toolstate/toolstates.json
DEPLOY_TOOLSTATES_JSON: toolstates-windows.json
# 32/64-bit MinGW builds.
#
@ -281,83 +180,57 @@ jobs:
# came from the mingw-w64 SourceForge download site. Unfortunately
# SourceForge is notoriously flaky, so we mirror it on our own infrastructure.
i686-mingw-1:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu
SCRIPT: make ci-mingw-subset-1
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
MINGW_DIR: mingw32
CUSTOM_MINGW: 1
# FIXME(#59637)
NO_DEBUG_ASSERTIONS: 1
NO_LLVM_ASSERTIONS: 1
i686-mingw-2:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu
SCRIPT: make ci-mingw-subset-2
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
MINGW_DIR: mingw32
CUSTOM_MINGW: 1
x86_64-mingw-1:
MSYS_BITS: 64
SCRIPT: make ci-mingw-subset-1
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
MINGW_DIR: mingw64
CUSTOM_MINGW: 1
# FIXME(#59637)
NO_DEBUG_ASSERTIONS: 1
NO_LLVM_ASSERTIONS: 1
x86_64-mingw-2:
MSYS_BITS: 64
SCRIPT: make ci-mingw-subset-2
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
MINGW_DIR: mingw64
CUSTOM_MINGW: 1
# 32/64 bit MSVC and GNU deployment
dist-x86_64-msvc:
MSYS_BITS: 64
RUST_CONFIGURE_ARGS: >
RUST_CONFIGURE_ARGS: >-
--build=x86_64-pc-windows-msvc
--target=x86_64-pc-windows-msvc,aarch64-pc-windows-msvc
--enable-full-tools
--enable-profiler
SCRIPT: python x.py dist
DIST_REQUIRE_ALL_TOOLS: 1
DEPLOY: 1
dist-i686-msvc:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: >
RUST_CONFIGURE_ARGS: >-
--build=i686-pc-windows-msvc
--target=i586-pc-windows-msvc
--enable-full-tools
--enable-profiler
SCRIPT: python x.py dist
DIST_REQUIRE_ALL_TOOLS: 1
DEPLOY: 1
dist-i686-mingw:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu --enable-full-tools --enable-profiler
SCRIPT: python x.py dist
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
MINGW_DIR: mingw32
CUSTOM_MINGW: 1
DIST_REQUIRE_ALL_TOOLS: 1
DEPLOY: 1
dist-x86_64-mingw:
MSYS_BITS: 64
SCRIPT: python x.py dist
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu --enable-full-tools --enable-profiler
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
MINGW_DIR: mingw64
CUSTOM_MINGW: 1
DIST_REQUIRE_ALL_TOOLS: 1
DEPLOY: 1
# "alternate" deployment, see .travis.yml for more info
dist-x86_64-msvc-alt:
MSYS_BITS: 64
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-extended --enable-profiler
SCRIPT: python x.py dist
DEPLOY_ALT: 1

View File

@ -16,10 +16,7 @@ steps:
- checkout: self
fetchDepth: 2
- script: |
export MESSAGE_FILE=$(mktemp -t msg.XXXXXX)
. src/ci/docker/x86_64-gnu-tools/repo.sh
commit_toolstate_change "$MESSAGE_FILE" "$BUILD_SOURCESDIRECTORY/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" "$(git log --format=%s -n1 HEAD)" "$MESSAGE_FILE" "$TOOLSTATE_REPO_ACCESS_TOKEN"
- script: src/ci/publish_toolstate.sh
displayName: Publish toolstate
env:
TOOLSTATE_REPO_ACCESS_TOKEN: $(TOOLSTATE_REPO_ACCESS_TOKEN)

View File

@ -18,10 +18,7 @@ jobs:
- template: steps/run.yml
strategy:
matrix:
x86_64-gnu-llvm-6.0:
IMAGE: x86_64-gnu-llvm-6.0
mingw-check:
IMAGE: mingw-check
x86_64-gnu-llvm-7: {}
mingw-check: {}
x86_64-gnu-tools:
IMAGE: x86_64-gnu-tools
CI_ONLY_WHEN_SUBMODULES_CHANGED: 1

View File

@ -8,6 +8,13 @@
steps:
# Configure our CI_JOB_NAME variable which log analyzers can use for the main
# step to see what's going on.
- bash: |
builder=$(echo $AGENT_JOBNAME | cut -d ' ' -f 2)
echo "##vso[task.setvariable variable=CI_JOB_NAME]$builder"
displayName: Configure Job Name
# Disable automatic line ending conversion, which is enabled by default on
# Azure's Windows image. Having the conversion enabled caused regressions both
# in our test suite (it broke miri tests) and in the ecosystem, since we
@ -21,51 +28,42 @@ steps:
- checkout: self
fetchDepth: 2
- bash: src/ci/scripts/setup-environment.sh
displayName: Setup environment
- bash: src/ci/scripts/clean-disk.sh
displayName: Clean disk
- bash: src/ci/scripts/should-skip-this.sh
displayName: Decide whether to run this job
# Spawn a background process to collect CPU usage statistics which we'll upload
# at the end of the build. See the comments in the script here for more
# information.
- bash: python src/ci/cpu-usage-over-time.py &> cpu-usage.csv &
displayName: "Collect CPU-usage statistics in the background"
- bash: src/ci/scripts/collect-cpu-stats.sh
displayName: Collect CPU-usage statistics in the background
- bash: src/ci/scripts/dump-environment.sh
displayName: Show the current environment
- bash: src/ci/scripts/install-sccache.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Install sccache
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/install-clang.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Install clang
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/switch-xcode.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Switch to Xcode 9.3
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/install-wix.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Install wix
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/install-innosetup.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Install InnoSetup
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/windows-symlink-build-dir.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Ensure the build happens on C:\ instead of D:\
condition: and(succeeded(), not(variables.SKIP_JOB))
@ -74,35 +72,22 @@ steps:
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/install-msys2.sh
env:
AGENT_OS: $(Agent.OS)
SYSTEM_WORKFOLDER: $(System.Workfolder)
displayName: Install msys2
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/install-msys2-packages.sh
env:
AGENT_OS: $(Agent.OS)
SYSTEM_WORKFOLDER: $(System.Workfolder)
displayName: Install msys2 packages
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/install-mingw.sh
env:
AGENT_OS: $(Agent.OS)
SYSTEM_WORKFOLDER: $(System.Workfolder)
displayName: Install MinGW
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/install-ninja.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Install ninja
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/enable-docker-ipv6.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Enable IPv6 on Docker
condition: and(succeeded(), not(variables.SKIP_JOB))
@ -116,67 +101,22 @@ steps:
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/checkout-submodules.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Checkout submodules
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/verify-line-endings.sh
env:
AGENT_OS: $(Agent.OS)
displayName: Verify line endings
condition: and(succeeded(), not(variables.SKIP_JOB))
# Ensure the `aws` CLI is installed so we can deploy later on, cache docker
# images, etc.
- bash: src/ci/scripts/install-awscli.sh
env:
AGENT_OS: $(Agent.OS)
condition: and(succeeded(), not(variables.SKIP_JOB))
displayName: Install awscli
# Configure our CI_JOB_NAME variable which log analyzers can use for the main
# step to see what's going on.
- bash: |
builder=$(echo $AGENT_JOBNAME | cut -d ' ' -f 2)
echo "##vso[task.setvariable variable=CI_JOB_NAME]$builder"
displayName: Configure Job Name
# As a quick smoke check on the otherwise very fast mingw-check linux builder
# check our own internal scripts.
- bash: |
set -e
git clone --depth=1 https://github.com/rust-lang-nursery/rust-toolstate.git
cd rust-toolstate
python2.7 "$BUILD_SOURCESDIRECTORY/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" "$(git log --format=%s -n1 HEAD)" "" ""
# Only check maintainers if this build is supposed to publish toolstate.
# Builds that are not supposed to publish don't have the access token.
if [ -n "${TOOLSTATE_PUBLISH+is_set}" ]; then
TOOLSTATE_VALIDATE_MAINTAINERS_REPO=rust-lang/rust python2.7 "${BUILD_SOURCESDIRECTORY}/src/tools/publish_toolstate.py"
fi
cd ..
rm -rf rust-toolstate
env:
TOOLSTATE_REPO_ACCESS_TOKEN: $(TOOLSTATE_REPO_ACCESS_TOKEN)
condition: and(succeeded(), not(variables.SKIP_JOB), eq(variables['IMAGE'], 'mingw-check'))
displayName: Verify the publish_toolstate script works
- bash: |
set -e
# Remove any preexisting rustup installation since it can interfere
# with the cargotest step and its auto-detection of things like Clippy in
# the environment
rustup self uninstall -y || true
if [ "$IMAGE" = "" ]; then
src/ci/run.sh
else
src/ci/docker/run.sh $IMAGE
fi
#timeoutInMinutes: 180
- bash: src/ci/scripts/run-build-from-ci.sh
timeoutInMinutes: 600
env:
CI: true
SRC: .
AWS_ACCESS_KEY_ID: $(SCCACHE_AWS_ACCESS_KEY_ID)
AWS_SECRET_ACCESS_KEY: $(SCCACHE_AWS_SECRET_ACCESS_KEY)
TOOLSTATE_REPO_ACCESS_TOKEN: $(TOOLSTATE_REPO_ACCESS_TOKEN)

View File

@ -14,13 +14,9 @@ jobs:
- template: steps/run.yml
strategy:
matrix:
dist-x86_64-linux:
IMAGE: dist-x86_64-linux
DEPLOY: 1
dist-x86_64-linux: {}
dist-x86_64-linux-alt:
IMAGE: dist-x86_64-linux
DEPLOY_ALT: 1
# The macOS and Windows builds here are currently disabled due to them not being
# overly necessary on `try` builds. We also don't actually have anything that
@ -72,7 +68,6 @@ jobs:
# DEPLOY: 1
#
# dist-x86_64-msvc-alt:
# MSYS_BITS: 64
# RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-extended --enable-profiler
# SCRIPT: python x.py dist
# DEPLOY_ALT: 1

View File

@ -16,6 +16,13 @@ for example:
Images will output artifacts in an `obj` dir at the root of a repository.
**NOTE**: Re-using the same `obj` dir with different docker images with
the same target triple (e.g. `dist-x86_64-linux` and `dist-various-1`)
may result in strange linker errors, due shared library versions differing between platforms.
If you encounter any issues when using multiple Docker images, try deleting your `obj` directory
before running your command.
## Filesystem layout
- Each directory, excluding `scripts` and `disabled`, corresponds to a docker image

View File

@ -72,7 +72,7 @@ RUN arm-linux-gnueabihf-gcc addentropy.c -o rootfs/addentropy -static
# TODO: What is this?!
# Source of the file: https://github.com/vfdev-5/qemu-rpi2-vexpress/raw/master/vexpress-v2p-ca15-tc1.dtb
RUN curl -O https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/vexpress-v2p-ca15-tc1.dtb
RUN curl -O https://ci-mirrors.rust-lang.org/rustc/vexpress-v2p-ca15-tc1.dtb
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh

View File

@ -1,7 +1,7 @@
set -ex
# Mirrored from https://github.com/crosstool-ng/crosstool-ng/archive/crosstool-ng-1.24.0.tar.gz
url="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/crosstool-ng-1.24.0.tar.gz"
url="https://ci-mirrors.rust-lang.org/rustc/crosstool-ng-1.24.0.tar.gz"
curl -Lf $url | tar xzf -
cd crosstool-ng-crosstool-ng-1.24.0
./bootstrap

View File

@ -69,7 +69,7 @@ RUN ./build-python.sh
# Now build LLVM+Clang 7, afterwards configuring further compilations to use the
# clang/clang++ compilers.
COPY dist-x86_64-linux/build-clang.sh /tmp/
COPY dist-x86_64-linux/build-clang.sh dist-x86_64-linux/llvm-project-centos.patch /tmp/
RUN ./build-clang.sh
ENV CC=clang CXX=clang++

View File

@ -135,6 +135,9 @@ ENV TARGETS=$TARGETS,armv7r-none-eabi
ENV TARGETS=$TARGETS,armv7r-none-eabihf
ENV TARGETS=$TARGETS,thumbv7neon-unknown-linux-gnueabihf
# riscv targets currently do not need a C compiler, as compiler_builtins
# doesn't currently have it enabled, and the riscv gcc compiler is not
# installed.
ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \
CC_mips_unknown_linux_musl=mips-openwrt-linux-gcc \
CC_mips64el_unknown_linux_muslabi64=mips64el-linux-gnuabi64-gcc \
@ -143,7 +146,12 @@ ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \
CC_x86_64_unknown_redox=x86_64-unknown-redox-gcc \
CC_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-gcc \
AR_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-ar \
CXX_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-g++
CXX_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-g++ \
CC_riscv32i_unknown_none_elf=false \
CC_riscv32imc_unknown_none_elf=false \
CC_riscv32imac_unknown_none_elf=false \
CC_riscv64imac_unknown_none_elf=false \
CC_riscv64gc_unknown_none_elf=false
ENV RUST_CONFIGURE_ARGS \
--musl-root-armv5te=/musl-armv5te \

View File

@ -5,7 +5,7 @@ mkdir /usr/local/mips-linux-musl
# originally from
# https://downloads.openwrt.org/snapshots/trunk/ar71xx/generic/
# OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2
URL="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc"
URL="https://ci-mirrors.rust-lang.org/rustc"
FILE="OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2"
curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mips-linux-musl --strip-components=2

View File

@ -5,7 +5,7 @@ mkdir /usr/local/mipsel-linux-musl
# Note that this originally came from:
# https://downloads.openwrt.org/snapshots/trunk/malta/generic/
# OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2
URL="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc"
URL="https://ci-mirrors.rust-lang.org/rustc"
FILE="OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2"
curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mipsel-linux-musl --strip-components=2

View File

@ -4,17 +4,17 @@
set -ex
# Originally from https://releases.llvm.org/8.0.0/clang+llvm-8.0.0-x86_64-linux-gnu-ubuntu-14.04.tar.xz
curl https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/clang%2Bllvm-8.0.0-x86_64-linux-gnu-ubuntu-14.04.tar.xz | \
# Originally from https://releases.llvm.org/9.0.0/clang+llvm-9.0.0-x86_64-linux-gnu-ubuntu-14.04.tar.xz
curl https://ci-mirrors.rust-lang.org/rustc/clang%2Bllvm-9.0.0-x86_64-linux-gnu-ubuntu-14.04.tar.xz | \
tar xJf -
export PATH=`pwd`/clang+llvm-8.0.0-x86_64-linux-gnu-ubuntu-14.04/bin:$PATH
export PATH=`pwd`/clang+llvm-9.0.0-x86_64-linux-gnu-ubuntu-14.04/bin:$PATH
git clone https://github.com/CraneStation/wasi-sysroot
git clone https://github.com/CraneStation/wasi-libc
cd wasi-sysroot
git reset --hard e5f14be38362f1ab83302895a6e74b2ffd0e2302
cd wasi-libc
git reset --hard f645f498dfbbbc00a7a97874d33082d3605c3f21
make -j$(nproc) INSTALL_DIR=/wasm32-wasi install
cd ..
rm -rf reference-sysroot-wasi
rm -rf wasi-libc
rm -rf clang+llvm*

View File

@ -69,7 +69,7 @@ RUN ./build-python.sh
# Now build LLVM+Clang 7, afterwards configuring further compilations to use the
# clang/clang++ compilers.
COPY dist-x86_64-linux/build-clang.sh /tmp/
COPY dist-x86_64-linux/build-clang.sh dist-x86_64-linux/llvm-project-centos.patch /tmp/
RUN ./build-clang.sh
ENV CC=clang CXX=clang++

View File

@ -4,7 +4,7 @@ set -ex
source shared.sh
LLVM=llvmorg-8.0.0-rc2
LLVM=llvmorg-9.0.0
mkdir llvm-project
cd llvm-project
@ -12,6 +12,9 @@ cd llvm-project
curl -L https://github.com/llvm/llvm-project/archive/$LLVM.tar.gz | \
tar xzf - --strip-components=1
yum install -y patch
patch -Np1 < ../llvm-project-centos.patch
mkdir clang-build
cd clang-build

View File

@ -5,6 +5,9 @@ source shared.sh
VERSION=7.66.0
# This needs to be downloaded directly from S3, it can't go through the CDN.
# That's because the CDN is backed by CloudFront, which requires SNI and TLSv1
# (without paying an absurd amount of money).
curl https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/curl-$VERSION.tar.xz \
| xz --decompress \
| tar xf -

View File

@ -4,6 +4,10 @@ set -ex
source shared.sh
VERSION=1.0.2k
# This needs to be downloaded directly from S3, it can't go through the CDN.
# That's because the CDN is backed by CloudFront, which requires SNI and TLSv1
# (without paying an absurd amount of money).
URL=https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/openssl-$VERSION.tar.gz
curl $URL | tar xzf -

View File

@ -25,7 +25,7 @@ cd netbsd
mkdir -p /x-tools/x86_64-unknown-netbsd/sysroot
URL=https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
URL=https://ci-mirrors.rust-lang.org/rustc
# Originally from ftp://ftp.netbsd.org/pub/NetBSD/NetBSD-$BSD/source/sets/*.tgz
curl $URL/2018-03-01-netbsd-src.tgz | tar xzf -

View File

@ -19,7 +19,10 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
COPY mingw-check/validate-toolstate.sh /scripts/
ENV RUN_CHECK_WITH_PARALLEL_QUERIES 1
ENV SCRIPT python2.7 ../x.py check --target=i686-pc-windows-gnu --host=i686-pc-windows-gnu && \
python2.7 ../x.py build --stage 0 src/tools/build-manifest && \
python2.7 ../x.py test --stage 0 src/tools/compiletest
python2.7 ../x.py test --stage 0 src/tools/compiletest && \
/scripts/validate-toolstate.sh

View File

@ -0,0 +1,19 @@
#!/bin/bash
# A quick smoke test to make sure publish_tooolstate.py works.
set -euo pipefail
IFS=$'\n\t'
rm -rf rust-toolstate
git clone --depth=1 https://github.com/rust-lang-nursery/rust-toolstate.git
cd rust-toolstate
python2.7 "../../src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" \
"$(git log --format=%s -n1 HEAD)" "" ""
# Only check maintainers if this build is supposed to publish toolstate.
# Builds that are not supposed to publish don't have the access token.
if [ -n "${TOOLSTATE_PUBLISH+is_set}" ]; then
TOOLSTATE_VALIDATE_MAINTAINERS_REPO=rust-lang/rust python2.7 \
"../../src/tools/publish_toolstate.py"
fi
cd ..
rm -rf rust-toolstate

View File

@ -172,6 +172,8 @@ docker \
--env CI \
--env TF_BUILD \
--env BUILD_SOURCEBRANCHNAME \
--env GITHUB_ACTIONS \
--env GITHUB_REF \
--env TOOLSTATE_REPO_ACCESS_TOKEN \
--env TOOLSTATE_REPO \
--env TOOLSTATE_PUBLISH \

View File

@ -59,7 +59,7 @@ done
# Originally downloaded from:
# https://download.freebsd.org/ftp/releases/${freebsd_arch}/${freebsd_version}-RELEASE/base.txz
URL=https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2019-04-04-freebsd-${freebsd_arch}-${freebsd_version}-RELEASE-base.txz
URL=https://ci-mirrors.rust-lang.org/rustc/2019-04-04-freebsd-${freebsd_arch}-${freebsd_version}-RELEASE-base.txz
curl "$URL" | tar xJf - -C "$sysroot" --wildcards "${files_to_extract[@]}"
# Fix up absolute symlinks from the system image. This can be removed

View File

@ -1,6 +1,6 @@
set -ex
curl -fo /usr/local/bin/sccache \
https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2018-04-02-sccache-x86_64-unknown-linux-musl
https://ci-mirrors.rust-lang.org/rustc/2018-04-02-sccache-x86_64-unknown-linux-musl
chmod +x /usr/local/bin/sccache

View File

@ -1,4 +1,4 @@
FROM ubuntu:16.04
FROM ubuntu:18.04
RUN apt-get update && apt-get install -y --no-install-recommends \
g++ \
@ -11,7 +11,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
cmake \
sudo \
gdb \
llvm-6.0-tools \
llvm-7-tools \
libedit-dev \
libssl-dev \
pkg-config \
@ -24,7 +24,7 @@ RUN sh /scripts/sccache.sh
# using llvm-link-shared due to libffi issues -- see #34486
ENV RUST_CONFIGURE_ARGS \
--build=x86_64-unknown-linux-gnu \
--llvm-root=/usr/lib/llvm-6.0 \
--llvm-root=/usr/lib/llvm-7 \
--enable-llvm-link-shared
ENV SCRIPT python2.7 ../x.py test src/tools/tidy && python2.7 ../x.py test

View File

@ -17,9 +17,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
COPY x86_64-gnu-tools/checkregression.py /tmp/
COPY x86_64-gnu-tools/checktools.sh /tmp/
COPY x86_64-gnu-tools/repo.sh /tmp/
# Run rustbook with `linkcheck` feature enabled
ENV CHECK_LINKS 1
@ -27,4 +25,4 @@ ENV CHECK_LINKS 1
ENV RUST_CONFIGURE_ARGS \
--build=x86_64-unknown-linux-gnu \
--save-toolstates=/tmp/toolstate/toolstates.json
ENV SCRIPT /tmp/checktools.sh ../x.py /tmp/toolstate/toolstates.json linux
ENV SCRIPT /tmp/checktools.sh ../x.py

View File

@ -1,48 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
## This script has two purposes: detect any tool that *regressed*, which is used
## during the week before the beta branches to reject PRs; and detect any tool
## that *changed* to see if we need to update the toolstate repo.
import sys
import json
# Regressions for these tools during the beta cutoff week do not cause failure.
# See `status_check` in `checktools.sh` for tools that have to pass on the
# beta/stable branches.
REGRESSION_OK = ["rustc-guide", "miri", "embedded-book"]
if __name__ == '__main__':
os_name = sys.argv[1]
toolstate_file = sys.argv[2]
current_state = sys.argv[3]
verb = sys.argv[4] # 'regressed' or 'changed'
with open(toolstate_file, 'r') as f:
toolstate = json.load(f)
with open(current_state, 'r') as f:
current = json.load(f)
regressed = False
for cur in current:
tool = cur['tool']
state = cur[os_name]
new_state = toolstate.get(tool, '')
if verb == 'regressed':
updated = new_state < state
elif verb == 'changed':
updated = new_state != state
else:
print('Unknown verb {}'.format(updated))
sys.exit(2)
if updated:
print(
'The state of "{}" has {} from "{}" to "{}"'
.format(tool, verb, state, new_state)
)
if not (verb == 'regressed' and tool in REGRESSION_OK):
regressed = True
if regressed:
sys.exit(1)

View File

@ -3,18 +3,6 @@
set -eu
X_PY="$1"
TOOLSTATE_FILE="$(realpath -m $2)"
OS="$3"
COMMIT="$(git rev-parse HEAD)"
CHANGED_FILES="$(git diff --name-status HEAD HEAD^)"
SIX_WEEK_CYCLE="$(( ($(date +%s) / 86400 - 20) % 42 ))"
# ^ Number of days after the last promotion of beta.
# Its value is 41 on the Tuesday where "Promote master to beta (T-2)" happens.
# The Wednesday after this has value 0.
# We track this value to prevent regressing tools in the last week of the 6-week cycle.
mkdir -p "$(dirname $TOOLSTATE_FILE)"
touch "$TOOLSTATE_FILE"
# Try to test all the tools and store the build/test success in the TOOLSTATE_FILE
@ -34,106 +22,4 @@ python2.7 "$X_PY" test --no-fail-fast \
set -e
cat "$TOOLSTATE_FILE"
echo
# This function checks if a particular tool is *not* in status "test-pass".
check_tool_failed() {
grep -vq '"'"$1"'":"test-pass"' "$TOOLSTATE_FILE"
}
# This function checks that if a tool's submodule changed, the tool's state must improve
verify_submodule_changed() {
echo "Verifying status of $1..."
if echo "$CHANGED_FILES" | grep -q "^M[[:blank:]]$2$"; then
echo "This PR updated '$2', verifying if status is 'test-pass'..."
if check_tool_failed "$1"; then
echo
echo "⚠️ We detected that this PR updated '$1', but its tests failed."
echo
echo "If you do intend to update '$1', please check the error messages above and"
echo "commit another update."
echo
echo "If you do NOT intend to update '$1', please ensure you did not accidentally"
echo "change the submodule at '$2'. You may ask your reviewer for the"
echo "proper steps."
exit 3
fi
fi
}
# deduplicates the submodule check and the assertion that on beta some tools MUST be passing.
# $1 should be "submodule_changed" to only check tools that got changed by this PR,
# or "beta_required" to check all tools that have $2 set to "beta".
check_dispatch() {
if [ "$1" = submodule_changed ]; then
# ignore $2 (branch id)
verify_submodule_changed $3 $4
elif [ "$2" = beta ]; then
echo "Requiring test passing for $3..."
if check_tool_failed "$3"; then
exit 4
fi
fi
}
# List all tools here.
# This function gets called with "submodule_changed" for each PR that changed a submodule,
# and with "beta_required" for each PR that lands on beta/stable.
# The purpose of this function is to *reject* PRs if a tool is not "test-pass" and
# (a) the tool's submodule has been updated, or (b) we landed on beta/stable and the
# tool has to "test-pass" on that branch.
status_check() {
check_dispatch $1 beta book src/doc/book
check_dispatch $1 beta nomicon src/doc/nomicon
check_dispatch $1 beta reference src/doc/reference
check_dispatch $1 beta rust-by-example src/doc/rust-by-example
check_dispatch $1 beta edition-guide src/doc/edition-guide
check_dispatch $1 beta rls src/tools/rls
check_dispatch $1 beta rustfmt src/tools/rustfmt
check_dispatch $1 beta clippy-driver src/tools/clippy
# These tools are not required on the beta/stable branches, but they *do* cause
# PRs to fail if a submodule update does not fix them.
# They will still cause failure during the beta cutoff week, unless `checkregression.py`
# exempts them from that.
check_dispatch $1 nightly miri src/tools/miri
check_dispatch $1 nightly embedded-book src/doc/embedded-book
check_dispatch $1 nightly rustc-guide src/doc/rustc-guide
}
# If this PR is intended to update one of these tools, do not let the build pass
# when they do not test-pass.
status_check "submodule_changed"
CHECK_NOT="$(readlink -f "$(dirname $0)/checkregression.py")"
# This callback is called by `commit_toolstate_change`, see `repo.sh`.
change_toolstate() {
# only update the history
if python2.7 "$CHECK_NOT" "$OS" "$TOOLSTATE_FILE" "_data/latest.json" changed; then
echo 'Toolstate is not changed. Not updating.'
else
if [ $SIX_WEEK_CYCLE -ge 35 ]; then
# Reject any regressions during the week before beta cutoff.
python2.7 "$CHECK_NOT" "$OS" "$TOOLSTATE_FILE" "_data/latest.json" regressed
fi
sed -i "1 a\\
$COMMIT\t$(cat "$TOOLSTATE_FILE")
" "history/$OS.tsv"
fi
}
if [ "$RUST_RELEASE_CHANNEL" = nightly ]; then
if [ -n "${TOOLSTATE_PUBLISH+is_set}" ]; then
. "$(dirname $0)/repo.sh"
MESSAGE_FILE=$(mktemp -t msg.XXXXXX)
echo "($OS CI update)" > "$MESSAGE_FILE"
commit_toolstate_change "$MESSAGE_FILE" change_toolstate
rm -f "$MESSAGE_FILE"
fi
exit 0
fi
# abort compilation if an important tool doesn't build
# (this code is reachable if not on the nightly channel)
status_check "beta_required"
python2.7 "$X_PY" test check-tools

View File

@ -1,68 +0,0 @@
#!/bin/sh
# This file provides the function `commit_toolstate_change` for pushing a change
# to the `rust-toolstate` repository.
#
# The function relies on a GitHub bot user, which should have a Personal access
# token defined in the environment variable $TOOLSTATE_REPO_ACCESS_TOKEN. If for
# some reason you need to change the token, please update the Azure Pipelines
# variable group.
#
# 1. Generate a new Personal access token:
#
# * Login to the bot account, and go to Settings -> Developer settings ->
# Personal access tokens
# * Click "Generate new token"
# * Enable the "public_repo" permission, then click "Generate token"
# * Copy the generated token (should be a 40-digit hexadecimal number).
# Save it somewhere secure, as the token would be gone once you leave
# the page.
#
# 2. Update the variable group in Azure Pipelines
#
# * Ping a member of the infrastructure team to do this.
#
# 4. Replace the email address below if the bot account identity is changed
#
# * See <https://help.github.com/articles/about-commit-email-addresses/>
# if a private email by GitHub is wanted.
commit_toolstate_change() {
OLDFLAGS="$-"
set -eu
git config --global user.email '7378925+rust-toolstate-update@users.noreply.github.com'
git config --global user.name 'Rust Toolstate Update'
git config --global credential.helper store
printf 'https://%s:x-oauth-basic@github.com\n' "$TOOLSTATE_REPO_ACCESS_TOKEN" \
> "$HOME/.git-credentials"
git clone --depth=1 $TOOLSTATE_REPO
cd rust-toolstate
FAILURE=1
MESSAGE_FILE="$1"
shift
for RETRY_COUNT in 1 2 3 4 5; do
# Call the callback.
# - If we are in the `auto` branch (pre-landing), this is called from `checktools.sh` and
# the callback is `change_toolstate` in that file. The purpose of this is to publish the
# test results (the new commit-to-toolstate mapping) in the toolstate repo.
# - If we are in the `master` branch (post-landing), this is called by the CI pipeline
# and the callback is `src/tools/publish_toolstate.py`. The purpose is to publish
# the new "current" toolstate in the toolstate repo.
"$@"
# `git commit` failing means nothing to commit.
FAILURE=0
git commit -a -F "$MESSAGE_FILE" || break
# On failure randomly sleep for 0 to 3 seconds as a crude way to introduce jittering.
git push origin master && break || sleep $(LC_ALL=C tr -cd 0-3 < /dev/urandom | head -c 1)
FAILURE=1
git fetch origin master
git reset --hard origin/master
done
cd ..
set +eu
set "-$OLDFLAGS"
return $FAILURE
}

33
src/ci/publish_toolstate.sh Executable file
View File

@ -0,0 +1,33 @@
#!/bin/sh
set -eu
# The following lines are also found in src/bootstrap/toolstate.rs,
# so if updating here, please also update that file.
export MESSAGE_FILE=$(mktemp -t msg.XXXXXX)
git config --global user.email '7378925+rust-toolstate-update@users.noreply.github.com'
git config --global user.name 'Rust Toolstate Update'
git config --global credential.helper store
printf 'https://%s:x-oauth-basic@github.com\n' "$TOOLSTATE_REPO_ACCESS_TOKEN" \
> "$HOME/.git-credentials"
git clone --depth=1 $TOOLSTATE_REPO
cd rust-toolstate
FAILURE=1
for RETRY_COUNT in 1 2 3 4 5; do
# The purpose is to publish the new "current" toolstate in the toolstate repo.
"$BUILD_SOURCESDIRECTORY/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" \
"$(git log --format=%s -n1 HEAD)" \
"$MESSAGE_FILE" \
"$TOOLSTATE_REPO_ACCESS_TOKEN"
# `git commit` failing means nothing to commit.
FAILURE=0
git commit -a -F "$MESSAGE_FILE" || break
# On failure randomly sleep for 0 to 3 seconds as a crude way to introduce jittering.
git push origin master && break || sleep $(LC_ALL=C tr -cd 0-3 < /dev/urandom | head -c 1)
FAILURE=1
git fetch origin master
git reset --hard origin/master
done

View File

@ -23,9 +23,7 @@ fi
ci_dir=`cd $(dirname $0) && pwd`
source "$ci_dir/shared.sh"
branch_name=$(getCIBranch)
if [ ! isCI ] || [ "$branch_name" = "auto" ] || [ "$branch_name" = "try" ]; then
if ! isCI || isCiBranch auto || isCiBranch beta; then
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --set build.print-step-timings --enable-verbose-tests"
fi

16
src/ci/scripts/clean-disk.sh Executable file
View File

@ -0,0 +1,16 @@
#!/bin/bash
# This script deletes some of the Azure-provided artifacts. We don't use these,
# and disk space is at a premium on our builders.
set -euo pipefail
IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
# All the Linux builds happen inside Docker.
if isLinux; then
# 6.7GB
sudo rm -rf /opt/ghc
# 16GB
sudo rm -rf /usr/share/dotnet
fi

View File

@ -0,0 +1,9 @@
#!/bin/bash
# Spawn a background process to collect CPU usage statistics which we'll upload
# at the end of the build. See the comments in the script here for more
# information.
set -euo pipefail
IFS=$'\n\t'
python src/ci/cpu-usage-over-time.py &> cpu-usage.csv &

View File

@ -9,15 +9,15 @@ IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
if isMacOS; then
curl -f "${MIRRORS_BASE}/clang%2Bllvm-7.0.0-x86_64-apple-darwin.tar.xz" | tar xJf -
curl -f "${MIRRORS_BASE}/clang%2Bllvm-9.0.0-x86_64-darwin-apple.tar.xz" | tar xJf -
ciCommandSetEnv CC "$(pwd)/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang"
ciCommandSetEnv CXX "$(pwd)/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang++"
ciCommandSetEnv CC "$(pwd)/clang+llvm-9.0.0-x86_64-darwin-apple/bin/clang"
ciCommandSetEnv CXX "$(pwd)/clang+llvm-9.0.0-x86_64-darwin-apple/bin/clang++"
# Configure `AR` specifically so rustbuild doesn't try to infer it as
# `clang-ar` by accident.
ciCommandSetEnv AR "ar"
elif isWindows && [[ -z ${MINGW_URL+x} ]]; then
elif isWindows && [[ ${CUSTOM_MINGW-0} -ne 1 ]]; then
# If we're compiling for MSVC then we, like most other distribution builders,
# switch to clang as the compiler. This'll allow us eventually to enable LTO
# amongst LLVM and rustc. Note that we only do this on MSVC as I don't think
@ -27,17 +27,18 @@ elif isWindows && [[ -z ${MINGW_URL+x} ]]; then
# Note that the LLVM installer is an NSIS installer
#
# Original downloaded here came from
# http://releases.llvm.org/7.0.0/LLVM-7.0.0-win64.exe
# That installer was run through `wine` on Linux and then the resulting
# installation directory (found in `$HOME/.wine/drive_c/Program Files/LLVM`) was
# packaged up into a tarball. We've had issues otherwise that the installer will
# randomly hang, provide not a lot of useful information, pollute global state,
# etc. In general the tarball is just more confined and easier to deal with when
# working with various CI environments.
# http://releases.llvm.org/9.0.0/LLVM-9.0.0-win64.exe
# That installer was run through `wine ./installer.exe /S /NCRC` on Linux
# and then the resulting installation directory (found in
# `$HOME/.wine/drive_c/Program Files/LLVM`) was packaged up into a tarball.
# We've had issues otherwise that the installer will randomly hang, provide
# not a lot of useful information, pollute global state, etc. In general the
# tarball is just more confined and easier to deal with when working with
# various CI environments.
mkdir -p citools
cd citools
curl -f "${MIRRORS_BASE}/LLVM-7.0.0-win64.tar.gz" | tar xzf -
curl -f "${MIRRORS_BASE}/LLVM-9.0.0-win64.tar.gz" | tar xzf -
ciCommandSetEnv RUST_CONFIGURE_ARGS \
"${RUST_CONFIGURE_ARGS} --set llvm.clang-cl=$(pwd)/clang-rust/bin/clang-cl.exe"
fi

View File

@ -27,19 +27,38 @@ IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
MINGW_ARCHIVE_32="i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z"
MINGW_ARCHIVE_64="x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z"
if isWindows; then
if [[ -z "${MINGW_URL+x}" ]]; then
arch=i686
if [ "$MSYS_BITS" = "64" ]; then
arch=x86_64
fi
case "${CI_JOB_NAME}" in
*i686*)
bits=32
arch=i686
mingw_archive="${MINGW_ARCHIVE_32}"
;;
*x86_64*)
bits=64
arch=x86_64
mingw_archive="${MINGW_ARCHIVE_64}"
;;
*)
echo "src/ci/scripts/install-mingw.sh can't detect the builder's architecture"
echo "please tweak it to recognize the builder named '${CI_JOB_NAME}'"
exit 1
;;
esac
if [[ "${CUSTOM_MINGW-0}" -ne 1 ]]; then
pacman -S --noconfirm --needed mingw-w64-$arch-toolchain mingw-w64-$arch-cmake \
mingw-w64-$arch-gcc mingw-w64-$arch-python2
ciCommandAddPath "${SYSTEM_WORKFOLDER}/msys2/mingw${MSYS_BITS}/bin"
ciCommandAddPath "$(ciCheckoutPath)/msys2/mingw${bits}/bin"
else
curl -o mingw.7z "${MINGW_URL}/${MINGW_ARCHIVE}"
mingw_dir="mingw${bits}"
curl -o mingw.7z "${MIRRORS_BASE}/${mingw_archive}"
7z x -y mingw.7z > /dev/null
curl -o "${MINGW_DIR}/bin/gdborig.exe" "${MINGW_URL}/2017-04-20-${MSYS_BITS}bit-gdborig.exe"
ciCommandAddPath "$(pwd)/${MINGW_DIR}/bin"
curl -o "${mingw_dir}/bin/gdborig.exe" "${MIRRORS_BASE}/2017-04-20-${bits}bit-gdborig.exe"
ciCommandAddPath "$(pwd)/${mingw_dir}/bin"
fi
fi

View File

@ -12,8 +12,8 @@ IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
if isWindows; then
choco install msys2 --params="/InstallDir:${SYSTEM_WORKFOLDER}/msys2 /NoPath" -y --no-progress
mkdir -p "${SYSTEM_WORKFOLDER}/msys2/home/${USERNAME}"
choco install msys2 --params="/InstallDir:$(ciCheckoutPath)/msys2 /NoPath" -y --no-progress
mkdir -p "$(ciCheckoutPath)/msys2/home/${USERNAME}"
ciCommandAddPath "${SYSTEM_WORKFOLDER}/msys2/usr/bin"
ciCommandAddPath "$(ciCheckoutPath)/msys2/usr/bin"
fi

View File

@ -0,0 +1,21 @@
#!/bin/bash
# Start the CI build. You shouldn't run this locally: call either src/ci/run.sh
# or src/ci/docker/run.sh instead.
set -euo pipefail
IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
export CI="true"
export SRC=.
# Remove any preexisting rustup installation since it can interfere
# with the cargotest step and its auto-detection of things like Clippy in
# the environment
rustup self uninstall -y || true
if [ -z "${IMAGE+x}" ]; then
src/ci/run.sh
else
src/ci/docker/run.sh "${IMAGE}"
fi

View File

@ -0,0 +1,31 @@
#!/bin/bash
# This script guesses some environment variables based on the builder name and
# the current platform, to reduce the amount of variables defined in the CI
# configuration.
set -euo pipefail
IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
# Builders starting with `dist-` are dist builders, but if they also end with
# `-alt` they are alternate dist builders.
if [[ "${CI_JOB_NAME}" = dist-* ]]; then
if [[ "${CI_JOB_NAME}" = *-alt ]]; then
echo "alternate dist builder detected, setting DEPLOY_ALT=1"
ciCommandSetEnv DEPLOY_ALT 1
else
echo "normal dist builder detected, setting DEPLOY=1"
ciCommandSetEnv DEPLOY 1
fi
fi
# All the Linux builds happen inside Docker.
if isLinux; then
if [[ -z "${IMAGE+x}" ]]; then
echo "linux builder detected, using docker to run the build"
ciCommandSetEnv IMAGE "${CI_JOB_NAME}"
else
echo "a custom docker image is already set"
fi
fi

View File

@ -4,7 +4,7 @@
# `source shared.sh`, hence the invalid shebang and not being
# marked as an executable file in git.
export MIRRORS_BASE="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc"
export MIRRORS_BASE="https://ci-mirrors.rust-lang.org/rustc"
# See http://unix.stackexchange.com/questions/82598
# Duplicated in docker/dist-various-2/shared.sh
@ -27,27 +27,66 @@ function retry {
}
function isCI {
[ "$CI" = "true" ] || [ "$TF_BUILD" = "True" ]
[[ "${CI-false}" = "true" ]] || isAzurePipelines || isGitHubActions
}
function isAzurePipelines {
[[ "${TF_BUILD-False}" = "True" ]]
}
function isGitHubActions {
[[ "${GITHUB_ACTIONS-false}" = "true" ]]
}
function isMacOS {
[ "$AGENT_OS" = "Darwin" ]
[[ "${OSTYPE}" = "darwin"* ]]
}
function isWindows {
[ "$AGENT_OS" = "Windows_NT" ]
[[ "${OSTYPE}" = "cygwin" ]] || [[ "${OSTYPE}" = "msys" ]]
}
function isLinux {
[ "$AGENT_OS" = "Linux" ]
[[ "${OSTYPE}" = "linux-gnu" ]]
}
function getCIBranch {
echo "$BUILD_SOURCEBRANCHNAME"
function isCiBranch {
if [[ $# -ne 1 ]]; then
echo "usage: $0 <branch-name>"
exit 1
fi
name="$1"
if isAzurePipelines; then
[[ "${BUILD_SOURCEBRANCHNAME}" = "${name}" ]]
elif isGitHubActions; then
[[ "${GITHUB_REF}" = "refs/heads/${name}" ]]
else
echo "isCiBranch only works inside CI!"
exit 1
fi
}
function ciCommit {
echo "${BUILD_SOURCEVERSION}"
if isAzurePipelines; then
echo "${BUILD_SOURCEVERSION}"
elif isGitHubActions; then
echo "${GITHUB_SHA}"
else
echo "ciCommit only works inside CI!"
exit 1
fi
}
function ciCheckoutPath {
if isAzurePipelines; then
echo "${SYSTEM_WORKFOLDER}"
elif isGitHubActions; then
echo "${GITHUB_WORKSPACE}"
else
echo "ciCheckoutPath only works inside CI!"
exit 1
fi
}
function ciCommandAddPath {
@ -57,7 +96,14 @@ function ciCommandAddPath {
fi
path="$1"
echo "##vso[task.prependpath]${path}"
if isAzurePipelines; then
echo "##vso[task.prependpath]${path}"
elif isGitHubActions; then
echo "::add-path::${value}"
else
echo "ciCommandAddPath only works inside CI!"
exit 1
fi
}
function ciCommandSetEnv {
@ -68,5 +114,12 @@ function ciCommandSetEnv {
name="$1"
value="$2"
echo "##vso[task.setvariable variable=${name}]${value}"
if isAzurePipelines; then
echo "##vso[task.setvariable variable=${name}]${value}"
elif isGitHubActions; then
echo "::set-env name=${name}::${value}"
else
echo "ciCommandSetEnv only works inside CI!"
exit 1
fi
}

View File

@ -160,7 +160,6 @@ Filesystem
filesystem's
filesystems
Firefox
FnBox
FnMut
FnOnce
formatter

View File

@ -158,7 +158,7 @@ in the slice and `ending_index` is one more than the last position in the
slice. Internally, the slice data structure stores the starting position and
the length of the slice, which corresponds to `ending_index` minus
`starting_index`. So in the case of `let world = &s[6..11];`, `world` would be
a slice that contains a pointer to the 7th byte of `s` with a length value of 5.
a slice that contains a pointer to the 7th byte (counting from 1) of `s` with a length value of 5.
Figure 4-6 shows this in a diagram.

View File

@ -29,7 +29,7 @@ Listing 7-1 into *src/lib.rs* to define some modules and function signatures.
<span class="filename">Filename: src/lib.rs</span>
```rust
```rust,ignore
mod front_of_house {
mod hosting {
fn add_to_waitlist() {}

View File

@ -125,7 +125,7 @@ number in two different lists.
fn largest(list: &[i32]) -> i32 {
let mut largest = list[0];
for &item in list.iter() {
for &item in list {
if item > largest {
largest = item;
}

View File

@ -103,10 +103,9 @@ to change the inner `i32` to 6.
Now, lets try to share a value between multiple threads using `Mutex<T>`.
Well spin up 10 threads and have them each increment a counter value by 1, so
the counter goes from 0 to 10. Note that the next few examples will have
compiler errors, and well use those errors to learn more about using
`Mutex<T>` and how Rust helps us use it correctly. Listing 16-13 has our
starting example:
the counter goes from 0 to 10. The next example in Listing 16-13 will have
a compiler error, and well use that error to learn more about using
`Mutex<T>` and how Rust helps us use it correctly.
<span class="filename">Filename: src/main.rs</span>
@ -154,110 +153,23 @@ program.
We hinted that this example wouldnt compile. Now lets find out why!
```text
error[E0382]: capture of moved value: `counter`
--> src/main.rs:10:27
error[E0382]: use of moved value: `counter`
--> src/main.rs:9:36
|
9 | let handle = thread::spawn(move || {
| ------- value moved (into closure) here
| ^^^^^^^ value moved into closure here,
in previous iteration of loop
10 | let mut num = counter.lock().unwrap();
| ^^^^^^^ value captured here after move
| ------- use occurs due to use in closure
|
= note: move occurs because `counter` has type `std::sync::Mutex<i32>`,
which does not implement the `Copy` trait
error[E0382]: use of moved value: `counter`
--> src/main.rs:21:29
|
9 | let handle = thread::spawn(move || {
| ------- value moved (into closure) here
...
21 | println!("Result: {}", *counter.lock().unwrap());
| ^^^^^^^ value used here after move
|
= note: move occurs because `counter` has type `std::sync::Mutex<i32>`,
which does not implement the `Copy` trait
error: aborting due to 2 previous errors
which does not implement the `Copy` trait
```
The error message states that the `counter` value is moved into the closure and
then captured when we call `lock`. That description sounds like what we wanted,
but its not allowed!
Lets figure this out by simplifying the program. Instead of making 10 threads
in a `for` loop, lets just make two threads without a loop and see what
happens. Replace the first `for` loop in Listing 16-13 with this code instead:
```rust,ignore,does_not_compile
use std::sync::Mutex;
use std::thread;
fn main() {
let counter = Mutex::new(0);
let mut handles = vec![];
let handle = thread::spawn(move || {
let mut num = counter.lock().unwrap();
*num += 1;
});
handles.push(handle);
let handle2 = thread::spawn(move || {
let mut num2 = counter.lock().unwrap();
*num2 += 1;
});
handles.push(handle2);
for handle in handles {
handle.join().unwrap();
}
println!("Result: {}", *counter.lock().unwrap());
}
```
We make two threads and change the variable names used with the second thread
to `handle2` and `num2`. When we run the code this time, compiling gives us the
following:
```text
error[E0382]: capture of moved value: `counter`
--> src/main.rs:16:24
|
8 | let handle = thread::spawn(move || {
| ------- value moved (into closure) here
...
16 | let mut num2 = counter.lock().unwrap();
| ^^^^^^^ value captured here after move
|
= note: move occurs because `counter` has type `std::sync::Mutex<i32>`,
which does not implement the `Copy` trait
error[E0382]: use of moved value: `counter`
--> src/main.rs:26:29
|
8 | let handle = thread::spawn(move || {
| ------- value moved (into closure) here
...
26 | println!("Result: {}", *counter.lock().unwrap());
| ^^^^^^^ value used here after move
|
= note: move occurs because `counter` has type `std::sync::Mutex<i32>`,
which does not implement the `Copy` trait
error: aborting due to 2 previous errors
```
Aha! The first error message indicates that `counter` is moved into the closure
for the thread associated with `handle`. That move is preventing us from
capturing `counter` when we try to call `lock` on it and store the result in
`num2` in the second thread! So Rust is telling us that we cant move ownership
of `counter` into multiple threads. This was hard to see earlier because our
threads were in a loop, and Rust cant point to different threads in different
iterations of the loop. Lets fix the compiler error with a multiple-ownership
method we discussed in Chapter 15.
The error message states that the `counter` value was moved in the previous
iteration of the loop. So Rust is telling us that we cant move the ownership
of lock `counter` into multiple threads. Lets fix the compiler error with a
multiple-ownership method we discussed in Chapter 15.
#### Multiple Ownership with Multiple Threads
@ -304,30 +216,27 @@ Once again, we compile and get... different errors! The compiler is teaching us
a lot.
```text
error[E0277]: the trait bound `std::rc::Rc<std::sync::Mutex<i32>>:
std::marker::Send` is not satisfied in `[closure@src/main.rs:11:36:
15:10 counter:std::rc::Rc<std::sync::Mutex<i32>>]`
error[E0277]: `std::rc::Rc<std::sync::Mutex<i32>>` cannot be sent between threads safely
--> src/main.rs:11:22
|
11 | let handle = thread::spawn(move || {
| ^^^^^^^^^^^^^ `std::rc::Rc<std::sync::Mutex<i32>>`
cannot be sent between threads safely
|
= help: within `[closure@src/main.rs:11:36: 15:10
counter:std::rc::Rc<std::sync::Mutex<i32>>]`, the trait `std::marker::Send` is
not implemented for `std::rc::Rc<std::sync::Mutex<i32>>`
= help: within `[closure@src/main.rs:11:36: 14:10
counter:std::rc::Rc<std::sync::Mutex<i32>>]`, the trait `std::marker::Send`
is not implemented for `std::rc::Rc<std::sync::Mutex<i32>>`
= note: required because it appears within the type
`[closure@src/main.rs:11:36: 15:10 counter:std::rc::Rc<std::sync::Mutex<i32>>]`
`[closure@src/main.rs:11:36: 14:10 counter:std::rc::Rc<std::sync::Mutex<i32>>]`
= note: required by `std::thread::spawn`
```
Wow, that error message is very wordy! Here are some important parts to focus
on: the first inline error says `` `std::rc::Rc<std::sync::Mutex<i32>>` cannot
be sent between threads safely ``. The reason for this is in the next important
part to focus on, the error message. The distilled error message says `` the
trait bound `Send` is not satisfied ``. Well talk about `Send` in the next
section: its one of the traits that ensures the types we use with threads are
meant for use in concurrent situations.
Wow, that error message is very wordy! Heres the important part to focus
on: `` `Rc<Mutex<i32>>` cannot be sent between threads safely ``. The compiler
is also telling us the reason why: ``the trait `Send` is not implemented for
`Rc<Mutex<i32>>` ``. Well talk about `Send` in the next section: its one of
the traits that ensures the types we use with threads are meant for use in
concurrent situations.
Unfortunately, `Rc<T>` is not safe to share across threads. When `Rc<T>`
manages the reference count, it adds to the count for each call to `clone` and

View File

@ -944,7 +944,7 @@ impl Worker {
println!("Worker {} got a job; executing.", id);
(*job)();
job();
}
});
@ -976,109 +976,6 @@ The call to `recv` blocks, so if there is no job yet, the current thread will
wait until a job becomes available. The `Mutex<T>` ensures that only one
`Worker` thread at a time is trying to request a job.
Theoretically, this code should compile. Unfortunately, the Rust compiler isnt
perfect yet, and we get this error:
```text
error[E0161]: cannot move a value of type std::ops::FnOnce() +
std::marker::Send: the size of std::ops::FnOnce() + std::marker::Send cannot be
statically determined
--> src/lib.rs:63:17
|
63 | (*job)();
| ^^^^^^
```
This error is fairly cryptic because the problem is fairly cryptic. To call a
`FnOnce` closure that is stored in a `Box<T>` (which is what our `Job` type
alias is), the closure needs to move itself *out* of the `Box<T>` because the
closure takes ownership of `self` when we call it. In general, Rust doesnt
allow us to move a value out of a `Box<T>` because Rust doesnt know how big
the value inside the `Box<T>` will be: recall in Chapter 15 that we used
`Box<T>` precisely because we had something of an unknown size that we wanted
to store in a `Box<T>` to get a value of a known size.
As you saw in Listing 17-15, we can write methods that use the syntax `self:
Box<Self>`, which allows the method to take ownership of a `Self` value stored
in a `Box<T>`. Thats exactly what we want to do here, but unfortunately Rust
wont let us: the part of Rust that implements behavior when a closure is
called isnt implemented using `self: Box<Self>`. So Rust doesnt yet
understand that it could use `self: Box<Self>` in this situation to take
ownership of the closure and move the closure out of the `Box<T>`.
Rust is still a work in progress with places where the compiler could be
improved, but in the future, the code in Listing 20-20 should work just fine.
People just like you are working to fix this and other issues! After youve
finished this book, we would love for you to join in.
But for now, lets work around this problem using a handy trick. We can tell
Rust explicitly that in this case we can take ownership of the value inside the
`Box<T>` using `self: Box<Self>`; then, once we have ownership of the closure,
we can call it. This involves defining a new trait `FnBox` with the method
`call_box` that will use `self: Box<Self>` in its signature, defining `FnBox`
for any type that implements `FnOnce()`, changing our type alias to use the new
trait, and changing `Worker` to use the `call_box` method. These changes are
shown in Listing 20-21.
<span class="filename">Filename: src/lib.rs</span>
```rust,ignore
trait FnBox {
fn call_box(self: Box<Self>);
}
impl<F: FnOnce()> FnBox for F {
fn call_box(self: Box<F>) {
(*self)()
}
}
type Job = Box<dyn FnBox + Send + 'static>;
// --snip--
impl Worker {
fn new(id: usize, receiver: Arc<Mutex<mpsc::Receiver<Job>>>) -> Worker {
let thread = thread::spawn(move || {
loop {
let job = receiver.lock().unwrap().recv().unwrap();
println!("Worker {} got a job; executing.", id);
job.call_box();
}
});
Worker {
id,
thread,
}
}
}
```
<span class="caption">Listing 20-21: Adding a new trait `FnBox` to work around
the current limitations of `Box<FnOnce()>`</span>
First, we create a new trait named `FnBox`. This trait has the one method
`call_box`, which is similar to the `call` methods on the other `Fn*` traits
except that it takes `self: Box<Self>` to take ownership of `self` and move the
value out of the `Box<T>`.
Next, we implement the `FnBox` trait for any type `F` that implements the
`FnOnce()` trait. Effectively, this means that any `FnOnce()` closures can use
our `call_box` method. The implementation of `call_box` uses `(*self)()` to
move the closure out of the `Box<T>` and call the closure.
We now need our `Job` type alias to be a `Box` of anything that implements our
new trait `FnBox`. This will allow us to use `call_box` in `Worker` when we get
a `Job` value instead of invoking the closure directly. Implementing the
`FnBox` trait for any `FnOnce()` closure means we dont have to change anything
about the actual values were sending down the channel. Now Rust is able to
recognize that what we want to do is fine.
This trick is very sneaky and complicated. Dont worry if it doesnt make
perfect sense; someday, it will be completely unnecessary.
With the implementation of this trick, our thread pool is in a working state!
Give it a `cargo run` and make some requests:
@ -1136,7 +1033,7 @@ thread run them.
> limitation is not caused by our web server.
After learning about the `while let` loop in Chapter 18, you might be wondering
why we didnt write the worker thread code as shown in Listing 20-22.
why we didnt write the worker thread code as shown in Listing 20-21.
<span class="filename">Filename: src/lib.rs</span>
@ -1149,7 +1046,7 @@ impl Worker {
while let Ok(job) = receiver.lock().unwrap().recv() {
println!("Worker {} got a job; executing.", id);
job.call_box();
job();
}
});
@ -1161,7 +1058,7 @@ impl Worker {
}
```
<span class="caption">Listing 20-22: An alternative implementation of
<span class="caption">Listing 20-21: An alternative implementation of
`Worker::new` using `while let`</span>
This code compiles and runs but doesnt result in the desired threading
@ -1175,13 +1072,13 @@ lock. But this implementation can also result in the lock being held longer
than intended if we dont think carefully about the lifetime of the
`MutexGuard<T>`. Because the values in the `while` expression remain in scope
for the duration of the block, the lock remains held for the duration of the
call to `job.call_box()`, meaning other workers cannot receive jobs.
call to `job()`, meaning other workers cannot receive jobs.
By using `loop` instead and acquiring the lock and a job within the block
rather than outside it, the `MutexGuard` returned from the `lock` method is
dropped as soon as the `let job` statement ends. This ensures that the lock is
held during the call to `recv`, but it is released before the call to
`job.call_box()`, allowing multiple requests to be serviced concurrently.
`job()`, allowing multiple requests to be serviced concurrently.
[creating-type-synonyms-with-type-aliases]:
ch19-04-advanced-types.html#creating-type-synonyms-with-type-aliases

View File

@ -450,17 +450,7 @@ pub struct ThreadPool {
sender: mpsc::Sender<Message>,
}
trait FnBox {
fn call_box(self: Box<Self>);
}
impl<F: FnOnce()> FnBox for F {
fn call_box(self: Box<F>) {
(*self)()
}
}
type Job = Box<dyn FnBox + Send + 'static>;
type Job = Box<dyn FnOnce() + Send + 'static>;
impl ThreadPool {
/// Create a new ThreadPool.
@ -536,7 +526,7 @@ impl Worker {
Message::NewJob(job) => {
println!("Worker {} got a job; executing.", id);
job.call_box();
job();
},
Message::Terminate => {
println!("Worker {} was told to terminate.", id);

View File

@ -0,0 +1,28 @@
name: CI
on: [push, pull_request]
jobs:
test:
name: Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@master
- name: Update rustup
run: rustup self update
- name: Install Rust
run: |
rustup set profile minimal
rustup toolchain install nightly -c rust-docs
rustup default nightly
- name: Install mdbook
run: |
mkdir bin
curl -sSL https://github.com/rust-lang/mdBook/releases/download/v0.3.4/mdbook-v0.3.4-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=bin
echo "##[add-path]$(pwd)/bin"
- name: Report versions
run: |
rustup --version
rustc -Vv
mdbook --version
- name: Run tests
run: mdbook test

View File

@ -1,9 +0,0 @@
language: rust
cache: cargo
rust: nightly
before_script:
- cargo install mdbook -Z install-upgrade
- mdbook --version
script:
- mdbook build
- mdbook test

View File

@ -1,7 +1,5 @@
# The Rust Edition Guide
[![Build Status](https://api.travis-ci.com/rust-lang-nursery/edition-guide.svg?branch=master)](https://travis-ci.com/rust-lang-nursery/edition-guide)
This book explains the concept of "editions", major new eras in [Rust]'s
development. You can [read the book
online](https://doc.rust-lang.org/nightly/edition-guide/).

View File

@ -21,7 +21,7 @@ is a *workspace* that contains many related packages:
and more.
[the `futures` package]: https://github.com/rust-lang-nursery/futures-rs
[the `futures` package]: https://github.com/rust-lang/futures-rs
Workspaces allow these packages to be developed individually, but they share
a single set of dependencies, and therefore have a single target directory

View File

@ -133,7 +133,7 @@ contains a copy of Rust's documentation, so that you can read it offline.
This component cannot be removed for now; if that's of interest, please
comment on [this
issue](https://github.com/rust-lang-nursery/rustup.rs/issues/998).
issue](https://github.com/rust-lang/rustup.rs/issues/998).
### `rust-src` for a copy of Rust's source code

View File

@ -0,0 +1,28 @@
name: CI
on: [push, pull_request]
jobs:
test:
name: Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@master
- name: Update rustup
run: rustup self update
- name: Install Rust
run: |
rustup set profile minimal
rustup toolchain install nightly -c rust-docs
rustup default nightly
- name: Install mdbook
run: |
mkdir bin
curl -sSL https://github.com/rust-lang/mdBook/releases/download/v0.3.4/mdbook-v0.3.4-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=bin
echo "##[add-path]$(pwd)/bin"
- name: Report versions
run: |
rustup --version
rustc -Vv
mdbook --version
- name: Run tests
run: mdbook test

View File

@ -1,14 +0,0 @@
language: rust
cache: cargo
rust:
- nightly
before_script:
- (cargo install mdbook --force || true)
script:
- export PATH=$PATH:/home/travis/.cargo/bin;
- mdbook test

View File

@ -23,7 +23,7 @@ infinitesimal fragments of despair.
Building the Nomicon requires [mdBook]. To get it:
[mdBook]: https://github.com/rust-lang-nursery/mdBook
[mdBook]: https://github.com/rust-lang/mdBook
```bash
$ cargo install mdbook

View File

@ -1,7 +1,7 @@
# Alternative representations
Rust allows you to specify alternative data layout strategies from the default.
There's also the [reference].
There's also the [unsafe code guidelines] (note that it's **NOT** normative).
@ -143,7 +143,7 @@ This is a modifier on `repr(C)` and `repr(rust)`. It is incompatible with
[reference]: https://github.com/rust-rfcs/unsafe-code-guidelines/tree/master/reference/src/representation.md
[unsafe code guidelines]: https://rust-lang.github.io/unsafe-code-guidelines/layout.html
[drop flags]: drop-flags.html
[ub loads]: https://github.com/rust-lang/rust/issues/27060
[`UnsafeCell`]: ../std/cell/struct.UnsafeCell.html

View File

@ -0,0 +1,32 @@
name: CI
on: [push, pull_request]
jobs:
test:
name: Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@master
- name: Update rustup
run: rustup self update
- name: Install Rust
run: |
rustup set profile minimal
rustup toolchain install nightly -c rust-docs
rustup default nightly
- name: Install mdbook
run: |
mkdir bin
curl -sSL https://github.com/rust-lang/mdBook/releases/download/v0.3.4/mdbook-v0.3.4-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=bin
echo "##[add-path]$(pwd)/bin"
- name: Report versions
run: |
rustup --version
rustc -Vv
mdbook --version
- name: Run tests
run: mdbook test
- name: Check for unstable features
run: (cd stable-check && cargo run -- ../src)
- name: Check for broken links
run: tests/linkcheck.sh

View File

@ -1,13 +0,0 @@
language: shell
before_install:
- curl -sSL https://sh.rustup.rs | sh -s -- -y --default-toolchain=nightly --profile=minimal -c rust-docs
- export PATH="$HOME/.cargo/bin:$PATH"
install:
- travis_retry curl -Lf https://github.com/rust-lang-nursery/mdBook/releases/download/v0.3.1/mdbook-v0.3.1-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=$HOME/.cargo/bin
script:
- export PATH=$PATH:/home/travis/.cargo/bin && mdbook test
- (cd stable-check && cargo run -- ../src)
- tests/linkcheck.sh

View File

@ -63,7 +63,7 @@ This should include links to any relevant information, such as the
stabilization PR, the RFC, the tracking issue, and anything else that would be
helpful for writing the documentation.
[issue tracker]: https://github.com/rust-lang-nursery/reference/issues
[issue tracker]: https://github.com/rust-lang/reference/issues
[playpen]: https://play.rust-lang.org/
[rust-lang/rust]: https://github.com/rust-lang/rust/
[unstable]: https://doc.rust-lang.org/nightly/unstable-book/

View File

@ -5,4 +5,4 @@ author = "The Rust Project Developers"
[output.html]
additional-css = ["theme/reference.css"]
git-repository-url = "https://github.com/rust-lang-nursery/reference/"
git-repository-url = "https://github.com/rust-lang/reference/"

View File

@ -68,7 +68,8 @@ The *`link_section` attribute* specifies the section of the object file that a
[function] or [static]'s content will be placed into. It uses the
[_MetaNameValueStr_] syntax to specify the section name.
```rust,ignore
<!-- no_run: don't link. The format of the section name is platform-specific. -->
```rust,no_run
#[no_mangle]
#[link_section = ".example_section"]
pub static VAR1: u32 = 1;
@ -80,7 +81,7 @@ The *`export_name` attribute* specifies the name of the symbol that will be
exported on a [function] or [static]. It uses the [_MetaNameValueStr_] syntax
to specify the symbol name.
```rust,ignore
```rust
#[export_name = "exported_symbol_name"]
pub fn name_in_rust() { }
```

View File

@ -49,7 +49,8 @@ enable code generation of that function for specific platform architecture
features. It uses the [_MetaListNameValueStr_] syntax with a single key of
`enable` whose value is a string of comma-separated feature names to enable.
```rust,ignore
```rust
# #[cfg(target_feature = "avx2")]
#[target_feature(enable = "avx2")]
unsafe fn foo_avx2() {}
```

View File

@ -66,7 +66,8 @@ Non-exhaustive types cannot be constructed outside of the defining crate:
with a [_StructExpression_] \(including with [functional update syntax]).
- [`enum`][enum] instances can be constructed in an [_EnumerationVariantExpression_].
```rust,ignore (requires multiple crates)
<!-- ignore: requires external crates -->
```rust,ignore
// `Config`, `Error`, and `Message` are types defined in an upstream crate that have been
// annotated as `#[non_exhaustive]`.
use upstream::{Config, Error, Message};
@ -99,7 +100,8 @@ There are limitations when matching on non-exhaustive types outside of the defin
- When pattern matching on a non-exhaustive [`enum`][enum], matching on a variant does not
contribute towards the exhaustiveness of the arms.
```rust, ignore (requires multiple crates)
<!-- ignore: requires external crates -->
```rust, ignore
// `Config`, `Error`, and `Message` are types defined in an upstream crate that have been
// annotated as `#[non_exhaustive]`.
use upstream::{Config, Error, Message};

View File

@ -264,6 +264,7 @@ When the configuration predicate is true, this attribute expands out to the
attributes listed after the predicate. For example, the following module will
either be found at `linux.rs` or `windows.rs` based on the target.
<!-- ignore: `mod` needs multiple files -->
```rust,ignore
#[cfg_attr(linux, path = "linux.rs")]
#[cfg_attr(windows, path = "windows.rs")]
@ -273,6 +274,7 @@ mod os;
Zero, one, or more attributes may be listed. Multiple attributes will each be
expanded into separate attributes. For example:
<!-- ignore: fake attributes -->
```rust,ignore
#[cfg_attr(feature = "magic", sparkles, crackles)]
fn bewitched() {}

View File

@ -53,7 +53,7 @@ that apply to the containing module, most of which influence the behavior of
the compiler. The anonymous crate module can have additional attributes that
apply to the crate as a whole.
```rust,no_run
```rust
// Specify the crate name.
#![crate_name = "projx"]
@ -75,7 +75,8 @@ essentially to treat the source file as an executable script. The shebang
can only occur at the beginning of the file (but after the optional
_UTF8BOM_). It is ignored by the compiler. For example:
```text,ignore
<!-- ignore: tests don't like shebang -->
```rust,ignore
#!/usr/bin/env rustx
fn main() {
@ -136,7 +137,7 @@ other object being linked to defines `main`.
The *`crate_name` [attribute]* may be applied at the crate level to specify the
name of the crate with the [_MetaNameValueStr_] syntax.
```rust,ignore
```rust
#![crate_name = "mycrate"]
```

View File

@ -12,7 +12,7 @@
An _[array](../types/array.md) expression_ can be written by
enclosing zero or more comma-separated expressions of uniform type in square
brackets. This produces and array containing each of these values in the
brackets. This produces an array containing each of these values in the
order they are written.
Alternatively there can be exactly two expressions inside the brackets,

View File

@ -51,6 +51,7 @@ context, there must be some task context available.
Effectively, an `<expr>.await` expression is roughly
equivalent to the following (this desugaring is not normative):
<!-- ignore: example expansion -->
```rust,ignore
match /* <expr> */ {
mut pinned => loop {

View File

@ -10,6 +10,7 @@ A _field expression_ consists of an expression followed by a single dot and an
field of a [struct] or [union]. To call a function stored in a struct,
parentheses are needed around the field expression.
<!-- ignore: needs lots of support code -->
```rust,ignore
mystruct.myfield;
foo().x;

View File

@ -97,6 +97,7 @@ assert_eq!(a, 3);
An `if let` expression is equivalent to a [`match` expression] as follows:
<!-- ignore: expansion example -->
```rust,ignore
if let PATS = EXPR {
/* body */
@ -107,6 +108,7 @@ if let PATS = EXPR {
is equivalent to
<!-- ignore: expansion example -->
```rust,ignore
match EXPR {
PATS => { /* body */ },
@ -135,6 +137,7 @@ of the language (the implementation of if-let chains - see [eRFC 2947][_eRFCIfLe
When lazy boolean operator expression is desired, this can be achieved
by using parenthesis as below:
<!-- ignore: psuedo code -->
```rust,ignore
// Before...
if let PAT = EXPR && EXPR { .. }

View File

@ -93,6 +93,7 @@ while let _ = 5 {
A `while let` loop is equivalent to a `loop` expression containing a [`match`
expression] as follows.
<!-- ignore: expansion example -->
```rust,ignore
'label: while let PATS = EXPR {
/* loop body */
@ -101,6 +102,7 @@ expression] as follows.
is equivalent to
<!-- ignore: expansion example -->
```rust,ignore
'label: loop {
match EXPR {
@ -156,6 +158,7 @@ assert_eq!(sum, 55);
A for loop is equivalent to the following block expression.
<!-- ignore: expansion example -->
```rust,ignore
'label: for PATTERN in iter_expr {
/* loop body */
@ -164,6 +167,7 @@ A for loop is equivalent to the following block expression.
is equivalent to
<!-- ignore: expansion example -->
```rust,ignore
{
let result = match IntoIterator::into_iter(iter_expr) {

View File

@ -91,10 +91,11 @@ Every binding in each `|` separated pattern must appear in all of the patterns
in the arm. Every binding of the same name must have the same type, and have
the same binding mode.
## Match guards
Match arms can accept _match guards_ to further refine the
criteria for matching a case. Pattern guards appear after the pattern and
consist of a bool-typed expression following the `if` keyword. A pattern guard
may refer to the variables bound within the pattern they follow.
consist of a `bool`-typed expression following the `if` keyword.
When the pattern matches successfully, the pattern guard expression is executed.
If the expression evaluates to true, the pattern is successfully matched against.
@ -125,6 +126,16 @@ let message = match maybe_digit {
> assert_eq!(i.get(), 2);
> ```
A pattern guard may refer to the variables bound within the pattern they follow.
Before evaluating the guard, a shared reference is taken to the part of the
scrutinee the variable matches on. While evaluating the guard,
this shared reference is then used when accessing the variable.
Only when the guard evaluates to true is the value moved, or copied,
from the scrutinee into the variable. This allows shared borrows to be used
inside guards without moving out of the scrutinee in case guard fails to match.
Moreover, by holding a shared reference while evaluating the guard,
mutation inside guards is also prevented.
## Attributes on match arms
Outer attributes are allowed on match arms. The only attributes that have

View File

@ -260,7 +260,9 @@ functions and macros in the standard library can then use that assumption
above, these operators implicitly take shared borrows of their operands,
evaluating them in [place expression context][place expression]:
```rust,ignore
```rust
# let a = 1;
# let b = 1;
a == b;
// is equivalent to
::std::cmp::PartialEq::eq(&a, &b);

View File

@ -30,6 +30,14 @@ items are defined in [implementations] and declared in [traits]. Only
functions, constants, and type aliases can be associated. Contrast to a [free
item].
### Blanket implementation
Any implementation where a type appears [uncovered](#uncovered-type). `impl<T> Foo
for T`, `impl<T> Bar<T> for T`, `impl<T> Bar<Vec<T>> for T`, and `impl<T> Bar<T>
for Vec<T>` are considered blanket impls. However, `impl<T> Bar<Vec<T>> for
Vec<T>` is not a blanket impl, as all instances of `T` which appear in this `impl`
are covered by `Vec`.
### Bound
Bounds are constraints on a type or trait. For example, if a bound
@ -65,6 +73,21 @@ For example, `2 + (3 * 4)` is an expression that returns the value 14.
An [item] that is not a member of an [implementation], such as a *free
function* or a *free const*. Contrast to an [associated item].
### Fundamental traits
A fundamental trait is one where adding an impl of it for an existing type is a breaking change.
The `Fn` traits and `Sized` are fundamental.
### Fundamental type constructors
A fundamental type constructor is a type where implementing a [blanket implementation](#blanket-implementation) over it
is a breaking change. `&`, `&mut`, `Box`, and `Pin` are fundamental.
Any time a type `T` is considered [local](#local-type), `&T`, `&mut T`, `Box<T>`, and `Pin<T>`
are also considered local. Fundamental type constructors cannot [cover](#uncovered-type) other types.
Any time the term "covered type" is used,
the `T` in `&T`, `&mut T`, `Box<T>`, and `Pin<T>` is not considered covered.
### Inhabited
A type is inhabited if it has constructors and therefore can be instantiated. An inhabited type is
@ -87,6 +110,19 @@ A variable is initialized if it has been assigned a value and hasn't since been
moved from. All other memory locations are assumed to be uninitialized. Only
unsafe Rust can create such a memory without initializing it.
### Local trait
A `trait` which was defined in the current crate. A trait definition is local
or not independent of applied type arguments. Given `trait Foo<T, U>`,
`Foo` is always local, regardless of the types substituted for `T` and `U`.
### Local type
A `struct`, `enum`, or `union` which was defined in the current crate.
This is not affected by applied type arguments. `struct Foo` is considered local, but
`Vec<Foo>` is not. `LocalType<ForeignType>` is local. Type aliases do not
affect locality.
### Nominal types
Types that can be referred to by a path directly. Specifically [enums],
@ -158,6 +194,12 @@ It allows a type to make certain promises about its behavior.
Generic functions and generic structs can use traits to constrain, or bound, the types they accept.
### Uncovered type
A type which does not appear as an argument to another type. For example,
`T` is uncovered, but the `T` in `Vec<T>` is covered. This is only relevant for
type arguments.
### Undefined behavior
Compile-time or run-time behavior that is not specified. This may result in,

View File

@ -132,9 +132,9 @@ hesitate to file an issue or ask about it in the `#docs` channels on
attention to making those sections the best that they can be.
[book]: ../book/index.html
[github issues]: https://github.com/rust-lang-nursery/reference/issues
[github issues]: https://github.com/rust-lang/reference/issues
[standard library]: ../std/index.html
[the Rust Reference repository]: https://github.com/rust-lang-nursery/reference/
[the Rust Reference repository]: https://github.com/rust-lang/reference/
[Unstable Book]: https://doc.rust-lang.org/nightly/unstable-book/
[_Expression_]: expressions.md
[cargo book]: ../cargo/index.html

View File

@ -18,7 +18,7 @@ Every associated item kind comes in two varieties: definitions that contain the
actual implementation and declarations that declare signatures for
definitions.
It is the declarations that make up the contract of traits and what it available
It is the declarations that make up the contract of traits and what is available
on generic types.
## Associated functions and methods
@ -360,4 +360,4 @@ fn main() {
[function item]: ../types/function-item.md
[method call operator]: ../expressions/method-call-expr.md
[path]: ../paths.md
[regular function parameters]: functions.md#attributes-on-function-parameters
[regular function parameters]: functions.md#attributes-on-function-parameters

View File

@ -12,7 +12,7 @@
> &nbsp;&nbsp; _EnumItem_ ( `,` _EnumItem_ )<sup>\*</sup> `,`<sup>?</sup>
>
> _EnumItem_ :\
> &nbsp;&nbsp; _OuterAttribute_<sup>\*</sup>\
> &nbsp;&nbsp; _OuterAttribute_<sup>\*</sup> [_Visibility_]<sup>?</sup>\
> &nbsp;&nbsp; [IDENTIFIER]&nbsp;( _EnumItemTuple_ | _EnumItemStruct_
> | _EnumItemDiscriminant_ )<sup>?</sup>
>
@ -91,7 +91,7 @@ using a [primitive representation] or the [`C` representation].
It is an error when two variants share the same discriminant.
```rust,ignore
```rust,compile_fail
enum SharedDiscriminantError {
SharedA = 1,
SharedB = 1
@ -107,7 +107,7 @@ enum SharedDiscriminantError2 {
It is also an error to have an unspecified discriminant where the previous
discriminant is the maximum value for the size of the discriminant.
```rust,ignore
```rust,compile_fail
#[repr(u8)]
enum OverflowingDiscriminantError {
Max = 255,
@ -131,14 +131,56 @@ no valid values, they cannot be instantiated.
enum ZeroVariants {}
```
Zero-variant enums are equivalent to the [never type], but they cannot be
coerced into other types.
```rust,compile_fail
# enum ZeroVariants {}
let x: ZeroVariants = panic!();
let y: u32 = x; // mismatched type error
```
## Variant visibility
Enum variants syntactically allow a [_Visibility_] annotation, but this is
rejected when the enum is validated. This allows items to be parsed with a
unified syntax across different contexts where they are used.
```rust
macro_rules! mac_variant {
($vis:vis $name:ident) => {
enum $name {
$vis Unit,
$vis Tuple(u8, u16),
$vis Struct { f: u8 },
}
}
}
// Empty `vis` is allowed.
mac_variant! { E }
// This is allowed, since it is removed before being validated.
#[cfg(FALSE)]
enum E {
pub U,
pub(crate) T(u8),
pub(super) T { f: String }
}
```
[IDENTIFIER]: ../identifiers.md
[_Generics_]: generics.md
[_WhereClause_]: generics.md#where-clauses
[_Expression_]: ../expressions.md
[_TupleFields_]: structs.md
[_StructFields_]: structs.md
[_Visibility_]: ../visibility-and-privacy.md
[enumerated type]: ../types/enum.md
[`mem::discriminant`]: ../../std/mem/fn.discriminant.html
[never type]: ../types/never.md
[numeric cast]: ../expressions/operator-expr.md#semantics
[constant expression]: ../const_eval.md#constant-expressions
[default representation]: ../type-layout.md#the-default-representation

View File

@ -28,6 +28,7 @@ In this case the `as` clause must be used to specify the name to bind it to.
Three examples of `extern crate` declarations:
<!-- ignore: requires external crates -->
```rust,ignore
extern crate pcre;
@ -43,6 +44,7 @@ details).
Here is an example:
<!-- ignore: requires external crates -->
```rust,ignore
// Importing the Cargo package hello-world
extern crate hello_world; // hyphen replaced with an underscore

View File

@ -64,7 +64,7 @@ By default external blocks assume that the library they are calling uses the
standard C ABI on the specific platform. Other ABIs may be specified using an
`abi` string, as shown here:
```rust,ignore
```rust
// Interface to the Windows API
extern "stdcall" { }
```
@ -97,7 +97,7 @@ There are also some platform-specific ABI strings:
Functions within external blocks may be variadic by specifying `...` after one
or more named arguments in the argument list:
```rust,ignore
```rust
extern {
fn foo(x: i32, ...);
}
@ -128,6 +128,7 @@ name for the items within an `extern` block when importing symbols from the
host environment. The default module name is `env` if `wasm_import_module` is
not specified.
<!-- ignore: requires extern linking -->
```rust,ignore
#[link(name = "crypto")]
extern {
@ -156,7 +157,7 @@ The `link_name` attribute may be specified on declarations inside an `extern`
block to indicate the symbol to import for the given function or static. It
uses the [_MetaNameValueStr_] syntax to specify the name of the symbol.
```rust,ignore
```rust
extern {
#[link_name = "actual_symbol_name"]
fn name_in_rust();
@ -184,4 +185,4 @@ restrictions as [regular function parameters].
[_Visibility_]: ../visibility-and-privacy.md
[_WhereClause_]: generics.md#where-clauses
[attributes]: ../attributes.md
[regular function parameters]: functions.md#attributes-on-function-parameters
[regular function parameters]: functions.md#attributes-on-function-parameters

View File

@ -58,6 +58,7 @@ the body of the function will short-cut that implicit return, if reached.
For example, the function above behaves as if it was written as:
<!-- ignore: example expansion -->
```rust,ignore
// argument_0 is the actual first argument passed from the caller
let (value, _) = argument_0;
@ -115,14 +116,16 @@ sufficient context to determine the type parameters. For example,
The `extern` function qualifier allows providing function _definitions_ that can
be called with a particular ABI:
+<!-- ignore: fake ABI -->
```rust,ignore
extern "ABI" fn foo() { ... }
extern "ABI" fn foo() { /* ... */ }
```
These are often used in combination with [external block] items which provide
function _declarations_ that can be used to call functions without providing
their _definition_:
+<!-- ignore: fake ABI -->
```rust,ignore
extern "ABI" {
fn foo(); /* no body */
@ -376,6 +379,7 @@ For example, the following code defines an inert `some_inert_attribute` attribut
is not formally defined anywhere and the `some_proc_macro_attribute` procedural macro is
responsible for detecting its presence and removing it from the output token stream.
<!-- ignore: requires proc macro -->
```rust,ignore
#[some_proc_macro_attribute]
fn foo_oof(#[some_inert_attribute] arg: u8) {

Some files were not shown because too many files have changed in this diff Show More