New upstream version 1.39.0+dfsg1

This commit is contained in:
Ximin Luo 2019-11-27 12:13:46 +00:00
parent 416331ca66
commit e1599b0ce6
3363 changed files with 140208 additions and 66286 deletions

4263
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -68,6 +68,7 @@ rustc-workspace-hack = { path = 'src/tools/rustc-workspace-hack' }
# here
rustc-std-workspace-core = { path = 'src/tools/rustc-std-workspace-core' }
rustc-std-workspace-alloc = { path = 'src/tools/rustc-std-workspace-alloc' }
rustc-std-workspace-std = { path = 'src/tools/rustc-std-workspace-std' }
[patch."https://github.com/rust-lang/rust-clippy"]
clippy_lints = { path = "src/tools/clippy/clippy_lints" }

View File

@ -26,12 +26,13 @@ or reading the [rustc guide][rustcguidebuild].
### Building on *nix
1. Make sure you have installed the dependencies:
* `g++` 4.7 or later or `clang++` 3.x or later
* `g++` 5.1 or later or `clang++` 3.5 or later
* `python` 2.7 (but not 3.x)
* GNU `make` 3.81 or later
* `cmake` 3.4.3 or later
* `curl`
* `git`
* `ssl` which comes in `libssl-dev` or `openssl-devel`
2. Clone the [source] with `git`:
@ -56,6 +57,8 @@ or reading the [rustc guide][rustcguidebuild].
an installation (using `./x.py install`) that you set the `prefix` value
in the `[install]` section to a directory that you have write permissions.
Create install directory if you are not installing in default directory
4. Build and install:
```sh
@ -144,10 +147,21 @@ then you may need to force rustbuild to use an older version. This can be done
by manually calling the appropriate vcvars file before running the bootstrap.
```batch
> CALL "C:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\VC\Auxiliary\Build\vcvars64.bat"
> CALL "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvars64.bat"
> python x.py build
```
### Building rustc with older host toolchains
It is still possible to build Rust with the older toolchain versions listed below, but only if the
LLVM_TEMPORARILY_ALLOW_OLD_TOOLCHAIN option is set to true in the config.toml file.
* Clang 3.1
* Apple Clang 3.1
* GCC 4.8
* Visual Studio 2015 (Update 3)
Toolchain versions older than what is listed above cannot be used to build rustc.
#### Specifying an ABI
Each specific ABI can also be used from either environment (for example, using

View File

@ -1,3 +1,127 @@
Version 1.39.0 (2019-11-07)
===========================
Language
--------
- [You can now create `async` functions and blocks with `async fn`, `async move {}`, and
`async {}` respectively, and you can now call `.await` on async expressions.][63209]
- [You can now use certain attributes on function, closure, and function pointer
parameters.][64010] These attributes include `cfg`, `cfg_attr`, `allow`, `warn`,
`deny`, `forbid` as well as inert helper attributes used by procedural macro
attributes applied to items. e.g.
```rust
fn len(
#[cfg(windows)] slice: &[u16],
#[cfg(not(windows))] slice: &[u8],
) -> usize {
slice.len()
}
```
- [You can now take shared references to bind-by-move patterns in the `if` guards
of `match` arms.][63118] e.g.
```rust
fn main() {
let array: Box<[u8; 4]> = Box::new([1, 2, 3, 4]);
match array {
nums
// ---- `nums` is bound by move.
if nums.iter().sum::<u8>() == 10
// ^------ `.iter()` implicitly takes a reference to `nums`.
=> {
drop(nums);
// ----------- Legal as `nums` was bound by move and so we have ownership.
}
_ => unreachable!(),
}
}
```
Compiler
--------
- [Added tier 3\* support for the `i686-unknown-uefi` target.][64334]
- [Added tier 3 support for the `sparc64-unknown-openbsd` target.][63595]
- [rustc will now trim code snippets in diagnostics to fit in your terminal.][63402]
**Note** Cargo currently doesn't use this feature. Refer to
[cargo#7315][cargo/7315] to track this feature's progress.
- [You can now pass `--show-output` argument to test binaries to print the
output of successful tests.][62600]
\* Refer to Rust's [platform support page][forge-platform-support] for more
information on Rust's tiered platform support.
Libraries
---------
- [`Vec::new` and `String::new` are now `const` functions.][64028]
- [`LinkedList::new` is now a `const` function.][63684]
- [`str::len`, `[T]::len` and `str::as_bytes` are now `const` functions.][63770]
- [The `abs`, `wrapping_abs`, and `overflowing_abs` numeric functions are
now `const`.][63786]
Stabilized APIs
---------------
- [`Pin::into_inner`]
- [`Instant::checked_duration_since`]
- [`Instant::saturating_duration_since`]
Cargo
-----
- [You can now publish git dependencies if supplied with a `version`.][cargo/7237]
- [The `--all` flag has been renamed to `--workspace`.][cargo/7241] Using
`--all` is now deprecated.
Misc
----
- [You can now pass `-Clinker` to rustdoc to control the linker used
for compiling doctests.][63834]
Compatibility Notes
-------------------
- [Code that was previously accepted by the old borrow checker, but rejected by
the NLL borrow checker is now a hard error in Rust 2018.][63565] This was
previously a warning, and will also become a hard error in the Rust 2015
edition in the 1.40.0 release.
- [`rustdoc` now requires `rustc` to be installed and in the same directory to
run tests.][63827] This should improve performance when running a large
amount of doctests.
- [The `try!` macro will now issue a deprecation warning.][62672] It is
recommended to use the `?` operator instead.
- [`asinh(-0.0)` now correctly returns `-0.0`.][63698] Previously this
returned `0.0`.
[62600]: https://github.com/rust-lang/rust/pull/62600/
[62672]: https://github.com/rust-lang/rust/pull/62672/
[63118]: https://github.com/rust-lang/rust/pull/63118/
[63209]: https://github.com/rust-lang/rust/pull/63209/
[63402]: https://github.com/rust-lang/rust/pull/63402/
[63565]: https://github.com/rust-lang/rust/pull/63565/
[63595]: https://github.com/rust-lang/rust/pull/63595/
[63684]: https://github.com/rust-lang/rust/pull/63684/
[63698]: https://github.com/rust-lang/rust/pull/63698/
[63770]: https://github.com/rust-lang/rust/pull/63770/
[63786]: https://github.com/rust-lang/rust/pull/63786/
[63827]: https://github.com/rust-lang/rust/pull/63827/
[63834]: https://github.com/rust-lang/rust/pull/63834/
[63927]: https://github.com/rust-lang/rust/pull/63927/
[63933]: https://github.com/rust-lang/rust/pull/63933/
[63934]: https://github.com/rust-lang/rust/pull/63934/
[63938]: https://github.com/rust-lang/rust/pull/63938/
[63940]: https://github.com/rust-lang/rust/pull/63940/
[63941]: https://github.com/rust-lang/rust/pull/63941/
[63945]: https://github.com/rust-lang/rust/pull/63945/
[64010]: https://github.com/rust-lang/rust/pull/64010/
[64028]: https://github.com/rust-lang/rust/pull/64028/
[64334]: https://github.com/rust-lang/rust/pull/64334/
[cargo/7237]: https://github.com/rust-lang/cargo/pull/7237/
[cargo/7241]: https://github.com/rust-lang/cargo/pull/7241/
[cargo/7315]: https://github.com/rust-lang/cargo/pull/7315/
[`Pin::into_inner`]: https://doc.rust-lang.org/std/pin/struct.Pin.html#method.into_inner
[`Instant::checked_duration_since`]: https://doc.rust-lang.org/std/time/struct.Instant.html#method.checked_duration_since
[`Instant::saturating_duration_since`]: https://doc.rust-lang.org/std/time/struct.Instant.html#method.saturating_duration_since
Version 1.38.0 (2019-09-26)
==========================
@ -47,8 +171,6 @@ Stabilized APIs
- [`<*mut T>::cast`]
- [`Duration::as_secs_f32`]
- [`Duration::as_secs_f64`]
- [`Duration::div_duration_f32`]
- [`Duration::div_duration_f64`]
- [`Duration::div_f32`]
- [`Duration::div_f64`]
- [`Duration::from_secs_f32`]
@ -70,10 +192,10 @@ Misc
Compatibility Notes
-------------------
- Unfortunately the [`x86_64-unknown-uefi` platform can not be built][62785]
with rustc 1.39.0.
- The [`armv7-unknown-linux-gnueabihf` platform is also known to have
issues][62896] for certain crates such as libc.
- The [`x86_64-unknown-uefi` platform can not be built][62785] with rustc
1.38.0.
- The [`armv7-unknown-linux-gnueabihf` platform is known to have
issues][62896] with certain crates such as libc.
[60260]: https://github.com/rust-lang/rust/pull/60260/
[61457]: https://github.com/rust-lang/rust/pull/61457/
@ -100,8 +222,6 @@ Compatibility Notes
[`<*mut T>::cast`]: https://doc.rust-lang.org/std/primitive.pointer.html#method.cast
[`Duration::as_secs_f32`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.as_secs_f32
[`Duration::as_secs_f64`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.as_secs_f64
[`Duration::div_duration_f32`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.div_duration_f32
[`Duration::div_duration_f64`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.div_duration_f64
[`Duration::div_f32`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.div_f32
[`Duration::div_f64`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.div_f64
[`Duration::from_secs_f32`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.from_secs_f32

View File

@ -141,10 +141,10 @@
# library and facade crates.
#compiler-docs = false
# Indicate whether submodules are managed and updated automatically.
# Indicate whether git submodules are managed and updated automatically.
#submodules = true
# Update submodules only when the checked out commit in the submodules differs
# Update git submodules only when the checked out commit in the submodules differs
# from what is committed in the main rustc repo.
#fast-submodules = true
@ -184,7 +184,7 @@
# default.
#extended = false
# Installs chosen set of extended tools if enables. By default builds all.
# Installs chosen set of extended tools if enabled. By default builds all.
# If chosen tool failed to build the installation fails.
#tools = ["cargo", "rls", "clippy", "rustfmt", "analysis", "src"]
@ -382,11 +382,6 @@
# This is the name of the directory in which codegen backends will get installed
#codegen-backends-dir = "codegen-backends"
# Flag indicating whether `libstd` calls an imported function to handle basic IO
# when targeting WebAssembly. Enable this to debug tests for the `wasm32-unknown-unknown`
# target, as without this option the test output will not be captured.
#wasm-syscall = false
# Indicates whether LLD will be compiled and made available in the sysroot for
# rustc to execute.
#lld = false

View File

@ -1 +1 @@
625451e376bb2e5283fc4741caa0a3e8a2ca4d54
4560ea788cb760f0a34127156c78e2552949f734

View File

@ -44,7 +44,7 @@ cc = "1.0.35"
libc = "0.2"
serde = { version = "1.0.8", features = ["derive"] }
serde_json = "1.0.2"
toml = "0.4"
toml = "0.5"
lazy_static = "1.3.0"
time = "0.1"
petgraph = "0.4.13"

View File

@ -5,9 +5,6 @@
//! parent directory, and otherwise documentation can be found throughout the `build`
//! directory in each respective module.
// NO-RUSTC-WRAPPER
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
use std::env;
use bootstrap::{Config, Build};

View File

@ -15,11 +15,7 @@
//! switching compilers for the bootstrap and for build scripts will probably
//! never get replaced.
// NO-RUSTC-WRAPPER
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
use std::env;
use std::ffi::OsString;
use std::io;
use std::path::PathBuf;
use std::process::Command;
@ -27,35 +23,7 @@ use std::str::FromStr;
use std::time::Instant;
fn main() {
let mut args = env::args_os().skip(1).collect::<Vec<_>>();
// Append metadata suffix for internal crates. See the corresponding entry
// in bootstrap/lib.rs for details.
if let Ok(s) = env::var("RUSTC_METADATA_SUFFIX") {
for i in 1..args.len() {
// Dirty code for borrowing issues
let mut new = None;
if let Some(current_as_str) = args[i].to_str() {
if (&*args[i - 1] == "-C" && current_as_str.starts_with("metadata")) ||
current_as_str.starts_with("-Cmetadata") {
new = Some(format!("{}-{}", current_as_str, s));
}
}
if let Some(new) = new { args[i] = new.into(); }
}
}
// Drop `--error-format json` because despite our desire for json messages
// from Cargo we don't want any from rustc itself.
if let Some(n) = args.iter().position(|n| n == "--error-format") {
args.remove(n);
args.remove(n);
}
if let Some(s) = env::var_os("RUSTC_ERROR_FORMAT") {
args.push("--error-format".into());
args.push(s);
}
let args = env::args_os().skip(1).collect::<Vec<_>>();
// Detect whether or not we're a build script depending on whether --target
// is passed (a bit janky...)
@ -101,47 +69,19 @@ fn main() {
if let Some(crate_name) = crate_name {
if let Some(target) = env::var_os("RUSTC_TIME") {
if target == "all" ||
target.into_string().unwrap().split(",").any(|c| c.trim() == crate_name)
target.into_string().unwrap().split(",").any(|c| c.trim() == crate_name)
{
cmd.arg("-Ztime");
}
}
}
// Non-zero stages must all be treated uniformly to avoid problems when attempting to uplift
// compiler libraries and such from stage 1 to 2.
if stage == "0" {
cmd.arg("--cfg").arg("bootstrap");
}
// Print backtrace in case of ICE
if env::var("RUSTC_BACKTRACE_ON_ICE").is_ok() && env::var("RUST_BACKTRACE").is_err() {
cmd.env("RUST_BACKTRACE", "1");
}
cmd.env("RUSTC_BREAK_ON_ICE", "1");
if let Ok(debuginfo_level) = env::var("RUSTC_DEBUGINFO_LEVEL") {
cmd.arg(format!("-Cdebuginfo={}", debuginfo_level));
}
if env::var_os("RUSTC_DENY_WARNINGS").is_some() &&
env::var_os("RUSTC_EXTERNAL_TOOL").is_none() {
// When extending this list, search for `NO-RUSTC-WRAPPER` and add the new lints
// there as well, some code doesn't go through this `rustc` wrapper.
cmd.arg("-Dwarnings");
cmd.arg("-Drust_2018_idioms");
cmd.arg("-Dunused_lifetimes");
// cfg(not(bootstrap)): Remove this during the next stage 0 compiler update.
// `-Drustc::internal` is a new feature and `rustc_version` mis-reports the `stage`.
let cfg_not_bootstrap = stage != "0" && crate_name != Some("rustc_version");
if cfg_not_bootstrap && use_internal_lints(crate_name) {
cmd.arg("-Zunstable-options");
cmd.arg("-Drustc::internal");
}
}
if let Some(target) = target {
if target.is_some() {
// The stage0 compiler has a special sysroot distinct from what we
// actually downloaded, so we just always pass the `--sysroot` option,
// unless one is already set.
@ -149,43 +89,6 @@ fn main() {
cmd.arg("--sysroot").arg(&sysroot);
}
cmd.arg("-Zexternal-macro-backtrace");
// Link crates to the proc macro crate for the target, but use a host proc macro crate
// to actually run the macros
if env::var_os("RUST_DUAL_PROC_MACROS").is_some() {
cmd.arg("-Zdual-proc-macros");
}
// When we build Rust dylibs they're all intended for intermediate
// usage, so make sure we pass the -Cprefer-dynamic flag instead of
// linking all deps statically into the dylib.
if env::var_os("RUSTC_NO_PREFER_DYNAMIC").is_none() {
cmd.arg("-Cprefer-dynamic");
}
// Help the libc crate compile by assisting it in finding various
// sysroot native libraries.
if let Some(s) = env::var_os("MUSL_ROOT") {
if target.contains("musl") {
let mut root = OsString::from("native=");
root.push(&s);
root.push("/lib");
cmd.arg("-L").arg(&root);
}
}
if let Some(s) = env::var_os("WASI_ROOT") {
let mut root = OsString::from("native=");
root.push(&s);
root.push("/lib/wasm32-wasi");
cmd.arg("-L").arg(&root);
}
// Override linker if necessary.
if let Ok(target_linker) = env::var("RUSTC_TARGET_LINKER") {
cmd.arg(format!("-Clinker={}", target_linker));
}
// If we're compiling specifically the `panic_abort` crate then we pass
// the `-C panic=abort` option. Note that we do not do this for any
// other crate intentionally as this is the only crate for now that we
@ -212,86 +115,18 @@ fn main() {
// The compiler builtins are pretty sensitive to symbols referenced in
// libcore and such, so we never compile them with debug assertions.
//
// FIXME(rust-lang/cargo#7253) we should be doing this in `builder.rs`
// with env vars instead of doing it here in this script.
if crate_name == Some("compiler_builtins") {
cmd.arg("-C").arg("debug-assertions=no");
} else {
cmd.arg("-C").arg(format!("debug-assertions={}", debug_assertions));
}
if let Ok(s) = env::var("RUSTC_CODEGEN_UNITS") {
cmd.arg("-C").arg(format!("codegen-units={}", s));
}
// Emit save-analysis info.
if env::var("RUSTC_SAVE_ANALYSIS") == Ok("api".to_string()) {
cmd.arg("-Zsave-analysis");
cmd.env("RUST_SAVE_ANALYSIS_CONFIG",
"{\"output_file\": null,\"full_docs\": false,\
\"pub_only\": true,\"reachable_only\": false,\
\"distro_crate\": true,\"signatures\": false,\"borrow_data\": false}");
}
// Dealing with rpath here is a little special, so let's go into some
// detail. First off, `-rpath` is a linker option on Unix platforms
// which adds to the runtime dynamic loader path when looking for
// dynamic libraries. We use this by default on Unix platforms to ensure
// that our nightlies behave the same on Windows, that is they work out
// of the box. This can be disabled, of course, but basically that's why
// we're gated on RUSTC_RPATH here.
//
// Ok, so the astute might be wondering "why isn't `-C rpath` used
// here?" and that is indeed a good question to task. This codegen
// option is the compiler's current interface to generating an rpath.
// Unfortunately it doesn't quite suffice for us. The flag currently
// takes no value as an argument, so the compiler calculates what it
// should pass to the linker as `-rpath`. This unfortunately is based on
// the **compile time** directory structure which when building with
// Cargo will be very different than the runtime directory structure.
//
// All that's a really long winded way of saying that if we use
// `-Crpath` then the executables generated have the wrong rpath of
// something like `$ORIGIN/deps` when in fact the way we distribute
// rustc requires the rpath to be `$ORIGIN/../lib`.
//
// So, all in all, to set up the correct rpath we pass the linker
// argument manually via `-C link-args=-Wl,-rpath,...`. Plus isn't it
// fun to pass a flag to a tool to pass a flag to pass a flag to a tool
// to change a flag in a binary?
if env::var("RUSTC_RPATH") == Ok("true".to_string()) {
let rpath = if target.contains("apple") {
// Note that we need to take one extra step on macOS to also pass
// `-Wl,-instal_name,@rpath/...` to get things to work right. To
// do that we pass a weird flag to the compiler to get it to do
// so. Note that this is definitely a hack, and we should likely
// flesh out rpath support more fully in the future.
cmd.arg("-Z").arg("osx-rpath-install-name");
Some("-Wl,-rpath,@loader_path/../lib")
} else if !target.contains("windows") &&
!target.contains("wasm32") &&
!target.contains("fuchsia") {
Some("-Wl,-rpath,$ORIGIN/../lib")
} else {
None
};
if let Some(rpath) = rpath {
cmd.arg("-C").arg(format!("link-args={}", rpath));
}
}
if let Ok(s) = env::var("RUSTC_CRT_STATIC") {
if s == "true" {
cmd.arg("-C").arg("target-feature=+crt-static");
}
if s == "false" {
cmd.arg("-C").arg("target-feature=-crt-static");
}
}
if let Ok(map) = env::var("RUSTC_DEBUGINFO_MAP") {
cmd.arg("--remap-path-prefix").arg(&map);
}
} else {
// FIXME(rust-lang/cargo#5754) we shouldn't be using special env vars
// here, but rather Cargo should know what flags to pass rustc itself.
// Override linker if necessary.
if let Ok(host_linker) = env::var("RUSTC_HOST_LINKER") {
cmd.arg(format!("-Clinker={}", host_linker));
@ -307,6 +142,10 @@ fn main() {
}
}
if let Ok(map) = env::var("RUSTC_DEBUGINFO_MAP") {
cmd.arg("--remap-path-prefix").arg(&map);
}
// Force all crates compiled by this compiler to (a) be unstable and (b)
// allow the `rustc_private` feature to link to other unstable crates
// also in the sysroot. We also do this for host crates, since those
@ -315,10 +154,6 @@ fn main() {
cmd.arg("-Z").arg("force-unstable-if-unmarked");
}
if env::var_os("RUSTC_PARALLEL_COMPILER").is_some() {
cmd.arg("--cfg").arg("parallel_compiler");
}
if verbose > 1 {
eprintln!(
"rustc command: {:?}={:?} {:?}",
@ -369,14 +204,6 @@ fn main() {
std::process::exit(code);
}
// Rustc crates for which internal lints are in effect.
fn use_internal_lints(crate_name: Option<&str>) -> bool {
crate_name.map_or(false, |crate_name| {
crate_name.starts_with("rustc") || crate_name.starts_with("syntax") ||
["arena", "fmt_macros"].contains(&crate_name)
})
}
#[cfg(unix)]
fn exec_cmd(cmd: &mut Command) -> io::Result<i32> {
use std::os::unix::process::CommandExt;

View File

@ -2,12 +2,10 @@
//!
//! See comments in `src/bootstrap/rustc.rs` for more information.
// NO-RUSTC-WRAPPER
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
use std::env;
use std::process::Command;
use std::path::PathBuf;
use std::ffi::OsString;
fn main() {
let args = env::args_os().skip(1).collect::<Vec<_>>();
@ -47,7 +45,9 @@ fn main() {
cmd.arg("-Z").arg("force-unstable-if-unmarked");
}
if let Some(linker) = env::var_os("RUSTC_TARGET_LINKER") {
cmd.arg("--linker").arg(linker).arg("-Z").arg("unstable-options");
let mut arg = OsString::from("-Clinker=");
arg.push(&linker);
cmd.arg(arg);
}
// Bootstrap's Cargo-command builder sets this variable to the current Rust version; let's pick

View File

@ -320,7 +320,7 @@ class RustBuild(object):
def __init__(self):
self.cargo_channel = ''
self.date = ''
self._download_url = 'https://static.rust-lang.org'
self._download_url = ''
self.rustc_channel = ''
self.build = ''
self.build_dir = os.path.join(os.getcwd(), "build")
@ -523,6 +523,10 @@ class RustBuild(object):
'value2'
>>> rb.get_toml('key', 'c') is None
True
>>> rb.config_toml = 'key1 = true'
>>> rb.get_toml("key1")
'true'
"""
cur_section = None
@ -571,6 +575,12 @@ class RustBuild(object):
>>> RustBuild.get_string(' "devel" ')
'devel'
>>> RustBuild.get_string(" 'devel' ")
'devel'
>>> RustBuild.get_string('devel') is None
True
>>> RustBuild.get_string(' "devel ')
''
"""
start = line.find('"')
if start != -1:
@ -631,6 +641,9 @@ class RustBuild(object):
target_linker = self.get_toml("linker", build_section)
if target_linker is not None:
env["RUSTFLAGS"] += "-C linker=" + target_linker + " "
env["RUSTFLAGS"] += " -Wrust_2018_idioms -Wunused_lifetimes "
if self.get_toml("deny-warnings", "rust") != "false":
env["RUSTFLAGS"] += "-Dwarnings "
env["PATH"] = os.path.join(self.bin_root(), "bin") + \
os.pathsep + env["PATH"]
@ -666,7 +679,7 @@ class RustBuild(object):
def update_submodule(self, module, checked_out, recorded_submodules):
module_path = os.path.join(self.rust_root, module)
if checked_out != None:
if checked_out is not None:
default_encoding = sys.getdefaultencoding()
checked_out = checked_out.communicate()[0].decode(default_encoding).strip()
if recorded_submodules[module] == checked_out:
@ -695,6 +708,14 @@ class RustBuild(object):
if (not os.path.exists(os.path.join(self.rust_root, ".git"))) or \
self.get_toml('submodules') == "false":
return
# check the existence of 'git' command
try:
subprocess.check_output(['git', '--version'])
except (subprocess.CalledProcessError, OSError):
print("error: `git` is not found, please make sure it's installed and in the path.")
sys.exit(1)
slow_submodules = self.get_toml('fast-submodules') == "false"
start_time = time()
if slow_submodules:
@ -731,9 +752,19 @@ class RustBuild(object):
self.update_submodule(module[0], module[1], recorded_submodules)
print("Submodules updated in %.2f seconds" % (time() - start_time))
def set_normal_environment(self):
"""Set download URL for normal environment"""
if 'RUSTUP_DIST_SERVER' in os.environ:
self._download_url = os.environ['RUSTUP_DIST_SERVER']
else:
self._download_url = 'https://static.rust-lang.org'
def set_dev_environment(self):
"""Set download URL for development environment"""
self._download_url = 'https://dev-static.rust-lang.org'
if 'RUSTUP_DEV_DIST_SERVER' in os.environ:
self._download_url = os.environ['RUSTUP_DEV_DIST_SERVER']
else:
self._download_url = 'https://dev-static.rust-lang.org'
def check_vendored_status(self):
"""Check that vendoring is configured properly"""
@ -809,13 +840,13 @@ def bootstrap(help_triggered):
except (OSError, IOError):
pass
match = re.search(r'\nverbose = (\d+)', build.config_toml)
if match is not None:
build.verbose = max(build.verbose, int(match.group(1)))
config_verbose = build.get_toml('verbose', 'build')
if config_verbose is not None:
build.verbose = max(build.verbose, int(config_verbose))
build.use_vendored_sources = '\nvendor = true' in build.config_toml
build.use_vendored_sources = build.get_toml('vendor', 'build') == 'true'
build.use_locked_deps = '\nlocked-deps = true' in build.config_toml
build.use_locked_deps = build.get_toml('locked-deps', 'build') == 'true'
build.check_vendored_status()
@ -826,6 +857,8 @@ def bootstrap(help_triggered):
if 'dev' in data:
build.set_dev_environment()
else:
build.set_normal_environment()
build.update_submodules()

View File

@ -3,6 +3,7 @@ use std::cell::{Cell, RefCell};
use std::collections::BTreeSet;
use std::collections::HashMap;
use std::env;
use std::ffi::OsStr;
use std::fmt::Debug;
use std::fs;
use std::hash::Hash;
@ -145,7 +146,7 @@ impl StepDescription {
only_hosts: S::ONLY_HOSTS,
should_run: S::should_run,
make_run: S::make_run,
name: unsafe { ::std::intrinsics::type_name::<S>() },
name: std::any::type_name::<S>(),
}
}
@ -337,7 +338,6 @@ impl<'a> Builder<'a> {
match kind {
Kind::Build => describe!(
compile::Std,
compile::Test,
compile::Rustc,
compile::CodegenBackend,
compile::StartupObjects,
@ -363,7 +363,6 @@ impl<'a> Builder<'a> {
),
Kind::Check | Kind::Clippy | Kind::Fix => describe!(
check::Std,
check::Test,
check::Rustc,
check::CodegenBackend,
check::Rustdoc
@ -425,8 +424,6 @@ impl<'a> Builder<'a> {
doc::TheBook,
doc::Standalone,
doc::Std,
doc::Test,
doc::WhitelistedRustc,
doc::Rustc,
doc::Rustdoc,
doc::ErrorIndex,
@ -618,13 +615,7 @@ impl<'a> Builder<'a> {
}
fn run(self, builder: &Builder<'_>) -> Interned<PathBuf> {
let compiler = self.compiler;
let config = &builder.build.config;
let lib = if compiler.stage >= 1 && config.libdir_relative().is_some() {
builder.build.config.libdir_relative().unwrap()
} else {
Path::new("lib")
};
let lib = builder.sysroot_libdir_relative(self.compiler);
let sysroot = builder
.sysroot(self.compiler)
.join(lib)
@ -678,9 +669,21 @@ impl<'a> Builder<'a> {
}
}
/// Returns the compiler's relative libdir where the standard library and other artifacts are
/// found for a compiler's sysroot.
///
/// For example this returns `lib` on Unix and Windows.
pub fn sysroot_libdir_relative(&self, compiler: Compiler) -> &Path {
match self.config.libdir_relative() {
Some(relative_libdir) if compiler.stage >= 1
=> relative_libdir,
_ => Path::new("lib")
}
}
/// Adds the compiler's directory of dynamic libraries to `cmd`'s dynamic
/// library lookup path.
pub fn add_rustc_lib_path(&self, compiler: Compiler, cmd: &mut Command) {
pub fn add_rustc_lib_path(&self, compiler: Compiler, cmd: &mut Cargo) {
// Windows doesn't need dylib path munging because the dlls for the
// compiler live next to the compiler and the system will find them
// automatically.
@ -688,7 +691,7 @@ impl<'a> Builder<'a> {
return;
}
add_lib_path(vec![self.rustc_libdir(compiler)], cmd);
add_lib_path(vec![self.rustc_libdir(compiler)], &mut cmd.command);
}
/// Gets a path to the compiler specified.
@ -750,85 +753,39 @@ impl<'a> Builder<'a> {
mode: Mode,
target: Interned<String>,
cmd: &str,
) -> Command {
) -> Cargo {
let mut cargo = Command::new(&self.initial_cargo);
let out_dir = self.stage_out(compiler, mode);
// command specific path, we call clear_if_dirty with this
let mut my_out = match cmd {
"build" => self.cargo_out(compiler, mode, target),
// This is the intended out directory for crate documentation.
"doc" | "rustdoc" => self.crate_doc_out(target),
_ => self.stage_out(compiler, mode),
};
// This is for the original compiler, but if we're forced to use stage 1, then
// std/test/rustc stamps won't exist in stage 2, so we need to get those from stage 1, since
// we copy the libs forward.
let cmp = self.compiler_for(compiler.stage, compiler.host, target);
let libstd_stamp = match cmd {
"check" | "clippy" | "fix" => check::libstd_stamp(self, cmp, target),
_ => compile::libstd_stamp(self, cmp, target),
};
let libtest_stamp = match cmd {
"check" | "clippy" | "fix" => check::libtest_stamp(self, cmp, target),
_ => compile::libtest_stamp(self, cmp, target),
};
let librustc_stamp = match cmd {
"check" | "clippy" | "fix" => check::librustc_stamp(self, cmp, target),
_ => compile::librustc_stamp(self, cmp, target),
};
// Codegen backends are not yet tracked by -Zbinary-dep-depinfo,
// so we need to explicitly clear out if they've been updated.
for backend in self.codegen_backends(compiler) {
self.clear_if_dirty(&out_dir, &backend);
}
if cmd == "doc" || cmd == "rustdoc" {
if mode == Mode::Rustc || mode == Mode::ToolRustc || mode == Mode::Codegen {
let my_out = match mode {
// This is the intended out directory for compiler documentation.
my_out = self.compiler_doc_out(target);
}
Mode::Rustc | Mode::ToolRustc | Mode::Codegen => self.compiler_doc_out(target),
_ => self.crate_doc_out(target),
};
let rustdoc = self.rustdoc(compiler);
self.clear_if_dirty(&my_out, &rustdoc);
} else if cmd != "test" {
match mode {
Mode::Std => {
self.clear_if_dirty(&my_out, &self.rustc(compiler));
for backend in self.codegen_backends(compiler) {
self.clear_if_dirty(&my_out, &backend);
}
},
Mode::Test => {
self.clear_if_dirty(&my_out, &libstd_stamp);
},
Mode::Rustc => {
self.clear_if_dirty(&my_out, &self.rustc(compiler));
self.clear_if_dirty(&my_out, &libstd_stamp);
self.clear_if_dirty(&my_out, &libtest_stamp);
},
Mode::Codegen => {
self.clear_if_dirty(&my_out, &librustc_stamp);
},
Mode::ToolBootstrap => { },
Mode::ToolStd => {
self.clear_if_dirty(&my_out, &libstd_stamp);
},
Mode::ToolTest => {
self.clear_if_dirty(&my_out, &libstd_stamp);
self.clear_if_dirty(&my_out, &libtest_stamp);
},
Mode::ToolRustc => {
self.clear_if_dirty(&my_out, &libstd_stamp);
self.clear_if_dirty(&my_out, &libtest_stamp);
self.clear_if_dirty(&my_out, &librustc_stamp);
},
}
}
cargo
.env("CARGO_TARGET_DIR", out_dir)
.arg(cmd);
.arg(cmd)
.arg("-Zconfig-profile");
let profile_var = |name: &str| {
let profile = if self.config.rust_optimize {
"RELEASE"
} else {
"DEV"
};
format!("CARGO_PROFILE_{}_{}", profile, name)
};
// See comment in librustc_llvm/build.rs for why this is necessary, largely llvm-config
// needs to not accidentally link to libLLVM in stage0/lib.
@ -850,17 +807,46 @@ impl<'a> Builder<'a> {
cargo.env("RUST_CHECK", "1");
}
let stage;
if compiler.stage == 0 && self.local_rebuild {
// Assume the local-rebuild rustc already has stage1 features.
stage = 1;
} else {
stage = compiler.stage;
}
let mut rustflags = Rustflags::new(&target);
if stage != 0 {
rustflags.env("RUSTFLAGS_NOT_BOOTSTRAP");
} else {
rustflags.env("RUSTFLAGS_BOOTSTRAP");
rustflags.arg("--cfg=bootstrap");
}
match mode {
Mode::Std | Mode::Test | Mode::ToolBootstrap | Mode::ToolStd | Mode::ToolTest=> {},
Mode::Std | Mode::ToolBootstrap | Mode::ToolStd => {},
Mode::Rustc | Mode::Codegen | Mode::ToolRustc => {
// Build proc macros both for the host and the target
if target != compiler.host && cmd != "check" {
cargo.arg("-Zdual-proc-macros");
cargo.env("RUST_DUAL_PROC_MACROS", "1");
rustflags.arg("-Zdual-proc-macros");
}
},
}
// This tells Cargo (and in turn, rustc) to output more complete
// dependency information. Most importantly for rustbuild, this
// includes sysroot artifacts, like libstd, which means that we don't
// need to track those in rustbuild (an error prone process!). This
// feature is currently unstable as there may be some bugs and such, but
// it represents a big improvement in rustbuild's reliability on
// rebuilds, so we're using it here.
//
// For some additional context, see #63470 (the PR originally adding
// this), as well as #63012 which is the tracking issue for this
// feature on the rustc side.
cargo.arg("-Zbinary-dep-depinfo");
cargo.arg("-j").arg(self.jobs().to_string());
// Remove make-related flags to ensure Cargo can correctly set things up
cargo.env_remove("MAKEFLAGS");
@ -889,43 +875,15 @@ impl<'a> Builder<'a> {
// things still build right, please do!
match mode {
Mode::Std => metadata.push_str("std"),
Mode::Test => metadata.push_str("test"),
_ => {},
}
cargo.env("__CARGO_DEFAULT_LIB_METADATA", &metadata);
let stage;
if compiler.stage == 0 && self.local_rebuild {
// Assume the local-rebuild rustc already has stage1 features.
stage = 1;
} else {
stage = compiler.stage;
}
let mut extra_args = env::var(&format!("RUSTFLAGS_STAGE_{}", stage)).unwrap_or_default();
if stage != 0 {
let s = env::var("RUSTFLAGS_STAGE_NOT_0").unwrap_or_default();
if !extra_args.is_empty() {
extra_args.push_str(" ");
}
extra_args.push_str(&s);
}
if cmd == "clippy" {
extra_args.push_str("-Zforce-unstable-if-unmarked -Zunstable-options \
--json-rendered=termcolor");
rustflags.arg("-Zforce-unstable-if-unmarked");
}
if !extra_args.is_empty() {
cargo.env(
"RUSTFLAGS",
format!(
"{} {}",
env::var("RUSTFLAGS").unwrap_or_default(),
extra_args
),
);
}
rustflags.arg("-Zexternal-macro-backtrace");
let want_rustdoc = self.doc_tests != DocTests::No;
@ -962,7 +920,6 @@ impl<'a> Builder<'a> {
)
.env("RUSTC_SYSROOT", &sysroot)
.env("RUSTC_LIBDIR", &libdir)
.env("RUSTC_RPATH", self.config.rust_rpath.to_string())
.env("RUSTDOC", self.out.join("bootstrap/debug/rustdoc"))
.env(
"RUSTDOC_REAL",
@ -972,16 +929,63 @@ impl<'a> Builder<'a> {
PathBuf::from("/path/to/nowhere/rustdoc/not/required")
},
)
.env("RUSTC_ERROR_METADATA_DST", self.extended_error_dir());
.env("RUSTC_ERROR_METADATA_DST", self.extended_error_dir())
.env("RUSTC_BREAK_ON_ICE", "1");
// Dealing with rpath here is a little special, so let's go into some
// detail. First off, `-rpath` is a linker option on Unix platforms
// which adds to the runtime dynamic loader path when looking for
// dynamic libraries. We use this by default on Unix platforms to ensure
// that our nightlies behave the same on Windows, that is they work out
// of the box. This can be disabled, of course, but basically that's why
// we're gated on RUSTC_RPATH here.
//
// Ok, so the astute might be wondering "why isn't `-C rpath` used
// here?" and that is indeed a good question to task. This codegen
// option is the compiler's current interface to generating an rpath.
// Unfortunately it doesn't quite suffice for us. The flag currently
// takes no value as an argument, so the compiler calculates what it
// should pass to the linker as `-rpath`. This unfortunately is based on
// the **compile time** directory structure which when building with
// Cargo will be very different than the runtime directory structure.
//
// All that's a really long winded way of saying that if we use
// `-Crpath` then the executables generated have the wrong rpath of
// something like `$ORIGIN/deps` when in fact the way we distribute
// rustc requires the rpath to be `$ORIGIN/../lib`.
//
// So, all in all, to set up the correct rpath we pass the linker
// argument manually via `-C link-args=-Wl,-rpath,...`. Plus isn't it
// fun to pass a flag to a tool to pass a flag to pass a flag to a tool
// to change a flag in a binary?
if self.config.rust_rpath {
let rpath = if target.contains("apple") {
// Note that we need to take one extra step on macOS to also pass
// `-Wl,-instal_name,@rpath/...` to get things to work right. To
// do that we pass a weird flag to the compiler to get it to do
// so. Note that this is definitely a hack, and we should likely
// flesh out rpath support more fully in the future.
rustflags.arg("-Zosx-rpath-install-name");
Some("-Wl,-rpath,@loader_path/../lib")
} else if !target.contains("windows") &&
!target.contains("wasm32") &&
!target.contains("fuchsia") {
Some("-Wl,-rpath,$ORIGIN/../lib")
} else {
None
};
if let Some(rpath) = rpath {
rustflags.arg(&format!("-Clink-args={}", rpath));
}
}
if let Some(host_linker) = self.linker(compiler.host) {
cargo.env("RUSTC_HOST_LINKER", host_linker);
}
if let Some(target_linker) = self.linker(target) {
cargo.env("RUSTC_TARGET_LINKER", target_linker);
}
if let Some(ref error_format) = self.config.rustc_error_format {
cargo.env("RUSTC_ERROR_FORMAT", error_format);
let target = crate::envify(&target);
cargo.env(&format!("CARGO_TARGET_{}_LINKER", target), target_linker);
}
if !(["build", "check", "clippy", "fix", "rustc"].contains(&cmd)) && want_rustdoc {
cargo.env("RUSTDOC_LIBDIR", self.rustc_libdir(compiler));
@ -989,36 +993,22 @@ impl<'a> Builder<'a> {
let debuginfo_level = match mode {
Mode::Rustc | Mode::Codegen => self.config.rust_debuginfo_level_rustc,
Mode::Std | Mode::Test => self.config.rust_debuginfo_level_std,
Mode::Std => self.config.rust_debuginfo_level_std,
Mode::ToolBootstrap | Mode::ToolStd |
Mode::ToolTest | Mode::ToolRustc => self.config.rust_debuginfo_level_tools,
Mode::ToolRustc => self.config.rust_debuginfo_level_tools,
};
cargo.env("RUSTC_DEBUGINFO_LEVEL", debuginfo_level.to_string());
cargo.env(profile_var("DEBUG"), debuginfo_level.to_string());
if !mode.is_tool() {
cargo.env("RUSTC_FORCE_UNSTABLE", "1");
// Currently the compiler depends on crates from crates.io, and
// then other crates can depend on the compiler (e.g., proc-macro
// crates). Let's say, for example that rustc itself depends on the
// bitflags crate. If an external crate then depends on the
// bitflags crate as well, we need to make sure they don't
// conflict, even if they pick the same version of bitflags. We'll
// want to make sure that e.g., a plugin and rustc each get their
// own copy of bitflags.
// Cargo ensures that this works in general through the -C metadata
// flag. This flag will frob the symbols in the binary to make sure
// they're different, even though the source code is the exact
// same. To solve this problem for the compiler we extend Cargo's
// already-passed -C metadata flag with our own. Our rustc.rs
// wrapper around the actual rustc will detect -C metadata being
// passed and frob it with this extra string we're passing in.
cargo.env("RUSTC_METADATA_SUFFIX", "rustc");
}
if let Some(x) = self.crt_static(target) {
cargo.env("RUSTC_CRT_STATIC", x.to_string());
if x {
rustflags.arg("-Ctarget-feature=+crt-static");
} else {
rustflags.arg("-Ctarget-feature=-crt-static");
}
}
if let Some(x) = self.crt_static(compiler.host) {
@ -1077,8 +1067,21 @@ impl<'a> Builder<'a> {
cargo.env("RUSTC_VERBOSE", self.verbosity.to_string());
if self.config.deny_warnings {
cargo.env("RUSTC_DENY_WARNINGS", "1");
if !mode.is_tool() {
// When extending this list, add the new lints to the RUSTFLAGS of the
// build_bootstrap function of src/bootstrap/bootstrap.py as well as
// some code doesn't go through this `rustc` wrapper.
rustflags.arg("-Wrust_2018_idioms");
rustflags.arg("-Wunused_lifetimes");
if self.config.deny_warnings {
rustflags.arg("-Dwarnings");
}
}
if let Mode::Rustc | Mode::Codegen = mode {
rustflags.arg("-Zunstable-options");
rustflags.arg("-Wrustc::internal");
}
// Throughout the build Cargo can execute a number of build scripts
@ -1131,12 +1134,15 @@ impl<'a> Builder<'a> {
}
}
if (cmd == "build" || cmd == "rustc")
&& mode == Mode::Std
if mode == Mode::Std
&& self.config.extended
&& compiler.is_final_stage(self)
{
cargo.env("RUSTC_SAVE_ANALYSIS", "api".to_string());
rustflags.arg("-Zsave-analysis");
cargo.env("RUST_SAVE_ANALYSIS_CONFIG",
"{\"output_file\": null,\"full_docs\": false,\
\"pub_only\": true,\"reachable_only\": false,\
\"distro_crate\": true,\"signatures\": false,\"borrow_data\": false}");
}
// For `cargo doc` invocations, make rustdoc print the Rust version into the docs
@ -1191,9 +1197,8 @@ impl<'a> Builder<'a> {
match (mode, self.config.rust_codegen_units_std, self.config.rust_codegen_units) {
(Mode::Std, Some(n), _) |
(Mode::Test, Some(n), _) |
(_, _, Some(n)) => {
cargo.env("RUSTC_CODEGEN_UNITS", n.to_string());
cargo.env(profile_var("CODEGEN_UNITS"), n.to_string());
}
_ => {
// Don't set anything
@ -1214,9 +1219,21 @@ impl<'a> Builder<'a> {
cargo.arg("--frozen");
}
cargo.env("RUSTC_INSTALL_BINDIR", &self.config.bindir);
self.ci_env.force_coloring_in_ci(&mut cargo);
cargo
// When we build Rust dylibs they're all intended for intermediate
// usage, so make sure we pass the -Cprefer-dynamic flag instead of
// linking all deps statically into the dylib.
if let Mode::Std | Mode::Rustc | Mode::Codegen = mode {
rustflags.arg("-Cprefer-dynamic");
}
Cargo {
command: cargo,
rustflags,
}
}
/// Ensure that a given step is built, returning its output. This will
@ -1316,3 +1333,78 @@ impl<'a> Builder<'a> {
#[cfg(test)]
mod tests;
#[derive(Debug)]
struct Rustflags(String);
impl Rustflags {
fn new(target: &str) -> Rustflags {
let mut ret = Rustflags(String::new());
// Inherit `RUSTFLAGS` by default ...
ret.env("RUSTFLAGS");
// ... and also handle target-specific env RUSTFLAGS if they're
// configured.
let target_specific = format!("CARGO_TARGET_{}_RUSTFLAGS", crate::envify(target));
ret.env(&target_specific);
ret
}
fn env(&mut self, env: &str) {
if let Ok(s) = env::var(env) {
for part in s.split_whitespace() {
self.arg(part);
}
}
}
fn arg(&mut self, arg: &str) -> &mut Self {
assert_eq!(arg.split_whitespace().count(), 1);
if self.0.len() > 0 {
self.0.push_str(" ");
}
self.0.push_str(arg);
self
}
}
#[derive(Debug)]
pub struct Cargo {
command: Command,
rustflags: Rustflags,
}
impl Cargo {
pub fn rustflag(&mut self, arg: &str) -> &mut Cargo {
self.rustflags.arg(arg);
self
}
pub fn arg(&mut self, arg: impl AsRef<OsStr>) -> &mut Cargo {
self.command.arg(arg.as_ref());
self
}
pub fn args<I, S>(&mut self, args: I) -> &mut Cargo
where I: IntoIterator<Item=S>, S: AsRef<OsStr>
{
for arg in args {
self.arg(arg.as_ref());
}
self
}
pub fn env(&mut self, key: impl AsRef<OsStr>, value: impl AsRef<OsStr>) -> &mut Cargo {
self.command.env(key.as_ref(), value.as_ref());
self
}
}
impl From<Cargo> for Command {
fn from(mut cargo: Cargo) -> Command {
cargo.command.env("RUSTFLAGS", &cargo.rustflags.0);
cargo.command
}
}

View File

@ -365,27 +365,6 @@ fn dist_with_same_targets_and_hosts() {
},
]
);
assert_eq!(
first(builder.cache.all::<compile::Test>()),
&[
compile::Test {
compiler: Compiler { host: a, stage: 0 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 1 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
]
);
assert_eq!(
first(builder.cache.all::<compile::Assemble>()),
&[
@ -415,7 +394,47 @@ fn build_default() {
let b = INTERNER.intern_str("B");
let c = INTERNER.intern_str("C");
assert!(!builder.cache.all::<compile::Std>().is_empty());
assert_eq!(
first(builder.cache.all::<compile::Std>()),
&[
compile::Std {
compiler: Compiler { host: a, stage: 0 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 1 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: c,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: c,
},
]
);
assert!(!builder.cache.all::<compile::Assemble>().is_empty());
assert_eq!(
first(builder.cache.all::<compile::Rustc>()),
@ -450,48 +469,6 @@ fn build_default() {
},
]
);
assert_eq!(
first(builder.cache.all::<compile::Test>()),
&[
compile::Test {
compiler: Compiler { host: a, stage: 0 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 1 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Test {
compiler: Compiler { host: b, stage: 2 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Test {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
compile::Test {
compiler: Compiler { host: b, stage: 2 },
target: b,
},
compile::Test {
compiler: Compiler { host: a, stage: 2 },
target: c,
},
compile::Test {
compiler: Compiler { host: b, stage: 2 },
target: c,
},
]
);
}
#[test]
@ -506,7 +483,47 @@ fn build_with_target_flag() {
let b = INTERNER.intern_str("B");
let c = INTERNER.intern_str("C");
assert!(!builder.cache.all::<compile::Std>().is_empty());
assert_eq!(
first(builder.cache.all::<compile::Std>()),
&[
compile::Std {
compiler: Compiler { host: a, stage: 0 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 1 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: c,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: c,
},
]
);
assert_eq!(
first(builder.cache.all::<compile::Assemble>()),
&[
@ -541,48 +558,6 @@ fn build_with_target_flag() {
},
]
);
assert_eq!(
first(builder.cache.all::<compile::Test>()),
&[
compile::Test {
compiler: Compiler { host: a, stage: 0 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 1 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Test {
compiler: Compiler { host: b, stage: 2 },
target: a,
},
compile::Test {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Test {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
compile::Test {
compiler: Compiler { host: b, stage: 2 },
target: b,
},
compile::Test {
compiler: Compiler { host: a, stage: 2 },
target: c,
},
compile::Test {
compiler: Compiler { host: b, stage: 2 },
target: c,
},
]
);
}
#[test]

View File

@ -46,7 +46,7 @@ fn cc2ar(cc: &Path, target: &str) -> Option<PathBuf> {
} else if target.contains("openbsd") {
Some(PathBuf::from("ar"))
} else if target.contains("vxworks") {
Some(PathBuf::from("vx-ar"))
Some(PathBuf::from("wr-ar"))
} else {
let parent = cc.parent().unwrap();
let file = cc.file_name().unwrap().to_str().unwrap();

View File

@ -13,7 +13,7 @@ use build_helper::output;
use crate::Build;
// The version number
pub const CFG_RELEASE_NUM: &str = "1.38.0";
pub const CFG_RELEASE_NUM: &str = "1.39.0";
pub struct GitInfo {
inner: Option<Info>,

View File

@ -1,6 +1,6 @@
//! Implementation of compiling the compiler and standard library, in "check"-based modes.
use crate::compile::{run_cargo, std_cargo, test_cargo, rustc_cargo, rustc_cargo_env,
use crate::compile::{run_cargo, std_cargo, rustc_cargo, rustc_cargo_env,
add_to_sysroot};
use crate::builder::{RunConfig, Builder, Kind, ShouldRun, Step};
use crate::tool::{prepare_tool_cargo, SourceType};
@ -34,7 +34,7 @@ impl Step for Std {
const DEFAULT: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("std")
run.all_krates("test")
}
fn make_run(run: RunConfig<'_>) {
@ -52,7 +52,7 @@ impl Step for Std {
builder.info(&format!("Checking std artifacts ({} -> {})", &compiler.host, target));
run_cargo(builder,
&mut cargo,
cargo,
args(builder.kind),
&libstd_stamp(builder, compiler, target),
true);
@ -92,7 +92,7 @@ impl Step for Rustc {
let compiler = builder.compiler(0, builder.config.build);
let target = self.target;
builder.ensure(Test { target });
builder.ensure(Std { target });
let mut cargo = builder.cargo(compiler, Mode::Rustc, target,
cargo_subcommand(builder.kind));
@ -100,7 +100,7 @@ impl Step for Rustc {
builder.info(&format!("Checking compiler artifacts ({} -> {})", &compiler.host, target));
run_cargo(builder,
&mut cargo,
cargo,
args(builder.kind),
&librustc_stamp(builder, compiler, target),
true);
@ -152,54 +152,13 @@ impl Step for CodegenBackend {
// We won't build LLVM if it's not available, as it shouldn't affect `check`.
run_cargo(builder,
&mut cargo,
cargo,
args(builder.kind),
&codegen_backend_stamp(builder, compiler, target, backend),
true);
}
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct Test {
pub target: Interned<String>,
}
impl Step for Test {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("test")
}
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Test {
target: run.target,
});
}
fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(0, builder.config.build);
let target = self.target;
builder.ensure(Std { target });
let mut cargo = builder.cargo(compiler, Mode::Test, target, cargo_subcommand(builder.kind));
test_cargo(builder, &compiler, target, &mut cargo);
builder.info(&format!("Checking test artifacts ({} -> {})", &compiler.host, target));
run_cargo(builder,
&mut cargo,
args(builder.kind),
&libtest_stamp(builder, compiler, target),
true);
let libdir = builder.sysroot_libdir(compiler, target);
let hostdir = builder.sysroot_libdir(compiler, compiler.host);
add_to_sysroot(builder, &libdir, &hostdir, &libtest_stamp(builder, compiler, target));
}
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct Rustdoc {
pub target: Interned<String>,
@ -226,18 +185,18 @@ impl Step for Rustdoc {
builder.ensure(Rustc { target });
let mut cargo = prepare_tool_cargo(builder,
compiler,
Mode::ToolRustc,
target,
cargo_subcommand(builder.kind),
"src/tools/rustdoc",
SourceType::InTree,
&[]);
let cargo = prepare_tool_cargo(builder,
compiler,
Mode::ToolRustc,
target,
cargo_subcommand(builder.kind),
"src/tools/rustdoc",
SourceType::InTree,
&[]);
println!("Checking rustdoc artifacts ({} -> {})", &compiler.host, target);
run_cargo(builder,
&mut cargo,
cargo,
args(builder.kind),
&rustdoc_stamp(builder, compiler, target),
true);
@ -245,7 +204,6 @@ impl Step for Rustdoc {
let libdir = builder.sysroot_libdir(compiler, target);
let hostdir = builder.sysroot_libdir(compiler, compiler.host);
add_to_sysroot(&builder, &libdir, &hostdir, &rustdoc_stamp(builder, compiler, target));
builder.cargo(compiler, Mode::ToolRustc, target, "clean");
}
}
@ -259,16 +217,6 @@ pub fn libstd_stamp(
builder.cargo_out(compiler, Mode::Std, target).join(".libstd-check.stamp")
}
/// Cargo's output path for libtest in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn libtest_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::Test, target).join(".libtest-check.stamp")
}
/// Cargo's output path for librustc in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn librustc_stamp(

View File

@ -15,12 +15,13 @@ use std::path::{Path, PathBuf};
use std::process::{Command, Stdio, exit};
use std::str;
use build_helper::{output, mtime, t, up_to_date};
use build_helper::{output, t, up_to_date};
use filetime::FileTime;
use serde::Deserialize;
use serde_json;
use crate::dist;
use crate::builder::Cargo;
use crate::util::{exe, is_dylib};
use crate::{Compiler, Mode, GitRepo};
use crate::native;
@ -39,7 +40,7 @@ impl Step for Std {
const DEFAULT: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("std")
run.all_krates("test")
}
fn make_run(run: RunConfig<'_>) {
@ -98,7 +99,7 @@ impl Step for Std {
builder.info(&format!("Building stage{} std artifacts ({} -> {})", compiler.stage,
&compiler.host, target));
run_cargo(builder,
&mut cargo,
cargo,
vec![],
&libstd_stamp(builder, compiler, target),
false);
@ -156,7 +157,7 @@ fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target:
pub fn std_cargo(builder: &Builder<'_>,
compiler: &Compiler,
target: Interned<String>,
cargo: &mut Command) {
cargo: &mut Cargo) {
if let Some(target) = env::var_os("MACOSX_STD_DEPLOYMENT_TARGET") {
cargo.env("MACOSX_DEPLOYMENT_TARGET", target);
}
@ -212,21 +213,26 @@ pub fn std_cargo(builder: &Builder<'_>,
emscripten: false,
});
cargo.env("LLVM_CONFIG", llvm_config);
cargo.env("RUSTC_BUILD_SANITIZERS", "1");
}
cargo.arg("--features").arg(features)
.arg("--manifest-path")
.arg(builder.src.join("src/libstd/Cargo.toml"));
.arg(builder.src.join("src/libtest/Cargo.toml"));
// Help the libc crate compile by assisting it in finding various
// sysroot native libraries.
if target.contains("musl") {
if let Some(p) = builder.musl_root(target) {
cargo.env("MUSL_ROOT", p);
let root = format!("native={}/lib", p.to_str().unwrap());
cargo.rustflag("-L").rustflag(&root);
}
}
if target.ends_with("-wasi") {
if let Some(p) = builder.wasi_root(target) {
cargo.env("WASI_ROOT", p);
let root = format!("native={}/lib/wasm32-wasi", p.to_str().unwrap());
cargo.rustflag("-L").rustflag(&root);
}
}
}
@ -274,8 +280,6 @@ impl Step for StdLink {
// for reason why the sanitizers are not built in stage0.
copy_apple_sanitizer_dylibs(builder, &builder.native_dir(target), "osx", &libdir);
}
builder.cargo(target_compiler, Mode::ToolStd, target, "clean");
}
}
@ -360,131 +364,6 @@ impl Step for StartupObjects {
}
}
#[derive(Debug, PartialOrd, Ord, Copy, Clone, PartialEq, Eq, Hash)]
pub struct Test {
pub target: Interned<String>,
pub compiler: Compiler,
}
impl Step for Test {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("test")
}
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Test {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Builds libtest.
///
/// This will build libtest and supporting libraries for a particular stage of
/// the build using the `compiler` targeting the `target` architecture. The
/// artifacts created will also be linked into the sysroot directory.
fn run(self, builder: &Builder<'_>) {
let target = self.target;
let compiler = self.compiler;
builder.ensure(Std { compiler, target });
if builder.config.keep_stage.contains(&compiler.stage) {
builder.info("Warning: Using a potentially old libtest. This may not behave well.");
builder.ensure(TestLink {
compiler,
target_compiler: compiler,
target,
});
return;
}
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
if compiler_to_use != compiler {
builder.ensure(Test {
compiler: compiler_to_use,
target,
});
builder.info(
&format!("Uplifting stage1 test ({} -> {})", builder.config.build, target));
builder.ensure(TestLink {
compiler: compiler_to_use,
target_compiler: compiler,
target,
});
return;
}
let mut cargo = builder.cargo(compiler, Mode::Test, target, "build");
test_cargo(builder, &compiler, target, &mut cargo);
builder.info(&format!("Building stage{} test artifacts ({} -> {})", compiler.stage,
&compiler.host, target));
run_cargo(builder,
&mut cargo,
vec![],
&libtest_stamp(builder, compiler, target),
false);
builder.ensure(TestLink {
compiler: builder.compiler(compiler.stage, builder.config.build),
target_compiler: compiler,
target,
});
}
}
/// Same as `std_cargo`, but for libtest
pub fn test_cargo(builder: &Builder<'_>,
_compiler: &Compiler,
_target: Interned<String>,
cargo: &mut Command) {
if let Some(target) = env::var_os("MACOSX_STD_DEPLOYMENT_TARGET") {
cargo.env("MACOSX_DEPLOYMENT_TARGET", target);
}
cargo.arg("--manifest-path")
.arg(builder.src.join("src/libtest/Cargo.toml"));
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct TestLink {
pub compiler: Compiler,
pub target_compiler: Compiler,
pub target: Interned<String>,
}
impl Step for TestLink {
type Output = ();
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
/// Same as `std_link`, only for libtest
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target_compiler = self.target_compiler;
let target = self.target;
builder.info(&format!("Copying stage{} test from stage{} ({} -> {} / {})",
target_compiler.stage,
compiler.stage,
&compiler.host,
target_compiler.host,
target));
add_to_sysroot(
builder,
&builder.sysroot_libdir(target_compiler, target),
&builder.sysroot_libdir(target_compiler, compiler.host),
&libtest_stamp(builder, compiler, target)
);
builder.cargo(target_compiler, Mode::ToolTest, target, "clean");
}
}
#[derive(Debug, PartialOrd, Ord, Copy, Clone, PartialEq, Eq, Hash)]
pub struct Rustc {
pub target: Interned<String>,
@ -516,7 +395,7 @@ impl Step for Rustc {
let compiler = self.compiler;
let target = self.target;
builder.ensure(Test { compiler, target });
builder.ensure(Std { compiler, target });
if builder.config.keep_stage.contains(&compiler.stage) {
builder.info("Warning: Using a potentially old librustc. This may not behave well.");
@ -545,7 +424,7 @@ impl Step for Rustc {
}
// Ensure that build scripts and proc macros have a std / libproc_macro to link against.
builder.ensure(Test {
builder.ensure(Std {
compiler: builder.compiler(self.compiler.stage, builder.config.build),
target: builder.config.build,
});
@ -556,7 +435,7 @@ impl Step for Rustc {
builder.info(&format!("Building stage{} compiler artifacts ({} -> {})",
compiler.stage, &compiler.host, target));
run_cargo(builder,
&mut cargo,
cargo,
vec![],
&librustc_stamp(builder, compiler, target),
false);
@ -569,14 +448,14 @@ impl Step for Rustc {
}
}
pub fn rustc_cargo(builder: &Builder<'_>, cargo: &mut Command) {
pub fn rustc_cargo(builder: &Builder<'_>, cargo: &mut Cargo) {
cargo.arg("--features").arg(builder.rustc_features())
.arg("--manifest-path")
.arg(builder.src.join("src/rustc/Cargo.toml"));
rustc_cargo_env(builder, cargo);
}
pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Command) {
pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Cargo) {
// Set some configuration variables picked up by build scripts and
// the compiler alike
cargo.env("CFG_RELEASE", builder.rust_release())
@ -601,7 +480,7 @@ pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Command) {
cargo.env("CFG_DEFAULT_LINKER", s);
}
if builder.config.rustc_parallel {
cargo.env("RUSTC_PARALLEL_COMPILER", "1");
cargo.rustflag("--cfg=parallel_compiler");
}
if builder.config.rust_verify_llvm_ir {
cargo.env("RUSTC_VERIFY_LLVM_IR", "1");
@ -639,7 +518,6 @@ impl Step for RustcLink {
&builder.sysroot_libdir(target_compiler, compiler.host),
&librustc_stamp(builder, compiler, target)
);
builder.cargo(target_compiler, Mode::ToolRustc, target, "clean");
}
}
@ -704,14 +582,11 @@ impl Step for CodegenBackend {
rustc_cargo_env(builder, &mut cargo);
let features = build_codegen_backend(&builder, &mut cargo, &compiler, target, backend);
cargo.arg("--features").arg(features);
let tmp_stamp = out_dir.join(".tmp.stamp");
let files = run_cargo(builder,
cargo.arg("--features").arg(features),
vec![],
&tmp_stamp,
false);
let files = run_cargo(builder, cargo, vec![], &tmp_stamp, false);
if builder.config.dry_run {
return;
}
@ -736,7 +611,7 @@ impl Step for CodegenBackend {
}
pub fn build_codegen_backend(builder: &Builder<'_>,
cargo: &mut Command,
cargo: &mut Cargo,
compiler: &Compiler,
target: Interned<String>,
backend: Interned<String>) -> String {
@ -795,6 +670,9 @@ pub fn build_codegen_backend(builder: &Builder<'_>,
if builder.config.llvm_use_libcxx {
cargo.env("LLVM_USE_LIBCXX", "1");
}
if builder.config.llvm_optimize && !builder.config.llvm_release_debuginfo {
cargo.env("LLVM_NDEBUG", "1");
}
}
_ => panic!("unknown backend: {}", backend),
}
@ -874,16 +752,6 @@ pub fn libstd_stamp(
builder.cargo_out(compiler, Mode::Std, target).join(".libstd.stamp")
}
/// Cargo's output path for libtest in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn libtest_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::Test, target).join(".libtest.stamp")
}
/// Cargo's output path for librustc in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn librustc_stamp(
@ -1083,7 +951,7 @@ pub fn add_to_sysroot(
}
pub fn run_cargo(builder: &Builder<'_>,
cargo: &mut Command,
cargo: Cargo,
tail_args: Vec<String>,
stamp: &Path,
is_check: bool)
@ -1116,10 +984,6 @@ pub fn run_cargo(builder: &Builder<'_>,
},
..
} => (filenames, crate_types),
CargoMessage::CompilerMessage { message } => {
eprintln!("{}", message.rendered);
return;
}
_ => return,
};
for filename in filenames {
@ -1206,58 +1070,35 @@ pub fn run_cargo(builder: &Builder<'_>,
deps.push((path_to_add.into(), false));
}
// Now we want to update the contents of the stamp file, if necessary. First
// we read off the previous contents along with its mtime. If our new
// contents (the list of files to copy) is different or if any dep's mtime
// is newer then we rewrite the stamp file.
deps.sort();
let stamp_contents = fs::read(stamp);
let stamp_mtime = mtime(&stamp);
let mut new_contents = Vec::new();
let mut max = None;
let mut max_path = None;
for (dep, proc_macro) in deps.iter() {
let mtime = mtime(dep);
if Some(mtime) > max {
max = Some(mtime);
max_path = Some(dep.clone());
}
new_contents.extend(if *proc_macro { b"h" } else { b"t" });
new_contents.extend(dep.to_str().unwrap().as_bytes());
new_contents.extend(b"\0");
}
let max = max.unwrap();
let max_path = max_path.unwrap();
let contents_equal = stamp_contents
.map(|contents| contents == new_contents)
.unwrap_or_default();
if contents_equal && max <= stamp_mtime {
builder.verbose(&format!("not updating {:?}; contents equal and {:?} <= {:?}",
stamp, max, stamp_mtime));
return deps.into_iter().map(|(d, _)| d).collect()
}
if max > stamp_mtime {
builder.verbose(&format!("updating {:?} as {:?} changed", stamp, max_path));
} else {
builder.verbose(&format!("updating {:?} as deps changed", stamp));
}
t!(fs::write(&stamp, &new_contents));
deps.into_iter().map(|(d, _)| d).collect()
}
pub fn stream_cargo(
builder: &Builder<'_>,
cargo: &mut Command,
cargo: Cargo,
tail_args: Vec<String>,
cb: &mut dyn FnMut(CargoMessage<'_>),
) -> bool {
let mut cargo = Command::from(cargo);
if builder.config.dry_run {
return true;
}
// Instruct Cargo to give us json messages on stdout, critically leaving
// stderr as piped so we can get those pretty colors.
cargo.arg("--message-format").arg("json")
.stdout(Stdio::piped());
let mut message_format = String::from("json-render-diagnostics");
if let Some(s) = &builder.config.rustc_error_format {
message_format.push_str(",json-diagnostic-");
message_format.push_str(s);
}
cargo.arg("--message-format").arg(message_format).stdout(Stdio::piped());
for arg in tail_args {
cargo.arg(arg);
@ -1310,12 +1151,4 @@ pub enum CargoMessage<'a> {
BuildScriptExecuted {
package_id: Cow<'a, str>,
},
CompilerMessage {
message: ClippyMessage<'a>
}
}
#[derive(Deserialize)]
pub struct ClippyMessage<'a> {
rendered: Cow<'a, str>,
}

View File

@ -122,7 +122,6 @@ pub struct Config {
// libstd features
pub backtrace: bool, // support for RUST_BACKTRACE
pub wasm_syscall: bool,
// misc
pub low_priority: bool,
@ -138,7 +137,7 @@ pub struct Config {
pub sysconfdir: Option<PathBuf>,
pub datadir: Option<PathBuf>,
pub docdir: Option<PathBuf>,
pub bindir: Option<PathBuf>,
pub bindir: PathBuf,
pub libdir: Option<PathBuf>,
pub mandir: Option<PathBuf>,
pub codegen_tests: bool,
@ -318,7 +317,6 @@ struct Rust {
save_toolstates: Option<String>,
codegen_backends: Option<Vec<String>>,
codegen_backends_dir: Option<String>,
wasm_syscall: Option<bool>,
lld: Option<bool>,
lldb: Option<bool>,
llvm_tools: Option<bool>,
@ -402,6 +400,7 @@ impl Config {
config.incremental = flags.incremental;
config.dry_run = flags.dry_run;
config.keep_stage = flags.keep_stage;
config.bindir = "bin".into(); // default
if let Some(value) = flags.deny_warnings {
config.deny_warnings = value;
}
@ -484,7 +483,7 @@ impl Config {
config.sysconfdir = install.sysconfdir.clone().map(PathBuf::from);
config.datadir = install.datadir.clone().map(PathBuf::from);
config.docdir = install.docdir.clone().map(PathBuf::from);
config.bindir = install.bindir.clone().map(PathBuf::from);
set(&mut config.bindir, install.bindir.clone().map(PathBuf::from));
config.libdir = install.libdir.clone().map(PathBuf::from);
config.mandir = install.mandir.clone().map(PathBuf::from);
}
@ -558,7 +557,6 @@ impl Config {
if let Some(true) = rust.incremental {
config.incremental = true;
}
set(&mut config.wasm_syscall, rust.wasm_syscall);
set(&mut config.lld_enabled, rust.lld);
set(&mut config.lldb_enabled, rust.lldb);
set(&mut config.llvm_tools_enabled, rust.llvm_tools);

View File

@ -18,7 +18,7 @@ use build_helper::{output, t};
use crate::{Compiler, Mode, LLVM_TOOLS};
use crate::channel;
use crate::util::{is_dylib, exe};
use crate::util::{is_dylib, exe, timeit};
use crate::builder::{Builder, RunConfig, ShouldRun, Step};
use crate::compile;
use crate::tool::{self, Tool};
@ -91,14 +91,15 @@ impl Step for Docs {
let name = pkgname(builder, "rust-docs");
builder.info(&format!("Dist docs ({})", host));
if !builder.config.docs {
builder.info("\tskipping - docs disabled");
return distdir(builder).join(format!("{}-{}.tar.gz", name, host));
}
builder.default_doc(None);
builder.info(&format!("Dist docs ({})", host));
let _time = timeit(builder);
let image = tmpdir(builder).join(format!("{}-{}-image", name, host));
let _ = fs::remove_dir_all(&image);
@ -151,9 +152,7 @@ impl Step for RustcDocs {
let name = pkgname(builder, "rustc-docs");
builder.info(&format!("Dist compiler docs ({})", host));
if !builder.config.compiler_docs {
builder.info("\tskipping - compiler docs disabled");
return distdir(builder).join(format!("{}-{}.tar.gz", name, host));
}
@ -179,6 +178,9 @@ impl Step for RustcDocs {
.arg("--component-name=rustc-docs")
.arg("--legacy-manifest-dirs=rustlib,cargo")
.arg("--bulk-dirs=share/doc/rust/html");
builder.info(&format!("Dist compiler docs ({})", host));
let _time = timeit(builder);
builder.run(&mut cmd);
builder.remove_dir(&image);
@ -350,6 +352,7 @@ impl Step for Mingw {
}
builder.info(&format!("Dist mingw ({})", host));
let _time = timeit(builder);
let name = pkgname(builder, "rust-mingw");
let image = tmpdir(builder).join(format!("{}-{}-image", name, host));
let _ = fs::remove_dir_all(&image);
@ -403,7 +406,6 @@ impl Step for Rustc {
let compiler = self.compiler;
let host = self.compiler.host;
builder.info(&format!("Dist rustc stage{} ({})", compiler.stage, host));
let name = pkgname(builder, "rustc");
let image = tmpdir(builder).join(format!("{}-{}-image", name, host));
let _ = fs::remove_dir_all(&image);
@ -460,6 +462,9 @@ impl Step for Rustc {
.arg(format!("--package-name={}-{}", name, host))
.arg("--component-name=rustc")
.arg("--legacy-manifest-dirs=rustlib,cargo");
builder.info(&format!("Dist rustc stage{} ({})", compiler.stage, host));
let _time = timeit(builder);
builder.run(&mut cmd);
builder.remove_dir(&image);
builder.remove_dir(&overlay);
@ -469,7 +474,6 @@ impl Step for Rustc {
fn prepare_image(builder: &Builder<'_>, compiler: Compiler, image: &Path) {
let host = compiler.host;
let src = builder.sysroot(compiler);
let libdir = builder.rustc_libdir(compiler);
// Copy rustc/rustdoc binaries
t!(fs::create_dir_all(image.join("bin")));
@ -481,11 +485,14 @@ impl Step for Rustc {
// Copy runtime DLLs needed by the compiler
if libdir_relative.to_str() != Some("bin") {
let libdir = builder.rustc_libdir(compiler);
for entry in builder.read_dir(&libdir) {
let name = entry.file_name();
if let Some(s) = name.to_str() {
if is_dylib(s) {
builder.install(&entry.path(), &image.join(&libdir_relative), 0o644);
// Don't use custom libdir here because ^lib/ will be resolved again
// with installer
builder.install(&entry.path(), &image.join("lib"), 0o644);
}
}
}
@ -493,8 +500,11 @@ impl Step for Rustc {
// Copy over the codegen backends
let backends_src = builder.sysroot_codegen_backends(compiler);
let backends_rel = backends_src.strip_prefix(&src).unwrap();
let backends_dst = image.join(&backends_rel);
let backends_rel = backends_src.strip_prefix(&src).unwrap()
.strip_prefix(builder.sysroot_libdir_relative(compiler)).unwrap();
// Don't use custom libdir here because ^lib/ will be resolved again with installer
let backends_dst = image.join("lib").join(&backends_rel);
t!(fs::create_dir_all(&backends_dst));
builder.cp_r(&backends_src, &backends_dst);
@ -657,8 +667,6 @@ impl Step for Std {
let target = self.target;
let name = pkgname(builder, "rust-std");
builder.info(&format!("Dist std stage{} ({} -> {})",
compiler.stage, &compiler.host, target));
// The only true set of target libraries came from the build triple, so
// let's reduce redundant work by only producing archives from that host.
@ -673,12 +681,7 @@ impl Step for Std {
if builder.hosts.iter().any(|t| t == target) {
builder.ensure(compile::Rustc { compiler, target });
} else {
if builder.no_std(target) == Some(true) {
// the `test` doesn't compile for no-std targets
builder.ensure(compile::Std { compiler, target });
} else {
builder.ensure(compile::Test { compiler, target });
}
builder.ensure(compile::Std { compiler, target });
}
let image = tmpdir(builder).join(format!("{}-{}-image", name, target));
@ -714,6 +717,10 @@ impl Step for Std {
.arg(format!("--package-name={}-{}", name, target))
.arg(format!("--component-name=rust-std-{}", target))
.arg("--legacy-manifest-dirs=rustlib,cargo");
builder.info(&format!("Dist std stage{} ({} -> {})",
compiler.stage, &compiler.host, target));
let _time = timeit(builder);
builder.run(&mut cmd);
builder.remove_dir(&image);
distdir(builder).join(format!("{}-{}.tar.gz", name, target))
@ -754,15 +761,13 @@ impl Step for Analysis {
let compiler = self.compiler;
let target = self.target;
assert!(builder.config.extended);
builder.info("Dist analysis");
let name = pkgname(builder, "rust-analysis");
if &compiler.host != builder.config.build {
builder.info("\tskipping, not a build host");
return distdir(builder).join(format!("{}-{}.tar.gz", name, target));
}
builder.ensure(Std { compiler, target });
builder.ensure(compile::Std { compiler, target });
let image = tmpdir(builder).join(format!("{}-{}-image", name, target));
@ -786,6 +791,9 @@ impl Step for Analysis {
.arg(format!("--package-name={}-{}", name, target))
.arg(format!("--component-name=rust-analysis-{}", target))
.arg("--legacy-manifest-dirs=rustlib,cargo");
builder.info("Dist analysis");
let _time = timeit(builder);
builder.run(&mut cmd);
builder.remove_dir(&image);
distdir(builder).join(format!("{}-{}.tar.gz", name, target))
@ -874,8 +882,6 @@ impl Step for Src {
/// Creates the `rust-src` installer component
fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.info("Dist src");
let name = pkgname(builder, "rust-src");
let image = tmpdir(builder).join(format!("{}-image", name));
let _ = fs::remove_dir_all(&image);
@ -908,6 +914,7 @@ impl Step for Src {
"src/libproc_macro",
"src/tools/rustc-std-workspace-core",
"src/tools/rustc-std-workspace-alloc",
"src/tools/rustc-std-workspace-std",
"src/librustc",
"src/libsyntax",
];
@ -929,6 +936,9 @@ impl Step for Src {
.arg(format!("--package-name={}", name))
.arg("--component-name=rust-src")
.arg("--legacy-manifest-dirs=rustlib,cargo");
builder.info("Dist src");
let _time = timeit(builder);
builder.run(&mut cmd);
builder.remove_dir(&image);
@ -956,8 +966,6 @@ impl Step for PlainSourceTarball {
/// Creates the plain source tarball
fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.info("Create plain source tarball");
// Make sure that the root folder of tarball has the correct name
let plain_name = format!("{}-src", pkgname(builder, "rustc"));
let plain_dst_src = tmpdir(builder).join(&plain_name);
@ -1019,6 +1027,9 @@ impl Step for PlainSourceTarball {
.arg("--output").arg(&tarball)
.arg("--work-dir=.")
.current_dir(tmpdir(builder));
builder.info("Create plain source tarball");
let _time = timeit(builder);
builder.run(&mut cmd);
distdir(builder).join(&format!("{}.tar.gz", plain_name))
}
@ -1072,7 +1083,6 @@ impl Step for Cargo {
let compiler = self.compiler;
let target = self.target;
builder.info(&format!("Dist cargo stage{} ({})", compiler.stage, target));
let src = builder.src.join("src/tools/cargo");
let etc = src.join("src/etc");
let release_num = builder.release_num("cargo");
@ -1125,6 +1135,9 @@ impl Step for Cargo {
.arg(format!("--package-name={}-{}", name, target))
.arg("--component-name=cargo")
.arg("--legacy-manifest-dirs=rustlib,cargo");
builder.info(&format!("Dist cargo stage{} ({})", compiler.stage, target));
let _time = timeit(builder);
builder.run(&mut cmd);
distdir(builder).join(format!("{}-{}.tar.gz", name, target))
}
@ -1160,7 +1173,6 @@ impl Step for Rls {
let target = self.target;
assert!(builder.config.extended);
builder.info(&format!("Dist RLS stage{} ({})", compiler.stage, target));
let src = builder.src.join("src/tools/rls");
let release_num = builder.release_num("rls");
let name = pkgname(builder, "rls");
@ -1209,6 +1221,8 @@ impl Step for Rls {
.arg("--legacy-manifest-dirs=rustlib,cargo")
.arg("--component-name=rls-preview");
builder.info(&format!("Dist RLS stage{} ({})", compiler.stage, target));
let _time = timeit(builder);
builder.run(&mut cmd);
Some(distdir(builder).join(format!("{}-{}.tar.gz", name, target)))
}
@ -1244,7 +1258,6 @@ impl Step for Clippy {
let target = self.target;
assert!(builder.config.extended);
builder.info(&format!("Dist clippy stage{} ({})", compiler.stage, target));
let src = builder.src.join("src/tools/clippy");
let release_num = builder.release_num("clippy");
let name = pkgname(builder, "clippy");
@ -1298,6 +1311,8 @@ impl Step for Clippy {
.arg("--legacy-manifest-dirs=rustlib,cargo")
.arg("--component-name=clippy-preview");
builder.info(&format!("Dist clippy stage{} ({})", compiler.stage, target));
let _time = timeit(builder);
builder.run(&mut cmd);
Some(distdir(builder).join(format!("{}-{}.tar.gz", name, target)))
}
@ -1333,7 +1348,6 @@ impl Step for Miri {
let target = self.target;
assert!(builder.config.extended);
builder.info(&format!("Dist miri stage{} ({})", compiler.stage, target));
let src = builder.src.join("src/tools/miri");
let release_num = builder.release_num("miri");
let name = pkgname(builder, "miri");
@ -1388,6 +1402,8 @@ impl Step for Miri {
.arg("--legacy-manifest-dirs=rustlib,cargo")
.arg("--component-name=miri-preview");
builder.info(&format!("Dist miri stage{} ({})", compiler.stage, target));
let _time = timeit(builder);
builder.run(&mut cmd);
Some(distdir(builder).join(format!("{}-{}.tar.gz", name, target)))
}
@ -1422,7 +1438,6 @@ impl Step for Rustfmt {
let compiler = self.compiler;
let target = self.target;
builder.info(&format!("Dist Rustfmt stage{} ({})", compiler.stage, target));
let src = builder.src.join("src/tools/rustfmt");
let release_num = builder.release_num("rustfmt");
let name = pkgname(builder, "rustfmt");
@ -1475,6 +1490,8 @@ impl Step for Rustfmt {
.arg("--legacy-manifest-dirs=rustlib,cargo")
.arg("--component-name=rustfmt-preview");
builder.info(&format!("Dist Rustfmt stage{} ({})", compiler.stage, target));
let _time = timeit(builder);
builder.run(&mut cmd);
Some(distdir(builder).join(format!("{}-{}.tar.gz", name, target)))
}
@ -1575,6 +1592,7 @@ impl Step for Extended {
input_tarballs.push(tarball);
}
builder.info("building combined installer");
let mut cmd = rust_installer(builder);
cmd.arg("combine")
.arg("--product-name=Rust")
@ -1586,7 +1604,9 @@ impl Step for Extended {
.arg("--legacy-manifest-dirs=rustlib,cargo")
.arg("--input-tarballs").arg(input_tarballs)
.arg("--non-installed-overlay").arg(&overlay);
let time = timeit(&builder);
builder.run(&mut cmd);
drop(time);
let mut license = String::new();
license += &builder.read(&builder.src.join("COPYRIGHT"));
@ -1642,6 +1662,7 @@ impl Step for Extended {
};
if target.contains("apple-darwin") {
builder.info("building pkg installer");
let pkg = tmp.join("pkg");
let _ = fs::remove_dir_all(&pkg);
@ -1691,6 +1712,7 @@ impl Step for Extended {
pkgname(builder, "rust"),
target)))
.arg("--package-path").arg(&pkg);
let _time = timeit(builder);
builder.run(&mut cmd);
}
@ -1741,14 +1763,18 @@ impl Step for Extended {
builder.create(&exe.join("LICENSE.txt"), &license);
// Generate exe installer
builder.info("building `exe` installer with `iscc`");
let mut cmd = Command::new("iscc");
cmd.arg("rust.iss")
.arg("/Q")
.current_dir(&exe);
if target.contains("windows-gnu") {
cmd.arg("/dMINGW");
}
add_env(builder, &mut cmd, target);
let time = timeit(builder);
builder.run(&mut cmd);
drop(time);
builder.install(&exe.join(format!("{}-{}.exe", pkgname(builder, "rust"), target)),
&distdir(builder),
0o755);
@ -1913,6 +1939,7 @@ impl Step for Extended {
builder.install(&etc.join("gfx/banner.bmp"), &exe, 0o644);
builder.install(&etc.join("gfx/dialogbg.bmp"), &exe, 0o644);
builder.info(&format!("building `msi` installer with {:?}", light));
let filename = format!("{}-{}.msi", pkgname(builder, "rust"), target);
let mut cmd = Command::new(&light);
cmd.arg("-nologo")
@ -1945,6 +1972,7 @@ impl Step for Extended {
// ICE57 wrongly complains about the shortcuts
cmd.arg("-sice:ICE57");
let _time = timeit(builder);
builder.run(&mut cmd);
if !builder.config.dry_run {
@ -1999,6 +2027,8 @@ impl Step for HashSign {
}
fn run(self, builder: &Builder<'_>) {
// This gets called by `promote-release`
// (https://github.com/rust-lang/rust-central-station/tree/master/promote-release).
let mut cmd = builder.tool_cmd(Tool::BuildManifest);
if builder.config.dry_run {
return;
@ -2009,10 +2039,14 @@ impl Step for HashSign {
let addr = builder.config.dist_upload_addr.as_ref().unwrap_or_else(|| {
panic!("\n\nfailed to specify `dist.upload-addr` in `config.toml`\n\n")
});
let file = builder.config.dist_gpg_password_file.as_ref().unwrap_or_else(|| {
panic!("\n\nfailed to specify `dist.gpg-password-file` in `config.toml`\n\n")
});
let pass = t!(fs::read_to_string(&file));
let pass = if env::var("BUILD_MANIFEST_DISABLE_SIGNING").is_err() {
let file = builder.config.dist_gpg_password_file.as_ref().unwrap_or_else(|| {
panic!("\n\nfailed to specify `dist.gpg-password-file` in `config.toml`\n\n")
});
t!(fs::read_to_string(&file))
} else {
String::new()
};
let today = output(Command::new("date").arg("+%Y-%m-%d"));
@ -2107,6 +2141,7 @@ impl Step for LlvmTools {
}
builder.info(&format!("Dist LlvmTools ({})", target));
let _time = timeit(builder);
let src = builder.src.join("src/llvm-project/llvm");
let name = pkgname(builder, "llvm-tools");

View File

@ -375,7 +375,7 @@ impl Step for Standalone {
up_to_date(&footer, &html) &&
up_to_date(&favicon, &html) &&
up_to_date(&full_toc, &html) &&
up_to_date(&version_info, &html) &&
(builder.config.dry_run || up_to_date(&version_info, &html)) &&
(builder.config.dry_run || up_to_date(&rustdoc, &html)) {
continue
}
@ -413,7 +413,7 @@ impl Step for Std {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.all_krates("std").default_condition(builder.config.docs)
run.all_krates("test").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig<'_>) {
@ -475,137 +475,11 @@ impl Step for Std {
.arg("--resource-suffix").arg(crate::channel::CFG_RELEASE_NUM)
.arg("--index-page").arg(&builder.src.join("src/doc/index.md"));
builder.run(&mut cargo);
builder.cp_r(&my_out, &out);
builder.run(&mut cargo.into());
};
for krate in &["alloc", "core", "std"] {
for krate in &["alloc", "core", "std", "proc_macro", "test"] {
run_cargo_rustdoc_for(krate);
}
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct Test {
stage: u32,
target: Interned<String>,
}
impl Step for Test {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.krate("test").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Test {
stage: run.builder.top_stage,
target: run.target,
});
}
/// Compile all libtest documentation.
///
/// This will generate all documentation for libtest and its dependencies. This
/// is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let target = self.target;
builder.info(&format!("Documenting stage{} test ({})", stage, target));
let out = builder.doc_out(target);
t!(fs::create_dir_all(&out));
let compiler = builder.compiler_for(stage, builder.config.build, target);
// Build libstd docs so that we generate relative links
builder.ensure(Std { stage, target });
builder.ensure(compile::Test { compiler, target });
let out_dir = builder.stage_out(compiler, Mode::Test)
.join(target).join("doc");
// See docs in std above for why we symlink
let my_out = builder.crate_doc_out(target);
t!(symlink_dir_force(&builder.config, &my_out, &out_dir));
let mut cargo = builder.cargo(compiler, Mode::Test, target, "doc");
compile::test_cargo(builder, &compiler, target, &mut cargo);
cargo.arg("--no-deps")
.arg("-p").arg("test")
.env("RUSTDOC_RESOURCE_SUFFIX", crate::channel::CFG_RELEASE_NUM)
.env("RUSTDOC_GENERATE_REDIRECT_PAGES", "1");
builder.run(&mut cargo);
builder.cp_r(&my_out, &out);
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct WhitelistedRustc {
stage: u32,
target: Interned<String>,
}
impl Step for WhitelistedRustc {
type Output = ();
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.krate("rustc-main").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(WhitelistedRustc {
stage: run.builder.top_stage,
target: run.target,
});
}
/// Generates whitelisted compiler crate documentation.
///
/// This will generate all documentation for crates that are whitelisted
/// to be included in the standard documentation. This documentation is
/// included in the standard Rust documentation, so we should always
/// document it and symlink to merge with the rest of the std and test
/// documentation. We don't build other compiler documentation
/// here as we want to be able to keep it separate from the standard
/// documentation. This is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let target = self.target;
builder.info(&format!("Documenting stage{} whitelisted compiler ({})", stage, target));
let out = builder.doc_out(target);
t!(fs::create_dir_all(&out));
let compiler = builder.compiler_for(stage, builder.config.build, target);
// Build libstd docs so that we generate relative links
builder.ensure(Std { stage, target });
builder.ensure(compile::Rustc { compiler, target });
let out_dir = builder.stage_out(compiler, Mode::Rustc)
.join(target).join("doc");
// See docs in std above for why we symlink
let my_out = builder.crate_doc_out(target);
t!(symlink_dir_force(&builder.config, &my_out, &out_dir));
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "doc");
compile::rustc_cargo(builder, &mut cargo);
// We don't want to build docs for internal compiler dependencies in this
// step (there is another step for that). Therefore, we whitelist the crates
// for which docs must be built.
for krate in &["proc_macro"] {
cargo.arg("-p").arg(krate)
.env("RUSTDOC_RESOURCE_SUFFIX", crate::channel::CFG_RELEASE_NUM)
.env("RUSTDOC_GENERATE_REDIRECT_PAGES", "1");
}
builder.run(&mut cargo);
builder.cp_r(&my_out, &out);
}
}
@ -687,7 +561,7 @@ impl Step for Rustc {
cargo.arg("-p").arg(krate);
}
builder.run(&mut cargo);
builder.run(&mut cargo.into());
}
}
@ -782,7 +656,7 @@ impl Step for Rustdoc {
cargo.arg("-p").arg("rustdoc");
cargo.env("RUSTDOCFLAGS", "--document-private-items");
builder.run(&mut cargo);
builder.run(&mut cargo.into());
}
}
@ -825,8 +699,7 @@ impl Step for ErrorIndex {
index.arg(crate::channel::CFG_RELEASE_NUM);
// FIXME: shouldn't have to pass this env var
index.env("CFG_BUILD", &builder.config.build)
.env("RUSTC_ERROR_METADATA_DST", builder.extended_error_dir());
index.env("CFG_BUILD", &builder.config.build);
builder.run(&mut index);
}

View File

@ -36,7 +36,7 @@ pub struct Flags {
// This overrides the deny-warnings configuation option,
// which passes -Dwarnings to the compiler invocations.
//
// true => deny, false => allow
// true => deny, false => warn
pub deny_warnings: Option<bool>,
}
@ -556,10 +556,10 @@ fn split(s: &[String]) -> Vec<String> {
fn parse_deny_warnings(matches: &getopts::Matches) -> Option<bool> {
match matches.opt_str("warnings").as_ref().map(|v| v.as_str()) {
Some("deny") => Some(true),
Some("allow") => Some(false),
Some("warn") => Some(false),
Some(value) => {
eprintln!(
r#"invalid value for --warnings: {:?}, expected "allow" or "deny""#,
r#"invalid value for --warnings: {:?}, expected "warn" or "deny""#,
value,
);
process::exit(1);

View File

@ -67,7 +67,6 @@ fn install_sh(
let sysconfdir_default = PathBuf::from("/etc");
let datadir_default = PathBuf::from("share");
let docdir_default = datadir_default.join("doc/rust");
let bindir_default = PathBuf::from("bin");
let libdir_default = PathBuf::from("lib");
let mandir_default = datadir_default.join("man");
let prefix = builder.config.prefix.as_ref().map_or(prefix_default, |p| {
@ -76,7 +75,7 @@ fn install_sh(
let sysconfdir = builder.config.sysconfdir.as_ref().unwrap_or(&sysconfdir_default);
let datadir = builder.config.datadir.as_ref().unwrap_or(&datadir_default);
let docdir = builder.config.docdir.as_ref().unwrap_or(&docdir_default);
let bindir = builder.config.bindir.as_ref().unwrap_or(&bindir_default);
let bindir = &builder.config.bindir;
let libdir = builder.config.libdir.as_ref().unwrap_or(&libdir_default);
let mandir = builder.config.mandir.as_ref().unwrap_or(&mandir_default);

View File

@ -103,9 +103,6 @@
//! More documentation can be found in each respective module below, and you can
//! also check out the `src/bootstrap/README.md` file for more information.
// NO-RUSTC-WRAPPER
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
#![feature(core_intrinsics)]
#![feature(drain_filter)]
@ -297,9 +294,6 @@ pub enum Mode {
/// Build the standard library, placing output in the "stageN-std" directory.
Std,
/// Build libtest, placing output in the "stageN-test" directory.
Test,
/// Build librustc, and compiler libraries, placing output in the "stageN-rustc" directory.
Rustc,
@ -315,7 +309,6 @@ pub enum Mode {
/// Compile a tool which uses all libraries we compile (up to rustc).
/// Doesn't use the stage0 compiler libraries like "other", and includes
/// tools like rustdoc, cargo, rls, etc.
ToolTest,
ToolStd,
ToolRustc,
}
@ -502,9 +495,6 @@ impl Build {
if self.config.profiler {
features.push_str(" profiler");
}
if self.config.wasm_syscall {
features.push_str(" wasm_syscall");
}
features
}
@ -536,11 +526,10 @@ impl Build {
fn stage_out(&self, compiler: Compiler, mode: Mode) -> PathBuf {
let suffix = match mode {
Mode::Std => "-std",
Mode::Test => "-test",
Mode::Rustc => "-rustc",
Mode::Codegen => "-codegen",
Mode::ToolBootstrap => "-bootstrap-tools",
Mode::ToolStd | Mode::ToolTest | Mode::ToolRustc => "-tools",
Mode::ToolStd | Mode::ToolRustc => "-tools",
};
self.out.join(&*compiler.host)
.join(format!("stage{}{}", compiler.stage, suffix))
@ -1331,3 +1320,13 @@ impl Compiler {
self.stage >= final_stage
}
}
fn envify(s: &str) -> String {
s.chars()
.map(|c| match c {
'-' => '_',
c => c,
})
.flat_map(|c| c.to_uppercase())
.collect()
}

View File

@ -81,5 +81,14 @@ ci-subset-1:
ci-subset-2:
$(Q)$(BOOTSTRAP) test $(TESTS_IN_2)
TESTS_IN_MINGW_2 := \
src/test/ui \
src/test/compile-fail
ci-mingw-subset-1:
$(Q)$(BOOTSTRAP) test $(TESTS_IN_MINGW_2:%=--exclude %)
ci-mingw-subset-2:
$(Q)$(BOOTSTRAP) test $(TESTS_IN_MINGW_2)
.PHONY: dist

View File

@ -81,26 +81,29 @@ impl Step for Llvm {
(info, "src/llvm-project/llvm", builder.llvm_out(target), dir.join("bin"))
};
if !llvm_info.is_git() {
println!(
"git could not determine the LLVM submodule commit hash. \
Assuming that an LLVM build is necessary.",
);
}
let build_llvm_config = llvm_config_ret_dir
.join(exe("llvm-config", &*builder.config.build));
let done_stamp = out_dir.join("llvm-finished-building");
if let Some(llvm_commit) = llvm_info.sha() {
if done_stamp.exists() {
if done_stamp.exists() {
if let Some(llvm_commit) = llvm_info.sha() {
let done_contents = t!(fs::read(&done_stamp));
// If LLVM was already built previously and the submodule's commit didn't change
// from the previous build, then no action is required.
if done_contents == llvm_commit.as_bytes() {
return build_llvm_config
return build_llvm_config;
}
} else {
builder.info(
"Could not determine the LLVM submodule commit hash. \
Assuming that an LLVM rebuild is not necessary.",
);
builder.info(&format!(
"To force LLVM to rebuild, remove the file `{}`",
done_stamp.display()
));
return build_llvm_config;
}
}
@ -303,9 +306,7 @@ impl Step for Llvm {
cfg.build();
if let Some(llvm_commit) = llvm_info.sha() {
t!(fs::write(&done_stamp, llvm_commit));
}
t!(fs::write(&done_stamp, llvm_info.sha().unwrap_or("")));
build_llvm_config
}

View File

@ -202,10 +202,6 @@ pub fn check(build: &mut Build) {
panic!("couldn't find libc.a in musl dir: {}",
root.join("lib").display());
}
if fs::metadata(root.join("lib/libunwind.a")).is_err() {
panic!("couldn't find libunwind.a in musl dir: {}",
root.join("lib").display());
}
}
None => {
panic!("when targeting MUSL either the rust.musl-root \

View File

@ -23,7 +23,7 @@ use crate::tool::{self, Tool, SourceType};
use crate::toolstate::ToolState;
use crate::util::{self, dylib_path, dylib_path_var};
use crate::Crate as CargoCrate;
use crate::{DocTests, Mode, GitRepo};
use crate::{DocTests, Mode, GitRepo, envify};
const ADB_TEST_DIR: &str = "/data/tmp/work";
@ -233,10 +233,9 @@ impl Step for Cargo {
// those features won't be able to land.
cargo.env("CARGO_TEST_DISABLE_NIGHTLY", "1");
try_run(
builder,
cargo.env("PATH", &path_for_cargo(builder, compiler)),
);
cargo.env("PATH", &path_for_cargo(builder, compiler));
try_run(builder, &mut cargo.into());
}
}
@ -290,7 +289,7 @@ impl Step for Rls {
cargo.arg("--")
.args(builder.config.cmd.test_args());
if try_run(builder, &mut cargo) {
if try_run(builder, &mut cargo.into()) {
builder.save_toolstate("rls", ToolState::TestPass);
}
}
@ -348,7 +347,7 @@ impl Step for Rustfmt {
builder.add_rustc_lib_path(compiler, &mut cargo);
if try_run(builder, &mut cargo) {
if try_run(builder, &mut cargo.into()) {
builder.save_toolstate("rustfmt", ToolState::TestPass);
}
}
@ -418,6 +417,7 @@ impl Step for Miri {
cargo.env("CARGO_INSTALL_ROOT", &builder.out); // cargo adds a `bin/`
cargo.env("XARGO", builder.out.join("bin").join("xargo"));
let mut cargo = Command::from(cargo);
if !try_run(builder, &mut cargo) {
return;
}
@ -467,7 +467,7 @@ impl Step for Miri {
builder.add_rustc_lib_path(compiler, &mut cargo);
if !try_run(builder, &mut cargo) {
if !try_run(builder, &mut cargo.into()) {
return;
}
@ -502,16 +502,16 @@ impl Step for CompiletestTest {
let host = self.host;
let compiler = builder.compiler(0, host);
let mut cargo = tool::prepare_tool_cargo(builder,
compiler,
Mode::ToolBootstrap,
host,
"test",
"src/tools/compiletest",
SourceType::InTree,
&[]);
let cargo = tool::prepare_tool_cargo(builder,
compiler,
Mode::ToolBootstrap,
host,
"test",
"src/tools/compiletest",
SourceType::InTree,
&[]);
try_run(builder, &mut cargo);
try_run(builder, &mut cargo.into());
}
}
@ -571,7 +571,7 @@ impl Step for Clippy {
builder.add_rustc_lib_path(compiler, &mut cargo);
if try_run(builder, &mut cargo) {
if try_run(builder, &mut cargo.into()) {
builder.save_toolstate("clippy-driver", ToolState::TestPass);
}
} else {
@ -1040,21 +1040,10 @@ impl Step for Compiletest {
builder.ensure(compile::Rustc { compiler, target });
}
if builder.no_std(target) == Some(true) {
// the `test` doesn't compile for no-std targets
builder.ensure(compile::Std { compiler, target });
} else {
builder.ensure(compile::Test { compiler, target });
}
builder.ensure(compile::Std { compiler, target });
// ensure that `libproc_macro` is available on the host.
builder.ensure(compile::Std { compiler, target: compiler.host });
if builder.no_std(target) == Some(true) {
// for no_std run-make (e.g., thumb*),
// we need a host compiler which is called by cargo.
builder.ensure(compile::Std { compiler, target: compiler.host });
}
// HACK(eddyb) ensure that `libproc_macro` is available on the host.
builder.ensure(compile::Test { compiler, target: compiler.host });
// Also provide `rust_test_helpers` for the host.
builder.ensure(native::TestHelpers { target: compiler.host });
@ -1338,7 +1327,10 @@ impl Step for Compiletest {
cmd.env("RUSTC_PROFILER_SUPPORT", "1");
}
cmd.env("RUST_TEST_TMPDIR", builder.out.join("tmp"));
let tmp = builder.out.join("tmp");
std::fs::create_dir_all(&tmp).unwrap();
cmd.env("RUST_TEST_TMPDIR", tmp);
cmd.arg("--adb-path").arg("adb");
cmd.arg("--adb-test-dir").arg(ADB_TEST_DIR);
@ -1399,7 +1391,7 @@ impl Step for DocTest {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
builder.ensure(compile::Test {
builder.ensure(compile::Std {
compiler,
target: compiler.host,
});
@ -1535,8 +1527,7 @@ impl Step for ErrorIndex {
);
tool.arg("markdown")
.arg(&output)
.env("CFG_BUILD", &builder.config.build)
.env("RUSTC_ERROR_METADATA_DST", builder.extended_error_dir());
.env("CFG_BUILD", &builder.config.build);
builder.info(&format!("Testing error-index stage{}", compiler.stage));
let _time = util::timeit(&builder);
@ -1710,8 +1701,7 @@ impl Step for Crate {
fn should_run(mut run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run = run.krate("test");
for krate in run.builder.in_tree_crates("std") {
for krate in run.builder.in_tree_crates("test") {
if !(krate.name.starts_with("rustc_") && krate.name.ends_with("san")) {
run = run.path(krate.local_path(&builder).to_str().unwrap());
}
@ -1735,14 +1725,9 @@ impl Step for Crate {
});
};
for krate in builder.in_tree_crates("std") {
if run.path.ends_with(&krate.local_path(&builder)) {
make(Mode::Std, krate);
}
}
for krate in builder.in_tree_crates("test") {
if run.path.ends_with(&krate.local_path(&builder)) {
make(Mode::Test, krate);
make(Mode::Std, krate);
}
}
}
@ -1762,7 +1747,7 @@ impl Step for Crate {
let test_kind = self.test_kind;
let krate = self.krate;
builder.ensure(compile::Test { compiler, target });
builder.ensure(compile::Std { compiler, target });
builder.ensure(RemoteCopyLibs { compiler, target });
// If we're not doing a full bootstrap but we're testing a stage2
@ -1776,9 +1761,6 @@ impl Step for Crate {
Mode::Std => {
compile::std_cargo(builder, &compiler, target, &mut cargo);
}
Mode::Test => {
compile::test_cargo(builder, &compiler, target, &mut cargo);
}
Mode::Rustc => {
builder.ensure(compile::Rustc { compiler, target });
compile::rustc_cargo(builder, &mut cargo);
@ -1832,20 +1814,6 @@ impl Step for Crate {
.expect("nodejs not configured"),
);
} else if target.starts_with("wasm32") {
// Warn about running tests without the `wasm_syscall` feature enabled.
// The javascript shim implements the syscall interface so that test
// output can be correctly reported.
if !builder.config.wasm_syscall {
builder.info(
"Libstd was built without `wasm_syscall` feature enabled: \
test output may not be visible."
);
}
// On the wasm32-unknown-unknown target we're using LTO which is
// incompatible with `-C prefer-dynamic`, so disable that here
cargo.env("RUSTC_NO_PREFER_DYNAMIC", "1");
let node = builder
.config
.nodejs
@ -1869,7 +1837,7 @@ impl Step for Crate {
test_kind, krate, compiler.stage, &compiler.host, target
));
let _time = util::timeit(&builder);
try_run(builder, &mut cargo);
try_run(builder, &mut cargo.into());
}
}
@ -1937,20 +1905,10 @@ impl Step for CrateRustdoc {
));
let _time = util::timeit(&builder);
try_run(builder, &mut cargo);
try_run(builder, &mut cargo.into());
}
}
fn envify(s: &str) -> String {
s.chars()
.map(|c| match c {
'-' => '_',
c => c,
})
.flat_map(|c| c.to_uppercase())
.collect()
}
/// Some test suites are run inside emulators or on remote devices, and most
/// of our test binaries are linked dynamically which means we need to ship
/// the standard library and such to the emulator ahead of time. This step
@ -1980,7 +1938,7 @@ impl Step for RemoteCopyLibs {
return;
}
builder.ensure(compile::Test { compiler, target });
builder.ensure(compile::Std { compiler, target });
builder.info(&format!("REMOTE copy libs to emulator ({})", target));
t!(fs::create_dir_all(builder.out.join("tmp")));

View File

@ -8,7 +8,7 @@ use build_helper::t;
use crate::Mode;
use crate::Compiler;
use crate::builder::{Step, RunConfig, ShouldRun, Builder};
use crate::builder::{Step, RunConfig, ShouldRun, Builder, Cargo as CargoCommand};
use crate::util::{exe, add_lib_path, CiEnv};
use crate::compile;
use crate::channel::GitInfo;
@ -63,7 +63,7 @@ impl Step for ToolBuild {
_ => panic!("unexpected Mode for tool build")
}
let mut cargo = prepare_tool_cargo(
let cargo = prepare_tool_cargo(
builder,
compiler,
self.mode,
@ -76,7 +76,7 @@ impl Step for ToolBuild {
builder.info(&format!("Building stage{} tool {} ({})", compiler.stage, tool, target));
let mut duplicates = Vec::new();
let is_expected = compile::stream_cargo(builder, &mut cargo, vec![], &mut |msg| {
let is_expected = compile::stream_cargo(builder, cargo, vec![], &mut |msg| {
// Only care about big things like the RLS/Cargo for now
match tool {
| "rls"
@ -229,15 +229,11 @@ pub fn prepare_tool_cargo(
path: &'static str,
source_type: SourceType,
extra_features: &[String],
) -> Command {
) -> CargoCommand {
let mut cargo = builder.cargo(compiler, mode, target, command);
let dir = builder.src.join(path);
cargo.arg("--manifest-path").arg(dir.join("Cargo.toml"));
// We don't want to build tools dynamically as they'll be running across
// stages and such and it's just easier if they're not dynamically linked.
cargo.env("RUSTC_NO_PREFER_DYNAMIC", "1");
if source_type == SourceType::Submodule {
cargo.env("RUSTC_EXTERNAL_TOOL", "1");
}
@ -517,7 +513,7 @@ impl Step for Rustdoc {
// libraries here. The intuition here is that If we've built a compiler, we should be able
// to build rustdoc.
let mut cargo = prepare_tool_cargo(
let cargo = prepare_tool_cargo(
builder,
build_compiler,
Mode::ToolRustc,
@ -530,7 +526,7 @@ impl Step for Rustdoc {
builder.info(&format!("Building rustdoc for stage{} ({})",
target_compiler.stage, target_compiler.host));
builder.run(&mut cargo);
builder.run(&mut cargo.into());
// Cargo adds a number of paths to the dylib search path on windows, which results in
// the wrong rustdoc being executed. To avoid the conflicting rustdocs, we name the "tool"
@ -577,12 +573,6 @@ impl Step for Cargo {
}
fn run(self, builder: &Builder<'_>) -> PathBuf {
// Cargo depends on procedural macros, so make sure the host
// libstd/libproc_macro is available.
builder.ensure(compile::Test {
compiler: self.compiler,
target: builder.config.build,
});
builder.ensure(ToolBuild {
compiler: self.compiler,
target: self.target,
@ -650,31 +640,10 @@ macro_rules! tool_extended {
tool_extended!((self, builder),
Cargofmt, rustfmt, "src/tools/rustfmt", "cargo-fmt", {};
CargoClippy, clippy, "src/tools/clippy", "cargo-clippy", {
// Clippy depends on procedural macros, so make sure that's built for
// the compiler itself.
builder.ensure(compile::Test {
compiler: self.compiler,
target: builder.config.build,
});
};
Clippy, clippy, "src/tools/clippy", "clippy-driver", {
// Clippy depends on procedural macros, so make sure that's built for
// the compiler itself.
builder.ensure(compile::Test {
compiler: self.compiler,
target: builder.config.build,
});
};
CargoClippy, clippy, "src/tools/clippy", "cargo-clippy", {};
Clippy, clippy, "src/tools/clippy", "clippy-driver", {};
Miri, miri, "src/tools/miri", "miri", {};
CargoMiri, miri, "src/tools/miri", "cargo-miri", {
// Miri depends on procedural macros, so make sure that's built for
// the compiler itself.
builder.ensure(compile::Test {
compiler: self.compiler,
target: builder.config.build,
});
};
CargoMiri, miri, "src/tools/miri", "cargo-miri", {};
Rls, rls, "src/tools/rls", "rls", {
let clippy = builder.ensure(Clippy {
compiler: self.compiler,
@ -684,12 +653,6 @@ tool_extended!((self, builder),
if clippy.is_some() {
self.extra_features.push("clippy".to_owned());
}
// RLS depends on procedural macros, so make sure that's built for
// the compiler itself.
builder.ensure(compile::Test {
compiler: self.compiler,
target: builder.config.build,
});
};
Rustfmt, rustfmt, "src/tools/rustfmt", "rustfmt", {};
);

View File

@ -1,6 +1,3 @@
// NO-RUSTC-WRAPPER
#![deny(warnings, rust_2018_idioms, unused_lifetimes)]
use std::fs::File;
use std::path::{Path, PathBuf};
use std::process::{Command, Stdio};
@ -262,7 +259,7 @@ pub fn native_lib_boilerplate(
if !up_to_date(Path::new("build.rs"), &timestamp) || !up_to_date(src_dir, &timestamp) {
Ok(NativeLibBoilerplate {
src_dir: src_dir.to_path_buf(),
out_dir: out_dir,
out_dir,
})
} else {
Err(())

View File

@ -124,8 +124,6 @@ jobs:
IMAGE: dist-x86_64-netbsd
DEPLOY: 1
asmjs:
IMAGE: asmjs
i686-gnu:
IMAGE: i686-gnu
i686-gnu-nopt:
@ -236,10 +234,16 @@ jobs:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc
SCRIPT: make ci-subset-1
# FIXME(#59637)
NO_DEBUG_ASSERTIONS: 1
NO_LLVM_ASSERTIONS: 1
i686-msvc-2:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc
SCRIPT: make ci-subset-2
# FIXME(#59637)
NO_DEBUG_ASSERTIONS: 1
NO_LLVM_ASSERTIONS: 1
# MSVC aux tests
x86_64-msvc-aux:
MSYS_BITS: 64
@ -250,6 +254,9 @@ jobs:
SCRIPT: python x.py test src/tools/cargotest src/tools/cargo
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc
VCVARS_BAT: vcvars64.bat
# FIXME(#59637)
NO_DEBUG_ASSERTIONS: 1
NO_LLVM_ASSERTIONS: 1
# MSVC tools tests
x86_64-msvc-tools:
MSYS_BITS: 64
@ -272,7 +279,7 @@ jobs:
i686-mingw-1:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu
SCRIPT: make ci-subset-1
SCRIPT: make ci-mingw-subset-1
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
MINGW_DIR: mingw32
@ -282,13 +289,13 @@ jobs:
i686-mingw-2:
MSYS_BITS: 32
RUST_CONFIGURE_ARGS: --build=i686-pc-windows-gnu
SCRIPT: make ci-subset-2
SCRIPT: make ci-mingw-subset-2
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: i686-6.3.0-release-posix-dwarf-rt_v5-rev2.7z
MINGW_DIR: mingw32
x86_64-mingw-1:
MSYS_BITS: 64
SCRIPT: make ci-subset-1
SCRIPT: make ci-mingw-subset-1
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z
@ -298,7 +305,7 @@ jobs:
NO_LLVM_ASSERTIONS: 1
x86_64-mingw-2:
MSYS_BITS: 64
SCRIPT: make ci-subset-2
SCRIPT: make ci-mingw-subset-2
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu
MINGW_URL: https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc
MINGW_ARCHIVE: x86_64-6.3.0-release-posix-seh-rt_v5-rev2.7z

View File

@ -18,9 +18,9 @@ steps:
# one is MSI installers and one is EXE, but they're not used so frequently at
# this point anyway so perhaps it's a wash!
- script: |
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf is-install.exe https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2017-08-22-is.exe"
is-install.exe /VERYSILENT /SUPPRESSMSGBOXES /NORESTART /SP-
echo ##vso[task.prependpath]C:\Program Files (x86)\Inno Setup 5
curl.exe -o is-install.exe https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2017-08-22-is.exe
is-install.exe /VERYSILENT /SUPPRESSMSGBOXES /NORESTART /SP-
displayName: Install InnoSetup
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
@ -43,24 +43,18 @@ steps:
# FIXME: we should probe the default azure image and see if we can use the MSYS2
# toolchain there. (if there's even one there). For now though this gets the job
# done.
- script: |
set MSYS_PATH=%CD%\citools\msys64
choco install msys2 --params="/InstallDir:%MSYS_PATH% /NoPath" -y
set PATH=%MSYS_PATH%\usr\bin;%PATH%
pacman -S --noconfirm --needed base-devel ca-certificates make diffutils tar
IF "%MINGW_URL%"=="" (
IF "%MSYS_BITS%"=="32" pacman -S --noconfirm --needed mingw-w64-i686-toolchain mingw-w64-i686-cmake mingw-w64-i686-gcc mingw-w64-i686-python2
IF "%MSYS_BITS%"=="64" pacman -S --noconfirm --needed mingw-w64-x86_64-toolchain mingw-w64-x86_64-cmake mingw-w64-x86_64-gcc mingw-w64-x86_64-python2
)
where rev
rev --help
where make
echo ##vso[task.setvariable variable=MSYS_PATH]%MSYS_PATH%
echo ##vso[task.prependpath]%MSYS_PATH%\usr\bin
- bash: |
set -e
choco install msys2 --params="/InstallDir:$(System.Workfolder)/msys2 /NoPath" -y --no-progress
echo "##vso[task.prependpath]$(System.Workfolder)/msys2/usr/bin"
mkdir -p "$(System.Workfolder)/msys2/home/$USERNAME"
displayName: Install msys2
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
- bash: pacman -S --noconfirm --needed base-devel ca-certificates make diffutils tar
displayName: Install msys2 base deps
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
# If we need to download a custom MinGW, do so here and set the path
# appropriately.
#
@ -81,39 +75,46 @@ steps:
#
# Note that we don't literally overwrite the gdb.exe binary because it appears
# to just use gdborig.exe, so that's the binary we deal with instead.
- script: |
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf %MINGW_ARCHIVE% %MINGW_URL%/%MINGW_ARCHIVE%"
7z x -y %MINGW_ARCHIVE% > nul
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf 2017-04-20-%MSYS_BITS%bit-gdborig.exe %MINGW_URL%/2017-04-20-%MSYS_BITS%bit-gdborig.exe"
mv 2017-04-20-%MSYS_BITS%bit-gdborig.exe %MINGW_DIR%\bin\gdborig.exe
echo ##vso[task.prependpath]%CD%\%MINGW_DIR%\bin
- bash: |
set -e
curl -o mingw.7z $MINGW_URL/$MINGW_ARCHIVE
7z x -y mingw.7z > /dev/null
curl -o $MINGW_DIR/bin/gdborig.exe $MINGW_URL/2017-04-20-${MSYS_BITS}bit-gdborig.exe
echo "##vso[task.prependpath]`pwd`/$MINGW_DIR/bin"
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'), ne(variables['MINGW_URL'],''))
displayName: Download custom MinGW
# Otherwise pull in the MinGW installed on appveyor
- script: |
echo ##vso[task.prependpath]%MSYS_PATH%\mingw%MSYS_BITS%\bin
# Otherwise install MinGW through `pacman`
- bash: |
set -e
arch=i686
if [ "$MSYS_BITS" = "64" ]; then
arch=x86_64
fi
pacman -S --noconfirm --needed mingw-w64-$arch-toolchain mingw-w64-$arch-cmake mingw-w64-$arch-gcc mingw-w64-$arch-python2
echo "##vso[task.prependpath]$(System.Workfolder)/msys2/mingw$MSYS_BITS/bin"
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'), eq(variables['MINGW_URL'],''))
displayName: Add MinGW to path
displayName: Download standard MinGW
# Make sure we use the native python interpreter instead of some msys equivalent
# one way or another. The msys interpreters seem to have weird path conversions
# baked in which break LLVM's build system one way or another, so let's use the
# native version which keeps everything as native as possible.
- script: |
copy C:\Python27amd64\python.exe C:\Python27amd64\python2.7.exe
echo ##vso[task.prependpath]C:\Python27amd64
- bash: |
set -e
cp C:/Python27amd64/python.exe C:/Python27amd64/python2.7.exe
echo "##vso[task.prependpath]C:/Python27amd64"
displayName: Prefer the "native" Python as LLVM has trouble building with MSYS sometimes
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
# Note that this is originally from the github releases patch of Ninja
- script: |
md ninja
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf 2017-03-15-ninja-win.zip https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2017-03-15-ninja-win.zip"
7z x -oninja 2017-03-15-ninja-win.zip
del 2017-03-15-ninja-win.zip
set RUST_CONFIGURE_ARGS=%RUST_CONFIGURE_ARGS% --enable-ninja
echo ##vso[task.setvariable variable=RUST_CONFIGURE_ARGS]%RUST_CONFIGURE_ARGS%
echo ##vso[task.prependpath]%CD%\ninja
- bash: |
set -e
mkdir ninja
curl -o ninja.zip https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2017-03-15-ninja-win.zip
7z x -oninja ninja.zip
rm ninja.zip
echo "##vso[task.setvariable variable=RUST_CONFIGURE_ARGS]$RUST_CONFIGURE_ARGS --enable-ninja"
echo "##vso[task.prependpath]`pwd`/ninja"
displayName: Download and install ninja
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))

View File

@ -147,8 +147,15 @@ steps:
git clone --depth=1 https://github.com/rust-lang-nursery/rust-toolstate.git
cd rust-toolstate
python2.7 "$BUILD_SOURCESDIRECTORY/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" "$(git log --format=%s -n1 HEAD)" "" ""
# Only check maintainers if this build is supposed to publish toolstate.
# Builds that are not supposed to publish don't have the access token.
if [ -n "${TOOLSTATE_PUBLISH+is_set}" ]; then
TOOLSTATE_VALIDATE_MAINTAINERS_REPO=rust-lang/rust python2.7 "${BUILD_SOURCESDIRECTORY}/src/tools/publish_toolstate.py"
fi
cd ..
rm -rf rust-toolstate
env:
TOOLSTATE_REPO_ACCESS_TOKEN: $(TOOLSTATE_REPO_ACCESS_TOKEN)
condition: and(succeeded(), not(variables.SKIP_JOB), eq(variables['IMAGE'], 'mingw-check'))
displayName: Verify the publish_toolstate script works
@ -201,7 +208,7 @@ steps:
# Upload CPU usage statistics that we've been gathering this whole time. Always
# execute this step in case we want to inspect failed builds, but don't let
# errors here ever fail the build since this is just informational.
- bash: aws s3 cp --acl public-read cpu-usage.csv s3://$DEPLOY_BUCKET/rustc-builds/$BUILD_SOURCEVERSION/cpu-$SYSTEM_JOBNAME.csv
- bash: aws s3 cp --acl public-read cpu-usage.csv s3://$DEPLOY_BUCKET/rustc-builds/$BUILD_SOURCEVERSION/cpu-$CI_JOB_NAME.csv
env:
AWS_ACCESS_KEY_ID: $(UPLOAD_AWS_ACCESS_KEY_ID)
AWS_SECRET_ACCESS_KEY: $(UPLOAD_AWS_SECRET_ACCESS_KEY)

View File

@ -165,8 +165,7 @@ For targets: `arm-unknown-linux-gnueabihf`
For targets: `armv7-unknown-linux-gnueabihf`
- Path and misc options > Prefix directory = /x-tools/${CT\_TARGET}
- Path and misc options > Patches origin = Bundled, then local
- Path and misc options > Local patch directory = /tmp/patches
- Path and misc options > Patches origin = Bundled only
- Target options > Target Architecture = arm
- Target options > Suffix to the arch-part = v7
- Target options > Architecture level = armv7-a -- (+)
@ -174,9 +173,9 @@ For targets: `armv7-unknown-linux-gnueabihf`
- Target options > Floating point = hardware (FPU) -- (\*)
- Target options > Default instruction set mode = thumb -- (\*)
- Operating System > Target OS = linux
- Operating System > Linux kernel version = 3.2.72 -- Precise kernel
- C-library > glibc version = 2.16.0
- C compiler > gcc version = 5.2.0
- Operating System > Linux kernel version = 3.2.101
- C-library > glibc version = 2.17.0
- C compiler > gcc version = 8.3.0
- C compiler > C++ = ENABLE -- to cross compile LLVM
(\*) These options have been selected to match the configuration of the arm

View File

@ -3,12 +3,7 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh
# Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/
RUN sh /scripts/make3.sh
COPY scripts/crosstool-ng.sh /scripts/
COPY dist-armv7-linux/crosstool-ng.sh /scripts/
RUN sh /scripts/crosstool-ng.sh
COPY scripts/rustbuild-setup.sh /scripts/
@ -16,7 +11,6 @@ RUN sh /scripts/rustbuild-setup.sh
USER rustbuild
WORKDIR /tmp
COPY dist-armv7-linux/patches/ /tmp/patches/
COPY dist-armv7-linux/build-toolchains.sh dist-armv7-linux/armv7-linux-gnueabihf.config /tmp/
RUN ./build-toolchains.sh

View File

@ -1,9 +1,32 @@
#
# Automatically generated file; DO NOT EDIT.
# Crosstool-NG Configuration
# crosstool-NG Configuration
#
CT_CONFIGURE_has_make381=y
CT_CONFIGURE_has_xz=y
CT_CONFIGURE_has_static_link=y
CT_CONFIGURE_has_cxx11=y
CT_CONFIGURE_has_wget=y
CT_CONFIGURE_has_curl=y
CT_CONFIGURE_has_make_3_81_or_newer=y
CT_CONFIGURE_has_make_4_0_or_newer=y
CT_CONFIGURE_has_libtool_2_4_or_newer=y
CT_CONFIGURE_has_libtoolize_2_4_or_newer=y
CT_CONFIGURE_has_autoconf_2_65_or_newer=y
CT_CONFIGURE_has_autoreconf_2_65_or_newer=y
CT_CONFIGURE_has_automake_1_15_or_newer=y
CT_CONFIGURE_has_gnu_m4_1_4_12_or_newer=y
CT_CONFIGURE_has_python_3_4_or_newer=y
CT_CONFIGURE_has_bison_2_7_or_newer=y
CT_CONFIGURE_has_python=y
CT_CONFIGURE_has_dtc=y
CT_CONFIGURE_has_svn=y
CT_CONFIGURE_has_git=y
CT_CONFIGURE_has_md5sum=y
CT_CONFIGURE_has_sha1sum=y
CT_CONFIGURE_has_sha256sum=y
CT_CONFIGURE_has_sha512sum=y
CT_CONFIGURE_has_install_with_strip_program=y
CT_CONFIG_VERSION_CURRENT="3"
CT_CONFIG_VERSION="3"
CT_MODULES=y
#
@ -21,40 +44,46 @@ CT_MODULES=y
# Paths
#
CT_LOCAL_TARBALLS_DIR=""
# CT_TARBALLS_BUILDROOT_LAYOUT is not set
CT_WORK_DIR="${CT_TOP_DIR}/.build"
CT_BUILD_TOP_DIR="${CT_WORK_DIR:-${CT_TOP_DIR}/.build}/${CT_HOST:+HOST-${CT_HOST}/}${CT_TARGET}"
CT_PREFIX_DIR="/x-tools/${CT_TARGET}"
CT_INSTALL_DIR="${CT_PREFIX_DIR}"
CT_RM_RF_PREFIX_DIR=y
CT_REMOVE_DOCS=y
CT_INSTALL_DIR_RO=y
CT_INSTALL_LICENSES=y
CT_PREFIX_DIR_RO=y
CT_STRIP_HOST_TOOLCHAIN_EXECUTABLES=y
# CT_STRIP_TARGET_TOOLCHAIN_EXECUTABLES is not set
#
# Downloading
#
CT_DOWNLOAD_AGENT_WGET=y
# CT_DOWNLOAD_AGENT_CURL is not set
# CT_DOWNLOAD_AGENT_NONE is not set
# CT_FORBID_DOWNLOAD is not set
# CT_FORCE_DOWNLOAD is not set
CT_CONNECT_TIMEOUT=10
CT_DOWNLOAD_WGET_OPTIONS="--passive-ftp --tries=3 -nc --progress=dot:binary"
# CT_ONLY_DOWNLOAD is not set
# CT_USE_MIRROR is not set
CT_VERIFY_DOWNLOAD_DIGEST=y
CT_VERIFY_DOWNLOAD_DIGEST_SHA512=y
# CT_VERIFY_DOWNLOAD_DIGEST_SHA256 is not set
# CT_VERIFY_DOWNLOAD_DIGEST_SHA1 is not set
# CT_VERIFY_DOWNLOAD_DIGEST_MD5 is not set
CT_VERIFY_DOWNLOAD_DIGEST_ALG="sha512"
# CT_VERIFY_DOWNLOAD_SIGNATURE is not set
#
# Extracting
#
# CT_FORCE_EXTRACT is not set
CT_OVERIDE_CONFIG_GUESS_SUB=y
CT_OVERRIDE_CONFIG_GUESS_SUB=y
# CT_ONLY_EXTRACT is not set
# CT_PATCH_BUNDLED is not set
# CT_PATCH_LOCAL is not set
CT_PATCH_BUNDLED_LOCAL=y
# CT_PATCH_LOCAL_BUNDLED is not set
# CT_PATCH_BUNDLED_FALLBACK_LOCAL is not set
# CT_PATCH_LOCAL_FALLBACK_BUNDLED is not set
# CT_PATCH_NONE is not set
CT_PATCH_ORDER="bundled,local"
CT_PATCH_USE_LOCAL=y
CT_LOCAL_PATCH_DIR="/tmp/patches"
CT_PATCH_BUNDLED=y
# CT_PATCH_BUNDLED_LOCAL is not set
CT_PATCH_ORDER="bundled"
#
# Build behavior
@ -90,78 +119,29 @@ CT_LOG_FILE_COMPRESS=y
#
# Target options
#
# CT_ARCH_ALPHA is not set
# CT_ARCH_ARC is not set
CT_ARCH_ARM=y
# CT_ARCH_AVR is not set
# CT_ARCH_M68K is not set
# CT_ARCH_MIPS is not set
# CT_ARCH_NIOS2 is not set
# CT_ARCH_POWERPC is not set
# CT_ARCH_S390 is not set
# CT_ARCH_SH is not set
# CT_ARCH_SPARC is not set
# CT_ARCH_X86 is not set
# CT_ARCH_XTENSA is not set
CT_ARCH="arm"
CT_ARCH_SUPPORTS_BOTH_MMU=y
CT_ARCH_SUPPORTS_BOTH_ENDIAN=y
CT_ARCH_SUPPORTS_32=y
CT_ARCH_SUPPORTS_64=y
CT_ARCH_SUPPORTS_WITH_ARCH=y
CT_ARCH_SUPPORTS_WITH_CPU=y
CT_ARCH_SUPPORTS_WITH_TUNE=y
CT_ARCH_SUPPORTS_WITH_FLOAT=y
CT_ARCH_SUPPORTS_WITH_FPU=y
CT_ARCH_SUPPORTS_SOFTFP=y
CT_ARCH_DEFAULT_HAS_MMU=y
CT_ARCH_DEFAULT_LE=y
CT_ARCH_DEFAULT_32=y
CT_ARCH_ARCH="armv7-a"
CT_ARCH_CHOICE_KSYM="ARM"
CT_ARCH_CPU=""
CT_ARCH_TUNE=""
CT_ARCH_FPU="vfpv3-d16"
# CT_ARCH_BE is not set
CT_ARCH_LE=y
CT_ARCH_32=y
# CT_ARCH_64 is not set
CT_ARCH_BITNESS=32
CT_ARCH_FLOAT_HW=y
# CT_ARCH_FLOAT_SW is not set
CT_TARGET_CFLAGS=""
CT_TARGET_LDFLAGS=""
# CT_ARCH_alpha is not set
CT_ARCH_arm=y
# CT_ARCH_avr is not set
# CT_ARCH_m68k is not set
# CT_ARCH_mips is not set
# CT_ARCH_nios2 is not set
# CT_ARCH_powerpc is not set
# CT_ARCH_s390 is not set
# CT_ARCH_sh is not set
# CT_ARCH_sparc is not set
# CT_ARCH_x86 is not set
# CT_ARCH_xtensa is not set
CT_ARCH_alpha_AVAILABLE=y
CT_ARCH_arm_AVAILABLE=y
CT_ARCH_avr_AVAILABLE=y
CT_ARCH_m68k_AVAILABLE=y
CT_ARCH_microblaze_AVAILABLE=y
CT_ARCH_mips_AVAILABLE=y
CT_ARCH_nios2_AVAILABLE=y
CT_ARCH_powerpc_AVAILABLE=y
CT_ARCH_s390_AVAILABLE=y
CT_ARCH_sh_AVAILABLE=y
CT_ARCH_sparc_AVAILABLE=y
CT_ARCH_x86_AVAILABLE=y
CT_ARCH_xtensa_AVAILABLE=y
CT_ARCH_SUFFIX="v7"
CT_ARCH_ARM_SHOW=y
#
# Generic target options
#
# CT_MULTILIB is not set
CT_ARCH_USE_MMU=y
CT_ARCH_ENDIAN="little"
#
# Target optimisations
#
CT_ARCH_EXCLUSIVE_WITH_CPU=y
# CT_ARCH_FLOAT_AUTO is not set
# CT_ARCH_FLOAT_SOFTFP is not set
CT_ARCH_FLOAT="hard"
#
# arm other options
# Options for arm
#
CT_ARCH_ARM_PKG_KSYM=""
CT_ARCH_ARM_MODE="thumb"
# CT_ARCH_ARM_MODE_ARM is not set
CT_ARCH_ARM_MODE_THUMB=y
@ -169,6 +149,50 @@ CT_ARCH_ARM_MODE_THUMB=y
CT_ARCH_ARM_EABI_FORCE=y
CT_ARCH_ARM_EABI=y
CT_ARCH_ARM_TUPLE_USE_EABIHF=y
CT_ALL_ARCH_CHOICES="ALPHA ARC ARM AVR M68K MICROBLAZE MIPS MOXIE MSP430 NIOS2 POWERPC RISCV S390 SH SPARC X86 XTENSA"
CT_ARCH_SUFFIX="v7"
# CT_OMIT_TARGET_VENDOR is not set
#
# Generic target options
#
# CT_MULTILIB is not set
CT_DEMULTILIB=y
CT_ARCH_SUPPORTS_BOTH_MMU=y
CT_ARCH_DEFAULT_HAS_MMU=y
CT_ARCH_USE_MMU=y
CT_ARCH_SUPPORTS_FLAT_FORMAT=y
CT_ARCH_SUPPORTS_EITHER_ENDIAN=y
CT_ARCH_DEFAULT_LE=y
# CT_ARCH_BE is not set
CT_ARCH_LE=y
CT_ARCH_ENDIAN="little"
CT_ARCH_SUPPORTS_32=y
CT_ARCH_SUPPORTS_64=y
CT_ARCH_DEFAULT_32=y
CT_ARCH_BITNESS=32
CT_ARCH_32=y
# CT_ARCH_64 is not set
#
# Target optimisations
#
CT_ARCH_SUPPORTS_WITH_ARCH=y
CT_ARCH_SUPPORTS_WITH_CPU=y
CT_ARCH_SUPPORTS_WITH_TUNE=y
CT_ARCH_SUPPORTS_WITH_FLOAT=y
CT_ARCH_SUPPORTS_WITH_FPU=y
CT_ARCH_SUPPORTS_SOFTFP=y
CT_ARCH_EXCLUSIVE_WITH_CPU=y
CT_ARCH_ARCH="armv7-a"
CT_ARCH_FPU="vfpv3-d16"
# CT_ARCH_FLOAT_AUTO is not set
CT_ARCH_FLOAT_HW=y
# CT_ARCH_FLOAT_SOFTFP is not set
# CT_ARCH_FLOAT_SW is not set
CT_TARGET_CFLAGS=""
CT_TARGET_LDFLAGS=""
CT_ARCH_FLOAT="hard"
#
# Toolchain options
@ -182,7 +206,9 @@ CT_USE_SYSROOT=y
CT_SYSROOT_NAME="sysroot"
CT_SYSROOT_DIR_PREFIX=""
CT_WANTS_STATIC_LINK=y
CT_WANTS_STATIC_LINK_CXX=y
# CT_STATIC_TOOLCHAIN is not set
CT_SHOW_CT_VERSION=y
CT_TOOLCHAIN_PKGVERSION=""
CT_TOOLCHAIN_BUGURL=""
@ -216,126 +242,207 @@ CT_BUILD_SUFFIX=""
# Operating System
#
CT_KERNEL_SUPPORTS_SHARED_LIBS=y
# CT_KERNEL_BARE_METAL is not set
CT_KERNEL_LINUX=y
CT_KERNEL="linux"
CT_KERNEL_VERSION="3.2.72"
# CT_KERNEL_bare_metal is not set
CT_KERNEL_linux=y
CT_KERNEL_bare_metal_AVAILABLE=y
CT_KERNEL_linux_AVAILABLE=y
# CT_KERNEL_V_4_3 is not set
# CT_KERNEL_V_4_2 is not set
# CT_KERNEL_V_4_1 is not set
# CT_KERNEL_V_3_18 is not set
# CT_KERNEL_V_3_14 is not set
# CT_KERNEL_V_3_12 is not set
# CT_KERNEL_V_3_10 is not set
# CT_KERNEL_V_3_4 is not set
CT_KERNEL_V_3_2=y
# CT_KERNEL_V_2_6_32 is not set
# CT_KERNEL_LINUX_CUSTOM is not set
CT_KERNEL_windows_AVAILABLE=y
CT_KERNEL_CHOICE_KSYM="LINUX"
CT_KERNEL_LINUX_SHOW=y
#
# Options for linux
#
CT_KERNEL_LINUX_PKG_KSYM="LINUX"
CT_LINUX_DIR_NAME="linux"
CT_LINUX_PKG_NAME="linux"
CT_LINUX_SRC_RELEASE=y
CT_LINUX_PATCH_ORDER="global"
# CT_LINUX_V_4_20 is not set
# CT_LINUX_V_4_19 is not set
# CT_LINUX_V_4_18 is not set
# CT_LINUX_V_4_17 is not set
# CT_LINUX_V_4_16 is not set
# CT_LINUX_V_4_15 is not set
# CT_LINUX_V_4_14 is not set
# CT_LINUX_V_4_13 is not set
# CT_LINUX_V_4_12 is not set
# CT_LINUX_V_4_11 is not set
# CT_LINUX_V_4_10 is not set
# CT_LINUX_V_4_9 is not set
# CT_LINUX_V_4_4 is not set
# CT_LINUX_V_4_1 is not set
# CT_LINUX_V_3_16 is not set
# CT_LINUX_V_3_13 is not set
# CT_LINUX_V_3_12 is not set
# CT_LINUX_V_3_10 is not set
# CT_LINUX_V_3_4 is not set
CT_LINUX_V_3_2=y
# CT_LINUX_V_2_6_32 is not set
# CT_LINUX_NO_VERSIONS is not set
CT_LINUX_VERSION="3.2.101"
CT_LINUX_MIRRORS="$(CT_Mirrors kernel.org linux ${CT_LINUX_VERSION})"
CT_LINUX_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_LINUX_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_LINUX_ARCHIVE_FORMATS=".tar.xz .tar.gz"
CT_LINUX_SIGNATURE_FORMAT="unpacked/.sign"
CT_LINUX_4_8_or_older=y
CT_LINUX_older_than_4_8=y
CT_LINUX_3_7_or_older=y
CT_LINUX_older_than_3_7=y
CT_LINUX_later_than_3_2=y
CT_LINUX_3_2_or_later=y
CT_KERNEL_LINUX_VERBOSITY_0=y
# CT_KERNEL_LINUX_VERBOSITY_1 is not set
# CT_KERNEL_LINUX_VERBOSITY_2 is not set
CT_KERNEL_LINUX_VERBOSE_LEVEL=0
CT_KERNEL_LINUX_INSTALL_CHECK=y
CT_ALL_KERNEL_CHOICES="BARE_METAL LINUX WINDOWS"
#
# Common kernel options
#
CT_SHARED_LIBS=y
#
# linux other options
#
CT_KERNEL_LINUX_VERBOSITY_0=y
# CT_KERNEL_LINUX_VERBOSITY_1 is not set
# CT_KERNEL_LINUX_VERBOSITY_2 is not set
CT_KERNEL_LINUX_VERBOSE_LEVEL=0
CT_KERNEL_LINUX_INSTALL_CHECK=y
#
# Binary utilities
#
CT_ARCH_BINFMT_ELF=y
CT_BINUTILS_BINUTILS=y
CT_BINUTILS="binutils"
CT_BINUTILS_binutils=y
CT_BINUTILS_CHOICE_KSYM="BINUTILS"
CT_BINUTILS_BINUTILS_SHOW=y
#
# Options for binutils
#
CT_BINUTILS_BINUTILS_PKG_KSYM="BINUTILS"
CT_BINUTILS_DIR_NAME="binutils"
CT_BINUTILS_USE_GNU=y
CT_BINUTILS_USE="BINUTILS"
CT_BINUTILS_PKG_NAME="binutils"
CT_BINUTILS_SRC_RELEASE=y
CT_BINUTILS_PATCH_ORDER="global"
CT_BINUTILS_V_2_32=y
# CT_BINUTILS_V_2_31 is not set
# CT_BINUTILS_V_2_30 is not set
# CT_BINUTILS_V_2_29 is not set
# CT_BINUTILS_V_2_28 is not set
# CT_BINUTILS_V_2_27 is not set
# CT_BINUTILS_V_2_26 is not set
# CT_BINUTILS_NO_VERSIONS is not set
CT_BINUTILS_VERSION="2.32"
CT_BINUTILS_MIRRORS="$(CT_Mirrors GNU binutils) $(CT_Mirrors sourceware binutils/releases)"
CT_BINUTILS_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_BINUTILS_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_BINUTILS_ARCHIVE_FORMATS=".tar.xz .tar.bz2 .tar.gz"
CT_BINUTILS_SIGNATURE_FORMAT="packed/.sig"
CT_BINUTILS_later_than_2_30=y
CT_BINUTILS_2_30_or_later=y
CT_BINUTILS_later_than_2_27=y
CT_BINUTILS_2_27_or_later=y
CT_BINUTILS_later_than_2_25=y
CT_BINUTILS_2_25_or_later=y
CT_BINUTILS_later_than_2_23=y
CT_BINUTILS_2_23_or_later=y
#
# GNU binutils
#
# CT_CC_BINUTILS_SHOW_LINARO is not set
CT_BINUTILS_V_2_25_1=y
# CT_BINUTILS_V_2_25 is not set
# CT_BINUTILS_V_2_24 is not set
# CT_BINUTILS_V_2_23_2 is not set
# CT_BINUTILS_V_2_23_1 is not set
# CT_BINUTILS_V_2_22 is not set
# CT_BINUTILS_V_2_21_53 is not set
# CT_BINUTILS_V_2_21_1a is not set
# CT_BINUTILS_V_2_20_1a is not set
# CT_BINUTILS_V_2_19_1a is not set
# CT_BINUTILS_V_2_18a is not set
CT_BINUTILS_VERSION="2.25.1"
CT_BINUTILS_2_25_1_or_later=y
CT_BINUTILS_2_25_or_later=y
CT_BINUTILS_2_24_or_later=y
CT_BINUTILS_2_23_or_later=y
CT_BINUTILS_2_22_or_later=y
CT_BINUTILS_2_21_or_later=y
CT_BINUTILS_2_20_or_later=y
CT_BINUTILS_2_19_or_later=y
CT_BINUTILS_2_18_or_later=y
CT_BINUTILS_HAS_HASH_STYLE=y
CT_BINUTILS_HAS_GOLD=y
CT_BINUTILS_GOLD_SUPPORTS_ARCH=y
CT_BINUTILS_GOLD_SUPPORT=y
CT_BINUTILS_HAS_PLUGINS=y
CT_BINUTILS_HAS_PKGVERSION_BUGURL=y
CT_BINUTILS_FORCE_LD_BFD=y
CT_BINUTILS_GOLD_SUPPORTS_ARCH=y
CT_BINUTILS_GOLD_SUPPORT=y
CT_BINUTILS_FORCE_LD_BFD_DEFAULT=y
CT_BINUTILS_LINKER_LD=y
# CT_BINUTILS_LINKER_LD_GOLD is not set
# CT_BINUTILS_LINKER_GOLD_LD is not set
CT_BINUTILS_LINKERS_LIST="ld"
CT_BINUTILS_LINKER_DEFAULT="bfd"
# CT_BINUTILS_PLUGINS is not set
CT_BINUTILS_RELRO=m
CT_BINUTILS_EXTRA_CONFIG_ARRAY=""
# CT_BINUTILS_FOR_TARGET is not set
#
# binutils other options
#
CT_ALL_BINUTILS_CHOICES="BINUTILS"
#
# C-library
#
CT_LIBC_GLIBC=y
# CT_LIBC_UCLIBC is not set
CT_LIBC="glibc"
CT_LIBC_VERSION="2.16.0"
CT_LIBC_glibc=y
# CT_LIBC_musl is not set
# CT_LIBC_uClibc is not set
CT_LIBC_avr_libc_AVAILABLE=y
CT_LIBC_glibc_AVAILABLE=y
CT_LIBC_CHOICE_KSYM="GLIBC"
CT_THREADS="nptl"
# CT_CC_GLIBC_SHOW_LINARO is not set
# CT_LIBC_GLIBC_V_2_22 is not set
# CT_LIBC_GLIBC_V_2_21 is not set
# CT_LIBC_GLIBC_V_2_20 is not set
# CT_LIBC_GLIBC_V_2_19 is not set
# CT_LIBC_GLIBC_V_2_18 is not set
# CT_LIBC_GLIBC_V_2_17 is not set
CT_LIBC_GLIBC_V_2_16_0=y
# CT_LIBC_GLIBC_V_2_15 is not set
# CT_LIBC_GLIBC_V_2_14_1 is not set
# CT_LIBC_GLIBC_V_2_14 is not set
# CT_LIBC_GLIBC_V_2_13 is not set
# CT_LIBC_GLIBC_V_2_12_2 is not set
# CT_LIBC_GLIBC_V_2_12_1 is not set
# CT_LIBC_GLIBC_V_2_11_1 is not set
# CT_LIBC_GLIBC_V_2_11 is not set
# CT_LIBC_GLIBC_V_2_10_1 is not set
# CT_LIBC_GLIBC_V_2_9 is not set
# CT_LIBC_GLIBC_V_2_8 is not set
CT_LIBC_mingw_AVAILABLE=y
CT_LIBC_musl_AVAILABLE=y
CT_LIBC_newlib_AVAILABLE=y
CT_LIBC_none_AVAILABLE=y
CT_LIBC_uClibc_AVAILABLE=y
CT_LIBC_GLIBC_SHOW=y
#
# Options for glibc
#
CT_LIBC_GLIBC_PKG_KSYM="GLIBC"
CT_GLIBC_DIR_NAME="glibc"
CT_GLIBC_USE_GNU=y
CT_GLIBC_USE="GLIBC"
CT_GLIBC_PKG_NAME="glibc"
CT_GLIBC_SRC_RELEASE=y
CT_GLIBC_PATCH_ORDER="global"
# CT_GLIBC_V_2_29 is not set
# CT_GLIBC_V_2_28 is not set
# CT_GLIBC_V_2_27 is not set
# CT_GLIBC_V_2_26 is not set
# CT_GLIBC_V_2_25 is not set
# CT_GLIBC_V_2_24 is not set
# CT_GLIBC_V_2_23 is not set
# CT_GLIBC_V_2_19 is not set
CT_GLIBC_V_2_17=y
# CT_GLIBC_V_2_12_1 is not set
# CT_GLIBC_NO_VERSIONS is not set
CT_GLIBC_VERSION="2.17"
CT_GLIBC_MIRRORS="$(CT_Mirrors GNU glibc)"
CT_GLIBC_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GLIBC_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GLIBC_ARCHIVE_FORMATS=".tar.xz .tar.bz2 .tar.gz"
CT_GLIBC_SIGNATURE_FORMAT="packed/.sig"
CT_GLIBC_2_29_or_older=y
CT_GLIBC_older_than_2_29=y
CT_GLIBC_2_27_or_older=y
CT_GLIBC_older_than_2_27=y
CT_GLIBC_2_26_or_older=y
CT_GLIBC_older_than_2_26=y
CT_GLIBC_2_25_or_older=y
CT_GLIBC_older_than_2_25=y
CT_GLIBC_2_24_or_older=y
CT_GLIBC_older_than_2_24=y
CT_GLIBC_2_23_or_older=y
CT_GLIBC_older_than_2_23=y
CT_GLIBC_2_20_or_older=y
CT_GLIBC_older_than_2_20=y
CT_GLIBC_2_17_or_later=y
CT_GLIBC_2_17_or_older=y
CT_GLIBC_later_than_2_14=y
CT_GLIBC_2_14_or_later=y
CT_GLIBC_DEP_KERNEL_HEADERS_VERSION=y
CT_GLIBC_DEP_BINUTILS=y
CT_GLIBC_DEP_GCC=y
CT_GLIBC_DEP_PYTHON=y
CT_GLIBC_HAS_NPTL_ADDON=y
CT_GLIBC_HAS_PORTS_ADDON=y
CT_GLIBC_HAS_LIBIDN_ADDON=y
CT_GLIBC_USE_PORTS_ADDON=y
CT_GLIBC_USE_NPTL_ADDON=y
# CT_GLIBC_USE_LIBIDN_ADDON is not set
CT_GLIBC_HAS_OBSOLETE_RPC=y
CT_GLIBC_EXTRA_CONFIG_ARRAY=""
CT_GLIBC_CONFIGPARMS=""
CT_GLIBC_EXTRA_CFLAGS=""
CT_GLIBC_ENABLE_OBSOLETE_RPC=y
# CT_GLIBC_DISABLE_VERSIONING is not set
CT_GLIBC_OLDEST_ABI=""
CT_GLIBC_FORCE_UNWIND=y
# CT_GLIBC_LOCALES is not set
# CT_GLIBC_KERNEL_VERSION_NONE is not set
CT_GLIBC_KERNEL_VERSION_AS_HEADERS=y
# CT_GLIBC_KERNEL_VERSION_CHOSEN is not set
CT_GLIBC_MIN_KERNEL="3.2.101"
CT_ALL_LIBC_CHOICES="AVR_LIBC BIONIC GLIBC MINGW_W64 MOXIEBOX MUSL NEWLIB NONE UCLIBC"
CT_LIBC_SUPPORT_THREADS_ANY=y
CT_LIBC_SUPPORT_THREADS_NATIVE=y
@ -343,100 +450,71 @@ CT_LIBC_SUPPORT_THREADS_NATIVE=y
# Common C library options
#
CT_THREADS_NATIVE=y
# CT_CREATE_LDSO_CONF is not set
CT_LIBC_XLDD=y
#
# glibc other options
#
CT_LIBC_GLIBC_PORTS_EXTERNAL=y
CT_LIBC_GLIBC_MAY_FORCE_PORTS=y
CT_LIBC_glibc_familly=y
CT_LIBC_GLIBC_EXTRA_CONFIG_ARRAY=""
CT_LIBC_GLIBC_CONFIGPARMS=""
CT_LIBC_GLIBC_EXTRA_CFLAGS=""
CT_LIBC_EXTRA_CC_ARGS=""
# CT_LIBC_DISABLE_VERSIONING is not set
CT_LIBC_OLDEST_ABI=""
CT_LIBC_GLIBC_FORCE_UNWIND=y
CT_LIBC_GLIBC_USE_PORTS=y
CT_LIBC_ADDONS_LIST=""
#
# WARNING !!!
#
#
# For glibc >= 2.8, it can happen that the tarballs
#
#
# for the addons are not available for download.
#
#
# If that happens, bad luck... Try a previous version
#
#
# or try again later... :-(
#
# CT_LIBC_LOCALES is not set
# CT_LIBC_GLIBC_KERNEL_VERSION_NONE is not set
CT_LIBC_GLIBC_KERNEL_VERSION_AS_HEADERS=y
# CT_LIBC_GLIBC_KERNEL_VERSION_CHOSEN is not set
CT_LIBC_GLIBC_MIN_KERNEL="3.2.72"
#
# C compiler
#
CT_CC="gcc"
CT_CC_CORE_PASSES_NEEDED=y
CT_CC_CORE_PASS_1_NEEDED=y
CT_CC_CORE_PASS_2_NEEDED=y
CT_CC_gcc=y
# CT_CC_GCC_SHOW_LINARO is not set
CT_CC_GCC_V_5_2_0=y
# CT_CC_GCC_V_4_9_3 is not set
# CT_CC_GCC_V_4_8_5 is not set
# CT_CC_GCC_V_4_7_4 is not set
# CT_CC_GCC_V_4_6_4 is not set
# CT_CC_GCC_V_4_5_4 is not set
# CT_CC_GCC_V_4_4_7 is not set
# CT_CC_GCC_V_4_3_6 is not set
# CT_CC_GCC_V_4_2_4 is not set
CT_CC_GCC_4_2_or_later=y
CT_CC_GCC_4_3_or_later=y
CT_CC_GCC_4_4_or_later=y
CT_CC_GCC_4_5_or_later=y
CT_CC_GCC_4_6_or_later=y
CT_CC_GCC_4_7_or_later=y
CT_CC_GCC_4_8_or_later=y
CT_CC_GCC_4_9_or_later=y
CT_CC_GCC_5=y
CT_CC_GCC_5_or_later=y
CT_CC_GCC_HAS_GRAPHITE=y
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_HAS_LTO=y
CT_CC_GCC_USE_LTO=y
CT_CC_GCC_HAS_PKGVERSION_BUGURL=y
CT_CC_GCC_HAS_BUILD_ID=y
CT_CC_GCC_HAS_LNK_HASH_STYLE=y
CT_CC_GCC_USE_GMP_MPFR=y
CT_CC_GCC_USE_MPC=y
CT_CC_GCC_HAS_LIBQUADMATH=y
CT_CC_GCC_HAS_LIBSANITIZER=y
CT_CC_GCC_VERSION="5.2.0"
# CT_CC_LANG_FORTRAN is not set
CT_CC_SUPPORT_CXX=y
CT_CC_SUPPORT_FORTRAN=y
CT_CC_SUPPORT_ADA=y
CT_CC_SUPPORT_OBJC=y
CT_CC_SUPPORT_OBJCXX=y
CT_CC_SUPPORT_GOLANG=y
CT_CC_GCC=y
CT_CC="gcc"
CT_CC_CHOICE_KSYM="GCC"
CT_CC_GCC_SHOW=y
#
# Options for gcc
#
CT_CC_GCC_PKG_KSYM="GCC"
CT_GCC_DIR_NAME="gcc"
CT_GCC_USE_GNU=y
CT_GCC_USE="GCC"
CT_GCC_PKG_NAME="gcc"
CT_GCC_SRC_RELEASE=y
CT_GCC_PATCH_ORDER="global"
CT_GCC_V_8=y
# CT_GCC_V_7 is not set
# CT_GCC_V_6 is not set
# CT_GCC_V_5 is not set
# CT_GCC_V_4_9 is not set
# CT_GCC_NO_VERSIONS is not set
CT_GCC_VERSION="8.3.0"
CT_GCC_MIRRORS="$(CT_Mirrors GNU gcc/gcc-${CT_GCC_VERSION}) $(CT_Mirrors sourceware gcc/releases/gcc-${CT_GCC_VERSION})"
CT_GCC_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GCC_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GCC_ARCHIVE_FORMATS=".tar.xz .tar.gz"
CT_GCC_SIGNATURE_FORMAT=""
CT_GCC_later_than_7=y
CT_GCC_7_or_later=y
CT_GCC_later_than_6=y
CT_GCC_6_or_later=y
CT_GCC_later_than_5=y
CT_GCC_5_or_later=y
CT_GCC_later_than_4_9=y
CT_GCC_4_9_or_later=y
CT_GCC_later_than_4_8=y
CT_GCC_4_8_or_later=y
CT_CC_GCC_HAS_LIBMPX=y
CT_CC_GCC_ENABLE_CXX_FLAGS=""
CT_CC_GCC_CORE_EXTRA_CONFIG_ARRAY=""
CT_CC_GCC_EXTRA_CONFIG_ARRAY=""
CT_CC_GCC_EXTRA_ENV_ARRAY=""
CT_CC_GCC_STATIC_LIBSTDCXX=y
# CT_CC_GCC_SYSTEM_ZLIB is not set
CT_CC_GCC_CONFIG_TLS=m
#
# Optimisation features
#
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_USE_LTO=y
#
# Settings for libraries running on target
@ -465,97 +543,206 @@ CT_CC_GCC_DEC_FLOAT_AUTO=y
# CT_CC_GCC_DEC_FLOAT_BID is not set
# CT_CC_GCC_DEC_FLOAT_DPD is not set
# CT_CC_GCC_DEC_FLOATS_NO is not set
CT_CC_SUPPORT_CXX=y
CT_CC_SUPPORT_FORTRAN=y
CT_CC_SUPPORT_JAVA=y
CT_CC_SUPPORT_ADA=y
CT_CC_SUPPORT_OBJC=y
CT_CC_SUPPORT_OBJCXX=y
CT_CC_SUPPORT_GOLANG=y
CT_ALL_CC_CHOICES="GCC"
#
# Additional supported languages:
#
CT_CC_LANG_CXX=y
# CT_CC_LANG_JAVA is not set
# CT_CC_LANG_FORTRAN is not set
#
# Debug facilities
#
# CT_DEBUG_dmalloc is not set
# CT_DEBUG_duma is not set
# CT_DEBUG_gdb is not set
# CT_DEBUG_ltrace is not set
# CT_DEBUG_strace is not set
# CT_DEBUG_DUMA is not set
# CT_DEBUG_GDB is not set
# CT_DEBUG_LTRACE is not set
# CT_DEBUG_STRACE is not set
CT_ALL_DEBUG_CHOICES="DUMA GDB LTRACE STRACE"
#
# Companion libraries
#
CT_COMPLIBS_NEEDED=y
# CT_COMPLIBS_CHECK is not set
# CT_COMP_LIBS_CLOOG is not set
# CT_COMP_LIBS_EXPAT is not set
CT_COMP_LIBS_GETTEXT=y
CT_COMP_LIBS_GETTEXT_PKG_KSYM="GETTEXT"
CT_GETTEXT_DIR_NAME="gettext"
CT_GETTEXT_PKG_NAME="gettext"
CT_GETTEXT_SRC_RELEASE=y
CT_GETTEXT_PATCH_ORDER="global"
CT_GETTEXT_V_0_19_8_1=y
# CT_GETTEXT_NO_VERSIONS is not set
CT_GETTEXT_VERSION="0.19.8.1"
CT_GETTEXT_MIRRORS="$(CT_Mirrors GNU gettext)"
CT_GETTEXT_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GETTEXT_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GETTEXT_ARCHIVE_FORMATS=".tar.xz .tar.lz .tar.gz"
CT_GETTEXT_SIGNATURE_FORMAT="packed/.sig"
CT_COMP_LIBS_GMP=y
CT_COMP_LIBS_GMP_PKG_KSYM="GMP"
CT_GMP_DIR_NAME="gmp"
CT_GMP_PKG_NAME="gmp"
CT_GMP_SRC_RELEASE=y
CT_GMP_PATCH_ORDER="global"
CT_GMP_V_6_1=y
# CT_GMP_NO_VERSIONS is not set
CT_GMP_VERSION="6.1.2"
CT_GMP_MIRRORS="https://gmplib.org/download/gmp https://gmplib.org/download/gmp/archive $(CT_Mirrors GNU gmp)"
CT_GMP_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GMP_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GMP_ARCHIVE_FORMATS=".tar.xz .tar.lz .tar.bz2"
CT_GMP_SIGNATURE_FORMAT="packed/.sig"
CT_GMP_later_than_5_1_0=y
CT_GMP_5_1_0_or_later=y
CT_GMP_later_than_5_0_0=y
CT_GMP_5_0_0_or_later=y
CT_COMP_LIBS_ISL=y
CT_COMP_LIBS_ISL_PKG_KSYM="ISL"
CT_ISL_DIR_NAME="isl"
CT_ISL_PKG_NAME="isl"
CT_ISL_SRC_RELEASE=y
CT_ISL_PATCH_ORDER="global"
CT_ISL_V_0_20=y
# CT_ISL_V_0_19 is not set
# CT_ISL_V_0_18 is not set
# CT_ISL_V_0_17 is not set
# CT_ISL_V_0_16 is not set
# CT_ISL_V_0_15 is not set
# CT_ISL_NO_VERSIONS is not set
CT_ISL_VERSION="0.20"
CT_ISL_MIRRORS="http://isl.gforge.inria.fr"
CT_ISL_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_ISL_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_ISL_ARCHIVE_FORMATS=".tar.xz .tar.bz2 .tar.gz"
CT_ISL_SIGNATURE_FORMAT=""
CT_ISL_later_than_0_18=y
CT_ISL_0_18_or_later=y
CT_ISL_later_than_0_15=y
CT_ISL_0_15_or_later=y
CT_ISL_REQUIRE_0_15_or_later=y
CT_ISL_later_than_0_14=y
CT_ISL_0_14_or_later=y
CT_ISL_REQUIRE_0_14_or_later=y
CT_ISL_later_than_0_13=y
CT_ISL_0_13_or_later=y
CT_ISL_later_than_0_12=y
CT_ISL_0_12_or_later=y
CT_ISL_REQUIRE_0_12_or_later=y
# CT_COMP_LIBS_LIBELF is not set
CT_COMP_LIBS_LIBICONV=y
CT_COMP_LIBS_LIBICONV_PKG_KSYM="LIBICONV"
CT_LIBICONV_DIR_NAME="libiconv"
CT_LIBICONV_PKG_NAME="libiconv"
CT_LIBICONV_SRC_RELEASE=y
CT_LIBICONV_PATCH_ORDER="global"
CT_LIBICONV_V_1_15=y
# CT_LIBICONV_NO_VERSIONS is not set
CT_LIBICONV_VERSION="1.15"
CT_LIBICONV_MIRRORS="$(CT_Mirrors GNU libiconv)"
CT_LIBICONV_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_LIBICONV_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_LIBICONV_ARCHIVE_FORMATS=".tar.gz"
CT_LIBICONV_SIGNATURE_FORMAT="packed/.sig"
CT_COMP_LIBS_MPC=y
CT_COMP_LIBS_MPC_PKG_KSYM="MPC"
CT_MPC_DIR_NAME="mpc"
CT_MPC_PKG_NAME="mpc"
CT_MPC_SRC_RELEASE=y
CT_MPC_PATCH_ORDER="global"
# CT_MPC_V_1_1 is not set
CT_MPC_V_1_0=y
# CT_MPC_NO_VERSIONS is not set
CT_MPC_VERSION="1.0.3"
CT_MPC_MIRRORS="http://www.multiprecision.org/downloads $(CT_Mirrors GNU mpc)"
CT_MPC_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_MPC_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_MPC_ARCHIVE_FORMATS=".tar.gz"
CT_MPC_SIGNATURE_FORMAT="packed/.sig"
CT_MPC_1_1_0_or_older=y
CT_MPC_older_than_1_1_0=y
CT_COMP_LIBS_MPFR=y
CT_COMP_LIBS_MPFR_PKG_KSYM="MPFR"
CT_MPFR_DIR_NAME="mpfr"
CT_MPFR_PKG_NAME="mpfr"
CT_MPFR_SRC_RELEASE=y
CT_MPFR_PATCH_ORDER="global"
CT_MPFR_V_3_1=y
# CT_MPFR_NO_VERSIONS is not set
CT_MPFR_VERSION="3.1.6"
CT_MPFR_MIRRORS="http://www.mpfr.org/mpfr-${CT_MPFR_VERSION} $(CT_Mirrors GNU mpfr)"
CT_MPFR_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_MPFR_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_MPFR_ARCHIVE_FORMATS=".tar.xz .tar.bz2 .tar.gz .zip"
CT_MPFR_SIGNATURE_FORMAT="packed/.asc"
CT_MPFR_4_0_0_or_older=y
CT_MPFR_older_than_4_0_0=y
CT_MPFR_REQUIRE_older_than_4_0_0=y
CT_MPFR_later_than_3_0_0=y
CT_MPFR_3_0_0_or_later=y
CT_COMP_LIBS_NCURSES=y
CT_COMP_LIBS_NCURSES_PKG_KSYM="NCURSES"
CT_NCURSES_DIR_NAME="ncurses"
CT_NCURSES_PKG_NAME="ncurses"
CT_NCURSES_SRC_RELEASE=y
CT_NCURSES_PATCH_ORDER="global"
CT_NCURSES_V_6_1=y
# CT_NCURSES_V_6_0 is not set
# CT_NCURSES_NO_VERSIONS is not set
CT_NCURSES_VERSION="6.1"
CT_NCURSES_MIRRORS="ftp://invisible-island.net/ncurses $(CT_Mirrors GNU ncurses)"
CT_NCURSES_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_NCURSES_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_NCURSES_ARCHIVE_FORMATS=".tar.gz"
CT_NCURSES_SIGNATURE_FORMAT="packed/.sig"
CT_NCURSES_HOST_CONFIG_ARGS=""
CT_NCURSES_HOST_DISABLE_DB=y
CT_NCURSES_HOST_FALLBACKS="linux,xterm,xterm-color,xterm-256color,vt100"
CT_NCURSES_TARGET_CONFIG_ARGS=""
# CT_NCURSES_TARGET_DISABLE_DB is not set
CT_NCURSES_TARGET_FALLBACKS=""
CT_COMP_LIBS_ZLIB=y
CT_COMP_LIBS_ZLIB_PKG_KSYM="ZLIB"
CT_ZLIB_DIR_NAME="zlib"
CT_ZLIB_PKG_NAME="zlib"
CT_ZLIB_SRC_RELEASE=y
CT_ZLIB_PATCH_ORDER="global"
CT_ZLIB_V_1_2_11=y
# CT_ZLIB_NO_VERSIONS is not set
CT_ZLIB_VERSION="1.2.11"
CT_ZLIB_MIRRORS="http://downloads.sourceforge.net/project/libpng/zlib/${CT_ZLIB_VERSION}"
CT_ZLIB_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_ZLIB_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_ZLIB_ARCHIVE_FORMATS=".tar.xz .tar.gz"
CT_ZLIB_SIGNATURE_FORMAT="packed/.asc"
CT_ALL_COMP_LIBS_CHOICES="CLOOG EXPAT GETTEXT GMP ISL LIBELF LIBICONV MPC MPFR NCURSES ZLIB"
CT_LIBICONV_NEEDED=y
CT_GETTEXT_NEEDED=y
CT_GMP_NEEDED=y
CT_MPFR_NEEDED=y
CT_ISL_NEEDED=y
CT_MPC_NEEDED=y
CT_COMPLIBS=y
CT_NCURSES_NEEDED=y
CT_ZLIB_NEEDED=y
CT_LIBICONV=y
CT_GETTEXT=y
CT_GMP=y
CT_MPFR=y
CT_ISL=y
CT_MPC=y
CT_LIBICONV_V_1_14=y
CT_LIBICONV_VERSION="1.14"
CT_GETTEXT_V_0_19_6=y
CT_GETTEXT_VERSION="0.19.6"
CT_GMP_V_6_0_0=y
# CT_GMP_V_5_1_3 is not set
# CT_GMP_V_5_1_1 is not set
# CT_GMP_V_5_0_2 is not set
# CT_GMP_V_5_0_1 is not set
# CT_GMP_V_4_3_2 is not set
# CT_GMP_V_4_3_1 is not set
# CT_GMP_V_4_3_0 is not set
CT_GMP_5_0_2_or_later=y
CT_GMP_VERSION="6.0.0a"
CT_MPFR_V_3_1_3=y
# CT_MPFR_V_3_1_2 is not set
# CT_MPFR_V_3_1_0 is not set
# CT_MPFR_V_3_0_1 is not set
# CT_MPFR_V_3_0_0 is not set
# CT_MPFR_V_2_4_2 is not set
# CT_MPFR_V_2_4_1 is not set
# CT_MPFR_V_2_4_0 is not set
CT_MPFR_VERSION="3.1.3"
CT_ISL_V_0_14=y
# CT_ISL_V_0_12_2 is not set
CT_ISL_V_0_14_or_later=y
CT_ISL_V_0_12_or_later=y
CT_ISL_VERSION="0.14"
# CT_CLOOG_V_0_18_4 is not set
# CT_CLOOG_V_0_18_1 is not set
# CT_CLOOG_V_0_18_0 is not set
CT_MPC_V_1_0_3=y
# CT_MPC_V_1_0_2 is not set
# CT_MPC_V_1_0_1 is not set
# CT_MPC_V_1_0 is not set
# CT_MPC_V_0_9 is not set
# CT_MPC_V_0_8_2 is not set
# CT_MPC_V_0_8_1 is not set
# CT_MPC_V_0_7 is not set
CT_MPC_VERSION="1.0.3"
#
# Companion libraries common options
#
# CT_COMPLIBS_CHECK is not set
CT_NCURSES=y
CT_ZLIB=y
#
# Companion tools
#
#
# READ HELP before you say 'Y' below !!!
#
# CT_COMP_TOOLS is not set
# CT_COMP_TOOLS_FOR_HOST is not set
# CT_COMP_TOOLS_AUTOCONF is not set
# CT_COMP_TOOLS_AUTOMAKE is not set
# CT_COMP_TOOLS_BISON is not set
# CT_COMP_TOOLS_DTC is not set
# CT_COMP_TOOLS_LIBTOOL is not set
# CT_COMP_TOOLS_M4 is not set
# CT_COMP_TOOLS_MAKE is not set
CT_ALL_COMP_TOOLS_CHOICES="AUTOCONF AUTOMAKE BISON DTC LIBTOOL M4 MAKE"

View File

@ -0,0 +1,12 @@
set -ex
# Mirrored from https://github.com/crosstool-ng/crosstool-ng/archive/crosstool-ng-1.24.0.tar.gz
url="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/crosstool-ng-1.24.0.tar.gz"
curl -Lf $url | tar xzf -
cd crosstool-ng-crosstool-ng-1.24.0
./bootstrap
./configure --prefix=/usr/local
make -j$(nproc)
make install
cd ..
rm -rf crosstool-ng-crosstool-ng-1.24.0

View File

@ -1,48 +0,0 @@
commit bdb24c2851fd5f0ad9b82d7ea1db911d334b02d2
Author: Joseph Myers <joseph@codesourcery.com>
Date: Tue May 20 21:27:13 2014 +0000
Fix ARM build with GCC trunk.
sysdeps/unix/sysv/linux/arm/unwind-resume.c and
sysdeps/unix/sysv/linux/arm/unwind-forcedunwind.c have static
variables that are written in C code but only read from toplevel asms.
Current GCC trunk now optimizes away such apparently write-only static
variables, so causing a build failure. This patch marks those
variables with __attribute_used__ to avoid that optimization.
Tested that this fixes the build for ARM.
* sysdeps/unix/sysv/linux/arm/unwind-forcedunwind.c
(libgcc_s_resume): Use __attribute_used__.
* sysdeps/unix/sysv/linux/arm/unwind-resume.c (libgcc_s_resume):
Likewise.
diff --git a/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c b/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
index 29e2c2b00b04..e848bfeffdcb 100644
--- a/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
+++ b/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
@@ -22,7 +22,8 @@
#include <pthreadP.h>
static void *libgcc_s_handle;
-static void (*libgcc_s_resume) (struct _Unwind_Exception *exc);
+static void (*libgcc_s_resume) (struct _Unwind_Exception *exc)
+ __attribute_used__;
static _Unwind_Reason_Code (*libgcc_s_personality)
(_Unwind_State, struct _Unwind_Exception *, struct _Unwind_Context *);
static _Unwind_Reason_Code (*libgcc_s_forcedunwind)
diff --git a/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c b/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
index 285b99b5ed0d..48d00fc83641 100644
--- a/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
+++ b/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
@@ -20,7 +20,8 @@
#include <stdio.h>
#include <unwind.h>
-static void (*libgcc_s_resume) (struct _Unwind_Exception *exc);
+static void (*libgcc_s_resume) (struct _Unwind_Exception *exc)
+ __attribute_used__;
static _Unwind_Reason_Code (*libgcc_s_personality)
(_Unwind_State, struct _Unwind_Exception *, struct _Unwind_Context *);

View File

@ -32,7 +32,6 @@ RUN sh /scripts/sccache.sh
ENV RUST_CONFIGURE_ARGS \
--musl-root-i586=/musl-i586 \
--musl-root-i686=/musl-i686 \
--enable-extended \
--disable-docs
# Newer binutils broke things on some vms/distros (i.e., linking against

View File

@ -104,9 +104,7 @@ ENV TARGETS=$TARGETS,armv5te-unknown-linux-musleabi
ENV TARGETS=$TARGETS,armv7-unknown-linux-musleabihf
ENV TARGETS=$TARGETS,aarch64-unknown-linux-musl
ENV TARGETS=$TARGETS,sparc64-unknown-linux-gnu
# FIXME: temporarily disable the redox builder,
# see: https://github.com/rust-lang/rust/issues/63160
# ENV TARGETS=$TARGETS,x86_64-unknown-redox
ENV TARGETS=$TARGETS,x86_64-unknown-redox
ENV TARGETS=$TARGETS,thumbv6m-none-eabi
ENV TARGETS=$TARGETS,thumbv7m-none-eabi
ENV TARGETS=$TARGETS,thumbv7em-none-eabi
@ -132,7 +130,7 @@ ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \
CC_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-gcc \
AR_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-ar \
CXX_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-g++
ENV RUST_CONFIGURE_ARGS \
--musl-root-armv5te=/musl-armv5te \
--musl-root-arm=/musl-arm \

View File

@ -3,9 +3,11 @@
set -ex
source shared.sh
VERSION=7.51.0
VERSION=7.66.0
curl http://cool.haxx.se/download/curl-$VERSION.tar.bz2 | tar xjf -
curl https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/curl-$VERSION.tar.xz \
| xz --decompress \
| tar xf -
mkdir curl-build
cd curl-build

View File

@ -19,3 +19,6 @@ RUN sh /scripts/sccache.sh
ENV RUST_CONFIGURE_ARGS --build=i686-unknown-linux-gnu --disable-optimize-tests
ENV SCRIPT python2.7 ../x.py test
# FIXME(#59637) takes too long on CI right now
ENV NO_LLVM_ASSERTIONS=1 NO_DEBUG_ASSERTIONS=1

View File

@ -25,3 +25,6 @@ ENV SCRIPT python2.7 ../x.py test \
--exclude src/test/rustdoc-js \
--exclude src/tools/error_index_generator \
--exclude src/tools/linkchecker
# FIXME(#59637) takes too long on CI right now
ENV NO_LLVM_ASSERTIONS=1 NO_DEBUG_ASSERTIONS=1

View File

@ -22,5 +22,6 @@ apt-get update && apt-get install -y --no-install-recommends \
python2.7 \
sudo \
texinfo \
unzip \
wget \
xz-utils

View File

@ -54,29 +54,3 @@ if [ "$REPLACE_CC" = "1" ]; then
ln -s $TARGET-g++ /usr/local/bin/$exec
done
fi
export CC=$TARGET-gcc
export CXX=$TARGET-g++
LLVM=70
# may have been downloaded in a previous run
if [ ! -d libunwind-release_$LLVM ]; then
curl -L https://github.com/llvm-mirror/llvm/archive/release_$LLVM.tar.gz | tar xzf -
curl -L https://github.com/llvm-mirror/libunwind/archive/release_$LLVM.tar.gz | tar xzf -
fi
# fixme(mati865): Replace it with https://github.com/rust-lang/rust/pull/59089
mkdir libunwind-build
cd libunwind-build
cmake ../libunwind-release_$LLVM \
-DLLVM_PATH=/build/llvm-release_$LLVM \
-DLIBUNWIND_ENABLE_SHARED=0 \
-DCMAKE_C_COMPILER=$CC \
-DCMAKE_CXX_COMPILER=$CXX \
-DCMAKE_C_FLAGS="$CFLAGS" \
-DCMAKE_CXX_FLAGS="$CXXFLAGS"
hide_output make -j$(nproc)
cp lib/libunwind.a $OUTPUT/$TARGET/lib
cd - && rm -rf libunwind-build

View File

@ -20,6 +20,8 @@ exit 1
TAG=$1
shift
# Ancient binutils versions don't understand debug symbols produced by more recent tools.
# Apparently applying `-fPIC` everywhere allows them to link successfully.
export CFLAGS="-fPIC $CFLAGS"
MUSL=musl-1.1.22
@ -38,27 +40,3 @@ else
fi
hide_output make install
hide_output make clean
cd ..
LLVM=70
# may have been downloaded in a previous run
if [ ! -d libunwind-release_$LLVM ]; then
curl -L https://github.com/llvm-mirror/llvm/archive/release_$LLVM.tar.gz | tar xzf -
curl -L https://github.com/llvm-mirror/libunwind/archive/release_$LLVM.tar.gz | tar xzf -
fi
mkdir libunwind-build
cd libunwind-build
cmake ../libunwind-release_$LLVM \
-DLLVM_PATH=/build/llvm-release_$LLVM \
-DLIBUNWIND_ENABLE_SHARED=0 \
-DCMAKE_C_COMPILER=$CC \
-DCMAKE_CXX_COMPILER=$CXX \
-DCMAKE_C_FLAGS="$CFLAGS" \
-DCMAKE_CXX_FLAGS="$CXXFLAGS"
hide_output make -j$(nproc)
cp lib/libunwind.a /musl-$TAG/lib
cd ../ && rm -rf libunwind-build

View File

@ -78,6 +78,21 @@ if [ "$RUST_RELEASE_CHANNEL" = "nightly" ] || [ "$DIST_REQUIRE_ALL_TOOLS" = "" ]
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-missing-tools"
fi
# Print the date from the local machine and the date from an external source to
# check for clock drifts. An HTTP URL is used instead of HTTPS since on Azure
# Pipelines it happened that the certificates were marked as expired.
datecheck() {
echo "== clock drift check =="
echo -n " local time: "
date
echo -n " network time: "
curl -fs --head http://detectportal.firefox.com/success.txt | grep ^Date: \
| sed 's/Date: //g' || true
echo "== end clock drift check =="
}
datecheck
trap datecheck EXIT
# We've had problems in the past of shell scripts leaking fds into the sccache
# server (#48192) which causes Cargo to erroneously think that a build script
# hasn't finished yet. Try to solve that problem by starting a very long-lived

View File

@ -3,7 +3,7 @@ dist: trusty
language: rust
cache: cargo
rust:
- 1.31.1
- 1.37.0
branches:
only:
- master

View File

@ -19,7 +19,7 @@ releases are updated less frequently.
## Requirements
Building the book requires [mdBook], ideally the same 0.2.x version that
Building the book requires [mdBook], ideally the same 0.3.x version that
rust-lang/rust uses in [this file][rust-mdbook]. To get it:
[mdBook]: https://github.com/rust-lang-nursery/mdBook

View File

@ -60,6 +60,7 @@ for potential future use.
* `abstract`
* `async`
* `await`
* `become`
* `box`
* `do`

View File

@ -668,7 +668,7 @@ error[E0308]: mismatched types
--> src/main.rs:23:21
|
23 | match guess.cmp(&secret_number) {
| ^^^^^^^^^^^^^^ expected struct `std::string::String`, found integral variable
| ^^^^^^^^^^^^^^ expected struct `std::string::String`, found integer
|
= note: expected type `&std::string::String`
= note: found type `&{integer}`

View File

@ -65,7 +65,7 @@ But mutability can be very useful. Variables are immutable only by default; as
you did in Chapter 2, you can make them mutable by adding `mut` in front of the
variable name. In addition to allowing this value to change, `mut` conveys
intent to future readers of the code by indicating that other parts of the code
will be changing this variable value.
will be changing this variable's value.
For example, lets change *src/main.rs* to the following:

View File

@ -96,7 +96,7 @@ error[E0308]: mismatched types
--> src/main.rs:4:8
|
4 | if number {
| ^^^^^^ expected bool, found integral variable
| ^^^^^^ expected bool, found integer
|
= note: expected type `bool`
found type `{integer}`
@ -240,7 +240,7 @@ error[E0308]: if and else have incompatible types
6 | | } else {
7 | | "six"
8 | | };
| |_____^ expected integral variable, found &str
| |_____^ expected integer, found &str
|
= note: expected type `{integer}`
found type `&str`

View File

@ -146,9 +146,9 @@ that is stored on the heap and explore how Rust knows when to clean up that
data.
Well use `String` as the example here and concentrate on the parts of `String`
that relate to ownership. These aspects also apply to other complex data types
provided by the standard library and that you create. Well discuss `String` in
more depth in Chapter 8.
that relate to ownership. These aspects also apply to other complex data types,
whether they are provided by the standard library or created by you. Well
discuss `String` in more depth in Chapter 8.
Weve already seen string literals, where a string value is hardcoded into our
program. String literals are convenient, but they arent suitable for every

View File

@ -314,7 +314,7 @@ Lets take a closer look at exactly whats happening at each stage of our
<span class="filename">Filename: src/main.rs</span>
```rust,ignore
```rust,ignore,does_not_compile
fn dangle() -> &String { // dangle returns a reference to a String
let s = String::from("hello"); // s is a new String

View File

@ -165,7 +165,7 @@ fn main() {
<span class="caption">Listing 5-11: Attempting to print a `Rectangle`
instance</span>
When we run this code, we get an error with this core message:
When we compile this code, we get an error with this core message:
```text
error[E0277]: `Rectangle` doesn't implement `std::fmt::Display`
@ -195,7 +195,7 @@ Lets try it! The `println!` macro call will now look like `println!("rect1 is
enables us to print our struct in a way that is useful for developers so we can
see its value while were debugging our code.
Run the code with this change. Drat! We still get an error:
Compile the code with this change. Drat! We still get an error:
```text
error[E0277]: `Rectangle` doesn't implement `std::fmt::Debug`

View File

@ -41,10 +41,9 @@ root, `hosting` is now a valid name in that scope, just as though the `hosting`
module had been defined in the crate root. Paths brought into scope with `use`
also check privacy, like any other paths.
Specifying a relative path with `use` is slightly different. Instead of
starting from a name in the current scope, we must start the path given to
`use` with the keyword `self`. Listing 7-12 shows how to specify a relative
path to get the same behavior as in Listing 7-11.
You can also bring an item into scope with `use` and a relative path. Listing
7-12 shows how to specify a relative path to get the same behavior as in
Listing 7-11.
<span class="filename">Filename: src/lib.rs</span>
@ -55,7 +54,7 @@ mod front_of_house {
}
}
use self::front_of_house::hosting;
use front_of_house::hosting;
pub fn eat_at_restaurant() {
hosting::add_to_waitlist();
@ -66,10 +65,7 @@ pub fn eat_at_restaurant() {
```
<span class="caption">Listing 7-12: Bringing a module into scope with `use` and
a relative path starting with `self`</span>
Note that using `self` in this way might not be necessary in the future; its
an inconsistency in the language that Rust developers are working to eliminate.
a relative path</span>
### Creating Idiomatic `use` Paths

View File

@ -51,7 +51,7 @@ us that the types dont match. The error message will then tell us what the
type of `f` *is*. Lets try it! We know that the return type of `File::open`
isnt of type `u32`, so lets change the `let f` statement to this:
```rust,ignore
```rust,ignore,does_not_compile
let f: u32 = File::open("hello.txt");
```
@ -481,7 +481,7 @@ must be a `Result` to be compatible with this `return`.
Lets look at what happens if we use the `?` operator in the `main` function,
which youll recall has a return type of `()`:
```rust,ignore
```rust,ignore,does_not_compile
use std::fs::File;
fn main() {

View File

@ -207,8 +207,8 @@ error[E0308]: mismatched types
--> src/main.rs:7:38
|
7 | let wont_work = Point { x: 5, y: 4.0 };
| ^^^ expected integral variable, found
floating-point variable
| ^^^ expected integer, found
floating-point number
|
= note: expected type `{integer}`
found type `{float}`

View File

@ -132,7 +132,7 @@ fn it_adds_two() {
`adder` crate</span>
Weve added `use adder` at the top of the code, which we didnt need in the
unit tests. The reason is that each test in the `tests` directory is a separate
unit tests. The reason is that each file in the `tests` directory is a separate
crate, so we need to bring our library into each test crates scope.
We dont need to annotate any code in *tests/integration_test.rs* with

View File

@ -402,7 +402,7 @@ error[E0308]: mismatched types
|
| let n = example_closure(5);
| ^ expected struct `std::string::String`, found
integral variable
integer
|
= note: expected type `std::string::String`
found type `{integer}`

View File

@ -66,4 +66,4 @@ Cargo will use the defaults for the `dev` profile plus our customization to
optimizations than the default, but not as many as in a release build.
For the full list of configuration options and defaults for each profile, see
[Cargos documentation](https://doc.rust-lang.org/cargo/).
[Cargos documentation](https://doc.rust-lang.org/cargo/reference/manifest.html#the-profile-sections).

View File

@ -10,7 +10,7 @@ Lets first look at how the dereference operator works with regular references
Then well try to define a custom type that behaves like `Box<T>`, and see why
the dereference operator doesnt work like a reference on our newly defined
type. Well explore how implementing the `Deref` trait makes it possible for
smart pointers to work in a similar way as references. Then well look at
smart pointers to work in ways similar to references. Then well look at
Rusts *deref coercion* feature and how it lets us work with either references
or smart pointers.

View File

@ -35,12 +35,11 @@ want to send over the channel.
<span class="filename">Filename: src/main.rs</span>
```rust
```rust,ignore,does_not_compile
use std::sync::mpsc;
fn main() {
let (tx, rx) = mpsc::channel();
# tx.send(()).unwrap();
}
```

View File

@ -2,7 +2,7 @@
Message passing is a fine way of handling concurrency, but its not the only
one. Consider this part of the slogan from the Go language documentation again:
“communicate by sharing memory.”
do not communicate by sharing memory.”
What would communicating by sharing memory look like? In addition, why would
message-passing enthusiasts not use it and do the opposite instead?

View File

@ -380,7 +380,7 @@ otherwise, we want to return an empty string slice, as shown in Listing 17-17:
impl Post {
// --snip--
pub fn content(&self) -> &str {
self.state.as_ref().unwrap().content(&self)
self.state.as_ref().unwrap().content(self)
}
// --snip--
}

View File

@ -105,9 +105,9 @@ match x {
This code prints `one or two`.
### Matching Ranges of Values with `...`
### Matching Ranges of Values with `..=`
The `...` syntax allows us to match to an inclusive range of values. In the
The `..=` syntax allows us to match to an inclusive range of values. In the
following code, when a pattern matches any of the values within the range, that
arm will execute:
@ -115,14 +115,14 @@ arm will execute:
let x = 5;
match x {
1...5 => println!("one through five"),
1..=5 => println!("one through five"),
_ => println!("something else"),
}
```
If `x` is 1, 2, 3, 4, or 5, the first arm will match. This syntax is more
convenient than using the `|` operator to express the same idea; instead of
`1...5`, we would have to specify `1 | 2 | 3 | 4 | 5` if we used `|`.
`1..=5`, we would have to specify `1 | 2 | 3 | 4 | 5` if we used `|`.
Specifying a range is much shorter, especially if we want to match, say, any
number between 1 and 1,000!
@ -136,8 +136,8 @@ Here is an example using ranges of `char` values:
let x = 'c';
match x {
'a'...'j' => println!("early ASCII letter"),
'k'...'z' => println!("late ASCII letter"),
'a'..='j' => println!("early ASCII letter"),
'k'..='z' => println!("late ASCII letter"),
_ => println!("something else"),
}
```
@ -783,7 +783,7 @@ were applied only to the final value in the list of values specified using the
The *at* operator (`@`) lets us create a variable that holds a value at the
same time were testing that value to see whether it matches a pattern. Listing
18-29 shows an example where we want to test that a `Message::Hello` `id` field
is within the range `3...7`. But we also want to bind the value to the variable
is within the range `3..=7`. But we also want to bind the value to the variable
`id_variable` so we can use it in the code associated with the arm. We could
name this variable `id`, the same as the field, but for this example well use
a different name.
@ -796,10 +796,10 @@ enum Message {
let msg = Message::Hello { id: 5 };
match msg {
Message::Hello { id: id_variable @ 3...7 } => {
Message::Hello { id: id_variable @ 3..=7 } => {
println!("Found an id in range: {}", id_variable)
},
Message::Hello { id: 10...12 } => {
Message::Hello { id: 10..=12 } => {
println!("Found an id in another range")
},
Message::Hello { id } => {
@ -812,7 +812,7 @@ match msg {
while also testing it</span>
This example will print `Found an id in range: 5`. By specifying `id_variable
@` before the range `3...7`, were capturing whatever value matched the range
@` before the range `3..=7`, were capturing whatever value matched the range
while also testing that the value matched the range pattern.
In the second arm, where we only have a range specified in the pattern, the code

View File

@ -33,6 +33,7 @@ the ability to:
* Call an unsafe function or method
* Access or modify a mutable static variable
* Implement an unsafe trait
* Access fields of `union`s
Its important to understand that `unsafe` doesnt turn off the borrow checker
or disable any other of Rusts safety checks: if you use a reference in unsafe

View File

@ -895,7 +895,7 @@ at Listing 20-19.
# use std::sync::mpsc;
# struct Worker {}
type Job = Box<FnOnce() + Send + 'static>;
type Job = Box<dyn FnOnce() + Send + 'static>;
impl ThreadPool {
// --snip--

View File

@ -2,7 +2,7 @@
*by Steve Klabnik and Carol Nichols, with contributions from the Rust Community*
This version of the text assumes youre using Rust 1.31.0 or later with
This version of the text assumes youre using Rust 1.37.0 or later with
`edition="2018"` in *Cargo.toml* of all projects to use Rust 2018 Edition
idioms. See the [“Installation” section of Chapter 1][install]<!-- ignore -->
to install or update Rust, and see the new [Appendix E][editions]<!-- ignore

View File

@ -7,7 +7,7 @@ extern crate walkdir;
use docopt::Docopt;
use std::{path, fs, io};
use std::io::{BufRead, Write};
use std::io::BufRead;
fn main () {
let args: Args = Docopt::new(USAGE)

View File

@ -4,11 +4,10 @@ main() {
local tag=$(git ls-remote --tags --refs --exit-code \
https://github.com/rust-lang-nursery/mdbook \
| cut -d/ -f3 \
| grep -E '^v[0.1.0-9.]+$' \
| grep -E '^v[0-9\.]+$' \
| sort --version-sort \
| tail -n1)
# Temporarily use older version until packages are available for 0.2.2 (or newer)
local tag="v0.2.1"
curl -LSfs https://japaric.github.io/trust/install.sh | \
sh -s -- --git rust-lang-nursery/mdbook --tag $tag

View File

@ -130,3 +130,16 @@ panicked at 'assertion failed: `(left == right)`
$ echo $?
1
```
**NOTE**: To enable this feature on `panic-semihosting`, edit your
`Cargo.toml` dependencies section where `panic-semihosting` is specified with:
``` toml
panic-semihosting = { version = "VERSION", features = ["exit"] }
```
where `VERSION` is the version desired. For more information on dependencies
features check the [`specifying dependencies`] section of the Cargo book.
[`specifying dependencies`]:
https://doc.rust-lang.org/cargo/reference/specifying-dependencies.html

View File

@ -51,7 +51,7 @@ and performing an *unsizing coercion*:
```rust
struct MySuperSliceable<T: ?Sized> {
info: u32,
data: T
data: T,
}
fn main() {
@ -111,10 +111,15 @@ support values.
Safe code need not worry about ZSTs, but *unsafe* code must be careful about the
consequence of types with no size. In particular, pointer offsets are no-ops,
and standard allocators may return `null` when a zero-sized allocation is
requested, which is indistinguishable from the out of memory result.
and allocators typically [require a non-zero size][alloc].
Note that references to ZSTs (including empty slices), just like all other
references, must be non-null and suitably aligned. Dereferencing a null or
unaligned pointer to a ZST is [undefined behavior][ub], just like for any other
type.
[alloc]: https://doc.rust-lang.org/std/alloc/trait.GlobalAlloc.html#tymethod.alloc
[ub]: what-unsafe-does.html

View File

@ -247,7 +247,7 @@ Second, and more seriously, lifetimes are only a part of the reference itself. T
type of the referent is shared knowledge, which is why adjusting that type in only
one place (the reference) can lead to issues. But if you shrink down a reference's
lifetime when you hand it to someone, that lifetime information isn't shared in
anyway. There are now two independent references with independent lifetimes.
any way. There are now two independent references with independent lifetimes.
There's no way to mess with original reference's lifetime using the other one.
Or rather, the only way to mess with someone's lifetime is to build a meowing dog.

View File

@ -36,4 +36,4 @@ pointer casts.
[unbounded lifetime]: unbounded-lifetimes.html
[transmute]: ../std/mem/fn.transmute.html
[transmute_copy]: ../std/mem/fn.transmute.html
[transmute_copy]: ../std/mem/fn.transmute_copy.html

View File

@ -8,25 +8,89 @@ Unfortunately this is pretty rigid, especially if you need to initialize your
array in a more incremental or dynamic way.
Unsafe Rust gives us a powerful tool to handle this problem:
[`mem::uninitialized`][uninitialized]. This function pretends to return a value
when really it does nothing at all. Using it, we can convince Rust that we have
initialized a variable, allowing us to do trickier things with conditional and
incremental initialization.
[`MaybeUninit`]. This type can be used to handle memory that has not been fully
initialized yet.
Unfortunately, this opens us up to all kinds of problems. Assignment has a
different meaning to Rust based on whether it believes that a variable is
initialized or not. If it's believed uninitialized, then Rust will semantically
just memcopy the bits over the uninitialized ones, and do nothing else. However
if Rust believes a value to be initialized, it will try to `Drop` the old value!
Since we've tricked Rust into believing that the value is initialized, we can no
longer safely use normal assignment.
With `MaybeUninit`, we can initialize an array element-for-element as follows:
This is also a problem if you're working with a raw system allocator, which
returns a pointer to uninitialized memory.
```rust
use std::mem::{self, MaybeUninit};
To handle this, we must use the [`ptr`] module. In particular, it provides
three functions that allow us to assign bytes to a location in memory without
dropping the old value: [`write`], [`copy`], and [`copy_nonoverlapping`].
// Size of the array is hard-coded but easy to change (meaning, changing just
// the constant is sufficient). This means we can't use [a, b, c] syntax to
// initialize the array, though, as we would have to keep that in sync
// with `SIZE`!
const SIZE: usize = 10;
let x = {
// Create an uninitialized array of `MaybeUninit`. The `assume_init` is
// safe because the type we are claiming to have initialized here is a
// bunch of `MaybeUninit`s, which do not require initialization.
let mut x: [MaybeUninit<Box<u32>>; SIZE] = unsafe {
MaybeUninit::uninit().assume_init()
};
// Dropping a `MaybeUninit` does nothing. Thus using raw pointer
// assignment instead of `ptr::write` does not cause the old
// uninitialized value to be dropped.
// Exception safety is not a concern because Box can't panic
for i in 0..SIZE {
x[i] = MaybeUninit::new(Box::new(i as u32));
}
// Everything is initialized. Transmute the array to the
// initialized type.
unsafe { mem::transmute::<_, [Box<u32>; SIZE]>(x) }
};
dbg!(x);
```
This code proceeds in three steps:
1. Create an array of `MaybeUninit<T>`. With current stable Rust, we have to use
unsafe code for this: we take some uninitialized piece of memory
(`MaybeUninit::uninit()`) and claim we have fully initialized it
([`assume_init()`][assume_init]). This seems ridiculous, because we didn't!
The reason this is correct is that the array consists itself entirely of
`MaybeUninit`, which do not actually require initialization. For most other
types, doing `MaybeUninit::uninit().assume_init()` produces an invalid
instance of said type, so you got yourself some Undefined Behavior.
2. Initialize the array. The subtle aspect of this is that usually, when we use
`=` to assign to a value that the Rust type checker considers to already be
initialized (like `x[i]`), the old value stored on the left-hand side gets
dropped. This would be a disaster. However, in this case, the type of the
left-hand side is `MaybeUninit<Box<u32>>`, and dropping that does not do
anything! See below for some more discussion of this `drop` issue.
3. Finally, we have to change the type of our array to remove the
`MaybeUninit`. With current stable Rust, this requires a `transmute`.
This transmute is legal because in memory, `MaybeUninit<T>` looks the same as `T`.
However, note that in general, `Container<MaybeUninit<T>>>` does *not* look
the same as `Container<T>`! Imagine if `Container` was `Option`, and `T` was
`bool`, then `Option<bool>` exploits that `bool` only has two valid values,
but `Option<MaybeUninit<bool>>` cannot do that because the `bool` does not
have to be initialized.
So, it depends on `Container` whether transmuting away the `MaybeUninit` is
allowed. For arrays, it is (and eventually the standard library will
acknowledge that by providing appropriate methods).
It's worth spending a bit more time on the loop in the middle, and in particular
the assignment operator and its interaction with `drop`. If we would have
written something like
```rust,ignore
*x[i].as_mut_ptr() = Box::new(i as u32); // WRONG!
```
we would actually overwrite a `Box<u32>`, leading to `drop` of uninitialized
data, which will cause much sadness and pain.
The correct alternative, if for some reason we cannot use `MaybeUninit::new`, is
to use the [`ptr`] module. In particular, it provides three functions that allow
us to assign bytes to a location in memory without dropping the old value:
[`write`], [`copy`], and [`copy_nonoverlapping`].
* `ptr::write(ptr, val)` takes a `val` and moves it into the address pointed
to by `ptr`.
@ -40,59 +104,53 @@ dropping the old value: [`write`], [`copy`], and [`copy_nonoverlapping`].
It should go without saying that these functions, if misused, will cause serious
havoc or just straight up Undefined Behavior. The only things that these
functions *themselves* require is that the locations you want to read and write
are allocated. However the ways writing arbitrary bits to arbitrary
locations of memory can break things are basically uncountable!
Putting this all together, we get the following:
```rust
use std::mem;
use std::ptr;
// size of the array is hard-coded but easy to change. This means we can't
// use [a, b, c] syntax to initialize the array, though!
const SIZE: usize = 10;
let mut x: [Box<u32>; SIZE];
unsafe {
// convince Rust that x is Totally Initialized
x = mem::uninitialized();
for i in 0..SIZE {
// very carefully overwrite each index without reading it
// NOTE: exception safety is not a concern; Box can't panic
ptr::write(&mut x[i], Box::new(i as u32));
}
}
println!("{:?}", x);
```
are allocated and properly aligned. However, the ways writing arbitrary bits to
arbitrary locations of memory can break things are basically uncountable!
It's worth noting that you don't need to worry about `ptr::write`-style
shenanigans with types which don't implement `Drop` or contain `Drop` types,
because Rust knows not to try to drop them. Similarly you should be able to
assign to fields of partially initialized structs directly if those fields don't
contain any `Drop` types.
because Rust knows not to try to drop them. This is what we relied on in the
above example.
However when working with uninitialized memory you need to be ever-vigilant for
Rust trying to drop values you make like this before they're fully initialized.
Every control path through that variable's scope must initialize the value
before it ends, if it has a destructor.
*[This includes code panicking](unwinding.html)*.
*[This includes code panicking](unwinding.html)*. `MaybeUninit` helps a bit
here, because it does not implicitly drop its content - but all this really
means in case of a panic is that instead of a double-free of the not yet
initialized parts, you end up with a memory leak of the already initialized
parts.
Not being careful about uninitialized memory often leads to bugs and it has been
decided the [`mem::uninitialized`][uninitialized] function should be deprecated.
The [`MaybeUninit`] type is supposed to replace it as its API wraps many common
operations needed to be done around initialized memory. This is nightly only for
now.
Note that, to use the `ptr` methods, you need to first obtain a *raw pointer* to
the data you want to initialize. It is illegal to construct a *reference* to
uninitialized data, which implies that you have to be careful when obtaining
said raw pointer:
* For an array of `T`, you can use `base_ptr.add(idx)` where `base_ptr: *mut T`
to compute the address of array index `idx`. This relies on
how arrays are laid out in memory.
* For a struct, however, in general we do not know how it is laid out, and we
also cannot use `&mut base_ptr.field` as that would be creating a
reference. Thus, it is currently not possible to create a raw pointer to a field
of a partially initialized struct, and also not possible to initialize a single
field of a partially initialized struct. (A
[solution to this problem](https://github.com/rust-lang/rfcs/pull/2582) is being
worked on.)
One last remark: when reading old Rust code, you might stumble upon the
deprecated `mem::uninitialized` function. That function used to be the only way
to deal with uninitialized memory on the stack, but it turned out to be
impossible to properly integrate with the rest of the language. Always use
`MaybeUninit` instead in new code, and port old code over when you get the
opportunity.
And that's about it for working with uninitialized memory! Basically nothing
anywhere expects to be handed uninitialized memory, so if you're going to pass
it around at all, be sure to be *really* careful.
[uninitialized]: ../std/mem/fn.uninitialized.html
[`ptr`]: ../std/ptr/index.html
[`write`]: ../std/ptr/fn.write.html
[`MaybeUninit`]: ../core/mem/union.MaybeUninit.html
[assume_init]: ../core/mem/union.MaybeUninit.html#method.assume_init
[`ptr`]: ../core/ptr/index.html
[`write`]: ../core/ptr/fn.write.html
[`copy`]: ../std/ptr/fn.copy.html
[`copy_nonoverlapping`]: ../std/ptr/fn.copy_nonoverlapping.html
[`MaybeUninit`]: ../std/mem/union.MaybeUninit.html

View File

@ -9,7 +9,7 @@ in a similar form as it is today.
However we will generally try to avoid unstable code where possible. In
particular we won't use any intrinsics that could make a code a little
bit nicer or efficient because intrinsics are permanently unstable. Although
many intrinsics *do* become stabilized elsewhere (`std::ptr` and `str::mem`
many intrinsics *do* become stabilized elsewhere (`std::ptr` and `std::mem`
consist of many intrinsics).
Ultimately this means our implementation may not take advantage of all

View File

@ -16,18 +16,44 @@ to your program. You definitely *should not* invoke Undefined Behavior.
Unlike C, Undefined Behavior is pretty limited in scope in Rust. All the core
language cares about is preventing the following things:
* Dereferencing null, dangling, or unaligned pointers
* Reading [uninitialized memory][]
* Dereferencing (using the `*` operator on) dangling or unaligned pointers (see below)
* Breaking the [pointer aliasing rules][]
* Producing invalid primitive values:
* dangling/null references
* null `fn` pointers
* a `bool` that isn't 0 or 1
* an undefined `enum` discriminant
* a `char` outside the ranges [0x0, 0xD7FF] and [0xE000, 0x10FFFF]
* A non-utf8 `str`
* Unwinding into another language
* Calling a function with the wrong call ABI or unwinding from a function with the wrong unwind ABI.
* Causing a [data race][race]
* Executing code compiled with [target features][] that the current thread of execution does
not support
* Producing invalid values (either alone or as a field of a compound type such
as `enum`/`struct`/array/tuple):
* a `bool` that isn't 0 or 1
* an `enum` with an invalid discriminant
* a null `fn` pointer
* a `char` outside the ranges [0x0, 0xD7FF] and [0xE000, 0x10FFFF]
* a `!` (all values are invalid for this type)
* an integer (`i*`/`u*`), floating point value (`f*`), or raw pointer read from
[uninitialized memory][]
* a reference/`Box` that is dangling, unaligned, or points to an invalid value.
* a wide reference, `Box`, or raw pointer that has invalid metadata:
* `dyn Trait` metadata is invalid if it is not a pointer to a vtable for
`Trait` that matches the actual dynamic trait the pointer or reference points to
* slice metadata is invalid if the length is not a valid `usize`
(i.e., it must not be read from uninitialized memory)
* a `str` that isn't valid UTF-8
* a type with custom invalid values that is one of those values, such as a
`NonNull` that is null. (Requesting custom invalid values is an unstable
feature, but some stable libstd types, like `NonNull`, make use of it.)
"Producing" a value happens any time a value is assigned, passed to a
function/primitive operation or returned from a function/primitive operation.
A reference/pointer is "dangling" if it is null or not all of the bytes it
points to are part of the same allocation (so in particular they all have to be
part of *some* allocation). The span of bytes it points to is determined by the
pointer value and the size of the pointee type. As a consequence, if the span is
empty, "dangling" is the same as "non-null". Note that slices point to their
entire range, so it's important that the length metadata is never too large
(in particular, allocations and therefore slices cannot be bigger than
`isize::MAX` bytes). If for some reason this is too cumbersome, consider using
raw pointers.
That's it. That's all the causes of Undefined Behavior baked into Rust. Of
course, unsafe functions and traits are free to declare arbitrary other
@ -58,3 +84,4 @@ these problems are considered impractical to categorically prevent.
[pointer aliasing rules]: references.html
[uninitialized memory]: uninitialized.html
[race]: races.html
[target features]: ../reference/attributes/codegen.html#the-target_feature-attribute

View File

@ -67,6 +67,7 @@
- [If and if let expressions](expressions/if-expr.md)
- [Match expressions](expressions/match-expr.md)
- [Return expressions](expressions/return-expr.md)
- [Await expressions](expressions/await-expr.md)
- [Patterns](patterns.md)

View File

@ -88,7 +88,7 @@ pub fn name_in_rust() { }
[_MetaNameValueStr_]: attributes.md#meta-item-attribute-syntax
[`static` items]: items/static-items.md
[attribute]: attributes.md
[extern functions]: items/functions.md#extern-functions
[extern functions]: items/functions.md#extern-function-qualifier
[external blocks]: items/external-blocks.md
[function]: items/functions.md
[item]: items.md

View File

@ -16,7 +16,7 @@
> &nbsp;&nbsp; | `=` [_LiteralExpression_]<sub>_without suffix_</sub>
An _attribute_ is a general, free-form metadatum that is interpreted according
to name, convention, and language and compiler version. Attributes are modeled
to name, convention, language, and compiler version. Attributes are modeled
on Attributes in [ECMA-335], with the syntax coming from [ECMA-334] \(C#).
_Inner attributes_, written with a bang (`!`) after the hash (`#`), apply to the

View File

@ -23,30 +23,49 @@ code.
</div>
* Data races.
* Dereferencing a null or dangling raw pointer.
* Unaligned pointer reading and writing outside of [`read_unaligned`]
and [`write_unaligned`].
* Reads of [undef] \(uninitialized) memory.
* Breaking the [pointer aliasing rules] on accesses through raw pointers;
a subset of the rules used by C.
* `&mut T` and `&T` follow LLVMs scoped [noalias] model, except if the `&T`
contains an [`UnsafeCell<U>`].
* Mutating non-mutable data &mdash; that is, data reached through a shared
reference or data owned by a `let` binding), unless that data is contained
within an [`UnsafeCell<U>`].
* Invoking undefined behavior via compiler intrinsics:
* Indexing outside of the bounds of an object with [`offset`] with
the exception of one byte past the end of the object.
* Using [`std::ptr::copy_nonoverlapping_memory`], a.k.a. the `memcpy32`and
`memcpy64` intrinsics, on overlapping buffers.
* Invalid values in primitive types, even in private fields and locals:
* Dangling or null references and boxes.
* A value other than `false` (`0`) or `true` (`1`) in a `bool`.
* A discriminant in an `enum` not included in the type definition.
* A value in a `char` which is a surrogate or above `char::MAX`.
* Non-UTF-8 byte sequences in a `str`.
* Dereferencing (using the `*` operator on) a dangling or unaligned raw pointer.
* Breaking the [pointer aliasing rules]. `&mut T` and `&T` follow LLVMs scoped
[noalias] model, except if the `&T` contains an [`UnsafeCell<U>`].
* Mutating immutable data. All data inside a [`const`] item is immutable. Moreover, all
data reached through a shared reference or data owned by an immutable binding
is immutable, unless that data is contained within an [`UnsafeCell<U>`].
* Invoking undefined behavior via compiler intrinsics.
* Executing code compiled with platform features that the current platform
does not support (see [`target_feature`]).
* Calling a function with the wrong call ABI or unwinding from a function with the wrong unwind ABI.
* Producing an invalid value, even in private fields and locals. "Producing" a
value happens any time a value is assigned to or read from a place, passed to
a function/primitive operation or returned from a function/primitive
operation.
The following values are invalid (at their respective type):
* A value other than `false` (`0`) or `true` (`1`) in a `bool`.
* A discriminant in an `enum` not included in the type definition.
* A null `fn` pointer.
* A value in a `char` which is a surrogate or above `char::MAX`.
* A `!` (all values are invalid for this type).
* An integer (`i*`/`u*`), floating point value (`f*`), or raw pointer obtained
from [uninitialized memory][undef].
* A reference or `Box<T>` that is dangling, unaligned, or points to an invalid value.
* Invalid metadata in a wide reference, `Box<T>`, or raw pointer:
* `dyn Trait` metadata is invalid if it is not a pointer to a vtable for
`Trait` that matches the actual dynamic trait the pointer or reference points to.
* Slice metadata is invalid if if the length is not a valid `usize`
(i.e., it must not be read from uninitialized memory).
* Non-UTF-8 byte sequences in a `str`.
* Invalid values for a type with a custom definition of invalid values.
In the standard library, this affects [`NonNull<T>`] and [`NonZero*`].
> **Note**: `rustc` achieves this with the unstable
> `rustc_layout_scalar_valid_range_*` attributes.
A reference/pointer is "dangling" if it is null or not all of the bytes it
points to are part of the same allocation (so in particular they all have to be
part of *some* allocation). The span of bytes it points to is determined by the
pointer value and the size of the pointee type. As a consequence, if the span is
empty, "dangling" is the same as "non-null". Note that slices point to their
entire range, so it is important that the length metadata is never too
large. In particular, allocations and therefore slices cannot be bigger than
`isize::MAX` bytes.
> **Note**: Undefined behavior affects the entire program. For example, calling
> a function in C that exhibits undefined behavior of C means your entire
@ -54,13 +73,12 @@ code.
> vice versa, undefined behavior in Rust can cause adverse affects on code
> executed by any FFI calls to other languages.
[`const`]: items/constant-items.html
[noalias]: http://llvm.org/docs/LangRef.html#noalias
[pointer aliasing rules]: http://llvm.org/docs/LangRef.html#pointer-aliasing-rules
[undef]: http://llvm.org/docs/LangRef.html#undefined-values
[`offset`]: ../std/primitive.pointer.html#method.offset
[`std::ptr::copy_nonoverlapping_memory`]: ../std/ptr/fn.copy_nonoverlapping.html
[`target_feature`]: attributes/codegen.md#the-target_feature-attribute
[`UnsafeCell<U>`]: ../std/cell/struct.UnsafeCell.html
[`read_unaligned`]: ../std/ptr/fn.read_unaligned.html
[`write_unaligned`]: ../std/ptr/fn.write_unaligned.html
[Rustonomicon]: ../nomicon/index.html
[`NonNull<T>`]: ../core/ptr/struct.NonNull.html
[`NonZero*`]: ../core/num/index.html

View File

@ -25,7 +25,7 @@
*Conditionally compiled source code* is source code that may or may not be
considered a part of the source code depending on certain conditions. <!-- This
definition is sort of vacuous --> Source code can be conditionally compiled
using [attributes], [`cfg`] and [`cfg_attr`], and the built-in [`cfg` macro].
using the [attributes] [`cfg`] and [`cfg_attr`] and the built-in [`cfg` macro].
These conditions are based on the target architecture of the compiled crate,
arbitrary values passed to the compiler, and a few other miscellaneous things
further described below in detail.
@ -284,7 +284,7 @@ fn bewitched() {}
```
> **Note**: The `cfg_attr` can expand to another `cfg_attr`. For example,
> `#[cfg_attr(linux, cfg_attr(feature = "multithreaded", some_other_attribute))`
> `#[cfg_attr(linux, cfg_attr(feature = "multithreaded", some_other_attribute))]`
> is valid. This example would be equivalent to
> `#[cfg_attr(all(linux, feature ="multithreaded"), some_other_attribute)]`.

View File

@ -37,8 +37,8 @@ to be run.
* Index expressions, [array indexing] or [slice] with a `usize`.
* [Range expressions].
* [Closure expressions] which don't capture variables from the environment.
* Built in [negation], [arithmetic, logical], [comparison] or [lazy boolean]
operators used on integer and floating point types, `bool` and `char`.
* Built-in [negation], [arithmetic], [logical], [comparison] or [lazy boolean]
operators used on integer and floating point types, `bool`, and `char`.
* Shared [borrow]s, except if applied to a type with [interior mutability].
* The [dereference operator].
* [Grouped] expressions.
@ -57,7 +57,7 @@ A _const context_ is one of the following:
* [statics]
* [enum discriminants]
[arithmetic, logical]: expressions/operator-expr.md#arithmetic-and-logical-binary-operators
[arithmetic]: expressions/operator-expr.md#arithmetic-and-logical-binary-operators
[array expressions]: expressions/array-expr.md
[array indexing]: expressions/array-expr.md#array-and-slice-indexing-expressions
[array indexing]: expressions/array-expr.md#array-and-slice-indexing-expressions
@ -84,6 +84,7 @@ A _const context_ is one of the following:
[lazy boolean]: expressions/operator-expr.md#lazy-boolean-operators
[let statements]: statements.md#let-statements
[literals]: expressions/literal-expr.md
[logical]: expressions/operator-expr.md#arithmetic-and-logical-binary-operators
[negation]: expressions/operator-expr.md#negation-operators
[overflow]: expressions/operator-expr.md#overflow
[paths]: expressions/path-expr.md

View File

@ -29,7 +29,7 @@ crate in binary form: either an executable or some sort of
library.[^cratesourcefile]
A _crate_ is a unit of compilation and linking, as well as versioning,
distribution and runtime loading. A crate contains a _tree_ of nested
distribution, and runtime loading. A crate contains a _tree_ of nested
[module] scopes. The top level of this tree is a module that is
anonymous (from the point of view of paths within the module) and any item
within a crate has a canonical [module path] denoting its location

View File

@ -20,7 +20,7 @@ types">DSTs</abbr>. Such types can only be used in certain cases:
last field, this makes the struct itself a
<abbr title="dynamically sized type">DST</abbr>.
Notably: [variables], function parameters, [const] and [static] items must be
> **Note**: [variables], function parameters, [const] items, and [static] items must be
`Sized`.
[sized]: special-types-and-traits.md#sized

View File

@ -13,6 +13,7 @@
> &nbsp;&nbsp; &nbsp;&nbsp; | [_OperatorExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_GroupedExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_ArrayExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_AwaitExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_IndexExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_TupleExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_TupleIndexingExpression_]\
@ -33,6 +34,7 @@
> &nbsp;&nbsp; [_OuterAttribute_]<sup>\*</sup>[†](#expression-attributes)\
> &nbsp;&nbsp; (\
> &nbsp;&nbsp; &nbsp;&nbsp; &nbsp;&nbsp; [_BlockExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_AsyncBlockExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_UnsafeBlockExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_LoopExpression_]\
> &nbsp;&nbsp; &nbsp;&nbsp; | [_IfExpression_]\
@ -263,7 +265,7 @@ a few specific cases:
* Before an expression used as a [statement].
* Elements of [array expressions], [tuple expressions], [call expressions],
tuple-style [struct] and [enum variant] expressions.
and tuple-style [struct] and [enum variant] expressions.
<!--
These were likely stabilized inadvertently.
See https://github.com/rust-lang/rust/issues/32796 and
@ -324,6 +326,8 @@ They are never allowed before:
[_ArithmeticOrLogicalExpression_]: expressions/operator-expr.md#arithmetic-and-logical-binary-operators
[_ArrayExpression_]: expressions/array-expr.md
[_AsyncBlockExpression_]: expressions/block-expr.md#async-blocks
[_AwaitExpression_]: expressions/await-expr.md
[_AssignmentExpression_]: expressions/operator-expr.md#assignment-expressions
[_BlockExpression_]: expressions/block-expr.md
[_BreakExpression_]: expressions/loop-expr.md#break-expressions

View File

@ -0,0 +1,68 @@
# Await expressions
> **<sup>Syntax</sup>**\
> _AwaitExpression_ :\
> &nbsp;&nbsp; [_Expression_] `.` `await`
Await expressions are legal only within an [async context], like an
[`async fn`] or an [`async` block]. They operate on a [future]. Their effect
is to suspend the current computation until the given future is ready
to produce a value.
More specifically, an `<expr>.await` expression has the following effect.
1. Evaluate `<expr>` to a [future] `tmp`;
2. Pin `tmp` using [`Pin::new_unchecked`];
3. This pinned future is then polled by calling the [`Future::poll`] method and
passing it the current [task context](#task-context);
3. If the call to `poll` returns [`Poll::Pending`], then the future
returns `Poll::Pending`, suspending its state so that, when the
surrounding async context is re-polled, execution returns to step
2;
4. Otherwise the call to `poll` must have returned [`Poll::Ready`], in which case the
value contained in the [`Poll::Ready`] variant is used as the result
of the `await` expression itself.
[`async fn`]: ../items/functions.md#async-functions
[`async` block]: block-expr.md#async-blocks
[future]: ../../std/future/trait.Future.html
[_Expression_]: ../expressions.md
[`Future::poll`]: ../../std/future/trait.Future.html#tymethod.poll
[`Context`]: ../../std/task/struct.Context.html
[`Pin::new_unchecked`]: ../../std/pin/struct.Pin.html#method.new_unchecked
[`Poll::Pending`]: ../../std/task/enum.Poll.html#variant.Pending
[`Poll::Ready`]: ../../std/task/enum.Poll.html#variant.Ready
> **Edition differences**: Await expressions are only available beginning with
> Rust 2018.
## Task context
The task context refers to the [`Context`] which was supplied to the
current [async context] when the async context itself was
polled. Because `await` expressions are only legal in an async
context, there must be some task context available.
[`Context`]: ../../std/task/struct.Context.html
[async context]: ../expressions/block-expr.md#async-context
## Approximate desugaring
Effectively, an `<expr>.await` expression is roughly
equivalent to the following (this desugaring is not normative):
```rust,ignore
let future = /* <expr> */;
loop {
let mut pin = unsafe { Pin::new_unchecked(&mut future) };
match Pin::future::poll(Pin::borrow(&mut pin), &mut current_context) {
Poll::Ready(r) => break r,
Poll::Pending => yield Poll::Pending,
}
}
```
where the `yield` pseudo-code returns `Poll::Pending` and, when
re-invoked, resumes execution from that point. The variable
`current_context` refers to the context taken from the async
environment.

View File

@ -80,6 +80,74 @@ fn move_by_block_expression() {
}
```
## `async` blocks
> **<sup>Syntax</sup>**\
> _AsyncBlockExpression_ :\
> &nbsp;&nbsp; `async` `move`<sup>?</sup> _BlockExpression_
An *async block* is a variant of a block expression which evaluates to
a *future*. The final expression of the block, if present, determines
the result value of the future.
Executing an async block is similar to executing a closure expression:
its immediate effect is to produce and return an anonymous type.
Whereas closures return a type that implements one or more of the
[`std::ops::Fn`] traits, however, the type returned for an async block
implements the [`std::future::Future`] trait. The actual data format for
this type is unspecified.
> **Note:** The future type that rustc generates is roughly equivalent
> to an enum with one variant per `await` point, where each variant
> stores the data needed to resume from its corresponding point.
> **Edition differences**: Async blocks are only available beginning with Rust 2018.
[`std::ops::Fn`]: ../../std/ops/trait.Fn.html
[`std::future::Future`]: ../../std/future/trait.Future.html
### Capture modes
Async blocks capture variables from their environment using the same
[capture modes] as closures. Like closures, when written `async {
.. }` the capture mode for each variable will be inferred from the
content of the block. `async move { .. }` blocks however will move all
referenced variables into the resulting future.
[capture modes]: ../types/closure.md#capture-modes
[shared references]: ../types/pointer.md#shared-references-
[mutable reference]: ../types/pointer.md#mutables-references-
### Async context
Because async blocks construct a future, they define an **async
context** which can in turn contain [`await` expressions]. Async
contexts are established by async blocks as well as the bodies of
async functions, whose semantics are defined in terms of async blocks.
[`await` expressions]: await-expr.md
### Control-flow operators
Async blocks act like a function boundary, much like
closures. Therefore, the `?` operator and `return` expressions both
affect the output of the future, not the enclosing function or other
context. That is, `return <expr>` from within a closure will return
the result of `<expr>` as the output of the future. Similarly, if
`<expr>?` propagates an error, that error is propagated as the result
of the future.
Finally, the `break` and `continue` keywords cannot be used to branch
out from an async block. Therefore the following is illegal:
```rust,edition2018,compile_fail
loop {
async move {
break; // This would break out of the loop.
}
}
```
## `unsafe` blocks
> **<sup>Syntax</sup>**\
@ -112,7 +180,7 @@ expression in the following situations:
* Loop bodies ([`loop`], [`while`], [`while let`], and [`for`]).
* Block expressions used as a [statement].
* Block expressions as elements of [array expressions], [tuple expressions],
[call expressions], tuple-style [struct] and [enum variant] expressions.
[call expressions], and tuple-style [struct] and [enum variant] expressions.
* A block expression as the tail expression of another block expression.
<!-- Keep list in sync with expressions.md -->

View File

@ -130,9 +130,9 @@ while let Some(v @ 1) | Some(v @ 2) = vals.pop() {
A `for` expression is a syntactic construct for looping over elements provided
by an implementation of `std::iter::IntoIterator`. If the iterator yields a
value, that value is given the specified name and the body of the loop is
executed, then control returns to the head of the `for` loop. If the iterator
is empty, the `for` expression completes.
value, that value is matched against the irrefutable pattern, the body of the
loop is executed, and then control returns to the head of the `for` loop. If the
iterator is empty, the `for` expression completes.
An example of a `for` loop over the contents of an array:
@ -181,9 +181,9 @@ is equivalent to
}
```
`IntoIterator`, `Iterator` and `Option` are always the standard library items
`IntoIterator`, `Iterator`, and `Option` are always the standard library items
here, not whatever those names resolve to in the current scope. The variable
names `next`, `iter` and `val` are for exposition only, they do not actually
names `next`, `iter`, and `val` are for exposition only, they do not actually
have names the user can type.
> **Note**: that the outer `match` is used to ensure that any
@ -284,7 +284,7 @@ expression `()`.
[LIFETIME_OR_LABEL]: ../tokens.md#lifetimes-and-loop-labels
[_BlockExpression_]: block-expr.md
[_Expression_]: ../ expressions.md
[_Expression_]: ../expressions.md
[_MatchArmPatterns_]: match-expr.md
[_Pattern_]: ../patterns.md
[`match` expression]: match-expr.md

View File

@ -367,16 +367,15 @@ same trait object.
* **[NOTE: currently this will cause Undefined Behavior if the rounded
value cannot be represented by the target integer type][float-int]**.
This includes Inf and NaN. This is a bug and will be fixed.
* Casting from an integer to float will produce the floating point
representation of the integer, rounded if necessary (rounding strategy
unspecified)
* Casting from an integer to float will produce the closest possible float \*
* if necessary, rounding is according to `roundTiesToEven` mode \*\*\*
* on overflow, infinity (of the same sign as the input) is produced
* note: with the current set of numeric types, overflow can only happen
on `u128 as f32` for values greater or equal to `f32::MAX + (0.5 ULP)`
* Casting from an f32 to an f64 is perfect and lossless
* Casting from an f64 to an f32 will produce the closest possible value
(rounding strategy unspecified)
* **[NOTE: currently this will cause Undefined Behavior if the value
is finite but larger or smaller than the largest or smallest finite
value representable by f32][float-float]**. This is a bug and will
be fixed.
* Casting from an f64 to an f32 will produce the closest possible f32 \*\*
* if necessary, rounding is according to `roundTiesToEven` mode \*\*\*
* on overflow, infinity (of the same sign as the input) is produced
* Enum cast
* Casts an enum to its discriminant, then uses a numeric cast if needed.
* Primitive to integer cast
@ -385,8 +384,19 @@ same trait object.
* `u8` to `char` cast
* Casts to the `char` with the corresponding code point.
\* if integer-to-float casts with this rounding mode and overflow behavior are
not supported natively by the hardware, these casts will likely be slower than
expected.
\*\* if f64-to-f32 casts with this rounding mode and overflow behavior are not
supported natively by the hardware, these casts will likely be slower than
expected.
\*\*\* as defined in IEEE 754-2008 &sect;4.3.1: pick the nearest floating point
number, preferring the one with an even least significant digit if exactly
halfway between two floating point numbers.
[float-int]: https://github.com/rust-lang/rust/issues/10184
[float-float]: https://github.com/rust-lang/rust/issues/15536
## Assignment expressions

View File

@ -16,7 +16,7 @@ that have since been removed):
* Swift: optional bindings
* Scheme: hygienic macros
* C#: attributes
* Ruby: <strike>block syntax</strike>
* Ruby: closure syntax, <strike>block syntax</strike>
* NIL, Hermes: <strike>typestate</strike>
* [Unicode Annex #31](http://www.unicode.org/reports/tr31/): identifier and
pattern syntax

View File

@ -5,7 +5,7 @@ provides three kinds of material:
- Chapters that informally describe each language construct and their use.
- Chapters that informally describe the memory model, concurrency model,
runtime services, linkage model and debugging facilities.
runtime services, linkage model, and debugging facilities.
- Appendix chapters providing rationale and references to languages that
influenced the design.

View File

@ -133,7 +133,7 @@ Shorthand | Equivalent
`&'lifetime self` | `self: &'lifetime Self`
`&'lifetime mut self` | `self: &'lifetime mut Self`
> Note: Lifetimes can be and usually are elided with this shorthand.
> **Note**: Lifetimes can be, and usually are, elided with this shorthand.
If the `self` parameter is prefixed with `mut`, it becomes a mutable variable,
similar to regular parameters using a `mut` [identifier pattern]. For example:

View File

@ -29,9 +29,13 @@
> _NamedFunctionParametersWithVariadics_ :\
> &nbsp;&nbsp; ( _NamedFunctionParam_ `,` )<sup>\*</sup> _NamedFunctionParam_ `,` `...`
External blocks form the basis for Rust's foreign function interface.
Declarations in an external block describe symbols in external, non-Rust
libraries.
External blocks provide _declarations_ of items that are not _defined_ in the
current crate and are the basis of Rust's foreign function interface. These are
akin to unchecked imports.
Two kind of item _declarations_ are allowed in external blocks: [functions] and
[statics]. Calling functions or accessing statics that are declared in external
blocks is only allowed in an `unsafe` context.
Functions within external blocks are declared in the same way as other Rust
functions, with the exception that they may not have a body and are instead
@ -48,6 +52,8 @@ extern "abi" for<'l1, ..., 'lm> fn(A1, ..., An) -> R`, where `'l1`, ... `'lm`
are its lifetime parameters, `A1`, ..., `An` are the declared types of its
parameters and `R` is the declared return type.
Statics within external blocks are declared in the same way as statics outside of external blocks,
except that they do not have an expression initializing their value.
It is `unsafe` to access a static item declared in an extern block, whether or
not it's mutable.
@ -85,13 +91,6 @@ There are also some platform-specific ABI strings:
* `extern "vectorcall"` -- The `vectorcall` ABI -- corresponds to MSVC's
`__vectorcall` and clang's `__attribute__((vectorcall))`
Finally, there are some rustc-specific ABI strings:
* `extern "rust-intrinsic"` -- The ABI of rustc intrinsics.
* `extern "rust-call"` -- The ABI of the Fn::call trait functions.
* `extern "platform-intrinsic"` -- Specific platform intrinsics -- like, for
example, `sqrt` -- have this ABI. You should never have to deal with it.
## Variadic functions
Functions within external blocks may be variadic by specifying `...` after one
@ -165,6 +164,8 @@ extern {
[IDENTIFIER]: ../identifiers.md
[WebAssembly module]: https://webassembly.github.io/spec/core/syntax/modules.html
[functions]: functions.md
[statics]: static-items.md
[_Abi_]: functions.md
[_FunctionReturnType_]: functions.md
[_Generics_]: generics.md

View File

@ -8,7 +8,10 @@
> &nbsp;&nbsp; &nbsp;&nbsp; [_BlockExpression_]
>
> _FunctionQualifiers_ :\
> &nbsp;&nbsp; `const`<sup>?</sup> `unsafe`<sup>?</sup> (`extern` _Abi_<sup>?</sup>)<sup>?</sup>
> &nbsp;&nbsp; _AsyncConstQualifiers_<sup>?</sup> `unsafe`<sup>?</sup> (`extern` _Abi_<sup>?</sup>)<sup>?</sup>
>
> _AsyncConstQualifiers_ :\
> &nbsp;&nbsp; `async` | `const`
>
> _Abi_ :\
> &nbsp;&nbsp; [STRING_LITERAL] | [RAW_STRING_LITERAL]
@ -107,35 +110,73 @@ component after the function name. This might be necessary if there is not
sufficient context to determine the type parameters. For example,
`mem::size_of::<u32>() == 4`.
## Extern functions
## Extern function qualifier
Extern functions are part of Rust's foreign function interface, providing the
opposite functionality to [external blocks]. Whereas external
blocks allow Rust code to call foreign code, extern functions with bodies
defined in Rust code _can be called by foreign code_. They are defined in the
same way as any other Rust function, except that they have the `extern`
qualifier.
The `extern` function qualifier allows providing function _definitions_ that can
be called with a particular ABI:
```rust,ignore
extern "ABI" fn foo() { ... }
```
These are often used in combination with [external block] items which provide
function _declarations_ that can be used to call functions without providing
their _definition_:
```rust,ignore
extern "ABI" {
fn foo(); /* no body */
}
unsafe { foo() }
```
When `"extern" Abi?*` is omitted from `FunctionQualifiers` in function items,
the ABI `"Rust"` is assigned. For example:
```rust
// Declares an extern fn, the ABI defaults to "C"
extern fn new_i32() -> i32 { 0 }
fn foo() {}
```
// Declares an extern fn with "stdcall" ABI
is equivalent to:
```rust
extern "Rust" fn foo() {}
```
Functions in Rust can be called by foreign code, and using an ABI that
differs from Rust allows, for example, to provide functions that can be
called from other programming languages like C:
```rust
// Declares a function with the "C" ABI
extern "C" fn new_i32() -> i32 { 0 }
// Declares a function with the "stdcall" ABI
# #[cfg(target_arch = "x86_64")]
extern "stdcall" fn new_i32_stdcall() -> i32 { 0 }
```
Unlike normal functions, extern fns have type `extern "ABI" fn()`. This is the
same type as the functions declared in an extern block.
Just as with [external block], when the `extern` keyword is used and the `"ABI`
is omitted, the ABI used defaults to `"C"`. That is, this:
```rust
# extern fn new_i32() -> i32 { 0 }
extern fn new_i32() -> i32 { 0 }
let fptr: extern fn() -> i32 = new_i32;
```
is equivalent to:
```rust
extern "C" fn new_i32() -> i32 { 0 }
let fptr: extern "C" fn() -> i32 = new_i32;
```
As non-Rust calling conventions do not support unwinding, unwinding past the end
of an extern function will cause the process to abort. In LLVM, this is
implemented by executing an illegal instruction.
Functions with an ABI that differs from `"Rust"` do not support unwinding in the
exact same way that Rust does. Therefore, unwinding past the end of functions
with such ABIs causes the process to abort.
> **Note**: The LLVM backend of the `rustc` implementation
aborts the process by executing an illegal instruction.
## Const functions
@ -169,7 +210,7 @@ Exhaustive list of permitted structures in const functions:
* lifetimes
* `Sized` or [`?Sized`]
This means that `<T: 'a + ?Sized>`, `<T: 'b + Sized>` and `<T>`
This means that `<T: 'a + ?Sized>`, `<T: 'b + Sized>`, and `<T>`
are all permitted.
This rule also applies to type parameters of impl blocks that
@ -189,6 +230,104 @@ Exhaustive list of permitted structures in const functions:
the following unsafe operations:
* calls to const unsafe functions
## Async functions
Functions may be qualified as async, and this can also be combined with the
`unsafe` qualifier:
```rust,edition2018
async fn regular_example() { }
async unsafe fn unsafe_example() { }
```
Async functions do no work when called: instead, they
capture their arguments into a future. When polled, that future will
execute the function's body.
An async function is roughly equivalent to a function
that returns [`impl Future`] and with an [`async move` block][async-blocks] as
its body:
```rust,edition2018
// Source
async fn example(x: &str) -> usize {
x.len()
}
```
is roughly equivalent to:
```rust,edition2018
# use std::future::Future;
// Desugared
fn example<'a>(x: &'a str) -> impl Future<Output = usize> + 'a {
async move { x.len() }
}
```
The actual desugaring is more complex:
- The return type in the desugaring is assumed to capture all lifetime
parameters from the `async fn` declaration. This can be seen in the
desugared example above, which explicitly outlives, and hence
captures, `'a`.
- The [`async move` block][async-blocks] in the body captures all function
parameters, including those that are unused or bound to a `_`
pattern. This ensures that function parameters are dropped in the
same order as they would be if the function were not async, except
that the drop occurs when the returned future has been fully
awaited.
For more information on the effect of async, see [`async` blocks][async-blocks].
[async-blocks]: ../expressions/block-expr.md#async-blocks
[`impl Future`]: ../types/impl-trait.md
> **Edition differences**: Async functions are only available beginning with
> Rust 2018.
### Combining `async` and `unsafe`
It is legal to declare a function that is both async and unsafe. The
resulting function is unsafe to call and (like any async function)
returns a future. This future is just an ordinary future and thus an
`unsafe` context is not required to "await" it:
```rust,edition2018
// Returns a future that, when awaited, dereferences `x`.
//
// Soundness condition: `x` must be safe to dereference until
// the resulting future is complete.
async unsafe fn unsafe_example(x: *const i32) -> i32 {
*x
}
async fn safe_example() {
// An `unsafe` block is required to invoke the function initially:
let p = 22;
let future = unsafe { unsafe_example(&p) };
// But no `unsafe` block required here. This will
// read the value of `p`:
let q = future.await;
}
```
Note that this behavior is a consequence of the desugaring to a
function that returns an `impl Future` -- in this case, the function
we desugar to is an `unsafe` function, but the return value remains
the same.
Unsafe is used on an async function in precisely the same way that it
is used on other functions: it indicates that the function imposes
some additional obligations on its caller to ensure soundness. As in any
other unsafe function, these conditions may extend beyond the initial
call itself -- in the snippet above, for example, the `unsafe_example`
function took a pointer `x` as argument, and then (when awaited)
dereferenced that pointer. This implies that `x` would have to be
valid until the future is finished executing, and it is the callers
responsibility to ensure that.
## Attributes on functions
[Outer attributes][attributes] are allowed on functions. [Inner
@ -221,7 +360,7 @@ attributes macros.
[_Type_]: ../types.md#type-expressions
[_WhereClause_]: generics.md#where-clauses
[const context]: ../const_eval.md#const-context
[external blocks]: external-blocks.md
[external block]: external-blocks.md
[path]: ../paths.md
[block]: ../expressions/block-expr.md
[variables]: ../variables.md
@ -243,3 +382,4 @@ attributes macros.
[`export_name`]: ../abi.md#the-export_name-attribute
[`link_section`]: ../abi.md#the-link_section-attribute
[`no_mangle`]: ../abi.md#the-no_mangle-attribute
[external_block_abi]: external-blocks.md#abi

View File

@ -20,7 +20,7 @@
> _TypeParam_ :\
> &nbsp;&nbsp; [_OuterAttribute_]<sup>?</sup> [IDENTIFIER] ( `:` [_TypeParamBounds_]<sup>?</sup> )<sup>?</sup> ( `=` [_Type_] )<sup>?</sup>
Functions, type aliases, structs, enumerations, unions, traits and
Functions, type aliases, structs, enumerations, unions, traits, and
implementations may be *parameterized* by types and lifetimes. These parameters
are listed in angle <span class="parenthetical">brackets (`<...>`)</span>,
usually immediately after the name of the item and before its definition. For
@ -34,7 +34,7 @@ trait A<U> {}
struct Ref<'a, T> where T: 'a { r: &'a T }
```
[References], [raw pointers], [arrays], [slices][arrays], [tuples] and
[References], [raw pointers], [arrays], [slices][arrays], [tuples], and
[function pointers] have lifetime or type parameters as well, but are not
referred to with path syntax.
@ -64,7 +64,7 @@ parameters.
Bounds that don't use the item's parameters or higher-ranked lifetimes are
checked when the item is defined. It is an error for such a bound to be false.
[`Copy`], [`Clone`] and [`Sized`] bounds are also checked for certain generic
[`Copy`], [`Clone`], and [`Sized`] bounds are also checked for certain generic
types when defining the item. It is an error to have `Copy` or `Clone`as a
bound on a mutable reference, [trait object] or [slice][arrays] or `Sized` as a
bound on a trait object or slice.

View File

@ -151,7 +151,7 @@ fn test() {
}
```
As you could see, in many aspects (except for layouts, safety and ownership)
As you could see, in many aspects (except for layouts, safety, and ownership)
unions behave exactly like structs, largely as a consequence of inheriting
their syntactic shape from structs. This is also true for many unmentioned
aspects of Rust language (such as privacy, name resolution, type inference,

View File

@ -129,8 +129,8 @@ fn main() {}
> use ::foo::baz::foobaz;
> ```
>
> The 2015 edition does not allow use declarations to reference the [extern
> prelude]. Thus [`extern crate`] declarations are still required in 2015 to
> The 2015 edition does not allow use declarations to reference the [extern prelude].
> Thus [`extern crate`] declarations are still required in 2015 to
> reference an external crate in a use declaration. Beginning with the 2018
> edition, use declarations can specify an external crate dependency the same
> way `extern crate` can.

View File

@ -6,7 +6,7 @@ compiler can infer a sensible default choice.
## Lifetime elision in functions
In order to make common patterns more ergonomic, lifetime arguments can be
*elided* in [function item], [function pointer] and [closure trait] signatures.
*elided* in [function item], [function pointer], and [closure trait] signatures.
The following rules are used to infer lifetime parameters for elided lifetimes.
It is an error to elide lifetime parameters that cannot be inferred. The
placeholder lifetime, `'_`, can also be used to have a lifetime inferred in the

Some files were not shown because too many files have changed in this diff Show More